CN111696058A - Image processing method, device and storage medium - Google Patents

Image processing method, device and storage medium Download PDF

Info

Publication number
CN111696058A
CN111696058A CN202010462573.1A CN202010462573A CN111696058A CN 111696058 A CN111696058 A CN 111696058A CN 202010462573 A CN202010462573 A CN 202010462573A CN 111696058 A CN111696058 A CN 111696058A
Authority
CN
China
Prior art keywords
image
target
jnd
jnds
compensation
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202010462573.1A
Other languages
Chinese (zh)
Inventor
杨佳义
李愿
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
College Of Mobile Telecommunications Chongqing University Of Posts And Telecommunications
Original Assignee
College Of Mobile Telecommunications Chongqing University Of Posts And Telecommunications
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by College Of Mobile Telecommunications Chongqing University Of Posts And Telecommunications filed Critical College Of Mobile Telecommunications Chongqing University Of Posts And Telecommunications
Priority to CN202010462573.1A priority Critical patent/CN111696058A/en
Publication of CN111696058A publication Critical patent/CN111696058A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • G06T5/94
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F21/00Security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
    • G06F21/30Authentication, i.e. establishing the identity or authorisation of security principals
    • G06F21/31User authentication
    • G06F21/32User authentication using biometric data, e.g. fingerprints, iris scans or voiceprints
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30196Human being; Person
    • G06T2207/30201Face

Abstract

The application discloses an image processing method, an image processing device and a storage medium, wherein the method comprises the following steps: acquiring an image to be processed; acquiring just distinguishable difference JND corresponding to each pixel point in the image to be processed to obtain a plurality of JNDs; determining a segmentation threshold; and performing image enhancement processing on the image to be processed based on the JNDs and the segmentation threshold value to obtain a target image. By the adoption of the method and the device, the image quality under the low-illumination environment is improved.

Description

Image processing method, device and storage medium
Technical Field
The present application relates to the field of image processing technologies, and in particular, to an image processing method, an image processing apparatus, and a storage medium.
Background
With the widespread use of electronic devices (such as mobile phones, tablet computers, and the like), the electronic devices have more and more applications and more powerful functions, and the electronic devices are developed towards diversification and personalization, and become indispensable electronic products in the life of users.
At present, photographing becomes a standard matching technology of electronic equipment, but due to the defects that a video image is often low in contrast, small in information amount, narrow in grayscale spectral band width, weak in image layering and the like in a low-illumination (low-light, night-vision or special scene) environment, human eyes are difficult to identify, and therefore the problem of how to improve the image quality in the low-illumination environment needs to be solved urgently.
Disclosure of Invention
The embodiment of the application provides an image processing method, an image processing device and a storage medium, which can improve the image quality in a low-illumination environment.
In a first aspect, an embodiment of the present application provides an image processing method, where the method includes:
acquiring an image to be processed;
acquiring just distinguishable difference JND corresponding to each pixel point in the image to be processed to obtain a plurality of JNDs;
determining a segmentation threshold;
and performing image enhancement processing on the image to be processed based on the JNDs and the segmentation threshold value to obtain a target image.
In a second aspect, an embodiment of the present application provides an image processing apparatus, including: a first acquisition unit, a second acquisition unit, a determination unit and an image enhancement unit, wherein,
the first acquisition unit is used for acquiring an image to be processed;
the second obtaining unit is used for obtaining just distinguishable differences JND corresponding to each pixel point in the image to be processed to obtain a plurality of JNDs;
the determining unit is used for determining a segmentation threshold;
and the image enhancement unit is used for carrying out image enhancement processing on the image to be processed based on the JNDs and the segmentation threshold value to obtain a target image.
In a third aspect, embodiments of the present application provide an electronic device, which includes a processor, a memory, a communication interface, and one or more programs, stored in the memory and configured to be executed by the processor, the programs including instructions for performing some or all of the steps described in the method according to the first aspect of the embodiments of the present application.
In a fourth aspect, the present application provides a computer-readable storage medium, where the computer-readable storage medium is used to store a computer program, where the computer program is executed by a processor to implement part or all of the steps described in the method according to the first aspect of the present application.
In a fifth aspect, the present application provides a computer program product, where the computer program product includes a non-transitory computer-readable storage medium storing a computer program, where the computer program is operable to cause a computer to perform some or all of the steps described in the method according to the first aspect of the present application. The computer program product may be a software installation package.
The embodiment of the application has the following beneficial effects:
it can be seen that, in the image processing method, the apparatus, and the storage medium described in the embodiments of the present application, an image to be processed is acquired, a just resolvable difference JND corresponding to each pixel point in the image to be processed is acquired, a plurality of JNDs are acquired, a segmentation threshold is determined, and image enhancement processing is performed on the image to be processed based on the JNDs and the segmentation threshold to acquire a target image.
These and other aspects of the present application will be more readily apparent from the following description of the embodiments.
Drawings
In order to more clearly illustrate the embodiments of the present application or the technical solutions in the prior art, the drawings used in the description of the embodiments or the prior art will be briefly described below, it is obvious that the drawings in the following description are only some embodiments of the present application, and for those skilled in the art, other drawings can be obtained according to the drawings without creative efforts.
Fig. 1A is a schematic diagram of a hardware structure of an electronic device according to an embodiment of the present disclosure;
fig. 1B is a schematic flowchart of an image processing method according to an embodiment of the present application;
FIG. 1C is a schematic flow chart of an automatic optimization model for compensating a scaling factor k according to an embodiment of the present disclosure;
FIG. 1D is a schematic diagram illustrating an optimal compensation coefficient k value prediction model provided in an embodiment of the present application;
FIG. 1E is a schematic flow chart of a compensation algorithm provided by an embodiment of the present application;
fig. 1F is a schematic flowchart of creating a compensation linked list according to an embodiment of the present application;
fig. 1G is a schematic illustration of a demonstration of a real-time video processing window provided by an embodiment of the present application;
fig. 1H is a schematic illustration of a local video file processing system according to an embodiment of the present application;
FIG. 2 is a schematic flow chart of another image processing method provided in the embodiments of the present application;
fig. 3 is a schematic diagram of another hardware structure of an electronic device according to an embodiment of the present disclosure;
fig. 4 is a schematic structural diagram of an image processing apparatus according to an embodiment of the present application.
Detailed Description
In order to make the technical solutions better understood by those skilled in the art, the technical solutions in the embodiments of the present application will be clearly and completely described below with reference to the drawings in the embodiments of the present application, and it is obvious that the described embodiments are only partial embodiments of the present application, but not all embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present application.
The following are detailed below.
The terms "first," "second," "third," and "fourth," etc. in the description and claims of this application and in the accompanying drawings are used for distinguishing between different objects and not for describing a particular order. Furthermore, the terms "include" and "have," as well as any variations thereof, are intended to cover non-exclusive inclusions. For example, a process, method, system, article, or apparatus that comprises a list of steps or elements is not limited to only those steps or elements listed, but may alternatively include other steps or elements not listed, or inherent to such process, method, article, or apparatus.
Reference herein to "an embodiment" means that a particular feature, structure, or characteristic described in connection with the embodiment can be included in at least one embodiment of the application. The appearances of the phrase in various places in the specification are not necessarily all referring to the same embodiment, nor are separate or alternative embodiments mutually exclusive of other embodiments. It is explicitly and implicitly understood by one skilled in the art that the embodiments described herein can be combined with other embodiments.
Hereinafter, some terms in the present application are explained to facilitate understanding by those skilled in the art.
Electronic devices may include various handheld devices having wireless communication capabilities, in-vehicle devices, wearable devices (e.g., smart watches, smart glasses, smart bracelets, pedometers, etc.), smart cameras (e.g., smart single-lens reflex cameras, high-speed cameras), computing devices, or other processing devices communicatively connected to a wireless modem, as well as various forms of User Equipment (UE), Mobile Stations (MS), terminal devices (terminal device), and so forth. For convenience of description, the above-mentioned devices are collectively referred to as electronic devices.
As shown in fig. 1A, fig. 1A is a schematic structural diagram of an electronic device according to an embodiment of the present disclosure. The electronic device includes a processor, a Memory, a signal processor, a transceiver, a display screen, a speaker, a microphone, a Random Access Memory (RAM), a camera, a sensor, a network module, and the like. The memory, the DSP, the loudspeaker, the microphone, the RAM, the camera, the sensor and the network module are connected with the processor, and the transceiver is connected with the signal processor.
The Processor is a control center of the electronic device, connects various parts of the whole electronic device by using various interfaces and lines, executes various functions and processes data of the electronic device by running or executing software programs and/or modules stored in the memory and calling the data stored in the memory, thereby performing overall monitoring on the electronic device, and may be a Central Processing Unit (CPU), a Graphics Processing Unit (GPU) or a Network Processing Unit (NPU).
Further, the processor may integrate an application processor, which mainly handles operating systems, user interfaces, application programs, etc., and a modem processor, which mainly handles wireless communications. It will be appreciated that the modem processor described above may not be integrated into the processor.
The memory is used for storing software programs and/or modules, and the processor executes various functional applications and data processing of the electronic equipment by operating the software programs and/or modules stored in the memory. The memory mainly comprises a program storage area and a data storage area, wherein the program storage area can store an operating system, a software program required by at least one function and the like; the storage data area may store data created according to use of the electronic device, and the like. Further, the memory may include high speed random access memory, and may also include non-volatile memory, such as at least one magnetic disk storage device, flash memory device, or other volatile solid state storage device.
Wherein the sensor comprises at least one of: light-sensitive sensors, gyroscopes, infrared proximity sensors, vibration detection sensors, pressure sensors, etc. Among them, the light sensor, also called an ambient light sensor, is used to detect the ambient light brightness. The light sensor may include a light sensitive element and an analog to digital converter. The photosensitive element is used for converting collected optical signals into electric signals, and the analog-to-digital converter is used for converting the electric signals into digital signals. Optionally, the light sensor may further include a signal amplifier, and the signal amplifier may amplify the electrical signal converted by the photosensitive element and output the amplified electrical signal to the analog-to-digital converter. The photosensitive element may include at least one of a photodiode, a phototransistor, a photoresistor, and a silicon photocell.
The camera may be a visible light camera (general view angle camera, wide angle camera), an infrared camera, or a dual camera (having a distance measurement function), which is not limited herein.
The network module may be at least one of: a bluetooth module, a wireless fidelity (Wi-Fi), etc., which are not limited herein.
Based on the electronic device described in fig. 1A, the following image processing method can be executed, and the specific steps are as follows:
acquiring an image to be processed;
acquiring just distinguishable difference JND corresponding to each pixel point in the image to be processed to obtain a plurality of JNDs;
determining a segmentation threshold;
and performing image enhancement processing on the image to be processed based on the JNDs and the segmentation threshold value to obtain a target image.
It can be seen that, in the image processing method described in the embodiment of the present application, an image to be processed is acquired, a just resolvable difference JND corresponding to each pixel point in the image to be processed is acquired, a plurality of JNDs are acquired, a segmentation threshold is determined, and the image to be processed is subjected to image enhancement processing based on the JNDs and the segmentation threshold to obtain a target image.
The following describes embodiments of the present application in detail.
As shown in fig. 1B, fig. 1B is a schematic flowchart of an image processing method provided in an embodiment of the present application, and is applied to the electronic device shown in fig. 1A, where the method includes:
101. and acquiring an image to be processed.
The image to be processed may be any frame image in a video, or may be an image shot by a camera in a dark environment. The image to be processed may be a grayscale image or a color image. In the embodiment of the application, the electronic device can comprise a camera, the camera can be an infrared camera or a visible light camera, and of course, the camera can also be a double camera or a plurality of cameras.
102. And acquiring just distinguishable difference JND corresponding to each pixel point in the image to be processed to obtain a plurality of JNDs.
In a specific implementation, the electronic device may obtain Just Noticeable Difference (JND), which is the smallest perceivable difference, corresponding to each pixel point in the image to be processed through a correlation technique. In the related art, the difference just distinguishable by human vision is referred to as JND, and the rule of contrast resolution changing with background gray scale is shown as follows:
Figure BDA0002511532960000061
in the formula, i is background gray, i belongs to [0,47] and is dark vision, and the just-distinguishable gray difference of human eyes is exponentially changed; i e (47,255) is photopic, just the distinguishable difference is 1.17 to 1.75 gray levels.
Optionally, between the step 101 and the step 102, the following steps may be further included:
a1, obtaining the brightness of the target environment light;
and A2, when the target environment light brightness is lower than a preset environment light brightness threshold value, executing the step of obtaining just distinguishable difference JND corresponding to each pixel point in the image to be processed to obtain a plurality of JNDs.
The preset ambient light brightness threshold value can be set by the user or defaulted by the system. In specific implementation, an ambient light sensor can be arranged in the electronic device, and ambient light brightness detection can be realized through the ambient light sensor. Specifically, the electronic device may obtain the brightness of the target environment, and when the brightness of the target environment is lower than the preset threshold value of the brightness of the environment, step 102 may be executed, otherwise, the image may be directly processed according to the pre-stored image enhancement algorithm, so that, in a dark visual environment, the image enhancement processing may be implemented according to the method in the embodiment of the present application, which is beneficial to improving the image quality.
103. A segmentation threshold is determined.
The segmentation threshold may be set by the user or default by the system, and of course, the segmentation threshold may also be determined by the above JNDs. The segmentation threshold may be one or more. The fragmentation threshold may be a real number greater than 0.
Optionally, the step 103 of determining the segmentation threshold may include the following steps:
311. determining a mean value JND corresponding to the multiple JNDs;
312. determining target mean square deviations corresponding to the JNDs;
313. determining a target adjusting coefficient corresponding to the target mean square error according to a mapping relation between a preset mean square error and an adjusting coefficient;
314. and adjusting the mean JND according to the target adjusting coefficient to obtain the segmentation threshold.
In a specific implementation, the electronic device may pre-store a mapping relationship between a preset mean square error and an adjustment coefficient, and further, the electronic device may determine a mean JND corresponding to the multiple JNDs, and may further determine target mean square errors corresponding to the multiple JNDs, determine a target adjustment coefficient corresponding to the target mean square error according to the mapping relationship between the preset mean square error and the adjustment coefficient, and adjust the mean JND according to the target adjustment coefficient to obtain a segmentation threshold, for example, a value range of the target adjustment coefficient may be (-0.2, 0.2), and the segmentation threshold is the mean JND (1+ target adjustment coefficient).
Optionally, the step 103 of determining the segmentation threshold may include the following steps:
321. acquiring target vision parameters of a user;
322. and determining a segmentation threshold corresponding to the target vision parameter according to a mapping relation between a preset vision parameter and a threshold.
Wherein the vision parameter may be at least one of: right eye vision, left eye vision, astigmatism, axis of astigmatism, eye sensitivity, color sensitivity, whether or not there is color weakness, whether or not there is color blindness, etc., without limitation.
In the specific implementation, the mapping relation between the preset vision parameter and the threshold value can be stored in the electronic device in advance, and then the electronic device can acquire the target vision parameter of the user, and determine the segmentation threshold value corresponding to the target vision parameter according to the mapping relation between the preset vision parameter and the threshold value, so that the segmentation threshold value suitable for the eyes of the user can be obtained, and the user experience is facilitated to be improved.
104. And performing image enhancement processing on the image to be processed based on the JNDs and the segmentation threshold value to obtain a target image.
The electronic device can divide the multiple JNDs into at least two sets based on the segmentation threshold, and then perform image enhancement processing on the images to be processed according to the characteristics of each set to obtain the target images.
Optionally, in the step 104, performing image enhancement processing on the image to be processed based on the multiple JNDs and the segmentation threshold to obtain the target image, the method may include the following steps:
41. dividing the segmentation threshold value according to the segmentation threshold value to obtain a first JND set and a second JND set, wherein the first JND set is a JND which is larger than or equal to the segmentation threshold value, and the second JND set is a JND which is smaller than the segmentation threshold value;
42. and performing image enhancement processing on the pixel value corresponding to each JND in the first JND set by adopting a first preset algorithm, and performing image enhancement processing on the pixel value corresponding to each JND in the second JND set by adopting a second preset algorithm to obtain the target image.
The first preset algorithm and the second preset algorithm can be set by a user or defaulted by a system, and can be at least one of the following algorithms: histogram equalization, gray scale stretching, proportional integral compensation algorithm, wavelet transform, etc., without limitation.
In a specific implementation, the electronic device may divide a plurality of JNDs according to a segmentation threshold to obtain a first JND set and a second JND set, where the first JND set is a JND greater than or equal to the segmentation threshold, and the second JND set is a JND smaller than the segmentation threshold, and further, may perform image enhancement processing on a pixel value corresponding to each JND in the first JND set by using a first preset algorithm, and perform image enhancement processing on a pixel value corresponding to each JND in the second JND set by using a second preset algorithm to obtain a target image.
Further optionally, the first preset algorithm is a proportional integral compensation algorithm, and a specific formula is as follows:
Figure BDA0002511532960000081
JND (i) is JND corresponding to i, k is a target contrast resolution compensation scale factor, and OG (x, y) and TG (x, y) respectively represent an original gray value before compensation and a target gray value after compensation at image pixel point coordinates (x, y).
In a specific implementation, k is a real number greater than 0. Since the exactly resolvable difference is 1.17 to 1.75 gray levels and JND is greater than the segmentation threshold, it can be considered that it has reached or is close to reaching the level that the user can resolve, and therefore, the corresponding pixels can be finely adjusted, for example, proportional-integral compensation, to suppress the excessive enhancement, so that the image enhancement effect is just as good.
Further optionally, the method may further include the steps of:
b1, acquiring target environment parameters;
b2, determining the target contrast resolution compensation proportionality coefficient corresponding to the target environmental parameter according to the mapping relation between the preset environmental parameter and the contrast resolution compensation proportionality coefficient.
Wherein the environmental parameter may be at least one of: ambient light level, ambient color temperature, weather, geographic location, etc., without limitation.
In specific implementation, the electronic device may pre-store a mapping relationship between a preset environmental parameter and the contrast resolution compensation scaling factor. Furthermore, the electronic device may obtain the target environment parameter, and determine the target contrast resolution compensation scaling factor corresponding to the target environment parameter according to a mapping relationship between the preset environment parameter and the contrast resolution compensation scaling factor.
Further optionally, the second preset algorithm is a linear stretch compensation algorithm, and the specific formula is as follows:
TG(x,y)=TG(x,y)Th+a×[OG(x,y)-TG(x,y)Th]
wherein, TG (x, y)ThAnd the OG (x, y) and the TG (x, y) respectively represent an original gray value before compensation and a target gray value after compensation at the image pixel point coordinate (x, y).
In a specific implementation, a is a real number greater than 0, e.g., a is 1.5. The selection of the segmentation threshold is often combined with the eye characteristics of the user or human, and if the segmentation threshold is lower than the threshold, the image enhancement can be considered based on the segmentation threshold, namely, the image enhancement is performed based on the segmentation threshold, so that the low-resolution difference can be effectively enhanced.
Further optionally, the method may further include the steps of:
b3, acquiring target shooting parameters;
b4, determining the target linear stretching adjustment coefficient corresponding to the target shooting parameter according to the mapping relation between the preset shooting parameter and the linear stretching adjustment coefficient.
In the embodiment of the present application, the shooting parameters may be at least one of the following: sensitivity ISO, exposure time, EV value, shutter size, and the like, which are not limited herein.
In specific implementation, a mapping relationship between the preset shooting parameters and the linear stretching adjustment coefficients can be prestored in the electronic device, the electronic device can acquire the target shooting parameters, and then the target linear stretching adjustment coefficients corresponding to the target shooting parameters can be determined according to the mapping relationship between the preset shooting parameters and the linear stretching adjustment coefficients.
For example, as can be seen from the above related arts, in order to ensure that the gray level of the photopic vision is properly resolved, the segmentation threshold, i.e., the photopic vision compensation degree CDegree, is 1.5, and the gray level corresponding to CDegree being 1.5 is the contrast resolution compensation threshold Th. The piecewise compensation algorithm for thresholding is defined as follows:
Figure BDA0002511532960000091
wherein, TG (x, y)ThIs the compensated target gray level at the threshold value when the compensation degree is CDegree>When 1.5, proportional integral compensation is used, when the degree of compensation CDegree<At 1.5, linear stretch compensation was performed.
In a specific implementation, in a video image processing application, the electronic device may adaptively extract the compensation scale factor k value. The compensation effect of different compensation coefficients is different, and with the increase of the compensation proportionality coefficient k value, the compensation effect of the image gradually becomes better from the difference, then gradually becomes worse from the good, and is in a convex function characteristic, so that the compensation proportionality coefficient k value is optimal. By extracting the characteristic parameters of the video image, selecting the compensated optimal image in combination with a subjective evaluation mode and establishing an automatic optimization model of a compensation proportionality coefficient k, the optimal compensation coefficient of the single-frame image is obtained in real time, and the modeling technical scheme is shown in figure 1C.
Further, in a specific implementation, 20 original images with an average gray value of 0 to 47 under a low-illumination environment can be collected, and each image takes 5 different compensation scale coefficients k to obtain a test image sample set. The best visual effect image is selected by organizing the subjective evaluation of 10 observers according to the observation conditions of the digital television image quality subjective evaluation method standard (GY/T134-1998). When an observer evaluates the data, the evaluation result is different according to different factors such as cognitive direction, knowledge background and emotion. And (4) adopting a mathematical statistics method for the evaluation result, discarding the evaluation result with overlarge deviation for each group of data, and extracting the final evaluation result by using a mean method. The prediction model for fitting the optimal compensation scaling factor k using the data analysis software AGrapher mathematical statistics is shown in fig. 1D.
In the graph, the abscissa is the average gray level ago (average gray of organic image) of the Original image, the ordinate is the proportionality coefficient k, the square points in the graph are experimental optimization experimental data, and the prediction model relationship of the optimal compensation proportionality coefficient k is as follows:
k=2.9/AGO+0.28,0<AGO≤47
wherein, AGO is the average gray value of the original image.
Further, the Directshow platform provided by Microsoft can be utilized to develop an application program with video image adaptive compensation enhancement, and the practicability and real-time performance of the compensation algorithm can be verified.
The DirectShow is a general video processing technology under a Windows platform, and can quickly build a video playing, capturing and editing application program. In order to realize the adaptive Compensation enhancement of video images, a Contrast Resolution Compensation (CRCpensate) Filter is developed. The standard transformFilter basic class CTransformFilter is used as a parent class. The rewriting function CTransformFilter calls a canPerformTransform function to check the media format of the input video; the function CTRansformFilter checks whether the output media type matches the input media type; the CTransformFilter comprises the steps that DecideBufferSize negotiates and reserves the buffer size of each Sample through an OutPutPin interface and a lower-level InputPin interface of the Filter; the CTransformFilter comprises that GetMediaType determines the output media type by using the media type supported by the last filter; CTransformFilter, and a contrast resolution compensation algorithm is realized in Transform.
The Transform function of the CRCpensate class obtains Video stream data from an input Sample, processes the Video stream data, puts the processed Video stream data into a memory of the output Sample, and sends the processed Video stream data to the Video renderer. The flow of implementing the algorithm is shown in fig. 1E.
In the Transform function, the code for the current media type is obtained as follows:
AM_MEDIA_TYPE*pmt=&(m_pInput->CurrentMediaType());
v/structure AM _ MEDIA _ TYPE defines the MEDIA TYPE
For the Filter Graph manager to call the Compensation Filter, the Compensation Filter is registered, CLSID (ClassIdentifiers) is CLSID _ CRCpensenate.
Sdk (software Development kit) from Directshow provides application cases for various streaming media, routine simplepreseter provides an application example for local video playback, and routine AMCap provides an example for a video capture application. The main flow chart of the linked list creation for constructing the video image processing linked list response is shown in fig. 1F.
The CLSID for the compensation Filter is defined in the icrcpense. The CLSID definition is converted into a GUID (Global Unique identifier) constant form by including an initguid. And a video acquisition processing linked list is realized in the BuildProcessGraph function, and the construction of a subsequent Filter link of the specified Pin is automatically completed in a RenderStream intelligent connection mode provided by an ICaptureGraphBuild2 interface. Establishing a new processing linked list, inserting a compensation Filter, firstly finding a SmartTee Filter in an original linked list in intelligent connection, destroying the Filter behind the SmartTee Filter, and establishing a linked list for real-time preview and real-time processing.
The function FindFilterByCLSID finds SmartTeeFilter through CLSID number CLSID _ SmartTee of the Filter, the NukeDownstream function realizes destroying all filters downstream of the SmartTee Filter, the AddFilter method adds contrast resolution compensation Filter and creates a linked list behind.
And adding a video compensation processing command to the original AMCap window, wherein the real-time video processing window is shown in FIG. 1G.
The native video processing player design as shown in fig. 1H, the main dialog displays the video images before and after compensation, respectively, left and right.
Optionally, in step 101, before acquiring the image to be processed, the method may further include the following steps:
c1, acquiring a first face image;
c2, dividing the first face image into a plurality of areas;
c3, determining the distribution density of the characteristic points of each area in the plurality of areas to obtain a characteristic point distribution density set, wherein each area corresponds to one characteristic point distribution density;
c4, determining a target mean value and a target mean square error corresponding to the feature point distribution density set;
c5, determining a target image enhancement algorithm corresponding to the target average value according to the mapping relation between the preset average value and the image enhancement algorithm;
c6, determining a target fine tuning coefficient corresponding to the target mean square error according to a mapping relation between a preset mean square error and the fine tuning coefficient;
c7, adjusting the algorithm control parameters of the target image enhancement algorithm according to the target fine adjustment coefficients to obtain target algorithm control parameters;
c8, carrying out image enhancement processing on the first face image according to the target algorithm control parameter and the target image enhancement algorithm to obtain a second face image;
c9, matching the second face image with a preset face template;
and C10, when the second face image is successfully matched with the preset face template, executing the step of acquiring the image to be processed.
In the embodiment of the application, a preset face template can be stored in the electronic device in advance. The image enhancement algorithm may be at least one of: gamma correction, histogram equalization, image sharpening, wavelet transformation, gray stretching, and the like, without limitation. Each image enhancement algorithm corresponds to an algorithm control parameter, and the algorithm control algorithm is used for controlling the image enhancement degree. The electronic device may pre-store a mapping relationship between a preset average value and an image enhancement algorithm, and a mapping relationship between a preset mean square error and a fine tuning coefficient.
In the specific implementation, the electronic device may acquire the first face image, and further, the first face image may be divided into a plurality of regions, the number of the plurality of regions may be understood as 3 or more, the size of each region in the plurality of regions is within a preset area range, the size of each region in the plurality of regions may be the same or different, and the preset area range may be set by a user or default to the system. The electronic device may determine the feature point distribution density of each of the plurality of regions to obtain a feature point distribution density set, where the feature point distribution density set includes a plurality of feature point distribution densities, and each region corresponds to one feature point distribution density, that is, the number of feature points in each of the plurality of regions and a corresponding region area may be determined, and a ratio between the number of feature points and the corresponding region area is used as the feature point distribution density. The electronic device may determine a target average value and a target mean square error corresponding to the feature point distribution density set, that is, the target average value is the total number of feature points/the number of regions corresponding to the feature point distribution density set, and may determine the target mean square error corresponding to the feature point distribution density set based on the target average value and the feature point distribution density set. The average value reflects the overall characteristics of the image, and the mean square error reflects the relevance between the regions, so that the corresponding image enhancement algorithm and the corresponding algorithm control parameters can be selected by combining the overall characteristics and the regional relevance of the image, thereby being beneficial to improving the image enhancement efficiency, namely improving the quality of the face image.
Further, the electronic device may determine a target image enhancement algorithm corresponding to the target average value according to a mapping relationship between a preset average value and an image enhancement algorithm, and may determine a target fine-tuning coefficient corresponding to the target mean-square error according to a mapping relationship between a preset mean-square error and a fine-tuning coefficient, and then, the electronic device may adjust an algorithm control parameter of the target image enhancement algorithm according to the target fine-tuning coefficient to obtain a target algorithm control parameter, and perform image enhancement processing on the first face image according to the target algorithm control parameter and the target image enhancement algorithm to obtain a second face image, and further, since the second face image has been image enhancement processed, the electronic device may match the second face image with the preset face template, and when the second face image is successfully matched with the preset face template, and executing the step of acquiring the data sending request, otherwise, prompting the user to continue inputting the face image, so that the face recognition efficiency can be improved, face unlocking can be realized through the face, and image enhancement processing is performed on the image to be processed under the condition of an unlocking result, thereby being beneficial to improving the safety of the electronic equipment.
The image processing method described in the embodiment of the present application includes acquiring an image to be processed, acquiring just resolvable difference JNDs corresponding to each pixel point in the image to be processed, acquiring a plurality of JNDs, determining a segmentation threshold, and performing image enhancement processing on the image to be processed based on the JNDs and the segmentation threshold to obtain a target image.
Referring to fig. 2, fig. 2 is a schematic flow chart of an image processing method according to an embodiment of the present application, and as shown in the figure, the image processing method is applied to the electronic device shown in fig. 1A, and includes:
201. and acquiring an image to be processed.
202. And obtaining the brightness of the target environment.
203. And when the target environment brightness is lower than a preset environment brightness threshold value, acquiring just-distinguishable difference JND corresponding to each pixel point in the image to be processed to obtain a plurality of JNDs.
204. A segmentation threshold is determined.
205. And performing image enhancement processing on the image to be processed based on the JNDs and the segmentation threshold value to obtain a target image.
For the detailed description of steps 201 to 205, reference may be made to corresponding steps of the image processing method described in fig. 1B, which are not described herein again.
It can be seen that the image processing method described in the embodiments of the present application acquires an image to be processed, acquires the target ambient light brightness, when the brightness of the target environment light is lower than a preset environment light brightness threshold value, acquiring just distinguishable difference JNDs corresponding to each pixel point in the image to be processed to obtain a plurality of JNDs, determining a segmentation threshold value, performing image enhancement processing on the image to be processed based on the JNDs and the segmentation threshold value to obtain a target image, according to the human visual perception characteristic and based on the contrast resolution difference compensation idea, the contrast resolution compensation algorithm with the added threshold is provided, so that the scotopic vision information of the low-illumination environment can be effectively mined, the image quality can be effectively improved on the premise of not introducing pseudo information, the method can be widely applied to low-illumination scenes under different illumination conditions, the scotopic vision image information is mined, and the image quality under the low-illumination environment is favorably improved.
In accordance with the foregoing embodiments, please refer to fig. 3, fig. 3 is a schematic structural diagram of an electronic device according to an embodiment of the present application, and as shown in the drawing, the electronic device includes a processor, a memory, a communication interface, and one or more programs, the one or more programs are stored in the memory and configured to be executed by the processor, and in an embodiment of the present application, the programs include instructions for performing the following steps:
acquiring an image to be processed;
acquiring just distinguishable difference JND corresponding to each pixel point in the image to be processed to obtain a plurality of JNDs;
determining a segmentation threshold;
and performing image enhancement processing on the image to be processed based on the JNDs and the segmentation threshold value to obtain a target image.
It can be seen that, in the electronic device described in the embodiment of the present application, an image to be processed is acquired, a just resolvable difference JND corresponding to each pixel point in the image to be processed is acquired, a plurality of JNDs are acquired, a segmentation threshold is determined, the image to be processed is subjected to image enhancement processing based on the JNDs and the segmentation threshold, and a target image is acquired.
In one possible example, in terms of performing image enhancement processing on the image to be processed based on the plurality of JNDs and the segmentation threshold to obtain a target image, the program includes instructions for performing the following steps:
dividing the JNDs according to the segmentation threshold value to obtain a first JND set and a second JND set, wherein the first JND set is a JND which is larger than or equal to the segmentation threshold value, and the second JND set is a JND which is smaller than the segmentation threshold value;
and performing image enhancement processing on the pixel value corresponding to each JND in the first JND set by adopting a first preset algorithm, and performing image enhancement processing on the pixel value corresponding to each JND in the second JND set by adopting a second preset algorithm to obtain the target image.
In one possible example, wherein the first preset algorithm is a proportional integral compensation algorithm, the specific formula is as follows:
Figure BDA0002511532960000151
JND (i) is JND corresponding to i, k is a target contrast resolution compensation scale factor, and OG (x, y) and TG (x, y) respectively represent an original gray value before compensation and a target gray value after compensation at image pixel point coordinates (x, y).
In one possible example, the program includes instructions for performing the steps of:
acquiring target environment parameters;
and determining the target contrast resolution compensation proportionality coefficient corresponding to the target environmental parameter according to a mapping relation between a preset environmental parameter and the contrast resolution compensation proportionality coefficient.
In one possible example, wherein the second preset algorithm is a linear stretch compensation algorithm, the specific formula is as follows:
TG(x,y)=TG(x,y)Th+a×[OG(x,y)-TG(x,y)Th]
wherein, TG (x, y)ThAnd the OG (x, y) and the TG (x, y) respectively represent an original gray value before compensation and a target gray value after compensation at the image pixel point coordinate (x, y).
In one possible example, the program further includes instructions for performing the steps of:
acquiring target shooting parameters;
and determining the target linear stretching adjusting coefficient corresponding to the target shooting parameter according to a mapping relation between preset shooting parameters and linear stretching adjusting coefficients.
In one possible example, in said determining the fragmentation threshold, the above program comprises instructions for performing the steps of:
determining a mean value JND corresponding to the multiple JNDs;
determining target mean square deviations corresponding to the JNDs;
determining a target adjusting coefficient corresponding to the target mean square error according to a mapping relation between a preset mean square error and an adjusting coefficient;
and adjusting the mean JND according to the target adjusting coefficient to obtain the segmentation threshold.
The above description has introduced the solution of the embodiment of the present application mainly from the perspective of the method-side implementation process. It is understood that the electronic device comprises corresponding hardware structures and/or software modules for performing the respective functions in order to realize the above-mentioned functions. Those of skill in the art will readily appreciate that the present application is capable of hardware or a combination of hardware and computer software implementing the various illustrative elements and algorithm steps described in connection with the embodiments provided herein. Whether a function is performed as hardware or computer software drives hardware depends upon the particular application and design constraints imposed on the solution. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present application.
In the embodiment of the present application, the electronic device may be divided into the functional units according to the method example, for example, each functional unit may be divided corresponding to each function, or two or more functions may be integrated into one processing unit. The integrated unit can be realized in a form of hardware, and can also be realized in a form of a software functional unit. It should be noted that the division of the unit in the embodiment of the present application is schematic, and is only a logic function division, and there may be another division manner in actual implementation.
Fig. 4 is a block diagram showing functional units of an image processing apparatus 400 according to an embodiment of the present application. The image processing apparatus 400 is applied to an electronic device, and the apparatus 400 may include: a first acquisition unit 401, a second acquisition unit 402, a determination unit 403, and an image enhancement unit 404, wherein,
the first obtaining unit 401 is configured to obtain an image to be processed;
the second obtaining unit 402 is configured to obtain a just-resolvable difference JND corresponding to each pixel point in the image to be processed, so as to obtain multiple JNDs;
the determining unit 403 is configured to determine a segmentation threshold;
the image enhancement unit 404 is configured to perform image enhancement processing on the image to be processed based on the multiple JNDs and the segmentation threshold to obtain a target image.
It can be seen that, in the image processing apparatus described in this embodiment of the present application, an image to be processed is acquired, a just resolvable difference JND corresponding to each pixel point in the image to be processed is acquired, a plurality of JNDs are acquired, a segmentation threshold is determined, an image enhancement process is performed on the image to be processed based on the JNDs and the segmentation threshold, and a target image is acquired.
Optionally, in terms of performing image enhancement processing on the image to be processed based on the multiple JNDs and the segmentation threshold to obtain a target image, the image enhancement unit 404 is specifically configured to:
dividing the JNDs according to the segmentation threshold value to obtain a first JND set and a second JND set, wherein the first JND set is a JND which is larger than or equal to the segmentation threshold value, and the second JND set is a JND which is smaller than the segmentation threshold value;
and performing image enhancement processing on the pixel value corresponding to each JND in the first JND set by adopting a first preset algorithm, and performing image enhancement processing on the pixel value corresponding to each JND in the second JND set by adopting a second preset algorithm to obtain the target image.
Optionally, the first preset algorithm is a proportional integral compensation algorithm, and a specific formula is as follows:
Figure BDA0002511532960000171
JND (i) is JND corresponding to i, k is a target contrast resolution compensation scale factor, and OG (x, y) and TG (x, y) respectively represent an original gray value before compensation and a target gray value after compensation at image pixel point coordinates (x, y).
Optionally, the image enhancement unit 404 is specifically configured to:
acquiring target environment parameters;
and determining the target contrast resolution compensation proportionality coefficient corresponding to the target environmental parameter according to a mapping relation between a preset environmental parameter and the contrast resolution compensation proportionality coefficient.
Optionally, the second preset algorithm is a linear stretch compensation algorithm, and a specific formula is as follows:
TG(x,y)=TG(x,y)Th+a×[OG(x,y)-TG(x,y)Th]
wherein, TG (x, y)ThAnd the OG (x, y) and the TG (x, y) respectively represent an original gray value before compensation and a target gray value after compensation at the image pixel point coordinate (x, y).
Optionally, the image enhancement unit 404 is specifically configured to:
acquiring target shooting parameters;
and determining the target linear stretching adjusting coefficient corresponding to the target shooting parameter according to a mapping relation between preset shooting parameters and linear stretching adjusting coefficients.
Optionally, in terms of determining the segmentation threshold, the determining unit 403 is specifically configured to:
determining a mean value JND corresponding to the multiple JNDs;
determining target mean square deviations corresponding to the JNDs;
determining a target adjusting coefficient corresponding to the target mean square error according to a mapping relation between a preset mean square error and an adjusting coefficient;
and adjusting the mean JND according to the target adjusting coefficient to obtain the segmentation threshold.
It is to be understood that the functions of each program module of the image processing apparatus of this embodiment may be specifically implemented according to the method in the foregoing method embodiment, and the specific implementation process may refer to the relevant description of the foregoing method embodiment, which is not described herein again.
Embodiments of the present application also provide a computer storage medium, where the computer storage medium stores a computer program for electronic data exchange, the computer program enabling a computer to execute part or all of the steps of any one of the methods described in the above method embodiments, and the computer includes an electronic device.
Embodiments of the present application also provide a computer program product comprising a non-transitory computer readable storage medium storing a computer program operable to cause a computer to perform some or all of the steps of any of the methods as described in the above method embodiments. The computer program product may be a software installation package, the computer comprising an electronic device.
It should be noted that, for simplicity of description, the above-mentioned method embodiments are described as a series of acts or combination of acts, but those skilled in the art will recognize that the present application is not limited by the order of acts described, as some steps may occur in other orders or concurrently depending on the application. Further, those skilled in the art should also appreciate that the embodiments described in the specification are preferred embodiments and that the acts and modules referred to are not necessarily required in this application.
In the foregoing embodiments, the descriptions of the respective embodiments have respective emphasis, and for parts that are not described in detail in a certain embodiment, reference may be made to related descriptions of other embodiments.
In the embodiments provided in the present application, it should be understood that the disclosed apparatus may be implemented in other manners. For example, the above-described embodiments of the apparatus are merely illustrative, and for example, the above-described division of the units is only one type of division of logical functions, and other divisions may be realized in practice, for example, a plurality of units or components may be combined or integrated into another system, or some features may be omitted, or not executed. In addition, the shown or discussed mutual coupling or direct coupling or communication connection may be an indirect coupling or communication connection of some interfaces, devices or units, and may be an electric or other form.
The units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the units can be selected according to actual needs to achieve the purpose of the solution of the embodiment.
In addition, functional units in the embodiments of the present application may be integrated into one processing unit, or each unit may exist alone physically, or two or more units are integrated into one unit. The integrated unit can be realized in a form of hardware, and can also be realized in a form of a software functional unit.
The integrated unit may be stored in a computer readable memory if it is implemented in the form of a software functional unit and sold or used as a stand-alone product. Based on such understanding, the technical solution of the present application may be substantially implemented or a part of or all or part of the technical solution contributing to the prior art may be embodied in the form of a software product stored in a memory, and including several instructions for causing a computer device (which may be a personal computer, a server, or a network device) to execute all or part of the steps of the above-mentioned method of the embodiments of the present application. And the aforementioned memory comprises: a U-disk, a Read-Only Memory (ROM), a Random Access Memory (RAM), a removable hard disk, a magnetic or optical disk, and other various media capable of storing program codes.
Those skilled in the art will appreciate that all or part of the steps in the methods of the above embodiments may be implemented by associated hardware instructed by a program, which may be stored in a computer-readable memory, which may include: flash Memory disks, Read-Only memories (ROMs), Random Access Memories (RAMs), magnetic or optical disks, and the like.
The foregoing detailed description of the embodiments of the present application has been presented to illustrate the principles and implementations of the present application, and the above description of the embodiments is only provided to help understand the method and the core concept of the present application; meanwhile, for a person skilled in the art, according to the idea of the present application, there may be variations in the specific embodiments and the application scope, and in summary, the content of the present specification should not be construed as a limitation to the present application.

Claims (10)

1. An image processing method, characterized in that the method comprises:
acquiring an image to be processed;
acquiring just distinguishable difference JND corresponding to each pixel point in the image to be processed to obtain a plurality of JNDs;
determining a segmentation threshold;
and performing image enhancement processing on the image to be processed based on the JNDs and the segmentation threshold value to obtain a target image.
2. The method according to claim 1, wherein the performing image enhancement processing on the to-be-processed image based on the plurality of JNDs and the segmentation threshold to obtain a target image comprises:
dividing the JNDs according to the segmentation threshold value to obtain a first JND set and a second JND set, wherein the first JND set is a JND which is larger than or equal to the segmentation threshold value, and the second JND set is a JND which is smaller than the segmentation threshold value;
and performing image enhancement processing on the pixel value corresponding to each JND in the first JND set by adopting a first preset algorithm, and performing image enhancement processing on the pixel value corresponding to each JND in the second JND set by adopting a second preset algorithm to obtain the target image.
3. The method according to claim 2, wherein the first predetermined algorithm is a proportional integral compensation algorithm, and the specific formula is as follows:
Figure FDA0002511532950000011
JND (i) is JND corresponding to i, k is a target contrast resolution compensation scale factor, and OG (x, y) and TG (x, y) respectively represent an original gray value before compensation and a target gray value after compensation at image pixel point coordinates (x, y).
4. The method of claim 3, further comprising:
acquiring target environment parameters;
and determining the target contrast resolution compensation proportionality coefficient corresponding to the target environmental parameter according to a mapping relation between a preset environmental parameter and the contrast resolution compensation proportionality coefficient.
5. The method according to any one of claims 2 to 4, wherein the second predetermined algorithm is a linear stretch compensation algorithm, and the specific formula is as follows:
TG(x,y)=TG(x,y)Th+a×[OG(x,y)-TG(x,y)Th]
wherein, TG (x, y)ThAnd the OG (x, y) and the TG (x, y) respectively represent an original gray value before compensation and a target gray value after compensation at the image pixel point coordinate (x, y).
6. The method of claim 5, further comprising:
acquiring target shooting parameters;
and determining the target linear stretching adjusting coefficient corresponding to the target shooting parameter according to a mapping relation between preset shooting parameters and linear stretching adjusting coefficients.
7. The method of any of claims 1-6, wherein determining the fragmentation threshold comprises:
determining a mean value JND corresponding to the multiple JNDs;
determining target mean square deviations corresponding to the JNDs;
determining a target adjusting coefficient corresponding to the target mean square error according to a mapping relation between a preset mean square error and an adjusting coefficient;
and adjusting the mean JND according to the target adjusting coefficient to obtain the segmentation threshold.
8. An image processing apparatus, characterized in that the apparatus comprises: a first acquisition unit, a second acquisition unit, a determination unit and an image enhancement unit, wherein,
the first acquisition unit is used for acquiring an image to be processed;
the second obtaining unit is used for obtaining just distinguishable differences JND corresponding to each pixel point in the image to be processed to obtain a plurality of JNDs;
the determining unit is used for determining a segmentation threshold;
and the image enhancement unit is used for carrying out image enhancement processing on the image to be processed based on the JNDs and the segmentation threshold value to obtain a target image.
9. An electronic device comprising a processor, a memory, a communication interface, and one or more programs stored in the memory and configured to be executed by the processor, the programs comprising instructions for performing the steps in the method of any of claims 1-7.
10. A computer-readable storage medium, characterized in that the computer-readable storage medium stores a computer program which is executed by a processor to implement the method of any one of claims 1 to 7.
CN202010462573.1A 2020-05-27 2020-05-27 Image processing method, device and storage medium Pending CN111696058A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010462573.1A CN111696058A (en) 2020-05-27 2020-05-27 Image processing method, device and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010462573.1A CN111696058A (en) 2020-05-27 2020-05-27 Image processing method, device and storage medium

Publications (1)

Publication Number Publication Date
CN111696058A true CN111696058A (en) 2020-09-22

Family

ID=72478525

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010462573.1A Pending CN111696058A (en) 2020-05-27 2020-05-27 Image processing method, device and storage medium

Country Status (1)

Country Link
CN (1) CN111696058A (en)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112184581A (en) * 2020-09-27 2021-01-05 腾讯科技(深圳)有限公司 Image processing method, image processing apparatus, computer device, and medium
WO2022111269A1 (en) * 2020-11-26 2022-06-02 百果园技术(新加坡)有限公司 Method and device for enhancing video details, mobile terminal, and storage medium
CN114842579A (en) * 2022-04-26 2022-08-02 深圳市凯迪仕智能科技有限公司 Intelligent lock, image processing method and related product

Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20070039347A (en) * 2005-10-07 2007-04-11 삼성전자주식회사 Method and system for enhancement color image quality
CN101911716A (en) * 2008-01-18 2010-12-08 汤姆森许可贸易公司 Method for assessing perceptual quality
CN102708872A (en) * 2012-06-11 2012-10-03 武汉大学 Method for acquiring horizontal azimuth parameter codebook in three-dimensional (3D) audio
CN102881010A (en) * 2012-08-28 2013-01-16 北京理工大学 Method for evaluating perception sharpness of fused image based on human visual characteristics
CN107547895A (en) * 2016-06-29 2018-01-05 腾讯科技(深圳)有限公司 A kind of image processing method and its device
CN108537758A (en) * 2018-04-23 2018-09-14 北京理工大学 A kind of method for enhancing picture contrast based on display and human-eye visual characteristic
WO2019052329A1 (en) * 2017-09-12 2019-03-21 Oppo广东移动通信有限公司 Facial recognition method and related product
CN110765502A (en) * 2019-10-30 2020-02-07 Oppo广东移动通信有限公司 Information processing method and related product
CN111147301A (en) * 2019-12-27 2020-05-12 咻享智能(深圳)有限公司 Wireless Internet of things grouping management method and related device

Patent Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20070039347A (en) * 2005-10-07 2007-04-11 삼성전자주식회사 Method and system for enhancement color image quality
CN101911716A (en) * 2008-01-18 2010-12-08 汤姆森许可贸易公司 Method for assessing perceptual quality
CN102708872A (en) * 2012-06-11 2012-10-03 武汉大学 Method for acquiring horizontal azimuth parameter codebook in three-dimensional (3D) audio
CN102881010A (en) * 2012-08-28 2013-01-16 北京理工大学 Method for evaluating perception sharpness of fused image based on human visual characteristics
CN107547895A (en) * 2016-06-29 2018-01-05 腾讯科技(深圳)有限公司 A kind of image processing method and its device
WO2019052329A1 (en) * 2017-09-12 2019-03-21 Oppo广东移动通信有限公司 Facial recognition method and related product
CN108537758A (en) * 2018-04-23 2018-09-14 北京理工大学 A kind of method for enhancing picture contrast based on display and human-eye visual characteristic
CN110765502A (en) * 2019-10-30 2020-02-07 Oppo广东移动通信有限公司 Information processing method and related product
CN111147301A (en) * 2019-12-27 2020-05-12 咻享智能(深圳)有限公司 Wireless Internet of things grouping management method and related device

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
杨佳义等: "低照度视频图像自适应补偿增强的设计与实现", 《计算机应用》, vol. 40, no. 8, 30 April 2020 (2020-04-30), pages 1 - 1 *
杨佳义等: "低照度视频图像自适应补偿增强的设计与实现", 《计算机应用》, vol. 40, no. 8, pages 1 - 1 *

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112184581A (en) * 2020-09-27 2021-01-05 腾讯科技(深圳)有限公司 Image processing method, image processing apparatus, computer device, and medium
CN112184581B (en) * 2020-09-27 2023-09-05 腾讯科技(深圳)有限公司 Image processing method, device, computer equipment and medium
WO2022111269A1 (en) * 2020-11-26 2022-06-02 百果园技术(新加坡)有限公司 Method and device for enhancing video details, mobile terminal, and storage medium
CN114842579A (en) * 2022-04-26 2022-08-02 深圳市凯迪仕智能科技有限公司 Intelligent lock, image processing method and related product
CN114842579B (en) * 2022-04-26 2024-02-20 深圳市凯迪仕智能科技股份有限公司 Intelligent lock, image processing method and related products

Similar Documents

Publication Publication Date Title
CN109636754B (en) Extremely-low-illumination image enhancement method based on generation countermeasure network
CN110839129A (en) Image processing method and device and mobile terminal
US11127117B2 (en) Information processing method, information processing apparatus, and recording medium
CN111696058A (en) Image processing method, device and storage medium
CN108234882B (en) Image blurring method and mobile terminal
CN107948733B (en) Video image processing method and device and electronic equipment
CN105208281A (en) Night scene shooting method and device
CN109462745B (en) White balance processing method and mobile terminal
WO2022160895A1 (en) Image processing method, image processing apparatus, electronic system and readable storage medium
EP4340383A1 (en) Image processing method and related device thereof
CN112419167A (en) Image enhancement method, device and storage medium
CN113313661A (en) Image fusion method and device, electronic equipment and computer readable storage medium
CN113411498B (en) Image shooting method, mobile terminal and storage medium
US20200204723A1 (en) Image processing apparatus, imaging apparatus, image processing method, imaging method, and program
JP7136956B2 (en) Image processing method and device, terminal and storage medium
CN109104578B (en) Image processing method and mobile terminal
CN108616687B (en) Photographing method and device and mobile terminal
CN111541937B (en) Image quality adjusting method, television device and computer storage medium
CN114820405A (en) Image fusion method, device, equipment and computer readable storage medium
CN107682611B (en) Focusing method and device, computer readable storage medium and electronic equipment
CN113673474B (en) Image processing method, device, electronic equipment and computer readable storage medium
WO2016026072A1 (en) Method, apparatus and computer program product for generation of extended dynamic range color images
KR20210077579A (en) Electronic device and operating method for generating high dynamic range image
CN115205164B (en) Training method of image processing model, video processing method, device and equipment
TW201525942A (en) Image processing method and image processing device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination