CN112270721A - Color sensing method based on human eye color sensitivity function - Google Patents
Color sensing method based on human eye color sensitivity function Download PDFInfo
- Publication number
- CN112270721A CN112270721A CN202011104136.9A CN202011104136A CN112270721A CN 112270721 A CN112270721 A CN 112270721A CN 202011104136 A CN202011104136 A CN 202011104136A CN 112270721 A CN112270721 A CN 112270721A
- Authority
- CN
- China
- Prior art keywords
- color
- human eye
- sensitivity function
- perception
- input image
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
- 230000035945 sensitivity Effects 0.000 title claims abstract description 37
- 238000000034 method Methods 0.000 title claims abstract description 24
- 230000004456 color vision Effects 0.000 claims abstract description 48
- 230000006870 function Effects 0.000 claims abstract description 39
- 238000003062 neural network model Methods 0.000 claims abstract description 6
- 230000003595 spectral effect Effects 0.000 claims description 12
- 230000001537 neural effect Effects 0.000 claims description 10
- 230000008859 change Effects 0.000 claims description 6
- 239000003086 colorant Substances 0.000 claims description 6
- 238000005286 illumination Methods 0.000 claims description 6
- 230000004913 activation Effects 0.000 claims description 3
- 230000002301 combined effect Effects 0.000 claims description 3
- 230000014509 gene expression Effects 0.000 claims description 3
- 230000037081 physical activity Effects 0.000 claims description 3
- 230000000638 stimulation Effects 0.000 claims description 3
- 230000008569 process Effects 0.000 abstract description 5
- 230000004438 eyesight Effects 0.000 description 5
- 238000004364 calculation method Methods 0.000 description 2
- 238000002474 experimental method Methods 0.000 description 2
- 230000006698 induction Effects 0.000 description 2
- 230000009286 beneficial effect Effects 0.000 description 1
- 238000005094 computer simulation Methods 0.000 description 1
- 230000000694 effects Effects 0.000 description 1
- 230000003993 interaction Effects 0.000 description 1
- 230000003287 optical effect Effects 0.000 description 1
- 230000008447 perception Effects 0.000 description 1
- 230000002093 peripheral effect Effects 0.000 description 1
- 210000001525 retina Anatomy 0.000 description 1
- 238000004088 simulation Methods 0.000 description 1
- 230000001131 transforming effect Effects 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F17/00—Digital computing or data processing equipment or methods, specially adapted for specific functions
- G06F17/10—Complex mathematical operations
- G06F17/11—Complex mathematical operations for solving equations, e.g. nonlinear equations, general mathematical optimization problems
- G06F17/13—Differential equations
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01J—MEASUREMENT OF INTENSITY, VELOCITY, SPECTRAL CONTENT, POLARISATION, PHASE OR PULSE CHARACTERISTICS OF INFRARED, VISIBLE OR ULTRAVIOLET LIGHT; COLORIMETRY; RADIATION PYROMETRY
- G01J3/00—Spectrometry; Spectrophotometry; Monochromators; Measuring colours
- G01J3/46—Measurement of colour; Colour measuring devices, e.g. colorimeters
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/90—Determination of colour characteristics
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01J—MEASUREMENT OF INTENSITY, VELOCITY, SPECTRAL CONTENT, POLARISATION, PHASE OR PULSE CHARACTERISTICS OF INFRARED, VISIBLE OR ULTRAVIOLET LIGHT; COLORIMETRY; RADIATION PYROMETRY
- G01J3/00—Spectrometry; Spectrophotometry; Monochromators; Measuring colours
- G01J3/46—Measurement of colour; Colour measuring devices, e.g. colorimeters
- G01J2003/467—Colour computing
Landscapes
- Physics & Mathematics (AREA)
- Engineering & Computer Science (AREA)
- General Physics & Mathematics (AREA)
- Mathematical Physics (AREA)
- Mathematical Analysis (AREA)
- Mathematical Optimization (AREA)
- Computational Mathematics (AREA)
- Pure & Applied Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Data Mining & Analysis (AREA)
- Spectroscopy & Molecular Physics (AREA)
- Algebra (AREA)
- Operations Research (AREA)
- Databases & Information Systems (AREA)
- Software Systems (AREA)
- General Engineering & Computer Science (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Image Analysis (AREA)
- Color Image Communication Systems (AREA)
Abstract
The invention discloses a color perception method based on a human eye color sensitivity function, which is implemented according to the following steps: step 1, reading the color of an input image; step 2, processing an input image by using a human eye color sensitivity function; step 3, simulating contrast assimilation in color perception, and enabling an image processed by a human eye color sensitivity function to be used as input to enter a color perception neural network model; step 4, the image obtained in the step 3 is switched to the step 3 for iterative operation until convergence; and 5, calculating and outputting the offset after convergence to finish color perception, and has the characteristic of simulating the color perception phenomenon of human eyes in the color recognition process through the color offset.
Description
Technical Field
The invention belongs to the technical field of image processing, and relates to a color perception method based on a human eye color sensitivity function.
Background
In recent years, with the development of computer vision, there have been many studies on a computer model for color perception, which is an important development direction in computer vision in recent years, and Rudd (2010) proposes that the direction of brightness induction (contrast and assimilation) depends on the width and brightness of the peripheral ring of the optical disc. Vanrell et al (2011) show a representation of the color of the points where the image is affected by assimilation and contrast effects in the surrounding environment, Kim (2015) simulates the contrast and assimilation phenomena on brightness proposed by Vanrell with a biophysical retina model, and obtains qualitative consensus results for psychophysical data. Song et al (2019) propose a color-aware neural field framework model, which solves the important problem of color-space interaction, while unifying color contrast and assimilation. Despite the tremendous development in computer vision, there are still great differences between computer vision and human vision, and there are still great differences in the perception of color, and the difference in the distribution position and space of color can cause the human eye to perceive the color differently.
Disclosure of Invention
The invention aims to provide a color perception method based on a human eye color sensitivity function, which has the characteristic of simulating the color perception phenomenon of human eyes in the color recognition process through color offset.
The invention adopts the technical scheme that a color perception method based on a human eye color sensitivity function is implemented according to the following steps:
step 1, reading the color of an input image;
step 2, processing an input image by using a human eye color sensitivity function;
step 3, simulating contrast assimilation in color perception, and enabling an image processed by a human eye color sensitivity function to be used as input to enter a color perception neural network model;
step 4, performing iterative operation on the step 3 until convergence;
and 5, calculating and outputting the offset after convergence to finish color perception.
The invention is also characterized in that:
step 2 is specifically carried out as follows:
in CIE1964(X, Y, Z) color space, the tristimulus values of object colors are:
wherein s (λ) is the relative spectral power distribution of the light source; r (lambda) is the spectral reflectance of the object; k is an adjustment factor;the spectral tristimulus values of the standard observer of the International Commission on illumination are respectively, and let Delta lambda be 1nm and set at the wavelength lambdaiOf spectral tristimulus values of the standard observer color sensitivity functionChanges occur, no change occurs at other wavelengths in the visible range, and the tristimulus values of the object color change to Δ X, Δ Y, Δ Z:
substituting the colors of the input image of step 1 into the CLE1976(L, a, b) color space with better uniformity:
differentiating equation 3 yields:
ΔE=[(ΔL*)2+(Δa*)2+(Δb*)2]1/2In (1), obtaining:
X0for standard illumination stimulus values, when onlyExpression of the human eye color sensitivity function when changing:
resulting in a processed input image.
Step 3, simulating contrast assimilation in color perception, and enabling an image processed by a human eye color sensitivity function to be used as input to enter a color perception neural network model;
assuming that when the time interval is 0, based on a Wilson-Cowan type integral differential equation, τ is 1;
wherein a represents the neural activity of the neural mass (r, c) at time t, with a range [0, 1], representing a stimulus or physical activity:
-a(t):=-a(r,c,t) (8)
f represents that the sigmoid activation function is an s-shaped curve, and converges to 0 and 1 at +/-1:
omega describes the combined effect of contrast and assimilation in color perception, omega xioppRepresents a color space, where ω (r, c, r ', c') ═ g (r-r ') f (c-c'); depends on the respective opponent expressions of the color space f (c, c ') and the physical space g (r, r'); g (r-r') is the difference of classical gaussians, and local stimulation with respect to color is obtained when g (0) > 0; f (c, c') is the difference of the Gaussian kernel functions of two variables in the color space, and the simulation of the color perception contrast of the input image is completedAnd (4) transforming.
H (r, c, t) represents the image color input delivered to the neural mass (c, r) by the cells in the LGN, H (r, c, t): h (c-I (r, t)) (10)
WhereinI (r, t): i (r, t, w) is an input image processed using the human eye color sensitivity function equation (6).
dt is the minimum time step, the integral differential equation of the neurodynamics Wilson-Cowan type is re-expressed as
Step 4 is specifically implemented as follows: turning to the step 3, continuing to perform iteration until convergence;
when using euler's equation to model neurodynamics in an equation, to maintain dynamic steady state, a fixed-point iterative method is used (dt ═ 1). When dt is 1, convergence is completed.
The step 5 specifically comprises the following steps: and (4) after the convergence is finished, the difference value of the image color value obtained in the step (4) and the color value of the input image is the offset, and the color perception is finished.
The invention has the beneficial effects that: the color perception method based on the human eye color sensitivity function has the characteristic of simulating the color perception phenomenon of human eyes in the color recognition process through the color offset. The human eye color sensitivity function is added, so that the color perception performance is improved, the calculation result is closer to the biological experiment result, and the color perception method with the human eye color sensitivity function is more in line with the characteristic of the biological color perception.
Drawings
Fig. 1 is a flow chart of a color perception method based on a human eye color sensitivity function according to the present invention.
Detailed Description
The present invention will be described in detail below with reference to the accompanying drawings and specific embodiments.
The invention discloses a color perception method based on a human eye color sensitivity function, which is implemented according to the following steps as shown in figure 1:
step 1, reading the color of an input image;
step 2, processing an input image by using a human eye color sensitivity function;
step 2 is specifically carried out as follows:
in CIE1964(X, Y, Z) color space, the tristimulus values of object colors are:
wherein s (λ) is the relative spectral power distribution of the light source; r (lambda) is the spectral reflectance of the object; k is an adjustment factor;the spectral tristimulus values of the standard observer of the International Commission on illumination are respectively, and let Delta lambda be 1nm and set at the wavelength lambdaiOf spectral tristimulus values of the standard observer color sensitivity functionChanges occur, no change occurs at other wavelengths in the visible range, and the tristimulus values of the object color change to Δ X, Δ Y, Δ Z:
substituting the colors of the input image of step 1 into the CLE1976(L, a, b) color space with better uniformity:
differentiating equation 3 yields:
ΔE=[(ΔL*)2+(Δa*)2+(Δb*)2]1/2In (1), obtaining:
X0for standard illumination stimulus values, when onlyExpression of the human eye color sensitivity function when changing:
resulting in a processed input image.
Step 3, simulating contrast assimilation in color perception, and enabling an image processed by a human eye color sensitivity function to be used as input to enter a color perception neural network model;
step 3 is specifically carried out as follows:
assuming that when the time interval is 0, based on a Wilson-Cowan type integral differential equation, τ is 1;
wherein a represents the neural activity of the neural mass (r, c) at time t, with a range [0, 1], representing a stimulus or physical activity:
-a(t):=-a(r,c,t) (8)
f represents that the sigmoid activation function is an s-shaped curve, and converges to 0 and 1 at +/-1:
omega describes the combined effect of contrast and assimilation in color perception, omega xioppRepresents a color space, where ω (r, c, r ', c'): g (r-r ') f (c, c'); depends on the respective opponent expressions of the color space f (c, c ') and the physical space g (r, r'); g (r-r') is the difference of classical gaussians, and local stimulation with respect to color is obtained when g (0) > 0; f (c, c') is the difference of the Gaussian kernel functions of two variables in the color space, and the assimilation of the color perception contrast of the analog input image is completed.
H (r, c, t) represents the image color input delivered to the neural mass (c, r) by the cells in the LGN, H (r, c, t): h (c-I (r, t)) (10)
WhereinI (r, t): i (r, t, w) is an input image processed using the human eye color sensitivity function equation (6).
dt is the minimum time step, the integral differential equation of the neurodynamics Wilson-Cowan type is re-expressed:
step 4, performing iterative operation on the step 3 until convergence;
step 4 is specifically implemented as follows: turning to the step 3, continuing to perform iteration until convergence;
when modeling neurodynamics in an equation using the euler equation, to maintain the dynamics steady state, a fixed-point iterative method is used (dt ═ 1). When dt is 1, convergence is completed.
And 5, calculating and outputting the offset after convergence to finish color perception.
The step 5 specifically comprises the following steps: and (4) after the convergence is finished, the difference value of the image color value obtained in the step (4) and the color value of the input image is the offset, and the color perception is finished.
Example 1
Step 1 is executed, and s-cone image color induction data proposed by Monnier P in 2004 is adopted as a data set;
step 2, reading the input color image, processing the input image data by using the human eye color sensitivity function of formula 6, and outputting;
step 3 is executed, contrast assimilation in color perception is simulated, image data processed by human eye color sensitivity function is calculated through a color perception neural network model,
and 4, executing the step 4, judging whether the obtained result is converged, if so, outputting the final result, and if not, continuing to execute the step 3 until the result is converged and outputting.
And 5, executing the step 5, and outputting the color offset of the computer in the color perception process.
The color perception method based on the human eye color sensitivity function has the characteristic of simulating the color perception phenomenon of human eyes in the color recognition process through the color offset. The human eye color sensitivity function is added, so that the color perception performance is improved, the calculation result is closer to the biological experiment result, and the color perception method with the human eye color sensitivity function is more in line with the characteristic of the biological color perception.
Claims (5)
1. A color perception method based on a human eye color sensitivity function is characterized by comprising the following steps:
step 1, reading the color of an input image;
step 2, processing an input image by using a human eye color sensitivity function;
step 3, simulating contrast assimilation in color perception, and enabling an image processed by a human eye color sensitivity function to be used as input to enter a color perception neural network model;
step 4, performing iterative operation on the step 3 until convergence;
and 5, calculating and outputting the offset after convergence to finish color perception.
2. The method as claimed in claim 1, wherein the step 2 is implemented as follows:
in CIE1964(X, Y, Z) color space, the tristimulus values of object colors are:
wherein s (λ) is the relative spectral power distribution of the light source; r (lambda) is the spectral reflectance of the object; k is an adjustment factor;the spectral tristimulus values of the standard observer of the International Commission on illumination are respectively, and let Delta lambda be 1nm and set at the wavelength lambdaiOf spectral tristimulus values of the standard observer color sensitivity functionChanges occur, no change occurs at other wavelengths in the visible range, and the tristimulus values of the object color change to Δ X, Δ Y, Δ Z:
substituting the colors of the input image of step 1 into the CLE1976(L, a, b) color space with better uniformity:
differentiating equation 3 yields:
whereinSubstituting equation 4 into equation Δ E ═ Δ L ═ color difference*)2+(Δa*)2+(Δb*)2]1/2In (1), obtaining:
X0for standard illumination stimulus values, when onlyExpression of the human eye color sensitivity function when changing:
resulting in a processed input image.
3. The method for color perception based on the human eye color sensitivity function as claimed in claim 2, wherein the step 3 is implemented as follows:
assume that when the time interval is 0, based on Wilson-Cowan type integral differential equation, and T is 1:
wherein a represents the neural activity of the neural mass (r, c) at time t, with a range [0, 1], representing a stimulus or physical activity:
-a(t):=-a(r,c,t) (8)
f represents that the sigmoid activation function is an s-shaped curve, and converges to 0 and 1 at +/-1:
omega describes the combined effect of contrast and assimilation in color perception, omega xioppRepresents a color space, where ω (r, c, r ', c'): g (r-r ') f (c, c'); depends on the respective opponent expressions of the color space f (c, c ') and the physical space g (r, r'); g (r-r') is the difference of classical gaussians, and local stimulation with respect to color is obtained when g (0) > 0; f (c, c') is the difference of the Gaussian kernel functions of two variables in the color space, and the assimilation of the color perception contrast of the analog input image is completed.
H (r, c, t) is represented as the image color input of cells in LGN delivered to the neural mass (c, r),
H(r,c,t):=h(c-I(r,t)) (10)
whereinI (r, t): i (r, t, w) is an input image processed using the human eye color sensitivity function equation (6).
dt is the minimum time step, and the integral differential equation of the neurodynamics Wilson-Cowan type is re-expressed as:
4. the method as claimed in claim 3, wherein the step 4 is implemented as follows: turning to the step 3, continuing to perform iteration until convergence;
when dt is 1, convergence is completed.
5. The color perception method based on the human eye color sensitivity function as claimed in claim 4, wherein the step 5 is specifically: and (4) after the convergence is finished, the difference value of the image color value obtained in the step (4) and the color value of the input image is the offset, and the color perception is finished.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202011104136.9A CN112270721A (en) | 2020-10-15 | 2020-10-15 | Color sensing method based on human eye color sensitivity function |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202011104136.9A CN112270721A (en) | 2020-10-15 | 2020-10-15 | Color sensing method based on human eye color sensitivity function |
Publications (1)
Publication Number | Publication Date |
---|---|
CN112270721A true CN112270721A (en) | 2021-01-26 |
Family
ID=74337194
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202011104136.9A Pending CN112270721A (en) | 2020-10-15 | 2020-10-15 | Color sensing method based on human eye color sensitivity function |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN112270721A (en) |
Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5742296A (en) * | 1992-01-08 | 1998-04-21 | Canon Kabushiki Kaisha | Image processing method and apparatus therefor |
WO2000065847A1 (en) * | 1999-04-28 | 2000-11-02 | Sony Corporation | Methods and apparatus for color device characterization |
US20160180552A1 (en) * | 2014-12-23 | 2016-06-23 | Mediatek Singapore Pte. Ltd. | Method And Device Of Constructing Uniform Color Space Directly From Raw Camera RGB |
CN105825020A (en) * | 2016-03-23 | 2016-08-03 | 天津师范大学 | Computing method of three-dimensional perceptual color gamut |
CN108020519A (en) * | 2017-12-11 | 2018-05-11 | 齐鲁工业大学 | A kind of virtual multiple light courcess spectrum reconstruction method based on color constancy |
-
2020
- 2020-10-15 CN CN202011104136.9A patent/CN112270721A/en active Pending
Patent Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5742296A (en) * | 1992-01-08 | 1998-04-21 | Canon Kabushiki Kaisha | Image processing method and apparatus therefor |
WO2000065847A1 (en) * | 1999-04-28 | 2000-11-02 | Sony Corporation | Methods and apparatus for color device characterization |
US20160180552A1 (en) * | 2014-12-23 | 2016-06-23 | Mediatek Singapore Pte. Ltd. | Method And Device Of Constructing Uniform Color Space Directly From Raw Camera RGB |
CN105825020A (en) * | 2016-03-23 | 2016-08-03 | 天津师范大学 | Computing method of three-dimensional perceptual color gamut |
CN108020519A (en) * | 2017-12-11 | 2018-05-11 | 齐鲁工业大学 | A kind of virtual multiple light courcess spectrum reconstruction method based on color constancy |
Non-Patent Citations (2)
Title |
---|
CANDAN CENGIZ ; ERDOGAN KOSE: "Modelling of color perception of different eye colors using artificial neural networks", NEURAL COMPUTING AND APPLICATIONS * |
韩晓微: "彩色图像处理关键技术研究", 中国优秀硕士学位论文全文数据库信息科技辑 * |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
Zhang et al. | A retinal mechanism inspired color constancy model | |
CN110097609B (en) | Sample domain-based refined embroidery texture migration method | |
CN107346653B (en) | GAMMA curve adjusting method and device based on deep learning | |
CN110059593B (en) | Facial expression recognition method based on feedback convolutional neural network | |
CN108304920A (en) | A method of multiple dimensioned learning network is optimized based on MobileNets | |
CN111402285B (en) | Contour detection method based on visual mechanism dark edge enhancement | |
Ding et al. | Product color emotional design adaptive to product shape feature variation | |
CN105701540B (en) | A kind of self-generating neutral net construction method | |
CN114581356B (en) | Image enhancement model generalization method based on style migration data augmentation | |
CN113988123A (en) | Electroencephalogram fatigue prediction method based on self-weighted increment RVFL network | |
CN101674490B (en) | Color image color constant method based on retina vision mechanism | |
CN100585635C (en) | Visualization method for Munsell colour model computer | |
CN112270721A (en) | Color sensing method based on human eye color sensitivity function | |
Chao | The fractal artistic design based on interactive genetic algorithm | |
CN104092919B (en) | Chromatic adaptation transformation optimizing method and system for color digital imaging system | |
TW201901622A (en) | Color processing program, color processing method, color feeling inspection system, output system, color vision correction image processing system, and color vision analog image processing system | |
Niklas | Effects of hypothetical developmental barriers and abrupt environmental changes on adaptive walks in a computer-generated domain for early vascular land plants | |
CN115830051A (en) | Visual bionic edge detection method based on texture gradient adjustment | |
Kurniawan et al. | Premise Parameter Optimization on Adaptive Network Based Fuzzy Inference System Using Modification Hybrid Particle Swarm Optimization and Genetic Algorithm | |
CN108873705A (en) | A kind of HH neuron synchronisation control means based on non-linearity PID | |
Kobayashi | Multi-objective aesthetic design optimization for minimizing the effect of variation in customer Kansei | |
CN110717893B (en) | Edge detection method based on visual nerve pathway | |
CN113465742A (en) | Illumination optimization-based white light source illumination color resolution capability quantification method and system | |
Yang et al. | An Intelligent Designing Approach for Personalized Products Using Product Gene | |
CN111047581A (en) | Image significance detection method based on Itti model and capsule neural network |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
TA01 | Transfer of patent application right |
Effective date of registration: 20240129 Address after: 518000 1002, Building A, Zhiyun Industrial Park, No. 13, Huaxing Road, Henglang Community, Longhua District, Shenzhen, Guangdong Province Applicant after: Shenzhen Wanzhida Technology Co.,Ltd. Country or region after: China Address before: 710048 Shaanxi province Xi'an Beilin District Jinhua Road No. 19 Applicant before: XI'AN POLYTECHNIC University Country or region before: China |
|
TA01 | Transfer of patent application right |