CN112270721A - Color sensing method based on human eye color sensitivity function - Google Patents

Color sensing method based on human eye color sensitivity function Download PDF

Info

Publication number
CN112270721A
CN112270721A CN202011104136.9A CN202011104136A CN112270721A CN 112270721 A CN112270721 A CN 112270721A CN 202011104136 A CN202011104136 A CN 202011104136A CN 112270721 A CN112270721 A CN 112270721A
Authority
CN
China
Prior art keywords
color
human eye
sensitivity function
perception
input image
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202011104136.9A
Other languages
Chinese (zh)
Inventor
赵雪青
冯一凡
杨坤
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shenzhen Wanzhida Technology Co ltd
Original Assignee
Xian Polytechnic University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Xian Polytechnic University filed Critical Xian Polytechnic University
Priority to CN202011104136.9A priority Critical patent/CN112270721A/en
Publication of CN112270721A publication Critical patent/CN112270721A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F17/00Digital computing or data processing equipment or methods, specially adapted for specific functions
    • G06F17/10Complex mathematical operations
    • G06F17/11Complex mathematical operations for solving equations, e.g. nonlinear equations, general mathematical optimization problems
    • G06F17/13Differential equations
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01JMEASUREMENT OF INTENSITY, VELOCITY, SPECTRAL CONTENT, POLARISATION, PHASE OR PULSE CHARACTERISTICS OF INFRARED, VISIBLE OR ULTRAVIOLET LIGHT; COLORIMETRY; RADIATION PYROMETRY
    • G01J3/00Spectrometry; Spectrophotometry; Monochromators; Measuring colours
    • G01J3/46Measurement of colour; Colour measuring devices, e.g. colorimeters
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/90Determination of colour characteristics
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01JMEASUREMENT OF INTENSITY, VELOCITY, SPECTRAL CONTENT, POLARISATION, PHASE OR PULSE CHARACTERISTICS OF INFRARED, VISIBLE OR ULTRAVIOLET LIGHT; COLORIMETRY; RADIATION PYROMETRY
    • G01J3/00Spectrometry; Spectrophotometry; Monochromators; Measuring colours
    • G01J3/46Measurement of colour; Colour measuring devices, e.g. colorimeters
    • G01J2003/467Colour computing

Landscapes

  • Physics & Mathematics (AREA)
  • Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Mathematical Physics (AREA)
  • Mathematical Analysis (AREA)
  • Mathematical Optimization (AREA)
  • Computational Mathematics (AREA)
  • Pure & Applied Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Data Mining & Analysis (AREA)
  • Spectroscopy & Molecular Physics (AREA)
  • Algebra (AREA)
  • Operations Research (AREA)
  • Databases & Information Systems (AREA)
  • Software Systems (AREA)
  • General Engineering & Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Image Analysis (AREA)
  • Color Image Communication Systems (AREA)

Abstract

The invention discloses a color perception method based on a human eye color sensitivity function, which is implemented according to the following steps: step 1, reading the color of an input image; step 2, processing an input image by using a human eye color sensitivity function; step 3, simulating contrast assimilation in color perception, and enabling an image processed by a human eye color sensitivity function to be used as input to enter a color perception neural network model; step 4, the image obtained in the step 3 is switched to the step 3 for iterative operation until convergence; and 5, calculating and outputting the offset after convergence to finish color perception, and has the characteristic of simulating the color perception phenomenon of human eyes in the color recognition process through the color offset.

Description

Color sensing method based on human eye color sensitivity function
Technical Field
The invention belongs to the technical field of image processing, and relates to a color perception method based on a human eye color sensitivity function.
Background
In recent years, with the development of computer vision, there have been many studies on a computer model for color perception, which is an important development direction in computer vision in recent years, and Rudd (2010) proposes that the direction of brightness induction (contrast and assimilation) depends on the width and brightness of the peripheral ring of the optical disc. Vanrell et al (2011) show a representation of the color of the points where the image is affected by assimilation and contrast effects in the surrounding environment, Kim (2015) simulates the contrast and assimilation phenomena on brightness proposed by Vanrell with a biophysical retina model, and obtains qualitative consensus results for psychophysical data. Song et al (2019) propose a color-aware neural field framework model, which solves the important problem of color-space interaction, while unifying color contrast and assimilation. Despite the tremendous development in computer vision, there are still great differences between computer vision and human vision, and there are still great differences in the perception of color, and the difference in the distribution position and space of color can cause the human eye to perceive the color differently.
Disclosure of Invention
The invention aims to provide a color perception method based on a human eye color sensitivity function, which has the characteristic of simulating the color perception phenomenon of human eyes in the color recognition process through color offset.
The invention adopts the technical scheme that a color perception method based on a human eye color sensitivity function is implemented according to the following steps:
step 1, reading the color of an input image;
step 2, processing an input image by using a human eye color sensitivity function;
step 3, simulating contrast assimilation in color perception, and enabling an image processed by a human eye color sensitivity function to be used as input to enter a color perception neural network model;
step 4, performing iterative operation on the step 3 until convergence;
and 5, calculating and outputting the offset after convergence to finish color perception.
The invention is also characterized in that:
step 2 is specifically carried out as follows:
in CIE1964(X, Y, Z) color space, the tristimulus values of object colors are:
Figure BDA0002726373020000021
wherein s (λ) is the relative spectral power distribution of the light source; r (lambda) is the spectral reflectance of the object; k is an adjustment factor;
Figure BDA0002726373020000022
the spectral tristimulus values of the standard observer of the International Commission on illumination are respectively, and let Delta lambda be 1nm and set at the wavelength lambdaiOf spectral tristimulus values of the standard observer color sensitivity function
Figure BDA0002726373020000023
Changes occur, no change occurs at other wavelengths in the visible range, and the tristimulus values of the object color change to Δ X, Δ Y, Δ Z:
Figure BDA0002726373020000024
substituting the colors of the input image of step 1 into the CLE1976(L, a, b) color space with better uniformity:
Figure BDA0002726373020000031
differentiating equation 3 yields:
Figure BDA0002726373020000032
wherein
Figure BDA0002726373020000033
Substituting equation 4 into the color difference equation
ΔE=[(ΔL*)2+(Δa*)2+(Δb*)2]1/2In (1), obtaining:
Figure BDA0002726373020000034
X0for standard illumination stimulus values, when only
Figure BDA0002726373020000037
Expression of the human eye color sensitivity function when changing:
Figure BDA0002726373020000035
resulting in a processed input image.
Step 3, simulating contrast assimilation in color perception, and enabling an image processed by a human eye color sensitivity function to be used as input to enter a color perception neural network model;
assuming that when the time interval is 0, based on a Wilson-Cowan type integral differential equation, τ is 1;
Figure BDA0002726373020000036
wherein a represents the neural activity of the neural mass (r, c) at time t, with a range [0, 1], representing a stimulus or physical activity:
-a(t):=-a(r,c,t) (8)
f represents that the sigmoid activation function is an s-shaped curve, and converges to 0 and 1 at +/-1:
Figure BDA0002726373020000041
omega describes the combined effect of contrast and assimilation in color perception, omega xioppRepresents a color space, where ω (r, c, r ', c') ═ g (r-r ') f (c-c'); depends on the respective opponent expressions of the color space f (c, c ') and the physical space g (r, r'); g (r-r') is the difference of classical gaussians, and local stimulation with respect to color is obtained when g (0) > 0; f (c, c') is the difference of the Gaussian kernel functions of two variables in the color space, and the simulation of the color perception contrast of the input image is completedAnd (4) transforming.
H (r, c, t) represents the image color input delivered to the neural mass (c, r) by the cells in the LGN, H (r, c, t): h (c-I (r, t)) (10)
Wherein
Figure BDA0002726373020000042
I (r, t): i (r, t, w) is an input image processed using the human eye color sensitivity function equation (6).
dt is the minimum time step, the integral differential equation of the neurodynamics Wilson-Cowan type is re-expressed as
Figure BDA0002726373020000043
Step 4 is specifically implemented as follows: turning to the step 3, continuing to perform iteration until convergence;
when using euler's equation to model neurodynamics in an equation, to maintain dynamic steady state, a fixed-point iterative method is used (dt ═ 1). When dt is 1, convergence is completed.
The step 5 specifically comprises the following steps: and (4) after the convergence is finished, the difference value of the image color value obtained in the step (4) and the color value of the input image is the offset, and the color perception is finished.
The invention has the beneficial effects that: the color perception method based on the human eye color sensitivity function has the characteristic of simulating the color perception phenomenon of human eyes in the color recognition process through the color offset. The human eye color sensitivity function is added, so that the color perception performance is improved, the calculation result is closer to the biological experiment result, and the color perception method with the human eye color sensitivity function is more in line with the characteristic of the biological color perception.
Drawings
Fig. 1 is a flow chart of a color perception method based on a human eye color sensitivity function according to the present invention.
Detailed Description
The present invention will be described in detail below with reference to the accompanying drawings and specific embodiments.
The invention discloses a color perception method based on a human eye color sensitivity function, which is implemented according to the following steps as shown in figure 1:
step 1, reading the color of an input image;
step 2, processing an input image by using a human eye color sensitivity function;
step 2 is specifically carried out as follows:
in CIE1964(X, Y, Z) color space, the tristimulus values of object colors are:
Figure BDA0002726373020000051
wherein s (λ) is the relative spectral power distribution of the light source; r (lambda) is the spectral reflectance of the object; k is an adjustment factor;
Figure BDA0002726373020000061
the spectral tristimulus values of the standard observer of the International Commission on illumination are respectively, and let Delta lambda be 1nm and set at the wavelength lambdaiOf spectral tristimulus values of the standard observer color sensitivity function
Figure BDA0002726373020000062
Changes occur, no change occurs at other wavelengths in the visible range, and the tristimulus values of the object color change to Δ X, Δ Y, Δ Z:
Figure BDA0002726373020000063
substituting the colors of the input image of step 1 into the CLE1976(L, a, b) color space with better uniformity:
Figure BDA0002726373020000064
differentiating equation 3 yields:
Figure BDA0002726373020000065
wherein
Figure BDA0002726373020000066
Substituting equation 4 into the color difference equation
ΔE=[(ΔL*)2+(Δa*)2+(Δb*)2]1/2In (1), obtaining:
Figure BDA0002726373020000071
X0for standard illumination stimulus values, when only
Figure BDA0002726373020000072
Expression of the human eye color sensitivity function when changing:
Figure BDA0002726373020000073
resulting in a processed input image.
Step 3, simulating contrast assimilation in color perception, and enabling an image processed by a human eye color sensitivity function to be used as input to enter a color perception neural network model;
step 3 is specifically carried out as follows:
assuming that when the time interval is 0, based on a Wilson-Cowan type integral differential equation, τ is 1;
Figure BDA0002726373020000074
wherein a represents the neural activity of the neural mass (r, c) at time t, with a range [0, 1], representing a stimulus or physical activity:
-a(t):=-a(r,c,t) (8)
f represents that the sigmoid activation function is an s-shaped curve, and converges to 0 and 1 at +/-1:
Figure BDA0002726373020000075
omega describes the combined effect of contrast and assimilation in color perception, omega xioppRepresents a color space, where ω (r, c, r ', c'): g (r-r ') f (c, c'); depends on the respective opponent expressions of the color space f (c, c ') and the physical space g (r, r'); g (r-r') is the difference of classical gaussians, and local stimulation with respect to color is obtained when g (0) > 0; f (c, c') is the difference of the Gaussian kernel functions of two variables in the color space, and the assimilation of the color perception contrast of the analog input image is completed.
H (r, c, t) represents the image color input delivered to the neural mass (c, r) by the cells in the LGN, H (r, c, t): h (c-I (r, t)) (10)
Wherein
Figure BDA0002726373020000081
I (r, t): i (r, t, w) is an input image processed using the human eye color sensitivity function equation (6).
dt is the minimum time step, the integral differential equation of the neurodynamics Wilson-Cowan type is re-expressed:
Figure BDA0002726373020000082
step 4, performing iterative operation on the step 3 until convergence;
step 4 is specifically implemented as follows: turning to the step 3, continuing to perform iteration until convergence;
when modeling neurodynamics in an equation using the euler equation, to maintain the dynamics steady state, a fixed-point iterative method is used (dt ═ 1). When dt is 1, convergence is completed.
And 5, calculating and outputting the offset after convergence to finish color perception.
The step 5 specifically comprises the following steps: and (4) after the convergence is finished, the difference value of the image color value obtained in the step (4) and the color value of the input image is the offset, and the color perception is finished.
Example 1
Step 1 is executed, and s-cone image color induction data proposed by Monnier P in 2004 is adopted as a data set;
step 2, reading the input color image, processing the input image data by using the human eye color sensitivity function of formula 6, and outputting;
step 3 is executed, contrast assimilation in color perception is simulated, image data processed by human eye color sensitivity function is calculated through a color perception neural network model,
and 4, executing the step 4, judging whether the obtained result is converged, if so, outputting the final result, and if not, continuing to execute the step 3 until the result is converged and outputting.
And 5, executing the step 5, and outputting the color offset of the computer in the color perception process.
The color perception method based on the human eye color sensitivity function has the characteristic of simulating the color perception phenomenon of human eyes in the color recognition process through the color offset. The human eye color sensitivity function is added, so that the color perception performance is improved, the calculation result is closer to the biological experiment result, and the color perception method with the human eye color sensitivity function is more in line with the characteristic of the biological color perception.

Claims (5)

1. A color perception method based on a human eye color sensitivity function is characterized by comprising the following steps:
step 1, reading the color of an input image;
step 2, processing an input image by using a human eye color sensitivity function;
step 3, simulating contrast assimilation in color perception, and enabling an image processed by a human eye color sensitivity function to be used as input to enter a color perception neural network model;
step 4, performing iterative operation on the step 3 until convergence;
and 5, calculating and outputting the offset after convergence to finish color perception.
2. The method as claimed in claim 1, wherein the step 2 is implemented as follows:
in CIE1964(X, Y, Z) color space, the tristimulus values of object colors are:
Figure FDA0002726373010000011
wherein s (λ) is the relative spectral power distribution of the light source; r (lambda) is the spectral reflectance of the object; k is an adjustment factor;
Figure FDA0002726373010000012
the spectral tristimulus values of the standard observer of the International Commission on illumination are respectively, and let Delta lambda be 1nm and set at the wavelength lambdaiOf spectral tristimulus values of the standard observer color sensitivity function
Figure FDA0002726373010000013
Changes occur, no change occurs at other wavelengths in the visible range, and the tristimulus values of the object color change to Δ X, Δ Y, Δ Z:
Figure FDA0002726373010000021
substituting the colors of the input image of step 1 into the CLE1976(L, a, b) color space with better uniformity:
Figure FDA0002726373010000022
differentiating equation 3 yields:
Figure FDA0002726373010000023
wherein
Figure FDA0002726373010000024
Substituting equation 4 into equation Δ E ═ Δ L ═ color difference*)2+(Δa*)2+(Δb*)2]1/2In (1), obtaining:
Figure FDA0002726373010000025
X0for standard illumination stimulus values, when only
Figure FDA0002726373010000026
Expression of the human eye color sensitivity function when changing:
Figure FDA0002726373010000027
resulting in a processed input image.
3. The method for color perception based on the human eye color sensitivity function as claimed in claim 2, wherein the step 3 is implemented as follows:
assume that when the time interval is 0, based on Wilson-Cowan type integral differential equation, and T is 1:
Figure FDA0002726373010000031
wherein a represents the neural activity of the neural mass (r, c) at time t, with a range [0, 1], representing a stimulus or physical activity:
-a(t):=-a(r,c,t) (8)
f represents that the sigmoid activation function is an s-shaped curve, and converges to 0 and 1 at +/-1:
Figure FDA0002726373010000033
omega describes the combined effect of contrast and assimilation in color perception, omega xioppRepresents a color space, where ω (r, c, r ', c'): g (r-r ') f (c, c'); depends on the respective opponent expressions of the color space f (c, c ') and the physical space g (r, r'); g (r-r') is the difference of classical gaussians, and local stimulation with respect to color is obtained when g (0) > 0; f (c, c') is the difference of the Gaussian kernel functions of two variables in the color space, and the assimilation of the color perception contrast of the analog input image is completed.
H (r, c, t) is represented as the image color input of cells in LGN delivered to the neural mass (c, r),
H(r,c,t):=h(c-I(r,t)) (10)
wherein
Figure FDA0002726373010000032
I (r, t): i (r, t, w) is an input image processed using the human eye color sensitivity function equation (6).
dt is the minimum time step, and the integral differential equation of the neurodynamics Wilson-Cowan type is re-expressed as:
Figure FDA0002726373010000041
4. the method as claimed in claim 3, wherein the step 4 is implemented as follows: turning to the step 3, continuing to perform iteration until convergence;
when dt is 1, convergence is completed.
5. The color perception method based on the human eye color sensitivity function as claimed in claim 4, wherein the step 5 is specifically: and (4) after the convergence is finished, the difference value of the image color value obtained in the step (4) and the color value of the input image is the offset, and the color perception is finished.
CN202011104136.9A 2020-10-15 2020-10-15 Color sensing method based on human eye color sensitivity function Pending CN112270721A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202011104136.9A CN112270721A (en) 2020-10-15 2020-10-15 Color sensing method based on human eye color sensitivity function

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202011104136.9A CN112270721A (en) 2020-10-15 2020-10-15 Color sensing method based on human eye color sensitivity function

Publications (1)

Publication Number Publication Date
CN112270721A true CN112270721A (en) 2021-01-26

Family

ID=74337194

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202011104136.9A Pending CN112270721A (en) 2020-10-15 2020-10-15 Color sensing method based on human eye color sensitivity function

Country Status (1)

Country Link
CN (1) CN112270721A (en)

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5742296A (en) * 1992-01-08 1998-04-21 Canon Kabushiki Kaisha Image processing method and apparatus therefor
WO2000065847A1 (en) * 1999-04-28 2000-11-02 Sony Corporation Methods and apparatus for color device characterization
US20160180552A1 (en) * 2014-12-23 2016-06-23 Mediatek Singapore Pte. Ltd. Method And Device Of Constructing Uniform Color Space Directly From Raw Camera RGB
CN105825020A (en) * 2016-03-23 2016-08-03 天津师范大学 Computing method of three-dimensional perceptual color gamut
CN108020519A (en) * 2017-12-11 2018-05-11 齐鲁工业大学 A kind of virtual multiple light courcess spectrum reconstruction method based on color constancy

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5742296A (en) * 1992-01-08 1998-04-21 Canon Kabushiki Kaisha Image processing method and apparatus therefor
WO2000065847A1 (en) * 1999-04-28 2000-11-02 Sony Corporation Methods and apparatus for color device characterization
US20160180552A1 (en) * 2014-12-23 2016-06-23 Mediatek Singapore Pte. Ltd. Method And Device Of Constructing Uniform Color Space Directly From Raw Camera RGB
CN105825020A (en) * 2016-03-23 2016-08-03 天津师范大学 Computing method of three-dimensional perceptual color gamut
CN108020519A (en) * 2017-12-11 2018-05-11 齐鲁工业大学 A kind of virtual multiple light courcess spectrum reconstruction method based on color constancy

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
CANDAN CENGIZ ; ERDOGAN KOSE: "Modelling of color perception of different eye colors using artificial neural networks", NEURAL COMPUTING AND APPLICATIONS *
韩晓微: "彩色图像处理关键技术研究", 中国优秀硕士学位论文全文数据库信息科技辑 *

Similar Documents

Publication Publication Date Title
Zhang et al. A retinal mechanism inspired color constancy model
CN110097609B (en) Sample domain-based refined embroidery texture migration method
CN107346653B (en) GAMMA curve adjusting method and device based on deep learning
CN110059593B (en) Facial expression recognition method based on feedback convolutional neural network
CN108304920A (en) A method of multiple dimensioned learning network is optimized based on MobileNets
CN111402285B (en) Contour detection method based on visual mechanism dark edge enhancement
Ding et al. Product color emotional design adaptive to product shape feature variation
CN105701540B (en) A kind of self-generating neutral net construction method
CN114581356B (en) Image enhancement model generalization method based on style migration data augmentation
CN113988123A (en) Electroencephalogram fatigue prediction method based on self-weighted increment RVFL network
CN101674490B (en) Color image color constant method based on retina vision mechanism
CN100585635C (en) Visualization method for Munsell colour model computer
CN112270721A (en) Color sensing method based on human eye color sensitivity function
Chao The fractal artistic design based on interactive genetic algorithm
CN104092919B (en) Chromatic adaptation transformation optimizing method and system for color digital imaging system
TW201901622A (en) Color processing program, color processing method, color feeling inspection system, output system, color vision correction image processing system, and color vision analog image processing system
Niklas Effects of hypothetical developmental barriers and abrupt environmental changes on adaptive walks in a computer-generated domain for early vascular land plants
CN115830051A (en) Visual bionic edge detection method based on texture gradient adjustment
Kurniawan et al. Premise Parameter Optimization on Adaptive Network Based Fuzzy Inference System Using Modification Hybrid Particle Swarm Optimization and Genetic Algorithm
CN108873705A (en) A kind of HH neuron synchronisation control means based on non-linearity PID
Kobayashi Multi-objective aesthetic design optimization for minimizing the effect of variation in customer Kansei
CN110717893B (en) Edge detection method based on visual nerve pathway
CN113465742A (en) Illumination optimization-based white light source illumination color resolution capability quantification method and system
Yang et al. An Intelligent Designing Approach for Personalized Products Using Product Gene
CN111047581A (en) Image significance detection method based on Itti model and capsule neural network

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
TA01 Transfer of patent application right

Effective date of registration: 20240129

Address after: 518000 1002, Building A, Zhiyun Industrial Park, No. 13, Huaxing Road, Henglang Community, Longhua District, Shenzhen, Guangdong Province

Applicant after: Shenzhen Wanzhida Technology Co.,Ltd.

Country or region after: China

Address before: 710048 Shaanxi province Xi'an Beilin District Jinhua Road No. 19

Applicant before: XI'AN POLYTECHNIC University

Country or region before: China

TA01 Transfer of patent application right