CN105741235A - Visual rivalry improving method of complementary color three-dimensional image for improving color fidelity - Google Patents

Visual rivalry improving method of complementary color three-dimensional image for improving color fidelity Download PDF

Info

Publication number
CN105741235A
CN105741235A CN201610111418.9A CN201610111418A CN105741235A CN 105741235 A CN105741235 A CN 105741235A CN 201610111418 A CN201610111418 A CN 201610111418A CN 105741235 A CN105741235 A CN 105741235A
Authority
CN
China
Prior art keywords
color
lab
rgb
color space
pixel
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201610111418.9A
Other languages
Chinese (zh)
Other versions
CN105741235B (en
Inventor
齐敏
杜乾敏
程恭
朱柏飞
魏效昱
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Northwestern Polytechnical University
Original Assignee
Northwestern Polytechnical University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Northwestern Polytechnical University filed Critical Northwestern Polytechnical University
Priority to CN201610111418.9A priority Critical patent/CN105741235B/en
Publication of CN105741235A publication Critical patent/CN105741235A/en
Application granted granted Critical
Publication of CN105741235B publication Critical patent/CN105741235B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T3/00Geometric image transformation in the plane of the image
    • G06T3/40Scaling the whole image or part thereof
    • G06T5/90
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20212Image combination
    • G06T2207/20221Image fusion; Image merging

Abstract

The invention provides a visual rivalry improving method of a complementary color three-dimensional image for improving color fidelity. The method comprises the steps of transforming symmetrical left view and right view of a three-dimensional image from an RGB color space to a CIE L*a*b* color space; then calculating the sensing area of a cyan lens in the CIE L*a*b* color space; matching and mapping the right view to the two-dimensional sensing area of the cyan lens, and calculating corresponding a* and b* values of each pixel in the matched and mapped right view; adjusting the brightness of the left view and the right view based on a chromatic value, and separately calculating a corresponding L* value of each pixel in the adjusted left view and right view; inversely transforming the processed left view and right view from the CIE L*a*b* color space to the RGB color space to synthesize a red and cyan complementary color three-dimensional image. The method can improve the color fidelity of the complementary color three-dimensional image on the premise of effectively reducing the visual fatigue in view of the three-dimensional image and obtaining more comfortable three-dimensional visual experience.

Description

Improve the anaglyphs visual rivalry ameliorative way of color fidelity
Technical field
The present invention relates to a kind of anaglyphs synthetic technology, especially improve in anaglyphs visual rivalry phenomenon, improve the dark purple anaglyphs synthetic technology of color fidelity.
Background technology
Along with virtual reality technology was stepped into " the virtual reality first year " in 2016, virtual reality applications initially enters commercialization and promotes the stage.Correspondingly, the level of hardware of stereoscopic display device constantly promotes, and people are unprecedentedly surging to demand and the enthusiasm of stereo display.Stereo display technique can divide two big classes: a class is to wear the Auto-stereo display mode of any aid, i.e. bore hole formula stereo display technique, show including body, based on the holographic display etc. of the Auto-stereo display of liquid crystal display, non-spectacle, this kind of technology still locates conceptual phase at present, due to all many-sided restrictions such as technology, great number equipment cost and film source making, actual business promotion uses and also needs the long period;The another kind of stereo display technique being to need to wear auxiliary equipment, foundation binocular parallax principle, the equipment of representative is Helmet Mounted Display and anaglyph spectacles.Helmet Mounted Display is only applicable to single viewing, after stereo display hardware device obtains tremendous development, volume and weight based on the helmet of Helmet Mounted Display is substantially reduced, developing into comparatively light glasses form, also call with anaglyph spectacles now, price is still higher, it is professional anaglyph spectacles referred to here as this kind of anaglyph spectacles, such as the Rift glasses of Oculus company of the U.S., price $ 599, the PC that the performance of about 1000 dollars is powerful separately need to be joined.Another kind of anaglyph spectacles is the conventional stereo glasses only with three-dimensional display function, belong to popular consumer goods, cheap, it is consumer level anaglyph spectacles referred to here as this kind of anaglyph spectacles, the polarized stereoscopic glasses that be divided into the anaglyphic spectacles divided based on color, divides based on light and the shutter stereo glasses based on the time-division.Polarized stereoscopic glasses needs to be equipped with special stereo projecting equipment, is typically employed in the occasion such as movie theatre, exhibitions demonstration;Anaglyphic spectacles separates two eye pattern pictures by the color filtration of eyeglass, the stereo-picture issued on the internet and three-dimensional video-frequency generally use this anaglyph spectacles to watch, it is widely used in the personal consumption fields such as PC, mobile phone, iPad, there is the most vast customer group.
At present, virtual reality hardware device still suffers from many problems demand in actual applications and solves, including: (1) film source is without versatility: different stereoscopic display devices can only use the film source of oneself professional format, the not versatility of this film source is particularly evident on the impact of emerging stereoscopic display device popularization and application, these stereoscopic display devices are releasing initial stage film source seldom, and will reach certain customer volume have to continue to increase a large amount of input in the software aspects that film source produces.(2) volume and weight of Helmet Mounted Display and professional anaglyph spectacles is still excessive, and compared with the augmented reality holographic glasses Hololens of Microsoft, it appears stupid much heavier, comfort level and feeling of immersion on viewing have impact.
Make a general survey of all kinds of stereo display technique, the operation principle of its stereoscopic display device and the characteristic of stereo-picture are analyzed, it is found that the complementary stereo display technique only divided based on color possesses the basic condition solving the problems referred to above.Complementary stereo display technique is stereo display technique the earliest, at the initial stage of stereoscopic display device hardware technological development, some problems existed due to color point technology are not solved very well, and stereo display effect is not ideal enough, and gradually faded from memory, technical development is also therefore slow.But, current after fast development of stereo display hardware technology, the inherent character of complementary stereo display technique but brings opportunity for solving new problem, analyzes and can obtain as drawn a conclusion: (1) can solve film source versatility problem principle.Complementary stereo display use anaglyphs or auto-stereoscopic video can compliant applications in all stereoscopic display devices, such as: can using the image of two complementary color passages or video signal as professional anaglyph spectacles the input signal source of two LCD display, it is achieved stereo display;The input signal source of left and right projector equipment in the image of two complementary color passages or video signal can also being shown as polarized stereoscopic, it is achieved stereo display.It is to say, anaglyphs or three-dimensional video-frequency manufacturing technology possess the condition showing film source production method as a kind of General stereoscopic.The film source changing current various stereoscopic display devices is made the state being respectively battalion by unified film source manufacturing technology, same film source is avoided to repeat to make the cost waste caused, various stereoscopic display devices both can use the film source of oneself professional format, general film source can also be used, the abundant rapidly of stereo display film source will be promoted.This result, the dissemination newly going out stereoscopic display device is more great, and new equipment just can use a large amount of existing film source for Consumer's Experience at the beginning of releasing, and its popularization is had important facilitation.(2) miniaturization of stereoscopic display device it is greatly facilitated.Auto-stereoscopic imaging technique utilizes two Color Channels to realize stereoscopic vision, as long as viewing equipment has corresponding colour filter function, the accessories such as required power supply are shown without LCD, therefore in principle, current anaglyphic spectacles can be made the form of contact lens, utilizing the production technology being similar to contact lens or U.S. pupil to realize, this will bring a qualitative leap of stereoscopic display device miniaturization.(3) it is of value to the stereos copic viewing improving the handheld mobile device with huge customer group to experience.Along with the explosive growth of Helmet Mounted Display and professional anaglyph spectacles equipment, the stereo display mode of the handheld mobile device such as mobile phone, iPad also begins to be subject to the concern of some VR equipment vendors.Google company prepares the stereo display glasses that investment cellphone subscriber is new, but does not break away from the thinking of their simple solid glasses Cardboard, it is necessary to develop special mobile phone operating system and chip.According to Such analysis, for handheld mobile device, auto-stereoscopic shows the potential ability possessing the most convenient viewing mode, for the user worn glasses, complementary color electrostatic pad pasting is adsorbed on the daily eyeglass worn, the user that just can solve easily to wear glasses in other stereoscopic display devices watches the puzzlement of stereo-picture and three-dimensional video-frequency, and certainly, this kind of user can also use the form of contact lens to watch more easily future.
In view of the above-mentioned advantage that the inherent characteristics of complementary stereo display technique embodies when development of new techniques, the present invention studies the anaglyphs synthetic technology in complementary stereo display technique, it is directed to most popular dark purple anaglyphs, viewing equipment is dark purple anaglyphic spectacles, and two eyeglass is red color filter and cyan light filter respectively.
The concept related in anaglyphs synthesis includes:
(1) binocular parallax stereoscopic vision and anaglyphs
When mankind's eyes watch the object of certain distance, the visual pattern received due to the existence of two optical axis spacing, left eye and right eye is different, is a pair stereo pairs, the scene image that wherein left eye is seen is called that the scene image that left view, right eye are seen is called right view.Two width image fine distinctions can be merged by brain, perceives physiology depth cueing, thus producing third dimension.Anaglyphs synthetic technology using color solid image to as film source, left view and right view to stereo image pair carry out computing, generate a width anaglyphs, two kinds of image informations of superposition on this image: one is that left eye passes through red color filter and should observe the image information perceived, calculated by left view and obtain, it is called " perception left view ", another is that right eye passes through cyan light filter and should observe the image information perceived, calculated by right view and obtain, be called " perception right view ".When watching anaglyphs by dark purple anaglyphic spectacles, the perception view that two eyes are each seen by brain merges, thus understanding depth information, produces third dimension.
(2) visual rivalry
Visual rivalry refers to when the profile of the presented image of eyes, brightness or color differ greatly, left view and right view cannot be fused into a stereoscopic image by brain completely, cause the dynamic alternate of visually-perceptible: sometime, the image sensation that left eye is seen is clear, the visual stimulus of right eye is suppressed, and the next moment is then contrary.The visual stimulus of right and left eyes is all striving that the leading position being in vision but can not produce to merge.Visual rivalry phenomenon is the main cause affecting anaglyphs stereoscopic vision perception stability.
(3) three elements of color
The three elements of color refer to tone (or claiming form and aspect), saturation and brightness, in order to describe the human eye perception to color.Wherein, the classification of tone reflection color, depends on the dominant wavelength of light that object sends or reflects, is generally identified by color designation;Saturation refers to gradation of color, more high containing colour content, and saturation is more high, and color is more pure;Brightness refers to the relative bright-dark degree of color.
Tone in color three elements and saturation are generically and collectively referred to as colourity.
(4) color space
Color space is also referred to as color model, and the color space related in anaglyphs synthesis has: (a) RGB color: R, G, B represent Red Green Blue respectively;(b)CIEL*a*b*Color space: L*For psychology brightness, a*、b*It is feel consistent two chrominance channels with tone and saturation;C () CIE XYZ color space: developed by rgb space, is a kind of basic skills measuring color, X, Y, Z are the three primary colories of this color space;D () hsv color space: H is tone, S is saturation, and V is brightness.
Common anaglyphs composition algorithm is broadly divided into two big classes, is empirical composition algorithm and calculation type composition algorithm respectively.Empirical composition algorithm is left out spectrum parameter and the anaglyphic spectacles absorption curve function to different-waveband of display device, with the passage addition method for main representative.Its main thought is the Color Channel utilizing two kinds of complementary colors each other, reaches the effect of separate colors in conjunction with corresponding anaglyphic spectacles thus producing three-dimensional perception.But, the passage addition method often because serious visual rivalry, makes people do not feel like oneself, easily causes visual fatigue.Modified hydrothermal process mainly has two classes on this basis: a class is to utilize greyscale transformation, first the left view of stereo image pair or left view and right view is become gray level image simultaneously, then synthesizes by the passage addition method;Another kind of, it is utilize gamma correction, cyan component is mapped in red component, to improve the stereoscopic visual effect of the anaglyphic map of synthesis.But these innovatory algorithm are in accordance with experience to be adjusted the mapping parameters in the passage addition method, not tight logical analysis, after causing the image that scene is different is synthesized, stereoeffect difference is very big.Calculation type composition algorithm considers spectrum parameter and the anaglyphic spectacles absorption curve function to different-waveband of display device, and it represents algorithm mainly XYZ method, Lab method and HSV method.XYZ method is the stereo pairs of input to be converted into CIE XYZ color space be calculated synthesis, is minimized the color error between original image and the image that perceived by anaglyphic spectacles by the method for least square.Due to the heterogeneity of CIE XYZ color space, there is certain cross-color in the anaglyphs of synthesis, and visual rivalry phenomenon is obvious.Lab rule is that XYZ method is converted into CIEL*a*b*Color space, it is intended to utilize CIEL*a*b*The perception uniformity of color space obtains better stereoeffect, and this algorithm effectively improves cross-color problem, but the anaglyphs of its synthesis still suffers from obvious visual rivalry phenomenon.HSV method is to directly utilize hsv color space to more conform to the human vision characteristic to color description, carries out the coupling of saturation in hsv color space, then contravariant shifts to rgb space synthesis anaglyphs.This algorithm reduces the impact of visual rivalry to a certain extent, but its color distortion ratio is more serious, especially there will be bigger distortion at blue and red area.
Summary of the invention
In order to overcome the deficiencies in the prior art, the present invention provides a kind of anaglyphs visual rivalry ameliorative way improving color fidelity, matching condition in utilizing tone invariance to adjust as color, in order to balance the relation of visual rivalry and cross-color so that the dark purple anaglyphs of synthesis can obtain more accurate color reconstruction while reducing visual rivalry;And simultaneously by regulating the purpose reducing visual rivalry based on the luminance dynamic of tone and saturation, improve the stability of stereoscopic vision perception;Can at the visual fatigue in effectively reducing stereo-picture viewing, it is thus achieved that under the premise of more comfortable stereo vision experience, improve the color fidelity of anaglyphs.
The technical solution adopted for the present invention to solve the technical problems comprises the following steps:
First, the left view of stereo image pair and right view are converted into CIEL by RGB color respectively*a*b*Color space;Then, cyan eyeglass is calculated at CIEL*a*b*The sensing region of color space;Using tone invariance as constraints, right view coupling is mapped to the two-dimentional sensing region of cyan eyeglass, calculates a that after coupling maps, in right view, each pixel is corresponding*、b*Value;Based on colourity (tone and saturation) value, left view and right view carried out brightness regulation, and calculate the L that the pixel each with in right view of the left view after adjustment is corresponding respectively*Value;Finally, by process after left view and right view by CIEL*a*b*Color space contravariant shifts to RGB color, synthesizes dark purple anaglyphs.Mainly comprise the steps:
Step one, at RGB color, the left view pixel I to stereo image pairL,RGBWith right view pixel IR,RGBTri-channel value of respective R, G, B are normalized so that it is be distributed in [0,1] scope;As intermediary, left view pixel and right view pixel are converted into CIEL by RGB color respectively using CIE XYZ color space*a*b*Color space,In formula, IL,RGB_R、IL,RGB_G、IL,RGB_BRespectively left view pixel IL,RGBCorresponding tri-channel value of R, G, B, IL,XYZ_X、IL,XYZ_Y、IL,XYZ_ZIt is then left view pixel I respectivelyL,RGBAt the X of CIE XYZ color space, Y, Z component value, IR,RGB_R、IR,RGB_G、IR,RGB_BRespectively right view pixel IR,RGBCorresponding tri-channel value of R, G, B, IR,XYZ_X、IR,XYZ_Y、IR,XYZ_ZIt is then that right view pixel is in the X of CIE XYZ color space, Y, Z component value respectively;Cs is the transformation matrix being converted into CIE XYZ color space by RGB color,
Utilize CIE XYZ color space and CIEL*a*b*The transformation relation of color space, is converted into CIEL further respectively by the left view pixel and right view pixel that transform to CIE XYZ color space*a*b*Color space,In formula, IL,Lab_L、IL,Lab_a、IL,Lab_bRespectively left view pixel IL,RGBAt CIEL*a*b*The L of color space*、a*、b*Component value;IR,Lab_L、IR,Lab_a、IR,Lab_bIt is right view pixel I respectivelyR,RGBAt CIEL*a*b*The L of color space*、a*、b*Component value, the expression formula of f () isIt is the reference white point of CIE XYZ color space, [XwhiteYwhiteZwhite]T=Cs × [111]T
Each pixel in left view and right view is carried out above-mentioned computing, thus completing stereo pairs from RGB color to CIEL*a*b*The conversion of color space;
Step 2, one panchromatic right view comprising all colours of definition, its color is at RGB color three-dimensional matrice C256×256×256Storage, the every one-dimensional element of matrix is the integer from 0 to 255;Calculate this panchromatic right view through after cyan eyeglass at CIEL*a*b*The distributed areas of color space, are cyan eyeglass at CIEL*a*b*The sensing region of color space, its computational methods are as follows:
By C256×256×256Replace with the V 100 width HS figure changed from 0 to 99 integer, V is the luminance component in hsv color space, HS figure is V when being certain constant in span, using tone H as transverse axis, saturation S is as the 2 d plane picture of the longitudinal axis, and to the right, span is the integer of 0 to 359 to tone H axle positive direction level, vertically downward, span is the integer of 0 to 99 to saturation S axle positive direction;If correspondence HS during V=iiFigure, i is integer and 0≤i≤99, IHS_Ri、IHS_Gi、IHS_BiRepresent HS respectivelyiThe color of figure is at tri-channel value of R, G, B of RGB color, IRP,XYZ_Xi、IRP,XYZ_Yi、IRP,XYZ_ZiRepresent that it is at CIE XYZ color space tri-component values of corresponding X, Y, Z respectively, thenIn formula, IRP,Lab_Li、IRP,Lab_ai、IRP,Lab_biRepresent HS respectivelyiThe color of figure is at CIEL*a*b*The corresponding L of color space*、a*、b*Component value,Be according to cyan eyeglass, the absorption curve function of different-waveband color is calculated, by the transformation matrix of RGB color to CIEXYZ, [XR_whiteYG_whiteZB_white] it is that CIE XYZ color space corresponds to transformation matrix ARReference white point, [XR_whiteYR_whiteZR_white]T=AR×[111]T
According to calculated all IRP,Lab_Li、IRP,Lab_ai、IRP,Lab_bi, at CIEL*a*b*Color space drawing 3 D graphics, obtains the three-dimensional perception region of cyan eyeglass.This sensing region be distributed as a series of point bunch body, be approximately perpendicular to a from spatially observing it*-b*Plane.By three-dimensional perception region projection to a*-b*Plane, obtains the region that namely two dimension sensing region is curve one and curve two surrounds, and the equation of curve one is b1=-5.1146 × 10-4·(a1)2-0.8724·(a1)+22.5848, the equation of curve two is b2=0.0029 (a2)2-0.89·(a2)-5.5028, in formula, b1、a1The respectively b of curve one*Component value and a*Component value, b2、a2The respectively b of curve two*Component value and a*Component value;
Step 3, at a*-b*Plane carries out coupling and maps, and the color of right view is mapped to CIEL*a*b*The two-dimentional sensing region of cyan eyeglass in color space;
The a of border, curve one upper right side exterior pixel after note coupling mapping*、b*Component value is IRM,Lab_au、IRM,Lab_bu, tone value isA before Corresponding matching mapping*、b*Component value IR,Lab_au、IR,Lab_bu, tone valueIt is derived by
The a of border, curve two lower left exterior pixel after note coupling mapping*、b*Component value is IRM,Lab_ad、IRM,Lab_bd, tone valueA before Corresponding matching mapping*、b*Component value IR,Lab_ad、IR,Lab_bd, tone valueIt is derived by
By on the two-dimentional sensing region boundary curve one of the pixel-map outside border, curve one upper right side to cyan eyeglass, have
Try to achieve IRM,Lab_au、IRM,Lab_buSolution;All pixels outside for border, curve one upper right side perform this operation;
By on the two-dimentional sensing region boundary curve two of the pixel-map outside border, curve two lower left to cyan eyeglass, have
Try to achieve IRM,Lab_ad、IRM,Lab_bdSolution;All pixels outside for border, curve two lower left perform this operation;
It is positioned at inside, two dimension sensing region and borderline color dot remains unchanged;After being mapped, right view is at CIEL*a*b*The a of color space*、b*Component value IRM,Lab_au、IRM,Lab_bu、IRM,Lab_ad、IRM,Lab_bd, further by a*、b*Component value is unified respectively is designated as IRM,Lab_a、IRM,Lab_b
Step 4, at CIEL*a*b*Left view and right view are carried out brightness regulation by color space respectively;
Red point in note RGB color is Red=(1,0,0), converts it to CIEL*a*b*After color space, calculating the tone value corresponding to this redness point Red is HRed=41.7515;Take threshold value T=15, to tone value at (HRed-T, HRed+ T) color in scope carries out brightness regulation;
To right view, the brightness regulating preceding pixel is IR,Lab_L, the brightness after note adjustment is IRM,Lab_L, have
Wherein, WrIt is weight coefficient,In formula, Wmax=0.4, Smax=50, Smin=40;WithThe respectively tone value of brightness regulation preceding pixel and intensity value;
All pixels of right view are carried out above operation;
To left view, the brightness regulating preceding pixel is IL,Lab_L, the brightness after note adjustment is ILM,Lab_L, have
In formula, WlFor weight coefficient,Wherein,WithRespectively regulate tone value and the intensity value of preceding pixel;
All pixels of left view are carried out above operation;
Step 5, through above-mentioned coupling map and brightness regulation after, the right view pixel obtained is at CIEL*a*b*The three-component of color space is IRM,Lab_L、IRM,Lab_a、IRM,Lab_b;First, CIE XYZ color space is converted it to:
In formula, IRM,XYZ_X、IRM,XYZ_Y、IRM,XYZ_ZIt is that the right view pixel after coupling maps is in the X of CIE XYZ color space, Y, Z component value respectively;f-1() is the inverse function of f ();
Pass through AR -1It is shifted to RGB color by CIE XYZ color space contravariant:
All pixels of right view are carried out above operation;
Left view after brightness regulation, the pixel L obtained*Value is ILM,Lab_L, remember a*、b*Component value is ILM,Lab_a、ILM,Lab_b;Using CIE XYZ color space as intermediary, its contravariant is shifted to RGB color:
In formula, ILM,XYZ_X, ILM,XYZ_Y, ILM,XYZ_ZFor the X in CIE XYZ color space of the left view after brightness regulation, Y, Z component value, ALBe according to red eyeglass, the absorption curve function of different-waveband color is calculated, by the transformation matrix of RGB color to CIEXYZ,[XL_whiteYL_whiteZL_white] it is that CIE XYZ color space corresponds to transformation matrix ALReference white point,
[XL_whiteYL_whiteZL_white]T=AL×[111]T
Pass through AL -1It is converted into RGB color by CIE XYZ color space:
If the dark purple anaglyphs pixel of final synthesis is expressed as I at RGB colorA,RGB, tri-channel value respectively I of R, G, B of its correspondenceA,RGB_R、IA,RGB_GAnd IA,RGB_B;By through the right view pixel G of aforementioned processing, B color channel value IRM,RGB_GAnd IRM,RGB_BRespectively as the G of anaglyphs pixel, B color channel value, by the left view pixel R color channel values I after processLM,RGB_RAs the R color channel values of dark purple anaglyphs pixel, [IA,RGB_RIA,RGB_GIA,RGB_B]=[ILM,RGB_RIRM,RGB_GIRM,RGB_B];Each component is multiplied by the pixel color value I of the 255 dark purple anaglyphses that can calculate final synthesisA,RGB
Each pixel of dark purple anaglyphs is performed aforesaid operations, finally gives dark purple anaglyphs.
The invention has the beneficial effects as follows: considered in auto-stereoscopic imaging technique, easily cause visual fatigue visual rivalry phenomenon and cross-color problem, by the left view of stereo image pair and right view are processed respectively, synthesize dark purple anaglyphs.Utilize CIEL*a*b*The perception uniformity of color space, based on tone invariance, at CIEL*a*b*The a of color space*-b*In plane, the distribution of color of right view is mated the two-dimentional sensing region mapping to cyan eyeglass, solve owing to part colours is stopped the cross-color problem caused by cyan eyeglass;Change according to tone Yu saturation value, dynamically regulates the brightness of left view and right view, reduces the brightness value of the too high pixel of saturation, reduces the purpose of visual rivalry.It is different from XYZ algorithm and HSV algorithm and exchanges, for cost, the way that visual rivalry phenomenon is improved with huge cross-color for, the present invention considers Problems existing in auto-stereoscopic imaging technique, improving color fidelity and reduce visual rivalry to have reached active balance in improving the stability of stereoscopic vision perception so that the dark purple anaglyphs of synthesis can provide more comfortable stereo vision experience.
Accompanying drawing explanation
Fig. 1 is the method flow diagram of the present invention;
Fig. 2 is matching border, cyan eyeglass two dimension sensing region schematic diagram in CIEL*a*b* color space;
In figure, 1-curve one, 2-curve two.
Detailed description of the invention
Below in conjunction with drawings and Examples, the present invention is further described, and the present invention includes but are not limited to following embodiment.
Step one, left view and the right view of stereo image pair are converted into CIEL by RGB color respectively*a*b*Color space.
First, at RGB color, the left view pixel I to stereo image pairL,RGBWith right view pixel IR,RGBTri-channel value of respective R, G, B are normalized so that it is be distributed in [0,1] scope.Again left view pixel and right view pixel are converted into CIEL by RGB color respectively*a*b*Color space, this conversion needs using CIE XYZ color space as intermediary:
In formula, IL,RGB_R、IL,RGB_G、IL,RGB_BRespectively left view pixel IL,RGBCorresponding tri-channel value of R, G, B, IL,XYZ_X、IL,XYZ_Y、IL,XYZ_ZIt is then left view pixel I respectivelyL,RGBIn the X of CIE XYZ color space, Y, Z component value.Correspondingly, IR,RGB_R、IR,RGB_G、IR,RGB_BRespectively right view pixel IR,RGBCorresponding tri-channel value of R, G, B, IR,XYZ_X、IR,XYZ_Y、IR,XYZ_ZIt is then that right view pixel is in the X of CIE XYZ color space, Y, Z component value respectively.Cs is the transformation matrix being converted into CIE XYZ color space by RGB color, and value is delivered the accepted value in document:
Recycling CIE XYZ color space and CIEL*a*b*The transformation relation of color space, is converted into CIEL further respectively by the left view pixel and right view pixel that transform to CIE XYZ color space*a*b*Color space:
In formula, IL,Lab_L、IL,Lab_a、IL,Lab_bRespectively left view pixel IL,RGBAt CIEL*a*b*The L of color space*、a*、b*Component value;IR,Lab_L、IR,Lab_a、IR,Lab_bIt is right view pixel I respectivelyR,RGBAt CIEL*a*b*The L of color space*、a*、b*Component value.The expression of f () is:
[XwhiteYwhiteZwhite] it is the reference white point of CIE XYZ color space, can be calculated by following formula:
[XwhiteYwhiteZwhite]T=Cs × [111]T(7)
It is taken as the unified standard of color reference with reference to white point, ensures the concordance of color measurements and evaluation with this.
Each pixel in left view and right view is carried out above-mentioned computing, thus completing stereo pairs from RGB color to CIEL*a*b*The conversion of color space.
Step 2, calculate cyan eyeglass sensing region.
Color is the human eye visual perception that different wave length visible radiation causes, and is usually the complex light that monochromatic light combines, and color perception is decided by the relative power distribution of different wave length light radiation.Two eyeglasses based on the anaglyphic spectacles of dark purple complementary color are different to the transmitance of different-waveband color.The color filter film that right view uses is cyan, G, B dichromatism synthesize, and when right eye is by eyeglass viewing image, by eyeglass stop, some color can be caused that right eye cannot be observed, thus causing the anaglyphs cross-color finally seen.Here, all cyan eyeglasses that can pass through are observed the color perceived at CIEL by right eye*a*b*The region of Color-spatial distribution is called the three-dimensional perception region of cyan eyeglass, by three-dimensional perception region at a*-b*The projection of plane is called two dimension sensing region, and both are referred to as sensing region.In anaglyphs synthesizes, it is necessary to all colours in right view is mapped to the sensing region (including in sensing region and border, sensing region) of cyan eyeglass.Especially, for being positioned at the color outside sensing region, it is necessary to when keeping color constant, to major general, it maps on border, sensing region so that it is color filter film can be passed through and perceived by eyes.
Defining a panchromatic right view comprising all colours, its color is at RGB color three-dimensional matrice C256×256×256Storage, the every one-dimensional element of matrix is the integer from 0 to 255.Calculate this panchromatic right view through after cyan eyeglass at CIEL*a*b*The distributed areas of color space, are cyan eyeglass at CIEL*a*b*The sensing region of color space, its computational methods are as follows:
In order to reduce amount of calculation, utilize the rule of conversion of RGB color and hsv color space, by C256×256×256Replace with the V 100 width HS figure changed from 0 to 99 integer.V is the luminance component in hsv color space, in HS figure to be then V be span during certain constant, using tone H as transverse axis, saturation S is as the 2 d plane picture of the longitudinal axis, tone H axle positive direction level is to the right, span is the integer of 0 to 359, and vertically downward, span is the integer of 0 to 99 to saturation S axle positive direction.If correspondence HS during V=iiFigure, i is integer and 0≤i≤99, IHS_Ri、IHS_Gi、IHS_BiRepresent HS respectivelyiThe color of figure is at tri-channel value of R, G, B of RGB color, IRP,XYZ_Xi、IRP,XYZ_Yi、IRP,XYZ_ZiRepresent that it is at CIE XYZ color space tri-component values of corresponding X, Y, Z, then have respectively:
In formula, IRP,Lab_Li、IRP,Lab_ai、IRP,Lab_biRepresent HS respectivelyiThe color of figure is at CIEL*a*b*The corresponding L of color space*、a*、b*Component value, ARBe according to cyan eyeglass, the absorption curve function of different-waveband color is calculated, by the transformation matrix of RGB color to CIEXYZ, its value is delivered the accepted value in document:
[XR_whiteYG_whiteZB_white] it is that CIE XYZ color space corresponds to transformation matrix ARReference white point, can be calculated by following formula:
[XR_whiteYR_whiteZR_white]T=AR×[111]T(11)
According to calculated all IRP,Lab_Li、IRP,Lab_ai、IRP,Lab_bi, at CIEL*a*b*Color space drawing 3 D graphics, obtains the three-dimensional perception region of cyan eyeglass.This sensing region be distributed as a series of point bunch body, be approximately perpendicular to a from spatially observing it*-b*Plane.By three-dimensional perception region projection to a*-b*Plane, obtains two dimension sensing region, utilizes quadratic polynomial to carry out curve fitting the boundary point bunch of two dimension sensing region, and fitting result is as shown in Figure 2.Top-right edge fitting curve is curve one, and equation is:
b1=-5.1146 × 10-4·(a1)2-0.8724·(a1)+22.5848(12)
In formula, b1、a1The respectively b of curve one*Component value and a*Component value.The edge fitting curve of lower left is curve two, and equation is:
b2=0.0029 (a2)2-0.89·(a2)-5.5028(13)
In formula, b2、a2The respectively b of curve two*Component value and a*Component value.So at a*-b*In plane, the region that namely the two-dimentional sensing region of cyan eyeglass is curve one and curve two surrounds.
Step 3, coupling map
When arbitrary right view is converted into CIEL from RGB color*a*b*During color space, major part color dot has transformed to the sensing region of cyan eyeglass, it is no longer necessary to adjust, but some color dot can exceed this region, these colors are perceived in order to enable right eye to observe, can being adjusted on the border, sensing region of cyan eyeglass, this process completes by mating mapping.Correlational study shows, human visual system is processing the final stage of color-aware, the three elements of color are generally used to characterize, the present invention utilizes tone key element therein, color is adjusted so that tone is constant as constraints, so more conform to the human eye perception to color, it is possible to improve the color fidelity of the anaglyphs of synthesis.Select to carry out color adjustment so that tone is constant as constraints, it is meant that coupling maps only need at a*-b*Plane carries out, and the color of right view is mapped to CIEL*a*b*The two-dimentional sensing region of cyan eyeglass in color space.
1. coupling maps principle
Coupling mapping needs to follow two principles:
Principle (1): the tone value of all pixels remains unchanged before the mapping afterwards, i.e. tone invariance.In this, as the matching condition of color in mapping process.
Principle (2): a of all pixels of right view*、b*Component value should be mapped in the two-dimentional sensing region of cyan eyeglass or on the border of two-dimentional sensing region.
2. carry out coupling using tone invariance as constraints to map
It is transformed into CIEL from RGB color by arbitrary*a*b*The right view distribution of color of color space, the cyan eyeglass two dimension sensing region surrounded with curve one and curve two compares.
The a of border, curve one upper right side exterior pixel after note coupling mapping*、b*Component value is IRM,Lab_au、IRM,Lab_bu, tone value is HRMu, have:
The corresponding a before coupling mapping*、b*Component value is IR,Lab_au、IR,Lab_bu, tone value is HRu, have:
(1) has H on principleRMu=HRu, it is derived by:
The a of border, curve two lower left exterior pixel after note coupling mapping*、b*Component value is IRM,Lab_ad、IRM,Lab_bd, tone value is HRMd, have:
The corresponding a before coupling mapping*、b*Component value is IR,Lab_ad、IR,Lab_bd, tone value is HRd, have:
(1) has H on principleRMd=HRd, it is derived by:
For the pixel outside border, curve one upper right side, on principle (2), mapped on the two-dimentional sensing region boundary curve one of cyan eyeglass, i.e. IRM,Lab_au、IRM,Lab_buMeet formula (12).Had by formula (12) and formula (16):
I can be tried to achieveRM,Lab_au、IRM,Lab_buSolution.All pixels outside for border, curve one upper right side perform this operation.
For the pixel outside border, curve two lower left, on principle (2), mapped on the two-dimentional sensing region boundary curve two of cyan eyeglass, i.e. IRM,Lab_ad、IRM,Lab_bdMeet formula (13).Had by formula (13) and formula (19):
I can be tried to achieveRM,Lab_ad、IRM,Lab_bdSolution.All pixels outside for border, curve two lower left perform this operation.
It is positioned at inside, two dimension sensing region and borderline color dot remains unchanged.
So far, at a*-b*Plane completes the coupling to right view and maps, and after being mapped right view at CIEL*a*b*The a of color space*、b*Component value IRM,Lab_au、IRM,Lab_bu、IRM,Lab_ad、IRM,Lab_bd, further by a*、b*Component value is unified respectively is designated as IRM,Lab_a、IRM,Lab_b
Step 4, at CIEL*a*b*Left view and right view are carried out brightness regulation by color space respectively.
In dark purple anaglyphs, if the saturation of red component and brightness are simultaneously too high, visual rivalry phenomenon can be caused fairly obvious, directly influence the level of comfort of human eye viewing anaglyphs.Brightness regulation is carried out to reduce the impact of visual rivalry here by red component color within the specific limits in left view and right view.
Red point in note RGB color is Red=(1,0,0), converts it to CIEL*a*b*After color space, calculating the tone value corresponding to this redness point Red is HRed=41.7515.According to the summary of experience of many experiments operation, take threshold value T=15, to tone value at (HRed-T, HRed+ T) color in scope carries out brightness regulation.
To right view, the brightness regulating preceding pixel is IR,Lab_L, the brightness after note adjustment is IRM,Lab_L, have:
Wherein, WrBeing weight coefficient, change according to the size of respective pixel color saturation on right view, its expression formula is:
In formula, Wmax=0.4, Smax=50, Smin=40, the experiment experience according to delivering HSV algorithm in document is arranged.
HRWith SRThe respectively tone value of brightness regulation preceding pixel and intensity value, can be calculated by following formula:
Formula (22) and formula (23) show, when pixel intensity is more high time, and WrCan be more big accordingly, the brightness I after adjustmentRM,Lab_LCan be more little accordingly, avoid the impact of visual rivalry with this;When pixel intensity regulates little, it was shown that itself caused visual rivalry impact is also less.
All pixels of right view are carried out above operation.
Similarly, to left view, the brightness regulating preceding pixel is IL,Lab_L, the brightness after note adjustment is ILM,Lab_L, have:
In formula, WlFor weight coefficient, changing according to the size of respective pixel color saturation on left view, its effect is similar to Wr, it is embodied as:
Wherein, HLWith SLRespectively regulate tone value and the intensity value of preceding pixel, can be calculated by following formula:
All pixels of left view are carried out above operation.
So far, complete the brightness regulation to left view and right view, obtain after brightness regulation left view and right view at CIEL*a*b*The L of color space*Component value ILM,Lab_L、IRM,Lab_L.By brightness regulation, the color accuracy of red area can be improved on the one hand, it is to avoid excessive cross-color, the brightness disproportionation due to left, view perception can be avoided on the other hand and the visual rivalry that causes.
Step 5, by process after left view and right view by CIEL*a*b*Color space contravariant shifts to RGB color, synthesizes dark purple anaglyphs.
1. right view conversion
After the mapping of above-mentioned coupling and brightness regulation, the right view pixel obtained is at CIEL*a*b*The three-component of color space is IRM,Lab_L、IRM,Lab_a、IRM,Lab_b.First, CIE XYZ color space is converted it to:
In formula, IRM,XYZ_X、IRM,XYZ_Y、IRM,XYZ_ZIt is that the right view pixel after coupling maps is in the X of CIE XYZ color space, Y, Z component value respectively.f-1() is the inverse function of f (), and its expression is:
Again through AR -1It is shifted to RGB color by CIE XYZ color space contravariant:
All pixels of right view are carried out above operation.
2. left view transformation
Left view after brightness regulation, the pixel L obtained*Value is ILM,Lab_L, remember a*、b*Component value is ILM,Lab_a、ILM,Lab_b.Do not carry out coupling to map because of left view, a*、b*Component is not changed in, therefore ILM,Lab_a=IR,Lab_a, ILM,Lab_b=IR,Lab_b.Similarly, using CIE XYZ color space as intermediary, its contravariant is shifted to RGB color:
In formula, ILM,XYZ_X, ILM,XYZ_Y, ILM,XYZ_ZFor the X in CIE XYZ color space of the left view after brightness regulation, Y, Z component value.ALBe according to red eyeglass, the absorption curve function of different-waveband color is calculated, by the transformation matrix of RGB color to CIEXYZ, its value is delivered the accepted value in document:
[XL_whiteYL_whiteZL_white] it is that CIE XYZ color space corresponds to transformation matrix ALReference white point, can be calculated by following formula:
[XL_whiteYL_whiteZL_white]T=AL×[111]T(35)
Pass through AL -1It is converted into RGB color by CIE XYZ color space:
3. the synthesis of dark purple anaglyphs
If the dark purple anaglyphs pixel of final synthesis is expressed as I at RGB colorA,RGB, tri-channel value respectively I of R, G, B of its correspondenceA,RGB_R、IA,RGB_GAnd IA,RGB_B.By through the right view pixel G of aforementioned processing, B color channel value IRM,RGB_GAnd IRM,RGB_BRespectively as the G of anaglyphs pixel, B color channel value, by the left view pixel R color channel values I after processLM,RGB_RR color channel values as dark purple anaglyphs pixel.Then have:
[IA,RGB_RIA,RGB_GIA,RGB_B]=[ILM,RGB_RIRM,RGB_GIRM,RGB_B](37)
Each component is multiplied by the pixel color value I of the 255 dark purple anaglyphses that can calculate final synthesisA,RGB.Each pixel of dark purple anaglyphs is performed aforesaid operations, finally gives dark purple anaglyphs.
So far, by above five steps, the synthesis of dark purple anaglyphs is completed.
Following four embodiment is AMDAthlon (tm) IIX2245Processor at CPU, dominant frequency is 3.01GHz, inside save as and the computer of 2GB carries out, film source is color solid image pair, input is left view and the right view of color solid image pair, corresponding programming language is matlab, it is achieved software platform be matlab2010a.
Embodiment finds on image four characteristic points that geometric properties is obvious, be prone to location, the arbitrary view of anaglyphs and stereo image pair (selecting right view here) is determined 4 stack features points pair with four Feature point correspondence respectively, detect the RGB color value of 4 stack features points pair, utilize the color fidelity performance of " aberration percentage ratio " index verification algorithm.At RGB color, the aberration percentage ratio defining every stack features point pair is:
Wherein, R1、G1、B1Respectively tri-component values of R, G, B of characteristic point on anaglyphs;R0、G0、B0Respectively tri-component values of R, G, B of character pair point on the right view of stereo pairs.Aberration percentage ratio is more big, it was shown that the color fidelity of the anaglyphs of algorithm synthesis is more poor;Aberration percentage ratio is more little, then corresponding color fidelity is more good.
It is respectively adopted the algorithm synthesis dark purple anaglyphs that XYZ algorithm, HSV algorithm and the present invention propose, is calculated analyzing contrast.Due to empirical composition algorithm and the visual rivalry phenomenon of the anaglyphs that Lab algorithm synthesizes compared with above-mentioned three kinds of algorithms significantly greater, do not improve at the visual rivalry that the present invention is targeted within the scope of the discussion of algorithm, therefore be not involved in contrast.
Embodiment 1:
Input size is left view and the right view of " duck " color solid image pair of 400 × 300 pixels, is respectively adopted XYZ algorithm, HSV algorithm and the present invention and synthesizes dark purple anaglyphs.Obvious geometric properties according to right view non-occluded area, choose four characteristic points, be designated as a little 1, point 2, point 3 and point 4, determine 4 stack features point pair on right view and dark purple anaglyphs according to this, record R, G, B three-component value of 4 stack features points pair, calculating aberration percentage ratio, result is as shown in table 1:
The color contrast of each dark purple anaglyphs composition algorithm of table 1 " duck " image
As it can be seen from table 1 the anaglyphs of present invention synthesis is less by 9.03% than XYZ algorithm at the aberration percentage ratio of point 1 calculating, less by 11.18% than HSV algorithm;Aberration percentage difference in point 2 calculating reaches minimum, and its value is less by 10.81% than XYZ algorithm, less by 12.32% than HSV algorithm;Less by 4.54% than XYZ algorithm at the aberration percentage ratio of point 3 calculating, less by 1.42% than HSV algorithm;Less by 3.08% than XYZ algorithm at the aberration percentage ratio of point 4 calculating, less by 2.79% than HSV algorithm.Data above shows, the dark purple anaglyphs of present invention synthesis can improve the fidelity simultaneously effective improving color of visual rivalry.
Embodiment 2:
Input size is left view and the right view of " memorial archway " color solid image pair of 800 × 600 pixels, is respectively adopted XYZ algorithm, HSV algorithm and the present invention and synthesizes dark purple anaglyphs.Obvious geometric properties according to right view non-occluded area, choose four characteristic points, be designated as a little 1, point 2, point 3 and point 4, determine 4 stack features point pair on right view and dark purple anaglyphs according to this, record R, G, B three-component value of 4 stack features points pair, calculating aberration percentage ratio, result is as shown in table 2:
The color contrast of each dark purple anaglyphs composition algorithm of table 2 " memorial archway " image
From table 2 it can be seen that the anaglyphs of present invention synthesis reaches minimum in the aberration percentage difference of point 1 calculating, its value is less by 38.02% than XYZ algorithm, less by 26.15% than HSV algorithm;Less by 2.2% than XYZ algorithm at the aberration percentage ratio of point 2 calculating, less by 1.29% than HSV algorithm;Less by 14.13% than XYZ algorithm at the aberration percentage ratio of point 3 calculating, less by 17.7% than HSV algorithm;Less by 14.83% than XYZ algorithm at the aberration percentage ratio of point 4 calculating, less by 1.9% than HSV algorithm.Data above shows, the region that the dark purple anaglyphs color fidelity that synthesizes at XYZ algorithm and HSV algorithm is relatively low, the dark purple anaglyphs of present invention synthesis is while improving visual rivalry, and the effect improving color fidelity is especially apparent.
Embodiment 3:
Input size is left view and the right view of " superman " color solid image pair of 420 × 330 pixels, is respectively adopted XYZ algorithm, HSV algorithm and the present invention and synthesizes dark purple anaglyphs.Obvious geometric properties according to right view non-occluded area, choose four characteristic points, be designated as a little 1, point 2, point 3 and point 4, determine 4 stack features point pair on right view and dark purple anaglyphs according to this, record R, G, B three-component value of 4 stack features points pair, calculating aberration percentage ratio, result is as shown in table 3:
The color contrast of each dark purple anaglyphs composition algorithm of table 3 " superman " image
From table 3 it can be seen that the anaglyphs of present invention synthesis reaches minimum in the aberration percentage difference of point 1 calculating, its value is less by 27.26% than XYZ algorithm, less by 31.04% than HSV algorithm;Less by 19.04% than XYZ algorithm at the aberration percentage ratio of point 2 calculating, less by 22.85% than HSV algorithm;Less by 8.04% than XYZ algorithm at the aberration percentage ratio of point 3 calculating, less by 13.43% than HSV algorithm;Less by 2.75% than XYZ algorithm at the aberration percentage ratio of point 4 calculating, less by 4.4% than HSV algorithm.Data above shows, the dark purple anaglyphs of present invention synthesis can improve the fidelity simultaneously effective improving color of visual rivalry.
Embodiment 4:
Input size is left view and the right view of " ocean " color solid image pair of 480 × 220 pixels, is respectively adopted XYZ algorithm, HSV algorithm and the present invention and synthesizes dark purple anaglyphs.Obvious geometric properties according to right view non-occluded area, choose four characteristic points, be designated as a little 1, point 2, point 3 and point 4, determine 4 stack features point pair on right view and dark purple anaglyphs according to this, record R, G, B three-component value of 4 stack features points pair, calculating aberration percentage ratio, result is as shown in table 4:
The color contrast of each dark purple anaglyphs composition algorithm of table 4 " ocean " image
From table 4, it can be seen that the anaglyphs of present invention synthesis reaches minimum in the aberration percentage difference of point 1 calculating, its value is less by 14.68% than XYZ algorithm, less by 17.39% than HSV algorithm;Less by 7.76% than XYZ algorithm at the aberration percentage ratio of point 2 calculating, less by 22.07% than HSV algorithm;Less by 5.47% than XYZ algorithm at the aberration percentage ratio of point 3 calculating, less by 9.85% than HSV algorithm;Less by 5.41% than XYZ algorithm at the aberration percentage ratio of point 4 calculating, less by 8.44% than HSV algorithm.Data above shows, the dark purple anaglyphs of present invention synthesis can improve the fidelity simultaneously effective improving color of visual rivalry.
Above example describes simultaneously, and in the process of synthesis anaglyphs, the distortion level of different colors is different.Generally, Lycoperdon polymorphum Vitt and green distortion are less, and red and blue distortion is bigger.

Claims (1)

1. the anaglyphs visual rivalry ameliorative way improving color fidelity, it is characterised in that comprise the steps:
Step one, at RGB color, the left view pixel I to stereo image pairL,RGBWith right view pixel IR,RGBTri-channel value of respective R, G, B are normalized so that it is be distributed in [0,1] scope;As intermediary, left view pixel and right view pixel are converted into CIEL by RGB color respectively using CIE XYZ color space*a*b*Color space, I L , X Y Z _ X I L , X Y Z _ Y I L , X Y Z _ Z = C s × I L , R G B _ R I L , R G B _ G I L , R G B _ B , I R , X Y Z _ X I R , X Y Z _ Y I R , X Y Z _ Z = C s × I R , R G B _ R I R , R G B _ G I R , R G B _ B , In formula, IL,RGB_R、IL,RGB_G、IL,RGB_BRespectively left view pixel IL,RGBCorresponding tri-channel value of R, G, B, IL,XYZ_X、IL,XYZ_Y、IL,XYZ_ZIt is then left view pixel I respectivelyL,RGBAt the X of CIE XYZ color space, Y, Z component value, IR,RGB_R、IR,RGB_G、IR,RGB_BRespectively right view pixel IR,RGBCorresponding tri-channel value of R, G, B, IR,XYZ_X、IR,XYZ_Y、IR,XYZ_ZIt is then that right view pixel is in the X of CIE XYZ color space, Y, Z component value respectively;Cs is the transformation matrix being converted into CIE XYZ color space by RGB color, C s = 0.4243 0.3105 0.1657 0.2492 0.6419 0.1089 0.0265 0.1225 0.8614 ;
Utilize CIE XYZ color space and CIEL*a*b*The transformation relation of color space, is converted into CIEL further respectively by the left view pixel and right view pixel that transform to CIE XYZ color space*a*b*Color space, I L , L a b _ L = 116 · f ( I L , X Y Z _ Y Y w h i t e ) - 16 I L , L a b _ a = 500 · [ f ( I L , X Y Z _ X X w h i t e ) - f ( I L , X Y Z _ Y Y w h i t e ) ] I L , L a b _ b = 200 · [ f ( I L , X Y Z _ Y Y w h i t e ) - f ( I L , X Y Z _ Z Z w h i t e ) ] , I R , L a b _ L = 116 · f ( I R , X Y Z _ Y Y w h i t e ) - 16 I R , L a b _ a = 500 · [ f ( I R , X Y Z _ X X w h i t e ) - f ( I R , X Y Z _ Y Y w h i t e ) ] I R , L a b _ b = 200 · [ f ( I R , X Y Z _ Y Y w h i t e ) - f ( I R , X Y Z _ Z Z w h i t e ) ] , In formula, IL,Lab_L、IL,Lab_a、IL,Lab_bRespectively left view pixel IL,RGBAt CIEL*a*b*The L of color space*、a*、b*Component value;IR,Lab_L、IR,Lab_a、IR,Lab_bIt is right view pixel I respectivelyR,RGBAt CIEL*a*b*The L of color space*、a*、b*Component value, the expression formula of f () is[XwhiteYwhiteZwhite] it is the reference white point of CIE XYZ color space, [XwhiteYwhiteZwhite]T=Cs × [111]T
Each pixel in left view and right view is carried out above-mentioned computing, thus completing stereo pairs from RGB color to CIEL*a*b*The conversion of color space;
Step 2, one panchromatic right view comprising all colours of definition, its color is at RGB color three-dimensional matrice C256×256×256Storage, the every one-dimensional element of matrix is the integer from 0 to 255;Calculate this panchromatic right view through after cyan eyeglass at CIEL*a*b*The distributed areas of color space, are cyan eyeglass at CIEL*a*b*The sensing region of color space, its computational methods are as follows:
By C256×256×256Replace with the V 100 width HS figure changed from 0 to 99 integer, V is the luminance component in hsv color space, HS figure is V when being certain constant in span, using tone H as transverse axis, saturation S is as the 2 d plane picture of the longitudinal axis, and to the right, span is the integer of 0 to 359 to tone H axle positive direction level, vertically downward, span is the integer of 0 to 99 to saturation S axle positive direction;If correspondence HS during V=iiFigure, i is integer and 0≤i≤99, IHS_Ri、IHS_Gi、IHS_BiRepresent HS respectivelyiThe color of figure is at tri-channel value of R, G, B of RGB color, IRP,XYZ_Xi、IRP,XYZ_Yi、IRP,XYZ_ZiRepresent that it is at CIE XYZ color space tri-component values of corresponding X, Y, Z respectively, then I R P , X Y Z _ X i I R P , X Y Z _ Y i I R P , X Y Z _ Z i = A R × I H S _ R i I H S _ G i I H S _ B i , L R P , L a b _ L i = 116 · f ( I R P , X Y Z _ Y i Y R _ w h i t e ) - 16 I R P , L a b _ a i = 500 · [ f ( I R P , X Y Z _ X i X R _ w h i t e ) - f ( I R P , X Y Z _ Y i Y R _ w h i t e ) ] I R P , L a b _ b i = 200 · [ f ( I R P , X Y Z _ Y i Y R _ w h i t e ) - f ( I R P , X Y Z _ Z i Z R _ w h i t e ) ] , In formula, IRP,Lab_Li、IRP,Lab_ai、IRP,Lab_biRepresent HS respectivelyiThe color of figure is at CIEL*a*b*The corresponding L of color space*、a*、b*Component value, A R = 0.0153 0.1092 0.1171 0.0176 0.3088 0.0777 0.0201 0.1016 0.6546 Be according to cyan eyeglass, the absorption curve function of different-waveband color is calculated, by the transformation matrix of RGB color to CIEXYZ, [XR_whiteYG_whiteZB_white] it is that CIE XYZ color space corresponds to transformation matrix ARReference white point, [XR_whiteYR_whiteZR_white]T=AR×[111]T
According to calculated all IRP,Lab_Li、IRP,Lab_ai、IRP,Lab_bi, at CIEL*a*b*Color space drawing 3 D graphics, obtains the three-dimensional perception region of cyan eyeglass.This sensing region be distributed as a series of point bunch body, be approximately perpendicular to a from spatially observing it*-b*Plane.By three-dimensional perception region projection to a*-b*Plane, obtains the region that namely two dimension sensing region is curve one and curve two surrounds, and the equation of curve one is b1=-5.1146 × 10-4·(a1)2-0.8724·(a1)+22.5848, the equation of curve two is b2=0.0029 (a2)2-0.89·(a2)-5.5028, in formula, b1、a1The respectively b of curve one*Component value and a*Component value, b2、a2The respectively b of curve two*Component value and a*Component value;
Step 3, at a*-b*Plane carries out coupling and maps, and the color of right view is mapped to CIEL*a*b*The two-dimentional sensing region of cyan eyeglass in color space;
The a of border, curve one upper right side exterior pixel after note coupling mapping*、b*Component value is IRM,Lab_au、IRM,Lab_bu, tone value isA before Corresponding matching mapping*、b*Component value IR,Lab_au、IR,Lab_bu, tone value H R u = 180 π · a r c t a n ( I R , L a b _ b u I R , L a b _ a u ) ; It is derived by I R M , L a b _ b u I R M , L a b _ a u = I R , L a b _ b u I R , L a b _ a u ;
The a of border, curve two lower left exterior pixel after note coupling mapping*、b*Component value is IRM,Lab_ad、IRM,Lab_bd, tone valueA before Corresponding matching mapping*、b*Component value IR,Lab_ad、IR,Lab_bd, tone value H R d = 180 π · a r c t a n ( I R , L a b _ b d I R , L a b _ a d ) ; It is derived by I R M , L a b _ b d I R M , L a b _ a d = I R , L a b _ b d I R , L a b _ a d ;
By on the two-dimentional sensing region boundary curve one of the pixel-map outside border, curve one upper right side to cyan eyeglass, have
I R M , L a b _ b u I R M , L a b _ a u = I R , L a b _ b u I R , L a b _ a u I R M , L a b _ b u = - 5.1146 × 10 - 4 · ( I R M , L a b _ a u ) 2 - 0.8724 · I R M , L a b _ a u + 22.5848
Try to achieve IRM,Lab_au、IRM,Lab_buSolution;All pixels outside for border, curve one upper right side perform this operation;
By on the two-dimentional sensing region boundary curve two of the pixel-map outside border, curve two lower left to cyan eyeglass, have
I R M , L a b _ b d I R M , L a b _ a d = I R , L a b _ b d I R , L a b _ a d I R M , L a b _ b d = 0.0029 · ( I R M , L a b _ a d ) 2 - 0.89 · I R M , L a b _ a d - 5.50288
Try to achieve IRM,Lab_ad、IRM,Lab_bdSolution;All pixels outside for border, curve two lower left perform this operation;
It is positioned at inside, two dimension sensing region and borderline color dot remains unchanged;After being mapped, right view is at CIEL*a*b*The a of color space*、b*Component value IRM,Lab_au、IRM,Lab_bu、IRM,Lab_ad、IRM,Lab_bd, further by a*、b*Component value is unified respectively is designated as IRM,Lab_a、IRM,Lab_b
Step 4, at CIEL*a*b*Left view and right view are carried out brightness regulation by color space respectively;
Red point in note RGB color is Red=(1,0,0), converts it to CIEL*a*b*After color space, calculating the tone value corresponding to this redness point Red is HRed=41.7515;Take threshold value T=15, to tone value at (HRed-T, HRed+ T) color in scope carries out brightness regulation;
To right view, the brightness regulating preceding pixel is IR,Lab_L, the brightness after note adjustment is IRM,Lab_L, have
Wherein, WrIt is weight coefficient,In formula, Wmax=0.4, Smax=50, Smin=40; H R = 180 π · a r c t a n ( I R , L a b _ b I R , L a b _ a ) With S R = ( I R , L a b _ a ) 2 + ( I R , L a b _ b ) 2 The respectively tone value of brightness regulation preceding pixel and intensity value;
All pixels of right view are carried out above operation;
To left view, the brightness regulating preceding pixel is IL,Lab_L, the brightness after note adjustment is ILM,Lab_L, have
In formula, WlFor weight coefficient,Wherein, H L = 180 π · a r c t a n ( I L , L a b _ b I L , L a b _ a ) With S L = ( I L , L a b _ a ) 2 + ( I L , L a b _ b ) 2 Respectively regulate tone value and the intensity value of preceding pixel;All pixels of left view are carried out above operation;
Step 5, through above-mentioned coupling map and brightness regulation after, the right view pixel obtained is at CIEL*a*b*The three-component of color space is IRM,Lab_L、IRM,Lab_a、IRM,Lab_b;First, CIE XYZ color space is converted it to:
I R M , X Y Z _ X = X R _ w h i t e · f - 1 ( I R M , L a b _ L + 16 116 + I R M , L a b _ a 500 ) I R M , X Y Z _ Y = Y R _ w h i t e · f - 1 ( I R M , L a b _ L + 16 116 ) I R M , X Y Z _ Z = Z R _ w h i t e · f - 1 ( I R M , L a b _ L + 16 116 - I R M , L a b _ b 200 )
In formula, IRM,XYZ_X、IRM,XYZ_Y、IRM,XYZ_ZIt is that the right view pixel after coupling maps is in the X of CIE XYZ color space, Y, Z component value respectively;f-1() is the inverse function of f ();
Pass through AR -1It is shifted to RGB color by CIE XYZ color space contravariant:
I R M , R G B _ R I R M , R G B _ G I R M , R G B _ B = A B - 1 × I R M , X Y Z _ X I R M , X Y Z _ Y I R M , X Y Z _ Z ;
All pixels of right view are carried out above operation;
Left view after brightness regulation, the pixel L obtained*Value is ILM,Lab_L, remember a*、b*Component value is ILM,Lab_a、ILM,Lab_b;Using CIE XYZ color space as intermediary, its contravariant is shifted to RGB color:
I L M , X Y Z _ X = X L _ w h i t e · f - 1 ( I L M , L a b _ L + 16 116 + I L m , L a b _ a 500 ) ) I L M , X Y Z _ Y = Y L _ w h i t e · f - 1 ( I L M , L a b _ L + 16 116 ) I L M , X Y Z _ Z = Z L _ w h i t e · f - 1 ( I L M , L a b _ L + 16 116 - I L M , L a b _ b 200 )
In formula, ILM,XYZ_X, ILM,XYZ_Y, ILM,XYZ_ZFor the X in CIE XYZ color space of the left view after brightness regulation, Y, Z component value, ALBe according to red eyeglass, the absorption curve function of different-waveband color is calculated, by the transformation matrix of RGB color to CIEXYZ, A L = 0.1840 0.0179 0.0048 0.0876 0.0118 0.0018 0.0005 0.0012 0.0159 , [XL_whiteYL_whiteZL_white] it is that CIE XYZ color space corresponds to transformation matrix ALReference white point,
[XL_whiteYL_whiteZL_white]T=AL×[111]T
Pass through AL -1It is converted into RGB color by CIE XYZ color space:
I L M , R G B _ R I L M , R G B _ G I L M , R G B _ B = A L - 1 × I L M , X Y Z _ X I L M , X Y Z _ Y I L M , X Y Z _ Z ;
If the dark purple anaglyphs pixel of final synthesis is expressed as I at RGB colorA,RGB, tri-channel value respectively I of R, G, B of its correspondenceA,RGB_R、IA,RGB_GAnd IA,RGB_B;By through the right view pixel G of aforementioned processing, B color channel value IRM,RGB_GAnd IRM,RGB_BRespectively as the G of anaglyphs pixel, B color channel value, by the left view pixel R color channel values I after processLM,RGB_RAs the R color channel values of dark purple anaglyphs pixel, [IA,RGB_RIA,RGB_GIA,RGB_B]=[ILM,RGB_RIRM,RGB_GIRM,RGB_B];Each component is multiplied by the pixel color value I of the 255 dark purple anaglyphses that can calculate final synthesisA,RGB
Each pixel of dark purple anaglyphs is performed aforesaid operations, finally gives dark purple anaglyphs.
CN201610111418.9A 2016-02-29 2016-02-29 Improve the anaglyphs visual rivalry ameliorative way of color fidelity Active CN105741235B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201610111418.9A CN105741235B (en) 2016-02-29 2016-02-29 Improve the anaglyphs visual rivalry ameliorative way of color fidelity

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201610111418.9A CN105741235B (en) 2016-02-29 2016-02-29 Improve the anaglyphs visual rivalry ameliorative way of color fidelity

Publications (2)

Publication Number Publication Date
CN105741235A true CN105741235A (en) 2016-07-06
CN105741235B CN105741235B (en) 2019-03-29

Family

ID=56249637

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201610111418.9A Active CN105741235B (en) 2016-02-29 2016-02-29 Improve the anaglyphs visual rivalry ameliorative way of color fidelity

Country Status (1)

Country Link
CN (1) CN105741235B (en)

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107123093A (en) * 2016-12-07 2017-09-01 重庆微标科技股份有限公司 A kind of processing method and processing device of vehicle image
CN109559692A (en) * 2018-12-11 2019-04-02 惠科股份有限公司 A kind of driving method of display module, drive system and display device
CN109636739A (en) * 2018-11-09 2019-04-16 深圳市华星光电半导体显示技术有限公司 The treatment of details method and device of image saturation enhancing
CN112950510A (en) * 2021-03-22 2021-06-11 南京莱斯电子设备有限公司 Large-scene splicing image chromatic aberration correction method
CN114902659A (en) * 2019-12-27 2022-08-12 株式会社索思未来 Image processing apparatus, image processing method, and program
CN116245753A (en) * 2022-12-30 2023-06-09 北京华云星地通科技有限公司 Red and blue stereoscopic satellite cloud image generation method, system, electronic equipment and medium

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1996386A (en) * 2006-12-28 2007-07-11 钟磊 3D digital image synthesizing method
CN101329761A (en) * 2008-07-25 2008-12-24 北京中星微电子有限公司 Method and apparatus for regulating keystone distortion of a projecting equipment
US20100208044A1 (en) * 2009-02-19 2010-08-19 Real D Stereoscopic systems for anaglyph images

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1996386A (en) * 2006-12-28 2007-07-11 钟磊 3D digital image synthesizing method
CN101329761A (en) * 2008-07-25 2008-12-24 北京中星微电子有限公司 Method and apparatus for regulating keystone distortion of a projecting equipment
US20100208044A1 (en) * 2009-02-19 2010-08-19 Real D Stereoscopic systems for anaglyph images

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
耿玉杰: "立体图像合成方法的研究", 《中国优秀硕士学位论文全文数据库 信息科技辑》 *

Cited By (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107123093A (en) * 2016-12-07 2017-09-01 重庆微标科技股份有限公司 A kind of processing method and processing device of vehicle image
CN109636739A (en) * 2018-11-09 2019-04-16 深圳市华星光电半导体显示技术有限公司 The treatment of details method and device of image saturation enhancing
CN109636739B (en) * 2018-11-09 2020-07-10 深圳市华星光电半导体显示技术有限公司 Detail processing method and device for enhancing image saturation
CN109559692A (en) * 2018-12-11 2019-04-02 惠科股份有限公司 A kind of driving method of display module, drive system and display device
CN114902659A (en) * 2019-12-27 2022-08-12 株式会社索思未来 Image processing apparatus, image processing method, and program
CN114902659B (en) * 2019-12-27 2023-08-15 株式会社索思未来 Image processing apparatus and image processing method
CN112950510A (en) * 2021-03-22 2021-06-11 南京莱斯电子设备有限公司 Large-scene splicing image chromatic aberration correction method
CN112950510B (en) * 2021-03-22 2024-04-02 南京莱斯电子设备有限公司 Large scene spliced image chromatic aberration correction method
CN116245753A (en) * 2022-12-30 2023-06-09 北京华云星地通科技有限公司 Red and blue stereoscopic satellite cloud image generation method, system, electronic equipment and medium
CN116245753B (en) * 2022-12-30 2023-10-03 北京华云星地通科技有限公司 Red and blue stereoscopic satellite cloud image generation method, system, electronic equipment and medium

Also Published As

Publication number Publication date
CN105741235B (en) 2019-03-29

Similar Documents

Publication Publication Date Title
CN105741235A (en) Visual rivalry improving method of complementary color three-dimensional image for improving color fidelity
CN109495734B (en) Image processing method and apparatus for autostereoscopic three-dimensional display
US9934575B2 (en) Image processing apparatus, method and computer program to adjust 3D information based on human visual characteristics
Terzić et al. Methods for reducing visual discomfort in stereoscopic 3D: A review
CN102831866B (en) Stereoscopic display device and driving method thereof
EP2259601B1 (en) Image processing method, image processing device, and recording medium
US9380284B2 (en) Image processing method, image processing device and recording medium
EP2323416A2 (en) Stereoscopic editing for video production, post-production and display adaptation
WO2010131985A1 (en) Conversion of input image data for different display devices
JP2002528746A (en) A method for recording and viewing stereoscopic images in color using a multi-chrome filter
Weissman et al. A simple method for measuring crosstalk in stereoscopic displays
US9111377B2 (en) Apparatus and method for generating a multi-viewpoint image
Zhu et al. Processing images for red–green dichromats compensation via naturalness and information-preservation considered recoloring
Chen et al. A method of stereoscopic display for dynamic 3D graphics on android platform
US20210208407A1 (en) Head Mounted System with Color Specific Modulation
CN102026012A (en) Generation method and device of depth map through three-dimensional conversion to planar video
KR20200035784A (en) Method and apparatus for processing hologram image data
Woods et al. Characterizing and reducing crosstalk in printed anaglyph stereoscopic 3D images
Güzel et al. ChromaCorrect: prescription correction in virtual reality headsets through perceptual guidance
CN110234004A (en) Display device, display methods and recording medium
JP2015097049A (en) Image processor and image processing method
Mangiat et al. Disparity remapping for handheld 3D video communications
TWI526717B (en) Naked eye stereoscopic display device and method for arranging pixel thereof
JP2012060345A (en) Multi-viewpoint image creation device, multi-viewpoint image creation method and multi-viewpoint image display system
CN102572456B (en) Color correction method for glass-type stereo display device

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant