CN110060291A - It is a kind of consider people because stereopsis in distance calculation method - Google Patents

It is a kind of consider people because stereopsis in distance calculation method Download PDF

Info

Publication number
CN110060291A
CN110060291A CN201910273315.6A CN201910273315A CN110060291A CN 110060291 A CN110060291 A CN 110060291A CN 201910273315 A CN201910273315 A CN 201910273315A CN 110060291 A CN110060291 A CN 110060291A
Authority
CN
China
Prior art keywords
dimensional
model
comfort level
dimensional depth
apparent range
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201910273315.6A
Other languages
Chinese (zh)
Other versions
CN110060291B (en
Inventor
权巍
赵云秀
韩成
李华
胡汉平
张超
蒋振刚
杨华民
冯欣
丁莹
姜珊
刘祎
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Changchun University of Science and Technology
Original Assignee
Changchun University of Science and Technology
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Changchun University of Science and Technology filed Critical Changchun University of Science and Technology
Priority to CN201910273315.6A priority Critical patent/CN110060291B/en
Publication of CN110060291A publication Critical patent/CN110060291A/en
Application granted granted Critical
Publication of CN110060291B publication Critical patent/CN110060291B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/50Depth or shape recovery
    • G06T7/55Depth or shape recovery from multiple images
    • G06T7/593Depth or shape recovery from multiple images from stereo images
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10004Still image; Photographic image
    • G06T2207/10012Stereo images
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10024Color image
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20228Disparity calculation for image-based rendering

Landscapes

  • Engineering & Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Image Analysis (AREA)
  • Testing, Inspecting, Measuring Of Stereoscopic Televisions And Televisions (AREA)

Abstract

The present invention relates to it is a kind of based on people because three-dimensional apparent range resolve model, it is characterised in that: extract image parallax on the screen, calculate its three-dimensional depth;Secondly, extracting the area-of-interest of image, its parallactic angle, width angle and front and back scape contrast are obtained, model is resolved according to objective visual comfort and calculates its comfort level;Then, by subjective experiment, the apparent range value of stereo-picture subjective perception is obtained;Finally, establishing the three-dimensional depth of resolving and the incidence relation of objective comfort level by subjective measurement value, establishes the apparent range based on three-dimensional comfort level and resolve model;Model starting point is established on the basis of human eye, so that model has more objectivity, better directive function is played to stereoscopic shooting;In conjunction with three-dimensional depth, stereoscopic vision comfort level etc., comprehensive consideration multi-dimensional factors, so that model evaluation result is more accurate, with a high credibility.

Description

It is a kind of consider people because stereopsis in distance calculation method
Technical field
The present invention relates to it is a kind of consideration people because stereopsis in distance calculation method, belong to computer vision, image procossing Technical field.
Background technique
In recent years, stereo technology, related hardware technology were quickly grown, three-dimensional video-frequency and game etc. by mobile phone, plate, The media such as TV, film and various Helmet Mounted Displays enter ordinary populace life comprehensively.In three-dimensional video-frequency watching process Three-dimensional sense is main feature of the stereoscopic display better than two dimension display, it is therefore necessary to appropriately control relief power, should keep away " false three-dimensional " picture for exempting from no any visual impact, can not occur enabling audience's discomfort, even cannot achieve stereoscopic fusion The case where.
Three-dimensional depth information can be solved based on many factors such as right and left eyes stereo-picture, shooting and playing environment parameters.And The three-dimensional depth of stereoscopic effect and solution that actually user is watched by various media has differences.At this stage, in solid In whole life cycle, only audience could understand stereoscopic effect situation according to ornamental experience effect after finally watching, can not Realize that timely three-dimensional sense controls.
The space length for the steric information that three-dimensional apparent range, that is, viewer is perceived, being different from " three-dimensional depth " can With directly calculation, the complicated multiplicity of the influence factor of apparent range, in addition to " three-dimensional depth " resolves the factor being related in model also Comprising viewer's oneself factor, i.e., people because.It is stood since stereopsis is imaged on human eye retina different from human eye viewing is normal Body is in kind, visual comfort can be caused lower due to the problems such as " adjusting-polymerization " in watching process, i.e., viewing experience is poor, Certain feeling of fatigue, which can thus be generated, causes the apparent range of perception and the three-dimensional depth value of resolving to have certain discrepancy.Therefore, People must be taken into consideration because could more accurately resolve three-dimensional apparent range.
Summary of the invention
The purpose of the present invention is to provide it is a kind of based on people because three-dimensional apparent range resolve model, since the mankind exist In terms of spatial perception, there are significant errors with actual depth for the apparent range that people perceives in virtual environment;Although real generation Apparent range estimation accuracy in boundary is about 94%, but averagely drops to 80% or so in virtual environment, that is, underestimate or press Contracting 20%, while the apparent range perceived is influenced by factors such as parallax, color and brightness.In consideration of it, the present invention is with three-dimensional deep Based on degree resolves model, and establishes three-dimensional apparent range because of feedback using objective comfort level Models computed result as people and resolve Model is automatically performed the calculating of apparent range, obtains more accurate apparent range.
The technical scheme is that being achieved: it is a kind of based on people because three-dimensional apparent range resolve model, feature It is: extracts the parallax of image on the screen, calculate its three-dimensional depth;Secondly, extracting the area-of-interest of image, it is obtained Parallactic angle, width angle and front and back scape contrast resolve model according to objective visual comfort and calculate its comfort level;Then, pass through Subjective experiment obtains the apparent range value of stereo-picture subjective perception;Finally, establishing the solid of resolving by subjective measurement value The incidence relation of depth and objective comfort level establishes the apparent range based on three-dimensional comfort level and resolves model;Its specific step It is as follows:
Screen parallax when step 1, extraction stereo pairs projection, calculates its three-dimensional depth according to the following formula:
Wherein, SD indicates that the three-dimensional depth resolved, e indicate that the pupil distance (generally often taking 6.5cm) of viewer, V indicate Distance of the viewer away from view screen, Z indicate the parallax of stereo pairs on screen when broadcasting;
Step 2, the foreground area parallactic angle for respectively obtaining stereo-picture, width angle and front and back scene area contrast, according to Following formula calculates its objective visual comfort:
VC (D, w, c)=4.8736-0.7084D+0.1912ln (w) -0.0208Dln (w)+0.0015c2-0.0572c (0.50≤D≤2.00,0.25≤w≤4.00)
Wherein, D indicates that the parallactic angle of foreground area, w indicate the width angle of foreground area, and c indicates foreground area and background The contrast in region.It is required that parallactic angle, in 0.5 °~2.0 ° of range, width angle is in 0.25 °~4.0 ° of range;VC is objective Stereoscopic vision comfort level;
Step 3 carries out subjective experiment to 21 groups of stereo-pictures of real scene shooting, artificially measurement experiment personnel subjective perception Apparent range arranges data, obtains statistical average.
The difference dif of step 4, the perception apparent range for calculating every group of image and three-dimensional depth, analyze the difference with comfortably The relationship for spending VC, does curve matching and show that linear relationship is as follows:
Dif=0.13-0.0353*VC
Wherein, dif is the difference for perceiving apparent range (GTPD) and three-dimensional depth (SD): dif=GTPD-SD, VC are visitor See stereoscopic vision comfort level;
Step 5 arranges formula, finally obtains the incidence relation of apparent range, three-dimensional depth and three-dimensional comfort level, obtains The resolving model of apparent range:
OPD=SD+0.13-0.0353*VC
In formula, OPD indicates the objective apparent range that model calculates, and SD indicates that the three-dimensional depth of Models computed, VC indicate visitor See the comfortable angle value of stereoscopic vision comfort level Models computed.
The positive effect of the present invention is the objective estimation realized to stereo-picture apparent range, three-dimensional deep by analysis Calculated result, the objective visual comfort Models computed result that degree resolves model are associated with the apparent range of subjective perception System establishes the resolving model of objective three-dimensional apparent range.It can effectively save traditional artificial pricer power cost, and according to Rely in the estimator the problem of;Model starting point is established on the basis of human eye, so that model has more objectivity, solid is clapped It takes the photograph and plays better directive function;In conjunction with three-dimensional depth, stereoscopic vision comfort level etc., comprehensive consideration multi-dimensional factors, so that model Evaluation result is more accurate, with a high credibility.
Detailed description of the invention
Fig. 1 is flow diagram of the present invention.
Fig. 2 is binocular camera system.
Fig. 3 is experimental group right view.
Fig. 4 is experimental situation.
Fig. 5 is experimental group area-of-interest figure.
Fig. 6 is the tendency chart of three-dimensional depth and apparent range.
Fig. 7 is the difference of apparent range and three-dimensional depth and the relationship of three-dimensional comfort level.
Fig. 8 is apparent range, three-dimensional depth and the apparent range of the resolving three's tendency chart of perception.
Specific embodiment
The present invention will be further described with reference to the accompanying drawings and examples: in the present embodiment, using Daheng's industry phase Machine obtains 21 groups of real scene shooting stereo pairs, and is tested using the stereo-picture library that the advanced academy of sciences, South Korea provides the model Card, flow chart are as shown in Figure 1, the specific steps are as follows:
Step 1, using Liang Ge Daheng MER-310-12UC industrial camera progress Image Acquisition (as shown in Figure 2), camera Resolution ratio be 2048 × 1536.And two identical cameras of bench-type number, the underlying parameters standard such as white balance, gain, mode Unanimously.When carrying out data acquisition, connection computer first transmits signal, places a device in stable horizontal A-frame, group Alignment correction after dress, the software being equipped with by Daheng's camera watch two-shipper picture over the display.By the aperture of two machines, The random informations such as focal length, focus synchronize unanimously, with pupil of human away from (6.5cm is chosen in 5.5-7.5cm, this experiment) for shooting base Line synchronizes shooting.During shooting stereo images, scene layout is divided into two layers more, and picture is more succinct for this experiment, leads to It crosses hierarchy correlation and highlights preferable stereoscopic effect.As shown in figure 3, being the right view of collected 21 groups of images.
Step 2 chooses experimenter 15 of age between 19-28 years old.Determine the age of experimenter in reasonable model It encloses, the physiological maladies in terms of normal visual acuity and no visual, such as colour blindness, anomalous trichromatism eye disease.And when being worn in watching process Fraction anaglyph spectacles have no adverse reaction, can the accurate and stable stereopsis position for judging to perceive and true in viewing Surely the apparent range perceived.And whether there is the experience of certain observation stereopsis according to it, whether understands relevant professional knowledge It distinguishes.Due to there is the tested personnel of correlation experience that can preferably carry out subjective evaluation and test, and other Observation personnels are due to lacking Correlation experience needs first to arrange a series of training, by the training of early period, can more accurately determine subjective judgement result.
Step 3 cooperates tall and handsome reach up to NVIDIA Quadro K620 video card, Samsung 2233RZ 3D display screen using tall and handsome The unreal mirror of Wireless stereo -- time division type anaglyph spectacles carry out viewing stereo-picture.Viewing distance is 1.5 meters, about height of display 5 Times, horizontal (50cm) and vertical visual angle (30cm) they are respectively 18.96 ° and 8.58.Experimental situation is based on ITU-R BT.500-11 With ITU-R BT.1438 recommended setting, the eye to guarantee viewing personnel in watching process is comfortable, when being tested every two Eye rest can be carried out to image playing interval.Movie theatre viewing environment is copied in experiment watching process, environment light is dimmed, As shown in figure 4, being tested for tested personnel under darkroom.
Step 4, the apparent range to obtain perception are tested using the method for subjectivity evaluation and test, in correlative study, Measurement method mainly has 3 seed types: speech estimation, vision Imaginary Movement and perception matching.Vocabulary estimation refers to as unit of rice Directly indicate estimation of Depth.The movement of the vision imagination refers to that allowing main body to observe object and imagine moves towards object.Imagine the time with him Common speed of travel record to carry out depth judgement.Experiment is directly quantified using the method for language estimation in the present embodiment Estimate the apparent range of perception.The apparent range for each image that viewing personnel are perceived is obtained by artificial measurement, it will The apparent range acquired takes its average value, arrange data so that obtain the view of part Experiment group image pair shown in table 1 away from From value.
The apparent range of 1 parts of images pair of table
It is each that the Parameter Calculations such as step 5, parallax, viewer's interpupillary distance and viewing distance when passing through projection go out experimental group Image is to corresponding three-dimensional depth.The three-dimensional depth value of image a certain frame picture when playing can be calculated according to the following formula:
Wherein, SD indicates that the three-dimensional depth resolved, e indicate that the pupil distance (generally often taking 6.5cm) of viewer, V indicate Distance of the viewer away from view screen, Z indicate the parallax of stereo pairs on screen when broadcasting.It as shown in table 2, is experimental group Image is to part isometric depth resolving value corresponding to (shown in Fig. 3).
The three-dimensional depth of 2 parts of images pair of table
Step 6, the disparity map for obtaining experimental group image, and calculate according to GBVS algorithm the plane notable figure of each image. Then, disparity map is calculated and merged according to the following formula and plane notable figure obtains three-dimensional notable figure:
IIS (x, y)=ω1ISR(x,y)+ω2DR(x,y)
IS in formulaR(x, y) is plane notable figure, DR(x, y) is disparity map, ω1、ω2For its weight, the present embodiment takes ω12=0.5.To obtain area-of-interest, by three-dimensional notable figure progress Threshold segmentation handle to obtain mask image IIM (x, Y), formula that the specific method is as follows:
C (x, y) is the pixel value at (x, y) in formula, and T is its segmentation threshold.The pixel belongs to sense if C (x, y) > T Interest region corresponds to white area in mask image, otherwise belongs to black region.Use preset region of interest exposure mask and the right side View, disparity map are multiplied, and Image with Region of Interest and region of interest disparity map are respectively obtained, using area-of-interest as foreground zone Domain.Region of interest mask image is negated, and is multiplied with right view and can obtain background area.As shown in figure 5, being experimental group image (Fig. 3) corresponding area-of-interest figure.
Step 7, according to the following formula can the corresponding objective visual comfort of experiment with computing group image:
VC (D, w, c)=4.8736-0.7084D+0.1912ln (w) -0.0208Dln (w)+0.0015c2-0.0572c
(0.50≤D≤2.00,0.25≤w≤4.00)
Wherein, D indicates that the parallactic angle of foreground area, w indicate the width angle of foreground area, and c indicates foreground area and background The contrast in region.It is required that parallactic angle, in 0.5 °~2.0 ° of range, width angle is in 0.25 °~4.0 ° of range.D, w, c Calculate such as following formula:
Wherein, the mean parallax value of foreground area is Df, ofIndicate foreground area, | of| indicate ofPixel is total in region Number, D are the mean parallax angle of foreground area, and k is projection magnifying power, and D indicates parallax, away from screen distance when L is viewing.
N in formulafIndicate horizontal quantity in foreground area,Indicate the horizontal line length of nth in foreground area.W is Width angle, W indicate mean breadth, and k is projection magnifying power, away from screen distance when L is viewing.
Each Color Channel is quantified as 16 different values, the quantity of color is reduced 4096 times.Then by RGB sky Between fill and change into Lab space, obtain the color distance set between the scene area of front and back.dr(r1,r2) it is region r1And r2Between face Color distance.F (c in formulak,i) indicate i-th of color ck,iIn k-th of region rkAll colours nkThe frequency of middle appearance.d(c1,i, c2,j) indicate i-th of color and color distance of j-th of color in Lab space in region 2 in region 1.For based on sky Between the region contrast (c in i.e. objective comfort level model) that weights, Sr(rk,ri) it is region rkWith riBetween space length, σsControl the intensity of space weight.σsThe influence of more large space weight is smaller, then the influence of background area is more significant.Two regions Between space length be defined as the Euclidean distance between regional barycenter.Wherein, pixel coordinate is after normalizing as a result, taking
It as shown in table 3, is visual comfort (VC) value corresponding to part Experiment image in table.
The euphorosia angle value of each image of table 3
Step 8 is to analyze three-dimensional depth and apparent range relationship, three-dimensional depth as shown in FIG. 6 and apparent range trend Figure, Cong Tuzhong observable obtain: the apparent range of human eye perception is largely influenced by three-dimensional depth, but itself and model solution Obtained three-dimensional depth can numerically have certain discrepancy, therefore by people because this factor brings apparent range meter into That calculates considers range.And the viewing of stereopsis is experienced since stereoscopic vision comfort level directly reacts viewer, it is also simultaneously One of the important criteria of evaluating stereoscopic image quality, therefore the present embodiment is anti-because of controlling unit using three-dimensional comfort level as people During feedback factor brings apparent range quantization into.Model is resolved according to three-dimensional depth and three-dimensional comfort level objectively evaluates mould The three-dimensional depth (VD) and stereoscopic vision comfort level (VC) of each image pair is calculated in type, exists in conjunction with the view that subjective evaluation and test obtains The value of distance (GTPD) is analyzed.
Step 9 is to obtain three-dimensional depth, stereoscopic vision comfort level and apparent range three's incidence relation, is calculated first vertical The difference (dif=GTPD-SD) of body depth and apparent range.As shown in fig. 7, human eye is practical with the increase of visual comfort Difference between the apparent range of perception and the three-dimensional depth of resolving is constantly reducing, can further analyze three-dimensional comfort level with Relationship between difference, by establishing scatter plot and obtaining as matched curve such as following formula:
Dif=0.13-0.0353*VC
Step 10, arrangement formula obtain three-dimensional apparent range resolving model and are shown below:
OPD=SD+0.13-0.0353*VC
Step 11, in the present embodiment, commonly uses objective parameter as evaluation index to model estimate value and master using five The correlation for seeing evaluation of estimate is analyzed, and the apparent range of these images is calculated using model of the present invention, then more corresponding visitor Perception knows Pearson correlation coefficient of the view of apparent range, corresponding three-dimensional depth and subjective perception between depth measurement (Pearson Linear Correlation Coefficient, PLCC), Kendall related coefficient (Kendall Rank- Order Correlation Coefficient, KRCC), SROCC related coefficient (SpearmanRank Order Correlation Coefficient), average absolute value error (Mean Absolute Error, MAE) and root-mean-square error (Root Mean Squared Error,RMSE).And selected part visual comfort is good and euphorosia in IVY image library Spend bad image as test set, more corresponding objective perception apparent range, corresponding three-dimensional depth and subjective perception Depending on the Pearson correlation coefficient between depth measurement, Kendall related coefficient, SROCC related coefficient, average absolute value mistake Difference and root-mean-square error, the apparent range resolving value for the stereo-picture that result verification the method for the present invention obtains is compared with three-dimensional depth solution Calculate the apparent range that model result is perceived closer to human eye, better performances.

Claims (1)

1. it is a kind of based on people because three-dimensional apparent range resolve model, it is characterised in that: extract image parallax on the screen, meter Calculate its three-dimensional depth;Secondly, extracting the area-of-interest of image, its parallactic angle, width angle and front and back scape contrast are obtained, according to Objective visual comfort resolves model and calculates its comfort level;Then, by subjective experiment, the view of stereo-picture subjective perception is obtained In distance value;Finally, establishing the three-dimensional depth of resolving and the incidence relation of objective comfort level by subjective measurement value, establishing base Model is resolved in the apparent range of three-dimensional comfort level;Itself specific steps are as follows:
Screen parallax when step 1, extraction stereo pairs projection, calculates its three-dimensional depth according to the following formula:
Wherein, SD indicates that the three-dimensional depth resolved, e indicate the pupil distance of viewer, often take 6.5cm, and V indicates viewer away from broadcasting The distance of screen is put, Z indicates the parallax of stereo pairs on screen when broadcasting;
Step 2, the foreground area parallactic angle for respectively obtaining stereo-picture, width angle and front and back scene area contrast, according to the following formula Calculate its objective visual comfort:
VC (D, w, c)=4.8736-0.7084D+0.1912ln (w) -0.0208Dln (w)+0.0015c2-0.0572c(0.50≤ D≤2.00,0.25≤w≤4.00)
Wherein, D indicates that the parallactic angle of foreground area, w indicate the width angle of foreground area, and c indicates foreground area and background region Contrast;It is required that parallactic angle, in 0.5 °~2.0 ° of range, width angle is in 0.25 °~4.0 ° of range;VC is objective solid Visual comfort;
Step 3 carries out subjective experiment to 21 groups of stereo-pictures of real scene shooting, artificially the view of measurement experiment personnel subjective perception away from From arrangement data obtain statistical average;
The difference dif of step 4, the perception apparent range for calculating every group of image and three-dimensional depth, analyzes the difference and comfort level VC Relationship, do curve matching and show that linear relationship is as follows:
Dif=0.13-0.0353*VC
Wherein, dif is the difference for perceiving apparent range (GTPD) and three-dimensional depth (SD): dif=GTPD-SD, VC are objective vertical Body vision comfort level;
Step 5 arranges formula, finally obtains the incidence relation of apparent range, three-dimensional depth and three-dimensional comfort level, obtain view away from From resolving model:
OPD=SD+0.13-0.0353*VC
In formula, OPD indicates the objective apparent range that model calculates, and SD indicates that the three-dimensional depth of Models computed, VC indicate objective vertical The comfortable angle value of body vision comfort level Models computed.
CN201910273315.6A 2019-04-04 2019-04-04 Three-dimensional apparent distance resolving method considering human factors Active CN110060291B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201910273315.6A CN110060291B (en) 2019-04-04 2019-04-04 Three-dimensional apparent distance resolving method considering human factors

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201910273315.6A CN110060291B (en) 2019-04-04 2019-04-04 Three-dimensional apparent distance resolving method considering human factors

Publications (2)

Publication Number Publication Date
CN110060291A true CN110060291A (en) 2019-07-26
CN110060291B CN110060291B (en) 2023-01-31

Family

ID=67318414

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910273315.6A Active CN110060291B (en) 2019-04-04 2019-04-04 Three-dimensional apparent distance resolving method considering human factors

Country Status (1)

Country Link
CN (1) CN110060291B (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112153362A (en) * 2020-09-15 2020-12-29 清华大学深圳国际研究生院 Method and system for measuring stereoscopic depth of naked eye 3D display system

Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20110222757A1 (en) * 2010-03-10 2011-09-15 Gbo 3D Technology Pte. Ltd. Systems and methods for 2D image and spatial data capture for 3D stereo imaging
CN102402005A (en) * 2011-12-06 2012-04-04 北京理工大学 Bifocal-surface monocular stereo helmet-mounted display device with free-form surfaces
CN103118265A (en) * 2011-11-16 2013-05-22 克里斯蒂数字系统美国有限公司 A collimated stereo display system
CN103986925A (en) * 2014-06-05 2014-08-13 吉林大学 Method for evaluating vision comfort of three-dimensional video based on brightness compensation
GB201419379D0 (en) * 2014-10-31 2014-12-17 Nokia Corp Method for alignment of low-quality noisy depth map to the high-resolution colour image
US20150229904A1 (en) * 2014-02-10 2015-08-13 Sony Corporation Image processing method, image processing device, and electronic device
CN104887316A (en) * 2015-04-24 2015-09-09 长春理工大学 Virtual three-dimensional endoscope displaying method based on active three-dimensional displaying technology
US20160249037A1 (en) * 2013-10-30 2016-08-25 Tsinghua University Method for acquiring comfort degree of motion-sensing binocular stereoscopic video
CN106570900A (en) * 2016-10-11 2017-04-19 宁波大学 Three-dimensional image relocation method
CN109167988A (en) * 2018-08-29 2019-01-08 长春理工大学 A kind of stereo image vision comfort level evaluation method based on D+W model and contrast

Patent Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20110222757A1 (en) * 2010-03-10 2011-09-15 Gbo 3D Technology Pte. Ltd. Systems and methods for 2D image and spatial data capture for 3D stereo imaging
CN103118265A (en) * 2011-11-16 2013-05-22 克里斯蒂数字系统美国有限公司 A collimated stereo display system
CN102402005A (en) * 2011-12-06 2012-04-04 北京理工大学 Bifocal-surface monocular stereo helmet-mounted display device with free-form surfaces
US20160249037A1 (en) * 2013-10-30 2016-08-25 Tsinghua University Method for acquiring comfort degree of motion-sensing binocular stereoscopic video
US20150229904A1 (en) * 2014-02-10 2015-08-13 Sony Corporation Image processing method, image processing device, and electronic device
CN103986925A (en) * 2014-06-05 2014-08-13 吉林大学 Method for evaluating vision comfort of three-dimensional video based on brightness compensation
GB201419379D0 (en) * 2014-10-31 2014-12-17 Nokia Corp Method for alignment of low-quality noisy depth map to the high-resolution colour image
CN104887316A (en) * 2015-04-24 2015-09-09 长春理工大学 Virtual three-dimensional endoscope displaying method based on active three-dimensional displaying technology
CN106570900A (en) * 2016-10-11 2017-04-19 宁波大学 Three-dimensional image relocation method
CN109167988A (en) * 2018-08-29 2019-01-08 长春理工大学 A kind of stereo image vision comfort level evaluation method based on D+W model and contrast

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
YUJI NOJIRI 等: "Measurement of parallax distribution, and its application to the analysis of visual comfort for stereoscopic HDTV", 《STEREOSCOPIC DISPLAYS AND VIRTUAL REALITY SYSTEMS X》 *
张玉强: "多视点虚拟场景的立体效果控制算法与应用", 《中国优秀硕士学位论文全文数据库 信息科技辑》 *

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112153362A (en) * 2020-09-15 2020-12-29 清华大学深圳国际研究生院 Method and system for measuring stereoscopic depth of naked eye 3D display system

Also Published As

Publication number Publication date
CN110060291B (en) 2023-01-31

Similar Documents

Publication Publication Date Title
JP7094266B2 (en) Single-depth tracking-accommodation-binocular accommodation solution
Vienne et al. Depth perception in virtual reality systems: effect of screen distance, environment richness and display factors
CN103096125B (en) Stereoscopic video visual comfort evaluation method based on region segmentation
Mittal et al. Algorithmic assessment of 3D quality of experience for images and videos
US8675045B2 (en) Method of simulating blur in digitally processed images
CN105930821A (en) Method for identifying and tracking human eye and apparatus for applying same to naked eye 3D display
CN104853185A (en) Stereo video comfort evaluation method combining multiple parallaxes with motion
CN103986925B (en) based on the stereoscopic video visual comfort evaluation method of luminance compensation
CN106973288B (en) A kind of three-dimensional video-frequency Comfort Evaluation method and device
Cooper et al. The perceptual basis of common photographic practice
JP2010531102A (en) Method and apparatus for generating and displaying stereoscopic image with color filter
US20180288405A1 (en) Viewing device adjustment based on eye accommodation in relation to a display
CN108449596A (en) A kind of 3D stereo image quality appraisal procedures of fusion aesthetics and comfort level
US20180249148A1 (en) Wide-angle stereoscopic vision with cameras having different parameters
CN207589060U (en) A kind of naked-eye stereoscopic display device of combination visual fatigue detection
Vaziri et al. Egocentric distance judgments in full-cue video-see-through vr conditions are no better than distance judgments to targets in a void
Campagnoli et al. Explicit and implicit depth-cue integration: evidence of systematic biases with real objects
CN110060291A (en) It is a kind of consider people because stereopsis in distance calculation method
Kim et al. Quality assessment of perceptual crosstalk on two-view auto-stereoscopic displays
CN108259888A (en) The test method and system of stereo display effect
CN109167988B (en) Stereo image visual comfort evaluation method based on D + W model and contrast
CN104883577B (en) A kind of three-dimensional video-frequency comfort level Enhancement Method adjusted based on parallax change continuity
CN206650798U (en) The test system of stereo display effect
CN109031667B (en) Virtual reality glasses image display area transverse boundary positioning method
Yang et al. 3-D visual discomfort assessment considering optical and neural attention models

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant