TW201730627A - Light field display metrology - Google Patents

Light field display metrology Download PDF

Info

Publication number
TW201730627A
TW201730627A TW105135772A TW105135772A TW201730627A TW 201730627 A TW201730627 A TW 201730627A TW 105135772 A TW105135772 A TW 105135772A TW 105135772 A TW105135772 A TW 105135772A TW 201730627 A TW201730627 A TW 201730627A
Authority
TW
Taiwan
Prior art keywords
display
calibration
camera
color
depth
Prior art date
Application number
TW105135772A
Other languages
Chinese (zh)
Other versions
TWI648559B (en
Inventor
伊凡L 楊
利奧尼爾E 埃德溫
山姆 米勒
Original Assignee
Magic Leap股份有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Magic Leap股份有限公司 filed Critical Magic Leap股份有限公司
Publication of TW201730627A publication Critical patent/TW201730627A/en
Application granted granted Critical
Publication of TWI648559B publication Critical patent/TWI648559B/en

Links

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B11/00Measuring arrangements characterised by the use of optical techniques
    • G01B11/14Measuring arrangements characterised by the use of optical techniques for measuring distance or clearance between spaced objects or spaced apertures
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B11/00Measuring arrangements characterised by the use of optical techniques
    • G01B11/22Measuring arrangements characterised by the use of optical techniques for measuring depth
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01JMEASUREMENT OF INTENSITY, VELOCITY, SPECTRAL CONTENT, POLARISATION, PHASE OR PULSE CHARACTERISTICS OF INFRARED, VISIBLE OR ULTRAVIOLET LIGHT; COLORIMETRY; RADIATION PYROMETRY
    • G01J3/00Spectrometry; Spectrophotometry; Monochromators; Measuring colours
    • G01J3/46Measurement of colour; Colour measuring devices, e.g. colorimeters
    • G01J3/50Measurement of colour; Colour measuring devices, e.g. colorimeters using electric radiation detectors
    • G01J3/506Measurement of colour; Colour measuring devices, e.g. colorimeters using electric radiation detectors measuring the colour produced by screens, monitors, displays or CRTs
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/017Head mounted
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • G06F3/013Eye tracking input arrangements
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T15/003D [Three Dimensional] image rendering
    • G06T15/10Geometric effects
    • G06T15/20Perspective computation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T19/00Manipulating 3D models or images for computer graphics
    • G06T19/006Mixed reality
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G3/00Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes
    • G09G3/001Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes using specific devices not provided for in groups G09G3/02 - G09G3/36, e.g. using an intermediate record carrier such as a film slide; Projection systems; Display of non-alphanumerical information, solely or in combination with alphanumerical information, e.g. digital display on projected diapositive as background
    • G09G3/003Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes using specific devices not provided for in groups G09G3/02 - G09G3/36, e.g. using an intermediate record carrier such as a film slide; Projection systems; Display of non-alphanumerical information, solely or in combination with alphanumerical information, e.g. digital display on projected diapositive as background to produce spatial visual effects
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G3/00Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes
    • G09G3/006Electronic inspection or testing of displays and display drivers, e.g. of LED or LCD displays
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G5/00Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators
    • G09G5/02Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators characterised by the way in which colour is displayed
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/10Processing, recording or transmission of stereoscopic or multi-view image signals
    • H04N13/106Processing image signals
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/10Processing, recording or transmission of stereoscopic or multi-view image signals
    • H04N13/106Processing image signals
    • H04N13/144Processing image signals for flicker reduction
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/30Image reproducers
    • H04N13/324Colour aspects
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/30Image reproducers
    • H04N13/327Calibration thereof
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/30Image reproducers
    • H04N13/332Displays for viewing with the aid of special glasses or head-mounted displays [HMD]
    • H04N13/344Displays for viewing with the aid of special glasses or head-mounted displays [HMD] with head-mounted left-right displays
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/30Image reproducers
    • H04N13/366Image reproducers using viewer tracking
    • H04N13/383Image reproducers using viewer tracking for tracking with gaze detection, i.e. detecting the lines of sight of the viewer's eyes
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/30Image reproducers
    • H04N13/398Synchronisation thereof; Control thereof
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/0101Head-up displays characterised by optical features
    • G02B2027/011Head-up displays characterised by optical features comprising device for correcting geometrical aberrations, distortion
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/017Head mounted
    • G02B2027/0178Eyeglass type
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2203/00Indexing scheme relating to G06F3/00 - G06F3/048
    • G06F2203/01Indexing scheme relating to G06F3/01
    • G06F2203/011Emotion or mood input determined on the basis of sensed human body parameters such as pulse, heart rate or beat, temperature of skin, facial expressions, iris, voice pitch, brain activity patterns
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10004Still image; Photographic image
    • G06T2207/10012Stereo images
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10052Images from lightfield camera
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T3/00Geometric image transformation in the plane of the image
    • G06T3/20Linear translation of a whole image or part thereof, e.g. panning
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T3/00Geometric image transformation in the plane of the image
    • G06T3/40Scaling the whole image or part thereof
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T3/00Geometric image transformation in the plane of the image
    • G06T3/60Rotation of a whole image or part thereof
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G2320/00Control of display operating conditions
    • G09G2320/02Improving the quality of display appearance
    • G09G2320/0233Improving the luminance or brightness uniformity across the screen
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G2320/00Control of display operating conditions
    • G09G2320/02Improving the quality of display appearance
    • G09G2320/0242Compensation of deficiencies in the appearance of colours
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G2320/00Control of display operating conditions
    • G09G2320/02Improving the quality of display appearance
    • G09G2320/0271Adjustment of the gradation levels within the range of the gradation scale, e.g. by redistribution or clipping
    • G09G2320/0276Adjustment of the gradation levels within the range of the gradation scale, e.g. by redistribution or clipping for the purpose of adaptation to the characteristics of a display device, i.e. gamma correction
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G2320/00Control of display operating conditions
    • G09G2320/02Improving the quality of display appearance
    • G09G2320/028Improving the quality of display appearance by changing the viewing angle properties, e.g. widening the viewing angle, adapting the viewing angle to the view direction
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G2320/00Control of display operating conditions
    • G09G2320/02Improving the quality of display appearance
    • G09G2320/029Improving the quality of display appearance by monitoring one or more pixels in the display panel, e.g. by monitoring a fixed reference pixel
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G2320/00Control of display operating conditions
    • G09G2320/06Adjustment of display parameters
    • G09G2320/0626Adjustment of display parameters for control of overall brightness
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G2320/00Control of display operating conditions
    • G09G2320/06Adjustment of display parameters
    • G09G2320/0666Adjustment of display parameters for control of colour parameters, e.g. colour temperature
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G2320/00Control of display operating conditions
    • G09G2320/06Adjustment of display parameters
    • G09G2320/0673Adjustment of display parameters for control of gamma adjustment, e.g. selecting another gamma curve
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G2320/00Control of display operating conditions
    • G09G2320/06Adjustment of display parameters
    • G09G2320/0693Calibration of display systems
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G2340/00Aspects of display data processing
    • G09G2340/04Changes in size, position or resolution of an image
    • G09G2340/0464Positioning
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G2360/00Aspects of the architecture of display systems
    • G09G2360/16Calculation or use of calculated indices related to luminance levels in display data
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G3/00Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes
    • G09G3/20Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes for presentation of an assembly of a number of characters, e.g. a page, by composing the assembly by combination of individual elements arranged in a matrix no fixed position being assigned to or needed to be assigned to the individual characters or partial characters
    • G09G3/2092Details of a display terminals using a flat panel, the details relating to the control arrangement of the display terminal and to the interfaces thereto
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G3/00Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes
    • G09G3/20Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes for presentation of an assembly of a number of characters, e.g. a page, by composing the assembly by combination of individual elements arranged in a matrix no fixed position being assigned to or needed to be assigned to the individual characters or partial characters
    • G09G3/34Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes for presentation of an assembly of a number of characters, e.g. a page, by composing the assembly by combination of individual elements arranged in a matrix no fixed position being assigned to or needed to be assigned to the individual characters or partial characters by control of light from an independent source
    • G09G3/3406Control of illumination source
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/30Image reproducers
    • H04N13/388Volumetric displays, i.e. systems where the image is built up from picture elements distributed through a volume
    • H04N13/395Volumetric displays, i.e. systems where the image is built up from picture elements distributed through a volume with depth sampling, i.e. the volume being constructed from a stack or sequence of 2D image planes

Abstract

Examples of a light field metrology system for use with a display are disclosed. The light field metrology may capture images of a projected light field, and determine focus depths (or lateral focus positions) for various regions of the light field using the captured images. The determined focus depths (or lateral positions) may then be compared with intended focus depths (or lateral positions), to quantify the imperfections of the display. Based on the measured imperfections, an appropriate error correction may be performed on the light field to correct for the measured imperfections. The display can be an optical display element in a head mounted display, for example, an optical display element capable of generating multiple depth planes or a light field display.

Description

用於檢測顯示器所產生的光場缺陷之光學計量系統 Optical metrology system for detecting light field defects generated by a display 【交互參照之相關申請案】[Reciprocal Reference Related Applications]

本申請案引用美國專利申請號62/250,925,申請日2015年11月4日、標題為LIGHT FIELD ERROR CORRECTION的優先權、引用美國專利申請號62/278,779,申請日2016年1月14日,標題為LIGHT FIELD ERROR CORRECTION的優先權、引用美國專利申請號62/250,934,申請日2015年11月4日,標題為AUTOMATED CALIBRATION IMAGE PROJECTION AND CAPTURE FOR DISPLAY CALIBRATION的權先權、引用美國專利申請號62/278,824,申請日2016年1月14日,標題為DYNAMIC CALIBRATION OF A DISPLAY BASED ON EYE-TRACKING的優先權以及美國專利申請號62/278,794,申請日2016年1月14日,標題為CHROMATIC BALANCING A DISPLAY HAVING VARYING CHROMATICITY ACROSS A FIELD OF VIEW的優先權。並且所有美國申請案的內容將以引用的方式來併入本發明案。 The present application is filed in the U.S. Patent Application Serial No. 62/250, 925, filed on Nov. 4, 2015, entitled LIGHT FIELD ERROR CORRECTION, citing U.S. Patent Application Serial No. 62/278,779, filed Jan. Priority is LIGHT FIELD ERROR CORRECTION, citing U.S. Patent Application No. 62/250,934, filed on November 4, 2015, entitled AUTOMATED CALIBRATION IMAGE PROJECTION AND CAPTURE FOR DISPLAY CALIBRATION, citing U.S. Patent Application No. 62/ 278,824, filed January 14, 2016, entitled DYNAMIC CALIBRATION OF A DISPLAY BASED ON EYE-TRACKING, and US Patent Application No. 62/278,794, filed on January 14, 2016, entitled CHROMATIC BALANCING A DISPLAY HAVING VARYING CHROMATICITY ACROSS A FIELD OF VIEW. The contents of all U.S. applications are incorporated herein by reference.

本發明揭露出一種涉及虛擬實境、擴增實境成像和 可視化系統,更具體地涉及用於測量、校準成像和可視化系統的光學性質的計量系統。同時,本文還揭露出一種根據眼球追蹤技術的虛擬實境、擴境實境成像和可視化系統的動態校準。 The present invention discloses a virtual reality, augmented reality imaging and Visualization systems, and more particularly, metrology systems for measuring, calibrating the optical properties of imaging and visualization systems. At the same time, this paper also reveals a dynamic calibration of virtual reality, extended reality imaging and visualization system based on eye tracking technology.

現代的運算和顯示技術已經讓所謂的“虛擬實境”與“擴增實境”體驗的系統開發變的更容易,其中用數位方式再現圖像或其中的部分圖像讓它們似乎可被視為真實的情境呈現給使用者。虛擬實境(或稱“VR”),一般涉及的情境是用數位化或不具透明的虛擬影像訊息呈現到實際真實環境的視覺輸入;擴增實境(或稱“AR”),一般涉及的情境是用數位化或虛擬影像訊息來呈現以增強使用者周圍的真實環境的可視化;或混合實境”MR”,其涉及合併現實世界和虛擬世界以產生新的環境,其中現實世界中的物件和虛擬世界的物件共同存在並即時產生互動。事實證明,人的視覺感知系統是複雜的,所以如何利用虛擬實境、擴增實境與混合實境的技術在其它虛擬或真實世界圖像元素中呈現出舒適、自然感覺、豐富的虛擬圖像元件是極具挑戰性的。本發明的系統與方法揭露出如何解決虛擬實境、擴增實境及混合實境所產生的各種困難點。 Modern computing and display technologies have made it easier to develop systems for so-called "virtual reality" and "augmented reality" experiences, where digitally reproducing images or parts of them makes them seem to be viewed Presented to the user for a real situation. Virtual reality (or "VR"), the general context involved is the visual input of the actual real environment rendered by digital or non-transparent virtual image information; augmented reality (or "AR"), generally involved A situation is a visualization that is rendered with digital or virtual image information to enhance the real environment around the user; or a hybrid reality "MR" that involves merging the real world and the virtual world to create a new environment in which objects in the real world Coexist with the objects of the virtual world and interact instantly. It turns out that human visual perception systems are complex, so how to use virtual reality, augmented reality and mixed reality technology to present comfortable, natural feelings, rich virtual maps in other virtual or real world image elements. Image components are extremely challenging. The system and method of the present invention reveals how to solve various difficulties arising from virtual reality, augmented reality, and mixed reality.

成像系統的實施例包括用於朝向觀看者的眼睛投影 圖像的投影裝置,該圖像還包括虛擬物體的光源之光場,其中該虛擬物體被配置為投影成好像位於一個或多個預期的聚焦深度,以及用於檢測光場缺陷之光場計量裝置。該光場計量裝置可以被配置為擷取對應於光場的一部分的一個或多個圖像,分析該一個或多個擷取的圖像以識別一個或多個感知聚焦深度以對應於部分的光場所聚焦深度,至少部分地根據所識別的聚焦深度來創建深度圖,以及將所創建的深度圖與該一個或多個預期聚焦深度進行比較。該系統可以產生一用於動態校準可穿戴顯示系統的空間和/或色彩缺陷的校正法。 Embodiments of the imaging system include projections for the eyes of the viewer a projection device for an image, the image further comprising a light field of a light source of the virtual object, wherein the virtual object is configured to be projected as if it were at one or more desired depths of focus, and a light field measurement for detecting a defect in the light field Device. The light field metering device can be configured to capture one or more images corresponding to a portion of the light field, analyze the one or more captured images to identify one or more perceived depths of focus to correspond to portions The light field focuses depth, creating a depth map based at least in part on the identified depth of focus, and comparing the created depth map to the one or more expected depths of focus. The system can generate a correction method for dynamically calibrating the spatial and/or color defects of the wearable display system.

在本說明書主題中所描述的一個或多個實施範例之細節會在圖示及以下的敘述來闡述說明。其它特徵、觀點和優點將從所述的說明、圖式和專利申請範圍而變得顯而易見。應注意,本發明內容和以下詳細描述都不是旨在限定或限制本發明主題的範圍。 The details of one or more embodiments described in the subject matter of the specification are set forth in the description and the description below. Other features, aspects, and advantages will be apparent from the description, drawings and claims. It should be noted that the summary of the invention and the following detailed description are not intended to limit or limit the scope of the inventive subject matter.

100‧‧‧擴增實境場景 100‧‧‧Augmented reality scene

110‧‧‧真實環境公園化設定 110‧‧‧Real environment park setting

120‧‧‧混凝土平台 120‧‧‧Concrete platform

130‧‧‧機器人雕像 130‧‧‧Robot statue

140‧‧‧卡通虛擬角色 140‧‧‧ Cartoon virtual character

200‧‧‧顯示系統 200‧‧‧Display system

204‧‧‧觀看者、穿戴者、使用者 204‧‧‧ Viewers, wearers, users

208‧‧‧顯示器 208‧‧‧ display

212‧‧‧框架 212‧‧‧Frame

216‧‧‧揚聲器 216‧‧‧Speaker

220‧‧‧操作地耦接 220‧‧‧Operating coupling

224‧‧‧局部處理和數據模組 224‧‧‧Local Processing and Data Modules

228‧‧‧遠程處理模組 228‧‧‧Remote processing module

232‧‧‧遠程數據儲存庫 232‧‧‧Remote Data Repository

236、240‧‧‧通訊線路 236, 240‧‧‧ communication lines

302、304‧‧‧眼睛 302, 304‧‧‧ eyes

306‧‧‧平面 306‧‧‧ Plane

400‧‧‧顯示系統 400‧‧‧Display system

405‧‧‧波導疊層組件 405‧‧‧Wave laminate assembly

410‧‧‧眼睛 410‧‧‧ eyes

420、422、424、426、428‧‧‧波導 420, 422, 424, 426, 428‧‧ ‧Band

430、432、434、436‧‧‧特徵(透鏡) 430, 432, 434, 436‧‧ ‧ features (lens)

438‧‧‧補償透鏡層 438‧‧‧Compensation lens layer

440、442、444、446、448‧‧‧影像成形裝置 440, 442, 444, 446, 448‧ ‧ image forming devices

450‧‧‧控制器 450‧‧‧ Controller

452、454‧‧‧成像系統 452, 454‧‧‧ imaging system

456‧‧‧世界 456‧‧‧World

466‧‧‧使用者輸入裝置 466‧‧‧User input device

460、462、464、466、468‧‧‧光提取光學元件 460, 462, 464, 466, 468‧ ‧ light extraction optics

500‧‧‧眼動追踪相機 500‧‧‧ eye tracking camera

505‧‧‧光源 505‧‧‧Light source

510‧‧‧邊緣 Edge of 510‧‧

515‧‧‧出射光束 515‧‧‧Output beam

604‧‧‧主平面波導 604‧‧‧Main Plane Waveguide

612‧‧‧分佈平面波導 612‧‧‧Distributed planar waveguide

608、616‧‧‧繞射光學元件(DOE) 608, 616‧‧‧Diffractive Optical Elements (DOE)

620‧‧‧彩色光源 620‧‧‧Color light source

624‧‧‧光纖 624‧‧‧ fiber optic

628‧‧‧中空管 628‧‧‧ hollow tube

632‧‧‧懸臂 632‧‧‧ cantilever

636‧‧‧驅動電子 636‧‧‧Drive Electronics

640‧‧‧導線 640‧‧‧ wire

644‧‧‧元件 644‧‧‧ components

648‧‧‧鏡面 648‧‧‧Mirror

702‧‧‧校準圖案(棋盤方格圖案) 702‧‧‧ calibration pattern (checkerboard checkered pattern)

704‧‧‧光場影像 704‧‧‧Light field imagery

802‧‧‧預期位置 802‧‧‧ expected location

804‧‧‧檢測位置 804‧‧‧Detection location

806‧‧‧線 806‧‧‧ line

900‧‧‧預期影像位置 900‧‧‧ Expected image location

900a‧‧‧顯示影像位置 900a‧‧‧Display image location

901‧‧‧平移向量 901‧‧‧ translation vector

902、904‧‧‧中心位置 902, 904‧‧‧ central location

906‧‧‧顯示影像 906‧‧‧Display image

907‧‧‧旋轉量 907‧‧‧Rotation

908‧‧‧中心點 908‧‧‧ center point

910‧‧‧位置 910‧‧‧ position

912‧‧‧影像 912‧‧ images

913‧‧‧縮放量 913‧‧‧Zoom

914、922‧‧‧期望影像 914, 922‧‧‧ Expected images

916、920‧‧‧顯示影像 916, 920‧‧‧ display images

917‧‧‧縮放量 917‧‧‧Zoom

918‧‧‧預期影像 918‧‧‧ Expected images

1002‧‧‧投影深度平面 1002‧‧‧Drop depth plane

1004‧‧‧預期深度平面 1004‧‧‧Expected depth plane

1006‧‧‧旋轉軸線 1006‧‧‧Rotation axis

1102、1104‧‧‧區域 1102, 1104‧‧‧ area

1202‧‧‧位置 1202‧‧‧Location

1204‧‧‧模式 1204‧‧‧ mode

1206、1208‧‧‧輝度值 1206, 1208‧‧‧ luminance values

1302、1304‧‧‧強度分佈 1302, 1304‧‧‧ intensity distribution

1400‧‧‧繪製圖 1400‧‧‧ Drawing

1402‧‧‧紅色層 1402‧‧‧Red layer

1404‧‧‧藍色層 1404‧‧‧Blue layer

1406‧‧‧綠色層 1406‧‧‧Green layer

1408‧‧‧平均輝度 1408‧‧‧ average brightness

1410‧‧‧最大輝度(平均+最大誤差) 1410‧‧‧Maximum brightness (average + maximum error)

1412‧‧‧最小輝度(平均-最大誤差) 1412‧‧‧ Minimum brightness (average-maximum error)

1500‧‧‧曲線圖 1500‧‧‧Curve

1502‧‧‧紅色層 1502‧‧‧ red layer

1504‧‧‧藍色層 1504‧‧‧Blue layer

1506‧‧‧綠色層 1506‧‧‧Green layer

1600‧‧‧流程圖 1600‧‧‧flow chart

1602‧‧‧相機校準 1602‧‧‧ Camera calibration

1604‧‧‧空間誤差 1604‧‧‧ Spatial error

1604a‧‧‧XY中心 1604a‧‧‧XY Center

1604b‧‧‧聚集旋轉 1604b‧‧‧Gathering rotation

1604c‧‧‧聚集縮放 1604c‧‧‧Gathering zoom

1604d‧‧‧空間映射 1604d‧‧‧ Spatial mapping

1606‧‧‧色彩誤差 1606‧‧‧Color error

1606a‧‧‧輝度平坦化 1606a‧‧ ‧ Brightness flattening

1606b‧‧‧色彩平衡 1606b‧‧‧Color balance

1702、1710‧‧‧物件 1702, 1710‧‧‧ objects

1708‧‧‧距離 1708‧‧‧Distance

1706、1712‧‧‧光線 1706, 1712‧‧‧ rays

1800‧‧‧測量系統 1800‧‧‧Measurement system

1802‧‧‧顯示器 1802‧‧‧ display

1804‧‧‧光線 1804‧‧‧Light

1806‧‧‧照相機 1806‧‧‧ camera

1808‧‧‧控制器 1808‧‧‧ Controller

1900‧‧‧影像 1900‧‧ images

1902、1904‧‧‧區域 1902, 1904‧‧‧ areas

1910‧‧‧深度圖 1910‧‧Deep map

1912‧‧‧測量焦深 1912‧‧‧Measured depth of focus

1914‧‧‧焦點深度 1914‧‧‧Focus of depth

1916、1918‧‧‧區域 1916, 1918‧‧‧ Area

1920‧‧‧深度圖 1920‧‧ depth map

1922‧‧‧預期深度位置 1922‧‧‧ Expected depth position

1924‧‧‧聚焦深度 1924‧‧‧ Depth of focus

2001‧‧‧程序 2001‧‧‧Program

2002‧‧‧設定聚焦深度 2002‧‧‧Set focus depth

2004‧‧‧擷取聚焦深度處的虛擬目標圖案的影像 2004‧‧‧ Capture images of virtual target patterns at depth of focus

2006‧‧‧更多聚焦深度 2006‧‧‧More depth of focus

2008‧‧‧改變聚焦深度 2008‧‧‧Change the depth of focus

2010‧‧‧識別目標圖案在不同區域聚焦的深度 2010‧‧‧ Identify the depth of focus of the target pattern in different areas

2012‧‧‧根據識別的聚焦深度來創建深度圖 2012‧‧‧Create a depth map based on the identified depth of focus

2014‧‧‧深度圖與期望的聚焦深度作比較 2014‧‧‧ depth map compared to expected depth of focus

2016‧‧‧執行誤差校正 2016‧‧‧Performance error correction

2150‧‧‧方法 2150‧‧‧ method

2160‧‧‧獲取顯示器影像 2160‧‧‧Get the monitor image

2170‧‧‧契合全局變換參數 2170‧‧‧Compatible with global transformation parameters

2180‧‧‧契合局部變換參數 2180‧‧‧Conforming to local transformation parameters

2190‧‧‧變換參數儲存在與顯示器相關聯的儲存器中 2190‧‧‧Transform parameters are stored in the memory associated with the display

2200‧‧‧校準系統 2200‧‧‧ calibration system

2202‧‧‧顯示器 2202‧‧‧ display

2204‧‧‧校準圖案 2204‧‧‧ calibration pattern

2206‧‧‧光場 2206‧‧‧Light field

2208‧‧‧照相機 2208‧‧‧ camera

2302‧‧‧特徵點 2302‧‧‧Feature points

2304‧‧‧檢查框 2304‧‧‧Check box

2306‧‧‧像素 2306‧‧ ‧ pixels

2308‧‧‧箭頭 2308‧‧‧ arrow

2400‧‧‧流程 2400‧‧‧ Process

2402‧‧‧投影圖像 2402‧‧‧Projected image

2404‧‧‧擷取圖像的影像 2404‧‧‧Image capture of images

2406‧‧‧計算失真 2406‧‧‧Computational distortion

2408‧‧‧更多的位置 2408‧‧‧More locations

2410‧‧‧在新的位置投影圖像 2410‧‧‧Projecting images in new locations

2412‧‧‧生成失真圖 2412‧‧‧ Generate distortion map

2414‧‧‧執行誤差校正 2414‧‧‧Performance error correction

2500‧‧‧顯示器 2500‧‧‧ display

2503i1、2503i2、2503i3‧‧‧波長 2503i1, 2503i2, 2503i3‧‧‧ wavelength

2505‧‧‧波導 2505‧‧‧Band

2505a、2505b‧‧‧主表面 2505a, 2505b‧‧‧ main surface

2507‧‧‧光輸入耦合元件 2507‧‧‧Light input coupling components

2509‧‧‧光輸出耦合元件 2509‧‧‧Light output coupling components

2511‧‧‧光分佈元件 2511‧‧‧Light distribution components

2513‧‧‧波長選擇濾波器 2513‧‧‧wavelength selection filter

2600‧‧‧動態校準系統 2600‧‧‧ Dynamic Calibration System

2602‧‧‧點 2602‧‧ points

2602a、2602b‧‧‧位置 2602a, 2602b‧‧‧ position

2610‧‧‧動態校準處理器 2610‧‧‧Dynamic calibration processor

2700‧‧‧方法 2700‧‧‧ method

2710‧‧‧追踪眼睛以確定眼睛位置 2710‧‧‧ Tracking eyes to determine eye position

2720‧‧‧根據確定的眼睛位置存取顯示器的校準 2720‧‧‧Access to the calibration of the display based on the determined eye position

2730‧‧‧校準應用於顯示器以校正顯示器中的空間和/或色彩缺陷 2730‧‧‧ Calibration applied to displays to correct for space and/or color defects in displays

2805‧‧‧處理流程 2805‧‧‧Processing process

2810‧‧‧眼代理相機校準系統 2810‧‧‧ Eye Agent Camera Calibration System

2820‧‧‧產生用於每個眼睛代理相機網格點的校準 2820‧‧‧ Generate calibration for each eye proxy camera grid point

2830‧‧‧可由儲存器取出在記憶體內儲存校準(動態校準LUT) 2830‧‧‧ Can be stored in memory and stored in memory (calibration of dynamic calibration LUT)

2840‧‧‧眼睛追踪系統 2840‧‧‧ Eye Tracking System

2850‧‧‧根據眼睛透視位置從儲存器提取校準 2850‧‧‧Extracting calibration from the reservoir based on the perspective of the eye

2860‧‧‧校準應用於顯示器 2860‧‧‧ Calibration applied to the display

2870‧‧‧使用者觀看校準的顯示器 2870‧‧‧Users viewing calibrated displays

圖1係為使用者利用擴增實境(AR)裝置來觀看擴增實境場景的示意圖。 FIG. 1 is a schematic diagram of a user viewing an augmented reality scene using an augmented reality (AR) device.

圖2係為可佩戴顯示系統之實施示意圖。 2 is a schematic diagram of an implementation of a wearable display system.

圖3係為使用多個深度平面來模擬三維成像的示意圖。 3 is a schematic diagram of simulating three-dimensional imaging using multiple depth planes.

圖4係為向使用者輸出影像訊息的疊層波導之實施示意圖。 4 is a schematic diagram of the implementation of a laminated waveguide for outputting image information to a user.

圖5係為由波導輸出的出射光束之實施示意圖。 Fig. 5 is a schematic view showing the implementation of an outgoing beam outputted from a waveguide.

圖6係為波導裝置,用以將光耦合到波導裝置或從波導裝置耦合光的光耦合器子系統以及用於生成多焦點體積顯示、影像或光場的控制子系統之光學系統示意圖。 6 is a schematic diagram of an optical system of a waveguide device for coupling light to or coupling light from a waveguide device and a control subsystem for generating a multifocal volume display, image or light field.

圖7係為顯示系統之投影校準圖案時所產生失真的實施示意圖。 Figure 7 is a schematic illustration of the implementation of the distortion produced when the projected calibration pattern of the system is displayed.

圖8係為投影光場中的點之預期位置及實際顯示位置之間的映射偏差的一個或多個擷取影像生成的向量場之實施列示意圖。 8 is a schematic diagram of an implementation of a vector field generated by one or more captured images of a projected offset between a desired position of a point in a projected light field and an actual display position.

圖9A係為平面內平移空間誤差實施示意圖。 Figure 9A is a schematic diagram of the implementation of the in-plane translational space error.

圖9B係為聚集旋轉空間誤差的實施示意圖。 Figure 9B is a schematic diagram of the implementation of the aggregated rotational space error.

圖9C係為聚集縮放空間誤差的實施示意圖。 Figure 9C is a schematic diagram of the implementation of the aggregated scaling space error.

圖9D係為聚集縮放空間誤差的另一實施示意圖。 Figure 9D is a schematic diagram of another implementation of the aggregated scaling space error.

圖9E係為執行XY平移、旋轉和縮放的校正後的剩餘空間誤差的實施示意圖。 Figure 9E is a schematic diagram of the implementation of the corrected residual spatial error for performing XY translation, rotation, and scaling.

圖10A係為在不同深度觀看多個深度平面的實施示意圖。 FIG. 10A is a schematic diagram of an implementation of viewing multiple depth planes at different depths.

圖10B-10D係為當觀察投影深度平面時,所產生的平面外空間誤差的實施示意圖。 10B-10D are schematic diagrams showing the implementation of the out-of-plane spatial error when observing the projected depth plane.

圖11係為投影後的測試影像之擷取影像的實施示意圖。 FIG. 11 is a schematic diagram of an implementation of capturing images of a test image after projection.

圖12A係為投影後的測試影像之擷取影像所生成的強度直方示意圖。 FIG. 12A is a histogram of the intensity generated by the captured image of the projected test image.

圖12B係為投影後的測試影像之擷取影像所生成的強度分佈 示意圖。 Figure 12B is the intensity distribution generated by the captured image of the projected test image. schematic diagram.

圖13係為模式、中間值及平均值差異性的強度直方示意圖。 Figure 13 is a histogram of the intensity of the mode, median, and mean difference.

圖14A係為投影後的測試影像之擷取影像所產生的紅-綠-藍(RGB)的強度示意圖。 FIG. 14A is a schematic diagram showing the intensity of red-green-blue (RGB) generated by the captured image of the projected test image.

圖14B係為映射最大色彩失衡誤差的曲線示意圖。 Figure 14B is a graphical representation of a graph mapping the maximum color imbalance error.

圖15係為具有不同強度的紅色層、綠色層和藍色層的顯示系統之色彩校正的RGB強度示意圖。 Figure 15 is a color corrected RGB intensity diagram of a display system having red, green and blue layers of different intensities.

圖16係為在顯示系統上執行影像校正處理的實施流程圖。 Fig. 16 is a flow chart showing an implementation of performing image correction processing on a display system.

圖17A係為利用正常光場來觀察物體的實施示意圖。 Fig. 17A is a schematic view showing an implementation of observing an object using a normal light field.

圖17B係為利用缺陷光場來觀察物體的實施示意圖。 Fig. 17B is a schematic view showing the implementation of observing an object using a defective light field.

圖18係為測量顯示器的光場質量之測量系統的實施示意圖。 Figure 18 is a schematic illustration of an implementation of a measurement system for measuring the light field quality of a display.

圖19A係為照相機在特定聚焦深度上聚焦所擷取的影像之實施示意圖。 Figure 19A is a schematic illustration of an embodiment of an image captured by a camera at a particular depth of focus.

圖19B係為執行測量焦深的深度示意圖。 Fig. 19B is a schematic view showing the depth of the measurement of the depth of focus.

圖19C係為擷取一個或多個生成影像的深度示意圖。 Figure 19C is a schematic diagram showing the depth of one or more generated images.

圖20係為測量光場顯示生成的虛擬目標圖案的質量之實施流程圖。 Figure 20 is a flow chart showing the implementation of measuring the quality of the virtual target pattern generated by the light field display.

圖21係用於校準顯示器的方法實施流程圖。 21 is a flow chart of a method implementation for calibrating a display.

圖22係為使用校準圖案的校準系統之實施示意圖。 Figure 22 is a schematic illustration of the implementation of a calibration system using a calibration pattern.

圖23A係用於棋盤校準圖案的實施示意圖。 Figure 23A is a schematic illustration of an implementation of a checkerboard calibration pattern.

圖23B係為單像素校準圖案的實施示意圖。 Figure 23B is a schematic diagram of the implementation of a single pixel calibration pattern.

圖24係為用於校準投影光場過程的流程示意圖。 Figure 24 is a flow diagram showing the process for calibrating the projected light field.

圖25A係揭露包括波導、光輸入耦合元件、光重新分件元件及光輸出耦合元件的顯示器之實施俯視示意圖。 25A is a top plan view showing an implementation of a display including a waveguide, a light input coupling element, a light re-partitioning element, and a light output coupling element.

圖25B係為揭露出圖25A沿軸線A-A'所示的顯示器之橫截面示意圖。 Figure 25B is a schematic cross-sectional view showing the display of Figure 25A along axis A-A'.

圖26係用於顯示器的動態校準系統之實施例,並可應用校準來校正網格參考位置(由點所示)的空間和/或色彩誤差的示意圖。 Figure 26 is an illustration of an embodiment of a dynamic calibration system for a display and may apply calibration to correct the spatial and/or color error of the grid reference position (shown by dots).

圖27係為根據眼動追踪動態校準顯示器的方法之流程示意圖。 27 is a flow chart showing a method of dynamically calibrating a display according to eye tracking.

圖28係為工廠校準系統和特定顯示器相關聯的動態校準系統的相互作用之流程示意圖。 28 is a flow diagram of the interaction of a factory calibration system and a dynamic calibration system associated with a particular display.

在整個附圖中,參考標號可以被重新使用以指示參考元件之間的對應。並且提供附圖以說明本文所述的實施例,並不是用以限制本公開的範圍。 Throughout the drawings, reference numerals may be reused to indicate the correspondence between the reference elements. The drawings are provided to illustrate the embodiments described herein and are not intended to limit the scope of the disclosure.

為了使三維(3D)顯示器產生真實的深度感知,更具體地說,表面深度的模擬感知,可以期望顯示器的視場中的每個點都能產生對應於其虛擬深度的調節反應。如果顯示點的調節反應不對應於該點的虛擬深度,由雙眼深度線索的輻輳及立體視覺來確定,則人的眼睛可能經歷調節衝突,導致不穩定的成像,使眼睛疲勞、頭痛,並且在沒有調節訊息的情況下,幾乎完全缺 乏表面深度。 In order for a three-dimensional (3D) display to produce true depth perception, and more specifically, simulated perception of surface depth, it may be desirable for each point in the field of view of the display to produce an modulating response corresponding to its virtual depth. If the adjustment response of the display point does not correspond to the virtual depth of the point, as determined by the convergence of the binocular depth cues and stereo vision, the human eye may experience adjustmental conflicts, resulting in unstable imaging, eye fatigue, headache, and Almost completely absent without adjustment messages Deficit surface depth.

虛擬實境和擴增實境的體驗可以由具有顯示器的顯示系統提供,在該顯示器中,向觀看者提供對應於多個深度平面的影像。每個深度平面上的影像可以是不相同的(例如,提供稍微不同呈現的場景或物件),並且可以由觀看者的眼睛單獨聚焦,從而有助於向使用者提供依據眼睛調節所需的深度線索使位於不同深度平面上的場景的不同影像特徵聚焦/或基於觀察位於不同深度平面上的不同影像特徵失焦。如本文其他地方所討論的,這樣的深度線索提供可信的深度感知。 The virtual reality and augmented reality experience may be provided by a display system having a display in which an image corresponding to multiple depth planes is provided to the viewer. The images on each depth plane may be different (eg, providing a slightly different rendered scene or object) and may be individually focused by the viewer's eyes to help provide the user with the depth needed for eye adjustment The cues focus different image features of the scenes located on different depth planes and/or focus on different image features located on different depth planes. As discussed elsewhere herein, such depth cues provide trusted depth perception.

■3D顯示器■3D display

圖1描繪了具有某些虛擬實境物體以及被人觀看的某些實際現實物件的擴增實境場景的圖示。圖1描繪了擴增實境場景100,其中使用AR技術的使用者看到一個真實環境公園化設定110擁有人、樹木、建築物的背景、和一個混凝土平台120。除了這些項目以外,使用AR技術的使用者也可感覺到他“看到”機器人雕像130站立在真實環境的混凝土平台120,和一個飛行的大黃蜂的卡通虛擬角色140,但是這些元件並不存在真實世界。 Figure 1 depicts an illustration of an augmented reality scene with certain virtual reality objects and certain actual reality objects being viewed by a person. 1 depicts an augmented reality scenario 100 in which a user using AR technology sees a real environment park setting 110 owner, trees, a background of a building, and a concrete platform 120. In addition to these items, users using AR technology can also feel that he "sees" the robotic statue 130 standing on the concrete platform 120 of the real environment, and a flying hornet cartoon avatar 140, but these components do not exist. real world.

為了使三維(3D)顯示器產生真實的深度感知,更具體地說,表面深度的模擬感知,可以期望顯示器的視場中的每個點能產生對應其虛擬深度的調節反應。如果顯示點的調節反應不對應於該點的虛擬深度,由雙眼深度線索的輻輳及立體視覺來確定,則人眼可能經歷調節衝突,導致不穩定的成像,使眼睛疲 勞、頭痛,並且在沒有調節訊息的情況下,幾乎完全缺乏表面深度。 In order for a three-dimensional (3D) display to produce a true depth perception, and more specifically, a simulated perception of surface depth, it may be desirable for each point in the field of view of the display to produce a modulating response corresponding to its virtual depth. If the adjustment response of the display point does not correspond to the virtual depth of the point, determined by the convergence of the binocular depth cues and stereo vision, the human eye may experience adjustment conflicts, resulting in unstable imaging and eye fatigue. Work, headache, and almost no surface depth without adjustment messages.

虛擬實境、擴增實境及混和實境的體驗可以由具有顯示器的顯示系統提供,在該顯示器中,向觀看者提供對應於多個深度平面的圖像。每個深度平面上的圖像可以是不相同的(例如,提供稍微不同呈現的場景或物件),並且可以由觀看者的眼睛單獨聚焦,從而有助於向使用者提供依據眼睛調節所需的深度線索使位於不同深度平面上的場景的不同圖像特徵聚焦/或基於觀察位於不同深度平面上的不同圖像特徵失焦。如本文其他地方所討論的,這樣的深度線索提供可信的深度感知。 The experience of virtual reality, augmented reality, and mixed reality can be provided by a display system having a display in which an image corresponding to multiple depth planes is provided to the viewer. The images on each depth plane may be different (eg, provide a slightly different rendered scene or object) and may be individually focused by the viewer's eyes to help provide the user with the eye conditioning needed The depth cues focus different image features of the scenes located on different depth planes and/or focus on different image features located on different depth planes. As discussed elsewhere herein, such depth cues provide trusted depth perception.

圖2係為一實施範例之可佩戴的顯示系統200之示意圖,其可以用於向顯示系統佩戴者或觀看者204呈現VR,AR或MR體驗。顯示系統200包括一顯示器208,以及各種機械和電子模組及系統,用來支持該顯示器208的運作。該顯示器208可以被耦接到一框架212,形成可以讓使用者、穿戴者或觀看者204佩戴的顯示系統以及將該顯示器208配置定位在穿戴者204的眼睛前方。顯示器208可以是光場顯示器。在一些實施例中,一揚聲器216耦接到框架212和靠近使用者的耳道位置(在一些實施例,另一個未顯示出來的揚聲器也是靠近使用者的耳道位置,用來提供立體聲/可成形聲音控制)。該顯示器208可操作地耦接220,例如透過有線或無線連接到一局部數據處理模組224其可被安裝在多種結構內,例如固定地連接到框架212、固定地 連接到使用者佩戴的一頭盔或帽子、嵌入到耳機內或以其他可拆卸的方法連接到使用者204(例如,配置在背包式結構內、配置在結合皮帶式結構內)。 2 is a schematic illustration of a wearable display system 200 of an embodiment that can be used to present a VR, AR or MR experience to a display system wearer or viewer 204. Display system 200 includes a display 208, as well as various mechanical and electronic modules and systems for supporting the operation of display 208. The display 208 can be coupled to a frame 212 to form a display system that can be worn by a user, wearer or viewer 204 and to position the display 208 in front of the eyes of the wearer 204. Display 208 can be a light field display. In some embodiments, a speaker 216 is coupled to the frame 212 and to the ear canal position of the user (in some embodiments, another speaker not shown is also located near the ear canal of the user for providing stereo/can Forming sound control). The display 208 is operatively coupled 220, such as by wire or wirelessly to a local data processing module 224, which can be mounted in a variety of configurations, such as fixedly coupled to the frame 212, fixedly Attached to a helmet or cap worn by the user, embedded in the earpiece or otherwise detachably connected to the user 204 (eg, disposed within the backpack structure, disposed within the integrated belt structure).

局部處理和數據模組224可以包括一硬體處理器以及數位記憶體,例如非揮發記憶體(例如快閃式記憶體),這兩者都可以用於協助處理、快取和存儲數據。該數據包括來自感測器所擷取的數據a(其可以是,例如,可操作地耦接到該框架212)或以其它方式連接到使用者204)、諸如影像擷取設備(例如相機)、麥克風、慣性測量組件、加速度感測器,羅盤、GPS裝置、無線電裝置、和/或陀螺儀;和/或b)取得和/或處理其使用遠程處理模組228和/或遠程數據儲存庫232,如此的處理或提取之後用 通訊線路236和/或240操作耦接到遠程處理模組228和遠程數據儲存庫232,諸如經由一有線或無線通訊線路耦接,使得這些遠程模組228、232可將有用的資源傳遞到局部處理和數據模組224。除此之外,遠程處理模組228和遠程數據儲存庫232可彼此操作耦接。 The local processing and data module 224 can include a hardware processor and digital memory, such as non-volatile memory (e.g., flash memory), both of which can be used to assist in processing, caching, and storing data. The data includes data a from the sensor (which may be, for example, operatively coupled to the frame 212) or otherwise connected to the user 204), such as an image capture device (eg, a camera). , a microphone, an inertial measurement component, an acceleration sensor, a compass, a GPS device, a radio, and/or a gyroscope; and/or b) obtaining and/or processing its use of the remote processing module 228 and/or a remote data repository 232, after such processing or extraction Communication lines 236 and/or 240 are operatively coupled to remote processing module 228 and remote data repository 232, such as via a wired or wireless communication line, such that the remote modules 228, 232 can transfer useful resources to the local Processing and data module 224. In addition, remote processing module 228 and remote data repository 232 can be operatively coupled to each other.

在一些實施例中,遠程處理模組228可包括一個或多個處理器被配置用來分析和處理數據和/或影像訊息。例如由影像擷取裝置所擷取的視頻信息。視頻數據可以局部存儲在局部處理和數據模組224和/或遠程數據儲存庫232中。在一些實施例中,遠程數據儲存庫232可以包括數位數據儲存設備,其可以是透過網際網路或者”雲端”資源架構的其他網路組態。在一些實施 例中,所有的數據儲存和所有的運算都在局部處理和數據模組224中執行,並且允許透過遠程模組來完全自主使用。 In some embodiments, the remote processing module 228 can include one or more processors configured to analyze and process data and/or image information. For example, video information captured by the image capture device. The video data may be stored locally in the local processing and data module 224 and/or the remote data repository 232. In some embodiments, remote data repository 232 can include a digital data storage device, which can be other network configurations that are through an internet or "cloud" resource architecture. In some implementations In the example, all data storage and all operations are performed in the local processing and data module 224, and are allowed to be completely autonomous through the remote module.

人類視覺系統是複雜以及提供一真實深度的感知是具有挑戰性。不受理論的限制,據信由於一輻輳組合及調節,讓物體的觀看者可以感知到物體就是”三維”的。二眼之間相互的輻輳運動(即讓眼睛視線聚焦注視到一個物體上使瞳孔靠近或遠離彼此的滾動運動)都與聚焦(或”調焦”)眼睛的水晶體有密切的關聯。在正常條件下,改變眼睛的水晶體的聚焦、或調節眼睛,在不同的距離下從一個物體改變焦聚到另一個物體上將自主地導致在輻輳內一匹配變化到相同的距離,此種關係被稱為”調節輻輳反射”。同樣地,在正常情況下,輻輳內的變化將觸發調節內的匹配改變。顯示系統在調節和輻輳之間提供一種更好的匹配可以形成更逼真或舒適的三維成像模擬。 It is challenging for the human visual system to be complex and to provide a true depth of perception. Without being bound by theory, it is believed that due to the combination and adjustment of a convergence, the viewer of the object can perceive that the object is "three-dimensional". The mutual convergent motion between the two eyes (i.e., the rolling motion that causes the eye's line of sight to focus on an object to bring the pupil closer to or away from each other) is closely related to the focusing (or "focusing") of the lens of the eye. Under normal conditions, changing the focus of the lens of the eye, or adjusting the eye, changing the focus from one object to another at different distances will autonomously cause a match within the convergence to change to the same distance. It is called "adjusting the convergence of the convergence". Similarly, under normal conditions, changes in the convergence will trigger a matching change within the adjustment. The display system provides a better match between adjustment and convergence to create a more realistic or comfortable 3D imaging simulation.

圖3係為使用多個深度平面來模擬三維成像方法的示意圖。參考圖3,用眼睛302和304來調節從眼睛302和304到z軸上的各個不同距離的物體,以便這些物體被聚焦。假設在特別的調節狀態下,眼睛302和304將沿Z軸上之不同距離的物體來聚焦。因此,一特定的調節狀態可以說是與具有相關聯的焦距的一特定深度平面的其中一個平面306有關聯的,如此當眼睛處於深度平面的調節狀態下,在一特定深度的物體或部分的物體會被聚焦。在一些實施例中,三維成像可以被模擬對每個眼睛302和304提供不同呈現的影像,並且也提供不同呈現影像對應到每 個深度平面。雖然為了清楚說明而分開表示,但應當理解的是,例如,隨著沿著z軸的距離增加,眼睛302和304的視場可以重疊。此外,雖然為了便於說明而以平面來表示,但應當理解的是,深度平面的輪廓可以在物理空間中彎曲,使得深度平面中的所有特徵在特定調節狀態下與眼睛對準。不受理論的限制,據信,人的眼睛通常可以解讀為一有限深度平面,用以提供深度感知。因此,該深度感知的一高可信度的模擬可以透過眼睛來實現,用以呈現不同的影像來對應到這些其中每一個有限量的深度平面。 3 is a schematic diagram of a method of simulating a three-dimensional imaging using multiple depth planes. Referring to Figure 3, the eyes 302 and 304 are used to adjust objects of various different distances from the eyes 302 and 304 to the z-axis so that the objects are focused. It is assumed that under special adjustment conditions, eyes 302 and 304 will focus on objects at different distances along the Z-axis. Thus, a particular adjustment state can be said to be associated with one of the planes 306 of a particular depth plane having an associated focal length, such that when the eye is in a state of adjustment of the depth plane, an object or portion of the particular depth The object will be focused. In some embodiments, three-dimensional imaging can be simulated to provide different rendered images for each of the eyes 302 and 304, and also provide different rendered images corresponding to each Depth planes. Although separately shown for clarity of illustration, it should be understood that, for example, as the distance along the z-axis increases, the fields of view of eyes 302 and 304 may overlap. Moreover, although shown in plan for ease of illustration, it should be understood that the contour of the depth plane may be curved in physical space such that all features in the depth plane are aligned with the eye in a particular adjusted state. Without being bound by theory, it is believed that a human eye can often be interpreted as a finite depth plane to provide depth perception. Thus, a high-confidence simulation of depth perception can be achieved through the eye to present different images to correspond to each of these limited depth planes.

■波導堆疊組件■Waveguide stacking components

圖4係為給使用者輸出影像訊息的疊層波導之示意圖。一顯示系統400包括一波導疊層,或波導疊層組件405,其可以使用多個波導420、422、424、426、428來提供三維感知到眼睛410或大腦。在一些實施例中,顯示系統400可以對應於圖2的系統200,如圖4之透視圖所示,更詳細地表示出該系統200的某些部分。例如,在一些實施例中,波導疊層組件405可以被整合到圖2的顯示器208。 4 is a schematic diagram of a laminated waveguide for outputting image information to a user. A display system 400 includes a waveguide stack, or waveguide stack assembly 405, which can use a plurality of waveguides 420, 422, 424, 426, 428 to provide three-dimensional perception to the eye 410 or brain. In some embodiments, display system 400 can correspond to system 200 of FIG. 2, as shown in perspective in FIG. 4, showing portions of system 200 in greater detail. For example, in some embodiments, waveguide stack assembly 405 can be integrated into display 208 of FIG.

繼續參考圖4,波導疊層組件405還可包括波導之間的一多個特徵430、432、434、436。在一些實施例中,該特徵430、432、434、436可以是透鏡。在一些實施例中,該特徵430、432、434、436可以不是透鏡。相反的,它們可以是間隔物(例如,覆蓋層和/或用於形成空氣間隙的結構)。 With continued reference to FIG. 4, the waveguide stack assembly 405 can also include a plurality of features 430, 432, 434, 436 between the waveguides. In some embodiments, the feature 430, 432, 434, 436 can be a lens. In some embodiments, the features 430, 432, 434, 436 may not be lenses. Instead, they may be spacers (eg, a cover layer and/or a structure for forming an air gap).

該些波導420、422、424、426、428和/或該多個透 鏡430、432、434、436可以被配置發送影像訊息到眼睛,其具有不同程度的波前曲度或光線發散。每個平面波導可與一特定的深度平面相關聯,並且可以被配置輸出影像訊息對應到該深度平面。影像成形裝置440、442、444、446、448可以被用於注入影像訊息到波導420、422、424、426、428,每一個影像成形裝置被用以發散入射光跨越到每個相對應的波導,並朝向眼睛410輸出。光從影像成形裝置440、442、444、446、448的輸出表面離開,並被注入到相對應波導420、422、424、426、428的輸入端邊緣。在一些實施例中,單一光束的光(例如一準直光束)可能被注入到每一個波導以輸出一完整的仿準直光束區域,其被引導從一特定角度(和發散量)朝向到眼睛410並對應到和一特定波導有相關聯的深度平面。 The waveguides 420, 422, 424, 426, 428 and/or the plurality of transparent The mirrors 430, 432, 434, 436 can be configured to transmit image messages to the eye with varying degrees of wavefront curvature or divergence of light. Each planar waveguide can be associated with a particular depth plane and can be configured to output an image message corresponding to the depth plane. Image forming devices 440, 442, 444, 446, 448 can be used to inject image information into waveguides 420, 422, 424, 426, 428, each image forming device being used to diverge incident light across each corresponding waveguide And output toward the eye 410. Light exits the output surface of image forming devices 440, 442, 444, 446, 448 and is injected into the input end edges of corresponding waveguides 420, 422, 424, 426, 428. In some embodiments, a single beam of light (eg, a collimated beam) may be injected into each waveguide to output a complete pseudo-collimated beam region that is directed from a particular angle (and divergence) toward the eye. 410 and corresponds to a depth plane associated with a particular waveguide.

在一些實施例中,影像成形裝置440、442、444、446、448是離散顯示器,每個離散顯示器所產生影像訊息以用來分別地注入到一相對應的波導420、422、424、426、428。在一些其它實施例中,影像成形裝置440、442、444、446、448是一單一多重顯示器的輸出端,並且可以是,例如,透過一個或更多個光導管(如光纖電纜)輸遞導管的影像訊息到每個影像成形裝置440、442、444、446、448。 In some embodiments, image forming devices 440, 442, 444, 446, 448 are discrete displays, each of which produces image information for injection into a corresponding waveguide 420, 422, 424, 426, 428. In some other embodiments, image forming devices 440, 442, 444, 446, 448 are outputs of a single multi-display and may be, for example, a delivery catheter through one or more light pipes (eg, fiber optic cables) The image information is sent to each image forming device 440, 442, 444, 446, 448.

一控制器450控制波導疊層組件405的運作與影像成形裝置440、442、444、446、448。在一些實施例中,控制器450包括編程(例如,在一非暫態電腦可讀取的媒體中的指令), 其調控時間及提供影像訊息到波導420、422、424、426、428。在一些實施例中,控制器450可以是一種單一的整體的設備,或通過有線或無線溝通管道連接的分布系統。控制器450可以是一些實施例中處理模組224或228(圖2所示)的一部分。在一些實施例中,控制器可以與面向內的成像系統452(例如,數位照相機),面向外的成像系統454(例如,數位照相機)和/或使用者輸入裝置466溝通。面向內的成像系統452(例如,數位照相機)可以用於擷取眼睛410的影像,以例如,確定眼睛410的瞳孔的尺寸和/或取向。面向外的成像系統454可以用於成像一部分的世界456。使用者可以經由使用者輸入裝置466向控制器450輸入命令與顯示系統400交互作用。 A controller 450 controls the operation of the waveguide stack assembly 405 and image forming devices 440, 442, 444, 446, 448. In some embodiments, controller 450 includes programming (eg, instructions in a non-transitory computer readable medium), It regulates the time and provides image information to the waveguides 420, 422, 424, 426, 428. In some embodiments, controller 450 can be a single, unitary device, or a distribution system that is connected by wired or wireless communication conduits. Controller 450 can be part of processing module 224 or 228 (shown in Figure 2) in some embodiments. In some embodiments, the controller can communicate with an inward facing imaging system 452 (eg, a digital camera), an outwardly facing imaging system 454 (eg, a digital camera), and/or a user input device 466. An inward facing imaging system 452 (eg, a digital camera) can be used to capture an image of the eye 410 to, for example, determine the size and/or orientation of the pupil of the eye 410. The outwardly facing imaging system 454 can be used to image a portion of the world 456. The user can enter commands to the controller 450 via the user input device 466 to interact with the display system 400.

波導420、422、424、426、428可被配置透過全內反射(total internal reflection,TIR)來傳遞每個相對應的波導內的光。該波導420、422、424、426、428中的每一個可以是平面的,或具有其它的形狀(例如彎曲),具有主頂部和底部表面以及邊緣在該主頂部和該底部表面之間延伸。在示意圖中,波導420、422、424、426、428可以每一個都包括光提取光學元件460、462、464、466、468被配置來通過重定向光將光從波導內提取出來,在每個相對應的波導內傳播,在波導之外輸出影像訊息到眼睛410。提取的光也可以稱為外耦合光,並且光提取光學元件也可以稱為外耦合光學元件。一提取光束藉由波導輸出到一個位置,該位置為波導內傳遞的光撞擊到光重定向元件的位置。該光提取光 學元件460、462、464、466、468可以,例如,具有反射和/或繞射光學特性。當其示意出設置波導420、422、424、426、428的底部主表面用以便於描述和圖示清楚,在一些實施例中,光提取光學元件460、462、464、466、468可以被設置在頂部和/或底部主表面,和/或可以直接被設置在波導420、422、424、426、428的體積內。在一些實施例中,光提取光學元件460、462、464、466、468可以在材料層內形成,該材料被附著到一透明基板,以形成該波導420、422、424、426、428。在其它一些實施例中,波導420、422、424、426、428可以是一單晶片材料以及光提取光學元件460、462、464、466、468可以在一表面上和/或在該一片材料的內部形成。 The waveguides 420, 422, 424, 426, 428 can be configured to transmit light within each corresponding waveguide through total internal reflection (TIR). Each of the waveguides 420, 422, 424, 426, 428 may be planar or have other shapes (eg, curved) having a main top and bottom surface and an edge extending between the main top and the bottom surface. In the schematic, the waveguides 420, 422, 424, 426, 428 can each include light extraction optics 460, 462, 464, 466, 468 configured to extract light from the waveguide by redirecting light, at each Corresponding waveguide propagation, outputting image information to the eye 410 outside of the waveguide. The extracted light may also be referred to as outcoupling light, and the light extraction optical element may also be referred to as an outcoupling optical element. An extraction beam is output by the waveguide to a position that is the position at which light transmitted within the waveguide impinges on the light redirecting element. The light extraction light The learning elements 460, 462, 464, 466, 468 may, for example, have reflective and/or diffractive optical properties. When it is illustrated that the bottom major surfaces of the waveguides 420, 422, 424, 426, 428 are provided for ease of description and illustration, in some embodiments, the light extraction optical elements 460, 462, 464, 466, 468 can be set At the top and/or bottom major surface, and/or may be disposed directly within the volume of the waveguides 420, 422, 424, 426, 428. In some embodiments, light extraction optical elements 460, 462, 464, 466, 468 can be formed within a layer of material that is attached to a transparent substrate to form the waveguides 420, 422, 424, 426, 428. In other embodiments, the waveguides 420, 422, 424, 426, 428 can be a single wafer material and the light extraction optical elements 460, 462, 464, 466, 468 can be on a surface and/or in the material of the sheet. Formed internally.

請繼續參考圖4,如在此所討論的,每個波導420、422、424、426、428被配置輸出光以形成對應於一特定深度平面的一影像。例如,最接近眼睛的波導420可以被配置傳送準直光束,當其被注入到這樣波導420後,傳送準直光束到眼睛410。該準直光束可以是該光學無窮遠焦平面的呈現。在到達眼睛410之前,下一個向上的波導422可以被配置為發送出穿過該第一透鏡430(例如一凹透鏡)的準直光。第一透鏡430可以被配置為創造一增量的凸波前曲率,因此使得眼睛/大腦解讀光來自下一個向上的波導422就如同來自第一焦聚平面,使無限大的光更緊密的向內部朝向眼睛410投射。類似地,在到達眼睛410之前,第三向上的波導424傳遞他的輸出光來通過第一透鏡430及第二透鏡 432。該第一透鏡430及該第二透鏡432的組合光功率可以被配置創造另一個增量的凸波前曲率,因此使得眼睛/大腦解讀光來自第三波導424就如同來自一第二焦聚平面,甚至使無限大的光源會比從下一個向上波導422的光源更緊密的向內部朝向人投射。 With continued reference to FIG. 4, as discussed herein, each of the waveguides 420, 422, 424, 426, 428 is configured to output light to form an image corresponding to a particular depth plane. For example, the waveguide 420 closest to the eye can be configured to transmit a collimated beam of light that, when injected into such a waveguide 420, transmits a collimated beam of light to the eye 410. The collimated beam can be the representation of the optical infinity focal plane. Prior to reaching the eye 410, the next upward waveguide 422 can be configured to transmit collimated light through the first lens 430 (eg, a concave lens). The first lens 430 can be configured to create an incremental convex front curvature, thus causing the eye/brain interpretation light to come from the next upward waveguide 422 as if from the first focal plane, making the infinitely brighter light closer The interior is projected towards the eye 410. Similarly, before reaching the eye 410, the third upward waveguide 424 transmits his output light through the first lens 430 and the second lens. 432. The combined optical power of the first lens 430 and the second lens 432 can be configured to create another incremental convex front curvature, thus causing the eye/brain interpretation light to come from the third waveguide 424 as if from a second focal plane Even an infinite source of light will be projected closer to the inside toward the person than from the source of the next upward waveguide 422.

其它波導(例如,波導426、428)和透鏡(例如透鏡434、436)也是同樣的配置,堆疊在最頂部的波導428會經過所有的透鏡來傳送輸出,該所有的透鏡會介於最頂部的波導428與眼睛之間,並且最緊密的聚焦平面到人之間係為一種總焦距功率的呈現。為了補償堆疊的透鏡430、432、434、436當觀看/解讀的光來自世界456時在另一側的疊層波導組件405會包括一補償透鏡層438用以設置在堆疊的頂部,以補償低於下述堆疊的透鏡430、432、434、436的總功率。可利用波導/透鏡的配置提供更多的感知聚焦平面。波導420、422、424、426、428的光提取光學元件460、462、464、466、468以及透鏡430、432、434、436的聚焦方面可為靜態的(即,非動態或電活性)。但在一些替代的實施例中,它們可以是動態的並使用電活性的特性。 Other waveguides (e.g., waveguides 426, 428) and lenses (e.g., lenses 434, 436) are also of the same configuration, with the topmost waveguide 428 passing through all of the lenses to transmit the output, all of which will be at the top Between the waveguide 428 and the eye, and the closest focal plane to the person is a representation of the total focal power. To compensate for the stacked lenses 430, 432, 434, 436 when the viewed/interpreted light is from the world 456, the laminated waveguide assembly 405 on the other side will include a compensation lens layer 438 for placement on top of the stack to compensate for the low The total power of the stacked lenses 430, 432, 434, 436 is described below. A waveguide/lens configuration can be utilized to provide more perceptual focus planes. The focus aspects of the light extraction optics 460, 462, 464, 466, 468 and lenses 430, 432, 434, 436 of the waveguides 420, 422, 424, 426, 428 may be static (ie, non-dynamic or electroactive). However, in some alternative embodiments, they may be dynamic and use electrically active properties.

請繼續參考圖4,被配置的光提取光學元件460、462、464、466、468會從各自的波導重定向光,和輸出具有適量發散性的光或特定深度平面的準直性光,兩者均與波導相關聯。其結果是,波導具有不同相關聯的深度平面,並具有不同配置的光提取光學元件,依據相關聯的深度平面來輸出一個不同量的發散性光。在一些實施例中,如本文所討論的,光提取光學元件460、 462、464、466、468的體積或表面特徵,其可以被配置特定的角度來輸出光源。例如,該光提取光學元件460、462、464、466、468可以是體積全像圖(volume holograms)、表面全像圖(surface holograms)、和/或繞射光柵。光提取光學元件,例如繞射光柵,已被揭露在美國專利申請案2015/0178939號內,該案申請日為2015年6月25日,該申請案的全部內容會以引用方式併入本發明。在一些實施例中,特徵430、432、434、436可能不是透鏡;相反,它們可以是簡單的間隔物(例如,覆蓋層和/或用於形成空氣間隙的結構)。 With continued reference to FIG. 4, the configured light extraction optical elements 460, 462, 464, 466, 468 redirect light from the respective waveguides and output collimated light having a moderate amount of divergence or a specific depth plane, Both are associated with the waveguide. As a result, the waveguides have different associated depth planes and have different configurations of light extraction optics that output a different amount of divergent light depending on the associated depth plane. In some embodiments, as discussed herein, light extraction optical element 460, The volume or surface features of 462, 464, 466, 468, which can be configured to output a light source at a particular angle. For example, the light extraction optics 460, 462, 464, 466, 468 can be volume holograms, surface holograms, and/or diffraction gratings. A light extraction optical element, such as a diffraction grating, is disclosed in U.S. Patent Application Serial No. 2015/0178939, filed on Jun. 25, 2015, the entire disclosure of . In some embodiments, features 430, 432, 434, 436 may not be lenses; rather, they may be simple spacers (eg, a cover layer and/or a structure for forming an air gap).

在一些實施例中,光提取光學元件460、462、464、466、468具有形成一衍射圖案的繞射特徵或”繞射光學元件”(diffractive optical element,或稱DOE)。更可取地是,繞射光學元件具有相當低的繞射效率,因此只有一部份的光束的光會伴隨每一個交叉點的DOE被偏轉遠離朝向眼睛410,其它剩餘的光束的光則繼續透過全內反射在波導內移動。而運載影像資訊的光,因此被分成若干相關聯的出射光束,該出射光束會從波導的多個位置退出,並且這些特定準值光束會在波導內彈跳,然後從出口散射相當均勻的圖案到眼睛410。 In some embodiments, light extraction optical elements 460, 462, 464, 466, 468 have diffractive features or "diffractive optical elements" (DOEs) that form a diffraction pattern. Preferably, the diffractive optical element has a relatively low diffraction efficiency, so that only a portion of the light of the beam is deflected away from the DOE toward each of the intersections toward the eye 410, and the light of the remaining beams continues to pass through the full Internal reflection moves within the waveguide. The light carrying the image information is thus divided into a number of associated outgoing beams that exit from multiple locations of the waveguide, and these specific quasi-beams bounce within the waveguide and then scatter a fairly uniform pattern from the exit to Eye 410.

在一些實施例中,一個或多個的DOE可以被切換為開或開的狀態。在開的狀態時它們主動的繞射,反之在關的狀態時他們不顯著繞射。例如,一可切換的DOE可以包括一聚合物分散液晶層,在該層中,微滴包括在主介質內的一衍射圖案,以及 微滴折射率可以被切換來基本匹配主體材料的折射指數(在這種情況下,光柵不會明顯地繞射入射光)或微滴可以切換為一指數,該指數不匹配主介質的折射指數(在這種情況下,光柵主動的繞射入射光)。 In some embodiments, one or more of the DOEs can be switched to an on or off state. They are actively diffracted in the open state, whereas they are not significantly diffracted in the off state. For example, a switchable DOE can include a polymer dispersed liquid crystal layer in which the droplets comprise a diffraction pattern in the host medium, and The droplet index of refraction can be switched to substantially match the refractive index of the host material (in which case the grating does not significantly illuminate the incident light) or the droplet can be switched to an index that does not match the refractive index of the host medium. (In this case, the grating actively diffracts the incident light).

在一些實施例中,深度平面的數量和分佈和/或景深可以基於觀看者的眼睛的瞳孔大小和/或取向而動態地改變。在一些實施例中,面向內的成像系統452(例如,數位照相機)可以用於擷取眼睛410的影像,以確定眼睛410瞳孔的尺寸和/或取向。在一些實施例中,面向內的成像系統452可以附接到框架212(如圖2所示),並且可以與處理模組224和/或228通訊聯繫,處理模組224和/或228可以處理來自面向內的成像系統452的影像信息,以確定,例如使用者204的眼睛的瞳孔直徑和/或取向。 In some embodiments, the number and distribution and/or depth of field of the depth plane may be dynamically changed based on the pupil size and/or orientation of the viewer's eye. In some embodiments, an inward facing imaging system 452 (eg, a digital camera) can be used to capture an image of the eye 410 to determine the size and/or orientation of the pupil of the eye 410. In some embodiments, the inward facing imaging system 452 can be attached to the frame 212 (shown in FIG. 2) and can be in communication with the processing modules 224 and/or 228, which can be processed by the processing modules 224 and/or 228. Image information from the inward facing imaging system 452 to determine, for example, the pupil diameter and/or orientation of the eyes of the user 204.

在一些實施例中,面向內的成像系統452(例如,數位相機)可以觀察使用者的運動,諸如眼睛運動和臉部運動。面向內的成像系統452可以用於擷取眼睛410的影像以確定眼睛410的瞳孔尺寸和/或取向。面向內的成像系統452可以用於獲得影像以確定使用者正在看的方向(例如,眼姿勢)或用於使用者的生物測定(例如,經由虹膜辨識)。可以分析由面向內的成像系統452獲得的影像來確定使用者的眼睛姿勢和/或情緒,其可以由顯示系統400來決定應該向使用者呈現哪個音頻或視覺內容。顯示系統400還可以使用諸如慣性測量元件(IMU),加速度計,陀螺儀等的感測器來確定頭部姿勢(例如,頭部位置或頭部取向)。 頭部姿勢可以單獨使用或與眼睛姿勢組合使用來與音軌和/或當前音頻內容相互作用。 In some embodiments, an inward facing imaging system 452 (eg, a digital camera) can observe the motion of the user, such as eye movements and facial movements. The inward facing imaging system 452 can be used to capture images of the eye 410 to determine the pupil size and/or orientation of the eye 410. The inward facing imaging system 452 can be used to obtain images to determine the direction in which the user is looking (eg, eye posture) or biometrics for the user (eg, via iris recognition). The images obtained by the inward facing imaging system 452 can be analyzed to determine the user's eye posture and/or mood, which can be determined by the display system 400 as to which audio or visual content should be presented to the user. Display system 400 may also use a sensor such as an inertial measurement element (IMU), accelerometer, gyroscope, etc. to determine a head pose (eg, head position or head orientation). The head gesture can be used alone or in combination with an eye gesture to interact with the audio track and/or current audio content.

在一些實施例中,可以為每隻眼睛使用一台照相機,以分別確定每隻眼睛的瞳孔大小和/或方向,從而允許將所要呈現給每隻眼睛的影像信息動態地剪裁傳送到該眼睛。在一些實施例中,可以為每隻眼睛至少使用一台照相機,以獨立地及分別地確定每隻眼睛的瞳孔大小和/或眼睛姿勢,允許將所要呈現給每隻眼睛的影像信息動態地剪裁傳送到該眼睛。在一些其它實施例中,僅確定單眼410的瞳孔直徑和/或取向(例如,每對眼睛僅使用單一台照相機),並且假設觀看者204的兩隻眼睛是相似的。 In some embodiments, a camera can be used for each eye to determine the pupil size and/or orientation of each eye, respectively, thereby allowing the image information to be presented to each eye to be dynamically clipped for delivery to the eye. In some embodiments, at least one camera can be used for each eye to independently and separately determine the pupil size and/or eye posture of each eye, allowing for dynamic cropping of image information to be presented to each eye. Transfer to the eye. In some other embodiments, only the pupil diameter and/or orientation of the monocular 410 is determined (eg, only a single camera is used for each pair of eyes), and the two eyes of the viewer 204 are assumed to be similar.

例如,景深可以與觀看者的瞳孔大小的改變成反比。結果,隨著觀看者眼睛瞳孔的尺寸減小,景深增加,使得一個平面不可辨別,因為該平面的位置超過眼睛可辨別的聚焦深度,並且隨著瞳孔尺寸的減小和相應的景深的增加而更加聚焦。同樣,用於向觀看者呈現不同圖像的間隔開的深度平面的數量可隨著瞳孔大小的縮小而減小。例如,在不調整眼睛遠離一個深度平面到另一深度平面的調節下,觀察者也許不能在一個瞳孔尺寸下清楚地感知第一深度平面和第二深度平面的細節。但是,這兩個深度平面可以在不改變調節的情況下,在另一瞳孔尺寸下同時充分對焦到使用者。 For example, the depth of field can be inversely proportional to the change in the size of the viewer's pupil. As a result, as the size of the pupil of the viewer's eye decreases, the depth of field increases, making a plane indistinguishable because the position of the plane exceeds the depth of focus that the eye can discern, and as the size of the pupil decreases and the corresponding depth of field increases More focused. Likewise, the number of spaced apart depth planes used to present different images to the viewer may decrease as the pupil size decreases. For example, without adjusting the adjustment of the eye away from one depth plane to another depth plane, the viewer may not be able to clearly perceive the details of the first depth plane and the second depth plane at one pupil size. However, the two depth planes can simultaneously focus on the user at the same pupil size without changing the adjustment.

在一些實施例中,顯示系統可以根據瞳孔大小和/或取向的確定性,或者根據接收到特定瞳孔大小和/或取向的電性信 號指示,來改變接收影像信息的波導的數量。例如,如果使用者的眼睛不能在與兩個波導相關聯的兩個深度平面之間進行區分,則控制器450可以被配置或編程為停止向這些波導中的其中一個提供影像信息。這可以有利地減少系統上的處理負擔,從而增加系統的反應性。在用於波導的DOE可在開啟和關閉狀態之間切換的實施例中,當波導接收影像信息時,DOE可以切換到關閉狀態。 In some embodiments, the display system can be based on the certainty of the pupil size and/or orientation, or based on receiving an electrical message of a particular pupil size and/or orientation. The number indicates to change the number of waveguides that receive image information. For example, if the user's eyes are unable to distinguish between two depth planes associated with the two waveguides, the controller 450 can be configured or programmed to stop providing image information to one of the waveguides. This can advantageously reduce the processing burden on the system, thereby increasing the responsiveness of the system. In embodiments where the DOE for the waveguide can be switched between on and off states, the DOE can be switched to the off state when the waveguide receives image information.

在一些實施例中,可能期望出射光束的直徑滿足具有小於觀察者眼睛的直徑的條件。然而,鑑於觀看者的瞳孔尺寸的變化性,滿足該條件可能是具有挑戰性的。在一些實施例中,藉由反應於觀察者的瞳孔尺寸的確定性來改變出射光束的尺寸,使在寬範圍的瞳孔尺寸上滿足該條件。例如,隨著瞳孔尺寸減小,出射光束的尺寸也可以減小。在一些實施例中,可以使用可變孔徑來改變出射光束尺寸。 In some embodiments, it may be desirable for the diameter of the exiting beam to satisfy a condition having a diameter that is less than the diameter of the observer's eye. However, in view of the variability of the viewer's pupil size, meeting this condition can be challenging. In some embodiments, the size of the exiting beam is varied by reacting to the deterministic size of the observer's pupil size such that the condition is met over a wide range of pupil sizes. For example, as the pupil size decreases, the size of the exiting beam can also be reduced. In some embodiments, a variable aperture can be used to vary the exit beam size.

顯示系統400可以包括對世界456的一部分進行成像的面向外的成像系統454(例如,一台數位相機)。該部分的世界456可以被稱為視場(FOV),並且該成像系統454有時被稱為FOV相機。由觀看者204觀看或成像的整個區域可以被稱為關注區域(FOR)。FOR可以包括圍繞著顯示系統400的立體角的4π球面度。在顯示系統400的一些實施例中,FOR可以包括顯示系統400的使用者204周圍的基本上所有的立體角,因為使用者204可以移動他們的頭部和眼睛看到圍繞著使用者的物件(在使用者的前面,後面,上面,下面或側面)。從面向外的成像系統454 獲得的影像可以用於追踪使用者做出的手勢(例如,手或手指手勢),檢測在世界456中,使用者前面的物件等等。 Display system 400 can include an outward facing imaging system 454 (eg, a digital camera) that images a portion of world 456. The portion of the world 456 can be referred to as a field of view (FOV), and the imaging system 454 is sometimes referred to as an FOV camera. The entire area viewed or imaged by the viewer 204 may be referred to as a region of interest (FOR). FOR may include a 4π sphericity around the solid angle of display system 400. In some embodiments of display system 400, FOR can include substantially all of the solid angles around user 204 of display system 400, as user 204 can move their head and eyes to see objects surrounding the user ( On the front, back, top, bottom or side of the user). From an outward facing imaging system 454 The acquired image can be used to track gestures made by the user (eg, hand or finger gestures), detect objects in the world 456, objects in front of the user, and the like.

顯示系統400可以包括使用者輸入裝置466,通過該使用者輸入裝置466,使用者可以向控制器450輸入命令以與顯示系統400交互作用。例如,使用者輸入裝置466可以包括觸摸板、觸摸屏、操縱桿、多自由度(DOF)控制器、電容感測裝置、遊戲控制器、鍵盤、滑鼠、方向鍵(D-pad),棒、觸覺裝置、圖騰(例如,具有如虛擬使用者輸入裝置的功用)等等。在某些情況下,使用者可以使用手指(例如,拇指)在觸控輸入裝置上按壓或滑動來向顯示系統400提供輸入(例如,提供使用者輸入到由顯示系統400提供的使用者界面)。在使用顯示系統400期間,使用者可以手持該使用者輸入裝置466。並且該使用者輸入裝置466可以與顯示系統400有線或無線溝通。 Display system 400 can include user input device 466 through which a user can enter commands to interact with display system 400. For example, user input device 466 can include a touchpad, a touch screen, a joystick, a multi-degree of freedom (DOF) controller, a capacitive sensing device, a game controller, a keyboard, a mouse, a D-pad, a stick, A haptic device, a totem (eg, having functionality such as a virtual user input device), and the like. In some cases, a user may use a finger (eg, a thumb) to press or slide on a touch input device to provide input to display system 400 (eg, to provide user input to a user interface provided by display system 400). The user can hold the user input device 466 during use of the display system 400. And the user input device 466 can communicate with the display system 400 in a wired or wireless manner.

圖5係為出射光束從波導輸出的實施例。並請參閱如圖所示的波導,應當理解的是,在疊層波導組件405內的其它波導也有類似功能,其中疊層波導組件405包括多個波導。當光源505被注入到波導420的輸入端邊緣510時,會用TIR在波導420內進行傳遞。在點處的光源505會在DOE 460上撞擊,光源的一部分會離開波導作為出射光束515。該出射光束515基本上是平行的,如圖所示,它們也可以被重定向以一角度傳遞到眼睛410(例如,形成發散的出射光束),這取決於與波導420相關聯的深度平面。應該理解的是,基本上平行的出射光束可以用來呈 現具有光提取光學元件的波導,該光提取光學元件外耦合光以形成設置在從眼睛410到一大距離(例如,光無窮遠)的深度平面上的影像。其它波導或其他組的光提取光學元件可以輸出更發散的出射光束圖案,需要用眼410來將發散的出射光束圖像調節到一更近的距離,使其在視網膜上聚焦,以及大腦將其解讀為如同光來自一個比光無窮遠更接近眼410的距離。 Figure 5 is an embodiment of an outgoing beam output from a waveguide. Referring to the waveguide as shown, it should be understood that other waveguides within the laminated waveguide assembly 405 have similar functions, with the laminated waveguide assembly 405 including a plurality of waveguides. When light source 505 is injected into input edge 510 of waveguide 420, it is transferred within waveguide 420 using TIR. Light source 505 at the point will strike on DOE 460, and a portion of the source will exit the waveguide as exit beam 515. The exiting beams 515 are substantially parallel, as shown, they can also be redirected at an angle to the eye 410 (e.g., to form a divergent exit beam) depending on the depth plane associated with the waveguide 420. It should be understood that substantially parallel outgoing beams can be used to present There is now a waveguide for a light extraction optical element that externally couples light to form an image disposed on a depth plane from the eye 410 to a large distance (eg, light infinity). Other waveguides or other sets of light extraction optics can output a more divergent exit beam pattern, requiring the eye 410 to adjust the diverging exit beam image to a closer distance, focusing it on the retina, and the brain will Interpreted as if the light came from a distance closer to the eye 410 than the infinity of the light.

圖6係為顯示系統400的另一實施例,其包括一波導裝置,用於將光耦合到波導裝置或從波導裝置耦合光的一光耦合器子系統以及一控制子系統。顯示系統400可以用於產生一多焦矩體積、影像或光場。顯示系統400可以包括一個或多個主平面波導604(在圖6中僅表示出一個)以及包括與至少幾個主波導604中的每一個相關聯的一個或多個DOEs 608。平面波導604可以類似於圖4中所討論的波導420、422、424、426、428。光學系統可以使用分佈波導裝置,以沿著第一軸(圖6的垂直或Y軸)傳播光,並沿著第一軸(例如,Y軸)擴展光的有效出射光瞳。分佈波導裝置,可以,例如包括分佈平面波導612和與分佈平面波導612相關聯的至少一個DOE 616(由雙虛線示出)。分佈平面波導612可以在至少某些方面與主平面波導604相似或相同,具有與其不同的取向。同樣地,至少一個DOE 616可以在至少某些方面與DOE 608相似或相同。例如,分佈平面波導612和/或DOE 616可以分別地由與主平面波導604和/或DOE 608相同的材料來組成。圖6所示的光學系統可以集成到圖2所示的可穿 戴顯示系統200中。 6 is another embodiment of a display system 400 that includes a waveguide device for coupling light to or from an optical coupler subsystem that couples light from the waveguide device and a control subsystem. Display system 400 can be used to generate a multi-focal moment volume, image or light field. Display system 400 can include one or more primary planar waveguides 604 (only one shown in FIG. 6) and one or more DOEs 608 associated with each of at least a few of the primary waveguides 604. The planar waveguide 604 can be similar to the waveguides 420, 422, 424, 426, 428 discussed in FIG. The optical system may use a distributed waveguide device to propagate light along a first axis (vertical or Y-axis of Figure 6) and to expand the effective exit pupil of light along a first axis (eg, the Y-axis). The distributed waveguide device may, for example, include a distributed planar waveguide 612 and at least one DOE 616 (shown by double dashed lines) associated with the distributed planar waveguide 612. The distributed planar waveguide 612 can be similar or identical to the primary planar waveguide 604 in at least some aspects, having a different orientation thereto. Likewise, at least one DOE 616 can be similar or identical to DOE 608 in at least some aspects. For example, distributed planar waveguide 612 and/or DOE 616 may be composed of the same material as main planar waveguide 604 and/or DOE 608, respectively. The optical system shown in Figure 6 can be integrated into the wearable Figure 2 Wearing the display system 200.

該傳播的光和出射光瞳擴展的光從分佈波導裝置光耦合到一個或多個主平面波導604。該主平面波導604沿著第二軸,並正交於第一軸(例如水平或X軸參見圖6)。值得注意的是,第二軸線可以是相對於第一軸線的非正交軸線。主平面波導604沿著第二軸(例如,X軸)擴展光的有效出口路徑。例如,分佈平面波導612可以沿著垂直或Y軸傳播和擴展光,並將該光傳遞到該主平面波導604,而該主平面波導604再沿著水平或X軸傳播和擴展光。 The propagated light and the exit pupil extended light are optically coupled from the distributed waveguide to one or more of the primary planar waveguides 604. The primary planar waveguide 604 is along the second axis and is orthogonal to the first axis (eg, horizontal or X-axis see FIG. 6). Notably, the second axis can be a non-orthogonal axis relative to the first axis. The primary planar waveguide 604 extends the effective exit path of light along a second axis (eg, the X-axis). For example, the distributed planar waveguide 612 can propagate and spread light along the vertical or Y-axis and pass the light to the primary planar waveguide 604, which in turn propagates and spreads light along the horizontal or X-axis.

顯示系統400可包括能夠被光耦合到一單模光纖624的近端中的一個或多個彩色光源(例如,紅色,綠色和藍色激光)620。光纖624的末端部分可以通過壓電材料的中空管628串線連接或接收,並且該末端從該中空管628突出形成可固定擺動的柔性懸臂632。壓電管628可與四個象限電極(圖示未揭露)相關聯。例如,電極可以鍍在中空管628的外部、外表面或外週或直徑上。芯電極(圖示未揭露)也位於管628的芯、中心、內週或內徑中。 Display system 400 can include one or more colored light sources (eg, red, green, and blue lasers) 620 that can be optically coupled into the proximal end of a single mode fiber 624. The end portion of the optical fiber 624 may be connected or received in series by a hollow tube 628 of piezoelectric material, and the end protrudes from the hollow tube 628 to form a flexible cantilever 632 that is fixedly swingable. Piezoelectric tube 628 can be associated with four quadrant electrodes (not shown). For example, the electrodes can be plated on the outer, outer or outer circumference or diameter of the hollow tube 628. The core electrode (not shown) is also located in the core, center, inner circumference or inner diameter of the tube 628.

驅動電子636,例如經由導線640驅動對應的兩電極在兩個獨立的軸線上彎曲壓電管628。光纖624突出四末端尖端具有機械共振模式。共振頻率可以取決於光纖624的直徑、長度和材料特性。在該光纖懸臂632的機械共振之第一模式附近振動壓電管628,使得光纖懸臂632振動,可以掃描大的偏角。 The drive electronics 636, for example, drive the corresponding two electrodes via wires 640 to bend the piezoelectric tubes 628 on two separate axes. The fiber 624 protrudes from the four end tips to have a mechanical resonance mode. The resonant frequency may depend on the diameter, length, and material properties of the fiber 624. The piezoelectric tube 628 is vibrated in the vicinity of the first mode of mechanical resonance of the fiber cantilever 632, causing the fiber cantilever 632 to vibrate, and a large off angle can be scanned.

通過在兩個軸線上激烈的共振振動,填充二維(2-D)掃描的區域並雙軸掃描光纖懸臂632的尖端。通過與光纖懸臂632的掃描同步地調變光源620的強度,從光纖懸臂632出射的光形成影像。在美國專利申請案第2014/0003762號中提供了這種設置的描述,其通過引用的方式將其全部內容併入本發明中。 The two-dimensional (2-D) scanned region is filled and the tip end of the fiber cantilever 632 is biaxially scanned by intense resonant vibration on both axes. The light emitted from the fiber cantilever 632 forms an image by modulating the intensity of the light source 620 in synchronization with the scanning of the fiber cantilever 632. A description of such an arrangement is provided in U.S. Patent Application Serial No. 2014/0003762, the entire content of which is incorporated herein by reference.

光耦合器子系統的元件644對從掃描光纖懸臂632出射的光進行準直。準直光被鏡面648反射到包含至少一個衍射光學元件(DOE)616的窄分佈平面波導612中。準直光通過全內反射沿著分佈平面波導612垂直地(與圖6的視圖有關)傳播,並且這樣反覆地與DOE 616相互交叉。DOE 616更可取的是具有低衍射效率。這導致一部分(例如,10%)的光伴隨在每個交叉點的DOE 616朝向較大的主平面波導604的邊緣衍射,以及一部分的光繼續透過TIR在其原始軌跡上沿著分佈平面波導612的長度移動。 Element 644 of the optocoupler subsystem collimates light emerging from scanning fiber cantilever 632. The collimated light is reflected by mirror 648 into a narrowly distributed planar waveguide 612 comprising at least one diffractive optical element (DOE) 616. The collimated light propagates vertically (through the view of FIG. 6) along the distribution plane waveguide 612 by total internal reflection and thus crosses the DOE 616 in turn. More preferably, DOE 616 has low diffraction efficiency. This causes a portion (e.g., 10%) of the light to be diffracted toward the edge of the larger principal plane waveguide 604 at the DOE 616 at each intersection, and a portion of the light continues to pass through the TIR along its original trajectory along the distribution plane waveguide 612. The length of the move.

在與DOE 616交叉的每個交叉點,額外的光朝向主波導612的入口衍射。通過將入射光分成多個輸出耦合組,光的出射光瞳由分佈平面波導612中的DOE 616垂直地擴展。從分佈平面波導612耦合出的該垂直擴展的光進入主平面波導604的邊緣。 At each intersection that intersects the DOE 616, additional light is diffracted toward the entrance of the main waveguide 612. By splitting the incident light into a plurality of output coupling sets, the exit pupil of the light is vertically expanded by the DOE 616 in the distributed planar waveguide 612. The vertically extended light coupled from the distribution plane waveguide 612 enters the edge of the main planar waveguide 604.

進入主平面波導604的光沿著主平面波導604經由TIR水平傳播(與圖6的視圖有關)。當光經由TIR沿著主平面波導604至少一部分的長度水平傳播時,光在多個點處與DOE 608 相交叉。DOE 608可以有利地被設計或配置為具有作為線性繞射圖案和徑向對稱衍射圖案的總和的相位分佈,以產生光的偏轉和聚焦。DOE 608可以有利地具有低衍射效率(例如10%),使得只有一部分光束的光會與每個交叉點的DOE 608偏轉朝向視圖的眼睛,而其餘的光繼續透過TIR傳播通過主平面波導604。 Light entering the primary planar waveguide 604 propagates along the primary planar waveguide 604 via the TIR level (associated with the view of Figure 6). When light propagates horizontally along the length of at least a portion of the main planar waveguide 604 via TIR, the light is at multiple points with the DOE 608 Cross. The DOE 608 can advantageously be designed or configured to have a phase distribution that is the sum of a linear diffraction pattern and a radially symmetric diffraction pattern to produce deflection and focus of the light. The DOE 608 can advantageously have low diffraction efficiency (e.g., 10%) such that only a portion of the beam of light will deflect with the DOE 608 of each intersection toward the eye of the view, while the remaining light continues to propagate through the main plane waveguide 604 through the TIR.

在傳播光和DOE 608之間的每個交叉點處,一部分的光朝向主平面波導604的相鄰面衍射,允許光離開TIR並從主平面波導604的表面射出。在一些實施例中,DOE 608的徑向對稱衍射圖案具有衍射光的焦點水平,這兩個個別光束的光波前(例如,賦予曲率)形狀,會與設計好的焦點水平之對稱角度來引導光束。 At each intersection between the propagating light and the DOE 608, a portion of the light is diffracted toward the adjacent faces of the main planar waveguide 604, allowing light to exit the TIR and exit from the surface of the main planar waveguide 604. In some embodiments, the radially symmetric diffraction pattern of DOE 608 has a focus level of diffracted light, and the optical wavefront (eg, imparting curvature) shape of the two individual beams directs the beam at a symmetrical angle to the designed focus level. .

因此,這些不同的路徑可以使光通過在不同的角度上的多個DOE 608,焦點水平耦合到主平面波導604之外,和/或在出射光瞳產生不同的填充圖案。在出射光瞳處的不同填充圖案可以有利地用於創建具有多個深度平面的光場顯示器。波導組件中的每個層或者在疊層中的一組層(例如,3層)可以用於產生相應的顏色(例如,紅色、藍色、綠色)。因此,例如,可以採用第一組三個相鄰層以分別在第一焦深產生紅色、藍色和綠色光。第二組三個相鄰層可以用於在第二焦深分別地產生紅色、藍色和綠色光。可以採用多個集合來產生具有各種焦深的完整3D或4D彩色影像光場。 Thus, these different paths can pass light through multiple DOEs 608 at different angles, the focus is horizontally coupled out of the main planar waveguide 604, and/or produces a different fill pattern in the exit pupil. Different fill patterns at the exit pupil can advantageously be used to create a light field display having multiple depth planes. Each layer in the waveguide assembly or a set of layers (eg, 3 layers) in the stack can be used to produce a corresponding color (eg, red, blue, green). Thus, for example, a first set of three adjacent layers can be employed to produce red, blue, and green light at a first depth of focus, respectively. A second set of three adjacent layers can be used to produce red, blue, and green light, respectively, at the second depth of focus. Multiple sets can be employed to produce a complete 3D or 4D color image light field with various depths of focus.

■AR系統的其它組件■Other components of the AR system

在許多實施例中,AR系統可以包括除了可穿戴顯示系統200(或光學系統400)之外的其他組件。AR裝置可以,例如,包括一個或多個觸覺裝置或組件。觸覺裝置或組件可用來操作並向使用者提供觸覺感知。例如,當觸摸虛擬內容(例如,虛擬物體、虛擬工具、其他虛擬構造)時,觸覺裝置或組件可以提供壓力和/或紋理的觸覺感知。觸覺感知可以反應出虛擬物體呈現的實體反應,或者可以反應出虛擬內容呈現的想像物件或角色(例如,龍)的感覺。在一些實施例中,觸覺裝置或組件可以由使用者(例如,使用者可穿戴手套)佩戴。在一些實施例中,觸覺裝置或組件可以由使用者握住。 In many embodiments, the AR system can include other components than the wearable display system 200 (or optical system 400). An AR device can, for example, include one or more haptic devices or components. A haptic device or component can be used to operate and provide tactile sensation to the user. For example, when touching virtual content (eg, virtual objects, virtual tools, other virtual constructs), the haptic device or component can provide a tactile perception of pressure and/or texture. Haptic perception can reflect the physical response presented by a virtual object, or can reflect the sensation of an imaginary object or character (eg, a dragon) presented by the virtual content. In some embodiments, the haptic device or component can be worn by a user (eg, a user wearable glove). In some embodiments, the haptic device or component can be held by a user.

AR系統可以,例如,包括可由使用者操縱以允許與AR系統的輸入或交互的一個或多個實體物件。這些實體物件在這裡被稱為圖騰(totems)。某些圖騰可以採取無生命實體的形式,例如一塊金屬或塑料、牆壁、桌子的表面。或者,某些圖騰可以採取動畫實體的形式,例如使用者的手。如本發明所述,圖騰可以實際上不具有任何實體輸入結構(例如,鍵、觸發器、操縱桿、軌跡球、搖桿開關)。相反,圖騰可以簡單地提供實體表面,並且AR系統可以呈現使用者界面,使得使用者看起來在圖騰的一個或多個表面上。例如,AR系統可以使電腦鍵盤和觸控板的影像看起來位於圖騰的一個或多個表面上。例如,AR系統可以使虛擬電腦鍵盤和虛擬觸控板呈現在當作圖騰的薄矩形鋁板的表面上。矩形板本身不具有任何實體鍵或觸控板或傳感器。然而,AR 系統可以檢測使用者操作或與矩形板的交互作用或觸摸,如同經由虛擬鍵盤和/或虛擬觸控板做出的選擇或輸入。 The AR system can, for example, include one or more physical objects that can be manipulated by a user to allow for input or interaction with the AR system. These physical objects are referred to herein as totems. Some totems can take the form of inanimate objects, such as a piece of metal or plastic, walls, or the surface of a table. Alternatively, some totems may take the form of an animated entity, such as a user's hand. As described herein, a totem may not actually have any physical input structure (eg, keys, triggers, joysticks, trackballs, rocker switches). Instead, the totem can simply provide a solid surface, and the AR system can present the user interface such that the user appears to be on one or more surfaces of the totem. For example, an AR system can make an image of a computer keyboard and trackpad appear to be on one or more surfaces of a totem. For example, the AR system can present a virtual computer keyboard and virtual touchpad on the surface of a thin rectangular aluminum plate that acts as a totem. The rectangular plate itself does not have any physical keys or touchpads or sensors. However, AR The system can detect user interactions or interactions or touches with rectangular panels as selected or entered via a virtual keyboard and/or virtual trackpad.

在美國專利公開第2015/0016777號中描述了可與本發明的AR裝置、HMD和顯示系統一起使用的觸覺裝置和圖騰的實例,其通過引用的方式將其全部內容併入本發明中。 Examples of haptic devices and totems that can be used with the AR devices, HMDs, and display systems of the present invention are described in U.S. Patent Publication No. 2015/0016777, the entire disclosure of which is incorporated herein by reference.

■顯示系統上執行錯誤校正的實施例■Example of performing error correction on the display system

如上所述,一顯示系統可以包括一堆疊波導組件,例如圖中所示。參見圖4-6,其具有帶有衍射光柵的基底材料的多個顯示層,以重定向產生撞擊在眼睛上的數位化光場的光。在一些實施例中,波導組件依每個顏色、每個深度包括一個基底層。例如,雙深度平面RGB顯示器可以具有總共6個波導層。顯示系統可以是可穿戴顯示系統200的實施例。 As noted above, a display system can include a stacked waveguide assembly, such as shown in the figures. Referring to Figures 4-6, there are multiple display layers of substrate material with a diffraction grating to redirect light that produces a digitized light field that impinges on the eye. In some embodiments, the waveguide assembly includes a substrate layer for each color, each depth. For example, a dual depth planar RGB display can have a total of six waveguide layers. The display system can be an embodiment of the wearable display system 200.

在堆疊波導組件中,存在可能引入造成影像品質劣化的一系列電勢現象。這可以包括重影(多個影像)、失真、未對準(顏色或深度之間的),以及視場上的顏色強度變化。另外,可能在其他類型的條件下,發生的某些類型的偽像,例如,當用相對於LED的激光照射時(例如,斑點、條帶、牛頓條紋),或當輸出耦合光束的密度小於一定量時(例如,波前稀疏,其可以被感覺為好像通過屏幕門或柵欄)。 In the stacked waveguide assembly, there is a possibility that a series of potential phenomena causing deterioration of image quality may be introduced. This can include ghosting (multiple images), distortion, misalignment (between color or depth), and color intensity variations on the field of view. In addition, certain types of artifacts that may occur under other types of conditions, such as when illuminated with laser light relative to the LED (eg, spots, strips, Newtonian stripes), or when the density of the output coupled beam is less than At a certain amount (for example, the wavefront is sparse, it can be perceived as if it passes through a screen door or fence).

由於光場顯示器的光學器件中的缺陷,當通過光學器件顯示時,在渲染引擎(render engine)中理想的三維網格會變得失真。為了識別和校正期望圖像和實際顯示的影像之間的失真, 可以使用顯示系統投影校準圖案,例如棋盤圖案。 Due to defects in the optics of the light field display, the ideal three-dimensional grid becomes distorted in the render engine when displayed through the optics. In order to identify and correct the distortion between the desired image and the actually displayed image, A display system can be used to project a calibration pattern, such as a checkerboard pattern.

圖7係為當經由顯示系統之投影校準圖案702時可能發生失真的實施例。校準圖案702可以適於執行空間或色彩校準的任何類型的圖案(例如,包括多個棋盤方格的棋盤圖案)。校準圖案702可以包括任何類型的測試或校準圖案,例如幾何圖案或隨機圖案。投影校準圖案702產生生成光場影像704。存在於影像704中的失真可以包括空間失真(例如,當可見像素不在它被期望的視場內時)以及色彩失真(例如,當可見像素的顏色值與期望的不同時)。例如,棋盤方格圖案702可以從它們在影像704(例如,空間誤差)中的預期位置偏移。另外,影像704中的某些棋盤方格可以以其他顏色(例如紫色(例如,色差))出現,來代替以黑色和白色顯示的棋盤方格。可以使用光場計量系統來測量顯示誤差,該光場計量系統可以包括一數位照相機定位成獲取由顯示器投影的校準圖案之影像。在一些實施例中,可以擷取相當於偏移到不同位置的校準影像的多個影像,以便獲得關於期望位置與實際位置的更細密紋理的訊息。數位相機可以被用以聚焦在不同的聚焦深度上,以便確定在不同區域上所顯示的影像(例如,所顯示的校準圖案上的特徵)聚焦於哪個深度上。 FIG. 7 is an embodiment of distortion that may occur when the pattern 702 is calibrated via projection of the display system. The calibration pattern 702 can be adapted to perform any type of pattern of spatial or color calibration (eg, a checkerboard pattern comprising a plurality of checkerboard tiles). The calibration pattern 702 can include any type of test or calibration pattern, such as a geometric pattern or a random pattern. Projection calibration pattern 702 produces a generated light field image 704. Distortion present in image 704 may include spatial distortion (eg, when the visible pixel is not within its intended field of view) and color distortion (eg, when the color value of the visible pixel is different than desired). For example, checkerboard pattern 702 may be offset from their intended position in image 704 (eg, spatial error). Additionally, some of the checkerboard tiles in image 704 may appear in other colors (eg, purple (eg, color difference)) instead of the checkerboard squares displayed in black and white. The display error can be measured using a light field metrology system that can include an image of a digital camera positioned to acquire a calibration pattern projected by the display. In some embodiments, multiple images of the calibration image corresponding to the offset to different locations may be retrieved to obtain a more detailed texture of the desired location and the actual location. Digital cameras can be used to focus at different depths of focus to determine which depth of image (eg, features on the displayed calibration pattern) displayed on different regions is focused on.

根據一些實施例,在不同聚焦深度擷取多個影像,以確定所顯示影像的不同區域的深度在下文中結合圖17到圖20有更詳細地描述。以下結合圖22到圖24更詳細地描述可以在各種實施例中使用的不同類型的校準圖案。 According to some embodiments, capturing multiple images at different depths of focus to determine the depth of different regions of the displayed image is described in more detail below in connection with Figures 17-20. Different types of calibration patterns that can be used in various embodiments are described in more detail below in conjunction with FIGS. 22 through 24.

■空間誤差■ Spatial error

空間誤差可以包括幾種不同的表現。例如,空間未對準包括顯示層的平移或旋轉。空間誤差還會涉及在顯示器的深度平面的視場(FOV)上變化的非線性空間失真。 Spatial errors can include several different manifestations. For example, spatial misalignment includes translation or rotation of the display layer. Spatial errors can also involve nonlinear spatial distortion that varies over the field of view (FOV) of the depth plane of the display.

空間誤差可以是顯示系統內的機械或光學缺陷的一徵狀。經由說明所測量的空間誤差,可以導出量化系統的光學機械質量的度量指標和建議的改進方法的度量指標。例如,表示深度平面旋轉的空間誤差可以建議顯示器相對於期望位置機械地轉動。每個顏色的平面縮放可以建議為不充分消色差的透鏡系統。 The spatial error can be a sign of mechanical or optical defects within the display system. By clarifying the measured spatial error, metrics of the optomechanical quality of the quantized system and metrics of the proposed improved method can be derived. For example, a spatial error indicative of depth plane rotation may suggest that the display mechanically rotate relative to the desired position. The planar scaling of each color can be suggested as a lens system that does not adequately achromatic.

為了識別空間誤差,光場測量系統包括影像擷取裝置,例如數位相機,可以用於擷取由顯示系統投影的一個或多個影像(例如,校準圖案的投影),並產生表示實際顯示影像與期望影像的偏差的向量場。該向量場可以是三維向量場,其包括顯示器的xy平面中的平面內偏差和z方向(深度)上的平面外偏差,或者包括僅在xy平面中偏差的二維向量場。在一些實施例中,一向量場可以由顯示系統的每個深度平面或每個色彩平面生成。在一些實施例中,深度可以用屈光度測量,表示該層的反向焦距以米為單位) In order to identify spatial errors, the light field measurement system includes an image capture device, such as a digital camera, that can be used to capture one or more images projected by the display system (eg, a projection of a calibration pattern) and produce an image representing the actual display and The vector field of the deviation of the desired image. The vector field may be a three-dimensional vector field comprising an in-plane deviation in the xy plane of the display and an out-of-plane deviation in the z-direction (depth), or a two-dimensional vector field that is only offset in the xy plane. In some embodiments, a vector field can be generated by each depth plane or each color plane of the display system. In some embodiments, the depth can be measured in diopter, indicating the back focal length of the layer in meters.

圖8係為可以從一投影光場中的點的預期位置及其實際顯示位置之間的映射偏差的一個或多個擷取影像生成的向量場之實施列。投影光場中的點可以對應於校準影像中的特徵(例如,校準棋盤方格的中心和角)。向量場中的每個向量表示在光場 中的預期位置和其相應的實際位置之間的失真。在該實施例中,失真向量場是2D。在所示的向量場中,使用第一顏色和標記類型(例如,“O”802用於預期位置)來標記一特徵的預期位置,而使用第二顏色(例如,“X”804用於檢測位置)。每對相應的預期位置和顯示位置由線806連接,線806可以包括指示需要校正方向的一箭頭來將檢測到的顯示位置校正為預期位置。 8 is an implementation of a vector field generated from one or more captured images of a mapping offset between an expected position of a point in a projected light field and its actual display position. The points in the projected light field may correspond to features in the calibration image (eg, calibrating the center and corner of the checkerboard square). Each vector in the vector field is represented in the light field Distortion between the expected position in the middle and its corresponding actual position. In this embodiment, the distortion vector field is 2D. In the illustrated vector field, the first color and marker type (eg, "O" 802 for the intended location) is used to mark the intended location of a feature, while the second color is used (eg, "X" 804 is used for detection). position). Each pair of respective desired and displayed positions is connected by line 806, which may include an arrow indicating the direction in which correction is needed to correct the detected display position to the desired position.

使用向量場,可以提取局部或全局失真信息(例如,平面內平移、聚集縮放、聚集旋轉、均數像素彎曲或屈光度誤差,如下所述)。例如,可以從確定的向量場生成失真圖。失真圖可以用於分析在生成的向量場上的像素位置誤差值(例如向量幅度)之分佈。失真圖可以是顯示像素位置誤差的頻率的直方圖(例如,針對誤差幅度出現在向量場中的頻率繪製像素位置誤差幅度)。可以使用其他類型的圖來分析向量場的其他屬性(例如,失真方向)。 Using vector fields, local or global distortion information can be extracted (eg, in-plane translation, aggregate scaling, aggregate rotation, mean pixel curvature, or diopter error, as described below). For example, a distortion map can be generated from the determined vector field. The distortion map can be used to analyze the distribution of pixel position error values (eg, vector magnitude) over the generated vector field. The distortion map may be a histogram of the frequency at which the pixel position error is displayed (eg, the pixel position error magnitude is plotted for the frequency at which the error magnitude appears in the vector field). Other types of graphs can be used to analyze other properties of the vector field (eg, distortion direction).

空間誤差可以大致分為平面內和平面外空間誤差。平面內空間誤差是指在特定深度(在z軸上測量的)沿著特定深度平面(例如,根據圖6所示的坐標系統的xy平面)的空間誤差。向量域(例如,如圖8所示)可以導出用於不同類別的空間誤差的一個或多個度量。這些度量中的每一個度量可以在每層的基底上定義(例如,對於與顏色和深度的特定組合相對應的每個單獨的顯示層(例如,紅-3屈光度顯示層,綠-1屈光度顯示層等))或每個顯示器基底(例如,在簡明參數中量化顯示器的總體保真 度)。 Spatial errors can be roughly divided into in-plane and out-of-plane spatial errors. The in-plane spatial error refers to the spatial error along a particular depth plane (e.g., according to the xy plane of the coordinate system shown in Figure 6) at a particular depth (measured on the z-axis). A vector field (eg, as shown in FIG. 8) may derive one or more metrics for spatial errors of different categories. Each of these metrics can be defined on a substrate of each layer (eg, for each individual display layer corresponding to a particular combination of color and depth (eg, a red-3 diopter display layer, a green-1 diopter display) Layers, etc.) or each display substrate (eg, quantify the overall fidelity of the display in concise parameters) degree).

■平面空間誤差■planar space error

在一些實施例中,平面內空間誤差可以分為多個不同的元件,每個元件對應於不同類型的誤差。這些元件可以包括平移誤差、旋轉誤差、定標誤差或非線性空間誤差。這些誤差元件中的每一個誤差可以單獨地或順序地校正。 In some embodiments, the in-plane spatial error can be divided into a plurality of different elements, each element corresponding to a different type of error. These components may include translational errors, rotational errors, scaling errors, or nonlinear spatial errors. Each of these error elements can be corrected individually or sequentially.

■平面內平移誤差■In-plane translation error

圖9A係為平面內(xy)平移空間誤差(也稱為xy中心)的實施例。xy平移誤差是指顯示層的顯示影像中心的x和/或y像素從其預期位置偏移,並且xy平移誤差意在通知機械或顯示器對準。在圖9A中,預期影像位置900(在該實施例中,用紅色矩形來表示)被轉換為顯示影像位置900a(用具有非直線的綠色形狀表示)。並利用該顯示影像位置900a的中心位置902和預期影像位置900的中心位置904,執行一個或多個移位(沿著所確定的平移向量901)來校正一xy平移誤差,使得所顯示的中心位置902與預期中心位置904(通過顯示器的機械對準、顯示影像的軟體校正、或兩者的組合)對準。測量出的xy平移空間誤差之一個或多個的度量可以包括在每一層的基底上測量出的平移誤差,其測量層的中心對應於預期或參考位置(例如,顯示器的光軸)或在每個顯示器測量出最大平移偏移,其在任何兩個顯示層之間識別最大平移以量化整體平移配準。 Figure 9A is an embodiment of an in-plane (xy) translational spatial error (also referred to as the xy center). The xy translation error refers to the x and/or y pixels of the display image center of the display layer being offset from their intended position, and the xy translation error is intended to inform the mechanical or display alignment. In FIG. 9A, the expected image position 900 (in this embodiment, represented by a red rectangle) is converted to display image position 900a (represented by a green shape having a non-linear line). Using the center position 902 of the display image position 900a and the center position 904 of the expected image position 900, one or more shifts (along the determined translation vector 901) are performed to correct an xy translation error such that the displayed center Position 902 is aligned with expected center position 904 (mechanical alignment by display, software correction of displayed image, or a combination of both). A measure of one or more of the measured xy translational spatial errors may include a translational error measured on a substrate of each layer, the center of the measurement layer corresponding to an expected or reference location (eg, the optical axis of the display) or at each The displays measure the maximum translational offset, which identifies the maximum translation between any two display layers to quantify the overall translational registration.

■聚集旋轉誤差■ Aggregate rotation error

圖9B係為聚集旋轉空間誤差的實施例。聚集旋轉是指所顯示的影圖像靠近其中心的總旋轉角度相對於影像的預期位置。雖然空間失真不可能總是可以利用簡單的仿射旋轉來完全描述,但是可以使用聚集旋轉度量來提供旋轉角度,通過該旋轉角度,像素位置誤差(在顯示與預期影像位置之間)被最小化。聚集旋轉度量意在通知機械或顯示對準。如圖9B所示,可利用將所顯示影像906圍繞中心點908旋轉,用指定的旋轉量907到對應於預期位置的位置910來校正聚集旋轉。(通過顯示器的機械對準、通過顯示影像的軟體校正,或者皆可)。報告的度量可以包括在每一層測量出的旋轉誤差,識別測量出的取向與預期或參考取向(例如,相對於顯示器的水平軸)的關係和在每個顯示器測量出的最大旋轉偏移,在任何兩個顯示層之間識別出最大平移以量化整體平移配準。 Figure 9B is an embodiment of an aggregated rotational spatial error. Aggregate rotation refers to the expected position of the displayed image image near its center relative to the image. Although spatial distortion cannot always be fully described using simple affine rotation, an aggregate rotation metric can be used to provide a rotation angle by which pixel position errors (between display and expected image position) are minimized . Aggregate rotation metrics are intended to inform the machine or display alignment. As shown in FIG. 9B, the displayed image 906 can be rotated about the center point 908, with the specified amount of rotation 907 to the position 910 corresponding to the expected position to correct the focus rotation. (either by mechanical alignment of the display, by software correction of the displayed image, or both). The reported metrics may include the rotational error measured at each layer, identifying the relationship of the measured orientation to the expected or reference orientation (eg, relative to the horizontal axis of the display) and the maximum rotational offset measured at each display, The maximum translation is recognized between any two display layers to quantify the overall translational registration.

■聚集縮放誤差■ Aggregation scaling error

圖9C係為聚集縮放空間誤差的實施例。聚集縮放標示顯示影像靠近其中心的整體縮放因子相對於期望影像。雖然空間失真不可能用簡單的仿射縮放來完全描述,但是聚集縮放測量可以標示縮放因子,通過該縮放因子,像素位置誤差被最小化。聚集縮放度量意在通知光學設計或顯示對準。如圖9C所示,可以利用所指定的縮放量913縮放所顯示的影像912的大小以匹配期望影像914的大小來校正聚集縮放空間誤差。用於聚集縮放的報告度量可以包括縮放誤差、每層測量、其測量影像(例如,參 考在校準設置中的物理目標)和每個顯示器測量的最大縮放偏移,其標示任何兩個顯示層之間的最大縮放以量化整體平移配準。 Figure 9C is an embodiment of an aggregated scaling spatial error. The aggregate zoom indicator shows the overall zoom factor of the image near its center relative to the desired image. Although spatial distortion cannot be fully described with simple affine scaling, the aggregate scaling measurement can indicate a scaling factor by which pixel position errors are minimized. The aggregate scaling metric is intended to inform the optical design or display alignment. As shown in FIG. 9C, the size of the displayed image 912 can be scaled with the specified amount of zoom 913 to match the size of the desired image 914 to correct the aggregated zoom space error. Report metrics for aggregate scaling can include scaling errors, measurements per layer, and their measurement images (eg, The physical target in the calibration settings and the maximum zoom offset measured by each display, which indicates the maximum zoom between any two display layers to quantify the overall translational registration.

圖9D係為聚集縮放空間誤差的另一實施例。與預期影像918相比,顯示影像916顯得更小。為了校正縮放誤差,顯示影像916利用縮放量917按比例放大以匹配預期影像918的尺寸。 Figure 9D is another embodiment of an aggregated scaling spatial error. The display image 916 appears to be smaller than the expected image 918. To correct the scaling error, display image 916 is scaled up by zoom amount 917 to match the size of expected image 918.

■像素扭曲誤差■Pixel distortion error

圖9E係為已執行xy平移、旋轉和縮放的校正之後的剩餘空間誤差的實施例。剩餘誤差(也稱為像素扭曲或空間映射)標示在平移、旋轉和縮放之後的xy空間失真分佈中的平均殘餘Euclidean像素位置誤差(例如,如圖9A-9D所示)給予顯示系統的非線性或非仿射扭曲特性的度量,並且用來通知顯示設計和質量控制。像素扭曲的所報告的度量可以包括平均像素扭曲(MPW),測量每層,表示在平移、旋轉和縮放之後的xy平均殘餘Euclidean像素位置誤差係參考理想網格和最大平均像素扭曲(最大MPW),其表示出顯示層之間最大的MPW值,以量化整體扭曲。在一些實施例中,可以通過使用處理模塊(例如,模塊224或228)執行的空間映射來校正剩餘的像素扭曲,來將顯示影像920與期望影像922對準。 Figure 9E is an embodiment of the residual spatial error after correction for xy translation, rotation, and scaling has been performed. The residual error (also known as pixel warping or spatial mapping) indicates the average residual Euclidean pixel position error in the xy spatial distortion distribution after translation, rotation, and scaling (eg, as shown in Figures 9A-9D) imparting nonlinearity to the display system Or a measure of non-affine distortion characteristics and is used to inform display design and quality control. The reported metric of pixel distortion can include average pixel distortion (MPW), measuring each layer, representing the xy average residual Euclidean pixel position error after translation, rotation, and scaling, reference ideal grid and maximum average pixel distortion (maximum MPW) , which represents the maximum MPW value between display layers to quantify the overall distortion. In some embodiments, the display image 920 can be aligned with the desired image 922 by using the spatial mapping performed by the processing module (eg, module 224 or 228) to correct the remaining pixel distortion.

■平面外空間誤差■Out-of-plane spatial error

數位光場顯示系統,例如圖4-6所示,能夠產生看起來位於來自觀察者的不同深度(沿z方向)的深度平面(參見, 例如圖3)。在一些實施例中,深度平面對應於看起來位於離觀看者不同距離處的平面。如在光學中常見的,而不是指深度平面與顯示器的距離,可以使用以屈光度(m-1)測量的反距離來參考不同的深度平面。例如,顯示器可以具有位於3屈光度(1/3m)和1屈光度(1m)的深度處的兩個深度平面。由於顯示系統中的缺陷,穿過深度平面的屈光度輪廓將不如預期。例如,深度層上顯示的影像可以具有不正確距離的屈光度輪廓,或者跨越在顯示器FOV上的變焦。 Digital light field display systems, such as those shown in Figures 4-6, are capable of producing depth planes that appear to be at different depths (in the z-direction) from the observer (see, For example, Figure 3). In some embodiments, the depth plane corresponds to a plane that appears to be at a different distance from the viewer. As is common in optics, rather than the distance of the depth plane from the display, the inverse distance measured in diopter (m-1) can be used to reference different depth planes. For example, the display can have two depth planes at depths of 3 diopters (1/3 m) and 1 diopters (1 m). Due to defects in the display system, the diopter profile through the depth plane will not be as expected. For example, an image displayed on a depth layer may have a diopter profile of an incorrect distance or a zoom across the display FOV.

平面外空間誤差(也稱為屈光度誤差)是深度平面的屈光度(深度)誤差的度量,用以通知光學、機械和波導對準或設計中的誤差。所報告的屈光度誤差的度量可以包括表示出預期深度平面和測量深度平面之間的誤差量,以及最大屈光度誤差,表示出深度平面之中的最大深度誤差。 Out-of-plane spatial error (also known as diopter error) is a measure of the diopter (depth) error of the depth plane to inform optical, mechanical, and waveguide alignment or errors in the design. The metric of the reported diopter error may include indicating the amount of error between the expected depth plane and the measured depth plane, and the maximum diopter error, indicating the maximum depth error among the depth planes.

圖10A係為在不同深度觀看多個深度平面的實施例。在實施例中,揭露出三個不同的深度平面,但是顯示系統可以包含更多或更少的深度平面。另外,每個深度平面可以對應於多個波導層(例如,RGB彩色層)。 Figure 10A is an embodiment of viewing multiple depth planes at different depths. In an embodiment, three different depth planes are exposed, but the display system may contain more or fewer depth planes. Additionally, each depth plane may correspond to a plurality of waveguide layers (eg, RGB color layers).

圖10B-10D係為當觀察圖10A所示的投影深度平面時,可能發生的平面外空間誤差的類型之實施例。例如,投影深度平面可以被位移到不同的深度,使它出現大於或小於預期的深度處(圖10B)。方向偏離的深度平面,從預期深度呈現出(大量)體積旋轉(圖10C)。深度平面可呈現出光柵缺陷的不均勻輪廓特 性(圖10D)。深度平面可表現出圖10B-10D中所呈現的誤差組合。 10B-10D are embodiments of types of out-of-plane spatial errors that may occur when viewing the projected depth plane shown in FIG. 10A. For example, the projected depth plane can be displaced to a different depth such that it appears larger or smaller than the expected depth (Fig. 10B). The depth plane in which the direction deviates presents a (large amount) volume rotation from the expected depth (Fig. 10C). Depth planes can exhibit uneven contours of grating defects Sex (Fig. 10D). The depth plane can exhibit the combination of errors presented in Figures 10B-10D.

圖10E係為面外空間誤差的另一實施例。未對準的投影深度平面1002與預期深度平面1004有關。在實施例中,未對準包括深度平面旋轉。為了校正面外空間誤差,可以定義旋轉軸線1006,並且圍繞所標識的旋轉軸線1006在投影深度平面1002上執行旋轉,使得投影深度平面1002基本上與預期的深度平面1004對準。儘管旋轉軸線1006作為平行於預期深度平面1004的軸線(例如,垂直軸線),但是應當理解,旋轉軸線可以在任何方向上。 Figure 10E is another embodiment of the out-of-plane spatial error. The misaligned projected depth plane 1002 is related to the expected depth plane 1004. In an embodiment, the misalignment includes depth plane rotation. To correct the out-of-plane spatial error, an axis of rotation 1006 can be defined and rotated about the identified axis of rotation 1006 on the projected depth plane 1002 such that the projected depth plane 1002 is substantially aligned with the intended depth plane 1004. Although the axis of rotation 1006 is taken as an axis parallel to the intended depth plane 1004 (eg, a vertical axis), it should be understood that the axis of rotation can be in any direction.

雖然屈光度誤差不同於平面內失真相關的平面內空間誤差,但是屈光度誤差可能潛在地影響平面內空間誤差,例如由於像素深度的不正確假設,將導致角度依賴(viewpoint-dependent)的空間失真。例如,對於具有與預期深度不同的深度區域之缺陷深度平面,像素可能相對於觀看者位置不均勻地偏移,導致變化的影像扭曲。 Although the diopter error is different from the in-plane spatial error associated with in-plane distortion, the diopter error may potentially affect the in-plane spatial error, such as due to incorrect assumptions in pixel depth, which will result in view-dependent spatial distortion. For example, for a defect depth plane having a depth region that is different from the expected depth, the pixels may be unevenly offset relative to the viewer position, resulting in a distorted image distortion.

在一些實施例中,本發明描述了用於平面內空間誤差(例如,xy中心、聚集縮放、聚集旋轉和空間映射)的校正技術可以擴展到三維。例如,居中性(Centration)可以通過在xyz坐標系上判別顯示平面的中心點的位置並移動該平面在三維中執行(例如,沿著x、y和z軸),使得中心點與預期的位置對齊。 In some embodiments, the present invention describes that correction techniques for in-plane spatial errors (eg, xy center, aggregate scaling, aggregate rotation, and spatial mapping) can be extended to three dimensions. For example, Centration can be performed in three dimensions by discriminating the position of the center point of the display plane on the xyz coordinate system and moving the plane (eg, along the x, y, and z axes) such that the center point and the expected position Align.

■基於失真向量場的空間誤差量化■ Spatial error quantization based on distortion vector field

如本文參閱圖8所示,可以利用測量影像特徵從預期位置到顯示位置的位移來產生多維(例如,2D或3D)失真向量場。可以用多層顯示器(例如,包括堆疊波導組件405的顯示器)的每個層來計算失真向量場。失真向量場可以用於擷取和表徵由顯示器投射的光場失真。例如,可以對失真向量場執行向量分析操作以確定某些空間誤差。光場計量系統可以計算這種向量操作,作為計量相機(例如,數位照相機或光場照相機)所獲得的影像分析的一部分,以用於顯示器投影的校準圖案(例如,棋盤)。這樣的向量分析技術不限於光場顯示器,並且可以應用於任何多維度量或任何類型的顯示器的校準。 As shown herein with reference to Figure 8, a multi-dimensional (e.g., 2D or 3D) distortion vector field can be generated using the displacement of the measured image features from the expected location to the display location. The distortion vector field can be calculated with each layer of a multi-layer display (eg, a display including stacked waveguide assemblies 405). The distortion vector field can be used to capture and characterize the light field distortion projected by the display. For example, a vector analysis operation can be performed on the distortion vector field to determine some spatial errors. The light field metrology system can calculate such vector operations as part of an image analysis obtained by a metrology camera (eg, a digital camera or a light field camera) for a calibration pattern (eg, a checkerboard) for display projection. Such vector analysis techniques are not limited to light field displays and can be applied to the calibration of any multi-dimensional metric or any type of display.

給予多維失真向量場,可以計算向量場的捲曲以確定局部旋轉。在顯示器FOV中的區域上的捲曲均數提供了該區域中的聚集旋轉誤差的測量。在光場顯示器的離散深度平面實施例中,失真向量場的捲曲計算可以提供關於層的面內旋轉或面外旋轉相關的訊息。 Given a multi-dimensional distortion vector field, the curl of the vector field can be calculated to determine the local rotation. The curl mean over the area in the display FOV provides a measure of the aggregate rotation error in that area. In discrete depth plane embodiments of the light field display, the curl calculation of the distortion vector field can provide information about the in-plane or out-of-plane rotation of the layer.

可以計算失真向量場的發散以確定縮放誤差。在具有多個層(例如,RGB彩色層)的實施例中,得以在每個深度上產生全色影像,該縮放誤差可以用於提供與縮放校準有關的訊息。 The divergence of the distortion vector field can be calculated to determine the scaling error. In embodiments having multiple layers (e.g., RGB color layers), a full color image can be produced at each depth, which can be used to provide information related to zoom calibration.

可以將向量積分定理(例如,斯托克斯定理或發散定理(高斯定理))應用於失真向量場,以計算在顯示器的FOV中的區域上的向量場之捲曲和發散(例如,該區域的旋轉或聚集縮放)。可以計算失真向量場中的向量的Euclidean平均值,以獲 得與由失真導入的非仿射的空間變換之有關的信息。。 Vector integral theorems (eg, Stokes theorem or divergence theorem (Gaussian theorem)) can be applied to the distortion vector field to calculate the curl and divergence of the vector field over the region in the FOV of the display (eg, the region's Rotate or gather zoom). The Euclidean average of the vectors in the distortion vector field can be calculated to obtain Information relating to the non-affine spatial transformation introduced by the distortion. .

■色彩誤差的量化■Quantification of color error

當可見像素的顏色值不同於預期的顏色值時,將產生色差。為了評估色差,可以使用顯示系統投影校準圖像。校準影像可以是用於執行空間誤差校正的相同校準影像,或者可以是不同的校準影像。例如,校準影像可以包括處於特定輝度準位(例如,最大亮度)的特定顏色(例如紅色)之實心影像。可以使用影像擷取裝置(例如,一台或多台相機)擷取來自投影校準影像的輸出。圖11係為投影後的校準影像的擷取影像之實施例。雖然校準影像可以恆定整個影像中的輝度準位,但是由於色彩誤差的存在,該顯示出的校準影像的輝度會在跨越顯示器的視場上產生變化。例如,所擷取的圖像的某些區域1102可以是高輝度準位,而其他區域1104可以呈現較低的輝度準位,導致在顯示器上出現暗區或暗帶。在一些實施例中,校準影像可以包括彩色校準圖案而不是單色。 When the color value of the visible pixel is different from the expected color value, a color difference will occur. To evaluate the color difference, the display system can be used to project a calibration image. The calibration image can be the same calibration image used to perform spatial error correction, or can be a different calibration image. For example, the calibration image may include a solid image of a particular color (eg, red) at a particular luminance level (eg, maximum brightness). An image capture device (for example, one or more cameras) can be used to capture the output from the projected calibration image. Figure 11 is an embodiment of a captured image of a projected image after projection. Although the calibration image can be constant in the luminance level of the entire image, the brightness of the displayed calibration image will vary across the field of view of the display due to the presence of color errors. For example, certain regions 1102 of the captured image may be of high luminance level, while other regions 1104 may exhibit lower luminance levels resulting in dark areas or dark bands on the display. In some embodiments, the calibration image can include a color calibration pattern instead of a single color.

在顯示器的一些實施例中,觀察到的輝度拓撲可以取決於波長。例如,對於紅色、綠色和藍色,可以有不同的輝度變化,使得投影影像呈現為不同於期望的顏色(表示出紅色、綠色、藍色分量之間的不平衡)。例如,投影的白色校準影像可能看起來是紫色的,綠色的輝度準位低於紅色和藍色的輝度準位。另外,輝度變化也可以基於觀察者位置(例如,照相機被移動,在區域1102處的暗帶似乎是移動到FOV中的不同位置)。這種現象 可能導致在FOV上保持顏色均勻性和白平衡是具有挑戰性(特別是當輝度或色彩平衡取決於觀察者位置時),並且最終影響正在顯示內容的顏色準確度。 In some embodiments of the display, the observed luminance topology may depend on the wavelength. For example, for red, green, and blue, there may be different luminance variations such that the projected image appears to be different from the desired color (representing an imbalance between the red, green, and blue components). For example, a projected white calibration image may appear purple, with a green luminance level lower than the red and blue luminance levels. Additionally, the luminance change may also be based on the observer position (eg, the camera is moved, the dark band at region 1102 appears to be moving to a different location in the FOV). this phenomenon It may be challenging to maintain color uniformity and white balance on the FOV (especially when luminance or color balance depends on the viewer's position) and ultimately affect the color accuracy of the content being displayed.

顯示系統中的每個顯示層與色度特性、測量顏色和輝度特性、測量亮度或強度相關聯。因此,色差可以大致分為輝度平坦度誤差和色度均勻性誤差。 Each display layer in the display system is associated with chrominance characteristics, measured color and luminance characteristics, measured brightness or intensity. Therefore, the chromatic aberration can be roughly classified into a luminance flatness error and a chromaticity uniformity error.

■輝度平坦度■ Brightness flatness

輝度平坦度的量度可以用於量化每個顯示層呈現出多少輝度變化。通常,在堆疊波導組件中,由於每個顯示層由堆疊中的不同波導產生(參見例如圖4中的波導組件405),不同顯示層可能在整個視場上具有不同的輝度變化。 A measure of luminance flatness can be used to quantify how much luminance variation each display layer exhibits. Generally, in stacked waveguide assemblies, as each display layer is produced by a different waveguide in the stack (see, for example, waveguide assembly 405 in Figure 4), different display layers may have different luminance variations across the field of view.

測量顯示層的輝度平坦度,可以用於擷取影像的某些或所有像素確定輝度值(也稱為強度值)。儘管本發明主要涉及像素的輝度值,但是在其他實施例中,可以針對多個像素(例如,N×M像素網格)的區域而不是針對單個像素來確定輝度值。在一些實施例中,每個確定的輝度值會被分配一個或多個輝度值範圍的輝度單元。例如,對於8位元彩色顯示系統,可以使用對應於8位元的256色。 Measuring the brightness flatness of the display layer, which can be used to capture some or all of the pixels of the image to determine the luminance value (also known as the intensity value). Although the present invention primarily relates to luminance values for pixels, in other embodiments, luminance values may be determined for regions of multiple pixels (eg, N x M pixel grids) rather than for a single pixel. In some embodiments, each determined luminance value is assigned a luminance unit of one or more luminance value ranges. For example, for an 8-bit color display system, 256 colors corresponding to 8 bits can be used.

從所確定的輝度值,可由計量系統計算出多個輝度平坦度的度量。例如,可計算出跨越該顯示區域的一表示出最常見像素輝度值的模式(mode)。從該模式,可以確定半像素密度(HPPR)係表示輝度範圍或多個輝度檔位(bin)鄰近覆蓋50%像素 密度的模式。一小HPPR係表示顯示層的輝度在顯示器實體是均勻的。輝度值也可以被稱為強度值。對於本專利的目的,輝度和強度術語可以交互使用。 From the determined luminance values, a plurality of metrics of luminance flatness can be calculated by the metering system. For example, a mode that represents the most common pixel luminance value across the display area can be calculated. From this mode, it can be determined that the half pixel density (HPPR) indicates that the luminance range or multiple luminance bins (bin) are adjacent to cover 50% of the pixels. Density mode. A small HPPR indicates that the brightness of the display layer is uniform across the display entity. The luminance value can also be referred to as an intensity value. For the purposes of this patent, luminance and intensity terms can be used interchangeably.

圖12A係為投影後的校準影像之擷取影像(例如,如圖11所示)所產生的強度直方圖。該強度直方圖係針對擷取影像中出現的頻率(例如,具有輝度值的像素數量)來繪製成輝度值。該模式在影像中會出現最高次的輝度值(例如,在位置1202處)。 Figure 12A is an intensity histogram produced by a captured image of the projected calibration image (e.g., as shown in Figure 11). The intensity histogram is plotted as a luminance value for the frequency of occurrences in the captured image (eg, the number of pixels having a luminance value). This mode will have the highest luminance value in the image (for example, at position 1202).

圖12B係為投影後的校準影像之擷取影像所產生的強度分佈圖。該強度剖面圖中,該模式會在輝度值1204(並且在該實施例中具有236色)中出現。從該模式,以模式1204為中心的偏差範圍,其表示出輝度值1206和輝度值1208之間的範圍,以確定該範圍為覆蓋50%影像的像素密度。根據計算的偏差範圍(例如,輝度值1206和輝度值1208之間的差異)來確定HPPR。 Figure 12B is an intensity distribution diagram of the captured image of the projected calibration image. In this intensity profile, this pattern will appear in the luminance value 1204 (and in this embodiment has 236 colors). From this mode, a deviation range centered on mode 1204, which represents a range between luminance value 1206 and luminance value 1208, is determined to be a pixel density that covers 50% of the image. The HPPR is determined based on the calculated range of deviations (eg, the difference between the luminance value 1206 and the luminance value 1208).

對於理想的顯示層,所產生的輸入照明(例如,HPPR=0),該場的強度值是相同的。偏離該理想的特性將彰顯出遠離模式值之像素強度值分佈(範圍)。該HPPR測量試圖的分佈將用公制來表示遠離模式。基本上均勻的輝度可以具有小的HPPR,例如,比模式的輝度值之範圍(例如,8位顏色的255)小的HPPR。例如,基本上均勻(例如,平坦的)的輝度顯示會比總色彩範圍小於約10%、小於約5%、小於約1%或小於約0.1%的HPPR比率。 For an ideal display layer, the resulting input illumination (eg, HPPR = 0), the intensity values of the field are the same. Deviation from this ideal characteristic will reveal the pixel intensity value distribution (range) away from the mode value. The distribution of the HPPR measurement attempt will use the metric to indicate the away mode. A substantially uniform luminance can have a small HPPR, for example, a HPPR that is smaller than the range of luminance values of the mode (e.g., 255 of an 8-bit color). For example, a substantially uniform (eg, flat) luminance display may be less than about 10%, less than about 5%, less than about 1%, or less than about 0.1% HPPR ratio over the total color range.

HPPR可以被視為一四分位差的變化,係用以測量遠離中間值的分佈而不是遠離模式的分佈。該像素強度值的中間值不與顯示層期望的平面-強度反應有直接關係。圖13係為模式、中間值和平均值(μ)之差異的強度直方圖1302,1304之實施例。兩個分佈1302,1304的中間值在該實施例中是相同的。兩個分佈1302,1304分別具有0.8和2的標準偏差σ。如圖13所示,如果影像的強度分佈接近正常(例如,強度分佈1302),則模式、中間值和平均值將會非常相似。相反地,如果強度分佈不接近正常分佈(例如,強度分佈1304),則強度分佈的模式、中間值和平均值基本上是彼此不相同。 HPPR can be viewed as a variation of a quartile that is used to measure the distribution away from the median rather than away from the pattern. The median value of the pixel intensity value is not directly related to the desired plane-intensity response of the display layer. Figure 13 is an embodiment of intensity histograms 1302, 1304 for differences in mode, median, and mean (μ). The median values of the two distributions 1302, 1304 are the same in this embodiment. The two distributions 1302, 1304 have a standard deviation σ of 0.8 and 2, respectively. As shown in Figure 13, if the intensity distribution of the image is near normal (eg, intensity distribution 1302), the mode, median, and average will be very similar. Conversely, if the intensity distribution is not close to the normal distribution (eg, intensity distribution 1304), the mode, median, and average of the intensity distribution are substantially different from each other.

對於顯示器的每個顯示層,輝度平坦化係為降低穿過顯示視場上的輝度變化。由於像素的輝度強度通常不能超過其最大值,因此輝度平坦化通常是整體輝度的降低步驟,其中在層的特定剖面中壓縮像素輝度,使得該層的輝度儘可能平坦化。 For each display layer of the display, luminance flattening is to reduce the variation in luminance across the display field of view. Since the luminance intensity of a pixel generally cannot exceed its maximum value, luminance flattening is generally a step of reducing the overall luminance, in which the luminance of the pixel is compressed in a specific section of the layer such that the luminance of the layer is as flat as possible.

例如,可以執行輝度平坦化,使得該像素輝度在最低輝度值的像素輝度處具有最大值,將顯示層的輝度降低到基本上最小的輝度。或者,該像素輝度可以被配置在所選擇的輝度值處為最大值,該最大值將大於最低輝度值的像素輝度值。這可能不會將總體輝度減到最小,因為低於所選擇的輝度值之像素仍然存在,並且存在的剩餘輝度並不均勻。在一些實施例中,降低像素或像素組的輝度值包括鑑別通過其降低像素或像素組的輝度值。在其它實施例中,降低像素或像素組的輝度值包括鑑別縮放 因子,通過縮放因子將像素或像素組的輝度值縮減到最小輝度值或閾輝度值。 For example, luminance flattening can be performed such that the pixel luminance has a maximum at the pixel luminance of the lowest luminance value, reducing the luminance of the display layer to a substantially minimum luminance. Alternatively, the pixel luminance may be configured to be a maximum at the selected luminance value, the maximum value being greater than the pixel luminance value of the lowest luminance value. This may not minimize the overall luminance because pixels below the selected luminance value are still present and the remaining luminance present is not uniform. In some embodiments, reducing the luminance value of a pixel or group of pixels includes identifying a luminance value by which to reduce a pixel or group of pixels. In other embodiments, reducing the luminance value of the pixel or group of pixels includes discriminating the scaling A factor that reduces the luminance value of a pixel or group of pixels to a minimum luminance value or a threshold luminance value by a scaling factor.

在一些實施例中,如果顯示層的初始輝度平坦化良好(例如,HPPR低於閾值),則可將輝度值減小到最小輝度值,以提供一平坦輝度場。另一方面,如果輝度平坦度是差的(例如,HPPR超過閾值)或者低的最小輝度值(例如,未達到最小閾值),則可以選取所選擇的最大輝度值。該輝度平坦化可以在軟體模組(例如,處理模組224,228)中執行。 In some embodiments, if the initial luminance flattening of the display layer is good (eg, HPPR is below a threshold), the luminance value can be reduced to a minimum luminance value to provide a flat luminance field. On the other hand, if the luminance flatness is poor (eg, HPPR exceeds a threshold) or a low minimum luminance value (eg, the minimum threshold is not reached), the selected maximum luminance value may be selected. The luminance flattening can be performed in a software module (eg, processing module 224, 228).

當每個顯示層的執行輝度平坦化不相同時,輝度的準位會降低。然而,對於相同色彩群組(例如,RGB層群組)中的不同層之不同輝度準位將導致白平衡的損失,這可以藉由校正顯示器的色度均勻性來處理。 When the execution luminance flattening of each display layer is not the same, the luminance level is lowered. However, different luminance levels for different layers in the same color group (eg, RGB layer groups) will result in a loss of white balance, which can be handled by correcting the chromaticity uniformity of the display.

■色度一致性■ Chromaticity consistency

色度通常是指與輝度無關的顯示器的顏色分量。如上所述,顯示系統中的顯示層可以包括紅色顯示層、綠色顯示層和藍色顯示層,但是應當理解,在其他實施例中,可以使用其它數量、類型或顏色的顯示層或組合的顯示層。在以下的實施例中,為了說明的目的將描述RGB色彩層,但是這不是限制色彩平衡(其可以應用於任何的顯示色彩組合)的方法。 Chromaticity generally refers to the color component of a display that is independent of luminance. As noted above, the display layer in the display system can include a red display layer, a green display layer, and a blue display layer, although it should be understood that in other embodiments, other numbers, types, or colors of display layers or combinations of displays can be used. Floor. In the following embodiments, the RGB color layer will be described for illustrative purposes, but this is not a method of limiting color balance (which can be applied to any display color combination).

如果對應於紅色、綠色和藍色顯示層的輝度變化相同,則在顯示器上維持色度。相反地,如果對應的紅色、綠色和藍色顯示層上的輝度變化不同,則所顯示影像的色度將與期望的 不同。例如,對於白色校準影像,如果紅色和藍色層具有比綠色層更高的輝度,則白色校準影像的區域將呈現紫色。這些來自預期白色的偏差可以被稱為關閉灰度(off grayscale)。 If the luminance changes corresponding to the red, green, and blue display layers are the same, the chromaticity is maintained on the display. Conversely, if the luminance changes on the corresponding red, green, and blue display layers are different, the chromaticity of the displayed image will be the desired different. For example, for a white calibration image, if the red and blue layers have a higher brightness than the green layer, the area of the white calibration image will appear purple. These deviations from the expected white can be referred to as off grayscale.

色度一致性的度量可以用於擷取影像的關閉灰度。該度量包括平均色彩誤差,分別指示出紅色、綠色和藍色所對應之穿越FOV偏差的紅色、綠色和藍色之平均值。當平均色彩誤差越小,影像出現的灰度越接近。平均色彩誤差可以藉由除以平均色彩或色彩的範圍(例如,8位元255色彩)被歸一化(normalized)為無因次數值。在各種實施例中,如果平均色彩誤差小於10%、小於5%、小於1%或某個其他閾值,則可以視為顯示器已經達成色度一致性。 A measure of chroma consistency can be used to capture the grayscale of the image. The metric includes an average color error indicating the average of red, green, and blue across the FOV bias for red, green, and blue, respectively. The smaller the average color error, the closer the gray level of the image appears. The average color error can be normalized to a dimensionless value by dividing by the range of average colors or colors (eg, 8-bit 255 colors). In various embodiments, if the average color error is less than 10%, less than 5%, less than 1%, or some other threshold, then it can be considered that the display has achieved chroma consistency.

圖14A係為投影測試影像的擷取影像產生的紅-綠-藍(RGB)強度圖的實施例。紅色和藍色層1402和1404具有彼此大致相似的輝度,並且紅色層1402和藍色層1404都具有比綠色層1406更高的輝度。結果,白色測試影像的投影區域會呈現紫色(紅色加藍色,參見例圖11B)。 Figure 14A is an embodiment of a red-green-blue (RGB) intensity map generated from a captured image of a projected test image. The red and blue layers 1402 and 1404 have substantially similar luminances to each other, and both the red layer 1402 and the blue layer 1404 have a higher luminance than the green layer 1406. As a result, the projected area of the white test image will appear purple (red plus blue, see Figure 11B for an example).

圖14B係為映射最大色彩失衡誤差的一曲線1408。平均輝度1410可以被認定為紅色、綠色和藍色彩色層的平均輝度值。“平均+最大誤差”表面1412表示紅色層、綠色層和藍色層的最大輝度值,而“平均一最大誤差”表面1414表示紅色層、綠色層和藍色層的最小輝度值。 Figure 14B is a plot 1408 that maps the maximum color imbalance error. The average luminance 1410 can be identified as the average luminance value of the red, green, and blue color layers. The "Average + Maximum Error" surface 1412 represents the maximum luminance values for the red, green, and blue layers, while the "Average Maximum Error" surface 1414 represents the minimum luminance values for the red, green, and blue layers.

圖15係為在如圖14A所示的顯示視場上,具有不同 強度的紅色層、綠色層和藍色層的顯示系統之色彩校正的RGB強度圖。如下所述和在曲線圖1500中所示,在該實施例中,在大部分的顯示器中,最大R和B輝度值已經降低到較低於G輝度值的準位,以便提供色度一致性。 Figure 15 is different in the display field of view as shown in Figure 14A. Color-corrected RGB intensity map of the display system for intensity red, green, and blue layers. As shown below and in graph 1500, in this embodiment, in most displays, the maximum R and B luminance values have been lowered to a level lower than the G luminance value to provide chroma consistency. .

如圖14A所示,在色彩校正之前,紅色層和藍色層的輝度比在FOV上的大部分綠色層之輝度高很多,這將導致白色校準影像的擷取影像的大區域呈現紫色。在本實施例中的色彩校正期間,對於深度平面的每個點,將確認深度平面(例如,紅色、綠色和藍色)相關聯的色彩層之最低輝度值,並且設定每個色彩層的輝度值為該點的最低輝度值。例如,如圖15所示,紅色和藍色層1502和1504的顏色輝度被降低用以匹配綠色層1506的顏色輝度(例如,比較圖14A的RGB強度圖與圖15的RGB強度圖)。結果,紅色層和藍色層的輝度被校正,因此它們可以與綠色層的強度匹配,從而減少投影影像的關閉灰度量。 As shown in FIG. 14A, before the color correction, the luminances of the red and blue layers are much higher than the luminance of most of the green layers on the FOV, which causes a large area of the captured image of the white calibration image to appear purple. During color correction in this embodiment, for each point of the depth plane, the lowest luminance value of the color layer associated with the depth plane (eg, red, green, and blue) will be confirmed, and the luminance of each color layer will be set. The value is the lowest luminance value at that point. For example, as shown in FIG. 15, the color luminance of the red and blue layers 1502 and 1504 is lowered to match the color luminance of the green layer 1506 (eg, comparing the RGB intensity map of FIG. 14A with the RGB intensity map of FIG. 15). As a result, the luminances of the red and blue layers are corrected, so they can match the intensity of the green layer, thereby reducing the amount of grayscale off of the projected image.

■影像校正處理■Image correction processing

影像校準是指與先前定義的影像質量度量相關的顯示裝置之特性(參見如圖7-15的描述)。影像校正是指為提高影像質量而採取的校正措施。影像質量度量通知係用以改進或優化顯示裝置影像質量度量所採取的校正動作。因此,影像校正與每個影像質量度量有緊密的關係。 Image calibration refers to the characteristics of the display device associated with previously defined image quality metrics (see description of Figure 7-15). Image correction refers to the corrective measures taken to improve image quality. The image quality metric notification is used to improve or optimize the corrective action taken by the display device image quality metric. Therefore, image correction is closely related to each image quality metric.

圖16係為在顯示系統上執行影像校正處理的實施例之流程圖1600。在區塊1602,校準用於擷取投影影像的照相機(例 如,以下所描述的度量系統1800的照相機1806)。照相機校準包括在擷取和表示實際視覺/顯示訊息時,照相機的精度特徵。為了確保從擷取影像測量出的任一度量是來自於顯示系統,而不是來自與照相機相關聯的誤差,應當在影像校正之前將使用於影像校正的照相機先完成校準。 16 is a flow chart 1600 of an embodiment of performing image correction processing on a display system. At block 1602, a camera for capturing a projected image is calibrated (eg, For example, camera 1806 of metrology system 1800 described below. Camera calibration includes the camera's accuracy characteristics when capturing and representing actual visual/display messages. To ensure that any metrics measured from the captured image are from the display system, rather than from the camera-associated error, the camera used for image correction should be calibrated prior to image correction.

在一些實施例中,照相機校準包括執行平場校正(例如,確保照相機的強度反應在FOV上是一致的)、透鏡畸變校正(例如,判別和補償透鏡畸變)或像素縮放中的至少一個(例如,判別照相機的影像擷取上的像素大小與顯示系統的像素大小之間的關係)。在一些實施例中,可以應用顯示器到照相機像素映射以執行顯示像素值和相機像素值之間的轉移。顯示器到照相機像素映射可以根據顯示顏色像素值映射到第一中間色調空間的第一全局非線性伽馬函數、一局部、像素相關的耦合函數將第一中間色調空間映射到第二中間色調空間,以及第二全局非線性伽馬函數,其將第二中間色調空間映射到相機色彩空間中的像素強度。以下參考圖21所描述的顯示器到照相機像素映射的實施例細節。 In some embodiments, camera calibration includes performing at least one of flat field correction (eg, ensuring that the intensity response of the camera is consistent across the FOV), lens distortion correction (eg, discriminating and compensating for lens distortion), or pixel scaling (eg, The relationship between the pixel size on the image capture of the camera and the pixel size of the display system is discriminated). In some embodiments, a display to camera pixel mapping can be applied to perform a transfer between display pixel values and camera pixel values. The display to camera pixel mapping may map the first halftone space to the second halftone space according to a first global nonlinear gamma function, a local, pixel related coupling function that maps the color pixel values to the first halftone space, And a second global non-linear gamma function that maps the second halftone space to the pixel intensity in the camera color space. Details of the embodiment of the display to camera pixel mapping described below with reference to FIG.

在區塊1604,可在顯示系統上執行空間誤差校正。空間誤差校正包括使用校準的照相機擷取投影光場的一個或多個影像,其可以用於產生顯示器所顯示的影像位置和期望的影像位置之間的失真向量場。在一些實施例中,每個顯示層產生單獨的向量場。使用產生的向量場,可以執行一個或多個空間校正,其可以包括XY中心(區塊1604a)、聚集旋轉(區塊1604b)、聚集 縮放(1604c)或空間映射(區塊1604d)。在一些實施例中,在每一層的基底上執行這些校正中的每一個。 At block 1604, spatial error correction can be performed on the display system. Spatial error correction includes capturing one or more images of the projected light field using a calibrated camera that can be used to generate a distortion vector field between the image position displayed by the display and the desired image position. In some embodiments, each display layer produces a separate vector field. Using the generated vector field, one or more spatial corrections can be performed, which can include XY centers (block 1604a), aggregate rotation (block 1604b), aggregation Scale (1604c) or spatial map (block 1604d). In some embodiments, each of these corrections is performed on a substrate of each layer.

XY中心用以表示顯示層的顯示影像之中心相對於期望影像位置的平移空間誤差。執行XY中心可以包括識別所顯示的影像中心點,以及沿著測定的平移向量來移動影像,使得中心點對應於期望的中心位置。參照圖9A所描述的XY中心校正的實施例。 The XY center is used to indicate the translational space error of the center of the display image of the display layer with respect to the desired image position. Executing the XY center may include identifying the displayed image center point and moving the image along the measured translation vector such that the center point corresponds to the desired center position. An embodiment of the XY center correction described with reference to Figure 9A.

聚集旋轉係指顯示影像和預期位置之間的總體旋轉誤差。執行聚集旋轉可以包括識別所顯示影像的中心點,並且圍繞所識別的中心點之旋轉影像的旋轉量(例如,像素位置誤差到相對於期望影像位置的一最小化的位置)。參考圖9B所描述的聚集旋轉校正的實施例。 Aggregate rotation refers to the overall rotation error between the displayed image and the expected position. Performing the collective rotation may include identifying a center point of the displayed image and an amount of rotation of the rotated image around the identified center point (eg, a pixel position error to a minimized position relative to the desired image position). An embodiment of the aggregate rotation correction described with reference to Figure 9B.

聚集縮放係為所顯示的影像和期望影像之間的整體縮放誤差。執行聚集縮放可以包括識別顯示影像的中心點,以及通過指定的因子(例如,相對於期望影像位置的像素位置誤差被最小化的因子)縮放靠近所判別的中心點的影像。參考圖9C和圖9D所描述的聚集縮放的實施例。 The aggregate zoom is the overall scaling error between the displayed image and the desired image. Performing the aggregate zoom may include identifying a center point of the displayed image and scaling the image near the determined center point by a specified factor (eg, a factor that the pixel position error is minimized relative to the desired image position). Embodiments of the aggregate scaling described with reference to Figures 9C and 9D.

雖然xy中心化、聚集旋轉和聚集縮放可以用於校正線性或仿射空間誤差,但是顯示層的顯示影像還可以包含附加的非線性或非仿射空間誤差。用以執行空間映射來校正已執行XY中心化、聚集旋轉和聚集縮放校正之後所剩餘的任何剩餘誤差(例如,非線性或非仿射誤差)。該空間映射也可以被稱為像素扭曲, 如圖9E所描述的實施例。 While xy centering, gather rotation, and gather scaling can be used to correct linear or affine spatial errors, the display image of the display layer can also contain additional nonlinear or non-affine spatial errors. Used to perform spatial mapping to correct any residual errors (eg, non-linear or non-affine errors) that have remained after performing XY centering, gather rotation, and gather scaling correction. This spatial mapping can also be referred to as pixel distortion. The embodiment as depicted in Figure 9E.

在一些實施例中,空間誤差可以被分離成平面內空間誤差和平面外空間誤差(有時稱為屈光度誤差)。例如,在面外空間誤差進行校正之前,可以先針對面內空間誤差來校正顯示層,反之亦然。兩者擇一或可同時一起校正平面內空間誤差和平面外空間誤差。 In some embodiments, the spatial error can be separated into an in-plane spatial error and an out-of-plane spatial error (sometimes referred to as a diopter error). For example, the display layer may be corrected for in-plane spatial errors before the out-of-plane spatial error is corrected, and vice versa. Alternatively, the intra-plane spatial error and the out-of-plane spatial error may be corrected together.

區塊1606,用以在顯示系統上執行色彩誤差校正。色彩誤差校正可以包括輝度平坦化1606a或色彩平衡1606b。在一些實施例中,當每個色彩群集(例如,每個RGB群集)執行色彩平衡時,將對每層的基底執行輝度平坦化。 Block 1606 is for performing color error correction on the display system. The color error correction can include luminance flattening 1606a or color balance 1606b. In some embodiments, when each color cluster (eg, each RGB cluster) performs color balance, luminance flattening will be performed on the substrate of each layer.

輝度平坦化可參考遍及顯示層上之減小的輝度變化。在一些實施例中,輝度平坦化包括將在顯示的FOV中所有像素的輝度減小到最小輝度值。或者,將顯示器的FOV中所有具有大於最大值或閾值輝度的像素輝度減小到等於最大值/閾值,而具有小於最大值/閾值的輝度像素可保持不變。在一些實施例中,可以根據輝度和閾輝度值之間的距離來縮放輝度值。如圖12A和圖12B所描述的輝度平坦化的實施例。 Luminance flattening can refer to reduced luminance variations throughout the display layer. In some embodiments, luminance flattening includes reducing the luminance of all pixels in the displayed FOV to a minimum luminance value. Alternatively, all pixel luminances in the display's FOV having greater than maximum or threshold luminance are reduced to equal the maximum/threshold, while luminance pixels having less than the maximum/threshold may remain unchanged. In some embodiments, the luminance value can be scaled according to the distance between the luminance and the threshold luminance value. An embodiment of luminance flattening as depicted in Figures 12A and 12B.

色彩平衡會包括減少在色彩群集(例如,RGB群集)中的不同色彩層之間的強度不匹配所引起的關閉灰度效應。可以藉由降低深度平面中每個位置區域的色彩層的輝度以匹配在該位置區域具有最低輝度的色彩群集中的色彩層輝度來執行色彩平衡。例如,在FOV中的每個像素,在每個位置區域的紅色、綠色 和藍色色彩層的輝度都被設置在該區域的三個色彩層中的最低層。在一些實施例中,高於閾輝度值的輝度被降低到等於閾輝度值,或降低到該位置區域的色彩群集中之最小輝度值,取其中之最大值。在一些實施例中,可根據輝度和閾輝度值之間的距離來縮放輝度。如圖14A和圖15所描述的色彩平衡的實施例。 Color balance may include reducing the off-gray effect caused by intensity mismatch between different color layers in a color cluster (eg, RGB cluster). Color balance can be performed by reducing the luminance of the color layer of each location region in the depth plane to match the color layer luminance in the color cluster having the lowest luminance at the location region. For example, for each pixel in the FOV, red, green in each location area The luminance of the blue color layer and the blue color layer are set at the lowest of the three color layers of the region. In some embodiments, the luminance above the threshold luminance value is reduced to be equal to the threshold luminance value, or reduced to the minimum luminance value in the color cluster of the location region, taking the maximum value therein. In some embodiments, the luminance can be scaled according to the distance between the luminance and the threshold luminance values. An embodiment of color balance as depicted in Figures 14A and 15 is shown.

在一些實施方式中,在製造過程期間將為每個顯示系統執行影像校準(以量化該影像質量度量)。該影像質量度量相關聯的信息和可用於改善或優化顯示系統的校正可以存儲在與顯示系統相關聯的非暫時性存儲器(例如,數據模組224或遠程數據儲存庫232)中。在使用顯示系統期間,可以將影像校正信息應用於顯示器以執行適當的校正,用以提供該顯示系統的使用者減少或消除顯示器中的影像誤差後之改善或優化的影像。例如,局部或遠程處理模組224,228可以使用影像校正的即時信息向使用者提供改良的影像。如圖27和圖28所描述的校準過程之實施例。 In some embodiments, image calibration (to quantify the image quality metric) will be performed for each display system during the manufacturing process. The information associated with the image quality metric and the corrections available to improve or optimize the display system can be stored in a non-transitory memory (eg, data module 224 or remote data repository 232) associated with the display system. During use of the display system, image correction information can be applied to the display to perform appropriate corrections to provide an improved or optimized image of the user of the display system after reducing or eliminating image errors in the display. For example, the local or remote processing module 224, 228 can provide improved images to the user using image corrected instant information. An embodiment of the calibration process as depicted in Figures 27 and 28.

■深度平面計量的實施例■Example of depth plane metrology

本發明所述的顯示系統之實施例會產生光場(如圖1-6的描述)。因此,就像實體(物理)物件會在距離顯示器的佩戴者一定的距離處產生撞擊在眼睛上的光場,並且放置在一定深度的虛擬物體也會產生(數位化的)光場,該光場將使虛擬物體在預期的深度處聚焦。這允許輻輳-調節匹配以及呈現出有更令人信服的混合實境。 Embodiments of the display system of the present invention produce a light field (as depicted in Figures 1-6). Therefore, just as a physical (physical) object will produce a light field impinging on the eye at a certain distance from the wearer of the display, and a virtual object placed at a certain depth will also produce a (digitized) light field, the light The field will focus the virtual object at the desired depth. This allows for a convergence-adjustment match and presents a more convincing mixed reality.

由於所產生的光場缺陷(例如,由於波導組件405的波導中的缺陷),即使內容創建者在渲染引擎中將虛擬物體放置在距離觀看者特定的深度區域,該虛擬物體也會在不同於預期深度的深度區域聚焦。這將造成輻輳-調節不匹配。在某些情況下,不同部分的虛擬物體會呈現在不同深度區域對焦。這些不匹配的深度會對應於一種平面外空間誤差的型式,如在圖10A-10E所示的實施例。 Due to the resulting light field defects (eg, due to defects in the waveguide of the waveguide assembly 405), even if the content creator places the virtual object in a depth region that is specific to the viewer in the rendering engine, the virtual object will be different The depth region of the depth is expected to be focused. This will cause a convergence-adjustment mismatch. In some cases, different parts of the virtual object will appear to focus in different depth regions. These mismatched depths will correspond to a pattern of out-of-plane spatial errors, as in the embodiment shown in Figures 10A-10E.

因此,本發明描述了可以測量由顯示器產生的光場質量之計量系統的實施例。某部分的計量系統可以映射出該顯示器產生的光場之拓撲和質量,並且可以提供信息,該信息將指引該顯示器產生的光場質量的評價。某部分的計量系統會擷取該顯示器產生的向量光場(例如,方向和幅度),並允許分析顯示器中的焦聚和深度缺陷。本發明所述之計量系統所產生的信息將運用在已成熟的光場顯示器的空間及色彩校準技術上。本發明所述的計量系統係特定應用於光場顯示器的實施例(例如,顯示系統200、400的實施例),但並非用以限制,因為該計量系統也可以用於其它實施例,如用來測量來自任何型態之顯示器上的光。該測量系統的實施例可以用於確定3D失真場,並可以導引出顯示器之有效的空間校準信息。該計量系統還可以用於雙眼校準和單眼RGB和深度平面校準。 Accordingly, the present invention describes an embodiment of a metering system that can measure the quality of the light field produced by the display. A portion of the metering system can map the topology and quality of the light field produced by the display and can provide information that will guide the evaluation of the quality of the light field produced by the display. A portion of the metering system captures the vector light field (eg, direction and amplitude) produced by the display and allows analysis of focus and depth defects in the display. The information generated by the metering system of the present invention will be utilized in the spatial and color calibration techniques of established light field displays. The metering system of the present invention is particularly applicable to embodiments of light field displays (e.g., embodiments of display systems 200, 400), but is not intended to be limiting, as the metering system can be used in other embodiments as well. To measure light from any type of display. Embodiments of the measurement system can be used to determine the 3D distortion field and can direct valid spatial calibration information from the display. The metering system can also be used for binocular calibration and monocular RGB and depth plane calibration.

圖17A係由具有正常光場的眼睛304來觀看物件1702的實施例。該物件1702會對應於實質上沒有缺陷的光場所產生的 實體物件或虛擬物體。該物件1702上的點係從單個點發散相關聯的光線1706(光線1706對應於物件1702上的點看起來是從單個點發散),導致該物件1702的點會從眼睛304到距離1708處聚焦。 Figure 17A is an embodiment of viewing an object 1702 from an eye 304 having a normal light field. The object 1702 will correspond to a light field that is substantially free of defects. Physical object or virtual object. The point on the object 1702 diverges the associated ray 1706 from a single point (the ray 1706 corresponds to a point on the object 1702 that appears to diverge from a single point), causing the point of the object 1702 to be focused from the eye 304 to a distance 1708. .

圖17B係為利用缺陷光場來觀察物件1710的實施例。該物件1710可以對應於虛擬物體,例如使用顯示系統(例如,如圖4和圖6所示的顯示系統400)所產生的虛擬物體。再利用產生的光場缺陷,例如波導420,422,424,426,428,604中的缺陷,打算用來對應於物件1710上特定點的光線1712來呈現出不同的點發散,或表現出不同於預期的發散。結果,該物件1710會在距離1708處失焦。此外,不同部分的該物件1710會在不同深度或距離處聚焦。 Figure 17B is an embodiment of viewing an object 1710 using a defective light field. The object 1710 can correspond to a virtual object, such as a virtual object produced using a display system (eg, display system 400 as shown in Figures 4 and 6). Reuse of the resulting light field defects, such as defects in the waveguides 420, 422, 424, 426, 428, 604, is intended to correspond to the ray 1712 at a particular point on the object 1710 to exhibit a different point divergence, or to exhibit a divergence that is different than expected. As a result, the object 1710 will be out of focus at a distance 1708. In addition, different portions of the object 1710 will be focused at different depths or distances.

測量系統也能用於測量顯示器所產生的光場質量。圖18係為用於測量顯示器1802的光場質量之測量系統1800的實施例。該顯示器1802產生的光場會具有指向照相機1806的光線1804。該顯示器1802可以對應於堆疊波導組件(例如,堆疊波導組件405,如圖4所示)。儘管該光線1804在實施例上是平行的,但是此意僅用於說明,因為該光線1804也可以沿著不同方向(例如,發散)投射,以便傳遞光場所呈現的一個或多個虛擬物體的不同深度。另外,由於顯示器1802中的缺陷,光線1804也會是不平行的(參見如圖17B)。 The measurement system can also be used to measure the quality of the light field produced by the display. FIG. 18 is an embodiment of a measurement system 1800 for measuring the light field quality of display 1802. The light field produced by the display 1802 will have light 1804 directed at the camera 1806. The display 1802 can correspond to a stacked waveguide assembly (eg, stacked waveguide assembly 405, as shown in FIG. 4). Although the ray 1804 is parallel in the embodiment, this is intended to be illustrative only, as the ray 1804 can also be projected in different directions (eg, diverging) to convey one or more virtual objects presented by the light location. Different depths. Additionally, due to defects in display 1802, light 1804 will also be non-parallel (see Figure 17B).

在一些實施例中,照相機1806可以用於擷取所生成的光場的至少一部分,以便測量例如在光場中表示的虛擬物體的 感知深度。該照相機1806用於在特定深度或距離上聚焦(以下也稱為“聚焦深度”)。在一些實施例中,這可以使用具有小的焦深(DOF)的透鏡來完成。例如,DOF可以小於Z-距離,其中顯示器上的缺陷通常導致聚焦深度偏離預期的聚焦深度(例如,如圖19C所示的小於深度圖的峰頂1924與預期聚焦之間的距離所示的深度1922)。在其他實施例中,DOF可以小於照相機和顯示器之間的距離倍數,其中該因子可以小於約0.1、小於約0.01、小於約0.001等。照相機1806可以用於擷取光場的特定部分或整個光場。該照相機180用於擷取與特定虛擬物體相關聯的部分光場並使用該光場呈現出來。該照相機1806被定位成能夠擷取影像基本上類似於眼睛304感知。該照相機1806和該顯示器1802可以相對於彼此移動以映射出光場。例如,相對運動可以平行於該顯示器1802(例如,沿圖18所示的X方向或垂直於X和Z的Y方向(未揭露))或垂直於該顯示器1802(例如,沿圖18所示的Z方向)。在其它實施例中,掃描光學器件(未揭露)可以用於相對地掃描該照相機1806和該顯示器1802。在一些實施例中,照相機1806可以用於擷取所生成的光場部分,以便確定變形圖(例如圖8中所示的),其可以用於判別投影影像中的空間誤差(例如,如圖9A-9E所示的平面內空間誤差或如圖10A-10E所示的平面外空間誤差)。另外,照相機1806可以用於判別所生成的光場中的輝度或色差(例如,如圖11-15所示)。 In some embodiments, camera 1806 can be used to capture at least a portion of the generated light field to measure, for example, a virtual object represented in the light field. Perceive depth. The camera 1806 is for focusing at a particular depth or distance (hereinafter also referred to as "focus depth"). In some embodiments, this can be done using a lens with a small depth of focus (DOF). For example, the DOF can be less than the Z-distance, where defects on the display typically cause the depth of focus to deviate from the expected depth of focus (eg, as shown in Figure 19C, less than the distance between the peak 1924 of the depth map and the expected focus) 1922). In other embodiments, the DOF can be less than a multiple of the distance between the camera and the display, wherein the factor can be less than about 0.1, less than about 0.01, less than about 0.001, and the like. Camera 1806 can be used to capture a particular portion of the light field or the entire light field. The camera 180 is configured to capture a portion of the light field associated with a particular virtual object and present it using the light field. The camera 1806 is positioned to capture images substantially similar to the perception of the eye 304. The camera 1806 and the display 1802 can be moved relative to one another to map out the light field. For example, the relative motion may be parallel to the display 1802 (eg, along the X direction shown in FIG. 18 or perpendicular to the Y direction of X and Z (not disclosed)) or perpendicular to the display 1802 (eg, as shown in FIG. 18) Z direction). In other embodiments, scanning optics (not disclosed) can be used to relatively scan the camera 1806 and the display 1802. In some embodiments, camera 1806 can be used to capture the generated portion of the light field to determine a deformation map (eg, as shown in FIG. 8) that can be used to discriminate spatial errors in the projected image (eg, as illustrated The in-plane spatial error shown in 9A-9E or the out-of-plane spatial error shown in Figures 10A-10E). Additionally, camera 1806 can be used to discriminate luminance or chromatic aberration in the generated light field (e.g., as shown in Figures 11-15).

在一些實施例中,照相機1806可移動以便在不同方 向上定向。例如,雖然照相機1806被揭露出為正交地面向顯示器1802,但是照相機1806也可以被旋轉(例如,沿著Y軸旋轉或X軸旋轉),使得對應於顯示器1802面向不同的角度,從而允許照相機1806測量顯示器1802在不同方向或取向上產生的光場。 In some embodiments, camera 1806 can be moved to different parties Oriented upwards. For example, although the camera 1806 is exposed to face the display 1802 orthogonally, the camera 1806 can also be rotated (eg, rotated along the Y axis or X-axis) such that the display 1802 is oriented at a different angle, thereby allowing the camera 1806 measures the light field produced by display 1802 in different directions or orientations.

在各種實施例中,照相機1806可以是數位相機,例如短焦數位相機。在其他實施例中,照相機1806可以是光場相機。 In various embodiments, camera 1806 can be a digital camera, such as a short focus digital camera. In other embodiments, camera 1806 can be a light field camera.

照相機1806會連接到控制器1808,控制器1808用於控制照相機1806的聚焦深度、照相機1806的視場、曝光時間、照相機1806和顯示器1802的相對移動等。在一些實施例中,控制器1808可以對應於如圖4所示的控制器450。控制器1808可以包括硬體處理器和非暫時性數據存儲器。 Camera 1806 is coupled to controller 1808 for controlling the depth of focus of camera 1806, the field of view of camera 1806, the exposure time, the relative movement of camera 1806 and display 1802, and the like. In some embodiments, controller 1808 can correspond to controller 450 as shown in FIG. Controller 1808 can include a hardware processor and a non-transitory data store.

圖19A係為照相機在特定聚焦深度上聚焦(例如,照相機1806)所擷取的影像1900之實施例。影像1900可以包含聚焦的一個或多個區域1902,以及失焦的一個或多個區域1904。由於照相機1806可以用於在不同的聚焦深度聚焦,所以影像的聚焦或非聚焦的區域可以改變。例如,如果照相機被改變不同的聚焦深度聚焦,則區域1902可以出現在焦點外,而區域1904的部分可以進入焦點。通過多個不同的聚焦深度上擷取光場的多個影像,可以確定光場的各個區域的感知深度。例如,照相機擷取的影像之每個像素會對應於焦點深度相關聯的特定焦點深度,其中部分光場會對應於像素的焦點上。並可將所產生的光場區域與感知深度相映射建構成一深度圖或一圖形。另外,深度圖或圖形也 可以具體為顯示器準備投影出的聚焦深度,因此將光場中呈現的虛擬物體之預期焦深和實際測量聚焦深度進行比較。 19A is an embodiment of an image 1900 captured by a camera at a particular depth of focus (eg, camera 1806). Image 1900 can include one or more regions 1902 of focus, and one or more regions 1904 that are out of focus. Since the camera 1806 can be used to focus at different depths of focus, the focused or unfocused regions of the image can be changed. For example, if the camera is changed to focus at a different depth of focus, region 1902 can appear out of focus, while portions of region 1904 can enter focus. The perceived depth of each region of the light field can be determined by capturing multiple images of the light field at a plurality of different depths of focus. For example, each pixel of the image captured by the camera will correspond to a particular depth of focus associated with the depth of focus, where a portion of the light field will correspond to the focus of the pixel. The generated light field region and the perceived depth may be mapped to form a depth map or a graphic. In addition, depth maps or graphics are also The projected depth of focus can be specifically prepared for the display, thus comparing the expected depth of focus of the virtual object presented in the light field with the actual measured depth of focus.

圖19B係為深度圖的實施例,用以揭露出利用測量系統1800執行測量焦深的實施例。圖1910繪製了沿著從顯示器1802(例如,沿著光場的水平X軸,如圖18所示)發射光場的線所生成光場的測量焦深1912。在一些實施例中,可以跨越多個不同的聚焦深度來掃描相機1806的聚焦深度進而產生圖形1910。例如,照相機1806可以聚焦在焦點深度1914(由水平虛線表示)。在理想顯示中,由顯示器產生的光場將使得虛擬物體的實際測量深度正好是預期深度,但是在實際顯示中,由於顯示中的缺陷,兩者可能不相同。因此,當測量的聚焦深度接近聚焦深度1914之光場的任何區域時(例如,區域1916),可以被感知為是聚焦,而具有測量的聚焦深度明顯不同於聚焦深度1914之該光場區域時(例如,區域1918)可被感知為失焦。 19B is an embodiment of a depth map to reveal an embodiment in which measurement of the depth of focus is performed using measurement system 1800. Figure 1910 plots the measured focal depth 1912 of the light field generated along a line that emits a light field from display 1802 (e.g., along the horizontal X-axis of the light field, as shown in Figure 18). In some embodiments, the depth of focus of camera 1806 can be scanned across a plurality of different depths of focus to produce graphics 1910. For example, camera 1806 can be focused at a focal depth 1914 (represented by a horizontal dashed line). In an ideal display, the light field produced by the display will cause the actual measured depth of the virtual object to be exactly the expected depth, but in actual display, the two may not be identical due to defects in the display. Thus, when the measured depth of focus is close to any region of the light field of the depth of focus 1914 (eg, region 1916), it can be perceived as being focused, while having a measured depth of focus that is significantly different from the region of the light field of the depth of focus 1914 (eg, region 1918) can be perceived as out of focus.

圖19C係為擷取一個或多個生成影像的深度圖之實施例。深度圖1920包含預期深度位置1922及測量深度圖1924。其中該深度位置1922係由顯示器1802聚焦生成的影像(在圖19C中以水平平面表示),而該測量深度圖1924係呈現出焦點深度(Z),並且也會聚焦生成影像。比較該預期的聚焦深度1922和測量的聚焦深度1924以允許由顯示器1802產生的光場的缺陷在顯示器的視場(FOV)上被判別和量化。 Figure 19C is an embodiment of a depth map for capturing one or more generated images. The depth map 1920 includes an expected depth location 1922 and a measured depth map 1924. The depth position 1922 is the image generated by the focus of the display 1802 (indicated by a horizontal plane in FIG. 19C), and the measured depth map 1924 exhibits a depth of focus (Z) and is also focused to generate an image. The expected depth of focus 1922 and measured depth of focus 1924 are compared to allow for defects in the light field produced by display 1802 to be discerned and quantized on the field of view (FOV) of the display.

例如,在水平位置(X0,Y0)處聚焦的光之預期聚焦 深度是Z0,並且在該位置處的測量聚焦深度是Z,則(Z-Z0)是在位置(X0,Y0)處的顯示器聚焦缺陷的度量。在一些實施方式中,可以測量光線聚焦的實際水平位置(X,Y)。在實施例中,實際對焦位置相對於預期對焦位置(X,Y,Z)-(X0,Y0,Z0)的向量測量可以呈現出顯示器產生的光場中的缺陷。該向量測量的顯示缺陷提供了3D平面內和平面外(例如,屈光度)的誤差。在一些實施例中,僅使用2D向量誤差測量(X,Y)-(X0,Y0)測量(和校準)平面內誤差。在某些情況下,可以在逐個像素(pixel-by-pixel)的基礎上確定顯示器的聚焦誤差。然而,由於許多顯示器中的大量像素(例如,數百萬像素),可以僅針對顯示器的部分或針對該顯示器進行採樣的像素組來確定聚焦誤差數據(例如,跨越顯示器的一10×10或一100x100樣本)。棋盤圖案不需要是正方形的,也可以被設計為符合顯示器的像素結構。 For example, the expected depth of focus of the light focused at the horizontal position (X 0 , Y 0 ) is Z 0 , and the measured depth of focus at that position is Z, then (ZZ 0 ) is at the position (X 0 , Y 0 The display at the location focuses on the measure of the defect. In some embodiments, the actual horizontal position (X, Y) of the light focus can be measured. In an embodiment, the vector measurement of the actual focus position relative to the expected focus position (X, Y, Z) - (X 0 , Y 0 , Z 0 ) may present a defect in the light field produced by the display. The display defects of this vector measurement provide errors in 3D in-plane and out-of-plane (eg, diopter). In some embodiments, only the 2D vector error measurement (X, Y) - (X 0, Y 0) measurements (and calibrated) within the error plane. In some cases, the focus error of the display can be determined on a pixel-by-pixel basis. However, due to the large number of pixels in many displays (eg, millions of pixels), focus error data can be determined only for portions of the display or groups of pixels that are sampled for the display (eg, a 10×10 or a span across the display) 100x100 samples). The checkerboard pattern does not need to be square, but can also be designed to conform to the pixel structure of the display.

圖20係為程序2001的實施流程圖,該程序2001為用於測量光場顯示生成的虛擬目標模式的質量。程序2001可由計量系統1800執行,例如由控制器1808執行。在一些實施方式中,虛擬目標模式是具有交替的明暗區域之陣列的棋盤圖案。該棋盤圖案可以用於對顯示器的部分(例如,10×10或100×100或其他尺寸的棋盤)進行取樣,或者可以對應於每種尺寸的顯示器之像素數量的大小。在其他情況下,可以按照順序地打開和關閉一個(或多個)像素組並獲取打開像素的影像來獲取逐個像素的數據。棋盤圖案(打開/關閉的像素的序列)可以包括亮區和暗區的 隨機序列或者亮區和暗區的幾何圖案或任何其他類型的校準圖案。以下參照圖22-23B所描述的棋盤圖案和像素開關序列的實施例。在區塊2002,可以設置初始焦點深度。在一些實施例中,這包括照相機上的聚焦透鏡之深度。初始焦深可對應於虛擬目標模式的任何深度。例如,初始深度可以對應於虛擬目標模式相關聯的最小或最大深度。 FIG. 20 is a flowchart showing an implementation of the program 2001 for measuring the quality of the virtual target mode generated by the light field display. Program 2001 may be performed by metering system 1800, such as by controller 1808. In some embodiments, the virtual target mode is a checkerboard pattern with an array of alternating light and dark regions. The checkerboard pattern can be used to sample portions of the display (eg, 10x10 or 100x100 or other sized checkers) or can correspond to the number of pixels of each size display. In other cases, one (or more) of the pixel groups may be turned on and off in sequence and an image of the open pixel may be acquired to obtain pixel-by-pixel data. Checkerboard pattern (sequence of open/closed pixels) can include bright and dark areas Random sequence or geometric pattern of bright and dark areas or any other type of calibration pattern. Embodiments of the checkerboard pattern and pixel switch sequence described below with reference to Figures 22-23B. At block 2002, the initial focus depth can be set. In some embodiments, this includes the depth of the focusing lens on the camera. The initial depth of focus may correspond to any depth of the virtual target mode. For example, the initial depth may correspond to a minimum or maximum depth associated with the virtual target mode.

在區塊2004,在選定的聚焦深度來擷取虛擬目標圖案的影像。在一些實施例中,影像可以包括聚焦的部分和失焦的部分。在一些實施例中,影像的範圍可以集中在與虛擬目標模式相關聯的特定虛擬物體上。在其他實施例中,影像可以對應於包括多個虛擬物體的整個光場。影像可以包括跨越虛擬目標圖案的逐個像素的聚焦深度信息。 At block 2004, an image of the virtual target pattern is captured at the selected depth of focus. In some embodiments, the image may include a focused portion and an out-of-focus portion. In some embodiments, the range of images may be concentrated on a particular virtual object associated with the virtual target mode. In other embodiments, the image may correspond to an entire light field comprising a plurality of virtual objects. The image may include pixel-by-pixel focus depth information across the virtual target pattern.

在區塊2006,如果存在額外聚焦深度係用於被拍攝的影像則進行下一步驟。如果確定存在額外的聚焦深度,則在區塊2008,用以選擇新的聚焦深度。在一些實施例中,數量聚焦深度基本上可以至少部分地由顯示系統呈現出不同深度的數量(例如,如圖3所示的深度平面306的數量或者如圖4所示的波導組件中的波導數量)。在一些實施例中,如果影像聚焦在特定虛擬物體上,則聚焦深度的範圍會與虛擬物體的一個或多個深度相關聯(例如,與虛擬物體相關聯的最小深度和最大深度)。 At block 2006, the next step is performed if there is an additional depth of focus for the captured image. If it is determined that there is an additional depth of focus, then at block 2008, a new depth of focus is selected. In some embodiments, the number of depths of focus may be at least partially exhibited by the display system at different depths (eg, the number of depth planes 306 as shown in FIG. 3 or the waveguides in the waveguide assembly as shown in FIG. Quantity). In some embodiments, if the image is focused on a particular virtual object, the range of depth of focus may be associated with one or more depths of the virtual object (eg, the minimum depth and maximum depth associated with the virtual object).

如果不存在額外聚焦深度係用於被拍攝的影像則進行另一步驟,該步驟區塊2010,用以分析虛擬目標圖案的擷取影 像,以便識別深度Z或橫向位置(X,Y),其中目標圖案的不同區域實際上是聚焦。例如,虛擬目標圖案的每個擷取影像會對應於特定聚焦深度所包含部分的對焦和部分的失焦。在一些實施例中,每個影像可以被劃分為對應於光場區域的一個或多個區域。自動聚焦技術可以用於確定每個區域在哪個深度處聚焦。在一些實施例中,每個區域可以對應於像素。 If there is no additional depth of focus for the captured image, another step is performed. The step block 2010 is used to analyze the captured image of the virtual target pattern. Image to identify depth Z or lateral position (X, Y), where different regions of the target pattern are actually focused. For example, each captured image of the virtual target pattern may correspond to focus and partial out of focus of a portion included in a particular depth of focus. In some embodiments, each image may be divided into one or more regions corresponding to the light field regions. Autofocus techniques can be used to determine at which depth each region is focused. In some embodiments, each region may correspond to a pixel.

在區塊2012,會至少部分地依據測量的聚焦深度(或橫向位置)來建立深度圖。該深度圖會包括任何類型的數據結構或可視化將焦點深度映射到光場位置。例如,該深度圖可以包括一個或多個像素的擷取影像的深度信息(例如,Z軸聚焦深度或結合橫向聚焦位置的Z軸聚焦深度(X和/或Y位置)的測量)。在一些實施例中,像素可以對應於與目標虛擬物體相關聯的像素雲(pixel cloud)。這樣,該深度圖可以指定當通過顯示光學器件觀看時虛擬物體的實際感知深度。 At block 2012, a depth map is established based, at least in part, on the measured depth of focus (or lateral position). The depth map will include any type of data structure or visualization that maps the depth of focus to the light field location. For example, the depth map may include depth information of the captured image of one or more pixels (eg, a Z-axis focus depth or a measurement of the Z-axis focus depth (X and/or Y position) in conjunction with the lateral focus position). In some embodiments, the pixels may correspond to a pixel cloud associated with the target virtual object. In this way, the depth map can specify the actual perceived depth of the virtual object when viewed through the display optics.

在區塊2014,深度圖可以將一個或多個期望聚焦深度相互比較,其中該期望聚焦深度會對應於在一個或多個虛擬物體呈現的深度。通過檢查虛擬物體的實際感知深度與虛擬物體意圖出現的焦點深度之間的差異,在光場的缺陷和/或偏離可以被發現。 At block 2014, the depth map may compare one or more desired depths of focus to each other, where the desired depth of focus may correspond to the depth presented at one or more virtual objects. Defects and/or deviations in the light field can be found by examining the difference between the actual perceived depth of the virtual object and the depth of focus in which the virtual object is intended to appear.

在區塊2006,可以至少一部分根據深度圖和期望的聚焦深度的比較結果來執行誤差校正。該誤差校正可以補償光場顯示中的缺陷或顯示器投射的影像內容。 At block 2006, error correction may be performed based at least in part on the comparison of the depth map and the desired depth of focus. This error correction can compensate for defects in the light field display or image content projected by the display.

可以對光場顯示器的波導組件405中的每個波導重複程序2001,以映射每個波導的缺陷。在某些情況下,可以存在對應於多個深度平面的多個波導以及對應於多種顏色(例如,紅色(R)、綠色(G)和藍色(B))的多個波導。例如,某些顯示器,對於每個深度平面存在三個色彩平面,因此具有兩個深度平面的波導組件可以具有2x3=6個波導。照相機1806可以是多種色彩敏感的照相機或照相機的組合,每一個會對色彩的子集敏感。由計量系統1800獲得的聚焦深度信息可以用於確定聚焦誤差的空間分佈以及顯示器的色彩(彩色)缺陷的分佈。 The program 2001 can be repeated for each of the waveguide assemblies 405 of the light field display to map the defects of each waveguide. In some cases, there may be multiple waveguides corresponding to multiple depth planes and multiple waveguides corresponding to multiple colors (eg, red (R), green (G), and blue (B)). For example, some displays have three color planes for each depth plane, so a waveguide assembly with two depth planes can have 2x3 = 6 waveguides. Camera 1806 can be a combination of multiple color sensitive cameras or cameras, each sensitive to a subset of colors. The depth of focus information obtained by metering system 1800 can be used to determine the spatial distribution of focus errors and the distribution of color (color) defects of the display.

在一些實施例中,可以使用光場相機來擷取由顯示器1802產生的光場(例如,使用具有掃描焦點的數位相機),來代替在多個不同的聚焦深度所擷取的多個影像。擷取的光場可以分析焦點和/或深度缺陷。通過分析擷取的光場中的光線之向量,可以確定各個區域的聚焦深度。然後可以比較所識別的焦點深度和一個或多個預期焦點深度,並且可以執行適當的誤差校正(如在區塊2016中)。例如,向量計量的實際對焦位置(X,Y,Z)對應於預期的聚焦位置(X0,Y0,Z0)並可以被確定為:向量誤差=(X,Y,Z)-(X0,Y0,Z0),用於描述顯示器產生的光場中的缺陷。 In some embodiments, a light field camera can be used to capture the light field produced by display 1802 (eg, using a digital camera with scanning focus) instead of multiple images captured at multiple different depths of focus. The captured light field can analyze focus and/or depth defects. By analyzing the vector of rays in the captured light field, the depth of focus of each region can be determined. The identified depth of focus and one or more expected focus depths can then be compared and appropriate error corrections can be performed (as in block 2016). For example, the actual focus position (X, Y, Z) of the vector meter corresponds to the expected focus position (X 0 , Y 0 , Z 0 ) and can be determined as: vector error = (X, Y, Z) - (X 0 , Y 0 , Z 0 ), used to describe defects in the light field generated by the display.

■色彩平衡顯示的實施例方法■ Embodiment method of color balance display

如上所述,全色顯示器的一些實現方式係藉由組合顯示器投射紅色(R)、綠色(G)和藍色(B)波長的光,並在觀 看者的視網膜上產生三色刺激反應。理想的顯示器對於這三個色彩層具有空間均勻的輝度;然而,由於硬體缺陷,實際顯示器在視場的輝度上會具有一些量的變化。如果對於不同的色彩層有不相同的變化,則該變化會在顯示器的視場(FOV)(例如,如圖11所示)上產生不均勻的色度。本發明描述了校正色彩變化並試圖讓FOV上的色度均勻的方法之實施例。例如,可以調節顯示器的各個色彩層(例如,R、G和B)的強度,使得顯示器的白點在FOV的基本上是均勻的。 As mentioned above, some implementations of full-color displays project red (R), green (G), and blue (B) wavelengths by combining displays, and A tristimulus response is produced on the subject's retina. An ideal display has a spatially uniform luminance for the three color layers; however, due to hardware defects, the actual display will have some amount of variation in the luminance of the field of view. If there are different variations for different color layers, the change will produce uneven chromaticity on the field of view (FOV) of the display (eg, as shown in Figure 11). Embodiments of a method of correcting color variations and attempting to uniform chromaticity on the FOV are described. For example, the intensity of the various color layers (eg, R, G, and B) of the display can be adjusted such that the white point of the display is substantially uniform at the FOV.

在一些實施方式中,這裡描述的光場度量系統可以用於顯示器的色彩平衡。例如,數位彩色照相機可以拍攝顯示器的影像(例如,使用如圖18所示的度量系統1800),對於顯示器的一些或全部像素,可以從該圖像來確定顯示器的色彩反應。在許多顯示器中,存在三個色彩層(例如,R、G和B),然而,本發明不限於RGB或3色顯示器。本發明方法可以應用於任何數量的色彩層(例如,2、3、4、5、6或更多)和任何顏色(例如青色、洋紅色、黃色、黑色)的選擇。 In some embodiments, the light field metrology system described herein can be used for color balance of a display. For example, a digital color camera can capture an image of the display (eg, using a metrology system 1800 as shown in FIG. 18) from which the color response of the display can be determined for some or all of the pixels of the display. In many displays, there are three color layers (eg, R, G, and B), however, the invention is not limited to RGB or 3-color displays. The method of the invention can be applied to any number of color layers (e.g., 2, 3, 4, 5, 6, or more) and any color (e.g., cyan, magenta, yellow, black).

圖14A(色彩校準前)和圖15(色彩校准後)係為用於RGB顯示器的特定執行的量測色彩平衡之實施例。圖14A和圖15為包括橫跨顯示器像素(水平軸)的R、G和B強度(垂直軸)的分佈繪製圖(分別為1400,1500)。圖14B包括用於顯示器像素(水平軸)的最大色彩非平衡(垂直軸)的曲線1408,呈現出顏色校正之前的平均值和平均值加上或減去最大誤差。 Figure 14A (before color calibration) and Figure 15 (after color calibration) are embodiments of measurement color balance for a particular implementation of an RGB display. 14A and 15 are distribution plots (1400, 1500, respectively) including R, G, and B intensities (vertical axes) across display pixels (horizontal axes). Figure 14B includes a curve 1408 for the maximum color imbalance (vertical axis) of the display pixels (horizontal axis), showing the average and average values before color correction plus or minus the maximum error.

如上所述,圖14A呈現出未校準的顯示器在顯示器的像素上具有明顯不均勻的顏色反應。紅色和藍色的反應相同,R和B強度朝著繪製圖1400的右側突出。綠色反應通常小於R或B反應,因此會向著繪製圖1400的右邊減小。圖15揭露出應用以下所描述的色彩校準之後,已校準的顯示器在顯示器像素上將具有均勻的色彩反應。 As noted above, Figure 14A presents an uncalibrated display having a distinctly non-uniform color response on the pixels of the display. The red and blue reactions are the same, and the R and B intensities protrude toward the right side of the drawing 1400. The green response is typically less than the R or B reaction and will therefore decrease toward the right side of the plot 1400. Figure 15 reveals that after applying the color calibration described below, the calibrated display will have a uniform color response on the display pixels.

本發明所述的色彩平衡系統和方法之實施例,用於提供調整多色顯示器中的至少某些色彩層的強度技術,使得顯示器的白點在顯示器FOV的基本上是均勻的。在各種實施方式中,顯示器可以是光場顯示器。例如,顯示器可以具有在多個深度平面向觀看者呈現彩色影像的能力。色彩平衡系統和方法的實施例可以應用於色彩平衡顯示器208(圖2)、顯示系統400(圖4-6)和顯示器2500(圖25A、25B、26)。 Embodiments of the color balance system and method of the present invention are for providing a technique for adjusting the intensity of at least some of the color layers in a multi-color display such that the white point of the display is substantially uniform across the display FOV. In various implementations, the display can be a light field display. For example, a display may have the ability to present a color image to a viewer in multiple depth planes. Embodiments of the color balance system and method can be applied to color balance display 208 (FIG. 2), display system 400 (FIGS. 4-6), and display 2500 (FIGS. 25A, 25B, 26).

人的眼睛不會以線性方式來感知光的準位。例如,與理想、線性的顯示器相比,人眼對於暗色調或類似光色調的變化較為敏感,這允許人類視覺系統在寬範圍的輝度準位上操作。現實世界的顯示器也不提供精確的線性亮度反應。此外,數位影像通常被編碼用以表示在感知上具有更均勻的色調準位。人的視覺感知、顯示輸出和影像編碼通常被塑造為遵循相對於亮度或色階的近似冪定律關係。例如,輸出電平與已提高功率伽馬:V out .的輸入電平成比例。這種非線性、冪定律、特性通常被稱為伽馬校正、伽馬編碼或單純的稱為伽瑪。 The human eye does not perceive the level of light in a linear manner. For example, the human eye is more sensitive to changes in dark tones or similar tonal hue than an ideal, linear display, which allows the human visual system to operate over a wide range of luminance levels. Real-world displays also do not provide accurate linear brightness response. In addition, digital images are typically encoded to indicate a more uniform tone level in perception. Human visual perception, display output, and image coding are typically shaped to follow an approximate power law relationship with respect to brightness or color gradation. For example, the output level and the increased power gamma: V out . The input level is proportional. This nonlinearity, power law, and characteristic are often referred to as gamma correction, gamma coding, or simply gamma.

在某些實施例中,如果顯示器中各個色彩層的輝度平坦度在跨越顯示器的FOV上幾乎是均勻的,則色彩平衡可以包括用於實現跨越顯示器的均勻色彩平衡之各個色彩層的縮放強度。如果在各種實施例中跨越顯示器的FOV上的輝度變化小於1%、小於5%、小於10%,則顯示器可以具有適當的輝度平坦度。由於顯示器的伽馬反應和人類視覺感知,在某些情況下,使這種直接的縮放可能具有某些缺點。 In some embodiments, if the luminance flatness of each color layer in the display is nearly uniform across the FOV of the display, the color balance may include the zoom intensity of the various color layers used to achieve uniform color balance across the display. If the luminance variation across the FOV of the display is less than 1%, less than 5%, less than 10% in various embodiments, the display may have appropriate luminance flatness. Due to the gamma response of the display and human visual perception, in some cases, such direct scaling may have certain drawbacks.

如果顯示器的色彩層不具有實質的輝度平坦性,則色彩平衡可以不僅僅包括各個色彩層的縮放強度。例如,色彩平衡可以嘗試在顯示器的每個像素(或一組像素)處單獨地平衡白點。在某些這樣的實施例中,可以實現跨越顯示器的FOV的色彩平衡,而不使跨越FOV上的輝度變平。可以附加或替代地執行輝度平坦化以達到色彩平衡。 If the color layer of the display does not have substantial luminance flatness, the color balance may include not only the zoom intensity of each color layer. For example, color balance can attempt to separately balance white points at each pixel (or set of pixels) of the display. In some such embodiments, the color balance of the FOV across the display can be achieved without flattening the luminance across the FOV. Luminance flattening may be performed in addition or alternatively to achieve color balance.

色彩平衡顯示器的目的是讓顯示器的觀看者感知跨越在顯示器的FOV上均勻的色彩平衡。為了測量和調整顯示器的色彩平衡,使用校準相機(而不是人眼)來記錄顯示輸出的影像。可以假定相機為顯示輸出的人類感知之呈現,並且如果顯示器的相機影像是色彩平衡的,則觀看者對顯示器的感知也將被色彩平衡。 The purpose of the color balance display is to allow the viewer of the display to perceive a uniform color balance across the FOV of the display. To measure and adjust the color balance of the display, use a calibration camera (rather than the human eye) to record the image of the display output. It can be assumed that the camera is a representation of human perception of the display output, and if the camera image of the display is color balanced, the viewer's perception of the display will also be color balanced.

在一些實施例中,以下模式為用於將顯示器色彩層之像素值轉換成校準相機所測量的顏色像素值。在以下實施例中,有三個色彩層,被假設為R、G和B;然而,這是為了說明 的目的,而不是限制。在其他情況下,色彩層的任何數量和色調都可用於色彩平衡技術的實施例。此外,在應用此示模式之前,可以考慮顯示器和相機的像素大小之間的適當縮放。 In some embodiments, the following mode is for converting pixel values of the display color layer to color pixel values measured by the calibration camera. In the following embodiments, there are three color layers, which are assumed to be R, G, and B; however, this is for illustration The purpose, not the limit. In other cases, any number and hue of color layers can be used in embodiments of color balancing techniques. In addition, proper scaling between the display and the pixel size of the camera can be considered before applying this mode.

在等式(1)中,[Rd、Gd、Bd]為發送到顯示器的RGB影像的強度值。在許多情況下(例如,標準RGB或sRGB),強度值在0和255之間。伽馬1{}表示第一非線性伽馬函數(具有指數γ 1),其將顯示色階映射到中間色調表示式[R1 G1 B1]。耦合()表示將色度值[R1 G1 B1]映射到第二中間色調表示式[R2 G2 B2]的函數。耦合()函數可以是線性函數,例如,3×3矩陣(在3個色彩層的情況下)。在其他實施例中,耦合()函數可以是非線性的。伽馬2{}為將第二中間色調表示式[R2 G2 B2]映射到由校準相機記錄的像素強度[Rc Gc Bc]的第二非線性伽馬函數(具有指數γ 2)。 In equation (1), [Rd, Gd, Bd] is the intensity value of the RGB image transmitted to the display. In many cases (eg, standard RGB or sRGB), the intensity values are between 0 and 255. The gamma 1{} represents a first nonlinear gamma function (having an exponent γ 1 ) which maps the display gradation to the midtone representation [R1 G1 B1]. Coupling () represents a function that maps the chrominance value [R1 G1 B1] to the second halftone representation [R2 G2 B2]. The coupling() function can be a linear function, for example, a 3x3 matrix (in the case of 3 color layers). In other embodiments, the coupling () function can be non-linear. The gamma 2{} is a second nonlinear gamma function (having an index γ 2 ) that maps the second halftone expression [R2 G2 B2] to the pixel intensity [Rc Gc Bc] recorded by the calibration camera.

在一些實施例中,第一和第二伽馬函數是在顯示器的FOV上的全局函數(例如,在FOV上的指數γ 1γ 2是恆定的)。耦合()可以是局部(像素相關)函數,其在整個FOV中從像素到像素而變化。由耦合()函數提供的每像素顏色映射允許每個像素色彩平衡。 In some embodiments, the first and second gamma functions are global functions on the FOV of the display (eg, the indices γ 1 and γ 2 on the FOV are constant). The coupling () can be a local (pixel dependent) function that varies from pixel to pixel throughout the FOV. The per-pixel color mapping provided by the coupling() function allows each pixel to be color balanced.

為了確定函數伽馬1{},伽馬2{}和耦合(),一系列的顯示器的一個或多個影像可以被照相機擷取,並且可以由編程的分析系統來分析以執行反覆最適化演算法(例如,爬山演算法、局部搜索、單純形法、遺傳算法等),用以提供顯示器合理的色彩平衡的伽馬和耦合函數的合適契合。當分析系統搜索伽馬和耦合函數的合適契合時,分析系統可以在反覆過程期間藉由擷取顯示器的附加影像(或多個)來使用反饋。例如,可以藉由反覆地調整這些函數來確定函數伽馬1{},伽馬2{}和耦合(),以改善或優化跨越顯示器的FOV的相機影像的色彩平衡。可以反覆地調整這些函數,直到在反覆過程期間獲取的相機影像的白點在顯示器的FOV基本上是均勻的。在各種實施例中,基本上均勻的白點分佈與白點的變化有關聯,該白點在顏色系統內被測量出跨越該FOV之小於10%、小於5%或小於1%的白點值。例如,可以使用由國際照明委員會(CIE)提供的色彩空間。在一些實施例中,基本上均勻的白點分佈可與根據色彩空間的恰辨差(JND)之小於閾值量的白點的變化相關聯。在一些實施例中,首先重覆地計算伽馬傳遞函數伽瑪1{}和伽瑪2{},然後一旦已經計算了伽馬函數(例如,指數γ1和γ2),再來就計算耦合()函數。 To determine the function gamma 1{}, gamma 2{} and coupling (), one or more images of a series of displays can be captured by the camera and analyzed by a programmed analysis system to perform a repetitive optimization calculation. The method (for example, hill climbing algorithm, local search, simplex method, genetic algorithm, etc.) is used to provide a suitable fit of the gamma and coupling function of the display with a reasonable color balance. When the analysis system searches for a suitable fit of the gamma and the coupling function, the analysis system can use the feedback during the iterative process by capturing additional images (or multiple) of the display. For example, function gamma 1{}, gamma 2{}, and coupling () can be determined by repeatedly adjusting these functions to improve or optimize the color balance of the camera image of the FOV across the display. These functions can be adjusted repeatedly until the white point of the camera image acquired during the iterative process is substantially uniform at the FOV of the display. In various embodiments, a substantially uniform white point distribution is associated with a change in white point that is measured within the color system as a white point value that is less than 10%, less than 5%, or less than 1% across the FOV. . For example, a color space provided by the International Commission on Illumination (CIE) can be used. In some embodiments, the substantially uniform white point distribution may be associated with a change in white point that is less than a threshold amount according to a difference in color space (JND). In some embodiments, the gamma transfer function gamma 1{} and gamma 2{} are first repeatedly calculated, and then once the gamma function (eg, the indices γ 1 and γ 2 ) has been calculated, the calculation is performed again. Coupling () function.

一種用於製造環境中校準顯示器的生產過程可以當顯示器沿著生產線傳輸時自動描繪顯示器。例如,在生產過程中的適當點,本發明所述的校準相機和分析系統可以執行重覆分析以獲得用於特定顯示器的伽馬傳遞函數和耦合函數,並將所得的 伽馬和耦合函數儲存在與顯示器相關聯的儲存器。然後,顯示器具有自動執行色彩平衡的能力。 A production process for calibrating a display in a manufacturing environment can automatically depict the display as it is transmitted along the production line. For example, at a suitable point in the production process, the calibration camera and analysis system of the present invention can perform repeated analysis to obtain a gamma transfer function and coupling function for a particular display, and the resulting The gamma and coupling functions are stored in a memory associated with the display. The display then has the ability to automatically perform color balance.

在特定顯示器的使用期間,一旦伽馬傳遞函數伽馬1{}和伽馬2{}和耦合()函數對於特定顯示是已知時,則適當的顯示像素值[Rd Gd Bd]可以輸入到等式(1)以實現色彩平衡輸出。例如,特定顯示器確定的伽馬指數和耦合()函數可以儲存在顯示器內合宜的儲存器中,並被取出以變換輸入影像像素顏色值,以提供來自顯示器的色彩平衡輸出。在一些實施例中,可穿戴顯示系統200的局部處理和數據模組224可以儲存伽馬傳遞和耦合函數,並且處理模組可以利用等式(1)輸出即時色彩平衡影像(圖2)。在其他實施例中,顯示系統400的控制器450可以根據等式(1)和儲存的伽馬和耦合函數(圖4)來執行色彩平衡。在其它實施例中,如下所述,動態校準系統2600的動態校準處理器2610可以使用等式(1)和儲存的伽馬和耦合函數對顯示器2500(圖26)執行色彩平衡。 During use of a particular display, once the gamma transfer function gamma 1{} and gamma 2{} and coupling () functions are known for a particular display, the appropriate display pixel value [Rd Gd Bd] can be input to Equation (1) to achieve color balance output. For example, the gamma index and coupling() functions determined by a particular display can be stored in a suitable memory within the display and retrieved to transform the input image pixel color values to provide a color balanced output from the display. In some embodiments, the local processing and data module 224 of the wearable display system 200 can store gamma transfer and coupling functions, and the processing module can output an instant color balance image (FIG. 2) using equation (1). In other embodiments, controller 450 of display system 400 can perform color balancing in accordance with equation (1) and stored gamma and coupling functions (FIG. 4). In other embodiments, as described below, dynamic calibration processor 2610 of dynamic calibration system 2600 can perform color balance on display 2500 (FIG. 26) using equation (1) and stored gamma and coupling functions.

請同時參考圖27和圖28所述的根據眼動追踪來動態地校準顯示器的方法2700或處理流程2805的實施例。以下更詳細地描述圖像傳感器可以執行色彩平衡和其他誤差校正/校準的功能。例如,在方法2700的區塊2720處取出的校準可以包括伽瑪和耦合函數,並且在區塊2730,可以藉由使用等式(1)和所取出的伽馬和耦合函數來校正顯示器的色彩缺陷。作為另一實施例,處理流程圖2805的區塊2880可以取出伽馬和耦合函數並 在校準期間應用它們。 Embodiments of method 2700 or process flow 2805 for dynamically calibrating a display according to eye tracking as described with reference to FIGS. 27 and 28 are also described. The function of the image sensor to perform color balance and other error correction/calibration is described in more detail below. For example, the calibration taken at block 2720 of method 2700 can include gamma and coupling functions, and at block 2730, the color of the display can be corrected by using equation (1) and the extracted gamma and coupling functions. defect. As another example, block 2880 of process flow diagram 2805 can take out gamma and coupling functions and Apply them during calibration.

圖21係為用於校準顯示器的方法2150之實施流程圖。顯示器可以是光場顯示器。顯示器可以是顯示器208(圖2)、顯示系統400(圖4-6)和顯示器2500(圖25A、25B、26)。方法2150可由分析系統來執行(包括相機和由電腦硬體執行的分析程序,例如圖18所示的計量系統1800)如同顯示器製造過程的生產線之一部分(例如,如同參考圖28所述之一部分的過程2805)。方法2150可以如同參考圖16所述的流程圖1600之區塊1602的一部分之相機校準來執行。在一些實施例中,方法2700應用等式(1)以確定顯示器與相機之間的適當變換(假設為呈現顯示器的觀看者的視覺感知)。在區塊2160,由相機擷取顯示器的影像。在區塊2170,確定在顯示器和相機之間變換的全局變換參數。全局變換參數可以包括不在跨越顯示器的FOV上變化的參數(例如,不是像素相關的參數)。例如,全局變換參數可以包括Gamma1{}和Gamma2{}函數。在某些情況下,方法2150可返回到區塊2160以獲取一個或多個附加影像如同一部分的疊代,用於確定全局變換參數的反饋過程。在獲得對全局變換參數的合適契合之後,方法2150移動到區塊2180,其中將局部(例如,像素相關)變換參數契合到相機圖像。例如,局部變換參數可以包括耦合()函數(例如,跨越顯示器的FOV的像素位置處的該函數值)。在某些情況下,方法2150可返回到區塊2160以獲取一個或一個以上的附加影像如同一部分的疊代,用於確定局部變換參數 反饋過程。在一些實施例中,在區塊2160處獲得附加影像之後,方法2150可跳回到區塊2180以繼續契合局部變換參數,而不是傳遞區塊2170,因為先前已確定了全局變換參數。在將局部變換參數的合適契合與相機影像契合之後,方法2150移動到區塊2190,其中局部變換參數和全局變換參數被儲存在與顯示器(例如,局部數據模塊71)相關聯的儲存器中。如上所述,在用於動態校準顯示器的方法2700的區塊2720處,可以存取局部和全局變換參數來用於顯示器的一部分校準,並且在區塊2730,局部和全局變換參數和等式(1)可以應用來從顯示器中產生色彩平衡影像。 21 is a flow diagram of an implementation of a method 2150 for calibrating a display. The display can be a light field display. The display can be display 208 (FIG. 2), display system 400 (FIGS. 4-6), and display 2500 (FIGS. 25A, 25B, 26). Method 2150 can be performed by an analysis system (including a camera and an analysis program executed by a computer hardware, such as metering system 1800 shown in FIG. 18) as part of a production line of a display manufacturing process (eg, as described in reference to FIG. 28) Process 2805). Method 2150 can be performed as with camera calibration of a portion of block 1602 of flowchart 1600 described with reference to FIG. In some embodiments, method 2700 applies equation (1) to determine an appropriate transition between the display and the camera (assuming visual perception of the viewer presenting the display). At block 2160, the image of the display is captured by the camera. At block 2170, global transformation parameters that are transformed between the display and the camera are determined. Global transformation parameters may include parameters that are not varied across the FOV of the display (eg, not pixel related parameters). For example, global transformation parameters can include Gamma1{} and Gamma2{} functions. In some cases, method 2150 can return to block 2160 to obtain one or more additional images as part of an iteration, a feedback process for determining global transformation parameters. After obtaining a suitable fit to the global transform parameters, method 2150 moves to block 2180 where local (eg, pixel related) transform parameters are blended into the camera image. For example, the local transform parameters can include a coupling () function (eg, the function value at a pixel location across the FOV of the display). In some cases, method 2150 can return to block 2160 to obtain one or more additional images as part of an iteration for determining local transformation parameters Feedback process. In some embodiments, after obtaining additional images at block 2160, method 2150 can jump back to block 2180 to continue to fit the local transformation parameters instead of passing block 2170 because the global transformation parameters have been previously determined. After the appropriate fit of the local transformation parameters is matched to the camera image, method 2150 moves to block 2190 where the local transformation parameters and global transformation parameters are stored in a memory associated with the display (eg, local data module 71). As described above, at block 2720 of method 2700 for dynamically calibrating the display, local and global transform parameters can be accessed for partial calibration of the display, and at block 2730, local and global transform parameters and equations ( 1) Can be applied to produce color balance images from the display.

儘管針對顯示器的色彩平衡的情況進行了描述,但是本系統和方法不受限於此,並且可以應用於校正顯示器的其他色彩(或空間)缺陷(例如,上述的任何色彩或空間缺陷)。例如,如上所述,顯示器可以呈現輝度平坦度變化,並且本發明的分析技術的實施例可以確定一輝度平坦度校準,其用於校正輝度平坦度的缺陷。另外地或替代地,顯示器可以展示空間缺陷,包括平面內平移、旋轉、縮放或扭曲誤差以及平面外(例如,焦深)誤差。本發明的分析技術的實施例可以確定用於這些空間誤差中的某些或全部的校準。 Although described with respect to the color balance of the display, the present systems and methods are not limited thereto and can be applied to correct other color (or spatial) defects of the display (eg, any color or spatial defects described above). For example, as described above, the display can exhibit a change in luminance flatness, and embodiments of the analysis techniques of the present invention can determine a luminance flatness calibration that is used to correct for defects in luminance flatness. Additionally or alternatively, the display may exhibit spatial imperfections, including in-plane translation, rotation, scaling or warping errors, and out-of-plane (eg, depth of focus) errors. Embodiments of the analysis techniques of the present invention may determine calibration for some or all of these spatial errors.

■使用校準圖案的顯示校準實施例■ Display calibration example using calibration pattern

顯示器中的缺陷可能導致顯示器投射的虛擬物體看起來空間失真或色度失真。為了校正這些失真,可以首先藉由測 量該失真來校準,然後執行任何必要的誤差校正(例如,使用圖18中所示的測量系統1800)來校準顯示器。顯示器校準可以包括使用顯示器來投影校準圖案,例如,棋盤圖案(例如,如圖7所示),以及用相機擷取所得到的影像。然後可以處理所擷取的影像,由通過量化圖案特徵點的期望位置與其測量位置之間的誤差來確定校準圖案的特徵點位置處的失真。對於具有單獨的顏色層(例如,紅色(R)、綠色(G)和藍色(B))的顯示器,該校準也可以校正顏色品質和影像品質。 Defects in the display may cause the virtual object projected by the display to appear spatially distorted or chromatically distorted. In order to correct these distortions, you can first measure The distortion is calibrated and then any necessary error corrections are performed (e.g., using measurement system 1800 shown in Figure 18) to calibrate the display. Display calibration may include using a display to project a calibration pattern, such as a checkerboard pattern (eg, as shown in FIG. 7), and capturing the resulting image with a camera. The captured image can then be processed to determine the distortion at the feature point location of the calibration pattern by quantifying the error between the desired position of the pattern feature point and its measured position. For displays with separate color layers (eg, red (R), green (G), and blue (B)), this calibration can also correct color quality and image quality.

圖22係為使用校準圖案的校準系統2200之實施例。顯示器2202可以被配置為將校準圖案2204投影為生成光場2206,可以使用諸如相機2208的成像裝置來擷取生成光場2206。在一些實施例中,顯示器2202包括堆疊波導組件(例如,如圖4或圖6所示)或其他類型的光場顯示器。在一些實施例中,相機2208(或顯示器2202)被配置為可移動的,使得系統2200將能夠從不同的橫向位置、深度或角度擷取光場706的影像。在一些實施例中,校準系統2200可以類似於圖18的計量系統1800。例如,顯示器2202、光場2206和相機2208可以對應於計量系統1800的顯示器1802、光場1804和相機1806。 22 is an embodiment of a calibration system 2200 that uses a calibration pattern. Display 2202 can be configured to project calibration pattern 2204 into a generated light field 2206, which can be captured using an imaging device such as camera 2208. In some embodiments, display 2202 includes a stacked waveguide assembly (eg, as shown in FIG. 4 or FIG. 6) or other type of light field display. In some embodiments, camera 2208 (or display 2202) is configured to be movable such that system 2200 will be able to capture images of light field 706 from different lateral positions, depths, or angles. In some embodiments, calibration system 2200 can be similar to metering system 1800 of FIG. For example, display 2202, light field 2206, and camera 2208 can correspond to display 1802, light field 1804, and camera 1806 of metering system 1800.

在該實施例中,校準圖案2204包括棋盤圖案,其中不同區域具有不同(例如,交替)光學特性,例如輝度(例如,亮或暗)、色度、色調、飽和度、顏色等。棋盤圖案可以是規則圖案(例如,如圖22所示)或不規則圖案。校準圖案2204包含可 用於測量由相機2208擷取的影像中的失真量的多個特徵點。例如,棋盤圖案的特徵點包括在棋盤的檢查框之間的邊框和角落上的點或在檢查框中心的點。校準圖案2204可以與顯示器2202的尺寸相同或者比顯示器2202小。當系統2200在顯示器2202上移動時,較小的校準圖案可以在顯示器2202上移動,並且當測量顯示器2202的失真時,相機2208可以獲取校準圖案2204的多個影像。在一些實施例中,可以根據運算優化的序列對校準圖案2204進行隨機採樣。 In this embodiment, the calibration pattern 2204 includes a checkerboard pattern in which different regions have different (eg, alternating) optical characteristics, such as luminance (eg, light or dark), chromaticity, hue, saturation, color, and the like. The checkerboard pattern may be a regular pattern (eg, as shown in FIG. 22) or an irregular pattern. Calibration pattern 2204 includes A plurality of feature points for measuring the amount of distortion in the image captured by camera 2208. For example, the feature points of the checkerboard pattern include points on the border and corners between the check boxes of the checkerboard or points at the center of the check box. The calibration pattern 2204 can be the same size as or smaller than the display 2202. As the system 2200 moves over the display 2202, a smaller calibration pattern can be moved over the display 2202, and when measuring the distortion of the display 2202, the camera 2208 can acquire multiple images of the calibration pattern 2204. In some embodiments, the calibration pattern 2204 can be randomly sampled according to a sequence optimized for operation.

由於顯示器2202中的誤差(例如,一個或多個波導或透鏡中的缺陷),光場2206可能包含導致虛擬物體或光場中的圖案出現失真的缺陷。這可能使校準圖案2204上的特徵點的預期聚焦位置(橫向或深度)與它們由相機2208擷取的影像中的實際測量位置之間產生偏差。藉由比較校準圖案2204的特徵點的實際測量位置與可以識別和測量這些特徵點的預期位置,來判定和測量失真所引起的偏差。在一些實施例中,校準圖案包括顏色信息,使得顯示器2202的色彩誤差可以由系統2200量化。在一些實施例中,可以生成失真圖以用於顯示器的空間或色彩誤差的誤差校正2202(例如,如圖8所示)。 Due to errors in display 2202 (eg, defects in one or more waveguides or lenses), light field 2206 may contain defects that cause distortion of the pattern in the virtual object or light field. This may cause a deviation between the expected focus position (lateral or depth) of the feature points on the calibration pattern 2204 and the actual measured position in the image they are captured by the camera 2208. The deviation caused by the distortion is determined and measured by comparing the actual measurement position of the feature points of the calibration pattern 2204 with the expected position at which these feature points can be identified and measured. In some embodiments, the calibration pattern includes color information such that the color error of display 2202 can be quantified by system 2200. In some embodiments, a distortion map can be generated for error correction 2202 of the spatial or color error of the display (eg, as shown in FIG. 8).

在一些實施例中,校準圖案2204中的每個檢查框2304對應於顯示器2202的單個像素,其可以允許在逐個像素的基礎上直接測量顯示缺陷。在其他實施例中,每個檢查框2304對應於多個像素(例如,N×M像素網格,N或M至少一個大於1)。 在某些這樣的實施例中,校準圖案的粗糙質量意味著在採樣點處獲得失真信息,並且可以對其進行內插以獲得每個像素失真信息。例如,在圖23A所示的棋盤圖案中,可以針對對應於特徵點2302的圖案位置(例如,檢查框的邊框、角或中心上的點)測量失真信息。可以從與靠近特徵點2302相關聯的測量的失真值推斷出或內插的檢查框區域2304中的其他點的失真信息。 In some embodiments, each of the calibration patterns 2204 corresponds to a single pixel of the display 2202, which may allow for direct measurement of display defects on a pixel by pixel basis. In other embodiments, each check box 2304 corresponds to a plurality of pixels (eg, an N x M pixel grid, at least one of N or M being greater than one). In some such embodiments, the coarse quality of the calibration pattern means that distortion information is obtained at the sample point and can be interpolated to obtain each pixel distortion information. For example, in the checkerboard pattern shown in FIG. 23A, distortion information may be measured for a pattern position corresponding to the feature point 2302 (eg, a border, a corner, or a point on the center of the inspection frame). Distortion information for other points in the checkbox region 2304 can be inferred or interpolated from the measured distortion values associated with the feature points 2302.

棋盤投影擷取過程識別特徵點(例如,檢查框的邊緣)會量化用於失真校準的預期位置與測量位置中的誤差。與顯示器中的像素數目相比,特徵點可以是稀疏的。例如,高清晰度顯示器可以包括數百萬個像素(例如,對於1920x1080像素分辨率為210萬像素),而校準圖案中的方格804的數量可以幾乎更少(例如,對於50×50、100×100、500×500圖案)。因此,使用單個投影擷取近似率採樣測量的系統2200的實施例可以被內插以估計每個像素失真。 The board projection capture process identifies feature points (eg, the edges of the check box) to quantify errors in the expected position and measurement position for distortion calibration. The feature points may be sparse compared to the number of pixels in the display. For example, a high definition display can include millions of pixels (eg, 2.1 million pixels for a 1920x1080 pixel resolution), while the number of squares 804 in a calibration pattern can be almost less (eg, for 50x50, 100) ×100, 500×500 pattern). Thus, embodiments of system 2200 that use a single projection to extract approximate rate sampling measurements can be interpolated to estimate each pixel distortion.

為了獲得用於顯示器準確的每像素失真信息,系統2200的實施例可以藉由實現不同或位移校準圖案來自動獲得失真信息的作業。可以投影不同的校準圖案或者可以遞增地位移相同的圖案,使得顯示器2202的整個像素空間被測量。自動影像投影和擷取或不同的位移校準圖案允許用於顯示器2202的失真之像素準確映射。 In order to obtain accurate per-pixel distortion information for the display, embodiments of system 2200 can automatically obtain distortion information by implementing different or displacement calibration patterns. Different calibration patterns can be projected or the same pattern can be incrementally shifted such that the entire pixel space of display 2202 is measured. Automated image projection and capture or different displacement calibration patterns allow for accurate mapping of the distorted pixels for display 2202.

可藉由自動重複棋盤投影擷取,但是具有例如1個像素位移的校準圖案,系統2200會在每個像素的基礎上,獲得改 善的失真信息。例如,照相機2208可以在每次位移圖案時,獲得圖案的影像。由於每個重複的影像擷取,會讓投影的校準圖案的特徵點對應到不同的像素組。此校準圖案的位移可以重複,直到獲得顯示器的失真場之密集採樣。例如,棋盤可以被投影以及位移通過與棋盤的檢查框之像素對應的多個位置,並允許測量顯示器的每個像素的失真信息。在其它實施例中,該位移可不同於一個像素,例如,2、3、4、5、8、16或更多像素。對於不同方向的顯示器,位移可以是不同的,例如,x位移不需要與y位移相同。 The system 2200 can be modified on a per pixel basis by automatically repeating the board projection, but with a calibration pattern of, for example, 1 pixel displacement. Good distortion information. For example, camera 2208 can obtain an image of the pattern each time the pattern is displaced. Due to each repeated image capture, the feature points of the projected calibration pattern are mapped to different pixel groups. The displacement of this calibration pattern can be repeated until an intensive sampling of the distortion field of the display is obtained. For example, the board may be projected and displaced through a plurality of locations corresponding to pixels of the checkbox of the board and allow for measurement of distortion information for each pixel of the display. In other embodiments, the displacement can be different than one pixel, for example, 2, 3, 4, 5, 8, 16, or more pixels. The displacement can be different for displays in different directions, for example, the x displacement does not need to be the same as the y displacement.

雖然本發明揭露了棋盤圖案的實施範例,但是應當理解,也可以使用其他類型的圖案。例如,可以使用其他幾何圖案,可以隨機使用隨機圖案,或者可以使用任何其他類型的校準或測試圖案。在一些實施例中,使用期間會開啟顯示器中的單像素的校準圖案。圖23B係為單像素校準圖案的實施例,其中該單個像素2306已被開啟。並從每個結果幀(resulting frame)的擷取影像,來量化顯示裝置到觀看者場景的每像素的傳遞函數。在每次影像擷取之後,顯示的像素2306的位置可以在設置的距離(例如,單個像素)上跨越顯示器(例如,沿由箭頭2308指示的方向)移動。並通過自動掃描顯示器的每個像素,用以獲得顯示裝置的質量的完全量化。在其他實施例中,照亮像素的移位可以是不同數量的像素,例如2、3、4、5、8、16或更多像素,該平移對於顯示器上的不同橫向方向可以是不同的,或者可以在每個影像擷 取中照亮多個像素(而不是如圖23B所示的單個像素)。 Although the present invention discloses an embodiment of a checkerboard pattern, it should be understood that other types of patterns may be used. For example, other geometric patterns can be used, random patterns can be used at random, or any other type of calibration or test pattern can be used. In some embodiments, a single pixel calibration pattern in the display is turned on during use. Figure 23B is an embodiment of a single pixel calibration pattern in which the single pixel 2306 has been turned on. The transfer function of each pixel of the display device to the viewer scene is quantized from the captured image of each resulting frame. After each image capture, the position of the displayed pixel 2306 can be moved across the display (eg, in the direction indicated by arrow 2308) over a set distance (eg, a single pixel). And by automatically scanning each pixel of the display to obtain a full quantification of the quality of the display device. In other embodiments, the shifting of the illuminated pixels can be a different number of pixels, such as 2, 3, 4, 5, 8, 16 or more pixels, which can be different for different lateral directions on the display, Or can be in each image撷 The pixels are illuminated to illuminate a plurality of pixels (instead of a single pixel as shown in Figure 23B).

圖24係用於執行自動顯示校準的流程2400示意圖。流程2400可以,例如,作為參考圖27和28描述的流程2700和2805的一部分來執行。在區塊2402,由顯示器投影校準圖案。校準圖案可以包括具有可以由顯示器產生的一個或多個特徵點的任何圖案。在一些實施例中,校準圖案包括棋盤圖案。在其他實施例中,可以使用其他類型的校準圖案,例如單像素圖案。 Figure 24 is a schematic diagram of a flow 2400 for performing automatic display calibration. Flow 2400 can be performed, for example, as part of flows 2700 and 2805 described with reference to Figures 27 and 28. At block 2402, a calibration pattern is projected by the display. The calibration pattern can include any pattern having one or more feature points that can be produced by the display. In some embodiments, the calibration pattern comprises a checkerboard pattern. In other embodiments, other types of calibration patterns can be used, such as a single pixel pattern.

在區塊2404,會使用相機或其它類型的影像擷取裝置來擷取所顯示的校準圖案的影像。如果在顯示器產生的光場中存在誤差或缺陷,則所顯示的校準圖案的部分可能變得失真,其中校準圖案中的一個或多個特徵點將出現與期望位置不同的位置。影像的輝度或色度也將不同於校準圖案所期望的輝度或色度。 At block 2404, a camera or other type of image capture device is used to capture an image of the displayed calibration pattern. If there are errors or defects in the light field produced by the display, portions of the displayed calibration pattern may become distorted, with one or more of the feature points in the calibration pattern appearing at a different location than the desired location. The brightness or chromaticity of the image will also differ from the brightness or chromaticity desired for the calibration pattern.

在區塊2406,確定對應於校準圖案的特徵點的預期位置與所擷取的特徵點的位置之間的誤差的失真。例如,對於單像素校準圖案,可以針對圖案的特定像素位置計算失真信息。對於棋盤圖案,可以對與棋盤的特徵點對應的像素(例如,檢查框的邊緣、角或中心)計算失真信息。在一些實施例中,確定校準圖案的輝度或色度與校準圖案的擷取影像的對應輝度或色度之間的輝度或色度誤差。 At block 2406, a distortion of the error between the expected position of the feature point corresponding to the calibration pattern and the position of the captured feature point is determined. For example, for a single pixel calibration pattern, distortion information can be calculated for a particular pixel location of the pattern. For the checkerboard pattern, distortion information can be calculated for pixels corresponding to feature points of the checkerboard (eg, the edge, corner, or center of the check box). In some embodiments, a luminance or chromaticity error between the luminance or chrominance of the calibration pattern and the corresponding luminance or chrominance of the captured image of the calibration pattern is determined.

在區塊2408,確定在跨過顯示器上是否有應當投影校準圖案的任何附加位置。如果確定存在附加位置,則在區塊2410,用以將校準圖案移位並投影在新位置,並且可以擷取校準 圖案的影像(區塊2404),且用於計算失真量(區塊2406)。在一些實施例中,用於顯示校準圖案的不同位置的數量是基於所使用的校準圖案。例如,對於單像素校準圖案,位置的數量可以對應於顯示器可顯示的像素的總數。對於棋盤圖案,位置的數量可以是基於每個檢查框中的像素的數量。 At block 2408, it is determined if there are any additional locations on the display that should have projected the calibration pattern. If it is determined that there is an additional location, then at block 2410, the calibration pattern is shifted and projected at the new location, and the calibration can be captured The image of the pattern (block 2404) is used to calculate the amount of distortion (block 2406). In some embodiments, the number of different locations used to display the calibration pattern is based on the calibration pattern used. For example, for a single pixel calibration pattern, the number of locations may correspond to the total number of pixels that the display can display. For a checkerboard pattern, the number of locations can be based on the number of pixels in each checkbox.

一旦校準圖案已經顯示在所有期望的位置,在區塊912處,所計算的失真可以被聚集並且用於生成包括顯示器的每個像素(或一組像素)的失真信息的失真圖。失真信息可以包括由於聚焦誤差(例如,平面內誤差或平面外誤差)或顏色誤差(例如,輝度或色度誤差)引起的空間失真。在區塊2414,可使用計算的失真圖在顯示器上執行糾錯。例如,失真信息(例如,失真圖)可以由圖2所示的可穿戴顯示系統200的數據模組224、232儲存。可穿戴顯示系統200的處理模組224、228可以使用失真信息來校正顯示器208中的空間或色差,使得顯示系統200的佩戴者204感知的影像至少部分地被補償。 Once the calibration pattern has been displayed at all desired locations, at block 912, the calculated distortion can be aggregated and used to generate a distortion map of the distortion information for each pixel (or set of pixels) of the display. Distortion information may include spatial distortion due to focus errors (eg, in-plane or out-of-plane errors) or color errors (eg, luminance or chrominance errors). At block 2414, error correction can be performed on the display using the calculated distortion map. For example, distortion information (eg, a distortion map) may be stored by the data modules 224, 232 of the wearable display system 200 shown in FIG. The processing modules 224, 228 of the wearable display system 200 can use distortion information to correct for spatial or chromatic aberrations in the display 208 such that images perceived by the wearer 204 of the display system 200 are at least partially compensated.

在一些實施例中,圖24係為用於光場顯示的實施例,用以對光場顯示器的波導組件405中的每個波導執行過程2400,以校準每個波導。在某些情況下,可以存在對應於多個深度平面的多個波導以及對應於多種顏色(例如,紅色(R)、綠色(G)和藍色(B))的多個波導。例如,對於某些顯示器,每個深度平面存在三個色彩平面,因此具有兩個深度平面的波導組件可以具有2x3=6個波導。此外,除了像素位置之外,還可以校準色品 和質量,以便校正顯示器的色彩(顏色)缺陷。例如,相機2208可以是對多種色彩靈敏的相機或相機組合,每一個對色彩的子集靈敏,且用於擷取光場2208的影像,在其中可以識別投影圖案2204的擷取顏色或輝度值與預期顏色或輝度值之間的偏差。 In some embodiments, FIG. 24 is an embodiment for light field display to perform a process 2400 on each of the waveguide assemblies 405 of the light field display to calibrate each waveguide. In some cases, there may be multiple waveguides corresponding to multiple depth planes and multiple waveguides corresponding to multiple colors (eg, red (R), green (G), and blue (B)). For example, for some displays, there are three color planes per depth plane, so a waveguide assembly with two depth planes can have 2x3 = 6 waveguides. In addition, in addition to the pixel position, you can also calibrate the chromaticity And quality in order to correct the color (color) defects of the display. For example, camera 2208 can be a combination of multiple color sensitive cameras or cameras, each sensitive to a subset of colors, and used to capture an image of light field 2208 in which the captured color or luminance value of projection pattern 2204 can be identified. The deviation from the expected color or luminance value.

■波導顯示實施例■Waveguide display embodiment

圖25A揭露出包括一波導2505、一光輸入耦合元件2507、一光重新分佈元件2511和一光輸出耦合元件2509的顯示器2500之實施例的俯視圖。圖25B為揭露出圖25A沿軸線A-A'所示的顯示器2500的橫截面示意圖。 25A illustrates a top view of an embodiment of a display 2500 that includes a waveguide 2505, a light input coupling element 2507, a light redistribution element 2511, and a light output coupling element 2509. Figure 25B is a schematic cross-sectional view showing the display 2500 of Figure 25A taken along axis AA'.

波導2505是圖4所示的顯示系統400中的波導堆疊405的一部分。例如,波導2505可以對應於波導420、422、424、426、428中的其中一個,並且光輸出耦合元件2509可以對應於顯示系統400的光提取光學元件460、462、464、466、468。 Waveguide 2505 is part of waveguide stack 405 in display system 400 shown in FIG. For example, the waveguide 2505 can correspond to one of the waveguides 420, 422, 424, 426, 428, and the light output coupling element 2509 can correspond to the light extraction optical elements 460, 462, 464, 466, 468 of the display system 400.

顯示器2500用以將光線2503i1,2503i2和2503i3(分別為實線、虛線和雙點劃線)所表示的不同波長的入射光由光輸入耦合元件2507耦合到波導2505中。入射到波導2505的入射光可以從影像注入裝置(例如圖4所示的影像注入裝置440、442、444、446、448中的其中一個)投影。光輸入耦合元件2507用以將適當的角度的入射光之波長耦合到波導2505中,該波長藉由全內反射(TIR)的支持來傳遞通過波導2505。 Display 2500 is used to couple incident light of different wavelengths, as indicated by rays 2503i1, 2503i2, and 2503i3 (solid, dashed, and two-dot, respectively), to optical waveguide coupling element 2507 into waveguide 2505. Incident light incident on the waveguide 2505 can be projected from an image injecting device, such as one of the image injecting devices 440, 442, 444, 446, 448 shown in FIG. Light input coupling element 2507 is used to couple the wavelength of the incident light at an appropriate angle into waveguide 2505, which is transmitted through waveguide 2505 by the support of total internal reflection (TIR).

光重新分佈元件2511用以設置在光學路徑中,光的不同波長2503i1、2503i2和2503i3沿著該光學路徑傳遞通過波導 2505。光分佈元件2511用以配置為重定向來自光輸入耦合元件2507的一部分的光朝向光輸出耦合元件2509,並沿傳播方向增大相互作用的光之光束尺寸。因此,光分佈元件2511可以有利擴大顯示裝置2500的出射光瞳。在一些實施例中,光分佈元件2511可以因此具有垂直瞳孔擴張器(OPE)的功能。 The light redistribution element 2511 is configured to be disposed in the optical path along which different wavelengths of light 2503i1, 2503i2, and 2503i3 are transmitted through the waveguide 2505. The light distributing element 2511 is configured to redirect light from a portion of the light input coupling element 2507 toward the light output coupling element 2509 and to increase the beam size of the interacting light in the direction of propagation. Therefore, the light distributing element 2511 can advantageously expand the exit pupil of the display device 2500. In some embodiments, the light distributing element 2511 can thus have the function of a vertical pupil dilator (OPE).

光輸出耦合元件2509用以適當的角度(例如,在z方向)和效率將入射在元件2509上的輸入耦合光重新導出波導2505的xy平面,有助於在不同的波長和在不同的深度平面,使得觀看者可以感知到良好視覺質量的彩色影像。光輸出耦合元件2509可以具有向通過波導2505射出的光提供發散的光功率,使得通過波導2505射出的光形成的影像為(對於觀看者)起源於一定深度。光輸出耦合元件2509可以放大顯示器2500的出射光瞳,並可以將光引導到觀看者的眼睛,如同出射瞳孔擴張器(EPE)。 The light output coupling element 2509 re-extracts the input coupling light incident on the element 2509 at an appropriate angle (e.g., in the z direction) and efficiency to facilitate the xy plane of the waveguide 2505, contributing to different wavelengths and at different depth planes. A color image that allows viewers to perceive good visual quality. The light output coupling element 2509 can have a diverging optical power that provides light that is emitted through the waveguide 2505 such that the image formed by the light exiting the waveguide 2505 is (for the viewer) originating at a certain depth. Light output coupling element 2509 can amplify the exit pupil of display 2500 and can direct light to the viewer's eye as an exit pupil dilator (EPE).

光輸入耦合元件2507、光輸出耦合元件1009和光分配元件2511可以包括多個光柵,例如模擬表面起伏光柵(ASR)、二進製表面浮雕結構(BSR)、體全息光學元件(VHOE)、數位表面浮雕結構和/或體積相位全息材料(例如,記錄在體積相位全息材料中的全息圖)或可切換繞射光學元件(例如聚合物分散液晶(PDLC)光柵)。在各種實施例中,光輸入耦合元件2507可包括一個或多個光學棱鏡或包括一個或多個繞射元件和/或折射元件的光學元件。藉由使用諸如射出壓縮成形、UV複製或者繞射結構的奈米壓印的製造方法將各種繞射或光柵結構組用於波導上。 The light input coupling element 2507, the light output coupling element 1009, and the light distribution element 2511 may comprise a plurality of gratings, such as an analog surface relief grating (ASR), a binary surface relief structure (BSR), a volume holographic optical element (VHOE), a digital surface Embossed structures and/or volume phase holographic materials (eg, holograms recorded in volume phase holographic materials) or switchable diffractive optical elements (eg, polymer dispersed liquid crystal (PDLC) gratings). In various embodiments, light input coupling element 2507 can include one or more optical prisms or optical elements including one or more diffractive elements and/or refractive elements. Various diffractive or grating structure groups are used for the waveguide by using a fabrication method such as injection compression molding, UV replication, or nanoimprinting of a diffraction structure.

光輸入耦合元件2507、光輸出耦合元件1009或光分佈元件2511不一定是單個元件(例如,在如圖25A和25B所示),此種元件也可以為複數元件。這些元件被設置在波導2505的主表面2505a、2505b的其中一個(或兩個)上。如圖25A和25B所示,光輸入耦合元件2507、光輸出耦合元件2509和光分佈元件2511被設置在波導2505的主表面2505a上。 The light input coupling element 2507, the light output coupling element 1009, or the light distributing element 2511 are not necessarily a single element (for example, as shown in FIGS. 25A and 25B), and such elements may also be plural elements. These elements are disposed on one (or both) of the major surfaces 2505a, 2505b of the waveguide 2505. As shown in FIGS. 25A and 25B, the light input coupling element 2507, the light output coupling element 2509, and the light distribution element 2511 are disposed on the main surface 2505a of the waveguide 2505.

在一些實施例中,一個或多個波長選擇濾波器可以與光輸入耦合元件2507、光輸出耦合元件2509或光分佈元件2511。在圖25A所示的顯示器2500包括波長選擇濾波器2513、波長選擇濾波器2513被集成到波導2505的表面中或表面上。波長選擇濾波器可以被配置為濾除沿著波導2505中的各個方向傳播的一個或多個波長之某部分的光。波長選擇濾波器可以是吸收濾波器,例如色帶吸收器。 In some embodiments, one or more wavelength selective filters may be coupled to light input coupling element 2507, light output coupling element 2509, or light distribution element 2511. The display 2500 shown in FIG. 25A includes a wavelength selective filter 2513 that is integrated into or on the surface of the waveguide 2505. The wavelength selective filter can be configured to filter out light of a portion of one or more wavelengths propagating along various directions in the waveguide 2505. The wavelength selective filter can be an absorption filter such as a ribbon absorber.

■根據眼動追踪的AR或VR顯示器的動態校準實施例。■ Dynamic calibration embodiment of AR or VR display according to eye tracking.

顯示系統用以被校準(空間和/或色度)以產生改善的質量影像。在某些近眼顯示器(例如,圖4中所示的堆疊波導組件405用於圖2所示的顯示器208或參考圖25A和25B所述的顯示器2500)的情況下,該校準於名義上固定的眼睛位置(例如,佩戴者通過顯示器208直視前方)會相當準確,但對於其他眼睛的姿勢方向或位置就不會準確。因此,顯示器的校準會取決於眼睛位置或眼睛方向。如果僅使用單個(例如,基準)位置來校準,則當佩戴者朝向不同位置(例如,遠離基準位置)時將存在未被 校正的誤差。 The display system is used to be calibrated (space and/or chrominance) to produce an improved quality image. In some near-eye displays (eg, stacked waveguide assembly 405 shown in FIG. 4 for display 208 shown in FIG. 2 or display 2500 described with reference to FIGS. 25A and 25B), the calibration is nominally fixed. The position of the eye (e.g., the wearer looking straight ahead through the display 208) can be quite accurate, but the orientation or position of the other eyes will not be accurate. Therefore, the calibration of the display will depend on the eye position or eye direction. If only a single (eg, reference) position is used for calibration, there will be no when the wearer is facing a different position (eg, away from the reference position) Corrected error.

本發明還描述了使用眼動追踪的可穿戴式顯示系統400的動態校準的實施例,其中空間和/或顏色校準會因眼睛位置(或在某些情況下的眼睛方向)的變化而改變。某些這樣的校準提供了前置校準系統,將導致對於寬範圍的眼睛運動維持高質量影像。在一些實施例中,透過硬體處理器(例如,可穿戴顯示系統200的處理模組224、228或顯示系統400的控制器450)即即時地執行校準,而不增加特殊硬體。 The present invention also describes an embodiment of dynamic calibration of the wearable display system 400 using eye tracking, where spatial and/or color calibration may vary due to changes in eye position (or, in some cases, eye direction). Some of these calibrations provide a pre-calibration system that will result in maintaining high quality images for a wide range of eye movements. In some embodiments, calibration is performed on-the-fly through a hardware processor (eg, processing module 224, 228 of wearable display system 200 or controller 450 of display system 400) without adding special hardware.

校準可以補償(或校正)顯示器的視場中的空間誤差和/或色彩(顏色)誤差。例如,空間誤差可以包括平面內平移、旋轉、縮放或扭曲誤差以及平面外(例如,焦深)誤差。色差可以包括用以顯示的每個顏色(例如,R、G和B)的輝度平坦度或色彩均勻度誤差。 Calibration can compensate (or correct) spatial and/or color (color) errors in the field of view of the display. For example, spatial errors may include in-plane translation, rotation, scaling or distortion errors as well as out-of-plane (eg, depth of focus) errors. The color difference may include luminance flatness or color uniformity error for each color (eg, R, G, and B) to be displayed.

圖26係為用於顯示器2500的動態校準系統2600之實施例,對於該動態校準系統2600可以應用校準來校正網格參考位置(由點2602所示)的空間和/或色彩誤差。動態校準系統2600可以包括顯示器2500、面向內部的成像系統(例如眼動追踪相機500)和動態校準處理器2610(其檢索並應用該校準)。圖26係為顯示器2500的另一實施例,其包括參考圖25A和25B所述的光學元件的實施例。光輸出耦合元件2509將光引導到觀看者的眼睛。當觀看者的眼睛相對於光輸出耦合元件2509定位在不同位置2602時,用於該特定眼睛位置(如圖26中揭露的點2602)的顯 示器2500的光學校準可以是不相同的。例如,如果眼睛位於位置2602a上、接近光輸出耦合元件2509的中心校準,和如果眼睛位於位置2602b上方、朝向光輸出耦合元件2509的左上角,以及類似用於光學元件2509上的任何其他實施的位置2602的校準是不相同的。 26 is an embodiment of a dynamic calibration system 2600 for display 2500 for which calibration can be applied to correct for spatial and/or color errors of the grid reference position (shown by point 2602). The dynamic calibration system 2600 can include a display 2500, an internal-facing imaging system (eg, an eye-tracking camera 500), and a dynamic calibration processor 2610 (which retrieves and applies the calibration). Figure 26 is another embodiment of a display 2500 that includes an embodiment of the optical component described with reference to Figures 25A and 25B. Light output coupling element 2509 directs light to the viewer's eyes. When the viewer's eye is positioned at a different position 2602 relative to the light output coupling element 2509, the display for that particular eye position (point 2602 as disclosed in FIG. 26) The optical alignment of the display 2500 can be different. For example, if the eye is at position 2602a, near the center of light output coupling element 2509, and if the eye is above position 2602b, toward the upper left corner of light output coupling element 2509, and similar to any other implementation on optical element 2509. The calibration of position 2602 is not the same.

當使用者的眼睛相對於顯示器移動時,顯示器的視場(FOV)保持大致相同,但是顯示器中的空間和/或色度失真可以隨著眼睛相對於顯示器平移而改變。由於FOV包括向使用者呈現影像的角度範圍,(在相對於顯示器的給定位置處)校準數據基本上可考慮眼睛的所有取向或視角。例如,當使用者將他的視覺定向到不同的角度(同時保持相對於顯示器的相同位置)時,使用者僅可以觀看到具有相同整體失真的不同部分的影像。因此,在任何給定位置,當眼睛的方向改變(例如,眼睛注視方向改變)時,眼睛的視圖通常保持在顯示器的FOV內,並且相同的校準(對於給定的眼睛位置)可以用於幾乎所有眼睛的方向。因此,校準系統的某些實施例利用位置相關校準,該校準是無額外定向相關性。 The field of view (FOV) of the display remains substantially the same as the user's eyes move relative to the display, but the spatial and/or chromatic distortion in the display can change as the eye translates relative to the display. Since the FOV includes an angular range in which the image is presented to the user, the calibration data (at a given location relative to the display) can generally take into account all orientations or viewing angles of the eye. For example, when a user directs his vision to a different angle (while maintaining the same position relative to the display), the user can only view images of different portions having the same overall distortion. Thus, at any given location, when the direction of the eye changes (eg, the eye gaze direction changes), the view of the eye typically remains within the FOV of the display, and the same calibration (for a given eye position) can be used for almost The direction of all eyes. Thus, certain embodiments of the calibration system utilize position-dependent calibration that is without additional orientation correlation.

注意,點2602、2602a、2602b僅用於參考,並且不形成光輸出耦合元件2509或顯示器2500的一部分。此外,儘管在圖26中示意出3×3柵格中的9個位置2602,這僅是為了說明的目的,應當理解,用於顯示器2500的校準的位置之數量(或佈置)可以不同於圖26所示。例如,在各種實施例中,使用1、2、 3、4、5、6、7、8、9、10、11、12、16、20、25、100、256或更多個校準位置。校準位置可以佈置成2x2、3x3、4x4、5x5、6x6、7x7、9x9或其他尺寸網格或其他圖案或位置佈局。 Note that points 2602, 2602a, 2602b are for reference only and do not form part of light output coupling element 2509 or display 2500. Moreover, although nine locations 2602 in a 3x3 grid are illustrated in Figure 26, this is for illustrative purposes only, it should be understood that the number (or arrangement) of locations for calibration of display 2500 can be different from the map. 26 is shown. For example, in various embodiments, 1, 2 are used. 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 16, 20, 25, 100, 256 or more calibration positions. The calibration locations can be arranged in 2x2, 3x3, 4x4, 5x5, 6x6, 7x7, 9x9 or other size grids or other pattern or positional layouts.

可以使用測量顯示器投影的校準圖案(例如,棋盤)中的誤差的光場度量系統來確定顯示器2500上的一個或多個位置的校準。校準可以取決於顯示器上的位置,該位置係為觀看顯示器的位置。例如,計量系統可以相對於顯示器(例如,通過相對平移相機和顯示器)掃過眼代理相機(eye-proxy camera),以模擬使用者眼睛的位置範圍。當相機相對於顯示器掃描時,在每個採樣點2602處,度量系統可以建立校準(校正),從而產生與眼睛代理位置相比的一組校準。 Calibration of one or more locations on display 2500 can be determined using a light field metric system that measures errors in a calibration pattern (eg, a checkerboard) projected by the display. Calibration can depend on the location on the display, which is the location at which the display is viewed. For example, the metering system can sweep the eye-proxy camera relative to the display (eg, by relatively panning the camera and display) to simulate the range of positions of the user's eyes. As the camera scans relative to the display, at each sampling point 2602, the metrology system can establish a calibration (correction) to produce a set of calibrations compared to the eye proxy position.

特定顯示器的校準可以由可穿戴顯示系統200的數據模組224,228儲存為對照表(LUT)(或其他有效的數據結構)。在其他實施例,分析模型可從測量系統來獲得校準的數據以及契合的分析模型可由可穿戴顯示系統200進行儲存。並可使用其他建模或數據參考方法來儲存校準。如上所述,校準可以包括為顯示器的每個校準位置(例如,圖26所示的顯示器2500之校準位置的3×3網格之實施例)生成的空間和/或色彩校正。注意,在各種實施例中,為了獲得校準、顯示器相對於固定相機掃描(平移)、相機相對於固定顯示器掃描(平移),或者相機和顯示器相對於彼此掃描(平移)。 Calibration of a particular display may be stored by the data modules 224, 228 of the wearable display system 200 as a look-up table (LUT) (or other valid data structure). In other embodiments, the analysis model can obtain calibration data from the measurement system and the fit analysis model can be stored by the wearable display system 200. Other modeling or data reference methods can be used to store the calibration. As noted above, the calibration may include spatial and/or color corrections generated for each calibration location of the display (e.g., an embodiment of a 3x3 grid of calibration locations of display 2500 shown in FIG. 26). Note that in various embodiments, in order to obtain calibration, the display is scanned (translated) relative to a fixed camera, the camera is scanned (translated) relative to a fixed display, or the camera and display are scanned (translated) relative to each other.

在眼代理相機的視場(FOV)大於顯示器的FOV的 實施例中,將校準相機放置在相對於顯示器的多個離散位置(例如,在由點所示的位置2602),並且一個或多個校準影像將提供足夠的關於顯示器缺陷的信息,以確定每個離散位置的校準。在一些這樣的實施例中,照相機可以擷取顯示器的全部FOV,並且可以不需要改變照相機的每個位置2602處的定向(例如,指向方向)。在其他實施例中,可以改變相機(在每個位置2602)來獲得額外的影像以繪製出顯示器的FOV(例如,當相機的FOV小於顯示器的FOV時)。 The field of view (FOV) of the eye agent camera is greater than the FOV of the display In an embodiment, the calibration camera is placed at a plurality of discrete locations relative to the display (eg, at location 2602 as indicated by the dots), and the one or more calibration images will provide sufficient information about the display defects to determine each Calibration of discrete locations. In some such embodiments, the camera can capture the full FOV of the display and may not need to change the orientation (eg, pointing direction) at each position 2602 of the camera. In other embodiments, the camera can be changed (at each location 2602) to obtain additional images to plot the FOV of the display (eg, when the FOV of the camera is less than the FOV of the display).

校準位置可以表示相對於顯示器2500的眼睛位置。例如,顯示器2500的佩戴者一貫地定位該顯示器,以便佩戴者的眼睛(在xy平面中)大致靠近光輸出耦合元件2509的中心,例如,佩戴者的眼睛位於位置2602a上方。因此,位置2602a(靠近光學元件2509的中心)的校準對應於大致垂直於顯示器2500(例如,基本上沿著z方向)傳播的光,並且可以應用於動態校準處理器2610。如果佩戴者的眼睛在位置2602b(靠近光學元件2509的左上角)向上和向左移動,位置2602b的校準可以應用於處理器2510。眼動追踪相機500可以對眼睛成像(例如即時地),並且動態校準處理器2510可以使用眼動追踪數據來確定眼睛的位置、選擇適當的校準(基於確定的眼睛位置),並且將校準應用於顯示器。在一些實施例中,從角膜位置和注視方向確定眼睛位置。此外,在其他實施例中,可以確定眼睛定向(例如,凝視方向),並且可以使用方向相關性的校準。 The calibration position may represent the eye position relative to display 2500. For example, the wearer of display 2500 consistently positions the display such that the wearer's eyes (in the xy plane) are generally near the center of light output coupling element 2509, for example, the wearer's eyes are above position 2602a. Thus, the calibration of position 2602a (near the center of optical element 2509) corresponds to light that propagates substantially perpendicular to display 2500 (eg, substantially along the z-direction) and can be applied to dynamic calibration processor 2610. If the wearer's eyes move up and to the left at position 2602b (near the upper left corner of optical element 2509), calibration of position 2602b can be applied to processor 2510. The eye tracking camera 500 can image the eye (eg, on-the-fly), and the dynamic calibration processor 2510 can use the eye tracking data to determine the position of the eye, select an appropriate calibration (based on the determined eye position), and apply the calibration monitor. In some embodiments, the eye position is determined from the corneal position and the gaze direction. Moreover, in other embodiments, eye orientation (eg, gaze direction) can be determined, and calibration of directional correlation can be used.

可穿戴顯示系統200的實施例可以包括圖8中所示的動態校準系統2600之實施例。例如,眼動追踪相機500(參考圖4所述)可以被固定到可穿戴顯示系統200的框架並且可以動態地測量佩戴者的眼睛姿勢(例如,眼睛位置或眼睛方向)。來自相機500的影像可以由動態校準處理器2610使用以即時或接近即時地確定佩戴者的眼睛姿勢。當操作動態校準的系統時,眼動追踪相機可以即時或接近即時地通知動態校準處理器2610關於佩戴者的當前眼姿勢。動態校準處理器2610可以根據測量的眼睛姿勢(例如,位置或取向)獲取和應用適當的校準(例如,儲存在數據模組224,228中的適當的校準LUT)。在佩戴者沒有直接查看儲存的校準位置或者佩戴者的眼睛沒有直接位於校準位置上方的情況下,動態校準處理器可以在靠近校準位置的校準之間內插(或外插)(例如,至少包括最接近佩戴者的眼睛姿勢的校準位置),以確定適當的校準以應用於佩戴者當前眼姿勢。因此,顯示系統200(具有動態校準系統2600)可以校正顯示器中的缺陷(空間或色彩),從而提供穿戴者良好質量的色彩影像。如本發明所述,在一些情況下,校準取決於相對於顯示器的眼睛位置,而不取決於眼睛取向(例如,凝視方向),但這不是限制。 Embodiments of the wearable display system 200 can include embodiments of the dynamic calibration system 2600 shown in FIG. For example, the eye tracking camera 500 (described with reference to FIG. 4) can be secured to the frame of the wearable display system 200 and can dynamically measure the wearer's eye posture (eg, eye position or eye direction). Images from camera 500 may be used by dynamic calibration processor 2610 to determine the wearer's eye posture either immediately or near instantaneously. When operating a dynamically calibrated system, the eye tracking camera can notify the dynamic calibration processor 2610 about the current eye posture of the wearer, either immediately or near instantaneously. The dynamic calibration processor 2610 can acquire and apply appropriate calibrations (eg, appropriate calibration LUTs stored in the data modules 224, 228) based on the measured eye gestures (eg, position or orientation). In the event that the wearer does not directly view the stored calibration position or the wearer's eyes are not directly above the calibration position, the dynamic calibration processor may interpolate (or extrapolate) between calibrations close to the calibration position (eg, at least include The calibration position closest to the wearer's eye posture is determined to determine the appropriate calibration to apply to the wearer's current eye posture. Thus, display system 200 (with dynamic calibration system 2600) can correct for defects (space or color) in the display to provide a good quality color image of the wearer. As described herein, in some cases, the calibration depends on the position of the eye relative to the display, and not on the orientation of the eye (eg, the direction of gaze), but this is not a limitation.

動態校準處理器2610係為儲存在儲存器(例如,數據模組224,228)中的軟體,並且軟體指令會由處理模組224,228中的一個或兩個或由控制器450執行。因此,連續調整校準可以在佩戴者的眼睛的寬範圍的輸入運動上產生高質量的影像。 Dynamic calibration processor 2610 is software stored in a storage (e.g., data module 224, 228), and software instructions are executed by one or both of processing modules 224, 228 or by controller 450. Thus, continuous adjustment calibration can produce high quality images over a wide range of input motions of the wearer's eyes.

在一些實施方式中,校準被儲存在數量減少的校準位置(例如,2×2或3×3網格),以減少數據儲存量。如上所述,動態校準處理器可以內插或外插以決定不直接在儲存的校準位置上的眼睛姿勢的校準。 In some embodiments, the calibration is stored in a reduced number of calibration locations (eg, a 2x2 or 3x3 grid) to reduce data storage. As noted above, the dynamic calibration processor can be interpolated or extrapolated to determine the calibration of the eye gesture that is not directly at the stored calibration location.

在一些實施例中,可穿戴顯示系統200使用單個眼動追踪相機來測量穿戴者的單個眼睛的姿勢,並且動態校準處理器2610推斷佩戴者的另一隻眼睛相對於顯示系統200的姿勢(因為眼睛通常指向相同的方向)。在其他實施例中,可穿戴顯示系統200使用兩個眼動追踪相機(每個眼睛各一個)並且獨立地測量每個眼睛的姿勢。在一些實施例中,可穿戴系統中的每個顯示器都會儲存分離校準(在許多情況下,會有兩個顯示器,各自在穿戴者的每隻眼睛前面,因此儲存兩個校準)。在其他實施例中,單次校準(例如,平均校準)被儲存並用於可穿戴系統200中的所有顯示器。 In some embodiments, the wearable display system 200 uses a single eye tracking camera to measure the pose of the wearer's single eye, and the dynamic calibration processor 2610 infers the pose of the wearer's other eye relative to the display system 200 (because The eyes usually point in the same direction). In other embodiments, the wearable display system 200 uses two eye tracking cameras (one for each eye) and independently measures the pose of each eye. In some embodiments, each display in the wearable system stores a separate calibration (in many cases, there will be two displays, each in front of each eye of the wearer, thus storing two calibrations). In other embodiments, a single calibration (eg, average calibration) is stored and used for all displays in the wearable system 200.

眼動追踪相機(或其他類型的面向內的成像系統)可以對使用者臉部的眼周區域成像。眼眶區域可以包括眼睛和眼睛周圍的區域。例如,眼眶區域可以包括眼睛(例如眼窩)和眼睛周圍的區域。眼睛周圍的區域可以包括例如眉毛、鼻子、臉頰和前額的部分。眼周區域可以具有多種特徵,例如眉毛的形狀、眼角、眼瞼的特徵等。在一些實施例中,這些特徵中的一個或多個可以由關鍵點、點雲或其他類型的數學特徵來表示。可穿戴裝置可以識別影像中的這些特徵,並使用這些特徵來確定可穿戴顯 示系統和使用者的臉部之間的相對位置。在某些實施例中,可穿戴顯示系統200可以為每隻眼睛分別計算相對位置。例如,當可穿戴裝置具有一個或兩個眼睛相機,每個眼睛相機被配置為對使用者的一隻眼睛進行成像時,可穿戴裝置可以計算左眼和可穿戴顯示系統之間的一個相對位置以及右眼和可穿戴顯示系統之間的另一個相對位置。可穿戴裝置還可以分別追踪相對應眼睛的相對位置。因為左眼和可穿戴顯示系統之間的相對位置可以不同於右眼和可穿戴顯示系統之間的相對位置(例如當可穿戴系統向一側傾斜時),所以對於虛擬物體呈現位置的調整,左眼顯示器和右眼顯示器可以是不同的。 An eye tracking camera (or other type of inward facing imaging system) can image the periocular area of the user's face. The eyelid area may include an area around the eyes and eyes. For example, the eyelid region can include an eye (eg, an eye socket) and an area around the eye. The area around the eyes may include portions such as eyebrows, nose, cheeks, and forehead. The periocular region can have a variety of features, such as the shape of the eyebrows, the corners of the eyes, the features of the eyelids, and the like. In some embodiments, one or more of these features may be represented by key points, point clouds, or other types of mathematical features. The wearable device can identify these features in the image and use these features to determine the wearable display Shows the relative position between the system and the user's face. In some embodiments, the wearable display system 200 can calculate a relative position for each eye separately. For example, when the wearable device has one or two eye cameras, each eye camera is configured to image one eye of the user, the wearable device can calculate a relative position between the left eye and the wearable display system And another relative position between the right eye and the wearable display system. The wearable device can also track the relative positions of the corresponding eyes, respectively. Because the relative position between the left eye and the wearable display system can be different from the relative position between the right eye and the wearable display system (eg, when the wearable system is tilted to one side), the adjustment of the position of the virtual object is presented, The left eye display and the right eye display can be different.

可穿戴顯示系統可以使用神經網絡或視覺關鍵點技術來計算和追踪眼周特徵,例如尺度不變特徵變換算法(SIFT)、加速魯棒特徵算法(SURF)、定向的FAST和旋轉的BRIEF(ORB)、二元魯棒非變化可伸縮關鍵點算法(BRISK)、快速視網膜關鍵點算法(FREAK)等。在一些實施例中,可以使用專門為該特定臉部特徵設計的檢測器來追踪特定臉部特徵。例如,可以使用各種算法單獨地識別和追踪眼周特徵,例如眼角、鼻子特徵、嘴角等。分別追踪這些眼周特徵中的一個或多個可能是有利的,因為當使用者表達自己或正在說話時,它們容易於發生實質運動。與這些眼周特徵相關聯的檢測器可以考慮移動性的範圍。例如,當一些臉部特徵在某些方向上移動並且在其他方向上穩定(例如,眉毛傾向於向上或向下移動,而不向左或向右移動)。可 穿戴系統可以統計地分析臉部特徵的移動。這些統計可以用於確定臉部特徵將在特定方向上移動的可能性。在一些實施例中,一個或多個臉部特徵可以被移除或不追踪。例如,可穿戴顯示系統可以在追踪眼周區域的位置時忽略眼睛運動。 Wearable display systems can use neural networks or visual keypoint techniques to calculate and track eye features, such as Scale Invariant Feature Transform Algorithm (SIFT), Accelerated Robust Feature Algorithm (SURF), Directional FAST, and Rotating BRIEF (ORB) ), Binary Robust Non-Variable Scalable Keypoint Algorithm (BRISK), Fast Retina Keypoint Algorithm (FREAK), etc. In some embodiments, a detector designed specifically for the particular facial feature can be used to track a particular facial feature. For example, various algorithms can be used to individually identify and track eye features, such as eye corners, nose features, mouth angles, and the like. It may be advantageous to track one or more of these periocular features separately, as they are prone to substantial motion when the user expresses himself or is speaking. Detectors associated with these periocular features can take into account the range of mobility. For example, when some facial features move in certain directions and are stable in other directions (eg, the eyebrows tend to move up or down without moving to the left or right). can The wearable system can statistically analyze the movement of facial features. These statistics can be used to determine the likelihood that a facial feature will move in a particular direction. In some embodiments, one or more facial features may or may not be tracked. For example, a wearable display system can ignore eye movements while tracking the position of the eye area.

可穿戴顯示系統還可以使用視覺同步定位與地圖建立(vSLAM)技術,例如時序貝氏估計器(如,卡爾曼濾波器、擴展卡爾曼濾波器等)、光束法平差等來識別和追踪臉部特徵。在一些實施例中,可穿戴裝置可以被配置為允許深度感知。例如,可穿戴系統可以根據由一個或多個相機擷取的數據建構密集圖,其對臉部的至少一部分進行編碼。密集圖代替關鍵點圖,可以包括測量臉部的3D形狀之臉部斑塊或區域。斑塊或區域可以利用計算HMD的位置和使用例如,疊代最近演算法或相似演算法的技術之使用者的臉部相關聯。 The wearable display system can also use visual synchronization and map establishment (vSLAM) techniques such as timing Bayesian estimators (eg, Kalman filters, extended Kalman filters, etc.), beam adjustments, etc. to identify and track faces. Features. In some embodiments, the wearable device can be configured to allow for depth perception. For example, the wearable system can construct a dense map that encodes at least a portion of the face based on data retrieved by one or more cameras. The dense map replaces the keypoint map and may include a facial plaque or region that measures the 3D shape of the face. Plaques or regions may be associated with calculating the location of the HMD and the face of the user using techniques such as iterative nearest algorithm or similar algorithm.

在一些實施例中,眼睛相機獲取的影像可以是低分辨率影像,因為可穿戴顯示系統200可不需要高質量影像來追踪眼周特徵。另外或替代地,從眼成像器獲得的影像之分辨率可相對於其原始分辨率或在其它應用中使用的分辨率(例如,眼動追踪)被向下取樣。 In some embodiments, the image acquired by the eye camera may be a low resolution image because the wearable display system 200 may not require high quality images to track eye features. Additionally or alternatively, the resolution of the image obtained from the eye imager may be downsampled relative to its original resolution or resolution used in other applications (eg, eye tracking).

可穿戴顯示系統200可分析由一個或兩個眼睛相機獲得的影像,並使用各種技術來確定顯示系統的顯示器和使用者之間的相對位置。顯示器和使用者的眼睛之間的相對位置可以是顯示系統200相對於使用者的臉部的正常靜止位置。顯示系統200 的正常靜止位置可以在可穿戴系統的初始化階段期間確定。例如,當使用者首次使用可穿戴系統時,可穿戴系統可以建立臉部模型(例如,使用者臉部的地圖)並且基於使用者的眼睛確定顯示器相對於使用者的眼睛的正常靜止位置臉部模型。 The wearable display system 200 can analyze images obtained by one or two eye cameras and use various techniques to determine the relative position between the display and the user of the display system. The relative position between the display and the user's eyes may be the normal resting position of the display system 200 relative to the user's face. Display system 200 The normal rest position can be determined during the initialization phase of the wearable system. For example, when the user first uses the wearable system, the wearable system can establish a facial model (eg, a map of the user's face) and determine the normal resting position of the display relative to the user's eyes based on the user's eyes. model.

當使用者使用可穿戴系統200時,可穿戴系統可以使用各種技術來保持追踪顯示器和使用者之間的相對位置。例如,可穿戴裝置可以識別和追踪與眼周特徵相關聯的視覺關鍵點。可穿戴系統還可以匹配在所獲取的影像中識別的臉部的區域相對於使用者臉部的密集圖,以計算顯示器相對於臉部的位置。 When a user uses the wearable system 200, the wearable system can use various techniques to keep track of the relative position between the display and the user. For example, a wearable device can identify and track visual key points associated with eye contour features. The wearable system can also match a dense map of the area of the face identified in the acquired image relative to the user's face to calculate the position of the display relative to the face.

因此,可以使用各種眼動追踪或臉部成像技術(靜態地或動態地)確定使用者的眼睛和顯示系統的顯示器之間的相對位置。然後,顯示系統200可以至少部分地基於所確定的相對眼睛位置來選擇並應用適當的空間和/或色彩校準到顯示器,如本發明進一步的描述。 Thus, various eye tracking or facial imaging techniques (statically or dynamically) can be used to determine the relative position between the user's eyes and the display of the display system. Display system 200 can then select and apply appropriate spatial and/or color calibration to the display based at least in part on the determined relative eye position, as further described herein.

圖27為根據眼動追踪動態校準顯示器的方法2700流程示意圖。方法2700可以由動態校準系統2600執行。在區塊2710,追踪使用者的眼睛以確定使用者相對於顯示器的眼睛位置。例如,顯示系統2600的相機500可以確定使用者的眼睛位置。可以追踪一隻或兩隻眼睛。在區塊2720,取出根據所確定的眼睛位置的校準。在區塊2730,將校準應用於顯示器以校正顯示器中的空間和/或色彩缺陷。例如,動態校準處理器2610可以應用校正以調整注入到顯示器的波導中的光的性質,使得期望的光束由 顯示器輸出。在某些情況下,光可以以略微不同的顏色或位置或取向注入,以針對顯示缺陷進行調整。例如,可以通過相應的RGB校準(基於使用者眼睛的位置)修改顯示器投影的輸入影像中的RGB顏色值中的一個或多個,並且將修改的RGB值發送到顯示器進行投影。不完全顯示的淨效應投影修改的RGB值是用於產生至少部分地校正顯示器的缺陷(空間和/或色彩)的投影影像。在其他情況下,波導組件中的主動控制的繞射光學元件可以由動態校準處理器調節,使得光束從顯示器投射出,該光束至少部分地校正顯示器中的缺陷。在一些實施方式中,方法2700當作即時執行的反饋迴路,使得眼動追踪相機500監視使用者的眼睛,並且如果檢測到眼睛位置的變化,則使用新的校準(對於新的眼睛位置)以校準顯示器。在某些情況下,如果眼睛位置的變化超過閾值(例如,在校準位置的網格之間的間距的一部分),則應用新的校準。某些這樣的實施方式可以有利地連續地提供校準的顯示器供使用者觀看。在一些實施方式中,方法2700可以偶爾地(例如,在使用者將顯示器放置在使用者的眼睛上方時)或週期性地(例如,為了校正顯示器和使用者的眼睛之間的偶然滑動)執行。 27 is a flow diagram of a method 2700 of dynamically calibrating a display based on eye tracking. Method 2700 can be performed by dynamic calibration system 2600. At block 2710, the user's eyes are tracked to determine the user's eye position relative to the display. For example, camera 500 of display system 2600 can determine the user's eye position. You can track one or two eyes. At block 2720, a calibration based on the determined eye position is taken. At block 2730, calibration is applied to the display to correct for spatial and/or color defects in the display. For example, the dynamic calibration processor 2610 can apply a correction to adjust the properties of the light injected into the waveguide of the display such that the desired beam is Display output. In some cases, light can be injected in slightly different colors or positions or orientations to adjust for display defects. For example, one or more of the RGB color values in the input image projected by the display can be modified by a corresponding RGB calibration (based on the location of the user's eyes) and the modified RGB values are sent to the display for projection. The net effect projection modified RGB values that are not fully displayed are projection images used to generate defects (space and/or color) that at least partially correct the display. In other cases, the actively controlled diffractive optical elements in the waveguide assembly can be adjusted by a dynamic calibration processor such that a beam of light is projected from the display that at least partially corrects defects in the display. In some embodiments, method 2700 acts as a feedback loop that is executed on-the-fly, such that eye tracking camera 500 monitors the user's eyes, and if a change in eye position is detected, a new calibration (for a new eye position) is used Calibrate the display. In some cases, if the change in eye position exceeds a threshold (eg, a portion of the spacing between grids at the calibration location), then a new calibration is applied. Some such embodiments may advantageously provide a calibrated display for viewing by a user. In some embodiments, the method 2700 can be performed occasionally (eg, when the user places the display over the user's eyes) or periodically (eg, to correct for accidental sliding between the display and the user's eyes) .

圖28係為工廠校準系統和特定顯示器相關聯的動態校準系統的相互作用之流程示意圖2805。在實施例中,工廠(製造)設置中使用眼代理相機校準系統2810以確定正在製造的顯示器的位置相關的校準。在區塊2820,該過程針對正被製造的每個特定顯示器來分析一個或多個校準影像,並且生成用於每個眼睛 代理位置的校準。在區塊2830,將校準儲存在與特定顯示器相關聯的儲存器中,使得每個顯示器可取出在製造過程期間為該特定顯示器定製的校準。例如,校準可以作為對照表(LUT)儲存在顯示器208的數據模組224或遠程數據儲存庫232中。該處理流程2805的一部分可以在製造期間對於每個顯示器執行一次以便為每個顯示器提供定製的校準。 28 is a flow diagram 2805 of the interaction of a factory calibration system and a dynamic calibration system associated with a particular display. In an embodiment, an eye agent camera calibration system 2810 is used in a factory (manufacture) setup to determine the position-dependent calibration of the display being manufactured. At block 2820, the process analyzes one or more calibration images for each particular display being manufactured and generates for each eye Calibration of the agent location. At block 2830, the calibration is stored in a reservoir associated with a particular display such that each display can take a calibration customized for that particular display during the manufacturing process. For example, the calibration can be stored as a look-up table (LUT) in data module 224 or remote data repository 232 of display 208. A portion of this process flow 2805 can be performed once for each display during manufacturing to provide a customized calibration for each display.

在該實施例中,每個顯示系統(例如,可穿戴顯示系統200)可以使用在區塊2830處儲存的校準來執行即時校準。例如,顯示器的眼動追踪系統2840(其可以包括眼動追踪相機500)可以確定眼睛的角膜的位置和眼睛的注視方向以確定眼睛的位置。在區塊2850,顯示系統(例如,經由動態校準處理器2610)可以基於所確定的眼睛位置從儲存器提取適當的校準。在區塊2860,將校準應用於顯示器(例如,經由動態校準處理器2610)以校正顯示器的空間和/或色度誤差。在區塊2870,穿戴者能夠觀看由校準的顯示器投射的成像。隨著佩戴者相對於顯示器的眼睛位置改變,顯示系統中的處理流程也會更新校準,例如,即時地。 In this embodiment, each display system (eg, wearable display system 200) can perform an instant calibration using the calibration stored at block 2830. For example, the eye tracking system 2840 of the display (which may include the eye tracking camera 500) may determine the position of the cornea of the eye and the direction of gaze of the eye to determine the position of the eye. At block 2850, the display system (eg, via dynamic calibration processor 2610) can extract the appropriate calibration from the storage based on the determined eye position. At block 2860, calibration is applied to the display (eg, via dynamic calibration processor 2610) to correct for spatial and/or chromaticity errors of the display. At block 2870, the wearer is able to view the image projected by the calibrated display. As the wearer's eye position changes relative to the display, the processing flow in the display system also updates the calibration, for example, on the fly.

儘管已經在可穿戴顯示系統中的顯示器之上下文中描述了動態校準系統2600的實施例,但是這並非限制,並且動態校準系統(例如,眼動追踪相機和動態校準處理器)可以用於任何顯示器(可穿戴的或不可穿戴的),其校準僅僅靠近標稱觀看位置(例如,垂直於顯示器的中心)。例如,動態校準系統可以用於 平板顯示器、液晶顯示器、發光二極體顯示器、微機電系統(MEMS)顯示器等。 Although an embodiment of the dynamic calibration system 2600 has been described in the context of a display in a wearable display system, this is not a limitation, and a dynamic calibration system (eg, an eye tracking camera and a dynamic calibration processor) can be used for any display. (Wearable or non-wearable), the calibration is only close to the nominal viewing position (eg, perpendicular to the center of the display). For example, a dynamic calibration system can be used Flat panel displays, liquid crystal displays, light emitting diode displays, microelectromechanical systems (MEMS) displays, and the like.

■執行影像校正的其他方面■ Perform other aspects of image correction

在第1方面,公開了一種用於在顯示器上執行影像校正的電腦實施方法。該方法在具有電腦硬體和相機的顯示校準系統的控制下,包括:校準相機,用該相機擷取由該顯示器投影的光場的影像,該光場與該顯示器的顯示層相關聯;至少一部分根據所擷取的影像生成向量場,該向量場包括對應於該顯示層的點的投影位置和預期位置之間的偏差的向量;使用所生成的向量場,對該顯示器執行:中心校正、聚集旋轉校正、聚集縮放校正或空間映射中的至少一個;至少部分地基於所擷取的影像,確定與該顯示層上的多個點相對應的多個輝度值;以及使用所確定的多個輝度值,對該顯示執行:輝度平坦化或色彩平衡至少其中一個。 In a first aspect, a computer implemented method for performing image correction on a display is disclosed. The method is under the control of a display calibration system having a computer hardware and a camera, comprising: calibrating a camera, using the camera to capture an image of a light field projected by the display, the light field being associated with a display layer of the display; Part of generating a vector field from the captured image, the vector field including a vector corresponding to a deviation between a projected position of the point of the display layer and an expected position; using the generated vector field, performing: center correction, At least one of an aggregate rotation correction, an aggregation scaling correction, or a spatial mapping; determining, based at least in part on the captured image, a plurality of luminance values corresponding to the plurality of points on the display layer; and using the determined plurality of The luminance value is performed on the display: at least one of luminance flattening or color balance.

在第2方面,根據方面1的電腦實施的方法,其中執行中心校正包括:識別投影的顯示層的中心點;以及確定平移向量,其中該平移向量對應於所識別的中心點和預期中心點位置之間的平移誤差。 In a second aspect, the computer-implemented method of aspect 1, wherein performing center correction comprises: identifying a center point of the projected display layer; and determining a translation vector, wherein the translation vector corresponds to the identified center point and the expected center point position Translation error between.

在第3方面,根據方面1或方面2的電腦實施的方法,其中執行聚集旋轉包括:識別投影的顯示層的中心點;以及確定旋轉量,其中該旋轉量對應於該投影顯示層繞著該中心點的旋轉,使得該投影位置和該預期位置之間的像素誤差量最小化。 In a third aspect, the computer-implemented method of aspect 1 or aspect 2, wherein performing the collective rotation comprises: identifying a center point of the projected display layer; and determining an amount of rotation, wherein the amount of rotation corresponds to the projection display layer surrounding the The rotation of the center point minimizes the amount of pixel error between the projected position and the expected position.

在第4方面,根據方面1-3中任一方面的電腦實施的方法,其中執行聚集縮放包括:識別投影的顯示層的中心點;以及確定縮放量,其中縮放量對應於圍繞中心點的投影顯示層的縮放,使得投影位置和預期位置之間的像素誤差量最小化。 The computer-implemented method of any of aspects 1-3, wherein performing the aggregate scaling comprises: identifying a center point of the projected display layer; and determining a scaling amount, wherein the scaling amount corresponds to a projection around the center point The scaling of the display layer minimizes the amount of pixel error between the projected position and the expected position.

在第5方面中,根據方面1-4中任一方面的電腦實施的方法,其中執行空間映射包括識別非線性變換以將顯示層的投影位置與期望位置對準。 The computer-implemented method of any one of aspects 1 to 4, wherein performing the spatial mapping comprises identifying a non-linear transformation to align the projected position of the display layer with the desired position.

在第6方面,根據方面1-5中任一方面的電腦實施的方法,其中執行輝度平坦化包括:確定該多個輝度值中的最小輝度值;以及將該多個輝度值中的所有輝度值降低到該最小輝度值。 The computer-implemented method of any of aspects 1-5, wherein performing luminance flattening comprises: determining a minimum luminance value of the plurality of luminance values; and all luminances of the plurality of luminance values The value is reduced to the minimum luminance value.

在第7方面中,根據方面1-5中任一方面的電腦實施的方法,其中執行輝度平坦化包括:確定閾輝度值;以及將大於該閾輝度值的該多個輝度值中的所有輝度值降低到該閾輝度值。 The computer-implemented method of any one of aspects 1 to 5, wherein performing luminance flattening comprises: determining a threshold luminance value; and using all of the plurality of luminance values greater than the threshold luminance value The value is reduced to the threshold luminance value.

在第8方面中,根據方面1-7中任一方面的電腦實施的方法,其中執行色彩平衡包括:識別與該顯示層相關聯的顏色群集,該顏色群集包括至少一個附加顯示層;對於該顯示層上的該多個點中的每個點,比較該顯示層上的點相對應的輝度值和對應於該附加顯示層上的點的輝度值;以及將該多個輝度值中的每個輝度值降低到與其對應點相關聯的最低輝度值。 The computer-implemented method of any of aspects 1-7, wherein performing color balancing comprises: identifying a color cluster associated with the display layer, the color cluster comprising at least one additional display layer; Displaying, in each of the plurality of points on the layer, a luminance value corresponding to a point on the display layer and a luminance value corresponding to a point on the additional display layer; and each of the plurality of luminance values The luminance values are reduced to the lowest luminance values associated with their corresponding points.

在第9方面,根據方面1-8中任一方面的電腦實施的方法,其中執行聚集旋轉校正包括計算向量場的捲曲。 The computer-implemented method of any of aspects 1-8, wherein performing the aggregate rotation correction comprises calculating a curl of the vector field.

在第10方面中,根據方面1-9中任一方面的電腦實 施的方法,其中執行聚集縮放校正包括計算向量場的發散度。 In a tenth aspect, the computer of any of aspects 1-9 The method of performing, wherein performing the aggregate scaling correction comprises calculating a divergence of the vector field.

在第11方面,根據方面1-10中任一方面的電腦實施的方法,其中該顯示器包括光場顯示器。 The computer-implemented method of any of aspects 1 to 10, wherein the display comprises a light field display.

在第12方面,根據方面11的電腦實施的方法,其中該光場顯示器包括堆疊波導組件。 The computer-implemented method of aspect 11, wherein the light field display comprises a stacked waveguide assembly.

在第13方面,根據方面12的電腦實施的方法,其中該堆疊波導組件包括分別對應於兩個或更多個深度平面的兩個或更多個波導。 The computer-implemented method of aspect 12, wherein the stacked waveguide assembly comprises two or more waveguides respectively corresponding to two or more depth planes.

在第14方面中,根據方面13的電腦實施的方法,其中每個深度平面與紅色顯示層、綠色顯示層和藍色顯示層相關聯。 In a fourteenth aspect, the computer-implemented method of aspect 13, wherein each depth plane is associated with a red display layer, a green display layer, and a blue display layer.

在第15方面,公開了一種校準顯示器的方法。該方法在包括電腦硬體的顯示器校準系統的控制下,包括:取出由顯示器投影的校準圖案的影像;確定投影光場中的校準點的預期位置與影像中的實際顯示位置之間的空間失真;分析該空間失真以決定該顯示器的空間校準;以及將該空間校準存儲在與該顯示器相關聯的非暫時性存儲器中。 In a fifteenth aspect, a method of calibrating a display is disclosed. The method is controlled by a display calibration system including a computer hardware, comprising: taking out an image of a calibration pattern projected by the display; determining a spatial distortion between an expected position of the calibration point in the projected light field and an actual display position in the image. The spatial distortion is analyzed to determine a spatial calibration of the display; and the spatial calibration is stored in a non-transitory memory associated with the display.

在第16方面中,根據方面15的方法,其中該空間校準針對以下各項中的一個或多個進行校正:平面內空間誤差或平面外空間誤差。 In a sixteenth aspect, the method of aspect 15, wherein the spatial calibration is corrected for one or more of: an in-plane spatial error or an out-of-plane spatial error.

在第17方面中,根據方面15或方面16的方法,其中該空間校準對以下各項中的一個或多個進行校正:平移誤差、 旋轉誤差、縮放誤差或像素扭曲。 In a seventeenth aspect, the method of aspect 15 or aspect 16, wherein the spatial calibration corrects one or more of: translational error, Rotational error, scaling error, or pixel distortion.

在第18方面中,根據方面15-17中任一項的方法,還包括:從該影像確定色彩失真;分析該色彩失真以決定該顯示器的色彩校準;以及將該色彩校準儲存在與該顯示器相關聯的非暫時性儲存器中。 The method of any of aspects 15-17, further comprising: determining color distortion from the image; analyzing the color distortion to determine a color calibration of the display; and storing the color calibration in the display Associated with non-transitory storage.

在第19方面,根據方面18的方法,其中該色彩校準校正該顯示器的輝度平坦度或色彩均勻性。 In a nineteenth aspect, the method of aspect 18, wherein the color calibration corrects luminance flatness or color uniformity of the display.

■光學計量系統的附加方面■Additional aspects of optical metrology systems

在第20方面,公開了一種用於測量顯示器產生的光場中的缺陷的光學計量系統。該光學計量系統包括:顯示器,被用以投射包括具有預期對焦位置的虛擬物體的目標光場;相機,用以獲得該目標光場的影像;以及一可執行指令的處理器編程:取出對應於該光場的一部分的一個或多個影像;分析該一個或多個影圖像以識別對應出該虛擬物體在焦點位置之測量的對焦位置;以及至少一部分根據該測量的對焦位置和該預期對焦位置的比較結果來確定該光場中的缺陷。 In a twentieth aspect, an optical metrology system for measuring defects in a light field produced by a display is disclosed. The optical metrology system includes: a display for projecting a target light field including a virtual object having an intended focus position; a camera for obtaining an image of the target light field; and a processor programming of executable instructions: fetching corresponding to One or more images of a portion of the light field; analyzing the one or more shadow images to identify a focus position corresponding to the measurement of the virtual object at a focus position; and at least a portion of the focus position and the desired focus based on the measurement The result of the comparison of the positions determines the defects in the light field.

在第21方面,根據方面20的光學計量系統,其中該顯示器包括光場顯示器。 The optical metrology system of aspect 20, wherein the display comprises a light field display.

在第22方面,根據方面20或方面21的光學計量系統,其中該顯示器包括被配置為輸出光用以將虛擬物體投影到特定深度平面的波導堆疊。 The optical metrology system of aspect 20 or aspect 21, wherein the display comprises a waveguide stack configured to output light for projecting a virtual object to a particular depth plane.

在第23方面中,根據方面20-22中任一方面的光學 計量系統,其中該相機包括具有小景深的數位照相機。 In a 23rd aspect, the optical according to any of aspects 20-22 A metering system wherein the camera comprises a digital camera with a small depth of field.

在第24方面中,根據方面23的光學計量系統,其中該相機具有焦點,並且該系統被用以在焦點範圍上掃描該相機的焦點以獲得該一個或多個影像。 In a twenty-fourth aspect, the optical metrology system of aspect 23, wherein the camera has a focus, and the system is operative to scan a focus of the camera over a focus range to obtain the one or more images.

在第25方面,根據方面20-22中任一方面的光學計量系統,其中該照相機包括光場照相機。 The optical metrology system of any of aspects 20-22, wherein the camera comprises a light field camera.

在第26方面,根據方面20-25中任一方面的光學計量系統,其中該虛擬物體包括棋盤圖案、幾何圖案或隨機圖案。 The optical metrology system of any of aspects 20-25, wherein the virtual object comprises a checkerboard pattern, a geometric pattern, or a random pattern.

在第27方面中,根據方面20-26中任一項該的光學計量系統,其中該顯示器包括多個像素,並且該目標光場對應於少於所有被照亮的像素的子集。 The optical metrology system of any of aspects 20-26, wherein the display comprises a plurality of pixels, and the target light field corresponds to less than a subset of all illuminated pixels.

在第28方面,根據方面20-27中任一項的光學計量系統,其中所測量的對焦位置包括焦深。 The optical metrology system of any of aspects 20-27, wherein the measured in-focus position comprises a depth of focus.

在第29方面,根據方面28的光學計量系統,其中所測量的聚焦位置還包括橫向聚焦位置。 The optical metrology system of aspect 28, wherein the measured focus position further comprises a lateral focus position.

在第30方面,根據方面29的光學計量系統,其中該確定的缺陷係至少一部分根據預期對焦位置和測量的對焦位置之間的誤差向量。 In a 30th aspect, the optical metrology system of aspect 29, wherein the determined defect is based at least in part on an error vector between an expected focus position and a measured focus position.

在第31個方面,根據方面20-30中任一個的光學計量系統,其中所確定的缺陷包括空間缺陷。 The optical metrology system of any of aspects 20-30, wherein the determined defect comprises a spatial defect.

在第32方面中,根據方面20-31中任一方面的光學計量系統,其中所確定的缺陷包括色彩缺陷。 The optical metrology system of any of aspects 20-31, wherein the determined defect comprises a color defect.

在第33方面,根據方面20-32中任一方面的光學計量系統,其中該處理器進一步編程為至少一部分是根據所確定的缺陷來決定該顯示器的錯誤校正。 The optical metrology system of any of aspects 20-32, wherein the processor is further programmed to determine, at least in part, an error correction of the display based on the determined defect.

在第34方面中,公開了一種用於測量光場中的缺陷的方法,該方法包括:取出對應於顯示器投射的一部分的光場之一個或多個影像,該部分光場具有預期的聚焦位置;分析該一個或多個影像以確認測量的對焦位置,並對應於該光場的對焦位置;以及至少部分根據該測量的對焦位置和該預期對焦位置的比較結果來確定該光場中的缺陷。 In a 34th aspect, a method for measuring a defect in a light field is disclosed, the method comprising: extracting one or more images of a light field corresponding to a portion of a projection of the display, the partial light field having an expected focus position Analyzing the one or more images to confirm the measured in-focus position and corresponding to the in-focus position of the light field; and determining a defect in the light field based at least in part on the comparison of the measured in-focus position and the expected in-focus position .

在第35方面中,根據方面34的方法,包括掃描相機的焦點以獲得該一個或多個影像。 In a 35th aspect, the method of aspect 34, comprising scanning a focus of the camera to obtain the one or more images.

在第36方面,根據方面34的方法,包括使用光場相機獲得一個或多個影像。 In a 36th aspect, the method of aspect 34, comprising obtaining one or more images using a light field camera.

在第37方面中,根據方面34-36中任一方面的方法,還包括投影包括棋盤圖案的光場影像。 The method of any of aspects 34-36, further comprising projecting a light field image comprising a checkerboard pattern.

在第38方面,根據方面34-37中任一方面的方法,還包括至少部分係根據所確定的缺陷來決定該光場的誤差校正。 The method of any of aspects 34-37, further comprising determining, at least in part, an error correction of the light field based on the determined defect.

■校準顯示器的其他方面■ Calibrate other aspects of the display

在第39方面,提供了一種用於顯示器的校準系統。校準系統包括:相機,被用以獲取顯示器的影像;以及與該相機通信的硬體處理器,該硬體處理器被程用以:接收該顯示器的影像;確定該顯示器的校準;以及將該校準儲存在與該顯示器相關 聯的儲存器中。 In a 39th aspect, a calibration system for a display is provided. The calibration system includes: a camera for acquiring an image of the display; and a hardware processor in communication with the camera, the hardware processor configured to: receive an image of the display; determine calibration of the display; Calibration stored in relation to the display Connected to the storage.

在第40方面中,根據方面39的校準系統,其中該校準包括空間校準以校正顯示器中的空間缺陷。 In a 40th aspect, the calibration system of aspect 39, wherein the calibration comprises spatial calibration to correct for spatial defects in the display.

在第41方面,根據方面39的校準系統,其中該校準包括用以色度校準來校正顯示器中的顏色缺陷。 In a 41st aspect, the calibration system of aspect 39, wherein the calibrating comprises using chromaticity calibration to correct for color defects in the display.

在第42方面中,根據方面39-41中任一項該的校準系統,其中該顯示器包括視場中的多個像素,並且其中為了確定該校準,該硬體處理器編程用以:確定全局變換參數其獨立於該顯示器的視場中的像素;以及確定依賴於該顯示器的該視場中的像素的局部變換參數。 The calibration system of any of aspects 39-41, wherein the display comprises a plurality of pixels in a field of view, and wherein to determine the calibration, the hardware processor is programmed to: determine global Transforming parameters that are independent of pixels in the field of view of the display; and determining local transform parameters that depend on pixels in the field of view of the display.

在第43方面中,根據方面42的校準系統,其中該全局變換參數包括一個或多個非線性伽馬校正。 In a 43rd aspect, the calibration system of aspect 42, wherein the global transformation parameter comprises one or more non-linear gamma corrections.

在第44方面中,根據方面42或方面43的校準系統,其中該局部變換包括線性函數。 In a 44th aspect, the calibration system of aspect 42 or aspect 43, wherein the local transformation comprises a linear function.

在第45方面中,根據方面39至44中任一項該的校準系統,其中為了確定該校準,該硬體處理器被編程為使用來自該相機擷取的影像的反饋以疊代地解決該校準。 The calibration system of any of aspects 39 to 44, wherein the hardware processor is programmed to use the feedback from the image captured by the camera to solve the problem in an iterative manner in order to determine the calibration calibration.

在第46方面中,根據方面39至45中任一項的該校準系統,其中該校準包括色彩校準,該顯示器包括可提供白點的多個色階,並且確定該校準,該硬體處理器編程用以調整該色階的強度,使得該白點在該顯示器的視場上基本上是均勻的。 The calibration system of any of aspects 39 to 45, wherein the calibration comprises color calibration, the display includes a plurality of color gradations that provide white points, and the calibration is determined, the hardware processor Programming is used to adjust the intensity of the tone scale such that the white point is substantially uniform across the field of view of the display.

在第47方面中,根據方面46的校準系統,其中為 了確定校準,硬體處理器編程用以:解決將發送到顯示器的色階映射到第一中間色彩表示式的第一伽馬校正;解決將該第一中間色彩表示式映射到第二中間色彩表示式的一像素相關耦合函數;並解決將第二中間色彩表示式映射到由相機記錄的色階的第二伽馬校正。 In a 47th aspect, the calibration system according to aspect 46, wherein Determining the calibration, the hardware processor programming is to: resolve a first gamma correction that maps the gradation sent to the display to the first intermediate color representation; and resolves mapping the first intermediate color representation to the second intermediate color A one-pixel correlation coupling function of the expression; and a second gamma correction that maps the second intermediate color representation to the gradation recorded by the camera.

在第48方面中,根據方面47的校準系統,其中硬體處理器編程用於優先解決像素相關耦合函數,再用於解決第一伽馬校正和第二伽馬校正。 In a 48th aspect, the calibration system of aspect 47, wherein the hardware processor is programmed to preferentially resolve the pixel correlation coupling function and to solve the first gamma correction and the second gamma correction.

在第49方面,根據方面39至48中任一項該的校準系統,其中該顯示器包括光場顯示器。 The calibration system of any of aspects 39 to 48, wherein the display comprises a light field display.

在第50方面中,根據方面39至49中任一項的校準系統,其中該顯示器包括多個波導的可堆疊波導組件。 In a 50th aspect, the calibration system of any of aspects 39 to 49, wherein the display comprises a plurality of waveguide stackable waveguide assemblies.

在第51個方面,根據方面39至50中任一項的校準系統,其中該顯示器被配置用於可穿戴顯示系統。 In a 51st aspect, the calibration system of any of aspects 39 to 50, wherein the display is configured for a wearable display system.

在第52方面,提供了一種用於校準顯示器的方法。該方法在電腦硬體執行的動態校準系統的控制下,包括:取出顯示器的校準;確定,至少部分地基於所取出的校準,一校準應用於該顯示器以至少部分地校正該顯示器中的缺陷;以及將該校正應用於該顯示器。 In a 52nd aspect, a method for calibrating a display is provided. The method, under the control of a dynamic calibration system performed by a computer hardware, includes: removing a calibration of the display; determining, based at least in part on the extracted calibration, a calibration applied to the display to at least partially correct defects in the display; And applying the correction to the display.

在第53方面,根據方面52的方法,其中取出校準包括色彩校準。 The method of aspect 52, wherein the taking out the calibration comprises color calibration.

在第54方面中,根據方面53的方法,其中該顯示 器包括視場中的多個像素,並且該色彩校準包括多個像素獨立的非線性伽馬校正和像素相關的耦合函數。 In a 54th aspect, the method of aspect 53, wherein the display The device includes a plurality of pixels in the field of view, and the color calibration includes a plurality of pixel independent nonlinear gamma corrections and pixel dependent coupling functions.

在第55方面,方面52至54中任一方面的方法,其中該顯示器包括光場顯示器。 The method of any of aspects 52 to 54, wherein the display comprises a light field display.

在第56方面,提供了一種穿戴式顯示器,包括:顯示器;儲存器,被配置為儲存該校準;以及硬體處理器編程,係為執行方面14至17中的任一方面的方法,並與該非暫時性儲存器通訊聯繫。 In a 56th aspect, a wearable display is provided, comprising: a display; a memory configured to store the calibration; and a hardware processor programming, the method of performing any of aspects 14 to 17, and The non-transitory memory communication link.

■校準模式的其他方面■Other aspects of the calibration mode

在第57方面中,一種用於校準顯示器產生的光場的光學系統,包括:顯示器,用以投射包括含有特徵點的校準圖案的目標光場;照相機,用以獲得該目標光場的影像;處理器,其編程可執行指令用以:對於多個位置中的每一個:使得該顯示器將該校準圖案投影在該多個位置中的位置處;使得該相機獲得所投影的校準圖案的影像;計算該特徵點的失真,其中該失真對應於該特徵點的預期位置和該特徵點的測量位置之間的一誤差或校準圖案的期望輝度或色度與校準圖案的測量亮度或色度之間的一誤差;以及反應於對該多個位置中的下一位置的確定,將該校準圖案平移使其在該下一位置處顯示。 In an 57th aspect, an optical system for calibrating a light field generated by a display, comprising: a display for projecting a target light field including a calibration pattern containing feature points; and a camera for obtaining an image of the target light field; a processor that programs executable instructions for: for each of the plurality of locations: causing the display to project the calibration pattern at a location in the plurality of locations; such that the camera obtains an image of the projected calibration pattern; Calculating a distortion of the feature point, wherein the distortion corresponds to an error between the expected position of the feature point and the measured position of the feature point or between a desired luminance or chromaticity of the calibration pattern and a measured brightness or chromaticity of the calibration pattern An error; and reacting to the determination of the next of the plurality of locations, translating the calibration pattern to display at the next location.

在第58方面,根據方面57的光學系統,其中該校準圖案包括棋盤圖案。 The optical system of aspect 57, wherein the calibration pattern comprises a checkerboard pattern.

在第59方面中,根據方面57的光學系統,其中該 多個位置的數量對應於該棋盤圖案的檢查框中的像素的數量。 In an 59th aspect, the optical system of aspect 57, wherein The number of locations corresponds to the number of pixels in the checkbox of the checkerboard pattern.

在第60方面,根據方面57的光學系統,其中該校準圖案包括單像素圖案。 The optical system of aspect 57, wherein the calibration pattern comprises a single pixel pattern.

在第61方面中,根據方面60的光學系統,其中該多個位置的數量對應於顯示的像素的數量。 In the 61st aspect, the optical system of aspect 60, wherein the number of the plurality of locations corresponds to the number of pixels displayed.

在第62方面中,根據方面57-61中的任一個的光學系統,其中該處理器還被編程為至少部分地基於對應於該多個位置的計算的失真來生成失真映射。 The optical system of any of aspects 57-61, wherein the processor is further programmed to generate a distortion map based at least in part on the calculated distortion corresponding to the plurality of locations.

在第63個方面,根據方面57-62中任一個的光學系統,其中該處理器還被編程為至少部分地基於對應於該多個位置的計算的失真來確定該顯示器的誤差校正。 The optical system of any of aspects 57-62, wherein the processor is further programmed to determine an error correction for the display based at least in part on the calculated distortion corresponding to the plurality of locations.

在第64方面中,根據方面57-63中任一項的光學系統,其中該顯示器包括單獨的紅色、綠色和藍色色層。 The optical system of any of aspects 57-63, wherein the display comprises separate red, green, and blue color layers.

在第65方面,根據方面57-64中任一項的光學系統,其中該顯示器包括光場顯示器。 The optical system of any of aspects 57-64, wherein the display comprises a light field display.

在第66方面,根據方面65的光學系統,其中該光場顯示器包括堆疊波導組件。 The optical system of aspect 65, wherein the light field display comprises a stacked waveguide assembly.

在第67方面,根據方面66的光學系統,其中該堆疊波導組件包括分別對應於兩個或更多個深度平面的兩個或更多個波導。 The optical system of aspect 66, wherein the stacked waveguide assembly comprises two or more waveguides respectively corresponding to two or more depth planes.

在第68方面,根據方面57-67中任一個的光學系統,其中該計算的失真還包括輝度失真或色彩失真。 The optical system of any of aspects 57-67, wherein the calculated distortion further comprises luminance distortion or color distortion.

在第69方面,提供了一種用於校準由顯示器產生的光場的方法。該方法包括:對於多個位置中的每一個:使得顯示器將校準圖案投影在多個位置中的位置處;使該相機獲得所投影的校準圖案的影像;計算該特徵點的失真,其中該失真對應於該特徵點的預期位置和該特徵點的測量位置之間的一誤差或該特徵點的預期輝度或色度與所測量的輝度或色度之間的一誤差;以及反應於對該多個位置中的下一位置的確定,將該校準圖案平移使其在該下一位置處顯示。 In a 69th aspect, a method for calibrating a light field produced by a display is provided. The method includes, for each of a plurality of locations: causing a display to project a calibration pattern at a location in a plurality of locations; causing the camera to obtain an image of the projected calibration pattern; calculating a distortion of the feature point, wherein the distortion Corresponding to an error between the expected position of the feature point and the measured position of the feature point or an error between the expected luminance or chrominance of the feature point and the measured luminance or chrominance; The determination of the next position in the position shifts the calibration pattern to display at the next position.

在第70方面,根據方面69的方法,其中該校準圖案是棋盤圖案。 The method of aspect 69, wherein the calibration pattern is a checkerboard pattern.

在第71方面,根據方面70的方法,其中該多個位置的數目對應於棋盤圖案的檢查框中的像素的數目。 The method of aspect 70, wherein the number of the plurality of locations corresponds to the number of pixels in the checkbox of the checkerboard pattern.

在第72方面中,根據方面69的方法,其中該校準圖案包括單像素圖案、隨機圖案或幾何圖案。 The method of aspect 69, wherein the calibration pattern comprises a single pixel pattern, a random pattern, or a geometric pattern.

在第73方面,根據方面72的方法,其中該多個位置的數目對應於所顯示的像素的數目。 The method of aspect 72, wherein the number of the plurality of locations corresponds to the number of pixels displayed.

在第74方面中,根據方面69-73中的任一個的方法,還包括至少部分地基於對應於該多個位置的所計算的失真來生成失真映射。 The method of any one of aspects 69-73, further comprising generating a distortion map based at least in part on the calculated distortion corresponding to the plurality of locations.

在第75方面中,根據方面69-74中的任一個的方法,還包括至少部分地基於對應於該多個位置的計算的失真來確定該顯示器的誤差校正。 The method of any one of aspects 69-74, further comprising determining an error correction for the display based at least in part on the calculated distortion corresponding to the plurality of locations.

在第76方面,根據方面69-75中任一個的光學系統,其中該顯示器包括單獨的紅色、綠色和藍色色層。 The optical system of any of aspects 69-75, wherein the display comprises separate red, green, and blue color layers.

在第77方面中,根據方面69-76中任一項的光學系統,其中該顯示器包括光場顯示器。 The optical system of any of aspects 69-76, wherein the display comprises a light field display.

在第78方面,根據方面77的光學系統,其中該光場顯示器包括堆疊波導組件。 The optical system of aspect 77, wherein the light field display comprises a stacked waveguide assembly.

在第79個方面,根據方面78的光學系統,其中該堆疊波導組件包括分別對應於兩個或更多個深度平面的兩個或更多個波導。 In the 79th aspect, the optical system of aspect 78, wherein the stacked waveguide assembly comprises two or more waveguides respectively corresponding to two or more depth planes.

在第80方面,根據方面69-79中任一方面的光學系統,其中所計算的失真還包括輝度失真或色度失真。 The optical system of any of aspects 69-79, wherein the calculated distortion further comprises luminance distortion or chrominance distortion.

■執行動態校準的其他方面■ Perform other aspects of dynamic calibration

在第81個方面,提供了一種顯示系統。顯示系統包括:一眼動追踪相機;一顯示器;非暫時性數據儲存器,用以儲存該顯示器的多個校準,該多個校準中的每個校準與該顯示器的校準位置相關聯;以及與該眼動追踪相機通訊聯繫的硬體處理器,該顯示器,和該非暫時性數據儲存器,該硬體處理器編程用以:確定一眼睛與顯示器有關的位置,用於該顯示器的使用者;接取,至少部分根據在眼睛位置上,該多個校準中的一個或多個;確定,至少部分根據該多個校準中的一個或多個,一校準應用於該顯示器至少部分校正該顯示器中的缺陷;並將校準應用於顯示器。 In an 81st aspect, a display system is provided. The display system includes: an eye tracking camera; a display; a non-transitory data storage for storing a plurality of calibrations of the display, each of the plurality of calibrations being associated with a calibration position of the display; and a hardware processor associated with the eye tracking camera communication, the display, and the non-transitory data storage, the hardware processor being programmed to: determine an eye-related position of the display for the user of the display; Taking, at least in part, one or more of the plurality of calibrations at an eye position; determining, at least in part, based on one or more of the plurality of calibrations, applying a calibration to the display to at least partially correct the display Defects; and apply calibration to the display.

在第82方面,根據方面81的顯示系統,其中校準位置的數量是2、3、4、5、6、7、8、9或更多。 In a 82nd aspect, the display system according to aspect 81, wherein the number of calibration positions is 2, 3, 4, 5, 6, 7, 8, 9, or more.

在第83方面,根據方面81或方面82的顯示系統,其中該校準位置在網格中跨該顯示器分佈。 The display system of aspect 81 or aspect 82, wherein the calibration position is distributed across the display in a grid.

在第84方面,根據方面83的顯示系統,其中網格包括2x2,3x3,5x5或9x9網格。 In a 84th aspect, the display system of aspect 83, wherein the grid comprises a 2x2, 3x3, 5x5 or 9x9 grid.

在第85方面中,根據方面81至84中任一項的顯示系統,其中該多個校準中的該一個或多個包括與最接近該眼睛位置的校準位置相關聯的校準。 The display system of any one of aspects 81 to 84, wherein the one or more of the plurality of calibrations comprises a calibration associated with a calibration position that is closest to the eye position.

在第86方面中,根據方面81至85中任一項的顯示系統,其中為了確定該校正,該硬體處理器被編程為在該多個校準中的該一個或多個中內插或外插。 The display system of any one of aspects 81 to 85, wherein the hardware processor is programmed to interpolate or externally in the one or more of the plurality of calibrations in order to determine the correction Plug in.

在第87方面中,根據方面81至86中任一項的顯示系統,其中所述多個校準中的每個校準校正了顯示器的空間缺陷、顯示器的色彩缺陷或空間缺陷和色彩缺陷兩者。 The display system of any of aspects 81 to 86, wherein each of the plurality of calibrations corrects for a spatial defect of the display, a color defect of the display, or both a spatial defect and a color defect.

在第88方面中,根據方面81至87中任一項的顯示系統,其中該顯示器包括光場顯示器。 The display system of any of aspects 81 to 87, wherein the display comprises a light field display.

在第89方面,根據方面81至88中任一項的顯示系統,其中該顯示器包括包括多個波導的可堆疊波導組件。 The display system of any of aspects 81 to 88, wherein the display comprises a stackable waveguide assembly comprising a plurality of waveguides.

在第90方面,根據方面81至89中任一項的顯示系統,其中該顯示器被配置為可穿戴的顯示系統。 The display system of any one of aspects 81 to 89, wherein the display is configured as a wearable display system.

在第91方面,提供了一種頭戴式顯示器,包括方面 81至90中任一項該的顯示系統。 In a 91st aspect, a head mounted display is provided, including aspects A display system according to any one of 81 to 90.

在第92方面,提供了一種用於校準顯示器的方法。該方法在由電腦硬體執行的動態校準系統的控制下,包括:確定顯示器的使用者的眼睛位置;取出,至少部分地基於所確定的眼睛位置,用於顯示器的一校準,該校準與接近所確定的眼睛位置的校準位置相關聯;確定,至少部分地基於所取出的校準,應用於該顯示器以至少部分地校正該顯示器中的缺陷的一校準;以及將該校正應用於該顯示器。 In a 92nd aspect, a method for calibrating a display is provided. The method, under the control of a dynamic calibration system executed by a computer hardware, includes: determining an eye position of a user of the display; taking out, based at least in part on the determined eye position, a calibration for the display, the calibration and proximity A determined calibration position of the eye position is associated; determining, based at least in part on the extracted calibration, a calibration applied to the display to at least partially correct defects in the display; and applying the correction to the display.

在第93方面,根據方面92的方法,其中取出校準包括從多個校準中選擇一個或多個校準,其中每個校準與相對於顯示器的不同校準位置相關聯。 In a 93rd aspect, the method of aspect 92, wherein the removing the calibration comprises selecting one or more calibrations from the plurality of calibrations, wherein each calibration is associated with a different calibration location relative to the display.

在第94方面,根據方面93的方法,其中該校準位置被佈置在跨越該顯示器的網格中。 The method of aspect 93, wherein the calibration position is disposed in a grid spanning the display.

在第95方面,根據方面92至94中任一方面的方法,其中該校準用於校正該顯示器的空間缺陷、該顯示器的色彩缺陷,或該空間缺陷和該色彩缺陷兩者。 The method of any one of aspects 92 to 94, wherein the calibration is for correcting a spatial defect of the display, a color defect of the display, or both the spatial defect and the color defect.

在第96方面,根據方面92至95中任一方面的方法,其中確定該校正包括在與該眼姿勢附近的校準位置相關聯的一個或多個校準之間進行內插或外插。 The method of any one of aspects 92 to 95, wherein determining the correction comprises interpolating or extrapolating between one or more calibrations associated with a calibration position proximate the eye gesture.

在第97方面,根據方面92至96中任一項該的方法,其中該顯示器包括光場顯示器。 The method of any one of aspects 92 to 96, wherein the display comprises a light field display.

在第98方面,提供一種頭戴式顯示器,包括眼動追 踪系統和一硬體處理器被編程以執行方面92至97中任一方面的方法。 In a 98th aspect, a head mounted display is provided, including eye tracking The trace system and a hardware processor are programmed to perform the method of any of aspects 92 to 97.

■光學計量系統的附加方面■Additional aspects of optical metrology systems

在第99方面,提供了一種用於測量顯示器產生的光場中的缺陷的光學計量系統。光學計量系統包括:顯示器,用以投影包括具有預期對焦位置的虛擬對象的目標光場;一照相機,用以獲得該目標光場的影像;一硬體處理器,其被編程具有可執行指令用以:取出對應於該光場的一部分的一個或多個影像;分析該一個或多個影像以識別對應出該虛擬物體在焦點位置之測量的對焦位置;以及至少部分地比較該測量的對焦位置和該預期對焦位置來確定該光場中的缺陷。 In a 99th aspect, an optical metrology system for measuring defects in a light field produced by a display is provided. The optical metrology system includes: a display for projecting a target light field including a virtual object having an intended focus position; a camera for obtaining an image of the target light field; and a hardware processor programmed with executable instructions Taking one or more images corresponding to a portion of the light field; analyzing the one or more images to identify a focus position corresponding to the measurement of the virtual object at a focus position; and at least partially comparing the measured focus position And the expected focus position to determine a defect in the light field.

在第100方面,根據方面99的光學計量系統,其中該顯示器包括波導堆疊,其被配置成輸出光以將該虛擬物體投影到至少一個深度平面。 The optical metrology system of aspect 99, wherein the display comprises a waveguide stack configured to output light to project the virtual object to at least one depth plane.

在第101個方面,根據方面99至100中任一方面的光學計量系統,其中該照相機包括具有小的焦深的數位照相機。 The optical metrology system of any of aspects 99 to 100, wherein the camera comprises a digital camera having a small depth of focus.

在第102方面中,根據方面101的光學計量系統,其中該相機具有焦點,且該系統被配置為在焦點範圍上掃描該相機的焦點以獲得該一個或一個以上影像。 The optical metrology system of aspect 101, wherein the camera has a focus, and the system is configured to scan a focus of the camera over a focus range to obtain the one or more images.

在第103個方面,根據方面99至102中任一方面的光學計量系統,其中該相機包括光場相機。 The optical metrology system of any of aspects 99 to 102, wherein the camera comprises a light field camera.

在第104方面,根據方面99至103中任一方面的光 學計量系統,其中該虛擬對象包括棋盤圖案、幾何圖案或隨機圖案。 In a 104th aspect, the light according to any one of aspects 99 to 103 A metrology system, wherein the virtual object comprises a checkerboard pattern, a geometric pattern, or a random pattern.

在第105方面,根據方面99至104中任一方面的光學計量系統,其中該顯示器包括多個像素,並且該目標光場對應於少於所有被照亮的像素的子集。 The optical metrology system of any of aspects 99 to 104, wherein the display comprises a plurality of pixels, and the target light field corresponds to a subset of less than all of the illuminated pixels.

在第106方面,根據方面99至105中任一方面的光學計量系統,其中該測量的對焦位置包括焦深。 The optical metrology system of any of aspects 99 to 105, wherein the measured in-focus position comprises a depth of focus.

在第107個方面,根據方面106的光學計量系統,其中所測量的聚焦位置還包括橫向聚焦位置。 In a 107th aspect, the optical metrology system of aspect 106, wherein the measured focus position further comprises a lateral focus position.

在第108個方面,根據方面99至107中任一個的光學計量系統,其中所確定的缺陷至少部分地基於預期對焦位置和測量的對焦位置之間的一誤差向量。 The optical metrology system of any one of clauses 99 to 107, wherein the determined defect is based at least in part on an error vector between the expected focus position and the measured focus position.

在第109方面,根據方面99至108中任一方面的光學計量系統,其中該硬體處理器進一步被編程為至少部分地基於所確定的缺陷來確定該顯示器的誤差校正。 The optical metrology system of any one of aspects 99 to 108, wherein the hardware processor is further programmed to determine an error correction for the display based at least in part on the determined defect.

在第110個方面,根據方面99至109中任一個的光學計量系統,其中硬體處理器還被編程為應用於顯示器到相機像素映射,以將顯示器的像素值傳送到相機的像素值。 The optical metrology system of any one of clauses 99 to 109, wherein the hardware processor is further programmed to apply to the display to camera pixel mapping to transmit pixel values of the display to pixel values of the camera.

在第111方面,根據方面110的光學計量系統,其中顯示器到照相機像素映射包括:一第一伽馬校正,將顯示器的色階映射到第一中間色彩表示式;一像素獨立耦合函數,其將該第一中間色彩表示式映射到第二中間色彩表示式;以及一第二伽 馬校正將第二中間色彩表示式映射到由相機記錄的色階。 The optical metrology system of aspect 110, wherein the display to camera pixel mapping comprises: a first gamma correction, mapping the gradation of the display to a first intermediate color representation; a pixel independent coupling function, which The first intermediate color representation is mapped to the second intermediate color representation; and a second gamma The horse correction maps the second intermediate color representation to the color scale recorded by the camera.

在第112方面,根據方面99至111中任一方面的光學計量系統,其中所確定的缺陷包括空間缺陷。 The optical metrology system of any of aspects 99 to 111, wherein the determined defect comprises a spatial defect.

在第113個方面,根據方面112的光學計量系統,其中該空間缺陷包括平面內平移、旋轉、縮放或扭曲誤差或平面外或焦點深度誤差中的一個或多個。 In an eleventh aspect, the optical metrology system of aspect 112, wherein the spatial defect comprises one or more of an in-plane translation, rotation, scaling or distortion error or an out-of-plane or focus depth error.

在第114個方面,根據方面99至113中任一方面的光學計量系統,其中所確定的缺陷包括色彩缺陷。 The optical metrology system of any of aspects 99 to 113, wherein the determined defect comprises a color defect.

在第115個方面,根據方面114的光學計量系統,其中該色彩缺陷包括與該顯示器可顯示的顏色相關聯的輝度平坦度或色彩均勻度誤差中的一個或多個。 In an eleventh aspect, the optical metrology system of aspect 114, wherein the color defect comprises one or more of luminance flatness or color uniformity error associated with a color displayable by the display.

在第116個方面,提供了一種用於在顯示器上執行影像校正的光學計量系統。該系統包括:照相機,用以擷取顯示器投射的光場的影像,該光場與顯示器的顯示層相關聯;硬體處理器,其編程可執行指令用以:至少部分地基於該相機擷取的影像生成向量場,該向量場包括對應於該顯示層的點的投影位置和預期位置之間的偏差的向量;計算,至少部分地基於該向量場,至少其中一個:一中心校正、一聚集旋轉校正、一聚集縮放校正或空間映射,用於顯示器;計算,至少部分地基於該相機擷取的影像,與該顯示層上的多個點相對應的輝度值;並且計算,至少部分地基於所確定的輝度值,一輝度平坦校正或一色平衡校正,用於顯示器。 In a 116th aspect, an optical metrology system for performing image correction on a display is provided. The system includes a camera for capturing an image of a light field projected by the display, the light field being associated with a display layer of the display, and a hardware processor programming executable instructions for: based at least in part on capturing the camera An image generation vector field, the vector field comprising a vector corresponding to a deviation between a projected position of the point of the display layer and an expected position; computing, based at least in part on the vector field, at least one of: a center correction, an aggregation Rotation correction, an aggregate zoom correction or spatial mapping for a display; computing, based at least in part on an image captured by the camera, a luminance value corresponding to a plurality of points on the display layer; and calculating, based at least in part on The determined luminance value, a luminance flat correction or a one color balance correction, is used for the display.

在第117個方面,根據方面116的光學計量系統,其中顯示器的顯示層包括色彩層或深度層。 In an eleventh aspect, the optical metrology system of aspect 116, wherein the display layer of the display comprises a color layer or a depth layer.

在第118方面,根據方面116至117中任一方面的光學計量系統,其中該照相機包括具有小焦深的光場照相機或數位照相機。 The optical metrology system of any of aspects 116 to 117, wherein the camera comprises a light field camera or a digital camera having a small depth of focus.

在第119方面,根據方面116至118中任一方面的光學計量系統,其中為了計算該中心校正,該硬體處理器被編程為確定對應於該投影顯示層的所識別的中心點與該投影顯示層的中心點之間的平移誤差的平移向量,以及預期中心點位置。 The optical metrology system of any one of aspects 116 to 118, wherein the hardware processor is programmed to determine the identified center point corresponding to the projection display layer and the projection for calculating the center correction The translation vector of the translation error between the center points of the display layer, as well as the expected center point position.

在第120方面中,根據方面116至119中任一方面的光學計量系統,其中為了計算聚集旋轉校正,該硬體處理器被編程為確定對應於投影顯示層圍繞中心點的旋轉的旋轉量,例如投影位置和預期位置之間的像素誤差量被減小或最小化。 The optical metrology system of any one of aspects 116 to 119, wherein the hardware processor is programmed to determine an amount of rotation corresponding to a rotation of the projection display layer about the center point in order to calculate the aggregate rotation correction, For example, the amount of pixel error between the projected position and the expected position is reduced or minimized.

在第121個方面中,根據方面116至120中任一個的光學計量系統,其中為了計算總計旋轉校正,該硬體處理器被編程為計算向量場的捲曲。 In an eleventh aspect, the optical metrology system of any one of aspects 116 to 120, wherein the hardware processor is programmed to calculate the curl of the vector field in order to calculate the total rotation correction.

在第122方面中,根據方面116至121中任一項該的光學計量系統,其中為了計算該聚集縮放校正,該硬體處理器被編程為確定與該投影顯示層圍繞中心點的縮放相對應的縮放量,投影位置和預期位置之間的像素誤差量被減小或最小化。 The optical metrology system of any one of aspects 116 to 121, wherein the hardware processor is programmed to determine a scaling corresponding to the center of the projection display layer in order to calculate the aggregate scaling correction The amount of quantization, the amount of pixel error between the projected position and the expected position is reduced or minimized.

在第123個方面,根據方面116至122中任一個的光學計量系統,其中為了計算該總計縮放校正,該硬體處理器被 編程為計算該向量場的發散度。 The optical metrology system of any one of aspects 116 to 122, wherein the hardware processor is Programming to calculate the divergence of the vector field.

在第124方面中,根據方面116至123中任一方面的光學計量系統,其中為了計算空間映射,該硬體處理器被編程為確定非線性變換以將顯示層的投影位置與預期位置對準。 The optical metrology system of any one of aspects 116 to 123, wherein the hardware processor is programmed to determine a non-linear transformation to align the projected position of the display layer with the intended location for calculating the spatial mapping .

在第125個方面,根據方面116至124中任一個的光學計量系統,其中為了計算輝度平坦校正,該硬體處理器被編程為:確定閾輝度值;並且計算出將大於閾值亮度值的每個輝度值降低到閾輝度值的一總量。 The optical metrology system of any one of aspects 116 to 124, wherein the hardware processor is programmed to: determine a threshold luminance value; and calculate a greater than threshold luminance value for calculating a luminance flat correction The luminance values are reduced to a total amount of threshold luminance values.

在第126方面,根據方面116至125中任一方面的光學計量系統,其中為了計算色彩平衡校正,該硬體處理器被編程為:識別與顯示層相關聯的顏色群集,該顏色群集包括至少一個附加的顯示層;對於該顯示層的每個點,比較該對應於該顯示層上的點的輝度值與對應於該附加顯示層上的點的輝度值;並且計算出將每個輝度值降低到與其對應點相關聯的最低輝度值的一總量。 The optical metrology system of any one of aspects 116 to 125, wherein the hardware processor is programmed to: identify a color cluster associated with the display layer, the color cluster comprising at least An additional display layer; for each point of the display layer, comparing the luminance value corresponding to the point on the display layer with the luminance value corresponding to the point on the additional display layer; and calculating each luminance value Decrease a total amount of the lowest luminance value associated with its corresponding point.

■動態顯示校準的其他方面■ Dynamic display of other aspects of calibration

在第127個方面,提供了一種顯示系統。顯示系統包括:一眼動追踪相機;一顯示器;非暫時性數據存儲器,被配置為儲存用於該顯示器的多個校準,該多個校準中的每個校準與相對於該顯示器的校準位置相關聯;以及一硬體處理器與該眼動追踪相機通訊聯繫,該顯示器,和該非暫時性數據存儲器,該硬體處理器被編程為:確定,基於來自該眼動追踪相機的信息,一 眼睛位置,相對於該顯示器的使用者;取出,至少部分地基於所確定的眼睛位置,該多個校準中的一個或多個;計算,至少部分地基於該多個校準中的該一個或多個,一校準應用於該顯示器以至少部分地校正該顯示器中的缺陷;並將該校正應用於顯示器。 In a 127th aspect, a display system is provided. The display system includes: an eye tracking camera; a display; a non-transitory data storage configured to store a plurality of calibrations for the display, each of the plurality of calibrations being associated with a calibration position relative to the display And a hardware processor in communication with the eye tracking camera, the display, and the non-transitory data storage, the hardware processor being programmed to: determine, based on information from the eye tracking camera, An eye position relative to a user of the display; taking out, based at least in part on the determined eye position, one or more of the plurality of calibrations; calculating, based at least in part on the one or more of the plurality of calibrations A calibration is applied to the display to at least partially correct defects in the display; and the correction is applied to the display.

在第128個方面,根據方面127的顯示系統,其中校準位置的數量是2、3、4、5、6、7、8、9或更多。 In a 128th aspect, the display system according to aspect 127, wherein the number of calibration positions is 2, 3, 4, 5, 6, 7, 8, 9, or more.

在第129方面,根據方面127至128中任一項的顯示系統,其中該校準位置在網格中跨越該顯示器分佈。 The display system of any of aspects 127 to 128, wherein the calibration position is distributed across the display in a grid.

在第130個方面,根據方面129的顯示系統,其中網格包括2×2,3×3,5×5或9×9網格。 In a 130th aspect, the display system of aspect 129, wherein the grid comprises a 2x2, 3x3, 5x5 or 9x9 grid.

在第131方面,根據方面127至130中任一方面的顯示系統,其中該多個校準中的一個或多個包括與最靠近眼睛位置的校準位置相關聯的校準。 The display system of any of aspects 127 to 130, wherein one or more of the plurality of calibrations comprises a calibration associated with a calibration position that is closest to an eye position.

在第132個方面,根據方面127至131中任一個的顯示系統,其中為了計算校正,硬體處理器被編程為在該多個校準中的一個或多個之間進行插值或外插,至少部分地基於該多個校準中的一個或多個的校準位置和所確定的眼睛位置。 The display system of any one of aspects 127 to 131, wherein the hardware processor is programmed to interpolate or extrapolate between one or more of the plurality of calibrations, in order to calculate the correction, at least Based in part on the calibration position of the one or more of the plurality of calibrations and the determined eye position.

在第133個方面,根據方面127至132中任一項的顯示系統,其中該顯示器包括與該使用者的第一眼睛相關聯的第一顯示器和與該使用者的第二眼睛相關聯的第二顯示器,並且該硬體處理器被編程以確定使用者相對於第一顯示器的眼睛位置,以及應用所確定的眼睛位置來計算對於第二顯示器的校正。 The display system of any one of aspects 127 to 132, wherein the display comprises a first display associated with the first eye of the user and a second associated with the second eye of the user A second display, and the hardware processor is programmed to determine an eye position of the user relative to the first display, and to apply the determined eye position to calculate a correction for the second display.

在第134方面中,根據方面127至133中任一項該的顯示系統,其中該顯示器包括與該使用者的第一眼睛相關聯的第一顯示器和與該使用者的第二眼睛相關聯的第二顯示器,並且該多個校準中的至少一些表示用於該第一顯示器和該第二顯示器的平均校準。 The display system of any one of aspects 127 to 133, wherein the display comprises a first display associated with the first eye of the user and associated with a second eye of the user A second display, and at least some of the plurality of calibrations represent an average calibration for the first display and the second display.

在第135個方面,根據方面127至134中任一項的顯示系統,其中該顯示器包括光場顯示器。 The display system of any of aspects 127 to 134, wherein the display comprises a light field display.

在第136個方面,根據方面127至135中任一項的顯示系統,其中該顯示器包括包括多個波導的可堆疊波導組件。 The display system of any of aspects 127 to 135, wherein the display comprises a stackable waveguide assembly comprising a plurality of waveguides.

在第137方面,根據方面127至136中的任一個的顯示系統,其中該顯示器被配置為頭戴式可穿戴顯示系統。 The display system of any one of aspects 127 to 136, wherein the display is configured as a head mounted wearable display system.

在第138個方面,根據方面127至137中任一項該的顯示系統,其中該多個校準中的每個校準用於校正該顯示器的空間缺陷、該顯示器的色彩缺陷,或該空間缺陷和該色彩缺陷兩者。 The display system of any one of aspects 127 to 137, wherein each of the plurality of calibrations is for correcting a spatial defect of the display, a color defect of the display, or the spatial defect and Both of these color defects.

在第139方面,根據方面138的顯示系統,其中該空間缺陷包括平面內平移、旋轉、縮放或扭曲誤差或平面外或焦點深度誤差中的一個或多個。 The display system of aspect 138, wherein the spatial defect comprises one or more of an in-plane translation, rotation, scaling or distortion error or an out-of-plane or focus depth error.

在第140方面,根據方面138的顯示系統,其中該色彩缺陷包括與該顯示器可顯示的顏色相關聯的輝度平坦度或色彩均勻度誤差中的一個或多個。 In a 140th aspect, the display system of aspect 138, wherein the color defect comprises one or more of luminance flatness or color uniformity error associated with a color displayable by the display.

在第141方面,提供了一種用於校準顯示器的方法。 該方法在由電腦硬體執行的動態校準系統的控制下,包括:確定顯示器的使用者的眼睛位置;取出,至少部分地基於所確定的眼睛位置,顯示器的校準,其中該校準是基於相關聯的校準位置和所確定的眼睛位置來選擇的;計算,至少部分地基於所取出的校準,一校準應用於該顯示器以至少部分地校正該顯示器中的缺陷;以及將該校正應用於該顯示器。 In a 141st aspect, a method for calibrating a display is provided. The method, under the control of a dynamic calibration system executed by a computer hardware, includes: determining an eye position of a user of the display; taking out, based at least in part on the determined eye position, calibration of the display, wherein the calibration is based on an association The calibration position and the determined eye position are selected; calculating, based at least in part on the extracted calibration, a calibration applied to the display to at least partially correct defects in the display; and applying the correction to the display.

在第142方面,根據方面141的方法,其中取出校準包括從多個校準中選擇一個或多個校準,其中每個校準與相對於顯示器的不同校準位置相關聯。 The method of aspect 141, wherein the removing the calibration comprises selecting one or more calibrations from the plurality of calibrations, wherein each calibration is associated with a different calibration location relative to the display.

在第143個方面,根據方面142的方法,其中該校準位置被佈置在跨越該顯示器的網格中。 In a 143th aspect, the method of aspect 142, wherein the calibration position is disposed in a grid spanning the display.

在第144個方面中,根據方面142至143中的任一個的方法,其中計算校正包括基於多個校準中的一個或多個的相關聯的校準位置,在多個校準中的一個或多個之間進行內插或外推和所確定的眼睛位置。 The method of any one of aspects 142 to 143, wherein the calculating the correction comprises one or more of the plurality of calibrations based on the associated calibration position of the one or more of the plurality of calibrations Interpolate or extrapolate and determine the position of the eye.

在第145方面,根據方面141至144中任一方面的方法,還包括取出顯示器的使用者的眼睛的影像,以及至少部分地基於眼睛的影像確定眼睛位置。 The method of any one of aspects 141 to 144, further comprising removing an image of a user's eyes of the display and determining an eye position based at least in part on the image of the eye.

在第146方面中,根據方面141-145中任一方面的方法,其中計算校準包括校準顯示器的空間缺陷、顯示器的色彩缺陷或空間缺陷和色彩缺陷兩者。 The method of any one of aspects 141-145, wherein calculating the calibration comprises calibrating a spatial defect of the display, a color defect or a spatial defect of the display, and a color defect.

在第147方面,提供了一種可穿戴式顯示系統,包 括:一面向內的成像系統;一顯示器;非暫時性數據存儲器,宖以儲存該顯示器的多個校準,該多個校準中的每個校準與相對於該顯示器的校準位置相關聯;以及一硬體處理器與該面向內的成像系統通訊聯繫,該顯示器,和該非暫時性數據存儲器,該硬體處理器被編程用以:確定,使用該面向內的成像系統,相對於顯示器的使用者的顯示器的一眼睛位置;計算,至少部分地基於所確定的眼睛位置和該多個校準中的一個或多個,一校準應用於該顯示器以至少部分地校正該顯示器中一個或多個的空間缺陷或該顯示器中的色彩缺陷;並將該校正應用於顯示器。 In a 147th aspect, a wearable display system is provided Included: an inward facing imaging system; a display; a non-transitory data store, to store a plurality of calibrations of the display, each of the plurality of calibrations being associated with a calibration location relative to the display; and a A hardware processor is in communication with the inward facing imaging system, the display, and the non-transitory data storage, the hardware processor being programmed to: determine, use the inward facing imaging system, relative to a user of the display An eye position of the display; calculating, based at least in part on the determined eye position and one or more of the plurality of calibrations, a calibration applied to the display to at least partially correct one or more spaces in the display Defects or color defects in the display; and apply this correction to the display.

在第148個方面,根據方面147的可穿戴顯示系統,其中硬體處理器被編程為經由監視眼睛位置的變化的反饋迴路來應用校正。 In a 148th aspect, the wearable display system of aspect 147, wherein the hardware processor is programmed to apply the correction via a feedback loop that monitors changes in eye position.

在第149個方面,根據方面147至148中任一個的可穿戴顯示系統,其中該硬體處理器被編程為確定眼睛相對於前一眼睛位置的變化,並且如果該變化超過閾值,則計算該校正。 The 149th aspect, the wearable display system of any one of clauses 147 to 148, wherein the hardware processor is programmed to determine a change in an eye position relative to a previous eye, and if the change exceeds a threshold, calculate the Correction.

在第150方面中,根據方面147至149中任一項該的可穿戴顯示系統,其中該空間缺陷包括一個或多個平面內平移、旋轉、縮放或扭曲誤差或平面外或焦深的誤差。 In a 150th aspect, the wearable display system of any of aspects 147 to 149, wherein the spatial defect comprises one or more in-plane translational, rotational, scaling or distortion errors or out-of-plane or depth-of-focus errors.

在第151個方面,根據方面147至150中任一項的可穿戴顯示系統,其中該色彩缺陷包括與該顯示器可顯示的色彩相關聯的一個或多個的輝度平坦度或色彩均勻性誤差。 The 151st aspect, the wearable display system of any of aspects 147 to 150, wherein the color defect comprises one or more luminance flatness or color uniformity errors associated with a color displayable by the display.

■結論■Conclusion

本發明所描述和/或附圖中所描繪的過程、方法和算法中的每一個具體實施例可由一個或多個物理計算系統、硬體電腦處理器、專用集成電路(ASIC)執行的代碼模組,和/或被配置為執行特定和特定計算機指令的電子硬體來完全實現或部分自動化。例如,計算系統可以包括用特定計算機指令或專用計算機編程的通用計算機(例如,伺服器)、專用電路等。代碼模組可以被編譯並鏈接到可執行程序中,安裝在動態鏈接庫中,或者可以用解釋程式語言編寫。在一些實現中,特定操作和方法可以由特定於給定功能的電路執行。 Each of the processes, methods, and algorithms described and/or depicted in the present disclosure can be executed by one or more physical computing systems, hardware computer processors, application specific integrated circuits (ASICs). Groups, and/or electronic hardware configured to perform specific and specific computer instructions are fully implemented or partially automated. For example, a computing system can include a general purpose computer (eg, a server), a special purpose circuit, or the like, programmed with a particular computer instruction or a special purpose computer. Code modules can be compiled and linked into an executable program, installed in a dynamic link library, or written in an interpreter language. In some implementations, certain operations and methods may be performed by circuitry that is specific to a given function.

此外,本發明所實施的某些功能在數學上、計算上或技術上相當複雜,使得需要專用硬體或一個或多個物理計算設備(利用適當的專用可執行指令)來執行功能,例如,由於涉及的計算的體積或複雜性或者基本上即時地提供結果。例如,視頻可以包括許多幀,每個幀具有數百萬個像素,並且特別地需要編程的計算機硬體來處理視頻數據,以在商業上合理的時間量內提供期望的影像處理任務或應用。 Moreover, some of the functions implemented by the present invention are mathematically, computationally, or technically complex, such that a dedicated hardware or one or more physical computing devices (with appropriate dedicated executable instructions) are required to perform functions, for example, The results are provided due to the volume or complexity of the calculations involved or substantially instantaneously. For example, a video may include many frames, each having millions of pixels, and in particular requiring programmed computer hardware to process the video data to provide a desired image processing task or application for a commercially reasonable amount of time.

代碼模組或任何類型的數據可以存儲在任何類型的非暫時性電腦可讀取媒體上,例如物理計算機儲存器,包括硬碟、固態記憶體、隨機存取記憶體(RAM),唯讀記憶體(ROM)、光學磁碟、揮發性或非揮發性儲存器,其組合和/或類似物。方法和模組(或數據)還可以作為生成的數據信號(例如,作為載波或其他類比或數位傳播信號的一部分)在各種電腦可讀取傳輸媒體 上傳輸,包括基於無線的和有線/基於電纜的媒體,並且可以採取各種形式(例如,作為單個或多工復用類比信號的一部分,或者作為多個離散數位封包或幀)。所公開的過程或過程步驟的結果可以存儲,持久地或以其它方式存儲在任何類型的非暫時性有形計算機存儲器中,或者可以經由計算機可讀取傳輸介質傳送。 Code modules or any type of data can be stored on any type of non-transitory computer readable media, such as physical computer storage, including hard drives, solid state memory, random access memory (RAM), read-only memory Body (ROM), optical disk, volatile or non-volatile storage, combinations thereof and/or the like. Methods and modules (or data) can also be used as generated data signals (eg, as part of a carrier or other analog or digitally transmitted signal) in a variety of computer readable transmission media Up-conversion, including wireless-based and wired/cable-based media, and can take various forms (eg, as part of a single or multiplexed analog analog signal, or as multiple discrete digital packets or frames). The results of the disclosed processes or process steps may be stored, permanently or otherwise stored in any type of non-transitory tangible computer memory, or may be transferred via a computer readable transmission medium.

本文所描述和/或附圖中描繪的流程圖中之任何過程、塊、狀態、步驟或功能性應當被理解為潛在地表示代碼模塊、代碼段或代碼部分,其包括用於實現特定功能的一個或多個可執行指令(例如,邏輯或算術)或步驟。或可以以各種方式進行組合、重新佈置、添加、刪除、修改或以其他方式改變各種過程、塊、狀態、步驟或功能。在一些實施例中,也可以以附加的或不同的計算系統或代碼模塊來執行本文所描述的一些或全部功能。本文描述的方法和過程也不限於任何特定序列,並且與其相關的框、步驟或狀態可以以適合於例如串行、並行或以某種其它方式的其它序列來執行。任務或事件可以被添加到所公開的實施例或從所公開的實施例中移除。此外,本文所描述的實施方式中的各種系統組件的分離是出於說明性目的,並且不應被理解為在所有實施方式中需要這樣的分離。應當理解,所描述的程序組件、方法和系統通常可以一起集成在單個計算機產品中或封裝到多個計算機產品中。許多實施變化是可能的。 Any process, block, state, step, or functionality in the flowcharts described herein and/or depicted in the drawings should be understood to potentially represent a code module, a code segment, or a code portion that includes a particular function. One or more executable instructions (eg, logic or arithmetic) or steps. The various processes, blocks, states, steps or functions may be combined, rearranged, added, deleted, modified or otherwise altered in various ways. In some embodiments, some or all of the functions described herein may also be performed in additional or different computing systems or code modules. The methods and processes described herein are also not limited to any particular sequence, and the blocks, steps or states associated therewith may be performed in other sequences suitable for, for example, serial, parallel, or in some other manner. Tasks or events may be added to or removed from the disclosed embodiments. Moreover, the separation of various system components in the embodiments described herein is for illustrative purposes and should not be construed as requiring such separation in all embodiments. It should be understood that the described program components, methods, and systems can generally be integrated together in a single computer product or packaged into multiple computer products. Many implementation changes are possible.

過程、方法和系統可以在網絡(或分佈式)計算環境中實現。網絡環境包括企業範圍的電腦網絡、內部網、局域 網(LAN)、廣域網(WAN)、個域網(PAN)、雲端運算網絡、群眾外包計算網絡、網際網路和全球資訊網。網絡可以是有線或無線網絡或任何其它類型的通信網絡。 Processes, methods, and systems can be implemented in a network (or distributed) computing environment. The network environment includes enterprise-wide computer networks, intranets, and local areas. Network (LAN), wide area network (WAN), personal area network (PAN), cloud computing network, crowdsourcing computing network, internet and global information network. The network can be a wired or wireless network or any other type of communication network.

本公開的系統、方法及裝置各自具有多個創新方面,任何單獨一個都不是本文中公開的期望屬性的唯一原因。上述各種特徵和過程可以彼此獨立地使用,或者可以以各種方式組合。所有可能的組合和子組合旨在落入本公開的範圍內。本發明各種示例性修改的實施例對於本領域技術人員來說是顯而易見的,並且在不脫離本發明的精神或範圍的情況下,本發明所界定的一般原理還可應用於其它實施例。因此,權利要求並不旨在限於本文所示的實施方式,而是要符合與本公開,本文公開的原理和新穎特徵一致的最寬範圍。 The systems, methods, and devices of the present disclosure each have a number of innovative aspects, and any single one is not the only reason for the desired attributes disclosed herein. The various features and processes described above can be used independently of one another or can be combined in various ways. All possible combinations and sub-combinations are intended to fall within the scope of the present disclosure. The various exemplary embodiments of the invention are apparent to those skilled in the art, and the general principles defined by the invention may be applied to other embodiments without departing from the spirit and scope of the invention. Therefore, the claims are not intended to be limited to the embodiments shown herein, but are to be accorded

本說明書中在分離的實施例的上下文中描述的某些特徵組合在單個實施例中實現。相反地,也可以將在單個實施例的上下文中描述的各個特徵分離地在多個實施例中實現或在任何適當的子組合中實現。此外,儘管可能在上面將特徵描述為在某些組合中起作用,甚至最初主張如此,但是可以在一些情況下將來自所主張的組合的一個或多個特徵從組合中刪去,並且可以將所主張的組合指向子組合或者子組合的變體。此外,並沒有單個特徵或組合特徵對於每個實施例是必需的或不可缺少的。 Certain combinations of features described in this specification in the context of separate embodiments are implemented in a single embodiment. Conversely, various features that are described in the context of a single embodiment can be implemented separately in various embodiments or in any suitable sub-combination. Moreover, although features may be described above as being functional in certain combinations, even initially claimed, one or more features from the claimed combination may be deleted from the combination in some cases and may be The claimed combination points to a sub-combination or a variant of a sub-combination. Moreover, no single feature or combination of features is essential or indispensable for each embodiment.

本文中所使用的條件語言,諸如“可以”、“可能”、“例如”等,除非另有明確陳述或者在所使用的上下文內另有理解,否 則一般旨在表達某些實施例包括某些特徵,元件和/或步驟,而其它實施例不包括某些特徵、元件和/或步驟。因此,這樣的條件語言通常不旨在暗示特徵,元件和/或步驟以任何方式對於一個或多個實施例是必需的,或者一個或多個實施例必須包括用於在有或沒有操作者輸入或提示的情況下決定這些特徵,元件和/或步驟被包括在或將在任何特定實施例中執行。術語“包括”、“包含”、“具有”等是同義的,並且以開放式的方式包含使用,並且不排除額外的元件、特徵、動作、操作等。此外,術語“或”以其包括的含義使用(而不是以其排他的含義),使得當例如用於連接元件列表時,術語“或”表示一個,一些或所有的元素。此外,除非另有說明,否則本申請和所附權利要求中使用的冠詞“一”,“一個”和“所述”被解釋為表示“一個或多個”或“至少一個”。 Conditional language as used herein, such as "may", "may", "such as" and the like, unless otherwise explicitly stated or otherwise understood in the context of use. It is intended, therefore, that the claims, the claims, and the claims Thus, such conditional language is generally not intended to imply that features, elements and/or steps are required in any manner for one or more embodiments, or one or more embodiments must be included for use with or without operator input. These features, elements and/or steps are determined to be or will be performed in any particular embodiment. The terms "including", "comprising", "having", etc. are meant to be used in an open-ended manner and do not exclude additional elements, features, acts, operations, etc. Furthermore, the term "or" is used in its <RTI ID=0.0> </ RTI> <RTIgt; </ RTI> <RTIgt; </ RTI> <RTIgt; </ RTI> <RTIgt; In addition, the articles "a", "an" and "the"

如本文所使用的,引述一列項目中的“至少一個”的短語是指這些項目的任何組合,包括單個成員。作為示例,“A、B或C中的至少一個”旨在涵蓋:A、B、C,A和B,A和C,B和C以及A、B和C。另外,除非另有特別說明,否則連接詞語言,諸如短語“X、Y和Z中的至少一個”,一般以通常使用的上下文來理解,以表達項目、項等可以是X,Y或Z。因此,這種連接詞語言通常不旨在暗示,某些實施方案要求X的至少一個,Y的至少一個以及Z中的至少一個各自存在。 As used herein, a phrase referring to "at least one of" a list of items refers to any combination of these items, including a single member. As an example, "at least one of A, B or C" is intended to cover: A, B, C, A and B, A and C, B and C, and A, B and C. In addition, unless otherwise specified, the conjunction language, such as the phrase "at least one of X, Y, and Z", is generally understood in the context of the commonly used, to express that the item, item, etc. may be X, Y, or Z. . Thus, such conjunction language is generally not intended to imply that certain embodiments require at least one of X, at least one of Y, and at least one of Z.

類似地,雖然在附圖中以特定次序描繪了操作,但是應當將這理解為不需要以所示的特定次序或者以連續次序執行 這樣的操作、或者需要執行所有圖示的操作才能達到期望的結果。還可進一步,圖示可以以流程圖的形式來示意性地描繪另一個例子的過程。然而,未被描述的其它操作可以用所示意出的範例方法來引入。例如,一個或多個附加操作可以是之前、之後、同時、或之間的任何示出的操作。此外,在其他實施例中,操作可以被重新排列或重新排序。在某些情況下,多重以及並行處理可以是有利的。此外,不應當將在上述實施例中的各種系統元件的分離理解為在所有實施例中均需要這樣的分離,而應當理解的是,通常可以將所描述的程式元件和系統集成到一起成為單個軟體產品或封裝為多個軟體產品。此外,在以下權利要求之內的其他實施例,在某些情況下權利要求中記載的操作可以以不同順序來執行並且仍然獲得期望的結果。 Similarly, although operations are depicted in a particular order in the figures, this should be understood as not necessarily being performed in the specific order shown or in a sequential order. Such an operation, or the need to perform all of the illustrated operations, can achieve the desired result. Still further, the illustration may schematically depict the process of another example in the form of a flow chart. However, other operations not described may be introduced by the exemplary methods illustrated. For example, one or more additional operations can be any of the operations shown before, after, simultaneously, or between. Moreover, in other embodiments, operations may be rearranged or reordered. In some cases, multiple as well as parallel processing may be advantageous. Moreover, the separation of the various system elements in the above-described embodiments should not be construed as requiring such separation in all embodiments, but it should be understood that the described program elements and systems can generally be integrated into a single unit. The software product or package is a plurality of software products. In addition, other embodiments are within the scope of the following claims, and in some cases the operations recited in the claims can be performed in a different order and still obtain the desired results.

1800‧‧‧測量系統 1800‧‧‧Measurement system

1802‧‧‧顯示器 1802‧‧‧ display

1804‧‧‧光線 1804‧‧‧Light

1806‧‧‧照相機 1806‧‧‧ camera

1808‧‧‧控制器 1808‧‧‧ Controller

Claims (28)

一種用於檢測顯示器所產生的光場缺陷之光學計量系統,該光學計量系統包括:一顯示器,用以投射包括具有預期對焦位置的虛擬物體之目標光場;一照相機,用以擷取該目標光場的影像;一硬體處理器,被編程用以執行指令:取出對應於該光場一部分的一個或多個影像;分析該一個或多個影像以識別對應出該虛擬物體在焦點位置之測量的對焦位置;以及至少一部分是根據該測量的對焦位置和預期的對焦位置的比較結果來確定光場缺陷。 An optical metrology system for detecting a light field defect generated by a display, the optical metrology system comprising: a display for projecting a target light field including a virtual object having an intended focus position; and a camera for capturing the target An image of the light field; a hardware processor programmed to execute an instruction: extracting one or more images corresponding to a portion of the light field; analyzing the one or more images to identify that the virtual object is in a focus position The measured focus position; and at least a portion is to determine a light field defect based on a comparison of the measured focus position and the expected focus position. 如請求項1所述的光學計量系統,其中該顯示器包含一波導疊層,該波導疊層用以輸出光源,將虛擬物體投影到至少一個深度平面。 The optical metrology system of claim 1 wherein the display comprises a waveguide stack for outputting a light source to project the virtual object to at least one depth plane. 如請求項1所述的光學計量系統,其中該照相機包括具有小焦深的一數位相機。 The optical metrology system of claim 1 wherein the camera comprises a digital camera having a small depth of focus. 如請求項3所述的光學計量系統,其中該相機具有對焦點,並且該系統用以在對焦範圍上掃描該相機的焦點,以取得一個或多個影像。 The optical metrology system of claim 3, wherein the camera has a focus point and the system is to scan a focus of the camera over a focus range to obtain one or more images. 如請求項1所述的光學計量系統,其中該相機包括一光場相機。 The optical metrology system of claim 1 wherein the camera comprises a light field camera. 如請求項1所述的光學計量系統,其中該虛擬物件包括一棋盤圖案、一幾何圖案或一隨機圖案。 The optical metrology system of claim 1, wherein the virtual object comprises a checkerboard pattern, a geometric pattern, or a random pattern. 如請求項1所述的光學計量系統,其中該顯示器包括多像素,並且該目標光場對應少於該所有被照亮多像素中的子集。 The optical metrology system of claim 1 wherein the display comprises a plurality of pixels and the target light field corresponds to less than a subset of the plurality of illuminated multi-pixels. 如請求項1所述的光學計量系統,其中該測量的對焦位置包括一焦深。 The optical metrology system of claim 1, wherein the measured in-focus position comprises a depth of focus. 如請求項1所述的光學計量系統,其中該測量的對焦位置進一步包括一橫向對焦位置。 The optical metrology system of claim 1 wherein the measured in-focus position further comprises a lateral focus position. 如請求項1所述的光學計量系統,其中該確定光場的缺陷係至少一部分是根據測量的對焦位置和預期的對焦位置之間的一誤差向量。 The optical metrology system of claim 1, wherein the defect of the determined light field is at least a portion based on an error vector between the measured in-focus position and the expected in-focus position. 如請求項1所述的光學計量系統,其中該硬體處理器編程係至少一部分是根據確定的缺陷來確定顯示器的一誤差校正。 The optical metrology system of claim 1 wherein the hardware processor programming is at least in part determining an error correction of the display based on the determined defect. 如請求項1所述的光學計量系統,其中該硬體處理器進一步編程用於顯示器到照相機的像素映射,並將該顯示器的像素值傳送到該相機的像素值。 The optical metrology system of claim 1, wherein the hardware processor is further programmed for pixel mapping of the display to the camera and transmitting pixel values of the display to pixel values of the camera. 如請求項12所述的光學計量系統,其中該顯示器到照相機的像素映射包括:一第一伽馬校正,將該顯示器的色階映射到第一中間色彩表示式; 一像素相關耦合函數,將該第一中間色彩表示式映射到第二中間色彩表示式;以及一第二伽馬校正,將第二中間色彩表示式映射到相機記錄的色階。 The optical metrology system of claim 12, wherein the display to the camera pixel mapping comprises: a first gamma correction, mapping the color gradation of the display to a first intermediate color representation; a pixel correlation coupling function mapping the first intermediate color representation to a second intermediate color representation; and a second gamma correction mapping the second intermediate color representation to the gradation recorded by the camera. 如請求項1至13中任一項所述的光學計量系統,其中該確定的缺陷包括一空間缺陷。 The optical metrology system of any of claims 1 to 13, wherein the determined defect comprises a spatial defect. 如請求項14所述的光學計量系統,其中該空間缺陷包括一個或多個一平面內轉移、旋轉、縮放或扭曲誤差或一平面外或一焦點深度誤差。 The optical metrology system of claim 14, wherein the spatial defect comprises one or more in-plane transitions, rotations, scaling or distortion errors or an out-of-plane or a focus depth error. 如請求項1至13中任一項所述的光學計量系統,其中該確定的缺陷包括一色彩缺陷。 The optical metrology system of any of claims 1 to 13, wherein the determined defect comprises a color defect. 如請求項16所述的光學計量系統,其中該色彩缺陷包括一個或多個輝度平坦化或顯示器可顯示顏色相關聯的色彩均勻度誤差。 The optical metrology system of claim 16, wherein the color defect comprises one or more luminance flattening or a display can display a color-associated color uniformity error. 一種用於檢測顯示器所產生的光場缺陷之光學計量系統,並可在顯示器上執行影像校正,該光學計量系統包括:一照相機,用以擷取由一顯示器投射的光場影像,該光場與該顯示器的顯示層相關聯;一硬體處理器編程,用以執行指令:至少一部分是根據該照相機擷取的影像產生向量場,該向量場包括該顯示層的點之投影位置和預期位置之間的偏差所對應的向量; 至少一部分是根據該向量場來計算該顯示器的一中心校正、一聚集旋轉校正、一聚集縮放校正或一空間映射的至少一個;至少一部分是根據該照相機擷取的影像、計算與該顯示層上的多個相對應的輝度值;以及至少一部分是根據所確定的輝度值來計算顯示器的一輝度平坦校正或一色彩平衡校正。 An optical metrology system for detecting a defect in a light field generated by a display, and performing image correction on a display, the optical metrology system comprising: a camera for capturing a light field image projected by a display, the light field Associated with a display layer of the display; a hardware processor programming to execute an instruction: at least a portion is to generate a vector field based on the image captured by the camera, the vector field including a projected position and an expected position of the point of the display layer The vector corresponding to the deviation; At least a portion is to calculate at least one of a center correction, a focus rotation correction, an aggregation zoom correction, or a spatial mapping of the display according to the vector field; at least a portion is based on the image captured by the camera, calculated and displayed on the display layer And a plurality of corresponding luminance values; and at least a portion is to calculate a luminance flat correction or a color balance correction of the display according to the determined luminance value. 如請求項18所述的光學計量系統,其中該顯示器的顯示層包括一色彩層或一深度層。 The optical metrology system of claim 18, wherein the display layer of the display comprises a color layer or a depth layer. 如請求項18所述的光學計量系統,其中該照相機包括具有較小焦深的一光場相機或一數位相機。 The optical metrology system of claim 18, wherein the camera comprises a light field camera or a digital camera having a smaller depth of focus. 如請求項18所述的光學計量系統,其中為了計算中心校正,該硬體處理器編程用以確定對應於該投影顯示層所識別的中心點與預期中心點位置之間的平移誤差之平移向量。 The optical metrology system of claim 18, wherein for calculating a center correction, the hardware processor is programmed to determine a translation vector corresponding to a translation error between a center point identified by the projection display layer and an expected center point position . 如請求項18所述的光學計量系統,其中為了計算聚集旋轉校正,該硬體處理器編程用以確定對應於該投影顯示層圍繞中心點旋轉的旋轉量,使得該投影位置和預期位置被縮小或最小化。 The optical metrology system of claim 18, wherein the hardware processor is programmed to determine an amount of rotation corresponding to rotation of the projection display layer about a center point in order to calculate an aggregate rotation correction such that the projection position and the expected position are reduced. Or minimized. 如請求項18所述的光學計量系統,其中為了計算聚集旋轉校正,該硬體處理器編程為計算該向量場的捲曲。 The optical metrology system of claim 18, wherein the hardware processor is programmed to calculate the curl of the vector field in order to calculate the aggregate rotation correction. 如請求項18所述的光學計量系統,其中為了計算該聚集 縮放校正,該硬體處理器編程用以確定對應於該投影顯示層圍繞中心點縮放的縮放量,使得在該投影位置和預期位置被縮小或最小化。 The optical metrology system of claim 18, wherein the aggregation is calculated A scaling correction is programmed to determine an amount of scaling corresponding to the projected display layer scaling about the center point such that the projected position and the expected position are reduced or minimized. 如請求項18所述的光學計量系統,其中為了計算該聚集縮放校正,該硬體處理器編程為計算該向量場的發散度。 The optical metrology system of claim 18, wherein to calculate the aggregate scaling correction, the hardware processor is programmed to calculate a divergence of the vector field. 如請求項18所述的光學計量系統,其中為了計算該空間映射,該硬體處理器編程用以確定非線性轉換,並將該顯示層的投影位置與預期位置進行調準。 The optical metrology system of claim 18, wherein to calculate the spatial map, the hardware processor is programmed to determine a non-linear transition and to align the projected position of the display layer with an expected position. 如請求項求18所述的光學計量系統,其中為了計算該輝度平坦校正,該硬體處理器編程用以:確定門檻輝度值;以及計算將大於門檻輝度值的每個輝度值,降低到門檻輝度值的數量。 The optical metrology system of claim 18, wherein to calculate the luminance flat correction, the hardware processor is programmed to: determine a threshold luminance value; and calculate each luminance value that is greater than a threshold luminance value, to a threshold The number of luminance values. 如請求18所述的光學計量系統,其中為了計算該色彩平衡校正,該硬體處理器編程用以:識別顯示層相關聯的顏色聚集,該顏色聚集包括至少一個附加顯示層;對於該顯示層的每個點,比較對應於該顯示層上的點之輝度值和對應於該附加顯示層上的點之輝度值;以及計算出將每個輝度值降低到與其對應點相關聯的最低輝度值的數量。 The optical metrology system of claim 18, wherein to calculate the color balance correction, the hardware processor is programmed to: identify a color aggregate associated with the display layer, the color aggregate comprising at least one additional display layer; for the display layer Each point of the comparison, comparing a luminance value corresponding to a point on the display layer and a luminance value corresponding to a point on the additional display layer; and calculating to reduce each luminance value to a lowest luminance value associated with the corresponding point quantity.
TW105135772A 2015-11-04 2016-11-03 Display light field metering system TWI648559B (en)

Applications Claiming Priority (10)

Application Number Priority Date Filing Date Title
US201562250925P 2015-11-04 2015-11-04
US201562250934P 2015-11-04 2015-11-04
US62/250,925 2015-11-04
US62/250,934 2015-11-04
US201662278824P 2016-01-14 2016-01-14
US201662278794P 2016-01-14 2016-01-14
US201662278779P 2016-01-14 2016-01-14
US62/278,779 2016-01-14
US62/278,824 2016-01-14
US62/278,794 2016-01-14

Publications (2)

Publication Number Publication Date
TW201730627A true TW201730627A (en) 2017-09-01
TWI648559B TWI648559B (en) 2019-01-21

Family

ID=58634472

Family Applications (2)

Application Number Title Priority Date Filing Date
TW107136549A TWI695999B (en) 2015-11-04 2016-11-03 Calibration system for a display
TW105135772A TWI648559B (en) 2015-11-04 2016-11-03 Display light field metering system

Family Applications Before (1)

Application Number Title Priority Date Filing Date
TW107136549A TWI695999B (en) 2015-11-04 2016-11-03 Calibration system for a display

Country Status (11)

Country Link
US (7) US10260864B2 (en)
EP (4) EP4235639A3 (en)
JP (5) JP7210280B2 (en)
KR (4) KR20240017132A (en)
CN (4) CN108474737B (en)
AU (4) AU2016349891B9 (en)
CA (2) CA3004271A1 (en)
IL (4) IL292793B1 (en)
NZ (2) NZ742532A (en)
TW (2) TWI695999B (en)
WO (2) WO2017079333A1 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
TWI817335B (en) * 2022-01-25 2023-10-01 宏碁股份有限公司 Stereoscopic image playback apparatus and method of generating stereoscopic images thereof

Families Citing this family (191)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10261321B2 (en) 2005-11-08 2019-04-16 Lumus Ltd. Polarizing optical system
WO2016020630A2 (en) 2014-08-08 2016-02-11 Milan Momcilo Popovich Waveguide laser illuminator incorporating a despeckler
US8693731B2 (en) 2012-01-17 2014-04-08 Leap Motion, Inc. Enhanced contrast for object detection and characterization by optical imaging
US11493998B2 (en) 2012-01-17 2022-11-08 Ultrahaptics IP Two Limited Systems and methods for machine control
JP6102602B2 (en) 2013-07-23 2017-03-29 ソニー株式会社 Image processing apparatus, image processing method, image processing program, and imaging apparatus
IL235642B (en) 2014-11-11 2021-08-31 Lumus Ltd Compact head-mounted display system protected by a hyperfine structure
CN107873086B (en) 2015-01-12 2020-03-20 迪吉伦斯公司 Environmentally isolated waveguide display
IL237337B (en) 2015-02-19 2020-03-31 Amitai Yaakov Compact head-mounted display system having uniform image
EP3062142B1 (en) 2015-02-26 2018-10-03 Nokia Technologies OY Apparatus for a near-eye display
NZ773847A (en) 2015-03-16 2022-07-01 Magic Leap Inc Methods and systems for diagnosing and treating health ailments
CN107924085B (en) 2015-06-15 2022-09-02 奇跃公司 Virtual and augmented reality systems and methods
US9910276B2 (en) 2015-06-30 2018-03-06 Microsoft Technology Licensing, Llc Diffractive optical elements with graded edges
US10670862B2 (en) 2015-07-02 2020-06-02 Microsoft Technology Licensing, Llc Diffractive optical elements with asymmetric profiles
US9864208B2 (en) * 2015-07-30 2018-01-09 Microsoft Technology Licensing, Llc Diffractive optical elements with varying direction for depth modulation
US10038840B2 (en) 2015-07-30 2018-07-31 Microsoft Technology Licensing, Llc Diffractive optical element using crossed grating for pupil expansion
US10073278B2 (en) 2015-08-27 2018-09-11 Microsoft Technology Licensing, Llc Diffractive optical element using polarization rotation grating for in-coupling
EP3359999A1 (en) 2015-10-05 2018-08-15 Popovich, Milan Momcilo Waveguide display
US10429645B2 (en) 2015-10-07 2019-10-01 Microsoft Technology Licensing, Llc Diffractive optical element with integrated in-coupling, exit pupil expansion, and out-coupling
US10241332B2 (en) 2015-10-08 2019-03-26 Microsoft Technology Licensing, Llc Reducing stray light transmission in near eye display using resonant grating filter
US9946072B2 (en) * 2015-10-29 2018-04-17 Microsoft Technology Licensing, Llc Diffractive optical element with uncoupled grating structures
WO2017079333A1 (en) 2015-11-04 2017-05-11 Magic Leap, Inc. Light field display metrology
US10234686B2 (en) 2015-11-16 2019-03-19 Microsoft Technology Licensing, Llc Rainbow removal in near-eye display using polarization-sensitive grating
NZ747005A (en) 2016-04-08 2020-04-24 Magic Leap Inc Augmented reality systems and methods with variable focus lens elements
WO2018067357A2 (en) 2016-10-05 2018-04-12 Magic Leap, Inc. Periocular test for mixed reality calibration
CA2992213C (en) 2016-10-09 2023-08-29 Yochay Danziger Aperture multiplier using a rectangular waveguide
KR102541662B1 (en) 2016-11-08 2023-06-13 루머스 리미티드 Light-guide device with optical cutoff edge and corresponding production methods
IL303676B1 (en) 2016-11-18 2024-02-01 Magic Leap Inc Spatially variable liquid crystal diffraction gratings
WO2018094093A1 (en) 2016-11-18 2018-05-24 Magic Leap, Inc. Waveguide light multiplexer using crossed gratings
US11067860B2 (en) 2016-11-18 2021-07-20 Magic Leap, Inc. Liquid crystal diffractive devices with nano-scale pattern and methods of manufacturing the same
CA3045663A1 (en) 2016-12-08 2018-06-14 Magic Leap, Inc. Diffractive devices based on cholesteric liquid crystal
KR102550742B1 (en) 2016-12-14 2023-06-30 매직 립, 인코포레이티드 Patterning of liquid crystals using soft-imprint replication of surface alignment patterns
US10650552B2 (en) 2016-12-29 2020-05-12 Magic Leap, Inc. Systems and methods for augmented reality
EP3343267B1 (en) 2016-12-30 2024-01-24 Magic Leap, Inc. Polychromatic light out-coupling apparatus, near-eye displays comprising the same, and method of out-coupling polychromatic light
WO2018122859A1 (en) 2016-12-31 2018-07-05 Lumus Ltd. Eye tracker based on retinal imaging via light-guide optical element
US10108014B2 (en) * 2017-01-10 2018-10-23 Microsoft Technology Licensing, Llc Waveguide display with multiple focal depths
AU2018210527B2 (en) 2017-01-23 2022-12-01 Magic Leap, Inc. Eyepiece for virtual, augmented, or mixed reality systems
EP3574360A4 (en) 2017-01-28 2020-11-11 Lumus Ltd. Augmented reality imaging system
IL292456B (en) 2017-02-22 2022-08-01 Lumus Ltd Light guide optical assembly
JP7158395B2 (en) 2017-02-23 2022-10-21 マジック リープ, インコーポレイテッド Variable focus imaging device based on polarization conversion
TWI663427B (en) * 2017-03-15 2019-06-21 宏碁股份有限公司 Head mounted display and chroma aberration compensation method using sub-pixel shifting
IL303471A (en) 2017-03-21 2023-08-01 Magic Leap Inc Eye-imaging apparatus using diffractive optical elements
CN110476118B (en) * 2017-04-01 2021-10-15 深圳市大疆创新科技有限公司 Low profile multiband hyperspectral imaging for machine vision
CN107068114B (en) * 2017-04-24 2019-04-30 北京小米移动软件有限公司 Screen color method of adjustment, device, equipment and storage medium
CN108931357B (en) * 2017-05-22 2020-10-23 宁波舜宇车载光学技术有限公司 Test target and corresponding lens MTF detection system and method
US10634921B2 (en) * 2017-06-01 2020-04-28 NewSight Reality, Inc. See-through near eye optical display
US10921613B2 (en) * 2017-06-01 2021-02-16 NewSight Reality, Inc. Near eye display and related computer-implemented software and firmware
US11119353B2 (en) 2017-06-01 2021-09-14 E-Vision Smart Optics, Inc. Switchable micro-lens array for augmented reality and mixed reality
WO2019014861A1 (en) * 2017-07-18 2019-01-24 Hangzhou Taruo Information Technology Co., Ltd. Intelligent object tracking
US11243434B2 (en) 2017-07-19 2022-02-08 Lumus Ltd. LCOS illumination via LOE
US10578870B2 (en) 2017-07-26 2020-03-03 Magic Leap, Inc. Exit pupil expander
CN109387939B (en) * 2017-08-09 2021-02-12 中强光电股份有限公司 Near-to-eye display device and correction method of display image thereof
TWI646466B (en) * 2017-08-09 2019-01-01 宏碁股份有限公司 Vision range mapping method and related eyeball tracking device and system
US10551614B2 (en) * 2017-08-14 2020-02-04 Facebook Technologies, Llc Camera assembly with programmable diffractive optical element for depth sensing
CN111629653A (en) 2017-08-23 2020-09-04 神经股份有限公司 Brain-computer interface with high speed eye tracking features
US11160449B2 (en) * 2017-08-29 2021-11-02 Verily Life Sciences Llc Focus stacking for retinal imaging
US10586342B2 (en) 2017-08-31 2020-03-10 Facebook Technologies, Llc Shifting diffractive optical element for adjustable depth sensing resolution
CN107680047A (en) * 2017-09-05 2018-02-09 北京小鸟看看科技有限公司 A kind of virtual reality scenario rendering intent, image processor and wear display device
JP7280250B2 (en) 2017-09-21 2023-05-23 マジック リープ, インコーポレイテッド Augmented reality display with waveguide configured to capture images of the eye and/or environment
US11368657B2 (en) * 2017-09-28 2022-06-21 Disney Enterprises, Inc. Light field based projector calibration method and system
WO2019077614A1 (en) 2017-10-22 2019-04-25 Lumus Ltd. Head-mounted augmented reality device employing an optical bench
EP4329296A2 (en) 2017-11-15 2024-02-28 Magic Leap, Inc. System and methods for extrinsic calibration of cameras and diffractive optical elements
US11181977B2 (en) 2017-11-17 2021-11-23 Dolby Laboratories Licensing Corporation Slippage compensation in eye tracking
US11282133B2 (en) 2017-11-21 2022-03-22 International Business Machines Corporation Augmented reality product comparison
US10586360B2 (en) * 2017-11-21 2020-03-10 International Business Machines Corporation Changing view order of augmented reality objects based on user gaze
CN111417883B (en) 2017-12-03 2022-06-17 鲁姆斯有限公司 Optical equipment alignment method
US11280937B2 (en) 2017-12-10 2022-03-22 Magic Leap, Inc. Anti-reflective coatings on optical waveguides
US10643576B2 (en) 2017-12-15 2020-05-05 Samsung Display Co., Ltd. System and method for white spot Mura detection with improved preprocessing
US10681344B2 (en) * 2017-12-15 2020-06-09 Samsung Display Co., Ltd. System and method for mura detection on a display
CN111683584A (en) 2017-12-15 2020-09-18 奇跃公司 Eyepiece for augmented reality display system
KR20200100720A (en) 2017-12-20 2020-08-26 매직 립, 인코포레이티드 Insert for augmented reality viewing device
FI129586B (en) * 2017-12-22 2022-05-13 Dispelix Oy Multipupil waveguide display element and display device
WO2019133505A1 (en) * 2017-12-29 2019-07-04 Pcms Holdings, Inc. Method and system for maintaining color calibration using common objects
NL2020216B1 (en) * 2017-12-30 2019-07-08 Zhangjiagang Kangde Xin Optronics Mat Co Ltd Method for reducing crosstalk on an autostereoscopic display
KR20200102408A (en) * 2018-01-02 2020-08-31 루머스 리미티드 Augmented Reality Display with Active Alignment and Corresponding Method
US20190038964A1 (en) * 2018-01-12 2019-02-07 Karthik Veeramani Personalized calibration and adaption of vr experience
AU2019209950A1 (en) 2018-01-17 2020-07-09 Magic Leap, Inc. Display systems and methods for determining registration between a display and a user's eyes
JP7390297B2 (en) 2018-01-17 2023-12-01 マジック リープ, インコーポレイテッド Eye rotation center determination, depth plane selection, and rendering camera positioning within the display system
DE102018105917A1 (en) * 2018-03-14 2019-09-19 tooz technologies GmbH A method for user-specific calibration of a superimposed on the head of a user display device for an augmented presentation
US10860399B2 (en) 2018-03-15 2020-12-08 Samsung Display Co., Ltd. Permutation based stress profile compression
US10755676B2 (en) 2018-03-15 2020-08-25 Magic Leap, Inc. Image correction due to deformation of components of a viewing device
JP7377413B2 (en) * 2018-04-12 2023-11-10 Toppanホールディングス株式会社 Light field image generation system, image display system, shape information acquisition server, light field image generation method, and image display method
IL259518B2 (en) 2018-05-22 2023-04-01 Lumus Ltd Optical system and method for improvement of light field uniformity
BR112020023513A2 (en) 2018-05-23 2021-02-09 Lumus Ltd. optical system
JP7319303B2 (en) 2018-05-31 2023-08-01 マジック リープ, インコーポレイテッド Radar head pose localization
US10495882B1 (en) * 2018-06-04 2019-12-03 Facebook Technologies, Llc Positioning cameras in a head mounted display to capture images of portions of a face of a user
CN112400157A (en) 2018-06-05 2021-02-23 奇跃公司 Homography transformation matrix based temperature calibration of viewing systems
IL279500B (en) 2018-06-21 2022-09-01 Lumus Ltd Measurement technique for refractive index inhomogeneity between plates of a lightguide optical element (loe)
US11415812B2 (en) 2018-06-26 2022-08-16 Lumus Ltd. Compact collimating optical device and system
US11579441B2 (en) 2018-07-02 2023-02-14 Magic Leap, Inc. Pixel intensity modulation using modifying gain values
US11856479B2 (en) 2018-07-03 2023-12-26 Magic Leap, Inc. Systems and methods for virtual and augmented reality along a route with markers
US11510027B2 (en) 2018-07-03 2022-11-22 Magic Leap, Inc. Systems and methods for virtual and augmented reality
US11106033B2 (en) * 2018-07-05 2021-08-31 Magic Leap, Inc. Waveguide-based illumination for head mounted display system
JP7408621B2 (en) 2018-07-13 2024-01-05 マジック リープ, インコーポレイテッド System and method for binocular deformation compensation of displays
US10884492B2 (en) * 2018-07-20 2021-01-05 Avegant Corp. Relative position based eye-tracking system
TWI675583B (en) * 2018-07-23 2019-10-21 緯創資通股份有限公司 Augmented reality system and color compensation method thereof
US11624929B2 (en) 2018-07-24 2023-04-11 Magic Leap, Inc. Viewing device with dust seal integration
US11598651B2 (en) 2018-07-24 2023-03-07 Magic Leap, Inc. Temperature dependent calibration of movement detection devices
CN112689869A (en) 2018-07-24 2021-04-20 奇跃公司 Display system and method for determining registration between a display and an eye of a user
WO2020023672A1 (en) * 2018-07-24 2020-01-30 Magic Leap, Inc. Display systems and methods for determining vertical alignment between left and right displays and a user's eyes
WO2020028834A1 (en) 2018-08-02 2020-02-06 Magic Leap, Inc. A viewing system with interpupillary distance compensation based on head motion
WO2020028191A1 (en) 2018-08-03 2020-02-06 Magic Leap, Inc. Unfused pose-based drift correction of a fused pose of a totem in a user interaction system
CN108985291B (en) * 2018-08-07 2021-02-19 东北大学 Binocular tracking system based on single camera
US10861415B2 (en) * 2018-08-14 2020-12-08 Facebook Technologies, Llc Display device with throughput calibration
US10607353B2 (en) * 2018-08-30 2020-03-31 Facebook Technologies, Llc Structured light depth sensing
US11205378B1 (en) * 2018-09-07 2021-12-21 Apple Inc. Dynamic uniformity compensation for electronic display
US11141645B2 (en) 2018-09-11 2021-10-12 Real Shot Inc. Athletic ball game using smart glasses
US11103763B2 (en) 2018-09-11 2021-08-31 Real Shot Inc. Basketball shooting game using smart glasses
WO2020059157A1 (en) * 2018-09-20 2020-03-26 株式会社ソニー・インタラクティブエンタテインメント Display system, program, display method, and head-mounted device
US10664050B2 (en) * 2018-09-21 2020-05-26 Neurable Inc. Human-computer interface using high-speed and accurate tracking of user interactions
US11733523B2 (en) 2018-09-26 2023-08-22 Magic Leap, Inc. Diffractive optical elements with optical power
US10795630B2 (en) 2018-10-10 2020-10-06 International Business Machines Corporation Configuring computing device to utilize a multiple display arrangement by tracking eye movement
US10803791B2 (en) 2018-10-31 2020-10-13 Samsung Display Co., Ltd. Burrows-wheeler based stress profile compression
TWM642752U (en) 2018-11-08 2023-06-21 以色列商魯姆斯有限公司 Light-guide display with reflector
WO2020102412A1 (en) 2018-11-16 2020-05-22 Magic Leap, Inc. Image size triggered clarification to maintain image sharpness
WO2020106824A1 (en) 2018-11-20 2020-05-28 Magic Leap, Inc. Eyepieces for augmented reality display system
KR20210106990A (en) 2018-11-30 2021-08-31 피씨엠에스 홀딩스, 인크. Method and apparatus for estimating scene illuminants based on skin reflectance database
KR102221991B1 (en) * 2018-12-06 2021-03-04 한국광기술원 Apparatus and Method for Discriminating whether Display Serves Function of Accommodation to the Observer or not by Generating Patterns
US10990168B2 (en) * 2018-12-10 2021-04-27 Samsung Electronics Co., Ltd. Compensating for a movement of a sensor attached to a body of a user
CN113196139B (en) 2018-12-20 2023-08-11 美国斯耐普公司 Flexible eye-wear device with dual cameras for generating stereoscopic images
US20200209669A1 (en) * 2018-12-28 2020-07-02 Lightspace Technologies, SIA Electro-optical unit for volumetric display device
CN111399633B (en) * 2019-01-03 2023-03-31 见臻科技股份有限公司 Correction method for eyeball tracking application
JP7190580B2 (en) * 2019-01-09 2022-12-15 ビュージックス コーポレーション Color correction of virtual images in near-eye displays
US11200655B2 (en) 2019-01-11 2021-12-14 Universal City Studios Llc Wearable visualization system and method
TWI767179B (en) * 2019-01-24 2022-06-11 宏達國際電子股份有限公司 Method, virtual reality system and recording medium for detecting real-world light resource in mixed reality
US11686935B2 (en) * 2019-01-29 2023-06-27 Meta Platforms Technologies, Llc Interferometric structured illumination for depth determination
CN113518961A (en) 2019-02-06 2021-10-19 奇跃公司 Targeted intent based clock speed determination and adjustment to limit total heat generated by multiple processors
EP3924759A4 (en) 2019-02-15 2022-12-28 Digilens Inc. Methods and apparatuses for providing a holographic waveguide display using integrated gratings
US10866422B2 (en) 2019-02-21 2020-12-15 Microsoft Technology Licensing, Llc Micro LED display system
US10650785B1 (en) * 2019-02-21 2020-05-12 Microsoft Technology Licensing, Llc Color management of display device
US11762623B2 (en) 2019-03-12 2023-09-19 Magic Leap, Inc. Registration of local content between first and second augmented reality viewers
TWI800657B (en) 2019-03-12 2023-05-01 以色列商魯姆斯有限公司 Image projector
US11024002B2 (en) * 2019-03-14 2021-06-01 Intel Corporation Generating gaze corrected images using bidirectionally trained network
KR20200109766A (en) * 2019-03-14 2020-09-23 삼성전자주식회사 Correction pattern gaining apparatus for correction of noise by optical element included in display and method of gaining noise using the same
US11445232B2 (en) 2019-05-01 2022-09-13 Magic Leap, Inc. Content provisioning system and method
EP3736796A1 (en) * 2019-05-07 2020-11-11 Wooptix S.L. Method and optical system for characterizing displays
US11308873B2 (en) 2019-05-23 2022-04-19 Samsung Display Co., Ltd. Redundancy assisted noise control for accumulated iterative compression error
JP2022535460A (en) 2019-06-07 2022-08-08 ディジレンズ インコーポレイテッド Waveguides incorporating transmission and reflection gratings, and associated fabrication methods
US11650423B2 (en) 2019-06-20 2023-05-16 Magic Leap, Inc. Eyepieces for augmented reality display system
JP7394152B2 (en) * 2019-06-24 2023-12-07 マジック リープ, インコーポレイテッド Customized polymer/glass diffractive waveguide stack for augmented reality/mixed reality applications
AU2020300121A1 (en) 2019-07-04 2022-02-03 Lumus Ltd. Image waveguide with symmetric beam multiplication
CN110310313B (en) * 2019-07-09 2021-10-01 中国电子科技集团公司第十三研究所 Image registration method, image registration device and terminal
CN114424147A (en) * 2019-07-16 2022-04-29 奇跃公司 Determining eye rotation center using one or more eye tracking cameras
JP2022542363A (en) 2019-07-26 2022-10-03 マジック リープ, インコーポレイテッド Systems and methods for augmented reality
US11156829B2 (en) * 2019-07-29 2021-10-26 Facebook Technologies, Llc Pupil expander cailibration
EP3786767B1 (en) * 2019-07-29 2023-11-08 HTC Corporation Eye tracking method, head-mounted display, and computer readable storage medium
EP4010756A4 (en) 2019-08-09 2023-09-20 Light Field Lab, Inc. Light field display system based digital signage system
WO2021041949A1 (en) 2019-08-29 2021-03-04 Digilens Inc. Evacuating bragg gratings and methods of manufacturing
US11245931B2 (en) 2019-09-11 2022-02-08 Samsung Display Co., Ltd. System and method for RGBG conversion
RU2724442C1 (en) * 2019-09-12 2020-06-23 Самсунг Электроникс Ко., Лтд. Eye focusing distance determining device and method for head-end display device, head-end display device
GB2578523B (en) * 2019-09-25 2021-08-11 Dualitas Ltd Holographic projection
KR102349087B1 (en) * 2019-10-10 2022-01-12 한국과학기술연구원 Method for controlling robot based on brain-computer interface and apparatus for controlling meal assistance robot thereof
US11256214B2 (en) 2019-10-18 2022-02-22 Looking Glass Factory, Inc. System and method for lightfield capture
CN110766733B (en) * 2019-10-28 2022-08-12 广东三维家信息科技有限公司 Single-space point cloud registration method and device
US11288503B2 (en) * 2019-11-04 2022-03-29 Facebook Technologies, Llc Systems and methods for image adjustment based on pupil size
KR20210053665A (en) * 2019-11-04 2021-05-12 엘지전자 주식회사 Method and apparatus for enhancing image illumination intensity
WO2021097323A1 (en) 2019-11-15 2021-05-20 Magic Leap, Inc. A viewing system for use in a surgical environment
CN111047562B (en) * 2019-11-26 2023-09-19 联想(北京)有限公司 Processing method, processing device, electronic equipment and storage medium
JP7396738B2 (en) 2019-12-05 2023-12-12 ルーマス リミテッド Light-guiding optics with complementary coating partial reflectors and light-guiding optics with reduced light scattering
US10965931B1 (en) * 2019-12-06 2021-03-30 Snap Inc. Sensor misalignment compensation
JP2023504368A (en) 2019-12-06 2023-02-03 マジック リープ, インコーポレイテッド Encoding stereo splash screens in still images
KR20220111285A (en) 2019-12-08 2022-08-09 루머스 리미티드 Optical system with compact image projector
CN113010125B (en) 2019-12-20 2024-03-19 托比股份公司 Method, computer program product, and binocular headset controller
KR20210096449A (en) 2020-01-28 2021-08-05 삼성전자주식회사 Method of playing image on hud system and hud system
EP3875999A1 (en) * 2020-03-06 2021-09-08 Micledi Microdisplays BV Full color display systems and calibration methods thereof
CN111445453B (en) * 2020-03-25 2023-04-25 森兰信息科技(上海)有限公司 Method, system, medium and device for judging deviation of key image acquired by camera
CN111707187B (en) * 2020-05-12 2022-05-24 深圳大学 Measuring method and system for large part
US11449004B2 (en) 2020-05-21 2022-09-20 Looking Glass Factory, Inc. System and method for holographic image display
WO2021262759A1 (en) * 2020-06-22 2021-12-30 Digilens Inc. Systems and methods for real-time color correction of waveguide based displays
WO2021262860A1 (en) 2020-06-23 2021-12-30 Looking Glass Factory, Inc. System and method for holographic communication
CN115867962A (en) * 2020-06-26 2023-03-28 奇跃公司 Color uniformity correction for display devices
US11151755B1 (en) * 2020-07-29 2021-10-19 Adobe Inc. Image processing for increasing visibility of obscured patterns
WO2022025891A1 (en) * 2020-07-30 2022-02-03 Hewlett-Packard Development Company, L.P. Amounts of wavelengths of light during periods of time
EP4002346A1 (en) 2020-11-12 2022-05-25 Micledi Microdisplays BV Video pipeline system and method for improved color perception
US11442541B1 (en) * 2020-11-13 2022-09-13 Meta Platforms Technologies, Llc Color-based calibration for eye-tracking
IL309921A (en) 2020-11-18 2024-03-01 Lumus Ltd Optical-based validation of orientations of internal facets
WO2022119940A1 (en) 2020-12-01 2022-06-09 Looking Glass Factory, Inc. System and method for processing three dimensional images
US11733773B1 (en) 2020-12-29 2023-08-22 Meta Platforms Technologies, Llc Dynamic uniformity correction for boundary regions
TW202244552A (en) 2021-03-01 2022-11-16 以色列商魯姆斯有限公司 Optical system with compact coupling from a projector into a waveguide
US11681363B2 (en) * 2021-03-29 2023-06-20 Meta Platforms Technologies, Llc Waveguide correction map compression
US11735138B2 (en) * 2021-04-22 2023-08-22 GM Global Technology Operations LLC Dual image plane HUD with automated illuminance setting for AR graphics displayed in far virtual image plane
CN116783539A (en) 2021-05-19 2023-09-19 鲁姆斯有限公司 Active optical engine
WO2023026266A1 (en) 2021-08-23 2023-03-02 Lumus Ltd. Methods of fabrication of compound light-guide optical elements having embedded coupling-in reflectors
US11900845B2 (en) 2021-10-28 2024-02-13 Samsung Electronics Co., Ltd. System and method for optical calibration of a head-mounted display
US11927757B1 (en) 2021-10-29 2024-03-12 Apple Inc. Electronic device display having distortion compensation
US11722655B2 (en) * 2021-11-30 2023-08-08 SoliDDD Corp. Low latency networking of plenoptic data
US11710212B1 (en) * 2022-01-21 2023-07-25 Meta Platforms Technologies, Llc Display non-uniformity correction
US11754846B2 (en) 2022-01-21 2023-09-12 Meta Platforms Technologies, Llc Display non-uniformity correction
CN116524045A (en) * 2022-03-29 2023-08-01 腾讯科技(深圳)有限公司 Color calibration method, apparatus, computer device, and computer-readable storage medium
WO2023200176A1 (en) * 2022-04-12 2023-10-19 삼성전자 주식회사 Electronic device for displaying 3d image, and method for operating electronic device
WO2024016163A1 (en) * 2022-07-19 2024-01-25 Jade Bird Display (shanghai) Limited Methods and systems for virtual imagecompensation and evaluation
CN115931303B (en) * 2022-10-26 2023-11-17 江西凤凰光学科技有限公司 Test method of polychromatic diffraction optical waveguide

Family Cites Families (103)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH05161166A (en) * 1991-12-04 1993-06-25 Sony Corp Stereoscopic video signal generator
US6222525B1 (en) 1992-03-05 2001-04-24 Brad A. Armstrong Image controllers with sheet connected sensors
JP3309431B2 (en) * 1992-07-15 2002-07-29 富士ゼロックス株式会社 Information processing device
US6011581A (en) * 1992-11-16 2000-01-04 Reveo, Inc. Intelligent method and system for producing and displaying stereoscopically-multiplexed images of three-dimensional objects for use in realistic stereoscopic viewing thereof in interactive virtual reality display environments
US5594563A (en) 1994-05-31 1997-01-14 Honeywell Inc. High resolution subtractive color projection system
US5670988A (en) 1995-09-05 1997-09-23 Interlink Electronics, Inc. Trigger operated electronic device
JPH1020245A (en) * 1996-07-01 1998-01-23 Canon Inc Depth sampling type stereoscopic picture forming and displaying device
JPH11203986A (en) 1998-01-16 1999-07-30 Denso Corp Multifunctional switch device
US6456339B1 (en) * 1998-07-31 2002-09-24 Massachusetts Institute Of Technology Super-resolution display
US20020063807A1 (en) 1999-04-19 2002-05-30 Neal Margulis Method for Performing Image Transforms in a Digital Display System
JP4348839B2 (en) * 2000-06-28 2009-10-21 ソニー株式会社 Inspection apparatus and inspection method
US6816625B2 (en) * 2000-08-16 2004-11-09 Lewis Jr Clarence A Distortion free image capture system and method
US7308157B2 (en) * 2003-02-03 2007-12-11 Photon Dynamics, Inc. Method and apparatus for optical inspection of a display
US7530315B2 (en) * 2003-05-08 2009-05-12 Lone Star Ip Holdings, Lp Weapon and weapon system employing the same
JP2005101828A (en) * 2003-09-24 2005-04-14 Canon Inc Image processing system, method for processing image, its recording medium, and program
USD514570S1 (en) 2004-06-24 2006-02-07 Microsoft Corporation Region of a fingerprint scanning device with an illuminated ring
JP4965800B2 (en) 2004-10-01 2012-07-04 キヤノン株式会社 Image display system
JP4560368B2 (en) 2004-10-08 2010-10-13 キヤノン株式会社 Eye detection device and image display device
JP2006153914A (en) * 2004-11-25 2006-06-15 Canon Inc Liquid crystal projector
US11428937B2 (en) 2005-10-07 2022-08-30 Percept Technologies Enhanced optical and perceptual digital eyewear
US20070081123A1 (en) 2005-10-07 2007-04-12 Lewis Scott W Digital eyewear
US8696113B2 (en) 2005-10-07 2014-04-15 Percept Technologies Inc. Enhanced optical and perceptual digital eyewear
US20080144174A1 (en) 2006-03-15 2008-06-19 Zebra Imaging, Inc. Dynamic autostereoscopic displays
US9843790B2 (en) * 2006-03-15 2017-12-12 Fovi 3D, Inc. Dynamic autostereoscopic displays
US8406562B2 (en) * 2006-08-11 2013-03-26 Geo Semiconductor Inc. System and method for automated calibration and correction of display geometry and color
EP1962517A1 (en) * 2007-02-21 2008-08-27 STMicroelectronics (Research & Development) Limited Error reduction in image sensors
JP2008258802A (en) 2007-04-03 2008-10-23 Canon Inc Image display system
WO2008129421A1 (en) * 2007-04-18 2008-10-30 Micronic Laser Systems Ab Method and apparatus for mura detection and metrology
JP5089405B2 (en) * 2008-01-17 2012-12-05 キヤノン株式会社 Image processing apparatus, image processing method, and imaging apparatus
JP2010199659A (en) 2009-02-23 2010-09-09 Panasonic Corp Image processing apparatus, and image processing method
WO2010131400A1 (en) * 2009-05-14 2010-11-18 株式会社ナナオ Stereoscopic image display apparatus
JP2010271565A (en) * 2009-05-22 2010-12-02 Seiko Epson Corp Head-mounted display device
HU0900478D0 (en) 2009-07-31 2009-09-28 Holografika Hologrameloeallito Method and apparatus for displaying 3d images
US20120212499A1 (en) * 2010-02-28 2012-08-23 Osterhout Group, Inc. System and method for display content control during glasses movement
US20150309316A1 (en) * 2011-04-06 2015-10-29 Microsoft Technology Licensing, Llc Ar glasses with predictive control of external device based on event input
US8564647B2 (en) 2010-04-21 2013-10-22 Canon Kabushiki Kaisha Color management of autostereoscopic 3D displays
US8922636B1 (en) * 2010-08-20 2014-12-30 The United States Of America As Represented By The Secretary Of The Navy Synthetic aperture imaging for fluid flows
KR101494066B1 (en) * 2010-10-05 2015-02-16 엠파이어 테크놀로지 디벨롭먼트 엘엘씨 Generation of depth data based on spatial light pattern
US20120113223A1 (en) 2010-11-05 2012-05-10 Microsoft Corporation User Interaction in Augmented Reality
US9304319B2 (en) 2010-11-18 2016-04-05 Microsoft Technology Licensing, Llc Automatic focus improvement for augmented reality displays
CN103688208B (en) 2010-12-24 2017-06-06 奇跃公司 Ergonomics head-mounted display apparatus and optical system
US10156722B2 (en) 2010-12-24 2018-12-18 Magic Leap, Inc. Methods and systems for displaying stereoscopy with a freeform optical system with addressable focus for virtual and augmented reality
US8643684B2 (en) * 2011-01-18 2014-02-04 Disney Enterprises, Inc. Multi-layer plenoptic displays that combine multiple emissive and light modulating planes
RU2017118159A (en) 2011-05-06 2018-10-30 Мэджик Лип, Инк. WORLD OF MASS SIMULTANEOUS REMOTE DIGITAL PRESENCE
JP5414946B2 (en) 2011-06-16 2014-02-12 パナソニック株式会社 Head-mounted display and method for adjusting misalignment thereof
US8546454B2 (en) * 2011-07-26 2013-10-01 Unitel Technologies, Inc. Process and method for the producton of dimethylether (DME)
JP2013037021A (en) 2011-08-03 2013-02-21 Canon Inc Display and head-mounted display
JP2013045001A (en) * 2011-08-25 2013-03-04 Fujitsu Ltd Color display method and color display device
WO2013033195A2 (en) * 2011-08-30 2013-03-07 Microsoft Corporation Head mounted display with iris scan profiling
US10795448B2 (en) 2011-09-29 2020-10-06 Magic Leap, Inc. Tactile glove for human-computer interaction
JP6119091B2 (en) 2011-09-30 2017-04-26 セイコーエプソン株式会社 Virtual image display device
US9157286B2 (en) * 2011-10-11 2015-10-13 Warrier Rig Ltd Portable pipe handling system
KR102005106B1 (en) 2011-10-28 2019-07-29 매직 립, 인코포레이티드 System and method for augmented and virtual reality
KR102116697B1 (en) 2011-11-23 2020-05-29 매직 립, 인코포레이티드 Three dimensional virtual and augmented reality display system
US8913789B1 (en) 2012-01-06 2014-12-16 Google Inc. Input methods and systems for eye positioning using plural glints
KR102095330B1 (en) 2012-04-05 2020-03-31 매직 립, 인코포레이티드 Wide-field of view (fov) imaging devices with active foveation capability
US20130300635A1 (en) * 2012-05-09 2013-11-14 Nokia Corporation Method and apparatus for providing focus correction of displayed information
US8989535B2 (en) * 2012-06-04 2015-03-24 Microsoft Technology Licensing, Llc Multiple waveguide imaging structure
US9671566B2 (en) 2012-06-11 2017-06-06 Magic Leap, Inc. Planar waveguide apparatus with diffraction element(s) and system employing same
CA2876335C (en) 2012-06-11 2020-04-28 Magic Leap, Inc. Multiple depth plane three-dimensional display using a wave guide reflector array projector
US9077973B2 (en) * 2012-06-29 2015-07-07 Dri Systems Llc Wide field-of-view stereo vision platform with dynamic control of immersive or heads-up display operation
US9835864B2 (en) 2012-07-24 2017-12-05 Sony Corporation Image display apparatus and method for displaying image
US8754829B2 (en) * 2012-08-04 2014-06-17 Paul Lapstun Scanning light field camera and display
CN104813218A (en) 2012-09-11 2015-07-29 奇跃公司 Ergonomic head mounted display device and optical system
IL283193B (en) 2013-01-15 2022-08-01 Magic Leap Inc System for scanning electromagnetic imaging radiation
JP2014142383A (en) 2013-01-22 2014-08-07 Canon Inc Image forming apparatus
CN105247447B (en) * 2013-02-14 2017-11-10 脸谱公司 Eyes tracking and calibrating system and method
US20140240842A1 (en) * 2013-02-22 2014-08-28 Ian Nguyen Alignment-insensitive image input coupling
CA3157218A1 (en) 2013-03-11 2014-10-09 Magic Leap, Inc. System and method for augmented and virtual reality
US9424467B2 (en) * 2013-03-14 2016-08-23 Disney Enterprises, Inc. Gaze tracking and recognition with image location
JP6326482B2 (en) 2013-03-15 2018-05-16 マジック リープ, インコーポレイテッドMagic Leap,Inc. Display system and method
WO2014144828A1 (en) * 2013-03-15 2014-09-18 Scalable Display Technologies, Inc. System and method for calibrating a display system using a short throw camera
GB201305726D0 (en) * 2013-03-28 2013-05-15 Eye Tracking Analysts Ltd A method for calibration free eye tracking
TWI508554B (en) 2013-05-21 2015-11-11 Univ Nat Taiwan An image focus processing method based on light-field camera and the system thereof are disclosed
JP2013240057A (en) * 2013-05-30 2013-11-28 Denso Corp Adjustment method of head-up display device
US9874749B2 (en) 2013-11-27 2018-01-23 Magic Leap, Inc. Virtual and augmented reality systems and methods
US10262462B2 (en) 2014-04-18 2019-04-16 Magic Leap, Inc. Systems and methods for augmented and virtual reality
US10295338B2 (en) 2013-07-12 2019-05-21 Magic Leap, Inc. Method and system for generating map data from an image
US9146862B2 (en) * 2013-07-18 2015-09-29 International Business Machines Corporation Optimizing memory usage across multiple garbage collected computer environments
JP5693803B1 (en) 2013-07-26 2015-04-01 シチズンホールディングス株式会社 Light source device and projection device
US9557856B2 (en) * 2013-08-19 2017-01-31 Basf Se Optical detector
US20150104101A1 (en) * 2013-10-14 2015-04-16 Apple Inc. Method and ui for z depth image segmentation
EP4321915A2 (en) 2013-10-16 2024-02-14 Magic Leap, Inc. Virtual or augmented reality headsets having adjustable interpupillary distance
JP6287095B2 (en) * 2013-11-19 2018-03-07 セイコーエプソン株式会社 Optical device and electronic apparatus
US9857591B2 (en) 2014-05-30 2018-01-02 Magic Leap, Inc. Methods and system for creating focal planes in virtual and augmented reality
EP4220999A3 (en) 2013-11-27 2023-08-09 Magic Leap, Inc. Virtual and augmented reality systems and methods
US10620457B2 (en) 2013-12-17 2020-04-14 Intel Corporation Controlling vision correction using eye tracking and depth detection
US9804395B2 (en) 2014-01-29 2017-10-31 Ricoh Co., Ltd Range calibration of a binocular optical augmented reality system
WO2015117043A1 (en) 2014-01-31 2015-08-06 Magic Leap, Inc. Multi-focal display system and method
EP3100098B8 (en) 2014-01-31 2022-10-05 Magic Leap, Inc. Multi-focal display system and method
US10203762B2 (en) 2014-03-11 2019-02-12 Magic Leap, Inc. Methods and systems for creating virtual and augmented reality
US10264211B2 (en) * 2014-03-14 2019-04-16 Comcast Cable Communications, Llc Adaptive resolution in software applications based on dynamic eye tracking
JP6550460B2 (en) 2014-05-09 2019-07-24 グーグル エルエルシー System and method for identifying eye signals, and continuous biometric authentication
USD759657S1 (en) 2014-05-19 2016-06-21 Microsoft Corporation Connector with illumination region
CA3124368C (en) 2014-05-30 2023-04-25 Magic Leap, Inc. Methods and systems for generating virtual content display with a virtual or augmented reality apparatus
USD752529S1 (en) 2014-06-09 2016-03-29 Comcast Cable Communications, Llc Electronic housing with illuminated region
CN104155819B (en) * 2014-08-04 2017-03-15 上海中航光电子有限公司 Dot structure and its driving method, display device
US10067561B2 (en) 2014-09-22 2018-09-04 Facebook, Inc. Display visibility based on eye convergence
US20160131902A1 (en) 2014-11-12 2016-05-12 Anthony J. Ambrus System for automatic eye tracking calibration of head mounted display device
USD758367S1 (en) 2015-05-14 2016-06-07 Magic Leap, Inc. Virtual reality headset
WO2017079333A1 (en) 2015-11-04 2017-05-11 Magic Leap, Inc. Light field display metrology
USD805734S1 (en) 2016-03-04 2017-12-26 Nike, Inc. Shirt
USD794288S1 (en) 2016-03-11 2017-08-15 Nike, Inc. Shoe with illuminable sole light sequence

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
TWI817335B (en) * 2022-01-25 2023-10-01 宏碁股份有限公司 Stereoscopic image playback apparatus and method of generating stereoscopic images thereof

Also Published As

Publication number Publication date
US11454495B2 (en) 2022-09-27
TWI695999B (en) 2020-06-11
IL259074B (en) 2022-06-01
JP2019501564A (en) 2019-01-17
CN108476311A (en) 2018-08-31
US11226193B2 (en) 2022-01-18
JP7210280B2 (en) 2023-01-23
AU2016349891A1 (en) 2018-05-31
IL259074A (en) 2018-07-31
CN113489967A (en) 2021-10-08
US20210148697A1 (en) 2021-05-20
AU2021202036A1 (en) 2021-04-29
US20200225024A1 (en) 2020-07-16
US11898836B2 (en) 2024-02-13
EP3371573B1 (en) 2022-06-15
NZ742532A (en) 2019-05-31
US10571251B2 (en) 2020-02-25
KR102633000B1 (en) 2024-02-01
KR20180081103A (en) 2018-07-13
CN113358045A (en) 2021-09-07
IL309607A (en) 2024-02-01
IL259072B (en) 2022-06-01
US11536559B2 (en) 2022-12-27
IL292793B1 (en) 2024-02-01
CN108474737A (en) 2018-08-31
IL292793A (en) 2022-07-01
JP7218398B2 (en) 2023-02-06
US20190226830A1 (en) 2019-07-25
AU2021202036B2 (en) 2022-06-09
WO2017079329A1 (en) 2017-05-11
JP2023053974A (en) 2023-04-13
JP2019504292A (en) 2019-02-14
JP6983773B2 (en) 2021-12-17
EP4235639A3 (en) 2023-10-25
CA3004278A1 (en) 2017-05-11
US20170122725A1 (en) 2017-05-04
KR20230151554A (en) 2023-11-01
KR20180080302A (en) 2018-07-11
US20170124928A1 (en) 2017-05-04
US20190323825A1 (en) 2019-10-24
KR20240017132A (en) 2024-02-06
EP3371972A1 (en) 2018-09-12
JP2021073820A (en) 2021-05-13
EP4080194A1 (en) 2022-10-26
AU2022224797B2 (en) 2023-06-29
AU2016349895B2 (en) 2022-01-13
WO2017079333A1 (en) 2017-05-11
EP3371573A1 (en) 2018-09-12
EP3371972A4 (en) 2019-05-01
KR102592980B1 (en) 2023-10-20
AU2016349891B2 (en) 2021-04-22
US10378882B2 (en) 2019-08-13
JP7189243B2 (en) 2022-12-13
EP4235639A2 (en) 2023-08-30
CN108476311B (en) 2021-04-27
EP3371573A4 (en) 2019-05-08
EP3371972B1 (en) 2023-06-07
TWI648559B (en) 2019-01-21
AU2022224797A1 (en) 2022-09-22
NZ742518A (en) 2019-08-30
CN108474737B (en) 2021-04-06
AU2016349891B9 (en) 2021-05-06
JP2021141612A (en) 2021-09-16
US20230108721A1 (en) 2023-04-06
CA3004271A1 (en) 2017-05-11
IL259072A (en) 2018-07-31
US10260864B2 (en) 2019-04-16
AU2016349895A1 (en) 2018-05-31
TW201908817A (en) 2019-03-01

Similar Documents

Publication Publication Date Title
AU2022224797B2 (en) Dynamic display calibration based on eye-tracking