CN117501084A - Apparatus, system, and method for selectively compensating corrective lenses applied to a display device during testing - Google Patents

Apparatus, system, and method for selectively compensating corrective lenses applied to a display device during testing Download PDF

Info

Publication number
CN117501084A
CN117501084A CN202280043295.3A CN202280043295A CN117501084A CN 117501084 A CN117501084 A CN 117501084A CN 202280043295 A CN202280043295 A CN 202280043295A CN 117501084 A CN117501084 A CN 117501084A
Authority
CN
China
Prior art keywords
image
display device
controller
compensation element
axicon
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202280043295.3A
Other languages
Chinese (zh)
Inventor
罗宾·夏尔马
李昀翰
齐迪飞
康斯坦丁·库迪诺夫
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Meta Platforms Technologies LLC
Original Assignee
Meta Platforms Technologies LLC
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from US17/721,760 external-priority patent/US20220404608A1/en
Application filed by Meta Platforms Technologies LLC filed Critical Meta Platforms Technologies LLC
Priority claimed from PCT/US2022/033874 external-priority patent/WO2022266380A1/en
Publication of CN117501084A publication Critical patent/CN117501084A/en
Pending legal-status Critical Current

Links

Landscapes

  • Lens Barrels (AREA)

Abstract

An apparatus comprising: (1) A axicon configured to receive an image emitted by the display device through the corrective lens; (2) A variable compensation element coupled to the axicon, wherein the variable compensation element is capable of selectively modifying an image emitted by the display device to compensate for optical effects exerted on the image by the corrective lens; and (3) a controller coupled to the variable compensation element, wherein the controller: (1) Receiving a compensation parameter indicative of an optical effect exerted by the corrective lens on the image; (2) Selecting a feature of the variable compensation element that compensates for the optical effect based at least in part on the compensation parameter; and (3) applying the characteristics of the variable compensation element to the image. Various other devices, systems, and methods are also disclosed.

Description

Apparatus, system, and method for selectively compensating corrective lenses applied to a display device during testing
Technical Field
The present disclosure relates generally to apparatus, systems, and methods for selectively compensating corrective lenses applied to display devices during testing. As will be explained in more detail below, these devices, systems, and methods may provide a number of features and benefits.
Background
In some examples, a display of an augmented reality (augmented reality, AR) and/or Virtual Reality (VR) device may emit collimated light toward an eye of a user. In some AR/VR devices, a prescription lens may be placed between the user's eye and the AR/VR display to correct the user's eye's imperfections and/or refractive errors. Unfortunately, these prescription lenses are typically not easily removable. Furthermore, these prescription lenses may present difficulties related to characterizing the performance of the AR/VR display by such metrics as modulation transfer function (modulation transfer function, MTF) and/or display color uniformity. The MTF of an AR/VR display may represent resolution properties and/or quantify the sharpness of the display at different spatial frequencies.
In some cases, it may be useful to characterize display parameters (e.g., MTF and/or color uniformity) of an AR/VR device that includes a prescription lens. However, for accurate evaluation, a well-focused image may be required, while the presence of the prescription lens may distort the image transmitted by the AR/VR device, making it difficult to obtain a well-focused image. For example, prescription lenses are typically customized for a particular user and/or a particular group of users. Thus, such prescription lenses may introduce variable modifications to the wavefront of light from the AR/VR display, which makes determination of MTF and/or measurement of color uniformity difficult.
Visual perception may account for over 80% of all information received and/or consumed by humans from the real world. For this reason, the primary task of creating all modalities of a realistic and/or trusted AR/VR experience may involve generating an artificial image for the human eye. Efforts to this end may require the development and/or manufacture of so-called near-eye displays (NED). The development and/or manufacture of such NEDs can be a complex scientific and/or engineering effort that includes not only developing and/or manufacturing the display itself, but also creating all necessary infrastructure for characterizing, testing and/or metering such displays in an extensible manner.
In some cases, characterization of the AR/VR display may be useful for various quality and/or calibration purposes. Examples of applications for characterizing and/or measuring devices include, but are not limited to: evaluation of a display prototype, quality control on a display production line, compliance checking related to a display (e.g., related to safety regulations), and/or calibration and adjustment of a display prior to or during use, combinations or variations of one or more of the above, and/or any other suitable application.
Contemporary AR/VR displays may break through the limits of optical system designs with the aim of increasing the user's field of view and/or eye box, improving display efficiency, and/or reducing the display size and/or weight. In contests where better performance and/or efficiency is sought, some metrics (e.g., MTF and/or angular uniformity) may be sacrificed in order to optimize other parameters. This tradeoff may be reasonable because the uniformity and/or other defects of the display are typically detected and/or corrected by proper calibration during testing. While this may alleviate some of the stress of display developers, such an approach may present additional challenges to optical metrology because of the large amounts of data that need to be collected, processed, and/or stored to facilitate such testing and/or calibration. This approach may also result in a very long factory process that requires expensive automated machinery with custom functions and/or custom devices.
Examples of test methods and calibration methods experienced by an AR/VR display include display MTF and/or color uniformity calibration. One purpose of the display MTF test is to ensure that a virtual image created by the display meets the required MTF specification when viewed from several different angles. In some examples, such MTF testing may involve and/or deploy complex systems with several cameras. In such examples, each camera may view the virtual image at a defined angle and/or perform the necessary data collection. The use of several cameras in this manner may allow and/or facilitate the simultaneous acquisition of information about all angles. However, this approach may have a number of significant drawbacks including high cost (e.g., the device may require up to 30 expensive cameras), complex alignment and/or auditing procedures, large system size and/or footprint, large weight, and/or substantial inconvenience.
In some cases, MTF measurements may be more complex if the tested AR/VR device is adjusted for multiple users with different visual acuity. For example, the AR/VR device may include and/or incorporate prescription lenses for one or both eyes, and these prescription lenses may be non-detachably secured and/or attached to the AR/VR display. Thus, it may not be feasible to remove the prescription lens outside of the professional service. These prescription lenses may modify the image projected by the AR/VR display, similar to how prescription glasses compensate for myopia or hyperopia.
Further challenges may be presented by the need to determine the MTF of an AR/VR display with a prescription lens and/or to measure the color uniformity of the AR/VR display. Ametropia correction for distance vision of any given eye may include three primary variables, namely sphere, cylinder and/or axicon. Because of the variation in the values of these variables, there may be a large number of unique lens prescriptions (e.g., 500000 different options). For optical power variations only, a large amount (e.g., hundreds or more) of prescription lens power may be required during display characterization (e.g., to determine display parameters such as MTF).
In some cases, determining the MTF and/or color uniformity of an AR/VR display whose prescription lens provides astigmatic correction can appear to be particularly challenging. Such prescription lenses may add further variable modifications to the wavefront of light emitted from the AR/VR display. One approach to addressing this challenge may involve focusing the images sequentially along different astigmatic axes. Unfortunately, this approach may result in a large amount of focus control, additional device and/or equipment adjustments under test, complex data processing, and/or large measurement errors.
In some cases, it may be very useful to determine the display MTF and/or color uniformity using a single camera and a single image acquisition. Furthermore, without compensation elements, the prescription lens may introduce a monocular aberration to the outgoing light wave front that dominates any MTF or color uniformity measurement. Thus, compensation for these monocular aberrations may be required for accurate MTF and/or color uniformity determinations. Accordingly, the present disclosure determines and addresses the need for additional and improved apparatus, systems, and methods for selectively compensating prescription lenses applied to display devices during testing.
Disclosure of Invention
According to a first aspect of the present disclosure there is provided an apparatus comprising: a axicon configured to receive an image emitted by the display device through the corrective lens; a variable compensation element coupled to the axicon, wherein the variable compensation element is capable of selectively modifying an image emitted by the display device to compensate for optical effects exerted on the image by the corrective lens; and a controller coupled to the variable compensation element, wherein the controller: receiving a compensation parameter indicative of an optical effect exerted by the corrective lens on the image; selecting a feature of the variable compensation element that compensates for the optical effect based at least in part on the compensation parameter; and causing the feature of the variable compensation element to be applied to the image.
The apparatus may further include: an image sensor coupled to the variable compensation element, wherein: the image sensor is configured to sense an image compensated by the characteristics of the variable compensation element; and the controller: receiving data from the image sensor representing an image sensed by the image sensor; and determining display parameters of the display device based at least in part on the data representing the image.
The display parameters may include at least one of: a modulation transfer function of the display device; displaying the color uniformity measurement result of the device; resolution of the display device; displaying a definition measurement result of the device; or display the luminance uniformity measurement of the device.
The variable compensation element may comprise a phase plate array comprising a plurality of selectable features. The controller may select one of the plurality of selectable features included on the phase plate array for application to the image before the image reaches the image sensor.
The plurality of selectable features included on the array of phase plates may include at least one Pancharatnam-Berry phase optic of a set of Pancharatnam-Berry phase optics.
The apparatus may further include: a phase plate positioning mechanism communicatively coupled to the controller and configured to move the phase plate array. The controller may instruct the phase plate positioning mechanism to move the phase plate array to a position such that the one of the plurality of selectable features is applied to the image as the image passes through the axicon.
The phase plate positioning mechanism may include one or more actuators. The axicon may include an optical tray positioned in the optical path of the image. The phase plate positioning mechanism may engage the one or more actuators to move the phase plate array relative to the optical tray such that the one of the plurality of selectable features is applied to the image as the image passes through the optical path within the axicon.
The controller may provide display parameters representing the display device to a user interface or computing device for evaluation.
The compensation parameters include at least one of: correcting the spherical power of the lens; correcting the cylindrical power of the lens; or correcting the cylindrical axis of the lens.
The axicon may include: a collection lens configured to collect light representing an image emitted by the display device; and an imaging lens configured to form an image from the light for presentation on an image sensor.
The collection lens may collect light emitted by the display device over a range of viewing angles.
According to a second aspect, there is provided a system comprising: a display device including a corrective lens; and an imaging camera device optically coupled to the display device, wherein the imaging camera device comprises the apparatus of the first aspect.
According to a third aspect, there is provided a method comprising: optically coupling the display device to a axicon, the axicon configured to receive an image emitted by the display device through the corrective lens; receiving, by a controller, a compensation parameter, the controller coupled to a variable compensation element positioned in an optical path of the axicon, the compensation parameter being indicative of at least one optical effect exerted by the corrective lens on the image; selecting, by the controller, a feature of the variable compensation element that compensates for the optical effect based at least in part on the compensation parameter; and applying, by the controller, the characteristics of the variable compensation element to the image in the axicon.
Drawings
The accompanying drawings illustrate several examples and are a part of the specification. The drawings together with the following description illustrate and explain various principles of the disclosure.
FIG. 1 is an illustration of an exemplary apparatus for selectively compensating a corrective lens applied to a display device during testing.
FIG. 2 is a diagram of an exemplary axicon incorporated into the following device: the apparatus facilitates selectively compensating corrective lenses applied to a display device during testing.
Fig. 3 is an illustration of an exemplary apparatus for selectively compensating a corrective lens applied to a display device during testing.
FIG. 4 is an illustration of an exemplary system for selectively compensating a corrective lens applied to a display device during testing.
FIG. 5 is a diagram of an exemplary variable compensation element that includes various optional features that can be applied to the axicon.
FIG. 6 is a flow chart of an exemplary method for selectively compensating a corrective lens applied to a display device during testing.
Fig. 7 is an illustration of an exemplary augmented reality environment.
Fig. 8 is an illustration of an exemplary virtual reality headset (head set).
While the examples described herein are susceptible to various modifications and alternative forms, specific examples are shown in the drawings and will be described in detail herein. However, the examples described herein are not intended to be limited to the particular forms disclosed. Rather, the present disclosure covers all modifications, combinations, equivalents, and alternatives falling within the disclosure.
Detailed Description
As will be described in more detail below, an apparatus and/or system may be configured to compensate for many possible prescription lenses in order to transfer a clear image from an AR/VR display under test onto an image sensor. By so doing, the apparatus and/or system may facilitate and/or support accurately determining the MTF and/or color uniformity of the AR/VR display. In some examples, the apparatus and/or system may be configured to obtain an MTF and/or color uniformity of the AR/VR display. In such examples, the MTF and/or color uniformity determination may indicate and/or represent image sharpness on the AR/VR display. These MTF and/or color uniformity determinations may be used to ensure that the AR/VR display meets and/or meets certain quality criteria prior to distribution and/or shipment.
In some examples, such devices and/or systems may include a light collection element (e.g., an optical element such as a lens of a axicon) configured to collect light from a display over a range of viewing angles. By collecting light from a range of viewing angles, such devices and/or systems may alleviate and/or eliminate the conventional requirement to arrange multiple cameras at multiple locations for testing and/or calibration. Thus, such devices and/or systems may require and/or include only a single camera, thereby helping to reduce costs and greatly simplify alignment and/or operation during testing and/or calibration.
In some examples, such apparatus and/or systems may implement and/or include a compensation element (e.g., a feature of a phase plate array) that helps compensate for prescription lenses incorporated into the AR/VR device under test. In one example, the compensation element may be positioned, placed and/or located within or near a pupil conjugation region (pupil conjugated area) of a axicon included in the device and/or system. In this example, the compensation element may be selected from a plurality of such elements (e.g., incorporated in a phase plate array) based at least in part on the nature and/or characteristics of a prescription lens applied to the AR/VR device. The compensation element may compensate and/or account for the effect of the prescription lens on image quality. Thus, the compensation element may enable the optical construction of the device and/or system to be achieved and/or achieved with a very simple design (e.g. including certain fixed optical features).
In some examples, adjustment of a prescription lens incorporated into an AR/VR device may involve automatically selecting and/or applying compensation elements from an array of such elements disposed on a phase plate array. In such an example, light from the AR/VR display may be modified by the prescription lens, while the compensation element may reverse the effect of the prescription lens. In one example, such apparatus and/or systems may include and/or represent bifocal afocal optical relay devices, potentially telecentric eyepiece lenses, image sensors, and/or compensation plates (e.g., a phase plate array) that are bifocal.
In some examples, such devices and/or systems may facilitate and/or support determination and/or calibration of MTF and/or color uniformity or sharpness of an AR/VR display mounted and/or equipped with a prescription lens. These prescription lenses may provide and/or apply only spherical correction, only cylindrical correction, or both spherical and cylindrical correction. In one example, such an apparatus and/or system may include and/or represent a axicon and/or another optic configured to collect light from an AR/VR display at several viewing angles to facilitate and/or support the use of a single camera to determine MTF and/or color uniformity. Additionally or alternatively, such devices and/or systems may include and/or represent special prisms and/or reflectors (e.g., spatial reflectors) configured to collect light from an AR/VR display at several viewing angles.
In some examples, the compensation elements of such devices and/or systems may include and/or represent at least one Pancharatnam-Berry phase (PBP) optical element. In one example, the selection, adjustment, insertion, and/or application of such compensation elements may be automated and directed by a controller (e.g., a processing device). Additionally or alternatively, such compensation elements may be selected based at least in part on one or more adjustment parameters and/or optical properties of a prescription lens applied to the tested AR/VR display.
Detailed descriptions of exemplary devices, systems, components, and corresponding embodiments for selectively compensating corrective lenses applied to a display device during testing are provided below with reference to fig. 1-5. In addition, a method for selectively compensating a corrective lens applied to a display device during testing is described in detail in connection with fig. 6. The discussion corresponding to fig. 7 and 8 will provide a detailed description of the types of example artificial reality devices, wearable devices, and/or related systems that may be tested and/or calibrated using one of the apparatus, systems, and/or methods disclosed herein.
Fig. 1 illustrates an example apparatus 100 that facilitates and/or supports selectively compensating corrective lenses applied to a display device during testing. As shown in fig. 1, the exemplary apparatus 100 may include and/or represent a axicon 102, a variable compensation element 104, and/or a controller 106. In some examples, the axicon 102 may be configured to receive, detect, and/or accept an image transmitted by a display device through a corrective lens. In such examples, the variable compensation element 104 may be physically, directly, indirectly, and/or optically coupled to the axicon 102. In one example, the variable compensation element 104 may be capable of selectively modifying an image emitted by the display device to compensate for optical effects imposed on the image by the corrective lens.
In some examples, the controller 106 may be physically, directly, indirectly, and/or communicatively coupled to the variable compensation element 104. In one example, the controller 106 may receive and/or obtain compensation parameters that are indicative of the optical effects exerted by the corrective lens on the image. The controller 106 may receive and/or obtain the compensation parameter in a variety of different ways. For example, the controller 106 may receive and/or obtain compensation parameters as user input via a user interface from a technician testing the display device. In another example, the display device may be programmed with information identifying certain properties of the corrective lens, and the controller 106 may be communicatively coupled to the display device (e.g., via a wireless connection and/or a wired connection). In this example, the controller 106 may receive and/or obtain information from the display device identifying those properties of the corrective lens via the communicative coupling.
In some examples, the compensation parameters may constitute and/or represent one or more features and/or attributes of a corrective lens through which the image is transmitted. Additionally or alternatively, the compensation parameter may constitute and/or represent one or more features and/or properties of the optical effect exerted by the corrective lens on the image. Examples of compensation parameters include, but are not limited to: the spherical power of the corrective lens, the cylindrical axis of the corrective lens, combinations or variations of one or more of the above, and/or any other suitable compensation parameter.
In some examples, the controller 106 may select features of the variable compensation element 104 that compensate and/or account for optical effects imposed by the corrective lens based at least in part on the compensation parameters. In one example, the variable compensation element 104 may include and/or represent a phase plate array having a plurality of selectable features. In this example, the controller 106 may identify one of a plurality of selectable features of the array of phase plates as being capable of counteracting and/or reversing the optical effects exerted by the corrective lens on the image. For example, selected features of the phase plate array, when applied to an image, may counteract and/or reverse optical effects imposed on the image by the spherical power, cylindrical power, and/or cylinder axis of the corrective lens. Such cancellation and/or reversal of optical effects may correct and/or adjust the image for proper testing and/or calibration of the display device by the apparatus 100 and/or the corresponding image sensor.
In some examples, the controller 106 may cause the characteristics of the variable compensation element 104 to be applied to the image. In one example, upon selection of a feature of the variable compensation element 104, the controller 106 can manipulate and/or adjust the variable compensation element 104 such that the selected feature is applied to the image before the image reaches the image sensor. The controller 106 may manipulate and/or adjust the variable compensation element 104 in a variety of different ways. For example, the apparatus 100 may include and/or represent a phase plate positioning mechanism (not necessarily shown in fig. 1) communicatively coupled to the controller 106 and/or configured to move the variable compensation element 104 in one or more directions (e.g., rotationally, laterally, horizontally, vertically, diagonally, etc.). In this example, the controller 106 may direct and/or instruct the phase plate positioning mechanism to move the variable compensation element 104 to such a position and/or arrangement: the location and/or arrangement is such that the selected feature is applied to the image as it passes through the axicon 102.
In some examples, the axicon 102 may include and/or represent any type or form of device and/or component that performs, facilitates, and/or supports the measurement and/or observation of the axicon. In one example, the axicon 102 can include and/or represent one or more optical components. Examples of such optical components include, but are not limited to: lenses, quarter wave plates, reflectors, polarizers, retarders, partial reflectors, reflective polarizers, optical films, compensators, beam splitters, alignment layers, color filters, protective sheets, glass elements, plastic elements, apertures, fresnel lenses, convex lenses, concave lenses, filters, spherical lenses, cylindrical lenses, compensators, coatings, combinations or variations of one or more of the above, and/or any other suitable optical component. In one example, axicon 102 can include and/or represent a telecentric lens stack capable of performing one or more optical functions, including: image scaling, lens correction, polarization, reflection, retardation, refraction, light collection, optical aberration correction, gamma correction and/or gamma adjustment, multi-image blending and/or multi-image overlay, display overload compensation, mura correction, dithering, image decompression, noise correction, image distortion, contrast, and/or sharpening, and the like.
In some examples, the various optical components of the axicon 102 can include and/or include a variety of different materials. Examples of such materials include, but are not limited to: plastic, glass (e.g., crown glass), polycarbonate, combinations or variations of one or more of the above, and/or any other suitable material. The various optical components may be defined and/or formed in various shapes and/or sizes.
In some examples, variable compensation element 104 may include and/or represent any type or form of phase plate array and/or phase compensation plate. In such examples, the array of phase plates and/or the phase compensation plate may include and/or represent a plurality of selectable features (e.g., phase compensation elements). In one example, the plurality of optional features may include and/or represent a set of PBP optics and/or a corrector. Additionally or alternatively, at least a portion of the variable compensation element 104 may be sized and/or dimensioned to fit within an optical tray positioned in the optical path of the image within the axicon 102.
In some examples, variable compensation element 104 may include and/or represent one or more spherical power features and/or adapters capable of counteracting and/or reversing a spherical power or effect of a corrective lens applied to a display device under test. In other examples, variable compensation element 104 may include and/or represent one or more cylindrical power features and/or adapters capable of counteracting and/or reversing the cylindrical power or effect of a corrective lens applied to a display device under test. Additionally or alternatively, variable compensation element 104 may include and/or represent one or more post axis features and/or adapters capable of counteracting and/or reversing a post axis applied to a corrective lens of a display device under test. Finally, variable compensation element 104 may include and/or represent one or more field curvature features and/or adapters capable of counteracting and/or reversing field curvature and/or magnification effects of corrective lenses applied to the display device under test.
In some examples, controller 106 may include and/or represent any type or form of processing device and/or system capable of interpreting and/or executing a hardware implementation of computer readable instructions. In one example, the controller 106 may access and/or modify certain software modules stored in memory to facilitate and/or support selectively compensating corrective lenses applied to the display device during testing. Examples of controller 106 include, but are not limited to: a physical processor, a central processing unit (Central Processing Unit, CPU), a microprocessor, a microcontroller, a Field-programmable gate array (Field-Programmable Gate Array, FPGA) implementing a soft-core processor, an Application-specific integrated circuit (ASIC), a portion of one or more of the above, variations or combinations of one or more of the above, and/or any other suitable controller.
In some examples, apparatus 100 may include and/or represent one or more additional components, devices, and/or mechanisms that are not necessarily shown and/or identified in fig. 1-5. For example, the axicon 102 may include and/or represent one or more lenses, which are not necessarily shown and/or labeled in fig. 1-5. In another example, although not necessarily shown and/or identified in fig. 1-5 in this manner, apparatus 100 may include and/or represent additional optical components, circuitry, processors, storage devices, cables, connectors, springs, motors, actuators, and the like. In further examples, the apparatus 100 may exclude and/or omit one or more of the components, devices, and/or mechanisms shown and/or identified in fig. 1-5.
Fig. 2 illustrates an exemplary embodiment of a axicon 102 that facilitates and/or supports selectively compensating corrective lenses applied to a display device during testing. As shown in fig. 2, the exemplary axicon 102 may include and/or represent a front lens 202, a back lens 204, an aperture stop 206, and/or a circular polarizer 210. In one example, front-end lens 202 may include and/or represent a collection lens and/or an imaging lens configured to collect collimated light representing an image emitted by a display device. In this example, front lens 202 may collect collimated light emitted by the display device over a range of viewing angles. Additionally or alternatively, the back end lens 204 may include and/or represent an imaging lens configured to form an image from the collimated light for presentation on an image sensor.
In some examples, the front lens 202 and the back lens 204 may be similar and/or identical to each other. In such an example, the front lens 202 and the rear lens 204 may be positioned and/or configured in opposite directions relative to each other within the axicon 102.
In some examples, front lens 202 may be positioned and/or placed closest to and/or in proximity to the display device under test. In such an example, the aperture stop 206 may be positioned and/or placed closest to and/or near the image sensor. In one example, the circular polarizer 210 and/or the back end lens 204 may be positioned and/or placed in close proximity to each other between the front end lens 202 and the aperture stop 206. More specifically, the circular polarizer 210 may be positioned and/or placed between the front end lens 202 and the back end lens 204. Additionally or alternatively, the back end lens 204 may be positioned and/or placed between the circular polarizer 210 and the aperture stop 206. Thus, light emitted by the display device may enter the axicon 102 at the front lens 202 and/or through the front lens 202 and then proceed toward the circular polarizer 210. After passing through the circular polarizer 210, the light may pass toward the rear end lens 204 and then exit the axicon 102 via the aperture stop 206.
Fig. 3 illustrates an exemplary embodiment of an apparatus 100 that facilitates and/or supports selectively compensating corrective lenses applied to a display device during testing. As shown in fig. 3, this exemplary embodiment of the apparatus 100 may include and/or represent the axicon 102, the variable compensation element 104, and/or the controller 106. In some examples, apparatus 100 may include and/or represent a phase plate positioning mechanism 304 communicatively coupled to controller 106. Additionally or alternatively, the phase plate positioning mechanism 304 may be physically and/or communicatively coupled to the variable compensation element 104. In such an example, the phase plate positioning mechanism 304 may be configured to move the variable compensation element 104 in one or more directions (e.g., rotationally, laterally, horizontally, vertically, diagonally, etc.) relative to the axicon 102. In one example, the phase plate positioning mechanism 304 may move the variable compensation element 104 to a position: the location is such that the feature selected by the controller 106 is applied to the image as it passes through the axicon 102.
In some examples, phase plate positioning mechanism 304 may include and/or represent one or more devices and/or components that apply a force to variable compensation element 104. The force exerted by such devices and/or components may cause and/or direct the variable compensation element 104 to move to a position relative to the optical path of the axicon 102. Examples of such devices and/or components include, but are not limited to: actuators, rotators, servo motors, direct Current (DC) motors, alternating Current (Alternating Current, AC) motors, variations or combinations of one or more of the above, and/or any other suitable devices and/or components.
In some examples, by moving the variable compensation element 104 to this position, the phase plate positioning mechanism 304 can align selected features of the variable compensation element 104 with the optical path of the axicon 102. For example, the phase plate positioning mechanism 304 may engage one or more actuators to move the variable compensation element 104 relative to and/or within the optical tray of the axicon 102 such that the selected feature is applied to the image as it passes through the optical path of the axicon 102.
Fig. 4 illustrates an example system 400 that facilitates and/or supports selectively compensating a corrective lens 404 applied to a display device 402 during testing. As shown in fig. 4, exemplary system 400 may include and/or represent display device 402 and/or imaging camera device 410. In some examples, imaging camera device 410 may be optically coupled to display device 402. In one example, the imaging camera apparatus 410 may include and/or represent the axicon 102, the variable compensation element 104, and/or the controller 106. In this example, the imaging camera device 410 may include and/or represent an image sensor 408 that is physically, directly, indirectly, and/or optically coupled to the variable compensation element 104.
In some examples, the display device 402 may include and/or represent a light source 412 that emits, projects, and/or transmits the image 406 as collimated light through the corrective lens 404. In such examples, corrective lens 404 may compensate for and/or address defects and/or deficiencies in the vision of these users (much like prescription glasses). Due to the corrective lens 404, the image 406 may experience one or more optical effects that alter and/or distort the collimated light in one or another manner relative to its original form.
In some examples, corrective lens 404 may modify, change, and/or alter image 406 to compensate for the refractive errors experienced by these users. Examples of such refractive errors include, but are not limited to: myopia, hyperopia, astigmatism, presbyopia, combinations or variations of one or more of the above, and/or any other vision deficiency and/or deficiency. To accurately test and/or calibrate the quality of the image 406 after passing through the corrective lens 404, the image 406 may undergo further modifications by various adapters included in the imaging camera device 410 before reaching the image sensor 408. In some examples, corrective lens 404 may include and/or represent any type or form of prescription lens and/or transmissive optical device prescribed by a physician and/or doctor (e.g., an optician and/or optometrist).
In one example, the image 406 may pass, traverse, and/or advance along an optical path 416 through various components of the axicon 102 (e.g., including the front lens 202, the circular polarizer 210, and/or the back lens 204, etc.) before reaching the image sensor 408 for evaluation and/or analysis. In this example, the image 406 may also pass, traverse, and/or advance through selected features of the variable compensation element 104 before reaching the image sensor 408 for evaluation and/or analysis. Once this is done, the image 406 may undergo and/or undergo a transformation and/or change that returns the collimated light to its original form (e.g., prior to the optical effect applied by the corrective lens 404). In other words, selected features of the variable compensation element 104 may adjust and/or process the collimated light such that the image 406 reaches the image sensor 408 in a form prior to the corrective lens 404. In some examples, the image sensor 408 may detect, sense, analyze, and/or evaluate the image 406 to obtain information and/or insight regarding the quality of the image 406 projected and/or presented by the display device 402.
In some examples, the axicon 102 may include and/or represent an optical tray 414 positioned and/or disposed in an optical path 416 traversed by the image 406 and/or positioned and/or disposed along the optical path 416 traversed by the image 406. In one example, the variable compensation element 104 can be inserted and/or mounted into the optical tray 414. In this example, the optical tray 414 may constitute, represent, and/or form an interface and/or receptacle between the variable compensation element 104 and/or the axicon 102. Additionally or alternatively, the phase plate positioning mechanism 304 may engage one or more actuators to move the variable compensation element 104 relative to the optical tray 414. Because of this movement, the phase plate positioning mechanism 304 can effectively apply optional features of the variable compensation element 104 to the image 406 and/or align with the image 406 as the image 406 traverses the optical path 416 within the axicon 102.
In some examples, the image sensor 408 may be communicatively coupled to the controller 106 and/or may provide data representing the image 406 to the controller 106 for further evaluation and/or analysis. In one example, the controller 106 may receive data representing the image 406 from the image sensor 408. In this example, the controller 106 may determine, infer, and/or identify one or more display parameters of the display device 402 based, at least in part, on the data representing the image 406. Examples of such display parameters include, but are not limited to: the MTF of the display device 402, the color uniformity measurement of the display device 402, the resolution (e.g., angular resolution) of the display device 402, the sharpness measurement of the display device 402, the brightness uniformity measurement of the display device 402, combinations or variations of one or more of the above, and/or any other suitable display parameter.
In some examples, the controller 106 may use such display parameters to determine whether the display device 402 meets and/or meets certain quality criteria. Additionally or alternatively, the controller 106 may provide such display parameters to a computing device and/or user interface (e.g., monitor) operated and/or accessible by a technician responsible for ensuring certain quality criteria of the display device 402. In one example, the controller 106 may design a policy for correcting one or more defects detected in the display device 402 based at least in part on the display parameters. Further, the controller 106 may initiate and/or perform one or more actions directed to correcting one or more defects detected in the display device 402. For example, the controller 106 may inform and/or inform a technician how to correct and/or resolve one or more defects detected in the display device 402.
Fig. 5 illustrates an exemplary embodiment of the variable compensation element 104. In some examples, the exemplary embodiment of variable compensation element 104 in fig. 5 may include and/or represent a phase plate array having a set of PBP optical elements and/or correctors. As shown in fig. 5, such an embodiment of the variable compensation element 104 may include and/or represent features 502 (1), 502 (2), 502 (3), 502 (4), 502 (5), 502 (6), 502 (7), 502 (8), 502 (9), 502 (10), 502 (11), 502 (12), 502 (13), 502 (14), 502 (15), 502 (17), 502 (18), 502 (19), 502 (20), 502 (21), 502 (22), 502 (23), 502 (24), 502 (25), 502 (26), 502 (27), 502 (28), 502 (29), 502 (30), 502 (31), and/or 502 (32). In one example, each of the features 502 (1) through 502 (32) may include and/or represent a different and/or unique PBP optical element. In this example, features 502 (1) through 502 (32) may include and/or represent a combination of different spherical and/or cylindrical powers.
In some examples, the controller 106 may select one of the features 502 (1) to 502 (32) from the variable compensation element 104 to apply to the image 406. As a specific example, feature 502 (1) may include and/or represent the following combinations: SPH at 0 diopter and CYL at 0 diopter. In this example, feature 502 (16) may include and/or represent the following combinations: -SPH at 4 diopters and CYL at-0.5 diopters. Additionally or alternatively, the features 502 (32) may include and/or represent the following combinations: -SPH at 4 diopters and CYL at-1.5 diopters.
Fig. 6 is a flow chart of an exemplary method 500 for manufacturing and/or assembling an imaging camera device that facilitates selectively compensating corrective lenses applied to a display device during testing. Additionally or alternatively, the steps shown in fig. 6 may incorporate/relate to various sub-steps and/or variations consistent with the description provided above in connection with fig. 1-5.
As shown in fig. 6, method 600 may include and/or involve a step (610) of optically coupling a display device to a axicon configured to receive an image emitted by the display device through a corrective lens. Step 610 may be performed in a variety of ways, including any of those described above in connection with fig. 1-5. For example, a test rig manufacturer and/or contractor may optically couple a display device to a axicon configured to receive an image transmitted by the display device through a corrective lens.
The method 600 may further include and/or involve the step of receiving (620) compensation parameters indicative of an optical effect exerted by the corrective lens on the image. Step 620 may be performed in a variety of ways, including any of those described above in connection with fig. 1-5. For example, the controller may be coupled to a variable compensation element positioned in the optical path of the axicon. In this example, the controller may receive compensation parameters that represent optical effects exerted by the corrective lens on the image.
The method 600 may further include and/or involve a step (630) of selecting a characteristic of the variable compensation element that compensates for the optical effect based at least in part on the compensation parameter. Step 630 may be performed in a variety of ways, including any of those described above in connection with fig. 1-5. For example, the controller may select a characteristic of the variable compensation element that compensates for the optical effect based at least in part on the compensation parameter.
The method 600 may further include and/or involve the step of applying (640) features of the variable compensation element to the image in the axicon. Step 640 may be performed in a variety of ways, including any of those described above in connection with fig. 1-5. For example, the controller may apply the features of the variable compensation element to the image in the axicon.
Embodiments of the present disclosure may include or be implemented in conjunction with various types of artificial reality systems. An artificial reality is a form of reality that has been somehow adjusted before being presented to a user, which may include, for example, virtual reality, augmented reality, mixed reality (mixed reality), or some combination and/or derivative thereof. The artificial reality content may include entirely computer-generated content, or computer-generated content in combination with captured (e.g., real-world) content. The artificial reality content may include video, audio, haptic feedback, or some combination thereof, any of which may be presented in a single channel or multiple channels (e.g., stereoscopic video that brings three-dimensional (3D) effects to the viewer). Additionally, in some embodiments, the artificial reality may also be associated with applications, products, accessories, services, or some combination thereof, e.g., for creating content in the artificial reality and/or otherwise for the artificial reality (e.g., performing an activity in the artificial reality).
The artificial reality system may be implemented in a variety of different form factors and configurations. Some artificial reality systems may be designed to operate without a near-eye display (NED). Other artificial reality systems may include NEDs that also provide visibility to the real world (e.g., augmented reality system 700 in FIG. 7) or that visually immerse the user in artificial reality (e.g., virtual reality system 800 in FIG. 8). While some artificial reality devices may be stand-alone systems, other artificial reality devices may communicate and/or cooperate with external devices to provide an artificial reality experience to a user. Examples of such external devices include a handheld controller, a mobile device, a desktop computer, a device worn by a user, a device worn by one or more other users, and/or any other suitable external system.
Turning to fig. 7, the augmented reality system 700 may include an eyeglass device 702 having a frame 710 configured to hold a left display device 715 (a) and a right display device 715 (B) in front of a user's eyes. Display device 715 (a) and display device 715 (B) may act together or independently to present an image or series of images to a user. Although the augmented reality system 700 includes two displays, embodiments of the present disclosure may be implemented in an augmented reality system having a single NED or more than two nes.
In some examples, the augmented reality system 700 may include one or more sensors, such as sensor 740. The sensor 740 may generate measurement signals in response to movement of the augmented reality system 700 and may be located substantially anywhere on the frame 710. The sensor 740 may represent one or more of a variety of different sensing mechanisms, such as a position sensor, an inertial measurement unit (inertial measurement unit, IMU), a depth camera assembly, a structured light emitter and/or detector, or any combination thereof. In some examples, the augmented reality system 700 may or may not include the sensor 740, or may include more than one sensor. In examples where the sensor 740 includes an IMU, the IMU may generate calibration data based on measurement signals from the sensor 740. Examples of the sensor 740 may include, but are not limited to: accelerometers, gyroscopes, magnetometers, other suitable types of sensors that detect motion, sensors for error correction of the IMU, or some combination thereof.
In some examples, the augmented reality system 700 may also include a microphone array having a plurality of acoustic transducers 720 (a) through 720 (J), collectively referred to as acoustic transducers 720. Acoustic transducer 720 may represent a transducer that detects changes in air pressure caused by sound waves. Each acoustic converter 720 may be configured to detect sound and convert the detected sound to an electronic format (e.g., analog format or digital format). The microphone array in fig. 7 may for example comprise ten acoustic transducers: 720 (a) and 720 (B), which may be designed to be placed within respective ears of a user, acoustic transducers 720 (C), 720 (D), 720 (E), 720 (F), 720 (G), and 720 (H), which may be positioned at various locations on frame 710, and/or acoustic transducers 720 (I) and 720 (J), which may be positioned on corresponding neck strap 705.
In some examples, one or more of the acoustic transducers 720 (a) through 720 (J) may function as an output transducer (e.g., a speaker). For example, acoustic transducers 720 (a) and/or 720 (B) may be earplugs, or any other suitable type of headphones or speakers.
The configuration of the individual acoustic transducers 720 in the microphone array may vary. Although the augmented reality system 700 is shown in fig. 7 as having ten acoustic transducers 720, the number of acoustic transducers 720 may be more or less than ten. In some examples, using a greater number of acoustic transducers 720 may increase the amount of audio information collected and/or increase the sensitivity and accuracy of the audio information. In contrast, using a fewer number of acoustic transducers 720 may reduce the computational power required by the associated controller 750 to process the collected audio information. In addition, the location of each acoustic transducer 720 in the microphone array may vary. For example, the locations of the acoustic transducers 720 may include defined locations on the user, defined coordinates on the frame 710, orientations associated with each acoustic transducer 720, or some combination thereof.
Acoustic transducers 720 (a) and 720 (B) may be positioned on different locations of the user's ear, such as behind the pinna, behind the tragus, and/or within the auricle (auricle) or the ear socket. Alternatively, there may be additional acoustic transducers 720 on or around the ear in addition to the acoustic transducers 720 within the ear canal. Positioning the acoustic transducer 720 near the ear canal of the user may enable the microphone array to collect information about how sound reaches the ear canal. By positioning at least two acoustic transducers of the plurality of acoustic transducers 720 on both sides of the user's head (e.g., as binaural microphones), the augmented reality system 700 may simulate binaural hearing and capture a 3D stereoscopic field around the user's head. In some examples, acoustic converters 720 (a) and 720 (B) may be connected to augmented reality system 700 via wired connection 730, while in other embodiments acoustic converters 720 (a) and 720 (B) may be connected to augmented reality system 700 via a wireless connection (e.g., a BLUETOOTH (BLUETOOTH) connection). In other embodiments, acoustic transducers 720 (a) and 720 (B) may not be used at all in conjunction with augmented reality system 700.
The plurality of acoustic transducers 720 on the frame 710 may be positioned in a variety of different ways including along the length of the earpieces, across the bridge, above or below the display device 715 (a) and the display device 715 (B), or some combination thereof. The plurality of acoustic transducers 720 may also be oriented such that the microphone array is capable of detecting sound over a wide range of directions around a user wearing the augmented reality system 700. In some examples, an optimization process may be performed during fabrication of the augmented reality system 700 to determine the relative positioning of the individual acoustic transducers 720 in the microphone array.
In some examples, the augmented reality system 700 may include or be connected to an external device (e.g., a pairing device), such as a neck strap 705. Neck strap 705 generally represents any type or form of mating device. Accordingly, the following discussion of neck strap 705 may also apply to a variety of other paired devices, such as charging boxes, smartwatches, smartphones, bracelets, other wearable devices, hand-held controllers, tablet computers, laptop computers, other external computing devices, and the like.
As shown, the neck strap 705 may be coupled to the eyewear device 702 by one or more connectors. These connectors may be wired or wireless and may include electronic components and/or non-electronic components (e.g., structural components). In some cases, the eyewear device 702 and the neck strap 705 may operate independently without any wired or wireless connection between them. Although fig. 7 shows the components in the eyewear device 702 and the components in the neck strap 705 at example locations on the eyewear device 702 and the neck strap 705, the components may be located elsewhere on the eyewear device 702 and/or the neck strap 705 and/or distributed across the eyewear device and/or the neck strap in a different manner. In some examples, the components in the eyewear device 702 and the neck strap 705 may be located on one or more additional peripheral devices that are paired with the eyewear device 702, the neck strap 705, or some combination thereof.
Pairing an external device (e.g., neck strap 705) with an augmented reality eyewear device may enable the eyewear device to implement the form factor of a pair of eyewear while still providing sufficient battery power and computing power for the extended capabilities. Some or all of the battery power, computing resources, and/or additional features of the augmented reality system 700 may be provided by, or shared between, the paired device and the eyeglass device, thereby generally reducing the weight, heat distribution, and form factor of the eyeglass device while still maintaining the desired functionality. For example, the neck strap 705 may allow for multiple components to be otherwise included in the eyeglass apparatus to be included in the neck strap 705, as users may bear heavier weight loads on their shoulders than they bear on their heads. The neck strap 705 may also have a large surface area to spread and dissipate heat to the surrounding environment through the large surface area. Thus, the neck strap 705 may allow for greater battery power and greater computing power than would otherwise be possible on a stand-alone eyeglass device. Because the weight carried in neck strap 705 may be less invasive to the user than the weight carried in eyeglass device 702, the user may endure wearing a lighter eyeglass device and carrying or wearing a paired device for a longer period of time than a user would endure wearing a heavy, independent eyeglass device, thereby enabling the user to more fully integrate the artificial reality environment into their daily activities.
The neck strap 705 may be communicatively coupled with the eyewear device 702, and/or communicatively coupled to a plurality of other devices. These other devices may provide certain functions (e.g., tracking, positioning, depth map construction (depth mapping), processing, storage, etc.) to the augmented reality system 700. In the example of fig. 7, the neck strap 705 may include two acoustic transducers (e.g., 720 (I) and 720 (J)) that are part of the microphone array (or potentially form their own microphone sub-arrays). The neck strap 705 may also include a controller 725 and a power supply 735.
Acoustic transducers 720 (I) and 720 (J) in neck strap 705 may be configured to detect sound and convert the detected sound to an electronic format (analog or digital). In the example of fig. 7, acoustic transducers 720 (I) and 720 (J) may be positioned on the neck strap 705, increasing the distance between the acoustic transducers 720 (I) and 720 (J) in the neck strap and other acoustic transducers 720 positioned on the eyewear device 702. In some cases, increasing the distance between the plurality of acoustic transducers 720 in the microphone array may increase the accuracy of the beamforming performed by the microphone array. For example, if acoustic transducers 720 (C) and 720 (D) detect sound, and the distance between acoustic transducer 720 (C) and acoustic transducer 720 (D) is greater than the distance between acoustic transducer 720 (D) and acoustic transducer 720 (E), for example, the determined source location of the detected sound may be more accurate than when the sound is detected by acoustic transducers 720 (D) and 720 (E).
The controller 725 in the neck strap 705 may process information generated by a plurality of sensors on the neck strap 705 and/or the augmented reality system 700. For example, the controller 725 may process information from the microphone array describing sounds detected by the microphone array. For each detected sound, the controller 725 may perform a direction-of-arrival (DOA) estimation to estimate from which direction the detected sound arrived at the microphone array. When the microphone array detects sound, the controller 725 may populate the audio data set with this information. In examples where the augmented reality system 700 includes an inertial measurement unit, the controller 725 may calculate all inertial and spatial calculations from the IMU located on the eyeglass apparatus 702. The connector may communicate information between the augmented reality system 700 and the neck strap 705, and between the augmented reality system 700 and the controller 725. The information may be in the form of optical data, electronic data, wireless data, or any other transmissible data. Moving the processing of the information generated by the augmented reality system 700 to the neck strap 705 may reduce the weight and heat of the eyeglass device 702, making the eyeglass device more comfortable for the user.
A power supply 735 in the neck strap 705 may provide power to the eyewear device 702 and/or the neck strap 705. The power source 735 may include, but is not limited to, a lithium ion battery, a lithium polymer battery, a disposable lithium battery, an alkaline battery, or any other form of power storage. In some cases, power supply 735 may be a wired power supply. The inclusion of the power source 735 on the neck strap 705 rather than on the eyeglass device 702 may help better disperse the weight and heat generated by the power source 735.
As mentioned, some artificial reality systems may use a virtual experience to substantially replace one or more of the user's multiple sensory perceptions of the real world, rather than mixing artificial reality with real reality. One example of this type of system is a head mounted display system that covers mostly or completely the field of view of the user, such as virtual reality system 800 in fig. 8. The virtual reality system 800 may include a front rigid body 802 and a band 804 shaped to fit around the user's head. The virtual reality system 800 may also include output audio transducers 806 (a) and 806 (B). Further, although not shown in fig. 8, the front rigid body 802 may include one or more electronic components including one or more electronic displays, one or more Inertial Measurement Units (IMUs), one or more tracking emitters or detectors, and/or any other suitable device or system for generating an artificial reality experience.
The artificial reality system may include various types of visual feedback mechanisms. For example, the display devices in the augmented reality system 700 and/or in the virtual reality system 800 may include: one or more liquid crystal displays (liquid crystal display, LCD), one or more light emitting diode (light emitting diode, LED) displays, one or more micro LED displays, one or more Organic LED (OLED) displays, one or more digital light projection (digital light project, DLP) micro displays, one or more liquid crystal on silicon (liquid crystal on silicon, LCoS) micro displays, and/or any other suitable type of display screen. These artificial reality systems may include a single display screen for both eyes, or one display screen may be provided for each eye, which may provide additional flexibility for zoom adjustment or for correcting refractive errors of the user. Some of these artificial reality systems may also include multiple optical subsystems having one or more lenses (e.g., concave or convex lenses, fresnel lenses, adjustable liquid lenses, etc.) through which a user may view the display screen. These optical subsystems may be used for various purposes including collimating light (e.g., causing an object to appear at a greater distance than its physical distance), magnifying light (e.g., causing an object to appear larger than its physical size), and/or delivering light (e.g., to an eye of a viewer). These optical subsystems may be used for direct-view architectures (non-pupil-forming architecture) (e.g., single lens configurations that directly collimate light but produce so-called pincushion distortion (pincushion distortion)) and/or non-direct-view architectures (pupil-forming architecture) (e.g., multi-lens configurations that produce so-called barrel distortion to eliminate pincushion distortion).
Some of the plurality of artificial reality systems described herein may include one or more projection systems in addition to, or instead of, using a display screen. For example, a display device in the augmented reality system 700 and/or in the virtual reality system 800 may include a micro LED projector that projects light (e.g., using waveguide projection) into a display device, such as a transparent combination lens that allows ambient light to pass through. The display device may refract the projected light toward the pupil of the user, and may enable the user to view both the artificial reality content and the real world simultaneously. The display device may use any of a variety of different optical components to achieve this end, including waveguide components (e.g., holographic elements, planar elements, diffractive elements, polarizing elements, and/or reflective waveguide elements), light-manipulating surfaces and elements (e.g., diffractive elements and gratings, reflective elements and gratings, and refractive elements and gratings), coupling elements, and the like. The artificial reality system may also be configured with any other suitable type or form of image projection system, such as a retinal projector for a virtual retinal display.
The artificial reality systems described herein may also include various types of computer vision components and subsystems. For example, the augmented reality system 700 and/or the virtual reality system 800 may include one or more optical sensors, such as two-dimensional (2D) cameras or 3D cameras, structured light emitters and detectors, time-of-flight depth sensors, single beam rangefinders or scanning laser rangefinders, 3D LiDAR (LiDAR) sensors, and/or any other suitable type or form of optical sensor. The artificial reality system may process data from one or more of these sensors to identify the user's location, map the real world, provide the user with a background related to the real world surroundings, and/or perform various other functions.
The artificial reality system described herein may also include one or more input and/or output audio transducers. The output audio transducer may include a voice coil speaker, a ribbon speaker, an electrostatic speaker, a piezoelectric speaker, a bone conduction transducer, a cartilage conduction transducer, a tragus vibration transducer, and/or any other suitable type or form of audio transducer. Similarly, the input audio transducer may include a condenser microphone, a dynamic microphone, a ribbon microphone, and/or any other type or form of input transducer. In some examples, a single transducer may be used for both the audio input and the audio output.
In some examples, the artificial reality systems described herein may also include a haptic feedback system (e.g., haptic) that may be incorporated into headwear, gloves, clothing, hand-held controllers, environmental devices (e.g., chairs, floor mats, etc.), and/or any other type of device or system. The haptic feedback system may provide various types of skin feedback including vibration, thrust, traction, texture, and/or temperature. Haptic feedback systems may also provide various types of kinesthetic feedback, such as motion and compliance. Haptic feedback may be implemented using motors, piezoelectric actuators, fluid systems, and/or various other types of feedback mechanisms. The haptic feedback system may be implemented independently of, within, and/or in combination with other artificial reality devices.
By providing haptic perception, auditory content, and/or visual content, an artificial reality system may create a complete virtual experience or enhance a user's real-world experience in various contexts and environments. For example, an artificial reality system may assist or extend a user's perception, memory, or cognition in a particular environment. Some systems may enhance user interaction with others in the real world or may enable more immersive interaction with others in the virtual world. The artificial reality system may also be used for educational purposes (e.g., for teaching or training in schools, hospitals, government institutions, military institutions, businesses, etc.), entertainment purposes (e.g., for playing video games, listening to music, watching video content, etc.), and/or for accessibility purposes (e.g., as a hearing aid, visual aid, etc.). Examples disclosed herein may enable or enhance a user's artificial reality experience in one or more of these contexts and environments, and/or in other contexts and environments.
The process parameters and sequence of steps described and/or illustrated herein are given by way of example only and may be varied as desired. For example, although the steps illustrated and/or described herein may be illustrated or discussed in a particular order, the steps need not be performed in the order illustrated or discussed. Various example methods described and/or illustrated herein may also omit one or more of the steps described or illustrated herein or include additional steps in addition to those disclosed.
The previous description has been provided to enable others skilled in the art to best utilize various aspects of the exemplary embodiments disclosed herein. The exemplary description is not intended to be exhaustive or to be limited to any precise form disclosed. Many modifications and variations are possible without departing from the scope of the disclosure. The examples disclosed herein are to be considered in all respects as illustrative and not restrictive. In determining the scope of the present disclosure, reference should be made to any claims appended hereto and their equivalents.
The terms "connected to" and "coupled to" (and derivatives thereof) as used in the specification and/or claims, are to be interpreted as allowing both direct connection and indirect connection (i.e., via other elements or components) unless otherwise indicated. Furthermore, the terms "a" or "an," as used in the description and/or in the claims, are to be interpreted as meaning at least one of the terms. Finally, for ease of use, the terms "comprising" and "having" (and their derivatives) as used in the specification and/or claims are interchangeable with, and have the same meaning as, the word "comprising".

Claims (13)

1. An apparatus, comprising:
a axicon configured to receive an image emitted by the display device through the corrective lens;
a variable compensation element coupled to the axicon, wherein the variable compensation element is capable of selectively modifying the image emitted by the display device to compensate for optical effects exerted on the image by the corrective lens; and
a controller coupled to the variable compensation element, wherein the controller:
receiving compensation parameters representative of the optical effect exerted by the corrective lens on the image;
selecting a characteristic of the variable compensation element that compensates for the optical effect based at least in part on the compensation parameter; and
the features of the variable compensation element are applied to the image.
2. The apparatus of claim 1, further comprising: an image sensor coupled to the variable compensation element, wherein:
the image sensor is configured to sense the image compensated by the characteristic of the variable compensation element; and is also provided with
The controller:
receiving data from the image sensor representative of the image sensed by the image sensor; and
a display parameter of the display device is determined based at least in part on the data representing the image.
3. The apparatus of claim 2, wherein the display parameters include at least one of:
a modulation transfer function of the display device;
color uniformity measurements of the display device;
resolution of the display device;
a sharpness measurement of the display device; or alternatively
And measuring the brightness uniformity of the display device.
4. A device according to claim 2 or 3, wherein:
the variable compensation element includes a phase plate array including a plurality of selectable features; and is also provided with
The controller selects one of the plurality of selectable features included on the phase plate array for application to the image before the image reaches the image sensor.
5. The apparatus of claim 4, wherein the plurality of selectable features included on the array of phase plates comprises at least one pancharatna-Berry phase optic of a set of pancharatna-Berry phase optics.
6. The apparatus of claim 4 or 5, further comprising: a phase plate positioning mechanism communicatively coupled to the controller and configured to move the phase plate array; and is also provided with
Wherein the controller directs the phase plate positioning mechanism to move the array of phase plates to a position that causes the one of the plurality of selectable features to be applied to the image as the image passes through the axicon.
7. The apparatus of claim 6, wherein:
the phase plate positioning mechanism includes one or more actuators;
the axicon includes an optical tray positioned in an optical path of the image; and is also provided with
The phase plate positioning mechanism engages the one or more actuators to move the phase plate array relative to the optical tray such that the one of the plurality of selectable features is applied to the image as the image passes through the optical path within the axicon.
8. The apparatus of any of claims 2-7, wherein the controller provides the display parameter representative of the display device to a user interface or computing device for evaluation.
9. The apparatus of any preceding claim, wherein the compensation parameter comprises at least one of:
the spherical power of the correction lens;
the cylindrical power of the correcting lens; or alternatively
And the cylindrical lens axis of the correcting lens is positioned.
10. The apparatus of any preceding claim, wherein the axicon comprises:
a collection lens configured to collect light representative of the image emitted by the display device; and
an imaging lens configured to form the image from the light for presentation on the image sensor.
11. The apparatus of any preceding claim, wherein the collection lens collects the light emitted by the display device over a range of viewing angles.
12. A system, comprising:
a display device including a corrective lens; and
an imaging camera apparatus optically coupled to the display apparatus, wherein the imaging camera apparatus comprises the apparatus of any preceding claim.
13. A method, comprising:
optically coupling a display device to a axicon, the axicon configured to receive an image emitted by the display device through a corrective lens;
Receiving, by a controller, a compensation parameter, the controller coupled to a variable compensation element positioned in an optical path of the axicon, the compensation parameter being representative of at least one optical effect exerted by the corrective lens on the image;
selecting, by the controller, a characteristic of the variable compensation element that compensates for the optical effect based at least in part on the compensation parameter; and
the features of the variable compensation element are applied to the image in the axicon by the controller.
CN202280043295.3A 2021-06-18 2022-06-16 Apparatus, system, and method for selectively compensating corrective lenses applied to a display device during testing Pending CN117501084A (en)

Applications Claiming Priority (4)

Application Number Priority Date Filing Date Title
US63/212,260 2021-06-18
US17/721,760 US20220404608A1 (en) 2021-06-18 2022-04-15 Apparatus, system, and method for selectively compensating for corrective lenses applied to display devices during testing
US17/721,760 2022-04-15
PCT/US2022/033874 WO2022266380A1 (en) 2021-06-18 2022-06-16 Apparatus, system, and method for selectively compensating for corrective lenses applied to display devices during testing

Publications (1)

Publication Number Publication Date
CN117501084A true CN117501084A (en) 2024-02-02

Family

ID=89667684

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202280043295.3A Pending CN117501084A (en) 2021-06-18 2022-06-16 Apparatus, system, and method for selectively compensating corrective lenses applied to a display device during testing

Country Status (1)

Country Link
CN (1) CN117501084A (en)

Similar Documents

Publication Publication Date Title
US20240103301A1 (en) Head-Mounted Display Device With Vision Correction
JP6923552B2 (en) Augmented reality systems and methods with varifocal lens elements
US20190246889A1 (en) Method of determining an eye parameter of a user of a display device
CN101495024B (en) Method for optimizing eyeglass lenses
CN109922707B (en) Method for determining an eye parameter of a user of a display device
US20230037329A1 (en) Optical systems and methods for predicting fixation distance
US11758115B2 (en) Method for the user-specific calibration of a display apparatus, wearable on the head of a user, for an augmented presentation
US20180246332A1 (en) Optical characterization system for lenses
US20230043585A1 (en) Ultrasound devices for making eye measurements
US20220404608A1 (en) Apparatus, system, and method for selectively compensating for corrective lenses applied to display devices during testing
US20230084541A1 (en) Compact imaging optics using spatially located, free form optical components for distortion compensation and image clarity enhancement
CN117501084A (en) Apparatus, system, and method for selectively compensating corrective lenses applied to a display device during testing
EP4356094A1 (en) Apparatus, system, and method for selectively compensating for corrective lenses applied to display devices during testing
CN114303118A (en) Display panel uniformity calibration system
US20240179286A1 (en) Systems and methods of near eye imaging product virtual image distance mapping
US20230341812A1 (en) Multi-layered polarization volume hologram
US20230411932A1 (en) Tunable laser array
US20240231119A1 (en) Systems and methods for increasing display system fill factor
US20230418070A1 (en) Optical assemblies, head-mounted displays, and related methods
US20240094552A1 (en) Geometrical waveguide with partial-coverage beam splitters
CN117724248A (en) System and method for assembling a head mounted display
WO2023031633A1 (en) Online calibration based on deformable body mechanics
TW202403388A (en) Systems and methods for alignment of optical components
WO2023014918A1 (en) Optical systems and methods for predicting fixation distance
CN117784493A (en) Stacked graded index liquid crystal lens assembly

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination