CN116583777A - Dispensing system - Google Patents

Dispensing system Download PDF

Info

Publication number
CN116583777A
CN116583777A CN202180068070.9A CN202180068070A CN116583777A CN 116583777 A CN116583777 A CN 116583777A CN 202180068070 A CN202180068070 A CN 202180068070A CN 116583777 A CN116583777 A CN 116583777A
Authority
CN
China
Prior art keywords
lens
user
sample
lens groups
station
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202180068070.9A
Other languages
Chinese (zh)
Inventor
A·V·米拉贝拉
F·C·王
A·G·韦伯
D·B·帕斯穆伊
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Apple Inc
Original Assignee
Apple Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Apple Inc filed Critical Apple Inc
Publication of CN116583777A publication Critical patent/CN116583777A/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01MTESTING STATIC OR DYNAMIC BALANCE OF MACHINES OR STRUCTURES; TESTING OF STRUCTURES OR APPARATUS, NOT OTHERWISE PROVIDED FOR
    • G01M11/00Testing of optical apparatus; Testing structures by optical methods not otherwise provided for
    • G01M11/02Testing optical properties
    • G01M11/0242Testing optical properties by measuring geometrical properties or aberrations
    • G01M11/0257Testing optical properties by measuring geometrical properties or aberrations by analyzing the image formed by the object to be tested
    • G01M11/0264Testing optical properties by measuring geometrical properties or aberrations by analyzing the image formed by the object to be tested by using targets or reference patterns
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B7/00Mountings, adjusting means, or light-tight connections, for optical elements
    • G02B7/02Mountings, adjusting means, or light-tight connections, for optical elements for lenses
    • G02B7/14Mountings, adjusting means, or light-tight connections, for optical elements for lenses adapted to interchange lenses
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q50/00Information and communication technology [ICT] specially adapted for implementation of business processes of specific business sectors, e.g. utilities or tourism
    • G06Q50/10Services
    • G06Q50/22Social work or social welfare, e.g. community support activities or counselling services

Landscapes

  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Business, Economics & Management (AREA)
  • Optics & Photonics (AREA)
  • Tourism & Hospitality (AREA)
  • Health & Medical Sciences (AREA)
  • General Health & Medical Sciences (AREA)
  • Primary Health Care (AREA)
  • Analytical Chemistry (AREA)
  • Economics (AREA)
  • Chemical & Material Sciences (AREA)
  • Human Resources & Organizations (AREA)
  • Marketing (AREA)
  • Child & Adolescent Psychology (AREA)
  • Strategic Management (AREA)
  • General Business, Economics & Management (AREA)
  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Geometry (AREA)
  • Eyeglasses (AREA)

Abstract

Different users of head-mounted devices have different needs for vision correction. A system can be provided to determine the corrective lens that is most appropriate for a given user. A dispenser can contain a plurality of different lenses that provide different types of vision correction. The dispenser can provide an appropriate one of the lenses and the user can use the lens with the head-mounted device during the experience session. During the experience session, the user can verify that the lens is satisfactory.

Description

Dispensing system
Cross Reference to Related Applications
The present application claims the benefit of U.S. provisional application No. 63/060,589, entitled "DISPENSING SYSTEM," filed 8/3 of 2021, the entire contents of which are incorporated herein by reference.
Technical Field
The present description relates generally to dispensing systems, including dispensing systems that relate to components for use with head-mounted devices.
Background
A user may wear a headset to display visual information within the user's field of view. The head-mounted device may be used as a Virtual Reality (VR) system, an Augmented Reality (AR) system, and/or a Mixed Reality (MR) system. The user may observe output provided by the head-mounted device, such as visual information provided on a display. The display may optionally allow the user to view the environment external to the head mounted device. Other outputs provided by the head-mounted device may include speaker output and/or haptic feedback. The user may further interact with the head-mounted device by providing input for processing by one or more components of the head-mounted device. For example, a user may provide tactile input, voice commands, and other inputs while the device is mounted to the user's head.
Drawings
Some features of the subject technology are set forth in the following claims. However, for purposes of explanation, several embodiments of the subject technology are set forth in the following figures.
FIG. 1 illustrates an exemplary environment in which an HMD presentation system may be implemented in accordance with one or more implementations.
FIG. 2 illustrates a flow diagram of an exemplary process for an input device in accordance with one or more implementations.
FIG. 3 illustrates a flow diagram of an exemplary process for a dispenser in accordance with one or more implementations.
FIG. 4 illustrates a flow diagram of an exemplary process for a head-mounted device in accordance with one or more implementations.
Fig. 5 illustrates an exemplary network environment in which an HMD presentation system may be implemented in accordance with one or more implementations.
Fig. 6 illustrates an exemplary electronic device that may be used in an HMD presentation system in accordance with one or more implementations.
Fig. 7 illustrates a top view of a head mounted device and a lens device according to some embodiments of the present disclosure.
Fig. 8 illustrates a top view of the headset of fig. 7 with the lens device of fig. 7 mounted therein, according to some embodiments of the present disclosure.
Fig. 9 illustrates a block diagram of a head mounted device according to some embodiments of the present disclosure.
Detailed Description
The detailed description set forth below is intended as a description of various configurations of the subject technology and is not intended to represent the only configurations in which the subject technology may be practiced. The accompanying drawings are incorporated in and constitute a part of this specification. The specific embodiments include specific details for the purpose of providing a thorough understanding of the subject technology. However, it will be clear and apparent to one skilled in the art that the subject technology is not limited to the specific details shown herein and may be practiced without these specific details. In some instances, well-known structures and components are shown in block diagram form in order to avoid obscuring the concepts of the subject technology.
A head-mounted device, such as a head-mounted display, headset, goggles, smart glasses, head-up display, etc., may perform a series of functions managed by components (e.g., sensors, circuitry, and other hardware) included in the wearable device.
The visual output characteristics of the head-mounted device may be provided in a manner that accommodates the user's vision (including vision impairment and/or vision correction requirements). For example, the head-mounted device may include or may be combined with a corrective lens that allows the user to properly view the visual output characteristics of the head-mounted device. To allow a given head mounted device to be used by different users, corrective lenses may be provided as separate modules that are attachable, removable, and/or exchangeable with other corrective lenses. Thus, any given user may properly view visual output characteristics when using a headset with the appropriate corresponding corrective lens group.
It may be desirable to allow a user to determine which of the various lenses are suitable for use with the head-mounted device, be provided with the appropriate lenses, experience the lenses, and purchase if satisfied. Where provided in combination, such capability may facilitate the process of a user making a decision when looking for a head mounted device and corresponding components.
Different users of head-mounted devices have different needs for vision correction. Systems and methods may be provided to determine the corrective lens that is best suited for a given user. Such systems and methods may include an input device for determining which of a plurality of existing corrective lenses are appropriate for a given user. The dispenser may provide a selected one of the plurality of lenses for use by a user with the head-mounted device. The headset may then be operated with the lens, including any suitable adjustment based on the selection of the lens.
These and other embodiments are discussed below with reference to fig. 1-9. However, those skilled in the art will readily appreciate that the detailed description given herein with respect to these figures is for explanatory purposes only and should not be construed as limiting.
Fig. 1 illustrates an exemplary HMD presentation system 10 for providing a selected lens arrangement for use by a user in accordance with one or more implementations. However, not all of the depicted components may be used in all implementations, and one or more implementations may include additional or different components than those shown in the figures. Variations in the arrangement and type of these components may be made without departing from the spirit or scope of the claims set forth herein. Additional components, different components, or fewer components may be provided.
The HMD presentation system 10 may be built by an entity (such as a retail store, school, company, hotel, cruise, stadium, museum, etc.) that implements the lens-dispensing system of the present invention. The facility 2 including the HMD presentation system 10 may encompass all or part of one or more structures, such as one or more retail store buildings, one or more school buildings, one or more office buildings, and the like. In fig. 1, facility 2 is shown as a retail store. The retail store may provide an inventory of devices (e.g., head mounted devices) and/or components (e.g., lenses) to interact so that a user may decide whether to purchase one of the devices and/or components.
As shown in fig. 1, HMD presentation system 10 may include an input station 20, a distribution station 40, a presentation station 60, and/or a point-of-sale station 80. It should be appreciated that stations may be combined, for example, by providing input station 20 and distribution station 40 as a single station 12. It should also be appreciated that one or more of the stations may be omitted and/or located elsewhere (e.g., outside of facility 2). It should also be appreciated that one or more additional stations may be included.
Input station 20 may facilitate determining a user's need for corrective lenses. For example, input station 20 may include a device that determines which of a variety of lenses are suitable for use with a head mounted device.
In some examples, input station 20 may include an input device 22 that provides a user interface for inputting information about a given user. The input device 22 may include, for example, a touch screen, keyboard, mouse, microphone, camera, etc. The input device 22 may present selectable elements (e.g., from a menu) or another input format (e.g., text, handwriting, etc.). The input device 22 may also include output components such as a display, speakers, haptic devices, and the like. The input device 22 may be operated by the user or another person to input vision information related to the user's vision condition and/or the need for vision correction. Such information may include diagnostics, test results, prescriptions, and/or user preferences. The user may optionally enter the user's own information so that the information is not necessarily available to another person.
In some examples, input station 20 may include and/or interact with a user's own user device 24. The user device may include a phone, a tablet, a computer, a laptop, a watch, a wearable device, a head mounted device, or another electronic device. Information relating to the user's vision condition and/or the need for vision correction may be stored on or accessible by the user device 24. Such information may include diagnostics, test results, prescriptions, and/or user preferences. The user may operate the user device 24 to authorize transmission of such information within the HMD presentation system 10.
In some examples, input station 20 may include a detector 32 that detects a condition of the corrective lens and/or the user. Such detection may be performed within the facility 2 such that the user does not need to hold vision correction information before entering the facility 2. As shown in fig. 1, the detector 32 may be operated to analyze existing corrective lenses, eyeglasses, contact lenses, or other eyeglasses. The reference lens 30 may be user-owned and provide the appropriate vision correction needed and/or desired by the user. The detector 32 may include a camera 34 that captures an image of a reference pattern 36 when viewed through the reference lens 30. Based on the image captured by the camera 34 and the known reference pattern 36, the detector 32 may determine the effect of the reference lens 30 and thus the type of reference lens 30 present. Such detection may be used to determine information related to the user's vision condition and/or the need for vision correction.
It should be appreciated that input station 20 may include other devices such as a detector that measures one or more characteristics of a user's own eyes to determine a user's vision condition and/or a need for vision correction. It should also be appreciated that multiple input types may be combined to verify and/or confirm visual information from one type to another.
As further shown in fig. 1, the dispensing station 40 may include a dispenser 42 that contains a plurality of sample lens groups 200 and dispenses a selected one or more of the sample lens groups 200. The sample lens group 200 may include multiple types corresponding to different types of optical effects and/or vision correction. For example, each of the sample lens groups 200 may have a spherical surface, a cylindrical surface, and/or another type of correction. The correction differences between the plurality of sample lens groups 200 may include variations in correction type, diopter, axis of correction, and the like. Various corrective combinations may be combined for some or all of the sample lens groups 200. Each correction type or combination of correction types may be identified by a lens type. For example, each of the sample lens groups 200 may have a known type of correction based on the identity of the lens. A corresponding identifier, such as a stock keeping unit ("SKU"), may be assigned for reference.
The selection from sample lens group 200 may be based on information from input station 20. In the event input station 20 determines the type of correction desired, a corresponding sample lens group 200 may be further determined. For example, one of the plurality of sample lens groups 200 may be selected from the dispensing station 40. The selection may be based on information (e.g., diagnosis, test results, prescription, and/or user preferences) that determines which of sample lens groups 200 most closely matches that collected at input station 20.
It should be appreciated that any number of sample lens groups 200 may be provided by the dispensing station 40. However, the sample lens groups 200 provided at the dispensing station 40 may still be limited in number and thus have discrete corrective capabilities, or any two sample lens groups 200 may be separated by a corrective difference. It should also be appreciated that the sample lens set 200 may be provided for experience purposes and need not exactly match the vision correction needs of the user. Thus, the number of sample lens groups 200 available for experience for demonstration purposes may be less than the number of user lens groups available for purchase, production, and/or delivery to a user. For example, the sample lens group 200 of the dispensing station 40 may be provided for experience purposes, but may provide the user with the ability to purchase custom lenses and/or lenses from a larger inventory, with different lenses separated by a smaller difference in vision correction capability.
As shown in fig. 1, the dispensing station 40 may include a dispenser 42 that contains sample lens groups 200 and provides the ability to dispense one or more of the sample lens groups 200 to a user. For example, dispenser 42 may identify any given one of sample lens groups 200 based on an identifier corresponding to or otherwise associated with information received from input station 20. The dispenser 42 may dispense a selected one of the sample lens groups 200 by one or more of a variety of mechanisms. For example, the dispenser 42 may indicate to a user which of the sample lens groups 200 are to be used for a given user. By way of further example, the dispenser 42 may move one of the sample lens groups 200 to a user accessible location or otherwise make such sample lens group 200 available. The corresponding mechanisms may include actuators, belts, arms, and/or trays for moving the sample lens assembly 200 and/or presenting the sample lens assembly to a user. By way of further example, the dispenser 42 may mount a selected one of the sample lens groups 200 to the head-mounted device 100. Thus, the dispenser 42 can dispense both the sample lens group 200 and the headset 100 for use by a user.
Additionally or alternatively, other accessories and/or components may be assigned for use with the headset 100. For example, the dispenser 42 or another dispenser and/or inventory source may dispense or otherwise provide accessories and/or components that are specific and/or customized for a particular user. Such accessories and/or components may include a light seal, a head fixation element, and/or another piece of equipment fitted to the user. Additionally or alternatively, optional equipment for use with the headset 100 may be provided based on input and/or user selection.
It should be appreciated that the dispenser 42 may provide more than one sample lens group 200 for experience by a user. For example, a plurality of sample lens groups 200 representing the closest approximation range of the desired correction may be provided for experience by the user. The user may then be provided with an opportunity to experience each of the assigned sample lens groups 200 with the headset 100 and decide which sample lens group is preferred.
As further shown in fig. 1, the HMD presentation system 10 may also include a presentation station 60. Presentation station 60 may include space and/or equipment to facilitate user operation of headset 100 (e.g., a sample headset). If not previously provided (e.g., at the dispensing station 40), the presentation station 60 may provide a head mounted device. In the presentation station 60, a user may utilize the sample lens group 200 and/or any other equipment to operate the headset 100 as part of a presentation of the experience of such equipment. The user may be provided with the ability to select one or more sample lens groups 200 and/or another piece of equipment.
As further shown in fig. 1, the HMD presentation system 10 may also include a point-of-sale station 80. The point-of-sale station 80 may include a point-of-sale device 82 operable by a user or another person to perform a transaction based on a selected one or more of the sample lens groups 200 and/or the head-mounted device 100. For example, when the user has selected the sample lens group 200 or another piece of equipment, the user and/or another person may place an order or otherwise purchase the sample lens group 200, the head mounted device 100, and/or another piece of equipment. The generation of the order and/or purchase may be based on a determination made by input station 20 and/or a user selection.
It should be appreciated that the ordered and/or purchased sample lens group 200 and/or the head mounted device 100 may be different from the sample lens group and/or the head mounted device experienced by the user in the presentation station 60. For example, upon completion of the presentation at presentation station 60, the user may refund sample lens group 200 and/or head-mounted device 100 to the source of the sample lens group and/or head-mounted device (e.g., dispensing station 40). Optionally, the sample lens group 200 and/or the headset 100 may be handled and/or cleaned by a staff member in preparation for use by other users. When a user places an order and/or purchases, the user may be provided, ordered, and/or delivered with different lens groups (e.g., user lens groups) and/or head-mounted devices. As discussed herein, the ordered and/or purchased lens groups of the user may be different from the experienced lens groups, as the ordered and/or purchased lens groups may be more representative of the corrective needs determined for the user. This may be achieved by providing a wider array of user lenses than is available for experience.
FIG. 2 illustrates a flow diagram of an exemplary process 300 for an input device in accordance with one or more implementations. For purposes of explanation, process 300 is described herein primarily with reference to the apparatus of input station 20 of fig. 1. However, process 300 is not limited to input station 20 of fig. 1, and one or more blocks (or operations) of process 300 may be performed by one or more other components, devices, and/or stations. The devices of input station 20 are also presented as exemplary devices, and the operations described herein may be performed by any suitable device. For further explanation purposes, the blocks of process 300 are described herein as occurring sequentially or linearly. However, multiple blocks of process 300 may occur in parallel. Furthermore, the blocks of process 300 need not be performed in the order shown, and/or one or more blocks of process 300 need not be performed and/or may be replaced by other operations.
Process 300 may begin when the input station determines a lens type (302), as described herein. Such determination may be based on user input, record retrieval, and/or direct detection of the user and/or existing corrective lenses. The input station may then transmit an indication of the lens type for dispensing the appropriate lens (304). The transmission may include the determined information and/or an identity of the lens corresponding to the determined information.
FIG. 3 illustrates a flow diagram of an exemplary process 400 for a dispenser in accordance with one or more implementations. For purposes of explanation, the process 400 is described herein primarily with reference to the dispenser 42 of the dispensing station 40 of fig. 1. However, process 400 is not limited to dispensing station 40 of fig. 1, and one or more blocks (or operations) of process 400 may be performed by one or more other components, devices, and/or stations. The dispenser 42 of the dispensing station 40 is also presented as an exemplary device, and the operations described herein may be performed by any suitable device. For further explanation purposes, the blocks of process 400 are described herein as occurring sequentially or linearly. However, multiple blocks of process 400 may occur in parallel. Furthermore, the blocks of process 400 need not be performed in the order shown, and/or one or more blocks of process 400 need not be performed and/or may be replaced by other operations.
Process 400 may begin when an indication of a lens type is received by a dispensing station (402), as described herein. The dispenser may then dispense the lens based on the lens type (404). Alternatively, the dispenser may mount the lens on the head mounted device (406), and both the lens and the head mounted device may be dispensed together.
FIG. 4 illustrates a flow diagram of an exemplary process 500 for a head-mounted device in accordance with one or more implementations. For purposes of explanation, the process 500 is described herein primarily with reference to the headset 100 of fig. 1. However, process 500 is not limited to head-mounted device 100 of fig. 1, and one or more blocks (or operations) of process 500 may be performed by one or more other components, devices, and/or stations. The head mounted device 100 is also presented as an exemplary device, and the operations described herein may be performed by any suitable device. For further explanation purposes, the blocks of process 500 are described herein as occurring sequentially or linearly. However, multiple blocks of process 500 may occur in parallel. Furthermore, the blocks of process 500 need not be performed in the order shown, and/or one or more blocks of process 500 need not be performed and/or may be replaced by other operations.
Process 500 may begin when a headset receives an indication of a lens type (502), as described herein. The headset may output an image based on the lens type (504). For example, the image displayed for viewing through the lens may be changed based on known characteristics and properties of the lens. Additionally or alternatively, the headset may detect characteristics of the user based on the lens type (506). For example, the sensor may perform detection and/or measurement based on the view through the lens. By providing an indication of the type of lens to the head-mounted device, the sensor may be calibrated to interpret the view through the lens.
Fig. 5 illustrates an exemplary network environment in which the HMD presentation system 10 may be implemented in accordance with one or more implementations. However, not all of the depicted components may be used in all implementations, and one or more implementations may include additional or different components than those shown in the figures. Variations in the arrangement and type of these components may be made without departing from the spirit or scope of the claims set forth herein. Additional components, different components, or fewer components may be provided.
The network environment includes various devices, access points 18, networks 16, and servers 14. Access point 18 and/or network 16 may communicatively couple, for example, various devices to server 14 and/or to each other. In one or more implementations, the network 16 may be an interconnection network that may include the Internet or devices communicatively coupled to the Internet.
Server 14 may include one or more server devices that may facilitate providing services to users, such as records related to the user's vision information, content for use by head-mounted device 100, and/or order fulfillment systems. In one or more implementations, the one or more service provider servers 108 may include and/or be communicatively coupled to one or more servers.
The input device 22, the user device 24, the detector 32, the dispenser 42, the head mounted device 100, and/or the point of sale device 82 may include a communication interface for communicating with each other. Such communication may be direct and/or indirect (e.g., via access point 18, network 16, and/or server 14). The communication interfaces may include one or more wired or wireless communication interfaces, such as one or more Universal Serial Bus (USB) interfaces, near Field Communication (NFC) radios, wireless Local Area Network (WLAN) radios, bluetooth radios, zigbee radios, cellular radios, and/or other radios. In fig. 1, for example, the electronic device 102 is depicted as a tablet device.
Fig. 6 illustrates an exemplary electronic device 600 that may be used in an HMD presentation system in accordance with one or more implementations. Electronic device 600 may correspond to input device 22, user device 24, detector 32, dispenser 40, head mounted device 100, and/or point of sale device 82. However, not all of the depicted components may be used in all implementations, and one or more implementations may include additional or different components than those shown in the figures. Variations in the arrangement and type of these components may be made without departing from the spirit or scope of the claims set forth herein. Additional components, different components, or fewer components may be provided.
The electronic device 600 may include a host processor 602, memory 604, one or more input/output devices 606, a communication interface 608, and/or one or more sensors 612, among other components.
The host processor 602 (which may also be referred to as an application processor or processor) may comprise suitable logic, circuitry, and/or code that may be enabled to process data and/or control operation of the electronic device 600. In this regard, the host processor 602 may be enabled to provide control signals to various other components of the electronic device 600. Host processor 602 may also control the transfer of data between portions of electronic device 600. In addition, host processor 602 may enable an operating system or otherwise execute code to manage the operation of electronic device 600. The memory 604 may comprise suitable logic, circuitry, and/or code that may enable storage of various types of information, such as received data, generated data, code, and/or configuration information. Memory 604 may include, for example, random Access Memory (RAM), read Only Memory (ROM), flash memory, and/or magnetic storage.
The communication interface 608 may comprise suitable logic, circuitry, and/or code that may enable wired or wireless communication, such as with the access point 18, the network 16, the server 14, and/or other electronic devices of the HMD presentation system 10. The communication interface 608 of any given device may provide a communication link with a communication interface of any other device within the HMD presentation system 10. Such communication may be direct or indirect (e.g., through intermediaries). The communication interface 608 may include, for example, one or more of a bluetooth communication interface, an NFC interface, a Zigbee communication interface, a WLAN communication interface, a USB communication interface, or generally any communication interface.
The one or more sensors 612 may include, for example, one or more image sensors, one or more depth sensors, one or more infrared sensors, one or more thermal (e.g., infrared) sensors, and/or any sensor that may be generally used to detect and/or measure a lens or user.
In one or more implementations, one or more of the host processor 602, the memory 604, the one or more sensors 612, the communication interface 608, and/or one or more portions thereof may be implemented in software (e.g., subroutines and code), may be implemented in hardware (e.g., an Application Specific Integrated Circuit (ASIC), a Field Programmable Gate Array (FPGA), a Programmable Logic Device (PLD), a controller, a state machine, a gating logic component, a discrete hardware component, or any other suitable device), and/or a combination of both.
Referring now to fig. 7 and 8, an assigned lens or lens device may be mounted for use with a head mounted device.
According to some embodiments, for example as shown in fig. 7, the head-mounted device 100 includes a housing 110 that is worn on the head of a user. The housing 110 may be positioned in front of the user's eyes to provide information within the field of view of the user.
The housing 110 may be supported on the head of a user using the fixing element 120. The fixation element 120 may be wrapped or extended along opposite sides of the user's head. The fixation element 120 may optionally include headphones for wrapping around or otherwise engaging or resting on the user's ear. It should be appreciated that other configurations may be applied to secure the headset 100 to the head of a user. For example, one or more straps, bands, covers, caps, or other components may be used in addition to or in lieu of the illustrated components of the headset 100. As another example, fixation element 120 may include multiple components to engage the head of a user. The fixation element 120 may extend from the housing 110 and/or the light seal 190.
While the light seal 190 is schematically illustrated as having a particular size and shape, it should be understood that the size and shape of the light seal 190 (particularly at the inside of the light seal 190) may have a size and shape that accommodates the face of a user wearing the headset 100. For example, the inner side may provide a shape that substantially matches the contour of the user's face around the user's eyes. The inner side may be provided with one or more features that allow the light seal 190 to conform to the face of the user to enhance comfort and to inhibit light from entering the light seal 190 at the point of contact with the face. For example, the inner side may provide a flexible, soft, resilient and/or compliant structure.
When the user wears the headset 100, the light seal 190 may be placed against the user's face and/or head. The light seal 190 may include a base that provides structural support for one or more other components of the light seal 190. The light seal 190 may define an interior space through which light may pass, thereby providing a view of the display 140 to a user wearing the headset. Such a view may be enhanced by preventing light from entering from the external environment and entering the light seal 190.
Given the variety of head and face shapes that different users may have, it may be desirable to provide light seals 190 with customization and adjustability so that the headset 100 is in a desired position and orientation relative to the user's face and head during use. Accordingly, the light seal 190 may be selected and/or adjusted based on a given user. To accommodate different users, the light seal 190 may be swapped with other light seals. It should be understood that the light seal 190 may be dispensed by a dispenser for lenses or a different dispenser for demonstration purposes. The housing 110 and/or the light seal 190 may provide a nose pad or another feature to rest on the nose of the user.
The housing 110 may provide structure about its peripheral region to support any internal components of the housing 110 in their assembled position. For example, the housing 110 may enclose and support various internal components (including, for example, integrated circuit chips, processors, memory devices, and other circuitry) to provide computing and functional operations for the headset 100, as discussed further herein. While several components are shown within the housing 110, it should be understood that some or all of these components may be located anywhere within or on the headset 100. For example, one or more of these components may be positioned within the fixation element 120 of the headset 100.
The housing 110 may include and/or support one or more cameras 130. The camera 130 may be positioned on or near the outside 112 of the housing 110 to capture an image of a view external to the headset 100. As used herein, the outside of a portion of a head mounted device is the side facing away from the user and/or toward the external environment. The captured image may be available for display to a user or stored for any other purpose. Each of the cameras 130 may be movable along the outer side 112. For example, a track or other guide may be provided to facilitate movement of the camera 130 in the track or other guide.
The head mounted device 100 may include a display 140 that provides visual output for viewing by a user wearing the head mounted device 100. One or more displays 140 may be positioned on or near the inner side 114 of the housing 110. As used herein, the interior side 114 of a portion of the head-mounted device is the side facing the user and/or facing away from the external environment.
The display 140 may transmit light from the physical environment (e.g., as captured by the camera module) for viewing by a user. Such a display 140 may include optical characteristics, such as lenses for vision correction based on incident light from a physical environment. Additionally or alternatively, the display 140 may provide information as a display within the user's field of view. Such information may be provided by excluding views of the physical environment or in addition to (e.g., overlaying) the physical environment. As used herein, a physical environment refers to the physical world with which people can interact without the assistance of an electronic system. In contrast, a computer-generated real-world environment refers to a fully or partially simulated environment with which people interact via an electronic system. Examples of computer-generated reality include virtual reality, mixed reality, and augmented reality. Electronic systems that enable personnel to interact with various computer-generated reality environments include head-mounted devices, projection-based systems, head-up displays (HUDs), vehicle windshields integrated with display capabilities, windows integrated with display capabilities, displays formed as lenses designed for placement on a person's eyes (e.g., similar to contact lenses), headphones/earphones, speaker arrays, input systems (e.g., wearable or handheld controllers with or without haptic feedback), smart phones, tablet computers, and desktop/laptop computers. The head-mounted system may have an integrated opaque display and one or more speakers. Alternatively, the head-mounted system may be configured to accept an external opaque display (e.g., a smart phone). The head-mounted system may have a transparent or translucent display instead of an opaque display.
Each display 140 may be adjusted to align with a corresponding eye of a user. For example, each display 140 may be moved along one or more axes until the center of each display 140 is aligned with the center of the corresponding eye. Accordingly, the distance between the displays 140 may be set based on the user's interpupillary distance. IPD is defined as the distance between the pupil centers of the user's eyes.
A pair of displays 140 may be mounted to the housing 110 and spaced apart a distance. The distance between the pair of displays 140 may be designed to correspond to the IPD of the user. The distance may be adjustable to account for different IPDs of different users who may wear the headset 100. For example, either or both of the displays 140 may be movably mounted to the housing 110 to permit the displays 140 to move or translate laterally to make distances greater or lesser. Any type of manual or automatic mechanism may be used to permit the distance between displays 140 to be an adjustable distance. For example, the displays 140 may be mounted to the housing 110 via slidable tracks or guides that permit manual or electronically actuated movement of one or more display elements in the display 140 to adjust the distance between the displays.
Additionally or alternatively, the display element may be moved to the target location based on a desired visual effect corresponding to a perception of the display 140 by a user when the display 140 is positioned at the target location. The target location may be determined based on the focal length of the user and/or the optical elements of the system. For example, the user's eyes and/or optical elements of the system may determine how the visual output of display 140 will be perceived by the user. The distance between display 140 and the user's eye and/or the distance between display 140 and one or more optical elements may be varied to place display 140 at, in, or out of the corresponding focal distance. Such adjustments may be adapted to accommodate the eyes, corrective lenses, and/or desired optical effects of a particular user.
The headset 100 may include a sensor 170. The sensor 170 may be positioned and arranged to detect characteristics of the user, such as facial features. For example, such user sensors may perform facial feature detection, facial motion detection, facial recognition, eye tracking, pupil measurement, user emotion detection, voice detection, and the like.
As further shown in fig. 7, the sample lens group 200 may be provided separately from the head-mounted device 100 and/or may be combined with the head-mounted device. The sample lens group 200 may be or include one or more lenses 250 for providing corrective vision power. It should be appreciated that where multiple lenses are used, lenses 250 of sample lens group 200 may be provided together or separately (e.g., for combination).
As shown in fig. 7 and 8, the connector may facilitate coupling the sample lens group 200 to the head-mounted device 100 in a relative position and orientation that aligns the lens 250 of the sample lens group 200 in a preferred position and orientation with respect to the display 140 of the head-mounted device 100 for viewing by a user. The headset 100 and the sample lens set 200 may be securely and releasably coupled together. For example, the HMD connector 180 may releasably engage the lens connector 280. One or more of a variety of mechanisms may be provided to secure the components to one another. For example, mechanisms such as locks, latches, snaps, slides, channels, screws, clasps, threads, magnets, pins, interference (e.g., friction) fits, roller presses, bayonet locks, fused materials, fabrics, knits, braids, and/or combinations thereof may be included to couple and/or secure together the headset 100 and the sample lens set 200. The components may remain fixed to each other until the optional release mechanism is actuated. A release mechanism may be provided for access by a user.
As further shown in fig. 8, sample lens assembly 200 may optionally be coupled to housing 110 while positioned within or near light seal 190. It should be appreciated that the connector may allow the sample lens group 200 to be securely held in any position that places the lens 250 within the field of view of the user.
Although lens 250 may be interposed between the user and display 140, lens 250 may also be interposed between sensor 170 and the user. Thus, the sensor 170 may perform detection and/or measurement based on the view through the lens 250. By providing an indication of the lens type to the head mounted device 100, the sensor 170 can be calibrated to interpret the view through the lens 250. For example, the image captured by the sensor 170 may be slightly distorted by the lens 250. With the lens type of lens 250 known, sensor 170 may be calibrated accordingly to perform detection and/or measurement of the sensor appropriately.
Referring now to fig. 9, components of the head-mounted device may be operably connected to provide the capabilities described herein. Fig. 9 shows a simplified block diagram of an exemplary headset 100 according to one embodiment of the invention. It should be appreciated that the components described herein may be provided on one, some, or all of the housing, light seal, and/or head fixation element. It should be understood that additional components, different components, or fewer components than those shown may be utilized within the scope of the subject disclosure.
As shown in fig. 9, the head mounted device 100 may include a controller 178 (e.g., control circuitry) having one or more processing units including or configured to access a memory 182 having instructions stored thereon. The instructions or computer program may be configured to perform one or more of the operations or functions described with respect to the headset 100. The controller 178 may be implemented as any electronic device capable of processing, receiving, or transmitting data or instructions. For example, the controller 178 may include one or more of the following: a microprocessor, a Central Processing Unit (CPU), an Application Specific Integrated Circuit (ASIC), a Digital Signal Processor (DSP), or a combination of such devices. As described herein, the term "processor" is intended to encompass a single processor or processing unit, multiple processors, multiple processing units, or one or more other suitably configured computing elements.
The memory 182 may store electronic data that may be used by the head mounted device 100. For example, the memory 182 may store electrical data or content such as, for example, audio and video files, documents and applications, device settings and user preferences, timing and control signals or data for various modules, data structures, or databases, and the like. The memory 182 may be configured as any type of memory. By way of example only, the memory 182 may be implemented as random access memory, read only memory, flash memory, removable memory, or other types of storage elements or combinations of such devices.
The head mounted device 100 may also include a display 140 for displaying visual information of the user. Display 140 may provide visual (e.g., image or video) output. Display 140 may be or include an opaque, transparent, and/or translucent display. The display 140 may have a transparent or translucent medium through which light representing an image is directed to the user's eyes. The display 140 may utilize digital light projection, OLED, LED, uLED, liquid crystal on silicon, laser scanning light sources, or any combination of these techniques. The medium may be an optical waveguide, a holographic medium, an optical combiner, an optical reflector, or any combination thereof. In one embodiment, the transparent or translucent display may be configured to selectively become opaque. Projection-based systems may employ retinal projection techniques that project a graphical image onto a person's retina. The projection system may also be configured to project the virtual object into the physical environment, for example as a hologram or on a physical surface. The headset 100 may include an optical subassembly configured to help optically adjust and properly project image-based content displayed by the display 140 for close-up viewing. The optical subassembly may include one or more lenses, mirrors, or other optical devices.
The headset 100 may include one or more sensors 170, as described herein. The headset 100 may include one or more other sensors. Such sensors may be configured to sense substantially any type of feature, such as, but not limited to, image, pressure, light, touch, force, temperature, position, motion, and the like. For example, the sensor may be a photodetector, a temperature sensor, a light or optical sensor, an atmospheric pressure sensor, a humidity sensor, a magnet, a gyroscope, an accelerometer, a chemical sensor, an ozone sensor, a particle count sensor, or the like. As another example, the sensor may be a biosensor for tracking biometric characteristics, such as health and activity metrics. Other user sensors may perform facial feature detection, facial motion detection, facial recognition, eye tracking, user emotion detection, voice detection, and the like. The sensor may include a camera that may capture image-based content of the outside world.
The headset 100 may include input/output components 186, which may include any suitable components for connecting the headset 100 to other devices. Suitable components may include, for example, audio/video jacks, data connectors, or any additional or alternative input/output components. The input/output component 186 may include buttons, keys, or another feature that may act as a keyboard for user operation.
The headset 100 may include a microphone 188 as described herein. The microphone 188 may be operably connected to the controller 178 for detecting sound levels and communication of the detection for further processing, as further described herein.
The head mounted device 100 may include a speaker 194 as described herein. The speaker 194 may be operably connected to the controller 178 to control speaker output, including sound levels, as further described herein.
The head-mounted device 100 may include a communication interface 192 for communicating with one or more servers or other devices using any suitable communication protocol. For example, the communication interface 192 may support Wi-Fi (e.g., 802.11 protocol), ethernet, bluetooth, high frequency systems (e.g., 900MHz, 2.4GHz, and 5.6GHz communication systems), infrared, TCP/IP (e.g., any of the protocols used in each of the TCP/IP layers), HTTP, bitTorrent, FTP, RTP, RTSP, SSH, any other communication protocol, or any combination thereof. Communication interface 192 may also include an antenna for transmitting and receiving electromagnetic signals.
The head-mounted device 100 may include a battery 160 that may charge and/or power the components of the head-mounted device 100. The battery 160 may also charge and/or power components connected to the headset 100.
Thus, embodiments of the present disclosure accommodate different users of head-mounted devices having different needs for vision correction. Systems and methods may be provided to determine the corrective lens that is best suited for a given user. Such systems and methods may include an input device for determining which of a plurality of existing corrective lenses are appropriate for a given user. The dispenser may provide a selected one of the plurality of lenses for use by a user with the head-mounted device. The headset may then be operated with the lens, including any suitable adjustment based on the selection of the lens.
For convenience, various examples of aspects of the disclosure are described below as clauses. These examples are provided by way of example and not limitation of the subject technology.
Clause a: a system, comprising: a station, the station comprising: an input device for determining a lens type for a user; a dispenser for dispensing one of a plurality of lens groups based on the lens type; and a station communication interface configured to communicate an indication of the one of the plurality of lens groups; a headset, the headset comprising: a display; an HMD connector configured to receive the one of a plurality of lens groups; and an HMD communication interface configured to receive the indication of the lens type from the station communication interface.
Clause B: a station, comprising: a detector for measuring a characteristic of the reference lens; and a dispenser including a plurality of sample lens groups, the dispenser configured to dispense one of the plurality of sample lens groups based on the characteristics of the reference lens, wherein each of the plurality of sample lens groups includes a lens connector configured to engage an HMD connector of a head-mounted device that includes a display visible through the one of the plurality of sample lens groups.
Clause C: a system, comprising: an input device for determining a lens type for a user; and a dispenser including a plurality of sample lens groups, the dispenser configured to dispense one of the plurality of sample lens groups based on the lens type, the sample lens groups being usable with a sample headset; and a point-of-sale device configured to generate an order for a purchasable head mounted device and one of a plurality of purchasable lens groups based on the lens type, the number of purchasable lens groups being greater than the number of sample lens groups of the dispenser.
One or more of the above clauses may include one or more of the following features. It should be noted that any of the following clauses may be combined with each other in any combination and placed in the corresponding independent clauses, e.g., clauses A, B or C.
Clause 1: the display is configured to output an image based on the indication of the one of the plurality of lens groups.
Clause 2: the head-mounted device further includes a sensor configured to detect a characteristic of the user through the one of the plurality of lens groups based on the indication of the one of the plurality of lens groups.
Clause 3: the input device is a detector for measuring characteristics of the reference lens.
Clause 4: the detector includes: a reference pattern; a camera configured to capture an image of the reference pattern through the reference lens; and a processor configured to: comparing the image with the reference pattern; and generating the indication of the lens type based on a comparison of the image to the reference pattern.
Clause 5: the input device includes a user interface for receiving an indication of the lens type, wherein the lens type corresponds to a prescription for the user.
Clause 6: the input device is operatively connected to the station communication interface to receive an indication of the lens type from an external device, wherein the lens type corresponds to a prescription for the user.
Clause 7: the station is configured to mount the one of the plurality of lens groups at the HMD connector of the head-mounted device.
Clause 8: the detector includes a camera and a reference pattern, wherein the camera is configured to capture an image of the reference pattern through the reference lens.
Clause 9: the detector also includes a processor configured to compare the image to the reference pattern.
Clause 10: the dispenser is further configured to dispense a plurality of sample lens groups having different optical characteristics from each other among the sample lens groups, wherein one of the plurality of sample lens groups is a sample lens group that provides vision correction most similar to that provided by the reference lens.
Clause 11: the dispenser is further configured to dispense the head mounted device.
Clause 12: the dispenser is further configured to mount the one of the plurality of sample lens groups at the HMD connector of the head-mounted device.
Clause 13: the input device is operatively connected to the station communication interface to receive an indication of the lens type from an external device, wherein the lens type corresponds to a prescription for the user.
Clause 14: each of the sample lens groups includes a lens connector configured to engage an HMD connector of the sample headset that includes a display visible through the one of the sample lens groups.
Those of skill in the art will appreciate that the various illustrative blocks, modules, elements, components, methods, and algorithms described herein may be implemented as electronic hardware, computer software, or combinations of both. To illustrate this interchangeability of hardware and software, various illustrative blocks, modules, elements, components, methods, and algorithms have been described above generally in terms of their functionality. Whether such functionality is implemented as hardware or software depends upon the particular application and design constraints imposed on the overall system. Skilled artisans may implement the described functionality in varying ways for each particular application. The various components and blocks may be arranged differently (e.g., arranged in a different order, or divided in a different manner) without departing from the scope of the subject technology.
It should be understood that the specific order or hierarchy of blocks in the processes disclosed herein is an illustration of exemplary approaches. Based on design preference requirements, it should be understood that the particular order or hierarchy of blocks in the process may be rearranged or all illustrated blocks may be performed. Any of these blocks may be performed simultaneously. In one or more implementations, multitasking and parallel processing may be advantageous. Furthermore, the division of the various system components in the above embodiments should not be understood as requiring such division in all embodiments, and it should be understood that the program components and systems may be generally integrated together in a single software product or packaged into multiple software products.
As used in this specification and any claims of this patent application, the terms "base station," "receiver," "computer," "server," "processor," and "memory" refer to an electronic or other technical device. These terms exclude a person or group of people. For purposes of this specification, the term "display" or "displaying" means displaying on an electronic device.
In the present technology, the use of personal information data may be used to benefit the user. For example, health and fitness data may be used to provide insight into the overall health of a user, or may be used as positive feedback to individuals using technology to pursue health goals. The present disclosure contemplates that entities responsible for collecting, analyzing, disclosing, transmitting, storing, or otherwise using such personal information data will adhere to established privacy policies and/or privacy practices. Further, it is an object of the present disclosure that personal information data should be managed and processed to minimize the risk of inadvertent or unauthorized access or use.
As used herein, the phrase "at least one of" after separating a series of items of any of the items with the term "and" or "is a modification of the list as a whole, rather than modifying each member (i.e., each item) in the list. The phrase "at least one of" does not require the selection of at least one of each item listed; rather, the phrase allows for the inclusion of at least one of any one item and/or the meaning of at least one of any combination of items and/or at least one of each item. For example, the phrase "at least one of A, B and C" or "at least one of A, B or C" each refer to a only, B only, or C only; A. any combination of B and C; and/or at least one of each of A, B and C.
The predicates "configured to", "operable to", and "programmed to" do not mean any particular tangible or intangible modification to a subject but are intended to be used interchangeably. In one or more implementations, a processor configured to monitor and control operations or components may also mean that the processor is programmed to monitor and control operations or that the processor is operable to monitor and control operations. Likewise, a processor configured to execute code may be interpreted as a processor programmed to execute code or operable to execute code.
Phrases such as an aspect, this aspect, another aspect, some aspects, one or more aspects, an implementation, the implementation, another implementation, some implementations, one or more implementations, an embodiment, the embodiment, another embodiment, some embodiments, one or more embodiments, a configuration, the configuration, another configuration, some configurations, one or more configurations, subject technology, disclosure, the present disclosure, other variations, and the like are all for convenience and do not imply that disclosure involving such one or more phrases is essential to the subject technology, or that such disclosure applies to all configurations of the subject technology. The disclosure relating to such one or more phrases may apply to all configurations or one or more configurations. The disclosure relating to such one or more phrases may provide one or more examples. A phrase such as an aspect or some aspects may refer to one or more aspects and vice versa, and this applies similarly to other previously described phrases.
The word "exemplary" is used herein to mean "serving as an example, instance, or illustration. Any embodiment described herein as "exemplary" or as "exemplary" is not necessarily to be construed as preferred or advantageous over other embodiments. Furthermore, to the extent that the terms "includes," "has," and the like are used in either the description or the claims, such terms are intended to be inclusive in a manner similar to the term "comprising" as "comprising" is interpreted when employed as a transitional word in a claim.
All structural and functional equivalents to the elements of the various aspects described throughout this disclosure that are known or later come to be known to those of ordinary skill in the art are expressly incorporated herein by reference and are intended to be encompassed by the claims. Furthermore, nothing disclosed herein is intended to be dedicated to the public regardless of whether such disclosure is explicitly recited in the claims. According to the provisions of 35u.s.c. ≡112, no claim element needs to be interpreted unless the element is explicitly stated using the phrase "means for … …" or, in the case of the method claims, the phrase "step for … …".
The previous description is provided to enable any person skilled in the art to practice the various aspects described herein. Various modifications to these aspects will be readily apparent to those skilled in the art, and the generic principles defined herein may be applied to other aspects. Thus, the claims are not intended to be limited to the aspects shown herein, but is to be accorded the full scope consistent with the language claims, wherein reference to an element in a singular value is not intended to mean "one only" but rather "one or more" unless specifically so stated. The term "some" means one or more unless specifically stated otherwise. The terminology of male (e.g., his) includes female and neutral (e.g., her and its), and vice versa. Headings and sub-headings (if any) are used for convenience only and do not limit the subject disclosure.

Claims (20)

1. A system, comprising:
a station, the station comprising:
an input device configured to: determining a lens type for a user;
a dispenser configured to provide one of a plurality of lens groups based on the lens type; and
a station communication interface configured to communicate an indication of the one of the plurality of lens groups; and
a headset, the headset comprising:
a display;
an HMD connector configured to receive the one of a plurality of lens groups; and
an HMD communication interface configured to receive the indication of the lens type from the station communication interface.
2. The system of claim 1, wherein the display is configured to output an image based on the indication of the one of the plurality of lens groups.
3. The system of claim 1, wherein the head-mounted device further comprises a sensor configured to: based on the indication of the one of the plurality of lens groups, a characteristic of the user is detected by the one of the plurality of lens groups.
4. The system of claim 1, wherein the input device is a detector for measuring characteristics of a reference lens.
5. The system of claim 4, wherein the detector comprises:
a reference pattern;
a camera configured to capture an image of the reference pattern through the reference lens; and
a processor configured to:
comparing the image with the reference pattern; and
the indication of the lens type is generated based on a comparison of the image and the reference pattern.
6. The system of claim 1, wherein the input device comprises a user interface for receiving an indication of the lens type, wherein the lens type corresponds to a prescription for the user.
7. The system of claim 1, wherein the input device is operatively connected to the station communication interface to receive an indication of the lens type from an external device, wherein the lens type corresponds to a prescription for the user.
8. The system of claim 1, wherein the station is configured to mount the one of the plurality of lens groups at the HMD connector of the headset.
9. A station, comprising:
a detector for measuring a characteristic of the reference lens; and
a dispenser comprising a plurality of sample lens groups, the dispenser configured to provide one of the plurality of sample lens groups based on the characteristics of the reference lens, wherein each of the plurality of sample lens groups comprises a lens connector configured to engage an HMD connector of a head-mounted device comprising a display visible through the one of the plurality of sample lens groups.
10. The station of claim 9, wherein the detector comprises a camera and a reference pattern, wherein the camera is configured to capture an image of the reference pattern through the reference lens.
11. The station of claim 10, wherein the detector further comprises a processor configured to compare the image to the reference pattern.
12. The station of claim 9, wherein the dispenser is further configured to dispense a plurality of sample lens groups of the sample lens groups having different optical characteristics from one another, wherein one of the plurality of sample lens groups is a sample lens group that provides vision correction most similar to vision correction provided by the reference lens.
13. The station of claim 9, wherein the dispenser is further configured to dispense the head mounted device.
14. The station of claim 13, wherein the dispenser is further configured to mount the one of the plurality of sample lens groups at the HMD connector of the headset.
15. A system, comprising:
an input device for determining a lens type for a user; and
a dispenser comprising a plurality of sample lens groups, the dispenser configured to provide one of the plurality of sample lens groups based on the lens type, the sample lens groups being usable with a sample headset; and
a point-of-sale device configured to generate an order for a purchasable head mounted device and one of a plurality of purchasable lens groups based on the lens type, the number of purchasable lens groups being greater than the number of sample lens groups of the dispenser.
16. The system of claim 15, wherein the input device is a detector for measuring characteristics of a reference lens.
17. The system of claim 16, wherein the detector comprises:
A reference pattern;
a camera configured to capture an image of the reference pattern through the reference lens; and
a processor configured to:
comparing the image with the reference pattern; and
an indication of the lens type is generated based on a comparison of the image and the reference pattern.
18. The system of claim 15, wherein the input device comprises a user interface for receiving an indication of the lens type, wherein the lens type corresponds to a prescription for the user.
19. The system of claim 15, wherein the input device is operatively connected to a station communication interface to receive an indication of the lens type from an external device, wherein the lens type corresponds to a prescription for the user.
20. The system of claim 15, wherein each of the sample lens groups comprises a lens connector configured to engage an HMD connector of the sample headset that includes a display visible through the one of the sample lens groups.
CN202180068070.9A 2020-08-03 2021-07-29 Dispensing system Pending CN116583777A (en)

Applications Claiming Priority (3)

Application Number Priority Date Filing Date Title
US202063060589P 2020-08-03 2020-08-03
US63/060,589 2020-08-03
PCT/US2021/043765 WO2022031519A1 (en) 2020-08-03 2021-07-29 Dispensing system

Publications (1)

Publication Number Publication Date
CN116583777A true CN116583777A (en) 2023-08-11

Family

ID=77412380

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202180068070.9A Pending CN116583777A (en) 2020-08-03 2021-07-29 Dispensing system

Country Status (2)

Country Link
CN (1) CN116583777A (en)
WO (1) WO2022031519A1 (en)

Family Cites Families (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7840444B2 (en) * 2006-12-05 2010-11-23 Essilor International Compagnie Generale D'optique Lens ordering and delivery system for head mounted display
US10055929B2 (en) * 2014-12-31 2018-08-21 Essilor International Automated eyewear kiosk
EP3128362B1 (en) * 2015-08-05 2023-10-04 Essilor International Method for determining a parameter of an optical equipment
WO2017134275A1 (en) * 2016-02-05 2017-08-10 Eidgenossische Technische Hochschule Zurich Methods and systems for determining an optical axis and/or physical properties of a lens and use of the same in virtual imaging and head-mounted displays
US10310598B2 (en) * 2017-01-17 2019-06-04 Facebook Technologies, Llc Varifocal head-mounted display including modular air spaced optical assembly
US10321820B1 (en) * 2017-12-21 2019-06-18 Facebook Technologies, Llc Measuring optical properties of an eyewear device
US20200096775A1 (en) * 2018-09-24 2020-03-26 Apple Inc. Display System With Interchangeable Lens

Also Published As

Publication number Publication date
WO2022031519A1 (en) 2022-02-10

Similar Documents

Publication Publication Date Title
US10908421B2 (en) Systems and methods for personal viewing devices
KR102300390B1 (en) Wearable food nutrition feedback system
CN112368628B (en) Adjustable electronic device system with face mapping
CN214540234U (en) Head-mounted device
CN108474952A (en) Wear-type electronic equipment
US20160363763A1 (en) Human factor-based wearable display apparatus
KR101203921B1 (en) Information providing apparatus using an eye tracking and local based service
CN110275602A (en) Artificial reality system and head-mounted display
US20230229007A1 (en) Fit detection for head-mountable devices
KR102219659B1 (en) Method and system for virtual reality-based eye health measurement
CN112285930B (en) Optical alignment for head-mounted devices
EP2583131B1 (en) Personal viewing devices
US20230264442A1 (en) Dispensing system
CN116507961A (en) Headset with modular assembly for fit adjustment
CN117295995A (en) Fitting detection system for head-mountable device
CN116583777A (en) Dispensing system
US20230043585A1 (en) Ultrasound devices for making eye measurements
CN209821501U (en) Electronic device, system and head-mounted device
US11729373B1 (en) Calibration for head-mountable devices
US20240230318A1 (en) Fit detection system for head-mountable devices
US20240004459A1 (en) Fit guidance for head-mountable devices
US20240064280A1 (en) Optical assemblies for shared experience
US11982816B1 (en) Wearable devices with adjustable fit
CN117590599A (en) Optical component for shared experience
US20230418019A1 (en) Electronic Device With Lens Position Sensing

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination