CN117441331A - Adaptive brightness for augmented reality display - Google Patents

Adaptive brightness for augmented reality display Download PDF

Info

Publication number
CN117441331A
CN117441331A CN202280040857.9A CN202280040857A CN117441331A CN 117441331 A CN117441331 A CN 117441331A CN 202280040857 A CN202280040857 A CN 202280040857A CN 117441331 A CN117441331 A CN 117441331A
Authority
CN
China
Prior art keywords
wearable device
value
ambient light
measurement
projection element
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202280040857.9A
Other languages
Chinese (zh)
Inventor
杰弗里·迈克尔·德瓦尔
亚历山大·卡内
多米尼克·施尼策尔
马蒂厄·埃曼努尔·比尼奥
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Snap Inc
Original Assignee
Snap Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from US17/582,633 external-priority patent/US11823634B2/en
Application filed by Snap Inc filed Critical Snap Inc
Priority claimed from PCT/US2022/032175 external-priority patent/WO2022260954A1/en
Publication of CN117441331A publication Critical patent/CN117441331A/en
Pending legal-status Critical Current

Links

Landscapes

  • Controls And Circuits For Display Device (AREA)

Abstract

Systems and methods for adaptively adjusting the brightness of a wearable device projection system are disclosed. The system and method perform operations comprising: causing a projection element of the AR wearable device to project an image; receiving a measurement of ambient light from an ambient light sensor; adjusting one or more hardware parameters of a projection element of the AR wearable device based on the measurement of ambient light; modifying one or more color values of an image displayed by a projection element of the AR wearable device based on the measurement of ambient light; and projecting an image with the modified color values using a projection element of the AR wearable device having the adjusted one or more hardware parameters.

Description

Adaptive brightness for augmented reality display
Cross Reference to Related Applications
The present application claims priority from U.S. provisional application Ser. No. 63/208,810 filed on day 6 and 9 of 2021 and U.S. patent application Ser. No. 17/582,633 filed on day 1 and 24 of 2022. The contents of these prior applications are considered to be part of the present application and are incorporated herein by reference in their entirety.
Technical Field
The present disclosure relates generally to a wearable device having a display system. In particular, the present disclosure proposes systems and methods for using a digital light projector for an augmented reality wearable device.
Background
The wearable device may be implemented with a transparent or translucent display through which a user of the wearable device may view the surrounding environment. Such devices enable a user to view through a transparent or translucent display to view the surrounding environment, and also to see objects (e.g., virtual objects such as images, videos, text, etc.) that are generated for display to appear as part of and/or overlay on the surrounding environment.
Drawings
To facilitate identification of a discussion of any particular element or act, one or more of the highest digit(s) in a reference number refers to the figure number in which that element was first introduced.
Fig. 1 is a perspective view of a wearable device according to some examples.
Fig. 2 is a block diagram illustrating a network environment for operating an Augmented Reality (AR) wearable device, according to one example.
Fig. 3 is a block diagram illustrating an AR wearable device according to one example.
Fig. 4 is a block diagram illustrating a DLP projector of an AR wearable device according to one example.
Fig. 5 is a block diagram illustrating a DLP controller according to one example.
Fig. 6 illustrates an AR wearable device according to some examples.
Fig. 7 is a flow chart illustrating example operations of a DLP controller according to some examples.
FIG. 8 is a diagrammatic representation of machine in the form of a computer system within which a set of instructions, for causing the machine to perform any one or more of the methodologies discussed herein, may be executed according to some examples.
Fig. 9 is a block diagram illustrating a software architecture within which an example may be implemented.
Detailed Description
The following description describes systems, methods, techniques, sequences of instructions, and computer-machine program products that illustrate examples of the present subject matter. In the following description, for purposes of explanation, numerous specific details are set forth in order to provide an understanding of various examples of the present subject matter. It will be apparent, however, to one skilled in the art, that the examples of the subject matter may be practiced without some or other of these specific details. Examples merely typify possible variations. Unless explicitly stated otherwise, structures (e.g., structural components such as modules) are optional and may be combined or sub-divided, and operations (e.g., in a process, algorithm, or other function) may be varied in sequence or combined or sub-divided.
An AR wearable device implemented with a transparent or semi-transparent display enables a user to view through the transparent or semi-transparent display to view the surrounding environment. In addition, the AR wearable device may enable a user to see objects (e.g., virtual objects such as images, videos, text, etc.) rendered in a display of the AR wearable device to appear as part of the surrounding environment and/or overlaid on the surrounding environment. Such AR wearable devices may provide an augmented reality experience for a user.
The virtual object may be rendered based on the position of the wearable device relative to the physical object or relative to a frame of reference (external to the wearable device) such that the virtual object appears correctly in the display. The virtual object appears to be aligned with a physical object perceived by a user of the AR wearable device. Graphics (e.g., graphical elements containing indications and guides) appear to be attached to a physical object of interest. To do this, the AR wearable device detects the physical object and tracks the pose of the AR wearable device relative to the position of the physical object. The pose identifies the position and orientation of the object relative to a frame of reference or relative to another object.
In one example, the AR wearable device includes a projector (e.g., a Digital Light Projector (DLP)) that displays the virtual object on a display of the AR wearable device (e.g., a lens of the AR wearable device). DLP projectors operate by projecting light from a light source (e.g., one or more LEDs such as red, green, and blue LEDs) through a color wheel toward a DMD (digital micromirror device). The DMD controls whether or not colored light is reflected toward the display of the AR wearable device. DLP projectors create color for the human eye by cycling through (R) red, (G) green, and (B) blue bitplanes at a very high rate (e.g., 10 kHz). The sum of all bitplanes creates a color impression for the human eye (impression of color). The order of the display bit planes (in terms of power and color) is optimized separately for each DLP projector. Thus, different DLP projectors will have different color recycling arrangements. Further, depending on the frame rate of the DLP projector, the DLP projector repeats the bit-plane sequence to fill in the frame time (e.g., cycle). Thus, each DLP projector is typically configured to optimize the bit plane sequence (for power saving, color calibration, rainbow artifact reduction (for wall projectors)).
In some implementations, AR wearable devices using DLP projectors encounter difficulties when operating in environments where the ambient brightness varies widely, from dim indoor lighting to bright outdoor lighting (i.e., outdoor parks or dimly lit restaurants). In this case, the AR wearable device displays the object with perceived brightness (effective brightness value) that matches the ambient lighting conditions of the operating environment, while also reducing the amount of power used and maintaining high contrast. Some AR wearable devices may modify the brightness of the projected image by repeating the projected RGB bitplanes, for example by projecting the RGBRGB bitplanes (e.g., repeating the same RGB pixel values twice), which is referred to as a 0% dark time. This may result in a larger effective brightness, but may result in image ghosting and color separation (color break up). To avoid image ghosting and color separation, AR wearable devices may introduce a short blanking period (blanking period), for example by projecting a dark (full black-no light) bit plane after projecting the RGB bit plane, which is referred to as a 50% dark time. This approach can address image ghosting and color separation effects, but cannot achieve the same high brightness level, even though it can achieve a brightness level lower than 0% dark time.
The disclosed systems and methods include operations for configuring operation of a DLP projector for use in an AR wearable device based on measured ambient light received from an ambient light sensor of the AR wearable device. The disclosed system can be dynamically switched between 0% dark time and 50% dark time and the pixel values can be modified in software to achieve the desired effective brightness level (maximum brightness level achievable with 0% dark time or minimum brightness level achievable with 50% dark time) without introducing image ghosting or color separation. By being able to sense ambient brightness (measure ambient light) and control the LED current, projector dark time, and projected RGB composition in combination, the AR wearable device can accommodate various ambient lights in different real world environments while maintaining desired perceived brightness and power savings.
Fig. 1 is a perspective view of an AR wearable device (e.g., AR glasses (AR eyeglasses) or AR glasses) 100 according to some examples. AR glasses 100 may include a frame 132, where frame 132 is made of any suitable material, such as plastic or metal (including any suitable shape memory alloy). In one or more examples, the frame 132 includes a front piece 134, the front piece 134 including a first optical left optical element holder 114 (e.g., a display holder or a lens holder) and a second optical element holder or a right optical element holder 120 connected by a nosepiece 118. The front piece 134 additionally includes a left end 110 and a right end 124. A first or left optical element 116 and a second or right optical element 122 may be disposed within the respective left and right optical element holders 114, 120. The optical elements 116 and 122 may be colored (where colored layers are placed on top of the lenses or glass that make up the optical elements 116 and 122) or transparent (where no colored layers are placed). Each of the right optical element 122 and the left optical element 116 may be a lens, a display assembly, or a combination of the foregoing. Any of the display components disclosed herein may be disposed in AR glasses 100.
The frame 132 additionally includes a left arm or temple piece 136 and a right arm or temple piece 138 that are coupled to the respective left and right ends 110, 124 of the front piece 134 by any suitable means, such as a hinge (not shown), to couple to the front piece 134, or are rigidly or fixably secured to the front piece 134 to be integral with the front piece 134. In one or more implementations, each of the temple pieces 136 and 138 includes a first portion 108 coupled to the respective left end 110 or right end 124 of the front piece 134 and any suitable second portion 126 for coupling to the ear of a user. In one example, the front piece 134 may be formed from a single piece of material to have a unitary or one-piece construction. In one example, such as shown in FIG. 1, the entire frame 132 may be formed from a single piece of material to have a unitary or integral construction.
AR glasses 100 may include a computing device, such as computer 128, which may be of any suitable type to be carried by frame 132, and in one or more examples, is of a suitable size and shape to be at least partially disposed in one of temple piece 136 and temple piece 138. In one or more examples, as shown in fig. 1, the size and shape of the computer 128 is similar to the size and shape of one of the temple pieces 138 (e.g., or the temple piece 136) and, thus, is disposed nearly entirely within the structure and boundaries of such temple piece 138, without being entirely within the structure and boundaries of such temple piece 138. In one or more examples, the computer 128 is disposed in both the temple piece 136 and the temple piece 138. The computer 128 may include one or more processors and memory, wireless communication circuitry, and a power supply. As discussed below, computer 128 includes low power circuitry, high speed circuitry, and a display processor. Various other examples may include these elements being configured differently or integrated together in different ways. Additional details of aspects of the computer 128 may be implemented as shown by the wearable device 210 discussed below. In some aspects, computer 128 implements a DLP controller as discussed below.
The computer 128 additionally includes a battery 106 or other suitable portable power source. In one example, the battery 106 is disposed in one of the temple pieces 136 or the temple piece 138. In the AR glasses 100 shown in fig. 1, the battery 106 is shown disposed in the left temple piece 136 and electrically coupled to the rest of the computer 128 disposed in the right temple piece 138 using the connection 130. AR glasses 100 may include a connector or port (not shown), wireless receiver, transmitter or transceiver (not shown), or a combination of these devices accessible from outside of frame 132, suitable for charging battery 106.
In one or more implementations, the AR glasses 100 include an imaging device 102. Although two cameras are depicted, other examples contemplate the use of a single or additional (i.e., more than two) cameras. In one or more examples, glasses 100 include any number of input sensors (e.g., one or more ambient light sensors) or peripherals in addition to camera 102. The front piece 134 is provided with: an outwardly, forwardly facing, or front or outer surface 112 that faces forward or away from the user when the eyeglass 100 is mounted on the face of the user; and an opposing inwardly facing, rearwardly facing or rear or inner surface 104 that faces the user's face when the eyeglass 100 is mounted on the user's face. Such a sensor may comprise: an inwardly facing video sensor or digital imaging module, such as a camera, which may be mounted on the inner surface 104 of the front piece 134 or disposed elsewhere within the inner surface 104 of the front piece 134 or on the frame 132 to face the user; and an outward facing video sensor or digital imaging module, such as camera 102, which may be mounted on the outer surface 112 of the front piece 134 or disposed elsewhere within the outer surface 112 of the front piece 134 or on the frame 132 to face away from the user. Such sensors, peripherals or peripheral devices may additionally include biometric sensors, positioning sensors, ambient light sensors, thermal temperature sensors or any other such sensor.
Fig. 2 is a network diagram illustrating a network environment 200 suitable for operating an AR wearable device 210, according to some examples. The network environment 200 includes an AR wearable device 210, a client device 211, and a server 212 communicatively coupled to each other directly or via a network 204. The AR wearable device 210 and the server 212 may each be implemented in whole or in part in a computer system as described below with respect to fig. 8 and 9. Server 212 may be part of a network-based system. For example, the network-based system may be or include a cloud-based server system that provides additional information to the AR wearable device 210, such as virtual content (e.g., an image of a three-dimensional model of a virtual object).
The client device 211 may be a smart phone, tablet, laptop, access point, or any other such device capable of connecting with the wearable device 210 using both a low power wireless connection and a high speed wireless connection. Client device 211 is connected to server 212 and network 204. Network 204 may include any combination of wired and wireless connections. Server 212 may be one or more computing devices that are part of a service or network computing system. Any elements of client device 211, as well as server 212 and network 204, may be implemented using the details of software architecture 704 or machine 800 depicted in fig. 8 and 9. The client device 211 may provide one or more images for display to the AR wearable device 210. The client device 211 may receive input from a user (e.g., by moving a slider between a minimum brightness setting position and a maximum brightness setting position) to adjust the brightness setting of the AR wearable device 210. In response, the client device 211 sends the brightness value set by the user based on the received input to the AR wearable device 210. In this case, AR wearable device 210 sets the effective brightness value to the setting indicated by client device 211, independent of any ambient light measurements received from the ambient light sensor.
The user 206 operates the AR wearable device 210. The user 206 may be a human user (e.g., a human), a machine user (e.g., a computer configured by a software program to interact with the AR wearable device 210), or any suitable combination thereof (e.g., a machine-assisted human or a machine supervised by a human). The user 206 is not part of the network environment 200, but is associated with an AR wearable device 210.
The AR wearable device 210 may be a computing device with a display, such as a smart phone, a tablet computer, or a wearable computing device (e.g., AR glasses or a head-mounted display device). The computing device may be handheld or may be removably mounted to the head of the user 206. In one example, the display may be a screen displaying content captured with a camera of the AR wearable device 210. In another example, the display of the device may be transparent, such as in the lenses of wearable computing eyewear. In another example, the display of AR wearable device 210 may present images received from client device 211, for example, by using one or more projection elements operated by a DLP controller.
The user 206 operates an application of the AR wearable device 210. The application may include an AR application configured to provide the user 206 with an experience triggered by the physical object 208, such as a two-dimensional physical object (e.g., a picture), a three-dimensional physical object (e.g., a statue), a location (e.g., in a facility), or any reference in a real-world physical environment (e.g., a perceived wall or corner of furniture). For example, user 206 may direct the camera of AR wearable device 210 to capture an image of physical object 208. The images are tracked and recognized locally in the AR wearable device 210 using a local context recognition dataset module of the AR application of the AR wearable device 210. The local context recognition dataset module may include a library of virtual objects associated with real world physical objects or references. The AR application then generates additional information (e.g., a three-dimensional model) corresponding to the identified image in response to identifying the image, and presents the additional information as one or more images in a display of the AR wearable device 210. If the captured image is not recognized locally at the AR wearable device 210, the AR wearable device 210 downloads additional information (e.g., a three-dimensional model) corresponding to the captured image from a database of the server 212 or from the client device 211 over the network 204.
In one example, server 212 may be configured to detect and identify physical object 208 based on sensor data (e.g., image and depth data) from AR wearable device 210, determine the pose of AR wearable device 210 and physical object 208 based on the sensor data. Server 212 may also generate virtual objects based on the gestures of AR wearable device 210 and physical object 208. The server 212 transmits the virtual object to the AR wearable device 210. Object recognition, tracking, and AR rendering may be performed on AR wearable device 210, server 212, or a combination between AR wearable device 210 and server 212.
Any of the machines, databases, or devices illustrated in fig. 1 may be implemented in a general-purpose computer that is modified (e.g., configured or programmed) by software to be a special-purpose computer to perform one or more of the functions described herein for the machine, database, or device. As used herein, a "database" is a data storage resource and may store data structured as text files, tables, spreadsheets, relational databases (e.g., object-relational databases), triad stores, hierarchical data stores, or any suitable combination thereof. Furthermore, any two or more of the machines, databases, or devices illustrated in fig. 1 may be combined into a single machine, and the functionality described herein with respect to any single machine, database, or device may be subdivided among multiple machines, databases, or devices.
Network 204 may be any network that enables communication between or among machines (e.g., server 212), databases, and devices (e.g., AR wearable device 210). Thus, the network 204 may be a wired network, a wireless network (e.g., mobile, bluetooth, short-range network, or long-range network or cellular network), or any suitable combination thereof. Network 204 may include one or more portions that constitute a private network, a public network (e.g., the internet), or any suitable combination thereof.
Fig. 3 is a block diagram illustrating modules (e.g., components) of AR wearable device 210 according to some examples. The AR wearable device 210 includes a sensor 302, a display system 304, a processor 308, and a storage device 306. Examples of AR wearable device 210 include wearable computing devices, AR glasses (AR glasses), AR glasses (AR eyeglassages), desktop computers, car computers, tablet computers, navigation devices, portable media devices, or smart phones.
The sensors 302 include, for example, optical sensors 316 (e.g., imaging devices such as color imaging devices, thermal imaging devices, depth sensors, and one or more gray scale, global shutter tracking imaging devices) and inertial sensors 318 (e.g., gyroscopes, accelerometers). Other examples of sensors 302 include proximity or location sensors (e.g., near field communication, GPS, bluetooth, wi-Fi), ambient light sensors, thermal temperature sensors, audio sensors (e.g., microphones), or any suitable combination thereof. Note that the sensor 302 described herein is for illustration purposes, and thus the sensor 302 is not limited to the sensor described above.
The display system 304 includes a screen 324 and a DLP projector 326. The DLP projector 326 includes one or more projection elements (LEDs) that project an image of the virtual object onto the screen 324. In one example, screen 324 may be transparent or translucent such that user 206 may view through screen 324 (in the AR use case). The DLP projector 326 is configured to operate with a predictable color sequence, a single RGB color cycle per frame, and a shorter pixel persistence (pixel persistence). DLP projector 326 is described in more detail below with respect to FIG. 4.
Processor 308 includes an AR application 310, a tracking system 312, and a DLP controller 314.AR application 310 uses computer vision to detect and identify physical environments or physical objects 208.AR application 310 (which may be implemented in part by client device 211) retrieves virtual objects (e.g., 3D object models) based on the identified physical object 208 or physical environment. AR application 310 renders virtual objects in display system 304. For AR applications, AR application 310 includes a local rendering engine that generates a visualization of a virtual object that is overlaid on (e.g., superimposed on, or otherwise displayed in coordination with) an image of physical object 208 captured by optical sensor 316. Visualization of virtual objects may be manipulated by adjusting the position of physical object 208 relative to optical sensor 316 (e.g., its physical location, orientation, or both). Similarly, the visualization of the virtual object may be manipulated by adjusting the pose of AR wearable device 210 relative to physical object 208.
In one example, AR application 310 includes a brightness control application. The brightness control application receives the measurement of ambient light from the ambient light sensor (e.g., sensor 302) and adjusts one or more hardware parameters of a projection element (e.g., DLP projector 326) of the AR wearable device based on the measurement of ambient light. The AR application 310 modifies one or more color values of an image displayed by a projection element of the AR wearable device based on the measurement of ambient light and projects the image with the modified color values using the projection element of the AR wearable device with the adjusted one or more hardware parameters. In an example, AR application 310 calculates an effective luminance value based on ambient light measurements that are used to control adjustments to one or more hardware parameters and one or more color values. In some examples, AR application 310 selects between operating the projection elements of DLP projector 326 at 0% Dark Time (DT) or 50% DT as part of calculating the effective brightness value.
The tracking system 312 tracks the pose (e.g., position and orientation) of the AR wearable device 210 relative to the real-world environment 202 using optical sensors (e.g., depth-enabled 3D cameras, image cameras), inertial sensors (e.g., gyroscopes, accelerometers), wireless sensors (bluetooth, wi-Fi), GPS sensors, and/or audio sensors to determine the location of the AR wearable device 210 within the real-world environment 202. Tracking system 312 includes, for example, accessing inertial sensor data from inertial sensor 318, optical sensor data from optical sensor 316, and determining its pose based on the combined inertial sensor data and optical sensor data. In another example, the tracking system 312 determines a pose (e.g., a position, location, orientation) of the AR wearable device 210 relative to a frame of reference (e.g., the real world environment 202). In another example, tracking system 312 includes a visual odometer system that estimates a pose of AR wearable device 210 based on a 3D map of feature points from inertial sensor data and optical sensor data.
The DLP controller 314 communicates data signals to the DLP projector 326 to project virtual content (including one or more images) onto a screen 324 (e.g., a transparent display). The DLP controller 314 includes hardware that converts signals from the AR application 310 into display signals for the DLP projector 326. In one example, DLP controller 314 is part of processor 308. In another example, DLP controller 314 is part of DLP projector 326.
In one example, the DLP controller 314 configures the DLP projector 326 to operate with a predictable color sequence, a single RGB color cycle per frame, and a shorter pixel persistence. For example, the DLP controller 314 determines or identifies a color sequence pattern for the DLP projector 326. The DLP controller 314 instructs the light source (or color filter system) of the DLP projector 326 to produce a single color cycle per frame. The DLP controller 314 also instructs the Digital Micromirror Device (DMD) of the DLP projector 326 to generate shorter pixel persistence. DLP controller 314 is described in more detail below with respect to FIG. 5.
The storage device 306 stores virtual object content 320 and DLP configuration settings 322. Virtual object content 320 includes, for example, a database of visual references (e.g., images) and corresponding experiences (e.g., three-dimensional virtual objects, interactive features of three-dimensional virtual objects). In one example, the storage device 306 includes a primary content data set, a contextual content data set, and a visual content data set. The primary content data set includes, for example, a first set of images and corresponding experiences (e.g., interactions with a three-dimensional virtual object model). For example, the image may be associated with one or more virtual object models. The primary content data set may include a set of core images. The set of core images may include a limited number of images identified by server 212. For example, the set of core images may include images depicting covers of ten most viewed physical objects and their corresponding experiences (e.g., virtual objects representing the ten most viewed physical objects). In another example, the server 212 may generate the first set of images based on the most popular or frequently scanned images received at the server 212. Thus, the primary content data set is not dependent on the physical object or image obtained by the optical sensor 316.
The DLP configuration settings 322 include, for example, effective brightness values, settings of the DLP projector 326, and/or settings determined by the DLP controller 314. Examples of settings include RGB bit-plane cycling rate, frame rate, color sequence, luminance values, and pixel persistence time.
Any one or more of the modules described herein may be implemented using hardware (e.g., a processor of a machine) or a combination of hardware and software. For example, any of the modules described herein may configure a processor to perform the operations described herein for that module. Furthermore, any two or more of these modules may be combined into a single module, and the functionality described herein for a single module may be subdivided among multiple modules. Furthermore, according to various examples, modules described herein as being implemented within a single machine, database, or device may be distributed across multiple machines, databases, or devices.
Fig. 4 is a block diagram illustrating a DLP projector 326 according to one example. The DLP controller 314 includes a light source 402 (also referred to as a light source component or projection element), a condenser lens 404, a shaping lens 406, a DMD 408, and a projection lens 410. The light source 402 includes, for example, a pressurized bulb, a laser, or a high power LED. In one example, the light source 402 includes three color LEDs: blue LED 412, red LED 414, and green LED 416. Each colored LED emits colored light at its corresponding collimating lens (e.g., collimating lens 418, collimating lens 420, collimating lens 422) according to its corresponding hardware parameters.
The DLP controller 314 interfaces with the light source 402 of the DLP projector 326 and controls the light source 402 to generate a single RGB repeat per frame. In one example, the DLP controller 314 interfaces with the light source 402 and identifies a color sequence of the light source 402. For example, the DLP controller 314 queries the DLP projector 326 and identifies the model number of the DLP projector 326. The DLP controller 314 identifies the color sequence of the light source 402 based on the model number of the DLP projector 326.
In another example, the light source 402 includes, for example, a white light source (not shown) and a color wheel (not shown) divided into primary colors (red, green, and blue). The color wheel rotates at a high speed (e.g., 7200 RPM). The DLP controller 314 synchronizes the rotational motion of the color wheel such that the green component is displayed on the DMD when the green portion of the color wheel is in front of the lamp. The same is true for the red, blue and other parts. The colors are sequentially displayed at a rate high enough so that the viewer sees a composite (full color) image. The black color is created by directing unused light away from the light source 402. For example, unused light is scattered to reflect and dissipate on the inner walls of DMD 408 or projection lens 410. The DLP controller 314 operates the light source 402 such that the color wheel rotates one RGB cycle per frame.
The condenser lens 404 focuses light from the light source 402 onto a shaped lens 406. The shaped lens 406 diffuses light from the light source 402 to the DMD 408.DMD 408 includes hundreds of individual micromirrors. The digital signals representing 0 and 1 drive the micromirrors to rotate to a selected angle to reflect unwanted light and direct the unwanted light to the projection lens 410. By persistence of vision, light of different colors is synthesized as a color image for the human eye. In one example, DLP controller 314 controls DMD 408 to reduce the persistence of each pixel. Afterglow may be referred to as the time each pixel remains lit. High persistence (e.g., 8.3ms at 120 Hz) results in blurring and smearing of the image. The DLP controller 314 reduces the persistence of each pixel to, for example, less than 3ms.
Fig. 5 illustrates a DLP controller 314 according to one example. The DLP controller 314 includes a motion color artifact compensation module 506 and a low persistence module 508. The motion color artifact compensation module 506 reduces color artifacts produced by motion of the AR wearable device 210. For example, when user 206 moves his head where AR wearable device 210 is mounted on his head, the displayed virtual content will be separated by their primary colors (RGB), more precisely, the color sequence will become visible.
DLP projectors that utilize a mechanically rotating color wheel exhibit this color separation, which is also known as the "rainbow effect. This is best described as a brief flash of perceived red, blue and green "shadows", which is most often observed when the projected content is characterized by a moving bright or white object in a high contrast area on a nearly fully dark or black background. A short-lived visible separation of colors may also be apparent when the observer moves their eyes rapidly over the projected image. Generally, the faster the user moves his eyes/head, the farther apart the colors appear.
The motion color artifact compensation module 506 reduces or eliminates rainbow effects by compensating for color artifacts based on the predictable data. In other words, the motion color artifact compensation module 506 predicts where and how colors are rendered in the event of user motion and compensates for motion-to-photon delays based on each color. In one example, the motion color artifact compensation module 506 includes a color cycling module 502 and a color sequence module 504.
The color cycling module 502 configures the light source 402 to generate only one single repetition of primary colors (RGB) per image frame. For example, the conventional light source 402 generates four colors RGB-RGB per frame (e.g., at about 60 Hz). Multiple color cycles can result in a stutter effect since the picture is seen four times at different locations. This intermittent effect is particularly exacerbated during head movements of the AR wearable device 210 when virtual content is displayed.
The color sequence module 504 identifies or determines a color sequence of the light source 402. As previously described, in conventional DLP projectors, when a user moves his head, the displayed virtual content will be separated by their primary colors, more precisely, the color sequence will become visible. For example, a simple RGB sequence would have its three colors oozed out (bleed). The faster the user moves his head, the farther apart the colors will appear. The high frequency color sequence may be used to shift color bleeding (color bleeding). However, high frequencies may result in motion blur and unreadable text. The color sequence module 504 identifies the color sequence (R, G and B) of the light source 402 and counteracts the effects of color separation based on the predicted color sequence for each frame.
The low persistence module 508 reduces persistence for each pixel by controlling the DMD 408 to direct light from the light source 402 away from the projection lens 410. In one example, the low persistence module 508 reduces persistence for each pixel to, for example, less than 3ms. In another example, DLP controller 314 controls DMD 408 to display black (direct light away from projection lens 410) for 50% of the frame time, thereby causing a shift in the respective color planes.
In addition, the DLP controller 314 may control the current of each LED of the light source 402, thereby providing a range of possible brightness. However, the provided current range may limit the upper or lower limit of the LED brightness. In some examples, a current range is provided that will provide a maximum LED brightness to achieve a perceived brightness corresponding to the outdoor ambient brightness. This may result in a lower perceived brightness boundary that is still greater than the indoor environment brightness, which results in a projection that is too bright in the indoor environment to be comfortable for the user. Thus, the process of turning off the projector for a period of time (i.e., dark time) may be utilized to achieve lower perceived brightness in an indoor environment while maintaining the ability to achieve maximum perceived brightness in an outdoor environment (i.e., through maximum current and 0% dark time). In addition, other methods may be utilized. For example, the RGB compositor may change the hue and color spectrum of any given color to be displayed in order to increase or decrease the perceived brightness of that color. Each of these methods, alone or in combination, enable an AR wearable device to operate at various ambient brightnesses while maintaining low power consumption and extending battery life.
In an example, the DLP controller 314 may operate the LEDs of the light source 402 in 0% Dark Time (DT) mode or 50% DT mode. In 0% dt mode, the DLP controller 314 performs two repetitions (e.g., RGBRGB) for each desired color. Further, the DLP controller 314 may drive the LEDs at a maximum current amount in the 0% dt mode. This results in the maximum achievable effective brightness setting. In such implementations, the 0% dt mode also consumes the maximum amount of power. To reduce the amount of power and brightness, the DLP projector 326 may operate the LEDs of the light source 402 in a 50% dt mode. In 50% dt mode, the DLP projector 326 performs one iteration (e.g., RGB, dark) for each desired color. That is, in this mode, the LED is driven at half the number of repetitions of the 0% dt mode. To achieve the same effective or perceived brightness level in the 50% dt mode as in the 0% dt mode, the current used to drive the LEDs needs to be doubled. However, certain projection elements of the light source 402 cannot be driven at such high currents without damaging the components.
One advantage of operating the LEDs of the light source 402 in the 50% dt mode is that a lower minimum perceived brightness level can be achieved than in the 0% dt mode. That is, driving the LED with the minimum operating current in the 50% dt mode produces a lower effective brightness than perceived when driving the LED with the same minimum operating current in the 0% dt mode. According to some implementations, the DLP projector 326 may select between operating the LEDs of the light source 402 in 0% dt mode or in 50% dt mode depending on the desired or calculated effective brightness value for a given real world environment. In addition, DLP projector 326 may also modify color values of pixels of an image received for display in AR wearable device 210 in order to further reduce or increase the perceived effective luminance value. By using a combination of modifying the hardware parameters of the LEDs (e.g., changing the DT operation mode and modifying the current used to drive the LEDs) and modifying the software pixel values (e.g., changing the white pixels in the image to gray pixels, or scaling or multiplying the red, green, blue pixel values of the image by a specified value) based on the current ambient light measurements, the effective brightness perceived by the user through the AR wearable device 210 can be set more effectively for different real world environments, including bright outdoor environments and darker indoor environments. This enables the AR wearable device 210 to be used in a wider environment without negatively affecting the overall user experience.
In some examples, DLP controller 314 receives a measurement of ambient light from an ambient light sensor of AR wearable device 210. The measurement of ambient light may be received in lux (lx) measurement units that are used as one component of the effective brightness setting calculated by the DLP controller 314. In one example, DLP controller 314 may automatically change or control the effective brightness setting of AR wearable device 210. In one example, DLP controller 314 may change or control the effective brightness setting of AR wearable device 210 based on manual input control (e.g., based on input received from a user directly on AR wearable device 210 or input received from an application implemented on client device 211 in communication with AR wearable device 210). The effective luminance setting may be specified between a range of 0 (minimum luminance value) to 255 (maximum luminance value). In some implementations, when the current thermal temperature of the AR wearable device is above or exceeds a specified threshold, the maximum brightness value (e.g., a value from 255 to 80% of the maximum brightness value, e.g., 204) may be reduced based on the current thermal temperature. When the current thermal temperature returns below the specified threshold, the maximum brightness value returns to a default value, such as 255.
In an example, the DLP controller 314 determines that the measurement of ambient light is less than a first threshold (e.g., 10 lx). In this case, the DLP controller 314 sets the effective brightness setting to a first value that is 30% of the maximum brightness setting. The DLP controller 314 may determine that the ambient light measurement is within a first range (e.g., between 10lx and 200 lx). In this case, the DLP controller 314 may calculate the effective brightness setting as an interpolation between a first value (e.g., which is 30% of the maximum brightness setting) and a second value (e.g., which is 100% of the maximum brightness setting). As an example, the DLP controller 314 may calculate a difference between the current measurement of brightness and the first threshold. Then, the DLP controller 314 may calculate the effective luminance value by linearly interpolating between the first value and the second value based on the calculated difference. For example, the DLP controller 314 may determine that the current measurement of ambient light is 50lx, and in this case, the DLP controller 314 may calculate the effective luminance value as 39% of the maximum luminance setting (based on a linear interpolation indicating a difference of 40lx increase from 30% to 100% relative to the first threshold). The DLP controller 314 may determine that the measurement of ambient light exceeds a maximum value (e.g., greater than 200 lx) of the first range. In this case, the DLP controller 314 may set the effective brightness setting to a second value (e.g., 100% of the maximum brightness setting).
In some examples, DLP controller 314 may calculate the effective brightness value based on whether the lens of AR wearable device 210 is colored or transparent. In particular, the DLP controller 314 may determine that the lens is colored. In this case, the DLP controller 314 may determine that the measurement of the ambient light is less than a first threshold (e.g., 10 lx). In response, the DLP controller 314 sets the effective brightness setting to a particular value, such as 0% of the maximum brightness setting. The DLP controller 314 may determine that the ambient light measurement is within a second range (e.g., between 25lx and 300 lx). In this case, the DLP controller 314 may calculate the effective brightness setting as an interpolation between a specific value (e.g., which is 0% of the maximum brightness setting) and a second value (e.g., which is 100% of the maximum brightness setting). As an example, the DLP controller 314 may calculate a difference between the current measurement of brightness and the first threshold. Then, the DLP controller 314 may calculate the effective luminance value by linearly interpolating between the first value and the second value based on the calculated difference. For example, the DLP controller 314 may determine that the current measurement of ambient light is 50lx, and in this case, the DLP controller 314 may calculate the effective luminance value as 39% of the maximum luminance setting (based on a linear interpolation indicating a difference of 40lx increase from 30% to 100% relative to the first threshold). The DLP controller 314 may determine that the measurement of ambient light exceeds a maximum value (e.g., greater than 200lx or 300 lx) of the second range. In this case, the DLP controller 314 may set the effective brightness setting to a second value (e.g., 100% of the maximum brightness setting).
As another example, DLP controller 314 may determine that the lens is transparent. In this case, the DLP controller 314 may determine that the measurement result of the ambient light is less than a second threshold (e.g., 25 lx), which may be set to be greater than the first threshold used in the case where the lens is determined to be colored. That is, the DLP controller 314 may select between a first threshold (associated with the colored lens) and a second threshold (associated with the transparent lens) of a plurality of thresholds for controlling the effective brightness based on ambient light measurements. In response to selecting the second threshold, the DLP controller 314 sets the effective brightness setting to a particular value, such as 0% of the maximum brightness setting, if the ambient light measurement is less than the second threshold. The DLP controller 314 may determine that the ambient light measurement is within a third range (e.g., between 10lx and 200 lx). In this case, the DLP controller 314 may calculate the effective brightness setting as an interpolation between a specific value (e.g., which is 0% of the maximum brightness setting) and a second value (e.g., which is 100% of the maximum brightness setting). As an example, the DLP controller 314 may calculate a difference between the current measurement of brightness and the second threshold. Then, the DLP controller 314 may calculate the effective luminance value by linearly interpolating between the first value and the second value based on the calculated difference. For example, the DLP controller 314 may determine that the current measurement of ambient light is 50lx, and in this case, the DLP controller 314 may calculate the effective luminance value as 39% of the maximum luminance setting (based on a linear interpolation indicating a 40lx increase in difference from 30% to 100% relative to the second threshold). The DLP controller 314 may determine that the measurement of ambient light exceeds a maximum value (e.g., greater than 200lx or 300 lx) of the second range. In this case, the DLP controller 314 may set the effective brightness setting to a second value (e.g., 100% of the maximum brightness setting).
In some implementations, DLP controller 314 may calculate the effective brightness setting of AR wearable device 210 by obtaining a minimum value between the screen brightness offset by the animation scale (which may be set between 0 and 1) and the thermal limit of AR wearable device 210. That is, if the screen brightness calculated in the manner discussed above (e.g., based on user input or based on measurements of ambient light received from an ambient light sensor) is below the maximum brightness associated with the thermal limit, the DLP controller 314 sets the brightness setting to a brightness value calculated based on user input or based on measurements of ambient light received from an ambient light sensor. If the calculated screen brightness in the manner discussed above is greater than or exceeds the maximum brightness associated with the thermal limit, the DLP controller 314 sets the brightness setting to the thermal limit (e.g., 80% of the maximum brightness value or any number between 80% and 100%).
In some implementations, the DLP controller 314 changes projector LED current, operating mode (e.g., 50% DT (50-DT) or 0% DT (0-DT)) and composition of the image RGB values based on the calculated effective brightness setting. An illustrative way in which DLP controller 314 changes these parameters is summarized in the following table:
Table 1: hardware and software modifications for controlling brightness
In particular, the DLP controller 314 may access the ambient temperature measurements to calculate an effective luminance value, or may receive input from a user to set the effective luminance value. In response to determining that the effective brightness value is set to the maximum brightness setting (e.g., 100%), DLP controller 314 may drive the projection element (e.g., LED) of the projector at the maximum current and set the operation mode to 0% dt. In this case, DLP controller 314 may output the image on AR wearable device 210 with the received RGB values (e.g., without scaling the RGB values). In some cases, the DLP controller 314 may determine that the effective luminance value is set to an amount of luminance between 80% and 100%. In this case, DLP controller 314 may drive a projection element (e.g., LED) of the projector with a current that scales linearly with respect to the maximum current. That is, the lower the effective brightness level calculated to be between 80% and 100%, the lower the current is scaled linearly (e.g., if the effective brightness level decreases from 100% to 95%, which is a 5% decrease, the DLP controller 314 decreases the current by 5%). The DLP controller 314 may set the operation mode to 50% dt and may output an image on the AR wearable device 210 at the received RGB values (e.g., without scaling the RGB values).
In some cases, the DLP controller 314 may determine that the effective luminance value is set to a luminance amount between 80% and 0% (or a value slightly greater than 0%). In this case, the DLP controller 314 may drive a projection element (e.g., LED) of the projector with a minimum current, and may set the operation mode to 50% dt. The DLP controller 314 may also linearly scale the RGB values of the image on the AR wearable device 210. That is, DLP controller 314 may modify the value of the received image by scaling the RGB values based on the amount by which the effective brightness is reduced relative to 80% prior to displaying the image on the display of AR wearable device 120. For example, if the effective luminance is calculated to be 60%, the RGB values may be linearly scaled by a first value, and if the effective luminance is calculated to be 40%, the RGB values may be linearly scaled by a second value (lower than the first value). The lowest brightness is achieved by driving the LEDs without current, setting the operation mode to 50-DT and scaling the RGB values of the image by 0%.
Fig. 6 illustrates a wearable device (e.g., glasses 600) according to some examples. The glasses 600 may be of any suitable type including the glasses 100, and like reference numerals have been used to describe like components of the glasses 600 and the glasses 100. For simplicity, a portion of glasses 600 is shown in fig. 6. The eyeglass 600 comprises an optical lens 602 secured within each of the left and right optical element holders 114, 120. Each of the optical lenses 602 has a respective front surface (not shown) and an opposite rear surface 604. The left and right ends 110, 124 of the front piece 134 may include respective left and right frame extensions 608, 614 extending rearward from the respective left and right ends 110, 124. As discussed above with respect to fig. 1, left and right temple pieces 136 (not shown in fig. 6) and 138 may be provided and left and right temple pieces 136 and 138 may be fixedly secured to respective frame extensions 608 and 614 or removably attached to respective frame extensions 608 and 614. In one or more examples, a connector mechanism 616 is provided for securing the temple piece 136 and the temple piece 138 to the respective frame extension 608 and frame extension 614.
The eyeglass 600 comprises a computer 620 (e.g., a computing system), and the computer 620 can be similar to the computer 128 or the machine 800. In the example of fig. 6, computer 620 is powered by a suitable rechargeable battery (not shown), which may be similar to battery 106. The computer 620 may be implemented using one or more of the processor elements of the AR wearable device 210 (including the processor 308) and/or the storage device 306.
The glasses 600 also include a sensor 618 (e.g., one or more cameras, ambient light sensors), which sensor 618 may be similar to the optical sensor 316 and/or other digital sensors, and may face inward (e.g., toward the user) and/or outward (e.g., away from the user). The data feed from the sensor 618 is provided to the computer 620. In the example of fig. 6, the computer 620 is disposed within the first portion 108 of the right temple piece 138, although in alternative examples the computer 620 may be disposed elsewhere. In the example of fig. 6, the right temple piece 138 includes a removable cover portion 606 for accessing a connector mechanism 616 or other electronic components of the eyeglass 600.
The glasses 600 include an optical assembly 622 for displaying an image to a user. In the example of fig. 6, one optical assembly 622 is shown, but in other examples, an optical assembly may be provided for both eyes of the user (e.g., for both the temple piece 136 and the temple piece 138). The optical assembly 622 includes a light source, such as a projector 610, disposed in one of the arms or temples of the glasses (e.g., the right temple piece 136 of the glasses 600). In one or more examples, projector 610 is a three-color laser projector using a scanning mirror or galvanometer. Computer 620 is connected to projector 610.
Fig. 7 is a flow chart of a process 700 according to some examples. Although a flowchart may describe the operations as a sequential process, many of the operations can be performed in parallel or concurrently. In addition, the order of operations may be rearranged. The process terminates when its operations are completed. The process may correspond to a method, procedure, etc. The steps of a method may be performed in whole or in part, may be performed in combination with some or all of the steps of other methods, and may be performed by any number of different systems or any portion thereof (e.g., a processor included in any system).
At operation 701, the DLP controller 314 instructs the projection element of the AR wearable device to project an image, as discussed above.
At operation 702, the DLP controller 314 receives a measurement of ambient light from an ambient light sensor, as discussed above.
At operation 703, the DLP controller 314 adjusts one or more hardware parameters of a projection element of the AR wearable device based on the measurement of ambient light, as discussed above.
At operation 704, the DLP controller 314 modifies one or more color values of the image displayed by the projection element of the AR wearable device based on the measurement of ambient light, as discussed above.
At operation 705, the DLP controller 314 projects an image with modified color values using a projection element of the AR wearable device with the adjusted one or more hardware parameters, as discussed above.
Machine architecture
Fig. 8 is a diagrammatic representation of a machine 800 within which instructions 808 (e.g., software, programs, applications, applets, apps, or other executable code) for causing the machine 800 to perform any one or more of the methods discussed herein may be executed. For example, the instructions 808 may cause the machine 800 to perform any one or more of the methods described herein. The instructions 808 transform a generic, un-programmed machine 800 into a specific machine 800 that is programmed to perform the described and illustrated functions in the manner described. The machine 800 may operate as a standalone device or may be coupled (e.g., networked) to other machines. In a networked deployment, the machine 800 may operate in the capacity of a server machine or a client machine in server-client network environment, or as a peer machine in a peer-to-peer (or distributed) network environment. Machine 800 may include, but is not limited to, a server computer, a client computer, a Personal Computer (PC), a tablet computer, a laptop computer, a netbook, a set-top box (STB), a Personal Digital Assistant (PDA), an entertainment media system, a cellular telephone, a smart phone, a mobile device, a wearable device (e.g., a smart watch), a smart home device (e.g., a smart appliance), other smart devices, a web device, a network router, a network switch, a network bridge, or any machine capable of sequentially or otherwise executing instructions 808 that specify actions to be taken by machine 800. Furthermore, while only a single machine 800 is illustrated, the term "machine" shall also be taken to include a collection of machines that individually or jointly execute instructions 808 to perform any one or more of the methodologies discussed herein. For example, the machine 800 may include the client device 102 or any one of several server devices that form part of the messaging server system 108. In some examples, machine 800 may also include both a client system and a server system, where certain operations of a particular method or algorithm are performed on the server side and certain operations of a particular method or algorithm are performed on the client side.
Machine 800 may include a processor 802, a memory 804, and input/output (I/O) components 838, which may be configured to communicate with each other via a bus 840. In an example, the processor 802 (e.g., a Central Processing Unit (CPU), a Reduced Instruction Set Computing (RISC) processor, a Complex Instruction Set Computing (CISC) processor, a Graphics Processing Unit (GPU), a Digital Signal Processor (DSP), an Application Specific Integrated Circuit (ASIC), a Radio Frequency Integrated Circuit (RFIC), another processor, or any suitable combination thereof) may include, for example, a processor 806 and a processor 810 that execute instructions 808. The term "processor" is intended to include a multi-core processor, which may include two or more separate processors (sometimes referred to as "cores") that may execute instructions simultaneously. Although fig. 8 shows multiple processors 802, machine 800 may include a single processor with a single core, a single processor with multiple cores (e.g., a multi-core processor), multiple processors with a single core, multiple processors with multiple cores, or any combination thereof.
Memory 804 includes a main memory 812, a static memory 814, and a storage unit 816, all of which are accessible to processor 802 via bus 840. Main memory 804, static memory 814, and storage unit 816 store instructions 808 embodying any one or more of the methodologies or functions described herein. The instructions 808 may also reside, completely or partially, within the main memory 812, within the static memory 814, within the machine-readable medium within the storage unit 816, within at least one processor of the processors 802 (e.g., within a cache memory of the processor), or within any suitable combination thereof, during execution thereof by the machine 800.
The I/O components 838 may include a variety of components that receive input, provide output, generate output, send information, exchange information, capture measurements, and so forth. The particular I/O components 838 included in a particular machine will depend on the type of machine. For example, a portable machine such as a mobile phone may include a touch input device or other such input mechanism, while a headless server machine would be unlikely to include such a touch input device. It should be appreciated that the I/O component 838 can include many other components not shown in FIG. 8. In various examples, the I/O components 838 may include a user output component 824 and a user input component 826. The user output component 824 may include visual components (e.g., a display such as a Plasma Display Panel (PDP), a Light Emitting Diode (LED) display, a Liquid Crystal Display (LCD), a projector, or a Cathode Ray Tube (CRT)), audible components (e.g., speakers), tactile components (e.g., vibration motor, resistance mechanism), other signal generators, and so forth. The user input component 826 can include alphanumeric input components (e.g., a keyboard, a touch screen configured to receive alphanumeric input, an optoelectronic keyboard, or other alphanumeric input components), point-based input components (e.g., a mouse, touchpad, trackball, joystick, motion sensor, or other pointing instrument), tactile input components (e.g., physical buttons, a touch screen providing positioning and force of a touch or touch gesture, or other tactile input components), audio input components (e.g., a microphone), and the like.
In other examples, the I/O components 838 may include a biometric component 828, a motion component 830, an environmental component 832, or a location component 834, among various other components. For example, the biometric component 828 includes components for detecting expressions (e.g., hand expressions, facial expressions, voice expressions, body gestures, or eye tracking), measuring biological signals (e.g., blood pressure, heart rate, body temperature, perspiration, or brain waves), identifying a person (e.g., voice recognition, retinal recognition, facial recognition, fingerprint recognition, or electroencephalogram-based recognition), and the like. The motion component 830 includes an acceleration sensor component (e.g., accelerometer), a gravity sensor component, a rotation sensor component (e.g., gyroscope).
Environmental components 832 include, for example, one or more cameras (with still image/photo and video functionality), an illumination sensor component (e.g., a photometer), a temperature sensor component (e.g., one or more thermometers that detect ambient temperature), a humidity sensor component, a pressure sensor component (e.g., a barometer), an auditory sensor component (e.g., one or more microphones that detect background noise), a proximity sensor component (e.g., an infrared sensor that detects nearby objects), a gas sensor (e.g., a gas detection sensor that detects the concentration of hazardous gas or measures contaminants in the atmosphere for safety), or other components that may provide an indication, measurement, or signal corresponding to the surrounding physical environment.
Regarding the camera, the client device 102 may have a camera system including, for example, a front camera on the front surface of the client device 102 and a rear camera on the rear surface of the client device 102. The front-facing camera may, for example, be used to capture still images and video (e.g., "self-timer") of the user of the client device 102, which may then be enhanced with the enhancement data (e.g., filters) described above. For example, a rear camera may be used to capture still images and video in a more traditional camera mode, which images are similarly enhanced with enhancement data. In addition to the front-end camera and the rear-end camera, the client device 102 may also include a 360 ° camera for capturing 360 ° photos and videos.
Further, the camera system of the client device 102 may include dual rear-facing cameras (e.g., a main camera and a depth sensing camera), or even triple, quadruple, or quintuple rear-facing camera configurations on the front-to-back side of the client device 102. For example, these multiple camera systems may include a wide-angle camera, an ultra-wide-angle camera, a tele camera, a macro camera, and a depth sensor.
The location component 834 includes a position sensor component (e.g., a GPS receiver component), an altitude sensor component (e.g., an altimeter or barometer that detects barometric pressure from which altitude can be derived), an orientation sensor component (e.g., a magnetometer), and so forth.
Communication may be accomplished using a variety of techniques. The I/O component 838 also includes a communication component 836, the communication component 836 being operable to couple the machine 800 to the network 820 or device 822 via a respective coupling or connection. For example, communication component 836 may include a network interface component or another suitable device for interfacing with network 820. In other examples, communication component 836 may include wired communication components, wireless communication components, cellular communication components, near Field Communication (NFC) components,Parts (e.g.)>Low power consumption)/(f)>Means and other communication means for providing communication via other modalities. The device 822 may be another machine or any of a variety of peripheral devices (e.g., a peripheral device coupled via USB). />
Further, the communication component 836 may detect an identifier or include components operable to detect an identifier. For example, the communication component 836 may include a Radio Frequency Identification (RFID) tag reader component, an NFC smart tag detection component, an optical reader component (e.g., an optical sensor for detecting one-dimensional barcodes such as Universal Product Code (UPC) barcodes, multi-dimensional barcodes such as Quick Response (QR) codes, aztec codes, data matrices, data symbols (Dataglyph), maximum codes (MaxiCode), PDF417, ultra codes (Ultra Code), UCC RSS-2D barcodes, and other optical codes), or an acoustic detection component (e.g., a microphone for identifying marked audio signals). In addition, various information may be available via the communication component 836, e.g., location via Internet Protocol (IP) geolocation, via Location of signal triangulation, location of NFC beacon signals that may indicate a particular location via detection, etc.
The various memories (e.g., main memory 812, static memory 814, and memory of processor 802) and storage unit 816 may store one or more sets of instructions and data structures (e.g., software) embodying or used by any one or more of the methods or functions described herein. These instructions (e.g., instructions 808), when executed by the processor 802, cause various operations to implement the disclosed examples.
The instructions 808 may be transmitted or received over the network 820 via a network interface device (e.g., a network interface component included in the communication component 836) using a transmission medium and using any one of a number of well-known transmission protocols (e.g., hypertext transfer protocol (HTTP)). Similarly, the instructions 808 may be transmitted or received via a coupling (e.g., peer-to-peer coupling) to the device 822 using a transmission medium.
Software architecture
Fig. 9 is a block diagram 900 illustrating a software architecture 904, which software architecture 904 may be installed on any one or more of the devices described herein. The software architecture 904 is supported by hardware, such as a machine 902, the machine 902 including a processor 920, memory 926 and an I/O component 938. In this example, the software architecture 904 may be conceptualized as a stack of layers, with each layer providing a particular function. The software architecture 904 includes layers such as an operating system 912, libraries 910, frameworks 908, and applications 906. In operation, the application 906 calls an API call 950 through the software stack and receives a message 952 in response to the API call 950.
Operating system 912 manages hardware resources and provides common services. Operating system 912 includes, for example, kernel 914, services 916, and drivers 922. The kernel 914 acts as an abstraction layer between the hardware and other software layers. For example, the kernel 914 provides memory management, processor management (e.g., scheduling), component management, networking, and security settings, among other functions. The service 916 may provide other common services for other software layers. The driver 922 is responsible for controlling or interfacing with the underlying hardware. For example, the driver 922 may include a display driver, an imaging device driver,Or->Low power consumption driver, flash memory driver, serial communication driver (e.g., USB driver)>Drivers, audio drivers, power management drivers, etc.
Library 910 provides a common low-level infrastructure used by applications 906. Library 910 may include a system library 918 (e.g., a C-standard library), system library 918 providing functions such as memory allocation functions, string manipulation functions, mathematical functions, and the like. In addition, libraries 910 may include API libraries 924, such as media libraries (e.g., libraries for supporting presentation and manipulation of various media formats, such as moving Picture experts group-4 (MPEG 4), advanced video coding (H.264 or AVC), moving Picture experts group layer-3 (MP 3), advanced Audio Coding (AAC), adaptive Multi-Rate (AMR) audio codec, joint Picture experts group (JPEG or JPG) or Portable Network Graphics (PNG)), graphics libraries (e.g., openGL framework for rendering in two-dimensional (2D) and three-dimensional (3D) in graphical content on a display), database libraries (e.g., SQLite providing various relational database functions), web libraries (e.g., webKit providing web browsing functions), and the like. Library 910 may also include various other libraries 928 to provide many other APIs to application 906.
Framework 908 provides a common high-level infrastructure used by applications 906. For example, framework 908 provides various Graphical User Interface (GUI) functions, advanced resource management, and advanced location services. Framework 908 can provide a wide variety of other APIs that can be used by applications 906, some of which can be specific to a particular operating system or platform.
In an example, the applications 906 can include a home application 936, a contacts application 930, a browser application 932, a book-viewer application 934, a positioning application 942, a media application 944, a messaging application 946, a gaming application 948, and a variety of other applications such as an external application 940. The application 906 is a program that performs the functions defined in the program. One or more of the applications 906 that are variously structured may be created using a variety of programming languages, such as an object oriented programming language (e.g., objective-C, java or C++) or a procedural programming language (e.g., C-language or assembly language). In a particular example, the external application 940 (e.g., using ANDROID by an entity other than the vendor of the particular platform) TM Or IOS TM Applications developed in Software Development Kits (SDKs) may be, for example, in IOS TM 、ANDROID TMMobile operating system of Phone or movement running on another mobile operating systemSoftware. In this example, external applications 940 may call API calls 950 provided by operating system 912 to facilitate the functionality described herein.
Glossary of terms
"carrier wave signal" refers to any intangible medium capable of storing, encoding or carrying instructions for execution by a machine, and includes digital or analog communications signals or other intangible medium to facilitate communication of such instructions. The instructions may be transmitted or received over a network using a transmission medium via a network interface device.
"client device" refers to any machine that interfaces with a communication network to obtain resources from one or more server systems or other client devices. The client device may be, but is not limited to, a mobile phone, desktop computer, laptop computer, portable Digital Assistant (PDA), smart phone, tablet, super book, netbook, laptop computer, multiprocessor system, microprocessor-based or programmable consumer electronics, game console, set top box, or any other communication device that a user may use to access a network.
"communication network" refers to one or more portions of a network, the network may be an ad hoc network, an intranet, an extranet, a Virtual Private Network (VPN), a Local Area Network (LAN), a Wireless LAN (WLAN), a Wide Area Network (WAN), a Wireless WAN (WWAN), a Virtual Private Network (VPN) Metropolitan Area Networks (MANs), the Internet, portions of the Public Switched Telephone Network (PSTN), plain Old Telephone Service (POTS) networks, cellular telephone networks, wireless networks, A network, other type of network, or a combination of two or more such networks. For example, the network or portion of the network may comprise a wireless network or cellular network, and the coupling may be a Code Division Multiple Access (CDMA) connection, a global system for mobile communications (GSM) connection, or other type of cellular or wireless coupling. In this example, the coupling may implement any of a variety of types of data transmission techniques, such as single carrier radio transmission techniques (1 xRTT), evolution-data optimized (EVDO) techniques, genericPacket Radio Service (GPRS) technology, enhanced data rates for GSM evolution (EDGE) technology, third generation partnership project (3 GPP) including 3G, fourth generation wireless (4G) networks, universal Mobile Telecommunications System (UMTS), high Speed Packet Access (HSPA), worldwide Interoperability for Microwave Access (WiMAX), long Term Evolution (LTE) standards, other data transmission technologies defined by various standards setting organizations, other long distance protocols, or other data transmission technologies.
"component" refers to a device, physical entity, or logic having boundaries defined by function or subroutine calls, branch points, APIs, or other techniques that provide partitioning or modularization of particular processing or control functions. Components may be combined with other components via their interfaces to perform machine processes. A component may be a packaged functional hardware unit designed for use with other components and may be part of a program that typically performs certain of the relevant functions.
The components may constitute software components (e.g., code embodied on a machine-readable medium) or hardware components. A "hardware component" is a tangible unit capable of performing certain operations and may be configured or arranged in some physical manner. In various examples, one or more computer systems (e.g., stand-alone computer systems, client computer systems, or server computer systems) or one or more hardware components of a computer system (e.g., processors or groups of processors) may be configured by software (e.g., an application or application part) as hardware components that operate to perform certain operations as described herein.
The hardware components may also be implemented mechanically, electronically, or in any suitable combination thereof. For example, a hardware component may include specialized circuitry or logic permanently configured to perform certain operations. The hardware component may be a special purpose processor such as a Field Programmable Gate Array (FPGA) or an Application Specific Integrated Circuit (ASIC). The hardware components may also include programmable logic or circuitry that is temporarily configured by software to perform certain operations. For example, the hardware components may include software that is executed by a general purpose processor or other programmable processor. Once configured by such software, the hardware components become the specific machines (or specific components of machines) uniquely tailored to perform the configured functions, and are no longer general purpose processors. It will be appreciated that it may be decided, for cost and time considerations, to implement a hardware component mechanically in dedicated and permanently configured circuitry or in temporarily configured (e.g., by software configuration) circuitry. Thus, the phrase "hardware component" (or "hardware-implemented component") should be understood to include a tangible entity, i.e., an entity that is physically constructed, permanently configured (e.g., hardwired), or temporarily configured (e.g., programmed) to operate in some manner or perform certain operations described herein.
Considering the example where hardware components are temporarily configured (e.g., programmed), it is not necessary to configure or instantiate each of the hardware components at any one time. For example, where the hardware components include a general-purpose processor that is configured by software to be a special-purpose processor, the general-purpose processor may be configured at different times as respective different special-purpose processors (e.g., including different hardware components). The software configures one or more particular processors accordingly, for example, to constitute particular hardware components at one time and different hardware components at different times.
A hardware component may provide information to and receive information from other hardware components. Thus, the described hardware components may be considered to be communicatively coupled. Where multiple hardware components are present at the same time, communication may be achieved by signal transmission (e.g., through appropriate circuitry and buses) between or among two or more of the hardware components. In examples where multiple hardware components are configured or instantiated at different times, communication between such hardware components may be achieved, for example, by storing information in a memory structure accessible to the multiple hardware components and retrieving the information in the memory structure. For example, one hardware component may perform an operation and store an output of the operation in a memory device communicatively coupled thereto. Additional hardware components may then access the memory device at a later time to retrieve and process the stored output. The hardware component may also initiate communication with an input device or an output device, and may operate on a resource (e.g., a collection of information).
Various operations of the example methods described herein may be performed, at least in part, by one or more processors that are temporarily configured (e.g., via software) or permanently configured to perform the relevant operations. Whether temporarily configured or permanently configured, such a processor may constitute a processor-implemented component that operates to perform one or more operations or functions described herein. As used herein, "processor-implemented components" refers to hardware components implemented using one or more processors. Similarly, the methods described herein may be implemented, at least in part, by processors, with particular one or more processors being examples of hardware. For example, at least some operations of the method may be performed by one or more processors 802 or processor-implemented components. In addition, one or more processors may also operate to support execution of related operations in a "cloud computing" environment or as "software as a service" (SaaS) operations. For example, at least some of the operations may be performed by a set of computers (as examples of machines including processors), where the operations are accessible via a network (e.g., the internet) and via one or more suitable interfaces (e.g., APIs). The performance of certain of the operations may be distributed among processors, not only residing within a single machine, but also deployed across multiple machines. In some examples, the processor or processor-implemented components may be located in a single geographic location (e.g., within a home environment, an office environment, or a server farm). In other examples, the processor or processor-implemented components may be distributed across multiple geolocations.
"computer-readable storage medium" refers to both machine storage media and transmission media. Thus, the term includes both storage devices/media and carrier wave/modulated data signals. The terms "machine-readable medium," "computer-readable medium," and "device-readable medium" mean the same thing and may be used interchangeably in this disclosure.
"ephemeral message" refers to a message that is accessible for a limited duration of time. The transient message may be text, images, video, etc. The access time for the ephemeral message may be set by the message sender. Alternatively, the access time may be a default setting or a setting specified by the recipient. The message is temporary regardless of the setup technique.
"machine storage media" refers to single or multiple storage devices and media (e.g., centralized or distributed databases, as well as associated caches and servers) that store the executable instructions, routines, and data. Accordingly, the term should be taken to include, but is not limited to, solid-state memory, as well as optical and magnetic media, including memory internal or external to the processor. Specific examples of machine storage media, computer storage media, and device storage media include: nonvolatile memory including, for example, semiconductor memory devices such as erasable programmable read-only memory (EPROM), electrically erasable programmable read-only memory (EEPROM), FPGA, and flash memory devices; magnetic disks, such as internal hard disks and removable disks; magneto-optical disk; CD-ROM and DVD-ROM discs. The terms "machine storage medium," "device storage medium," "computer storage medium" mean the same thing and may be used interchangeably in this disclosure. The terms "machine storage medium," computer storage medium, "and" device storage medium "expressly exclude carrier waves, modulated data signals, and other such media, and at least some of the carrier waves, modulated data signals, and other such media are encompassed by the term" signal medium.
"non-transitory computer-readable storage medium" refers to a tangible medium capable of storing, encoding or carrying instructions for execution by a machine.
"signal medium" refers to any intangible medium capable of storing, encoding or carrying instructions for execution by a machine, and includes digital or analog communications signals or other intangible medium to facilitate communication of software or data. The term "signal medium" shall be taken to include any form of modulated data signal, carrier wave, and the like. The term "modulated data signal" means a signal that has one or more of its characteristics set or changed in such a manner as to encode information in the signal. The terms "transmission medium" and "signal medium" mean the same thing and may be used interchangeably in this disclosure.
Changes and modifications may be made to the disclosed examples without departing from the scope of the present disclosure. These and other changes or modifications are intended to be included within the scope of the present disclosure as expressed in the appended claims.

Claims (20)

1. A method, comprising:
causing, by one or more processors of an Augmented Reality (AR) wearable device, a projection element of the AR wearable device to project an image;
Receiving, by the one or more processors of the AR wearable device, a measurement of ambient light from an ambient light sensor;
adjusting one or more hardware parameters of the projection element of the AR wearable device based on the measurement of ambient light;
modifying one or more color values of an image projected by the projection element of the AR wearable device based on the measurement of ambient light; and
such that the image with the modified one or more color values is projected using the projection element of the AR wearable device with the adjusted one or more hardware parameters.
2. The method of claim 1, wherein the AR wearable device comprises electronic glasses.
3. The method of any of claims 1-2, further comprising:
an effective luminance value is calculated based on the measurement of ambient light, wherein the one or more hardware parameters are adjusted and the one or more color values are modified based on the effective luminance value.
4. A method according to claim 3, wherein calculating the effective luminance value comprises:
input from a user on an application implemented on a mobile device associated with the AR wearable device is received to adjust a brightness setting, the effective brightness being calculated independent of the ambient light in response to receiving input from the user.
5. The method of any of claims 1-4, wherein calculating the effective luminance value comprises:
determining that a lens of the AR wearable device comprises a colored lens or a transparent lens;
selecting a first threshold value from a plurality of threshold values for setting a screen brightness value based on determining that a lens of the AR wearable device includes the colored lens; and
based on determining that the lens of the AR wearable device includes the transparent lens, a second threshold value is selected from the plurality of threshold values for setting the screen brightness value.
6. The method of claim 5, further comprising:
in response to determining that the lens of the AR wearable device includes the colored lens:
comparing the measurement of the ambient light with the first threshold of the plurality of thresholds; and
the screen brightness value is set to a given value based on comparing the measurement result of the ambient light with the first threshold value.
7. The method of any one of claims 1 to 6, further comprising:
in response to determining that the measurement of ambient light is less than the first threshold, setting the screen brightness value to zero;
in response to determining that the measurement of ambient light is greater than the first threshold and within a range of values, setting the screen brightness value as an interpolation between a minimum value and a maximum value; and
In response to determining that the measurement of ambient light is greater than a third threshold, the screen brightness value is set to a maximum value.
8. The method of any of claims 1-7, wherein calculating the effective luminance value comprises:
in response to determining that the lens of the AR wearable device includes the transparent lens:
comparing the measurement of the ambient light with the second threshold of the plurality of thresholds; and
the screen brightness value is set to a given value based on comparing the measurement result of the ambient light with the second threshold value.
9. The method of any one of claims 1 to 8, further comprising:
determining that a thermal limit of the AR wearable device has been reached; and
the effective luminance value is prevented from exceeding a maximum luminance value associated with the thermal limit.
10. The method of claim 9, wherein the maximum luminance value comprises 80% or less.
11. The method of any of claims 1 to 10, wherein the effective luminance value is calculated as a minimum of a screen luminance value calculated based on a measurement of the ambient light and the maximum luminance value associated with the thermal limit.
12. The method of any one of claims 1 to 11, further comprising:
determining that the effective luminance value is within a first range of values; and
in response to determining that the effective luminance value is within the first range of values, a Light Emitting Diode (LED) current value of the projection element is linearly changed between a maximum current and a minimum current.
13. The method of claim 12, further comprising:
determining that the effective luminance value is below the first range of values; and
in response to determining that the effective luminance value is below the first range of values, the red, green, and blue pixel values of the image are multiplied by a specified value.
14. The method of any of claims 1 to 13, further comprising selecting between operating the projection element at 0% Dark Time (DT) or 50% DT based on the effective brightness value.
15. The method of claim 14, wherein in response to determining that the effective luminance value is a maximum effective luminance value, operating the projection element at the 0% dt drives a current to the projection element at a maximum value.
16. The method of any of claims 1-15, wherein in response to determining that the effective luminance value is less than the maximum effective luminance value and greater than a second value, operating the projection element at the 50% dt, driving a current to the projection element in a linear range between a minimum value and a maximum value, wherein red, green, and blue pixel values of the image are linearly scaled down.
17. The method of claim 16, wherein in response to determining that the effective brightness value corresponds to the second value, operating the projection element at the 50% dt, the current driven to the projection element in the linear range corresponds to the minimum value, wherein red, green, and blue pixel values of the image are multiplied by 0.
18. A system, comprising:
one or more processors of an Augmented Reality (AR) wearable device, the one or more processors configured to perform operations comprising:
causing a projection element of the AR wearable device to project an image;
receiving a measurement of ambient light from an ambient light sensor;
adjusting one or more hardware parameters of the projection element of the AR wearable device based on the measurement of ambient light;
modifying one or more color values of an image projected by the projection element of the AR wearable device based on the measurement of ambient light; and
such that the image with the modified one or more color values is projected using the projection element of the AR wearable device with the adjusted one or more hardware parameters.
19. The system of claim 18, wherein the AR wearable device comprises electronic glasses.
20. A non-transitory computer-readable medium comprising instructions that, when executed by one or more processors of an Augmented Reality (AR) wearable device, configure the AR wearable device to perform operations comprising:
causing a projection element of the AR wearable device to project an image;
receiving a measurement of ambient light from an ambient light sensor;
adjusting one or more hardware parameters of the projection element of the AR wearable device based on the measurement of ambient light;
modifying one or more color values of an image projected by the projection element of the AR wearable device based on the measurement of ambient light; and
such that the image with the modified one or more color values is projected using the projection element of the AR wearable device with the adjusted one or more hardware parameters.
CN202280040857.9A 2021-06-09 2022-06-03 Adaptive brightness for augmented reality display Pending CN117441331A (en)

Applications Claiming Priority (4)

Application Number Priority Date Filing Date Title
US63/208,810 2021-06-09
US17/582,633 US11823634B2 (en) 2021-06-09 2022-01-24 Adaptive brightness for augmented reality display
US17/582,633 2022-01-24
PCT/US2022/032175 WO2022260954A1 (en) 2021-06-09 2022-06-03 Adaptive brightness for augmented reality display

Publications (1)

Publication Number Publication Date
CN117441331A true CN117441331A (en) 2024-01-23

Family

ID=89554005

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202280040857.9A Pending CN117441331A (en) 2021-06-09 2022-06-03 Adaptive brightness for augmented reality display

Country Status (1)

Country Link
CN (1) CN117441331A (en)

Similar Documents

Publication Publication Date Title
JP7397777B2 (en) Virtual reality, augmented reality, and mixed reality systems and methods
JP7042286B2 (en) Smoothly changing forbidden rendering
KR102624635B1 (en) 3D data generation in messaging systems
US9576397B2 (en) Reducing latency in an augmented-reality display
US11624925B2 (en) Eyewear with integrated peripheral display
US10922878B2 (en) Lighting for inserted content
US10095034B1 (en) Eyewear with integrated heads-up display
CN115113399B (en) Augmented reality display for macular degeneration
KR20210138484A (en) System and method for depth map recovery
US11823634B2 (en) Adaptive brightness for augmented reality display
US20230194859A1 (en) System for using digital light projectors for augmented reality
WO2022260954A1 (en) Adaptive brightness for augmented reality display
WO2022146781A1 (en) Digital light projectors for augmented reality
CN117441331A (en) Adaptive brightness for augmented reality display
US11431955B1 (en) Systems and methods for temporal anti-aliasing
CN116916809A (en) Ophthalmic imaging using a head-mounted device
EP2706508B1 (en) Reducing latency in an augmented-reality display
US20230315383A1 (en) Wearable device ar object voice-based interaction
US20230418062A1 (en) Color calibration tool for see-through augmented reality environment
US20230215108A1 (en) System and method for adaptive volume-based scene reconstruction for xr platform applications
US20240069637A1 (en) Touch-based augmented reality experience
WO2023172894A1 (en) High dynamic range for dual pixel sensors
CN117425869A (en) Dynamic over-rendering in post-distortion
CN115661408A (en) Generating and modifying hand representations in an artificial reality environment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination