WO2024054846A1 - Online calibration of sensor alignment in a head-worn device - Google Patents

Online calibration of sensor alignment in a head-worn device Download PDF

Info

Publication number
WO2024054846A1
WO2024054846A1 PCT/US2023/073552 US2023073552W WO2024054846A1 WO 2024054846 A1 WO2024054846 A1 WO 2024054846A1 US 2023073552 W US2023073552 W US 2023073552W WO 2024054846 A1 WO2024054846 A1 WO 2024054846A1
Authority
WO
WIPO (PCT)
Prior art keywords
smart glasses
measurement unit
inertial measurement
camera
calibration
Prior art date
Application number
PCT/US2023/073552
Other languages
French (fr)
Inventor
Zhiheng Jia
Joshua Anthony HERNANDEZ
Qiyue Zhang
Chao GUO
Original Assignee
Google Llc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Google Llc filed Critical Google Llc
Publication of WO2024054846A1 publication Critical patent/WO2024054846A1/en

Links

Classifications

    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/017Head mounted
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C19/00Gyroscopes; Turn-sensitive devices using vibrating masses; Turn-sensitive devices without moving masses; Measuring angular rate using gyroscopic effects
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C25/00Manufacturing, calibrating, cleaning, or repairing instruments or devices referred to in the other groups of this subclass
    • G01C25/005Manufacturing, calibrating, cleaning, or repairing instruments or devices referred to in the other groups of this subclass initial alignment, calibration or starting-up of inertial devices
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/0101Head-up displays characterised by optical features
    • G02B2027/0138Head-up displays characterised by optical features comprising image capture systems, e.g. camera
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/017Head mounted
    • G02B2027/0178Eyeglass type
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/0179Display position adjusting means not related to the information to be displayed
    • G02B2027/0187Display position adjusting means not related to the information to be displayed slaved to motion of at least a part of the body of the user, e.g. head, eye
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/0093Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00 with means for monitoring data relating to the user, e.g. head-tracking, eye-tracking

Definitions

  • the present disclosure relates to an online calibration of sensors (e.g., cameras) of a head-worn device to enable combination or comparison of their sensor data.
  • sensors e.g., cameras
  • a head-mounted device may be configured to provide a user with a virtual reality (VR) or augmented reality (AR) experience.
  • the headmounted device may be implemented as smart glasses in an optical see through (OST) configuration in which virtual content can be displayed on a heads-up display (HUD) through which the user can view the world.
  • the head-mounted device may be implemented as smart glasses in a video see through (VST) configuration in which virtual content can be displayed on a display on which the user can view images of the world captured by a camera. In either case, the displayed content may be presented to both eyes of the user in a binocular display.
  • the quality of this binocular (i.e., stereoscopic) display may require the images to be aligned for each eye. For example, a vertical misalignment in the alignment between left and right displays and the corresponding left and right eyes of the user may cause significant strain on a user’s eyes.
  • the proposed solution uses a plurality of inertial measurement units (e.g., 2 ⁇ 6 IMUs) to estimate relative poses between cameras to detect and compensate for vertical misalignment in content displayed on a binocular display in a head-worn device, such as smart glasses.
  • IMUs inertial measurement units
  • the techniques described herein relate to a method, including: coupling a first camera at a first fixed position with respect to a first inertial measurement unit; coupling a second camera at a second fixed position with respect to a second inertial measurement unit; coupling the first inertial measurement unit and the second inertial measurement unit to a frame of smart glasses; obtaining first measurement of a movement of the smart glasses from the first inertial measurement unit; obtaining a second measurement of the movement of the smart glasses from the second inertial measurement unit; comparing the first measurement and the second measurement to estimate a relative pose difference between the first inertial measurement unit and the second inertial measurement unit; and updating a calibration based on the relative pose difference.
  • the techniques described herein relate to smart glasses including: a first camera mechanically coupled to a first inertial measurement unit; a second camera mechanically coupled to a second inertial measurement unit; a frame mechanically coupling the first inertial measurement unit and the second inertial measurement unit; and a processor configured by software instructions to perform a calibration including: obtaining a first measurement of a movement of the smart glasses from the first inertial measurement unit; obtaining a second measurement of the movement of the smart glasses from the second inertial measurement unit; comparing the first measurement and the second measurement to estimate a relative pose difference between the first inertial measurement unit and the second inertial measurement unit; and updating a calibration file based on the relative pose difference.
  • the techniques described herein relate to smart glasses including: a first camera and a first inertial measurement unit mechanically coupled to a first circuit board; a second camera and a second inertial measurement unit mechanically coupled to a second circuit board; a frame including the first circuit board and the second circuit board; and a processor in the frame that is electrically coupled to the first circuit board and the second circuit board, the processor configured by software instructions to perform an online calibration including: obtaining a first rotation measurement of the smart glasses from a first gyroscope of the first inertial measurement unit; obtaining a second rotation measurement of the smart glasses from a second gyroscope the second inertial measurement unit; applying the first rotation measurement and the second rotation measurement to a Kalman filter to estimate a relative pose difference between the first inertial measurement unit and the second inertial measurement unit; and updating a calibration file to include camera extrinsics based on the relative pose difference.
  • FIG. l is a top-view of optical see-through (OST) smart glasses according to a possible implementation of the present disclosure.
  • OST optical see-through
  • FIG. 2 is a top-view of video see-through (VST) smart glasses according to a possible implementation of the present disclosure.
  • FIG. 3 is a block diagram of smart glasses according to a possible implementation of the present disclosure.
  • FIG. 4 is a system block diagram of an IMU configured for a 6DOF measurement according to a possible implementation of the present disclosure.
  • FIG. 5 is a front view of smart glasses with a first IMU and a second IMU configured to measure a movement of the smart glasses according to an implementation of the present disclosure.
  • FIG. 6 is a flowchart of a method of online calibration of smart glasses according to a possible implementation of the present disclosure.
  • FIG. 7 is a flowchart of a method for deriving camera extrinsics according to a possible implementation of the present disclosure.
  • FIG. 8 is a flowchart of a method for reducing a misalignment in a binocular display using camera extrinsics according to a possible implementation of the present disclosure.
  • Accurate display of virtual content may require a precise positioning based on images captured by different cameras in a head-mounted device (e.g., smart glasses) for displaying virtual content (e.g., AR content, VR content).
  • a head-mounted device e.g., smart glasses
  • virtual content e.g., AR content, VR content
  • the relative position/orientation of a world facing camera and an eye-tracking camera may be used to determine where on a heads-up display (i.e., HUD) to render virtual content for a user.
  • a very small misalignment i.e., vertical
  • a very small misalignment i.e., vertical
  • the disclosed smart glasses address these technical problems by rigidly coupling a position sensor (i.e., IMU) to each camera and using the IMU data (i.e., instead of camera data) to determine camera extrinsics.
  • the disclosure further provides a method to simplify the extrinsic calculation in order to make the process practical for the limited resources of the smart glasses.
  • the relative position/orientation (i.e., pose) between cameras is referred to as the extrinsics of the cameras (i.e., camera extrinsics).
  • One aspect of the invention is mechanically coupling an IMU to each camera in the smart glasses and determining camera extrinsics for the cameras based on IMU data.
  • the smart glasses of the disclosure include a plurality of IMUs (i.e., multi IMUs) to sense relative camera poses on the device. This is different from a device that includes multiple IMUs to reduce noise for a single pose measurement.
  • the number of IMUs in the smart glasses may match the number of camera sensors in the smart glasses.
  • the IMU measurements may consume less power for sensing camera extrinsics than image measurements and therefore may be better suited to the resources of the smart glasses.
  • the IMU data can be combined with image data from the cameras (i.e., sensor fusion) to determine the camera extrinsics. These implementations may consume more power, but may offer increased accuracy.
  • online calibration i.e., OC
  • misalignments may be responded to as they happen and without user intervention (or knowledge) due to the online calibration.
  • the disclosed approach can be computed instantaneously (i.e., instantaneous estimates). This is different from other approaches that require integration.
  • An online calibration may be triggered on a schedule (e.g., periodically) or in response to an event corresponding to a misalignment. For example, an online calibration could be triggered in response to a change in temperature or in response to a change in status (e.g., use from non-use). Triggering an online calibration in response to an event could be advantageous from a power efficiency standpoint.
  • the smart glasses are considered to have a rigid or semi-rigid frame to which the IMUs (and the cameras) are affixed.
  • a force applied to the frame may move the frame without deforming the frame so that the IMUs (and the cameras) remain in a fixed position with respect to each other.
  • portions of the frame may be moved but the movement is constrained in a particular direction (e.g., temple hinge). In either case, a movement of one portion of the frame can be related to another portion of the frame according to the shape and dimensions of the frame.
  • a movement of the frame of glasses may be measured simultaneously by each IMU and compared to determine camera extrinsics.
  • a rotation rate of the frame may be measured by a first IMU and a second IMU.
  • the measurements may be compared to determine their relative orientation.
  • a rotation that is identically measured, in three dimensions, by each IMU may indicate that the IMUs are aligned (in x, y, z).
  • the measurements may be applied to a Kalman filter in order to estimate the measurements in the presence of noise.
  • a camera and an IMU may be mechanically coupled in a fixed spatial relationship (i.e., rigid connection) so that a movement (e.g., rotation) the camera may be sensed by the IMU.
  • the camera and the IMU may be mechanically coupled to a circuit board (e.g., FR4) that does not bend in response to a movement of the smart glasses.
  • a circuit board e.g., FR4
  • couplings, connections, or elements that are described as rigid imply that a movement does not cause the coupling, connection, or element to significantly deform (e.g., change shape, bend).
  • the smart glasses may be configured to display stereoscopic images to a user’s left eye and right eye. For an optimal stereoscopic experience, the display of these images should each be aligned with the user’s respective eye.
  • An aspect of the present disclosure is that a misalignment between left and right images of a stereoscopic display is reduced based on camera extrinsics measured for eye tracking cameras of the left eye and the right eye.
  • Smart glasses may include a plurality of sensors configured to sense the user wearing the head-worn device, a plurality of sensors configured to sense the real environment, and a binocular display configured to display information regarding the environment. The sensors and binocular display work together to provide the user with virtual information displayed as if it were located in the real environment. Problems may occur when the sensed information about the user or the environment is incorrect. For example, misalignments in the displayed information may interfere with an experience of a user.
  • a binocular display misalignment (e.g., vertical misalignment) is correlated with a level of comfort experienced by a user of a head worn device for extended reality (e.g., augmented reality, virtual reality).
  • the display vertical misalignment (DVM) can be caused by a misalignment of sensors attached to, or otherwise integrated with, a body (i.e., frame) of the device.
  • Previous image-based approaches to correct for misalignments may not be practical in a mobile environment due to power requirements and times required for processing (e.g., integrating).
  • One technical effect of the disclosed systems and methods is that a vertical misalignment can be corrected using faster and more power efficient than other entirely imagebased approaches.
  • FIG. 1 is a top view of smart glass in an OST configuration according to a possible implementation of the present disclosure.
  • the OST smart glasses may be worn on the face of a user like conventional glasses but can have sensors integrated within the temples, frames, and/or lenses of the glasses.
  • the smart glasses may include a left temple arm 121 (i.e., left temple) and a right temple arm 122 (i.e., right temple).
  • the left temple and the right temple are coupled by flexible hinges to a left side and a right side of a frame holding a left lens 111 and a right lens 112.
  • the smart glasses may have a target alignment (i.e., expected configuration, factory-setting) between the left and right temples and the frames when worn on the head of a user.
  • the target alignment may include the temples parallel with each other and orthogonal to the frame. Deviations to this target alignment may occur when the flexible hinges are broken or otherwise strained from use, which can cause errors in the sensing associated with the smart glasses.
  • a calibration which is used to compare and/or combine the data from the sensors, may be generated based on the target alignment.
  • a calibration may include extrinsics that specify the relative poses (e.g., translation, rotations) of sensors (e.g., cameras, inertial sensors, etc.) of the smart glasses in space (e.g., relative to a global coordinate system).
  • sensors e.g., cameras, inertial sensors, etc.
  • an eye-tracking (ET) camera is positioned in a temple of the OST smart glasses to capture images of an eye of the user.
  • a front-facing (i.e., head tracking (HT)) camera is positioned in the frame of the smart glasses to capture images of the environment viewed by the eye of the user.
  • a calibration may be used to correlate the images from these cameras. This calibration may become less accurate when the temple and the frame are misaligned (i.e., relative to the target alignment). As a result, it may be difficult, or impossible, to determine exactly where, in the image of the environment, the user is looking. Accordingly, it may be desirable to adjust the calibration to accommodate misalignments of the sensors from a factory setting.
  • FIG. 2 is a top-view of smart glasses in a video see-through (VST) configuration according to a possible implementation of the present disclosure.
  • a left video see-through camera i.e., left VST camera
  • a right video see-through camera i.e., right VST camera
  • Both cameras are directed towards the environment of the user and are configured to capture stereo views of the environment that can be displayed on a binocular display of a user.
  • Misalignment e.g., vertical misalignment
  • the left view and the right views may occur then the frames are bent or otherwise distorted from damage or use.
  • OC online calibration
  • An inertial measurement unit may be included with each camera of smart glasses to monitor and update the extrinsic of the smart glasses.
  • the cameras included in the smart glasses may include any combination of head tracking camera (i.e., HeT-cam), handtracking camera (i.e., HT-cam), head/hand-tracking camera (i.e., HeT/HT-cam), eye-tracking camera (ET-cam), face-tracking camera (i.e., FT-cam), and video see-through camera (i.e., VST- cam).
  • the smart glasses 100 in the OST configuration can include a left ET-cam 131 includes a left ET-IMU 133; a right ET-cam 132 includes a right ET-IMU 134, a left HeT/HT-cam 143 includes a left HeT/HT-IMU 141, a right HeT/HT-cam 144 includes a right HeT/HT-IMU 142, and a FT-cam 150 may include a FT-IMU.
  • the smart glasses in the VST configuration can include a left ET- cam 131 includes a left ET-IMU 133; a right ET-cam 132 includes a right ET-IMU 134, a left HeT/HT-cam 143 includes a left HeT/HT-IMU 141, a right HeT/HT-cam 144 includes a right HeT/HT-IMU 142, a left VST-cam 163 includes a left VST-IMU 161, a right VST-cam 164 include a right VST-IMU 162, and a FT-cam 150 may include a FT-IMU.
  • TABLE 1 includes IMU information for various sensors on the smart glasses in the different configurations.
  • the rows of TABLE 1 may correspond to a need or importance of the multiple IMUs for online calibration of the smart glasses.
  • the IMUs for the OLED lens/Pancake lens which are used for a virtual reality (VR) head-mounted display (HMD) (e.g., the OLED lens is part of the image source, and the pancake lens projects image to eye of user), may be highly desirable (e.g., required) required to minimize vertical misalignment.
  • an IMU for the indirect time of flight camera of iToF camera which provides depth information, may be less desirable (e.g. not required), especially if the field-of view (FOV) has an overlap with an infrared (IR) image FOV that is greater than an amount (e.g., 30 degrees).
  • FOV field-of view
  • IR infrared
  • the IMUs for the Left/Right VST-cams may not be required if each VST-cam is rigidly coupled to its corresponding VST-display.
  • the IMUs for the HeT/HT-cams may be optional (e.g., may not be used) if their FOVs have an overlap greater than an amount (e.g., 30 degrees).
  • the IMUs for the ET-cams may be necessary if the ET-cam has a deformity significant to a vertical misalignment. That is, if a deformity of the ET-cam can cause a vertical misalignment greater than a threshold (e.g., 4arcmin).
  • a threshold e.g., 4arcmin
  • FIG. 3 is a block diagram of smart glasses according to a possible implementation of the present disclosure.
  • the smart glasses 300 includes a camera (e.g., first camera 311) configured to capture images of a field-of-view (e.g., first field-of-view 315).
  • the motiontracking device may further include a processor 350, and images from the first camera 311 may be analyzed by the processor to identify one or more features in the images for motion tracking. Tracking pixel positions of the one or more features over consecutive images may help to determine a motion (e.g., rotation) of the smart glasses 300.
  • Each camera may have an IMU integrated or otherwise affixed to the camera to measure its position and orientation.
  • the first camera 311 may include a first IMU 301 and the second camera 312 may include a second IMU 302.
  • the smart glasses 300 further includes a second camera 312 configured to capture images of a second field-of-view 316, which may overlap a portion of the first field-of-view 315.
  • the cameras may be aligned and focused so that a first image (e.g., right image) of the first field- of-view and a second image (e.g., left image) of the second field-of-view may be combined to form a stereoscopic image.
  • the stereoscopic images may help to track the one or more features in three dimensions.
  • the smart glasses 300 further includes a memory 360.
  • the memory may be a non- transitory computer-readable medium and may be configured to store instructions that, when executed by the processor 350, can configure the motion tracking device to perform the disclosed methods.
  • the memory 360 may be configured to store a calibration related to a relative to an estimated pose transformation (i.e., extrinsics) between the first camera 311 and the second camera 312.
  • the head-mounted device may include a temperature sensor 340 configured to measure a temperature corresponding to a temperature of the first IMU 301 and/or a temperature of the second IMU 302. A change in temperature detected by the temperature sensor 340 may be used to trigger a calibration process to update the calibration.
  • the smart glasses 300 may further include a binocular display 390.
  • the display 390 is a heads-up display (i.e., HUD).
  • the smart glasses 300 may further include a battery 380.
  • the battery may be configured to provide energy to the subsystems, modules, and devices of the smart glasses 300 to enable their operation.
  • the battery 380 may be rechargeable and have an operating life (e.g., lifetime) between charges.
  • a battery level may determine how the calibration is updated.
  • a lower battery level e.g., below a threshold
  • a higher battery level e.g., above a threshold
  • the calibration process may be configured to estimate the camera extrinsics using data from the IMUs plus images from the cameras.
  • the smart glasses 300 may further include a communication interface 370.
  • the communication interface may be configured to communicate information digitally over a wireless communication link 371 (e.g., WiFi, Bluetooth, etc.).
  • the motion-tracking device may be communicatively coupled to a network 372 (i.e., the cloud) or a device (e.g., mobile phone 373) over the wireless communication link 371.
  • the wireless communication link may allow operations of a computer-implemented method to be divided between devices and/or could allow for remote storage of the calibration 361.
  • the smart glasses are augmented-reality glasses.
  • the smart glasses are a virtual-reality visor.
  • FIG. 4 is a system block diagram of an IMU configured for a 6DOF measurement according to a possible implementation of the present disclosure.
  • the IMU 400 may output a motion tracking measurement having six components (i.e., 6 degrees of freedom (DOF)) including a linear acceleration in an x-direction (a x ), a linear acceleration in a y-direction (a y ), a linear acceleration in a z-direction (a z ), an angular rotation (R x ) about an x-axis (ROLL), an angular rotation (R y ) around a y-axis (PITCH), and an angular rotation (R z ) around a z-axis (YAW).
  • the six components are relative to a coordinate system that may be aligned with, or define, a coordinate system of the IMU.
  • the IMU 400 may include a gyroscope module 410 including an X-axis gyroscope configured to measure a first angular rotation 411 (i.e., R x ) around an X-axis of the coordinate system; a Y-axis gyroscope configured to measure a second angular rotation 412 (i.e., R y ) around a Y-axis of the coordinate system; and a Z-axis gyroscope configured to measure a third angular rotation 413 (i.e.., R z ) around a Z-axis of the coordinate system associated with the AR device.
  • a gyroscope module 410 including an X-axis gyroscope configured to measure a first angular rotation 411 (i.e., R x ) around an X-axis of the coordinate system; a Y-axis gyroscope configured to measure a second angular rotation 412
  • a gyroscope of the IMU 400 may be implemented as a micro-electromechanical system (MEMS) in which a movement of a mass affixed to springs can be capacitively sensed to determine rotation. The alignment of the mass and the springs can determine the axis of the sensed rotation. Accordingly, the IMU 400 may include three MEMS gyroscopes, each aligned to sense a corresponding rotation around an axis of the coordinate system.
  • MEMS micro-electromechanical system
  • the IMU 400 may further include an accelerometer module 420 that includes an X- axis accelerometer configured to measure a first linear acceleration (a x ) in an X-direction; a Y- axis accelerometer configured to measure a second linear acceleration (a y ) in a Y-direction; and a Z-axis accelerometer configured to measure a third linear acceleration (a z ) in a Z-direction.
  • an accelerometer module 420 that includes an X- axis accelerometer configured to measure a first linear acceleration (a x ) in an X-direction; a Y- axis accelerometer configured to measure a second linear acceleration (a y ) in a Y-direction; and a Z-axis accelerometer configured to measure a third linear acceleration (a z ) in a Z-direction.
  • An accelerometer of the IMU 400 may be implemented as a MEMS configured to capacitively sense a force (e.g., gravity 421) exerted on a movable mass to determine an acceleration.
  • the accelerometer may sense displacement or velocity by processing (e.g., integration) the acceleration over time.
  • the mechanical nature of the MEMS sensors described above can make their responses sensitive to changes in temperature and/or to changes in their installed environment. For example, a temperature change or a force due to use (or misuse) of the motion-tracking device can alter the sensitivity of the MEMS devices. For example, dropping or bending the motion-tracking device can cause a change in the installed environment, thereby changing the response of a gyroscope or an accelerometer of the IMU.
  • the changes described above can make the sensed (i.e., measured) rotations and/or displacements differ from the actual rotations and/or displacements.
  • the differences between a measured parameter (e.g., rotation, displacement) and an actual parameter is referred to as a bias.
  • An output of the IMU may be considered as including an IMU measurement (IMU_MEAS) and a bias (BIAS). When the bias is zero, the measured parameter matches the actual parameter. Accordingly, it may be desirable to estimate/compensate for the bias for any, or all, outputs of the IMU.
  • the bias may be a function of temperature (i.e., BIAS(T)).
  • the IMU 400 may include a temperature sensor 440 within or near the IMU 400 so as to measure a temperature (T) that corresponds to the temperature of the gyroscope module 410 and the accelerometer module 420.
  • the IMU 400 can further include a magnetometer 430 that includes an X-axis magnetometer configured to measure a first magnetic field strength (i.e., Hx) an X-direction of the coordinate system, a Y-axis magnetometer configured to measure a second magnetic field strength (i.e., H y ) in a Y-direction of the coordinate system, and a Z-axis magnetometer configured to measure a third magnetic field strength (i.e., H z ) in a Z-direction of the coordinate system.
  • the magnetic field strengths may be relative to the Earth’s magnetic field 431 (i.e., north (N)).
  • the relative pose difference between IMUs may be estimated based on movement measurements measured by different IMUs on the smart glasses. For example, measurements of the same movement relative to each IMUs frame of reference may be compared to estimate how the frames of reference differ. A 3DOF estimate of this difference may be obtained by comparing the three-dimensional (3DOF) measurements captured by the gyroscope of each IMU.
  • FIG. 5 is a front view of smart glasses with a first IMU and a second IMU configured to measure a movement of the smart glasses according to an implementation of the present disclosure.
  • a frame 500 of the smart glasses experiences a movement 501.
  • a first inertial measurement unit i.e., first IMU 510 is configured to obtain a first measurement 511 of the movement.
  • the first measurement 511 is in a first frame of reference (i.e., first coordinate system) associated with the first IMU 510.
  • the first measurement includes a x- rotation (Rxi), a y-rotation (RYI), and a z-rotation (Rzi).
  • a second inertial measurement unit i.e., second IMU 520
  • the second measurement is in a second frame of reference (i.e., second coordinate system) associated with the second IMU 520.
  • the second measurement includes a x-rotation (Rxz), a y-rotation (RY2), and a z-rotation (Rz2).
  • the frames of reference are aligned but offset along the frame of the smart glasses.
  • This factory setting may be known or set at the time of fabrication and may provide an initial condition to which subsequent measurements are compared.
  • the IMUs may be coupled to cameras and are rigidly coupled to the frame 500 of the smart glasses.
  • the cameras can be a left eye-tracking camera disposed in a left lens portion of the frame of the smart glasses and a right eye-tracking camera disposed in a right lens portion of the frame. Because the IMUs are rigidly coupled to the frame, each IMU will measure different versions of the movement 501.
  • the rotation measurement of the second IMU 520 may have a component with an opposite sign as compared to a corresponding component of the rotation measurement first IMU 510. Comparing the first measurement and the second measurement of the movement can be used to estimate the relative pose difference between the first IMU 510 and the second IMU 520. Because each IMU may be rigidly coupled to a camera. The IMU-IMU extrinsics may be used to derive the corresponding camera extrinsics. The derives camera extrinsics may then be stored in a calibration that can be used to adjust or modify the images captured by the cameras so that they can be combined or compared.
  • Various rotations may be measured and the disclosure is not limited to the example shown in FIG. 5. For example, even no movement may provide information about the IMUs. For example, a non-zero measurement made during a zero rotation may be associated with an error (i.e., bias) of the gyroscope, which may be used in the estimate of the relative pose difference.
  • bias an error of the gyroscope
  • FIG. 6 is a flowchart of a method of online calibration of smart glasses according to a possible implementation of the present disclosure.
  • the method 600 may be performed without participation (or knowledge) of a user. In other words, the method 600 may be transparent to a user. The method 600 may be performed while a user is otherwise using the smart glasses. In other words, the method 600 may be performed online.
  • Each iteration of the method 600 may result in an estimate of a relative pose difference between IMUs (i.e., IMU-IMU extrinsics). In other words, no integration period may be necessary for an estimate, but an estimate may require multiple iterations of the method 600 to converge on an estimate with confidence (i.e., with an error below a threshold).
  • the method 600 may be performed (i.e., executed) periodically or whenever it is likely that the cameras have shifted from their default (i.e., factory) set positions. Accordingly, the method 600 includes triggering 610 a process to update a calibration corresponding to the relative positions/orientation of the cameras. In other words, the method includes triggering 610 a calibration (e.g., online calibration).
  • the triggering can be based on a change in temperature. In a possible implementation a temperature sensor is monitored and when the temperature changes more than a threshold a trigger signal may be generated to configure the processor of the smart glasses to perform the calibration.
  • the triggering can also be based on a change in a status of the smart glasses. For example, a sensor of the smart glasses may generate the trigger signal when the smart glasses are transitioned from an idle state (e.g.,. charging, no movement) to an operating state (e.g., movement, interaction).
  • the method 600 may further include obtaining 620 a first measurement from a first
  • the first IMU 611 is mechanically coupled to a first camera 601 (e.g., left eye-tracking camera) and the second IMU 621 is mechanically coupled to a second camera 602 (e.g., right eye-tracking camera).
  • the first camera 601 and the first IMU 611 may be coupled to a first circuit board and the second camera 602 and the second IMU 621 may be coupled to a second circuit board.
  • the first circuit board may be separate from the second circuit board but both circuit boards may be coupled to a body (i.e., frame) of the smart glasses.
  • the cameras and the IMUs are either directly or indirectly connected so that the physical connection is mechanically stable (i.e., does not bend or break) in response to a movement.
  • the first camera 601 and the first IMU 611 can be held fixed in their relative positions by the first circuit board;
  • the second camera 602 and the second IMU 621 can be held fixed in their relative positions by the second circuit board, and;
  • the first circuit board and the second circuit board can be held fixed in their relative positions by the frame of the smart glasses.
  • the first measurement may be a first rotation measured by a first gyroscope of the first IMU 611 and the second measurement may be a second rotation measured by a second gyroscope of the second IMU 621.
  • the first measurement and the second measurement may each have three degrees of freedom (3DOF).
  • Each gyroscope may capture a rotation in three dimensions (e.g., x, y, z) aligned with a coordinate system of each gyroscope.
  • the coordinate systems of the gyroscopes may not be aligned but knowledge of the initial (i.e., factory set) positions of the gyroscopes may be known and used in the comparison. Further the coordinate systems of a gyroscope may not be aligned with a coordinate system of its corresponding camera but knowledge of the relative positions may be known or measured (e.g., using image data and imu data).
  • Obtaining 620 the rotation measurements of a movement of the smart glasses may include buffering the measurements from either (or both) IMUs. Buffering the measurements can synchronize the first measurement and the second measurement in time. The synchronization can help the comparison by improving the likelihood that each measurement corresponds to the same overall movement of the smart glasses.
  • the method 600 further includes estimating 630 a relative pose difference between the first IMU 611 and the second IMU 621 (i.e., IMU-IMU extrinsics) based on the movement measurements (e.g., buffered rotation measurements).
  • the estimating may include comparing the first measurement and the second measurement.
  • the first measurement and the second measurement are applied to a Kalman filter.
  • the Kalman filter is configured to output a state that corresponds to a relative pose difference (e.g., pose transformation) between the IMUs.
  • the method 600 may be iterated in order for the Kalman filter to converge on a solution.
  • the method may include checking 640 the convergence of the Kalman filter to determine if a stable state has been obtained. If the estimate is not converged, the method 600 may repeat obtaining measurements and estimating extrinsics. In other words, the method 600 may obtain subsequent measurements of the movement of the smart glasses from the first IMU 611 and the second IMU and apply the subsequent measurement to the Kalman filter until the state has converged. When the estimate has converged, the method 600 includes updating 650 a calibration file stored in a memory (i.e., calibration 660) to include the relative pose difference between the cameras corresponding to the first IMU 611 and the second IMU 621.
  • a calibration file stored in a memory i.e., calibration 660
  • the calibration method may be enhanced or supplemented using image data from the first camera 601 and the second camera 602.
  • camera data e.g., VST camera data
  • IMU data may be analyzed with IMU data to help determine or refine the estimate.
  • image analysis may help refine a measurement of a relative movement measured by an IMU.
  • Image and IMU data may be analyzed (e.g., compare) to determine a relative position/ orientation (i.e., pose) difference between the IMU and its respective camera (i.e., IMU-CAM extrinsics).
  • the method 600 may include checking 670 to see if image analysis is available (e.g., images captured, battery level suitable). If so the method 680 may use sensor fusion to analyze the image and IMU (i.e., rotation) data to update 680 extrinsic (e.g., CAM-IMU extrinsics, CAM-CAM extrinsics), which can be recorded with the calibration 660.
  • image analysis e.g., images captured, battery level suitable
  • IMU i.e., rotation
  • extrinsic e.g., CAM-IMU extrinsics, CAM-CAM extrinsics
  • Values for an initial relative transformation between pairs of IMUs may be stored in the calibration at a time of manufacture.
  • the method may update these values continually or regularly to account for changes due to age (i.e., wear, use), temperature, and the like.
  • age i.e., wear, use
  • temperature i.e., temperature
  • the disclosed method may be performed while a user operates the smart glasses, it may be referred to as being performed “online”.
  • the methods may be considered and online calibration method for determining transformations between the frames of two cameras.
  • FIG. 7 is a flowchart of a method 700 for deriving camera extrinsics according to a possible implementation of the present disclosure. When multiple pairs of cameras, sensors, and/or projectors exist then the transforms between a first pair may be based on a transform between other pairs.
  • a system including three IMUs (1, 2, 3) may have three possible frame transformations (1-2, 2-3, and 1-3).
  • a transform between a first frame pair (1-3) of the three frame pairs (1-2, 2-3, 1-3) may be a function of (e.g., the sum of) the transform between a second frame pair (1-2) and a third frame pair (2-3).
  • the method 700 of computing transforms between pairs of cameras, sensors, and/or projectors may be extended mathematically to cover more (e.g., all) possible transformations.
  • the method 700 includes estimating 710 camera extrinsics based on the first IMU 701 and the second IMU 702 (i.e., CAM1-CAM2 extrinsics)
  • the method 700 further includes estimating 720 camera extrinsics based on the second IMU 702 and the third IMU 703 (i.e., CAM2-CAM3 extrinsics).
  • the method 700 further includes deriving 730 a relative position/orientation/offset difference between the first IMU 701 (i.e., first camera) and the third IMU 703 (i.e., third camera) based on the CAM1-CAM2 extrinsics and the CAM2-CAM3 extrinsics. Deriving 730 (i.e., rather than estimating from IMU data) may provide more efficient estimates when the number of cameras in the smart glasses is large.
  • FIG. 8 is a flowchart of a method for reducing a misalignment in a binocular display using camera extrinsics according to a possible implementation of the present disclosure.
  • the binocular display includes a left display for a left eye of the user and a right display for a right eye of the user.
  • the method 800 includes obtaining 810 (i.e., capturing) left eye images from a left eye-tracking camera and right eye images from a right eye-tracking camera.
  • the method 800 further includes determining 820 relative positions between the left eye and the left display based on the left eye images and determining 820 relative positions between the right eye and the right display based on the right eye images.
  • This determination may be based on a relative pose difference (i.e., camera extrinsics) between the left eye camera and the right eye camera.
  • the method may include retrieving left/right eye-tracking camera extrinsics from the stored calibration 660.
  • the method 800 further includes adjusting 830 the left images displayed on the left display and adjusting 830 the right images displayed on the right display based on the relative positions of the displays with respect to the eyes. The adjustment can reduce a vertical misalignment between the eyes which can reduce an eye strain of a user viewing the binocular display.
  • the online calibration can include capturing first sensor data from a first sensor of a head-worn device and capturing second sensor data from a second sensor of the head-worn device.
  • the online calibration can further include determining an instantaneous local acceleration and rotation of the first sensor, while ignoring a first sensor noise and/or a first sensor bias of the first sensor.
  • the online calibration further includes determining an instantons local acceleration and rotation of the second sensor, while ignoring a second sensor noise and/or a second sensor bias of the second sensor.
  • the online calibration further includes determining a misalignment between the first sensor data and the second sensor data based on a differential analysis of the acceleration and rotation of the first sensor and the acceleration and rotation of the second sensor. The differential analysis can compensate for the sensor noise and sensor bias.
  • the online calibration further includes adjusting the first sensor data and/or the second sensor data based on a calibration to correct for the misalignment.
  • the first sensor can be a first IMU configured to capture the instantaneous acceleration and rotation local to the first IMU (i.e., first IMU data)
  • the second sensor can be a second IMU configured to capture instantaneous acceleration and rotation local to the second IMU (i.e., second IMU data).
  • determining the misalignment between the first sensor data and the second sensor data is based on the first IMU data and the second IMU data. This determination can include determining at least one difference between a nominal extrinsic (e.g., corresponding to no misalignment) and an actual extrinsics (e.g., corresponding to the determined misalignment).
  • Example 1 A method, comprising: coupling a first camera at a first fixed position with respect to a first inertial measurement unit; coupling a second camera at a second fixed position with respect to a second inertial measurement unit; coupling the first inertial measurement unit and the second inertial measurement unit to a frame of smart glasses; obtaining first measurement of a movement of the smart glasses from the first inertial measurement unit; obtaining a second measurement of the movement of the smart glasses from the second inertial measurement unit; comparing the first measurement and the second measurement to estimate a relative pose difference between the first inertial measurement unit and the second inertial measurement unit; and updating a calibration based on the relative pose difference.
  • Example 2 The method according to Example 1, wherein the first camera and the first inertial measurement unit are both on a first circuit board and the second camera and the second inertial measurement unit are both on a second circuit board.
  • Example 3 The method according to Example 1, wherein the first camera is a left eye-tracking camera and the second camera is a right eye-tracking camera.
  • Example 4 The method according to Example 3, wherein the left eye-tracking camera is disposed in a left lens portion of the frame of the smart glasses and the right eyetracking camera is disposed in a right lens portion of the frame of the smart glasses.
  • Example 5 The method according to Example 1, wherein the first measurement is captured by a first gyroscope of the first inertial measurement unit and the second measurement is captured by a second gyroscope of the second inertial measurement unit.
  • Example 6 The method according to Example 5, wherein the first measurement is a first rotation and the second measurement is a second rotation, the first rotation and the second rotation each having three degrees of freedom.
  • Example 7 The method according to Example 1, further comprising buffering data from the first inertial measurement unit and the second inertial measurement unit to synchronize the first measurement and the second measurement in time.
  • Example 8 The method according to Example 1, wherein comparing the first measurement and the second measurement to estimate the relative pose difference between the first inertial measurement unit and the second inertial measurement unit includes: applying the first measurement and the second measurement to a Kalman filter, the Kalman filter is configured to output a state corresponding to the relative pose difference.
  • Example 9 The method according to Example 8, further comprising: obtaining subsequent measurements of the movement of the smart glasses from the first inertial measurement unit the second inertial measurement unit; and applying the subsequent measurements to the Kalman filter until the state is converged.
  • Example 10 The method according to Example 1, wherein the method is an online process to update the calibration, the method further including: triggering the online process to update the calibration based on a change in temperature of the smart glasses.
  • Example 11 The method according to Example 1, wherein the method is an online process to update the calibration, the method further including: triggering the online process to update the calibration based on a change in status of the smart glasses.
  • Example 12 Smart glasses comprising: a first camera mechanically coupled to a first inertial measurement unit; a second camera mechanically coupled to a second inertial measurement unit; a frame mechanically coupling the first inertial measurement unit and the second inertial measurement unit; and a processor configured by software instructions to perform a calibration including: obtaining a first measurement of a movement of the smart glasses from the first inertial measurement unit; obtaining a second measurement of the movement of the smart glasses from the second inertial measurement unit; comparing the first measurement and the second measurement to estimate a relative pose difference between the first inertial measurement unit and the second inertial measurement unit; and updating a calibration file based on the relative pose difference.
  • Example 13 The smart glasses according to Example 12, wherein the first camera and the first inertial measurement unit are both on a first circuit board and the second camera and the second inertial measurement unit are both on a second circuit board.
  • Example 14 The smart glasses according to Example 12, wherein the first camera is a left eye-tracking camera and the second camera is a right eye-tracking camera.
  • Example 15 The smart glasses according to Example 14, wherein the left eyetracking camera is disposed in a left lens portion of the frame of the smart glasses and the right eye-tracking camera is disposed in a right lens portion of the frame of the smart glasses.
  • Example 16 The smart glasses according to Example 12, wherein the first measurement is captured by a first gyroscope of the first inertial measurement unit and the second measurement is captured by a second gyroscope of the second inertial measurement unit.
  • Example 17 The smart glasses according to Example 16, wherein the first measurement is a first rotation and the second measurement is a second rotation, the first rotation and the second rotation each having three degrees of freedom.
  • Example 18 The smart glasses according to Example 12, wherein the calibration further includes: buffering data from the first inertial measurement unit and the second inertial measurement unit to synchronize the first measurement and the second measurement in time.
  • Example 19 Example 19
  • comparing the first measurement and the second measurement to estimate the relative pose difference between the first inertial measurement unit and the second inertial measurement unit includes: applying the first measurement and the second measurement to a Kalman filter, the Kalman filter configured to output a state corresponding to the relative pose difference.
  • Example 20 The smart glasses according to Example 12, wherein the calibration is performed online while a user is otherwise operating the smart glasses and wherein the calibration is triggered based on a change in a temperature of the smart glasses or a status of the smart glasses.
  • Example 21 Smart glasses comprising: a first camera and a first inertial measurement unit mechanically coupled to a first circuit board; a second camera and a second inertial measurement unit mechanically coupled to a second circuit board; a frame including the first circuit board and the second circuit board; and a processor in the frame that is electrically coupled to the first circuit board and the second circuit board, the processor configured by software instructions to perform an online calibration including: obtaining a first rotation measurement of the smart glasses from a first gyroscope of the first inertial measurement unit; obtaining a second rotation measurement of the smart glasses from a second gyroscope the second inertial measurement unit; applying the first rotation measurement and the second rotation measurement to a Kalman filter to estimate a relative pose difference between the first inertial measurement unit and the second inertial measurement unit; and updating a calibration file to include camera extrinsics based on the relative pose difference.
  • Example 22 The smart glasses according to Example 21, wherein the online calibration is triggered based on a change in a temperature of the smart glasses or a status of the smart glasses.
  • Example 23 The smart glasses according to Example 21, wherein: the first camera and the first inertial measurement unit are held fixed in their relative positions by the first circuit board; the second camera and the second inertial measurement unit are held fixed in their relative positions by the second circuit board; and the first circuit board and the second circuit board are held fixed in their relative positions by the frame of the smart glasses.
  • Methods discussed above may be implemented by hardware, software, firmware, middleware, microcode, hardware description languages, or any combination thereof.
  • the program code or code segments to perform the necessary tasks may be stored in a machine or computer readable medium such as a storage medium.
  • a processor(s) may perform the necessary tasks.

Abstract

A head-worn device, such as smart glasses, may be configured with multiple sensors to provide an augmented reality or virtual reality experience to a user. The sensors may be calibrated so that misalignments, such as a rotation or a translation, may be compensated when combining, or otherwise, comparing data from the sensors. Position sensors mechanically coupled to the multiple sensors may allow for the calibration to be continuously, or periodically, updated as the smart glasses change shape during use due to damage and/or adjustment.

Description

ONLINE CALIBRATION OF SENSOR ALIGNMENT IN A HEAD-WORN DEVICE
CROSS REFERENCE TO RELATED APPLICATION
[0001] This application claims the benefit of U.S. Provisional Application No. 63/374,654, filed September 6, 2022, the disclosure of which is incorporated herein in its entirety.
FIELD OF THE DISCLOSURE
[0002] The present disclosure relates to an online calibration of sensors (e.g., cameras) of a head-worn device to enable combination or comparison of their sensor data.
BACKGROUND
[0003] A head-mounted device (i.e., head-worn device) may be configured to provide a user with a virtual reality (VR) or augmented reality (AR) experience. For example, the headmounted device may be implemented as smart glasses in an optical see through (OST) configuration in which virtual content can be displayed on a heads-up display (HUD) through which the user can view the world. Alternatively, the head-mounted device may be implemented as smart glasses in a video see through (VST) configuration in which virtual content can be displayed on a display on which the user can view images of the world captured by a camera. In either case, the displayed content may be presented to both eyes of the user in a binocular display. The quality of this binocular (i.e., stereoscopic) display may require the images to be aligned for each eye. For example, a vertical misalignment in the alignment between left and right displays and the corresponding left and right eyes of the user may cause significant strain on a user’s eyes.
SUMMARY
[0004] The proposed solution uses a plurality of inertial measurement units (e.g., 2~6 IMUs) to estimate relative poses between cameras to detect and compensate for vertical misalignment in content displayed on a binocular display in a head-worn device, such as smart glasses.
[0005] In some aspects, the techniques described herein relate to a method, including: coupling a first camera at a first fixed position with respect to a first inertial measurement unit; coupling a second camera at a second fixed position with respect to a second inertial measurement unit; coupling the first inertial measurement unit and the second inertial measurement unit to a frame of smart glasses; obtaining first measurement of a movement of the smart glasses from the first inertial measurement unit; obtaining a second measurement of the movement of the smart glasses from the second inertial measurement unit; comparing the first measurement and the second measurement to estimate a relative pose difference between the first inertial measurement unit and the second inertial measurement unit; and updating a calibration based on the relative pose difference.
[0006] In some aspects, the techniques described herein relate to smart glasses including: a first camera mechanically coupled to a first inertial measurement unit; a second camera mechanically coupled to a second inertial measurement unit; a frame mechanically coupling the first inertial measurement unit and the second inertial measurement unit; and a processor configured by software instructions to perform a calibration including: obtaining a first measurement of a movement of the smart glasses from the first inertial measurement unit; obtaining a second measurement of the movement of the smart glasses from the second inertial measurement unit; comparing the first measurement and the second measurement to estimate a relative pose difference between the first inertial measurement unit and the second inertial measurement unit; and updating a calibration file based on the relative pose difference.
[0007] In some aspects, the techniques described herein relate to smart glasses including: a first camera and a first inertial measurement unit mechanically coupled to a first circuit board; a second camera and a second inertial measurement unit mechanically coupled to a second circuit board; a frame including the first circuit board and the second circuit board; and a processor in the frame that is electrically coupled to the first circuit board and the second circuit board, the processor configured by software instructions to perform an online calibration including: obtaining a first rotation measurement of the smart glasses from a first gyroscope of the first inertial measurement unit; obtaining a second rotation measurement of the smart glasses from a second gyroscope the second inertial measurement unit; applying the first rotation measurement and the second rotation measurement to a Kalman filter to estimate a relative pose difference between the first inertial measurement unit and the second inertial measurement unit; and updating a calibration file to include camera extrinsics based on the relative pose difference. [0008] The foregoing illustrative summary, as well as other exemplary objectives and/or advantages of the disclosure, and the manner in which the same are accomplished, are further explained within the following detailed description and its accompanying drawings.
BRIEF DESCRIPTION OF THE DRAWINGS
[0009] FIG. l is a top-view of optical see-through (OST) smart glasses according to a possible implementation of the present disclosure.
[0010] FIG. 2 is a top-view of video see-through (VST) smart glasses according to a possible implementation of the present disclosure.
[0011] FIG. 3 is a block diagram of smart glasses according to a possible implementation of the present disclosure.
[0012] FIG. 4 is a system block diagram of an IMU configured for a 6DOF measurement according to a possible implementation of the present disclosure.
[0013] FIG. 5 is a front view of smart glasses with a first IMU and a second IMU configured to measure a movement of the smart glasses according to an implementation of the present disclosure.
[0014] FIG. 6 is a flowchart of a method of online calibration of smart glasses according to a possible implementation of the present disclosure.
[0015] FIG. 7 is a flowchart of a method for deriving camera extrinsics according to a possible implementation of the present disclosure.
[0016] FIG. 8 is a flowchart of a method for reducing a misalignment in a binocular display using camera extrinsics according to a possible implementation of the present disclosure.
[0017] The components in the drawings are not necessarily to scale relative to each other. Like reference numerals designate corresponding parts throughout the several views.
DETAILED DESCRIPTION
[0018] Accurate display of virtual content (e.g., AR content) may require a precise positioning based on images captured by different cameras in a head-mounted device (e.g., smart glasses) for displaying virtual content (e.g., AR content, VR content). For example, the relative position/orientation of a world facing camera and an eye-tracking camera may be used to determine where on a heads-up display (i.e., HUD) to render virtual content for a user. Further, when rendering stereoscopically, a very small misalignment (i.e., vertical) between the relative positions of the left/right eyes with respect to the left/right head-up display may cause discomfort to a user. One technical problem facing the accurate display is that the cameras may change relative position/orientation (i.e., relative poses) due to changes due to deformation, use (e.g., aging), and temperature. Accordingly, it may be desirable to determine (i.e., calibration) the extrinsics between cameras on smart glasses during use (i.e., online) in a way that is not disruptive to a user. A technical problem with this online calibration is that it may be too computationally complex for the limited computing and power resources of smart glasses. The disclosed smart glasses address these technical problems by rigidly coupling a position sensor (i.e., IMU) to each camera and using the IMU data (i.e., instead of camera data) to determine camera extrinsics. The disclosure further provides a method to simplify the extrinsic calculation in order to make the process practical for the limited resources of the smart glasses.
[0019] In the disclosure, the relative position/orientation (i.e., pose) between cameras is referred to as the extrinsics of the cameras (i.e., camera extrinsics). One aspect of the invention is mechanically coupling an IMU to each camera in the smart glasses and determining camera extrinsics for the cameras based on IMU data. In other words, the smart glasses of the disclosure include a plurality of IMUs (i.e., multi IMUs) to sense relative camera poses on the device. This is different from a device that includes multiple IMUs to reduce noise for a single pose measurement. The number of IMUs in the smart glasses may match the number of camera sensors in the smart glasses. The IMU measurements may consume less power for sensing camera extrinsics than image measurements and therefore may be better suited to the resources of the smart glasses. In some implementations, however, the IMU data can be combined with image data from the cameras (i.e., sensor fusion) to determine the camera extrinsics. These implementations may consume more power, but may offer increased accuracy.
[0020] In the disclosure, the determination of extrinsics while a user is operating the smart glasses is referred to as online calibration (i.e., OC). Another aspect of the invention is that misalignments may be responded to as they happen and without user intervention (or knowledge) due to the online calibration. In other words, the disclosed approach can be computed instantaneously (i.e., instantaneous estimates). This is different from other approaches that require integration. An online calibration may be triggered on a schedule (e.g., periodically) or in response to an event corresponding to a misalignment. For example, an online calibration could be triggered in response to a change in temperature or in response to a change in status (e.g., use from non-use). Triggering an online calibration in response to an event could be advantageous from a power efficiency standpoint.
[0021] In the disclosure, the smart glasses are considered to have a rigid or semi-rigid frame to which the IMUs (and the cameras) are affixed. In a rigid frame a force applied to the frame may move the frame without deforming the frame so that the IMUs (and the cameras) remain in a fixed position with respect to each other. In a semi-rigid frame, portions of the frame may be moved but the movement is constrained in a particular direction (e.g., temple hinge). In either case, a movement of one portion of the frame can be related to another portion of the frame according to the shape and dimensions of the frame. Accordingly, another aspect of the invention is that a movement of the frame of glasses may be measured simultaneously by each IMU and compared to determine camera extrinsics. For example, a rotation rate of the frame may be measured by a first IMU and a second IMU. The measurements may be compared to determine their relative orientation. For example, a rotation that is identically measured, in three dimensions, by each IMU may indicate that the IMUs are aligned (in x, y, z). In a possible implementation, the measurements may be applied to a Kalman filter in order to estimate the measurements in the presence of noise. Likewise, a camera and an IMU may be mechanically coupled in a fixed spatial relationship (i.e., rigid connection) so that a movement (e.g., rotation) the camera may be sensed by the IMU. For example, the camera and the IMU may be mechanically coupled to a circuit board (e.g., FR4) that does not bend in response to a movement of the smart glasses. Accordingly, couplings, connections, or elements that are described as rigid imply that a movement does not cause the coupling, connection, or element to significantly deform (e.g., change shape, bend).
[0022] In the disclosure, the smart glasses may be configured to display stereoscopic images to a user’s left eye and right eye. For an optimal stereoscopic experience, the display of these images should each be aligned with the user’s respective eye. An aspect of the present disclosure is that a misalignment between left and right images of a stereoscopic display is reduced based on camera extrinsics measured for eye tracking cameras of the left eye and the right eye. Smart glasses may include a plurality of sensors configured to sense the user wearing the head-worn device, a plurality of sensors configured to sense the real environment, and a binocular display configured to display information regarding the environment. The sensors and binocular display work together to provide the user with virtual information displayed as if it were located in the real environment. Problems may occur when the sensed information about the user or the environment is incorrect. For example, misalignments in the displayed information may interfere with an experience of a user.
[0023] A binocular display misalignment (e.g., vertical misalignment) is correlated with a level of comfort experienced by a user of a head worn device for extended reality (e.g., augmented reality, virtual reality). The display vertical misalignment (DVM) can be caused by a misalignment of sensors attached to, or otherwise integrated with, a body (i.e., frame) of the device. Previous image-based approaches to correct for misalignments may not be practical in a mobile environment due to power requirements and times required for processing (e.g., integrating). One technical effect of the disclosed systems and methods is that a vertical misalignment can be corrected using faster and more power efficient than other entirely imagebased approaches.
[0024] FIG. 1 is a top view of smart glass in an OST configuration according to a possible implementation of the present disclosure. The OST smart glasses may be worn on the face of a user like conventional glasses but can have sensors integrated within the temples, frames, and/or lenses of the glasses. For example, the smart glasses may include a left temple arm 121 (i.e., left temple) and a right temple arm 122 (i.e., right temple). The left temple and the right temple are coupled by flexible hinges to a left side and a right side of a frame holding a left lens 111 and a right lens 112. The smart glasses may have a target alignment (i.e., expected configuration, factory-setting) between the left and right temples and the frames when worn on the head of a user. For example, the target alignment may include the temples parallel with each other and orthogonal to the frame. Deviations to this target alignment may occur when the flexible hinges are broken or otherwise strained from use, which can cause errors in the sensing associated with the smart glasses. One reason for this may be that a calibration, which is used to compare and/or combine the data from the sensors, may be generated based on the target alignment. A calibration may include extrinsics that specify the relative poses (e.g., translation, rotations) of sensors (e.g., cameras, inertial sensors, etc.) of the smart glasses in space (e.g., relative to a global coordinate system).
[0025] In a possible implementation, an eye-tracking (ET) camera is positioned in a temple of the OST smart glasses to capture images of an eye of the user. Additionally, a front-facing (i.e., head tracking (HT)) camera is positioned in the frame of the smart glasses to capture images of the environment viewed by the eye of the user. A calibration may be used to correlate the images from these cameras. This calibration may become less accurate when the temple and the frame are misaligned (i.e., relative to the target alignment). As a result, it may be difficult, or impossible, to determine exactly where, in the image of the environment, the user is looking. Accordingly, it may be desirable to adjust the calibration to accommodate misalignments of the sensors from a factory setting.
[0026] FIG. 2 is a top-view of smart glasses in a video see-through (VST) configuration according to a possible implementation of the present disclosure. In a possible implementation, a left video see-through camera (i.e., left VST camera) is located at a left lens position and a right video see-through camera (i.e., right VST camera) is located at a right lens position. Both cameras are directed towards the environment of the user and are configured to capture stereo views of the environment that can be displayed on a binocular display of a user. Misalignment (e.g., vertical misalignment) between the left view and the right views may occur then the frames are bent or otherwise distorted from damage or use. When this misalignment becomes too large (e.g., > 4 arcmin), a user may experience negative symptoms. Accordingly, it may be desirable to adjust the alignment to accommodate misalignments of the cameras from a factory setting. The automatic adjustment is referred to as online calibration (OC).
[0027] An inertial measurement unit (IMU) may be included with each camera of smart glasses to monitor and update the extrinsic of the smart glasses. The cameras included in the smart glasses may include any combination of head tracking camera (i.e., HeT-cam), handtracking camera (i.e., HT-cam), head/hand-tracking camera (i.e., HeT/HT-cam), eye-tracking camera (ET-cam), face-tracking camera (i.e., FT-cam), and video see-through camera (i.e., VST- cam).
[0028] As shown in FIG. 1, the smart glasses 100 in the OST configuration can include a left ET-cam 131 includes a left ET-IMU 133; a right ET-cam 132 includes a right ET-IMU 134, a left HeT/HT-cam 143 includes a left HeT/HT-IMU 141, a right HeT/HT-cam 144 includes a right HeT/HT-IMU 142, and a FT-cam 150 may include a FT-IMU.
[0029] As shown in FIG. 2, the smart glasses in the VST configuration can include a left ET- cam 131 includes a left ET-IMU 133; a right ET-cam 132 includes a right ET-IMU 134, a left HeT/HT-cam 143 includes a left HeT/HT-IMU 141, a right HeT/HT-cam 144 includes a right HeT/HT-IMU 142, a left VST-cam 163 includes a left VST-IMU 161, a right VST-cam 164 include a right VST-IMU 162, and a FT-cam 150 may include a FT-IMU.
TABLE 1: Possible IMU number and locations for smart glasses
Figure imgf000010_0001
[0030] TABLE 1 includes IMU information for various sensors on the smart glasses in the different configurations. The rows of TABLE 1 may correspond to a need or importance of the multiple IMUs for online calibration of the smart glasses. For example, the IMUs for the OLED lens/Pancake lens, which are used for a virtual reality (VR) head-mounted display (HMD) (e.g., the OLED lens is part of the image source, and the pancake lens projects image to eye of user), may be highly desirable (e.g., required) required to minimize vertical misalignment. Whereas, an IMU for the indirect time of flight camera of iToF camera, which provides depth information, may be less desirable (e.g. not required), especially if the field-of view (FOV) has an overlap with an infrared (IR) image FOV that is greater than an amount (e.g., 30 degrees).
[0031] The IMUs for the Left/Right VST-cams may not be required if each VST-cam is rigidly coupled to its corresponding VST-display.
[0032] The IMUs for the HeT/HT-cams may be optional (e.g., may not be used) if their FOVs have an overlap greater than an amount (e.g., 30 degrees).
[0033] The IMUs for the ET-cams may be necessary if the ET-cam has a deformity significant to a vertical misalignment. That is, if a deformity of the ET-cam can cause a vertical misalignment greater than a threshold (e.g., 4arcmin).
[0034] The IMUs for the FT-cams may be optionally used if additional accuracy in the alignment with the other sensors is required. [0035] FIG. 3 is a block diagram of smart glasses according to a possible implementation of the present disclosure. The smart glasses 300 includes a camera (e.g., first camera 311) configured to capture images of a field-of-view (e.g., first field-of-view 315). The motiontracking device may further include a processor 350, and images from the first camera 311 may be analyzed by the processor to identify one or more features in the images for motion tracking. Tracking pixel positions of the one or more features over consecutive images may help to determine a motion (e.g., rotation) of the smart glasses 300. Each camera may have an IMU integrated or otherwise affixed to the camera to measure its position and orientation. For example, the first camera 311 may include a first IMU 301 and the second camera 312 may include a second IMU 302.
[0036] The smart glasses 300 further includes a second camera 312 configured to capture images of a second field-of-view 316, which may overlap a portion of the first field-of-view 315. The cameras may be aligned and focused so that a first image (e.g., right image) of the first field- of-view and a second image (e.g., left image) of the second field-of-view may be combined to form a stereoscopic image. The stereoscopic images may help to track the one or more features in three dimensions.
[0037] The smart glasses 300 further includes a memory 360. The memory may be a non- transitory computer-readable medium and may be configured to store instructions that, when executed by the processor 350, can configure the motion tracking device to perform the disclosed methods. For example, the memory 360 may be configured to store a calibration related to a relative to an estimated pose transformation (i.e., extrinsics) between the first camera 311 and the second camera 312. The head-mounted device may include a temperature sensor 340 configured to measure a temperature corresponding to a temperature of the first IMU 301 and/or a temperature of the second IMU 302. A change in temperature detected by the temperature sensor 340 may be used to trigger a calibration process to update the calibration.
[0038] The smart glasses 300 may further include a binocular display 390. For example, the display 390. In a possible implementation, the display 390 is a heads-up display (i.e., HUD). The smart glasses 300 may further include a battery 380. The battery may be configured to provide energy to the subsystems, modules, and devices of the smart glasses 300 to enable their operation. The battery 380 may be rechargeable and have an operating life (e.g., lifetime) between charges. [0039] A battery level may determine how the calibration is updated. For example, for a lower battery level (e.g., below a threshold) may configured the calibration process to estimate the camera extrinsics using only data from the IMUs while a higher battery level (e.g., above a threshold) may configure the calibration process to estimate the camera extrinsics using data from the IMUs plus images from the cameras.
[0040] The smart glasses 300 may further include a communication interface 370. The communication interface may be configured to communicate information digitally over a wireless communication link 371 (e.g., WiFi, Bluetooth, etc.). For example, the motion-tracking device may be communicatively coupled to a network 372 (i.e., the cloud) or a device (e.g., mobile phone 373) over the wireless communication link 371. The wireless communication link may allow operations of a computer-implemented method to be divided between devices and/or could allow for remote storage of the calibration 361. In a possible implementation, the smart glasses are augmented-reality glasses. In another possible implementation, the smart glasses are a virtual-reality visor.
[0041] FIG. 4 is a system block diagram of an IMU configured for a 6DOF measurement according to a possible implementation of the present disclosure. The IMU 400 may output a motion tracking measurement having six components (i.e., 6 degrees of freedom (DOF)) including a linear acceleration in an x-direction (ax), a linear acceleration in a y-direction (ay), a linear acceleration in a z-direction (az), an angular rotation (Rx) about an x-axis (ROLL), an angular rotation (Ry) around a y-axis (PITCH), and an angular rotation (Rz) around a z-axis (YAW). The six components are relative to a coordinate system that may be aligned with, or define, a coordinate system of the IMU.
[0042] The IMU 400 may include a gyroscope module 410 including an X-axis gyroscope configured to measure a first angular rotation 411 (i.e., Rx) around an X-axis of the coordinate system; a Y-axis gyroscope configured to measure a second angular rotation 412 (i.e., Ry) around a Y-axis of the coordinate system; and a Z-axis gyroscope configured to measure a third angular rotation 413 (i.e.., Rz) around a Z-axis of the coordinate system associated with the AR device. [0043] A gyroscope of the IMU 400 may be implemented as a micro-electromechanical system (MEMS) in which a movement of a mass affixed to springs can be capacitively sensed to determine rotation. The alignment of the mass and the springs can determine the axis of the sensed rotation. Accordingly, the IMU 400 may include three MEMS gyroscopes, each aligned to sense a corresponding rotation around an axis of the coordinate system.
[0044] The IMU 400 may further include an accelerometer module 420 that includes an X- axis accelerometer configured to measure a first linear acceleration (ax) in an X-direction; a Y- axis accelerometer configured to measure a second linear acceleration (ay) in a Y-direction; and a Z-axis accelerometer configured to measure a third linear acceleration (az) in a Z-direction.
[0045] An accelerometer of the IMU 400 may be implemented as a MEMS configured to capacitively sense a force (e.g., gravity 421) exerted on a movable mass to determine an acceleration. The accelerometer may sense displacement or velocity by processing (e.g., integration) the acceleration over time.
[0046] The mechanical nature of the MEMS sensors described above can make their responses sensitive to changes in temperature and/or to changes in their installed environment. For example, a temperature change or a force due to use (or misuse) of the motion-tracking device can alter the sensitivity of the MEMS devices. For example, dropping or bending the motion-tracking device can cause a change in the installed environment, thereby changing the response of a gyroscope or an accelerometer of the IMU.
[0047] The changes described above can make the sensed (i.e., measured) rotations and/or displacements differ from the actual rotations and/or displacements. The differences between a measured parameter (e.g., rotation, displacement) and an actual parameter is referred to as a bias. An output of the IMU may be considered as including an IMU measurement (IMU_MEAS) and a bias (BIAS). When the bias is zero, the measured parameter matches the actual parameter. Accordingly, it may be desirable to estimate/compensate for the bias for any, or all, outputs of the IMU.
[0048] As mentioned above, the bias may be a function of temperature (i.e., BIAS(T)). Accordingly, the IMU 400 may include a temperature sensor 440 within or near the IMU 400 so as to measure a temperature (T) that corresponds to the temperature of the gyroscope module 410 and the accelerometer module 420.
[0049] In a possible implementation, the IMU 400 can further include a magnetometer 430 that includes an X-axis magnetometer configured to measure a first magnetic field strength (i.e., Hx) an X-direction of the coordinate system, a Y-axis magnetometer configured to measure a second magnetic field strength (i.e., Hy) in a Y-direction of the coordinate system, and a Z-axis magnetometer configured to measure a third magnetic field strength (i.e., Hz) in a Z-direction of the coordinate system. The magnetic field strengths may be relative to the Earth’s magnetic field 431 (i.e., north (N)).
[0050] The relative pose difference between IMUs (i.e., IMU-IMU extrinsics) may be estimated based on movement measurements measured by different IMUs on the smart glasses. For example, measurements of the same movement relative to each IMUs frame of reference may be compared to estimate how the frames of reference differ. A 3DOF estimate of this difference may be obtained by comparing the three-dimensional (3DOF) measurements captured by the gyroscope of each IMU.
[0051] FIG. 5 is a front view of smart glasses with a first IMU and a second IMU configured to measure a movement of the smart glasses according to an implementation of the present disclosure. As shown, a frame 500 of the smart glasses experiences a movement 501. A first inertial measurement unit (i.e., first IMU 510) is configured to obtain a first measurement 511 of the movement. The first measurement 511 is in a first frame of reference (i.e., first coordinate system) associated with the first IMU 510. For example, the first measurement includes a x- rotation (Rxi), a y-rotation (RYI), and a z-rotation (Rzi). A second inertial measurement unit (i.e., second IMU 520) is configured to obtain a second measurement 521 of the movement. The second measurement is in a second frame of reference (i.e., second coordinate system) associated with the second IMU 520. For example, the second measurement includes a x-rotation (Rxz), a y-rotation (RY2), and a z-rotation (Rz2).
[0052] For the example shown, the frames of reference are aligned but offset along the frame of the smart glasses. This factory setting may be known or set at the time of fabrication and may provide an initial condition to which subsequent measurements are compared. The IMUs may be coupled to cameras and are rigidly coupled to the frame 500 of the smart glasses. For example, the cameras can be a left eye-tracking camera disposed in a left lens portion of the frame of the smart glasses and a right eye-tracking camera disposed in a right lens portion of the frame. Because the IMUs are rigidly coupled to the frame, each IMU will measure different versions of the movement 501. For example, the rotation measurement of the second IMU 520 may have a component with an opposite sign as compared to a corresponding component of the rotation measurement first IMU 510. Comparing the first measurement and the second measurement of the movement can be used to estimate the relative pose difference between the first IMU 510 and the second IMU 520. Because each IMU may be rigidly coupled to a camera. The IMU-IMU extrinsics may be used to derive the corresponding camera extrinsics. The derives camera extrinsics may then be stored in a calibration that can be used to adjust or modify the images captured by the cameras so that they can be combined or compared.
[0053] Various rotations may be measured and the disclosure is not limited to the example shown in FIG. 5. For example, even no movement may provide information about the IMUs. For example, a non-zero measurement made during a zero rotation may be associated with an error (i.e., bias) of the gyroscope, which may be used in the estimate of the relative pose difference.
[0054] FIG. 6 is a flowchart of a method of online calibration of smart glasses according to a possible implementation of the present disclosure. The method 600 may be performed without participation (or knowledge) of a user. In other words, the method 600 may be transparent to a user. The method 600 may be performed while a user is otherwise using the smart glasses. In other words, the method 600 may be performed online. Each iteration of the method 600 may result in an estimate of a relative pose difference between IMUs (i.e., IMU-IMU extrinsics). In other words, no integration period may be necessary for an estimate, but an estimate may require multiple iterations of the method 600 to converge on an estimate with confidence (i.e., with an error below a threshold).
[0055] The method 600 may be performed (i.e., executed) periodically or whenever it is likely that the cameras have shifted from their default (i.e., factory) set positions. Accordingly, the method 600 includes triggering 610 a process to update a calibration corresponding to the relative positions/orientation of the cameras. In other words, the method includes triggering 610 a calibration (e.g., online calibration). The triggering can be based on a change in temperature. In a possible implementation a temperature sensor is monitored and when the temperature changes more than a threshold a trigger signal may be generated to configure the processor of the smart glasses to perform the calibration. The triggering can also be based on a change in a status of the smart glasses. For example, a sensor of the smart glasses may generate the trigger signal when the smart glasses are transitioned from an idle state (e.g.,. charging, no movement) to an operating state (e.g., movement, interaction).
[0056] The method 600 may further include obtaining 620 a first measurement from a first
IMU 611 and a second measurement from a second IMU 621. The first IMU 611 is mechanically coupled to a first camera 601 (e.g., left eye-tracking camera) and the second IMU 621 is mechanically coupled to a second camera 602 (e.g., right eye-tracking camera). For example, the first camera 601 and the first IMU 611 may be coupled to a first circuit board and the second camera 602 and the second IMU 621 may be coupled to a second circuit board. The first circuit board may be separate from the second circuit board but both circuit boards may be coupled to a body (i.e., frame) of the smart glasses. In this way the cameras and the IMUs are either directly or indirectly connected so that the physical connection is mechanically stable (i.e., does not bend or break) in response to a movement. In other words, in response to a movement of the smart glasses, (i) the first camera 601 and the first IMU 611 can be held fixed in their relative positions by the first circuit board; (ii) the second camera 602 and the second IMU 621 can be held fixed in their relative positions by the second circuit board, and; (iii) the first circuit board and the second circuit board can be held fixed in their relative positions by the frame of the smart glasses. As a result, their measured movements in response to the movement of the smart glasses may be compared.
[0057] The first measurement may be a first rotation measured by a first gyroscope of the first IMU 611 and the second measurement may be a second rotation measured by a second gyroscope of the second IMU 621. The first measurement and the second measurement may each have three degrees of freedom (3DOF). Each gyroscope may capture a rotation in three dimensions (e.g., x, y, z) aligned with a coordinate system of each gyroscope. The coordinate systems of the gyroscopes may not be aligned but knowledge of the initial (i.e., factory set) positions of the gyroscopes may be known and used in the comparison. Further the coordinate systems of a gyroscope may not be aligned with a coordinate system of its corresponding camera but knowledge of the relative positions may be known or measured (e.g., using image data and imu data).
[0058] Obtaining 620 the rotation measurements of a movement of the smart glasses may include buffering the measurements from either (or both) IMUs. Buffering the measurements can synchronize the first measurement and the second measurement in time. The synchronization can help the comparison by improving the likelihood that each measurement corresponds to the same overall movement of the smart glasses.
[0059] The method 600 further includes estimating 630 a relative pose difference between the first IMU 611 and the second IMU 621 (i.e., IMU-IMU extrinsics) based on the movement measurements (e.g., buffered rotation measurements). The estimating may include comparing the first measurement and the second measurement. In a possible implementation, the first measurement and the second measurement are applied to a Kalman filter. The Kalman filter is configured to output a state that corresponds to a relative pose difference (e.g., pose transformation) between the IMUs. As mentioned, the method 600 may be iterated in order for the Kalman filter to converge on a solution. Accordingly, the method may include checking 640 the convergence of the Kalman filter to determine if a stable state has been obtained. If the estimate is not converged, the method 600 may repeat obtaining measurements and estimating extrinsics. In other words, the method 600 may obtain subsequent measurements of the movement of the smart glasses from the first IMU 611 and the second IMU and apply the subsequent measurement to the Kalman filter until the state has converged. When the estimate has converged, the method 600 includes updating 650 a calibration file stored in a memory (i.e., calibration 660) to include the relative pose difference between the cameras corresponding to the first IMU 611 and the second IMU 621.
[0060] In a possible implementation, the calibration method may be enhanced or supplemented using image data from the first camera 601 and the second camera 602. For example, when images are available for analysis and the resources of the camera are available, camera data (e.g., VST camera data) may be analyzed with IMU data to help determine or refine the estimate. For example image analysis may help refine a measurement of a relative movement measured by an IMU. Image and IMU data may be analyzed (e.g., compare) to determine a relative position/ orientation (i.e., pose) difference between the IMU and its respective camera (i.e., IMU-CAM extrinsics). Accordingly, the method 600 may include checking 670 to see if image analysis is available (e.g., images captured, battery level suitable). If so the method 680 may use sensor fusion to analyze the image and IMU (i.e., rotation) data to update 680 extrinsic (e.g., CAM-IMU extrinsics, CAM-CAM extrinsics), which can be recorded with the calibration 660.
[0061] Values for an initial relative transformation between pairs of IMUs may be stored in the calibration at a time of manufacture. The method may update these values continually or regularly to account for changes due to age (i.e., wear, use), temperature, and the like. Because the disclosed method may be performed while a user operates the smart glasses, it may be referred to as being performed “online”. For example, the methods may be considered and online calibration method for determining transformations between the frames of two cameras. [0062] FIG. 7 is a flowchart of a method 700 for deriving camera extrinsics according to a possible implementation of the present disclosure. When multiple pairs of cameras, sensors, and/or projectors exist then the transforms between a first pair may be based on a transform between other pairs. For example, A system including three IMUs (1, 2, 3) may have three possible frame transformations (1-2, 2-3, and 1-3). A transform between a first frame pair (1-3) of the three frame pairs (1-2, 2-3, 1-3) may be a function of (e.g., the sum of) the transform between a second frame pair (1-2) and a third frame pair (2-3). Accordingly, the method 700 of computing transforms between pairs of cameras, sensors, and/or projectors may be extended mathematically to cover more (e.g., all) possible transformations.
[0063] The method 700 includes estimating 710 camera extrinsics based on the first IMU 701 and the second IMU 702 (i.e., CAM1-CAM2 extrinsics) The method 700 further includes estimating 720 camera extrinsics based on the second IMU 702 and the third IMU 703 (i.e., CAM2-CAM3 extrinsics). The method 700 further includes deriving 730 a relative position/orientation/offset difference between the first IMU 701 (i.e., first camera) and the third IMU 703 (i.e., third camera) based on the CAM1-CAM2 extrinsics and the CAM2-CAM3 extrinsics. Deriving 730 (i.e., rather than estimating from IMU data) may provide more efficient estimates when the number of cameras in the smart glasses is large.
[0064] FIG. 8 is a flowchart of a method for reducing a misalignment in a binocular display using camera extrinsics according to a possible implementation of the present disclosure. The binocular display includes a left display for a left eye of the user and a right display for a right eye of the user. The method 800 includes obtaining 810 (i.e., capturing) left eye images from a left eye-tracking camera and right eye images from a right eye-tracking camera. The method 800 further includes determining 820 relative positions between the left eye and the left display based on the left eye images and determining 820 relative positions between the right eye and the right display based on the right eye images. This determination may be based on a relative pose difference (i.e., camera extrinsics) between the left eye camera and the right eye camera. Accordingly, the method may include retrieving left/right eye-tracking camera extrinsics from the stored calibration 660. The method 800 further includes adjusting 830 the left images displayed on the left display and adjusting 830 the right images displayed on the right display based on the relative positions of the displays with respect to the eyes. The adjustment can reduce a vertical misalignment between the eyes which can reduce an eye strain of a user viewing the binocular display.
[0065] In a possible implementation, the online calibration can include capturing first sensor data from a first sensor of a head-worn device and capturing second sensor data from a second sensor of the head-worn device. The online calibration can further include determining an instantaneous local acceleration and rotation of the first sensor, while ignoring a first sensor noise and/or a first sensor bias of the first sensor. The online calibration further includes determining an instantons local acceleration and rotation of the second sensor, while ignoring a second sensor noise and/or a second sensor bias of the second sensor. The online calibration further includes determining a misalignment between the first sensor data and the second sensor data based on a differential analysis of the acceleration and rotation of the first sensor and the acceleration and rotation of the second sensor. The differential analysis can compensate for the sensor noise and sensor bias. The online calibration further includes adjusting the first sensor data and/or the second sensor data based on a calibration to correct for the misalignment.
[0066] The first sensor can be a first IMU configured to capture the instantaneous acceleration and rotation local to the first IMU (i.e., first IMU data), and the second sensor can be a second IMU configured to capture instantaneous acceleration and rotation local to the second IMU (i.e., second IMU data). Accordingly, determining the misalignment between the first sensor data and the second sensor data is based on the first IMU data and the second IMU data. This determination can include determining at least one difference between a nominal extrinsic (e.g., corresponding to no misalignment) and an actual extrinsics (e.g., corresponding to the determined misalignment).
[0067] In what follows, some example implementations of the disclosure are described. [0068] Example 1. A method, comprising: coupling a first camera at a first fixed position with respect to a first inertial measurement unit; coupling a second camera at a second fixed position with respect to a second inertial measurement unit; coupling the first inertial measurement unit and the second inertial measurement unit to a frame of smart glasses; obtaining first measurement of a movement of the smart glasses from the first inertial measurement unit; obtaining a second measurement of the movement of the smart glasses from the second inertial measurement unit; comparing the first measurement and the second measurement to estimate a relative pose difference between the first inertial measurement unit and the second inertial measurement unit; and updating a calibration based on the relative pose difference.
[0069] Example 2. The method according to Example 1, wherein the first camera and the first inertial measurement unit are both on a first circuit board and the second camera and the second inertial measurement unit are both on a second circuit board.
[0070] Example 3. The method according to Example 1, wherein the first camera is a left eye-tracking camera and the second camera is a right eye-tracking camera.
[0071] Example 4. The method according to Example 3, wherein the left eye-tracking camera is disposed in a left lens portion of the frame of the smart glasses and the right eyetracking camera is disposed in a right lens portion of the frame of the smart glasses.
[0072] Example 5. The method according to Example 1, wherein the first measurement is captured by a first gyroscope of the first inertial measurement unit and the second measurement is captured by a second gyroscope of the second inertial measurement unit.
[0073] Example 6. The method according to Example 5, wherein the first measurement is a first rotation and the second measurement is a second rotation, the first rotation and the second rotation each having three degrees of freedom.
[0074] Example 7. The method according to Example 1, further comprising buffering data from the first inertial measurement unit and the second inertial measurement unit to synchronize the first measurement and the second measurement in time.
[0075] Example 8. The method according to Example 1, wherein comparing the first measurement and the second measurement to estimate the relative pose difference between the first inertial measurement unit and the second inertial measurement unit includes: applying the first measurement and the second measurement to a Kalman filter, the Kalman filter is configured to output a state corresponding to the relative pose difference.
[0076] Example 9. The method according to Example 8, further comprising: obtaining subsequent measurements of the movement of the smart glasses from the first inertial measurement unit the second inertial measurement unit; and applying the subsequent measurements to the Kalman filter until the state is converged. [0077] Example 10. The method according to Example 1, wherein the method is an online process to update the calibration, the method further including: triggering the online process to update the calibration based on a change in temperature of the smart glasses.
[0078] Example 11. The method according to Example 1, wherein the method is an online process to update the calibration, the method further including: triggering the online process to update the calibration based on a change in status of the smart glasses.
[0079] Example 12. Smart glasses comprising: a first camera mechanically coupled to a first inertial measurement unit; a second camera mechanically coupled to a second inertial measurement unit; a frame mechanically coupling the first inertial measurement unit and the second inertial measurement unit; and a processor configured by software instructions to perform a calibration including: obtaining a first measurement of a movement of the smart glasses from the first inertial measurement unit; obtaining a second measurement of the movement of the smart glasses from the second inertial measurement unit; comparing the first measurement and the second measurement to estimate a relative pose difference between the first inertial measurement unit and the second inertial measurement unit; and updating a calibration file based on the relative pose difference.
[0080] Example 13. The smart glasses according to Example 12, wherein the first camera and the first inertial measurement unit are both on a first circuit board and the second camera and the second inertial measurement unit are both on a second circuit board.
[0081] Example 14. The smart glasses according to Example 12, wherein the first camera is a left eye-tracking camera and the second camera is a right eye-tracking camera.
[0082] Example 15. The smart glasses according to Example 14, wherein the left eyetracking camera is disposed in a left lens portion of the frame of the smart glasses and the right eye-tracking camera is disposed in a right lens portion of the frame of the smart glasses.
[0083] Example 16. The smart glasses according to Example 12, wherein the first measurement is captured by a first gyroscope of the first inertial measurement unit and the second measurement is captured by a second gyroscope of the second inertial measurement unit. [0084] Example 17. The smart glasses according to Example 16, wherein the first measurement is a first rotation and the second measurement is a second rotation, the first rotation and the second rotation each having three degrees of freedom. [0085] Example 18. The smart glasses according to Example 12, wherein the calibration further includes: buffering data from the first inertial measurement unit and the second inertial measurement unit to synchronize the first measurement and the second measurement in time. [0086] Example 19. The smart glasses according to Example 12, wherein comparing the first measurement and the second measurement to estimate the relative pose difference between the first inertial measurement unit and the second inertial measurement unit includes: applying the first measurement and the second measurement to a Kalman filter, the Kalman filter configured to output a state corresponding to the relative pose difference.
[0087] Example 20. The smart glasses according to Example 12, wherein the calibration is performed online while a user is otherwise operating the smart glasses and wherein the calibration is triggered based on a change in a temperature of the smart glasses or a status of the smart glasses.
[0088] Example 21. Smart glasses comprising: a first camera and a first inertial measurement unit mechanically coupled to a first circuit board; a second camera and a second inertial measurement unit mechanically coupled to a second circuit board; a frame including the first circuit board and the second circuit board; and a processor in the frame that is electrically coupled to the first circuit board and the second circuit board, the processor configured by software instructions to perform an online calibration including: obtaining a first rotation measurement of the smart glasses from a first gyroscope of the first inertial measurement unit; obtaining a second rotation measurement of the smart glasses from a second gyroscope the second inertial measurement unit; applying the first rotation measurement and the second rotation measurement to a Kalman filter to estimate a relative pose difference between the first inertial measurement unit and the second inertial measurement unit; and updating a calibration file to include camera extrinsics based on the relative pose difference.
[0089] Example 22. The smart glasses according to Example 21, wherein the online calibration is triggered based on a change in a temperature of the smart glasses or a status of the smart glasses.
[0090] Example 23. The smart glasses according to Example 21, wherein: the first camera and the first inertial measurement unit are held fixed in their relative positions by the first circuit board; the second camera and the second inertial measurement unit are held fixed in their relative positions by the second circuit board; and the first circuit board and the second circuit board are held fixed in their relative positions by the frame of the smart glasses.
[0091] A number of implementations have been described. Nevertheless, it will be understood that various modifications may be made without departing from the spirit and scope of the specification.
[0092] In addition, the logic flows depicted in the figures do not require the particular order shown, or sequential order, to achieve desirable results. In addition, other steps may be provided, or steps may be eliminated, from the described flows, and other components may be added to, or removed from, the described systems. Accordingly, other implementations are within the scope of the following claims.
[0093] While certain features of the described implementations have been illustrated as described herein, many modifications, substitutions, changes and equivalents will now occur to those skilled in the art. It is, therefore, to be understood that the appended claims are intended to cover all such modifications and changes as fall within the scope of the implementations. It should be understood that they have been presented by way of example only, not limitation, and various changes in form and details may be made. Any portion of the apparatus and/or methods described herein may be combined in any combination, except mutually exclusive combinations. The implementations described herein can include various combinations and/or subcombinations of the functions, components and/or features of the different implementations described.
[0094] Methods discussed above, some of which are illustrated by the flow charts, may be implemented by hardware, software, firmware, middleware, microcode, hardware description languages, or any combination thereof. When implemented in software, firmware, middleware or microcode, the program code or code segments to perform the necessary tasks may be stored in a machine or computer readable medium such as a storage medium. A processor(s) may perform the necessary tasks.
[0095] Specific structural and functional details disclosed herein are merely representative for purposes of describing example implementations. Example implementations, however, be embodied in many alternate forms and should not be construed as limited to only the implementations set forth herein. [0096] It will be understood that, although the terms first, second, etc. may be used herein to describe various elements, these elements should not be limited by these terms. These terms are only used to distinguish one element from another. For example, a first element could be termed a second element, and, similarly, a second element could be termed a first element, without departing from the scope of example implementations. As used herein, the term and/or includes any and all combinations of one or more of the associated listed items.
[0097] It will be understood that when an element is referred to as being connected or coupled to another element, it can be directly connected or coupled to the other element or intervening elements may be present. In contrast, when an element is referred to as being directly connected or directly coupled to another element, there are no intervening elements present. Other words used to describe the relationship between elements should be interpreted in a like fashion (e.g., between versus directly between, adjacent versus directly adjacent, etc ).
[0098] The terminology used herein is for the purpose of describing particular implementations only and is not intended to be limiting of example implementations. As used herein, the singular forms a, an and the are intended to include the plural forms as well, unless the context clearly indicates otherwise. It will be further understood that the terms comprises, comprising, includes and/or including, when used herein, specify the presence of stated features, integers, steps, operations, elements and/or components, but do not preclude the presence or addition of one or more other features, integers, steps, operations, elements, components and/or groups thereof.
[0099] It should also be noted that in some alternative implementations, the functions/acts noted may occur out of the order noted in the figures. For example, two figures shown in succession may in fact be executed concurrently or may sometimes be executed in the reverse order, depending upon the functionality/acts involved.
[00100] Unless otherwise defined, all terms (including technical and scientific terms) used herein have the same meaning as commonly understood by one of ordinary skill in the art to which example implementations belong. It will be further understood that terms, e.g., those defined in commonly used dictionaries, should be interpreted as having a meaning that is consistent with their meaning in the context of the relevant art and will not be interpreted in an idealized or overly formal sense unless expressly so defined herein. [00101] Lastly, it should also be noted that whilst the accompanying claims set out particular combinations of features described herein, the scope of the present disclosure is not limited to the particular combinations hereafter claimed, but instead extends to encompass any combination of features or implementations herein disclosed irrespective of whether or not that particular combination has been specifically enumerated in the accompanying claims at this time.

Claims

CLAIMS A method, compri sing : coupling a first camera at a first fixed position with respect to a first inertial measurement unit; coupling a second camera at a second fixed position with respect to a second inertial measurement unit; coupling the first inertial measurement unit and the second inertial measurement unit to a frame of smart glasses; obtaining first measurement of a movement of the smart glasses from the first inertial measurement unit; obtaining a second measurement of the movement of the smart glasses from the second inertial measurement unit; comparing the first measurement and the second measurement to estimate a relative pose difference between the first inertial measurement unit and the second inertial measurement unit; and updating a calibration based on the relative pose difference. The method according to claim 1, wherein the first camera and the first inertial measurement unit are both on a first circuit board and the second camera and the second inertial measurement unit are both on a second circuit board. The method according to claims 1 or 2, wherein the first camera is a left eye-tracking camera and the second camera is a right eye-tracking camera. The method according to claim 3, wherein the left eye-tracking camera is disposed in a left lens portion of the frame of the smart glasses and the right eye-tracking camera is disposed in a right lens portion of the frame of the smart glasses. The method according to any one of the preceding claims, wherein the first measurement is captured by a first gyroscope of the first inertial measurement unit and the second measurement is captured by a second gyroscope of the second inertial measurement unit. The method according to claim 5, wherein the first measurement is a first rotation and the second measurement is a second rotation, the first rotation and the second rotation each having three degrees of freedom. The method according to any one of the preceding claims, further comprising buffering data from the first inertial measurement unit and the second inertial measurement unit to synchronize the first measurement and the second measurement in time. The method according to any one of the preceding claims, wherein comparing the first measurement and the second measurement to estimate the relative pose difference between the first inertial measurement unit and the second inertial measurement unit includes: applying the first measurement and the second measurement to a Kalman filter, the Kalman filter is configured to output a state corresponding to the relative pose difference. The method according to claim 8, further comprising: obtaining subsequent measurements of the movement of the smart glasses from the first inertial measurement unit the second inertial measurement unit; and applying the subsequent measurements to the Kalman filter until the state is converged. The method according to any one of the preceding claims, wherein the method is an online process to update the calibration, the method further including: triggering the online process to update the calibration based on a change in temperature of the smart glasses. The method according to any one of the preceding claims, wherein the method is an online process to update the calibration, the method further including: triggering the online process to update the calibration based on a change in status of the smart glasses. Smart glasses comprising: a first camera mechanically coupled to a first inertial measurement unit; a second camera mechanically coupled to a second inertial measurement unit; a frame mechanically coupling the first inertial measurement unit and the second inertial measurement unit; and a processor configured by software instructions to perform a calibration including: obtaining a first measurement of a movement of the smart glasses from the first inertial measurement unit; obtaining a second measurement of the movement of the smart glasses from the second inertial measurement unit; comparing the first measurement and the second measurement to estimate a relative pose difference between the first inertial measurement unit and the second inertial measurement unit; and updating a calibration file based on the relative pose difference. The smart glasses according to claim 12, wherein the first camera and the first inertial measurement unit are both on a first circuit board and the second camera and the second inertial measurement unit are both on a second circuit board. The smart glasses according to claims 12 or 13, wherein the first camera is a left eyetracking camera and the second camera is a right eye-tracking camera. The smart glasses according to claim 14, wherein the left eye-tracking camera is disposed in a left lens portion of the frame of the smart glasses and the right eye-tracking camera is disposed in a right lens portion of the frame of the smart glasses. The smart glasses according to any one of claims 12 to 15, wherein the first measurement is captured by a first gyroscope of the first inertial measurement unit and the second measurement is captured by a second gyroscope of the second inertial measurement unit. The smart glasses according to claim 16, wherein the first measurement is a first rotation and the second measurement is a second rotation, the first rotation and the second rotation each having three degrees of freedom. The smart glasses according to any one of claims 12 to 17, wherein the calibration further includes: buffering data from the first inertial measurement unit and the second inertial measurement unit to synchronize the first measurement and the second measurement in time. The smart glasses according to any one of claims 12 to 18, wherein comparing the first measurement and the second measurement to estimate the relative pose difference between the first inertial measurement unit and the second inertial measurement unit includes: applying the first measurement and the second measurement to a Kalman filter, the Kalman filter configured to output a state corresponding to the relative pose difference. The smart glasses according to any one of claims 12 to 19, wherein the calibration is performed online while a user is otherwise operating the smart glasses and wherein the calibration is triggered based on a change in a temperature of the smart glasses or a status of the smart glasses.
Smart glasses comprising: a first camera and a first inertial measurement unit mechanically coupled to a first circuit board; a second camera and a second inertial measurement unit mechanically coupled to a second circuit board; a frame including the first circuit board and the second circuit board; and a processor in the frame that is electrically coupled to the first circuit board and the second circuit board, the processor configured by software instructions to perform an online calibration including: obtaining a first rotation measurement of the smart glasses from a first gyroscope of the first inertial measurement unit; obtaining a second rotation measurement of the smart glasses from a second gyroscope the second inertial measurement unit; applying the first rotation measurement and the second rotation measurement to a Kalman filter to estimate a relative pose difference between the first inertial measurement unit and the second inertial measurement unit; and updating a calibration file to include camera extrinsics based on the relative pose difference. The smart glasses according to claim 21, wherein the online calibration is triggered based on a change in a temperature of the smart glasses or a status of the smart glasses. The smart glasses according to claims 21 or 22, wherein: the first camera and the first inertial measurement unit are held fixed in their relative positions by the first circuit board; the second camera and the second inertial measurement unit are held fixed in their relative positions by the second circuit board; and the first circuit board and the second circuit board are held fixed in their relative positions by the frame of the smart glasses.
PCT/US2023/073552 2022-09-06 2023-09-06 Online calibration of sensor alignment in a head-worn device WO2024054846A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US202263374654P 2022-09-06 2022-09-06
US63/374,654 2022-09-06

Publications (1)

Publication Number Publication Date
WO2024054846A1 true WO2024054846A1 (en) 2024-03-14

Family

ID=88207564

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/US2023/073552 WO2024054846A1 (en) 2022-09-06 2023-09-06 Online calibration of sensor alignment in a head-worn device

Country Status (1)

Country Link
WO (1) WO2024054846A1 (en)

Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20220051441A1 (en) * 2018-12-21 2022-02-17 Magic Leap, Inc. Multi-camera cross reality device

Patent Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20220051441A1 (en) * 2018-12-21 2022-02-17 Magic Leap, Inc. Multi-camera cross reality device

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
LLOYD HAYDN HUGHES: "Enhancing Mobile Camera Pose Estimation Through the Inclusion of Sensors", 1 December 2014 (2014-12-01), XP055391425, Retrieved from the Internet <URL:http://appliedmaths.sun.ac.za/~wbrink/students/LHughes2014.pdf> *

Similar Documents

Publication Publication Date Title
US9864192B2 (en) Image display device, computer program, and image display system
CN109146965B (en) Information processing apparatus, computer readable medium, and head-mounted display apparatus
US10269139B2 (en) Computer program, head-mounted display device, and calibration method
US9401050B2 (en) Recalibration of a flexible mixed reality device
CN111512574B (en) Method for calibrating augmented reality device
US8958599B1 (en) Input method and system based on ambient glints
US10365710B2 (en) Head-mounted display device configured to display a visual element at a location derived from sensor data and perform calibration
US10373334B2 (en) Computer program, object tracking method, and object tracking device
CN111602082B (en) Position tracking system for head mounted display including sensor integrated circuit
JP5844880B2 (en) Head mounted display, calibration method and calibration program, and recording medium
US20180217380A1 (en) Head-mounted display device and image display system
EP3248176A1 (en) Mixed reality system
CN114761909A (en) Content stabilization for head-mounted displays
US11662589B2 (en) Geometry modeling of eyewear devices with flexible frames
CN103517060B (en) A kind of display control method of terminal equipment and device
EP3301545A1 (en) Computer program, object tracking method, and display device
JP2017224984A (en) Program, device, and calibration method
US10764558B2 (en) Reduced bandwidth stereo distortion correction for fisheye lenses of head-mounted displays
KR20230079138A (en) Eyewear with strain gauge estimation function
WO2024054846A1 (en) Online calibration of sensor alignment in a head-worn device
JP2016057634A (en) Head-mounted display, calibration method, calibration program, and recording medium
GB2599145A (en) Large space tracking using a wearable optics device
EP3975040A1 (en) Large space tracking using a wearable optics device
JP2017215597A (en) Information display method and information display device
WO2023095717A1 (en) Display device, display method, and display program