CN209928142U - Head-mounted device - Google Patents

Head-mounted device Download PDF

Info

Publication number
CN209928142U
CN209928142U CN201920757997.3U CN201920757997U CN209928142U CN 209928142 U CN209928142 U CN 209928142U CN 201920757997 U CN201920757997 U CN 201920757997U CN 209928142 U CN209928142 U CN 209928142U
Authority
CN
China
Prior art keywords
module
arm
frame
head
mounted device
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201920757997.3U
Other languages
Chinese (zh)
Inventor
P·X·王
D·C·马修
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Apple Inc
Original Assignee
Apple Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from US16/378,357 external-priority patent/US20200201042A1/en
Application filed by Apple Inc filed Critical Apple Inc
Application granted granted Critical
Publication of CN209928142U publication Critical patent/CN209928142U/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/017Head mounted
    • G02B27/0176Head mounted characterised by mechanical features
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/017Head mounted
    • G02B2027/0178Eyeglass type

Abstract

The utility model discloses the problem is "head-mounted equipment". The head-mounted device includes: a frame module, the frame module comprising: a frame; a display element supported by the frame; a first attachment element; and a first communication interface; an arm module, the arm module comprising: an electronic component; a second attachment element; and a second communication interface, wherein the display element is operatively connected to the electronic component via the first communication interface and the second communication interface when the arm module is attached to the frame module; and a release mechanism for releasing the arm module from the frame module.

Description

Head-mounted device
Technical Field
The present description relates generally to head mounted devices and, more particularly, to modular configuration of head mounted devices.
Background
A user may wear a head-mounted device to display visual information within the user's field of view. The head-mounted device may be used as a Virtual Reality (VR) system, an Augmented Reality (AR) system, and/or a Mixed Reality (MR) system. The user may observe output provided by the head-mounted device, such as visual information provided on a display. The display may optionally allow the user to view the environment external to the head-mounted device. Other outputs provided by the head-mounted device may include audio outputs and/or haptic feedback. The user may further interact with the head-mounted device by providing input for processing by one or more components of the head-mounted device. For example, when the device is mounted to the user's head, the user may provide tactile input, voice commands, and other input.
SUMMERY OF THE UTILITY MODEL
According to an embodiment of the present disclosure, a head-mounted device is provided. The head-mounted device includes: a frame module, an arm module, and a release mechanism. The frame module includes: a frame; a display element supported by the frame; a first attachment element; and a first communication interface. The arm module includes: an electronic component; a second attachment element; and a second communication interface, wherein the display element is operatively connected to the electronic component via the first communication interface and the second communication interface when the arm module is attached to the frame module. The release mechanism is for releasing the arm module from the frame module.
According to another embodiment of the present disclosure, there is provided a head-mounted device, characterized in that the head-mounted device includes: a frame module and an arm module. The frame module includes: a frame; and a display element supported by the frame. The arm module is removably attached to the frame module and includes electronic components operatively connected to the display element when the arm module is attached to the frame module, the arm module being interchangeable with other arm modules removably attached to the frame module.
According to still another embodiment of the present disclosure, there is also provided a head-mounted device, including: a frame module and first and second arm modules. The frame module includes: a frame; a controller within the frame; and an attachment element. The first arm module includes a first electronic component for performing a first function, wherein the first electronic component is operatively connected to the controller when the first arm module is attached to the attachment element. The second arm module includes a second electronic component for performing a second function different from the first function, wherein the second electronic component is operatively connected to the controller when the second arm module is attached to the attachment element.
Drawings
Some of the features of the subject technology are set forth in the appended claims. However, for purposes of explanation, several embodiments of the subject technology are set forth in the following figures.
Fig. 1 illustrates a perspective view of a head-mounted device in an assembled configuration according to some embodiments of the present disclosure.
Fig. 2 illustrates a perspective view of a head-mounted device in an unassembled configuration according to some embodiments of the present disclosure.
Fig. 3 illustrates a perspective view of a portion of a headset in an unassembled configuration according to some embodiments of the present disclosure.
Fig. 4 illustrates a perspective view of a portion of an arm module of a head-mounted device according to some embodiments of the present disclosure.
Fig. 5 illustrates a perspective view of portions of a headset in an unassembled configuration according to some embodiments of the present disclosure.
Fig. 6 illustrates a perspective view of portions of a headset in an assembled configuration according to some embodiments of the present disclosure.
Fig. 7 illustrates a block diagram of a head-mounted device having a frame module and two arm modules, according to some embodiments of the present disclosure.
Fig. 8 illustrates a front view and a side view of a first arm module of a head-mounted device according to some embodiments of the present disclosure.
Fig. 9 illustrates a front view and a side view of a second arm module of a head-mounted device according to some embodiments of the present disclosure.
Fig. 10 illustrates front and side views of a third arm module of a head-mounted device according to some embodiments of the present disclosure.
Fig. 11 illustrates a perspective view of an arm module and an additional module according to some embodiments of the present disclosure.
Fig. 12 illustrates a block diagram of an arm module and additional modules according to some embodiments of the present disclosure.
Fig. 13 illustrates a perspective view of a head mounted device with a replaceable display element, according to some embodiments of the present disclosure.
Fig. 14 illustrates a perspective view of a frame module of a head mounted device with a replaceable display element according to some embodiments of the present disclosure.
Fig. 15 illustrates a perspective view of a frame module and an additional module according to some embodiments of the present disclosure.
Fig. 16 illustrates a block diagram of a frame module and additional modules, according to some embodiments of the present disclosure.
Fig. 17 illustrates a back view of a frame module having a first support member according to some embodiments of the present disclosure.
Fig. 18 illustrates a back view of a frame module having a second support member according to some embodiments of the present disclosure.
Detailed Description
Cross Reference to Related Applications
The benefit of U.S. provisional application No.62/782,260 entitled module SYSTEM FOR HEAD-motor procedure filed on 19.12.2018, the entire contents of which are incorporated herein by reference.
The detailed description set forth below is intended as a description of various configurations of the subject technology and is not intended to represent the only configurations in which the subject technology may be practiced. The accompanying drawings are incorporated herein and constitute a part of the detailed description. The detailed description includes specific details for the purpose of providing a thorough understanding of the subject technology. It will be apparent, however, to one skilled in the art that the subject technology is not limited to the specific details shown herein and may be practiced without these specific details. In some instances, well-known structures and components are shown in block diagram form in order to avoid obscuring the concepts of the subject technology.
A head-mounted device, such as a head-mounted display, a headset, goggles, smart glasses, a heads-up display, etc., may perform a range of functions determined by the components (e.g., sensors, circuitry, and other hardware) included with the manufactured wearable device. However, space, cost, and other considerations may limit the ability to provide each component that can provide the desired functionality. For example, different users may have different preferences for the components and functionality provided by a given head-mounted device. Some users may desire certain capabilities, such as high resolution display and long battery life, while other users may desire other capabilities, such as a smaller form factor. Furthermore, a given user may desire different functionality at different times. For example, a given user may desire a high resolution display at home and a long battery life when out.
In view of the variety of components and functions required, it would be beneficial to allow a user to change the components and functions of a head mounted device to customize the user experience according to the user's desires. The head mounted device of the present disclosure facilitates customization, adaptation, and modification by a user according to the user's desires.
The system of the present disclosure may provide a head-mounted device with a replaceable module that provides a variety of different components and functions to achieve a user-desired result. The modular configuration allows a user to easily customize a head-mounted device with one or more arm modules to provide features that are integrated with other operations of the frame module of the head-mounted device. The arm modules can be easily interchanged with one another to provide different components and functions at different times. Thus, the framework module of the present disclosure need not include permanent components that provide each function that a user will later desire. Rather, the head-mounted device may have expanded and customizable capabilities through the use of one or more arm modules.
These and other embodiments are discussed below with reference to fig. 1-18. However, those skilled in the art will readily appreciate that the detailed description given herein with respect to these figures is for explanatory purposes only and should not be construed as limiting.
According to some embodiments, such as shown in fig. 1, the head mounted device 10 includes a frame module 100 that is worn on the head using one or more arm modules 200. The frame module 100 may be positioned in front of the user's eyes to provide information within the user's field of view. The frame module 100 may include a frame 102 that provides a nose pad 106 or another feature to rest on the nose of the user. The frame 102 also includes one or more display elements 104 and a bridge 108 over the nose pad 106 and connecting the plurality of display elements 104.
The frame 102 may be used to surround a peripheral region of the frame module 100 and support the internal components of the frame module 100 in their assembled position. For example, the frame 102 may enclose and support various internal components (including, for example, integrated circuit chips, processors, memory devices, and other circuitry) to provide computing and functional operations for the head-mounted device 10, as discussed further herein.
The display element 104 may transmit light from the physical environment for viewing by a user. Such display elements 104 may include optical properties such as lenses for vision correction based on incident light from the physical environment. Additionally or alternatively, the display element 104 may provide information as a display within the user's field of view. Such information may be provided in lieu of or in addition to (e.g., overlaid on) a view of the physical environment.
A physical environment refers to a physical world in which people can sense and/or interact without the aid of an electronic system. Physical environments such as physical parks include physical objects such as physical trees, physical buildings, and physical people. People can directly sense and/or interact with the physical environment, such as through vision, touch, hearing, taste, and smell.
In contrast, a computer-generated reality (CGR) environment refers to a fully or partially simulated environment in which people sense and/or interact via electronic systems. In CGR, a subset of the human's physical movements, or a representation thereof, is tracked and in response, one or more characteristics of one or more virtual objects simulated in the CGR environment are adjusted in a manner that complies with at least one laws of physics. For example, the CGR system may detect head rotations of a person and in response adjust the graphical content and sound field presented to the person in a manner similar to how such views and sounds change in the physical environment. In some cases (e.g., for accessibility reasons), adjustments to the characteristics of virtual objects in the CGR environment may be made in response to representations of physical motion (e.g., voice commands).
A person may utilize any of their senses to sense and/or interact with CGR objects, including vision, hearing, touch, taste, and smell. For example, a person may sense and/or interact with audio objects that create a 3D or spatial audio environment that provides a perception of a point audio source in 3D space. As another example, an audio object may enable audio transparency that selectively introduces ambient sound from a physical environment with or without computer-generated audio. In some CGR environments, a person may sense and/or interact only with audio objects.
Examples of CGR include virtual reality and mixed reality.
A Virtual Reality (VR) environment refers to a simulated environment designed to be based entirely on computer-generated sensory input for one or more senses. The VR environment includes a plurality of virtual objects that a person can sense and/or interact with. For example, computer-generated images of trees, buildings, and avatars representing people are examples of virtual objects. A person may sense and/or interact with a virtual object in the VR environment through simulation of the presence of the person within the computer-generated environment, and/or through simulation of a subset of the physical movements of the person within the computer-generated environment.
In contrast to VR environments designed to be based entirely on computer-generated sensory inputs, a Mixed Reality (MR) environment refers to a simulated environment designed to introduce sensory inputs from a physical environment or representations thereof in addition to computer-generated sensory inputs (e.g., virtual objects). On a virtual continuum, a mixed reality environment is anything between the full physical environment as one end and the virtual reality environment as the other end, but not both ends.
In some MR environments, computer-generated sensory inputs may be responsive to changes in sensory inputs from the physical environment. Additionally, some electronic systems for presenting MR environments may track position and/or orientation relative to a physical environment to enable virtual objects to interact with real objects (i.e., physical objects or representations thereof from the physical environment). For example, the system may cause motion such that the virtual trees appear to be stationary relative to the physical ground.
Examples of mixed reality include augmented reality and augmented virtual.
An Augmented Reality (AR) environment refers to a simulated environment in which one or more virtual objects are superimposed over a physical environment or representation thereof. For example, an electronic system for presenting an AR environment may have a transparent or translucent display through which a person may directly view the physical environment. The system may be configured to present virtual objects on a transparent or translucent display such that a person perceives the virtual objects superimposed over a physical environment with the system. Alternatively, the system may have an opaque display and one or more imaging sensors that capture images or videos of the physical environment, which are representations of the physical environment. The system combines the image or video with a virtual object and presents the combination on an opaque display. A person utilizes the system to indirectly view a physical environment via an image or video of the physical environment and perceive a virtual object superimposed over the physical environment. As used herein, video of the physical environment displayed on an opaque display is referred to as "passthrough video," meaning that the system captures images of the physical environment using one or more image sensors and uses those images in rendering the AR environment on the opaque display. Further alternatively, the system may have a projection system that projects the virtual object into the physical environment, for example as a hologram or on a physical surface, so that a person perceives the virtual object superimposed on the physical environment with the system.
Augmented reality environments also refer to simulated environments in which representations of a physical environment are converted by computer-generated sensory information. For example, in providing a pass-through video, the system may transform one or more sensor images to impose a different perspective (e.g., viewpoint) than the perspective captured by the imaging sensor. As another example, a representation of a physical environment may be transformed by graphically modifying (e.g., magnifying) a portion thereof, such that the modified portion may be a representative but not real version of the original captured image. As another example, a representation of a physical environment may be transformed by graphically eliminating or obscuring portions thereof.
An enhanced virtual (AV) environment refers to a simulated environment in which a virtual or computer-generated environment incorporates one or more sensory inputs from a physical environment. The sensory input may be a representation of one or more characteristics of the physical environment. For example, an AV park may have virtual trees and virtual buildings, but the face of a person is realistically reproduced from an image taken of a physical person. As another example, the virtual object may take the shape or color of the physical object imaged by the one or more imaging sensors. As another example, the virtual object may employ a shadow that conforms to the position of the sun in the physical environment.
There are many different types of electronic systems that enable a person to sense and/or interact with various CGR environments. Embodiments include head-mounted systems, projection-based systems, head-up displays (HUDs), vehicle windshields integrated with display capabilities, windows integrated with display capabilities, displays formed as lenses designed for placement on a person's eyes (e.g., similar to contact lenses), headphones/earphones, speaker arrays, input systems (e.g., wearable or handheld controllers with or without haptic feedback), smart phones, tablets, and desktop/laptop computers. The head-mounted system may have one or more speakers and an integrated opaque display. Alternatively, the head-mounted system may be configured to accept an external opaque display (e.g., a smartphone). The head-mounted system may incorporate one or more imaging sensors for capturing images or video of the physical environment, and/or one or more microphones for capturing audio of the physical environment. The head mounted system may have a transparent or translucent display instead of an opaque display. A transparent or translucent display may have a medium through which light representing an image is directed to a person's eye. The display may utilize digital light projection, OLED, LED, uuled, liquid crystal on silicon, laser scanning light sources, or any combination of these technologies. The medium may be an optical waveguide, a holographic medium, an optical combiner, an optical reflector, or any combination thereof. In one embodiment, a transparent or translucent display may be configured to selectively become opaque. Projection-based systems may employ retinal projection techniques that project a graphical image onto a person's retina. The projection system may also be configured to project the virtual object into the physical environment, for example as a hologram or on a physical surface.
Referring again to fig. 1, the frame module 100 may be supported on the head of the user using the arm module 200. The arm modules 200 may wrap or extend along opposite sides of the user's head, as with the temple pieces. The arm module 200 may also include an earpiece 298 for wrapping around or otherwise engaging the user's ear. It should be understood that other configurations may be employed to secure the head mounted device 10 to the head of the user. For example, one or more straps, bands, straps, caps, hats, or other components may be used in addition to or in place of the illustrated components of the head mounted device 10. By further example, the arm module may extend around the head of the user to both sides of the frame module 100.
Referring to fig. 2, the system of the present disclosure provides a head-mounted device with a replaceable module that provides a variety of different components and functions to achieve a user-desired result. For example, the head mounted device 10 may be provided with an arm module 200 detachable from the frame module 100. The frame attachment element 110 of the frame module 100 may interact with the arm attachment element 210 of the arm module 200 to provide a secure and reversible coupling. The modular configuration allows a user to easily customize a head-mounted device with one or more arm modules to provide features that are integrated with other operations of the frame module of the head-mounted device. The arm modules 200 can be easily interchanged with one another to provide different components and functions at different times.
As used herein, "modular" or "module" may refer to a feature that allows a user to connect, install, remove, exchange, and/or replace an item (such as an arm module) in conjunction with another item (such as a frame module of a head-mounted device). The connection of the arm module to the frame module can be performed and inverted, and then the other arm module is disconnected and connected to the same frame module or the other frame module is disconnected and connected to the same arm module. In this way, the plurality of arm modules may be interchangeable with one another with respect to a given frame module. Additionally, multiple frame modules may be interchangeable with one another with respect to a given arm module.
The arm module may be connected to the frame module in a manner that allows the arm module to be removed thereafter. The connection may be fully reversible such that when the arm module and the frame module are disconnected, each of the arm module and the frame module returns to a state that was maintained before the connection. The connection may be completely repeatable, so that after the arm and frame modules are disconnected, the same or different frame and arm module pairs may be connected in the same manner. The arm module and the frame module may be securely and temporarily connected, rather than permanently, fixedly or resiliently connected (e.g., by chemical and/or molecular bonding). For example, the connection and disconnection of the arm module and the frame module is facilitated in a manner that does not cause permanent damage, breakage, or deformation to the arm module or the frame module.
The user can easily connect and disconnect the arm module to and from the frame module. The connection and/or disconnection can be repeated and reversible by hand without the need for tools. For example, a locking mechanism and/or a release mechanism may be provided on the arm module and/or the frame module for ready access by a user. The force required by the user to connect and/or disconnect the arm module and the frame module may be within a typical range of the user's finger. For example, the force required to connect and/or disconnect the arm module and the frame module may be less than 1N, 5N, 10N, 15N, 20N, 25N, or 30N. Additionally or alternatively, the connection and/or disconnection may be achieved and/or facilitated through the use of tools.
The arm module and the frame module may be connected in a manner that fixes the relative positions of the arm module and the frame module with respect to each other. The arm module and the frame module may be connected in a manner to provide a communication link therebetween. The fixed position and the communication connection means can be realized and maintained when connecting the arm module and the frame module. The securing position and the communication connection means can be removed when the arm module is disconnected from the frame module.
While different arm modules may provide different features and/or functions, multiple arm modules may be interchangeable with one another by providing at least some features that are similar or identical between the multiple arm modules. For example, different arm modules may be secured to a given frame module by the same securing mechanism. By way of further example, different arm modules can establish a communication connection with a given frame module via the same communication mechanism. Thus, the frame module can accommodate replacement of different arm modules by providing the same securing mechanism and communication mechanism on different arm modules. Also, the arm module may accommodate replacement of different frame modules by providing the same securing mechanism and communication mechanism on different frame modules.
The plurality of arm modules may have other features that are similar or identical between the plurality of arm modules. For example, the plurality of arm modules may include housings having the same or similar size, shape, contour, dimension, aspect ratio, surface features, texture, color, and/or indicia. The common feature allows users to swap arm modules with each other while maintaining a consistent user experience across different arm modules at different times of use.
Additionally or alternatively, at least one of the size, shape, contour, dimension, aspect ratio, surface feature, texture, color, and/or indicia may be different between the plurality of arm modules. For example, different arm modules may have different sizes and/or shapes to accommodate different head and/or facial structures. This may allow the user to select between multiple arm modules providing different ergonomic features so that the user may select one depending on the comfort provided. By further example, different arm modules may have different aesthetic features to provide different fashion and appearance options to the user.
Additionally or alternatively, at least one of the size, shape, contour, dimension, aspect ratio, surface feature, texture, color, and/or indicia may be different between the plurality of frame modules. For example, different frame modules may have different sizes and/or shapes to accommodate different head and/or facial structures. This may allow the user to select between multiple frame modules providing different ergonomic features so that the user may select one depending on the comfort provided. By further embodiments, different frame modules may provide different functional features, such as different lenses for vision correction, so that a user may select a frame module suitable for a given activity (e.g., driving, reading, etc.). By further embodiments, different frame modules may have different aesthetic features to provide different fashion and appearance options to the user.
Fig. 3 illustrates a perspective view of a frame module and an arm module, each having a mechanical attachment mechanism and a communication connection, according to some embodiments of the present disclosure. Although only one arm module is shown in fig. 3, it should be understood that the description herein may be applied to each of a plurality of arm modules.
As shown in fig. 3, the arm module 200 may be attached to the frame module 100 of the head-mounted device 10 using the frame attachment element 110 and the arm attachment element 210. For example, the frame attachment element 110 and the arm attachment element 210 may mechanically engage each other to secure the arm module 200 to the frame module 100. The frame attachment element 110 and the arm attachment element 210 may have complementary shapes to facilitate engagement. For example, frame attachment element 110 and/or arm attachment element 210 may form a protrusion, and arm attachment element 210 and/or frame attachment element 110 may form a groove. The channel may have a shape and/or size that is complementary to the shape and/or size of the frame attachment element 110. It should be understood that various shapes and/or sizes may be provided to achieve engagement between frame attachment element 110 and arm attachment element 210. It will be further understood that any number of frame attachment elements 110 and arm attachment elements 210 may be provided. While certain mechanical attachment mechanisms are depicted, it should be understood that other mechanical attachment mechanisms are also contemplated.
The frame attachment element 110 and the arm attachment element 210 may provide a rotational engagement such that the arm module 200 can rotate relative to the frame module 100. This provides a collapsed configuration in which the arm module 200 is closer to the frame module 100 for storage when not in use. Such attachment may be achieved by mechanical hinges or magnetic couplings that allow relative rotational movement.
As further shown in fig. 3 and 4, the frame module 100 may be provided with a frame communication interface 120 and the arm module 200 may be provided with an arm communication interface 220. The communication interfaces 120 and 220 may include pairs of electrically conductive contacts configured to make electrical contact when the frame attachment element 110 and the arm attachment element 210 are engaged with each other. For example, one or more of the communication interfaces 120 and 220 may include movable elements for making electrical connections, such as at least partially collapsed pogo pins and/or at least partially flexible contact pads. By further embodiments, the pogo pins may be spring loaded and/or the contact pads may be formed of conductive foam or elastomer.
Additionally or alternatively, the communication interfaces 120 and 220 may include connectors that are manually connected to establish a communication interface other than mechanical attachment with the attachment elements 110 and 210. Such connectors may include ZIF connectors, non-ZIF connectors, slide connectors, flip actuator connectors, and/or FPC-to-board connectors. Additionally or alternatively, the frame communication interface 120 and/or the arm communication interface 220 may provide a direct (e.g., board-to-board) connection between the controller of the frame module 100 and the arm module 200.
Additionally or alternatively, the communication interfaces 120 and 220 may form waveguides for conducting light between the frame module and the arm module. For example, such a waveguide may allow light generated in the arm module to be directed to the display element of the frame module for viewing by a user.
It should be understood that various other communication connections may be provided between the frame communication interface 120 and the arm communication interface 220. Direct contact may not be required to establish a communication connection. For example, the communicative coupling between the frame communication interface 120 and the arm communication interface may include a wireless interface, a bluetooth interface, a near field communication interface, a magnetic interface, an inductive interface, a resonant interface, a capacitive coupling interface, a Wi-Fi interface, an optical interface, an acoustic interface, and/or other communication interfaces.
Fig. 5 and 6 show perspective views of embodiments of mechanical and communicative connections. Although only one arm module is shown in fig. 5 and 6, it should be understood that the description herein may be applied to each of a plurality of arm modules.
As shown in fig. 5, the arm attachment element 210 of the arm module 200 may be inserted laterally or otherwise into the frame attachment element 110 of the frame module 100. As such, the arm module 200 may be configured to slide relative to the frame module 100. Additionally or alternatively, the arm attachment element 210 may be pressed, snapped, or otherwise inserted forward into the frame attachment element 110. Once inserted, the arm attachment unit 210 may be locked or otherwise secured within the frame attachment element 110. The electrical connection may be made and maintained while mechanically securing the frame attachment element 110 and the arm attachment element 210, such as via the frame communication interface 120 and the arm communication interface 220. When the frame module 100 is connected to the arm module 200, its components may be in operative communication.
Additional or alternative mechanisms may be provided to lock the arm module 200 in place relative to the frame module 100. For example, mechanisms such as locks, latches, snaps, screws, clasps, threads, magnets, pins, interference (e.g., friction) fits, knurled presses, bayonets, and/or combinations thereof may be included to lock the arm module 200 to the frame module 100 when the frame attachment element 110 and the arm attachment element 210 are engaged with one another. The arm module 200 may remain locked from sliding relative to the frame module 100 until the release mechanism 192 is actuated. The release mechanism 192 may be provided on an outer surface of the head mounted device 10 for access by a user. For example, the release mechanism 192 may be disposed on an exterior surface of the frame module 100 and/or the arm module 200. With the locking mechanism locking the arm module 200 in place relative to the frame module 100, the release mechanism 192, when actuated, may move and act on the locking mechanism to release it. For example, the release mechanism 192, when actuated, may release one or more locks, latches, snaps, screws, clasps, threads, magnets, pins, interference (e.g., friction) fits, knurling presses, bayonets, and/or combinations thereof that previously locked the arm module 200 to the frame module 100. At least some of the interaction between the release mechanism 192 and the locking mechanism may be within the frame module 100 and/or the arm module 200.
Fig. 7 illustrates a block diagram of a frame module and an arm module, according to some embodiments of the present disclosure. It should be understood that the components described herein may be provided on either or both of the frame module and/or the arm module. In some embodiments, components are provided by arm modules rather than frame modules to reduce redundancy and increase customization based on the selection of arm modules.
As shown in fig. 7, the frame module 100 may include a controller 180 having one or more processing units that include or are configured to access a memory having instructions stored thereon. The instructions or computer programs may be configured to perform one or more of the operations or functions described with respect to the framework module 100. The controller 180 may be implemented as any electronic device capable of processing, receiving, or transmitting data or instructions. For example, the controller 180 may include one or more of the following: a microprocessor, a Central Processing Unit (CPU), an Application Specific Integrated Circuit (ASIC), a Digital Signal Processor (DSP), or a combination of such devices. As described herein, the term "processor" is intended to encompass a single processor or processing unit, a plurality of processors, a plurality of processing units, or one or more other suitably configured computing elements. The memory may store electronic data usable by the frame module 100. For example, the memory may store electronic data or content, such as, for example, audio and video files, documents and applications, device settings and user preferences, timing and control signals or data for various modules, data structures or databases, and so forth. The memory may be configured as any type of memory. By way of example only, the memory may be implemented as random access memory, read only memory, flash memory, removable memory, other types of storage elements, or a combination of such devices.
The frame module 100 may also include a display element 104 for displaying visual information to a user. The display element 104 may provide visual (e.g., image or video) output. The display element 104 may be or include an opaque, transparent, and/or translucent display. The transparent or translucent display element 104 may have a medium through which light representing an image is directed to the user's eye. The display element 104 may utilize digital light projection, OLED, LED, uuled, liquid crystal on silicon, laser scanning light sources, or any combination of these technologies. The medium may be an optical waveguide, a holographic medium, an optical combiner, an optical reflector, or any combination thereof. In one embodiment, a transparent or translucent display may be configured to selectively become opaque. Projection-based systems may employ retinal projection techniques that project a graphical image onto a person's retina. The projection system may also be configured to project the virtual object into the physical environment, for example as a hologram or on a physical surface.
The frame module 100 may also include any number of secondary frame electronic components 190. Embodiments of the secondary frame electronics 190 include an interface for receiving input from a user and/or providing output to a user. Examples of such interfaces include a speaker, a microphone, a haptic device, and/or another I/O component. For example, a speaker (e.g., a headset) may be provided by or connectable to the frame module and/or the arm module. Such speaker components may be modular components that may operate independently of and/or in concert with the head-mounted device. The speaker component may communicate wirelessly with the head-mounted device and/or another device. The speaker component can transmit power to and/or receive power from the head mounted device and/or another device. Such power transfer may be wired and/or wireless.
The haptic device may be implemented as any suitable device configured to provide force feedback, vibratory feedback, haptic sensations, or the like. For example, in one embodiment, the haptic device may be implemented as a linear actuator configured to provide intermittent haptic feedback, such as a tap or tap. Other user interface embodiments include one or more buttons, dials, crowns, switches, or other devices may be provided for receiving input from a user. The sub-frame electronics 190 may include additional displays and/or projectors for displaying images on surfaces other than the head-mounted device.
Other embodiments of sub-frame electronics 190 include sensors. Such sensors may be configured to sense substantially any type of characteristic, such as, but not limited to, image, pressure, light, touch, force, temperature, position, motion, and the like. For example, the sensor may be a photodetector, a temperature sensor, a light or optical sensor, an atmospheric pressure sensor, a humidity sensor, a magnet, a gyroscope, an accelerometer, a chemical sensor, an ozone sensor, a particle counting sensor, and the like. By further embodiments, the sensor may be a biometric sensor for tracking biometric features, such as health and activity metrics.
Other embodiments of the sub-frame electronics 190 include a power source and/or power transmission components for recharging the power source. Additional embodiments of the auxiliary frame electronics 190 include communication components that facilitate the transfer of data and/or power to or from other electronic devices across standardized or proprietary protocols.
As further shown in fig. 7, the arm module 200 includes components for performing selected functions and interacting with the frame module 100. While the arm modules 200 of fig. 7 are shown as being substantially similar, it should be understood that individual arm modules 200 may be provided with different features at the same time.
The frame module 100 may include a frame communication interface 120, and the arm modules 200 may each include an arm communication interface 220 to facilitate a communicative coupling between the frame module 100 and the arm module 200. For example, the communication link may operatively connect components of the frame module 100 (such as the controller 180) to components of the arm module 200.
Each frame module 200 may include a controller 280 having one or more processing units that include or are configured to access a memory having instructions stored thereon. The memory of the controller 280 and/or the arm module 200 may be the same, similar, or different than the memory of the controller 180 and/or the frame module 100.
The frame module 100 may be controlled at least in part by the controller 280 of the arm module 200. For example, when the arm module 200 is connected to the frame module 100, the controller 280 of the arm module 200 may be operatively connected to and/or control one or more components of the frame module 100 via the communication connections provided by the frame communication interface 120 and the arm communication interface 220.
Additionally or alternatively, each arm module 200 may be at least partially controlled by the controller 180 of the frame module 100. For example, when the arm module 200 is connected to the frame module 100, the controller 180 of the frame module 100 may be operatively connected to and/or control one or more components of the arm module 200 via the communication connections provided by the frame communication interface 120 and the arm communication interface 220.
The arm module 200 may serve as a power source for the frame module 100. By providing power using the detachable module, a user may select such an arm module according to anticipated power requirements. As shown in fig. 7, the device 200 may include a battery 224 for storing power and providing power to the frame module 100 and/or the arm module 200. Optionally, the arm module 200 may recharge the battery of the frame module 100, for example, by directing power from the battery 224 across the frame communication interface 120 and the arm communication interface 220. Other paths are contemplated, such as another link or wireless charging. The battery 224 may be a replaceable battery, a rechargeable battery, or a tethered power source that receives power from a source external to the arm module 200, such as from a USB cable, a lightning cable, or other interface. One or more batteries of the head mounted device may transmit power to and/or receive power from another device. Such power transfer may be wired and/or wireless.
Each arm module 200 may also include any number of secondary arm electronic components 290. By providing electronic components on the removable module, the user can select the arm module that provides the selected function when desired. At other times, other arm modules may be selected, thereby reducing the need to have all features available at all times in the frame module or fixed arm module. Each arm module may have the same or different auxiliary arm electronics 290.
An embodiment of the auxiliary arm electronics 290 includes a display driver. By providing a display driver with a detachable module, a user may select such an arm module when certain display features are desired. Such a display driver may be configured to control the display elements 104 of the frame module 100.
Other embodiments of the auxiliary arm electronics 290 include sensors. By providing sensing capabilities to the removable module, a user may select such an arm module when it is desired to sense a particular condition. Such sensors may be configured to sense substantially any type of characteristic, such as, but not limited to, image, pressure, light, touch, force, temperature, position, motion, and the like. For example, the sensor may be a photodetector, a temperature sensor, a light or optical sensor, an atmospheric pressure sensor, a humidity sensor, a magnet, a gyroscope, an accelerometer, a chemical sensor, an ozone sensor, a particle counting sensor, and the like. The sensor may be used to sense environmental conditions in an adjacent environment. The sensor may be exposed to the environment, for example with an opening provided in the arm module 200.
Additional embodiments of the auxiliary arm electronics 290 include a biosensor. By providing the removable module with biosensing capabilities, the user may select such an arm module when it is desired to track a biometric feature, such as a health and activity metric. The one or more biometric sensors may include optical and/or electronic biometric sensors that may be used to calculate one or more biometric characteristics. For example, the biosensor may include a light source and a photodetector to form a photoplethysmography (PPG) sensor. Optical (e.g., PPG) sensors may be used to calculate various biological characteristics including, but not limited to, heart rate, respiration rate, blood oxygen level, blood volume estimation, blood pressure, or combinations thereof. One or more of the sensors may also be configured to perform electrical measurements using one or more electrodes. One or more electrical sensors may be used to measure Electrocardiogram (ECG) characteristics, skin resistance, and other electrical characteristics of the user's body. Additionally or alternatively, the biosensor may be configured to measure body temperature, exposure to ultraviolet radiation, and other health-related information.
Additional embodiments of the auxiliary arm electronics 290 include user sensors. Such sensors may be used to detect characteristics associated with a user and/or other individual wearing the head-mounted device. For example, the user sensor may perform facial feature detection, facial motion detection, facial recognition, eye tracking, user mood detection, user emotion detection, voice detection, and the like.
Other embodiments of the auxiliary arm electronics 290 include components for receiving input from a user, providing output to a user, and/or performing other functions. Examples of such components include speakers, microphones, displays, touch sensors, haptic devices, cameras, optical sensors, magnets, gyroscopes, accelerometers, and/or other I/O components. The I/O components may be used to detect and interpret user input. The I/O components may be used to provide information to a user. The I/O components may also be used to capture information related to the user and/or the environment.
It should be understood that the components of either of the frame modules or arm modules discussed herein may be provided on either or both of the frame modules and/or arm modules. In addition, selecting different modules provides a customized experience for the user.
Referring now to fig. 8-10, one or more of the various arm modules may be used with a given frame module of a head-mounted device at different times. Fig. 8-10 illustrate front and side views of different arm modules for use with a frame module according to some embodiments of the present disclosure. Although each of fig. 8-10 shows only one arm module, it should be understood that the description herein may be applied to each of a plurality of arm modules. Any number of frame modules and arm modules may be provided. Each arm module may be configured differently. For example, the functional and aesthetic aspects may be configured differently. The additional arm modules may also have the same or different components. As discussed further herein, different components may provide different functions, such that attachment of a given arm module provides different functions for the same frame module of the head-mounted device. Each arm module may include one or more functional components, such as sensors, biosensors, batteries, I/O components, communication interfaces, controllers, and the like, as discussed further herein.
As shown in fig. 8, the first arm module 200 may include an arm communication interface 220 for communicating with the frame module. The arm module 200 may also include a controller 280, a display driver 230 configured to control the display elements of the frame module, and a battery 224.
As shown in fig. 9, the second arm module 300 may be provided with different and/or additional components with respect to the first arm module 200 of fig. 8. For example, the second arm module 300 may include an arm communication interface 320 for communicating with the frame module. The arm communication interface 320 may include additional connections for managing larger communication bandwidth. The second arm module 300 may also include a controller 380, a display driver 330 configured to control the display elements of the frame module, and a battery 324. The controller 380 may have greater processing power than the controller 280 of the first arm module 200. The display driver 330 may have a greater display capability (e.g., resolution, frame rate, etc.) than the display driver 230 of the first arm module 200. The battery 324 may have a greater charge capacity than the battery 224 of the first arm module 200. The second arm module 300 may optionally include additional components (not shown) not included in the first arm module 200. In this way, one or more components of the second arm module 300 may provide greater capabilities. It will be appreciated that the second arm module 300 may provide other features not provided by the first arm module 200, such as size, shape, and other visual features. Accordingly, a user may select and install the second arm module 300 to provide selected functional and/or aesthetic features not provided by the first arm module 200.
As shown in fig. 10, the second arm module 400 may be provided with different and/or additional components with respect to the first arm module 200 of fig. 8 and/or the second arm module 300 of fig. 9. For example, the third arm module 400 may include an arm communication interface 420 for communicating with the frame module. The arm communication interface 420 may include additional connections for managing larger communication bandwidth. The third arm module 400 may also include a controller 480, a display driver 430 configured to control the display elements of the frame module, and a battery 424. The controller 480 may have greater processing power than the controller 380 of the first arm module 200 and/or the controller 280 of the second arm module 300. The display driver 430 may have a greater display capability (e.g., resolution, frame rate, etc.) than the display driver 230 of the first arm module 200 and/or the display driver 330 of the second arm module 300. The battery 424 may have a greater charge capacity than the battery 224 of the first arm module 200 and/or the battery 324 of the second arm module 300. The third arm module 400 may optionally include auxiliary electronic components 490 that are not included in the first arm module 200 and/or the second arm module 300. In this way, one or more components of the third arm module 400 may provide greater capabilities. It will be appreciated that the third arm module 400 may provide other features not provided by the first arm module 200 and/or the second arm module 300, such as size, shape, and other visual features. Accordingly, a user may select and install the third arm module 400 to provide selected functional and/or aesthetic features not provided by the first arm module 200 and/or the second arm module 300.
It will be appreciated that functional differences between arm modules may refer to both the purpose of the component and the parameters of its operation. For example, while components of different arm modules may all serve a common purpose, the components may operate differently to accomplish that purpose. For example, different components may be used to sense different conditions based on the user's desired operation. Other variations, such as size, shape, and material selection may be provided so that the user may select the arm module that is best suited for user comfort and/or component performance.
Different arm modules may also differ in mechanical configuration, such as material characteristics and/or structural features, which may help define shape, size, flexibility, rigidity, feel, and/or aesthetic properties, such as color, pattern, and/or material, to provide different look and feel. Further, each arm module may have a different housing with a different color, material, shape, equipment, pattern, etc. The housing may provide different aesthetic features, appearance features, and/or look and feel than other housings in the system.
While the components of different arm modules may differ, the arm modules 200, 300, and 400 may have the same or similar arm communication interfaces 220, 320, and 420 so that each arm module 200, 300, and 400 may be attached to and communicate with the same frame module 100 in the same or similar manner.
Thus, each arm module is configured to provide different functional and/or aesthetic features than one or more other arm modules in the system. In this way, the user may select an arm module having a desired function and/or look and feel. This may be at the time of purchase, thus allowing differentiation from other purchasers, or may be a grouping of all or some portions of the arm module so that the user may select the desired arm module at the appropriate time. In one embodiment, one arm module may be configured for use outside the home, while another arm module may be configured for use in the home. Any combination of aesthetic and functional features may be provided to create different head mounted devices. When combined with different frame modules, the system becomes highly customizable. The user may create different head mounted devices with a set of arm modules by selecting one frame module. If multiple systems are provided, any number of different head-mounted device configurations may be made.
Referring now to fig. 11 and 12, the arm module may be provided with one or more additional modules for further enhancing the functionality of the head-mounted device. For example, as shown in fig. 11, the add-on module 500 may be attached and operably connected to the arm module 200 independently of the attachment of the arm module 200 to the frame module. Although only one arm module is shown in fig. 11 and 12, it should be understood that the description herein may be applied to each of a plurality of arm modules.
The additional module 500 may be provided at any portion of the arm module 200. For example, the additional module 500 may be disposed inside, outside, top, bottom, front, or back of the arm module 200. The additional module 500 may protrude from the surface of the arm module 200. In this way, the arm module 200 may provide a continuous surface at its outer periphery while accepting the add-on module 500 when desired by the user. In such a configuration, add-on module 500 may be effectively positioned to perform certain functions, such as directional sensing. For example, sensors mounted on the add-on module may be oriented in multiple directions to sense near the arm module 200. Alternatively, the add-on module 500 may be inserted into a recess of the arm module 200 such that a portion of the add-on module 500 is flush with a portion of the arm module 200.
As shown in fig. 12, the arm module 200 may include an additional arm communication interface 222 for communicating with the frame module in addition to the arm communication interface 220. The additional module 500 may include an additional communication interface 520 for communicating with the arm module 200.
The add-in module 500 can include a controller 580 having one or more processing units that include or are configured to access memory having instructions stored thereon. The memory of the controller 580 and/or the additional module 500 may be the same, similar, or different than the memory of the controller 280 and/or the arm module 200.
The add-on module 500 may also include any number of auxiliary additional electronic components 590. By providing electronic components on the removable module, the user can optionally provide additional modules 500 when desired for selected functionality. At other times, additional modules may or may not be selected, thereby reducing the need to have all features available in the arm module 200 at all times. Embodiments of auxiliary additional electronic components 590 include any of the electronic components discussed herein with respect to auxiliary frame electronic components 190 and auxiliary arm electronic components 290.
Referring now to fig. 13 and 14, the frame module may be provided with replaceable display elements. For example, as shown in fig. 13, one configuration of display elements 104 may be swapped with additional display elements 704 having at least one different characteristic (e.g., resolution, color, thickness, etc.). For example, one of the display elements may be opaque and the other may be transparent or translucent. Additionally or alternatively, one display element may provide display capability on one eye, while another display element may provide display capability on only one eye or no display capability on either eye. Additionally or alternatively, one display element may provide optical correction, while another display element may provide a different optical correction or no optical correction.
As shown in fig. 14, the display element 104 may be configured to slide relative to the frame module 100. Additionally or alternatively, the attachment unit 104 may be pressed, snapped, or otherwise inserted forward into the frame 120 of the frame module 100. Once inserted, the display element 104 may be locked or otherwise secured within the frame 120. The electrical connection may be made and maintained while mechanically securing the display element 104.
Referring now to fig. 15 and 16, the frame module may be provided with one or more additional modules for further enhancing the functionality of the head-mounted device. For example, as shown in fig. 15, the add-on module 600 may be attached and operably connected to the frame module 100 independently of the attachment of the frame module 100 to the arm module.
The additional module 600 may be provided at any portion of the frame module 100. For example, the additional module 600 may be disposed inside, outside, top, bottom, front, or back of the frame module 100. The additional module 600 may protrude from a surface of the frame module 100. In this way, the frame module 100 may provide a continuous surface at its outer periphery while accepting additional modules 600 as desired by the user. In such a configuration, the add-on module 600 may be effectively positioned to perform certain functions, such as directional sensing. For example, sensors mounted on the add-on module may be oriented in multiple directions to sense near the frame module 100. Alternatively, the add-on module 600 may be inserted into a recess of the frame module 100 such that a portion of the add-on module 600 is flush with a portion of the frame module 100.
As shown in fig. 16, in addition to the frame communication interface, the frame module 100 may also include an additional frame communication interface 122 for communicating with any arm module. The add-on module 600 may include an add-on communication interface 620 for communicating with the frame module 100.
The add-in module 600 may include a controller 680 having one or more processing units that include or are configured to access a memory having instructions stored thereon. The memory of the controller 680 and/or the add-on module 600 may be the same, similar, or different than the memory of the controller 180 and/or the frame module 100.
The add-on module 600 may also include any number of auxiliary additional electronic components 690. By providing electronic components on the removable module, the user can optionally provide additional modules 600 when desired for selected functionality. At other times, additional modules may be selected or deselected, thereby reducing the need to have all features available in the frame module 100 at all times. Examples of auxiliary additional electronic components 690 include any of the electronic components discussed herein with respect to auxiliary frame electronic components 190 and auxiliary arm electronic components 290.
Referring now to fig. 17 and 18, the frame module may be provided with replaceable user engagement portions. For example, as shown in fig. 17, a user engagement portion 150 may be provided in one configuration for use. The user engagement portion 150 may include or be coupled to the frame 102 and optionally extend over the bridge 108 and/or nose 106 of the frame module 100. The user engagement portion 150 may expose at least a portion of the display element 104 for viewing.
In another configuration, as shown in fig. 18, an additional user engagement portion 850 is provided. The additional user engagement portion 850 may have different characteristics relative to the user engagement portion 150. For example, the additional user engagement portion 850 may provide a greater degree of engagement with the user's face. By further example, the additional user engagement portion 850 may be shaped to enclose the space between the user's eye and the display element 104, thereby blocking external light sources. Additional user engagement portions 850 may include or be coupled to the frame 102 and optionally extend over the bridge 808 and/or nose 806 of the frame module 100. One or both of the user engaging portions 150 and 850 may be configured to conform to the contours of the user's face. For example, the user engagement portions 150 and 850 may comprise a flexible material. By further example, the user-engaging portions 150 and 850 may include conforming and hardening (e.g., curing) to a desired shape that is maintained outside of the initial stage during the initial stage
Thus, one of the different user engagement portions 150 and 850 may be selected at different times to achieve the desired performance of the head-mounted device.
Accordingly, embodiments of the present disclosure provide a head-mounted device with a replaceable module that provides a variety of different components and functions to achieve a user-desired result. The modular configuration allows a user to easily customize a head-mounted device with one or more arm modules to provide features that are integrated with other operations of the frame module of the head-mounted device. The arm modules can be easily interchanged with one another to provide different components and functions at different times. Thus, the framework module of the present disclosure need not include permanent components that provide each function that a user will later desire. Rather, the head-mounted device may have expanded and customizable capabilities through the use of one or more arm modules.
As described above, one aspect of the present technology may include collecting and using data from a variety of sources. The present disclosure contemplates that, in some instances, such collected data may include personal information data that uniquely identifies or may be used to contact or locate a particular person. Such personal information data may include demographic data, location-based data, phone numbers, email addresses, twitter IDs, home addresses, data or records related to the user's health or fitness level (e.g., vital sign measurements, medication information, exercise information), date of birth, or any other identifying or personal information.
The present disclosure recognizes that the use of such personal information data in the present technology may be useful to benefit the user. For example, health and fitness data may be used to provide insight into the overall health condition of a user, or may be used as positive feedback for individuals using technology to pursue health goals.
The present disclosure contemplates that entities responsible for collecting, analyzing, publishing, transmitting, storing, or otherwise using such personal information data will comply with established privacy policies and/or privacy practices. In particular, such entities should enforce and adhere to the use of privacy policies and practices that are recognized as meeting or exceeding industry or government requirements for maintaining privacy and security of personal information data. Such policies should be easily accessible to users and should be updated as data collection and/or usage changes. Personal information from the user should be collected for legitimate and legitimate uses by the entity and not shared or sold outside of these legitimate uses. Furthermore, such acquisition/sharing should be done after receiving the user's informed consent. Furthermore, such entities should consider taking any necessary steps to defend and secure access to such personal information data and to ensure that others who have access to the personal information data comply with their privacy policies and procedures. In addition, such entities may subject themselves to third party evaluations to prove compliance with widely accepted privacy policies and practices. In addition, policies and practices should be adjusted to the particular type of personal information data collected and/or accessed, and to applicable laws and standards including specific considerations of jurisdiction. For example, in the united states, the collection or acquisition of certain health data may be governed by federal and/or state laws, such as the health insurance transfer and accountability act (HIPAA); while other countries may have health data subject to other regulations and policies and should be treated accordingly. Therefore, different privacy practices should be maintained for different personal data types in each country.
Regardless of the foregoing, the present disclosure also contemplates embodiments in which a user selectively prevents use or access to personal information data. That is, the present disclosure contemplates that hardware elements and/or software elements may be provided to prevent or block access to such personal information data. For example, in the context of an ad delivery service, the techniques of the present invention may be configured to allow a user to opt-in or opt-out of participating in the collection of personal information data at any time during or after registration service. As another example, the user may choose not to provide emotion-related data for the targeted content delivery service. In another example, the user may choose to limit the length of time that emotion-related data is kept, or to prohibit the development of the underlying emotional condition altogether. In addition to providing "opt-in" and "opt-out" options, the present disclosure contemplates providing notifications related to accessing or using personal information. For example, the user may be notified that their personal information data is to be accessed when the application is downloaded, and then be reminded again just before the personal information data is accessed by the application.
Further, it is an object of the present disclosure that personal information data should be managed and processed to minimize the risk of inadvertent or unauthorized access or use. Once the data is no longer needed, the risk can be minimized by limiting data collection and deleting data. In addition, and when applicable, including in certain health-related applications, data de-identification may be used to protect the privacy of the user. De-identification may be facilitated by removing particular identifiers (e.g., date of birth, etc.), controlling the amount or specificity of stored data (e.g., collecting location data at a city level rather than at an address level), controlling how data is stored (e.g., aggregating data on a user), and/or other methods, as appropriate.
Thus, while the present disclosure broadly covers the use of personal information data to implement one or more of the various disclosed embodiments, the present disclosure also contemplates that various embodiments may be implemented without the need to access such personal information data. That is, various embodiments of the present technology do not fail to perform properly due to the lack of all or a portion of such personal information data. For example, content may be selected and delivered to a user by inferring preferences based on non-personal information data or an absolute minimum amount of personal information, such as content requested by a device associated with the user, other non-personal information available to a content delivery service, or publicly available information.
Unless specifically stated otherwise, reference to an element in the singular is not intended to be exclusive, but rather refers to one or more. For example, "a" module may refer to one or more modules. The prefix "a", "an", "the" or "said" does not exclude the presence of other identical elements, without further limitation.
Headings and sub-headings (if any) are used for convenience only and do not limit the invention. The word "exemplary" is used herein to mean serving as an example or illustration. To the extent that the terms "includes," "has," and the like are used, such terms are intended to be inclusive in a manner similar to the term "comprising" as "comprising" is interpreted when employed as a transitional word in a claim. Relational terms such as first and second, and the like may be used solely to distinguish one entity or action from another entity or action without necessarily requiring or implying any actual such relationship or order between such entities or actions.
Phrases such as an aspect, the aspect, another aspect, some aspects, one or more aspects, a specific implementation, the specific implementation, another specific implementation, some specific implementation, one or more specific implementations, embodiments, the embodiment, another embodiment, some embodiments, one or more embodiments, configurations, the configuration, other configurations, some configurations, one or more configurations, the subject technology, the disclosure, the present disclosure, other variations, and the like are for convenience and do not imply that a disclosure relating to such one or more phrases is essential to the subject technology or that such disclosure applies to all configurations of the subject technology. Disclosure relating to such one or more phrases may apply to all configurations or one or more configurations. Disclosure relating to such one or more phrases may provide one or more examples. Phrases such as an aspect or some aspects may refer to one or more aspects and vice versa and this applies similarly to the other preceding phrases.
The phrase "at least one of," preceding a series of items, separates any of the items by the terms "and" or, "modifying the list as a whole rather than each member of the list. The phrase "at least one" does not require the selection of at least one item; rather, the phrase allows the meaning of at least one of any one item and/or at least one of any combination of items and/or at least one of each item to be included. For example, each of the phrases "at least one of A, B and C" or "A, B or at least one of C" refers to a alone, B alone or C alone; A. any combination of B and C; and/or A, B and C.
It should be understood that the specific order or hierarchy of steps, operations, or processes disclosed is an illustration of exemplary approaches. Unless specifically stated otherwise, it is understood that a specific order or hierarchy of steps, operations, or processes may be performed in a different order. Some of the steps, operations, or processes may be performed concurrently. The accompanying method claims, if any, present elements of the various steps, operations, or processes in a sample order, and are not meant to be limited to the specific order or hierarchy presented. These may be performed serially, linearly, in parallel, or in a different order. It should be understood that the described instructions, operations, and systems may generally be integrated together in a single software/hardware product or packaged into multiple software/hardware products.
In one aspect, the terms coupled, and the like, may refer to a direct coupling. On the other hand, the terms coupled and the like may refer to indirect coupling.
Terms such as top, bottom, front, rear, side, horizontal, vertical, and the like refer to any frame of reference, not to the usual gravitational frame of reference. Thus, such terms may extend upwardly, downwardly, diagonally or horizontally in a gravitational frame of reference.
The present disclosure is provided to enable one of ordinary skill in the art to practice the various aspects described herein. In some instances, well-known structures and components are shown in block diagram form in order to avoid obscuring the concepts of the subject technology. The present disclosure provides various embodiments of the subject technology, and the subject technology is not limited to these embodiments. Various modifications to these aspects will be readily apparent to those skilled in the art, and the principles described herein may be applied to other aspects.
All structural and functional equivalents to the elements of the various aspects described throughout this disclosure that are known or later come to be known to those of ordinary skill in the art are expressly incorporated herein by reference and are intended to be encompassed by the claims. Moreover, nothing disclosed herein is intended to be dedicated to the public regardless of whether such disclosure is explicitly recited in the claims. No claim element need be construed according to the provisions of 35u.s.c. § 112, unless the element is explicitly stated using the phrase "method to" or, in the case of a method claim, the element is stated using the phrase "step to".
The title, background, brief description of the drawings, abstract, and drawings are hereby incorporated into this disclosure and are provided as illustrative examples of the disclosure, not as limiting descriptions. They are not to be considered as limiting the scope or meaning of the claims. In addition, in the detailed description, it can be seen that the description provides illustrative examples, and that various features are grouped together in various implementations for the purpose of streamlining the disclosure. This method of disclosure is not to be interpreted as reflecting an intention that the claimed subject matter requires more features than are expressly recited in each claim. Rather, as the following claims reflect, inventive subject matter lies in less than all features of a single disclosed configuration or operation. The claims are hereby incorporated into the detailed description, with each claim standing on its own as a separately claimed subject matter.
The claims are not intended to be limited to the aspects described herein, but are to be accorded the full scope consistent with the language of the claims, and encompass all legal equivalents. None of these claims, however, contain subject matter that is inconsistent with the requirements of the applicable patent laws and should be interpreted in this manner.

Claims (10)

1. A head-mounted device, characterized in that the head-mounted device comprises:
a frame module, the frame module comprising:
a frame;
a display element supported by the frame;
a first attachment element; and
a first communication interface;
an arm module, the arm module comprising:
an electronic component;
a second attachment element; and
a second communication interface, wherein the display element is operatively connected to the electronic component via the first communication interface and the second communication interface when the arm module is attached to the frame module; and
a release mechanism for releasing the arm module from the frame module.
2. The head mounted device of claim 1, wherein the electronic components include a display driver for controlling the display elements of the frame module.
3. The headset of claim 2, wherein the arm module further comprises:
a controller for operating the display element of the frame module;
a microphone;
a speaker; and
a battery for supplying power to the display element of the frame module.
4. A head-mounted device, characterized in that the head-mounted device comprises:
a frame module, the frame module comprising:
a frame; and
a display element supported by the frame; and
an arm module removably attached to the frame module and including electronic components operatively connected to the display element when the arm module is attached to the frame module, the arm module being interchangeable with other arm modules removably attached to the frame module.
5. The head-mounted device of claim 4, wherein:
the frame module further includes:
a controller;
a first attachment element; and
a first communication interface;
the arm module further includes:
a second attachment element; and
a second communication interface, wherein the controller is operatively connected to the electronic component via the first communication interface and the second communication interface when the arm module is attached to the frame module; and
a release mechanism on an outer surface of the head-mounted device for releasing the arm module from the frame module.
6. The head-mounted device of claim 5, wherein:
the arm module further includes:
a third attachment element; and
a third communication interface;
the head-mounted device further comprises an additional module comprising:
an additional electronic component;
a fourth attachment element; and
a fourth communication interface, wherein the additional electronic component is operatively connected to the controller via the third communication interface and the fourth communication interface when the additional module is attached to the arm module.
7. The head-mounted device of claim 5, wherein:
the frame module further includes:
a third attachment element; and
a third communication interface;
the head-mounted device further comprises an additional module comprising:
an additional electronic component;
a fourth attachment element; and
a fourth communication interface, wherein the additional electronic component is operably connected to the controller via the third communication interface and the fourth communication interface when the additional module is attached to the frame module.
8. A head-mounted device, characterized in that the head-mounted device comprises:
a frame module, the frame module comprising:
a frame;
a controller within the frame; and
an attachment element; and
a first arm module comprising a first electronic component for performing a first function, wherein the first electronic component is operatively connected to the controller when the first arm module is attached to the attachment element; and
a second arm module comprising a second electronic component for performing a second function different from the first function, wherein the second electronic component is operably connected to the controller when the second arm module is attached to the attachment element.
9. The head-mounted device of claim 8, wherein:
the frame module includes a display element;
the first electronic component comprises a first display driver to operate the display element at a first resolution; and
the second electronic component includes a second display driver configured to operate the display element at a second resolution different from the first resolution.
10. The headset of claim 8, wherein said frame module further comprises a release mechanism on an outer surface of said frame module for releasing said first arm module or said second arm module from said attachment element.
CN201920757997.3U 2018-12-19 2019-05-24 Head-mounted device Active CN209928142U (en)

Applications Claiming Priority (4)

Application Number Priority Date Filing Date Title
US201862782260P 2018-12-19 2018-12-19
US62/782,260 2018-12-19
US16/378,357 2019-04-08
US16/378,357 US20200201042A1 (en) 2018-12-19 2019-04-08 Modular system for head-mounted device

Publications (1)

Publication Number Publication Date
CN209928142U true CN209928142U (en) 2020-01-10

Family

ID=66380181

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201920757997.3U Active CN209928142U (en) 2018-12-19 2019-05-24 Head-mounted device

Country Status (1)

Country Link
CN (1) CN209928142U (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113671694A (en) * 2020-05-13 2021-11-19 宇龙计算机通信科技(深圳)有限公司 Head-mounted display device

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113671694A (en) * 2020-05-13 2021-11-19 宇龙计算机通信科技(深圳)有限公司 Head-mounted display device

Similar Documents

Publication Publication Date Title
JP7291784B2 (en) Modular system for head-mounted devices
CN214540234U (en) Head-mounted device
CN211834370U (en) Head-mounted display, face interface for head-mounted display and display system
US20230350503A1 (en) Ring input devices
CN213659407U (en) Crown input and feedback for head-mounted devices
CN112444996A (en) Head-mounted device with tension adjustment
US20230350633A1 (en) Shared data and collaboration for head-mounted devices
US11175734B1 (en) Wrist tracking devices
US20230229007A1 (en) Fit detection for head-mountable devices
WO2021194636A1 (en) Head-mounted device comprising headband
CN112423190A (en) Audio-based feedback for head-mounted devices
CN110888234A (en) Display system
US20230305301A1 (en) Head-mountable device with connectable accessories
CN209928142U (en) Head-mounted device
US20230229010A1 (en) Head-mountable device for posture detection
US11714286B1 (en) Accessory for head-mountable device
US20210373592A1 (en) Optical Module With Conformable Portion
CN112241200A (en) Object tracking for head mounted devices
US11953690B2 (en) Head-mountable device and connector
CN112526750A (en) Head-mounted display
CN112285928A (en) Head-mounted device
US11729373B1 (en) Calibration for head-mountable devices
US11789276B1 (en) Head-mounted device with pivoting connectors
US11726523B1 (en) Head-mountable device with variable stiffness head securement
US11709365B1 (en) Motor temperature sensor for head-mountable device

Legal Events

Date Code Title Description
GR01 Patent grant
GR01 Patent grant