CN117355785A - Front and surrounding viewing lens zones for virtual reality headgear - Google Patents

Front and surrounding viewing lens zones for virtual reality headgear Download PDF

Info

Publication number
CN117355785A
CN117355785A CN202280035120.8A CN202280035120A CN117355785A CN 117355785 A CN117355785 A CN 117355785A CN 202280035120 A CN202280035120 A CN 202280035120A CN 117355785 A CN117355785 A CN 117355785A
Authority
CN
China
Prior art keywords
primary
virtual reality
surrounding
image
display screen
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202280035120.8A
Other languages
Chinese (zh)
Inventor
P·J·格尔根
M·E·格雷厄姆
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Universal City Studios LLC
Original Assignee
Universal City Studios LLC
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from US17/742,214 external-priority patent/US20220365349A1/en
Application filed by Universal City Studios LLC filed Critical Universal City Studios LLC
Priority claimed from PCT/US2022/029024 external-priority patent/WO2022241136A1/en
Publication of CN117355785A publication Critical patent/CN117355785A/en
Pending legal-status Critical Current

Links

Abstract

The present disclosure is directed to a virtual reality headgear (10) that includes a primary lens region (16, 18) and a surrounding lens region (20, 22). The virtual reality headgear (10) may also include one or more primary display screens (30) aligned with the primary lens regions (16, 18) and a plurality of secondary display screens (32) aligned with the surrounding lens regions (20, 22). The peripheral lens region (20, 22) may include a translucent texture to deform the image viewed through the peripheral lens region (20, 22) to appear as though it were a peripheral visual view, and to augment the primary image viewed through the primary lens region (16, 18).

Description

Front and surrounding viewing lens zones for virtual reality headgear
Cross reference to related applications
The present application claims priority and benefit of U.S. provisional application No. 63/242,115 entitled "front and surrounding viewing lens zone for virtual reality headgear," filed on 9 months of 2021, and U.S. provisional application No. 63/189,051 entitled "front and surrounding viewing lens zone for virtual reality headgear," filed on 14 months of 2021, both of which are hereby incorporated by reference in their entirety for all purposes.
Technical Field
The present disclosure relates generally to the field of virtual reality headgear. More particularly, embodiments of the present disclosure relate to lenses for virtual reality headgear.
Background
The virtual reality headgear may include one or more display screens that display images to a user of the virtual reality headgear. The displayed image may be viewed through a lens of the virtual reality headset. The virtual reality headgear may display images and/or present special effects to map various environments and/or scenes. For example, the virtual reality headgear may display visual features along with sound effects corresponding to the visual features to further assist in immersing the user in the virtual reality environment.
This section is intended to introduce the reader to various aspects of art that may be related to various aspects of the present technology, which are described and/or claimed below. This discussion is believed to be helpful in providing the reader with background information to facilitate a better understanding of the various aspects of the present disclosure. It should be understood, therefore, that these statements are to be read in this light, and not as admissions of prior art
Disclosure of Invention
The following sets forth an overview of certain embodiments disclosed herein. It should be understood that these aspects are presented merely to provide the reader with a brief summary of these certain embodiments and that these aspects are not intended to limit the scope of this disclosure. Indeed, the disclosure may encompass a variety of aspects that may not be set forth below.
In one embodiment, a virtual reality headset includes a housing, a primary display screen supported within the housing and configured to display a primary image, and a secondary display screen supported within the housing and configured to display a secondary image. The virtual reality headgear may also include an optical element including a primary optical zone and a surrounding optical zone. Further, the primary optical zone enables viewing of primary content displayed on the primary display screen, and the surrounding optical zone enables viewing of secondary content displayed on the secondary display screen.
Further, in one embodiment, the virtual reality headset includes a housing, a primary display screen supported within the housing and configured to display a primary image, and a secondary display screen supported within the housing and configured to display a secondary image. Further, the virtual reality headset includes an optical element that includes a primary optical zone and a surrounding optical zone. The primary optical zone is positioned in alignment with the primary display screen and is transparent, and the surrounding optical zone is positioned in alignment with the secondary display screen and is translucent.
Further, in an embodiment, a method of operating a virtual headgear includes displaying a primary image from a perspective of a viewer on a primary display screen positioned in front of a transparent primary lens region of the virtual reality headgear. The method further includes displaying a secondary image from a perspective of a viewer on a secondary display screen positioned in front of the translucent surrounding lens region of the virtual reality headset. The method further includes coordinating, via the one or more processors, display of the primary image and display of the secondary image to create a virtual environment having a surrounding visual effect.
Drawings
These and other features, aspects, and advantages of the present disclosure will become better understood when the following detailed description is read with reference to the accompanying drawings in which like characters represent like parts throughout the drawings, wherein:
fig. 1 is a front perspective view of optical elements of a virtual reality headset and portions of an electrical housing of the virtual reality headset according to an embodiment of the present disclosure;
FIG. 2 is a rear view of a virtual reality headset having optical elements including left and right primary lens regions and left and right surrounding lens regions according to an embodiment of the present disclosure;
FIG. 3 is a front perspective view of a virtual reality headset with optical elements including left and right primary lens regions and left and right surrounding lens regions, with portions of the virtual reality headset removed and/or transparent to enable visualization of various features of the virtual reality headset, according to an embodiment of this disclosure;
fig. 4 is a front perspective view of a virtual reality headset according to an embodiment of the present disclosure, wherein the electrical housing is transparent to enable visualization of various features inside the electrical housing;
Fig. 5 is a rear perspective view of a virtual reality headset according to an embodiment of the present disclosure; and
fig. 6 is a perspective view of an embodiment of a ride vehicle that may carry a user while wearing virtual reality headgear according to an embodiment of the present disclosure.
Detailed Description
Reference will now be made in detail to the drawings and the specific embodiments illustrated in the drawings. In the following detailed description, numerous specific details are set forth in order to provide a thorough understanding of the present disclosure. It will be apparent, however, to one of ordinary skill in the art that the embodiments may be practiced without these specific details. In other instances, well-known methods, procedures, and components have not been described in detail so as not to unnecessarily obscure aspects of the embodiments.
The terminology used herein is for the purpose of describing particular embodiments only and is not intended to be limiting. As used in the description and the appended claims, the singular forms "a," "an," and "the" are intended to include the plural forms as well, unless the context clearly indicates otherwise. It will also be understood that the term "and/or" as used herein refers to (and encompasses) any and all possible combinations of one or more of the associated listed items. It will be further understood that the terms "comprises," "comprising," "includes," "including" and/or "having," when used in this specification, specify the presence of stated features, integers, steps, operations, elements, and/or components, but do not preclude the presence or addition of one or more other features, integers, operations, elements, components, and/or groups thereof. Furthermore, as used herein, the term "if" may be interpreted to mean "when … …" or "at … …" or "responsive to determination … …" or "responsive to detection … …" depending on the context.
Some existing virtual reality headgear may have one or two lenses that enable viewing of images displayed on a display screen of the virtual reality headgear. Advantageously, the present disclosure provides a virtual reality headset having an optical element (e.g., one or more lenses; a lens assembly) that includes a primary lens zone (e.g., a main lens zone; a central or forward lens zone) of the optical element and a peripheral lens zone (e.g., a secondary lens zone; a side lens zone) of the optical element. The surrounding lens region of the optical element may be used to provide the user with surrounding vision at various times and/or in some manner to achieve a more immersive experience for the user. For example, the surrounding lens region of the optical element may enable visualization of additional color effects or other special effects to enhance the user's experience in viewing the primary image at the primary lens region of the optical element.
More particularly, provided herein are virtual reality headgear including optical elements that enable viewing of content displayed on one or more screens positioned in the virtual reality headgear. The optical element may be divided into two zones (e.g., left and right zones; left and right of the center line). The optical element may include left and right primary lens regions (e.g., left and right primary lens regions) that may each be positioned over a primary display screen (e.g., a high definition display screen) and may enable a primary image displayed on the high definition display screen to be viewed by a user having their eyes positioned behind (e.g., looking through) the left and right primary lens regions. The optical element may also include left and right peripheral lens regions (e.g., left and right peripheral lens regions) that may each be positioned over a secondary display screen (e.g., a low-definition display screen) and may enable images displayed on the secondary display screen to be viewed by a user (e.g., in their peripheral vision; when viewed through the left and right primary lens regions).
The left and right peripheral lens regions may be formed of frosted glass and/or a translucent material such that images viewed through the left and right peripheral lens regions may have a comparable or similar (e.g., unfocused, unclear) appearance to ambient vision (e.g., when a user views a real world environment without virtual reality headgear). The left and right peripheral lens regions may be formed of a textured material. In an embodiment, the left peripheral lens region may be connected to the left primary lens region via a textured material and the right peripheral lens region may be connected to the right primary lens region via a textured material. In an embodiment, the transparency may gradually decrease from the outer edge of the left primary lens region to the outer edge of the left peripheral lens region and from the outer edge of the right primary lens region to the outer edge of the right peripheral lens region (e.g., via a change in texture, such as by providing a surface roughness of the gradually increasing texture).
While the disclosed embodiments are generally described in the context of virtual reality headgear, it should be understood that virtual reality headgear as provided herein may also be used in other contexts and with other projection and/or viewing systems. For example, the virtual reality headgear may use an external screen or projection member. In addition, the virtual reality headgear may utilize an external control system that may send commands and/or instructions for the images to be displayed for each virtual reality headgear within the environment. Thus, the particular configuration (e.g., material, shape, size) of the virtual reality headgear may be implemented according to the desired end use.
Fig. 1 is a front perspective view of an optical element 14 (e.g., one or more lenses; lens assemblies) of a virtual reality headset 10 and a portion of an electrical housing 12 (e.g., shell) of the virtual reality headset 10, according to an embodiment of the disclosure. In one embodiment, the virtual reality headset 10 may include an optical element 14 and an electrical housing 12 that holds the optical element 14. The optical element 14 may be formed of a single material, such as polymethyl methacrylate (PMMA), an optically transparent material (e.g., having at least 80 percent, 85 percent, 90 percent, 92 percent or more optical transmission or light propagation), or any other material including a suitable refractive index. The optical element 14 may also be formed from a plurality of different materials such that the different materials form different sections of the optical element 14. Different materials may each occupy equal sections of the optical element 14 or sections of variable size of the optical element 14. The electrical housing 12 may include or enclose a plurality of electrical components including one or more display screens, audio elements, special effect elements, controllers, and the like. The electrical housing 12 may contain the optical element 14, and the optical element 14 may be detachable from the electrical housing 12.
The electrical housing 12 may include one or more display screens to display images (e.g., video feeds, pictures) that can be viewed through the optical element 14 (e.g., by a user wearing the virtual reality headset 10). The one or more display screens may include a Liquid Crystal Display (LCD), an Organic Light Emitting Diode (OLED) display, or any other suitable display type. The one or more display screens may include a variety of display screen types. For example, one of the display screens may correspond to an LCD display and the other display screen may correspond to an OLED display. In addition, the one or more display screens may include a high definition display screen and/or a low definition display screen. As set forth above, the virtual reality headset 10 may include multiple display screens that may display different images. The optical element 14 may be at least partially translucent in all sections of the optical element 14 (e.g., one or more sections may be transparent or translucent to enable light to propagate through the optical element 14; not opaque in any section of the optical element 14). However, it should be understood that the degree to which the material is translucent may vary across the optical element 14. The optical element 14 may also include one or more opaque regions that may be consistent with a theme or image being viewed through the optical element 14. In this way, the optical element 14 permits a user wearing the virtual reality headset 10 to view the image(s) displayed on the display screen(s). The optical element 14 may be coupled to the electrical housing 12, but the optical element 14 may also be removable or removable from the electrical housing 12. For ease of discussion, the virtual reality headgear 10 and its components may be described with reference to a vertical axis or direction 2, a longitudinal axis or direction 4, and/or a lateral axis or direction 6. The virtual reality headset 10 may also have a centerline 8 extending in the longitudinal direction 4 and separating the left and right portions of the virtual reality headset 10.
Fig. 2 is a rear view of a virtual reality headgear 10 having an optical element 14 according to an embodiment of this disclosure, the optical element 14 including left and right primary lens regions 16, 18 (e.g., left/first primary lens region 16 and right/second primary lens region 18) and left and right peripheral lens regions 20, 22 (e.g., left/first peripheral lens region 20 and right/second peripheral lens region 22). The left and right primary lens regions 16, 18 may together be referred to as a primary lens region or primary optical region. The left and right peripheral lens regions 20, 22 may be collectively referred to as a peripheral lens region or a peripheral optical region. In the illustrated embodiment, the optical element 14 may include left and right primary lens regions 16, 18 for viewing primary content (e.g., images, video) and left and right surrounding lens regions 20, 22 for viewing surrounding image effects and/or for enhancing additional special effects of the immersive nature of the primary content. Additional special effects may include non-visual special effects including, but not limited to, wind effects, sound effects, vibration effects, odoriferous effects, haptic effects, and the like. The left primary lens region 16 may be disposed on a first lateral side (e.g., left side) of the virtual reality headset 10 and the right primary lens region 18 may be disposed on a second lateral side (e.g., right side) of the virtual reality headset 10. The left and right primary lens regions 16, 18 may be shaped to form a notched member 26 (e.g., a gap) of the virtual reality headset 10 that may fit over the nose of a user. It should be appreciated that the left and right primary lens regions 16, 18 may be any suitable shape and/or have any configuration to correspond to the desired overall design for the virtual reality headset 10. The left and right primary lens regions 16, 18 may be positioned on the virtual reality headset 10 to correspond to and precede the left and right eyes of the user when the virtual reality headset 10 is worn by the user. The left primary lens region 16 may be separated (e.g., physically separated) from the right primary lens region and/or may be separated by a portion of the electrical housing 12 of the virtual reality headset 10 (e.g., the notched member 26 defining the virtual reality headset 10).
The textured edge 28 may be presented around all or some of the respective peripheral edges of the left and right primary lens regions 16, 18. For example, the textured edge 28 may transition the left primary lens region 16 to the left peripheral lens region 20 and the right primary lens region 18 to the right peripheral lens region 22. The textured edge 28 surrounding the left primary lens region 16 may be separated from the textured edge 28 surrounding the right primary lens region 18. The textured edge 28 may also include one or more embedded sensors that may detect the positioning of the virtual reality headset 10 or be used to collect additional data on the virtual reality headset 10. In an embodiment, the left and right primary lens regions 16, 18 may be formed of a transparent material, the left and right peripheral lens regions 20, 22 may be formed of a translucent material, and the textured edge 28 may be positioned between the transparent and translucent materials. The textured edge 28 may have a surface roughness that is greater than the surface roughness of the left and right primary lens regions 16, 18 and/or the surface roughness of the left and right peripheral lens regions 20, 22. For example, the textured edge 28 may have a texture that is visible and/or detectable by touch, while the left and right primary lens regions 16, 18 and/or the left and right surrounding lens regions 20, 22 may not have any texture (e.g., a smooth surface) that is visible and/or detectable by touch. As another example, the textured edge 28 may have a texture that is visible and/or detectable by touch, and the left and right surrounding lens regions 20, 22 may also have a texture that is visible and/or detectable by touch; however, the surface roughness of the textured edge 28 may be greater than the surface roughness of the left and right peripheral lens regions 20, 22. In an embodiment, the surface roughness of the textured edge 28 may be less than the surface roughness of the left and right peripheral lens regions 20, 22. More generally, in one embodiment, the texture characteristics of the textured edge 28 may be different from the texture characteristics of the left and right peripheral lens regions 20, 22. Thus, the textured edge 28 may be considered to be distinct from the left and right peripheral lens regions 20, 22 even though texture is present in these sections of the optical element 14. It should be appreciated that the textured edge 28 is an optional feature and may be omitted. Alternatively, the left and right primary lens regions 16, 18 may transition to the left and right peripheral lens regions 20, 22 without any distinct intermediate sections (e.g., having different characteristics; without the textured edge 28). The left and right peripheral lens regions 20, 22 may each include the same or varying surface roughness as compared to the textured edge 28. For example, the left peripheral lens region 20 may have a greater surface roughness than the right peripheral lens region 22.
The left and right primary lens regions 16, 18 may enable a user to view images displayed on primary screens positioned in the virtual reality headset 10. It should be appreciated that the left primary lens region 16 may enable viewing of different content than the right primary lens region 18 based on the image displayed by the virtual reality headset 10. For example, the portion of the image exposed to the left primary lens region 16 may be different from the portion of the image exposed to the right primary lens region 16. As shown, the left primary lens region 16 may be associated with a respective opaque divider 31 (e.g., having a tapered or funnel shape) that extends toward the primary screen to limit viewing of the primary content to the left primary lens region 16. Although only the opaque divider 31 for the left primary lens region 16 is shown in fig. 2 for image clarity, it should be appreciated that the right primary lens region 18 may also be associated with a corresponding opaque divider having the same characteristics. As discussed above, the left and right primary lens regions 16, 18 may be formed of PMMA or another optically transparent material. Further, the left and right primary lens regions 16, 18 may be formed from separate pieces of material and may be held in place relative to one another via the electrical housing 12 or other frame structure, or the left and right primary lens regions 16, 18 may be formed from a single piece of material (e.g., a one-piece construction).
As noted, the textured edge 28 may be presented around all or some of the periphery of the left primary lens region 16, and may also be presented around all or some of the periphery of the right primary lens region 18. For example, textured edge 28 may form a gradient texture that transitions into a solid uniform texture that extends across left and right peripheral lens regions 20 and 22. The left and right peripheral lens regions 20, 22 may have less blur (e.g., be more opaque) than the left and right primary lens regions 16, 18. This may enable the image viewed through the left and right peripheral lens regions 20, 22 to appear blurred in a manner corresponding to the point of view of the peripheral vision. As discussed above, the left and right peripheral lens regions 20, 22 may be formed of frosted glass or another translucent material. The translucent material may be a plastic material, a fabric material, or any suitable lens material configured to be positioned in front of the user's eyes to be within and/or to affect the user's field of view, to allow light to pass through the material, and/or to enable the user to view images and/or objects through the material. In practice, the term "lens region" refers to any portion that is configured to be positioned in front of the user's eyes to be within and/or to affect the user's field of view, allow light to pass through the material, and/or enable the user to view images and/or objects through the material. The left and right primary lens regions 16, 18 and the left and right surrounding lens regions 20, 22 may be formed of the same material (e.g., PMMA or other optically transparent material), but the left and right surrounding lens regions 20, 22 may be treated such that the left and right surrounding lens regions 20, 22 have greater blur than the left and right primary lens regions 16, 18. For example, the left and right peripheral lens regions 20, 22 may be textured (e.g., via etching) and/or coated (e.g., via paint). In this case, the left primary lens region 16 and the left peripheral lens region 20 may be formed from a single piece of material (e.g., a one-piece construction), and the right primary lens region 18 and the right peripheral lens region 22 may be formed from a single piece of material (e.g., a one-piece construction). In an embodiment, the left and right primary lens regions 16, 18 and the left and right peripheral lens regions 20, 22 may be formed from a single piece of material (e.g., a one-piece construction). However, it should be appreciated that the left and right peripheral lens regions 20, 22 may be formed from separate pieces of material (e.g., separate from each other and from the left and right primary lens regions 16, 18) and may be held in place relative to each other via the electrical housing 12 or other frame structure.
In an embodiment, each of the left and right peripheral lens regions 20, 22 may be formed from a plurality of materials corresponding to sections of each of the left and right peripheral lens regions 20, 22. For example, near the left primary lens region 16 for the left peripheral lens region 20 and near the right primary lens region 18 for the right peripheral lens region 22, each of the left and right peripheral lens regions 20, 22 may include an outer edge of greater blur (e.g., formed from a band of black plastic or paint) and an inner edge of lesser blur (e.g., formed from a band of mixed plastic or paint). The ambiguity may vary between an outer edge of a greater ambiguity and an inner edge of a lesser ambiguity. For example, the ambiguity may gradually transition between an outer edge of a greater ambiguity and an inner edge of a lesser ambiguity. This may cause the color effect displayed on the secondary screen to be viewed through the left and right peripheral lens regions 20, 22 in light and/or color similar to ambient vision.
The left and right primary lens regions 16, 18 may each have a generally circular shape (e.g., have curved edges), whereas the left and right primary lens regions 16, 18 may each have straight edges along the notched member 26. The left and right primary lens regions 16, 18 may also include curved edges along the notched member 26 instead of straight edges, or any other suitable edge shape corresponding to the notched member 26. The left and right peripheral lens regions 20, 22 may each have a laterally extending portion and a longitudinally extending portion. For example, referring to the left peripheral lens region 20 shown in fig. 2, the left peripheral lens region 20 includes a laterally extending portion 21 (e.g., in the same plane as the left primary lens region 16) and a longitudinally extending portion 23 joined together via a meandering portion 25. The laterally extending portion 21 and the longitudinally extending portion 23 may be oriented at any suitable angle relative to each other, such as an angle between 60 and 120 degrees, 70 and 110 degrees, or 80 and 100 degrees. In an embodiment, the laterally extending portion 21 may extend from the first central portion 27 to the second central portion 29 to wrap around the left primary lens region 16. The right surrounding lens region 22 may have the same features. It should be appreciated that the various portions of the left and right peripheral lens regions 20, 22 may have the same or different characteristics. For example, the laterally extending portion 21 and the longitudinally extending portion 23 may have different degrees of ambiguity (e.g., due to different textures, coatings, or matrix materials). In an embodiment, the left and right peripheral lens regions 20, 22 may not include any laterally extending portions, but may instead include only longitudinally extending portions. For example, the left and right primary lens regions 16, 18 may extend and/or be shaped to extend laterally across the virtual reality headset 10. Thus, it should be appreciated that the left and right peripheral lens regions 20, 22 may have any of a variety of shapes and configurations to accommodate the left and right primary lens regions 16, 18 and work in conjunction with the left and right primary lens regions 16, 18, regardless of the shape and configuration of the left and right primary lens regions 16, 18.
As discussed in more detail below, the virtual reality headset 10 may include one or more primary screens 30 (e.g., a primary screen; high definition) positioned in front of (e.g., aligned with; overlapping from the perspective of a user wearing the virtual reality headset 10) the left and right primary lens regions 16, 18 and a plurality of secondary screens 32 (e.g., low definition) positioned in front of (e.g., aligned with; overlapping from the perspective of a user wearing the virtual reality headset 10) the left and right surrounding lens regions 20, 22. In particular, at least one of the secondary screens 32 may be positioned before the left surrounding lens region 20 and at least one of the secondary screens 32 may be positioned before the right surrounding lens region 22. The one or more primary screens 30 and the plurality of secondary screens 32 may include a variety of display types including Liquid Crystal Displays (LCDs), organic Light Emitting Diode (OLED) displays, or any other suitable display type. For example, one of the one or more primary screens 30 may correspond to an LCD display and the plurality of secondary screens 32 may correspond to an OLED display. It should be appreciated that any suitable combination of display types may be utilized by one or more primary screens 30 and/or multiple secondary screens 32. The images displayed on the plurality of secondary screens 32 may be coordinated with the images displayed on the one or more primary screens 30. This may enable content displayed on the plurality of secondary screens 32 to enhance content displayed on the one or more primary screens 30. For example, primary screen 30 may be instructed to display a video that speaks of a daily rise and/or a character that points to a virtual area that is to be outside of the field of view in the user's surrounding vision, and secondary screen 32 may be instructed to display a bright light effect to simulate a daily rise and/or an event within the virtual area. It should be appreciated that speakers within the virtual reality headset 10 (and/or within the real world environment) may provide audio effects corresponding to images presented via the virtual reality headset 10.
With the foregoing in mind, fig. 3 is a front perspective view of a virtual reality headset 10 having an optical element 14, the optical element 14 including left and right primary lens regions 16, 18 and left and right peripheral lens regions 20, 22, according to an embodiment of this disclosure. The primary screen 30 can be viewed in fig. 3 as being disposed immediately in front of the left and right primary lens regions 16, 18. A divider may be present between the left and right primary lens regions 16, 18 to isolate content displayed on the left side of the primary screen 30 from the right side of the primary screen 30. The partitions may include one or more partitions extending along the longitudinal axis 4 between the primary screen 30 and the left and right primary lens regions 16, 18. In an embodiment, the partitions may include one or more dividers that isolate and/or block light from propagating from the primary screen 30 through the left and right peripheral lens regions 20, 22. For example, two tapered, opaque spacers 31 (only one shown for image clarity) and/or other shaped opaque material spacers may connect each of the left and right primary lens regions 16, 18 to the primary screen 30, thus isolating the left and right peripheral lens regions 20, 22 (e.g., entirely or substantially) from the primary screen 30. The secondary screen 32 may be positioned on the left and right lateral sides of the electrical housing 12 at an angle (e.g., relative to the primary screen 30; to match/align with the left and right peripheral lens regions 20, 22) to provide light and special effects for viewing by a user through the left peripheral lens region 20 over the left secondary screen 32 and through the right peripheral lens region 22 over the right secondary screen 32.
In one embodiment, the virtual reality headset 10 may include three display screens: a single high resolution primary screen 30 to display primary content viewable on the left and right primary lens regions 16, 18, and two secondary screens 32 to display special effects or light content viewable on the left surrounding lens region 20 (corresponding to the secondary screen on the left) and the right surrounding lens region 22 (corresponding to the secondary screen 32 on the right). For example, the primary screen 30 may play sunset video viewable through the left and right primary lens regions 16, 18. The secondary screen 32 may display a light effect matching the color of sunset displayed on the primary screen 30. The light effects of sunset may be viewed through the left and right surrounding lens regions 20, 22 and may assist in providing an enhanced virtual reality experience for the user. The left secondary screen 32 may also display an image of the sun at a first time that is viewable through the left surrounding lens region 20. At a later time, the left secondary screen 32 may no longer display an image of the sun and the right secondary screen 32 may display an image of the sun that can be viewed through the right surrounding lens region 22. This may provide the effect that the sun rises in the left and falls in the right via the virtual reality headset 10. It should be appreciated that different content may be displayed on the secondary screen 32 corresponding to the left and right peripheral lens regions 20, 22, depending on the desired effect. The primary screen 30 and the secondary screen 32 may also form a single display screen (e.g., one piece).
The discussion of the electrical components of the electrical housing 12 herein may be better understood in view of the above understanding of the operation and features of the optical element 14 and screens 30, 32. Fig. 4 is a front perspective view of virtual reality headgear 10 according to an embodiment of the present disclosure. The electrical components of the electrical housing 12 may include one or more primary screens 30 and a plurality of secondary screens 32. In an embodiment, the top of the electrical housing 12 may include or enclose one or more driver boards 34 and a controller 36 (e.g., an electronic controller) that may be used to send instructions corresponding to images to be displayed on the primary screen 30 and/or the secondary screen 32 of the virtual reality headset 10. The controller 36 may include one or more processors and may be disposed in the top of the electrical housing 12 and/or may be separate from the virtual reality headset 10 such that the central controller 38 (described in more detail below) may act to transmit or send commands to the virtual reality headset 10 to display an image according to a desired effect. In addition, the controller 36 may be capable of generating surrounding display content corresponding to the secondary screen 32 based on primary display content received from an external controller or processor to be displayed on the primary screen 30.
The controller 36 may be operative to send commands to the primary screen 30 and/or the secondary screen 32 of the virtual reality headset 10 to display an image. The controller 36 may send different image commands to each display screen of the virtual reality headset 10. For example, the controller 36 may send commands to the primary screen 30 to display primary images and/or primary video feed content. The controller 36 may then send a different ambient image and/or effect to each of the secondary screens 32 positioned on each side of the virtual reality headset 10. In one embodiment, the primary screen 30 may be commanded via the controller 36 to display a dark channel or important video information corresponding to the primary focus on the primary screen 30. In this example, the controller 36 may not send any commands to project an image on the secondary screen 32 so that focus will be directed on the primary image displayed on the primary screen 30. In other examples, the primary effect may be beach scenery and the controller 36 may send commands to the primary screen 30 to display beach images, and may send commands to the secondary screen 32 to display bright light effects or to display beach images or other corresponding effects (e.g., bright light to represent the sun) on each of the secondary screens 32. Thus, it should be appreciated that the primary screen 30 and the secondary screen 32 may display similar images (e.g., similar quality and/or image, such as ocean, sand, and sky in beach images), and that the characteristics (e.g., texture or blur) of the left and right surrounding lens regions 20, 22 may alter the propagation of light to simulate surrounding vision, and/or the primary screen 30 and the secondary screen 32 may display different images (e.g., different quality and/or image, such as ocean, sand, and sky in high definition on the primary screen 30, and soft blue and yellow color across the secondary screen 32) to simulate surrounding vision. In this way, the controller 36 may act to send image commands to the primary screen 30 and/or the secondary screen 32 based on the desired images and effects to be displayed on the virtual reality headset 10.
In an embodiment, a central controller 38 (e.g., an electronic controller) separate from the virtual reality headset 10 may function to send instructions to one or more virtual reality headsets 10 within the environment. This may enable central controller 38 to transmit instructions corresponding to the same image to multiple virtual reality headgear 10 within the environment. For example, multiple users may be within an environment, and the central controller 38 may transmit instructions to display the same image on the primary screen 30 of each virtual reality headset 10. The central controller 38 may also send instructions to display images and/or effects on the secondary screen 32 of each virtual reality headset 10. Further, the controller 36 may be capable of displaying secondary content on the secondary screen 32 based on instructions associated with the primary content received from the central controller 38. For example, the controller 36 may select or determine a certain video or image to be displayed on the primary screen 30 and also send instructions to the secondary screen 32 to display a supplemental image or effect based on instructions received from the central controller 38. This may enable a unified experience for multiple users within the environment. The central controller 38 may also function to send unique instructions to each virtual reality headset 10 within the environment. This may enable each virtual reality headset 10 to display a different image on the primary screen 30 and/or the secondary screen 32. This may enable each user to have a unique experience within the environment while also offloading image processing to the central controller 38 within the environment.
As discussed herein, the left and right peripheral lenses 20, 22 may be treated to enable the left and right peripheral lenses 20, 22 to simulate peripheral vision. In an embodiment, all or some portions of the left and right peripheral lens regions 20, 22 may include a membrane that may be electrically activated. The ambiguity of the film may vary based on the current applied to the film. The film may be laid on top of an electrical wiring structure that allows current to be sent through the film to provide a desired level of ambiguity for the left and right surrounding lens regions 20, 22. It should be appreciated that the film may be used in combination with other techniques, such as translucent matrix materials, frosted matrix materials, flexible and/or non-flexible materials, and/or texture-based materials. It should be appreciated that the left surrounding lens region 20 may be sent a different current than the right surrounding lens region 22, such that each of the left and right surrounding lens regions 20, 22 may have the same or different blur.
The virtual reality headset 10 may include various other features. For example, in one embodiment, the electrical housing 12 may include a game engine that includes a camera view that enables images to be presented by utilizing a virtual camera. The game engine may include a processor and software including morphing algorithms to alter the image projected on the secondary screen 32 so as not to distract from the primary content, while still providing enhanced light and image effects to mimic the visual surrounding. It should also be appreciated that the secondary screen 32 may be replaced or augmented with one or more light emitters (e.g., light emitting diodes [ LEDs ]) to provide additional visual effects visible through the left and right peripheral lens regions 20, 22 in a low cost, compact form. In this case, one or more light emitters may be illuminated in coordination with the image on primary screen 30 to provide different colors and/or brightnesses to simulate peripheral vision.
Fig. 5 is a rear perspective view of virtual reality headgear 10 having optical elements 14 according to an embodiment of the present disclosure. As discussed above, the left and right peripheral lens regions 20, 22 may comprise a translucent material that is connected to (e.g., adjoins, surrounds) the left and right primary lens regions 16, 18. While the left and right peripheral lens regions 20, 22 may generally be isolated from the primary screen 30 positioned in front of the left and right primary lens regions 16, 18, some of the light from the primary screen 30 may diverge into the left and right peripheral lens regions 20, 22 (e.g., due to the connection between the left and right peripheral lens regions 20, 22 and the left and right primary lens regions 16, 18).
The electrical housing 12 may also include and/or support speakers to aid in sound effects to augment the virtual reality experience and to correspond to images displayed on the primary screen 30. For example, the primary screen 30 may display a bonfire, and the speaker may play a ruptured sound effect to simulate the sound of a firewood burning. The components of the virtual reality headset 10 may also include other haptic, sound, light, wind, and/or other special effect components to augment the primary image and to aid in the immersive experience for the user. This may work in combination with left and right surrounding lens regions 20, 22 that enable a viewer to view enhanced light effects and/or other display elements that may enhance the immersive experience by expanding the viewpoint of the user beyond the left and right primary lens regions 16, 18. As discussed herein, the left primary lens region 16 and the left peripheral lens region 20 may be of one-piece construction and/or may be coupled together (e.g., via an adhesive and/or welding) to form a single lens structure (e.g., continuous without gaps), and the left primary lens region 16 and the left peripheral lens region 20 may be distinguished from one another by differences in transparency, texture, and/or positioning/display alignment. Similarly, the right primary lens region 18 and the right surrounding lens region 22 may be of one-piece construction and/or coupled together (e.g., via an adhesive and/or welding) to form a single lens structure (e.g., continuous without gaps), and the right primary lens region 18 and the right surrounding lens region 22 may be distinguished from one another by differences in transparency, texture, and/or positioning/display alignment.
In addition, the virtual reality headgear 10 may be part of an amusement park attraction 46 that includes a ride carrier 42. The ride vehicle 42 may be maneuvered to correspond to an image projected by the virtual reality headset 10 (for displaying an image on a lens of the virtual reality headset 10) according to the methods discussed above.
With the foregoing in mind, fig. 6 is a perspective view of an embodiment of a ride vehicle 42 and one or more users 44, each of whom may utilize a respective virtual reality headgear 10 in an amusement park attraction 46. As set forth above, the primary screen 30 and/or the secondary screen 32 may be configured to display images to the user 44 of the virtual reality headset 10 such that one or more users 44 may view images (e.g., video feeds) during movement of the ride carrier 42. In the illustrated embodiment, each user 44 of the amusement park attraction 46 has a corresponding virtual reality headgear 10 to view the image. The controller of the virtual reality headset 10 may be configured to send commands to each primary screen 30 and/or secondary screen to output the same image to each optical element 14 such that the common video feed is viewable to each user 44. However, in an embodiment, the image may be different for each virtual reality headset 10, such that each user 44 may view a unique image (e.g., a unique video feed). For example, each user 44 may be assigned a particular role (e.g., captain, pilot) as part of an amusement park attraction 46. The respective virtual reality headgear of each user 44 may display video feed content specifically corresponding to the specific character assigned to the user 44 such that the user 44 may experience the amusement park attraction 46 from the perspective of the assigned character.
In an embodiment, the virtual reality headset 10 may be configured to output a combination of both co-feed content and unique video feed content within the amusement park attraction 46. For example, the virtual reality headgear 10 may display the co-feed during the introductory portion of the amusement park attraction 46. During subsequent portions of the amusement park attraction 46, the controller of each of the virtual reality headgear 10 may send commands to the primary screen 30 and/or the secondary screen to output unique video feed content (e.g., associated with a particular character) to the user 44. Users 44 with no roles or users 44 with roles without active tasks may continue to receive co-feed content. However, during some portions of the amusement park attraction 46, the virtual reality headgear 10 may output a unique video feed to each user 44 of the amusement park attraction 46. Additionally, in an embodiment, the virtual reality headset 10 may receive commands to display certain images and/or video feeds from a central controller that may be programmed to send commands to correspond to the ride carrier 42 movements and/or other effects of the amusement park attraction 46.
In an embodiment, the image includes text-based information, pictures, video, or some combination thereof. For example, the amusement park attraction 46 may be a virtual reality type attraction such that the image includes a video image of a virtual reality environment. In another example, the image may include text-based instructions for the amusement park attraction 46. The text-based instructions may inform the user 44 about how to use the virtual reality headset 10 and/or perform actions that are part of the amusement park attraction 46.
Although only certain features have been illustrated and described herein, many modifications and changes will occur to those skilled in the art. It is, therefore, to be understood that the appended claims are intended to cover all such modifications and changes as fall within the true spirit of the invention. While certain disclosed embodiments have been disclosed in the context of amusement parks or theme parks, it should be understood that certain embodiments may also be directed to other uses. Furthermore, it should be understood that certain elements of the disclosed embodiments may be combined with or interchanged with one another.
The technology presented and claimed herein is referenced and applied to material objects and concrete examples of practical nature that significantly improve the art and are therefore not abstract, intangible, or purely theoretical. Furthermore, if any claim appended to the end of this specification contains one or more elements designated as "means for [ performing ] … … [ function ] or" step for [ performing ] … … [ function ], it is intended that such element should be interpreted in accordance with 35u.s.c. ≡112 (f). However, for any claim containing an element specified in any other way, it is intended that such element not be construed in accordance with 35u.s.c. ≡112 (f).

Claims (20)

1. A virtual reality headset, comprising: a housing;
a primary display screen supported within the housing and configured to display a primary image;
a secondary display screen supported within the housing and configured to display a secondary image; and
an optical element comprising a primary optical zone and a surrounding optical zone, wherein the primary optical zone enables viewing of the primary image displayed on the primary display screen and the surrounding optical zone enables viewing of the secondary image displayed on the secondary display screen.
2. The virtual reality headgear of claim 1, comprising a textured edge around at least a portion of the primary optical zone.
3. The virtual reality headgear of claim 2, wherein the textured edge separates the primary optic zone from the surrounding optic zone.
4. The virtual reality headgear of claim 1, wherein the primary image comprises a high definition image and the secondary image comprises a low definition image.
5. The virtual reality headset of claim 1, wherein the surrounding optical zone comprises a left surrounding optical zone on a left side of the virtual reality headset and a right surrounding optical zone on a right side of the virtual reality headset.
6. The virtual reality headgear of claim 1, wherein the surrounding optical zone comprises a translucent material, a surface texture, or any combination thereof.
7. The virtual reality headset of claim 1, wherein the primary optical zone comprises polymethyl methacrylate (PMMA) or another optically transparent material comprising at least 85 percent optical transmission.
8. The virtual reality headgear of claim 1, wherein the surrounding optical zone comprises one or more sensors, speakers, light emitting diodes, or any combination thereof.
9. The virtual reality headset of claim 1, wherein the primary display screen and the secondary display screen form a single display screen.
10. The virtual reality headgear of claim 1, wherein the surrounding optical zone comprises a translucent film, and the virtual reality headgear comprises circuitry configured to deliver current through the translucent film to adjust the blur of the surrounding optical zone.
11. A virtual reality headset, comprising:
a housing;
a primary display screen supported within the housing and configured to display a primary image;
A secondary display screen supported within the housing and configured to display a secondary image; and
an optical element comprising a primary optical zone and a peripheral optical zone, wherein the primary optical zone is positioned in alignment with the primary display screen and is transparent, and the peripheral optical zone is positioned in alignment with the secondary display screen and is translucent.
12. The virtual reality headset of claim 11, wherein the primary display screen comprises a high definition display and the secondary display screen comprises a low resolution display.
13. The virtual reality headset of claim 11, wherein the surrounding optical zone is translucent due to a translucent base material, a surface texture, a coating, or any combination thereof.
14. The virtual reality headset of claim 13, wherein the surrounding optical zone is translucent due to an electrically activated membrane.
15. The virtual reality headset of claim 14, wherein the surrounding optical zone comprises a left surrounding lens zone on a left lateral side of the virtual reality headset and a right surrounding lens zone on a right lateral side of the virtual reality headset.
16. The virtual reality headset of claim 11, wherein the primary display screen and the secondary display screen form a single display screen.
17. A method of operating virtual reality headgear, the method comprising:
displaying a primary image from a viewer perspective on a primary display screen positioned in front of a transparent primary lens region of the virtual reality headset;
displaying a secondary image from the viewer perspective on a secondary display screen positioned in front of a translucent surrounding lens region of the virtual reality headset; and
the display of the primary image and the display of the secondary image are coordinated via one or more processors to create a virtual environment having a surrounding visual effect.
18. The method of claim 17, wherein displaying the primary image comprises displaying the primary image with high definition and displaying the secondary image comprises displaying the secondary image with low definition.
19. The method of claim 17, comprising displaying an additional secondary image on an additional secondary display screen positioned in front of an additional translucent surrounding lens region of the virtual reality headset.
20. The method of claim 19, comprising coordinating, via the one or more processors, the display of the primary image, the display of the secondary image, and the display of the additional secondary image to create the virtual environment with a surrounding visual effect.
CN202280035120.8A 2021-05-14 2022-05-12 Front and surrounding viewing lens zones for virtual reality headgear Pending CN117355785A (en)

Applications Claiming Priority (5)

Application Number Priority Date Filing Date Title
US63/189051 2021-05-14
US63/242115 2021-09-09
US17/742214 2022-05-11
US17/742,214 US20220365349A1 (en) 2021-05-14 2022-05-11 Front and peripheral view lens areas for virtual reality headsets
PCT/US2022/029024 WO2022241136A1 (en) 2021-05-14 2022-05-12 Front and peripheral view lens areas for virtual reality headsets

Publications (1)

Publication Number Publication Date
CN117355785A true CN117355785A (en) 2024-01-05

Family

ID=89371510

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202280035120.8A Pending CN117355785A (en) 2021-05-14 2022-05-12 Front and surrounding viewing lens zones for virtual reality headgear

Country Status (1)

Country Link
CN (1) CN117355785A (en)

Similar Documents

Publication Publication Date Title
US10823965B2 (en) Peripheral treatment for head-mounted displays
US5598297A (en) Image display unit
KR102373940B1 (en) Head-mounted display with electrochromic dimming module for augmented and virtual reality perception
CN102540464B (en) Head-mounted display device which provides surround video
US8976323B2 (en) Switching dual layer display with independent layer content and a dynamic mask
AU2012352273B2 (en) Display of shadows via see-through display
US8619005B2 (en) Switchable head-mounted display transition
US10317683B2 (en) Head-mounted electronic device
JP7426413B2 (en) Blended mode three-dimensional display system and method
CN107003517A (en) Wear-type electronic installation
CN111247473B (en) Display apparatus and display method using device for providing visual cue
CN117355785A (en) Front and surrounding viewing lens zones for virtual reality headgear
US20220365349A1 (en) Front and peripheral view lens areas for virtual reality headsets
CA3216715A1 (en) Front and peripheral view lens areas for virtual reality headsets
CN114675418B (en) Ultra-light wearable display device and method for the same
CN110297327A (en) Using the lightweight display device of active optical cable
US20240282225A1 (en) Virtual reality and augmented reality display system for near-eye display devices
CN205720877U (en) A kind of filter structure and a kind of virtual reality device
KR20240083777A (en) Wearable device for modifying background object based on size or number of foreground object and method thereof
CN118521739A (en) Virtual reality and augmented reality display system for near-eye display device
CN110196495A (en) Lightweight display device
JPH07168121A (en) Picture display device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
REG Reference to a national code

Ref country code: HK

Ref legal event code: DE

Ref document number: 40105615

Country of ref document: HK