WO2022241136A1 - Zones de lentilles de vue avant et périphérique pour casques de réalité virtuelle - Google Patents

Zones de lentilles de vue avant et périphérique pour casques de réalité virtuelle Download PDF

Info

Publication number
WO2022241136A1
WO2022241136A1 PCT/US2022/029024 US2022029024W WO2022241136A1 WO 2022241136 A1 WO2022241136 A1 WO 2022241136A1 US 2022029024 W US2022029024 W US 2022029024W WO 2022241136 A1 WO2022241136 A1 WO 2022241136A1
Authority
WO
WIPO (PCT)
Prior art keywords
virtual reality
primary
reality headset
images
display screen
Prior art date
Application number
PCT/US2022/029024
Other languages
English (en)
Inventor
Patrick John Goergen
Martin Evan Graham
Original Assignee
Universal City Studios Llc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from US17/742,214 external-priority patent/US20220365349A1/en
Application filed by Universal City Studios Llc filed Critical Universal City Studios Llc
Priority to EP22727652.4A priority Critical patent/EP4338003A1/fr
Priority to CN202280035120.8A priority patent/CN117355785A/zh
Priority to CA3216715A priority patent/CA3216715A1/fr
Priority to JP2023570449A priority patent/JP2024521662A/ja
Priority to KR1020237043019A priority patent/KR20240008347A/ko
Publication of WO2022241136A1 publication Critical patent/WO2022241136A1/fr

Links

Classifications

    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/017Head mounted
    • G02B27/0172Head mounted characterised by optical features
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/30Image reproducers
    • H04N13/332Displays for viewing with the aid of special glasses or head-mounted displays [HMD]
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/0101Head-up displays characterised by optical features
    • G02B2027/0123Head-up displays characterised by optical features comprising devices increasing the field of view

Definitions

  • the present disclosure relates generally to the field of virtual reality headsets. More specifically, embodiments of the present disclosure relate to lenses for virtual reality headsets.
  • a virtual reality headset may include one or more display screens that display images to a user of the virtual reality headset.
  • the displayed images may be viewed through lenses of the virtual reality headset.
  • the virtual reality headset may display the images and/or render special effects to reflect a variety of environments and/or scenarios.
  • the virtual reality headset may display a visual feature along with sound effects that correspond to the visual feature to aide in further immersing the user in a virtual reality environment.
  • a virtual reality headset includes a housing, a primary display screen supported within the housing and configured to display primary images, and a secondary display screen supported within the housing and configured to display secondary images.
  • the virtual reality headset may also include an optical element that includes a primary optical area and a peripheral optical area. Further, the primary optical area enables viewing of main content displayed on the primary display screen and the peripheral optical area enables viewing of secondary content displayed on the secondary display screen.
  • a virtual reality headset includes a housing, a primary display screen supported within the housing and configured to display primary images, and a secondary display screen supported within the housing and configured to display secondary images. Further, the virtual reality headset includes an optical element that includes a primary optical area and a peripheral optical area. The primary optical area is positioned to align with the primary display screen and is transparent, and the peripheral optical area is positioned to align with the secondary display screen and is translucent.
  • a method of operating a virtual headset includes displaying primary images on a primary display screen positioned in front of a transparent primary lens area of the virtual reality headset from a perspective of a viewer.
  • the method also includes displaying secondary images on a secondary display screen positioned in front of a translucent peripheral lens area of the virtual reality headset from the perspective of the viewer.
  • the method further includes coordinating, via one or more processors, the displaying of the primary images and the displaying of the secondary images to create a virtual environment with peripheral vision effects.
  • FIG. l is a front perspective view of an optical element of a virtual reality headset and a portion of an electrical casing of the virtual reality headset, in accordance with an embodiment of the present disclosure
  • FIG. 2 is a rear view of the virtual reality headset with the optical element that includes a left and right primary lens area and a left and right peripheral lens area, in accordance with an embodiment of the present disclosure
  • FIG. 3 is a front perspective view of the virtual reality headset with the optical element that includes the left and right primary lens area and the left and right peripheral lens area, wherein certain portions of the virtual reality headset are removed and/or transparent to enable visualization of various features of the virtual reality headset, in accordance with an embodiment of the present disclosure;
  • FIG. 4 is a front perspective view of the virtual reality headset, wherein the electrical casing is transparent to enable visualization of various features internal to the electrical casing, in accordance with embodiments of the present disclosure
  • FIG. 5 is a rear perspective view of the virtual reality headset, in accordance with embodiments of the present disclosure.
  • FIG. 6 is a perspective view of an embodiment of a ride vehicle that may carry a user while the user wears the virtual reality headset, in accordance with embodiments of the present disclosure.
  • Some existing virtual reality headsets may have one or two lenses that enable viewing an image displayed on a display screen of the virtual reality headset.
  • the present disclosure provides a virtual reality headset with an optical element (e.g., one or more lenses; a lens assembly) that includes a primary lens area (e.g., main lens area; center or forward lens area) of the optical element and a peripheral lens area (e.g., secondary lens area; side lens area) of the optical element.
  • the peripheral lens area of the optical element may be used to provide a peripheral view to the user at various times and/or in certain ways to enable a more immersive experience for the user.
  • the peripheral lens areas of the optical element may enable visualization of additional color effects or other special effects to enhance the user’s experience in viewing a main image at the primary lens area of the optical element.
  • a virtual reality headset that includes an optical element that enables viewing of content displayed on one or more screens located in the virtual reality headset.
  • the optical element may be partitioned into two areas (e.g., left and right areas; left and right of a centerline).
  • the optical element may include left and right primary lens areas (e.g., a left primary lens area and a right primary lens area), which may each be positioned over a primary display screen (e.g., high-definition display screen) and may enable a main image displayed on the high-definition display screen to be viewed by the user positioning their eyes behind (e.g., looking through) the left and right primary lens area.
  • the optical element may also include left and right peripheral lens areas (e.g., a left peripheral lens area and a right peripheral
  • lens area which may each be positioned over secondary display screens (e.g., low-definition display screens) and may enable images displayed on the secondary display screens to be viewed by the user (e.g., in their peripheral vision; while looking through the left and right primary lens area).
  • secondary display screens e.g., low-definition display screens
  • images displayed on the secondary display screens may be viewed by the user (e.g., in their peripheral vision; while looking through the left and right primary lens area).
  • the left and right peripheral lens area may be formed from frosted glass and/or a semi transparent material, such that the images viewed through the left and right peripheral lens area may have an appearance (e.g., unfocused, fuzzy) that compares to or is similar to peripheral vision (e.g. when the user views a real-world environment without the virtual reality headset).
  • the left and right peripheral lens area may be formed from a textured material.
  • the left peripheral lens area may be connected to the left primary lens area via a textured material
  • the right peripheral lens area may be connected to the right primary lens area via a textured material.
  • the transparency may gradually decrease (e.g., via changes in the texture, such as by a surface roughness that provides the texture gradually increasing) from an outer edge of the left primary lens area to an outer edge of the left peripheral lens area and from an outer edge of the right primary lens area to an outer edge of the right peripheral lens area.
  • the virtual reality headsets may also be used in other contexts and with other projection and/or viewing systems.
  • the virtual reality headsets may use an external screen or projection component.
  • the virtual reality headsets may utilize an external control system that may send commands and/or instructions for the images to be displayed for each virtual reality headset within an environment. Accordingly, the particular construction of the virtual reality headset (e.g., materials, shape, size) may be implemented according to the desired end use.
  • FIG. 1 is a front perspective view of an optical element 14 (e.g., one or more lenses; a lens assembly) of a virtual reality headset 10 and a portion of an electrical casing 12 (e.g., housing) of the virtual reality headset 10, in accordance with an embodiment of the present disclosure.
  • the virtual reality headset 10 may include the optical element 14 and the electrical casing 12 that holds the optical element 14.
  • the optical element 14 may be formed from a single material, such as polymethyl methacrylate (PMMA), optically clear material (e.g., with an optical PMMA), optically clear material (e.g., with an optical
  • PMMA polymethyl methacrylate
  • the optical element 14 may also be formed from multiple different materials, such that different materials form different sections of the optical element 14. The different materials may each take up equal sections of the optical element 14 or variable size sections of the optical element 14.
  • the electrical casing 12 may include or surround multiple electrical components, including one or more display screens, audio elements, special effect elements, controllers, and the like. The electrical casing 12 may contain the optical element 14, and the optical element 14 may be detachable from the electrical casing 12.
  • the electrical casing 12 may include one or more display screens to display the image (e.g., video feed, picture) that can be viewed through the optical element 14 (e.g., by a user wearing the virtual reality headset 10).
  • the one or more display screens may include a liquid crystal display (LCD), an organic light-emitting diode (OLED) display, or any other suitable display type.
  • the one or more display screens may include a variety of display screen types. For example, one of the display screens may correspond to an LCD display, and another display screen may correspond to an OLED display. Additionally, the one or more display screens may include high-definition display screens and/or low definition display screens.
  • the virtual reality headset 10 may include multiple display screens that may display different images.
  • the optical element 14 may be at least partially translucent in all sections of the optical element 14 (e.g., one or more sections may be transparent or translucent to enable light transmission through the optical element 14; not opaque in any section of the optical element 14). However, it should be understood that the degree to which the material is translucent may vary across the optical element 14.
  • the optical element 14 may also include one or more opaque areas that may coincide with the theme or image viewed through the optical element 14. In this way, the optical element 14 permits the user wearing the virtual reality headset 10 to view the image(s) displayed on the one or more display screens.
  • the optical element 14 may be coupled to the electrical casing 12, but the optical element 14 may also be detachable or removable from the electrical casing 12.
  • the virtual reality headset 10 and its components may be described with reference to a vertical axis or direction 2, a longitudinal axis or direction 4, and/or a lateral axis or direction 6.
  • the virtual reality headset 10 may also have a centerline 8 that extends in the longitudinal direction 4 and separates a left portion and a right portion of the virtual reality headset 10.
  • FIG. 2 is a rear view of the virtual reality headset 10 with the optical element 14 that includes a left and right primary lens areas 16, 18 (e.g., a left/first primary lens area 16 and a right/second primary lens area 18) and a left and right peripheral lens areas 20, 22 (e.g., a left/first peripheral lens area 20 and a right/second peripheral lens area 22), in accordance with an embodiment of the present disclosure.
  • the left and right primary lens areas 16, 18 may be referred to as a primary lens area or a primary optical area.
  • the left and right peripheral lens areas 20, 22 may be referred to as a peripheral lens area or a peripheral optical area.
  • the optical element 14 may include the left and right primary lens areas 16, 18 to view main content (e.g., images, videos) and the left and right peripheral lens areas 20, 22 to view peripheral image effects and/or additional special effects that enhance the immersive nature of the main content.
  • the additional special effects may include non-visual special effects, including but not limited to wind effects, sound effects, vibration effects, scent emitting effects, haptic effects, and the like.
  • the left primary lens area 16 may be disposed on a first lateral side (e.g., left side) of the virtual reality headset 10 and the right primary lens area 18 may be disposed on a second lateral side (e.g., right side) of the virtual reality headset 10.
  • the left and right primary lens areas 16, 18 may be shaped to form a notched component 26 (e.g., gap) of the virtual reality headset 10 that may be fitted to a nose of the user. It should be understood that the left and right primary lens areas 16, 18 may be any suitable shape and/or have any configuration to correspond to a desired overall design for the virtual reality headset 10.
  • the left and right primary lens areas 16, 18 may be positioned on the virtual reality headset 10 to correspond to and be in front of a left eye and a right eye of the user while the user wears the virtual reality headset 10.
  • the left primary lens area 16 may be separate from the right primary lens area (e.g., physically separate) and/or may be divided by a portion of the electrical casing 12 (e.g., that defines the notched component 26) of the virtual reality headset 10.
  • a textured edge 28 may be present about all or some of respective circumferences of the left and right primary lens areas 16, 18.
  • the textured edge 28 may transition the left primary lens areas 16 to the left peripheral lens area 20 and the right primary lens area 18 to the right peripheral lens area 22
  • the textured edge 28 that surrounds the left primary lens area 16 may be separate from the textured edge 28 that surrounds the right primary lens area 18.
  • the textured edge 28 may also include one or more embedded sensors that may detect a location of the virtual reality headset 10, or be used to collect additional data on the virtual reality headset 10.
  • the left and right primary lens areas 16, 18 may be formed from a transparent material
  • the left and right peripheral lens areas 20, 22 may be formed from a translucent material
  • the textured edge 28 may be positioned between the transparent material and the translucent material.
  • the textured edge 28 may have a surface roughness that is greater than the surface roughness of the left and right primary lens areas 16, 18 and/or the surface roughness of the left and right peripheral lens areas 20, 22.
  • the textured edge 28 may have texture that is visible and/or detectable by touch, while the left and right primary lens areas 16, 18 and/or the left and right peripheral lens areas 20, 22 may not have any texture that is visible and/or detectable by touch (e.g., smooth surfaces).
  • the textured edge 28 may have texture that is visible and/or detectable by touch, and the left and right peripheral lens areas 20, 22 may also have texture that is visible and/or detectable by touch; however, the surface roughness of the textured edge 28 may be greater than the surface roughness of the left and right peripheral lens areas 20, 22. In an embodiment, the surface roughness of the textured edge 28 may be less than the surface roughness of the left and right peripheral lens areas 20, 22. More generally, in an embodiment, characteristics of the texture of the textured edge 28 may be different from characteristics of the texture of the left and right peripheral lens areas 20, 22. Thus, the textured edge 28 may be considered distinct from the left and right peripheral lens areas 20, 22 even though texture is present in these sections of the optical element 14.
  • the textured edge 28 is an optional feature and may be omitted.
  • the left and right primary lens areas 16, 18 may transition to the left and right peripheral lens areas 20, 22 without any distinct intermediate section (e.g., having different characteristics; without the textured edge 28).
  • the left and right peripheral lens areas 20, 22 may each include the same or varied surface roughness compared to the textured edge 28.
  • the left peripheral lens area 20 may have a greater surface roughness than the right peripheral lens area 22.
  • the left and right primary lens areas 16, 18 may enable the user to view an image displayed on a primary screen located in the virtual reality headset 10. It should be understood that the left primary lens area 16 may enable viewing of different content than the right primary lens area 18 based on the image displayed by the virtual reality headset 10. For example, a portion of the image that is exposed to the left primary lens area 16 may be different from a portion of the image that is exposed to the right primary lens areas 16. As shown, the left primary lens area 16 may be associated with a respective non-transparent divider 31 (e.g., having a cone or funnel
  • the left and right primary lens areas 16, 18 may be formed from PMMA or another optically clear material. Furthermore, the left and right primary lens areas 16, 18 may be formed from separate pieces of material and may be held in appropriate relative positions via the electrical casing 12 or other frame structure, or the left and right primary lens areas 16, 18 may be formed from a single piece of material (e.g., one-piece construction).
  • the textured edge 28 may be present around all or some of the circumference of the left primary lens area 16 and also present around all or some of the circumference of the right primary lens area 18.
  • the textured edge 28 may form a gradient texture that transitions to a solid uniform texture that extends across the left peripheral lens area 20 and the right peripheral lens area 22.
  • the left and right peripheral lens areas 20, 22 may have less opacity (e.g., less transparent) than the left and right primary lens areas 16, 18. This may enable the images viewed through the left and right peripheral lens areas 20, 22 to appear blurred in a manner that corresponds to a peripheral vision viewpoint.
  • the left and right peripheral lens areas 20, 22 may be formed from frosted glass or another semi-transparent material.
  • the semi transparent material may be a plastic material, fabric material, or any suitable lens material that is configured to be positioned in front of eyes of the user to be within and/or affect a view of the user, allow light to pass through the material, and/or to enable the user to view images and/or objects through the material.
  • lens area refers to any part that is configured to be positioned in front of eyes of the user to be within and/or affect a view of the user, allow light to pass through the material, and/or to enable the user to view images and/or objects through the material.
  • the left and right primary lens areas 16, 18 and the left and right peripheral lens areas 20, 22 may be formed from the same material (e.g., PMMA or other optically clear material), but the left and right peripheral lens areas 20, 22 may be treated to make the left and right peripheral lens areas 20, 22 have a greater opacity than the left and right primary lens areas 16, 18.
  • the left and right peripheral lens areas 20, 22 may be texturized (e.g., via etching) and/or coated (e.g., via paint).
  • the left primary lens area 16 and the left peripheral lens area 20 may be formed from a single piece of material (e.g., one-piece construction) and the right
  • primary lens area 18 and the right peripheral lens area 22 may be formed from a single piece of material (e.g., one-piece construction).
  • the left and right primary lens areas 16, 18 and the left and right peripheral lens areas 20, 22 may be formed from a single piece of material (e.g., one-piece construction).
  • the left and right peripheral lens areas 20, 22 may be formed from separate pieces of material (e.g., separate from one another and separate from the left and right primary lens areas 16, 18) and may be held in appropriate relative positions via the electrical casing 12 or other frame structure.
  • each of the left and right peripheral lens areas 20, 22 may be formed from multiple materials that correspond to sections of each of the left and right peripheral lens areas 20, 22.
  • each of the left and right peripheral lens areas 20, 22 may include a greater opacity outer edge (e.g., formed by a strip of black plastic or paint) and a lesser opacity inner edge (e.g., formed by a strip of blended plastic or paint) near the left primary lens area 16 for the left peripheral lens area 20 and the right primary lens area 18 for the right peripheral lens area 22.
  • An opacity may vary between the greater opacity outer edge and the lesser opacity inner edge.
  • the opacity may transition gradually between the greater opacity outer edge and the lesser opacity inner edge. This may cause the color effects displayed on the secondary screens to be viewed through the left and right peripheral lens areas 20, 22 with light and/or color similar to peripheral vision.
  • the left and right primary lens areas 16, 18 may each have a generally circular shape (e.g., with a curved edge), although the left and right primary lens areas 16, 18 may each have a straight edge along the notched components 26.
  • the left and right primary lens areas 16, 18 may also include a curved edge rather that a straight edge along the notched components 26, or any other suitable edge shape to correspond to the notched components 26.
  • the left and right peripheral lens areas 20, 22 may each have a laterally-extending portion and a longitudinally-extending portion... For example, with reference to the left peripheral lens area 20 shown in FIG.
  • the left peripheral lens area 20 includes a laterally-extending portion 21 (e.g., in a same plane as the left primary lens area 16) and a longitudinally-extending portion 23 that are joined together via a bend portion 25.
  • the laterally-extending portion 21 and the longitudinally-extending portion 23 may be oriented at any suitable angle relative to one another, such as an angle between 60 to 120 degrees, 70 to 110 degrees, or 80 to 100 degrees.
  • the laterally-extending portion 21 may
  • the right peripheral lens area 22 may have the same features. It should be appreciated that the various portions of the left and right peripheral lens areas 20, 22 may have the same or different characteristics. For example, the laterally-extending portion 21 and the longitudinally- extending portion 23 may have different opacities (e.g., due to different textures, coatings, or base materials). In an embodiment, the left and right peripheral lens areas 20, 22 may not include any laterally-extending portions, but instead may only include the longitudinally-extending portions. For example, the left and right primary lens areas 16, 18 may extend and/or be shaped to extend laterally across the virtual reality headset 10. Thus, it should be appreciated that the left and right peripheral lens areas 20, 22 may have any of a variety of shapes and configurations to accommodate and to work in conjunction with the left and right primary lens areas 16, 18 regardless of the shape and configuration of the left and right primary lens areas 16, 18.
  • the virtual reality headset 10 may include one or more primary screens 30 (e.g., main screens; high-definition) located in front of the left and right primary lens areas 16, 18 (e.g. aligned; overlapping from a perspective of the user wearing the virtual reality headset 10) and multiple secondary screens 32 (e.g., low-definition) located in front of the left and right peripheral lens areas 20, 22 (e.g., aligned; overlapping from the perspective of the user wearing the virtual reality headset 10).
  • primary screens 30 e.g., main screens; high-definition
  • secondary screens 32 e.g., low-definition located in front of the left and right peripheral lens areas 20, 22 (e.g., aligned; overlapping from the perspective of the user wearing the virtual reality headset 10).
  • at least one of the secondary screens 32 may be located in front of the left peripheral lens area 20 and at least one of the secondary screens 32 may be located in front of the right peripheral lens area 22. ).
  • the one or more primary screens 30 and the multiple secondary screens 32 may include a variety of display types including a liquid crystal display (LCD), an organic light-emitting diode (OLED) display, or any other suitable display type.
  • LCD liquid crystal display
  • OLED organic light-emitting diode
  • one the one or more primary screens 30 may correspond to an LCD display
  • the multiple secondary screens 32 may correspond to an OLED display.
  • any suitable combination of display type may be utilized by the one or more primary screens 30 and/or the multiple secondary screens 32.
  • the images displayed on the multiple secondary screens 32 may be coordinated with the image displayed on the one or more primary screens 30. This may enable the content displayed on the multiple secondary screens 32 to enhance content displayed on the one or more primary screens 30.
  • the primary screens 30 may be sent instructions to display a video that displays a character talking about a sunrise and/or pointing to a virtual area that would be out of view in the peripheral vision of the user, and the
  • 11 secondary screens 32 may be sent instructions to display a bright light effect to simulate the sunrise and/or event within the virtual area. It should be appreciated that speakers within the virtual reality headset 10 (and/or within a real-world environment) may provide audio effects that corresponds to the images presented via the virtual reality headset 10.
  • FIG. 3 is a front perspective view of the virtual reality headset 10 with the optical element 14 that includes the left and right primary lens areas 16, 18 and the left and right peripheral lens areas 20, 22, in accordance with an embodiment of the present disclosure.
  • the primary screen 30 can be observed in FIG. 3 as disposed directly in front of the left and right primary lens areas 16, 18.
  • the partition may include one or more dividers that extend along the longitudinal axis 4 between the primary screen 30 and the left and right primary lens areas 16, 18.
  • the partition may include one or more dividers that isolate and/or block light transmission from the primary screen 30 through the left and right peripheral lens areas 20, 22.
  • the partition may include one or more dividers that isolate and/or block light transmission from the primary screen 30 through the left and right peripheral lens areas 20, 22.
  • two cone shape, non-transparent dividers 31 may connect each of the left and right primary lens areas 16, 18 to the primary screen 30, thus isolating (e.g., completely or substantially) the left and right peripheral lens areas 20, 22 from the primary screen 30.
  • the secondary screens 32 may be located at an angle (e.g., relative to the primary screen 30; to match/align with the left and right peripheral lens areas 20, 22) on the left and right lateral sides of the electrical casing 12 to provide light and special effects for the user to view through the left peripheral lens area 20 over the left secondary screen 32 and through the right peripheral lens area 22 over the right secondary screen 32.
  • the virtual reality headset 10 may include three display screens: a single high resolution primary screen 30 to display main content that can be viewed on the left and right primary lens areas 16, 18, and two secondary screens 32 to display special effects or light content that can be viewed on the left peripheral lens area 20 corresponding to the secondary screen on the left and the right peripheral lens area 22 corresponding to the secondary screen 32 on the right.
  • the primary screen 30 may play a video of a sunset that can be viewed through the left and right primary lens areas 16, 18.
  • the secondary screens 32 may display light effects
  • the left secondary screen 32 may also display an image of a sun at a first time that can be viewed through the left peripheral lens area 20. At a later time, the left secondary screen 32 may no longer display an image of the sun and the right secondary screen 32 may display the image of the sun that can be viewed through the right peripheral lens area 22. This may provide the effect of the sun rising on the left and setting on the right via the virtual reality headset 10. It should be understood that different content may be displayed on the secondary screens 32 corresponding to the left peripheral lens area 20 and the right peripheral lens area 22 according to the desired effect.
  • the primary screen 30 and the secondary screen 32 may also form a single display screen (e.g., one piece).
  • FIG. 4 is a front perspective view of the virtual reality headset 10, in accordance with an embodiment of the present disclosure.
  • the electrical components of the electrical casing 12 may include the one or more primary screens 30 and the multiple secondary screens 32.
  • a top of the electrical casing 12 may include or surround one or more driver boards 34 and a controller 36 (e.g., electronic controller) that may be used to send instructions corresponding to images to be displayed on the primary screen 30 and/or the secondary screens 32 of the virtual reality headset 10.
  • a controller 36 e.g., electronic controller
  • the controller 36 may include one or more processors and may be disposed in the top of the electrical casing 12 and/or may be separate from the virtual reality headset 10 such that a central controller 38, described in more detail below, may function to transmit or send commands to the virtual reality headset 10 to display images according to desired effects. Additionally, the controller 36 may be able to generate peripheral display content corresponding to the secondary screens 32 based on primary display content received from an external controller or processor to be displayed on the primary screen 30.
  • the controller 36 may function to send commands to the primary screen 30 and/or the secondary screens 32 of the virtual reality headset 10 to display images.
  • the controller 36 may send different image commands to each display screen of the virtual reality headset 10. For example, the controller 36 may send a command to the primary screen 30 to display a main image
  • the controller 36 may then send a different peripheral image and/or effect to each of the secondary screens 32 located on each side of the virtual reality headset 10.
  • the primary screen 30 may be sent a command via the controller 36 to display a dark alley or important video message that corresponds to a main focus on the primary screen 30.
  • the controller 36 may not send any command to project images on the secondary screens 32 so that the focus will be directed on the main image displayed on the primary screen 30.
  • the main effect may be a beach scenery and the controller 36 may send a command to the primary screen 30 to display a beach image, and may send a command to the secondary screens 32 to display a bright light effect or display a beach image or other corresponding effect (e.g., bright light to represent the sun) on each of the secondary screens 32.
  • the controller 36 may send a command to the primary screen 30 to display a beach image, and may send a command to the secondary screens 32 to display a bright light effect or display a beach image or other corresponding effect (e.g., bright light to represent the sun) on each of the secondary screens 32.
  • the primary screen 30 and the secondary screens 32 may display a similar image (e.g., similar quality and/or imagery, such as ocean, sand, and sky in the beach image) and the characteristics of the left and right peripheral lens areas 20, 22 (e.g., the texture or the opacity) may alter the transmission of light to simulate peripheral vision and/or the primary screen 30 and the secondary screens 32 may display different images (e.g., different quality and/or imagery, such as the ocean, sand, and sky in high-definition on the primary screen 30 and soft blue and yellow colors across the secondary screens 32) to simulate peripheral vision.
  • the controller 36 may function to send image commands to the primary screen 30 and/or secondary screens 32 based on the desired image and effect to be displayed on the virtual reality headset 10.
  • a central controller 38 that is separate from the virtual reality headset 10 may function to send instructions to one or more virtual reality headsets 10 within an environment. This may enable the central controller 38 to transmit instructions corresponding to the same image to multiple virtual reality headsets 10 within an environment. For example, multiple users may be within the environment, and the central controller 38 may transmit instructions to display the same image on the primary screen 30 of each virtual reality headset 10. The central controller 38 may also send instructions to display an image and/or effect on the secondary screens 32 of each virtual reality headset 10. Further, the controller 36 may be able to display secondary content on the secondary screens 32 based on the instructions associated with primary content received from the central controller 38. For example, the controller 36 may select or determine that a certain video or image is to be displayed on the primary screen 30, and also send instructions to the secondary screens 32 to display a complementary image or
  • the central controller 38 may also function to send unique instructions to each virtual reality headset 10 within the environment. This may enable each virtual reality headset 10 to display different images on the primary screen 30 and/or secondary screens 32. This may enable each user to have a unique experience within the environment, while also offloading image processing to the central controller 38 within the environment.
  • the left and right peripheral lens 20, 22 may be treated to enable the left and right peripheral lens 20, 22 to simulate peripheral vision.
  • all or some portions of the left and right peripheral lens areas 20, 22 may include a film that may be electrically activated.
  • the opacity of the film may vary based on an electrical current that is applied to the film.
  • the film may be layered on top of electrical wiring that allows the electrical current to be sent through the film to provide a level of opacity desired for the left and right peripheral lens areas 20, 22.
  • the film may be used in combination with other techniques, such as a translucent base material, a frosted base material, flexible and/or non-flexible material, and/or a textured based material.
  • the left peripheral lens area 20 may be sent a different current than the right peripheral lens area 22, such that each of the left and right peripheral lens areas 20, 22 may have the same or different opacity.
  • the virtual reality headset 10 may include various other features.
  • the electrical casing 12 may include a game engine that includes a camera view that enables an image to be rendered by utilizing a virtual camera.
  • the game engine may include a processor and software that includes a distortion algorithm to alter the images projected on the secondary screens 32 to not distract from the main content, while still providing enhanced light and image effects to mimic peripheral field of vision.
  • the secondary screens 32 may be replaced or supplemented with one or more light emitters (e.g., light emitting diodes [LEDs]) to provide additional visual effects that are visible through the left and right peripheral lens areas 20, 22 in a low-cost, compact form.
  • the one or more light emitters may be illuminated in coordination with the images on the primary screen 30, such as to provide different colors and/or brightness to simulate peripheral vision.
  • FIG. 5 is a rear perspective view of the virtual reality headset 10 with the optical element 14, in accordance with an embodiment of the present disclosure.
  • the left and right peripheral lens areas 20, 22 may include a semi-transparent material that connects to (e.g., abuts, surrounds) the left and right primary lens areas 16, 18. While the left and right peripheral lens areas 20, 22 may generally be isolated from the primary screen 30 that is positioned in front of the left and right primary lens areas 16, 18, some of the light from the primary screen 30 may diffuse into the left and right peripheral lens areas 20, 22 (e.g., due to the connection between the left and right peripheral lens areas 20, 22 and the left and right primary lens areas 16, 18).
  • the electrical casing 12 may also include and/or support a speaker to aide in sound effects to enhance the virtual reality experience and to correspond to images displayed on the primary screen 30.
  • the primary screen 30 may display a campfire, and the speaker may play a cracking sound effect to mimic the sound of the firewood burning.
  • the components of the virtual reality headset 10 may also include other haptic, sound, light, wind, and/or other special effect components to supplement the main image and aide in the immersive experience for the user. This may work in combination with the left and right peripheral lens areas 20, 22 that enable the viewer to view enhanced light effects and/or other display elements that may further the immersive experience by expanding the user’s viewpoint past the left and right primary lens areas 16, 18.
  • the left primary lens area 16 and the left peripheral lens area 20 may be a one-piece structure and/or may be coupled together (e.g., via adhesive and/or welds) to form a single lens structure (e.g., gaplessly continuous), and the left primary lens area 16 and the left peripheral lens area 20 may be distinguished from one another due to differences in transparency, texture, and/or location/display alignment.
  • the right primary lens area 18 and the right peripheral lens area 22 may be a one-piece structure and/or coupled together (e.g., via adhesive and/or welds) to form a single lens structure (e.g., gaplessly continuous), and the right primary lens area 18 and the right peripheral lens area 22 may be distinguished from one another due to differences in transparency, texture, and/or location/display alignment.
  • the virtual reality headset 10 may be part of an amusement park attraction 46 that includes a ride vehicle 42.
  • the ride vehicle 42 may be maneuvered to correspond with the images projected by the virtual reality headset 10 according to the methods discussed above for displaying images on the lenses of the virtual reality headset 10.
  • FIG. 6 is a perspective view of an embodiment of a ride vehicle 42 and one or more users 44 that each may utilize a respective virtual reality headset 10 in an amusement park attraction 46.
  • the primary screen 30 and/or the secondary screens 32 may be configured to display images to the user 44 of the virtual reality headset 10 such that the one or more users 44 may view the images (e.g., video feed) during movement of the ride vehicle 42.
  • each user 44 of the amusement park attraction 46 has a respective virtual reality headset 10 to view the images.
  • the controllers of the virtual reality headsets 10 may be configured to send a command to each primary screen 30 and/or the secondary screens to output the same images to each optical element 14, such that a common video feed is viewable for each user 44.
  • the images may be different for each virtual reality headset 10 such that each user 44 may view unique images (e.g., unique video feed).
  • each user 44 may be assigned a specific role (e.g., captain, pilot, navigator) as part of the amusement park attraction 46.
  • the respective virtual reality headset of each user 44 may display a video feed specific to the specific role assigned to the user 44 such that the user 44 may experience the amusement park attraction 46 from a perspective of their assigned role.
  • the virtual reality headset 10 may be configured to output a combination of both the common feed and unique video feeds within the amusement park attraction 46.
  • the virtual reality headset 10 may display the common feed during an introductory portion of the amusement park attraction 46.
  • the controller of each of the virtual reality headsets 10 may send a command to the primary screen 30 and/or the secondary screens to output unique video feeds to users 44 (e.g., associated with specific roles).
  • the users 44 without roles or the users 44 with roles that do not have active tasks may continue to receive the common feed.
  • the virtual reality headset 10 may output unique video feeds to each user 44 of the amusement park attraction 46.
  • the virtual reality headsets 10 may receive commands to display certain images and/or video feeds from a central controller that may be programmed to send commands to correspond to ride vehicle 42 motion and/or other effects of the amusement park attraction 46.
  • the image includes a text-based message, a picture, a video, or some combination thereof.
  • the amusement park attraction 46 may be a virtual reality type
  • the image may include text-based instructions for the amusement park attraction 46.
  • the text-based instructions may inform the user 44 on how to use the virtual reality headset 10 and/or perform actions as part of the amusement park attraction 46.

Landscapes

  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Optics & Photonics (AREA)
  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Lenses (AREA)

Abstract

La présente divulgation concerne un casque de réalité virtuelle (10) qui comprend une zone de lentille primaire (16, 18) et une zone de lentille périphérique (20, 22). Le casque de réalité virtuelle (10) peut également comprendre un ou plusieurs écrans d'affichage primaires (30) alignés avec les zones de lentilles primaires (16, 18) et de multiples écrans d'affichage secondaires (32) alignés avec les zones de lentilles périphériques (20, 22). Les zones de lentilles périphériques (20, 22) peuvent comprendre une texture semi-transparente pour déformer des images visualisées à travers la zone de lentille périphérique (20, 22) pour apparaître comme par l'intermédiaire d'une vue de vision périphérique et pour compléter des images principales vues à travers la zone de lentille primaire (16, 18).
PCT/US2022/029024 2021-05-14 2022-05-12 Zones de lentilles de vue avant et périphérique pour casques de réalité virtuelle WO2022241136A1 (fr)

Priority Applications (5)

Application Number Priority Date Filing Date Title
EP22727652.4A EP4338003A1 (fr) 2021-05-14 2022-05-12 Zones de lentilles de vue avant et périphérique pour casques de réalité virtuelle
CN202280035120.8A CN117355785A (zh) 2021-05-14 2022-05-12 用于虚拟现实头戴装备的前部和周围观看透镜区
CA3216715A CA3216715A1 (fr) 2021-05-14 2022-05-12 Zones de lentilles de vue avant et peripherique pour casques de realite virtuelle
JP2023570449A JP2024521662A (ja) 2021-05-14 2022-05-12 仮想現実ヘッドセットのための前方及び周辺のビューレンズ領域
KR1020237043019A KR20240008347A (ko) 2021-05-14 2022-05-12 가상 현실 헤드셋용 전면 및 주변 뷰 렌즈 영역

Applications Claiming Priority (6)

Application Number Priority Date Filing Date Title
US202163189051P 2021-05-14 2021-05-14
US63/189,051 2021-05-14
US202163242115P 2021-09-09 2021-09-09
US63/242,115 2021-09-09
US17/742,214 2022-05-11
US17/742,214 US20220365349A1 (en) 2021-05-14 2022-05-11 Front and peripheral view lens areas for virtual reality headsets

Publications (1)

Publication Number Publication Date
WO2022241136A1 true WO2022241136A1 (fr) 2022-11-17

Family

ID=81927925

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/US2022/029024 WO2022241136A1 (fr) 2021-05-14 2022-05-12 Zones de lentilles de vue avant et périphérique pour casques de réalité virtuelle

Country Status (5)

Country Link
EP (1) EP4338003A1 (fr)
JP (1) JP2024521662A (fr)
KR (1) KR20240008347A (fr)
CA (1) CA3216715A1 (fr)
WO (1) WO2022241136A1 (fr)

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO1993005432A1 (fr) * 1991-09-04 1993-03-18 Concept Vision Systems, Inc. Visionneuse grand angle
WO2012039877A1 (fr) * 2010-09-21 2012-03-29 Microsoft Corporation Filtre d'opacité pour visiocasque transparent
US20170039766A1 (en) * 2015-08-07 2017-02-09 Ariadne's Thread (Usa), Inc. (Dba Immerex) Modular multi-mode virtual reality headset
US20190026871A1 (en) * 2016-01-06 2019-01-24 Samsung Electronics Co., Ltd. Head-mounted electronic device
US20190339528A1 (en) * 2015-03-17 2019-11-07 Raytrx, Llc Wearable image manipulation and control system with high resolution micro-displays and dynamic opacity augmentation in augmented reality glasses
US20200233189A1 (en) * 2019-01-17 2020-07-23 Sharp Kabushiki Kaisha Wide field of view head mounted display

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO1993005432A1 (fr) * 1991-09-04 1993-03-18 Concept Vision Systems, Inc. Visionneuse grand angle
WO2012039877A1 (fr) * 2010-09-21 2012-03-29 Microsoft Corporation Filtre d'opacité pour visiocasque transparent
US20190339528A1 (en) * 2015-03-17 2019-11-07 Raytrx, Llc Wearable image manipulation and control system with high resolution micro-displays and dynamic opacity augmentation in augmented reality glasses
US20170039766A1 (en) * 2015-08-07 2017-02-09 Ariadne's Thread (Usa), Inc. (Dba Immerex) Modular multi-mode virtual reality headset
US20190026871A1 (en) * 2016-01-06 2019-01-24 Samsung Electronics Co., Ltd. Head-mounted electronic device
US20200233189A1 (en) * 2019-01-17 2020-07-23 Sharp Kabushiki Kaisha Wide field of view head mounted display

Non-Patent Citations (3)

* Cited by examiner, † Cited by third party
Title
BEHNAM BASTANI ET AL: "Foveated Pipeline for AR/VR Head-Mounted Displays", INFORMATION DISPLAY, 1 November 2017 (2017-11-01), XP055655202, Retrieved from the Internet <URL:https://onlinelibrary.wiley.com/doi/pdfdirect/10.1002/j.2637-496X.2017.tb01040.x> [retrieved on 20200107], DOI: 10.1002/j.2637-496X.2017.tb01040.x *
WATARU YAMADA ET AL: "Expanding the Field-of-View of Head-Mounted Displays with Peripheral Blurred Images", USER INTERFACE SOFTWARE AND TECHNOLOGY, ACM, 2 PENN PLAZA, SUITE 701 NEW YORK NY 10121-0701 USA, 16 October 2016 (2016-10-16), pages 141 - 142, XP058281059, ISBN: 978-1-4503-4531-6, DOI: 10.1145/2984751.2985735 *
XIAO ROBERT BRX@CS CMU EDU ET AL: "Augmenting the Field-of-View of Head-Mounted Displays with Sparse Peripheral Displays", USER INTERFACE SOFTWARE AND TECHNOLOGY, ACM, 2 PENN PLAZA, SUITE 701 NEW YORK NY 10121-0701 USA, 7 May 2016 (2016-05-07), pages 1221 - 1232, XP058517669, ISBN: 978-1-4503-4531-6, DOI: 10.1145/2858036.2858212 *

Also Published As

Publication number Publication date
CA3216715A1 (fr) 2022-11-17
EP4338003A1 (fr) 2024-03-20
JP2024521662A (ja) 2024-06-04
KR20240008347A (ko) 2024-01-18

Similar Documents

Publication Publication Date Title
CN107037587B (zh) 紧凑型增强现实/虚拟现实显示器
US8976323B2 (en) Switching dual layer display with independent layer content and a dynamic mask
US8711061B2 (en) Multiplanar image displays and media formatted to provide 3D imagery without 3D glasses
CN102540464B (zh) 提供环绕视频的头戴式显示设备
AU2012352273B2 (en) Display of shadows via see-through display
CN102754013B (zh) 三维立体成像方法、系统和成像设备
US8537075B2 (en) Environmental-light filter for see-through head-mounted display device
EP0640859B1 (fr) Unité d&#39;affichage d&#39;images
US20120068913A1 (en) Opacity filter for see-through head mounted display
EP3183615A1 (fr) Visiocasque comprenant module de gradation électrochromique pour une perception de réalité augmentée et virtuelle
US20060284788A1 (en) Infinity tunnel display system with floating dynamic image
CN107728319B (zh) 视觉显示系统及方法,以及头戴显示装置
CN105911696A (zh) 一种虚拟现实映像装置以及一体式虚拟现实设备
US20220365349A1 (en) Front and peripheral view lens areas for virtual reality headsets
JP7426413B2 (ja) ブレンドモード3次元ディスプレイシステムおよび方法
EP4338003A1 (fr) Zones de lentilles de vue avant et périphérique pour casques de réalité virtuelle
CN111247473B (zh) 使用提供视觉线索的装置的显示设备及显示方法
US11887263B1 (en) Adaptive rendering in artificial reality environments
CN207516640U (zh) 视觉显示系统以及头戴显示装置
CN117355785A (zh) 用于虚拟现实头戴装备的前部和周围观看透镜区
CN201637936U (zh) 三维立体成像系统和成像设备
WO2019118100A1 (fr) Lampe tridimensionnelle rétro-éclairée
CN213023957U (zh) 一种全息效果图像显示装置
US20240242447A1 (en) Adaptive rendering in artificial reality environments
CN209746346U (zh) 一种基于两次成像的透明投影屏幕

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 22727652

Country of ref document: EP

Kind code of ref document: A1

WWE Wipo information: entry into national phase

Ref document number: P6002765/2023

Country of ref document: AE

WWE Wipo information: entry into national phase

Ref document number: 3216715

Country of ref document: CA

WWE Wipo information: entry into national phase

Ref document number: 202280035120.8

Country of ref document: CN

Ref document number: 2023570449

Country of ref document: JP

ENP Entry into the national phase

Ref document number: 20237043019

Country of ref document: KR

Kind code of ref document: A

WWE Wipo information: entry into national phase

Ref document number: 1020237043019

Country of ref document: KR

WWE Wipo information: entry into national phase

Ref document number: 2022727652

Country of ref document: EP

NENP Non-entry into the national phase

Ref country code: DE

ENP Entry into the national phase

Ref document number: 2022727652

Country of ref document: EP

Effective date: 20231214