WO2023163958A1 - Système et procédé de test de visiocasque - Google Patents

Système et procédé de test de visiocasque Download PDF

Info

Publication number
WO2023163958A1
WO2023163958A1 PCT/US2023/013540 US2023013540W WO2023163958A1 WO 2023163958 A1 WO2023163958 A1 WO 2023163958A1 US 2023013540 W US2023013540 W US 2023013540W WO 2023163958 A1 WO2023163958 A1 WO 2023163958A1
Authority
WO
WIPO (PCT)
Prior art keywords
head
display
mounted display
ride
display screen
Prior art date
Application number
PCT/US2023/013540
Other languages
English (en)
Inventor
Patrick John Goergen
Robert Michael Jordan
Eric To Chan
Original Assignee
Universal City Studios Llc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from US18/171,546 external-priority patent/US20230267590A1/en
Application filed by Universal City Studios Llc filed Critical Universal City Studios Llc
Priority to CN202380022949.9A priority Critical patent/CN118742936A/zh
Publication of WO2023163958A1 publication Critical patent/WO2023163958A1/fr

Links

Classifications

    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09BEDUCATIONAL OR DEMONSTRATION APPLIANCES; APPLIANCES FOR TEACHING, OR COMMUNICATING WITH, THE BLIND, DEAF OR MUTE; MODELS; PLANETARIA; GLOBES; MAPS; DIAGRAMS
    • G09B9/00Simulators for teaching or training purposes
    • G09B9/02Simulators for teaching or training purposes for teaching control of vehicles or other craft

Definitions

  • the present disclosure relates generally to the field of amusement park entertainment. More specifically, embodiments of the present disclosure relate to a system to test the durability of head-mounted augmented reality/virtual reality displays worn during amusement rides.
  • Amusement parks and/or theme parks are designed to provide entertainment to guests. Areas of the amusement park may have different themes that are specifically targeted to certain audiences. For example, some areas may include themes that are traditionally of interest to children, while other areas may include themes that are traditionally of interest to more mature audiences. Generally, such areas having themes may be referred to as an attraction or a themed attraction. It is recognized that it may be desirable to enhance the immersive experience for guests in such attractions, such as by augmenting the themes with virtual features. Such virtual features can be provided to the guests via augmented reality (AR)Zvirtual reality (VR) head-mounted displays.
  • AR augmented reality
  • VR virtual reality
  • a head-mounted display testing system includes a mount configured to receive the head-mounted display that includes a display screen and at least one camera coupled to the mount and positioned to capture the display screen when the head-mounted display is on the mount.
  • the head-mounted display testing system includes an actuator coupled to the mount and an actuator controller of the ride motion simulator, which drives motion of the actuator to cause the mount to simulate motion of a ride according to a motion pattern.
  • the head-mounted display testing system also includes a controller, which receives a signal from the at least one camera comprising captured images of the display screen of the head-mounted display and identifies deviations from an expected display on the display screen based on the signal.
  • a method of testing the head-mounted displays includes receiving an indication that head-mounted display is on ride motion simulator, controlling the ride motion simulator to move according to a ride motion pattern, and detecting a display screen on the head-mounted display using at least one camera of the ride motion simulator during movement in the ride motion pattern.
  • the method of testing the head-mounted displays includes identifying deviations from an expected display on the display screen based on the signal received from the at least one camera and generating an indication of head-mounted display quality based on the identified deviations.
  • the method of testing head-mounted displays comprising includes receiving an indication that head-mounted display is on ride motion simulator, controlling the ride motion simulator to move according to a ride motion pattern, detecting a display screen on the head-mounted display using at least one camera of the ride motion simulator during movement in the ride motion pattern.
  • the method of testing the head-mounted displays includes identifying, while the ride motion simulator is moving, deviations from an expected display on the display screen based on the signal received from the at least one camera and generating a time of failure for the head-mounted display responsive to identifying the deviations.
  • FIG. 1 is a head-mounted display testing system that includes a ride motion simulator that simulates motion of an amusement ride in which the head-mounted display may be utilized, in accordance with present embodiments;
  • FIG. 2 is a perspective view of a ride motion simulator, in accordance with present embodiments.
  • FIG. 3 is a perspective view of the ride motion simulator during the headmounted display testing, in accordance with present embodiments
  • FIG. 4 is a perspective view of housing and screen of the head-mounted display featuring a displayed image and a marker, in accordance with present embodiments;
  • FIG. 5 is a flow chart of a process for testing the quality the head-mounted display, in accordance with present embodiments.
  • FIG. 6 is a block diagram of a head-mounted display testing system including a head-mounted display, a ride motion simulator and a controller, in accordance with present embodiments.
  • An amusement park may include an augmented reality (AR), a virtual reality (VR), and/or a mixed reality (a combination of AR and VR) system (e.g., AR/VR system) to enhance a guest experience of an amusement park attraction by providing guests with AR/VR experiences (e.g., AR experiences, VR experiences, or both).
  • AR/VR system may include a head-mounted display (e.g., electronic goggles or displays, eyeglasses), which may be worn by a guest to enable the guest to view virtual or augmented reality features.
  • the head-mounted display may be utilized to enhance a guest experience by overlaying virtual features onto a real- world environment of the amusement park, by providing adjustable virtual environments to provide different experiences in an attraction, and so forth.
  • head-mounted displays are provided to guests and returned after each ride cycle to be used by the next guest.
  • each individual head-mounted display experiences normal wear and tear from heavy, repeated use as well as specific stresses that are associated with the motion pattern of the associated amusement ride.
  • a cable connecting the head-mounted display to the ride vehicle may experience stretching, compression, or other mechanical stresses. Because this cable transfers data to the head-mounted display, repeated cable stresses may lead to a deterioration in the operation of the head-mounted display over time.
  • the disclosed embodiments provide a head-mounted display testing system to monitor head-mounted display operation to assess a relationship between ride motion and eventual malfunction of the head-mounted display (e.g., the image displayed by the head-mounted display screen is invisible, blurry, interrupted, or otherwise compromised) due to use in high-motion environments such as an amusement ride.
  • the head-mounted display testing system may include a ride motion simulator that includes an actuator and a robotic arm attached to a mount.
  • the mount receives and retains the head-mounted display during testing.
  • the mount may be generally sized and shaped to resemble a human head.
  • the mount may be a three-dimensionally (3D) printed humanoid mannequin head.
  • the head-mounted display may thus be retained on the ride motion simulator via the mount.
  • the actuator controller may receive instructions to actuate the mount based on the motion pattern of an amusement ride.
  • the controller of the ride motion simulator may then drive the robotic arm and an actuator according to the ride motion pattern, thus simulating the amusement ride.
  • the mount of the ride motion simulator may include one or more sensors, such as outward-facing cameras, that can be generally positioned as eyes on a face.
  • the outward-facing cameras may capture (e.g., record, photograph) the active display screen of the head-mounted display.
  • the screen of the head-mounted display displays the desired images that are coordinated with ride motion of the amusement ride.
  • the head-mounted display in an embodiment, may also include a marker or fiducial that is captured by the one or more sensors to confirm that the headmounted display remains securely mounted during testing.
  • the cameras capture the screen of the head-mounted display.
  • a controller receives the information captured by the cameras, detects the displayed image, and determines an operational state of the head-mounted display based on correspondence or deviation of the displayed image from an expected display. For example, if the marker is detected but no displayed image is detected on the screen of the head-mounted display, this indicates that the screen of the head-mounted display may be broken or otherwise compromised. If neither the displayed image nor the marker is detected, the head-mounted display may have shifted or fallen off the mount. If the displayed images are detected but deviate from corresponding expected images (e.g., stored reference images), the head-mounted display can be flagged as nonoperational. Conversely, correspondence of the detected displayed images with expected images can be used to mark the head-mounted display as operating normally. Flagged head-mounted displays can be further evaluated to identify a failure mode and/or for maintenance or cable replacement.
  • This operational state may be used to determine parameters such as the average lifetime (e.g., number of ride cycles before image quality is compromised) of a headmounted display in an amusement park environment. Further, by testing different ride motion patterns, particular patterns may be identified as being associated with shorter use lifespans. Thus, future amusement rides using head-mounted displays can be designed to include less disruptive motion patterns and to avoid motion patterns associated with head-mounted display malfunction and shorter lifespans.
  • the average lifetime e.g., number of ride cycles before image quality is compromised
  • FIG. 1 is a representation of the head-mounted display testing system 10.
  • the head-mounted display testing system 10 includes a ride motion simulator 12.
  • the ride motion simulator 12 receives ride motion pattern information to permit testing using a particular ride motion pattern.
  • the ride motion pattern executed by the system 10 is received from a ride controller 32 of an individual amusement ride 14 or is generated based on the predicted ride motion of the amusement ride caused by the ride controller 32.
  • the amusement ride 14 involves a ride vehicle 18 that travels along a ride path 20 causing the ride vehicle 18 to move according to a particular ride motion pattern.
  • passengers 16 are positioned within a ride vehicle 18.
  • the passengers 16 may be able to view physical structures 22 in the real-world environment 24 through the screens of the head-mounted displays 26 as well as displayed virtual features 28.
  • the virtual features 28 may be overlaid onto the real-world environment 24 so that the passengers 16 are able to view both the physical structures 22 in the real-world environment 24 and the virtual features 28 simultaneously.
  • the head-mounted displays may, additionally or alternatively, be used in a fully immersive VR configuration in which the passengers 16 view the displayed images on the head-mounted displays 26 in an immersive manner and cannot view the real-world environment through lenses of the head-mounted display 26.
  • the displayed images may include images of the physical structures 22 that are captured by cameras and, in turn provided as displayed images on the head-mounted displays 26.
  • Each passenger 16 may be presented with different virtual features 28 so that each passenger 16 has a different experience on the ride 14 in an embodiment.
  • the ride vehicle 18 on the ride path 20 may drop, roll, pitch, and yaw so that the passengers 16 of the amusement ride 14 may experience ride forces.
  • ride forces may help bring the head-mounted displays 26 to the end of their lifecycles faster than regular-use conditions (e.g., head-mounted display 26 being worn by a standing user or a walking user).
  • a cable 30, which may be tethered to the headmounted display 26 at one end and to the ride vehicle 18 at another end, may have internal leads that fray or stretch faster due to rapid movement in response to the ride forces.
  • the cable 30 connects the head-mounted display 26 to a controller, which is responsible for providing signals via the cable 30 to drive display of the virtual features 28.
  • damage to the cable 30 may compromise the quality of the displayed images of the head-mounted display 26.
  • the virtual features 28 or images may become distorted or otherwise altered.
  • the ride forces may cause the headmounted display 26 to shift on the passenger’s head such that the display screen is outside of the passenger’s field of view.
  • ride forces associated with the high motion environment may cause the head-mounted display to become dislodged or fall off the passenger’s head during the ride.
  • the ride motion simulator 12 places the head-mounted display 26 in a high motion environment outside of an amusement ride 14.
  • the ride motion simulator 12 includes a mount 36 that holds the head-mounted display 26 (e.g., via a guest interface device) as well as a robotic arm 34 and an actuator 44 that move the mount 36 according to the ride motion pattern of the ride controller 32.
  • the head-mounted display 26 is positioned on the mount 36, and the ride motion simulator 12 moves the mount 36 according to a motion pattern of the amusement ride 14.
  • the ride controller 32 may generate a ride control signal that guides the ride vehicle 18 along the ride path 20 according to the ride motion pattern.
  • the ride controller 32 may drive the ride vehicle 18 according to one or more different motion patterns.
  • the ride motion simulator 12 can simulate the movement (e.g., drop, pitch, yaw, roll) of the ride vehicle 18 and the passengers 16 during the amusement ride 14.
  • the ride control signal of the ride controller 32 is used to generate a simulation signal that translates the ride vehicle motion to predicted motion of a passenger’s head within the ride vehicle 18. The simulation signal is them used to drive the ride motion simulator 12 to execute motion patterns.
  • the signal to drive the motion pattern may be provided to the controller of the ride motion simulator 12 directly from the ride controller 32 or from another source.
  • the ride motion simulator 12 may execute the ride motion pattern to test the head-mounted display 26 under high-motion conditions that simulate the motion of the amusement ride 14.
  • FIG. 2 is a perspective view of components of the ride motion simulator 12.
  • the ride motion simulator 12 does not have an associated headmounted display 26.
  • the ride motion simulator 12 includes a robot arm 34 connected to a mount 36.
  • the mount 36 may be implemented a three- dimensional (3D) humanoid mannequin head.
  • the mount 36 may include one or more sensors, illustrated here as outward-facing cameras 40.
  • the camera(s) 40 are generally positioned to correspond to the position of eyes on a face.
  • the mount 36 that arrange the camera(s) 40 relative to the head-mounted display 26 with an appropriate spatial relationship.
  • the robot arm 34 may include one or more joints 42 that increase the range of motion of the mount 36. Additionally, the robot arm 34 includes an actuator 44 coupled to the mount 36 and/or the robot arm 34. The movement of the actuator 44 and the robot arm 34 may allow the ride motion simulator 12 to replicate the motion of the head-mounted display 26 inside the ride vehicle 18 during the amusement ride 14.
  • FIG. 3 is a perspective view of the ride motion simulator 12.
  • the head-mounted display 26 when the head-mounted display 26 is mounted on the ride motion simulator 12, it rests on the mount 36 as a head-mounted display 26 may rest on the head of the passenger 16 of the ride vehicle 18 during the amusement ride 14 and/or to align the one or more cameras 40 with the display of the head-mounted display 26.
  • the cable 30 of the head-mounted display 26 is coupled to a fixed structure 43s at a first end 45 and to the head-mounted display 26 at a second end 47.
  • the coupling at one or both the first end 45 or the second end 47 can be a reversible coupling. Images for display are provided to the head-mounted display 26 via the cable 30.
  • the head-mounted display 26 is a unitary assembly that includes display screens and a band, strap, or other feature that retains the headmounted display 26 during use.
  • the head-mounted display 26 may be a two-part assembly including a display portion and a headband or head interface portion. The two parts of the assembly can be coupled and decoupled to one another.
  • the head-mounted display 26 may be secured on the mount 36 using a guest interface device.
  • the guest interface device is designed to enable the display portion of the head-mounted display 26 to quickly transition between being mounted onto the passenger’s head and being taken off after the ride.
  • the guest interface device is configured to be affixed to the passenger’s head and, thus, enable the passenger to comfortably wear the head-mounted display 26 throughout various attractions or while traversing certain amusement park environments.
  • the guest interface device may include a head strap assembly that is configured to span about a circumference of the passenger’s head and configured to be tightened (e.g., constricted) on the passenger’s head.
  • the head strap assembly facilitates affixing the guest interface device to the head of the passenger, such that the guest interface device may be utilized to retain the head-mounted display 26 on the passenger.
  • the head strap assembly and the guest interface device may be used to affix the guest interface device to the mount 36.
  • the mount 36 moves according to a motion pattern of the amusement ride 14.
  • the robot arm 34 may also move about one or more joints 42 aiding the actuator 44 in executing the ride motion pattern.
  • the footage e.g., images and/or videos displayed on the head-mounted display 26
  • the footage e.g., images and/or videos displayed on the head-mounted display 26
  • the footage e.g., images and/or videos displayed on the head-mounted display 26
  • the simulated motion causes cable stress in a manner similar to that of the actual amusement ride.
  • the effect of motion on cable integrity can be assessed. That is, because the cable 30 provides images for display, a decrease in image quality may be tied to motion- induced cable damage.
  • FIG. 4 is a perspective view of a housing 50 and the screen 52 of the headmounted display 26 configured for head- mounted display testing.
  • the head-mounted display 26 includes a screen 52 that is coupled to a head-mounted display housing 50.
  • the screen 52 may include one or more screens, lenses, or displays (e.g., transparent, semitransparent or opaque).
  • the screen 52 may include transparent (e.g., see-through) light emitting diode (LED) displays or transparent (e.g., see-through) organic light emitting diode (OLED) displays.
  • LED light emitting diode
  • OLED organic light emitting diode
  • the screen 52 may be formed from a single-piece construction that spans a certain distance so as to display images (e.g., virtual features 28) to both eyes of the user. That is, in such embodiments (e.g., the embodiment shown in FIG. 4), the screen 52 may be formed from a single, continuous piece of material, where the first screen, lens, or display may be aligned with a first eye of the user and the second screen, lens, or display may be aligned with a second eye of the user.
  • the screen 52 may be a multi-piece construction that is formed from two or more separate screens, lenses, or displays.
  • the screen 52 may be implemented as two separate screens, corresponding to a left-eye screen and a right eye screen.
  • the screen 52 may display an image 54 (e.g., a virtual image), for example, an image of a car as in the illustrated embodiment.
  • a marker 56 e.g., a physical marker
  • the marker can be a piece of a sticker, a printed marker, a fiducial marker, or the like.
  • the marker 56 may be used in conjunction with images captured by the camera(s) 40 to determine quality or operating state of the head-mounted display 26.
  • the head-mounted display 26 When the displayed image 54 is detected on the screen 52 during testing, the head-mounted display 26 is mounted and operating. However, if the displayed image 54 on the screen 52 is not detected, it may indicate the head-mounted display 26 is not working properly (e.g., due to cable damage) or the head- mounted display 26 is shifted or detached from the mount 36. The marker 56 may help differentiate between these two scenarios. If the displayed image 54 is not detected but the marker 56 is detected by the one or more cameras 40, then the head-mounted display 26 can be characterized as being nonfunctional while being affixed to the mount 36.
  • the head-mounted display 26 may have detached from the mount 36 or otherwise shifted such that the screen 52 is no longer in the field of view of the one or more cameras 40.
  • FIG. 5 is a flow chart of a process for ascertaining the quality of the headmounted display 26, in accordance with present embodiments.
  • the process begins with receiving an indication that the head-mounted display 26 is on ride motion simulator (block 62).
  • this may involve the one or more cameras 40 capturing the marker 56 in their field of view and relaying the captured footage (e.g., images and/or video) to a processor, which executes instructions to process the camera footage (e.g., captured images) to detect a presence or absence of the marker 56 in the captured images.
  • an image recognition algorithm can be used to detect the marker 56 in camera’s footage.
  • the marker 56 may be selected to include an image or text that is not likely to be duplicated within the images displayed on the head-mounted display 26 to aid in identification of the marker 56.
  • the ride motion simulator is controlled to move according to the ride motion pattern (block 64).
  • the motion pattern may be selected from a variety of different motion patterns of an amusement ride 14.
  • the motion pattern executed by the ride motion simulator may involve the movements 46, 48 that resemble or simulate the motions (e.g., drop, pitch, yaw, roll, etc.) of the ride vehicle 18 and its passengers 16 during an amusement ride 14.
  • the motion pattern can be obtained from the amusement ride controller 32.
  • the images displayed on the head-mounted display screen 52 are detected using the one or more cameras 40 of the ride motion simulator (block 66).
  • the images displayed on the head-mounted display 26 may include both the marker 56 and the displayed image 54.
  • deviations from an expected display in the detected images are identified (block 68).
  • possible deviations from the expected display may include absence of the displayed image 54, shifting of the displayed image 54 fully or partially outside the field of view of the one or more cameras 40, blurring or interruption of the displayed image 54, absence of the marker 56, and shifting of the marker 56 fully or partially outside the field of view of the one or more cameras 40.
  • the deviations from an expected display may be detected, at a processor, by an image recognition algorithm.
  • the deviations may be detected by comparison with a signal provided to the head-mounted display 26 and intended to cause display of the images.
  • the signal may be used as a reference image set, and the captured images can be compared to the reference images to identify any deviations.
  • the comparison of the captured images to the reference images can be time-aligned, such that the comparison occurs between corresponding time points relative to the start of display (e.g., according to same time points of a video display). Further, the identification of deviations may be based on a frame-by-frame comparison.
  • the system can determine that an individual frame is different than its corresponding reference frame using image differencing techniques, e.g., pixel -by - pixel comparisons, to generate an image distance score between each compared set.
  • the system 10 may extract the characteristics of each captured image and represent it as multi-dimensional feature vector.
  • a distance measure D quantifies the similarity between the two images, where D is defined in the multi-dimensional feature space.
  • the system 10 may use global or local similarity measures.
  • a global similarity value may be generated and, when outside of a predetermined range, the frame can be identified as having deviations from a reference image. When the global similarity value is within the predetermined range, the frame can be identified as conforming or having no deviations.
  • the operational state of the head-mounted display can be characterized.
  • the head-mounted display can be identified as operating normally or malfunctioning.
  • the system 10 can start a counter of identified displayed and captured frames having deviation.
  • the counter reaches a threshold (e.g., at least 10 identified frames)
  • the system 10 can flag the head- mounted display 26 as being in a malfunctioning operational state. Below the threshold, the head-mounted display 26 can be categorized as operating normally, and the system 10 can provide the appropriate indication.
  • an indication of head-mounted display quality based on the identified deviations is generated (block 70). For example, if both the displayed image 54 and the marker 56 are detected by the at least one camera 40, then the indication of the headmounted display quality may convey that the head-mounted display is operational. On the other hand, if the quality of the displayed image 54 has been compromised (e.g., disappearance, blurring, interruption of the displayed image 54) but the marker 56 is detected, then the indication of the head-mounted display quality may convey that the screen 52 of the head-mounted display 26 is malfunctioning (e.g., due to fraying of the cable 30). If the head-mounted display 26 is determined to be malfunctioning, the system 10 can generate a flag. In addition, if both the displayed image 54 and the marker 56 are not detected, then the indication may convey that the head-mounted display 26 may have shifted or detached from the mount 36.
  • the indications of the head-mounted display quality may be used to determine quantities like a mean time to failure of the head-mounted display.
  • the system 10 may test the head-mounted display 26 by repeating the motion pattern associated with an amusement ride over a number of cycles until a change in the operational state is detected.
  • the images displayed by the head-mounted display 26 may conform to an expected display (e.g., show high correspondence with the reference images) for cycles 1-1000 during testing.
  • the captured images of the one or more cameras 40 may show more deviations and less conformance with the reference images. If the deviations are greater than a threshold, the mean time to failure can be marked as being associated with the cycle number at which the deviations pass the threshold.
  • the mean time to failure may be indicated by a number of ride cycles to failure in some embodiments. In other embodiments, the mean time to failure may be an overall testing time until failure.
  • the operating state of the head-mounted display can change over time and over the course of testing.
  • the head-mounted display 26 may be characterized as operating normally at a first time point and in a malfunctioning operational state at a second, later, time point.
  • the assessment of the operational state may be conducted in substantially real-time such that the system 10 flags malfunction as it occurs. Further, in an embodiment, the system 10 may instruct the ride motion simulator 12 to cease actuation upon identification of malfunction of the head- mounted display 26. In another embodiment, the assessment is done retroactively.
  • testing can occur using a predetermined number of ride cycle simulations (e.g., 500) to determine whether the head- mounted display 26 is sufficiently robust to operate normally over the predetermined number of cycles.
  • the disclosed testing techniques may be used to test different motion patterns relative to one another.
  • certain motion patterns may have a faster time to failure relative to others.
  • Such motion patterns can be flagged, and rides can be adjusted to change operating characteristics to avoid the flagged motion patterns and/or to include motion patterns with longer times to failure to preserve lifespan of head-mounted displays.
  • the disclosed techniques can be used for design of amusement rides using head-mounted displays.
  • FIG. 6 is a block diagram of the head-mounted display testing system 10 that incorporates a head-mounted display 26, a ride motion simulator 12 and a controller 96, in accordance with present embodiments.
  • the head-mounted display 26, the ride motion simulator 12, and the controller 96 may be separate components that are used together to test the head-mounted display quality.
  • the head-mounted display 26 may include a controller 74, for example, a central controller that includes one or more processors 78 and one or more memory devices 76.
  • the one or more processors 78 may execute software programs and/or instructions to adjust displayed images, such as virtual features 28, displayed on the head-mounted display screen 52.
  • the processor(s) 78 may include multiple microprocessors, one or more “general-purpose” microprocessors, one or more special-purpose microprocessors, and/or one or more application specific integrated circuits (ASICS), and/or one or more reduced instruction set (RISC) processors.
  • the memory device(s) 76 may include one or more storage devices, and may store machine- readable and/or processor-executable instructions (e.g., firmware or software) for the processor(s) 78 to execute, such as instructions relating to adjusting display of a virtual object.
  • the memory device(s) 76 may store, for example, control software, look up tables, configuration data, and so forth, to facilitate adjusting display of a virtual object.
  • the processor(s) 78 and the memory device(s) 76 may be external to the controller 74.
  • the memory device(s) 76 may include a tangible, non- transitory, machine-readable-medium, such as a volatile memory (e.g., a random access memory (RAM)) and/or a nonvolatile memory (e.g., a read-only memory (ROM), flash memory, hard drive, and/or any other suitable optical, magnetic, or solid-state storage medium).
  • a volatile memory e.g., a random access memory (RAM)
  • a nonvolatile memory e.g., a read-only memory (ROM), flash memory, hard drive, and/or any other suitable optical, magnetic, or solid-state storage medium.
  • the head-mounted display 26 may include a microphone 80, a speaker 82, a receiver 84 and a transmitter 86.
  • the receiver 84 picks up the audio output so that the wearer of the head-mounted display 26 can hear the audio through the speaker 82.
  • the microphone 80 receives an audio input from the head-mounted display wearer, and the transmitter 86 transmits the audio input to other head-mounted display wearers.
  • the head-mounted display 26 may include a plurality of sensors 88 (e.g., a camera, eye tracking sensors, heart rate sensor, equipment monitoring sensors, hand tracking sensors, and the like).
  • the ride motion simulator 12 includes one or more cameras 40, an actuator 44 and an actuator controller 90, such as a central controller, which includes one or more processor(s) 92 and one or more memory device(s) 94.
  • the processor(s) 92 and the memory device(s) 94 may be external to the actuator controller 90.
  • the actuator controller 90 receives the ride motion pattern (e.g., from the ride controller 32) and drives motion of the actuator 44 to simulate motion of the amusement ride 14 according to a motion pattern.
  • the memory device(s) 94 may store machine-readable and/or processor executable instructions for the ride motion pattern execution.
  • the processor(s) 92 may execute software programs and/or instructions to direct the movement of the robotic arm 34 and the actuator 44 to move the mount 36 according to a motion pattern.
  • the controller 96 receives the footage captured by the camera(s) 40 for processing (e.g., image detection).
  • the controller 96 includes processor(s) 98, memory device(s) 100, an input/output (I/O) port 102, and a user interface 104.
  • the I/O port 102 may receive the footage from the camera(s) 40.
  • the processor(s) 98 may execute software programs and/or instructions to receive the signal from the one or more camera(s) 40 and to identify deviations from an expected display on the display screen 52 based on the signal. For example, a processor 98 may run an image recognition algorithm to identify the displayed image 54 and the marker 56 on the display screen 52.
  • the memory device(s) 100 may include one or more storage devices and may store machine-readable and/or processor-executable instructions (e.g., firmware or software) for the processor(s) 98 to execute, such as instructions relating to receiving camera signal and identifying deviations from an expected display.
  • the controller 96 may thus generate an indication of the head-mounted display quality based on the identified deviations.
  • the user interface 104 may be used to configure the controller 96 for processing the camera signal.
  • GUI graphical user interface
  • the user interface 104 may be used to set the expected display (e.g., expected image 54 and marker 56) so that deviations from the expected display can be identified in the detected images (e.g., detected image 54 and marker 56), as described in block 68.
  • the user interface 104 may be used to communicate the indication of the head-mounted display quality to the users.
  • the user interface 104 may display a notification indicating that the head-mounted display 26 is malfunctioning.
  • the user interface 104 may display a notification indicating that a head-mounted display 26 is on the ride motion simulator 12 prior to the beginning of the head-mounted display testing.
  • the head-mounted display 26 can display images that are provided to the controller 74 of the head-mounted display 26 via the cable 30. These images may be provided as reference images 110 to the controller 96 and used to identify deviations. For example, the reference images 110 can be compared to the captured images from the camera(s) 40. If the head-mounted display is operating as intended, the images displayed on the display screen 52 of the head-mounted display 26 and captured by the camera(s) 40 should have high conformance to the reference images 110.

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Aviation & Aerospace Engineering (AREA)
  • Business, Economics & Management (AREA)
  • Physics & Mathematics (AREA)
  • Educational Administration (AREA)
  • Educational Technology (AREA)
  • General Physics & Mathematics (AREA)
  • Processing Or Creating Images (AREA)

Abstract

Un système de test de visiocasque peut comprendre un simulateur de mouvement de manège servant à simuler un mouvement d'un manège ainsi qu'un visiocasque qui est installé sur le simulateur de mouvement de manège et qui affiche une image virtuelle. Le simulateur de mouvement de manège comprend une ou plusieurs caméras orientées vers l'extérieur qui sont conçues pour capturer l'image virtuelle et un marqueur physique lorsque le visiocasque se trouve sur le simulateur de mouvement de manège. Le dispositif de commande est conçu pour recevoir le contenu de l'écran de visiocasque capturé par la ou les caméras et pour générer une indication de qualité de visiocasque sur la base d'écarts par rapport à un affichage attendu.
PCT/US2023/013540 2022-02-22 2023-02-21 Système et procédé de test de visiocasque WO2023163958A1 (fr)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202380022949.9A CN118742936A (zh) 2022-02-22 2023-02-21 头戴式显示器测试系统和方法

Applications Claiming Priority (4)

Application Number Priority Date Filing Date Title
US202263312491P 2022-02-22 2022-02-22
US63/312,491 2022-02-22
US18/171,546 2023-02-20
US18/171,546 US20230267590A1 (en) 2022-02-22 2023-02-20 Head-mounted display testing system and method

Publications (1)

Publication Number Publication Date
WO2023163958A1 true WO2023163958A1 (fr) 2023-08-31

Family

ID=85640678

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/US2023/013540 WO2023163958A1 (fr) 2022-02-22 2023-02-21 Système et procédé de test de visiocasque

Country Status (1)

Country Link
WO (1) WO2023163958A1 (fr)

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP3115873A1 (fr) * 2015-07-06 2017-01-11 Seiko Epson Corporation Dispositif d'affichage monté sur la tête et programme informatique
US20200215687A1 (en) * 2019-01-04 2020-07-09 Universal City Studios Llc Extended reality ride test assembly for amusement park system

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP3115873A1 (fr) * 2015-07-06 2017-01-11 Seiko Epson Corporation Dispositif d'affichage monté sur la tête et programme informatique
US20200215687A1 (en) * 2019-01-04 2020-07-09 Universal City Studios Llc Extended reality ride test assembly for amusement park system

Similar Documents

Publication Publication Date Title
US11200655B2 (en) Wearable visualization system and method
US10606348B2 (en) Systems and methods for generating augmented and virtual reality images
US10373334B2 (en) Computer program, object tracking method, and object tracking device
CN108351523A (zh) 利用头戴式显示器使用立体照相机以及结构光的深度映射
EP3908387B1 (fr) Systèmes et procédés pour un environnement augmenté connecté
KR102371072B1 (ko) 모션 및 얼굴 캡쳐를 이용한 실시간 방송플랫폼 제공 방법, 장치 및 그 시스템
US11749141B2 (en) Information processing apparatus, information processing method, and recording medium
KR20210107786A (ko) 놀이기구 탑승물을 위한 증강 현실 시스템
JP2024074862A (ja) 軸外カメラを使用して眼追跡を実施するための方法およびシステム
JP7451514B2 (ja) 表示装置のために画像を処理するシステム及び方法
US20230267590A1 (en) Head-mounted display testing system and method
WO2023163958A1 (fr) Système et procédé de test de visiocasque
KR101019829B1 (ko) 운동하는 볼에 대한 센싱처리장치, 센싱처리방법 및 이를 이용한 가상 골프 시뮬레이션 장치
KR20240154014A (ko) 헤드 장착형 디스플레이 테스트 시스템 및 방법
CN118742936A (zh) 头戴式显示器测试系统和方法
KR101019847B1 (ko) 운동하는 볼에 대한 센싱처리장치, 센싱처리방법 및 이를 이용한 가상 골프 시뮬레이션 장치
CN114385002B (zh) 智能设备控制方法、装置、服务器和存储介质
US20220401849A1 (en) Systems and methods for projection mapping for an attraction system
US11688040B2 (en) Imaging systems and methods for correcting visual artifacts caused by camera straylight
CN113227884B (zh) 用于游乐乘坐设施的增强现实系统
CA3219103A1 (fr) Systemes et procedes de cartographie par projection pour un systeme d'attraction
CN115668107A (zh) 具有能缩回的覆盖物的能够佩戴的可视化设备
TW201918832A (zh) 穿戴式虛實互動展演系統

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 23711313

Country of ref document: EP

Kind code of ref document: A1

WWE Wipo information: entry into national phase

Ref document number: 1020247031085

Country of ref document: KR

WWE Wipo information: entry into national phase

Ref document number: 2023711313

Country of ref document: EP

NENP Non-entry into the national phase

Ref country code: DE

WWE Wipo information: entry into national phase

Ref document number: 11202405130X

Country of ref document: SG

ENP Entry into the national phase

Ref document number: 2023711313

Country of ref document: EP

Effective date: 20240923