WO2023219966A1 - Robotic arm integrated immersive reality - Google Patents

Robotic arm integrated immersive reality Download PDF

Info

Publication number
WO2023219966A1
WO2023219966A1 PCT/US2023/021391 US2023021391W WO2023219966A1 WO 2023219966 A1 WO2023219966 A1 WO 2023219966A1 US 2023021391 W US2023021391 W US 2023021391W WO 2023219966 A1 WO2023219966 A1 WO 2023219966A1
Authority
WO
WIPO (PCT)
Prior art keywords
display
actuator
control system
ride vehicle
movement
Prior art date
Application number
PCT/US2023/021391
Other languages
French (fr)
Inventor
IV Nathaniel David Helmick
Bryan ZEBLECKES
Original Assignee
Universal City Studios Llc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from US17/880,174 external-priority patent/US20230364523A1/en
Application filed by Universal City Studios Llc filed Critical Universal City Studios Llc
Publication of WO2023219966A1 publication Critical patent/WO2023219966A1/en

Links

Classifications

    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63GMERRY-GO-ROUNDS; SWINGS; ROCKING-HORSES; CHUTES; SWITCHBACKS; SIMILAR DEVICES FOR PUBLIC AMUSEMENT
    • A63G31/00Amusement arrangements
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63GMERRY-GO-ROUNDS; SWINGS; ROCKING-HORSES; CHUTES; SWITCHBACKS; SIMILAR DEVICES FOR PUBLIC AMUSEMENT
    • A63G31/00Amusement arrangements
    • A63G31/16Amusement arrangements creating illusions of travel

Definitions

  • Immersive environments may include physical props and set pieces, robotic or mechanical elements, and/or display surfaces that present media.
  • the immersive environment may include audio effects, smoke effects, and/or motion effects.
  • immersive environments may include a combination of dynamic and static elements.
  • an atraction system includes a ride vehicle configured to move within the atraction system, a display coupled to the ride vehicle, and a control system communicatively coupled to the display.
  • the control system is configured to determine a position and/or orientation of the display within the atraction system and operate the display to present an image based on the position and/or orientation of the display.
  • a non-transitory computer-readable medium includes instructions that, when executed by processing circuitry, cause the processing circuitry to instruct an actuator of an atraction system to move a display coupled to the actuator, determine a positioning of the display, determine image data based on the positioning of the display, and instruct the display to present an image based on the image data.
  • an attraction system includes a display coupled to an actuator, a ride vehicle configured to move and cause movement of the display, and a control system communicatively coupled to the display.
  • the control system is configured to instruct the actuator to move the display, determine a position and/or orientation of the display within the atraction system, and operate the display to present an image based on the position and/or orientation of the display.
  • FIG. 1 is a schematic diagram of an embodiment of an atraction system that includes a display, in accordance with an aspect of the present disclosure
  • FIG. 2 is a perspective view of an embodiment of an atraction system that includes a display coupled to a ride vehicle, in accordance with an aspect of the present disclosure
  • FIG. 3 is a perspective view of an embodiment of an attraction system that includes a display coupled to a ride vehicle, in accordance with an aspect of the present disclosure
  • FIG. 4 is a perspective view of an embodiment of an attraction system that includes a display coupled to an actuator, in accordance with an aspect of the present disclosure
  • FIG. 5 is a perspective view of an embodiment of an attraction system that includes multiple displays coupled to respective actuators, in accordance with an aspect of the present disclosure
  • FIG. 6 is a perspective view of an embodiment of an attraction system that includes a ride vehicle and a display coupled to an actuator, in accordance with an aspect of the present disclosure
  • FIG. 7 is a perspective view of an embodiment of an attraction system that includes a ride vehicle, an actuator coupled to the ride vehicle, and a display coupled to the actuator, in accordance with an aspect of the present disclosure
  • FIG. 8 is a flowchart of an embodiment of a method or process for operating an attraction system, in accordance with an aspect of the present disclosure.
  • the present disclosure is directed to an attraction system of an amusement or theme park.
  • the amusement park may include a variety of features, such as rides (e.g., a roller coaster), theatrical shows, set designs, performers, and/or decoration elements, to entertain guests.
  • Show effects may be used to supplement or complement the features, such as to provide the guests with a more immersive and/or unique experience.
  • the show effects may be presented to emulate real world elements in order to present a more realistic atmosphere for the guests.
  • the show effect may include a display configured to present an output, such as an image and/or a video (e.g., a series of images), to the guests.
  • the output (e.g., imagery) of the display may include a visualization of a virtual environment (e.g. , a virtual space defining a three-dimensional (3-D) model).
  • the display may be positioned to enable the guests to view the output.
  • the display may be moved to enable improved visibility of the display based on a position of the guests, and/or the display may be coupled to a ride vehicle of the attraction system, and movement of the ride vehicle may drive corresponding movement of the display.
  • the movement of the display and the output of the display may cooperatively entertain the guests.
  • the display may be desirable to adjust the output provided by the display in a manner that enhances portrayal of the virtual environment to the guests.
  • it may be desirable to operate the display to present an output that portrays corresponding movement of the display in a virtual environment.
  • the virtual environment e.g., a perception of movement through the virtual environment
  • embodiments of the present disclosure are directed to determining a positioning (e.g., a position, an orientation, a location) of the display and instructing the display to present an output based on the determined positioning.
  • the display may be instructed to present an updated output.
  • the updated output may portray movement within a virtual environment corresponding to movement of the display.
  • respective outputs e.g., image data that causes the display to present imagery
  • an associated output may be selected for presentation via the display.
  • the output of the display may be more closely associated with movement of the display, and such operation of the display may provide benefits that are not easily achieved via existing techniques of operating a display (e.g., using a predetermined media file).
  • the output may more realistically portray movement within the virtual environment based on movement of the display in the attraction system to provide a more realistic experience to the guests.
  • the output of the display may be more unique or personalized, such as for an embodiment in which the movement of the display may be different or customizable for different ride cycles (e.g., based on a user input).
  • operating the display based on a determined positioning of the display may improve operation of the attraction system to entertain the guests.
  • FIG. 1 is a schematic diagram of an embodiment of an attraction system 50.
  • the attraction system 50 may be configured to entertain one or more guests 52.
  • the attraction system 50 may include a ride vehicle 54 in which a guest 52 may be positioned.
  • the ride vehicle 54 may move to provide movement sensations (e.g., a gravitational force, an inertial force, a postural adjustment) and/or carry the guest 52 to different parts of the attraction system 50.
  • the ride vehicle 54 may travel (e.g., translate) along a path 56, which may include a track, an open pathway, or any suitable route that the ride vehicle 54 may navigate.
  • the ride vehicle 54 may remain at a particular location within the attraction system 50 and may, for instance, move (e.g., pivot, rotate) about a base to change orientations at the location in order to move the guest 52 within the attraction system 50.
  • the movement of the ride vehicle 54 may entertain the guest 52.
  • the guest 52 may navigate through and/or within the attraction system 50 without the ride vehicle 54.
  • the attraction system 50 may include another path (e.g., a queue, an open pathway, a walkway) that the guest 52 may move along, and the attraction system 50 may include various features to entertain the guest 52 moving through and/or within the attraction system 50.
  • the attraction system 50 may also include a show effect system 58 configured to provide further entertainment to the guest 52.
  • the show effect system 58 may include a display 60 configured to provide an output (e.g., a media output, a virtual output) viewable to the guest 52.
  • the output may include imagery, such as an image or a video (e.g., a plurality of images that is sequentially presented).
  • the output provided by the display 60 may present a virtual environment in which the guest 52 may be immersed.
  • the output may be based on the movement of the display 60 within the attraction system 50 to portray movement of the display 60 in the virtual environment, thereby providing a more realistic appearance of the virtual environment.
  • the attraction system 50 may include a control system 62 (e.g., an automation controller, a programmable controller, an electronic controller) communicatively coupled to the display 60 and configured to operate the display 60.
  • the control system 62 may include a memory 64 and processing circuitry 66.
  • the memory 64 may include volatile memory, such as random access memory (RAM), and/or non-volatile memory, such as read-only memory (ROM), optical drives, hard disc drives, solid-state drives, or any other non-transitory computer- readable medium that includes instructions.
  • the processing circuitry 66 may be configured to execute such instructions.
  • the processing circuitry 66 may include one or more application specific integrated circuits (ASICs), one or more field programmable gate arrays (FPGAs), one or more general purpose processors, or any combination thereof
  • the control system 62 may be configured to operate the display 60 to provide an output based on a positioning, such as position and/or an orientation, of the display 60.
  • the show effect system 50 may include a sensor 68 configured to monitor a parameter associated with the positioning of the display 60.
  • the sensor 68 may transmit data indicative of the parameter to the control system 62, and the control system 62 may operate the display 60 based on the parameter.
  • the sensor 68 may include a position sensor, which may include an optical sensor (e.g., a camera), a remote sensing device (e.g., a light detection and ranging sensor), a proximity sensor, another suitable position sensor, or any combination thereof.
  • the parameter monitored by the sensor 68 may directly correspond to a positioning of the display 60 determined by the sensor 68.
  • the sensor 68 may include a movement or motion sensor, which may include an accelerometer, a gyroscope, an inertial measurement unit, another movement sensor, or any combination thereof.
  • the parameter monitored by the sensor 68 may include movement or an adjustment of the positioning of the display 60.
  • the control system 62 may determine a change in positioning of the display 60 based on movement data received from the sensor 68. For example, the control system 62 may determine an updated positioning of the display 60 based on a determined movement of the display 60 from a previous positioning of the display 60.
  • the control system 62 may, for example, determine a position and/or orientation of the display 60 within a virtual 3-D coordinate system based on the data received from the sensor 68 indicating the positioning of the display 60 in the real world. The control system 62 may then operate the display 60 based on the position and/or orientation of the display 60 within the 3-D coordinate system.
  • the control system 62 may refer to information or data that associates or maps different image data to respective positionings (e g., a position, an orientation, a coordinate, a location) of the display 60 within the 3-D coordinate system.
  • the control system 62 may select the image data associated with the positioning of the display 60 based on the referenced information, and the control system 62 may instruct the display 60 to present an image based on the selected image data.
  • the control system 62 may transmit the image data to the display 60 to cause the display 60 to present an image corresponding to the image data.
  • each image data associated with the various positionings of the display 60 within the 3-D coordinate system may correspond to an appearance of a virtual environment viewed from the positioning within the 3-D coordinate system.
  • the control system 62 may determine updated image data associated with an updated appearance of the virtual environment (e.g., viewed from the positioning within the 3-D coordinate system).
  • the control system 62 may transmit the updated image data to the display 60 to cause the display 60 to present an image providing the updated appearance of the virtual environment, such as to portray movement of the display 60 within the virtual environment corresponding to physical movement of the display 60 in the real world.
  • an actuator 70 of the show effect system 58 may be configured to move the display 60.
  • the actuator 70 may include a robotic arm with segments that are movable relative to one another, and operation of the actuator 70 may move the segments to drive movement of the display 60.
  • the actuator 70 may include a combination of winches, pulleys, and/or cables or ropes coupled to the display 60, and operation of the actuator 70 may cause the winches and pulleys to move (e.g., pull, extend) the cables or ropes to move the display 60.
  • the actuator 70 may include a platform assembly (e.g., a parallel manipulator) having multiple platforms configured to move relative to one another.
  • the display 60 may be coupled to one of the platforms, and operation of the actuator 70 to drive movement of the platforms may cause movement of the display 60.
  • the actuator 70 may additionally or alternatively include any suitable component or device configured to drive the display 60 to move.
  • the control system 62 may be communicatively coupled to the actuator 70 and may be configured to instruct the actuator 70 to move the display 60.
  • the control system 62 may be configured to instruct the actuator 70 to move the display 60 in a pre-programmed, preset, or predetermined path or manner, such as based on a particular ride cycle (e.g., in which the ride vehicle 54 travels along the path 56) and/or based on a time of operation of the attraction system 50.
  • the control system 62 may be configured to instruct the actuator 70 to move based on input data.
  • the control system 62 may receive data (e.g., from the sensor 68) indicative of a position of the guest 52 in the attraction system 50.
  • the guest 52 may navigate the attraction system 50 and/or move via movement of the ride vehicle 54 carrying the guest 52 along the path 56.
  • the control system 62 may instruct the actuator 70 to move based on the determined position of the guest 52, such as to position the display 60 relative to the guest 52 to enable the guest 52 to view the output of the display 60.
  • the control system 62 may receive a user input and operate the actuator 70 based on the user input. Therefore, the control system 62 may include or be communicatively coupled to a user interface 72, such as a touch screen, a trackpad, a mouse, a button, a switch, a dial, a slider, a lever, and the like.
  • the guest 52 may interact with the user interface 72 to transmit the user input, and the control system 62 may receive the user input and instruct the actuator 70 to move the display 60 based on the user input.
  • the user input may be indicative of a request to move the display 60 in a particular direction and/or to a target position, and the control system 62 may be configured to instruct the display 60 to move in the particular direction and/or to the target position in response.
  • control system 62 may include a single controller configured to determine the positioning of the display 60 (e.g., via the sensor 68), operate the display 60 to present content, and instruct the actuator 70 to move the display 60.
  • control system 62 may include one or more separate controllers (e.g., a primary controller, a secondary controller) configured to coordinate to operate the display 60, the actuator 70, and/or any other feature of the attraction system 50.
  • a first controller of the control system 62 may determine the positioning of the display 60, a second controller of the control system 62 may operate the display 60 (e.g., based on communication with the first controller), and a third controller of the control system 62 may instruct the actuator 70 to move the display 60 (e.g., based on communication with the first controller and/or the second controller).
  • the control system 62 may include any suitable number of controllers to operate one or more subsystems (e.g., the display 60, the actuator 70, the ride vehicle 54) of the attraction system 50.
  • the actuator 70 may be attached to, secured to, mounted to, or engaged with the ride vehicle 54.
  • the display 60 may be coupled to the ride vehicle 54, and the actuator 70 may be coupled to and configured to drive movement of the ride vehicle 54.
  • operation of the actuator 70 may move the ride vehicle 54 and drive corresponding movement of the display 60 coupled to the ride vehicle 54.
  • the actuator 70 may be coupled to the ride vehicle 54, and the display 60 may be coupled to the actuator 70.
  • movement of the ride vehicle 54 e.g., along the path 56
  • the control system 62 may also operate the actuator 70 to cause additional movement of the display 60.
  • the actuator 70 may be separate from the ride vehicle 54.
  • the actuator 70 may be coupled to a base that is not a part of the ride vehicle 54, and the actuator 70 may therefore move separately from the ride vehicle 54. In other words, movement of the ride vehicle 54 may not directly drive movement of the actuator 70 and the display 60.
  • the control system 62 may instruct the display 60 to present a particular output.
  • Operating the display 60 to present a location-based output may improve the experience provided to the guest 52. For instance, operation of the display 60 may be more flexible as compared to outputting a preprogrammed or predetermined media file.
  • the control system 62 may be configured to cause the display 60 to present a different output (e.g., based on different movement of the display 60) for different ride cycles, thereby providing a more unique or personalized experience to the guest 52.
  • the output provided by the display 60 may correspond to the movement and/or determined positioning of the display 60 more appropriately as compared to a preprogrammed media file. That is, the output may be more synchronized with or may match the movement of the display 60 more closely.
  • different ride cycles may have varying movements and/or motions (e g., speeds, acceleration, vibration) of the display 60, such as based on a condition of the actuator 70, a weight or number of the guests 52 in the ride vehicle 54 driving motion of the display 60, a wind speed, and so forth.
  • a preprogrammed media file such as a preset video, may not be altered for different ride cycles.
  • the preprogrammed media file may not match the different movements and/or motions associated with certain ride cycles.
  • an output provided by the display 60 based on the position and/or orientation of the display 60 may accommodate the variation in movements and/or motions for different ride cycles.
  • the display 60 may present a realistic appearance of movement through a virtual environment in correspondence with movement of the display 60. This may be achieved by using a 3-D model, which may include an application that uses mathematical modeling to depict a 3-D environment as a virtual space and map the 3-D environment to the 3-D coordinate system to account for all or a subset of views into the virtual space that can be achieved within a display envelope (e.g., boundaries of the positioning of the display 60) in the 3-D coordinate system.
  • a display envelope e.g., boundaries of the positioning of the display 60
  • coordination between the 3-D model and the display 60 or multiple displays may be used to depict views into the virtual space from available perspectives (e.g., viewpoints of riders in the ride vehicle 54 driving motion of the display 60) with the display envelope.
  • presented content e.g., images
  • changes, including minor, small, or incremental changes, in positioning of the display 60 in the physical world are accounted for by the 3-D model based on corresponding changes in positioning of the display 60 in the 3-D coordinate system without requiring a special media track for differing paths and/or without creating discontinuity between movements of the display 60 and content presented via the display 60.
  • content depicting a view into the virtual space associated with the 3-D environment may be selected.
  • a change in positioning of the display 60 in the real world may not directly correspond to the same change in positioning of the display 60 with respect to the 3-D coordinate system.
  • a certain movement of the display 60 in the real world may be amplified or muted in the 3-D coordinate system.
  • movement of the display 60 along a vertical distance in the real world may cause movement of the display 60 along an increased vertical distance in the 3-D coordinate system.
  • rotation of the display 60 at a certain angle in the real world may cause rotation of the display 60 at a reduced angle in the 3-D coordinate system.
  • Such discrepancies between the change in positioning of the display 60 in the real world and the change in positioning of the display 60 in the 3-D coordinate system may cause certain content associated with the 3-D environment to be selected and presented to provide a more desirable experience to the guests.
  • the presented content may depict movement of the display 60 at a high speed in the 3-D environment to provide an appearance that the guest is traveling at a higher speed in the real world than that actually effectuated by the ride vehicle 54.
  • operating the display 60 to provide an output based on positioning instead of using a preprogrammed media file may facilitate ease of adjusting operation of the attraction system 50.
  • it may be desirable to adjust the movement of the display 60 such as by adjusting movement of the ride vehicle 54 (e.g., by changing operation of the actuator 70, by changing a feature of the path 6), in order to change the entertainment provided to the guests 52.
  • an updated output of the display 60 may be readily available for presentation.
  • the movement of the display 60 may include updated positioning of the display 60 (e g., within the 3-D coordinate system), and image data may already be mapped to the updated positioning (e.g., as part of a comprehensive digital 3-D model including the 3-D environment mapped to the 3-D coordinate system).
  • the image data may be readily generated or selected based on the 3-D model, such as the positioning of the display 60 within the 3-D coordinate system and a correlation between the positioning of the display 60 within the 3-D coordinate system (e.g., as caused by the position and/or orientation of the display 60 within the attraction system 50) and the 3-D environment while the display 60 is being moved.
  • the display 60 may be separate from the guest 52. That is, the display 60 may not be directly in contact with or equipped by the guest 52, such as via a device or component (e g., a headset) that may significantly limit a field of view of the guest 52. Thus, movement of the display 60 may be separate from movement of the guest 52. For example, multiple guests 52 may be able to view a single display 60. Therefore, a single display 60 may provide immersive entertainment to several guests 52 to increase efficient operation of the attraction system 50.
  • a device or component e g., a headset
  • the output provided by the display 60 may be more suitable or be of better quality for various guests 52 as compared to a wearable device that provides an output (e.g., a headset that provides an output directly in front of a user) that may vary in quality for different guests 52 (e.g., guests 52 having different pupil distances to affect a quality of the output provided in front of them).
  • the display 60 may be positioned to enable the guest 52 to view other features of the attraction system 50.
  • the control system 62 may cause the actuator 70 to position the display 60 at a particular offset distance from the guest 52 to avoid blocking a field of view of the guest 52, thereby enabling the guest 52 to see additional show effects 74 of the show effect system 58.
  • the additional show effects 74 may include an animated figure, a decoration, a smoke effect, a fog effect, lighting, another output (e.g., other imagery provided by another display), or any other suitable show effect that the control system 62 may operate to further enhance the experience provided to the guest 52.
  • the guest 52 may be able to view the virtual elements provided by the display 60 in addition to real world elements of the attraction system 50.
  • the guest 52 may not be in direct contact with the display 60, a desirable structure or integrity of the display 60 may be maintained.
  • wear or deterioration of the display 60 e.g., that would otherwise be caused by contact or interaction between the guest 52 and the display 60
  • wear or deterioration of the display 60 may be reduced to increase a useful lifespan of the display 60, thereby reducing maintenance operations to repair or replace the display 60.
  • a cleaner appearance or structure of the display 60 may be maintained.
  • fewer operations may be performed with respect to the display 60, thereby further improving an efficiency and/or reducing a cost associated with operation of the attraction system 50.
  • the display 60 described herein may improve a performance of the attraction system 50 to entertain the guest 52 in an efficient manner.
  • the display 60 may also be used to improve performance of maintenance and/or service operations of the attraction system 50.
  • a user e.g., an operator, a technician
  • the user may be able to more easily inspect other components or features of the attraction system 50 (e.g., the ride vehicle 54, the additional show effects 74) that are separate from the display 60, such as in conjunction with inspection of the display 60.
  • the maintenance and/or service operations may be more efficiently performed.
  • FIG. 2 is a perspective view of an embodiment of the attraction system 50 that includes the ride vehicle 54.
  • Guests 52 may be positioned within a chassis 100 of the ride vehicle 54, and the actuator 70, shown as a robotic arm in the illustrated embodiment, may be coupled to the chassis 100. Operation of the actuator 70 may drive movement of the ride vehicle 54 to move and entertain the guests 52.
  • the control system 62 may be configured to instruct the actuator 70 to move the ride vehicle 54 during operation of the attraction system 50.
  • the actuator 70 may be configured to translate the ride vehicle 54 along a first axis 102 (e.g., a longitudinal axis), along a second axis 104 (e.g., a vertical axis), along a third axis 106 (e.g., a lateral axis), and/or along any other suitable axis (e.g., an axis between the first axis 102, the second axis 104, and the third axis 106). Additionally, or alternatively, the actuator 70 may be configured to rotate the ride vehicle 54 about the first axis 102, about the second axis 104, about the third axis 106, or about any other suitable axis.
  • a first axis 102 e.g., a longitudinal axis
  • a second axis 104 e.g., a vertical axis
  • a third axis 106 e.g., a lateral axis
  • any other suitable axis
  • the actuator 70 may include any suitable component, device, system, or assembly in an additional or alternative embodiment.
  • the control system 62 may be configured to instruct the actuator 70 to move the ride vehicle 54 based on any suitable parameter, such as in a predetermined motion or route for a ride cycle, based on sensor data (e.g., received from the sensor 68), and/or based on a user input (e.g., received via the user interface 72).
  • the display 60 may be coupled to (e.g., directly coupled to, mounted to, secured to, attached to, engaged with) the chassis 100 at a position in which multiple guests 52 positioned within the chassis 100 may see the output provided by the display 60.
  • the display 60 may be coupled to (e.g., indirectly coupled to) the actuator 70 via the ride vehicle 54.
  • the coupling between the display 60 and the chassis 100 may enable movement of the ride vehicle 54 to drive corresponding movement of the display 60.
  • the display 60 may be fixedly coupled to the chassis 100 such that relative movement between the ride vehicle 54 and the display 60 may be blocked, thereby enabling the guests 52 to view the output provided by the display 60 regardless of the position and/or orientation of the ride vehicle 54.
  • the control system 62 may be configured to operate the display 60 to provide an output based on a position of the display 60.
  • the display 60 may be at a first position and/or a first orientation
  • the control system 62 may be configured to instruct the display 60 to present imagery based on the first position and/or the first orientation.
  • the ride vehicle 54 may move (e.g., via the actuator 70) to transition the display 60 to a second position and/or a second orientation, and the control system 62 may be configured to instruct the display 60 to present updated imagery based on the second position and/or the second orientation.
  • the sensor 68 may monitor the position and/or orientation of the display 60 (e.g., within the attraction system 50), and the control system 62 may determine the position and/or orientation of the display 60 based on data received from the sensor 68.
  • the control system 62 may determine the position and/or orientation of the display 60 based on operation of the actuator 70, such as an instructed positioning and/or movement of the actuator 70.
  • the control system 62 may operate the display 60 to present an output based on the determined position and/or orientation of the display 60 (e.g., based on image data mapped to the position and/or orientation of the display 60).
  • the updated imagery may portray movement of the ride vehicle 54 through a virtual environment to correspond with movement of the ride vehicle 54 in the real world, as caused by the actuator 70.
  • the control system 62 may cause the display 60 to present imagery that portrays forward movement in the virtual environment (e g., by updating a position of various virtual elements in the imagery provided by the display 60).
  • the control system 62 may cause the display 60 to present imagery that portrays a more upward view of the virtual environment. As such, a more realistic portrayal of the virtual environment in coordination with the movement of the ride vehicle 54 may be achieved via the operation of the display 60.
  • FIG. 3 is a perspective view of an embodiment of the attraction system 50 that includes the ride vehicle 54.
  • the display 60 is coupled to (e.g., directly coupled to, fixedly coupled to) the chassis 100 of the ride vehicle 54, and the ride vehicle 54 is configured to move along a path 56.
  • the path 56 may include various features, such as a hill, a drop, a turn, a curve, an inversion, and so forth, that may adjust the position and/or orientation of the ride vehicle 54 (e.g., about the axes 102, 104, 106) during movement along the path 56.
  • Such movement of the ride vehicle 54 may also drive corresponding movement of the display 60.
  • control system 62 may be configured to operate the display 60 to update the output presented by the display 60 to the guests 52.
  • the control system 62 may receive data from the sensor 68 indicating the position and/or orientation of the display 60, and the control system 62 may operate the display 60 to present an output based on the position and/or orientation of the display 60.
  • the display 60 is positioned in front of the guests 52 in the embodiments illustrated in FIGS. 2 and 3, the display 60 may be arranged in a different position with respect to the guests 52, such as behind the guests 52 and/or at a lateral side (e.g., a right side, a left side) of the chassis 100, in an additional or alternative embodiment.
  • the display 60 may be positioned to enable the guests 52 to see a path of travel in front of the ride vehicle 54.
  • the control system 62 may operate the display 60 based on a position and/or orientation of the display 60 to enhance the experience provided to the guests 52.
  • control system 62 may operate the display 60 to portray movement of the ride vehicle 54 along various virtual elements to cause the guests 52 to perceive movement of the ride vehicle 54 within a virtual environment in correspondence with movement of the ride vehicle 54 in the real world.
  • the display 60 may be coupled to the chassis 100 at a position to enable the guest 52 to view the output provided by the display 60, the guest 52 may also be able to view other features and aspects of the attraction system 50. That is, the display 60 may be positioned to avoid blocking a field of view from the ride vehicle 54 to other locations within the attraction system 50. For example, other show effects (e g., the additional show effects 74), such as lighting, smoke effects, fog effects, and so forth, may be provided, and such show effects may supplement the output (e.g., imagery) provided by the display 60. Thus, the display 60 may be one of many show effects provided by the attraction system 50 and experienced by the guests 52.
  • the additional show effects 74 such as lighting, smoke effects, fog effects, and so forth
  • FIG. 4 is a perspective view of an embodiment of the attraction system 50.
  • the display 60 may be configured to move relative to a guest 52 that is not in the ride vehicle 54, such as a guest 52 that is navigating (e.g., walking) through and/or within the attraction system 50.
  • the display 60 may be directly coupled to (e.g., secured to, engaged with, mounted to, attached to) the actuator 70, and the actuator 70 may be coupled to a base 150.
  • the control system 62 may be configured to instruct the base 150 to move within the attraction system 50 and/or to instruct the actuator 70 to move (e.g., translate, rotate) the display 60 relative to the base 150.
  • control system 62 may instruct the actuator 70 and/or the base 150 to move the display 60 in any suitable direction with respect to the axes 102, 104, 106.
  • the actuator 70 may include any suitable mechanism or components configured to cause the display 60 to move within the attraction system 50.
  • control system 62 may be configured to position the display 60 relative to the guest 52 to enable the guest 52 to view the output presented by the display 60. For this reason, the control system 62 may be configured to determine a viewing perspective of the guest 52, such as based on a position of the guest 52, an orientation of the guest 52, a facial feature of the guest 52, and so forth. The control system 62 may be configured to instruct the actuator 70 to move based on the viewing perspective of the guest 52 to enable the guest 52 to see the output presented by the display 60 as the guest 52 moves within the attraction system 50. For instance, the control system 62 may instruct the actuator 70 to move the display 60 and maintain positioning of the display 60 in front of and/or adjacent to the guest 52.
  • the senor 68 may be configured to determine a parameter indicative of the viewing perspective of the guest 52.
  • the control system 62 may be configured to instruct the actuator 70 to move based on data received from the sensor 68 indicating the viewing perspective of the guest 52.
  • the sensor 68 may include a camera configured to capture an image of the guest 52, and the control system 62 may determine the viewing perspective of the guest 52 based on the image.
  • the guest 52 may be in possession of and/or may equip a device 152 of the attraction system 50, such as a band, a strap, a headset, a mobile device, and the like.
  • the sensor 68 may be configured to track the positioning of the device 152, and the control system 62 may determine a viewing perspective of the guest 52 based on data indicative of the positioning of the device 152.
  • the control system 62 may be configured to operate the display 60 to present an output based on the position and/or orientation of the display 60 (e g., as arranged via the actuator 70).
  • the control system 62 may be configured to cause the display 60 to present imagery of a virtual environment and, as the display 60 moves, the imagery presented by the display 60 may be adjusted to present different viewpoints of the virtual environment.
  • the control system 62 may be configured to instruct the actuator 70 to move the display 60 to a first position based on a first viewing perspective of the guest 52 to enable the guest 52 to view the display 60, and the control system 62 may instruct the display 60 to present first imagery of a virtual environment based on the first position of the display 60.
  • the guest 52 may move and have a second viewing perspective
  • the control system 62 may be configured to instruct the actuator 70 to move the display 60 to a second position based on the second viewing perspective of the guest 52 to enable the guest 52 to view the display 60.
  • the control system 62 may also instruct the display 60 to present second imagery of the virtual environment based on the second position of the guest 52, and the second imagery may include a different viewpoint of the virtual environment as that provided by the first imagery. That is, the second imagery may update the appearance of the virtual environment to portray that, due to movement of the guest 52 through and/or within the attraction system 50, the guest 52 is viewing the virtual environment from a different perspective.
  • the guest 52 may perceive that movement through and/or within the attraction system 50 causes navigation through the virtual environment.
  • the virtual environment may be more realistically presented to the guest 52.
  • FIG. 5 is a perspective view of an embodiment of the attraction system 50.
  • the attraction system 50 may include multiple areas, such as rooms, levels, enclosed volumes, and the like.
  • a first area 180 may include a first actuator 70A coupled to (e.g., directly coupled to) a first display 60 A.
  • a first guest 52A may be positioned in the first area 180 and may navigate through the first area 180.
  • the control system 62 may determine a viewing perspective of the first guest 52 A (e.g., via data received from the sensor 68) in the first area 180 and operate the first actuator 70A based on the viewing perspective, such as to move the first display 60A in front of and/or adjacent to the first guest 52A in the first area 180.
  • the control system 62 may also operate the first display 60A to present an output based on movement of the first display 60A, such as based on a position and/or orientation of the first display 60A in the first area 180.
  • a second area 182 may include a second actuator 70B coupled to a second display 60B.
  • a second guest 52B may be positioned in the second area 182 and may navigate through the second area 182.
  • the control system 62 may determine a viewing perspective of the second guest 52B (e g., via data received from the sensor 68), operate the second actuator 70B based on the viewing perspective in the second area 182, and operate the second display 60B to present an output based on movement of the second display 60B.
  • the control system 62 may operate the first display 60A and the second display 60B independently from one another, as well as the first actuator 70 A and the second actuator 70B independent from one another.
  • the control system 62 may operate the first actuator 70A and the first display 60A based on the viewing perspective of the first guest 52A in the first area 180 regardless of the viewing perspective of the second guest 52B in the second area 182
  • the control system 62 may operate the second actuator 70B and the second display 60B based on the viewing perspective of the second guest 52B in the second area 182 regardless of the viewing perspective of the first guest 52A in the first area 180.
  • the attraction system 50 may present a unique experience to each of the guests 52 in the separate areas 180, 182.
  • a guest 52 may navigate through the areas 180, 182 in a sequential manner, such as from the first area 180 to the second area 182.
  • the control system 62 may be configured to adjust control of the displays 60 and/or the actuators 70 based on the transition of the guest 52 between the first area 180 and the second area 182.
  • the control system 62 may operate the first actuator 70A and/or the first display 60A to present an output to the first guest 52A based on movement of the first guest 52A within the first area 180.
  • the first guest 52A may navigate from the first area 180 to the second area 182, and the control system 62 may then determine that the first guest 52 A is in the second area 182 and not in the first area 180 (e g., based on data received from the sensor 68). In response, the control system 62 may operate the second actuator 70B and/or the second display 60B, instead of the first actuator 70A and/or the first display 60 A, to present an output to the first guest 52A based on movement of the first guest 52A within the second area 182. Thus, respective operation of the displays 60 and/or the actuators 70 may be associated with one of the areas 180, 182.
  • the output presented by the first display 60A may be associated with a first virtual environment (e.g., a first graphics-based representation based on a 3-D model), and the output presented by the second display 60B may be associated with a second virtual environment (e.g., a second graphics-based representation based on a 3-D model) that is different than the first virtual environment. Therefore, as the first guest 52A transitions between different areas 180, 182, the first guest 52A may view a different virtual environment via the displays 60. In this manner, the first guest 52A may perceive a transition between different virtual environments as a result of transitioning between the different areas 180, 182. As a result, a unique experience may be provided to the guests 52 as the guests 52 navigate between different areas 180, 182 within the attraction system 50.
  • a first virtual environment e.g., a first graphics-based representation based on a 3-D model
  • a second virtual environment e.g., a second graphics-based representation based on a 3-D model
  • FIG. 6 is a perspective view of an embodiment of the attraction system 50.
  • the display 60 is coupled to (e.g., directly coupled to) the actuator 70, and the actuator 70 is separate from the ride vehicle 54. That is, the ride vehicle 54 is not coupled to the actuator 70.
  • the ride vehicle 54 may be coupled to a separate actuator, and/or the ride vehicle 54 may be configured to move along a path (e.g., the path 56 of FIGS. 1 and 3).
  • the ride vehicle 54 may be configured to move, and the control system 62 may be configured to operate the actuator 70 to move the display 60 as the ride vehicle 54 is in motion.
  • control system 62 may be configured to determine a position and/or orientation of the ride vehicle 54 in the attraction system 50 (e.g., based on data received from the sensor 68), and the control system 62 may be configured to operate the actuator 70 to position the display 60 based on the position and/or orientation of the ride vehicle 54.
  • control system 62 may be configured to operate the actuator 70 to position the display 60 in front of and/or adjacent to the ride vehicle 54 to enable the guests 52 positioned within the ride vehicle 54 to see the output of the display 60.
  • the control system 62 may be configured to operate the actuator 70 independently of movement of the ride vehicle 54.
  • control system 62 may be configured to operate the actuator 70 to move the display 60 in a predetermined manner and/or based on auser input regardless of the position and/or orientation of the ride vehicle 54. In either embodiment, the control system 62 may be configured to operate the display 60 to provide an output based on a position and/or orientation of the display 60.
  • FIG. 7 is a perspective view of an embodiment of the attraction system 50.
  • the actuator 70 is coupled to (e.g., directly coupled to) the chassis 100 of the ride vehicle 54
  • the display 60 is coupled to (e.g., directly coupled to) the actuator 70.
  • the ride vehicle 54 may be configured to move, such as via an actuator separate from the actuator 70 and/or along a path. Movement of the ride vehicle 54 may drive movement of the actuator 70 and therefore movement of the display 60.
  • the control system 62 may be configured to operate the actuator 70 to move the display 60 relative to the chassis 100.
  • movement of the display 60 may be caused by movement of the ride vehicle 54 (e.g., of the chassis 100) and/or by operation of the actuator 70.
  • the control system 62 may be configured to instruct the actuator 70 to move the display 60 in a predetermined path, based on a position and/or orientation of the guests 52 within the ride vehicle 54, based on a position and/or orientation of the ride vehicle 54 in the attraction system 50, based on a user input received from one of the guests 52, and the like.
  • the control system 62 may cause the actuator 70 to move the display 60 that enables the guests 52 to view the output of the display 60 more easily based on the positioning of the ride vehicle 54.
  • the control system 62 may be configured to operate the display 60 to provide an output based on the position and/or orientation of the display 60 in the attraction system 50 as caused by the movement of the ride vehicle 54 and/or the operation of the actuator 70.
  • the control system 62 may determine a position and/or orientation of the ride vehicle 54, a position and/or orientation of the actuator 70, and the position and/or orientation of the display 60 based on the determined positions and/or orientations of the ride vehicle 54 and the actuator 70.
  • the control system 62 may determine the position and/or orientation of the actuator 70 without determining the position and/or orientation of the ride vehicle 54 and/or of the actuator 70.
  • the sensor 68 may be configured to directly determine the position and/or orientation of the display 60, and the control system 62 may directly determine the position and/or orientation of the display 60 based on data received from the sensor 68 without having to determine the position and/or orientation of the ride vehicle 54 and/or the actuator 70.
  • the control system 62 may cause the display 60 to provide an output, such as imagery of a virtual environment, that the guests 52 may view from within the chassis 100 as the ride vehicle 54 moves through the attraction system 50.
  • FIG. 8 is a flowchart of an embodiment of a method or process 200 for operating an attraction system (e.g., the attraction system 50 of FIGS. 1-7).
  • any suitable device may perform the method 200.
  • the method 200 may be implemented by executing instructions stored in a tangible, non-transitory, computer- readable medium (e.g., the memory 64 of the control system 62).
  • the method 200 may be performed at least in part by one or more software components, one or more hardware components, one or more software applications, and the like. While the method 200 is described using steps in a specific sequence, additional steps may be performed, the described steps may be performed in different sequences than the sequence illustrated, and/or certain described steps may be skipped or not performed altogether.
  • a position and/or orientation of a display may be determined.
  • the position and/or orientation of the display may be determined based on data received from a sensor.
  • the data may include an image of the attraction system, movement of the display, a positioning of a ride vehicle to which the display is coupled, a positioning of an actuator to which the display is coupled, and so forth.
  • the position and/or orientation of the display may be determined or derived based on a determined position and/or orientation of an actuator and/or ride vehicle to which the display is coupled.
  • the actuator and/or ride vehicle may be instructed to move to a target position and/or target orientation (e.g., based on a user input), and the position and/or orientation of the display may be determined based on the target position and/or target orientation of the actuator and/or ride vehicle.
  • a target position and/or target orientation e.g., based on a user input
  • the position and/or orientation of the display may be determined based on the target position and/or target orientation of the actuator and/or ride vehicle.
  • image data may be determined based on the position and/or orientation of the display.
  • reference data that associates respective image data with corresponding positions and/or orientations of the display may be utilized, and the image data associated with the determined position and/or determined orientation of the display may be selected based on the reference data.
  • the image data may correspond to imagery of a virtual environment.
  • the display may be instructed to present an output based on the image data.
  • the image data may be transmitted to the display to cause the display to present the output according to the image data.
  • the output may include an image, such as a single image.
  • the output may include a video or multiple images presented in a sequential order.
  • the steps of the method 200 may be repeated during operation of the attraction system. That is, the position and/or orientation of the display may be repeatedly or continually determined.
  • updated image data may be repeatedly determined, and the display may be repeatedly instructed to present an output based on the updated image data.
  • movement of the display may cause the display to present an updated output.
  • the output presented by the display may include imagery of a virtual environment seen from a particular viewpoint corresponding to the positioning of the display, and movement of the display may cause the display to present updated imagery of the virtual environment from an updated viewpoint corresponding to the movement of the display to an updated positioning.
  • the updated output presented by the display may portray movement of the display in the virtual environment in accordance with movement of the display in the real world. In this manner, the appearance of the virtual environment may be more realistically portrayed.

Abstract

An attraction system includes a ride vehicle configured to move within the attraction system, a display coupled to the ride vehicle, and a control system communicatively coupled to the display. The control system is configured to determine a position and/or orientation of the display within the attraction system and operate the display to present an image based on the position and/or orientation of the display.

Description

ROBOTIC ARM INTEGRATED IMMERSIVE REALITY
CROSS-REFERENCE TO RELATED APPLICATION
[0001] This application claims priority to and the benefit of U.S. Provisional Application No. 63/340,279, entitled “ROBOTIC ARM INTEGRATED IMMERSIVE REALITY,” filed May 10, 2022, which is hereby incorporated by reference in its entirety for all purposes.
BACKGROUND
[0002] This section is intended to introduce the reader to various aspects of art that may be related to various aspects of the present techniques, which are described and/or claimed below. This discussion is believed to be helpful in providing the reader with background information to facilitate a better understanding of the various aspects of the present disclosure. Accordingly, it should be understood that these statements are to be read in this light, and not as admissions of prior art.
[0003] Throughout amusement parks and other entertainment venues, special effects can be used to help immerse guests in the experience of a nde or attraction. Immersive environments may include physical props and set pieces, robotic or mechanical elements, and/or display surfaces that present media. In addition, the immersive environment may include audio effects, smoke effects, and/or motion effects. Thus, immersive environments may include a combination of dynamic and static elements. With the increasing sophistication and complexity of modem ride attractions and the corresponding increase in expectations among theme or amusement park patrons, improved and more creative attractions are desirable, including ride attractions having more complex, immersive, and/or realistic special effects, such as visual effects.
BRIEF DESCRIPTION
[0004] A summary of certain embodiments disclosed herein is set forth below. It should be understood that these aspects are presented merely to provide the reader with a brief summary of these certain embodiments and that these aspects are not intended to limit the scope of this disclosure. Indeed, this disclosure may encompass a variety of aspects that may not be set forth below.
[0005] In an embodiment, an atraction system includes a ride vehicle configured to move within the atraction system, a display coupled to the ride vehicle, and a control system communicatively coupled to the display. The control system is configured to determine a position and/or orientation of the display within the atraction system and operate the display to present an image based on the position and/or orientation of the display.
[0006] In an embodiment, a non-transitory computer-readable medium includes instructions that, when executed by processing circuitry, cause the processing circuitry to instruct an actuator of an atraction system to move a display coupled to the actuator, determine a positioning of the display, determine image data based on the positioning of the display, and instruct the display to present an image based on the image data.
[0007] In an embodiment, an attraction system includes a display coupled to an actuator, a ride vehicle configured to move and cause movement of the display, and a control system communicatively coupled to the display. The control system is configured to instruct the actuator to move the display, determine a position and/or orientation of the display within the atraction system, and operate the display to present an image based on the position and/or orientation of the display.
DRAWINGS
[0008] These and other features, aspects, and advantages of the present disclosure will become beter understood when the following detailed description is read with reference to the accompanying drawings in which like characters represent like parts throughout the drawings, wherein:
[0009] FIG. 1 is a schematic diagram of an embodiment of an atraction system that includes a display, in accordance with an aspect of the present disclosure;
[0010] FIG. 2 is a perspective view of an embodiment of an atraction system that includes a display coupled to a ride vehicle, in accordance with an aspect of the present disclosure; [0011] FIG. 3 is a perspective view of an embodiment of an attraction system that includes a display coupled to a ride vehicle, in accordance with an aspect of the present disclosure;
[0012] FIG. 4 is a perspective view of an embodiment of an attraction system that includes a display coupled to an actuator, in accordance with an aspect of the present disclosure;
[0013] FIG. 5 is a perspective view of an embodiment of an attraction system that includes multiple displays coupled to respective actuators, in accordance with an aspect of the present disclosure;
[0014] FIG. 6 is a perspective view of an embodiment of an attraction system that includes a ride vehicle and a display coupled to an actuator, in accordance with an aspect of the present disclosure;
[0015] FIG. 7 is a perspective view of an embodiment of an attraction system that includes a ride vehicle, an actuator coupled to the ride vehicle, and a display coupled to the actuator, in accordance with an aspect of the present disclosure; and
[0016] FIG. 8 is a flowchart of an embodiment of a method or process for operating an attraction system, in accordance with an aspect of the present disclosure.
DETAILED DESCRIPTION
[0017] One or more specific embodiments will be described below. In an effort to provide a concise description of these embodiments, not all features of an actual implementation are described in the specification. It should be appreciated that in the development of any such actual implementation, as in any engineering or design project, numerous implementation-specific decisions must be made to achieve the developers' specific goals, such as compliance with system-related and business-related constraints, which may vary from one implementation to another. Moreover, it should be appreciated that such a development effort might be complex and time consuming, but would nevertheless be a routine undertaking of design, fabrication, and manufacture for those of ordinary skill having the benefit of this disclosure. [0018] When introducing elements of various embodiments of the present disclosure, the articles “a,” “an,” and “the” are intended to mean that there are one or more of the elements. The terms “comprising,” “including,” and “having” are intended to be inclusive and mean that there may be additional elements other than the listed elements. Additionally, it should be understood that references to “one embodiment” or “an embodiment” of the present disclosure are not intended to be interpreted as excluding the existence of additional embodiments that also incorporate the recited features.
[0019] The present disclosure is directed to an attraction system of an amusement or theme park. The amusement park may include a variety of features, such as rides (e.g., a roller coaster), theatrical shows, set designs, performers, and/or decoration elements, to entertain guests. Show effects may be used to supplement or complement the features, such as to provide the guests with a more immersive and/or unique experience. For example, the show effects may be presented to emulate real world elements in order to present a more realistic atmosphere for the guests.
[0020] In an embodiment, the show effect may include a display configured to present an output, such as an image and/or a video (e.g., a series of images), to the guests. For example, the output (e.g., imagery) of the display may include a visualization of a virtual environment (e.g. , a virtual space defining a three-dimensional (3-D) model). During operation of the attraction system, the display may be positioned to enable the guests to view the output. For instance, the display may be moved to enable improved visibility of the display based on a position of the guests, and/or the display may be coupled to a ride vehicle of the attraction system, and movement of the ride vehicle may drive corresponding movement of the display. The movement of the display and the output of the display may cooperatively entertain the guests.
[0021] It may be desirable to adjust the output provided by the display in a manner that enhances portrayal of the virtual environment to the guests. As an example, as the display moves within the attraction system, it may be desirable to operate the display to present an output that portrays corresponding movement of the display in a virtual environment. Thus, the virtual environment (e.g., a perception of movement through the virtual environment) may be portrayed more realistically. Accordingly, embodiments of the present disclosure are directed to determining a positioning (e.g., a position, an orientation, a location) of the display and instructing the display to present an output based on the determined positioning. Thus, as the display moves, the display may be instructed to present an updated output. By way of example, the updated output may portray movement within a virtual environment corresponding to movement of the display. In an embodiment, respective outputs (e.g., image data that causes the display to present imagery) may be associated with a corresponding positioning of the display. Thus, based on a determined positioning of the display, an associated output may be selected for presentation via the display. In this manner, the output of the display may be more closely associated with movement of the display, and such operation of the display may provide benefits that are not easily achieved via existing techniques of operating a display (e.g., using a predetermined media file). As an example, the output may more realistically portray movement within the virtual environment based on movement of the display in the attraction system to provide a more realistic experience to the guests. As another example, the output of the display may be more unique or personalized, such as for an embodiment in which the movement of the display may be different or customizable for different ride cycles (e.g., based on a user input). Thus, operating the display based on a determined positioning of the display may improve operation of the attraction system to entertain the guests.
[0022] With the preceding in mind, FIG. 1 is a schematic diagram of an embodiment of an attraction system 50. The attraction system 50 may be configured to entertain one or more guests 52. In an embodiment, the attraction system 50 may include a ride vehicle 54 in which a guest 52 may be positioned. During operation of the attraction system 50, the ride vehicle 54 may move to provide movement sensations (e.g., a gravitational force, an inertial force, a postural adjustment) and/or carry the guest 52 to different parts of the attraction system 50. For example, the ride vehicle 54 may travel (e.g., translate) along a path 56, which may include a track, an open pathway, or any suitable route that the ride vehicle 54 may navigate. Additionally, or alternatively, the ride vehicle 54 may remain at a particular location within the attraction system 50 and may, for instance, move (e.g., pivot, rotate) about a base to change orientations at the location in order to move the guest 52 within the attraction system 50. The movement of the ride vehicle 54 may entertain the guest 52. Moreover, the guest 52 may navigate through and/or within the attraction system 50 without the ride vehicle 54. As an example, the attraction system 50 may include another path (e.g., a queue, an open pathway, a walkway) that the guest 52 may move along, and the attraction system 50 may include various features to entertain the guest 52 moving through and/or within the attraction system 50.
[0023] The attraction system 50 may also include a show effect system 58 configured to provide further entertainment to the guest 52. In an embodiment, the show effect system 58 may include a display 60 configured to provide an output (e.g., a media output, a virtual output) viewable to the guest 52. The output may include imagery, such as an image or a video (e.g., a plurality of images that is sequentially presented). For example, the output provided by the display 60 may present a virtual environment in which the guest 52 may be immersed. The output may be based on the movement of the display 60 within the attraction system 50 to portray movement of the display 60 in the virtual environment, thereby providing a more realistic appearance of the virtual environment.
[0024] To this end, the attraction system 50 (e.g., the show effect system 58) may include a control system 62 (e.g., an automation controller, a programmable controller, an electronic controller) communicatively coupled to the display 60 and configured to operate the display 60. The control system 62 may include a memory 64 and processing circuitry 66. The memory 64 may include volatile memory, such as random access memory (RAM), and/or non-volatile memory, such as read-only memory (ROM), optical drives, hard disc drives, solid-state drives, or any other non-transitory computer- readable medium that includes instructions. The processing circuitry 66 may be configured to execute such instructions. For example, the processing circuitry 66 may include one or more application specific integrated circuits (ASICs), one or more field programmable gate arrays (FPGAs), one or more general purpose processors, or any combination thereof
[0025] The control system 62 may be configured to operate the display 60 to provide an output based on a positioning, such as position and/or an orientation, of the display 60. To this end, the show effect system 50 may include a sensor 68 configured to monitor a parameter associated with the positioning of the display 60. The sensor 68 may transmit data indicative of the parameter to the control system 62, and the control system 62 may operate the display 60 based on the parameter. In an embodiment, the sensor 68 may include a position sensor, which may include an optical sensor (e.g., a camera), a remote sensing device (e.g., a light detection and ranging sensor), a proximity sensor, another suitable position sensor, or any combination thereof. In such an embodiment, the parameter monitored by the sensor 68 may directly correspond to a positioning of the display 60 determined by the sensor 68. In an additional or alternative embodiment, the sensor 68 may include a movement or motion sensor, which may include an accelerometer, a gyroscope, an inertial measurement unit, another movement sensor, or any combination thereof. In such an embodiment, the parameter monitored by the sensor 68 may include movement or an adjustment of the positioning of the display 60. The control system 62 may determine a change in positioning of the display 60 based on movement data received from the sensor 68. For example, the control system 62 may determine an updated positioning of the display 60 based on a determined movement of the display 60 from a previous positioning of the display 60.
[0026] The control system 62 may, for example, determine a position and/or orientation of the display 60 within a virtual 3-D coordinate system based on the data received from the sensor 68 indicating the positioning of the display 60 in the real world. The control system 62 may then operate the display 60 based on the position and/or orientation of the display 60 within the 3-D coordinate system. By way of example, the control system 62 may refer to information or data that associates or maps different image data to respective positionings (e g., a position, an orientation, a coordinate, a location) of the display 60 within the 3-D coordinate system. In response to determining the positioning of the display 60 within the 3-D coordinate system, the control system 62 may select the image data associated with the positioning of the display 60 based on the referenced information, and the control system 62 may instruct the display 60 to present an image based on the selected image data. As an example, the control system 62 may transmit the image data to the display 60 to cause the display 60 to present an image corresponding to the image data. For instance, each image data associated with the various positionings of the display 60 within the 3-D coordinate system may correspond to an appearance of a virtual environment viewed from the positioning within the 3-D coordinate system. Based on a change in positioning of the display 60 within the 3-D coordinate system (e.g., caused by movement of the display 60 in the real world), the control system 62 may determine updated image data associated with an updated appearance of the virtual environment (e.g., viewed from the positioning within the 3-D coordinate system). The control system 62 may transmit the updated image data to the display 60 to cause the display 60 to present an image providing the updated appearance of the virtual environment, such as to portray movement of the display 60 within the virtual environment corresponding to physical movement of the display 60 in the real world.
[0027] In an embodiment, an actuator 70 of the show effect system 58 (e.g., of the attraction system 50) may be configured to move the display 60. As an example, the actuator 70 may include a robotic arm with segments that are movable relative to one another, and operation of the actuator 70 may move the segments to drive movement of the display 60. As another example, the actuator 70 may include a combination of winches, pulleys, and/or cables or ropes coupled to the display 60, and operation of the actuator 70 may cause the winches and pulleys to move (e.g., pull, extend) the cables or ropes to move the display 60. As a further example, the actuator 70 may include a platform assembly (e.g., a parallel manipulator) having multiple platforms configured to move relative to one another. The display 60 may be coupled to one of the platforms, and operation of the actuator 70 to drive movement of the platforms may cause movement of the display 60. The actuator 70 may additionally or alternatively include any suitable component or device configured to drive the display 60 to move.
[0028] The control system 62 may be communicatively coupled to the actuator 70 and may be configured to instruct the actuator 70 to move the display 60. In one embodiment, the control system 62 may be configured to instruct the actuator 70 to move the display 60 in a pre-programmed, preset, or predetermined path or manner, such as based on a particular ride cycle (e.g., in which the ride vehicle 54 travels along the path 56) and/or based on a time of operation of the attraction system 50. In an additional or alternative embodiment, the control system 62 may be configured to instruct the actuator 70 to move based on input data. As an example, the control system 62 may receive data (e.g., from the sensor 68) indicative of a position of the guest 52 in the attraction system 50. For instance, the guest 52 may navigate the attraction system 50 and/or move via movement of the ride vehicle 54 carrying the guest 52 along the path 56. The control system 62 may instruct the actuator 70 to move based on the determined position of the guest 52, such as to position the display 60 relative to the guest 52 to enable the guest 52 to view the output of the display 60. As another example, the control system 62 may receive a user input and operate the actuator 70 based on the user input. Therefore, the control system 62 may include or be communicatively coupled to a user interface 72, such as a touch screen, a trackpad, a mouse, a button, a switch, a dial, a slider, a lever, and the like. The guest 52 may interact with the user interface 72 to transmit the user input, and the control system 62 may receive the user input and instruct the actuator 70 to move the display 60 based on the user input. For instance, the user input may be indicative of a request to move the display 60 in a particular direction and/or to a target position, and the control system 62 may be configured to instruct the display 60 to move in the particular direction and/or to the target position in response.
[0029] Tn an embodiment, the control system 62 may include a single controller configured to determine the positioning of the display 60 (e.g., via the sensor 68), operate the display 60 to present content, and instruct the actuator 70 to move the display 60. In an additional or alternative embodiment, the control system 62 may include one or more separate controllers (e.g., a primary controller, a secondary controller) configured to coordinate to operate the display 60, the actuator 70, and/or any other feature of the attraction system 50. By way of example, a first controller of the control system 62 may determine the positioning of the display 60, a second controller of the control system 62 may operate the display 60 (e.g., based on communication with the first controller), and a third controller of the control system 62 may instruct the actuator 70 to move the display 60 (e.g., based on communication with the first controller and/or the second controller). Indeed, the control system 62 may include any suitable number of controllers to operate one or more subsystems (e.g., the display 60, the actuator 70, the ride vehicle 54) of the attraction system 50.
[0030] In an embodiment, the actuator 70 may be attached to, secured to, mounted to, or engaged with the ride vehicle 54. By way of example, the display 60 may be coupled to the ride vehicle 54, and the actuator 70 may be coupled to and configured to drive movement of the ride vehicle 54. Thus, operation of the actuator 70 may move the ride vehicle 54 and drive corresponding movement of the display 60 coupled to the ride vehicle 54. Additionally, or alternatively, the actuator 70 may be coupled to the ride vehicle 54, and the display 60 may be coupled to the actuator 70. Thus, movement of the ride vehicle 54 (e.g., along the path 56) may drive movement of the actuator 70 and the display 60. The control system 62 may also operate the actuator 70 to cause additional movement of the display 60. In an additional or alternative embodiment, the actuator 70 may be separate from the ride vehicle 54. For example, the actuator 70 may be coupled to a base that is not a part of the ride vehicle 54, and the actuator 70 may therefore move separately from the ride vehicle 54. In other words, movement of the ride vehicle 54 may not directly drive movement of the actuator 70 and the display 60.
[0031] In any case, based on the movement of the display 60, as caused by movement of the ride vehicle 54 and/or movement of the actuator 70, the control system 62 may instruct the display 60 to present a particular output. Operating the display 60 to present a location-based output may improve the experience provided to the guest 52. For instance, operation of the display 60 may be more flexible as compared to outputting a preprogrammed or predetermined media file. As an example, the control system 62 may be configured to cause the display 60 to present a different output (e.g., based on different movement of the display 60) for different ride cycles, thereby providing a more unique or personalized experience to the guest 52. As another example, the output provided by the display 60 may correspond to the movement and/or determined positioning of the display 60 more appropriately as compared to a preprogrammed media file. That is, the output may be more synchronized with or may match the movement of the display 60 more closely. For instance, different ride cycles may have varying movements and/or motions (e g., speeds, acceleration, vibration) of the display 60, such as based on a condition of the actuator 70, a weight or number of the guests 52 in the ride vehicle 54 driving motion of the display 60, a wind speed, and so forth. However, a preprogrammed media file, such as a preset video, may not be altered for different ride cycles. Thus, the preprogrammed media file may not match the different movements and/or motions associated with certain ride cycles.
[0032] However, an output provided by the display 60 based on the position and/or orientation of the display 60 may accommodate the variation in movements and/or motions for different ride cycles. For instance, the display 60 may present a realistic appearance of movement through a virtual environment in correspondence with movement of the display 60. This may be achieved by using a 3-D model, which may include an application that uses mathematical modeling to depict a 3-D environment as a virtual space and map the 3-D environment to the 3-D coordinate system to account for all or a subset of views into the virtual space that can be achieved within a display envelope (e.g., boundaries of the positioning of the display 60) in the 3-D coordinate system. In other words, coordination between the 3-D model and the display 60 or multiple displays may be used to depict views into the virtual space from available perspectives (e.g., viewpoints of riders in the ride vehicle 54 driving motion of the display 60) with the display envelope. Thus, presented content (e.g., images) caused by changes, including minor, small, or incremental changes, in positioning of the display 60 in the physical world are accounted for by the 3-D model based on corresponding changes in positioning of the display 60 in the 3-D coordinate system without requiring a special media track for differing paths and/or without creating discontinuity between movements of the display 60 and content presented via the display 60. For example, based on a positioning of the display 60 in the 3-D coordinate system, content depicting a view into the virtual space associated with the 3-D environment may be selected.
[0033] In some embodiments, a change in positioning of the display 60 in the real world may not directly correspond to the same change in positioning of the display 60 with respect to the 3-D coordinate system. For instance, a certain movement of the display 60 in the real world may be amplified or muted in the 3-D coordinate system. As an example, movement of the display 60 along a vertical distance in the real world may cause movement of the display 60 along an increased vertical distance in the 3-D coordinate system. As another example, rotation of the display 60 at a certain angle in the real world may cause rotation of the display 60 at a reduced angle in the 3-D coordinate system. Such discrepancies between the change in positioning of the display 60 in the real world and the change in positioning of the display 60 in the 3-D coordinate system may cause certain content associated with the 3-D environment to be selected and presented to provide a more desirable experience to the guests. For example, the presented content may depict movement of the display 60 at a high speed in the 3-D environment to provide an appearance that the guest is traveling at a higher speed in the real world than that actually effectuated by the ride vehicle 54.
[0034] Moreover, operating the display 60 to provide an output based on positioning instead of using a preprogrammed media file may facilitate ease of adjusting operation of the attraction system 50. By way of example, it may be desirable to adjust the movement of the display 60, such as by adjusting movement of the ride vehicle 54 (e.g., by changing operation of the actuator 70, by changing a feature of the path 6), in order to change the entertainment provided to the guests 52. Instead of having to prepare an updated preprogrammed media file, such as by creating an initial preprogrammed media file, testing the output of the initially created preprogrammed media file in conjunction with movement of the display 60, and adjusting the initially created preprogrammed media profile based on testing, an updated output of the display 60 may be readily available for presentation. That is, the movement of the display 60 may include updated positioning of the display 60 (e g., within the 3-D coordinate system), and image data may already be mapped to the updated positioning (e.g., as part of a comprehensive digital 3-D model including the 3-D environment mapped to the 3-D coordinate system). Thus, the image data may be readily generated or selected based on the 3-D model, such as the positioning of the display 60 within the 3-D coordinate system and a correlation between the positioning of the display 60 within the 3-D coordinate system (e.g., as caused by the position and/or orientation of the display 60 within the attraction system 50) and the 3-D environment while the display 60 is being moved.
[0035] Furthermore, the display 60 may be separate from the guest 52. That is, the display 60 may not be directly in contact with or equipped by the guest 52, such as via a device or component (e g., a headset) that may significantly limit a field of view of the guest 52. Thus, movement of the display 60 may be separate from movement of the guest 52. For example, multiple guests 52 may be able to view a single display 60. Therefore, a single display 60 may provide immersive entertainment to several guests 52 to increase efficient operation of the attraction system 50. In addition, the output provided by the display 60 may be more suitable or be of better quality for various guests 52 as compared to a wearable device that provides an output (e.g., a headset that provides an output directly in front of a user) that may vary in quality for different guests 52 (e.g., guests 52 having different pupil distances to affect a quality of the output provided in front of them). Moreover, the display 60 may be positioned to enable the guest 52 to view other features of the attraction system 50. For instance, the control system 62 may cause the actuator 70 to position the display 60 at a particular offset distance from the guest 52 to avoid blocking a field of view of the guest 52, thereby enabling the guest 52 to see additional show effects 74 of the show effect system 58. By way of example, the additional show effects 74 may include an animated figure, a decoration, a smoke effect, a fog effect, lighting, another output (e.g., other imagery provided by another display), or any other suitable show effect that the control system 62 may operate to further enhance the experience provided to the guest 52. Indeed, the guest 52 may be able to view the virtual elements provided by the display 60 in addition to real world elements of the attraction system 50.
[0036] Further still, since the guest 52 may not be in direct contact with the display 60, a desirable structure or integrity of the display 60 may be maintained. As an example, wear or deterioration of the display 60 (e.g., that would otherwise be caused by contact or interaction between the guest 52 and the display 60) may be reduced to increase a useful lifespan of the display 60, thereby reducing maintenance operations to repair or replace the display 60. As another example, a cleaner appearance or structure of the display 60 may be maintained. Thus, fewer operations may be performed with respect to the display 60, thereby further improving an efficiency and/or reducing a cost associated with operation of the attraction system 50. Indeed, the display 60 described herein may improve a performance of the attraction system 50 to entertain the guest 52 in an efficient manner.
[0037] In one embodiment, the display 60 may also be used to improve performance of maintenance and/or service operations of the attraction system 50. By way of example, a user (e.g., an operator, a technician) may more easily navigate through and/or within the attraction system 50 (e.g., without having to equip an additional component that displays imagery) to inspect whether movement of the display 60 and/or the output provided by the display 60 are desirable. In addition, the user may be able to more easily inspect other components or features of the attraction system 50 (e.g., the ride vehicle 54, the additional show effects 74) that are separate from the display 60, such as in conjunction with inspection of the display 60. As such, the maintenance and/or service operations may be more efficiently performed.
[0038] FIG. 2 is a perspective view of an embodiment of the attraction system 50 that includes the ride vehicle 54. Guests 52 may be positioned within a chassis 100 of the ride vehicle 54, and the actuator 70, shown as a robotic arm in the illustrated embodiment, may be coupled to the chassis 100. Operation of the actuator 70 may drive movement of the ride vehicle 54 to move and entertain the guests 52. As an example, the control system 62 may be configured to instruct the actuator 70 to move the ride vehicle 54 during operation of the attraction system 50. For instance, the actuator 70 may be configured to translate the ride vehicle 54 along a first axis 102 (e.g., a longitudinal axis), along a second axis 104 (e.g., a vertical axis), along a third axis 106 (e.g., a lateral axis), and/or along any other suitable axis (e.g., an axis between the first axis 102, the second axis 104, and the third axis 106). Additionally, or alternatively, the actuator 70 may be configured to rotate the ride vehicle 54 about the first axis 102, about the second axis 104, about the third axis 106, or about any other suitable axis. Although the illustrated actuator 70 includes a robotic arm, the actuator 70 may include any suitable component, device, system, or assembly in an additional or alternative embodiment. The control system 62 may be configured to instruct the actuator 70 to move the ride vehicle 54 based on any suitable parameter, such as in a predetermined motion or route for a ride cycle, based on sensor data (e.g., received from the sensor 68), and/or based on a user input (e.g., received via the user interface 72).
[0039] The display 60 may be coupled to (e.g., directly coupled to, mounted to, secured to, attached to, engaged with) the chassis 100 at a position in which multiple guests 52 positioned within the chassis 100 may see the output provided by the display 60. In this manner, the display 60 may be coupled to (e.g., indirectly coupled to) the actuator 70 via the ride vehicle 54. The coupling between the display 60 and the chassis 100 may enable movement of the ride vehicle 54 to drive corresponding movement of the display 60. For instance, the display 60 may be fixedly coupled to the chassis 100 such that relative movement between the ride vehicle 54 and the display 60 may be blocked, thereby enabling the guests 52 to view the output provided by the display 60 regardless of the position and/or orientation of the ride vehicle 54. The control system 62 may be configured to operate the display 60 to provide an output based on a position of the display 60. By way of example, the display 60 may be at a first position and/or a first orientation, and the control system 62 may be configured to instruct the display 60 to present imagery based on the first position and/or the first orientation. During operation of the attraction system 50, the ride vehicle 54 may move (e.g., via the actuator 70) to transition the display 60 to a second position and/or a second orientation, and the control system 62 may be configured to instruct the display 60 to present updated imagery based on the second position and/or the second orientation. [0040] In an example embodiment, the sensor 68 may monitor the position and/or orientation of the display 60 (e.g., within the attraction system 50), and the control system 62 may determine the position and/or orientation of the display 60 based on data received from the sensor 68. In an additional or alternative embodiment, the control system 62 may determine the position and/or orientation of the display 60 based on operation of the actuator 70, such as an instructed positioning and/or movement of the actuator 70. In either embodiment, the control system 62 may operate the display 60 to present an output based on the determined position and/or orientation of the display 60 (e.g., based on image data mapped to the position and/or orientation of the display 60).
[0041] In an embodiment, the updated imagery may portray movement of the ride vehicle 54 through a virtual environment to correspond with movement of the ride vehicle 54 in the real world, as caused by the actuator 70. As an example, based on a determination that the ride vehicle 54 moves in a forward direction (e.g., along the first axis 102), the control system 62 may cause the display 60 to present imagery that portrays forward movement in the virtual environment (e g., by updating a position of various virtual elements in the imagery provided by the display 60). As another example, based on a determination that the ride vehicle 54 rotates (e.g., about the third axis 106) to cause the guests 52 to face more upwardly, the control system 62 may cause the display 60 to present imagery that portrays a more upward view of the virtual environment. As such, a more realistic portrayal of the virtual environment in coordination with the movement of the ride vehicle 54 may be achieved via the operation of the display 60.
[0042] FIG. 3 is a perspective view of an embodiment of the attraction system 50 that includes the ride vehicle 54. In the illustrated embodiment, the display 60 is coupled to (e.g., directly coupled to, fixedly coupled to) the chassis 100 of the ride vehicle 54, and the ride vehicle 54 is configured to move along a path 56. By way of example, the path 56 may include various features, such as a hill, a drop, a turn, a curve, an inversion, and so forth, that may adjust the position and/or orientation of the ride vehicle 54 (e.g., about the axes 102, 104, 106) during movement along the path 56. Such movement of the ride vehicle 54 may also drive corresponding movement of the display 60. As such, during movement of the ride vehicle 54 along the path 56, the control system 62 may be configured to operate the display 60 to update the output presented by the display 60 to the guests 52. For example, the control system 62 may receive data from the sensor 68 indicating the position and/or orientation of the display 60, and the control system 62 may operate the display 60 to present an output based on the position and/or orientation of the display 60.
[0043] Although the display 60 is positioned in front of the guests 52 in the embodiments illustrated in FIGS. 2 and 3, the display 60 may be arranged in a different position with respect to the guests 52, such as behind the guests 52 and/or at a lateral side (e.g., a right side, a left side) of the chassis 100, in an additional or alternative embodiment. For example, the display 60 may be positioned to enable the guests 52 to see a path of travel in front of the ride vehicle 54. The control system 62 may operate the display 60 based on a position and/or orientation of the display 60 to enhance the experience provided to the guests 52. By way of example, in an embodiment in which the display 60 is positioned laterally with respect to the guests 52, the control system 62 may operate the display 60 to portray movement of the ride vehicle 54 along various virtual elements to cause the guests 52 to perceive movement of the ride vehicle 54 within a virtual environment in correspondence with movement of the ride vehicle 54 in the real world.
[0044] In either embodiment shown in FIGS. 2 and 3, although the display 60 may be coupled to the chassis 100 at a position to enable the guest 52 to view the output provided by the display 60, the guest 52 may also be able to view other features and aspects of the attraction system 50. That is, the display 60 may be positioned to avoid blocking a field of view from the ride vehicle 54 to other locations within the attraction system 50. For example, other show effects (e g., the additional show effects 74), such as lighting, smoke effects, fog effects, and so forth, may be provided, and such show effects may supplement the output (e.g., imagery) provided by the display 60. Thus, the display 60 may be one of many show effects provided by the attraction system 50 and experienced by the guests 52.
[0045] FIG. 4 is a perspective view of an embodiment of the attraction system 50. The display 60 may be configured to move relative to a guest 52 that is not in the ride vehicle 54, such as a guest 52 that is navigating (e.g., walking) through and/or within the attraction system 50. For example, the display 60 may be directly coupled to (e.g., secured to, engaged with, mounted to, attached to) the actuator 70, and the actuator 70 may be coupled to a base 150. The control system 62 may be configured to instruct the base 150 to move within the attraction system 50 and/or to instruct the actuator 70 to move (e.g., translate, rotate) the display 60 relative to the base 150. For instance, the control system 62 may instruct the actuator 70 and/or the base 150 to move the display 60 in any suitable direction with respect to the axes 102, 104, 106. In an additional or alternative embodiment, the actuator 70 may include any suitable mechanism or components configured to cause the display 60 to move within the attraction system 50.
[0046] In an embodiment, the control system 62 may be configured to position the display 60 relative to the guest 52 to enable the guest 52 to view the output presented by the display 60. For this reason, the control system 62 may be configured to determine a viewing perspective of the guest 52, such as based on a position of the guest 52, an orientation of the guest 52, a facial feature of the guest 52, and so forth. The control system 62 may be configured to instruct the actuator 70 to move based on the viewing perspective of the guest 52 to enable the guest 52 to see the output presented by the display 60 as the guest 52 moves within the attraction system 50. For instance, the control system 62 may instruct the actuator 70 to move the display 60 and maintain positioning of the display 60 in front of and/or adjacent to the guest 52.
[0047] In an embodiment, the sensor 68 may be configured to determine a parameter indicative of the viewing perspective of the guest 52. The control system 62 may be configured to instruct the actuator 70 to move based on data received from the sensor 68 indicating the viewing perspective of the guest 52. As an example, the sensor 68 may include a camera configured to capture an image of the guest 52, and the control system 62 may determine the viewing perspective of the guest 52 based on the image. As another example, the guest 52 may be in possession of and/or may equip a device 152 of the attraction system 50, such as a band, a strap, a headset, a mobile device, and the like. The sensor 68 may be configured to track the positioning of the device 152, and the control system 62 may determine a viewing perspective of the guest 52 based on data indicative of the positioning of the device 152.
[0048] The control system 62 may be configured to operate the display 60 to present an output based on the position and/or orientation of the display 60 (e g., as arranged via the actuator 70). As an example, the control system 62 may be configured to cause the display 60 to present imagery of a virtual environment and, as the display 60 moves, the imagery presented by the display 60 may be adjusted to present different viewpoints of the virtual environment. For instance, the control system 62 may be configured to instruct the actuator 70 to move the display 60 to a first position based on a first viewing perspective of the guest 52 to enable the guest 52 to view the display 60, and the control system 62 may instruct the display 60 to present first imagery of a virtual environment based on the first position of the display 60. The guest 52 may move and have a second viewing perspective, and the control system 62 may be configured to instruct the actuator 70 to move the display 60 to a second position based on the second viewing perspective of the guest 52 to enable the guest 52 to view the display 60. The control system 62 may also instruct the display 60 to present second imagery of the virtual environment based on the second position of the guest 52, and the second imagery may include a different viewpoint of the virtual environment as that provided by the first imagery. That is, the second imagery may update the appearance of the virtual environment to portray that, due to movement of the guest 52 through and/or within the attraction system 50, the guest 52 is viewing the virtual environment from a different perspective. Thus, the guest 52 may perceive that movement through and/or within the attraction system 50 causes navigation through the virtual environment. Thus, the virtual environment may be more realistically presented to the guest 52.
[0049] FIG. 5 is a perspective view of an embodiment of the attraction system 50. The attraction system 50 may include multiple areas, such as rooms, levels, enclosed volumes, and the like. A first area 180 may include a first actuator 70A coupled to (e.g., directly coupled to) a first display 60 A. A first guest 52A may be positioned in the first area 180 and may navigate through the first area 180. The control system 62 may determine a viewing perspective of the first guest 52 A (e.g., via data received from the sensor 68) in the first area 180 and operate the first actuator 70A based on the viewing perspective, such as to move the first display 60A in front of and/or adjacent to the first guest 52A in the first area 180. The control system 62 may also operate the first display 60A to present an output based on movement of the first display 60A, such as based on a position and/or orientation of the first display 60A in the first area 180. Additionally, a second area 182 may include a second actuator 70B coupled to a second display 60B. A second guest 52B may be positioned in the second area 182 and may navigate through the second area 182. The control system 62 may determine a viewing perspective of the second guest 52B (e g., via data received from the sensor 68), operate the second actuator 70B based on the viewing perspective in the second area 182, and operate the second display 60B to present an output based on movement of the second display 60B.
[0050] Indeed, the control system 62 may operate the first display 60A and the second display 60B independently from one another, as well as the first actuator 70 A and the second actuator 70B independent from one another. For example, the control system 62 may operate the first actuator 70A and the first display 60A based on the viewing perspective of the first guest 52A in the first area 180 regardless of the viewing perspective of the second guest 52B in the second area 182, and the control system 62 may operate the second actuator 70B and the second display 60B based on the viewing perspective of the second guest 52B in the second area 182 regardless of the viewing perspective of the first guest 52A in the first area 180. Thus, the attraction system 50 may present a unique experience to each of the guests 52 in the separate areas 180, 182.
[0051] In an embodiment, a guest 52 may navigate through the areas 180, 182 in a sequential manner, such as from the first area 180 to the second area 182. The control system 62 may be configured to adjust control of the displays 60 and/or the actuators 70 based on the transition of the guest 52 between the first area 180 and the second area 182. By way of example, in response to determining that the first guest 52A is in the first area 180, the control system 62 may operate the first actuator 70A and/or the first display 60A to present an output to the first guest 52A based on movement of the first guest 52A within the first area 180. The first guest 52A may navigate from the first area 180 to the second area 182, and the control system 62 may then determine that the first guest 52 A is in the second area 182 and not in the first area 180 (e g., based on data received from the sensor 68). In response, the control system 62 may operate the second actuator 70B and/or the second display 60B, instead of the first actuator 70A and/or the first display 60 A, to present an output to the first guest 52A based on movement of the first guest 52A within the second area 182. Thus, respective operation of the displays 60 and/or the actuators 70 may be associated with one of the areas 180, 182. For example, the output presented by the first display 60A may be associated with a first virtual environment (e.g., a first graphics-based representation based on a 3-D model), and the output presented by the second display 60B may be associated with a second virtual environment (e.g., a second graphics-based representation based on a 3-D model) that is different than the first virtual environment. Therefore, as the first guest 52A transitions between different areas 180, 182, the first guest 52A may view a different virtual environment via the displays 60. In this manner, the first guest 52A may perceive a transition between different virtual environments as a result of transitioning between the different areas 180, 182. As a result, a unique experience may be provided to the guests 52 as the guests 52 navigate between different areas 180, 182 within the attraction system 50.
[0052] FIG. 6 is a perspective view of an embodiment of the attraction system 50. In the illustrated embodiment, the display 60 is coupled to (e.g., directly coupled to) the actuator 70, and the actuator 70 is separate from the ride vehicle 54. That is, the ride vehicle 54 is not coupled to the actuator 70. For example, the ride vehicle 54 may be coupled to a separate actuator, and/or the ride vehicle 54 may be configured to move along a path (e.g., the path 56 of FIGS. 1 and 3). The ride vehicle 54 may be configured to move, and the control system 62 may be configured to operate the actuator 70 to move the display 60 as the ride vehicle 54 is in motion. In an embodiment, the control system 62 may be configured to determine a position and/or orientation of the ride vehicle 54 in the attraction system 50 (e.g., based on data received from the sensor 68), and the control system 62 may be configured to operate the actuator 70 to position the display 60 based on the position and/or orientation of the ride vehicle 54. For example, the control system 62 may be configured to operate the actuator 70 to position the display 60 in front of and/or adjacent to the ride vehicle 54 to enable the guests 52 positioned within the ride vehicle 54 to see the output of the display 60. In an additional or alternative embodiment, the control system 62 may be configured to operate the actuator 70 independently of movement of the ride vehicle 54. By way of example, the control system 62 may be configured to operate the actuator 70 to move the display 60 in a predetermined manner and/or based on auser input regardless of the position and/or orientation of the ride vehicle 54. In either embodiment, the control system 62 may be configured to operate the display 60 to provide an output based on a position and/or orientation of the display 60.
[0053] FIG. 7 is a perspective view of an embodiment of the attraction system 50. In the illustrated embodiment, the actuator 70 is coupled to (e.g., directly coupled to) the chassis 100 of the ride vehicle 54, and the display 60 is coupled to (e.g., directly coupled to) the actuator 70. During operation of the attraction system 50, the ride vehicle 54 may be configured to move, such as via an actuator separate from the actuator 70 and/or along a path. Movement of the ride vehicle 54 may drive movement of the actuator 70 and therefore movement of the display 60. Moreover, the control system 62 may be configured to operate the actuator 70 to move the display 60 relative to the chassis 100. Thus, movement of the display 60 may be caused by movement of the ride vehicle 54 (e.g., of the chassis 100) and/or by operation of the actuator 70. By way of example, the control system 62 may be configured to instruct the actuator 70 to move the display 60 in a predetermined path, based on a position and/or orientation of the guests 52 within the ride vehicle 54, based on a position and/or orientation of the ride vehicle 54 in the attraction system 50, based on a user input received from one of the guests 52, and the like. For instance, the control system 62 may cause the actuator 70 to move the display 60 that enables the guests 52 to view the output of the display 60 more easily based on the positioning of the ride vehicle 54.
[0054] The control system 62 may be configured to operate the display 60 to provide an output based on the position and/or orientation of the display 60 in the attraction system 50 as caused by the movement of the ride vehicle 54 and/or the operation of the actuator 70. As an example, the control system 62 may determine a position and/or orientation of the ride vehicle 54, a position and/or orientation of the actuator 70, and the position and/or orientation of the display 60 based on the determined positions and/or orientations of the ride vehicle 54 and the actuator 70. As another example, the control system 62 may determine the position and/or orientation of the actuator 70 without determining the position and/or orientation of the ride vehicle 54 and/or of the actuator 70. For instance, the sensor 68 may be configured to directly determine the position and/or orientation of the display 60, and the control system 62 may directly determine the position and/or orientation of the display 60 based on data received from the sensor 68 without having to determine the position and/or orientation of the ride vehicle 54 and/or the actuator 70. In any case, based on the determined position and/or orientation of the display 60, the control system 62 may cause the display 60 to provide an output, such as imagery of a virtual environment, that the guests 52 may view from within the chassis 100 as the ride vehicle 54 moves through the attraction system 50. [0055] FIG. 8 is a flowchart of an embodiment of a method or process 200 for operating an attraction system (e.g., the attraction system 50 of FIGS. 1-7). Any suitable device (e.g., the processing circuitry 66 of the control system 62 of FIGS. 1-7) may perform the method 200. In one embodiment, the method 200 may be implemented by executing instructions stored in a tangible, non-transitory, computer- readable medium (e.g., the memory 64 of the control system 62). For example, the method 200 may be performed at least in part by one or more software components, one or more hardware components, one or more software applications, and the like. While the method 200 is described using steps in a specific sequence, additional steps may be performed, the described steps may be performed in different sequences than the sequence illustrated, and/or certain described steps may be skipped or not performed altogether.
[0056] At block 202, a position and/or orientation of a display may be determined. In one embodiment, the position and/or orientation of the display may be determined based on data received from a sensor. For example, the data may include an image of the attraction system, movement of the display, a positioning of a ride vehicle to which the display is coupled, a positioning of an actuator to which the display is coupled, and so forth. Indeed, the position and/or orientation of the display may be determined or derived based on a determined position and/or orientation of an actuator and/or ride vehicle to which the display is coupled. For example, the actuator and/or ride vehicle may be instructed to move to a target position and/or target orientation (e.g., based on a user input), and the position and/or orientation of the display may be determined based on the target position and/or target orientation of the actuator and/or ride vehicle.
[0057] At block 204, image data may be determined based on the position and/or orientation of the display. For example, reference data that associates respective image data with corresponding positions and/or orientations of the display may be utilized, and the image data associated with the determined position and/or determined orientation of the display may be selected based on the reference data. In an embodiment, the image data may correspond to imagery of a virtual environment.
[0058] At block 206, the display may be instructed to present an output based on the image data. For example, the image data may be transmitted to the display to cause the display to present the output according to the image data. In an embodiment, the output may include an image, such as a single image. In an additional or alternative embodiment, the output may include a video or multiple images presented in a sequential order.
[0059] The steps of the method 200 may be repeated during operation of the attraction system. That is, the position and/or orientation of the display may be repeatedly or continually determined. In response, updated image data may be repeatedly determined, and the display may be repeatedly instructed to present an output based on the updated image data. As such, movement of the display may cause the display to present an updated output. By way of example, the output presented by the display may include imagery of a virtual environment seen from a particular viewpoint corresponding to the positioning of the display, and movement of the display may cause the display to present updated imagery of the virtual environment from an updated viewpoint corresponding to the movement of the display to an updated positioning. Thus, the updated output presented by the display may portray movement of the display in the virtual environment in accordance with movement of the display in the real world. In this manner, the appearance of the virtual environment may be more realistically portrayed.
[0060] While only certain features of the disclosure have been illustrated and described herein, many modifications and changes will occur to those skilled in the art. It is, therefore, to be understood that the appended claims are intended to cover all such modifications and changes as fall within the true spirit of the disclosure.
[0061] The techniques presented and claimed herein are referenced and applied to material obj ects and concrete examples of a practical nature that demonstrably improve the present technical field and, as such, are not abstract, intangible or purely theoretical. Further, if any claims appended to the end of this specification contain one or more elements designated as “means for (performing (a function)... ” or “step for (performing (a function)... ”, it is intended that such elements are to be interpreted under 35 U.S.C. 112(f). However, for any claims containing elements designated in any other manner, it is intended that such elements are not to be interpreted under 35 U.S.C. 112(f).

Claims

1. An atraction system, comprising: a ride vehicle configured to move within the atraction system; a display coupled to the ride vehicle; and a control system communicatively coupled to the display, wherein the control system is configured to perform operations comprising: determining a position and/or orientation of the display within the atraction system; and operating the display to present an image based on the position and/or orientation of the display.
2. The atraction system of claim 1, wherein the control system is configured to perform operations comprising: determining image data based on the position and/or orientation of the display; and transmiting the image data to the display to cause the display to present the image based on the image data.
3. The atraction system of claim 2, wherein the control system is configured to perform operations comprising: referencing data associating a plurality of image data to respective corresponding positions and/or orientations of the display , wherein the plurality of image data comprises the image data; and selecting the image data from plurality of image data based on the image data being associated with the position and/or orientation of the display by the data.
4. The atraction system of claim 1, wherein the ride vehicle is configured to move along a path within the atraction sy stem to dnve movement of the display.
5. The atraction system of claim 1, comprising an actuator, wherein the control system is communicatively coupled to the actuator, and the control system is configured to instruct the actuator to cause movement of the ride vehicle within the attraction system.
6. The atraction system of claim 5, wherein the actuator comprises a robotic arm.
7. The atraction system of claim 1, wherein the image depicts a view into a virtual space based on a three-dimensional (3-D) model.
8. A non-transitory computer-readable medium comprising instructions that, when executed by processing circuitry, cause the processing circuitry to perform operations comprising: instructing an actuator of an atraction system to move a display coupled to the actuator; determining a positioning of the display; determining image data based on the positioning of the display; and instructing the display to present an image based on the image data.
9. The non-transitory computer-readable medium of claim 8, wherein the instructions, when executed by the processing circuitry, cause the processing circuitry to determine the image data from a three-dimensional (3-D) model of a virtual space based on the positioning of the display within a 3-D coordinate system of the 3-D model and a correlation between the positioning of the display within the 3-D coordinate system and the virtual space.
10. The non-transitory computer-readable medium of claim 8, wherein the actuator is coupled to a ride vehicle of the atraction system, the display is coupled to the actuator, the instructions, when executed by the processing circuitry, cause the processing circuitry to instruct the actuator to move the ride vehicle, and movement of the ride vehicle causes corresponding movement of the display.
11. The non-transitory computer-readable medium of claim 10, wherein the instructions, when executed by the processing circuitry, cause the processing circuitry to perform operations comprising: determining an updated positioning of the display in response to instructing the actuator to move the display; determining additional image data based on the updated positioning of the display; and instructing the display to present an additional image based on the additional image data.
12. The non-transitory computer-readable medium of claim 8, wherein the instructions, when executed by the processing circuitry, cause the processing circuitry to perform operations comprising: receiving data from a sensor; and determining the positioning of the display based on the data received form the sensor.
13. The non-transitory computer-readable medium of claim 8, wherein the instructions, when executed by the processing circuitry, cause the processing circuitry to perform operations comprising: determining a positioning of a guest within the attraction system; and instructing the actuator to move the display based on the positioning of the guest.
14. The non-transitory computer-readable medium of claim 8, wherein the instructions, when executed by the processing circuitry, cause the processing circuitry to perform operations comprising: determining a positioning of a ride vehicle within the attraction system; and instructing the actuator to move the display based on the positioning of the ride vehicle.
15. An attraction system, comprising: a display coupled to an actuator; a ride vehicle configured to move and cause movement of the display; and a control system communicatively coupled to the display, wherein the control system is configured to perform operations comprising: instructing the actuator to move the display; determining a position and/or orientation of the display within the attraction system; and operating the display to present an image based on the position and/or orientation of the display.
16. The attraction system of claim 15, wherein the actuator is coupled to the ride vehicle, movement of the ride vehicle drives movement of the actuator, the movement of the actuator drives movement of the display, and the control system is configured to instruct the actuator to move the display relative to the ride vehicle.
17. The attraction system of claim 15, wherein the control system is configured to present the image based on a correlation between the position and/or orientation and an aspect of a virtual space defined by a three-dimensional (3-D) model.
18. The attraction system of claim 15, wherein the control system is configured to perform operations comprising: receiving a user input; and instructing the actuator to move the display based on the user input.
19. The attraction system of claim 15, comprising a sensor, wherein the control system is communicatively coupled to the sensor, and the control system is configured to perform operations comprising: receiving data from the sensor; and determining the position and/or orientation of the display based on the data received from the sensor.
20. The attraction system of claim 19, wherein the sensor comprises a position sensor, a movement sensor, or both.
PCT/US2023/021391 2022-05-10 2023-05-08 Robotic arm integrated immersive reality WO2023219966A1 (en)

Applications Claiming Priority (4)

Application Number Priority Date Filing Date Title
US202263340279P 2022-05-10 2022-05-10
US63/340,279 2022-05-10
US17/880,174 US20230364523A1 (en) 2022-05-10 2022-08-03 Robotic arm integrated immersive reality
US17/880,174 2022-08-03

Publications (1)

Publication Number Publication Date
WO2023219966A1 true WO2023219966A1 (en) 2023-11-16

Family

ID=86688543

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/US2023/021391 WO2023219966A1 (en) 2022-05-10 2023-05-08 Robotic arm integrated immersive reality

Country Status (1)

Country Link
WO (1) WO2023219966A1 (en)

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6179619B1 (en) * 1997-05-13 2001-01-30 Shigenobu Tanaka Game machine for moving object
US20100053029A1 (en) * 2008-09-02 2010-03-04 Disney Enterprises, Inc. Mobile projected sets
US20160346704A1 (en) * 2014-08-11 2016-12-01 Mack Rides Gmbh & Co. Kg Method for Operating a Device, in Particular an Amusement Ride, Transport Means, a Fitness Apparatus or the Like
US20170364145A1 (en) * 2014-08-18 2017-12-21 Universal City Studios Llc Systems and methods for generating augmented and virtual reality images

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6179619B1 (en) * 1997-05-13 2001-01-30 Shigenobu Tanaka Game machine for moving object
US20100053029A1 (en) * 2008-09-02 2010-03-04 Disney Enterprises, Inc. Mobile projected sets
US20160346704A1 (en) * 2014-08-11 2016-12-01 Mack Rides Gmbh & Co. Kg Method for Operating a Device, in Particular an Amusement Ride, Transport Means, a Fitness Apparatus or the Like
US20170364145A1 (en) * 2014-08-18 2017-12-21 Universal City Studios Llc Systems and methods for generating augmented and virtual reality images

Similar Documents

Publication Publication Date Title
US10528123B2 (en) Augmented ride system and method
CN106029190B (en) Method for operating a device, in particular an amusement ride, a vehicle, a fitness apparatus or the like
JP7416836B2 (en) amusement park ride tunnel
JP2687989B2 (en) Electronic play equipment
US20130079079A1 (en) Digital jigsaw puzzle game for mobile device platforms
CN111278522B (en) Velocity simulation effect
US8941654B2 (en) Virtual flashlight for real-time scene illumination and discovery
CN209821674U (en) Multi-degree-of-freedom virtual reality movement device
US20230364523A1 (en) Robotic arm integrated immersive reality
WO2023219966A1 (en) Robotic arm integrated immersive reality
KR20160099075A (en) Image Simulating System, Apparatus for Controlling Platform and Method for Controlling Platform
KR101829656B1 (en) Method and apparatus for non-motorized equestrian device using augmented reality service
JPH1039743A (en) Movement function adjusting device
KR20230113881A (en) Immersive Virtual Reality Experience System and Control Method thereof
JP2015197647A (en) Projection device
KR20210133344A (en) Apparatus for controlling mixed reality attraction and method thereof
KR20160095663A (en) Image Simulating System, Apparatus for Controlling Platform and Method for Controlling Platform
US20230311011A1 (en) Show effect system for amusement park attraction system
KR20170082028A (en) Rim motion apparatus
WO2023192517A1 (en) Show effect system for amusement park attraction system
JP2004253009A (en) Virtual space display apparatus
KR20180057304A (en) Riding Simulator
JP2004030667A (en) Virtual space display method
JP2004030666A (en) Virtual space display method
JP2008097386A (en) Space display device

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 23728516

Country of ref document: EP

Kind code of ref document: A1