WO2022216913A1 - Systems and methods for dynamic projection mapping for animated figures - Google Patents
Systems and methods for dynamic projection mapping for animated figures Download PDFInfo
- Publication number
- WO2022216913A1 WO2022216913A1 PCT/US2022/023802 US2022023802W WO2022216913A1 WO 2022216913 A1 WO2022216913 A1 WO 2022216913A1 US 2022023802 W US2022023802 W US 2022023802W WO 2022216913 A1 WO2022216913 A1 WO 2022216913A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- sensor
- projector
- emitter
- prop
- animated
- Prior art date
Links
- 238000013507 mapping Methods 0.000 title claims abstract description 41
- 238000000034 method Methods 0.000 title claims description 48
- 230000003287 optical effect Effects 0.000 claims description 10
- 238000001514 detection method Methods 0.000 claims description 7
- 238000012545 processing Methods 0.000 claims description 5
- 230000002452 interceptive effect Effects 0.000 description 18
- 230000008569 process Effects 0.000 description 18
- 238000003384 imaging method Methods 0.000 description 12
- 239000000835 fiber Substances 0.000 description 8
- 230000015654 memory Effects 0.000 description 7
- 230000001815 facial effect Effects 0.000 description 5
- 230000004044 response Effects 0.000 description 5
- 238000004891 communication Methods 0.000 description 4
- 230000006870 function Effects 0.000 description 4
- 238000000059 patterning Methods 0.000 description 4
- 230000008901 benefit Effects 0.000 description 3
- 238000010586 diagram Methods 0.000 description 3
- 230000000694 effects Effects 0.000 description 3
- 230000003993 interaction Effects 0.000 description 3
- 238000010801 machine learning Methods 0.000 description 3
- 230000002441 reversible effect Effects 0.000 description 3
- 238000001228 spectrum Methods 0.000 description 3
- 230000001960 triggered effect Effects 0.000 description 3
- 230000001133 acceleration Effects 0.000 description 2
- 230000009471 action Effects 0.000 description 2
- 238000004364 calculation method Methods 0.000 description 2
- 238000013461 design Methods 0.000 description 2
- 230000023077 detection of light stimulus Effects 0.000 description 2
- 238000011161 development Methods 0.000 description 2
- 238000004519 manufacturing process Methods 0.000 description 2
- 239000000049 pigment Substances 0.000 description 2
- 230000002829 reductive effect Effects 0.000 description 2
- 230000002745 absorbent Effects 0.000 description 1
- 239000002250 absorbent Substances 0.000 description 1
- 230000000712 assembly Effects 0.000 description 1
- 238000000429 assembly Methods 0.000 description 1
- 230000008859 change Effects 0.000 description 1
- 239000003086 colorant Substances 0.000 description 1
- 230000000295 complement effect Effects 0.000 description 1
- 230000008878 coupling Effects 0.000 description 1
- 238000010168 coupling process Methods 0.000 description 1
- 238000005859 coupling reaction Methods 0.000 description 1
- 238000013523 data management Methods 0.000 description 1
- 238000006073 displacement reaction Methods 0.000 description 1
- 210000004709 eyebrow Anatomy 0.000 description 1
- 239000004744 fabric Substances 0.000 description 1
- 239000003550 marker Substances 0.000 description 1
- 230000000873 masking effect Effects 0.000 description 1
- 239000000463 material Substances 0.000 description 1
- 239000011159 matrix material Substances 0.000 description 1
- 239000000203 mixture Substances 0.000 description 1
- 238000010845 search algorithm Methods 0.000 description 1
- 239000000779 smoke Substances 0.000 description 1
- 230000003068 static effect Effects 0.000 description 1
- 239000000725 suspension Substances 0.000 description 1
- 238000012384 transportation and delivery Methods 0.000 description 1
- 238000011144 upstream manufacturing Methods 0.000 description 1
- 230000000007 visual effect Effects 0.000 description 1
- XLYOFNOQVPJJNP-UHFFFAOYSA-N water Substances O XLYOFNOQVPJJNP-UHFFFAOYSA-N 0.000 description 1
- 230000037303 wrinkles Effects 0.000 description 1
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N9/00—Details of colour television systems
- H04N9/12—Picture reproducers
- H04N9/31—Projection devices for colour picture display, e.g. using electronic spatial light modulators [ESLM]
- H04N9/3191—Testing thereof
- H04N9/3194—Testing thereof including sensor feedback
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/20—Analysis of motion
- G06T7/246—Analysis of motion using feature-based methods, e.g. the tracking of corners or segments
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/20—Analysis of motion
- G06T7/292—Multi-camera tracking
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/80—Analysis of captured images to determine intrinsic or extrinsic camera parameters, i.e. camera calibration
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N9/00—Details of colour television systems
- H04N9/12—Picture reproducers
- H04N9/31—Projection devices for colour picture display, e.g. using electronic spatial light modulators [ESLM]
- H04N9/3141—Constructional details thereof
- H04N9/3147—Multi-projection systems
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N9/00—Details of colour television systems
- H04N9/12—Picture reproducers
- H04N9/31—Projection devices for colour picture display, e.g. using electronic spatial light modulators [ESLM]
- H04N9/3179—Video signal processing therefor
- H04N9/3185—Geometric adjustment, e.g. keystone or convergence
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01B—MEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
- G01B11/00—Measuring arrangements characterised by the use of optical techniques
- G01B11/24—Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures
- G01B11/25—Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures by projecting a pattern, e.g. one or more lines, moiré fringes on the object
- G01B11/2513—Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures by projecting a pattern, e.g. one or more lines, moiré fringes on the object with several lines being projected in more than one direction, e.g. grids, patterns
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10048—Infrared image
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30196—Human being; Person
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30204—Marker
Definitions
- Amusement parks and other entertainment venues contain, among many other attractions, animated figures to entertain park guests that are queued for or within a ride experience.
- Certain animated figures may be brought to life by projection mapping, which traditionally directs predetermined appearances onto the animated figures.
- a particular animated figure may be visually supplemented with a canned or fixed set of images, which may align with preprogrammed movements of the animated figure.
- projection mapping traditionally directs predetermined appearances onto the animated figures.
- a particular animated figure may be visually supplemented with a canned or fixed set of images, which may align with preprogrammed movements of the animated figure.
- advancements may be made to further immerse the guests within a particular attraction, ride, or interactive experience.
- certain animated figures have an internally-positioned projector that generates an unrealistic backlighting or glow via internal or rear projection through a semi-transparent projection surface of the animated figure.
- a dynamic projection mapping system includes a projector configured to project visible light.
- the dynamic projection mapping system also includes a calibration assembly having a sensor configured to detect the visible light projected by the projector and an emitter configured to emit infrared light.
- the dynamic projection mapping system further includes a tracking sensor configured to detect the infrared light emitted by the emitter.
- the dynamic projection mapping system further includes one or more processors configured to perform a calibration to align the projector and the tracking sensor based on sensor signals received from the sensor and from the tracking sensor.
- a dynamic projection mapping system includes a projector configured to project light.
- the dynamic projection mapping system also includes a calibration assembly having a sensor configured to detect the light projected by the projector and an emitter configured to emit light.
- the dynamic projection mapping system further includes a tracking sensor configured to detect the light emitted by the emitter.
- the dynamic projection mapping system further includes one or more processors configured to establish a common origin within a show space for the projector and the tracking sensor based on sensor signals from the sensor and the tracking sensor.
- a method of operating a projection system and an optical tracking system for dynamic projection mapping includes instructing, via one or more processors, a projector to project visible light.
- the method also includes receiving, at the one or more processors, a first sensor signal from a sensor of a calibration assembly, wherein the first sensor signal indicates receipt of the visible light at the sensor.
- the method further includes instructing, via the one or more processors, an emitter of the calibration assembly to emit infrared light.
- the method further includes receiving, at the one or more processors, a second sensor signal from a tracking sensor, wherein the second sensor signal indicates receipt of the infrared light at the tracking sensor.
- the method further includes calibrating, via the one or more processors, the projector and the tracking sensor based on the first sensor signal and the second sensor signal.
- FIG. 1 is a schematic diagram illustrating an embodiment of the reactive media system including a projection system and a motion tracking system, in accordance with an embodiment of the present disclosure
- FIG. 2 is a block diagram of an embodiment of the reactive media system of FIG. 1, in accordance with an embodiment of the present disclosure
- FIG. 3 is a front view of human-like facial features projection mapped onto a head portion of an animated figure using the reactive media system of FIG. 1, in accordance with an embodiment of the present disclosure
- FIG. 4 is a front view of animal-like facial features that are projection mapped onto a head portion of an animated figure using the reactive media system of FIG. 1, in accordance with an embodiment of the present disclosure
- FIG. 5 is a perspective view of a show set of an attraction that may utilize the reactive media system of FIG. 1, in accordance with an embodiment of the present disclosure
- FIG. 6 is a perspective view of the show set of the attraction of FIG. 5, wherein an origin point is established for the motion tracking system, in accordance with an embodiment of the present disclosure
- FIG. 7 is a perspective view of the show set of the attraction of FIGS. 5 and 6, wherein the origin point is established for the projection system, in accordance with an embodiment of the present disclosure
- FIG. 8 is a side view of a sensor/emitter assembly that may be utilized to calibrate the projection system and the motion tracking system of FIG. 1, wherein the sensor/emitter assembly utilizes a beam splitter, in accordance with an embodiment of the present disclosure
- FIG. 9A is a side view of a sensor/emitter assembly that may be utilized to calibrate the projection system and the motion tracking system of FIG. 1, wherein the sensor/emitter assembly utilizes fiber-optic strands, in accordance with an embodiment of the present disclosure
- FIG. 9B is an end view of the sensor/emitter assembly of FIG. 9A, in accordance with an embodiment of the present disclosure
- FIG. 10A is a side view of a sensor/emitter assembly that may be utilized to calibrate the projection system and the motion tracking system of FIG. 1, wherein the sensor/emitter assembly includes a housing with multiple light pipes, in accordance with an embodiment of the present disclosure
- FIG. 10B is an end view of the sensor/emitter assembly of FIG. 10A, in accordance with an embodiment of the present disclosure
- FIG. IOC is an end view of the sensor/emitter assembly of FIG. 10A, wherein an emitter includes multiple fiber-optic strands, in accordance with an embodiment of the present disclosure
- FIG. 11A is a side view of a sensor/emitter assembly that may be utilized to calibrate the projection system and the motion tracking system of FIG. 1, wherein the sensor/emitter assembly includes a housing with an emitter, in accordance with an embodiment of the present disclosure;
- FIG. 11B is an end view of the sensor/emitter assembly of FIG. 11 A, in accordance with an embodiment of the present disclosure
- FIG. 12A is a side view of a sensor/emitter assembly that may be utilized to calibrate the projection system and the motion tracking system of FIG. 1, wherein the sensor/emitter assembly includes a housing with multiple emitters, in accordance with an embodiment of the present disclosure;
- FIG. 12B is an end view of the sensor/emitter assembly of FIG. 12A, in accordance with an embodiment of the present disclosure
- FIG. 13 A is a side view of a sensor/emitter assembly that may be utilized to calibrate the projection system and the motion tracking system of FIG. 1, wherein the sensor/emitter assembly includes a housing with multiple emitters and a sensor mounted on a printed circuit board, in accordance with an embodiment of the present disclosure;
- FIG. 13B is an end view of the sensor/emitter assembly of FIG. 13A, in accordance with an embodiment of the present disclosure;
- FIG. 13C is a cross-sectional view of the sensor/emitter assembly of FIG. 13A, taken at line 13C-13C, in accordance with an embodiment of the present disclosure
- FIG. 14A is a side view of a sensor/emitter assembly that may be utilized to calibrate the projection system and the motion tracking system of FIG. 1, wherein the sensor/emitter assembly includes a prismatic light pipe for multiple emitters and a shrouded light pipe for a sensor, in accordance with an embodiment of the present disclosure;
- FIG. 14B is an end view of the sensor/emitter assembly of FIG. 14A, in accordance with an embodiment of the present disclosure
- FIG. 15 is a first side view of a sensor/emitter assembly that may be utilized to calibrate the projection system and the motion tracking system of FIG. 1, wherein sensor/emitter assembly includes one or more emitters at the first side, in accordance with an embodiment of the present disclosure
- FIG. 16 is a second side view of the sensor/emitter assembly of FIG. 15, wherein sensor/emitter assembly includes one or more sensors at the second side, in accordance with an embodiment of the present disclosure.
- FIG. 17 is a schematic view of an image that includes indications of multiple markers within an attraction and that may be used to calibrate the projection system and the motion tracking system of FIG. 1, in accordance with an embodiment of the present disclosure.
- Present embodiments are directed to a reactive media system for an amusement attraction, such as an attraction in which a projector of a media control system directs images onto an external surface of a prop, such as an animated figure.
- a reactive media system for an amusement attraction, such as an attraction in which a projector of a media control system directs images onto an external surface of a prop, such as an animated figure.
- the animated figure may appear more lifelike than certain animated figure systems that internally project images through a semi-transparent surface of an animated figure, thereby generating an unnatural or ethereal glowing appearance.
- the reactive media system leverages external tracking (e.g., via optical performance capture or optical motion capture) of the animated figure to dynamically generate and provide images onto the external surface of the animated figure.
- the animated figure may be fitted with trackers that enable tracking cameras of a motion tracking system of a media control system to discern movements, positions, and orientations of the animated figure in real-time.
- the media control system may operate independently of the animated figure (e.g., by not relying on position, velocity, and/or acceleration information regarding actuators of the animated figure), and the media control system may dynamically generate and fit projected images onto the interactive animated figure at a realistic framerate that emulates live characters, such as by presenting textures, colors, and/or movements that appear to be indistinguishable from the animated figure.
- the media control system may generate and update a skeletal model of the animated figure based on feedback from the tracking cameras.
- the skeletal model generally represents the moveable portions of the animated figure, and is dynamically updated to represent a current three-dimensional position (e.g., including x, y, and z coordinates), orientation, and scale of the animated figure or portions thereof (e.g., a pose of the animated figure).
- the media control system therefore utilizes the skeletal model to generate the images for projection that precisely suit the current position and orientation of the animated figure.
- a calibration may be carried out to align and to coordinate the tracking cameras of the motion tracking system and the projector of a projection system.
- the calibration may be done during the setup of a unified tracking and projection system.
- the unified tracking and projection system may simplify the calculations needed for calibration as a tracking system and a projection system are rigidly mounted to one another such that displacement of one directly affects the other. For example, accidentally bumping into the projection system frame may displace it a certain distance, but the tracking system would also be displaced the same distance.
- the adjustments needed to be made may be simplified since adjusting the unified tracking and projection system adjusts the tracking system and the projection system simultaneously.
- the unified tracking and projection system may include various combinations of tracking systems and projection systems. These combinations will be discussed below in further detail, however no one embodiment disparages another and all of the disclosed embodiments may be considered as a plausible solution.
- the prop may be a full animated robotic figure.
- the prop may be formed by one or more objects (e.g., simpler than a full animated robotic figure) that are moved around via complex SAE.
- the prop may represent a character (e.g., a human-like character, an animal-like character) or may not represent a character (e.g., an inanimate object, such as a building, furniture, water).
- FIG. 1 illustrates a reactive media system 8 (e.g., dynamic projection mapping system) of an amusement attraction 10 that includes a prop, which may be referred to as an animated figure 12, that receives images 14 (e.g., projected content) from a projector 16 (e.g., external projector, optical projector with lens) of a media control system 20.
- the amusement attraction 10 is a show set having a stage ceiling 22, a stage floor 24, and scenery objects 26 disposed between the stage ceiling 22 and the stage floor 24.
- the show set may also include any suitable stage lighting devices 30, such as the illustrated lighting instruments or devices. From a guest area 32 of the amusement attraction 10, multiple guests 34 may view and/or interact with the animated figure 12.
- the reactive media system 8 may be utilized to entertain guests 34 in any suitable entertainment environment, such as a dark ride, an outdoor arena, an environment adjacent to a ride path of a ride vehicle carrying the guests 34, and so forth.
- the projector 16 is external to the animated figure 12, thereby enabling an enclosed volume within the animated figure 12 to be utilized to house components other than the projector 16, such as certain actuation systems.
- the projector 16 is disposed in front of the animated figure 12 and obstructed from sight of the guests 34 by an overhang 36 of the stage ceiling 22.
- the projector 16 directs the images 14 onto an external surface 40 of a body 42 (e.g., structure) of the animated figure 12, which may correspond to a head portion 44 of the animated figure 12.
- the media control system 20 may therefore deliver realistic and engaging textures to the head portion 44, thereby providing an immersive and interactive experience to the guests 34.
- the animated figure 12 is part of a motion control system 50 (e.g., prop control system) that may operate independently of the media control system 20.
- the motion control system 50 may leverage interactive data to dynamically update the animated figure 12.
- the motion control system 50 may instruct actuators to adjust the animated figure 12 and/or to adjust the position of any other suitable components of the amusement attraction 10 that may be viewable to the guests 34.
- the motion control system 50 may control an actuatable motion device 66 (e.g., actuatable motion base) that is physically coupled to the animated figure 12.
- the actuatable motion device 66 may be any suitable motion generating assembly that may move (e.g., translate, rotate) the animated figure 12 laterally, longitudinally, and/or vertically. Furthermore, it should be appreciated that the actuatable motion device 66 may be or include a suspension system and/or flying system that is coupled to the animated figure 12 from above the stage floor 24.
- trackers 60 may be positioned on the animated figure 12.
- the trackers 60 may be positioned on a back surface 62 or on any suitable surface of the animated figure 12.
- the trackers 60 enable a tracking camera 64 of the media control system 20 to sense or resolve a position and an orientation of the animated figure 12 within the amusement attraction 10, such as via optical performance capture or optical motion capture techniques.
- the projector 16 may project the images 14 onto the animated figure 12 in synchronization with an actual, current position and orientation (e.g., pose) of the animated figure 12, without relying on position, velocity, and/or acceleration information from actuators of the animated figure 12.
- the media control system 20 may verify the positioning and operation of the projector 16 based on actuator- derived information from the animated figure 12.
- the reactive media system 8 may include any suitable number of projectors 16, trackers 60, and tracking cameras 64.
- more than one animated figure 12 may be included within a single amusement attraction 10, and the reactive media system 8 may include at least one projector 16 for each animated figure 12.
- the particular infrastructure of the reactive media system 8 enables any number of animated figures 12 that are moveable within an optical range of at least one tracking camera 64 and moveable within a projection cone of at least one projector 16 to receive the images 14.
- multiple projectors 16 may be provided to deliver content to multiple sides of a single animated figure 12.
- certain embodiments of the animated figure 12 may include at least two trackers 60 to enable the tracking camera 64 to resolve the relative positioning of the at least two trackers 60 for efficient tracking of the animated figure 12, though it should be understood that changes in position of a single tracker 60 may also enable resolution of the position of the animated figure 12 with a less complex system.
- the projectors 16 and the tracking cameras 64 may be physically coupled to one another.
- the projectors 16 and the tracking cameras 64 may be rigidly mounted to a frame to form a unified system 160 so that the projectors 16 and the tracking cameras 64 remain in fixed positions relative to one another (e.g., with a known offset).
- the tracking cameras 64 may be rigidly mounted to frames of the projectors 16.
- the unified system 160 may simplify a calibration of the projectors 16 and the tracking cameras 64. Further, the unified system 160 blocks (e.g., reduces or eliminates) an amount of drift between the projectors 16 and the tracking cameras 64 during operation of the attraction 10.
- the calibration is performed to establish a relationship between the projectors 16 and the tracking cameras 64 to enable the projectors 16 to project the images 14 onto the animated figures 12 that are tracked via the tracking cameras 64.
- the calibration may occur prior to operation of the attraction 10. For example, the calibration may occur before the week begins, each day before the amusement park opens, before each cycle of the attraction 10, or any combination thereof. In an embodiment, the calibration may occur (e.g., be triggered) by a big movement of the animated figure 12 (e.g., a threshold distance across the show set).
- FIG. 1 illustrates an example of an interactive data source 70 that includes guest sensors 72.
- the guest sensors 72 may collect guest input from any guests 34 within the guest area 32.
- the guest input is one form of interactive data that may be utilized to adaptively update the animated figure 12 or the amusement attraction 10.
- the motion control system 50 may generate a response for the animated figure 12 to perform based on the interactive data, and then instruct actuators of the animated figure 12 to perform the response.
- the animated figure 12 is covered with the trackers 60 (e.g., visible or non-visible; active or passive; retro-reflective markers or active emitters). These discrete points on the animated figure 12 may be used directly as visual reference points, on which to base the 2D or 3D pose estimation process. These discrete points may also be identified and fed through a machine learning algorithm, compared against known ground truth surface poses, and pose matches made in real time.
- the animated figure 12 may be coated with unique patterning, (such as that of facial features imprinted or embedded).
- the unique patterning may be made up of both infrared reflective and infrared absorbent pigments of a same visible base color, which causes a uniform looking surface in the visible light spectrum (which is best for projecting colored light imagery).
- the patterning would be highly visible and trackable by a trained pose estimation and prediction algorithm running on the hardware.
- This use of specialized pigments blocks the projected visible light from confusing a visible light sensor tracking system, or having to perform difference calculations from known base surface to frame by frame projected visible light overlays.
- visibly flat surfaces may contain hidden infrared reflecting/ab sorbing patterning that is viewable by the tracking cameras 64.
- FIG. 2 is a block diagram of the reactive media system 8 having the media control system 20 that may operate to externally project images onto the animated figure 12 (e.g., without communicatively coupling to or relying exclusively on the motion control system 50).
- the media control system 20 may not directly transmit to or receive communication signals from the motion control system 50.
- the interactive data sources 70 may be communicatively coupled upstream of both the media control system 20 and the motion control system 50 to enable coordination of the media control system 20 and the motion control system 50, without intercommunication between the control systems 20, 50.
- a network device 90 such as a switch or a hub, may be communicatively coupled directly downstream of the interactive data sources 70 to facilitate efficient communications between the interactive data sources 70 and the control systems 20, 50.
- the network device 90 may be omitted, that multiple network devices 90 may be implemented, or that any other suitable data management device may be utilized to facilitate delivery of data from the interactive data sources 70 to the control systems 20, 50.
- the animated figure 12 includes a figure processor 100 and a figure memory 104, which may collectively form all or a portion of a figure controller 102 of the motion control system 50.
- the trackers 60 are disposed on the body 42 of the animated figure 12 to enable the tracking cameras 64 of the media control system 20 to sense the position and orientation, or pose, of the animated figure 12.
- the trackers 60 may be active devices, which may each emit an individualized signal to the tracking cameras 64.
- the trackers 60 may emit infrared light, electromagnetic energy, or any other suitable signal that is detectable by the tracking cameras 64 (and, at least in some cases, undetectable by the guests 34).
- the trackers 60 may be passive devices (e.g., reflectors, pigmented portions) that do not emit a signal and that enable the tracking cameras 64 to precisely distinguish the passive devices from other portions of the animated figure 12 and/or the amusement attraction 10.
- the animated figure 12 is fitted with any suitable actuators 106 that enable the animated figure 12 to move (e.g., ambulate, translate, rotate, pivot, lip synchronize) in a realistic and life-emulating manner.
- the interactive data sources 70 may include any suitable data source that provides a variable set of data over time as interactive data 109.
- the guest sensors 72 may sense guest interactions and relay interactive data indicative of the guest interactions to the figure controller 102. Then, the figure controller 102 may instruct the actuators 106 to dynamically manipulate the animated figure 12 to immediately respond to the interactive data 109.
- the media control system 20 may include the projector 16, the tracking cameras 64, a camera network device 110, and/or a media controller 112.
- the media controller 112 is communicatively coupled to the interactive data sources 70 (e.g., via the network device 90), thereby enabling the media controller 112 to dynamically react to the interactive data 109 and/or to other changes in the amusement attraction 10.
- the media control system 20 may be communicatively isolated from the motion control system 50. That is, the motion control system 50 may be independent from the media control system 20.
- the media control system 20 provides operational freedom to the animated figure 12 for adaptively responding to the interactive data 109 in substantially real-time (e.g., within microseconds or milliseconds of an interaction), while the media control system 20 monitors or traces movements of the animated figure 12 to project images thereon also in substantially real-time.
- the media control system 20 simultaneously performs a media feedback loop that modifies the images that are projected onto the animated figure 12.
- the media control system 20 leverages the tracking cameras 64.
- a type or configuration of the tracking cameras 64 may be individually selected to correspond to and to detect a type of the trackers 60.
- the positioning of the trackers 60, in conjunction with geometric or skeletal models of the animated figure 12, facilitates coordination of projection onto the animated figure 12 in different orientations.
- the tracking cameras 64 are communicatively coupled to the camera network device 110, which relays signals indicative of the current three-dimensional position (e.g., including x, y, and z coordinates relative to an origin) and orientation of the animated figure 12 or portions thereof (e.g., a pose of the animated figure 12) to the media controller 112.
- the camera network device 110 is therefore a network switch or sensor hub that consolidates multiple streams of information from the tracking cameras 64 for efficient processing by the media controller 112.
- the media controller 112 includes a media processor 114 and a media memory 116, which operate together to determine, generate, and/or adjust dynamic images to be projected onto the animated figure 12 in its current position and orientation.
- the media controller 112 may instruct the projector 16 to project the dynamic images onto the animated figure 12.
- the images may be wholly rendered on demand based on a current pose (e.g., position and orientation) of the animated figure 12. In less complex configurations, the images may be generated by adapting a prerecorded video stream to the current pose of the animated figure 12.
- the media controller 112 may be any suitable media generator or game engine with significant processing power and reduced latency. It should be understood that the media controller 112 is therefore capable of generating the images to be projected onto the animated figure 12 in substantially real-time, based on the data received from the tracking cameras 64.
- the media controller 112 may maintain a skeletal model or algorithm that represents the animated figure 12 and its actuatable portions (e.g., jaw, limbs, joints). Based on the data, the media controller 112 may update the skeletal model to represent an actual, current position and orientation of the animated figure 12, and then generate the images to be projected onto the animated figure 12 having the current position and orientation.
- the projector 16 may include a projector processor 120 and a projector memory 122 to facilitate the presentation of the images onto the animated figure 12.
- the projector processor 120 generally receives data indicative of the images from the media controller 112, and then instructs a light source within the projector 16 to output the images through a lens.
- the projector 16 may be moveable or actuatable to follow and align with the animated figure 12, such as based on commands received from the media controller 112.
- the projector 16 may be stationary.
- the media controller 112 may determine a current silhouette or a shape of a target figure portion of the animated figure 12 that is to receive projected images based on the updated skeletal model, and then instruct the projector 16 to provide the images onto the silhouette.
- the processors 100, 114, 120 are each any suitable processor that can execute instructions for carrying out the presently disclosed techniques, such as a general-purpose processor, system-on-chip (SoC) device, an application-specific integrated circuit (ASIC), a processor of a programmable logic controller (PLC), a processor of an industrial PC (IPC), or some other similar processor configuration. These instructions are encoded in programs or code stored in a tangible, non-transitory, computer-readable medium, such as the memories 104, 116, 122 and/or other storage circuitry or device. As such, the figure processor 100 is coupled to the figure memory 104, the media processor 114 is coupled to the media memory 116, and the projector processor 120 is coupled to the projector memory 122.
- SoC system-on-chip
- ASIC application-specific integrated circuit
- PLC programmable logic controller
- IPC industrial PC
- the present embodiment of the reactive media system 8 also includes a show control system 130 that coordinates additional output devices of the amusement attraction 10.
- a show controller 132 of the show control system 130 is communicatively coupled between the network device 90 and one or multiple lighting output devices 134, audio output devices 136, and/or venue-specific special effect output devices 138 (e.g., fog machines, vibration generators, actuatable portions of the scenery objects 26).
- FIG. 3 is a front view of the images 14 provided onto the head portion 44 of the body 42 of the animated figure 12.
- the images 14 may include features or textures that resemble a face. For example, eyebrows, eyes, a nose, lips, and/or wrinkles may be projected on to the head portion 44.
- the animated figure 12 is outfitted with a costume element (e.g., a hat, wig, jewelry), and the media controller 112 and/or the projector 16 may identify an outline of the external surface 40 of the animated figure 12 formed by the costume element (e.g., via projection masking). Then, the projector 16 directs the images 14 to a target portion or figure portion of the external surface 40 of the animated figure 12.
- the media control system 20 may monitor movement of the animated figure 12, such as large movements across the stage and/or small movements of an articulating jaw, and project appropriate, realistic images onto the head portion 44 of the animated figure 12.
- FIG. 4 is a front view of the images 14 provided onto the external surface 40 of the animated figure 12.
- the images 14 provide the animated figure 12 with a character, non-human, or fanciful appearance, such as the appearance of an owl.
- the external surface 40 of the head portion 44 may be textured to complement the images 14.
- the images 14 may also include financial, fanciful, or non-human images and/or effects, such as flames, smoke, shapeshifting, color morphing, and so forth.
- FIG. 5 the amusement attraction 10 is shown including the tracking cameras 64, one or more anchor markers 150, one or more objects 154, and the animated figure 12.
- the tracking cameras 64 may use the anchor markers 150 as reference points to establish a position of the animated figure 12 within the amusement attraction 10.
- FIG. 6 illustrates an embodiment of the attraction 10 including an origin point 156 (e.g., common origin point), which may be centered on one of the anchor markers 150 disposed on one of the objects 154.
- origin point 156 e.g., common origin point
- the object 154 may be static, such that it is stationary during the calibration and during the cycle of the amusement attraction 10 (e.g., and the animated figures 12 moves relative to the object 154 during the cycle of the amusement attraction 10).
- the origin point 156 on the top of the object 154 may establish a coordinate system (e.g., 2D or 3D; relative coordinate system for the amusement attraction 10) that does not change during the cycle of the amusement attraction 10. Then, the tracking cameras 64 reference the origin point 156 and the coordinate system to track the animated figure 12 within the coordinate system. Additionally, as shown in FIG.
- the projector 16 may also reference the origin point 156 and the coordinate system to enable the projector 16 to accurately project the images 14 onto the animated figure 12 during the cycle of the amusement attraction 10 (e.g., at all times and in all poses). In this way, the tracking cameras 64 and the projector 16 are calibrated and aligned with one another. In operation during the cycle of the amusement attraction 10, when the tracking cameras 64 detect that the animated figure 12 is at a first set of coordinates, the media controller may then instruct the projector 16 to project the image to the animated figure 12 at the first set of coordinates. Because the tracking cameras 64 and the projector 16 have been calibrated and aligned with one another, the image is properly aligned and mapped onto the animated figure 12.
- the media control system may operate generally as a 2D solution (e.g., in an XY coordinate system), such that the animated figure 12 is captured with the tracking cameras 64, features or markers of the animated figure 12 are identified in 2D space with a shared X/Y origin point of the tracking cameras 64 and the projector 16, and the images are mapped directly to the animated figure 12 in the 2D space.
- the medial control system may operate generally as a 3D solution (e.g., in an XYZ coordinate system). In such cases, machine learning may be used to solve for an estimation of a pose of the animated figure 12 in 3D space.
- the animated figure 12 has a face
- this may generally be a type of facial tracking in which a machine learning model is trained on an extensive set of labeled and tagged facial images, noting pose, expression, proportions, and surface features. The resulting pose estimation can then be used to project masks or digital costume and effects elements in real time.
- the methods may measure the relative position of the tracking cameras 64 of the motion tracking system and the environment, as well as the relative position of the projection lenses of the projectors 16 and the environment.
- the methods may determine the relative position of the motion tracking system and the projection lens (e.g., establish a common origin and coordinate system).
- FIG. 8 illustrates an embodiment of a calibration system 200 with a sensor/emitter assembly 202 (e.g., calibration assembly; co-aligned sensor/emitter assembly).
- the sensor/emitter assembly 202 may be positioned within the amusement attraction, such as on a stationary object within the amusement attraction (e.g., the object 154 in FIG. 6).
- the sensor/emitter assembly 202 may be positioned on a moving object within the show set (e.g., the animated figure 12 of FIG. 1); however, the moving object may be stationary during the calibration with the sensor/emitter assembly 202.
- the sensor/emitter assembly 202 may include a sensor 162 (e.g., light detector) and an emitter 164 (e.g., light emitter).
- the sensor 162 and the emitter 164 may be co-aligned (e.g., coaxial) by means of a beam splitter 166 that is coupled to a fiber optic strand 168.
- the sensor 162 may be a visible light sensor (e.g., a photodiode) to enable the sensor 162 to detect light from the projectors (e.g., the light from the projectors 16 may only be within the visible light spectrum).
- the emitter 164 may be an infrared (IR) light emitting diode (LED) to facilitate detection of light from the emitter 164 by the tracking cameras (e.g., the tracking cameras may only capture light with wavelengths associated with IR light).
- the sensor 162 may detect any type of light (e.g., a first type of light), and the emitter 164 may emit any type of light (e.g., a second type of light that is the same or different than the first type of light).
- the calibration system 200 may further include a sensing area 170 (e.g., end portion; tip) that takes in the visible light from the projectors and that passes the IR light from the emitter 164.
- the emitter 164 may first emit light to enable detection of the light by the tracking cameras and to enable the motion tracking system to determine the coordinates of a point (e.g., at the sensing area 170) in space (e.g., 3D space; the coordinate system).
- the sensor/emitter assembly 202 is illuminated with a light scan (e.g., structured light or similar) from the projector to enable detection of the light by the sensor 162 at the point in space.
- a light scan e.g., structured light or similar
- An algorithm in the media controller then makes the two equal to each other (e.g., associated with each other), such that the coordinates (X, Y, Z) of the point in space are equal to a pixel location (XI, Yl) relative to the projector’s raster.
- This process is completed for N co-aligned points in space (e.g., at least 3, 4, 5, 6, 7, or 8) to obtain a matrix of points that can then be used by the media controller to calibrate the motion tracking system and the projection system to accurately project onto a 3D object (e.g., the exterior surface of the animated figure) that moves through the space.
- the light scan from the projector 16 may be carried out in a particular manner.
- a logarithmic or binary search may be used in order to collapse on a single known value in a sorted array.
- a pixel position is a known value (based on detection of a max value in data from the sensor 162).
- the search algorithm becomes quaternary in nature. From there, the goal becomes determining the location of the sensor/emitter assembly 202 as seen by the projector 16 (e.g., 2D projector) into the show space (e.g., 3D space).
- any number of sections may be separately, sequentially illuminated as part of the light scan from the projector 16 to facilitate the calibration.
- the projector 16 may project the visible light in a first section at a first time, a second section at a second time after the first time, and so on to facilitate the calibration.
- the senor 162 may be physically separated from the emitter 164.
- the fiber optic strands 168 from the sensor 162 and the emitter 164 may converge into one another at the sensing area 170.
- the sensing area 170 shows a lens with a curved end surface (e.g., concave to bend away from the fiber optic strands 168) at an end or tip, it should be appreciated that the lens may be may have a flat end surface.
- the fiber optic strand 168 running from the sensor 162 may have a smaller diameter than that of the fiber optic strand 168 associated with the emitter 164.
- the difference in the diameters enables the fiber optic strands 168 to form a concentric ring arrangement at the sensing area 170, as shown in FIG. 9B.
- a respective end of the fiber optic strand 168 that extends from the sensor 162 and a respective end of the fiber optic strand 168 that extends from the emitter 164 are concentric (e.g., co-axial; one circumferentially surrounds the other).
- the concentric ring arrangement may efficiently calibrate the light from the projector to the 3D space in the attraction.
- the concentric ring arrangement may include a fiber-optic strand and/or or light-pipe arrangement, a sensor/emitter tip, and/or a sensor amplifier to provide a holistic sensing solution.
- This may include a customization of a bifurcated fiber-optic diffuse sensor/emitter tip, combined with a sensor amplifier.
- This may include a custom rigid, flexible, or hybrid light-pipe that is combined with discrete sensors and/or emitters, and this may or may not also utilize a custom printed circuit board.
- the emitter 164 and the sensor 162 provide different functionality.
- the purpose of the emitter 164 is to provide a tracking point or “marker” for the tracking cameras.
- Any of a variety of IR LED(s) may be utilized as the emitter 164, and the emitter optical output (beam angle) is equivalent to the IR LED specification.
- the emitter 164 may emit light at an approximately 850nm wavelength.
- the sensor 162 is used to detect visible light from the proj ectors.
- a diameter of the sensor 162 may be sized to correspond with the size of 1 pixel at a target pixel pitch (e.g., 20 pixels per inch; 0.05 inches or 1.27mm per pixel); however, the sensor 162 may be larger or smaller than 1 pixel. In another embodiment, the diameter of the sensor 162 may be approximately 0.5mm.
- the sensor 162 may have peak response in the human-visible light spectrum. Ideally, the sensor 162 is a high resolution (16 bit) ambient light sensor and provides a linear response in the range of 0-65k lux.
- the senor 162 may have a reading value increase as the light moves closer to a center of the sensor 162, which may enable sub-pixel (e.g., pixel of the projector accuracy.
- the sensor 162 may be a small array of sensors (e.g., phototransistor array) to achieve a similar result.
- the sensor 162 is immune to IR light (inclusive of light leak from the emitter 164).
- the sensor 162 is not a photoresistor or phototransistor.
- a matching sensor amplifier (or equivalent, or similar) shall be connected to the tip of the sensor/emitter assembly 202.
- the emitter 164 is always be on (illuminated).
- the emitter 164 may be controllable, such as via a simple Negative-Positive-Negative (NPN) digital I/O bit.
- the sensor 162 is configured to convert the visible light to an analog signal that is either directly outputted as an analog output (e.g., 0-5V, O-IOV, 0-15V, 0-20V, 5-lOV, 5-15V, 5- 20V), or is sensed as a threshold on the sensor amplifier.
- the sensor bandwidth or scan rate may be at least 50Hz, ideally at least 100Hz (or at least 150Hz, 200Hz, 250Hz).
- Compatible voltages for the system may be 24Vdc or 5 Vdc or any other suitable Vdc.
- the sensor amplifier shall be as small as possible as it will be hidden in unconventional (e.g., non-Deutsches Institut fur Normung [DIN] rail) mounting locations. It should be appreciated that the optical components are isolated (e.g., the IR light emitted by the emitter 164 is isolated from the visible light received at the sensor 162, and vice versa).
- FIGS 10A-14B illustrate various configurations of a sensor/emitter assembly having a concentric ring arrangement.
- FIG. 10A illustrates an embodiment of a sensor/emitter assembly 180 (e.g., calibration assembly) with an output fiber-optic strand 182 and an input fiber-optic strand 184.
- the fiber-optic strands 182, 184 may be supported within a housing 186 (e.g., annular housing), and the fiber-optic strands 182, 184 may exit the housing 186 at an exit 190 (e.g., termination) to extend to the emitter and the sensor.
- FIG. 10A illustrates an embodiment of a sensor/emitter assembly 180 (e.g., calibration assembly) with an output fiber-optic strand 182 and an input fiber-optic strand 184.
- the fiber-optic strands 182, 184 may be supported within a housing 186 (e.g., annular housing), and the fiber-optic strands 182, 184 may
- FIG. 10B illustrates the concentric ring arrangement formed by the emitter (e.g., via the output fiber-optic strand 182) and the sensor (e.g., via the input fiber-optic strand 184).
- FIG. IOC illustrates an embodiment in which multiple output fiber-optic strands 182 are distributed in a ring.
- FIG. 11A illustrates an embodiment of a sensor/emitter assembly 190 (e.g., calibration assembly) with the emitter 164, as well as an input fiber-optic strand 194 that extends to the sensor.
- the emitter 164 and the input fiber-optic strand 194 may be supported within a housing 196 (e.g., annular housing), and the input fiber-optic strand 194 may exit the housing 196 via an exit 198 to extend to the sensor.
- a light pipe 199 e.g., annular pipe
- FIG. 11B illustrates the concentric ring arrangement formed by the emitter 164 and the sensor (e.g., via the input fiber-optic strand 194).
- FIG. 12A illustrates an embodiment of a sensor/emitter assembly 210 (e.g., calibration assembly) with multiple emitters 164 (e.g., arranged in a ring), as well as an input fiber-optic strand 214.
- the emitters 164 and the input fiber-optic strand 214 may be supported within a housing 216 (e.g., annular housing), and the input fiber-optic strand 214 may exit the housing 216 to extend to the sensor.
- FIG. 12B illustrates the concentric ring arrangement formed by the emitter 164 and the sensor (e.g., via the input fiber-optic strand 214).
- FIG. 13 A illustrates an embodiment of a sensor/emitter assembly 220 (e.g., calibration assembly) with multiple emitters 164 (e.g., arranged in a ring), as well as the sensor 162.
- the emitters 164 and the sensor 162 may be supported within the housing 226.
- the emitters 164 and the sensor 162 may also be supported on a printed circuit board 228 to facilitate coordinated emission of the light by the emitters 164, as well as processing and communication of light detected via the sensor 162, for example.
- a first light pipe 222 extends to the sensor 162 to guide and to isolate the light from the projector
- a second light pipe 224 surrounds the first light pipe 222 to guide and to isolate the light emitted by the emitter 164.
- FIG. 13B illustrates the concentric ring arrangement formed by the sensor 162 (e.g., via the first light pipe 222) and the emitters 164 (e.g., via the second light pipe 224).
- FIG. 13C is taken within line 13C-13C in FIG. 13A, and FIG. 13C illustrates the emitters 164 and the sensor 162 mounted on the printed circuit board 228.
- FIG. 14A illustrates an embodiment of a sensor/emitter assembly 230 (e.g., calibration assembly) with multiple emitters 164 (e.g., arranged in a ring), as well as the sensor 162.
- a first light pipe 232 e.g., shrouded light pipe; annular pipe
- a second light pipe 234 e.g., prismatic light pipe; annular pipe
- the emitters 164 and the sensor 162 may be supported on a printed circuit board 238 to facilitate coordinated emission of the light by the emitters 164, as well as processing and communication of light detected via the sensor 162, for example.
- FIG. 14B illustrates the concentric ring arrangement formed by the sensor 162 (e.g., via the first light pipe 232) and the emitters 164 (e.g., via the second light pipe 234), as well as the emitters 164 and the sensor 162 mounted on the printed circuit board 238.
- FIG. 15 illustrates a first side (e.g., a front side) of an embodiment of a sensor/emitter assembly 240 (e.g., calibration assembly).
- an emitter 164 may be positioned such that the emitter 164 is visible from the first side of the sensor/emitter assembly 240.
- the emitter 164 may be positioned on a first surface of a housing 242 (e.g., rigid housing; plate).
- the emitter 164 may function in a similar manner, or the same manner, as described in one or more embodiments mentioned herein.
- FIG. 16 illustrates a second side (e.g., a rear side) of an embodiment of the sensor/emitter assembly 240.
- a sensor 162 may be positioned such that the sensor 162 is visible from the second side of the sensor/emitter assembly 240.
- the sensor 162 may be positioned on a second surface of the housing 242 (e.g., opposite the first surface of the housing).
- the sensor 162 may function in a similar manner, or the same manner, as described in one or more embodiments mentioned herein. With reference to FIGS.
- the emitter 164 and the sensor 162 may both be positioned on the housing 242 such that the emitter 164 and the sensor 162 are in fixed positions relative to one another, but may not be co-located (e.g., not facing the same direction and/or one does not circumferentially surround the other).
- the emitter 164 and the sensor 162 are shown as being positioned on opposite sides of the housing 242 in a co-axial (e.g., aligned) relationship; however, the emitter 162 and the sensor 162 may be positioned on the housing 242 (or on multiple housings or separate structures) such that the emitter 164 and the sensor 162 are in fixed positions relative to one another without being co-located or co-axial.
- the emitter 164 and the sensor 162 may be located at two different fixed positions in the same plane (e.g., spaced apart from one another in the same plane; co-linear).
- the emitter 164 and the sensor 162 may be located at two different fixed positions in different planes (e.g., offset in three-dimensions) and/or on different housings (or different, physically separate structures).
- the spatial relationship between the sensor 162 and the emitter 164 is known and taken into account.
- the housing 242 (or the multiple housings or the physically separate structures) may be coupled to an object that is configured to be worn, held, or carried by the prop (e.g., band, clothing, jewelry, or tool).
- the calibration may be carried out via other types of calibration assemblies.
- multiple retro-reflective dots e.g., markers; 7, 8, 9, 10, or more
- the projector 16 may scan across the raster (e.g., a light scan; across two-dimensional pixels that form the raster).
- An imaging sensor 250 e.g., camera mounted to the projector 16 may capture/generate an image 252 of the attraction.
- the imaging sensor 250 detects a bright point 254, and thus, the image 252 includes indications of the bright points 254. Based on the relative locations of all the bright points 254 detected by the imaging sensor 250, the media controller may determine a respective location (e.g., coordinates) that correspond to each of the bright points 254.
- a first bright point 254 that is in an upper right of the image 252 corresponds to a first retro-reflective dot on a ceiling (e.g., at a first known location/coordinates in the attraction), while a second bright point 254 that is in a lower left of the image 252 corresponds to a second retro-reflective dot on a floor (e.g., at a second known location/coordinates in the attraction).
- the imaging sensor does not need to be high resolution or well-aligned to the projector.
- the media controller may determine a respective pixel that corresponds to each of the retro-reflective dots (and thus, links the respective pixel to the coordinates in the attraction).
- the data is provided to a reverse mapping algorithm that calculates a location of the projector 16 relative to the retro-reflective dots (and thus, relative to the coordinates in the attraction/the coordinate system for the attraction).
- the tracking cameras of the motion tracking system may also detect the retro-reflective dots in the attraction.
- the tracking cameras may include a light source (e.g., light rings) to illuminate the retro-reflective dots to facilitate detection of the retro-reflective dots.
- the media controller may establish a location of the tracking cameras relative to the coordinates in the attraction/the coordinate system for the attraction.
- the projector system and the motion tracking system may share an origin point/coordinate system.
- this technique provides passive markers (e.g., the retro- reflective dots) and the same markers are used to calibrate/align the projector 16 and the tracking cameras 64.
- this technique is forgiving of minor movements as the calibration may be completed even if objects (e.g., the walls or the objects in the attraction) move relative to one another.
- the calibration process may utilize detectors (e.g., light detectors; instead of passive retro-reflective dots).
- detectors e.g., light detectors; instead of passive retro-reflective dots.
- multiple detectors may be placed in the attraction (e.g., on the walls or objects; on the prop, such as on the animated figure) at known locations/coordinates.
- the projector 16 may perform a light scan across the attraction 10, and each of the detectors may be triggered when the light is detected to provide a X/Y coordinate as position input to the media controller. Then, a reverse mapping is performed to establish a location of the projector 16 relative to the detectors (and thus, relative to the coordinates in the attraction/the coordinate system for the attraction).
- the tracking cameras of the motion tracking system may also detect the detectors in the attraction.
- the projector system and the motion tracking system may share an origin point/coordinate system.
- the calibration process may utilize a combination of emitters (e.g., light emitters, such as LEDs; instead of passive retro-reflective dots) and detectors (e.g., light detectors).
- emitters e.g., light emitters, such as LEDs; instead of passive retro-reflective dots
- detectors e.g., light detectors
- multiple emitters and detectors may be placed in the attraction (e.g., on the walls or objects; on the prop, such as on the animated figure) at known locations/coordinates.
- the projector 16 may perform a light scan across the attraction 10.
- the emitter illuminates (e.g., emits light).
- the imaging sensor 250 may capture/generate an image that includes indications of the emitters.
- the media controller may determine a respective pixel that corresponds to each of the emitter/detector pairs (and thus, links the respective pixel to the coordinates in the attraction).
- the data is provided to a reverse mapping algorithm that calculates a location of the projector 16 relative to the emitter/detector pairs (and thus, relative to the coordinates in the attraction/the coordinate system for the attraction).
- the tracking cameras of the motion tracking system may also detect the emitter/detector pairs in the attraction in the same way (e.g., the tracking cameras are associated with a light source).
- the emitters of the emitter/detector pairs may be turned on to emit light (e.g., in sequence or simultaneously).
- detection of light from the light scan of the projector 16 at the detector may cause the emitter to turn off.
- This may be detected by the imaging sensor 250 to enable the imaging sensor 250 to capture/generate an image that includes indications of the emitters.
- the light emitted by the emitters may be detected by the tracking cameras. In this way, the projector system and the motion tracking system may share an origin point/coordinate system.
- the calibration process may utilize emitters (e.g., light emitters, such as LEDs; instead of passive retro-reflective dots).
- the emitters may be placed in the attraction (e.g., on the walls or objects; on the prop, such as on the animated figure) at known locations/coordinates.
- the calibration process may be carried out without any detectors (e.g., light detectors) in the attraction.
- the emitters may emit infrared light, which may be detected by the imaging sensor 250.
- the projector 16 may perform a light scan across the attraction 10.
- the imaging sensor 250 may detect when the light from the light scan crosses the previously identified emitters.
- the imaging sensor 250 may be a high-resolution camera and may be configured to capture both visible light and invisible light.
- the calibration process may utilize multiple imaging sensors 250, and the data may be averaged and/or compared to provide increased accuracy (e.g., as compared to only one imaging sensor 250).
- the light emitted by the emitters may be detected by the tracking cameras. In this way, the projector system and the motion tracking system may share an origin point/coordinate system.
- the calibration process may utilize a rigid frame structure that is dropped or otherwise temporarily placed into the attraction (e.g., via an automated and/or motorized system).
- the rigid frame structure as well as detectors (e.g., light detectors) coupled thereto, may be passed over with a light scan from the projector 16 to facilitate the calibration process described in detail herein.
- the rigid frame structure Once the origin point/coordinate system is established, the rigid frame structure may be removed from the attraction (or the show set of the attraction).
- the rigid frame structure is used only temporarily for calibration purposes (e.g., it is brought in, calibration is completed, and then it is taken out).
- the rigid frame structure may be a space frame (or physical “wireframe”) that actually encompasses an object of interest (e.g., the prop, such as the animated figure) onto which the projector 16 will project images during the show.
- the rigid frame structure may be the object of interest, such as a piece of show action equipment that is projected onto during the show, but that only makes an appearance during one time period in the show.
- This calibration technique may work well for an amorphous surface, like a fabric ghost that does not have any geometric features or structure to otherwise mount retro-reflective dots, detectors, or emitters.
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Physics & Mathematics (AREA)
- Computer Vision & Pattern Recognition (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Signal Processing (AREA)
- Geometry (AREA)
- Length Measuring Devices By Optical Means (AREA)
Abstract
Description
Claims
Priority Applications (5)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2023561390A JP2024514565A (en) | 2021-04-09 | 2022-04-07 | System and method for dynamic projection mapping of anime figures |
CA3213712A CA3213712A1 (en) | 2021-04-09 | 2022-04-07 | Systems and methods for dynamic projection mapping for animated figures |
KR1020237038644A KR20230165343A (en) | 2021-04-09 | 2022-04-07 | System and method for dynamic projection mapping correction |
CN202280027526.1A CN117157971A (en) | 2021-04-09 | 2022-04-07 | System and method for dynamic projection mapping of animated figures |
EP22719170.7A EP4320857A1 (en) | 2021-04-09 | 2022-04-07 | Systems and methods for dynamic projection mapping for animated figures |
Applications Claiming Priority (8)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US202163173327P | 2021-04-09 | 2021-04-09 | |
US63/173,327 | 2021-04-09 | ||
US202163177234P | 2021-04-20 | 2021-04-20 | |
US63/177,234 | 2021-04-20 | ||
US202163212507P | 2021-06-18 | 2021-06-18 | |
US63/212,507 | 2021-06-18 | ||
US17/714,818 | 2022-04-06 | ||
US17/714,818 US20220323874A1 (en) | 2021-04-09 | 2022-04-06 | Systems and methods for dynamic projection mapping for animated figures |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2022216913A1 true WO2022216913A1 (en) | 2022-10-13 |
Family
ID=81386928
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/US2022/023802 WO2022216913A1 (en) | 2021-04-09 | 2022-04-07 | Systems and methods for dynamic projection mapping for animated figures |
Country Status (1)
Country | Link |
---|---|
WO (1) | WO2022216913A1 (en) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2023239802A1 (en) * | 2022-06-08 | 2023-12-14 | Universal City Studios Llc | Calibration systems and methods for dynamic projection mapping |
Citations (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20050030486A1 (en) * | 2003-08-06 | 2005-02-10 | Lee Johnny Chung | Method and system for calibrating projectors to arbitrarily shaped surfaces with discrete optical sensors mounted at the surfaces |
-
2022
- 2022-04-07 WO PCT/US2022/023802 patent/WO2022216913A1/en active Application Filing
Patent Citations (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20050030486A1 (en) * | 2003-08-06 | 2005-02-10 | Lee Johnny Chung | Method and system for calibrating projectors to arbitrarily shaped surfaces with discrete optical sensors mounted at the surfaces |
Non-Patent Citations (1)
Title |
---|
DAIKI TONE ET AL: "FibAR: Embedding Optical Fibers in 3D Printed Objects for Active Markers in Dynamic Projection Mapping", ARXIV.ORG, CORNELL UNIVERSITY LIBRARY, 201 OLIN LIBRARY CORNELL UNIVERSITY ITHACA, NY 14853, 6 February 2020 (2020-02-06), XP081943614, DOI: 10.1109/TVCG.2020.2973444 * |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2023239802A1 (en) * | 2022-06-08 | 2023-12-14 | Universal City Studios Llc | Calibration systems and methods for dynamic projection mapping |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
JP4077787B2 (en) | Interactive video display system | |
KR20220099580A (en) | Head-mounted display for virtual and mixed reality with inside-out positional, user body and environment tracking | |
US11772276B2 (en) | Systems and methods for optical performance captured animated figure with real-time reactive projected media | |
JP7482881B2 (en) | Augmented reality systems for recreational vehicles | |
EP3878529A1 (en) | Interactive entertainment system | |
US20220323874A1 (en) | Systems and methods for dynamic projection mapping for animated figures | |
WO2022216913A1 (en) | Systems and methods for dynamic projection mapping for animated figures | |
US11189061B2 (en) | Systems and methods for virtual feature development | |
EP3454098A1 (en) | System with semi-transparent reflector for mixed/augmented reality | |
US20210374982A1 (en) | Systems and Methods for Illuminating Physical Space with Shadows of Virtual Objects | |
EP3729235B1 (en) | Data processing | |
US20230403381A1 (en) | Calibration systems and methods for dynamic projection mapping | |
US20220347705A1 (en) | Water fountain controlled by observer | |
US20220327754A1 (en) | Systems and methods for animated figure display | |
WO2023239802A1 (en) | Calibration systems and methods for dynamic projection mapping | |
KR102124564B1 (en) | Apparatus and Method For Image Processing Based on Position of Moving Light Source in Augmented Reality | |
US20240280700A1 (en) | Optical tracking system with data transmission via infrared | |
WO2023277020A1 (en) | Image display system and image display method | |
RU2772301C1 (en) | Augmented reality system for riding attraction | |
JP2011215919A (en) | Program, information storage medium and image generation system |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 22719170 Country of ref document: EP Kind code of ref document: A1 |
|
WWE | Wipo information: entry into national phase |
Ref document number: P6002373/2023 Country of ref document: AE |
|
ENP | Entry into the national phase |
Ref document number: 3213712 Country of ref document: CA |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2023561390 Country of ref document: JP |
|
WWE | Wipo information: entry into national phase |
Ref document number: 11202307523P Country of ref document: SG |
|
ENP | Entry into the national phase |
Ref document number: 20237038644 Country of ref document: KR Kind code of ref document: A |
|
WWE | Wipo information: entry into national phase |
Ref document number: 1020237038644 Country of ref document: KR |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2022719170 Country of ref document: EP |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
ENP | Entry into the national phase |
Ref document number: 2022719170 Country of ref document: EP Effective date: 20231109 |