WO2023196845A2 - System and method for providing dynamic backgrounds in live-action videography - Google Patents

System and method for providing dynamic backgrounds in live-action videography Download PDF

Info

Publication number
WO2023196845A2
WO2023196845A2 PCT/US2023/065368 US2023065368W WO2023196845A2 WO 2023196845 A2 WO2023196845 A2 WO 2023196845A2 US 2023065368 W US2023065368 W US 2023065368W WO 2023196845 A2 WO2023196845 A2 WO 2023196845A2
Authority
WO
WIPO (PCT)
Prior art keywords
display
background
display wall
computer
camera
Prior art date
Application number
PCT/US2023/065368
Other languages
French (fr)
Other versions
WO2023196845A3 (en
Inventor
Aaron Daly
Wes PALMER
Julian SARMIENTO
Johnny FISK
Original Assignee
Fusefx Corporation
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Fusefx Corporation filed Critical Fusefx Corporation
Publication of WO2023196845A2 publication Critical patent/WO2023196845A2/en
Publication of WO2023196845A3 publication Critical patent/WO2023196845A3/en

Links

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/222Studio circuitry; Studio devices; Studio equipment
    • H04N5/2224Studio circuitry; Studio devices; Studio equipment related to virtual studio applications
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/222Studio circuitry; Studio devices; Studio equipment
    • H04N5/262Studio circuits, e.g. for mixing, switching-over, change of character of image, other special effects ; Cameras specially adapted for the electronic generation of special effects

Definitions

  • the present disclosure describes a complete and user-friendly system that includes an extensive library of cinematic backdrop assets and virtual production technology.
  • a system configured to allow tablet devices to send commands to servers, which changes and adjusts content in real-time on a wall of LED displays, where the content is a series of background images.
  • an e-commerce ecosystem that allows filmmakers to purchase or rent contents, which are delivered on demand to a stage anywhere in the world.
  • the system and method described herein enables directors of photography to retain greater creative control during production. Without the extensive pre-production or post-production creation and manipulation of digital assets, filmmakers experience all of the upsides of virtual production, with the added benefit of creative control on set. Production designers and art directors can select backdrops from an image library for virtual productions.
  • the backdrops may include motion elements like fog, moving traffic, or other moving imagery,
  • SUBSTITUTE SHEET (RULE 26) allowing the backdrop to take on greater depth and become a more integrated part of the set.
  • advantages of the system and method described herein include time and cost savings, integration between background and camera/lighting controls, improved creative control, improved availability of background imagery, better integration between set and background, and improved final output of production.
  • FIG. 1 illustrates an exemplary embodiment of the virtual production system.
  • FIGS. 2-9 illustrate an exemplary embodiment of the mobile application.
  • a system comprising a library of interactive background imagery for virtual production, computer-implemented method allowing the images to be directly manipulated on set as “live action backdrops,” and a user-friendly mobile application (“app”) to control the image being used.
  • the library comprises background images created specifically for use in filmmaking.
  • the background images are recorded separately using any recording means such as a standard video camera (digital or not) or a movie camera.
  • the images have been shot at optimal resolution, taking into account the specified image properties required within the film and television industry.
  • manipulation of the images includes adjustments to image depth, color, contrast, time of day, horizon line, blur, motion, focus, zoom, temperature, and parallax.
  • the app requires virtually no training and can be used directly on set while the image is being displayed on all of light emitting diode (LED) panels.
  • the system allows for real-time changes without latency.
  • the manipulation of the background images may be done dynamically while filming.
  • a method for synchronizing a background action sequence with a foreground action sequence comprising: displaying the background action sequence on a LED display; providing the foreground action sequence in front of the LED display; and causing a change in the background action sequence as a function of and in
  • SUBSTITUTE SHEET (RULE 26) accordance with action occurring in the foreground action sequence.
  • a human operator causes the change in the background action sequence as a result of a visual cue received by any member of the fdm crew.
  • the director can also produce final images that do not need to be retouched in the post-production phase and the overall costs of production are reduced.
  • the actors can see the context within which they are performing as the action unrolls, instead of having to imagine what the background scene will look like.
  • the actor is at least partially surrounded by one or more displays presenting images of a virtual environment.
  • the displays surrounding the stage area are formed from multiple LED panels that are generally fixed in position and mostly surround the stage area.
  • the displays can be greater than 10 feet tall, for example, greater than 20 ft, 30 ft, 40 ft or even taller than 50 ft or even 60 ft tall.
  • the displays can be greater than 20 feet wide, for example, greater than 30 ft, 50, 70 ft or even wider than 90.
  • sensors can be used to determine the position and orientation of the taking camera, for example OptiTrack, during a performance and/or the camera can be a motion track equipped camera.
  • the system adjusts the virtual environment displayed by the immersive cave or walls in real-time to correspond to the orientation and position of the taking camera. In this way, images of the virtual environment created with using an LED background display, displaying a series of composite images generated from a plurality of digital high resolution photographic images; can be perspective-correct over a performance.
  • portions of the displays surrounding the stage area can be used to simulate LED lights that illuminate the stage area.
  • the number, location, color, and intensity of the simulated lights can be selected in order to achieve a desired lighting effect.
  • the system described herein allows portable devices to display, control, manipulate high resolution image-based backgrounds for filmmaking applications.
  • the system is comprised of both a mobile application, as well as a desktop
  • SUBSTITUTE SHEET (RULE 26) application that can be used with a 3d software development tool such as Unity or Unreal Engine, for example Unreal’s nDisplay functions to drive a display.
  • the mobile application is designed to be used with tablet devices, but can also be run on cell phones or other desktop computers.
  • the mobile application is an Android application.
  • the mobile application can also be deployed to iOS, Windows and Mac devices.
  • the mobile application is platform agnostic.
  • FIGS. 2-9 illustrate an embodiment of a mobile app user interface displays.
  • the mobile application has an intuitive user interface and sends commands via the OSC protocol to a variety of servers running the desktop application, which interpret those commands and modify content that is displayed and updated in real-time.
  • the system described herein includes a server, which is created for every “listening computer” within the setup for the intended display.
  • a server which is created for every “listening computer” within the setup for the intended display.
  • the software is display agnostic.
  • the server can include a processor, a processing acceleration unit, a digital signal processor (DSP), a special purpose processor, and/or the like.
  • the server can include/operate a real-time gaming engine such as Unreal Engine or other similar rendering engine.
  • the real-time gaming engine can load the virtual environment for display on the displays surrounding the stage area.
  • the mobile application sends commands via the OSC protocol to each “listening computer,” which then updates their variables and changes the way that content is rendered using Unreal Engine’s real-time Tenderer.
  • the mobile device is connected to the servers either wirelessly via secure WiFi connection to a router, or is hardwired via Ethernet to a connected router.
  • a separate media server is integrated into the system and interfaces directly with Unreal Engine and OSC, to playback all video components and integrate them into the final display output in real-time.
  • the mobile application sends OSC commands to the
  • SUBSTITUTE SHEET (RULE 26) desktop application, which then sends additional OSC commands to the media server to control playback and drive updates to the display output.
  • system described herein is an ecommerce ecosystem that allows users to digitally “rent” or download virtual production backdrops from an extensive online library, and implement them in a filmmaking context as a new alternative to traditional backdrops or green/blue screen for compositing in post- production.
  • the system comprises of a series of web servers, a client-facing web and ecommerce interface, a back-end system to track rentals and background expiration, and the actual system implementation on set to drive the content.
  • a user is able to make a transaction on the client-facing web interface, which triggers content downloads and availability on the back-end server.
  • the user can then access that content from a set or soundstage that they intend to use the background on.
  • the system then interfaces with the back-end server, verifies that the content has been purchased and the timeframe for which it should be available, and allows the user to download and implement the solution.
  • the system described herein allows the manipulation of high resolution image-based backgrounds, including changes to image depth, horizon line, zoom, focus, color temperature, time of day, lighting, motion, frustum (camera position), and/or fog.
  • a series of high-resolution aligned and stitched panoramic images are fed into the system. Those images are then mapped onto a three-dimensional object that matches the exact curvature and size as the intended display, and are then manipulated by the application as a material and a shader.
  • a user may adjust a series of simple sliders in an app, and control saturation and exposure, as well as add digital lights to selective areas and shape the appearance of the background in real-time, for example continuously and simultaneously with the live action being viewed and/or captured during a production shoot (i.e., it removes the need for stopping the production to adjust or transition the background image).
  • the images can also be offset on a vertical or horizontal basis to change the appearance of a horizon line and accommodate multi-story sets and different
  • SUBSTITUTE SHEET (RULE 26) creative decisions.
  • the high- resolution images can also be zoomed into with projected cross-hairs to achieve a desired result.
  • the depth of field may be controlled, such as in accordance with the methods disclosed in U.S. Patent No. 7,164,462, whose disclosure is incorporated by reference in its entirety.
  • blending and/or overlaying techniques can be used on the background image to create special visual effects.
  • one of the objects of the background image can be set to a different level of focus manually in one image and blended into the second image. The sharpness of the objects in the combined image then vary.
  • the user can integrate motion elements shot on set (boats and water moving on a river, for example) to bring the background to life. Those elements can be controlled, scrubbed, and played back on command.
  • the user can also integrate simulated fog/rain effects that are adjusted on a depth-based basis, in addition to transparency, radius, position, color, direction and speed.
  • the desktop application receives a normalized value from the mobile application between 0 and 1.
  • the desktop application then dives the total number of images (X) within a given set by 1/(X- 1) to calculate a ratio to change between the set, and blend them accordingly within the normalized range.
  • a linear interpolation is then applied between images at that proper ratio to create a seamless blending effect.
  • a reverse linear interpolation is applied for the digital lighting effects, effectively allowing an alternative time of day to be displayed within very specific, defined region of the image in the form of an inverse mask.
  • This solution allows the user to shape the image without introducing unwanted noise and grain from purely boosting exposure or other base variables to an image, mixing elements of night and day as needed to achieve a desired creative result.
  • Digital lights can be shaped with temperature, tint, opacity, radius, feathering, exposure, and “bam doors” can be used to make a variety of adjustments and shape the appearance of the top, left, right, and bottom edges of the light.
  • Digital lights can be added and removed via the mobile application, and easily moved around the background image by touch, with all changes updated and rendered on the display in real- time.
  • additional displays can also be controlled by both applications.
  • the mobile application allows a user to sample a color from the background that they currently have displayed, and send commands to the desktop application to populate that additional display with that color.
  • the sampled color changes dynamically along with other changes a user may make in the mobile application, to reflect digital lighting adjustments and other creative changes.
  • the user can also manually override that color, and dial in specific hue, saturation, temperature, and exposure adjustments to send a specific color to those display methods.
  • the system can utilize three-dimensional camera position data to project an “inner frustum” — that frustum can be controlled separately from the rest of the image (blurred, defocused, applied with a specific green color for the purposes of pulling a chroma key, among other functions), and can be used to simulate the background to appear to a camera to be a different distance then the reality of the display.
  • the application can receive a distance variable to simulate in three-dimensional space, only within the confines of the inner frustum.
  • a three-dimensional object is scaled proportionally to fill the display at that specified distance. For example, this allows the system to create the illusion to a tracked camera that a skyline background image of New York City is a full mile away, instead of the reality of only ten feet from the image on a display.
  • U.S. Patent No. 11,132,837 discloses a method where in the images of the virtual environment within a frustum of the camera are updated on one or more displays based on movement of the camera, whose disclosure is incorporated by reference in its entirety.
  • the background is offset as the camera moves to achieve an appropriate amount of parallax with all foreground objects in front of the display, as if the background image is more distant than it is.
  • the user can use IP addresses to designate proper listening servers to receive commands from the application, and will eventually be able to send DMX commands based on color samples to any lighting fixtures that have been designated to receive them. Those fixtures could then be updated in real-time to reflect color changes made to the background image.
  • SUBSTITUTE SHEET ( RULE 26) [0047]
  • the arrangement of all settings can be saved as preset files, so users can save preferences and specific background settings, share them and apply those settings to achieve the same appearance across multiple shoot days and applications, as well as quickly switch between appearances as needed.

Abstract

A system configured to allow tablet devices to send commands to servers, which changes and adjusts virtual production backdrops in real-time on a wall of LED displays surrounding a stage area for filmmaking, where the adjustments include image depth, color, contrast, time of day, horizon line, blur, motion, focus, zoom, temperature, and parallax.

Description

SYSTEM AND METHOD FOR PROVIDING DYNAMIC BACKGROUNDS IN LIVE-ACTION VIDEOGRAPHY
[0001] FIELD
[0002] Described herein are systems and methods in lighting and backdrop technologies for the live entertainment and fdm & television industries.
[0003] BACKGROUND AND SUMMARY
[0004] Filmmakers who shoot visual content (film, television, commercials, etc.) within an stage environment have previously only been able to work with visual effects content that has been produced before or after their shoot, such as computer-generated imagery or custom photography specific to the production, or stock photography from various sources. The custom content that is created for these productions is typically expensive to create, with long lead times, and that filmmakers are unable to control during their shoot. Adjustments that need to be made are time-intensive and would happen off-set, and are beyond the creative control of the cinematographer or the production designer. Furthermore, there is no complete system that includes a library of images created for filmmakers along with a control mechanism that can be used during the shooting of a production from anywhere on the set.
[0005] The present disclosure describes a complete and user-friendly system that includes an extensive library of cinematic backdrop assets and virtual production technology. In one aspect, provided herein is a system configured to allow tablet devices to send commands to servers, which changes and adjusts content in real-time on a wall of LED displays, where the content is a series of background images. In another aspect, also provided herein is an e-commerce ecosystem that allows filmmakers to purchase or rent contents, which are delivered on demand to a stage anywhere in the world.
[0006] The system and method described herein enables directors of photography to retain greater creative control during production. Without the extensive pre-production or post-production creation and manipulation of digital assets, filmmakers experience all of the upsides of virtual production, with the added benefit of creative control on set. Production designers and art directors can select backdrops from an image library for virtual productions. The backdrops may include motion elements like fog, moving traffic, or other moving imagery,
1
SUBSTITUTE SHEET ( RULE 26) allowing the backdrop to take on greater depth and become a more integrated part of the set. In summary, the advantages of the system and method described herein include time and cost savings, integration between background and camera/lighting controls, improved creative control, improved availability of background imagery, better integration between set and background, and improved final output of production.
[0007] BRIEF DESCRIPTION OF THE DRAWINGS
[0008] Aspects of the present disclosure are best understood from the following detailed description when read with the accompanying figures.
[0009] FIG. 1 illustrates an exemplary embodiment of the virtual production system.
[0010] FIGS. 2-9 illustrate an exemplary embodiment of the mobile application.
[0011] DETAILED DESCRIPTION
[0012] Provided herein is a system comprising a library of interactive background imagery for virtual production, computer-implemented method allowing the images to be directly manipulated on set as “live action backdrops,” and a user-friendly mobile application (“app”) to control the image being used. In some embodiments, the library comprises background images created specifically for use in filmmaking. In some embodiments, the background images are recorded separately using any recording means such as a standard video camera (digital or not) or a movie camera. In some embodiments, the images have been shot at optimal resolution, taking into account the specified image properties required within the film and television industry. In some embodiments, manipulation of the images includes adjustments to image depth, color, contrast, time of day, horizon line, blur, motion, focus, zoom, temperature, and parallax. In some embodiments, the app requires virtually no training and can be used directly on set while the image is being displayed on all of light emitting diode (LED) panels. In some embodiments, the system allows for real-time changes without latency. In some embodiments, the manipulation of the background images may be done dynamically while filming.
[0013] In another aspect, provided herein is a method for synchronizing a background action sequence with a foreground action sequence, the method comprising: displaying the background action sequence on a LED display; providing the foreground action sequence in front of the LED display; and causing a change in the background action sequence as a function of and in
2
SUBSTITUTE SHEET ( RULE 26) accordance with action occurring in the foreground action sequence. In some embodiments, , a human operator causes the change in the background action sequence as a result of a visual cue received by any member of the fdm crew. In accordance with this aspect of the disclosure provided herein, it becomes possible to have the action happen with an exotic location as its backdrop without requiring the displacement of an entire film crew. The director can also produce final images that do not need to be retouched in the post-production phase and the overall costs of production are reduced. Moreover, the actors can see the context within which they are performing as the action unrolls, instead of having to imagine what the background scene will look like. In some embodiments, the actor is at least partially surrounded by one or more displays presenting images of a virtual environment.
[0014] Display and Stage
[0015] In another aspect, the displays surrounding the stage area are formed from multiple LED panels that are generally fixed in position and mostly surround the stage area. In some embodiments, the displays can be greater than 10 feet tall, for example, greater than 20 ft, 30 ft, 40 ft or even taller than 50 ft or even 60 ft tall. The displays can be greater than 20 feet wide, for example, greater than 30 ft, 50, 70 ft or even wider than 90. In some embodiments, sensors can be used to determine the position and orientation of the taking camera, for example OptiTrack, during a performance and/or the camera can be a motion track equipped camera. The system adjusts the virtual environment displayed by the immersive cave or walls in real-time to correspond to the orientation and position of the taking camera. In this way, images of the virtual environment created with using an LED background display, displaying a series of composite images generated from a plurality of digital high resolution photographic images; can be perspective-correct over a performance.
[0016] In some embodiments, portions of the displays surrounding the stage area can be used to simulate LED lights that illuminate the stage area. The number, location, color, and intensity of the simulated lights can be selected in order to achieve a desired lighting effect.
[0017] Mobile Application
[0018] In another aspect, the system described herein allows portable devices to display, control, manipulate high resolution image-based backgrounds for filmmaking applications. In some embodiments, the system is comprised of both a mobile application, as well as a desktop
3
SUBSTITUTE SHEET ( RULE 26) application that can be used with a 3d software development tool such as Unity or Unreal Engine, for example Unreal’s nDisplay functions to drive a display.
[0019] The mobile application is designed to be used with tablet devices, but can also be run on cell phones or other desktop computers. In some embodiments, the mobile application is an Android application. In some embodiments, the mobile application can also be deployed to iOS, Windows and Mac devices. In some embodiments, the mobile application is platform agnostic. FIGS. 2-9 illustrate an embodiment of a mobile app user interface displays.
[0020] The mobile application has an intuitive user interface and sends commands via the OSC protocol to a variety of servers running the desktop application, which interpret those commands and modify content that is displayed and updated in real-time.
[0021] Server
[0022] In another aspect, the system described herein includes a server, which is created for every “listening computer” within the setup for the intended display. For example, that could be a single computer driving a projector, or ten computers driving portions of a large LED display. The software is display agnostic.
[0023] In some embodiments, the server can include a processor, a processing acceleration unit, a digital signal processor (DSP), a special purpose processor, and/or the like. In some embodiments, the server can include/operate a real-time gaming engine such as Unreal Engine or other similar rendering engine. In some embodiments, the real-time gaming engine can load the virtual environment for display on the displays surrounding the stage area.
[0024] The mobile application sends commands via the OSC protocol to each “listening computer,” which then updates their variables and changes the way that content is rendered using Unreal Engine’s real-time Tenderer.
[0025] The mobile device is connected to the servers either wirelessly via secure WiFi connection to a router, or is hardwired via Ethernet to a connected router. The number of servers and machines to interface with changes depending on the individual stage setup and intended final output resolution.
[0026] Optionally, a separate media server is integrated into the system and interfaces directly with Unreal Engine and OSC, to playback all video components and integrate them into the final display output in real-time. The mobile application sends OSC commands to the
4
SUBSTITUTE SHEET ( RULE 26) desktop application, which then sends additional OSC commands to the media server to control playback and drive updates to the display output.
[0027] Product Ecosystem
[0028] In another aspect, the system described herein is an ecommerce ecosystem that allows users to digitally “rent” or download virtual production backdrops from an extensive online library, and implement them in a filmmaking context as a new alternative to traditional backdrops or green/blue screen for compositing in post- production.
[0029] In some embodiments, the system comprises of a series of web servers, a client-facing web and ecommerce interface, a back-end system to track rentals and background expiration, and the actual system implementation on set to drive the content.
[0030] In some embodiments, a user is able to make a transaction on the client-facing web interface, which triggers content downloads and availability on the back-end server. The user can then access that content from a set or soundstage that they intend to use the background on. The system then interfaces with the back-end server, verifies that the content has been purchased and the timeframe for which it should be available, and allows the user to download and implement the solution.
[0031] Manipulation of Image Files
[0032] In another aspect, the system described herein allows the manipulation of high resolution image-based backgrounds, including changes to image depth, horizon line, zoom, focus, color temperature, time of day, lighting, motion, frustum (camera position), and/or fog. In some embodiments, a series of high-resolution aligned and stitched panoramic images are fed into the system. Those images are then mapped onto a three-dimensional object that matches the exact curvature and size as the intended display, and are then manipulated by the application as a material and a shader.
[0033] In some embodiments, a user may adjust a series of simple sliders in an app, and control saturation and exposure, as well as add digital lights to selective areas and shape the appearance of the background in real-time, for example continuously and simultaneously with the live action being viewed and/or captured during a production shoot (i.e., it removes the need for stopping the production to adjust or transition the background image).
[0034] In some embodiments, the images can also be offset on a vertical or horizontal basis to change the appearance of a horizon line and accommodate multi-story sets and different
5
SUBSTITUTE SHEET ( RULE 26) creative decisions. The high- resolution images can also be zoomed into with projected cross-hairs to achieve a desired result.
[0035] In some embodiments, the depth of field may be controlled, such as in accordance with the methods disclosed in U.S. Patent No. 7,164,462, whose disclosure is incorporated by reference in its entirety.
[0036] In some embodiments, blending and/or overlaying techniques can be used on the background image to create special visual effects. For example, one of the objects of the background image can be set to a different level of focus manually in one image and blended into the second image. The sharpness of the objects in the combined image then vary.
[0037] In some embodiments, the user can integrate motion elements shot on set (boats and water moving on a river, for example) to bring the background to life. Those elements can be controlled, scrubbed, and played back on command. The user can also integrate simulated fog/rain effects that are adjusted on a depth-based basis, in addition to transparency, radius, position, color, direction and speed.
[0038] To achieve seamless shift from the user perspective as the perceived time-of-day shifts between day and night, the desktop application receives a normalized value from the mobile application between 0 and 1. The desktop application then dives the total number of images (X) within a given set by 1/(X- 1) to calculate a ratio to change between the set, and blend them accordingly within the normalized range. A linear interpolation is then applied between images at that proper ratio to create a seamless blending effect.
[0039] A reverse linear interpolation is applied for the digital lighting effects, effectively allowing an alternative time of day to be displayed within very specific, defined region of the image in the form of an inverse mask. This solution allows the user to shape the image without introducing unwanted noise and grain from purely boosting exposure or other base variables to an image, mixing elements of night and day as needed to achieve a desired creative result.
[0040] Digital lights can be shaped with temperature, tint, opacity, radius, feathering, exposure, and “bam doors” can be used to make a variety of adjustments and shape the appearance of the top, left, right, and bottom edges of the light.
[0041] Digital lights can be added and removed via the mobile application, and easily moved around the background image by touch, with all changes updated and rendered on the display in real- time.
6
SUBSTITUTE SHEET ( RULE 26) [0042] In some embodiments, additional displays (adjacent LED panels, additional projectors, etc.) can also be controlled by both applications. The mobile application allows a user to sample a color from the background that they currently have displayed, and send commands to the desktop application to populate that additional display with that color. The sampled color changes dynamically along with other changes a user may make in the mobile application, to reflect digital lighting adjustments and other creative changes. The user can also manually override that color, and dial in specific hue, saturation, temperature, and exposure adjustments to send a specific color to those display methods.
[0043] If three-dimensional camera tracking hardware is implemented in the on-set environment, the system can utilize three-dimensional camera position data to project an “inner frustum” — that frustum can be controlled separately from the rest of the image (blurred, defocused, applied with a specific green color for the purposes of pulling a chroma key, among other functions), and can be used to simulate the background to appear to a camera to be a different distance then the reality of the display.
[0044] To achieve that, the application can receive a distance variable to simulate in three-dimensional space, only within the confines of the inner frustum. When enabled, a three-dimensional object is scaled proportionally to fill the display at that specified distance. For example, this allows the system to create the illusion to a tracked camera that a skyline background image of New York City is a full mile away, instead of the reality of only ten feet from the image on a display. U.S. Patent No. 11,132,837 discloses a method where in the images of the virtual environment within a frustum of the camera are updated on one or more displays based on movement of the camera, whose disclosure is incorporated by reference in its entirety.
[0045] The background is offset as the camera moves to achieve an appropriate amount of parallax with all foreground objects in front of the display, as if the background image is more distant than it is.
[0046] In some embodiment as illustrated in Fig. 9, the user can use IP addresses to designate proper listening servers to receive commands from the application, and will eventually be able to send DMX commands based on color samples to any lighting fixtures that have been designated to receive them. Those fixtures could then be updated in real-time to reflect color changes made to the background image.
7
SUBSTITUTE SHEET ( RULE 26) [0047] In some embodiment as illustrated in Fig. 9, the arrangement of all settings can be saved as preset files, so users can save preferences and specific background settings, share them and apply those settings to achieve the same appearance across multiple shoot days and applications, as well as quickly switch between appearances as needed.
[0048] Finally, it should be understood that the features described above in accordance with any one of the various aspects and embodiments of the present disclosure can be used in combination with each other.
8
SUBSTITUTE SHEET ( RULE 26)

Claims

What is Claimed:
1. A system for providing dynamic backgrounds in live-action videography, comprising: a camera for recording live-action video; a display wall, situated within a line of sight of the camera, for displaying a background; a database of images for displaying on the display wall; a computer in data communication with said database and said display wall; a mobile device for providing inputs to said computer; the mobile device configured to display a list of the images in the database for selection for display on the display wall; the computer receiving a selection of one of the images; the display wall displaying the selected image; the mobile device configured to present a list of modifications for the selected image, the modification altering the appearance of the image on the display wall; the computer receiving a selection of a modification; the computer processing the selected modification to the image to generate a background; the display wall displaying the background.
2. The system of claim 1, wherein the image is a video.
3. The system of claim 1, wherein the background is displayed on the display wall in real-time with the selected modification.
4. The system of claim 1, wherein the modification changes the apparent time of day of the background.
5. The system of claim 1, wherein the modification pans, tilts, positions, or zooms the background.
6. The system of claim 1, wherein the modification changes the shape, exposure, color temperature, spotlight, of the background.
9
SUBSTITUTE SHEET ( RULE 26)
7. The system of claim 1, wherein the modification defocuses or focuses the background.
8. The system of claim 1, wherein the modification adds fog or atmospheric effects.
9. The system of claim 1, wherein the modification inserts a chromakey over at least a portion of the background.
10. The system of claim 1, further comprising: a LED light source positioned outside a line of sight of the camera; the LED light source configured to emit light at a color or color temperature based on the background.
11. The system of claim 10, the mobile device configured to control the LED light source.
12. The system of claim 11, further comprising: the mobile device configured to allow selection of a portion of the background; the mobile device sending a command to the LED light source to match the color of the LED light source to the portion of the background.
13. The system of claim 1, further comprising: said computer in data communication with said camera; said computer receiving information regarding a position of the camera relative to a position of the display wall; said computer adjusting the background to maintain a consistent apparent distance to the camera.
14. The system of claim 1, wherein the position of the camera includes a zoom level of the camera.
15. The system of claim 1, wherein the modification inserts a dynamic image over at least a portion of the background.
10
SUBSTITUTE SHEET ( RULE 26)
16. The system of claim 1, further comprising an actor between the camera and the display wall.
17. A film generated by a system comprising: a camera for recording live-action video; a display wall, situated within a line of sight of the camera, for displaying a background; a database of images for displaying on the display wall; a computer in data communication with said database and said display wall; a mobile device for providing inputs to said computer; the mobile device configured to display a list of the images in the database for selection for display on the display wall; the computer receiving a selection of one of the images; the display wall displaying the selected image; the mobile device configured to present a list of modifications for the selected image, the modification altering the appearance of the image on the display wall; the computer receiving a selection of a modification; the computer processing the selected modification to the image to generate a background; the display wall displaying the background.
18. A system for providing dynamic backgrounds in live-action videography, comprising: an LED display wall for displaying a background; a database of images for displaying on the display wall; a computer in data communication with said database and said display wall; a mobile device for providing inputs to said computer; the mobile device configured to display a list of the images in the database for selection for display on the display wall; the computer causing the display wall to display an image selected by said mobile device; a LED light source configured to emit light at a color or color temperature based on the video displayed on the display wall.
11
SUBSTITUTE SHEET ( RULE 26)
19. A system for providing dynamic backgrounds in live-action videography, comprising: a camera for recording live-action video; a display wall, situated within a line of sight of the camera, for displaying a background; a database of images for displaying on the display wall; a computer in data communication with said camera, said database, and said display wall; a mobile device for providing inputs to said computer; the mobile device configured to display a list of the images in the database for selection for display on the display wall; the computer causing the display wall to display a background on the display wall based on the selected image; the computer receiving information regarding a position of the camera relative to a position of the display wall; said computer adjusting at least a portion of the background based on the position of the camera.
20. A method for providing a background for live-action videography, comprising: generating at least one composite image from a plurality of digital high resolution photographic images, wherein the plurality of digital high resolution photographic images are images of the same subject shot from the same angle; where the only essential difference between the plurality of images is the lighting; wherein the plurality of digital high resolution photographic images are panoramic high resolution photographic images; and wherein said computer adjusting at least a portion of the background based on the position of the camera.
21. The method of claim 20, wherein the plurality of digital high resolution photographic images are real-life landscapes having a resolution of at least 14K, greater than 16K, or equal to or greater than 18K pixels.
12
SUBSTITUTE SHEET ( RULE 26)
22. A method for providing a series of background images for live-action videography, comprising: generating at least one composite image from a plurality of digital high resolution photographic images wherein the high resolution is greater than 100 MP.
23. An ecommerce system for digitally transacting virtual production backdrops, comprising: comprising: one or more web servers configured to display a client-facing web interface, a back-end system configured to track rental expiration of virtual production backdrops, and an user-operated display system configured to display virtual production backdrops on set; wherein the back-end system makes a virtual production backdrop available for download to the user-operated display system when the user makes a rental transaction for the virtual production backdrops on the client-facing web interface.
24. A method for providing virtual production backdrops on demand, comprising: receiving an order for a virtual production backdrop for a period of availability from a user on a client-facing web interface, verifying the order for the virtual production backdrop and the period of availability, triggering content availability on a back-end server, and allowing the virtual production backdrop to be downloaded by the user.
13
SUBSTITUTE SHEET ( RULE 26)
PCT/US2023/065368 2022-04-06 2023-04-05 System and method for providing dynamic backgrounds in live-action videography WO2023196845A2 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US202263328148P 2022-04-06 2022-04-06
US63/328,148 2022-04-06

Publications (2)

Publication Number Publication Date
WO2023196845A2 true WO2023196845A2 (en) 2023-10-12
WO2023196845A3 WO2023196845A3 (en) 2023-11-16

Family

ID=88243585

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/US2023/065368 WO2023196845A2 (en) 2022-04-06 2023-04-05 System and method for providing dynamic backgrounds in live-action videography

Country Status (1)

Country Link
WO (1) WO2023196845A2 (en)

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10349140B2 (en) * 2013-11-18 2019-07-09 Tagboard, Inc. Systems and methods for creating and navigating broadcast-ready social content items in a live produced video
US10372402B1 (en) * 2018-03-27 2019-08-06 Panoscape Holdings, LLC Multi-panel, multi-communication video wall and system and method for seamlessly isolating one of more panels for individual user interaction
US11665284B2 (en) * 2020-06-20 2023-05-30 Science House LLC Systems, methods, and apparatus for virtual meetings

Also Published As

Publication number Publication date
WO2023196845A3 (en) 2023-11-16

Similar Documents

Publication Publication Date Title
US11196984B2 (en) System and method for generating videos
US9160938B2 (en) System and method for generating three dimensional presentations
US6335765B1 (en) Virtual presentation system and method
US9041899B2 (en) Digital, virtual director apparatus and method
US20060165310A1 (en) Method and apparatus for a virtual scene previewing system
CN111698390A (en) Virtual camera control method and device, and virtual studio implementation method and system
US9679369B2 (en) Depth key compositing for video and holographic projection
US11425283B1 (en) Blending real and virtual focus in a virtual display environment
CN110225224B (en) Virtual image guiding and broadcasting method, device and system
US10859852B2 (en) Real-time video processing for pyramid holographic projections
GB2520312A (en) A method, apparatus and system for image processing
US11615755B1 (en) Increasing resolution and luminance of a display
JP2009010915A (en) Video display method and video system
US10594995B2 (en) Image capture and display on a dome for chroma keying
CN113692734A (en) System and method for acquiring and projecting images, and use of such a system
US20230077552A1 (en) Video Game Engine Assisted Virtual Studio Production Process
WO2023196845A2 (en) System and method for providing dynamic backgrounds in live-action videography
WO2023196850A2 (en) System and method for providing dynamic backgrounds in live-action videography
KR102371031B1 (en) Apparatus, system, method and program for video shooting in virtual production
GB2535143A (en) System and method for manipulating audio data in view of corresponding visual data
CN109669753B (en) Digital audio-visual system interface display method and computer readable storage medium
WO2023223759A1 (en) Information processing device, information processing method, and imaging system
KR101743874B1 (en) System and Method for Creating Video Contents Using Collaboration of Performing Objects
KR20240010336A (en) A method for shooting xr technology-based performance venue video from the perspective of the venue and an apparatus and a system thereof
Sawicki et al. So, you don’t have a million dollars

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 23785592

Country of ref document: EP

Kind code of ref document: A2