EP3213505A1 - Synchronisierte medienserver und projektoren - Google Patents

Synchronisierte medienserver und projektoren

Info

Publication number
EP3213505A1
EP3213505A1 EP15854753.9A EP15854753A EP3213505A1 EP 3213505 A1 EP3213505 A1 EP 3213505A1 EP 15854753 A EP15854753 A EP 15854753A EP 3213505 A1 EP3213505 A1 EP 3213505A1
Authority
EP
European Patent Office
Prior art keywords
slave
video
projector
projector system
synchronization
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Withdrawn
Application number
EP15854753.9A
Other languages
English (en)
French (fr)
Other versions
EP3213505A4 (de
Inventor
Diego Duyvejonck
Jerome Delvaux
Alexander William Gocke
Emmanuel Cappon
Scott STREMPLE
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Barco Inc
Original Assignee
Barco Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Barco Inc filed Critical Barco Inc
Publication of EP3213505A1 publication Critical patent/EP3213505A1/de
Publication of EP3213505A4 publication Critical patent/EP3213505A4/de
Withdrawn legal-status Critical Current

Links

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/4302Content synchronisation processes, e.g. decoder synchronisation
    • H04N21/4307Synchronising the rendering of multiple content streams or additional data on devices, e.g. synchronisation of audio on a mobile phone with the video output on the TV screen
    • H04N21/43076Synchronising the rendering of multiple content streams or additional data on devices, e.g. synchronisation of audio on a mobile phone with the video output on the TV screen of the same content streams on multiple devices, e.g. when family members are watching the same movie on different devices
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/04Synchronising
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/41Structure of client; Structure of client peripherals
    • H04N21/4104Peripherals receiving signals from specially adapted client devices
    • H04N21/4122Peripherals receiving signals from specially adapted client devices additional display device, e.g. video projector
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/41Structure of client; Structure of client peripherals
    • H04N21/414Specialised client platforms, e.g. receiver in car or embedded in a mobile appliance
    • H04N21/41415Specialised client platforms, e.g. receiver in car or embedded in a mobile appliance involving a public display, viewable by several users in a public space outside their home, e.g. movie theatre, information kiosk
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/433Content storage operation, e.g. storage operation in response to a pause request, caching operations
    • H04N21/4331Caching operations, e.g. of an advertisement for later insertion during playback
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/436Interfacing a local distribution network, e.g. communicating with another STB or one or more peripheral devices inside the home
    • H04N21/4363Adapting the video stream to a specific local network, e.g. a Bluetooth® network
    • H04N21/43632Adapting the video stream to a specific local network, e.g. a Bluetooth® network involving a wired protocol, e.g. IEEE 1394
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N9/00Details of colour television systems
    • H04N9/12Picture reproducers
    • H04N9/31Projection devices for colour picture display, e.g. using electronic spatial light modulators [ESLM]
    • H04N9/3141Constructional details thereof
    • H04N9/3147Multi-projection systems

Definitions

  • the present disclosure generally relates to distributed projector systems that provide synchronized videos projected onto a plurality of screens.
  • Digital cinema servers and projectors receive digital content for projection in a theater or other venue.
  • the content can be packaged in one or more digital files for delivery and storage on a media server.
  • the media server can then extract the digital content from the one or more digital files for display using one or more projectors.
  • the content can be 3D video projected onto a screen where slightly different visual content is projected for simultaneous observation in the right and left eyes of a viewer to create the illusion of depth.
  • a multi-projection system can be used to display video on a plurality of screens in a venue, such as in a theater or auditorium, to facilitate an immersive experience for the viewer.
  • Example embodiments described herein have innovative features, no single one of which is indispensable or solely responsible for their desirable attributes. Without limiting the scope of the claims, some of the advantageous features will now be summarized.
  • An immersive display system can include a plurality of projection systems arranged to provide immersive viewing of video.
  • Such an immersive display system can include a plurality of projector systems that each projects video wherein video frames from each video are synchronized with one another.
  • Each projector system can be configured to project its video onto a projection surface placed around an audience. In this way, the audience can experience a sense of immersion into the environment depicted in the video.
  • Synchronized video provided by the plurality of projector systems may be projected on the plurality of projection surfaces creating a unified video presentation.
  • Such immersive display systems are capable of generating audiovisual presentations with a relatively high level of realism due at least in part to video being simultaneously presented to the viewer from many directions.
  • movie theaters provide a single screen for viewing projected video content.
  • the video content can be digitally stored as a package of digital files on a media server that the media server decodes to provide to the projector.
  • single-screen projector systems are not configured to provide multi-view content (e.g., media streams designed to be projected onto a plurality of screens).
  • Multi-view content e.g., media streams designed to be projected onto a plurality of screens.
  • Combining a plurality of such single-screen projector systems to enable presentation of multi-view content to create an immersive display system presents a number of challenges. For example, to provide an immersive audiovisual environment it can be important to reduce or eliminate issues that may destroy the immersive quality of the experience for the viewer.
  • a master projector system can generate a synchronization signal based at least in part on the media stream provided by the master projector system and transmit the synchronization signal serially to a plurality of slave projector systems (e.g., creating a chain of projector systems).
  • each slave projector system in the chain receiving the synchronization signal can (1) pass the signal to a subsequent slave projector system and (2) process the synchronization signal to determine when to display a video frame so that it is synchronized with the video of the master projector system.
  • the projector systems can thus be connected in serial, or chained together, to synchronize the media streams of each, the synchronization signal being provided by the master projector system.
  • an immersive display system comprises at least 3 screens with at least 3 sequentially chained projector systems.
  • Digital media content can be downloaded to each projector system, the projector system comprising a media server and a projector wherein the media server drives the projector.
  • a master projector system creates and transmits a synchronization signal to enable synchronous projection of video content by all projector systems with sub-frame accuracy.
  • the synchronization signal gets passed sequentially among the at least 3 chained projector systems.
  • a sequentially chained projector system can utilize a simple wired connection (e.g., a coaxial cable between projector systems) to transmit a synchronization signal derived from standard timecodes used in media environments.
  • a simple wired connection e.g., a coaxial cable between projector systems
  • Using sequentially chained projector systems can also simplify the generation and transmission of the synchronization signal.
  • the serial connection design may reduce or eliminate the need for signal amplification relative to an immersive display system employing a parallel connection infrastructure.
  • the serial connection design reduces or eliminates a need for a centralized distribution system to distribute the synchronization signal to each projector system relative to an immersive display system employing a parallel connection infrastructure.
  • serial connection design enables flexibility in the number of projector systems in the immersive display system because the addition of another projector system simply entails adding another link in the projector system chain. This may provide an advantage over a parallel connection design as a maximum number of projector systems may be reached in a parallel system when the synchronization signal distribution system runs out of available physical connection points.
  • the serial connection design also can result in relatively small latency between projector systems.
  • the synchronization signal may also enable synchronization of video with different frame rates, aspect ratios, codecs, or the like due at least in part to the synchronization signal being independent of these parameters.
  • a master projector system can generate a synchronization signal based at least in part on a signal coming from its framebuffer and a slave projector system can synchronize its video based at least in part on regulating the signal output of its framebuffer according to information in the synchronization signal.
  • multi-view content can be packaged for digital delivery and ingestion by a media server, wherein the package comprises a plurality of channels of video to be displayed by a corresponding plurality of projector systems.
  • each of the video channels can conform to standard digital content packaging formats (e.g., such as standards set by Digital Cinema Initiatives, LLC, or DCI).
  • a master projector system comprising a master server can ingest the package, extract the digital files, and distribute the video channels to other slave projector systems.
  • the master projector system can selectively distribute video data to the projector system intended to play the video data.
  • each projector system in the immersive display system ingests the entire package and is configured to determine the appropriate video channel to decode and display.
  • the master server is configured to automatically determine the appropriate slave projector system for each video channel in the package and to transmit the video channel data to the slave projector system, where transmission can occur prior to presentation of the multi-view content, during playback wherein the slave projector system buffers the video channel data, or the video channel data is delivered and displayed in real-time.
  • the master projector system includes hardware and/or software components that distinguish it from slave projector systems.
  • a slave projector system can include a synchronization module or card that allows it to frame- lock the video presentation based at least in part on the synchronization signal originating from the master projector system.
  • the master projector system and slave projector system contain the same hardware and/or software components but the master projector system is configured to act as the master while other projector systems are configured to act as slaves.
  • a projector system comprises a media server and projector integrated into a single physical unit.
  • a projector system comprises a media server and a projector that are physically separate and communicatively coupled to one another (e.g., through a wired or wireless connection).
  • FIG. 1 illustrates an example immersive display system for providing an immersive display experience.
  • FIG. 2 illustrates a plurality of example projector systems ingesting digital content for display.
  • FIG. 3 illustrates a plurality of example projector systems displaying synchronized video.
  • FIG. 4 illustrates a block diagram of an example media server system.
  • FIG. 5 illustrates a slave media server receiving a synchronization signal and transmitting the synchronization signal to the next slave media server in the chain.
  • FIG. 6 illustrates an example media server module configured to allow a projector system to be upgraded from a single-screen projector system to projector system that can part of an immersive display system.
  • FIG. 7 illustrates a flow chart of an example method of synchronizing multiple media streams in serially connected media servers in an immersive display system.
  • FIG. 8 illustrates a flow chart of an example method of synchronizing a slave video with a master video based at least in part on a synchronization signal from a master projector system.
  • FIG. 9 illustrates an example immersive display system for providing an immersive display experience with control connections between projector systems and content lines connecting the projector systems to a server.
  • FIG. 1 illustrates an example immersive display system 100 comprising a plurality of projectors, 200a-c, configured to project images onto corresponding screens 105a- c for providing an immersive display experience.
  • the immersive display system 100 can include, for example and without limitation, multiple direct-view displays, multiple rear- projection displays, and/or multiple front-projection displays, such as screens 105a-c. There can be gaps between adjacent displays. For example, screens 105a-c can have gaps between them as depicted in FIG. 1. In some embodiments, the gaps can be relatively small, close to zero, or zero.
  • the immersive display system 100 can include a plurality of flat or curved displays or screens or it can include a single curved display or screen. The screens can be rotated relative to one another. The screens 105a-c can also have respective inclinations relative to one another.
  • the screens 105a-c of the immersive display system 100 can include flat screens, curved screens, or a combination of both.
  • the example immersive display system 100 includes three planar front- projection screens wherein the image on each screen is provided by a projector system.
  • Projector system 200a is configured to project video onto screen 105a
  • projector system 200b is configured to project video onto screen 105b
  • projector system 200c is configured to project video onto screen 105c.
  • Sound systems may be mounted behind screen 105a, screen 105b and/or screen 105c.
  • Light emerging from the projector systems 200a-c can each have different spectra. This may result in color differences between the images provided by these projector systems. These color differences can be electronically compensated.
  • An example method for compensating color differences between two projectors is disclosed in U.S. Pat. Pub. No. 2007/0127121 to B. Maximus et al., which is incorporated by reference herein in its entirety.
  • the spectra of the projector systems 200a-c can be configured to project, after electronic compensation, color images with a color gamut according to Rec. 709 or DCI P3, for example.
  • the projector systems 200a-c refer to devices configured to project video on the screens 150a-c.
  • These projector systems 200a-c can include a media server and a projector.
  • the media server is physically separate from the projector and is communicably coupled (e.g., through wired or wireless connections) to the projector.
  • the projector system comprises an integrated media server and projector.
  • the media server portion of the projector system can include hardware and software components configured to receive, store, and decode media content.
  • the media server can include hardware and software configured to ingest and decode digital content files, to produce a media stream (e.g., video and audio), to send image data to the projector.
  • the media server can include modules for ingesting digital content, decoding ingested content, generating video from the decoded content, generating audio from the decoded content, providing security credentials to access secure content, and to generate or interpret synchronization signals to provide a synchronized presentation, and the like.
  • the projector can include an optical engine, a modulation element, optics, and the like to enable the projector to produce, modulate, and project an image.
  • the projector may be implemented using a cathode ray tube (CRT), a liquid crystal display (LCD), digital light processing (DLP), digital micro-mirror devices (DMD), etc.
  • the projector systems 200a-c can be configured to provide video with an aspect ratio and resolution conforming to any of a number of standards including, for example and without limitation, 4K (e.g., 3636x2664, 3996x2160, 3840x2160, 4096x2160, etc.), 2K (e.g., 1828x1332, 1998x1080), HD (e.g., 1920x1080, 1280x720), or the like.
  • the projector systems 200a-c can be configured to provide video with a variety of frame rates including, for example and without limitation, 24 fps, 30 fps, 60 fps, 120 fps, etc.
  • the projector systems 200a-c can be configured to display synchronized 3D content (e.g., stereoscopic video) on two or more screens.
  • the projector system 200a can be configured to be the master projector system.
  • the master projector system or the master media server provides the synchronization signal to which the slave projector systems synchronize their output.
  • the master projector system 200a ingests, decodes, and/or provides the main audiovisual presentation in the immersive display system 100.
  • Projector systems 200b and 200c are slave projector systems.
  • a slave projector system or slave media server provides images synchronized to the master system wherein synchronization is based at least in part on the synchronization signal provided by the master projector system.
  • a slave projector system may provide video that is projected peripheral, adjacent, near, and/or otherwise complementary to the video provided by the master system.
  • the master projector system 200a transmits a synchronization signal over the cabled connection 130a to a first slave projector system (e.g., projector system 200b) that then transmits the same synchronization signal over the cabled connection 130b to a second slave projector system (e.g., projector system 200c).
  • the synchronization signal is the same or substantially the same for all projector systems to enable globally synchronized video in the immersive display system. Accordingly, due at least in part to the projector systems 200a-c projecting video based on the synchronization signal, a synchronized video presentation is provided on the screens 150a-c.
  • synchronized video includes video from different projector systems having corresponding frames that are displayed within a sufficiently small time window from one another so as to be displayed substantially simultaneously.
  • synchronized video includes video wherein corresponding frames are displayed such that a time between the display of the synchronized frames is less than or equal to about 1 ms, less than or equal to about 500 ⁇ , less than or equal to about 350 ⁇ , less than or equal to about 250 ⁇ , or less than or equal to about 200 ⁇ .
  • Such synchronization can be referred to as having sub-frame accuracy in its synchronization.
  • sub-frame accuracy can include synchronization that has a latency between corresponding frames that is less than about 10% of the frame rate, less than about 5% of the frame rate, less than about 1% of the frame rate, or less than about 0.1% of the frame rate.
  • the master projector system 200a can control display of a video in units of frames and synchronize the video frames from projector systems 200b and 200c using a time code for each frame, the time code being carried by the synchronization signal, as described in greater detail herein with reference to FIG. 4. Accordingly, the projector systems 200a-c can accurately synchronize the video projected on screens 150a-c based at least in part on the time code for each frame in the synchronization signal.
  • the immersive display system 100 can include DCI- compliant projector systems 200a-c configured to play DCI-compliant content inside a movie theater.
  • the DCI-compliant content can include a media stream (e.g., video data or video and audio data extracted from digital content).
  • the media stream is provided as a digital cinema package ("DCP") comprising compressed, encrypted, and packaged data for distribution to movie theaters, for example.
  • the data can include a digital cinema distribution master (“DCDM”) comprising the image structure, audio structure, subtitle structure, and the like mapped to data file formats.
  • the data can include picture essence files and audio essence files that make up the audiovisual presentation in the DCP.
  • the DCP can include a composition which includes all of the essence and metadata required for a single digital presentation of a feature, trailer, advertisement, logo, or the like.
  • the projector systems 200a-c can be configured to ingest the DCP and generate a visually indistinguishable copy of the DCDM and then use that copy of the DCDM to generate image and sound for presentation to an audience.
  • FIG. 1 illustrates 3 projector systems 200a-c and 3 screens.
  • the immersive display system can include a different number of projector systems and/or screens.
  • the immersive display system 100 can include 2, 3, 4, 5, 6, 7, 8, 9, 10, or more than 10 projector systems.
  • the immersive display system 100 can include 2, 3, 4, 5, 6, 7, 8, 9, 10, or more than 10 screens.
  • the immersive display system 100 can be configured such that more than one projector system provides video on a single screen, such that the images substantially overlap.
  • the immersive display system 100 can be configured such that projector systems provide video on a single screen wherein the videos from projector systems minimally overlap, are adjacent to one another, or are near one another to provide a substantially unitary video presentation.
  • FIG. 2 illustrates a plurality of example projector systems 200a-d ingesting digital content 110 for display in an immersive display system 100.
  • the digital content 110 can be any collection of digital files that include content data and metadata that make up a composition to be displayed by the immersive display system 100.
  • the digital content 110 can be ingested by the plurality of projector systems 200a-d through network connection 150.
  • the media servers 210a-d can be configured to extract the appropriate data files from the ingested digital content 110 and to decode the appropriate video content to send to the corresponding projector 220a-220d.
  • the master projector system 200a can generate a synchronization signal to send over a cable 130 (e.g., a coaxial cable) to a first slave projector system 200b.
  • a cable 130 e.g., a coaxial cable
  • the first slave projector system 200b can then send the synchronization signal to a second slave projector system 200c that can then send it to a third slave projection system 200d and so on.
  • the immersive display system 100 can display synchronized video from a plurality of projector systems.
  • the master and slave projector systems 200a-d can be configured to ingest only portions of the digital content 110 intended for that particular projector system.
  • a projector system 200a-d can download portions of a digital package that contain the data sufficient to provide the content intended for that particular projector system.
  • the master projector system 200a ingests the entire digital content 110 and distributes a copy of that digital content 110 to the slave projector systems 200b-d.
  • the master after ingesting the digital content 110, the master distributes a portion of the digital content 110 to each slave projector system 200b-d wherein the portion transmitted to a slave projector system contains the data sufficient for that particular slave projector system to provide its audiovisual presentation.
  • Transmission of digital content 110 can occur over the network connection 150 which can be a wired connection, a wireless connection, or a combination of both wired and wireless connections.
  • the master projector system 200a can transmit copies of the digital content 110 or copies of portions of the digital content 110 to the slave projector systems 200b-d over the network connection 150. In such circumstances, the master projector system 200a can transmit the digital content 110 to the slave projector systems 200b-d prior to presentation of the composition contained in the digital content 110, during presentation of the composition by buffering the data in each slave projector system 200b-d, and/or during presentation of the composition in real time. In some implementations, the master projector system 200a can transmit information to the slave projector systems 200b-d indicating which portion of the digital content 110 that each slave projector system should ingest. Based at least in part on this information, each slave projector system 200b, 200c, 200d can ingest a portion of the digital content 110.
  • the digital content 110 and the projector systems 200a-d can be configured to conform to digital cinema standards, such as the Digital Cinema System Specification ("DCSS") that describes, among other things, data file formats, hardware capabilities, security standards, and the like.
  • DCSS Digital Cinema System Specification
  • Such specifications like the DCSS, can allow a variety of different content producers, different distributors, and different presenters to work together to generate, distribute, and display digital content to an audience, such as movies. Movie theaters and other such venues that present digital content, such as movies, can then invest in systems that can display digital content packaged according to the specification.
  • the digital content 110 can include data that conforms to a single specification.
  • the digital content 110 can be a digital cinema package ("DCP").
  • the standard associated with the DCP can be expanded to include multi-view content (e.g., video to be simultaneously displayed on a plurality of screens).
  • the DCP can include the data (e.g., audio, video, and metadata) for each screen in the immersive display system 100.
  • the DCP can be configured to have a composition playlist (“CPL") for each screen in the immersive display system 100.
  • the digital content 110 includes a DCP for each screen in the immersive display system 100.
  • each projector system 200a-d can implement a smart ingest function that limits ingestion of the digital content 110 to the relevant portions of the digital content 110 for its intended screen.
  • the immersive display system 100 displays DCP content from the projector systems 200a-d blended together to accommodate a curved screen.
  • the digital content 110 can include data that conforms to a plurality of specifications.
  • a portion of the digital content 110 can be a DCP while other portions can include data conforming to another specification.
  • the digital content 110 can include data that conforms to a specification and data that does not conform to a specification.
  • a portion of the digital content 110 can be a DCP while other portions can include non-DCP data.
  • the systems and methods described herein can advantageously allow the synchronization of video from a plurality of projector systems when the digital content 110 conforms to a single specification, multiple specifications, or a combination of a single specification and no specification.
  • This advantage is realized due at least in part to the master projector system 200a generating the synchronization signal after decoding the video content.
  • the master projector system 200a can generate an appropriate timeline and metadata independent of the format of the digital content 110 and encode that information into the synchronization signal.
  • the synchronization can be done between the video frames (e.g., line-based synchronization).
  • the master projector system 200a can generate the synchronization signal after the frame buffer output in the projector, prior to the modulation portion of the projector (e.g., a DMD chip).
  • Each slave projector 200b-d can receive the synchronization signal and control its display of video based on the timeline and metadata in the signal.
  • the slave projector systems 200b-d can frame-lock to the synchronization signal at a similar hardware level to the master projector system 200a (e.g., after the frame buffer and prior to the modulation chip).
  • the projector systems 200a-d can be synchronized on a frame-basis, frame- locking content wherein timing is linked on a frame-by-frame basis.
  • the immersive display system 100 can synchronize the projector systems 200a-d with each other for content playback with sub-frame accuracy, wherein each server has a different DCP, a different CPL in a single DCP, and/or a DCP and non-DCP content.
  • the immersive display system 100 can also synchronize video having different aspect ratios, different content formats (e.g., JPEG2000, MPEG4, etc.), and/or different frame rates.
  • side screens can have a frame rate that is higher than a frame rate of the main screen or vice versa.
  • synchronization of different frame rates can occur where the differing frame rates are multiples of one another (e.g., 30 fps and 60 fps), multiples of a common frame rate (e.g., 24 fps and 60 fps are both multiples of 12), or where the data rate of the synchronization signal allows for synchronization at differing frame rates (e.g., where the base frequency of the synchronization signal is a multiple of possible frame rates).
  • the immersive display system 100 can also synchronize video that is stereoscopic, not stereoscopic, or a combination of both.
  • the immersive display system 100 can display non- DCP content (e.g., provided by slave projector systems 200b, 200c, and/or 200d) synchronized to DCP content (e.g., provided by the master projector system 200a).
  • DCP content e.g., provided by the master projector system 200a.
  • dynamic content e.g., feeds from social media, advertisements, news feeds, etc.
  • main video presentation e.g., a feature film
  • one or more of the slave projector systems 200b-d provides the dynamic, synchronized content overlaid on the main screen or on a side screen.
  • the master projector system 200a provides content from a DCP on a main screen and at least 2 slave projector systems 200b, 200c provide content from a non-DC source on side screens.
  • the master projector system 200a and the slave projector systems 200b-d provide subtitles synchronized across multiple screens. For example, different subtitles can be displayed on different screens. In certain implementations, the subtitles are a part of the content package. In some implementations, subtitles can be acquired from a different source and can be displayed on one or more screens in synchrony with the video.
  • one or more of the slave projector systems 200b-d are configured to display news feeds (e.g., rich site summary or RSS feeds) on side screens synchronized with the video provided by the master projector system 200a on a main screen.
  • the master projector system 200a can display a composition on the main screen (e.g., a feature film in JPEG2000) and at least one slave projector system 200b-d can overlay a live rendering of an RSS feed (e.g., a feed from a social networking website like Twitter®) over the composition.
  • the projector systems 200a-d can leverage existing hardware and software configured to display subtitles to display additional or alternative textual content.
  • one or more of the media servers 210a-210d can be physically separate its associated projector 220a-d.
  • the synchronization signal can be generated by the main media server 210a prior to the frame buffer output, as described herein.
  • the slave media servers 210b-d can synchronize video output prior to its frame buffer output.
  • the master projector system 200a and the slave projector systems 200b-d can be substantially identical devices.
  • the user can configure the devices to assume the roles of master and slave.
  • the content ingested by the devices determines, at least in part, the role of the projector system (e.g., master or slave).
  • the immersive display system 100 can thus be configured to not include any specific main server that controls the slave projector systems.
  • the master projector system 200a can include hardware and/or software components that differentiate it from the slave projector systems 200b-d.
  • the master projector system 200a can include hardware and/or software specific to generating the synchronization signal.
  • the slave projector systems 200b-d can include hardware and/or software specific to synchronizing video output based on the synchronization signal.
  • the immersive display system 100 can operate in an automation system (e.g., a theater management system or "TMS," and/or a screen management system or "SMS").
  • TMS theater management system
  • SMS screen management system
  • the immersive display system 100 can be treated as a single entity. This can allow existing TMS's to expand operation relatively easily from exclusively operating single-screen projection systems to incorporating the immersive display system 100.
  • FIG. 3 illustrates a plurality of example projector systems 200a-c displaying synchronized video on a plurality of screens.
  • the projector systems 200a-c are connected in serial with cables 130a and 130b to relay a synchronization signal from the master projector system 200a to a first slave projector system 200b and then to a second slave projector system 200c.
  • Communication over the cables 130a, 130b occurs in real time and at a data rate sufficient to synchronize video between the projector systems 200a-c with sub- frame accuracy.
  • each projector system 200a-c can have a single DCP stored thereon.
  • the master projector system 200a can extract video from its DCP and generate a synchronization signal based at least in part on the video.
  • the master projector system 200a can display the video on a main screen.
  • the master projector system 200a can transmit the synchronization signal to the first slave projector system 200b over a first coaxial cable.
  • the first slave projector system 200b can extract video from its DCP and use the synchronization signal to synchronize the presentation of the video on a first side screen with that of the master projector system 200a.
  • the first slave projector system 200b can also re-transmit the synchronization signal, in parallel with processing the synchronization signal, to the second slave projector system 200c over a second coaxial cable.
  • the second slave projector system 200c can extract video from its DCP and use the synchronization signal to synchronize the presentation of the video on a second side screen with that of the master projector system 200a.
  • the projector systems 200a-c can also be communicably coupled via a network connection 150 that can be wired (e.g., using an Ethernet connection), wireless (e.g., using a wireless networking protocol such as IEEE 802.11 ⁇ ), or a combination of both. In some implementations, communication over the network connection 150 does not need to support real-time communication.
  • video can be sent from the master projector system 200a to one or more of the slave projector systems 200b, 200c over the network connection 150.
  • the video can be sent prior to presenting the video or while the video is being presented (e.g., using a buffering system or in real time).
  • each projector system 200a-c uses the network connection 150 to ingest content to be displayed. It is to be understood that the slave projector systems 200b, 200c can be synchronized with the master projector system 200a without the network connection 150 being present or being connected to less than all of the projector systems 200a-c.
  • one or more projector system 200a-c can ingest content for presentation from a computer readable storage medium (e.g., a Blu-Ray disc, a USB drive, etc.) and that content can be synchronized with the content provided by the serially connected projector systems 200a-c with cables 130a, 130b.
  • a computer readable storage medium e.g., a Blu-Ray disc, a USB drive, etc.
  • the master projector system 200a produces a synchronization signal based at least in part on the content it is providing (e.g., video and/or audio) to provide over the cable 130a to a first slave projector system 200b.
  • the synchronization signal can be time- coded so that the first slave projector system 200b can synchronize its video output with the output of the main projector system 200a.
  • Additional slave projector systems can be added to the chain of projector systems by adding another link in the chain.
  • the second slave projector system 200c can be added by connecting it to the first slave projector system 200b with the cable 130b.
  • the first slave projector system 200b can then propagate the synchronization signal it received from the master projector system 200a to the second slave projector system 200c.
  • the total number of projector systems 200 can be varied based at least in part on the intended use.
  • the maximum number of projector systems 200 can be based at least in part on acceptable accuracy in synchronization.
  • Each additional projector system increases the overall latency in the system, potentially reducing synchronization accuracy.
  • the latency in the system can be related to the time between when the master projector system 200a sends the synchronization signal and when the last slave projector system in the chain receives the synchronization signal.
  • the latency in the system can also be related to the time difference between when a video frame is displayed in the master projector system 200a and when a corresponding video frame is displayed in a slave projector system 200b, 200c.
  • the latency in the system can also be related to a time difference between when a video frame theoretically should be displayed and when the video frame is actually displayed.
  • the acceptable accuracy in synchronization can be measured as a fraction of the time between successive video frames in the master video (e.g., the frame display time for the video provided by the master projector system 200a) where the acceptable accuracy can be less than or equal to about 10% of the frame display time, less than or equal to about 5% of the frame display time, less than or equal to about 1% of the frame display time, less than or equal to about 0.1 % of the frame display time, or less than or equal to about 0.01% of the frame display time.
  • the acceptable accuracy in synchronization can be measured in seconds where the acceptable accuracy can be less than or equal to about 1 ms, less than or equal to about 500 ⁇ , less than or equal to about 350 ⁇ , less than or equal to about 250 ⁇ , less than or equal to about 200 ⁇ , less than or equal to about 100 ⁇ , or less than or equal to about 50 ⁇ .
  • the total number of systems is, for example and without limitation, 3 systems, 4 systems, 5 systems, 6 systems 7 systems, 8 systems, 9 systems, 10 systems, 15 systems, 20 systems, at least 10 systems, or between 3 and 10 systems.
  • the synchronization signal can be configured to be a waveform that can encode data into it (e.g., using digital methods).
  • the synchronization signal can include words defined to be 64 bits of information each.
  • Each word can include synchronization information (e.g., a time code) and/or additional information such as commands for slave projector systems.
  • the data rate of the synchronization signal can be about 2 Mbps. If each word in the signal is 64 bits, the latency due to constraints imposed by the synchronization signal is about 32 (2 Mbps/64 bits per word is about 31,250 words/s). Other data rates and/or word sizes are possible that would result in different calculated and measured latencies.
  • the synchronization signal can also include commands intended for one or more slave projector systems 200b, 200c. Commands can also be sent to slave projector systems 200b, 200c over the network connection 150. Commands encoded into the synchronization signal can be commands intended for real time or near real time execution. For example, commands sent over the synchronization signal can include commands to change a side screen color space, brightness, or the like based at least in part on metadata in the content (e.g., if an image on a side screen is bright a command in the synchronization signal can weight side screen projector outputs less light to reduce effects on the content projected on the main screen). Commands sent over the network connection 150 can be commands intended for near real time execution or delayed execution. For example, commands to open or close a dowser or to load content can be sent over the network connection 150.
  • the synchronization signal can include a timeline adjustment that provides the ability to adjust individual videos displayed by each projector system.
  • the master projector system 200a can be configured to continue displaying video.
  • the synchronization signal can be implemented by modifying or employing existing synchronization technology such as linear time coding ("LTC") or AES3 signals (e.g., a signal conforming to the IEC 60958 standard) that can be used over coaxial cables or other similar cables thus reducing potential obstacles to implementing the technology in existing projector or other display systems.
  • LTC linear time coding
  • AES3 signals e.g., a signal conforming to the IEC 60958 standard
  • FIG. 4 illustrates a block diagram of an example media server system 210.
  • the media server system 210 can be a master media server or a slaver media server.
  • the media server system 210 can be configured to generate a synchronization signal (e.g., when it is a master media server system), transmit the synchronization signal (e.g., over a synchronization link such as a coaxial cable), receive a synchronization signal (e.g., when it is a slave media server system), synchronize presentation of a video based at least in part on the synchronization signal, send and receive communications over a network connection, process digital files to generate a video, provide security credentials to extract video, and the like.
  • the media server system 210 can include hardware and software sufficient to accomplish the functionality described herein.
  • the media server system 210 includes a controller 201, such as a computer processor, and a data store 202, such as non-transitory computer storage.
  • the controller 201 can be configured to provide computational power and to direct and coordinate the execution of functions sufficient to provide the targeted and desired functionality of the media server system 210.
  • the data store 202 can be used to store digital files, e.g., a DCP, software, executable instructions, configuration settings, calibration information, and the like.
  • the media server 210 provides a user interface or a control program accessible over a network connection that allows a user or other system to provide commands to the media server system 210, to monitor a status of the media server system, and/or to request information from the media server system 210.
  • the media server system 210 includes a communication module 203 configured to process, send, receive, construct, and/or interpret information over a network connection, such as the network connection 150 described herein with reference to FIGS. 2 and 3.
  • the communication module 203 can be configured to ingest digital content for display by an associated projector.
  • the communication module 203 can be configured to perform a smart ingest function wherein data necessary for displaying content on the associated projector is ingested and other data is not ingested.
  • the communication module 203 can be configured to send commands to be performed by connected media servers.
  • a master media server can command one or more slave media servers to control its associated projector system by dowsing the shutter or other similar functionality.
  • the communication module 203 in a slave projector system can be configured to receive and interpret commands received from a master projector system.
  • the media server system 210 includes a media module 204 configured to process digital data to generate a video presentation.
  • the media module 204 can be configured to extract packaged files from a standard format, such as a DCP package, and to provide an appropriate signal to a projector so that the projector displays intended video. For example, to display a feature film, the media module 204 can decompress digital files, identify an appropriate playlist file, decode associated image essence files, decode associated audio essence files, and produce a video signal that is sent to a projector for display.
  • the media server system 210 includes a security module 205 configured to provide appropriate security functionality to access secure digital files.
  • a DCP can be encrypted to prevent unauthorized access.
  • the security module 205 can provide appropriate security credentials and decrypt the digital files so that the media module 204 can access the files.
  • the security module can also provide security functionality when the video signal generated by the media module 204 is to be sent over a cable to the projector, such as when the projector is physically separate from the media server system 210.
  • the media server system 210 includes a synchronization module 206 configured to generate a synchronization signal (e.g., when the media server 210 is part of a master projector system), transmit the synchronization signal (e.g., over a synchronization cable), and/or process the synchronization signal (e.g., when the media server 210 is part of a slave projector system).
  • the synchronization module 206 can be configured to generate the synchronization signal.
  • the synchronization signal can be generated independent of synchronization information provided in the digital files related to the composition (e.g., video presentation).
  • the synchronization signal can be generated based at least in part on the video signal generated by the media module.
  • the synchronization signal can be generated based at least in part on the output of a frame buffer in the projector, prior to (or in parallel with) the video signal being input into a modulation chip in the projector.
  • the synchronization signal can be a waveform having information that is encoded therein.
  • the waveform can utilize a biphase mark code ("BMC") to encode data (e.g., as used in AES3 and S/PDIF signals).
  • BMC biphase mark code
  • the synchronization signal encoded with BMC can be polarity insensitive which can be advantageous in an immersive display system with a plurality of projector systems.
  • the waveform can be divided into words, or groups of bits, with information encoded at particular places within the words.
  • the waveform can have one or more encoded words.
  • the synchronization waveform can be a modified linear time code ("LTC") or a modified AES3 signal.
  • LTC modified linear time code
  • the waveform can encode SMPTE timecode data to enable synchronization of slave projector systems to the master projector system.
  • the waveform can also encode commands or other information (e.g., metadata) addressed to or intended for one or more projector systems.
  • the synchronization signal can include two 64-bit words.
  • the first word can include a 24-bit frame number, a valid status bit, a channel status bit, a user data bit, a parity bit (e.g., to validate a received word).
  • an active edge in the user data bit can be used to indicate that the master projector system will start the next frame.
  • the second word can contain a command structure used by a master projector system to provide commands to connected slave projector systems. Additional data, such as metadata, can be included in the first or second word.
  • the second word can include a 24-bit instruction from the master projector system to connected slave projector systems.
  • the metadata can be used to provide information to the slave projector systems to modify their functionality. For example, metadata can be used to indicate that the master projector system is paused. The slave projector systems can then pause their playback until another signal is received indicating that playback on the master projector system has resumed.
  • the synchronization signal can be a modification of standard signals, such as the LTC or AES3 signal. This can allow existing projector systems, hardware, and/or software to incorporate elements sufficient to implement the synchronization signal in a relatively straight-forward and easy fashion.
  • the synchronization module 206 can include look-up tables, data structures, data tables, data bases, or the like to interpret the signals encoded into the synchronization signal.
  • the synchronization module 206 can include a command table that correlates commands with numbers encoded into the synchronization signal.
  • the synchronization signal can have a data rate of about 2 Mbps.
  • the data rate is about 32 ⁇ .
  • the time to transmit a packet is about 64 ⁇ .
  • the synchronization module 206 can be configured to adjust display of a video frame based at least in part on the synchronization signal. For example, the synchronization module 206 can wait for a matching frame id received in the synchronization signal. When the matching frame id is received, the synchronization module 206 can indicate to the projector to display the appropriate video frame.
  • the synchronization module 206 generates the synchronization signal based at least in part on audio provided by the media module 204.
  • sound can be generated by the master projector system and the timing of the audio can drive the video synchronization chain.
  • the audio can be processed by the media module 204 in real time and the video frames can be specified in terms of the number of clock cycles relative to the audio clock domain. This can enable automatic alignment of audio and video during playback.
  • continuous or substantially continuous adjustments to video playback can be performed during the video blanking time slot (e.g., using a backpressure algorithm). Accordingly, the master projector system can play audio in real time and display the video synchronized to the audio using the media module 204.
  • the master projector system also provides a synchronization signal via the synchronization module 206 to connected slave projector systems.
  • the slave projector systems can then synchronize their video to this synchronization signal provided by the master projector system, making it so that their video is synchronized with the master video and not necessarily to their audio.
  • FIG. 5 illustrates a slave media server 210 receiving a synchronization signal at a first connector 502 and transmitting the synchronization signal from the connector 504 to the next slave media server in the chain.
  • the first and second connectors 502, 504 can be standard connectors used for sync-in and sync-out signals generally used in this field, such as BNC connectors.
  • the slave media server 210 can loop the signal from the Rx port to the Tx port using active electronics 505.
  • the active electronics includes an amplifier.
  • the active electronics 505 are configured to reduce the introduction of latency into the synchronization chain.
  • the slave media server 210 can also direct the synchronization signal to the synchronization module for processing and utilization.
  • the slave media server 210 can include hardware and software components configured to extract synchronization information from the synchronization signal and to control video playback based at least in part on the extracted synchronization information.
  • the slave media server 210 can include hardware configured to frame-lock its playback to the synchronization signal.
  • FIG. 6 illustrates an example media server module 610 configured to allow a projector system to be upgraded from a single-screen projector system to a projector system that can part of an immersive display system, such as the immersive display system described herein with reference to FIGS. 1 and 2.
  • the media server module 610 can include connectors and interface elements 620 to provide compatibility with existing projector system infrastructure, such as those present in a screen management system ("SMS").
  • the media server module 610 can also include electronics 630 configured to provide the functionality described herein with reference to FIG. 4.
  • the connectors 620 and the electronics 630 can be configured to receive a synchronization signal and control video playback based at least in part on the synchronization signal.
  • the connectors 620 and the electronics 630 can be configured to generate a custom synchronization signal to synchronize video playback among a plurality of projector systems.
  • the connectors 620 and the electronics 630 can be configured to be part of a sequentially chained projector system wherein the synchronization signal is passed sequentially among serially connected projector systems.
  • the media server module 610 can be configured to be integrated into a projector system such that the media server module 610 is configured to drive a projector of the projector system. In this way, the media server module 610 allows the projector system to be updated and upgraded without requiring replacement of the projector. Thus, the media server module 610 provides a way to upgrade a projector system to include immersive presentation capabilities described herein.
  • the media server module 610 can act as an integrated cinema media processor, providing functionality of an integrated cinema processor and a media server. This can convert a projector system into a DCI-compliant projector and media server.
  • FIG. 7 illustrates a flow chart of an example method 700 of synchronizing multiple media streams in serially connected media servers in an immersive display system.
  • the method 700 can be performed by a plurality of projector systems and/or media servers in an immersive display system.
  • One or more media servers such as the media servers described herein with reference to FIGS, 2, 4, 5, or 6, can perform one or more of the steps of the method 700.
  • one or more modules of the media server such as those described herein with reference to FIG. 4, can perform one or more of the steps of the method 700.
  • a single step of the method 700 may be performed by more than one module and/or projector system.
  • a master projector system extracts a composition for presentation.
  • the composition can include video and/or audio to be presented to an audience.
  • the composition can include video to be displayed by the master projector system.
  • the composition can include video to be displayed by two or more slave projector systems.
  • the master projector system can transmit the data sufficient to display the video to the respective slave projector systems.
  • two or more slave projector systems each extract a composition for presentation by the respective slave projector system.
  • the master projector system generates a synchronization signal based at least in part on the extracted composition.
  • the synchronization signal can encode data words into a synchronization waveform.
  • the encoded data words can include synchronization information in the form of a timecode.
  • the master projector system generates the synchronization signal based at least in part on audio in the composition for presentation by the master projector system.
  • the master projector system transmits the synchronization signal to a first slave projector system.
  • the master projector system can transmit the synchronization signal over a coaxial cable or other cable with a signal line and a ground.
  • the first slave projector system receives the synchronization signal and retransmits the synchronization signal to a second slave projector system.
  • the first slave projector system can receive the synchronization signal at an input synchronization connector and transmit the synchronization signal at an output synchronization connector.
  • the master projector system displays a video frame from the extracted composition.
  • the first slave projector system and the second slave projector system display video frames synchronized with the video frame displayed by the master projector system wherein the displayed video frames are synchronized based at least in part on the synchronization signal generated by the master projector system.
  • Each of the first and second slave projector systems can process the received synchronization signal to extract synchronization information.
  • each of the first and second slave projector systems can control playback of its video (e.g., control timing of when to display a video frame) based at least in part on the extracted synchronization information.
  • FIG. 8 illustrates a flow chart of an example method 800 of synchronizing a slave video with a master video based at least in part on a synchronization signal from a master projector system.
  • the method can be performed by a slave projector system in an immersive display system, such as the slave projector systems described herein with reference to FIGS. 1-5.
  • the slave projector system can include hardware and software configured to perform the steps in the method 800, and each step in the method can be performed by one or more components and/or one or more modules of the slave projector system.
  • one or more steps in the method 800 can be performed by any combination of hardware and software of the slave projector system.
  • the method 800 can allow a slave projector system to synchronize a slave video with a master video.
  • the slave projector system can comprise a modified single projector system.
  • a projector system can be retrofit with a module, such as the module 610 described herein with reference to FIG. 6, that is configured to receive a synchronization signal and to synchronize its video based at least in part on the received synchronization signal.
  • the synchronization signal can be generated by a master projector system that has not been specifically designed to be part of an immersive display system.
  • a projector system configured for use in a single-screen theater can generate a synchronization signal based on standards such as LTC or AES3.
  • the slave projector system can receive the synchronization signal and synchronize its video based on that generated synchronization signal.
  • an immersive display system can be created using existing hardware and retrofitting one or more projector systems (e.g., with the module 610) to act as slave projector systems.
  • the slave projector system receives a synchronization signal.
  • the synchronization signal can be generated by a master projector system or another system configured to generate the synchronization signal.
  • the synchronization signal can be based on standard synchronization signals (e.g., LTC, AES3, etc.) or it can conform to a format that the slave projector system can process and from which it can extract synchronization information.
  • the synchronization signal can be received over a cable that has a signal line and a ground line, such as a coaxial cable. It is to be understood that other cabling options are within the scope of this disclosure including, for example and without limitation, serial cables, twisted pair cables, USB cables, and the like.
  • the slave projector system transmits the received synchronization signal to another slave projector system over another cable (e.g., a cable different from the cable used to receive the synchronization signal).
  • the slave projector system can include active electronics configured to receive the synchronization signal and pass that signal to the next slave projector system in the chain.
  • the slave projector system includes amplifiers, filters, and/or other electronics configured to reduce degradation of the synchronization signal as it is passed from one slave projector system to the next.
  • the slave projector system extracts synchronization information from the received synchronization signal. This can occur in parallel with the transmission of the synchronization signal in block 810. This can be done to reduce or minimize latency in the immersive display system.
  • the synchronization information can include information sufficient for the slave projector system to provide a video frame synchronized with a video provided by another projector system (e.g., a master projector system and/or other slave projector systems).
  • the synchronization information can include, for example and without limitation, frame numbers, timestamps, timecodes, metadata, command(s) for the slave projector system, and the like, as described in greater detail herein.
  • the slave projector system provides a video frame synchronized with a video provided by another projector system.
  • the slave projector system can synchronize the video frame at the framebuffer.
  • the slave projector system can synchronize the video frame at a point in the processing chain prior to the framebuffer, such as at the video decoding stage.
  • the synchronized video frame can be displayed on a screen along with video from other projector systems to provide an immersive viewing experience for a viewer.
  • FIG. 9 illustrates an example immersive display system 899 for providing an immersive display experience with control connections (e.g., connections for transmitting commands) between projector systems 900a-c and connections for transmitting content to projector systems 900a-c from a server node 980.
  • Projector systems 900a-c can be sequentially chained, as previously described in this disclosure, with cables 930a-b.
  • Each of projector systems 900a-c can also comprise a media server.
  • One or more media servers such as the media servers described herein with reference to FIGS. 2, 4, 5, or 6, can be used as the media servers for any of projector systems 900a-c.
  • the media servers of projector systems 900a-c can also include an integrated cinema media processor, which can be a single or unitary electronics board that combines the functionalities of an integrated cinema processor and a media server.
  • Server 990 can first host the cinema content.
  • the cinema content can be stored as DCI-compliant content, including media streams such as DCPs and/or DCDMs.
  • media streams such as DCPs and/or DCDMs.
  • the systems and methods provided in this disclosure can be applied to any file format used to deliver and/or package digital cinema content such as, but not limited to, REDCODE, Tagged Image File Format ("TIFF"), Tag Image File Format/Electronic Photography (“TIFF/EP”), Digital Negative files (“DNG”), Extensible Metadata Platform files (“XMP”), Exchangeable image file format (“Exif '), etc.
  • Server 990 can include, or be coupled to, a network attached storage (“NAS”).
  • Server 990 may also be a component of a TMS or may be part of a standalone system.
  • the media streams can consist of a single file, a merged file, or a plurality of files.
  • the cinema content on server 990 can be stored in compressed, encrypted, and/or packaged form and/or uncompressed, decrypted, and/or unpackaged form.
  • server 990 can run software and/or have hardware that decompresses, decrypts, and/or unpackages cinema content.
  • the data can be uploaded onto server 990 already decompressed, decrypted, and/or unpackaged.
  • Cinema content from server 990 can be transmitted to server node 980 in compressed, encrypted, and/or packaged form and/or decompressed, decrypted, and/or unpackaged form.
  • the cinema content can be transmitted over a cable that has a signal line and a ground line.
  • cables can include coaxial cables, Ethernet cables, HDMI cables, component cables, HD-SDI cables, etc. It is to be understood that other cabling options are within the scope of this disclosure including, for example and without limitation, serial cables, twisted pair cables, USB cables, and the like.
  • data can be transferred using a 1000BASE-T GB transceiver and/or any cable and/or component conforming to IEEE's Gigabit Ethernet standard.
  • cables can be replaced by wireless transmission (e.g., using a wireless networking protocol such as IEEE 802.11 ⁇ ).
  • the cinema content can be transmitted over cables 940a-c, which can comprise any of the aforementioned cables or wireless transmission, to each of projector systems 900a-c, respectively.
  • the cinema content can be configured to have a composition playlist ("CPL") for each of projector systems 900a-c.
  • CPL composition playlist
  • each projector system 900a-c can implement a smart ingest function that limits ingestion of the cinema content to the relevant portions for that particular projector system.
  • the cinema content can be received by integrated cinema media processor located at each of projector systems 900a-c.
  • the cinema content can be received by the integrated cinema media processor in decompressed, decrypted, and/or unpackaged form.
  • projector systems 900a-c may not further decompress, decrypt, un-package and/or process the cinema content for viewing.
  • the cinema content can be received in a compressed, encrypted, and/or packaged form.
  • the integrated cinema media processor of projector systems 900a-c may decompress, decrypt, and/or un-package the cinema content before it can be viewed.
  • Projector systems 900a-c can be configured to project video (e.g., onto a screen) based on at least the received cinema content.
  • cables 950a-c can also be used. Cables 950a-c can comprise any of the abovementioned cables or wireless transmissions, and further provide connectivity between projector systems 900a-c. In some cases, cables 950a-c can provide additional communication between projector systems 900a-c in which each can send commands and/or control signals to the other projectors. In some cases, cables 950a-c can also transmit cinema content compressed, encrypted, and/or packaged, and/or uncompressed, decrypted, and/or unpackaged between projector systems 900a-c.
  • One or more of the projector systems 900a-c can be connected to one or more user interfaces to provide user inputs and/or control.
  • the user interfaces can also display statuses, statistics/data, and log histories.
  • the user interfaces can also contain software to manipulate/edit cinema content and/or process cinema content.
  • projector system 900a can be connected to interface 970 by cable 975, which can be any of the abovementioned cables or wireless transmissions.
  • Interface 970 can be a personal computer, tablet, mobile device, web browser, and/or any device that can send and receive signals to projector system 900a.
  • Projector system 900a can also be coupled to a computer, such as a touchscreen panel computer (“TPC") 960, which can further allow user inputs and/or control.
  • TPC touchscreen panel computer
  • a computing system that has components including a central processing unit (CPU), input/output (I/O) components, storage, and memory may be used to execute the projector system, or specific components of the projector system.
  • the executable code modules of the projector system can be stored in the memory of the computing system and/or on other types of non-transitory computer-readable storage media.
  • the projector system may be configured differently than described above.
  • Each of the processes, methods, and algorithms described in the preceding sections may be embodied in, and fully or partially automated by, code modules executed by one or more computers, computer processors, or machines configured to execute computer instructions.
  • the code modules may be stored on any type of non- transitory computer- readable medium or tangible computer storage device, such as hard drives, solid state memory, optical disc, and/or the like.
  • the systems and modules may also be transmitted as generated data signals (e.g., as part of a carrier wave or other analog or digital propagated signal) on a variety of computer-readable transmission mediums, including wireless-based and wired/cable-based mediums, and may take a variety of forms (e.g., as part of a single or multiplexed analog signal, or as multiple discrete digital packets or frames).
  • the processes and algorithms may be implemented partially or wholly in application-specific circuitry.
  • the results of the disclosed processes and process steps may be stored, persistently or otherwise, in any type of non-transitory computer storage such as, e.g., volatile or non-volatile storage.
  • the term "or” is used in its inclusive sense (and not in its exclusive sense) so that when used, for example, to connect a list of elements, the term “or” means one, some, or all of the elements in the list.
  • Conjunctive language such as the phrase "at least one of X, Y and Z," unless specifically stated otherwise, is otherwise understood with the context as used in general to convey that an item, term, etc. may be either X, Y or Z. Thus, such conjunctive language is not generally intended to imply that certain embodiments require at least one of X, at least one of Y and at least one of Z to each be present.
  • the terms "about” or “approximate” and the like are synonymous and are used to indicate that the value modified by the term has an understood range associated with it, where the range can be +20%, +15%, +10%, +5%, or +1%.
  • the term “substantially” is used to indicate that a result (e.g., measurement value) is close to a targeted value, where close can mean, for example, the result is within 80% of the value, within 90% of the value, within 95% of the value, or within 99% of the value.

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Controls And Circuits For Display Device (AREA)
EP15854753.9A 2014-10-28 2015-09-29 Synchronisierte medienserver und projektoren Withdrawn EP3213505A4 (de)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US201462069720P 2014-10-28 2014-10-28
PCT/US2015/053003 WO2016069175A1 (en) 2014-10-28 2015-09-29 Synchronized media servers and projectors

Publications (2)

Publication Number Publication Date
EP3213505A1 true EP3213505A1 (de) 2017-09-06
EP3213505A4 EP3213505A4 (de) 2018-04-04

Family

ID=55858153

Family Applications (1)

Application Number Title Priority Date Filing Date
EP15854753.9A Withdrawn EP3213505A4 (de) 2014-10-28 2015-09-29 Synchronisierte medienserver und projektoren

Country Status (4)

Country Link
EP (1) EP3213505A4 (de)
KR (1) KR20170088357A (de)
CN (1) CN105917645A (de)
WO (1) WO2016069175A1 (de)

Families Citing this family (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9838675B2 (en) 2015-02-03 2017-12-05 Barco, Inc. Remote 6P laser projection of 3D cinema content
JP6693367B2 (ja) * 2016-09-21 2020-05-13 セイコーエプソン株式会社 プロジェクションシステムシステム、及びプロジェクションシステムの制御方法
CN107920237A (zh) * 2016-10-10 2018-04-17 深圳市光峰光电技术有限公司 投影仪
EP3528490B1 (de) 2016-10-31 2021-07-07 Huawei Technologies Co., Ltd. Bilddatensynchronisierungsverfahren und endgerät
CN106328106A (zh) * 2016-11-09 2017-01-11 佛山市高明区子昊钢琴有限公司 一种多媒体钢琴及其自动演奏方法、系统
KR101823033B1 (ko) * 2017-01-26 2018-03-14 씨제이씨지브이 주식회사 실시간 렌더링 효율을 개선 시킨 영상 관리 시스템 및 그 방법
JP2020106580A (ja) * 2018-12-26 2020-07-09 セイコーエプソン株式会社 表示装置、表示システム、及び表示方法
KR102390872B1 (ko) * 2019-12-19 2022-04-26 애드커넥티드 주식회사 복수의 커넥티드 장치 간 디지털 콘텐츠의 재생 동기화를 맞추는 방법 및 이를 이용한 장치
CN113099193B (zh) * 2019-12-23 2022-11-25 明基智能科技(上海)有限公司 投影仪、沉浸式投影系统及方法
CN114079761B (zh) * 2020-08-21 2024-05-17 深圳市环球数码科技有限公司 集成的符合dci标准的电影播放器及dlp投影设备
CN112422770A (zh) * 2020-11-18 2021-02-26 厦门视诚科技有限公司 一种针对多台4k分辨率的视频处理器的同步方法及系统
WO2022181859A1 (ko) * 2021-02-26 2022-09-01 애드커넥티드 주식회사 복수의 커넥티드 장치 간 디지털 콘텐츠의 재생 동기화를 맞추는 방법 및 이를 이용한 장치

Family Cites Families (18)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4456927A (en) * 1981-12-02 1984-06-26 Vidicraft, Inc. Video circuitry
EP0589217A1 (de) * 1992-09-24 1994-03-30 Siemens Stromberg-Carlson Verfahren und Vorrichtung zur Synchronisierung einer seriellen Linienstrecke
JPH0964874A (ja) * 1995-08-28 1997-03-07 Sony Corp データ伝送方法およびデータ伝送システム
CN1501365B (zh) * 1999-01-27 2023-04-25 皇家菲利浦电子有限公司 记录载体和提供记录载体的方法
US7012906B1 (en) * 1999-03-15 2006-03-14 Lg Information & Communications, Ltd. Pilot signals for synchronization and/or channel estimation
JP3826659B2 (ja) * 2000-03-27 2006-09-27 セイコーエプソン株式会社 投写表示システムおよび投写型表示装置
WO2002086745A2 (en) * 2001-04-23 2002-10-31 Quantum 3D, Inc. System and method for synchronization of video display outputs from multiple pc graphics subsystems
US7068278B1 (en) * 2003-04-17 2006-06-27 Nvidia Corporation Synchronized graphics processing units
JP4977950B2 (ja) * 2004-02-04 2012-07-18 セイコーエプソン株式会社 マルチ画面映像再生システム、映像再生方法及び表示装置
ITBG20070012U1 (it) * 2007-07-16 2009-01-17 Antonio Francesco Valli Manica monolitica da impiegarsi nella lavorazione di capi di abbigliamento,in materiale elastomerico,ottenuta per immersione della forma nell'elastomero in forma liquida,con la tecnica del slash moulding o per spruzzatura dell'elastomero liquido sull
US8963802B2 (en) * 2010-03-26 2015-02-24 Seiko Epson Corporation Projector, projector system, data output method of projector, and data output method of projector system
WO2012060814A1 (en) * 2010-11-01 2012-05-10 Hewlett-Packard Development Company, L.P. Image display using a virtual projector array
CN103649904A (zh) * 2011-05-10 2014-03-19 Nds有限公司 自适应内容呈现
US8621527B2 (en) * 2012-05-04 2013-12-31 Thales Avionics, Inc. Aircraft in-flight entertainment system with robust daisy-chained network
JP2013247591A (ja) * 2012-05-29 2013-12-09 Sony Corp 上映管理装置および上映管理方法
GB2502578B (en) * 2012-05-31 2015-07-01 Canon Kk Method, device, computer program and information storage means for transmitting a source frame into a video display system
US9247180B2 (en) * 2012-12-27 2016-01-26 Panasonic Intellectual Property Corporation Of America Video display method using visible light communication image including stripe patterns having different pitches
KR20140063534A (ko) * 2014-01-23 2014-05-27 씨제이씨지브이 주식회사 다면 상영 시스템

Also Published As

Publication number Publication date
KR20170088357A (ko) 2017-08-01
WO2016069175A1 (en) 2016-05-06
EP3213505A4 (de) 2018-04-04
CN105917645A (zh) 2016-08-31

Similar Documents

Publication Publication Date Title
US20160119507A1 (en) Synchronized media servers and projectors
EP3213505A1 (de) Synchronisierte medienserver und projektoren
US11544029B2 (en) System and method for synchronized streaming of a video-wall
US11006168B2 (en) Synchronizing internet (“over the top”) video streams for simultaneous feedback
US9936185B2 (en) Systems and methods for merging digital cinema packages for a multiscreen environment
US9628868B2 (en) Transmission of digital audio signals using an internet protocol
KR101575138B1 (ko) 무선 3d 스트리밍 서버
EP3477952B1 (de) Vorrichtung, systeme und verfahren zur verteilung von digitalen inhalten
JP4990762B2 (ja) インターネットプロトコルに用いるストリーミングオーディオとストリーミングビデオとの同期保持
US9591043B2 (en) Computer-implemented method, computer system, and computer program product for synchronizing output of media data across a plurality of devices
WO2021031739A1 (zh) 云桌面视频播放方法、服务器、终端及存储介质
WO2017172514A1 (en) Synchronized media content on a plurality of display systems in an immersive media system
US20130016196A1 (en) Display apparatus and method for displaying 3d image thereof
WO2018128879A1 (en) Branch device bandwidth management for video streams
WO2020241309A1 (ja) 同期制御装置、同期制御方法及び同期制御プログラム
WO2014162748A1 (ja) 受信装置、及び受信方法
CN111294628B (zh) 一种多通道沉浸式影音视频控制系统
WO2017101356A1 (zh) 视频信号处理设备
US11856242B1 (en) Synchronization of content during live video stream
Ryan Variable frame rate display for cinematic presentations
Llobera et al. Creating and broadcasting video-based multi-platform experiences
KR101794521B1 (ko) 3차원 영상 출력 시스템
KR101474142B1 (ko) 네트워크 상에서 실시간 스테레오스코픽 3d 비디오의 시간적 비동기를 해결하기 위한 시간적 동기화 방법 및 시스템
WO2017101338A1 (zh) 视频信号处理方法及设备

Legal Events

Date Code Title Description
PUAI Public reference made under article 153(3) epc to a published international application that has entered the european phase

Free format text: ORIGINAL CODE: 0009012

17P Request for examination filed

Effective date: 20170526

AK Designated contracting states

Kind code of ref document: A1

Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR

AX Request for extension of the european patent

Extension state: BA ME

DAV Request for validation of the european patent (deleted)
DAX Request for extension of the european patent (deleted)
A4 Supplementary search report drawn up and despatched

Effective date: 20180302

RIC1 Information provided on ipc code assigned before grant

Ipc: H04N 21/4363 20110101ALI20180226BHEP

Ipc: H04N 21/414 20110101ALI20180226BHEP

Ipc: H04N 21/41 20110101ALI20180226BHEP

Ipc: H04N 5/74 20060101AFI20180226BHEP

Ipc: H04N 21/433 20110101ALI20180226BHEP

Ipc: H04N 5/04 20060101ALI20180226BHEP

Ipc: H04N 21/43 20110101ALI20180226BHEP

Ipc: H04N 9/31 20060101ALI20180226BHEP

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: THE APPLICATION IS DEEMED TO BE WITHDRAWN

18D Application deemed to be withdrawn

Effective date: 20181002