WO2017172514A1 - Contenu multimédia synchronisé sur une pluralité de systèmes d'affichage dans un système multimédia immersif - Google Patents

Contenu multimédia synchronisé sur une pluralité de systèmes d'affichage dans un système multimédia immersif Download PDF

Info

Publication number
WO2017172514A1
WO2017172514A1 PCT/US2017/024003 US2017024003W WO2017172514A1 WO 2017172514 A1 WO2017172514 A1 WO 2017172514A1 US 2017024003 W US2017024003 W US 2017024003W WO 2017172514 A1 WO2017172514 A1 WO 2017172514A1
Authority
WO
WIPO (PCT)
Prior art keywords
slave
video
display system
synchronization
master
Prior art date
Application number
PCT/US2017/024003
Other languages
English (en)
Inventor
Alexander William GOCKE
Diego Duyvejonck
Scott STREMPLE
Jerome Delvaux
Emmanuel Cappon
Original Assignee
Barco, Inc.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Barco, Inc. filed Critical Barco, Inc.
Publication of WO2017172514A1 publication Critical patent/WO2017172514A1/fr

Links

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/04Synchronising

Definitions

  • the present disclosure generally relates to systems and methods of an immersive media system that provides synchronized video on a plurality of display systems.
  • Media content can be delivered to homes and other venues for viewing. For example, in many households, content is delivered via cable, the internet, and/or antenna.
  • Example embodiments described herein have innovative features, no single one of which is indispensable or solely responsible for their desirable attributes. Without limiting the scope of the claims, some of the advantageous features will now be summarized.
  • An immersive media system can include a plurality of display systems arranged to provide immersive viewing of video.
  • an immersive media system can comprise a plurality of television screens arranged around a viewer and/or audience. In this way, the viewer/audience can experience a sense of immersion into the environment depicted in the video. Synchronized video provided by the plurality of television screens can create a unified video presentation.
  • Such immersive media systems are capable of generating audiovisual presentations with a relatively high level of realism due at least in part to video being simultaneously presented to the viewer from many directions.
  • media content is provided for a single display system for viewing.
  • Such single-screen display systems are not configured to provide multi-view content (e.g., media streams designed to be shown on a plurality of screens).
  • Multi-view content e.g., media streams designed to be shown on a plurality of screens.
  • Combining a plurality of such single-screen display systems to enable presentation of multi-view content to create an immersive media system presents a number of challenges. For example, to provide an immersive audiovisual environment it can be important to reduce or eliminate issues that may destroy the immersive quality of the experience for the viewer.
  • video from different screens are not synchronized, a viewer can become disoriented, distracted, or can otherwise lose a sense of immersion in the environment.
  • Combining single-screen display systems can result in video synchronization issues because such display systems may not be configured to synchronize video with other display systems.
  • attempts to convert single-screen display systems to be part of an immersive media system can result in out-of-sync video on different displays, reducing viewer enjoyment and satisfaction
  • media content can be streamed to a single receiving device (e.g., a set-top box) via cable, internet, antenna, etc.
  • a single receiving device e.g., a set-top box
  • another challenge of providing media content to the plurality of display systems is distributing the data to each of them.
  • a master display system can generate a synchronization signal based at least in part on the media stream provided to the master display system and transmit the synchronization signal serially to a plurality of slave display systems (e.g., creating a chain of displays).
  • each slave display system in the chain receiving the synchronization signal can (1) pass the signal to a subsequent slave display system and (2) process the synchronization signal to determine when to display a video frame so that it is synchronized with the video of the master display system.
  • the displays can thus be connected in serial, or chained together, to synchronize the media streams of each, the synchronization signal being provided by the master display system.
  • an immersive media system comprises at least 3 displays that are sequentially chained.
  • Media content can be downloaded to each display.
  • a master display creates and transmits a synchronization signal to enable synchronous projection of video content by all display systems with sub-frame accuracy.
  • the synchronization signal gets passed sequentially among the at least 3 chained displays.
  • a sequentially chained display system can utilize a wireless connection (e.g., using a wireless networking protocol such as IEEE 802.11 ⁇ ) to transmit a synchronization signal derived from standard timecodes used in media environments.
  • a wireless connection e.g., using a wireless networking protocol such as IEEE 802.11 ⁇
  • Using sequentially chained display systems can also simplify the generation and transmission of the synchronization signal.
  • the serial wireless connection design can reduce or eliminate the need for signal amplification relative to an immersive media system employing a parallel connection infrastructure.
  • the serial connection design reduces or eliminates a need for a centralized distribution system to distribute the synchronization signal to each display system relative to an immersive media system employing a parallel connection infrastructure.
  • serial connection design enables flexibility in the number of display systems in the immersive media system because the addition of another display simply entails adding another link in the display system chain. This can provide an advantage over a parallel connection design as a maximum number of display systems can be reached in a parallel system when the synchronization signal distribution system runs out of available connection points.
  • the serial connection design also can result in relatively small latency between display systems.
  • the synchronization signal can also enable synchronization of video with different frame rates, aspect ratios, codecs, or the like due at least in part to the synchronization signal being independent of these parameters.
  • a master display system can generate a synchronization signal based at least in part on a signal coming from its framebuffer and a slave display system can synchronize its video based at least in part on regulating the signal output of its framebuffer according to information in the synchronization signal.
  • multi-view content can be packaged for digital delivery and ingested by a receiving device at the venue, wherein the package comprises a plurality of channels of video to be displayed by a corresponding plurality of display systems.
  • each of the video channels can conform to standard digital content packaging formats (e.g., MKV, MP4, XVID, AVI, WMV, MOV, and/or other media formats).
  • a receiving device e.g., a set-top box
  • the master display system can selectively distribute video data to the display intended to play the video data.
  • each display in the immersive media system ingests the entire package and is configured to determine the appropriate video channel to decode and display.
  • the master display system is configured to automatically determine the appropriate slave display system for each video channel in the package and to transmit the video channel data to the slave display, where transmission can occur prior to presentation of the multi-view content, during playback wherein the slave display system buffers the video channel data, or the video channel data is delivered and displayed in real-time.
  • the master display system includes hardware and/or software components that distinguish it from slave display systems.
  • a slave display system can include a synchronization module or card that allows it to frame-lock the video presentation based at least in part on the synchronization signal originating from the master display.
  • the master display system and slave display system contain the same hardware and/or software components, but the master display system is configured to act as the master while other display systems are configured to act as slaves.
  • a display system comprises a media server and display screen integrated into a single physical unit.
  • a display system comprises a media server and a display screen that are physically separate and communicatively coupled to one another (e.g., through a wired or wireless connection).
  • FIG. 1 illustrates an example method of generating media content, delivering the media content to a home or other venue, receiving and unpacking the content, sending the content to a first display system, and then sending the content to display systems in a chained sequence.
  • FIG. 2 illustrates an example immersive media system having a plurality of display systems.
  • FIG. 3 illustrates a plurality of example display systems ingesting digital content for viewing.
  • FIG. 4 illustrates a block diagram of an example media server for a display system.
  • FIG. 5 illustrates a flow chart of an example method of synchronizing multiple media streams in serially connected media servers in an immersive media system.
  • FIG. 6 illustrates a flow chart of an example method of synchronizing a slave video with a master video based at least in part on a synchronization signal from a master display system.
  • FIG. 1 illustrates an example method of generating media content, delivering the media content to a home or other venue, receiving and unpacking the content, sending the content to a first display system, and then sending a synchronization signal to display systems in a chained sequence.
  • Block 100 shows generating media content.
  • the media content can be generated by a service provider (e.g., a cable company, movie distributor, internet company, streaming service, etc.).
  • the media content can be in a variety of formats including MKV, MP4, XVID, AVI, WMV, MOV, and/or other media formats.
  • the media content can also be formatted for display in a variety of resolutions including 4K (e.g., 3636x2664, 3996x2160, 3840x2160, 4096x2160, etc.), 2K (e.g., 1828x1332, 1998x1080), HD (e.g., 1920x1080, 1280x720), SD (640x480) or the like.
  • the media content can also be in a variety of frame rates including, for example and without limitation, 24 fps, 30 fps, 60 fps, 120 fps, etc.
  • the media content can also undergo additional compression during generation.
  • color information in 4K video can be encoded using Chroma subsampling and other techniques in order to enable transmission of the video using lower band widths.
  • compression of the media content can be achieved by utilizing a lower resolution signal (e.g., lower resolution than 4K resolution).
  • the lower resolution signal can be 2K, HD, SD, or the like.
  • the lower resolution signal can comprise video frames, where each video frame comprises pixels for displaying the video frame. When viewed in a sequence, the video frames can appear as a movie.
  • Each pixel can have a color.
  • each entry can comprise of three values representing the relative red, green, and blue composition of a pixel. In other cases, each entry can be a scaled number between zero (0) and one (1), reflecting a color and/or color intensity.
  • each entry can be a number in any range of numbers reflecting different colors.
  • the entries can be letters and/or numbers that correspond to particular colors and/or color intensities.
  • the color of a pixel can be represented in any number of ways. In this disclosure, any way of representing the color, color intensity, and/or any other attribute of a pixel (e.g., hue, contrast, brightness, shade, grayscale, etc.) will be called the "color" of the pixel.
  • the visual combination of the pixel colors can create the image perceived by a viewer.
  • using 2K signals as the lower resolution signal may be desirable because 2K signals require less upscaling to achieve 4K resolution.
  • the lower resolution signal can be smaller in size (e.g., smaller in bytes, megabytes, gigabytes, terabytes, and the like to transmit or store the signal) than a 4K signal.
  • An up-conversion signal can be transmitted along with the lower resolution signal in order to upscale the lower resolution signal to 4K resolution.
  • the up-conversion signal can be a separate signal or integrated into the lower resolution signal itself.
  • the upscale signal can contain instructions for each pixel and/or groups of pixels in each frame to upscale the signal, such as by interpolating additional pixels between pixels of the lower resolution signal.
  • a 100 x 200 pixel frame can be upscaled by a factor of two, creating a 200 x 400 pixel frame.
  • an additional pixel can be interpolated so that the frame enlarges to 200 x 400 pixels.
  • the lower resolution signal can be upscaled by a receiving device.
  • the receiving device processes the up-conversion signal, which instructs the receiving device to determine the color of an interpolated pixel based on pixels in other frames and/or other locations within the same frame of the interpolated pixel.
  • interpolation can be used as desired. Some examples of techniques are nearest-neighbor interpolation, bilinear interpolation, bi-cubic interpolation, and/or directional upscaling. In some embodiments, more holistic methods can be implemented, such as, without limitation, comparing blocks (e.g., sections of adjacent pixels in a frame) or pixels along visual lines (e.g., edges) in a frame to find the color of an interpolated pixel.
  • blocks e.g., sections of adjacent pixels in a frame
  • visual lines e.g., edges
  • the lower resolution signal can have frames of higher resolution mixed in.
  • the lower resolution signal can comprise 100 frames.
  • the majority of the frames can have 2K resolution, however, some subset of the frames can have 4K resolution.
  • every fifth frame can have 4K resolution.
  • a frame with 2K resolution can interpolate pixels based, at least in part, on pixel values of the frames that have 4K resolution.
  • the interpolated pixel value can be calculated relative to the corresponding pixel in the sequentially closest preceding 4K resolution frame and/or sequentially closest proceeding 4K resolution frame.
  • the up-conversion signal can comprise instructions including how to adjust the pixel color relative to those corresponding pixels.
  • the instructions can be pixel specific (e.g., instructions in how to change the color of the particular interpolated pixel) or for groups of pixels.
  • instructions can give instructions on how to change blocks of pixels.
  • Such instructions can comprise mathematical transformations as desired.
  • the instructions can comprise mathematical functions comprising addition, multiplication, division, subtraction, etc.
  • the media content can be made available in any number of mediums including DVD, Blu-Ray, Redray, hard drive/media drive, internet, broadcast signal, etc.
  • the content can be formatted to be played by a receiving device, such as a set-top box.
  • the video content can further encode content for a plurality of display systems, including 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, or more display systems.
  • Block 101 shows delivering media content to home or other venue. Delivery can be accomplished by physical delivery of a disk (e.g., DVD, Blu-Ray, Redray), hard drive/media drive, or other medium. It can also be accomplished by sending information over telecommunication lines, cable lines, wirelessly, and/or through a broadcast signal.
  • a disk e.g., DVD, Blu-Ray, Redray
  • hard drive/media drive or other medium. It can also be accomplished by sending information over telecommunication lines, cable lines, wirelessly, and/or through a broadcast signal.
  • the delivered media content is received and unpacked by a receiving device.
  • a set-top box can be configured to decode, decompress, and/or ingest the media content for viewing.
  • the media content is sent to the plurality of display systems.
  • the media content can be sent to just a master display system.
  • the master display system can then connect to one or more slave display systems to distribute all or a portion of the media content to each of those connected display systems.
  • the media content can be distributed over cables that have signal lines and ground lines, such as coaxial cables, Ethernet cables, HDMI cables, and the like. It is to be understood that other cabling options are within the scope of this disclosure including, for example and without limitation, serial cables, twisted pair cables, USB cables, and the like.
  • data can be transferred using a 1000BASE-T GB transceiver and/or any cable and/or component conforming to IEEE's Gigabit Ethernet standard.
  • cables can be replaced by wireless transmission (e.g., using a wireless networking protocol such as IEEE 802.11 ⁇ ).
  • the receiving device can distribute all or a portion of the media content to each of the display systems.
  • Each display system can optionally include a media server configured to receive and transmit media content.
  • the media server can also be configured to receive and transmit synchronization signals.
  • a synchronization signal is sent from the first display system to the second display system.
  • the signal can be sent by wireless transmission using a wireless networking protocol such as IEEE 802.1 In. It can also be sent over cables that have signal lines and ground lines, such as coaxial cables, Ethernet cables, HDMI cables, and the like.
  • FIG. 2 illustrates an example immersive media system having a plurality of display systems.
  • the immersive media system 200 comprises a plurality of display systems 204a-c, configured to display media content for providing an immersive media experience.
  • display systems 204a-c can comprise multiple direct-view displays, multiple rear-projection displays, multiple front-projection displays, liquid-crystal displays (LCDs), light-emitting diode (LED) displays, LED LCD displays, in-plane switching panels (IPSs), cathode ray tubes, plasma displays, ultra high definition (HD) panels, 4K displays, retina displays, organic LED displays, surfaces, canvases, and/or any displays, televisions, monitors, panels, and/or devices known in the art for visual presentation.
  • LCDs liquid-crystal displays
  • LED light-emitting diode
  • IPSs in-plane switching panels
  • CATV ultra high definition
  • the immersive media system 200 can include a plurality of flat or curved displays or screens or it can include a single curved display or screen.
  • the displays can be rotated relative to one another.
  • Display systems 204a-c can also have respective inclinations relative to one another.
  • the screens 204a-c of the immersive media system 100 can include flat screens, curved screens, or a combination of both.
  • the viewer can be positioned so as to view displays 204a-c.
  • viewers can be located at seating 201.
  • the example immersive media system 200 includes display systems 204a-c configured to show video. Sound systems can be included in display systems 204a-c and/or otherwise positioned to provide an immersive experience.
  • a media server is physically separate from the display and is communicably coupled (e.g., through wired or wireless connections) to the display.
  • the display comprises an integrated media server.
  • the media server can include hardware and software components configured to receive, store, and decode media content.
  • the media server can include hardware and software configured to ingest and decode digital content files, to produce a media stream (e.g., video and audio), to send image data to the display.
  • the media server can include modules for ingesting digital content, decoding ingested content, generating video from the decoded content, generating audio from the decoded content, providing security credentials to access secure content, and to generate or interpret synchronization signals to provide a synchronized presentation, and the like.
  • the display system 204a can be configured to be the master display system.
  • the master display system or the master media server provides the synchronization signal to which the slave display systems synchronize their output.
  • the master display system 204a ingests, decodes, and/or provides the main audiovisual presentation in the immersive media system 200.
  • Display systems 204b and 204c are slave display systems.
  • a slave display system or slave media server provides images synchronized to the master system wherein synchronization is based at least in part on the synchronization signal provided by the master display system.
  • a slave display system can provide video that is projected peripheral, adjacent, near, and/or otherwise complementary to the video provided by the master display system.
  • the master display system 204a transmits a synchronization signal over a cable and/or wirelessly.
  • the synchronization signal is the same or substantially the same for all display systems to enable globally synchronized video in the immersive media system. Accordingly, due at least in part to display systems 204a-c displaying video based on the synchronization signal, a synchronized video presentation is provided.
  • synchronized video includes video shown from different display systems having corresponding frames that are displayed within a sufficiently small time window from one another so as to be displayed substantially simultaneously.
  • synchronized video includes video wherein corresponding frames are displayed such that a time between the display of the synchronized frames is less than or equal to about 1 ms, less than or equal to about 500 ⁇ , less than or equal to about 350 ⁇ , less than or equal to about 250 ⁇ , or less than or equal to about 200 ⁇ .
  • Such synchronization can be referred to as having sub-frame accuracy in its synchronization. For example, for a video that has a frame rate of 30 fps (or 60 fps), each frame of video is displayed for about 33.3 ms (or 16.7 ms). Videos that are synchronized to within a fraction of the time a video frame is displayed can be said to have sub-frame accuracy.
  • sub-frame accuracy can include synchronization that has a latency between corresponding frames that is less than about 10% of the frame rate, less than about 5% of the frame rate, less than about 1% of the frame rate, or less than about 0.1% of the frame rate.
  • the master display system 204a can control display of a video in units of frames and synchronize the video frames from display systems 204b and 204c using a time code for each frame, the time code being carried by the synchronization signal, as described in greater detail herein with reference to FIG. 3. Accordingly, the display systems 204a-c can accurately synchronize the video based at least in part on the time code for each frame in the synchronization signal.
  • FIG. 2 illustrates 3 display systems 204a-c.
  • the immersive media system 200 can include a different number of display systems.
  • immersive media system 200 can include 2, 3, 4, 5, 6, 7, 8, 9, 10, or more than 10 display systems.
  • the immersive media system 200 can be configured such that more than one display system provides video on a single screen, such that the images substantially overlap.
  • the immersive media system 200 can be configured such that display systems provide video on a single display screen wherein the videos from display systems minimally overlap, are adjacent to one another, or are near one another to provide a substantially unitary video presentation.
  • FIG. 2 illustrates a plurality of example display systems 302a-c ingesting digital content 301 for display in an immersive media system 300.
  • the digital content 301 can be any collection of digital files that include content data and metadata that make up a composition to be displayed by the immersive media system 300.
  • the digital content 301 can be received by receiving device 306.
  • Receiving device 306 can process and ingest digital content 301.
  • Receiving device 306 can also pass digital content 301 through network connection 350 in non-ingested form to the plurality of display systems 302a-c, where the digital content can be ingested at one or more of the plurality of display systems 302a-c.
  • the media servers 310a-c can be configured to extract the appropriate data files from the ingested digital content 301 and to decode the appropriate video content to send to the corresponding display systems 302a-c.
  • the master display system 302a can generate a first synchronization signal 304a to send over router 303.
  • Router 303 can relay the synchronization signal to the first slave display system 302b.
  • the first slave display system 302b can then send the synchronization signal, as synchronization signal 305a, to a second slave display system 302c as synchronization signal 305b.
  • This method and system of using router 303 to relay synchronization signals can repeat for each display system in immersive media system 300. In this way, immersive media system 300 can display synchronized video from a plurality of display systems.
  • the master and slave display systems 302a-c can be configured to ingest only portions of the digital content 301 intended for that particular display system.
  • a display system 302a-c can download portions of a digital package that contain the data sufficient to provide the content intended for that particular display system.
  • the master display system 302a ingests the entire digital content 301 and distributes a copy of that digital content 301 to the slave display systems 302b-c.
  • the master display system 302a after ingesting the digital content 301, the master display system 302a distributes a portion of the digital content 301 to each slave display system 302a-c wherein the portion transmitted to the slave display system contains the data sufficient for that particular slave display system to provide its audiovisual presentation.
  • Transmission of digital content 301 can occur over the network connection 350 which can be a wired connection, a wireless connection, or a combination of both wired and wireless connections.
  • the master display system 302a can transmit copies of the digital content 301 or copies of portions of the digital content 301 to the slave display systems 302b-c over the network connection 350.
  • the master display system 200a can transmit the digital content 301 to the slave display systems 302b-c prior to presentation of the composition contained in the digital content 301, during presentation of the composition by buffering the data in each slave display system 302b-c, and/or during presentation of the composition in real time.
  • the master display system 302a can transmit information to the slave display systems 302b-d indicating which portion of the digital content 301 that each slave display system should ingest.
  • each slave display system 302b, 302c can ingest a portion of the digital content 301.
  • the digital content 301 can include data that is encoded and/or compressed.
  • the digital content 301 can conform to a variety of formats and/or specifications.
  • digital content 301 can comprise files encoded using WMV3, WMA, VC-1 Advanced Profile, high-efficiency video coding (HEVC) (e.g., H.265), H.264, MPEG-4, VP8, VP9, Daala, Theora, and the like.
  • the files can include Digital Rights Management ("DRM").
  • DRM Digital Rights Management
  • the files can include video and sound for the plurality of display systems (e.g., 1, 2, 3, 4, 5, 6, 7, 8, or more display systems) in immersive media system 300.
  • the files can be display-system specific (e.g., content for master display system 302a, slave display system 302b, or slave display system 302c) or can be run on any display system.
  • digital content 301 can include metadata or other files that can be used to identify and/or designate files for playing on a particular display system in display system 300.
  • a playlist or directory can be used to designate files for playing on the particular display system.
  • other files, playlist, and/or directory can be used at least in part to implement a smart ingest function that limits ingestion of digital content 301 to the relevant portions of the digital content 301 for its intended display system.
  • This smart ingestion can occur at the receiving device 306, wherein the appropriate content is sent to the correct display system, or can occur at each or any of the media servers of the display systems (e.g., media servers 310a-c).
  • the immersive media system 300 displays content from the display systems 302a-c blended together to accommodate a curved display screen.
  • the immersive media system 300 can display other content synchronized to digital content 301. This can allow for dynamic content (e.g., feeds from social media, advertisements, news feeds, etc.) to be displayed along with a main video presentation (e.g., a feature film).
  • a main video presentation e.g., a feature film.
  • one or more of the display systems 302a-c provides the dynamic, synchronized content overlaid on the display screens of the display systems.
  • the systems and methods described herein can advantageously allow the synchronization of video from a plurality of display systems when the digital content 301 conforms to a single specification, multiple specifications, or a combination of a single specification and no specification.
  • This advantage is realized due at least in part to the master display system 302a generating the synchronization signal after the media content has been decoded and/or ingested.
  • the master display system 302a can generate an appropriate timeline and metadata independent of the format of the digital content 301 and encode that information into the synchronization signal.
  • the synchronization can be done between the video frames (e.g., line-based synchronization).
  • the master display system 302a can generate the synchronization signal after the frame buffer output in the display, prior to the showing of the frame on the display screen.
  • Each slave display 302b-c can receive the synchronization signal and control its display of video based on the timeline and metadata in the signal.
  • slave display systems 302b-c can frame-lock to the synchronization signal at a similar hardware level to the master display system 302a (e.g., after the frame buffer and prior to the modulation chip).
  • the display systems 302a-c can be synchronized on a frame -basis, frame- locking content wherein timing is linked on a frame-by-frame basis.
  • the immersive media system 300 can synchronize display systems 302a-c with each other for content playback with sub-frame accuracy, wherein each media server of each display system has files in a different format and/or following a different specification.
  • the immersive media system 300 can also synchronize video having different aspect ratios, different content formats, and/or different frame rates.
  • side screens can have a frame rate that is higher than a frame rate of the main screen or vice versa.
  • synchronization of different frame rates can occur where the differing frame rates are multiples of one another (e.g., 30 fps and 60 fps), multiples of a common frame rate (e.g., 24 fps and 60 fps are both multiples of 12), or where the data rate of the synchronization signal allows for synchronization at differing frame rates (e.g., where the base frequency of the synchronization signal is a multiple of possible frame rates).
  • the immersive media system 300 can also synchronize video that is stereoscopic, not stereoscopic, or a combination of both.
  • the master display system 302a and the slave display systems 302b-c can be substantially identical devices.
  • the user can configure the devices to assume the roles of master and slave.
  • the content ingested by the devices determines, at least in part, the role of the display system (e.g., master or slave).
  • the immersive media system 300 can thus be configured to not include any specific main server that controls the slave display systems.
  • the master display system 302a can include hardware and/or software components that differentiate it from the slave display systems 302b-c.
  • the master display system 302a can include hardware and/or software specific to generating the synchronization signal.
  • the slave display systems 302b-c can include hardware and/or software specific to synchronizing video output based on the synchronization signal.
  • FIG. 4 illustrates a block diagram of an example media server system 410.
  • the media server system 410 can be a master media server or a slaver media server.
  • the media server system 410 can be configured to generate a synchronization signal (e.g., when it is a master media server system), transmit the synchronization signal (e.g., over a synchronization link such as a coaxial cable), receive a synchronization signal (e.g., when it is a slave media server system), synchronize presentation of a video based at least in part on the synchronization signal, send and receive communications over a network connection, process digital files to generate a video, provide security credentials to extract video, and the like.
  • the media server system 410 can include hardware and software sufficient to accomplish the functionality described herein.
  • the media server system 410 includes a controller 401, such as a computer processor, and a data store 402, such as a non-transitory computer storage. Controller 401 can be configured to provide computational power and to direct and coordinate the execution of functions sufficient to provide the targeted and desired functionality of the media server system 410.
  • the data store 402 can be used to store digital files, e.g., software, executable instructions, configuration settings, calibration information, and the like.
  • the media server 410 provides a user interface or a control program accessible over a network connection that allows a user or other system to provide commands to the media server system 410, to monitor a status of the media server system, and/or to request information from the media server system 410.
  • the media server system 410 includes a communication module 403 configured to process, send, receive, construct, and/or interpret information over a network connection, such as the network connection 350 described herein with reference to FIG. 3.
  • the communication module 403 can be configured to ingest digital content for display by an associated display.
  • the communication module 403 can be configured to perform a smart ingest function wherein data necessary for displaying content on the associated display is ingested and other data is not ingested.
  • the communication module 403 can be configured to send commands to be performed by connected media servers.
  • a master media server can command one or more slave media servers to control its associated display system by dowsing the shutter or other similar functionality.
  • the communication module 403 in a slave display system can be configured to receive and interpret commands received from a master display system.
  • the media server system 410 includes a media module 404 configured to process digital data to generate a video presentation.
  • the media module 404 can be configured to extract packaged files from a standard format, such as a DCP package, and to provide an appropriate signal to a display screen so that the display screen displays intended video. For example, to display a feature film, the media module 404 can decompress digital files, identify an appropriate playlist file, decode associated image essence files, decode associated audio essence files, and produce a video signal that is sent to a display screen for display.
  • the media server system 410 includes a security module 405 configured to provide appropriate security functionality to access secure and/or encrypted digital files.
  • the security module 405 can provide appropriate security credentials and decrypt the digital files so that the media module 404 can access the files.
  • the security module can also provide security functionality when the video signal generated by the media module 404 is to be sent over a cable to the display screen, such as when the display screen is physically separate from the media server system 410.
  • the media server system 410 includes a synchronization module 406 configured to generate a synchronization signal (e.g., when the media server 410 is part of a master display system), transmit the synchronization signal (e.g., wirelessly and/or over a cable), and/or process the synchronization signal (e.g., when the media server 410 is part of a slave display system).
  • the synchronization module 406 can be configured to generate the synchronization signal.
  • the synchronization signal can be generated independent of synchronization information provided in the digital files related to the composition (e.g., video presentation). For example, the synchronization signal can be generated based at least in part on the video signal generated by the media module.
  • the synchronization signal can be generated based at least in part on the output of a frame buffer in the display screen, prior to (or in parallel with) the video signal being input the display screen.
  • the synchronization signal can be a waveform having information that is encoded therein.
  • the waveform can utilize a biphase mark code to encode data (e.g., as used in AES3 and S/PDIF signals).
  • the synchronization signal encoded with biphase mark code can be polarity insensitive which can be advantageous in an immersive media system with a plurality of display systems.
  • the waveform can be divided into words, or groups of bits, with information encoded at particular places within the words.
  • the waveform can have one or more encoded words.
  • the synchronization waveform can be a modified linear time code or a modified AES3 signal.
  • the waveform can encode SMPTE timecode data to enable synchronization of slave display systems to the master display system.
  • the waveform can also encode commands or other information (e.g., metadata) addressed to or intended for one or more display systems.
  • the synchronization signal can include two 64-bit words.
  • the first word can include a 24-bit frame number, a valid status bit, a channel status bit, a user data bit, a parity bit (e.g., to validate a received word).
  • an active edge in the user data bit can be used to indicate that the master display system will start the next frame.
  • the second word can contain a command structure used by a master display system to provide commands to connected slave display systems. Additional data, such as metadata, can be included in the first or second word.
  • the second word can include a 24-bit instruction from the master display system to connected slave display systems.
  • the metadata can be used to provide information to the slave display systems to modify their functionality.
  • Metadata can be used to indicate that the master display system is paused.
  • the slave display systems can then pause their playback until another signal is received indicating that playback on the master display system has resumed.
  • the synchronization signal can be a modification of standard signals, such as the linear time code or AES3 signal. This can allow existing display systems, hardware, and/or software to incorporate elements sufficient to implement the synchronization signal in a relatively straight-forward and easy fashion.
  • the synchronization module 406 can include look-up tables, data structures, data tables, data bases, or the like to interpret the signals encoded into the synchronization signal.
  • the synchronization module 406 can include a command table that correlates commands with numbers encoded into the synchronization signal.
  • the synchronization signal can have a data rate of about 2 Mbps.
  • the data rate is about 32 ⁇ ⁇ .
  • the time to transmit a packet is about 64 ⁇ .
  • the synchronization module 406 can be configured to adjust display of a video frame based at least in part on the synchronization signal. For example, the synchronization module 406 can wait for a matching frame id received in the synchronization signal. When the matching frame id is received, the synchronization module 406 can indicate to the display system to display the appropriate video frame.
  • the synchronization module 406 generates the synchronization signal based at least in part on audio provided by the media module 204.
  • sound can be generated by the master display system and the timing of the audio can drive the video synchronization chain.
  • the audio can be processed by the media module 404 in real time and the video frames can be specified in terms of the number of clock cycles relative to the audio clock domain. This can enable automatic alignment of audio and video during playback.
  • continuous or substantially continuous adjustments to video playback can be performed during the video blanking time slot (e.g., using a back-pressure algorithm). Accordingly, the master display system can play audio in real time and display the video synchronized to the audio using the media module 404.
  • the master display system also provides a synchronization signal via the synchronization module 406 to connected slave display systems.
  • the slave display systems can then synchronize their video to this synchronization signal provided by the master display system, making it so that their video is synchronized with the master video and not necessarily to their audio.
  • media server 401 can be configured to allow a display system to be upgraded from a single-display system to a display system that can part of an immersive media system, such as the immersive media system described herein with reference to FIGS. 1 and 2. It can be attachable to a display system
  • FIG. 5 illustrates a flow chart of an example method 500 of synchronizing multiple media streams in serially connected media servers in an immersive media system.
  • the method 500 can be performed by a plurality of display systems and/or media servers in an immersive media system.
  • One or more media servers such as the media servers described herein with reference to FIGS, 3 or 4, can perform one or more of the steps of the method 500.
  • one or more modules of the media server such as those described herein with reference to FIG. 4, can perform one or more of the steps of the method 500.
  • a single step of the method 500 can be performed by more than one module and/or display system.
  • a master display system and/or a receiving device extracts a composition for presentation.
  • the composition can include video and/or audio to be presented to an audience.
  • the composition can include video to be displayed by the master display system.
  • the composition can include video to be displayed by two or more slave display systems.
  • the master display system can transmit the data sufficient to display the video to the respective slave display systems.
  • two or more slave display systems each extract a composition for presentation by the respective slave display system.
  • the master display system generates a synchronization signal based at least in part on the extracted composition.
  • the synchronization signal can encode data words into a synchronization waveform.
  • the encoded data words can include synchronization information in the form of a timecode.
  • the master display system generates the synchronization signal based at least in part on audio in the composition for presentation by the master display system.
  • the master display system transmits the synchronization signal to a first slave display system.
  • the master display system can transmit the synchronization signal over a coaxial cable or other cable with a signal line and a ground.
  • the first slave display system receives the synchronization signal and retransmits the synchronization signal to a second slave display system.
  • the first slave display system can receive the synchronization signal at an input synchronization connector and transmit the synchronization signal at an output synchronization connector.
  • the master display system displays a video frame from the extracted composition.
  • the first slave display system and the second slave display system display video frames synchronized with the video frame displayed by the master display system wherein the displayed video frames are synchronized based at least in part on the synchronization signal generated by the master display system.
  • Each of the first and second slave display systems can process the received synchronization signal to extract synchronization information.
  • each of the first and second slave display systems can control playback of its video (e.g., control timing of when to display a video frame) based at least in part on the extracted synchronization information.
  • FIG. 6 illustrates a flow chart of an example method 600 of synchronizing a slave video with a master video based at least in part on a synchronization signal from a master display system.
  • the method can be performed by a slave display system in an immersive media system, such as the slave display systems described herein with reference to FIGS. 1-5.
  • the slave display system can include hardware and software configured to perform the steps in the method 600, and each step in the method can be performed by one or more components and/or one or more modules of the slave display system. Similarly, one or more steps in the method 600 can be performed by any combination of hardware and software of the slave display system.
  • the method 600 can allow a slave display system to synchronize a slave video with a master video.
  • the slave display system can comprise a modified single display system.
  • a display system can be retrofit with a module, such as the module 410 described herein with reference to FIG. 4, that is configured to receive a synchronization signal and to synchronize its video based at least in part on the received synchronization signal.
  • the synchronization signal can be generated by a master display system that has not been specifically designed to be part of an immersive media system.
  • a display system configured for use in a single-display system can generate a synchronization signal based on standards such as LTC or AES3.
  • the slave display system can receive the synchronization signal and synchronize its video based on that generated synchronization signal.
  • an immersive media system can be created using existing hardware and retrofitting one or more display systems (e.g., with the module 410) to act as slave display systems.
  • the slave display system receives a synchronization signal.
  • the synchronization signal can be generated by a master display system or another system configured to generate the synchronization signal.
  • the synchronization signal can be based on standard synchronization signals (e.g., LTC, AES3, etc.) or it can conform to a format that the slave display system can process and from which it can extract synchronization information.
  • the synchronization signal can be received wireless and/or over a cable that has a signal line and a ground line, such as a coaxial cable. It is to be understood that other cabling options are within the scope of this disclosure including, for example and without limitation, serial cables, twisted pair cables, USB cables, and the like.
  • the slave display system transmits the received synchronization signal to another slave display system over wireless transmission or over another cable (e.g., a cable different from the cable used to receive the synchronization signal).
  • the slave display system can include active electronics configured to receive the synchronization signal and pass that signal to the next slave display system in the chain.
  • the slave display system includes amplifiers, filters, and/or other electronics configured to reduce degradation of the synchronization signal as it is passed from one slave display system to the next.
  • the slave display system extracts synchronization information from the received synchronization signal. This can occur in parallel with the transmission of the synchronization signal in block 610. This can be done to reduce or minimize latency in the immersive media system.
  • the synchronization information can include information sufficient for the slave display system to provide a video frame synchronized with a video provided by another display system (e.g., a master display system and/or other slave display systems).
  • the synchronization information can include, for example and without limitation, frame numbers, timestamps, timecodes, metadata, command(s) for the slave display system, and the like, as described in greater detail herein.
  • the slave display system provides a video frame synchronized with a video provided by another display system.
  • the slave display system can synchronize the video frame at the framebuffer.
  • the slave display system can synchronize the video frame at a point in the processing chain prior to the framebuffer, such as at the video decoding stage.
  • the synchronized video frame can be displayed on a screen along with video from other display systems to provide an immersive viewing experience for a viewer.
  • a computing system that has components including a central processing unit (CPU), input/output (I/O) components, storage, and memory can be used to execute the display system, or specific components of the display system.
  • the executable code modules of the display system can be stored in the memory of the computing system and/or on other types of non-transitory computer-readable storage media.
  • the display system can be configured differently than described above.
  • Each of the processes, methods, and algorithms described in the preceding sections can be embodied in, and fully or partially automated by, code modules executed by one or more computers, computer processors, or machines configured to execute computer instructions.
  • the code modules can be stored on any type of non-transitory computer- readable medium or tangible computer storage device, such as hard drives, solid state memory, optical disc, and/or the like.
  • the systems and modules can also be transmitted as generated data signals (e.g., as part of a carrier wave or other analog or digital propagated signal) on a variety of computer-readable transmission mediums, including wireless-based and wired/cable-based mediums, and can take a variety of forms (e.g., as part of a single or multiplexed analog signal, or as multiple discrete digital packets or frames).
  • the processes and algorithms can be implemented partially or wholly in application-specific circuitry.
  • the results of the disclosed processes and process steps can be stored, persistently or otherwise, in any type of non-transitory computer storage such as, e.g., volatile or non-volatile storage.
  • the term "or” is used in its inclusive sense (and not in its exclusive sense) so that when used, for example, to connect a list of elements, the term “or” means one, some, or all of the elements in the list.
  • Conjunctive language such as the phrase "at least one of X, Y and Z," unless specifically stated otherwise, is otherwise understood with the context as used in general to convey that an item, term, etc. can be either X, Y or Z. Thus, such conjunctive language is not generally intended to imply that certain embodiments require at least one of X, at least one of Y and at least one of Z to each be present.
  • the terms "about” or “approximate” and the like are synonymous and are used to indicate that the value modified by the term has an understood range associated with it, where the range can be ⁇ 20%, ⁇ 15%, ⁇ 10%, ⁇ 5%, or ⁇ 1%.
  • the term “substantially” is used to indicate that a result (e.g., measurement value) is close to a targeted value, where close can mean, for example, the result is within 80% of the value, within 90% of the value, within 95% of the value, or within 99% of the value.

Abstract

L'invention concerne un système d'affichage immersif. Le système d'affichage immersif comprend un système de projection maître et une pluralité de systèmes de projection esclaves. Le système de projection maître synchronise l'affichage d'une vidéo avec la pluralité de systèmes de projection esclaves à l'aide d'un signal de synchronisation. Le signal de synchronisation est transmis de manière séquentielle du système de projection maître à chacun des systèmes de projection esclaves, les systèmes de projection étant reliés en série les uns aux autres. La synchronisation vidéo des sous-trames est effectuée au moyen des systèmes de projection reliés en chaîne de manière séquentielle.
PCT/US2017/024003 2016-03-29 2017-03-24 Contenu multimédia synchronisé sur une pluralité de systèmes d'affichage dans un système multimédia immersif WO2017172514A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US201662314851P 2016-03-29 2016-03-29
US62/314,851 2016-03-29

Publications (1)

Publication Number Publication Date
WO2017172514A1 true WO2017172514A1 (fr) 2017-10-05

Family

ID=59966380

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/US2017/024003 WO2017172514A1 (fr) 2016-03-29 2017-03-24 Contenu multimédia synchronisé sur une pluralité de systèmes d'affichage dans un système multimédia immersif

Country Status (1)

Country Link
WO (1) WO2017172514A1 (fr)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110896500A (zh) * 2019-11-29 2020-03-20 中国电影科学技术研究所 用于数字电影播放的控制装置及系统
CN116506480A (zh) * 2023-05-11 2023-07-28 广州市埃威姆电子科技有限公司 一种沉浸式演艺逻辑联动秀控系统
WO2024046269A1 (fr) * 2022-09-01 2024-03-07 维沃移动通信有限公司 Module d'affichage, système d'affichage, circuit d'attaque d'affichage, procédé d'affichage et dispositif électronique

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6122000A (en) * 1997-06-03 2000-09-19 Hewlett Packard Company Synchronization of left/right channel display and vertical refresh in multi-display stereoscopic computer graphics systems
US20030179782A1 (en) * 1998-09-23 2003-09-25 Eastty Peter Charles Multiplexing digital signals
US20090036159A1 (en) * 2000-01-31 2009-02-05 3E Technologies International, Inc. Broadband communications access device
US20120159026A1 (en) * 2009-07-22 2012-06-21 Teruo Kataoka Synchronous control system including a master device and a slave device, and synchronous control method for controlling the same
US20140152784A1 (en) * 2012-12-05 2014-06-05 Sony Network Entertainment International Llc Method and apparatus for synchronizing of 3-d display devices
US20150348558A1 (en) * 2010-12-03 2015-12-03 Dolby Laboratories Licensing Corporation Audio Bitstreams with Supplementary Data and Encoding and Decoding of Such Bitstreams
US20160080710A1 (en) * 2014-09-17 2016-03-17 Pointcloud Media, LLC Tri-surface image projection system and method

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6122000A (en) * 1997-06-03 2000-09-19 Hewlett Packard Company Synchronization of left/right channel display and vertical refresh in multi-display stereoscopic computer graphics systems
US20030179782A1 (en) * 1998-09-23 2003-09-25 Eastty Peter Charles Multiplexing digital signals
US20090036159A1 (en) * 2000-01-31 2009-02-05 3E Technologies International, Inc. Broadband communications access device
US20120159026A1 (en) * 2009-07-22 2012-06-21 Teruo Kataoka Synchronous control system including a master device and a slave device, and synchronous control method for controlling the same
US20150348558A1 (en) * 2010-12-03 2015-12-03 Dolby Laboratories Licensing Corporation Audio Bitstreams with Supplementary Data and Encoding and Decoding of Such Bitstreams
US20140152784A1 (en) * 2012-12-05 2014-06-05 Sony Network Entertainment International Llc Method and apparatus for synchronizing of 3-d display devices
US20160080710A1 (en) * 2014-09-17 2016-03-17 Pointcloud Media, LLC Tri-surface image projection system and method

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110896500A (zh) * 2019-11-29 2020-03-20 中国电影科学技术研究所 用于数字电影播放的控制装置及系统
WO2024046269A1 (fr) * 2022-09-01 2024-03-07 维沃移动通信有限公司 Module d'affichage, système d'affichage, circuit d'attaque d'affichage, procédé d'affichage et dispositif électronique
CN116506480A (zh) * 2023-05-11 2023-07-28 广州市埃威姆电子科技有限公司 一种沉浸式演艺逻辑联动秀控系统
CN116506480B (zh) * 2023-05-11 2023-12-26 广州市埃威姆电子科技有限公司 一种沉浸式演艺逻辑联动秀控系统

Similar Documents

Publication Publication Date Title
US20160119507A1 (en) Synchronized media servers and projectors
US9628868B2 (en) Transmission of digital audio signals using an internet protocol
EP3213505A1 (fr) Projecteurs et serveurs multimédia synchronisés
US8988506B2 (en) Transcoder supporting selective delivery of 2D, stereoscopic 3D, and multi-view 3D content from source video
TWI595777B (zh) 透過hdmi發送顯示管理元數據
US11544029B2 (en) System and method for synchronized streaming of a video-wall
US9967599B2 (en) Transmitting display management metadata over HDMI
US11122253B2 (en) Dynamic distribution of multi-dimensional multimedia content
KR20100106567A (ko) 모바일 고화질 멀티미디어 인터페이스를 생성하여 용이하게 하기 위한 방법과 장치와 시스템
US9749682B2 (en) Tunneling HDMI data over wireless connections
US10547885B2 (en) Adaptively selecting content resolution
WO2017172514A1 (fr) Contenu multimédia synchronisé sur une pluralité de systèmes d'affichage dans un système multimédia immersif
US10375349B2 (en) Branch device bandwidth management for video streams
WO2017076913A1 (fr) Diffusion en continu d'image mise à niveau sur des dispositifs d'affichage classiques et mis à niveau
WO2017101356A1 (fr) Dispositif de traitement de signal vidéo
Reinhard et al. High dynamic range video production, delivery and rendering
US11856242B1 (en) Synchronization of content during live video stream
Reinhard et al. High dynamic range video chains
WO2017101338A1 (fr) Procédé et dispositif de traitement de signal vidéo

Legal Events

Date Code Title Description
NENP Non-entry into the national phase

Ref country code: DE

121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 17776348

Country of ref document: EP

Kind code of ref document: A1

122 Ep: pct application non-entry in european phase

Ref document number: 17776348

Country of ref document: EP

Kind code of ref document: A1