WO2010002361A1 - Method and apparatus for dynamic displays for digital cinema - Google Patents
Method and apparatus for dynamic displays for digital cinema Download PDFInfo
- Publication number
- WO2010002361A1 WO2010002361A1 PCT/US2008/008166 US2008008166W WO2010002361A1 WO 2010002361 A1 WO2010002361 A1 WO 2010002361A1 US 2008008166 W US2008008166 W US 2008008166W WO 2010002361 A1 WO2010002361 A1 WO 2010002361A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- dynamic information
- server
- subtitle
- display system
- digital display
- Prior art date
Links
Classifications
-
- G—PHYSICS
- G03—PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
- G03B—APPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
- G03B21/00—Projectors or projection-type viewers; Accessories therefor
- G03B21/14—Details
- G03B21/32—Details specially adapted for motion-picture projection
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/40—Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
- H04N21/47—End-user applications
- H04N21/482—End-user interface for program selection
- H04N21/4825—End-user interface for program selection using a list of items to be played back in a given order, e.g. playlists
-
- G—PHYSICS
- G03—PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
- G03B—APPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
- G03B21/00—Projectors or projection-type viewers; Accessories therefor
- G03B21/14—Details
-
- G—PHYSICS
- G03—PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
- G03B—APPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
- G03B21/00—Projectors or projection-type viewers; Accessories therefor
- G03B21/14—Details
- G03B21/32—Details specially adapted for motion-picture projection
- G03B21/43—Driving mechanisms
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/40—Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
- H04N21/41—Structure of client; Structure of client peripherals
- H04N21/414—Specialised client platforms, e.g. receiver in car or embedded in a mobile appliance
- H04N21/41415—Specialised client platforms, e.g. receiver in car or embedded in a mobile appliance involving a public display, viewable by several users in a public space outside their home, e.g. movie theatre, information kiosk
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
Definitions
- the present prihciples relate generally to digital cinema and, more particularly, to a method and apparatus for dynamic displays for digital cinema.
- a method for use in a digital display system includes generating dynamic information for display by processing a subtitle file including at least one element related to the dynamic information.
- an apparatus for use in a digital display system includes a display manager for generating dynamic information for display by processing a subtitle file including at least one element related to the dynamic information.
- FIG. 1 is a block diagram of a digital cinema presentation system in a theatrical auditorium providing a dynamically generated display, in accordance with an embodiment of the present principles
- FIG. 2 is a schematic diagram of a show playlist, the composition playlists the show playlist references, the track and asset file components which are referenced by the dynamic content composition, and a timeline indicating important events relative to the generation and display of dynamic content, in accordance with an embodiment of the present principles;
- FIG. 3 is a diagram showing an exemplary composition, in accordance with an embodiment of the present principles
- FIG. 4A shows an exemplary subtitle track file making use of timed text, in accordance with an embodiment of the present principles
- FIG. 4B shows a portion of an alternative exemplary subtitle track file making use of timed text, in accordance with an embodiment of the present principles
- FIG. 4C shows a portion of an alternative exemplary subtitle track file making use of dynamic sub-pictures, in accordance with an embodiment of the present principles.
- FIG. 5 is a flow diagram of an exemplary method for displaying dynamic information in a digital display system, in accordance with an embodiment of the present principles.
- the present principles are directed to a method and apparatus for dynamic displays for digital cinema.
- the present description illustrates the present principles. It will thus be appreciated that those skilled in the art will be able to devise various arrangements that, although not explicitly described or shown herein, embody the present principles and are included within its spirit and scope. All examples and conditional language recited herein are intended for pedagogical purposes to aid the reader in understanding the present principles and the concepts contributed by the inventor(s) to furthering the art, and are to be construed as being without limitation to such specifically recited examples and conditions. Moreover, all statements herein reciting principles, aspects, and embodiments of the present principles, as well as specific examples thereof, are intended to encompass both structural and functional equivalents thereof. Additionally, it is intended that such equivalents include both currently known equivalents as well as equivalents developed in the future, i.e., any elements developed that perform the same function, regardless of structure.
- processor or “controller” should not be construed to refer exclusively to hardware capable of executing software, and may implicitly include, without limitation, digital signal processor (“DSP”) hardware, read-only memory (“ROM”) for storing software, random access memory (“RAM”), and non-volatile storage.
- DSP digital signal processor
- ROM read-only memory
- RAM random access memory
- any switches shown in the figures are conceptual only. Their function may be carried out through the operation of program logic, through dedicated logic, through the interaction of program control and dedicated logic, or even manually, the particular technique being selectable by the implementer as more specifically understood from the context.
- any element expressed as a means for performing a specified function is intended to encompass any way of performing that function including, for example, a) a combination of circuit elements that performs that function or b) software in any form, including, therefore, firmware, microcode or the like, combined with appropriate circuitry for executing that software to perform the function.
- the present principles as defined by such claims reside in the fact that the functionalities provided by the various recited means are combined and brought together in the manner which the claims call for. It is thus regarded that any means that can provide those functionalities are equivalent to those shown herein.
- such phrasing is intended to encompass the selection of the first listed option (A) only, or the selection of the second listed option (B) only, or the selection of the third listed option (C) only, or the selection of the first and the second listed options (A and B) only, or the selection of the first and third listed options (A and C) only, or the selection of the second and third listed options (B and C) only, or the selection of all three options (A and B and C).
- This may be extended, as readily apparent by one of ordinary skill in this and related arts, for as many items listed.
- “one or more elements” and “at least one element” are used interchangeably.
- the present principles are directed to a method and apparatus for dynamic preshow displays for digital cinema.
- one or more embodiments allow for dynamically providing information to a cinema audience, particularly in relationship to a current show (e.g., one currently playing or about to be played) in the cinema.
- information may include, but is not limited to, specifying how many minutes remain until show time, real-time news, weather, sports score updates, and so forth.
- composition playlist is inserted Into a show playlist (SPL) such as those supported by widely available digital cinema servers.
- SPL show playlist
- the composition playlist uses a mechanism normally reserved for retrieving a subtitle element to describe dynamically defined timed text, and/or to retrieve still images dynamically created in accordance with the present principles, among others.
- An exemplary use envisioned is for the creation of a "time remaining until the feature" message, and a dynamically created countdown. Additional exemplary uses include the creation of real-time news updates such a current sports scores, and theater status updates such as "the concession lines are short", “the show is delayed”, among others.
- personalized or customized messages can be displayed, such as "Happy Birthday, Kerry! or "Please Marry Me, Rond ⁇ ". Such messages can help promote a cinema outing for special occasions.
- the digital cinema system 100 includes digital cinema server 110, a storage device 112 and digital cinema projector 120. Server 110 and projector 120 are interconnected, preferably as described below.
- An example of server 110 is the DCP-2000 by DOREMI LABS, INC. of Burbank, CA.
- An example of projector 120 includes many of those based on the Texas Instruments DLP Cinema® light engine, such as the STARUS NC2500S by NEC Corporation of America, of Santa Clara, CA.
- Digital cinema system 100 also preferably includes audio equipment (not shown) suitable for reproducing a sound track in the auditorium of screen 130.
- Server 110 has a connection 116 to network 118.
- Projector 120 has a connection 122 to network 118.
- Server 110 and projector 120 are in communication through connections 118 and 122.
- Network 118 may be implemented as, for example, a 10ObT Ethernet network.
- Server 110 also has a connection 114 to projector 120 through which, during playout, an image stream is provided to projector 120.
- Connection 114 may be implemented as, for example, a dual high definition serial digital interface (dual HD- SDI).
- Projector 120 is arranged to make a projection on screen 130.
- the projection on screen 130 may include, for example, but is not limited to, an image 132 such as a helpful image of delicious popcorn, a text message including words 134, 136, and 138, and a display 140 of time remaining.
- the source of presentation elements 132, 134, 136, 138, and 140 is described in further detail below, in conjunction with FIGs. 2, 3, 4A-C.
- At least the display of time remaining 140 is dynamic content, preferably provided by server 110 to projector 120, for the projection on screen 130.
- the storage device 112 includes digital cinema content, and may be connected directly to server 110 (as shown) or may be accessed through network 118 (network access not shown). This content is further described below.
- a theater management station (TMS) 150 having a connection 152 to network 118 can be used by theater personnel for controlling and monitoring server 110 and projector 120.
- a projectionist can directly control and operate server 110 and projector 120 through dedicated user interfaces (not shown).
- a display manager 177 manages the displaying of dynamic text and/or other information (e.g., graphics, and so forth), also collectively referred to herein as "dynamic information", in accordance with the present principles.
- display manager 177 is implemented in digital server 110.
- display manager 177 could be implemented as a stand-alone apparatus having communication with digital server 110, for example through network 118.
- display manager 177 could reside on theater management system 150.
- display manager 177 may be a distributed entity, for example, with components on both digital server 110 and theater management system 150.
- the display manager 177 in conjunction with the server 110 and/or other components in system 100, provides at least one subtitle file that includes at least one element relating to the dynamic information, and during or shortly before playout of a show playlist (to be discussed in conjunction with timeline 270 in FIG. 2), dynamically resolves the at least one element and reproduces (e.g., visually and/or audibly) the dynamic information (for example, on screen 130, and/or another screen(s).
- display manager 177 may simply reproduce audio information (e.g., "the movie will begin in 10 minutes").
- the display manager 177 at the least, manages the reproduction (display) of image/video data in various embodiments in accordance with the present principles.
- the display manager 177 may also be configured for receiving at least one user input relating to the dynamic information, which contains customized or personalized information or messages to be reproduced on screen 130 and/or one or more other screens and/or one or more speakers (not shown, and may or may not be integrated with the screens).
- the display manager 177 it may be advantageous to configure the display manager 177 as a distributed entity that includes a user interface at the theater management station 150 for receiving customized information.
- the storage device 112 includes show data 200.
- the show data 200 includes files in accordance with the corresponding file formats described by the Society of Motion Picture and Television Engineers (SMPTE) of White Plains, NY, in their published and emerging standards: • D-Cinema Packaging - Sound & Picture Track File 429-3
- composition playlists 220, 230, 240, 250 and 260 are meant to correlate in time with the various events indicated along the timeline.
- the show data 200 is structured as follows: the show playlist 210 lists each composition that can play.
- the show playlist there may be one or more levels of indirection through files that collect compositions into convenient groups (not shown), but the indirection is resolvable to a sequence of compositions such as represented by composition playlists (CPLs) 220, 230, 240, 250, and 260.
- CPLs composition playlists
- CPL 220 is an advertisement for soda
- CPL 230 is a dynamic ad for popcorn
- CPL 240 is a trailer for an upcoming movie release
- CPL 250 is a transitional element which, in a showmanly manner, for example, concludes the pre-show and prepares the audience for the feature presentation
- CPL 260 is the main attraction (i.e., the feature itself). While an actual preshow would generally have more ads, more trailers, and additional transitional elements (e.g., a "coming soon" element before the trailers), the CPLs shown are representative.
- each of the CPLs will generally reference picture and audio track files, and some of the CPLs may reference a subtitle track file, with the composition described by each CPL including associated track files, these associations are only shown for CPL 230 for the sake of brevity.
- the CPL 230 enumerates three track files, picture track file 232, audio track file 234, and subtitle track file 236.
- subtitle track file 236' references instead of subtitle track file 236.
- the picture track file 232 provides the data necessary to display plausible images of delicious popcorn 132.
- Picture track file 232 is retrieved by server 110 from storage 112, decoded into an image signal which is transmitted via connection 114 to projector 120.
- Projector 120 formats the image signal as needed, and the corresponding images, after additional processing for the subtitles, are displayed.
- the audio track file 234 provides the data necessary to provide an audio track in synchronization with the display of picture track file 232.
- the subtitle track file 236 or 236' provides a subtitle sequence that is overlaid with the images provided from picture track file 232. Words 134, 136, and 138 appear over time, but are preferably statically encoded in subtitle track file 236 or 236'. Time remaining 140 is dynamically provided during the processing of subtitle track file 236, or during the processing of dynamic subpicture 238 referenced by subtitle track file 236'.
- timed text describes subtitles that are provided as text and fonts with the text being rendered using the fonts into a subtitle image that will be overlaid in synchronization with the images provided by the picture track file 232.
- Subpicture describes subtitles that are provided as still image files, typically in the portable network graphics (PNG) file format, such as a still image of the time remaining 140.
- PNG portable network graphics
- An advantage of the subpicture approach is that a graphical image of a clock showing the time remaining (not shown) could be used in lieu of a numeric display shown as time remaining 140.
- the timeline 270 includes a series of events arranged with time advancing in the direction of the arrow.
- ingest time 272 a time preferably well in advance of playout, the various elements of the show are copied to the storage device 112, preferably under the control of server 110 or a similarly trusted entity.
- server 110 a CPL is provided to server 110 along with each of the track files which the CPL references.
- the CPL preferably includes a hash value corresponding to each of the track files that is derived from the corresponding track file.
- server 110 is assured that the files are complete, have not been miscopied, nor subjected to tampering.
- the presence of the hash value within the CPL is optional.
- Prep-suite time 280 occurs shortly before playout begins, e.g., within about 30 minutes.
- Showtime 286 is preferably the published show time, which is generally a few minutes before the start of the feature, but in no case should it be any later than the start of the feature.
- Feature start time 288 is the time when the feature actually commences.
- the server 110 Shortly before playout begins, at prep-suite time 280, the server 110 prepares for imminent playout of the performance described by show playlist 210.
- Communication between projector 120 and server 110 across network 118 is established and confirmed.
- the presence of current, valid decryption keys (not shown) is confirmed for any CPL identified by show playlist 210 that requires decryption keys.
- the server 110 may buffer a quantity of picture and audio data to allow for a rapid onset of playout when triggered.
- projector 120 may be provided with listing of subtitle track files, such as subtitle track file 236, that will appear in show playlist 210.
- the subtitle track file 236 is not transferred to projector 120 at this time.
- CPLs such as soda ad 220 is presented on screen 130 in the auditorium.
- notice prefetch time 282 occurs a few moments before notice time 284, the time when CPL 230 begins its playout, and projector 120 requests track file 236 (or in an alternative embodiment, track file 236') from server 110 through network 118.
- subtitle track file 236 is processed by server 110 and any dynamic text callout (described in more detail below), for example as may be implemented using "server-side includes” (the processing being performed by the display manager 177, which, in a preferred embodiment, comprises an HTTP server on digital cinema server 110 provisioned to interpret the server-side includes), is replaced by appropriate text values.
- This substitution may be made with a predetermined expectation of the interval between notice prefetch time 282 and notice time 284, the current time, showtime 286, and feature start time 288, or similar information as parameters. These times may be expressed in absolute clock time, or may be expressed as a time relative to the beginning of the show playlist, or other convenient reference. These, or other parameters, may be combined by a predetermined algorithm to return the substitution text.
- FIGs. 4A and 4B Several exemplary algorithms are described below in conjunction with FIGs. 4A and 4B.
- projector 120 will request the subpicture 238 from server 110 shortly before the point at which it is needed for display according to subtitle track file 236'.
- the PNG image file returned in response to the request by projector 120 will be created or selected dynamically by server 110 according to a predetermined algorithm known to server 110 and associated with the identification in subtitle track file 236' by which projector 120 refers to subpicture file 238.
- This dynamic subpicture file generating or selecting algorithm may accept any of the parameters mentioned above and behaves similarly to the exemplary algorithms described in conjunction with FIGs. 4A and 4B, though executed in response to a request induced, for example, in conjunction with FIG. 4C.
- FIG. 3 shows CPL data 300 including the preferred implementation of CPL
- a universally unique identifier (UUID) 302 occurs in CPL data 300 to define the UUID by which CPL 230 will be referenced. This is the reference by which SPL 210 can unambiguously identify CPL 230.
- CPL 230 can be identified in SPL 210 by a separate UUID that may be called out in CPL data 300 as the ContentVersion identifier (not shown).
- picture element 332, audio element 334, and subtitle element 336 are included in CPL data 300.
- extensible markup language (XML) data an element that begins as does 332 with an angle-bracketed tag such as "MainPicture” ends with an angle-bracketed closing tag “/MainPicture”.
- XML extensible markup language
- These elements identify the corresponding picture track file 232, audio track file 234, and subtitle track file 236 using the UUIDs 333, 335, and 337, respectively.
- Other well- known metadata is provided.
- the association between these UUIDs and the corresponding files is provided to server 110 at ingest time 272 and is subsequently tracked by server 110.
- FIG. 4A the subtitle track file data 400 of the present principles is shown.
- This subtitle track file is identified by the UUID 402 which matches the UUID 337 in the subtitle element 336 of CPL data 300, and so subtitle track file data 400 is the subtitle track data referenced by CPL data 300 in subtitle element 336.
- subtitle track file data 400 is the subtitle track data referenced by CPL data 300 in subtitle element 336.
- Subtitle elements 434, 436, and 438 operate as follows: each identifies a time interval relative to the playout of CPL 230 during which the corresponding text should appear. Also, a location of the text on screen 130 is described. Subtitle 434 defines that the text 134' should appear one-half of a second into the playout of CPL 230 at the location of word 134. One-half second later, subtitle 436 shows text 136' at the location of word 136. One-half second after that, subtitle 436 ends and is replaced by subtitle 438 which shows text 136" and 138' at the locations of words 136 and 138, respectively. The effect to the audience is that the words "There's still time:" appear at half-second intervals.
- Subtitles 440, 442, and 444 each provide text to be displayed as time remaining 140. The text is updated each second, since subtitles 440, 442, and 444 begin at consecutive seconds beginning two seconds into the playout of CPL 230, and each lasts for one second. If the subtitle implementation of projector 120 is not W
- the astute reader will note that the text present in subtitle track file data 400 at location 140' and at locations 140" does not resemble the display of time remaining at location 140.
- the text present is an XML comment of the form associated with a server-side include, in this case, of a format known to be supported by Apache hypertext transfer protocol server, provided by the THE APACHE SOFTWARE FOUNDATION of Forest Hill, MD.
- Server-side includes of this format, or similar formats, are provided by most hypertext transfer protocol (HTTP) servers.
- the server-side include should be in a format supported by the HTTP service provided by digital cinema server 110.
- the server-side include is an instruction to the HTTP service running on server 110 to execute a specific script in the Perl language named "TimeToNextShow" and to provide to that script the parameter of "00:00:02".
- the result returned by that script would be a few characters of text such as "5:12" that represents an amount of time remaining before the feature starts. This is the text displayed as time remaining 140.
- the server-side includes called out as 140" both specify additional executions of the "TimeToNextShow” script, but with different parameters due to their deeper offset into the playout of CPL 230.
- the parameter value is the value of the Timeln parameter for the corresponding subtitle 440, 442, 444.
- the reason is that the projector is going to request subtitle track file 236 at notice pre-fetch time 282. If the script TimeToNextShow did not accept a parameter, then at the time that subtitle track file data 400 was processed by server 110 and supplied to projector 120, each of the three occurrences of the TimeToNextShow script would evaluate to the same text value.
- the script can perform the following steps: (a) note the UUID 402 of CPL 230; (b) find a match for UUID 402 in SPL 210; (c) determine the interval between the start of CPL 230 in SPL 210 and the start of the feature CPL 260; (d) subtract from the interval the value of the parameter; and (e) return the resulting interval, i.e., the time remaining before the feature at the time the specific subtitle is to be displayed, as a text string, preferably in a M:SS (minutes and seconds) format.
- W note the UUID 402 of CPL 230
- determine the interval between the start of CPL 230 in SPL 210 and the start of the feature CPL 260 (d) subtract from the interval the value of the parameter; and (e) return the resulting interval, i.e., the time remaining before the feature at the time the specific subtitle is to be displayed, as a text string,
- the resulting text provided by the script is substituted for the entirety of the XML comment providing the server-side include.
- subtitle 450 (which would replace subtitle 434, 436, and 438) displays the explicit text "Show starts at” and the server-side include appends a few characters of text representing the value of the environment variable "SHOWTIME" on server 110. Such an environment variable would be available from the schedule (not shown) on which server 110 is operating. A second and a half later, a second subtitle 452 is added to the display (which would replace subtitles 440, 442, and 444) which provides the explicit text "Time now is” and the server-side include which executes a Linux command to print the current time in H:MM:SS format.
- timed text subtitle 440 is replaced by subpicture subtitle 440' in which no server-side include is necessary. Instead, subtitle 440' provides UUID 460 which tells projector 120 that it will need to request a subpicture PNG file.
- the subtitle track file data 400 with subtitle 440' replacing subtitle 440 is an embodiment of subtitle track file 236', and UUID 460 will correspond to dynamic subpicture 238.
- server 110 requests the file for UUID 460
- server 110 Other messages can be included by server 110, such as the status of concession lines ("Concession lines are short! No waiting!) or real-time sports scores ("UCLA 21 , USC 0") scrapped off of external web sites or subscription news feeds.
- concession lines Concession lines are short! No waiting!
- UCLA 21 , USC 0 real-time sports scores
- a user interface can be provided at TMS 150.
- Personalized messages such as those previously mentioned, can be entered and assigned to variables in the system.
- the CPL (not shown) for a pre-built, appropriately themed composition would be manually included in the show playlist. For instance, a theatergoer wishes to celebrate her husband's birthday. The theatergoer selects a specific "Happy Birthday" composition from a selection of special event compositions. The theater manager incorporates the CPL corresponding to the theatergoer's selection into show playlist and the composition assets are ingested (if not already).
- the theater manager solicits appropriate personalized text for inclusion in the celebratory composition, and the text is entered as the value of a variable WHOSE_BIRTHDAY to which the celebratory composition will refer, much as the value of the variable SHOWTIME was referenced in subtitle 450.
- such a personalized composition can be implemented by a CPL that requests a subpicture which server 110 computes on the basis of WHOSE_BIRTHDAY.
- server 110 does not have server-side includes enabled, and instead the XML comment embodying the include is passed to projector 120 to be interpreted by projector 120 before being used for display.
- XML comment embodying the include is passed to projector 120 to be interpreted by projector 120 before being used for display.
- client-side includes or other script function.
- Such functions are commonly implemented by the recipient of an XML file.
- comment-embedded functions may be used as with server-side includes, a more common form is to use script tags to embed an operation in the subtitle track file data.
- a suitable and common language for such embedding is JavaScript. Such an embodiment would require the projector 120 to interpret JavaScript.
- a digital cinema server 110 has a more predictable computational burden and excess processing power than the digital cinema projector 120, especially when handling subtitle overlays, it is preferable to allocate the burden for dynamic manipulations to the server.
- subtitle rendering and overlay on the picture essence is performed by the server, and the projector has no participation in the subtitle rendering or overlay process.
- the projector merely displays the server- overlaid images provided via connection 114.
- the present principles also apply to the processing of subtitle track files or the dynamic creation of subpicture files in such an embodiment.
- server 110, the TMS 150, or another entity could provide an updated version of subtitle track file 236, wherein an external program generates a complete subtitle track file 236 that has counting down text such as "5:12", “5:11", and "5:10" in lieu of the server-side includes in subtitles 440, 442, and 444, respectively.
- Such an updated version of subtitle track file is generated by a process that maintains the necessary elements of the subtitle track file, including spatial and time critical aspects of the file.
- Such a dynamically generated subtitle track file might be referred to within CPL data 300 and the UUID 337 would be recognized by the server 110 as a call to generate and return a fresh version of the subtitle track file data 400 by a mechanism other than server-side includes.
- Such an implementation is readily understood by one of ordinary skill in this and related arts, and is not substantially different in the computational burden placed on server 110 than the preferred embodiment described herein.
- a corresponding PNG subpicture can be created from a digital image, for example from a network accessible camera (i.e., a "web-cam", not shown) overlooking the concession stand queues.
- Server 110 obtains the digital image and processes the digital image as needed.
- another device such as the TMS can periodically access the web-cam (not shown) and format an image taken with the camera into the appropriate scale and aspect ratio(s) called for by the various auditoriums. These images could be updated occasionally (e.g., one per minute) and left where they can be accessed by server 110, or periodically pushed by the TMS 150 to storage 112.
- a similar implementation could be used to provide an image of individuals appropriate to messages such as "Sylvain: Tonight's Manager On Duty” or "This auditorium cleaned with pride by: Mark” to attribute the operation and maintenance of the theater to those responsible.
- the presentation of dynamically generated subtitles may not be displayed in the image displayed on screen 130 by projector 120.
- dynamically generated displays may be presented on an additional display device (not shown), such as an LED multi-line character display located near the screen such as the SUPERTITLE supplied by DIGITAL TECH SERVICES of Portsmouth, VA, or written on or near the screen by a laser projection system such as the CINEMA SUBTITLING SYSTEM supplied by DIGITAL THEATER SYSTEMS of Agoura Hills, CA.
- communication from server 110 to the additional display device (not shown) would be via network 118 or other direct or wireless connection.
- Server 110 would react to commands in subtitle track file 236 or other analogous files providing a list of events and the times at which they should occur, by providing the appropriate text for display by the additional display device at substantially the time when the display of the text is appropriate.
- the projector 120 may provide text to the additional display device as a response to a client-side include or script.
- the method 500 includes a start block 505 that passes control to a function block 510.
- the function block 510 provides at least one subtitle file that includes at least one element relating to the dynamic information, and passes control to a function block 520.
- the function block 520 dynamically resolves the element (i.e., in some embodiments, provides a solution by acting on and replacing an instruction or comment in the subtitle file) relating to the dynamic information on the basis of dynamic values such as the current time or customized information.
- the specific operation or action required for dynamically resolving a given element depends on the implementation details.
- Such operations may include expanding a server-side include, generating or selecting an appropriate subpicture file as described above, among others.
- control is passed to a function block 525, which reproduces (e.g., visually and/or audibly) the resolved form of the subtitle file on a reproduction device (e.g., one or more screens and/or one or more speakers, and so forth).
- a reproduction device e.g., one or more screens and/or one or more speakers, and so forth.
- the reproduction of the dynamically generated information may be done as a part of a presentation of content by a digital display system, e.g., prior to the showing of a movie in a cinema.
- control is passed to an end block 599.
- the dynamic information to be reproduced contains customized or personalized information, which may be provided by a user.
- an optional function block 515 receives at least one user input relating to the customized information, and the customized information is used in function block 520 for dynamically resolving the at least one element associated with the dynamic information.
- the dynamically resolved information is then reproduced (see function block 525).
- the display manager 177 may be used for receiving the user input, and customized information may be reproduced by the digital display system on screen 130 and/or one or more other screens and/or speakers (not shown)
- the teachings of the present principles are implemented as a combination of hardware and software.
- the software may be implemented as an application program tangibly embodied on a program storage unit.
- the application program may be uploaded to, and executed by, a machine comprising any suitable architecture.
- the machine is implemented on a computer platform having hardware such as one or more central processing units (“CPU"), a random access memory (“RAM”), and input/output ("I/O") interfaces.
- CPU central processing units
- RAM random access memory
- I/O input/output
- the computer platform may also include an operating system and microinstruction code.
- the various processes and functions described herein may be either part of the microinstruction code or part of the application program, or any combination thereof, which may be executed by a CPU.
- various other peripheral units may be connected to the computer platform such as an additional data storage unit and a printing unit.
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Human Computer Interaction (AREA)
- Two-Way Televisions, Distribution Of Moving Picture Or The Like (AREA)
- Transforming Electric Information Into Light Information (AREA)
Abstract
Description
Claims
Priority Applications (7)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
EP08779907A EP2304499A1 (en) | 2008-06-30 | 2008-06-30 | Method and apparatus for dynamic displays for digital cinema |
JP2011516242A JP5717629B2 (en) | 2008-06-30 | 2008-06-30 | Method and apparatus for dynamic display for digital movies |
CA2728326A CA2728326A1 (en) | 2008-06-30 | 2008-06-30 | Method and apparatus for dynamic displays for digital cinema |
KR1020107029614A KR20110029141A (en) | 2008-06-30 | 2008-06-30 | Method and apparatus for dynamic displays for digital cinema |
US12/737,228 US20110090397A1 (en) | 2008-06-30 | 2008-06-30 | Method and apparatus for dynamic displays for digital cinema |
CN2008801301066A CN102077138A (en) | 2008-06-30 | 2008-06-30 | Method and apparatus for dynamic displays for digital cinema |
PCT/US2008/008166 WO2010002361A1 (en) | 2008-06-30 | 2008-06-30 | Method and apparatus for dynamic displays for digital cinema |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
PCT/US2008/008166 WO2010002361A1 (en) | 2008-06-30 | 2008-06-30 | Method and apparatus for dynamic displays for digital cinema |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2010002361A1 true WO2010002361A1 (en) | 2010-01-07 |
Family
ID=40361580
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/US2008/008166 WO2010002361A1 (en) | 2008-06-30 | 2008-06-30 | Method and apparatus for dynamic displays for digital cinema |
Country Status (7)
Country | Link |
---|---|
US (1) | US20110090397A1 (en) |
EP (1) | EP2304499A1 (en) |
JP (1) | JP5717629B2 (en) |
KR (1) | KR20110029141A (en) |
CN (1) | CN102077138A (en) |
CA (1) | CA2728326A1 (en) |
WO (1) | WO2010002361A1 (en) |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
EP2458885A1 (en) | 2010-11-24 | 2012-05-30 | SmarDTV S.A. | A method and apparatus for controlling a display on a host device |
WO2016012978A1 (en) * | 2014-07-23 | 2016-01-28 | Active Solutions S.À.R.L. | System and method for creating and displaying recipient gifts for display at a venue display |
Families Citing this family (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US8140407B2 (en) * | 2009-08-06 | 2012-03-20 | International Business Machines Corporation | Method, system, and storage medium for substituting media preview items for suppressed media preview items |
JP2013051462A (en) * | 2011-08-30 | 2013-03-14 | Sony Corp | Screening management system, screening management method and program |
CN107431780A (en) * | 2015-01-27 | 2017-12-01 | 巴科公司 | System and method for merging the digital movie bag for multiple screens environment |
CN109788335B (en) * | 2019-03-06 | 2021-08-17 | 珠海天燕科技有限公司 | Video subtitle generating method and device |
CN112995134B (en) * | 2021-02-03 | 2022-03-18 | 中南大学 | Three-dimensional video streaming media transmission method and visualization method |
CN112882678B (en) * | 2021-03-15 | 2024-04-09 | 百度在线网络技术(北京)有限公司 | Image-text processing method, image-text processing display method, image-text processing device, image-text processing equipment and storage medium |
Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2001013301A2 (en) | 1999-08-13 | 2001-02-22 | Cinecast, Llc | System and method for digitally providing and displaying advertisement information to cinemas and theaters |
US20050057724A1 (en) | 2003-09-11 | 2005-03-17 | Eastman Kodak Company | Method for staging motion picture content by exhibitor |
US20060245727A1 (en) | 2005-04-28 | 2006-11-02 | Hiroshi Nakano | Subtitle generating apparatus and method |
Family Cites Families (30)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH08275205A (en) * | 1995-04-03 | 1996-10-18 | Sony Corp | Method and device for data coding/decoding and coded data recording medium |
JP3326670B2 (en) * | 1995-08-02 | 2002-09-24 | ソニー株式会社 | Data encoding / decoding method and apparatus, and encoded data recording medium |
US6384893B1 (en) * | 1998-12-11 | 2002-05-07 | Sony Corporation | Cinema networking system |
JP2001036485A (en) * | 1999-07-16 | 2001-02-09 | Nippon Telegr & Teleph Corp <Ntt> | Broadcast contents configuration method, broadcast reception method, broadcast reception system and storage medium storing broadcast reception program |
US20020069107A1 (en) * | 1999-12-22 | 2002-06-06 | Werner William B. | Video presentation scheduling and control method and system |
KR100341444B1 (en) * | 1999-12-27 | 2002-06-21 | 조종태 | Subtitle management method for digital video disk |
US6829003B2 (en) * | 2000-06-02 | 2004-12-07 | Pentax Corporation | Sampling pulse generator of electronic endoscope |
US7487112B2 (en) * | 2000-06-29 | 2009-02-03 | Barnes Jr Melvin L | System, method, and computer program product for providing location based services and mobile e-commerce |
US20020108114A1 (en) * | 2001-02-08 | 2002-08-08 | Sony Corporation | System and method for presenting DVD bulletin board screen personalized to viewer |
US6700640B2 (en) * | 2001-03-02 | 2004-03-02 | Qualcomm Incorporated | Apparatus and method for cueing a theatre automation system |
JP2003030287A (en) * | 2001-07-16 | 2003-01-31 | Yasuo Nishimori | System for supporting travel agency |
GB0129669D0 (en) * | 2001-12-12 | 2002-01-30 | Slaughter Paul | Apparatus and method |
US7075587B2 (en) * | 2002-01-04 | 2006-07-11 | Industry-Academic Cooperation Foundation Yonsei University | Video display apparatus with separate display means for textual information |
CN1720711B (en) * | 2002-10-04 | 2012-07-18 | Rgb系统公司 | Apparatus and system for providing universal web access function |
US7034916B2 (en) * | 2002-12-04 | 2006-04-25 | Eastman Kodak Company | Scheduling between digital projection and film projection corresponding to a predetermined condition |
US6812994B2 (en) * | 2002-12-04 | 2004-11-02 | Eastman Kodak Company | Streamlined methods and systems for scheduling and handling digital cinema content in a multi-theatre environment |
US20050076372A1 (en) * | 2002-12-04 | 2005-04-07 | Moore Leslie G. | Method for rapidly changing digital content for a digital cinema house |
US7236227B2 (en) * | 2002-12-04 | 2007-06-26 | Eastman Kodak Company | System for management of both pre-show and feature presentation content within a theatre |
US20040181807A1 (en) * | 2003-03-11 | 2004-09-16 | Theiste Christopher H. | System and method for scheduling digital cinema content |
US20040181819A1 (en) * | 2003-03-11 | 2004-09-16 | Theiste Christopher H. | System and method for scheduling in-theatre advertising |
JP2005004726A (en) * | 2003-05-20 | 2005-01-06 | Victor Co Of Japan Ltd | Electronic service manual creating method, additional data generating method, program for creating electronic service manual, and program for generating additional data |
JP4228854B2 (en) * | 2003-09-18 | 2009-02-25 | セイコーエプソン株式会社 | projector |
US20050081144A1 (en) * | 2003-10-13 | 2005-04-14 | Bankers Systems Inc. | Document creation system and method using knowledge base, precedence, and integrated rules |
JP2006059129A (en) * | 2004-08-20 | 2006-03-02 | Kobo Itonaga | Picture recording apparatus |
US20060078273A1 (en) * | 2004-10-07 | 2006-04-13 | Eastman Kodak Company | Promotional materials derived from digital cinema data stream |
US7382991B2 (en) * | 2005-04-27 | 2008-06-03 | Samsung Electronics Co., Ltd. | Apparatus and method for displaying toner residual quantity |
JP4466459B2 (en) * | 2005-04-28 | 2010-05-26 | ソニー株式会社 | Character information generation apparatus and method, character information display apparatus and method, digital movie screening method and system, and caption generation apparatus |
US20090172028A1 (en) * | 2005-07-14 | 2009-07-02 | Ana Belen Benitez | Method and Apparatus for Providing an Auxiliary Media In a Digital Cinema Composition Playlist |
US20090024922A1 (en) * | 2006-07-31 | 2009-01-22 | David Markowitz | Method and system for synchronizing media files |
JP4311475B2 (en) * | 2007-05-10 | 2009-08-12 | ソニー株式会社 | Digital cinema processing apparatus, ingest method, and program |
-
2008
- 2008-06-30 EP EP08779907A patent/EP2304499A1/en not_active Ceased
- 2008-06-30 WO PCT/US2008/008166 patent/WO2010002361A1/en active Application Filing
- 2008-06-30 CA CA2728326A patent/CA2728326A1/en not_active Abandoned
- 2008-06-30 US US12/737,228 patent/US20110090397A1/en not_active Abandoned
- 2008-06-30 JP JP2011516242A patent/JP5717629B2/en not_active Expired - Fee Related
- 2008-06-30 CN CN2008801301066A patent/CN102077138A/en active Pending
- 2008-06-30 KR KR1020107029614A patent/KR20110029141A/en not_active Application Discontinuation
Patent Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2001013301A2 (en) | 1999-08-13 | 2001-02-22 | Cinecast, Llc | System and method for digitally providing and displaying advertisement information to cinemas and theaters |
US20050057724A1 (en) | 2003-09-11 | 2005-03-17 | Eastman Kodak Company | Method for staging motion picture content by exhibitor |
US20060245727A1 (en) | 2005-04-28 | 2006-11-02 | Hiroshi Nakano | Subtitle generating apparatus and method |
Cited By (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
EP2458885A1 (en) | 2010-11-24 | 2012-05-30 | SmarDTV S.A. | A method and apparatus for controlling a display on a host device |
WO2012069321A1 (en) | 2010-11-24 | 2012-05-31 | Nagravision S.A. | A method and apparatus for controlling a display on a host device |
WO2016012978A1 (en) * | 2014-07-23 | 2016-01-28 | Active Solutions S.À.R.L. | System and method for creating and displaying recipient gifts for display at a venue display |
Also Published As
Publication number | Publication date |
---|---|
JP5717629B2 (en) | 2015-05-13 |
CA2728326A1 (en) | 2010-01-07 |
CN102077138A (en) | 2011-05-25 |
US20110090397A1 (en) | 2011-04-21 |
KR20110029141A (en) | 2011-03-22 |
EP2304499A1 (en) | 2011-04-06 |
JP2011529643A (en) | 2011-12-08 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US20110090397A1 (en) | Method and apparatus for dynamic displays for digital cinema | |
KR100922999B1 (en) | Apparatus and methods for creating a playlist, a theatre manager apparatus, and a computer readable medium | |
JP5026423B2 (en) | Method and apparatus for delivering electronic messages | |
US6700640B2 (en) | Apparatus and method for cueing a theatre automation system | |
JP2017504230A (en) | Video broadcast system and method for distributing video content | |
AU2002244195A1 (en) | Apparatus and method for building a playlist | |
CN101652993A (en) | Method and apparatus for content distribution to and playout with a digital cinema system | |
US20190158928A1 (en) | Video summary information playback device and method and video summary information providing server and method | |
WO2002017633A2 (en) | Method and system for active modification of video content responsively to processes and data embedded in a video stream | |
CN113225587B (en) | Video processing method, video processing device and electronic equipment | |
US20150095940A1 (en) | Playlist content selection system and method | |
JP2007019768A (en) | Tag information generating apparatus, tag information generating method, and program | |
US20090064257A1 (en) | Compact graphics for limited resolution display devices | |
KR101050186B1 (en) | Multi media process play system and method | |
JPWO2003065311A1 (en) | DATA GENERATION DEVICE, DATA REPRODUCTION DEVICE, DATA PROCESSING SYSTEM, AND METHOD THEREOF | |
JP2015065694A (en) | Method and apparatus for dynamic display for digital cinema | |
JP2009060411A (en) | Vod system, and content distributing method for vod system | |
US8208787B2 (en) | SMMD media producing and reproducing apparatus |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
WWE | Wipo information: entry into national phase |
Ref document number: 200880130106.6 Country of ref document: CN |
|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 08779907 Country of ref document: EP Kind code of ref document: A1 |
|
WWE | Wipo information: entry into national phase |
Ref document number: 8988/DELNP/2010 Country of ref document: IN |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2728326 Country of ref document: CA |
|
WWE | Wipo information: entry into national phase |
Ref document number: 12737228 Country of ref document: US |
|
ENP | Entry into the national phase |
Ref document number: 2011516242 Country of ref document: JP Kind code of ref document: A |
|
ENP | Entry into the national phase |
Ref document number: 20107029614 Country of ref document: KR Kind code of ref document: A |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2008779907 Country of ref document: EP |