WO2016017451A1 - 受信装置、受信方法、送信装置、及び、送信方法 - Google Patents
受信装置、受信方法、送信装置、及び、送信方法 Download PDFInfo
- Publication number
- WO2016017451A1 WO2016017451A1 PCT/JP2015/070498 JP2015070498W WO2016017451A1 WO 2016017451 A1 WO2016017451 A1 WO 2016017451A1 JP 2015070498 W JP2015070498 W JP 2015070498W WO 2016017451 A1 WO2016017451 A1 WO 2016017451A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- application
- information
- control information
- trigger
- unit
- Prior art date
Links
Images
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04H—BROADCAST COMMUNICATION
- H04H20/00—Arrangements for broadcast or for distribution combined with broadcast
- H04H20/86—Arrangements characterised by the broadcast information itself
- H04H20/93—Arrangements characterised by the broadcast information itself which locates resources of other pieces of information, e.g. URL [Uniform Resource Locator]
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04H—BROADCAST COMMUNICATION
- H04H60/00—Arrangements for broadcast applications with a direct linking to broadcast information or broadcast space-time; Broadcast-related systems
- H04H60/09—Arrangements for device control with a direct linkage to broadcast information or to broadcast space-time; Arrangements for control of broadcast-related services
- H04H60/13—Arrangements for device control affected by the broadcast information
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/20—Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
- H04N21/21—Server components or server architectures
- H04N21/218—Source of audio or video content, e.g. local disk arrays
- H04N21/2183—Cache memory
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/20—Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
- H04N21/23—Processing of content or additional data; Elementary server operations; Server middleware
- H04N21/237—Communication with additional data server
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/40—Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
- H04N21/43—Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
- H04N21/435—Processing of additional data, e.g. decrypting of additional data, reconstructing software from modules extracted from the transport stream
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/60—Network structure or processes for video distribution between server and client or between remote clients; Control signalling between clients, server and network components; Transmission of management data between server and client, e.g. sending from server to client commands for recording incoming content stream; Communication details between server and client
- H04N21/65—Transmission of management data between client and server
- H04N21/654—Transmission by server directed to the client
- H04N21/6543—Transmission by server directed to the client for forcing some client operations, e.g. recording
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/80—Generation or processing of content or additional data by content creator independently of the distribution process; Content per se
- H04N21/81—Monomedia components thereof
- H04N21/8126—Monomedia components thereof involving additional data, e.g. news, sports, stocks, weather forecasts
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/80—Generation or processing of content or additional data by content creator independently of the distribution process; Content per se
- H04N21/81—Monomedia components thereof
- H04N21/8166—Monomedia components thereof involving executable data, e.g. software
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/80—Generation or processing of content or additional data by content creator independently of the distribution process; Content per se
- H04N21/83—Generation or processing of protective or descriptive data associated with content; Content structuring
- H04N21/845—Structuring of content, e.g. decomposing content into time segments
- H04N21/8456—Structuring of content, e.g. decomposing content into time segments by decomposing the content in the time domain, e.g. in time segments
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/80—Generation or processing of content or additional data by content creator independently of the distribution process; Content per se
- H04N21/85—Assembly of content; Generation of multimedia applications
- H04N21/854—Content authoring
- H04N21/8543—Content authoring using a description language, e.g. Multimedia and Hypermedia information coding Expert Group [MHEG], eXtensible Markup Language [XML]
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/80—Generation or processing of content or additional data by content creator independently of the distribution process; Content per se
- H04N21/85—Assembly of content; Generation of multimedia applications
- H04N21/854—Content authoring
- H04N21/8547—Content authoring involving timestamps for synchronizing content
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/80—Generation or processing of content or additional data by content creator independently of the distribution process; Content per se
- H04N21/85—Assembly of content; Generation of multimedia applications
- H04N21/858—Linking data to content, e.g. by linking an URL to a video object, by creating a hotspot
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/80—Generation or processing of content or additional data by content creator independently of the distribution process; Content per se
- H04N21/85—Assembly of content; Generation of multimedia applications
- H04N21/858—Linking data to content, e.g. by linking an URL to a video object, by creating a hotspot
- H04N21/8586—Linking data to content, e.g. by linking an URL to a video object, by creating a hotspot by using a URL
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04H—BROADCAST COMMUNICATION
- H04H2201/00—Aspects of broadcast communication
- H04H2201/40—Aspects of broadcast communication characterised in that additional data relating to the broadcast data are available via a different channel than the broadcast channel
Definitions
- the present technology relates to a reception device, a reception method, a transmission device, and a transmission method, and in particular, a reception device, a reception method, and a reception device that can reliably operate an application executed in conjunction with AV content.
- the present invention relates to a transmission device and a transmission method.
- An operation of redistributing AV (Audio Video) content such as a program broadcast by terrestrial broadcasting by cable TV (CATV: Cable Television) or satellite broadcasting is performed (for example, see Patent Document 1).
- AV Audio Video
- CATV Cable Television
- This technology has been made in view of such a situation, and is intended to ensure that an application executed in conjunction with AV content can be operated.
- the reception device acquires first trigger information including at least location information as information for controlling an operation of an application executed in conjunction with AV (Audio Video) content.
- a second acquisition unit that acquires application control information for controlling the operation of the application, and a control unit that controls the operation of the application based on the trigger information and the application control information.
- a receiving device acquires first trigger information including at least location information as information for controlling an operation of an application executed in conjunction with AV (Audio Video) content.
- a second acquisition unit that acquires application control information for controlling the operation of the application, and a control unit that controls the operation of the application based on the trigger information and the application control information.
- the trigger information includes time information serving as a reference time for controlling the operation of the application, and the second acquisition unit acquires schedule control information that defines the operation of the application in time series, and the control When the time measured based on the time information has passed the time specified in the schedule control information, the unit controls the operation of the application according to action information for the application corresponding to the time. Can be.
- the application is composed of a plurality of files
- the second acquisition unit acquires cache control information for controlling a cache of a file group constituting the application
- the control unit is configured to control the cache control. Based on the information, the file group constituting the application can be held in the cache memory.
- the trigger information includes editing information for editing the contents defined in the schedule control information, and the control unit can edit the schedule control information based on the editing information.
- the location information is information for acquiring the application control information, the schedule control information, and the cache control information, and the application control information, the schedule control information, and the cache control information are the application It can be made to be related by the identification information.
- the trigger information includes action information for the application, and the control unit controls the operation of the application according to the action information included in the trigger information when the trigger information is acquired. be able to.
- the application is composed of a plurality of files
- the second acquisition unit acquires cache control information for controlling a cache of a file group constituting the application
- the control unit is configured to control the cache control. Based on the information, the file group constituting the application can be held in the cache memory.
- the location information is information for acquiring the application control information and the cache control information, and the trigger information, the application control information, and the cache control information are associated by identification information of the application. Can be like that.
- the AV content is broadcast content transmitted by a digital broadcast signal, and the trigger information is distributed in a digital broadcast signal or distributed from a server on the Internet, and the first acquisition unit is The trigger information distributed by broadcasting or communication can be acquired.
- the receiving device may be an independent device, or may be an internal block constituting one device.
- the reception method according to the first aspect of the present technology is a reception method corresponding to the reception device according to the first aspect of the present technology described above.
- trigger information including at least location information is acquired as information for controlling an operation of an application executed in conjunction with AV content
- Application control information for controlling the operation of the application is acquired, and the operation of the application is controlled based on the trigger information and the application control information.
- the transmission device includes an acquisition unit that acquires AV content and trigger information including at least location information as information for controlling an operation of an application that is executed in conjunction with the AV content.
- the trigger information and the application control information together with the first generation unit to generate, the second second generation unit to acquire to generate application control information for controlling the operation of the application, and the AV content
- the first generation unit generates the trigger information including time information serving as a reference for time for controlling the operation of the application, and the second generation unit defines the operation of the application in time series.
- the schedule control information is generated, and the transmission unit can transmit the trigger information including the time information and the schedule control information.
- the application includes a plurality of files
- the second generation unit generates cache control information for controlling a cache of a file group constituting the application
- the transmission unit includes the cache control. Information may be further transmitted.
- the first generation unit generates the trigger information including editing information for editing the contents defined in the schedule control information, and the transmission unit transmits the trigger information including the editing information. Can be.
- the location information is information for acquiring the application control information, the schedule control information, and the cache control information, and the application control information, the schedule control information, and the cache control information are the application It can be made to be related by the identification information.
- the first generation unit may generate the trigger information including action information for the application, and the transmission unit may transmit the trigger information including the action information.
- the application includes a plurality of files
- the second generation unit generates cache control information for controlling a cache of a file group constituting the application
- the transmission unit includes the cache control. Information may be further transmitted.
- the location information is information for acquiring the application control information and the cache control information, and the trigger information, the application control information, and the cache control information are associated by identification information of the application. Can be like that.
- the AV content may be broadcast content, and the transmission unit may transmit the trigger information and application control information together with the AV content using a digital broadcast signal.
- the transmission device according to the second aspect of the present technology may be an independent device, or may be an internal block constituting one device.
- a transmission method according to the second aspect of the present technology is a transmission method corresponding to the transmission device according to the second aspect of the present technology described above.
- At least location information is included as information for controlling the operation of an application that is acquired and executed in conjunction with the AV content.
- Trigger information is generated, application control information for controlling the operation of the application is generated, and the trigger information and application control information are transmitted together with the AV content.
- the trigger information includes at least location information as information for controlling the operation of the application.
- the metadata is composed of AIT (Application Information Table), EMT (Event Message Table), and CCT (Content Configuration Table).
- AIT is application control information for controlling the operation of the application.
- EMT is schedule control information that defines the operation of an application in time series.
- the CCT is cache control information for controlling the cache of the file group constituting the application.
- FIG. 1 is a diagram illustrating a configuration of trigger information.
- Trigger information has a structure in which a command ID (cmdID) indicating the type of trigger information and a URI (Uniform resource identifier) as location information that is the trigger information body are described. For example, “0” is designated as the command ID when trigger information is applied to application control related to the present technology.
- locator_part corresponds to a URI, and terms can be optionally specified.
- the command ID is omitted.
- action information (action), media time information (media_time), or event information (event) is specified.
- action information various actions for the application are specified.
- media time information information (time information) indicating a reference time (for example, current time) for controlling the operation of the application is designated.
- event information information (edit information) for editing the contents defined in EMT is designated.
- spread information stamp
- version information version information
- other parameters can be specified.
- spread information information for stochastically distributing operations related to the application is designated.
- version information version information of AIT, EMT, and CCT is specified. These parameters are connected by “&”.
- the application ID (appID) and the action code (action_code) are connected by a period.
- the application ID identification information of the target application is specified.
- an action to be executed by the application identified by the application ID is specified.
- codes such as “a1” to “a4” and “e1” to “eN” are specified as the action code.
- Prefetch is an action for instructing acquisition of an application.
- the prefetch action code is “a1”. Note that prefetch may be referred to as “prepare”, but here, the prefetch is described as being unified.
- Execute is an action for instructing acquisition or activation of an application.
- execution of the application is resumed.
- the execute action code is “a2”.
- Kill is an action for terminating a running application.
- the kill action code is "a3".
- Suspend is an action for suspending a running application and causing it to pause.
- the suspend action code is "a4".
- event ID N is an integer of 1 or more
- event_data data associated with the stream event
- event_time a time for executing an action on the application is designated.
- the event information specified in the term of FIG. 1 has an event ID (eventID) and an edit code (edit_code) linked by a period.
- event ID identification information of a target event is specified among events defined in time series in the EMT.
- edit code information (edit information) for editing the event identified by the event ID is specified.
- deletion delete
- update update
- information indicating the time after updating is specified in the event time (event_time).
- the deletion edit code is "1"
- the update edit code is "2".
- FIG. 4 is a diagram illustrating a description example of trigger information.
- “xbc.tv/e12” on the first line indicates trigger information when no terms are specified.
- metadata AIT, EMT, CCT
- URL Uniform Resource Locator
- SCS Service Channel Channel Signaling
- FLUTE File Delivery Unidirectional Transport
- action information (a: action) is specified as the term.
- event stream event
- FIG. 5 is a diagram illustrating an example of the syntax of AIT.
- AIT is described in a markup language such as XML (Extensible Markup Language).
- XML Extensible Markup Language
- FIG. 5 among the elements and attributes, “@” is added to the attribute. Further, the indented element and attribute are specified for the upper element.
- the ApplicationDiscovery element in the ServiceDiscovery element is an upper element of the DomainName attribute and the ApplicationList element.
- the DomainName attribute specifies the name of the domain.
- the ApplicationList element is an upper element of the Application element, and describes one or a plurality of Application elements as a list.
- the Application element is an upper element of the appName element, applicationIdentifier element, applicationDescriptor element, applicationTransport element, applicationLocation element, and application ⁇ Boundary element.
- the name of the application is specified in the appName element.
- the applicationIdentifier element information related to application identification information is specified.
- the applicationIdentifier element is an upper element of the orgId element and the appId element.
- An organization ID is specified in the orgId element.
- An application ID is specified in the appId element. This application ID corresponds to the application ID of the trigger information in FIG.
- the applicationDescriptor element is a higher element of the type element, controlCode element, serviceBound element, priority element, and icon element.
- type element type information about the application is specified.
- the action to be executed by the application is specified in the controlCode element.
- this action information for example, auto start, present, kill, or prefetch is designated.
- Auto start is an action for instructing automatic execution of an application.
- a present means that the application is not automatically executed.
- a kill is an action for terminating a running application.
- Prefetch is an action for instructing acquisition of an application.
- the serviceBound element information indicating whether the application depends on the service is specified.
- the priority element information indicating the priority when there are a plurality of applications is specified.
- the icon element the acquisition destination and size of the icon used in the application are specified.
- the applicationTransport element is an upper element of the type attribute, URLBase element, and URLExtension element.
- Type attribute type information regarding application transmission is specified.
- URL application URL
- the URL application URL from which the application is acquired is specified by the URLBase element, the URLExtension element, and the applicationLocation element.
- application Boundary element a domain indicating a range in which the application operates is specified.
- Fig. 5 the number of occurrences (Cardinality), but when "1" is specified, only one element or attribute is specified, and when "0..1" is specified Specifying the element or attribute is optional. If “1..N” is specified, one or more elements or attributes are specified. If “0..N” is specified, one or more elements or attributes are specified. It is optional. The meaning of the number of appearances is the same in other syntaxes described later.
- FIG. 6 is a diagram illustrating an example of syntax of EMT.
- the EMT is described in a markup language such as XML.
- the EMT element is a major element of the majorProtocolversion attribute, minorProtocolVersion attribute, id attribute, EMTVersion attribute, beginMT attribute, LiveTrigger element, and Event element.
- Syntax version information is specified in the majorProtocolversion attribute and the minorProtocolVersion attribute.
- identification information for identifying the EMT is specified.
- a character string in which domain_name and program_id (segment_id) are concatenated with “/” is designated.
- EMT version information is specified.
- beginMT attribute information indicating the time when the media time corresponding to EMT starts is specified.
- the LiveTrigger element information about trigger information (live trigger information) when trigger information is transmitted from a server on the Internet is described.
- the LiveTrigger element has a URL attribute and a pollPeriod attribute specified.
- a URL for connecting to a server that provides live trigger information is specified.
- a polling cycle for acquiring live trigger information from the server is specified.
- Event information is specified in time series.
- the Event element is an upper element of the id attribute, appID attribute, action attribute, startTime attribute, endTime attribute, and Data element.
- An event ID is specified in the id attribute.
- An application ID is specified in the appID attribute.
- an action to be executed by the application is specified.
- a prefetch, execute, suspend, kill, or injection event is designated.
- Prefetch is an action for instructing acquisition of an application. As described above, prefetch may be referred to as “prepare”.
- Execute is an action for instructing acquisition or activation of an application.
- execution of the application is resumed.
- Suspend is an action for suspending a running application and causing it to pause.
- Kill is an action for terminating a running application.
- An inject event is an action for firing an event as a stream event.
- startTime attribute information indicating the start time of the effective period of action for the application is specified.
- endTime attribute information indicating the end time of the effective period of the action for the application is specified.
- the effective period of the action for the application is determined by the startTime attribute and endTime attribute indicating two points on the progress time axis of the corresponding AV content. For example, when the progress timing of the AV content measured by the internal clock (media time information) of the receiver has passed the valid start time indicated by the startTime attribute, the action corresponding to the valid start time is validated. . In this case, only the startTime attribute may be specified without specifying the endTime attribute.
- the action corresponding to the valid period is valid and the progress timing of the AV content has not reached the valid period.
- the action corresponding to the validity period may be invalidated. That is, in the receiver, when the time measured by the internal clock (media time information) satisfies a predetermined valid condition based on the valid period or the like, the action corresponding to the valid period is validated.
- the Data element specifies the data used in the event when an injection event is specified as action information.
- a data ID for identifying the data is specified by the dataID attribute of the child element of the Data element.
- FIG. 7 is a diagram for explaining the outline of CCT.
- FIG. 7 shows the structure of an application having an application ID “a1”.
- the application includes a plurality of files such as HTML (HyperText Markup Language) files and JPEG (Joint Photographic Experts Group) files.
- HTML HyperText Markup Language
- JPEG Joint Photographic Experts Group
- an HTML file is represented by “An”
- a resource file referred to from an HTML file such as a JPEG file
- “n” is designated by a number for identifying each file.
- HTML file A11 on the top page is linked to the HTML file A01 and the HTML file A12.
- the HTML file A01 is linked to the HTML file A02, and the HTML file A02 is further linked to the HTML file A03.
- PU Presentation Unit
- HTML file A12 refers to the resource files B07, B09, B12, and B13
- HTML file A01 refers to the resource files B01, B03, and B04
- HTML file A02 refers to the resource files B05 and B06
- the receiver can speed up the processing related to the application by holding the file group constituting the application in the cache memory.
- the file that can be held in the cache memory depends on the capacity of the cache memory, Limited. Therefore, the CCT provides cache control information for controlling the cache of the file group that constitutes the application, so that the receiver can hold the file in the cache memory according to the capacity of the cache memory. .
- the receiver can hold more intermediate files than the minimum files in the cache memory ("Medium Cache" in the center of the figure), only the presented HTML file and its resource file
- the peripheral file group is also held in the cache memory.
- the receiver can adaptively hold the file in the cache memory according to the capacity of the cache memory.
- the capacity of the cache memory since files distributed by broadcast are periodically distributed, if a desired file is missed, it may be possible to acquire the file only after tens of seconds or minutes, The machine can properly hold the file in the cache memory to avoid such a case.
- FIG. 9 is a diagram showing an example of CCT syntax.
- the CCT is described in a markup language such as XML.
- the CCT element is a major element of the majorProtocolversion attribute, minorProtocolVersion attribute, CCTVersion attribute, baseURI attribute, and Application element.
- Syntax version information is specified in the majorProtocolversion attribute and the minorProtocolVersion attribute.
- the CCTVersion attribute specifies CCT version information.
- the baseURI attribute a common URL that is the base of the URL related to CCT is specified.
- the Application element cache control information for each application is specified.
- the Application element is an upper element of the appID attribute, the size attribute, and the PU element.
- An application ID is specified in the appID attribute. This application ID corresponds to an application ID such as AIT in FIG.
- the size attribute information indicating the size of the entire application is specified.
- the PU element cache control information for each presented unit is specified.
- the PU element is an upper element of the id attribute, the size attribute, the Item element, and the LinkedPU element.
- PU identification information (PU_ID in FIG. 7 and the like) is designated.
- PU_ID PU identification information
- the size attribute information indicating the size in PU units is specified. The receiver can determine the file to be held in the cache memory by checking the size information for each PU.
- the Item element information about each file that constitutes the PU is specified.
- the Item element is an upper element of the primary attribute, the uri attribute, and the type attribute.
- the primary attribute information on the primary file in the PU is specified.
- the primary file is an HTML file, and this information enables the receiver to recognize that a specific PU has been imported.
- the URL of each file is specified in the uri attribute.
- a URL relative to the URL specified by the baseURI attribute is specified.
- type attribute “m” or “p” is specified as type information.
- “m” is specified as this type information, it indicates that the file constitutes a PU.
- “p” is specified as the type information, it indicates that the file is not a file constituting the PU but a specific file.
- type information “p” is specified for a specific file that the broadcaster wants to forcibly acquire in advance.
- the LinkedPU element specifies information related to the PU linked to the target PU.
- linked PU identification information (PU_ID in FIG. 7 and the like) is designated.
- the receiver side can hold, for example, an intermediate file group, which is larger than the minimum file group, in the cache memory (“center” in FIG. 8).
- Medium Cache ”).
- AV content such as programs can be transmitted by digital broadcasting using the IP (Internet Protocol) transmission method.
- IP Internet Protocol
- FIG. 10 is a diagram showing a system pipe model of digital broadcasting using the IP transmission method.
- a plurality of BBP (Base Band Packet) streams are transmitted in a broadcast wave (RF Channel) having a predetermined frequency band.
- Each BBP stream includes NTP (Network Time Protocol), multiple service channels (Service Channel), ESG (Electronic Service Guide) service, and LLS (Low Layer Signaling).
- the NTP, service channel, and ESG service are transmitted according to the UDP / IP (User Datagram Protocol / Internet Protocol) protocol, but the LLS is transmitted on the BBP stream.
- NTP is time information.
- ESG service is an electronic service guide.
- LLS low layer signaling information is transmitted.
- LLS signaling information such as SCD (Service Configuration Description), EAD (Emergency Alert Description), RRD (Region Rating Description), etc. is transmitted.
- the SCD shows the BBP stream configuration and service configuration in the broadcast network by an ID system corresponding to MPEG2-TS (Moving Picture Experts Group phase 2 Transport Channel).
- the SCD includes attribute / setting information for each service, bootstrap information for connecting to the ESG service and the SCS, and the like.
- the EAD contains information about emergency notifications.
- the RRD includes rating information.
- the LLS signaling information such as SCD, EAD, and RRD is described in a markup language such as XML.
- the service channel (hereinafter referred to as “service”) is composed of SCS (Service Channel Signaling) and components that make up programs such as video, audio, and subtitles.
- SCS Service Channel Signaling
- components that make up programs such as video, audio, and subtitles.
- IP address is given to the elements constituting each service, and components, SCSs, and the like can be packaged for each service using this IP address.
- SCS transmits service unit signaling information.
- SCS signaling such as USD (User Service Description), MPD (Media Presentation Description), SDP (Session Description Protocol), FDD (File Delivery Description), SPD (Service Parameter Description), IS (Initialization Segment), etc. Information is transmitted.
- USD User Service Description
- MPD Media Presentation Description
- SDP Session Description Protocol
- FDD File Delivery Description
- SPD Service Parameter Description
- IS Intelligentization Segment
- the USD includes reference information for referring to SCS signaling information such as MPD, FDD, and SDP.
- USD may be referred to as USBD (User (Service Bundle Description).
- the MPD includes information such as a segment URL for each component stream transmitted in service units. Note that MPD conforms to the MPEG-DASH standard (Moving / Picture / Expert / Group / Dynamic / Adaptive / Streaming / over / HTTP).
- the SDP includes service attributes for each service, stream configuration information and attributes, filter information, location information, and the like.
- the FDD includes information such as location information (for example, URL) and TOI (Transport Object Identifier) as index information for each TSI (Transport Session Identifier).
- location information for example, URL
- TOI Transport Object Identifier
- index information for each TSI Transport Session Identifier
- FLUTE + FLUTE plus obtained by expanding FLUTE may be used.
- the SPD is composed of various parameters specified at the service and component level.
- the IS includes control information related to segment data of video and audio components transmitted in the FLUTE session.
- each segment of the FLUTE session conforms to the ISO Base Media Media File format.
- SCS signaling information such as USD, MPD, SDP, FDD, SPD, and IS is described in a markup language such as XML. Further, the IS may be transmitted as a video or audio stream instead of being transmitted as an SCS stream. Further, hereinafter, when there is no need to particularly distinguish LLS signaling information and SCS signaling information, they will be simply referred to as “signaling information”.
- an RF channel ID is assigned to a broadcast wave (RF Channel) having a predetermined frequency band, for example, for each broadcaster.
- a BBP stream ID (BBP Stream ID) is assigned to one or a plurality of BBP streams transmitted on each broadcast wave.
- a service ID (Service ID) is assigned to one or more services transmitted in each BBP stream.
- the IP transmission system ID system is a combination of the network ID (Network ID), transport stream ID (Transport Stream ID), and service ID (Service ID) used in the MPEG2-TS system ( A configuration corresponding to a triplet is adopted, and this triplet indicates a BBP stream configuration and a service configuration in the broadcast network.
- the RF channel ID and BBP stream ID correspond to the network ID and transport stream ID in the MPEG2-TS system.
- FIG. 11 and 12 are diagrams for explaining use case 1.
- FIG. 11 and FIG. 12 the time direction is from the left side to the right side in the drawing and is shown in another drawing, but it is continuous in time via the vertical dotted line L1 in the drawing. Suppose you are.
- the transmitter of the broadcast station (broadcaster) identified by the RF channel ID transmits the BBP stream identified by the BBP stream ID by a digital broadcast signal using the IP transmission method. Yes.
- the BBP stream AV content ("A / V” in the figure), SCS signaling information (“SCS” in the figure), metadata (“SCS” in the figure) that make up the service identified by the service ID,
- NRT a stream of the application
- Trigger is embedded in the video data constituting the AV content.
- an application server Application Server
- a metadata server Microsoft Server
- a recorded program such as a drama is transmitted as AV content ("A / V" in the figure) distributed from the transmitter.
- AV content A / V
- a receiver installed at each home or the like reproduces a recorded program by connecting to an A / V stream.
- the receiver reads the SCD from the memory, connects to the SCS stream transmitted by the broadcast wave according to the SCS Bootstrap information, and acquires the SCS signaling information (S11 in FIG. 11). Note that the receiver acquires LLS signaling information transmitted by LLS and records it in the memory during the initial scan processing. In the redistribution environment, when signaling information cannot be acquired, it is not necessary to acquire such information.
- the receiver acquires trigger information transmitted in the video stream at a timing when the transmitter transmits the trigger information (S12 in FIG. 11).
- This trigger information includes location information (Locator) and media time information (Media Time).
- the receiver sets the media time information included in the trigger information, and starts measuring the time based on the media time information (S13 in FIG. 11).
- the receiver determines whether the metadata distribution route is broadcast or communication. Determine if there is. In the redistribution environment, when the signaling information cannot be acquired, it may be determined that the distribution route is communication only.
- the receiver When the metadata is distributed by broadcasting, the receiver connects to the SCS stream according to SDP or FDD included in the SCS signaling information, and acquires the metadata file transmitted in the FLUTE session. (S14 in FIG. 11).
- the receiver when the metadata is distributed by communication, the receiver connects to the metadata server via the Internet according to the location information included in the trigger information, and acquires the metadata file (S14 in FIG. 11). .
- This metadata includes AIT, EMT, and CCT.
- the AIT includes application control information such as an organization ID (OrgID), an application ID (AppID), and an application URL (App_URL).
- action information for each application is specified in time series.
- action information for the application 1 (App1)
- prefetch (Pref) at time T0
- execute (Exec) at time T1
- inject event (Inj_A_E) at time T2
- time T4 Suspend (Susp)
- execute at time T5 Exec
- kill at time T6 Kill
- prefetch (Pref) at time T3 execute (Exec) at time T4, and kill (Kill) at time T5 are defined as action information for application 2 (App2). Yes.
- the CCT includes cache control information such as the URL of each file that makes up the PU for application 1 and application 2.
- the time measurement based on the media time information is started in the process of step S13, but has this time reached the time specified in the action information for each EMT application? Whether or not (time has passed) is constantly monitored.
- the receiver When the time reaches the time T0, it is time to execute the prefetch action for the application 1 (App1) specified in the EMT, and the receiver follows the application ID (AppID) of the application 1 to execute the AIT To obtain the application URL of the application 1 (S15 in FIG. 11).
- the receiver refers to the CCT and determines a file to be held in the cache memory from among the file group constituting the application 1.
- a file to be held in the cache memory is determined according to the capacity of the cache memory included in the receiver.
- the receiver 1 Based on the USD included in the SCS signaling information acquired in the process of step S11, the application URL, and the application item URL (URL indicating the acquisition destination of the file stored in the cache memory), the receiver 1 ) Is determined to be broadcast or communication. In the redistribution environment, when the signaling information cannot be acquired, it may be determined that the distribution route is communication only.
- the receiver When the application 1 (file) is distributed by broadcasting, the receiver connects to the NRT stream according to SDP, FDD, etc. included in the SCS signaling information and transmits the application 1 transmitted in the FLUTE session. A file is acquired (S15 in FIG. 11).
- the receiver when the application 1 (file) is distributed by communication, the receiver connects to the application server via the Internet according to the application URL or the like, and acquires the file of the application 1 (S15 in FIG. 11).
- the application 1 (file) distributed by broadcasting or communication is acquired and held in the cache memory (Cache).
- the file of application 1 held in the cache memory is based on the capacity of the cache memory and the CCT as cache control information.
- the receiver receives the application ID (AppID) of application 2. Accordingly, the application URL of the application 2 is acquired with reference to the AIT (S18 in FIG. 12).
- the receiver refers to the CCT and determines a file to be held in the cache memory among the file group configuring the application 2.
- a file to be held in the cache memory is determined according to the capacity of the cache memory included in the receiver.
- the receiver 2 Based on the USD included in the SCS signaling information acquired in step S11, the application URL, and the application item URL (URL indicating the acquisition destination of the file held in the cache memory), the receiver 2 ) Is determined to be broadcast or communication. In the redistribution environment, when the signaling information cannot be acquired, it may be determined that the distribution route is communication only.
- the receiver When the application 2 (file) is distributed by broadcasting, the receiver connects to the NRT stream according to SDP, FDD, etc. included in the SCS signaling information and transmits the application 2 transmitted in the FLUTE session. A file is acquired (S18 in FIG. 12).
- the receiver connects to the application server via the Internet according to the application URL or the like, and acquires the file of the application 2 (S18 in FIG. 12).
- the application 2 distributed by broadcasting or communication is acquired and held in the cache memory (Cache).
- the file of the application 2 held in the cache memory is based on the capacity of the cache memory and the CCT as cache control information.
- the machine saves the running application 1 in the cache memory (Cache) (S19 in FIG. 12). Subsequently, in the receiver, the application 2 (file) held in the cache memory is read and executed (S19 in FIG. 12). As a result, in the receiver, the application 2 operates instead of the application 1 in conjunction with the recorded program.
- time T5 when the timed time is time T5, it is time to execute the kill action for application 2 (App2) and the execute action for application 1 (App1) as specified in EMT.
- the machine ends the running application 2 (S20 in FIG. 12).
- the application 1 saved in the cache memory in the process of step S19 is read and executed (S20 in FIG. 12).
- the application 2 that has been executed in conjunction with the recorded program is terminated, and the application 1 again operates in conjunction with the recorded program.
- FIG. 13 shows the correspondence of each data in use case 1.
- FIG. 13 shows that metadata such as AIT and EMT is acquired according to location information included in the trigger information.
- AIT and EMT an application ID is associated.
- the application is acquired according to the application URL of AIT.
- FIGS. 14 and 15 are diagrams for explaining use case 2.
- FIG. In FIGS. 14 and 15, the time direction is from the left side to the right side in the drawing and is shown in another drawing, but it is continuously in time via the vertical dotted line L2 in the drawing. It shall be.
- the transmitter of the broadcasting station transmits the BBP stream by the broadcast wave of the digital broadcast using the IP transmission method, as in FIG. 11 and the like described above.
- AV content (“A / V” in the figure)
- SCS signaling information (“SCS” in the figure)
- metadata (“SCS” in the figure)
- NRT application
- trigger information is embedded in the video data constituting the AV content.
- an application server Application Server
- a metadata server Microsoft Server
- a live program such as a sports broadcast is transmitted as AV content (“A / V” in the figure) distributed from the transmitter.
- AV content (“A / V” in the figure) distributed from the transmitter.
- a receiver installed in each home or the like is playing a live program by connecting to an A / V stream.
- the receiver reads the SCD from the memory, connects to the SCS stream transmitted by the broadcast wave according to the SCS Bootstrap information, and acquires the SCS signaling information (S31 in FIG. 14). In the redistribution environment, when signaling information cannot be acquired, it is not necessary to acquire such information.
- the receiver acquires the trigger information transmitted in the video stream at the timing when the transmitter transmits the trigger information (S32 in FIG. 14).
- This trigger information includes location information (Locator). Also, prefetch is added to the location information as action information for the application 1 (App1).
- the receiver determines whether the metadata distribution route is broadcast or communication. Determine if there is. In the redistribution environment, when the signaling information cannot be acquired, it may be determined that the distribution route is communication only.
- the receiver When the metadata is distributed by broadcasting, the receiver connects to the SCS stream according to SDP or FDD included in the SCS signaling information, and acquires the metadata file transmitted in the FLUTE session. (S33 in FIG. 14).
- the receiver when the metadata is distributed by communication, the receiver connects to the metadata server via the Internet according to the location information included in the trigger information, and acquires the metadata file (S33 in FIG. 14). .
- This metadata includes AIT and CCT.
- the AIT includes application control information such as an organization ID (OrgID), an application ID (AppID), and an application URL (App_URL).
- the CCT includes cache control information such as the URL of each file constituting the PU for the application 1 and the application 2.
- the receiver refers to the AIT according to the application ID (AppID) of the application 1 (App1) that is the target of the prefetch action added to the location information included in the trigger information, and acquires the application URL of the application 1 (S34 in FIG. 14).
- the receiver refers to the CCT and determines a file to be held in the cache memory from among the file group constituting the application 1.
- the receiver Based on the USD included in the SCS signaling information acquired in the process of step S31, the application URL, and the application item URL (URL indicating the acquisition destination of the file held in the cache memory), the receiver distributes the distribution path of the application 1. Is broadcast or communication. In the redistribution environment, when the signaling information cannot be acquired, it may be determined that the distribution route is communication only.
- the receiver When the application 1 (file) is distributed by broadcasting, the receiver connects to the NRT stream according to SDP, FDD, etc. included in the SCS signaling information and transmits the application 1 transmitted in the FLUTE session. A file is acquired (S34 in FIG. 14).
- the receiver when the application 1 (file) is distributed by communication, the receiver connects to the application server via the Internet according to the application URL or the like, and acquires the file of the application 1 (S34 in FIG. 14).
- the application 1 (file) distributed by broadcasting or communication is acquired and held in the cache memory (Cache).
- the file of application 1 held in the cache memory is based on the capacity of the cache memory and the CCT as cache control information.
- the receiver monitors whether or not the trigger information is transmitted in the video stream, and the trigger information is acquired at the timing when the transmitter transmits the trigger information (S35 in FIG. 14).
- an execute action for application 1 (App1) is added to the location information.
- the receiver reads and executes the application 1 held in the cache memory after confirming with reference to the AIT according to the trigger information (S36 in FIG. 14). Thereby, in the receiver, the application 1 operates in conjunction with the live program.
- the receiver it is always monitored whether or not the trigger information is transmitted in the video stream, and the trigger information is acquired at the timing when the transmitter transmits the trigger information (S37 in FIG. 14).
- the trigger information an injection event action for application 1 (App1) is added to the location information.
- the receiver fires an event for the running application 1 after confirming with reference to the AIT (S38 in FIG. 14).
- the display of the application 1 being executed in conjunction with the live program is switched.
- the receiver acquires the trigger information at the timing when the transmitter transmits the trigger information (S39 in FIG. 15).
- a prefetch action for application 2 (App2) is added to the location information.
- the receiver refers to the AIT according to the application ID (AppID) of the application 2 added to the location information and acquires the application URL of the application 2 (S40 in FIG. 15).
- the receiver refers to the CCT and determines a file to be held in the cache memory from among the file group constituting the application 1.
- the receiver Based on the USD included in the SCS signaling information acquired in the process of step S31, the application URL, and the application item URL (URL indicating the acquisition destination of the file held in the cache memory), the receiver distributes the distribution path of the application 2. Is either broadcast or communication. In the redistribution environment, when the signaling information cannot be acquired, it may be determined that the distribution route is communication only.
- the receiver acquires the file of the application 2 transmitted in the FLUTE session by connecting to the NRT stream according to SDP, FDD, etc. included in the SCS signaling information. (S40 in FIG. 15).
- the receiver connects to the application server via the Internet according to the application URL or the like, and acquires the file of the application 2 (S40 in FIG. 15).
- the application 2 distributed by broadcasting or communication is acquired and held in the cache memory (Cache).
- the file of the application 2 held in the cache memory is based on the capacity of the cache memory and the CCT as cache control information.
- the trigger information is acquired at the timing when the transmitter transmits the trigger information (S41 in FIG. 15).
- a suspend action for application 1 (App1) and an execute action for application 2 (App2) are added to the location information.
- the receiver refers to the AIT and confirms it, and then saves the running application 1 in the cache memory (Cache) (S42 in FIG. 15). . Subsequently, the receiver reads and executes the application 2 held in the cache memory after confirming with reference to the AIT according to the execute action for the application 2 added to the location information (FIG. 15). S42). Thereby, in the receiver, the application 2 operates instead of the application 1 in conjunction with the live program.
- the trigger information is acquired at the timing when the transmitter transmits the trigger information (S43 in FIG. 15).
- a kill action for application 2 (App2) and an execute action for application 1 (App1) are added to the location information.
- the receiver first confirms with reference to the AIT according to the kill action for the application 2 added to the location information, and then terminates the application 2 being executed (S44 in FIG. 15). Subsequently, the receiver follows the execute action for the application 1 added to the location information, confirms by referring to the AIT, reads the application 1 saved in the cache memory in the process of step S42, and Execution is resumed (S44 in FIG. 15). Thereby, in the receiver, the application 2 executed in conjunction with the live program is terminated, and the application 1 operates again in conjunction with the live program.
- the trigger information is acquired at the timing when the transmitter transmits the trigger information (S45 in FIG. 15).
- a kill action for application 1 (App1) is added to the location information.
- the receiver refers to the AIT and confirms it, and then ends the running application 1. Thereby, in the receiver, the application 1 that has been executed in conjunction with the live program is terminated, and only the live program is displayed.
- FIG. 16 shows the correspondence of each data in use case 2.
- FIG. 16 shows that metadata such as AIT is acquired according to location information included in the trigger information. Further, it is indicated that the application is acquired according to the application URL of AIT.
- FIG. 17 and 18 are diagrams for explaining use case 3.
- the time direction is the direction from the left side to the right side in the figure, and is shown in another drawing, but it is continuous in time via the vertical dotted line L3 in the figure. It shall be.
- the transmitter of the broadcasting station transmits the BBP stream by the broadcast wave of the digital broadcast using the IP transmission method, as in FIG. 11 and the like described above.
- AV content (“A / V” in the figure)
- SCS signaling information (“SCS” in the figure)
- metadata (“SCS” in the figure)
- NRT application
- trigger information is embedded in the video data constituting the AV content.
- an application server Application Server
- a metadata server Microsoft Server
- a receiver installed in each home or the like is playing a program by connecting to an A / V stream.
- the receiver reads the SCD from the memory, connects to the SCS stream transmitted by the broadcast wave according to the SCS Bootstrap information, and acquires the SCS signaling information (S51 in FIG. 17).
- SCS signaling information S51 in FIG. 17.
- the receiver acquires the trigger information transmitted in the video stream at the timing when the transmitter transmits the trigger information (S52 in FIG. 17).
- This trigger information includes location information (Locator) and media time information (Media Time).
- the receiver sets the media time information included in the trigger information and starts measuring the time based on the media time information (S53 in FIG. 17).
- the receiver determines whether the metadata distribution route is broadcast or communication. Determine if there is. In the redistribution environment, when the signaling information cannot be acquired, it may be determined that the distribution route is communication only.
- the receiver When the metadata is distributed by broadcasting, the receiver connects to the SCS stream according to SDP or FDD included in the SCS signaling information, and acquires the metadata file transmitted in the FLUTE session. (S54 in FIG. 17).
- the receiver when the metadata is distributed by communication, the receiver connects to the metadata server via the Internet according to the location information included in the trigger information, and acquires the metadata file (S54 in FIG. 17). .
- This metadata includes AIT, EMT, and CCT.
- the AIT includes application control information such as an organization ID (OrgID), an application ID (AppID), and an application URL (App_URL).
- action information for each application is specified in time series.
- actions for application 1 App1
- prefetch Pref
- execute Exec
- inject event Inj_A_E
- the CCT includes cache control information such as the URL of each file that constitutes the PU for the application 1.
- the time measurement based on the media time information is started in the process of step S53. Has the time measured become the time specified in the action information for each EMT application? Whether or not (time has passed) is constantly monitored.
- the receiver When the time reaches the time T0, it is time to execute the prefetch action for the application 1 (App1) defined in the EMT, so the receiver follows the application ID (AppID) of the application 1 according to the AIT. Referring to FIG. 17, the application URL of application 1 is acquired (S55 in FIG. 17). In addition, the receiver refers to the CCT and determines a file to be held in the cache memory from among the file group constituting the application 1.
- the receiver 1 Based on the USD included in the SCS signaling information acquired in the process of step S51, the application URL, and the application item URL (URL indicating the acquisition destination of the file held in the cache memory), the receiver 1 ) Is determined to be broadcast or communication. In the redistribution environment, when the signaling information cannot be acquired, it may be determined that the distribution route is communication only.
- the receiver When the application 1 (file) is distributed by broadcasting, the receiver connects to the NRT stream according to SDP, FDD, etc. included in the SCS signaling information and transmits the application 1 transmitted in the FLUTE session. A file is acquired (S55 in FIG. 17).
- the receiver when the application 1 (file) is distributed by communication, the receiver connects to the application server via the Internet according to the application URL or the like, and acquires the file of the application 1 (S55 in FIG. 17).
- the application 1 (file) distributed by broadcasting or communication is acquired and held in the cache memory (Cache).
- the file of application 1 held in the cache memory is based on the capacity of the cache memory and the CCT as cache control information.
- the trigger information is acquired at the timing when the transmitter transmits the trigger information (S56 in FIG. 17).
- event information for updating the execution time of the execute action for the application 1 (App1) from the time T1 to the time T1A is added to the location information.
- the receiver updates the execution time of the execute action for the application 1 stipulated in EMT from time T1 to time T1A (S57 in FIG. 17).
- the receiver 1 stores the application 1 held in the cache memory. It is read out and executed (S58 in FIG. 17). Thereby, in the receiver, the application 1 operates in conjunction with the program.
- the receiver acquires the trigger information at the timing when the transmitter transmits the trigger information (S59 in FIG. 18).
- event information for deleting the injection event action at time T2 for the application 1 (App1) is added to the location information.
- the receiver deletes the injection event action at time T2 for application 1 (App1) defined in EMT (S60 in FIG. 18).
- the trigger information is acquired at the timing when the transmitter transmits the trigger information (S61 in FIG. 18).
- an injection event action for application 1 (App1) is added to the location information.
- the receiver fires an event for the running application 1 after confirming with reference to the AIT (S62 in FIG. 18).
- the display of the application 1 executed in conjunction with the program is switched. That is, here, the event for the application 1 is fired at the time after the change specified in the event information included in the trigger information, not at the time specified in advance in the EMT.
- FIG. 19 is a diagram for explaining use case 4.
- the transmitter of the broadcasting station (broadcasting company) transmits the BBP stream by the broadcast wave of the digital broadcasting using the IP transmission method.
- this BBP stream a stream of AV content (“A / V” in the figure) constituting the service is transmitted. Note that the file transmitted in the stream is transmitted in the FLUTE session.
- an application server (Application Server) provided on the Internet distributes applications
- a metadata server (Matadata Server) distributes metadata.
- an ACR server (ACR Server) is provided on the Internet, and in response to inquiries from receivers, AV content is identified using ACR (Automatic Content Recognition) technology, and depending on the identification result Provide trigger information.
- a recorded program such as a drama is transmitted as AV content ("A / V" in the figure) distributed from the transmitter.
- AV content A / V
- a receiver installed in each home or the like reproduces a recorded program by connecting to an A / V stream.
- the receiver transmits a feature amount extracted from at least one of video data and audio data of the recorded program being played back (hereinafter referred to as “finger print information”) to the ACR server via the Internet. (S71).
- This fingerprint information is transmitted from the receiver to the ACR server, for example, every few seconds.
- the ACR server When the ACR server receives fingerprint information from the receiver, the ACR server uses the ACR technology to identify the recorded program being played back by the receiver by comparing the fingerprint information with the database. The corresponding trigger information is generated. The ACR server transmits trigger information corresponding to the ACR identification result to the receiver via the Internet.
- watermark information (Water Mark) may be used instead of fingerprint information.
- information for specifying the program scene can be included. In this case, it is not necessary to specify the program scene on the ACR server side.
- the receiver receives and acquires trigger information transmitted from the ACR server via the Internet (S72).
- This trigger information includes location information (Locator) and media time information (Media Time).
- the receiver sets the media time information included in the trigger information and starts measuring the time based on the media time information (S73).
- the receiver connects to the metadata server via the Internet according to the location information included in the trigger information acquired in step S72, and acquires a metadata file (S74).
- This metadata includes AIT, EMT, and CCT.
- the AIT includes application control information such as an organization ID (OrgID), an application ID (AppID), and an application URL (App_URL).
- action information for each application is specified in time series.
- the action information for application 1 (App1) includes prefetch (Prep) at time T0, execute (Exec) at time T1, inject event (Inj_A_E) at time T2, and time T4. Suspend (Susp), execute at time T5 (Exec), and kill at time T6 (Kill).
- prefetch (Prep) at time T3, execute (Exec) at time T4, and kill (Kill) at time T5 are defined as action information for application 2 (App2). Yes.
- the CCT includes cache control information such as the URL of each file that makes up the PU for application 1 and application 2.
- the time measurement based on the media time information is started in the process of step S73, but has the time measured become the time specified in the action information for each EMT application? Whether or not (time has passed) is constantly monitored.
- the receiver When the time reaches the time T0, it is time to execute the prefetch action for the application 1 (App1) defined in the EMT, so the receiver follows the application ID (AppID) of the application 1 according to the AIT. To obtain the application URL of the application 1 (S75). In addition, the receiver refers to the CCT and determines a file to be held in the cache memory from among the file group constituting the application 1.
- the receiver connects to the application server via the Internet according to the application URL or the like, and acquires the file of the application 1 (S75).
- the application 1 distributed by communication is acquired and held in the cache memory (Cache).
- the file of application 1 held in the cache memory is based on the capacity of the cache memory and the CCT as cache control information.
- the subsequent operation of the application 1 is not shown in FIG. 19, but the injection event action for the application 1 (App1) defined in the EMT is executed when the measured time reaches the time T2. In the receiver, an event for the application 1 being executed is fired.
- the receiver uses the Internet according to the application URL for acquiring application 2.
- the application server uses the Internet according to the application URL for acquiring application 2.
- the application 2 is held in a cache memory (Cache).
- the kill action for application 2 (App2) specified in EMT and the execute action for application 1 (App1) are executed at the same time. Terminates the running application 2. Subsequently, in the receiver, the application 1 saved in the cache memory is read and executed.
- FIG. 20 shows the correspondence of each data in use case 4.
- the trigger information is not transmitted as a video stream or the like, but is acquired as a result of the inquiry by transmitting fingerprint information or watermark information to the ACR server.
- FIG. 20 shows that metadata such as AIT and EMT is acquired according to location information included in the trigger information.
- AIT and EMT an application ID is associated.
- the application is acquired according to the application URL of AIT.
- the use case 4 has been described above.
- FIG. 21 is a diagram for explaining use case 5.
- the transmitter of the broadcasting station (broadcasting company) transmits the BBP stream by the broadcast wave of the digital broadcasting using the IP transmission method.
- this BBP stream a stream of AV content (“A / V” in the figure) constituting each service is transmitted in a FLUTE session.
- an application server (Application Server) provided on the Internet distributes applications
- a metadata server (Matadata Server) distributes metadata.
- an ACR server (ACR server) provides trigger information according to the AV content identification result using the ACR technology.
- a live program such as a sports broadcast is transmitted as AV content (“A / V” in the figure) distributed from the transmitter.
- AV content (“A / V” in the figure) distributed from the transmitter.
- a receiver installed in each home or the like reproduces a live program by connecting to an A / V stream.
- the receiver transmits fingerprint information extracted from at least one of video data and audio data of a live program being reproduced to the ACR server via the Internet (S81).
- the ACR server When the ACR server receives fingerprint information from the receiver, the ACR server identifies the live program being played on the receiver by using the ACR technology by comparing the fingerprint information with the database, and according to the identification result. Generated trigger information.
- the ACR server transmits trigger information to the receiver via the Internet. Note that watermark information may be used instead of fingerprint information.
- the receiver receives and acquires the trigger information transmitted from the ACR server via the Internet (S82).
- This trigger information includes location information. Further, a prefetch action is added to the location information as action information for the application 1 (App1).
- the receiver connects to the metadata server via the Internet in accordance with the location information included in the trigger information acquired in step S82, and acquires a metadata file (S83).
- This metadata includes AIT and CCT.
- the AIT includes application control information such as an organization ID (OrgID), an application ID (AppID), and an application URL (App_URL).
- the CCT includes cache control information such as the URL of each file constituting the PU for the application 1 and the like.
- the receiver refers to the AIT according to the application ID (AppID) of the application 1 (App1) that is the target of the prefetch action added to the location information included in the trigger information, and acquires the application URL of the application 1 (S84). ).
- the receiver refers to the CCT and determines a file to be held in the cache memory from among the file group constituting the application 1.
- the receiver connects to the application server via the Internet according to the application URL and the application item URL (URL indicating the acquisition destination of the file held in the cache memory), and acquires the file of the application 1 (S84).
- the application 1 distributed by communication is acquired and held in the cache memory (Cache).
- the file of application 1 held in the cache memory is based on the capacity of the cache memory and the CCT as cache control information.
- the fingerprint information extracted from the live program being played is periodically (for example, several seconds cycle) transmitted to the ACR server via the Internet, and trigger information corresponding to the ACR identification result is acquired.
- trigger information an execute action for application 1 (App1) is added to the location information.
- the receiver reads and executes the application 1 held in the cache memory after confirming with reference to the AIT according to the trigger information (S86). Thereby, in the receiver, the application 1 operates in conjunction with the live program.
- the subsequent operation of the application 1 is not shown in FIG. 21, but after that, in the receiver, the fingerprint information extracted from the live program being played is periodically (for example, a cycle of several seconds).
- the trigger information is acquired by being transmitted to the ACR server via the Internet.
- the receiver fires an event for the application 1 being executed according to the trigger information.
- the receiver connects to the application server via the Internet according to the trigger information, and the file of application 2 is acquired. And held in a cache memory (Cache).
- the running application 1 is saved in the cache memory and cached.
- the application 2 held in the memory is read and executed.
- the running application 2 is terminated and saved in the cache memory.
- the existing application 1 is read and its execution is resumed. If the acquired trigger information includes a kill action for the application 1 (App1), the receiver 1 terminates the application 1 being executed.
- FIG. 22 shows the correspondence of each data in use case 5.
- the trigger information is not transmitted as a video stream or the like, but is acquired as a result of the inquiry by transmitting fingerprint information or watermark information to the ACR server.
- FIG. 22 shows that metadata such as AIT is acquired according to location information included in the trigger information. Further, it is indicated that the application is acquired according to the application URL of AIT.
- the use case 5 has been described above.
- FIG. 23 is a diagram for explaining use case 6.
- the transmitter of the broadcasting station (broadcasting company) transmits the BBP stream by the broadcast wave of the digital broadcast using the IP transmission method.
- this BBP stream a stream of AV contents (“A / V” in the figure), SCS signaling information (“SCS” in the figure), and application (“NRT” in the figure) constituting the service is transmitted.
- SCS SCS signaling information
- NRT application
- an application server (Application (Server) provided on the Internet distributes applications.
- a metadata server (Matadata Server) is illustrated, it is assumed that metadata is not distributed.
- a receiver installed in each home or the like is playing a program by connecting to an A / V stream.
- the receiver reads the SCD from the memory, connects to the SCS stream transmitted on the broadcast wave according to the SCS Bootstrap information, and acquires the SCS signaling information (S91).
- This SCS signaling information includes AIT and CCT in addition to USD and the like.
- the AIT in FIG. 23 includes prefetch (Prefetch) as action information in addition to an organization ID (OrgID), an application ID (AppID), and an application URL (App_URL).
- the CCT also includes cache control information such as the URL of each file that constitutes the PU for the application 1.
- the receiver refers to the AIT and acquires an application URL corresponding to the application ID (AppID) of the application 1 (App1) that is the target of the prefetch action (S92).
- the receiver refers to the CCT and determines a file to be held in the cache memory from among the file group constituting the application 1.
- the receiver Based on the USD included in the SCS signaling information acquired in the process of step S91, the application URL, and the application item URL (URL indicating the acquisition destination of the file held in the cache memory), the receiver distributes the distribution path of the application 1. Is broadcast or communication.
- the receiver When the application 1 (file) is distributed by broadcasting, the receiver connects to the NRT stream according to SDP, FDD, etc. included in the SCS signaling information and transmits the application 1 transmitted in the FLUTE session. A file is acquired (S92).
- the receiver On the other hand, when the application 1 (file) is distributed by communication, the receiver connects to the application server via the Internet according to the application URL or the like, and acquires the file of the application 1 (S92).
- the application 1 (file) distributed by broadcasting or communication is acquired and held in the cache memory (Cache).
- the file of application 1 held in the cache memory is based on the capacity of the cache memory and the CCT as cache control information.
- the receiver monitors whether the AIT and CCT included in the SCS signaling information transmitted in the SCS stream is updated, and when at least one of the AIT and CCT is updated, the SCS signaling including the AIT and CCT is updated.
- Information is acquired (S93).
- an execute action for application 1 (App1) is designated.
- the receiver reads and executes the application 1 held in the cache memory according to the AIT (S94). Thereby, in the receiver, the application 1 operates in conjunction with the program.
- the receiver continues to monitor the update of AIT and CCT, and when at least one of AIT and CCT is updated, SCS signaling information including the AIT and CCT is acquired (S95).
- SCS signaling information including the AIT and CCT is acquired (S95).
- AIT a kill action for application 1 (App1) is designated.
- the receiver terminates the running application 1 according to the AIT. Thereby, in the receiver, the application 1 executed in conjunction with the program is terminated and only the program is displayed.
- the use case 6 has been described above.
- FIG. 24 is a diagram illustrating a configuration example of a broadcast communication system.
- the system means a set of a plurality of components (devices and the like).
- the broadcast communication system 1 in FIG. 24 has a configuration for realizing the use cases 1 to 6 described above. That is, in FIG. 24, the broadcast communication system 1 includes a transmission device 10, a reception device 20, an application server 30, a metadata server 40, and an ACR server 50.
- the receiving device 20 is connected to the application server 30, the metadata server 40, and the ACR server 50 via the Internet 90.
- the transmitting device 10 transmits AV contents such as pre-recorded programs and live programs and signaling information by digital broadcast signals. Further, the transmission device 10 transmits the trigger information, metadata, or application included in the digital broadcast signal.
- the transmission device 10 corresponds to the above-described transmitter, and is provided by, for example, a broadcaster and disposed in the broadcast station.
- the receiving device 20 receives the digital broadcast signal transmitted from the transmitting device 10.
- the receiving device 20 acquires and outputs AV content video and audio based on signaling information obtained from the digital broadcast signal.
- the receiving device 20 receives a digital broadcast signal from the transmitting device 10 and acquires trigger information, metadata, or an application.
- the receiving device 20 connects to the application server 30 via the Internet 90 and acquires an application. Further, the receiving device 20 connects to the metadata server 40 via the Internet 90 and acquires metadata.
- the receiving device 20 controls the operation of the application acquired through broadcasting or communication based on the signaling information, trigger information, and metadata acquired through broadcasting or communication.
- the receiving device 20 is a television receiver or the like corresponding to the above-described receiver, and is disposed in the home.
- the application server 30 distributes the application to the receiving device 20 via the Internet 90 in response to a request from the receiving device 20.
- the application server 30 corresponds to the above-described application server (“Application Server” in FIG. 11 and the like), and is installed by, for example, a broadcaster.
- the metadata server 40 distributes metadata to the receiving device 20 via the Internet 90 in response to a request from the receiving device 20.
- the metadata server 40 corresponds to the above-described metadata server (“MatadataataServer” in FIG. 11 and the like), and is installed by, for example, a broadcaster.
- the receiving device 20 makes an inquiry about trigger information by connecting to the ACR server 50 via the Internet 90. At that time, the receiving device 20 transmits fingerprint information to the ACR server 50. The receiving device 20 acquires trigger information transmitted from the ACR server 50, and controls the operation of the application based on the trigger information.
- the ACR server 50 performs ACR processing on the fingerprint information in response to an inquiry from the receiving device 20 and identifies AV content being played back by the receiving device 20.
- the ACR server 50 generates trigger information corresponding to the ACR identification result, and transmits the trigger information to the receiving device 20 via the Internet 90.
- the ACR server 50 corresponds to the above-described ACR server (“ACR server” in FIG. 19 and the like), and is installed by, for example, a broadcaster.
- the broadcast communication system 1 is configured as described above. Next, a configuration example of each device configuring the broadcast communication system 1 of FIG. 24 will be described.
- FIG. 25 is a diagram illustrating a configuration example of the transmission apparatus in FIG.
- the transmission apparatus 10 includes a signaling information generation unit 111, a signaling information processing unit 112, a metadata generation unit 113, a metadata processing unit 114, an audio data acquisition unit 115, an audio encoder 116, a video data acquisition unit 117, and a video.
- the encoder 118 includes a trigger information generation unit 119, a multiplexing unit 120, and a transmission unit 121.
- the signaling information generation unit 111 generates signaling information and supplies it to the signaling information processing unit 112.
- the signaling information processing unit 112 processes the signaling information supplied from the signaling information generation unit 111 and supplies it to the multiplexing unit 120.
- the metadata generation unit 113 generates metadata and supplies it to the metadata processing unit 114.
- the metadata processing unit 114 processes the metadata supplied from the metadata generation unit 113 and supplies the processed metadata to the multiplexing unit 120.
- the audio data acquisition unit 115 acquires audio data of AV content from an external server, microphone, recording medium, or the like, and supplies it to the audio encoder 116.
- the audio encoder 116 encodes the audio data supplied from the audio data acquisition unit 115 in accordance with an encoding method such as MPEG (Moving Picture Experts Group), and supplies the encoded data to the multiplexing unit 120.
- MPEG Motion Picture Experts Group
- the video data acquisition unit 117 acquires video data of AV content from an external server, camera, recording medium, or the like, and supplies it to the video encoder 118 and the trigger information generation unit 119.
- the video encoder 118 encodes the video data supplied from the video data acquisition unit 117 in accordance with an encoding method such as MPEG, and supplies the encoded video data to the multiplexing unit 120.
- the trigger information generation unit 119 generates trigger information in accordance with the progress of the AV content corresponding to the video data supplied from the video data acquisition unit 117, and supplies the trigger information to the video encoder 118 or the multiplexing unit 120.
- the video encoder 118 can embed the trigger information supplied from the trigger information generation unit 119 in the video data and encode the video data.
- the multiplexing unit 120 multiplexes the signaling information from the signaling information processing unit 112, the metadata from the metadata processing unit 114, the audio data from the audio encoder 116, and the video data from the video encoder 118,
- the BBP stream obtained as a result is supplied to the transmission unit 121.
- the multiplexing unit 120 When the trigger information is supplied from the trigger information generation unit 119, the multiplexing unit 120 further multiplexes the trigger information with the audio data and the video data to generate a BBP stream.
- the metadata does not necessarily need to be transmitted when the distribution route is communication. In this case, the metadata does not have to be included in the BBP stream.
- the application may be transmitted by being included in the BBP stream.
- the transmission unit 121 transmits the BBP stream supplied from the multiplexing unit 120 as a digital broadcast signal via the antenna 122.
- the trigger information is embedded in the video data and the case where the trigger information is multiplexed in the BBP stream are illustrated.
- the trigger information is embedded by other methods such as embedding the trigger information in the audio data.
- Information may be arranged.
- FIG. 26 is a diagram illustrating a configuration example of the reception apparatus in FIG.
- the receiving apparatus 20 includes a tuner 212, a demultiplexing unit 213, an audio decoder 214, an audio output unit 215, a video decoder 216, a video output unit 217, a control unit 218, a memory 219, an input unit 220, a communication unit 221, An application engine 222 and a cache memory 223 are included.
- the tuner 212 selects and demodulates a digital broadcast signal received via the antenna 211, and supplies a BBP stream obtained as a result to the demultiplexing unit 213.
- the demultiplexing unit 213 separates the BBP stream supplied from the tuner 212 into audio data, video data, signaling information, and metadata.
- the demultiplexing unit 213 supplies audio data to the audio decoder, video data to the video decoder, and signaling information and metadata to the control unit 218, respectively.
- the audio decoder 214 decodes the audio data supplied from the demultiplexing unit 213 by a decoding method corresponding to the encoding method by the audio encoder 116 (FIG. 25), and the resulting audio data is converted into the audio output unit 215 and It supplies to the control part 218.
- the audio output unit 215 outputs the audio data supplied from the audio decoder 214 to a speaker (not shown).
- the speaker outputs sound corresponding to the audio data supplied from the audio output unit 215.
- the video decoder 216 decodes the video data supplied from the demultiplexing unit 213 by a decoding method corresponding to the encoding method by the video encoder 118 (FIG. 25), and the resulting video data is converted into the video output unit 217 and It supplies to the control part 218.
- the video output unit 217 outputs the video data supplied from the video decoder 216 to a display (not shown).
- the display displays video corresponding to the video data supplied from the video output unit 217.
- the control unit 218 controls the operation of each unit of the receiving device 20 such as the tuner 212, the demultiplexing unit 213, and the communication unit 221.
- the memory 219 stores various data supplied from the control unit 218.
- the input unit 220 receives an operation from the user and supplies an operation signal corresponding to the operation to the control unit 218.
- control unit 218 acquires signaling information and metadata supplied from the demultiplexing unit 213. Further, the control unit 218 acquires trigger information or fingerprint information based on the audio data supplied from the audio decoder 214 or the video data supplied from the video decoder 216. The control unit 218 supplies fingerprint information to the communication unit 221.
- the communication unit 221 connects to the application server 30 via the Internet 90 and requests an application according to the control from the control unit 218.
- the communication unit 221 acquires an application transmitted from the application server 30 via the Internet 90 and stores it in the cache memory 223.
- the communication unit 221 connects to the metadata server 40 via the Internet 90 and requests metadata in accordance with the control from the control unit 218.
- the communication unit 221 acquires metadata transmitted from the metadata server 40 via the Internet 90 and supplies the metadata to the control unit 218.
- the communication unit 221 connects to the ACR server 50 via the Internet 90 according to control from the control unit 218, transmits fingerprint information, and inquires about trigger information.
- the communication unit 221 acquires trigger information transmitted from the ACR server 50 via the Internet 90 and supplies the trigger information to the control unit 218.
- the control unit 218 controls the operation of the application acquired through broadcasting or communication based on the signaling information, trigger information, and metadata acquired through broadcasting or communication.
- the application engine 222 reads and executes the application held in the cache memory 223 according to the control from the control unit 218.
- the application engine 222 controls operations such as application pause (interruption), event firing, and termination according to control from the control unit 218.
- the video data of the application is supplied to the video output unit 217.
- the video output unit 217 combines the video data supplied from the application engine 222 with the video data supplied from the video decoder 216, and displays a video obtained thereby on the display.
- the application separated by the demultiplexing unit 213 is stored in the cache memory 223. Will be held.
- FIG. 27 is a diagram illustrating a functional configuration example of a part that performs processing related to application control in the control unit 218 in FIG.
- control unit 218 includes a signaling information acquisition unit 251, a trigger information acquisition unit 252, a metadata acquisition unit 253, a fingerprint information acquisition unit 254, an analysis unit 255, a media time timing unit 256, and an application control unit 257. Consists of
- the signaling information acquisition unit 251 connects to the SCS stream according to the SCS Bootstrap information, acquires the SCS signaling information, and supplies it to the analysis unit 255.
- the trigger information acquisition unit 252 constantly monitors the video data supplied from the video decoder 216, acquires the trigger information embedded in the video data, and supplies it to the analysis unit 255.
- the trigger information acquisition unit 252 supplies the media time information to the media time timing unit 256 when the media time information is included in the trigger information.
- the trigger information acquisition unit 252 monitors the packet including the trigger information separated by the demultiplexing unit 213 and acquires the trigger information therefrom. .
- the metadata acquisition unit 253 acquires metadata distributed by broadcast or communication according to the analysis result by the analysis unit 255 and supplies the metadata to the analysis unit 255.
- the fingerprint information acquisition unit 254 acquires (extracts) fingerprint information from at least one of the audio data supplied from the audio decoder 214 and the video data supplied from the video decoder 216, and supplies the fingerprint information to the communication unit 221. .
- the communication unit 221 connects to the ACR server 50 via the Internet 90 and transmits fingerprint information.
- the communication unit 221 receives trigger information transmitted from the ACR server 50 and supplies the trigger information to the trigger information acquisition unit 252.
- the trigger information acquisition unit 252 acquires the trigger information supplied from the communication unit 221 and supplies it to the analysis unit 255.
- the analysis unit 255 is supplied with signaling information from the signaling information acquisition unit 251, trigger information from the trigger information acquisition unit 252, and metadata from the metadata acquisition unit 253.
- the analysis unit 255 analyzes at least one of the signaling information, trigger information, and metadata, and supplies the analysis result to the metadata acquisition unit 253 or the application control unit 257.
- the media time timing unit 256 sets the media time information supplied from the trigger information acquisition unit 252 and measures the time based on the media time information.
- the application control unit 257 controls the operation of the application by controlling the application engine 222 (FIG. 26) according to the analysis result from the analysis unit 255.
- FIG. 28 is a diagram illustrating a configuration example of each server in FIG. FIG. 28 shows a configuration example of the application server 30, the metadata server 40, and the ACR server 50.
- the application server 30 includes a control unit 311, an application generation unit 312, an application holding unit 313, and a communication unit 314.
- the control unit 311 controls the operation of each unit of the application server 30.
- the application generation unit 312 generates an application (for example, composed of an HTML file or a JPEG file) that is executed in conjunction with the AV content in accordance with the control from the control unit 311, and stores the generated application in the application storage unit 313.
- the communication unit 314 communicates with the receiving device 20 via the Internet 90 according to control from the control unit 311.
- the control unit 311 constantly monitors the communication status of the communication unit 314.
- the control unit 311 acquires the application from the application holding unit 313 and supplies the application to the communication unit 314.
- the communication unit 314 transmits the application to the requesting receiving device 20 via the Internet 90 in accordance with the control from the control unit 311.
- Application server 30 is configured as described above.
- the metadata server 40 includes a control unit 411, a metadata generation unit 412, a metadata holding unit 413, and a communication unit 414.
- the control unit 411 controls the operation of each unit of the metadata server 40.
- the metadata generation unit 412 generates metadata including at least one piece of information among AIT, EMT, and CCT in accordance with control from the control unit 411 and holds the metadata in the metadata holding unit 413.
- the communication unit 414 communicates with the receiving device 20 via the Internet 90 according to the control from the control unit 411.
- the control unit 411 constantly monitors the communication status of the communication unit 414, and when metadata is requested from the receiving device 20, acquires the metadata from the metadata holding unit 413 and supplies it to the communication unit 414.
- the communication unit 414 transmits metadata to the requesting receiving apparatus 20 via the Internet 90 in accordance with control from the control unit 411.
- the metadata server 40 is configured as described above.
- the ACR server 50 includes a communication unit 511, an ACR identification processing unit 512, an FP database 513, a trigger information generation unit 514, and a trigger information database.
- the communication unit 511 communicates with the receiving device 20 via the Internet 90.
- the communication unit 511 receives the fingerprint information and supplies the fingerprint information to the ACR identification processing unit 512.
- the ACR identification processing unit 512 compares the fingerprint information supplied from the communication unit 511 with the FP database 513 prepared in advance, and performs ACR identification processing for identifying the AV content being played back by the receiving device 20 as the inquiry source. Do.
- the ACR identification processing unit 512 supplies the result of the ACR identification processing to the trigger information generation unit 514.
- the fingerprint information is, for example, unique information of all or part of AV content, and a large number of unique information of AV content is registered in the FP database 513 in advance.
- the degree of similarity or coincidence of the unique information is determined.
- a known technique disclosed by various documents or the like can be used as a method for determining the degree of similarity or the degree of coincidence.
- the trigger information generation unit 514 generates trigger information based on the result of the ACR identification process supplied from the ACR identification processing unit 512 and various types of information registered in the trigger information database 515, and supplies the trigger information to the communication unit 511. To do.
- the communication unit 511 transmits the trigger information supplied from the trigger information generation unit 514 to the receiving device 20 that is the inquiry source via the Internet 90.
- the ACR server 50 is configured as described above.
- step S111 the signaling information generation unit 111 generates signaling information.
- step S112 the signaling information processing unit 112 processes the signaling information generated in step S111.
- step S113 the metadata generation unit 113 generates metadata.
- step S114 the metadata processing unit 114 processes the metadata generated in the process of step S113.
- step S115 the audio data acquisition unit 115 acquires audio data of AV content from an external server or the like.
- step S116 the audio encoder 116 encodes the audio data obtained in step S115 in accordance with an encoding method such as MPEG.
- step S117 the video data acquisition unit 117 acquires video data of AV content from an external server or the like.
- the trigger information generation unit 119 generates trigger information in accordance with the progress of the AV content corresponding to the video data acquired in the process of step S117.
- step S119 the video encoder 118 encodes the video data obtained in step S117 in accordance with an encoding method such as MPEG. However, when encoding the video data, the video encoder 118 embeds the trigger information supplied from the trigger information generation unit 119 in the video data and performs the encoding.
- step S120 the multiplexing unit 120 encodes the signaling information processed in step S112, the metadata processed in step S114, the audio data encoded in step S116, and the processing in step S119.
- the video data is multiplexed and the resulting BBP stream is supplied to the transmitter 121.
- step S121 the transmission unit 121 transmits the BBP stream generated in the process of step S120 as a digital broadcast signal using the IP transmission method via the antenna 122.
- the digital broadcast signal transmission process of FIG. 29 ends.
- step S211 the tuner 212 selects and demodulates a digital broadcast signal received through the antenna 211 using the IP transmission method.
- step S212 the demultiplexing unit 213 separates audio data and video data from the BBP stream demodulated in step S211.
- step S213 the audio decoder 214 decodes the audio data separated in the process of step S212 by a decoding method corresponding to the encoding method by the audio encoder 116 (FIG. 25).
- step S214 the video decoder 216 decodes the video data separated in the process of step S212 by a decoding method corresponding to the encoding method by the video encoder 118 (FIG. 25).
- step S215 the audio output unit 215 outputs the audio data decoded in the process of step S213 to a speaker (not shown).
- step S216 the video output unit 217 outputs the video data decoded in step S214 to a display (not shown). As a result, the video of the AV content is displayed on the display, and audio synchronized with the video is output from the speaker.
- step S216 When the process of step S216 is completed, the digital broadcast signal reception process of FIG. 30 is completed.
- step S231 the signaling information acquisition unit 251 acquires SCS signaling information by connecting to the SCS stream in accordance with the SCS Bootstrap information.
- the SCS signaling information is analyzed by the analysis unit 255.
- step S232 the trigger information acquisition unit 252 constantly monitors the video data supplied from the video decoder 216, and acquires trigger information embedded in the video data. This trigger information is analyzed by the analysis unit 255.
- step S233 the media time timing unit 256 sets the media time information according to the analysis result of the trigger information acquired in the process of step S232, and starts measuring the time based on the media time information.
- step S234 the metadata acquisition unit 253 acquires metadata (AIT, EMT, CCT) distributed by broadcasting or communication according to the analysis result by the analysis unit 255. Specifically, based on the USD included in the SCS signaling information acquired in the process of step S231 and the location information included in the trigger information acquired in the process of step S232, the distribution route of the metadata is broadcast or communicated. Is determined.
- the metadata acquisition unit 253 When the metadata is distributed by broadcasting, the metadata acquisition unit 253 connects to the SCS stream according to SDP or FDD included in the SCS signaling information, and transmits the metadata file transmitted in the FLUTE session. To get.
- the metadata acquisition unit 253 controls the communication unit 221 to connect to the metadata server 40 via the Internet 90 according to the location information included in the trigger information, Get metadata file. This metadata is analyzed by the analysis unit 255.
- step S235 it is determined whether or not the time measured by the media time timer 256 is the start time of the event defined in the EMT list. If it is determined in step S235 that the time measured is not the start time of the event specified in the EMT list, the process returns to step S235, and the determination process in step S235 is repeated. That is, in step S235, the process proceeds to step S236 after waiting for the measured time to be the start time of the event specified in the EMT list.
- step S236 the application control unit 257 controls the application engine 222 to execute the action of the application corresponding to the event for which the time measurement time is determined to be the start time in the determination process of step S235.
- step S237 it is determined whether there is an event to be executed remaining in the EMT list. If it is determined in step S237 that there are still events to be executed, the process returns to step S235, and the subsequent processes are repeated.
- the EMT includes prefetch (Pref) at time T0 as action information for application 1 (App1), and at time T1.
- Execution (Exec), injection event (Inj_A_E) at time T2, suspend (Susp) at time T4, execute (Exec) at time T5, and kill (Kill) at time T6 are defined.
- prefetch (Pref) at time T3, execute (Exec) at time T4, and kill (Kill) at time T5 are defined as action information for application 2 (App2). Yes.
- steps S235 to S237 is repeated, and the time measured by the media time timer 256 matches the corresponding event, that is, the prefetch action or execute for the application 1 at the timing when the time T0 to T6 coincides.
- An action, a prefetch action for the application 2, and the like are sequentially executed.
- step S237 If it is determined in step S237 that there are no events to be executed, the application control process linked to the recorded program in FIG. 31 ends.
- the receiving device 20 receives a digital broadcast signal from the transmitting device 10 and plays a live program such as a sports broadcast as AV content. To do.
- step S251 the signaling information acquisition unit 251 acquires SCS signaling information by connecting to the SCS stream according to the SCS Bootstrap information.
- the SCS signaling information is analyzed by the analysis unit 255.
- step S252 the trigger information acquisition unit 252 constantly monitors the video data supplied from the video decoder 216, and acquires trigger information embedded in the video data. This trigger information is analyzed by the analysis unit 255.
- the metadata acquisition unit 253 acquires metadata (AIT, CCT) distributed by broadcasting or communication according to the analysis result by the analysis unit 255. Specifically, based on the USD included in the SCS signaling information acquired in the process of step S251 and the location information included in the trigger information acquired in the process of step S252, the metadata distribution route is broadcast or communicated. Is determined.
- the metadata acquisition unit 253 When the metadata is distributed by broadcasting, the metadata acquisition unit 253 connects to the SCS stream according to SDP or FDD included in the SCS signaling information, and transmits the metadata file transmitted in the FLUTE session. To get.
- the metadata acquisition unit 253 controls the communication unit 221 to connect to the metadata server 40 via the Internet 90 according to the location information included in the trigger information, Get metadata file. This metadata is analyzed by the analysis unit 255.
- step S254 the application control unit 257 causes an application distributed by broadcasting or communication to be supplied to the application engine 222 in accordance with the analysis result by the analysis unit 255. Specifically, the USD included in the SCS signaling information acquired in the process of step S251, the application URL acquired in the process of step S253, and the application item URL (URL indicating the acquisition destination of the file held in the cache memory) ) To determine whether the application distribution route is broadcast or communication.
- the application control unit 257 When the application is distributed by broadcasting, the application control unit 257 connects to the NRT stream according to SDP or FDD included in the SCS signaling information, and acquires the application file transmitted in the FLUTE session. .
- the application control unit 257 controls the communication unit 221 to connect to the application server 30 via the Internet 90 according to the application URL or the like and acquire the application file.
- the application acquired in this way is held in the cache memory 223.
- the application file held in the cache memory 223 is based on the capacity of the cache memory 223 and the CCT as cache control information.
- step S255 the trigger information acquisition unit 252 constantly monitors the video data supplied from the video decoder 216, and determines whether the trigger information embedded in the video data has been acquired. If it is determined in step S255 that the trigger information has been acquired, the process proceeds to step S256.
- step S256 the application control unit 257 controls the application engine 222 to execute the action of the application included in the trigger information acquired in the process of step S255. Note that when the process of step S256 ends, the process proceeds to step S257. If it is determined in step S255 that the trigger information has not been acquired, the process in step S256 is skipped, and the process proceeds to step S257.
- step S257 it is determined whether or not the live program being played has ended. If it is determined in step S257 that the live program being reproduced has not ended, the process returns to step S255, and the subsequent processes are repeated.
- step S257 If it is determined in step S257 that the live program has ended, the application control process linked to the live program in FIG. 32 ends.
- the flow of application control processing linked to live programs has been described above.
- the application control process linked to this live program corresponds to the use cases 2 and 5 described above.
- Hybrid application control processing Next, the flow of the hybrid type application control process executed by the reception device 20 of FIG. 24 will be described with reference to the flowchart of FIG. Note that it is assumed that AV content such as a program is played on the receiving device 20.
- step S271 the signaling information acquisition unit 251 acquires the SCS signaling information by connecting to the SCS stream according to the SCS Bootstrap information.
- the SCS signaling information is analyzed by the analysis unit 255.
- step S272 the trigger information acquisition unit 252 constantly monitors the video data supplied from the video decoder 216, and acquires trigger information embedded in the video data. This trigger information is analyzed by the analysis unit 255.
- step S273 the media time timing unit 256 sets the media time information according to the analysis result of the trigger information acquired in the process of step S272, and starts measuring the time based on the media time information.
- step S274 the metadata acquisition unit 253 acquires metadata (AIT, EMT, CCT) distributed by broadcasting or communication according to the analysis result by the analysis unit 255. Specifically, based on the USD included in the SCS signaling information acquired in the process of step S271 and the location information included in the trigger information acquired in the process of step S272, the metadata distribution route is broadcast or communicated. Is determined.
- the metadata acquisition unit 253 When the metadata is distributed by broadcasting, the metadata acquisition unit 253 connects to the SCS stream according to SDP or FDD included in the SCS signaling information, and transmits the metadata file transmitted in the FLUTE session. To get.
- the metadata acquisition unit 253 controls the communication unit 221 to connect to the metadata server 40 via the Internet 90 according to the location information included in the trigger information, Get metadata file. This metadata is analyzed by the analysis unit 255.
- step S275 it is determined whether or not the time measured by the media time timer 256 is the start time of the event specified in the EMT list. If it is determined in step S275 that the time measured has reached the start time of the event specified in the EMT list, the process proceeds to step S276.
- step S276 the application control unit 257 controls the application engine 222 to execute the action of the application corresponding to the event for which it is determined in the determination process in step S275 that the time measurement has reached the start time. Note that when the process of step S276 ends, the process proceeds to step S277. If it is determined in step S275 that the time measured is not the start time of the event specified in the EMT list, the process in step S276 is skipped, and the process proceeds to step S277.
- step S277 the video data supplied from the video decoder 216 is constantly monitored to determine whether or not trigger information embedded in the video data has been acquired. If it is determined in step S277 that trigger information has been acquired, the process proceeds to step S278.
- step S278 it is determined whether or not event information is specified in the trigger information acquired in the process of step S277. If it is determined in step S278 that event information is specified in the trigger information, the process proceeds to step S279.
- step S279 the analysis unit 255 edits the EMT according to the editing content of the event information included in the trigger information.
- the execution time of the execute action for the application 1 defined in the EMT is updated from the time T1 to the time T1A, or the injection event action for the application 1 at the time T2 is deleted. Is edited.
- step S278 if it is determined in step S278 that event information is not specified in the trigger information, the process proceeds to step S280.
- step S280 the application control unit 257 controls the application engine 222 to execute an application action included in the trigger information.
- step S277 If it is determined in step S277 that trigger information has not been acquired, steps S278 to S280 are skipped, and the process proceeds to step S281. When the process of step S279 or S280 is completed, the process proceeds to step S281.
- step S281 it is determined whether or not the program being played has ended. If it is determined in step S281 that the program being reproduced has not ended, the process returns to step S275, and the subsequent processing is repeated. If it is determined in step S281 that the program being played has ended, the hybrid type application control process in FIG. 33 ends.
- step S311 the control unit 311 always monitors the communication status of the communication unit 314, and determines whether an application is requested from the reception device 20. If it is determined in step S311 that an application is not requested, the determination process in step S311 is repeated. That is, in step S311, the process is advanced to step S312 after waiting for an application request from the receiving device 20.
- step S312 the communication unit 314 acquires the application held in the application holding unit 313 according to the control from the control unit 311.
- step S 313 the communication unit 314 transmits the application acquired in the process of step S 312 to the request receiving device 20 via the Internet 90 in accordance with the control from the control unit 311.
- the application distribution process of FIG. 34 ends.
- step S411 the control unit 411 constantly monitors the communication status of the communication unit 414, and determines whether metadata is requested from the reception device 20. If it is determined in step S411 that metadata is not requested, the determination process in step S411 is repeated. That is, in step S411, the process proceeds to step S412 after waiting for metadata to be requested from the receiving device 20.
- step S412 the communication unit 414 acquires the metadata held in the metadata holding unit 413 according to the control from the control unit 411.
- step S 413 the communication unit 414 transmits the metadata acquired in the processing in step S 412 to the request receiving device 20 via the Internet 90 in accordance with the control from the control unit 411.
- the metadata distribution process of FIG. 35 ends.
- step S511 an inquiry about trigger information is received from the receiving device 20, and it is determined whether or not the fingerprint information is received. If it is determined in step S511 that the fingerprint information has not been received, the determination process in step S511 is repeated. In other words, in step S511, after the fingerprint information is received by the communication unit 511, the process proceeds to step S512.
- step S512 the ACR identification processing unit 512 collates the fingerprint information received in the process of step S511 with the FP database 513 prepared in advance, and identifies the AV content being played back by the receiving device 20 as the inquiry source. ACR identification processing is performed.
- step S513 the trigger information generation unit 514 generates trigger information based on the result of the ACR identification process obtained in the process of step S512 and various information registered in the trigger information database 515.
- step S514 the communication unit 511 transmits the trigger information generated in the process of step S513 to the receiving device 20 that is the inquiry source via the Internet 90.
- the trigger information distribution process of FIG. 36 ends.
- broadcast contents such as recorded programs and live programs have been described as AV contents.
- communication contents are streamed from a streaming server (not shown) via the Internet 90. You may be made to do.
- FIG. 37 is a diagram illustrating a hardware configuration example of a computer that executes the above-described series of processing by a program.
- a CPU Central Processing Unit
- ROM Read Only Memory
- RAM Random Access Memory
- An input / output interface 905 is further connected to the bus 904.
- An input unit 906, an output unit 907, a recording unit 908, a communication unit 909, and a drive 910 are connected to the input / output interface 905.
- the input unit 906 includes a keyboard, a mouse, a microphone, and the like.
- the output unit 907 includes a display, a speaker, and the like.
- the recording unit 908 includes a hard disk, a nonvolatile memory, and the like.
- the communication unit 909 includes a network interface or the like.
- the drive 910 drives a removable medium 911 such as a magnetic disk, an optical disk, a magneto-optical disk, or a semiconductor memory.
- the CPU 901 loads the program stored in the ROM 902 or the recording unit 908 to the RAM 903 via the input / output interface 905 and the bus 904 and executes the program. A series of processing is performed.
- the program executed by the computer 900 can be provided by being recorded on a removable medium 911 as a package medium, for example.
- the program can be provided via a wired or wireless transmission medium such as a local area network, the Internet, or digital satellite broadcasting.
- the program can be installed in the recording unit 908 via the input / output interface 905 by installing the removable medium 911 in the drive 910. Further, the program can be received by the communication unit 909 via a wired or wireless transmission medium and installed in the recording unit 908. In addition, the program can be installed in the ROM 902 or the recording unit 908 in advance.
- the processing performed by the computer according to the program does not necessarily have to be performed in chronological order in the order described as the flowchart. That is, the processing performed by the computer according to the program includes processing executed in parallel or individually (for example, parallel processing or object processing).
- the program may be processed by a single computer (processor) or may be distributedly processed by a plurality of computers.
- the present technology can take the following configurations.
- a first acquisition unit that acquires trigger information including at least location information as information for controlling the operation of an application executed in conjunction with AV (Audio Video) content;
- a second acquisition unit that acquires application control information for controlling the operation of the application;
- a control unit that controls the operation of the application based on the trigger information and the application control information.
- the trigger information includes time information serving as a reference of time for controlling the operation of the application.
- the second acquisition unit acquires schedule control information that defines the operation of the application in time series, When the time measured based on the time information has passed the time specified in the schedule control information, the control unit performs the operation of the application according to action information for the application corresponding to the time.
- the receiving device according to (1).
- the application is composed of a plurality of files
- the second acquisition unit acquires cache control information for controlling a cache of a file group constituting the application,
- the receiving device according to (2) wherein the control unit causes a cache memory to hold a file group constituting the application based on the cache control information.
- the trigger information includes editing information for editing the contents defined in the schedule control information,
- the receiving device according to (2) or (3) wherein the control unit edits the schedule control information based on the editing information.
- the location information is information for acquiring the application control information, the schedule control information, and the cache control information,
- the receiving device according to (3) or (4), wherein the application control information, the schedule control information, and the cache control information are associated by identification information of the application.
- the trigger information includes action information for the application, The receiving device according to (1), wherein when the trigger information is acquired, the control unit controls the operation of the application according to the action information included in the trigger information.
- the application is composed of a plurality of files, The second acquisition unit acquires cache control information for controlling a cache of a file group constituting the application, The receiving device according to (6), wherein the control unit causes a cache memory to hold a file group configuring the application based on the cache control information.
- the location information is information for acquiring the application control information and the cache control information, The receiving device according to (7), wherein the trigger information, the application control information, and the cache control information are associated by identification information of the application.
- the AV content is broadcast content transmitted by a digital broadcast signal
- the trigger information is distributed in a digital broadcast signal or distributed from a server on the Internet
- the receiving device according to any one of (1) to (8), wherein the first acquisition unit acquires the trigger information distributed by broadcast or communication.
- the receiving device is Trigger information including at least location information is acquired as information for controlling the operation of an application executed in conjunction with AV content, Obtaining application control information for controlling the operation of the application;
- a receiving method including a step of controlling an operation of the application based on the trigger information and the application control information.
- the application is composed of a plurality of files
- the second generation unit generates cache control information for controlling a cache of a file group constituting the application,
- the first generation unit generates the trigger information including editing information for editing the contents defined in the schedule control information,
- the location information is information for acquiring the application control information, the schedule control information, and the cache control information, The transmission device according to (13) or (14), wherein the application control information, the schedule control information, and the cache control information are associated by identification information of the application.
- the first generation unit generates the trigger information including action information for the application, The transmission device according to (11), wherein the transmission unit transmits the trigger information including the action information.
- the application is composed of a plurality of files, The second generation unit generates cache control information for controlling a cache of a file group constituting the application, The transmission device according to (16), wherein the transmission unit further transmits the cache control information.
- the location information is information for acquiring the application control information and the cache control information, The transmission device according to (17), wherein the trigger information, the application control information, and the cache control information are associated by identification information of the application.
- the AV content is broadcast content
- the transmission device according to any one of (11) to (18), wherein the transmission unit transmits the trigger information and application control information together with the AV content by a digital broadcast signal.
- the transmitting device is Get AV content, As information for controlling the operation of the application executed in conjunction with the AV content, generating trigger information including at least location information, Obtaining and generating application control information for controlling the operation of the application; A transmission method including a step of transmitting the trigger information and application control information together with the AV content.
- 1 broadcast communication system 10 transmitting device, 20 receiving device, 30 application server, 40 metadata server, 50 ACR server, 90 Internet, 111 signaling information generating unit, 113 metadata generating unit, 115 audio data acquiring unit, 117 video data Acquisition unit, 119 trigger information generation unit, 121 transmission unit, 212 tuner, 218 control unit, 221 communication unit, 222 application engine, 223 cache memory, 251 signaling information acquisition unit, 252 trigger information acquisition unit, 253 metadata acquisition unit, 254 Fingerprint information acquisition unit, 255 analysis unit, 256 media time timing unit, 257 application control unit, 11 control unit, 313 application holding unit, 314 communication unit, 411 control unit, 413 the metadata holding unit, 414 communication unit, 511 communication unit, 512 ACR identification processing unit, 514 trigger information generation unit, 900 computer, 901 CPU
Landscapes
- Engineering & Computer Science (AREA)
- Signal Processing (AREA)
- Multimedia (AREA)
- Computer Security & Cryptography (AREA)
- General Engineering & Computer Science (AREA)
- Software Systems (AREA)
- Databases & Information Systems (AREA)
- Two-Way Televisions, Distribution Of Moving Picture Or The Like (AREA)
- Television Systems (AREA)
Abstract
Description
2.IP伝送方式によるデジタル放送の概要
3.具体的なユースケース
(1)ユースケース1:収録済み番組に連動したアプリケーションの制御
(2)ユースケース2:ライブ番組に連動したアプリケーションの制御
(3)ユースケース3:ハイブリッド型のアプリケーションの制御
(4)ユースケース4:収録済み番組に連動したアプリケーションの制御(ACR対応)
(5)ユースケース5:ライブ番組に連動したアプリケーションの制御(ACR対応)
(6)ユースケース6:SCSシグナリング情報を用いたアプリケーションの制御
4.システムの構成
5.各装置で実行される処理の流れ
6.コンピュータの構成
図1は、トリガ情報の構成を示す図である。
図4は、トリガ情報の記述例を示す図である。
図5は、AITのシンタックスの例を示す図である。
図6は、EMTのシンタックスの例を示す図である。なお、EMTは、例えばXML等のマークアップ言語により記述される。
図7は、CCTの概要を説明する図である。
図10は、IP伝送方式のデジタル放送のシステムパイプモデルを示す図である。
図24は、放送通信システムの構成例を示す図である。なお、システムとは、複数の構成要素(装置等)の集合を意味する。
図25は、図24の送信装置の構成例を示す図である。
図26は、図24の受信装置の構成例を示す図である。
図27は、図26の制御部218における、アプリケーションの制御に関する処理を行う部分の機能的構成例を示す図である。
図28は、図24の各サーバの構成例を示す図である。図28には、アプリケーションサーバ30、メタデータサーバ40、及び、ACRサーバ50の構成例が示されている。
まず、図29のフローチャートを参照して、図24の送信装置10により実行されるデジタル放送信号送信処理の流れについて説明する。
次に、図30のフローチャートを参照して、図24の受信装置20により実行されるデジタル放送信号受信処理の流れについて説明する。
次に、図31のフローチャートを参照して、図24の受信装置20により実行される収録済み番組に連動したアプリケーション制御処理の流れについて説明する。なお、この収録済み番組に連動したアプリケーション制御処理に先立って、受信装置20では、送信装置10からのデジタル放送信号が受信され、AVコンテンツとして、例えばドラマ等の収録済み番組が再生されているものとする。
次に、図32のフローチャートを参照して、図24の受信装置20により実行されるライブ番組に連動したアプリケーション制御処理の流れについて説明する。なお、このライブ番組に連動したアプリケーション制御処理に先立って、受信装置20では、送信装置10からのデジタル放送信号が受信され、AVコンテンツとして、例えばスポーツ中継等のライブ番組が再生されているものとする。
次に、図33のフローチャートを参照して、図24の受信装置20により実行されるハイブリッド型のアプリケーション制御処理の流れについて説明する。なお、受信装置20では、番組等のAVコンテンツが再生されているものとする。
次に、図34のフローチャートを参照して、図24のアプリケーションサーバ30により実行されるアプリケーション配信処理の流れについて説明する。
次に、図35のフローチャートを参照して、図24のメタデータサーバ40により実行されるメタデータ配信処理の流れについて説明する。
次に、図36のフローチャートを参照して、図24のACRサーバ50により実行されるトリガ情報配信処理の流れについて説明する。
AV(Audio Video)コンテンツに連動して実行されるアプリケーションの動作を制御するための情報として、ロケーション情報を少なくとも含むトリガ情報を取得する第1の取得部と、
前記アプリケーションの動作を制御するためのアプリケーション制御情報を取得する第2の取得部と、
前記トリガ情報、及び、前記アプリケーション制御情報に基づいて、前記アプリケーションの動作を制御する制御部と
を備える受信装置。
(2)
前記トリガ情報は、前記アプリケーションの動作の制御を行う時刻の基準となる時刻情報を含み、
前記第2の取得部は、前記アプリケーションの動作を時系列で規定したスケジュール制御情報を取得し、
前記制御部は、前記時刻情報を基準に計時された時刻が、前記スケジュール制御情報に規定された時刻を経過したとき、当該時刻に対応した前記アプリケーションに対するアクション情報に応じて、前記アプリケーションの動作を制御する
(1)に記載の受信装置。
(3)
前記アプリケーションは、複数のファイルから構成されており、
前記第2の取得部は、前記アプリケーションを構成するファイル群のキャッシュを制御するためのキャッシュ制御情報を取得し、
前記制御部は、前記キャッシュ制御情報に基づいて、前記アプリケーションを構成するファイル群をキャッシュメモリに保持させる
(2)に記載の受信装置。
(4)
前記トリガ情報は、前記スケジュール制御情報に規定された内容を編集するための編集情報を含み、
前記制御部は、前記編集情報に基づいて、前記スケジュール制御情報を編集する
(2)又は(3)に記載の受信装置。
(5)
前記ロケーション情報は、前記アプリケーション制御情報、前記スケジュール制御情報、及び、前記キャッシュ制御情報を取得するための情報であり、
前記アプリケーション制御情報、前記スケジュール制御情報、及び、前記キャッシュ制御情報は、前記アプリケーションの識別情報により関連付けられている
(3)又は(4)に記載の受信装置。
(6)
前記トリガ情報は、前記アプリケーションに対するアクション情報を含み、
前記制御部は、前記トリガ情報が取得されたとき、当該トリガ情報に含まれる前記アクション情報に応じて、前記アプリケーションの動作を制御する
(1)に記載の受信装置。
(7)
前記アプリケーションは、複数のファイルから構成されており、
前記第2の取得部は、前記アプリケーションを構成するファイル群のキャッシュを制御するためのキャッシュ制御情報を取得し、
前記制御部は、前記キャッシュ制御情報に基づいて、前記アプリケーションを構成するファイル群をキャッシュメモリに保持させる
(6)に記載の受信装置。
(8)
前記ロケーション情報は、前記アプリケーション制御情報、及び、前記キャッシュ制御情報を取得するための情報であり、
前記トリガ情報、前記アプリケーション制御情報、及び、前記キャッシュ制御情報は、前記アプリケーションの識別情報により関連付けられている
(7)に記載の受信装置。
(9)
前記AVコンテンツは、デジタル放送信号で送信される放送コンテンツであり、
前記トリガ情報は、デジタル放送信号に含めて配信されるか、あるいはインターネット上のサーバから配信され、
前記第1の取得部は、放送又は通信で配信される前記トリガ情報を取得する
(1)乃至(8)のいずれかに記載の受信装置。
(10)
受信装置の受信方法において、
前記受信装置が、
AVコンテンツに連動して実行されるアプリケーションの動作を制御するための情報として、ロケーション情報を少なくとも含むトリガ情報を取得し、
前記アプリケーションの動作を制御するためのアプリケーション制御情報を取得し、
前記トリガ情報、及び、前記アプリケーション制御情報に基づいて、前記アプリケーションの動作を制御する
ステップを含む受信方法。
(11)
AVコンテンツを取得する取得部と、
前記AVコンテンツに連動して実行されるアプリケーションの動作を制御するための情報として、ロケーション情報を少なくとも含むトリガ情報を生成する第1の生成部と、
前記アプリケーションの動作を制御するためのアプリケーション制御情報を生成する取得する第2の第2の生成部と、
前記AVコンテンツとともに、前記トリガ情報、及び、アプリケーション制御情報を送信する送信部と
を備える送信装置。
(12)
前記第1の生成部は、前記アプリケーションの動作の制御を行う時刻の基準となる時刻情報を含む前記トリガ情報を生成し、
前記第2の生成部は、前記アプリケーションの動作を時系列で規定したスケジュール制御情報を生成し、
前記送信部は、前記時刻情報を含む前記トリガ情報、及び、前記スケジュール制御情報を送信する
(11)に記載の送信装置。
(13)
前記アプリケーションは、複数のファイルから構成されており、
前記第2の生成部は、前記アプリケーションを構成するファイル群のキャッシュを制御するためのキャッシュ制御情報を生成し、
前記送信部は、前記キャッシュ制御情報をさらに送信する
(12)に記載の送信装置。
(14)
前記第1の生成部は、前記スケジュール制御情報に規定された内容を編集するための編集情報を含む前記トリガ情報を生成し、
前記送信部は、前記編集情報を含む前記トリガ情報を送信する
(12)又は(13)に記載の送信装置。
(15)
前記ロケーション情報は、前記アプリケーション制御情報、前記スケジュール制御情報、及び、前記キャッシュ制御情報を取得するための情報であり、
前記アプリケーション制御情報、前記スケジュール制御情報、及び、前記キャッシュ制御情報は、前記アプリケーションの識別情報により関連付けられている
(13)又は(14)に記載の送信装置。
(16)
前記第1の生成部は、前記アプリケーションに対するアクション情報を含む前記トリガ情報を生成し、
前記送信部は、前記アクション情報を含む前記トリガ情報を送信する
(11)に記載の送信装置。
(17)
前記アプリケーションは、複数のファイルから構成されており、
前記第2の生成部は、前記アプリケーションを構成するファイル群のキャッシュを制御するためのキャッシュ制御情報を生成し、
前記送信部は、前記キャッシュ制御情報をさらに送信する
(16)に記載の送信装置。
(18)
前記ロケーション情報は、前記アプリケーション制御情報、及び、前記キャッシュ制御情報を取得するための情報であり、
前記トリガ情報、前記アプリケーション制御情報、及び、前記キャッシュ制御情報は、前記アプリケーションの識別情報により関連付けられている
(17)に記載の送信装置。
(19)
前記AVコンテンツは、放送コンテンツであり、
前記送信部は、前記AVコンテンツとともに、前記トリガ情報、及び、アプリケーション制御情報を、デジタル放送信号で送信する
(11)乃至(18)のいずれかに記載の送信装置。
(20)
送信装置の送信方法において、
前記送信装置が、
AVコンテンツを取得し、
前記AVコンテンツに連動して実行されるアプリケーションの動作を制御するための情報として、ロケーション情報を少なくとも含むトリガ情報を生成し、
前記アプリケーションの動作を制御するためのアプリケーション制御情報を生成する取得し、
前記AVコンテンツとともに、前記トリガ情報、及び、アプリケーション制御情報を送信する
ステップを含む送信方法。
Claims (20)
- AV(Audio Video)コンテンツに連動して実行されるアプリケーションの動作を制御するための情報として、ロケーション情報を少なくとも含むトリガ情報を取得する第1の取得部と、
前記アプリケーションの動作を制御するためのアプリケーション制御情報を取得する第2の取得部と、
前記トリガ情報、及び、前記アプリケーション制御情報に基づいて、前記アプリケーションの動作を制御する制御部と
を備える受信装置。 - 前記トリガ情報は、前記アプリケーションの動作の制御を行う時刻の基準となる時刻情報を含み、
前記第2の取得部は、前記アプリケーションの動作を時系列で規定したスケジュール制御情報を取得し、
前記制御部は、前記時刻情報を基準に計時された時刻が、前記スケジュール制御情報に規定された時刻を経過したとき、当該時刻に対応した前記アプリケーションに対するアクション情報に応じて、前記アプリケーションの動作を制御する
請求項1に記載の受信装置。 - 前記アプリケーションは、複数のファイルから構成されており、
前記第2の取得部は、前記アプリケーションを構成するファイル群のキャッシュを制御するためのキャッシュ制御情報を取得し、
前記制御部は、前記キャッシュ制御情報に基づいて、前記アプリケーションを構成するファイル群をキャッシュメモリに保持させる
請求項2に記載の受信装置。 - 前記トリガ情報は、前記スケジュール制御情報に規定された内容を編集するための編集情報を含み、
前記制御部は、前記編集情報に基づいて、前記スケジュール制御情報を編集する
請求項3に記載の受信装置。 - 前記ロケーション情報は、前記アプリケーション制御情報、前記スケジュール制御情報、及び、前記キャッシュ制御情報を取得するための情報であり、
前記アプリケーション制御情報、前記スケジュール制御情報、及び、前記キャッシュ制御情報は、前記アプリケーションの識別情報により関連付けられている
請求項4に記載の受信装置。 - 前記トリガ情報は、前記アプリケーションに対するアクション情報を含み、
前記制御部は、前記トリガ情報が取得されたとき、当該トリガ情報に含まれる前記アクション情報に応じて、前記アプリケーションの動作を制御する
請求項1に記載の受信装置。 - 前記アプリケーションは、複数のファイルから構成されており、
前記第2の取得部は、前記アプリケーションを構成するファイル群のキャッシュを制御するためのキャッシュ制御情報を取得し、
前記制御部は、前記キャッシュ制御情報に基づいて、前記アプリケーションを構成するファイル群をキャッシュメモリに保持させる
請求項6に記載の受信装置。 - 前記ロケーション情報は、前記アプリケーション制御情報、及び、前記キャッシュ制御情報を取得するための情報であり、
前記トリガ情報、前記アプリケーション制御情報、及び、前記キャッシュ制御情報は、前記アプリケーションの識別情報により関連付けられている
請求項7に記載の受信装置。 - 前記AVコンテンツは、デジタル放送信号で送信される放送コンテンツであり、
前記トリガ情報は、デジタル放送信号に含めて配信されるか、あるいはインターネット上のサーバから配信され、
前記第1の取得部は、放送又は通信で配信される前記トリガ情報を取得する
請求項1に記載の受信装置。 - 受信装置の受信方法において、
前記受信装置が、
AVコンテンツに連動して実行されるアプリケーションの動作を制御するための情報として、ロケーション情報を少なくとも含むトリガ情報を取得し、
前記アプリケーションの動作を制御するためのアプリケーション制御情報を取得し、
前記トリガ情報、及び、前記アプリケーション制御情報に基づいて、前記アプリケーションの動作を制御する
ステップを含む受信方法。 - AVコンテンツを取得する取得部と、
前記AVコンテンツに連動して実行されるアプリケーションの動作を制御するための情報として、ロケーション情報を少なくとも含むトリガ情報を生成する第1の生成部と、
前記アプリケーションの動作を制御するためのアプリケーション制御情報を生成する取得する第2の第2の生成部と、
前記AVコンテンツとともに、前記トリガ情報、及び、アプリケーション制御情報を送信する送信部と
を備える送信装置。 - 前記第1の生成部は、前記アプリケーションの動作の制御を行う時刻の基準となる時刻情報を含む前記トリガ情報を生成し、
前記第2の生成部は、前記アプリケーションの動作を時系列で規定したスケジュール制御情報を生成し、
前記送信部は、前記時刻情報を含む前記トリガ情報、及び、前記スケジュール制御情報を送信する
請求項11に記載の送信装置。 - 前記アプリケーションは、複数のファイルから構成されており、
前記第2の生成部は、前記アプリケーションを構成するファイル群のキャッシュを制御するためのキャッシュ制御情報を生成し、
前記送信部は、前記キャッシュ制御情報をさらに送信する
請求項12に記載の送信装置。 - 前記第1の生成部は、前記スケジュール制御情報に規定された内容を編集するための編集情報を含む前記トリガ情報を生成し、
前記送信部は、前記編集情報を含む前記トリガ情報を送信する
請求項13に記載の送信装置。 - 前記ロケーション情報は、前記アプリケーション制御情報、前記スケジュール制御情報、及び、前記キャッシュ制御情報を取得するための情報であり、
前記アプリケーション制御情報、前記スケジュール制御情報、及び、前記キャッシュ制御情報は、前記アプリケーションの識別情報により関連付けられている
請求項14に記載の送信装置。 - 前記第1の生成部は、前記アプリケーションに対するアクション情報を含む前記トリガ情報を生成し、
前記送信部は、前記アクション情報を含む前記トリガ情報を送信する
請求項11に記載の送信装置。 - 前記アプリケーションは、複数のファイルから構成されており、
前記第2の生成部は、前記アプリケーションを構成するファイル群のキャッシュを制御するためのキャッシュ制御情報を生成し、
前記送信部は、前記キャッシュ制御情報をさらに送信する
請求項16に記載の送信装置。 - 前記ロケーション情報は、前記アプリケーション制御情報、及び、前記キャッシュ制御情報を取得するための情報であり、
前記トリガ情報、前記アプリケーション制御情報、及び、前記キャッシュ制御情報は、前記アプリケーションの識別情報により関連付けられている
請求項17に記載の送信装置。 - 前記AVコンテンツは、放送コンテンツであり、
前記送信部は、前記AVコンテンツとともに、前記トリガ情報、及び、アプリケーション制御情報を、デジタル放送信号で送信する
請求項11に記載の送信装置。 - 送信装置の送信方法において、
前記送信装置が、
AVコンテンツを取得し、
前記AVコンテンツに連動して実行されるアプリケーションの動作を制御するための情報として、ロケーション情報を少なくとも含むトリガ情報を生成し、
前記アプリケーションの動作を制御するためのアプリケーション制御情報を生成する取得し、
前記AVコンテンツとともに、前記トリガ情報、及び、アプリケーション制御情報を送信する
ステップを含む送信方法。
Priority Applications (8)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
KR1020167007523A KR102459246B1 (ko) | 2014-08-01 | 2015-07-17 | 수신 장치, 수신 방법, 송신 장치 및 송신 방법 |
CN201580002007.XA CN105594220B (zh) | 2014-08-01 | 2015-07-17 | 接收装置、接收方法、传输装置以及传输方法 |
EP15828221.0A EP3177026B1 (en) | 2014-08-01 | 2015-07-17 | Reception device, reception method, transmission device, and transmission method |
MX2016003752A MX369424B (es) | 2014-08-01 | 2015-07-17 | Dispositivo de recepción, método de recepción, dispositivo de transmisión y método de transmisión. |
JP2016510532A JPWO2016017451A1 (ja) | 2014-08-01 | 2015-07-17 | 受信装置、受信方法、送信装置、及び、送信方法 |
CA2924036A CA2924036C (en) | 2014-08-01 | 2015-07-17 | Receiving device, receiving method, transmitting device, and transmitting method |
US14/916,507 US11528539B2 (en) | 2014-08-01 | 2015-07-17 | Receiving device, receiving method, transmitting device, and transmitting method |
US17/983,101 US11889163B2 (en) | 2014-08-01 | 2022-11-08 | Receiving device, receiving method, transmitting device, and transmitting method |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2014-158231 | 2014-08-01 | ||
JP2014158231 | 2014-08-01 |
Related Child Applications (2)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US14/916,507 A-371-Of-International US11528539B2 (en) | 2014-08-01 | 2015-07-17 | Receiving device, receiving method, transmitting device, and transmitting method |
US17/983,101 Continuation US11889163B2 (en) | 2014-08-01 | 2022-11-08 | Receiving device, receiving method, transmitting device, and transmitting method |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2016017451A1 true WO2016017451A1 (ja) | 2016-02-04 |
Family
ID=55217356
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2015/070498 WO2016017451A1 (ja) | 2014-08-01 | 2015-07-17 | 受信装置、受信方法、送信装置、及び、送信方法 |
Country Status (8)
Country | Link |
---|---|
US (2) | US11528539B2 (ja) |
EP (1) | EP3177026B1 (ja) |
JP (1) | JPWO2016017451A1 (ja) |
KR (1) | KR102459246B1 (ja) |
CN (1) | CN105594220B (ja) |
CA (1) | CA2924036C (ja) |
MX (1) | MX369424B (ja) |
WO (1) | WO2016017451A1 (ja) |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPWO2017141701A1 (ja) * | 2016-02-15 | 2018-12-13 | ソニー株式会社 | 受信装置、送信装置、及び、データ処理方法 |
JP2023525655A (ja) * | 2021-04-20 | 2023-06-19 | テンセント・アメリカ・エルエルシー | イベントメッセージトラックにおけるイベント間の拡張された関係シグナリング |
Families Citing this family (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
EP3232668A4 (en) | 2014-12-10 | 2018-06-13 | LG Electronics Inc. | Broadcast signal transmission device, broadcast signal reception device, broadcast signal transmission method and broadcast signal reception method |
KR102014800B1 (ko) | 2015-07-06 | 2019-08-27 | 엘지전자 주식회사 | 방송 신호 송신 장치, 방송 신호 수신 장치, 방송 신호 송신 방법, 및 방송 신호 수신 방법 |
WO2017144126A1 (en) * | 2016-02-22 | 2017-08-31 | Telefonaktiebolaget Lm Ericsson (Publ) | A method and entity for audience measurement |
Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20140040965A1 (en) * | 2012-07-31 | 2014-02-06 | Sony Corporation | Receiving device, receiving method, transmitting device, and transmitting method |
US20140043540A1 (en) * | 2012-08-13 | 2014-02-13 | Sony Corporation | Reception apparatus, reception method, transmission apparatus, and transmission method |
Family Cites Families (62)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP3338810B2 (ja) | 1999-11-09 | 2002-10-28 | エヌイーシーケーブルメディア株式会社 | ディジタルテレビ受信機の番組選択装置およびその周波数変換データ取得方法 |
US20020147984A1 (en) * | 2000-11-07 | 2002-10-10 | Tomsen Mai-Lan | System and method for pre-caching supplemental content related to a television broadcast using unprompted, context-sensitive querying |
US20040059775A1 (en) * | 2002-09-19 | 2004-03-25 | International Business Machines Corporation | Method, system and computer program product for remotely building and delivering customized multimedia presentations |
US7516472B2 (en) * | 2003-02-28 | 2009-04-07 | Microsoft Corporation | Using broadcast television as a portal for video on demand solicitation |
US20040237102A1 (en) * | 2003-03-07 | 2004-11-25 | Richard Konig | Advertisement substitution |
KR100849842B1 (ko) * | 2003-12-23 | 2008-08-01 | 삼성전자주식회사 | 방송용 어플리케이션을 실행하는 장치 및 실행 방법 |
US20050246757A1 (en) * | 2004-04-07 | 2005-11-03 | Sandeep Relan | Convergence of network file system for sharing multimedia content across several set-top-boxes |
US7784071B2 (en) * | 2005-06-01 | 2010-08-24 | General Instrument Corporation | Method and apparatus for linking a plurality of user devices to a service location |
US7966392B2 (en) * | 2005-07-29 | 2011-06-21 | Cyber Solutions Inc. | Network management system and network management information collecting method |
US7530088B2 (en) * | 2005-09-20 | 2009-05-05 | International Business Machines Corporation | Topology based proximity validation for broadcast content |
US8078500B2 (en) * | 2005-12-27 | 2011-12-13 | The Pen | Security enhancements using fixed unique system identifiers for executing a transaction |
US8196169B1 (en) * | 2006-09-18 | 2012-06-05 | Nvidia Corporation | Coordinate-based set top box policy enforcement system, method and computer program product |
US7894370B2 (en) * | 2007-03-09 | 2011-02-22 | Nbc Universal, Inc. | Media content distribution system and method |
JP4946592B2 (ja) * | 2007-04-20 | 2012-06-06 | 株式会社日立製作所 | ダウンロード装置および方法、コンテンツ送受信システムおよび方法 |
JP5024610B2 (ja) * | 2007-05-31 | 2012-09-12 | ソニー株式会社 | 情報処理システム、情報処理装置、情報処理方法、及びプログラム |
JP4819161B2 (ja) * | 2007-08-07 | 2011-11-24 | パナソニック株式会社 | ネットワークavコンテンツ再生システム、サーバ、プログラムおよび記録媒体 |
EP2188996A2 (en) * | 2007-08-16 | 2010-05-26 | ST-NXP Wireless (Holding) AG | Digital video broadcast receiver and method for receiving digital video broadcast data |
JP4730626B2 (ja) * | 2008-06-13 | 2011-07-20 | ソニー株式会社 | コンテンツ供給装置、コンテンツ供給方法、およびプログラム |
US20100017820A1 (en) * | 2008-07-18 | 2010-01-21 | Telephoto Technologies Inc. | Realtime insertion of video content in live broadcasting |
US8321904B2 (en) * | 2008-11-05 | 2012-11-27 | At&T Intellectual Property I, L.P. | System and method to enable access to broadband services |
US9641889B2 (en) * | 2009-07-31 | 2017-05-02 | Bce Inc. | Method and system for controlling media conveyance by a device to a user based on current location of the device |
US9191624B2 (en) * | 2009-08-26 | 2015-11-17 | At&T Intellectual Property I, L.P. | System and method to determine an authorization of a wireless set-top box device to receive media content |
KR101669287B1 (ko) * | 2009-09-01 | 2016-11-09 | 삼성전자주식회사 | 제 3의 원격 유저 인터페이스 장치를 통한 원격 유저 인터페이스 장치의 제어 방법 및 장치 |
US20110060993A1 (en) * | 2009-09-08 | 2011-03-10 | Classified Ventures, Llc | Interactive Detailed Video Navigation System |
EP2495964B1 (en) * | 2009-10-30 | 2020-11-18 | Maxell, Ltd. | Content-receiving device |
JP4878642B2 (ja) * | 2009-12-15 | 2012-02-15 | シャープ株式会社 | コンテンツ配信システム、コンテンツ配信装置、コンテンツ再生端末およびコンテンツ配信方法 |
EP2365711B1 (de) * | 2010-03-12 | 2016-02-10 | Siemens Aktiengesellschaft | Drahtlosnetzwerk, insbesondere für Automatisierungs-, Echtzeit- und/oder Industrie-Anwendungen |
US20110296472A1 (en) * | 2010-06-01 | 2011-12-01 | Microsoft Corporation | Controllable device companion data |
JP5765558B2 (ja) * | 2010-08-27 | 2015-08-19 | ソニー株式会社 | 受信装置、受信方法、送信装置、送信方法、プログラム、および放送システム |
JP5703664B2 (ja) * | 2010-09-30 | 2015-04-22 | ソニー株式会社 | 受信装置、受信方法、送信装置、送信方法、プログラム、および放送システム |
KR101980712B1 (ko) * | 2011-02-15 | 2019-05-21 | 엘지전자 주식회사 | 방송 서비스 전송 방법, 그 수신 방법 및 그 수신 장치 |
CN102883344B (zh) * | 2011-07-15 | 2015-05-27 | 华为终端有限公司 | 一种无线网络管理消息交互的方法及装置 |
CN103026681B (zh) * | 2011-08-01 | 2015-06-17 | 华为技术有限公司 | 基于视频的增值业务实现方法、服务器和系统 |
RU2594000C2 (ru) * | 2011-08-05 | 2016-08-10 | Сони Корпорейшн | Приемное устройство, способ приема, носитель записи и система обработки информации |
US20110296452A1 (en) * | 2011-08-08 | 2011-12-01 | Lei Yu | System and method for providing content-aware persistent advertisements |
EP3439294B1 (en) * | 2011-08-24 | 2023-06-14 | Saturn Licensing LLC | Reception apparatus, reception method, program and information processing system |
TWI528749B (zh) * | 2011-09-06 | 2016-04-01 | Sony Corp | A signal receiving device, a signal receiving method, an information processing program and an information processing system |
US8776145B2 (en) * | 2011-09-16 | 2014-07-08 | Elwha Llc | In-transit electronic media with location-based content |
US9219950B2 (en) * | 2011-11-01 | 2015-12-22 | Sony Corporation | Reproduction apparatus, reproduction method, and program |
US8869196B2 (en) * | 2011-11-18 | 2014-10-21 | Verizon Patent And Licensing Inc. | Programming based interactive content |
EP2786562B1 (en) * | 2011-11-29 | 2017-06-21 | Nagravision S.A. | Method and system to confirm co-location of multiple devices within a geographic area |
CN103167046B (zh) * | 2011-12-09 | 2017-04-12 | 华为技术有限公司 | 获取组播地址的方法、装置及系统 |
US9113230B2 (en) * | 2011-12-21 | 2015-08-18 | Sony Corporation | Method, computer program, and reception apparatus for delivery of supplemental content |
US10271081B2 (en) * | 2012-03-07 | 2019-04-23 | The Directv Group, Inc. | Method and system for detecting unauthorized use of a set top box using satellite signal identification |
JP5896222B2 (ja) * | 2012-03-21 | 2016-03-30 | ソニー株式会社 | 端末装置、中継装置、情報処理方法、プログラム、およびコンテンツ識別システム |
US9936231B2 (en) * | 2012-03-21 | 2018-04-03 | Saturn Licensing Llc | Trigger compaction |
US20150161632A1 (en) * | 2012-06-15 | 2015-06-11 | Anthony W. Humay | Intelligent social polling platform |
MX338057B (es) * | 2012-06-19 | 2016-04-01 | Sony Corp | Extensiones a la tabla de parametros del activador para television interactiva. |
JP6019442B2 (ja) * | 2012-06-22 | 2016-11-02 | 株式会社アウトスタンディングテクノロジー | 空間光伝送を使用するコンテンツ提供システム |
US9154840B2 (en) * | 2012-07-31 | 2015-10-06 | Sony Corporation | Reception apparatus, reception method, transmission apparatus, and transmission method |
CN104662925B (zh) | 2012-09-12 | 2018-12-04 | Lg电子株式会社 | 处理交互服务的设备和方法 |
US9936256B2 (en) * | 2012-11-28 | 2018-04-03 | Saturn Licensing Llc | Receiver, reception method, transmitter and transmission method |
CN103024450B (zh) * | 2012-12-10 | 2016-09-14 | 惠州Tcl移动通信有限公司 | 一种通过nfc技术实现互动电视的方法及系统 |
US9124911B2 (en) * | 2013-02-15 | 2015-09-01 | Cox Communications, Inc. | Storage optimization in a cloud-enabled network-based digital video recorder |
EP2802152B1 (en) * | 2013-05-07 | 2017-07-05 | Nagravision S.A. | Method for secure processing a stream of encrypted digital audio / video data |
JP6616064B2 (ja) * | 2013-07-25 | 2019-12-04 | パナソニック インテレクチュアル プロパティ コーポレーション オブ アメリカ | 送信方法および受信方法 |
US9231718B2 (en) * | 2013-08-08 | 2016-01-05 | EchoStar Technologies, L.L.C. | Use of television satellite signals to determine location |
JP6625318B2 (ja) * | 2013-08-29 | 2019-12-25 | パナソニック インテレクチュアル プロパティ コーポレーション オブ アメリカPanasonic Intellectual Property Corporation of America | 送信方法および受信方法 |
US9848232B2 (en) * | 2014-01-28 | 2017-12-19 | Echostar Technologies L.L.C. | Acquiring network configuration data for a television receiver |
WO2015182490A1 (ja) * | 2014-05-30 | 2015-12-03 | ソニー株式会社 | 受信装置、受信方法、送信装置、及び、送信方法 |
EP3235329B1 (en) * | 2014-12-19 | 2022-04-13 | Nokia Solutions and Networks Oy | Proximity services device-to-device communication services control |
KR102653289B1 (ko) * | 2016-01-15 | 2024-04-02 | 소니그룹주식회사 | 수신 장치, 송신 장치 및 데이터 처리 방법 |
-
2015
- 2015-07-17 JP JP2016510532A patent/JPWO2016017451A1/ja active Pending
- 2015-07-17 CN CN201580002007.XA patent/CN105594220B/zh active Active
- 2015-07-17 US US14/916,507 patent/US11528539B2/en active Active
- 2015-07-17 EP EP15828221.0A patent/EP3177026B1/en active Active
- 2015-07-17 WO PCT/JP2015/070498 patent/WO2016017451A1/ja active Application Filing
- 2015-07-17 MX MX2016003752A patent/MX369424B/es active IP Right Grant
- 2015-07-17 CA CA2924036A patent/CA2924036C/en active Active
- 2015-07-17 KR KR1020167007523A patent/KR102459246B1/ko active IP Right Grant
-
2022
- 2022-11-08 US US17/983,101 patent/US11889163B2/en active Active
Patent Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20140040965A1 (en) * | 2012-07-31 | 2014-02-06 | Sony Corporation | Receiving device, receiving method, transmitting device, and transmitting method |
US20140043540A1 (en) * | 2012-08-13 | 2014-02-13 | Sony Corporation | Reception apparatus, reception method, transmission apparatus, and transmission method |
Cited By (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPWO2017141701A1 (ja) * | 2016-02-15 | 2018-12-13 | ソニー株式会社 | 受信装置、送信装置、及び、データ処理方法 |
JP2023525655A (ja) * | 2021-04-20 | 2023-06-19 | テンセント・アメリカ・エルエルシー | イベントメッセージトラックにおけるイベント間の拡張された関係シグナリング |
JP7513748B2 (ja) | 2021-04-20 | 2024-07-09 | テンセント・アメリカ・エルエルシー | イベントメッセージトラックにおけるイベント間の拡張された関係シグナリング |
Also Published As
Publication number | Publication date |
---|---|
CN105594220A (zh) | 2016-05-18 |
EP3177026A1 (en) | 2017-06-07 |
CA2924036A1 (en) | 2016-02-04 |
US20230071040A1 (en) | 2023-03-09 |
US11889163B2 (en) | 2024-01-30 |
MX2016003752A (es) | 2016-06-30 |
CN105594220B (zh) | 2020-08-07 |
US20160205449A1 (en) | 2016-07-14 |
EP3177026A4 (en) | 2018-02-21 |
KR102459246B1 (ko) | 2022-10-27 |
US11528539B2 (en) | 2022-12-13 |
KR20170039070A (ko) | 2017-04-10 |
EP3177026B1 (en) | 2021-03-31 |
CA2924036C (en) | 2023-08-22 |
JPWO2016017451A1 (ja) | 2017-05-18 |
MX369424B (es) | 2019-11-08 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
KR102040623B1 (ko) | 양방향 서비스를 처리하는 장치 및 방법 | |
US11889163B2 (en) | Receiving device, receiving method, transmitting device, and transmitting method | |
US9912971B2 (en) | Apparatus and method for processing an interactive service | |
JP6225109B2 (ja) | 受信装置、受信方法、送信装置、及び送信方法 | |
KR101939296B1 (ko) | 양방향 서비스를 처리하는 장치 및 방법 | |
KR20150073987A (ko) | 양방향 서비스를 처리하는 장치 및 방법 | |
KR20150067148A (ko) | 수신 장치, 수신 방법, 송신 장치 및 송신 방법 | |
CN107517411B (zh) | 一种基于GStreamer框架的视频播放方法 | |
US20210152875A1 (en) | Reception apparatus, reception method, transmission apparatus, and transmission method for controlling termination of application | |
KR20170141677A (ko) | 수신 장치, 송신 장치 및 데이터 처리 방법 | |
KR102440142B1 (ko) | 수신 장치, 수신 방법, 송신 장치, 및 송신 방법 | |
WO2016035589A1 (ja) | 受信装置、受信方法、送信装置、及び、送信方法 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
ENP | Entry into the national phase |
Ref document number: 2016510532 Country of ref document: JP Kind code of ref document: A |
|
WWE | Wipo information: entry into national phase |
Ref document number: 14916507 Country of ref document: US |
|
ENP | Entry into the national phase |
Ref document number: 2924036 Country of ref document: CA |
|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 15828221 Country of ref document: EP Kind code of ref document: A1 |
|
ENP | Entry into the national phase |
Ref document number: 20167007523 Country of ref document: KR Kind code of ref document: A |
|
WWE | Wipo information: entry into national phase |
Ref document number: MX/A/2016/003752 Country of ref document: MX |
|
REEP | Request for entry into the european phase |
Ref document number: 2015828221 Country of ref document: EP |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2015828221 Country of ref document: EP |
|
NENP | Non-entry into the national phase |
Ref country code: DE |