JP5978000B2 - Receiving machine - Google Patents

Receiving machine Download PDF

Info

Publication number
JP5978000B2
JP5978000B2 JP2012112969A JP2012112969A JP5978000B2 JP 5978000 B2 JP5978000 B2 JP 5978000B2 JP 2012112969 A JP2012112969 A JP 2012112969A JP 2012112969 A JP2012112969 A JP 2012112969A JP 5978000 B2 JP5978000 B2 JP 5978000B2
Authority
JP
Japan
Prior art keywords
application
unit
receiver
server
broadcast
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
JP2012112969A
Other languages
Japanese (ja)
Other versions
JP2013066160A (en
Inventor
馬場 秋継
秋継 馬場
松村 欣司
欣司 松村
茂明 三矢
茂明 三矢
秀 武智
秀 武智
藤沢 寛
寛 藤沢
俊二 砂崎
俊二 砂崎
浩行 浜田
浩行 浜田
Original Assignee
日本放送協会
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority to JP2011184565 priority Critical
Priority to JP2011184565 priority
Application filed by 日本放送協会 filed Critical 日本放送協会
Priority to JP2012112969A priority patent/JP5978000B2/en
Priority claimed from EP12828400.7A external-priority patent/EP2750309A4/en
Publication of JP2013066160A publication Critical patent/JP2013066160A/en
Publication of JP5978000B2 publication Critical patent/JP5978000B2/en
Application granted granted Critical
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Description

  The present invention relates to a receiver.

In recent years, research for realizing a broadcasting / communication cooperation service has been conducted. As an example, this broadcasting / communication cooperation service is distributed to receivers used by viewers and broadcast program contents broadcast from the broadcasting station side and distributed from the service provider side via an electric communication line such as the Internet. This is a service for receiving content and reproducing the broadcast program content and distribution content in cooperation with each other.
The receiver can receive the broadcasting / communication cooperation service by executing an application for displaying and outputting the broadcast program content and the distribution content.
In addition, an attempt has been made to allow a viewer to receive a broadcasting / communication cooperation service by using a communication terminal by operating the receiver and the communication terminal in a cooperative manner.

  Conventionally, an application execution environment in digital broadcasting has been standardized and operated (see, for example, Non-Patent Document 1). This document describes technical matters related to the use of network devices in order to use devices connected to the network and the functions of the devices in conjunction with data broadcast contents.

"Application Execution Environment Standard for Digital Broadcasting", ARIB STD-B23, 1.2 edition, Japan Radio Industry Association, July 2009

When linking a receiver and a communication terminal, there is a request to link an application executed by the receiver and an application executed by the communication terminal. By the way, for example, some languages for client side applications such as HTML5 cannot be described as operating on the communication parent side (server side, host side).
That is, when the application executed by each of the receiver and the communication terminal is an application described in a language such as HTML5, since each application is a client-side application, there is a problem that communication cannot be performed with each other. .

  The present invention has been made in view of the above circumstances, and provides a receiver capable of cooperatively operating an application executed by a receiver and an application executed by a communication terminal regardless of the language specification of the application. With the goal.

[1] In order to solve the above problems, a receiver according to the present invention includes a broadcast receiving unit that receives a broadcast signal, a separation unit that separates a broadcast stream from the broadcast signal received by the broadcast reception unit, and the separation unit. An application information acquisition unit that acquires information on an application to be executed by the device from a broadcast stream separated by the device, an application execution unit that executes an application indicated by the information acquired by the application information acquisition unit, and the execution unit and the application. Via a server unit that receives a request output by execution of the application by a terminal to be executed, a connection unit that establishes a connection between the server unit, the application execution unit, and the terminal, and a connection established by the connection unit, The server unit executes the application The request received from the output to the terminal, the server unit is characterized in that it comprises a bridge section for outputting a request received from the terminal to the application execution unit.
[2] In the receiver according to [1], the bridge unit determines whether a relationship between an application executed by the application execution unit and an application executed by the terminal satisfies a predetermined condition, When the condition is satisfied, the server unit outputs the request received from the application execution unit to the terminal via the connection established by the connection unit, and the server unit receives the request received from the terminal. It outputs to the part.
[3] In the receiver according to [2], when the application execution unit and the terminal establish a connection with the server unit by executing the application, type information indicating a type of application to cooperate with is received. the output to connect unit, the predetermined condition in said bridge portion, characterized in that the type information in which the connect portion is received from the terminal and type information received from the application execution unit is to match.
[4] In the receiver according to any one of [2] to [3] , the bridge unit determines the predetermined condition when a connection is established by the connect unit. .
[5] In the receiver according to any one of [2] to [4] , when the bridge unit determines that the predetermined condition is satisfied , a bridge connection is established between the server unit and the terminal. performs the processing, if the generated identification information for identifying the bridge connection and outputs it to the application execution unit, said server unit includes the identification information on the request received from the application execution unit, the identification information is assigned When the request is received from the terminal of the bridge connection destination to which the identification information is assigned , the request is output to the terminal of the connection destination of the bridge connection . The combination is output to the application execution unit.

  According to the present invention, a server unit that receives a request from an application executed by a receiver and an application executed by a terminal is provided, the server unit outputs a request received from the terminal to the execution unit, and a request received from the execution unit is received. Can be output to the terminal. That is, the receiver according to the present invention can cause the application executed by the receiver and the application executed by the communication terminal to perform a cooperative operation regardless of the language specification of the application.

It is a block diagram which shows the function structure of the receiving system which is one Embodiment of this invention. It is a figure which shows the person using the broadcast communication cooperation system to which this invention is applied, and its relationship. It is a figure which shows the whole structure of a broadcast communication cooperation system. It is a figure which shows the terminal cooperation model of a broadcast communication cooperation system. The conceptual diagram of the service classification of a broadcast communication cooperation system is shown. It is a figure which shows the example of the text expression of AIT used for a broadcast communication cooperation system. It is a figure which shows the life cycle of the application in a broadcast communication cooperation system. It is a figure which shows the data flow between providers in a broadcast communication cooperation system. It is a figure which shows the flow of the data in the whole broadcast communication cooperation system. It is a figure which shows the sequence of the recommendation service in a broadcast communication cooperation system. It is a figure which shows the transfer protocol stack in a broadcast communication cooperation system. It is a figure which shows the concept of a terminal cooperation manager. It is a figure which shows the example which cooperates a receiver and a portable terminal by communication by WebSocket. It is a figure which shows the example which cooperates the application of a receiver and the application of a portable terminal by communication by WebSocket. It is a figure which shows the application management model in a broadcast communication cooperation system. The functional model of the secure manager in the broadcasting / communication cooperation system is shown. It is a figure which shows the concept of the screen presentation control system in a broadcast communication cooperation system. It is a figure which shows the basic operation | movement model of the screen presentation control in a broadcast communication cooperation system. The example of the screen presentation control according to the policy level in a broadcast communication cooperation system is shown. The example of the presentation control at the time of the earthquake early warning reception in a broadcast communication cooperation system is shown. 1 is an overall configuration diagram of a broadcasting / communication cooperation system according to an embodiment of the present invention. It is a functional block diagram which shows the internal structure of the receiver by the embodiment. It is a block diagram which shows the detailed structure of the application execution control part by the embodiment. It is a block diagram which shows the detailed structure of the presentation control part by the embodiment. In one Embodiment of this invention, it is a general | schematic external appearance front view at the time of using an infrared remote control as an operation reception part. It is a sequence diagram which shows the procedure of a process with the receiver which is the same embodiment, a receiver application server, and a content delivery server. It is a flowchart which shows the process sequence of operation | movement when the receiver which is the same embodiment operate | moves according to operation of the operation reception part. It is a flowchart which shows the process sequence of operation | movement when the receiver which is the same embodiment operate | moves according to operation of the operation reception part. It is a flowchart which shows the process sequence of operation | movement of the receiver which is the embodiment. It is a sequence diagram which shows the procedure of a process with the receiver which is the embodiment, an apparatus, and a terminal application server. It is a sequence diagram which shows the procedure of the cooperation process of a receiver and an apparatus. It is a sequence diagram which shows the cooperation procedure between applications. It is a flowchart which shows the procedure of the bridge determination process by a receiver. It is a figure which shows the data structure of an event information table.

Hereinafter, embodiments for carrying out the present invention will be described in detail with reference to the drawings.
One embodiment of the present invention is a receiver capable of switching between a state in which only a broadcast service is received and a state in which a stream-dependent service that is one service form of a broadcast communication cooperation service is received by a simple operation. In addition, the present embodiment is a receiver that can switch the currently received broadcast communication cooperation service to the broadcast service under the control of the broadcast communication cooperation service supplier. In addition, the present embodiment is a receiver that can acquire an application to be executed in the broadcasting / communication cooperation service and content data related to the application from an external supply source according to a request of the own apparatus. In addition, the present embodiment is a receiver and a receiver system that includes a receiver and a device (terminal), and can dynamically change an application to be executed by a device that operates in cooperation. It is.

FIG. 1 is a block diagram showing a functional configuration of a receiving system according to an embodiment of the present invention.
As shown in the figure, the receiving system includes a receiver 4 and a device 8.
The receiver 4 includes a broadcast receiving unit 401, a separation unit 402, a communication input / output unit 411, an application execution control unit 412, an operation input unit 414, a channel selection unit 415, an external I / F unit 417, an operation And a reception unit 474.
The application execution control unit 412 includes an application storage unit 431, an application control unit 434, an application execution unit 435, a resource access control unit 438, and a resource control unit 439.
The application control unit 434 includes an application information acquisition unit 472, an activation control unit 473, and an end control unit 481.
The operation input unit 414 includes an activation request signal acquisition unit 471.
The external I / F unit 417 includes a device-side server unit 491, a receiver-side server unit 492, a connect unit 493, and a bridge unit 494. The external I / F unit 417 may be provided for each device 8 to which the device-side server unit 491 is connected, and may be configured to be connected to the device 8 on a one-to-one basis, or one device-side server unit 491 is provided. The apparatus side server unit 491 may be connected to the plurality of apparatuses 8.
The device 8 includes a connection control unit 501, a terminal application acquisition unit (abbreviated as a terminal application acquisition unit in the figure) 502, and a terminal application execution unit (abbreviated as a terminal application execution unit in the figure) 503. Consists of including.
As described above, the device 8 is a terminal (electronic device, information processing apparatus) having a communication function such as a mobile phone, a PDA, a smartphone, a tablet, or a personal computer.

[Description of Broadcast and Communication Cooperation System to which the Present Invention is Applied]
Here, a broadcasting / communication cooperation system to which the present invention is applied will be described. The broadcast communication cooperation system (broadcast communication integration system, broadcast communication system, transmission / reception system) to which the present invention is applied is, for example, a Hybridcast (registered trademark) (hybrid cast) system, and a broadcast communication cooperation service (Hybridcast (registered trademark)). Service, broadcasting / communication integration service, broadcasting / communication service). The broadcasting / communication cooperation service realized by the broadcasting / communication cooperation system to which the present invention is applied links a digital broadcasting service and a communication service using the Internet or the like. For example, in a broadcasting / communication cooperation service, a receiver such as a digital television, a personal computer, or a portable terminal displays a display screen (hereinafter, “program display screen”) of a broadcast program (hereinafter also referred to as “program”) transmitted by broadcasting. (Also referred to as a “broadcast screen”), and a display screen of services and contents acquired by communication by the application installed in the receiver (hereinafter referred to as “application screen” and “application display screen”). ) And display simultaneously.

[1. System model]
[1.1 Users of the broadcasting / communication cooperation system]
FIG. 2 is a diagram showing a person who uses the broadcasting / communication cooperation system and its relationship.
A broadcasting station that sends a program accompanied by organization distributes the program to viewers by broadcast radio waves or a communication network. Broadcasting stations provide service providers with metadata related to programs in order to enhance broadcast communication cooperation services.

  A service provider that provides a broadcasting / communication cooperation service produces and distributes content and an application (hereinafter also referred to as “app”) for providing the viewer with a broadcasting / communication cooperation service. Hereinafter, when simply described as “application”, it refers to an application for providing a broadcasting / communication cooperation service (an application of the broadcasting / communication cooperation service). The content or application producer and distributor need not be the same service provider. A broadcasting station may also serve as a service provider. The service provider can also provide link information to other service providers. The service provider can apply for registration of the application and obtain approval from the system administrator in order to indicate that the application to be provided is official. Approved applications are not restricted in operation on the receiver. On the other hand, a screen displayed by an unapproved application cannot overlap the program display screen and sound, but the application display screen can be reduced and displayed outside the broadcast program screen. An approved application is called an A (Authorized) application, and an unapproved application is called a general application. The A application is also referred to as an official application, a registered application, an authenticated application, an authorized application, an authorized application, an authorized application, or an A (Authorized) type application. The general application is also referred to as an unofficial application, an unauthenticated application, an unauthorized application, an unauthorized application, a U (Unauthorized) type application, or a U application.

  The system administrator is an organization that certifies that the application (receiver application) provided to the viewer is the A application (official). The system administrator's decision as to whether or not to approve the applied application depends on the commission from the broadcasting station.

  An application for performing various settings may be installed in the receiver. At this time, the display screen of the application in the receiver may overlap the display screen (video) of the program.

A viewer who views a program broadcast by a broadcasting station enjoys a broadcasting / communication cooperation service.
The viewer can download or start the application according to his / her will. Further, the viewer can overlap the display screen of the application with the display screen (video) of the program according to his / her intention.

[1.2 System configuration of broadcasting / communication cooperation system]
FIG. 3 is a diagram illustrating the overall configuration of the broadcast communication cooperation system. The broadcasting / communication cooperation system is configured by functionally adding a “broadcasting station server group”, a “service provider server group”, and a “receiver” to the current broadcasting station equipment using radio waves.

  The broadcasting station has broadcasting station facilities. Further, the broadcasting station configures and manages both the broadcasting station server group and the service provider server group. The service provider configures and manages a service provider server group. The system administrator manages and operates the repository server. The receiver manufacturer manufactures and sells the receiver. The viewer has a receiver and enjoys a broadcasting / communication cooperation service.

A receiver (Hybridcast (registered trademark) receiver, broadcast receiving communication device) is equipped with a standardized common API (Application Program Interface). Further, the receiver receives broadcasts of the current system such as terrestrial digital broadcast and BS (broadcasting satellite) digital broadcast.
The broadcasting station equipment multiplexes a signal for starting a broadcasting / communication cooperation service into a broadcasting wave. The multiplexing method will be described later.

[1.3 Configuration example of broadcast station server group]
The broadcasting station server group manages and distributes content and metadata held by broadcasting stations.
For example, the broadcast station server group includes various servers, a data storage unit (DB (database)), and an API, and the broadcast station server group includes a content management server, a viewer management server, a content distribution server, a broadcast A station service server is included.

  A content management server that manages content manages programs and metadata that are broadcast content. The content management server includes a program management server that manages a broadcast program or a broadcast program, and a metadata management server that manages metadata related to the program. The metadata indicates, for example, a program title, a program ID, a program summary, performers, staff, broadcast date and time, script, captions, and commentary.

  The viewer management server manages viewers (users), and the content distribution server distributes content data by communication. The broadcast station service server is a server for the broadcast station to provide a service to a service provider. Services provided by the broadcast station service server include, for example, a social network service operated by a broadcast station and a web log (blog) for each broadcast program.

  The data storage unit of the broadcast station server group includes a part for storing contents and metadata held by the broadcast station and a database. The stored data can be accessed only by the service provider that is managing it, and is restricted so that it cannot be accessed by others.

  The API of the broadcasting station server group is an API for providing data in response to a request from the service provider server group. The API is a program that an application calls to receive a service and its execution unit.

[1.4 Service provider server group configuration example]
A service server group managed and operated by a service provider manages and provides applications and contents. The service server group includes a receiver application server, a service server, a content distribution server, a data storage unit (DB (database)), and an API.

The receiver application server is a server that manages an application of the broadcasting / communication cooperation service. The service provider stores, manages, and distributes applications that operate on the receiver.
Service providers are composed of groups or individuals. In response to a request from the receiver, the receiver application server notifies the receiver of the storage location of the application file (the application file will be described later) and distributes the application file.

  The service server is a server that provides a service in response to a request from an application operating on the receiver. Examples of the service server include a multilingual subtitle server, a speech speed conversion voice server, a social TV server, a recommendation server, a program review server, and a bookmark server.

  The content distribution server is a server that provides content in response to a request from an application running on the receiver. Examples of the content distribution server include a VOD (Video On Demand) distribution server, a caption distribution server, and a multi-view distribution server.

  The data storage unit of the service provider server group is a place for storing content data, metadata, data created by the service provider, viewer data, and application files. The data stored in the data storage unit can be accessed only by the service provider that is managing it, and cannot be accessed by others.

  The API of the service server group is an API for providing application files, contents, and services in response to a request from an application running on the receiver.

[1.5 Receiver]
The receiver receives and displays the broadcast of the current system, and executes a broadcasting / communication cooperation service. Current broadcasting is satellite broadcasting such as terrestrial digital broadcasting and BS digital broadcasting, and data broadcasting. The receiver is connected to the Internet.

The receiver makes an application download request to the service provider server based on the information multiplexed on the received broadcast wave. When the receiver executes an application program included in the downloaded application file, the application operates on the receiver. An application operating on the receiver accesses the service provider server and acquires content.
Further, the receiver has a broadcasting / communication cooperation function which is a function necessary for executing a broadcasting / communication cooperation service such as a synchronization function and an application control function. Since the API for the broadcasting / communication cooperation function is shared, it is easy to create an application and the application does not depend on the receiver.
The broadcasting / communication cooperation service also incorporates functions for cooperation with devices such as personal computers and portable terminals.

  The broadcasting / communication cooperation function includes a broadcasting / communication cooperation basic function and an optional function to be implemented as necessary. The receiver manufacturer implements the broadcasting / communication cooperation basic function in all receivers. The application uses the broadcasting / communication cooperation function through the API. The broadcasting / communication cooperation function operates based on an API described later.

  The API implemented by the receiver is defined so that the operation of the application is the same without depending on the receiver. Since all applications process the receiver through the API, the application cannot access functions specific to the receiver without going through the API.

[1.6 Terminal linkage model]
FIG. 4 is a diagram illustrating a terminal cooperation model of the broadcasting / communication cooperation system.
The receiver can provide a service in cooperation with a terminal such as a portable terminal. Examples of terminals to be linked include a personal computer, a mobile phone, a tablet, a smartphone, and a PDA (Personal Digital Assistant). The receiver provides, as an API, a function that can be used by other terminals as a receiver function. An API that provides a function that can be used by other terminals is called a terminal cooperation API. For example, an application running on a mobile terminal can access a broadcasting resource such as acquisition of program information or call a receiver function such as reproduction control by using the terminal cooperation API.

[1.6.1 Terminal cooperation API]
The terminal cooperation API is an API for using functions of the receiver by other terminals and applications operating on the terminals. The terminals that cooperate with each other target terminals on a home network (LAN) and terminals that are accessed through the Internet. The definition of the API that provides various operations will be described later.

[1.6.2 Terminal cooperation API provision process]
The terminal cooperation API providing process operating on the receiver operates the terminal cooperation API. The terminal cooperation API providing process operates like a kind of daemon process that operates residently.

[1.6.3 Protocol for calling API]
For example, RESTful (REST: Representational State Transfer), UPnP (Universal Plug and Play), XMPP (eXtensible Messaging and Presence Protocol), or the like is used as a protocol for calling the terminal cooperation API.

[1.6.4 Push Notification Function]
The receiver also supports a Notification function in which a server on the Internet notifies the receiver of information by pushing. The receiver receives information notified by a push from a server or the like. Some receiver operations may be controlled by the Notification function, and the Notification function is also defined as part of the terminal cooperation API specification.

[2. Broadcast communication application]
[2.1 Service and application model]
The application model of the broadcasting / communication cooperation system is a model added or changed based on the concept of the application model of DVB-GEM1.2.

[2.1.1 Broadcast / communication cooperation application]
The operation of the application of the broadcasting / communication cooperation service is classified into two patterns: an operation linked to AV (Audio Visual) content (linked) and an operation of the application alone (not linked). AV content is broadcast content (program) or communication content (VoD, etc.).

In the case of linkage, application life cycle control such as activation is performed in conjunction with broadcast or communication content. The application is activated based on an AIT (Application Information Table) (application information table, application activation information) distributed together with the AV content. In this case, in addition to startup and termination operations by the viewer, a provider of AV content such as a broadcaster can also control a life cycle such as automatic startup and termination of an application.
On the other hand, in the case of non-linkage, the application is started and terminated without being linked to broadcasting or communication content. In this case, the life cycle of the application such as the start and end of the application is controlled only by the viewer.

[2.1.2 Service]
Conventionally, a service means a series of programs that can be broadcasted as part of a schedule organized by a broadcaster, but this concept has been expanded in a broadcasting / communication cooperation system to provide stream-dependent and independent services. Two service types are defined.

FIG. 5 shows a conceptual diagram of service types.
In the receiver, a related application is started by pseudo-tuning the stream dependent service and the independent service.
The stream dependent service is an extension of the service concept in the conventional sense, and is configured by adding an application (or a plurality of applications) operating in conjunction with an AV stream transmitted by broadcasting or communication. An application can be started in conjunction with selection / playback of an AV stream (channel selection in the case of broadcasting).
On the other hand, the independent service does not include a video / audio stream, and is configured only by an application (a plurality of applications). When the viewer selects the stand-alone service, the application is activated.

[2.1.3 Launching apps acquired on the fly and launching installed apps]
There are two methods for starting an application: a method of starting an application file acquired on the fly, and a method of starting an application file stored (installed) in a receiver in advance. On-the-fly is a method of acquiring an application file by communication when an application is executed, and is also referred to as a non-install type or a direct execution type.

  The receiver activates an application program of an application file in the local file system based on the well-known application by AIT described later. When the receiver acquires and installs an application file by communication, it rewrites the information in the location hierarchy set in the related AIT (see section 2.5.1) to the location on the local file system. Accordingly, an operation for generating a value for identifying a stand-alone service (required for each AIT unit of the stand-alone service) is required.

[2.2 Application well-known method (signaling)]
[2.2.1 Application startup information (AIT)]
The application included in the service is made known by application activation information notified when the service is selected. AIT defined by ARIB STD-B23 (hereinafter referred to as ARIB-J) is used as application activation information. In each of the stream dependent service and the independent service, the AIT for the service is known. Details of how to send AIT in each service are shown below.

FIG. 6 is a diagram showing an example of a text expression of AIT used in the broadcasting / communication cooperation system.
The AIT used in the broadcasting / communication cooperation system is based on the AIT defined by ARIB-J. In AIT, there are binary representation for transmission in SI (Service Information) table and text representation (AIT File) in XML (extensible markup language) format. In the figure, an example of text representation is shown. Yes. In the AIT, an application ID (applicationIdentifier) for identifying an application, a control code (controlCode) for controlling an application state, location information (location) indicating a storage location (storage location) of an application file, and the like are described.

[2.2.2 Dissemination of applications linked to AV contents]
Known applications that are linked to AV content are sometimes multiplexed with AIT on AV content transmitted by MPEG (Moving Picture Experts Group) -2 TS (Transport Stream), or sent separately with AIT information. . By transmitting AIT in conjunction with AV content, life cycle control such as activation of an application that is linked to a broadcast program or dynamic application that is linked to the progress of the program can be performed at the receiver.
Well-known methods include, for example, (1) AIT ES (Elementary Stream) addition, (2) Descriptor addition to EIT (Event Information Table), (3) Carousel transmission, ( 4) Acquisition of AIT file by communication, (5) Dynamic transmission of AIT file by communication, etc.

(1) When an ES for AIT is added, the AIT ES is multiplexed on the broadcast TS in the same manner as in ARIB-J.

(2) In the case of adding a descriptor to the EIT, a descriptor to the EIT (p / f) is added and the same information as the information transmitted by the AIT is transmitted as in the presentation control described later.

(3) In the case of transmission by carousel, AIT is transmitted by DSM-CC (Digital Storage Media Command and Control) data carousel. For example, an AIT file is transmitted by a specific module. By transmitting in the carousel, an acquisition time overhead is assumed, but there is no need to change the current broadcast signal.
As an example of operation in the carousel, the component tag and module of the broadcast communication cooperative activation file transmission carousel are fixed. For example, “AA” is set in the component tag, “0000” is set in the module ID, and a type indicating AIT is set in the Type descriptor of the module. The receiver monitors the update of the module, and when the update is detected, rereads the AIT, and executes the control specified by the AIT (application life cycle control).

(4) When acquiring an AIT file by communication, an AIT file prepared separately is acquired simultaneously with the selection of AV content. For example, the AV content to be reproduced (content ID) and the information describing application activation information (AIT) are acquired as starting points. It is possible to use the concept of usage unit contents and entry components of server type broadcasting (ARIB TR-B27).

  (5) In the case of dynamic AIT transmission by communication, control for starting a new application or ending an active application during reproduction of AV content is performed by the AIT transmitted by communication. When control is performed at a timing that is not assumed in advance, notification by push through communication is performed.

[2.2.3 Dissemination of applications that operate independently]
The receiver acquires an AIT including activation information of an application that operates independently by communication. Independent applications are obtained from known application repositories. The procedure for obtaining the startup information of each independent application is shown below.

(1) Set the application repository location in the receiver. It may be set in advance at the time of shipment, or a plurality of repositories may be added later by some method.
(2) When the application menu is opened, the receiver acquires a list of applications (including an AIT location description of each application) from the application repository, and displays the applications on the menu.
(3) The AIT of the application selected by the viewer is acquired from communication.

  The above procedure is executed using a WEB (Web) API provided by the repository. In addition, since an application that operates independently does not operate in conjunction with AV content, dynamic life cycle control is not performed at a predetermined timing. Control (such as termination) at a timing not specified in advance is performed by notification (notification) by push through communication.

[2.3 Application startup and termination]
[2.3.1 Application life cycle]
[2.3.1.1 Lifecycle]
FIG. 7 is a diagram illustrating a life cycle of an application.
The application status is “Not Loaded”, “Loaded”, “Paused”, “Started”, “Destroyed” (destroyed) according to the application status in ARIB-J. ) ". In these five states, a series of processes from when an application is loaded and executed until the application is finished is called an application life cycle, and control of transition between the states is called life cycle control.

[2.3.1.2 Basic life cycle control of applications linked to AV contents]
The life cycle control of an application linked to AV content is basically performed through selection of a stream dependent service.
The selection of the stream dependent service is made by the viewer. A service is a set of a series of contents including AV contents and applications, and a life cycle such as activation and termination is controlled by a control code included in an AIT sent together with the application. A single service may include a plurality of applications that operate simultaneously.

  Selection of a service that triggers application activation includes control from an application through a receiver API, control from a navigator as a resident application of a receiver, and control of a remote control button in the case of a broadcast service. . At the time of service switching, presentation of content (AV content or application) included in the service before and after switching is switched. When the applications included in the service before and after the switching are different, the application that was activated before the switching is terminated by the service switching, and a different application can be activated after the switching. Details of these operations will be described later in section 2.4.

[2.3.2 Application startup]
[2.3.2.1 Startup by AIT]
When a service (stream dependent service or independent service) is selected at the receiver, an application for which “auto-start” is specified in the control code included in the AIT provided together with the service is explicitly received from the viewer. Automatically starts with service selection without any action. During service selection, the life cycle is controlled by application signaling for that service. For example, in the case of a broadcast service, the receiver always monitors the AIT transmitted together with the broadcast and responds to the change. As described above, by application signaling such as AIT transmission, a new application can be controlled to be automatically started (auto-start) in the middle of the receiver.

  An application for which “auto-start” is not designated in the application activation information by AIT is not automatically activated, and requires explicit activation by the viewer. This explicit activation is performed by the application launcher of the resident application of the receiver. For example, when a broadcast service is selected, a broadcast communication cooperation service button on the remote control is pressed to open an application activation menu in the receiver, and a list of applications linked to the current broadcast (communication) service is displayed. Here, the viewer performs an operation of selecting and starting an application to be started on the receiver.

[2.3.2.2 Start from broadcasting / communication cooperation application]
Since a plurality of applications can be started in the service, another application included in the same service may be started from the started application. In the ARIB-J application execution environment, an API for starting another application by specifying an application ID is defined. In other execution environments, an API having the same function is defined.

[2.3.2.3 Startup from BML (Broadcast Markup Language)]
Since the receiver includes the current BML data broadcast execution environment in addition to the broadcast communication cooperation application execution environment, an API for controlling the start of the broadcast communication cooperation application is added as a BML API. BML is a multimedia coding system defined in ARIB STD B24, and is adopted as a data broadcasting system in the current Japanese terrestrial / BS / CS digital broadcasting.

[2.3.2.4 Launching an independently running application]
An independent service is a virtual service that includes only an application, and by selecting an independent application, the application is started by acquiring an AIT by the same mechanism as the activation by AIT in Section 2.3.2.1. . However, in the stand-alone service, at least one auto-start application is started. The stand-alone service is selected from, for example, an application launcher.

[2.3.3 Termination of application]
[2.3.3.1 End by AIT]
The life cycle of the activated application is controlled by application signaling for the service. For example, in the case of broadcast, the receiver always monitors the AIT transmitted together with the broadcast and designates the control code destroy for the active application, thereby terminating the application. Even when the AIT is multiplexed in the stream dependent service transmitted by communication, the termination control of the linked application can be performed.

[2.3.3.2 Termination by application itself]
The application itself terminates itself using the termination API.

[2.3.3.3 Termination by other applications]
Using the application termination API executed by the application, other running applications are terminated. In this case, an appropriate security policy for terminating other applications is necessary.

[2.3.3.4 Termination when switching to another service]
At the time of switching to another service at the receiver, among the applications included in the stream dependent service, the application included in the service before the switching is terminated, and the application signaled by the new service is started. If the same application is included in the services before and after switching, the operation can be continued. This is controlled by a flag in the AIT. Details of the service bound application, which is an application included in the stream dependent service, will be described later in section 4.2.

[2.3.3.5 End by receiver]
The receiver terminates the designated application by the receiver function. For example, a list of applications that are activated by the receiver is displayed, and the designated application is terminated by the viewer's selection.

[2.3.3.6 Dynamic application termination]
In order to dynamically control the end of the application, an AIT file instructing the end of the application is transmitted to the receiver. In this case, push notification (Notification) of AIT is performed.

[2.3.4 Launching multiple applications]
[2.3.4.1 Application signaled within the same service]
The receiver can simultaneously execute applications listed in the AIT in the same service.

[2.3.4.2 Simultaneous activation of applications that operate independently of applications linked to AV content]
An application linked to AV content is activated only within the stream dependent service. On the other hand, an application that operates independently can be activated at the same time as an application that works with AV content or another application that operates independently.

[2.3.4.3 Resource management when starting multiple applications]
When multiple applications are launched, they may require the same receiver resources (eg, displays). The receiver is equipped with a mechanism such as a resource manager, and performs operations such as appropriately allocating resources and stopping execution of applications when resources are not available.

[2.4 Application boundary]
[2.4.1 Basic handling of bound / unbound]
There are two types of applications: bound applications associated (associated) with the composition service and unbound applications not associated (unassociated). The composition service to which the bound application is associated is determined by the composition service from which the AIT including the activation information of the application is obtained.

  The bound application is in an executable state when receiving the associated organization service. That is, when the composition service is started by the AIT and the reception of the composition service is finished (when the composition channel being received is changed), the execution is finished. Another application started from the bound application is also handled as a bound application. When the first activated application that is the main book of a series of related bound applications is terminated, the other applications activated thereby are also terminated.

  Since the unbound application is not linked to the composition service, the execution of the application is continued even if the received composition service is changed. Since the AIT for starting the application cannot be obtained from the composition service, the start information is obtained by other means (for example, obtaining an AIT File (file) linked to the application using an application launcher or the like). It is given to the receiver and activated. Another application started from the unbound application is also handled as an unbound application. The application is basically terminated explicitly by the operation of the viewer, but is terminated also when an instruction (KILLALL) for terminating all applications is given by the AIT from the receiving service.

[2.4.2 Handling specific to unbound applications]
An unbound application is not associated with a composition service, but as shown in Section 2.3.2.4, a bound application is associated with a virtual composition service (generated in the receiver when the receiver is started). The same startup processing mechanism can be applied.

  The method of generating the virtual composition service depends on the implementation of the receiver, and what identification value is given to the composition service differs depending on the implementation of the receiver. However, if application files are stored in the receiver and can be started from the application launcher at an arbitrary timing, an ID for identifying a virtual organization service or an application file acquisition destination (service provider server or Since the server is described as the acquisition destination in the AIT acquired from the repository, it is necessary to change it so that it is acquired from the storage area in the receiver). Need to be updated.

[2.5 Application acquisition method]
[2.5.1 Acquisition based on AIT]
As described above, the activation information of all applications is given by the AIT.
Acquisition of the application file is instructed by the location information of the application included in the AIT. For example, in the example of FIG. 3, the location information is described in a hierarchy of “/ ApplicationList / Application / applicationSpecificDescriptor / dvbjDescriptor / location” (XML is described as the contents of the location element). The description of the location information is, for example, “http://192.168.11.37/demo.jar”.
The above is an example of acquiring demo.jar (Java (registered trademark) application archive) using the HTTP (Hypertext Transfer Protocol) protocol. The transport protocol to be used and the package format of the application will be described later.

[2.5.2 Application package format]
The package format of the application depends on the application format (Java (registered trademark) or HTML5). The receiver acquires a series of files (such as a program main body and an image file) necessary for starting the application by acquiring a group of files or entry files. This series of files is an application file. For example, the application file includes a compressed series of files (such as a zip file), a Jar file (Java (registered trademark) execution environment), an entry HTML file (in the case of the HTML5 execution environment), and an entry file defined uniquely. Format is used.

[2.5.3 Application transmission method]
The transmission method used when acquiring the application file via the network includes acquisition using the HTTP protocol and acquisition using the FILE protocol.
In the case of acquisition using the HTTP protocol, acquisition is performed using the GET method. The designation of the location of the AIT is “http: // ˜”.
On the other hand, in the case of acquisition using the FILE protocol, when specifying an application file (application program) stored (installed) locally on the receiver, the location of the AIT is specified as “file: /// ˜”.

[3. Interface conditions]
[3.1 Broadcast wave communication communication service control signal]
The broadcast wave needs a mechanism for sending the application activation information described in Section 2.2.2. Furthermore, in order to forcibly terminate all applications in the event of an emergency warning broadcast, the AIT application control code (application_control_code) defined in ARIB STD-B23 Part 2 Section 10.16.3.2 is set to “ Add “KILLALL”. Table 1 shows the meaning of the control code “KILLALL” to be added.

  Also, descriptors are added to EIT and AIT in order to control application presentation from the relationship between the application and AV content. Details will be described later in section 4.3.

[3.2 Broadcasting Station Server Group API]
FIG. 8 is a diagram illustrating a data flow between business operators in the broadcast communication cooperation system, and FIG. 9 is a diagram illustrating a data flow in the entire broadcast communication cooperation system.
Here, as shown in FIG. 8, between the broadcast station server group and the server for each service of the service provider server group, between the broadcast station server group and the broadcast communication cooperation infrastructure server, and between the broadcast communication cooperation infrastructure server and the service A description will be given of the API definition between the server for each service of the provider server group, the API between the receiver control and the broadcasting / communication cooperation base server, the metadata and the server for each service shown in FIG.

[3.2.1 API]
Communication between the broadcast station server, which is each server constituting the broadcast station server group, and the service provider server, which is each server constituting the service provider server group, is in the REST format. Also, since it is expected that the directory structure of the server differs between the broadcasting station server and the service provider server depending on the service to be provided, the API is negotiated between the two. Examples of URLs of the broadcast station server and service provider server are shown below.

http://hybridcast.org/{Broadcaster name} / {Server name} / {Content ID} / {Data to be managed} / {Sort method} / {First item}, {Number} /? {Parameter} = { value}/

[3.2.2 Recommendation service]
FIG. 10 is a diagram illustrating a sequence of a recommendation service. Methods used between the service provider server group and the interface unit of the broadcast station server are “GET”, “POST”, “PUT”, and “DELETE”. An example of the command format is shown below.

(1) http://hybridcast.or.jp/{broadcast station name} / (server name) / {content ID} / {data to be managed} / {sort method} / {first item}, {number} /
(2) http://hybridcast.or.jp/{broadcast station name} / (server name) / {viewer ID} / {data to be managed} / {sort method} / {first item}, {number} /
(3) http://hybridcast.or.jp/{Broadcasting station name} / (Server name) / {Review ID} / {Data to be managed} / {Sort method} / {First item}, {Number} /

  Parameters include {broadcast station name}, {server name}, {content ID}, {viewer ID}, {review ID}, {data to be managed}, {sort method}, {first item}, { Number} etc.

[3.2.3 Data to be managed]
Data to be managed includes content information, user information, user generated content information, device information, and authentication information.
Content information includes title, summary, genre, broadcast date / time, broadcast time (scale), video mode, audio mode, caption data, script, performer, music, producer, production company, work, recommended program, video URI, playback It includes data indicating the number of times, CM, time stamp information, and the like. The user information indicates the user's (viewer) name, age, gender, region, number of reviews written, number of comments written, favorites, friend list, playback location (time), playback end location (time), program viewing history, etc. Contains data. The user generated content information includes data indicating a content ID, a user ID, a review content, a review writing time, a review evaluation, and the like. The device information includes a device ID. The authentication information includes an authentication ID.

[3. Transport format]
[3.3.1 Video / audio handled by communication]
Video and audio handled by communications conform to the Digital TV Network Functional Specification, Streaming Functional Specification, Protocol Version V1.1 (Digital Television Information Society).

[3.3.1.1 Relationship to Video / Audio Monomedia Format]
MPEG-2 Video or H.264 For multiplexing video encoded with H.264 / MPEG-4 AVC (Advanced Video Coding), audio encoded with MPEG-1 Audio Layer II, MPEG-2 Audio AAC, subtitles, etc. Use the Transport Stream format. However, MPEG2-TS, MMT (MPEG Media Transport), MP4, etc. can also be used.

[3.3.1.2 Transfer protocol]
FIG. 11 is a diagram illustrating a transfer protocol stack.
Stream transmission uses RTP (Real-Time Transport Protocol) / UDP (User Datagram Protocol) and HTTP / TCP (Transmission Control Protocol). When RTP / UDP is used, error correction information may be transmitted as an option. In addition, when HTTP / TCP is used, stream control is performed using HTTP connections, methods, and headers. When transmission is performed by RTP (Real-time Transport Protocol), RTSP (Real Time Streaming Protocol) is used as stream control information.

[3.3.2 Subtitles]
Multilingual subtitles comply with Timed Text Markup Language (W3C (World Wide Web Consortium)). Note that synchronization is performed separately at the application level. Each corresponding font is downloaded from the server as necessary. For example, a font file is placed on the HTTP payload. In this case, Web Dynamic Fonts and PFR (PortableFont Resource) are used.
The font capacity is preferably about 5-35 MB (megabytes).

[3.4 Monomedia format]
The monomedia encoding in the broadcasting / communication cooperation service uses the one defined below.

[3.4.1 Video]
For the moving images, the MPEG-2 Video system defined in ARIB STD-B32 2.4 edition, Part 1, section 3.1 and the MPEG4-AVC system defined in section 3.2 are used. The restrictions of the encoding parameters of the television service specified in section are applied.

[3.4.2 Voice]
MPEG-2 Audio and PCM (Pulse Code Modulation) (AIFF-C (Audio Interchange File Format Compression)) are used for audio.
In the case of MPEG-2 Audio, the MPEG-2 AAC system defined in ARIB STD-B32 2.4 Edition, Part 2, Section 3.1 is used, and the encoding parameter constraints defined in Chapter 5 are Applied.
In the case of PCM, the system defined in ARIB STD-B24 5.4 version 1st part 2 section 6.2 is used.
For the additional sound, the system defined in ARIB STD-B24 5.4 version 1st part 2 section 6.4 is used.

[3.4.3 Still image]
In the case of JPEG (Joint Photographic Experts Group), ARIB STD-B24
The encoding method defined in Section 5.4 of the 5.4 edition, Volume 1, Part 2 is used.
In the case of PNG (Portable Network Graphics), a method defined in ISO / IEC 15948: 2003 is used. This is the same content as W3C Recommendation Portable Network Graphics (PNG) Specification (Second Edition)

[3.4.4 characters]
For character encoding, an internationally encoded character set defined in ARIB STD-B24 5.4 Edition, Volume 1, Part 2, Section 7.2 is used.
For the character code set, the BMP (Basic Multilingual Plane) set defined in section 7.2.1.1.3 is used, and Table 7-20 is applied. Also, ISO / IEC 10646: 2003 supplement 5 and supplement 6 are applied.
The external characters are defined in the ARIB STD-B24 5.4 edition, Part 1, Part 2 Section 1.2.1.2, or in the ARIB STD-B23 Part 1 section 5.2.1.2. The method is applied.
As control codes, only APR (CR) and APD (LF) are used among the C0 control codes defined in ARIB STD-B24 5.4 version 1 Part 2 Section 7.2.2.1. . Other C0 control codes and C1 control codes are not used.
The conversion of the character code is in accordance with ARIB STD-B24 5.4 Edition, Part 1, Part 2, Appendix E.

  When information is encoded by a method other than the character encoding method defined above, the information is converted into the character encoding method and processed in an appropriate process in the transmission or receiver. That is, the character code by another encoding method is not directly handled by the application.

[3.5 Application format]
Describes how to describe applications that can be executed on the receiver. The connection between the execution environment for executing the application created by this description method and the secure manager is shown in Chapter 4.

[3.5.1 Application format executable by receiver]
As a description method of an application executable by the receiver, BML (ARIB STD-B24), ARIB-J (ARIB STD-B23), HTML5 (W3C HTML5
Working draft-2011 / Jan / 13).

[3.5.2 BML]
The receiver has a function of presenting a BML document that conforms to the terrestrial digital broadcast operation regulations (ARIB TR-B14) or the BS digital broadcast operation regulations (ARIB TR-B15).
The receiver must be able to present data broadcasting services provided by terrestrial digital broadcasting or BS digital broadcasting in accordance with existing standards. However, the receiver is required only to present BML content distributed by broadcasting by the data carousel method, and BML content provided by the HTTP protocol by communication (TR-B14 Volume 3, Part 2, Section 5.14, TR -B15 Part 1, Part 3, Section 8.14) is not mandatory.

  Also, starting from the data broadcast content (BML), browser.startHybridcastApp () and getAITInfo () are defined as broadcast extended APIs for starting the communication application defined below.

  Table 2 shows the definition of browser.startHybridcastApp (). browser.startHybridcastApp () is an API for starting a broadcasting / communication cooperation application.

  Table 3 shows the definition of getAITInfo (). getAITInfo () is an API for acquiring the latest AIT information included in the service being received.

[3.5.3 HTML5]
[3.5.3.1 Description method]
The receiver supports HTML5 as a description method of a presentation engine type application provided from communication. The following are supported as JavaScript API. Of the following APIs, those being studied by W3C include Working Draft (WD) or Editor's Draft (ED). However, the API related to the data carousel transmitted by the broadcast wave is not essential.

(1) System Information API (W3C Working Draft 02 Feb. 2010) (2) WebSocket API (W3C Editor's Draft 28 Feb. 2011) (3) File API (W3C Working Draft 26 Oct. 2010) (4) Permission for File API , System Information API (Permissions for Device API Access, W3C Working Draft 05 Oct. 2010) (5) Device Description Repository Simple API (W3C Recommendation 05 Dec. 2008) (6) API for Media Resource 1.0 (W3C Working Draft 08 June 2010 (7) Web Storage (W3C Working Draft 08 Feb. 2011) (8) Server-Sent Events (W3C Editor's Draft 28 Feb. 2011) (9) Indexed Database API (W3C Working Draft 19 Aug. 2010) (10) SI Access API (11) Tuning API
(12) Print (13) Reservations

[3.5.3.2 Browser]
The HTML5 browser of the receiver has the functions of JavaScript processing system, Web Workers (W3C Working Draft 08 Feb. 2011), Widget Interface (W3C Working Draft 3 Feb. 2011), HTML Canvas2D Context (W3C Editor's Draft 28 Feb. 2011) Implement. Web Workers supports multitasking, Widget Interface supports independent applications, and HTML Canvas 2D Context is necessary to support 2D vector graphics.

[3.5.4 ARIB-J]
The receiver supports ARIB-J as a description method of an application execution engine type application provided from communication. Further, DVB Bluebook A153 (GEM Media Synchronization API) is used as a synchronization API between a plurality of streams.

[3.6 Receiver API]
Hereinafter, receiver APIs usable in HTML5 and ARIB-J will be described.

[3.6.1 Namespace]
The name space is a character string description rule for specifying the positions of various resources such as video / audio contents, applications, monomedia files, and the like that are handled on the server or in the receiver. Namespace notation for referring to various resources used in section 3.5.2 and after is specified for each classification. The resources include resources on the Internet server, resources on the application cache, and broadcasting resources. Resources on the Internet server include stream resources such as VOD content and file resources such as applications and other resources referred to by applications. Broadcast resources include stream resources such as programs being broadcast, past and future programs, and carousel resources such as modules and event messages.

[3.6.2 Broadcasting / communication cooperation interface]
The broadcasting / communication cooperation interface includes the following interfaces.

(1) getRunningApplications (): Acquires information on applications being executed. The return value of getRunningApplications includes apps [] and application_id and running_level for each application. In apps [], a list of running applications is set. In application_id, an application ID is set, and is null when the application is a general application (unofficial application). In running_level, an execution level (authentication result and viewer setting state) is set.
From a security point of view, information that can be acquired for other applications should be restricted.

(2) queryApplicationInfo (): Acquires information about the specified application.
(3) getProgramInfo (): Acquires information on the broadcast being received. Return values are tuner_state, network_id, ts_id, orig_ts_id, service_id, event_id, and content_id. In tuner_state, a value indicating the reception state is set.
(4) getEPGInfo (): Acquires various information in the EIT (+ SDT) of the broadcast being received.
(5) saveApplicationToCache (): Saves the application file on the server in the cache.

(6) queryApplicationInCache (): Search for an application file (application program) in the cache. The arguments of queryApplicationInCache () are application_id, getDSMCCModule (), addBroadcastSignalListener (), and getListFromHybridcastMenu (). In application_id, an application ID issued from the certification authority is set. getDSMCCModule () acquires the specified module from the broadcast wave. addBroadcastSignalListener () registers a listener that monitors updates of SI, emergency information, carousel, and event messages. getListFromHybridcastMenu () acquires a list of top menu applications. The return values of queryApplicationInCache () are user_apps [], broadcaster_apps [], and vendor_apps [].

(7) addApplicationToHybridcastMenu (): Adds an application to the top menu.
(8) getKeyFromBroadcast (): Acquires key information for limited server access from broadcasting.
(9) querySupportedFunction (): Queries the application browser function. This is used for the purpose of checking whether the function / API is available.

[3.6.3 BroadacastSignalListener interface]
The BroadacastSignalListener interface is a listener interface for monitoring SI, emergency information, carousel, and event messages acquired from broadcasting. This interface event also occurs when the associated organization service is changed during bound application execution.

[3.6.4 LocalDatabase interface]
The LocalDatabase interface is an interface for holding and managing viewer information in the receiver. The viewer information is information that should not be output to the server side such as personal information, and is minimum information such as a viewer ID and a receiver ID.

[3.6.5 Synchronization API]
An API similar to DVB Bluebook A153 (GEM Stream Synchronization API) is introduced as the SynchronizationManager interface. Further, the following interface is added as an API.

(1) getCurrentSTC (): Acquires the current STC (System Time Clock) value. In the MPEG 2 Systems standard, the system clock (STC) on the receiver side is multiplexed and distributed as a PCR (Program Clock Reference) signal in the MPEG2 transport stream so that the system clock (STC) inside the receiver is transmitted on the transmission side. To be synchronized with the STC.
(2) getCurrentPositionInProgram (): Acquires the elapsed time from the start of the program.
(3) delayStreamPresentation (): Starts delay presentation of the broadcast stream being presented.
(4) getCurrentDelay (): Acquires the delay time amount (from the original presentation time) of the broadcast stream being presented.

[3.6.6 SecurityException interface]
An interface for exceptions that occur when an application makes function calls and property operations that are prohibited at the current execution level. The ecurityException interface is generated by calling each of the above APIs or by various operations on an object (<video> for HTML5 or OO Controller for ARIB-J) that references a broadcast.

[3.7 Receiver function]
The receiver has the following functions as built-in functions.
[3.7.1 Application Launcher]
The application launcher is a receiver built-in function that provides application selection for application launch. The target of selection is as follows.
・ Applications registered or stored in the receiver.
-Pre-installed application in the receiver.
・ Applications registered in known repositories.
・ Applications whose application control code is other than AUTO_START among applications whose start instructions are described by application start information.
・ Application for which AUTO_START has been instructed by the user (this is a function for enabling the application to be restarted when the conditions (execution service being watched, time, etc.) that can be executed by the application are satisfied) Is).

As a method for starting the application launcher, a specific key press on the remote controller, selection from a built-in menu of the receiver, or the like can be given.
In addition, the application launcher has a function that allows an application in the repository to be registered in the launcher itself, and allows a user to start a specific application quickly and easily. Further, the application launcher can be activated at any timing by the user regardless of the execution state of the application, and does not affect the execution state of the application.

[3.8 Terminal linkage function]
The receiver makes it possible to use an API (terminal cooperation API) for an application that operates on a portable terminal connected via a home LAN or the like. Accordingly, the television (receiver) and various terminals can be operated in conjunction with each other. For example, an application on a mobile terminal can call up the function of a receiver, such as acquiring information about a program being viewed on a TV, performing a broadcast channel to be displayed on the TV, or controlling playback of VOD content. become.
In order to realize such a terminal cooperation function, the receiver includes a terminal cooperation manager. The details of the terminal cooperation function by the terminal cooperation manager are described below. In this embodiment, the terminal cooperation manager is provided in the external I / F unit 417 shown in FIG.

[3.8.1 Terminal cooperation scope]
The scope of terminal cooperation at the receiver is assumed as follows.
Home network: A case where a terminal linked with a receiver is connected to a home LAN together.
・ Internet: A case where a plurality of receivers are linked through the Internet, or a receiver and a portable terminal are linked.

[3.8.2 Terminal cooperation manager]
In addition to establishing communication between terminals, the terminal cooperation manager is a process for controlling the receiver function by API call from the terminal of the cooperation partner, and notifying information by pushing to the terminal. It is a software module of the receiver that rises at the same time.
In order for the receiver and the terminal to cooperate, the terminal cooperation manager defines a device discovery protocol for establishing communication and a communication protocol for calling and returning a cooperation API and a return value.

FIG. 12 is a diagram illustrating the concept of the terminal cooperation manager.
As shown in FIG. 12, the receiver has a configuration corresponding to a plurality of device discovery protocols and communication protocols. For example, a case where a plurality of different protocols are supported in cooperation in a home network, a case where different protocols are used as a protocol for home network, and a protocol used for cooperation through the Internet can be mentioned.
The receiver also provides a mechanism for not only cooperation with the receiver function of the receiver but also cooperation (inter-application cooperation) with the application operating on the receiver for the application operating on the mobile terminal.

An API for calling a receiver function is defined as an extended API for applications, but an API having basically the same function is provided for a mobile terminal.
Regarding cooperation between an application and a mobile terminal, what functions and how to cooperate are different depending on the application. In this terminal cooperation function, a general-purpose mechanism for transmitting and receiving commands and various kinds of information for cooperation between an application and a portable terminal is provided.
The terminal cooperation manager has a bridge function corresponding to a plurality of protocols, and bridges transmission / reception of data corresponding to each protocol in a receiver function call from a mobile terminal and communication with an application. The terminal cooperation manager is provided in the receiver 4 as a part of the external I / F unit 417 shown in FIG.

[3.8.3 Device discovery mechanism]
The terminal linkage function covers not only the home network but also the connection through the Internet, but this section describes the specifications for device discovery in the home network. It is desirable that other existing protocols can be easily implemented and used, and the following specifications are examples.

For example, SSDP (Simple Service Discovery Protocol), which is a UPnP (Universal Plug and Play) device discovery protocol, can be used for device discovery.
SSDP is used in UPnP for searching based on the names (URIs) of UPnP devices and services on the network. Using this function, the device cooperation service name (URI) is specified and a device for cooperation is discovered. In order to find a receiver that can cooperate with the mobile terminal, add an SSDP-compatible function to the terminal cooperation manager, and when a SSDP search message is received, a connection destination (including IP address and port number) response is returned. .

[3.8.4 Cooperation protocol between terminals]
In this section, the protocol between the terminal cooperation manager of the receiver and the terminal to be linked is specified. Describe the protocol to be used for each scope of cooperation in the home network and cooperation through the Internet. It is desirable to use an existing protocol that can be easily implemented and used, and the following specifications are examples.

[3.8.4.1 Cooperation protocol in home network]
(1) WebSocket
WebSocket is one of the related specifications of HTML5, and is a mechanism that enables bidirectional message transmission / reception by connecting an HTML5 application (client) and a server with WebSocket. Unlike the stateless connection of HTML, the WebSocket can maintain the connection, so that it is possible to realize push delivery from an application that exchanges messages in real time or from the server side.

FIG. 13 is a diagram illustrating an example in which a receiver and a mobile terminal are linked through communication using WebSocket.
As shown in FIG. 13, a bidirectional communication path is generated by making a WebSocket connection to a receiver from an HTML5 application operating on a mobile terminal, and various commands and information are transmitted and received using the communication path. It can be performed. This is because the WebSocket bridge function is added to the mobile cooperation manager, and the WebSocket bridge has an operation as a WebSocket server and an operation for controlling a receiver function according to a function (API) called through the WebSocket. Realized.

  A specific API or event notification from the server side (receiver) is realized by defining a message format in WebSocket. As the message format, a JSON format or an XML format is assumed. As an example, a message example in the case of controlling the channel change of the receiver from the mobile terminal is shown below.

<< Example of JSON format >>
{"ActionName": "RemoteControl", "Arguments": {"Function": "SelectAir", "URI": "arib: //onid.tsid.svid/"}}

<< Example of XML format >>
<? xml version = "1.0" encoding = "UTF-8"?><Action>
<ActionName> RemoteControl </ ActionName>
<Arguments>
<Function> SelectAir </ Function>
<URI> arib: //onid.tsid.svid/ </ URI>
</ Arguments>
</ Action>

(2) HTTP
Since the portable terminal manager of the receiver has an HTTP server bridge function, various terminal APIs can be called based on a general-purpose HTTP protocol. However, since HTTP is a pull-type protocol, it is necessary to support a pseudo-push function such as long polling in order to perform notification by push from the receiver (server side).

(3) UPnP
UPnP (Universal Plug and Play) is a protocol for connecting various devices such as personal computers and AV devices in a home through a network and providing functions to each other.
Many devices such as DLNA compatible devices based on UPnP are already in widespread use.
Since the receiver and the portable terminal each support UPnP, a device cooperation function in accordance with the UPnP specification can be realized.
In UPnP, there are a server that provides a service (function) and a client that uses the service. However, any of the receiver and the linked mobile terminal can be a UPnP server and a UPnP client. However, when the receiver provides various information related to the broadcast program to other terminals, it is realized by adding a UPnP bridge function and operating the UPnP bridge function as a UPnP server.
Specific APIs and event notifications from the server side (receiver) are defined according to the UPnP format.

[10.3.2.4.2 Cooperation protocol over the Internet]
Cooperation between terminals is performed using XMPP, which is a protocol for exchanging messages on the Internet in real time, or WebSocket.

[10.3.5 Cooperation between application and terminal]
A case is assumed in which the mobile terminal and the application running on the receiver communicate with each other and cooperate.
By adding a WebSocket bridge function to the receiver, it is possible to construct a general socket communication path using WebSocket when both the application format and the application format that operates on the linked portable terminal are based on HTML5. To.

FIG. 14 is a diagram illustrating an example in which a receiver application and a mobile terminal application are linked by communication using WebSocket.
In order to link HTML5-based applications with each other, as shown in FIG. 14, the WebSocket bridge of the terminal cooperation manager has two WebSocket server functions. The WebSocket server A (receiver-side server unit 492 shown in FIG. 1) is operated as an application connection destination, and in addition, the WebSocket server B (device-side server unit 491 shown in FIG. 1) is connected as an HTML5 application connection destination on the cooperation terminal. , And the data received by one WebSocket server is transmitted to the connection destination of the other WebSocket server, so that bidirectional communication by WebSocket is possible between the receiver and the cooperation destination terminal. For example, the application connects to a known local WebSocket server using the WebSocket API of HTML5 (eg, ws: // localhost: 8880 / hybridcast_app /). The HTML5 application running on the cooperation terminal specifies a WebSocket server on the receiver that can be connected using the device discovery mechanism (eg, ws: //192.168.11.5: 8880 / external_app /) and connects. A communication path using WebSocket between the two is generated, and cooperation is possible by transmitting and receiving commands and data in the communication path.

[4. Security]
[4.1 Management of broadcasting / communication cooperation application]
In order to disseminate and activate the broadcasting / communication cooperation service while satisfying the requirements of broadcasting companies, a framework that allows not only broadcasting companies and their related parties but also a wide range of service companies and individuals to participate is necessary. In this broadcasting / communication cooperation system, applications are classified into “A application” and “general application” from the viewpoint of security, and both applications can be executed in the receiver.

FIG. 15 shows an application management model in the broadcasting / communication cooperation system. By performing pre-registration with a registration manager (a third party organization), the “A application” is assured of the operation expected in the specifications of the broadcast communication cooperation system. The “A application” is given an ID and signature at the time of registration, and the signature is verified by the secure manager defined in section 2.2 at the receiver, allowing access to all APIs, and program linkage using broadcast resources Service can be performed. In addition, the AIT transmitted from the broadcaster enables fine presentation control according to the broadcaster's requirements.
On the other hand, the “general application” does not need to be registered in advance, but the operation expected in the specifications of the broadcasting / communication cooperation system is not guaranteed, and the broadcasting-related API cannot be handled from the application. Since “general application” is not assigned an ID and signature, it is difficult to specify individual applications, but it is possible to execute the application with presentation restrictions based on the requirements of the broadcaster.

[4.2 Secure Manager Functional Model]
FIG. 16 shows a functional model of the secure manager. The secure manager is a function that comprehensively manages security in the receiver.

[4.2.1 Application monitoring / control functions]
As described above, applications that run on the receiver are roughly classified into two types, “A application” and “general application”, depending on the distribution form of application files. “A application” and “general application” are distinguished according to the presence or absence of an ID and signature as shown in section 4.1, and the API access range in the receiver and the control range from the broadcaster are different. The operation contents of are different. The purpose of the application monitoring / control function is to identify the difference between the types of the A application and the general application, and to reliably control the operation at the time of executing the application.

(1) Application authentication: The receiver identifies either an A application or a general application for all applications to be executed, and further identifies an ID for an A application. The A application or the general application is distinguished by confirming the presence or absence of a signature attached to the application file (application program). If the application is an A application, the receiver further acquires an application ID described in the signature. Application identification is performed when an application is acquired or activated.
(2) Screen presentation control: described later in section 4.3.
(3) Resource access control: The receiver performs access control to APIs such as broadcast resources of the application being executed. When an application tries to access an API, if the application is a general application, the access is restricted by the type of API.
Further, when the application accesses the screen display API on the display, screen presentation control is executed based on the type of the A application or the general application and the presentation policy of the selected broadcaster. Details will be described later in section 4.3.
(4) Revocation: An application revocation function is provided.

[4.2.2 Receiver protection]
The receiver has protection functions such as viewer information protection and virus countermeasures.

[4.3 Application screen presentation control]
[4.3.1 Overview of screen presentation control]
In the broadcasting / communication cooperation service, the convenience of the broadcasting service can be expanded by presenting a related communication application simultaneously with the broadcasting program. On the other hand, it is assumed that a broadcast program and a communication application are mixedly presented on the receiver screen by using the communication service. Depending on the presentation method, the screen of the communication application overlaps with the broadcast program, which not only impairs the uniqueness and workability of the broadcast program, but also makes it impossible to accurately convey emergency information such as earthquake early warnings to viewers. There is. By the screen presentation control, application presentation control based on the intention of the broadcaster is performed in the broadcasting / communication cooperation service.

  FIG. 17 is a diagram illustrating the concept of the screen presentation control method. The screen presentation control method is intended to reflect the broadcaster's presentation policy on how to display the communication application on the screen for each broadcast program on the receiver. This is called control. In the content presentation control, presentation control in units of programs according to composition, presentation control for events occurring in a program such as an earthquake early warning, and presentation control in units of applications are realized.

[4.3.2 Basic operation of screen presentation control]
FIG. 18 is a diagram illustrating a basic operation model of screen presentation control. In order to reflect the broadcaster's presentation policy to the receiver, the method of presenting the communication content for the broadcast program, which is assumed in advance by the broadcaster, is managed as a presentation rule by the receiver. Specifically, as a method for presenting communication content, levels are classified according to the order of superposition and the difference in arrangement, and a table of presentation levels (policy levels) and presentation methods is held in the receiver as presentation rules. The broadcaster multiplexes and transmits the designated presentation level to the broadcast wave, and the receiver collates the presentation level with the presentation rules to determine the presentation method. Thereby, presentation control based on the presentation policy of the broadcaster can be realized.

[4.3.3 Transmission and multiplexing of control information]
With respect to the format of the control information for transmitting the broadcaster's presentation policy, three specific examples will be given as a method using program arrangement information used in digital broadcasting. As a screen presentation control in units of programs, there are a method using an existing EIT (Event Information Table) and a method using an extended EIT (EIT +). As a screen presentation control in service (channel) units, there is a method in which the AIT of a broadcast signal is extended and used. Further, as a screen presentation control in units of events that occur in real time during a program, there is a method using information transmitted from a broadcasting station other than program arrangement information. Details of the four methods will be described below.

(1) EIT program genre (EIT): The policy level is determined from the program genre described in the existing EIT content descriptor. For this purpose, the receiver manages a correspondence table between program genres and policy levels. The relationship with the ARIB standard is ARIB STD-B10.
Part 2 6.2.4, Appendix H.

  Table 4 is a table showing a specific example of the relationship between the program genre and the policy level. The program genre (program_genre) is composed of two stages of “content_nibble_level1” (0x0 to 0xF) representing the major classification and “content_nibble_level2” (0x0 to 0xF) representing the middle classification. The table managed by the receiver covers the genre of the middle category, and defines a policy level value.

(2) New descriptor added to EIT (EIT +): A new descriptor is added to the event information section of EIT, and policy information is described. The receiver interprets this descriptor and executes a desired process, thereby realizing control according to the policy level for each program. The relationship with the ARIB standard is ARIB TR-B14 (second volume) Part 3 31.3, ARIB STD-B10 Volume 2 5.2.7.

  Table 5 is a table showing the structure of the event security descriptor. In the case of EIT +, the event security descriptor shown in the figure is newly defined, and this event security descriptor is stored in the descriptor area in the EIT and transmitted. In the event security descriptor, a policy level (policy_level), an application ID (application_identifier), a control code (application_control_code), a priority (application_priority), a protocol identification (protocol_id), and a program related flag (associated_application_flag) are set.

policy_level represents a policy level in units of programs. The policy level is a value from 1 to 4.
application_identifier () is an identifier for identifying an application. Table 6 shows the structure of application_identifier ().

organization_id represents the organization that created the application and takes a value of 0x00000063 or later. application_id represents a number for identifying an application. application_id is uniquely assigned within the organization identification.

  application_control_code specifies a control code for controlling the application state. Table 7 shows the definition of the control code.

  application_priority indicates a policy level for each application. The policy level for each application indicates the relative priority among the applications announced in the service. The priority is a value from 1 to 4.

  protocol_id indicates a protocol for transmitting the application file. Table 8 shows the protocol_id specification.

associated_application_flag indicates whether the application is linked to the program. Table 9 shows the definition of associated_application_flag .

(3) AIT table definition and addition of new descriptor (AIT +): AIT is extended to transmit policy information. The receiver interprets this table and executes a desired process, thereby realizing control corresponding to a policy level for an event that occurs at any time. The association with the ARIB standard is ARIB STD-B23 Part 2 10.16.

Table 10 shows the data structure of AIT. The AIT shown in Table 10 is an extension of the AIT data structure defined by ARIB STD-B23. AIT describes a policy level, an application ID, and a control code. The AIT is transmitted in the section format, and is always transmitted while the event is continuing. The application ID is described in application_identifier (), and the control code is described in application_control_code.
These details are the same as those described in (2) EIT extension.
Further, in order to describe the policy level, a new security policy descriptor is defined and stored in the AIT common descriptor loop for transmission.

  Table 11 shows the structure of a newly defined security policy descriptor.

(4) Emergency warning broadcasting and emergency earthquake warning (EWS / EEW): The policy level is determined using emergency information transmitted from a broadcasting station. In the receiver, correspondence between emergency information and policy level is assumed in advance. If it is emergency warning broadcast, the emergency flag broadcast start flag of TMCC is monitored. The occurrence and termination of emergency information is detected, and the policy level at that time can be determined. The relationship with the ARIB standard is ARIB STD-B31 3.15 and ARIB STD-B24 Volume 1 Part 3 Chapter 9.

  Each of the above methods (1) to (4) can be sent simultaneously in parallel. Therefore, it is necessary to determine which method is used to prioritize what is sent and determine the policy level. The priority order is as follows.

EWS / EEW> AIT +> EIT +> EIT

  Based on this priority, the receiver determines the policy level, thereby enabling screen presentation control that prioritizes an emergency event based on the intention of the broadcaster.

[4.3.4 Example of screen presentation control]
FIG. 19 shows an example of screen presentation control according to the policy level.
When the policy level of the program is “1”, both the application screen of the application screen of the A application and the application screen of the general application are permitted to be superimposed on the broadcast screen.
When the policy level of the program is “2”, only the A application is permitted to be superimposed on the broadcast screen, and the application screen of the general application is prohibited from being superimposed on the broadcast screen. Only display of is allowed.
When the policy level of the program is “3”, the display of both the application screen of the A application and the application screen of the general application is permitted, but superimposition on the broadcast screen is prohibited for all application screens. Only display outside the screen is allowed.
When the policy level is “4”, only the full screen display of the broadcast screen is permitted.

  FIG. 20 shows an example of presentation control at the time of receiving the earthquake early warning. When the program policy level of program A is “1”, in the broadcast time zone of program A, the application screen of application A and the application screen of general application are displayed superimposed on the broadcast screen. However, the receiver determines that the policy level in the time zone in which the earthquake early warning is generated in the broadcast time zone of program A is the policy level “4” of the emergency earthquake early warning. Therefore, even if the receiver is in the broadcast time zone of program A, the application screen of the A application and the application screen of the general application are superimposed on the broadcast screen in the time zone where the earthquake early warning is occurring. Prohibit alignment.

[Description of an embodiment of the present invention to which the above-described broadcast communication cooperation system is applied]
Next, an embodiment of the present invention shown in FIG. 1 will be described.
FIG. 21 is an overall configuration diagram of a broadcasting / communication cooperation system according to an embodiment of the present invention. As shown in the figure, the broadcasting / communication cooperation system of the present embodiment includes a broadcaster apparatus 1 owned by a broadcasting station, a service provider server group 2 held by a service provider, and a repository server 3 held by a system administrator. And a receiver 4 held by the viewer. Although only one receiver 4 is shown in the figure, a plurality of receivers 4 are actually provided.

The broadcast provider apparatus 1 includes a broadcast transmission apparatus 11 and a broadcast station server group 12.
The broadcast sending apparatus 11 corresponds to the broadcasting station equipment shown in FIG. 3, and is a broadcasting equipment for digital broadcasting composed of a program organization equipment, a program sending equipment, a transmission equipment, and the like.

The broadcast transmission apparatus 11 includes a broadcast-related data management unit 111, a signal setting unit 112, and a broadcast transmission unit 113.
The broadcast-related data management unit 111 manages program security policy data for each program, application security policy data for the A application, other policy data, and the like.
The program security policy data includes policy level data indicating a policy level of the program, an application ID of an application bound to the program, a control code for the application bound to the program, and the like.
The application security policy data includes information for identifying a program to which the application is bound, application protocol identification, location information, and the like. The location information indicates the storage location (storage location) of the application, and is, for example, the URL of the receiver application server 21 or the repository server 3 that can download the application. The protocol identification indicates whether the application is transmitted by broadcast or communication.
Only the A application is bound to the program.

The policy data includes presentation rule data and a policy level table.
The presentation rule data is data describing a presentation method for each policy level. The presentation method includes a screen display method and an audio output method. As the screen display method, for example, only the broadcast screen (video of the program) is displayed. For both the A application and the general application, the application screen (video of the application) is displayed on the broadcast screen or outside the broadcast screen. There are methods such as displaying only the application screen superimposed on the broadcast screen, and displaying the application screen of the general application outside the broadcast screen. Examples of the audio output method include a method of outputting only the sound of the broadcast program, and outputting the sound of the broadcast program and the sound of the A application or the general application independently or in combination.
The policy level table is data describing the policy level corresponding to the genre of the program and the policy level of each event. An event is the content of a broadcast that does not necessarily occur in conjunction with a program, such as an emergency warning signal or an earthquake early warning.

The signal setting unit 112 sets various data in the broadcast signal transmitted by the broadcast sending unit 113.
The signal setting unit 112 sets AIT and program policy level data for a broadcast signal based on program security policy data and application security policy data managed by the broadcast-related data management unit 111. The signal setting unit 112 multiplexes the AIT of the application bound to the program as an independent ES on the broadcast signal (broadcast TS) or sets it in the data carousel. Alternatively, the signal setting unit 112 sets information equivalent to the AIT of the application bound to the program in the EIT. Further, the signal setting unit 112 sets the policy level data of the program to EIT (Table 5) or AIT (Table 11). Note that when the policy level corresponding to the genre of the program is used, the policy level data need not be set in the broadcast signal. Further, the signal setting unit 112 sets the application file to a data carousel or the like. In addition, the signal setting unit 112 sets the policy data managed by the broadcast-related data management unit 111 to a broadcast signal in a section format, or to an engineering service or a data carousel.
The broadcast sending unit 113 transmits a broadcast signal for digital broadcasting. The broadcast signal includes information set by the signal setting unit 112.

The broadcast station server group 12 corresponds to the broadcast station server group shown in FIG. 3, and includes a content management server 13, a content distribution server 16, a broadcast station service server 17, and a notification server 18.
The content management server 13 includes a program management server 14 and a metadata management server 15. The program management server 14 manages already broadcasted programs and broadcasted programs. The metadata management server 15 manages metadata regarding each program. The metadata includes, for example, program title, program ID, program outline, performer, broadcast date and time, script, caption, and commentary data.

The content distribution server 16 is connected to the receiver 4 via the communication network 9 such as the Internet, and distributes content data of the content requested from the receiver 4.
The broadcast station service server 17 transmits the content data of the broadcast station service to the service provider server group 2. Examples of broadcasting station services include social network services and blog services.

  The notification server 18 is connected to the receiver 4 via the communication network 9 and is bound to the program based on the program security policy data and the application security policy data acquired from the broadcast-related data management unit 111 of the broadcast transmission device 11. The application AIT (FIG. 6) and the policy level data of the program are distributed to the receiver 4. In addition, the notification server 18 distributes the policy data acquired from the broadcast-related data management unit 111 of the broadcast transmission device 11 to the receiver 4. Note that all or part of the information may not be distributed from the notification server 18 and may be transmitted by the broadcast sending unit 113 of the broadcast sending device 11 using only the broadcast signal.

  The service provider server group 2 corresponds to the service provider server group shown in FIG. 3 and includes a receiver application server 21, a service server 22, a content distribution server 23, and a notification server 24. The receiver application server 21, service server 22, content distribution server 23, and notification server 24 are connected to the receiver 4 via the communication network 9.

The receiver application server 21 manages each application and distributes application files to the receiver 4.
The service server 22 is, for example, a multilingual subtitle server, a speech speed conversion voice server, a social TV server, a recommendation server, a bookmark server, and the like, and distributes content data of a service requested from the receiver 4.
The content distribution server 23 is, for example, a VOD distribution server, a caption distribution server, or a multi-view distribution server, and distributes content data of content requested from the receiver 4.
The notification server 24 transmits the application AIT (FIG. 6) to the receiver 4. In the case of the A application, the notification server 24 may transmit AIT (FIG. 6) based on the program security policy data and application security policy data acquired from the broadcast related data management unit 111 of the broadcast transmission device 11.

The repository server 3 corresponds to the repository shown in FIG. 3 and is connected to the receiver 4 via the communication network 9. The repository server 3 performs an electronic signature on an application file (application program) generated by the service provider, and transmits data necessary for authenticating the electronic signature of the application file (application program) to the receiver 4. Further, the repository server 3 transmits data indicating a list of A applications and location information of the A applications to the receiver 4. The repository server 3 may transmit the application file of the A application that is digitally signed to the receiver 4, and the receiver application server 21 receives and receives the application file of the A application that is digitally signed from the repository server 3. You may transmit to the machine 4. Further, the repository server 3 may transmit the AIT of the A application to the receiver 4.
Further, the repository server 3 receives the AIT (FIG. 6) of the A application bound to the program based on the program security policy data and the application security policy data received from the broadcast related data management unit 111 of the broadcast transmission device 11. 4 may be transmitted.

  The receiver 4 corresponds to the receiver shown in FIG. 3 and is, for example, a device such as a television receiver, a set top box, a personal computer, or a portable terminal.

  FIG. 22 is a functional block diagram showing the internal configuration of the receiver 4. As shown in the figure, the receiver 4 includes a broadcast receiving unit 401, a separating unit 402, a clock 403, a first synchronization buffer 404-1, a second synchronization buffer 404-2, a first decoder 405-1, 2 decoder 405-2, data broadcast execution unit 406, video control unit 407, video display unit 408, audio control unit 409, audio output unit 410, communication input / output unit 411, application execution control unit 412, presentation control unit 413, operation An input unit 414, a channel selection unit 415, a local information storage unit 416, and an external I / F unit 417 are configured.

The broadcast receiving unit 401 is a tuner that receives a broadcast signal. The broadcast signal is either a wireless broadcast signal or a wired broadcast signal. A radio broadcast signal is a signal obtained by receiving a broadcast radio wave (ground wave) transmitted by a transmission antenna on the broadcast station side or a satellite wave relayed by a satellite with a reception antenna. The wired broadcast signal is a signal transmitted from the broadcast station side via an optical cable, a coaxial cable, or the like. The broadcast receiving unit 401 receives a broadcast signal, demodulates it, and outputs a broadcast stream (TS).
The demultiplexing unit 402 is a demultiplexer, which converts a broadcast stream supplied from the broadcast receiving unit 401 into PCR (Program Clock Reference), video data, audio data, subtitle data, data broadcast, PSI (Program Specific Information) / SI (SI). Service Information) and AIT transmitted in an independent elementary stream (ES). The AIT may be included in the data broadcast, or the same content as the AIT may be set in the EIT constituting the SI. Further, the separation unit 402 may separate and output the application file from the broadcast signal.

  The communication input / output unit 411 inputs and outputs data through communication via the communication network 9. The communication input / output unit 411 outputs the AIT and application file transmitted via the communication network 9 to the application execution control unit 412. Further, the communication input / output unit 411 outputs the policy level data and policy data of the program transmitted via the communication network 9 to the presentation control unit 413. The communication input / output unit 411 communicates content data distributed from the content distribution server 16 and the content distribution server 23 and content data distributed from the service server 22 in accordance with an instruction of an application executed by the application execution control unit 412. The data is received via the network 9 and output to the second synchronization buffer 404-2.

The operation input unit 414 is an interface that receives an operation by the viewer, and is, for example, a receiving device that receives information input by the viewer from a remote controller, a mobile phone, a tablet terminal, or the like, a keyboard, a mouse, or the like. The operation input unit 414 outputs media (terrestrial / BS) and channel selection instructions input by the viewer to the channel selection unit 415. In addition, the operation input unit 414 outputs instructions for starting and ending the broadcasting / communication cooperation service and instructions for the application to the application execution control unit 412.
The channel selection unit 415 controls media and channels received by the broadcast reception unit 401 in accordance with the operation input to the operation input unit 414.

  The data broadcast execution unit 406 executes the data broadcast application transmitted by the digital broadcast signal, and outputs data broadcast image (graphic) data to the video control unit 407. The data broadcast execution unit 406 includes an API for starting an application of the broadcast communication cooperation service. When the data broadcast execution unit 406 executes the data broadcast application and an API for starting the application of the broadcast communication cooperation service is called, the data broadcast execution unit 406 instructs the application execution control unit 412 to start the application. Further, the data broadcast execution unit 406 acquires the AIT and application file transmitted by the data carousel from the data broadcast and outputs them to the application execution control unit 412. Further, the data broadcast execution unit 406 acquires the policy data transmitted by the data carousel from the data broadcast and outputs it to the presentation control unit 413.

  The application execution control unit 412 executes an application of the broadcasting / communication cooperation service. The application execution control unit 412 instructs the second decoder 405-2 to decode the content data received from the content distribution server 16, the content distribution server 23, or the service server 22 in accordance with the application being executed. The content data includes one or both of video data and audio data. The video data is, for example, a moving image, a still image, text data, or the like. The application execution control unit 412 outputs graphic (video) data and video control instructions to the video control unit 407 and outputs audio data and voice control instructions to the audio control unit 409 according to the application being executed.

The clock 403 outputs a timer counter value. The clock 403 adjusts the frequency of the oscillator according to the timer counter value indicated by the PCR, and synchronizes the time with the broadcast transmission side.
The first synchronization buffer 404-1 stores video data, audio data, and caption data output from the separation unit 402. A PES (Packetized Elementary Stream) generated from an elementary stream (ES) of video data, audio data, and subtitle data is set by being divided into transport packets constituting a broadcast stream (TS). The PES header includes PTS (Presentation Time Stamp). The first synchronization buffer 404-1 outputs the video data, audio data, and caption data output from the separation unit 402 in units of PES packets according to the instruction of the first decoder 405-1.
The second synchronization buffer 404-2 stores content data and service content data received by the communication input / output unit 411. Alternatively, the second synchronization buffer 404-2 stores video data, audio data, and caption data output from the separation unit 402 in accordance with a viewer instruction input by the operation input unit 414. The second synchronization buffer 404-2 outputs the stored content data or video data, audio data, and caption data of the program in units of PES packets according to the instruction of the second decoder 405-2.

The first decoder 405-1 identifies the PES packet in the first synchronization buffer 404-1, in which the PTS corresponding to the time output from the clock 403 is set, and the video data encoded from the identified PES packet Audio data and subtitle data are read, and the read data are decoded and output.
The second decoder 405-2 identifies the content data or the PES packet of the program in the second synchronization buffer 404-2 in which the PTS corresponding to the time output from the clock 403 is set, and from the identified PES packet The encoded video data, audio data, and subtitle data are read, and the read data are decoded and output.

  The presentation control unit 413 determines a presentation method (screen display method and audio output method) according to the policy level of the selected program or the policy level of the event that is occurring and the presentation rule data. The presentation control unit 413 instructs the video control unit 407 to display the broadcast screen, the application screen of the A application, and the application screen of the general application according to the determined screen display method. Furthermore, the presentation control unit 413 instructs the audio control unit 409 to output the sound based on the broadcast sound data, the sound based on the sound data of the A application, and the sound based on the sound data of the general application according to the determined sound output method. .

  The video control unit 407 includes a broadcast screen based on the video data and caption data of the program output from the first decoder 405-1 and an A application based on the video data of the content data output from the second decoder 405-2, The application screen of the application is displayed on the video display unit 408 according to the screen display method instructed from the presentation control unit 413 or the application execution control unit 412. When graphic (video) data is output from the application execution control unit 412 by executing the application, the video control unit 407 displays the video according to the screen display method instructed from the presentation control unit 413 or the application execution control unit 412. A display screen based on the data is also displayed on the video display unit 408. The second decoder 405-2 may output video data and caption data of other programs.

  The video display unit 408 is a general display, and displays broadcast and application screens. For example, the video display unit 408 displays an application screen such as a moving image, a still image, text of content data received from the communication network 9, a graphic output from the application execution control unit 412 by executing the application, Alternatively, a video that combines broadcast screens of other programs is displayed.

  The audio control unit 409 includes audio based on the audio data of the program output from the first decoder 405-1, audio of A application and general application based on the audio data of the content data output from the second decoder 405-2. The voice based on the voice data output from the application execution control unit 412 by executing the application is output from the voice output unit 410 according to the voice output method instructed by the presentation control unit 413 or the application execution control unit 412. Note that the second decoder 405-2 may output audio data of another program. The audio output unit 410 is a general speaker and outputs broadcast and application audio.

The local information storage unit 416 stores various data such as user information.
An external interface unit (hereinafter referred to as “external I / F unit”) 417 transmits and receives data to and from the device 8 connected to a home network such as a LAN (Local Area Network). The device 8 is a terminal that operates in cooperation with the receiver 4, and is, for example, a personal computer, a mobile phone, a tablet, a smartphone, or a PDA.

  When the receiver 4 is a set top box or the like, the video display unit 408 and the audio output unit 410 are external devices connected to the receiver 4.

FIG. 23 is a block diagram illustrating a detailed configuration of the application execution control unit 412.
As shown in the figure, the application execution control unit 412 includes an application storage unit 431, an application authentication unit 432, an application management unit 433, an application control unit 434, an application execution unit 435, a resource access control unit 438, and a resource control unit 439. Prepare.

  The application storage unit 431 is an application file received by the communication input / output unit 411 via the communication network 9, or an application file acquired from the data broadcast by the data broadcast execution unit 406, or separated from the broadcast signal by the separation unit 402. Store application files. The application file may be stored in the application storage unit 431 in advance at the time of shipment. The application storage unit 431 includes a main storage device and an auxiliary storage device such as a disk. For example, an application file is stored on a disk and read to the main storage device at the time of execution. In this case, the application file of the application executed on-the-fly is not stored in the disk but is stored only in the main storage device, and is deleted from the main storage device when the execution is completed.

  The application authentication unit 432 receives data necessary for authentication of the electronic signature from the repository server 3, and verifies the electronic signature added to the application file (application program) using the received data. For example, the application authentication unit 432 uses the public key received from the repository server 3 to decrypt the application file that has been digitally signed. As a result, when a predetermined data string is obtained, the application authentication unit 432 determines that the verification of the electronic signature has succeeded. When the verification of the electronic signature is successful, the application authentication unit 432 determines that the application is an A application. When the verification of the electronic signature is unsuccessful, or when the electronic signature is not added, the application authentication unit 432 determines that the application is a general application. to decide.

  The application management unit 433 manages the activation or stop state of the application by the application execution unit 435 and the output status of the activated application. The output status is information indicating whether an image or sound is output from a running application. In response to the inquiry from the presentation control unit 413, the application management unit 433 returns the output status of the activated application and a response indicating whether the activated application is an A application or a general application.

  The application control unit 434 controls activation and stop of the application in the application execution unit 435 according to the control code for the application bound to the program and the instruction for the application input by the operation input unit 414. In addition, the application control unit 434 instructs the application execution unit 435 to start the application that is instructed to start from the data broadcast execution unit 406. When the channel is changed according to the input from the operation input unit 414, the application control unit 434 terminates the application bound to the program of the channel before the change and the application bound to the program of the channel after the change. The application execution unit 435 is instructed to start. Note that the application control unit 434 is equivalent to the application bound to the program and the control code for the bound application as the independent ES of the broadcast signal or the AIT included in the data broadcast and the AIT obtained from the EIT of the broadcast signal. Or from the AIT received from the notification server 18 or the notification server 24 via the communication input / output unit 411. In addition, the application control unit 434 transmits an application file download request with the location information set in the AIT as a destination. The repository server 3 or the receiver application server 21 that has received the download request from the receiver 4 distributes the application file to the receiver 4.

  The application execution unit 435 includes a receiver API unit 436 and a terminal cooperation API unit 437. In accordance with an instruction from the application control unit 434, the application execution unit 435 reads the application program of the application instructed to start from the application storage unit 431 and executes it. When the application execution unit 435 executes the application program, the application operates on the receiver 4, and the application execution unit 435 requests content from the content distribution server 16 and the content distribution server 23 via the communication network 9. Or request a service from the service server 22. Also, by executing the application program, the application execution unit 435 outputs graphic data and video control instructions to the video control unit 407 and outputs audio data and voice control instructions to the audio control unit 409.

The receiver API unit 436 executes a receiver API that is an API for using each resource in the receiver 4 when the application execution unit 435 executes the application. When the receiver API unit 436 executes the receiver API, the resources in the receiver 4 can be used from the application program executed by the application execution unit 435.
The terminal cooperation API unit 437 is an API that allows the device 8 on the home network that can communicate with the external I / F unit 417 and the device connected via the communication network 9 to use the function of the receiver 4. Execute cooperative API. When the terminal cooperation API unit 437 executes the terminal cooperation API, resources in the receiver 4 can be used from a device 8 connected via the home network or a device connected via the communication network 9.

The resource control unit 439 controls access from the receiver API unit 436 and the terminal cooperation API unit 437 to each functional unit that is a resource in the receiver 4.
The resource access control unit 438 controls whether access from the receiver API unit 436 and the terminal cooperation API unit 437 to each functional unit in the receiver 4 is permitted. The resource access control unit 438 performs this control according to whether the application that is the caller of each API executed by the receiver API unit 436 or the terminal cooperation API unit 437 is an A application or a general application.

  FIG. 24 is a block diagram illustrating a detailed configuration of the presentation control unit 413. As shown in the figure, the presentation control unit 413 includes a policy data management unit 451, a policy data storage unit 452, an event interpretation unit 453, a policy level collation unit 454, an event control unit 455, a program policy storage unit 456, and a policy mediation unit. 457 and a policy level storage unit 458.

  The policy data storage unit 452 stores policy data including presentation rule data and a policy level table. The policy data management unit 451 manages policy data stored in the policy data storage unit 452. The policy data management unit 451 outputs the policy level table read from the policy data storage unit 452 to the policy level verification unit 454, and outputs the presentation rule data read from the policy data storage unit 452 to the policy arbitration unit 457. Further, the policy data management unit 451 receives the policy data transmitted by broadcasting from the separation unit 402 or the data broadcast execution unit 406, and receives the policy data transmitted by communication from the communication input / output unit 411. The policy data management unit 451 updates the policy data stored in the policy data storage unit 452 with the policy data transmitted by broadcasting or communication.

  The event interpreter 453 analyzes the broadcast signal received by the broadcast receiver 401, the data broadcast and subtitle data separated by the separator 402, and detects the occurrence or end of the event. When the event interpretation unit 453 detects (interprets) the occurrence or end of an event, the event interpretation unit 453 outputs the event number of the detected event and status data indicating the occurrence or end to the policy level matching unit 454.

The policy level checking unit 454 refers to the policy level table to determine (check) the policy level corresponding to the genre of each program indicated by the EIT and the policy level corresponding to the event specified by the event number. The policy level collating unit 454 includes the broadcast start time and broadcast end time data of the program acquired from the SI input from the separating unit 402 and the policy level of the program (hereinafter referred to as “program policy level”). Output to the event control unit 455. When the program policy level is set to EIT, the policy level verification unit 454 sends the program start level and end time data of the program and the program policy level of the program acquired from the EIT to the event control unit 455. Output.
Further, when the policy level matching unit 454 acquires the program policy level from the AIT, the policy level matching unit 454 outputs the acquired program policy level to the policy mediation unit 457. Further, the policy level matching unit 454 outputs the policy level determined in accordance with the event number (hereinafter referred to as “trigger policy level”) to the policy arbitration unit 457.

  The program policy storage unit 456 stores the program start time and program end time in association with the program policy level. The event control unit 455 writes the program start time and program end time data input from the policy level matching unit 454 and the program policy level in association with each other and writes them in the program policy storage unit 456, and stores them in the program policy storage unit 456. Based on these information, the time for executing the display control is managed. When the event control unit 455 refers to the program start time data stored in the program policy storage unit 456 and detects that the execution time is to be notified, the event control unit 455 corresponds to the execution time and the execution time. The program policy level is output to the policy mediation unit 457.

  The policy level storage unit 458 stores the execution time and program policy level, trigger policy level, and status data input to the policy arbitration unit 457. The policy arbitration unit 457 determines a policy level from the execution time and program policy level input from the event control unit 455 and the trigger policy level input from the policy level collation unit 454. For example, the trigger policy level may be determined as the policy level, and the higher one of the program policy level and the trigger policy level may be determined as the policy level.

  When the program policy level acquired by the AIT is input from the policy level verification unit 454, the policy mediation unit 457 is input from the policy level verification unit 454 rather than the program policy level input from the event control unit 455. Prioritize program policy level. That is, the policy mediation unit 457 determines the policy level from the program policy level obtained from the AIT and the trigger policy level. The policy arbitration unit 457 refers to the presentation rule data, and determines the screen display method and sound based on the determined policy level, information indicating whether the active application acquired from the application management unit 433 is an A application, and the output status. Determine the output method (presentation method). The policy arbitration unit 457 outputs the determined screen display method to the video control unit 407, and outputs the determined audio output method to the audio control unit 409.

Here, returning to FIG. 1, the operation accepting unit 474, the activation request signal acquisition unit 471, the application information acquisition unit 472, the activation control unit 473, and the end control unit 481 will be described.
The connection control unit 501, the terminal application acquisition unit 502, and the terminal application execution unit 503 will also be described.

  The operation reception unit 474 is an operation device that transmits an operation signal in accordance with an operation by a viewer (operator). The operation receiving unit 474 is, for example, a remote controller (hereinafter referred to as a remote controller) capable of remotely operating the receiver 4, an operation panel disposed on the receiver 4 body, or the like. The remote controller includes a mobile terminal (such as a mobile phone, a smartphone, and a tablet terminal) that is executing an application program for realizing a remote control function, a computer device, a car navigation terminal, and the like.

  The operation reception unit 474 includes an operation unit for operating the receiver 4. In the present embodiment, the operation unit of the operation receiving unit 474 includes, for example, a power button, numeric buttons from “0” to “9” (channel designation button), a channel switching button, a volume control button, and a data broadcasting service button. The operation buttons are the same as those provided on the conventional television operation remote controller. The data broadcasting service button is an operation button for switching between displaying and hiding data broadcasting every time the receiver 4 is pressed when receiving data broadcasting. The data broadcast service button is also called a data broadcast button, d button, or D button.

  In addition, the operation unit of the operation reception unit 474 is provided with a broadcast communication cooperation service button (not shown in FIG. 1). This broadcast communication cooperation service button is an operation button for causing the receiver 4 to start receiving the broadcast communication cooperation service. The broadcast communication cooperation service button is also called a broadcast communication cooperation button, h button, or H button. When the broadcast communication cooperation service button is pressed, the operation reception unit 474 transmits an activation request signal for requesting the start of the stream dependent service in the broadcast communication cooperation service.

  Note that the operation unit of the operation reception unit 474 may be realized by, for example, a touch panel and a graphical user interface (GUI).

The operation input unit 414 receives the operation signal transmitted by the operation reception unit 474.
The activation request signal acquisition unit 471 of the operation input unit 414 generates an activation request command by taking an activation request signal among the operation signals received by the operation input unit 414, and uses the activation request command as an application control of the application execution control unit 412. To the unit 434.
When the operation input unit 414 receives an operation signal transmitted by operating a broadcast service return button, for example, a numeric button or a channel switching button provided on the operation unit of the operation reception unit 474, the operation input unit 414 requests to end the operation. A command is generated and this end request command is supplied to the application control unit 434.

  The application information acquisition unit 472 of the application control unit 434 takes in an AIT (Application Information Table, application information table) supplied from the separation unit 402 regularly or irregularly. More specifically, the application information acquisition unit 472 extracts an AIT extracted from a TS (Transport Stream; TS, broadcast stream) corresponding to a channel selected as desired among media selected by the viewer. The ES (Elementary Stream) is taken regularly or irregularly.

  As described above, the AIT is information including information related to the application, control information for the application, and information for specifying the application. As shown in FIG. 6, the application information includes an application name (appName) and an application ID (orgId, appId). The control information for the application includes the application control code (application_control_code) shown in Table 7 above. This application control code is data for controlling the life cycle of the application, for example. The information specifying the application includes location information (location). This location information is information for specifying the storage location of the application file and the application file. The location information in the stream dependent service is a URL (Uniform Resource Locator) of the receiver application server 21 or the repository server 3 (application server) that stores the application file.

The application information acquisition unit 472 extracts the application control code and application ID from the captured AIT, and supplies the application control code and application ID to the end control unit 481.
Note that the application information acquisition unit 472 may extract the application name from the captured AIT instead of the application ID or together with the application ID and supply it to the end control unit 481.

  Further, when the application information acquisition unit 472 captures the activation request command supplied from the activation request signal acquisition unit 471, the application control code is obtained from the AIT captured at the same time as the activation request command or immediately after the acquisition timing. In addition, the application name and location information are extracted. The application name, application control code, and location information extracted by the application information acquisition unit 472 are referred to as application information. The application information acquisition unit 472 supplies the extracted application information to the activation control unit 473.

Further, the application information acquisition unit 472 takes in PSI (Program Specific Information, program identification information) / SI (Service Information, program arrangement information) supplied from the separation unit 402 regularly or irregularly. More specifically, the application information acquisition unit 472 periodically or irregularly extracts the PSI / SI extracted from the TS corresponding to the desired channel selected from the above-described desired media. take in.
PSI / SI is information including metadata related to a broadcast program. As described above, the metadata is information related to a broadcast program such as a program ID, a program overview, performers, staff, broadcast date and time, script, captions, and commentary.

In the present embodiment, PSI / SI includes application designation information for designating an application (also referred to as a terminal application) to be executed by the device 8 when the receiver 4 and the device 8 are operated in cooperation. The application designation information may be identification information for identifying a terminal application, or may be a URL of an external terminal application server that stores the terminal application.
The application designation information may be included in the metadata.
The signal setting unit 112 of the broadcast transmission apparatus 11 multiplexes application designation information in the TS in association with the broadcast program or its contents or its progress.

The application information acquisition unit 472 extracts application designation information and metadata from the captured PSI / SI, and supplies all or part of the application designation information and metadata to the application execution unit 435.
The application designation information and metadata or any information may be included in an ES provided independently in association with a specific identifier. In this case, the separation unit 402 extracts application designation information and / or metadata from the ES using a specific identifier as a clue.

  The activation control unit 473 captures application information supplied from the application information acquisition unit 472, that is, an application name, an application control code, and location information. The activation control unit 473 controls the activation of the application corresponding to the application information when the application control code is identified as data indicating standby of the application.

Specifically, the activation control unit 473 generates an application request command and supplies the application request command to the communication input / output unit 411 when the application control code identifies data indicating application standby. That is, the activation control unit 473 is an application request unit.
The data indicating the standby of the application is, for example, “12 (hexadecimal number)” (identification name: PRESENT) in the application control code shown in Table 7 above.
In addition to the above “12 (hexadecimal number)” (identification name: PRESENT), the data indicating the standby of the application may be a dedicated code.
The application request command is a command indicating an acquisition request (download request) of the application file with the location indicated by the location information as a request destination of the application file.

  Note that control of application activation by the activation control unit 473 includes reading an application file stored in the application storage unit 431 into the application execution unit 435 to start execution.

In addition, when the activation control unit 473 receives the application file acquisition notification supplied from the communication input / output unit 411, the activation control unit 473 generates an application execution request command and supplies the application execution request command to the application execution unit 435.
The application execution request command is a command indicating an execution start request for the application corresponding to the application file acquisition notification.

  When the termination control unit 481 of the application control unit 434 fetches the termination request command supplied from the operation input unit 414, the application execution unit 435 detects the application processing state (application execution state). When the detected application execution state is the execution state, the end control unit 481 generates an application execution end command and supplies the application execution end command to the application execution unit 435. That is, when the termination control unit 481 captures the termination request command supplied from the operation input unit 414, the termination control unit 481 controls the termination of the execution of the application that is associated with the TS being captured and is being executed.

  Further, the end control unit 481 takes in the application control code and application ID supplied from the application information acquisition unit 472. When the captured application control code is identified as data for instructing the end of execution of the application, the end control unit 481 controls the end of execution of the application corresponding to the application ID associated with the application control code.

Specifically, if the end control unit 481 identifies that the application control code is data instructing the end of execution of the application, the end control unit 481 generates an application execution end command including the application ID associated with the application control code. The application execution end command is supplied to the application execution unit 435.
The data for instructing the end of execution of the application is, for example, “03 (hexadecimal number)” (identification name: DESTROY) in the application control code shown in Table 7 above.

  In addition to the above “03 (hexadecimal number)” (identification name: DESTROY), “04 (hexadecimal number)” (identification name: KILL), “08 (16) (Hexadecimal number) "(identification name: KILL ALL), or other dedicated code may be used.

  The application execution unit 435 described above includes a content request unit (not shown) as its functional configuration. This content requesting unit sends a content request command for requesting content data necessary for the application execution process to the content distribution server 16 of the broadcast station server group 12 or the content distribution server 23 of the service provider server group 2. The content request command is generated and supplied to the communication input / output unit 411.

  As described above, the application execution unit 435 has a terminal cooperation API (Application Program Interface) for realizing a cooperation function between the receiver 4 and the device 8. When executing an application (also referred to as a cooperative application) that executes a cooperative process with the device 8, the application execution unit 435 causes the cooperative application to call the terminal cooperative API and execute the application.

  The application execution unit 435 takes the application designation information and metadata supplied from the application information acquisition unit 472 after operating the receiver 4 and the device 8 in cooperation. The cooperative operation of the receiver 4 and the device 8 is to establish a communication path between the receiver 4 and the device 8, and the receiver 4 and the device 8 cooperate to realize the receiving state of the broadcasting / communication cooperation service. It is to be. Communication between the receiver 4 and the device 8 is performed via a WebSocket connection using a WebSocket bridge function included in the bridge unit 493 of the external I / F unit 417.

The application execution unit 435 responds to the device 8 based on information specifying the device 8 that is operating in cooperation, information specifying the terminal application to be executed by the device 8 (application specifying information), and metadata. Specify the terminal application and perform startup control. The information specifying the device 8 is information acquired from the device 8 when the application execution unit 435 establishes cooperation between the receiver 4 and the device 8.
In addition, when the device 8 to be cooperatively operated by the receiver 4 is specified in advance, information specifying the device 8 may be omitted.

  The application execution unit 435 provides an InvokeApplicationOnDevice () method for specifying and starting a terminal application to the device 8 based on information specifying the device 8, application specifying information, and metadata in the terminal cooperation API. Provided. The InvokeApplicationOnDevice () method has the following format, for example.

Method name: InvokeApplicationOnDevice ()
Function: Designates and starts a terminal application for the device 8.
Arguments: device_dev, application_id, strings_parameters

In the InvokeApplicationOnDevice () method, device_dev is information (device ID) that specifies the device 8 that is operating in cooperation, application_id is information (application specification information, application ID) that specifies a terminal application, and strings_parameters is metadata.
That is, after the device 8 and the receiver 4 start the cooperative operation, when the application execution unit 435 executes the InvokeApplicationOnDevice () method, the application execution unit 435 corresponds to application_id for the device 8 corresponding to device_dev. A start command for starting the terminal application is transmitted to the device 8 via the external I / F 417.
At this time, stringus_parameters is included as an argument in this activation command.

  Note that the application execution unit 435 may execute the execution process by omitting stringus_parameters from the arguments in the InvokeApplicationOnDevice () method. In the present embodiment, a case where stringus_parameters is included in the argument will be described.

Further, the communication input / output unit 411 described above includes an application acquisition unit (not shown) and a content acquisition unit as its functional configuration.
The application acquisition unit takes in an application file transmitted from an external server, for example, the receiver application server 21 or the repository server 3 that has received the application request signal transmitted by the communication input / output unit 411, and stores the application file in the application storage unit 431 is stored.
The content acquisition unit takes in content data transmitted from an external server that has received the content request signal transmitted by the communication input / output unit 411, for example, the content distribution server 16 or the content distribution server 23, and this content data is second synchronized. To the buffer 404-2.

  The connection control unit 501 of the device 8 is a communication interface for performing communication with the external I / F unit 417 of the receiver 4. When the application execution unit 435 executes the InvokeApplicationOnDevice () method after the device 8 and the receiver 4 start cooperative operation, the connection control unit 501 uses strings_parameters supplied from the external I / F unit 417 as an argument. The activation command is captured, and the activation command is supplied to the terminal application acquisition unit 502.

  The terminal application acquisition unit 502 takes in the activation command supplied from the connection control unit 501, acquires a terminal application based on the activation command, and supplies the acquired terminal application to the terminal application execution unit 503 together with strings_parameters.

Specifically, the terminal application acquisition unit 502 includes a terminal application storage unit (not shown). The terminal application acquisition unit 502 determines whether device_dev indicated by the start command corresponds to the device 8. When device_dev corresponds to the device 8, the terminal application acquisition unit 502 determines whether a terminal application corresponding to application_id indicated by the activation command is stored in the terminal application storage unit. When the terminal application corresponding to application_id is stored in the terminal application storage unit, the terminal application acquisition unit 502 reads the terminal application from the terminal application storage unit and supplies it to the terminal application execution unit 503 along with strings_parameters.
When device_dev does not correspond to the device 8, the terminal application acquisition unit 502 may transmit error information to the receiver 4 via the connection control unit 501.

When the terminal application corresponding to application_id is not stored in the terminal application storage unit, the terminal application acquisition unit 502 generates a terminal application request signal indicating an acquisition request for the terminal application, and the terminal application request signal is Supplied to the terminal application server. And the terminal application acquisition part 502 takes in the terminal application supplied from the terminal application server according to a terminal application request signal, and supplies this terminal application to the terminal application execution part 503 with strings_parameters.
The terminal application acquisition unit 502 may store the captured terminal application in the terminal application storage unit.

  The terminal application is, for example, a remote control application for realizing a remote control function for remotely operating the receiver 4, an answer for transmitting a viewer answer from a viewer to a broadcasting station or a service provider in a quiz program, a variety program, or the like ( Voting) application, web log, chat, SNS (Social Networking Service), and other client functions.

The terminal application execution unit 503 takes in the terminal application and strings_parameters supplied from the terminal application acquisition unit 502 and executes the terminal application using the strings_parameters.
For example, when strings_parameters is information related to a cast or staff of a broadcast program, the terminal application execution unit 503 executes the terminal application using information related to the cast or staff of the broadcast program, that is, broadcast resources.

  FIG. 25 is a schematic external front view when an infrared remote controller is used as the operation receiving unit 474. As shown in the figure, the operation accepting unit 474 includes buttons similar to those provided on a conventional TV operation remote controller, such as a power button, a numeric button, a channel switching button, a volume control button, and a data broadcasting service button. In addition, a broadcast communication cooperation service button 475 is provided.

  Next, when the broadcast communication cooperation service button 475 of the operation reception unit 474 is pressed, the receiver 4 acquires an AIT, acquires an application file from an external server based on the AIT, executes the application, A process for acquiring and presenting related content from an external server in accordance with the execution process of the application will be described.

FIG. 26 is a sequence diagram showing processing procedures of the receiver 4, the receiver application server 21, and the content distribution server 23.
In step S 1, when the receiver 4 receives the activation request signal transmitted from the operation reception unit 474, the AIT ES from the TS corresponding to the selected channel among the media selected by the viewer's request is received. To extract.
Next, in step S <b> 2, when the receiver 4 identifies that the application control code included in the AIT is data indicating application standby, the receiver 4 transmits an application request signal to the receiver application server 21.
Next, in step S3, when the receiver application server 21 receives and captures the application request signal transmitted by the receiver 4, the receiver application server 21 reads the application file specified by the application request signal.
Next, in step S4, the receiver application server 21 transmits the application file to the receiver 4 that has transmitted the application request signal.

Next, in step S5, the receiver 4 receives and takes in the application file transmitted by the receiver application server 21, and starts application execution processing.
Next, in step S <b> 6, the receiver 4 transmits to the content distribution server 23 a content request signal for acquiring content data that is necessary for the execution process of the application.
Next, in step S7, when the content distribution server 23 receives and takes in the content request signal transmitted by the receiver 4, it reads out the content data specified by the content request signal.
Next, in step S8, the content distribution server 23 transmits content data to the receiver 4 that has transmitted the content request signal.
Next, in step S9, the receiver 4 receives and takes in the content data transmitted by the content distribution server 23, decodes the content, displays it, and outputs it.

  In FIG. 26, the same processing is performed when the repository server 3 is provided instead of the receiver application server 21 and when the content distribution server 16 is provided instead of the content distribution server 23.

Next, the operation of the receiver 4 that receives the operation signal transmitted from the operation reception unit 474 will be described in detail.
FIG. 27 and FIG. 28 are flowcharts showing the processing procedure of the operation when the receiver 4 operates according to the operation of the operation receiving unit 474.
The receiver 4 captures the broadcast signal, extracts the broadcast content from the TS corresponding to the media selected by the viewer and the selected channel, that is, displays the desired program and outputs the audio In this state, the receiver 4 starts executing the processing according to this flowchart.

First, in step S <b> 11, the operation input unit 414 receives reception of an operation signal transmitted from the operation receiving unit 474. When the operation input unit 414 receives the operation signal transmitted from the operation reception unit 474, the operation input unit 414 takes in the operation signal and proceeds to the process of step S12.
In step S12, the operation input unit 414 proceeds to the process of step S13 when the received operation signal is the activation request signal (S12: YES), and when the received operation signal is not the activation request signal (S12: NO). Then, the process proceeds to step S21 in FIG.

In step S <b> 13, the activation request signal acquisition unit 471 takes in the activation request signal, generates an activation request command, and supplies the activation request command to the application control unit 434 of the application execution control unit 412.
Next, the application information acquisition unit 472 of the application control unit 434 takes in the activation request command supplied from the activation request signal acquisition unit 471.
Next, the application information acquisition unit 472 takes in the AIT supplied from the separation unit 402. This AIT is information extracted from the TS corresponding to the channel selected as desired among the media desired by the viewer.

In step S14, the application information acquisition unit 472 extracts the application name, application control code, and location information as application information from the captured AIT, and supplies the application information to the activation control unit 473.
Next, the activation control unit 473 takes in the application information supplied from the application information acquisition unit 472, that is, the application name, application control code, and location information.

Next, in step S15, if the activation control unit 473 identifies that the application control code is data indicating application standby (S15: YES), the process proceeds to step S16, where the application control code sets application standby. If it is identified that the data is not shown (S15: NO), the process of this flowchart is terminated.
Specifically, if the activation control unit 473 identifies that the application control code is “12 (hexadecimal number)” (identification name: PRESENT), for example, the activation control unit 473 proceeds to the process of step S16, and the application control code is “12 ( If it is discriminated that it is not (hexadecimal) ”(identification name: PRESENT), the processing of this flowchart is terminated.

In step S <b> 16, the activation control unit 473 generates an application request command specifying the location indicated in the location information as a request destination of the application file, and supplies the application request command to the communication input / output unit 411.
Next, the communication input / output unit 411 receives the application request command supplied from the activation control unit 473, and transmits this application request command as an application request signal to the request destination. This application request signal is, for example, a signal obtained by converting an application request command into an IP (Internet Protocol) packet.
That is, the communication input / output unit 411 transmits an application request signal to the receiver application server 21 or the repository server 3 that is a request destination indicated by the application request command.

  Next, in step S17, the communication input / output unit 411 accepts reception of an application file transmitted by the request destination (the receiver application server 21 or the repository server 3). When the communication input / output unit 411 receives the application file transmitted by the request destination, the communication input / output unit 411 takes in the application file and moves to the process of step S18.

  In step S <b> 18, the communication input / output unit 411 supplies the captured application file to the application storage unit 431 of the application execution control unit 412 for storage.

In step S <b> 19, the communication input / output unit 411 supplies an application file acquisition notification to the activation control unit 473.
Next, the activation control unit 473 captures an application file acquisition notification supplied from the communication input / output unit 411.
Next, the activation control unit 473 generates an application execution request command for the application corresponding to the application file acquisition notification, and supplies the application execution request command to the application execution unit 435.
Next, the application execution unit 435 takes in an application execution request command supplied from the activation control unit 473.
Next, the application execution unit 435 reads the application file designated by the fetched application execution request command from the application storage unit 431, starts application execution processing, and ends the processing of this flowchart.

  On the other hand, in step S21 of FIG. 28, the operation input unit 414 receives an operation signal when a broadcast service return button (for example, a numeric button or a channel switching button) provided in the operation unit of the operation reception unit 474 is operated. When it is the transmitted operation signal (S21: YES), the process proceeds to step S22, and otherwise, the process proceeds to step S25.

In step S <b> 22, the operation input unit 414 generates an end request command and supplies the end request command to the application control unit 434.
Next, the termination control unit 481 of the application control unit 434 receives the termination request command supplied from the operation input unit 414 and detects the application execution state from the application execution unit 435.

  Next, in step S23, when the detected application execution state is the execution state (S23: YES), the end control unit 481 proceeds to the process of step S24, and when the application execution state is not the execution state (S23: NO). Then, the process of this flowchart is terminated.

In step S24, the termination control unit 481 generates an application execution termination command and supplies the application execution termination command to the application execution unit 435.
Next, when the application execution unit 435 receives the application execution end command supplied from the end control unit 481, the application execution unit 435 ends the execution process of the application currently being executed and ends the process of this flowchart.

On the other hand, in step S25, the operation input unit 414 supplies a code for specifying a medium or a channel to the channel selection unit 415 according to an operation signal other than the activation request signal, or sends a code for setting the volume to a volume control circuit (not shown). Or supply.
The receiver 4 starts the operation according to the operation signal other than the activation request signal, and ends the process of this flowchart.

Next, the operation of the receiver 4 that terminates execution of an application based on control from the provider side of the broadcasting / communication cooperation service will be described.
FIG. 29 is a flowchart illustrating a processing procedure of the operation of the receiver 4.
First, in step S <b> 31, the application information acquisition unit 472 takes in the AIT supplied from the separation unit 402.

In step S32, the application information acquisition unit 472 extracts the application control code and the application ID from the captured AIT, and supplies the application control code and the application ID to the end control unit 481.
Next, the end control unit 481 takes in the application control code and the application ID supplied from the application information acquisition unit 472.

Next, in step S33, when the end control unit 481 identifies that the application control code is data instructing the end of execution of the application (S33: YES), the process proceeds to step S34, and the application control code is the application control code. If it is determined that the data is not data for instructing the end of execution (S33: NO), the process returns to step S31.
Specifically, when the end control unit 481 identifies that the application control code is “03 (hexadecimal number)” (identification name: DESTROY), for example, the end control unit 481 proceeds to the process of step S34, and the application control code is “03 ( If it is discriminated that it is not (hexadecimal number) "(identification name: DESTROY), the process returns to step S31.

In step S34, the termination control unit 481 generates an application execution termination command including the application ID associated with the application control code, and supplies the application execution termination command to the application execution unit 435.
Next, the application execution unit 435 takes in the application execution end command supplied from the end control unit 481.
Next, when the application execution unit 435 receives the application execution end command supplied from the end control unit 481, the application execution unit 435 ends the execution process of the application that is currently being executed, and ends the process of this flowchart.

Next, the operation of the receiving system that dynamically changes the terminal application executed by the device 8 that operates in cooperation with the receiver 4 will be described.
FIG. 30 is a sequence diagram illustrating processing procedures performed by the receiver 4, the device 8, and the terminal application server.
First, in step S41, the receiver 4 and the device 8 establish cooperation.
Specifically, the application execution unit 435 of the receiver 4 causes the cooperation application that executes the cooperation processing with the device 8 to call the terminal cooperation API and execute the processing. When the application execution unit 435 executes the terminal cooperation API, the receiver 4 and the device 8 establish cooperation.

Here, a detailed operation of the cooperation process between the receiver 4 and the device 8 in step S41 will be described.
FIG. 31 is a sequence diagram illustrating the procedure of the cooperation process between the receiver 4 and the device 8.
First, when the terminal application execution unit 503 of the device 8 executes the receiver connection application by the operation of the user of the device 8, in step S411, the connection control unit 501 searches for the receiver 4 by UPnP SSDP. Do.

  When the receiver 4 receives the search by the device 8, the external I / F unit 417 of the receiver 4 transmits a response including the WebSocket connection address and port number to the device 8 in step S 412. Specifically, the external I / F unit 417 includes a device-side server unit 491 by executing a WebSocket server program, and sets a connection address and a port number used to access the device-side server unit 491 as a device. 8 is notified.

  When the device 8 receives the response including the connection address and the port number from the receiver 4, in step S413, the terminal application execution unit 503 of the device 8 receives the response to the search on the display (not shown) of the device 8. A list of receivers 4 is displayed as a list of available receivers 4. And the terminal application execution part 503 waits for selection of the receiver 4 which should cooperate from the said list | wrist.

  Next, in step S414, when one receiver 4 is selected from the list of available receivers 4 by the user of the device 8, the terminal application execution unit 503 selects the receiver 4 to cooperate with. Accept.

  Next, in step S415, the connection control unit 501 transmits a WebSocket handshake request to the receiver 4 that has received the selection in step S414. Specifically, the connection control unit 501 transmits a handshake request with the connection address and port number received from the receiver 4 in step S412 as the destination.

  When the receiver 4 receives the handshake request from the device 8, in step S416, the connect unit 493 of the receiver 4 returns a WebSocket handshake response to the device 8 via the device-side server unit 491. Establish a connection. That is, the connection unit 493 establishes a connection between the device side server 491 and the device 8. Thereafter, the receiver 4 and the device 8 can perform bidirectional communication by communicating via the connection. At this time, the device 8 transmits a request for event reception registration (terminal application activation information reception registration) to the receiver 4, and the receiver 4 registers event reception in response to the request. Accordingly, when the application designation information is included in the broadcast stream separated by the separation unit 402, the receiver 4 notifies the registered device 8 of the activation command of the application by push transmission.

Returning to FIG. 27, when cooperation is established between the receiver 4 and the device 8, in step S42, the receiver 4 acquires application designation information and metadata from the TS being received.
Specifically, the application information acquisition unit 472 takes in the PSI / SI extracted from the TS corresponding to the channel selected as desired from among the media selected as desired by the viewer, and from this PSI / SI Capture application specification information and metadata.

Next, in step S43, the receiver 4 takes in the information specifying the device 8 that is operating in cooperation and the application specifying information (first information) in the PSI / SI in the processing in step S42. Based on the application designation information and the metadata, the terminal application is designated for the device 8 and activation control is performed.
Specifically, the application execution unit 435 includes a start command (stringus_parameters) for executing the InvokeApplicationOnDevice () method of the mobile cooperation API and starting the terminal application corresponding to application_id for the device 8 corresponding to device_dev. ) (Second information) is pushed and transmitted to the device 8 via the external I / F 417 through the WebSocket connection established in step S41.

  Next, in step S44, the device 8 receives the start control of the terminal application from the receiver 4, and displays information on the designated terminal application on a display (not shown). Then, when the device 8 receives an instruction to start the terminal application by the user, the device 8 generates a terminal application request signal for the specified terminal application, and transmits the terminal application request signal to the terminal application server.

Specifically, the connection control unit 501 takes in an activation command supplied from the external I / F unit 417 with strings_parameters as an argument, and supplies this activation command to the terminal application acquisition unit 502.
Next, the terminal application acquisition unit 502 takes in an activation command supplied from the connection control unit 501.
Next, the terminal application acquisition unit 502 acquires a terminal application based on this activation command, and supplies this terminal application to the terminal application execution unit 503 together with strings_parameters.

More specifically, the terminal application acquisition unit 502 determines whether device_dev indicated by the start command corresponds to the device 8. When device_dev corresponds to the device 8, the terminal application acquisition unit 502 determines whether a terminal application corresponding to application_id indicated by the start command is stored in the terminal application storage unit.
However, FIG. 30 is an example when the terminal application corresponding to application_id is not stored in the terminal application storage unit.
When the terminal application corresponding to application_id is not stored in the terminal application storage unit, the terminal application acquisition unit 502 generates a terminal application request signal indicating an acquisition request for the terminal application, and the terminal application request signal is transmitted to the terminal application server. Send to.

Next, in step S45, when the terminal application server receives and captures the terminal application request signal transmitted by the device 8, the terminal application server reads the file of the terminal application specified by the terminal application request signal.
Next, in step S46, the terminal application server transmits a file of the terminal application to the device 8 that has transmitted the terminal application request signal.

Next, in step S47, the device 8 receives and takes in the file of the terminal application transmitted from the terminal application server.
Specifically, the terminal application acquisition unit 502 takes in the terminal application supplied from the terminal application server in response to the terminal application request signal, and supplies this terminal application to the terminal application execution unit 503 together with strings_parameters.

Next, in step S <b> 48, the device 8 starts terminal application execution processing on the captured terminal application file.
Specifically, the terminal application execution unit 503 takes in the terminal application and strings_parameters supplied from the terminal application acquisition unit 502, and starts execution processing of the terminal application using the strings_parameters.
Next, in step S49, the receiver 4 performs a cooperative application execution process, and the device 8 performs a terminal application execution process. Thereby, the receiver 4 and the apparatus 8 implement | achieve cooperation operation | movement using a broadcast resource.

  Note that the connection established in step S41 is released when the cooperative application of the receiver 4 or the receiver connection application of the device 8 is terminated. Therefore, when the device 8 operates as a single task, that is, when the receiver connection application ends with the activation of the terminal application in step S48, the terminal application newly establishes a WebSocket connection with the receiver 4. I do. On the other hand, when the device 8 operates by multitasking and the terminal application and the receiver connection application operate in parallel, the connection is maintained. Therefore, in this case, the terminal application may perform cooperation processing using the connection established by the receiver connection application.

Next, an operation for linking applications of the receiver 4 and the device 8 will be described.
In this example, there are a plurality of devices 8 to which the receiver 4 is connected (devices 8-1 and 8-2), and an application is executed in the order of the device 8-1, the receiver 4, and the device 8-2. explain.
FIG. 32 is a sequence diagram illustrating a cooperation procedure between applications.
First, the devices 8-1 and 8-2 acquire a terminal application according to the procedure of steps S 41 to S 47 described above (step S 51). Further, the receiver 4 receives a receiver application from a receiver application server (not shown) based on the application designation information acquired in step S42 described above and information on the application to be executed by the receiver 4 indicated by the metadata. Obtain (step S52). Note that the terminal application executed by the devices 8-1 and 8-2 and the receiver application executed by the receiver 4 have the same value as the connection type, which is information indicating the type of the application to be linked. To do.

Next, the terminal 8-1 executes a terminal application (step S53). Next, the terminal 8-1 transmits a handshake request to the receiver 4 by the procedure of steps S411 to S415 described above by the executed terminal application (step S54).
When the connection unit 493 of the receiver 4 acquires the handshake request via the device-side server unit 491, the bridge unit 494 performs a bridge connection process between the application executed by the application execution unit 435 and the application executed by the device 8. A bridge determination process for determining whether or not to perform is performed (step S55).

Here, the detailed operation of the bridge determination process by the receiver 4 will be described.
FIG. 33 is a flowchart illustrating a procedure of bridge determination processing by the receiver 4.
First, the connection unit 493 of the external I / F unit 417 waits for reception of a handshake request from the device-side server unit 491 or the receiver-side server unit 492 (step S501). Then, the connect unit 493 receives a handshake request from the application execution unit 435 or the device 8 (step S502). The handshake request includes a port number and a connection type in addition to the connection address.
Here, the connection unit 493 and the bridge unit 494 read the port number to determine whether the received handshake request is due to an application executed by the device 8 or an application executed by the application execution unit 435. Identify.
For example, a handshake request for the port 1000 is due to an application executed by the application execution unit 435, and a handshake request for any of the ports 1001 to 1010 is due to an application executed by the device 8. .

Next, the connection unit 493 determines whether or not the port indicated by the handshake request is being used by another connection (step S503). If the connection unit 493 determines that the port is already used (step S503: YES), the connection determination unit 493 ends the bridge determination process without establishing the connection.
On the other hand, when determining that the port is not used (step S503: NO), the connecting unit 493 refers to the port number and determines whether or not the handshake request is from the device 8 (step S504). ).

If the connection unit 493 determines that the handshake request is from the device 8 (step S504: YES), the connection unit 493 establishes a WebSocket connection between the device-side server unit 491 and the device 8 (step S505). Next, the bridge unit 494 determines whether or not a connection on the receiver side has already been established (step S506). If the bridge unit 494 determines that the connection on the receiver side has not yet been established (step S506: NO), the bridge determination process is terminated without performing the bridge connection.
On the other hand, if the bridge unit 494 determines that the connection on the receiver side is established (step S506: YES), whether or not the connection type of the connection matches the connection type of the connection generated in step S505. Is determined (step S507).

When it is determined that the connection type of the connection on the receiver side does not match the connection type of the connection generated in step S505 (step S507: NO), the bridge unit 494 ends the bridge determination process without performing the bridge connection.
On the other hand, if the bridge unit 494 determines that the connection type of the connection on the receiver side matches the connection type of the connection generated in step S505 (step S507: YES), the connection on the receiver side and the connection generated in step S505 Are bridge-connected (step S508). Here, the bridge connection means that the request received by the receiver-side server unit 492 via the receiver-side connection is transferred to the device-side server unit 491 and transmitted to the device 8 via the device-side connection. Further, the request received by the device-side server unit 491 via the device-side connection is transferred to the receiver-side server unit 492 and transmitted to the application execution unit 435 via the receiver-side connection.

If the connect unit 493 determines in step S504 that the handshake request is from the device 8 (step S504: YES), the connect unit 493 performs WebSocket between the receiver-side server unit 492 and the application execution unit 435. A connection is established (step S509). Next, the bridge unit 494 determines whether or not one or more device-side connections are established (step S510). When it is determined that no device-side connection has been established (step S510: NO), the bridge unit 494 ends the bridge determination process without performing bridge connection.
On the other hand, when the bridge unit 494 determines that one or more device-side connections have been established (step S510: YES), the bridge unit 494 selects one device-side server unit 491 with which a connection has been established, and the device-side server The following steps S512 to S513 are performed for each unit 491 (step S511).

First, it is determined whether or not the connection type of the connection in the device-side server unit 491 selected in step S511 matches the connection type of the connection generated in step S509 (step S512).
If the bridge unit 494 determines that the connection type of the connection on the selected device side does not match the connection type of the connection generated in step S509 (step S512: NO), the bridge unit 494 does not perform the bridge connection and the next device side server unit 491 is selected.
On the other hand, when the bridge unit 494 determines that the connection type of the connection on the selected device side matches the connection type of the connection generated in step S509 (step S512: YES), the bridge unit 494 generates the connection on the selected device side in step S509. The connected connection is bridge-connected (step S513), and the next device-side server unit 491 is selected.
Through the above processing, the external I / F unit 417 of the receiver 4 establishes a connection with the application execution unit 435 or the device 8 and performs a bridge connection process. When the bridge connection process is performed, the bridge unit 494 generates identification information (session_id) for specifying the connection for each bridge connection, and notifies the terminal 8 and the application execution unit 435 of the identification information.

  Returning to FIG. 32, when the external I / F unit 417 of the receiver 4 performs the above-described bridge connection processing, since the connection of the receiver-side server 492 is not established at this time, the external I / F unit 417 The connection with the terminal 8-1 is established without performing the bridge connection process (step S56).

Next, the application execution unit 435 of the receiver 4 executes the receiver application acquired in Step S52 (Step S56). Next, the application execution unit 435 outputs a handshake request to the external I / F unit 417 by the executed receiver application in the same procedure as the above-described steps S411 to S415 (step S57).
When the connection unit 493 of the external I / F unit 417 acquires a handshake request via the receiver-side server unit 492, the bridge unit 494 performs the above-described bridge determination process (step S55). As a result, the connection unit 493 establishes a connection between the application execution unit 435 and the receiver-side server unit 492. In step S56, since the connection between the device 8-1 and the device-side server unit 491 is established, the bridge unit 494 performs a bridge connection process between the device-side server unit 491 and the receiver-side server unit 492. This is performed (step S59). At this time, the bridge unit 494 notifies the application execution unit 435 and the device 8-1 of identification information (session_id = 1) that identifies the bridge connection.

Next, the device 8-2 executes the terminal application acquired in Step S51 (Step S60). Next, the device 8-2 transmits a handshake request to the external I / F unit 417 by the procedure of steps S411 to S415 described above by the executed terminal application (step S61).
When the connecting unit 493 of the external I / F unit 417 acquires a handshake request via the device-side server unit 491, the bridge unit 494 performs the above-described bridge determination process (step S62). Thereby, the connection unit 493 establishes a connection between the device 8-2 and the device-side server unit 491. In addition, since the connection between the application execution unit 435 and the receiver-side server unit 492 is established in step S58, the bridge unit 494 performs a bridge connection process between the device-side server unit 491 and the receiver-side server unit 492. Is performed (step S63). At this time, the bridge unit 494 notifies the application execution unit 435 and the device 8-2 of identification information (session_id = 2) that identifies the bridge connection.

  Thus, the receiver 4 can accept a request from the device 8 via the external I / F unit 417 regardless of the language specification of the application to be executed. The device 8 can accept a request from the receiver 4 via the external I / F unit 417 regardless of the language specification of the application to be executed.

  In addition, when the application execution unit 435 of the receiver 4 outputs a request in which the identification information (session_id) is embedded to the external I / F unit 417 in the subsequent processing, the bridge unit 494 responds to the device 8 indicated by the identification information. Output the request. When the bridge unit 494 receives a request from the connection destination device 8 to which the identification information is assigned, the bridge unit 494 outputs a combination of the request and the identification information to the application execution unit 435. As a result, the receiver application can perform cooperation processing by distinguishing the transmission destination of the request.

As described above, the receiver 4 according to the embodiment of the present invention receives the operation signal transmitted from the operation reception unit 474 according to the operation by the viewer. The operation reception unit 474 is provided with a broadcast communication cooperation service button 475 that causes the receiver 4 to start receiving the broadcast communication cooperation service. When the broadcast communication cooperation service button 475 is pressed, the operation reception unit 474 transmits an activation request signal.
When receiving the activation request signal, the receiver 4 acquires an AIT from a TS corresponding to a desired channel in a desired medium, and extracts application information (application name, application control code, and location information) from the AIT.
When the application control code is data indicating the standby of the application, the receiver 4 transmits an application request signal to the receiver application server 21 or the repository server 3 that is a request destination of the application file.
When receiving the application file supplied from the request destination of the application, the receiver 4 takes in the application file and starts executing the application.
The receiver 4 supplies a content request signal to the content distribution server 16 or the content distribution server 23 in order to acquire content data necessary for the application execution process.
When the receiver 4 receives the supply of the content data from the request destination of the content data, the receiver 4 captures and presents the content data.

According to this configuration, the receiver 4 starts the broadcasting / communication cooperation service from the state in which the broadcasting service is received according to the operation of the broadcasting / communication cooperation service button 475 provided in the operation reception unit 474 by the viewer (operator). It is possible to switch to a state of receiving a stream dependent service which is one service form.
Therefore, according to the receiver 4, it is possible to switch from the broadcast service to the broadcast communication cooperative service by a simple operation.

Further, the receiver 4 receives an operation signal supplied from the operation receiving unit 474 when a broadcast service return button (for example, a numeric button or a channel switching button) provided on the operation unit of the operation receiving unit 474 is operated. When the operation signal is transmitted, the execution state of the application is checked.
If the application is currently in an execution state, the receiver 4 ends the execution process of the application that is being executed.

According to this configuration, the receiver 4 is a stream-dependent type that is one service form of the broadcasting / communication cooperation service according to the operation of the numeric buttons and the channel switching button provided in the operation receiving unit 474 by the viewer (operator). It is possible to switch from receiving a service to receiving a broadcast service.
Therefore, according to the receiver 4, it is possible to switch from the broadcasting / communication cooperation service to the broadcasting service by a simple operation.

  Further, when the receiver 4 identifies that the application control code included in the AIT obtained from the TS is data instructing the end of execution of the application, the receiver 4 indicates the application ID indicated by the application ID associated with the application control code. The execution process is terminated.

According to this configuration, the receiver 4 can switch the broadcast communication cooperation service currently received to the broadcast service by the control from the provider side of the broadcast communication cooperation service.
Therefore, according to the receiver 4, the broadcast communication cooperation service received on the receiver side can be switched to the broadcast service by the control from the provider side of the broadcast communication cooperation service.

  Moreover, according to this structure, the receiver 4 can acquire the application which a time apparatus performs, and the content data relevant to this application from an external supply source according to the request | requirement of an own apparatus.

In the receiving system according to the present embodiment, when the receiver 4 executes a cooperative application that executes a cooperative process with the device 8, the cooperative application calls the terminal cooperative API to execute the cooperative application.
After establishing cooperation with the device 8, the receiver 4 acquires application designation information and metadata from the receiving TS.
The receiver 4 sends the terminal application to the device 8 based on the information specifying the device 8 that is operating in cooperation, the application specifying information specifying the terminal application to be executed by the device 8, and the metadata. Is specified and start control is performed.
Upon receiving activation control of the receiver 4, the device 8 acquires a terminal application designated to be executed by the device 8 from the internal terminal application storage unit or an external terminal application server.
The device 8 executes the acquired terminal application using the metadata acquired from the receiver 4.

According to this configuration, the application to be executed by the device 8 operating in cooperation with the receiver 4 can be changed in real time based on the application designation information obtained from the broadcast stream.
Therefore, the device 8 that operates in cooperation with the receiver 4 can execute the terminal application by changing the terminal application according to the broadcast program or the contents thereof or the progress status thereof.

  Note that the activation control unit 473 of the receiver 4 recognizes that the application control code included in the acquired AIT is data for instructing automatic activation of the application, regardless of the operation of the operation reception unit 474. A request signal is transmitted to the receiver application server 21.

  Also, in the present embodiment, when the receiver 4 is receiving a broadcast service, when the broadcast communication cooperation service button 475 provided in the operation reception unit 474 is pressed, the broadcast communication is started from the state where only the broadcast service is received. It was an example of transition to a state where a cooperative service is received. In addition to this, for example, when the broadcast communication cooperation service button 475 is pressed when the receiver 4 is not receiving any service, for example, in a standby state, the receiver 4 You may make it transfer to the state which receives broadcast communication cooperation service, after becoming the state which receives broadcast service.

Further, the present embodiment is an example in which the application information acquisition unit 472 acquires an AIT ES multiplexed on a TS obtained from a broadcast signal.
In addition to this, the broadcast transmission apparatus 11 transmits, as a broadcast signal, a TS in which an EIT (Event Information Table) provided with a descriptor including AIT information is provided, and receives the broadcast signal. In the machine 4, the separation unit 402 may extract the AIT from the EIT multiplexed in the TS, and the application information acquisition unit 472 may acquire the AIT.

FIG. 34 shows the data structure of the EIT.
Details on the data structure of the EIT can be found in, for example, “Technical Data for Digital Terrestrial Television Broadcasting Operation Regulations”, ARIB TR-B14, 4.4 edition, second volume, Japan Radio Industry Association, March 2011. It is described in the month (Part 4, Part 3, 31.3).
For example, the signal setting unit 112 of the broadcast transmission apparatus 11 stores AIT information in an EIT descriptor (descriptor ()) having the data structure shown in FIG. Then, the signal setting unit 112 generates a TS in which the EIT storing the AIT is multiplexed, and supplies this TS to the broadcast transmission unit 113.

Alternatively, the broadcast transmission apparatus 11 can convert an AIT described in BML (Broadcast Markup Language) into a DSM-CC (Digital Storage).
Media-Command and Control) In the receiver 4 that transmits the data by the data carousel transmission method and receives the broadcast signal, the separation unit 402 extracts the AIT from the data broadcast content, and the application information acquisition unit 472 acquires the AIT. You may make it do.

For details on the DSM-CC data carousel transmission system, see, for example, “Data Broadcast Coding System and Transmission System Standards in Digital Broadcasting”, ARIB STD-B24, 5.1 Edition, Volume III, Japan Radio Industry Association. , March 2007 (Part 3, Chapter 6).
As described above, the signal setting unit 112 fixes the component tag and module for carousel transmission of the AIT. The signal setting unit 112 fixes the component tag to “AA” (hexadecimal number), for example, and the module ID, which is module identification information, to “0”, for example. Then, the signal setting unit 112 sets a type for identifying the AIT in the type descriptor of the module.
On the other hand, the separation unit 402 monitors the modules in the TS, and when detecting a module whose module ID is “0”, extracts the AIT corresponding to the Type identifier from the detected module.

  Alternatively, the AIT associated with the broadcast content identification information or the AIT including the broadcast content identification information is stored in the notification server 18 in the broadcast station server group 12 or the notification server 24 in the service provider server group 2. In the receiver 4 that has captured the broadcast signal, the application control unit 434 acquires the content identification information from the separation unit 402 and acquires the AIT from the notification server 18 or the notification server 24 via the communication input / output unit 411. May be.

Moreover, in order to implement | achieve the function which switches the broadcast service which the receiver side has received to the broadcast communication cooperation service by control from the provider side of the broadcast communication cooperation service, it may have the following configurations.
That is, the application information acquisition unit 472 extracts an application control code and a predetermined flag from the captured AIT, and supplies the application control code and the flag to the activation control unit 473. The predetermined flag is, for example, data provided in the AIT that designates either application activation or data broadcast presentation.
The activation control unit 473 takes in the application control code and flag supplied from the application information acquisition unit 472.
When the activation control unit 473 identifies that the application control code is data for instructing the automatic activation of the application, the activation control unit 473 controls either the execution start of the application or the presentation of data broadcasting according to the flag.
The data for instructing the automatic activation of the application is, for example, “01 (hexadecimal number)” (identification name: AUTOSTART) in the application control code shown in Table 7 above.

  Moreover, you may make it implement | achieve a part of function of the receiver 4 which is this embodiment with a computer. In this case, it is realized by recording a computer program for realizing the function on a computer-readable recording medium, causing the computer system to read the computer program recorded on the recording medium, and executing the computer program. May be. Here, the computer system includes an operating system (OS) and hardware of peripheral devices. The computer-readable recording medium refers to a portable recording medium such as a flexible disk, a magneto-optical disk, an optical disk, and a memory card, and a storage device such as a magnetic hard disk and a solid state drive built in the computer system. Furthermore, a computer-readable recording medium holds a program dynamically for a short period of time, such as a computer network such as the Internet, or a communication line when transmitting a program via a telephone line or a mobile phone network. It is also possible to include one that holds a program for a certain period of time, such as a volatile memory inside a computer system that becomes a server device or a client in that case. Further, the above program may be for realizing a part of the above-described functions, and further, may be realized by combining the above-described functions with a program already recorded in the computer system. Good.

  As mentioned above, although embodiment of this invention was explained in full detail with reference to drawings, the specific structure is not restricted to that embodiment, The design of the range which does not deviate from the summary of this invention, etc. are included.

  In the present embodiment, the bridge unit 494 has described the case where the connections having the same connection type included in the handshake request are bridge-connected. You may make it connect. For example, the condition is that the application information executed by the application execution unit 435 and the application information executed by the device 8 are included in the same broadcast stream separated by the separation unit 402. This can be realized by acquiring application information from the broadcast stream and recording the information every time the separation unit 402 separates the broadcast signal into the broadcast stream.

  Further, in the present embodiment, the case where the connection is not established when the port is already in use has been described. However, the present invention is not limited to this. For example, a process for preferentially connecting a port specified later is performed. You can go.

  In this embodiment, an example in which one terminal application is assigned to one port has been described. However, a configuration in which a single port is connected to a plurality of terminal applications may be employed. In this case, the determination as to whether or not the port is being used in step S503 is omitted.

DESCRIPTION OF SYMBOLS 1 ... Broadcast provider apparatus 11 ... Broadcast transmission apparatus 111 ... Broadcast related data management part 112 ... Signal setting part 113 ... Broadcast transmission part 12 ... Broadcasting station server group 13 ... Content management server 14 ... Program management server 15 ... Metadata management server DESCRIPTION OF SYMBOLS 16 ... Content delivery server 17 ... Broadcasting station service server 18 ... Notification server 2 ... Service provider server group 21 ... Receiver application server 22 ... Service server 23 ... Content delivery server 24 ... Notification server 3 ... Repository server 4 ... Receiver 401 ... broadcast receiving unit 402 ... separating unit 403 ... clock 404-1 ... first synchronization buffer 404-2 ... second synchronization buffer 405-1 ... first decoder 405-2 ... second decoder 406 ... data broadcast execution unit 407 ... Video control unit 408 ... Video display unit 409 ... Sound control unit 410 ... Sound output Unit 411 ... communication input / output unit 412 ... application execution control unit 413 ... presentation control unit 414 ... operation input unit 415 ... channel selection unit 416 ... local information storage unit 417 ... external I / F unit 431 ... application storage unit 432 ... application authentication unit 433 ... Application management unit 434 ... Application control unit 435 ... Application execution unit 436 ... Receiver API unit 437 ... Terminal cooperation API unit 438 ... Resource access control unit 439 ... Resource control unit 451 ... Policy data management unit 452 ... Policy data storage unit 453 ... Event interpretation unit 454 ... Policy level collation unit 455 ... Event control unit 456 ... Program policy storage unit 457 ... Policy arbitration unit 458 ... Policy level storage unit 9 ... Communication network 471 ... Activation request signal acquisition unit 472 Application information acquiring unit 473 ... start control unit 474 ... operation accepting unit 475 ... linked digital terrestrial television broadcasting service button 481 ... end control unit 491 ... equipment-side server unit (server unit)
492 .. receiver side server part (server part)
493 ... Connect unit 494 ... Bridge unit 501 ... Connection control unit 502 ... Terminal application control unit 503 ... Terminal application execution unit

Claims (5)

  1. A broadcast receiver for receiving broadcast signals;
    A separator for separating a broadcast stream from a broadcast signal received by the broadcast receiver;
    An application information acquisition unit that acquires information of an application to be executed by the device from the broadcast stream separated by the separation unit;
    An application execution unit that executes an application indicated by the information acquired by the application information acquisition unit;
    The application execution unit, and a server unit that receives a request output by execution of the application by a terminal that executes the application;
    A connection unit that establishes a connection between the server unit and the application execution unit and the terminal;
    Via the connection established by the connection unit, the server unit outputs a request received from the application execution unit to the terminal, and the server unit outputs a request received from the terminal to the application execution unit. And a receiver.
  2. The bridge unit determines whether a relationship between an application executed by the application execution unit and an application executed by the terminal satisfies a predetermined condition, and when the condition is satisfied, the connection established by the connect unit The request received from the application execution unit by the server unit is output to the terminal via the server, and the request received from the terminal by the server unit is output to the application execution unit. The listed receiver.
  3. When the application execution unit and the terminal establish a connection with the server unit by executing the application, the application execution unit and the terminal output type information indicating the type of application to be linked to the connection unit,
    3. The receiver according to claim 2, wherein the predetermined condition in the bridge unit is that the type information received by the connect unit from the application execution unit matches the type information received from the terminal. .
  4. The receiver according to any one of claims 2 to 3 , wherein the bridge unit determines the predetermined condition when a connection is established by the connection unit.
  5. When it is determined that the predetermined condition is satisfied, the bridge unit performs a bridge connection process between the server unit and the terminal , generates identification information for specifying the bridge connection, and transmits the identification information to the application execution unit. When the identification information is included in the request received by the server unit from the application execution unit, the request is output to the terminal at the connection destination of the bridge connection to which the identification information is assigned. when the identification information has accepted the request from the terminal of the destination bridge connections assigned, claim 4 claim 2, characterized in that for outputting a combination of the identification information with the request to the application executing unit The receiver according to any one of the above.
JP2012112969A 2011-08-26 2012-05-17 Receiving machine Active JP5978000B2 (en)

Priority Applications (3)

Application Number Priority Date Filing Date Title
JP2011184565 2011-08-26
JP2011184565 2011-08-26
JP2012112969A JP5978000B2 (en) 2011-08-26 2012-05-17 Receiving machine

Applications Claiming Priority (4)

Application Number Priority Date Filing Date Title
JP2012112969A JP5978000B2 (en) 2011-08-26 2012-05-17 Receiving machine
EP12828400.7A EP2750309A4 (en) 2011-08-26 2012-08-17 Receiver and reception method
US14/239,598 US20140214967A1 (en) 2011-08-26 2012-08-17 Receiver and reception method
PCT/JP2012/070925 WO2013031556A1 (en) 2011-08-26 2012-08-17 Receiver and reception method

Publications (2)

Publication Number Publication Date
JP2013066160A JP2013066160A (en) 2013-04-11
JP5978000B2 true JP5978000B2 (en) 2016-08-24

Family

ID=48189230

Family Applications (1)

Application Number Title Priority Date Filing Date
JP2012112969A Active JP5978000B2 (en) 2011-08-26 2012-05-17 Receiving machine

Country Status (1)

Country Link
JP (1) JP5978000B2 (en)

Families Citing this family (19)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2014196398A1 (en) * 2013-06-06 2014-12-11 ソニー株式会社 Receiving device, receiving method, transmission device, transmission method, and program
JP5802312B2 (en) * 2013-06-20 2015-10-28 シャープ株式会社 Broadcast receiving apparatus, extended function execution apparatus, control method for broadcast receiving apparatus, and information processing apparatus
JP2015050749A (en) * 2013-09-04 2015-03-16 日本放送協会 Receiver, cooperative terminal device and program
CN105723729B (en) * 2013-11-13 2018-06-12 麦克赛尔株式会社 Broadcast receiver and broadcast receiver system
CN105723725B (en) 2013-11-13 2017-08-25 日立麦克赛尔株式会社 Broadcast receiver
CN105794220A (en) * 2013-11-13 2016-07-20 Lg电子株式会社 Method and apparatus for managing connection between broadcast receiving device and another device connected by network
WO2015076178A1 (en) * 2013-11-21 2015-05-28 シャープ株式会社 Web application access control system
WO2015104742A1 (en) * 2014-01-07 2015-07-16 ソニー株式会社 Information processing device and information processing method
JP5836406B2 (en) * 2014-01-24 2015-12-24 株式会社Jストリーム API providing server and API providing system
JP6455974B2 (en) * 2014-02-21 2019-01-23 日本放送協会 Receiving machine
JP6226791B2 (en) * 2014-03-24 2017-11-08 Kddi株式会社 Recommendation device, recommendation system, and recommendation method
BR102014011263B1 (en) * 2014-05-09 2019-07-02 Tqtvd Software Ltda Method for enclosuring audiovisual content streams in mpeg2-private-sections, device for enclosing audiovisual content in mpeg2-transport-stream, audio / audio communication protocol data for user devices without resources to tune a digital tv signal broadcast through a digital tv signal broadcast
JP6370095B2 (en) * 2014-05-12 2018-08-08 マクセル株式会社 Broadcast receiving apparatus and broadcast receiving system
JP6355964B2 (en) * 2014-05-12 2018-07-11 マクセル株式会社 Broadcast receiver
JP6336330B2 (en) * 2014-05-12 2018-06-06 マクセル株式会社 Broadcast receiver
US20170164070A1 (en) * 2014-10-29 2017-06-08 Lg Electronics Inc. Apparatus for transmitting broadcast signals, apparatus for receiving broadcast signals, method for transmitting broadcast signals and method for receiving broadcast signals
JP6176764B1 (en) * 2016-11-28 2017-08-09 一般社団法人日本ケーブルラボ Television viewing operation method, set top box, terminal, and program
JP6629790B2 (en) * 2017-05-18 2020-01-15 マクセル株式会社 Broadcast receiver
JP6522208B2 (en) * 2018-06-13 2019-05-29 マクセル株式会社 Broadcast receiver

Family Cites Families (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
GB0307694D0 (en) * 2003-04-03 2003-05-07 Koninkl Philips Electronics Nv Broadcast delivery to a wireless device
US7873974B2 (en) * 2005-09-19 2011-01-18 Sony Corporation Identification of television programming using a portable wireless device
ES2336121T3 (en) * 2005-11-16 2010-04-08 Alcatel Lucent Procedure and system of multi-user interactive televesion, and television receiver that makes use of this procedure.
JP5433239B2 (en) * 2009-01-15 2014-03-05 日本放送協会 Broadcast application launch system
MX2011010999A (en) * 2009-05-04 2011-11-02 Ericsson Telefon Ab L M Session push transfer.
JP4933594B2 (en) * 2009-07-06 2012-05-16 キヤノン株式会社 Television receiver, network system and control method thereof
JP5573202B2 (en) * 2010-01-29 2014-08-20 船井電機株式会社 Portable terminal and information display interlocking system

Also Published As

Publication number Publication date
JP2013066160A (en) 2013-04-11

Similar Documents

Publication Publication Date Title
US10491965B2 (en) Method, computer program, and reception apparatus for delivery of supplemental content
US9794645B2 (en) Apparatus and method for processing an interactive service
US9883247B2 (en) Reception apparatus, reception method, transmission apparatus, and transmission method
JP6438766B2 (en) Receiving apparatus and receiving method
US9936231B2 (en) Trigger compaction
US9723349B2 (en) Video display device, terminal device, and method thereof
KR101695514B1 (en) Method for transmitting a broadcast service, apparatus for receiving same, and method for processing an adjunct service using the apparatus for receiving same
US9712864B2 (en) Broadcast service receiving method and broadcast service receiving apparatus
DE112013003718B4 (en) Method and device for processing digital service signals
JP5897468B2 (en) Receiving apparatus, receiving method, and program
AU2013332537B2 (en) Apparatus and method for processing an interactive service
DE112011104029B4 (en) Broadcast service broadcasting method, broadcasting service receiving method and broadcasting service receiving apparatus
CA2889868C (en) Apparatus and method for processing an interactive service
KR20150043295A (en) Broadband delivery of personalization information for advanced tv services
CA2875467C (en) Apparatus and method for processing an interactive service
CA2844605C (en) Method for transmitting broadcast service, method for receiving broadcast service, and apparatus for receiving broadcast service
US9451337B2 (en) Media synchronization within home network using set-top box as gateway
KR102083944B1 (en) Receiving device, receiving method, program, and information-processing system
KR102033809B1 (en) Reception device, reception method, and program
KR20120028818A (en) Transmitting apparatus, transmitting method, receiving apparatus, receiving method, program, and broadcasting system
KR20140119699A (en) Receiving device, receiving method, program, and information processing system
CN103988210B (en) Message processing device, server apparatus, information processing method and server processing method
WO2011033730A1 (en) Information processing device, data management method, and program
KR101567832B1 (en) Digital device and method for controlling the same
CN101159577B (en) Apparatus for receiving adaptive broadcast signal and method thereof

Legal Events

Date Code Title Description
A621 Written request for application examination

Free format text: JAPANESE INTERMEDIATE CODE: A621

Effective date: 20150401

A131 Notification of reasons for refusal

Free format text: JAPANESE INTERMEDIATE CODE: A131

Effective date: 20160301

A521 Written amendment

Free format text: JAPANESE INTERMEDIATE CODE: A523

Effective date: 20160425

TRDD Decision of grant or rejection written
A01 Written decision to grant a patent or to grant a registration (utility model)

Free format text: JAPANESE INTERMEDIATE CODE: A01

Effective date: 20160628

A61 First payment of annual fees (during grant procedure)

Free format text: JAPANESE INTERMEDIATE CODE: A61

Effective date: 20160725

R150 Certificate of patent or registration of utility model

Ref document number: 5978000

Country of ref document: JP

Free format text: JAPANESE INTERMEDIATE CODE: R150

R250 Receipt of annual fees

Free format text: JAPANESE INTERMEDIATE CODE: R250

R250 Receipt of annual fees

Free format text: JAPANESE INTERMEDIATE CODE: R250