KR20160123982A - Collaborating service providing method for media sharing and system thereof - Google Patents

Collaborating service providing method for media sharing and system thereof Download PDF

Info

Publication number
KR20160123982A
KR20160123982A KR1020160022671A KR20160022671A KR20160123982A KR 20160123982 A KR20160123982 A KR 20160123982A KR 1020160022671 A KR1020160022671 A KR 1020160022671A KR 20160022671 A KR20160022671 A KR 20160022671A KR 20160123982 A KR20160123982 A KR 20160123982A
Authority
KR
South Korea
Prior art keywords
content
screen
information
taas
secondary screen
Prior art date
Application number
KR1020160022671A
Other languages
Korean (ko)
Other versions
KR102052385B1 (en
Inventor
박상욱
지덕구
고은진
장종현
한미경
Original Assignee
한국전자통신연구원
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 한국전자통신연구원 filed Critical 한국전자통신연구원
Publication of KR20160123982A publication Critical patent/KR20160123982A/en
Application granted granted Critical
Publication of KR102052385B1 publication Critical patent/KR102052385B1/en

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/436Interfacing a local distribution network, e.g. communicating with another STB or one or more peripheral devices inside the home
    • H04N21/43615Interfacing a Home Network, e.g. for connecting the client to a plurality of peripherals
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q50/00Information and communication technology [ICT] specially adapted for implementation of business processes of specific business sectors, e.g. utilities or tourism
    • G06Q50/10Services
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/436Interfacing a local distribution network, e.g. communicating with another STB or one or more peripheral devices inside the home
    • H04N21/4363Adapting the video stream to a specific local network, e.g. a Bluetooth® network
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/60Network structure or processes for video distribution between server and client or between remote clients; Control signalling between clients, server and network components; Transmission of management data between server and client, e.g. sending from server to client commands for recording incoming content stream; Communication details between server and client 
    • H04N21/63Control signaling related to video distribution between client, server and network components; Network processes for video distribution between server and clients or between remote clients, e.g. transmitting basic layer and enhancement layers over different transmission paths, setting up a peer-to-peer communication via Internet between remote STB's; Communication protocols; Addressing
    • H04N21/64Addressing
    • H04N21/6405Multicasting

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Business, Economics & Management (AREA)
  • Tourism & Hospitality (AREA)
  • General Health & Medical Sciences (AREA)
  • Health & Medical Sciences (AREA)
  • Economics (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Human Resources & Organizations (AREA)
  • Marketing (AREA)
  • Primary Health Care (AREA)
  • Strategic Management (AREA)
  • Physics & Mathematics (AREA)
  • General Business, Economics & Management (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Information Transfer Between Computers (AREA)

Abstract

The present invention relates to a method and a system for providing collaborating service for media sharing. The method for providing collaborating service for media sharing includes the steps of: receiving multi-track contents from a broadcasting server; discovering a neighboring device able to provide the collaborating service when the collaborating service for the multi=track contents is requested; and dividing the multi-track contents to transmit the divided multi-track contents to the neighboring device. According to the present invention, a plurality of smart devices may provide collaborating service for media sharing.

Description

Technical Field [0001] The present invention relates to a collaborative service providing method and system for media sharing,

Embodiments of the present invention relate to a method and system for providing collaborative services for media sharing.

The spread of smart devices that can perform multifunctions such as web browsing and computing functions connected to the Internet through Giga network is spreading. The smart device can be, for example, a smart phone, a smart pad and a smart TV. In recent years, there has been an increasing demand to use UHD (Ultra High Definition) high quality content using a plurality of smart devices.

A multi-screen service technology refers to a technique of providing image contents using a plurality of smart devices. A multi-screen service refers to a set of screens of a plurality of smart devices owned by users. A multi- The service provided through the multi-screen of the user. In terms of content providers, multi-screen service technology allows users to use one content with the same user experience on various smart devices. On the user side, multi-screen service technology enables content providers to consume content using smart devices such as smart TVs, smartphones and smart pads. In addition, multi-screen service technology on the network operator side can effectively use network resources to provide multi-screen service.

Korean Patent Laid-Open Publication No. 2014-0039869 (Web-based multi-network adaptive multi-screen service method and apparatus thereof)

Discovery and launch protocol specification (DIAL), version 1.7.1 (Netflix)

Embodiments of the present invention allow a plurality of smart devices to provide collaborative services for media sharing.

According to another aspect of the present invention, there is provided a method of providing a collaborative service, the method comprising: receiving multi-track content from a broadcast server; If a collaboration service for the multi-track content is requested, discovering a peripheral device capable of providing the collaboration service; And separating the multitrack content and transmitting the separated content to the peripheral device.

In one embodiment, the step of discovering the peripheral device includes: multicasting a discovery request message including type information of an auxiliary content included in the multi-track content to peripheral devices; And receiving a discovery response message from a peripheral device capable of reproducing or reproducing the auxiliary content.

In one embodiment, the method may further include a step of mapping type information of the auxiliary content and information of a peripheral device that has transmitted the discovery response message and storing the mapping information.

In one embodiment, the step of transmitting the separated content to the peripheral device may include transmitting the separated content to the peripheral device with reference to the mapped information.

In one embodiment, the auxiliary content may be auxiliary video or real effect metadata.

In one embodiment, the step of transmitting the separated content to the peripheral device may include transmitting the separated content to the peripheral device by referring to the application information obtained from the peripheral device.

A cooperative work service providing system according to another embodiment of the present invention includes: a request receiving unit operable to receive requests necessary for providing a collaboration service from a first device belonging to a first subnet and belonging to a first subnet, A first gateway for transmitting the first gateway; Wherein the first device is configured to communicate with the first device and to forward the requests to the second subnet where the second device is receiving content from the streaming server, server; And multicasts the requests received from the management server to devices belonging to the second subnet and belongs to the second subnet and transmits responses received from the third device belonging to the second subnet using the set protocol And the third device may receive additional content from the streaming server using information received from the second gateway.

A method of providing a collaborative service by a user device according to an embodiment of the present invention includes: executing an application to be used for content playback according to an execution command from another device that is reproducing content received from a streaming server; Receiving information on an after-view content starting from a content playback time of the other device from a management server; And receiving and reproducing the subsequent viewing content from the streaming server using the information on the received viewing content.

In one embodiment, the information on the subsequent viewing content may be a uniform resource identifier (URI) of the subsequent viewing content.

In one embodiment, the method may further comprise the step of responding to a discovery request from the other device.

In one embodiment, responding to the discovery request may comprise responding to the discovery request using Universal Plug and Play (UPnP).

In one embodiment, the method may further include, in response to a request from the other device, transmitting the application information available to the other device to the other device.

In one embodiment, the step of transmitting the application information may include transmitting the application information using REST (Rewritable State Transfer).

According to embodiments of the present invention, a plurality of smart devices can provide a collaboration service for media sharing.

1 is a conceptual diagram for explaining a method of providing a cooperative work service according to an embodiment of the present invention;
FIG. 2 is a flowchart illustrating a method of providing a collaboration service shown in FIG. 1. FIG.
3 is a conceptual diagram illustrating a method of providing a collaboration service according to another embodiment of the present invention.
FIG. 4 is a flowchart for explaining a method of providing a collaboration service shown in FIG. 3;
FIG. 5 is a conceptual diagram illustrating a method for providing a collaboration service according to another embodiment of the present invention;
FIG. 6 is a flowchart for explaining a method of providing a collaboration service shown in FIG. 5;
FIG. 7 is a conceptual diagram illustrating a method of providing a collaboration service according to another embodiment of the present invention;
FIG. 8 is a flowchart for explaining a method of providing a collaboration service shown in FIG. 7. FIG.

In the following description of the embodiments of the present invention, a detailed description of known functions and configurations incorporated herein will be omitted when it may make the subject matter of the present invention rather unclear.

Hereinafter, embodiments of the present invention will be described with reference to the accompanying drawings.

Hereinafter, in describing embodiments of the present invention, a smart device such as a TV, a Blu-ray player, a smart phone, or a tablet is referred to as a screen. The screen can play back the content using web standard technology (e.g., HTTP Live Streaming (HLS)).

FIG. 1 is a conceptual diagram for explaining a method of providing a collaboration service according to an embodiment of the present invention.

In the following description, a non-mobile device such as a TV and a Blu-ray player is referred to as a primary screen 100 for convenience, and a mobile device such as a smart phone and a tablet is referred to as a secondary screen 200. [

The secondary screen 200 can receive an available content list from the TaaS software platform server 400 (hereinafter referred to as a TaaS server). The TaaS server 400 can manage information of contents that can be provided to a device (e.g., a primary screen or a secondary screen). The TaaS server 400 manages the device information or the information on the group to which the device belongs and can perform a response corresponding to the request when it is requested to provide the information of the available content from an arbitrary device .

The secondary screen 200 can guide the user to a list of available contents and receive the content selected by the user from the streaming server 300 for playback. If the user of the secondary screen 200 wishes to play the content being played on the secondary screen 200 on another device (e.g., the primary screen 100) The car screen 100 and the discovery and application execution procedures. Accordingly, the primary screen 100 can receive and reproduce contents that have been reproduced on the secondary screen 200 from the streaming server 300. [

2 is a flowchart illustrating a method of providing a collaboration service shown in FIG. Depending on the embodiment, at least one of the steps shown in Fig. 2 may be omitted.

In step 201, the secondary screen 200 requests the TaaS server 400 to transmit its own list of available content, and may display the content list received from the TaaS server 400 . The content list may include at least one of a file name of a content, a Uniform Resource Identifier (URI), and a thumbnail. The content list may be written in various standard formats (for example, JSON (JavaScript Object Notation)), and the URI may be an identifier of the content stored in the streaming server 300. For requesting and transferring the contents list, REST (REpresentational State Transfer) may be used.

The user of the secondary screen 200 can select any one of contents included in the contents list to reproduce. The selection of the content can be done through various user interfaces (e.g., touch screen).

In step 203, the streaming server 300 may receive a content provision request from the secondary screen 200 and provide the secondary screen 200 with content corresponding to the request. If it is possible to provide adaptive content, the streaming server 300 generates a list of adaptive content that can be provided to the secondary screen 200, and provides the generated list to the secondary screen 200 can do. Here, the adaptive content means content having a file size suited to the resolution and bandwidth of a receiving device (e.g., a secondary screen). That is, the list of adaptive contents may include information on a plurality of adaptive contents having different file sizes and containing the same contents. If any one of the adaptive content is selected by the user of the secondary screen 200 or the secondary screen 200, the secondary screen 200 requests the streaming server 300 to provide the selected adaptive content And receive and play back the adaptive content corresponding to the request. The content (or the adaptive content) may be provided in various formats (for example, HTTP Adaptive Streaming such as HLS (HTTP Live Streaming)).

If the provision of a collaboration service (e.g., an N-screen service) is requested, at step 205, the secondary screen 200 provides a procedure for discovering a peripheral device capable of providing a collaboration service Can be performed. For example, the secondary screen 200 may send a discovery request (e.g., an M-SEARCH request (M-SEARCH REQUEST) defined in the UPnP Specification 1.3.2) to the primary screen 100 located around the secondary screen 200, SEARCH request) from the primary screen 100 and receive a discovery response (e.g., a response defined in UPnP Specification 1.3.3 (M-SEARCH response)) from the primary screen 100. The discovery request can be multicast.

In step 207, an application information acquisition procedure may be performed. For example, the secondary screen 200 can request application information that can be reproduced from the primary screen 100 and receive application information that can reproduce the content from the primary screen 100. Application information request and response can be performed using REST. The application information may be a URI of the application (for example, an application name). The application information request can be multicast.

In step 209, the secondary screen 200 may display executable application information on the primary screen 100. If a plurality of primary screens 100 are found, the secondary screen 200 displays information (e.g., device names) for the plurality of primary screens 100 and the user Any one primary screen 100 can be selected. When the primary screen 100 is selected, the secondary screen 200 displays executable application information on the selected primary screen 100, and any one of these applications can be selected. If an application to be used for content playback is selected, the secondary screen 200 may send an application execution command to the primary screen 100 instructing it to execute the selected application. REST can be used to transfer application execution commands. Information (e.g., a URI) of the content that was being played back on the secondary screen 200 with the application execution command may be transmitted to the primary screen 100.

In step 211, the primary screen 100 executes the corresponding application in accordance with the application execution command received from the secondary screen 200, and receives the information (http: / / /address/*.m3u8) to request the content to be provided to the streaming server 300, and to receive the content from the streaming server 300 and play the content.

3 is a conceptual diagram for explaining a method of providing a collaboration service according to another embodiment of the present invention.

3, when a user of the secondary screen 200 wishes to reproduce the content being reproduced in the secondary screen 200 on another device (e.g., the primary screen 100) The car screen 100 can reproduce the content from the time when it is being reproduced on the secondary screen 200. [

If there is a request for providing a collaboration service from the secondary screen 200, the streaming server 300 newly creates and stores a content starting from the point in time of playback on the secondary screen 200 . The TaaS server 400 may then store information about subsequent viewing content (e.g., a URI) corresponding to a user of the secondary screen 200 or a user group to which the secondary screen 200 belongs. Here, it is assumed that the primary screen 100 and the secondary screen 200 are devices belonging to the same user, and the TaaS server 400 holds information about the primary screen 100 and the secondary screen 200 .

4 is a flowchart illustrating a method of providing a collaboration service shown in FIG. Depending on the embodiment, at least one of the steps shown in Fig. 4 may be omitted.

In step 405, it is assumed that the content is being played back on the secondary screen 200. [ The step 201 and the step 203 described with reference to FIG. 2 may be performed before the step 405, and the URI of the currently playing content is "http: //address/*.m3u8" I suppose.

If there is a collaboration service provision request from the user of the secondary screen 200, the secondary screen 200 may transmit the collaboration service provision request to the streaming server 300 at step 407. Accordingly, the streaming server 300 can check the current playback time of the content being played back on the secondary screen 200, and generate the next content starting from the current playback time. Here, it is assumed that the URI of the subsequent viewing content is "http: //address/*.m3u8/start/50/*new.m3u8".

The streaming server 300 may provide the TaaS server 400 with information (e.g., a URI) about the viewing content. Here, the streaming server 300 may transmit information about the secondary screen 200 together with information on the subsequent viewing content to the TaaS server 400. [ Accordingly, the TaaS server 400 can manage information about which user (or device or user group) the subsequent viewing content is related to. Accordingly, when it is requested to provide a list of contents available from any device, the TaaS server 400 can provide the device with a list of contents available in the corresponding device by referring to the managed information.

For example, if there is an available content provision request from another device held by a user of the secondary screen 200, or from a device held by a user group to which the user of the secondary screen 200 belongs, the TaaS server 400 may provide a list of contents including information on subsequent viewing contents (http: //address/*.m3u8/start/50/*new.m3u8). Conversely, when there is a content provision request available from a device irrelevant to the user of the secondary screen 200, the TaaS server 400 includes information (http: //address/*.m3u8) about the original content A content list can be provided.

In step 409, a discovery procedure and application information acquisition procedure may be performed. Step 409 is the same as steps 205 and 207 described with reference to FIG. 2, and therefore, detailed description thereof will be omitted.

In step 411, the secondary screen 200 displays executable application information on the primary screen 100, and can select any one application from the user. Once the application is selected, the secondary screen 200 may send an application execution command to the primary screen 100 instructing it to execute the selected application.

At step 413, the primary screen 100 may request the TaaS server 400 to execute the requested application and transfer the available content list. Accordingly, the TaaS server 400 can provide the primary screen 100 with a list of contents available on the primary screen 100. [ Assuming that the primary screen 100 and the secondary screen 200 are held by the same user as described above, the content list provided by the TaaS server 400 to the primary screen 100 is followed by the content (Http: //address/*.m3u8/start/50/*new.m3u8). When the playback of the subsequent viewing content is requested, the primary screen 100 displays a content providing request including information on the following viewing content (http: //address/*.m3u8/start/50/*new.m3u8) To the streaming server (300). Accordingly, the streaming server 300 can provide the viewing content to the primary screen 100, and the primary screen 100 can play back the viewing content.

5 is a conceptual diagram for explaining a method of providing a collaboration service according to another embodiment of the present invention.

The above-described discovery procedure using UPnP may not be performed between devices located on different subnets. Accordingly, the TaaS gateways 410 and 420 and the TaaS server 400 cooperate to enable collaboration services to be performed among devices located on different subnets. The TaaS gateways 410 and 420 may further include a function of converting messages used for discovery, application information acquisition, and application execution according to a set protocol to the general gateway function, which is conventionally used, and delivering them to the counterpart gateway.

6 is a flowchart illustrating a method of providing a collaboration service shown in FIG. Depending on the embodiment, at least one of the steps shown in Fig. 6 may be omitted.

In step 601 a session is established between the remote device 10 located in the subnet 1000 and the primary screen 100 located in the subnet 2000 and a session is established between the remote device 10 and the primary screen 100. [ It is assumed that the content received from the streaming server 300 is being reproduced. The session may be a session for a video conference or a video lecture, and may be formed according to various methods conventionally used. The TaaS server 400 may be a management server that manages information about the TaaS gateways 410 and 420 participating in the session. For convenience of explanation, FIG. 8 shows one device forming a session with the remote device 10, but a plurality of devices belonging to different subnets may form a session with the remote device 10.

In step 603, if there is additional content desired to be shared in addition to the content currently being played back, the user of the remote device 10 can upload the additional content to the streaming server 300. [

The TaaS server 400 can receive additional content and information about the remote device 10 from the remote device 10 or the streaming server 300 and map and manage the information.

Thereafter, a series of procedures may be performed to allow the additional content to be played on a device (e.g., secondary screen 200) located around the primary screen 100. [ For example, the remote device 10 may perform a discovery request to the first TaaS gateway 410 and the first TaaS gateway 410 may forward the discovery request to the TaaS server 400. [ The TaaS server 400 may forward the discovery request to the gateway (e.g., the second TaaS gateway 420) that is in session with the remote device 10 (which may be delivered in a multicast form). The second TaaS gateway 420 transmits a discovery request to the devices (e.g., the primary screen 100 and the secondary screen 200) located in the subnet 2000 to which the second TaaS gateway 420 belongs Can be delivered). Accordingly, the primary screen 100 and the secondary screen 200 can perform a response to the discovery request to the second TaaS gateway 420. [ The second TaaS gateway 420 may send the discover responses received from the primary screen 100 and the secondary screen 200 to the TaaS server 400. [ Accordingly, the TaaS server 400 may forward discovery responses to the first TaaS gateway 410, and the first TaaS gateway 410 may forward the discovery responses to the remote device 10. Thereafter, according to the same procedure as the discovery request and discovery response process described above, requests and responses for application information acquisition can be performed.

In step 605, the remote device 10 displays the obtained application information and can select one of the applications from the user. The remote device 10 may send an application execution command to command the execution of the selected application. The application execution command can be transmitted to the second TaaS gateway 420 according to the same procedure as the above-described discovery request, and the second TaaS gateway 420 can transmit the application execution command to the secondary screen 200 have. At this time, information (e.g., a URI) of the supplementary content may be transmitted together with the application execution command.

In step 607, the secondary screen 200 executes the corresponding application in accordance with the application execution command received from the second TaaS gateway 420, and the information of the additional content received from the second TaaS gateway 420 (For example, a URI) from the streaming server 300 to play back the additional content.

7 is a conceptual diagram for explaining a method of providing a collaboration service according to another embodiment of the present invention.

The primary screen 100 can receive multi-track content from the broadcast server 500 and play it. The multitrack content may be, for example, a mixture of a plurality of images (for example, a main image and a sub-image to be displayed in a PIP format), or a combination of at least one image and additional information Or may be a mixture of at least one image and sensation effect metadata for realization effect reproduction. The sensation effect may be, for example, an effect on either lighting, wind, vibration and aroma.

Assuming that the multitrack content includes one main image, one auxiliary image, and one real-effect metadata, the main image is stored in the primary screen 100, the auxiliary image is stored in the secondary screen 200, The real-feeling effect can be reproduced or reproduced by the real-feeling-effect reproducing apparatus 120.

8 is a flowchart for explaining a method of providing a collaboration service shown in FIG. Depending on the embodiment, at least one of the steps shown in Fig. 8 may be omitted.

At step 801, the primary screen 100 may receive multitrack content from the broadcast server 500. [ Here, it is assumed that the multitrack content is a mixture of one main image, one auxiliary image, and one sensation effect metadata. Hereinafter, for convenience of explanation, the main image is referred to as main content. Other than the main video, such as auxiliary video and real-effect metadata, are referred to as auxiliary content.

In steps 803 and 805, the primary screen 100 may perform discovery procedures and application information acquisition procedures for peripheral devices. The discovery procedure and application information acquisition procedure of steps 803 and 805 may be performed in essentially the same manner as steps 205 and 207 described with reference to FIG.

However, auxiliary content type information may further be used for the discovery request and application information request in steps 803 and 805. [ The auxiliary content type information may be information indicating whether the auxiliary content is video, audio, or real effect. For example, the primary screen 100 may multicast a discovery request message including information indicating that the auxiliary content type is video, in order to discover a device capable of playing the auxiliary video. In addition, for example, the primary screen 100 may multicast a discovery request message including information indicating that the auxiliary content type is a real-world effect, in order to discover a device capable of reproducing a real-world effect. The application information acquisition procedure may also be performed in the same manner.

According to the discovery procedure and the application information acquisition procedure, the primary screen 100 is a device capable of reproducing an auxiliary image, and is a device capable of reproducing a realistic effect, 120). ≪ / RTI > That is, the second screen 200 responds to the discovery request message in which the information indicating that the auxiliary content type is video, and the realization effect is reproduced on the discovery request message including the information indicating that the auxiliary content type is the real effect It is assumed that device 120 has performed a response. It is assumed that the primary screen 100 acquires application information available in the secondary screen 200 and the sensation effect reproducing apparatus 120.

The primary screen 100 may store and manage mapping information in which auxiliary content type information and device information are mapped. For example, in the primary screen 100, the secondary screen 200 is mapped in correspondence with the auxiliary image, and the mapping information in which the sensation effect reproducing apparatus 120 is mapped in correspondence with the sensation effect metadata is stored and managed can do.

At step 807, the primary screen 100 may display application information available at the secondary screen 200 and the sensory reproduction device 120, and may be selected by the user. When an application is selected by the user, the primary screen 100 may transmit an application execution command to the secondary screen 200 and the real-time effect reproducing apparatus 120, which commands the execution of the selected application. As a result, the secondary screen 200 and the sensation-effect reproducing apparatus 120 can execute the application for which execution has been commanded.

In step 809, the primary screen 100 may separate the multitrack content into a main video, an auxiliary video, and a reality effect metadata. The primary screen 100 can reproduce the main picture by itself. The primary screen 100 can transmit the auxiliary content to the peripheral device with reference to the mapping information managed by the primary screen 100. [ For example, the primary screen 100 may provide the secondary image to the secondary screen 200 and provide the sensory effect metadata to the sensory reproduction apparatus 120. [ Thus, the secondary screen 200 reproduces the auxiliary image, and the real-effect reproduction apparatus 120 reproduces the real-world effect.

The embodiments of the invention described above may be implemented in any of a variety of ways. For example, embodiments of the present invention may be implemented using hardware, software, or a combination thereof. When implemented in software, it may be implemented as software running on one or more processors using various operating systems or platforms. Additionally, such software may be written using any of a plurality of suitable programming languages, and may also be compiled into machine code or intermediate code executable in a framework or virtual machine.

Also, when embodiments of the present invention are implemented on one or more processors, one or more programs for carrying out the methods of implementing the various embodiments of the invention discussed above may be stored on a processor readable medium (e.g., memory, A floppy disk, a hard disk, a compact disk, an optical disk, a magnetic tape, or the like).

Claims (1)

A method for a user device to provide a collaboration service,
Receiving multi-track content from a broadcast server;
Discovering a peripheral device capable of providing the collaboration service when a collaboration service for the multitrack content is requested; And
Separating the multitrack content and transmitting the separated content to the peripheral device
The method comprising the steps of:
KR1020160022671A 2015-04-15 2016-02-25 Collaborating service providing method for media sharing and system thereof KR102052385B1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
KR1020150053406 2015-04-15
KR20150053406 2015-04-15

Publications (2)

Publication Number Publication Date
KR20160123982A true KR20160123982A (en) 2016-10-26
KR102052385B1 KR102052385B1 (en) 2019-12-06

Family

ID=57251946

Family Applications (1)

Application Number Title Priority Date Filing Date
KR1020160022671A KR102052385B1 (en) 2015-04-15 2016-02-25 Collaborating service providing method for media sharing and system thereof

Country Status (1)

Country Link
KR (1) KR102052385B1 (en)

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20120082723A (en) * 2011-01-14 2012-07-24 전자부품연구원 Collaborative service system using mobile device and method for providing collaborative service using mobile device in smart tv of the system
KR20140039869A (en) 2012-09-25 2014-04-02 한국전자통신연구원 Method and apparatus for providing of web-based multi-network adaptive multi-screen service
KR20140050917A (en) * 2012-10-22 2014-04-30 광운대학교 산학협력단 The differential media content transmission method and system according to the network status in a home environment
KR20140117977A (en) * 2013-03-27 2014-10-08 건국대학교 산학협력단 N-Screen Service System for ASMD support and the method thereof

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20120082723A (en) * 2011-01-14 2012-07-24 전자부품연구원 Collaborative service system using mobile device and method for providing collaborative service using mobile device in smart tv of the system
KR20140039869A (en) 2012-09-25 2014-04-02 한국전자통신연구원 Method and apparatus for providing of web-based multi-network adaptive multi-screen service
KR20140050917A (en) * 2012-10-22 2014-04-30 광운대학교 산학협력단 The differential media content transmission method and system according to the network status in a home environment
KR20140117977A (en) * 2013-03-27 2014-10-08 건국대학교 산학협력단 N-Screen Service System for ASMD support and the method thereof

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
DIAL (Discovery and launch protocol specification), version 1.7.1(Netflix)

Also Published As

Publication number Publication date
KR102052385B1 (en) 2019-12-06

Similar Documents

Publication Publication Date Title
CN109963162B (en) Cloud directing system and live broadcast processing method and device
JP5335087B2 (en) Information processing system and information processing apparatus
TWI669957B (en) Media projection method, media projection device, control terminal, and cloud server
EP3996355B1 (en) Method for transferring media stream and user equipment
CN105765984B (en) Emit equipment, launching technique, receiving device and method of reseptance
JP2012169739A (en) Video division reproduction method, video reproduction method, video division reproduction system and video division reproduction program
KR102499231B1 (en) Receiving device, sending device and data processing method
WO2015035742A1 (en) Method, terminal and system for audio and video sharing of digital television
US11252478B2 (en) Distribution device, distribution method, reception device, reception method, program, and content distribution system
WO2018034172A1 (en) Information processing device, client device, and data processing method
CA3038028A1 (en) Receiving device, transmitting device, and data processing method
KR20180099109A (en) System and method for requesting real time broadcasting
KR102052385B1 (en) Collaborating service providing method for media sharing and system thereof
JPWO2016174959A1 (en) Reception device, transmission device, and data processing method
WO2016051802A1 (en) Content reception system, content reception device, display device, content reception system control method, and program
GB2561343A (en) Signalling of auxiliary content for a broadcast signal
KR20160107617A (en) Method, user device and computer program for providing video service
Bassbouss Concepts and models for creating distributed multimedia applications and content in a multiscreen environment
CN105721887A (en) Video playing method, apparatus and system
CN105007514A (en) Interactive system for television networks
CN104639518A (en) Session building method and device and session content delivering method and device
CN112565810A (en) Merging processing method, device, medium and electronic equipment for live broadcast room
KR20150103789A (en) Contents prefetch scheme based on ip multicast for seamless screen switching
CN105187920A (en) Interactive system for television network

Legal Events

Date Code Title Description
E902 Notification of reason for refusal
E90F Notification of reason for final refusal
E701 Decision to grant or registration of patent right