KR102052385B1 - Collaborating service providing method for media sharing and system thereof - Google Patents

Collaborating service providing method for media sharing and system thereof Download PDF

Info

Publication number
KR102052385B1
KR102052385B1 KR1020160022671A KR20160022671A KR102052385B1 KR 102052385 B1 KR102052385 B1 KR 102052385B1 KR 1020160022671 A KR1020160022671 A KR 1020160022671A KR 20160022671 A KR20160022671 A KR 20160022671A KR 102052385 B1 KR102052385 B1 KR 102052385B1
Authority
KR
South Korea
Prior art keywords
content
information
collaboration service
providing
screen
Prior art date
Application number
KR1020160022671A
Other languages
Korean (ko)
Other versions
KR20160123982A (en
Inventor
박상욱
지덕구
고은진
장종현
한미경
Original Assignee
한국전자통신연구원
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 한국전자통신연구원 filed Critical 한국전자통신연구원
Publication of KR20160123982A publication Critical patent/KR20160123982A/en
Application granted granted Critical
Publication of KR102052385B1 publication Critical patent/KR102052385B1/en

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/436Interfacing a local distribution network, e.g. communicating with another STB or one or more peripheral devices inside the home
    • H04N21/43615Interfacing a Home Network, e.g. for connecting the client to a plurality of peripherals
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q50/00Systems or methods specially adapted for specific business sectors, e.g. utilities or tourism
    • G06Q50/10Services
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/436Interfacing a local distribution network, e.g. communicating with another STB or one or more peripheral devices inside the home
    • H04N21/4363Adapting the video or multiplex stream to a specific local network, e.g. a IEEE 1394 or Bluetooth® network
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/60Network structure or processes for video distribution between server and client or between remote clients; Control signalling between clients, server and network components; Transmission of management data between server and client, e.g. sending from server to client commands for recording incoming content stream; Communication details between server and client 
    • H04N21/63Control signaling related to video distribution between client, server and network components; Network processes for video distribution between server and clients or between remote clients, e.g. transmitting basic layer and enhancement layers over different transmission paths, setting up a peer-to-peer communication via Internet between remote STB's; Communication protocols; Addressing
    • H04N21/64Addressing
    • H04N21/6405Multicasting

Abstract

Embodiments of the present invention relate to a method and system for providing a collaboration service for media sharing. The method for providing a collaboration service according to an embodiment of the present invention includes receiving multi-track content from a broadcast server. step; Discovering a peripheral device capable of providing the collaboration service when a collaboration service for the multi-track content is requested; And separating the multi track content and transmitting the separated content to the peripheral device. According to embodiments of the present disclosure, a plurality of smart devices may provide a collaboration service for media sharing.

Description

Collaborating service providing method for media sharing and system

Embodiments of the present invention relate to a method and system for providing a collaboration service for media sharing.

The spread of smart devices capable of performing multifunction, such as being able to browse the web and have a computing function by connecting to the Internet through a giga network, is spreading. Smart devices can be, for example, smart phones, smart pads and smart TVs. Recently, there is a growing demand for using high quality content of Ultra High Definition (UHD) level using a plurality of smart devices.

Multi-screen service technology is a technology for providing video content using a plurality of smart devices. Multi-screen refers to a screen set of various smart devices owned by users, and multi-screen service refers to a user's content and services. It means a service provided through their multi-screen. In terms of content providers, multi-screen service technology allows users to use a single content with the same user experience on various smart devices. On the user side, multi-screen service technology allows content providers to consume content using smart devices such as smart TVs, smart phones, and smart pads. In addition, the multi-screen service technology in terms of network operators, it is possible to effectively use the network resources to provide a multi-screen service.

Domestic Publication No. 2014-0039869 (Web-based multi-network adaptive multi-screen service method and device therefor)

DIAL (Discovery and launch protocol specification), version 1.7.1 (Netflix)

Embodiments of the present disclosure allow a plurality of smart devices to provide a collaboration service for media sharing.

According to an embodiment of the present invention, a method of providing a collaboration service by a user device includes: receiving multi-track content from a broadcast server; When the collaboration service for the multi-track content is requested, finding a peripheral device capable of providing the collaboration service; And separating the multi track content and transmitting the separated content to the peripheral device.

In an embodiment, the discovering of the peripheral device may include: multicasting, to the peripheral devices, a discovery request message in which type information of auxiliary content included in the multi-track content is inserted; And receiving a discovery response message from a peripheral device capable of playing or reproducing the auxiliary content.

According to an embodiment of the present disclosure, the method may further include mapping and storing the type information of the auxiliary content and information of the peripheral device that has transmitted the discovery response message.

In an embodiment, the transmitting of the separated content to the peripheral device may include transmitting the separated content to the peripheral device with reference to the mapped information.

In an embodiment, the auxiliary content may be auxiliary image or sensory effect metadata.

The transmitting of the separated content to the peripheral device may include transmitting the separated content to the peripheral device by referring to application information obtained from the peripheral device.

The collaboration service providing system according to another embodiment of the present invention belongs to a first subnet, receives requests for providing a collaboration service from a first device belonging to the first subnet, and uses the established protocol. A first gateway for transmitting; Establishing a session with the first device to forward the requests to a second subnet where a second device receiving content from a streaming server is located, and to forward responses received from the second subnet to the first gateway server; And multicasting requests received from the management server to devices belonging to the second subnet, and transmitting responses received from a third device belonging to the second subnet using the set protocol. Including a second gateway, the third device may be provided with additional content from the streaming server using the information received from the second gateway.

According to an embodiment of the present disclosure, a method of providing a cooperative service by a user device may include: executing an application to be used for content playback according to an execution command from another device that is playing content received from a streaming server; Receiving information on the replay content starting from a content reproduction time of the other device from a management server; And receiving and playing the earbuds content from the streaming server by using the received earbuds content information.

In an embodiment of the present disclosure, the information about the resume content may be a Uniform Resource Identifier (URI) of the resume content.

In one embodiment, the method may further include responding to a discovery request from the other device.

In an embodiment, the responding to the discovery request may include responding to the discovery request using Universal Plug and Play (UPnP).

In an embodiment, the method may further include transmitting application information available to the other device according to a request from the other device.

In an embodiment, the transmitting of the application information may include transmitting the application information by using a representational state transfer (REST).

According to embodiments of the present disclosure, a plurality of smart devices may provide a collaboration service for media sharing.

1 is a conceptual diagram illustrating a method for providing a collaboration service according to an embodiment of the present invention;
FIG. 2 is a flowchart illustrating a method for providing a collaboration service shown in FIG. 1;
3 is a conceptual diagram illustrating a method of providing a collaboration service according to another embodiment of the present invention;
4 is a flowchart for explaining a method of providing a collaboration service shown in FIG. 3;
5 is a conceptual diagram illustrating a method of providing a collaboration service according to another embodiment of the present invention;
6 is a flowchart illustrating a method of providing a collaboration service shown in FIG. 5;
7 is a conceptual diagram illustrating a method of providing a collaboration service according to another embodiment of the present invention;
8 is a flowchart for explaining a method of providing a collaboration service shown in FIG. 7.

Hereinafter, in describing the embodiments of the present invention, when it is determined that a detailed description of a related known function or configuration may unnecessarily obscure the subject matter of the present invention, the detailed description thereof will be omitted.

Hereinafter, with reference to the accompanying drawings will be described embodiments of the present invention.

Hereinafter, in describing the embodiments of the present invention, smart devices such as a TV, a Blu-ray player, a smartphone, and a tablet are referred to as a screen. The screen may play content using web standard technology (eg, HTTP Live Streaming (HLS)).

1 is a conceptual diagram illustrating a method of providing a collaboration service according to an embodiment of the present invention.

In the following description, for convenience, devices without mobility such as TVs and Blu-ray players are referred to as primary screens 100, and devices with mobility such as smartphones and tablets are referred to as secondary screens 200.

The secondary screen 200 may receive a list of available content from the TaaS software platform server 400 (hereinafter referred to as TaaS server). The TaaS server 400 may manage information of content that can be provided to a device (eg, a primary screen or a secondary screen). The TaaS server 400 may manage device information or information of a group to which the device belongs, and when requested to provide information of content available from any device, may perform a response corresponding to the request. .

The secondary screen 200 may guide the user through the list of available contents and receive and play the content selected by the user from the streaming server 300. If the user of the secondary screen 200 wants to play content being played on the secondary screen 200 on another device (for example, the primary screen 100), the secondary screen 200 is 1 Car screen 100 and discovery and application execution procedures can be performed. Accordingly, the primary screen 100 may receive and reproduce the content played on the secondary screen 200 from the streaming server 300.

FIG. 2 is a flowchart illustrating a method of providing a collaboration service shown in FIG. 1. According to an embodiment, at least one of the steps illustrated in FIG. 2 may be omitted.

In step 201, the secondary screen 200 may request the TaaS server 400 to send a list of available content, and display the content list received from the TaaS server 400. . The content list may include at least one of a file name of a content, a Uniform Resource Identifier (URI), and a thumbnail. The content list may be prepared in various standard formats (eg, JavaScript Object Notation (JSON)), and the URI may be an identifier of content stored in the streaming server 300. Representational State Transfer (REST) may be used for requesting and transmitting the content list.

The user of the secondary screen 200 may select any one of contents to be played among contents included in the contents list. The selection of content may be through various user interfaces (eg, touch screens).

In operation 203, the streaming server 300 may receive a content providing request from the secondary screen 200 and provide content corresponding to the request to the secondary screen 200. If the situation is that the adaptive content can be provided, the streaming server 300 generates a list of adaptive content that can be provided to the secondary screen 200, and provides the generated list to the secondary screen 200. can do. Here, adaptive content means content having a file size suitable for the resolution and bandwidth of a receiving device (eg, a secondary screen). That is, the list of adaptive contents may include information about a plurality of adaptive contents having different file sizes and containing the same contents. When either adaptive content is selected by the user of the secondary screen 200 or the secondary screen 200, the secondary screen 200 requests the streaming server 300 to provide the selected adaptive content. And adaptive content corresponding to the request can be received and played. The content (or adaptive content) may be provided in various formats (eg, HTTP Adaptive Streaming, such as HTTP Live Streaming (HLS)).

If the provision of a collaboration service (eg, N-screen service) is requested, in step 205, the secondary screen 200 performs a procedure for discovering a peripheral device capable of providing the collaboration service. Can be done. For example, the secondary screen 200 may request a discovery request (eg, an M-SEARCH request (M- defined in UPnP specification 1.3.2) from the primary screen 100 located around the secondary screen 200. SEARCH request), and receive a discovery response (eg, a M-SEARCH response defined in UPnP specification 1.3.3) from the primary screen 100. The discovery request may be multicast.

In step 207, an application information acquisition procedure may be performed. For example, the secondary screen 200 may request application information capable of playing content from the found primary screen 100 and receive application information capable of playing content from the primary screen 100. Application information requests and responses may be performed using REST. The application information may be a URI (eg, an application name) of the application. The application information request may be multicast.

In operation 209, the secondary screen 200 may display application information executable on the primary screen 100. If a plurality of primary screens 100 are found, the secondary screen 200 displays information about the plurality of primary screens 100 (eg, device names), and among them, the user may display one of them. Any one primary screen 100 may be selected. When the primary screen 100 is selected, the secondary screen 200 may display application information executable on the selected primary screen 100 and receive any one of these applications. When an application to be used for content reproduction is selected, the secondary screen 200 may transmit an application execution command to the primary screen 100 instructing to execute the selected application. REST may be used to transmit an application execution command. Information (eg, URI) of the content being played on the secondary screen 200 may be transmitted to the primary screen 100 together with the application execution command.

In step 211, the primary screen 100 executes the application according to the application execution command received from the secondary screen 200, and displays information (http: /) of the content received from the secondary screen 200. /address/*.m3u8) to request the content to be provided to the streaming server 300, the content can be received from the streaming server 300 to play.

3 is a conceptual diagram illustrating a method for providing a collaboration service according to another embodiment of the present invention.

3, when a user of the secondary screen 200 wants to play content being played on the secondary screen 200 on another device (eg, the primary screen 100), 1 The primary screen 100 may play the content from the time when the secondary screen 200 was playing.

To this end, when there is a request for providing a cooperative service from the secondary screen 200, the streaming server 300 newly generates and stores contents (hereinafter, referred to as resume content) starting from the point of being played on the secondary screen 200. Can be. In addition, the TaaS server 400 may store information (eg, URI) regarding the following content in correspondence with a user of the secondary screen 200 or a user group to which the secondary screen 200 belongs. Here, it is assumed that the primary screen 100 and the secondary screen 200 are devices belonging to the same user, and the TaaS server 400 holds information about the primary screen 100 and the secondary screen 200. .

FIG. 4 is a flowchart illustrating a method of providing a collaboration service shown in FIG. 3. According to an embodiment, at least one of the steps illustrated in FIG. 4 may be omitted.

In step 405, it is assumed that content is being played on the secondary screen 200. Before step 405, step 201 and step 203 described with reference to FIG. 2 may have been performed, and the URI of the content currently being played is "http: //address/*.m3u8". Assume

If there is a request for providing the collaboration service from the user of the secondary screen 200, in step 407, the secondary screen 200 may transmit the corresponding collaboration service provision request to the streaming server 300. Accordingly, the streaming server 300 may check the current playback time of the content being played on the secondary screen 200, and may generate the subsequent content starting from the current playback time. Here, it is assumed that the URI of the following content is "http: //address/*.m3u8/start/50/*new.m3u8".

The streaming server 300 may provide the TaaS server 400 with information (eg, URI) about the following content. Here, the streaming server 300 may transmit information on the secondary screen 200 when providing the information on the following content to the TaaS server 400. Accordingly, the TaaS server 400 may manage information about which user (or device or user group) the resume content is associated with. Therefore, when it is requested to provide a list of contents available from any device in the future, the TaaS server 400 may provide the device with a list of contents available in the device with reference to management information.

For example, from another device possessed by the user of the secondary screen 200, or from a device possessed by the user group to which the user of the secondary screen 200 belongs, the TaaS server ( 400 may provide a content list including information on subsequent content (http: //address/*.m3u8/start/50/*new.m3u8). On the contrary, when there is a content providing request available from a device unrelated to the user of the secondary screen 200, the TaaS server 400 includes information about the original content (http: //address/*.m3u8). A list of contents can be provided.

In step 409, a discovery procedure and an application information acquisition procedure may be performed. Step 409 is the same as step 205 and step 207 described with reference to FIG. 2, and thus a detailed description thereof will be omitted.

In operation 411, the secondary screen 200 may display application information executable on the primary screen 100, and may select one application from the user. When the application is selected, the secondary screen 200 may transmit an application execution command to the primary screen 100 instructing to execute the selected application.

In operation 413, the primary screen 100 may request the TaaS server 400 to execute the requested application and transmit a list of available contents. Accordingly, the TaaS server 400 may provide the primary screen 100 with a list of contents available on the primary screen 100. As described above, assuming that the primary screen 100 and the secondary screen 200 are held by the same user, the content list provided by the TaaS server 400 to the primary screen 100 is shown in the following content. Information about (http: //address/*.m3u8/start/50/*new.m3u8) may be included. When the playback of the resume content is requested, the primary screen 100 requests content provision including information on the resume content (http: //address/*.m3u8/start/50/*new.m3u8). It may be transmitted to the streaming server 300. Accordingly, the streaming server 300 may provide the ear show content to the primary screen 100, and the primary screen 100 may play the ear show content.

5 is a conceptual diagram illustrating a method of providing a collaboration service according to another embodiment of the present invention.

The discovery procedure using UPnP described above may not be performed between devices located in different subnets. Accordingly, the TaaS gateways 410 and 420 and the TaaS server 400 cooperate to enable a cooperative service between devices located in different subnets. The TaaS gateways 410 and 420 may further include a function of converting messages used for discovery, application information acquisition, and application execution according to a set protocol, and forwarding them to the other gateway.

FIG. 6 is a flowchart for explaining a collaboration service providing method illustrated in FIG. 5. According to an embodiment, at least one of the steps illustrated in FIG. 6 may be omitted.

In step 601, a session is established between the remote device 10 located in the subnet 1000 and the primary screen 100 located in the subnet 2000, and the remote device 10 and the primary screen 100 are located. Suppose that the content received from the streaming server 300 is playing. The session may be a session for video conferencing or video lecture, and may be formed according to various methods conventionally used. The TaaS server 400 may be a management server that manages information about TaaS gateways 410 and 420 participating in a session. For convenience of description, FIG. 8 illustrates one device forming a session with the remote device 10, but a plurality of devices belonging to different subnets may form a session with the remote device 10.

In step 603, if there is additional content that is desired to be shared other than the content currently being played, the user of the remote device 10 may upload the additional content to the streaming server 300.

The TaaS server 400 may receive additional content and information on the remote device 10 from the remote device 10 or the streaming server 300, and map and manage corresponding information.

Thereafter, a series of procedures may be performed to allow the additional content to be played on a device (eg, the secondary screen 200) located around the primary screen 100. For example, the remote device 10 may perform a discovery request to the first TaaS gateway 410, and the first TaaS gateway 410 may forward the discovery request to the TaaS server 400. The TaaS server 400 may forward a discovery request (eg, may be delivered in a multicast form) to a gateway (eg, the second TaaS gateway 420) establishing a session with the remote device 10. The second TaaS gateway 420 transmits a discovery request to devices (eg, the primary screen 100 and the secondary screen 200) located in the subnet 2000 to which the second TaaS gateway belongs (in multicast form). Can be delivered). Accordingly, the primary screen 100 and the secondary screen 200 may perform a response to the discovery request to the second TaaS gateway 420. The second TaaS gateway 420 can send the discovery responses received from the primary screen 100 and the secondary screen 200 to the TaaS server 400. Accordingly, the TaaS server 400 may transmit discovery responses to the first TaaS gateway 410, and the first TaaS gateway 410 may transmit corresponding discovery responses to the remote device 10. Thereafter, according to the same process as the discovery request and discovery response process described above, requests and responses for acquiring application information may be performed.

In operation 605, the remote device 10 may display the obtained application information, and select one application from the user. The remote device 10 may transmit an application execution command instructing to execute the selected application. The application execution command may be delivered to the second TaaS gateway 420 according to the same process as the discovery request described above, and the second TaaS gateway 420 may transmit the corresponding application execution command to the secondary screen 200. have. At this time, the information (eg, URI) of the additional content may be transmitted together with the application execution command.

In step 607, the secondary screen 200 executes the corresponding application according to the application execution command received from the second TaaS gateway 420, and the information of the additional content received from the second TaaS gateway 420. (Eg, URI), the additional content may be provided from the streaming server 300 and played.

7 is a conceptual diagram illustrating a method of providing a collaboration service according to another embodiment of the present invention.

The primary screen 100 may receive multi-track content from the broadcast server 500 and play it back. The multi-track content may be, for example, a form in which a plurality of images (for example, a main image and an auxiliary image to be displayed in a PIP format) are mixed or at least one image and additional information such as text are mixed. It may be a form in which at least one image and sensory effect metadata for reproducing sensory effects are mixed. The sensory effect can be, for example, an effect on any one of lighting, wind, vibration and fragrance.

Assuming that the multi-track content includes one main image, one sub image, and one sensory effect metadata, the main image is on the primary screen 100, and the sub image is on the secondary screen 200, The sensory effect may be reproduced or reproduced in the sensory effect reproducing apparatus 120.

FIG. 8 is a flowchart for explaining a collaboration service providing method illustrated in FIG. 7. According to an embodiment, at least one of the steps illustrated in FIG. 8 may be omitted.

In step 801, the primary screen 100 may receive multitrack content from the broadcast server 500. Here, it is assumed that the multi-track content has a form in which one main image, one auxiliary image, and one sensory effect metadata are mixed. Hereinafter, for convenience of explanation, the main image is referred to as main content. In addition, other than the main image, for example, the auxiliary image and sensory effect metadata are referred to as auxiliary content.

In steps 803 and 805, the primary screen 100 may perform a discovery procedure and an application information acquisition procedure for a peripheral device. The discovery procedure and the application information acquisition procedure of steps 803 and 805 may be performed in basically the same manner as those of steps 205 and 207 described with reference to FIG. 2.

However, auxiliary content type information may be further used for the discovery request and the application information request in steps 803 and 805. The auxiliary content type information may be information indicating whether the corresponding auxiliary content is a video, a voice, or a sensory effect. For example, the primary screen 100 may multicast a discovery request message in which information indicating that the auxiliary content type is a video is inserted in order to find a device capable of playing the auxiliary video. Also, for example, the primary screen 100 may multicast a discovery request message in which information indicating that the auxiliary content type is a sensory effect is inserted in order to find a device that can reproduce the sensory effect. The application information acquisition procedure may be performed in the same manner.

According to the discovery procedure and the application information acquisition procedure, the primary screen 100 is a device capable of reproducing the secondary screen 200 as a device capable of reproducing an auxiliary image and reproducing the sensory effect (a sensory effect reproduction apparatus ( 120). That is, the secondary screen 200 responds to the discovery request message with the information indicating that the auxiliary content type is an image, and reproduces the sensory effect with respect to the discovery request message with the information indicating that the auxiliary content type is an effect. Assume that device 120 has performed a response. In addition, it is assumed that the primary screen 100 obtains application information available in the secondary screen 200 and the sensory effect reproducing apparatus 120.

The primary screen 100 may store and manage mapping information in which auxiliary content type information and device information are mapped. For example, in the primary screen 100, the secondary screen 200 is mapped to correspond to the auxiliary image, and the sensory effect reproduction apparatus 120 stores and manages mapping information mapped to the sensory effect metadata. can do.

In operation 807, the primary screen 100 may display application information available in the secondary screen 200 and the sensory effect reproducing apparatus 120, and may be selected by the user. When an application is selected by the user, the primary screen 100 may transmit an application execution command to the secondary screen 200 and the sensory effect reproducing apparatus 120 to instruct to execute the selected application. Accordingly, the secondary screen 200 and the sensory effect reproducing apparatus 120 can execute an application in which execution is commanded.

In operation 809, the primary screen 100 may separate the multi-track content into main image, sub image, and sensory effect metadata. The primary screen 100 can reproduce the main image by itself. The primary screen 100 may transmit the auxiliary content to the peripheral device with reference to the mapping information managed by the primary screen 100. For example, the primary screen 100 may provide an auxiliary image to the secondary screen 200 and provide sensory effect metadata to the sensory effect reproduction apparatus 120. Accordingly, the secondary screen 200 reproduces the auxiliary image, and the sensory effect reproduction apparatus 120 may reproduce the sensory effect.

Embodiments of the present invention described above may be implemented in any of various ways. For example, embodiments of the present invention may be implemented using hardware, software, or a combination thereof. If implemented in software, it may be implemented as software running on one or more processors utilizing various operating systems or platforms. In addition, such software may be written using any of a plurality of suitable programming languages, and may also be compiled into machine code or intermediate code executable in a framework or virtual machine.

In addition, when embodiments of the present invention are executed on one or more processors, a processor-readable medium (eg, memory, recorded with one or more programs) for performing a method for implementing various embodiments of the present invention discussed above. Floppy disk, hard disk, compact disk, optical disk, or magnetic tape).

Claims (10)

As a user device provides a collaboration service,
Receiving multi-track content from a broadcast server;
Multicasting a discovery request message including type information of auxiliary content included in the multitrack content when a collaboration service for the multitrack content is requested;
Receiving a discovery response message from a device capable of playing the auxiliary content among a plurality of peripheral devices capable of providing the collaboration service;
Storing mapping information generated by mapping type information of the auxiliary content and device information of a device that has transmitted the discovery response message; And
Separating the multi-track content and transmitting the separated content to a device capable of playing the separated content with reference to the mapping information.
delete delete delete In claim 1,
The multi-track content,
At least two of the primary video, auxiliary video, sensory effect metadata, the collaboration service providing method.
A method for providing a collaboration service by a second device,
Receiving an execution command from a first device playing the first content;
Executing an application according to the execution command, and receiving information on the resume content of the first content from a management server; And
Receiving the earbuds content from a streaming server using information on the received earbuds content,
The following content,
Newly generated in the streaming server with content starting after a time point at which the first content is played on the first device,
Receiving information on the resume content,
If the second device does not belong to the same user group as the first device,
And receiving information on the original content of the first content from the management server instead of the information on the subsequent content.
In claim 6,
Information on the following content,
Method for providing a collaboration service including a Uniform Resource Identifier (URI) for the content of the replay.
delete In claim 6,
The second device,
A method of providing a collaboration service, belonging to the same user group as the first device.
delete
KR1020160022671A 2015-04-15 2016-02-25 Collaborating service providing method for media sharing and system thereof KR102052385B1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
KR20150053406 2015-04-15
KR1020150053406 2015-04-15

Publications (2)

Publication Number Publication Date
KR20160123982A KR20160123982A (en) 2016-10-26
KR102052385B1 true KR102052385B1 (en) 2019-12-06

Family

ID=57251946

Family Applications (1)

Application Number Title Priority Date Filing Date
KR1020160022671A KR102052385B1 (en) 2015-04-15 2016-02-25 Collaborating service providing method for media sharing and system thereof

Country Status (1)

Country Link
KR (1) KR102052385B1 (en)

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR101276205B1 (en) * 2011-01-14 2013-06-20 전자부품연구원 Collaborative service system using mobile device and method for providing collaborative service using mobile device in smart TV of the system
KR101711161B1 (en) 2012-09-25 2017-03-13 한국전자통신연구원 Method and apparatus for providing of web-based multi-network adaptive multi-screen service
KR101406425B1 (en) * 2012-10-22 2014-06-27 광운대학교 산학협력단 The differential media content transmission method and system according to the network status in a home environment
KR20140117977A (en) * 2013-03-27 2014-10-08 건국대학교 산학협력단 N-Screen Service System for ASMD support and the method thereof

Also Published As

Publication number Publication date
KR20160123982A (en) 2016-10-26

Similar Documents

Publication Publication Date Title
CN109963162B (en) Cloud directing system and live broadcast processing method and device
TWI669957B (en) Media projection method, media projection device, control terminal, and cloud server
US8762465B2 (en) Method for providing a content-sharing service, and device therefor
JP6006337B2 (en) Method, apparatus, and system for media playback processing and control
EP2933982B1 (en) Media stream transfer method and user equipment
US20110296460A1 (en) Method and apparatus for providing remote user interface (ui) service
WO2015035742A1 (en) Method, terminal and system for audio and video sharing of digital television
KR20120114016A (en) Method and apparatus for network adaptive streaming user data in a outer terminal
CN109983777B (en) Method, client device and controller system for enabling media orchestration
US20170105034A1 (en) Communication apparatus, communication method, and program
US20200351559A1 (en) Distribution device, distribution method, reception device, reception method, program, and content distribution system
WO2016174960A1 (en) Reception device, transmission device, and data processing method
WO2018034172A1 (en) Information processing device, client device, and data processing method
JPWO2018079295A1 (en) Information processing apparatus and information processing method
CN109792556B (en) Receiving apparatus, transmitting apparatus, and data processing method
JP5300951B2 (en) CONTENT PROCESSING SYSTEM, SERVER DEVICE, CONTENT REPRODUCTION DEVICE, CONTROL METHOD, AND CONTROL PROGRAM
KR101231821B1 (en) Method and System for providing contents continuous play service
KR102052385B1 (en) Collaborating service providing method for media sharing and system thereof
WO2018043111A1 (en) Information processing device, information processing method, and information processing system
JPWO2016174959A1 (en) Reception device, transmission device, and data processing method
US8255556B2 (en) Multicast and synchronization emulation for content transformed streams
WO2016051802A1 (en) Content reception system, content reception device, display device, content reception system control method, and program
WO2011118498A1 (en) Content distribution system, content distribution method, and content distribution program
CN105721887A (en) Video playing method, apparatus and system
WO2014183539A1 (en) Session setup method and apparatus, and session content delivery method and apparatus

Legal Events

Date Code Title Description
E902 Notification of reason for refusal
E90F Notification of reason for final refusal
E701 Decision to grant or registration of patent right