WO2018080180A1 - System and method for providing augmented reality service associated with broadcasting - Google Patents

System and method for providing augmented reality service associated with broadcasting Download PDF

Info

Publication number
WO2018080180A1
WO2018080180A1 PCT/KR2017/011881 KR2017011881W WO2018080180A1 WO 2018080180 A1 WO2018080180 A1 WO 2018080180A1 KR 2017011881 W KR2017011881 W KR 2017011881W WO 2018080180 A1 WO2018080180 A1 WO 2018080180A1
Authority
WO
WIPO (PCT)
Prior art keywords
information
broadcasting
terminal device
broadcasting content
additional
Prior art date
Application number
PCT/KR2017/011881
Other languages
English (en)
French (fr)
Inventor
Ji Eun Kim
Jong Han Lee
Hye Kyung Park
Kyung Hun Kang
Shin Lee
Ho Sang CHA
Original Assignee
Cj Hellovision Co., Ltd.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Cj Hellovision Co., Ltd. filed Critical Cj Hellovision Co., Ltd.
Publication of WO2018080180A1 publication Critical patent/WO2018080180A1/en

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/41Structure of client; Structure of client peripherals
    • H04N21/422Input-only peripherals, i.e. input devices connected to specially adapted client devices, e.g. global positioning system [GPS]
    • H04N21/4223Cameras
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/41Structure of client; Structure of client peripherals
    • H04N21/4104Peripherals receiving signals from specially adapted client devices
    • H04N21/4126The peripheral being portable, e.g. PDAs or mobile phones
    • H04N21/41265The peripheral being portable, e.g. PDAs or mobile phones having a remote control device for bidirectional communication between the remote control device and client device
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/434Disassembling of a multiplex stream, e.g. demultiplexing audio and video streams, extraction of additional data from a video stream; Remultiplexing of multiplex streams; Extraction or processing of SI; Disassembling of packetised elementary stream
    • H04N21/4347Demultiplexing of several video streams
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T19/00Manipulating 3D models or images for computer graphics
    • G06T19/006Mixed reality
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/41Structure of client; Structure of client peripherals
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/434Disassembling of a multiplex stream, e.g. demultiplexing audio and video streams, extraction of additional data from a video stream; Remultiplexing of multiplex streams; Extraction or processing of SI; Disassembling of packetised elementary stream
    • H04N21/4345Extraction or processing of SI, e.g. extracting service information from an MPEG stream
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/439Processing of audio elementary streams
    • H04N21/4394Processing of audio elementary streams involving operations for analysing the audio stream, e.g. detecting features or characteristics in audio streams
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/44Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream or rendering scenes according to encoded video stream scene graphs
    • H04N21/44008Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream or rendering scenes according to encoded video stream scene graphs involving operations for analysing video streams, e.g. detecting features or characteristics in the video stream
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/45Management operations performed by the client for facilitating the reception of or the interaction with the content or administrating data related to the end-user or to the client device itself, e.g. learning user preferences for recommending movies, resolving scheduling conflicts
    • H04N21/462Content or additional data management, e.g. creating a master electronic program guide from data received from the Internet and a Head-end, controlling the complexity of a video stream by scaling the resolution or bit-rate based on the client capabilities
    • H04N21/4622Retrieving content or additional data from different sources, e.g. from a broadcast channel and the Internet
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/80Generation or processing of content or additional data by content creator independently of the distribution process; Content per se
    • H04N21/81Monomedia components thereof
    • H04N21/812Monomedia components thereof involving advertisement data
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/80Generation or processing of content or additional data by content creator independently of the distribution process; Content per se
    • H04N21/81Monomedia components thereof
    • H04N21/816Monomedia components thereof involving special video data, e.g 3D video
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/80Generation or processing of content or additional data by content creator independently of the distribution process; Content per se
    • H04N21/81Monomedia components thereof
    • H04N21/8166Monomedia components thereof involving executable data, e.g. software
    • H04N21/8186Monomedia components thereof involving executable data, e.g. software specially adapted to be executed by a peripheral of the client device, e.g. by a reprogrammable remote control
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/434Disassembling of a multiplex stream, e.g. demultiplexing audio and video streams, extraction of additional data from a video stream; Remultiplexing of multiplex streams; Extraction or processing of SI; Disassembling of packetised elementary stream

Definitions

  • the present invention relates to a system and method for providing an augmented reality service associated with broadcasting and, more particularly, to a system and method for providing an augmented reality service associated with broadcasting, which capture an image output by a broadcasting reception apparatus using an App of a terminal device and output additional information related to the captured image in association with the image.
  • Augmented reality is a technology for showing the real world seen by a user and a virtual world having augmented information in a single image form by overlaying the real world with the virtual world.
  • AR is also called mixed reality (MR) because the real world is overlaid with a virtual world having additional information in real time to produce a single image.
  • MR mixed reality
  • a smartphone best implements the AR technology.
  • the smartphone can provide a function of designating information, such as the position and telephone number of a surrounding store, in the form of a three-dimensional image when the camera of the smartphone turns to the store.
  • An image device using a set-top box does not provide a method of immediately checking information required by a user although the information is included in content that is being watched. For example, a user has to inevitably obtain information, such as where is the street along which a character walks or clothes worn by the character when the user watches a drama, by searching for the information using a computer if the user wants to obtain the information.
  • the image device using a set-top box delivers only information, provided by a show host, to a user one-sidedly while the user watches a home shopping channel. Furthermore, the image device has a limit because a user can purchase only a product that is being sold and has to use only a shopping mall purchasing method using ARS telephone or the Internet in order to purchase the product.
  • Patent Document 1 Korean Patent No. 1,111,389 (February 24, 2012)
  • the present invention has been made to solve the above problems occurring in the prior art and an object of the present invention is to provide a system and method for providing an augmented reality service associated with broadcasting, wherein when a TV screen is recognized by the camera of a terminal device while a user watches TV, additional information related to a recognized image is displayed through the terminal device.
  • Another object of the present invention is to provide a system and method for providing an augmented reality service associated with broadcasting, which can recognize that an image captured by the terminal device corresponds to which scene of which program.
  • Yet another object of the present invention is to provide a system and method for providing an augmented reality service associated with broadcasting, wherein AR information or VR information related to an image captured by the terminal device is synchronized in real time or not synchronized in real time and output depending on the type of broadcasting content that is being displayed on a screen.
  • a terminal device including a display unit, a photographing unit, and a broadcasting-associated augmented reality App driving processing unit configured to capture an image output by a broadcasting reception apparatus through the photographing unit when a previously stored broadcasting-augmented reality App is driven and to display additional information related to the captured image on the display unit.
  • the broadcasting-augmented reality App driving processing unit may include an image acquisition module configured to drive the photographing unit and capture an image displayed on a screen of the broadcasting reception apparatus when the broadcasting-augmented reality App is driven, a source information acquisition module configured to obtain source information for recognizing broadcasting content to which the captured image corresponds, a recognition module configured to receive the meta information of the broadcasting content corresponding to the source information from an additional service server, and an augmented AR/VR service module configured to obtain related AR information or VR information based on the received meta information and to display the AR information or VR information in association with the captured image in accordance with regulations defined in the meta information.
  • the source information may include at least one of an electronic program guide (EPG), a non-audible frequency, image information and a sound generated from a program which is being watched.
  • EPG electronic program guide
  • the AR/VR service module may output the AR information or VR information by synchronizing the AR information or VR information and the broadcasting content in real time or without synchronizing the AR information or VR information and the broadcasting content in real time depending on the type of broadcasting content which is being displayed on a screen.
  • the broadcasting-associated AR App driving processing unit may further include a service providing module configured to provide access address information corresponding to a selected specific object when the specific object is selected in an image in which the AR information or VR information is displayed and to access a corresponding service server according to the access address information.
  • a service providing module configured to provide access address information corresponding to a selected specific object when the specific object is selected in an image in which the AR information or VR information is displayed and to access a corresponding service server according to the access address information.
  • an additional service server including a database configured to store at least one of broadcasting content, meta information of each piece of broadcasting content, additional information and source information, a recognition unit configured to recognize broadcasting content corresponding to source information in the database when a broadcasting content recognition request signal including the source information is received from a terminal device and to transmit meta information of the recognized broadcasting content to the terminal device, and an augmented AR/VR providing unit configured to obtain additional information about a corresponding scene or object from the database when the additional information is requested by the terminal device and to transmit the obtained additional information to the terminal device.
  • the source information may include at least one of an EPG, a non-audible frequency, image information, and a sound generated from a program which is being watched.
  • the additional information may be AR information or VR information.
  • a system for providing an augmented reality service associated with broadcasting including a broadcasting reception apparatus configured to output broadcasting content, a terminal device configured to capture an image output by the broadcasting reception apparatus, display the captured image on a screen, obtain additional information related to the captured image from an additional service server, and display the obtained additional information on the screen, and the additional service server configured to provide the additional information to the terminal device.
  • the terminal device may drive a photographing unit to capture an image displayed on a screen of the broadcasting reception apparatus, may obtain source information for recognizing broadcasting content to which the captured image corresponds, may receive meta information of the broadcasting content corresponding to the source information from the additional service server by requesting the meta information, may obtain related AR information or VR information from the additional service server based on the received meta information, and may display the AR information or VR information in association with the captured image in accordance with regulations defined in the meta information.
  • the terminal device may output the AR information or VR information by synchronizing the AR information or VR information and the broadcasting content in real time or without synchronizing the AR information or VR information and the broadcasting content in real time depending on the type of broadcasting content which is being displayed on a screen.
  • the terminal device may provide access address information corresponding to a selected specific object when the specific object is selected in an image in which the AR information or VR information is displayed, and may access a corresponding service server according to the access address information.
  • the additional service server may recognize broadcasting content corresponding to the source information and transmit the meta information of the recognized broadcasting content to the terminal device.
  • the additional service server may obtain AR information or VR information about a corresponding scene or object and transmit the AR information or VR information to the terminal device.
  • a method of providing, by a terminal device, a broadcasting-associated type augmented reality service including the steps of (a) when a previously stored broadcasting-augmented reality App is driven, driving a photographing unit to capture an image displayed on a screen of a broadcasting reception apparatus, (b) obtaining source information for recognizing broadcasting content to which the captured image corresponds, (c) obtaining the meta information of the broadcasting content corresponding to the source information, and (d) obtaining related AR information or VR information based on the meta information and displaying the AR information or VR information in association with the captured image.
  • the source information may include at least one of an electronic program guide (EPG), a non-audible frequency, image information and a sound generated from a program which is being watched.
  • EPG electronic program guide
  • the method of providing a broadcasting-associated type augmented reality service may further include the step of providing access address information corresponding to a selected specific object when the specific object is selected in an image in which the AR information or VR information is displayed and accessing a corresponding service server according to the access address information after the step (d).
  • the AR information or VR information may be output by synchronizing the AR information or VR information and the broadcasting content in real time or without synchronizing the AR information or VR information and the broadcasting content in real time depending on the type of broadcasting content which is being displayed on a screen.
  • a method of providing a broadcasting-associated type augmented reality service including the steps of (a) when a broadcasting-augmented reality App is executed, driving, by a terminal device, a photographing unit to capture an image output by a broadcasting reception apparatus and obtaining source information for recognizing broadcasting content to which the captured image corresponds, (b) transmitting, by the terminal device, a broadcasting content recognition request signal including the obtained source information to an additional service server, (c) recognizing, by the additional service server, the broadcasting content corresponding to the source information, obtaining meta information of the recognized broadcasting content, transmitting the obtained meta information to the terminal device, (d) interpreting, by the terminal device, the meta information and requesting additional information from the additional service server if the additional information is necessary, (e) obtaining, by the additional service server, the additional information about a corresponding scene or object and transmitting the obtained additional information to the terminal device, and (f) displaying, by the terminal device, the additional information in association with
  • the additional information may be AR information or VR information.
  • the terminal device may output the additional information by synchronizing the additional information and the broadcasting content in real time or without synchronizing the additional information and the broadcasting content in real time depending on the type of broadcasting content which is being displayed on a screen.
  • the method of providing a broadcasting-associated type AR service may further include the step of providing access address information corresponding to a selected specific object when the specific object is selected in an image being displayed in association with the additional information and accessing a corresponding service server according to the access address information after the step (f).
  • the "method of providing a broadcasting-associated type AR service" described above may be implemented in the form of a program, and may be recorded on a recording medium readable by an electronic device or distributed through a program download management device (e.g., a server).
  • a program download management device e.g., a server
  • the user when a user selects a specific object in an image being displayed in association with AR information or VR information, the user can access a corresponding service server using access address information corresponding to the selected specific object, and can display a specific broadcasting screen or an image of a specific photographed area in which a product interested by the user appears though his or her terminal device. Accordingly, the user can obtain information about the corresponding product using an AR service method and immediately access a shopping mall from which the product can be purchased.
  • Effects of the present invention are not limited to the aforementioned effects and may include various other effects within a range that is evident to a person having ordinary skill in the art from the following description.
  • FIG. 1 is a diagram showing a system for providing an augmented reality service associated with broadcasting according to an embodiment of the present invention.
  • FIG. 2 is a diagram for illustrating broadcasting content according to an embodiment of the present invention.
  • FIG. 3 is a block diagram schematically showing the configuration of a terminal device according to an embodiment of the present invention.
  • FIG. 4 is a detailed diagram showing a broadcasting-associated AR App driving processing unit of FIG. 3.
  • FIG. 5 is a block diagram schematically showing the configuration of an additional service server according to an embodiment of the present invention.
  • FIG. 6 is a diagram showing a method of providing a broadcasting-associated type AR service according to an embodiment of the present invention.
  • FIGS. 7 to 14 are exemplary diagrams of screens for illustrating the broadcasting-associated type AR service.
  • broadcasting server 200 broadcasting reception apparatus
  • 300 terminal device 310, 410: communication unit
  • recognition module 368 AR/VR service module
  • recognition unit 440 AR/VR providing unit
  • broadcasting content content transmitted by a broadcasting server
  • the broadcasting content may be a concept including video content, and media content of various forms.
  • FIG. 1 is a diagram showing a system for providing an augmented reality service associated with broadcasting according to an embodiment of the present invention
  • FIG. 2 is a diagram for illustrating broadcasting content according to an embodiment of the present invention.
  • the system for providing an augmented reality service associated with broadcasting includes a broadcasting server 100, a broadcasting reception apparatus 200, a terminal device 300 and an additional service server 400.
  • the broadcasting server 100 transmits broadcasting content over a broadcasting network or through the Internet.
  • the broadcasting content may be classified into real-time relay content provided through real-time relay and pre-produced content that has been produced and edited before it is provided to a viewer.
  • the broadcasting content may be classified depending on whether it has been previously produced or whether it is synchronized in real time as in FIG. 2.
  • the broadcasting content may be real-time synchronization pre-produced content that has been previously produced and is synchronized in real time, real-time asynchronous pre-produced content that has been previously produced and is not synchronized in real time, real-time synchronization relay content that has not been previously produced, but is synchronized in real time or real-time asynchronous relay content that has not been previously produced and is not synchronized in real time.
  • the real-time synchronization pre-produced content may include adult AR content, kids AR content and a bidirectional AR advertisement.
  • the real-time asynchronous pre-produced content may include a real estate AR advertisement and local channel VR content.
  • the real-time synchronization relay content may include media commerce and real-time sports information.
  • a broadcasting content provider may previously write meta information for representing or describing a corresponding situation in each scene through the analysis of the entire content and a related scenario before it provides the corresponding content, and may provide the written meta information.
  • a corresponding content provider may analyze the contents and scenario of the drama, may write meta information for describing characters appearing in the drama and clothes worn by the characters in the drama and may provide the meta information to a viewer.
  • meta information for representing or describing an ever-changing situation or a specific event, such as "home run” or "goal?, in real time and to provide the written meta information.
  • a corresponding content provider may write meta information for representing or describing a situation vividly occurring on the spot, for example, whether a ball thrown by a pitcher is strike or ball and whether a hitter makes a hit or hits a home run or the hitter has been out in the case of baseball, and may provide the written meta information to a viewer.
  • the broadcasting server 100 has to produce AR information (VR information) together when producing broadcasting content and to also produce meta information including information about the output time, output location and processing event of the AR information in the broadcasting content. That is, for a broadcasting-associated type AR service, the broadcasting server 100 also produces not only AR information, but meta information capable of indicating a relation between broadcasting content and the AR information in addition to the broadcasting content.
  • the meta information is data that provides the broadcasting content indicating whether the AR information is present or not and the location where the AR information has been stored and information related to the AR information.
  • the meta information may include information about the output time, output location or processing event of the AR information.
  • the broadcasting server 100 has to produce AR information (VR information) together when producing broadcasting content, and also produces meta information including information about the output location and processing event of the AR information in the broadcasting content. In this case, it is not necessary to synchronize the output time of the broadcasting content and the output time of the AR information because the broadcasting content and the AR information are not synchronized in real time.
  • the broadcasting server 100 writes AR information for representing or describing an ever-changing situation or a specific event and meta information indicative of a relation between broadcasting content and the AR information in real time, and provides the written meta information.
  • the term "event” is used to include at least any one of a specific situation, a specific scene and specific contents within the content.
  • the broadcasting server 100 transmits broadcasting content to the broadcasting reception apparatus 200 over a broadcasting network or through the Internet, and transmits broadcasting content, AR information and meta information to the additional service server 400.
  • the broadcasting server 100 transmits broadcasting content (video/audio) for a broadcasting program according to a broadcasting organization table and time synchronization information to the broadcasting reception apparatus 200 over a broadcasting network.
  • the time synchronization information is a time synchronization signal used to synchronize video and audio in the broadcasting reception apparatus 200.
  • the broadcasting server 100 may produce broadcasting content by inserting source information for recognizing the broadcasting content into the broadcasting content.
  • the source information for recognizing the broadcasting content may be a non-audible frequency, for example.
  • the broadcasting server 100 has been illustrated as inputting a non-audible frequency into broadcasting content, but source information for recognizing broadcasting content may be inserted in a separate device.
  • the broadcasting reception apparatus 200 receives broadcasting content from the broadcasting server 100 and displays the received broadcasting content on a TV screen.
  • the broadcasting reception apparatus 200 may be configured in the form of an apparatus capable of transmitting and receiving data broadcasting information, such as a set-top box (STB), TV having an STB embedded therein, DTV or a PC.
  • STB set-top box
  • DTV digital TV
  • PC personal computer
  • the broadcasting reception apparatus 200 may select a broadcasting channel required by a user through an embedded tuner, and may receive the broadcasting content of the required broadcasting channel through a radio frequency (RF) or over an Internet protocol (IP) network.
  • RF radio frequency
  • IP Internet protocol
  • the broadcasting reception apparatus 200 receives broadcasting content and meta information including information related to the broadcasting content over a broadcasting network or a communication network.
  • the terminal device 300 captures an image displayed on a screen of the broadcasting reception apparatus 200 using the camera function of the terminal device 300 in real time, enables the captured image to be viewed through a screen of the terminal device 300, and displays AR information or VR information at a corresponding location in a screen of the terminal device 300, thereby enabling an AR advertisement or an AR data service.
  • the terminal device 300 is a user's terminal onto which a broadcasting-associated AR application (App) for an AR implementation, which operates in conjunction with a TV screen, has been downloaded.
  • the terminal device may be configured in a form having a camera function and capable of capturing a screen of the broadcasting reception apparatus 200 through the camera function in real time and of displaying the captured image, such as a smartphone, a mobile phone, iPad, a tablet PC, a notebook or a PC.
  • the broadcasting-associated AR App captures an image displayed on a screen of the broadcasting reception apparatus, confirms that the captured image corresponds to which scene of which program in real time, and provides related AR information or VR information.
  • the broadcasting-associated AR App may be received from a download server on the Internet, such as an App store, and installed or may be software embedded in the terminal device in itself.
  • the terminal device 300 photographs a broadcasting screen being output by the broadcasting reception apparatus 200 in the state in which the broadcasting-associated AR App has been executed, displays the photographed screen, recognizes broadcasting content corresponding to a captured image, obtains meta information corresponding to the recognized broadcasting content, secures related additional information based on the meta information, and displays the additional information associated with the broadcasting content in accordance with regulations defined in the meta information.
  • the additional information may be AR information or VR information, but the additional information is hereinafter illustrated as being limited to AR information, for convenience of description.
  • the terminal device 300 In order to display AR information associated with broadcasting content, the terminal device 300 has to recognize that a captured image corresponds to which program (or broadcasting content). To this end, the terminal device 300 transmits source information for recognizing a program (or broadcasting content) that is being watched to the additional service server 400, and receives the meta information of the recognized broadcasting content from the additional service server 400.
  • the source information for recognizing broadcasting content may include an EPG, a non-audible frequency, image information and a sound generated from a program that is being watched.
  • the additional service server 400 that has received the source information recognizes corresponding broadcasting content by analyzing the source information, and transmits the meta information of the recognized broadcasting content to the terminal device 300.
  • a method for the terminal device 300 to recognize a current watching image (program) is described in more detail below.
  • the terminal device 300 is paired with the set-top box of the broadcasting reception apparatus 200, and may confirm TV watching context and a watching program by recognizing the EPG of video. That is, the terminal device 300 receives the EPG information (watching information) of broadcasting content that is being output from the broadcasting reception apparatus 200, transmits a broadcasting content recognition request signal including the received EPG information to the additional service server 400, and receives meta information corresponding to the EPG information from the additional service server 400 by performing pairing with the broadcasting reception apparatus 200. In this case, the terminal device 300 calls out the broadcasting reception apparatus 200 by transmitting a signal according to a predefined wireless communication standard using the unique information of the broadcasting reception apparatus 200.
  • EPG information watching information
  • the additional service server 400 receives meta information corresponding to the EPG information from the additional service server 400 by performing pairing with the broadcasting reception apparatus 200.
  • the terminal device 300 calls out the broadcasting reception apparatus 200 by transmitting a signal according to a predefined wireless communication standard using the unique information of the broadcasting reception apparatus 200.
  • the terminal device 300 When a response is received from the broadcasting reception apparatus 200, the terminal device 300 performs pairing for a connection with the broadcasting reception apparatus 200.
  • short-distance Bluetooth, Zigbee, Wi-Fi or WCDMA may be used as the wireless communication standard.
  • various wireless communication standards may be selected.
  • Various known technologies may be adopted as the pairing technology.
  • the terminal device 300 may recognize a current watching image (program) by recognizing a non-audible frequency output by the broadcasting reception apparatus 200 along with broadcasting content.
  • the non-audible frequency may be the sound of a frequency that cannot be heard by a person other than 20-20000 Hz.
  • the terminal device 300 transmits a broadcasting content recognition request signal including the non-audible frequency to the additional service server 400, and receives meta information corresponding to the non-audible frequency from the additional service server 400.
  • the terminal device 300 may recognize a current watching image (program) using a scheme for extracting image information.
  • the image information may be image data (the whole image or a partial image) displayed on a screen for a predetermined specific time or image motion change information displayed on a screen for a predetermined specific time. That is, the terminal device 300 may recognize watching data through an API capable of tracking image data regarding that which channel has been watched for several minutes.
  • the terminal device 300 transmits a broadcasting content recognition request signal, including image data displayed on a screen for a predetermined specific time, to the additional service server 400, and receives meta information corresponding to the image data from the additional service server 400.
  • the terminal device 300 may confirm watching data by recognizing a sound generated from a program that is being watched.
  • the terminal device 300 transmits a broadcasting content request signal, including audio data corresponding to a sound generated from a program that is being watched, to the additional service server 400, and receives meta information corresponding to the audio data from the additional service server 400.
  • the terminal device 300 that has received meta information for broadcasting content that is being watched analyzes the meta information, receives corresponding AR information by requesting it from the additional service server 400, and displays the received AR information in association with broadcasting content. That is, since the meta information includes information about the AR information, the terminal device 300 may request required AR information from the additional service server 400 based on the analysis of the meta information, and may receive the requested AR information. Thereafter, the terminal device 300 synchronizes the AR information in real time or does not synchronize the AR information in real time depending on the type of broadcasting content that is being displayed on a screen, and outputs the AR information.
  • the terminal device 300 may recognize a captured image, may obtain at least one piece of AR information corresponding to the recognized image, and may provide the at least one piece of AR information so that the AR information is combined with the image and displayed through the broadcasting-associated AR App. Accordingly, the terminal device 300 can display an image with which AR information has been combined.
  • the terminal device 300 may display additional information of the corresponding advertisement on a screen. Furthermore, if the terminal device 300 executes the broadcasting-associated AR App and photographs a screen of a drama when the drama is broadcasted by the broadcasting reception apparatus 200, the terminal device 300 may further display special contents unknown in terms of contents because a TV frame has been extended by 1 inches on a screen, may display the profile or news of a main character on a screen when the main character is photographed, or may display pieces of information about products worn by a main character on a screen.
  • the terminal device 300 may display a game in which a game character hidden in the specific program appears and scores are made when the character is taken on a screen.
  • the terminal device 300 may detect whether a user has selected a specific object or specific AR information in an image that is being displayed, and may provide access address information corresponding to the selected and detected specific object or specific AR information through the broadcasting-associated AR App, so the terminal device 300 can access a corresponding service server (not shown) according to the access address information. For example, when a user selects a specific object in an image in which AR information has been displayed, the terminal device 300 can be connected to an Internet website from which a product corresponding to the specific object can be purchased.
  • the additional service server 400 receives broadcasting content, meta information and AR information from the broadcasting server 100 and provides meta information or AR information, requested by the terminal device 300, to the terminal device 300.
  • the additional service server 400 recognizes corresponding broadcasting content by analyzing the source information and transmits the meta information of the recognized broadcasting content to the terminal device 300.
  • the additional service server 400 obtains the AR information for a corresponding scene or object and transmits the AR information to the terminal device 300.
  • the AR information may include product information and/or advertisement information associated with the scene or object of broadcasting content.
  • the additional service server 400 is described in detail with reference to FIG. 5.
  • FIG. 3 is a block diagram schematically showing the configuration of the terminal device according to an embodiment of the present invention
  • FIG. 4 is a detailed diagram showing the broadcasting-associated AR App driving processing unit of FIG. 3.
  • the terminal device 300 includes a communication unit 310, an input unit 320, a display unit 330, a storage unit 340, a photographing unit 350, a broadcasting-associated AR App driving processing unit 360 and a control unit 370.
  • the communication unit 310 is an element for transmitting and receiving data to and from various electronic devices. Specifically, the communication unit 310 may be connected to the terminal device 300, the broadcasting reception apparatus and the additional service server over a wired communication network or a wireless communication network, and may transmit and receive various data, including an advertisement request and a retargeting advertisement, based on the connection.
  • the communication unit 310 may include various wired communication modules or wireless communication modules and transmit or receive data in accordance with various wireless or wired communication standards.
  • the communication unit 310 may be implemented in a form including various standard communication modules, such as ITU, IEEE, ISO and IEC, and may be implemented in a form including various communication modules in addition to the standard communication modules.
  • the input unit 320 is an element for receiving information from a user. Specifically, the input unit 320 may be used to receive service use matched with augmented reality or virtual reality provided by the additional service server.
  • the input unit 320 may be implemented using an input device, such as a keypad or a touch panel, and may be implemented using various input devices in addition to the input devices. Furthermore, the input unit 320 may also be implemented in the form of a touch screen integrated with the display unit 330.
  • the display unit 330 is an element for displaying various types of information related to the operation of the terminal device 300, and may display various types of information, such as a captured image, augmented reality and virtual reality, through the photographing unit.
  • the display unit 130 may be implemented using various display devices including an LCD and an LED.
  • the storage unit 340 is an element for storing data related to the operation of the terminal device 300.
  • the storage unit 340 may store various data, including data for implementing a UI, data transferred through the communication unit 310 and data inputted through the input unit 320.
  • the storage unit 340 may be implemented in a form that stores the broadcasting-associated AR App and including a variety of types of memory devices.
  • the photographing unit 350 is an element for capturing an image and may be a camera, for example.
  • the broadcasting-associated AR App driving processing unit 360 drives the photographing unit 350.
  • the photographing unit 350 captures an image displayed on a screen of the broadcasting reception apparatus.
  • the broadcasting-associated AR App driving processing unit 360 confirms that the captured image corresponds to which scene of which program in real time and provides related AR information or VR information.
  • the broadcasting-associated AR App driving processing unit 36 includes an image acquisition module 362, a source information acquisition module 365, a recognition module 366, an AR/VR service module 368 and a service providing module 369.
  • the image acquisition module 362 drives the photographing unit 350, and the photographing unit 350 captures an image displayed on a screen of the broadcasting reception apparatus.
  • the source information acquisition module 365 obtains source information for recognizing that the image obtained by the image acquisition module 362 corresponds to which program (or broadcasting content).
  • the source information for recognizing broadcasting content may include an EPG, a non-audible frequency, image information, and a sound generated from a program that is being watched.
  • the recognition module 366 transmits the source information, obtained by the source information acquisition module 365, to the additional service server and receives the meta information of broadcasting content corresponding to the source information from the additional service server. Reference may be made to the detailed description of FIG. 1 for a method for the recognition module 366 to recognize the broadcasting content corresponding to the source information.
  • the AR/VR service module 368 secures related AR information based on the meta information received from the recognition module 366, and displays the AR information associated with the broadcasting content in accordance with regulations defined in the meta information. That is, since the meta information includes information about the AR information, the AR/VR service module 368 may request required AR information from the additional service server based on the analysis of the meta information, and may receive the requested AR information.
  • the AR/VR service module 368 outputs the AR information by synchronizing the AR information and the broadcasting content in real time or without synchronizing them in real time depending on the type of broadcasting content that is being displayed on a screen.
  • the AR/VR service module 368 may output AR information by synchronizing the AR information and broadcasting content in real time based on meta information.
  • the AR/VR service module 368 may use a method of buffering broadcasting content through a buffering unit (not shown), for example. That is, in the case of real-time synchronized content, broadcasting content may be buffered because time may be taken to receive AR information requested by the AR/VR service module 368 for the synchronization of the broadcasting content and the AR information. Such buffering is performed by the buffering unit (not shown).
  • the buffering unit may delay the playback of broadcasting content until the entire AR information is fetched for the purpose of media synchronization because some time is taken to fetch the AR information over a communication network based on meta information. Furthermore, for media synchronization, scheduling may be adjusted in such a manner that meta information related to the AR information of a next program is transmitted while a current program is broadcasted and thus the terminal device fetches the AR information from a server or a content provider and performs synchronization.
  • the AR/VR service module 368 may output a VR image related to broadcasting content that is being watched without synchronizing the VR image and the broadcasting content.
  • the AR/VR service module 368 outputs AR information by inserting link information or additional information about one or more regions of content that is being displayed using an AR scheme.
  • the service providing module 369 When a specific object is selected in an image in which additional information (AR information or VR information) is being displayed, the service providing module 369 provides access address information corresponding to the selected specific object so that a corresponding service server corresponding to the access address information is accessed. Accordingly, a user can perform product purchase through the accessed service server.
  • the control unit 370 is an element for controlling the operations of various elements of the terminal device 300, including the communication unit 310, the input unit 320, the display unit 330, the storage unit 340, the photographing unit 350 and the broadcasting-associated AR App driving processing unit 360.
  • the control unit 370 may include at least one operation device.
  • the operation device may be a general-purpose central processing unit (CPU), a programmable device (e.g., CPLD or FPGA) implemented suitably for a special purpose, an application-specific integrated circuit (ASIC) or a microcontroller chip.
  • CPU central processing unit
  • CPLD programmable device
  • FPGA field-programmable gate array
  • Each of the elements of the terminal device 300 may be implemented in the form of a software module or hardware module executed by a processor or may be implemented in a form in which a software module and a hardware module are combined.
  • the software module, the hardware module and/or the form in which a software module and a hardware module are combined, which is executed by a processor as described above, may be implemented as a hardware system (e.g., a computer system).
  • FIG. 5 is a block diagram schematically showing the configuration of the additional service server according to an embodiment of the present invention.
  • the additional service server 400 includes a communication unit 410, a database (DB) 420, a recognition unit 430, an AR/VR providing unit 440 and a control unit 450.
  • DB database
  • the additional service server 400 includes a communication unit 410, a database (DB) 420, a recognition unit 430, an AR/VR providing unit 440 and a control unit 450.
  • the communication unit 410 is an element for transmitting and receiving data to and from various electronic devices. Specifically, the communication unit 410 may be connected to two or more devices and may transmit and receive various types of information, such as meta information and AR information.
  • the DB 420 stores broadcasting content, meta information, and additional information about each scene or object of each piece of broadcasting content and also stores source information for the recognition of corresponding content for each piece of broadcasting content.
  • the additional information may include AR information or VR information
  • the source information may include an EPG, a non-audible frequency, image information, and a sound generated from a program that is being watched.
  • the recognition unit 430 recognizes corresponding broadcasting content by analyzing the source information and transmits the meta information of the recognized broadcasting content to the terminal device.
  • the recognition unit 430 recognizes broadcasting content corresponding to the EPG information by searching the DB 420 and transmits the meta information of the recognized broadcasting content to the terminal device. Furthermore, if the source information is a non-audible frequency, the recognition unit 430 recognizes broadcasting content corresponding to the non-audible frequency by searching the DB 420 and transmits the meta information of the recognized broadcasting content to the terminal device. Furthermore, if the source information is image data, the recognition unit 430 recognizes broadcasting content corresponding to the image data by searching the DB 420 and transmits the meta information of the recognized broadcasting content to the terminal device. Furthermore, if the source information is audio data, the recognition unit 430 recognizes broadcasting content corresponding to the audio data by searching the DB 420 and transmits the meta information of the recognized broadcasting content to the terminal device.
  • the AR/VR providing unit 440 obtains the additional information about a corresponding scene or object and transmits it to the terminal device.
  • the additional information may include product information and/or advertisement information related to a scene or object of broadcasting content.
  • the control unit 450 is an element for controlling the operations of various elements of the additional service server 400, including the communication unit 410, the DB 420, the recognition unit 430 and the AR/VR providing unit 440.
  • the control unit 450 may include at least one operation device.
  • the operation device may be a general-purpose central processing unit (CPU), a programmable device (e.g., CPLD or FPGA) implemented suitably for a special purpose, an application-specific integrated circuit (ASIC) or a microcontroller chip.
  • CPU central processing unit
  • CPLD programmable device
  • FPGA field-programmable gate array
  • Each of the elements of the additional service server 400 may be implemented in the form of a software module or hardware module executed by a processor or may be implemented in a form in which a software module and a hardware module are combined.
  • the software module, the hardware module and/or the form in which a software module and a hardware module are combined, which is executed by a processor as described above, may be implemented as a hardware system (e.g., a computer system).
  • FIG. 6 is a diagram showing a method of providing a broadcasting-associated type AR service according to an embodiment of the present invention
  • FIGS. 7 to 14 are exemplary diagrams of screens for illustrating the broadcasting-associated type AR service.
  • the broadcasting server transmits broadcasting content to the broadcasting reception apparatus (S602).
  • the broadcasting reception apparatus outputs the broadcasting content (S604).
  • the terminal device drives the photographing unit to capture an image of the broadcasting content (S608) and obtains source information for recognizing broadcasting content of the captured image (S610).
  • the source information may include an EPG, a non-audible frequency, image information, and a sound generated from a program that is being watched.
  • the terminal device transmits a broadcasting content recognition request signal, including the obtained source information, to the additional service server (S612).
  • the broadcasting content recognition request signal may include the source information and terminal ID information.
  • the additional service server recognizes broadcasting content corresponding to the source information by analyzing the source information, obtains the meta information of the recognized broadcasting content, and transmits the obtained meta information to the terminal device (S614, S616).
  • the terminal device interprets the meta information (S618), receives corresponding additional information by requesting it from the additional service server (S620, S622), and displays the received additional information in association with an image being displayed on a screen (S624). That is, since the meta information includes information about the additional information, the terminal device may request required additional information from the additional service server based on the analysis of the meta information and receive the requested additional information. Thereafter, the terminal device outputs the additional information by synchronizing the additional information and the broadcasting content in real time or without synchronizing in real time depending on the type of broadcasting content that is being displayed on a screen.
  • the terminal device When the broadcasting-associated AR App of the terminal device is executed and photographs adult content output by the broadcasting reception apparatus as in FIG. 7(a), the terminal device recognizes a mosaic screen indicated by A and displays content not having a mosaic pattern as in (b).
  • the terminal device When the broadcasting-associated AR App of the terminal device is executed and photographs education content for kids output by the broadcasting reception apparatus as in FIG. 8(a), the terminal device recognizes a captured image and displays additional information, such as music, as in (b).
  • a trigger and guide for AR recognition are exposed.
  • the terminal device recognizes a trigger for AR recognition and displays the inside of a vehicle on a screen with augmented reality as in (c).
  • the terminal device displays a VR image menu for VR image watching which is related to the attractions of the local channel that are being watched.
  • the terminal device displays a pre-produced VR image on a screen as in (b).
  • the terminal device displays a VR image menu for VR image watching which is related to the apartment that is being watched.
  • the terminal device displays a pre-produced VR image within the apartment on a screen as in (b).
  • the VR image related to the broadcasting content can be checked in an asynchronous manner through the broadcasting-associated AR App of the terminal device.
  • the terminal device When the broadcasting-associated AR App of the terminal device is executed and photographs broadcasting content output by the broadcasting reception apparatus as in FIG. 12(a), the terminal device exposes a related product tag on a captured image as in (b). When a user clicks on the product tag, the terminal device moves to a detailed page (media commerce page). The user can purchase the corresponding product through the detailed page.
  • the terminal device displays additional information (e.g., the present score condition, player information and a highlighted image) related to the corresponding sports in a PIP form as in (b).
  • additional information e.g., the present score condition, player information and a highlighted image
  • the terminal device may display a game character hidden in a screen as in (b).
  • a user may obtain the displayed game character, may store or bring up the obtained game character, may make it emoticon, and may use the emoticon in a messenger App.
  • the method of providing a broadcasting-associated type AR service may be written in the form of a program, and pieces of code and code segments forming the program may be easily inferred by a programmer skilled in the art. Furthermore, a program regarding the "method of providing a broadcasting-associated type AR service" may be stored in information storage media (or readable media) and may be read and executed by an electronic device.
  • the technological characteristics described in this specification and an implementation for executing the technological characteristics may be implemented using a digital electronic circuit, may be implemented using computer software, firmware or hardware including the structure described in this specification and a structural equivalent thereof or may be implemented by a combination of one or more of them.
  • the implementation for executing the technological characteristics described in this specification may be implemented in the form of a computer program product, that is, a module regarding computer program instructions encoded on a kind of program storage media in order to control the operation of a processing system or for execution by the processing system.
  • system or “server” covers all of apparatuses, devices and machines for processing data, for example, apparatuses including a processor, a computer and a multi-processor or a computer.
  • the processing system may include all types of code that form an execution environment for a computer program when code, a protocol stack, a database management system, an operating system and a combination of one or more of them that form processor firmware is requested in addition to hardware.
  • a computer program also known as a program, software, a software application, a script or code may be written in any form of a programming language including a compiled or interpreted language or a transcendental and/or procedural language, and may also be implemented in any form including an independent program, module, a component, a subroutine or other units suitable for being used in a computer environment.
  • the elements for executing the technological characteristics of the present invention included in the block diagrams and flowcharts shown in the accompanying drawings of this specification, mean a logical boundary between the elements.
  • the illustrated elements and functions thereof are executed in the form of an independent software module, a monolithic software structure, code, a service or a combination of them, and are stored in a medium executable by a computer including a processor capable of executing stored program code and instructions and the functions of the illustrated elements may be implemented. Accordingly, all of such embodiments should be construed as belonging to the range of right of the present invention.

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • General Engineering & Computer Science (AREA)
  • Software Systems (AREA)
  • Databases & Information Systems (AREA)
  • Computer Hardware Design (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Graphics (AREA)
  • Business, Economics & Management (AREA)
  • Marketing (AREA)
  • Information Transfer Between Computers (AREA)
  • Two-Way Televisions, Distribution Of Moving Picture Or The Like (AREA)
PCT/KR2017/011881 2016-10-27 2017-10-26 System and method for providing augmented reality service associated with broadcasting WO2018080180A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
KR1020160140725A KR101850482B1 (ko) 2016-10-27 2016-10-27 방송 연동형 증강현실 서비스 시스템 및 방법
KR10-2016-0140725 2016-10-27

Publications (1)

Publication Number Publication Date
WO2018080180A1 true WO2018080180A1 (en) 2018-05-03

Family

ID=62023790

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/KR2017/011881 WO2018080180A1 (en) 2016-10-27 2017-10-26 System and method for providing augmented reality service associated with broadcasting

Country Status (2)

Country Link
KR (1) KR101850482B1 (ko)
WO (1) WO2018080180A1 (ko)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2020015971A1 (en) * 2018-07-19 2020-01-23 Arcelik Anonim Sirketi A television and a virtual reality headset adapted to be used with the same
CN114268840A (zh) * 2021-12-20 2022-04-01 中国电信股份有限公司 视频推送方法及装置、存储介质及电子设备

Families Citing this family (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR102123597B1 (ko) * 2018-09-18 2020-06-16 전남대학교산학협력단 증강현실과 가상현실이 통합된 듀얼 리얼리티 환경에서 제조 서비스 정보 제공 방법 및 단말
KR102255635B1 (ko) * 2018-10-23 2021-05-25 (주)티비스톰 개인화된 멀티플 스트림 전송 방법 및 이를 지원하는 장치
KR102175535B1 (ko) * 2018-10-26 2020-11-06 주식회사 엘지유플러스 증강 현실을 이용한 tv 프로그램 정보 제공 시스템 및 그의 tv 프로그램 정보 제공 방법
KR102208916B1 (ko) 2019-03-12 2021-01-27 최재호 영상 인식 기반의 방송 프로그램 인식 시스템
KR102218843B1 (ko) 2019-11-19 2021-02-22 광운대학교 산학협력단 스테레오 카메라를 이용한 중첩 레이어 기반의 멀티 카메라 증강현실 방송시스템 및 그 제공 방법
CN111208962A (zh) * 2019-12-30 2020-05-29 上海曼恒数字技术股份有限公司 一种显示设备连接控制方法及系统
KR102299565B1 (ko) * 2021-04-27 2021-09-08 (주)비코 실시간 방송 영상에서 실시간으로 인물 객체를 인식하고 영상 처리하는 방법 및 이러한 방법을 수행하는 장치
KR102574730B1 (ko) * 2021-09-30 2023-09-06 주식회사 엘지유플러스 Ar 글라스를 이용한 증강 현실 화면 및 리모컨 제공 방법 및 그를 위한 장치 및 시스템

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20120004320A (ko) * 2010-07-06 2012-01-12 김덕중 모바일 환경에서 tv 화면과 연동하는 증강 현실을 제공하는 방법, 이를 위한 부가 서비스 서버 및 방송 시스템
KR20120071483A (ko) * 2010-12-23 2012-07-03 한국전자통신연구원 미디어 id 정합 기반의 방송 ar 광고 서비스 시스템 및 그 방법
US20140059604A1 (en) * 2012-08-22 2014-02-27 Electronics And Telecommunications Research Institute Terminal, system and method for providing augmented broadcasting service
US20140112265A1 (en) * 2012-10-19 2014-04-24 Electronics And Telecommunications Research Institute Method for providing augmented reality, and user terminal and access point using the same
KR20150083355A (ko) * 2014-01-09 2015-07-17 한국전자통신연구원 증강 미디어 서비스 제공 방법, 장치 및 시스템

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20120004320A (ko) * 2010-07-06 2012-01-12 김덕중 모바일 환경에서 tv 화면과 연동하는 증강 현실을 제공하는 방법, 이를 위한 부가 서비스 서버 및 방송 시스템
KR20120071483A (ko) * 2010-12-23 2012-07-03 한국전자통신연구원 미디어 id 정합 기반의 방송 ar 광고 서비스 시스템 및 그 방법
US20140059604A1 (en) * 2012-08-22 2014-02-27 Electronics And Telecommunications Research Institute Terminal, system and method for providing augmented broadcasting service
US20140112265A1 (en) * 2012-10-19 2014-04-24 Electronics And Telecommunications Research Institute Method for providing augmented reality, and user terminal and access point using the same
KR20150083355A (ko) * 2014-01-09 2015-07-17 한국전자통신연구원 증강 미디어 서비스 제공 방법, 장치 및 시스템

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2020015971A1 (en) * 2018-07-19 2020-01-23 Arcelik Anonim Sirketi A television and a virtual reality headset adapted to be used with the same
CN114268840A (zh) * 2021-12-20 2022-04-01 中国电信股份有限公司 视频推送方法及装置、存储介质及电子设备

Also Published As

Publication number Publication date
KR101850482B1 (ko) 2018-04-19

Similar Documents

Publication Publication Date Title
WO2018080180A1 (en) System and method for providing augmented reality service associated with broadcasting
WO2018186592A1 (en) Electronic device and operating method thereof
WO2014025219A1 (en) Portable terminal device and method for operating the same
WO2013073748A1 (en) Display device and method for providing content using the same
WO2018128298A1 (en) Electronic apparatus and controlling method thereof
WO2019054791A1 (ko) 컨텐트를 실행하는 방법 및 장치
WO2018048178A1 (en) Display device
WO2015005722A1 (en) Mobile device, display apparatus and method for sharing contents thereof
WO2014178507A1 (en) Display apparatus and searching method
WO2016137134A1 (en) Broadcast receiving apparatus and method for controlling the same
WO2018131806A1 (en) Electronic apparatus and method of operating the same
WO2019216554A1 (en) Electronic apparatus and controlling method thereof
WO2015182893A1 (en) Apparatus and method for providing information
WO2018093138A1 (en) Electronic apparatus and method of operating the same
WO2018066938A1 (ko) 디스플레이 장치, 이의 제어 방법 및 정보 제공 시스템
WO2020071870A1 (en) Image display device and method of providing broadcast program information
WO2015182844A1 (ko) 디스플레이 장치, 사용자 단말 장치, 서버 및 그 제어 방법
WO2014189289A1 (ko) 연관 서비스 제공 방법 및 장치
WO2017188568A1 (en) Display device for providing scrap function and method of operating the same
WO2011083951A2 (ko) 단말장치를 이용한 재생 제어 시스템 및 재생 제어 방법
WO2016080585A1 (en) System and method for providing cloud based user interfaces
WO2012020945A2 (ko) 영상표시기기의 데이터 입력 방법 및 그에 따른 영상표시기기
WO2021172713A1 (ko) 전자장치 및 그 제어방법
EP3494704A1 (en) Electronic apparatus and method of operating the same
WO2021029453A1 (ko) 디스플레이 장치 및 그의 동작 방법

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 17866146

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

32PN Ep: public notification in the ep bulletin as address of the adressee cannot be established

Free format text: NOTING OF LOSS OF RIGHTS PURSUANT TO RULE 112(1) EPC (EPO FORM 1205 DATED 06/09/2019)

122 Ep: pct application non-entry in european phase

Ref document number: 17866146

Country of ref document: EP

Kind code of ref document: A1