KR102055198B1 - Recording medium recording program for method of providing augmented reality contents during video call service, and apparatus teherfor - Google Patents

Recording medium recording program for method of providing augmented reality contents during video call service, and apparatus teherfor Download PDF

Info

Publication number
KR102055198B1
KR102055198B1 KR1020150004019A KR20150004019A KR102055198B1 KR 102055198 B1 KR102055198 B1 KR 102055198B1 KR 1020150004019 A KR1020150004019 A KR 1020150004019A KR 20150004019 A KR20150004019 A KR 20150004019A KR 102055198 B1 KR102055198 B1 KR 102055198B1
Authority
KR
South Korea
Prior art keywords
augmented reality
reality content
terminal
video call
image data
Prior art date
Application number
KR1020150004019A
Other languages
Korean (ko)
Other versions
KR20160086560A (en
Inventor
김지훈
강상철
Original Assignee
에스케이텔레콤 주식회사
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 에스케이텔레콤 주식회사 filed Critical 에스케이텔레콤 주식회사
Priority to KR1020150004019A priority Critical patent/KR102055198B1/en
Publication of KR20160086560A publication Critical patent/KR20160086560A/en
Application granted granted Critical
Publication of KR102055198B1 publication Critical patent/KR102055198B1/en

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04WWIRELESS COMMUNICATION NETWORKS
    • H04W4/00Services specially adapted for wireless communication networks; Facilities therefor
    • H04W4/20Services signaling; Auxiliary data signalling, i.e. transmitting data via a non-traffic channel
    • H04W4/21Services signaling; Auxiliary data signalling, i.e. transmitting data via a non-traffic channel for social networking applications
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/14Systems for two-way working
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04WWIRELESS COMMUNICATION NETWORKS
    • H04W24/00Supervisory, monitoring or testing arrangements
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04WWIRELESS COMMUNICATION NETWORKS
    • H04W88/00Devices specially adapted for wireless communication networks, e.g. terminals, base stations or access point devices
    • H04W88/02Terminal devices

Abstract

The present invention relates to a video call service, and more particularly, to a computer-readable recording medium recording a program for a method for providing augmented reality content during a video call that can provide augmented reality content during a video call, and an apparatus therefor. .
A program for a method of providing augmented reality content during a video call according to an embodiment of the present invention for transmitting a video call call request message to a counterpart terminal to which a video call is to be performed, the video call with the counterpart terminal. When the call is connected, when the terminal checks the network status information and as a result of the checking, when the network situation information is capable of transmitting augmented reality content, the terminal generates augmented reality content corresponding to the image data photographed and generated. Transmitting the augmented reality content to the counterpart terminal together with the image data.

Description

A computer-readable recording medium recording a program for providing augmented reality content during a video call, and a device therefor.

The present invention relates to a video call service, and more particularly, to a computer-readable recording medium recording a program for a method for providing augmented reality content during a video call that can provide augmented reality content during a video call, and an apparatus therefor. .

The contents described in this section merely provide background information on the present embodiment and do not constitute a prior art.

Recently, domestic mobile communication is shifting from 3G service WCDMA to LTE due to the growth of Long Term Evolution (LTE) network.

Since the LTE network supports all-IP between entities, all traffic flows from a radio link connecting a user terminal to an eNodeB, a base station, to a packet data network (PDN) connecting to a service entity, is operated based on IP. .

However, when a voice call occurs while the terminal is connected to an LTE network, there is a problem in that a voice call service is provided through a circuit switched method according to a CS fallback scheme.

In order to solve this problem, recently, VoLTE service, which provides fast call connection and clear sound quality, has been attracting attention compared to existing voice calls.

Voice over LTE (VoLTE) is a service that supports voice and video calls using a packet bearer in an LTE network, and the terminal attaches to the LTE network and passes through the subscriber information authentication step. After updating, the user receives a session setup and bearer allocation for a multimedia service and receives packet-based voice and video call service according to an appropriate quality of service level.

However, the VoLTE-based video telephony service up to now only supports transmission and reception of photographed video data and audio data controlled by the sender and receiver, and does not support various additional services.

Korean Patent Publication No. 10-2012-0030540, published on March 28, 2012 (Name: mobile communication method, mobile communication system, subscriber management server device and switching center)

The present invention has been proposed to solve the above-described problems, and a computer-readable recording medium and apparatus for recording a program for a method for providing augmented reality content during a video call capable of providing augmented reality content during a video call. The purpose is to provide.

In particular, the present invention transmits augmented reality content corresponding to video data and audio data and video data according to a video call through different bearers of the same access point name, respectively, An object of the present invention is to provide a computer-readable recording medium recording a program for a method for providing augmented reality content during a video call capable of providing augmented reality content, and an apparatus therefor.

In addition, the present invention determines the subject of creation according to the network conditions when providing augmented reality content during a video call, or by dynamically generating augmented reality content according to the tracking object or user settings, augmented reality content in a variety of ways during the video call The purpose is to provide.

However, the object of the present invention is not limited to the above object, and other objects not mentioned will be clearly understood from the following description.

A computer-readable recording medium recording a program for a method for providing augmented reality content during a video call according to an embodiment of the present invention for achieving the above object is a request for a video call call to a counterpart terminal for which the terminal is to perform a video call Sending a message; Checking, by the terminal, network status information when a video call with the counterpart terminal is connected; And when the network situation information is capable of transmitting augmented reality content, the counterpart terminal is generated with the augmented reality content generated after the augmented reality content is generated in response to the image data captured by the terminal. Sending to; can be made, including.

In this case, the checking of the network status information may be performed by the terminal receiving network status information from a quality management server.

In addition, in the step of transmitting to the counterpart terminal, when the network situation information cannot transmit augmented reality content as a result of the checking, image data photographed by the terminal may be transmitted to the counterpart terminal.

The transmitting of the information to the counterpart terminal may include setting the preset data from image data captured in real time through the camera module, which is driven by driving the camera module when the network condition information is capable of transmitting augmented reality content. Tracking and recognizing the object; Generating augmented reality content for the preset object as the terminal tracks and recognizes the preset object; And transmitting, by the terminal, the generated augmented reality content to the counterpart terminal together with the image data.

The transmitting of the video call request message may include: transmitting, by the terminal, the video call request message through a designated bearer of a designated access point name, and transmitting to the counterpart terminal; The terminal may transmit the augmented reality content to the counterpart terminal through the same bearer as the bearer to which the video call call request message of the access point name is transmitted, or a bearer designated to transmit and receive augmented reality content of the access point name.

In addition, in the transmitting to the counterpart terminal, the terminal may variably adjust the bit rate of the augmented reality content according to the network context information and transmit the variable rate to the counterpart terminal.

In addition, the augmented reality content may include at least one field of content type, content display coordinates, content delivery type, content quality, content delivery information.

In addition, the terminal receives the video data transmitted from the other terminal connected to the video call call, or the video data and the augmented reality content generated corresponding to the video data according to the control according to the network status information of the quality management server It may be made to include more.

In this case, when receiving image data from the counterpart terminal, the terminal may generate augmented reality content corresponding to the image data.

The apparatus supporting the method for providing augmented reality content during a video call according to an embodiment of the present invention for achieving the above object, if the video call call is connected to the counterpart terminal to perform a video call, by checking the network status information A call processing module for controlling augmented reality content to be transmitted to the counterpart terminal together with the image data when the AR content generation module is transmitted, when augmented reality content transmission is possible; And an AR content generation module configured to generate augmented reality content in response to the image data according to a request of the call processing module.

According to the computer-readable recording medium recording a program for a method for providing augmented reality content during a video call of the present invention, it is possible to provide augmented reality content during a video call.

In addition, according to the present invention by transmitting the augmented reality content corresponding to the video data and audio data and video data according to the video call through different bearers of the same access point name (Access Point Name), respectively, Augmented reality content can be provided.

In addition, according to the present invention, the originating terminal, the server, or the receiving terminal generates augmented reality content in accordance with the network situation, it is possible to provide augmented reality content in a high-quality video call more fluidly.

In addition, various effects other than the above-described effects may be directly or implicitly disclosed in the detailed description according to the embodiment of the present invention to be described later.

1 is a schematic configuration diagram of a communication system to which a method for providing augmented reality content during a video call is applied according to an exemplary embodiment of the present invention.
2 is a block diagram showing a main configuration of a terminal according to an embodiment of the present invention.
3 is a block diagram showing a main configuration of a quality control server according to an embodiment of the present invention.
4 is a data flow diagram schematically illustrating an initial call connection procedure according to an embodiment of the present invention.
5 is a flowchart illustrating a method of providing augmented reality content during a video call according to an embodiment of the present invention.
6 is a data flow diagram illustrating a method of providing augmented reality content during a video call according to an embodiment of the present invention.
7 is a data flowchart illustrating a method of providing augmented reality content during a video call according to another embodiment of the present invention.

Hereinafter, exemplary embodiments of the present invention will be described in detail with reference to the accompanying drawings. However, in describing in detail the operating principle of the preferred embodiment of the present invention, if it is determined that the detailed description of the related known functions or configurations may unnecessarily obscure the subject matter of the present invention, the detailed description thereof will be omitted. This is to more clearly communicate without obscure the core of the present invention by omitting unnecessary description. In addition, the present invention may be modified in various ways and may have various embodiments, but specific embodiments are illustrated in the drawings and described in detail in the detailed description, which is not intended to limit the present invention to specific embodiments. It is to be understood that all changes, equivalents, and substitutes included in the spirit and technical scope of the present invention are included.

In addition, terms including ordinal numbers, such as first and second, are used to describe various components, and are used only to distinguish one component from another component, and to limit the components. Not used. For example, without departing from the scope of the present invention, the second component may be referred to as the first component, and similarly, the first component may also be referred to as the second component.

In addition, when a component is referred to as being "connected" or "connected" to another component, it means that it may be connected or connected logically or physically. In other words, although a component may be directly connected or connected to other components, it should be understood that other components may exist in the middle, and may be connected or connected indirectly.

In addition, the terminology used herein is for the purpose of describing particular embodiments only and is not intended to be limiting of the invention. Singular expressions include plural expressions unless the context clearly indicates otherwise. In addition, the terms "comprises" or "having" described herein are intended to indicate that there is a feature, number, step, operation, component, part, or combination thereof described in the specification, and one or the same. It is to be understood that the present invention does not exclude in advance the possibility of the presence or the addition of other features, numbers, steps, operations, components, parts, or a combination thereof.

Now, a computer-readable recording medium recording a program for a method for providing augmented reality content during a video call and an apparatus therefor according to an embodiment of the present invention will be described in detail with reference to the accompanying drawings. In this case, the same reference numerals are used for parts having similar functions and functions throughout the drawings, and redundant description thereof will be omitted.

In addition, for convenience of description, the method for providing augmented reality content during a video call according to an embodiment of the present invention will be described by taking an example of an operation performed in a long term evolution (LTE) communication system. However, the method of providing augmented reality content during a video call of the present invention is not limited to the above LTE communication system, and it is obvious that the present invention may be applied to any equivalent or any other communication system.

Hereinafter, a communication system to which a method for providing augmented reality content during a video call according to an embodiment of the present invention is applied will be described.

1 is a schematic configuration diagram of a communication system to which a method for providing augmented reality content during a video call is applied according to an exemplary embodiment of the present invention.

Referring to FIG. 1, a system 1000 to which a method for providing augmented reality content is provided during a video call according to an exemplary embodiment of the present invention may be connected through a connection between a terminal 100, an access network, and a core network. Can be done.

Herein, terminals 100a and 100b (hereinafter, referred to as 100 if no separate classification is required) refer to a user's device capable of performing a video call according to a user's manipulation. The terminal 100 may perform voice or data communication via the access networks 510a and 510b (hereinafter, referred to as 510 when no separate classification is required) and the core network 520. That is, various information may be transmitted and received by interworking with the quality management server 200 through a PDN (Packet Data Network) 530, which is an internet network. The terminal 100 of the present invention for this purpose may include a browser for transmitting and receiving information, a memory for storing programs and protocols, a microprocessor for executing and controlling various programs, and the like, in particular an embodiment of the present invention. It may be implemented including a session connection module, for example, a VoLTE module for using the video call service according to.

In such a terminal 100, when the calling terminal 100a requests a video call call connection based on a VoLTE service to the receiving terminal 100b, and the receiving terminal 100b accepts it. In addition, the video call service can be used with the counterpart terminal 100b.

The communication network of the present invention supporting this process may include an access network 510 and a core network 520. Here, the access network 510 distinguishes the originating access network 510a and the receiving access network 510b, and assumes that the core network 520 uses the same core network 520. However, the core network 520 is described. In addition, it may be configured to be divided into a sender and a receiver.

Here, the access network 510 is a user access network supporting communication with each terminal 100, and preferably, may be a radio access network. In addition, the core network 520 is a core system for providing a VoLTE service for the terminal 100 connected to the access network 510, and may support communication between the terminals 100 connected to the access network 510. The core network 520 supports the connection with the PDN 530 which is an external packet data network. For example, the core network 520 may upload packet data to the quality control server 200 connected through the PDN 530, and may download the packet data from the PDN 530 to the terminal 100.

In more detail with respect to each of the networks described above, first, the access network 510 may include an eNB 511 that is a plurality of base stations. The plurality of eNBs 511 supports next-generation technologies and services, such as LTE, such as radio frequency (RF) conversion, transmission / reception signal strength and quality measurement, baseband signal processing, channel card resource management, and the like. Perform the function. The eNB 511 ensures stable connection with the plurality of terminals 100. For example, the eNB 511 may be locally distributed and support the connection of the terminal 100 existing within the communication range, and in particular, may process the connection request of the terminal 100 according to a wireless communication scheme.

In addition, the eNB 511 may collect and analyze channel measurement result information of cells adjacent to the current cell from the terminal 100 to determine the handover and command the handover. To this end, the eNB 511 may support a control protocol such as a radio resource control protocol related to radio resource management.

Specifically, the core network 520 is a network system that performs main functions for mobile communication services such as call processing, mobility control and switching between the access networks 510, and may be implemented with various functional elements. 1, a packet data network gateway (PDN GW or P-GW) 523, a serving gateway (Serving GW; S-GW) 522, and mobility management The device may include a Mobility Management Entity (MME) 521 or the like. The above-described functional elements may be implemented in one server device, or may be implemented in separate independent devices as shown in the drawings.

First, the MME 521 is a functional element for managing the movement of the terminal 100. For example, the MME 521 manages the terminal 100 in an idle mode, and the subscriber information of the terminal 100, in particular, Select the S-GW 522 and the P-GW 523 to support the connection with the terminal 100 according to the Access Point Name (APN) information, and select the selected S-GW 522 and P-GW Through operation 523, the session and the bearer of the terminal 100 may be controlled. For example, when the terminal 100 is a user subscribed to a general LTE service, the MME 521 checks an APN corresponding to a general LTE service in association with an HSS (not shown) and processes the corresponding LTE data session. The S-GW 522 and the P-GW 523 may be selected, and if the terminal 100 is a user subscribed to the VoLTE service, the MME 521 may process the S-GW 522 to process the corresponding VoLTE data session. And the P-GW 523, and controls the session and bearer necessary for the terminal 100 to be set through the selected S-GW 522 and the P-GW 523. Thereafter, the terminal 100 is allocated an LTE data session (eg APN: lte.sktelecom.com) or a VoLTE data session (eg APN: volte.sktelecom.com or ims.sktelecom.) Under the control of the MME 521. com) necessary information may be transmitted, and a request generated from the terminal 100 may be processed by the corresponding S-GW 522 and the P-GW 523 selected through the MME 521.

In addition, the MME 521 of the present invention may control to set up a separate bearer for transmitting and receiving augmented reality content when the terminal 100 is a user subscribed to a VoLTE service capable of supporting augmented reality when establishing a session and a bearer. .

In addition, the MME 521 may perform a function related to roaming and authentication for the terminal 100. In addition, the MME 521 may process a bearer signal generated by the terminal 100. Here, the message transmitted and received between the terminal 100 and the MME 521 means a Non Access Stratum (NAS) message.

In addition, the MME 521 supports a plurality of Tracking Area Identity (TAI), and is connected to the eNB 511 that individually supports the TAIs. Through this, eNBs 511 supporting the same TAI may be connected to the same MME 521, and eNBs 511 supporting different TAI may be connected to different MME 521.

The S-GW 522 refers to a serving gateway that processes traffic for a plurality of eNBs 511. In addition, interworking with the P-GW 523, and supports data transmission and reception for the terminal 100. That is, the packet data generated from the terminal 100 connected to the core network 520, for example, a video call call request message, includes an eNB 511 of the access network 510, an S-GW 522 of the core network 520, and the like. The packet is transmitted through the P-GW 523 and the MME 521 is not involved in packet data transmission and reception.

The P-GW 523 assigns an IP (Internet Protocol) address of the terminal 100. The P-GW 523 performs packet data related functions of the core network 520. In addition, the P-GW 523 determines a bearer band to be provided to the subscriber of the terminal 100. In addition, the P-GW 523 may be in charge of forwarding and routing functions for packet data, perform different QoS policies according to the type of service subscribed to each terminal 100, and perform traffic It is to control the back.

Here, the eNB 511 and the MME 521 may be connected to the S1-MME interface. In addition, the eNB 511 and the S-GW 522 may be connected to an S1-U interface. In addition, the S-GW 522 and the P-GW 523 may be connected to the S5 interface.

In addition, the core network 520 according to an embodiment of the present invention is described as an example of being implemented as an entity of the MME 521, the S-GW 522 and the P-GW 523, PCEF other than the entity described above Policy and Charging Enforcement Function (Policy and Charging Enhancement Function) It may be configured to further include a device for enforcing a policy and billing, a Home Subscriber Sever (HSS), and the like. At this time, the home subscriber management device (HSS) stores and manages subscription information for each terminal 100, and at the request of the MME 521 during the initial call access procedure of the terminal 100, the MME 521. As a result, subscriber information, in particular, information on a designated APN of a subscribed VoLTE may be transmitted to the MME 521, thereby supporting session and bearer allocation according to the APN.

In addition, the quality management server 200 connected through the PDN 530 monitors network status information as the video call between the originating terminal 100a and the receiving terminal is connected to the counterpart terminal 100b, and augmented reality. It can support the role of determining the subject of content creation.

In such a system 1000, the terminal 100a, which is the originating terminal, transmits a video call call setup request to the counterpart terminal 100b, which is the receiving terminal, and when the video call is established according to the acceptance of the counterpart terminal 100b, photographing. After generating the augmented reality content corresponding to the image data by recognizing the image data to be generated, the counterpart terminal (a) through the originating access network (510a) and core network 520, and the receiving side access network (510b) 100b).

Here, the augmented reality content is delivered through the same APN as the image data and the audio data, but the bearers to be delivered are delivered through different paths. In addition, the bit rate of the augmented reality content of the present invention is controlled by the control of the quality management server 200 is transmitted or transmitted, or the terminal 100a transmits only voice data and video data to the other terminal (100b), The management server 200 may generate and transmit the augmented reality content to the counterpart terminal 100b or the augmented reality content may be generated in the counterpart terminal 100b.

More specifically, the method for providing augmented reality content during a video call in the terminal 100 will be described later. A processor mounted in each device constituting the system 1000 of the present invention may execute a program command for executing the method according to the present invention. Can be processed. In one implementation, this processor may be a single-threaded processor, and in other implementations, the processor may be a multithreaded processor. Furthermore, the processor is capable of processing instructions stored on memory or storage devices.

Hereinafter, the main configuration and operation method of the terminal 100 according to an embodiment of the present invention.

2 is a block diagram showing a main configuration of a terminal according to an embodiment of the present invention.

1 and 2, the terminal 100 according to an embodiment of the present invention includes a communication unit 10, an input unit 20, a control unit 30, an image photographing unit 40, a storage unit 50, and an output unit. It may be configured to include a portion 60.

In more detail with respect to each component, first, the communication unit 10 includes a session connection module 11 for a video call service, the procedure for accessing the core network 520 via the access network 510, It may support the transmission and reception of various information generated in various procedures such as a procedure for performing a video call with the counterpart terminal 100b.

The input unit 20 transmits various information such as numeric and text information input from a user, a signal input in connection with various function settings and function control of the terminal 100, to the controller 30. In particular, the input unit 20 of the present invention inputs a user input for executing a separate application for executing a video call service capable of generating augmented reality content, or inputs the number of the counterpart terminal 100b for performing a video call. Support behavior, etc. In addition, the input unit 20 of the present invention may include a module for generating voice data by processing a voice signal of a user such as a microphone, and may transmit voice data input from the user to the controller 30 during a video call. .

As described above, the input unit 20 includes a key input means such as a keyboard or a keypad, a touch input means such as a touch sensor or a touch pad, a voice input means, a gyro sensor, a geomagnetic sensor, an acceleration sensor and a proximity sensor, and a gesture input means. And voice input means.

In addition, it may include all types of input means that are currently under development or may be developed in the future.

The controller 30 performs overall control of the terminal 100, in which at least one processor and at least one memory loading data including a central processing unit (CPU) and a micro processing unit (MPU) are loaded in hardware. It may include an execution memory (eg, a register and / or random access memory (RAM)) and a bus (BUS) for inputting and outputting at least one or more data into the processor and the memory. In addition, a predetermined program routine which is loaded into the execution memory from a predetermined recording medium and computed by the processor to perform a function defined in the terminal 100 (eg, augmented reality content generation function during a video call) by software. (Routine) or program data can be included. In other words, among the functions provided in the terminal 100 to process a method of generating augmented reality content during a video call according to an embodiment of the present invention, a component that can be processed by software may be determined as a function of the controller 30. have.

Such, the control unit 30 of the present invention is functionally connected with at least one component provided to support a method for generating augmented reality content during a video call according to an embodiment of the present invention. That is, the controller 30 is functionally connected to the communication unit 10, the input unit 20, the image capturing unit 40, the storage unit 50, and the output unit 60, and supplies power and functions to each of the components. It controls the flow of signals for execution.

In particular, the controller 30 of the present invention may perform an initial call connection procedure for accessing the core network 520 via the access network 510. In response to the user's request, the video call call connection request message is transmitted to the counterpart terminal 100b, which is the receiving terminal, and the counterpart terminal 100b accepts the video call call connection. Process of driving the camera module 41 to generate and augmented reality content by tracking and recognizing a predetermined object in the image data captured by the camera module 41, and transmits the generated augmented reality content to the counterpart terminal (100b) Can support

The operation of the control unit 30 of the present invention can be clearly understood through the flow chart described below, the control unit 30 of the present invention includes a call processing module 31 and AR content generation module 32 to be configured Can be.

Here, the call processing module 31 processes a process of connecting to the access network 510, a process of connecting to the core network 520, and a process of connecting a video call call to the counterpart terminal 100b. The AR content generation module 32 tracks and recognizes a predetermined object in the image data photographed as the video call is established with the counterpart terminal 100b, generates augmented reality content corresponding thereto, and calls the module 31. ) To support the process of being delivered to the counterpart terminal 100b through the call processing module 310.

The image capturing unit 40 includes a camera module 41, and drives the camera module 41 under the control of the controller 30 to capture images and generate image data.

The storage unit 50 may temporarily store various data generated while executing the application program, including an application program required for operating a function according to an embodiment of the present invention. The storage unit 50 may include a flash memory, a hard disk, a memory of a multimedia card micro type (eg, SD or XD memory), a RAM, a ROM, or the like. ROM) and a storage medium.

The output unit 60 displays information on a series of operation states and operation results that occur during the functioning of the terminal 100. In particular, the output unit 60 of the present invention includes a display module (not shown) for outputting a video call screen and a voice output module (not shown) for outputting voice data generated by converting a user's voice signal. Can be.

As such, the main components of the terminal 100 of the present invention have been described with reference to FIG. 2. The terminal 100 of the present invention may be implemented in various forms. For example, the terminal 100 described herein includes a user equipment (UE), a mobile station (MS), a mobile terminal (MT), a subscriber station (SS), a mobile subscriber station ( It may refer to a Portable Subscriber Station (PSS), an Access Terminal (AT), or the like, and may include all or a part of functions of a mobile terminal, a subscriber station, a portable subscriber station, and the like. In addition, the terminal 100 may be classified into a smart phone, a tablet PC, a personal digital assistant (PDA), a portable multimedia player (PMP), and the like, according to an implementation form thereof.

Hereinafter, the main configuration and operation method of the quality control server 200 according to an embodiment of the present invention.

3 is a block diagram showing a main configuration of a quality control server according to an embodiment of the present invention.

1 and 3, the quality control server 300 according to an exemplary embodiment of the present invention may include a server communication unit 210, a server control unit 220, and a server storage unit 230.

In more detail with respect to each component, first, the server communication unit 210 is connected via the PDN 530 and monitors the network status information on the session between the terminal 100a and the counterpart terminal 100b to which the video call call is connected. Play a role.

The server controller 220 determines whether it is suitable for the terminal 100 to generate augmented reality content according to the network condition information monitored by the server communication unit 210, and transmits the information according to the server communication unit 210. It may support the role of transmitting to the terminal 100. In addition, the server controller 220 monitors the network status information at regular intervals and transmits the monitored network status information to the terminal 100a as the originating terminal to variably adjust the bit rate when the terminal 100a generates augmented reality content. It can play the role of supporting creation. In addition, when it is inappropriate for the terminal 100 to generate the augmented reality content, the server controller 220 may directly generate the augmented reality content, and adjust the bit rate of the generated augmented reality content according to the network context information. By doing so, the receiving terminal may support a process of transmitting to the counterpart terminal 100b.

Here, the network status information may be a concept including various information that may be generated on the access network 510, the core network 520, and the PDN 530, such as network connection status, communication bandwidth, and load information.

In addition, although not shown in the drawings, the server controller 220 of the present invention enhances a call processing module (not shown) that supports a video call call processing process between the terminals 100 and image data transmitted and received between the terminals 100. It may be configured to include an AR content generation module (not shown) that can generate the actual content.

The server storage unit 230 stores a variety of information according to an embodiment of the present invention. For example, the server storage unit 230 may store information related to an AR engine required for generating augmented reality content.

The main configuration and operation method of the quality control server 200 according to the embodiment of the present invention have been described above.

Meanwhile, the memory mounted in each device for implementing the present invention, such as the terminal 100 and the quality control server 200, stores information in the device. In one embodiment, the memory is a computer readable medium. In one implementation, the memory may be a volatile memory unit, and for other implementations, the memory may be a nonvolatile memory unit. In one embodiment, the storage device is a computer readable medium. In various different implementations, the storage device may include, for example, a hard disk device, an optical disk device, or some other mass storage device.

In addition, the term '~ module' used in the embodiment of the present invention refers to a software component, '~ module' plays a role. By way of example, a '~ module' may include components such as software components, object-oriented software components, class components, and task components, and processes, functions, properties, procedures, and subs. Routines, segments of program code, drivers, data, databases, data structures, tables, arrays, and variables. In addition, the functions provided in the components and '~ modules' may be combined into a smaller number of components and '~ modules' or further separated into additional components and '~ modules'.

Although the specification and drawings describe exemplary device configurations, the functional operations and subject matter implementations described herein may be embodied in other types of digital electronic circuitry, or modified from the structures and structural equivalents disclosed herein. It may be implemented in computer software, firmware or hardware, including, or a combination of one or more of them. Implementations of the subject matter described herein relate to one or more computer program products, ie computer program instructions encoded on a program storage medium of tangible type for controlling or by the operation of an apparatus according to the invention. It may be implemented as the above module. The computer readable medium may be a machine readable storage device, a machine readable storage substrate, a memory device, a composition of materials affecting a machine readable propagated signal, or a combination of one or more thereof.

Hereinafter, a method of providing augmented reality content during a video call according to an embodiment of the present invention will be described.

First, before describing a method for providing augmented reality content during a video call according to an embodiment of the present invention, an initial call connection procedure for this will be described with reference to FIG. 4.

4 is a data flow diagram schematically illustrating an initial call connection procedure according to an embodiment of the present invention.

Prior to the description with reference to FIG. 4, the initial call connection procedure shown in FIG. 4 schematically shows the steps necessary to explain the present invention. In the actual operation step, various steps other than the steps shown are initial call connection. It should be noted that this may be done at the procedural stage.

Referring to FIG. 4, in an initial call access procedure, the terminal 100 performs an RRC connection establishment process with an eNB 511 (S101). For example, the terminal 100 searches for an eNB 511 located within a predetermined radius, establishes radio link synchronization with the discovered eNB 511, and transmits an RRC Connection Request message to the eNB 511. In response, the terminal 100 may receive an RRC connection setup message from the eNB 511. When the RRC connection establishment is completed through this procedure, the terminal 100 sets its terminal ID to IMSI (International Mobile Subscriber Identity), and attach request including information on network performance supported by the terminal 100. The attach request message is transmitted to the MME 521 (S103).

In addition, the eNB 511 transmits an initial UE message to the MME 521 (S105), and the MME 521 which receives the MUE is an identification information managed by the eNB 511 in response to the corresponding UE 100. S1 signaling connection is established between the eNB 511 and the MME 521 by assigning an S1AP UE ID. In addition, the MME 521 which has confirmed the information on the IMSI and the network performance of the terminal 100 through the attach request message from the terminal 100 performs the subscriber authentication procedure using this (S107). Here, although the subscriber authentication procedure by the MME 521 is not shown in the figure, obtaining an authentication vector for the subscriber from the HSS (not shown), and mutually between the MME 521 and the UE 100. This can be done through the step of performing authentication. Since the specific subscriber authentication procedure follows a known subscriber authentication procedure, it will be omitted.

Thereafter, the MME 521 registers the terminal 100 as a subscriber to the network and performs a location registration procedure with the HSS (not shown) to determine what services the user can use (S109). That is, the MME 521 transmits an update location request message to the HSS (not shown) indicating that the terminal 100 is connected to itself and is located in an area managed by the MME 521, and the HSS (not shown). Subscriber information including APN information is received from an Update Location Answer message (S111).

Through this process, the MME 521 may check the service information subscribed to the terminal 100, and the S-GW 522 and the terminal 100 to provide the subscribed service according to the confirmed APN information. The P-GW 523 controls to perform the session establishment and bearer establishment procedure. For example, the MME 521 allocates an EPS bearer ID for the terminal 100 and determines the P-GW 523 to access the confirmed APN.

Since the terminal 100 is subscribed to the LTE Second APN supporting the VoLTE service, the MME 521 determines the P-GW 523 capable of processing the service corresponding to the APN. Then, the S-GW 522 to be connected with the corresponding P-GW 523 is selected. Thereafter, the MME 521 transmits a session setup request message to the selected corresponding S-GW 522 (S113), and the S-GW 522 receiving the session sends a session to the P-GW 523. Send setup request message. Through this process, the P-GW 523 and the S-GW 522 establish a tunnel for transmitting downlink traffic, and the P-GW 523 allocates an IP address to be used when the terminal 100 uses the APN. Done. In addition, although the P-GW 523 is not shown in the drawing, the P-GW 523 may further perform a subscriber profile checking procedure in connection with a PCRF (not shown). When the session setup is completed through all the related procedures, the P-GW 523 may be used. The S-GW 522 transmits a session establishment completion message to the S-GW 522 (S117), and the S-GW 522 transmits a session establishment completion message to the MME 521 (S119).

Thereafter, the MME 521 transmits and receives an Initial Context Setup request / response message to the eNB 511, thereby completing the initial access process (S123).

In addition, according to an embodiment of the present invention, the MME 521 may control the S-GW 522 and the P-GW 523 to configure three basic bearers for VoLTE service at session establishment and bearer allocation. For example, a bearer for transmitting / receiving image data, a bearer for transmitting / receiving voice data, and a bearer for IMS signaling may be configured, and among the bearers for IMS signaling, the augmented reality content of the present invention may be transmitted, and the MME 521 Depending on the system design implementation method of the), additional bearers for transmitting and receiving augmented reality content other than the three basic bearers may be allocated.

After this process, the terminal 100 can perform a video call with the counterpart terminal through the designated S-GW 522 and the P-GW 523. In this case, the terminal 100 may transmit image data, voice data, and augmented reality content to the counterpart terminal through a designated bearer of the APN designated for the designated VoLTE service in the initial call access procedure.

Such a method of providing augmented reality content during a video call according to an exemplary embodiment of the present invention will be described.

5 is a flowchart illustrating a method of providing augmented reality content during a video call according to an embodiment of the present invention.

Before describing with reference to FIGS. 1 and 5, the terminal 100a and the counterpart terminal 100b are divided and described. The terminal 100a is an originating terminal for performing a video call and the counterpart terminal 100b. ) Is assumed to be a receiving terminal. In addition, it will be described on the assumption that both the terminal 100a and the counterpart terminal 100b are subscriber terminals subscribed to the VoLTE service and are connected to the core network 520 as described with reference to FIG. 4.

The terminal 100a selects the counterpart terminal 100b to conduct a video call and transmits a video call request message to the selected counterpart terminal 100b (S201). Here, the transferred video call request message may be transmitted to the counterpart terminal 100b through a bearer for transmitting / receiving IMS signaling of the APN designated for the VoLTE service, the terminal side eNB 511a, the S-GW 522, Although transmitted to the counterpart terminal 100b via the P-GW 523 and the counterpart terminal side eNB 511b, this process will be omitted.

The counterpart terminal 100b receiving the request may accept or reject the video call call connection at the user's request. At this time, when the counterpart terminal 100b accepts the video call call transmitted from the terminal 100a (S203), the video call call connection is completed between the terminal 100a and the counterpart terminal 100b.

When the terminal 100a completes the video call call connection with the counterpart terminal 100b, the terminal 100a may drive the camera module to generate image data photographed through the camera module, and check network information (S205). It may be determined whether to transmit the content.

That is, the terminal 100a may receive and confirm the network status information from the quality control server 200 at regular intervals, and determine whether the confirmed network status information is capable of augmented reality content transmission (S207). .

If it is determined that the network condition information received from the quality management server 200 is capable of transmitting augmented reality content, the terminal 100a generates augmented reality content in response to image data captured in real time through the camera module (S209). In operation S211, the generated augmented reality content may be transmitted to the counterpart terminal 100b together with the image data.

On the contrary, if it is determined in step S207 that the network condition information received from the quality management server 200 is impossible to transmit augmented reality content, the terminal 100a may transmit only image data to the counterpart terminal 100b. In this case, in this case, the quality management server 200 or the counterpart terminal 100b may generate augmented reality content corresponding to the image data according to the network condition of the receiver.

In addition, the terminal 100a has been described as an example of transmitting the image data or the image data and the generated augmented reality content to the counterpart terminal 100b. However, when the voice signal is input from the user to generate the voice data, the generated voice The data may be transmitted to the counterpart terminal 100b together.

In addition, the terminal 100a according to an embodiment of the present invention transmits the augmented reality content corresponding to the image data, the audio data, and the image data to the counterpart terminal 100b while the video call is connected to the same APN. Each can transmit on different bearers. In this case, a bearer to which augmented reality content is transmitted may use a default bearer allocated for signaling in an initial call access procedure, and may use a bearer additionally allocated to transmit and receive augmented reality content according to a system implementation.

The counterpart terminal 100b may combine and output data transmitted through different bearers, and when only the video data or the video data and the audio data are received from the terminal 100a, the counterpart terminal 100b may be configured as described above. The augmented reality content corresponding to the image data transmitted directly from the terminal 100a may be generated and output.

Such a method of providing augmented reality content during a video call according to an exemplary embodiment of the present invention will be described in more detail with reference to FIGS. 6 and 7.

6 is a data flow diagram illustrating a method of providing augmented reality content during a video call according to an embodiment of the present invention.

Before describing with reference to FIG. 6, in the method for providing augmented reality content during a video call according to an embodiment of the present invention, the terminal 100a as the originating terminal generates the augmented reality content to the counterpart terminal 100b as the receiving terminal. The description will focus on the transmission process.

In addition, transmission and reception of information between the terminal 100a and the counterpart terminal 100b may be transmitted and received through various entities of the eNB 511 and the core network 520 of the access network 510, but the description thereof is omitted for convenience of description. Do it. However, the intervention of the quality management server 200 connected through the PDN 530 will be described.

First, the terminal 100a selects the counterpart terminal 100b and transmits a video call call request message to the selected counterpart terminal 100b (S301). In this case, the video call call request message transmitted is transmitted through the IMS signaling bearer of the APN, which is dedicated in the initial call connection procedure, as described with reference to FIG. 4. When the video call acceptance message is transmitted from the counterpart terminal 100b (S303), the terminal 100a and the counterpart terminal 100b are connected to the video call call, and the quality management server 200 is connected to the video call call. The network status information of the terminal 100a and the counterpart terminal 100b is monitored (S305).

In addition, as the video call is connected, the terminal 100a drives the camera module (S307) to track and recognize a predetermined object in the image data photographed through the camera module (S309).

In this case, when a plurality of objects exist in the image data to be photographed, the terminal 100a recognizes the plurality of objects, and among the recognized objects, the object designated as the user of the terminal 100a or the set object, or the recognition is most easily. One object may be determined as a preset object, and a process of tracking and recognizing only the corresponding object may be performed. For example, when a user other than the user of the terminal 100a and the user of the terminal 100a exist together in the image data photographed through the camera module of the terminal 100a, the terminal 100a is an object designated as a user. Only recognize and track. As another example, when a preset advertisement object exists in the image data photographed through the camera module, only the corresponding advertisement object may be tracked and recognized. The preset advertisement object is information that may be set by the user or through the quality control server 200. For another example, when a plurality of objects exist in the image data, only the object that is at the forefront and easily recognized may be tracked and recognized.

Thereafter, the terminal 100a generates the augmented reality content for the preset object to be tracked and recognized (S311). Here, the generated augmented reality content may be generated as augmented reality content of various types of content such as an image, a video, an animation according to the implementation form. In addition, according to the contents of the content can be generated as augmented reality content information for guiding a major event (eg, birthday, meeting schedule, etc.) related to the user of the terminal (100a), the recent communication history (for example, call history, message transmission and reception) Brief information on the history) may be generated as augmented reality content. Here, the main event related to the user and the recent communication history may be generated by using the information stored in the terminal 100a.

While the terminal 100a generates the augmented reality content, the quality control server 200 may continuously monitor the network status information at predetermined intervals (S305) and guide the information to the terminal 100a.

Upon receiving this, the terminal 100a may variably adjust the bit rate of the augmented reality content according to the network situation information (S315), and adjust the augmented reality content that is variably adjusted to the counterpart terminal 100b together with the image data and the audio data. It is to be transmitted (S317). Here, the augmented reality content corresponding to the image data, the audio data, and the image data is transmitted to the counterpart terminal 100b through different bearers of the same APN, and the bearer to which the augmented reality content is delivered receives a bearer for IMS signaling. Or via a bearer additionally allocated for augmented reality content transmission and reception.

Thereafter, the counterpart terminal 100b outputs the received video data, audio data, and augmented reality content through the video call screen (S319). In addition, it may be output through the video call screen of the terminal (100a).

 Here, the augmented reality content generated corresponding to the image data is configured to include fields of content type, content display coordinates, content delivery type, content quality, and content delivery information. The content type refers to a type (eg, image, video, 3D animation, audio, etc.) of the generated augmented reality content, and may have an integer value of 1 byte. In addition, the content display coordinates may refer to coordinate values for mapping augmented reality content corresponding to the image data, and may have an integer value of 8 bytes in total each of 4 bytes of x and y coordinates. Here, if it is not necessary to send the coordinate value, it can be set to a null value. In addition, the content transmission type may have an integer value of 1 byte, and when augmented reality content is obtained from an external server (not shown) connected through the PDN 530, the information according to the content transmission information field is input. In this case, information on the content transmission type field will be described. In addition, the content quality refers to information having an integer value of 1 byte and which can be variably adjusted according to network status information received from the quality management server 200. At this time, the content quality may have a value between 0 and 255. The smaller the number, the higher the quality.

In addition, the terminal 100a may receive and output image data, audio data, and augmented reality content photographed by the counterpart terminal 100b from the counterpart terminal 100b.

That is, the video call screen shown through the terminal 100a is divided into an area where an image of the other terminal 100b can be checked and an area where an own video can be checked, and a terminal in an area where an own video can be checked. The image data photographed through the 100a and the augmented reality content corresponding thereto may be output, and the image data received from the counterpart terminal 100b and the augmented reality content corresponding thereto may be output through an area for checking the image of the opponent. Will print. On the other hand, the terminal 100a may receive only image data from the counterpart terminal 100b, and the terminal 100a may generate augmented reality content from the received image data.

Hereinafter, a method of providing augmented reality content during a video call according to another embodiment of the present invention will be described.

7 is a data flowchart illustrating a method of providing augmented reality content during a video call according to another embodiment of the present invention.

Before describing with reference to FIG. 7, in the method for providing augmented reality content during a video call according to an embodiment of the present invention, the terminal 100a serving as the calling terminal receives the video data (or video data and audio data) according to the video call. The process of transmitting to the counterpart terminal 100b as a receiving terminal and generating augmented reality content in the quality management server 200 or the counterpart terminal 100b will be described.

In addition, as described with reference to FIG. 6, transmission and reception of information between the terminal 100a and the counterpart terminal 100b may be transmitted and received through various entities of the eNB 511 and the core network 520 of the access network 510. However, for convenience of description, this will be omitted. However, the intervention of the quality management server 200 connected through the PDN 530 will be described.

First, the terminal 100a selects the counterpart terminal 100b and transmits a video call call request message to the selected counterpart terminal 100b (S401). In this case, the video call call request message transmitted is transmitted through the IMS signaling bearer of the APN, which is dedicated in the initial call connection procedure, as described with reference to FIG. 4. When the video call acceptance message is transmitted from the counterpart terminal 100b (S403), the terminal 100a and the counterpart terminal 100b are connected to the video call call, and as the video call call is connected, the quality management server 200 The network status information of the terminal 100a and the counterpart terminal 100b is monitored (S405).

In addition, the quality management server 200 may periodically transmit the monitored network status information to the terminal 100a (S407). The quality management server 200 may transmit the network status information only to the calling terminal requesting the video call, or may simultaneously transmit the network status information to all the terminals connected to the video call.

Upon receiving this, the terminal 100a checks the network situation information received from the quality management server 200, and when the current network situation is a situation in which augmented reality content transmission is impossible, the terminal 100a is connected to the video call call. After generating the image data to be photographed through the driven camera module, and together with the voice signal from the user to generate the audio data (S411), and transmits only the video data and the audio data to the other terminal (100b) (S413).

On the other hand, when the network situation is a situation that can transmit the augmented reality content, as described with reference to Figure 6, the terminal 100a may generate and transmit the augmented reality content.

Meanwhile, all information transmitted from the terminal 100a to the counterpart terminal 100b to which the video call is connected may be transmitted through the quality control server 200. When the video data (or the video data and the audio data are transmitted from the terminal 100a), the quality management server 200 checks the network status of the counterpart terminal 100b.

As a result of the check, when the network situation of the counterpart terminal 100b is impossible to transmit the augmented reality content, it is preferable to deliver only the minimum packet data. The data may be controlled to be transmitted to the counterpart terminal 100b as it is (S417). In this case, each data transmitted by the quality management server 200 may be transmitted through different bearers of the same APN.

The counterpart terminal 100b having received the information tracks and recognizes a preset object in the image data, and generates augmented reality content for the preset object (S419 to S421).

For example, when a plurality of objects exist in the image data received from the terminal 100a, the counterpart terminal 100b recognizes the plurality of objects, and among the recognized objects, an object designated as a user of the terminal 100a or a set object Alternatively, the process of tracking and recognizing only the corresponding object may be performed by determining the most easily recognized object as a preset object. For example, when there is a user other than the user of the terminal 100a and the user of the terminal 100a together in the image data transmitted from the terminal 100a through the quality control server 200, the counterpart terminal 100b The UE may recognize and track only an object designated as a user of the terminal 100a. As another example, when a preset advertisement object exists in the image data photographed through the camera module, only the corresponding advertisement object may be tracked and recognized. As another example, when a plurality of users are photographed, only the object that is at the front and easily recognized may be tracked and recognized.

Thereafter, the counterpart terminal 100b generates augmented reality content for a preset object that is tracked and recognized (S421). Here, the generated augmented reality content may be a variety of content, such as images, video, animation may be generated as augmented reality content. In addition, according to an implementation method, information for guiding a major event (eg, a birthday, a meeting schedule, etc.) related to a user of the terminal 100a pre-stored in the counterpart terminal 100b may be generated as augmented reality content, and recent communication history Brief information about the call history and the message transmission / reception history may be generated as the augmented reality content. Here, the main event related to the user and the recent communication history may be generated by using information stored in the counterpart terminal 100b.

The counterpart terminal 100b outputs the video data, the audio data, and the augmented reality content through the video call screen in operation S427.

In addition, in step S415, when the network situation of the other terminal (100b) is good, the quality management server 200 generates the augmented reality content corresponding to the image data transmitted directly from the terminal (100a) (S425) The augmented reality content, the image data and the audio data may be controlled to be transmitted to the counterpart terminal 100b, respectively (S425). Through this, the quality management server 200 having better system performance than the terminal generates augmented reality content, thereby providing a variety of high quality augmented reality content.

In addition, when the augmented reality content is generated by the quality management server 200, the augmented reality content may be transmitted to the counterpart terminal 100b by variably adjusting the bit rate according to the network context information of the counterpart terminal 100b. have.

In addition, according to an embodiment of the present invention, the quality management server 200 monitors the network conditions of the sender and the receiver to control the creator of the augmented reality content, and the type of augmented reality content varies according to the creator. It explained. For example, when the terminal 100 generates augmented reality content corresponding to the image data photographed through its camera module or when generating the augmented reality content corresponding to the image data received from the counterpart terminal, the terminal 100 Augmented reality content using the information stored in the), when the augmented reality content is generated in the quality management server 200, regardless of the information stored in the terminal 100 and the object tracked in conjunction with the external server Augmented reality content can be generated using a variety of related publicly available information.

On the other hand, the quality management server 200 may control the creator according to the type of augmented reality content. For example, when the augmented reality content generated corresponding to the image data is set to utilize the content stored in the terminal 100 (eg, when generating the augmented reality content reflecting the communication history), the quality management server 200 Even when the network situation is difficult to transmit and receive the augmented reality content, the terminal 100 itself supports to generate the augmented reality content, it can be controlled to adjust only the transmission bit rate of the augmented reality content generated according to the network situation. On the other hand, when the augmented reality content is set to be generated irrespective of the content stored in the terminal 100, it may be supported to generate various types of augmented reality content on the server side by interworking with various external service servers (not shown).

In the above, the method for providing augmented reality content during a video call according to an exemplary embodiment of the present invention has been described.

The method of providing augmented reality content during a video call according to an embodiment of the present invention described with reference to FIGS. 5 to 7 may be sequentially performed as shown in the drawing, but some steps may be performed simultaneously or in reverse order. have. For example, when the terminal 100 receives network status information from the quality management server 200, the terminal 100 may be received in a unit of a certain period while the video call is connected regardless of the operation in the terminal 100.

In addition, the method for providing augmented reality content during a video call as described above may be provided in the form of a computer readable medium suitable for storing computer program instructions and data. A program recorded on a recording medium for implementing an augmented reality content providing method during a video call according to an embodiment of the present invention, the terminal transmitting a video call call request message to the counterpart terminal to perform a video call, the counterpart When the video call call is connected with the terminal, the terminal confirms the network status information, and if the network context information is transmitted as a result of the checking, the augmented reality content corresponding to the image data captured by the terminal And generating the augmented reality content along with the image data to the counterpart terminal.

In this case, the program recorded in the recording medium may be read from a computer, installed and executed to execute the above functions.

Herein, in order for a computer to read a program recorded on a recording medium and execute functions implemented as a program, the above-described program may include C, C ++, and the like, which the computer's processor (CPU) can read through the computer's device interface. Code may be coded in a computer language such as JAVA or machine language.

Such code may include a function code associated with a function or the like that defines the above-described functions, and may include execution procedure-related control code necessary for a processor of the computer to execute the above-described functions according to a predetermined procedure. In addition, the code may further include memory reference-related code for additional information or media required for a processor of the computer to execute the above-described functions at which location (address address) of the computer's internal or external memory. . In addition, if the processor of the computer needs to communicate with any other computer or server on the remote in order to execute the above functions, the code may be used to determine which computer the processor of the computer uses the communication module of the computer on the remote. It may further include communication-related codes such as how to communicate with other computers or servers, and what information or media should be transmitted and received during communication.

Such computer-readable media suitable for storing computer program instructions and data include, for example, recording media comprising magnetic media, such as hard disks, floppy disks, and magnetic tape, and compact disk read only memory (CD-ROM). , Optical media such as Digital Video Disk (DVD), magneto-optical media such as Floppy Disk, and ROM (Read Only Memory), RAM And a semiconductor memory such as a random access memory, a flash memory, an erasable programmable ROM (EPROM), and an electrically erasable programmable ROM (EEPROM). The processor and memory can be supplemented by or integrated with special purpose logic circuitry.

The computer readable recording medium can also be distributed over network coupled computer systems so that the computer readable code is stored and executed in a distributed fashion. In addition, a functional program for implementing the present invention, codes and code segments associated therewith may be used in consideration of a system environment of a computer that reads a recording medium and executes the program. It may be easily inferred or changed by.

Although the specification includes numerous specific implementation details, these should not be construed as limiting to any invention or the scope of the claims, but rather as a description of features that may be specific to a particular embodiment of a particular invention. It must be understood. Certain features that are described in this specification in the context of separate embodiments may be implemented in combination in a single embodiment. Conversely, various features that are described in the context of a single embodiment can also be implemented in multiple embodiments individually or in any suitable subcombination. Furthermore, while features may operate in a particular combination and may be depicted as such initially claimed, one or more features from the claimed combination may in some cases be excluded from the combination, the claimed combination being a subcombination. Or a combination of subcombinations.

Likewise, although the operations are depicted in the drawings in a specific order, it should not be understood that such operations must be performed in the specific order or sequential order shown in order to obtain desirable results or that all illustrated operations must be performed. In certain cases, multitasking and parallel processing may be advantageous. Moreover, the separation of the various system components of the above-described embodiments should not be understood as requiring such separation in all embodiments, and the described program components and systems will generally be integrated together into a single software product or packaged into multiple software products. It should be understood that it can.

The present invention relates to a video call service, and more particularly, to a computer-readable recording medium recording a program for a method for providing augmented reality content during a video call that can provide augmented reality content during a video call, and an apparatus therefor. .

According to the present invention, the augmented reality content of the video call and the augmented reality content generated in response to the video data and the video data corresponding to the video data through the different bearers, respectively, can provide a high quality augmented reality content during the video call, This will contribute to the development of the video call service industry.

In addition, the present invention has industrial applicability because the present invention is not only sufficiently commercially available or commercially viable, but also practically clearly implemented.

100: terminal 10: communication unit
11: session connection module 20: input unit
30: control unit 31: call processing module
32: AR content generation module 40: Image capturing unit
41: camera module 50: storage unit
60: output unit 510: connection network
511: eNB 520: core network
521: MME 522: S-GW
523: P-GW 530: PDN
200: quality management server

Claims (10)

  1. Transmitting, by the terminal, a video call call request message to a counterpart terminal to which a video call is to be made through a designated bearer of a designated access point name;
    Checking, by the terminal, network status information when a video call with the counterpart terminal is connected; And
    As a result of the check, when the network context information is capable of transmitting augmented reality content, the terminal calls the video call request for the augmented reality content generated after generating the augmented reality content corresponding to the image data captured by the terminal. Transmitting to the counterpart terminal together with the video data through a bearer identical to the bearer to which the message is transmitted or a bearer designated for transmitting and receiving augmented reality content in the name of the access point;
    Computer-readable recording medium recording a program for a method of providing augmented reality content during a video call comprising a.
  2. According to claim 1,
    Confirming the network status information
    And recording, by the terminal, a program for a method of providing augmented reality content during a video call by receiving network status information from a quality management server.
  3. According to claim 1,
    In the step of transmitting to the counterpart terminal,
    As a result of the check, when the network condition information cannot transmit augmented reality content, the computer reads a program for providing an augmented reality content method during a video call, wherein the terminal transmits the captured image data to the counterpart terminal. Possible recording media.
  4. According to claim 1,
    The step of transmitting to the counterpart terminal
    As a result of the checking, when the network context information is capable of transmitting augmented reality content, tracking and recognizing a preset object in image data captured in real time through the camera module driven by the terminal by driving the camera module;
    Generating augmented reality content for the preset object as the terminal tracks and recognizes the preset object; And
    Transmitting, by the terminal, the generated augmented reality content to the counterpart terminal together with the image data;
    Computer-readable recording medium recording a program for a method of providing augmented reality content during a video call comprising a.
  5. delete
  6. According to claim 1,
    In the step of transmitting to the counterpart terminal,
    And the terminal variably adjusts the bit rate of the augmented reality content according to the network context information and transmits the bit rate of the augmented reality content to the counterpart terminal.
  7. According to claim 1,
    The augmented reality content is
    A computer-readable recording medium recording a program for a method of providing augmented reality content during a video call, comprising at least one field of a content type, a content display coordinate, a content delivery type, a content quality, and content delivery information.
  8. According to claim 1,
    Receiving, by the terminal, image data transmitted from a counterpart terminal to which a video call is connected according to control according to network status information of a quality management server, or receiving image data and augmented reality content generated corresponding to the image data together; ;
    Computer-readable recording medium recording a program for a method of providing augmented reality content during a video call further comprising.
  9. The method of claim 8,
    Generating, by the terminal, augmented reality content corresponding to the image data when receiving the image data from the counterpart terminal;
    Computer-readable recording medium recording a program for a method of providing augmented reality content during a video call further comprising.
  10. When the video call is connected through the designated bearer of the called terminal and the designated access point name, the network content information is checked and the AR content generation module is requested when the augmented reality content transmission is possible. A call for controlling the transmitted augmented reality content to be transmitted to the counterpart terminal together with the image data through a bearer identical to the bearer to which the video call call request message of the access point name is transmitted or a bearer designated for transmitting and receiving augmented reality content of the access point name. Processing module; And
    An AR content generation module configured to generate augmented reality content in response to the image data at the request of the call processing module;
    Apparatus for supporting augmented reality content providing method during a video call comprising a.
KR1020150004019A 2015-01-12 2015-01-12 Recording medium recording program for method of providing augmented reality contents during video call service, and apparatus teherfor KR102055198B1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
KR1020150004019A KR102055198B1 (en) 2015-01-12 2015-01-12 Recording medium recording program for method of providing augmented reality contents during video call service, and apparatus teherfor

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
KR1020150004019A KR102055198B1 (en) 2015-01-12 2015-01-12 Recording medium recording program for method of providing augmented reality contents during video call service, and apparatus teherfor

Publications (2)

Publication Number Publication Date
KR20160086560A KR20160086560A (en) 2016-07-20
KR102055198B1 true KR102055198B1 (en) 2019-12-12

Family

ID=56679935

Family Applications (1)

Application Number Title Priority Date Filing Date
KR1020150004019A KR102055198B1 (en) 2015-01-12 2015-01-12 Recording medium recording program for method of providing augmented reality contents during video call service, and apparatus teherfor

Country Status (1)

Country Link
KR (1) KR102055198B1 (en)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20190018243A (en) 2017-08-14 2019-02-22 라인 가부시키가이샤 Method and system for navigation using video call
KR20190120122A (en) 2019-10-15 2019-10-23 라인 가부시키가이샤 Method and system for navigation using video call

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20100050694A (en) * 2008-11-06 2010-05-14 삼성전자주식회사 Method for transmitting and receiving of video telephony
JP4608005B1 (en) 2009-07-03 2011-01-05 株式会社エヌ・ティ・ティ・ドコモ Mobile communication method, mobile communication system, subscriber management server device and switching center
KR20120050258A (en) * 2010-11-10 2012-05-18 엘지전자 주식회사 Video conference system and method thereof
KR20120120858A (en) * 2011-04-25 2012-11-02 강준규 Service and method for video call, server and terminal thereof

Also Published As

Publication number Publication date
KR20160086560A (en) 2016-07-20

Similar Documents

Publication Publication Date Title
US10492116B2 (en) Traffic offload via local network
JP2018137765A (en) Communication system, method and device
US10219143B2 (en) Data transmission method, mobility management entity, and mobile terminal
JP6409871B2 (en) Resuming multiple packet services in mobile networks
US10064030B2 (en) Method of operating function and resource of electronic device
US20160150498A1 (en) Proximity-based service registration method and related apparatus
US10321371B2 (en) Method and apparatus for communication over network slices in wireless communication systems
US9554406B2 (en) Method for device to device communication and control node using the same
US9503951B2 (en) Method and apparatus for switch
EP3264853A1 (en) Rrc connection establishing method and apparatus, computer program and recording medium
US9357573B2 (en) Method of providing service continuity between cellular communication and device to-device communication
US9271222B2 (en) Method and apparatus for implementing access to machine to machine (M2M) core network
US9357359B2 (en) Dynamic quality of service (QoS) for services over cellular
KR20190020142A (en) Method for interworking between networks in a wireless communication system and apparatus therefor
JP5646647B2 (en) Method and apparatus for use in a communication network
KR101464417B1 (en) System and method for determining establishment causes for emergency sessions
US8385893B2 (en) Multi-SIM status update system
KR101617930B1 (en) Methods, apparatuses and computer readable storage media for detecting gesture-based commands for a group communication session on a wireless communications device
JP2012142950A (en) Mechanism for third generation partnership project multiple inter-network quality of service continuity
US10492127B2 (en) Wireless communications access method, apparatus, processor, and wireless terminal
JP5383907B2 (en) Paging of user equipment (UE) in a wireless communication system
US20130242866A1 (en) Method for device to device communication and base station and user equipment using the same
US20110294548A1 (en) Methods for handling an apparatus terminated communication request and communication apparatuses utilizing the same
US10542482B2 (en) Access control to services in a network
JP5132815B2 (en) Method and system for determining activation of signaling saving function in idle mode

Legal Events

Date Code Title Description
A201 Request for examination
E902 Notification of reason for refusal
E701 Decision to grant or registration of patent right