KR20130066121A - Server and method for receiving video sharing request from a plurality of devices and providing 3d video data to one of the plurality of devices, the devices - Google Patents

Server and method for receiving video sharing request from a plurality of devices and providing 3d video data to one of the plurality of devices, the devices Download PDF

Info

Publication number
KR20130066121A
KR20130066121A KR1020110132817A KR20110132817A KR20130066121A KR 20130066121 A KR20130066121 A KR 20130066121A KR 1020110132817 A KR1020110132817 A KR 1020110132817A KR 20110132817 A KR20110132817 A KR 20110132817A KR 20130066121 A KR20130066121 A KR 20130066121A
Authority
KR
South Korea
Prior art keywords
image data
terminal
image
information
terminals
Prior art date
Application number
KR1020110132817A
Other languages
Korean (ko)
Inventor
양정엽
박희선
변은영
서현득
한상봉
Original Assignee
주식회사 케이티
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 주식회사 케이티 filed Critical 주식회사 케이티
Priority to KR1020110132817A priority Critical patent/KR20130066121A/en
Publication of KR20130066121A publication Critical patent/KR20130066121A/en

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/10Processing, recording or transmission of stereoscopic or multi-view image signals
    • H04N13/106Processing image signals
    • H04N13/128Adjusting depth or disparity
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/10Processing, recording or transmission of stereoscopic or multi-view image signals
    • H04N13/194Transmission of image signals
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/20Image signal generators
    • H04N13/261Image signal generators with monoscopic-to-stereoscopic image conversion

Abstract

A server, a method, and a terminal for providing 3D image data to a terminal are provided. In more detail, mapping information between the plurality of terminals and 3D conversion parameters is generated based on the performance information of each of the plurality of terminals, and the image used in one of the plurality of terminals is shared with the other terminal. A server, a method, and a terminal for receiving a request signal to request, generating 3D image data from image data of an image based on mapping information, and providing the generated 3D image data to another terminal are provided.

Description

SERVER AND METHOD FOR RECEIVING VIDEO SHARING REQUEST FROM A PLURALITY OF DEVICES AND PROVIDING 3D VIDEO DATA TO ONE OF THE PLURALITY OF DEVICES , THE DEVICES}

A server and method for providing 3D image data to a terminal, and a terminal, and more particularly, a server and method for providing 3D image data to any one of terminals in response to a video sharing request between a plurality of terminals, and a terminal. It is about.

The N screen service is a service that allows a user to use a service that was independently used in various devices such as a TV, a PC, a tablet PC, or a smart phone, centering on a user or content. In the provision of the N screen service, a technology of simultaneously playing the same content on a plurality of devices of various types and seamlessly playing back content played on any one terminal of the plurality of devices on another device of the plurality of devices is seamless. Technology is required. In this regard, Korean Patent Publication No. 2011-0009587, which is a prior art, discloses a configuration for providing video content replay between heterogeneous terminals by implementing synchronization of playback history between content servers providing content for a plurality of terminals. .

Meanwhile, three-dimensional image technology, which is a display technology for providing a realistic three-dimensional effect, is drawing attention with the development of multimedia and broadcasting technology. In general, a 3D image refers to an image in which a object is three-dimensionally expressed by adding a depth axis to a 2-dimensional (2D) plane image having only two axes of horizontal and vertical.

Recently, as the demand for 3D video has soared, technologies for dealing with 3D video are required in providing N screen service. In particular, since N screen service is a technology for sharing services among various devices, it is important to provide 3D image data in consideration of performance of each terminal.

The 3D image data suitable for the target terminal is generated in consideration of the performance of the target terminal sharing the 3D image data or the network quality. In addition, the conversion parameters for generating 3D image data are efficiently determined in consideration of the performance of the target terminal or the network quality. In addition, an optimal conversion parameter for generating a 3D image from a 2D image is determined. However, the technical problem to be achieved by the present embodiment is not limited to the technical problems as described above, and other technical problems may exist.

As a technical means for achieving the above technical problem, an embodiment of the present invention is a mapping information generation unit for generating mapping information between the plurality of terminals and the 3D conversion parameters based on the performance information of each of a plurality of terminals, the plurality of A request signal receiving unit which receives a request signal for requesting to share an image used in the first terminal from the first terminal among the terminals of the second terminal, and generates 3D image data from the image data of the image based on the mapping information; The 3D image data providing server may include an image data generation unit and an image data providing unit providing the generated 3D image data to the second terminal.

In addition, according to another embodiment of the present invention, the mapping information generation unit generates mapping information between the plurality of terminals and the state information and the 3D conversion parameter based on state information indicating the state of the network and the performance information. In addition, according to another embodiment of the present invention, the 3D conversion parameter is a parameter for adjusting the degree of three-dimensional effect of the generated 3D image data, and includes an angle value and a speed value.

Further, another embodiment of the present invention is to generate the mapping information between the plurality of terminals and the 3D conversion parameters based on the performance information of each of the plurality of terminals, the first terminal of the plurality of terminals from the first terminal to use Receiving a request signal requesting to continue the image to the second terminal, generating 3D image data from the image data of the image based on the mapping information and the generated 3D image data to the second It may provide a 3D image data providing method comprising the step of providing to the terminal.

In addition, another embodiment of the present invention is a mapping information generation unit for generating mapping information between the plurality of receiving terminals and the 3D conversion parameters based on the performance information of each of the plurality of receiving terminals, from among the plurality of receiving terminals from the user interface; A request signal receiver for receiving a request signal requesting to share an image to any one of the receiving terminals, an image data generator for generating 3D image data from the image data of the image based on the mapping information, and the generated 3D image A terminal including an image data providing unit providing data to any one of the receiving terminals may be provided.

By generating the 3D image data based on the state information of the network and the performance of the terminal, 3D image data suitable for the target terminal may be generated in consideration of the performance of the target terminal sharing the 3D image data or the transmission network quality. By performing mapping between the state information of the network, the performance information of the terminal, and the 3D conversion parameters, it is intended to efficiently determine the conversion parameters for generating 3D image data. In addition, in order to generate a 3D image from the 2D image, the conversion efficiency can be improved by determining the speed value and the angle value as the 3D conversion parameter.

1 is a block diagram of an image data providing system according to an exemplary embodiment of the present invention.
2 is a block diagram of the 3D image data providing server 10 shown in FIG. 1.
FIG. 3 is a diagram illustrating an example of mapping information generated by the mapping information generator 11 of FIG. 2.
4 is a configuration diagram of a terminal 40 according to an embodiment of the present invention.
5 is a flowchart illustrating a method of providing 3D image data according to an embodiment of the present invention.

Hereinafter, embodiments of the present invention will be described in detail with reference to the accompanying drawings, which will be readily apparent to those skilled in the art. The present invention may, however, be embodied in many different forms and should not be construed as limited to the embodiments set forth herein. In the drawings, parts irrelevant to the description are omitted in order to clearly describe the present invention, and like reference numerals designate like parts throughout the specification.

Throughout the specification, when a part is referred to as being "connected" to another part, it includes not only "directly connected" but also "electrically connected" with another part in between . Also, when an element is referred to as "comprising ", it means that it can include other elements as well, without departing from the other elements unless specifically stated otherwise.

1 is a block diagram of an image data providing system according to an exemplary embodiment of the present invention. Referring to FIG. 1, an image data providing system includes a 3D image data providing server 10 and a plurality of terminals 21 to 23. However, since the image data providing system of FIG. 1 is only an embodiment of the present invention, the present invention is not limitedly interpreted through FIG. 1. For example, according to various embodiments of the present disclosure, the image data providing system may further include a content server that provides 2D image data to the plurality of terminals 21 to 23.

Each component of FIG. 1 constituting the image data providing system is generally connected through a network. The network refers to a connection structure capable of exchanging information between nodes such as terminals and servers. An example of such a network is the Internet, a LAN (Local Area Network), a Wireless LAN Local Area Network (WAN), Wide Area Network (WAN), Personal Area Network (PAN), and the like.

The 3D image data providing server 10 provides 3D image data to the plurality of terminals 21 to 23. In this case, the 3D image data providing server 10 may share 3D image data between the plurality of terminals 21 to 23. For example, the 3D image data providing server 10 may provide the terminal 23 with 3D image data used in the terminal 21 among the plurality of terminals 21 to 23.

The 3D image data providing server 10 provides 3D image data to the plurality of terminals 21 to 23 in response to request signals received from the plurality of terminals 21 to 23. For example, the 3D image data providing server 10 may transmit 3D image data used in the terminal 21 to the terminal 23 in response to a request signal received from the terminal 21 among the plurality of terminals 21 to 23. Can provide. In this case, the request signal means a signal for requesting to share the image data between the terminals.

The 3D image data providing server 10 may convert 2D image data into 3D image data, and provide the converted 3D image data to the plurality of terminals 21 to 23. For example, the 3D image data providing server 10 converts 2D image data used in the terminal 21 among the plurality of terminals 21 to 23 into 3D image data, and converts the converted 3D image data into the terminal 23. Can be provided as

The 3D image data providing server 10 may generate 3D image data in consideration of the performance of the terminal. For example, the 3D image data providing server 10 may include a type (eg, a large screen, a mobile terminal, etc.) and performance (eg, resolution and graphics) of the terminal 23 among the plurality of terminals 21 to 23. 3D image data provided to the terminal 23 may be generated in consideration of card performance).

The 3D image data providing server 10 may generate 3D image data in consideration of the state of the network. For example, the 3D image data providing server 10 determines the quality of 3D data provided to the terminal 23 in consideration of the state of the network to which the terminal 23 belongs among the plurality of terminals 21 to 23, and determines the quality of the 3D data provided to the terminal 23. 3D data provided to the terminal 23 may be generated based on the quality.

The 3D image data providing server 10 may generate 3D image data by reflecting different stereoscopic effects for each terminal. In this case, the 3D image data providing server 10 may receive setting information for adjusting the stereoscopic effect from the terminal.

As such, the 3D image data providing server 10 may provide optimal 3D image data for each terminal in consideration of performance and network conditions for each terminal. Through this, a user using a plurality of terminals may be provided with customized content for each terminal.

The plurality of terminals 21 to 23 transmit a sharing request for sharing 3D image data to another terminal to the 3D image data providing server 10. In addition, the plurality of terminals 21 to 23 display 3D image data received from the 3D image data providing server 10 on a display.

According to various embodiments of the present disclosure, each of the plurality of terminals 21 to 23 may be various types of terminals. For example, the terminal may be a TV device, a computer, or a portable terminal capable of connecting to a remote server via a network. Here, an example of a TV device includes a smart TV, an IPTV set-top box, and the like, and an example of a computer includes a laptop, desktop, laptop, etc., which is equipped with a web browser. An example of a terminal is a wireless communication device that guarantees portability and mobility, and includes a personal communication system (PCS), a global system for mobile communications (GSM), a personal digital cellular (PDC), a personal handyphone system (PHS), and a personal digital (PDA). Assistant (IMT), International Mobile Telecommunication (IMT) -2000, Code Division Multiple Access (CDMA) -2000, W-Code Division Multiple Access (W-CDMA), Wireless Broadband Internet (Wibro) terminal, smartphone, tablet PC All kinds of handheld based wireless communication devices such as the like may be included.

Hereinafter, the operation of each component included in the image data providing system of FIG. 1 will be described in more detail.

2 is a block diagram of the 3D image data providing server 10 shown in FIG. 1. Referring to FIG. 2, the 3D image data providing server 10 may include a mapping information generator 11, a request signal receiver 12, an image data generator 13, an image data provider 14, and a database 15. It includes.

However, the 3D image data providing server 10 shown in FIG. 2 is only one implementation example of the present invention, and various modifications are possible based on the components shown in FIG. 2. For example, the 3D image data providing server 10 may further include a manager interface for receiving a certain command or information from the manager. In this case, the manager interface may be an input device such as a keyboard, a mouse, or the like, but may be a graphical user interface (GUI) represented on a video display device. For another example, the 3D image data providing server 10 may further include a communication unit for transmitting and receiving data with the first terminal 21. In this case, the communication unit receives data from the first terminal 21 via a network and transfers the received data to other components in the 3D image data providing server 10 or the 3D image data providing server 10. Data transmitted from other components inside may be transmitted to the first terminal 21.

The mapping information generator 11 generates mapping information between the plurality of terminals and the 3D conversion parameters based on the performance information of each of the plurality of terminals. In this case, the plurality of terminals may be terminals registered in the plurality of user information. In other words, the plurality of terminals may be terminals matched with a single user using the N screen service. In addition, one example of the performance information includes information indicating the ability of each of the plurality of terminals to process 3D image data, such as terminal type, performance, resolution, and screen type of each of the plurality of terminals.

The 3D conversion parameter refers to a conversion parameter for generating image data of the 3D image from the image data of the 2D image. The 3D conversion parameter is a parameter for adjusting the degree of stereoscopic 3D image data. According to an embodiment of the present invention, the 3D conversion parameter includes a velocity value and an angle value.

The mapping information generator 11 may generate mapping information based on performance information of each of the plurality of terminals and state information indicating a network state. In this case, the mapping information may mean mapping information between a plurality of terminals, state information, and 3D transformation parameters. An example of such mapping information includes a mapping relationship between a set of performance information and state information and a 3D transformation parameter.

The mapping information generation unit 11 generates a plurality of mapping information corresponding to the plurality of user information. For example, the mapping information generator 11 may generate unique mapping information for each user information, and store the generated mapping information in the database 15 in association with the user information. In this way, mapping information corresponding to user information is stored in the database 15, and each mapping information may include 3D conversion parameters corresponding to performance information of a plurality of terminals matched with the user information. For example, the database 15 may store three types of terminals registered in the A user information and 3D conversion parameters mapped to the respective terminals. In this case, as described above, the 3D transformation parameters may be mapped to the terminals and the network state information.

As such, the mapping information generator 11 may profile and manage 3D conversion parameters for each of the plurality of terminals for each user information. In this way, the 3D image data generation server 10 may provide 3D image data representing different levels of stereoscopic feeling for each user terminal.

For example, when the user wants to watch the same 3D video on the second terminal while viewing the 3D video on the first terminal, the 3D image data generation server 10 may generate a 3D conversion parameter or 3D image corresponding to the second terminal. By providing the data to the second terminal, the user can enjoy a new 3D image suitable for the 3D effect preferred when using the second terminal.

As another example, when a user wants to convert and view a 2D image being used in a first terminal into a 3D image in both the second terminal and the third terminal, the 3D image data generation server 10 may convert the 3D image to a 3D corresponding to the second terminal. By providing the conversion parameter or the 3D image data to the second terminal and at the same time providing the 3D conversion parameter or 3D image data corresponding to the third terminal to the third terminal, the user has a 3D 3D You can watch videos.

The mapping information generator 11 may generate mapping information based on the setting information received from the first terminal 21 or the second terminal 23. In other words, the mapping information generator 11 may receive setting information on the 3D conversion parameter from the first terminal 21 or the second terminal 23 and generate mapping information based on the received setting information. .

For example, the mapping information generator 11 provides at least one or more basic 3D image data to the first terminal 21 or the second terminal 23, and the first terminal 21 or the second terminal 23. The selection information on any one of the basic 3D image data may be received as configuration information, and mapping information may be generated based on the received configuration information. For another example, the mapping information generator 11 provides at least one or more basic 3D image data to the first terminal 21 or the second terminal 23, and the first terminal 21 or the second terminal 23. The change information on any one of the basic 3D image data may be received as setting information, and mapping information may be generated based on the received setting information. In this case, providing the basic 3D image data to the first terminal 21 or the second terminal 23 may be performed by the image data providing unit 14 of FIG. 2.

The mapping information generator 11 may receive terminal-specific setting information from the first terminal 21 or the second terminal 23. For example, the mapping information generation unit 11 is suitable for setting information corresponding to the 3D image suitable for the first terminal 21 and the second terminal 23 from the first terminal 21 or the second terminal 23. The setting information corresponding to the 3D image may be received. Through this, as described above, the mapping information generator 11 may provide 3D image data representing different levels of stereoscopic feeling for each user terminal.

The mapping information generator 11 may update the mapping information based on the setting information received from the first terminal 21 or the second terminal 23. For example, the image data providing unit 14 provides 3D image data to the first terminal 21 or the second terminal 23 based on the mapping information stored in the database 15, and the mapping information generation unit 11. ) Receives the change information on the 3D image data provided from the first terminal 21 or the second terminal 23 as setting information, the mapping information generation unit 11 receives the mapping information based on the received setting information. You can update it.

FIG. 3 is a diagram illustrating an example of mapping information generated by the mapping information generator 11 of FIG. 2. Referring to FIG. 3, the horizontal axis (x axis) of the mapping information includes state information of the network, and the vertical axis (y axis) of the mapping information includes performance information of the terminal. At this time, as shown in Figure 3, one example of the network state information includes 10Mbps and 100Mbps indicating the network quality or transmission bandwidth, an example of the performance information of the terminal is the type or resolution of the screen of the terminal Ultra-high resolution, Ultra HD and the like.

Referring to FIG. 3, the mapping information includes a 3D transformation parameter mapped to a set of state information and performance information of the terminal. As shown in FIG. 3, examples of 3D transformation parameters include (5, 0.8), (10, 1.5), (20) including a vector representing a velocity value and a vector representing an angle value, such as (V, A). , 3.0), and the like. At this time, (5, 0.8) may be a 3D conversion parameter mapped to the network state information of 10Mbps and the performance information of the terminal having a resolution of SD or less. However, FIG. 3 merely illustrates an example of mapping information, and thus modifications may be made according to various embodiments of the present disclosure. For example, the mapping information may be adjusted according to the type of image. In addition, the mapping information may be determined by prioritizing the performance information of the terminal and may consider different network state information for each performance information according to the determined performance information of the terminal.

The mapping information generator 11 may generate a plurality of mapping information. In this case, the mapping information generation unit 11 may generate different mapping information according to the type of the image or the importance of the image, or may generate different mapping information for each user.

The request signal receiving unit 12 receives a request signal for requesting to share the image used in the first terminal 21 from the first terminal 21 among the plurality of terminals to the second terminal 23. In this case, the request signal may be a request signal for requesting to continue the image used in the first terminal 21 to the second terminal 23. In this case, the request signal may further include playback information indicating a degree of playing the video on the first terminal 21, and may identify identification information of the second terminal 23 such as a telephone number of the second terminal 23. It may further include.

The request signal receiver 12 may perform user authentication or terminal authentication for the first terminal 21. In this case, when the first terminal 21 and the second terminal 23 are terminals mapped to the same user information, the user authentication is to authenticate whether the user information for the user of the first terminal 21 is valid. , Terminal authentication means authenticating whether the first terminal 21 is a valid terminal.

The image data generator 13 generates 3D image data from the image data of the image based on the mapping information. In this case, the image may mean a 2D image. However, according to another embodiment of the present invention, the image may be a 3D image. In other words, according to another embodiment of the present invention, the image used in the first terminal 21 is a 3D image, and the image data generator 13 acquires and acquires image data of a 2D image corresponding to the 3D image. The 3D image data may be generated from the image data of the 2D image. In this case, the image data generator 13 may acquire image data of the 2D image corresponding to the 3D image using the identification information of the 3D image received from the first terminal 21.

The image data generator 13 generates 3D image data from the image data of the image based on the mapping information. In detail, the image data generation unit 13 may generate 3D image data by rotating the 2D image data of the image based on the 3D conversion parameter included in the mapping information. At this time, as described above, an example of the 3D transformation parameter may include an angle value and a velocity value. In sum, the image data generation unit 13 may generate 3D image data by rotating the 2D image data of the image based on the angle value and the speed value. In this case, the angle value may indicate an angle for rotating the 2D image data, and the speed value may indicate a speed for rotating the 2D image data.

The image data generation unit 13 generates left eye image data by rotating the 2D image data to the left based on the angle value and the speed value, and rotates the 2D image data to the right based on the angle value and the speed value to the right eye image data. 3D image data may be generated by synthesizing the generated left eye image data and the generated right eye image data.

The image data generator 13 separates the background image and the object image from the 2D image data, processes the separated background image, applies a depth value to the separated object image based on the angle value, and the velocity value, 3D image data may be generated using the background image and the applied depth image.

Hereinafter, the image data generation unit 13 generates left eye image data from the object image, which is 2D image data, and generates right eye image data in order to separate the background image from the object image and apply a depth value to the separated object image. For example, a 3D object image, which is 3D image data, is generated by synthesizing the generated left eye image data and the generated right eye image data, and 3D image data is generated using the generated 3D object image and the processed background image. Explain. However, such an example is not limited thereto because it is only one embodiment of the present invention.

According to an embodiment of the present invention, the image data generating unit 13 generates a rendered image from the afterimage generated by rotating the 2D object image. In this case, different rendering images are generated according to the rotating angle value and the speed value, through which the quality (eg, resolution) of 3D image data can be adjusted.

According to an embodiment of the present invention, the image data generator 13 generates a left eye image and a right eye image from the 2D object image, respectively, and generates a 3D stereoscopic image by synthesizing the generated left eye image and the right eye image. In other words, the image data generation unit 13 converts an arbitrary point P (x, y) of the 2D object image into a point Q (x, y, z) of the 3D stereoscopic image.

According to an embodiment of the present invention, the image data generation unit 13 generates the left eye image by rotating the 2D object image to the right about the rotation axis, and the 2D object image about the rotation axis. Rotate left to generate the right eye image. In this case, the axis of rotation is a line connecting two arbitrary points F (x, y) and G (x, y) in the center of the 2D object image, and F and G are gray levels of the center dividing line of the 2D object image. The level difference can then be determined by the two largest points.

According to an embodiment of the present invention, the image data generator 13 rotates the 2D object image data by using vectors representing velocity values and angle values. In this case, the speed value may be represented by v, and this speed value indicates how fast the 2D object image is rotated. In general, the unit of the velocity value may be determined in m / s, but is not limited thereto. In addition, the angle value may be represented by a, and this angle value indicates how far to rotate the 2D object image. In general, the unit of the angle value may be determined as Radian, but is not limited thereto.

According to an embodiment of the present invention, the image data generator 13 extracts edges of the generated rotating body according to the velocity value and the angle value, generates a closed curve composed of the extracted edges, and closes the generated closed curve. A left eye image and a right eye image are generated using the region, and a 3D object image is generated by synthesizing the generated left and right eye images.

According to an embodiment of the present invention, the image data generation unit 13 may generate a 3D object from the 2D object image, and then fill the remaining background part with the afterimage of the rotating body. In general, the surface area of the rotating body may be determined as shown in Equation 1 according to the speed value and the angle value. In this case, S means the outer area of the rotating body, F and G means two points to create a rotation axis, h (x) means the cross-sectional area of the 2D object image, A means the angle value, V means the speed value.

[Equation 1]

Figure pat00001

The image data providing unit 14 provides the generated 3D image data to the second terminal 23. In this case, the image data providing unit 14 may provide the 2D image data and the generated 3D image data to the second terminal 23.

According to another embodiment of the present invention, the image data providing unit 14 may provide 3D image data to the first terminal 21 and the second terminal 23. In addition, the image data providing unit 14 may provide different 3D image data to the first terminal 21 and the second terminal 23. For example, the image data providing unit 14 provides the first terminal 3D image data generated from the 2D image data to the second terminal 23, and the second terminal 3D image data generated from the 2D image data. It may be provided as (21).

According to another embodiment of the present invention, the image data providing unit 14 may provide 3D image data to the second terminal 23 and the third terminal (not shown). For example, the request signal receiving unit 12 requests to share the image used in the first terminal 21 from the first terminal 21 to the second terminal 23 and the third terminal (not shown). When receiving the request signal, the image data generating unit 13 generates 3D image data from the 2D image data, and the image data providing unit 14 outputs the generated 3D image data to the second terminal 23 and the third terminal. (Not shown).

According to another embodiment of the present invention, the image data providing unit 14 may provide different 3D image data to the second terminal 23 and the third terminal (not shown). For example, the request signal receiving unit 12 requests to share the image used in the first terminal 21 from the first terminal 21 to the second terminal 23 and the third terminal (not shown). When receiving the request signal, the image data generation unit 13 generates the first 3D image data and the second 3D image data from the 2D image data, and the image data providing unit 14 generates the generated first 3D image data. The third terminal may be provided to the second terminal 23, and the second 3D image data generated from the 2D image data may be provided to the third terminal (not shown). At this time, the image data generation unit 14 generates the first 3D image data by using the 3D conversion parameter corresponding to the second terminal 23, and generates the 3D conversion parameter corresponding to the third terminal (not shown). The second 3D image data may also be generated.

According to another embodiment of the present invention, the image data providing unit 14 transmits the generation information of the 3D image data to the content server (not shown), thereby the second terminal 23 to the content server (not shown). It may also request to transmit 3D image data. At this time, the image data generation unit 13 receives image information on the image data of the 2D image from a content server (not shown), and generates 3D image data from the image data of the 2D image based on the received information. It can generate the creation information about the thing.

The database 15 stores data. At this time, the data includes data input and output between the components of the 3D image data providing server 10, components outside the 3D image data providing server 10 and the 3D image data providing server 10 Contains data input and output between them. For example, the database 15 may store the request signal input from the first terminal 21 and store the mapping information output from the mapping information generator 11. An example of such a database 15 includes a hard disk drive, a hard disk drive, a read only memory (ROM), a random access memory (RAM), a flash memory, and a memory card existing inside or outside the 3D image data providing server 10. Etc. are included.

4 is a configuration diagram of a terminal 40 according to an embodiment of the present invention. Referring to FIG. 4, the terminal 40 includes a mapping information generator 41, a request signal receiver 42, an image data generator 43, an image data provider 44, and a database 45. However, the terminal 40 shown in FIG. 4 is only one implementation example of the present invention, and various modifications are possible based on the components shown in FIG. 4. For example, the terminal 40 may further include an image data receiver (not shown) that receives the image data of the 2D image or the image data of the 3D image from the content server or the 3D image data providing server 10. For another example, the terminal 40 may further include a user interface for receiving a certain command or information from the user. In this case, the user interface may generally be an input device such as a keyboard, a mouse, or the like, or may be a graphical user interface (GUI) expressed on the image display device.

The mapping information generation unit 41 generates mapping information between the plurality of receiving terminals and the 3D conversion parameter based on the performance information of each of the plurality of receiving terminals. At this time, the plurality of receiving terminals are registered in the same user information as the terminal 40, and the mapping information generating unit 41 is based on the state information and the performance information indicating the state of the network, the plurality of terminals, the state information and the 3D conversion. Mapping information between parameters can be created. In addition, the 3D conversion parameter may include an angle value and a speed value as parameters for adjusting the degree of stereoscopic effect of the 3D image data.

The request signal receiver 42 receives a request signal for requesting to share an image from any one of the plurality of receiving terminals 50 or a user interface (not shown) to one of the plurality of receiving terminals. Receive input. The image data generator 43 generates 3D image data from the image data of the image based on the mapping information. The image data providing unit 44 provides the generated 3D image data to the receiving terminal 50. The database 15 stores data.

The terminal 40 refers to a terminal that can perform the function of the 3D image data providing server 10 described above by itself. In other words, the terminal 40 refers to a terminal that can perform both operations of the first terminal 21 and the 3D image data providing server 10 described above with reference to FIGS. 1 to 3. In addition, the receiving terminal 50 of FIG. 4 corresponds to the second terminal 23 described above with reference to FIGS. 1 to 3.

Therefore, the matters not described below with respect to the terminal 40 are the same as those described with respect to the first terminal 21 and the 3D image data providing server 10 described above with reference to FIGS. 1 to 3, or Since it can be easily inferred from the description by those skilled in the art, it will be omitted below. In this context, the mapping information generating unit 41, the request signal receiving unit 42, the image data generating unit 43, the image data providing unit 44, and the database 45 of the terminal 40 are not described. The matters that are not described above will be described with reference to the mapping information generator 11, the request signal receiver 12, the image data generator 13, the image data provider 14, and the database 15 through FIGS. 1 to 3. The written content shall apply mutatis mutandis.

5 is a flowchart illustrating a method of providing 3D image data according to an embodiment of the present invention. The 3D image data providing method according to the exemplary embodiment shown in FIG. 5 includes steps processed in time series by the 3D image data providing server 10 according to the exemplary embodiment shown in FIG. 2. Therefore, even if omitted below, the above description of the 3D image data providing server 10 of FIG. 2 is also applied to the 3D image data providing method according to the exemplary embodiment shown in FIG. 5.

In operation S51, the mapping information generation unit 11 generates mapping information between the plurality of terminals and the 3D conversion parameter based on the performance information of each of the plurality of terminals. In step S52, the request signal receiving unit 12 receives a request signal for requesting to share the image used in the first terminal 21 from the first terminal 21 among the plurality of terminals to the second terminal 23. In operation S53, the image data generator 13 generates 3D image data from the image data of the image based on the mapping information. In operation S54, the image data providing unit 14 provides the generated 3D image data to the second terminal 23.

The 3D image data providing method according to the embodiment described with reference to FIG. 5 may also be implemented in the form of a recording medium including instructions executable by a computer, such as a program module executed by the computer. Computer readable media can be any available media that can be accessed by a computer and includes both volatile and nonvolatile media, removable and non-removable media. In addition, the computer-readable medium may include both computer storage media and communication media. Computer storage media includes both volatile and nonvolatile, removable and non-removable media implemented in any method or technology for storage of information such as computer readable instructions, data structures, program modules or other data. Communication media typically includes any information delivery media, including computer readable instructions, data structures, program modules, or other data in a modulated data signal such as a carrier wave, or other transport mechanism.

The foregoing description of the present invention is intended for illustration, and it will be understood by those skilled in the art that the present invention may be easily modified in other specific forms without changing the technical spirit or essential features of the present invention. will be. It is therefore to be understood that the above-described embodiments are illustrative in all aspects and not restrictive. For example, each component described as a single entity may be distributed and implemented, and components described as being distributed may also be implemented in a combined form.

The scope of the present invention is shown by the following claims rather than the above description, and all changes or modifications derived from the meaning and scope of the claims and their equivalents should be construed as being included in the scope of the present invention. do.

10: 3D video data providing server
11: Mapping Information Generator
12: request signal receiving unit
13: Image data generator
14: image data providing unit
40: terminal

Claims (20)

In the 3D image data providing server for providing 3D image data to the terminal,
A mapping information generator configured to generate mapping information between the plurality of terminals and 3D conversion parameters based on performance information of each of a plurality of terminals;
A request signal receiving unit receiving a request signal for requesting to share an image used in the first terminal from the first terminal among the plurality of terminals to the second terminal;
An image data generator configured to generate 3D image data from the image data of the image based on the mapping information; And
3D image data providing server comprising an image data providing unit for providing the generated 3D image data to the second terminal.
The method of claim 1,
The mapping information generating unit generates mapping information between the plurality of terminals, the state information, and the 3D conversion parameter based on state information indicating the state of the network and the performance information.
The method of claim 1,
The 3D conversion parameter is a parameter for adjusting the degree of three-dimensional impression of the generated 3D image data, 3D image data providing server.
The method of claim 1,
The 3D conversion parameter comprises an angle value and a speed value, 3D image data providing server.
The method of claim 1,
The plurality of terminals are terminals registered in the same user information, 3D image data providing server.
3. The method of claim 2,
The mapping information generating unit generates a plurality of mapping information corresponding to a plurality of user information, 3D image data providing server.
The method of claim 1,
The mapping information generating unit generates the mapping information based on the setting information received from the first terminal or the second terminal, 3D image data providing server.
The method of claim 1,
The mapping information generating unit receives update information on the provided 3D image data from the second terminal, and updates the mapping information based on the received update information, 3D image data providing server.
The method of claim 1,
Image data of the image is 2D image data of the image,
And the image data generation unit generates the 3D image data by rotating the 2D image data of the image based on the 3D conversion parameter included in the mapping information.
The method of claim 9,
The 3D transform parameter includes an angle value and a velocity value,
And the image data generation unit generates the 3D image data by rotating the 2D image data of the image based on the angle value and the speed value.
11. The method of claim 10,
The image data generation unit separates the background image and the object image from the 2D image data, processes the separated background image, applies a depth value to the separated object image based on the angle value and the speed value, and processes the The 3D image data providing server to generate the 3D image data using the background image and the applied depth image.
11. The method of claim 10,
The image data generation unit generates left eye image data by rotating the 2D image data to the right based on the angle value and the speed value, and rotates the 2D image data to the left based on the angle value and the speed value to the right eye image. And generating the 3D image data by synthesizing the generated left eye image data and the generated right eye image data.
11. The method of claim 10,
The angle value represents the angle to rotate the 2D image data, and the speed value represents the speed to rotate the 2D image data, 3D image data providing server.
The method of claim 1,
The image used in the first terminal is a 3D image,
The image data generating unit obtains image data of a 2D image corresponding to the 3D image, and generates the 3D image data from the obtained image data of the 2D image, 3D image data providing server.
The method of claim 1,
The request signal receiving unit receives a request signal for requesting to continue the image used in the first terminal from the first terminal of the plurality of terminals to the second terminal, 3D image data providing server.
The method of claim 9,
The image data providing unit provides the 2D image data and the 3D image data to the second terminal, 3D image data providing server.
The method of claim 1,
The request signal receiving unit receives a request signal for requesting to share the image used in the first terminal to the second terminal and the third terminal from the first terminal,
The image data generation unit generates first 3D image data and second 3D image data from the image data of the image,
The image data providing unit provides the generated first 3D image data to the second terminal, and provides the generated second 3D image data to the third terminal, 3D image data providing server.
The method of claim 1,
The image data generation unit receives image information of the image data of the image from a content server, and generates generation information for generating 3D image data from the image data of the image based on the received information,
The image data providing unit transmits the generated generation information to the content server,
The content server generates 3D image data based on the received generation information, 3D image data providing server.
In the method for providing 3D image data to the terminal by the 3D image data providing server,
Generating mapping information between the plurality of terminals and a 3D transformation parameter based on performance information of each of the plurality of terminals;
Receiving a request signal for requesting to share an image used in a first terminal to a second terminal from a first terminal of the plurality of terminals;
Generating 3D image data from the image data of the image based on the mapping information; And
3D image data providing method comprising the step of providing the generated 3D image data to the second terminal.
In the terminal for providing 3D image data to the receiving terminal,
A mapping information generator configured to generate mapping information between the plurality of receiving terminals and a 3D conversion parameter based on performance information of each of the plurality of receiving terminals;
A request signal receiver for receiving a request signal for requesting to share an image from a user interface to any one of the plurality of receiving terminals;
An image data generator configured to generate 3D image data from the image data of the image based on the mapping information; And
And a video data providing unit for providing the generated 3D video data to any one of the receiving terminals.
KR1020110132817A 2011-12-12 2011-12-12 Server and method for receiving video sharing request from a plurality of devices and providing 3d video data to one of the plurality of devices, the devices KR20130066121A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
KR1020110132817A KR20130066121A (en) 2011-12-12 2011-12-12 Server and method for receiving video sharing request from a plurality of devices and providing 3d video data to one of the plurality of devices, the devices

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
KR1020110132817A KR20130066121A (en) 2011-12-12 2011-12-12 Server and method for receiving video sharing request from a plurality of devices and providing 3d video data to one of the plurality of devices, the devices

Publications (1)

Publication Number Publication Date
KR20130066121A true KR20130066121A (en) 2013-06-20

Family

ID=48862490

Family Applications (1)

Application Number Title Priority Date Filing Date
KR1020110132817A KR20130066121A (en) 2011-12-12 2011-12-12 Server and method for receiving video sharing request from a plurality of devices and providing 3d video data to one of the plurality of devices, the devices

Country Status (1)

Country Link
KR (1) KR20130066121A (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR102335096B1 (en) * 2021-05-21 2021-12-03 손승호 System for providing video production service compositing figure video and ground video
KR102438910B1 (en) * 2022-01-24 2022-09-01 주식회사 엔에스랩 Method of providing 3D content in response to device changes

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR102335096B1 (en) * 2021-05-21 2021-12-03 손승호 System for providing video production service compositing figure video and ground video
KR102438910B1 (en) * 2022-01-24 2022-09-01 주식회사 엔에스랩 Method of providing 3D content in response to device changes

Similar Documents

Publication Publication Date Title
KR102375307B1 (en) Method, apparatus, and system for sharing virtual reality viewport
US10516903B2 (en) Method and apparatus for transmitting video data
US20180091577A1 (en) Method and apparatus for transmitting and receiving image data for virtual-reality streaming service
US20180091791A1 (en) Streaming volumetric video for six degrees of freedom virtual reality
KR20210126566A (en) Systems and methods for adaptive spatial content streaming with multiple levels of detail and degrees of freedom
CN105718227A (en) Screen transmission method and related device
KR20130066069A (en) Method and system for providing application based on cloud computing
US20130029719A1 (en) Method and apparatus for providing application service in a mobile communication system
EP3547688B1 (en) Method for decoding motion vector, and decoder
US20210240257A1 (en) Hiding latency in wireless virtual and augmented reality systems
EP3930335A1 (en) Resource transmission method and terminal
US20200259880A1 (en) Data processing method and apparatus
US11652864B2 (en) Method and apparatus for transmitting resources and non-transitory storage medium
TWI786572B (en) Immersive media providing method and acquiring method, device, equipment and storage medium
CN113986177A (en) Screen projection method, screen projection device, storage medium and electronic equipment
KR20130066121A (en) Server and method for receiving video sharing request from a plurality of devices and providing 3d video data to one of the plurality of devices, the devices
US20230176915A1 (en) Method and device for providing split computing based on device capability
WO2020137876A1 (en) Generation device, three-dimensional data transmission device, and three-dimensional data reproduction device
KR102164686B1 (en) Image processing method and apparatus of tile images
KR20180020483A (en) Method for expanding screen based on Multi Device
CN110635995A (en) Method, device and system for realizing interaction between users
US20200374567A1 (en) Generation apparatus, reproduction apparatus, generation method, reproduction method, control program, and recording medium
KR20160045994A (en) Method for providing augmented reality-video game, device and system
JP2021033354A (en) Communication device and control method therefor
Huang et al. A Survey on Video Streaming for Next-Generation Vehicular Networks

Legal Events

Date Code Title Description
WITN Withdrawal due to no request for examination