KR20110112211A - System and method for providing multimedia service in a communication system - Google Patents

System and method for providing multimedia service in a communication system Download PDF

Info

Publication number
KR20110112211A
KR20110112211A KR1020110030397A KR20110030397A KR20110112211A KR 20110112211 A KR20110112211 A KR 20110112211A KR 1020110030397 A KR1020110030397 A KR 1020110030397A KR 20110030397 A KR20110030397 A KR 20110030397A KR 20110112211 A KR20110112211 A KR 20110112211A
Authority
KR
South Korea
Prior art keywords
effect
sensory
multimedia
information
binary
Prior art date
Application number
KR1020110030397A
Other languages
Korean (ko)
Inventor
이은서
최범석
Original Assignee
한국전자통신연구원
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 한국전자통신연구원 filed Critical 한국전자통신연구원
Priority to US13/080,095 priority Critical patent/US20110282967A1/en
Publication of KR20110112211A publication Critical patent/KR20110112211A/en

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/20Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
    • H04N21/23Processing of content or additional data; Elementary server operations; Server middleware
    • H04N21/235Processing of additional data, e.g. scrambling of additional data or processing content descriptors
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/20Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
    • H04N21/25Management operations performed by the server for facilitating the content distribution or administrating data related to end-users or client devices, e.g. end-user or client device authentication, learning user preferences for recommending movies
    • H04N21/258Client or end-user data management, e.g. managing client capabilities, user preferences or demographics, processing of multiple end-users preferences to derive collaborative data
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/80Generation or processing of content or additional data by content creator independently of the distribution process; Content per se
    • H04N21/85Assembly of content; Generation of multimedia applications
    • H04N21/854Content authoring
    • H04N21/8543Content authoring using a description language, e.g. Multimedia and Hypermedia information coding Expert Group [MHEG], eXtensible Markup Language [XML]

Abstract

The present invention relates to a multimedia service providing system and method for providing a variety of high-capacity multimedia content and various sensory effects of the multimedia content to users in real time at high speed. According to the service request, the multimedia content of the multimedia service is generated, the sensory effect information indicating the sensory effects of the multimedia content is generated, and the sensory effect information is represented in binary notation using a binary representation encoding scheme. Encoding, converting the sensory effect information encoded in the binary notation into command information in binary notation, and converting the multimedia content and the sensory effects through a device command according to the control information of the binary notation. In real time It is provided to the user.

Description

System and method for providing multimedia service in communication system

BACKGROUND OF THE INVENTION 1. Field of the Invention The present invention relates to a communication system, and more particularly, to a multimedia service providing system and method for providing high speed multimedia contents in various forms and various sensory effects of the multimedia contents to users in real time.

In a communication system, active research is being conducted to provide users with services of various quality of service (QoS: QoS) having a high transmission speed. Such a communication system transmits various types of service data to users quickly and stably through limited resources according to service requests of users who want to receive various types of services, thereby providing services requested by each user. Measures have been proposed.

On the other hand, in the current communication system, a method for transmitting a large amount of service data at high speed has been proposed according to various service needs of users, and in particular, a large amount of multimedia data corresponding to the service needs of users who want to receive various multimedia services is proposed. There is an active research on the method for high speed transmission. In other words, through a communication system, users want to be provided with various multimedia services of higher quality, and in particular, multimedia contents corresponding to multimedia services are provided as well as various sensory effects of the multimedia contents, thereby providing higher quality multimedia services. To be provided.

However, in the current communication system, there is a limitation in providing multimedia services requested by the users by transmitting the multimedia contents according to the multimedia service requests of the users. In particular, in the current communication system, as described above, in order to meet various demands of higher quality multimedia services of users, a specific scheme for providing not only multimedia contents but also various sensory effects of the multimedia contents to users is still proposed. Not. That is, in the current communication system, a concrete method for providing various high quality multimedia services to each user in real time by transmitting the multimedia content and various sensory effects at high speed has not been proposed.

Accordingly, there is a need for a method for providing a high-quality, high-capacity multimedia service at high speed in accordance with the service needs of users in a communication system, and in particular, a high-quality, high-capacity multimedia service required by each user in real time.

Accordingly, an object of the present invention is to provide a system and method for providing a multimedia service in a communication system.

Another object of the present invention is to provide a multimedia service providing system and method for providing a variety of high quality multimedia services to users in real time in accordance with a service request of a user in a communication system.

Another object of the present invention is to transmit multimedia contents of a multimedia service that each user wants to be provided in a communication system, and various sensory effects of the multimedia contents at high speed, thereby realizing various high quality multimedia services in real time. The present invention provides a system and method for providing a multimedia service to each user.

A system of the present invention for achieving the above object, in a system for providing a multimedia service in a communication system, according to the service request of the multimedia service to be provided by users, the multimedia content of the multimedia service, and the multimedia content of A service provider for providing sensory effect information indicative of sensory effects; A user server configured to receive multimedia data including the multimedia content and the sensory effect information, and convert the sensory effect information into control information from the multimedia data and provide the command information; And user devices that provide the multimedia contents and the sensory effects to the users in real time through a device command according to the control information.

Another system of the present invention for achieving the above object, in a system for providing a multimedia service in a communication system, according to the service request of the multimedia service to be provided by the user, generates the multimedia content of the multimedia service, A generator for generating sensory effect information indicative of sensory effects of multimedia content; An encoder for encoding the sensory effect information in binary notation using a binary representation coding scheme; And a transmitter for transmitting the multimedia data including the multimedia content and sensory effect information encoded in the binary notation.

In accordance with another aspect of the present invention, there is provided a method of providing a multimedia service in a communication system, the method comprising: generating multimedia content of the multimedia service according to a service request of a multimedia service to be provided by users, and generating the multimedia service. Generating sensory effect information indicating sensory effects of the content; Encoding the sensory effect information in binary notation using a binary representation coding scheme; Converting sensory effect information encoded in the binary notation into command information in binary notation; And providing the multimedia contents and the sensory effects to the users in real time through a device command according to the control information of the binary notation.

The present invention reliably provides various high quality multimedia services that each user wants to receive in a communication system, and in particular, provides multimedia contents of the multimedia service and various sensory effects of the multimedia contents to each user. can do. In addition, the present invention encodes information indicating various sensory effects of the multimedia content using a binary representation, thereby transmitting the multimedia content and various sensory effects of the multimedia content at high speed, and accordingly, The multimedia contents and sensory effects may be provided to each user in real time, that is, the various high quality multimedia services may be provided to the users in real time.

1 is a view schematically showing the structure of a multimedia service providing system according to an embodiment of the present invention.
2 is a diagram schematically illustrating a structure of a service provider in a multimedia service providing system according to an exemplary embodiment of the present invention.
3 is a diagram schematically illustrating a structure of a user server in a multimedia service providing system according to an exemplary embodiment of the present invention.
4 is a diagram schematically illustrating a structure of a user device in a multimedia service providing system according to an exemplary embodiment of the present invention.
5 is a diagram illustrating a location model of sensory effect metadata in a multimedia service providing system according to an exemplary embodiment of the present invention.
6 is a diagram illustrating a movement pattern in the sensory effect of the multimedia service providing system according to an embodiment of the present invention.
7 is a diagram illustrating a motion trace sample pattern in a sensory effect of a multimedia service providing system according to an exemplary embodiment of the present invention.
8 is a diagram illustrating an incline pattern in a sensory effect of a multimedia service providing system according to an exemplary embodiment of the present invention.
9 is a diagram illustrating a shake pattern in a sensory effect of a multimedia service providing system according to an exemplary embodiment of the present invention.
FIG. 10 is a diagram illustrating a wave pattern in a sensory effect of a multimedia service providing system according to an exemplary embodiment of the present invention. FIG.
11 is a diagram illustrating a spin pattern in the sensory effect of the multimedia service providing system according to an embodiment of the present invention.
12 is a diagram illustrating a turn pattern in the sensory effect of the multimedia service providing system according to an exemplary embodiment of the present invention.
FIG. 13 is a diagram illustrating a collide pattern in a sensory effect of a multimedia service providing system according to an exemplary embodiment of the present invention. FIG.
14 is a diagram illustrating a horizontal movement pattern in the sensory effect of the multimedia service providing system according to an embodiment of the present invention.
15 is a view illustrating a vertical movement pattern in the sensory effect of the multimedia service providing system according to an embodiment of the present invention.
16 is a view illustrating a direction incline pattern in a sensory effect of a multimedia service providing system according to an embodiment of the present invention.
17 is a diagram illustrating a direction shake pattern in a sensory effect of a multimedia service providing system according to an exemplary embodiment of the present invention.
18 is a diagram illustrating a shake motion distance in the sensory effect of the multimedia service providing system according to an exemplary embodiment of the present invention.
19 and 20 illustrate wave motion directions in a sensory effect of a multimedia service providing system according to an exemplary embodiment of the present invention.
21 and 22 are views illustrating a wave motion starting direction in a sensory effect of a multimedia service providing system according to an exemplary embodiment of the present invention.
FIG. 23 is a diagram illustrating a wave motion distance in a sensory effect of a multimedia service providing system according to an embodiment of the present invention. FIG.
24 is a diagram illustrating a turn pattern direction in a sensory effect of a multimedia service providing system according to an embodiment of the present invention.
25 is a diagram illustrating a horizontal colloid pattern in a sensory effect of a multimedia service providing system according to an embodiment of the present invention.
FIG. 26 illustrates a vertical colloid pattern in the sensory effect of the multimedia service providing system according to an exemplary embodiment of the present invention. FIG.
27 is a diagram schematically illustrating a multimedia service providing process of a multimedia service providing system according to an embodiment of the present invention.

Hereinafter, exemplary embodiments of the present invention will be described in detail with reference to the accompanying drawings. It should be noted that in the following description, only parts necessary for understanding the operation according to the present invention will be described, and descriptions of other parts will be omitted so as not to distract from the gist of the present invention.

The present invention proposes a multimedia service providing system and method for providing various high quality multimedia services in a high speed and real time in a communication system. Here, according to an embodiment of the present invention, multimedia contents of a multimedia service to be provided to each user and various sensory effects of the multimedia contents according to a service request of users who want to receive various high quality services are provided. By transmitting at high speed, it provides various high quality multimedia services required by each user in real time.

In addition, in an embodiment of the present invention, the multimedia content of the aforementioned multimedia service and various sensory effects of the multimedia content are transmitted at high speed by maximizing the resources available for providing the multimedia service to users. At this time, the multimedia content of the multimedia service that users want to provide is a large amount of data, most of the available resources are used for the transmission of the multimedia content. Accordingly, the resources available for the transmission of various sensory effects of the multimedia content that must be transmitted and provided indispensable for providing various multimedia services of high quality required by users are further limited, and as a result, various high quality multimedia services can be quickly And in order to provide to users in real time, high-speed transmission of the various sensory effects as well as high-speed transmission of a large amount of multimedia content is essential.

That is, in an embodiment of the present invention, in order to provide a multimedia service required by each user at high speed and in real time through resources available for providing various high quality multimedia services, the multimedia content is coded, and in particular, Information indicative of various sensory effects (hereinafter referred to as 'sensory effects information') is encoded using a binary representation to minimize the data size of the sensory effect information and thus The multimedia content and various sensory effects of the multimedia content are transmitted at high speed, and as a result, the multimedia content and sensory effects are provided to the users in real time, that is, the various high quality multimedia services are provided to the users in real time. do .

In addition, according to an embodiment of the present invention, in the Moving Picture Experts Group (MPEG) -V, various sensory effect information of multimedia are transmitted at high speed using binary code notation, that is, sensory effect data through binary notation. Alternatively, sensory effect metadata may be transmitted at high speed to provide various sensory effects of the multimedia content in real time to each user who is provided with a multimedia service.

Here, the embodiment of the present invention relates to the high-speed transmission of sensory effect information, that is, sensory effect data or sensory effect metadata in Part 3 of MPEG-V, and generates various high-quality multimedia services according to service requests of each user. A service provider which provides or sells the multimedia content of the multimedia service and encodes the multimedia content at high speed. In particular, various sensory effects of the multimedia content are encoded in binary notation, that is, the sensory effect information is encoded in a binary code notation. do. The service provider transmits the multimedia content and the sensory effect information encoded in the binary notation to a user server such as a home server at high speed.

In this case, as the service provider encodes and transmits the sensory effect information in binary notation, as described above, an extremely limited available resource for transmitting the sensory effect information, that is, used for transmitting a large amount of multimedia content is used. The sensory effect information is transmitted at high speed by maximizing the available resources except resources. Accordingly, the service provider transmits the sensory effect information as well as the multimedia content to the user server at high speed, and as a result, provides each user with various sensory effects of the multimedia content in real time.

Here, the user server outputs the multimedia service and transmits the multimedia content and the sensory effect information to user devices that provide the actual multimedia service to each user. The effect information is encoded in binary notation and converted into command information for device control of each user device, and the control information converted into binary notation is transmitted to each user device. Each of the user devices is controlled according to the control information converted into the binary notation to output various sensory effects, that is, to provide users with various sensory effects of the multimedia content in real time.

For example, Part 3 of the above-described MPEG-V defines a schema for efficiently describing various sensory effects that may appear in a scene of a multimedia content or a real environment. For example, when a wind blows in a specific scene of a movie, a wind-like realistic effect is described using a predefined schema and inserted into multimedia data, and when a home server reproduces a movie through the multimedia data, the multimedia data After extracting the sensory effect information from, and synchronizes with the user device capable of outputting the wind effect, such as a fan, to provide the user with the windy effect. In another example, the learner (ie, the user) purchases user devices that can produce various sensational effects and is located in the house, while the instructor (ie, the service provider) remotely delivers the lecture (ie, transmits multimedia data). It delivers a more realistic education, that is, a higher quality multimedia service by transmitting various realistic effects to the students according to the lecture contents (ie, multimedia contents).

In order to provide high quality multimedia services, sensory effect information provided together with multimedia contents may be described as an extensible markup language (XML) document. For example, when the service provider describes the sensory effect information as an XML document, the sensory effect information is transmitted to the user server as an XML document, and the user server that has received the sensory effect information of the XML document may describe the XML document. After analyzing, the sensory effect information of the analyzed XML document is analyzed.

Here, although the user server analyzes an XML document and sensory effect information, there may be a limitation in providing various high quality multimedia services to users at high speed and in real time. However, in the embodiment of the present invention, as described above, As information is encoded and transmitted in binary notation, analysis of XML documents and sensory effect information is unnecessary, thereby providing various high quality multimedia services to users at high speed and in real time. In other words, in the embodiment of the present invention, in the Part 3 of MPEG-V, sensory effect information is compressed and transmitted through a binary code notation method, not an XML document, and the number of bits used for transmitting the sensory effect information is reduced, that is, It reduces the amount of resources used for the transmission of sensory effect information, and sends sensory effect information at high speed and efficiency by omitting the analysis process of the XML document and sensory effect information. Next, the multimedia service providing system according to an exemplary embodiment of the present invention will be described in more detail with reference to FIG. 1.

1 is a diagram schematically illustrating a structure of a multimedia service providing system according to an exemplary embodiment of the present invention.

Referring to FIG. 1, the multimedia service providing system includes a service provider 110 for generating and providing or selling various high quality multimedia services that each user wants to receive according to a service request of the user, and the service provider 110. A user server 130 for transmitting and providing a multimedia service provided by the user to the user, a plurality of user devices, for example, a user device that outputs the multimedia service transmitted from the user server 130 to actually provide to the user 1 152, user device 2 154, user device 3 156, and user device N 158.

As described above, the service provider 110 generates multimedia contents of a multimedia service that each user wants to receive according to the service request of the users, and also provides various sensory effects of the multimedia contents to each user. Generates sensory effect information. The service provider 110 encodes the multimedia content and the sensory effect information and transmits the encoded content to the user server 130 at high speed.

In this case, the service provider 110 encodes the sensory effect information in binary notation as described above, that is, by encoding the sensory effect information using a binary notation encoding scheme, so that the data size of the sensory effect information is increased. Minimized and thus, the sensory effect information of the binary notation having the minimum data size is transmitted to the user server 130. Accordingly, the service provider 110 transmits the multimedia data at high speed by maximizing the resources available for providing the multimedia service, and in particular, the service provider 110 is the multimedia data. The encoded multimedia content and the sensory effect information encoded in the binary notation are transmitted to the user server 130. That is, the multimedia data includes the encoded multimedia content and sensory effect information encoded in the binary notation, and is transmitted to the user server 130.

Here, the service provider 110 may be a content provider that generates the multimedia service, a communication provider that provides or sells the multimedia service, a service seller, or the like. As will be described in more detail with reference to FIG. 2, a detailed description thereof will be omitted herein.

The user server 130 receives multimedia data from the service provider 110, transmits multimedia content included in the multimedia data to a corresponding user device, for example, user device 1 152, and transmits the multimedia data. By converting the sensory effect information encoded in the binary notation included in the data into command information, corresponding user devices such as user device 2 154, user device 3 156, and user device N 158. To transmit each. In this case, the user server 130 may receive sensory effect information encoded in the binary notation of sensory effect information of multimedia content from the service provider 110 as described above. The sensory effect information of the XML document may be received from another service provider in general.

In this case, when receiving the sensory effect information encoded in the binary notation, the user server 130 converts the sensory effect information into control information in the binary notation and encodes the converted control information in binary notation. The control information encoded in the notation is transmitted to the user devices 152, 154, 156, and 158, or the sensory effect information of the binary notation is transmitted to the user devices 152, 154, 156, and 158, respectively, as control information. In addition, when receiving the sensory effect information of the XML document, the user server 130 converts the sensory effect information of the XML document into control information, and then encodes the converted control information in binary notation, thereby performing binary notation. The control information encoded by the control information is transmitted to the user devices 152, 154, 156, and 158, respectively.

Here, the user server 130 may be a terminal for receiving multimedia data from the service provider 110, or user devices 152, 154, 156, and 158 that output multimedia content and various sensory effects of the multimedia content to actual users. ) May be a home server or the like, and the user server 130 will be described in more detail with reference to FIG. 3, and a detailed description thereof will be omitted herein.

In addition, the user devices 152, 154, 156, 158 receive multimedia content and control information from the user server 130, and output, ie, provide actual multimedia content and various sensory effects of the multimedia content to each user. Here, the user devices 152, 154, 156, and 158 output a multimedia content, that is, a user device that outputs the image and sound of the multimedia content, for example, the user device 1 152 and various sensory effects of the multimedia content, respectively. User devices 154, 156, and 158.

In this case, as described above, the user device 1 152 outputs the image and sound of the multimedia service to be provided to the users and provides the same to the users, and the remaining user devices 154, 156, and 158 are the user server 130. Receive control information encoded in the binary notation from each other, and are controlled according to the control information encoded in the binary notation to output a corresponding sensory effect. In particular, the remaining user devices 154, 156, 158 are control information for outputting the sensory effect simultaneously with the output of the image and sound of the multimedia service, and according to the received control information encoded in the binary notation, analysis of the control information. The sensory effect is output at high speed in correspondence with the control information encoded in the binary notation, and as a result, the sensory effect is provided to the users in real time at the same time as the image and sound of the multimedia service are output.

Here, the user devices 152, 154, 156, 158, the video display and the speaker for outputting the image and sound, as well as a variety of devices for outputting each of the various sensory effects, such as fans, air conditioners, humidifiers, hot air fans, boilers, etc. Devices can be. That is, the user devices 152, 154, 156, 158 are controlled according to the control information encoded in the binary notation to provide users with high quality multimedia services in real time, that is, not only images and sounds which are multimedia contents of the multimedia services, It provides a variety of realistic effects simultaneously in real time. The various sensory effects of the multimedia content may include, for example, a light effect, a colored light effect, a flash light effect, a temperature effect, and a wind effect. ), Vibration effect, spraying effect, water sprayer effect, scent effect, smoke effect, color correction effect, motion and Sensory effects (eg, rigid body motion effects, passive kinesthetic motion effects, passive kinesthetic force effects, active kinesthetic effects) effect, tactile effect), and the like. The user devices 152, 154, 156, and 158 will be described in more detail with reference to FIG. 4, and thus detailed description thereof will be omitted.

As such, in a system for providing a multimedia service according to an embodiment of the present invention, the service provider 110 generates sensory effect information in real time according to multimedia content, or obtains sensory effect information of an XML document, As described above, the provider 110 encodes the sensory effect information in binary notation, and the sensory effect information encoded in the binary notation is transmitted to the user server 130 through a network.

In other words, in the system for providing a multimedia service according to an embodiment of the present invention, the service provider 110 encodes the sensory effect information of the multimedia content by using a binary notation encoding scheme in Part 3 of MPEG-V. As multimedia data, sensory effect information encoded in the binary notation and the multimedia content are transmitted to the user server 130. Accordingly, the system providing the multimedia service transmits multimedia data using the network available to provide the multimedia service to the maximum, and in particular, the sensory effect information is encoded using a binary notation encoding scheme. The data size of the effect information is minimized, and as a result, the multimedia data is transmitted to the user server 130 at high speed and in real time.

In addition, the user server 130 receives sensory effect information encoded in the binary notation, and acquires sensory effect information for providing various high quality multimedia services to users at high speed. After converting the information into control information, the converted control information is encoded in binary notation and transmitted to the respective user devices 152, 154, 156 and 158. In addition, each of the user devices 152, 154, 156, and 158 is a device command according to the control information encoded in the binary notation to provide various sensory effects to users in real time simultaneously with multimedia contents. Next, the service provider 110 in the multimedia service providing system according to an exemplary embodiment of the present invention will be described in more detail with reference to FIG. 2.

2 is a diagram schematically illustrating a structure of a service provider in a multimedia service providing system according to an exemplary embodiment of the present invention.

Referring to FIG. 2, the service provider 110 may include a generator 1 210 that generates multimedia contents of a multimedia service that each user wants to receive in response to a service request of the users, as described above. Generator 2 220 for generating various sensory effects of the content, that is, sensory effect information or obtaining sensory effect information of an XML document, and encoder 1 230 for encoding the multimedia content. A encoder 2 240 for encoding the sensory effect information using a binary notation coding scheme, and a transmitter 1 for transmitting the multimedia data including the encoded multimedia content and sensory effect information to the user server 130. 250).

The generator 1 210 generates multimedia content corresponding to various high quality multimedia services that users want to receive, or obtains the multimedia content from an external device. In order to provide various high quality multimedia services to users, the generator 2 220 generates sensory effect information of the multimedia contents to provide various sensory effects simultaneously with the multimedia contents or XML from an external device. Receive and obtain sensory effect information of a document.

The encoder 1 230 encodes the multimedia content using a predetermined encoding scheme. The encoder 2 240 encodes the sensory effect information using a binary notation encoding scheme, that is, binary notation, wherein the sensory effect information is encoded into a binary code in the form of a stream. In other words, the encoder 2 240 becomes a sensory effect stream encoder, and outputs the sensory effect stream in which the sensory effect information is encoded in binary notation.

Here, the encoder 2 240 defines syntax, binary representation, and semantics of sensory effects corresponding to sensory effect information in binary notation encoding of the sensory effect information. . In addition, as the sensory effect information is encoded in binary notation, the data size of the sensory effect information is minimized. As described above, the user server 130 receives the sensory effect information of the binary notation. The sensory effect information is identified and converted into control information through stream decoding of a binary code without analyzing sensory effect information. Here, the sensory effect information and binary notation encoding of the sensory effect information will be described in more detail below, and thus a detailed description thereof will be omitted.

The transmitter 1 250 transmits the multimedia data including the multimedia content and sensory effect information to the user server 130, that is, the encoded multimedia content and the sensory effect information encoded with the binary code. To send). Here, as described above, the sensory effect information is encoded in a binary code in the form of a stream and transmitted, that is, transmitted in a sensory effect information stream encoded in the binary notation, so that the transmitter 1 250 may maximize the available resources. The multimedia data is transmitted to the user server 130 at high speed and in real time. Next, the user server 130 in the multimedia service providing system according to an exemplary embodiment of the present invention will be described in more detail with reference to FIG. 3.

3 is a diagram schematically illustrating a structure of a user server in a multimedia service providing system according to an exemplary embodiment of the present invention.

Referring to FIG. 3, the user server 130 may include a receiver 1 310 that receives multimedia data from the service provider 110, and as described above in the received multimedia data, in binary notation. A decoder 1 320 that decodes the encoded sensory effect information, a converter 330 that converts the decoded sensory effect information into control information for device control of each user device 152, 154, 156, and 158, and An encoder 3 340 for encoding the converted control information using a binary notation encoding scheme, and a transmitter 2 for transmitting the multimedia content in the multimedia data and the control information encoded in the binary notation to respective user devices 152, 154, 156, and 158. And 350.

As described above, the receiver 1 310 receives the multimedia data including the multimedia content and the sensory effect information of the multimedia content encoded in binary notation from the service provider 110. In this case, the receiver 1 310 may receive multimedia data including multimedia content and sensory effect information of an XML document from another service provider.

The decoder 1 320 decodes the sensory effect information encoded in binary notation from the multimedia data. In this case, since the sensory effect information encoded in the binary notation is a sensory effect stream encoded by a binary code in the form of a stream, the decoder 1 320 is a sensory effect stream decoder and decodes the sensory effect stream encoded in the binary notation. The decoded sensory effect information is transmitted to the converter 330. In addition, when the receiver 1 310 receives multimedia data including sensory effect information of the XML document, the decoder 1 320 analyzes and confirms the sensory effect information of the XML document, and confirms the checked sensory effect. Effect information is sent to the converter 330.

The converter 330 converts the sensory effect information into control information for device control of the devices 152, 154, 156, and 518. In this case, the converter 330 converts the sensory effect information into control information in consideration of capability information of the user devices 152, 154, 156, and 518.

Here, the receiver 1 310 of the user server 130 receives the capability information of the user devices 152, 154, 156 and 518 from the user devices 152, 154, 156 and 518, respectively. In particular, as described above, the user server 130 ) Manages and controls the user devices 152, 154, 156, 518, upon initial connection and setup of the user devices 152, 154, 156, 518 to the user server 130 for providing a multimedia service. Respectively transmits capability information to the user server 130.

Therefore, the converter 330 may allow the user devices 152, 154, 156, 518 to accurately output the sensory effects indicated by the sensory effect information in consideration of the capability information, that is, the sensory effect information. The sensory information is converted into control information so that the sensory effect of the multimedia content can be accurately and accurately provided to the user in real time, and the device control of the control information allows the user devices 152, 154, 156, and 518 to experience the sensory effect of the multimedia content. To give them accurate in real time.

The encoder 3 340 encodes the converted control information using a binary coding scheme, that is, the control information is encoded in binary notation, and the control information is encoded in a binary code in the form of a stream. In other words, the encoder 3 340 becomes a device control stream encoder and outputs a device control stream encoded in binary notation for control information for controlling the device.

Here, as the control information is encoded in binary notation, the control information of the binary notation becomes a control signal of each of the user devices 152, 154, 156, and 518, and the user devices 152, 154, 156, and 518 respectively control information of the binary notation. Upon reception, the device is controlled through stream decoding of a binary code without analyzing the control information to output a sensory effect. In addition, as described above, the receiver 1 310 of the user server 130 receives sensory effect information encoded in the binary notation of the sensory information of the multimedia content from the service provider 110 or senses the XML document. Receive effect information.

More specifically, when the receiver 1 310 receives the sensory effect information encoded in the binary notation, as described above, the decoder 1 320 streams the sensory effect information encoded in the binary notation. After decoding, the converter 330 converts the sensory effect information into control information in consideration of the capability information of the user devices 152, 154, 156, and 518, and then the encoder 3 340 encodes the converted control information in binary notation. The control information encoded in the binary notation is transmitted to the user devices 152, 154, 156, and 518, respectively.

When the receiver 1 310 receives the sensory effect information encoded in the binary notation, the user server 130 uses the sensory effect information of the binary notation as the control information as described above, and the user equipments. When transmitting to (152, 154, 156, 518), the decoder 1 (320) stream-decodes the sensory effect information encoded in the binary notation, and does not perform the control information conversion operation in the converter 330, and the encoder 3 (340). ) Encodes the decoded sensory effect information in binary notation in consideration of capability information of the user devices 152, 154, 156, and 518. In other words, the encoder 3 340 outputs the sensory effect information of the binary notation encoded in consideration of the capability information as control information encoded in the binary notation for device control of the user devices 152, 154, 156, and 518, respectively. The control information encoded in the binary notation is transmitted to user devices 152, 154, 156, and 518, respectively.

In addition, when the receiver 1 310 receives the sensory effect information of the XML document, the decoder 1 320 analyzes and confirms the sensory effect information of the XML document, and the converter 330 determines the user devices. After converting the checked sensory effect information into control information in consideration of the capability information of (152,154,156,518), the encoder 3 340 encodes the converted control information in binary notation, and the control information encoded in the binary notation is And are sent to user devices 152, 154, 156, and 518, respectively.

For example, when the user server 130 receives sensory effect information of binary representation or sensory effect information of an XML document including two-level wind effect (for example, a wind boom of 2m / s size) information, The server 130 checks the user device providing the wind effect through the capability information of the user devices 152, 154, 156, and 518. For example, the server 130 checks the fan through the capability information of the fan. Control the device to output the effect, that is, the fan is operated at three levels (here, the user server 130, the fan outputs the wind of the size of 2m / s when the fan is operating at three levels through the capability information of the fan Control information in binary notation to be controlled is transmitted to the fan. In addition, the fan receives the control information of the binary notation from the user server 130 and then decodes the control information of the binary notation to operate at three levels. As a result, the users watch the multimedia content and simultaneously. It is provided with a 2m / s wind blowing effect in real time.

The transmitter 2 350 transmits the multimedia content included in the multimedia data and the control information encoded in the binary notation to the user devices 152, 154, 156, and 518, respectively. Here, the control information encoded in the binary notation is transmitted to each of the user devices 152, 154, 156, and 518 in a stream form. Next, the user devices 152, 154, 156, and 518 in the multimedia service providing system according to an exemplary embodiment of the present invention will be described in detail with reference to FIG. 4.

4 is a diagram schematically illustrating a structure of a user device in a multimedia service providing system according to an exemplary embodiment of the present invention.

Referring to FIG. 4, the user device decodes the control information encoded in the multimedia content or the binary notation, the receiver 2 410, which receives the control information encoded in the multimedia content or the binary notation from the user server 130. Decoder 2 (420), a controller (430) for performing device control according to the decoded control information, outputs the multimedia content or output various sensory effects of the multimedia content to the user to a variety of high-quality multimedia An output unit 440 that provides a service.

The receiver 2 410 receives multimedia content transmitted from the transmitter 2 350 of the user server 130 or receives control information encoded in binary notation. Here, the control information encoded in the binary notation is transmitted in the form of a stream, and the receiver 2 410 receives the control information stream encoded in the binary notation. In addition, as described above, when the user device is a user device that outputs multimedia content, that is, a video and sound of a multimedia service, the receiver 2 410 receives the multimedia content and the decoder 420. After decrypting the multimedia content, the output unit 440 outputs the multimedia content, that is, provides the user with the image and sound of the multimedia service. Hereinafter, for convenience of description, the receiver 2 410 receives control information encoded in the binary notation, that is, a case in which a user device is a device that provides various sensory effects of the multimedia content to users. Let's explain.

The decoder 2 420 decodes the control information of the binary notation received in the form of a stream. In this case, since the control information encoded in the binary notation is a control information stream encoded in the binary code in the form of a stream, the decoder 2 420 is a device control stream decoder, and decodes the control information stream encoded in the binary notation. The decoded control information is transmitted to the controller 430 as a device control signal.

The controller 430 receives control information from the decoder 2 420 as a control signal and performs device control according to the control information. That is, the controller 430 controls the user device to provide a user with a realistic effect of the multimedia content according to the control information. In this case, as the control information is encoded and transmitted from the user server 130 in binary notation, the sensor device outputs a sensory effect at high speed without performing an analysis and confirmation operation of the control information. It provides a realistic effect to users in real time at the same time as the content.

In other words, when the receiver 2 410 receives the control information of the XML document, the decoder 2 420 analyzes and confirms the control information of the XML document, and the controller 430 according to the checked control information. Outputs the sensory effect through device control, and the sensory effect cannot be output at high speed as the control information analysis and confirmation operation of the XML document is performed. As a result, the user device can simultaneously output the sensory effect in real time with the multimedia content. It does not give them a realistic effect. However, the user server 130 of the multimedia service providing system according to an embodiment of the present invention encodes the control information in binary notation in consideration of capability information of the user devices 152, 154, 156, and 518, and transmits the control information to the user devices 152, 154, 156, 518, respectively. Accordingly, each of the user devices 152, 154, 156, 518 outputs a sensory effect at high speed without performing an analysis and confirmation operation of the control information, and as a result, each of the user devices 152, 154, 156, 518, in real time, simultaneously with the multimedia content. Gives a realistic effect.

The output unit 440 outputs the sensory effect of the multimedia content in correspondence with the device control according to the control information of the binary notation. Next, the sensory effect and sensory effect information of the multimedia content and the encoding of the sensory effect binary notation of the service user 110 will be described in more detail.

First, the sensory effect information, that is, the base datatypes and elements of the sensory effect metadata will be described. Syntax can be expressed as shown in Table 1 below. Here, Table 1 is a table showing the syntax of the sensory effect metadata.

<!-############################################## ##->
<!-SEM Base Attributes->
<!-############################################## ##->
<attributeGroup name = "SEMBaseAttributes">
<attribute name = "activate" type = "boolean" use = "optional"/>
<attribute name = "duration" type = "positiveInteger" use = "optional"/>
<attribute name = "fade" type = "positiveInteger" use = "optional"/>
<attribute name = "alt" type = "anyURI" use = "optional"/>
<attribute name = "priority" type = "positiveInteger" use = "optional"/>
<attribute name = "location" type = "mpeg7: termReferenceType"
use = "optional"/>
<attributeGroup ref = "sedl: SEMAdaptabilityAttributes"/>
</ attributeGroup>
<simpleType name = "intensityValueType">
<restriction base = "float"/>
</ simpleType>
<simpleType name = "intensityRangeType">
<restriction>
<simpleType>
<list itemType = "float"/>
</ simpleType>
<length value = "2" fixed = "true"/>
</ restriction>
</ simpleType>

<!-############################################## ##->
<!-SEM Adaptability Attributes->
<!-############################################## ##->
<attributeGroup name = "SEMAdaptabilityAttributes">
<attribute name = "adaptType" type = "sedl: adaptTypeType" use = "optional"/>
<attribute name = "adaptRange" type = "sedl: adaptRangeType" default = "10"
use = "optional"/>
</ attributeGroup>
<simpleType name = "adaptTypeType">
<restriction base = "NMTOKEN">
<enumeration value = "Strict"/>
<enumeration value = "Under"/>
<enumeration value = "Over"/>
<enumeration value = "Both"/>
</ restriction>
</ simpleType>
<simpleType name = "adaptRangeType">
<restriction base = "unsignedInt">
<minInclusive value = "0"/>
<maxInclusive value = "100"/>
</ restriction>
</ simpleType>

<!-############################################## ##->
<!-SEM Base type->
<!-############################################## ##->
<complexType name = "SEMBaseType" abstract = "true">
<complexContent>
<restriction base = "anyType">
<attribute name = "id" type = "ID" use = "optional"/>
</ restriction>
</ complexContent>
</ complexType>

The basic data type and the binary coding notation or binary representation of elements of the sensory effect metadata may be represented as in Table 2 below. Here, Table 2 is a table showing a binary representation of the basic data type and elements of the sensory effect metadata.

SEMBaseAttributes { Number of Bits Mnemonic  activateFlag One bslbf  durationFlag One bslbf  fadeFlag One bslbf  altFlag One bslbf  priorityFlag One bslbf  locationFlag One bslbf  if (activateFlag) {     activate One bslbf  }  if (durationFlag) { duration 32 uimsbf  }  if (fadeFlag) {    fade 32 uimsbf  }  if (altFlag) {    alt UTF-8  }  if (priorityFlag) {    priority 32 uimsbf  }  if (locationFlag) {    location 7 bslbf (Table1)  }  SEMAdaptabilityAttributes SEMAdaptabilityAttributes } SEMAdaptabilityAttributes {   adaptTypeFlag One bslbf  if (adaptTypeFlag) {   adaptType 2 bslbf (Table2)  }  adaptRange 7 uimsbf } SEMBaseType {  idFlag One bslbf  if (idFlag) {   id See ISO 10646 UTF-8  } }

In addition, the semantics of the basic data type and elements of the sensory effect metadata may be represented as shown in Table 3 below. Here, Table 3 is a table showing the SEM basic properties semantics (Semantics of the SEMBaseAttributes).

Name Definition activateFlag This field signals the presence of active attribute.If it is set to "1" the active attribute is following. durationFlag This field signals the presence of duration attribute.If it is set to "1" the duration attribute is following. fadeFlag This field signals the presence of fade attribute.If it is set to "1" the fade attribute is following). altFlag This field signals the presence of alt attribute.If it is set to "1" the alt attribute is following. priorityFlag This field signals the presence of priority attribute.If it is set to "1" the priotiry attribute is following. locationFlag A flag indicating whether the location attribute is used or not, and if the value is 1, the location attribute is used (This field signals the presence of location attribute.If it is set to "1" the location attribute is following). activate Indicates whether the effect shall be activated.A value of true means the effect shall be activated and false means the effect shall be deactivated. duration (Describes the duration according to the time scheme used.The time scheme used shall be identified by means of the si: absTimeScheme and si: timeScale attributes respectively). fade (Describes the fade time according to the time scheme used within which the defined intensity shall be reached.The time scheme used shall be identified by means of the si: absTimeScheme and si: timeScale attributes respectively). alt Describes an alternative effect identified by URI.
NOTE 1 The alternative might point to an effect-or list of effects-within the same description or an external description.
NOTE 2 The alternative might be used in case the original effect cannot be processed.
EXAMPLE 1 The alternative effect is chosen because the original intended effect cannot be processed due to lack of devices supporting this effect.
priority Describes the priority for effects with respect to other effects in the same group of effects sharing the same point in time when they should become available for consumption.A value of one indicates the highest priority and larger values indicate lower priorities.
NOTE 3 The priority might by used to process effects-Flag within a group of effects-according to the capabilities of the adaptation VR).
EXAMPLE 2 The adaptation VR processes the individual effects of a group of effects according to their priority in descending order due to its limited capabilities. That is, effects with low priority might get lost.
location Describes the location from where the effect is expected to be received from the user 'perspective according to the x-, y-, and z-axis as depicted in location model for sensory effect metadata.
A classification scheme that may be used for this purpose is the LocationCS as Flag in Annex A.2.1. The terms from the LocationCS shall be concatenated with the ":" sign in order of the x-, y-, and z-axis to uniquely define a location within the three-dimensional space.
For referring to a group of locations, a wild card mechanism may be employed using the "*" sign.
EXAMPLE 4 urn: mpeg: mpeg-v: 01-SI-LocationCS-NS: center: middle: front defines the location as follows: center on the x-axis, middle on the y-axis, and front on the z-axis . That is, it describes all effects at the center, middle, front side of the user.
EXAMPLE 5 urn: mpeg: mpeg-v: 01-SI-LocationCS-NS: left: *: midway defines the location as follows: left on the x-axis, any location on the y-axis, and midway on the z- axis. That is, it describes all effects at the left, midway side of the user.
EXAMPLE 6 urn: mpeg: mpeg-v: 01-SI-LocationCS-NS: *: *: back defines the location as follows: any location on the x-axis, any location on the y-axis, and back on the z -axis. That is, it describes all effects at the back of the user.
In the binary description, the following mapping table is used location.

In the SEM basic characteristic semantics shown in Table 3 above, location uses a location model for sensory effect metadata as shown in FIG. 5. 5 is a diagram illustrating a location model of sensory effect metadata in a multimedia service providing system according to an exemplary embodiment of the present invention.

That is, the positional model of the sensory effect metadata includes a back 502, a midway 504, a front 506, and a bottom on the spatial coordinates of xyz as shown in FIG. 5. (bottom) 508, middle 510, left 512, centerleft 514, center 516, centerright 518, light right 520, and top 522. Here, the location model of the sensory effect metadata may further include more locations by subdividing on the spatial coordinates of xyz as well as the locations shown in FIG. 5.

As illustrated in FIG. 5, each position of the sensory effect metadata on the spatial coordinates of xyz may be represented in binary notation as shown in Table 4 below. That is, in the SEM basic characteristic semantics shown in Table 3, the position is encoded in binary notation. Here, Table 4 is a table showing the binary representation of the position on the spatial coordinates of xyz.

location term of location 0000000 *: *: * 0000001 left: *: * 0000010 centerleft: *: * 0000011 center: *: * 0000100 centerright: *: * 0000101 right: *: * 0000110 *: bottom: * 0000111 *: middle: * 0001000 *: top: * 0001001 *: *: back 0001010 *: *: midway 0001011 *: *: front 0001100 left: bottom: * 0001101 centerleft: bottom: * 0001110 center: bottom: * 0001111 centerright: bottom: * 0010000 right: bottom: * 0010001 left: middle: * 0010010 centerleft: middle: * 0010011 center: middle: * 0010100 centerright: middle: * 0010101 right: middle: * 0010110 left: top: * 0010111 centerleft: top: * 0011000 center: top: * 0011001 centerright: top: * 0011010 right: top: * 0011011 left: *: back 0011100 centerleft: *: back 0011101 center: *: back 0011110 centerright: *: back 0011111 right: *: back 0100000 left: *: midway 0100001 centerleft: *: midway 0100010 center: *: midway 0100011 centerright: *: midway 0100100 right: *: midway 0100101 left: *: front 0100110 centerleft: *: front 0100111 center: *: front 0101000 centerright: *: front 0101001 right: *: front 0101010 *: bottom: back 0101011 *: middle: back 0101100 *: top: back 0101101 *: bottom: midway 0101110 *: middle: midway 0101111 *: top: midway 0110000 *: bottom: front 0110001 *: middle: front 0110010 *: top: front 0110011 left: bottom: back 0110100 centerleft: bottom: back 0110101 center: bottom: back 0110110 centerright: bottom: back 0110111 right: bottom: back 0111000 left: middle: back 0111001 centerleft: middle: back 0111010 center: middle: back 0111011 centerright: middle: back 0111100 right: middle: back 0111101 left: top: back 0111110 centerleft: top: back 0111111 center: top: back 1000000 centerright: top: back 1000001 right: top: back 1000010 left: bottom: midway 1000011 centerleft: bottom: midway 1000100 center: bottom: midway 1000101 centerright: bottom: midway 1000110 right: bottom: midway 1000111 left: middle: midway 1001000 centerleft: middle: midway 1001001 center: middle: midway 1001010 centerright: middle: midway 1001011 right: middle: midway 1001100 left: top: midway 1001101 centerleft: top: midway 1001110 center: top: midway 1001111 centerright: top: midway 1010000 right: top: midway 1010001 left: bottom: midway 1010010 centerleft: bottom: midway 1010011 center: bottom: midway 1010100 centerright: bottom: midway 1010101 right: bottom: midway 1010110 left: middle: midway 1010111 centerleft: middle: midway 1011000 center: middle: midway 1011001 centerright: middle: midway 1011010 right: middle: midway 1011011 left: top: midway 1011100 centerleft: top: midway 1011101 center: top: midway 1011110 centerright: top: midway 1011111 right: top: midway 1100000 ~ 1111111 Reserved

And, the semantics of the basic data type and elements of the sensory effect metadata can be expressed as shown in Table 5 below. Here, Table 5 is a table showing the SEM adaptability characteristics semantics (Semantics of the SEM Adaptability Attributes).

Name Definition adaptTypeFlag This field signals the presence of adaptType attribute.If it is set to "1" the adaptType attribute is following. adaptType Describes the preferred type of adaptation with the following possible instantiations:
Strict: An adaptation by approximation may not be performed.
Under: An adaptation by approximation may be performed with a smaller effect value than the specified effect value.
Over: An adaptation by approximation may be performed with a greater effect value than the specified effect value.
Both: An adaptation by approximation may be performed between the upper and lower bound specified by adaptRange.
adaptRange Describes the upper and lower bound in percentage for the adaptType.If the adaptType is not present, adaptRange shall be ignored.The value of adaptRange shoud be between 0 and 100). adaptRangeFlag Flag indicating whether the adaptRange attribute is used. If the value is 1, the adaptRange attribute is used.

In the SEM adaptability characteristic semantics shown in Table 5, the adapt type can be represented as shown in Table 6 below and is encoded in binary notation. Here, Table 6 is a table showing the binary notation of the adaptive type.

adaptType Sementics 00 Strict 01 Under 10 Over 11 Both

In addition, the semantics of the basic data type and elements of the sensory effect metadata may be represented as in Table 7 below. Here, Table 7 is a table showing the SEM base type semantics (Semantics of the SEMBaseType).

Name Definition SEMBaseType Provides the topmost type of the base type hierarchy. id Identifies the id of the SEMBaseType. idFlag Flag indicating whether the id attribute is used or not, if the value is 1, this field signals the presence of id attribute.If it is set to "1" the id attribute is following.

Next, when describing the sensory effect information, that is, the root element of the sensory effect metadata, the syntax may be represented as shown in Table 8. Here, Table 8 is a table showing syntax of the root element.

<!-############################################## ##->
<!-Definition of the SEM root element->
<!-############################################## ##->
<element name = "SEM">
<complexType>
<sequence>
<element name = "DescriptionMetadata" type = "sedl: DescriptionMetadataType"
minOccurs = "0" maxOccurs = "1"/>
<choice maxOccurs = "unbounded">
<element name = "Declarations" type = "sedl: DeclarationsType"/>
<element name = "GroupOfEffects" type = "sedl: GroupOfEffectsType"/>
<element name = "Effect" type = "sedl: EffectBaseType"/>
<element name = "ReferenceEffect" type = "sedl: ReferenceEffectType"/>
</ choice>
</ sequence>
<attribute name = "autoExtraction" type = "sedl: autoExtractionType"/>
<anyAttribute namespace = "## other" processContents = "lax"/>
</ complexType>
</ element>

<simpleType name = "autoExtractionType">
<restriction base = "string">
<enumeration value = "audio"/>
<enumeration value = "visual"/>
<enumeration value = "both"/>
</ restriction>
</ simpleType>

The binary encoding notation or binary notation of the root elements of the sensory effect metadata may be represented as in Table 9 below. Here, Table 9 is a table showing the binary representation of the root elements of the sensory effect metadata.

SEM { Number of bits Mnemonic  DescriptionMetadataFlag One bslbf  If (DescriptionMetadataFlag) {   DescriptionMetadata DescriptionMetadata  }  NumOfElements vluimsbf5  For (k = 0; k <NumOfElements; k ++) {   ElementID 4 uimsbf (Table 3) Element Element  }  autoExtractionID 2 uimsbf (Table 4)  anyAttributeType anyAttributeType } anyAttributeType { Number of bits Mnemonic  siAttibutes siAttributeList  anyAttributeFlag One bslbf  If (anyAttributeFlag) { SizeOfanyAttribute vluimsbf5 anyAttribute SizeOfanyAttribute * 8 bslbf  } }

In addition, semantics of the root elements of the sensory effect metadata may be expressed as shown in Table 10 below. Here, Table 10 is a table showing the SEM root element semantics.

Name Definition SEM Serves as the root element for sensory effects metadata. DescriptionMetadataFlag This field, which is only present in the binary representation, indicates the presence of the DescriptionMetadata element.If it is 1 then the DescriptionMetadata element is present, otherwise the DescriptionMetadata element is not present). DescriptionMetadata Describes general information about the sensory effects metadata.
EXAMPLE-Creation information or Classification Scheme Alias.
NumOfElements This field, which is only present in the binary representation, specifies the number of Element instances accommodated in the SEM. Declarations Declaration of effects, group of sensory effects, or parameters. Effect Describe the effect. GroupOfEffects Describe the effect group. ReferenceEffect See also sensory effect, group of sensory effects, or parameters. ElementID This field, which is only present in the binary representation, describes which SEM scheme shall be used.
In the binary description, the following mapping table is used.
Element Declaration of effects, group of sensory effects, or parameters autoExtractionID Describes whether an automatic extraction of sensory effects from the media resource, which is described by this sensory effect metadata, is desirable. The following values are available:
audio: the automatic extraction of sensory effects from the audio part of the media resource, which is described by this sensory effect metadata, is desirable.
visual: the automatic extraction of sensory effects from the visual part of the media resource, which is described by this sensory effect metadata, is desirable.
both: the automatic extraction of sensory effects from both the audio and visual part of the media resource, which is described by this sensory effect metadata, is desirable.
In the binary description, the following mapping table is used.
anyAttributeType Reserved area (Type of anyAttribure) siAttibutes Make reference to follow siAttributeList anyAttributeFlag This field signals the presence of anyAttribute attribute. If it is set to "1" the anyAttribute is following. SizeOfanyAttribute Number of byte arrary for anyAttribute anyAttributeType Provides an extension mechanism for including attributes from namespaces other than the target namespace. Attributes that shall be included are the XML streaming instructions as defined in ISO / IEC 21000-7 for the purpose of identifying process units and associating time information to them.

In the SEM root element semantics shown in Table 10, the element ID (ElementID) may be represented in binary notation as shown in Table 11 below. Here, Table 11 is a table showing the binary representation of the element ID.

ElementID Element 0 Reserved One Declarations 2 GroupOfEffects 3 Effect 4 ReferenceEffect 5 Parameter 6-15 Reserved

In addition, in the SEM root element semantics shown in Table 10 above, autoextractionID may be represented in binary notation as shown in Table 12 below. Here, Table 12 is a table showing the binary notation of the automatic extraction ID.

autoExtractionID autoExtractionType 00 audio 01 visual 10 both 11 Reserved

Here, in addition to the si attribute list (si attribute list), first, the XML representation syntax of the si attribute list (XML representation syntax) can be represented as shown in Table 13. Table 13 is a table showing XML notation syntax of the sensory effect metadata.

<? xml version = "1.0"?>
<!-Digital Item Adaptation ISO / IEC 21000-7 Second Edition->
<!-Schema for XML Streaming Instructions->
<schema
version = "ISO / IEC 21000-7 2nd"
id = "XSI-2nd.xsd"
xmlns = "http://www.w3.org/2001/XMLSchema"
xmlns: si = "urn: mpeg: mpeg21: 2003: 01-DIA-XSI-NS"
targetNamespace = "urn: mpeg: mpeg21: 2003: 01-DIA-XSI-NS"
elementFormDefault = "qualified">

<annotation>
<documentation>
Declaration of attributes used for XML streaming instructions
</ documentation>
</ annotation>

<!-The following attribute defines the process units->
<attribute name = "anchorElement" type = "boolean"/>

<!-The following attribute indicates that the PU shall be encoded as Random Access Point->
<attribute name = "encodeAsRAP" type = "boolean"/>

<attribute name = "puMode" type = "si: puModeType"/>
<simpleType name = "puModeType">
<restriction base = "string">
<enumeration value = "self"/>
<enumeration value = "ancestors"/>
<enumeration value = "descendants"/>
<enumeration value = "ancestorsDescendants"/>
<enumeration value = "preceding"/>
<enumeration value = "precedingSiblings"/>
<enumeration value = "sequential"/>
</ restriction>
</ simpleType>

<!-The following attributes define the time properties->
<attribute name = "timeScale" type = "unsignedInt"/>
<attribute name = "ptsDelta" type = "unsignedInt"/>
<attribute name = "absTimeScheme" type = "string"/>
<attribute name = "absTime" type = "string"/>
<attribute name = "pts" type = "nonNegativeInteger"/>

</ schema>

The binary encoding notation or binary notation of the syntax shown in Table 13 may be expressed as shown in Table 14 below. Here, Table 14 is a table showing a binary representation syntax.

siAttributeList { (Number of bits) (Mnemonic)  anchorElementFlag One bslbf  encodeAsRAPFlag One bslbf puModeFlag One bslbf timeScaleFlag One bslbf ptsDeltaFlag One bslbf absTimeSchemeFlag One bslbf absTimeFlag One bslbf ptsFlag One bslbf absTimeSchemeLength vluimsbf5 absTimeLength vluimsbf5 if (anchorElementFlag) {   anchorElement One bslbf  } if (encodeAsRAPFlag) {   encodeAsRAP One bslbf } if (puModeFlag) { puMode 3 bslbf (Table 5) } if (puModeFlag) { timeScale 32 uimsbf  } if (ptsDeltaFlag) { ptsDelta 32 uimsbf } if (absTimeSchemeFlag) { absTimeScheme 8 * absTimeSchemeLength bslbf } if (absTimeFlag) { absTime 8 * absTimeLength bslbf } if (ptsFlag) { pts vluimsbf5 }

The semantics of the si attribute list are shown in Table 15 below. Here, Table 15 is a table showing the semantics of the siAttributeList.

Names Description anchorElementFlag This field, which is only present in the binary representation, indicates the presence of the anchorElement attribute. If it is 1 then the anchorElement attribute is present, otherwise the anchorElement attribute is not present. encodeAsRAPFlag This field, which is only present in the binary representation, indicates the presence of the encodeAsRAP attribute. If it is 1 then the encodeAsRAP attribute is present, otherwise the encodeAsRAP attribute is not present. puModeFlag This field, which is only present in the binary representation, indicates the presence of the puMode attribute. If it is 1 then the puMode attribute is present, otherwise the puMode attribute is not present. timeScaleFlag This field, which is only present in the binary representation, indicates the presence of the timeScale attribute. If it is 1 then the timeScale attribute is present, otherwise the timeScale attribute is not present. ptsDeltaFlag This field, which is only present in the binary representation, indicates the presence of the ptsDelta attribute. If it is 1 then the ptsDelta attribute is present, otherwise the ptsDelta attribute is not present. absTimeSchemeFlag This field, which is only present in the binary representation, indicates the presence of the activation attribute. If it is 1 then the activation attribute is present, otherwise the activation attribute is not present. absTimeFlag This field, which is only present in the binary representation, indicates the presence of the absTimeScheme attribute. If it is 1 then the absTimeScheme attribute is present, otherwise the absTimeScheme attribute is not present. ptsFlag This field, which is only present in the binary representation, indicates the presence of the pts attribute. If it is 1 then the pts attribute is present, otherwise the pts attribute is not present. absTimeSchemeLength This field, which is only present in the binary representation, specifies the length of each absTimeSchemeLength instance in bytes. The value of this element is the size of the largest absTimeSchemeLength instance, aligned to a byte boundary by bit stuffing using 0-7 '1' bits. absTimeLength This field, which is only present in the binary representation, specifies the length of each absTimeLength instance in bytes. The value of this element is the size of the largest absTimeLength instance, aligned to a byte boundary by bit stuffing using 0-7 '1' bits. anchorElement Describes whether the element shall be anchor element. A value of true (= 1) means the element shall be anchor element and false (= 0) means the element shall be not anchor element.
The anchorElement allows one to indicate whether an XML element is an anchor element, ie, the starting point for composing the process unit.
encodeAsRAP Describes property indicates that the process unit shall be encoded as a random access point. A value of true (= 1) means the process unit shall be encoded as a random access point and false (= 0) means the process unit shall be not encoded as a random access point. puMode The puMode specifies how elements are aggregated to the anchor element to compose the process unit. For detailed information the reader is referred to ISO / IEC JTC 1 / SC 29 / WG 11 / N9899.
PuMode = descendants means that the process unit contains the anchor element and its descendant elements. Note that the anchor elements are pictured in white.
In the binary description, the following mapping table is used.
timeScale Describes a time scale. ptsDelta Describes a processing time stamp delta. absTimeScheme Describes an absolute time scheme. absTime Describes an absolute time. pts Describes a processing time stamp (PTS).

In the semantics of the si attribute list shown in Table 15, the put mode may be represented in binary notation as shown in Table 16 below. That is, in the semantics of the si attribute list shown in Table 15, the put mode is encoded in binary notation. Here, Table 16 shows a binary representation of the put mode.

puMode puModeType 000 self 001 ancestors 010 descendants 011 ancestorsDescendants 100 preceding 101 precedingSiblings 110 sequential 111 Reserved

Next, when describing the sensory effect information, that is, description metadata of the sensory effect metadata, the syntax may be expressed as shown in Table 17 below. Here, Table 17 is a table showing the technical metadata syntax.

<!-############################################## ##->
<!-Definition of Description Metadata Type->
<!-############################################## ##->
<complexType name = "DescriptionMetadataType">
<complexContent>
<extension base = "mpeg7: DescriptionMetadataType">
<sequence>
<element name = "ClassificationSchemeAlias" minOccurs = "0"
maxOccurs = "unbounded">
<complexType>
<complexContent>
<extension base = "sedl: SEMBaseType">
<attribute name = "alias" type = "NMTOKEN" use = "required"/>
<attribute name = "href" type = "anyURI" use = "required"/>
</ extension>
</ complexContent>
</ complexType>
</ element>
</ sequence>
</ extension>
</ complexContent>
</ complexType>

The binary encoding notation or binary notation of the description metadata of the sensory effect metadata may be represented as shown in Table 18 below. Here, Table 18 is a table showing the binary notation of the description metadata of the sensory effect metadata.

DescriptionMetadata { Number of bits Mnemonic  MPEG7DescriptionMetadata One Mpeg7: DescriptionMetadataType  NumOfClassSchemeAlias vluimsbf5  for (k = 0; k <NumOfClassSchemeAlias; k ++) {   SEMBaseType [k] SEMBaseType   alias [k] UTF-8   href [k] UTF-8  } }

In addition, the semantics of the description metadata of the sensory effect metadata may be expressed as shown in Table 19 below. Here, Table 19 is a table showing the description metadata semantics (Semantics of the DescriptionMetadata).

Name Definition DescriptionMetadata DescriptionMetadataType extends mpeg7: DescriptionMetadataType and provides a sequence of classification schemes for usage in the SEM description. MPEG7DescriptionMetadata make reference to MPEG7: DescriptionMetadata NumOfClassSchemeAlias This field, which is only present in the binary representation, specifies the number of Classification Scheme Alias instances accommodated in the description metadata. SEMBase Describes a base type of a Sensory Effect Metadata. ClassificationSchemeAlias Classification scheme referenced by URI alias Describes the alias assigned to the ClassificationScheme. The scope of the alias assigned shall be the entire description regardless of where the ClassificationSchemeAlias appears in the description href Describes a reference to the classification scheme that is being aliased using a URI.The classification schemes defined in this part of the ISO / IEC 23005, whether normative of informative, shall be referenced by the uri attribute of the Classification Scheme for that classification scheme.

Next, when the sensory effect information, that is, the declarations of the sensory effect metadata, will be described, the syntax may be represented as shown in Table 20 below. Here, Table 20 is a table showing the declaration syntax.

<!-############################################## ##->
<!-Declarations type->
<!-############################################## ##->
<complexType name = "DeclarationsType">
<complexContent>
<extension base = "sedl: SEMBaseType">
<choice maxOccurs = "unbounded">
<element ref = "sedl: GroupOfEffects"/>
<element ref = "sedl: Effect"/>
<element ref = "sedl: Parameter"/>
</ choice>
</ extension>
</ complexContent>
</ complexType>

The binary encoding notation or binary notation of the declaration of the sensory effect metadata may be represented as in Table 21 below. Here, Table 21 is a table showing the binary notation of the declaration of the sensory effect metadata.

Declarations { Number of bits Mnemonic  SEMBaseType SEMBaseType  NumOfElements vluimsbf5  For (k = 0; k <NumOfElements; k ++) {   ElementID 4 bslbf   Element Element  } }

In addition, the semantics of the declaration of the sensory effect metadata may be expressed as shown in Table 22 below. Here, Table 22 shows semantics of the DeclarationsType.

Name Definition SEMBaseType Describes a base type of a Sensory Effect Metadata. NumOfElements This field, which is only present in the binary representation, specifies the number of Element instances accommodated in the SEM. ElementID This field, which is only present in the binary representation, describes which SEM scheme shall be used.
In the binary description, make referece to Table 3. Element ID
Element Effect See SEM root elements GroupOfEffects See SEM root elements Parameter Parameter of sensory effect

Next, when describing the sensory effect information, that is, a group of effects of the sensory effect metadata, the syntax may be expressed as shown in Table 23 below. Here, Table 23 is a table showing syntax of the effect group.

<!-############################################## ##->
<!-Group of Effects type->
<!-############################################## ##->
<complexType name = "GroupOfEffectsType">
<complexContent>
<extension base = "sedl: SEMBaseType">
<choice minOccurs = "2" maxOccurs = "unbounded">
<element ref = "sedl: Effect"/>
<element ref = "sedl: ReferenceEffect"/>
</ choice>
<attributeGroup ref = "sedl: SEMBaseAttributes"/>
<anyAttribute namespace = "## other" processContents = "lax"/>
</ extension>
</ complexContent>
</ complexType>

The binary coding notation or binary notation of the effect group of the sensory effect metadata may be represented as in Table 24 below. Here, Table 24 shows a binary representation of the effect group of the sensory effect metadata.

GroupOfEffects { Number of bits Mnemonic  SEMBaseType SEMBaseType  NumOfElements 5 uimsbf  For (k = 0; k <NumOfElements; k ++) {   ElementID 4 bslbf   Element bslbf  }  SEMBaseAttributes SEMBaseAttributes  anyAttributeType SizeOfanyAttribute * 8 anyAttributeType }

In addition, the semantics of the effect group of the sensory effect metadata may be expressed as shown in Table 25 below. Here, Table 25 is a table showing the Semantics of the Group Of Effects Type.

Name Definition SEMBaseType Describes a base type of a Sensory Effect Metadata. NumOfElements This field, which is only present in the binary representation, specifies the number of Element instances accommodated in the SEM. ElementID This field, which is only present in the binary representation, describes which SEM scheme shall be used.
In the binary description, make referece to Table 3. Element ID
NOTE ElementID restricted 3, 4
Element GroupOfEffectsType Tool to express two or more real effects Effect See SEM root elements SEMBaseAttributes Describes a group of attributes for the effects. anyAttributeType Reserved area (Type of anyAttribure)

Next, the sensory effect information, that is, the effect (effect) of the sensory effect metadata will be described. Syntax may be expressed as shown in Table 26 below. Here, Table 26 is a table showing the effect syntax.

<!-############################################## ##->
<!-Effect base type->
<!-############################################## ##->
<complexType name = "EffectBaseType" abstract = "true">
<complexContent>
<extension base = "sedl: SEMBaseType">
<sequence minOccurs = "0">
<element name = "SupplementalInformation" type = "sedl: SupplementalInformationType" minOccurs = "0"/>
</ sequence>
<attribute name = "autoExtraction" type = "sedl: autoExtractionType"/>
<attributeGroup ref = "sedl: SEMBaseAttributes"/>
<anyAttribute namespace = "## other" processContents = "lax"/>
</ extension>
</ complexContent>
</ complexType>

<complexType name = "SupplementalInformationType">
<sequence>
<element name = "ReferenceRegion" type = "mpeg7: SpatioTemporalLocatorType"/>
<element name = "Operator" type = "sedl: OperatorType" minOccurs = "0"/>
</ sequence>
</ complexType>
<simpleType name = "OperatorType">
<restriction base = "NMTOKEN">
<enumeration value = "Average"/>
<enumeration value = "Dominant"/>
</ restriction>
</ simpleType>

In addition, the binary coding notation or binary notation of the effect of the sensory effect metadata may be represented as shown in Table 27 below. Here, Table 27 is a table showing the binary notation of the effect of the sensory effect metadata.

Effect { Number of bits Mnemonic  EffectTypeID 4 uimsbf (Table 6)  EffectbaseType EffectbaseType  Effecttype See subclasse 4.2 ~ 4.15 Effecttype } EffectBaseType { Number of bits Mnemonic  SEMBaseType SEMBaseType  SupplementalInformationType SupplementalInformationType  Operator One bslbf  ReferenceRegion  autoExtractionID 2 uimsbf (Table 4)  SEMBaseAttributes SEMBaseAttributes  anyAttributeType anyAttributeType If (anyAttributeFlag) {   SizeOfanyAttribute vluimsbf5   anyAttribute SizeOfanyAttribute * 8 bslbf } } SupplementalInformationType { Number of bits Mnemonic  ReferenceRegion Operator 3 bslbf (Table 7) }

In the binary notation of the effects shown in Table 27, the effect type ID may be represented as shown in Table 28 below. Here, Table 28 is a table showing the effect type ID in the binary notation of the effect.

EffectType ID Effecttype EffectType ID Effecttype 0 Reserved 8 FogType One Lighttype 9 ColorCorrectionType 2 FlashType 10 RigidBodyMotionType 3 TemperatureType 11 PassiveKinesthetic
MotionType
4 Windtype 12 PassiveKinesthetic
Forcetype
5 VibrationType 13 ActiveKinestheticType 6 SprayingType 14 TactileType 7 Scenttype 15 Reserved

In addition, the semantics of the effect of the sensory effect metadata may be expressed as shown in Table 29 below. Here, Table 29 is a table showing the Semantics of the EffectBaseType (Semantics of the EffectBaseType).

Name Definition EffectTypeID EffectBaseType extends SEMBaseType, providing a basis for sensory effect metadata types (This field, which is only present in the binary representation, specifies a descriptor identifier.The descriptor identifier indicates the descriptor type accommodated in the Effect). EffectBaseType EffectBaseType extends SEMBaseType and provides a base abstract type for a subset of types defined as part of the sensory effects metadata types. SEMBaseAttributes Describes a group of attributes for the effects. anyAttributeType Provides an extension mechanism for including attributes from namespaces other than the target namespace.Attributes that shall be included are the XML streaming instructions as defined in ISO / IEC 21000-7 for the purpose of identifying process units and associating time information to them ).
EXAMPLE-si: pts describes the point in time when the associated information shall become available to the application for processing.

In addition, the semantics of the effect of the sensory effect metadata may be expressed as shown in Table 30 below. Here, Table 30 is a table showing the Semantics of the Supplemental Information Type in the binary notation of the effect of the sensory effect metadata shown in Table 27.

Name Definition SupplementalInformationType Describes the Supplemental Information ReferenceRegion Describes a reference region that is automatically extracted from the video. If autoExtraction is not provided or is not video, this element is ignored. The scheme describing this see mpeg7: SpatioTemporalLocatorType in ISO / IEC 15938-5 (Describes the reference region for automatic extraction from video.If the autoExtraction is not present or is not equal to video, this element shall be ignored.The localization scheme used is identified by means of the mpeg7: SpatioTemporalLocatorType that is defined in ISO / IEC 15938-5). Operator Describes the preferred type of operator for extracting sensory effects from the reference region of video with the following possible instantiations.
Average: extracts sensory effects from the reference region by calculating average value.
Dominant: extracts sensory effects from the reference region by calculating dominant value.

In Table 30, an operator may be represented by a binary notation as shown in Table 31 below. That is, in the supplemental information type semantics shown in Table 30, the operator is encoded in binary notation. Here, Table 31 is a table which shows the binary notation of an operator.

Operator Sementics 000 Reserved 001 Average 010 Dominant 011-111 Reserved

Next, referring to the sensory effect information, that is, the reference effect of the sensory effect metadata, the syntax may be expressed as shown in Table 32 below. Here, Table 32 is a table showing the reference effect syntax.

<!-############################################## ##->
<!-Reference Effect type->
<!-############################################## ##->
<complexType name = "ReferenceEffectType">
<complexContent>
<extension base = "sedl: SEMBaseType">
<attribute name = "uri" type = "anyURI" use = "required"/>
<attributeGroup ref = "sedl: SEMBaseAttributes"/>
<anyAttribute namespace = "## other" processContents = "lax"/>
</ extension>
</ complexContent>
</ complexType>

The binary encoding notation or binary notation of the reference effect of the sensory effect metadata may be represented as shown in Table 33 below. Here, Table 33 is a table showing the binary notation of the reference effect of the sensory effect metadata.

ReferenceEffect { Number of bits Mnemonic  SEMBaseType SEMBaseType  uri UTF-8  SEMBaseAttributes SEMBaseAttributes  anyAttributeType anyAttributeType  anyAttributeFlag One bslbf If (anyAttributeFlag) {     SizeOfanyAttribute vluimsbf5         anyAttribute SizeOfanyAttribute * 8 bslbf } }

In addition, the semantics of the reference effect of the sensory effect metadata may be expressed as shown in Table 34 below. Here, Table 34 is a table showing the Semantics of the ReferenceEffectType.

Name Definition ReferenceEffectType Tool for describing a reference to a sensory effect, group of sensory effects, or parameter. uri Describes a reference to a sensory effect, group of sensory effects, or parameter by an Uniform Resource Identifier (URI) .Its target type must be one-or derived-of sedl: EffectBaseType, sedl: GroupOfEffectType, or sedl: ParameterBaseType). SEMBaseAttributes Describes a group of attributes for the effects. anyAttribute Provides an extension mechanism for including attributes from namespaces other than the target namespace.Attributes that shall be included are the XML streaming instructions as defined in ISO / IEC 21000-7 for the purpose of identifying process units and associating time information to them ).
Attributes included here override the attribute values possibly defined within the sensory effect, group of effects or parameter referenced by the uri.
EXAMPLE-si: pts describes the point in time when the associated information shall become available to the application for processing.

Next, the sensory effect information, that is, the parameters (parameters) of the sensory effect metadata will be described. The syntax may be represented as in Table 35 below. Here, Table 35 is a table showing parameter syntax.

<!-############################################## ##->
<!-Parameter Base type->
<!-############################################## ##->
<complexType name = "ParameterBaseType" abstract = "true">
<complexContent>
<extension base = "sedl: SEMBaseType"/>
</ complexContent>
</ complexType>

The binary coding notation or binary notation of the parameter of the sensory effect metadata may be represented as in Table 36 below. Here, Table 36 is a table which shows the binary notation of the parameter of the sensory effect metadata.

ParameterBaseType { Number of bits Mnemonic  SEMBaseType SEMBaseType }

In addition, the semantics of the parameter of the sensory effect metadata may be expressed as shown in Table 37 below. Here, Table 37 is a table showing the parameter base type semantics.

Name Definition ParameterBaseType Provides the topmost type of the parameter base type hierarchy.

Next, the sensory effect information, that is, the color correction parameter type of the sensory effect metadata will be described. First, the XML representation syntax of the color correction parameter type may be represented as shown in Table 38 below. Table 38 shows an XML notation syntax of the color correction parameter type.

<!-############################################## ##->
<!-Definition of Color Correction Parameter type->
<!-############################################## ##->
<complexType name = "ColorCorrectionParameterType">
<complexContent>
<extension base = "sedl: ParameterBaseType">
<sequence>
<element name = "ToneReproductionCurves"
type = "sedl: ToneReproductionCurvesType" minOccurs = "0"/>
<element name = "ConversionLUT" type = "sedl: ConversionLUTType"/>
<element name = "ColorTemperature" type = "sedl: IlluminantType" minOccurs = "0"/>
<element name = "InputDeviceColorGamut"
type = "sedl: InputDeviceColorGamutType" minOccurs = "0"/>
<element name = "IlluminanceOfSurround" type = "mpeg7: unsigned12"
minOccurs = "0"/>
</ sequence>
</ extension>
</ complexContent>
</ complexType>

<complexType name = "ToneReproductionCurvesType">
<sequence maxOccurs = "256">
<element name = "DAC_Value" type = "mpeg7: unsigned8"/>
<element name = "RGB_Value" type = "mpeg7: doubleVector"/>
</ sequence>
</ complexType>

<complexType name = "ConversionLUTType">
<sequence>
<element name = "RGB2XYZ_LUT" type = "mpeg7: DoubleMatrixType"/>
<element name = "RGBScalar_Max" type = "mpeg7: doubleVector"/>
<element name = "Offset_Value" type = "mpeg7: doubleVector"/>
<element name = "Gain_Offset_Gamma" type = "mpeg7: DoubleMatrixType"/>
<element name = "InverseLUT" type = "mpeg7: DoubleMatrixType"/>
</ sequence>
</ complexType>

<complexType name = "IlluminantType">
<choice>
<sequence>
<element name = "XY_Value" type = "dia: ChromaticityType"/>
<element name = "Y_Value" type = "mpeg7: unsigned7"/>
</ sequence>
<element name = "Correlated_CT" type = "mpeg7: unsigned8"/>
</ choice>
</ complexType>

<complexType name = "InputDeviceColorGamutType">
<sequence>
<element name = "IDCG_Type" type = "string"/>
<element name = "IDCG_Value" type = "mpeg7: DoubleMatrixType"/>
</ sequence>
</ complexType>

The binary encoding notation or binary notation of the syntax shown in Table 38 may be expressed as shown in Table 39 below. Here, Table 39 is a table showing binary representation syntax.

ColorCorrectionParameterType { (Number of bits) (Mnemonic) ParameterBaseType ParameterBaseType ToneReproductionFlag One bslbf ColorTemperatureFlag One bslbf InputDeviceColorGamutFlag One bslbf IlluminanceOfSurroundFlag One bslbf if (ToneReproductionFlag) {  ToneReproductionCurves ToneReproductionCurvesType } ConversionLUT ConversionLUTType if (ColorTemperatureFlag) {  ColorTemperature IlluminantType } if (InputDeviceColorGamutFlag) {  InputDeviceColorGamut InputDeviceColorGamutType } if (IlluminanceOfSurroundFlag) {  IlluminanceOfSurround 12 uimsbf } } ToneReproductionCurvesType { (Number of bits) (Mnemonic) NumOfRecords 8 uimsbf for (i = 0; i <NumOfRecords; i ++) { DAC_Value 8 mpeg7: unsigned8 RGB_Value 32 * 3 mpeg7: doubleVector } } ConversionLUTType { (Number of bits) (Mnemonic) RGB2XYZ_LUT 32 * 3 * 3 mpeg7: DoubleMatrixType RGBScalar_Max 32 * 3 mpeg7: doubleVector Offset_Value 32 * 3 mpeg7: doubleVector Gain_Offset_Gamma 32 * 3 * 3 mpeg7: DoubleMatrixType InverseLUT 32 * 3 * 3 mpeg7: DoubleMatrixType } IlluminantType { (Number of bits) (Mnemonic) ElementType 2 bslbf (Table 8) if (ElementType == 00) { XY_Value 32 * 2 dia: ChromaticityType Y_Value 7 uimsbf } else if (ElementType == 01) {  Correlated_CT 8 uimsbf } } InputDeviceColorGamutType { (Number of bits) (Mnemonic) typeLength vluimsbf5 IDCG_Type 8 * typeLength bslbf IDCG_Value 32 * 3 * 2 mpeg7: DoubleMatrixType }

The semantics of the color correction parameter type are shown in Table 40 below. Here, Table 40 is a table showing the color correction parameter type semantics (Semantics of the ColorCorrectionParameterType).

Names Description ParameterBaseType Describes a base type of a Parameter Metadata. ToneReproductionFlag This field, which is only present in the binary representation, indicates the presence of the ToneReproductionCurves element. If it is 1 then the ToneReproductionCurves element is present, otherwise the ToneReproductionCurves element is not present. ColorTemperatureFlag This field, which is only present in the binary representation, indicates the presence of the ColorTemperature element. If it is 1 then the ColorTemperature element is present, otherwise the ColorTemperature element is not present. InputDeviceColorGamutFlag This field, which is only present in the binary representation, indicates the presence of the InputDeviceColorGamut element. If it is 1 then the InputDeviceColorGamut element is present, otherwise the InputDeviceColorGamut element is not present. IlluminanceOfSurroundFlag This field, which is only present in the binary representation, indicates the presence of the IlluminanceOfSurround element. If it is 1 then the IlluminanceOfSurround element is present, otherwise the IlluminanceOfSurround element is not present. ToneReproductionCurves This curve shows the characteristics (e.g., gamma curves for R, G and B channels) of the input display device. ConversionLUT A look-up table (matrix) converting an image between an image color space (e.g. RGB) and a standard connection space (e.g. CIE XYZ). ColorTemperature An element describing a white point setting (e.g., D65, D93) of the input display device. InputDeviceColorGamut An element describing an input display device color gamut, which is represented by chromaticity values of R, G, and B channels at maximum DAC values. IlluminanceOfSurround An element describing an illuminance level of viewing environment. The illuminance is represented by lux.

The semantics of the tone reproduction curves in the color correction parameter type semantics shown in Table 40 are as shown in Table 41 below. Here, Table 41 is a table showing the tone reproduction curve type semantics (Semantics of the ToneReproductionCurvesType).

Names Description NumOfRecords This field, which is only present in the binary representation, specifies the number of record (DAC and RGB value) instances accommodated in the ToneReproductionCurves. DAC_Value An element describing discrete DAC values of input device. RGB_Value An element describing normalized gamma curve values with respect to DAC values. The order of describing the RGB_Value is R n , G n , B n .

In addition, the semantics of the conversion LUT (conversion LUT) in the color correction parameter type semantics shown in Table 40 are as shown in Table 42 below. Here, Table 42 shows a table of conversion LUT type semantics.

Names Description RGB2XYZ_LUT This look-up table (matrix) converts an image from RGB to CIE XYZ. The size of the conversion matrix is 3x3 such as

Figure pat00001
Figure pat00002
. The way of describing the values in the binary representation is in the order of [
Figure pat00003
Figure pat00004
,
Figure pat00005
Figure pat00006
,
Figure pat00007
Figure pat00008
;
Figure pat00009
Figure pat00010
,
Figure pat00011
Figure pat00012
,
Figure pat00013
Figure pat00014
;
Figure pat00015
Figure pat00016
,
Figure pat00017
Figure pat00018
,
Figure pat00019
Figure pat00020
]. RGBScalar_Max An element describing maximum RGB scalar values for GOG transformation. The order of describing the RGBScalar_Max is R max , G max , B max . Offset_Value An element describing offset values of input display device when the DAC is 0.The value is described in CIE XYZ form. The order of describing the Offset_Value is X, Y, Z. Gain_Offset_Gamma An element describing the gain, offset, gamma of RGB channels for GOG transformation. The size of the Gain_Offset_Gamma matrix is 3x3 such as
Figure pat00021
Figure pat00022
. The way of describing the values in the binary representation is in the order of [Gain r , Gain g , Gain b ; Offset r , Offset g , Offset b ; Gamma r , Gamma g , Gamma b ].
InverseLUT This look-up table (matrix) converts an image form CIE XYZ to RGB.
The size of the conversion matrix is 3x3 such as
Figure pat00023
Figure pat00024
. The way of describing the values in the binary representation is in the order of [
Figure pat00025
Figure pat00026
,
Figure pat00027
Figure pat00028
,
Figure pat00029
Figure pat00030
;
Figure pat00031
Figure pat00032
,
Figure pat00033
Figure pat00034
,
Figure pat00035
Figure pat00036
;
Figure pat00037
Figure pat00038
,
Figure pat00039
Figure pat00040
,
Figure pat00041
Figure pat00042
].

The semantics of the color correction parameter type are as shown in Table 43 below. Here, Table 43 is a table showing the semantics of the semantics (Semantics of the IlluminantType).

Names Description ElementType This field, which is only present in the binary representation, describes which Illuminant scheme shall be used.
In the binary description, the following mapping table is used.
XY_Value An element describing the chromaticity of the light source. The ChromaticityType is specified in ISO / IEC 21000-7. Y_Value An element describing the luminance of the light source between 0 and 100. Correlated_CT Indicates the correlated color temperature of the overall illumination. The value expression is obtained through quantizing the range [1667, 25000] into 28 bins in a non-uniform way as specified in ISO / IEC 15938-5.

In the illuminant type semantics shown in Table 43, the illuminant of the element type may be represented in binary notation as shown in Table 44 below. That is, in the illuminant type semantics shown in Table 43, the element type is encoded in binary notation. Here, Table 44 shows a binary representation of element types.

Illuminant IlluminantType 00 xy and Y value 01 Correlated_CT

The semantics of the input device color gamut in the color correction parameter type semantics shown in Table 40 are as shown in Table 45 below. Here, Table 45 is a table showing the input device color gamut type semantics (Semantics of the InputDeviceColorGamutType).

Names Description typeLength This field, which is only present in the binary representation, specifies the length of each IDCG_Type instance in bytes. The value of this element is the size of the largest IDCG_Type instance, aligned to a byte boundary by bit stuffing using 0-7 '1' bits. IDCG_Type An element describing the type of input device color gamut (e.g., NTSC, SMPTE). IDCG_Value An element describing the chromaticity values of RGB channels when the DAC values are maximum. The size of the IDCG_Value matrix is 3x2 such as

Figure pat00043
Figure pat00044
. The way of describing the values in the binary representation is in the order of [
Figure pat00045
Figure pat00046
,
Figure pat00047
Figure pat00048
,
Figure pat00049
Figure pat00050
,
Figure pat00051
Figure pat00052
,
Figure pat00053
Figure pat00054
,
Figure pat00055
Figure pat00056
].

Hereinafter, the sensory effect term (Sensory effect vocabulary), that is, the binary representation of the sensory effect information and the sensory effect information through an example of various sensory effects will be described in more detail with respect to the coding using a binary notation scheme. . Here, the various sensory effects of the multimedia content, as described above, the lighting effect, color tone effect, glare effect, temperature effect, wind effect, vibration effect, spray effect as a water spray effect, aroma effect, smoke effect, color correction effect , Motion and sensory effects (e.g., rigid body motion effects, passive kinesthetic motion effects, passive kinesthetic force effects, active kinetic effects) (active kinesthetic effect, tactile effect), and the like.

First, the lighting effect will be described in detail. The syntax of the lighting effect can be expressed as shown in Table 46 below. Here, Table 46 is a table | surface which showed the syntax of the said lighting effect.

<!-############################################## ##->
<!-SEV Light type->
<!-############################################## ##->
<complexType name = "LightType">
<complexContent>
<extension base = "sedl: EffectBaseType">
<attribute name = "color" type = "sev: colorType" use = "optional"/>
<attribute name = "intensity-value" type = "sedl: intensityValueType"
use = "optional"/>
<attribute name = "intensity-range" type = "sedl: intensityRangeType"
use = "optional"/>
</ extension>
</ complexContent>
</ complexType>

<simpleType name = "colorType">
<union memberTypes = "mpeg7: termReferenceType sev: colorRGBType"/>
</ simpleType>

<simpleType name = "colorRGBType">
<restriction base = "NMTOKEN">
<whiteSpace value = "collapse"/>
<pattern value = "# [0-9A-Fa-f] {6}"/>
</ restriction>
</ simpleType>

In addition, the binary coding notation or binary notation of the lighting effect can be represented as shown in Table 47. Here, Table 47 is a table showing the binary notation of the lighting effect.

LightType { Number of bits Mnemonic  colorFlag One bslbf  intensityValueFlag bslbf  intensityRangeFlag bslbf  if (colorFlag) {   color 9 colorType  }  if (intensityValueFlag) {   intensityValue 32 fsfb  }  if (intensityRangeFlag) {   intensityRange [0] 32 fsfb   intensityRange [1] 32 fsfb  } } ColorType { Number of bits Mnemonic  NamedcolorFlag One   If (NamedcolorFlag) {    NamedColorType 9 bslbf (Table 9)   } else {    colorRGBType 56 bslbf   } }

In addition, the semantics of the lighting effect can be expressed as shown in Table 48 below. Here, Table 48 is a table showing the illumination type semantics (Semantics of the LightType).

Name Definition Lighttype Tool for describing a light effect. colorFlag This field, which is only present in the binary representation, indicates the presence of the color attribute. If it is 1 then the color attribute is present, otherwise the color attribute is not present. intensityValueFlag This field, which is only present in the binary representation, indicates the presence of the intensity-value attribute. If it is 1 then the intensity-value attribute is present, otherwise the intensity-value attribute is not present. intensityRangeFlag This field, which is only present in the binary representation, indicates the presence of the intensityRange attribute. If it is 1 then the intensity-range attribute is present, otherwise the intensity-range attribute is not present. color Indicates the color of the illumination, expressed as a classification scheme (CS) or RGB value, see CS for A.2.2 of ISO / IEC 23005-6
(Describes the color of the light effect as a reference to a classification scheme term or as RGB value.A CS that may be used for this purpose is the ColorCS defined in Annex A.2.1).
intensity-value Describes the intensity of the light effect in terms of illumination in lux. intensity-range Describes the domain of the intensity value.

In the illumination type semantics shown in Table 48, the color may be represented in binary notation as shown in Table 49 below. That is, in the illumination type semantics shown in Table 48, the colors are encoded in binary notation. Here, Table 49 is a table showing binary notation of colors, namely named color types.

NamedcolorType Term ID of color 000000000 alice_blue 000000001 alizarin 000000010 amaranth 000000011 amaranth_pink 000000100 amber 000000101 amethyst 000000110 apricot 000000111 aqua 000001000 aquamarine 000001001 army_green 000001010 asparagus 000001011 atomic_tangerine 000001100 auburn 000001101 azure_color_wheel 000001110 azure_web 000001111 baby_blue 000010000 beige 000010001 bistre 000010010 black 000010011 blue 000010100 blue_pigment 000010101 blue_ryb 000010110 blue_green 000010111 blue-green 000011000 blue-violet 000011001 bondi_blue 000011010 brass 000011011 bright_green 000011100 bright_pink 000011101 bright_turquoise 000011110 brilliant_rose 000011111 brink_pink 000100000 bronze 000100001 brown 000100010 buff 000100011 burgundy 000100100 burnt_orange 000100101 burnt_sienna 000100110 burnt_umber 000100111 camouflage_green 000101000 caput_mortuum 000101001 cardinal 000101010 carmine 000101011 carmine_pink 000101100 carnation_pink 000101101 Carolina_blue 000101110 carrot_orange 000101111 celadon 000110000 cerise 000110001 cerise_pink 000110010 cerulean 000110011 cerulean_blue 000110100 champagne 000110101 charcoal 000110110 chartreuse_traditional 000110111 chartreuse_web 000111000 cherry_blossom_pink 000111001 chestnut 000111010 chocolate 000111011 cinnabar 000111100 cinnamon 000111101 cobalt 000111110 Columbia_blue 000111111 copper 001000000 copper_rose 001000001 coral 001000010 coral_pink 001000011 coral_red 001000100 corn 001000101 cornflower_blue 001000110 cosmic_latte 001000111 cream 001001000 crimson 001001001 cyan 001001010 cyan_process 001001011 dark_blue 001001100 dark_brown 001001101 dark_cerulean 001001110 dark_chestnut 001001111 dark_coral 001010000 dark_goldenrod 001010001 dark_green 001010010 dark_khaki 001010011 dark_magenta 001010100 dark_pastel_green 001010101 dark_pink 001010110 dark_scarlet 001010111 dark_salmon 001011000 dark_slate_gray 001011001 dark_spring_green 001011010 dark_tan 001011011 dark_turquoise 001011100 dark_violet 001011101 deep_carmine_pink 001011110 deep_cerise 001011111 deep_chestnut 001100000 deep_fuchsia 001100001 deep_lilac 001100010 deep_magenta 001100011 deep_magenta 001100100 deep_peach 001100101 deep_pink 001100110 denim 001100111 dodger_blue 001101000 ecru 001101001 egyptian_blue 001101010 electric_blue 001101011 electric_green 001101100 elctric_indigo 001101101 electric_lime 001101110 electric_purple 001101111 emerald 001110000 eggplant 001110001 falu_red 001110010 fern_green 001110011 firebrick 001110100 flax 001110101 forest_green 001110110 french_rose 001110111 fuchsia 001111000 fuchsia_pink 001111001 gamboge 001111010 gold_metallic 001111011 gold_web_golden 001111100 golden_brown 001111101 golden_yellow 001111110 goldenrod 001111111 grey-asparagus 010000000 green_color_wheel_x11_green 010000001 green_html / css_green 010000010 green_pigment 010000011 green_ryb 010000100 green_yellow 010000101 gray 010000110 han_purple 010000111 harlequin 010001000 heliotrope 010001001 Hollywood_cerise 010001010 hot_magenta 010001011 hot_pink 010001100 indigo_dye 010001101 international_klein_blue 010001110 international_orange 010001111 Islamic_green 010010000 ivory 010010001 jade 010010010 kelly_green 010010011 khaki 010010100 khaki_x11_light_khaki 010010101 lavender_floral 010010110 lavender_web 010010111 lavender_blue 010011000 lavender_blush 010011001 lavender_grey 010011010 lavender_magenta 010011011 lavender_pink 010011100 lavender_purple 010011101 lavender_rose 010011110 lawn_green 010011111 lemon 010100000 lemon_chiffon 010100001 light_blue 010100010 light_pink 010100011 lilac 010100100 lime_color_wheel 010100101 lime_web_x11_green 010100110 lime_green 010100111 linen 010101000 magenta 010101001 magenta_dye 010101010 magenta_process 010101011 magic_mint 010101100 magnolia 010101101 malachite 010101110 maroon_html / css 010101111 marron_x11 010110000 maya_blue 010110001 mauve 010110010 mauve_taupe 010110011 medium_blue 010110100 medium_carmine 010110101 medium_lavender_magenta 010110110 medum_purple 010110111 medium_spring_green 010111000 midnight_blue 010111001 midnight_green_eagle_green 010111010 mint_green 010111011 misty_rose 010111100 moss_green 010111101 mountbatten_pink 010111110 mustard 010111111 myrtle 011000000 navajo_white 011000001 navy_blue 011000010 ochre 011000011 office_green 011000100 old_gold 011000101 old_lace 011000110 old_lavender 011000111 old_rose 011001000 olive 011001001 olive_drab 011001010 olivine 011001011 orange_color_wheel 011001100 orange_ryb 011001101 orange_web 011001110 orange_peel 011001111 orange-red 011010000 orchid 011010001 pale_blue 011010010 pale_brown 011010011 pale_carmine 011010100 pale_chestnut 011010101 pale_cornflower_blue 011010110 pale_magenta 011010111 pale_pink 011011000 pale_red-violet 011011001 papaya_whip 011011010 pastel_green 011011011 pastel_pink 011011100 peach 011011101 peach-orange 011011110 peach-yellow 011011111 pear 011100000 periwinkle 011100001 persian_blue 011100010 persian_green 011100011 persian_indigo 011100100 persian_orange 011100101 persian_red 011100110 persian_pink 011100111 persian_rose 011101000 persimmon 011101001 pine_green 011101010 pink 011101011 pink-orange 011101100 platinum 011101101 plum_web 011101110 powder_blue_web 011101111 puce 011110000 prussian_blue 011110001 psychedelic_purple 011110010 pumpkin 011110011 purple_html / css 011110100 purple_x11 011110101 purple_taupe 011110110 raw_umber 011110111 razzmatazz 011111000 red 011111001 red_pigment 011111010 red_ryb 011111011 red-violet 011111100 rich_carmine 011111101 robin_egg_blue 011111110 rose 011111111 rose_madder 100000000 rose_taupe 100000001 royal_blue 100000010 royal_purple 100000011 ruby 100000100 russet 100000101 rust 100000110 safety_orange_blaze_orange 100000111 saffron 100001000 salmon 100001001 sandy_brown 100001010 sangria 100001011 sapphire 100001100 scarlet 100001101 school_bus_yellow 100001110 sea_green 100001111 seashell 100010000 selective_yellow 100010001 sepia 100010010 shamrock_green 100010011 shocking_pink 100010100 silver 100010101 sky_blue 100010110 slate_grey 100010111 smalt_dark_powder_blue 100011000 spring_bud 100011001 spring_green 100011010 steel_blue 100011011 tan 100011100 tangerine 100011101 tangerine_yellow 100011110 taupe 100011111 tea_green 100100000 tea_rose_orange 100100001 tea_rose_rose 100100010 teal 100100011 tenne_tawny 100100100 terra_cotta 100100101 thistle 100100110 tomato 100100111 turquoise 100101000 tyrian_purple 100101001 ultramarine 100101010 ultra_pink 100101011 united_nation_blue 100101100 vegas_gold 100101101 vermilion 100101110 violet 100101111 violet_web 100110000 violet_ryb 100110001 viridian 100110010 wheat 100110011 white 100110100 wisteria 100110101 yellow 100110110 yellow_process 100110111 yellow_ryb 100111000 yellow_green 100111001-111111111 Reserved

And, the semantics of the lighting effect can be expressed as shown in Table 50 below. Here, Table 50 is a table showing color RGB type semantics.

Name Definition NamedcolorFlag This field, which is only present in the binary representation, indicates a choice of the color descriptions. If it is 1 then the color is described by mpeg7: termReferenceType, otherwise the color is described by colorRGBType. NamedColorType This field, which is only present in the binary representation, describes color in terms of ColorCS Flag in Annex A.2.1. colorRGBType This field, which is only present in the binary representation, describes color in terms of colorRGBType.

Next, the flash effect will be described in detail. The syntax of the flash effect can be expressed as shown in Table 51 below. Here, Table 51 is a table | surface which showed the syntax of the said flash effect.

<!-############################################## ##->
<!-SEV Flash type->
<!-############################################## ##->
<complexType name = "FlashType">
<complexContent>
<extension base = "sev: LightType">
<attribute name = "frequency" type = "positiveInteger" use = "optional"/>
</ extension>
</ complexContent>
</ complexType>

In addition, the binary encoding notation or binary notation of the flash effect may be represented as shown in Table 52 below. Here, Table 52 is a table which shows the binary notation of the said flash effect.

FlashType { Number of bits Mnemonic  Lighttype Lighttype  frequencyFlag One bslbf  if (frequencyFlag) {  frequency 5 uimsbf  } }

In addition, the semantics of the scintillation effect can be shown as Table 53 below. Here, Table 53 is a table showing the flash type semantics (Semantics of the FlashType).

Name Definition FlashType Tool for describing a flash effect. Lighttype Describes a base type of a light effect. frequency Describes the number of flickering in times per second.
EXAMPLE-The value 10 means it will flicker 10 times for each second.

Next, with reference to the temperature effect in detail, the syntax of the temperature effect can be expressed as shown in Table 54 below. Here, Table 54 is a table | surface which showed the syntax of the said temperature effect.

<!-############################################## ##->
<!-SEV Temperature type->
<!-############################################## ##->
<complexType name = "TemperatureType">
<complexContent>
<extension base = "sedl: EffectBaseType">
<attribute name = "intensity-value" type = "sedl: intensityValueType"
use = "optional"/>
<attribute name = "intensity-range" type = "sedl: intensityRangeType"
use = "optional"/>
</ extension>
</ complexContent>
</ complexType>

In addition, the binary encoding notation or binary notation of the temperature effect may be represented as shown in Table 55 below. Here, Table 55 is a table which shows the binary notation of the said temperature effect.

TemperatureType { Number of bits Mnemonic  intensityValueFlag One bslbf  intensityRangeFlag One bslbf  if (intensityValueFlag) {   intensityValue 32 fsfb  }  if (intensityRangeFlag) {   intensityRange [0] 32 fsfb   intensityRange [1] 32 fsfb  } }

In addition, the semantics of the temperature effect can be expressed as shown in Table 56 below. Here, Table 56 is a table showing the temperature type semantics (Semantics of the TemperatureType).

Name Definition TemperatureType Tool for describing a temperature effect. intensityValueFlag This field, which is only present in the binary representation, indicates the presence of the intensityValue attribute. If it is 1 then the intensity-value attribute is present, otherwise the intensity-value attribute is not present. intensityRangeFlag This field, which is only present in the binary representation, indicates the presence of the intensityRange attribute. If it is 1 then the intensity-range attribute is present, otherwise the intensity-range attribute is not present. intensity-value Describes the intensity of the light effect in terms of heating / cooling in Celsius. intensity-range Describes the domain of the intensity value.
intensity-range [0]: minmum intensity
intensity-range [1]: maximum intensity
EXAMPLE-[0.0, 100.0] on the Celsius scale or [32.0, 212.0] on the Fahrenheit scale.

Next, the wind effect will be described in detail. The syntax of the wind effect can be expressed as shown in Table 57 below. Here, Table 57 is a table which shows the syntax of the said wind effect.

<!-############################################## ##->
<!-SEV Wind type->
<!-############################################## ##->
<complexType name = "WindType">
<complexContent>
<extension base = "sedl: EffectBaseType">
<attribute name = "intensity-value" type = "sedl: intensityValueType"
use = "optional"/>
<attribute name = "intensity-range" type = "sedl: intensityRangeType"
use = "optional"/>
</ extension>
</ complexContent>
</ complexType>

The binary encoding notation or binary notation of the wind effect may be represented as shown in Table 58 below. Here, Table 58 is a table showing the binary notation of the wind effect.

WindType { Number of bits Mnemonic  intensityValueFlag One bslbf  intensityRangeFlag One bslbf  if (intensityValueFlag) {   intensityValue 32 fsfb  }  if (intensityRangeFlag) {   intensityRange [0] 32 fsfb   intensityRange [1] 32 fsfb  } }

In addition, the semantics of the wind effect can be expressed as shown in Table 59 below. Here, Table 59 is a table showing the Semantics of the WindType.

Name Definition Windtype Tool for describing a wind effect. intensityValueFlag This field, which is only present in the binary representation, indicates the presence of the intensityValue attribute. If it is 1 then the intensity-value attribute is present, otherwise the intensity-value attribute is not present. intensityRangeFlag This field, which is only present in the binary representation, indicates the presence of the intensityRange attribute. If it is 1 then the intensity-range attribute is present, otherwise the intensity-range attribute is not present. intensity-value Describes the intensity of the wind effect in terms of strength in Beaufort. intensity-range Describes the domain of the intensity value.
intensity-range [0]: minmum intensity
intensity-range [1]: maximum intensity
EXAMPLE-[0.0, 12.0] on the Beaufort scale.

Next, when the vibration effect is described in detail, the syntax of the vibration effect can be expressed as shown in Table 60 below. Here, Table 60 is a table which shows the syntax of the said vibration effect.

<!-############################################## ##->
<!-SEV Vibration type->
<!-############################################## ##->
<complexType name = "VibrationType">
<complexContent>
<extension base = "sedl: EffectBaseType">
<attribute name = "intensity-value" type = "sedl: intensityValueType"
use = "optional"/>
<attribute name = "intensity-range" type = "sedl: intensityRangeType"
use = "optional"/>
</ extension>
</ complexContent>
</ complexType>

The binary coding notation or binary notation of the vibration effect may be represented as in Table 61 below. Here, Table 61 is a table showing the binary notation of the vibration effect.

VibrationType { Number of bits Mnemonic  intensityValueFlag One bslbf  intensityRangeFlag One bslbf  if (intensityValueFlag) {   intensityValue 32 fsfb  }  if (intensityRangeFlag) {   intensityRange [0] 32 fsfb   intensityRange [1] 32 fsfb  } }

In addition, the semantics of the vibration effect can be expressed as shown in Table 62 below. Here, Table 62 is a table showing vibration type semantics (Semantics of the VibrationType).

Name Definition VibrationType Tool for describing a vibration effect. intensityValueFlag This field, which is only present in the binary representation, indicates the presence of the intensityValue attribute. If it is 1 then the intensity-value attribute is present, otherwise the intensity-value attribute is not present. intensityRangeFlag This field, which is only present in the binary representation, indicates the presence of the intensityRange attribute. If it is 1 then the intensity-range attribute is present, otherwise the intensity-range attribute is not present. intensity-value Describes the intensity of the vibration effect in terms of strength according to the Richter scale. intensity-range Describes the domain of the intensity value.
intensity-range [0]: minmum intensity
intensity-range [1]: maximum intensity
EXAMPLE-[0.0, 10.0] on the Richter magnitude scale

Next, the spraying effect (spraying effect) will be described in detail, the syntax of the spraying effect can be expressed as shown in Table 64 below. Here, Table 64 is a table | surface which showed the syntax of the said spray effect.

<!-############################################## ##->
<!-Definition of Spraying type->
<!-############################################## ##->
<complexType name = "SprayingType">
<complexContent>
<extension base = "sedl: EffectBaseType">
<attribute name = "intensity-value" type = "sedl: intensityValueType"
use = "optional"/>
<attribute name = "intensity-range" type = "sedl: intensityRangeType"
use = "optional"/>
<attribute name = "sprayingType" type = "mpeg7: termReferenceType"/>
</ extension>
</ complexContent>
</ complexType>

The binary encoding notation or binary notation of the spray effect may be represented as in Table 65 below. Here, Table 65 is a table which shows the binary notation of the said spray effect.

SprayingType { Number of bits Mnemonic  intensityValueFlag One bslbf  intensityRangeFlag One bslbf  if (intensityValueFlag) {   intensityValue 32 fsfb  }  if (intensityRangeFlag) {   intensityRange [0] 32 fsfb   intensityRange [1] 32 fsfb  }  SprayingID 8 bslbf (Table 10) }

In addition, the semantics of the spray effect can be shown as Table 66 below. Here, Table 66 is a table showing the spray type semantics (Semantics of the SprayingType).

Name Definition SprayingType Tool for describing a spraying effect. intensityValueFlag This field, which is only present in the binary representation, indicates the presence of the intensityValue attribute. If it is 1 then the intensity-value attribute is present, otherwise the intensity-value attribute is not present. intensityRangeFlag This field, which is only present in the binary representation, indicates the presence of the intensityRange attribute. If it is 1 then the intensity-range attribute is present, otherwise the intensity-range attribute is not present. intensity-value Describes the intensity of the spraying effect in terms in ml / h. intensity-range Describes the domain of the intensity value.
intensity-range [0]: minmum intensity
intensity-range [1]: maximum intensity
EXAMPLE-[0.0, 10.0] ml / h.
sprayingType Describes the type of the spraying effect as a reference to a classification scheme term. A CS that may be used for this purpose is the SprayingTypeCS defined in Annex A.2.6.

In the semantics of the spray effect shown in Table 66, the spraying type can be represented in binary notation as shown in Table 67 below. That is, in the spray type semantics shown in Table 66, the spray type is encoded in binary notation. Here, Table 67 is a table which shows the binary notation of a spray type.

SprayingID spraying type 00000000 Reserved 00000001 Purified water 00000010
~ 11111111
Reserved

Next, specifically describing the fragrance effect, the syntax of the fragrance effect may be represented as shown in Table 68 below. Here, Table 68 is a table | surface which showed the syntax of the said fragrance effect.

<!-############################################## ##->
<!-Definition of Scent type->
<!-############################################## ##->
<complexType name = "ScentType">
<complexContent>
<extension base = "sedl: EffectBaseType">
<attribute name = "scent" type = "mpeg7: termReferenceType"
use = "optional"/>
<attribute name = "intensity-value" type = "sedl: intensityValueType"
use = "optional"/>
<attribute name = "intensity-range" type = "sedl: intensityRangeType"
use = "optional"/>
</ extension>
</ complexContent>
</ complexType>

The binary encoding notation or binary notation of the fragrance effect may be represented as shown in Table 69 below. Here, Table 69 is a table which shows the binary notation of the said fragrance effect.

ScentType { Number of bits Mnemonic  intensityValueFlag One bslbf  intensityRangeFlag One bslbf  if (intensityValueFlag) {   intensityValue 32 fsfb  }  if (intensityRangeFlag) {   intensityRange [0] 32 fsfb   intensityRange [1] 32 fsfb  }  ScentID 16 bslbf (Table 11) }

In addition, the semantics of the fragrance effect can be shown as Table 70 below. Here, Table 70 is a table showing the fragrance type semantics (Semantics of the ScentType).

Name Definition Scenttype Tool for describing a scent effect. intensityValueFlag This field, which is only present in the binary representation, indicates the presence of the intensityValue attribute. If it is 1 then the intensity-value attribute is present, otherwise the intensity-value attribute is not present. intensityRangeFlag This field, which is only present in the binary representation, indicates the presence of the intensityRange attribute. If it is 1 then the intensity-range attribute is present, otherwise the intensity-range attribute is not present. scent Describes the scent to use.A CS that may be used for this purpose is the ScentCS defined in Annex A.2.3. intensity-value Describes the intensity of the scent effect in ml / h intensity-range Describes the domain of the intensity value.
intensity-range [0]: minmum intensity
intensity-range [1]: maximum intensity
EXAMPLE-[0.0, 10.0] ml / h.

In the semantics of the fragrance effect shown in Table 70, the scent can be represented in binary notation as shown in Table 71 below. That is, in the fragrance type semantics shown in Table 70, the fragrance is encoded in binary notation. Here, Table 71 is a table which shows the binary notation of fragrance.

ScentID Scent 0000000000000000 Reserved 0000000000000001 rose 0000000000000010 acacia 0000000000000011 chrysanthemum 0000000000000100 lilac 0000000000000101 mint 0000000000000110 jasmine 0000000000000111 pine tree 0000000000001000 orange 0000000000001001 grape 0000000000001010
~ 1111111111111111
Reserved

Next, with reference to the smoke effect in detail, the syntax of the smoke effect can be expressed as shown in Table 72 below. Here, Table 72 is a table | surface which showed the syntax of the said smoke effect.

<!-############################################## ##->
<!-Definition of Fog type->
<!-############################################## ##->
<complexType name = "FogType">
<complexContent>
<extension base = "sedl: EffectBaseType">
<attribute name = "intensity-value" type = "sedl: intensityValueType"
use = "optional"/>
<attribute name = "intensity-range" type = "sedl: intensityRangeType"
use = "optional"/>
</ extension>
</ complexContent>
</ complexType>

In addition, the binary encoding notation or binary notation of the deferred effect may be represented as shown in Table 73 below. Here, Table 73 is a table which shows the binary notation of the said smoke effect.

FogType { Number of bits Mnemonic  intensityValueFlag One bslbf  intensityRangeFlag One bslbf  if (intensityValueFlag) {   intensityValue 32 fsfb  }  if (intensityRangeFlag) {   intensityRange [0] 32 fsfb   intensityRange [1] 32 fsfb  } }

In addition, the semantics of the smoke effect can be expressed as shown in Table 74 below. Here, Table 74 is a table showing the smoke type semantics (Semantics of the FogType).

Name Definition FogType Tool for describing a fog effect. intensityValueFlag This field, which is only present in the binary representation, indicates the presence of the intensityValue attribute. If it is 1 then the intensity-value attribute is present, otherwise the intensity-value attribute is not present. intensityRangeFlag This field, which is only present in the binary representation, indicates the presence of the intensityRange attribute. If it is 1 then the intensity-range attribute is present, otherwise the intensity-range attribute is not present. intensity-value Describes the intensity of the fog effect in ml / h. intensity-range Describes the domain of the intensity value.
intensity-range [0]: minmum intensity
intensity-range [1]: maximum intensity
EXAMPLE-[0.0, 10.0] ml / h.

Next, the color correction effect will be described in detail. The syntax of the color correction effect can be expressed as shown in Table 75 below. Here, Table 75 is a table showing the syntax of the color correction effect.

<!-############################################## ##->
<!-Definition of Color Correction type->
<!-############################################## ##->
<complexType name = "ColorCorrectionType">
<complexContent>
<extension base = "sedl: EffectBaseType">
<choice minOccurs = "0">
<element name = "SpatioTemporalLocator" type = "mpeg7: SpatioTemporalLocatorType"/>
<element name = "SpatioTemporalMask" type = "mpeg7: SpatioTemporalMaskType"/>
</ choice>
<attribute name = "intensity-value" type = "sedl: intensityValueType"
use = "optional"/>
<attribute name = "intensity-range" type = "sedl: intensityRangeType"
use = "optional" fixed = "0 1"/>
</ extension>
</ complexContent>
</ complexType>

The binary encoding notation or binary notation of the color correction effect may be represented as shown in Table 76 below. Here, Table 76 is a table showing the binary notation of the color correction effect.

ColorCorrectionType { (Number of bits) (Mnemonic)  intensityValueFlag One bslbf  intensityRangeFlag One bslbf  regionTypeChoice One bslbf  if (regionTypeChoice) {   SpatioTemporalLocator mpeg7: SpatioTemporalLocatorType  }  else {   SpatioTemporalMask mpeg7: SpatioTemporalMaskType  }  if (intensityValueFlag) {   Intensity-value 32 fsbf  }  if (intensityRangeFlag) {   Intensity-range 64 fsbf  } }

In addition, the semantics of the color correction effect may be expressed as shown in Table 77 below. Here, Table 77 is a table showing color correction type semantics (Semantics of the ColorCorrectionType).

Names Description ColorCorrectionType Tool for describing a ColorCorrection effect. intensityValueFlag This field, which is only present in the binary representation, indicates the presence of the intensityValue attribute. If it is 1 then the intensity-value attribute is present, otherwise the intensity-value attribute is not present. intensityRangeFlag This field, which is only present in the binary representation, indicates the presence of the intensityRange attribute. If it is 1 then the intensity-range attribute is present, otherwise the intensity-range attribute is not present. regionTypeChoice This field, which is only present in the binary representation, specifies the choice of the spatio-temporal region types. If it is 1 then the SpatioTemporalLocator is present, otherwise the SpatioTemporalMask is present. intensity-value Describes the intensity of the color correction effect in terms of "on" and "off" with respect to 1 (on) and 0 (off). intensity-range Describes the domain of the intensity value, ie, 1 (on) and 0 (off).
intensity-range [0]: minmum intensity
intensity-range [1]: maximum intensity
SpatioTemporalLocator Describes the spatio-temporal localization of the moving region using mpeg7: SpatioTemporalLocatorType (optional), which indicates the regions in a video segment where the color correction effect is applied. The mpeg7: SpatioTemporalLocatorType is Flag in ISO / IEC 15938-5. SpatioTemporalMask Describes a spatio-temporal mask that defines the spatio-temporal composition of the moving region (optional), which indicates the masks in a video segment where the color correction effect is applied. The mpeg 7: Spatio Temporal Mask Type is Flag in ISO / IEC 15938-5.

Next, the rigid body motion effect as a motion effect will be described in detail. The syntax of the ridge body motion effect can be expressed as shown in Table 78 below. Here, Table 78 is a table which shows the syntax of the said ridge body motion effect.

<!-############################################## ##->
<!-Definition of Rigid Body Motion type->
<!-############################################## ##->
<complexType name = "RigidBodyMotionType">
<complexContent>
<extension base = "sedl: EffectBaseType">
<sequence>
<element name = "MoveToward" type = "sev: MoveTowardType"
minOccurs = "0"/>
<element name = "TrajectorySamples" type = "mpeg7: FloatMatrixType"
minOccurs = "0" maxOccurs = "unbounded"/>
<element name = "Incline" type = "sev: InclineType" minOccurs = "0"/>
<element name = "Shake" type = "sev: ShakeType" minOccurs = "0"/>
<element name = "Wave" type = "sev: WaveType" minOccurs = "0"/>
<element name = "Spin" type = "sev: SpinType" minOccurs = "0"/>
<element name = "Turn" type = "sev: TurnType" minOccurs = "0"/>
<element name = "Collide" type = "sev: CollideType" minOccurs = "0"/>
</ sequence>
</ extension>
</ complexContent>
</ complexType>

<!-############################################## ##->
<!-Definition of Move Toward type->
<!-############################################## ##->
<complexType name = "MoveTowardType">
<choice minOccurs = "0">
<element name = "Speed" type = "float"/>
<element name = "Acceleration" type = "float"/>
</ choice>
<attribute name = "directionV" type = "MoveTowardAngleType" use = "optional" default = "0"/>
<attribute name = "directionH" type = "MoveTowardAngleType" use = "optional" default = "0"/>
<attribute name = "distance" type = "float" use = "optional"/>
</ complexType>

<!-############################################## ##->
<!-Definition of Incline type->
<!-############################################## ##->
<complexType name = "InclineType">
<sequence>
<choice minOccurs = "0">
<element name = "PitchSpeed" type = "float"/>
<element name = "PitchAcceleration" type = "float"/>
</ choice>
<choice minOccurs = "0">
<element name = "rollSpeed" type = "float"/>
<element name = "rollAcceleration" type = "float"/>
</ choice>
<choice minOccurs = "0">
<element name = "yawSpeed" type = "float"/>
<element name = "yawAcceleration" type = "float"/>
</ choice>
</ sequence>
<attribute name = "pitch" type = "sev: InclineAngleType" use = "optional" default = "0"/>
<attribute name = "roll" type = "sev: InclineAngleType" use = "optional" default = "0"/>
<attribute name = "yaw" type = "sev: InclineAngleType" use = "optional" default = "0"/>
</ complexType>

<!-############################################## ##->
<!-Definition of Shake type->
<!-############################################## ##->
<complexType name = "ShakeType">
<attribute name = "direction" type = "mpeg7: termReferenceType"
use = "optional"/>
<attribute name = "count" type = "float" use = "optional"/>
<attribute name = "distance" type = "float" use = "optional"/>
</ complexType>

<!-############################################## ##->
<!-Definition of Wave type->
<!-############################################## ##->
<complexType name = "WaveType">
<attribute name = "direction" type = "mpeg7: termReferenceType"
use = "optional"/>
<attribute name = "startDirection" type = "mpeg7: termReferenceType"
use = "optional"/>
<attribute name = "count" type = "float" use = "optional"/>
<attribute name = "distance" type = "float" use = "optional"/>
</ complexType>

<!-############################################## ##->
<!-Definition of Spin type->
<!-############################################## ##->
<complexType name = "SpinType">
<attribute name = "direction" type = "mpeg7: termReferenceType"
use = "optional"/>
<attribute name = "count" type = "float" use = "optional"/>
</ complexType>

<!-############################################## ##->
<!-Definition of Turn type->
<!-############################################## ##->
<complexType name = "TurnType">
<attribute name = "direction" type = "sev: TurnAngleType" use = "optional"/>
<attribute name = "speed" type = "float" use = "optional"/>
</ complexType>

<!-############################################## ##->
<!-Definition of Collide type->
<!-############################################## ##->
<complexType name = "CollideType">
<attribute name = "directionH" type = "sev: MoveTowardAngleType"
use = "optional" default = "0"/>
<attribute name = "directionV" type = "sev: MoveTowardAngleType"
use = "optional" default = "0"/>
<attribute name = "speed" type = "float" use = "optional"/>
</ complexType>

<!-############################################## ##->
<!-Definition of Rigid Body Motion base type->
<!-############################################## ##->
<simpleType name = "TurnAngleType">
<restriction base = "integer">
<minInclusive value = "-180"/>
<maxInclusive value = "180"/>
</ restriction>
</ simpleType>
<simpleType name = "InclineAngleType">
<restriction base = "integer">
<minInclusive value = "-359"/>
<maxInclusive value = "359"/>
</ restriction>
</ simpleType>
<simpleType name = "MoveTowardAngleType">
<restriction base = "integer">
<minInclusive value = "0"/>
<maxInclusive value = "359"/>
</ restriction>
</ simpleType>

The binary coding notation or binary notation of the ridge body motion effect may be represented as in Table 79 below. Here, Table 79 is a table showing the binary notation of the ridge body motion effect.

RigidBodyMotionEffect { Number of bits Mnemonic  MoveTowardFlag One bslbf  TrajectorySamplesFlag One bslbf  InclineFlag One bslbf  Shakekelag One bslbf  Waveflag One bslbf  Spinflag One bslbf  Turnflag One bslbf  Collideflag One bslbf  If (MoveTowardFlag) {   Movetoward MoveTowardType  }  If (TrajectorySamplesFlag) {   SizeOfIntensityRow 4 uimsbf   SizeOfIntensityColumn 16 uimsbf for (k = 0; k <(SizeOfIntensityRow *
SizeOfIntensityColumn); k ++) {
   ArrayIntensity [k] 32 fsfb   }  }  If (InclineFlag) {   Incline InclineType  }  If (ShakeFlag) {   Shake Shaketype  }  If (WaveFlag) {   Wave Wavetype  }  If (SpinFlag) {   Spin Spintype  }  If (TurnFlag) {   Turn Turntype  }  If (CollideFlag) {   Collide CollideType  } } MoveTowardType {  SpeedOrAccelerationFlag One bslbf  isSpeed One bslbf  distanceFlag  If (SpeedOrAccelerationFlag) {   If (isSpeed) {    Speed 32 fsfb   } else {     Acceleration 32 fsfb   }  }  directionV 9 uimsbf  directionH 9 uimsbf  If (distanceFlag) {   distance 32 fsfb  } } InclineType {  PitchSpeedOrPitchAccelerationFlag One bslbf  isPitchSpeed One bslbf  RollSpeedOrRollAccelerationFlag One bslbf  isRollSpeed One bslbf  YawSpeedOrYawAccelerationFlag One bslbf  isYawSpeed One bslbf  If (PitchSpeedOrPitchAccelerationFlag) {   If (isPitchSpeed) {    PitchSpeed 32 fsfb    } else {      PitchAcceleration 32 fsfb    }   }   If (RollSpeedOrRollAccelerationFlag) {    If (isRollSpeed) {     Rollspeed 32 fsfb    } else {      RollAcceleration 32 fsfb    }   }   If (YawSpeedOrYawAccelerationFlag) {    If (isYawSpeed) {     Yawspeed 32 fsfb    } else {      Yawacceleration 32 fsfb    }   }   pitch 10 bslbf   roll 10 bslbf   yaw 10 bslbf } ShakeType {  directionFlag One bslbf  countFlag One bslbf  distanceFlag One bslbf  If (directionFlag) {   direction 2 bslbf  }  If (countFlag) {   count 32 fsfb  }  If (distanceFlag) {   distance 32 fsfb  } } WaveType {  directionFlag One bslbf  startDirectionFlag One bslbf  countFlag One bslbf  distanceFlag One bslbf  If (directionFlag) {   direction One bslbf  }  If (startDirectionFlag) {   startDirection One bslbf  }  If (countFlag) {   count 32 fsfb  }  If (distanceFlag) {   distance 32 fsfb  } } SpinType {  directionFlag One bslbf  countFlag One bslbf  If (directionFlag) {   direction 3 bslbf  }  If (countFlag) {   count 32 fsfb  } } TurnType {  directionFlag One bslbf  speedFlag One bslbf  If (directionFlag) {   direction 9 simsbf  }  If (speedFlag) {   speed 32 fsfb  } } CollideType {  speedFlag One bslbf  directionH 9 uimsbf  directionV 9 uimsbf  If (speedFlag) {     speed 32 fsfb  } }

In addition, the semantics of the ridge body motion effect can be shown in Table 80 below. Here, Table 80 is a table showing the ridge body motion type semantics (Semantics of the RigidBodyMotionType).

Name Definition RigidBodyMotionType Tool for describing a rigid body motion effect. MoveTowardFlag This field, which is only present in the binary representation, indicates the presence of the MoveToward element. If it is 1 then the MoveToward element is present, otherwise the MoveToward element is not present. TrajectorySamplesFlag This field, which is only present in the binary representation, indicates the presence of the TrajectorySamples element. If it is 1 then the TrajectorySamples are present, otherwise the TrajectorySamples are not present. InclineFlag This field, which is only present in the binary representation, indicates the presence of the Incline element. If it is 1 then the Incline element is present, otherwise the Incline element is not present. Shakekelag This field, which is only present in the binary representation, indicates the presence of the Shake element. If it is 1 then the Shake element is present, otherwise the Shake element is not present. Waveflag This field, which is only present in the binary representation, indicates the presence of the Wave element. If it is 1 then the Wave element is present, otherwise the Wave element is not present. Spinflag This field, which is only present in the binary representation, indicates the presence of the Spin element. If it is 1 then the Spin element is present, otherwise the Spin element is not present. Turnflag This field, which is only present in the binary representation, indicates the presence of the Turn element. If it is 1 then the Turn element is present, otherwise the Turn element is not present. Collideflag This field, which is only present in the binary representation, indicates the presence of the Collide element. If it is 1 then the Collide element is present, otherwise the Collide element is not present. MoveTorward This pattern covers three dimensional movement of 6DoF, which means changing the location without rotation.The type is sev: MoveTorwardType. TrajectorySamples This pattern describes a set of position and orientation samples that the rigid body will follow.The type is mpeg7: termReferenceType. SizeOfIntensityRow Describes a row size of ArrayIntensity (Usually 6) SizeOfIntensityColumn Describes a column size of ArrayIntensity ArrayInstensity Describes 6 by 'm' matrix, where 6 rows contain three positions (Px, Py, Pz in millimeters) and three orientations (Ox, Oy, Oz in degrees). 'm' represents the number of position samples. Incline This pattern covers pitching, yawing, and rolling motion of 6 DoF, which means changing the rotation without changing the location.The type is sev : InclineType). Shake This pattern is a continuous motion moving from one side to opposite side repeatedly.This is an abstracted motion pattern which can be alternatively expressed by repetition of Move pattern.The type is sev: ShakeType). Wave This pattern is a continuous motion from side-up to side-down like the surface of water.This is an abstracted motion pattern which can be alternatively expressed by repetition of rolling or pitching of Incline pattern.The type is sev: WaveType). Spin This pattern is a continuous turning based on a central point inside without change the place.This is an abstracted motion pattern which can be alternatively expressed by repetition of yawing of Incline pattern. type is sev: SpinType). Turn This pattern is a motion of moving towards some direction.This is an abstracted motion pattern which can be alternatively expressed by repetition of Move and Incline pattern.The type is sev: TurnType ). Collide This pattern is a motion of moving object collides against something.This is an abstracted motion pattern which can be alternatively expressed by repetition of Move and Incline pattern.The type is sev: CollideType) .

In the ridge body motion type semantics shown in Table 80, Move Torward represents the movement 600 in the direction on the xyz coordinate, as shown in FIG. 6 is a diagram illustrating a movement pattern in the sensory effect of the multimedia service providing system according to an exemplary embodiment of the present invention.

In addition, in the ridge body motion type semantics shown in Table 80, the transactional sample (TrajectorySamples) represents a set of negative coordinate coordinates 700 representing a trajectory, as shown in FIG. FIG. 7 is a diagram illustrating a motion trace sample pattern in a sensory effect of a multimedia service providing system according to an exemplary embodiment of the present invention.

In the ridge body motion type semantics shown in Table 80, the incline is pitched 810, yaw 820, and roll on the xyz coordinate as shown in FIG. roll) 830. 8 is a diagram illustrating an incline pattern in a realistic effect of a multimedia service providing system according to an exemplary embodiment of the present invention.

In addition, in the ridge body motion type semantics shown in Table 80, the shake shows the continuous movement pattern 950 for the reciprocating motion 900, as shown in FIG. 9 is a diagram illustrating a shake pattern in the sensory effect of the multimedia service providing system according to an exemplary embodiment of the present invention.

Further, in the ridge body motion type semantics shown in Table 80, the wave represents a continuous wave pattern 1050, such as the ripple 1000 of the water surface, as shown in FIG. Here, FIG. 10 is a diagram illustrating a wave pattern in the sensory effect of the multimedia service providing system according to an exemplary embodiment of the present invention.

In the ridge body motion type semantics shown in Table 80, spin represents a continuous movement pattern 1150 rotating about one axis 1100, as shown in FIG. 11 is a diagram illustrating a spin pattern in the sensory effect of the multimedia service providing system according to an exemplary embodiment of the present invention.

In addition, in the ridge body motion type semantics shown in Table 80, a turn rotates the turn-type motion panel 1250 that rotates in a specific direction at the reference point 1200 as shown in FIG. 12. Indicates. 12 is a diagram illustrating a turn pattern in the sensory effect of the multimedia service providing system according to an exemplary embodiment of the present invention.

In addition, in the ridge body motion type semantics shown in Table 80, the collide, as shown in FIG. 13, the impact (1350) due to the collision according to the movement 1300 of the predetermined object is another object Indicates. Here, FIG. 13 is a diagram illustrating a collide pattern in a realistic effect of a multimedia service providing system according to an exemplary embodiment of the present invention.

In addition, the semantics of the ridge body motion effect can be expressed as shown in Table 81 below. Here, Table 81 is a table | surface which shows Move forward type semantics.

Name Definition SpeedOrAccelerationFlag This field, which is only present in the binary representation, specifies the choice of the moveToward characterics. If it is 1 then the Speed or Acceleration element is present, otherwise the Speed or Acceleration element is not present isSpeed This field, which is only present in the binary representation, specifies the choice of the moveToward characterics. If it is 1 then the Speed element is present, otherwise the Acceleration element is present distanceFlag This field, which is only present in the binary representation, indicates the presence of the distance attribute. If it is 1 then the distance attribute is present, otherwise the distance attribute is not present. Speed Describes the moving speed in terms of centimeter per second. Acceleration Describes the acceleration in terms of centimeter per square second. directionH Describes the horizontal direction of moving in terms of angle.The type is sev: MoveTowardAngleType.The angle starts from the front-center of the rigid body and increases CCW.
directionV Describes the vertical direction of moving in terms of angle.The type is sev: MoveTowardAngleType.The angle starts from the front-center of rigid body and increases CCW. distance Describes the distance between the origin and destination in terms of centimeter.

In the move forward type semantics shown in Table 81, the direction H (direction H), as shown in Figure 14, represents the magnitude of the horizontal direction movement through the angle unit, wherein Horizontal movement 1410 at location point 1400 is represented by directions H 0 1420, directions H 90 1430, directions H 180 1440, and directions H 270 1450. 14 is a diagram illustrating a horizontal movement pattern in the sensory effect of the multimedia service providing system according to an embodiment of the present invention.

In the move forward type semantics shown in Table 81, the direction V indicates the magnitude of the vertical direction movement through an angle unit, as shown in FIG. 15, wherein Vertical movement 1510 at a predetermined location point 1500 is represented by directions V 0 1520, directions V 90 1530, directions V 180 1540, and directions V 270 1550. Here, FIG. 15 is a diagram illustrating a vertical movement pattern in the sensory effect of the multimedia service providing system according to an exemplary embodiment of the present invention.

In addition, the semantics of the ridge body motion effect can be shown in Table 82 below. Here, Table 82 is a table | surface which shows Incline type semantics.

Name Definition PitchSpeedOrPitchAccelerationFlag This field, which is only present in the binary representation, specifies the choice of the moveToward characterics. If it is 1 then the PitchSpeed or PitchAcceleration element is present, otherwise the PitchSpeed or PitchAcceleration element is not present isPitchSpeed This field, which is only present in the binary representation, specifies the choice of the pitch characterics. If it is 1 then the PitchSpeed element is present, otherwise the PitchAcceleration element is present RollSpeedOrRollAccelerationFlag This field, which is only present in the binary representation, specifies the choice of the moveToward characterics. If it is 1 then the RollSpeed or RollAcceleration element is present, otherwise the RollSpeed or RollAcceleration element is not present isRollSpeed This field, which is only present in the binary representation, specifies the choice of the roll characterics. If it is 1 then the RollSpeed element is present, otherwise the RollAcceleration element is present YawSpeedOrYawAccelerationFlag This field, which is only present in the binary representation, specifies the choice of the moveToward characterics. If it is 1 then the YawSpeed or YawAcceleration element is present, otherwise the YawSpeed or YawAcceleration element is not present isYawSpeed This field, which is only present in the binary representation, specifies the choice of the yaw characterics. If it is 1 then the YawSpeed element is present, otherwise the YawAcceleration element is present PitchSpeed Describes the rotation speed based on X-axis in terms of degree per second. PitchAcceleration Describes the acceleration based on X-axis in terms of degree per square second. Rollspeed Describes the rotation speed based on Z-axis in terms of degree per second. RollAcceleration Describes the acceleration based on Z-axis in terms of degree per square second. Yawspeed Describes the rotation speed based on Y-axis in terms of degree per second. Yawacceleration Describes the acceleration based on Y-axis in terms of degree per square second. pitch Describes the rotation based on X-axis in terms of angle.Positive value means the rotation angle in the direction of pitch arrow ). roll Describes the rotation based on Z-axis in terms of angle.Positive value means the rotation angle in the direction of roll arrow ). yaw Describes the rotation based on Y-axis in terms of angle.Positive value means the rotation angle in the direction of yaw arrow ).

In the incline type semantics shown in Table 82, the pitch, roll0, and yaw is the magnitude 1610 of the rotation about the x-axis on each xyz coordinate axis, as shown in FIG. A magnitude 1620 of the rotation about the axis and a magnitude 1630 of the rotation about the y axis, respectively, are shown in FIG. One drawing.

In addition, the semantics of the ridge body motion effect can be expressed as shown in Table 83 below. Here, Table 83 is a table | surface which shows the Semantics of the ShakeType.

Name Definition directionFlag This field, which is only present in the binary representation, indicates the presence of the direction attribute. If it is 1 then the direction attribute is present, otherwise the direction attribute is not present. countFlag This field, which is only present in the binary representation, indicates the presence of the count attribute. If it is 1 then the count attribute is present, otherwise the count attribute is not present. distanceFlag This field, which is only present in the binary representation, indicates the presence of the distance attribute. If it is 1 then the distance attribute is present, otherwise the distance attribute is not present. direction Describes the direction of the shake motion.A CS that may be used for this purpose is the ShakeDirectionCS defined in Annex A.2.4. count Describes the times to shake during the duration time. distance Describes the distance between the two ends of the shaking motion in terms of centimeter.

In the shake type semantics shown in Table 83, the direction is the direction of the shake motion 1700 in space as shown in FIG. 17, ie, the heave 1710, the sway 1720. , And surge 1730. 17 is a diagram illustrating a direction shake pattern in the sensory effect of the multimedia service providing system according to an exemplary embodiment of the present invention. In addition, the direction of the direction shake pattern may be represented in binary notation as shown in Table 84 below. That is, in the shake type semantics shown in Table 83, the direction is encoded in binary notation. Here, Table 84 is a table showing the binary notation of directions.

direction (Shake) Sementics 00 Reserved 01 Heave 10 Sway 11 Surge

In the shake type semantics shown in Table 83, the distance indicates the moving distance 1800 of the shake motion 1850 as shown in FIG. Here, FIG. 18 is a diagram illustrating a shake motion distance in the sensory effect of the multimedia service providing system according to an exemplary embodiment of the present invention.

In addition, the semantics of the ridge body motion effect can be expressed as shown in Table 85 below. Here, Table 85 is a table showing wave type semantics.

Name Definition directionFlag This field, which is only present in the binary representation, indicates the presence of the direction attribute. If it is 1 then the direction attribute is present, otherwise the direction attribute is not present. startDirectionFlag This field, which is only present in the binary representation, indicates the presence of the startDirection attribute. If it is 1 then the startDirection attribute is present, otherwise the startDirection attribute is not present. countFlag This field, which is only present in the binary representation, indicates the presence of the count attribute. If it is 1 then the count attribute is present, otherwise the count attribute is not present. distanceFlag This field, which is only present in the binary representation, indicates the presence of the distance attribute. If it is 1 then the distance attribute is present, otherwise the distance attribute is not present. direction Describes the direction of the wave motion.A CS that may be used for this purpose is the WaveDirectionCS defined in Annex A.2.8. startDirection Describes whether it starts towards up direction or down direction.A CS that may be used for this purpose is the WaveStartDirectionCS defined in Annex A.2.9. count Describes the times to wave during the duration time. distance Describes the distance between the top and the bottom of the wave motion in centimeter.

In the wave type semantics shown in Table 85, the direction represents a continuous wave pattern such as wave ripple at a predetermined position (1900, 2000) as shown in Figs. 19 and 20, in particular the wave pattern. Front-rear 1910 and left-right (2010). 19 and 20 illustrate wave motion directions in a realistic effect of a multimedia service providing system according to an exemplary embodiment of the present invention. In addition, the direction of the wave pattern may be represented in binary notation as shown in Table 86 below. That is, in the wave type semantics shown in Table 85, the direction is encoded in binary notation. Here, Table 86 is a table showing the binary notation of directions.

direction (Wave) Sementics 0 Left-right One Front-rear

In the wave type semantics shown in Table 85, the start direction indicates the start direction of the wave patterns 2100 and 2200, as shown in FIGS. 21 and 22, in particular, the down direction of the start direction. 2110 and up 2210. 21 and 22 are views illustrating a wave motion starting direction in a realistic effect of a multimedia service providing system according to an exemplary embodiment of the present invention. The start direction of the wave pattern may be represented in binary notation as shown in Table 87 below. That is, in the wave type semantics shown in Table 85, the start direction is encoded in binary notation. Here, Table 87 is a table showing the binary notation of the direction.

startDirection (Wave) Sementics 0 Up One Down

In addition, in the wave type semantics shown in Table 85, the distance represents the maximum distance 2310 of the wave pattern 2300 as shown in FIG. 23 is a diagram illustrating a wave motion distance in the sensory effect of the multimedia service providing system according to an embodiment of the present invention.

In addition, the semantics of the ridge body motion effect can be expressed as shown in Table 88 below. Here, Table 88 is a table | surface which shows turn type semantics.

Name Definition directionFlag This field, which is only present in the binary representation, indicates the presence of the direction attribute. If it is 1 then the direction attribute is present, otherwise the direction attribute is not present. speedFlag This field, which is only present in the binary representation, indicates the presence of the speed attribute. If it is 1 then the speed attribute is present, otherwise the speed attribute is not present. direction Describes the turning direction in terms of angle.The type is sev: TurnAngleType. speed Describes the turning speed in degree per second.

In the turn type semantics shown in Table 88, the direction indicates the turn direction, as shown in FIG. 24, in particular the direction -90 (2410) and the direction 90 (2420) of the turn pattern. 24 is a diagram illustrating a turn pattern direction in the sensory effect of the multimedia service providing system according to an embodiment of the present invention.

In addition, the semantics of the ridge body motion effect can be expressed as shown in Table 89 below. Here, Table 89 is a table showing spin type semantics.

Name Definition directionFlag This field, which is only present in the binary representation, indicates the presence of the direction attribute. If it is 1 then the direction attribute is present, otherwise the direction attribute is not present. countFlag This field, which is only present in the binary representation, indicates the presence of the count attribute. If it is 1 then the count attribute is present, otherwise the count attribute is not present. direction Describes the direction of the spinning based on the 3 axes.A CS that may be used for this purpose is the SpinDirectionCS defined in Annex A.2.5.
NOTE 1 Forward-spin based on x axis (which is "xf" in the classification scheme) indicates the spinning direction by the pitch arrow. Otherwise, backward-spin based on x axis (which is "xb" in the classification scheme) indicates the opposite spinning direction of "xf"
count Describes the times to spin during the duration time.

In the spin type semantics shown in Table 89, the direction may be represented in binary notation as shown in Table 90 below. That is, in the spin type semantics shown in Table 89, the direction is encoded in binary notation. Here, Table 90 is a table showing the binary notation of the direction.

direction (Spin) Sementics 000 Reserved 001 XF 010 XB 011 YF 100 YB 101 ZF 110 ZB 111 Reserved

In addition, the semantics of the ridge body motion effect can be shown in Table 91 below. Here, Table 91 is a table | surface which shows the Collide type semantics.

Name Definition speedFlag This field, which is only present in the binary representation, indicates the presence of the speed attribute. If it is 1 then the speed attribute is present, otherwise the speed attribute is not present. directionH Describes the horizontal direction of receiving impact in terms of angle.The type is sev: MoveTowardAngleType.The angle starts from the front-center of the rigid body and increases turning right). directionV Describes the vertical direction of receiving impact in terms of angle.The type is sev: TowardAngleType.The angle starts from the front-center of rigid body and increases turning up ). speed Describes the speed of colliding object in terms of centimeter per second.

In the collide type semantics shown in Table 91, the direction H indicates the magnitude of the horizontal direction movement through the angle unit, as shown in FIG. Horizontal movement 2510 at point 2500 is represented by directions H 0 2520, directions H 90 2530, directions H 180 2540, and directions H 270 2550. Here, FIG. 25 is a diagram illustrating a horizontal colloid pattern in the sensory effect of the multimedia service providing system according to an exemplary embodiment of the present invention.

In the colloid type semantics shown in Table 91, the direction V indicates the magnitude of the vertical direction movement through an angle unit, as shown in FIG. Vertical movement 2610 at position point 2600 is indicated by directions V 0 2620, directions V 90 2630, directions V 180 2640, and directions V 270 2650. Here, FIG. 26 is a view illustrating a vertical colloid pattern in the sensory effect of the multimedia service providing system according to an exemplary embodiment of the present invention.

Next, the passive kinesthetic motion effect as a motion effect will be described in detail. The syntax of the passive kinetic motion effect can be expressed as shown in Table 92 below. Here, Table 92 is a table showing the syntax of the passive kinematic motion effect.

<!-############################################## ##->
<!-SEV Passive Kinesthetic Motion type->
<!-############################################## ##->
<complexType name = "PassiveKinestheticMotionType">
<complexContent>
<extension base = "sev: RigidBodyMotionType">
<attribute name = "updaterate" type = "positiveInteger" use = "required"/>
</ extension>
</ complexContent>
</ complexType>

The binary coding notation or binary notation of the passive kinematic motion effect may be represented as shown in Table 93 below. Here, Table 93 is a table showing the binary notation of the passive kinematic motion effect.

PassiveKinestheticMotioin { Number of bits Mnemonic  RigidBodyMotionType See subclauses RigidBodyMotionType  updaterate 16 uimsbf }

In addition, the semantics of the passive kinematic motion effect can be shown as Table 94 below. Here, Table 94 is a table showing the passive kinematic motion type semantics (Semantics of the Passive Kinesthetic Motion Type).

Name Definition PassiveKinestheticMotionType Tool for describing a passive kinesthetic motion effect.This type defines a passive kinesthetic motion mode.In this mode, a user holds the kinesthetic device softly and the kinesthetic device guides the uwer's hand according to the recorded motion trajectories that are specified by three positions and three orientations). TrajectorySamples The passive kinesthetic motion data consists of six matrices representing 6DoF motion (Px, Py, Pz, Ox, Oy, Oz) (Tool for describing a passive kinesthetic interaction. kinesthetic motion data is comprised with 6 by m matrix, where 6 rows contain three positions (Px, Py, Pz in millimeters) and three orientations (Ox, Oy, Oz in degrees) .These six data are updated with the same updaterate). updaterate Describes a number of data update times per second.
ex. The value 20 means the kinesthetic device will move to 20 different positions and orientations for each second.

Next, the passive kinesthetic force effect (passive kinesthetic force effect) will be described in detail, the syntax of the passive kinetic force effect can be expressed as shown in Table 95 below. Here, Table 95 is a table showing the syntax of the passive kinetic force effect.

<!-############################################## ##->
<!-SEV Passive Kinesthetic Force type->
<!-############################################## ##->
<complexType name = "PassiveKinestheticForceType">
<complexContent>
<extension base = "sedl: EffectBaseType">
<sequence>
<element name = "passivekinestheticforce"
type = "mpeg7: FloatMatrixType"/>
</ sequence>
<attribute name = "updaterate" type = "positiveInteger" use = "required"/>
</ extension>
</ complexContent>
</ complexType>

The binary coding notation or binary notation of the passive kinematic force effect may be represented as shown in Table 96 below. Here, Table 96 is a table showing the binary notation of the passive kinetic force effect.

PassiveKinestheticForce { Number of bits Mnemonic  EffectBaseType EffectBaseType  SizeOfforceRow 4 uimsbf  SizeOfforceColumn 16 uimsbf for (k = 0; k <(SizeOfforceRow *
SizeOfforceColumn); k ++) {
  force [k] 32 fsfb  }  updaterate 16 uimsbf }

In addition, the semantics of the passive kinetic force effect can be expressed as shown in Table 97 below. Here, Table 97 is a table | surface which shows the passive kinematic force type semantics.

Name Definition PassiveKinestheticForceType Tool for describing a passive kinesthetic force / torque effect.This type defines a passive kinesthetic force / torque mode.In this mode, a user holds the kinesthetic device softly and the kinesthetic device guides the user's hand according to the recorded force / toque histories. passivekinestheticforce Describes a passive kinesthetic force / torque sensation. The passive kinesthetic force / torque data are comprised with 6 by m matrix, where 6 rows contain three forces (Fx, Fy, Fz in Newton) and three torques (Tx, Ty, Tz in Newton-millimeter) for force / torque trajectories. These six data are updated with the same updaterate. SizeOfforceRow Describes a row size of force (Usually 6) SizeOfforceColumn Describes a column size of force force Describes 6 by 'm' matrix, where 6 rows contain three forces (Fx, Fy, Fz in Newton) and three torques (Tx, Ty, Tz in Newton-millimeter) for force / torque trajectories. 'm' represents the number of position samples. updaterate Describes a number of data update times per second.

Next, the active kinesthetic effect (active kinesthetic effect) will be described in detail, the syntax of the active kinetic effect can be expressed as shown in Table 98 below. Here, Table 98 is a table showing the syntax of the active kinetic effect.

<!-############################################## ##->
<!-SEV Active Kinesthetic type->
<!-############################################## ##->
<complexType name = "ActiveKinestheticType">
<complexContent>
<extension base = "sedl: EffectBaseType">
<sequence>
<element name = "activekinesthetic"
type = "sev: ActiveKinestheticForceType"/>
</ sequence>
</ extension>
</ complexContent>
</ complexType>
<complexType name = "ActiveKinestheticForceType">
<attribute name = "Fx" type = "float"/>
<attribute name = "Fy" type = "float"/>
<attribute name = "Fz" type = "float"/>
<attribute name = "Tx" type = "float" use = "optional"/>
<attribute name = "Ty" type = "float" use = "optional"/>
<attribute name = "Tz" type = "float" use = "optional"/>
</ complexType>

The binary coding notation or binary notation of the active kinematics effect may be represented as in Table 99 below. Here, Table 99 is a table showing the binary notation of the active kinematics effect.

ActiveKinesthetic { Number of bits Mnemonic  EffectBaseType EffectBaseType  TxFlag One bslbf  Tyflag One bslbf  TzFlag One bslbf  FxFlag One bslbf  FyFlag One bslbf  FzFlag One bslbf  if (TxFlag) {   Tx 32 fsfb  }  if (TyFlag) {   Ty 32 fsfb  }  if (TzFlag) {   Tz 32 fsfb  }  if (FxFlag) {   Fx 32 fsfb  }  If (FyFlag) { Fy 32 fsfb  }  If (FzFlag) { Fz 32 fsfb  } }

In addition, the semantics of the active kinesmatic effect can be expressed as shown in Table 100 below. Here, Table 100 is a table showing the Active Kinesetic Type semantics (Semantics of the Active Kinesthetic Type).

Name Definition ActiveKinestheticType Tool for describing an active kinesthetic effect.This type defines an active kinesthetic interaction mode.In this mode, when a user touches an object by his / her will, then the computed contact forces and torques are provided ). ActiveKinestheticForceType Represents three forces and torques, described in terms of forces (Fx, Fy, Fz), described in terms of torques (Tx, Ty, Tz) and describes three forces (Fx, Fy, Fz) and torques (Tx, Ty) , Tz) for each axis in an active kinesthetic mode.Force is represented in the unit of N (Newton) and torque is represented in the unit of Nmm (Newton-millimeter)). activekinesthetic Number of data updates per second (Tool for describing an active kinesthetic interaction). txFlag This field, which is only present in the binary representation, indicates the presence of the tx attribute. If it is 1 then the tx attribute is present, otherwise the tx attribute is not present. tyFlag This field, which is only present in the binary representation, indicates the presence of the ty attribute. If it is 1 then the ty attribute is present, otherwise the ty attribute is not present. tzFlag This field, which is only present in the binary representation, indicates the presence of the tz attribute. If it is 1 then the tz attribute is present, otherwise the tz attribute is not present. fxFlag This field, which is only present in the binary representation, indicates the presence of the fx attribute. If it is 1 then the fx attribute is present, otherwise the fx attribute is not present. fyFlag This field, which is only present in the binary representation, indicates the presence of the fy attribute. If it is 1 then the fy attribute is present, otherwise the fy attribute is not present. fzFlag This field, which is only present in the binary representation, indicates the presence of the fz attribute. If it is 1 then the fz attribute is present, otherwise the fz attribute is not present. Tx Torque for x-axis in an active kinesthetic mode. Torque is represented in the unit of Nmm (Newton-millimeter). Ty Torque for y-axis in an active kinesthetic mode. Torque is represented in the unit of Nmm (Newton-millimeter). Tz Torque for z-axis in an active kinesthetic mode. Torque is represented in the unit of Nmm (Newton-millimeter). Fx Force for x-axis in an active kinesthetic mode. Force is represented in the unit of N (Newton). Fy Force for y-axis in an active kinesthetic mode. Force is represented in the unit of N (Newton). Fz Force for z-axis in an active kinesthetic mode. Force is represented in the unit of N (Newton).

Next, the tactile effect will be described in detail. The syntax of the tactile effect can be expressed as shown in Table 101 below. Here, Table 101 is a table | surface which showed the syntax of the said tactile effect.

<!-############################################## ##->
<!-SEV Tactile type->
<!-############################################## ##->
<complexType name = "TactileType">
<complexContent>
<extension base = "sedl: EffectBaseType">
<sequence>
<choice>
<element name = "ArrayIntensity" type = "mpeg7: FloatMatrixType"/>
<element name = "TactileVideo" type = "anyURI"/>
</ choice>
</ sequence>
<attribute name = "tactileEffect" type = "mpeg7: termReferenceType" use = "optional"/>
<attribute name = "updaterate" type = "positiveInteger" use = "optional"/>
</ extension>
</ complexContent>
</ complexType>

The binary coding notation or binary notation of the tactile effect may be represented as shown in Table 102 below. Here, Table 102 is a table showing the binary notation of the haptic effect.

Tactile { Number of bits Mnemonic  EffectBaseType EffectBaseType  TactileFlag One bsblf  tactileEffectFlag One bsblf  updaterateflag One bsblf  if (TactileFlag) {    SizeOfIntensityRow 4 uimsbf    SizeOfIntensityColumn 16 uimsbf for (k = 0; k <(SizeOfIntensityRow *
SizeOfIntensityColumn); k ++) {
      ArrayInstensity [k] 32 fsfb    }  }  else {   TactileVideo UTF-8  }  if (tactileEffectFlag) {   tactileEffect 3 bslbf (Table 16)  }  if (updaterateflag) {    updaterate 16 uimsbf  } }

In addition, the semantics of the haptic effect may be represented as shown in Table 103 below. Here, Table 103 is a table which shows tactile type semantics.

Name Definition TactileType Tool for describing a tactile effect.Tactile effects can provide vibrations, pressures, temperature, etc, directly onto some areas of human skin through many types of actuators such as vibration motors, air-jets, piezo-actuators A tactile effect may effectively be represented by an ArrayIntensity or by a TactileVideo, all of which can be composed of m by n matrix that is mapped to m by n actuators in a tactile device.A Tactile Video is defined as a grayscale video formed with m-by-n pixels matched to the m-by-n tactile actuator array). tactileFlag This field, which is only present in the binary representation, specifies the choice of the tectile effect source.If it is 1 then the ArrayIntensity is present, otherwise the TactileVideo is present). tactileEffectFlag This field, which is only present in the binary representation, indicates the presence of the tactileEffect attribute. If it is 1 then the tactileEffect attribute is present, otherwise the tactileEffect attribute is not present. updateRateFlag This field, which is only present in the binary representation, indicates the presence of the updateRate attribute. If it is 1 then the updateRate attribute is present, otherwise the updateRate attribute is not present. SizeOfIntensityRow Describes a row size of ArrayIntensity (Usually 6) SizeOfIntensityColumn Describes a column size of ArrayIntensity ArrayIntensity Describes intensities in terms of physical quantities for all elements of m by n matrix of the tactile actuators. For temperature tactile effect, for example, intensity is specified in the unit of Celsius. For vibration tactile effect, intensity is specified in the unit of mm (amplitude). For pressure tactile effect, intensity is specified in the unit of Newton / mm 2 TactileVideo Describes intensities in terms of grayscale (0-255) video of tactile information.This grayscale value (0-255) can be divided into several levels according to the number of levels that a device produces). tactileeffect Describes the tactile effect to use. A CS that may be used for this purpose is the TactileEffectCS defined in Annex A.2.4. This refers the preferred tactile effects.
In the binary description, the following mapping table is used.
updaterate Describes a number of data update times per second.

In the tactile type semantics shown in Table 103, the tactile effect can be represented in binary notation as shown in Table 104 below. That is, in the tactile type semantics shown in Table 103, the haptic effect is encoded in binary notation. Here, Table 104 is a table which shows the binary notation of a tactile effect.

TactileEffect TactileEffectType 000 vibration 001 temperature 010 pressure 011-111 Reserved

Next, an operation of the multimedia service providing system according to an exemplary embodiment of the present invention will be described in more detail with reference to FIG. 27.

27 is a diagram schematically illustrating a process of providing a multimedia service of a multimedia service providing system according to an embodiment of the present invention.

Referring to FIG. 27, in operation 2710, a service provider of the multimedia service providing system generates multimedia content of multimedia service to be provided to users according to a service request of users, and sensory effect information of the multimedia content.

In operation 2720, the service provider encodes the generated multimedia content, and encodes the sensory effect information in binary notation, that is, using a binary notation coding scheme. Here, since the binary notation encoding of the sensory effect information has been described in detail above, a detailed description thereof will be omitted.

In operation 2730, the service provider transmits the multimedia data including the encoded multimedia content and sensory effect information encoded in the binary notation.

In operation 2740, the user server of the multimedia service providing system receives the multimedia data, and decodes the sensory effect information encoded in the binary notation from the received multimedia data.

In operation 2750, the user server converts the sensory effect information into control information in consideration of capability information of each user device, and encodes the converted control information in binary notation, that is, using a binary notation coding scheme. do. Here, since the conversion of the control information and the binary notation encoding of the control information have been described in detail above, a detailed description thereof will be omitted.

Then, in step 2760, the user server transmits the multimedia content and the control information encoded in the binary notation to the user devices, respectively.

In operation 2770, each of the user devices of the multimedia service providing system may simultaneously provide the multimedia content and the sensory effect of the multimedia content to the users in real time through device control by the control information encoded in the binary notation. That is, it provides various high quality multimedia services.

While the invention has been shown and described with reference to certain preferred embodiments thereof, it will be understood by those skilled in the art that various changes and modifications may be made without departing from the spirit and scope of the invention. Therefore, the scope of the present invention should not be limited to the described embodiments, but should be determined not only by the scope of the following claims, but also by the equivalents of the claims.

Claims (16)

In the system for providing a multimedia service in a communication system,
A service provider for providing multimedia content of the multimedia service and sensory effect information indicating sensory effects of the multimedia content according to a service request of a multimedia service that users want to receive;
A user server configured to receive multimedia data including the multimedia content and the sensory effect information, and convert the sensory effect information into control information from the multimedia data and provide the command information; And
And user devices that provide the multimedia content and the sensory effects to the users in real time through a device command according to the control information.
The method of claim 1,
The service provider, the multimedia service providing system, characterized in that for encoding the sensory effect information in a binary representation (binary representation).

The method of claim 2,
The multimedia data, the multimedia content and the multimedia service providing system, characterized in that the sensory effect information encoded in the binary notation.
The method of claim 3, wherein the service provider,
A generator for generating the multimedia content and the sensory effect information;
An encoder for encoding the sensory effect information by using a binary notation coding scheme; And
And a transmitter for transmitting the multimedia data.
The method of claim 4, wherein
And the encoder encodes the sensory effect information into a sensory effect stream in binary notation.

The method of claim 2,
The sensory effects may include a light effect, a colored light effect, a flash light effect, a temperature effect, a wind effect, a vibration effect, Spraying effect, scent effect, smoke effect, color correction effect, rigid body motion effect, passive kinetic motion effect (passive kinesthetic) A multimedia service providing system comprising a motion effect, a passive kinesthetic force effect, an active kinesthetic effect, and a tactile effect.
The method of claim 6,
And the service provider defines syntax, binary representation, and semantics of the sensory effects.
In the system for providing a multimedia service in a communication system,
A generator for generating multimedia content of the multimedia service and generating sensory effect information indicative of sensory effects of the multimedia content according to a service request of a multimedia service to be provided by users;
An encoder for encoding the sensory effect information in binary notation using a binary representation coding scheme; And
And a transmitter for transmitting the multimedia data including the multimedia content and sensory effect information encoded in the binary notation.
The method of claim 8,
And the encoder encodes the sensory effect information into a sensory effect stream in binary notation.
The method of claim 8,
The sensory effects may include a light effect, a colored light effect, a flash light effect, a temperature effect, a wind effect, a vibration effect, Spraying effect, scent effect, smoke effect, color correction effect, rigid body motion effect, passive kinetic motion effect (passive kinesthetic) A multimedia service providing system comprising a motion effect, a passive kinesthetic force effect, an active kinesthetic effect, and a tactile effect.
The method of claim 10,
The encoder defines a syntax, binary representation, and semantics of the sensory effects.
In the method for providing a multimedia service in a communication system,
Generating multimedia content of the multimedia service and generating sensory effect information indicative of sensory effects of the multimedia content according to a service request of a multimedia service to be provided by users;
Encoding the sensory effect information in binary notation using a binary representation coding scheme;
Converting sensory effect information encoded in the binary notation into command information in binary notation; And
And providing the multimedia contents and the sensory effects to the users in real time through a device command according to the control information of the binary notation.
The method of claim 12,
And transmitting multimedia data including the multimedia content and sensory effect information encoded in the binary notation.
The method of claim 12,
The encoding in the binary notation may include encoding the sensory effect information into a sensory effect stream in binary notation.
The method of claim 12,
The sensory effects may include a light effect, a colored light effect, a flash light effect, a temperature effect, a wind effect, a vibration effect, Spraying effect, scent effect, smoke effect, color correction effect, rigid body motion effect, passive kinetic motion effect (passive kinesthetic) A multimedia service providing method comprising a motion effect, a passive kinesthetic force effect, an active kinesthetic effect, and a tactile effect.
16. The method of claim 15,
The encoding in binary notation may include defining syntax, binary representation, and semantics of the sensory effects.
KR1020110030397A 2010-04-05 2011-04-01 System and method for providing multimedia service in a communication system KR20110112211A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
US13/080,095 US20110282967A1 (en) 2010-04-05 2011-04-05 System and method for providing multimedia service in a communication system

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
KR20100031129 2010-04-05
KR1020100031129 2010-04-05

Publications (1)

Publication Number Publication Date
KR20110112211A true KR20110112211A (en) 2011-10-12

Family

ID=45028081

Family Applications (1)

Application Number Title Priority Date Filing Date
KR1020110030397A KR20110112211A (en) 2010-04-05 2011-04-01 System and method for providing multimedia service in a communication system

Country Status (1)

Country Link
KR (1) KR20110112211A (en)

Similar Documents

Publication Publication Date Title
US20110241908A1 (en) System and method for processing sensory effect
US20130103703A1 (en) System and method for processing sensory effects
CN107771395B (en) Method and apparatus for generating and transmitting metadata for virtual reality
JP4334470B2 (en) Ambient light control
US20080291218A1 (en) System And Method For Generating Interactive Video Images
US10897646B2 (en) Video stream transmission method and related device and system
JP5781435B2 (en) Multimedia application system and method using metadata related to sensory playback device
KR101571283B1 (en) Media content transmission method and apparatus, and reception method and apparatus for providing augmenting media content using graphic object
US20100274817A1 (en) Method and apparatus for representing sensory effects using user&#39;s sensory effect preference metadata
US20100268745A1 (en) Method and apparatus for representing sensory effects using sensory device capability metadata
RU2721410C2 (en) Method and device for providing tactile feedback and interactivity based on user&#39;s tactile space (hapspace)
US10861221B2 (en) Sensory effect adaptation method, and adaptation engine and sensory device to perform the same
KR20110030656A (en) Data transmission device, method for transmitting data, audio-visual environment control device, audio-visual environment control system, and method for controlling audio-visual environment
JP2007248861A (en) Image display device and receiving device
CN108833937A (en) Method for processing video frequency and device
Kim Authoring multisensorial content
Yoon et al. MPEG-V: Bridging the virtual and real world
Saleme et al. Coping with the challenges of delivering multiple sensorial media
Timmerer et al. Assessing the quality of sensory experience for multimedia presentations
CN102282849A (en) Data transmission device, data transmission mthod, audio-visual environment control devcice, audio-visual environment control method, and audio-visual environment control system
KR20110111251A (en) Method and apparatus for providing metadata for sensory effect, computer readable record medium on which metadata for sensory effect is recorded, method and apparatus for representating sensory effect
US20110282967A1 (en) System and method for providing multimedia service in a communication system
US20110276659A1 (en) System and method for providing multimedia service in a communication system
US20080240669A1 (en) Mpeg-based user interface device and method of controlling function using the same
KR20110112211A (en) System and method for providing multimedia service in a communication system

Legal Events

Date Code Title Description
WITN Withdrawal due to no request for examination