EP3399763A1 - Method and system for haptic data encoding - Google Patents

Method and system for haptic data encoding Download PDF

Info

Publication number
EP3399763A1
EP3399763A1 EP18180688.6A EP18180688A EP3399763A1 EP 3399763 A1 EP3399763 A1 EP 3399763A1 EP 18180688 A EP18180688 A EP 18180688A EP 3399763 A1 EP3399763 A1 EP 3399763A1
Authority
EP
European Patent Office
Prior art keywords
haptic
data
user device
haptic data
encoded
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Ceased
Application number
EP18180688.6A
Other languages
German (de)
English (en)
French (fr)
Inventor
Loc Phan
Satvir Singh BHATIA
Stephen D. Rank
Christopher J Ullrich
Jean Francois DIONNE
Hugues-Antoine Oliver
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Immersion Corp
Original Assignee
Immersion Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Immersion Corp filed Critical Immersion Corp
Publication of EP3399763A1 publication Critical patent/EP3399763A1/en
Ceased legal-status Critical Current

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/60Network structure or processes for video distribution between server and client or between remote clients; Control signalling between clients, server and network components; Transmission of management data between server and client, e.g. sending from server to client commands for recording incoming content stream; Communication details between server and client 
    • H04N21/63Control signaling related to video distribution between client, server and network components; Network processes for video distribution between server and clients or between remote clients, e.g. transmitting basic layer and enhancement layers over different transmission paths, setting up a peer-to-peer communication via Internet between remote STB's; Communication protocols; Addressing
    • H04N21/637Control signals issued by the client directed to the server or network components
    • H04N21/6377Control signals issued by the client directed to the server or network components directed to server
    • H04N21/6379Control signals issued by the client directed to the server or network components directed to server directed to encoder, e.g. for requesting a lower encoding rate
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/20Input arrangements for video game devices
    • A63F13/21Input arrangements for video game devices characterised by their sensors, purposes or types
    • A63F13/215Input arrangements for video game devices characterised by their sensors, purposes or types comprising means for detecting acoustic signals, e.g. using a microphone
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/25Output arrangements for video game devices
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/25Output arrangements for video game devices
    • A63F13/28Output arrangements for video game devices responding to control signals received from the game device for affecting ambient conditions, e.g. for vibrating players' seats, activating scent dispensers or affecting temperature or light
    • A63F13/285Generating tactile feedback signals via the game input device, e.g. force feedback
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/016Input arrangements with force or tactile feedback as computer generated output to the user
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/16Sound input; Sound output
    • G06F3/165Management of the audio stream, e.g. setting of volume, audio stream path
    • GPHYSICS
    • G08SIGNALLING
    • G08BSIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
    • G08B6/00Tactile signalling systems, e.g. personal calling systems
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L65/00Network arrangements, protocols or services for supporting real-time applications in data packet communication
    • H04L65/60Network streaming of media packets
    • H04L65/70Media network packetisation
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N19/00Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
    • H04N19/44Decoders specially adapted therefor, e.g. video decoders which are asymmetric with respect to the encoder
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N19/00Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
    • H04N19/46Embedding additional information in the video signal during the compression process
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/20Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
    • H04N21/23Processing of content or additional data; Elementary server operations; Server middleware
    • H04N21/233Processing of audio elementary streams
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/20Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
    • H04N21/23Processing of content or additional data; Elementary server operations; Server middleware
    • H04N21/234Processing of video elementary streams, e.g. splicing of video streams or manipulating encoded video stream scene graphs
    • H04N21/2343Processing of video elementary streams, e.g. splicing of video streams or manipulating encoded video stream scene graphs involving reformatting operations of video signals for distribution or compliance with end-user requests or end-user device requirements
    • H04N21/234336Processing of video elementary streams, e.g. splicing of video streams or manipulating encoded video stream scene graphs involving reformatting operations of video signals for distribution or compliance with end-user requests or end-user device requirements by media transcoding, e.g. video is transformed into a slideshow of still pictures or audio is converted into text
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/20Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
    • H04N21/23Processing of content or additional data; Elementary server operations; Server middleware
    • H04N21/236Assembling of a multiplex stream, e.g. transport stream, by combining a video stream with other content or additional data, e.g. inserting a URL [Uniform Resource Locator] into a video stream, multiplexing software data into a video stream; Remultiplexing of multiplex streams; Insertion of stuffing bits into the multiplex stream, e.g. to obtain a constant bit-rate; Assembling of a packetised elementary stream
    • H04N21/23614Multiplexing of additional data and video streams
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/438Interfacing the downstream path of the transmission network originating from a server, e.g. retrieving encoded video stream packets from an IP network
    • H04N21/4382Demodulation or channel decoding, e.g. QPSK demodulation
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/438Interfacing the downstream path of the transmission network originating from a server, e.g. retrieving encoded video stream packets from an IP network
    • H04N21/4385Multiplex stream processing, e.g. multiplex stream decrypting
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/45Management operations performed by the client for facilitating the reception of or the interaction with the content or administrating data related to the end-user or to the client device itself, e.g. learning user preferences for recommending movies, resolving scheduling conflicts
    • H04N21/4508Management of client data or end-user data
    • H04N21/4516Management of client data or end-user data involving client characteristics, e.g. Set-Top-Box type, software version or amount of memory available
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/60Network structure or processes for video distribution between server and client or between remote clients; Control signalling between clients, server and network components; Transmission of management data between server and client, e.g. sending from server to client commands for recording incoming content stream; Communication details between server and client 
    • H04N21/63Control signaling related to video distribution between client, server and network components; Network processes for video distribution between server and clients or between remote clients, e.g. transmitting basic layer and enhancement layers over different transmission paths, setting up a peer-to-peer communication via Internet between remote STB's; Communication protocols; Addressing
    • H04N21/633Control signals issued by server directed to the network components or client
    • H04N21/6332Control signals issued by server directed to the network components or client directed to client
    • H04N21/6336Control signals issued by server directed to the network components or client directed to client directed to decoder
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/60Network structure or processes for video distribution between server and client or between remote clients; Control signalling between clients, server and network components; Transmission of management data between server and client, e.g. sending from server to client commands for recording incoming content stream; Communication details between server and client 
    • H04N21/65Transmission of management data between client and server
    • H04N21/658Transmission by the client directed to the server
    • H04N21/6582Data stored in the client, e.g. viewing habits, hardware capabilities, credit card number
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/80Generation or processing of content or additional data by content creator independently of the distribution process; Content per se
    • H04N21/81Monomedia components thereof
    • H04N21/8106Monomedia components thereof involving special audio data, e.g. different tracks for different languages
    • H04N21/8113Monomedia components thereof involving special audio data, e.g. different tracks for different languages comprising music, e.g. song in MP3 format
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/80Generation or processing of content or additional data by content creator independently of the distribution process; Content per se
    • H04N21/81Monomedia components thereof
    • H04N21/8126Monomedia components thereof involving additional data, e.g. news, sports, stocks, weather forecasts
    • H04N21/8133Monomedia components thereof involving additional data, e.g. news, sports, stocks, weather forecasts specifically related to the content, e.g. biography of the actors in a movie, detailed information about an article seen in a video program
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/80Generation or processing of content or additional data by content creator independently of the distribution process; Content per se
    • H04N21/81Monomedia components thereof
    • H04N21/816Monomedia components thereof involving special video data, e.g 3D video
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/80Generation or processing of content or additional data by content creator independently of the distribution process; Content per se
    • H04N21/83Generation or processing of protective or descriptive data associated with content; Content structuring
    • H04N21/845Structuring of content, e.g. decomposing content into time segments
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/80Generation or processing of content or additional data by content creator independently of the distribution process; Content per se
    • H04N21/85Assembly of content; Generation of multimedia applications
    • H04N21/854Content authoring
    • H04N21/85406Content authoring involving a specific file format, e.g. MP4 format

Definitions

  • the present invention is directed to systems and methods for encoding haptic data, in particular encoding a haptic stream as part of digital content for storing and/or for transferring over a network.
  • a haptic data stream is typically represented in raw pulse code modulation ("PCM") data format.
  • PCM raw pulse code modulation
  • there may be periods of silence in the stream i.e. periods during which no haptic data is streamed, which may waste time and bandwidth by streaming zeroes or unnecessary data over the network, and is not ideal for both the content distributor and the end user.
  • the storing of haptic data streams faces the same challenge.
  • adaptive bit rate streaming is a common practice for varying the quality of audio/video signals when streaming or transmitting large amounts of data over a digital network. This is done so that the smooth streaming or playback of audio and video signals is still feasible by adapting to varying network speeds and/or congestion. For example, if a video is streaming at very high quality, and then the network over which it is streaming experiences heavy network congestion, or the download speed suddenly becomes slower, the streaming server can still transmit data, but will transmit lower quality data to conserve network bandwidth. Once the congestion has been mitigated and the network speed has increased, the data sent will be higher quality as network bandwidth is more freely available. This may be done by encoding data at multiple bit rates so that the amount of data to be transmitted is much less.
  • Haptic signals may be interpreted as audio signals if the signal is simply a waveform. However, simply treating haptic signals like audio signals and transcoding the haptic signal at multiple bit rates may not offer much room to adapt the quality for various network speeds.
  • a high quality haptic signal may be considered to have a sample rate of 8 kHz, which is considered to be a very low sample rate for an audio signal and would be a low quality signal.
  • degrading the quality of the haptic signal a completely different user experience may be introduced, and degrading the quality by just arbitrarily removing bits of the same stream may result in unintended texturing and take away from the clarity of the signal.
  • a method that includes: receiving digital content data including audio data and/or video data; generating haptic data using at least some of the received digital content data; encoding the haptic data for efficient transmission over a communication network; multiplexing the encoded haptic data with the received digital content data; embedding information for decoding the encoded haptic data in metadata of the multiplexed data stream; and sending the multiplexed data stream over the communication network.
  • the first aspect may be modified in any suitable way as disclosed herein including but not limited to any one or more of the following.
  • the method may be a computer-implemented method for enriching user digital content experience with haptic data and may be implemented on a computing device having a processor programmed with a computer program module.
  • the computer-implemented method may transfer the haptic data together with other digital content data over a communication network for an end user to experience the haptic effects generated on an end user device that is coupled to the network.
  • the method may includes analyzing the haptic data to determine at least one characteristic of the haptic data, and encoding the haptic data includes encoding, based on the determined characteristic, the haptic data to meet a pre-defined criteria.
  • the pre-defined criteria may include an output encoded haptic bit stream having the least number of bits.
  • the method may include the step of receiving an endpoint configuration of an end user device from the end user device, and the pre-defined criteria includes preserving a range of frequency of the haptic data that correspond to the received endpoint configuration of the end user device.
  • Encoding the haptic data may include selecting an encoder from a list of pre-determined encoders based on the determined characteristic and applying the selected encoder to transform the haptic data.
  • the list of pre-determined encoders may include an Advanced Audio Coding encoder and/or a Free Lossless Audio encoder.
  • the characteristics of the haptic data may comprise one or more types of haptic output devices used to generate haptic effects based on the haptic data, intended use case of the haptic data, magnitude of the haptic data, frequency of the haptic data, and length of the silence in the haptic data.
  • the types of haptic output devices may include one or more of the group consisting of an eccentric rotating mass actuator, a linear resonant actuator, and a piezoelectric actuator.
  • the intended use case of the haptic data may be selected from the group consisting of music, movies and games.
  • Encoding the haptic data may include applying each one of a list of pre-determined encoders to the haptic data to generate a corresponding list of encoded haptic data streams, and selecting, using the pre-defined criteria, an encoded haptic data stream from the list of encoded haptic data streams.
  • Encoding the haptic data may includes encoding the haptic data for a predetermined list of density factors to generate a plurality of encoded haptic data streams, each encoded haptic data stream corresponding to a density factor of the predetermined list of density factors, and the method includes receiving network bandwidth information from an end user device over a communication network, and selecting one of the encoded haptic data streams based on a corresponding density factor matching a condition of the communication network bandwidth for multiplexing with the received digital content data.
  • a first encoded data stream of the plurality of encoded data streams may correspond to a first density factor of the predetermined list of density factors
  • a second encoded data stream of the plurality of encoded data streams corresponds to a second density factor of the predetermined list of density factors
  • the second encoded data stream comprises more haptic data than the first data stream and the second density factor is greater than the first density factor.
  • the method may include detecting a change in the network bandwidth information, selecting a different one of the encoded haptic data streams based on the corresponding density factor matching the condition of the communication network bandwidth, and transmitting the selected different one of the encoded data streams to the end user device.
  • a system that includes a processor configured to receive digital content data including audio data and/or video data, generate haptic data using at least some of the received digital content data, encode the haptic data for efficient transmission over a communication network, multiplex the encoded haptic data with the received digital content data, embed information for decoding the encoded haptic data in metadata of the multiplexed data stream, and send the multiplexed data stream over the communication network.
  • the system includes a user device configured to receive the multiplexed encoded haptic data and digital contact data over the communication network.
  • the user device includes a haptic output device configured to output a haptic effect based on the haptic data.
  • the second aspect may be modified in any suitable way as disclosed herein including but not limited to any one or more of the following.
  • the processor may be further configured to analyze the haptic data to determine at least one characteristic of the haptic data, and encode the haptic data based on the determined characteristic to meet a pre-defined criteria.
  • the pre-defined criteria may include an output encoded haptic bit stream having the least number of bits.
  • the processor may be configured to receive an endpoint configuration of the user device from the user device, and the pre-defined criteria includes preserving a range of frequency of the haptic data that correspond to the received endpoint configuration of the end user device.
  • the processor may be configured to select an encoder from a list of pre-determined encoders based on the determined characteristic and apply the selected encoder to transform the haptic data.
  • the list of pre-determined encoders may include an Advanced Audio Coding encoder and/or a Free Lossless Audio encoder.
  • the system may be configured such that the characteristics of the haptic data comprise one or more types of haptic output devices used to generate haptic effects based on the haptic data, intended use case of the haptic data, magnitude of the haptic data, frequency of the haptic data, and length of the silence in the haptic data.
  • the system may be configured such that the types of haptic output devices include one or more of the group consisting of an eccentric rotating mass actuator, a linear resonant actuator, and a piezoelectric actuator.
  • the system may be configured such that the intended use case of the haptic data is selected from the group consisting of music, movies and games.
  • the processor may be configured to apply each one of a list of pre-determined encoders to the haptic data to generate a corresponding list of encoded haptic data streams, and select, using the pre-defined criteria, an encoded haptic data stream from the list of encoded haptic data streams.
  • the processor may be configured to encode the haptic data for a predetermined list of density factors to generate a plurality of encoded haptic data streams, each encoded haptic data stream corresponding to a density factor of the predetermined list of density factors, receive network bandwidth information from an end user device over a communication network, and select one of the encoded haptic data streams based on a corresponding density factor matching a condition of the communication network bandwidth to multiplex with the received digital content data.
  • the system may be configured such that a first encoded data stream of the plurality of encoded data streams corresponds to a first density factor of the predetermined list of density factors, and a second encoded data stream of the plurality of encoded data streams corresponds to a second density factor of the predetermined list of density factors, and wherein the second encoded data stream comprises more haptic data than the first data stream and the second density factor is greater than the first density factor.
  • the processor may be configured to detect a change in the network bandwidth information, select a different one of the encoded haptic data streams based on the corresponding density factor matching the condition of the communication network bandwidth, and transmit the selected different one of the encoded data streams to the end user device.
  • a computer-implemented method for enriching user digital content experience with haptic data is implemented on a computing device having a processor programmed with a computer program module.
  • the method includes receiving digital content data including audio data and/or video data; generating haptic data using at least some of the received digital content data; encoding the generated haptic data for efficient transmission over a communication network; multiplexing the encoded haptic data with the received digital content data; embedding information for decoding the encoded haptic data in metadata of the multiplexed data stream; and sending the multiplexed data stream over the communication network.
  • a computer-implemented method for transferring haptic data together with other digital content data over a network for an end user to experience the haptic effects generated on an end user device The end user device is coupled to the network.
  • the method is implemented on a computing device having a processor programmed with a computer program module.
  • the method includes analyzing haptic data to determine at least one characteristic of the haptic data, and encoding, based on the determined characteristic, the haptic data to meet a pre-defined criteria.
  • a computer-implemented method for enriching user digital content experience with haptic data is implemented on a computing device having a processor programmed with a computer program module.
  • the method includes receiving raw haptic data, encoding the received raw haptic data for a predetermined list of density factors to generate a list of encoded haptic data streams, each encoded haptic data stream corresponding to a density factor of the predetermined list of density factors, receiving network bandwidth information from an end user device over a communication network, selecting one of the encoded haptic data streams based on a corresponding density factor matching a condition of the communication network bandwidth, and transmitting the selected encoded haptic data stream to the end user device.
  • any of the third, fourth and firth aspects are intended, where technically compatible, to be able to include or be modified by, any suitable feature or principle described herein, including but not limited to any one or more of the optional featues described for the first and second aspects.
  • Embodiments as described herein relate to systems and methods for generating, transferring and/or storing haptic data as part of digital content to enrich user experience when consuming the digital content.
  • digital content refers to information that can be transferred and stored at a non-transitory storage medium, and may include, for example, audio and/or video data.
  • digital content includes information encoded using various file formats and/or other un-encoded content that can be transferred and stored at a non-transitory storage medium.
  • FIG. 1 illustrates a method 100 for creating and encoding haptic data to be eventually played with other digital content, either locally or remotely, in accordance with an embodiment as described herein.
  • the method 100 includes a source multimedia processing stage 110, an audio-to-haptic conversion stage 120, a haptic encoding stage 130, a multiplexing stage 140, and multimedia interleaving stage 150.
  • haptic data may be created from the audio data of a multimedia data stream.
  • the audio data may be extracted during the source multimedia processing stage 110, and converted to a raw haptic data stream, as explained in more detail below, during the audio-to-haptic conversion stage 120.
  • the raw haptic data may be further encoded during the haptic encoding stage 130 so that less network bandwidth or less storage space will be needed for the transfer or storage of the haptic data.
  • the encoded haptic data joins the source multimedia data stream so that the encoded haptic data and the source multimedia data may be interleaved during the multimedia interleaving stage 150.
  • the audio data extracted from the multimedia data during stage 110 may be just raw data, e.g., in bit stream in PCM format.
  • the audio data may have been encoded already, such as encoded using an Advanced Audio Coding (AAC) encoder.
  • AAC Advanced Audio Coding
  • the encoded audio data may be first decoded, either as the last step of the stage 110 or as the first step during stage 120.
  • the haptic data that is used during stage 130 may not be generated from the audio data contained in multimedia data 110, but may instead come from a different source, such from a raw haptic data storage 125, as illustrated in FIG. 1 .
  • the video data instead of or in addition to the audio data may be used to generate haptic data in the method 100 described above.
  • certain video data transitions e.g., a lightening scene
  • FIG. 2 illustrates a method 200 of decoding and displaying haptic data in synchronization with other encoded multimedia data, according to an embodiment as described herein.
  • a de-multiplexer 204 may first separate an interleaved multimedia data stream 202 into different data streams, including an audio data stream 206, a video data stream 208 and a haptic data stream 210 at an end device in which the interleaved multimedia data 202 is to be played.
  • a decoder/synchronizer 212 may contain one or more audio decoders, one or more video decoders and one or more haptic decoders (not depicted).
  • the decoded audio data and video data may be sent to audio/video renderers 214 (e.g., speakers and display screens) for playback.
  • the decoded haptic data stream 216 may be sent to one or more haptic output devices, such as devices H1, H2, H3, H4 and/or H5, for displaying in synchronization with the audio and video data.
  • the haptic output device H1...H5 may consist of at least one endpoint processor 218, a vibration amplitude modulator 220, a vibration renderer 224 and at least one actuator 226.
  • Other signal processing devices 222 may also be used to alter the signal(s) output by the endpoint processor(s). The illustrated embodiment is not intended to be limiting in any way.
  • haptic output device may include an actuator, for example, an electromagnetic actuator such as an Eccentric Rotating Mass (“ERM”) in which an eccentric mass is moved by a motor, a Linear Resonant Actuator (“LRA”) in which a mass attached to a spring is driven back and forth, or a “smart material” such as piezoelectric materials, electro-active polymers or shape memory alloys, a macro-composite fiber actuator, an electro-static actuator, an electro-tactile actuator, and/or another type of actuator that provides a physical feedback such as a haptic (e.g., vibrotactile) feedback.
  • an electromagnetic actuator such as an Eccentric Rotating Mass (“ERM”) in which an eccentric mass is moved by a motor
  • LRA Linear Resonant Actuator
  • a “smart material” such as piezoelectric materials, electro-active polymers or shape memory alloys
  • macro-composite fiber actuator such as an electro-static actuator, an electro-tactile actuator, and/or another type
  • the haptic output device may include non-mechanical or non-vibratory devices such as those that use electrostatic friction (“ESF”), ultrasonic surface friction (“USF”), or those that induce acoustic radiation pressure with an ultrasonic haptic transducer, or those that use a haptic substrate and a flexible or deformable surface, or those that provide projected haptic output such as a puff of air using an air jet, and so on.
  • ESF electrostatic friction
  • USF ultrasonic surface friction
  • FIG. 3 illustrates an embodiment of a system 300 for transferring encoded haptic data with multimedia data over a communication network.
  • the system 300 includes a processor 310, a remote storage 330, end user devices 334, 336 and 338, and one or more communication network(s) 332 connecting processor 310, remote storage 330 and end user devices 334, 336, 338.
  • the processor 310 may include its own electronic storage 328 in addition to or in place of the remote storage 330.
  • the remote storage 330 may also include a processor 329 and non-transitory storage media 331.
  • the processor 329 may maintain a digital content database and profile information of the end user devices 334, 336, 338, as described in further detail below with respect to FIG. 4 .
  • the processor 310 may be a general-purpose or specific-purpose processor or microcontroller for managing or controlling the operations and functions of the system 300.
  • the processor 310 may be specifically designed as an application-specific integrated circuit ("ASIC") embedded in, for example, the end user device 334 and configured to provide haptic effects through a haptic output device in the end user device 334 to enhance a user's enjoyment of a movie being played on the end user device 334.
  • ASIC application-specific integrated circuit
  • the processor 310 may also be configured to determine, based on predefined factors, what haptic effects are to be generated based on the feedback received over the communication network(s) 332 from another remote end user device 336, and then provide streaming commands that may be used to drive a haptic output device on the end user device 336, for example.
  • the network 332 may include wired or wireless connections.
  • the network may include any one or more of, for instance, the Internet, an intranet, a PAN (Personal Area Network), a LAN (Local Area Network), a WAN (Wide Area Network), a SAN (Storage Area Network), a MAN (Metropolitan Area Network), a wireless network, a cellular communications network, a Public Switched Telephone Network, and/or other network.
  • a PAN Personal Area Network
  • LAN Local Area Network
  • WAN Wide Area Network
  • SAN Storage Area Network
  • MAN Metropolitan Area Network
  • wireless network a wireless network
  • cellular communications network a cellular communications network
  • Public Switched Telephone Network and/or other network.
  • the communication network 332 may be a CDN (Content Distribution Network), which is typically used to distribute content (such as websites, videos, music, etc.) to enable much faster access to such assets or content globally.
  • CDN Content Distribution Network
  • a CDN utilizes Edge servers around the world to essentially mirror frequently accessed content, and also automatically manages the lifecycle of the content on the Edge servers. The actual assets or content are typically stored and updated on a more permanent origin server, and the CDN automatically accesses the assets and content, as needed, and mirror them to the appropriate Edge servers.
  • the processor 310 may include a plurality of processors, each configured to perform certain functions within the system 300. In an embodiment, the processor 310 may be configured to execute one or more computer program modules.
  • the one of more computer program modules may include at least one multimedia module 312, one or more haptic encoding modules 314, a user interface module 324, and a communication module 326.
  • the haptic encoding modules 314 may include an audio-to-haptic conversion module 316, a haptic compression module 318, a haptic density transcoding module 320 and a multimedia assembling module 322.
  • the multimedia assembling module 322 may be configured to assemble a multimedia data stream including audio, video and haptic data according to an MP4 container format, as illustrated in FIG. 3 .
  • the local storage 328 and the remote storage 330 may be used to store various profiles for different use cases of haptics and generated haptic data streams, before compression and after compression, as discussed in more detail below.
  • modules 312-326 are illustrated in FIG 3 as being co-located within a single processing unit 310, in embodiments in which the processor 310 includes multiple processing units, one or more of modules 312-326 may be located remotely from the other modules.
  • the description of the functionality provided by the different modules 312-326 described in more detail below is for illustrative purposes, and is not intended to be limiting, as any of the modules 312-326 may provide more or less functionality than is described.
  • one or more of the modules 312-326 may be eliminated, and some or all of its functionality may be provided by other ones of the modules 312-326.
  • the processor 310 may be configured to execute one or more additional modules that may perform some or all of the functionality attributed below to one of the modules 312-326. The function and operation of modules 312-326 are described in more detail below.
  • haptic output devices there are various types of haptic output devices that may be used to display or playback haptic data. Audio-to-haptics algorithms that are tailored specifically to each type of haptic output device and different types of haptic output devices and different audio-to-haptics algorithms may result in haptic data streams with different characteristics. Maintaining a list of audio-to-haptic conversion algorithms for haptic data generation that includes a specific audio-to-haptic conversion algorithm specifically tailored for each type of haptic output device and to a specific end user device may become difficult to maintain as the number of audio-to-haptic conversion algorithms and haptic output devices increases.
  • each type of haptic output device that is included in an end user device that generally includes audio-to-haptics conversion algorithms may be treated as an endpoint, and each algorithm for that haptic output device may be treated as a profile that aims to give a particular experience.
  • Embodiments as described herein allow the system to automatically detect the type of haptic output device, and any additional factors that may be used to change audio-to-haptics conversion profiles.
  • the user may also select which profiles he/she wants active during a given use case.
  • Embodiments as described herein may save compilation time by only having a developer compile all available algorithms for each endpoint (i.e. haptic output device), at most, once. There are different situations in which a user may want a different experience such that one algorithm on the device at a time may not be enough.
  • the audio-to-haptics automatic conversion algorithm may not be the same for music as it is for movies, and additionally, as it is for games, etc.
  • the endpoint(s) (i.e. haptic output device(s)) in an end user device are typically fixed, but the embodiments described herein allow an end user device to adapt its behavior based on application focus, sensor-based data and environmental factors.
  • FIG. 4 illustrates an embodiment of a database 400 of different profiles used for generating haptic data.
  • the profiles may be stored in the electronic storage 328.
  • the profiles may be organized according to their use case, for example, movie use case 410, music use case 420 and gaming use case 430.
  • the haptic effects are intended for an end user to experience while watching a movie or any other video, such as a television show or an advertisement.
  • the music use case 420 the haptic effects are intended for an end user to experience while listening to the music.
  • the gaming use case 430 the haptic effects are intended for an end user to experience while playing a game.
  • the profiles may be stored for each endpoint (i.e. haptic output device on an end user device that generally has audio-to-haptics conversion algorithms).
  • profile 412 is stored for an end user device 334 that has an audio-to-haptic conversion algorithm specifically for a piezoelectric actuator
  • profile 414 is stored for an end user device 336 having an ERM actuator and corresponding audio-to-haptic conversion algorithm
  • profile 416 is stored for an end user device 338 having an LRA and corresponding audio-to-haptic conversion algorithm
  • profile 422 is stored for the end user device 334 that has a piezoelectric actuator
  • profile 424 is stored for the end user device 336 that includes an ERM actuator
  • profile 426 is stored for the end user device 338 that includes an LRA.
  • profile 432 is stored for a gaming use case 430 in which the haptic output device is a piezoelectric actuator
  • profile 434 is stored for a gaming use case and an ERM actuator
  • profile 436 is stored for a gaming use case and an LRA.
  • a single end user device may contain more than one endpoint.
  • Embodiments as described herein may save compilation time by having all available profile algorithms compiled for each endpoint, at most once, either offline or in real time.
  • embodiments as described herein allow the endpoint to adapt its behavior based on application focus, sensor-based data and environment factors even though the endpoint (haptic output device) in a single device may not be changed.
  • the system 300 may automatically detect the type of haptic output device and additional factors.
  • the audio-to-haptic module 316 may communicate with the end user device 334 through the communication module 326 over the communication network 332 to determine the type(s) of haptic output device(s) that end user device 334 has, which may be used to change audio-to-haptics conversion profiles.
  • a user of the end user device 336 may communicate with the haptic encoding module 314 over the communication network 332 through the user interface module 324 to select which profiles he/she wants active during a given use case.
  • FIG. 5 illustrates a user-selectable embodiment of the audio-to-haptic module 316 in which haptic data is generated based on endpoints and use cases.
  • the audio-to-haptic module 316 pre-stores all available audio-to-haptics algorithms in the database 400 depicted in FIG 4 .
  • Each audio-to-haptics algorithm may be identified in a way that is understandable to a user. This may be done either via an audio-to-haptics settings application/activity, or a separate application/activity.
  • the audio-to-haptic module 316 determines the use case based on the requested information, or user input. In an embodiment, the audio-to-haptic module 316 may also check during the use-case determination step 504 which application is running in the foreground or the background, so the device may re-configure the audio-to-haptic module 316 to execute a user selected audio-to-haptics algorithm.
  • the audio-to-haptic module 316 determines the endpoint configuration of the requesting end user device. In an embodiment, the audio-to-haptic module 316 may simply make the determination by obtaining user input through the user interface module 324. In an embodiment, the audio-to-haptic module 316 may look up in a registered end user device database stored in the electronic storage 328 to obtain the endpoint configuration. In an embodiment, the audio-to-haptic module 316 may communicate with the requesting end user device via the communication module 326 to obtain the endpoint configuration. In an embodiment, an end user device may have multiple endpoints for the purpose of displaying different types of haptic effects. For example, the end user device 338 may have an LRA, a piezoelectric actuator, and an ERM actuator.
  • the audio-to-haptic module 316 may identify the user-selected profile(s). In an embodiment, this may achieved by using a look-up table in the database 400, as depicted in FIG. 4 .
  • the audio-to-haptic module 316 may configure itself to apply the appropriate algorithm(s) to audio data 510 and generate raw haptic data 514. For example, if the user has a device that has an LRA and selects a 'bassy' profile for music playback, when music is playing, the audio-to-haptic module 316 understands that it is a music use case, and selects the 'bassy' processing algorithm (e.g., low frequencies only) out of the profile 416 that is created specifically for an LRA. If the device has a piezoelectric actuator or an ERM actuator instead of an LRA, the corresponding algorithm will be selected from the profiles 410, 414 for either of those endpoints.
  • the 'bassy' processing algorithm e.g., low frequencies only
  • the processor 310 may include a haptic-to-haptic module (not shown) that is configured to apply an appropriate algorithm to convert the raw haptic data 514 intended for one endpoint to raw haptic data for another endpoint.
  • the haptic-to-haptic module may convert the raw haptic data 514 that was generated for an LRA to haptic data for a piezoelectric actuator or vice-versa.
  • Such a haptic-to-haptic signal conversion may be completed when the haptic data is being encoded or when the haptic data is decoded at the end user device.
  • FIG. 6 illustrates an auto detection embodiment 600 of the audio-to-haptic module 316 in which haptic data is generated based on endpoints and use cases without the user specifying an audio-to-haptics algorithm.
  • the audio-to-haptic module 316 may register information of a digital content playing application when the application is installed on an end user device (or system 300). Examples of such applications include an audio player, a video player, or a game player.
  • the audio-to-haptic module 316 may register environmental factors, such as factors that may be measured by a sensor(s) on the end user device or system. For example, the lighting surrounding the end user device (e.g. bright or dark), the temperature external to the device, etc. may be sensed and used as input to determine which profile to use at a given time. For example, if it is determined that the device is being used in a low lighting environment, more pronounced or less pronounced haptic effects may be desirable by the user.
  • environmental factors such as factors that may be measured by a sensor(s) on
  • the audio-to-haptic module 316 may determine the use case based on the request information, or user input. At 616, the audio-to-haptic module 316 determines the endpoint configuration of the end user device on which the haptic effects are to be displayed alongside the multimedia data.
  • the audio-to-haptic module 316 may make the determination by obtaining user input through the user interface module 324. In an embodiment, the audio-to-haptic module 316 may communicate with the requesting end user device (e.g., end user device 338) via the communication module 326 to obtain the endpoint configuration. In an embodiment, an end user device may have multiple endpoints for the purpose of displaying different haptic effects. For example, the end user device 338 may have an LRA, a piezoelectric actuator, and/or an ERM actuator.
  • the audio-to-haptic module 316 may identify an appropriate profile or conversion algorithm to use that takes into consideration all the information collected and determined during steps 610-616, without a user selection. In an embodiment, this may be achieved by using a look-up table in the database 400 depicted in FIG. 4 .
  • the audio-to-haptic module 316 may then configure itself to apply the determined conversion algorithm to audio data 622 and generate raw haptic data 624.
  • the auto-detection embodiment 600 as illustrated in FIG. 6 differs from the user-selectable embodiment 500 illustrated in FIG. 5 in that a user does not have to interact directly with the system 300 or the end user device 334, 336, 338 to configure the audio-to-haptic module 316.
  • the audio-to-haptic module 316 may use external factors including sensor information, the use case, as well as the endpoint (haptic output device) configuration in an end user device to adaptively select an audio-to-haptic conversion algorithm that is best suited for a given specific application for the digital content to be played.
  • the encoder may encode the haptic data for each endpoint and multiplex the haptic streams, and all of the haptic streams may be communicated to the end user device.
  • the end user device may identify and extract the appropriate haptic signal that matches the endpoint (haptic output device) in that particular end user device.
  • the generated haptic data as discussed above may be referred to as a raw haptic data stream.
  • storing and/or transferring a raw haptic data stream may not be the most efficient way to communicate haptic data for various reasons. For example, there may be periods of silence (no haptic data) in the stream.
  • FIG. 7 illustrates an embodiment of a method 700 for encoding a raw haptic data stream for transmission over a network.
  • the haptic encoding module 314 receives raw haptic data that may be generated by the audio-to-haptic module 316 (e.g., step 514 of FIG. 5 and step 624 in FIG. 6 ) and/or the haptic-to-haptic module.
  • the content of the haptic data is analyzed to determine the characteristics of the raw haptic data stream, which may include the types of haptic output device the haptic data are intended to be played by, the intended use case of the raw haptic data, the frequency of the raw haptic data, the length of the silence in the raw haptic data stream, the amplitude of the raw haptic data, etc.
  • the haptic encoding module 314 may then encode the raw haptic data based on the determined characteristics of the raw haptic data to meet pre-defined criteria.
  • the haptic encoding module 314 may first select an encoder from a list of encoders for that particular stream and then at 710 apply the selected encoder to the raw haptic data stream.
  • audio encoders there are many audio encoders that may be used to encode a haptic stream for compression purposes.
  • audio encoders have their advantages and disadvantages.
  • the AAC mentioned above may be better in terms of space and quality for general encoding, and the Free Lossless Audio Codec (FLAC) may be better when the stream has a lot of silence.
  • FLAC Free Lossless Audio Codec
  • Any other suitable encoder i.e. digital signal coder/compressor other than AAC or FLAC may be used.
  • An appropriate encoder should be selected to match the determined characteristics of the raw haptic data stream to ensure the efficacy and efficiency of the encoded data stream.
  • pre-determined criteria may be used to preserve the range of frequency of the raw haptic data that matches the haptic output device to be used for playback at the end user device, which may require the haptic encoding module 314 to access the database 400 that includes the pre-stored use cases and endpoint profiles discussed above with respect to FIG 4 .
  • the haptic encoding module 314 may encode the raw haptic data stream with each encoder in a list of pre-determined encoders, and then at 714 select the encoded haptic data stream that has the least number of bits as the encoding output. In an embodiment, at 716 the haptic encoding module 314 may then store the information about the encoder used and any other information needed for decoding the encoded haptic data stream.
  • the encoder information for the encoded haptic data stream may be embedded in the container format's metadata. On the end user device side, this information may be parsed out and the correct decoder may be used to decode the stream. For example, the encoder information may be embedded in user private or a custom atom in the MPEG4 format.
  • the haptic encoding module 314 may also take into consideration the bandwidth availability in encoding the raw haptic data when the raw haptic data is to be streamed over a communication network, as discussed in more detail below.
  • a method of preserving the haptic signal quality is provided by leveraging the characteristics of haptic waveforms in the haptic data stream while also sending fewer bits. Because a goal of using a multiple-encoding scheme is to transmit a smaller amount of data for congested networks, embodiments as described herein provide adaptive density streaming for a haptic data stream.
  • the density factor may determine how dense or sparse a signal is by applying haptics to certain amounts of data based on a threshold. If this is done at the transcoding stage to encode haptic data streams using different density factors, haptics may be sent with less data when network bandwidth is not freely available, and with more data when there is ample available network bandwidth.
  • the relevant and high quality part of haptic data is kept while unnecessary data are cut so that less of the available network bandwidth is occupied at a given time.
  • adding a separate type of media to be streamed along with audio/video/metadata may naturally affect the audio/video/metadata quality.
  • Embodiments as described herein may help minimize how much the quality of the audio/video signal is affected, while providing a pleasant haptic experience with less overhead.
  • FIGs. 8A-8C illustrate embodiments of different density factors that may be used for encoding a raw haptic signal (data stream) for streaming over a network.
  • FIG. 8A illustrates a medium density factor
  • FIG. 8B illustrates a high density factor
  • FIG. 8C illustrates a low density factor.
  • the lighter shaded areas in FIGs. 8A-8C (“data used" areas) indicate what parts of the signal are used and encoded, and the darker shaded areas in FIGs. 8A and 8C ("data ignored” areas) indicate what parts of the signal are ignored.
  • FIG. 8A illustrates a raw haptic data signal to be encoded using a medium density factor. If the signal represents the raw haptic data that is input into the transcoder, the transcoder may output haptic data corresponding to a signal within just the "data used" area, which would be less data than what exists in the signal with a high density factor.
  • FIG. 8B illustrates a raw haptic data to be encoded using a high density factor. For streams when network bandwidth is available, using such a high density factor in the transcoding may be useful as more data may stream across the network.
  • FIG. 8C illustrates a raw haptic data to be encoded using a low density factor. For streams when there is a large amount of network congestion, this type of transcoding may be suitable, as the output stream would deliver fewer bits of data and only send data for relatively high magnitude events.
  • the amount of data to appear in the output signal may be adjusted by changing the density factor, which allows the haptic encoding module 314 to forego certain information, which may be of less importance (e.g., signals that are so low in magnitude that the signals are not worth playing, or miniscule details in the haptic signal that may be ignored to preserve bandwidth), in the signal when needed and include more data when more bandwidth over the network becomes available.
  • certain information e.g., signals that are so low in magnitude that the signals are not worth playing, or miniscule details in the haptic signal that may be ignored to preserve bandwidth
  • FIG. 9 illustrates an embodiment of an adaptive haptic density transcoding method 900.
  • transcoding is interchangeable with encoding of the raw haptic data but emphasizes that the encoding is for a conversion from a source (e.g., a streaming server) to a different destination (e.g., a remote end user device).
  • the input is raw haptic data 902, such as data generated by the audio-to-haptic module 316, as described above.
  • a haptic transcoder 904 encodes the raw haptic data 902 at multiple bit rates, each corresponding to a specific density factor as described above with respect to FIGs. 8A-8C .
  • a haptic web server 906 stores the haptic bit streams encoded with different density factors, and maintains a manifest file 908 for the encoded haptic data streams.
  • An end user device 912 (such as end user devices 334, 336, 338 in FIG. 3 ) communicates with the haptic web server 906 through a network 910 (or 332 as in FIG. 3 ) to receive one of the encoded haptic data streams for a given raw haptic signal 902.
  • the density of the haptic data stream may be adjusted by modifying a density factor between 1 and 15 (15 being highest density, 1 being lowest density). This value acts as a threshold (as discussed above) for how much data will be included in the encoded haptic data stream. As a result, the lower the density value, the less low-magnitude (less-important) raw haptic data will be included in the encoded haptic data stream. As the transcoder 904 increases the density threshold, more low-magnitude and detailed data are incrementally added to the encoded haptic data stream. Therefore, higher densities map to better network connectivity and lower densities map to lesser network connectivity.
  • the haptic web server 906 stores the haptic bit streams 1-5 encoded with five different density factors, and maintains a manifest file 908 for the encoded haptic data streams corresponding to a specific raw haptic data input.
  • stream 1 in web server 906 is encoded with a density factor of 1, stream 2 with a density factor of 2, and so on.
  • the end user device 912 is made aware of the available haptic data streams at different bit rates and segments of encoded haptic data streams. When starting, the end user device 912 may request the segments from the lowest bit rate haptic data stream 1.
  • the end user device 912 may request the next higher bit rate segments, e.g., segments of stream 2 or what would be considered to be a medium density stream, as illustrated in FIG. 10 . Later, if the end user device 912 determines that the download speed for a segment is lower than the bit rate for the segment, the end user device 912 determines that the network throughput has deteriorated, and may request a lower bit rate segment, as illustrated in FIG 10 . If the available bandwidth significantly increases, the end user device 912 may request a higher bit rate segment or what would be considered to be a high density stream, as illustrated in FIG. 10 , and so on.
  • the segment size may vary depending on the implementations. In an embodiment, the segment size may be as low as 2 seconds. In an embodiment, the segment size may be as high as 10 seconds.
  • the haptic density transcoding module 320 may be integrated with the same algorithms that transcode audio/video for network transmission (e.g., multimedia module 312 in FIG. 3 ).
  • the number of haptic streams created would be equal to the number of audio/video streams created, and the density factor for each haptic stream would not necessarily have to be different for each transcoding. For example, in an embodiment, if there are 10 different audio/video streams created, raw haptic data may be transcoded 5 times with 5 different density factors.
  • the adaptive bitrate transcoding of the haptic data discussed above may not be the same as adaptive bitrate streaming used to pre-render audio/video at different qualities to support varied network connectivity when streaming media data. The latter typically focuses on sending fewer bits through applying compression algorithm(s) to all of the input audio data or video data.
  • the adaptive bitrate transcoding according to embodiments as described herein sends less of the signal, but still provides relative haptic effects by forgoing specific sections of the haptic signal that are not important (i.e. low magnitude) and focusing on higher priority sections of the haptic signal. This may preserve the actual haptic quality by leveraging the characteristics of the haptic waveforms while also sending fewer bits, thus reducing the bandwidth needed.
  • the same format of the original haptic signal may be kept for the encoded haptic data with a different density.
  • the format of the source may not be preserved, as it may be re-encoded into a different format to compress the data for network transmission.
  • Embodiments as described herein also provide a method for embedding haptic bit streams in a MP4 compatible file container so that haptic data may be streamed with other digital content without impact on the delivery of the other digital content.
  • the same method may be applied to other similar file containers as well.
  • MP4 is described below as the container format for streaming digital content over a network, because MP4 is currently considered to provide the most flexibility in terms of supported formats, as well as the ability to modify it for embedding varying types of data interleaved with the standard audio and video streams to create novel experiences, it should be understood that embodiments as described herein may also be applied to other container formats. Although varying types of data include additional audio streams, closed captioning streams, etc., embodiments as described herein will be described with respect to interleaving a haptic stream with the audio and/or video streams that can exist within an MP4 container. On the end user device side, the embedded haptic signal may be extracted and played back over a haptic output device, simultaneously with standard audio and video playback. It should be appreciated that the proposed scheme applies to other digital content other than audio and video.
  • the haptic signal is encoded similar to that of an audio signal when using the variable haptic encoder in the discussion with respect to FIG. 7 above. In its current form, this allows the haptic signal to be supported by MP4, but also requires there to be enough distinction between the haptic and audio box structures to ensure that the haptic signal is not interpreted as an audio signal by standard MP4 players. Conversely, existing audio signals should not be mistaken for haptic signals.
  • the haptic signal may be encoded as AAC for convenience. This encoding is allowed to change due to various methods of providing an encoded haptic signal (data stream) as discussed above, but the box structure would be able to remain similar.
  • the haptic signal needs a separate 'trak' box within the standard 'moov' box, alongside the other 'trak' boxes that exist for audio and video. Table I below lists the haptic box structure.
  • MPEG4 Part-12 Standard ISO/IEC 14496-12 Media File Format
  • Forms Standard provides general information on the MP4 box hierarchy in Table I.
  • Table I Box Hierarchy for Haptics Box Hierarchy Description moov Container for all metadata trak Container for an individual track or stream mdia Container for the media information in a track mdhd Media header, overall information about the media hdlr Declares the media (handler) type minf Media information container nmhd Null media header stbl Sample Table Box stsd Sample Descriptions (codec types, initialization, etc.) hapt Haptic box, indicating this track as a haptic stream Esds Elementary Stream Description
  • the haptic data should not be rendered as audio or video. Because most players will try to render any combination or number of video streams or any combination of audio streams specified by their respective 'trak' boxes, the haptic data should be denoted as a timed metadata track ('meta'). This may be done by setting the handler_type field to 'meta' in the 'hdlr' box. Timed metadata tracks, specified in Section 8.4.3 in the Formats Standard, are for storing time-based media content. Storing the haptic signal in this type of track allows the stream to be considered media, but not of an audio or video format. In this case, specifying a timed metadata stream for haptic data is needed to ensure that during playback, the haptic data will not be rendered as audio, despite its original encoding being an audio format.
  • FIGs. 11-13 illustrate handler reference boxes for sound 1100, video 1200, and haptics 1300 and show how the sound, video, and haptic handler types differ when embedded together within an MP4 file.
  • the handler_type for each 'trak' 1102, 1202, 1302 is specified in the 'hdlr' box 1104, 1204, 1304 as shown in FIGs. 11-13 .
  • the handler types for sound, video and timed metadata tracks, respectively, are 'soun' (1106 in FIG. 11 ), 'vide' (1206 in FIG. 12 ), and 'meta' (1306 in FIG. 13 ), as specified in Section 8.4.3 of the Formats Standard.
  • a null media header box (see 'nmhd') (1408 in FIG. 14 ) is under the 'minf container box (1406 in FIG. 14 ), as specified under Section 8.4.5.5 in the Formats Standard. This indicates that the stream is not audio ('smhd' - sound media header) (1402 in FIG. 14 ) or video ('vmhd' - video media header) (1404 in FIG. 14 ), but may still be interpreted as a media stream.
  • FIG. 14 illustrates this in more detail by showing the full box structures for sound, video, and haptics, respectively.
  • 'stbl' Sample Table Box
  • 'stsd' box which is the Sample Description Box, as seen in Table I above.
  • the format of the 'trak' is specified.
  • the first child box in 'stsd' is typically 'mp4a' for AAC audio tracks.
  • FOURCC four-character code
  • the 'meta' handler type requires a meta data sample entry ("MetaDataSampleEntry") class to be implemented, but is defined as an empty class that may be extended by new subclasses. In an embodiment, this empty class is not extended with any additional information. Any haptic-specific information is stored in the 'esds' box, as seen in Table I and described below.
  • timescale is a 32-bit unsigned integer that contains the number of time units which pass in one second. For example, if the haptic track has an update rate of 50 samples per second, this timescale is set to 50.
  • the duration field is a 64-bit unsigned integer which declares the length of this haptic track in the scale of timescale. For example, if the timescale is set to 50, every sample has the length of 20 ms. If the haptic track is 5 seconds long, the duration field should have a value of 250 (5*1000/20).
  • the only child box under the 'hapt' container box is the 'esds' box (Elementary Stream Description), also referred to as the ESDescriptor.
  • this box contains information used to decode the stream associated with the 'trak' that it resides in. It may be used similarly with haptics. Syntax details on the 'esds' box are provided in Section 8.3.3.1 of the Systems Standard.
  • FIG. 15 illustrates an embodiment of a valid setting of an object profile indication (“objectProfileIndication”) value for haptic data in an MP4 file.
  • objectProfileIndication object profile indication
  • the ES_Descriptor box contains a decoder configuration descriptor (“DecoderConfigDescriptor”) structure, which contains the parameters and requirements to parse and read the elementary stream.
  • DecoderConfigDescriptor there are fields for the objectProfileIndication value and the decoder specific information (“DecoderSpecificInfo”) structure.
  • the objectProfileIndication value provides the object profile type for the stream. This field is set to a value between 0xC0 and 0xFE, a range of user-private object profile types, which, when used within a 'hapt' box structure, will be known as a haptic type (see Table 8-5 of the Systems Standard).
  • the DecoderSpecificInfo structure is an abstract class that is extended by other classes, based on the objectProfileIndication value (see Table 8-5 of the Systems Standard). For one haptics implementation, this structure may be extended by the haptic specific configuration ("HapticSpecificConfig") class, which contains haptic-specific information such as the signal type and the actuator that the encoded signal was created for.
  • HapticSpecificConfig haptic specific configuration
  • the DecoderConfigDescriptor must contain a subclass of the DecoderSpecificInfo abstract class as described in Section 8.3.3 of the Systems Standard.
  • the DecoderSpecificInfo is extended by an audio-specific configuration ("AudioSpecificConfig") class, as described in Section 1.6.2.1 of the MPEG4 Part-3 Standard (ISO/IEC 14496-3 Audio) (hereinafter referred to as the "Audio Standard").
  • AudioSpecificConfig an audio-specific configuration
  • the DecoderSpecificInfo is extended by a HapticSpecificConfig class as described herein.
  • HapticSpecificConfig extends the abstract class DecoderSpecificInfo, as defined in the System Standard, when the value of objectTypeIndication and streamType value are 0xC0 and 0x20 which indicate that this stream contains haptic information.
  • Tables II-VIII below provide additional information that may be used for the syntax of HapticSpecificConfig, haptic stream type, sampling frequency, actuator configuration, channel configuration, endpoint configuration, and haptic decoder type. Table II: Syntax of HapticSpecificConfig Syntax No.
  • HapticStreamType Haptic Stream Type Description 0x0 Reserved 0x1 Reserved 0x2 Haptic Encoded Stream 0x3 Audio Encoded Stream Table IV: Sampling Frequency Index Sampling Frequency Index Value 0x0 8000 0x1 - 0xE Reserved 0xF Escape value
  • the description above specifies the encoded haptic stream rather than the actual stream, itself.
  • These streams are interleaved in the 'mdat' box, which resides at the same level as the 'moov' box in the MP4 box structure.
  • the 'mdat' box contains the actual data that the player parses, decodes, and renders on the end-platform. It is the responsibility of the formatting software to determine the segment sizes for each stream and to interleave all of the encoded segments of video, audio, and metadata (haptic) samples.
  • the haptic signal may be extracted out of the interleaved data box by referencing the offsets to segments in the sample table, similar to how extraction is done for audio and video signals.
  • databases may be, include, or interface to, for example, an ORACLETM relational database sold commercially by Oracle Corporation.
  • Other databases such as INFORMIXTM, DB2 (Database 2) or other data storage, including file-based, or query formats, platforms, or resources such as OLAP (On Line Analytical Processing), SQL (Standard Query Language), a SAN (storage area network), MICROSOFT ACCESSTM or others may also be used, incorporated, or accessed.
  • the database may comprise one or more such databases that reside in one or more physical devices and in one or more physical locations.
  • the database may store a plurality of types of data and/or files and associated data or file descriptions, administrative information, or any other data.
  • the methods described herein may be embodied in one or more pieces of software.
  • the software is preferably held or otherwise encoded upon a memory device such as, but not limited to, any one or more of, a hard disk drive, RAM, ROM, solid state memory or other suitable memory device or component configured to software.
  • the methods may be realised by executing/running the software. Additionally or alternatively, the methods may be hardware encoded.
  • the methods encoded in software or hardware are preferably executed using one or more processors.
  • the memory and/or hardware and/or processors are preferably comprised as, at least part of, one or more servers and/or other suitable computing systems.

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Human Computer Interaction (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Physics & Mathematics (AREA)
  • Computer Security & Cryptography (AREA)
  • Health & Medical Sciences (AREA)
  • Audiology, Speech & Language Pathology (AREA)
  • General Health & Medical Sciences (AREA)
  • Databases & Information Systems (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Information Transfer Between Computers (AREA)
  • Two-Way Televisions, Distribution Of Moving Picture Or The Like (AREA)
  • Details Of Audible-Bandwidth Transducers (AREA)
EP18180688.6A 2013-05-24 2014-05-23 Method and system for haptic data encoding Ceased EP3399763A1 (en)

Applications Claiming Priority (4)

Application Number Priority Date Filing Date Title
US201361827341P 2013-05-24 2013-05-24
US201361874920P 2013-09-06 2013-09-06
US201361907318P 2013-11-21 2013-11-21
EP14169721.9A EP2806353B1 (en) 2013-05-24 2014-05-23 Method and system for haptic data encoding

Related Parent Applications (1)

Application Number Title Priority Date Filing Date
EP14169721.9A Division EP2806353B1 (en) 2013-05-24 2014-05-23 Method and system for haptic data encoding

Publications (1)

Publication Number Publication Date
EP3399763A1 true EP3399763A1 (en) 2018-11-07

Family

ID=50943052

Family Applications (2)

Application Number Title Priority Date Filing Date
EP14169721.9A Active EP2806353B1 (en) 2013-05-24 2014-05-23 Method and system for haptic data encoding
EP18180688.6A Ceased EP3399763A1 (en) 2013-05-24 2014-05-23 Method and system for haptic data encoding

Family Applications Before (1)

Application Number Title Priority Date Filing Date
EP14169721.9A Active EP2806353B1 (en) 2013-05-24 2014-05-23 Method and system for haptic data encoding

Country Status (5)

Country Link
US (3) US9437087B2 (zh)
EP (2) EP2806353B1 (zh)
JP (2) JP6411069B2 (zh)
KR (1) KR102176391B1 (zh)
CN (2) CN109876431A (zh)

Families Citing this family (41)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP2806353B1 (en) * 2013-05-24 2018-07-18 Immersion Corporation Method and system for haptic data encoding
US9619980B2 (en) 2013-09-06 2017-04-11 Immersion Corporation Systems and methods for generating haptic effects associated with audio signals
US9652945B2 (en) * 2013-09-06 2017-05-16 Immersion Corporation Method and system for providing haptic effects based on information complementary to multimedia content
US9576445B2 (en) 2013-09-06 2017-02-21 Immersion Corp. Systems and methods for generating haptic effects associated with an envelope in audio signals
US9711014B2 (en) 2013-09-06 2017-07-18 Immersion Corporation Systems and methods for generating haptic effects associated with transitions in audio signals
US9349378B2 (en) * 2013-11-19 2016-05-24 Dolby Laboratories Licensing Corporation Haptic signal synthesis and transport in a bit stream
CN106537291A (zh) * 2014-07-07 2017-03-22 意美森公司 第二屏幕触觉
US10268286B2 (en) 2014-12-23 2019-04-23 Immersion Corporation Controlling power distribution to haptic output devices
KR101790892B1 (ko) * 2016-05-17 2017-10-26 주식회사 씨케이머티리얼즈랩 음향 신호를 촉각 신호로 변환하기 방법 및 이를 이용하는 햅틱 장치
EP3267288A1 (en) * 2016-07-08 2018-01-10 Thomson Licensing Method, apparatus and system for rendering haptic effects
JP2018060313A (ja) * 2016-10-04 2018-04-12 ソニー株式会社 受信装置および方法、送信装置および方法、並びにプログラム
CN106454365B (zh) * 2016-11-22 2018-07-10 包磊 多媒体数据的编码、解码方法及编码、解码装置
US11290755B2 (en) * 2017-01-10 2022-03-29 Qualcomm Incorporated Signaling data for prefetching support for streaming media data
US10075251B2 (en) 2017-02-08 2018-09-11 Immersion Corporation Haptic broadcast with select haptic metadata based on haptic playback capability
CN106803982A (zh) * 2017-03-01 2017-06-06 深圳市集贤科技有限公司 一种带按摩功能的vr遥控器
US20190041987A1 (en) 2017-08-03 2019-02-07 Immersion Corporation Haptic effect encoding and rendering system
CN107566847B (zh) * 2017-09-18 2020-02-14 浙江大学 一种将触感数据编码为视频流进行保存和传输的方法
CN112313603B (zh) * 2018-06-28 2024-05-17 索尼公司 编码装置、编码方法、解码装置、解码方法和程序
JP7351299B2 (ja) * 2018-07-03 2023-09-27 ソニーグループ株式会社 符号化装置、符号化方法、復号装置、復号方法、プログラム
US11568718B2 (en) * 2018-12-13 2023-01-31 Sony Group Corporation Information processing apparatus, information processing system, information processing method, and program
CN113260488B (zh) * 2019-02-01 2024-08-06 索尼集团公司 解码装置、解码方法和程序
US10921893B2 (en) * 2019-02-04 2021-02-16 Subpac, Inc. Personalized tactile output
EP3938867A4 (en) * 2019-04-26 2022-10-26 Hewlett-Packard Development Company, L.P. SPATIAL AUDIO AND HAPTICS
US10951951B2 (en) * 2019-07-30 2021-03-16 Sony Interactive Entertainment Inc. Haptics metadata in a spectating stream
US11340704B2 (en) * 2019-08-21 2022-05-24 Subpac, Inc. Tactile audio enhancement
WO2021044901A1 (ja) * 2019-09-03 2021-03-11 ソニー株式会社 制御装置、スピーカ装置および音声出力方法
US10984638B1 (en) * 2019-10-17 2021-04-20 Immersion Corporation Systems, devices, and methods for encoding haptic tracks
US11282281B2 (en) * 2019-11-13 2022-03-22 At&T Intellectual Property I, L.P. Activation of extended reality actuators based on content analysis
JPWO2021176904A1 (zh) * 2020-03-05 2021-09-10
JPWO2021220659A1 (zh) * 2020-04-29 2021-11-04
JP7552338B2 (ja) * 2020-12-18 2024-09-18 株式会社Jvcケンウッド 情報提供装置、情報提供方法、およびプログラム
CN114115538A (zh) * 2021-11-23 2022-03-01 瑞声开泰声学科技(上海)有限公司 一种触觉反馈数据处理方法、装置及计算机可读存储介质
GB2615361B (en) * 2022-02-08 2024-05-29 Sony Interactive Entertainment Europe Ltd Method for generating feedback in a multimedia entertainment system
WO2023176928A1 (ja) * 2022-03-18 2023-09-21 ソニーグループ株式会社 情報処理装置および方法
US20240129578A1 (en) * 2022-10-17 2024-04-18 Tencent America LLC Method and apparatus for defining frames and timed referenced network abstraction layer (nals) structure in haptics signals
US20240129047A1 (en) * 2022-10-17 2024-04-18 Tencent America LLC Method for creating sparse isobmff haptics tracks
US20240129579A1 (en) * 2022-10-17 2024-04-18 Tencent America LLC Isobmff haptic tracks with sample anchoring of haptic effects
WO2024096812A1 (en) * 2022-10-31 2024-05-10 Razer (Asia-Pacific) Pte. Ltd. Method and system for generating haptic effects
CN116185172A (zh) * 2022-12-08 2023-05-30 瑞声开泰声学科技(上海)有限公司 触觉效果数据的压缩和加密方法、系统和相关设备
US20240233498A1 (en) * 2023-01-06 2024-07-11 Tencent America LLC Method and apparatus for frame-accurate haptics interchange file format
WO2024166652A1 (ja) * 2023-02-10 2024-08-15 ソニーグループ株式会社 復号装置および方法、並びに符号化装置および方法

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20090079690A1 (en) * 2007-09-21 2009-03-26 Sony Computer Entertainment America Inc. Method and apparatus for enhancing entertainment software through haptic insertion
US20090096632A1 (en) * 2007-10-16 2009-04-16 Immersion Corporation Synchronization of haptic effect data in a media stream
US20090128306A1 (en) * 2007-11-21 2009-05-21 The Guitammer Company Capture and remote reproduction of haptic events in synchronous association with the video and audio capture and reproduction of those events
US20090189748A1 (en) * 2006-08-24 2009-07-30 Koninklijke Philips Electronics N.V. Device for and method of processing an audio signal and/or a video signal to generate haptic excitation
US20110128132A1 (en) * 2006-04-13 2011-06-02 Immersion Corporation System and method for automatically producing haptic events from a digital audio signal
US20110133910A1 (en) * 2008-10-10 2011-06-09 Internet Services Llc System and method for transmitting haptic data in conjunction with media data
US20110149156A1 (en) * 2008-07-15 2011-06-23 Sharp Kabushiki Kaisha Data transmitting apparatus, data receiving apparatus, data transmitting method, data receiving method, and audio-visual environment controlling method

Family Cites Families (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE60124746T2 (de) * 2000-09-22 2007-09-13 Koninklijke Philips Electronics N.V. Videocodierung mit hybrider temporeller und snr-bezogener feingranularer skalierbarkeit
US7016409B2 (en) * 2003-11-12 2006-03-21 Sony Corporation Apparatus and method for use in providing dynamic bit rate encoding
JP2006163579A (ja) * 2004-12-03 2006-06-22 Sony Corp 情報処理システム、情報処理装置及び情報処理方法
EP3287874A1 (en) * 2006-04-06 2018-02-28 Immersion Corporation Systems and methods for enhanced haptic effects
US20110280398A1 (en) * 2010-05-17 2011-11-17 Anatoly Fradis Secured content distribution system
US8717152B2 (en) * 2011-02-11 2014-05-06 Immersion Corporation Sound to haptic effect conversion system using waveform
US9083821B2 (en) 2011-06-03 2015-07-14 Apple Inc. Converting audio to haptic feedback in an electronic device
CN102208188B (zh) * 2011-07-13 2013-04-17 华为技术有限公司 音频信号编解码方法和设备
JPWO2013008869A1 (ja) * 2011-07-14 2015-02-23 株式会社ニコン 電子機器及びデータ生成方法
US20150304249A1 (en) * 2011-09-06 2015-10-22 Andras Valkó Device and Method for Progressive Media Download with Multiple Layers or Streams
EP2806353B1 (en) * 2013-05-24 2018-07-18 Immersion Corporation Method and system for haptic data encoding
CN104980247B (zh) * 2014-04-04 2019-11-22 北京三星通信技术研究有限公司 自适应调整调制编码方式和参考信号图样的方法、基站、终端和系统

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20110128132A1 (en) * 2006-04-13 2011-06-02 Immersion Corporation System and method for automatically producing haptic events from a digital audio signal
US20090189748A1 (en) * 2006-08-24 2009-07-30 Koninklijke Philips Electronics N.V. Device for and method of processing an audio signal and/or a video signal to generate haptic excitation
US20090079690A1 (en) * 2007-09-21 2009-03-26 Sony Computer Entertainment America Inc. Method and apparatus for enhancing entertainment software through haptic insertion
US20090096632A1 (en) * 2007-10-16 2009-04-16 Immersion Corporation Synchronization of haptic effect data in a media stream
US20090128306A1 (en) * 2007-11-21 2009-05-21 The Guitammer Company Capture and remote reproduction of haptic events in synchronous association with the video and audio capture and reproduction of those events
US20110149156A1 (en) * 2008-07-15 2011-06-23 Sharp Kabushiki Kaisha Data transmitting apparatus, data receiving apparatus, data transmitting method, data receiving method, and audio-visual environment controlling method
US20110133910A1 (en) * 2008-10-10 2011-06-09 Internet Services Llc System and method for transmitting haptic data in conjunction with media data

Also Published As

Publication number Publication date
US9437087B2 (en) 2016-09-06
JP2014239430A (ja) 2014-12-18
JP6630798B2 (ja) 2020-01-15
JP6411069B2 (ja) 2018-10-24
KR102176391B1 (ko) 2020-11-09
EP2806353A1 (en) 2014-11-26
KR20140138087A (ko) 2014-12-03
CN104184721B (zh) 2019-03-12
JP2019024228A (ja) 2019-02-14
US10085069B2 (en) 2018-09-25
EP2806353B1 (en) 2018-07-18
CN109876431A (zh) 2019-06-14
US20140347177A1 (en) 2014-11-27
US20160345073A1 (en) 2016-11-24
US10542325B2 (en) 2020-01-21
CN104184721A (zh) 2014-12-03
US20190098368A1 (en) 2019-03-28

Similar Documents

Publication Publication Date Title
US10542325B2 (en) Method and system for haptic data encoding and streaming using a multiplexed data stream
JP6516766B2 (ja) 分割タイムドメディアデータのストリーミングを改善するための方法、デバイス、およびコンピュータプログラム
US10375373B2 (en) Method and apparatus for encoding three-dimensional (3D) content
US8898228B2 (en) Methods and systems for scalable video chunking
EP2666288B1 (en) Apparatus and method for storing and playing content in a multimedia streaming system
CN107071513B (zh) 用于提供媒体内容的方法、客户机和服务器
CN103141115B (zh) 用于媒体流传送的客户端、内容创建器实体及其方法
CN103814562A (zh) 用信号表示片段的特性以用于媒体数据的网络流式传输
GB2533624A (en) Methods, devices, and computer programs for improving coding of media presentation description data
JP7238948B2 (ja) 情報処理装置および情報処理方法
GB2599170A (en) Method, device, and computer program for optimizing indexing of portions of encapsulated media content data
CN112369041B (zh) 播放媒体的方法和计算机可读存储装置
JP7241874B2 (ja) カプセル化されたメディアコンテンツの利用可能な部分をシグナリングするための方法、装置、及びコンピュータプログラム
CN113364728B (zh) 媒体内容接收方法、装置、存储介质和计算机设备
KR101656102B1 (ko) 컨텐츠 파일 생성/제공 장치 및 방법
Ohm et al. Multimedia Representation Standards

Legal Events

Date Code Title Description
PUAI Public reference made under article 153(3) epc to a published international application that has entered the european phase

Free format text: ORIGINAL CODE: 0009012

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: THE APPLICATION HAS BEEN PUBLISHED

AC Divisional application: reference to earlier application

Ref document number: 2806353

Country of ref document: EP

Kind code of ref document: P

AK Designated contracting states

Kind code of ref document: A1

Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: REQUEST FOR EXAMINATION WAS MADE

17P Request for examination filed

Effective date: 20190507

RBV Designated contracting states (corrected)

Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: EXAMINATION IS IN PROGRESS

17Q First examination report despatched

Effective date: 20190903

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: EXAMINATION IS IN PROGRESS

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: THE APPLICATION HAS BEEN REFUSED

18R Application refused

Effective date: 20211028