WO2011007289A2 - A method and a system for a generation of an excitation effect - Google Patents
A method and a system for a generation of an excitation effect Download PDFInfo
- Publication number
- WO2011007289A2 WO2011007289A2 PCT/IB2010/053030 IB2010053030W WO2011007289A2 WO 2011007289 A2 WO2011007289 A2 WO 2011007289A2 IB 2010053030 W IB2010053030 W IB 2010053030W WO 2011007289 A2 WO2011007289 A2 WO 2011007289A2
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- stream
- analysis
- excitation
- feature
- excitation effect
- Prior art date
Links
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04S—STEREOPHONIC SYSTEMS
- H04S3/00—Systems employing more than two channels, e.g. quadraphonic
- H04S3/002—Non-adaptive circuits, e.g. manually adjustable or static, for enhancing the sound image or the spatial distribution
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04R—LOUDSPEAKERS, MICROPHONES, GRAMOPHONE PICK-UPS OR LIKE ACOUSTIC ELECTROMECHANICAL TRANSDUCERS; DEAF-AID SETS; PUBLIC ADDRESS SYSTEMS
- H04R1/00—Details of transducers, loudspeakers or microphones
- H04R1/10—Earpieces; Attachments therefor ; Earphones; Monophonic headphones
- H04R1/1091—Details not provided for in groups H04R1/1008 - H04R1/1083
Definitions
- the invention relates to a method and a system for generation of an excitation effect based on a multimedia stream, wherein the multimedia stream comprises at least a first stream and a second stream, comprising the following steps: performing a first analysis of the first stream, performing a second analysis of the second stream and generating the excitation effect based on the results of the first analysis and the second analysis.
- the known device for processing an audio signal and/or video signal comprises a haptic excitation generating unit adapted for generating a haptic excitation of a specific body part of a user by generating an airflow through a vent in accordance with the audio signal and/or the video signal to be reproduced.
- the multimedia stream comprises at least a first stream and a second stream.
- the method according to the invention comprises the following steps: performing a first analysis of the first stream and a second analysis of the second stream, and generating the excitation effect based on the results of the first analysis and the second analysis.
- the method according to the invention is characterized in that performing the first analysis and the second analysis comprises detecting a time stamp where the first stream has a first stream feature that dictates that the excitation effect should be generated and where the second stream has a second stream feature that dictates that the excitation effect should be generated and in that the excitation effect is generated
- An embodiment of the method according to the invention has the feature that the performing the second analysis is performed only on a part of the second stream substantially close to the time stamp where the first analysis detected a first stream feature that dictates that the excitation effect should be generated.
- This embodiment provides an improved performance of the method since the second analysis is not done on the whole second stream. Nevertheless, the method provides the same probability of the correct generation of the excitation effect and the same quality of the excitation experience for a user as in the previous embodiment.
- An embodiment of the method according to the invention has the feature that the generated excitation effect is characterized by at least one parameter and in that a value of the parameter is dependent on characteristics of the multimedia stream. That means that not only the excitation effect will be generated at the right moment according to the
- characteristics of the multimedia stream but also the excitation effect's characteristics, represented by the parameter, will depend on the characteristics of the multimedia stream with the goal to further improve the excitation experience for the user.
- An embodiment of the method according to the invention has the feature that the parameter of the generated excitation effect is an intensity or a duration of the generated excitation effect.
- An embodiment of the method according to the invention has the feature that the multimedia stream is an audio -video stream, the first stream is an audio stream, the second stream is a video stream, the first analysis is an audio analysis, the second analysis is a video analysis, the first stream feature is an audio feature, and the second stream feature is a video feature.
- the audio feature can be an amplitude of the audio stream, an audio -frequency of the audio stream, an amplitude of a value in the power spectrum calculated from the audio stream, or another suitable characteristic of the audio stream.
- the video feature can be for example a vector length or a direction of a motion vector of the video stream.
- An embodiment of the method according to the invention has the feature that the excitation effect is a haptic effect and/or a light effect.
- the haptic effect can be produced by an actuator, by a vent, or another suitable producing means.
- An embodiment of the method according to the invention has the feature that the excitation effect is generated if the detected first stream feature and the detected second stream feature have values according to a predetermined rule. Further improvement of the excitation experience for the user is achieved by tuning said predetermined rule till the most satisfactory results for the user are achieved.
- An embodiment of the method according to the invention has the feature that the predetermined rule implies that the excitation effect is generated if the first stream feature has a value higher than a predetermined first threshold of the first stream and if the second stream feature has a value higher than a predetermined first threshold of the second stream.
- the first stream is an audio stream and the second stream is a video stream for example
- the first stream feature could be the amplitude of the audio stream or the dominant frequency of the audio stream and said value of the first stream feature will be the amplitude value or the value of a detected dominant frequency respectively
- the second stream feature could be the motion vector of the video stream and said value of the second stream feature will be the motion vector magnitude.
- An embodiment of the method according to the invention has the feature that the predetermined rule implies that the excitation effect is generated if the first stream feature has a value higher than a predetermined first threshold of the first stream and lower than a predetermined second threshold of the first stream, and if the second stream feature has a value higher than a predetermined first threshold of the second stream and lower than a predetermined second threshold of the second stream, wherein the first threshold of the first stream is lower than the second threshold of the first stream and the first threshold of the second stream is lower than the second threshold of the second stream.
- the excitation effect will not be generated if the first stream feature has a value higher than a predetermined second threshold of the first stream or if the second stream feature has a value higher than a predetermined second threshold of the second stream and such measure in certain cases, dependent on the characteristics of the multimedia stream, can provide further improved quality of the excitation experience for the user.
- the system according to the invention is defined in claim 10.
- the multimedia stream comprises at least a first stream and a second stream.
- the system according to the invention comprises: a first stream analyzer for performing a first stream analysis and generating a first stream analysis result, a second stream analyzer for performing a second stream analysis and generating a second stream analysis result, a excitation effect generator for generating an excitation output signal based on the first stream analysis result and on the second stream analysis result, and an excitation means for generating the excitation effect based on the excitation output signal.
- the system according to the invention is characterized in that the excitation effect generator is configured for generating the excitation output signal by making use of the method as defined in any one of claims 1 or 2.
- An embodiment of the system according to the invention has the feature that the generated excitation effect is characterized by at least one parameter and in that the system further comprises an adaptation means for adaptation of the parameter by making use of the method as defined in any one of claims 3 to 9.
- an embodiment of the system according to the invention has the feature that the adaptation means is a Central Processor Unit (CPU), a microprocessor or any other suitable unit configured for running an adaptation software.
- CPU Central Processor Unit
- microprocessor any other suitable unit configured for running an adaptation software.
- the method and the system for a generation of an excitation effect based on a multimedia stream described above can be used in many multimedia- systems, among others:
- a home cinema set comprising a BluRay-player or a DVD-player to which haptic actuators can be attached via the system according to the invention
- Figs. IA and IB schematically show a first exemplary embodiment of the method according to the invention, wherein the method comprises a first analysis of a first stream, a second analysis of a second stream and generating an excitation effect;
- Fig. 2 schematically indicates a second exemplary embodiment of the method according to the invention, wherein an excitation effect is generated if a first stream feature has a value higher than a predetermined first threshold of a first stream and if a second stream feature has a value higher than a predetermined first threshold of a second stream;
- Figs. 3 schematically indicates a third exemplary embodiment of the method according to the invention, wherein an excitation effect is generated if a first stream feature has a value higher than a predetermined first threshold of a first stream and lower than a predetermined second threshold of the first stream, and if a second stream feature has a value higher than a predetermined first threshold of a second stream and lower than a
- predetermined second threshold of the second stream wherein the first threshold of the first stream is lower than the second threshold of the first stream and the first threshold of the second stream is lower than the second threshold of the second stream;
- Fig. 4 schematically indicates an exemplary embodiment of an audio-video stream used by the method according to the invention and the system according to the invention for the generation of an excitation effect;
- Fig. 5 schematically shows an exemplary embodiment of the system according to the invention for generating an excitation effect based on a multimedia stream.
- a first embodiment of the invention is shown in Figs. IA and IB.
- a method of generating an excitation effect 20 based on a multimedia stream 2 is schematically shown in Fig. IA.
- the multimedia stream comprises at least a first stream 22 and a second stream 24.
- the method of generating the excitation effect comprises the steps of performing a first analysis 30 of the first stream 22 and performing a second analysis 32 of the second stream 24.
- the method further comprises the step of generating 34 the excitation effect 20 based on results 14;16 of the first analysis 30 and the second analysis 32.
- the steps of performing the first analysis 30 and performing the second analysis 32 comprises detecting a time stamp tl, see Fig.
- the excitation effect 20 is generated synchronously to the time stamp tl indicated by both the first stream feature 36 and the second stream feature 38, the probability of the correctness of generation of the excitation effect 20 and consequently the quality of the excitation experience for the user 12 are improved.
- the multimedia stream 2 can be anyone of multimedia streams known in the art having at least two streams, for example: audio, video, subtitles, still pictures, or another suitable stream.
- Examples of a multimedia stream are: a MPEG2 stream, a DVD stream, or another suitable stream.
- MPEG-2 is a standard for the generic coding of moving pictures, video, and associated audio information.
- the MPEG2 stream is an example of a stream comprising an audio stream which can be used as the first stream 22 and a video stream which can be used as the second stream 24.
- the DVD stream comprises the audio stream, video stream, subtitles stream, still pictures stream and other streams according to the DVD standard known in the art.
- the first stream 22 and the second stream of the DVD stream can be any combination of the mentioned streams, for example audio stream and video stream, audio stream and subtitles stream, video stream and subtitles stream etc.
- the intensity of the excitation effect or the duration of the excitation effect can be set according to the first stream feature and the second stream feature.
- the excitation effect can be any excitation effect known in the art or combination of such effects.
- the examples of the excitation effects are: a haptic effect produced by an actuator or by a vent, light effect, or another suitable excitation effect.
- the first stream 22 is the audio stream having as the audio feature its amplitude or its dominant frequency
- the second stream 24 is the video stream having as the video feature its average motion vector magnitude
- the excitation effect 20 is a haptic effect produced by an actuator
- Fig. 2 schematically shows a second exemplary embodiment of the system according to the invention, wherein the excitation effect 20 is generated if the detected first stream feature and the detected second stream feature have values according to a
- the predetermined rule implies that the excitation effect 20 is generated if the first stream feature 36 has a value A higher than a predetermined first threshold of the first stream Al and if the second stream feature 38 has a value V higher than a predetermined first threshold of the second stream Vl.
- the predetermined rule implies the excitation effect 20 is generated if the first stream feature 36 has a value A higher than a predetermined first threshold of the first stream Al and lower than a predetermined second threshold of the first stream A2 and if the second stream feature 38 has a value V higher than a predetermined first threshold of the second stream Vl and lower than a predetermined second threshold of the second stream V2.
- the first threshold of the first stream Al is lower than the second threshold of the first stream A2 and the first threshold of the second stream Vl is lower than the second threshold of the second stream V2.
- Fig. 5 schematically shows an exemplary embodiment of the system according to the invention.
- the system is configured for generating an excitation effect 20 based on a multimedia stream 2, wherein the multimedia stream comprises at least a first stream 22 and a second stream 24.
- the system comprises a first stream analyzer 4 for performing a first stream analysis 30 and generating a first stream analysis result 14 and a second stream analyzer 6 for performing a second stream analysis 32 and generating a second stream analysis result 16.
- the system further comprises a excitation effect generator 8 for generating an excitation output signal 18 based on the first stream analysis result 14 and on the second stream analysis result 16 and an exciting means 10 for generating 34 the excitation effect 20 based on the excitation output signal 18.
- the excitation effect generator 8 is configured for generating the excitation output signal 18 synchronously to the time stamp tl where the first stream 22 has a first stream feature 36 that dictates that the excitation effect 20 should be generated and where the second stream 24 has a second stream feature 38 that dictates that the excitation effect 20 should be generated.
- the system further comprises an adaptation means 40 for adaptation of a parameter of the excitation effect 20.
- the adaptation means will set the parameter of the excitation effect, for example its intensity, its duration, or another suitable parameter.
- the first stream 22 is a video stream
- the second stream 24 is an audio stream
- the excitation means is a touch blanket.
- the touch blanket is a device with a two-dimensional matrix configuration of a number of haptic actuators.
- the parameter of the excitation effect can be a motion of a haptic pattern on the touch blanket that corresponds to a dominant direction of motion vectors in the video stream.
- Another parameter of the excitation effect can be a speed of said motion pattern that corresponds to the average magnitude of motions vectors in the video stream.
- the adaptation means 40 can be implemented as a Central Processor Unit (CPU) for running an adaptation software.
- CPU Central Processor Unit
- V2 A second video threshold
Abstract
A method of generating an excitation effect (20) based on a multimedia stream (2) wherein the multimedia stream comprises at least a first stream (22) and a second stream (24). The method comprises the following steps: performing a first analysis (30) of the first stream (22) and a second analysis (32) of the second stream (24), and generating (34) the excitation effect (20) based on the results (14;16) of the first analysis (30) and the second analysis (32). Performing the first analysis (30) and the second analysis (32) comprises detecting a time stamp (tl) where the first stream (22) has a first stream feature (36) that dictates that the excitation effect (20) should be generated and where the second stream (24) has a second stream feature (38) that dictates that the excitation effect (20) should be generated. The excitation effect (20) is generated synchronously to the time stamp (tl) detected by the first analysis (30) and the second analysis (32). The invention also provides a system for generating an excitation effect (20) based on a multimedia stream (2) that comprises at least a first stream (22) and a second stream (24) and is configured for making use of the method described above. The method and the system afford an improved quality of excitation experiences.
Description
A method and a system for a generation of an excitation effect
FIELD OF THE INVENTION
The invention relates to a method and a system for generation of an excitation effect based on a multimedia stream, wherein the multimedia stream comprises at least a first stream and a second stream, comprising the following steps: performing a first analysis of the first stream, performing a second analysis of the second stream and generating the excitation effect based on the results of the first analysis and the second analysis.
BACKGROUND OF THE INVENTION
An embodiment of such a method and system is disclosed in patent application WO 2008/023346 Al. The known device for processing an audio signal and/or video signal comprises a haptic excitation generating unit adapted for generating a haptic excitation of a specific body part of a user by generating an airflow through a vent in accordance with the audio signal and/or the video signal to be reproduced. SUMMARY OF THE INVENTION
It is an object of the present invention to afford a method and a system with an improved quality of excitation experiences.
This object is achieved with the method of producing an excitation effect based on a multimedia stream as defined in Claim 1. The multimedia stream comprises at least a first stream and a second stream. The method according to the invention comprises the following steps: performing a first analysis of the first stream and a second analysis of the second stream, and generating the excitation effect based on the results of the first analysis and the second analysis. The method according to the invention is characterized in that performing the first analysis and the second analysis comprises detecting a time stamp where the first stream has a first stream feature that dictates that the excitation effect should be generated and where the second stream has a second stream feature that dictates that the excitation effect should be generated and in that the excitation effect is generated
substantially synchronously to the time stamp detected by the first analysis and the second analysis. Since the excitation effect is generated synchronously to the time stamp indicated
by both the first stream feature and the second stream feature, the probability of the correct generation of the excitation effect and consequently the quality of the excitation experience for a user are improved.
An embodiment of the method according to the invention has the feature that the performing the second analysis is performed only on a part of the second stream substantially close to the time stamp where the first analysis detected a first stream feature that dictates that the excitation effect should be generated. This embodiment provides an improved performance of the method since the second analysis is not done on the whole second stream. Nevertheless, the method provides the same probability of the correct generation of the excitation effect and the same quality of the excitation experience for a user as in the previous embodiment.
An embodiment of the method according to the invention has the feature that the generated excitation effect is characterized by at least one parameter and in that a value of the parameter is dependent on characteristics of the multimedia stream. That means that not only the excitation effect will be generated at the right moment according to the
characteristics of the multimedia stream but also the excitation effect's characteristics, represented by the parameter, will depend on the characteristics of the multimedia stream with the goal to further improve the excitation experience for the user.
An embodiment of the method according to the invention has the feature that the parameter of the generated excitation effect is an intensity or a duration of the generated excitation effect.
An embodiment of the method according to the invention has the feature that the multimedia stream is an audio -video stream, the first stream is an audio stream, the second stream is a video stream, the first analysis is an audio analysis, the second analysis is a video analysis, the first stream feature is an audio feature, and the second stream feature is a video feature. The audio feature can be an amplitude of the audio stream, an audio -frequency of the audio stream, an amplitude of a value in the power spectrum calculated from the audio stream, or another suitable characteristic of the audio stream. The video feature can be for example a vector length or a direction of a motion vector of the video stream.
An embodiment of the method according to the invention has the feature that the excitation effect is a haptic effect and/or a light effect. The haptic effect can be produced by an actuator, by a vent, or another suitable producing means.
An embodiment of the method according to the invention has the feature that the excitation effect is generated if the detected first stream feature and the detected second
stream feature have values according to a predetermined rule. Further improvement of the excitation experience for the user is achieved by tuning said predetermined rule till the most satisfactory results for the user are achieved.
An embodiment of the method according to the invention has the feature that the predetermined rule implies that the excitation effect is generated if the first stream feature has a value higher than a predetermined first threshold of the first stream and if the second stream feature has a value higher than a predetermined first threshold of the second stream. If the first stream is an audio stream and the second stream is a video stream for example, the first stream feature could be the amplitude of the audio stream or the dominant frequency of the audio stream and said value of the first stream feature will be the amplitude value or the value of a detected dominant frequency respectively, and the second stream feature could be the motion vector of the video stream and said value of the second stream feature will be the motion vector magnitude.
An embodiment of the method according to the invention has the feature that the predetermined rule implies that the excitation effect is generated if the first stream feature has a value higher than a predetermined first threshold of the first stream and lower than a predetermined second threshold of the first stream, and if the second stream feature has a value higher than a predetermined first threshold of the second stream and lower than a predetermined second threshold of the second stream, wherein the first threshold of the first stream is lower than the second threshold of the first stream and the first threshold of the second stream is lower than the second threshold of the second stream. That means that the excitation effect will not be generated if the first stream feature has a value higher than a predetermined second threshold of the first stream or if the second stream feature has a value higher than a predetermined second threshold of the second stream and such measure in certain cases, dependent on the characteristics of the multimedia stream, can provide further improved quality of the excitation experience for the user.
It is also an object of present invention to provide a system for producing an improved excitation effect based on a multimedia stream. The system according to the invention is defined in claim 10. The multimedia stream comprises at least a first stream and a second stream. The system according to the invention comprises: a first stream analyzer for performing a first stream analysis and generating a first stream analysis result, a second stream analyzer for performing a second stream analysis and generating a second stream analysis result, a excitation effect generator for generating an excitation output signal based on the first stream analysis result and on the second stream analysis result, and an excitation
means for generating the excitation effect based on the excitation output signal. The system according to the invention is characterized in that the excitation effect generator is configured for generating the excitation output signal by making use of the method as defined in any one of claims 1 or 2.
An embodiment of the system according to the invention has the feature that the generated excitation effect is characterized by at least one parameter and in that the system further comprises an adaptation means for adaptation of the parameter by making use of the method as defined in any one of claims 3 to 9.
An embodiment of the system according to the invention has the feature that the adaptation means is a Central Processor Unit (CPU), a microprocessor or any other suitable unit configured for running an adaptation software.
The method and the system for a generation of an excitation effect based on a multimedia stream described above can be used in many multimedia- systems, among others:
a home cinema set comprising a BluRay-player or a DVD-player to which haptic actuators can be attached via the system according to the invention,
a television-set to which haptic effect devices and/or RGB lamps can be attached via the system according to the invention,
a gaming device and any suitable excitation means that are driven by the system according to the invention,
- a hard-disc recorder, BluRay-recorder or DVD-recorder where the system according to the invention based on the recorded multimedia stream generates an excitation effect which is recorded together with the multimedia stream,
an excitation blanket for a relaxation chair, a massaging-chair or a bad connected via the system according to the invention to a multimedia player,
- relaxation or massaging mats connected via the system according to the invention to a multimedia player,
a content creation system where an excitation effect content is created by the system according to the invention based on a multimedia content,
a system for a cinema where excitation effects are generated by the system according to the invention based on an audio-video content of a movie and particular excitation means driven by the system according to the invention are integrated in the cinema chairs,
any other system where excitation effects are produced based on a multimedia stream.
BRIEF DESCRIPTION OF THE DRAWINGS
In the following the invention and further aspects will be described, by way of example, and explained hereinafter, using the following figures:
Figs. IA and IB schematically show a first exemplary embodiment of the method according to the invention, wherein the method comprises a first analysis of a first stream, a second analysis of a second stream and generating an excitation effect;
Fig. 2 schematically indicates a second exemplary embodiment of the method according to the invention, wherein an excitation effect is generated if a first stream feature has a value higher than a predetermined first threshold of a first stream and if a second stream feature has a value higher than a predetermined first threshold of a second stream;
Figs. 3 schematically indicates a third exemplary embodiment of the method according to the invention, wherein an excitation effect is generated if a first stream feature has a value higher than a predetermined first threshold of a first stream and lower than a predetermined second threshold of the first stream, and if a second stream feature has a value higher than a predetermined first threshold of a second stream and lower than a
predetermined second threshold of the second stream, wherein the first threshold of the first stream is lower than the second threshold of the first stream and the first threshold of the second stream is lower than the second threshold of the second stream;
Fig. 4 schematically indicates an exemplary embodiment of an audio-video stream used by the method according to the invention and the system according to the invention for the generation of an excitation effect;
Fig. 5 schematically shows an exemplary embodiment of the system according to the invention for generating an excitation effect based on a multimedia stream.
DETAILED DESCRIPTION OF THE EMBODIMENTS
In the following description of the preferred embodiments, reference is made to the accompanying drawings which form a part thereof. Specific embodiments, in which the invention may be practiced, are shown in the following description by way of illustration. It is also understood that other embodiments may be utilized and structural changes may be made without departing from the scope of the present invention. It is noted that the same reference signs will be used for indicating the same or similar parts in the several
embodiments.
A first embodiment of the invention is shown in Figs. IA and IB. A method of generating an excitation effect 20 based on a multimedia stream 2 is schematically shown in Fig. IA. The multimedia stream comprises at least a first stream 22 and a second stream 24. The method of generating the excitation effect comprises the steps of performing a first analysis 30 of the first stream 22 and performing a second analysis 32 of the second stream 24. The method further comprises the step of generating 34 the excitation effect 20 based on results 14;16 of the first analysis 30 and the second analysis 32. The steps of performing the first analysis 30 and performing the second analysis 32 comprises detecting a time stamp tl, see Fig. IB, where substantially close to tl the first stream 22 has a first stream feature 36 that dictates that the excitation effect 20 should be generated and where substantially close to tl the second stream 24 has a second stream feature 38 that dictates that the excitation effect 20 should be generated. The excitation effect 20 is generated and felt by a user 12
synchronously or at least substantially synchronously to the time stamp tl detected by the first analysis 30 and the second analysis 32. Since the excitation effect 20 is generated synchronously to the time stamp tl indicated by both the first stream feature 36 and the second stream feature 38, the probability of the correctness of generation of the excitation effect 20 and consequently the quality of the excitation experience for the user 12 are improved.
The multimedia stream 2 can be anyone of multimedia streams known in the art having at least two streams, for example: audio, video, subtitles, still pictures, or another suitable stream. Examples of a multimedia stream are: a MPEG2 stream, a DVD stream, or another suitable stream. MPEG-2 is a standard for the generic coding of moving pictures, video, and associated audio information. The MPEG2 stream is an example of a stream comprising an audio stream which can be used as the first stream 22 and a video stream which can be used as the second stream 24. The DVD stream comprises the audio stream, video stream, subtitles stream, still pictures stream and other streams according to the DVD standard known in the art. The first stream 22 and the second stream of the DVD stream can be any combination of the mentioned streams, for example audio stream and video stream, audio stream and subtitles stream, video stream and subtitles stream etc.
Not only the moment of generation of the excitation effect is determined by the features of the first and the second stream, but also the characteristics of the generated excitation effect can be related to these features. For example, the intensity of the excitation effect or the duration of the excitation effect can be set according to the first stream feature and the second stream feature.
The excitation effect can be any excitation effect known in the art or combination of such effects. The examples of the excitation effects are: a haptic effect produced by an actuator or by a vent, light effect, or another suitable excitation effect.
If the first stream 22 is the audio stream having as the audio feature its amplitude or its dominant frequency, the second stream 24 is the video stream having as the video feature its average motion vector magnitude and the excitation effect 20 is a haptic effect produced by an actuator, the intensity and the duration of the generated haptic effect can be linearly proportional to the value of the amplitude of the audio stream and to the magnitude of the motion vector of the video stream. The haptic effect intensity linearly dependent on the audio stream amplitude and the magnitude of the video stream motion vector can be presented by the following equation: Hei=X*Aa+Y*Vmvi+Z, where Hei is the haptic effect intensity, Aa is the value of the amplitude of the audio stream, Vmvi is the magnitude of the motion vector of the video stream and X, Y and Z are constants. X, Y and Z constants can be tuned in order to achieve a required haptic experience. It has to be understood that this is only an example and that the parameters of the excitation effect can be any possible mathematical function of the parameters of the first stream 22 and the second stream 24.
Fig. 2 schematically shows a second exemplary embodiment of the system according to the invention, wherein the excitation effect 20 is generated if the detected first stream feature and the detected second stream feature have values according to a
predetermined rule. More particularly the predetermined rule implies that the excitation effect 20 is generated if the first stream feature 36 has a value A higher than a predetermined first threshold of the first stream Al and if the second stream feature 38 has a value V higher than a predetermined first threshold of the second stream Vl.
Another example of the predetermined rule is schematically shown in the Fig.
3 as a third exemplary embodiment of the system according to the invention, wherein the predetermined rule implies the excitation effect 20 is generated if the first stream feature 36 has a value A higher than a predetermined first threshold of the first stream Al and lower than a predetermined second threshold of the first stream A2 and if the second stream feature 38 has a value V higher than a predetermined first threshold of the second stream Vl and lower than a predetermined second threshold of the second stream V2. The first threshold of the first stream Al is lower than the second threshold of the first stream A2 and the first threshold of the second stream Vl is lower than the second threshold of the second stream V2.
Applying the predetermined rule shown in Fig. 2 on the first stream A and the second stream V shown in the Fig. 4 implies that the excitation effects will be generated synchronously to the time stamps tl, t2 and t3. The excitation effect will not be generated synchronously to the time stamp t4 since there the feature 38 of the second stream V is not higher than the predetermined first threshold of the second stream Vl .
Applying the predetermined rule shown in Fig. 3 on the first stream A and the second stream V shown in the Fig. 4 implies that the excitation effects will be generated synchronously to the time stamp tl . The excitation effect will not be generated synchronously to the time stamps:
- t2 (since there the feature 36 of the first stream A is not lower than the predetermined second threshold of the first stream A2),
t3 (since there the feature 38 of the second stream V is not lower than the predetermined second threshold of the second stream V2) and
t4 (since there the feature 38 of the second stream V is not higher than the predetermined first threshold of the second stream Vl).
Fig. 5 schematically shows an exemplary embodiment of the system according to the invention. The system is configured for generating an excitation effect 20 based on a multimedia stream 2, wherein the multimedia stream comprises at least a first stream 22 and a second stream 24. The system comprises a first stream analyzer 4 for performing a first stream analysis 30 and generating a first stream analysis result 14 and a second stream analyzer 6 for performing a second stream analysis 32 and generating a second stream analysis result 16. The system further comprises a excitation effect generator 8 for generating an excitation output signal 18 based on the first stream analysis result 14 and on the second stream analysis result 16 and an exciting means 10 for generating 34 the excitation effect 20 based on the excitation output signal 18. The excitation effect generator 8 is configured for generating the excitation output signal 18 synchronously to the time stamp tl where the first stream 22 has a first stream feature 36 that dictates that the excitation effect 20 should be generated and where the second stream 24 has a second stream feature 38 that dictates that the excitation effect 20 should be generated.
In an embodiment the system further comprises an adaptation means 40 for adaptation of a parameter of the excitation effect 20. Depending on the detected features of the first stream 22 and the second stream 24 the adaptation means will set the parameter of the excitation effect, for example its intensity, its duration, or another suitable parameter.
In an exemplary embodiment the first stream 22 is a video stream, the second stream 24 is an audio stream and the excitation means is a touch blanket. The touch blanket is a device with a two-dimensional matrix configuration of a number of haptic actuators. The parameter of the excitation effect can be a motion of a haptic pattern on the touch blanket that corresponds to a dominant direction of motion vectors in the video stream. Another parameter of the excitation effect can be a speed of said motion pattern that corresponds to the average magnitude of motions vectors in the video stream.
The adaptation means 40 can be implemented as a Central Processor Unit (CPU) for running an adaptation software.
While the invention has been illustrated and described in detail in the drawings and foregoing description, such illustration and description are to be considered illustrative or exemplary and not restrictive; the invention is not limited to the disclosed embodiments.
Other variations to the disclosed embodiments can be understood and effected by those skilled in the art in practicing the claimed invention, from a study of the drawings, the disclosure, and the appended claims. In the claims, the word "comprising" does not exclude other elements or steps, and the indefinite article "a" or "an" does not exclude a plurality. The mere fact that certain measures are recited in mutually different dependent claims does not indicate that a combination of these measured cannot be used to advantage. Any reference signs in the claims should not be construed as limiting the scope.
LIST OF REFERENCE NUMERALS:
2 A multimedia stream
4 A first stream analyzer
6 A second stream analyzer
8 An excitation effect generator 10 An exciting means
12 A user
14 A first stream analysis result
16 A second stream analysis result
18 An excitation output signal 20 An excitation effect
22 A first stream
24 A second stream
30 Performing a first analysis
32 Performing a second analysis 34 Generating an excitation effect
36 A first stream feature
38 A second stream feature
40 An adaptation means tl A time stamp
A An audio feature
V A video feature
Al A first audio threshold
A2 A second audio threshold
VI A first video threshold
V2 A second video threshold
Claims
1. A method of generating an excitation effect (20) based on a multimedia stream (2), wherein the multimedia stream comprises at least a first stream (22) and a second stream (24), which method comprises the following steps:
performing a first analysis (30) of the first stream (22) and performing a second analysis (32) of the second stream (24), and
generating (34) the excitation effect (20) based on results (14;16) of the first analysis (30) and the second analysis (32),
characterized in that,
performing the first analysis (30) and the second analysis (32) comprises detecting a time stamp (tl) where the first stream (22) has a first stream feature (36) that dictates that the excitation effect (20) should be generated and where the second stream (24) has a second stream feature (38) that dictates that the excitation effect (20) should be generated, and in that,
the excitation effect (20) is generated substantially synchronously to the time stamp (tl) detected by the first analysis (30) and the second analysis (32).
2. The method as claimed in claim 1, wherein performing the second analysis (32) is performed only on a part of the second stream substantially close to the time stamp (tl) where the first analysis (30) detected a first stream feature (36) that dictates that the excitation effect (20) should be generated.
3. The method as claimed in any one of the previous claims, wherein the generated excitation effect (20) is characterized by at least one parameter and in that a value of the parameter is dependent on a characteristic of the multimedia stream (2).
4. The method as claimed in claim 3, wherein the parameter of the generated excitation effect (20) is an intensity or duration of the generated excitation effect.
5. The method as claimed in any one of the previous claims, wherein:
the multimedia stream (2) is an audio -video stream,
the first stream (22) is an audio stream,
the second stream (24) is a video stream,
- the first analysis (30) is an audio analysis,
the second analysis (32) is a video analysis,
the first stream feature (36) is an audio feature (A), and
the second stream feature (38) is a video feature (V).
6. The method as claimed in any one of the previous claims, wherein the excitation effect (20) is a haptic effect and/or a light effect.
7. The method as claimed in any one of the previous claims, wherein the excitation effect (20) is generated if the detected first stream feature (36) and the detected second stream feature (38) have values according to a predetermined rule.
8. The method as claimed in claim 7, wherein the predetermined rule implies generating the excitation effect (20) if the first stream feature (36) has a value (A) higher than a predetermined first threshold of the first stream (Al) and if the second stream feature (38) has a value (V) higher than a predetermined first threshold of the second stream (Vl).
9. The method as claimed in claim 7, wherein the predetermined rule implies generating the excitation effect (20) if:
the first stream feature (36) has a value (A) higher than a predetermined first threshold of the first stream (Al) and lower than a predetermined second threshold of the first stream (A2), and if
the second stream feature (38) has a value (V) higher than a predetermined first threshold of the second stream (Vl) and lower than a predetermined second threshold of the second stream (V2),
- wherein the first threshold of the first stream (Al) is lower than the second threshold of the first stream (A2) and the first threshold of the second stream (Vl) is lower than the second threshold of the second stream (V2).
10. A system for generating an excitation effect (20) based on a multimedia stream (2), wherein the multimedia stream comprises at least a first stream (22) and a second stream (24), which system comprises:
a first stream analyzer (4) for performing a first stream analysis (30) and generating a first stream analysis result (14),
a second stream analyzer (6) for performing a second stream analysis (32) and generating a second stream analysis result (16),
a excitation effect generator (8) for generating an excitation output signal (18) based on the first stream analysis result (14) and the second stream analysis result (16), and - an exciting means (10) for generating (34) the excitation effect (20) based on the excitation output signal (18),
characterized in that
the excitation effect generator (8) is configured for generating the excitation output signal (18) by making use of the method as defined in any of claims 1 or 2.
11. The system as claimed in claim 10, wherein the generated excitation effect (20) is characterized by at least one parameter and in that the system further comprises an adaptation means (40) for adaptation of the parameter by making use of the method as defined in any of claims 3 to 9.
12. The system as claimed in claim 11, wherein the adaptation means (40) is a Central Processor Unit (CPU) configured for running an adaptation software.
13. A home-cinema system comprising the system as claimed in any one of claims 10, 11 or 12 and excitation means, wherein the excitation means are driven by the system based on a multimedia stream produced by the home-cinema system.
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
EP09165731 | 2009-07-17 | ||
EP09165731.2 | 2009-07-17 |
Publications (2)
Publication Number | Publication Date |
---|---|
WO2011007289A2 true WO2011007289A2 (en) | 2011-01-20 |
WO2011007289A3 WO2011007289A3 (en) | 2011-04-21 |
Family
ID=43431868
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/IB2010/053030 WO2011007289A2 (en) | 2009-07-17 | 2010-07-01 | A method and a system for a generation of an excitation effect |
Country Status (1)
Country | Link |
---|---|
WO (1) | WO2011007289A2 (en) |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN103902215A (en) * | 2012-12-28 | 2014-07-02 | 联想(北京)有限公司 | Information processing method and electronic devices |
CN104049749A (en) * | 2013-03-15 | 2014-09-17 | 英默森公司 | Method and apparatus to generate haptic feedback from video content analysis |
Citations (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2008023346A1 (en) | 2006-08-24 | 2008-02-28 | Koninklijke Philips Electronics N.V. | Device for and method of processing an audio signal and/or a video signal to generate haptic excitation |
Family Cites Families (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US6930620B2 (en) * | 2002-01-15 | 2005-08-16 | Microsoft Corporation | Methods and systems for synchronizing data streams |
US9019087B2 (en) * | 2007-10-16 | 2015-04-28 | Immersion Corporation | Synchronization of haptic effect data in a media stream |
-
2010
- 2010-07-01 WO PCT/IB2010/053030 patent/WO2011007289A2/en active Application Filing
Patent Citations (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2008023346A1 (en) | 2006-08-24 | 2008-02-28 | Koninklijke Philips Electronics N.V. | Device for and method of processing an audio signal and/or a video signal to generate haptic excitation |
Cited By (10)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN103902215A (en) * | 2012-12-28 | 2014-07-02 | 联想(北京)有限公司 | Information processing method and electronic devices |
CN104049749A (en) * | 2013-03-15 | 2014-09-17 | 英默森公司 | Method and apparatus to generate haptic feedback from video content analysis |
JP2014194765A (en) * | 2013-03-15 | 2014-10-09 | Immersion Corp | Method and apparatus to generate haptic feedback based on video content analysis |
EP2779675A3 (en) * | 2013-03-15 | 2015-03-04 | Immersion Corporation | Computer-implemented method and system of providing haptic feedback |
US9064385B2 (en) | 2013-03-15 | 2015-06-23 | Immersion Corporation | Method and apparatus to generate haptic feedback from video content analysis |
US9911196B2 (en) | 2013-03-15 | 2018-03-06 | Immersion Corporation | Method and apparatus to generate haptic feedback from video content analysis |
JP2019036351A (en) * | 2013-03-15 | 2019-03-07 | イマージョン コーポレーションImmersion Corporation | Method and apparatus to generate haptic feedback based on video content analysis |
US10482608B2 (en) | 2013-03-15 | 2019-11-19 | Immersion Corporation | Method and apparatus to generate haptic feedback from video content analysis |
EP3570553A1 (en) * | 2013-03-15 | 2019-11-20 | Immersion Corporation | Computer-implemented method and system of providing haptic feedback |
CN110536162A (en) * | 2013-03-15 | 2019-12-03 | 意美森公司 | The method and apparatus for generating the touch feedback from video content analysis |
Also Published As
Publication number | Publication date |
---|---|
WO2011007289A3 (en) | 2011-04-21 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
JP6639602B2 (en) | Offline haptic conversion system | |
US9667907B2 (en) | System and method for haptic integration and generation in multimedia devices | |
JP6534799B2 (en) | Multiplexing and demultiplexing of haptic signals | |
EP3410259B1 (en) | Spatialized haptic feedback based on dynamically scaled values | |
US9640046B2 (en) | Media recognition and synchronisation to a motion signal | |
US20130120114A1 (en) | Biofeedback control system and method for human-machine interface | |
JP2009521170A (en) | Script synchronization method using watermark | |
KR20080109907A (en) | Systems and methods for enhanced haptic effects | |
JP2012506085A (en) | Controlling the user impact of the rendering environment | |
KR20100114857A (en) | Method and apparatus for representation of sensory effects using user's sensory effect preference metadata | |
US20190267043A1 (en) | Automated haptic effect accompaniment | |
Timmerer et al. | Assessing the quality of sensory experience for multimedia presentations | |
CN106406522A (en) | Virtual reality scene content adjustment method and apparatus | |
CN112470104B (en) | Encoding device, encoding method, decoding device, decoding method, transmission system, reception device, and program | |
WO2020031497A1 (en) | Preemptive driving of tactile feedback presentation device | |
JP2013538469A (en) | Realistic effect processing system and method | |
WO2011007289A2 (en) | A method and a system for a generation of an excitation effect | |
CN103916705B (en) | The method and apparatus of mosaic navigation is realized in electric terminal | |
US10440446B2 (en) | Method for generating haptic coefficients using an autoregressive model, signal and device for reproducing such coefficients | |
JP7272360B2 (en) | Encoding device, encoding method, decoding device, decoding method, program | |
CN108721890B (en) | VR action adaptation method, device and readable storage medium | |
US20180218576A1 (en) | Low bit rate parametric encoding and transport of haptic-tactile signals | |
JP6475921B2 (en) | communication unit | |
KR20100092689A (en) | Method and system for providing game using streaming sound source | |
CN109905768A (en) | A kind of internet television set-top box output video cardton judgment method and device |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 10740321 Country of ref document: EP Kind code of ref document: A2 |
|
NENP | Non-entry into the national phase in: |
Ref country code: DE |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 10740321 Country of ref document: EP Kind code of ref document: A2 |