WO1999051020A2 - Animation synchronization for computer implemented multimedia applications - Google Patents

Animation synchronization for computer implemented multimedia applications Download PDF

Info

Publication number
WO1999051020A2
WO1999051020A2 PCT/US1999/006983 US9906983W WO9951020A2 WO 1999051020 A2 WO1999051020 A2 WO 1999051020A2 US 9906983 W US9906983 W US 9906983W WO 9951020 A2 WO9951020 A2 WO 9951020A2
Authority
WO
WIPO (PCT)
Prior art keywords
multimedia
motion video
multimedia application
computer
time information
Prior art date
Application number
PCT/US1999/006983
Other languages
French (fr)
Inventor
Gaston R. Cangiano
Angela Jane Benitz
Original Assignee
Scientific Learning Corp.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Scientific Learning Corp. filed Critical Scientific Learning Corp.
Priority to AU32180/99A priority Critical patent/AU3218099A/en
Publication of WO1999051020A2 publication Critical patent/WO1999051020A2/en

Links

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/434Disassembling of a multiplex stream, e.g. demultiplexing audio and video streams, extraction of additional data from a video stream; Remultiplexing of multiplex streams; Extraction or processing of SI; Disassembling of packetised elementary stream
    • H04N21/4341Demultiplexing of audio and video streams
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/4302Content synchronisation processes, e.g. decoder synchronisation
    • H04N21/4307Synchronising the rendering of multiple content streams or additional data on devices, e.g. synchronisation of audio on a mobile phone with the video output on the TV screen
    • H04N21/43072Synchronising the rendering of multiple content streams or additional data on devices, e.g. synchronisation of audio on a mobile phone with the video output on the TV screen of multiple content streams on the same device
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09BEDUCATIONAL OR DEMONSTRATION APPLIANCES; APPLIANCES FOR TEACHING, OR COMMUNICATING WITH, THE BLIND, DEAF OR MUTE; MODELS; PLANETARIA; GLOBES; MAPS; DIAGRAMS
    • G09B5/00Electrically-operated educational appliances
    • G09B5/06Electrically-operated educational appliances with both visual and audible presentation of the material to be studied
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09BEDUCATIONAL OR DEMONSTRATION APPLIANCES; APPLIANCES FOR TEACHING, OR COMMUNICATING WITH, THE BLIND, DEAF OR MUTE; MODELS; PLANETARIA; GLOBES; MAPS; DIAGRAMS
    • G09B7/00Electrically-operated teaching apparatus or devices working with questions and answers
    • G09B7/02Electrically-operated teaching apparatus or devices working with questions and answers of the type wherein the student is expected to construct an answer to the question which is presented or wherein the machine gives an answer to the question presented by a student

Abstract

A synchronization module executes independently of a multimedia application and serves requests for timing data which the multimedia application uses to synchronize display of motion video and audio data of a multimedia display. The multimedia application disregards the motion video playback mechanism provided by the multimedia authoring tool by which the multimedia application is created and instead directly controls movement of individual objects of a motion video scene in accordance with time information received from the synchronization module. The timing data is relative to the beginning of playback of accompanying audio data such that audio and video of the multimedia display are synchronized regardless of the particular processing speed of the computer system within which the multimedia application executes.

Description

ANIMATION SYNCHRONIZATION FOR COMPUTER IMPLEMENTED MULTIMEDIA APPLICATIONS
SPECIFICATION
FIELD OF THE INVENTION
The present invention relates to computer-implemented multimedia applications and, in particular, to a mechanism for synchronizing sound and video playback in multimedia presentations
BACKGROUND OF THE INVENTION
One of the fastest growing areas of computer processing is the development and proliferation of multimedia computer applications Such multimedia computer applications include multimedia computer games, virtual reality systems, multimedia presentations, and computer implemented training systems The combination of sound, text, and motion video in multimedia displays presents information in a particularly efficient and powerful manner such that each type of media provides a context for others of the types of media
Multimedia computer applications are relatively new and few multimedia authoring tools are available for creating multimedia applications A multimedia authoring tool is a computer process by which a user of a computer can create a multimedia application, i e , a computer process which presents multimedia subject matter using one or more computer display devices including computer video display screens and/or loudspeakers Currently, the most flexible and useful multimedia authoring tool available is the Director multimedia authoring tool available from Macromedia, Inc of San Francisco, California
The Director multimedia authoring tool provides a wide array of powerful multimedia processing modules but is rather limited in the manner to which a multimedia computer application created through the Director multimedia authoring tool can control the multimedia presentation created by the multimedia application For example, the Director multimedia authoring tool provides a mechanism by which motion video can be displayed but generally provides no mechanism by which the speed at which the motion video plays can be synchronized with accompanying audio data Specifically, displaying motion video through use of the mechanism provided by the Director multimedia authoring tool requires that each and every frame of the motion video image is displayed. If the multimedia application executes in a computer system which is particularly slow, the display of the motion video can lag behind the accompanying audio data. For example, a user can hear an animated character speak before the user sees the mouth of the character move. Conversely, if the particular computer system within which the multimedia application executes has a relatively high processing capacity, the motion video playback can outpace the playback of the accompanying audio data. Such is generally unpleasing to users and can be particularly undesirable in interactive multimedia applications in which the user is expected to react to visual and/or auditory cues. Without proper synchronization, the user may be unable to react as expected.
What is needed is a mechanism by which motion video can be synchronized with accompanying audio data by a multimedia application notwithstanding lack of synchronization mechanisms provided by the authoring tool with which the multimedia application is created.
SUMMARY OF THE INVENTION
In accordance with the present invention, a synchronization module executes independently of a multimedia application and serves requests for timing data which the multimedia application uses to synchronize display of motion video and audio data of a multimedia display. The multimedia application disregards the motion video playback mechanism provided by the multimedia authoring tool by which the multimedia application is created and instead directly controls movement of individual objects of a motion video scene in accordance with time information received from the synchronization module. The timing data is relative to the beginning of playback of accompanying audio data such that audio and video of the multimedia display are synchronized regardless of the particular processing speed of the computer system within which the multimedia application executes.
The multimedia application can be developed using the Macromedia Director multimedia authoring tool in this illustrative embodiment. Accordingly, the multimedia application includes no mechanisms by which time can be measured. Instead, the multimedia application can only initiate playback of a motion video and the individual frames of the motion video are displayed in sequence regardless of the rate at which the audio data plays.
Instead of displaying motion video as a typical motion video stream which includes a number of discrete frames, the multimedia application repeatedly redraws all objects of a scene in realtime according to time data received from the synchronization module. Specifically, the multimedia application queries the synchronization module by sending a synchronization query. The synchronization module receives the synchronization query and, in response thereto, records the current time as indicated in a clock within a processor of the computer system within which the multimedia application and the synchronization module execute. The synchronization module reports the current time to the multimedia application which then determines appropriate respective positions for all objects of the scene at the current time according to a number of motion models and draws all objects of the scene at the respective positions.
The importance of synchronization of playback of motion video and accompanying audio data are particularly apparent when the user is expected to react to synchronized visual and auditory cues presented to the user during playback of a multimedia display by multimedia application 140, i.e., when multimedia application 140 is interactive. For example, the user can be expected to identify an animated character which speaks a particular word or phrase. For the user to properly identify the speaking character, the character must be shown to speak, e.g., by a moving mouth, at substantially the same time the user hears the particular word or phrase. If the audible word or phrase is not synchronized, e.g., because of the particular processing capacity of the computer system within which the multimedia application executes, the user may be unable to determine which of a number of animated characters is supposed to have spoken the particular word or phrase.
BRIEF DESCRIPTION OF THE DRAWINGS
Figure 1 is a block diagram of a computer system in which a multimedia application and synchronization module execute in accordance with the present invention.
Figures 2 A-E are screen views of a motion view scene in which a ball follows a trajectory and bounces from a surface in a synchronized manner with accompanying sound in accordance with the present invention
Figure 3 is a logic flow diagram of the control of the motion video scene of Figures 2A-E by the multimedia application and the synchronization module of Figure 1 in accordance with the present invention
DETAILED DESCRIPTION
In accordance with the present invention, a synchronization module 150 (Figure 1) executes independently of a multimedia application 140 and serves requests for timing data which multimedia application 140 uses to synchronize display of motion video and audio data of a multimedia display Multimedia application 140 is developed using the Macromedia Director multimedia authoring tool in this illustrative embodiment Accordingly, multimedia application 140 includes no mechanisms by which time can be measured Instead, multimedia application 140 can only initiate playback of a motion video and the individual frames of the motion video are displayed in sequence regardless of the rate at which the audio data plays
Therefore, instead of displaying motion video as a typical motion video stream which includes a number of discrete images, multimedia application 140 repeatedly redraws all objects of a scene in realtime according to time data received from synchronization module 150 Specifically, multimedia application 140 queries synchronization module 150 by sending a synchronization query Synchronization module 150 receives the synchronization query and, in response thereto, records the current time as indicated in a clock 160 within a processor 102 Synchronization module 150 reports the current time to multimedia application 140 which then determines appropriate respective positions for all objects of the scene at the current time according to a number of motion models and draws all objects of the scene at the respective positions Thus, multimedia application 140 disregards the motion video playback mechanism provided by the multimedia authoring tool by which multimedia application 140 is created and instead directly controls movement of individual objects of the scene in accordance with time information received from synchronization module 150
The importance of synchronization of playback of motion video and accompanying audio data are particularly apparent when the user is expected to react to synchronized visual and auditory cues presented to the user during playback of a multimedia display by multimedia application 140, i.e., when multimedia application 140 is interactive. For example, the user can be expected to identify an animated character which speaks a particular word or phrase. For the user to properly identify the speaking character, the character must be shown to speak, e.g., by a moving mouth, at substantially the same time the user hears the particular word or phrase. If the audible word or phrase is not synchronized, e.g., because of the particular processing capacity of the computer system within which multimedia application 140 executes, the user may be unable to determine which of a number of animated characters is supposed to have spoken the particular word or phrase.
Multimedia application 140 and synchronization module 150 execute within a computer system 100 which is shown in Figure 1. Computer system 100 includes a processor 102 and memory 104 which is coupled to processor 102 through an interconnect 106. Interconnect 106 can be generally any interconnect mechanism for computer system components and can be, e.g., a bus, a crossbar, a mesh, a torus, or a hypercube. Processor 102 fetches from memory 104 computer instructions and executes the fetched computer instructions. Processor 102 also reads data from and writes data to memory 104 and sends data and control signals through interconnect 106 to one or more computer display devices 120 and receives data and control signals through interconnect 106 from one or more computer user input devices 130 in accordance with fetched and executed computer instructions. Processor 102 includes a clock 160 for synchronizing various operations carried out by processor 102.
Memory 104 can include any type of computer memory and can include, without limitation, randomly accessible memory (RAM), read-only memory (ROM), and storage devices which include storage media such as magnetic and/or optical disks. Memory 104 includes multimedia application 140 and synchronization module 150, each of which is all or part of one or more computer processes which in turn execute within processor 102 from memory 104. A computer process is generally a collection of computer instructions and data which collectively define a task performed by computer system 100.
Each of computer display devices 120 can be any type of computer display device including without limitation a printer, a cathode ray tube (CRT), a light-emitting diode (LED) display, or a liquid crystal display (LCD). Each of computer display devices 120 - o - receives from processor 102 control signals and data and, in response to such control signals, displays the received data. Computer display devices 120, and the control thereof by processor 102, are conventional. One of computer display devices 120 includes a display screen 122 as shown.
In addition, loudspeaker 120D can be any loudspeaker and can include amplification and can be, for example, a pair of headphones. Loudspeaker 120D receives sound signals from audio processing circuitry 120C and produces corresponding sound for presentation to a user of computer system 100. Audio processing circuitry 120C receives control signals and data from processor 102 through interconnect 106 and, in response to such control signals, transforms the received data to a sound signal for presentation through loudspeaker 120D.
Each of user input devices 130 can be any type of user input device including, without limitation, a keyboard, a numeric keypad, or a pointing device such as an electronic mouse, trackball, lightpen, touch-sensitive pad, digitizing tablet, thumb wheels, or joystick. Each of user input devices 130 generates signals in response to physical manipulation by the listener and transmits those signals through interconnect 106 to processor 102.
As described above, multimedia application 140 and synchronization module 150 execute within processor 102 from memory 104. Specifically, processor 102 fetches computer instructions from multimedia application 140 and synchronization module 150 and executes those computer instructions. Processor 102, in executing multimedia application 140 and synchronization module 150, synchronizes playback of motion video and accompanying audio data in a multimedia presentation in a manner described more completely below.
In one embodiment, multimedia application 140 is a training game in which the user is expected to identify one of two animated characters which speaks a particular phoneme in a manner which is described more completely in co-pending U.S. Patent Application S/N
08/ , filed January 23, 1998 by William M. Jenkins, Ph.D. et al. and entitled
"Adaptive Motivation for Computer- Assisted Training System" and in U.S. Patent
Application S/N 08/ , filed , 1998 by William M. Jenkins, Ph.D. et al. and entitled "Method and Apparatus for Training of Sensory Perceptual System in LLI Subjects" and those descriptions are incorporated herein by reference. Briefly, a selected phoneme, e.g., "ba," is played for the user and a visual cue is displayed to indicate to the user that this phoneme is the one to identify. Each of two animated characters speaks a respective one of a pair of similar sounding phonemes, e.g., "ba" and "da." In other words, multimedia application 140 plays each of the pair of phonemes for the user while substantially simultaneously showing a respective one of two animated characters as speaking, e.g., by moving the mouth of the respective animated character. The user then identifies one of the two animated characters using conventional graphical user interface techniques. If the user identifies the animated character which spoke the previously indicated phoneme, the user has responded correctly and is rewarded. Otherwise, the user has responded incorrectly and has failed to identify the indicated phoneme.
To properly synchronize the visual and auditory cues in the multimedia display of multimedia application 140, multimedia application 140 processes according to logic flow diagram 300 (Figure 3). Loop step 302 and next step 310 define a loop in which multimedia application 140 control animation of a number of animated objects of a scene according to steps 304-308 (Figure 3). In step 304, multimedia application 140 (Figure 1) queries synchronization module 150 by sending to synchronization module 150 a synchronization query. In one embodiment, synchronization module 150 is an Xtra as used by the Macromedia Director multimedia authoring tool. Xtras are known and are described only briefly herein for completeness. Synchronization module 150 is dynamically loaded and executed in response to a request by multimedia application 140. Synchronization module 150 shares a memory address space within memory 104 but has its own execution state and is scheduled for execution within computer system 100 concurrently with and independently of multimedia application 140.
Processing by synchronization module 150 is also shown in logic flow diagram 300 (Figure 3). In step 312, synchronization module 150 receives the synchronization query. In step 314 (Figure 3), synchronization module 150 retrieves a current time by polling clock 160 of processor 102 in a conventional manner, e.g., by invocation of clock polling mechanisms to which multimedia application 140 has no access by virtue of the multimedia authoring tool by which multimedia application 140 is created. In step 316 (Figure 3), synchronization module 150 (Figure 1) sends to multimedia application 140 a response message which includes data specifying the current time as polled from clock 160. After step 316 (Figure 3), synchronization module 150 (Figure 1) awaits the next synchronization query. Multimedia application 140 receives the response message with the current time in step 304 (Figure 3). In step 306, multimedia application 140 (Figure 1) determines appropriate respective positions for a number of objects in a motion video scene for the current time. In particular, multimedia application 140 specifies movement of each of the objects in the motion video scene as a function of time relative to initiation of the motion video scene. In step 308 (Figure 3), multimedia application 140 (Figure 1) draws the objects at the respective appropriate positions determined in step 306.
After step 308, processing by multimedia application 140 (Figure 1) transfers through next step 310 (Figure 3) to loop step 302 therethrough to steps 304-308 in which multimedia application 140 (Figure 1) again polls synchronization module 150 for the current time according to clock 160 and redraws the objects of the motion video scene in accordance with newly determined respective positions according to the new current time. Thus, the objects of the motion video scene are repeatedly redrawn by multimedia application 140 to give the illusion of motion. When the current time as polled from synchronization module 150 exceeds the time range of the motion video scene as specified within multimedia application 140, or when rendering of the motion video scene is otherwise complete, processing according to logic flow diagram 300 (Figure 3) by multimedia application 140 (Figure 1) completes.
By determining appropriate respective positions of objects of a motion video scene for the current time in step 306 and drawing the objects at the respective appropriate positions in step 308, multimedia application 140 synchronizes the motion video animation with audio data which is playing concurrently. Specifically, it should be understood that, initiation of processing accordingly to logic flow diagram 300 (Figure 3) is substantially concurrent with initiation of playback of audio data, e.g., audio data 152, by multimedia application 140. In general, playback of audio data 152 by multimedia application 140 is asynchronous in that multimedia application 140 initiates playback of audio data 152 and can continue processing while audio data 152 is concurrently transferred through interconnect to audio processing circuitry 120C for presentation through loudspeaker 120D. Typically, playback of audio data such as audio data 152 is at a specific rate such that the sound represented by the audio data is substantially faithfully reproduced for the user. Moving objects of the motion video scene in accordance with relative time ensures that the motion video scene is substantially synchronized with playback of audio data 152. The following example is illustrative
Figures 2A-E shown a motion video scene displayed in computer display 122 of Figure 1 The motion video scene includes a ball 202 (Figure 2A) following a trajectory 206 in which ball 202 bounces off of a surface 204 Trajectory 206 is not displayed for the user in computer display 122 but is shown for illustration purposes Playback of audio data 152 (Figure 1) is ongoing during motion video scene and is silence until a predetermined time at which audio data 152 includes a "bonk" noise to accompany bouncing of ball 202 (Figure 2A) from surface 204 In a first iteration of the loop of steps 304-310 (Figure 3), multimedia application 140 (Figure 1) determines the position of ball 202 (Figure 2 A) at the current time relative to the beginning of playback of audio data 152 (Figure 1) and draws ball 202 at that position as shown in Figure 2A In a subsequent iteration of the loop of steps 304-310 (Figure 3), multimedia application 140 (Figure 1) determines as the appropriate position of ball 202 the position shown in Figure 2B and redraws ball 202 at that position Playback of audio data 152 remains silent through these early iterations of the loop of steps 304-310 (Figure 3)
Figure 2C shows the motion video scene corresponding to an iteration of the loop of steps 304-310 (Figure 3) at approximately the predetermined time at which audio data 152 (Figure 1) includes the "bonk" noise Accordingly, the appropriate position of ball 202 is adjacent surface 204 and multimedia application 140 (Figure 1) displays ball 202 (Figure 2C) in this position regardless of the relative processing speed of computer system 100 (Figure 1) and, accordingly, the number of iterations of the loop of steps 304-310 (Figure 3) prior to the predetermined time of the "bonk" noise Therefore, regardless of the number of discrete positions of ball 202 prior to the predetermined time, ball 202 is shown to bounce from surface 204 at the predetermined time at which the "bonk" noise is substantially simultaneously presented to the user through loudspeaker 120D
In subsequent iterations of the loop of steps 304-310 (Figure 3), multimedia application 140 (Figure 1) shows ball 202 at positions shown in Figures 2D and 2E and playback of audio data 152 (Figure 1) is silent Accordingly, the motion video scene shown in Figures 2A-E is substantially synchronized with playback of audio data 152 As a result, proper correlation in the perception of the user of related video and auditory components of the multimedia display of multimedia application 140 is improved substantially The above description is illustrative only and is not limiting. The present invention d only by the claims which follow. -

Claims

What is claimed is
1 A method for synchronizing motion video in a multimedia presentation by a multimedia application created by a multimedia authoring tool, the method comprising forming a request for time information, sending the request to a synchronization module which executes concurrently with the multimedia application in such a manner that causes the sound management module to response with time information, determining respective positions of one or more objects in a motion video scene at a time indicated by the time information, and drawing the one or more objects at the respective positions
2 The method of Claim 1 further comprising repeating the forming, sending, determining, and drawing
3 The method of Claim 1 wherein the time information specifies a time relative to initiation of concurrently playing audio which corresponds to the motion video
4 A computer readable medium useful in association with a computer which includes a processor and a memory, the computer readable medium including computer instructions which are configured to cause the computer to synchronize motion video in a multimedia presentation by a multimedia application created by a multimedia authoring tool by forming a request for time information, sending the request to a synchronization module which executes concurrently with the multimedia application in such a manner that causes the sound management module to response with time information, determining respective positions of one or more objects in a motion video scene at a time indicated by the time information, and drawing the one or more objects at the respective positions.
5. The computer readable medium of Claim 4 wherein the computer instructions are further configured to cause the computer to synchronize motion video in a multimedia presentation by a multimedia application created by a multimedia authoring tool by repeating the forming, sending, determining, and drawing.
6. The computer readable medium of Claim 4 wherein the time information specifies a time relative to initiation of concurrently playing audio which corresponds to the motion video.
7. A computer system comprising: a processor; a memory operatively coupled to the processor; and a multimedia application (i) which executes in the processor from the memory and (ii) which is created by a multimedia authoring tool and (iii) which, when executed by the processor, causes the computer to synchronize motion video in a multimedia presentation by a multimedia application created by a multimedia authoring tool by: forming a request for time information; sending the request to a synchronization module which executes concurrently with the multimedia application in such a manner that causes the sound management module to response with time information; determining respective positions of one or more objects in a motion video scene at a time indicated by the time information; and drawing the one or more objects at the respective positions.
8. The computer readable system of Claim 7 wherein the multimedia application is further configured to cause the computer to synchronize motion video in a multimedia presentation by a multimedia application created by a multimedia authoring tool by repeating the forming, sending, determining, and drawing.
9. The computer system of Claim 7 wherein the time information specifies a time relative to initiation of concurrently playing audio which corresponds to the motion video.
PCT/US1999/006983 1998-03-31 1999-03-30 Animation synchronization for computer implemented multimedia applications WO1999051020A2 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
AU32180/99A AU3218099A (en) 1998-03-31 1999-03-30 Animation synchronization for computer implemented multimedia applications

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US5314398A 1998-03-31 1998-03-31
US09/053,143 1998-03-31

Publications (1)

Publication Number Publication Date
WO1999051020A2 true WO1999051020A2 (en) 1999-10-07

Family

ID=21982216

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/US1999/006983 WO1999051020A2 (en) 1998-03-31 1999-03-30 Animation synchronization for computer implemented multimedia applications

Country Status (2)

Country Link
AU (1) AU3218099A (en)
WO (1) WO1999051020A2 (en)

Also Published As

Publication number Publication date
AU3218099A (en) 1999-10-18

Similar Documents

Publication Publication Date Title
US10657727B2 (en) Production and packaging of entertainment data for virtual reality
US10872535B2 (en) Facilitating facial recognition, augmented reality, and virtual reality in online teaching groups
US5613056A (en) Advanced tools for speech synchronized animation
JP4225567B2 (en) Spatial access method for time-based information
US5526480A (en) Time domain scroll bar for multimedia presentations in a data processing system
EP1768011A1 (en) Inner force sense presentation device, inner force sense presentation method, and inner force sense presentation program
KR20010074565A (en) Virtual Reality System for Screen/Vibration/Sound
EP3264222B1 (en) An apparatus and associated methods
JPH09500741A (en) How to rewind a time-based script sequence
WO2012020242A2 (en) An augmented reality system
Miner et al. Computational requirements and synchronization issues for virtual acoustic displays
Pressing Some perspectives on performed sound and music in virtual environments
Schertenleib et al. Conducting a virtual orchestra
CN113810837B (en) Synchronous sounding control method of display device and related equipment
Cohen et al. Development and experimentation with synthetic visible speech
US11825170B2 (en) Apparatus and associated methods for presentation of comments
WO1999051020A2 (en) Animation synchronization for computer implemented multimedia applications
CN109616117A (en) A kind of mobile phone games control system and method based on speech recognition technology
JP4529360B2 (en) Body sensation apparatus, motion signal generation method and program
JP2000003171A (en) Fingering data forming device and fingering display device
JP2001268493A (en) Video reproducing device, video reproducing method and information recording medium
CN114842690B (en) Pronunciation interaction method, system, electronic equipment and storage medium for language courses
KR200241789Y1 (en) Virtual Reality System for Screen/Vibration/Sound
JP2000098866A (en) Kanji learning system
JP3512507B2 (en) 3D image processing device

Legal Events

Date Code Title Description
AK Designated states

Kind code of ref document: A2

Designated state(s): AL AM AT AU AZ BA BB BG BR BY CA CH CN CU CZ DE DK EE ES FI GB GE GH GM HR HU ID IL IS JP KE KG KP KR KZ LC LK LR LS LT LU LV MD MG MK MN MW MX NO NZ PL PT RO RU SD SE SG SI SK SL TJ TM TR TT UA UG UZ VN YU ZW

AL Designated countries for regional patents

Kind code of ref document: A2

Designated state(s): GH GM KE LS MW SD SL SZ UG ZW AM AZ BY KG KZ MD RU TJ TM AT BE CH CY DE DK ES FI FR GB GR IE IT LU MC NL PT SE BF BJ CF CG CI CM GA GN GW ML MR NE SN TD TG

121 Ep: the epo has been informed by wipo that ep was designated in this application
DFPE Request for preliminary examination filed prior to expiration of 19th month from priority date (pct application filed before 20040101)
NENP Non-entry into the national phase

Ref country code: KR

REG Reference to national code

Ref country code: DE

Ref legal event code: 8642

122 Ep: pct application non-entry in european phase