WO2002017179A1 - Electronic study method using multimedia - Google Patents

Electronic study method using multimedia Download PDF

Info

Publication number
WO2002017179A1
WO2002017179A1 PCT/KR2001/000003 KR0100003W WO0217179A1 WO 2002017179 A1 WO2002017179 A1 WO 2002017179A1 KR 0100003 W KR0100003 W KR 0100003W WO 0217179 A1 WO0217179 A1 WO 0217179A1
Authority
WO
WIPO (PCT)
Prior art keywords
lecture
producer
file
event
timetable
Prior art date
Application number
PCT/KR2001/000003
Other languages
French (fr)
Inventor
Jung-Hoon Bae
Original Assignee
4C Soft Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 4C Soft Inc filed Critical 4C Soft Inc
Priority to AU2001224098A priority Critical patent/AU2001224098A1/en
Publication of WO2002017179A1 publication Critical patent/WO2002017179A1/en

Links

Classifications

    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09BEDUCATIONAL OR DEMONSTRATION APPLIANCES; APPLIANCES FOR TEACHING, OR COMMUNICATING WITH, THE BLIND, DEAF OR MUTE; MODELS; PLANETARIA; GLOBES; MAPS; DIAGRAMS
    • G09B5/00Electrically-operated educational appliances
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09BEDUCATIONAL OR DEMONSTRATION APPLIANCES; APPLIANCES FOR TEACHING, OR COMMUNICATING WITH, THE BLIND, DEAF OR MUTE; MODELS; PLANETARIA; GLOBES; MAPS; DIAGRAMS
    • G09B7/00Electrically-operated teaching apparatus or devices working with questions and answers

Definitions

  • the present invention relates to an electronic learning method using multimedia, and more particularly, to an electronic learning method using multimedia wherein by enabling a producer to control the generation time and duration of events employed in producing the contents for lectures, problems encountered in a real-time production process thereof can be solved and effective iterative learning can be achieved through synchronization of lecture contents with audio files corresponding thereto.
  • the events are referred to as all kinds of object elements (for example, images, full-motion videos, straight lines and quadrangles) for constructing the lecture contents.
  • object elements for example, images, full-motion videos, straight lines and quadrangles
  • a program for remote lecturing is broadly divided into two categories.
  • the first is a lecturer program for producing lectures; the other is a learner program for playing back the produced contents for the lectures.
  • the conventional contents for the remote lecturing comprises of a combination of the object elements referred to as events such as voice files of a lecturer, progressing screens of the lecturer or full-motion videos, lines, figures and image files.
  • the lecturer can show the lecture contents that the lecturer wants to communicate.
  • a sequence and time for generating the respective events should be determined before the recording thereof in order to produce relatively perfect contents. There was inconvenience, therefore, in that the lecturer had to generate specific events at a time when the lecturer wanted to execute them, while recording the lecture.
  • An object of the present invention is to provide an electronic learning method using multimedia wherein by enabling a producer to control a generation time and duration of events employed when producing the contents for lectures, problems encountered in a realtime production process thereof ⁇ an be solved and effective iterative learning can be achieved through synchronization of lecture contents with audio files corresponding thereto.
  • the electronic learning method using multimedia comprises the steps of inputting an event time by a producer, recording lecture contents by the producer, and creating the lecture contents into a file by the producer, whereby the producer can create a lecture file.
  • the electronic learning method using multimedia comprises the steps of loading a lecture file by a computer, creating a timetable for a lecture schedule of the lecture file and playing back the lecture file by the computer, and selecting an arbitrary page by a learner, whereby the learner can play back the lecture file.
  • the step of creating the lecture file by the producer may further comprise a step of checking an event during the recording of the lecture contents when the producer does not directly input the event time.
  • the step of inputting the event time by the producer may further comprise a step of dividing data corresponding to the event time into page data and event data and storing the data.
  • the step of recording the lecture contents by the producer may also further comprise a step of recording the lecture contents after data corresponding to the event time has already been inputted.
  • the step of selecting the arbitrary page by the learner may further comprise a step of reading a timetable corresponding to an application time of the selected page.
  • the step of reading the timetable may further comprise a step of playing back a next timetable according to whether or hot the timetable comes to an end, in the case where an event address of the timetable is empty.
  • the timetable may be formed into a multi-dimensional array of a tree structure, and each item of this array may comprise memory addresses of event data and page data.
  • FIG. 1 is a flowchart showing a process of producing a lecture file according to the present invention.
  • FIG. 2 is a flowchart showing a process of playing back the lecture file according to the present invention.
  • FIG. 3 shows the constitution of a timetable according to the present invention.
  • FIG. 4 shows a lecture-producing window according to the present invention.
  • FIG. 1 An electronic learning method using multimedia according to the present invention is shown in FIG. 1.
  • data corresponding to an event time are divided into page data and event data at step 101 where the event time is inputted, and are then stored (step 106).
  • the events can be checked during the recording of the lecture contents (step 102). Further, the generation time, generation sequence and duration of the events previously produced during the recording can be adjusted and arranged, and the events can be individually modified.
  • the recording is stopped (step 109); the lecture data are generated (step 110); and then the lecture data, the page data and the event data are stored (step 111).
  • the lecture file is created (step 112), and then the process of producing the lecture file is completed (step 113).
  • the lecture data are instantly loaded to a memory.
  • a pointer page data function for allotting a data pointer in the form of the page data is executed.
  • the pointer allotted into the event data is stored, and the name of the image file used and an application time of the page are stored.
  • a pointer event data function used for allotting a data pointer in the form of the event data is executed.
  • the pointer allotted into the page data is stored, and the event data are stored in a memory space indicated by the pointer.
  • the page data function or event data function is executed and memories are hence allotted. Addresses of the allotted memories are stored in the event data or page data. When there is no additional page or event, a null value is stored in the memory address. When the producer clicks, a save button or a lecture-creating button, the data temporarily stored in the memory are stored in the lecture file.
  • check boxes of the events may be checked or the duration of the events may be directly inputted at a desired time during the recording. Thus, the checked or inputted time is stored in the memory.
  • the page is changed, the time when the page should be changed is also stored in the memory.
  • FIG. 2 is a flowchart showing a process of playing back the lecture file.
  • the stored lecture file is loaded (step 201), the time of the lecture data is read (step 202), and thus, a timetable is created (step 203).
  • the timetable is created while checking the lecture time of the lecture file, and the current time duration of the page is initialized to be zero (step 204).
  • the timetable reads the current time duration (step 207). If there are any contents in a subsequent page, the contents are executed (step 211). Otherwise, according to whether or not the timetable comes to an end, it is determined as to whether or not the next timetable will be played back.
  • the lecture is then terminated, when the timetable comes to an end (steps 209 and 214).
  • a lecture time in the lecture data stored in the lecture file is read, and an array in the form of a timetable is created.
  • Each item in the array has a linked list structure including each address of the event data.
  • a memory address consisting of an address on the memory of the event, which occurred first at each time, is stored in the item of the array.
  • the item has a null address.
  • a first list has the same address of the memory as a second memory.
  • the address of the next linked list is indicated as a null address.
  • Each item in an array 1 of the timetable 300 has each memory address of the pages.
  • the address #A of the contents of the page to be generated at the next time (second) has memory addresses #A-1, #A-2 (in 301).
  • the address #B of the contents of the events to be generated in the page concurrently with the opening thereof is stored in the memory address #A-2 (in 302).
  • a null value is stored in the memory address.
  • a lecture-producing window 400 comprises of a menu item section 440 at a top side thereof, an event list 410 at a central left side thereof, a lecture page 430 at a central right side thereof, and a page list 420 at a bottom side thereof.
  • a menu for inputting an event time operates in a lecture-producing program, and comprises a part 411 for indicating the total time of the lecture at a top portion thereof, and a part 412 including a number portion, an application time portion, a page number portion, a description portion and a type portion.
  • the number portion indicates an event number and comprises check boxes and icons 413.
  • the application time portion is an input portion for inputting the time when the events should be generated after the lecture has been initiated.
  • the page number portion indicates a lecture page number to which the event belongs.
  • the description portion indicates additional descriptions of the events.
  • the type portion indicates types of various events such as lines, full-motion video files, image files and figures.
  • the producer produces and saves the events beforehand regardless of the generation sequences.
  • the countdown is then started.
  • the relevant events can be generated at the checked time upon the next playing back.
  • the sequences of the events can be rearranged according to the checked sequence of the check boxes.
  • the time to be applied into the menu for inputting the event time can be inputted as a unit of second, after the time when the events should occur has been set beforehand. At this time, the events having the same inputted time are simultaneously generated. In addition, when a specific event is to be modified, the events can be individually modified by only selecting the events. In a preferred embodiment of the present invention, in a case where a null address is stored in the timetable, no event is generated at a time corresponding thereto.
  • the address #A of the contents to be generated at a next time (second) has the memory addresses #A-1, #A-2, and the address #B of the contents of the events to be generated concurrently with the opening of the page is stored in the memory address #A-2.
  • the lecturer in a case where the lecture contents are comprised of the sequences of the events such as lecturer's greetings, explanation of the subject, emphasis of the subject, and loading of auxiliary image files, the lecturer can beforehand prepare the events such as full-motion video files containing the greetings, writing of the lecture subject on the backboard, coloring of the subject, and the auxiliary image files. Then, the lecturer first confirms a playback time of the full-motion video file, and inputs the generation time of the explanation of the subject, the emphasis of the subject, the auxiliary image files, etc. Otherwise, the check boxes of the events continued just after the playback of the full- motion video files was terminated, can be checked. Thus, the relevant events are generated, and a teaching plan constructed to have such sequences can be* shown to the learner while maintaining the generation time and sequences of the events upon playing back.
  • the pronunciation part of the audio file synchronized with the relevant page can be repeated and learned by clicking only on a corresponding page without having to listen again to the lecture from the beginning.
  • the producer can adjust and arrange the generation time, sequences and duration of the events previously produced before the recording of the lecture, without need to produce the events during the recording of the lecture contents. Further, since a plurality of events can be simultaneously inputted, the multiple events can be generated at the same time. In addition, the events can be individually modified, and thus the producer can easily produce the lecture file and proceed with the lecture smoothly. In this way,_unnecessary repetition can also be eliminated.
  • the learner can take an improved learning effect by virtue of relatively smooth and perfect lecture contents. Further, since the contents of the page are perfectly synchronized with the lecturer's voice corresponding thereto, the advantage of an online learning system in that the learner can repeatedly play back a specific page or a desired part at any time during the learning, can be utilized considerably.

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Business, Economics & Management (AREA)
  • Physics & Mathematics (AREA)
  • Educational Administration (AREA)
  • Educational Technology (AREA)
  • General Physics & Mathematics (AREA)
  • Electrically Operated Instructional Devices (AREA)

Abstract

The present invention relates to an electronic learning method using multimedia wherein by enabling a producer to control the generation time and duration of events employed in producing the contents for lectures, problems encountered in a real-time production process thereof can be solved and effective iterative learning can be achieved by implementing synchronization of lecture contents with audio files corresponding thereto. In present invention, there is provided an electronic learning method using multimedia, comprising the steps of inputting an event time by a producer, recording lecture contents by the producer, and creating the lecture contents into a file by the producer, wehreby the producer can produce a lecture file. Further, there is provided an electronic learning method using multimedia, comprising the steps of loading a lecture file by a computer, creating a timetable for a lecture schedule of the lecture file and playing back the lecture file by the computer, and selecting an arbitrary page by a learner, whereby the learner can play back the lecture file.

Description

ELECTRONIC STUDY METHOD USING MULTIMEDIA
Technical Field
The present invention relates to an electronic learning method using multimedia, and more particularly, to an electronic learning method using multimedia wherein by enabling a producer to control the generation time and duration of events employed in producing the contents for lectures, problems encountered in a real-time production process thereof can be solved and effective iterative learning can be achieved through synchronization of lecture contents with audio files corresponding thereto.
Background Art The events are referred to as all kinds of object elements (for example, images, full-motion videos, straight lines and quadrangles) for constructing the lecture contents.
Generally, a program for remote lecturing is broadly divided into two categories. The first is a lecturer program for producing lectures; the other is a learner program for playing back the produced contents for the lectures. The conventional contents for the remote lecturing comprises of a combination of the object elements referred to as events such as voice files of a lecturer, progressing screens of the lecturer or full-motion videos, lines, figures and image files. By properly combining these events in an intended sequence, the lecturer can show the lecture contents that the lecturer wants to communicate. However, in the prior art, a sequence and time for generating the respective events should be determined before the recording thereof in order to produce relatively perfect contents. There was inconvenience, therefore, in that the lecturer had to generate specific events at a time when the lecturer wanted to execute them, while recording the lecture.
In such a case, when the lecturer fails to generate the events at a desired time, the identical voice recording should be repeated, and unnecessary processes such as synchronization of the sequence and duration of the events should also be repeated. Furthermore, the' generation time and duration of the events cannot be accurately assigned within the given lecture time. Since a plurality of events cannot be simultaneously inputted, the plural events cannot be generated at the same time. In addition, in a case where a specific event has to be modified, it cannot be solely modified. Thus, some troubles exist in that the entire lecture contents have to be modified. Furthermore, in the conventional learning program, when the learner tries to arbitrarily change the present page to a specific page during the learning through the program, only the lecture page was changed and the lecturer's voice corresponding to the changed page was not synchronized with the changed page. Thus, since the lecturer's voice and the page contents proceed independently with each other, the learner cannot learn from the specific part that the learner wants.
Disclosure of Invention The present invention is conceived to solve the above problems. An object of the present invention is to provide an electronic learning method using multimedia wherein by enabling a producer to control a generation time and duration of events employed when producing the contents for lectures, problems encountered in a realtime production process thereof ςan be solved and effective iterative learning can be achieved through synchronization of lecture contents with audio files corresponding thereto.
The electronic learning method using multimedia according to the present invention, comprises the steps of inputting an event time by a producer, recording lecture contents by the producer, and creating the lecture contents into a file by the producer, whereby the producer can create a lecture file.
Furthermore, the electronic learning method using multimedia according to the present invention, comprises the steps of loading a lecture file by a computer, creating a timetable for a lecture schedule of the lecture file and playing back the lecture file by the computer, and selecting an arbitrary page by a learner, whereby the learner can play back the lecture file.
The step of creating the lecture file by the producer may further comprise a step of checking an event during the recording of the lecture contents when the producer does not directly input the event time.
The step of inputting the event time by the producer may further comprise a step of dividing data corresponding to the event time into page data and event data and storing the data.
The step of recording the lecture contents by the producer may also further comprise a step of recording the lecture contents after data corresponding to the event time has already been inputted. The step of selecting the arbitrary page by the learner may further comprise a step of reading a timetable corresponding to an application time of the selected page.
The step of reading the timetable may further comprise a step of playing back a next timetable according to whether or hot the timetable comes to an end, in the case where an event address of the timetable is empty. The timetable may be formed into a multi-dimensional array of a tree structure, and each item of this array may comprise memory addresses of event data and page data.
Brief Description of Drawings FIG. 1 is a flowchart showing a process of producing a lecture file according to the present invention.
FIG. 2 is a flowchart showing a process of playing back the lecture file according to the present invention.
FIG. 3 shows the constitution of a timetable according to the present invention. FIG. 4 shows a lecture-producing window according to the present invention.
Best Mode for Carrying Out the Invention
Hereinafter, a preferred embodiment of an electronic learning method using multimedia according to the present invention will be explained in detail, with
I reference to the accompanying drawings.
An electronic learning method using multimedia according to the present invention is shown in FIG. 1. As shown, in a process of producing a lecture file according to the present invention, data corresponding to an event time are divided into page data and event data at step 101 where the event time is inputted, and are then stored (step 106). In a case where the event time is not directly inputted, the events can be checked during the recording of the lecture contents (step 102). Further, the generation time, generation sequence and duration of the events previously produced during the recording can be adjusted and arranged, and the events can be individually modified. When the check is completed (step 105), the recording is stopped (step 109); the lecture data are generated (step 110); and then the lecture data, the page data and the event data are stored (step 111). Next, the lecture file is created (step 112), and then the process of producing the lecture file is completed (step 113).
On the other hand, when the producer executes the lecture file, the lecture data are instantly loaded to a memory. When the first page is opened, a pointer page data function for allotting a data pointer in the form of the page data is executed. Then, the pointer allotted into the event data is stored, and the name of the image file used and an application time of the page are stored. Thereafter, when the events are executed in the page, a pointer event data function used for allotting a data pointer in the form of the event data is executed. Subsequently, the pointer allotted into the page data is stored, and the event data are stored in a memory space indicated by the pointer. Whenever pages or events are added, the page data function or event data function is executed and memories are hence allotted. Addresses of the allotted memories are stored in the event data or page data. When there is no additional page or event, a null value is stored in the memory address. When the producer clicks, a save button or a lecture-creating button, the data temporarily stored in the memory are stored in the lecture file. In order to input the generation time of the individual events, check boxes of the events may be checked or the duration of the events may be directly inputted at a desired time during the recording. Thus, the checked or inputted time is stored in the memory. Similarly, when the page is changed, the time when the page should be changed is also stored in the memory.
FIG. 2 is a flowchart showing a process of playing back the lecture file. The stored lecture file is loaded (step 201), the time of the lecture data is read (step 202), and thus, a timetable is created (step 203). The timetable is created while checking the lecture time of the lecture file, and the current time duration of the page is initialized to be zero (step 204). When an arbitrary page is selected (step 205), the timetable reads the current time duration (step 207). If there are any contents in a subsequent page, the contents are executed (step 211). Otherwise, according to whether or not the timetable comes to an end, it is determined as to whether or not the next timetable will be played back. The lecture is then terminated, when the timetable comes to an end (steps 209 and 214).
On the other hand, when the learner executes the lecture file, a lecture time in the lecture data stored in the lecture file is read, and an array in the form of a timetable is created. Each item in the array has a linked list structure including each address of the event data. A memory address consisting of an address on the memory of the event, which occurred first at each time, is stored in the item of the array. When there is no event to be generated at that time, the item has a null address. In addition, in a case where a plurality of events are generated at the same time, a first list has the same address of the memory as a second memory. Similarly, when there is no further event generated at the same time, the address of the next linked list is indicated as a null address. FIG. 3 shows the constitution of the timetable of the present invention. Each item in an array1 of the timetable 300 has each memory address of the pages. The address #A of the contents of the page to be generated at the next time (second) has memory addresses #A-1, #A-2 (in 301). The address #B of the contents of the events to be generated in the page concurrently with the opening thereof is stored in the memory address #A-2 (in 302). When there is no additional page or event, a null value is stored in the memory address.
On the other hand, according to an electronic learning method using multimedia for accomplishing the object of the present invention, the lecture page is produced by using the events. Referring to FIG. 4, a lecture-producing window 400 comprises of a menu item section 440 at a top side thereof, an event list 410 at a central left side thereof, a lecture page 430 at a central right side thereof, and a page list 420 at a bottom side thereof. A menu for inputting an event time operates in a lecture-producing program, and comprises a part 411 for indicating the total time of the lecture at a top portion thereof, and a part 412 including a number portion, an application time portion, a page number portion, a description portion and a type portion. Furthermore, the number portion indicates an event number and comprises check boxes and icons 413. The application time portion is an input portion for inputting the time when the events should be generated after the lecture has been initiated. The page number portion indicates a lecture page number to which the event belongs. The description portion indicates additional descriptions of the events. The type portion indicates types of various events such as lines, full-motion video files, image files and figures.
There are two methods for determining the sequences and generation time of the events. In the first method, the producer produces and saves the events beforehand regardless of the generation sequences. When a recording button is clicked, the countdown is then started. At this time, by checking each check box of the desired events, the relevant events can be generated at the checked time upon the next playing back. Thus, the sequences of the events can be rearranged according to the checked sequence of the check boxes.
In the second method, the time to be applied into the menu for inputting the event time can be inputted as a unit of second, after the time when the events should occur has been set beforehand. At this time, the events having the same inputted time are simultaneously generated. In addition, when a specific event is to be modified, the events can be individually modified by only selecting the events. In a preferred embodiment of the present invention, in a case where a null address is stored in the timetable, no event is generated at a time corresponding thereto.
The address #A of the contents to be generated at a next time (second) has the memory addresses #A-1, #A-2, and the address #B of the contents of the events to be generated concurrently with the opening of the page is stored in the memory address #A-2.
In another preferred embodiment of the present invention, in a case where the lecture contents are comprised of the sequences of the events such as lecturer's greetings, explanation of the subject, emphasis of the subject, and loading of auxiliary image files, the lecturer can beforehand prepare the events such as full-motion video files containing the greetings, writing of the lecture subject on the backboard, coloring of the subject, and the auxiliary image files. Then, the lecturer first confirms a playback time of the full-motion video file, and inputs the generation time of the explanation of the subject, the emphasis of the subject, the auxiliary image files, etc. Otherwise, the check boxes of the events continued just after the playback of the full- motion video files was terminated, can be checked. Thus, the relevant events are generated, and a teaching plan constructed to have such sequences can be* shown to the learner while maintaining the generation time and sequences of the events upon playing back.
In a further preferred embodiment of the present invention, in a case where an English learner wants to repeatedly and intensively learn only a pronunciation part, the pronunciation part of the audio file synchronized with the relevant page can be repeated and learned by clicking only on a corresponding page without having to listen again to the lecture from the beginning.
Industrial Applicability
According to the electronic learning method using multimedia of the present invention, the producer can adjust and arrange the generation time, sequences and duration of the events previously produced before the recording of the lecture, without need to produce the events during the recording of the lecture contents. Further, since a plurality of events can be simultaneously inputted, the multiple events can be generated at the same time. In addition, the events can be individually modified, and thus the producer can easily produce the lecture file and proceed with the lecture smoothly. In this way,_unnecessary repetition can also be eliminated. On the other hand, the learner can take an improved learning effect by virtue of relatively smooth and perfect lecture contents. Further, since the contents of the page are perfectly synchronized with the lecturer's voice corresponding thereto, the advantage of an online learning system in that the learner can repeatedly play back a specific page or a desired part at any time during the learning, can be utilized considerably.

Claims

1. An electronic learning method using multimedia, comprising the steps of: inputting an event time by a producer; recording lecture contents by said producer; and creating said lecture contents into a file by said producer, whereby said producer can produce a lecture file.
2. An electronic learning method using multimedia, comprising the steps of: loading a lecture file by a computer; creating a timetable for a lecture schedule of said lecture file and playing back said lecture file by said computer; and selecting an arbitrary page by a learner, whereby said learner can play back said lecture file.
3. The method as claimed in Claim 1, wherein said step of creating said lecture file by said producer further comprises a step of checking an event during the recording of said lecture contents when said producer does not directly input said event time.
4. The method as claimed in Claim 1, wherein said step of inputting said event time by said producer further comprises a step of dividing data corresponding to said event time into page data and event data and storing said data.
5. The method as claimed in Claim 1, wherein said step of recording said lecture contents by said producer further comprises a step of recording said lecture contents after data corresponding to said event time has already been inputted.
6. The method as claimed in Claim 2, wherein said step of selecting said arbitrary page by said learner further comprises a step of reading a timetable corresponding to an application time of said selected page.
7. The method as claimed in Claim 6, wherein said step of reading said timetable further comprises a step of playing back a next timetable according to whether or not said timetable comes to an end, in a case where an event address of said timetable is empty.
8. The method as claimed in Claim 2, wherein said timetable is formed into a multi-dimensional array of a tree structure, and each item of said array comprises memory addresses of event data and page data.
PCT/KR2001/000003 2000-08-25 2001-01-02 Electronic study method using multimedia WO2002017179A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
AU2001224098A AU2001224098A1 (en) 2000-08-25 2001-01-02 Electronic study method using multimedia

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
KR2000/49668 2000-08-25
KR20000049668 2000-08-25

Publications (1)

Publication Number Publication Date
WO2002017179A1 true WO2002017179A1 (en) 2002-02-28

Family

ID=19685243

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/KR2001/000003 WO2002017179A1 (en) 2000-08-25 2001-01-02 Electronic study method using multimedia

Country Status (2)

Country Link
AU (1) AU2001224098A1 (en)
WO (1) WO2002017179A1 (en)

Cited By (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7721308B2 (en) 2005-07-01 2010-05-18 Microsoft Corproation Synchronization aspects of interactive multimedia presentation management
US7941522B2 (en) 2005-07-01 2011-05-10 Microsoft Corporation Application security in an interactive media environment
US8020084B2 (en) 2005-07-01 2011-09-13 Microsoft Corporation Synchronization aspects of interactive multimedia presentation management
US8108787B2 (en) 2005-07-01 2012-01-31 Microsoft Corporation Distributing input events to multiple applications in an interactive media environment
US8305398B2 (en) 2005-07-01 2012-11-06 Microsoft Corporation Rendering and compositing multiple applications in an interactive media environment
US8656268B2 (en) 2005-07-01 2014-02-18 Microsoft Corporation Queueing events in an interactive media environment
US8799757B2 (en) 2005-07-01 2014-08-05 Microsoft Corporation Synchronization aspects of interactive multimedia presentation management

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR19980043574U (en) * 1996-12-26 1998-09-25 박병재 Fuel injection quantity control device of car
JP2000059724A (en) * 1998-08-11 2000-02-25 Toshiba Syst Technol Corp Multimedia authoring method, its system and recording medium thereof

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR19980043574U (en) * 1996-12-26 1998-09-25 박병재 Fuel injection quantity control device of car
JP2000059724A (en) * 1998-08-11 2000-02-25 Toshiba Syst Technol Corp Multimedia authoring method, its system and recording medium thereof

Cited By (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7721308B2 (en) 2005-07-01 2010-05-18 Microsoft Corproation Synchronization aspects of interactive multimedia presentation management
US7941522B2 (en) 2005-07-01 2011-05-10 Microsoft Corporation Application security in an interactive media environment
US8020084B2 (en) 2005-07-01 2011-09-13 Microsoft Corporation Synchronization aspects of interactive multimedia presentation management
US8108787B2 (en) 2005-07-01 2012-01-31 Microsoft Corporation Distributing input events to multiple applications in an interactive media environment
US8305398B2 (en) 2005-07-01 2012-11-06 Microsoft Corporation Rendering and compositing multiple applications in an interactive media environment
US8656268B2 (en) 2005-07-01 2014-02-18 Microsoft Corporation Queueing events in an interactive media environment
US8799757B2 (en) 2005-07-01 2014-08-05 Microsoft Corporation Synchronization aspects of interactive multimedia presentation management

Also Published As

Publication number Publication date
AU2001224098A1 (en) 2002-03-04

Similar Documents

Publication Publication Date Title
Boyer From media anthropology to the anthropology of mediation
US6308042B1 (en) Computer based training system
US8001211B2 (en) Convergence-enabled DVD and web system
US20030211447A1 (en) Computerized learning system
JPH10145762A (en) Method for synchronizing presentation of static and dynamic components for interactive multimedia document
US4956806A (en) Method and apparatus for editing source files of differing data formats using an edit tracking file
US20030142097A1 (en) Electronic assembly procedure manual system
WO2002017179A1 (en) Electronic study method using multimedia
CN110299036A (en) Interaction reading method, device, system and storage medium
CN104572712A (en) Multimedia file browsing system and multimedia file browsing method
US20020136529A1 (en) Caption subject matter creating system, caption subject matter creating method and a recording medium in which caption subject matter creating program is stored
US6211868B1 (en) Editing method in a multimedia synchronous training system
KR20000069830A (en) Method for coding a presentation
CN110362675A (en) A kind of foreign language teaching content displaying method and system
CN104035824B (en) Clone method is operated between browser window
KR101161693B1 (en) Objected, and based on XML CMS with freely editing solution
KR100442417B1 (en) Educational digital contents in accordance with the objects divided by concept unit and a processing method via the contents
TWI262455B (en) Method and apparatus for POP-IN learning system
JPS6324276A (en) Learning apparatus
Omarali et al. Radio and audio in 2018
BE1006555A6 (en) A method for producing a control system display interface.
Shih et al. Multimedia presentation design using data flow diagrams
CN116685987A (en) Information processing device, information processing method, and information processing program
CN115002084A (en) Handwriting processing method and device, electronic equipment and storage medium
EP0403123B1 (en) Object addressability in data processing systems

Legal Events

Date Code Title Description
AK Designated states

Kind code of ref document: A1

Designated state(s): AE AG AL AM AT AU AZ BA BB BG BR BY BZ CA CH CN CR CU CZ DE DK DM DZ EE ES FI GB GD GE GH GM HR HU ID IL IN IS JP KE KG KP KZ LC LK LR LS LT LU LV MA MD MG MK MN MW MX MZ NO NZ PL PT RO RU SD SE SG SI SK SL TJ TM TR TT TZ UA UG US UZ VN YU ZA ZW

AL Designated countries for regional patents

Kind code of ref document: A1

Designated state(s): GH GM KE LS MW MZ SD SL SZ TZ UG ZW AM AZ BY KG KZ MD RU TJ TM AT BE CH CY DE DK ES FI FR GB GR IE IT LU MC NL PT SE TR BF BJ CF CG CI CM GA GN GW ML MR NE SN TD TG

121 Ep: the epo has been informed by wipo that ep was designated in this application
DFPE Request for preliminary examination filed prior to expiration of 19th month from priority date (pct application filed before 20040101)
REG Reference to national code

Ref country code: DE

Ref legal event code: 8642

122 Ep: pct application non-entry in european phase
NENP Non-entry into the national phase

Ref country code: JP