CN112468679B - Method and device for synchronously playing audio and video courseware and electronic equipment - Google Patents

Method and device for synchronously playing audio and video courseware and electronic equipment Download PDF

Info

Publication number
CN112468679B
CN112468679B CN202110153341.2A CN202110153341A CN112468679B CN 112468679 B CN112468679 B CN 112468679B CN 202110153341 A CN202110153341 A CN 202110153341A CN 112468679 B CN112468679 B CN 112468679B
Authority
CN
China
Prior art keywords
audio
video
event
tag
courseware
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202110153341.2A
Other languages
Chinese (zh)
Other versions
CN112468679A (en
Inventor
范旭宇
王浩玮
陈磊
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Tuoke Network Technology Co ltd
Original Assignee
Beijing Tuoke Network Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Tuoke Network Technology Co ltd filed Critical Beijing Tuoke Network Technology Co ltd
Priority to CN202110153341.2A priority Critical patent/CN112468679B/en
Publication of CN112468679A publication Critical patent/CN112468679A/en
Application granted granted Critical
Publication of CN112468679B publication Critical patent/CN112468679B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/04Synchronising
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09BEDUCATIONAL OR DEMONSTRATION APPLIANCES; APPLIANCES FOR TEACHING, OR COMMUNICATING WITH, THE BLIND, DEAF OR MUTE; MODELS; PLANETARIA; GLOBES; MAPS; DIAGRAMS
    • G09B5/00Electrically-operated educational appliances
    • G09B5/08Electrically-operated educational appliances providing for individual presentation of information to a plurality of student stations

Landscapes

  • Engineering & Computer Science (AREA)
  • Business, Economics & Management (AREA)
  • Physics & Mathematics (AREA)
  • Educational Administration (AREA)
  • Educational Technology (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Two-Way Televisions, Distribution Of Moving Picture Or The Like (AREA)

Abstract

The invention provides a method, a device and electronic equipment for synchronously playing audio and video courseware, wherein the method comprises the following steps: traversing all audio and video tags in the courseware H5, and setting a monitoring event; determining an operation event of the target audio/video tag according to the monitoring event, and terminating the operation event; extracting corresponding audio and video resources, and determining corresponding operation instructions according to the operation events; and processing the audio and video resources by a preset processing process according to the operation instruction, wherein the processing process is used for processing the audio and video data transmitted by the user. By the method, the device and the electronic equipment for synchronously playing the audio and video courseware, provided by the embodiment of the invention, the audio and video resources of the H5 courseware and the audio and video data transmitted by a user can be synchronously processed by one processing process, even if two kinds of audio and video are simultaneously played by the processing process, the volume of one kind of audio and video cannot be reduced, and the synchronous processing of the two kinds of audio and video can be realized.

Description

Method and device for synchronously playing audio and video courseware and electronic equipment
Technical Field
The invention relates to the technical field of audio and video data processing, in particular to a method and a device for synchronously playing audio and video courseware, electronic equipment and a computer readable storage medium.
Background
At present, most of application programs installed in intelligent equipment can play audio, but most of application programs only play one audio at a certain time point; even when a plurality of audios need to be played simultaneously in a scene of multi-person conversation and the like, the audios can be played normally because of the same kind.
In the online education scene, the sound of a teacher (or the sound of a student) needs to be played, and meanwhile, in order to reduce the data volume transmitted by the network, only H5 (Html 5) courseware with smaller data volume is transmitted without sharing the desktop of the teacher; the student side application need only load the H5 courseware. However, the section H5 courseware also contains audio, i.e. the teacher's voice and the audio in the H5 courseware need to be played simultaneously at this time.
In some operating systems, such as the IOS system, when audio in the H5 courseware is played, the sound from the teacher or student causes the audio volume of the H5 courseware to decrease, resulting in the inability to play both types of sound simultaneously.
Disclosure of Invention
In order to solve the existing technical problems, embodiments of the present invention provide a method, an apparatus, an electronic device, and a computer-readable storage medium for synchronously playing audio and video courseware.
In a first aspect, an embodiment of the present invention provides a method for synchronously playing audio and video courseware, including:
traversing all audio and video tags in the courseware H5, and setting a monitoring event for monitoring the audio and video tags;
when the monitoring event determines that the audio/video tag changes, the changed audio/video tag is used as a target audio/video tag, an operation event of the target audio/video tag is determined according to the monitoring event, and the operation event is terminated;
extracting corresponding audio and video resources according to the audio and video resource address in the target audio and video tag, and determining a corresponding operation instruction according to the operation event;
and processing the audio and video resources by a preset processing process according to the operation instruction, wherein the processing process is used for processing audio and video data transmitted by a user.
In a second aspect, an embodiment of the present invention further provides a device for synchronously playing audio and video courseware, where the device includes:
the preprocessing module is used for traversing all audio and video tags in the courseware H5 and setting a monitoring event for monitoring the audio and video tags;
the monitoring module is used for taking the changed audio/video tag as a target audio/video tag when the monitoring event determines that the audio/video tag changes, determining an operation event of the target audio/video tag according to the monitoring event and terminating the operation event;
the determining module is used for extracting corresponding audio and video resources according to the audio and video resource address in the target audio and video tag and determining a corresponding operation instruction according to the operation event;
and the processing module is used for processing the audio and video resources according to the operation instruction by using a preset processing process, and the processing process is used for processing audio and video data transmitted by a user.
In a third aspect, an embodiment of the present invention provides an electronic device, which includes a bus, a transceiver, a memory, a processor, and a computer program that is stored in the memory and is executable on the processor, where the transceiver, the memory, and the processor are connected via the bus, and when the computer program is executed by the processor, the steps in any one of the above methods for synchronously playing audio and video courseware are implemented.
In a fourth aspect, an embodiment of the present invention further provides a computer-readable storage medium, on which a computer program is stored, where the computer program, when executed by a processor, implements the steps in the method for synchronously playing the audio and video courseware described in any one of the above.
The method, the device, the electronic equipment and the computer readable storage medium for synchronously playing the audio and video courseware provided by the embodiment of the invention are used for setting the audio and video tags in the monitoring event monitoring H5 courseware, when the operation event is monitored, the operation event is terminated, and simultaneously, the processing process processes the corresponding audio and video resources according to the operation instruction corresponding to the operation event, so that the intercepted audio and video resources can be processed by the processing process, and when the audio and video data transmitted by the user (student or teacher) needs to be played, the processing process can also process the audio and video data at the same time, thereby realizing that one processing process can synchronously process the audio and video resources of the H5 courseware and the audio and video data transmitted by the user, even if the processing process plays two audios and videos simultaneously, the volume of one audio and video cannot be reduced, and the two audios and videos can be synchronously processed.
Drawings
In order to more clearly illustrate the technical solutions in the embodiments or the background art of the present invention, the drawings required to be used in the embodiments or the background art of the present invention will be described below.
Fig. 1 is a flowchart illustrating a method for synchronously playing audio and video courseware according to an embodiment of the present invention;
fig. 2 is a schematic structural diagram illustrating an apparatus for synchronously playing audio and video courseware according to an embodiment of the present invention;
fig. 3 shows a schematic structural diagram of an electronic device for executing a method for synchronously playing audio and video courseware according to an embodiment of the present invention.
Detailed Description
In the description of the embodiments of the present invention, it should be apparent to those skilled in the art that the embodiments of the present invention can be embodied as methods, apparatuses, electronic devices, and computer-readable storage media. Thus, embodiments of the invention may be embodied in the form of: entirely hardware, entirely software (including firmware, resident software, micro-code, etc.), a combination of hardware and software. Furthermore, in some embodiments, embodiments of the invention may also be embodied in the form of a computer program product in one or more computer-readable storage media having computer program code embodied in the medium.
The computer-readable storage media described above may take any combination of one or more computer-readable storage media. The computer-readable storage medium includes: an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any combination thereof. More specific examples of the computer-readable storage medium include: a portable computer diskette, a hard disk, a Random Access Memory (RAM), a read-only Memory (ROM), an erasable programmable read-only Memory (EPROM), a Flash Memory, an optical fiber, a compact disc read-only Memory (CD-ROM), an optical storage device, a magnetic storage device, or any combination thereof. In embodiments of the invention, a computer readable storage medium may be any tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, device, or apparatus.
The computer program code embodied on the computer readable storage medium may be transmitted using any appropriate medium, including: wireless, wire, fiber optic cable, Radio Frequency (RF), or any suitable combination thereof.
Computer program code for carrying out operations for embodiments of the present invention may be written in assembly instructions, Instruction Set Architecture (ISA) instructions, machine related instructions, microcode, firmware instructions, state setting data, integrated circuit configuration data, or in one or more programming languages, including an object oriented programming language, such as: java, Smalltalk, C + +, and also include conventional procedural programming languages, such as: c or a similar programming language. The computer program code may execute entirely on the user's computer, partly on the user's computer, as a stand-alone software package, partly on the user's computer and partly on a remote computer or entirely on the remote computer or server. In the case of a remote computer, the remote computer may be over any of a variety of networks, including: a Local Area Network (LAN) or a Wide Area Network (WAN), which may be connected to the user's computer, may be connected to an external computer.
The method, the device and the electronic equipment are described through the flow chart and/or the block diagram.
It will be understood that each block of the flowchart illustrations and/or block diagrams, and combinations of blocks in the flowchart illustrations and/or block diagrams, can be implemented by computer-readable program instructions. These computer-readable program instructions may be provided to a processor of a general purpose computer, special purpose computer, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable data processing apparatus, create means for implementing the functions/acts specified in the flowchart and/or block diagram block or blocks.
These computer-readable program instructions may also be stored in a computer-readable storage medium that can direct a computer or other programmable data processing apparatus to function in a particular manner. Thus, the instructions stored in the computer-readable storage medium produce an article of manufacture including instruction means which implement the function/act specified in the flowchart and/or block diagram block or blocks.
The computer readable program instructions may also be loaded onto a computer, other programmable data processing apparatus, or other devices to cause a series of operational steps to be performed on the computer, other programmable apparatus or other devices to produce a computer implemented process such that the instructions which execute on the computer or other programmable apparatus provide processes for implementing the functions/acts specified in the flowchart and/or block diagram block or blocks.
The embodiments of the present invention will be described below with reference to the drawings.
In the current android operating systems, when the sound of a plurality of sound sources is played, each sound source does not affect each other, that is, the volume of a certain sound source is not reduced. In the current operating systems such as the IOS, when two or more sound sources are played simultaneously, the system reduces the volume of other sound sources except the designated sound source, so that the volume of the designated sound source is larger than that of other sound sources, and a user can clearly hear the sound of the designated sound source. For example, when a short message is received while music is played, the volume of the music is temporarily decreased. However, when playing the H5 courseware, the H5 courseware is loaded into the application program to be played, so that the H5 courseware and the sound of the teacher are played as the same sound source through the application program, but actually, the problem that the sound volume of the H5 courseware is reduced still exists. In the process of implementing the creation of the present invention, the inventor finds that this is because the H5 courseware is loaded using wkbwebview (a control used by an IOS system for loading a web page), which opens a new process, and the sound of a teacher or a student is played through another process, and two sounds are considered as two sound sources due to different processes, and the embodiment of the present invention implements the playing process of the two sounds through one process.
Fig. 1 shows a flowchart of a method for synchronously playing audio and video courseware according to an embodiment of the present invention. As shown in fig. 1, the method includes:
step 101: all the audio-video tags in the courseware are traversed through H5, and a listening event for listening to the audio-video tags is set.
In the embodiment of the invention, the H5 courseware in an Html 5 format, which can be converted from other formats, or courseware generated by directly encoding Html 5. The part H5 courseware may include audio and/or video, which may exist in the H5 courseware in the form of corresponding audio (audio) tags and video (video) tags, and in this embodiment, the audio tags and/or the video tags are collectively referred to as audio and video tags, that is, the audio and video tags may include only audio tags or video tags, or may include both audio tags and video tags.
In the embodiment of the invention, when one or more audio/video tags are contained in the courseware H5, all the audio/video tags are traversed; meanwhile, a monitoring event is added, and the monitoring event is used for monitoring the audio and video tags in the courseware H5, so that the change of the audio and video tags can be monitored in real time.
Step 102: and when the audio/video tag is determined to be changed by monitoring the event, taking the changed audio/video tag as a target audio/video tag, determining an operation event of the target audio/video tag according to the monitoring event, and terminating the operation event.
In the embodiment of the invention, the monitoring event can monitor the audio and video tags at any time, and when the audio and video tags change, the monitoring event can respond to the change in time so as to determine the operation event of the changed audio and video tags in time. In the embodiment of the invention, if the audio/video tag is automatically played, the audio/video tag has the attribute of automatic playing, the action of the audio/video tag needs to be monitored by a monitoring event, and the playing event can be triggered when the audio/video tag changes. Or, if the audio/video resource address (src) in the audio/video tag is changed, it indicates that the audio/video tag is calling a new audio/video resource, and at this time, the action of the audio/video tag also needs to be monitored by the monitoring event. Therefore, if the monitoring event monitors that the audio/video tag is automatically played or the audio/video resource address in the audio/video tag is changed, the audio/video tag is determined to be changed.
In the embodiment of the invention, if a certain audio/video tag is changed, the audio/video tag is used as a target audio/video tag; if the target audio/video tag generates an operation event, the operation event can be monitored by the monitoring event, and the operation event in the target audio/video tag is terminated. Specifically, the operation event may be one or more of a play event, a pause event, and an end event. The term "termination operation event" in the embodiment of the present invention refers to the operation event generated when the target audio-video tag is not executed. For example, when the operation event is an "end event", the "end event" is not executed, but does not mean that the playing of the audio/video cannot be ended in other ways.
Step 103: and extracting corresponding audio and video resources according to the audio and video resource address in the target audio and video tag, and determining a corresponding operation instruction according to the operation event.
In the embodiment of the invention, after the monitoring event determines the target audio/video tag, corresponding audio/video resources can be extracted according to the audio/video resource address in the target audio/video tag; the audio/video Resource address may specifically be a URL (Uniform Resource Locator) of an audio/video Resource, and the audio/video Resource may be located through the URL, so as to extract the audio/video Resource. Meanwhile, when the monitoring event monitors the operation event, a corresponding operation instruction is generated. For example, if the operation event is a play event, the generated operation instruction may be a play instruction.
Step 104: and processing the audio and video resources by a preset processing process according to the operation instruction, wherein the processing process is used for processing the audio and video data transmitted by the user.
In the embodiment of the invention, a processing process for processing audio and video data transmitted by a user is preset; for example, in an online education scene, the sound emitted by a teacher or a student is collected and audio/video resources are formed, when other terminals play the audio/video resources, the audio/video resources are played by a processing process at the terminal side, and the playing of the audio/video resources can be paused or stopped. In addition, after the audio and video resources and the operation instructions are determined, the processing process is changed to extract the audio and video resources from the courseware H5; for example, when the operation instruction is a play instruction, the processing process plays the audio and video resource, so that the audio and video in the courseware of the H5 and the audio and video generated by a teacher or a student are processed by the same processing process.
The method for synchronously playing the audio and video courseware, provided by the embodiment of the invention, is characterized in that an audio and video label in an event monitoring H5 courseware is set, when an operation event is monitored, the operation event is terminated, and meanwhile, a processing process processes corresponding audio and video resources according to an operation instruction corresponding to the operation event, so that the intercepted audio and video resources can be processed by the processing process, and when audio and video data transmitted by a user (a student or a teacher) needs to be played, the processing process can also process the audio and video data simultaneously, so that the audio and video resources of the H5 courseware and the audio and video data transmitted by the user can be synchronously processed by one processing process, and even if the processing process plays two audios and videos simultaneously, the volume of one audio and video cannot be reduced, and the two audios and videos can be synchronously processed.
On the basis of the above embodiments, the terminating operation event may specifically terminate the operation event directly or indirectly, and includes: adjusting the volume of the target audio/video tag to zero; or sending a playing completion instruction to the target audio/video tag.
In the embodiment of the invention, the operation event can be directly terminated. Specifically, when an operation event such as a play event, a pause event, or the like of the target audio/video tag needs to be terminated, a play completion instruction, that is, an end instruction, may be issued to the target audio/video tag; if the operation event needing to be terminated is an end event, the operation event can be automatically terminated at the moment as the end event is the corresponding audio and video resource which is finished to be played; of course, in order to avoid distinguishing the type of the operation event, an instruction for completion of playing can be dispatched to the target audio-video tag no matter why the operation event is.
Alternatively, the operational event may be terminated indirectly. In the embodiment of the invention, the purpose of stopping the operation event is to prevent the user from perceiving that the H5 courseware processes the audio and video resources, so that the volume of the target audio and video tag can be adjusted to zero at this time, so that even if the target audio and video tag can play the corresponding audio and video resources, the user cannot perceive the corresponding audio and video resources because the volume of the target audio and video tag is zero, and the operation event of the target audio and video tag is also stopped at this time.
Optionally, if the operation event is terminated in a direct manner, that is, an instruction for completing the playing is sent to the target audio/video tag, at this time, the step 103 "determining a corresponding operation instruction according to the operation event" includes:
step A1: and determining the current operation instruction according to the last operation instruction determined by the last operation event of the target audio/video tag and the current operation event.
In the embodiment of the invention, in part of target audio/video tags, a playing event and an ending event are triggered by the same component, and when a user triggers the component, the playing event and the ending event are generated by reciprocating triggering; therefore, it is necessary to determine what event the current operation event is specifically based on what event the previous operation event is, for example, if the previous event is a play event, the operation event determined when the user triggers the component is a pause event; if the last event is a pause event or an end event (namely an end playing event), the operation event determined when the user triggers the component is a playing event under the condition that the audio/video tag is not replaced.
In addition, because a play completion instruction is dispatched to the target audio/video tag when the operation event is terminated, for the above situation, when the user triggers the component for multiple times, each operation event is a play event, and at this time, the current real operation intention cannot be determined only according to the last operation event. Wherein, the operation instruction comprises one or more of a play instruction, a pause instruction and an end instruction.
Specifically, if a play event and a pause event are generated based on the same component trigger, and the current operation event of the target audio/video tag is the play event, whether the operation instruction determined by the last operation event of the target audio/video tag is the play instruction is judged; if the command is a play command, the current operation command is a pause command; if the instruction is not a play instruction (pause instruction or end instruction), the current operation instruction is a play instruction. According to the embodiment of the invention, the operation intention of the current operation time can be accurately determined through the last operation instruction and the current operation event, and further the instruction of the current operation instruction is determined, so that the processing process can accurately respond to the operation of the user on the H5 courseware.
The method for synchronously playing the audio and video courseware provided by the embodiment of the invention is described above in detail, the method can also be realized by a corresponding device, and the device for synchronously playing the audio and video courseware provided by the embodiment of the invention is described below in detail.
Fig. 2 shows a schematic structural diagram of a device for synchronously playing audio and video courseware provided by the embodiment of the invention. As shown in fig. 2, the device for synchronously playing the audio and video courseware comprises:
the preprocessing module 21 is used for traversing all audio/video tags in the courseware H5 and setting a monitoring event for monitoring the audio/video tags;
the monitoring module 22 is configured to, when the monitoring event determines that the audio/video tag changes, use the changed audio/video tag as a target audio/video tag, determine an operation event of the target audio/video tag according to the monitoring event, and terminate the operation event;
the determining module 23 is configured to extract corresponding audio and video resources according to the audio and video resource address in the target audio and video tag, and determine a corresponding operation instruction according to the operation event;
and the processing module 24 is configured to process the audio and video resources according to the operation instruction by using a preset processing process, where the processing process is used to process audio and video data transmitted by a user.
On the basis of the above embodiment, the apparatus further includes:
and the tag change module is used for determining that the audio/video tag changes when the monitoring event monitors that the audio/video tag is automatically played or the audio/video resource address in the audio/video tag changes.
On the basis of the above embodiment, the listening module 22 terminating the operation event includes:
adjusting the volume of the target audio/video tag to zero; or sending a playing completion instruction to the target audio/video tag.
On the basis of the above embodiment, the determining module 23 determines the corresponding operation instruction according to the operation event, including:
and determining the current operation instruction according to the last operation instruction determined by the last operation event of the target audio/video tag and the current operation event.
On the basis of the above embodiment, the operation instruction includes one or more of a play instruction, a pause instruction, and an end instruction.
In addition, an embodiment of the present invention further provides an electronic device, which includes a bus, a transceiver, a memory, a processor, and a computer program stored in the memory and capable of running on the processor, where the transceiver, the memory, and the processor are connected via the bus, and when the computer program is executed by the processor, each process of the above-mentioned method embodiment for synchronously playing audio and video courseware is implemented, and the same technical effect can be achieved, and details are not repeated here to avoid repetition.
Specifically, referring to fig. 3, an embodiment of the present invention further provides an electronic device, which includes a bus 1110, a processor 1120, a transceiver 1130, a bus interface 1140, a memory 1150, and a user interface 1160.
In an embodiment of the present invention, the electronic device further includes: a computer program stored on the memory 1150 and executable on the processor 1120, the computer program when executed by the processor 1120 implementing the processes of the above-described method embodiments of synchronized playing of audiovisual courseware.
A transceiver 1130 for receiving and transmitting data under the control of the processor 1120.
In embodiments of the invention in which a bus architecture (represented by bus 1110) is used, bus 1110 may include any number of interconnected buses and bridges, with bus 1110 connecting various circuits including one or more processors, represented by processor 1120, and memory, represented by memory 1150.
Bus 1110 represents one or more of any of several types of bus structures, including a memory bus, and memory controller, a peripheral bus, an Accelerated Graphics Port (AGP), a processor, or a local bus using any of a variety of bus architectures. By way of example, and not limitation, such architectures include: an Industry Standard Architecture (ISA) bus, a Micro Channel Architecture (MCA) bus, an Enhanced ISA (EISA) bus, a Video Electronics Standards Association (VESA), a Peripheral Component Interconnect (PCI) bus.
Processor 1120 may be an integrated circuit chip having signal processing capabilities. In implementation, the steps of the above method embodiments may be performed by integrated logic circuits in hardware or instructions in software in a processor. The processor described above includes: general purpose processors, Central Processing Units (CPUs), Network Processors (NPs), Digital Signal Processors (DSPs), Application Specific Integrated Circuits (ASICs), Field Programmable Gate Arrays (FPGAs), Complex Programmable Logic Devices (CPLDs), Programmable Logic Arrays (PLAs), Micro Control Units (MCUs) or other Programmable Logic devices, discrete gates, transistor Logic devices, discrete hardware components. The various methods, steps and logic blocks disclosed in embodiments of the present invention may be implemented or performed. For example, the processor may be a single core processor or a multi-core processor, which may be integrated on a single chip or located on multiple different chips.
Processor 1120 may be a microprocessor or any conventional processor. The steps of the method disclosed in connection with the embodiments of the present invention may be directly performed by a hardware decoding processor, or may be performed by a combination of hardware and software modules in the decoding processor. The software modules may be located in a Random Access Memory (RAM), a Flash Memory (Flash Memory), a Read-Only Memory (ROM), a Programmable ROM (PROM), an Erasable PROM (EPROM), a register, and other readable storage media known in the art. The readable storage medium is located in a memory, and a processor reads information in the memory and completes the steps of the method in combination with hardware of the processor.
The bus 1110 may also connect various other circuits such as peripherals, voltage regulators, or power management circuits to provide an interface between the bus 1110 and the transceiver 1130, as is well known in the art. Therefore, the embodiments of the present invention will not be further described.
The transceiver 1130 may be one element or may be multiple elements, such as multiple receivers and transmitters, providing a means for communicating with various other apparatus over a transmission medium. For example: the transceiver 1130 receives external data from other devices, and the transceiver 1130 transmits data processed by the processor 1120 to other devices. Depending on the nature of the computer system, a user interface 1160 may also be provided, such as: touch screen, physical keyboard, display, mouse, speaker, microphone, trackball, joystick, stylus.
It is to be appreciated that in embodiments of the invention, the memory 1150 may further include memory located remotely with respect to the processor 1120, which may be coupled to a server via a network. One or more portions of the above-described networks may be an ad hoc network (ad hoc network), an intranet (intranet), an extranet (extranet), a Virtual Private Network (VPN), a Local Area Network (LAN), a Wireless Local Area Network (WLAN), a Wide Area Network (WAN), a Wireless Wide Area Network (WWAN), a Metropolitan Area Network (MAN), the Internet (Internet), a Public Switched Telephone Network (PSTN), a plain old telephone service network (POTS), a cellular telephone network, a wireless fidelity (Wi-Fi) network, and combinations of two or more of the above. For example, the cellular telephone network and the wireless network may be a global system for Mobile Communications (GSM) system, a Code Division Multiple Access (CDMA) system, a Worldwide Interoperability for Microwave Access (WiMAX) system, a General Packet Radio Service (GPRS) system, a Wideband Code Division Multiple Access (WCDMA) system, a Long Term Evolution (LTE) system, an LTE Frequency Division Duplex (FDD) system, an LTE Time Division Duplex (TDD) system, a long term evolution-advanced (LTE-a) system, a Universal Mobile Telecommunications (UMTS) system, an enhanced Mobile Broadband (eMBB) system, a mass Machine Type Communication (mtc) system, an Ultra Reliable Low Latency Communication (urrllc) system, or the like.
It is to be understood that the memory 1150 in embodiments of the present invention can be either volatile memory or nonvolatile memory, or can include both volatile and nonvolatile memory. Wherein the nonvolatile memory includes: Read-Only Memory (ROM), Programmable ROM (PROM), Erasable PROM (EPROM), Electrically Erasable PROM (EEPROM), or Flash Memory.
The volatile memory includes: random Access Memory (RAM), which acts as an external cache. By way of example, and not limitation, many forms of RAM are available, such as: static random access memory (Static RAM, SRAM), Dynamic random access memory (Dynamic RAM, DRAM), Synchronous Dynamic random access memory (Synchronous DRAM, SDRAM), Double Data Rate Synchronous Dynamic random access memory (Double Data Rate SDRAM, DDRSDRAM), Enhanced Synchronous DRAM (ESDRAM), Synchronous Link DRAM (SLDRAM), and Direct memory bus RAM (DRRAM). The memory 1150 of the electronic device described in the embodiments of the invention includes, but is not limited to, the above and any other suitable types of memory.
In an embodiment of the present invention, memory 1150 stores the following elements of operating system 1151 and application programs 1152: an executable module, a data structure, or a subset thereof, or an expanded set thereof.
Specifically, the operating system 1151 includes various system programs such as: a framework layer, a core library layer, a driver layer, etc. for implementing various basic services and processing hardware-based tasks. Applications 1152 include various applications such as: media Player (Media Player), Browser (Browser), for implementing various application services. A program implementing a method of an embodiment of the invention may be included in application program 1152. The application programs 1152 include: applets, objects, components, logic, data structures, and other computer system executable instructions that perform particular tasks or implement particular abstract data types.
In addition, the embodiment of the present invention further provides a computer-readable storage medium, on which a computer program is stored, where the computer program, when executed by a processor, implements each process of the above method embodiment for synchronously playing audio and video courseware, and can achieve the same technical effect, and in order to avoid repetition, details are not repeated here.
The computer-readable storage medium includes: permanent and non-permanent, removable and non-removable media may be tangible devices that retain and store instructions for use by an instruction execution apparatus. The computer-readable storage medium includes: electronic memory devices, magnetic memory devices, optical memory devices, electromagnetic memory devices, semiconductor memory devices, and any suitable combination of the foregoing. The computer-readable storage medium includes: phase change memory (PRAM), Static Random Access Memory (SRAM), Dynamic Random Access Memory (DRAM), other types of Random Access Memory (RAM), Read Only Memory (ROM), non-volatile random access memory (NVRAM), Electrically Erasable Programmable Read Only Memory (EEPROM), flash memory or other memory technology, compact disc read only memory (CD-ROM), Digital Versatile Discs (DVD) or other optical storage, magnetic tape cartridge storage, magnetic tape disk storage or other magnetic storage devices, memory sticks, mechanically encoded devices (e.g., punched cards or raised structures in a groove having instructions recorded thereon), or any other non-transmission medium useful for storing information that may be accessed by a computing device. As defined in embodiments of the present invention, the computer-readable storage medium does not include transitory signals per se, such as radio waves or other freely propagating electromagnetic waves, electromagnetic waves propagating through a waveguide or other transmission medium (e.g., optical pulses traveling through a fiber optic cable), or electrical signals transmitted through a wire.
In the several embodiments provided in the present application, it should be understood that the disclosed apparatus, electronic device and method may be implemented in other ways. For example, the above-described apparatus embodiments are merely illustrative, and for example, the division of the modules or units is only one logical division, and there may be other divisions in actual implementation, for example, multiple units or components may be combined or integrated into another system, or some features may be omitted, or not executed. In addition, the shown or discussed mutual coupling or direct coupling or communication connection may be an indirect coupling or communication connection through some interfaces, devices or units, and may also be an electrical, mechanical or other form of connection.
The units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one position, or may be distributed on a plurality of network units. Some or all of the units can be selected according to actual needs to solve the problem to be solved by the embodiment of the invention.
In addition, functional units in the embodiments of the present invention may be integrated into one processing unit, or each unit may exist alone physically, or two or more units are integrated into one unit. The integrated unit can be realized in a form of hardware, and can also be realized in a form of a software functional unit.
The integrated unit, if implemented in the form of a software functional unit and sold or used as a stand-alone product, may be stored in a computer readable storage medium. Based on such understanding, the technical solutions of the embodiments of the present invention may be substantially or partially contributed by the prior art, or all or part of the technical solutions may be embodied in a software product stored in a storage medium and including instructions for causing a computer device (including a personal computer, a server, a data center, or other network devices) to execute all or part of the steps of the methods of the embodiments of the present invention. And the storage medium includes various media that can store the program code as listed in the foregoing.
The above description is only a specific implementation of the embodiments of the present invention, but the scope of the embodiments of the present invention is not limited thereto, and any person skilled in the art can easily conceive of changes or substitutions within the technical scope of the embodiments of the present invention, and all such changes or substitutions should be covered by the scope of the embodiments of the present invention. Therefore, the protection scope of the embodiments of the present invention shall be subject to the protection scope of the claims.

Claims (10)

1. A method for synchronously playing audio and video courseware is characterized by comprising the following steps:
traversing all audio and video tags in the courseware H5, and setting a monitoring event for monitoring the audio and video tags;
when the monitoring event determines that the audio/video tag changes, the changed audio/video tag is used as a target audio/video tag, an operation event of the target audio/video tag is determined according to the monitoring event, and the operation event is terminated;
extracting corresponding audio and video resources according to the audio and video resource address in the target audio and video tag, and determining a corresponding operation instruction according to the operation event;
and processing the audio and video resources by a preset processing process according to the operation instruction, wherein the processing process is used for processing audio and video data transmitted by a user.
2. The method of claim 1, further comprising:
and when the monitoring event monitors that the audio/video tag is automatically played or the audio/video resource address in the audio/video tag is changed, determining that the audio/video tag is changed.
3. The method of claim 1, wherein the terminating the operational event comprises:
adjusting the volume of the target audio/video tag to zero; or sending a playing completion instruction to the target audio/video tag.
4. The method of claim 1, wherein determining the corresponding operation instruction according to the operation event comprises:
and determining the current operation instruction according to the last operation instruction determined by the last operation event of the target audio/video tag and the current operation event.
5. The method of claim 1, wherein the operation instructions comprise one or more of a play instruction, a pause instruction, and an end instruction.
6. A device for synchronously playing audio and video courseware is characterized by comprising:
the preprocessing module is used for traversing all audio and video tags in the courseware H5 and setting a monitoring event for monitoring the audio and video tags;
the monitoring module is used for taking the changed audio/video tag as a target audio/video tag when the monitoring event determines that the audio/video tag changes, determining an operation event of the target audio/video tag according to the monitoring event and terminating the operation event;
the determining module is used for extracting corresponding audio and video resources according to the audio and video resource address in the target audio and video tag and determining a corresponding operation instruction according to the operation event;
and the processing module is used for processing the audio and video resources according to the operation instruction by using a preset processing process, and the processing process is used for processing audio and video data transmitted by a user.
7. The apparatus of claim 6, further comprising:
and the tag change module is used for determining that the audio/video tag changes when the monitoring event monitors that the audio/video tag is automatically played or the audio/video resource address in the audio/video tag changes.
8. The apparatus of claim 6, wherein the listening module terminating the operational event comprises:
adjusting the volume of the target audio/video tag to zero; or sending a playing completion instruction to the target audio/video tag.
9. An electronic device comprising a bus, a transceiver, a memory, a processor and a computer program stored on the memory and executable on the processor, the transceiver, the memory and the processor being connected via the bus, characterized in that the computer program, when executed by the processor, implements the steps in the method of synchronized playing of audio and video courseware as claimed in any one of claims 1 to 5.
10. A computer-readable storage medium, on which a computer program is stored, which, when being executed by a processor, carries out the steps of the method of synchronously playing audio-video courseware according to any one of claims 1 to 5.
CN202110153341.2A 2021-02-04 2021-02-04 Method and device for synchronously playing audio and video courseware and electronic equipment Active CN112468679B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110153341.2A CN112468679B (en) 2021-02-04 2021-02-04 Method and device for synchronously playing audio and video courseware and electronic equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110153341.2A CN112468679B (en) 2021-02-04 2021-02-04 Method and device for synchronously playing audio and video courseware and electronic equipment

Publications (2)

Publication Number Publication Date
CN112468679A CN112468679A (en) 2021-03-09
CN112468679B true CN112468679B (en) 2021-04-20

Family

ID=74802634

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110153341.2A Active CN112468679B (en) 2021-02-04 2021-02-04 Method and device for synchronously playing audio and video courseware and electronic equipment

Country Status (1)

Country Link
CN (1) CN112468679B (en)

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2002077858A (en) * 2000-09-01 2002-03-15 Nippon Telegraph & Telephone East Corp Method for synchronously providing video/voice and data
CN103493085A (en) * 2011-02-24 2014-01-01 谷歌公司 Electronic book extension systems and methods
CN110609918A (en) * 2019-07-30 2019-12-24 李华坤 Audio playing software
CN111428059A (en) * 2020-03-19 2020-07-17 威比网络科技(上海)有限公司 Audio-associated multimedia data playing method and device, electronic equipment and storage medium
CN112004113A (en) * 2020-07-27 2020-11-27 北京大米科技有限公司 Teaching interaction method, device, server and storage medium

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20150332595A1 (en) * 2014-05-18 2015-11-19 Salah Shakir Integrated user authentication and proctoring system for online and distance education courses and methods of use

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2002077858A (en) * 2000-09-01 2002-03-15 Nippon Telegraph & Telephone East Corp Method for synchronously providing video/voice and data
CN103493085A (en) * 2011-02-24 2014-01-01 谷歌公司 Electronic book extension systems and methods
CN110609918A (en) * 2019-07-30 2019-12-24 李华坤 Audio playing software
CN111428059A (en) * 2020-03-19 2020-07-17 威比网络科技(上海)有限公司 Audio-associated multimedia data playing method and device, electronic equipment and storage medium
CN112004113A (en) * 2020-07-27 2020-11-27 北京大米科技有限公司 Teaching interaction method, device, server and storage medium

Also Published As

Publication number Publication date
CN112468679A (en) 2021-03-09

Similar Documents

Publication Publication Date Title
US20170111414A1 (en) Video playing method and device
JP6582100B2 (en) Method and apparatus for providing voice service
CN110417641B (en) Method and equipment for sending session message
WO2019218458A1 (en) Application program test method and device, mobile terminal and medium
US11482257B2 (en) Image display method and apparatus
EP2955713A1 (en) Synchronous audio playback method, apparatus and system
CN110704202B (en) Multimedia recording data sharing method and terminal equipment
US20180260478A1 (en) Audio processing method, server, user equipment, and system
CN108521612B (en) Video abstract generation method, device, server and storage medium
CN111435600B (en) Method and apparatus for processing audio
CN112685121B (en) Method and equipment for presenting session entry
WO2019218464A1 (en) Application program testing method and apparatus, and mobile terminal and medium
EP3174312A1 (en) Playback method and playback device for a multiroom sound system
CN112822431A (en) Method and equipment for private audio and video call
JP7375089B2 (en) Method, device, computer readable storage medium and computer program for determining voice response speed
CN112711477B (en) Method and device for switching application programs and electronic equipment
US10027994B2 (en) Interactive audio metadata handling
CN108829370B (en) Audio resource playing method and device, computer equipment and storage medium
WO2024051823A1 (en) Method for managing reception information and back-end device
CN112468679B (en) Method and device for synchronously playing audio and video courseware and electronic equipment
US11557303B2 (en) Frictionless handoff of audio content playing using overlaid ultrasonic codes
CN109547830B (en) Method and device for synchronous playing of multiple virtual reality devices
CN111526381B (en) Method and device for optimizing live broadcast resources and electronic equipment
CN113259385B (en) Echo eliminating method and device for audio playing and electronic equipment
CN111381797B (en) Processing method and device for realizing KTV function on client and user equipment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant