CN117692727A - Method, device and storage medium for generating covers of classroom teaching video - Google Patents

Method, device and storage medium for generating covers of classroom teaching video Download PDF

Info

Publication number
CN117692727A
CN117692727A CN202211078128.0A CN202211078128A CN117692727A CN 117692727 A CN117692727 A CN 117692727A CN 202211078128 A CN202211078128 A CN 202211078128A CN 117692727 A CN117692727 A CN 117692727A
Authority
CN
China
Prior art keywords
classroom
video
cover
teaching
classroom teaching
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202211078128.0A
Other languages
Chinese (zh)
Inventor
尹志超
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Guangzhou Shiyuan Electronics Thecnology Co Ltd
Guangzhou Shirui Electronics Co Ltd
Original Assignee
Guangzhou Shiyuan Electronics Thecnology Co Ltd
Guangzhou Shirui Electronics Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Guangzhou Shiyuan Electronics Thecnology Co Ltd, Guangzhou Shirui Electronics Co Ltd filed Critical Guangzhou Shiyuan Electronics Thecnology Co Ltd
Priority to CN202211078128.0A priority Critical patent/CN117692727A/en
Publication of CN117692727A publication Critical patent/CN117692727A/en
Pending legal-status Critical Current

Links

Landscapes

  • Management, Administration, Business Operations System, And Electronic Commerce (AREA)
  • Electrically Operated Instructional Devices (AREA)

Abstract

The embodiment of the disclosure provides a method, a device, a storage medium and electronic equipment for generating a cover of a classroom teaching video, wherein the method comprises the following steps: acquiring classroom teaching data, and determining a classroom theme from the classroom teaching data, wherein the classroom teaching data comprises classroom teaching videos; responding to a classroom teaching video cover generation request, and determining a video cover template corresponding to a classroom theme; performing atmosphere analysis on each frame of video picture in the classroom teaching video to obtain classroom atmosphere score, and obtaining video frame picture with higher classroom atmosphere score from the classroom teaching video as a cover background picture; and combining the video cover template and the video cover background picture into a classroom teaching video cover. The classroom teaching video cover obtained by the cover generation method in the embodiment of the disclosure can show the theme ideas and atmosphere of classroom teaching in a personalized way, and the user experience is good.

Description

Method, device and storage medium for generating covers of classroom teaching video
Technical Field
The embodiment of the disclosure relates to the field of teaching, in particular to a method and a device for generating a cover of a classroom teaching video and a storage medium.
Background
Classroom teaching is a centralized embodiment of teacher teaching and student learning, and in order to review or summarize the classroom teaching in a later period outside the classroom, a classroom teaching video is recorded in a general case. Each classroom teaching video has a corresponding video cover, and in the conventional technology, a certain frame of video picture in the classroom teaching video is randomly extracted to be used as the video cover.
The inventor finds that if the randomly extracted classroom teaching video frame is used as the video cover in the process of implementing the invention, the following problems may exist: the video covers are too common, so that theme ideas and atmosphere of classroom teaching cannot be reflected, and user experience is poor.
Disclosure of Invention
In order to overcome the problems in the related art, the present disclosure provides a method, an apparatus, a storage medium and an electronic device for generating a classroom teaching video, where the classroom teaching video obtained by the method for generating a cover can individually embody the theme ideas and atmosphere of classroom teaching, and the user experience is better.
According to a first aspect of an embodiment of the present disclosure, there is provided a method for generating a cover of a classroom teaching video, including the steps of:
Acquiring classroom teaching data, and determining a classroom theme from the classroom teaching data, wherein the classroom teaching data comprises classroom teaching videos;
responding to a classroom teaching video cover generation request, and determining a video cover template corresponding to a classroom theme;
performing atmosphere analysis on each frame of video picture in the classroom teaching video to obtain classroom atmosphere score, and obtaining video frame picture with higher classroom atmosphere score from the classroom teaching video as a cover background picture;
and combining the video cover template and the video cover background picture into a classroom teaching video cover.
According to a second aspect of the embodiments of the present disclosure, there is provided a classroom teaching video cover generating device, including:
the classroom theme determining module is used for acquiring classroom teaching data and determining classroom theme from the classroom teaching data, wherein the classroom teaching data comprises classroom teaching videos;
the cover template determining module is used for responding to the classroom teaching video cover generating request and determining a video cover template corresponding to a classroom theme;
the cover background image determining module is used for carrying out atmosphere analysis on each frame of video image in the classroom teaching video to obtain classroom atmosphere scores, and obtaining video frame images with higher classroom atmosphere scores from the classroom teaching video as a cover background image;
And the video cover synthesis module is used for synthesizing the video cover template and the video cover background image into a classroom teaching video cover.
According to a third aspect of embodiments of the present disclosure, there is provided an electronic device comprising a processor and a memory; the memory stores a computer program adapted to be loaded by the processor and to perform the method of generating covers for classroom teaching video as described above.
According to a fourth aspect of the embodiments of the present disclosure, there is provided a computer-readable storage medium having stored thereon a computer program, wherein the computer program, when executed by a processor, implements a cover generation method for classroom teaching video as described above.
According to the embodiment of the disclosure, the classroom theme is determined from the classroom teaching data, so that the corresponding video cover template is determined based on the classroom theme; then, scoring the classroom atmosphere in the classroom teaching video, and acquiring a video frame picture with higher classroom atmosphere score from the classroom teaching video as a cover background picture; and finally, combining the video cover template and the video cover background into a classroom teaching video cover. Compared with the prior art, the video cover obtained by the cover generation method can embody the classroom theme thought and the classroom atmosphere more effectively by taking the randomly extracted classroom teaching video frames as the video cover, and the user experience is poorer.
It is to be understood that both the foregoing general description and the following detailed description are exemplary and explanatory only and are not restrictive of the disclosure.
For a better understanding and implementation, the present invention is described in detail below with reference to the drawings.
Drawings
In order to more clearly illustrate the embodiments of the present disclosure or the technical solutions in the prior art, the drawings that are required in the embodiments or the description of the prior art will be briefly described below, and it is obvious that the drawings in the following description are only some embodiments of the present disclosure, and other drawings may be obtained according to these drawings without inventive effort to a person of ordinary skill in the art.
FIG. 1 is a schematic illustration of a classroom teaching scenario shown in one embodiment of the present disclosure;
FIG. 2 is a flow chart of a method of generating a cover for a classroom teaching video shown in one embodiment of the present disclosure;
FIG. 3 is a schematic diagram of a video cover template shown in accordance with one embodiment of the present disclosure;
FIG. 4 is a flowchart of step S3 of a method for generating a cover of a classroom teaching video shown in one embodiment of the present disclosure;
FIG. 5 is a flow chart of a method of generating a cover for a classroom teaching video shown in another embodiment of the present disclosure;
FIG. 6 is a flowchart of step S5 of a method for generating a cover of a classroom teaching video according to another embodiment of the present disclosure;
FIG. 7 is a schematic block diagram of a classroom teaching video cover generation device shown in one embodiment of the present disclosure;
FIG. 8 is a schematic block diagram of a classroom teaching video cover generation system shown in accordance with another embodiment of the present disclosure;
fig. 9 is a schematic structural view of an electronic device according to an embodiment of the present disclosure.
Detailed Description
For the purpose of making the objects, technical solutions and advantages of the present disclosure more apparent, the embodiments of the present disclosure will be described in further detail below with reference to the accompanying drawings. Where the following description refers to the accompanying drawings, the same numbers in different drawings refer to the same or similar elements, unless otherwise indicated.
It should be understood that the embodiments described in the examples described below do not represent all embodiments consistent with the present disclosure. Rather, they are merely examples of apparatus and methods consistent with some aspects of the present disclosure as detailed in the accompanying claims. All other embodiments, which can be made by one of ordinary skill in the art based on the embodiments of the present disclosure without inventive faculty, are intended to fall within the scope of the present disclosure.
The terminology used in the present disclosure is for the purpose of describing particular embodiments only and is not intended to be limiting of the disclosure. As used in this disclosure, the singular forms "a," "an," and "the" are intended to include the plural forms as well, unless the context clearly indicates otherwise. Furthermore, in the description of the present disclosure, unless otherwise indicated, "a plurality" means two or more. It should also be understood that the term "and/or" as used herein refers to and encompasses any or all possible combinations of one or more of the associated listed items, e.g., a and/or B, may represent: a exists alone, a and B exist together, and B exists alone; the character "/" generally indicates that the context-dependent object is an "or" relationship.
It should be appreciated that, although the terms first, second, third, etc. may be used in this disclosure to describe various information, these information should not be limited to these terms, and these terms are merely used to distinguish between similar objects and are not necessarily used to describe a particular order or sequence or to be construed to indicate or imply relative importance. The specific meaning of the terms in this disclosure will be understood by those of ordinary skill in the art as the case may be. The term "if"/"if" as used in this disclosure may be interpreted as "at … …" or "at … …" or "in response to a determination", depending on the context.
Please refer to fig. 1, which is a schematic diagram of a classroom teaching scenario according to an embodiment of the present disclosure. The classroom teaching scenario of the embodiments of the present disclosure includes a classroom 10, a number of internet of things (Internet of things, ioT) data collection devices disposed within the classroom 10.
The data acquisition devices of the internet of things can access the internet through a network access mode, and establish a data communication link with the classroom teaching video processing device 30, so that the data acquisition devices of the internet of things can send the acquired classroom teaching data to the classroom teaching video processing device 30. The network may be a communication medium of various connection types capable of providing a communication link between the internet of things data collection device and the classroom teaching video processing device 30, such as a wired communication link, a wireless communication link, or a fiber optic cable, etc., and the disclosure is not limited herein.
The internet of things (Internet of Things, ioT for short) refers to collecting any object or process needing to be monitored, connected and interacted in real time through various devices and technologies such as various information sensors, radio frequency identification technologies, global positioning systems, infrared sensors and laser scanners, collecting various needed information such as sound, light, heat, electricity, mechanics, chemistry, biology and positions, and realizing ubiquitous connection of objects and people through various possible network access, and realizing intelligent sensing, identification and management of objects and processes. The internet of things is an information carrier based on the internet, a traditional telecommunication network and the like, and enables all common physical objects which can be independently addressed to form an interconnection network. The data acquisition equipment of the Internet of things can acquire classroom teaching data in real time, and the classroom teaching data can comprise classroom teaching videos and teacher teaching courseware.
In an alternative embodiment, the internet of things data collection device includes a front camera 21 and a front microphone 22 disposed in front of the classroom 10 toward a teacher, a rear camera 23 and a rear microphone 24 disposed in rear of the classroom 10 toward students, an interactive all-in-one machine 25 disposed in front of the classroom 10 on the teacher side, and a smart blackboard 26.
The front camera 21 is used for collecting full scene picture data of the classroom behaviors of students in the classroom 10, the rear microphone 24 is used for collecting voice data of the classroom behaviors of the students in the classroom 10, so that audio and video data of the classroom behaviors of the students in the classroom 10 are obtained, and the audio and video data of the students are transmitted to the classroom teaching video processing equipment 30 through the front camera 21 and the rear microphone 24.
The rear camera 23 collects full scene picture data of the classroom behaviors of the teacher in the classroom 10, and the front microphone 22 collects voice data of the classroom behaviors of the teacher in the classroom 10, so that audio and video data of the classroom behaviors of the teacher in the classroom 10 are obtained, and the audio and video data of the teacher are transmitted to the classroom teaching video processing device 30 through the rear camera 23 and the front microphone 22.
The interactive integrated machine 25 receives the teaching courseware data uploaded by the teacher, so that the teaching courseware is played in the classroom through the interactive integrated machine 25, and the teaching courseware data is transmitted to the classroom teaching video processing equipment 30 through the interactive integrated machine 25. Or the classroom teaching video processing equipment 30 is connected with the educational administration system in a networking way, and the teaching courseware data is directly obtained from the educational administration system.
Wherein the blackboard writing data in the class is collected by the intelligent blackboard 26. Optionally, the intelligent blackboard 26 includes a main screen and two auxiliary screens, where the two auxiliary screens are respectively disposed on two sides of the main screen, and the main screen is typically a large screen integrated machine with an operating system, and can perform touch operation, and the two auxiliary screens are typically used for writing; and setting an blackboard writing content detection device on the frames of the main screen and the two web screens, thereby obtaining blackboard writing data. Alternatively, the blackboard writing data on the intelligent blackboard 26 can be obtained by an infrared touch detection device provided on the frame of the intelligent blackboard 26.
The method for generating the cover of the classroom teaching video can be executed by the classroom teaching video processing device 30, the classroom teaching video processing device 30 can realize the method for generating the cover of the classroom teaching video in a software and/or hardware mode, and the classroom teaching video processing device 30 can be composed of two or more physical entities or one physical entity. For example, the classroom teaching video processing device 30 may be an intelligent device such as a computer, a mobile phone, a tablet, an interactive tablet, a smart blackboard or a smart whiteboard, where the classroom teaching video processing device 30 may be a device independent of the above-mentioned data collection device of the internet of things, or may be the data collection device of the internet of things, for example, the classroom teaching video processing device 30 may also be the interactive all-in-one machine 25 itself. The classroom teaching video processing device 30 may be running an application for performing the cover generation method of the classroom teaching video, which may be presented in a form adapted to the electronic device, for example an APP application, and in some examples may also be presented in a form such as a system plug-in, a web plug-in, etc.
In an alternative implementation, the classroom teaching video processing device 30 of the method for generating the cover of the classroom teaching video according to the embodiment of the present disclosure may be a cloud service platform; the cloud service platform comprises a streaming media server (SimpleRTMPServer, SRS), a teaching analysis server and a multimedia analysis server (Media Content Analysis, MCA).
The plurality of data acquisition devices of the Internet of things push and flow the acquired classroom teaching video data, the classroom teaching courseware data and other classroom teaching data to the streaming media server through communication protocols such as a protocol RTMP or RTMPS and the like; the streaming media server sends the classroom teaching data to a teaching analysis server; meanwhile, the teaching analysis server controls the multimedia analysis server to pull the audio and video data from the streaming media server to generate covers of the classroom teaching video.
The method for generating the cover of the classroom teaching video according to the embodiment of the present disclosure will be described in detail with reference to the accompanying drawings.
Referring to fig. 2, the method for generating a cover of a classroom teaching video according to the embodiment of the present disclosure is executed by a classroom teaching video processing device as an execution subject, and includes the following steps:
step S1: and acquiring classroom teaching data, and determining a classroom theme from the classroom teaching data, wherein the classroom teaching data comprises classroom teaching videos.
In an alternative embodiment, the teaching time is a past classroom teaching time period. It can be understood that at this time, the classroom teaching video processing device or the device interacting with the classroom teaching video processing device stores the classroom teaching data of each teaching time period in advance, so that the classroom teaching video processing device can directly obtain the corresponding classroom teaching data from the stored data according to the classroom teaching time period.
In an alternative embodiment, the classroom teaching data includes one or more of the following: classroom teaching video, classroom teaching audio, and teaching courseware data. Multidimensional data such as classroom teaching video, classroom teaching audio and teaching courseware data are used as classroom teaching data sources, and multidimensional classroom teaching atmosphere analysis results can be obtained.
The classroom teaching audio and video comprises voices and pictures of teachers and students in the classroom, and can be obtained through a front camera and a front microphone which are arranged in front of the classroom and a rear camera and a rear microphone which are arranged behind the classroom and are arranged on the students.
The teaching courseware data are teaching courseware used by teachers in the class and can be obtained through the interaction integrated machine. Generally, before a teacher takes lessons, the lessons are stored in the interaction integrated machine, and during the lessons, the lessons are played through the interaction integrated machine, so that the lessons used by the teacher in the lessons can be obtained in the interaction integrated machine. In addition, the interaction integrated machine can be networked with the educational administration system, and the teaching courseware is downloaded from the educational administration system through the interaction integrated machine.
The teaching courseware data comprises teaching contents such as key knowledge points, interaction elements and the like. Optionally, an interactive courseware program can be installed in the interactive integrated machine, and a teacher can write a courseware through the interactive courseware program, so that the teaching courseware is obtained.
The class theme may be: the central ideas of courses to be expressed in classroom teaching refer generally to the main content. Different classes and subjects will exist in different disciplines and different chapters and subjects will also exist in different classes and subjects in the same discipline. The classroom topics also have atmosphere colors, some classroom topics are lively and relaxed, some classroom topics are strict, some classroom topics have strong interactivity, and some classroom topics have general interactivity, so that corresponding video cover templates can be selected according to the classroom topics.
Classroom teaching data includes teaching courseware data, from confirm the classroom theme in the classroom teaching data, include: and determining the class theme from the teaching courseware data of the class teaching data, so that the class theme can be determined more quickly. In addition, the classroom subject can be determined from the classroom teaching video.
Step S2: and responding to the classroom teaching video cover generation request, and determining a video cover template corresponding to the classroom theme.
In an alternative embodiment, the classroom teaching video cover generation request may be triggered by the classroom teaching video processing device. For example, the classroom teaching video processing device displays each classroom teaching video without the cover, and the user can click on the corresponding control of one of the classroom teaching videos to trigger the video cover to generate the instruction. In addition, the classroom teaching video cover generation request can be triggered on the interaction integrated machine.
In another alternative embodiment, the classroom teaching video cover generation request may be sent to the classroom teaching video processing device by an external device that establishes a communication link with the classroom teaching video processing device. Specifically, the external device can be intelligent devices such as a computer, a mobile phone, a tablet or an intelligent interaction tablet, and the external device can be provided with a corresponding classroom teaching video processing application program, and a user can trigger a classroom teaching video cover generation request on the classroom teaching video processing application program.
The video cover templates comprise a plurality of video cover templates, and in order to enable the video cover templates to be matched with classroom topics, the video cover templates can be determined according to the classroom topics, so that the video cover can embody the theme ideas of classroom teaching.
In an alternative embodiment, as shown in fig. 3, the video cover template includes a number of cover constituent elements, typesetting attributes between the cover constituent elements, and a base color scheme of the cover template.
Wherein, the cover constituent elements include at least: the cover title, the teaching teacher information and the courseware content information can also comprise cover subheading, teaching time, course names, subjects to which the courses belong and other constituent elements.
The cover titles, cover subtitles, course names and subjects to which the courses belong can be extracted from the teaching courseware data.
The teaching teacher information is extracted from the teaching courseware data, or the teaching teacher information obtains teacher face characteristics through portrait analysis on a classroom teaching video, and the teaching teacher information corresponding to the teacher face characteristics is determined based on the association relationship between the teacher face characteristics and the teaching teacher information.
And the courseware content information is extracted from the teaching courseware data.
Step S3: and carrying out atmosphere analysis on each frame of video picture in the classroom teaching video to obtain classroom atmosphere score, and obtaining a video frame picture with higher classroom atmosphere score from the classroom teaching video as a cover background picture.
The atmosphere analysis can comprise atmosphere analysis according to the expression, the gesture and the interaction data of teachers and students, and various weighting coefficients can be set according to the style of teachers, the contents of courseware and the like in the atmosphere analysis process, so that the accuracy of the atmosphere analysis is improved.
In an alternative embodiment, as shown in fig. 4, step S3 includes:
s31, analyzing the classroom teaching video through a facial expression analysis algorithm to obtain a teacher and student classroom expression analysis result.
Emotion refers to the mind experience of a person, such as happiness, anger, grime, happiness, fear and the like, and is a reflection of the person's attitude to objective objects. Emotion has positive and negative properties, something that can meet a person's needs can cause a person's experience of positive properties, such as happiness, satisfaction, etc.; things which cannot meet the needs of people can cause negative experiences of people, such as anger, hate, complaints and the like; nothing is needed, so that the person Chen Sheng is free of so-called moods and emotions. Positive emotions may increase a person's ability to exercise while negative emotions may decrease a person's ability to exercise. Therefore, most of the expressed facial emotions of teachers and students in teaching classrooms are related to the class atmosphere, when the class atmosphere is good, the facial expressions of the teachers and students are rich, excited and active, and when the class atmosphere is general, the facial expressions of the teachers and students are calm and negative.
In an alternative embodiment, each frame in the classroom teaching video is analyzed by a facial expression analysis algorithm to obtain a corresponding expression analysis result, and video frames with more active expressions are screened out.
S32, analyzing the classroom teaching video through a behavior analysis algorithm to obtain a teacher-student classroom teaching behavior analysis result.
Classroom teaching actions include: student's actions such as hand lifting, standing, sitting down, discussion, standing up and sharing, recitation, and writing on blackboard.
And S33, acquiring the frequency spectrum distribution of the sound energy of the teacher and student audio data, converting the teacher and student audio data in the classroom teaching video data into text data, and performing text semantic analysis on the text data to acquire a teacher and student interaction analysis result.
In a certain period of time of the whole classroom teaching, the interactivity of teachers and students is strong, the atmosphere of the current classroom can be reflected when the interactivity is more, and therefore the acquisition of the interactive analysis results of the teachers and students is beneficial to the acquisition of video frames with higher atmosphere scores.
And S34, carrying out aggregation analysis on the teacher-student classroom expression analysis result, the teacher-student classroom teaching behavior analysis result and the teacher-student interaction analysis result to obtain a classroom atmosphere score, thereby realizing multidimensional classroom atmosphere score.
And S35, screening out video frame pictures with higher class atmosphere scores from the teaching video data as a video cover background picture so as to better show the class atmosphere sense.
Step S4: and combining the video cover template and the video cover background picture into a classroom teaching video cover.
In an alternative embodiment, step S4 is preceded by the further step of:
and cutting the background image of the video cover to obtain a close-up picture and storing the close-up picture, and blurring the background image of the cover to embody individuation of the video cover and improve user experience.
In an alternative embodiment, as shown in fig. 5, step S4 further includes the steps of:
step S5: and carrying out classroom characteristic analysis based on the classroom teaching data to obtain a classroom analysis result, and carrying out color beautification on the classroom teaching video cover according to the classroom analysis result to obtain the target classroom teaching video cover.
And carrying out color beautification on the classroom teaching video cover according to the classroom analysis result, so that the classroom teaching video cover is more close to the classroom characteristics of the video and has individuation.
In an alternative embodiment, as shown in fig. 6, step S5 may include:
S51: and performing courseware color matching analysis and courseware style analysis based on the teaching courseware data, determining courseware color matching and courseware style, and performing teacher teaching style analysis based on the classroom teaching video to determine teacher teaching style.
The teaching courseware is usually made by a teaching teacher, the color matching and the style in the courseware are closely related to the preference of the teaching teacher, and the preference of the teaching teacher can be better represented, so that the color of the cover is beautified by combining the teaching style of the teacher, the video cover is enabled to be closer to the personal preference of the teacher, individuation is realized, and the cover is prevented from being spread uniformly.
S52: and determining the color preference of the teacher according to the courseware color matching, the courseware style and the teacher teaching style.
The teaching courseware can be PPT files, each PPT file shows a layout and colors, and the elements are selected and manufactured by a teaching teacher, and the style in the teaching courseware is the style of the teaching teacher.
S53: and carrying out color beautification on the classroom teaching video cover according to the teacher color preference to obtain a target classroom teaching video cover, so that the generated video cover is more personalized, more attractive and user experience is improved.
According to the embodiment of the disclosure, the classroom theme is determined from the classroom teaching data, so that the corresponding video cover template is determined based on the classroom theme; then, scoring the classroom atmosphere in the classroom teaching video, and acquiring a video frame picture with higher classroom atmosphere score from the classroom teaching video as a cover background picture; and finally, combining the video cover template and the video cover background into a classroom teaching video cover. Compared with the prior art, the video cover obtained by the cover generation method can embody the classroom theme thought and the classroom atmosphere more effectively by taking the randomly extracted classroom teaching video frames as the video cover, and the user experience is poorer.
Further, the video cover generated by the method of the embodiment of the disclosure can not be identical, the wonderful moment of the teaching video can be expressed, the condition that the cover is fuzzy can not exist, the form and the content of the cover are rich, and the acquisition efficiency of the video cover is high.
In the embodiment of the disclosure, the classroom teaching data comprises classroom teaching video and teaching courseware data. The following describes the data acquisition mode in one-to-one detail.
In an alternative embodiment, the data acquisition device of the internet of things comprises a front camera and a front microphone arranged in front of a classroom and facing a teacher, and a rear camera and a rear microphone arranged in rear of the classroom and facing students; the front camera and the rear microphone are used for collecting audio and video data of classroom behaviors of students in classrooms; the rear camera and the front microphone are used for collecting audio and video data of classroom behaviors of teachers in classrooms.
Fig. 7 is a schematic structural diagram of a classroom teaching video cover generating device according to a second embodiment of the present disclosure. The classroom teaching video cover generation device 700 includes:
the classroom topic determination module 701 is configured to obtain classroom teaching data, and determine a classroom topic from the classroom teaching data, where the classroom teaching data includes a classroom teaching video;
the cover template determining module 702 is configured to determine a video cover template corresponding to a classroom theme in response to a classroom teaching video cover generation request;
the cover background image determining module 703 is configured to perform atmosphere analysis on each frame of video image in the classroom teaching video to obtain a class atmosphere score, and obtain a video frame image with a higher class atmosphere score from the classroom teaching video as a cover background image;
and the video cover synthesis module 704 is configured to synthesize the video cover template and the video cover background map into a classroom teaching video cover.
It should be noted that, when executing the method for generating the cover of the classroom teaching video, the apparatus for generating the cover of the classroom teaching video provided in the second embodiment of the present disclosure only uses the division of the above functional modules to illustrate, in practical application, the above functional allocation may be completed by different functional modules according to needs, that is, the internal structure of the device is divided into different functional modules, so as to complete all or part of the functions described above. In addition, the apparatus for generating the cover of the classroom teaching video provided by the second embodiment of the present disclosure belongs to the same concept as the method for generating the cover of the classroom teaching video of the first embodiment of the present disclosure, and detailed implementation processes of the apparatus are shown in the method embodiment and will not be described herein.
The embodiment of the classroom teaching video cover generating device of the second embodiment of the present disclosure may be applied to a computer device, for example, a classroom teaching video processing device, specifically, for example, a server, where the embodiment of the device may be implemented by software, or may be implemented by hardware or a combination of hardware and software. Taking a software implementation as an example, the device in a logic sense is formed by the execution of corresponding computer program instructions in a processor-readable memory of the file processing in which the device is located. In a hardware-level, the computer device in which it resides may include a processor and a memory, which are co-coupled by a data bus or other well-known means.
Fig. 8 is a schematic structural diagram of a classroom teaching video cover generating system according to a third embodiment of the present disclosure. The system comprises: the system comprises a plurality of internet of things data acquisition devices 301 and classroom teaching video processing devices 302; a plurality of internet of things data acquisition devices 301 are arranged in a classroom; the plurality of data acquisition devices 301 of the Internet of things are connected with the classroom teaching video processing device 302.
Specifically, the plurality of internet of things data acquisition devices 301 include a front camera 3011 and a front microphone 3012 arranged in front of a classroom for students, a rear camera 3013 and a rear microphone 3014 arranged behind the classroom for teachers, an interaction integrated machine 3015 arranged in front of the classroom and located on the teacher side, an intelligent blackboard 3016 and a video display stand 3017.
The front camera 3011 is used for collecting full scene picture data of classroom behaviors of students in a classroom, and the rear microphone 3014 is used for collecting voice data of the classroom behaviors of the students in the classroom, so that audio and video data of the classroom behaviors of the students in the classroom are obtained.
The method comprises the steps of collecting full scene picture data of classroom behaviors of a teacher in a classroom through a rear camera 3013, and collecting voice data of the classroom behaviors of the teacher in the classroom through a front microphone 3012, so that audio and video data of the classroom behaviors of the teacher in the classroom are obtained.
The interactive integrated machine 3015 plays the teaching courseware data uploaded by the teacher, and collects the teaching courseware data. In addition, the teaching plan data uploaded by the teacher is also acquired through the interaction integrated machine 3015.
Wherein, the intelligent blackboard 3016 is used as a blackboard writing carrier, so that the intelligent blackboard is used for collecting blackboard writing data in a classroom; specifically, the blackboard writing data on the intelligent blackboard is obtained through an infrared touch detection device arranged on the intelligent blackboard frame.
The video display table 3017 displays paper homework of students for pre-learning exercises, classroom exercises and post-class exercises, and the video display table recognizes the data of the names, homework questions, knowledge point solution questions and ideas of the paper homework through a text recognition technology, so as to obtain homework data.
The classroom teaching video processing device 302 is configured to perform the related operations of the method for generating a cover of any classroom teaching video in the foregoing embodiments, and has corresponding functions and beneficial effects, which are not described herein.
Referring to fig. 9, a schematic structural diagram of a computer device according to a third embodiment of the disclosure is provided.
As shown in fig. 9, the electronic device 400 may be embodied as a computer, a mobile phone, a tablet, an interactive tablet, etc., and in an exemplary embodiment of the present disclosure, the electronic device 400 may include: at least one processor 401, at least one memory 402, at least one network interface 403, a user interface 404, and at least one communication bus 405.
Wherein a communication bus 405 is used to enable connected communications between these components.
The user interface 404 may include, among other things, a display screen and a camera; the user interface 404 may also include standard wired and wireless interfaces.
The network interface 403 may optionally include standard wired and wireless interfaces (e.g., WI-FI interfaces), among others.
Wherein the processor 401 may include one or more processing cores. The processor 401 connects the various parts within the overall electronic device 400 using various interfaces and lines, performs various functions of the electronic device 400 and processes data by executing or executing instructions, programs, code sets, or instruction sets stored in the memory 402, and invoking data stored in the memory 402. Alternatively, the processor 401 may be implemented in at least one hardware form of digital signal processing (Digital Signal Processing, DSP), field programmable gate array (Field-Programmable Gate Array, FPGA), programmable logic array (Programmable Logic Array, PLA). The processor 401 may integrate one or a combination of several of a central processing unit (Central Processing Unit, CPU), an image processor (Graphics Processing Unit, GPU), a modem, etc. The CPU mainly processes an operating system, a user interface, an application program and the like; the GPU is used for rendering and drawing the content required to be displayed by the display layer; the modem is used to handle wireless communications. It will be appreciated that the modem may not be integrated into the processor 401 and may be implemented by a single chip.
The Memory 402 may include a random access Memory (Random Access Memory, RAM) or a Read-Only Memory (Read-Only Memory). Optionally, the memory 402 includes a non-transitory computer readable medium (non-transitory computer-readable storage medium). Memory 402 may be used to store instructions, programs, code sets, or instruction sets. The memory 402 may include a stored program area and a stored data area, wherein the stored program area may store instructions for implementing an operating system, instructions for at least one function (such as a touch function, a sound playing function, an image playing function, etc.), instructions for implementing the above-described various method embodiments, etc.; the storage data area may store data or the like referred to in the above respective method embodiments. The memory 402 may also optionally be at least one storage device located remotely from the aforementioned processor 401. As shown in fig. 9, an operating system, a network communication module, a user may be included in the memory 402, which is one type of computer storage medium.
In the electronic device 400 shown in fig. 9, the user interface 404 is mainly used as an interface for providing input for a user, and obtains data input by the user; and the processor 401 may be used to invoke an operating application stored in the memory 402, for example: a classroom analysis program; and related operations of any class analysis method in the embodiment are executed, so that the method has corresponding functions and beneficial effects.
The fourth embodiment of the present disclosure further provides a computer readable storage medium, on which a computer program is stored, where the instructions are adapted to be loaded and executed by a processor, where the specific execution process may refer to the specific description shown in the embodiment, and no further description is given here. The storage medium can be an electronic device such as a personal computer, a notebook computer, a smart phone, a tablet computer and the like.
For the device embodiments, reference is made to the description of the method embodiments for the relevant points, since they essentially correspond to the method embodiments. The above-described apparatus embodiments are merely illustrative, in which components illustrated as separate components may or may not be physically separate, and components shown as units may or may not be physical units, may be located in one place, or may be distributed over multiple network units. Some or all of the modules may be selected according to actual needs to achieve the objectives of the disclosed solution. Those of ordinary skill in the art will understand and implement the present invention without undue burden.
It will be apparent to those skilled in the art that embodiments of the present disclosure may be provided as a method, system, or computer program product. Accordingly, the present disclosure may take the form of an entirely hardware embodiment, an entirely software embodiment, or an embodiment combining software and hardware aspects. Furthermore, the present disclosure may take the form of a computer program product embodied on one or more computer-usable storage media (including, but not limited to, disk storage, CD-ROM, optical storage, etc.) having computer-usable program code embodied therein.
The present disclosure is described with reference to flowchart illustrations and/or block diagrams of methods, apparatus (systems), and computer program products according to embodiments of the disclosure. It will be understood that each flow and/or block of the flowchart illustrations and/or block diagrams, and combinations of flows and/or blocks in the flowchart illustrations and/or block diagrams, can be implemented by computer program instructions. These computer program instructions may be provided to a processor of a general purpose computer, special purpose computer, embedded processor, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable data processing apparatus, create means for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks. These computer program instructions may also be stored in a computer-readable memory that can direct a computer or other programmable data processing apparatus to function in a particular manner, such that the instructions stored in the computer-readable memory produce an article of manufacture including instruction means which implement the function specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be loaded onto a computer or other programmable data processing apparatus to cause a series of operational steps to be performed on the computer or other programmable apparatus to produce a computer implemented process such that the instructions which execute on the computer or other programmable apparatus provide steps for implementing the functions specified in the flowchart block or blocks and/or block diagram block or blocks.
In one typical configuration, a computing device includes one or more processors (CPUs), input/output interfaces, network interfaces, and memory.
The memory may include volatile memory in a computer-readable medium, random Access Memory (RAM) and/or nonvolatile memory, etc., such as Read Only Memory (ROM) or flash RAM. Memory is an example of a computer-readable medium.
Computer readable media, including both non-transitory and non-transitory, removable and non-removable media, may implement information storage by any method or technology. The information may be computer readable instructions, data structures, modules of a program, or other data. Examples of storage media for a computer include, but are not limited to, phase change memory (PRAM), static Random Access Memory (SRAM), dynamic Random Access Memory (DRAM), other types of Random Access Memory (RAM), read Only Memory (ROM), electrically Erasable Programmable Read Only Memory (EEPROM), flash memory or other memory technology, compact disc read only memory (CD-ROM), digital Versatile Discs (DVD) or other optical storage, magnetic cassettes, magnetic tape magnetic disk storage or other magnetic storage devices, or any other non-transmission medium, which can be used to store information that can be accessed by a computing device. Computer-readable media, as defined herein, does not include transitory computer-readable media (transmission media), such as modulated data signals and carrier waves.
It should also be noted that the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising one … …" does not exclude the presence of other like elements in a process, method, article or apparatus that comprises an element.
The foregoing is merely exemplary of the present disclosure and is not intended to limit the present disclosure. Various modifications and variations of this disclosure will be apparent to those skilled in the art. Any modifications, equivalent substitutions, improvements, or the like, which are within the spirit and principles of the present disclosure, are intended to be included within the scope of the claims of the present disclosure.

Claims (12)

1. The method for generating the cover of the classroom teaching video is characterized by comprising the following steps:
acquiring classroom teaching data, and determining a classroom theme from the classroom teaching data, wherein the classroom teaching data comprises classroom teaching videos;
Responding to a classroom teaching video cover generation request, and determining a video cover template corresponding to a classroom theme;
performing atmosphere analysis on each frame of video picture in the classroom teaching video to obtain classroom atmosphere score, and obtaining video frame picture with higher classroom atmosphere score from the classroom teaching video as a cover background picture;
and combining the video cover template and the video cover background picture into a classroom teaching video cover.
2. The method for generating a cover of a classroom teaching video according to claim 1, wherein: the video cover template comprises a plurality of cover constituent elements, typesetting attributes among the cover constituent elements and basic color matching of the cover template.
3. The cover generation method of classroom teaching video according to claim 2, characterized in that: the cover composition element at least comprises: cover title, lecture teacher information and courseware content information,
the cover titles are extracted from the teaching courseware data;
the teaching teacher information is extracted from the teaching courseware data, or the teaching teacher information obtains teacher face characteristics through portrait analysis on a classroom teaching video, and the teaching teacher information corresponding to the teacher face characteristics is determined based on the association relationship between the teacher face characteristics and the teaching teacher information;
And the courseware content information is extracted from the teaching courseware data.
4. The method for generating a cover of a classroom teaching video according to claim 1, wherein: the atmosphere analysis comprises: atmosphere analysis is carried out according to the expression, the gesture and the interaction data of teachers and students,
the method for obtaining the class atmosphere score by atmosphere analysis of each frame of video picture in the class teaching video comprises the steps of:
analyzing the classroom teaching video through a facial expression analysis algorithm to obtain a teacher-student classroom expression analysis result;
analyzing the classroom teaching video through a behavior analysis algorithm to obtain a teacher-student classroom teaching behavior analysis result;
the method comprises the steps of obtaining spectral distribution of sound energy of teacher and student audio data, converting the teacher and student audio data in classroom teaching video data into text data, and carrying out text semantic analysis on the text data to obtain a teacher and student interaction analysis result;
performing aggregate analysis on the teacher-student classroom expression analysis result, the teacher-student classroom teaching behavior analysis result and the teacher-student interaction analysis result to obtain a classroom atmosphere score;
And screening out video frame pictures with higher class atmosphere scores from the teaching video data to be used as a video cover background picture.
5. The method for generating a cover for a classroom teaching video according to claim 4, wherein: before the step of combining the video cover template and the video cover background image into the classroom teaching video cover, the method further comprises the following steps:
and cutting the background image of the video cover to obtain a close-up picture and storing the close-up picture.
6. The cover generation method of classroom teaching video according to any one of claims 1 to 5, characterized in that: the method also comprises the steps of:
and carrying out classroom characteristic analysis based on the classroom teaching data to obtain a classroom analysis result, and carrying out color beautification on the classroom teaching video cover according to the classroom analysis result to obtain the target classroom teaching video cover.
7. The method for generating a cover for a classroom teaching video according to claim 6, wherein: the classroom characteristic analysis comprises courseware color matching analysis, courseware style analysis and teacher teaching style analysis;
the classroom feature analysis is performed based on the classroom teaching data to obtain a classroom analysis result, the classroom teaching video cover is color beautified according to the classroom analysis result to obtain a target classroom teaching video cover, and the method comprises the following steps:
Performing courseware color matching analysis and courseware style analysis based on the teaching courseware data, determining courseware color matching and courseware style, performing teacher teaching style analysis based on the classroom teaching video, and determining teacher teaching style;
determining teacher color preference according to the courseware color matching, the courseware style and the teacher teaching style;
and carrying out color beautification on the classroom teaching video cover according to the teacher color preference to obtain the target classroom teaching video cover.
8. The method for generating a cover of a classroom teaching video according to claim 1, wherein: the classroom teaching video is acquired by the Internet of things data acquisition device in the teaching time,
the data acquisition equipment of the Internet of things comprises a front camera and a front microphone which are arranged in front of a classroom and facing a teacher, and a rear camera and a rear microphone which are arranged in rear of the classroom and facing students;
the front camera and the rear microphone are used for collecting audio and video data of classroom behaviors of students in classrooms; the rear camera and the front microphone are used for collecting audio and video data of classroom behaviors of teachers in classrooms.
9. The method for generating a cover of a classroom teaching video according to claim 1, wherein: the classroom teaching data also comprises teaching courseware data, and the classroom theme is determined from the classroom teaching data, and the method comprises the following steps:
and determining the class theme from the teaching courseware data of the class teaching data.
10. The utility model provides a classroom teaching video cover generation device which characterized in that includes:
the classroom theme determining module is used for acquiring classroom teaching data and determining classroom theme from the classroom teaching data, wherein the classroom teaching data comprises classroom teaching videos;
the cover template determining module is used for responding to the classroom teaching video cover generating request and determining a video cover template corresponding to a classroom theme;
the cover background image determining module is used for carrying out atmosphere analysis on each frame of video image in the classroom teaching video to obtain classroom atmosphere scores, and obtaining video frame images with higher classroom atmosphere scores from the classroom teaching video as a cover background image;
and the video cover synthesis module is used for synthesizing the video cover template and the video cover background image into a classroom teaching video cover.
11. An electronic device includes a processor and a memory; the computer program is stored in the memory, and is adapted to be loaded by the processor and execute the method for generating the cover of the classroom teaching video according to any one of claims 1 to 9.
12. A computer-readable storage medium, on which a computer program is stored, which when executed by a processor implements the cover generation method of classroom teaching video as claimed in any one of claims 1 to 9.
CN202211078128.0A 2022-09-05 2022-09-05 Method, device and storage medium for generating covers of classroom teaching video Pending CN117692727A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202211078128.0A CN117692727A (en) 2022-09-05 2022-09-05 Method, device and storage medium for generating covers of classroom teaching video

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202211078128.0A CN117692727A (en) 2022-09-05 2022-09-05 Method, device and storage medium for generating covers of classroom teaching video

Publications (1)

Publication Number Publication Date
CN117692727A true CN117692727A (en) 2024-03-12

Family

ID=90137780

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202211078128.0A Pending CN117692727A (en) 2022-09-05 2022-09-05 Method, device and storage medium for generating covers of classroom teaching video

Country Status (1)

Country Link
CN (1) CN117692727A (en)

Similar Documents

Publication Publication Date Title
US11151892B2 (en) Internet teaching platform-based following teaching system
CN106485964B (en) A kind of recording of classroom instruction and the method and system of program request
CN109801194B (en) Follow-up teaching method with remote evaluation function
US20200286396A1 (en) Following teaching system having voice evaluation function
CN107992195A (en) A kind of processing method of the content of courses, device, server and storage medium
US11094215B2 (en) Internet-based recorded course learning following system and method
CN104021326B (en) A kind of Teaching Methods and foreign language teaching aid
CN110491218A (en) A kind of online teaching exchange method, device, storage medium and electronic equipment
CN110677685B (en) Network live broadcast display method and device
CN110287947A (en) Interaction classroom in interaction classroom determines method and device
CN114339285B (en) Knowledge point processing method, video processing method, device and electronic equipment
KR101858204B1 (en) Method and apparatus for generating interactive multimedia contents
CN110427349A (en) A kind of educational information shared platform and information sharing method
CN108847066A (en) A kind of content of courses reminding method, device, server and storage medium
CN110046290B (en) Personalized autonomous teaching course system
CN112165627A (en) Information processing method, device, storage medium, terminal and system
CN113963306B (en) Courseware title making method and device based on artificial intelligence
CN117692727A (en) Method, device and storage medium for generating covers of classroom teaching video
CN114913042A (en) Teaching courseware generation method and device, electronic equipment and storage medium
CN109191958A (en) Exchange method, device, terminal and the storage medium of information
KR20090097484A (en) System and method for language training using multi expression sentence image, recording medium and language study book therefor
KR20120027647A (en) Learning contents generating system and method thereof
Godwin-Jones Technology-mediated SLAEvolving Trends and Emerging Technologies
CN107154173B (en) Language learning method and system
KR102568378B1 (en) Apparatus for Learning Service of Foreign Language and Driving Method Thereof

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination