CN113115037B - Online education method, system, equipment and storage medium - Google Patents

Online education method, system, equipment and storage medium Download PDF

Info

Publication number
CN113115037B
CN113115037B CN202110660037.7A CN202110660037A CN113115037B CN 113115037 B CN113115037 B CN 113115037B CN 202110660037 A CN202110660037 A CN 202110660037A CN 113115037 B CN113115037 B CN 113115037B
Authority
CN
China
Prior art keywords
frame
image
video
value
data
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202110660037.7A
Other languages
Chinese (zh)
Other versions
CN113115037A (en
Inventor
高德平
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shenzhen Zhongpeng Education Technology Co ltd
Original Assignee
Shenzhen Zhongpeng Education Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shenzhen Zhongpeng Education Technology Co ltd filed Critical Shenzhen Zhongpeng Education Technology Co ltd
Priority to CN202110660037.7A priority Critical patent/CN113115037B/en
Publication of CN113115037A publication Critical patent/CN113115037A/en
Application granted granted Critical
Publication of CN113115037B publication Critical patent/CN113115037B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N19/00Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
    • H04N19/10Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding
    • H04N19/134Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding characterised by the element, parameter or criterion affecting or controlling the adaptive coding
    • H04N19/136Incoming video signal characteristics or properties
    • H04N19/137Motion inside a coding unit, e.g. average field, frame or block difference
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N19/00Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
    • H04N19/10Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding
    • H04N19/169Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding characterised by the coding unit, i.e. the structural portion or semantic portion of the video signal being the object or the subject of the adaptive coding
    • H04N19/186Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding characterised by the coding unit, i.e. the structural portion or semantic portion of the video signal being the object or the subject of the adaptive coding the unit being a colour or a chrominance component
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N19/00Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
    • H04N19/50Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using predictive coding
    • H04N19/503Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using predictive coding involving temporal prediction

Abstract

The application provides an online education method, a system, equipment and a storage medium, comprising the following steps: the camera collects the teaching video streaming data of the teacher in real time; calculating a difference matrix Hn of the nth frame and the previous frame of video image in real time from the second frame; traversing the matrix Hn; selecting RGB video image information corresponding to the region position information set S1 of the nth frame image as a data subset set S2; taking the RGB video image information of which the degree of change of the nth frame video image is greater than the threshold value W as a data subset set S3; MPS encoding is performed on the data subset set S3, and the obtained structured data is transmitted to the user end. The invention realizes the data transmission only aiming at the area where the video image changes remarkably, but not reserving the original image in the area where the video image changes remarkably, greatly reduces the data transmission amount, remarkably improves the data transmission efficiency, and greatly accelerates the image data selection speed by the calculation mode of the difference matrix to the video frame image area.

Description

Online education method, system, equipment and storage medium
Technical Field
The application relates to the technical field of computer vision, in particular to an online education method, a system, equipment and a storage medium.
Background
At present, along with the continuous increase of the scale of the short video or the real-time video of the online education, higher requirements are provided for the data transmission bandwidth, and the real-time video or the short video can enable a user to carry out the training of related courses at any time and any place in real time, so the video playing fluency degree of the courses is closely related to the network transmission speed and the transmission data volume, and especially during epidemic situations, the online education has greater popularization in the aspect of remote education and plays an important role in remote learning.
However, the traditional online education has low video data transmission efficiency, large video data transmission quantity and higher cost, and cannot meet the requirement of more students; the fast, convenient and efficient transmission of remote video information is necessary at present. In the traditional video transmission, all data are transmitted, and the structural characteristics of the transmitted data cannot be effectively utilized; therefore, a method capable of real-time, fast and reducing the transmission of invalid data volume is an urgent need, thereby improving the user experience.
Disclosure of Invention
In view of the questionThe present invention has been made to provide an online education method, system, device and storage medium that overcome or at least partially solve the above-mentioned problems, by transmitting only image data for a significantly varying area without leaving an original image for the significantly varying area according to the structural characteristics of a course video, the amount of transmission data is greatly reduced, and data transmission efficiency is significantly improved; the image data selection speed is greatly accelerated by the calculation mode of the difference degree matrix to the video frame image area; the image change degree calculation mode can remarkably improve the selection rate of the significant region and greatly reduce image distortion; a method of online education comprising the steps of: the camera collects the teaching video streaming data of the teacher in real time; calculating the difference matrix H of the nth frame and the previous frame in real time from the second framen
Figure 710173DEST_PATH_IMAGE001
Wherein the content of the first and second substances,
Figure 91476DEST_PATH_IMAGE002
a difference between R value matrices of R channels representing video RGB images of the n-th frame and the n-1 th frame,
Figure 727994DEST_PATH_IMAGE003
a difference between a matrix of G values representing G channels of video RGB images of the n-th frame and the n-1 st frame,
Figure 5391DEST_PATH_IMAGE004
a difference between the B value matrices of B channels representing the video RGB images of the n-th frame and the n-1 st frame,
Figure 97500DEST_PATH_IMAGE005
Figure 751335DEST_PATH_IMAGE006
Figure 976780DEST_PATH_IMAGE007
matrix proportionality coefficients of an R channel, a G channel and a B channel are respectively;
traverse matrix HnAn area position information set S1 in which the element in the recording matrix is not equal to 0;
selecting RGB video image information corresponding to the region position information set S1 of the nth frame image as a data subset set S2, calculating the image change degree of each connected region set in the data subset set S2, and the image change degree W of the connected region of the kth data subsetkComprises the following steps:
Figure 425079DEST_PATH_IMAGE008
wherein the content of the first and second substances,
Figure 532712DEST_PATH_IMAGE009
representing the difference between the mean gray levels of the connected regions of the kth data subset of the n-th frame and the n-1 frame of video,
Figure 459080DEST_PATH_IMAGE010
representing the difference between the image steepness of the connected region of the kth data subset of the n-th frame and the n-1 frame video,
Figure 335769DEST_PATH_IMAGE011
representing the proportion of the k-th data subset connected region in the whole video image, h is a set threshold value, if
Figure 689390DEST_PATH_IMAGE012
Then, then
Figure 487582DEST_PATH_IMAGE013
(ii) a If it is
Figure 952061DEST_PATH_IMAGE014
Then, then
Figure 886519DEST_PATH_IMAGE015
(ii) a The image steepness is to extract pixel points with the lowest gray value and the highest gray value in each image block, and is respectively T1 and T2;the steepness is L = (T2-T1)/q; wherein q is the number of interval pixels of the pixel points with the lowest gray value and the highest gray value in the image block;
Figure 207779DEST_PATH_IMAGE016
the gray level mean value of n data subset connected regions;
Figure 507915DEST_PATH_IMAGE017
the mean value of the image steepness of the n data subset connected regions is obtained;
taking the RGB video image information of which the degree of change of the nth frame video image is greater than the threshold value W as a data subset set S3;
MPS is performed on the data subset set S3, and the obtained structured data is transmitted to the user end, where the MPS is a model data storage format and a transmission format for expressing the linear optimization model.
Preferably, the step of obtaining the mean gray level of the connected region includes performing gray processing on the images of the connected region, and selecting a gray level threshold according to the maximum inter-class variance method OSTU.
Preferably, the step of obtaining the mean gray level of the connected region includes graying the video image by a maximum method, and taking the maximum value of the three-component brightness in the color image as the gray level value of the gray level map.
Preferably, the method further comprises preprocessing the video image before obtaining the gray level mean value of the connected region, and filtering and denoising the video image information.
Also disclosed is an online education system including:
the acquisition module is used for acquiring teaching video streaming data of the teacher in real time by the camera; calculating the difference matrix H of the nth frame and the previous frame in real time from the second framen
Figure 307244DEST_PATH_IMAGE018
Wherein the content of the first and second substances,
Figure 627367DEST_PATH_IMAGE019
a difference between R value matrices of R channels representing video RGB images of the n-th frame and the n-1 th frame,
Figure 322791DEST_PATH_IMAGE020
a difference between a matrix of G values representing G channels of video RGB images of the n-th frame and the n-1 st frame,
Figure 95574DEST_PATH_IMAGE021
a difference between the B value matrices of B channels representing the video RGB images of the n-th frame and the n-1 st frame,
Figure 433015DEST_PATH_IMAGE005
Figure 607644DEST_PATH_IMAGE022
Figure 270707DEST_PATH_IMAGE007
matrix proportionality coefficients of an R channel, a G channel and a B channel are respectively;
a traversing module for traversing the matrix HnAn area position information set S1 in which the element in the recording matrix is not equal to 0;
a region initial selection module, configured to select RGB video image information corresponding to the region position information set S1 of the nth frame of image as a data subset set S2, calculate an image change degree of each connected region set in the data subset set S2, and calculate an image change degree W of a connected region of the kth data subsetkComprises the following steps:
Figure 796366DEST_PATH_IMAGE023
wherein the content of the first and second substances,
Figure 671918DEST_PATH_IMAGE024
representing the difference between the mean gray levels of the connected regions of the kth data subset of the n-th frame and the n-1 frame of video,
Figure 701054DEST_PATH_IMAGE025
representing the difference between the image steepness of the connected region of the kth data subset of the n-th frame and the n-1 frame video,
Figure 472701DEST_PATH_IMAGE026
representing the proportion of the k-th data subset connected region in the whole video image, h is a set threshold value, if
Figure 488586DEST_PATH_IMAGE027
Then, then
Figure 167829DEST_PATH_IMAGE028
(ii) a If it is
Figure 317051DEST_PATH_IMAGE029
Then, then
Figure 790757DEST_PATH_IMAGE030
(ii) a The image steepness is to extract pixel points with the lowest gray value and the highest gray value in each image block, and is respectively T1 and T2; the steepness is L = (T2-T1)/q; wherein q is the number of interval pixels of the pixel points with the lowest gray value and the highest gray value in the image block;
Figure 759850DEST_PATH_IMAGE016
the gray level mean value of n data subset connected regions;
Figure 242784DEST_PATH_IMAGE017
the mean value of the image steepness of the n data subset connected regions is obtained;
the area selection module is used for taking the RGB video image information of which the variation degree of the nth frame video image is greater than the threshold value W as a data subset set S3;
and the coding transmission module is used for performing MPS coding on the data subset set S3 to obtain an MPS data packet, analyzing the MPS data packet according to a preset data structure to obtain structured data which accords with optimization processing, and transmitting the obtained structured data to a user side, wherein the MPS is a model data storage format and a transmission format which are used for expressing a linear optimization model.
Preferably, the method further comprises the following steps: and the graying module is used for performing graying processing on the images of the connected regions and selecting a grayscale threshold value according to the maximum inter-class variance method OSTU.
Preferably, the graying module is further configured to perform graying on the video image by using a maximum value method, and use the maximum value of the three-component brightness in the color image as the grayscale value of the grayscale image.
Preferably, the method further comprises the following steps: and the preprocessing module is used for preprocessing the video image and filtering and denoising the video image information before acquiring the gray average value of the connected region.
An apparatus comprising a processor, a memory and a computer program stored on the memory and capable of running on the processor, the computer program when executed by the processor implementing the steps of the online education method as described above.
A computer-readable storage medium, having stored thereon a computer program which, when executed by a processor, carries out the steps of the method of online education as described above.
The application has the following advantages:
the application relates to an online education method and system, which calculates a difference matrix H of an nth frame and a previous frame of video image in real time from a second framen(ii) a Traverse matrix HnAn area position information set S1 in which the element in the recording matrix is not equal to 0; selecting RGB video image information corresponding to the region position information set S1 of the nth frame image as a data subset set S2; taking the RGB video image information of which the degree of change of the nth frame video image is greater than the threshold value W as a data subset set S3; MPS encoding is performed on the data subset set S3, and the obtained structured data is transmitted to the user end. The invention realizes the data transmission only aiming at the area where the video image has obvious change, and the original image is reserved in the area where the video image has no obvious change, thereby greatly reducing the data transmission amount and obviously improving the data transmission efficiency.
Particularly, the image data selection speed is greatly increased by the calculation mode of the difference degree matrix to the video frame image area; the image change degree calculation mode can remarkably improve the selection rate of the salient region and greatly reduce image distortion.
Drawings
In order to more clearly illustrate the technical solutions of the present application, the drawings needed to be used in the description of the present application will be briefly introduced below, and it is apparent that the drawings in the following description are only some embodiments of the present application, and it is obvious for those skilled in the art that other drawings can be obtained according to the drawings without inventive labor.
FIG. 1 is a flow chart of a method of online education provided by an embodiment of the present application;
fig. 2 is a schematic structural diagram of a computer device according to an embodiment of the present application.
Detailed Description
As understood by the technical personnel in the field, as for the background technology, the traditional online education has low video data transmission efficiency, large video data transmission quantity and higher cost, and can not meet the requirement of more students; the fast, convenient and efficient transmission of remote video information is necessary at present. In the traditional video transmission, all data are transmitted, and the structural characteristics of the transmitted data cannot be effectively utilized; therefore, a method capable of real-time, fast and reducing the transmission of invalid data volume is an urgent need, thereby improving the user experience. In order to make the aforementioned objects, features and advantages of the present invention comprehensible, embodiments accompanied with figures are described in detail below. It is to be understood that the embodiments described are only a few embodiments of the present application and not all embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present application.
Example 1:
the invention provides an online education method, a system, equipment and a storage medium, according to the structural characteristics of a course video, the invention only transmits the image data of a significant change area, and the original image is reserved in the area without significant change, thereby greatly reducing the data transmission amount and remarkably improving the data transmission efficiency;
referring to fig. 1, a flowchart of an online education method provided by an embodiment of the present application is shown, including the steps of:
s100, collecting teaching video streaming data of a teacher in real time by a camera;
s200, calculating a difference matrix H of the nth frame and the previous frame of video image in real time from the second framen
Figure 980933DEST_PATH_IMAGE031
Wherein the content of the first and second substances,
Figure 891120DEST_PATH_IMAGE032
a difference between R value matrices of R channels representing video RGB images of the n-th frame and the n-1 th frame,
Figure 613089DEST_PATH_IMAGE033
a difference between a matrix of G values representing G channels of video RGB images of the n-th frame and the n-1 st frame,
Figure 368555DEST_PATH_IMAGE034
a difference between the B value matrices of B channels representing the video RGB images of the n-th frame and the n-1 st frame,
Figure 961210DEST_PATH_IMAGE005
Figure 776720DEST_PATH_IMAGE022
Figure 454826DEST_PATH_IMAGE007
matrix proportionality coefficients of an R channel, a G channel and a B channel are respectively;
s300, traversing matrix HnAn area position information set S1 in which the element in the recording matrix is not equal to 0;
s400, selecting RGB video image information corresponding to the region position information set S1 of the nth frame image as a data subset set S2, calculating the image change degree of each connected region set in the data subset set S2, and calculating the image change degree W of the connected region of the kth data subsetkComprises the following steps:
Figure 13983DEST_PATH_IMAGE035
wherein the content of the first and second substances,
Figure 729654DEST_PATH_IMAGE036
representing the difference between the mean gray levels of the connected regions of the kth data subset of the n-th frame and the n-1 frame of video,
Figure 450485DEST_PATH_IMAGE037
representing the difference between the image steepness of the connected region of the kth data subset of the n-th frame and the n-1 frame video,
Figure 881466DEST_PATH_IMAGE038
representing the proportion of the k-th data subset connected region in the whole video image, h is a set threshold value, if
Figure 978735DEST_PATH_IMAGE039
Then, then
Figure 545983DEST_PATH_IMAGE040
(ii) a If it is
Figure 703295DEST_PATH_IMAGE041
Then, then
Figure 355993DEST_PATH_IMAGE042
(ii) a The image steepness is to extract pixel points with the lowest gray value and the highest gray value in each image block, and is respectively T1 and T2; the steepness is L = (T2-T1)/q; wherein q is the number of interval pixels of the pixel points with the lowest gray value and the highest gray value in the image block;
Figure 991374DEST_PATH_IMAGE043
the gray level mean value of n data subset connected regions;
Figure 678707DEST_PATH_IMAGE044
the mean value of the image steepness of the n data subset connected regions is obtained;
s500, taking the RGB video image information with the n frame video image change degree larger than a threshold value W as a data subset set S3;
s600, performing MPS coding on the data subset set S3, and transmitting the obtained structured data to the user end, where MPS is a model data storage format and a transmission format for expressing a linear optimization model.
In an embodiment of the present application, the step of obtaining the mean gray level of the connected region includes performing graying processing on the image of the connected region, and selecting a gray level threshold according to the maximum inter-class variance method OSTU.
In an embodiment of the present application, the obtaining of the mean grayscale value of the connected region includes graying the video image by using a maximum method, and taking a maximum value of three-component luminance in the color image as a grayscale value of the grayscale image.
In an embodiment of the present application, the preprocessing of the video image and the filtering and denoising of the video image information are further included before the obtaining of the gray level mean value of the connected region.
Example 2:
the invention also discloses an online education system, comprising:
the acquisition module is used for acquiring teaching video streaming data of the teacher in real time by the camera; calculating the difference matrix H of the nth frame and the previous frame in real time from the second framen
Figure 741341DEST_PATH_IMAGE045
Wherein the content of the first and second substances,
Figure 881335DEST_PATH_IMAGE046
to representThe difference between the R value matrix of the R channels of the video RGB image of the n-th frame and the n-1 st frame,
Figure 54828DEST_PATH_IMAGE047
a difference between a matrix of G values representing G channels of video RGB images of the n-th frame and the n-1 st frame,
Figure 596667DEST_PATH_IMAGE048
a difference between the B value matrices of B channels representing the video RGB images of the n-th frame and the n-1 st frame,
Figure 95782DEST_PATH_IMAGE005
Figure 457493DEST_PATH_IMAGE022
Figure 700256DEST_PATH_IMAGE007
matrix proportionality coefficients of an R channel, a G channel and a B channel are respectively;
a traversing module for traversing the matrix HnAn area position information set S1 in which the element in the recording matrix is not equal to 0;
a region initial selection module, configured to select RGB video image information corresponding to the region position information set S1 of the nth frame of image as a data subset set S2, calculate an image change degree of each connected region set in the data subset set S2, and calculate an image change degree W of a connected region of the kth data subsetkComprises the following steps:
Figure 96602DEST_PATH_IMAGE049
wherein the content of the first and second substances,
Figure 766618DEST_PATH_IMAGE050
representing the difference between the mean gray levels of the connected regions of the kth data subset of the n-th frame and the n-1 frame of video,
Figure 350046DEST_PATH_IMAGE051
representing the k data subset connected region of the n frame and the n-1 frame videoThe difference in the steepness of the image of (2),
Figure 399429DEST_PATH_IMAGE052
representing the proportion of the k-th data subset connected region in the whole video image, h is a set threshold value, if
Figure 915861DEST_PATH_IMAGE053
Then, then
Figure 491199DEST_PATH_IMAGE054
(ii) a If it is
Figure 827502DEST_PATH_IMAGE055
Then, then
Figure 146488DEST_PATH_IMAGE056
(ii) a The image steepness is to extract pixel points with the lowest gray value and the highest gray value in each image block, and is respectively T1 and T2; the steepness is L = (T2-T1)/q; wherein q is the number of interval pixels of the pixel points with the lowest gray value and the highest gray value in the image block;
Figure 517426DEST_PATH_IMAGE057
the gray level mean value of n data subset connected regions;
Figure 263666DEST_PATH_IMAGE058
the mean value of the image steepness of the n data subset connected regions is obtained;
the area selection module is used for taking the RGB video image information of which the variation degree of the nth frame video image is greater than the threshold value W as a data subset set S3;
and the coding transmission module is used for performing MPS coding on the data subset set S3 to obtain an MPS data packet, analyzing the MPS data packet according to a preset data structure to obtain structured data which accords with optimization processing, and transmitting the obtained structured data to a user side, wherein the MPS is a model data storage format and a transmission format which are used for expressing a linear optimization model.
In an embodiment of the present application, the method further includes: and the graying module is used for performing graying processing on the images of the connected regions and selecting a grayscale threshold value according to the maximum inter-class variance method OSTU.
In an embodiment of the application, the graying module is further configured to perform graying on the video image by using a maximum value method, and use a maximum value of three-component brightness in the color image as a grayscale value of the grayscale image.
In an embodiment of the present application, the method further includes: and the preprocessing module is used for preprocessing the video image and filtering and denoising the video image information before acquiring the gray average value of the connected region.
Compared with the prior art, the technical scheme of the invention has the following beneficial effects:
the application provides an online education method, system, device and storage medium, which calculates the difference degree matrix H of the nth frame and the previous frame of video image in real time from the second framen(ii) a Traverse matrix HnAn area position information set S1 in which the element in the recording matrix is not equal to 0; selecting RGB video image information corresponding to the region position information set S1 of the nth frame image as a data subset set S2; taking the RGB video image information of which the degree of change of the nth frame video image is greater than the threshold value W as a data subset set S3; MPS encoding is performed on the data subset set S3, and the obtained structured data is transmitted to the user end. The invention realizes the data transmission only aiming at the area where the video image has obvious change, and the original image is reserved in the area where the video image has no obvious change, thereby greatly reducing the data transmission amount and obviously improving the data transmission efficiency.
Particularly, the image data selection speed is greatly increased by the calculation mode of the difference degree matrix to the video frame image area; the image change degree calculation mode can remarkably improve the selection rate of the salient region and greatly reduce image distortion.
Example 3:
referring to fig. 2, a computer device of an online education method of the present application is shown, which may specifically include the following:
the computer device 12 described above is embodied in the form of a general purpose computing device, and the components of the computer device 12 may include, but are not limited to: one or more processors or processing units 16, a memory 28, and a bus 18 that couples various system components including the memory 28 and the processing unit 16.
Bus 18 represents one or more of any of several types of bus 18 structures, including a memory bus 18 or memory controller, a peripheral bus 18, an accelerated graphics port, and a processor or local bus 18 using any of a variety of bus 18 architectures. By way of example, such architectures include, but are not limited to, Industry Standard Architecture (ISA) bus 18, micro-channel architecture (MAC) bus 18, enhanced ISA bus 18, audio Video Electronics Standards Association (VESA) local bus 18, and Peripheral Component Interconnect (PCI) bus 18.
Computer device 12 typically includes a variety of computer system readable media. Such media may be any available media that is accessible by computer device 12 and includes both volatile and nonvolatile media, removable and non-removable media.
The memory 28 may include computer system readable media in the form of volatile memory, such as random access memory 30 and/or cache memory 32. Computer device 12 may further include other removable/non-removable, volatile/nonvolatile computer system storage media. By way of example only, storage system 34 may be used to read from and write to non-removable, nonvolatile magnetic media (commonly referred to as "hard drives"). Although not shown in FIG. 2, a magnetic disk drive for reading from and writing to a removable, nonvolatile magnetic disk (e.g., a "floppy disk") and an optical disk drive for reading from or writing to a removable, nonvolatile optical disk (e.g., a CD-ROM, DVD-ROM, or other optical media) may be provided. In these cases, each drive may be connected to bus 18 by one or more data media interfaces. The memory may include at least one program product having a set (e.g., at least one) of program modules 42, with the program modules 42 configured to carry out the functions of embodiments of the application.
A program/utility 40 having a set (at least one) of program modules 42 may be stored, for example, in memory, such program modules 42 including, but not limited to, an operating system, one or more application programs, other program modules 42, and program data, each of which examples or some combination thereof may comprise an implementation of a network environment. Program modules 42 generally perform the functions and/or methodologies of the embodiments described herein.
Computer device 12 may also communicate with one or more external devices 14 (e.g., keyboard, pointing device, display 24, camera, etc.), with one or more devices that enable an operator to interact with computer device 12, and/or with any devices (e.g., network card, modem, etc.) that enable computer device 12 to communicate with one or more other computing devices. Such communication may be through the I/O interface 22. Also, computer device 12 may communicate with one or more networks (e.g., a Local Area Network (LAN)), a Wide Area Network (WAN), and/or a public network (e.g., the Internet) via network adapter 20. As shown in FIG. 2, the network adapter 20 communicates with the other modules of the computer device 12 via the bus 18. It should be appreciated that although not shown in FIG. 2, other hardware and/or software modules may be used in conjunction with computer device 12, including but not limited to: microcode, device drivers, redundant processing units 16, external disk drive arrays, RAID systems, tape drives, and data backup storage systems 34, etc.
The processing unit 16 executes various functional applications and data processing by executing programs stored in the memory 28, for example, to implement an online education method provided by the embodiment of the present application.
That is, the processing unit 16 implements, when executing the program,: a method of online education comprising the steps of: the camera collects the teaching video streaming data of the teacher in real time; calculating the difference matrix H of the nth frame and the previous frame in real time from the second framen
Figure 352844DEST_PATH_IMAGE059
Wherein the content of the first and second substances,
Figure 475521DEST_PATH_IMAGE060
a difference between R value matrices of R channels representing video RGB images of the n-th frame and the n-1 th frame,
Figure 700966DEST_PATH_IMAGE047
a difference between a matrix of G values representing G channels of video RGB images of the n-th frame and the n-1 st frame,
Figure 883686DEST_PATH_IMAGE061
a difference between the B value matrices of B channels representing the video RGB images of the n-th frame and the n-1 st frame,
Figure 929002DEST_PATH_IMAGE005
Figure 855370DEST_PATH_IMAGE022
Figure 935321DEST_PATH_IMAGE007
matrix proportionality coefficients of an R channel, a G channel and a B channel are respectively;
traverse matrix HnAn area position information set S1 in which the element in the recording matrix is not equal to 0;
selecting RGB video image information corresponding to the region position information set S1 of the nth frame image as a data subset set S2, calculating the image change degree of each connected region set in the data subset set S2, and the image change degree W of the connected region of the kth data subsetkComprises the following steps:
Figure 288942DEST_PATH_IMAGE062
wherein the content of the first and second substances,
Figure 87134DEST_PATH_IMAGE063
representing the difference between the mean gray levels of the connected regions of the kth data subset of the n-th frame and the n-1 frame of video,
Figure 551613DEST_PATH_IMAGE064
representing the difference between the image steepness of the connected region of the kth data subset of the n-th frame and the n-1 frame video,
Figure 751651DEST_PATH_IMAGE065
representing the proportion of the k-th data subset connected region in the whole video image, h is a set threshold value, if
Figure 276173DEST_PATH_IMAGE066
Then, then
Figure 296081DEST_PATH_IMAGE067
(ii) a If it is
Figure 821042DEST_PATH_IMAGE068
Then, then
Figure 875585DEST_PATH_IMAGE069
(ii) a The image steepness is to extract pixel points with the lowest gray value and the highest gray value in each image block, and is respectively T1 and T2; the steepness is L = (T2-T1)/q; wherein q is the number of interval pixels of the pixel points with the lowest gray value and the highest gray value in the image block;
Figure 571009DEST_PATH_IMAGE070
the gray level mean value of n data subset connected regions;
Figure 343793DEST_PATH_IMAGE071
the mean value of the image steepness of the n data subset connected regions is obtained;
taking the RGB video image information of which the degree of change of the nth frame video image is greater than the threshold value W as a data subset set S3;
MPS is performed on the data subset set S3, and the obtained structured data is transmitted to the user end, where the MPS is a model data storage format and a transmission format for expressing the linear optimization model.
In an embodiment of the present application, there is also provided a computer-readable storage medium having stored thereon a computer program which, when executed by a processor, implements an online education method as provided in all embodiments of the present application.
That is, the program when executed by the processor implements: a method of online education comprising the steps of: the camera collects the teaching video streaming data of the teacher in real time; calculating the difference matrix H of the nth frame and the previous frame in real time from the second framen
Figure 150075DEST_PATH_IMAGE072
Wherein the content of the first and second substances,
Figure 59125DEST_PATH_IMAGE073
a difference between R value matrices of R channels representing video RGB images of the n-th frame and the n-1 th frame,
Figure 191029DEST_PATH_IMAGE074
a difference between a matrix of G values representing G channels of video RGB images of the n-th frame and the n-1 st frame,
Figure 185530DEST_PATH_IMAGE075
a difference between the B value matrices of B channels representing the video RGB images of the n-th frame and the n-1 st frame,
Figure 61082DEST_PATH_IMAGE005
Figure 90218DEST_PATH_IMAGE022
Figure 393023DEST_PATH_IMAGE007
matrix proportionality coefficients of an R channel, a G channel and a B channel are respectively;
traverse matrix HnAn area position information set S1 in which the element in the recording matrix is not equal to 0;
selecting RGB video image information corresponding to the region position information set S1 of the nth frame image as a data subset set S2, and calculating each connected region set in the data subset set S2The degree of image change of the k-th data subset connected region WkComprises the following steps:
Figure 140400DEST_PATH_IMAGE076
wherein the content of the first and second substances,
Figure 554063DEST_PATH_IMAGE077
representing the difference between the mean gray levels of the connected regions of the kth data subset of the n-th frame and the n-1 frame of video,
Figure 437706DEST_PATH_IMAGE078
representing the difference between the image steepness of the connected region of the kth data subset of the n-th frame and the n-1 frame video,
Figure 911412DEST_PATH_IMAGE079
representing the proportion of the k-th data subset connected region in the whole video image, h is a set threshold value, if
Figure 880505DEST_PATH_IMAGE053
Then, then
Figure 100790DEST_PATH_IMAGE080
(ii) a If it is
Figure 838939DEST_PATH_IMAGE081
Then, then
Figure 483547DEST_PATH_IMAGE082
(ii) a The image steepness is to extract pixel points with the lowest gray value and the highest gray value in each image block, and is respectively T1 and T2; the steepness is L = (T2-T1)/q; wherein q is the number of interval pixels of the pixel points with the lowest gray value and the highest gray value in the image block;
Figure 674357DEST_PATH_IMAGE070
the gray level mean value of n data subset connected regions;
Figure 429823DEST_PATH_IMAGE071
the mean value of the image steepness of the n data subset connected regions is obtained;
taking the RGB video image information of which the degree of change of the nth frame video image is greater than the threshold value W as a data subset set S3;
MPS is performed on the data subset set S3, and the obtained structured data is transmitted to the user end, where the MPS is a model data storage format and a transmission format for expressing the linear optimization model.
Any combination of one or more computer-readable media may be employed. The computer readable medium may be a computer readable signal medium or a computer readable storage medium. A computer readable storage medium may be, for example, but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any combination of the foregoing. More specific examples (a non-exhaustive list) of the computer readable storage medium would include the following: an electrical connection having one or more wires, a portable computer diskette, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing. In the context of this document, a computer readable storage medium may be any tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device.
A computer readable signal medium may include a propagated data signal with computer readable program code embodied therein, for example, in baseband or as part of a carrier wave. Such a propagated data signal may take any of a variety of forms, including, but not limited to, electro-magnetic, optical, or any suitable combination thereof. A computer readable signal medium may also be any computer readable medium that is not a computer readable storage medium and that can communicate, propagate, or transport a program for use by or in connection with an instruction execution system, apparatus, or device.
Computer program code for carrying out operations for aspects of the present application may be written in any combination of one or more programming languages, including an object oriented programming language such as Java, Smalltalk, C + + or the like and conventional procedural programming languages, such as the "C" programming language or similar programming languages. The program code may execute entirely on the operator's computer, partly on the operator's computer, as a stand-alone software package, partly on the operator's computer and partly on a remote computer or entirely on the remote computer or server. In the case of a remote computer, the remote computer may be connected to the operator's computer through any type of network, including a Local Area Network (LAN) or a Wide Area Network (WAN), or the connection may be made to an external computer (for example, through the Internet using an Internet service provider). The embodiments in the present specification are described in a progressive manner, each embodiment focuses on differences from other embodiments, and the same and similar parts among the embodiments are referred to each other.
While preferred embodiments of the present application have been described, additional variations and modifications of these embodiments may occur to those skilled in the art once they learn of the basic inventive concepts. Therefore, it is intended that the appended claims be interpreted as including the preferred embodiment and all such alterations and modifications as fall within the true scope of the embodiments of the application.
Finally, it should also be noted that, herein, relational terms such as first and second, and the like may be used solely to distinguish one entity or action from another entity or action without necessarily requiring or implying any actual such relationship or order between such entities or actions. Also, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or terminal that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or terminal. Without further limitation, an element defined by the phrase "comprising an … …" does not exclude the presence of other like elements in a process, method, article, or terminal that comprises the element.
The method, system, device and storage medium for online education provided by the present application are described in detail above, and the principle and implementation of the present application are explained herein by applying specific examples, and the description of the above examples is only used to help understand the method and core ideas of the present application; meanwhile, for a person skilled in the art, according to the idea of the present application, there may be variations in the specific embodiments and the application scope, and in summary, the content of the present specification should not be construed as a limitation to the present application.

Claims (10)

1. An online education method, comprising the steps of: the camera collects the teaching video streaming data of the teacher in real time; calculating the difference matrix of the nth frame and the previous frame in real time from the second frame
Figure 131293DEST_PATH_IMAGE001
Figure 940986DEST_PATH_IMAGE002
Wherein the content of the first and second substances,
Figure 292333DEST_PATH_IMAGE003
a difference between R value matrices of R channels representing video RGB images of the n-th frame and the n-1 th frame,
Figure 238292DEST_PATH_IMAGE004
a difference between a matrix of G values representing G channels of video RGB images of the n-th frame and the n-1 st frame,
Figure 649682DEST_PATH_IMAGE005
a difference between the B value matrices of B channels representing the video RGB images of the n-th frame and the n-1 st frame,
Figure 684022DEST_PATH_IMAGE006
Figure 573480DEST_PATH_IMAGE007
Figure 639525DEST_PATH_IMAGE008
matrix proportionality coefficients of an R channel, a G channel and a B channel are respectively;
traversal matrix
Figure 221816DEST_PATH_IMAGE009
An area position information set S1 in which the element in the recording matrix is not equal to 0;
selecting RGB video image information corresponding to the region position information set S1 of the nth frame image as a data subset set S2, calculating the image change degree of each connected region set in the data subset set S2, and the image change degree of the connected region of the kth data subset
Figure 740522DEST_PATH_IMAGE010
Comprises the following steps:
Figure 168093DEST_PATH_IMAGE011
wherein the content of the first and second substances,
Figure 885382DEST_PATH_IMAGE012
representing the difference between the mean gray levels of the connected regions of the kth data subset of the n-th frame and the n-1 frame of video,
Figure 904153DEST_PATH_IMAGE013
representing the difference between the image steepness of the connected region of the kth data subset of the n-th frame and the n-1 frame video,
Figure 910156DEST_PATH_IMAGE014
is shown ask data subset connected regions account for the proportion of the whole video image, h is a set threshold value, if yes, the connected regions are connected with the video image
Figure 406996DEST_PATH_IMAGE015
Then, then
Figure 182054DEST_PATH_IMAGE016
(ii) a If it is
Figure 840568DEST_PATH_IMAGE017
Then, then
Figure 333867DEST_PATH_IMAGE018
(ii) a The image steepness is to extract pixel points with the lowest gray value and the highest gray value in each image block, and is respectively T1 and T2; the steepness is L = (T2-T1)/q; wherein q is the number of interval pixels of the pixel points with the lowest gray value and the highest gray value in the image block;
Figure 368819DEST_PATH_IMAGE019
is the average of the gray levels of all connected regions of the nth frame,
Figure 284470DEST_PATH_IMAGE020
the mean value of the image steepness of all the connected areas of the nth frame is obtained;
taking the RGB video image information of which the degree of change of the nth frame video image is greater than the threshold value W as a data subset set S3;
MPS is performed on the data subset set S3, and the obtained structured data is transmitted to the user end, where the MPS is a model data storage format and a transmission format for expressing the linear optimization model.
2. The method of claim 1, wherein the obtaining of the mean grayscale value of the connected regions comprises graying the images of the connected regions and selecting the grayscale threshold according to the maximum inter-class variance method OSTU.
3. The on-line education method as claimed in claim 1, wherein the step of obtaining the mean value of the gray levels of the connected regions comprises graying the video image by a maximum value method, and taking the maximum value of the three-component brightness in the color image as the gray value of the gray image.
4. The on-line education method as claimed in claim 2, wherein the step of obtaining the mean value of the gray scale of the connected component further comprises preprocessing the video image to remove noise and filter the video image information.
5. An online education system, comprising an acquisition module: the camera collects the teaching video streaming data of the teacher in real time; calculating the difference matrix of the nth frame and the previous frame in real time from the second frame
Figure 113886DEST_PATH_IMAGE022
Figure 94480DEST_PATH_IMAGE024
Wherein the content of the first and second substances,
Figure 933123DEST_PATH_IMAGE026
a difference between R value matrices of R channels representing video RGB images of the n-th frame and the n-1 th frame,
Figure 151615DEST_PATH_IMAGE028
a difference between a matrix of G values representing G channels of video RGB images of the n-th frame and the n-1 st frame,
Figure 10987DEST_PATH_IMAGE030
a difference between the B value matrices of B channels representing the video RGB images of the n-th frame and the n-1 st frame,
Figure 885402DEST_PATH_IMAGE032
Figure 527736DEST_PATH_IMAGE034
Figure 131892DEST_PATH_IMAGE036
matrix proportionality coefficients of an R channel, a G channel and a B channel are respectively;
a traversing module: traversal matrix
Figure 568690DEST_PATH_IMAGE038
An area position information set S1 in which the element in the recording matrix is not equal to 0;
the region primary selection module: selecting RGB video image information corresponding to the region position information set S1 of the nth frame image as a data subset set S2, calculating the image change degree of each connected region set in the data subset set S2, and the image change degree of the connected region of the kth data subset
Figure 258297DEST_PATH_IMAGE040
Comprises the following steps:
Figure 438743DEST_PATH_IMAGE042
wherein the content of the first and second substances,
Figure 897406DEST_PATH_IMAGE044
representing the difference between the mean gray levels of the connected regions of the kth data subset of the n-th frame and the n-1 frame of video,
Figure 505105DEST_PATH_IMAGE046
representing the difference between the image steepness of the connected region of the kth data subset of the n-th frame and the n-1 frame video,
Figure 682008DEST_PATH_IMAGE048
representing the proportion of the k-th data subset connected region in the whole video image, h is a set threshold value, if
Figure 134986DEST_PATH_IMAGE050
Then, then
Figure 982244DEST_PATH_IMAGE052
(ii) a If it is
Figure 760844DEST_PATH_IMAGE054
Then, then
Figure 425044DEST_PATH_IMAGE056
(ii) a The image steepness is to extract pixel points with the lowest gray value and the highest gray value in each image block, and is respectively T1 and T2; the steepness is L = (T2-T1)/q; wherein q is the number of interval pixels of the pixel points with the lowest gray value and the highest gray value in the image block;
Figure 947292DEST_PATH_IMAGE058
is the average of the gray levels of all connected regions of the nth frame,
Figure DEST_PATH_IMAGE060
the mean value of the image steepness of all the connected areas of the nth frame is obtained;
a region selection module: taking the RGB video image information of which the degree of change of the nth frame video image is greater than the threshold value W as a data subset set S3;
the coding transmission module: MPS coding is carried out on the data subset set S3 to obtain an MPS data packet, the MPS data packet is analyzed according to a preset data structure to obtain structured data which accord with optimization processing, the obtained structured data are transmitted to a user side, and MPS is a model data storage format and a transmission format which are used for expressing a linear optimization model.
6. The system of claim 5, wherein the graying module comprises a graying module for graying the connected region image and selecting the grayscale threshold according to the maximum inter-class variance method OSTU.
7. The system of claim 5, wherein the graying module is configured to perform graying on the video image by using a maximum value method, and the maximum value of the three-component brightness in the color image is used as the grayscale value of the grayscale image.
8. The system of claim 5, wherein the preprocessing module is further configured to preprocess the video image before obtaining the mean grayscale value of the connected component, and filter and denoise information of the video image.
9. An apparatus comprising a processor, a memory, and a computer program stored on the memory and capable of running on the processor, the computer program when executed by the processor implementing the method of any one of claims 1 to 4.
10. A computer-readable storage medium, on which a computer program is stored which, when being executed by a processor, carries out the method according to any one of claims 1 to 4.
CN202110660037.7A 2021-06-15 2021-06-15 Online education method, system, equipment and storage medium Active CN113115037B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110660037.7A CN113115037B (en) 2021-06-15 2021-06-15 Online education method, system, equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110660037.7A CN113115037B (en) 2021-06-15 2021-06-15 Online education method, system, equipment and storage medium

Publications (2)

Publication Number Publication Date
CN113115037A CN113115037A (en) 2021-07-13
CN113115037B true CN113115037B (en) 2021-09-14

Family

ID=76723492

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110660037.7A Active CN113115037B (en) 2021-06-15 2021-06-15 Online education method, system, equipment and storage medium

Country Status (1)

Country Link
CN (1) CN113115037B (en)

Families Citing this family (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113660495A (en) * 2021-08-11 2021-11-16 易谷网络科技股份有限公司 Real-time video stream compression method and device, electronic equipment and storage medium
CN114155254B (en) * 2021-12-09 2022-11-08 成都智元汇信息技术股份有限公司 Image cutting method based on image correction, electronic device and medium
CN114140542B (en) * 2021-12-09 2022-11-22 成都智元汇信息技术股份有限公司 Picture cutting method based on color compensation, electronic equipment and medium
CN115119016A (en) * 2022-06-29 2022-09-27 王雨佳 Information data encryption algorithm

Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1946144A (en) * 2006-11-01 2007-04-11 李博航 Real time video image transmission technology
CN101184216A (en) * 2007-12-07 2008-05-21 广东纺织职业技术学院 Intelligent domestic gateway presentation video control method and system thereof
CN101321287A (en) * 2008-07-08 2008-12-10 浙江大学 Video encoding method based on movement object detection
CN103400154A (en) * 2013-08-09 2013-11-20 电子科技大学 Human body movement recognition method based on surveillance isometric mapping
CN104394418A (en) * 2014-09-23 2015-03-04 清华大学 Method and device for coding video data and method and device for decoding video data
CN105306960A (en) * 2015-10-18 2016-02-03 北京航空航天大学 Dynamic adaptive stream system for transmitting high-quality online course videos
CN105787597A (en) * 2016-01-20 2016-07-20 北京优弈数据科技有限公司 Data optimizing processing system
US9578324B1 (en) * 2014-06-27 2017-02-21 Google Inc. Video coding using statistical-based spatially differentiated partitioning
CN107147906A (en) * 2017-06-12 2017-09-08 中国矿业大学 A kind of virtual perspective synthetic video quality without referring to evaluation method
CN108021347A (en) * 2017-12-29 2018-05-11 航天科工智慧产业发展有限公司 A kind of method of Android terminal Screen sharing

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6647061B1 (en) * 2000-06-09 2003-11-11 General Instrument Corporation Video size conversion and transcoding from MPEG-2 to MPEG-4
JP5821610B2 (en) * 2011-12-20 2015-11-24 富士通株式会社 Information processing apparatus, information processing method, and program
US10448012B2 (en) * 2016-11-22 2019-10-15 Pixvana, Inc. System and method for data reduction based on scene content
JP7208356B2 (en) * 2018-09-26 2023-01-18 コーヒレント・ロジックス・インコーポレーテッド Generating Arbitrary World Views

Patent Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1946144A (en) * 2006-11-01 2007-04-11 李博航 Real time video image transmission technology
CN101184216A (en) * 2007-12-07 2008-05-21 广东纺织职业技术学院 Intelligent domestic gateway presentation video control method and system thereof
CN101321287A (en) * 2008-07-08 2008-12-10 浙江大学 Video encoding method based on movement object detection
CN103400154A (en) * 2013-08-09 2013-11-20 电子科技大学 Human body movement recognition method based on surveillance isometric mapping
US9578324B1 (en) * 2014-06-27 2017-02-21 Google Inc. Video coding using statistical-based spatially differentiated partitioning
CN104394418A (en) * 2014-09-23 2015-03-04 清华大学 Method and device for coding video data and method and device for decoding video data
CN105306960A (en) * 2015-10-18 2016-02-03 北京航空航天大学 Dynamic adaptive stream system for transmitting high-quality online course videos
CN105787597A (en) * 2016-01-20 2016-07-20 北京优弈数据科技有限公司 Data optimizing processing system
CN107147906A (en) * 2017-06-12 2017-09-08 中国矿业大学 A kind of virtual perspective synthetic video quality without referring to evaluation method
CN108021347A (en) * 2017-12-29 2018-05-11 航天科工智慧产业发展有限公司 A kind of method of Android terminal Screen sharing

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
基于可区分边界和加权对比度优化的显著度检测算法;姜青竹等;《电子学报》;20170115(第01期);第150-159页 *

Also Published As

Publication number Publication date
CN113115037A (en) 2021-07-13

Similar Documents

Publication Publication Date Title
CN113115037B (en) Online education method, system, equipment and storage medium
CN109886210B (en) Traffic image recognition method and device, computer equipment and medium
CN113052868B (en) Method and device for training matting model and image matting
CN111738041A (en) Video segmentation method, device, equipment and medium
CN109194878B (en) Video image anti-shake method, device, equipment and storage medium
CN111986117A (en) System and method for correcting arithmetic operation
CN113344826A (en) Image processing method, image processing device, electronic equipment and storage medium
CN111382647B (en) Picture processing method, device, equipment and storage medium
CN113516697A (en) Image registration method and device, electronic equipment and computer-readable storage medium
CN111815748B (en) Animation processing method and device, storage medium and electronic equipment
CN111861204A (en) Course mobile learning evaluation system and method based on intelligent platform
CN114723652A (en) Cell density determination method, cell density determination device, electronic apparatus, and storage medium
CN115333879B (en) Remote conference method and system
CN113315995B (en) Method and device for improving video quality, readable storage medium and electronic equipment
CN112507243B (en) Content pushing method and device based on expressions
WO2023284236A1 (en) Blind image denoising method and apparatus, electronic device, and storage medium
CN115601820A (en) Face fake image detection method, device, terminal and storage medium
CN112990198B (en) Detection and identification method and system for water meter reading and storage medium
CN108419095A (en) A kind of streaming media transcoding method, apparatus, computer equipment and readable medium
CN113762260A (en) Method, device and equipment for processing layout picture and storage medium
CN112584117B (en) White balance adjusting method, device, equipment and storage medium
CN113117341B (en) Picture processing method and device, computer readable storage medium and electronic equipment
CN117173609A (en) Multi-scale feature and channel attention-based reference-free screen video quality evaluation method and device
US20230334626A1 (en) Techniques for denoising videos
KR20100049406A (en) Remote education server, method and computer readable media storing program for method thereof

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant