CN115334053B - Method for realizing associated screen projection in cloud conference and related products - Google Patents

Method for realizing associated screen projection in cloud conference and related products Download PDF

Info

Publication number
CN115334053B
CN115334053B CN202210919844.0A CN202210919844A CN115334053B CN 115334053 B CN115334053 B CN 115334053B CN 202210919844 A CN202210919844 A CN 202210919844A CN 115334053 B CN115334053 B CN 115334053B
Authority
CN
China
Prior art keywords
cloud
file
compression
presenter
content
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202210919844.0A
Other languages
Chinese (zh)
Other versions
CN115334053A (en
Inventor
梅品西
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shenzhen Happycast Technology Co Ltd
Original Assignee
Shenzhen Happycast Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shenzhen Happycast Technology Co Ltd filed Critical Shenzhen Happycast Technology Co Ltd
Priority to CN202210919844.0A priority Critical patent/CN115334053B/en
Publication of CN115334053A publication Critical patent/CN115334053A/en
Application granted granted Critical
Publication of CN115334053B publication Critical patent/CN115334053B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L65/00Network arrangements, protocols or services for supporting real-time applications in data packet communication
    • H04L65/40Support for services or applications
    • H04L65/403Arrangements for multi-party communication, e.g. for conferences
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/14Digital output to display device ; Cooperation and interconnection of the display device with other functional units
    • G06F3/1454Digital output to display device ; Cooperation and interconnection of the display device with other functional units involving copying of the display data of a local workstation or window to a remote workstation or window so that an actual copy of the data is displayed simultaneously on two or more displays, e.g. teledisplay
    • GPHYSICS
    • G10MUSICAL INSTRUMENTS; ACOUSTICS
    • G10LSPEECH ANALYSIS OR SYNTHESIS; SPEECH RECOGNITION; SPEECH OR VOICE PROCESSING; SPEECH OR AUDIO CODING OR DECODING
    • G10L15/00Speech recognition
    • G10L15/08Speech classification or search
    • G10L15/16Speech classification or search using artificial neural networks
    • GPHYSICS
    • G10MUSICAL INSTRUMENTS; ACOUSTICS
    • G10LSPEECH ANALYSIS OR SYNTHESIS; SPEECH RECOGNITION; SPEECH OR VOICE PROCESSING; SPEECH OR AUDIO CODING OR DECODING
    • G10L15/00Speech recognition
    • G10L15/26Speech to text systems
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L67/00Network arrangements or protocols for supporting network services or applications
    • H04L67/01Protocols
    • H04L67/10Protocols in which an application is distributed across nodes in the network
    • H04L67/1095Replication or mirroring of data, e.g. scheduling or transport for data synchronisation between network nodes

Abstract

The embodiment of the application provides a realization method of associated screen projection in a cloud conference and related products, wherein the method comprises the following steps: the cloud conference server receives a sharing file to be shared in a cloud conference of a presenter's cloud terminal; the cloud conference server identifies and determines partial content of the shared file, which is concerned by the presenter and is not concerned by the presenter; the cloud conference server compresses the partial content concerned through a first compression algorithm to obtain a first compression file, compresses the partial content not concerned through a second compression algorithm to obtain a second compression file, and shares the first compression file and the second compression file with other cloud terminals in the cloud conference to carry out screen projection. The technical scheme provided by the application has the advantage of improving the cloud conference effect.

Description

Method for realizing associated screen projection in cloud conference and related products
Technical Field
The application relates to the technical field of electronics and communication, in particular to a method for realizing associated screen projection in a cloud conference and related products.
Background
Cloud conferencing is an efficient, convenient, low-cost form of conferencing based on cloud computing technology. The user can rapidly and efficiently share voice, data files and videos with all groups and clients in the world synchronously by simply and easily operating through an internet interface, and the user is helped by a cloud conference service provider to operate through complex technologies such as data transmission, processing and the like in the conference.
The image data transmission data volume of the shared file area is too large in the cloud conference scene, so that the delay of conference image data is influenced, the effect of the cloud conference is further influenced, and the user experience is reduced.
Disclosure of Invention
The embodiment of the application discloses a realization method of associated screen projection in a cloud conference and related products.
In a first aspect, a method for implementing associated screen projection in a cloud conference is provided, where the method includes the following steps:
the cloud conference server receives a sharing file to be shared in a cloud conference of a presenter's cloud terminal;
the cloud conference server identifies and determines partial content of the shared file, which is concerned by the presenter and is not concerned by the presenter;
the cloud conference server compresses the partial content concerned through a first compression algorithm to obtain a first compression file, compresses the partial content not concerned through a second compression algorithm to obtain a second compression file, and shares the first compression file and the second compression file with other cloud terminals in the cloud conference to carry out screen projection;
the second compression algorithm is a larger compression amount algorithm than the first compression algorithm.
In a second aspect, a system for implementing associated screen projection in a cloud conference is provided, where the system includes:
the communication unit is used for receiving a shared file to be shared in the cloud conference of the cloud terminal of the presenter;
the processing unit is used for identifying and determining partial content of the shared file, which is concerned by the main speaker, and partial content of the shared file, which is not concerned by the main speaker; the method comprises the steps of compressing the partial content of interest through a first compression algorithm to obtain a first compression file, compressing the partial content of non-interest through a second compression algorithm to obtain a second compression file, and sharing the first compression file and the second compression file to other cloud terminals in a cloud conference for screen projection;
the second compression algorithm is a larger compression amount algorithm than the first compression algorithm.
In a third aspect, there is provided an electronic device comprising a processor, a memory, a communication interface, and one or more programs stored in the memory and configured to be executed by the processor, the programs comprising instructions for performing the steps in the method of the first aspect.
In a fourth aspect, a computer-readable storage medium is provided, storing a computer program for electronic data exchange, wherein the computer program causes a computer to perform the method of the first aspect.
In a fifth aspect, a computer program product is provided, wherein the computer program product comprises a non-transitory computer readable storage medium storing a computer program, the computer program being operable to cause a computer to perform part or all of the steps as described in the first aspect of the embodiments of the present application. The computer program product may be a software installation package.
The cloud conference server receives a sharing file to be shared in a cloud conference of a presenter cloud terminal; the cloud conference server identifies and determines partial content of the shared file, which is concerned by the presenter and is not concerned by the presenter; the cloud conference server compresses the partial content concerned through a first compression algorithm to obtain a first compression file, compresses the partial content not concerned through a second compression algorithm to obtain a second compression file, and shares the first compression file and the second compression file with other cloud terminals in the cloud conference. According to the scheme, the content which is not concerned by the presenter in the shared file is compressed in different compression modes, so that the network transmission quantity occupied by other non-concerned content can be reduced as much as possible on the premise that the presenter is not influenced to carry out the conference, the data quantity is small, the single-frame data quantity is small, the data delay is low, the flow of the network is further reduced, the smoothness of a cloud conference is improved, the quality of the conference is improved, and the user experience is further improved.
Drawings
The drawings used in the embodiments of the present application are described below.
FIG. 1 is a schematic diagram of a framework of a cloud conference platform of the present application;
fig. 2 is a schematic flow chart of an implementation method of association screen projection in a cloud conference;
fig. 3 is a schematic structural diagram of an implementation system for associating with screen projection in a cloud conference;
FIG. 4 is a schematic diagram of a split screen display provided herein;
fig. 5 is a schematic structural diagram of an electronic device according to an embodiment of the present application.
Detailed Description
Embodiments of the present application are described below with reference to the accompanying drawings in the embodiments of the present application.
The term "and/or" in this application is merely an association relation describing an associated object, and indicates that three relations may exist, for example, a and/or B may indicate: a exists alone, A and B exist together, and B exists alone. In this context, the character "/" indicates that the front and rear associated objects are an "or" relationship.
The term "plurality" as used in the embodiments herein refers to two or more. The first, second, etc. descriptions in the embodiments of the present application are only used for illustrating and distinguishing the description objects, and no order division is used, nor does it indicate that the number of the devices in the embodiments of the present application is particularly limited, and no limitation on the embodiments of the present application should be construed. The "connection" in the embodiments of the present application refers to various connection manners such as direct connection or indirect connection, so as to implement communication between devices, which is not limited in any way in the embodiments of the present application.
Referring to fig. 1, fig. 1 is a schematic diagram of a framework of a cloud conference platform, as shown in fig. 1, where the cloud conference platform includes a plurality of cloud terminals, and the cloud terminals are connected together by a cloud conference server, where the cloud terminals specifically may include: the processor, the memory, the display screen, the communication circuit, the audio component and the camera component may be connected through a bus, or may be connected through other manners, which is not limited to the specific manner of connection. The cloud terminal and the cloud conference platform can be connected through a wired network, and can be connected through a wireless network of a wireless communication system.
The wireless communication system may be: global system for mobile communications (Global System of Mobile communication, GSM), code division multiple access (Code Division Multiple Access, CDMA) system, wideband code division multiple access (Wideband Code Division Multiple Access, WCDMA) system, general packet Radio service (General Packet Radio Service, GPRS), long term evolution (Long Term Evolution, LTE) system, long term evolution advanced (Advanced long term evolution, LTE-a) system, new Radio, NR system evolution system, LTE system over unlicensed spectrum (LTE-based access to unlicensed spectrum, LTE-U), unlicensed spectrum NR system (NR-based access tounlicensed spectrum, NR-U), universal mobile communication system (Universal Mobile Telecommunication System, UMTS), next generation communication system, or other communication system, etc.
Referring to fig. 2, fig. 2 provides a flow chart of a method for implementing association screen projection in a cloud conference, where the method shown in fig. 2 may be performed under the framework of the cloud conference platform shown in fig. 1, specifically, it may be performed by a cloud terminal in the cloud conference platform shown in fig. 1, and of course, may also be performed by a cloud conference server, and this embodiment is illustrated by taking the cloud conference server as an example, and in practical application, may also be performed by a cloud terminal, where the method as shown in fig. 2 includes the following steps:
step S201, a cloud conference server receives a sharing file to be shared in a cloud conference of a cloud terminal of a presenter;
the shared file may be any file format, including but not limited to: PPT, word, WPS, etc. The receiving mode may be a wired receiving mode or a wireless receiving mode, and the specific receiving mode may be determined by a connection mode between the cloud terminal and the cloud conference server.
Step S202, the cloud conference server identifies the shared file and determines partial content of the shared file, which is concerned by the presenter, and partial content of the shared file, which is not concerned by the presenter;
the part of the content concerned can be the content which the main speaker is speaking or the content which the main speaker needs to speak within a set time, and the content can be any one or any combination of characters, pictures and videos.
For example, the above noted part of content may be marked by the presenter to explain the content this time, but of course, may also be other forms of content, and the present application is not limited to the specific form of the above noted part of content.
Step S203, the cloud conference server compresses the partial content concerned by a first compression algorithm to obtain a first compression file, compresses the partial content not concerned by a second compression algorithm to obtain a second compression file, and shares the first compression file and the second compression file to other cloud terminals in the cloud conference to carry out screen projection (non-presenter terminal);
the second compression algorithm is an algorithm having a larger compression amount than the first compression algorithm.
The first compression algorithm and the second compression algorithm may be the same compression algorithm, specifically, when in compression, the compression ratio of the second compression algorithm is larger than that of the first compression algorithm, and of course, the first compression algorithm and the second compression algorithm may be different compression algorithms, and the specific compression algorithm may be a general compression algorithm.
For example, a 100Mb (megabit) file is compressed by a first compression algorithm and then becomes 50Mb, and is compressed by a second compression algorithm and then becomes 40Mb (only less than 50Mb is required), and the second compression algorithm can be regarded as an algorithm having a larger compression amount than the first compression algorithm.
The cloud conference server receives a sharing file to be shared in a cloud conference of a presenter cloud terminal; the cloud conference server identifies and determines partial content of the shared file, which is concerned by the presenter and is not concerned by the presenter; the cloud conference server compresses the partial content concerned through a first compression algorithm to obtain a first compression file, compresses the partial content not concerned through a second compression algorithm to obtain a second compression file, and shares the first compression file and the second compression file with other cloud terminals in the cloud conference. According to the scheme, the content which is not concerned by the presenter in the shared file is compressed in different compression modes, so that the network transmission quantity occupied by other non-concerned content can be reduced as much as possible on the premise that the presenter is not influenced to carry out the conference, the data quantity is small, the single-frame data quantity is small, the data delay is low, the flow of the network is further reduced, the smoothness of a cloud conference is improved, the quality of the conference is improved, and the user experience is further improved.
For example, the sharing the first compressed file and the second compressed file to other cloud terminals in the cloud conference may specifically include:
the cloud conference server acquires network delays of other cloud terminals, and dynamically distributes transmission priorities of the first compressed file and the second compressed file in the other cloud terminals according to the network delays.
For example, the dynamically allocating the transmission priorities of the first compressed file and the second compressed file in other cloud terminals according to the network delay may specifically include:
and arranging other cloud terminals according to the network delay ascending order to obtain a first sequence, and setting the transmission priority of the other cloud terminals according to the sequence of the first sequence, namely, the transmission priority of the first sequence of the prior cloud terminal is high (i.e. priority transmission), and the priority of the subsequent cloud terminal is low (i.e. later transmission).
For example, the network delay of the terminal 1 is 10 ms, the network delay of the terminal 2 is 20 ms, and then the transmission priority of the terminal 2 is greater than the transmission priority of the terminal 1.
For example, the identifying, by the cloud conference server, the shared file to determine the part of the content of interest of the presenter and the part of the content of non-interest of the presenter in the shared file may specifically include:
the cloud conference server receives audio data collected by a cloud terminal of a presenter, carries out voice recognition on the audio data to determine text information of the audio data, determines content corresponding to the text information from the shared file to be partial content focused by the presenter, and determines other content of the shared file to be partial content not focused by the presenter.
For example, the specific method for determining the text information of the audio data by performing voice recognition on the audio data may be an LSTM recognition method, which specifically may include:
the audio data are formed into input data Xt (wherein t represents the identification of the moment) of each moment of LSTM, and text information of the audio data is obtained by adopting the following formula;
LSTM can be divided into forget gate, input gate, output gate, corresponding to three calculations, whose formulas are as follows:
forgetting to open the door f t =σ(h t-1 *X t +b f )。
An input door:
i t =σ(h t-1 *X t +b i )
C’ t =tanh(h t-1 *X t +b c );
output door:
O t =σ(h t-1 *X t +b 0 );
h t =O t *tanh(C t )。
wherein C is t =C t-1 *f t +i t *C’ t
Above, b f Represents f t Bias of function, the value being constant, and similarly b i 、b c 、b o Respectively represent the offset of the corresponding formulas, O t Indicating time tOutput results of (2).
For example, the method may further include:
acquiring each output result of the audio data at each moment, counting each confidence coefficient of each output result, searching the confidence coefficient i lower than a first threshold value from each confidence coefficient, and extracting h of the moment i of the confidence coefficient i i-1 And i+1 and two confidence rates corresponding to i-1, and acquiring a time i+1 or i-1 corresponding to a higher confidence rate from the two confidence rates; if the higher confidence coefficient is time i+1 (or time i-1), combining the input data corresponding to time i+1 with the input data of time i to obtain a new input data, calculating a new output result according to the formula, and if the confidence coefficient of the new output result is higher than the average value of the confidence coefficients of time i+1 and time i, replacing the output results of time i+1 and time i with the new output result.
According to the method and the device, the accuracy of the input result is improved in a combined mode, so that the recognition accuracy of the audio data can be improved.
For example, sharing the first compressed file and the second compressed file to other cloud terminals in the cloud conference to perform screen projection may specifically include:
and sharing the first compressed file and the second compressed file to other cloud terminals in the cloud conference, and indicating the other cloud terminals to display the first file and the second file after decompressing the first compressed file and the second compressed file in a split screen mode, wherein the first file can be arranged in a middle area, and the second file can be arranged in an edge area.
As shown in fig. 4, a schematic diagram of split screen display is assumed that a first file has 2 pictures, namely, picture 1 and picture 2, and a second file has 4 pictures, namely, pictures 3, 4, 5 and 6, and the 2 pictures can be arranged at two sides of a middle area of a display area, and the other 4 pictures are arranged at 4 sides of the middle area.
For example, the method may further include:
and if the cloud conference server receives the indication information of clicking the edge area by the second cloud terminal (any one of other cloud terminals), determining a picture identifier in the second file according to the indication information, compressing the picture identifier by adopting a first compression algorithm, and then sending the compressed picture identifier to the second cloud terminal.
According to the technical scheme, when any terminal pays attention to the picture, the picture can be retransmitted, and therefore the definition of the picture concerned is improved.
Referring to fig. 3, fig. 3 is a schematic structural diagram of an implementation system of association screen projection in a cloud conference, where the system includes:
a communication unit 301, configured to receive a shared file to be shared in a cloud conference by a cloud terminal of a presenter;
a processing unit 302, configured to identify and determine a portion of the shared file that is focused by the presenter and a portion of the shared file that is not focused by the presenter; the method comprises the steps of compressing the partial content of interest through a first compression algorithm to obtain a first compression file, compressing the partial content of non-interest through a second compression algorithm to obtain a second compression file, and sharing the first compression file and the second compression file to other cloud terminals in a cloud conference for screen projection;
the second compression algorithm is a larger compression amount algorithm than the first compression algorithm.
In the system provided by the application, a shared file to be shared in a cloud conference of a presenter cloud terminal is received; the cloud conference server identifies and determines partial content of the shared file, which is concerned by the presenter and is not concerned by the presenter; the cloud conference server compresses the partial content concerned through a first compression algorithm to obtain a first compression file, compresses the partial content not concerned through a second compression algorithm to obtain a second compression file, and shares the first compression file and the second compression file with other cloud terminals in the cloud conference. According to the scheme, the content which is not concerned by the presenter in the shared file is compressed in different compression modes, so that the network transmission quantity occupied by other non-concerned content can be reduced as much as possible on the premise that the presenter is not influenced to carry out the conference, the data quantity is small, the single-frame data quantity is small, the data delay is low, the flow of the network is further reduced, the smoothness of a cloud conference is improved, the quality of the conference is improved, and the user experience is further improved.
By way of example only, the present invention is directed to a method of,
the processing unit is specifically configured to obtain network delay of other cloud terminals, and dynamically allocate transmission priorities of the first compressed file and the second compressed file at the other cloud terminals according to the network delay.
By way of example only, the present invention is directed to a method of,
the processing unit is specifically configured to arrange other cloud terminals according to a network delay ascending order to obtain a first sequence, and set the transmission priority of the other cloud terminals according to the sequence of the first sequence.
By way of example only, the present invention is directed to a method of,
the communication unit is also used for receiving audio data collected by the cloud terminal of the presenter;
the processing unit is further configured to perform speech recognition on the audio data to determine text information of the audio data, determine, from the shared file, that content corresponding to the text information is determined to be part of content focused by the presenter, and determine other content of the shared file to be part of content not focused by the presenter.
By way of example only, the present invention is directed to a method of,
the processing unit is also used for composing the audio data into input data Xt (wherein t represents the identification of the moment) of each moment of LSTM, and identifying and obtaining text information of the audio data by adopting the following formula;
LSTM can be divided into forget gate, input gate, output gate, corresponding to three calculations, whose formulas are as follows:
forgetting to open the door f t =σ(h t-1 *X t +b f )。
An input door:
i t =σ(h t-1 *X t +b i )
C’ t =tanh(h t-1 *X t +b c );
output door:
O t =σ(h t-1 *X t +b 0 );
h t =O t *tanh(C t )。
wherein C is t =C t-1 *f t +i t *C’ t
Above, b f Represents f t Bias of function, the value being constant, and similarly b i 、b c 、b o Respectively represent the offset of the corresponding formulas, O t The output result at time t is shown.
For example, the method may further include:
acquiring each output result of the audio data at each moment, counting each confidence coefficient of each output result, searching the confidence coefficient i lower than a first threshold value from each confidence coefficient, and extracting h of the moment i of the confidence coefficient i i-1 And i+1 and two confidence rates corresponding to i-1, and acquiring a time i+1 or i-1 corresponding to a higher confidence rate from the two confidence rates; if the higher confidence coefficient is time i+1 (or time i-1), combining the input data corresponding to time i+1 with the input data of time i to obtain a new input data, calculating a new output result according to the formula, and if the confidence coefficient of the new output result is higher than the average value of the confidence coefficients of time i+1 and time i, replacing the output results of time i+1 and time i with the new output result.
It will be appreciated that the apparatus, in order to achieve the above-described functions, comprises corresponding hardware and/or software modules for performing the respective functions. The steps of an algorithm for each example described in connection with the embodiments disclosed herein may be embodied in hardware or a combination of hardware and computer software. Whether a function is implemented as hardware or computer software driven hardware depends upon the particular application and design constraints imposed on the solution. Those skilled in the art may implement the described functionality using different approaches for each particular application in conjunction with the embodiments, but such implementation is not to be considered as outside the scope of this application.
The present embodiment may divide the functional modules of the electronic device according to the above method example, for example, each functional module may be divided corresponding to each function, or two or more functions may be integrated into one processing module. The integrated modules described above may be implemented in hardware. It should be noted that, in this embodiment, the division of the modules is schematic, only one logic function is divided, and another division manner may be implemented in actual implementation.
It should be noted that, all relevant contents of each step related to the above method embodiment may be cited to the functional description of the corresponding functional module, which is not described herein.
In case an integrated unit is employed, the user equipment may comprise a processing module and a storage module. The processing module may be configured to control and manage actions of the user equipment, for example, may be configured to support the electronic device to execute the steps executed by the acquiring unit, the communication unit, and the processing unit. The memory module may be used to support the electronic device to execute stored program code, data, etc.
Wherein the processing module may be a processor or a controller. Which may implement or perform the various exemplary logic blocks, modules, and circuits described in connection with this disclosure. A processor may also be a combination that performs computing functions, e.g., including one or more microprocessors, digital signal processing (digital signal processing, DSP) and microprocessor combinations, and the like. The memory module may be a memory. The communication module can be a radio frequency circuit, a Bluetooth chip, a Wi-Fi chip and other equipment which interact with other electronic equipment.
It should be understood that the connection relationship between the modules illustrated in the embodiments of the present application is only illustrative, and does not limit the structure of the user equipment. In other embodiments of the present application, the ue may also use different interfacing manners in the foregoing embodiments, or a combination of multiple interfacing manners.
Referring to fig. 5, fig. 5 is an electronic device 50 provided in an embodiment of the present application, where the electronic device 50 includes a processor 501, a memory 502, a communication interface 503, and a display screen 504, where the processor 501, the memory 502, and the communication interface 503 are connected to each other through a bus, and the display screen supplies power to the electronic device, and the electronic device may further include:
memory 502 includes, but is not limited to, random access memory (random access memory, RAM), read-only memory (ROM), erasable programmable read-only memory (erasable programmable read only memory, EPROM), or portable read-only memory (compact disc read-only memory, CD-ROM), with memory 502 for associated computer programs and data. The communication interface 503 is used to receive and transmit data.
The processor 501 may be one or more central processing units (central processing unit, CPU), and in the case where the processor 501 is a CPU, the CPU may be a single-core CPU or a multi-core CPU.
The processor 501 may include one or more processing units, such as: the processing units may include an application processor (application processor, AP), a modem processor, a graphics processor (graphics processing unit, GPU), an image signal processor (image signal processor, ISP), a controller, a video codec, a digital signal processor (digital signal processor, DSP), a baseband processor, and/or a neural network processor (neural-network processing unit, NPU), etc. Wherein the different processing units may be separate components or may be integrated in one or more processors. In some embodiments, the user equipment may also include one or more processing units. The controller can generate operation control signals according to the instruction operation codes and the time sequence signals to finish the control of instruction fetching and instruction execution. In other embodiments, memory may also be provided in the processing unit for storing instructions and data. The memory in the processing unit may be a cache memory, for example. The memory may hold instructions or data that the processing unit has just used or recycled. If the processing unit needs to reuse the instruction or data, it can be called directly from the memory. In this way, repeated accesses are avoided, and the latency of the processing unit is reduced, thereby improving the efficiency of the user equipment in processing data or executing instructions.
In some embodiments, processor 501 may include one or more interfaces. The interfaces may include inter-integrated circuit (inter-integrated circuit, I2C) interfaces, inter-integrated circuit audio (inter-integrated circuit sound, I2S) interfaces, pulse code modulation (pulse code modulation, PCM) interfaces, universal asynchronous receiver transmitter (universal asynchronous receiver/transmitter, UART) interfaces, mobile industry processor interfaces (mobile industry processor interface, MIPI), general-purpose input/output (GPIO) interfaces, SIM card interfaces, and/or USB interfaces, among others. The USB interface is an interface conforming to the USB standard specification, and specifically may be a Mini USB interface, a Micro USB interface, a USB Type C interface, or the like. The USB interface can be used for connecting a charger to charge the user equipment and can also be used for transmitting data between the user equipment and the peripheral equipment. The USB interface can also be used for connecting with a headset, and playing audio through the headset.
If the electronic device 50 is a cloud conference server or a cloud device, such as a smart phone, a computer device, or a server, the processor 501 in the electronic device 50 is configured to read the computer program code stored in the memory 502, and perform the following operations:
receiving a sharing file to be shared in a cloud conference of a cloud terminal of a presenter; identifying the shared file to determine partial content of the shared file which is concerned by the presenter and partial content which is not concerned by the presenter;
the method comprises the steps of compressing the partial content of interest through a first compression algorithm to obtain a first compression file, compressing the partial content of non-interest through a second compression algorithm to obtain a second compression file, and sharing the first compression file and the second compression file to other cloud terminals in a cloud conference for screen projection; the second compression algorithm is a larger compression amount algorithm than the first compression algorithm.
The sharing the first compressed file and the second compressed file to other cloud terminals in the cloud conference specifically includes:
and acquiring network delay of other cloud terminals, and dynamically distributing transmission priorities of the first compressed file and the second compressed file in the other cloud terminals according to the network delay.
The method for dynamically distributing the transmission priorities of the first compressed file and the second compressed file in other cloud terminals according to the network delay specifically comprises the following steps:
and arranging other cloud terminals according to the network delay ascending order to obtain a first sequence, and setting the transmission priority of the other cloud terminals according to the sequence of the first sequence.
The identifying the shared file to determine the part of the content of the shared file which is concerned by the presenter and the part of the content which is not concerned by the presenter specifically comprises the following steps:
and receiving audio data acquired by a cloud terminal of a presenter, performing voice recognition on the audio data to determine text information of the audio data, determining content corresponding to the text information from the shared file to be part of content focused by the presenter, and determining other content of the shared file to be part of content not focused by the presenter.
All relevant contents of each scenario related to the above method embodiment may be cited to the functional description of the corresponding functional module, which is not described herein.
Embodiments of the present application also provide a computer readable storage medium having a computer program stored therein, which when run on a network device, implements the method flow shown in fig. 2.
Embodiments of the present application also provide a computer program product, which when run on a terminal, implements the method flow shown in fig. 2.
Embodiments of the present application also provide an electronic device comprising a processor, a memory, a communication interface, and one or more programs stored in the memory and configured to be executed by the processor, the programs comprising instructions for performing the steps in the method of the embodiment shown in fig. 2.
The foregoing description of the embodiments of the present application has been presented primarily in terms of a method-side implementation. It will be appreciated that the electronic device, in order to achieve the above-described functions, includes corresponding hardware structures and/or software templates for performing the respective functions. Those of skill in the art will readily appreciate that the elements and algorithm steps described in connection with the embodiments disclosed herein may be embodied as hardware or a combination of hardware and computer software. Whether a function is implemented as hardware or computer software driven hardware depends upon the particular application and design constraints imposed on the solution. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present application.
The embodiment of the application may divide the functional units of the electronic device according to the above method example, for example, each functional unit may be divided corresponding to each function, or two or more functions may be integrated in one processing unit. The integrated units may be implemented in hardware or in software functional units. It should be noted that, in the embodiment of the present application, the division of the units is schematic, which is merely a logic function division, and other division manners may be implemented in actual practice.
It should be noted that, for simplicity of description, the foregoing method embodiments are all expressed as a series of action combinations, but it should be understood by those skilled in the art that the present application is not limited by the order of actions described, as some steps may be performed in other order or simultaneously in accordance with the present application. Further, those skilled in the art will also appreciate that the embodiments described in the specification are all of the preferred embodiments, and that the acts and templates referred to are not necessarily required for the present application.
In the foregoing embodiments, the descriptions of the embodiments are emphasized, and for parts of one embodiment that are not described in detail, reference may be made to related descriptions of other embodiments.
In the several embodiments provided in this application, it should be understood that the disclosed apparatus may be implemented in other ways. For example, the above-described apparatus embodiments are merely illustrative, such as the above-described division of units, merely a division of logic functions, and there may be additional manners of dividing in actual implementation, such as multiple units or components may be combined or integrated into another system, or some features may be omitted, or not performed. Alternatively, the coupling or direct coupling or communication connection shown or discussed with each other may be an indirect coupling or communication connection via some interfaces, devices or units, or may be in electrical or other forms.
The units described above as separate components may or may not be physically separate, and components shown as units may or may not be physical units, may be located in one place, or may be distributed over a plurality of network units. Some or all of the units may be selected according to actual needs to achieve the purpose of the solution of this embodiment.
In addition, each functional unit in each embodiment of the present application may be integrated in one processing unit, or each unit may exist alone physically, or two or more units may be integrated in one unit. The integrated units may be implemented in hardware or in software functional units.
The integrated units described above, if implemented in the form of software functional units and sold or used as stand-alone products, may be stored in a computer readable memory. Based on such understanding, the technical solution of the present application may be embodied in essence or a part contributing to the prior art or all or part of the technical solution in the form of a software product stored in a memory, including several instructions for causing a computer device (which may be a personal computer, a server or a network device, etc.) to perform all or part of the steps of the above-mentioned method of the various embodiments of the present application. And the aforementioned memory includes: a U-disk, a Read-Only Memory (ROM), a random access Memory (RAM, random Access Memory), a removable hard disk, a magnetic disk, or an optical disk, or other various media capable of storing program codes.
Those of ordinary skill in the art will appreciate that all or a portion of the steps in the various methods of the above embodiments may be implemented by a program that instructs associated hardware, and the program may be stored in a computer readable memory, which may include: flash disk, read-Only Memory (ROM), random access Memory (Random Access Memory, RAM), magnetic disk or optical disk.

Claims (10)

1. The method for realizing the associated screen projection in the cloud conference is characterized by comprising the following steps of:
the cloud conference server receives a sharing file to be shared in a cloud conference of a presenter's cloud terminal;
the cloud conference server identifies and determines partial content of the shared file, which is concerned by the presenter and is not concerned by the presenter;
the cloud conference server compresses the partial content concerned through a first compression algorithm to obtain a first compression file, compresses the partial content not concerned through a second compression algorithm to obtain a second compression file, and shares the first compression file and the second compression file with other cloud terminals in the cloud conference to carry out screen projection;
the second compression algorithm is a larger compression amount algorithm than the first compression algorithm.
2. The method of claim 1, wherein sharing the first compressed file and the second compressed file to other cloud terminals in the cloud conference specifically comprises:
and acquiring network delay of other cloud terminals, and dynamically distributing transmission priorities of the first compressed file and the second compressed file in the other cloud terminals according to the network delay.
3. The method of claim 2, wherein dynamically allocating the transmission priorities of the first compressed file and the second compressed file at other cloud terminals according to the network delay specifically comprises:
and arranging other cloud terminals according to the network delay ascending order to obtain a first sequence, and setting the transmission priority of the other cloud terminals according to the sequence of the first sequence.
4. The method according to claim 1, wherein the identifying the shared file to determine the portion of the shared file that is of interest to the presenter and the portion of the shared file that is not of interest to the presenter specifically comprises:
and receiving audio data acquired by a cloud terminal of a presenter, performing voice recognition on the audio data to determine text information of the audio data, determining content corresponding to the text information from the shared file to be part of content focused by the presenter, and determining other content of the shared file to be part of content not focused by the presenter.
5. An implementation system for associating screen projection in a cloud conference, the system comprising:
the communication unit is used for receiving a shared file to be shared in the cloud conference of the cloud terminal of the presenter;
the processing unit is used for identifying and determining partial content of the shared file, which is concerned by the main speaker, and partial content of the shared file, which is not concerned by the main speaker; the method comprises the steps of compressing the partial content of interest through a first compression algorithm to obtain a first compression file, compressing the partial content of non-interest through a second compression algorithm to obtain a second compression file, and sharing the first compression file and the second compression file to other cloud terminals in a cloud conference for screen projection;
the second compression algorithm is a larger compression amount algorithm than the first compression algorithm.
6. The system of claim 5, wherein the system further comprises a controller configured to control the controller,
the processing unit is specifically configured to obtain network delay of other cloud terminals, and dynamically allocate transmission priorities of the first compressed file and the second compressed file at the other cloud terminals according to the network delay.
7. The system of claim 6, wherein the system further comprises a controller configured to control the controller,
the processing unit is specifically configured to arrange other cloud terminals according to a network delay ascending order to obtain a first sequence, and set the transmission priority of the other cloud terminals according to the sequence of the first sequence.
8. The system of claim 5, wherein the system further comprises a controller configured to control the controller,
the communication unit is also used for receiving audio data collected by the cloud terminal of the presenter;
the processing unit is further configured to perform speech recognition on the audio data to determine text information of the audio data, determine, from the shared file, that content corresponding to the text information is determined to be part of content focused by the presenter, and determine other content of the shared file to be part of content not focused by the presenter.
9. An electronic device comprising a processor, a memory, a communication interface, and one or more programs stored in the memory and configured to be executed by the processor, the programs comprising instructions for performing the steps of the method of any of claims 1-4.
10. A computer readable storage medium having stored therein a computer program which, when run on a user equipment, performs the method of any of claims 1-4.
CN202210919844.0A 2022-08-03 2022-08-03 Method for realizing associated screen projection in cloud conference and related products Active CN115334053B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210919844.0A CN115334053B (en) 2022-08-03 2022-08-03 Method for realizing associated screen projection in cloud conference and related products

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210919844.0A CN115334053B (en) 2022-08-03 2022-08-03 Method for realizing associated screen projection in cloud conference and related products

Publications (2)

Publication Number Publication Date
CN115334053A CN115334053A (en) 2022-11-11
CN115334053B true CN115334053B (en) 2023-07-18

Family

ID=83919767

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210919844.0A Active CN115334053B (en) 2022-08-03 2022-08-03 Method for realizing associated screen projection in cloud conference and related products

Country Status (1)

Country Link
CN (1) CN115334053B (en)

Citations (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102855059A (en) * 2012-08-21 2013-01-02 东莞宇龙通信科技有限公司 Terminal and information sharing method
CN103561259A (en) * 2013-07-10 2014-02-05 杭州云本科技有限公司 Network conference visual quality automatic evaluation method for application sharing services
CN105812714A (en) * 2016-03-18 2016-07-27 浙江万朋教育科技股份有限公司 Data compression method for shared PPT document pages
CN109525802A (en) * 2018-11-27 2019-03-26 平安科技(深圳)有限公司 A kind of video stream transmission method and device
CN111954051A (en) * 2020-02-11 2020-11-17 华为技术有限公司 Method, equipment and system for transmitting video and audio
CN111954028A (en) * 2020-10-19 2020-11-17 深圳乐播科技有限公司 Screen projection method, device and equipment of audio data and storage medium
CN112422591A (en) * 2021-01-25 2021-02-26 北京拓课网络科技有限公司 Method and device for transmitting video stream data and electronic equipment
CN113204687A (en) * 2020-11-10 2021-08-03 摩赛恩科技(苏州)有限公司 Automatic mass spectrum data uploading method and terminal equipment
CN114679437A (en) * 2022-03-11 2022-06-28 阿里巴巴(中国)有限公司 Teleconference method, data interaction method, device, and computer storage medium
CN114827134A (en) * 2022-07-01 2022-07-29 深圳乐播科技有限公司 Differentiated pushing method, related device and display method for cloud conference desktop
CN114816308A (en) * 2022-06-28 2022-07-29 深圳乐播科技有限公司 Information partition display method and related equipment

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7849491B2 (en) * 2002-12-10 2010-12-07 Onlive, Inc. Apparatus and method for wireless video gaming
US7791656B2 (en) * 2005-08-16 2010-09-07 Konica Minolta Holdings, Inc. Image sensing apparatus and image processing method

Patent Citations (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102855059A (en) * 2012-08-21 2013-01-02 东莞宇龙通信科技有限公司 Terminal and information sharing method
CN103561259A (en) * 2013-07-10 2014-02-05 杭州云本科技有限公司 Network conference visual quality automatic evaluation method for application sharing services
CN105812714A (en) * 2016-03-18 2016-07-27 浙江万朋教育科技股份有限公司 Data compression method for shared PPT document pages
CN109525802A (en) * 2018-11-27 2019-03-26 平安科技(深圳)有限公司 A kind of video stream transmission method and device
CN111954051A (en) * 2020-02-11 2020-11-17 华为技术有限公司 Method, equipment and system for transmitting video and audio
CN111954028A (en) * 2020-10-19 2020-11-17 深圳乐播科技有限公司 Screen projection method, device and equipment of audio data and storage medium
CN113204687A (en) * 2020-11-10 2021-08-03 摩赛恩科技(苏州)有限公司 Automatic mass spectrum data uploading method and terminal equipment
CN112422591A (en) * 2021-01-25 2021-02-26 北京拓课网络科技有限公司 Method and device for transmitting video stream data and electronic equipment
CN114679437A (en) * 2022-03-11 2022-06-28 阿里巴巴(中国)有限公司 Teleconference method, data interaction method, device, and computer storage medium
CN114816308A (en) * 2022-06-28 2022-07-29 深圳乐播科技有限公司 Information partition display method and related equipment
CN114827134A (en) * 2022-07-01 2022-07-29 深圳乐播科技有限公司 Differentiated pushing method, related device and display method for cloud conference desktop

Non-Patent Citations (3)

* Cited by examiner, † Cited by third party
Title
Low power digital video compression hardware design;Ilker Hamzaoglu;2016 International Conference on Design and Technology of Integrated Systems in Nanoscale Era (DTIS);全文 *
移动终端图像信息云推送投影系统的研发;田均成;胡坤;曹宏宇;宋占伟;;吉林大学学报(信息科学版)(03);全文 *
网络视频会议系统的研究与应用;刘晓昱;中国优秀硕士学位论文全文数据库 (信息科技辑);全文 *

Also Published As

Publication number Publication date
CN115334053A (en) 2022-11-11

Similar Documents

Publication Publication Date Title
US20200068635A1 (en) Data-stream allocation method for link aggregation and related devices
WO2019042180A1 (en) Resource allocation method and related product
WO2019042169A1 (en) Resource allocation method and related products
WO2019042294A1 (en) Resource allocation method and related product
CN111132234A (en) Data transmission method and corresponding terminal
WO2019072208A1 (en) Application running control method and device
CN107517306B (en) Resource allocation method and related product
WO2019072180A1 (en) Method and apparatus for allocating resources to application
US20190034236A1 (en) Method For Resource Allocation And Terminal Device
US11182210B2 (en) Method for resource allocation and terminal device
KR20220104225A (en) Device Occupation Methods and Electronic Devices
CN113032112A (en) Resource scheduling method and device, electronic equipment and storage medium
CN114996168A (en) Multi-device cooperative test method, test device and readable storage medium
WO2020191729A1 (en) Resource indication method, apparatus and system on unlicensed spectrum, and storage medium
CN112463391B (en) Memory control method, memory control device, storage medium and electronic equipment
CN115334053B (en) Method for realizing associated screen projection in cloud conference and related products
CN115552518A (en) Signal encoding and decoding method and device, user equipment, network side equipment and storage medium
CN111245585B (en) Information sending method and device and parameter determining method and device
CN106330504B (en) Method for realizing application and service controller
CN115361569B (en) Dynamic frame screen projection method in cloud conference and related products
CN111432384B (en) Large-data-volume audio Bluetooth real-time transmission method for equipment with recording function
WO2021073394A1 (en) Data transmission method and apparatus, electronic device and storage medium
CN106776010B (en) Resource allocation method and terminal
CN112558885B (en) Memory using method of functional mobile phone and related product
CN112751819B (en) Processing method and device for online conference, electronic equipment and computer readable medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant