CN116266085A - Window display method and device for collaborative interaction, electronic equipment and storage medium - Google Patents

Window display method and device for collaborative interaction, electronic equipment and storage medium Download PDF

Info

Publication number
CN116266085A
CN116266085A CN202111554745.9A CN202111554745A CN116266085A CN 116266085 A CN116266085 A CN 116266085A CN 202111554745 A CN202111554745 A CN 202111554745A CN 116266085 A CN116266085 A CN 116266085A
Authority
CN
China
Prior art keywords
audio
video
window
video stream
stream data
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202111554745.9A
Other languages
Chinese (zh)
Inventor
李炎桐
王达昇
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Guangzhou Shizhen Information Technology Co Ltd
Original Assignee
Guangzhou Shizhen Information Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Guangzhou Shizhen Information Technology Co Ltd filed Critical Guangzhou Shizhen Information Technology Co Ltd
Priority to CN202111554745.9A priority Critical patent/CN116266085A/en
Publication of CN116266085A publication Critical patent/CN116266085A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/10File systems; File servers
    • G06F16/17Details of further file system functions
    • G06F16/172Caching, prefetching or hoarding of files
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/10File systems; File servers
    • G06F16/17Details of further file system functions
    • G06F16/1734Details of monitoring file system events, e.g. by the use of hooks, filter drivers, logs
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/16Sound input; Sound output
    • G06F3/165Management of the audio stream, e.g. setting of volume, audio stream path

Abstract

The embodiment of the invention discloses a window display method and device for collaborative interaction, electronic equipment and a storage medium. The method comprises the following steps: receiving audio and video stream data sent by at least one terminal device accessed to collaborative interaction, and buffering the audio and video stream data in a preset buffer area; displaying information blocks on an application interface of collaborative interaction, wherein each information block records data generated in a collaborative interaction process, and thumbnail corresponding to audio and video stream data is displayed on the audio and video stream information block; receiving a first triggering operation of at least one thumbnail in an audio/video stream information block; and adding an audio-video window on the application interface according to the first trigger operation, reading audio-video stream data corresponding to the thumbnail confirmed by the trigger operation from the buffer area, decoding to obtain an audio-video picture, and displaying the audio-video picture on the audio-video window. According to the scheme, the influence of user operation on the display of the audio and video window in the cooperative interaction process is avoided, and smooth playing of the audio and video stream is realized.

Description

Window display method and device for collaborative interaction, electronic equipment and storage medium
Technical Field
The embodiment of the invention relates to the technical field of interaction, in particular to a window display method and device for collaborative interaction, electronic equipment and a storage medium.
Background
In a conference scenario in the context of information technology, it is often the case that many people discuss focus information presented around a conference device. In this discussion, the focus information may be sourced from files uploaded by the terminal devices of the participants, and files generated by the conference devices, and if there are remotely accessed participants, files generated by the remote conference devices.
In order to realize the management of associated data in a conference scene, the existing conference support software realizes the management of different data in different functional pages, wherein audio and video data are displayed on one functional page, the switching operation of the audio and video data in the mode is complex, the display of the audio and video data depends on the display of the functional page, and when the functional page is switched, the playing window of the audio and video data is hidden or interrupted to exit.
Disclosure of Invention
The invention provides a window display method, device, electronic equipment and storage medium for collaborative interaction, which are used for solving the technical problem that when a functional page is switched in the existing collaborative interaction, a display window of audio and video data is hidden or interrupted to exit.
In a first aspect, an embodiment of the present invention provides a method for displaying windows in collaborative interaction, including:
receiving audio and video stream data sent by at least one terminal device accessed to collaborative interaction, and buffering the audio and video stream data in a preset buffer area;
displaying information blocks on an application interface of collaborative interaction, wherein each information block records data generated in a collaborative interaction process, and thumbnail corresponding to audio and video stream data is displayed on the audio and video stream information block;
receiving a first triggering operation of at least one thumbnail in the audio/video stream information block;
and adding an audio and video window to the application interface according to the first trigger operation, reading audio and video stream data corresponding to the thumbnail confirmed by the trigger operation from the buffer area, decoding to obtain an audio and video picture, and displaying the audio and video picture on the audio and video window.
In a second aspect, an embodiment of the present invention further provides a window display apparatus for collaborative interaction, including:
the data receiving and caching unit is used for receiving audio and video stream data sent by at least one terminal device accessed to collaborative interaction and caching the audio and video stream data in a preset cache area;
The thumbnail display unit is used for displaying information blocks on the application interface of collaborative interaction, each information block records data generated in the collaborative interaction process, and the thumbnail corresponding to the audio and video stream data is displayed on the audio and video stream information block;
a first operation receiving unit, configured to receive a first trigger operation on at least one thumbnail in the audio/video stream information block;
and the first window display unit is used for adding an audio and video window to the application interface according to the first trigger operation, reading audio and video stream data corresponding to the thumbnail confirmed by the trigger operation from the buffer area, decoding to obtain an audio and video picture, and displaying the audio and video picture on the audio and video window.
In a third aspect, an embodiment of the present invention further provides an electronic device, including:
one or more processors;
a memory for storing one or more programs;
when the one or more programs are executed by the one or more processors, the electronic device is caused to implement the method for displaying windows for collaborative interaction as described in the first aspect.
In a fourth aspect, an embodiment of the present invention further provides a computer readable storage medium having stored thereon a computer program which, when executed by a processor, implements a window display method of collaborative interaction as described in the first aspect.
The method comprises the steps of receiving audio and video stream data sent by at least one terminal device connected with collaborative interaction, and buffering the audio and video stream data in a preset buffer area; displaying information blocks on an application interface of collaborative interaction, wherein each information block records data generated in a collaborative interaction process, and thumbnail corresponding to audio and video stream data is displayed on the audio and video stream information block; receiving a first triggering operation of at least one thumbnail in the audio/video stream information block; and adding an audio and video window to the application interface according to the first trigger operation, reading audio and video stream data corresponding to the thumbnail confirmed by the trigger operation from the buffer area, decoding to obtain an audio and video picture, and displaying the audio and video picture on the audio and video window. And when triggering operation is detected at the thumbnail, adding an audio-video window to the application interface of the collaborative interaction, displaying a corresponding audio-video picture in the audio-video window, and enabling the audio-video window to be displayed on the application interface independent of other functional pages by cutting off the dependence of the audio-video window on the corresponding functional pages, so that the display of the audio-video window is prevented from being influenced by user operation in the collaborative interaction process, and smooth playing of the audio-video stream is realized.
Drawings
FIG. 1 is a method flow chart of a window display method for collaborative interaction provided by an embodiment of the invention;
FIG. 2 is an interface schematic of an interactive tablet;
FIG. 3 is a schematic diagram of an interactive tablet and a terminal device in multi-user communication according to an embodiment of the present invention;
FIG. 4 is a schematic diagram of an application interface for collaborative interaction according to an embodiment of the present invention;
FIG. 5 is a schematic illustration of displaying an audio video window based on the prior art of FIG. 4;
FIG. 6 is a schematic diagram of the prior art based on FIG. 5;
fig. 7 is a schematic display diagram of an audio/video window of a window display method for collaborative interaction according to an embodiment of the present invention;
fig. 8 is a schematic diagram of an audio/video stream data processing process of a window display method of collaborative interaction according to an embodiment of the present invention;
fig. 9 is a schematic structural diagram of a window display device for collaborative interaction according to an embodiment of the present invention;
fig. 10 is a schematic structural diagram of an electronic device according to an embodiment of the present invention.
Detailed Description
The invention is described in further detail below with reference to the drawings and examples. It is to be understood that the specific embodiments described herein are for purposes of illustration and not of limitation. It should be further noted that, for convenience of description, only some, but not all of the structures related to the present invention are shown in the drawings.
It should be noted that, for the sake of brevity, this specification is not exhaustive of all of the alternative embodiments, and after reading this specification, one skilled in the art will appreciate that any combination of features may constitute an alternative embodiment as long as the features do not contradict each other.
The following describes each embodiment in detail.
Fig. 1 is a method flowchart of a method for displaying a window of collaborative interaction according to an embodiment of the present invention, as shown in the drawing, the method for displaying a window of collaborative interaction includes:
step S110: and receiving audio and video stream data sent by at least one terminal device accessed to collaborative interaction, and buffering the audio and video stream data in a preset buffer area.
In the multi-person interaction scenes such as conferences, teaching, discussions, brainstorms and the like, information transmission among a plurality of participants is realized through speaking, vision and hearing in a long historical period, and the information transmission can provide good communication experience for the participants, but the information recording in the communication process depends on the recording of the participants according to personal habits and abilities, and the actually transmitted information can be lost.
With the development of information technology, more abundant means are provided for information display and information recording in a multi-person interaction scene, particularly multi-person interaction based on an interaction panel, each participant can concentrate the content to be displayed on the interaction panel for display, and various information in the cooperative communication process is received and stored by the interaction panel.
The interactive flat board in the scheme can be integrated equipment for controlling the content displayed on the display flat board and realizing man-machine interaction operation through a touch technology, and integrates one or more functions of a projector, an electronic whiteboard, a curtain, sound equipment, a television, a video conference terminal and the like.
Generally, as shown in FIG. 2, the interactive tablet 11 includes at least one display screen 12. For example, the interactive tablet 11 is configured with a touch-enabled display screen 12, and the display screen 12 may be a capacitive screen, a resistive screen, or an electromagnetic screen. In this embodiment, further, the user may touch the display screen 12 with a finger or a stylus, and accordingly, the interactive panel 12 detects a touch position and responds according to the touch position to implement a touch function. Typically, the interactive tablet 11 is installed with at least one operating system, wherein the operating system includes, but is not limited to, an android system, a Linux system, and a Windows system.
The interaction panel is used as interaction equipment mainly used for supporting a multi-person communication scene, and the software is mainly used for realizing functional requirements in the multi-person communication scene. For example, the interactive tablet may install at least one application program with a presentation function, where the application program may be an application program that is self-contained in an operating system; at the same time, applications downloaded from a third party device or server are also installed. Optionally, the program with the manuscript demonstration function has the basic function of displaying edited content, and also has the editing functions in other demonstration processes, such as the functions of inserting a form, inserting a picture, drawing a form, drawing a picture and the like, so that the real-time display of user information input and input content in the information display process is realized, and the information display and interaction effect in the manuscript demonstration process is further improved. For example, the interactive tablet may be provided with at least one application program having a whiteboard function, and the user may implement real-time writing recording, modification and saving in the multi-user cooperative communication scene based on the application program. The multi-user communication implementation based on the interactive flat board only needs to concentrate the content files for display in the communication process to the interactive flat board, then open the display one by one on the interactive flat board, record comments and the like based on the white board in the communication process, and each content file still exists as a discrete file.
Under the existing multi-person communication scene based on the interaction panel, various data connections in data are generally realized through conference supporting software, and the interaction and data transmission requirements under the multi-person communication scene are realized by different functional pages of the conference supporting software. In this scheme, as shown in fig. 3, the terminal device of each user realizes effective management of conference process data by logging into the collaboration platform. In the scenario shown in fig. 3, the terminal device may be a personal computer 21 or a mobile terminal 22, and a single participant may be connected to only the personal computer 21 or the mobile terminal 22, or may be connected to both the personal computer 21 and the mobile terminal 22. The participant sends the content file to be displayed on the interactive tablet 11 to the interactive tablet 11, or directly displays it on the terminal device and drops the screen to the interactive tablet 11 for display. Under the support of the architecture, the multi-user communication is not focused on the communication of the interaction panel 11, but all participants operate on the respective terminal equipment, so that a better communication effect is realized, and the multi-user communication process is also a collaborative interaction process.
Whether the interaction tablet is based on multi-person communication or based on multi-person communication with the interaction tablet, different application programs in the interaction tablet can be matched for use in order to better support multi-person communication, for example, when the manuscript is demonstrated, supplementary explanation can be realized on the manuscript demonstration picture through the real-time input function of the whiteboard function. Based on the prepared content file, a new content file is generated in the communication process, or the existing content file is updated.
The data generated during the conference includes relatively static data, such as various files uploaded, which may not change during the interaction; relatively dynamic data, such as audio-visual streaming data, is also included, updated in real-time during the interaction. In the cooperative interaction process, the audio and video stream data mainly comprises call audio and video stream data and screen sharing video stream data, wherein the call audio and video stream data refer to data acquired by a terminal device through an audio and video acquisition device (a microphone and a camera) and sent to the terminal device participating in the cooperative interaction; the screen sharing video stream data refers to data which is obtained by other terminal devices and transmitted to the terminal devices participating in collaborative interaction by the whole screen or image data of a certain area in the screen. In general, the terminal device sends the audio and video stream data to the collaboration platform, and the collaboration platform performs unified distribution management.
Each terminal device processes the received audio and video stream data in the existing audio and video stream data playing mode, as shown in fig. 5, the audio and video stream data can be played in real time on the corresponding functional page, but in the playing mode on the corresponding functional page, when the selection switching of the "space" functional page as shown in fig. 5 occurs, the corresponding functional page may enter the "space" as shown in fig. 6, and meanwhile, the playing window is hidden or interrupted to exit as shown in fig. 6, so that the cooperative interaction is interrupted, and especially, when the user remotely participating in the operation is performed on the terminal device, the participant of the cooperative interaction may not acquire the focus information of the cooperative interaction.
In the scheme, in order to realize normal display of the audio and video stream data, the audio and video stream data sent by other terminal devices received from the cooperative interaction is not directly played and output by conference support software in a corresponding playing window, but is buffered in a preset buffer zone, and the buffer zone processes the audio and video stream data.
Step S120: and displaying information blocks on an application interface of collaborative interaction, wherein each information block records data generated in the collaborative interaction process, and thumbnail corresponding to the audio and video stream data is displayed on the audio and video stream information block.
In the scheme, data generated in the process of multi-user collaborative interaction are managed on a collaborative platform, and for users, personal related data is realized through an application interface of collaborative interaction in the process of interaction and subsequent application. As shown in fig. 4, when the participants operate in the terminal device to implement the collaborative interaction, all the file or data transceiving with the interaction tablet and other participants is completed in one application interface 30, and the application interface 30 is an application interface of the collaborative interaction. In the application interface 30, operations such as file transmission, file receiving, screen projection sending, video initiating, video switching-on, whiteboard and the like in the multi-user communication process, and subsequent operations such as data summarization and display and the like of multi-user communication can be realized. The application interface 30 has functional pages 31 therein for implementing various interactions or data management of the collaborative interaction process.
Based on the collaboration platform, the operation of each terminal device in the multi-user communication process and the corresponding data are comprehensively recorded and managed on the collaboration platform, and the collaboration platform is built on a server. The collaboration platform can be understood as a platform for performing background management on all operations and corresponding data in the collaboration interaction process, and the application interface of the collaboration interaction can be understood as an interface of the data in the background management platform for user interaction in the terminal equipment.
The data corresponding to each interoperation is generated in the application interface 30 in the form of an information block 32. The information block 32 may be understood as a unit area in the application interface 30, each unit area displaying data associated with a cooperative interaction. Of course, one collaboration interaction here essentially triggers all data generated after one collaboration control. For example, after the whiteboard control is triggered, the whiteboard is displayed, and multiple input operations are performed on the whiteboard, and in this scheme, each input operation is not defined as one piece of data, but all whiteboard data is defined as one piece of data.
As shown in fig. 4, the audio/video stream data is one type of data for collaborative interaction, and generally exists at the same time, and is displayed in the form of a thumbnail 322 in the same information block (i.e., an audio/video stream information block 321 defined in this embodiment), and the thumbnail 322 is not a reduced display of a playing window of the audio/video stream data, but is only used to indicate the presence and preview of the audio/video stream data, and has a control attribute of whether to select the corresponding audio/video stream data for playing.
Step S130: and receiving a first triggering operation of at least one thumbnail in the audio/video stream information block.
When a trigger operation, such as a mouse click operation or a touch click operation, is received in a display area corresponding to the thumbnail, the first trigger operation is confirmed to be received, and the first trigger operation is used for triggering the subsequent playing of the audio and video stream data.
Step S140: and adding an audio and video window to the application interface according to the first trigger operation, reading audio and video stream data corresponding to the thumbnail confirmed by the trigger operation from the buffer area, decoding to obtain an audio and video picture, and displaying the audio and video picture on the audio and video window.
And when the first triggering operation is received, confirming the corresponding audio and video stream data according to the thumbnail where the first triggering operation is located, and adding an audio and video window in the application interface. The audio and video window in the scheme is the window with the highest display authority in the application interface, and the display state of the window is not changed due to switching to other functional pages. And (3) buffering the audio and video stream data to be displayed or played in a preset buffer area, reading the corresponding audio and video stream data from the buffer area when the display or play is needed, decoding to obtain an audio and video picture, and displaying the audio and video picture in an audio and video window. By establishing direct association between the audio and video window displayed at the front end and the audio and video stream data stored at the rear end, the dependence of the audio and video window on the function page is canceled, interruption of the audio and video window due to operation on the function page in the collaborative interaction process is avoided, and continuity of information transmission in the collaborative interaction process is ensured.
If the opening operation of the functional page corresponding to the space is detected on the basis of the display of the audio and video window realized based on the scheme shown in fig. 5, as shown in fig. 7, the display of the audio and video window 33 is not affected while the functional page 31 corresponding to the space is opened, so that the user can continuously view the focus information of the collaborative interaction, and the collaborative interaction process is not interrupted.
The audio and video stream data in this scheme may be audio stream data or video stream data when the audio and video stream data is specific to one specific path of data. If the frame is audio stream data, the decoded frame may be only a static data source frame, and in addition, audio output is provided; in the case of video stream data, the decoded dynamic picture and possibly audio output are obtained, and in the scheme, only the mode that the picture is displayed in the added audio-video window is limited, and the audio output can be output through a sound box or an earphone according to the prior art.
As shown in fig. 4, the application interface 30 displays an av invitation control 323, and in fig. 4, the av invitation control 323 is set in the av stream information block 321, and of course, may also be set in a fixed control area in the application interface 30. Based on the audio/video invitation control 323, the scheme can further realize the invitation of the audio/video and the prompt when the invitation is received through steps S151-S153.
Step S151: and receiving a second triggering operation of the audio and video invitation control, and displaying the contact list of the collaborative interaction.
Step S152: and receiving a target confirmation operation in the contact list, and sending an audio and video connection request to the contact confirmed by the target confirmation operation.
Step S153: when receiving an audio and video connection request, caching corresponding audio and video request data in the cache area, adding an audio and video window on the application interface, and displaying information of the audio and video request on the audio and video window.
When the mouse click or touch click operation is detected in the display area corresponding to the audio/video invitation control, the second trigger operation is confirmed to be received, and the contact list corresponding to the user identifier associated with the corresponding display terminal equipment, such as a department member list and a personal contact list, is displayed. And receiving a target confirmation operation in the contact list, namely confirming at least one contact, confirming sending the audio and video invitation, and sending an audio and video connection request to the confirmed contact. Correspondingly, when the user identifier associated with the terminal equipment receives the audio and video connection request, the corresponding audio and video request data are cached and the cache area, an audio and video window is added to the application interface, and information of the audio and video request is displayed in the audio and video window. Step S151-step S153 are all operations of one terminal device in a one-time audio and video connection request process, and if one terminal device initiates an audio and video connection request in a one-time audio and video connection request process, step S151-step S152 are executed; if a terminal device is invited to access the audio/video connection in a process of an audio/video connection request, step S153 is correspondingly executed.
Step S154: and receiving a connection confirmation operation in the audio/video window.
Step S155: and receiving the corresponding audio and video stream data and sending the audio and video stream data.
The method comprises the steps of displaying information of an audio/video request on an audio/video window, mainly displaying information of an initiator of the audio/video request, and displaying a control for accepting connection or rejecting connection. When the triggering operation is detected by the connected control, the audio and video connection is established, corresponding audio and video stream data are received, and the audio and video stream data acquired by hardware or obtained from a screen picture are sent. And correspondingly executing the step S110 and the step S120, while maintaining the display state of the audio and video window, and continuing to display the decoded audio and video pictures in the audio and video window. When the triggering operation is detected at the control refused to connect, the display of the audio and video window is ended, and the transmission of the corresponding audio and video stream data is not carried out.
For the displayed audio and video window, the window can be closed through a window closing operation, and only the closing of the audio and video window does not influence the receiving of audio and video stream data and the display of the corresponding thumbnail in the information block. When the display of the audio and video window is needed in the subsequent collaborative interaction process, the corresponding display process of the audio and video window can be completed again through the step S130 and the step S140.
In this scheme, the data processing process of displaying the audio and video stream data in the audio and video window may refer to fig. 8, in this scheme, the client service process 34 is specially configured to manage the received audio and video stream data, in fig. 8, four paths of audio and video stream data from "data one" to "data four" are received, the types of the thought audio and video stream data may be different, but the client service process 34 manages the four paths of audio and video stream data, after receiving the data, the client service process 34 caches the data in a preset cache area, and sends the corresponding thumbnail to an information block for displaying, and the audio and video stream data is not sent to the corresponding specific function window of the next layer for displaying or playing. When there is a play requirement, an audio/video window is added to the application interface according to the corresponding type, for example, the audio/video play window 33a is used for playing audio data and video data, the audio/video listening window 33b is used for displaying audio/video request information, and the screen sharing window 33c is used for playing screen sharing pictures. It should be noted that, the various audio and video windows in the present solution may not have specific implementation differences, and the named differences are only used to distinguish the display contents.
Step S160: and when the audio and video window is closed, adding information blocks on the application interface of collaborative interaction, and recording corresponding audio and video stream data in the display process of the audio and video window in the added information blocks.
In the process of displaying the audio and video window, a screen can be recorded, a specific screen recording range can be the range of the audio and video window or the range of the whole application interface, files obtained by screen recording can be stored through a collaboration platform or local terminal equipment and are recorded corresponding to newly built information blocks, the recording is not data storage and the information blocks, an interface for accessing related data is provided in the information blocks, and no matter whether the data are stored in the collaboration platform or the local terminal equipment, the data can be accessed through the interface in the information blocks. The information blocks are recorded in the application interface according to the time sequence, based on the information blocks, the whole-flow orderly recording of the data in the cooperative interaction process can be realized, and the user can quickly obtain all communication process behaviors of orderly multi-user communication and the correspondingly generated data by accessing the cooperative interaction application interface, so that the lengthy operation link in the file transfer process is shortened, and the information transfer efficiency is improved.
Based on the display of the information block, the specific display of the file data is further realized through step S161-step S164:
step S161: and receiving a selection operation on the application interface of the collaborative interaction, wherein the selection operation is used for selecting a target information block used for displaying file data from the information blocks of the application interface of the collaborative interaction.
Step S162: and displaying a display window of a target application, wherein the target application is an application program for opening file data corresponding to the target information block.
Step S163: and receiving a closing operation for the display window.
Step S164: and closing the display window according to the closing operation, and updating the display content of the target information block on the cooperative interactive application interface.
For the information blocks displayed in the application interface, when one of the information blocks is selected through a selection operation, such as clicking with a mouse or clicking with a touch, the information block is selected as a target information block, file data corresponding to the target information block needs to be correspondingly displayed, a display window of a corresponding application program is displayed according to the type of the file data, actual content of the file data is displayed in the display window, such as opening a window of a word program, and a word document is displayed in the window. And correspondingly updating the editing operation of the file data in the display window to the target information block when the display window is closed.
In addition, for the screen sharing video stream data, a screen sharing operation input by a user can be received, the currently displayed screen content is obtained according to the screen sharing operation, the screen sharing video stream data is formed, and the screen sharing video stream data is sent to the at least one terminal device accessed to the collaborative interaction for display. The content currently displayed by the interactive flat panel is sent to the currently accessed terminal equipment for display, so that the transmission range of the content interactively displayed is enlarged, and the information transmission effect in the cooperative interaction process is enhanced.
The data associated with the collaborative interaction is displayed in the form of information blocks, not only used in a group record interface, but also used in other display processes for carrying out various management on the data, such as sorting, screening and the like, and can be considered to be presented in the frame of the information blocks before specific data content is opened in any window of an application interface, and the association mode of the data and the collaborative interaction and the condition overview of the data are presented until a certain information block is clicked.
Receiving audio and video stream data transmitted by at least one terminal device accessed to collaborative interaction, and caching the audio and video stream data in a preset cache region; displaying information blocks on an application interface of collaborative interaction, wherein each information block records data generated in a collaborative interaction process, and thumbnail corresponding to audio and video stream data is displayed on the audio and video stream information block; receiving a first triggering operation of at least one thumbnail in the audio/video stream information block; and adding an audio and video window to the application interface according to the first trigger operation, reading audio and video stream data corresponding to the thumbnail confirmed by the trigger operation from the buffer area, decoding to obtain an audio and video picture, and displaying the audio and video picture on the audio and video window. And when triggering operation is detected at the thumbnail, adding an audio-video window to the application interface of the collaborative interaction, displaying a corresponding audio-video picture in the audio-video window, and enabling the audio-video window to be displayed on the application interface independent of other functional pages by cutting off the dependence of the audio-video window on the corresponding functional pages, so that the display of the audio-video window is prevented from being influenced by user operation in the collaborative interaction process, and smooth playing of the audio-video stream is realized.
Fig. 9 is a schematic structural diagram of a window display device for collaborative interaction according to an embodiment of the present invention. Referring to fig. 9, the window display apparatus of collaborative interaction includes: a data receiving buffer unit 210, a thumbnail display unit 220, a first operation receiving unit 230, and a first window display unit 240.
The data receiving and buffering unit 210 is configured to receive audio and video stream data sent by at least one terminal device accessing collaborative interaction, and buffer the audio and video stream data in a preset buffer area; the thumbnail display unit 220 is configured to display information blocks on an application interface of collaborative interaction, where each information block records data generated in a collaborative interaction process, and a thumbnail corresponding to the audio and video stream data is displayed on the audio and video stream information block; a first operation receiving unit 230, configured to receive a first trigger operation on at least one thumbnail in the audio/video stream information block; and the first window display unit 240 is configured to add an audio/video window to the application interface according to the first trigger operation, read audio/video stream data corresponding to the thumbnail confirmed by the trigger operation from the buffer area, decode the audio/video stream data to obtain an audio/video picture, and display the audio/video picture on the audio/video window.
On the basis of the embodiment, the application interface is also displayed with an audio/video invitation control;
correspondingly, the device further comprises:
the second operation receiving unit is used for receiving a second triggering operation of the audio and video invitation control and displaying the contact list of the collaborative interaction;
a connection request sending unit, configured to receive a target confirmation operation from the contact list, and send an audio/video connection request to a contact confirmed by the target confirmation operation;
and the second window display unit is used for caching corresponding audio and video request data in the cache area when receiving the audio and video connection request, adding an audio and video window in the application interface, and displaying information of the audio and video request in the audio and video window.
On the basis of the above embodiment, the apparatus further includes:
a third operation receiving unit, configured to receive a connection confirmation operation in the audio/video window;
and the data transmitting unit is used for receiving the audio and video stream data corresponding to the audio and video connection request and transmitting the acquired audio and video stream data.
On the basis of the above embodiment, the apparatus further includes:
and the data storage unit is used for adding information blocks on the application interface of the collaborative interaction when the audio and video window is closed, and recording corresponding audio and video stream data in the display process of the audio and video window in the added information blocks.
On the basis of the above embodiment, the apparatus further includes:
the information block selection unit is used for receiving selection operation on the cooperative interactive application interface, wherein the selection operation is used for selecting a target information block used for displaying file data from the information blocks of the cooperative interactive application interface;
the data display unit is used for displaying a display window of a target application, wherein the target application is an application program for opening file data corresponding to the target information block;
a closing operation receiving unit configured to receive a closing operation for the display window;
and the information block updating unit is used for closing the display window according to the closing operation and updating the display content of the target information block on the cooperative interactive application interface.
On the basis of the embodiment, the audio/video stream includes a call audio/video stream and a screen sharing video stream.
On the basis of the above embodiment, the apparatus further includes:
a sharing operation receiving unit for receiving a screen sharing operation input by a user;
and the shared content sending unit is used for acquiring the currently displayed screen content according to the screen sharing operation, forming screen sharing video stream data and sending the screen sharing video stream data to the at least one terminal device accessed to the collaboration interaction for display.
The window display device for collaborative interaction provided by the embodiment of the invention is contained in the electronic equipment, can be used for executing any window display method for collaborative interaction provided by the embodiment, and has corresponding functions and beneficial effects.
It should be noted that, in the embodiment of the window display device for collaborative interaction, each unit and module included are only divided according to the functional logic, but not limited to the above-mentioned division, so long as the corresponding functions can be implemented; in addition, the specific names of the functional units are also only for distinguishing from each other, and are not used to limit the protection scope of the present invention.
Fig. 10 is a schematic structural diagram of an electronic device according to an embodiment of the present invention. As shown in fig. 10, the electronic device includes a processor 310, a memory 320, an input device 330, an output device 340, and a communication device 350; the number of processors 310 in the electronic device may be one or more, one processor 310 being taken as an example in fig. 8; the processor 310, the memory 320, the input device 330, the output device 340, and the communication device 350 in the electronic device may be connected by a bus or other means, which is illustrated in fig. 8 as a bus connection.
The memory 320 is a computer-readable storage medium, and may be used to store a software program, a computer-executable program, and modules, such as program instructions/modules (e.g., the data receiving buffer unit 210, the thumbnail display unit 220, the first operation receiving unit 230, and the first window display unit 240) corresponding to a window display method of collaborative interaction in an embodiment of the present invention. The processor 310 executes various functional applications of the electronic device and data processing, i.e., a window display method implementing the above-described collaborative interaction, by running software programs, instructions, and modules stored in the memory 320.
Memory 320 may include primarily a program storage area and a data storage area, wherein the program storage area may store an operating system, at least one application program required for functionality; the storage data area may store data created according to the use of the electronic device, etc. In addition, memory 320 may include high-speed random access memory, and may also include non-volatile memory, such as at least one magnetic disk storage device, flash memory device, or other non-volatile solid-state storage device. In some examples, memory 320 may further include memory located remotely from processor 310, which may be connected to the electronic device via a network. Examples of such networks include, but are not limited to, the internet, intranets, local area networks, mobile communication networks, and combinations thereof.
The input device 330 may be used to receive input numeric or character information and to generate key signal inputs related to user settings and function control of the electronic device. The output device 340 may include a display device such as a display screen.
The electronic equipment comprises the window display device for the cooperative interaction, can be used for executing any window display method for the cooperative interaction, and has corresponding functions and beneficial effects.
The embodiments of the present invention also provide a storage medium containing computer executable instructions, which when executed by a computer processor, are configured to perform related operations in the window display method for collaborative interaction provided in any embodiment of the present application, and have corresponding functions and beneficial effects.
It will be appreciated by those skilled in the art that embodiments of the present application may be provided as a method, system, or computer program product.
Accordingly, the present application may take the form of an entirely hardware embodiment, an entirely software embodiment, or an embodiment combining software and hardware aspects. Furthermore, the present application may take the form of a computer program product embodied on one or more computer-usable storage media (including, but not limited to, disk storage, CD-ROM, optical storage, and the like) having computer-usable program code embodied therein. The present application is described with reference to flowchart illustrations and/or block diagrams of methods, apparatus (systems) and computer program products according to embodiments of the application. It will be understood that each flow and/or block of the flowchart illustrations and/or block diagrams, and combinations of flows and/or blocks in the flowchart illustrations and/or block diagrams, can be implemented by computer program instructions. These computer program instructions may be provided to a processor of a general purpose computer, special purpose computer, embedded processor, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable data processing apparatus, create means for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks. These computer program instructions may also be stored in a computer-readable memory that can direct a computer or other programmable data processing apparatus to function in a particular manner, such that the instructions stored in the computer-readable memory produce an article of manufacture including instruction means which implement the function specified in the flowchart flow or flows and/or block diagram block or blocks. These computer program instructions may also be loaded onto a computer or other programmable data processing apparatus to cause a series of operational steps to be performed on the computer or other programmable apparatus to produce a computer implemented process such that the instructions which execute on the computer or other programmable apparatus provide steps for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
In one typical configuration, a computing device includes one or more processors (CPUs), input/output interfaces, network interfaces, and memory. The memory may include volatile memory in a computer-readable medium, random Access Memory (RAM) and/or nonvolatile memory, etc., such as Read Only Memory (ROM) or flash RAM. Memory is an example of a computer-readable medium.
Computer readable media, including both non-transitory and non-transitory, removable and non-removable media, may implement information storage by any method or technology. The information may be computer readable instructions, data structures, modules of a program, or other data. Examples of storage media for a computer include, but are not limited to, phase change memory (PRAM), static Random Access Memory (SRAM), dynamic Random Access Memory (DRAM), other types of Random Access Memory (RAM), read Only Memory (ROM), electrically Erasable Programmable Read Only Memory (EEPROM), flash memory or other memory technology, compact disc read only memory (CD-ROM), digital Versatile Discs (DVD) or other optical storage, magnetic cassettes, magnetic tape magnetic disk storage or other magnetic storage devices, or any other non-transmission medium, which can be used to store information that can be accessed by a computing device. Computer-readable media, as defined herein, does not include transitory computer-readable media (transmission media), such as modulated data signals and carrier waves.
It should also be noted that the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising one … …" does not exclude the presence of other like elements in a process, method, article or apparatus that comprises an element.
Note that the above is only a preferred embodiment of the present invention and the technical principle applied. It will be understood by those skilled in the art that the present invention is not limited to the particular embodiments described herein, but is capable of various obvious changes, rearrangements and substitutions as will now become apparent to those skilled in the art without departing from the scope of the invention. Therefore, while the invention has been described in connection with the above embodiments, the invention is not limited to the embodiments, but may be embodied in many other equivalent forms without departing from the spirit or scope of the invention, which is set forth in the following claims.

Claims (10)

1. A method for displaying windows in a collaborative interaction, comprising:
receiving audio and video stream data sent by at least one terminal device accessed to collaborative interaction, and buffering the audio and video stream data in a preset buffer area;
displaying information blocks on an application interface of collaborative interaction, wherein each information block records data generated in a collaborative interaction process, and thumbnail corresponding to audio and video stream data is displayed on the audio and video stream information block;
receiving a first triggering operation of at least one thumbnail in the audio/video stream information block;
and adding an audio and video window to the application interface according to the first trigger operation, reading audio and video stream data corresponding to the thumbnail confirmed by the trigger operation from the buffer area, decoding to obtain an audio and video picture, and displaying the audio and video picture on the audio and video window.
2. The method of claim 1, wherein the application interface further displays an audiovisual invitation control;
correspondingly, the method further comprises the steps of:
receiving a second triggering operation of the audio/video invitation control, and displaying a contact list of the collaborative interaction;
receiving a target confirmation operation in the contact list, and sending an audio and video connection request to the contact confirmed by the target confirmation operation;
When receiving an audio and video connection request, caching corresponding audio and video request data in the cache area, adding an audio and video window on the application interface, and displaying information of the audio and video request on the audio and video window.
3. The method according to claim 2, wherein when the audio-video connection request is received, corresponding audio-video request data is cached in the cache area, an audio-video window is added to the application interface, and after the audio-video window displays the information of the audio-video request, the method further comprises:
receiving a connection confirmation operation in the audio/video window;
and receiving the corresponding audio and video stream data and sending the audio and video stream data.
4. The method of claim 1, wherein according to the first trigger operation, an audio-video window is added to the application interface, audio-video stream data corresponding to the thumbnail confirmed by the trigger operation is read from the buffer area, decoding is performed to obtain a window picture, and after the audio-video window displays the window picture, the method further comprises:
and when the audio and video window is closed, adding information blocks on the application interface of collaborative interaction, and recording corresponding audio and video stream data in the display process of the audio and video window in the added information blocks.
5. The method according to claim 4, wherein the method further comprises:
receiving a selection operation on a cooperative interactive application interface, wherein the selection operation is used for selecting a target information block for displaying file data from information blocks of the cooperative interactive application interface;
displaying a display window of a target application, wherein the target application is an application program for opening file data corresponding to the target information block;
receiving a closing operation for the display window;
and closing the display window according to the closing operation, and updating the display content of the target information block on the cooperative interactive application interface.
6. The method of claim 1, wherein the audio-video stream data comprises talk audio-video stream data and screen sharing video stream data.
7. The method of claim 6, wherein the method further comprises:
receiving a screen sharing operation input by a user;
and according to the screen sharing operation, acquiring the currently displayed screen content, forming screen sharing video stream data, and sending the screen sharing video stream data to the at least one terminal device accessed to the collaboration interaction for display.
8. A collaborative interactive window display device, comprising:
the data receiving and caching unit is used for receiving audio and video stream data sent by at least one terminal device accessed to collaborative interaction and caching the audio and video stream data in a preset cache area;
the thumbnail display unit is used for displaying information blocks on the application interface of collaborative interaction, each information block records data generated in the collaborative interaction process, and the thumbnail corresponding to the audio and video stream data is displayed on the audio and video stream information block;
a first operation receiving unit, configured to receive a first trigger operation on at least one thumbnail in the audio/video stream information block;
and the first window display unit is used for adding an audio and video window to the application interface according to the first trigger operation, reading audio and video stream data corresponding to the thumbnail confirmed by the trigger operation from the buffer area, decoding to obtain an audio and video picture, and displaying the audio and video picture on the audio and video window.
9. An electronic device, comprising:
one or more processors;
a memory for storing one or more programs;
when executed by the one or more processors, causes the interactive tablet to implement the collaborative interactive window display method of any one of claims 1-7.
10. A computer readable storage medium having stored thereon a computer program, characterized in that the program, when executed by a processor, implements a method for window display of collaborative interaction according to any of claims 1-7.
CN202111554745.9A 2021-12-17 2021-12-17 Window display method and device for collaborative interaction, electronic equipment and storage medium Pending CN116266085A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202111554745.9A CN116266085A (en) 2021-12-17 2021-12-17 Window display method and device for collaborative interaction, electronic equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202111554745.9A CN116266085A (en) 2021-12-17 2021-12-17 Window display method and device for collaborative interaction, electronic equipment and storage medium

Publications (1)

Publication Number Publication Date
CN116266085A true CN116266085A (en) 2023-06-20

Family

ID=86743814

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202111554745.9A Pending CN116266085A (en) 2021-12-17 2021-12-17 Window display method and device for collaborative interaction, electronic equipment and storage medium

Country Status (1)

Country Link
CN (1) CN116266085A (en)

Similar Documents

Publication Publication Date Title
US11140106B2 (en) Method, device and storage medium for interactive message in video page
US10567448B2 (en) Participation queue system and method for online video conferencing
US11366632B2 (en) User interface for screencast applications
US9584835B2 (en) System and method for broadcasting interactive content
CN107509052A (en) Double-current video-meeting method, device, electronic equipment and system
WO2015078199A1 (en) Live interaction method and device, client, server and system
JP5359199B2 (en) Comment distribution system, terminal, comment output method and program
US20180288484A1 (en) Virtual high definition video player
US20230283813A1 (en) Centralized streaming video composition
CN116266085A (en) Window display method and device for collaborative interaction, electronic equipment and storage medium
DE102019204521A1 (en) Context-dependent routing of media data
WO2022022580A1 (en) Network live broadcast interaction method and device
US20220417619A1 (en) Processing and playing control over interactive video
JP2022145503A (en) Live distribution information processing method, apparatus, electronic device, storage medium, and program
JP6724188B2 (en) Server, server control method, and program
WO2021049048A1 (en) Video-image providing system and program
CN113329237A (en) Method and equipment for presenting event label information
Mills et al. Surround video
CN115695841B (en) Method and device for embedding online live broadcast in external virtual scene
CN115243064B (en) Live broadcast control method, device, equipment and storage medium
US11381628B1 (en) Browser-based video production
CN116136785A (en) Directional screen capturing method and device for multi-user screen projection, electronic equipment and storage medium
WO2023115490A1 (en) Cooperative information display control method, device and storage medium
US20230156267A1 (en) Multiview synchronized communal system and method
WO2021095687A1 (en) Virtual server, moving-image editing method, and moving-image editing system

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination