WO2023138222A1 - 显示设备和直播方法 - Google Patents

显示设备和直播方法 Download PDF

Info

Publication number
WO2023138222A1
WO2023138222A1 PCT/CN2022/135719 CN2022135719W WO2023138222A1 WO 2023138222 A1 WO2023138222 A1 WO 2023138222A1 CN 2022135719 W CN2022135719 W CN 2022135719W WO 2023138222 A1 WO2023138222 A1 WO 2023138222A1
Authority
WO
WIPO (PCT)
Prior art keywords
class
live
camera
picture
interface
Prior art date
Application number
PCT/CN2022/135719
Other languages
English (en)
French (fr)
Inventor
江陆卫新
李英杰
Original Assignee
聚好看科技股份有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from CN202210077270.7A external-priority patent/CN114390357B/zh
Priority claimed from CN202210405366.1A external-priority patent/CN116962729A/zh
Application filed by 聚好看科技股份有限公司 filed Critical 聚好看科技股份有限公司
Publication of WO2023138222A1 publication Critical patent/WO2023138222A1/zh

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/431Generation of visual interfaces for content selection or interaction; Content or additional data rendering
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/47End-user applications
    • H04N21/472End-user interface for requesting content, additional data or services; End-user interface for interacting with content, e.g. for content reservation or setting reminders, for requesting event notification, for manipulating displayed content
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/47End-user applications
    • H04N21/485End-user interface for client configuration

Definitions

  • the present application relates to the technical field of display devices, in particular to a display device with a camera and a live broadcast method.
  • live classes are one of the rapidly developing scenarios for realizing online education.
  • the live class program can be configured in the display device.
  • the online live broadcast class it is necessary to simulate the real class scene. Both the teacher's picture and the contents of the teacher's blackboard writing on the blackboard need to be seen by the students.
  • the current live broadcast class usually adopts a camera to take pictures of the teacher and the blackboard at the same time for live broadcast. If it is necessary to switch to other screens during the live broadcast, a dedicated staff needs to manually move the camera position to solve the problem. The operation is very inflexible and affects the progress and effect of the live broadcast class.
  • start time and end time of the current online live class are based on the time set by the teacher when creating the live class.
  • the live class will stop immediately.
  • the teacher has unfinished content, he can only continue to teach in the next class. In this way, the teacher's class experience is not good.
  • the present disclosure provides a live broadcast method for a display device with a camera, the method comprising: displaying a first live broadcast interface in response to triggering a start instruction generated by a target live lesson control in a user interface, the first live broadcast interface including at least one of a class button, an auxiliary stream start button, a teacher video window, and a play window in a lecture area; responding to the start live broadcast instruction generated by triggering the class button, displaying whiteboard content in the play window of the lecture area, starting and acquiring a first picture captured by the first camera of the dual cameras, and displaying the first picture in the teacher video window, and, The first picture is sent to the student terminal display device as the main video stream for display; in response to triggering the dual-camera live broadcast command generated by the auxiliary stream start button, start and acquire the second picture captured by the second camera in the dual camera, display the second picture in the lecture area play window, and send the second picture as a secondary video stream to the student-side display device through the server for display.
  • the present disclosure also provides a display device, including: a display configured to display an image and/or a user interface; a first camera configured to capture a first picture; a second camera configured to capture a second picture; a controller connected to the display, the first camera, and the second camera respectively, the controller configured to display a first live interface in response to triggering a start instruction generated by a target live lesson control in the user interface, and the first live interface includes at least one of a class button, an auxiliary stream start button, a teacher video window, and a play window in a handout area;
  • the start live instruction generated by the class button displays the content of the whiteboard in the playback window of the lecture area, starts and acquires the first picture captured by the first camera, displays the first picture in the teacher video window, and sends the first picture as the main video stream to the student display device through the server for display; in response to the dual camera live broadcast instruction generated by triggering the auxiliary stream start button, starts and acquires the second picture captured by the second camera, displays the second picture in the
  • Fig. 1 shows a schematic diagram of an operation scenario between a display device and a control device according to some embodiments
  • FIG. 2 shows a block diagram of a hardware configuration of a control device 100 according to some embodiments
  • FIG. 3 shows a block diagram of a hardware configuration of a display device 200 according to some embodiments
  • FIG. 4 shows a software configuration diagram in a display device 200 according to some embodiments
  • FIG. 5 shows a flowchart of a method for performing a dual-camera live broadcast on a teacher-side display device according to some embodiments
  • FIG. 6 shows a sequence diagram of a dual-camera live broadcast method according to some embodiments
  • Fig. 7 shows a schematic diagram of the display effect of the main interface of the educational application program in the display device 200 according to some embodiments.
  • Fig. 8 shows a schematic diagram of the first live interface displayed by the display device on the teacher's end according to some embodiments
  • Fig. 9 shows a schematic diagram of a second live broadcast interface displayed by a display device at a student terminal according to some embodiments.
  • Fig. 10 shows a schematic diagram of displaying lecture content in the play window of the lecture area in the first live broadcast interface according to some embodiments
  • Fig. 11 shows a schematic diagram of displaying a secondary stream setting pop-up window in the first live broadcast interface according to some embodiments
  • Fig. 12 shows a schematic diagram of displaying a second picture in the playback window of the handout area according to some embodiments
  • Fig. 13 shows a schematic diagram of displaying switched primary and secondary video streams in the first live broadcast interface according to some embodiments
  • FIG. 14 shows a flow chart of a method for performing a dual-camera live broadcast on a student display device according to some embodiments
  • Fig. 15 shows a schematic diagram of displaying switched primary and secondary video streams in the second live interface according to some embodiments
  • Fig. 16 shows a flowchart of a server performing a dual-camera live broadcast method according to some embodiments
  • Fig. 17 shows a schematic flow diagram of a method for managing a live class dragging class according to some embodiments
  • Fig. 18 shows a schematic diagram of the display effect of the live class dragging class interface according to some embodiments
  • Fig. 19 shows a sequence diagram of a method for managing live classes and dragging classes according to some embodiments.
  • the display device provided in the embodiments of the present application may have various implementation forms, for example, it may be a TV, a smart TV, a computer, a laser projection device, a monitor, an electronic bulletin board, an electronic table, etc.
  • Fig. 1 and Fig. 2 are a specific implementation manner of the display device of the present application.
  • Fig. 1 shows a schematic diagram of an operation scene between a display device and a control device according to some embodiments. As shown in FIG. 1 , the user can operate the display device 200 through the smart device 300 or the control device 100 .
  • control device 100 may be a remote controller, and the communication between the remote controller and the display device includes infrared protocol communication, Bluetooth protocol communication, and other short-distance communication methods, and the display device 200 is controlled wirelessly or wiredly.
  • the user can control the display device 200 by inputting user commands through buttons on the remote control, voice input, control panel input, and the like.
  • the smart device 300 (such as a mobile terminal, a tablet computer, a computer, a notebook computer, etc.) can also be used to control the display device 200 .
  • the display device 200 is controlled using an application program running on the smart device.
  • the display device may not use the aforementioned smart device or control device to receive instructions, but may receive user control through touch or gesture.
  • the display device 200 can also be controlled in a manner other than the control device 100 and the smart device 300.
  • the user's voice command control can be directly received through the module for acquiring voice commands configured inside the display device 200, or the user's voice command control can be received through the voice control device installed outside the display device 200.
  • the display device 200 also performs data communication with the server 400 .
  • the display device 200 may be allowed to communicate via a local area network (LAN), a wireless local area network (WLAN), and other networks.
  • the server 400 may provide various contents and interactions to the display device 200 .
  • the server 400 may be one cluster, or multiple clusters, and may include one or more types of servers.
  • Fig. 2 shows a block diagram of hardware configuration of the control device 100 according to some embodiments.
  • the control device 100 includes a controller 110 , a communication interface 130 , a user input/output interface 140 , a memory, and a power supply.
  • the control device 100 can receive the user's input operation instruction, and convert the operation instruction into an instruction that the display device 200 can recognize and respond to, and play an intermediary role between the user and the display device 200 .
  • FIG. 3 shows a block diagram of a hardware configuration of a display device 200 according to some embodiments.
  • the display device 200 includes at least one of a tuner and demodulator 210, a communicator 220, a detector 230, an external device interface 240, a controller 250, a display 260, an audio output interface 270, a memory, a power supply, and a user interface.
  • the controller includes a processor, a video processor, an audio processor, a graphics processor, a RAM, a ROM, a first interface to an nth interface for input/output.
  • the display 260 includes a display screen component for displaying images, and a drive component for driving image display, for receiving image signals output from the controller, components for displaying video content, image content, menu control interface, and user control UI interface.
  • the display 260 can be a liquid crystal display, an OLED display, and a projection display, and can also be a projection device and a projection screen.
  • the communicator 220 is a component for communicating with external devices or servers according to various communication protocol types.
  • the communicator may include at least one of a Wifi module, a Bluetooth module, a wired Ethernet module and other network communication protocol chips or near field communication protocol chips, and an infrared receiver.
  • the display device 200 may establish transmission and reception of control signals and data signals with the external control device 100 or the server 400 through the communicator 220 .
  • the user interface can be used to receive control signals from the control device 100 (such as: an infrared remote controller, etc.).
  • the detector 230 is used to collect signals of the external environment or interaction with the outside.
  • the detector 230 includes a light receiver, a sensor for collecting ambient light intensity; or, the detector 230 includes an image collector, such as a camera, that can be used to collect external environmental scenes, user attributes or user interaction gestures, or, the detector 230 includes a sound collector, such as a microphone, for receiving external sound.
  • the external device interface 240 may include but not limited to the following: any one or more interfaces such as high-definition multimedia interface (HDMI), analog or data high-definition component input interface (component), composite video input interface (CVBS), USB input interface (USB), RGB port, etc. It may also be a composite input/output interface formed by the above-mentioned multiple interfaces.
  • HDMI high-definition multimedia interface
  • component analog or data high-definition component input interface
  • CVBS composite video input interface
  • USB USB input interface
  • RGB port etc. It may also be a composite input/output interface formed by the above-mentioned multiple interfaces.
  • the tuner demodulator 210 receives broadcast TV signals through wired or wireless reception, and demodulates audio and video signals, such as EPG data signals, from multiple wireless or cable broadcast TV signals.
  • the controller 250 and the tuner-demodulator 210 may be located in different split devices, that is, the tuner-demodulator 210 may also be located in an external device of the main device where the controller 250 is located, such as an external set-top box.
  • the controller 250 controls the work of the display device and responds to user operations through various software control programs stored in the memory.
  • the controller 250 controls the overall operations of the display device 200 . For example, in response to receiving a user command for selecting a UI object to be displayed on the display 260, the controller 250 may perform an operation related to the object selected by the user command.
  • the controller includes at least one of a central processing unit (Central Processing Unit, CPU), a video processor, an audio processor, a graphics processing unit (Graphics Processing Unit, GPU), a RAM (Random Access Memory, RAM), a ROM (Read-Only Memory, ROM), a first interface to an nth interface for input/output, a communication bus (Bus), and the like.
  • a central processing unit Central Processing Unit, CPU
  • a video processor an audio processor
  • a graphics processing unit GPU
  • RAM Random Access Memory
  • ROM Read-Only Memory
  • the user can input a user command through a graphical user interface (GUI) displayed on the display 260, and the user input interface receives the user input command through the graphical user interface (GUI).
  • GUI graphical user interface
  • the user may input a user command by inputting a specific sound or gesture, and the user input interface recognizes the sound or gesture through a sensor to receive the user input command.
  • User interface is a medium interface for interaction and information exchange between application programs or operating systems and users, which realizes the conversion between the internal form of information and the form acceptable to users.
  • the commonly used form of user interface is the graphical user interface (Graphic User Interface, GUI), which refers to the user interface related to computer operation displayed in a graphical way. It may be an icon, window, control and other interface elements displayed on the display screen of the electronic device, where the control may include icons, buttons, menus, tabs, text boxes, dialog boxes, status bars, navigation bars, Widgets and other visual interface elements.
  • GUI graphical User Interface
  • FIG. 4 shows a software configuration diagram in the display device 200 according to some embodiments.
  • the system is divided into four layers, from top to bottom are applications (Applications) layer (abbreviated “application layer”), application framework (Application Framework) layer (abbreviated “framework layer”), Android runtime (Android runtime) and system library layer (abbreviated “system runtime layer”), and kernel layer.
  • applications Applications
  • Application Framework Application Framework
  • Android runtime Application runtime
  • system library layer abbreviated "system runtime layer”
  • kernel layer kernel layer
  • there is at least one application program running in the application program layer and these application programs may be Window programs, system setting programs, or clock programs provided by the operating system; they may also be applications developed by third-party developers.
  • the application program packages in the application program layer are not limited to the above examples.
  • the framework layer provides application programming interface (application programming interface, API) and programming framework for the application.
  • the application framework layer includes some predefined functions.
  • the application framework layer is equivalent to a processing center, which decides to make the applications in the application layer take actions.
  • the API interface Through the API interface, the application program can access the resources in the system and obtain the services of the system during execution.
  • the application framework layer in some embodiments of the present application includes managers (Managers), providers (Content Provider), network management system, etc., wherein the manager includes at least one of the following modules: the activity manager (Activity Manager) is used to interact with all activities running in the system; the location manager (Location Manager) is used to provide system services or applications. Various information related to the application package; Notification Manager (Notification Manager) is used to control the display and clearing of notification messages; Window Manager (Window Manager) is used to manage icons, windows, toolbars, wallpapers and desktop components on the user interface.
  • different application programs can be configured in the display device, so that the display device provides different functions to enrich user experience.
  • the live class program can be configured in the display device to use the display device to realize application scenarios such as online live class, remote conference, real-time game live broadcast, and electronic whiteboard sharing.
  • the display device configured with the live class program can be applied to any teaching institution, including schools, remedial institutions, etc. When teaching live broadcast on the display device, it can provide users with a large-screen experience, and use the display device to display more teaching content and operating functions required during the teaching live broadcast process.
  • the live class program in addition to being applied on the display device side, can also be configured on the mobile phone, Pad and PC clients.
  • the display device includes a display and a controller, the display is used to present a live teaching interface, and the controller is used to implement live teaching according to operations related to the live broadcast.
  • the live class program includes a first program and a second program.
  • the first program is the live class program configured in the teacher’s display device.
  • the second program is the live class program configured in the student’s display device.
  • the second program is started, the second live interface is displayed on the student’s display.
  • the online live broadcast class it is necessary to simulate the real class scene. Both the teacher's picture and the contents of the teacher's blackboard writing on the blackboard need to be seen by the students.
  • the current live broadcast class usually adopts a camera to take pictures of the teacher and the blackboard at the same time for live broadcast. If it is necessary to switch to other screens during the live broadcast, or to switch between the teacher screen and the black screen screen, a dedicated staff needs to manually move the camera position to solve the problem. The operation is very inflexible and affects the progress and effect of the live broadcast class.
  • the embodiment of the present application provides a display device, which is equipped with two cameras, the first camera captures the teacher's picture, and the second camera captures the blackboard picture, and the teacher's picture and the blackboard picture are respectively displayed in the corresponding playback windows.
  • a switching entry is configured in the teacher's display device so that the teacher can dynamically switch the images in the two playback windows.
  • the user can customize the shooting parameters of the two cameras, or dynamically modify the shooting parameters during the live broadcast to save network resources.
  • both cameras may be built-in cameras of the display device; one of them may be a built-in camera and the other may be an external camera; or both cameras may be external cameras.
  • the first camera is used to collect the first picture, and the first picture may be the teacher's picture, therefore, the first camera faces the teacher's face.
  • the second camera is used to collect the second picture, which can be the blackboard picture, that is, the picture of the teacher manually writing on the blackboard. Therefore, the second camera faces the blackboard or desk.
  • Fig. 5 shows a flow chart of a teacher-end display device performing a dual-camera live broadcast solution according to some embodiments
  • Fig. 6 shows a sequence diagram of a dual-camera live broadcast solution according to some embodiments.
  • the display device at the teacher's end includes: a display configured to display a user interface; a first camera configured to capture a first picture; a second camera configured to capture a second picture; a controller connected to the display, the first camera, and the second camera respectively, and when executing the dual-camera live broadcast solution shown in FIGS. 5 and 6 , the controller is configured to perform the following steps:
  • the first live broadcast interface includes at least one of a class button, a supplementary stream start button, a teacher video window, and a play window in the handout area.
  • the first program (live class program) is configured in the teacher’s display device. Therefore, after starting the teacher’s display device, the first program icon will be displayed on the device home page. The user triggers the first program icon manually or by voice, and the homepage of the live class is displayed on the display.
  • the home page of the live class shows at least one live class control, and each live class control is used to create a corresponding live room for online live classes of teachers and students.
  • each live class control is used to create a corresponding live room for online live classes of teachers and students.
  • Fig. 7 shows a schematic diagram of the display effect of the main interface of the homepage of the live lesson in the display device 200 according to some embodiments.
  • the teacher as the lecturer, after logging in to the live class program, displays the created courses on the main interface, and can also create a new live class by clicking the "create live class" control in the lower right corner.
  • the controller displays the first live interface in response to triggering the activation instruction generated by the target live lesson control in the user interface, and is further configured to:
  • Step 111 in response to triggering the start instruction generated by the target live class control in the user interface, obtain from the server the audio and video SDK identification and message SDK identification corresponding to the target live class, the audio and video SDK identification refers to the identification of the audio and video SDK used to realize the audio and video sending and receiving operation in the live broadcast room, and the message SDK identification refers to the identification of the message SDK used to realize the chat message sending and receiving operation in the live broadcast room.
  • Step 112 Create a target live broadcast room corresponding to the target live broadcast class based on the audio and video SDK identifier and the message SDK identifier, and display a first live broadcast interface for presenting the target live broadcast room.
  • both the teacher and the students need to enter the same live broadcast room, and the transmission of audio, video and message streams between the teacher and students is required. Therefore, when the teacher enters the target live room by triggering the target live lesson control displayed on the teacher's display device, it carries the target live lesson ID and first obtains the corresponding audio and video SDK ID and message SDK ID from the server. SDK (Software Development Kit), a software development kit.
  • the audio and video SDK ID is the audio and video SDK room number
  • the message SDK ID is the message group ID.
  • Call the SDK method carry the audio and video SDK ID and send it to the audio and video server to create the corresponding target live room.
  • the teacher's display device sends messages to each student's display device participating in the class through the group, and the student sends a message to another student or teacher through the group.
  • the messages include but are not limited to interactive messages between teachers and students, and interactive messages between students.
  • the teacher-side display device sends signaling to each student-side display device participating in the class, the signaling includes but not limited to class start signaling, get out of class end signaling, main video stream and auxiliary video stream, etc.
  • the messages and signaling are transmitted between the teacher and the students, they both carry the audio and video SDK ID and message SDK ID corresponding to the target live class.
  • the message/signal transmission between teachers and students, and between students and students is realized based on the audio and video SDK identification and message SDK identification corresponding to the target live class, so that the messages/signals generated by the teacher can be sent to each student terminal in the same live room to ensure that teachers and students are in the same live room for live classes.
  • the student terminal display device also joins the target live broadcast room through the methods of step 111 and step 112 .
  • the teacher's terminal and each student's terminal are located in the same live broadcast room, so live classes can be conducted.
  • the display device at the teacher end generates and displays the first live interface for presenting the target live room
  • the display device at the student end generates and displays the second live interface for presenting the target live room.
  • Fig. 8 shows a schematic diagram of a first live broadcast interface displayed by a teacher-end display device according to some embodiments.
  • the first live interface includes at least one of a control bar, a message dialog box, a teacher video window, and a playback window in a handout area.
  • the control bar is displayed on one side of the first live broadcast interface, and the control bar includes a class button, an auxiliary stream start button, courseware controls, shared screen controls, student controls, and the like.
  • the class button is used to start the live class; the auxiliary stream start button is used to start the auxiliary stream camera for live class; the courseware control is used to trigger the display of courseware/handouts, including text content or video content; the sharing screen control is used to share the screen display content with the student terminal; the student control is used to view all student information participating in the live class.
  • the message dialog is used to send messages, the teacher's video window is used to display the teacher's picture in the initial state, and the playback window in the handout area is used to display the handout content in the initial state.
  • both the teacher’s video window and the play window in the handout area display blank content; if the teacher clicks on the courseware control before the class starts, the courseware content will be displayed in the play window in the handout area.
  • the teacher clicks the class button and the courseware content is called out before clicking the class button the courseware content will be displayed in the playback window of the lecture area, and the teacher's picture will be displayed in the teacher video window.
  • the teacher clicks the class button, and the courseware content is not called out before clicking the class button the content of the whiteboard will be displayed in the playback window of the lecture area, and the teacher's picture will be displayed in the teacher video window.
  • Fig. 9 shows a schematic diagram of a second live broadcast interface displayed by a student terminal display device according to some embodiments.
  • the operation buttons on the second live broadcast interface are less than those on the first live broadcast interface.
  • the second live broadcast interface includes at least one of a message dialog box, a teacher's video window, and a play window in a handout area.
  • the display style and presentation content of the teacher video window and the play window in the handout area in the second live interface may correspond to those of the first live interface.
  • the teacher can operate the live broadcast to attend classes.
  • the teacher triggers the class button in the first live broadcast interface to generate a start live broadcast instruction, that is, the class start signal, and sends it to the display device of the student terminal through the message server, so that the student terminal enters the state of preparing for class.
  • the teacher's end triggers the class button, it enters the state of preparing for class.
  • the first camera for collecting the teacher's picture is turned on, and the teacher's picture is displayed as the first picture in the teacher's video window in the first live broadcast interface. If the teacher does not call out the courseware content in advance before clicking the class button, the whiteboard content will be displayed in the playback window of the lecture area in the first live interface, as shown in FIG. 8 .
  • the student terminal displays the paused state map of the live broadcast room (without the mask of the class). Turn on the first camera on the teacher’s end, and the student’s end follows the teacher’s end to enter the state of preparing for class.
  • the student’s end receives the class signal sent by the teacher’s end, it hides the pause live broadcast status picture, and the teacher’s video window in the second live broadcast interface displays the teacher’s picture, and displays the whiteboard content in the playback window of the handout area in the second live broadcast interface.
  • the courseware control can be triggered to select the target lecture content and display it in the playback window of the lecture area.
  • the content of the handouts can be displayed before the teacher clicks the class button, or after the teacher clicks the class button.
  • Fig. 10 shows a schematic diagram of displaying lecture content in the play window of the lecture area in the first live broadcast interface according to some embodiments.
  • the teacher will trigger the courseware control to obtain the corresponding lecture content and display it in the playback window of the lecture area.
  • the display device at the teacher's end sends a signaling to the display device at the student's end through the audio and video server, so that the play window of the lecture area in the second live broadcast interface synchronously displays the lecture content.
  • a courseware preview area can also be displayed at the bottom of the playback window in the lecture area, and the courseware preview area is used to display the content of each page of the lecture content in the form of thumbnails.
  • the courseware preview area is only displayed in the first live broadcast interface on the teacher's end, not in the second live broadcast interface on the student end.
  • the playback window in the lecture area can occupy the space of the first live interface to the maximum without blocking other layout content in the first live interface; after calling out the lecture content on the teacher’s end, in order to facilitate the display of the courseware preview area, the occupied space of the lecture area playback window can be reduced from the vertical direction, that is, the lecture area playback window when the courseware preview area is displayed is smaller than the lecture area playback window when the courseware preview area is not displayed.
  • the teacher's video window i.e. the first picture
  • the whiteboard content is displayed in the playback window of the lecture area (when the courseware is not called out in advance), as shown in Figure 8.
  • the teacher’s video window namely the first picture
  • the lecture content is displayed in the playback window of the lecture area, as shown in FIG. 10 .
  • the controller executes the start live broadcast instruction generated in response to triggering the class button, starts and acquires the first picture collected by the first camera, and displays the first picture in the teacher's video window, and is further configured as:
  • Step 121 Select the first camera as the mainstream camera for shooting the first picture in response to the start live broadcast instruction generated by triggering the class button, and set the mainstream shooting parameters of the mainstream camera.
  • Step 122 Start the mainstream camera, and obtain the first frame captured by the mainstream camera based on the mainstream shooting parameters, and display the first frame as the primary video stream in the teacher's video window.
  • the teacher clicks the class button to start the live broadcast he calls the camera ID setting method to set the current default mainstream camera.
  • the first camera can be obtained from the camera list as the mainstream camera; or, one can be selected by the user as the mainstream camera.
  • Mainstream shooting parameters include but are not limited to resolution, width and height dimensions, etc.
  • the mainstream resolution can obtain system default parameters, for example, the mainstream resolution is 480P.
  • the main stream is the first video stream pushed after the live broadcast starts. Therefore, the first picture captured by the first camera (main stream camera) is used as the main video stream.
  • the display device on the teacher's side pushes the captured first picture as the main video stream to the audio and video server.
  • the audio and video server sends the video signaling of the main stream connection to each student display device.
  • the audio and video server configures the main video stream ID for the main video stream, and then generates video signaling for mainstream connections based on the teacher ID, video stream ID, main video stream ID, main video stream level, available ID, audio and video SDK ID, and message SDK ID.
  • the teacher ID is used to identify the teacher
  • the video stream ID is used to identify the main video stream
  • the main video stream ID is used to indicate the type of the main video stream
  • the available ID is used to indicate that the main video stream can be transmitted to the student terminal.
  • the display device of the student side judges that the video stream is marked as the main video stream, and calls the audio and video subscription video stream method in the teacher's video component to subscribe to the video stream from the server.
  • the server sends the main video stream to the student display device and renders it in the subscription component, that is, the first picture is displayed as the main video stream in the teacher video window of the second live interface provided by the student display device.
  • the teacher if the teacher wants to push the second video stream, it needs to be realized through dual cameras, and the second video stream is the auxiliary stream.
  • the teacher triggers the auxiliary stream start button in the first live broadcast interface to generate a dual-camera live broadcast command to start the second camera to collect the second picture.
  • the second picture is displayed in the playback window of the handout area, and the second camera is the auxiliary stream camera.
  • the playback window in the handout area may display the content of the whiteboard and the content of the handout. Therefore, when the content of the whiteboard is displayed in the playback window of the handout area, when the teacher triggers the auxiliary stream start button, the second camera can be directly started to capture the second picture. And if the broadcast window of the lecture area displays the lecture content, when the teacher triggers the auxiliary stream start button, the lecture content will be stopped first, and then the second camera will be started to capture the second picture, so that the lecture content will be replaced by the second picture and displayed in the lecture area playback window.
  • the shooting parameters of the second camera may be different from those of the first camera.
  • the shooting parameters of the second camera are better than those of the first camera.
  • the setting of the shooting parameters of the second camera can be realized by popping up an auxiliary stream setting pop-up window in the first live broadcast interface.
  • the controller is further configured to perform the following steps when executing the dual-camera live broadcast instruction generated in response to triggering the auxiliary stream start button, starting and acquiring the second picture captured by the second camera, and displaying the second picture in the playback window of the handout area:
  • Step 131 In response to the dual-camera live broadcast instruction generated by triggering the auxiliary stream start button, display an auxiliary stream setting pop-up window including a parameter selection box and an OK button on the first live broadcast interface.
  • the parameter selection box is used to set the auxiliary stream shooting parameters of the auxiliary stream camera.
  • Step 132 Select the second camera as the streaming camera for shooting the second picture based on the parameter selection box, and after setting the streaming shooting parameters of the streaming camera, start the streaming camera in response to an instruction generated by triggering the OK button.
  • Step 133 Obtain the second picture captured by the auxiliary stream camera based on the shooting parameters of the auxiliary stream, and display the second image as the auxiliary video stream in the playback window of the handout area.
  • the second camera can be turned on. At this time, it is necessary to set the shooting parameters of the second camera first, so after triggering the auxiliary stream start button, a pop-up window for auxiliary stream setting will pop up in the first live broadcast interface.
  • Fig. 11 shows a schematic diagram of displaying an auxiliary stream setting pop-up window in the first live broadcast interface according to some embodiments.
  • the content setting pop-up window includes a parameter selection box and an OK button.
  • the parameter selection box is used to set the content shooting parameters of the content camera, such as camera, resolution, mirror image, etc.
  • the teacher selects the second camera as the streaming camera for shooting the second screen, and configures the streaming ID for the streaming camera.
  • the second camera can be obtained from the camera list as the secondary streaming camera; or, a user-defined one can be selected as the secondary streaming camera.
  • Resolution can select high-resolution parameters, such as 480P, 720P, 1080P, etc.; mirror image can be selected to be on or off.
  • the auxiliary stream setting pop-up window may also include a camera image preview area for previewing images captured by the second camera.
  • the teacher After selecting the streaming camera based on the parameter selection box and setting the streaming shooting parameters of the streaming camera, the teacher triggers the OK button in the streaming setting pop-up window, and calls the camera shooting parameter setting method to complete the setting of the streaming shooting parameters of the streaming camera; then calls the audio and video SDK method based on the streaming ID to start the streaming camera.
  • the auxiliary stream camera captures the second picture based on the auxiliary stream shooting parameters, and then calls the rendering auxiliary stream method to display the second picture as the auxiliary video stream in the playback window of the handout area.
  • the images captured by the second camera and the content of the lecture are displayed in the same playback window of the lecture area. Therefore, at this time, there is a scene where the content displayed in the playback window of the lecture area is switched.
  • the second camera is activated to switch the display content in the lecture area playback window from the lecture content to the second screen, since the lecture area playback window no longer displays the lecture content, then the courseware preview area does not need to be displayed.
  • the occupied space of the lecture area playback window can be increased to maximize the space of the first live interface without blocking other layout content in the first live interface.
  • the size of the playback window in the handout area changes from small to large.
  • a tab area is displayed below the lecture area playback window in the first live interface, and the tab area displays the lecture tab, which is used to trigger the display of the lecture content.
  • the lecture content includes text lectures or video lectures, therefore, the lectures tab also includes a text lectures tab and a video lectures tab.
  • Fig. 12 shows a schematic diagram of displaying a second picture in the play window of the handout area according to some embodiments.
  • the second picture is a blackboard picture
  • the blackboard picture is a teacher drawing a rabbit pattern.
  • the blackboard screen is displayed in the playback window of the handout area.
  • the dual-camera live broadcast scene can be turned on or off by the teacher, so the picture captured by the second camera is not captured and displayed when the class button is initially triggered, so the second picture captured by the second camera can be displayed as a secondary video stream.
  • a secondary stream setting pop-up window including a parameter selection box and a confirmation button is displayed on the first live broadcast interface to set the secondary stream shooting parameters of the secondary stream camera.
  • the teacher triggers the auxiliary stream start button during subsequent use, the second camera can be started directly according to the previously set parameters without displaying the auxiliary stream setting pop-up window. If the user needs to modify the auxiliary stream shooting parameters of the second camera, he can reset the auxiliary stream setting pop-up window by triggering the setting button in the first live broadcast interface.
  • the display device on the teacher's end pushes the captured second picture to the audio and video server as a secondary video stream.
  • the audio and video server receives the main video stream sent by the teacher, it sends video signaling for auxiliary stream connection to each student display device.
  • the audio and video server When the audio and video server generates the video signaling for the auxiliary stream connection, it configures the auxiliary video stream identifier for the auxiliary video stream, and then generates the video signaling for the auxiliary stream connection based on the teacher ID, video stream ID, auxiliary video stream identifier, auxiliary video stream level, available identifier, audio and video SDK identifier, and message SDK identifier.
  • the video stream ID is used to represent the secondary video stream
  • the secondary video stream identifier is used to represent the type of the secondary video stream
  • the available identifier is used to represent that the secondary video stream can be transmitted to the student terminal.
  • the display device at the student terminal judges that the video stream identifier is the auxiliary video stream identifier, and calls the audio and video subscription video stream method in the auxiliary stream video component in the teacher's handout area to subscribe to the video stream from the server.
  • the server sends the secondary video stream to the display device at the student end, and renders it in the subscription component, that is, displays the second picture as the secondary video stream in the playback window of the lecture area of the second live interface provided by the display device at the student end.
  • the main video stream is still in the delivery state, that is, the student display device is still displaying the main video stream.
  • the dual-camera function when the dual-camera function is enabled on the teacher's end, there is a primary video stream and an auxiliary video stream at the same time, and the resolutions of the primary video stream and the auxiliary video stream may be different.
  • the reasons for the different resolutions of the main and auxiliary video streams can be manually set by the teacher, a certain video stream uploaded by the teacher to the server is automatically downgraded due to network reasons (primary/secondary video stream level), or it is automatically set by the live room corresponding to the target live class.
  • the main video stream carries the identification of the main video stream
  • the auxiliary video stream carries the identification of the auxiliary video stream, so as to distinguish them.
  • the teacher's video window displays corresponding content based on the main video stream ID
  • the playback window in the lecture area displays corresponding content based on the secondary video stream ID.
  • the lecture area play window displays the lecture content or the blackboard image. Therefore, in order to ensure that students and teachers can clearly watch the teaching content, the size of the playback window in the lecture area can be set to be larger than the teacher’s video window, so as to present the live broadcast effect of large and small screens.
  • the two cameras can be turned on to collect different teaching images, one camera collects the teacher’s image, and the other camera collects the blackboard image.
  • the teacher screen is displayed in the teacher video window (teacher and student side) of the live interface, and the blackboard screen is displayed in the playback window of the handout area of the live interface (teacher side and student side).
  • the blackboard screen and handout content share a playback window to increase the display effect.
  • different shooting parameters can be set for different cameras in order to save network resources.
  • the tab area of the first live broadcast interface is added to display the auxiliary stream tab. If the secondary stream tab is selected, it means that the second picture captured by the second camera is displayed in the playback window of the handout area.
  • the auxiliary stream tab when the first live broadcast interface is generated and displayed, can be synchronously displayed in the tab area.
  • the dual-camera live broadcast is not enabled on the teacher’s end, if the teacher clicks on the auxiliary stream tab, the playback window in the lecture area will switch to display a blank screen; after the dual-camera live broadcast is enabled on the teacher’s end, if the teacher clicks on the auxiliary stream tab, the lecture area’s playback window will switch to display the second image captured by the second camera.
  • the controller is further configured to: respond to the operation of switching the secondary stream tab to the handout tab in the tab area, turn off the second camera, stop sending the auxiliary video stream to the student display device through the server, and switch and display the handout content in the play window of the handout area.
  • the teacher can switch the content displayed in the playback window of the handout area by switching buttons. Then, when the teacher switches the auxiliary stream tab to the handout tab based on the tab area, that is, when the handout tab is selected, the blackboard screen does not need to continue to be presented, and the second camera can be turned off at this time.
  • Secondary stream port signaling includes teacher ID, video stream ID, secondary video stream identifier, unavailable identifier, audio and video SDK identifier, and message SDK identifier, etc.
  • the secondary video stream identifier is used to represent the type of the secondary video stream, and the unavailable flag is used to represent that the secondary video stream cannot be transmitted to the student terminal.
  • the display device on the student side After receiving the video signaling sent by the server, the display device on the student side determines that the video stream ID is the secondary video stream ID, and removes the subscription to the secondary video stream based on the unavailable flag and the secondary video stream ID, and removes the corresponding secondary stream component, that is, turns off the second camera, and displays the original lecture content in the playback window of the lecture area.
  • the second camera is turned off, and the auxiliary stream tab in the tab area is canceled.
  • the second camera when performing tab switching, can still be turned on, only stop sending the auxiliary video stream to the student display device through the server, and switch to display the lecture content in the playback window of the lecture area. At this time, the content tab in the tab area is still displayed. If the teacher still needs to display the blackboard image captured by the second camera, then select the auxiliary stream tab, and then display the blackboard image currently captured by the second camera in the playback window of the lecture area on the teacher's end and the student's end.
  • the teacher can modify the shooting parameters of the auxiliary stream of the second camera during the live broadcast of the class.
  • the second camera used as the secondary streaming camera fails and cannot be used, and the third camera needs to be replaced as the secondary streaming camera to continue to capture blackboard images. It can also occur in the scene of modifying the mirroring parameters, the scene of reducing the resolution when the video freezes, and the scenario of increasing the resolution when the video transmission is smooth, etc.
  • a setting button is added in the tab area of the first live broadcast interface.
  • the setting button is selected, and the controller is further configured to perform the following steps:
  • Step 141 In response to the instruction generated by triggering the setting button in the first live broadcast interface, pop up a secondary stream setting pop-up window in the first live broadcast interface, and stop sending the secondary video stream to the student display device.
  • Step 142 When the modification of the auxiliary stream shooting parameters is completed based on the auxiliary stream setting pop-up window, cancel the display of the auxiliary stream setting pop-up window, and continue to send the auxiliary video stream corresponding to the modified auxiliary stream shooting parameters to the student display device through the server.
  • the setting button can be simultaneously displayed on the first live interface when the teacher clicks the streaming start button.
  • the teacher triggers the setting button, and pops up the auxiliary stream setting pop-up window in the first live broadcast interface again.
  • the second screen is displayed in the playback window of the lecture area and the camera screen preview box of the setting pop-up window, such as the rabbit pattern hand-drawn by the teacher.
  • the push of the auxiliary video stream to the student terminal can be stopped first, and the playback window of the lecture area of the student terminal displays the content of the whiteboard.
  • the implementation method please refer to the foregoing content, and details will not be described here.
  • the push of the auxiliary video stream to the student end may not be stopped, but the auxiliary video stream is sent to the student end through the server according to the previous parameters.
  • the playback window of the lecture area of the student end still displays the auxiliary video stream corresponding to the shooting parameters of the original auxiliary stream.
  • the teacher re-modifies the shooting parameters of the auxiliary video stream, based on the modified shooting parameters of the new auxiliary stream, the collected auxiliary video stream is sent to the student end through the server for display, and the playback window in the lecture area of the student end continues to display the auxiliary video stream corresponding to the shooting parameters of the new auxiliary stream.
  • auxiliary stream camera continues to capture the second picture based on the modified auxiliary stream shooting parameters, and then continues to send the newly collected auxiliary video stream to the student display device through the server. It can be seen that the video fluency of the student terminal is ensured by manually reducing the resolution of the current push stream. For the pushing process of the secondary video stream, reference may be made to the foregoing content, and details are not described here.
  • the shooting parameters of the first camera and the second camera are different, so that the resolutions of the main video stream and the auxiliary video stream are different, and thus the display resolutions of the teacher's video window and the playback window of the handout area are different. Since the playback window in the lecture area is a large display window and the teacher’s video window is a small display window, the shooting parameters of the second camera are better than those of the first camera, and the display resolution of the playback window in the lecture area is better than that of the teacher’s video window.
  • the two playback windows use different resolutions, which can reduce the network resources occupied by the main video stream.
  • the shooting parameters of the first camera and the second camera are the same, so that the resolutions of the primary video stream and the secondary video stream are the same, and thus the display resolutions of the teacher's video window and the playback window of the handout area are the same.
  • the first camera collects the main video stream data and displays it in the teacher's video window
  • the second camera collects the secondary video stream data and displays it in the playback window of the handout area. Then based on the actual classroom layout, the teacher can also arrange the blackboard behind the teacher's position, that is, the first camera can simultaneously capture the blackboard picture while capturing the teacher's picture.
  • the teacher arranges a manual writing panel on the desk, and arranges a blackboard behind the teacher.
  • the first camera faces the teacher
  • the second camera faces the desk.
  • the first camera captures the teacher's picture
  • the second camera captures the writing panel on the desk. That is, the teacher's video window displays the teacher's picture, and the playback window in the handout area displays the writing content in the writing panel.
  • the blackboard picture captured by the first camera is displayed on the teacher's video window.
  • the size of the teacher’s video window is smaller than the size of the playback window in the handout area, in order to facilitate students to clearly see the teacher’s writing on the blackboard, the blackboard picture displayed in the teacher’s video window can be switched to the playback window of the handout area for display, while the teacher’s video window displays the content of the writing panel collected by the second camera.
  • the first picture captured by the first camera can also be switched and displayed in the playback window of the larger lecture area.
  • a switch button may be added in the tab area of the first live broadcast interface.
  • a switching button is simultaneously displayed on the first live broadcast interface.
  • the switching button is used to realize the switching display of the images captured by the first camera and the second camera, that is, the switching display of the main and auxiliary video streams.
  • the controller is further configured to perform the following steps:
  • Step 151 When the teacher's video window in the first live interface displays the first picture and the lecture area play window displays the second picture, the switch button is triggered to display the first picture as the secondary video stream in the lecture area play window, and the second picture is displayed as the main video stream in the teacher's video window.
  • Step 152 Send a switching instruction to the server, the switching instruction is used to instruct the server to change the video stream ID of the first picture to the secondary video stream ID, and change the video stream ID of the second picture to the main video stream ID.
  • the teacher can trigger the switch button, call the main and auxiliary stream switching method, and send a switching instruction to the server.
  • Fig. 13 shows a schematic diagram of displaying switched primary and secondary video streams in the first live broadcast interface according to some embodiments.
  • the teacher video window of the first live broadcast interface displays the second picture (such as a hand-painted rabbit pattern) based on the main video stream logo
  • the playback window in the lecture area displays the first picture (such as the teacher picture) based on the auxiliary video stream logo, and then completes the switching display of the main and auxiliary video streams.
  • the server responds to the switching instruction to switch the video stream identifiers of the main video stream and the auxiliary video stream, that is, change the main video stream identifier of the first picture to the auxiliary video stream identifier, and change the auxiliary video stream identifier of the second picture to the main video stream identifier.
  • the switching instruction sent by the display device of the teacher to the server, and the switched primary and secondary video stream data uploaded by the teacher can be sent asynchronously to allow time to complete the change of the primary and secondary video stream identifiers.
  • the server when the server switches the video stream identifier, it generates a switching notification synchronously and sends it to the display device at the student end.
  • the remote student terminal will first receive the callback that the main and auxiliary streams are disconnected, that is, stop receiving the target video stream sent by the server, and display the whiteboard content in the teacher's video window and the playback window of the handout area.
  • the content of the whiteboard is blank and will not be displayed.
  • the teacher end When the teacher end generates a switching instruction, the teacher end cancels the content display of the teacher’s video window and the playback window in the handout area in the first live interface, and switches to display blank content; at the same time, the server sends a switching notification to the student end, and then the student’s display device responds to the switching notification, and also cancels the content display of the teacher’s video window and the handout area playback window in the second live interface, and switches to display blank content.
  • the server After the server completes the switching of the video stream identifier, the server continues to send the main video stream and the auxiliary video stream to the student end, that is, the student end receives the main and auxiliary stream connection callback after switching, and calls the subscription method again to render the switched main and auxiliary stream in the teacher's video window and the handout playback window.
  • the teacher video window in the second live interface displays the second picture based on the main video stream identifier
  • the playback window in the lecture area displays the first picture based on the auxiliary video stream identifier, thereby completing the switching display of the main and auxiliary video streams.
  • the teacher After the teacher triggers the switching button, he will control the student display device to display the whiteboard content first and then display the switched main and auxiliary video streams. However, since the switching operation is an instantaneous operation, from the visual effect, the student display device will display the content after the switching of the playback window after a "flash" effect.
  • the server in order to reduce the visual effect of "flickering", can reduce the time-consuming when changing the main and auxiliary video stream identifiers, and try to make the switching instruction sent by the teacher's display device to the server, and the teacher's end upload the switched main and auxiliary video stream data to achieve a synchronous transmission method.
  • the teacher side triggers the window switching display
  • the teacher video window and the lecture area playback window on the teacher side and the student side no longer display blank whiteboard content, but instead display the handout content.
  • the content of the handout is the page shown before the teacher activated the dual camera function.
  • the teacher video window displays the first picture
  • the playback window of the handout area shows the handout content.
  • the lecture content is displayed in the teacher video window, and the first picture is displayed in the lecture area playback window.
  • the two cameras can be turned on to collect different teaching images, one camera collects the teacher’s image, and the other camera collects the blackboard image.
  • the student side can watch the teacher's live lecture based on the two playback windows at the same time, and the teacher side can flexibly and dynamically switch the images captured by the two cameras between the two playback windows. The operation is simple and convenient, and the user's class experience is improved.
  • Fig. 14 shows a flow chart of a method for performing a dual-camera live broadcast on a display device at a student terminal according to some embodiments.
  • a display device provided by an embodiment of the present application is applied to a display device at a student terminal, and includes: a display configured to display a user interface; a controller connected to the display.
  • the controller is configured to perform the following steps:
  • the target video stream refers to at least one of the first picture captured by the first camera and/or the second picture captured by the second camera acquired by the teacher-side display device after starting the live broadcast.
  • the video stream identifier indicates that the target video stream is the main video stream, display the main video stream in the teacher video window.
  • the video stream identifier indicates that the target video stream includes the main video stream and the auxiliary video stream, display the main video stream in the teacher's video window, and display the auxiliary video stream in the playback window of the handout area.
  • the second program (live class program) is configured in the display device of the student terminal. Therefore, after starting the display device of the student terminal, the icon of the second program is displayed on the home page of the device. The user triggers the second program icon manually or by voice to generate the homepage of the live class and display it on the display.
  • each live class control is used to enter a corresponding live room for online live classes of teachers and students.
  • a start instruction is generated to realize online class based on the target live class, and the second live interface is switched and displayed on the monitor.
  • the process of displaying the second live broadcast interface on the student terminal can refer to the process of displaying the first live broadcast interface on the teacher terminal.
  • the student terminal obtains the audio and video SDK ID and message SDK ID corresponding to the target live broadcast class from the server, and enters the target live broadcast room corresponding to the same target live broadcast class with the teacher terminal, so as to display the second live broadcast interface for presenting the target live broadcast room on the student terminal.
  • the teacher terminal so as to display the second live broadcast interface for presenting the target live broadcast room on the student terminal.
  • the teacher's display device sends the target video stream to the student's display device through the server.
  • the target video stream is the first picture captured by the first camera, which is the main video stream, that is, only the main video stream exists.
  • the target video stream is the first frame captured by the first camera and the second frame captured by the second camera, and the second frame captured by the second camera is the secondary video stream, that is, the primary video stream and the secondary video stream exist simultaneously.
  • the teacher-side display device to send the target video stream to the student-side display device through the server reference may be made to the execution process of the teacher-side display device provided in the foregoing embodiments, which will not be repeated here.
  • the video stream identifier needs to be obtained from the target video stream. If the video stream identifier indicates that the target video stream is the main video stream, the main video stream is displayed in the teacher video window. If the video stream identifier indicates that the target video stream includes a main video stream and an auxiliary video stream, the main video stream is displayed in the teacher video window, and the auxiliary video stream is displayed in the playback window of the handout area.
  • the teacher only turns on the first camera but not the second camera, and the target video stream is the first picture as the main video stream. Therefore, the first picture is displayed in the teacher's video window, and the content of the whiteboard is displayed in the playback window of the handout area.
  • the target video stream includes the first picture as the main video stream and the second picture as the auxiliary video stream. Therefore, the first picture as the main video stream is displayed in the teacher video window, and the second picture as the secondary video stream is displayed in the playback window of the handout area.
  • the teacher triggers the lecture content to be displayed in the playback window of the lecture area before the dual-camera live broadcast is enabled on the teacher’s end, then after the dual-camera live broadcast is enabled, the student end needs to stop displaying the lecture content first, and then display the second screen as the secondary video stream.
  • the controller is further configured to:
  • Step 251 If the teacher’s video window displays the first picture as the main video stream, and the play window in the handout area displays the second picture as the secondary video stream, then when the teacher’s display device switches and displays the images captured by the camera, it receives a switch notification sent by the server.
  • Step 252 In response to the switching notification, stop receiving the target video stream sent by the server, and display the whiteboard content in both the teacher's video window and the playback window in the handout area.
  • Step 253 when the switching and display of the images captured by the camera is completed, continue to receive the first frame carrying the ID of the secondary video stream and the second frame carrying the ID of the primary video stream sent by the server.
  • Step 254 Display the first picture as a secondary video stream in the playback window of the lecture area based on the secondary video stream ID, and display the second frame as the primary video stream in the teacher video window based on the primary video stream ID.
  • the teacher side display device sends a switching instruction to the server to instruct to change the video stream ID of the first picture to the auxiliary video stream ID, and change the video stream ID of the second picture to the main video stream ID.
  • the server generates a switching notification and sends it to the student display device.
  • the display device on the student side receives and responds to the switching notification, and first stops the screen display in the two playback windows, that is, the whiteboard content is displayed in both the teacher's video window and the playback window in the handout area.
  • Fig. 15 shows a schematic diagram of displaying switched primary and secondary video streams in the second live broadcast interface according to some embodiments.
  • the server completes the switching of the primary and secondary video stream identifiers, it continues to receive the first picture carrying the secondary video stream identifier and the second picture carrying the primary video stream identifier sent by the server, so as to display the first picture as the secondary video stream in the playback window of the lecture area based on the secondary video stream identifier, and display the second picture as the primary video stream in the teacher's video window based on the primary video stream identifier.
  • the second live interface of the student-side display device when the first live interface of the teacher-end display device only displays the teacher's picture in the teacher's video window, the second live interface of the student-side display device also only displays the teacher's picture in the teacher's video window.
  • the first live interface of the display device at the teacher’s end when displaying the teacher’s picture in the teacher’s video window and displaying the blackboard picture in the play window of the handout area, the second live interface of the display device at the student’s end also displays the teacher’s picture in the teacher’s video window and displays the blackboard picture in the play window of the handout area.
  • the display device on the teacher's end switches between large and small screens by switching screens
  • the display device on the student's end also simultaneously switches between large and small screens.
  • the teacher will control the display device on the student end to display the content of the whiteboard first and then display the switched main and auxiliary video streams after the teacher triggers the switch button, but because the switching operation is an instantaneous operation, from the visual effect, the display device on the student end will display the content after the switching of the playback window after a "flash" effect.
  • Fig. 16 shows a flow chart of a server performing a dual-camera live broadcast method according to some embodiments.
  • a server provided by an embodiment of the present application is applied to an audio and video server, and includes: a controller configured to perform the following steps when executing the dual-camera live broadcast method shown in FIG. 16 :
  • the target video stream refers to at least one of the first picture captured by the first camera and/or the second picture captured by the second camera acquired by the teacher-side display device after starting the live broadcast.
  • the target video stream is the first picture captured by the first camera, configure the main video stream identifier for the first picture, and send the first picture carrying the main video stream identifier as the main video stream to the student display device for display.
  • the target video stream is the second picture captured by the second camera, configure the secondary video stream identifier for the second picture, and send the second picture carrying the secondary video stream identifier as the secondary video stream to the student display device for display.
  • the server is used to realize the video stream transmission and message transmission between the teacher's display device and the student's display device. Therefore, the server may include an audio and video server and a message server, the message server realizes the video stream transmission between the teacher-side display device and the student-side display device, and the message server realizes the message transmission between the teacher-side display device and the student-side display device.
  • the server After the server receives the video stream sent by the teacher’s display device, if the main stream ID is obtained from the video stream, it means that the target video stream is the first picture captured by the first camera; if the auxiliary stream ID is obtained from the video stream, it means that the target video stream is the second picture captured by the second camera; if the main stream ID and auxiliary stream ID are obtained from the video stream, it means that the target video stream is the first picture captured by the first camera and the second picture captured by the second camera.
  • the server can configure corresponding video stream identifiers for the main video stream and the auxiliary video stream respectively, so that the student end can display the first picture in the teacher's video window based on the main video stream identifier, and display the second image in the playback window of the lecture area based on the auxiliary video stream identifier.
  • the server when the teacher side controls the screen switching, in order to notify the students that the screen switching is also performed synchronously, the server switches the video stream identifiers of the main video stream and the auxiliary video stream, and then sends the main video stream and the auxiliary video stream to the student end based on the switched video stream identifiers.
  • the controller is further configured to perform the following steps:
  • Step 341 When switching and displaying the images captured by the camera, in response to the switching instruction generated by the display device on the teacher’s side when the switching button is triggered, the video stream ID of the first screen is changed to the secondary video stream ID, and the video stream ID of the second screen is changed to the main video stream ID, and a switching notification is sent to the display device of the student side.
  • Step 342 When the switching and display of the images captured by the camera is completed, the first frame carrying the secondary video stream identifier and the second frame carrying the primary video stream identifier are sent to the display device at the student end, so that the playback window of the lecture area provided by the display device at the student end displays the first frame based on the secondary video stream identifier, and the teacher's video window displays the second frame based on the primary video stream identifier.
  • the server receives the switching command sent by the teacher’s display device, realizes the switching of the identification of the main and auxiliary video streams, and then sends the corresponding video stream to the student’s display device based on the switched video stream identification, so that the student’s display device realizes screen switching.
  • the smoothness of the transmission is determined by the network status. Therefore, in order to ensure the smoothness of the video between the two terminals, a method of layered video stream push can be adopted.
  • the controller when transmitting the video stream, the controller is further configured to perform the following steps:
  • Step 351 when realizing the transmission of the video stream between the teacher-side display device and the student-side display device, according to the resolution level, transcode the video stream sent by the teacher-side display device into a multi-layer video stream, each layer of video stream corresponds to a resolution, and the video stream refers to at least one of the main video stream and the auxiliary video stream.
  • Step 352 When sending the target layer video stream sent by the display device at the teacher's end to the display device at the student's end, detect the packet loss rate of the video stream received by the display device at the student's end.
  • Step 353 If the packet loss rate is higher than the preset threshold, send the corresponding layer video stream to the student display device based on the next resolution level, which is lower than the resolution level corresponding to the target layer video stream.
  • the server when the server receives the video stream sent by the teacher's display device, it will first divide the video stream into multi-layered video streams according to the resolution level before calling the audio and video SDK method to push the stream to the student's end.
  • the video streams of each layer are sorted from high to low.
  • the server will transcode the video stream into a three-layer video stream, such as a three-layer video stream of 1920 ⁇ 1080, 960 ⁇ 540, and 480 ⁇ 270.
  • the server sends the divided multi-layer video stream to the display device at the student end, so that the sent video stream connection signaling includes the information of the video stream level.
  • the video stream level information is the information of the three-layer video stream.
  • the student end calls the audio and video subscription video stream method in the main (auxiliary) stream video component of the teacher's handout area to select the corresponding layer video stream, and subscribes to the server for the corresponding layer video stream.
  • the student terminal subscribes to the server for a 1920 ⁇ 1080 layered video stream, and after receiving the subscription message, the server sends the 1920 ⁇ 1080 layered video stream to the student’s display device, and renders and displays it in the subscribing component.
  • the server detects the packet loss rate of the video stream received by the display device of the student terminal in real time. If the packet loss rate is higher than the preset threshold, such as higher than 40%, it means that the current network status is poor, and the layer will be automatically lowered, and the next layer video stream will be sent to the student display device to ensure the smoothness of the student end.
  • the preset threshold such as higher than 40%
  • the resolution may also be manually reduced for adjustment.
  • the teacher terminal triggers the setting button on the first live broadcast interface to pop up a secondary stream setting pop-up window, and manually modifies the resolution of the secondary video stream.
  • the video fluency of the student terminal is ensured by manually reducing the resolution of the current push stream.
  • the server when the server is performing dual-camera live broadcast, it can configure the corresponding video stream ID for the video stream type sent by the teacher's display device, so that the student can display different images in two corresponding playback windows based on different video stream IDs, so as to watch the teacher's live lecture.
  • the server when the teacher end flexibly and dynamically switches the pictures captured by the two cameras between the two playback windows, the server can switch the video stream identifiers of the main video stream and the auxiliary video stream, and continue to send the main video stream and auxiliary video stream after the video stream identifier switching to the student end, so that the student end can also switch and display the corresponding screen, which is simple and convenient to operate, and improves the user's class experience.
  • Fig. 14 shows a flow chart of a method for performing a dual-camera live broadcast on a display device at a student terminal according to some embodiments.
  • the present application also provides a dual-camera live broadcast method, which is applied to the student terminal display device, and the method includes:
  • the second live broadcast interface includes at least one of the teacher video window and the play window in the handout area;
  • the target video stream refers to at least one of the first picture captured by the first camera and/or the second picture captured by the second camera acquired by the teacher-side display device after starting the live broadcast;
  • the video stream identifier indicates that the target video stream is the main video stream, display the main video stream in the teacher's video window;
  • the video stream identifier indicates that the target video stream includes a main video stream and an auxiliary video stream
  • the main video stream is displayed in the teacher's video window
  • the auxiliary video stream is displayed in the lecture area playback window.
  • Fig. 16 shows a flow chart of a server performing a dual-camera live broadcast method according to some embodiments.
  • the present application also provides a dual-camera live broadcast method applied to a server, the method comprising:
  • the target video stream refers to at least one of the first picture captured by the first camera and/or the second picture captured by the second camera acquired by the teacher-side display device after starting the live broadcast;
  • the target video stream is the first picture captured by the first camera, configure the main video stream identifier for the first picture, and send the first picture carrying the main video stream identifier as the main video stream to the display device at the student end for display;
  • the target video stream is the second picture captured by the second camera, configure a secondary video stream identifier for the second picture, and send the second picture carrying the secondary video stream identifier as a secondary video stream to the student display device for display.
  • the operation is as follows when the teacher enters the target live broadcast room by triggering the target live lesson control displayed in the teacher display device:
  • S611 Obtain the corresponding audio and video SDK identification from the server and add it to the corresponding live broadcast room;
  • S613 Send a live class class signaling to each student-side display device participating in the class through the group;
  • S614 Start the live broadcast of the class, push the mainstream video, and send the video signal of the mainstream connection to the display device of the student terminal
  • S618 The teacher completes the teaching content, triggers the get out of class end control, and sends a get out of class end message to the server; or
  • S619 Arriving at the automatic end of get out of class time, triggering the scheduled task of starting end of get out of class, and sending a message of end of get out of class to the server.
  • the student terminal operates as follows when entering the target live broadcast room:
  • S621 Obtain the corresponding audio and video SDK identification from the server and join the corresponding live broadcast room;
  • S623 Receive the live class signal sent by the teacher from the server, hide the unattended class mask and display the live class content after receiving the message, and at this time, the playback window in the handout area displays the whiteboard content on the teacher's side;
  • S624 Receive the video signaling sent by the server, determine the video stream ID as the main video stream ID, call the audio and video subscription video stream method in the teacher video component, and subscribe the main video stream to the server; after receiving the subscription message, the server sends the main video stream to the student display device, and displays it in the teacher video window of the student display device
  • S625 After receiving the auxiliary stream video signaling, subscribe to the corresponding auxiliary stream video in the auxiliary stream video component in the handout area; after the subscription is successful, receive the auxiliary stream video sent by the server, and display it in the play window of the handout area;
  • S626 After receiving the switching signal of the main and auxiliary streams, respectively subscribe to the switched video streams in the teacher video component and the auxiliary stream video component in the handout area, and render them in corresponding windows after the subscription is successful;
  • S627 In response to the secondary stream disconnection signaling, remove the subscription of the secondary stream video, and display the lecture content in the playback window of the lecture area;
  • the teacher sets the start time (startTime) and end time (endTime) of the live class when creating the live class on the smart TV, for example, the live class starts at 8:00 and ends at 9:00. When the live class ends, that is, at 9:00, the live class will stop immediately. If the teacher still has unfinished content, he can only continue the lecture in the next class. In this manner, the interaction between the teacher and the display device 200 at the end of the class is poor, and the class experience is not good. In order to improve the experience effect of the teacher in the live broadcast class, the application also provides an implementation mode of dragging the class.
  • the server 400 responds to the user's request to create a live class through the display device 200, correspondingly creates a live class chat group, an audio and video live room, and creates a compulsory course end timer according to the class time set by the user on the display device 200.
  • the course mandatory end timer sets the first end time (delayEndTime) according to the get out of class time set by the teacher. If the class is not over within the first end time, the course compulsory end timer will send a time limit message to the live class chat group. That is to say, when the compulsory end timer of the course detects that the first end time is up, even if the teacher has not dismissed the get out of class, the server 400 will control the display device 200 to forcibly end the course.
  • the function of the compulsory course end timer is to maintain the operating efficiency of the entire system, because the server 400 serves many users, and only when the current live course is finished can the system resources be released for use by other live course services.
  • Fig. 17 exemplarily shows a schematic flow chart of a method for managing live classes and dragging classes according to some embodiments.
  • the process of dragging classroom management for this live broadcast class is as follows:
  • S701 In response to the operation of the lecturer entering the live lecture, send a data acquisition request including the course identifier to the server.
  • the main interface displays a list of live lessons created by the instructor.
  • the display device 200 controls the display 260 to jump to the live lesson playback interface shown in FIG. 8 .
  • the display device 200 will send a data acquisition request carrying a course identifier to the server 400.
  • the data acquisition request may also include a user identifier, and the server 400 can determine whether the user is a lecturer or an ordinary student through the user identifier.
  • the time for the lecturer to conduct the live lecture which may be earlier than startTime or later than startTime.
  • the server 400 when the server 400 receives the data request containing the course identification sent by the display device 200, it obtains the live course detailed data corresponding to the live course and the first end time of the live course from the database according to the course identification, and feeds back the corresponding data to the display device 200.
  • the server 400 also needs to find out the live class chat group ID and the audio and video live room ID created in response to the user creating the live class before, and the server 400 can attribute the live class chat group ID and the audio and video live room ID to the live class details data and feed them back to the display device 200, or separately to the 200.
  • S702 Receive the live class detail data and the first end time of the live class sent by the server according to the course identifier, wherein the live class detail data is the data required for the live class, and the first end time is calculated by the server according to the second end time set by the teaching party (for example, the end time endTime set by the teaching party), and the first end time is greater than or equal to the second end time.
  • the live class detail data is the data required for the live class
  • the first end time is calculated by the server according to the second end time set by the teaching party (for example, the end time endTime set by the teaching party), and the first end time is greater than or equal to the second end time.
  • the server 400 when the lecturer creates a live class through the display device 200, the server 400 obtains the start time and end time of the created live class from the display device 200, and sets the end time set by the lecturer as the second end time.
  • the operator can also set a third preset duration through the server 400, where the third preset duration is the duration (forceEndDelay) that allows the live broadcast class to be delayed.
  • the third preset duration can prevent the display device 200 from immediately closing the current live class when it reaches the second end time set by the lecturer, that is, appropriately setting a certain delay time for the live class, so as to improve the user experience of the lecturer.
  • the server 400 accumulates the third preset duration on the basis of the second end time, calculates the first end time, and feeds back the first end time to the display device 200 .
  • the server 400 starts the course mandatory end timer according to the calculated first end time. If the course mandatory end timer completes the timing of the first end time and does not receive the operation of the teaching party about dragging the class, the server 400 can directly force the end of the current live class, and spread the news of the end of the live class through the live class chat group.
  • S703 Set a first timer according to the first end time, so that the display device controls the display to display a dragging class interface after the first timer finishes counting seconds, wherein the dragging class interface includes a dragging class control, and the dragging class control is used to trigger a delay to stop the live class.
  • the display device 200 records the action time when the lecturer just entered the live class, and the display device 200 is set with a second preset duration, the second preset duration is the duration between displaying the dragging interface and the first end time, that is, the dragging interface is displayed for the second preset duration before the end of the course. For example, if the second end time is 9:00 and the third preset duration is 20 minutes, then the first end time is 9:20 and the second preset duration is 5 minutes, then the display device 200 will display the dragging interface at 9:15.
  • the display device 200 sequentially subtracts the action time and the second preset duration from the first end time to obtain a first timing duration, and sets the first timer according to the first timing duration.
  • the action time of the lecturer entering the live class is exactly startTime, and the startTime is 8:00
  • the calculation method of the first timer is 9:20 minus 8:00, and then subtract 5 minutes to get 1 hour and 15 minutes. That is to say, the first timer needs to complete the timing of 1 hour and 15 minutes.
  • timer function For example, use the JS language to start a timer through setTimeout (timer function), and set two parameters in the timer function, which are respectively the time to be scheduled and the behavior to be done after the timing is completed.
  • time to be scheduled is the first timing duration expressed in milliseconds
  • the behavior to be done after the timing is completed is to display the dragging interface.
  • the display device 260 is controlled to display the delay interface.
  • the dragging interface may be a dragging prompt box.
  • Fig. 18 exemplarily shows a schematic diagram of a display effect of a live class dragging class interface according to some embodiments. As shown in FIG. 18 , the dragging prompt box is displayed floating on the play window of the lecture area of the first live broadcast interface of the lecturer.
  • the delaying class prompt box is provided with a delaying class control (“Dragging class for 10 minutes” in FIG. 18 ), a class dismissal control (“Class dismissal” in FIG. 18 ), and a closing control (“X” in FIG. 18 ).
  • a delaying class control (“Dragging class for 10 minutes” in FIG. 18 )
  • Class dismissal in FIG. 18
  • closing control X” in FIG. 18
  • the dragging control is used to trigger the delay to stop the live class
  • the get out of class end control is used to trigger the immediate stop of the live class
  • the close control is used to close the prompt box of the current dragging class.
  • S704 When receiving the closing operation on the class dragging interface, call the server interface at the first end time to set the live class as closed class, so as to stop the live class.
  • the lecturer when the display device 200 displays the class delay interface, the lecturer, after seeing the class delay interface, judges whether it is necessary to delay the end of get out of class based on the current teaching situation. If the lecturer judges that the content of the lecture can be finished in the current live class and there is no need to drag the class, the dragging interface displayed on the display 260 can be directly closed.
  • the display device 200 receives the selection operation of the close control input by the lecturer, it hides the current class dragging interface, does not change the background logic, and closes the live class according to the first end time.
  • the display device 200 invokes the interface of the server 400 to set the live class as closed, so as to stop the live class.
  • the server 400 sends a signaling to the display device at the student end through the chat group of the live class, informing the display device at the student end that the current live class is over, and the display device at the student end displays an interface indicating that the get out of class is over.
  • the server 400 also sends a signal to the display device of the teaching party through the chat group of the live class, and the display device of the teaching party controls the display 260 to jump out of the live class playback interface, stop the live class, and display how long the current live class has been carried out.
  • the display device 200 when the display device 200 displays the dragging class interface, if the teaching party does not need to drag the class, the class teaching party may not process the dragging class interface, and when no input operation on the dragging class interface is received within the first preset time period, the display device 200 calls the server interface at the first end time to set the live class as the dismissal state. For example, within 10 seconds, if the display device 200 does not receive any operation on the dragging interface from the lecturer, the dragging interface will be hidden, and the live class will be closed according to the first end time by default.
  • S705 When receiving the selection operation of the class drag control, send a class delay request to the server, receive the third end time of the live class sent by the server according to the class delay request, and set a second timer according to the third end time, so that the display device controls the display to display the drag class interface again after the second timer finishes counting seconds.
  • the display device 200 when the display device 200 displays the dragging interface, the lecturer, after viewing the dragging interface, judges based on the current teaching situation that the content of the lecture cannot be completed in the current live lecture and needs to be dragged properly, and the dragging control can be selected.
  • the display device 200 receives the dragging instruction input by the instructor, it sends a class delay request to the server 400.
  • the server 400 after receiving the class extension request sent by the display device 200, the server 400 recalculates the third end time.
  • the third end time is the first end time plus the length of time to be procrastinated, plus the third preset time.
  • the server 400 adjusts the number of seconds counted by the compulsory end timer of the course accordingly. For example, if the first end time is 9:20, the waiting period is 10 minutes, and the third preset time is still 20 minutes, then the third end time is 9:20 plus 10 minutes, plus 20 minutes, which is 9:50. After calculating the third end time, the server 400 feeds back the third end time to the display device 200 .
  • the display device 200 records the current time at which the selection of the drag control is received.
  • the display device 200 sequentially subtracts the current time received for selecting the drag control and the second preset duration according to the third end time, to obtain the second timing duration of the second timer.
  • the calculation method of the second timer is 9:50 minus 9:15, and then subtract 5 minutes to get 30 minutes, that is to say, the second timer needs to complete 30 minutes of timing.
  • the display device 200 controls the display to display the dragging interface again.
  • the display device 200 when the display device 200 receives the selection operation of the get out of class dismissing control input by the lecturer, it immediately calls the interface of the server 400 to set the live class as dismissed.
  • Fig. 19 shows a sequence diagram of a method for managing live classes and dragging classes according to some embodiments. As shown in Figure 19, its operation is as follows:
  • the lecturer can call the interface of the server 400 through the display device 200 to create a live class, and set the start time and end time of the live class, where the end time is the second end time.
  • the server 400 obtains from the display device 200 the start time and the second end time of the live lecture created by the teaching party.
  • the operator sets the third preset duration on the server 400 side, and the server 400 obtains the first end time by adding the third preset duration to the second end time.
  • the server 400 also needs to create a live class chat group and an audio and video live room accordingly, and start a forced end timer for the class according to the first end time.
  • S903 The lecturer enters the live broadcast room to start the class.
  • the display device 200 sends a data acquisition request including the course representation to the server 400 .
  • the server 400 feeds back the first end time and live class detail data (including live class chat group IDs, audio and video live room IDs) to the display device 200 .
  • the display device 200 starts the class according to the received live class detail data, that is, joins the live class chat group, audio and video live room.
  • the display device 200 presets a second preset duration, that is, how long before the first end time is required to display the dragging interface.
  • the display device 200 calculates the first timing duration according to the first end time, and locally starts the first timer according to the first timing duration. After the first timer completes the first timing duration, the display device 200 controls to display the delay prompt.
  • S908 Immediately call the server interface to set the live class as the dismissed state when receiving the selection operation of the get out of class dismissal control in the class dragging prompt.
  • S909 Send a class delay request to the server 400 if a selection operation of the drag control on the drag class interface is received.
  • S910 The server 400 recalculates the third end time according to the classroom delay request, and feeds back to the display device 200 .
  • the course mandatory end timer adjusts its counting seconds.
  • the server 400 adjusts the counting time of the compulsory course end timer according to the third end time response.
  • the display device 200 calculates the second timing duration according to the third end time, starts the second timer locally, and repeats the above process until the live broadcast class stops.
  • S913 According to the received message of the end of the live broadcast, exit the chat group of the live broadcast and the audio and video live broadcast room, and jump to the end screen of the live broadcast.
  • the way to end the live class is to send a global message to each display device through the live class chat group.
  • the audio and video live room of the server 400 does not stop.
  • each display device will automatically stop accessing the live class chat group and stop uploading audio and video data.
  • the audio and video live room detects that there is no request for uploading or pulling down data for a period of time, and then automatically stops the audio and video live room.
  • the server 400 passively stops the audio and video live broadcast through the display device receiving the message to control, so that the server 400 does not actively cut off, and each display device will implement different controls according to its own network delay.
  • the server 400 turns off the timer for the forced end of the class. If the lecturer has not finished the class after the delay, the server 400 and other course forced end timers will send a time limit message to the live class chat group, and the live class chat group will send a live class end message to each display device participating in the live class, and the display device will exit the live class chat group and the audio and video live room according to the received message, and jump to the live broadcast end screen.
  • Some embodiments of the present application further provide a server, and the server is configured to: the server 400 receives a data acquisition request including a course identifier sent by the display device 200 .
  • the server 400 queries the database for the corresponding live class detail data and the first end time of the live class according to the course identifier, wherein the live class detail data is the data required to enter the live class, and the first end time is calculated according to the second end time set by the teaching party.
  • the server 400 sends the detailed data of the live class and the first end time of the live class to the display device 200, wherein the first end time is used for the display device to set a first timer, so that the display device controls the display to show the class dragging interface after the first timer finishes counting seconds.
  • the server 400 obtains the second end time set by the lecturer in the display device, and adds the second end time to a third preset duration to obtain the first end time, wherein the third preset duration is the duration that allows the live broadcast class to be delayed, and can be customized by the operator.
  • calculating the third end time includes: adding the first end time to the third preset duration and the duration of the class to be delayed to obtain the third end time.
  • the management method for dragging a live class includes: the display device 200 sends a data acquisition request including the course identifier to the server in response to the operation of the lecturer entering the live class.
  • the server 400 queries the corresponding live course detail data and the first end time from the database according to the course identifier, and sends them to the display device 200 .
  • the display device 200 receives the live class detail data and the first end time of the live class sent by the server 400, wherein the live class detail data is the data required for the live class, the first end time is calculated by the server according to the second end time set by the teaching party, and the first end time is greater than or equal to the second end time.
  • the display device 200 sets a first timer according to the first end time, so that the display device 200 controls the display 260 to display a class dragging interface after the first timer finishes counting seconds, wherein the class dragging interface includes a class dragging control, and the class dragging control is used to trigger a delay to stop the live class.
  • the display device 200 calls the server 400 interface at the first end time to set the live class as closed class, so as to stop the live class.
  • the display device 200 When receiving the selection operation of the drag control, the display device 200 sends a class delay request to the server 400, receives the third end time of the live class sent by the server 400 according to the class delay request, and sets a second timer according to the third end time, so that the display device 200 controls the display to display the drag interface again after the second timer finishes counting seconds.
  • the dragging prompt box is displayed on the lecture area playback window of the first live interface of the lecturer.
  • the teacher is in front of the desk and can directly see the display content in the playback window of the lecture area, so that he can see the live class dragging interface at the first time.
  • the above-mentioned live class dragging management solution can be directly adopted.
  • the teacher is writing on the blackboard but not in front of the desk, he cannot immediately see the dragging interface of the live class.
  • the dragging management plan needs to be adjusted accordingly as follows.
  • the acquisition of the second picture captured by the second camera is started, and the second picture is displayed in the playback window of the handout area.
  • the teacher may manually write blackboard writing on the blackboard.
  • the display device 200 displays the dragging interface on the playback window of the lecture area, the teacher may not notice the dragging interface, and will not deal with the dragging interface.
  • the display device 200 will default to dragging the class, so it sends a class delay request to the server; the server calculates and sends the third end time of the live class according to the class delay request, and then displays a delay prompt box on the playback window of the lecture area, for example, it displays "The end of class time has been delayed, and the delay time: 10 minutes". At the same time, close the dragging interface.
  • get out of class dismissal controls and close controls may also be set on the delay prompt box.
  • the get out of class end control is used to trigger the immediate stop of the live class
  • the close control is used to close the current delay prompt box. If the display device 200 does not receive an operation on the delay interface from the lecturer, the interface will remain in the playback window, and the live class will be closed according to the third end time.
  • a second timer is set according to the third end time, so that the display device controls the display to close the delay interface after the second timer finishes counting seconds, and displays the dragging interface again, and repeats the above process until the live class stops.
  • the server 400 when the server 400 receives the dragging request sent by the display device 200, it will judge the current and next system efficiency or performance. If there is redundancy in the system performance during the extended time of appointment, then normally feed back the message that the delay delay is successful to each display device. If the system performance does not have redundancy or the redundancy does not meet the preset standard during the extended time of appointment, procrastination is not allowed. Or allow the delay but prompt that the system performance is insufficient. In the following time, if the server 400 is stuck in operation or has an early warning, it will firstly reduce the frequency of the delayed live class service to maintain system efficiency. If you are still not satisfied after the frequency reduction of the live broadcast class service of the drag class, then you will deal with other services. In order to take into account the fairness of the system and maintain the class quality of the normal live class business.
  • the teacher triggers the target live lesson control, and displays the first live broadcast interface including a class button, an auxiliary stream start button, a teacher video window, and a play window in a handout area.
  • the class button When starting the live broadcast, trigger the class button to start and obtain the first picture captured by the first camera, which will be displayed in the teacher's video window.
  • the teacher triggers the auxiliary stream start button to start and obtain the second picture captured by the second camera, which is displayed in the playback window of the lecture area; the first picture and the second picture are sent to the student terminal through the server for display.
  • the teacher triggers the switching button, and the teacher and the students simultaneously display the first screen in the playback window of the lecture area, and display the second screen in the teacher's video window.
  • two cameras are used to capture different teaching images, and the auxiliary teaching content and the lecture content share a playback window to increase the display effect.
  • the teacher side can flexibly and dynamically switch the images captured by the two cameras between the two playback windows, the operation is simple and convenient, and the user's class experience is improved.
  • the present disclosure also provides a processing method for procrastination during the live broadcast.
  • the instructor can be asked whether the procrastination is required, so that the instructor can freely control the time to delay the end of the class, so that the unfinished course can be finished in one class, thereby improving the experience of the instructor.

Abstract

公开了一种带有摄像头的显示设备和相应的直播方法,包括:响应于触发用户界面中目标直播课控件产生的启动指令,显示第一直播界面,其包括上课按钮、辅流启动按钮、老师视频窗口和讲义区播放窗口中的至少一种;响应于触发上课按钮产生的开始直播指令,在讲义区播放窗口显示白板内容,启动并获取双摄像头中的第一摄像头采集的第一画面,将其显示在老师视频窗口,并将第一画面作为主视频流通过服务器发送至学生端显示设备进行显示;响应于触发辅流启动按钮产生的双摄像头直播指令,启动并获取第二摄像头采集的第二画面,将第二画面显示在讲义区播放窗口,并将第二画面作为辅视频流通过服务器发送至学生端显示设备进行显示。

Description

显示设备和直播方法
本申请要求2022年4月18日提交、申请号为No.202210405366.1、以及2022年1月24日提交的、申请号为No.202210077270.7的中国专利申请的优先权,其全部内容通过参引的方式结合入本文中。
技术领域
本申请涉及显示设备的技术领域,尤其涉及一种带有摄像头的显示设备和直播方法。
背景技术
随着显示设备的快速发展,显示设备的功能将越来越丰富,性能也越来越强大,目前,显示设备包括智能电视、智能手机、电脑,以及带有智能显示屏幕的产品等。在利用显示设备实现的不同场景中,直播课(在线直播上课)是一种实现在线教育且发展迅速的场景之一。在利用显示设备进行在线直播上课时,显示设备内可配置直播课程序。
在线直播课堂上课时,需要模拟真实上课场景,老师的画面和老师在黑板上的板书内容均需被学生看到。但是,目前的直播课通常采用一个摄像头同时拍摄老师和黑板的画面来进行直播。如需在直播过程中切换其他画面时,需由专门的工作人员手动挪动摄像头位置来解决,操作非常不灵活,影响直播课的进度和效果。
此外,目前在线直播课的开始时间和结束时间均按照老师在创建直播课时所设置的时间。等到直播课结束时间到时,直播课会立即停止,此时,若老师还有未讲完的内容,只能在下节课继续讲课。此方式下,使得老师上课体验效果不佳。
发明内容
本公开提供了一种用于带有摄像头的显示设备的直播方法,所述方法包括:响应于触发用户界面中目标直播课控件产生的启动指令,显示第一直播界面,所述第一直播界面中包括上课按钮、辅流启动按钮、老师视频窗口和讲义区播放窗口中的至少一种;响应于触发所述上课按钮产生的开始直播指令,在所述讲义区播放窗口显示白板内容,启动并获取双摄像头中的第一摄像头采集的第一画面,将所述第一画面显示在老师视频窗口,以及,将所述第一画面作为主视频流通过服务器发送至学生端显示设备进行显示;响应于触发所述辅流启动按钮产生的双摄像头直播指令,启动并获取双摄像头中的第二摄像头采集的第二画面,将所述第二画面显示在讲义区播放窗口,以及,将所述第二画面作为辅视频流通过服务器发送至学生端显示设备进行显示。
本公开还提供了一种显示设备,包括:显示器,被配置为显示图像和/或用户界面;第一摄像头,被配置为采集第一画面;第二摄像头,被配置为采集第二画面;分别与所述显示器、第一摄像头和第二摄像头连接的控制器,所述控制器被配置为:响应于触发用户界面中目标直播课控件产生的启动指令,显示第一直播界面,所述第一直播界面中包括上课按钮、辅流启动按钮、老师视频窗口和讲义区播放窗口中的至少一种;响应于 触发所述上课按钮产生的开始直播指令,在所述讲义区播放窗口显示白板内容,启动并获取所述第一摄像头采集的第一画面,将所述第一画面显示在老师视频窗口,以及,将所述第一画面作为主视频流通过服务器发送至学生端显示设备进行显示;响应于触发所述辅流启动按钮产生的双摄像头直播指令,启动并获取所述第二摄像头采集的第二画面,将所述第二画面显示在讲义区播放窗口,以及,将所述第二画面作为辅视频流通过服务器发送至学生端显示设备进行显示。
附图说明
图1示出了根据一些实施例的显示设备与控制装置之间操作场景的示意图;
图2示出了根据一些实施例的控制装置100的硬件配置框图;
图3示出了根据一些实施例的显示设备200的硬件配置框图;
图4示出了根据一些实施例的显示设备200中软件配置图;
图5示出了根据一些实施例的老师端显示设备执行双摄像头直播方法的流程图;
图6示出了根据一些实施例的双摄像头直播方法的时序图;
图7示出了根据一些实施例的显示设备200中教育应用程序的主界面显示效果示意图;
图8示出了根据一些实施例的老师端显示设备显示的第一直播界面的示意图;
图9示出了根据一些实施例的学生端显示设备显示的第二直播界面的示意图;
图10示出了根据一些实施例的第一直播界面中讲义区播放窗口显示讲义内容的示意图;
图11示出了根据一些实施例的在第一直播界面中显示辅流设置弹窗的示意图;
图12示出了根据一些实施例的在讲义区播放窗口显示第二画面的示意图;
图13示出了根据一些实施例的在第一直播界面中显示切换后主辅视频流的示意图;
图14示出了根据一些实施例的学生端显示设备执行双摄像头直播方法的流程图;
图15示出了根据一些实施例的在第二直播界面中显示切换后主辅视频流的示意图;
图16示出了根据一些实施例的服务器执行双摄像头直播方法的流程图;
图17示出了根据一些实施例的直播课拖堂管理方法的流程示意图;
图18示出了根据一些实施例的直播课拖堂界面的显示效果示意图;
图19示出了根据一些实施例的直播课拖堂管理方法的时序图。
具体实施方式
为使本申请的目的和实施方式更加清楚,下面将结合本申请示例性实施例中的附图,对本申请示例性实施方式进行清楚、完整地描述,显然,描述的示例性实施例仅是本申请一部分实施例,而不是全部的实施例。
需要说明的是,本申请中对于术语的简要说明,仅是为了方便理解接下来描述的实施方式,而不是意图限定本申请的实施方式。除非另有说明,这些术语应当按照其普通和通常的含义理解。
本申请实施方式提供的显示设备可以具有多种实施形式,例如,可以是电视、智能电视、电脑、激光投影设备、显示器(monitor)、电子白板(electronic bulletin board)、电 子桌面(electronic table)等。图1和图2为本申请的显示设备的一种具体实施方式。
图1示出了根据一些实施例的显示设备与控制装置之间操作场景的示意图。如图1所示,用户可通过智能设备300或控制装置100操作显示设备200。
在一些实施例中,控制装置100可以是遥控器,遥控器和显示设备的通信包括红外协议通信或蓝牙协议通信,及其他短距离通信方式,通过无线或有线方式来控制显示设备200。用户可以通过遥控器上按键、语音输入、控制面板输入等输入用户指令,来控制显示设备200。
在一些实施例中,也可以使用智能设备300(如移动终端、平板电脑、计算机、笔记本电脑等)以控制显示设备200。例如,使用在智能设备上运行的应用程序控制显示设备200。
在一些实施例中,显示设备可以不使用上述的智能设备或控制设备接收指令,而是通过触摸或者手势等接收用户的控制。
在一些实施例中,显示设备200还可以采用除了控制装置100和智能设备300之外的方式进行控制,例如,可以通过显示设备200设备内部配置的获取语音指令的模块直接接收用户的语音指令控制,也可以通过显示设备200设备外部设置的语音控制设备来接收用户的语音指令控制。
在一些实施例中,显示设备200还与服务器400进行数据通信。可允许显示设备200通过局域网(LAN)、无线局域网(WLAN)和其他网络进行通信连接。服务器400可以向显示设备200提供各种内容和互动。服务器400可以是一个集群,也可以是多个集群,可以包括一类或多类服务器。
图2示出了根据一些实施例的控制装置100的硬件配置框图。如图2所示,控制装置100包括控制器110、通信接口130、用户输入/输出接口140、存储器、供电电源。控制装置100可接收用户的输入操作指令,且将操作指令转换为显示设备200可识别和响应的指令,起用用户与显示设备200之间交互中介作用。
图3示出了根据一些实施例的显示设备200的硬件配置框图。如图3,显示设备200包括调谐解调器210、通信器220、检测器230、外部装置接口240、控制器250、显示器260、音频输出接口270、存储器、供电电源、用户接口中的至少一种。
在一些实施例中控制器包括处理器,视频处理器,音频处理器,图形处理器,RAM,ROM,用于输入/输出的第一接口至第n接口。
显示器260包括用于呈现画面的显示屏组件,以及驱动图像显示的驱动组件,用于接收源自控制器输出的图像信号,进行显示视频内容、图像内容以及菜单操控界面的组件以及用户操控UI界面。
显示器260可为液晶显示器、OLED显示器、以及投影显示器,还可以为一种投影装置和投影屏幕。
通信器220是用于根据各种通信协议类型与外部设备或服务器进行通信的组件。例如:通信器可以包括Wifi模块,蓝牙模块,有线以太网模块等其他网络通信协议芯片或近场通信协议芯片,以及红外接收器中的至少一种。显示设备200可以通过通信器220与外部控制设备100或服务器400建立控制信号和数据信号的发送和接收。
用户接口,可用于接收控制装置100(如:红外遥控器等)的控制信号。
检测器230用于采集外部环境或与外部交互的信号。例如,检测器230包括光接收 器,用于采集环境光线强度的传感器;或者,检测器230包括图像采集器,如摄像头,可以用于采集外部环境场景、用户的属性或用户交互手势,再或者,检测器230包括声音采集器,如麦克风等,用于接收外部声音。
外部装置接口240可以包括但不限于如下:高清多媒体接口接口(HDMI)、模拟或数据高清分量输入接口(分量)、复合视频输入接口(CVBS)、USB输入接口(USB)、RGB端口等任一个或多个接口。也可以是上述多个接口形成的复合性的输入/输出接口。
调谐解调器210通过有线或无线接收方式接收广播电视信号,以及从多个无线或有线广播电视信号中解调出音视频信号,如以及EPG数据信号。
在一些实施例中,控制器250和调谐解调器210可以位于不同的分体设备中,即调谐解调器210也可在控制器250所在的主体设备的外置设备中,如外置机顶盒等。
控制器250,通过存储在存储器上中各种软件控制程序,来控制显示设备的工作和响应用户的操作。控制器250控制显示设备200的整体操作。例如:响应于接收到用于选择在显示器260上显示UI对象的用户命令,控制器250便可以执行与由用户命令选择的对象有关的操作。
在一些实施例中控制器包括中央处理器(Central Processing Unit,CPU),视频处理器,音频处理器,图形处理器(Graphics Processing Unit,GPU),RAM Random Access Memory,RAM),ROM(Read-Only Memory,ROM),用于输入/输出的第一接口至第n接口,通信总线(Bus)等中的至少一种。
用户可在显示器260上显示的图形用户界面(GUI)输入用户命令,则用户输入接口通过图形用户界面(GUI)接收用户输入命令。或者,用户可通过输入特定的声音或手势进行输入用户命令,则用户输入接口通过传感器识别出声音或手势,来接收用户输入命令。
“用户界面”,是应用程序或操作系统与用户之间进行交互和信息交换的介质接口,它实现信息的内部形式与用户可以接受形式之间的转换。用户界面常用的表现形式是图形用户界面(Graphic User Interface,GUI),是指采用图形方式显示的与计算机操作相关的用户界面。它可以是在电子设备的显示屏中显示的一个图标、窗口、控件等界面元素,其中控件可以包括图标、按钮、菜单、选项卡、文本框、对话框、状态栏、导航栏、Widget等可视的界面元素。
图4示出了根据一些实施例的显示设备200中软件配置图。参见图4,在一些实施例中,将系统分为四层,从上至下分别为应用程序(Applications)层(简称“应用层”),应用程序框架(Application Framework)层(简称“框架层”),安卓运行时(Android runtime)和系统库层(简称“系统运行库层”),以及内核层。
在一些实施例中,应用程序层中运行有至少一个应用程序,这些应用程序可以是操作系统自带的窗口(Window)程序、系统设置程序或时钟程序等;也可以是第三方开发者所开发的应用程序。在具体实施时,应用程序层中的应用程序包不限于以上举例。
框架层为应用程序提供应用编程接口(application programming interface,API)和编程框架。应用程序框架层包括一些预先定义的函数。应用程序框架层相当于一个处理中心,这个中心决定让应用层中的应用程序做出动作。应用程序通过API接口,可在执行中访问系统中的资源和取得系统的服务。
如图4所示,本申请一些实施例中应用程序框架层包括管理器(Managers),提供器 (Content Provider)、网络管理系统等,其中管理器包括以下模块中的至少一个:活动管理器(Activity Manager)用与和系统中正在运行的所有活动进行交互;位置管理器(Location Manager)用于给系统服务或应用提供了系统位置服务的访问;文件包管理器(Package Manager)用于检索当前安装在设备上的应用程序包相关的各种信息;通知管理器(Notification Manager)用于控制通知消息的显示和清除;窗口管理器(Window Manager)用于管理用户界面上的图标、窗口、工具栏、壁纸和桌面部件。
在一些实施例中,显示设备中可配置不同的应用程序,以使显示设备提供不同的功能,丰富用户体验。例如,可在显示设备中配置直播课程序,以利用显示设备实现在线直播上课、远程会议、实时游戏直播、电子白板分享等应用场景。配置有直播课程序的显示设备可应用在任何教学机构,包括学校、补课机构等。在显示设备上进行教学直播时,可以提供给用户大屏体验,利用显示设备可以展示更多的教学内容以及教学直播过程中所需的操作功能等。
在一些实施例中,直播课程序除应用在显示设备端外,还可配置在手机端、Pad端和PC客户端等。
在一些实施例中,显示设备包括显示器和控制器,显示器用于呈现直播教学界面,控制器用于根据直播相关操作,实现教学直播。直播课程序包括第一程序和第二程序,第一程序为老师端显示设备中配置的直播课程序,在启动第一程序时,在老师端显示器中展示第一直播界面;第二程序为显示设备学生端中配置的直播课程序,在启动第二程序时,在学生端显示器中展示第二直播界面。
在线直播课堂上课时,需要模拟真实上课场景,老师的画面和老师在黑板上的板书内容均需被学生看到。但是,目前的直播课通常采用一个摄像头同时拍摄老师和黑板的画面来进行直播。如需在直播过程中切换其他画面,或者老师画面和黑屏画面进行切换显示时,需由专门的工作人员手动挪动摄像头位置来解决,操作非常不灵活,影响直播课的进度和效果。
因此,为了在直播过程中,能够灵活切换画面,本申请实施例提供一种显示设备,其配置有两个摄像头,由第一摄像头采集老师画面,由第二摄像头采集黑板画面,老师画面和黑板画面分别显示在对应的播放窗口中。并且,在老师端显示设备中配置切换入口,以便老师可以动态切换两个播放窗口中的画面。此时,用户可自定义两个摄像头的拍摄参数,也可在直播过程中动态修改拍摄参数,以节省网络资源。
在一些实施例中,两个摄像头均可为显示设备内置的摄像头;也可其中一个为内置摄像头,另一个为外接摄像头;还可两个摄像头均为外接摄像头。
在一些实施例中,第一摄像头用于采集第一画面,第一画面可为老师画面,因此,第一摄像头朝向老师的面部。第二摄像头用于采集第二画面,第二画面可为黑板画面,即老师手动书写板书的画面,因此,第二摄像头朝向黑板或书桌等位置。
图5示出了根据一些实施例的老师端显示设备执行双摄像头直播方案的流程图;图6示出了根据一些实施例的双摄像头直播方案的时序图。根据本申请的一些实施例,老师端显示设备包括:显示器,被配置为显示用户界面;第一摄像头,被配置为采集第一画面;第二摄像头,被配置为采集第二画面;分别与显示器、第一摄像头和第二摄像头连接的控制器,在执行图5和图6所示的双摄像头直播方案时,控制器被配置为执行下述步骤:
S11、响应于触发用户界面中目标直播课控件产生的启动指令,显示第一直播界面,所述第一直播界面中包括上课按钮、辅流启动按钮、老师视频窗口和讲义区播放窗口中的至少一种。
老师端显示设备内配置第一程序(直播课程序),因此,在启动老师端显示设备后,在设备主页显示第一程序图标。用户通过手动或语音方式触发第一程序图标,在显示器中展示直播课首页。
直播课首页展示有至少一个直播课控件,每个直播课控件用于创建一个对应的直播间,进行老师和学生的在线直播课堂。在老师基于时间点触发目标直播课控件时,产生启动指令,实现基于目标直播课的在线上课,并在显示器中切换显示第一直播界面。
图7示出了根据一些实施例的显示设备200中直播课首页的主界面显示效果示意图。如图7所示,老师作为授课方,在登录直播课程序后,在主界面上显示有已创建的课程,并且还可以通过点击右下角的“创建直播课”控件创建新的直播课。
在一些实施例中,控制器响应于触发用户界面中目标直播课控件产生的启动指令,显示第一直播界面,被进一步配置为:
步骤111、响应于触发用户界面中目标直播课控件产生的启动指令,从服务器获取与目标直播课对应的音视频SDK标识和消息SDK标识,音视频SDK标识是指用于实现直播间音视频收发操作的音视频SDK的标识,消息SDK标识是指用于实现直播间聊天消息收发操作的消息SDK的标识。
步骤112、基于音视频SDK标识和消息SDK标识,创建与目标直播课对应的目标直播间,并显示用于呈现目标直播间的第一直播界面。
为实现在线直播上课,老师端和学生端均需进入同一个直播间,且需要进行老师端和学生端之间的音视频流和消息流的传输。因此,在老师端通过触发老师端显示设备中显示的目标直播课控件进入目标直播间时,携带目标直播课ID,先从服务器中获取对应的音视频SDK标识和消息SDK标识。SDK(Software Development Kit),为软件开发工具包。
音视频SDK标识即为音视频SDK房间号,消息SDK标识即为消息群组ID。调用SDK方法,携带音视频SDK标识发送至音视频服务器以创建对应的目标直播间。同时,携带消息SDK标识发送至消息服务器以创建目标直播间对应的群组。
老师端显示设备通过群组向每个参与上课的学生端显示设备发送消息,学生端通过群组向另一学生端或老师端发送消息,消息包括但不限于老师与学生的互动消息、学生间的互动消息等。在老师端显示设备向每个参与上课的学生端显示设备发送信令时,信令包括但不限于上课信令、下课信令、主视频流和辅视频流等。消息和信令在老师端和学生端传递时,均携带目标直播课对应的音视频SDK标识和消息SDK标识。也就是说,老师与学生之间,学生与学生之间的消息/信令传递均基于目标直播课对应的音视频SDK标识和消息SDK标识来实现,使得老师端产生的消息/信令可发送至处于同一直播间的各个学生端,以保证老师和学生处于同一直播间进行直播上课。
在一些实施例中,学生端显示设备也通过步骤111和步骤112的方法加入目标直播间。老师端和每个学生端均位于同一直播间中,因此,可进行直播上课。此时,老师端显示设备生成并展示第一直播界面,用于呈现目标直播间,学生端显示设备生成并展示第二直播界面,用于呈现目标直播间。
图8示出了根据一些实施例的老师端显示设备显示的第一直播界面的示意图。参见图8,为便于老师端显示设备显示不同的播放窗口,以及,实现动态切换播放窗口的效果,第一直播界面中包括控件栏、消息对话框、老师视频窗口和讲义区播放窗口中的至少一种。控件栏显示在第一直播界面的一侧,控件栏包括上课按钮、辅流启动按钮、课件控件、共享屏幕控件、学员控件等。上课按钮用于启动直播上课;辅流启动按钮用于启动辅流摄像头进行直播上课;课件控件用于触发显示课件/讲义等内容,包括文字内容或视频内容;共享屏幕控件用于向学生端分享屏幕显示内容;学员控件用于查看参与该直播课的所有学生信息。消息对话框用于实现消息发送,老师视频窗口用于在初始状态显示老师画面,讲义区播放窗口用于在初始状态显示讲义内容。
在开始上课前,老师视频窗口和讲义区播放窗口均显示空白内容;如果老师端在开始上课前点击课件控件,则在讲义区播放窗口显示课件内容。在老师端点击上课按钮后,且在点击上课按钮之前预先已调出课件内容时,则在讲义区播放窗口显示课件内容,在老师视频窗口显示老师画面。在老师端点击上课按钮后,且在点击上课按钮之前未预先调出课件内容时,则在讲义区播放窗口显示白板内容,在老师视频窗口显示老师画面。
图9示出了根据一些实施例的学生端显示设备显示的第二直播界面的示意图。参见图9,由于学生端的操作要少于老师端的操作,因此,第二直播界面的操作按钮要少于第一直播界面的操作按钮。第二直播界面包括消息对话框、老师视频窗口和讲义区播放窗口中的至少一种。
第二直播界面中的老师视频窗口和讲义区播放窗口展示样式和呈现内容可与第一直播界面的呈现方式相对应。
S12、响应于触发上课按钮产生的开始直播指令,在讲义区播放窗口显示白板内容,启动并获取第一摄像头采集的第一画面,将第一画面显示在老师视频窗口,以及,将第一画面作为主视频流通过服务器发送至学生端显示设备进行显示。
在老师和各个学生均进入同一直播间后,老师端即可操作直播上课。此时,老师触发第一直播界面中的上课按钮,产生开始直播指令,即上课信令,并通过消息服务器发送至学生端显示设备,使得学生端进入准备上课状态。
在老师端触发上课按钮后进入准备上课状态,此时,开启用于采集老师画面的第一摄像头,将老师画面作为第一画面显示在第一直播界面中的老师视频窗口中。如果老师击上课按钮之前,未预先调出课件内容时,则在第一直播界面中的讲义区播放窗口中显示白板内容,如图8所示。
同理,在刚进入直播间时,直播间为暂停状态,即还未进行直播上课,因此,学生端显示直播间暂停状态图(未上课遮罩)。在老师端开启第一摄像头,学生端跟随老师端进入准备上课状态,学生端收到老师端发送的上课信令时隐藏暂停直播状态图,且在第二直播界面中的老师视频窗口显示老师画面,在第二直播界面中的讲义区播放窗口中显示白板内容。
在一些实施例中后,如果老师端开始进行内容讲解,那么可触发课件控件以选取目标讲义内容显示在讲义区播放窗口中。讲义内容可在老师端点击上课按钮之前即调出显示,也可在老师端点击上课按钮后再调出显示。
图10示出了根据一些实施例的第一直播界面中讲义区播放窗口显示讲义内容的示意图。参见图10,在开始直播之后,如果进行讲义内容展示,则老师触发课件控件,获 取对应的讲义内容,并显示在讲义区播放窗口。此时,老师端显示设备通过音视频服务器向学生端显示设备发送信令,使得第二直播界面中的讲义区播放窗口同步显示讲义内容。同时,还可在讲义区播放窗口的底部显示课件预览区,课件预览区用于以缩略图的形式显示讲义内容的每一页面的内容。课件预览区仅显示在老师端的第一直播界面中,并不显示在学生端的第二直播界面中。
在老师端未调出讲义内容时,讲义区播放窗口可在不对第一直播界面中其他布局内容造成遮挡的前提下最大限度占据第一直播界面的空间;在老师端调出讲义内容后,为便于显示课件预览区,可将讲义区播放窗口的占据空间从纵向方向上减少,即展示课件预览区时的讲义区播放窗口小于未展示课件预览区时的讲义区播放窗口。
在一些实施例中,在开始进行直播上课时,第一直播界面和第二直播界面中的老师视频窗口中显示老师画面(即第一画面),讲义区播放窗口中显示白板内容(未预先调出课件时),如图8所示。在基于展示内容进行直播上课时,第一直播界面和第二直播界面中的老师视频窗口中显示老师画面(即第一画面),讲义区播放窗口中显示讲义内容,如图10所示。
在一些实施例中,由于老师端显示设备连接有两个摄像头,那么在直播上课时,需确定哪一个作为拍摄老师画面的主流摄像头,哪个作为拍摄其他教学位置,如黑板画面的辅流摄像头。因此,控制器在执行响应于触发上课按钮产生的开始直播指令,启动并获取第一摄像头采集的第一画面,将第一画面显示在老师视频窗口,被进一步配置为:
步骤121、响应于触发上课按钮产生的开始直播指令,选取第一摄像头作为用于拍摄第一画面的主流摄像头,以及,设置主流摄像头的主流拍摄参数。
步骤122、启动主流摄像头,以及,获取主流摄像头基于主流拍摄参数采集的第一画面,将第一画面作为主视频流显示在老师视频窗口中。
老师点击上课按钮开始直播的同时,调用摄像头ID设置方法,设置当前默认主流摄像头。在选择主流摄像头时,可从摄像头列表中获取第一个摄像头作为主流摄像头;或者,由用户自定义选取一个作为主流摄像头。
为主流摄像头配置主流ID,同时,调用摄像头拍摄参数设置方法,为主流摄像头配置主流拍摄参数。主流拍摄参数包括但不限于分辨率、宽高尺寸等。主流分辨率可获取系统默认参数,示例性的,主流分辨率为480P。
在选出主流摄像头后,基于主流ID调用音视频sdk方法打开主流摄像头。主流摄像头基于主流拍摄参数实时采集第一画面。而后,调用渲染主流方法,在老师视频窗口渲染老师画面。
在一些实施例中,主流为开始直播后推送的第一路视频流,因此,将第一摄像头(主流摄像头)采集的第一画面作为主视频流。
因此,在老师端的第一摄像头采集到第一画面后,老师端显示设备将采集的第一画面作为主视频流推送到音视频服务器。音视频服务器接收到老师端发送的主视频流后,发送主流连接的视频信令至各个学生端显示设备。
音视频服务器在生成主流连接的视频信令时,为主视频流配置主视频流标识,而后基于老师ID、视频流ID、主视频流标识、主视频流层级、可用标识、音视频SDK标识和消息SDK标识等生成主流连接的视频信令。老师ID用于表征老师,视频流ID用于表征主视频流,主视频流标识用于表征主视频流类型,可用标识用于表征主视频流可被传 输至学生端。
学生端显示设备在接收到服务器发送的视频信令后,判断视频流标识为主视频流标识,则在老师视频组件中调用音视频订阅视频流方法,向服务器订阅该视频流。服务器收到此订阅消息后,向学生端显示设备发送主视频流,并在订阅组件中进行渲染,即将第一画面作为主视频流显示在学生端显示设备提供的第二直播界面的老师视频窗口中。
S13、响应于触发辅流启动按钮产生的双摄像头直播指令,启动并获取第二摄像头采集的第二画面,将第二画面显示在讲义区播放窗口,以及,将第二画面作为辅视频流通过服务器发送至学生端显示设备进行显示。
在直播上课过程中,如果老师端想要推送第二路视频流,则需要通过双摄像头实现,第二路视频流为辅流。此时,老师触发第一直播界面中的辅流启动按钮,产生双摄像头直播指令,以启动第二摄像头进行第二画面的采集,第二画面显示在讲义区播放窗口,第二摄像头为辅流摄像头。
由于在直播上课时,讲义区播放窗口可能显示白板内容,也可显示讲义内容。因此,在讲义区播放窗口显示白板内容时,则在老师触发辅流启动按钮时,可直接启动第二摄像头进行第二画面的采集。而如果讲义区播窗口展示讲义内容,则在老师触发辅流启动按钮时,先将讲义内容停止显示,而后再启动第二摄像头进行第二画面的采集,以便由第二画面替换讲义内容展示在讲义区播放窗口。
在一些实施例中,由于第二摄像头采集的第二画面显示在讲义区播放窗口,为便于用户能够更加清晰地观看第二画面,因此,第二摄像头的拍摄参数可与第一摄像头的拍摄参数不同。示例性的,第二摄像头的拍摄参数优于第一摄像头的拍摄参数。且为便于第二摄像头的拍摄参数可实时修改,可通过在第一直播界面中弹出辅流设置弹窗的方式,实现对第二摄像头的拍摄参数的设置。
因此,控制器在执行响应于触发辅流启动按钮产生的双摄像头直播指令,启动并获取第二摄像头采集的第二画面,将第二画面显示在讲义区播放窗口,被进一步配置为执行下述步骤:
步骤131、响应于触发辅流启动按钮产生的双摄像头直播指令,在第一直播界面中显示包括参数选择框和确定按钮的辅流设置弹窗,参数选择框用于设置辅流摄像头的辅流拍摄参数。
步骤132、基于参数选择框选取第二摄像头作为用于拍摄第二画面的辅流摄像头,以及,设置辅流摄像头的辅流拍摄参数后,响应于触发确定按钮产生的指令,启动辅流摄像头。
步骤133、获取辅流摄像头基于辅流拍摄参数采集的第二画面,将第二画面作为辅视频流显示在讲义区播放窗口。
在老师触发第一直播界面中的辅流启动按钮后,即可开启第二摄像头。此时,需先设定第二摄像头的拍摄参数,因此,在触发辅流启动按钮后,即可在第一直播界面中弹出辅流设置弹窗。
图11示出了根据一些实施例的在第一直播界面中显示辅流设置弹窗的示意图。参见图11,辅流设置弹窗包括参数选择框和确定按钮。参数选择框用于设置辅流摄像头的辅流拍摄参数,如摄像头、分辨率、镜像等。
老师基于参数选择框选取第二摄像头作为用于拍摄第二画面的辅流摄像头,为辅流 摄像头配置辅流ID。在选择第二摄像头时,可从摄像头列表中获取第二个摄像头作为辅流摄像头;或者,由用户自定义选取一个作为辅流摄像头。分辨率可选取高分辨率参数,如480P、720P、1080P等;镜像可选择是否开启或关闭。辅流设置弹窗中还可包括摄像头画面预览区,用于预览第二摄像头所采集的画面。
在基于参数选择框选择辅流摄像头和设置辅流摄像头的辅流拍摄参数后,老师触发辅流设置弹窗中的确定按钮,调用摄像头拍摄参数设置方法完成对辅流摄像头的辅流拍摄参数的设定;然后基于辅流ID调用音视频SDK方法开启辅流摄像头。辅流摄像头基于辅流拍摄参数采集的第二画面,而后调用渲染辅流方法,将第二画面作为辅视频流显示在讲义区播放窗口。
在一些实施例中,在老师端启动第二摄像头时,第二摄像头采集的画面与讲义内容显示在同一讲义区播放窗口,因此,此时存在讲义区播放窗口切换内容显示的场景。在启动第二摄像头使得讲义区播放窗口中的展示内容由讲义内容切换为第二画面时,由于讲义区播放窗口不再显示讲义内容,那么课件预览区也无需显示,此时,可将讲义区播放窗口的占据空间增大,以不对第一直播界面中其他布局内容造成遮挡的前提下最大限度占据第一直播界面的空间。此时,在启动第二摄像头前后,讲义区播放窗口的尺寸由小变大。
在一些实施例中,在启动第二摄像头使得讲义区播放窗口中的展示内容由讲义内容切换为第二画面时,为便于老师端再次切换回讲义内容进行显示,在第一直播界面中的讲义区播放窗口的下方显示tab区,tab区显示讲义tab,讲义tab用于触发讲义内容的显示。讲义内容包括文本讲义或视频讲义,因此,讲义tab还包括文本讲义tab和视频讲义tab。
图12示出了根据一些实施例的在讲义区播放窗口显示第二画面的示意图。参见图12,在一些实施例中,第二画面为黑板画面,如黑板画面为老师在手绘兔子图案。为便于学生能看清楚黑板画面,即老师手动书写的教学内容,将黑板画面显示在讲义区播放窗口中。
在一些实施例中,双摄像头直播场景可由老师自定义开启或关闭,那么第二摄像头采集的画面并非在初始触发上课按钮时即采集并显示,因此,可将第二摄像头采集的第二画面作为辅视频流进行展示。
在一些实施例中,如果老师首次启动双摄功能,则在第一直播界面中显示包括参数选择框和确定按钮的辅流设置弹窗,以设置辅流摄像头的辅流拍摄参数。而如果后续使用时,老师触发辅流启动按钮,可直接按照以前设置的参数启动第二摄像头,而不显示辅流设置弹窗。如果用户需要对第二摄像头的辅流拍摄参数进行修改,则可通过触发第一直播界面中的设置按钮,再次调出辅流设置弹窗进行重新设置。
在老师端的第二摄像头采集到第二画面后,老师端显示设备将采集的第二画面作为辅视频流推送到音视频服务器。音视频服务器接收到老师端发送的主视频流后,发送辅流连接的视频信令至各个学生端显示设备。
音视频服务器在生成辅流连接的视频信令时,为辅视频流配置辅视频流标识,而后基于老师ID、视频流ID、辅视频流标识、辅视频流层级、可用标识、音视频SDK标识和消息SDK标识等生成辅流连接的视频信令。视频流ID用于表征辅视频流,辅视频流标识用于表征辅视频流类型,可用标识用于表征辅视频流可被传输至学生端。
学生端显示设备在接收到服务器发送的视频信令后,判断视频流标识为辅视频流标识,则在老师讲义区辅流视频组件中调用音视频订阅视频流方法,向服务器订阅该视频流。服务器收到此订阅消息后,向学生端显示设备发送辅视频流,并在订阅组件中进行渲染,即将第二画面作为辅视频流显示在学生端显示设备提供的第二直播界面的讲义区播放窗口中。在此过程中,主视频流仍处于传递状态,即学生端显示设备仍在显示主视频流。
在一些实施例中,在老师端启动双摄功能时,同时存在主视频流和辅视频流,主视频流和辅视频流的分辨率可能不同。造成主辅视频流分辨率不同的原因可为老师端手动设置的、老师端向服务器上传某一视频流因网络原因自动降级(主/辅视频流层级)导致的,或者由目标直播课对应的直播间自动设置的。
在一些实施例中,在老师端产生主视频流和辅视频流时,主视频流中携带主视频流标识,辅视频流中携带辅视频流标识,以进行区分。无论是在老师端还是在学生端,老师视频窗口基于主视频流标识显示对应内容,讲义区播放窗口基于辅视频流标识显示对应内容。
在一些实施例中,由于老师视频窗口显示老师画面,讲义区播放窗口显示讲义内容或黑板画面。因此,为保证学生和老师都能清晰地观看教学内容,可设定讲义区播放窗口的尺寸大于老师视频窗口,呈现大小屏的直播效果。
可见,该老师端显示设备在进行双摄像头直播时,可自行开启两个摄像头分别采集不同的教学画面,一个摄像头采集老师画面,一个摄像头采集黑板画面。老师画面显示在直播界面的老师视频窗口(老师端和学生端),黑板画面显示在直播界面的讲义区播放窗口(老师端和学生端)。黑板画面与讲义内容共用一个播放窗口,以增大显示效果。且可为不同的摄像头设置不同的拍摄参数,以便节省网络资源。
再次参见图12,在一些实施例中,在老师端开启双摄像头直播时,第一直播界面的tab区增加显示辅流tab。如果辅流tab被选中,说明第二摄像头采集的第二画面被显示在讲义区播放窗口中。
在一些实施例中,第一直播界面在生成并显示时可同步在tab区显示辅流tab。在老师端未开启双摄像头直播时,如果老师点击辅流tab,则讲义区播放窗口切换显示的内容为空白画面;在老师端开启双摄像头直播后,如果老师点击辅流tab,则讲义区播放窗口切换显示第二摄像头采集的第二画面。
那么,在不需要进行双摄像头直播时,控制器被进一步配置为:响应于在tab区中将辅流tab切换为讲义tab的操作,关闭第二摄像头,停止通过服务器向学生端显示设备发送辅视频流,并且,讲义内容切换显示在讲义区播放窗口中。
在老师端无需向学生展示其手动书写的教学内容时,老师可通过切换按钮的方式实现讲义区播放窗口中展示内容的切换。那么,在老师基于tab区将辅流tab切换为讲义tab时,即讲义tab被选中时,则黑板画面无需继续呈现,此时可关闭第二摄像头。
第二摄像头被关闭后,辅视频流不再生成,因此,基于辅流ID调用音视频sdk方法停止辅流推送,并移除辅流视频控件。服务器收到此方法调用时,发送辅流端口信令至学生端显示设备。辅流端口信令包括老师ID、视频流ID、辅视频流标识、不可用标识、音视频SDK标识和消息SDK标识等。辅视频流标识用于表征辅视频流类型,不可用标识用于表征辅视频流不可被传输至学生端。
学生端显示设备在接收到服务器发送的视频信令后,判断视频流标识为辅视频流标识,且基于不可用标识和辅视频流ID移除辅视频流的订阅,并移除对应辅流组件,即关闭第二摄像头,并在讲义区播放窗口显示原讲义内容。
在一些实施例中,在执行tab切换后,第二摄像头被关闭,tab区中的辅流tab被取消显示。
在一些实施例中,在执行tab切换时,第二摄像头还可仍处于开启状态,仅停止通过服务器向学生端显示设备发送辅视频流,并且在讲义区播放窗口切换显示讲义内容。此时,tab区中的辅流tab仍进行显示。若老师还需展示第二摄像头采集的黑板画面,则再将辅流tab选中,以再将第二摄像头当前采集的黑板画面同步显示在老师端和学生端的讲义区播放窗口中。
在一些实施例中,在直播上课过程中,老师可对第二摄像头的辅流拍摄参数进行修改。该方案可发生作为辅流摄像头的第二摄像头出现故障无法使用需更换第三摄像头作为辅流摄像头继续采集黑板画面的场景,还可发生在修改镜像参数的场景,还可发生在视频卡顿时降低分辨率的场景,还可发生在视频传输流畅时增加分辨率的场景等。
为此,在第一直播界面的tab区中增加设置按钮。在需要对辅流拍摄参数进行修改时,设置按钮被选中,此时控制器被进一步配置为执行下述步骤:
步骤141、响应于触发第一直播界面中设置按钮产生的指令,在第一直播界面中弹出辅流设置弹窗,以及,停止向学生端显示设备发送辅视频流。
步骤142、在基于辅流设置弹窗完成对辅流拍摄参数的修改时,取消辅流设置弹窗的显示,以及,继续通过服务器向学生端显示设备发送修改后的辅流拍摄参数对应的辅视频流。
再次参见图11,为便于老师能够实时对辅流摄像头的辅流拍摄参数进行修改,可在老师端点击辅流启动按钮时,同步在第一直播界面中显示设置按钮。老师触发设置按钮,重新在第一直播界面中弹出辅流设置弹窗,图11中的讲义区播放窗口中和设置弹窗摄像头画面预览框中均显示第二画面,如老师手绘的兔子图案。
在一些实施例中,在触发设置按钮时,可先停止辅视频流向学生端的推流,学生端的讲义区播放窗口显示白板内容,具有实现方式可参照前述内容,此处不进行赘述。
在一些实施例中,在触发设置按钮时,也可不停止辅视频流向学生端的推流,而是按照之前的参数通过服务器向学生端发送辅视频流,此时学生端的讲义区播放窗口依然显示原辅流拍摄参数对应的辅视频流。而在老师端重新对辅视频流的拍摄参数修改后,再基于修改后的新辅流拍摄参数将采集的辅视频流通过服务器发生至学生端进行显示,学生端的讲义区播放窗口继续显示新辅流拍摄参数对应的辅视频流。
在老师基于辅流设置弹窗完成对任一辅流拍摄参数的修改后,触发辅流设置弹窗中的确定按钮,取消辅流设置弹窗的显示。同时,辅流摄像头基于修改后的辅流拍摄参数继续采集第二画面,而后,继续通过服务器向学生端显示设备发送新采集的辅视频流。可见,通过手动降低当前推流分辨率方式来保证学生端的视频流畅度。辅视频流的推流过程可参照前述内容,此处不进行赘述。
在一些实施例中,第一摄像头和第二摄像头的拍摄参数不同,使得主视频流和辅视频流的分辨率不同,进而老师视频窗口和讲义区播放窗口的显示分辨率不同。由于讲义区播放窗口为大尺寸显示窗口,老师视频窗口为小尺寸显示窗口,因此,第二摄像头的 拍摄参数优于第一摄像头的拍摄参数,进而讲义区播放窗口的显示分辨率优于老师视频窗口的显示分辨率。两个播放窗口采用不同的分辨率,可减少主视频流所占用的网络资源。
在一些实施例中,第一摄像头和第二摄像头的拍摄参数相同,使得主视频流和辅视频流的分辨率相同,进而老师视频窗口和讲义区播放窗口的显示分辨率相同。
在一些实施例中,第一摄像头采集主视频流数据并显示在老师视频窗口,第二摄像头采集辅视频流数据并显示在讲义区播放窗口。那么基于实际的课堂布置情况,老师还可将黑板布置在老师所在位置的后方,即第一摄像头在采集老师画面的同时,可同步采集到黑板画面。
那么在一种场景中,老师在书桌上布置一个手动书写面板,在老师的后方布置一个黑板。此时,第一摄像头朝向老师,第二摄像头朝向书桌。在双摄像头直播上课时,第一摄像头采集老师画面,第二摄像头采集书桌上的书写面板。即老师视频窗口显示老师画面,讲义区播放窗口显示书写面板中的书写内容。
如果老师需要在黑板上进行书写,以增加书写面积,则由第一摄像头采集黑板画面显示在老师视频窗口。而如果老师视频窗口的尺寸小于讲义区播放窗口的尺寸,因此,为便于学生能够清晰地观看到老师在黑板中的书写内容,则可将显示在老师视频窗口的黑板画面切换至讲义区播放窗口中进行展示,而老师视频窗口显示第二摄像头采集的书写面板内容。
那么在一种场景中,如果需要学生观看到老师的手部教学动作等情况,也可将第一摄像头采集的第一画面切换显示在尺寸较大的讲义区播放窗口中。
因此,为实现大小屏播放窗口的切换,再次参见图12,可在第一直播界面的tab区中增加切换按钮。老师端点击辅流启动按钮时,同步在第一直播界面中显示切换按钮,切换按钮用于实现第一摄像头和第二摄像头采集画面的切换显示,即主辅视频流的切换显示。此时,控制器被进一步配置为执行下述步骤:
步骤151、在第一直播界面中的老师视频窗口显示第一画面、讲义区播放窗口显示第二画面时,接收对切换按钮的触发,将第一画面作为辅视频流显示在讲义区播放窗口,以及,将第二画面作为主视频流显示在老师视频窗口。
步骤152、向服务器发送切换指令,切换指令用于指示服务器将第一画面的视频流标识变更为辅视频流标识,将第二画面的视频流标识变更为主视频流标识。
在两个摄像头均被开启的情况下,可灵活地切换两个播放窗口的画面。由此,在第一直播界面中的老师视频窗口显示第一画面、讲义区播放窗口显示第二画面时,老师可触发切换按钮,调用主辅流切换方法,向服务器发送切换指令。
图13示出了根据一些实施例的在第一直播界面中显示切换后主辅视频流的示意图。参见图13,在老师触发切换按钮后,在老师端显示设备中,第一直播界面的老师视频窗口基于主视频流标识显示第二画面(如手绘兔子图案),讲义区播放窗口基于辅视频流标识显示第一画面(如老师画面),进而完成主辅视频流的切换显示。
为实现主辅视频流数据的切换显示,由服务器响应切换指令对主视频流和辅视频流的视频流标识进行切换,即将第一画面的主视频流标识变更为辅视频流标识,将第二画面的辅视频流标识变更为主视频流标识。
在一些实施例中,老师端显示设备给服务器发的切换指令,和老师端上传切换后的 主辅视频流数据,可采用异步发送的方式,以留有时间完成主辅视频流标识的变更。
在一些实施例中,在服务器进行视频流标识的切换时,同步生成切换通知,发送至学生端显示设备。远端学生端会先收到主辅流断开的回调,即停止接收服务器发送的目标视频流,并在老师视频窗口和讲义区播放窗口均显示白板内容。
白板内容为空白内容,不进行内容展示。在老师端产生切换指令时,老师端取消第一直播界面中的老师视频窗口和讲义区播放窗口的内容展示,切换展示空白内容;同时通过服务器发送切换通知至学生端,而后学生端显示设备响应切换通知,也取消第二直播界面中的老师视频窗口和讲义区播放窗口的内容展示,切换展示空白内容。
在服务器完成视频流标识的切换后,服务器继续向学生端发送主视频流和辅视频流,即在学生端在收到切换后的主辅流连接回调,重新调用订阅方法在老师视频窗口和讲义播放窗口渲染切换后的主辅流。此时,在学生端显示设备中,第二直播界面中的老师视频窗口基于主视频流标识显示第二画面,讲义区播放窗口基于辅视频流标识显示第一画面,进而完成主辅视频流的切换显示。
由于老师触发切换按钮后会控制学生端显示设备先展示白板内容再展示切换后的主辅视频流,但是由于切换操作为瞬时完成的操作,因此,从视觉效果上,学生端显示设备呈现“闪一下”的效果后即显示播放窗口切换后的内容。
在一些实施例中,为减少“闪一下”的视觉效果,可降低服务器在变更主辅视频流标识时的耗时,尽量使得老师端显示设备给服务器发的切换指令,和老师端上传切换后的主辅视频流数据,达成同步发送的方式。或者,在老师端触发大小窗口切换显示时,在老师端和学生端的老师视频窗口和讲义区播放窗口不再显示空白的白板内容,而是替换显示讲义内容。此时,讲义内容为老师启动双摄功能之前所展示的页面。
在一些实施例中,在实现灵活切换两个播放窗口的画面时,还可应用在一个摄像头被开启的情况下,此时老师视频窗口显示第一画面,讲义区播放窗口显示讲义内容。老师在触发切换按钮后,将讲义内容显示在老师视频窗口,将第一画面显示在讲义区播放窗口。
可见,该老师端显示设备在进行双摄像头直播时,可自行开启两个摄像头分别采集不同的教学画面,一个摄像头采集老师画面,一个摄像头采集黑板画面。学生端可以同时基于两个播放窗口观看老师直播上课,老师端可对两个摄像头采集的画面在两个播放窗口间进行灵活动态切换,操作简单便利,提高用户的上课体验。
图14示出了根据一些实施例的学生端显示设备执行双摄像头直播方法的流程图。参见图14,本申请实施例提供的一种显示设备,应用于学生端显示设备,包括:显示器,被配置为显示用户界面;与显示器连接的控制器,在执行图14所述的双摄像头直播方法时,控制器被配置为执行下述步骤:
S21、响应于触发用户界面中目标直播课控件产生的启动指令,显示第二直播界面,第二直播界面中包括老师视频窗口和讲义区播放窗口中的至少一种。
S22、接收老师端显示设备通过服务器发送的目标视频流,获取目标视频流携带的视频流标识,目标视频流是指老师端显示设备在开始直播后获取的第一摄像头采集的第一画面和/或第二摄像头采集的第二画面中的至少一种。
S23、如果视频流标识表征目标视频流为主视频流,则将主视频流显示在老师视频窗口。
S24、如果视频流标识表征目标视频流包括主视频流和辅视频流,则将主视频流显示在老师视频窗口,将辅视频流显示在讲义区播放窗口。
学生端显示设备内配置第二程序(直播课程序),因此,在启动学生端显示设备后,在设备主页显示第二程序图标。用户通过手动或语音方式触发第二程序图标,生成直播课首页展示在显示器中。
直播课首页展示有至少一个直播课控件,每个直播课控件用于进入一个对应的直播间,进行老师和学生的在线直播课堂。在学生基于时间点触发目标直播课控件时,产生启动指令,实现基于目标直播课的在线上课,并在显示器中切换显示第二直播界面。
学生端在展示第二直播界面的过程可参照老师端展示第一直播界面的过程,学生端从服务器获取与目标直播课对应的音视频SDK标识和消息SDK标识,与老师端进入同一个目标直播课对应的目标直播间,以在学生端显示用于呈现目标直播间的第二直播界面。具体实现过程可参照前述内容,此处不进行赘述。
在老师端开始直播上课后,老师端显示设备通过服务器向学生端显示设备发送目标视频流。在开始上课时,目标视频流为第一摄像头采集的第一画面,为主视频流,即仅主视频流存在。在进行双摄像头直播时,目标视频流为第一摄像头采集的第一画面和第二摄像头采集的第二画面,第二摄像头采集的第二画面为辅视频流,即主视频流和辅视频流同时存在。老师端显示设备通过服务器向学生端显示设备发送目标视频流可参照前述实施例提供的老师端显示设备的执行过程,此处不赘述。
由于主视频流和辅视频流在第二直播界面中所展示的窗口不一致,因此,需从目标视频流中获取视频流标识。如果视频流标识表征目标视频流为主视频流,则将主视频流显示在老师视频窗口。如果视频流标识表征目标视频流包括主视频流和辅视频流,则将主视频流显示在老师视频窗口,将辅视频流显示在讲义区播放窗口。
此时存在至少两种实现场景,一种是在开始直播上课时,老师端仅开启第一摄像头还未开启第二摄像头,则目标视频流为作为主视频流的第一画面。因此,将第一画面显示在老师视频窗口,在讲义区播放窗口显示白板内容。
另一种是在进行双摄像头直播时,老师端开启第一摄像头和第二摄像头,则目标视频流包括作为主视频流的第一画面和作为辅视频流的第二画面。因此,将作为主视频流的第一画面显示在老师视频窗口,将作为辅视频流的第二画面显示在讲义区播放窗口。
如果在老师端开启双摄像头直播之前,老师端触发讲义内容显示在讲义区播放窗口,那么在开启双摄像头直播后,学生端需先将讲义内容停止显示,再显示作为辅视频流的第二画面。
在一些实施例中,在老师端实现对两个摄像头采集画面的切换显示时,学生端同步进行切换显示。此时,控制器被进一步配置为:
步骤251、如果老师视频窗口显示作为主视频流的第一画面、讲义区播放窗口显示作为辅视频流的第二画面,则在老师端显示设备进行摄像头采集画面切换显示时,接收服务器发送的切换通知,切换通知是指服务器在响应老师端显示设备在触发切换按钮产生的切换指令后生成的通知,切换指令用于指示服务器将第一画面的视频流标识变更为辅视频流标识,将第二画面的视频流标识变更为主视频流标识。
步骤252、响应于切换通知,停止接收服务器发送的目标视频流,以及,在老师视频窗口和讲义区播放窗口均显示白板内容。
步骤253、在完成摄像头采集画面切换显示时,继续接收服务器发送的携带辅视频流标识的第一画面和携带主视频流标识的第二画面。
步骤254、基于辅视频流标识将第一画面作为辅视频流显示在讲义区播放窗口,以及,基于主视频流标识将第二画面作为主视频流显示在老师视频窗口。
在老师端基于第一直播界面中的切换按钮控制两个摄像头采集的画面切换至对方的播放窗口进行显示时,老师端显示设备向服务器发送切换指令,以指示将第一画面的视频流标识变更为辅视频流标识,将第二画面的视频流标识变更为主视频流标识。同时,服务器产生切换通知发送给学生端显示设备。
学生端显示设备接收并响应切换通知,先停止两个播放窗口中的画面显示,即在老师视频窗口和讲义区播放窗口均显示白板内容。
图15示出了根据一些实施例的在第二直播界面中显示切换后主辅视频流的示意图。参见图15,在服务器完成主辅视频流标识的切换后,再继续接收服务器发送的携带辅视频流标识的第一画面和携带主视频流标识的第二画面,以基于辅视频流标识将第一画面作为辅视频流显示在讲义区播放窗口,以及,基于主视频流标识将第二画面作为主视频流显示在老师视频窗口。
学生端显示设备在切换两个摄像头采集画面的显示具体实现过程可参照前述实施例的内容,此处不进行赘述。
在一些实施例中,老师端显示设备的第一直播界面中只在老师视频窗口展示老师画面时,学生端显示设备的第二直播界面中也只在老师视频窗口展示老师画面。在老师端显示设备的第一直播界面中在老师视频窗口展示老师画面以及在讲义区播放窗口显示黑板画面时,学生端显示设备的第二直播界面中也在老师视频窗口展示老师画面以及在讲义区播放窗口显示黑板画面。
同理,在老师端显示设备通过切换画面进行大小屏切换显示时,学生端显示设备也同步进行大小屏切换显示。虽然老师触发切换按钮后会控制学生端显示设备先展示白板内容再展示切换后的主辅视频流,但是由于切换操作为瞬时完成的操作,因此,从视觉效果上,学生端显示设备呈现“闪一下”的效果后即显示播放窗口切换后的内容。
可见,该学生端显示设备在进行双摄像头直播时,可通过老师端显示设备自行开启的两个摄像头分别采集不同的教学画面,以使学生端可以同时基于两个播放窗口观看老师直播上课。并且,在老师端对两个摄像头采集的画面在两个播放窗口间进行灵活动态切换时,学生端也进行对应的画面切换显示,操作简单便利,提高用户的上课体验。
图16示出了根据一些实施例的服务器执行双摄像头直播方法的流程图。参见图16,本申请实施例提供的一种服务器,应用于音视频服务器,包括:控制器,在执行图16所示的双摄像头直播方法时,被配置为执行下述步骤:
S31、接收老师端显示设备在开始直播后发送的目标视频流,目标视频流是指老师端显示设备在开始直播后获取的第一摄像头采集的第一画面和/或第二摄像头采集的第二画面中的至少一种。
S32、如果目标视频流为第一摄像头采集的第一画面,则为第一画面配置主视频流标识,以及,将携带主视频流标识的第一画面作为主视频流发送至学生端显示设备进行显示。
S33、如果目标视频流为第二摄像头采集的第二画面,则为第二画面配置辅视频流标 识,以及,将携带辅视频流标识的第二画面作为辅视频流发送至学生端显示设备进行显示。
在直播上课过程中,服务器用于实现老师端显示设备和学生端显示设备之间的视频流传输和消息传输。因此,服务器可包括音视频服务器和消息服务器,由消息服务器实现老师端显示设备和学生端显示设备之间的视频流传输,由消息服务器实现老师端显示设备和学生端显示设备之间的消息传输。
服务器在接收到老师端显示设备发送的视频流后,如果从视频流中获取到主流ID,则说明目标视频流为第一摄像头采集的第一画面;如果从视频流中获取到辅流ID,则说明目标视频流为第二摄像头采集的第二画面;如果从视频流中获取到主流ID和辅流ID,则说明目标视频流为第一摄像头采集的第一画面和第二摄像头采集的第二画面。
学生端在接收到老师端通过服务器发送的主辅视频流时,由于学生端显示设备提供的第二直播界面包括两个播放窗口,为便于确定主视频流和辅视频流的显示位置,可由服务器为主视频流和辅视频流分别配置对应的视频流标识,以使得学生端能够基于主视频流标识在老师视频窗口显示第一画面,基于辅视频流标识在讲义区播放窗口显示第二画面。
在一些实施例中,在老师端控制进行画面切换时,为便于通知学生端也同步进行画面切换,则由服务器对主视频流和辅视频流的视频流标识进行切换,再基于切换后的视频流标识向学生端发送主视频流和辅视频流。
为此,在进行画面切换时,控制器被进一步配置为执行下述步骤:
步骤341、在进行摄像头采集画面切换显示时,响应于老师端显示设备在触发切换按钮产生的切换指令,将第一画面的视频流标识变更为辅视频流标识,将第二画面的视频流标识变更为主视频流标识,以及,向学生端显示设备发送切换通知,切换通知用于控制学生端显示设备提供的第二直播界面中的老师视频窗口和讲义区播放窗口均显示白板内容。
步骤342、在完成摄像头采集画面切换显示时,将携带辅视频流标识的第一画面和携带主视频流标识的第二画面发送至学生端显示设备,使得学生端显示设备提供的讲义区播放窗口基于辅视频流标识显示第一画面,老师视频窗口基于主视频流标识显示第二画面。
在老师端触发切换按钮拉起画面切换进程时,服务器接收到老师端显示设备发送的切换指令,实现主辅视频流的标识的切换,再基于切换后的视频流标识将对应的视频流发送至学生端显示设备,以使得学生端显示设备实现画面切换。
服务器所执行的配置主视频流和辅视频流的视频流标识以及切换视频流标识的具体实现过程,可参照前述任一实施例的内容,此处不进行赘述。
在一些实施例中,在利用服务器实现老师端显示设备和学生端显示设备之间的信息传输时,传输的流畅度由网络状态来决定。因此,为保证两个终端间视频流畅度,可采用分层推送视频流的方式。
为此,在传输视频流时,控制器被进一步配置为执行下述步骤:
步骤351、在实现老师端显示设备和学生端显示设备之间视频流的传输时,按照分辨率层级,将老师端显示设备发送的视频流转码为多层视频流,每层视频流对应一种分辨率,视频流是指主视频流和辅视频流中的至少一种。
步骤352、在将老师端显示设备发送的目标层视频流发送至学生端显示设备时,检测学生端显示设备接收视频流的丢包率。
步骤353、如果丢包率高于预设阈值,则基于下一分辨率层级将对应层视频流发送至学生端显示设备,下一分辨率层级低于目标层视频流对应的分辨率层级。
为保证两端视频传输的流畅度,服务器在接收到老师端显示设备发送的视频流时,在调用音视频SDK方法向学生端进行推流之前,会先将视频流按照分辨率层级划分为多层视频流。各个层视频流按照由高到低的顺序排序。
例如,在老师端设置的视频流为1920×1080分辨率的视频流时,服务器会将该视频流转码处理成三层视频流,如1920×1080、960×540、480×270三层视频流。
服务器将划分好的多层视频流发送至学生端显示设备,使得发送的视频流连接信令中包括视频流层级的信息。学生端显示设备在收到服务器发送的连接信令时,视频流层级信息就是这三层视频流的信息。而后,学生端在老师讲义区主(辅)流视频组件中调用音视频订阅视频流方法选择对应层视频流,向服务器订阅对应层视频流。
例如,学生端向服务器订阅1920×1080层视频流,而后服务器收到此订阅消息后,向学生端显示设备发送1920×1080的层视频流,并在订阅组件中进行渲染显示。
在视频流传输过程中,视频流畅度与网络状态息息相关,因此,由服务器实时检测学生端显示设备接收视频流的丢包率。如果丢包率高于预设阈值,如高于40%,说明当前网络状态差,则自动降低层级,将下一层视频流发送给学生端显示设备,来保证学生端的流畅度。
在一些实施例中,为了保证视频流畅度,还可采用手动降低分辨率的方式进行调节。例如,老师端触发第一直播界面的设置按钮弹出辅流设置弹窗,手动修改辅视频流的分辨率。通过手动降低当前推流分辨率方式来保证学生端的视频流畅度,具体实现过程可参照前述实施例的内容,此处不进行赘述。
可见,该服务器在进行双摄像头直播时,可对老师端显示设备发送的视频流类型配置对应的视频流标识,以便学生端可以基于不同的视频流标识将不同的画面显示在对应的两个不同播放窗口中,以观看老师直播上课。并且,在老师端对两个摄像头采集的画面在两个播放窗口间进行灵活动态切换时,服务器可对主视频流和辅视频流的视频流标识进行切换,将视频流标识切换后的主视频流和辅视频流继续发送至学生端,使得学生端也进行对应的画面切换显示,操作简单便利,提高用户的上课体验。
图14示出了根据一些实施例的学生端显示设备执行双摄像头直播方法的流程图。参见图14,本申请还提供了一种双摄像头直播方法,应用于学生端显示设备,所述方法包括:
S21、响应于触发用户界面中目标直播课控件产生的启动指令,显示第二直播界面,所述第二直播界面中包括老师视频窗口和讲义区播放窗口中的至少一种;
S22、接收老师端显示设备通过服务器发送的目标视频流,获取所述目标视频流携带的视频流标识,所述目标视频流是指老师端显示设备在开始直播后获取的第一摄像头采集的第一画面和/或第二摄像头采集的第二画面中的至少一种;
S23、如果所述视频流标识表征目标视频流为主视频流,则将所述主视频流显示在老师视频窗口;
S24、如果所述视频流标识表征目标视频流包括主视频流和辅视频流,则将所述主视 频流显示在老师视频窗口,将所述辅视频流显示在讲义区播放窗口。
图16示出了根据一些实施例的服务器执行双摄像头直播方法的流程图。参见图16,本申请还提供了一种双摄像头直播方法,应用于服务器,所述方法包括:
S31、接收老师端显示设备在开始直播后发送的目标视频流,所述目标视频流是指老师端显示设备在开始直播后获取的第一摄像头采集的第一画面和/或第二摄像头采集的第二画面中的至少一种;
S32、如果所述目标视频流为第一摄像头采集的第一画面,则为所述第一画面配置主视频流标识,以及,将携带主视频流标识的第一画面作为主视频流发送至学生端显示设备进行显示;
S33、如果所述目标视频流为第二摄像头采集的第二画面,则为所述第二画面配置辅视频流标识,以及,将携带辅视频流标识的第二画面作为辅视频流发送至学生端显示设备进行显示。
再次参见图6,在一些实施例的双摄像头直播方案中,在老师端通过触发老师端显示设备中显示的目标直播课控件进入目标直播间时操作如下:
S611:从服务器中获取对应的音视频SDK标识并加入对应直播间;
S612:同时,从服务器中获取对应的消息SDK标识并加入对应群组;
S613:通过群组向每个参与上课的学生端显示设备发送直播上课信令;
S614:开始直播上课,推送主流视频,同时向学生端显示设备发送主流连接的视频信令
S615:在开始直播上课后,若老师端想要推送辅流,则点击辅流启动按钮,弹出辅流设置弹窗;在点击确定后,将第二摄像头对应的辅流推送至服务器;同时,向学生端显示设备发送辅流连接的视频信令
S616:当需要切换主流视频和辅流视频的播放窗口时,点击切换按钮,向服务器发送主辅流切换指令,指示服务器将第一画面的视频流标识变更为辅视频流标识,将第二画面的视频流标识变更为主视频流标识;
S617:若不再需要进行双摄像头直播,则响应于在tab区中将辅流tab切换为讲义tab的操作,关闭第二摄像头,向服务器发送辅流断开信令,停止通过服务器向学生端显示设备发送辅视频流,此时,在讲义区播放窗口中切换显示讲义内容;
S618:老师完成授课内容,触发下课控件,向服务器发送下课消息;或者
S619:到达自动下课时间,触发出发下课定时任务,向服务器发送下课消息。
相应地,学生端在进入目标直播间时操作如下:
S621:从服务器中获取对应的音视频SDK标识并加入对应直播间;
S622:同时,从服务器中获取对应的消息SDK标识并加入对应群组;
S623:从服务器接收老师发送的直播上课信令,在收到消息后,隐藏未上课遮罩、显示直播课内容,此时讲义区播放窗口显示老师端的白板内容;
S624:接收服务器发送的视频信令,判断视频流标识为主视频流标识,在老师视频组件中调用音视频订阅视频流方法,向服务器订阅该主视频流;服务器收到此订阅消息后,向学生端显示设备发送主视频流,并将其显示在学生端显示设备的老师视频窗口中
S625:在收到辅流视频信令后,在讲义区辅流视频组件中订阅对应的辅流视频;在订阅成功后,接收服务器发送的辅流视频,并将之显示在讲义区播放窗口中;
S626:在接收到主辅流切换信令后,在老师视频组件和讲义区辅流视频组件中分别订阅切换后的视频流,并在订阅成功后分别在对应的窗口中渲染;
S627:响应于辅流断开信令,移除辅流视频的订阅,在讲义区播放窗口中显示讲义内容;
S628:在接收到下课消息后,显示下课界面,结束课程。
在一些实施例中,老师在智能电视上创建直播课时设置直播课的开始时间(startTime)和结束时间(endTime),例如,直播课于8:00开始,并于9:00结束。等到直播课结束时间到时,也就是到9:00时,直播课会立即停止。若老师还有未讲完的内容,只能在下节课继续讲课。此方式下,使得老师上课结束时与显示设备200之间的交互性较差,上课体验效果不佳。为了提高老师上直播课的体验效果,本申请还提供拖堂处理的实施方式。
在一些实施例中,服务器400响应于用户通过显示设备200创建直播课的请求,相应的创建直播课聊天群组、音视频直播间,并根据用户在显示设备200中设置的上课时间创建课程强制结束计时器。课程强制结束计时器根据老师设置的下课时间设置第一结束时间(delayEndTime),若是在第一结束时间内还未下课,则课程强制结束计时器会向直播课聊天群组发送时限消息,直播课聊天群组向老师和学生所登录的显示设备200发送用于指示直播课要结束的消息。也就是说,在课程强制结束计时器监测到第一结束时间到时,即使老师还未下课,服务器400也会控制显示设备200强制结束课程。
在一些实施例中,课程强制结束计时器的作用是为了维护整个系统的运行效率,因为服务器400是为很多用户服务,只有结束了当前直播课,才能释放出系统资源为其他直播课业务使用。
下面结合附图,来介绍本申请一些实施例提供的直播课拖堂管理的过程。
图17中示例性示出了根据一些实施例的直播课拖堂管理方法的流程示意图。该直播课拖堂管理的过程如下:
S701:响应于授课方进入直播课的操作,向服务器发送包含课程标识的数据获取请求。
如图7所示,主界面上显示有授课方已创建的直播课列表,授课方在选中已创建的直播课对应的“上课”控件后,显示设备200控制显示器260跳转至图8所示的直播课播放界面。在一些实施例中,当授课方进入直播课之后,显示设备200会向服务器400发送携带有课程标识的数据获取请求,当前,所述数据获取请求中也可包含用户标识,服务器400通过用户标识可判定出该用户为授课方还是普通学员。另外,对于授课方进行直播课的时间,在此不做限定,可以早于startTime,也可晚于startTime。
在一些实施例中,服务器400在接收到显示设备200发送的包含课程标识的数据请求时,根据课程标识从数据库中获取与直播课相对应的直播课详情数据以及直播课第一结束时间,并将相应的数据反馈至显示设备200。另外,服务器400还需查找出之前响应于用户创建直播课所创建的直播课聊天群组标识和音视频直播间标识,服务器400可将直播课聊天群组标识和音视频直播间标识归属于直播课详情数据中反馈至显示设备200,可以单独反馈给200。
S702:接收服务器根据所述课程标识发送的直播课详情数据以及直播课第一结束时间,其中,所述直播课详情数据为进行直播课所需数据,所述第一结束时间为所述服务 器根据所述授课方所设置的第二结束时间(例如授课方所设置结束时间endTime)计算得到,所述第一结束时间大于或等于所述第二结束时间。
在一些实施例中,授课方通过显示设备200创建直播课时,服务器400从显示设备200处获取所创建直播课的开始时间和结束时间,将授课方所设置结束时间设为第二结束时间。在一些实施例中,运营人员还可通过服务器400设置第三预设时长,这里所述第三预设时长为允许所述直播课延迟的时长(forceEndDelay)。此处第三预设时长可以避免显示设备200在到达授课方所设置的第二结束时间时,立即关闭当前直播课的情况,也就是适当为直播课设置一定的延迟时间,以提高授课方用户体验。
在一些实施例中,服务器400在所述第二结束时间的基础上累加第三预设时长,计算得到第一结束时间,并将第一结束时间反馈至显示设备200。这里,若是运营人员将第三预设时长设置为0,则第一结束时间等于第二结束时间。在一些实施例中,服务器400根据计算出的第一结束时间,启动课程强制结束定时器,若在课程强制结束定时器完成第一结束时间的计时时,未接收到授课方关于拖堂的操作时,则服务器400可以直接强制结束当前直播课,通过直播课聊天群组散播直播课程结束的消息。
S703:根据所述第一结束时间设置第一定时器,以使所述显示设备在所述第一定时器完成计秒后控制所述显示器展示出拖堂界面,其中,所述拖堂界面包含拖堂控件,所述拖堂控件用于触发延时停止所述直播课。
在一些实施例中,显示设备200记录授课方刚进入直播课的动作时间,并且,显示设备200设置有第二预设时长,所述第二预设时长为展示所述拖堂界面距离所述第一结束时间之间的时长,也就是说,在课程结束前的第二预设时长展示出拖堂界面。例如,第二结束时间为9:00,第三预设时长20分钟,则第一结束时间9:20,且第二预设时长为5分钟,则显示设备200将会在9:15的时候展示出拖堂界面。
在一些实施例中,显示设备200将所述第一结束时间依次减去所述动作时间和第二预设时长,得到第一定时时长,根据所述第一定时时长设置所述第一定时器。承接上述例子,假如授课方进入直播课的动作时间刚好为startTime,且startTime为8:00,那么此处第一定时时长计算方式为9:20减去8:00,再减去5分钟,得到1小时15分钟,也就是说,第一定时器需要完成1小时15分钟的计时。
例如,利用JS语言,通过setTimeout(定时器函数)起定时器,在该定时器函数中要设置两个参数,分别为待定时的时间以及定时完成后要做的行为。这里,待定时的时间为以毫秒表示的第一定时时长,定时完成后要做的行为为展示出拖堂界面。
在一些实施例中,在显示设备200完成第一定时器的计秒之后,控制显示器260显示出拖堂界面。在一些实施例中,拖堂界面可以为拖堂提示框。图18中示例性示出了根据一些实施例的直播课拖堂界面的显示效果示意图。如图18所示,所述拖堂提示框浮于授课方第一直播界面的讲义区播放窗口上展示。
在一些实施例中,所述拖堂提示框上设置有拖堂控件(图18中“拖堂10分钟”)、下课控件(图18中“下课”)以及关闭控件(图18中“X”)。其中,所述拖堂控件用于触发延时停止所述直播课,所述下课控件用于触发立即停止所述直播课,所述关闭控件用于关闭当前拖堂提示框。
S704:在接收到对所述拖堂界面的关闭操作时,在所述第一结束时间调用所述服务器接口将所述直播课置为下课状态,以停止所述直播课。
在一些实施例中,当显示设备200展示出拖堂界面时,授课方看到拖堂界面后,结合当前授课情况判断是否需要延迟下课。若是授课方判断出在当堂直播课中可以讲完该讲的内容,无需进行拖堂,则可直接关闭显示器260所展示出的拖堂界面。显示设备200接收到授课方输入的对关闭控件的选中操作时,隐藏当前拖堂界面,并不改变后台逻辑,依旧按照第一结束时间关闭直播课。
在一些实施例中,显示设备200调用所述服务器400接口将所述直播课置为下课状态,以停止所述直播课。服务器400通过直播课聊天群组向学员端的显示设备发送信令,告知学员端显示设备当前直播课结束,学员端显示设备展示出下课结束的界面。服务器400还通过直播课聊天群组向授课方的显示设备发送信令,授课方显示设备控制显示器260跳转出直播课播放界面,停止直播课,并展示出当前直播课进行了多长时间。
在一些实施例中,当显示设备200展示出拖堂界面时,授课方若是不需要进行拖堂,可以对拖堂界面不做处理,在第一预设时长内未接收到输入的对所述拖堂界面的操作时,显示设备200在所述第一结束时间调用所述服务器接口将所述直播课置为下课状态。例如,在10秒内,显示设备200没有接收到授课方对拖堂界面的操作时,隐藏拖堂界面,并默认还是按照第一结束时间关闭直播课。
S705:在接收到对所述拖堂控件的选中操作时,向所述服务器发送课堂延时请求,接收所述服务器根据所述课堂延时请求发送的直播课第三结束时间,根据所述第三结束时间设置第二定时器,以使所述显示设备在所述第二定时器完成计秒后控制所述显示器再次展示出所述拖堂界面。
在一些实施例中,当显示设备200展示出拖堂界面时,授课方看到拖堂界面后,结合当前授课情况判断出在当堂直播课中无法完成该讲的内容,需要适当进行拖堂,可对拖堂控件进行选中操作,显示设备200在接收到授课方输入的拖堂指示时,向服务器400发送课堂延时请求。
在一些实施例中,服务器400在接收到显示设备200发送的课堂延时请求后,重新计算第三结束时间,所述第三结束时间为第一结束时间加上待拖堂的时长,再加上第三预设时长。服务器400在计算出第三结束时间后,相应的调整课程强制结束定时器的计时秒数。例如,第一结束时间9:20,待拖堂的时长为10分钟,第三预设时长还是为20分钟,则第三结束时间为9:20加上10分钟,再加上20分钟,为9:50。服务器400在计算出第三结束时长后,将第三结束时长反馈至显示设备200。
在一些实施例中,显示设备200记录接收到对拖堂控件选中的当前时间。显示设备200根据所述第三结束时间依次减去接收到对拖堂控件选中的当前时间和第二预设时长,得到第二定时器的第二定时时长。承接上述例子,假如接收到对拖堂控件选中的当前时间为9:15,那么此处第二定时时长计算方式为9:50减去9:15,再减去5分钟,得到30分钟,也就是说,第二定时器需要完成30分钟的计时。在30分钟后,若是当前直播课还没有停止,则显示设备200控制显示器再次展示出所述拖堂界面。
在一些实施例中,当显示设备200接收到授课方输入的对下课控件的选中操作时,立即调用服务器400的接口,将直播课置为下课状态。
下面结合附图,进一步介绍一下本申请一些实施例中直播课拖堂管理的过程。
图19示出了根据一些实施例的直播课拖堂管理方法的时序图。如图19所示,其操作如下:
S901:授课方通过显示设备200可以调用服务器400的接口,创建直播课,并设置直播课的开始时间和结束时间,这里结束时间即为第二结束时间。
S902:服务器400从显示设备200获取授课方创建直播课的开始时间和第二结束时间。运营人员在服务器400侧自定义设置第三预设时长,服务器400通过第二结束时间加第三预设时长得到第一结束时间。服务器400还需相应的创建直播课聊天群组和音视频直播间,并根据第一结束时间启动课程强制结束定时器。
S903:授课方进入直播间开始上课,此时,显示设备200向服务器400发送包含课程表示的数据获取请求。服务器400将第一结束时间和直播课详情数据(包括直播课聊天群组标识、音视频直播间标识)反馈至显示设备200。
S904:显示设备200根据接收到直播课详情数据开始上课,即加入直播课聊天群组、音视频直播间。
S905:显示设备200预设第二预设时长,也就是需要提前第一结束时间多久展示拖堂界面。显示设备200根据第一结束时间,计算计算出第一定时时长,并根据第一定时时长本地拉起第一定时器,在第一定时器完成第一定时时长后,显示设备200控制显示拖堂提示。
S906:在显示出拖堂提示后,若是在第一预设时长内显示设备200未接收到授课方对拖堂界面的任何操作,显示设备200还是按照第一结束时间停止直播课,即在第一结束时间后,显示设备200会调用服务器400接口,将直播课置为下课状态;或者
S907:在显示出拖堂提示后,若是在第一预设时长内显示设备200接收到授课方对拖堂界面的关闭操作,则显示设备200还是按照第一结束时间停止直播课,即在第一结束时间后,显示设备200会调用服务器400接口,将直播课置为下课状态。
S908:在接收到对拖堂提示中下课控件的选中操作时,立即调用服务器接口将直播课设置为下课状态。
S909:若是接收到对拖堂界面上拖堂控件的选中操作时,向服务器400发送课堂延时请求。
S910:服务器400根据课堂延时请求重新计算第三结束时间,并反馈至显示设备200。相应地,课程强制结束定时器调整其计时秒数。
S911:服务器400根据第三结束时间响应的调整课程强制结束定时器的计时时间。显示设备200根据第三结束时间计算第二定时时长,本地起第二定时器,并重复上述过程,直至直播课停止。
S912:老师结束直播课,则关闭课程强制结束定时器;否则,等课程强制结束定时器完成计时时,发送时限消息,解散直播课聊天群组,关闭音视频直播间,并通过直播课聊天群组发送直播课结束消息。
S913:根据接收到的直播课结束消息,退出直播课聊天群组、音视频直播间,并跳转至直播结束画面。
在一些实施例中,结束直播课的方式是通过直播课聊天群组向各个显示设备发送全局消息,此时服务器400的音视频直播间并不停止,各显示设备接收到消息后会自动停止访问直播课聊天群组,以及停止上传音视频数据。此时音视频直播间检测到一段时间没有上传和下拉数据的请求,则自动停止音视频直播间。这样的通过显示设备接收到消息进行控制,服务器400被动的停止音视频直播,可以使得不出现服务器400主动切断, 各显示设备会根据自己的网络延迟实现不同的控制。
若在上课期间,授课方主动关闭直播课,进行下课,则服务器400关闭课程强制结束定时器。若授课方在延时拖堂之后,还未下课,则服务器400等课程强制结束定时器完成计时时,向直播课聊天群组发送时限消息,直播课聊天群组向参与直播课的各个显示设备发送直播课程结束消息,显示设备根据接收到的消息,退出直播课聊天群组和音视频直播间,并跳转至直播结束画面。
本申请中,通过临下课前在显示器上展示出拖堂界面,请示授课方是否需要进行拖堂,使得授课方自由控制延迟课程下课的时间,以便将未讲完的课程在一堂课中讲完,从而提高授课方的体验效果。
本申请一些实施例还提供了一种服务器,所述服务器被配置为:服务器400接收显示设备200发送的包含课程标识的数据获取请求。服务器400根据所述课程标识于数据库中查询相应的直播课详情数据以及直播课第一结束时间,其中,所述直播课详情数据为进入直播课所需数据,所述第一结束时间根据授课方所设置的第二结束时间计算得到。服务器400将所述直播课详情数据以及直播课第一结束时间发送至所述显示设备200,其中,所述第一结束时间用于所述显示设备设置第一定时器,以使所述显示设备在所述第一定时器完成计秒后控制所述显示器展示出拖堂界面。
在一些实施例中,服务器400获取所述显示设备中所述授课方所设置的第二结束时间,将所述第二结束时间与第三预设时长相加,得到所述第一结束时间,其中,所述第三预设时长为允许所述直播课延迟的时长,可由运营人员进行自定义配置。
在一些实施例中,服务器400在接收到所述显示设备200发送的课堂延时请求时,计算第三结束时间,包括:将所述第一结束时间与所述第三预设时长和待拖堂的时长相加,得到所述第三结束时间。
根据本申请的一些实施例,直播课拖堂管理方法包括:显示设备200响应于授课方进入直播课的操作,向服务器发送包含课程标识的数据获取请求。服务器400根据课程标识从数据库中查询相应的直播课详情数据和第一结束时间,并下发至显示设备200。显示设备200接收服务器400发送的直播课详情数据以及直播课第一结束时间,其中,所述直播课详情数据为进行直播课所需数据,所述第一结束时间为所述服务器根据所述授课方所设置的第二结束时间计算得到,所述第一结束时间大于或等于所述第二结束时间。显示设备200根据所述第一结束时间设置第一定时器,以使显示设备200在所述第一定时器完成计秒后控制所述显示器260展示出拖堂界面,其中,所述拖堂界面包含拖堂控件,所述拖堂控件用于触发延时停止所述直播课。在接收到对所述拖堂界面的关闭操作时,显示设备200在所述第一结束时间调用所述服务器400接口将所述直播课置为下课状态,以停止所述直播课。在接收到对所述拖堂控件的选中操作时,显示设备200向所述服务器400发送课堂延时请求,接收所述服务器400根据所述课堂延时请求发送的直播课第三结束时间,根据所述第三结束时间设置第二定时器,以使所述显示设备200在所述第二定时器完成计秒后控制所述显示器再次展示出所述拖堂界面。
参见图18中所示的根据一些实施例的直播课拖堂界面的显示效果示意图,在双摄像头直播方案中,拖堂提示框浮于授课方第一直播界面的讲义区播放窗口上展示。在讲义区播放窗口中播放的是课件内容的情形下,授课老师是位于书桌前的、可以直接看到讲义区播放窗口中的展示内容,由此可以第一时间看到直播课拖堂界面,此时可直接采用 如上所述的直播课拖堂管理方案。然而,当老师在黑板上进行书写而没有位于书桌前时,不能第一时间看到直播课拖堂界面,此时需要对拖堂管理方案进行相应的调整如下。
例如,响应于辅流启动按钮的触发,启动获取第二摄像头采集的第二画面,并将第二画面显示在讲义区播放窗口中,此时,老师可能在黑板上手动书写板书。此时若显示设备200在讲义区播放窗口上展示出拖堂界面,则老师很可能会注意不到该拖堂界面、对拖堂界面也不做处理。
在此情形下,若在第一预设时长内未接收到输入的对所述拖堂界面的操作时,显示设备200将默认需要拖堂,因此向所述服务器发送课堂延时请求;服务器则根据所述课堂延时请求,以之前所述的方法计算出并发送直播课第三结束时间,然后在讲义区播放窗口上展示延时提示框,其中例如显示“下课时间已延迟,延迟时间:10分钟”。与此同时,关闭拖堂界面。
在一些实施例中,所述延时提示框上还可设置有下课控件以及关闭控件。所述下课控件用于触发立即停止所述直播课,所述关闭控件用于关闭当前延时提示框。如果显示设备200没有接收到授课方对延时界面的操作,则该界面会一直留在播放窗口中,并按照第三结束时间关闭直播课。
进一步地,根据所述第三结束时间设置第二定时器,以使所述显示设备在所述第二定时器完成计秒后控制所述显示器关闭延时界面、再次展示出所述拖堂界面,重复上述过程,直至直播课停止。
在一些实施例中,服务器400在接收到显示设备200发送的拖堂请求时,会判断当前及其接下来的系统效率或性能。如果在预约延长的时间内系统性能有冗余,则正常反馈延时拖堂成功的消息给各个显示设备。如果在预约延长的时间内系统性能没有冗余或冗余达不到预设标准,则不允许拖堂。或者允许拖堂但提示系统性能不足。服务器400在接下来的时间中,如果出现运行卡顿或预警,则优先将拖堂的直播课业务进行降频,以维护系统效率。如果拖堂的直播课业务降频后仍不满足,则再处理其他业务。以兼顾系统的公平,维持正常直播课业务的上课质量。
本公开实施例提供的带有摄像头的显示设备和直播方法,老师端触发目标直播课控件,显示包括上课按钮、辅流启动按钮、老师视频窗口和讲义区播放窗口的第一直播界面。在开始直播时,触发上课按钮,启动并获取第一摄像头采集的第一画面,显示在老师视频窗口。在进行双摄像头直播时,老师端触发辅流启动按钮,启动并获取第二摄像头采集的第二画面,显示在讲义区播放窗口;通过服务器将第一画面和第二画面发送至学生端进行显示。在进行画面切换时,老师端触发切换按钮,老师端和学生端同步将第一画面显示在讲义区播放窗口,将第二画面显示在老师视频窗口。可见,通过两个摄像头分别采集不同的教学画面,且将辅助教学内容与讲义内容共用一个播放窗口,以增大显示效果。老师端可对两个摄像头采集的画面在两个播放窗口间进行灵活动态切换,操作简单便利,提高用户的上课体验。同时,本公开还提供了对直播中拖堂的处理方法,能够通过临下课前在显示器上展示出拖堂界面,请示授课方是否需要进行拖堂,使得授课方自由控制延迟课程下课的时间,以便将未讲完的课程在一堂课中讲完,从而提高授课方的体验效果。
为了方便解释,已经结合具体的实施方式进行了上述说明。但是,上述示例性的讨论不是意图穷尽或者将实施方式限定到上述公开的具体形式。根据上述的教导,可以得 到多种修改和变形。上述实施方式的选择和描述是为了更好的解释本公开内容,从而使得本领域技术人员更好的使用所述实施方式。

Claims (20)

  1. 一种用于带有摄像头的显示设备的直播方法,所述方法包括:
    响应于触发用户界面中目标直播课控件产生的启动指令,显示第一直播界面,所述第一直播界面中包括上课按钮、辅流启动按钮、老师视频窗口和讲义区播放窗口中的至少一种;
    响应于触发所述上课按钮产生的开始直播指令,在所述讲义区播放窗口显示白板内容,启动并获取双摄像头中的第一摄像头采集的第一画面,将所述第一画面显示在老师视频窗口,以及,将所述第一画面作为主视频流通过服务器发送至学生端显示设备进行显示;
    响应于触发所述辅流启动按钮产生的双摄像头直播指令,启动并获取双摄像头中的第二摄像头采集的第二画面,将所述第二画面显示在讲义区播放窗口,以及,将所述第二画面作为辅视频流通过服务器发送至学生端显示设备进行显示。
  2. 根据权利要求1所述的直播方法,其中,所述响应于触发用户界面中目标直播课控件产生的启动指令,显示第一直播界面的步骤进一步包括:
    响应于触发用户界面中目标直播课控件产生的启动指令,从服务器获取与目标直播课对应的音视频SDK标识和消息SDK标识,所述音视频SDK标识是指用于实现直播间音视频收发操作的音视频SDK的标识,所述消息SDK标识是指用于实现直播间聊天消息收发操作的消息SDK的标识;
    基于所述音视频SDK标识和消息SDK标识,创建与所述目标直播课对应的目标直播间,并显示用于呈现所述目标直播间的第一直播界面。
  3. 根据权利要求1所述的直播方法,其中,所述响应于触发所述上课按钮产生的开始直播指令,启动并获取双摄像头中的第一摄像头采集的第一画面,将所述第一画面显示在老师视频窗口的步骤进一步包括:
    响应于触发所述上课按钮产生的开始直播指令,选取第一摄像头作为用于拍摄第一画面的主流摄像头,以及,设置所述主流摄像头的主流拍摄参数;
    启动所述主流摄像头,以及,获取所述主流摄像头基于所述主流拍摄参数采集的第一画面,将所述第一画面作为主视频流显示在老师视频窗口中。
  4. 根据权利要求1所述的直播方法,进一步包括:
    在所述第一直播界面中设置课件控件,所述课件控件用于触发讲义内容的显示;
    在进行双摄像头直播之前,接收对所述课件控件的触发,获取讲义内容,将所述讲义内容显示在讲义区播放窗口;
    在进行双摄像头直播时,响应于触发所述辅流启动按钮产生的双摄像头直播指令,将所述讲义内容停止显示,以及,启动并获取所述第二摄像头采集的第二画面,将所述第二画面显示在讲义区播放窗口。
  5. 根据权利要求4所述的直播方法,进一步包括:
    在进行双摄像头直播时,在所述第一直播界面中显示讲义tab和辅流tab,所述辅流tab用于表征第二摄像头采集的第二画面被显示;
    在不需要进行双摄像头直播时,响应于在所述tab区中将辅流tab切换为讲义tab的操作,关闭所述第二摄像头,停止通过服务器向学生端显示设备发送辅视频流,以及,在所述讲义区播放窗口切换显示讲义内容。
  6. 根据权利要求4所述的直播方法,其中,所述响应于触发所述辅流启动按钮产生的双摄像头直播指令,启动并获取所述第二摄像头采集的第二画面,将所述第二画面显示在讲义区播放窗口的步骤进一步包括:
    响应于触发所述辅流启动按钮产生的双摄像头直播指令,在所述第一直播界面中显示包括参数选择框和确定按钮的辅流设置弹窗,所述参数选择框用于设置辅流摄像头的辅流拍摄参数;
    基于参数选择框选取第二摄像头作为用于拍摄第二画面的辅流摄像头,以及,设置辅流摄像头的辅流拍摄参数后,响应于触发所述确定按钮产生的指令,启动所述辅流摄像头;
    获取所述辅流摄像头基于所述辅流拍摄参数采集的第二画面,将所述第二画面作为辅视频流显示在讲义区播放窗口。
  7. 根据权利要求6所述的直播方法,进一步包括:
    在需要对所述辅流拍摄参数进行修改时,响应于触发所述第一直播界面中设置按钮产生的指令,在所述第一直播界面中弹出辅流设置弹窗,以及,停止通过服务器向学生端显示设备发送辅视频流;
    在基于所述辅流设置弹窗完成对辅流拍摄参数的修改时,取消所述辅流设置弹窗的显示,以及,继续通过服务器向学生端显示设备发送修改后的辅流拍摄参数对应的辅视频流。
  8. 根据权利要求1所述的直播方法,进一步包括:
    在所述第一直播界面中设置切换按钮,所述切换按钮用于实现第一摄像头和第二摄像头采集画面的切换显示;
    在所述第一直播界面中的老师视频窗口显示第一画面、讲义区播放窗口显示第二画面时,接收对所述切换按钮的触发,将所述第一画面作为辅视频流显示在讲义区播放窗口,以及,将所述第二画面作为主视频流显示在老师视频窗口;
    向服务器发送切换指令,所述切换指令用于指示服务器将所述第一画面的视频流标识变更为辅视频流标识,将所述第二画面的视频流标识变更为主视频流标识。
  9. 根据权利要求1所述的直播方法,进一步包括:
    响应于授课方进入直播课的操作,向服务器发送包含课程标识的数据获取请求;
    接收服务器根据所述课程标识发送的直播课详情数据以及直播课第一结束时间,其中,所述直播课详情数据为进行直播课所需数据,所述第一结束时间为所述服务器根据所述授课方所设置的第二结束时间计算得到,所述第一结束时间大于或等于所述第二结束时间;
    根据所述第一结束时间设置第一定时器,以使所述显示设备在所述第一定时器完成计秒后控制所述显示器展示出拖堂界面,其中,所述拖堂界面包含拖堂控件,所述拖堂控件用于触发延时停止所述直播课;
    在接收到对所述拖堂界面的关闭操作时,在所述第一结束时间调用所述服务器接口将所述直播课置为下课状态,以停止所述直播课;
    在接收到对所述拖堂控件的选中操作时,向所述服务器发送课堂延时请求,接收所述服务器根据所述课堂延时请求发送的直播课第三结束时间,根据所述第三结束时间设置第二定时器,以使所述显示设备在所述第二定时器完成计秒后控制所述显示器再次展示出所述拖堂界面。
  10. 根据权利要求9所述的直播方法,进一步包括:
    在第一预设时长内未接收到输入的对所述拖堂界面的操作时,隐藏所述拖堂界面并在所述第一结束时间调用所述服务器接口将所述直播课置为下课状态。
  11. 根据权利要求9所述的直播方法,其中,所述拖堂界面包含下课控件,所述下课控件用于触发停止所述直播课,在接收到对所述下课控件的选中操作时,调用所述服务器接口将所述直播课置为下课状态。
  12. 根据权利要求9所述的直播方法,其中,所述根据所述第一结束时间设置第一定时器的步骤包括:
    获取所述授课方进入所述直播课的动作时间;
    将所述第一结束时间依次减去所述动作时间和第二预设时长,得到第一定时时长,其中,所述第二预设时长为展示所述拖堂界面的时间距离所述第一结束时间之间的时长;
    根据所述第一定时时长设置所述第一定时器。
  13. 根据权利要求9所述的直播方法,其中,所述根据第三结束时间设置第二定时器的步骤包括:
    获取接收到对拖堂控件选中的当前时间;
    将所述第三结束时间依次减去所述当前时间和第二预设时长,得到第二定时时长;
    根据所述第二定时时长设置所述第二定时器。
  14. 根据权利要求9所述的直播方法,其中,所述第一定时器完成计秒后控制所述显示器展示出拖堂界面的步骤包括:
    控制所述显示器于当前所述直播课的讲义区播放窗口上展示出拖堂提示框,其中,所述拖堂提示框浮于所述讲义区播放窗口上展示。
  15. 根据权利要求14所述的直播方法,其中,
    当辅流启动按钮被触发并在讲义区播放窗口中显示由第二摄像头采集的第二画面时,若在第一预设时长内未接收到输入的对所述拖堂界面的操作,向所述服务器发送课堂延时请求,接收所述服务器根据所述课堂延时请求发送的直播课第三结束时间。
  16. 根据权利要求15所述的直播方法,进一步包括:
    在讲义区播放窗口上展示延时提示框,并关闭拖堂界面。
  17. 根据权利要求16所述的直播方法,进一步包括:
    在所述延时提示框上设置下课控件,所述下课控件用于触发停止所述直播课,在接收到对所述下课控件的选中操作时,调用所述服务器接口将所述直播课置为下课状态。
  18. 根据权利要求15所述的直播方法,进一步包括:根据所述第三结束时间设置第二定时器,以使所述显示设备在所述第二定时器完成计秒后控制所述显示器再次展示出所述拖堂界面。
  19. 一种显示设备,包括:
    显示器,被配置为显示图像和/或用户界面;
    第一摄像头,被配置为采集第一画面;
    第二摄像头,被配置为采集第二画面;
    分别与所述显示器、第一摄像头和第二摄像头连接的控制器,所述控制器被配置为:
    响应于触发用户界面中目标直播课控件产生的启动指令,显示第一直播界面,所述第一直播界面中包括上课按钮、辅流启动按钮、老师视频窗口和讲义区播放窗口中的至少一种;
    响应于触发所述上课按钮产生的开始直播指令,在所述讲义区播放窗口显示白板内容,启动并获取所述第一摄像头采集的第一画面,将所述第一画面显示在老师视频窗口,以及,将所述第一画面作为主视频流通过服务器发送至学生端显示设备进行显示;
    响应于触发所述辅流启动按钮产生的双摄像头直播指令,启动并获取所述第二摄像头采集的第二画面,将所述第二画面显示在讲义区播放窗口,以及,将所述第二画面作为辅视频流通过服务器发送至学生端显示设备进行显示。
  20. 根据权利要求19所述的显示设备,所述控制器被进一步配置为:
    响应于授课方进入直播课的操作,向服务器发送包含课程标识的数据获取请求;
    接收服务器根据所述课程标识发送的直播课详情数据以及直播课第一结束时间,其中,所述直播课详情数据为进行直播课所需数据,所述第一结束时间为所述服务器根据所述授课方所设置的第二结束时间计算得到,所述第一结束时间大于或等于所述第二结束时间;
    根据所述第一结束时间设置第一定时器,以使所述显示设备在所述第一定时器完成计秒后控制所述显示器展示出拖堂界面,其中,所述拖堂界面包含拖堂控件,所述拖堂控件用于触发延时停止所述直播课;
    在接收到对所述拖堂界面的关闭操作时,在所述第一结束时间调用所述服务器接口将所述直播课置为下课状态,以停止所述直播课;
    在接收到对所述拖堂控件的选中操作时,向所述服务器发送课堂延时请求,接收所述服务器根据所述课堂延时请求发送的直播课第三结束时间,根据所述第三结束时间设置第二定时器,以使所述显示设备在所述第二定时器完成计秒后控制所述显示器再次展示出所述拖堂界面。
PCT/CN2022/135719 2022-01-24 2022-11-30 显示设备和直播方法 WO2023138222A1 (zh)

Applications Claiming Priority (4)

Application Number Priority Date Filing Date Title
CN202210077270.7 2022-01-24
CN202210077270.7A CN114390357B (zh) 2022-01-24 2022-01-24 显示设备、服务器及直播课拖堂管理方法
CN202210405366.1A CN116962729A (zh) 2022-04-18 2022-04-18 一种双摄像头直播方法及显示设备、服务器
CN202210405366.1 2022-04-18

Publications (1)

Publication Number Publication Date
WO2023138222A1 true WO2023138222A1 (zh) 2023-07-27

Family

ID=87347747

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2022/135719 WO2023138222A1 (zh) 2022-01-24 2022-11-30 显示设备和直播方法

Country Status (1)

Country Link
WO (1) WO2023138222A1 (zh)

Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104539436A (zh) * 2014-12-22 2015-04-22 杭州施强网络科技有限公司 一种课堂内容实时直播方法及系统
CN104735416A (zh) * 2015-03-31 2015-06-24 宣城状元郎电子科技有限公司 跟踪摄像、录音信息采集处理直播网络教学系统
CN107155080A (zh) * 2016-09-06 2017-09-12 北京新学堂网络科技有限公司 一种模仿现场授课场景的课程视频制作方法
CN111182250A (zh) * 2019-11-29 2020-05-19 安徽文香信息技术有限公司 一种音视频教学录播系统及其控制方法
CN111245846A (zh) * 2020-01-15 2020-06-05 酷得少年(天津)文化传播有限公司 一种用于直播的信令传输系统及方法
CN112203106A (zh) * 2020-10-10 2021-01-08 深圳市捷视飞通科技股份有限公司 直播教学方法、装置、计算机设备和存储介质
CN112601124A (zh) * 2020-12-08 2021-04-02 聚好看科技股份有限公司 移动终端、服务器、显示设备及远程定时关机的控制方法
CN112788361A (zh) * 2020-10-15 2021-05-11 聚好看科技股份有限公司 一种直播课回看方法、显示设备及服务器
CN113645479A (zh) * 2021-08-12 2021-11-12 Vidaa美国公司 一种直播节目状态显示方法及显示设备

Patent Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104539436A (zh) * 2014-12-22 2015-04-22 杭州施强网络科技有限公司 一种课堂内容实时直播方法及系统
CN104735416A (zh) * 2015-03-31 2015-06-24 宣城状元郎电子科技有限公司 跟踪摄像、录音信息采集处理直播网络教学系统
CN107155080A (zh) * 2016-09-06 2017-09-12 北京新学堂网络科技有限公司 一种模仿现场授课场景的课程视频制作方法
CN111182250A (zh) * 2019-11-29 2020-05-19 安徽文香信息技术有限公司 一种音视频教学录播系统及其控制方法
CN111245846A (zh) * 2020-01-15 2020-06-05 酷得少年(天津)文化传播有限公司 一种用于直播的信令传输系统及方法
CN112203106A (zh) * 2020-10-10 2021-01-08 深圳市捷视飞通科技股份有限公司 直播教学方法、装置、计算机设备和存储介质
CN112788361A (zh) * 2020-10-15 2021-05-11 聚好看科技股份有限公司 一种直播课回看方法、显示设备及服务器
CN112601124A (zh) * 2020-12-08 2021-04-02 聚好看科技股份有限公司 移动终端、服务器、显示设备及远程定时关机的控制方法
CN113645479A (zh) * 2021-08-12 2021-11-12 Vidaa美国公司 一种直播节目状态显示方法及显示设备

Similar Documents

Publication Publication Date Title
CN111741372B (zh) 一种视频通话的投屏方法、显示设备及终端设备
WO2020248640A1 (zh) 一种显示设备
CN114390359B (zh) 一种消息的展示方法及显示设备
WO2020248668A1 (zh) 一种显示器及图像处理方法
WO2020248795A1 (zh) 一种智能电视上视频通话界面切换方法及智能电视
CN111491190B (zh) 一种双系统摄像头切换控制方法及显示设备
WO2020248714A1 (zh) 一种数据传输方法及设备
CN112788422A (zh) 显示设备
WO2024041672A1 (zh) 一种基于iptv业务的vr全景视频播放方法和系统
CN112788378B (zh) 显示设备与内容显示方法
WO2020248681A1 (zh) 显示设备及蓝牙开关状态的显示方法
CN112788423A (zh) 一种显示设备及菜单界面的显示方法
CN112783380A (zh) 显示设备和方法
CN114390357B (zh) 显示设备、服务器及直播课拖堂管理方法
WO2023138222A1 (zh) 显示设备和直播方法
CN113938633B (zh) 一种视频通话处理方法及显示设备
CN116980554A (zh) 一种显示设备及视频会议界面显示方法
WO2021088326A1 (zh) 一种显示设备及来电显示方法
CN113938634A (zh) 一种多路视频通话处理方法及显示设备
CN116962729A (zh) 一种双摄像头直播方法及显示设备、服务器
WO2023240973A1 (zh) 显示设备及投屏方法
CN115086722B (zh) 一种副屏内容的展示方法及显示设备
CN111641855B (zh) 一种双屏显示设备及其音频输出方法
CN113630633B (zh) 显示设备及交互控制方法
CN111970547B (zh) 一种显示设备

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 22921646

Country of ref document: EP

Kind code of ref document: A1