CN111277884A - Video playing method and device - Google Patents

Video playing method and device Download PDF

Info

Publication number
CN111277884A
CN111277884A CN202010111719.8A CN202010111719A CN111277884A CN 111277884 A CN111277884 A CN 111277884A CN 202010111719 A CN202010111719 A CN 202010111719A CN 111277884 A CN111277884 A CN 111277884A
Authority
CN
China
Prior art keywords
display device
playing
target video
server
auditorium
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202010111719.8A
Other languages
Chinese (zh)
Other versions
CN111277884B (en
Inventor
王光强
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Qingdao Hisense Media Network Technology Co Ltd
Original Assignee
Qingdao Hisense Media Network Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Qingdao Hisense Media Network Technology Co Ltd filed Critical Qingdao Hisense Media Network Technology Co Ltd
Priority to CN202010111719.8A priority Critical patent/CN111277884B/en
Publication of CN111277884A publication Critical patent/CN111277884A/en
Application granted granted Critical
Publication of CN111277884B publication Critical patent/CN111277884B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/4302Content synchronisation processes, e.g. decoder synchronisation
    • H04N21/4307Synchronising the rendering of multiple content streams or additional data on devices, e.g. synchronisation of audio on a mobile phone with the video output on the TV screen
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/433Content storage operation, e.g. storage operation in response to a pause request, caching operations
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/436Interfacing a local distribution network, e.g. communicating with another STB or one or more peripheral devices inside the home
    • H04N21/4363Adapting the video or multiplex stream to a specific local network, e.g. a IEEE 1394 or Bluetooth® network

Abstract

The application discloses a video playing method and video playing equipment, and belongs to the technical field of multimedia. The method is applied to a target display device and comprises the following steps: receiving a target video sent by a server, wherein the server is used for sending the target video to a plurality of display devices, the plurality of display devices are used for synchronously playing the target video, and the target display device is any one of the plurality of display devices; in the process of receiving the target video, playing the received part of the target video at a first playing speed; determining an absolute value of a time difference between the playing progress of the target video and a theoretical playing progress, wherein the theoretical playing progress is sent to the plurality of display devices by the server; and when the absolute value is greater than the first time length, adjusting the playing progress of the target video. The method and the device solve the problem that the playing progress difference of each display device to the video in the virtual auditorium is large. The application is used for playing videos.

Description

Video playing method and device
Technical Field
The present application relates to the field of multimedia technologies, and in particular, to a video playing method and device.
Background
With the development of multimedia technology, social television has come. Social television is to seamlessly combine social media (such as instant messaging software) with a television, so that a user can communicate with relatives and friends through the social television while watching videos through the social television.
When a user is using a social television, the social television may be triggered to establish a virtual auditorium for the video and invite other social televisions to join the virtual auditorium to play the video together. During the video playing process, each social television in the virtual auditorium can perform information interaction so as to realize the communication of each user watching the video shown in the virtual auditorium for the video. If the network state of the social television is poor, and the remaining part in the video is not obtained when the part in the video is played, the social television waits for obtaining the remaining part and then continues to play the remaining part, and further the playing progress of the social television is delayed.
Due to the difference of the network states of the social televisions in the virtual auditorium, the playing progress of the videos in the virtual auditorium is greatly different for the social televisions.
Disclosure of Invention
The embodiment of the application provides a video playing method and device, and the problem that playing progress differences of videos in virtual showing halls of various social televisions are large can be solved. The technical scheme is as follows:
in one aspect, a video playing method is provided, including:
receiving a target video sent by a server, wherein the server is used for sending the target video to a plurality of display devices, the display devices are used for synchronously playing the target video, and the target display device is any one of the display devices;
in the process of receiving the target video, playing the received part of the target video at a first playing speed;
determining an absolute value of a time difference between the playing progress of the target video and a theoretical playing progress, wherein the theoretical playing progress is sent to the plurality of display devices by the server;
and when the absolute value is greater than the first time length, adjusting the playing progress of the target video.
In another aspect, a video playback device is provided, the video playback device including:
the system comprises a controller, a server and a plurality of display devices, wherein the controller is used for receiving a target video sent by the server through a communication interface, the server is used for sending the target video to the plurality of display devices, the plurality of display devices are used for synchronously playing the target video, and the video playing device is any one of the plurality of display devices;
the display screen is used for playing the received part of the target video at a first playing speed in the process of receiving the target video through the communication interface;
the controller is further configured to determine an absolute value of a time difference between the playing progress of the target video and a theoretical playing progress, where the theoretical playing progress is sent to the plurality of display devices by the server;
the controller is further configured to adjust the playing progress of the target video when the absolute value is greater than the first time duration.
The beneficial effect that technical scheme that this application provided brought includes:
in the video playing method provided by the application, the display device can adjust the playing progress of the target video when the absolute value of the time difference between the playing progress of the target video and the theoretical playing progress is greater than the first time length, so that the time difference between the playing progress of the target video and the theoretical playing progress is smaller. Each display device in the virtual auditorium plays the target video in this way, so that the difference between the playing schedules of the display devices can be reduced.
Drawings
FIG. 1 is a schematic illustration of an implementation environment to which various embodiments of the present application relate;
fig. 2 is a block diagram of a hardware configuration of a terminal according to an embodiment of the present disclosure;
fig. 3 is a schematic functional configuration diagram of a terminal according to an embodiment of the present application;
fig. 4 is a block diagram of a configuration of a software system in a terminal according to an embodiment of the present application;
fig. 5 is a schematic structural diagram of a server provided in an embodiment of the present application;
fig. 6 is a flowchart of a video playing method provided in an embodiment of the present application;
fig. 7 is a flowchart of another video playing method provided in an embodiment of the present application;
FIG. 8 is a schematic diagram of a create interface provided by an embodiment of the present application;
FIG. 9 is a schematic diagram of a display interface of a second display device provided in an embodiment of the present application;
fig. 10 is a schematic diagram of a display interface of a display device according to an embodiment of the present application.
Detailed Description
To make the principles, technical solutions and advantages of the present application clearer, embodiments of the present application will be described in further detail below with reference to the accompanying drawings.
With the development of multimedia technology, social television is more and more popular. The social television is usually externally connected with a camera, a microphone and the like, so that audio and video communication on the social television becomes possible, and then a user can interact with relatives and friends when watching videos through the social television, and the function of chatting and watching the social television is realized. At present, the concept of a virtual auditorium is provided for social television, and the virtual auditorium is the same as a real auditorium, can be used for watching the same program by multiple persons, and is different from the virtual auditorium in that multiple users watch the same program on different display devices. However, currently, due to the difference in network quality between different display devices, the playing schedules of different display devices during playing the same program are different. Therefore, the content watched by each user at the same time is different, and each user cannot communicate with the same content in real time, and cannot meet the requirement of the user on the chatting and watching functions of the social television.
The video playing method provided by the application can reduce the playing progress difference of each display device in the virtual auditorium, and further meets the requirements of users on social television.
Fig. 1 is a schematic view of a video playing system according to an embodiment of the present application. The video playback system may include a plurality of display devices 200 and a server 400 communicatively connected. The display device 200 provided by the embodiment of the application can be a liquid crystal display, an OLED display or a projection display device. The particular display device type, size, resolution, etc. are not limiting, and those skilled in the art will appreciate that the display device may be modified in performance and configuration as desired. The display device can be a television product, and the display device can provide a broadcast receiving television function and can additionally provide an intelligent network television function of a computer support function. For example, the display device may include a web tv, a smart tv, an Internet Protocol Television (IPTV), and the like. Optionally, the display device may also be a mobile terminal, a tablet computer, a notebook computer, and other smart devices. It should be noted that the video playing system may include one server 400 or a plurality of servers 400, and fig. 1 exemplifies that the video playing system includes only one server 400. It should be noted that the server 400 described in this embodiment may be one server, or may also be a server cluster formed by a plurality of servers.
Alternatively, when the display apparatus 200 is a television product, the user may operate the display apparatus 200 through the first control device 100 and the second control device 300. The first control device 100 may be a remote controller, which includes infrared protocol communication or bluetooth protocol communication, and other short-distance communication methods, and controls the display apparatus 200 in a wireless or other wired manner. The user may input a user command through a key on a remote controller, voice input, control panel input, etc. to control the display apparatus 200. Such as: the user can input a corresponding control command through a volume up/down key, a channel control key, up/down/left/right moving keys, a voice input key, a menu key, a power on/off key, etc. on the remote controller, to implement the function of controlling the display device 200.
In some embodiments, the display device 200 may also be controlled using an application program running on a smart device, and the second control apparatus 300 is a smart device. The application program may provide various controls for the User in an intuitive User Interface (UI) on a screen associated with the smart device through configuration. For example, the second control apparatus 300 may install a software application with the display device 200, implement connection communication through a network communication protocol, and implement the purpose of one-to-one control operation and data communication. Such as: it is possible to implement the establishment of a control command protocol with the display apparatus 200 using the second control device 300, synchronize the remote control keypad to the second control device 300, and implement the function of controlling the display apparatus 200 by controlling the user interface on the second control device 300. The audio and video content displayed on the second control device 300 can also be transmitted to the display device 200, so as to realize the synchronous display function.
As also shown in fig. 1, the display apparatus 200 also performs data communication with the server 400 through various communication means. The display apparatus 200 may be allowed to make a communication connection through a Local Area Network (LAN), a Wireless Local Area Network (WLAN), and other networks. The server 400 may provide various contents and interactions to the display apparatus 200. Illustratively, the display device 200 receives software program updates, or accesses a remotely stored digital media library by sending and receiving information, and Electronic Program Guide (EPG) interactions. The server 400 may be a group of servers, a plurality of groups of servers, and one or more types of servers. Other web service content such as video on demand and advertising services may be provided through the server 400.
It should be noted that, when the display device is a mobile terminal, a tablet computer, a notebook computer, or other intelligent devices, the user may directly operate on the display device to control the display device, without controlling the display device through the control device.
Fig. 2 is a block diagram of a hardware configuration of a display device according to an embodiment of the present disclosure. As shown in fig. 2, the display device 200 includes a controller 210, a tuning demodulator 220, a communication interface 230, a detector 240, an input/output interface 250, a video processor 260-1, an audio processor 260-2, a display 280, an audio output 270, a memory 290, a power supply, and an infrared receiver.
A display 280 for receiving the image signal from the video processor 260-1 and displaying the video content and image and components of the menu manipulation interface. The display 280 includes a display screen for presenting a picture, and a driving component for driving the display of an image. The video content may be displayed from broadcast television content, or may be broadcast signals that may be received via a wired or wireless communication protocol. Alternatively, various image contents received from the network communication protocol and sent from the network server side can be displayed.
Meanwhile, the display 280 simultaneously displays a user manipulation UI interface generated in the display apparatus 200 and used to control the display apparatus 200.
And, a driving component for driving the display according to the type of the display 280. Alternatively, in case the display 280 is a projection display, it may also comprise a projection device and a projection screen.
The communication interface 230 is a component for communicating with an external device or an external server according to various communication protocol types. For example: the communication interface 230 may be a Wifi chip 231, a bluetooth communication protocol chip 232, a wired ethernet communication protocol chip 233, or other network communication protocol chips or near field communication protocol chips, and an infrared receiver (not shown in fig. 2).
The display apparatus 200 may establish control signal and data signal transmission and reception with an external control device or a content providing apparatus through the communication interface 230. And an infrared receiver, which may be an interface for receiving an infrared control signal of the first control device 100 (e.g., an infrared remote controller, etc.).
The detector 240 is a signal used by the display device 200 to collect an external environment or interact with the outside. The detector 240 includes a light receiver 242, a sensor for collecting the intensity of ambient light, and parameters such as parameter changes can be adaptively displayed by collecting the ambient light.
The image acquisition device 241, such as a camera and a camera, may be used to acquire an external environment scene, acquire attributes of a user or interact gestures with the user, adaptively change display parameters, and recognize gestures of the user, so as to implement an interaction function with the user.
In some other exemplary embodiments, the detector 240 may further include a temperature sensor, etc., such that the display device 200 may adaptively adjust a display color temperature of the image by sensing an ambient temperature. For example, the display apparatus 200 may be adjusted to display a cool tone when the temperature is in a high environment, or the display apparatus 200 may be adjusted to display a warm tone when the temperature is in a low environment.
In some other exemplary embodiments, the detector 240 may further include a sound collector or the like, such as a microphone, which may be used to receive a user's sound, a voice signal including a control instruction of the user to control the display device 200, or collect an environmental sound for identifying an environmental scene type, and the display device 200 may be adapted to the environmental noise.
The input/output interface 250 enables data transmission between the display device 200 and other external devices under the control of the controller 210. Such as receiving video and audio signals or command instructions from an external device.
Input/output interface 250 may include, but is not limited to, the following: any one or more of high definition multimedia interface HDMI interface 251, analog or data high definition component input interface 253, composite video input interface 252, USB input interface 254, RGB ports (not shown in fig. 2), etc.
In some other exemplary embodiments, the input/output interface 250 may also form a composite input/output interface with the above-mentioned plurality of interfaces.
The tuning demodulator 220 receives the broadcast television signals in a wired or wireless receiving manner, may perform modulation and demodulation processing such as amplification, frequency mixing, resonance, and the like, and demodulates the television audio and video signals carried in the television channel frequency selected by the user and the EPG data signals from a plurality of wireless or wired broadcast television signals.
The tuner demodulator 220 is responsive to the user-selected television signal frequency and the television signal carried by the frequency, as selected by the user and controlled by the controller 210.
The tuner-demodulator 220 may receive signals in various ways according to the broadcasting system of the television signal, such as: terrestrial broadcast, cable broadcast, satellite broadcast, or internet broadcast signals, etc.; and according to different modulation types, the modulation mode can be digital modulation or analog modulation. Depending on the type of television signal received, both analog and digital signals are possible.
In other exemplary embodiments, the tuner/demodulator 220 may be in an external device, such as an external set-top box. In this way, the set-top box outputs television audio/video signals after modulation and demodulation, and the television audio/video signals are input into the display device 200 through the input/output interface 250.
The video processor 260-1 is configured to receive an external video signal, and perform video processing such as decompression, decoding, scaling, noise reduction, frame rate conversion, resolution conversion, image synthesis, and the like according to a standard codec protocol of the input signal, so as to obtain a signal that can be displayed or played on the direct display device 200.
Illustratively, the video processor 260-1 includes a demultiplexing module, a video decoding module, an image synthesizing module, a frame rate conversion module, a display formatting module, and the like.
The demultiplexing module is used for demultiplexing the input audio and video data stream, and if the input MPEG-2 is input, the demultiplexing module demultiplexes the input audio and video data stream into a video signal and an audio signal.
And the video decoding module is used for processing the video signal after demultiplexing, including decoding, scaling and the like.
And the image synthesis module is used for carrying out superposition mixing processing on the GUI signal input by the user or generated by the user and the video image after the zooming processing by the graphic generator so as to generate an image signal for display.
The frame rate conversion module is configured to convert an input video frame rate, such as a 60Hz frame rate into a 120Hz frame rate or a 240Hz frame rate, and the normal format is implemented in, for example, an interpolation frame mode.
The display format module is used for converting the received video output signal after the frame rate conversion, and changing the signal to conform to the signal of the display format, such as outputting an RGB data signal.
The audio processor 260-2 is configured to receive an external audio signal, decompress and decode the received audio signal according to a standard codec protocol of the input signal, and perform noise reduction, digital-to-analog conversion, amplification processing, and the like to obtain an audio signal that can be played in the speaker.
In other exemplary embodiments, video processor 260-1 may comprise one or more chips. The audio processor 260-2 may also comprise one or more chips.
And, in other exemplary embodiments, the video processor 260-1 and the audio processor 260-2 may be separate chips or may be integrated together with the controller 210 in one or more chips.
An audio output 272, which receives the sound signal output from the audio processor 260-2 under the control of the controller 210, such as: the speaker 272, and the external sound output terminal 274 that can be output to the generation device of the external device, in addition to the speaker 272 carried by the display device 200 itself, such as: an external sound interface or an earphone interface and the like.
The power supply provides power supply support for the display device 200 from the power input from the external power source under the control of the controller 210. The power supply may include a built-in power supply circuit installed inside the display device 200, or may be a power supply interface installed outside the display device 200 to provide an external power supply in the display device 200.
A user input interface for receiving an input signal of a user and then transmitting the received user input signal to the controller 210. The user input signal may be a remote controller signal received through an infrared receiver, and various user control signals may be received through the network communication module.
For example, the user inputs a user command through the remote controller 100 or the second control device 300, the user input interface responds to the user input through the controller 210 according to the user input, and the display apparatus 200 responds to the user input.
In some embodiments, a user may enter a user command at a Graphical User Interface (GUI) displayed on the display 280, and the user input interface receives the user input command through the GUI. Alternatively, the user may input the user command by inputting a specific sound or gesture, and the user input interface receives the user input command by recognizing the sound or gesture through the sensor.
The controller 210 controls the operation of the display apparatus 200 and responds to the user's operation through various software control programs stored in the memory 290.
As shown in fig. 2, the controller 210 includes a random-access Memory (RAM) 213, a Read-Only Memory (ROM) 214, a graphics processor 216, a Central Processing Unit (CPU) 212, and a communication interface 218, such as: a first interface 218-1 through an nth interface 218-n, and a communication bus. The RAM213 and the ROM214, the graphic processor 216, the CPU212, and the communication interface 218 are connected via a bus.
A ROM 213 for storing instructions for various system boots. If the display apparatus 200 starts power-on upon receipt of the power-on signal, the CPU212 executes a system boot instruction in the ROM, copies the operating system stored in the memory 290 to the RAM213, and starts running the boot operating system. After the start of the operating system is completed, the CPU212 copies various application programs in the memory 290 to the RAM213, and then starts running and starting various application programs.
A graphics processor 216 for generating various graphics objects, such as: icons, operation menus, user input instruction display graphics, and the like. The display device comprises an arithmetic unit which carries out operation by receiving various interactive instructions input by a user and displays various objects according to display attributes. And a renderer for generating various objects based on the operator and displaying the rendered result on the display 280.
CPU212 is operative to execute operating system and application program instructions stored in memory 290. And executing various application programs, data and contents according to various interactive instructions received from the outside so as to finally display and play various audio and video contents.
In some exemplary embodiments, the CPU212 may include a plurality of processors. The plurality of processors may include one main processor and a plurality of or one sub-processor. A main processor for performing some operations of the display apparatus 200 in a pre-power-up mode and/or operations of displaying a screen in a normal mode. A plurality of or one sub-processor for one operation in a standby mode or the like.
The controller 210 may control the overall operation of the display apparatus 200. For example: in response to receiving a user command for selecting a UI object to be displayed on the display 280, the controller 210 may perform an operation related to the object selected by the user command.
Wherein the object may be any one of selectable objects, such as a hyperlink or an icon. Operations related to the selected object, such as: displaying an operation connected to a hyperlink page, document, image, or the like, or performing an operation of a program corresponding to the icon. The user command for selecting the UI object may be a command input through various input means (e.g., a mouse, a keyboard, a touch pad, etc.) connected to the display apparatus 200 or a voice command corresponding to a voice spoken by the user.
The memory 290 includes a memory for storing various software modules for driving the display device 200. Such as: various software modules stored in memory 290, including: the system comprises a basic module, a detection module, a communication module, a display control module, a browser module, various service modules and the like.
The basic module is a bottom layer software module for processing signal communication between hardware in the display device 200 and sending processing and control signals to an upper layer module. The detection module is used for collecting various information from various sensors or user input interfaces, and the management module is used for performing digital-to-analog conversion and analysis management.
For example: the voice recognition module comprises a voice analysis module and a voice instruction database module. The display control module is a module for controlling the display 280 to display image content, and may be used to play information such as multimedia image content and UI interface. And the communication module is used for carrying out control and data communication with external equipment. And the browser module is used for executing a module for data communication between browsing servers. And the service module is used for providing various services and modules including various application programs.
Meanwhile, the memory 290 is also used to store visual effect maps and the like for receiving external data and user data, images of respective items in various user interfaces, and a focus object.
Fig. 3 is a block diagram of a configuration of a first control device according to an embodiment of the present application. As shown in fig. 3, the first control apparatus 100 includes a controller 110, a communication interface 130, a user input/output interface 140, a memory 190, and a power supply 180.
The first control apparatus 100 is configured to control the display device 200 and may receive an input operation instruction of a user and convert the operation instruction into an instruction recognizable and responsive by the display device 200, serving as an interaction intermediary between the user and the display device 200. Such as: the user operates the channel up/down key of the first control apparatus 100, and the display device 200 responds to the channel up/down operation.
In some embodiments, the first control apparatus 100 may be a smart device. Such as: the first control apparatus 100 may install various applications that control the display device 200 according to user demands.
In some embodiments, as shown in fig. 1, the second control apparatus 300 or other intelligent electronic device may function similarly to the first control apparatus 100 after installing an application for manipulating the display device 200. Such as: the user can implement the function of the physical key of the first control apparatus 100 by installing an application, various function keys or virtual buttons of a graphic user interface available on the second control apparatus 300 or other intelligent electronic devices.
The controller 110 includes a processor 112 and RAM 113 and ROM 114, a communication interface 218, and a communication bus. The controller 110 is used to control the operation of the control device 100, as well as the internal components for communication and coordination and external and internal data processing functions.
The communication interface 130 enables communication of control signals and data signals with the display apparatus 200 under the control of the controller 110. Such as: the received user input signal is transmitted to the display apparatus 200. The communication interface 130 may include at least one of a WiFi chip, a bluetooth module, an NFC module, and other near field communication modules.
A user input/output interface 140, wherein the input interface includes at least one of a microphone 141, a touch pad 142, a sensor 143, keys 144, and other input interfaces. Such as: the user can realize a user instruction input function through actions such as voice, touch, gesture, pressing, and the like, and the input interface converts the received analog signal into a digital signal and converts the digital signal into a corresponding instruction signal, and sends the instruction signal to the display device 200.
The output interface includes an interface that transmits the received user instruction to the display apparatus 200. In some embodiments, the interface may be an infrared interface or a radio frequency interface. Such as: when the infrared signal interface is used, the user input instruction needs to be converted into an infrared control signal according to an infrared control protocol, and the infrared control signal is sent to the display device 200 through the infrared sending module. The following steps are repeated: when the rf signal interface is used, a user input command needs to be converted into a digital signal, and then the digital signal is modulated according to the rf control signal modulation protocol and then transmitted to the display device 200 through the rf transmitting terminal.
In some embodiments, the first control device 100 includes at least one of a communication interface 130 and an output interface. The first control device 100 is configured with a communication interface 130, such as: the WiFi, bluetooth, NFC, etc. modules may transmit the user input command to the display device 200 through the WiFi protocol, or the bluetooth protocol, or the NFC protocol code.
The memory 190 stores various operation programs, data, and applications for driving and controlling the control device 200 under the control of the controller 110. The memory 190 may store various control signal commands input by a user.
And a power supply 180 for providing operational power support to the components of the control device 100 under the control of the controller 110. A battery and associated control circuitry.
Fig. 4 is a schematic functional configuration diagram of a display device according to an embodiment of the present application. As shown in fig. 4, the memory 290 is used to store an operating system, an application program, contents, user data, and the like, and performs system operations for driving the display device 200 and various operations in response to a user under the control of the controller 210. The memory 290 may include volatile and/or nonvolatile memory.
The memory 290 is specifically configured to store an operating program for driving the controller 210 in the display device 200, and to store various application programs installed in the display device 200, various application programs downloaded by a user from an external device, various graphical user interfaces related to the applications, various objects related to the graphical user interfaces, user data information, and internal data of various supported applications. The memory 290 is used to store system software such as an OS kernel, middleware, and applications, and to store input video data and audio data, and other user data.
The memory 290 is specifically used for storing drivers and related data such as the audio/video processors 260-1 and 260-2, the display 280, the communication interface 230, the tuning demodulator 220, the input/output interface of the detector 240, and the like.
In some embodiments, the memory 290 may store software and/or programs, such as software programs representing an Operating System (OS) including, for example: a kernel, middleware, an Application Programming Interface (API), and/or an application program. For example, the kernel may control or manage system resources, or functions implemented by other programs (e.g., the middleware, APIs, or applications), and the kernel may provide interfaces to allow the middleware and APIs, or applications, to access the controller to implement controlling or managing system resources.
The memory 290, for example, includes a broadcast receiving module 2901, a channel control module 2902, a volume control module 2903, an image control module 2904, a display control module 2905, an audio control module 2906, an external instruction recognition module 2907, a communication control module 2908, a light receiving module 2909, a power control module 2910, an operating system 2911, and other applications 2912, a browser module, and the like. The controller 210 performs functions such as: a broadcast television signal reception demodulation function, a television channel selection control function, a volume selection control function, an image control function, a display control function, an audio control function, an external instruction recognition function, a communication control function, an optical signal reception function, an electric power control function, a software control platform supporting various functions, a browser function, and the like.
Fig. 5 is a block diagram of a configuration of a software system in a display device according to an embodiment of the present application. As shown in fig. 5, an operating system 2911, including executing operating software for handling various basic system services and for performing hardware related tasks, acts as an intermediary for data processing performed between application programs and hardware components. In some embodiments, portions of the operating system kernel may contain a series of software to manage the display device hardware resources and provide services to other programs or software code.
In other embodiments, portions of the operating system kernel may include one or more device drivers, which may be a set of software code in the operating system that assists in operating or controlling the devices or hardware associated with the display device. The drivers may contain code that operates the video, audio, and/or other multimedia components. Examples include a display screen, a camera, Flash, WiFi, and audio drivers.
The accessibility module 2911-1 is configured to modify or access the application program to achieve accessibility and operability of the application program for displaying content.
A communication module 2911-2 for connection to other peripherals via associated communication interfaces and a communication network.
The user interface module 2911-3 is configured to provide an object for displaying a user interface, so that each application program can access the object, and user operability can be achieved.
Control applications 2911-4 for controllable process management, including runtime applications and the like.
The event transmission system 2914, which may be implemented within the operating system 2911 or within the application program 2912, in some embodiments, on the one hand, within the operating system 2911 and on the other hand, within the application program 2912, is configured to listen for various user input events, and to refer to handlers that perform one or more predefined sets of operations in response to the identification of various types of events or sub-events, depending on the various events.
The event monitoring module 2914-1 is configured to monitor an event or a sub-event input by the user input interface.
The event identification module 2914-1 is configured to input definitions of various types of events for various user input interfaces, identify various events or sub-events, and transmit the same to a process for executing one or more corresponding sets of processes.
Here, the event or sub-event refers to an input detected by one or more sensors in the display apparatus 200 and an input of an external control device (e.g., the first control device 100 or the second control device 300). Such as: the method comprises the following steps of inputting various sub-events through voice, inputting gestures through gesture recognition, inputting sub-events through remote control key commands of the control device and the like. For example, the one or more sub-events in the remote control may include various forms, including but not limited to one or a combination of key press up/down/left/right, ok key, key press, etc., and non-physical key operation, such as move, press, release, etc.
The interface layout manager 2913, directly or indirectly receiving the input events or sub-events from the event transmission system 2914, monitors the input events or sub-events, and updates the layout of the user interface, including but not limited to the position of each control or sub-control in the interface, and the size, position, and level of the container, and other various execution operations related to the layout of the interface.
Fig. 6 is a flowchart of a video playing method according to an embodiment of the present application. The method may be used for a target display device, which may be the display device shown in fig. 1 or fig. 2, as shown in fig. 6, and may include:
step 601, receiving a target video sent by a server, where the server is used to send the target video to multiple display devices, the multiple display devices are used to play the target video synchronously, and the target display device is any one of the multiple display devices.
Step 602, in the process of receiving the target video, playing the received part of the target video at the first playing speed.
Step 603, determining an absolute value of a time difference between the playing progress of the target video and a theoretical playing progress, wherein the theoretical playing progress is sent to the plurality of display devices by the server.
And step 604, adjusting the playing progress of the target video when the absolute value is larger than the first time length.
To sum up, in the video playing method provided in the embodiment of the present application, the display device may adjust the playing progress of the target video when the absolute value of the time difference between the playing progress of the target video and the theoretical playing progress is greater than the first time, so that the time difference between the playing progress of the target video and the theoretical playing progress is small. Each display device in the virtual auditorium plays the target video in this way, so that the difference between the playing schedules of the display devices can be reduced.
Fig. 7 is a flowchart of a video playing method according to an embodiment of the present application. The method may be applied to the video playing system shown in fig. 1, where the video playing system includes a first display device and a second display device, the number of the second display devices may be one or more, and the target display device in this embodiment may be any one of the first display device and the second display device. As shown in fig. 7, the method may include:
step 701, the first display device sends a request for creating a auditorium to the server, where the request for creating the auditorium carries an identifier of the target video.
Alternatively, the first display device may send a theater creation request to the server upon acquiring the creation instruction. Optionally, the auditorium creation request may carry an identification of the target video and an identification of the second display device.
When a user of a first display device needs to invite a user of a second display device to synchronously watch a target video, the first display device may be triggered to initiate the creation of a virtual auditorium. For example, a user of a first display device may select or enter an identification of a target video and an identification of a second display device on the first display device and trigger the first display device to generate a auditorium creation request for instructing a server to create an auditorium service based on the identification of the target video and the identification of the second display device. Thereafter, the user may operate the first display device to generate a creation instruction, and upon generation of the creation instruction, send an auditorium creation request to the server through a communication connection with the server. Alternatively, the user may select only the identifier of one second display device, or may select the identifiers of a plurality of second display devices. Optionally, the number of identifiers of the second display device selected by the user may or may not have an upper limit (e.g., 5, 6, or other values).
For example, when a user needs to create a virtual auditorium, the user may trigger the first display device to start the auditorium function, and the first display device may present a creation interface on which the user may select the identifier of the target video and the identifier of the second display device. Fig. 8 is a schematic diagram of a creation interface provided by an embodiment of the present application, and as shown in fig. 8, the creation interface 810 includes an addition control 811, an invitation control 812, a creation control 813, and a cancellation control 814. The user may click the add control 811 to trigger the first display device to determine the identity of the target video and the user may click the invite control 812 to trigger the first display device to determine the identity of the second display device. Thereafter, the user may click on the create control 813 to trigger the first display device to generate a create instruction, or the user may click on the cancel control 814 to cancel the creation of the virtual auditorium. Optionally, when the first display device is an intelligent television, the user may click a control in the creation interface through a remote controller or another control device connected to the first display device.
Optionally, the process of the first display device determining the identity of the target video may include: after the user clicks the add control 811, the first display device displays a candidate video interface including an identification of at least one candidate video (e.g., a video name or a poster of the video, etc.). The user may select an identifier of any one of the identifiers of the at least one candidate video, and the first display device may determine the identifier of the candidate video selected by the user as the identifier of the target video, and determine the candidate video indicated by the identifier of the candidate video as the target video. The first display device may also display an identification of the target video at the location where the control 811 is added in the creation interface.
Optionally, the first display device may be logged with a social account of the user, and the social account may be a user identifier of the first display device. The process of the first display device determining the identity of the second display device may comprise: after the user clicks the invitation control 812, the first display device displays a friend list corresponding to the social account of the user, where the friend list includes at least one candidate account. The user may select any one or more candidate accounts of the at least one candidate account, and the first display device may determine, as the second display device, all the display devices logged in with the candidate accounts, and determine an identifier of the second display device. The first display device may also display an identification of the second display device at a location in the creation interface where the invitation control 812 is located, such as displaying a selected candidate account or a label of the candidate account (e.g., a user avatar).
The identification of the display device may comprise a device identification or a user identification of the display device. For example, the device identifier of the second display device may include a unique identifier that is set when the second display device is shipped from a factory, and the user identifier of the second display device may include a user account (e.g., a social account of the user) logged in on the second display device. For example, in this embodiment of the application, the first display device may directly use the candidate account selected by the user as the identifier of the second display device, where the identifier is the user identifier of the second display device.
Optionally, as shown in FIG. 8, the creation interface may also include the name "XX auditorium" of the auditorium, which may be customized by the user of the first display device and may be changed by the user at any time.
Alternatively, the controller of the first display device may generate a auditorium creation request in step 701, and then transmit the auditorium creation request to the server through the communication interface.
Step 702, the server creates a auditorium service according to the auditorium creation request.
After receiving the auditorium creation request sent by the first display device, the server may create an auditorium service according to the identifier of the target video carried by the auditorium creation request. Optionally, the server may further generate a binding relationship between the identifier of the first display device and the identifier of the auditorium service, and according to the binding relationship, the server may determine that the first display device is a display device accessing the auditorium service, that is, a display device in the virtual auditorium corresponding to the auditorium service. The respective display devices accessing the auditorium service may be used to play the target video corresponding to the auditorium service synchronously.
Step 703, the server sends the identifier of the auditorium service and the target video to the first display device.
The server can also allocate identifiers for the auditorium services after the auditorium services are created so as to distinguish the auditorium services in the server, wherein different auditorium services correspond to different identifiers. Alternatively, the identification of the auditorium service may be a string of characters randomly assigned by the server, or a number of the auditorium service ordered by the time of creation.
Optionally, the auditorium creation request sent by the first display device may further carry an identifier of the first display device, and after the auditorium service is created, the server may return the identifier of the auditorium service and the target video to the first display device according to the identifier of the first display device. Alternatively, the server may directly transmit the target video to the first display device, or the server may also send a storage address of the target video to the first display device to instruct the first display device to acquire the target video according to the storage address.
Optionally, in step 702, the server may send the identifier of the auditorium service and the target video to the communication interface of the first display device, and the controller of the first display device may obtain the identifier of the auditorium service and the target video received by the communication interface.
Step 704, the server sends invitation information to the second display device according to the auditorium creation request, wherein the invitation information carries the identification of the auditorium service.
After receiving the auditorium creation request sent by the first display device and allocating the identifier for the auditorium service, the server may send invitation information carrying the identifier of the auditorium service to the second display device according to the identifier of the second display device carried in the auditorium creation request. Optionally, the invitation information may also carry an identification of the auditorium service, an identification of the first display device and an identification of the target video. The invitation information is used to invite the second display device to access the auditorium service to play the target video with the first display device.
Alternatively, the invitation information may be a prompt, invitation code or link, etc. The server sends the invitation information to the second display device, and then the second display device receives the invitation information, and the display screen of the second display device can display the invitation information.
Exemplarily, fig. 9 is a schematic diagram of a display interface of a second display device provided in an embodiment of the present application. As shown in fig. 9, the invitation message is a prompt message, and the prompt message may be "do you invite you to join the auditorium XX to watch video xxx, please click the lower option to confirm whether to accept? "where" xiao ming "may be the user identification of the first display device," auditorium XX "may be the identification of the auditorium service, and" video xxx "may be the identification of the target video, such as the name of the target video. For another example, if the invitation information is an invitation code, the second display device may query information of a corresponding auditorium service according to the invitation code, and then determine whether to access the auditorium service. As another example, if the invitation information is a link, the user may directly click on the link to cause the second display device to display information of the auditorium service. The information of the auditorium service may include: the request server creates an identification of the display information of the auditorium service (e.g. an identification of the first display device), an identification of the video corresponding to the auditorium service (e.g. the target video).
Optionally, in step 704, the server may send the invitation information to a communication interface of the second display device, and the controller of the second display device may control the display screen to display the invitation information according to the invitation information.
Step 705, the second display device sends a auditorium access request to the server, where the auditorium access request carries an identification of the auditorium service.
Optionally, the auditorium access request may carry an identification of the auditorium service and an identification of the second display device.
For example, after viewing the invitation information displayed by the second display device, the user may operate the second display device to trigger the second display device to generate an auditorium access request, which is then sent to the server. With continued reference to fig. 9, the second display device may also display selection buttons when displaying the invitation information, such as the selection buttons may include an "accept" button and a "cancel" button. When the user clicks the "accept" button, the second display device may determine that a trigger operation to generate an auditorium access request is detected, generate an auditorium access request, and send the auditorium access request to the server. Optionally, the user of the second display device may also click a "cancel" button, at which time the second display device may stop displaying the invitation information and resume displaying the original interface.
Optionally, the controller of the second display device may generate the auditorium access request in step 705 and send the auditorium access request to the server via the communication interface.
Step 706, the server sends the target video to the second display device according to the auditorium access request.
After receiving the auditorium access request sent by the second display device, the server may determine the target video corresponding to the auditorium service according to the identifier of the auditorium service carried by the auditorium access request, and further send the target video to the second display device. The manner in which the server sends the target video to the second display device may refer to the record related to the server sending the target video to the first display device in step 703, which is not described in detail in this embodiment of the present application.
Optionally, the server may send, to the second display device, a part of the target video after the current playing progress of the first display device according to the playing progress of the first display device on the target video, or the server may send, to the second display device, a part of the target video after the theoretical playing progress, so as to ensure that a difference between the playing progresses of the first display device and the second display device is small. For example, if the second display device sends a request for accessing the auditorium to the server at a first time, and the first display device plays to the 20 th minute in the target video at this time, that is, the playing progress of the first display device is the 20 th minute, the server may send the portion of the target video after the 20 th minute to the second display device.
It should be noted that, in this embodiment, the server sends the target video to the first server after the auditorium service is created, and sends the target video after the current playing progress of the first display device in the target video to the second display device after receiving the auditorium access request sent by the second display device. Optionally, the server may also wait for receiving a theater access request sent by the second display device, and then send the target video to the first display device and the second display device at the same time, so as to ensure that a difference between the playing schedules of the first display device and the second display device is small, and enable a user of the second display device to view the complete target video.
Alternatively, when the video playback system includes a plurality of second display devices, the server may transmit the target video to the first display device and all of the second display devices after receiving the auditorium access request transmitted by all of the second display devices. Alternatively, the server may transmit the target video to the first display device and the part of the second display devices after receiving the auditorium access request transmitted by the part of the second display devices; after receiving the auditorium access request sent by the remaining part of the second display devices, sending the part (or the part after the playing progress of any display device in the virtual auditorium) after the theoretical playing progress in the target video to the remaining part of the second display devices.
Optionally, when the server receives a theater access request of the second display device for the theater service, the server may further generate a binding relationship between the identifier of the second display device and the identifier of the theater service, and according to the binding relationship, the server may add the second display device to the virtual theater corresponding to the theater service, and further send the target video to the second display device.
Optionally, in step 706, the controller of the second display device may receive the target video sent by the server through the communication interface.
In step 707, the first display device plays the received portion of the target video at the first play speed during the process of receiving the target video.
The server sequentially sends the frame images in the target video and the audio corresponding to the frame images to the first display device according to the display sequence of the frame images in the target video. After receiving at least a portion of the target video, the first display device may begin playing the received portion of the target video and may play the received portion at a default first play speed. Illustratively, the first play speed may be a speed at which 24 frames of images are played per second. The first playing speed may also be other customized playing speeds, such as a speed of playing 26 frames or 30 frames of images per second, which is not limited in the embodiment of the present application.
Optionally, in step 707, the controller of the first display device may control the display screen of the first display device to play the received portion of the target video at the first play speed during the receiving of the target video through the communication interface.
Step 708, the first display device determines a first absolute value of a time difference between a first playing progress of the target video and the theoretical playing progress.
It should be noted that, in order to distinguish the playing schedules of the target video on the first display device and the second display device, in the embodiment of the present application, the playing schedule of the target video on the first display device is referred to as a first playing schedule, an absolute value of a time difference between the first playing schedule and the theoretical playing schedule is referred to as a first absolute value, the playing schedule of the target video on the second display device is referred to as a second playing schedule, and an absolute value of a time difference between the second playing schedule and the theoretical playing schedule is referred to as a second absolute value.
Optionally, the first display device may determine the current playing progress every other fixed time, and further determine a first absolute value of a time difference between the current playing progress and the theoretical playing progress. Optionally, the interval duration of the first absolute value determined twice by the first display device may also be different, and the embodiment of the present application is not limited.
In some embodiments, the server sends the theoretical playing progress to a plurality of display devices accessing the service of the auditorium, and the plurality of display devices adjust the playing progress of the plurality of display devices according to the same theoretical playing progress.
In some embodiments, the theoretical play progress is: and when all the target videos are acquired, the progress of playing the target videos at the first playing speed is obtained by the first display device which is used for receiving the target videos in the multiple display devices for synchronously playing the target videos. The first display device to receive the target video may be the first display device to send the auditorium creation request to the server.
In some embodiments, the theoretical play progress is: after the target video service is started, the server increases the theoretical playing progress along with the increase of the starting time of the target video service according to the value determined by the starting time point of the target video service. The target video service may be the auditorium service.
Illustratively, the first display device may determine the theoretical playing progress through the following three alternative implementations:
in a first implementation manner, the first display device may determine a theoretical playing progress of the target video according to a starting playing time of the target video on the first display device and the first playing speed. If the first playing speed is a standard one-time playing speed, the first display device may determine a time difference between the current time and the starting playing time as a theoretical playing progress of the target video. For example, if the first display device determines that the starting playing time of the target video is 8:00, the first display device may determine that the theoretical playing progress of the target video at 8:01 is the first minute of the target video, that is, the content of the first minute of the target video should be played at 8:01 theoretically. Optionally, if the first playing speed is not a standard double-speed playing speed, if the first playing speed is a double-speed playing speed, the first display device may determine that twice the time difference between the current time and the starting playing time is a theoretical playing progress of the target video. In this way, theoretical playing speeds for other first playing speeds can be obtained, and the embodiments of the present application are not described again.
In a second implementation manner, the server may record the creation completion time of the auditorium service, and the first display device may also determine the theoretical playing progress of the target video according to the creation completion time and the first playing speed. Compared with the first implementation manner, the implementation manner is only to change the starting playing time of the target video on the first display device to the creation completion time of the auditorium service, and the specific manner of determining the theoretical playing progress is the same as the first implementation manner, and is not described again in this embodiment of the application.
In a third implementation manner, the first display device may also determine the playing progress of any second display device as a theoretical playing progress; or the first display device may also directly determine the playing progress of the first display device as the theoretical playing progress.
Alternatively, step 708 may be performed by a controller in the first display device.
And step 709, when the first absolute value is greater than the first time length, the first display device adjusts a first playing progress of the target video.
The first display device may adjust the first playing progress of the target video once only by determining that the first absolute value is greater than the first duration until the first absolute value is less than or equal to the first duration. Optionally, the first display device may directly make the first absolute value less than or equal to the first duration by performing one-time play progress adjustment; or, the adjustment of the one-time playing progress may not directly make the first absolute value smaller than or equal to the first duration, and at this time, the first display device may continue to adjust the first playing progress after determining the first absolute value next time.
Alternatively, the first duration may be positively correlated with the time interval during which the first display device determines the first absolute value. Since the probability that the difference occurs in the playing schedules is higher as the playing time is longer, and the difference in the playing schedules between the allowable display devices is larger as the first time is longer, the first display device may determine the first absolute value once at intervals of longer time. The first time length is smaller, the allowable difference of the playing progress between the display devices is smaller, so that the first display device can determine the first absolute value once every shorter time. When the first duration is longer, the first display device can be prevented from frequently determining the first absolute value to further adjust the first playing progress, when the first duration is shorter, the playing progress difference of each display device (namely, the display device accessing the business of the auditorium) in the virtual auditorium can be ensured to be smaller, and the use effect of the chatting function of the display device while watching is ensured.
Optionally, the first display device may obtain a first duration preset by the staff, or the first duration may also be flexibly adjusted by the user. For example, the first duration may be 5 seconds, and the difference in the playing schedules within 5 seconds generally has little influence on the communication of the user with respect to the video, and at this time, the frequency of the first display device adjusting to determine the first absolute value and adjusting to the first playing schedule is not too great. Alternatively, the first duration may be other values, such as 3 seconds, 4 seconds, or 6 seconds.
The first display device may adjust the first playing progress of the target video in a plurality of ways, and the embodiment of the present application is explained by taking two alternative implementation manners as examples.
In an alternative implementation manner, the first display device may directly jump from the current playing progress of the target video to the theoretical playing progress. Illustratively, at 8:00 the first display device determines that the first playback progress of the target video is at 20 minutes, 5 seconds, and the theoretical playback progress of the target video is at 20 minutes, 15 seconds, then the first display device may skip between the 20 minutes, 5 seconds, and the 20 minutes, 15 seconds of the target video, and play the portion of the target video immediately after the 20 minutes, 15 seconds.
In another alternative implementation manner, the first display device plays the part of the target video after the current playing progress at the second playing speed, and plays the part of the target video after the current playing progress at the first playing speed when the first absolute value is less than or equal to the second time length. The second playing speed is greater than the first playing speed, and the second duration is less than or equal to the first duration. This alternative implementation corresponds to fast-forward or double-speed playback of the target video.
Illustratively, the first duration is 5 seconds, the second duration is 0, the first playing speed is a speed of displaying 24 frames of images per second, and the second playing speed is a speed of displaying 48 frames of images per second, which is equivalent to performing double-speed playing on the target video, and the first playing speed is recovered when the first playing speed is consistent with the theoretical playing speed. If the first display device plays the target video at the first play speed of 8:00:00, and the first play progress of the target video is determined to be 20 minutes and 5 seconds, and the theoretical play progress of the target video is determined to be 20 minutes and 20 seconds, the first display device can increase the play speed of the target video and play the target video at the second play speed. Assuming that the first display device determines the first absolute value at 8:00:15 next time after the first display device at 8:00:00 has acquired the portion of the target video after 5 seconds at 20 minutes, at this time, the first playing progress and the theoretical playing progress of the first display device are both 35 seconds at 20 minutes, and the first absolute value is 0. The first display device may now resume the first playback speed, presenting 24 frames of images per second.
It should be noted that, if the first display device determines that the first absolute value is greater than the second duration when determining the first absolute value next time, the first display device may continue to play the target video at the second play speed. And playing the target video at the first playing speed again until the first display device determines that the first absolute value is less than or equal to the second time length. In this embodiment of the application, the second duration is taken as 0, optionally, the second duration may also be equal to the first duration, for example, equal to 5 seconds, or the second duration may also be 3 seconds or 2 seconds or other numerical values, and this embodiment of the application is not limited.
Optionally, in step 709, the controller of the first display device may adjust the first playing progress of the target video when the first absolute value is greater than the first time length, and control the display screen to play the target video at the adjusted playing progress.
Step 710, the first display device sends an acquisition request of the status indication information to the server.
Optionally, each display device (e.g., the first display device) in the virtual auditorium may also send a request for obtaining the status indication information to the server during the process of playing the target video. The acquisition request may carry an identification of the display device and an identification of the auditorium service. Alternatively, the first display device may send the acquisition request to the server every other fixed time period.
Alternatively, the controller of the first display device may generate the acquisition request and send the acquisition request to the server through the communication interface in step 710.
Step 711, the server sends the status indication information to the first display device according to the obtaining request.
When receiving an acquisition request sent by a display device, the server may determine the state of the auditorium service according to the identifier of the auditorium service carried by the acquisition request. The state of the auditorium service may include a stopped state or an active state, the state of the auditorium service being an active state when the auditorium service is operating normally, and the state of the auditorium service being a stopped state when the auditorium service is stopped for some reason. Further, the server may send the status indication information to the display device indicated by the identifier of the display device carried in the acquisition request. The state indication information is used for indicating the state of the auditorium service as follows: a stop state or a run state, the status indication information may carry an identification of the auditorium service.
Optionally, in step 711, the server may send the status indication information to the communication interface of the first display device according to the obtaining request.
And 712, when the status indication information received by the first display device is used for indicating that the status of the auditorium service is the stop status, the first display device stops playing the target video.
The first display device may determine whether the auditorium service is operating normally according to the status indication information when receiving the status indication information. When the state indicating information is used for indicating that the state of the auditorium service is the stop state, the first display device can determine that the auditorium service does not normally operate, and each display device accessing the auditorium service cannot normally and synchronously play the target video or cannot normally transmit communication messages among users of each display device; the first display device may then stop accessing the auditorium service and may also stop playing the target video. Optionally, the first display device may also only stop accessing the auditorium service, and continue playing the target video under the trigger of the user, which is not limited in the embodiment of the present application.
Optionally, when the status indication information received by the first display device is used to indicate that the status of the auditorium service is in the stop status, the first display device may further display exit reminding information to remind the user that the auditorium service is stopped, and each display device accessing the auditorium service cannot continue to communicate with the target video.
It should be noted that, in the above steps 710 to 712, the display device actively sends an acquisition request of the status indication information to the server, and the status indication information sent by the server is taken as an example. Alternatively, the server may monitor the state of the auditorium service and send state indication information indicating that the state of the auditorium service is in the stopped state to the server when it is determined that the state of the auditorium service is in the stopped state. The first display device may not need to perform step 711 at this time.
Alternatively, the controller may determine whether the status indication information received through the communication interface indicates that the status of the auditorium service is in the stopped state, and control the display screen of the first display device to stop playing the target video when the status indication information indicates that the status of the auditorium service is in the stopped state in step 712.
In step 713, the second display device plays the received portion of the target video at the first play speed during the process of receiving the target video.
It should be noted that the second display device in step 713 may perform the same action as the first display device in step 707, and step 707 may be referred to in step 713, and the embodiments of the present application are not described again.
And 714, the second display device determines a second absolute value of the time difference between the second playing progress of the target video and the theoretical playing progress.
It should be noted that when determining the theoretical playing progress, the second display device may adopt three optional implementation manners of determining the theoretical playing progress by the first display device, which are not described in detail in this embodiment of the application. If the second display device adopts the first optional implementation manner of the three optional implementation manners, the second display device may obtain, from the server or the first display device, a play start time and a first play speed of the target video on the first display device, and further determine a theoretical play speed according to the play start time and the first play speed. For a specific determination manner, reference may be made to related descriptions in step 708, and details of embodiments of the present application are not described again.
It should be noted that, in step 714, the second display device may perform the same action as that performed by the first display device in step 708, and step 714 may refer to step 708, which is not described again in this embodiment of the present application.
It should be noted that, in the embodiment of the present application, the determination of the first absolute value by the first display device and the determination of the second absolute value by the second display device need to be based on the same theoretical playing progress, and the theoretical playing progress is the same for each display device in the virtual auditorium. Optionally, in this embodiment of the present application, the theoretical playing schedule may be a playing schedule of any display device in the virtual auditorium. At this time, the display device may not need to perform the steps of determining the second absolute value and subsequently adjusting the playing progress.
And 715, when the second absolute value is larger than the first time length, the second display device adjusts the first playing progress of the target video.
It should be noted that, in step 715, the second display device may perform the same action as that performed by the first display device in step 709, and step 715 may refer to step 709, which is not described again in this embodiment of the present application.
Optionally, the first duration and the second duration corresponding to each display device may be set by a user, and the first duration corresponding to the second display device may be the same as the first duration corresponding to the first display device or different from the first duration corresponding to the first display device; the second duration corresponding to the second display device may be the same as the second duration corresponding to the first display device, or may be different from the second duration corresponding to the first display device, and the embodiment of the application is not limited.
Step 716, the second display device sends a request for obtaining the status indication information to the server.
It should be noted that the second display device in step 716 may perform the same action as the first display device in step 710, and step 716 may refer to step 710, which is not described again in this embodiment of the present application.
Step 717, the server transmits the status indication information to the second display device according to the acquisition request.
It should be noted that, in step 717, the second display device may perform the same action as that performed by the first display device in step 711, and step 717 may refer to step 711, which is not described in detail in this embodiment of the application.
Step 718, when the status indication information received by the second display device is used to indicate that the status of the auditorium service is the stop status, the second display device stops playing the target video.
It should be noted that, in step 718, the second display device may perform the same action as that performed by the first display device in step 712, and step 718 may refer to step 712, which is not described again in this embodiment of the present application.
In the embodiment of the application, when the users of the display devices in the virtual auditorium communicate with the target video, the users can communicate with the target video in a video call mode, a voice call mode or other modes (such as a text message transmission mode). Alternatively, when the first display device transmits a theater creation request to the server in step 701, the communication mode selected by the user may be simultaneously transmitted to the server. In step 704, when the server sends the invitation information to the second display device, the server may also send the indication information of the communication mode selected by the user of the first display device to the second display device at the same time, or the invitation information may carry the indication information. After the user of the second display device accepts the invitation and triggers the second display device to send an auditorium access request to the server, the display devices (such as the first display device and the second display device) in the virtual auditorium can communicate in the communication mode indicated by the indication information. Optionally, multiple communication modes can be simultaneously supported between the display devices, and at this time, the user of the second display device can also select the communication mode by himself and communicate with users of other display devices in the virtual auditorium according to the selected communication mode.
Illustratively, the communication mode selected by the user of the first display device is a video call mode. The first display device and the second display device are both provided with a camera and a microphone. After the first display device sends a request for creating a theater to the server, the first display device can turn on the camera and the microphone, acquire an environment image through the camera, and acquire environment sound through the microphone. After the second display device sends the auditorium access request to the server, the second display device can also start the camera and the microphone of the second display device, acquire the environment image through the camera, and acquire the environment sound through the microphone. The first display device can transmit the collected environment image and sound to the second display device, and then the second display device can display the image sent by the first display device and play the sound sent by the first display device. The second display device can also transmit the collected environment image and sound to the second display device, and then the second display device can display the image sent by the first display device and play the sound sent by the first display device. Thus, the user of the first display device can communicate with the user of the second display device. Optionally, the image and the sound may be directly transmitted between the first display device and the second display device, or the image and the sound may also be forwarded by a server, which is not limited in the embodiment of the present application.
Optionally, fig. 10 is a schematic view of a display interface of a display device provided in an embodiment of the present application, where the display interface may be a display interface of a first display device, and may also be a display interface of a second display device. As shown in fig. 10, when playing the target video, the display device may display at least one floating window C, and images captured by other display devices except the display device in the virtual auditorium may be displayed in each floating window C.
Optionally, in the embodiment of the present application, during the process of completing creation of the auditorium service and playing the target video, the first display device may also send invitation information to other display devices under the trigger of the user, so as to invite the other display devices to access the auditorium service to join the virtual auditorium. At this time, the first display apparatus may directly generate the invitation information and directly transmit the invitation information to the other display apparatus. Optionally, the first display device may also send the identifier of the other display device to the server, so that the server generates the invitation information and sends the invitation information to the other display device. For the invitation information, reference may be made to the introduction of the invitation information in step 704, and details of the embodiment of the present application are not described again. Optionally, in this embodiment of the present application, the display device may be invited to join the virtual auditorium only by the first display device that sends the auditorium creation request to the server, or any display device that joins the virtual auditorium may invite other display devices to join the virtual auditorium, and this embodiment of the present application is not limited thereto.
For example, continuing to refer to fig. 10, the display device may display an invitation control K, and the user may trigger the invitation control K to select an identifier of another display device to be invited, and then generate invitation information according to the identifier of the other display device. For the introduction of selecting the identifier of the other display device, reference may be made to the related introduction of the user selecting the identifier of the second display device in step 701, and details of the embodiment of the present application are not described again.
Optionally, in the embodiment of the present application, in the process of completing creation of the auditorium service and playing the target video, the first display device may also send a video switching request to the server under the trigger of the user, so as to switch the target video corresponding to the auditorium service. The video switch request may carry an identification of the auditorium service and an identification of the switched video. The server may send the switched video to each display device accessing the auditorium service according to the video switching request, so that each display device synchronously plays the switched video. It should be noted that, when playing the switched video, each display device may execute the same steps as those in playing the target video, which is not described in detail in this embodiment of the present application. Optionally, in this embodiment of the present application, the video switching request may be sent to the server only by the first display device that sends the auditorium creation request to the server, or any display device that joins the virtual auditorium may send the video switching request to the server, and this embodiment of the present application is not limited.
For example, a video switching control may be further displayed on the display device, and the user may select an identifier of the switched video by triggering the video switching control. For the introduction of selecting the identifier of the video after the switching, reference may be made to the related introduction of the identifier of the user selection target in step 701, and details are not described in this embodiment of the present application.
Optionally, other controls may also be displayed on the display device, and the user may control the display device to perform corresponding operations by triggering the other controls. Illustratively, continuing with reference to FIG. 10, the other controls can include a sharpness control, a video window control, and a chat sound control, among others. The user can adjust the definition of the played video by triggering the definition control, select whether to display the floating window used for displaying the images sent by other display equipment or not by triggering the video window control, and select whether to play the sound sent by other display equipment or not by triggering the chat sound control part.
In the embodiment of the application, in the process of playing the target video, each display device in the virtual auditorium can determine the absolute value of the time difference between the playing progress of the target video and the theoretical playing progress, and then adjust the playing progress of the target video when the absolute value is greater than the first time. Therefore, the difference between the playing progress of each display device in the virtual auditorium and the theoretical playing progress can be smaller, the playing progress difference of each display device can be smaller, the content watched by the user of each display device is basically the same, and the user of each display device can communicate aiming at the same playing content.
To sum up, in the video playing method provided in the embodiment of the present application, the display device may adjust the playing progress of the target video when the absolute value of the time difference between the playing progress of the target video and the theoretical playing progress is greater than the first time, so that the time difference between the playing progress of the target video and the theoretical playing progress is small. Each display device in the virtual auditorium plays the target video in this way, so that the difference between the playing schedules of the display devices can be reduced.
It should be noted that, in the foregoing embodiment of the present application, the first display device requests the server to create the auditorium service, and different display devices join the virtual auditorium corresponding to the auditorium service, so as to synchronously play the target video. Optionally, the video playing method provided by the embodiment of the present application may also be applicable to other scenes that need to play videos synchronously except for the virtual auditorium. For example, when watching a target video, a first display device may directly invite a second display device to also play the target video, or multiple display devices may also play the same video in other manners, which is not limited in the embodiment of the present application.
It should be understood that the terms "first," "second," "third," and the like in the description and in the claims of the present application and in the drawings described above are used for distinguishing between similar elements and not necessarily for describing a particular sequential or chronological order. It is to be understood that the data so used are interchangeable under appropriate circumstances and can be implemented in sequences other than those illustrated or otherwise described herein with respect to the embodiments of the application, for example.
Furthermore, the terms "comprises" and "comprising," and any variations thereof, are intended to cover a non-exclusive inclusion, such that a product or device that comprises a list of elements is not necessarily limited to those elements explicitly listed, but may include other elements not expressly listed or inherent to such product or device.
The term "module," as used herein, refers to any known or later developed hardware, software, firmware, artificial intelligence, fuzzy logic, or combination of hardware and/or software code that is capable of performing the functionality associated with that element.
The term "remote control" as used in this application refers to a component of an electronic device (such as the display device disclosed in this application) that is typically wirelessly controllable over a relatively short range of distances. Typically using infrared and/or Radio Frequency (RF) signals and/or bluetooth to connect with the electronic device, and may also include WiFi, wireless USB, bluetooth, motion sensor, etc. For example: the hand-held touch remote controller replaces most of the physical built-in hard keys in the common remote control device with the user interface in the touch screen.
The term "gesture" as used in this application refers to a user's behavior through a change in hand shape or an action such as hand motion to convey a desired idea, action, purpose, or result.
It should be noted that, the method embodiments provided in the embodiments of the present application can be mutually referred to corresponding apparatus embodiments, and the embodiments of the present application do not limit this. The sequence of the steps of the method embodiments provided in the embodiments of the present application can be appropriately adjusted, and the steps can be correspondingly increased or decreased according to the situation, and any method that can be easily conceived by those skilled in the art within the technical scope disclosed in the present application shall be covered by the protection scope of the present application, and therefore, the details are not repeated.
It will be understood by those skilled in the art that all or part of the steps for implementing the above embodiments may be implemented by hardware, or may be implemented by a program instructing relevant hardware, where the program may be stored in a computer-readable storage medium, and the above-mentioned storage medium may be a read-only memory, a magnetic disk or an optical disk, etc.
Embodiments of the present application also provide a computer program product containing instructions, which when run on a computer, cause the computer to execute the method provided by the embodiments of the present application.
The above description is only exemplary of the present application and should not be taken as limiting, as any modification, equivalent replacement, or improvement made within the spirit and principle of the present application should be included in the protection scope of the present application.

Claims (10)

1. A video playing method is applied to a target display device and comprises the following steps:
receiving a target video sent by a server, wherein the server is used for sending the target video to a plurality of display devices, the display devices are used for synchronously playing the target video, and the target display device is any one of the display devices;
in the process of receiving the target video, playing the received part of the target video at a first playing speed;
determining an absolute value of a time difference between the playing progress of the target video and a theoretical playing progress, wherein the theoretical playing progress is sent to the plurality of display devices by the server;
and when the absolute value is greater than the first time length, adjusting the playing progress of the target video.
2. The method of claim 1, wherein the adjusting the playing progress of the target video comprises:
and playing the part of the target video after the current playing progress at a second playing speed, wherein the second playing speed is greater than the first playing speed.
3. The method of claim 2, wherein after the playing of the portion of the target video subsequent to the target portion at the second playback speed, the method further comprises:
and when the absolute value is less than or equal to a second time length, playing the part of the target video after the current playing progress at the first playing speed, wherein the second time length is less than or equal to the first time length.
4. The method of claim 3, wherein the second duration is equal to zero.
5. The method according to any one of claims 1 to 4, wherein before receiving the target video sent by the server, the method further comprises:
sending a auditorium creation request to the server, wherein the auditorium creation request carries the identification of the target video;
receiving an identifier of a auditorium service sent by the server, wherein the auditorium service is created by the server according to the auditorium creation request;
the receiving of the target video sent by the server includes:
and receiving the target video sent by the server according to the auditorium creation request.
6. The method according to any one of claims 1 to 4, wherein before receiving the target video sent by the server, the method further comprises:
receiving invitation information sent by the server according to a received auditorium creation request, wherein the auditorium creation request is used for requesting creation of an auditorium service for the target video, the target display device is different from the display device sending the auditorium creation request, the auditorium creation request carries an identifier of the target display device, and the invitation information carries an identifier of the auditorium service;
sending a theater access request to the server according to the invitation information, wherein the theater access request carries an identifier of the theater service;
the receiving of the target video sent by the server includes:
and receiving the target video sent by the server according to the auditorium access request.
7. The method of claim 6, wherein the invitation information further carries an identification of the target video.
8. A video playback device, characterized in that the video playback device comprises:
the system comprises a controller, a server and a plurality of display devices, wherein the controller is used for receiving a target video sent by the server through a communication interface, the server is used for sending the target video to the plurality of display devices, the plurality of display devices are used for synchronously playing the target video, and the video playing device is any one of the plurality of display devices;
the display screen is used for playing the received part of the target video at a first playing speed in the process of receiving the target video through the communication interface;
the controller is further configured to determine an absolute value of a time difference between the playing progress of the target video and a theoretical playing progress, where the theoretical playing progress is sent to the plurality of display devices by the server;
the controller is further configured to adjust the playing progress of the target video when the absolute value is greater than the first time duration.
9. The video playback device of claim 8, wherein the display screen is further configured to:
and playing the part of the target video after the current playing progress at a second playing speed, wherein the second playing speed is greater than the first playing speed.
10. The video playback device of claim 9, wherein the display screen is further configured to:
and when the absolute value is less than or equal to a second time length, playing the part of the target video after the current playing progress at the first playing speed, wherein the second time length is less than or equal to the first time length.
CN202010111719.8A 2020-02-24 2020-02-24 Video playing method and device Active CN111277884B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010111719.8A CN111277884B (en) 2020-02-24 2020-02-24 Video playing method and device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010111719.8A CN111277884B (en) 2020-02-24 2020-02-24 Video playing method and device

Publications (2)

Publication Number Publication Date
CN111277884A true CN111277884A (en) 2020-06-12
CN111277884B CN111277884B (en) 2022-10-18

Family

ID=71002245

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010111719.8A Active CN111277884B (en) 2020-02-24 2020-02-24 Video playing method and device

Country Status (1)

Country Link
CN (1) CN111277884B (en)

Cited By (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112015506A (en) * 2020-08-19 2020-12-01 北京字节跳动网络技术有限公司 Content display method and device
CN112073791A (en) * 2020-08-03 2020-12-11 上海商泰汽车信息系统有限公司 Playing synchronization method and device, storage medium and user side
WO2021031940A1 (en) * 2019-08-18 2021-02-25 聚好看科技股份有限公司 Screening room service management method, interaction method, display device, and mobile terminal
CN112423039A (en) * 2020-11-20 2021-02-26 广州欢网科技有限责任公司 Television cinema creation system and method
CN112887769A (en) * 2021-01-21 2021-06-01 海信视像科技股份有限公司 Display device
CN112911368A (en) * 2021-01-15 2021-06-04 北京字跳网络技术有限公司 Interaction method, interaction device, electronic equipment and storage medium
CN113556611A (en) * 2021-07-20 2021-10-26 上海哔哩哔哩科技有限公司 Video watching method and device
CN113938633A (en) * 2020-06-29 2022-01-14 聚好看科技股份有限公司 Video call processing method and display device
CN117201854A (en) * 2023-11-02 2023-12-08 广东朝歌智慧互联科技有限公司 Method and system for accurate seek video frames applied to video synchronous playing system

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104125476A (en) * 2013-04-28 2014-10-29 腾讯科技(深圳)有限公司 Video playing method and device
CN106162236A (en) * 2015-03-23 2016-11-23 腾讯科技(深圳)有限公司 A kind of method and device of sharing video frequency
CN106507202A (en) * 2016-11-11 2017-03-15 传线网络科技(上海)有限公司 Control method for playing back and device
CN110719515A (en) * 2018-07-12 2020-01-21 北京优酷科技有限公司 Video playing method and device
CN110830823A (en) * 2019-11-27 2020-02-21 北京奇艺世纪科技有限公司 Play progress correction method and device, electronic equipment and readable storage medium

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104125476A (en) * 2013-04-28 2014-10-29 腾讯科技(深圳)有限公司 Video playing method and device
CN106162236A (en) * 2015-03-23 2016-11-23 腾讯科技(深圳)有限公司 A kind of method and device of sharing video frequency
CN106507202A (en) * 2016-11-11 2017-03-15 传线网络科技(上海)有限公司 Control method for playing back and device
CN110719515A (en) * 2018-07-12 2020-01-21 北京优酷科技有限公司 Video playing method and device
CN110830823A (en) * 2019-11-27 2020-02-21 北京奇艺世纪科技有限公司 Play progress correction method and device, electronic equipment and readable storage medium

Cited By (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2021031940A1 (en) * 2019-08-18 2021-02-25 聚好看科技股份有限公司 Screening room service management method, interaction method, display device, and mobile terminal
CN113938633B (en) * 2020-06-29 2023-09-08 聚好看科技股份有限公司 Video call processing method and display device
CN113938633A (en) * 2020-06-29 2022-01-14 聚好看科技股份有限公司 Video call processing method and display device
CN112073791A (en) * 2020-08-03 2020-12-11 上海商泰汽车信息系统有限公司 Playing synchronization method and device, storage medium and user side
CN112015506A (en) * 2020-08-19 2020-12-01 北京字节跳动网络技术有限公司 Content display method and device
WO2022037552A1 (en) * 2020-08-19 2022-02-24 北京字节跳动网络技术有限公司 Content display method and apparatus
CN112015506B (en) * 2020-08-19 2023-08-22 北京字节跳动网络技术有限公司 Content display method and device
CN112423039A (en) * 2020-11-20 2021-02-26 广州欢网科技有限责任公司 Television cinema creation system and method
CN112911368A (en) * 2021-01-15 2021-06-04 北京字跳网络技术有限公司 Interaction method, interaction device, electronic equipment and storage medium
CN112887769A (en) * 2021-01-21 2021-06-01 海信视像科技股份有限公司 Display device
CN112887769B (en) * 2021-01-21 2023-09-19 青岛海信传媒网络技术有限公司 Display equipment
CN113556611A (en) * 2021-07-20 2021-10-26 上海哔哩哔哩科技有限公司 Video watching method and device
WO2023000896A1 (en) * 2021-07-20 2023-01-26 上海哔哩哔哩科技有限公司 Video viewing method and device
CN117201854A (en) * 2023-11-02 2023-12-08 广东朝歌智慧互联科技有限公司 Method and system for accurate seek video frames applied to video synchronous playing system

Also Published As

Publication number Publication date
CN111277884B (en) 2022-10-18

Similar Documents

Publication Publication Date Title
CN111277884B (en) Video playing method and device
CN111741372B (en) Screen projection method for video call, display device and terminal device
CN111050199B (en) Display device and scheduling method of Bluetooth communication resources of display device
CN111405338B (en) Intelligent image quality switching method and display device
CN111343489B (en) Display device and method for playing music in terminal
CN111752518A (en) Screen projection method of display equipment and display equipment
CN111836109A (en) Display device, server and method for automatically updating column frame
CN111479145A (en) Display device and television program pushing method
CN112188279A (en) Channel switching method and display equipment
CN112333509A (en) Media asset recommendation method, recommended media asset playing method and display equipment
CN111954059A (en) Screen saver display method and display device
CN112243141A (en) Display method and display equipment for screen projection function
CN113825032A (en) Media asset playing method and display equipment
CN110602540B (en) Volume control method of display equipment and display equipment
CN112203154A (en) Display device
CN112040276A (en) Video progress synchronization method, display equipment and refrigeration equipment
CN111669662A (en) Display device, video call method and server
CN113495711A (en) Display apparatus and display method
CN111741314A (en) Video playing method and display equipment
CN111263223A (en) Media volume adjusting method and display device
CN112118476B (en) Method for rapidly displaying program reservation icon and display equipment
CN113259733B (en) Display device
CN113495654A (en) Control display method and display device
CN112261463A (en) Display device and program recommendation method
CN114390190A (en) Display equipment and method for monitoring application to start camera

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant