CN114297435A - Communication terminal and multi-screen interactive video browsing method - Google Patents

Communication terminal and multi-screen interactive video browsing method Download PDF

Info

Publication number
CN114297435A
CN114297435A CN202011293785.8A CN202011293785A CN114297435A CN 114297435 A CN114297435 A CN 114297435A CN 202011293785 A CN202011293785 A CN 202011293785A CN 114297435 A CN114297435 A CN 114297435A
Authority
CN
China
Prior art keywords
video
communication terminal
display
user
key frame
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202011293785.8A
Other languages
Chinese (zh)
Inventor
张娜
庞秀娟
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Vidaa Netherlands International Holdings BV
Original Assignee
Hisense Visual Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Hisense Visual Technology Co Ltd filed Critical Hisense Visual Technology Co Ltd
Priority to CN202011293785.8A priority Critical patent/CN114297435A/en
Publication of CN114297435A publication Critical patent/CN114297435A/en
Pending legal-status Critical Current

Links

Images

Abstract

The application provides a communication terminal and a multi-screen interactive video browsing method. The video sharing interface comprises video directory entries corresponding to a plurality of videos to be shared, and each video directory entry comprises a plurality of key frame images extracted from the videos to be shared so that a user can browse and view the key frame images. Through the plurality of key frame images, the video content in the preset time period can be displayed, a user can conveniently know the video content and select a video file to be shared to push to the display equipment, and the problem that a traditional video file browsing method is not beneficial to the user to select the video resource file to be shared is solved.

Description

Communication terminal and multi-screen interactive video browsing method
Technical Field
The application relates to the technical field of communication, in particular to a communication terminal and a multi-screen interactive video browsing method.
Background
The communication terminal based on the network connection relation can establish a multi-screen interaction relation with the display equipment. For example, a communication terminal such as a mobile phone may establish a connection relationship with a display device such as a smart television through a wireless network, so as to synchronously display a picture on the mobile phone through the smart television, or share resources between the mobile phone and the smart television. Because the display screen of the display device is larger, the communication terminal is more suitable for use scenes such as conference demonstration and the like compared with the communication terminal, and therefore the communication terminal can push local video resources to the display device for display after the connection relation is established.
Generally, the operation of pushing a video resource to a display device by a communication terminal is as follows: a user enters a file management interface through interactive operation, and a video list consisting of a plurality of video resource icons can be displayed in the file management interface for the user to select. After the user selects the icon of any video resource file, the video file is sent to the display device through a UI (user interface) or preset sharing operation, so that the display device can play the shared video resource.
The video resource patterns presented in the file management interface are typically video thumbnails and video names, i.e., video resource content is represented by a video frame of a first frame or a specific frame. However, under a part of video resources, a user cannot effectively identify the video resources through the video thumbnails, that is, cannot know which video resource the thumbnail corresponds to, and the names of the part of video resources, such as the names of videos recorded by mobile phones, usually have no meaning capable of expressing the specific video content of the video resources, so that the user cannot accurately select the video resource files to be shared when pushing the videos, and the operation efficiency is reduced.
Disclosure of Invention
The application provides a communication terminal and a multi-screen interactive video browsing method, and aims to solve the problem that a traditional video file browsing method is not beneficial to a user to select a video resource file to be shared.
In a first aspect, the present application provides a communication terminal, comprising: display element, communication circuit and processor. Wherein the display unit is configured to present various user interfaces, and to present a video sharing interface. The communication circuit is configured to establish a communication connection with the display device to push the shared video data to the display device. The processor is configured to perform the following program steps:
acquiring a multi-screen interaction instruction input by a user;
and responding to the multi-screen interaction instruction, and controlling the display unit to display a video sharing interface.
The video sharing interface comprises a plurality of video directory entries, and each video directory entry comprises a plurality of key frame images extracted from videos to be shared.
The communication terminal provided by the application in the first aspect can display the video sharing interface in the display unit according to a multi-screen interaction instruction input by a user. The video sharing interface comprises video directory entries corresponding to a plurality of videos to be shared, and each video directory entry comprises a plurality of key frame images extracted from the videos to be shared so that a user can browse and view the key frame images. Through the plurality of key frame images, the video content in the preset time period can be displayed, a user can conveniently know the video content and select a video file to be shared to push to the display equipment, and the problem that a traditional video file browsing method is not beneficial to the user to select the video resource file to be shared is solved.
In a second aspect, the present application further provides a display device comprising a display, a communicator, and a controller. The display is configured to display a user interface and display video picture content shared by the communication terminal, and the communicator is configured to establish communication connection with the communication terminal so as to acquire shared video data at the communication terminal. The controller is configured to perform the following program steps:
after establishing communication connection with a communication terminal, acquiring a video file pushed by the communication terminal;
and analyzing the video file, and controlling the display to display the video content of the video file.
The video file is a video file selected by a user in a video sharing interface of the communication terminal; the video sharing interface comprises a plurality of video directory entries, and each video directory entry comprises a plurality of key frame images extracted from videos to be shared.
According to the technical scheme, the display device provided by the second aspect of the application can receive the video file pushed by the communication terminal in real time through the communicator after the communication connection is established, analyze the received video file to generate specific video picture content, and finally display the video picture content through the display. The video files received by the display device are the video files selected by the user in the video sharing interface, and the video files are represented by the video directory items in the video sharing interface, so that the received video can directly display the video content to be shared by the user, the repeated operation times of the user are reduced, and the user experience is improved.
In a third aspect, the present application further provides a multi-screen interactive video browsing method, which is applied to a communication terminal that establishes a communication connection with a display device, and the method includes:
acquiring a multi-screen interaction instruction input by a user;
responding to the multi-screen interaction instruction, and displaying a video sharing interface through the communication terminal.
The video sharing interface comprises a plurality of video directory entries, and each video directory entry comprises a plurality of key frame images extracted from videos to be shared. The multi-screen interactive video browsing method provided by the third aspect of the application can be configured in a processor of a communication terminal, so that a video sharing interface can be displayed after the communication terminal obtains a multi-screen interactive instruction input by a user. And the video sharing interface comprises a plurality of video directory entries, and each video directory entry comprises a plurality of key frame images extracted from the video to be shared. According to the method, the video file can be displayed through the plurality of key frame images, so that a user can fully know the video content, and the video file to be shared can be conveniently and accurately selected.
In a fourth aspect, the present application further provides a multi-screen interactive video browsing system, which includes a communication terminal and a display device, where the communication terminal and the display device establish a communication connection;
the communication terminal is configured to: acquiring a multi-screen interaction instruction input by a user, responding to the multi-screen interaction instruction, and displaying a video sharing interface, wherein the video sharing interface comprises a plurality of video directory items, and each video directory item comprises a plurality of key frame images extracted from a video to be shared; and the number of the first and second groups,
pushing a video file to the display equipment, wherein the video file is a video to be shared selected by a user in a video sharing interface of the communication terminal;
the display device is configured to: the method comprises the steps of obtaining a video file pushed by a communication terminal, analyzing the video file, and displaying video content of the video file.
According to the technical scheme, the multi-screen interactive video browsing system provided by the fourth aspect of the application comprises the communication terminal and the display device which are mutually communicated. The communication terminal can display a video sharing interface after acquiring a multi-screen interaction instruction, wherein the video sharing interface comprises a plurality of video directory entries, and each video directory entry comprises a plurality of key frame images extracted from a video to be shared. After the user selects any video to be shared, the communication terminal further pushes the selected video file to the display device, so that the display device can display the specific video content of the video file after acquiring the video file. The system displays the video files through the plurality of key frame images, so that a user can fully know the video content, and the video files to be shared can be conveniently and accurately selected.
Drawings
In order to more clearly explain the technical solution of the present application, the drawings needed to be used in the embodiments will be briefly described below, and it is obvious to those skilled in the art that other drawings can be obtained according to the drawings without creative efforts.
FIG. 1 is a schematic diagram of an operation scenario between a display device and a control apparatus in an embodiment of the present application;
fig. 2 is a block diagram of a hardware configuration of a display device in an embodiment of the present application;
FIG. 3 is a schematic diagram of a software configuration of a display device in an embodiment of the present application;
FIG. 4 is a schematic diagram of an icon control interface display of an application program of a display device in an embodiment of the present application;
fig. 5 is a schematic structural diagram of a communication terminal provided in an embodiment of the present application;
fig. 6 is a schematic software architecture diagram of a communication terminal provided in an embodiment of the present application;
fig. 7 is a schematic view of a user interface of a communication terminal provided in an embodiment of the present application;
fig. 8 is a schematic diagram of a communication connection between a communication terminal and a display device provided in an embodiment of the present application;
fig. 9 is a schematic view illustrating a screen projection connection between a communication terminal and a display device provided in an embodiment of the present application;
fig. 10 is a schematic data flow diagram of a multi-screen interactive video browsing method provided in the embodiment of the present application;
fig. 11 is a flowchart illustrating a multi-screen interactive video browsing method provided in an embodiment of the present application;
fig. 12 is a schematic view of a video sharing interface provided in an embodiment of the present application;
fig. 13 is a schematic flowchart of displaying a browsing interface provided in an embodiment of the present application;
FIG. 14 is a schematic view of a browsing interface provided in an embodiment of the present application;
fig. 15 is a schematic flow chart of switching video files in a browsing interface provided in the embodiment of the present application;
fig. 16 is a schematic view of a display device provided in an embodiment of the present application.
Detailed Description
The technical solution in the embodiments of the present application will be described in detail and removed with reference to the accompanying drawings. In the description of the embodiments herein, "/" means "or" unless otherwise specified, for example, a/B may mean a or B; "and/or" in the text is only an association relationship describing an associated object, and means that three relationships may exist, for example, a and/or B may mean: three cases of a alone, a and B both, and B alone exist, and in addition, "a plurality" means two or more than two in the description of the embodiments of the present application.
In the following, the terms "first", "second" are used for descriptive purposes only and are not to be understood as implying or implying relative importance or implicitly indicating the number of technical features indicated. Thus, a feature defined as "first" or "second" may explicitly or implicitly include one or more of that feature, and in the description of embodiments of the application, unless stated otherwise, "plurality" means two or more.
Fig. 1 is a schematic diagram illustrating an operation scenario between a display device and a control apparatus according to an embodiment. As shown in fig. 1, a user may operate the display apparatus 200 through the communication terminal 100A and the control device 100 (e.g., the remote controller 100B).
In some embodiments, communication terminal 100A, tablet, computer, laptop, and other smart devices may also be used to control display device 200. For example, the display device 200 is controlled using an application program running on the smart device. The application, through configuration, may provide the user with various controls in an intuitive User Interface (UI) on a screen associated with the smart device.
In some embodiments, the communication terminal 100A may install a software application with the display device 200, implement connection communication through a network communication protocol, and implement the purpose of one-to-one control operation and data communication. Such as: the function of controlling the display device 200 can be realized by establishing a control instruction protocol with the communication terminal 100A and the display device 200, synchronizing a remote control keyboard to the communication terminal 100A, and controlling a user interface on the communication terminal 100A. The audio/video content displayed on the communication terminal 100A may also be transmitted to the display device 200, so as to implement the synchronous display function.
As also shown in fig. 1, the display apparatus 200 also performs data communication with the server 400 through various communication means. The display device 200 may be allowed to be communicatively connected through a Local Area Network (LAN), a Wireless Local Area Network (WLAN), and other networks. The server 400 may provide various contents and interactions to the display apparatus 200. Illustratively, the display device 200 receives software program updates, or accesses a remotely stored digital media library, by sending and receiving information, as well as Electronic Program Guide (EPG) interactions. The server 400 may be a cluster or a plurality of clusters, and may include one or more types of servers. Other web service contents such as video on demand and advertisement services are provided through the server 400.
The display device 200 may be a liquid crystal display, an OLED display, a projection display device. The particular display device type, size, resolution, etc. are not limiting, and those skilled in the art will appreciate that the display device 200 may be modified in performance and configuration as desired.
The display apparatus 200 may additionally provide an intelligent network tv function of a computer support function including, but not limited to, a network tv, an intelligent tv, an Internet Protocol Tv (IPTV), and the like, in addition to the broadcast receiving tv function.
A hardware configuration block diagram of a display device 200 according to an exemplary embodiment is exemplarily shown in fig. 2.
In some embodiments, at least one of the controller 250, the tuner demodulator 210, the communicator 220, the detector 230, the input/output interface 255, the display 275, the audio output interface 285, the memory 260, the power supply 290, the user interface 265, and the external device interface 240 is included in the display apparatus 200.
In some embodiments, a display 275 receives image signals originating from the first processor output and displays video content and images and components of the menu manipulation interface.
In some embodiments, the display 275, includes a display screen assembly for presenting a picture, and a driving assembly that drives the display of an image.
In some embodiments, the video content is displayed from broadcast television content, or alternatively, from various broadcast signals that may be received via wired or wireless communication protocols. Alternatively, various image contents received from the network communication protocol and sent from the network server side can be displayed.
In some embodiments, the display 275 is used to present a user-manipulated UI interface generated in the display apparatus 200 and used to control the display apparatus 200.
In some embodiments, a driver assembly for driving the display is also included, depending on the type of display 275.
In some embodiments, display 275 is a projection display and may also include a projection device and a projection screen.
In some embodiments, communicator 220 is a component for communicating with external devices or external servers according to various communication protocol types. For example: the communicator may include at least one of a Wifi chip, a bluetooth communication protocol chip, a wired ethernet communication protocol chip, and other network communication protocol chips or near field communication protocol chips, and an infrared receiver.
In some embodiments, the display apparatus 200 may establish control signal and data signal transmission and reception with the external control device 100 or the content providing apparatus through the communicator 220.
In some embodiments, user interface 265 may be configured to receive infrared control signals from control device 100 (e.g., infrared remote control 100B, etc.).
In some embodiments, the detector 230 is a signal used by the display device 200 to collect an external environment or interact with the outside.
In some embodiments, the detector 230 includes a light receiver, a sensor for collecting the intensity of ambient light, and parameters changes can be adaptively displayed by collecting the ambient light, and the like.
In some embodiments, the detector 230 may further include an image collector, such as a camera, etc., which may be configured to collect external environment scenes, collect attributes of the user or gestures interacted with the user, adaptively change display parameters, and recognize user gestures, so as to implement a function of interaction with the user.
In some embodiments, the detector 230 may also include a temperature sensor or the like, such as by sensing ambient temperature.
In some embodiments, the display apparatus 200 may adaptively adjust a display color temperature of an image. For example, the display apparatus 200 may be adjusted to display a cool tone when the temperature is in a high environment, or the display apparatus 200 may be adjusted to display a warm tone when the temperature is in a low environment.
In some embodiments, the detector 230 may also be a sound collector or the like, such as a microphone, which may be used to receive the user's voice. Illustratively, a voice signal including a control instruction of the user to control the display device 200, or to collect an ambient sound for recognizing an ambient scene type, so that the display device 200 can adaptively adapt to an ambient noise.
In some embodiments, as shown in fig. 2, the input/output interface 255 is configured to allow data transfer between the controller 250 and external other devices or other controllers 250. Such as receiving video signal data and audio signal data of an external device, or command instruction data, etc.
In some embodiments, the external device interface 240 may include, but is not limited to, the following: the interface can be any one or more of a high-definition multimedia interface (HDMI), an analog or data high-definition component input interface, a composite video input interface, a USB input interface, an RGB port and the like. The plurality of interfaces may form a composite input/output interface.
In some embodiments, as shown in fig. 2, the tuning demodulator 210 is configured to receive a broadcast television signal through a wired or wireless receiving manner, perform modulation and demodulation processing such as amplification, mixing, resonance, and the like, and demodulate an audio and video signal from a plurality of wireless or wired broadcast television signals, where the audio and video signal may include a television audio and video signal carried in a television channel frequency selected by a user and an EPG data signal.
In some embodiments, the frequency points demodulated by the tuner demodulator 210 are controlled by the controller 250, and the controller 250 can send out control signals according to user selection, so that the modem responds to the television signal frequency selected by the user and modulates and demodulates the television signal carried by the frequency.
In some embodiments, the broadcast television signal may be classified into a terrestrial broadcast signal, a cable broadcast signal, a satellite broadcast signal, an internet broadcast signal, or the like according to the broadcasting system of the television signal. Or may be classified into a digital modulation signal, an analog modulation signal, and the like according to a modulation type. Or the signals are classified into digital signals, analog signals and the like according to the types of the signals.
In some embodiments, the controller 250 and the modem 210 may be located in different separate devices, that is, the modem 210 may also be located in an external device of the main device where the controller 250 is located, such as an external set-top box. Therefore, the set top box outputs the television audio and video signals modulated and demodulated by the received broadcast television signals to the main body equipment, and the main body equipment receives the audio and video signals through the first input/output interface.
In some embodiments, the controller 250 controls the operation of the display device and responds to user operations through various software control programs stored in memory. The controller 250 may control the overall operation of the display apparatus 200. For example: in response to receiving a user command for selecting a UI object to be displayed on the display 275, the controller 250 may perform an operation related to the object selected by the user command.
In some embodiments, the object may be any one of selectable objects, such as a hyperlink or an icon. Operations related to the selected object, such as: displaying an operation connected to a hyperlink page, document, image, or the like, or performing an operation of a program corresponding to the icon. The user command for selecting the UI object may be a command input through various input means (e.g., a mouse, a keyboard, a touch pad, etc.) connected to the display apparatus 200 or a voice command corresponding to a voice spoken by the user.
As shown in fig. 2, the controller 250 includes at least one of a Random Access Memory 251 (RAM), a Read-Only Memory 252 (ROM), a video processor, an audio processor, other processors (e.g., a Graphics Processing Unit (GPU), a Central Processing Unit (CPU), a Communication Interface (Communication Interface), and a Communication Bus 256(Bus), wherein the Communication Bus connects the respective components.
In some embodiments, RAM 251 is used to store temporary data for the operating system or other programs that are running, and in some embodiments, ROM252 is used to store instructions for various system boots.
In some embodiments, the ROM252 is used to store a Basic Input Output System (BIOS). The system is used for completing power-on self-test of the system, initialization of each functional module in the system, a driver of basic input/output of the system and booting an operating system.
In some embodiments, when the power-on signal is received, the display device 200 starts to power up, the CPU executes the system boot instruction in the ROM252, and copies the temporary data of the operating system stored in the memory to the RAM 251 so as to start or run the operating system. After the start of the operating system is completed, the CPU copies the temporary data of the various application programs in the memory to the RAM 251, and then, the various application programs are started or run.
In some embodiments, a processor is used to execute operating system and application program instructions stored in memory. And executing various application programs, data and contents according to various interactive instructions received from the outside so as to finally display and play various audio and video contents.
In some example embodiments, a processor may include a plurality of processors. The plurality of processors may include a main processor and one or more sub-processors. A main processor for performing some operations of the display apparatus 200 in a pre-power-up mode and/or operations of displaying a screen in a normal mode. One or more sub-processors for one operation in a standby mode or the like.
In some embodiments, a graphics processor for generating various graphics objects, such as: icons, operation menus, user input instruction display graphics, and the like. The display device comprises an arithmetic unit which carries out operation by receiving various interactive instructions input by a user and displays various objects according to display attributes. And the system comprises a renderer for rendering various objects obtained based on the arithmetic unit, wherein the rendered objects are used for being displayed on a display.
In some embodiments, the video processor is configured to receive an external video signal, and perform video processing such as decompression, decoding, scaling, noise reduction, frame rate conversion, resolution conversion, image synthesis, and the like according to a standard codec protocol of the input signal, so as to obtain a signal displayed or played on the directly displayable device 200.
In some embodiments, the video processor includes a demultiplexing module, a video decoding module, an image synthesis module, a frame rate conversion module, a display formatting module, and the like.
The demultiplexing module is used for demultiplexing the input audio and video data stream, and if the input MPEG-2 is input, the demultiplexing module demultiplexes the input audio and video data stream into a video signal and an audio signal.
And the video decoding module is used for processing the video signal after demultiplexing, including decoding, scaling and the like.
And the image synthesis module is used for carrying out superposition mixing processing on the GUI signal input by the user or generated by the user and the video image after the zooming processing by the graphic generator so as to generate an image signal for display.
The frame rate conversion module is configured to convert an input video frame rate, such as a 60Hz frame rate into a 120Hz frame rate or a 240Hz frame rate, and the normal format is implemented in, for example, an interpolation frame mode.
The display format module is used for converting the received video output signal after the frame rate conversion, and changing the signal to conform to the signal of the display format, such as outputting an RGB data signal.
In some embodiments, the graphics processor and the video processor may be integrated or separately configured, and the graphics processor and the video processor may be configured to perform processing of graphics signals output to the display when the graphics processor and the video processor are integrated, and may perform different functions when the graphics processor and the video processor are separately configured, for example, a GPU + frc (frame Rate conversion) architecture.
In some embodiments, the audio processor is configured to receive an external audio signal, decompress and decode the received audio signal according to a standard codec protocol of the input signal, and perform noise reduction, digital-to-analog conversion, and amplification processing to obtain an audio signal that can be played in the speaker.
In some embodiments, the video processor may comprise one or more chips. The audio processor may also comprise one or more chips.
In some embodiments, the video processor and the audio processor may be separate chips or may be integrated together with the controller in one or more chips.
In some embodiments, the audio output, under the control of the controller 250, receives sound signals output by the audio processor, such as: the speaker 286, and an external sound output terminal of a generating device that can output to an external device, in addition to the speaker carried by the display device 200 itself, such as: external sound interface or earphone interface, etc., and may also include a near field communication module in the communication interface, for example: and the Bluetooth circuit is used for outputting sound of the Bluetooth loudspeaker.
The power supply 290 supplies power to the display device 200 from the power input from the external power source under the control of the controller 250. The power supply 290 may include a built-in power supply circuit installed inside the display apparatus 200, or may be a power supply interface installed outside the display apparatus 200 to provide an external power supply in the display apparatus 200.
A user interface 265 for receiving an input signal of a user and then transmitting the received user input signal to the controller 250. The user input signal may be a remote controller 100B signal received through an infrared receiver, and various user control signals may be received through a network communication circuit.
In some embodiments, the user inputs a user command through the control apparatus 100 or the communication terminal 100A, the user input interface responds to the user input through the controller 250 according to the user input, and the display device 200 responds to the user input through the controller 250.
In some embodiments, a user may enter user commands on a Graphical User Interface (GUI) displayed on the display 275, and the user input interface receives the user input commands through the Graphical User Interface (GUI). Alternatively, the user may input the user command by inputting a specific sound or gesture, and the user input interface receives the user input command by recognizing the sound or gesture through the sensor.
In some embodiments, a "user interface" is a media interface for interaction and information exchange between an application or operating system and a user that enables conversion between an internal form of information and a form that is acceptable to the user. A commonly used presentation form of the User Interface is a Graphical User Interface (GUI), which refers to a User Interface related to computer operations and displayed in a graphical manner. It may be an interface element such as an icon, a window, a control, etc. displayed in the display screen of the electronic device, where the control may include a visual interface element such as an icon, a button, a menu, a tab, a text box, a dialog box, a status bar, a navigation bar, a Widget, etc.
The memory 260 includes a memory storing various software modules for driving the display device 200. Such as: various software modules stored in the first memory, including: at least one of a basic module, a detection module, a communication module, a display control module, a browser module, and various service modules.
The base module is a bottom layer software module for signal communication between various hardware in the display device 200 and for sending processing and control signals to the upper layer module. The detection module is used for collecting various information from various sensors or user input interfaces, and the management module is used for performing digital-to-analog conversion and analysis management.
For example, the voice recognition module comprises a voice analysis module and a voice instruction database module. The display control module is used for controlling the display to display the image content, and can be used for playing the multimedia image content, UI interface and other information. And the communication module is used for carrying out control and data communication with external equipment. And the browser module is used for executing a module for data communication between browsing servers. And the service module is used for providing various services and modules including various application programs. Meanwhile, the memory 260 may store a visual effect map for receiving external data and user data, images of various items in various user interfaces, and a focus object, etc.
Referring to fig. 4, in some embodiments, the system is divided into four layers, which are an Application (Applications) layer (abbreviated as "Application layer"), an Application Framework (Application Framework) layer (abbreviated as "Framework layer"), an Android runtime (Android runtime) and system library layer (abbreviated as "system runtime library layer"), and a kernel layer from top to bottom.
In some embodiments, at least one application program runs in the application program layer, and the application programs can be Window (Window) programs carried by an operating system, system setting programs, clock programs, camera applications and the like; or may be an application developed by a third party developer such as a hi program, a karaoke program, a magic mirror program, or the like. In specific implementation, the application packages in the application layer are not limited to the above examples, and may actually include other application packages, which is not limited in this embodiment of the present application.
The framework layer provides an Application Programming Interface (API) and a programming framework for the application program of the application layer. The application framework layer includes a number of predefined functions. The application framework layer acts as a processing center that decides to let the applications in the application layer act. The application program can access the resource in the system and obtain the service of the system in execution through the API interface
As shown in fig. 3, in the embodiment of the present application, the application framework layer includes a manager (Managers), a Content Provider (Content Provider), and the like, where the manager includes at least one of the following modules: an Activity Manager (Activity Manager) is used for interacting with all activities running in the system; the Location Manager (Location Manager) is used for providing the system service or application with the access of the system Location service; the file Package Manager (Package Manager) is used for detecting various information related to the application packages currently installed on the device; a Notification Manager (Notification Manager) for controlling display and clearing of Notification messages; a Window Manager (Window Manager) is used to manage the icons, windows, toolbars, wallpapers, and desktop components on a user interface.
In some embodiments, the activity manager is to: managing the life cycle of each application program and the general navigation backspacing function, such as controlling the exit of the application program (including switching the user interface currently displayed in the display window to the system desktop), opening, backing (including switching the user interface currently displayed in the display window to the previous user interface of the user interface currently displayed), and the like.
In some embodiments, the window manager is configured to manage all window processes, such as obtaining a display size, determining whether a status bar is available, locking a screen, intercepting a screen, controlling a display change (e.g., zooming out, dithering, distorting, etc.) and the like.
In some embodiments, the system runtime layer provides support for the upper layer, i.e., the framework layer, and when the framework layer is used, the android operating system runs the C/C + + library included in the system runtime layer to implement the functions to be implemented by the framework layer.
In some embodiments, the kernel layer is a layer between hardware and software. As shown in fig. 4, the core layer includes at least one of the following drivers: audio drive, display drive, bluetooth drive, camera drive, WIFI drive, USB drive, HDMI drive, sensor drive (such as fingerprint sensor, temperature sensor, touch sensor, pressure sensor, etc.), and so on.
In some embodiments, the kernel layer further comprises a power driver module for power management.
In some embodiments, taking the magic mirror application (photographing application) as an example, when the remote control receiving device receives an input operation from the remote control 100B, a corresponding hardware interrupt is issued to the kernel layer. The kernel layer processes the input operation into an original input event (including information such as a value of the input operation, a timestamp of the input operation, etc.). The raw input events are stored at the kernel layer. The application program framework layer obtains an original input event from the kernel layer, identifies a control corresponding to the input event according to the current position of the focus and uses the input operation as a confirmation operation, the control corresponding to the confirmation operation is a control of a magic mirror application icon, the magic mirror application calls an interface of the application framework layer to start the magic mirror application, and then the kernel layer is called to start a camera driver, so that a static image or a video is captured through the camera.
In some embodiments, for a display device with a touch function, taking a split screen operation as an example, the display device receives an input operation (such as a split screen operation) that a user acts on a display screen, and the kernel layer may generate a corresponding input event according to the input operation and report the event to the application framework layer. The window mode (such as multi-window mode) corresponding to the input operation, the position and size of the window and the like are set by an activity manager of the application framework layer. And the window management of the application program framework layer draws a window according to the setting of the activity manager, then sends the drawn window data to the display driver of the kernel layer, and the display driver displays the corresponding application interface in different display areas of the display screen.
In some embodiments, as shown in fig. 4, the application layer containing at least one application may display a corresponding icon control in the display, such as: the system comprises a live television application icon control, a video on demand application icon control, a media center application icon control, an application center icon control, a game application icon control and the like.
In some embodiments, the live television application may provide live television via different signal sources. For example, a live television application may provide television signals using input from cable television, radio broadcasts, satellite services, or other types of live television services. And, the live television application may display video of the live television signal on the display device 200.
In some embodiments, a video-on-demand application may provide video from different storage sources. Unlike live television applications, video on demand provides a video display from some storage source. For example, the video on demand may come from a server side of the cloud storage, from a local hard disk storage containing stored video programs.
In some embodiments, the media center application may provide various applications for multimedia content playback. For example, a media center, which may be other than live television or video on demand, may provide services that a user may access to various images or audio through a media center application.
In some embodiments, an application center may provide storage for various applications. The application may be a game, an application, or some other application associated with a computer system or other device that may be run on the smart television. The application center may obtain these applications from different sources, store them in local storage, and then be operable on the display device 200.
Fig. 5 shows a schematic configuration of the communication terminal 100A.
The following specifically describes an embodiment by taking the communication terminal 100A as an example. It should be understood that the communication terminal 100A shown in fig. 1 is only an example, and the communication terminal 100A may have more or less components than those shown in fig. 1, may combine two or more components, or may have a different configuration of components. The various components shown in the figures may be implemented in hardware, software, or a combination of hardware and software, including one or more signal processing and/or application specific integrated circuits.
A block diagram of a hardware configuration of the communication terminal 100A according to the exemplary embodiment is exemplarily shown in fig. 5. As shown in fig. 5, the communication terminal 100A includes: radio Frequency (RF) circuitry 110, memory 120, display unit 130, camera 140, sensor 150, audio circuitry 160, Wireless Fidelity (Wi-Fi) circuitry 170, processor 180, bluetooth circuitry 181, and power supply 190.
The RF circuit 110 may be used for receiving and transmitting signals during information transmission and reception or during a call, and may receive downlink data of a base station and then send the downlink data to the processor 180 for processing; the uplink data may be transmitted to the base station. Typically, the RF circuitry includes, but is not limited to, an antenna, at least one amplifier, a transceiver, a coupler, a low noise amplifier, a duplexer, and the like.
The memory 120 may be used to store software programs and data. The processor 180 executes various functions and data processing of the communication terminal 100A by executing software programs or data stored in the memory 120. The memory 120 may include high speed random access memory and may also include non-volatile memory, such as at least one magnetic disk storage device, flash memory device, or other volatile solid state storage device. The memory 120 stores an operating system that enables the communication terminal 100A to operate. The memory 120 may store an operating system and various application programs, and may also store codes for performing the methods described in the embodiments of the present application.
The display unit 130 may be used to receive input numeric or character information and generate signal input related to user settings and function control of the communication terminal 100A, and particularly, the display unit 130 may include a touch screen 131 disposed on the front surface of the communication terminal 100A and may collect touch operations of a user thereon or nearby, such as clicking a button, dragging a scroll box, and the like.
The display unit 130 may also be used to display a Graphical User Interface (GUI) of information input by or provided to the user and various menus of the communication terminal 100A. Specifically, the display unit 130 may include a display screen 132 disposed on the front surface of the communication terminal 100A. The display screen 132 may be configured in the form of a liquid crystal display, a light emitting diode, or the like. The display unit 130 may be used to display various graphical user interfaces described herein.
The touch screen 131 may cover the display screen 132, or the touch screen 131 and the display screen 132 may be integrated to implement the input and output functions of the communication terminal 100A, and after the integration, the touch screen may be referred to as a touch display screen for short. In the present application, the display unit 130 may display the application programs and the corresponding operation steps.
The camera 140 may be used to capture still images or video. The object generates an optical image through the lens and projects the optical image to the photosensitive element. The photosensitive element may be a Charge Coupled Device (CCD) or a complementary metal-oxide-semiconductor (CMOS) phototransistor. The light sensing elements convert the light signals into electrical signals which are then passed to the processor 180 for conversion into digital image signals.
The communication terminal 100A may further comprise at least one sensor 150, such as an acceleration sensor 151, a distance sensor 152, a fingerprint sensor 153, a temperature sensor 154. The communication terminal 100A may be further provided with other sensors such as a gyroscope, a barometer, a hygrometer, a thermometer, an infrared sensor, an optical sensor, and a motion sensor.
The audio circuitry 160, speaker 161, microphone 162 may provide an audio interface between a user and the communication terminal 100A. The audio circuit 160 may transmit the electrical signal converted from the received audio data to the speaker 161, and convert the electrical signal into a sound signal for output by the speaker 161. The communication terminal 100A may also be provided with a volume button for adjusting the volume of the sound signal. On the other hand, the microphone 162 converts the collected sound signal into an electrical signal, converts the electrical signal into audio data after being received by the audio circuit 160, and outputs the audio data to the RF circuit 110 to be transmitted to, for example, another terminal or outputs the audio data to the memory 120 for further processing. In this application, the microphone 162 may capture the voice of the user.
Wi-Fi belongs to a short-range wireless transmission technology, and the communication terminal 100A may help a user to send and receive e-mails, browse web pages, access streaming media, etc. through the Wi-Fi circuit 170, which provides a wireless broadband internet access for the user.
The processor 180 is a control center of the communication terminal 100A, connects various parts of the entire terminal using various interfaces and lines, and performs various functions of the communication terminal 100A and processes data by running or executing software programs stored in the memory 120 and calling data stored in the memory 120. In some embodiments, processor 180 may include one or more processing units; the processor 180 may also integrate an application processor, which mainly handles operating systems, user interfaces, applications, etc., and a baseband processor, which mainly handles wireless communications. It will be appreciated that the baseband processor described above may not be integrated into the processor 180. In the present application, the processor 180 may run an operating system, an application program, a user interface display, and a touch response, and the processing method described in the embodiments of the present application. In addition, the processor 180 is coupled with the input unit 130 and the display unit 140.
And the Bluetooth circuit 181 is used for performing information interaction with other Bluetooth devices having a Bluetooth circuit through a Bluetooth protocol. For example, the communication terminal 100A may establish a bluetooth connection with a wearable electronic device (e.g., a smart watch) having a bluetooth circuit through the bluetooth circuit 181, so as to perform data interaction.
The communication terminal 100A also includes a power supply 190 (such as a battery) to power the various components. The power supply may be logically connected to the processor 180 through a power management system to manage charging, discharging, power consumption, etc. through the power management system. The communication terminal 100A may also be configured with power buttons for powering the terminal on and off, and for locking the screen.
Fig. 6 is a block diagram of the software configuration of the communication terminal 100A of the embodiment of the present invention.
The layered architecture divides the software into several layers, each layer having a clear role and division of labor. The layers communicate with each other through a software interface. In some embodiments, the Android system is divided into four layers, an application layer, an application framework layer, an Android runtime (Android runtime) and system library, and a kernel layer from top to bottom.
The application layer may include a series of application packages.
As shown in fig. 6, the application package may include applications such as camera, gallery, calendar, phone call, map, navigation, WLAN, bluetooth, music, video, short message, etc.
The application framework layer provides an Application Programming Interface (API) and a programming framework for the application program of the application layer. The application framework layer includes a number of predefined functions.
As shown in FIG. 6, the application framework layers may include a window manager, content provider, view system, phone manager, resource manager, notification manager, and the like.
The window manager is used for managing window programs. The window manager can obtain the size of the display screen, judge whether a status bar exists, lock the screen, intercept the screen and the like.
The content provider is used to store and retrieve data and make it accessible to applications. The data may include video, images, audio, calls made and received, browsing history and bookmarks, phone books, etc.
The view system includes visual controls such as controls to display text, controls to display pictures, and the like. The view system may be used to build applications. The display interface may be composed of one or more views. For example, the display interface including the short message notification icon may include a view for displaying text and a view for displaying pictures.
The telephone manager is for providing a communication function of the communication terminal 100A. Such as management of call status (including on, off, etc.).
The resource manager provides various resources for the application, such as localized strings, icons, pictures, layout files, video files, and the like.
The notification manager enables the application to display notification information in the status bar, can be used to convey notification-type messages, can disappear automatically after a short dwell, and does not require user interaction. Such as a notification manager used to inform download completion, message alerts, etc. The notification manager may also be a notification that appears in the form of a chart or scroll bar text at the top status bar of the system, such as a notification of a background running application, or a notification that appears on the screen in the form of a dialog window. For example, text information is prompted in the status bar, a prompt tone is given, the communication terminal vibrates, and an indicator light flashes.
The Android Runtime comprises a core library and a virtual machine. The Android runtime is responsible for scheduling and managing an Android system.
The core library comprises two parts: one part is a function which needs to be called by java language, and the other part is a core library of android.
The application layer and the application framework layer run in a virtual machine. And executing java files of the application program layer and the application program framework layer into a binary file by the virtual machine. The virtual machine is used for performing the functions of object life cycle management, stack management, thread management, safety and exception management, garbage collection and the like.
The system library may include a plurality of functional modules. For example: surface managers (surface managers), Media Libraries (Media Libraries), three-dimensional graphics processing Libraries (e.g., OpenGL ES), 2D graphics engines (e.g., SGL), and the like.
The surface manager is used to manage the display subsystem and provide fusion of 2D and 3D layers for multiple applications.
The media library supports a variety of commonly used audio, video format playback and recording, and still image files, among others. The media library may support a variety of audio-video encoding formats, such as MPEG4, h.264, MP3, AAC, AMR, JPG, PNG, and the like.
The three-dimensional graphic processing library is used for realizing three-dimensional graphic drawing, image rendering, synthesis, layer processing and the like.
The 2D graphics engine is a drawing engine for 2D drawing.
The kernel layer is a layer between hardware and software. The inner core layer at least comprises a display driver, a camera driver, an audio driver and a sensor driver.
The following exemplifies the workflow of the software and hardware of the communication terminal 100A in connection with capturing a photographing scene.
When the touch screen 131 receives a touch operation, a corresponding hardware interrupt is issued to the kernel layer. The kernel layer processes the touch operation into an original input event (including touch coordinates, a time stamp of the touch operation, and other information). The raw input events are stored at the kernel layer. And the application program framework layer acquires the original input event from the kernel layer and identifies the control corresponding to the input event. Taking the touch operation as a touch click operation, and taking a control corresponding to the click operation as a control of a camera application icon as an example, the camera application calls an interface of an application framework layer, starts the camera application, further starts a camera drive by calling a kernel layer, and captures a still image or a video through the camera 140.
The communication terminal 100A in the embodiment of the present application may be a mobile phone, a tablet computer, a wearable device, a notebook computer, a television, and the like.
Fig. 7 is a schematic diagram for illustrating a user interface on a communication terminal (e.g., communication terminal 100A of fig. 1). In some implementations, a user can open a corresponding application by touching an application icon on the user interface, or can open a corresponding folder by touching a folder icon on the user interface.
In the embodiment of the present application, as shown in fig. 8, the communication terminal 100A may establish a communication connection with the display device 200 to implement a multi-screen interaction function. The display device 200 is a device capable of presenting a display screen and performing data interaction with other devices, for example, a smart television, a tablet computer, an intelligent presentation screen, an intelligent projector, and the like.
In order to display a specific user interface and to be able to establish a communication connection with the communication terminal 100A, the display device 200 includes at least: a display 275, a communicator 220, and a controller 250. The display 275 is used for presenting a specific user interface, the communicator 220 is used for establishing a communication connection with the communication terminal 100A, and the controller 250 is used for receiving, sending and processing a display process and related data or control instructions in the communication process.
In some embodiments, at least one of a tuner demodulator, a detector, an input/output interface, an audio output interface, a memory, a power supply, a user interface, and an interface of an external device is further included in the display apparatus 200 to assist the display process and the communication process. For example, by collecting signals of an external environment or interaction with the outside by a detector such as a light receiver, parameter changes can be adaptively displayed by collecting ambient light.
The communication connection established between the communication terminal 100A and the display device 200 may be a wired or wireless connection. For example, the communicator 220 may include at least one of a Wifi chip, a bluetooth communication protocol chip, a wired ethernet communication protocol chip, or other network communication protocol chip or near field communication protocol chip, and an infrared receiver. Accordingly, the communication terminal 100A and the display device 200 may be connected through a wireless local area network, bluetooth, approach communication, or the like.
After the communication connection is established between the communication terminal 100A and the display device 200, the communication terminal 100A may establish different data transmission channels with the display device 200 using different transmission protocols to implement the data transmission function. For example, in order to implement the screen projection function, a data transmission channel based on a Digital Living Network Alliance (DLNA) protocol may be established between the communication terminal 100A and the display device 200, and the data transmission channel may be used to transmit a screen projection data stream, so that the display device 200 can synchronously display the content in the communication terminal 100A.
The communication terminal 100A and the display device 200 can communicate data with each other using the established communication connection relationship. For example, the communication terminal 100A may push a video file stored therein to the display device 200, and the display device 200 may perform a video playing function according to the received video file, so as to present the video content through the large-size display 275 of the display device 200 in a scene of a meeting presentation, a movie showing, and the like. In this embodiment, a process of pushing a video file to the display device 200 by the communication terminal 100A is referred to as video sharing. It should be understood that the files pushed by the communication terminal 100A to the display device 200 include not only video files but also document files, picture files, and other files that can be played in the communication terminal 100A and the display device 200.
In order to implement the video sharing process, the user may first select a video file to be shared in the file management interface of the communication terminal 100A, and then perform a corresponding interactive action of "pushing" through the UI interface of the communication terminal 100A. At this time, the communication terminal 100A may transmit the video file through the communication device, and the display apparatus 200 may receive the video file through the communicator 220, perform a play-related operation such as decoding on the received video file according to a predetermined play manner, and finally present specific contents of the video on the display 275.
In some embodiments, as shown in FIG. 9, the delivery of video content may also be accomplished through an established screen-cast data channel in order to enable the presentation of the video content on the display 275 of the display device 200. The screen projection function may include two functions, one is that the display device 200 displays all the screens displayed by the communication terminal 100A, including a UI interface and a play interface on the communication terminal 100A, and the content displayed by the display device 200 changes along with the change of the interactive operation in the communication terminal 100A; the other is that the display device 200 only displays a part of the screen in the communication terminal 100A, for example, the display device 200 may only display a play interface through screen projection, and other operations on the communication terminal that are not related to play do not affect the content displayed by the display device 200 during screen projection.
For the screen projection function, the communication terminal 100A may send screen projection data to the display device 200 through the screen projection data channel after the screen projection connection is established between the communication terminal 100A and the display device 200, where the screen projection data may include data corresponding to a picture displayed on the communication terminal 100A and may also include data corresponding to a video to be shared. The corresponding user operation process comprises the following steps: after the communication terminal 100A displays the playing interface, the user activates the screen projection function through the "screen projection" button on the playing interface, and sends the played video related data to the display device 200, so that the display device 200 can display the corresponding video content on the display 275. After the above operations are completed, the communication terminal 100A may no longer display the played video content, and maintain the background operation of the screen projection function, so as to avoid the user operation from affecting the screen projection screen.
As can be seen, in the two video sharing modes, the user needs to select a video file to be shared in the communication terminal 100A, and then push the video file to the display device 200. The process of selecting the video file by the user can be completed in the file management interface or an application program with a file sharing function. For example, a user may invoke a "screen projection" function in a play program by running a media asset play program, such as "xx audio and video", select a local resource screen projection or a network resource screen projection, select a corresponding video file, and start a screen projection function.
On the communication terminal 100A, the video file may be generally presented in the form of a file icon in association with a file name. For example, in the file management interface, a plurality of video file icons stored in the communication terminal 100A may be sequentially arranged and displayed, and a file name such as "video 1. AVI" may be displayed on the right side or below each icon. Since the specific video contents corresponding to different video files are different, the specific contents of the video files cannot be displayed in a manner of displaying the video files by icons and file names, so that when the number of the video files stored in the communication terminal 100A is large, a user cannot determine which video file to be shared is, and the user has to determine the video contents by opening/closing operations for many times.
For this reason, the contents of the video file may be presented through a thumbnail instead of the file icon. Wherein, the thumbnail can be generated from the first frame video picture of the video file or the video picture of a specific frame. However, due to the diversity of the contents of the video files, a part of the video cannot be effectively distinguished by one thumbnail. For example, for a movie-like video file, the first frame of video picture is usually a public license picture, which results in that thumbnails of the movie-like video files are all public license pictures, and therefore, the content displayed by the thumbnails of the pictures cannot effectively distinguish a plurality of movie-like video files, so that a user still cannot accurately select a video file to be shared.
In order to facilitate a user to select a video file to be shared, as shown in fig. 10 and 11, some embodiments of the present application provide a multi-screen interactive video browsing method, which may be applied to a communication terminal 100A that establishes a communication connection with a display device 200, where the method includes the following steps:
and acquiring a multi-screen interaction instruction input by a user.
The multi-screen interaction instruction is an instruction for starting a video sharing function according to interaction action input of a user. For example, when the user starts an application program related to a screen projection function, it is determined that the user inputs a multi-screen interaction instruction. For part of the communication terminals, the multi-screen interaction instruction can also be generated by a specific operation action. For example, when a three-finger up-slide instruction is input to the communication terminal 100A supporting touch operation, the screen projecting function of the communication terminal 100A is activated, so that when the three-finger up-slide instruction input by the user indicates that the user inputs a multi-screen interaction instruction.
In some embodiments, to input the multi-screen interaction instruction, the communication terminal 100A may first detect an application start action input by the user, and then parse an application program specified to run in the application start action. And if the specified running application program is the multi-screen interaction application, generating a multi-screen interaction instruction.
The application start action may have different expression forms according to different UI interaction modes of the communication terminal 100A. For example, the application start action may be a click touch instruction input by the user in the application program interface of the communication terminal 100A and having a position within the application icon range. After the user inputs the application start action, the communication terminal 100A may analyze an application program specified to be run in the application start action, and at the same time, determine the type of the started application program, and if the started application program is a multi-screen interaction application, for example, a screen-casting application, generate a multi-screen interaction instruction, so that the processor 180 may obtain the multi-screen interaction instruction input by the user.
Since the multi-screen interaction function is implemented based on the communication connection established between the communication terminal 100A and the display device 200, in some embodiments, the multi-screen interaction instruction may further include an operation instruction related to establishing the communication connection. For example, when the user inputs a multi-screen interaction instruction at the communication terminal 100A, the communication terminal 100A may detect whether the display device 200 is currently connected. If the communication terminal 100A is connected with the display device 200, establishing a corresponding screen-casting data channel so as to execute the related actions of file sharing in the following; if the communication terminal 100A does not establish a communication connection with the display device 200, a setup interface related to establishing a connection may be presented on the communication terminal to guide the user to establish a communication connection relationship.
In addition, if the communication terminal 100A establishes a wireless connection relationship with a plurality of display devices 200, after a multi-screen interaction instruction is input, a device list may be displayed on the communication terminal 100A, and the user may further select a pushed target in the list, thereby pushing a video file to the target display device 200.
And displaying a video sharing interface through the communication terminal.
After the user inputs the multi-screen interaction instruction, the communication terminal 100A may display a video sharing interface. As shown in fig. 8, the video sharing interface includes a plurality of video directory entries, and each of the video directory entries includes a plurality of key frame images extracted from a video to be shared. In this embodiment, the video sharing interface is used to display each video file that can be shared, so that a user can select the video file. Each video file which can be shared can be displayed in a video sharing interface in the form of a video directory entry, and a user can select the video directory entry after moving a focus mark to any video directory entry through an interactive UI (user interface), namely, the video file to be shared is selected.
The focus mark is a mark form used for representing a selected object, and may be specifically represented as a mark graphic such as a pointer, a box, a circle, or the like, or may be in an interface display effect form such as highlighting, amplifying, or the like. For example, when the user clicks in the area where the video directory entry corresponding to the video 2 is located in the video sharing interface, the focus mark may be set on the video directory entry corresponding to the video 2, and at this time, the UI interface of the communication terminal 100A may enlarge and highlight the video directory entry to indicate the position where the current focus mark is located.
According to the different shapes and areas of the display areas of the communication terminal 100A, different display effects can be presented in the video sharing interface. For example, as shown in fig. 12, when the communication terminal 100A is a vertically-disposed mobile phone, since the display unit 130 is in a vertical display state in which the width is smaller than the height, the video sharing interface may also be in a vertical state in which the width is smaller than the height, and each video directory item to be shared is sequentially arranged from top to bottom to form a list effect.
Obviously, in order to facilitate finding the specified video file, in the video sharing interface, the video directory entries may be arranged according to the video names or according to the modification dates of the video files. A multi-level menu may be set in the video sharing interface to display the video file list according to the storage location of the video file in the memory 120. For example, a plurality of folders including "movies, short videos, cameras", and the like may be set in the video sharing interface according to the source of the video file, and the user may select from different folders according to the type of the video to be shared.
When a plurality of video files are stored in the communication terminal 100A, in order to effectively distinguish the video files in the communication terminal 100A, a plurality of images may be included in the video directory entry, that is, in the video sharing interface, each video directory entry may be composed of a plurality of images. Wherein the image is a key frame extracted from the video file. Obviously, when the contents of the video files are different, the contents of the extracted key frame images are also different, so that the video files can be effectively distinguished through a plurality of key frame images, and a user can know the contents of the video files in time.
For example, for a video file, its video directory entry may consist of the 10 key frame images in the first 1-10s of the video file. And for video files with different contents, the probability that 10 key frame images are completely the same or similar is extremely low, so that different video contents can be effectively distinguished through a plurality of key frame images. In addition, for a common video file, the user can know the video content contained in the video file through the key frame image within 1-10s of the video file, so that the user can conveniently complete selection, and the efficiency of video sharing operation is improved.
As can be seen from the foregoing technical solutions, the multi-screen interactive video browsing method provided in the foregoing embodiment can be configured in the processor 180 of the communication terminal 100A, so that after the communication terminal 100A obtains a multi-screen interactive instruction input by a user, a video sharing interface can be displayed. And the video sharing interface comprises a plurality of video directory entries, and each video directory entry comprises a plurality of key frame images extracted from the video to be shared. According to the method, the video file can be displayed through the plurality of key frame images, so that a user can fully know the video content, and the video file to be shared can be conveniently and accurately selected.
In the above embodiment, since the video sharing interface and the file management interface have different display forms for the video file, in order to present the video sharing interface, in some embodiments, the step of controlling the display unit to display the video sharing interface further includes the following steps:
and traversing the video resource file.
After a multi-screen interaction instruction input by a user is obtained, the communication terminal may traverse video resource files that can be shared in the communication terminal 100A before displaying a video sharing interface. The video resource files comprise local video files stored in the communication terminal and network resource files displayed through a communication terminal resource interface. In the video sharing process, the user can share the video files stored locally and can also share the video files in the network.
Obviously, for different types of video files, the data transmitted in the subsequent video sharing process is different. For example, for a local video file, the communication terminal 100A is required to transmit the video file to the display device 200, and thus a video data stream can be formed between the communication terminal 100A and the display device 200. For the network resource file, the communication terminal 100A is required to transmit address information of the network resource file to the display device 200, such as a URL address, so that the display device 200 acquires the video file from the network server according to the address information, and thus a video data stream can be formed between the display device 200 and the server 400.
And decoding the video data in each video resource file within a preset time period.
By traversing the video resource files in the communication terminal 100A, the communication terminal 100A may parse the traversed partial segment of each video file, i.e., decode the video data in each video resource file within the preset time period. Since the communication terminal 100A can parse only a partial video clip, the communication terminal 100A can complete video decoding in a shorter time after traversing the video resource file.
After traversing the video file, the communication terminal 100A may first determine the video length, and if the video length exceeds a preset video length value, may analyze the video content within a specific time period. For example, when the video length exceeds 10s, the communication terminal 100A can parse the video content within 1-10 s. If the video resource is a shorter video with a length lower than a preset video length value, the whole video can be parsed. For example, for a resource file with a video total length of less than 10s, the resource file can be directly extracted according to the whole video length.
Extracting a set number of key frame images from the video data, and buffering the key frame images.
After partially parsing the video file, the communication terminal 100A may further extract a key frame from the video content to obtain a key frame image, thereby forming a video directory entry. For example, after entering the application, the communication terminal 100A may execute the above-mentioned procedure in the background to traverse the video files that can be shared in the communication terminal 100A, so as to obtain 5 video files. And then buffering the key frames of the video clips from the 1 st second to the 10 th second of the 5 videos, wherein the total number of the key frames is 5 multiplied by 10 frames. Because the background program does not directly buffer all the key frames of the video, unnecessary memory occupation can be reduced.
And arranging the key frame images according to the time sequence of the key frame images in the video file to generate a video directory entry.
For the cached key frame images, the key frame images can be grouped according to the extraction source of the key frame images, that is, the key frame images from the same video file are used as a group. And arranging the key frame images in the same group according to the time sequence of the key frame images in the video file to generate a video directory entry.
For example, in the 1-10s content of video 1, the time of the extracted key frame in the video file is: 00:01:30, 00:01:80, … …, 00:10:00, the extracted key frame images may be arranged in that order so that they are displayed in the first row. Similarly, the same arrangement is also adopted for the video 2, and the key frame images are displayed in the second row.
And controlling a display unit to display the video directory entries corresponding to the video files according to the arrangement sequence of the video files.
After extracting the key frame images, the communication terminal 100A may group and display the extracted key frame images according to the arrangement order of the video files, thereby forming a video sharing interface. For example, key frame images corresponding to 5 videos are displayed in the video sharing interface.
Due to the limitation of the display area of the communication terminal 100A, the key frame images included in each video directory entry cannot be all displayed, so that only a part of the key frame images can be displayed in each video directory entry, and other non-displayed images can be displayed through further operations, such as paging, sliding, and the like. For example, in a video sharing interface, each video catalog entry area may display the entire contents of the first 3 frame images, while the entire key frame images of the video may continue to be viewed by sliding.
It should be noted that, since different video file compression methods have different differences, the number of extracted key frame images is also different, and therefore, in the embodiment of the present application, the key frame images extracted from the video data may be dynamically set according to the processing capability of the communication terminal 100A and the number of videos sharable in the communication terminal 100A after the number of extracted key frame images is greater than the preset number. For example, for the communication terminal 100A with stronger processing capability and less sharable video files, the number of key frame images in each video directory entry may be increased, so that the user can know more video content information; for the communication terminal 100A with a weak processing capability or a large number of sharable video files, the number of the key frame images in each video directory entry may be set to be equal to a preset number, so as to cache the key frame images corresponding to the plurality of video files as soon as possible.
In addition, in order to present the video sharing interface in time, the communication terminal 100A may display the sharing interface in real time during the process of caching the key frame image, and further display the video directory entry of the corresponding video file according to the caching progress of the key frame image and the operation process of the user. For example, when a maximum of 5 video directory entries can be displayed in one video sharing interface, the communication terminal 100A may display the sharing interface after caching the key frame images of the first 5 video files. That is, in the initial display state, the video sharing interface may include only video directory entries of the first 5 video files for the user to operate. Meanwhile, the communication terminal 100A also maintains the relevant programs of the background cache key frame images, and then caches the key frame images of the 6 th and later video files in real time, so as to improve the page display efficiency.
In the above embodiments, the key frame image is an image that can represent video content, and is usually an original image that is not compressed in a video file, and for example, based on an IDR image compression method, the key frame is an I frame, and the non-key frame is a P frame or a B frame. Therefore, in order to extract key frame images, the step of extracting a set number of key frame images from the video data further comprises:
analyzing the video file to obtain a video code stream in a preset time period;
extracting key byte bits from the video code stream;
converting the code stream numerical value of the key byte bit into a binary numerical value, and converting a preset bit numerical value of the binary numerical value into a decimal numerical value;
and if the decimal value is equal to the key frame judgment value, extracting the image corresponding to the video code stream to generate a key frame image.
And the key byte bit is a byte bit with a specific length after a start code in the video code stream. For example, if a video bitstream is: 0000000141E 660 … …, 00000001 is the start code in the video stream, and the next byte after the start code can detect the frame type, i.e. the key byte bit is 0x41, and the binary value is 01000001. According to the decimal value 1 converted from the last 5 bits of the binary value, the code stream is determined to be a non-partitioned non-IDR image slice, namely a P frame.
If the other code stream is: 0000000165E 8 … …, the key byte bit is 0x65, scaled to a binary value of 01100101. And then, according to the decimal value of the last 5 bits of the binary value, the decimal value is converted to 5, and the code stream is determined to be a slice in the IDR image and is a key frame, namely an I frame.
The key frame extraction method may be configured in the memory 120 of the communication terminal 100A in the form of a control program for the processor 180 to call. The specific control program expression form may be programmed according to the type of the operating system of the communication terminal 100A. Taking the Android platform as an example, the configured key frame program can be represented as:
MediaMetadataRetriever retriever=new MediaMetadataRetriever();
retriever.setDataSource(dataPath);
namely acquiring the ith second key frame of the video:
bitmap=retriever.getFrameAtTime(i,MediaMetadataRetriever.OPTION_CLOSEST);
wherein, bitmap is the key frame image of ith second.
Therefore, by extracting the key byte value in the video code stream and performing binary and decimal value conversion, the key frame image contained in the video code stream can be determined, and the key frame image is extracted.
According to the technical scheme, after the multi-screen interaction instruction input by the user is obtained, the key frame images can be extracted from the sharable video file by the communication terminal in the embodiment, and then the key frame images form the video directory items and are displayed in the video sharing interface. Therefore, in the embodiment, the plurality of key frame images can be used for showing the video file instead of the monotonous icon or thumbnail, so that a user can obtain the video content in time, and the video file to be shared can be conveniently selected to complete video sharing.
Since the area occupied by the video directory entry is limited, the communication terminal 100A cannot display all of the extracted plurality of key frame images, but can display only a part of the key frame images. For example, in the video directory entry area of one line, the display unit of the communication terminal 100A can display only the first 3 key frame images. While under a partial video file, it is not sufficient to distinguish the video file by the first 3 key frame images, so in some embodiments, further operations may be performed to enable the communication terminal 100A to display other key frame images that are not displayed or not fully displayed. I.e. the method further comprises the steps of:
and acquiring a page turning instruction input by a user.
The page turning instruction is an instruction for executing a page turning function according to user interaction input, and can be a sliding instruction input by a user in a video directory entry area under a current focus mark. For example, if the user wants to know the video content of video 2, the user may first click on the video directory entry corresponding to video 2 to obtain the focus flag. At this time, the communication terminal 100A can highlight the video directory entry of the video 2. If the user still cannot know the video content through the displayed key frame image after the highlight display, a sliding instruction dragged to the left can be input in the video directory entry area, and an instruction for turning pages can be input.
Obviously, the page turning instruction input by the user differs according to the interaction manner that the communication terminal 100A can provide. For the communication terminal 100A that does not support touch operation, a page-turning button may be set below the video directory entry for obtaining the focus flag in the interactive UI of the communication terminal 100A, and the page-turning instruction may be input when the user clicks the page-turning button. For communication terminal 100A that supports only physical case operations, the page-turning instruction may also be entered when the focus flag is located on the video directory entry, upon pressing the "left" key or the "right" key by communication terminal 100A.
And controlling the display unit to scroll and display a plurality of key frame images in the sliding direction of the sliding instruction in the video directory entry area.
After the user inputs a page-turning instruction, communication terminal 100A may scroll display the plurality of key frame image positions in response to the page-turning instruction, so that other key frame images can be displayed in the video directory entry. For example, after the user inputs a slide instruction dragging left in the video directory entry area, the communication terminal 100A may display the 4 th key frame image and the 5 th key frame image … … in sequence following the action dragging left until the user knows the video content and stops inputting a page-turning instruction or stops displaying all the key frame images after they have finished being displayed.
In some embodiments, the user still cannot know the video content if the user completes displaying all the key frame images through the page turning instruction. The communication terminal 100A may further continue to extract key frame images from the video file after displaying a preset number of key frame images, and display them in the video directory entry in real time. For example, after the user drags the 10 th key frame image in the area of the video 2 through the touch motion dragged to the left, the user cannot distinguish the video content, and then the touch command dragged to the left may be continuously input, so as to trigger the communication terminal 100A to continue to extract the key frame image. At this time, the communication terminal 100A may obtain a key frame image in the video 2 corresponding to the 10-20s video content by the above-described key frame extraction method. And after obtaining the new key frame image, displaying behind the 10 th key frame image, namely, the 11 th key frame image. And the rest is repeated until the user knows the video content corresponding to the video 2 through the key frame image.
Because the corresponding area of the video directory entry is limited, and in order to present the video directory entries of a plurality of video files, a plurality of key frame images generally need to be displayed in the video sharing interface, for this reason, in the video sharing interface, the key frame images need to be simplified, that is, displayed through thumbnails. However, since the thumbnail loses some details after being reduced, in order to facilitate the user to view, as shown in fig. 14, in some embodiments of the present application, the video directory entry may be further displayed through a browsing interface, that is, the method further includes the following steps:
and acquiring a browsing instruction input by a user.
The browsing instruction is input by the user's interaction, and is used to control the communication terminal 100A to display the complete key frame image. For example, as shown in fig. 13, when the user inputs a long press instruction on the video directory entry for which the focus flag is obtained, the browsing instruction input by the user is obtained. For another example, the user may also input a browse instruction by clicking a "browse" button in the video sharing interface.
And extracting key frame images contained in the video directory entry under the current focus cursor.
After acquiring the browsing instruction input by the user, the communication terminal 100A may extract the selected key frame image from the cache in response to the browsing instruction. For example, after the user inputs a long press instruction in the video directory entry area corresponding to the video 2, it indicates that the user wants to browse the video 2, so all the key frame images corresponding to the video 2 may be extracted from the cache for display in the browsing interface.
And controlling the display unit to display a browsing interface.
After extracting the key frame images, the communication terminal 100A may switch from the video sharing interface to the browsing interface, and display the plurality of key frame images one by one in the browsing interface. As shown in fig. 10, the extracted key frame images are sequentially displayed in the browsing interface according to a preset time interval. In order to display the complete key frame original image, a content area for displaying the key frame image by the user can be set in the browsing interface, and the key frame image can be displayed in the content area circularly according to the preset time interval. The time interval of the cycle presentation can be customized according to the requirement, for example, the display time interval between two adjacent key frame images is 1-5 s.
As can be seen, in this embodiment, the key frame images in the video are carefully displayed through the browsing interface, so that details lost in the thumbnail are overcome, and the user can conveniently identify the video content.
In some embodiments, the displayed key frame images in the browsing interface may also be associated with browsing instructions entered by the user. That is, after acquiring the browsing instruction, the communication terminal 100A may further analyze the initial key frame image specified in the browsing instruction, and control the display unit to sequentially display the initial key frame and the key frame images subsequent to the initial key frame. The starting key frame image is a key frame image for a user to execute a long press action in the video sharing interface.
For example, for a movie-like video file, since the key frame images at the beginning of the file tend to be the same, such as public license pictures, distributor logos, etc., the user may choose to skip these repeated contents and browse directly from the 3 rd key frame image. That is, the user can input a long press instruction on the 3 rd key frame image to input a browsing instruction. After receiving the long press instruction, the communication terminal 100A may obtain that the key frame image corresponding to the long press instruction is the 3 rd key frame image, so that when the browsing interface is displayed, the 3 rd key frame image may be displayed in the content area, and the 4 th, 5 th and subsequent key frame images may be displayed according to a preset time interval.
Besides the content area, other controls can be included in the browsing interface for controlling the display effect in the browsing interface. For example, a plurality of controls may be presented below the content area, respectively: "previous", "next", "play", "pause", "push". The user can click any control to realize the corresponding function, for example, the user can switch the video directory items displayed in the browsing interface by clicking the "previous" button control and the "next" button control. Thus, as shown in fig. 15, in some embodiments, after displaying the browsing interface, the method may further include:
and acquiring a switching instruction input by a user.
The switching instruction can be input by clicking a button control in the browsing interface, and can also be input by a specific action gesture. For example, the user may input the switching instruction by inputting a slide instruction dragged upward or downward in the content area.
In response to the switching instruction, key frame images of the neighboring directory entries are extracted.
When the user inputs the switching instruction, the communication terminal 100A may display the adjacent directory entry in the browsing interface in the same manner as in the above-described embodiment. Wherein the adjacent directory entry is a video directory entry that is one of the previous and next video directory entries of the video directory entry under the current focus flag. For example, the neighborhood directory entry for video 2 is video 1 or video 3. Specifically, which adjacent directory entry is to be switched to may be determined according to a switching instruction input by the user, for example, when the user clicks a "previous" button, the communication terminal 100A extracts a key frame image corresponding to the video 1; when the user clicks the "next" button, the communication terminal 100A extracts a key frame image corresponding to the video 3.
And controlling the display unit to display the key frame image of the adjacent resource on the browsing interface.
After extracting the key frame image of the adjacent directory entry, the communication terminal 100A may display the extracted key frame image in the browsing interface, and complete video browsing switching. For example, after browsing the key frame image of the video 2, if the user determines that the video 2 is not the video that the user wants to share, the user may directly click a "next" button in the browsing interface to switch to the key frame image of the video 3. The communication terminal 100A can display the key frame image of the video 3 in the content area. The display mode of the image 3 may be the same as that of the video 2.
Therefore, in the embodiment, by executing the switching operation in the browsing interface, the directory entries of different videos can be quickly switched, so that the user can conveniently know the video content in a clearer environment, and whether the current video file is the video file to be shared is determined.
In addition, after the user views the video content through the video sharing interface and the browsing interface, if it is determined that a certain video file is a file to be shared, a push instruction may be further input, so that the communication terminal 100A sends the corresponding video file to the display device 200, and video sharing is completed.
The specific pushing method can be as follows: in the video sharing interface, a user may input a push instruction by double-clicking a video directory entry, and then the communication terminal 100A sends a video resource file corresponding to the directory entry to the display device 200; in the browsing interface, the user may input a push instruction by clicking a "push" button control, and the communication terminal 100A also sends the currently browsed video resource file to the display device 200, so as to complete video sharing.
Based on the above multi-screen interactive video browsing method, some embodiments of the present application further provide a communication terminal 100A, including: a display unit 130, a communication circuit, and a processor 180. The display unit 130 is configured to present various user interfaces and present a video sharing interface. The communication circuitry may include bluetooth circuitry 181, Wi-Fi circuitry 170, etc. configured to establish a communication connection with display device 200 to push the shared video data to display device 200. The processor 180 is configured to perform the following program steps:
acquiring a multi-screen interaction instruction input by a user;
and responding to the multi-screen interaction instruction, and controlling the display unit to display a video sharing interface.
The video sharing interface comprises a plurality of video directory entries, and each video directory entry comprises a plurality of key frame images extracted from videos to be shared.
As can be seen from the foregoing technical solutions, the communication terminal 100A provided in the foregoing embodiment can display a video sharing interface in the display unit 130 according to a multi-screen interaction instruction input by a user. In the video sharing interface, a plurality of video directory entries corresponding to videos to be shared can be included for a user to browse and view. Moreover, each video directory entry comprises a plurality of key frame images extracted from the video to be shared, and the video content in a preset time period can be displayed through the plurality of key frame images, so that a user can know the video content conveniently and select a video file to be shared to push to the display device 200, and the problem that a traditional video file browsing method is not beneficial to the user to select the video resource file to be shared is solved.
Based on the communication terminal 100A, some embodiments of the present application further provide a display device 200, which includes a display 275, a communicator 220, and a controller 250. The display 275 is configured to display a user interface and display video frame content shared by the communication terminal 100A, and the communicator 220 is configured to establish a communication connection with the communication terminal 100A, so as to obtain the shared video data at the communication terminal 100A.
As shown in fig. 16, the controller 250 is configured to execute the following program steps:
after establishing communication connection with a communication terminal, acquiring a video file pushed by the communication terminal;
and analyzing the video file, and controlling the display to display the video content of the video file.
The video file is a video file selected by a user in a video sharing interface of the communication terminal; the video sharing interface comprises a plurality of video directory entries, and each video directory entry comprises a plurality of key frame images extracted from videos to be shared.
As can be seen from the foregoing technical solutions, the display device 200 provided in the foregoing embodiment may receive, through the communicator 220, a video file pushed by a communication terminal in real time after establishing a communication connection, parse the received video file to generate specific video picture content, and finally display the video picture content through the display 275. Because the video file received by the display device 200 is a video file selected by the user in the video sharing interface, and the video file is represented by the plurality of video directory entries in the video sharing interface, the received video can directly display the video content to be shared by the user, the number of times of repeated operations of the user is reduced, and the user experience is improved.
Based on the communication terminal 100A and the display device 200, some embodiments of the present application further provide a multi-screen interactive video browsing system, which includes the communication terminal 100A and the display device 200, where the communication terminal 100A and the display device 200 establish a communication connection;
the communication terminal 100A is configured to: acquiring a multi-screen interaction instruction input by a user, responding to the multi-screen interaction instruction, and displaying a video sharing interface, wherein the video sharing interface comprises a plurality of video directory items, and each video directory item comprises a plurality of key frame images extracted from a video to be shared; and the number of the first and second groups,
pushing a video file to the display device 200, wherein the video file is a video to be shared selected by a user in a communication terminal video sharing interface;
the display device 200 is configured to: the method includes the steps of obtaining a video file pushed by the communication terminal 100A, analyzing the video file, and displaying video content of the video file.
According to the technical scheme, the multi-screen interactive video browsing system provided by the fourth aspect of the application comprises the communication terminal and the display device which are mutually communicated. The communication terminal can display a video sharing interface after acquiring a multi-screen interaction instruction, wherein the video sharing interface comprises a plurality of video directory entries, and each video directory entry comprises a plurality of key frame images extracted from a video to be shared. After the user selects any video to be shared, the communication terminal further pushes the selected video file to the display device, so that the display device can display the specific video content of the video file after acquiring the video file. The system displays the video files through the plurality of key frame images, so that a user can fully know the video content, and the video files to be shared can be conveniently and accurately selected.
The embodiments provided in the present application are only a few examples of the general concept of the present application, and do not limit the scope of the present application. Any other embodiments extended according to the scheme of the present application without inventive efforts will be within the scope of protection of the present application for a person skilled in the art.

Claims (10)

1.A communication terminal, comprising:
a display unit;
a communication circuit configured to establish a communication connection with a display device;
a processor configured to:
acquiring a multi-screen interaction instruction input by a user;
responding to the multi-screen interaction instruction, and controlling the display unit to display a video sharing interface, wherein the video sharing interface comprises a plurality of video directory entries, and each video directory entry comprises a plurality of key frame images extracted from the video to be shared.
2. The communication terminal according to claim 1, wherein in the step of obtaining the multi-screen interaction instruction input by the user, the processor is further configured to:
detecting an application starting action input by a user;
analyzing the application program appointed to run in the application starting action;
and if the specified running application program is a multi-screen interaction application, generating the multi-screen interaction instruction.
3. The communication terminal according to claim 1, wherein in the step of controlling the display unit to display the video sharing interface, the processor is further configured to:
traversing video resource files, wherein the video resource files comprise local video files stored in the communication terminal and network resource files displayed through a resource interface of the communication terminal;
decoding video data in a preset time period in each video resource file;
extracting a set number of key frame images from the video data;
and caching the key frame image.
4. The communication terminal according to claim 3, wherein in the step of controlling the display unit to display the video sharing interface, the processor is further configured to:
arranging the key frame images according to the time sequence of the key frame images in the video file to generate a video directory entry;
and controlling a display unit to display the video directory entries corresponding to the video files according to the arrangement sequence of the video files.
5. The communication terminal according to claim 3, wherein in the step of extracting a set number of key frame images from the video data, the processor is further configured to:
analyzing the video file to obtain a video code stream in a preset time period;
extracting key byte bits from the video code stream, wherein the key byte bits are byte bits behind a start code in the video code stream;
converting the code stream numerical value of the key byte bit into a binary numerical value, and converting a preset bit numerical value of the binary numerical value into a decimal numerical value;
and if the decimal value is equal to the key frame judgment value, extracting the image corresponding to the video code stream to generate a key frame image.
6. The communication terminal of claim 1, wherein the processor is further configured to:
acquiring a page turning instruction input by a user, wherein the page turning instruction is a sliding instruction input by the user in the video directory entry area under the current focus mark;
and responding to the page turning instruction, controlling the display unit to scroll and display a plurality of key frame images in the video directory entry area according to the sliding direction of the sliding instruction.
7. The communication terminal of claim 1, wherein the processor is further configured to:
acquiring a browsing instruction input by a user;
in response to the browsing instruction, extracting a key frame image contained in the video directory item under the current focus cursor;
and controlling the display unit to display a browsing interface, wherein the extracted key frame images are sequentially displayed in the browsing interface according to a preset time interval.
8. A multi-screen interactive video browsing method is applied to a communication terminal which establishes communication connection with display equipment, and comprises the following steps:
acquiring a multi-screen interaction instruction input by a user;
responding to the multi-screen interaction instruction, and displaying a video sharing interface through the communication terminal, wherein the video sharing interface comprises a plurality of video directory items, and each video directory item comprises a plurality of key frame images extracted from the video to be shared.
9. A display device, comprising:
a display;
a communicator configured to establish a communication connection with a communication terminal;
a controller configured to:
after communication connection is established with a communication terminal, a video file pushed by the communication terminal is obtained, wherein the video file is a video to be shared selected by a user in a video sharing interface of the communication terminal; the video sharing interface comprises a plurality of video directory entries, and each video directory entry comprises a plurality of key frame images extracted from videos to be shared;
and analyzing the video file, and controlling the display to display the video content of the video file.
10. A multi-screen interactive video browsing system is characterized by comprising a communication terminal and display equipment, wherein the communication terminal is in communication connection with the display equipment;
the communication terminal is configured to: acquiring a multi-screen interaction instruction input by a user, responding to the multi-screen interaction instruction, and displaying a video sharing interface, wherein the video sharing interface comprises a plurality of video directory items, and each video directory item comprises a plurality of key frame images extracted from a video to be shared; and the number of the first and second groups,
pushing a video file to the display equipment, wherein the video file is a video to be shared selected by a user in a video sharing interface of the communication terminal;
the display device is configured to: the method comprises the steps of obtaining a video file pushed by a communication terminal, analyzing the video file, and displaying video content of the video file.
CN202011293785.8A 2020-11-18 2020-11-18 Communication terminal and multi-screen interactive video browsing method Pending CN114297435A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202011293785.8A CN114297435A (en) 2020-11-18 2020-11-18 Communication terminal and multi-screen interactive video browsing method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202011293785.8A CN114297435A (en) 2020-11-18 2020-11-18 Communication terminal and multi-screen interactive video browsing method

Publications (1)

Publication Number Publication Date
CN114297435A true CN114297435A (en) 2022-04-08

Family

ID=80964378

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202011293785.8A Pending CN114297435A (en) 2020-11-18 2020-11-18 Communication terminal and multi-screen interactive video browsing method

Country Status (1)

Country Link
CN (1) CN114297435A (en)

Similar Documents

Publication Publication Date Title
WO2021203530A1 (en) Display device and television program pushing method
CN111970549B (en) Menu display method and display device
CN112073798B (en) Data transmission method and equipment
CN111836109A (en) Display device, server and method for automatically updating column frame
CN111897478A (en) Page display method and display equipment
CN112165641A (en) Display device
CN111176603A (en) Image display method for display equipment and display equipment
CN111954059A (en) Screen saver display method and display device
CN114286152A (en) Display device, communication terminal and screen projection picture dynamic display method
CN113225838A (en) Microphone control method, audio receiving equipment and audio collecting equipment
WO2022048203A1 (en) Display method and display device for manipulation prompt information of input method control
CN112272331B (en) Method for rapidly displaying program channel list and display equipment
CN111984167B (en) Quick naming method and display device
CN112269668A (en) Application resource sharing and display equipment
CN109857972B (en) Page updating method and display equipment
CN111669662A (en) Display device, video call method and server
CN112506859B (en) Method for maintaining hard disk data and display device
CN111935530B (en) Display equipment
CN111787350B (en) Display device and screenshot method in video call
CN112367550A (en) Method for realizing multi-title dynamic display of media asset list and display equipment
CN112261463A (en) Display device and program recommendation method
CN114079827A (en) Menu display method and display device
CN114297435A (en) Communication terminal and multi-screen interactive video browsing method
CN113825007B (en) Video playing method and device and display equipment
CN112199612B (en) Bookmark adding and combining method and display equipment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
TA01 Transfer of patent application right
TA01 Transfer of patent application right

Effective date of registration: 20221021

Address after: 83 Intekte Street, Devon, Netherlands

Applicant after: VIDAA (Netherlands) International Holdings Ltd.

Address before: 266555, No. 218, Bay Road, Qingdao economic and Technological Development Zone, Shandong

Applicant before: Hisense Video Technology Co.,Ltd.