CN111107428A - Method for playing two-way media stream data and display equipment - Google Patents

Method for playing two-way media stream data and display equipment Download PDF

Info

Publication number
CN111107428A
CN111107428A CN201911222288.6A CN201911222288A CN111107428A CN 111107428 A CN111107428 A CN 111107428A CN 201911222288 A CN201911222288 A CN 201911222288A CN 111107428 A CN111107428 A CN 111107428A
Authority
CN
China
Prior art keywords
data
path
media stream
video
display
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201911222288.6A
Other languages
Chinese (zh)
Inventor
王良
武兵
李双增
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Vidaa Netherlands International Holdings BV
Original Assignee
Qingdao Hisense Media Network Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Qingdao Hisense Media Network Technology Co Ltd filed Critical Qingdao Hisense Media Network Technology Co Ltd
Priority to CN201911222288.6A priority Critical patent/CN111107428A/en
Priority to PCT/CN2020/079175 priority patent/WO2021109354A1/en
Publication of CN111107428A publication Critical patent/CN111107428A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/44Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream or rendering scenes according to encoded video stream scene graphs
    • H04N21/4402Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream or rendering scenes according to encoded video stream scene graphs involving reformatting operations of video signals for household redistribution, storage or real-time display
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/431Generation of visual interfaces for content selection or interaction; Content or additional data rendering
    • H04N21/4312Generation of visual interfaces for content selection or interaction; Content or additional data rendering involving specific graphical features, e.g. screen layout, special fonts or colors, blinking icons, highlights or animations
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/434Disassembling of a multiplex stream, e.g. demultiplexing audio and video streams, extraction of additional data from a video stream; Remultiplexing of multiplex streams; Extraction or processing of SI; Disassembling of packetised elementary stream
    • H04N21/4341Demultiplexing of audio and video streams
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/439Processing of audio elementary streams

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Controls And Circuits For Display Device (AREA)

Abstract

The embodiment of the invention relates to the technical field of display, in particular to a method for playing two-way media stream data and display equipment, which are used for playing the two-way media stream data in the display equipment. The method comprises the following steps: a browser function module of the display equipment acquires two paths of media stream data from a server, and a demultiplexing module demultiplexes the two paths of media stream data into video data and audio data; the decoding module decodes the media stream data after demultiplexing to obtain a first path of media stream data including a first path of video data and a first path of audio data and a second path of media stream data at least including a second path of video data; the playing module outputs a first path of video data in the decoded first path of media stream data to a first window on a display, and outputs a first path of audio data to the audio output interface; and the browser functional module renders the decoded second path of video data into continuous video frames and outputs the continuous video frames to a second window on the display according to a certain frame rate.

Description

Method for playing two-way media stream data and display equipment
Technical Field
The invention relates to the technical field of display, in particular to a two-way video playing method and display equipment.
Background
With the popularization of network technology, display devices can be used not only to play TV signals but also to play network signals. With the increase of user demands, the display device needs to support the playing of two-way media stream data, and how to implement the two-way media stream data becomes a problem to be solved.
Disclosure of Invention
In view of this, the present invention provides a method for playing two-way media stream data and a display device, so as to implement playing of two-way media stream data and improve user proposals.
Specifically, the invention is realized by the following technical scheme:
in a first aspect, the present invention provides a display device comprising:
a display;
an audio output interface;
the browser functional module is controlled by the controller and is used for acquiring two paths of media stream data;
the demultiplexing module is controlled by the controller and is used for demultiplexing the media stream data acquired by the browser function module into video data and audio data;
the decoding module is used for decoding the media stream data after demultiplexing to obtain a first path of media stream data comprising a first path of video data and a first path of audio data and a second path of media stream data at least comprising a second path of video data;
the playing module is controlled by the controller and is used for outputting a first path of video data in the decoded first path of media stream data to a first window on the display and outputting a first path of audio data to the audio output interface;
and the browser function module is controlled by the controller and is also used for rendering the second path of video data in the decoded second path of media stream data into continuous video frames and outputting the continuous video frames to a second window on the display according to a certain frame rate.
In a second aspect, the present invention further provides a method for playing two-way media stream data, where the method is applied to a display device, and the method includes:
the controller controls the browser functional module to acquire two paths of media stream data;
the control demultiplexing module demultiplexes the two paths of media stream data acquired by the browser function module into video data and audio data;
decoding the media stream data after demultiplexing by a decoding module to obtain a first path of media stream data comprising a first path of video data and a first path of audio data and a second path of media stream data at least comprising a second path of video data;
the controller controls the playing module to output a first path of video data in the decoded first path of media stream data to a first window on the display and output a first path of audio data to the audio output interface;
and the controller controls the browser function module to render the second path of video data in the decoded second path of media stream data into continuous video frames, and outputs the continuous video frames to a second window on the display according to a certain frame rate.
In the above embodiment, the display device may obtain two paths of media stream data from the server through the browser function module, and the demultiplexing module demultiplexes the two paths of media stream data into video data and audio data; the decoding module decodes the media stream data after demultiplexing to obtain a first path of media stream data including a first path of video data and a first path of audio data and a second path of media stream data at least including a second path of video data; the playing module outputs a first path of video data in the decoded first path of media stream data to a first window on a display, and outputs a first path of audio data to the audio output interface; and the browser functional module renders the decoded second path of video data into continuous video frames and outputs the continuous video frames to a second window on the display according to a certain frame rate.
The invention can realize the frame-by-frame rendering of the video frames based on the rendering function of the browser, thereby achieving the effect of video playing and further realizing the function of double-path video playing on the display equipment. Furthermore, the invention can control the second video data in the RGB format to be compressed into YUV data and then sent to the browser functional module, thereby improving the efficiency of data transmission and data processing and ensuring that the video playing is smoother.
Drawings
Fig. 1A is a schematic diagram illustrating an operation scenario between a display device and a control apparatus;
fig. 1B is a block diagram schematically illustrating a configuration of the control apparatus 100 in fig. 1A;
fig. 1C is a block diagram schematically illustrating a configuration of the display device 200 in fig. 1A;
FIG. 1D is a block diagram illustrating an architectural configuration of an operating system in memory of display device 200;
FIG. 1E is a schematic diagram illustrating two-way playback of media stream data in a display;
2-1, 2-2 illustrate a flow chart of second video data compression;
a flow chart of rendering is illustrated in fig. 3;
an interaction flow diagram of a two-way video playback method is illustrated in fig. 4;
a flow chart of a two-way video playback method is illustrated in fig. 5.
Detailed Description
Reference will now be made in detail to the exemplary embodiments, examples of which are illustrated in the accompanying drawings. When the following description refers to the accompanying drawings, like numbers in different drawings represent the same or similar elements unless otherwise indicated. The embodiments described in the following exemplary embodiments do not represent all embodiments consistent with the present invention. Rather, they are merely examples of apparatus and methods consistent with certain aspects of the invention, as detailed in the appended claims.
The terminology used herein is for the purpose of describing particular embodiments only and is not intended to be limiting of the invention. As used in this specification and the appended claims, the singular forms "a", "an", and "the" are intended to include the plural forms as well, unless the context clearly indicates otherwise. It should also be understood that the term "and/or" as used herein refers to and encompasses any and all possible combinations of one or more of the associated listed items.
It is to be understood that although the terms first, second, third, etc. may be used herein to describe various information, these information should not be limited to these terms. These terms are only used to distinguish one type of information from another. For example, first information may also be referred to as second information, and similarly, second information may also be referred to as first information, without departing from the scope of the present invention. The word "if" as used herein may be interpreted as "at … …" or "when … …" or "in response to a determination", depending on the context.
Fig. 1A is a schematic diagram illustrating an operation scenario between a display device and a control apparatus. As shown in fig. 1A, the control apparatus 100 and the display device 200 may communicate with each other in a wired or wireless manner.
Among them, the control apparatus 100 is configured to control the display device 200, which may receive an operation instruction input by a user and convert the operation instruction into an instruction recognizable and responsive by the display device 200, serving as an intermediary for interaction between the user and the display device 200. Such as: the user operates the channel up/down key on the control device 100, and the display device 200 responds to the channel up/down operation.
The control device 100 may be a remote controller 100A, which includes infrared protocol communication or bluetooth protocol communication, and other short-distance communication methods, etc. to control the display apparatus 200 in a wireless or other wired manner. The user may input a user instruction through a key on a remote controller, voice input, control panel input, etc., to control the display apparatus 200. Such as: the user can input a corresponding control command through a volume up/down key, a channel control key, up/down/left/right moving keys, a voice input key, a menu key, a power on/off key, etc. on the remote controller, to implement the function of controlling the display device 200.
The control device 100 may also be an intelligent device, such as a mobile terminal 100B, a tablet computer, a notebook computer, and the like. For example, the display device 200 is controlled using an application program running on the smart device. The application program may provide various controls to a user through an intuitive User Interface (UI) on a screen associated with the smart device through configuration.
For example, the mobile terminal 100B may install a software application with the display device 200 to implement connection communication through a network communication protocol for the purpose of one-to-one control operation and data communication. Such as: the mobile terminal 100B may be caused to establish a control instruction protocol with the display device 200 to implement the functions of the physical keys as arranged in the remote control 100A by operating various function keys or virtual buttons of the user interface provided on the mobile terminal 100B. The audio and video content displayed on the mobile terminal 100B may also be transmitted to the display device 200, so as to implement a synchronous display function.
The display apparatus 200 may provide a network television function of a broadcast receiving function and a computer support function. The display device may be implemented as a digital television, a web television, an Internet Protocol Television (IPTV), or the like.
The display device 200 may be a liquid crystal display, an organic light emitting display, a projection device. The specific display device type, size, resolution, etc. are not limited.
The display apparatus 200 also performs data communication with the server 300 through various communication means. Here, the display apparatus 200 may be allowed to be communicatively connected through a Local Area Network (LAN), a Wireless Local Area Network (WLAN), and other networks. The server 300 may provide various contents and interactions to the display apparatus 200. By way of example, the display device 200 may send and receive information such as: receiving Electronic Program Guide (EPG) data, receiving software program updates, or accessing a remotely stored digital media library. The servers 300 may be a group or groups of servers, and may be one or more types of servers. Other web service contents such as a video on demand and an advertisement service are provided through the server 300.
Fig. 1B is a block diagram illustrating the configuration of the control device 100. As shown in fig. 1B, the control device 100 includes a controller 110, a memory 120, a communicator 130, a user input interface 140, an output interface 150, and a power supply 160.
The controller 110 includes a Random Access Memory (RAM)111, a Read Only Memory (ROM)112, a processor 113, a communication interface, and a communication bus. The controller 110 is used to control the operation of the control device 100, as well as the internal components of the communication cooperation, external and internal data processing functions.
Illustratively, when an interaction of a user pressing a key disposed on the remote controller 100A or an interaction of touching a touch panel disposed on the remote controller 100A is detected, the controller 110 may control to generate a signal corresponding to the detected interaction and transmit the signal to the display device 200.
And a memory 120 for storing various operation programs, data and applications for driving and controlling the control apparatus 100 under the control of the controller 110. The memory 120 may store various control signal commands input by a user.
The communicator 130 enables communication of control signals and data signals with the display apparatus 200 under the control of the controller 110. Such as: the control apparatus 100 transmits a control signal (e.g., a touch signal or a button signal) to the display device 200 via the communicator 130, and the control apparatus 100 may receive the signal transmitted by the display device 200 via the communicator 130. The communicator 130 may include an infrared signal interface 131 and a radio frequency signal interface 132. For example: when the infrared signal interface is used, the user input instruction needs to be converted into an infrared control signal according to an infrared control protocol, and the infrared control signal is sent to the display device 200 through the infrared sending module. The following steps are repeated: when the rf signal interface is used, a user input command needs to be converted into a digital signal, and then the digital signal is modulated according to the rf control signal modulation protocol and then transmitted to the display device 200 through the rf transmitting terminal.
The user input interface 140 may include at least one of a microphone 141, a touch pad 142, a sensor 143, a key 144, and the like, so that a user can input a user instruction regarding controlling the display apparatus 200 to the control apparatus 100 through voice, touch, gesture, press, and the like.
The output interface 150 outputs a user instruction received by the user input interface 140 to the display apparatus 200, or outputs an image or voice signal received by the display apparatus 200. Here, the output interface 150 may include an LED interface 151, a vibration interface 152 generating vibration, a sound output interface 153 outputting sound, a display 154 outputting an image, and the like. For example, the remote controller 100A may receive an output signal such as audio, video, or data from the output interface 150, and display the output signal in the form of an image on the display 154, in the form of audio on the sound output interface 153, or in the form of vibration on the vibration interface 152.
And a power supply 160 for providing operation power support for each element of the control device 100 under the control of the controller 110. In the form of a battery and associated control circuitry.
A hardware configuration block diagram of the display device 200 is exemplarily illustrated in fig. 1C. As shown in fig. 1C, the display apparatus 200 may include a tuner demodulator 210, a communicator 220, a detector 230, an external device interface 240, a controller 250, a memory 260, a user interface 265, a video processor 270, a display 275, an audio processor 280, an audio output interface 285, and a power supply 290.
The tuner demodulator 210 receives the broadcast television signal in a wired or wireless manner, may perform modulation and demodulation processing such as amplification, mixing, and resonance, and is configured to demodulate, from a plurality of wireless or wired broadcast television signals, an audio/video signal carried in a frequency of a television channel selected by a user, and additional information (e.g., EPG data).
The tuner demodulator 210 is responsive to the user selected frequency of the television channel and the television signal carried by the frequency, as selected by the user and controlled by the controller 250.
The tuner demodulator 210 can receive a television signal in various ways according to the broadcasting system of the television signal, such as: terrestrial broadcasting, cable broadcasting, satellite broadcasting, internet broadcasting, or the like; and according to different modulation types, a digital modulation mode or an analog modulation mode can be adopted; and can demodulate the analog signal and the digital signal according to the different kinds of the received television signals.
In other exemplary embodiments, the tuning demodulator 210 may also be in an external device, such as an external set-top box. In this way, the set-top box outputs a television signal after modulation and demodulation, and inputs the television signal into the display apparatus 200 through the external device interface 240.
The communicator 220 is a component for communicating with an external device or an external server according to various communication protocol types. For example, the display apparatus 200 may transmit content data to an external apparatus connected via the communicator 220, or browse and download content data from an external apparatus connected via the communicator 220. The communicator 220 may include a network communication protocol module or a near field communication protocol module, such as a WIFI module 221, a bluetooth communication protocol module 222, and a wired ethernet communication protocol module 223, so that the communicator 220 may receive a control signal of the control device 100 according to the control of the controller 250 and implement the control signal as a WIFI signal, a bluetooth signal, a radio frequency signal, and the like.
The detector 230 is a component of the display apparatus 200 for collecting signals of an external environment or interaction with the outside. The detector 230 may include a sound collector 231, such as a microphone, which may be used to receive a user's sound, such as a voice signal of a control instruction of the user to control the display device 200; alternatively, ambient sounds may be collected that identify the type of ambient scene, enabling the display device 200 to adapt to ambient noise.
In some other exemplary embodiments, the detector 230, which may further include an image collector 232, such as a camera, a video camera, etc., may be configured to collect external environment scenes to adaptively change the display parameters of the display device 200; and the function of acquiring the attribute of the user or interacting gestures with the user so as to realize the interaction between the display equipment and the user.
In some other exemplary embodiments, the detector 230 may further include a light receiver for collecting the intensity of the ambient light to adapt to the display parameter variation of the display device 200.
In some other exemplary embodiments, the detector 230 may further include a temperature sensor, such as by sensing an ambient temperature, and the display device 200 may adaptively adjust a display color temperature of the image. For example, when the temperature is higher, the display apparatus 200 may be adjusted to display a color temperature of an image that is cooler; when the temperature is lower, the display device 200 may be adjusted to display a warmer color temperature of the image.
The external device interface 240 is a component for providing the controller 250 to control data transmission between the display apparatus 200 and an external apparatus. The external device interface 240 may be connected to an external apparatus such as a set-top box, a game device, a notebook computer, etc. in a wired/wireless manner, and may receive data such as a video signal (e.g., moving image), an audio signal (e.g., music), additional information (e.g., EPG), etc. of the external apparatus.
The external device interface 240 may include: a High Definition Multimedia Interface (HDMI) terminal 241, a Composite Video Blanking Sync (CVBS) terminal 242, an analog or digital Component terminal 243, a Universal Serial Bus (USB) terminal 244, a Component terminal (not shown), a red, green, blue (RGB) terminal (not shown), and the like.
The controller 250 controls the operation of the display device 200 and responds to the operation of the user by running various software control programs (such as an operating system and various application programs) stored on the memory 260.
As shown in fig. 1C, the controller 250 includes a Random Access Memory (RAM)251, a Read Only Memory (ROM)252, a graphics processor 253, a CPU processor 254, a communication interface 255, and a communication bus 256. The RAM251, the ROM252, the graphic processor 253, and the CPU processor 254 are connected to each other through a communication bus 256 through a communication interface 255.
The ROM252 stores various system boot instructions. When the display apparatus 200 starts power-on upon receiving the power-on signal, the CPU processor 254 executes a system boot instruction in the ROM252, copies the operating system stored in the memory 260 to the RAM251, and starts running the boot operating system. After the start of the operating system is completed, the CPU processor 254 copies the various application programs in the memory 260 to the RAM251 and then starts running and starting the various application programs.
And a graphic processor 253 for generating various graphic objects such as icons, operation menus, and user input instruction display graphics, etc. The graphic processor 253 may include an operator for performing an operation by receiving various interactive instructions input by a user, and further displaying various objects according to display attributes; and a renderer for generating various objects based on the operator and displaying the rendered result on the display 275.
A CPU processor 254 for executing operating system and application program instructions stored in memory 260. And according to the received user input instruction, processing of various application programs, data and contents is executed so as to finally display and play various audio-video contents.
In some example embodiments, the CPU processor 254 may comprise a plurality of processors. The plurality of processors may include one main processor and a plurality of or one sub-processor. A main processor for performing some initialization operations of the display apparatus 200 in the display apparatus preload mode and/or operations of displaying a screen in the normal mode. A plurality of or one sub-processor for performing an operation in a state of a standby mode or the like of the display apparatus.
The communication interface 255 may include a first interface to an nth interface. These interfaces may be network interfaces that are connected to external devices via a network.
The controller 250 may control the overall operation of the display apparatus 200. For example: in response to receiving a user input command for selecting a GUI object displayed on the display 275, the controller 250 may perform an operation related to the object selected by the user input command.
Where the object may be any one of the selectable objects, such as a hyperlink or an icon. The operation related to the selected object is, for example, an operation of displaying a link to a hyperlink page, document, image, or the like, or an operation of executing a program corresponding to the object. The user input command for selecting the GUI object may be a command input through various input means (e.g., a mouse, a keyboard, a touch panel, etc.) connected to the display apparatus 200 or a voice command corresponding to a voice spoken by the user.
A memory 260 for storing various types of data, software programs, or applications for driving and controlling the operation of the display device 200. The memory 260 may include volatile and/or nonvolatile memory. And the term "memory" includes the memory 260, the RAM251 and the ROM252 of the controller 250, or a memory card in the display device 200.
In some embodiments, the memory 260 is specifically used for storing an operating program for driving the controller 250 of the display device 200; storing various application programs built in the display apparatus 200 and downloaded by a user from an external apparatus; data such as visual effect images for configuring various GUIs provided by the display 275, various objects related to the GUIs, and selectors for selecting GUI objects are stored.
In some embodiments, memory 260 is specifically configured to store drivers for tuner demodulator 210, communicator 220, detector 230, external device interface 240, video processor 270, display 275, audio processor 280, etc., and related data, such as external data (e.g., audio-visual data) received from the external device interface or user data (e.g., key information, voice information, touch information, etc.) received by the user interface.
In some embodiments, memory 260 specifically stores software and/or programs representing an Operating System (OS), which may include, for example: a kernel, middleware, an Application Programming Interface (API), and/or an application program. Illustratively, the kernel may control or manage system resources, as well as functions implemented by other programs (e.g., the middleware, APIs, or applications); at the same time, the kernel may provide an interface to allow middleware, APIs, or applications to access the controller to enable control or management of system resources.
A block diagram of the architectural configuration of the operating system in the memory of the display device 200 is illustrated in fig. 1D. The operating system architecture comprises an application layer, a middleware layer and a kernel layer from top to bottom.
The application layer, the application programs built in the system and the non-system-level application programs belong to the application layer. Is responsible for direct interaction with the user. The application layer may include a plurality of applications such as a setup application, a post application, a media center application, and the like. These application programs may be implemented as Web applications that execute based on a WebKit engine, and in particular may be developed and executed based on HTML5, Cascading Style Sheets (CSS), and JavaScript, typically to control operations or execution for a browser function module in a display device.
Here, HTML, which is called HyperText Markup Language (HyperText Markup Language), is a standard Markup Language for creating web pages, and describes the web pages by Markup tags, where the HTML tags are used to describe characters, graphics, animation, sound, tables, links, etc., and a browser reads an HTML document, interprets the content of the tags in the document, and displays the content in the form of web pages.
CSS, known as Cascading Style Sheets (Cascading Style Sheets), is a computer language used to represent the Style of HTML documents, and may be used to define Style structures, such as fonts, colors, locations, etc. The CSS style can be directly stored in the HTML webpage or a separate style file, so that the style in the webpage can be controlled.
JavaScript, a language applied to Web page programming, can be inserted into an HTML page and interpreted and executed by a browser. The interaction logic of the Web application is realized by JavaScript. The JavaScript can package a JavaScript extension interface through a browser, realize the communication with the kernel layer,
the middleware layer may provide some standardized interfaces to support the operation of various environments and systems. For example, the middleware layer may be implemented as multimedia and hypermedia information coding experts group (MHEG) middleware related to data broadcasting, DLNA middleware which is middleware related to communication with an external device, middleware which provides a browser environment in which each application program in the display device operates, and the like.
The kernel layer provides core system services, such as: file management, memory management, process management, network management, system security authority management and the like. The kernel layer may be implemented as a kernel based on various operating systems, for example, a kernel based on the Linux operating system.
The kernel layer also provides communication between system software and hardware, and provides device driver services for various hardware, such as: provide display driver for the display, provide camera driver for the camera, provide button driver for the remote controller, provide wiFi driver for the WIFI module, provide audio driver for audio output interface, provide power management drive for Power Management (PM) module etc..
A user interface 265 receives various user interactions. Specifically, it is used to transmit an input signal of a user to the controller 250 or transmit an output signal from the controller 250 to the user. For example, the remote controller 100A may transmit an input signal, such as a power switch signal, a channel selection signal, a volume adjustment signal, etc., input by the user to the user interface 265, and then the input signal is transferred to the controller 250 through the user interface 265; alternatively, the remote controller 100A may receive an output signal such as audio, video, or data output from the user interface 265 via the controller 250, and display the received output signal or output the received output signal in audio or vibration form.
In some embodiments, a user may enter user commands on a Graphical User Interface (GUI) displayed on the display 275, and the user interface 265 receives the user input commands through the GUI. Specifically, the user interface 265 may receive user input commands for controlling the position of a selector in the GUI to select different objects or items.
Alternatively, the user may input a user command by inputting a specific sound or gesture, and the user interface 265 receives the user input command by recognizing the sound or gesture through the sensor. The video processor 270 is configured to receive an external video signal, and perform video data processing such as decompression, decoding, scaling, noise reduction, frame rate conversion, resolution conversion, and image synthesis according to a standard codec protocol of the input signal, so as to obtain a video signal that is directly displayed or played on the display 275.
Illustratively, the video processor 270 includes a demultiplexing module, a video decoding module, an image synthesizing module, a frame rate conversion module, a display formatting module, and the like.
The demultiplexing module is configured to demultiplex an input audio/video data stream, where, for example, an input MPEG-2 stream (based on a compression standard of a digital storage media moving image and voice), the demultiplexing module demultiplexes the input audio/video data stream into a video signal and an audio signal.
And the video decoding module is used for processing the video signal after demultiplexing, including decoding, scaling and the like.
And the image synthesis module is used for carrying out superposition mixing processing on the GUI signal input by the user or generated by the user and the video image after the zooming processing by the graphic generator so as to generate an image signal for display.
The frame rate conversion module is configured to convert a frame rate of an input video, for example, convert a frame rate of an input 60Hz video into a frame rate of 120Hz or 240Hz, where a common format is implemented by using, for example, an interpolation frame method.
And a display formatting module for converting the signal output by the frame rate conversion module into a signal conforming to a display format of a display, such as converting the format of the signal output by the frame rate conversion module to output an RGB data signal.
A display 275 for receiving the image signal from the video processor 270 and displaying the video content, the image and the menu manipulation interface. The display video content may be from the video content in the broadcast signal received by the tuner-demodulator 210, or from the video content input by the communicator 220 or the external device interface 240. The display 275, while displaying a user manipulation interface UI generated in the display apparatus 200 and used to control the display apparatus 200.
And, the display 275 may include a display screen assembly for presenting a picture and a driving assembly for driving the display of an image. Alternatively, a projection device and projection screen may be included, provided display 275 is a projection display.
The audio processor 280 is configured to receive an external audio signal, decompress and decode the received audio signal according to a standard codec protocol of the input signal, and perform audio data processing such as noise reduction, digital-to-analog conversion, and amplification processing to obtain an audio signal that can be played by the speaker 286.
Illustratively, audio processor 280 may support various audio formats. Such as MPEG-2, MPEG-4, Advanced Audio Coding (AAC), high efficiency AAC (HE-AAC), and the like.
The audio output interface 285 is used for receiving an audio signal output by the audio processor 280 under the control of the controller 250, and the audio output interface 285 may include a speaker 286 or an external sound output terminal 287, such as an earphone output terminal, for outputting to a generating device of an external device.
In other exemplary embodiments, video processor 270 may comprise one or more chips. Audio processor 280 may also comprise one or more chips.
And, in other exemplary embodiments, the video processor 270 and the audio processor 280 may be separate chips or may be integrated with the controller 250 in one or more chips.
And a power supply 290 for supplying power supply support to the display apparatus 200 from the power input from the external power source under the control of the controller 250. The power supply 290 may be a built-in power supply circuit installed inside the display apparatus 200 or may be a power supply installed outside the display apparatus 200.
With reference to the display device in fig. 1B, when the display device of the present invention implements two-way video playing, part of the components in the display device specifically perform the following operations:
and the display is used for displaying the playing picture.
An audio output interface for outputting audio data;
the browser functional module is controlled by the controller and is used for acquiring two paths of media stream data from the server;
the demultiplexing module is controlled by the controller and is used for demultiplexing the media stream data acquired by the browser function module into video data and audio data;
the decoding module is used for decoding the media stream data after demultiplexing to obtain a first path of media stream data comprising a first path of video data and a first path of audio data and a second path of media stream data at least comprising a second path of video data;
the playing module is controlled by the controller and is used for outputting a first path of video data in the decoded first path of media stream data to a first window on the display and outputting a first path of audio data to the audio output interface;
and the browser function module is controlled by the controller and is also used for rendering the second path of video data in the decoded second path of media stream data into continuous video frames and outputting the continuous video frames to a second window on the display according to a certain frame rate.
Illustratively, as shown in fig. 1E, a path of media stream data, such as a variety program, is played in full screen in the display; another stream of media data, such as an advertising video, is played in a window in the upper left corner of the display. Therefore, the advertisement information can be provided for the user while the user watches the comprehensive art program, so that the user experience is improved.
In an embodiment, the controller may be specifically configured to control the decoding module to compress the second video data in RGB format into YUV data and then send the YUV data to the browser function module. Specifically, the media stream data is processed by the decoding module to obtain second video data, which is RGB data, and generally, the video image adopts an RGB mode, and RGB is actually a combination of three primary colors of red, green and blue. Typically, each pixel point may be composed of a 3-bit RGB dataset. For example, for a 4 × 2 picture, the RGB data storage method is as follows:
R1R2R3R4R5R6R7R8G1G2G3G4G5G6G7G8B1B2B3B4B5B6B7B8;
to improve the processing performance and further compress the data storage space, the RGB data format can be converted into YUV data format, where Y is used to represent the luminance, i.e. gray value, and U and V are the chrominance, which is used to describe the color and saturation of the image, and is used to specify the color of the pixel. Take the data format of YUV420 as an example, where 4 ys share a set of UV components. For 4 × 2 pictures, the data format of YUV420 is stored in two formats:
NV12 data format:
Y1Y2Y3Y4Y5Y6Y7Y8U1V1U2V2;
YV12 data format:
Y1Y2Y3Y4Y5Y6Y7Y8U1U2V1V2;
by comparing the RGB data format with the YUV data format, it is found that half of the memory storage space can be saved by sharing UV with the YUV data, and therefore, in this embodiment, the purpose of data compression can be achieved by converting RGB data into YUV data.
There are two methods for compressing RGB data into YUV data:
first, the controller may control the decoding module to convert the second video data in the RGB format into YUV data, then order Y, U, V sets of data in the YUV data into third video data in a preset format, and send the third video data to the browser function module. The preset format is NV12 format or YV12 format.
For example, as shown in fig. 2-1, the decoding module may decode the H264 format media stream data by hardware to obtain RGB data, and assuming that the RGB data is 4 × 2 pixels, the RGB data may be represented as:
R1R2R3R4R5R6R7R8G1G2G3G4G5G6G7G8B1B2B3B4B5B6B7B8;
converting the RGB data into YUV data through a conversion algorithm, which specifically comprises the following steps:
Y:Y1Y2Y3Y4Y5Y6Y7Y8;
U:U1U2;
V:V1V2;
after encoding (herein, NV12 is taken as an example), NV12 data is obtained, namely:
Y1Y2Y3Y4Y5Y6Y7Y8U1V1U2V2。
the NV12 data is the second video data that is ultimately input to the browser feature.
Secondly, the controller may further control a decoding module to convert the second video data in the RGB format into YUV data, then encapsulate Y, U, V three groups of data in the YUV data respectively to obtain virtual YV12 data, and send the virtual YV12 data to the browser function module.
For example, as shown in fig. 2-2, the decoding module may decode the H264 format media stream data by hardware to obtain RGB data, and assuming that the RGB data is 4 × 2 pixels, the RGB data may be represented as:
R1R2R3R4R5R6R7R8G1G2G3G4G5G6G7G8B1B2B3B4B5B6B7B8;
converting the RGB data into YUV data through a conversion algorithm, which specifically comprises the following steps:
Y:Y1Y2Y3Y4Y5Y6Y7Y8;
U:U1U2;
V:V1V2;
the YUV data is directly encapsulated to obtain virtual YV12 data, that is:
Y:Y1Y2Y3Y4Y5Y6Y7Y8;
U:U1U2;
V:V1V2。
the virtual YV12 data is the second video data that is eventually input to the browser feature module.
Since in the first method, Y, U, V three sets of data in the YUV data need to be sorted into video data in a preset format (such as NV 12); in the second method, U, V data can be directly and individually stored according to the characteristic that U and V data in YV12 data are not cross-arranged, so that after the YUV data are decoded to obtain YUV data, YUV sorting processing is not performed according to the format of NV12 or YV12, but virtual YV12 data are obtained by respectively encapsulating Y, U, V three groups of data in the YUV data, and the virtual YV12 data are sent to the browser function module. Therefore, the second method reduces the data sorting process compared to the first method, and further optimizes the encoding and decoding process.
In an embodiment, the browser function module is specifically configured to render the second video data in the YUV format through OPENGL to obtain a continuous video frame image. The browser functional module can encapsulate second video data into a specified data packet according to each frame of second video data in a YUV format, obtain position coordinates of a second window in a display screen of a display, determine position coordinates of each pixel point in the second video data according to the position coordinates of the second window, fill data in the specified data packet into the second window based on the position coordinates of the pixel point, and display a video frame image corresponding to the second video data in the second window on the display screen through the display.
As shown in fig. 3, first, in a stage a of process, the browser functional module may construct a Render Tree by parsing HTML and CSS, where data nodes in the Render Tree determine a position and a size of a video playing window (i.e., a second window) in the display, and the position and the size are both expressed by coordinates. Then, in the stage of the process B, the browser functional module may encapsulate the second video data into an assigned data packet, for example, decodertarget data, confirm the position coordinates of each pixel point by packing Box and Layout, and then fill each pixel point in the decodertarget data into the display area where the second window is located according to the position coordinates thereof, so that the video frame image is displayed in the second window.
Fig. 4 illustrates an interaction flow of a two-way video playing method, where the controller may control the browser function module to acquire two ways of media stream data from the server, as shown in step ① in fig. 4, and then control the demultiplexing module to demultiplex the two ways of media stream data acquired by the browser function module into video data and audio data.
The decoding module decodes the demultiplexed media stream data, as shown in step ② in fig. 4, to obtain a first path of video data and a second path of video data, and the controller controls the video playing module to output the first path of video data in the decoded first path of media stream data to a first window on the display.
The controller controls the decoding module to compress the second video data in RGB format into the second video data in YUV format, as shown in step ③ in fig. 4.
The controller controls the browser function module to encapsulate the YUV data into a specific data packet, as shown by an arrow ④ in fig. 4, for each frame of YUV data, where the specific data packet may be a decodarget data packet.
Then, the browser functional module may obtain the position coordinates of the second window in the display screen, determine the position coordinates of each pixel point in the video image according to the position coordinates of the second window, and fill the data in the designated data packet into the second window based on the position coordinates of the pixel point, which is a rendering process of the video image, as shown by an arrow ⑤ in fig. 4.
The rendered video frame images may be transmitted from the memory of the GPU to the memory of the display so that the video frame images corresponding to the second video data may be displayed in the second window on the display screen via the display, as indicated by arrow ⑥ in fig. 4, the browser operates as described above for each frame of the second video data so that the successive video frame images may be displayed in the second window.
In some embodiments, if the controller (e.g., chip) has a strong dual-channel playing capability, for example, a dual-channel decoding module to support dual-channel video decoding, and a dual-channel playing module (e.g., a player provided by the controller) to support dual-channel video playing, a dual-channel video playing function may be implemented. However, in some embodiments, if the controller (e.g., a chip) does not have a two-way video playing function, for example, only has one-way decoding module or only has one-way playing module, on one hand, video playing can be implemented based on the one-way playing module, and on the other hand, frame-by-frame rendering of video frames can be implemented based on the rendering function of the browser itself, so as to achieve the effect of video playing, thereby implementing the two-way video playing function on the display device.
Furthermore, the video processor of the invention compresses the second video data in the RGB format into YUV data and sends the YUV data to the browser, thereby improving the data transmission efficiency and data processing, and ensuring the video playing to be smoother.
Fig. 5 illustrates a flow chart of a two-way video playing method.
With the above-mentioned display device structure, the method for performing two-way video playing on the display device includes the following steps 501 to 505:
step 501: the controller controls the browser functional module to acquire two paths of media stream data from the server.
Step 502: and the control demultiplexing module demultiplexes the two paths of media stream data acquired by the browser function module into video data and audio data.
Step 503: and decoding the media stream data after demultiplexing by a decoding module to obtain a first path of media stream data comprising the first path of video data and the first path of audio data and a second path of media stream data at least comprising the second path of video data.
Step 504: the controller controls the playing module to output the first path of video data in the decoded first path of media stream data to a first window on the display, and output the first path of audio data to the audio output interface.
And 505, the controller controls the browser function module to render the second path of video data in the decoded second path of media stream data into continuous video frames, and output the continuous video frames to a second window on the display according to a certain frame rate.
In one embodiment, the decoding module compresses the second video data in RGB format into YUV data and sends the YUV data to the browser function module.
In an embodiment, the compressing, by the decoding module, the second video data in RGB format into YUV data and then sending the YUV data to the browser function module specifically includes: and after converting the second video data in the RGB format into YUV data, the decoding module respectively encapsulates Y, U, V three groups of data in the YUV data to obtain virtual YV12 data, and sends the virtual YV12 data to the browser.
For a specific embodiment of the method for performing two-way video playing on the display device, reference may be made to the above-mentioned embodiment processed by each module in the display device, and details are not described here again.
In the above embodiment, the display device may obtain two paths of media stream data from the server through the browser function module, and the demultiplexing module demultiplexes the two paths of media stream data into video data and audio data; the decoding module decodes the media stream data after demultiplexing to obtain a first path of media stream data including a first path of video data and a first path of audio data and a second path of media stream data at least including a second path of video data; the playing module outputs a first path of video data in the decoded first path of media stream data to a first window on a display, and outputs a first path of audio data to the audio output interface; and the browser functional module renders the decoded second path of video data into continuous video frames and outputs the continuous video frames to a second window on the display according to a certain frame rate.
On one hand, the first path of media stream data can be played through a playing module in the display device; on the other hand, based on the rendering function of the browser, the frame-by-frame rendering of the video frames can be realized, so that the second path of media stream data is played; therefore, the function of two-way media stream data playing on the display equipment is realized.
Furthermore, the invention can control the second video data in the RGB format to be compressed into YUV data and then sent to the browser functional module, thereby improving the efficiency of data transmission and data processing and ensuring that the video playing is smoother.
While preferred embodiments of the present invention have been described, additional variations and modifications in those embodiments may occur to those skilled in the art once they learn of the basic inventive concepts. Therefore, it is intended that the appended claims be interpreted as including preferred embodiments and all such alterations and modifications as fall within the scope of the invention.
It will be apparent to those skilled in the art that various changes and modifications may be made in the present invention without departing from the spirit and scope of the invention. Thus, if such modifications and variations of the present invention fall within the scope of the claims of the present invention and their equivalents, the present invention is also intended to include such modifications and variations.

Claims (10)

1. A display device, comprising:
a display;
an audio output interface;
the controller is used for controlling the browser function module, the demultiplexing module, the decoding module and the playing module; the browser function module is used for acquiring two paths of media stream data;
the demultiplexing module is used for demultiplexing the two paths of media stream data acquired by the browser function module into video data and audio data respectively;
the decoding module is used for respectively decoding the two paths of demultiplexed media stream data to obtain a first path of media stream data comprising a first path of video data and a first path of audio data and a second path of media stream data at least comprising a second path of video data;
the playing module is used for outputting a first path of video data in the decoded first path of media stream data to a first window on the display and outputting a first path of audio data to the audio output interface;
and the browser function module is further configured to render the second path of video data in the decoded second path of media stream data into continuous video frames, and output the continuous video frames to a second window on the display according to a certain frame rate.
2. The display device of claim 1, wherein the browser function module is specifically configured to:
and rendering the second video data in the YUV format through OPENGL to obtain a continuous video frame image.
3. The display device of claim 2, wherein the browser function module is specifically configured to:
packaging the second video data into a specified data packet aiming at the second video data in each frame of YUV format;
acquiring the position coordinates of the second window in a display screen of a display;
determining the position coordinate of each pixel point in the second video data according to the position coordinate of the second window;
and filling the data in the specified data packet into the second window based on the position coordinates of the pixel points, and displaying a video frame image corresponding to the second video data in the second window through a display.
4. The display device of claim 1, wherein the controller is to:
and controlling the second video data in the RGB format to be compressed into YUV data and then sent to the browser functional module.
5. The display device of claim 4, wherein the controller is further configured to:
after the second video data in the RGB format is converted into YUV data, Y, U, V groups of data in the YUV data are sequenced into third video data in a preset format, and the third video data are sent to the browser function module.
6. The display device according to claim 5,
the preset format is NV12 format or YV12 format.
7. The display device of claim 4, wherein the controller is further configured to:
and after the second video data in the RGB format is converted into YUV data, respectively packaging Y, U, V groups of data in the YUV data to obtain virtual YV12 data, and sending the virtual YV12 data to the browser function module.
8. A method for playing two-way media stream data in a display device is characterized by comprising the following steps:
acquiring two paths of media stream data;
demultiplexing the acquired two paths of media stream data into video data and audio data;
decoding the two paths of media stream data after demultiplexing to obtain a first path of media stream data comprising a first path of video data and a first path of audio data and a second path of media stream data at least comprising a second path of video data;
outputting a first path of video data in the decoded first path of media stream data to a first window on a display, and outputting a first path of audio data to an audio output interface;
rendering the second path of video data in the decoded second path of media stream data into continuous video frames, and outputting the continuous video frames to a second window on the display according to a certain frame rate.
9. The method of claim 8, further comprising:
compressing the second video data in the RGB format into YUV data.
10. The method of claim 8, further comprising:
and after the second video data in the RGB format is converted into YUV data, respectively packaging Y, U, V groups of data in the YUV data to obtain virtual YV12 data.
CN201911222288.6A 2019-12-03 2019-12-03 Method for playing two-way media stream data and display equipment Pending CN111107428A (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CN201911222288.6A CN111107428A (en) 2019-12-03 2019-12-03 Method for playing two-way media stream data and display equipment
PCT/CN2020/079175 WO2021109354A1 (en) 2019-12-03 2020-03-13 Media stream data playback method and device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201911222288.6A CN111107428A (en) 2019-12-03 2019-12-03 Method for playing two-way media stream data and display equipment

Publications (1)

Publication Number Publication Date
CN111107428A true CN111107428A (en) 2020-05-05

Family

ID=70420914

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201911222288.6A Pending CN111107428A (en) 2019-12-03 2019-12-03 Method for playing two-way media stream data and display equipment

Country Status (2)

Country Link
CN (1) CN111107428A (en)
WO (1) WO2021109354A1 (en)

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111601158A (en) * 2020-05-14 2020-08-28 青岛海信传媒网络技术有限公司 Method for optimizing audio track cutting of streaming media pipeline and display equipment
CN111757176A (en) * 2020-06-11 2020-10-09 青岛海信传媒网络技术有限公司 Streaming media file safe playing method and display equipment
CN112911371A (en) * 2021-01-29 2021-06-04 Vidaa美国公司 Double-channel video resource playing method and display equipment
CN113038221A (en) * 2021-03-02 2021-06-25 海信电子科技(武汉)有限公司 Double-channel video playing method and display equipment

Citations (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102724472A (en) * 2012-06-20 2012-10-10 杭州海康威视数字技术股份有限公司 Method and system for image data format conversion in image processing
CN104079597A (en) * 2013-03-26 2014-10-01 华为终端有限公司 Transfer method of media stream and user equipment
CN104202292A (en) * 2013-03-15 2014-12-10 株式会社理光 Distribution control system, distribution system and distribution control method
CN104813334A (en) * 2012-11-26 2015-07-29 日立麦克赛尔株式会社 Network terminal system, display device, terminal device, information processing method in display device, and program
CN106095241A (en) * 2016-06-14 2016-11-09 武汉深之度科技有限公司 The window display method of a kind of Web application, device and the equipment of calculating
CN106210703A (en) * 2016-09-08 2016-12-07 北京美吉克科技发展有限公司 The utilization of VR environment bust shot camera lens and display packing and system
CN106331850A (en) * 2016-09-18 2017-01-11 上海幻电信息科技有限公司 Browser live broadcast client, browser live broadcast system and browser live broadcast method
CN106973320A (en) * 2017-04-18 2017-07-21 深圳创维-Rgb电子有限公司 A kind of multi-path flash demo method, system and intelligent television
US20180131741A1 (en) * 2016-11-07 2018-05-10 Hanwha Techwin Co., Ltd. Adaptive media streaming method and apparatus according to decoding performance
CN109547838A (en) * 2018-12-06 2019-03-29 青岛海信传媒网络技术有限公司 The processing method and processing device of video window
CN110519628A (en) * 2019-09-20 2019-11-29 青岛海信移动通信技术股份有限公司 A kind of picture-in-picture display methods and display equipment

Family Cites Families (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6124897A (en) * 1996-09-30 2000-09-26 Sigma Designs, Inc. Method and apparatus for automatic calibration of analog video chromakey mixer
US8537201B2 (en) * 2010-10-18 2013-09-17 Silicon Image, Inc. Combining video data streams of differing dimensionality for concurrent display
US10097785B2 (en) * 2014-10-01 2018-10-09 Sony Corporation Selective sign language location
US9697630B2 (en) * 2014-10-01 2017-07-04 Sony Corporation Sign language window using picture-in-picture
CN105916002B (en) * 2016-02-05 2019-01-15 四川长虹电器股份有限公司 A kind of player windows display system and method for realizing soft or hard decoding switching
CN106604097B (en) * 2016-12-07 2020-08-11 广东威创视讯科技股份有限公司 Method and system for transmitting multiple video signals

Patent Citations (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102724472A (en) * 2012-06-20 2012-10-10 杭州海康威视数字技术股份有限公司 Method and system for image data format conversion in image processing
CN104813334A (en) * 2012-11-26 2015-07-29 日立麦克赛尔株式会社 Network terminal system, display device, terminal device, information processing method in display device, and program
CN104202292A (en) * 2013-03-15 2014-12-10 株式会社理光 Distribution control system, distribution system and distribution control method
CN104079597A (en) * 2013-03-26 2014-10-01 华为终端有限公司 Transfer method of media stream and user equipment
CN106095241A (en) * 2016-06-14 2016-11-09 武汉深之度科技有限公司 The window display method of a kind of Web application, device and the equipment of calculating
CN106210703A (en) * 2016-09-08 2016-12-07 北京美吉克科技发展有限公司 The utilization of VR environment bust shot camera lens and display packing and system
CN106331850A (en) * 2016-09-18 2017-01-11 上海幻电信息科技有限公司 Browser live broadcast client, browser live broadcast system and browser live broadcast method
US20180131741A1 (en) * 2016-11-07 2018-05-10 Hanwha Techwin Co., Ltd. Adaptive media streaming method and apparatus according to decoding performance
CN106973320A (en) * 2017-04-18 2017-07-21 深圳创维-Rgb电子有限公司 A kind of multi-path flash demo method, system and intelligent television
CN109547838A (en) * 2018-12-06 2019-03-29 青岛海信传媒网络技术有限公司 The processing method and processing device of video window
CN110519628A (en) * 2019-09-20 2019-11-29 青岛海信移动通信技术股份有限公司 A kind of picture-in-picture display methods and display equipment

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111601158A (en) * 2020-05-14 2020-08-28 青岛海信传媒网络技术有限公司 Method for optimizing audio track cutting of streaming media pipeline and display equipment
CN111601158B (en) * 2020-05-14 2021-11-02 青岛海信传媒网络技术有限公司 Method for optimizing audio track cutting of streaming media pipeline and display equipment
CN111757176A (en) * 2020-06-11 2020-10-09 青岛海信传媒网络技术有限公司 Streaming media file safe playing method and display equipment
CN112911371A (en) * 2021-01-29 2021-06-04 Vidaa美国公司 Double-channel video resource playing method and display equipment
CN113038221A (en) * 2021-03-02 2021-06-25 海信电子科技(武汉)有限公司 Double-channel video playing method and display equipment
CN113038221B (en) * 2021-03-02 2023-02-28 Vidaa(荷兰)国际控股有限公司 Double-channel video playing method and display equipment

Also Published As

Publication number Publication date
WO2021109354A1 (en) 2021-06-10

Similar Documents

Publication Publication Date Title
CN111698551B (en) Content display method and display equipment
CN111083539A (en) Display device
WO2021109354A1 (en) Media stream data playback method and device
CN111601142B (en) Subtitle display method and display equipment
CN111629249B (en) Method for playing startup picture and display device
CN111246309A (en) Method for displaying channel list in display device and display device
CN111343492B (en) Display method and display device of browser in different layers
CN111654743B (en) Audio playing method and display device
CN111045557A (en) Moving method of focus object and display device
CN111417027A (en) Method for switching small window playing of full-screen playing of webpage video and display equipment
CN113347413A (en) Window position detection method and display device
CN111277911B (en) Image processing method of panoramic video, display device and server
CN111526401B (en) Video playing control method and display equipment
CN112040308A (en) HDMI channel switching method and display device
CN111641856A (en) Prompt message display method for guiding user operation in display equipment and display equipment
CN111885415B (en) Audio data rapid output method and display device
CN111405329B (en) Display device and control method for EPG user interface display
CN113497906B (en) Volume adjusting method and device and terminal
CN111601147A (en) Content display method and display equipment
CN112040285A (en) Interface display method and display equipment
CN111614995A (en) Menu display method and display equipment
CN111459372A (en) Network list refreshing display method and display equipment
CN111405332B (en) Display device and control method for EPG user interface display
CN111901686B (en) Method for keeping normal display of user interface stack and display equipment
CN112040317B (en) Event response method and display device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
TA01 Transfer of patent application right
TA01 Transfer of patent application right

Effective date of registration: 20221031

Address after: 83 Intekte Street, Devon, Netherlands

Applicant after: VIDAA (Netherlands) International Holdings Ltd.

Address before: 266061 room 131, 248 Hong Kong East Road, Laoshan District, Qingdao City, Shandong Province

Applicant before: QINGDAO HISENSE MEDIA NETWORKS Ltd.

RJ01 Rejection of invention patent application after publication
RJ01 Rejection of invention patent application after publication

Application publication date: 20200505