CN113825032A - Media asset playing method and display equipment - Google Patents

Media asset playing method and display equipment Download PDF

Info

Publication number
CN113825032A
CN113825032A CN202010559495.7A CN202010559495A CN113825032A CN 113825032 A CN113825032 A CN 113825032A CN 202010559495 A CN202010559495 A CN 202010559495A CN 113825032 A CN113825032 A CN 113825032A
Authority
CN
China
Prior art keywords
playing
parameter
media asset
display device
quality parameter
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202010559495.7A
Other languages
Chinese (zh)
Inventor
秦鹏鹏
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Juhaokan Technology Co Ltd
Original Assignee
Juhaokan Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Juhaokan Technology Co Ltd filed Critical Juhaokan Technology Co Ltd
Priority to CN202010559495.7A priority Critical patent/CN113825032A/en
Priority to PCT/CN2021/081356 priority patent/WO2021253895A1/en
Publication of CN113825032A publication Critical patent/CN113825032A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/47End-user applications
    • H04N21/485End-user interface for client configuration
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/20Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
    • H04N21/25Management operations performed by the server for facilitating the content distribution or administrating data related to end-users or client devices, e.g. end-user or client device authentication, learning user preferences for recommending movies
    • H04N21/266Channel or content management, e.g. generation and management of keys and entitlement messages in a conditional access system, merging a VOD unicast channel into a multicast channel
    • H04N21/2662Controlling the complexity of the video stream, e.g. by scaling the resolution or bitrate of the video stream based on the client capabilities
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/431Generation of visual interfaces for content selection or interaction; Content or additional data rendering
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/431Generation of visual interfaces for content selection or interaction; Content or additional data rendering
    • H04N21/4312Generation of visual interfaces for content selection or interaction; Content or additional data rendering involving specific graphical features, e.g. screen layout, special fonts or colors, blinking icons, highlights or animations
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/44Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream, rendering scenes according to MPEG-4 scene graphs
    • H04N21/4402Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream, rendering scenes according to MPEG-4 scene graphs involving reformatting operations of video signals for household redistribution, storage or real-time display
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/44Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream, rendering scenes according to MPEG-4 scene graphs
    • H04N21/4402Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream, rendering scenes according to MPEG-4 scene graphs involving reformatting operations of video signals for household redistribution, storage or real-time display
    • H04N21/440263Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream, rendering scenes according to MPEG-4 scene graphs involving reformatting operations of video signals for household redistribution, storage or real-time display by altering the spatial resolution, e.g. for displaying on a connected PDA
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/47End-user applications
    • H04N21/472End-user interface for requesting content, additional data or services; End-user interface for interacting with content, e.g. for content reservation or setting reminders, for requesting event notification, for manipulating displayed content
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/47End-user applications
    • H04N21/485End-user interface for client configuration
    • H04N21/4858End-user interface for client configuration for modifying screen layout parameters, e.g. fonts, size of the windows

Abstract

The embodiment of the application discloses a media asset playing method and display equipment, and belongs to the technical field of terminals. The method comprises the following steps: and receiving a control display instruction. And responding to the control display instruction, and acquiring a first play effect parameter supported by the currently played media asset, wherein the first play effect parameter is used for indicating the play effect of the media asset. And determining a target playing effect parameter based on the first playing effect parameter and a second playing effect parameter supported by the display device, wherein the second playing effect parameter is used for indicating the playing effect of playing the media assets in the display device, and the target playing effect parameter is contained in the first playing effect parameter and the second playing effect parameter at the same time. And displaying the control corresponding to the target playing effect parameter. Therefore, the user can directly operate the control corresponding to the displayed target playing effect to realize the switching of the playing effect of the media assets, and can quickly experience better media asset playing effect.

Description

Media asset playing method and display equipment
Technical Field
The embodiment of the application relates to the technical field of terminals, in particular to a media asset playing method and display equipment.
Background
In the process of playing the media assets, the display device can switch the first playing effect parameter of the media assets based on the operation of the user, so that the user can enjoy better audio-visual experience. The media assets can include audio and video, and the first playing effect parameter can be the sound effect of the audio or the image quality of the video. For example, if the image quality of a currently played video needs to be switched, for the same video, multiple code streams with different image qualities can be corresponding to the same video under a first playing quality parameter, that is, the same video corresponds to multiple code streams with different image qualities under a definition, so that when the display device plays the video, the code streams with different image qualities can be switched to play based on the operation of a user, so as to switch the image quality of the video. However, the user is not a professional, and may need to switch to the high image quality many times, which results in wasting much time in switching the image quality, reducing the video playing efficiency, and further affecting the user experience.
Disclosure of Invention
The embodiment of the application provides a media asset playing method and display equipment, and can solve the problem of low video playing efficiency in the related technology. The technical scheme is as follows:
in one aspect, there is provided a display apparatus, including: a display configured to display a user interface; a communicator for communicating with a mobile terminal or a server; a controller in communication with the display, the controller to:
receiving a control display instruction;
responding to the control display instruction, and acquiring a first play effect parameter supported by the currently played media asset, wherein the first play effect parameter is used for indicating the play effect of the media asset;
determining a target playing effect parameter based on the first playing effect parameter and a second playing effect parameter supported by a display device, wherein the second playing effect parameter is used for indicating the playing effect of playing media assets in the display device, and the target playing effect parameter is simultaneously included in the first playing effect parameter and the second playing effect parameter;
and displaying the control corresponding to the target playing effect parameter.
In another aspect, there is provided a display apparatus including: a display configured to display a user interface; a communicator for communicating with a mobile terminal or a server; a controller in communication with the display, the controller to:
receiving a control display instruction;
responding to the control display instruction, and acquiring a first image quality parameter supported by the currently played media asset under the current resolution;
determining a target image quality parameter with the clearest image quality based on the intersection of the first image quality parameter and a second image quality parameter supported by a display device, wherein the target image quality parameter is contained in the first image quality parameter and the second image quality parameter;
and displaying the image quality control corresponding to the target image quality parameter.
On the other hand, a media asset playing method is provided, which is applied to display equipment, and the method comprises the following steps:
receiving a control display instruction;
responding to the control display instruction, and acquiring a first play effect parameter supported by the currently played media asset, wherein the first play effect parameter is used for indicating the play effect of the media asset;
determining a target playing effect parameter based on the first playing effect parameter and a second playing effect parameter supported by a display device, wherein the second playing effect parameter is used for indicating the playing effect of playing media assets in the display device, and the target playing effect parameter is simultaneously included in the first playing effect parameter and the second playing effect parameter;
and displaying the control corresponding to the target playing effect parameter.
In another aspect, a computer-readable storage medium is provided, in which a computer program is stored, and the computer program, when executed by a processor, implements the steps of the media asset playing method described above.
In another aspect, a computer program product containing instructions is provided, which when run on a computer, causes the computer to perform the steps of the method for playing media assets described above.
The technical scheme provided by the embodiment of the application can at least bring the following beneficial effects:
if the display device receives the control display instruction, it can be considered that the user wants to switch the playing effect parameter of the currently played media asset, and therefore, a first playing effect parameter which is supported by the currently played media asset and used for indicating the playing effect of the media asset can be obtained. Since the first playing effect parameter supported by the media asset is not necessarily supported by the display device, the target playing effect parameter supported by the media asset and the display device needs to be determined according to the first playing effect parameter and the second playing effect parameter supported by the display device, and then the control corresponding to the target playing effect parameter is displayed. Therefore, the user can directly operate the control corresponding to the displayed target playing effect to realize the switching of the playing effect of the media assets, the user does not need to carry out complicated operation, the better media asset playing effect can be quickly experienced, the media asset playing efficiency is improved, and better audio-visual experience is brought to the user.
Drawings
In order to more clearly illustrate the technical solutions in the embodiments of the present application, the drawings needed to be used in the description of the embodiments are briefly introduced below, and it is obvious that the drawings in the following description are only some embodiments of the present application, and it is obvious for those skilled in the art to obtain other drawings based on these drawings without creative efforts.
FIG. 1 is a schematic diagram illustrating an operational scenario between a display device and a control apparatus, according to some exemplary embodiments;
fig. 2 is a block diagram illustrating a hardware configuration of a display device according to some exemplary embodiments;
FIG. 3 is a block diagram illustrating a configuration of a control device, according to some exemplary embodiments;
FIG. 4 is a schematic diagram illustrating a functional configuration of a display device according to some exemplary embodiments;
FIG. 5 is a block diagram illustrating a configuration of a software system in a display device, according to some exemplary embodiments;
FIG. 6 is a block diagram illustrating a configuration of an application in a display device, according to some exemplary embodiments;
FIG. 7 is a flow diagram illustrating a method of playing a media asset, according to some exemplary embodiments;
FIG. 8 is a diagram illustrating a switching of a first play effect parameter, according to some example embodiments;
fig. 9 is a diagram illustrating another switching of a first play effect parameter according to some example embodiments;
fig. 10 is a schematic diagram illustrating yet another switching of a first play effect parameter according to some example embodiments;
FIG. 11 is a diagram illustrating a list of controls, according to some demonstrative embodiments;
fig. 12 is a thread diagram illustrating a method of media asset playback according to another exemplary embodiment.
Detailed Description
To make the objects, technical solutions and advantages of the exemplary embodiments of the present application clearer, the technical solutions in the exemplary embodiments of the present application will be clearly and completely described below with reference to the drawings in the exemplary embodiments of the present application, and it is obvious that the described exemplary embodiments are only a part of the embodiments of the present application, but not all the embodiments.
All other embodiments, which can be derived by a person skilled in the art from the exemplary embodiments shown in the present application without inventive step, are within the scope of the embodiments of the present application. In addition, while the disclosure in the embodiments of the present application has been presented in terms of exemplary embodiment or embodiments, it should be appreciated that aspects of the disclosure may stand alone in a complete solution.
It should be understood that the terms "first," "second," "third," and the like in the description and in the claims of the embodiments of the application and in the drawings described above are used for distinguishing between similar elements and not necessarily for describing a particular sequential or chronological order. It is to be understood that the data so used are interchangeable under appropriate circumstances and can be implemented in sequences other than those illustrated or otherwise described herein with respect to the embodiments of the application, for example.
Furthermore, the terms "comprises" and "comprising," and any variations thereof, are intended to cover a non-exclusive inclusion, such that a product or device that comprises a list of elements is not necessarily limited to those elements explicitly listed, but may include other elements not expressly listed or inherent to such product or device.
The term "module" as used in the embodiments of the present application refers to any known or later developed hardware, software, firmware, artificial intelligence, fuzzy logic, or combination of hardware and/or software code that is capable of performing the functionality associated with that element.
The term "remote controller" used in the embodiments of the present application refers to a component of an electronic device (such as the display device disclosed in the embodiments of the present application), which can be controlled wirelessly, typically in a short distance range. The touch screen remote control device is generally connected with an electronic device by using infrared and/or Radio Frequency (RF) signals and/or bluetooth, and may also include functional modules such as WiFi, wireless USB (Universal Serial Bus), bluetooth, and a motion sensor.
The term "gesture" used in the embodiments of the present application refers to a user behavior used to express an intended idea, action, purpose, or result through a change in hand shape or an action such as hand movement.
Fig. 1 is a schematic diagram illustrating an operational scenario between a display device and a control apparatus, according to some exemplary embodiments. As shown in fig. 1, a user may operate the display device 200 through the mobile terminal 300 and the control apparatus 100.
The control device 100 may be a remote controller, which includes infrared protocol communication or bluetooth protocol communication, and other short-distance communication methods, and controls the display apparatus 200 in a wireless or other wired manner. The user may input a user command through a key on a remote controller, voice input, control panel input, etc. to control the display apparatus 200. Such as: the user can input a corresponding control command through a volume up/down key, a channel control key, up/down/left/right moving keys, a voice input key, a menu key, a power on/off key, etc. on the remote controller, to implement the function of controlling the display device 200.
In some embodiments, mobile terminals, tablets, computers, laptops, and other smart devices may also be used to control the display device 200. For example, the display device 200 is controlled using an application program running on the smart device. The application, through configuration, may provide the user with various controls in an intuitive User Interface (UI) on a screen associated with the smart device.
For example, the mobile terminal 300 may install a software application with the display device 200, implement connection communication through a network communication protocol, and implement the purpose of one-to-one control operation and data communication. Such as: the mobile terminal 300 and the display device 200 can establish a control instruction protocol, synchronize a remote control keyboard to the mobile terminal 300, and control the display device 200 by controlling a user interface on the mobile terminal 300. The audio and video content displayed on the mobile terminal 300 can also be transmitted to the display device 200, so as to realize the synchronous display function.
As also shown in fig. 1, the display apparatus 200 also performs data communication with the server 400 through various communication means. The display device 200 may be allowed to be communicatively connected through a Local Area Network (LAN), a Wireless Local Area Network (WLAN), and other networks. The server 400 may provide various contents and interactions to the display apparatus 200. Illustratively, the display device 200 receives software program updates, or accesses a remotely stored digital media library, by sending and receiving information, as well as Electronic Program Guide (EPG) interactions. The servers 400 may be a group or groups of servers, and may be one or more types of servers. Other web service contents such as video on demand and advertisement services are provided through the server 400.
The display device 200 may include a liquid crystal display, an OLED (organic light-Emitting Diode) display, a projection display device. The particular display device type, size, resolution, etc. are not limiting, and those skilled in the art will appreciate that the display device 200 may be modified in performance and configuration as desired.
The display apparatus 200 may additionally provide an intelligent network tv function that provides a computer support function in addition to the broadcast receiving tv function. Illustratively, the display device 200 may also provide functions of a smart tv such as a web tv, a smart tv, an Internet Protocol Television (IPTV), and the like.
Next, a description is given of a display device provided in an embodiment of the present application.
Referring to fig. 2, fig. 2 is a block diagram illustrating a hardware configuration of a display apparatus according to some exemplary embodiments. The display device 200 includes a controller 210, a tuning demodulator 220, a communication interface 230, a detector 240, an input/output interface 250, a video processor 260-1, an audio processor 60-2, a display 280, an audio output 270, a memory 290, a power supply, and an infrared receiver.
A display 280 for receiving the image signal from the video processor 260-1 and displaying the video content and image and components of the menu manipulation interface. The display 280 includes a display screen assembly for presenting a picture, and a driving assembly for driving the display of an image. The displayed video content may be broadcast television content, i.e., various broadcast signals received through a wired or wireless communication protocol, or the displayed video content may be various image content transmitted by a network server side received through a network communication protocol.
Meanwhile, the display 280 may also display a user manipulation UI interface generated in the display apparatus 200 and used to control the display apparatus 200. In addition, the display 280 may further include a driving component for driving the display according to the type of the display 280. Alternatively, in case the display 280 is a projection display, it may also comprise a projection device and a projection screen.
The communication interface 230 is a component for communicating with an external device or an external server according to various communication protocol types. For example: the communication interface 230 may be a Wifi chip 231, a bluetooth communication protocol chip 232, a wired ethernet communication protocol chip 233, or other network communication protocol chips or near field communication protocol chips, and an infrared receiver (not shown).
The display apparatus 200 may establish a connection for transmission and reception of control signals and data signals with an external control apparatus or a content providing apparatus through the communication interface 230. In addition, the infrared receiver is an interface for receiving an infrared control signal of the control device 100 (e.g., an infrared remote controller, etc.).
The detector 240 may be used to collect signals of the external environment or interaction with the outside. The detector 240 includes a light receiver 242, and the light receiver 242 is a sensor for collecting the intensity of ambient light, and by collecting the ambient light, parameter changes and the like can be adaptively displayed.
The detector 240 further includes an image collector 241, such as a camera, etc., which may be used to collect external environment scenes, collect attributes of the user or interact gestures with the user, adaptively change display parameters, and also recognize gestures of the user, so as to implement the function of interaction with the user.
In some embodiments, the detector 240 may further include a temperature sensor, and the display apparatus 200 may adaptively adjust a display color temperature of the image by sensing an ambient temperature through the temperature sensor. For example, when the ambient temperature is higher, the display apparatus 200 may be adjusted to display a cool tone, or when the ambient temperature is lower, the display apparatus 200 may be adjusted to display a warm tone.
In other embodiments, the detector 240 may further include a sound collector, such as a microphone, which may be used to receive a user's voice, a voice signal including a control instruction for the user to control the display device 200, or collect an ambient sound for identifying an ambient scene type, and the display device 200 may adapt to the ambient noise.
The input/output interface 250 is used for data transmission between the display device 200 and other external devices under the control of the controller 210. Such as receiving video signals, audio signals, command instructions, etc. from an external device.
Input/output interface 250 may include, but is not limited to, the following: a high Definition Multimedia interface (hdmi) interface 251, an analog or data high Definition component input interface 253, a composite video input interface 252, a USB input interface 254, an RGB (Red Green Blue) port (not shown in the figure), and any one or more interfaces.
In some exemplary embodiments, the input/output interface 250 may also be a composite input/output interface formed by the above-mentioned plurality of interfaces.
The tuning demodulator 220 receives the broadcast television signal in a wired or wireless receiving manner, and may perform modulation and demodulation processing such as amplification, mixing, resonance, and the like, and demodulate a television audio/video signal carried in a television channel frequency selected by a user and an EPG (Electronic Program Guide) data signal from a plurality of wireless or wired broadcast television signals.
The tuner demodulator 220 is responsive to the user-selected television signal frequency and the television signal carried by the frequency, as selected by the user and controlled by the controller 210.
The tuner demodulator 220 may have a variety of ways to receive signals, depending on the broadcast system of the television signal, such as: terrestrial broadcast, cable broadcast, satellite broadcast, or internet broadcast signals, etc.; according to different modulation types, digital modulation and analog modulation modes can be carried out; depending on the type of television signal being received, both analog and digital signals may be received.
In other exemplary embodiments, the tuner/demodulator 220 may be in an external device, such as an external set-top box. In this way, the set-top box outputs television audio/video signals after modulation and demodulation, and the television audio/video signals are input into the display device 200 through the input/output interface 250.
The video processor 260-1 is configured to receive an external video signal, and perform video processing such as decompression, decoding, scaling, noise reduction, frame rate conversion, resolution conversion, image synthesis, and the like according to a standard codec protocol of the input signal, so as to obtain a signal that can be directly displayed or played on the display device 200.
For example, the video processor 260-1 may include a demultiplexing module, a video decoding module, an image synthesizing module, a frame rate conversion module, a display formatting module, and the like.
The demultiplexing module is used for demultiplexing the input audio and video data stream, and if the input MPEG-2 is input, the demultiplexing module demultiplexes the input audio and video data stream into a video signal and an audio signal.
And the video decoding module is used for processing the video signal after demultiplexing, including decoding, scaling and the like.
And an image synthesis module, such as an image synthesizer, configured to superimpose and mix a graphics generator with the scaled video image according to a GUI (Graphical User Interface) signal input by a User or generated by the User, so as to generate an image signal for display.
The frame rate conversion module is configured to convert an input video frame rate, for example, a 60Hz frame rate into a 120Hz frame rate or a 240Hz frame rate, and a common format is implemented by using, for example, an interpolation frame method.
And the display formatting module is used for changing the video output signals after the frame rate conversion is received to obtain signals conforming to the display format, such as output RGB data signals.
The audio processor 260-2 is configured to receive an external audio signal, decompress and decode the received audio signal according to a standard codec protocol of the input signal, and perform noise reduction, digital-to-analog conversion, amplification processing, and the like to obtain an audio signal that can be played in the speaker.
In other exemplary embodiments, video processor 260-1 may include one or more chips. The audio processor 260-2 may also include one or more chips.
In other exemplary embodiments, the video processor 260-1 and the audio processor 260-2 may be separate chips or may be integrated with the controller 210 in one or more chips.
An audio output 272, which receives the sound signal output from the audio processor 260-2 under the control of the controller 210, such as: the speaker 272, and the external sound output terminal 274 that can be output to the generation device of the external device, in addition to the speaker 272 carried by the display device 200 itself, such as: an external sound interface or an earphone interface and the like.
The power supply uses the power input from the external power source to provide power supply support for the display apparatus 200 under the control of the controller 210. The power supply may be a built-in power supply circuit installed inside the display device 200, or may be a power supply installed outside the display device 200, and provides a power supply interface for an external power supply in the display device 200.
A user input interface for receiving a user input signal and then transmitting the received user input signal to the controller 210. The user input signal may be a remote controller signal received through an infrared receiver, or may be various user control signals received through a network communication module.
Illustratively, the user inputs a user input signal through the control device 100 or the mobile terminal 300, and the user input interface responds to the user input signal through the controller 210 by the display device 200 according to the user input signal.
In some embodiments, a user may display a Graphical User Interface (GUI) on the display 280 to input a user command, which is received by the user input interface through the Graphical User Interface (GUI). Alternatively, the user may input the user command by inputting a specific sound or gesture, and the user input interface receives the input user command by recognizing the sound or gesture through the sensor.
The controller 210 controls the operation of the display apparatus 200 and responds to the user's operation through various software control programs stored in the memory 290.
As shown in fig. 2, the controller 210 includes a RAM (Random Access Memory) 213, a ROM (Read-Only Memory) 214, a graphics processor 216, a CPU processor 212, and a communication interface 218, such as: a first interface 218-1 through an nth interface 218-n, and a communication bus. The RAM213 and the ROM214, the graphic processor 216, the CPU processor 212, and the communication interface 218 are connected via a bus.
And a RAM213 for storing instructions for various system boots. If the display device 200 starts to power up when receiving the power-on signal, the cpu (central processing unit) 212 executes the system start instruction in the RAM213, and copies the operating system stored in the memory 290 to the RAM213, so as to start to execute the start operating system. After the start of the operating system is completed, the CPU processor 212 copies the various application programs in the memory 290 to the RAM213 and then starts running and starting the various application programs.
A graphics processor 216 for generating various graphics objects, such as: icons, operation menus, user input instruction display graphics, and the like. The graphic processor 216 includes an operator for performing an operation by receiving various interactive instructions input by a user, and displays various objects according to display attributes. The graphics processor 216 also includes a renderer that generates various objects based on the operator and displays the rendered results on the display 280.
The CPU processor 212 is configured to execute the operating system and application program instructions stored in the memory 290, and execute various application programs, data and contents according to various received interactive instructions of external input, so as to finally display and play various audio-video contents.
In some exemplary embodiments, the CPU processor 212 may include a plurality of processors. The plurality of processors may include one main processor and a plurality of or one sub-processor. A main processor for performing some operations of the display apparatus 200 in a pre-power-up mode and/or operations of displaying a screen in a normal mode. A plurality of or a sub-processor for performing an operation in a standby mode or the like.
The controller 210 may control the overall operation of the display apparatus 200. For example: in response to receiving a user command for selecting a UI object to be displayed on the display 280, the controller 210 may perform an operation related to the object selected by the user command.
Wherein the object may be any one of the selectable objects, such as a hyperlink or an icon. Operations related to the selected object may include, for example: displaying an operation connected to a hyperlink page, document, image, or the like, or performing an operation of a program corresponding to the icon. The user command for selecting the UI object may be a command input through various input means (e.g., a mouse, a keyboard, a touch pad, etc.) connected to the display apparatus 200 or a voice command corresponding to a voice spoken by the user.
The memory 290 includes a memory for storing various software modules for driving the display device 200. Such as: various software modules stored in memory 290, including: the system comprises a basic module, a detection module, a communication module, a display control module, a browser module, various service modules and the like.
Wherein the basic module is used for signal communication among the hardware in the postpartum care display device 200 and for sending the bottom layer software for processing and controlling signals to the upper layer module. The detection module is used for collecting various information from various sensors or user input interfaces, and performing digital-to-analog conversion and analysis management.
For example: the voice recognition module comprises a voice analysis module and a voice instruction database module. The display control module is used for controlling the display 280 to display image content, and can be used for playing information such as multimedia image content and UI interface. And the communication module is used for carrying out control and data communication with external equipment. And the browser module is used for executing a module for data communication between browsing servers. And the service module is used for providing various services and modules including various application programs.
Meanwhile, the memory 290 is also used to store visual effect maps and the like for receiving external data and user data, images of respective items in various user interfaces, and a focus object.
Referring to fig. 3, fig. 3 is a block diagram illustrating a configuration of a control apparatus according to some exemplary embodiments. The control device 100 includes a controller 110, a communication interface 130, a user input/output interface 140, a memory 190, and a power supply 180.
The control device 100 is configured to control the display device 200 and may receive an input operation instruction of a user and convert the operation instruction into an instruction recognizable and responsive by the display device 200, serving as an interaction intermediary between the user and the display device 200. Such as: the user responds to the channel up and down operation by operating the channel up and down keys on the control device 100.
In some embodiments, the control device 100 may be a smart device. Such as: the control apparatus 100 may install various applications that control the display apparatus 200 according to user demands.
In some embodiments, as shown in fig. 1, a mobile terminal 300 or other intelligent electronic device may function similar to the control device 100 after installing an application that manipulates the display device 200. Such as: the user may implement the functions of controlling the physical keys of the device 100 by installing applications, various function keys or virtual buttons of a graphical user interface available on the mobile terminal 300 or other intelligent electronic device.
The controller 110 includes a processor 112 and RAM113 and ROM114, a communication interface 118, and a communication bus. The controller 110 is used to control the operation of the control device 100, as well as the internal components for communication and coordination and external and internal data processing functions.
The communication interface 130 enables communication of control signals and data signals with the display apparatus 200 under the control of the controller 110. Such as: the received user input signal is transmitted to the display apparatus 200. The communication interface 130 may include at least one of a WiFi chip, a bluetooth module, an NFC module, and other near field communication modules.
A user input/output interface 140, wherein the input interface includes at least one of a microphone 141, a touch pad 142, a sensor 143, keys 144, and other input interfaces. Such as: the user can realize a user instruction input function through actions such as voice, touch, gesture, pressing, and the like, and the input interface converts the received analog signal into a digital signal and converts the digital signal into a corresponding instruction signal, and sends the instruction signal to the display device 200.
The output interface includes an interface that transmits the received user instruction to the display apparatus 200. In some embodiments, the interface may be an infrared interface or a radio frequency interface. Such as: when the infrared signal interface is used, the user input instruction needs to be converted into an infrared control signal according to an infrared control protocol, and the infrared control signal is sent to the display device 200 through the infrared sending module. The following steps are repeated: when the rf signal interface is used, a user input command needs to be converted into a digital signal, and then the digital signal is modulated according to the rf control signal modulation protocol and then transmitted to the display device 200 through the rf transmitting terminal.
In some embodiments, the control device 100 includes at least one of a communication interface 130 and an output interface. The control device 100 is provided with a communication interface 130, such as: a WiFi module, a bluetooth module, an NFC (Near Field Communication) module, etc. which may encode a user input command through a WiFi protocol, a bluetooth protocol, or an NFC protocol and send the user input command to the display device 200.
A memory 190 for storing various operation programs, data, and applications for driving and controlling the display device 200 under the control of the controller 110. The memory 190 may store various control signal commands input by a user.
The power supply 180, which is used to provide operational power support for the various components of the control device 100 under the control of the controller 110, may include a battery and associated control circuitry.
Referring to fig. 4, fig. 4 is a schematic diagram illustrating a functional configuration of a display device according to some exemplary embodiments.
As shown in fig. 4, the memory 290 is used to store an operating system, applications, contents, user data, and the like, and performs system operations for driving the display device 200 and various operations in response to a user under the control of the controller 210. The memory 290 may include volatile and/or nonvolatile memory.
The memory 290 is specifically configured to store an operating program for driving the controller 210 in the display device 200, and to store various applications installed in the display device 200, various applications downloaded by a user from an external device, various graphical user interfaces related to the applications, various objects related to the graphical user interfaces, user data information, and internal data of various supported applications. The memory 290 is used for storing System software such as an OS (Operating System) kernel, middleware, and applications, and storing input video data and audio data, and other user data.
The memory 290 is specifically used for storing drivers and related data such as the audio/video processors 260-1 and 260-2, the display 280, the communication interface 230, the tuning demodulator 220, the input/output interface of the detector 240, and the like.
In some embodiments, memory 290 may store software and/or programs, software programs for representing an Operating System (OS) including, for example: a kernel, middleware, an Application Programming Interface (API), and/or an Application program. For example, the kernel may control or manage system resources, or functions implemented by other programs (e.g., the middleware, APIs, or applications), and the kernel may provide interfaces to allow the middleware and APIs, or applications, to access the controller to implement controlling or managing system resources.
The memory 290, for example, includes a broadcast receiving module 2901, a channel control module 2902, a volume control module 2903, an image control module 2904, a display control module 2905, an audio control module 2906, an external instruction recognition module 2907, a communication control module 2908, a light receiving module 2909, a power control module 2910, an operating system 2911, and other applications 2912, a browser module, and the like. The controller 210 performs functions such as: a broadcast television signal reception demodulation function, a television channel selection control function, a volume selection control function, an image control function, a display control function, an audio control function, an external instruction recognition function, a communication control function, an optical signal reception function, an electric power control function, a software control platform supporting various functions, a browser function, and the like.
Referring to fig. 5, fig. 5 is a block diagram illustrating a configuration of a software system in a display device according to some exemplary embodiments.
As shown in FIG. 5, an operating system 2911, including executing operating software for handling various basic system services and for performing hardware related tasks, acts as an intermediary for data processing performed between application programs and hardware components. In some embodiments, portions of the operating system kernel may contain a series of software to manage the display device hardware resources and provide services to other programs or software code.
In other embodiments, portions of the operating system kernel may include one or more device drivers, which may be a set of software code in the operating system that assists in operating or controlling the devices or hardware associated with the display device. The drivers may contain code that operates the video, audio, and/or other multimedia components. Examples include a display screen, a camera, Flash, WiFi, and audio drivers.
The accessibility module 2911-1 is configured to modify or access the application program to achieve accessibility and operability of the application program for displaying content.
A communication module 2911-2 for connection to other peripherals via associated communication interfaces and a communication network.
The user interface module 2911-3 is configured to provide an object for displaying a user interface, so that each application program can access the object, and user operability can be achieved.
Control applications 2911-4 for controllable process management, including runtime applications and the like.
The event transmission system 2914, which may be implemented within the operating system 2911 or within the application program 2912, in some embodiments, on the one hand, within the operating system 2911 and on the other hand, within the application program 2912, is configured to listen for various user input events, and to refer to handlers that perform one or more predefined sets of operations in response to the identification of various types of events or sub-events, depending on the various events.
The event monitoring module 2914-1 is configured to monitor an event or a sub-event input by the user input interface.
The event identification module 2914-1 is configured to input definitions of various types of events for various user input interfaces, identify various events or sub-events, and transmit the same to a process for executing one or more corresponding sets of processes.
The event or sub-event refers to an input detected by one or more sensors in the display device 200 and an input of an external control device (e.g., the control device 100). Such as: the method comprises the following steps of inputting various sub-events through voice, inputting gestures through gesture recognition, inputting sub-events through remote control key commands of the control equipment and the like. Illustratively, the one or more sub-events in the remote control include a variety of forms including, but not limited to, one or a combination of key presses up/down/left/right/, ok keys, key presses, and the like. And non-physical key operations such as move, hold, release, etc.
The interface layout manager 2913, directly or indirectly receiving the input events or sub-events from the event transmission system 2914, monitors the input events or sub-events, and updates the layout of the user interface, including but not limited to the position of each control or sub-control in the interface, and the size, position, and level of the container, and other various execution operations related to the layout of the interface.
Referring to fig. 6, fig. 6 is a block diagram illustrating a configuration of an application program in a display device according to some exemplary embodiments.
As shown in fig. 6, the application layer 2912 contains various applications that may be executed at the display device 200. Applications may include, but are not limited to, one or more applications such as: live television applications, video-on-demand applications, media center applications, application program centers, gaming applications, and the like.
The live television application can provide live television through different signal sources. For example, a live television application may provide television signals using input from cable television, radio broadcasts, satellite services, or other types of live television services. And, the live television application may display a video of the live television signal on the display device 200.
And the video-on-demand application can provide videos from different storage sources. Unlike live television applications, video on demand provides a video display from some storage source. For example, the video on demand may come from a server side of the cloud storage, from a local hard disk storage containing stored video programs.
The media center application can provide various applications for playing multimedia contents. For example, a media center, which may be other than live television or video on demand, may provide services that a user may access to various images or audio through a media center application.
The application program center can provide and store various applications. The application may be a game, an application, or some other application associated with a computer system or other device that may be run in the smart television. The application center may obtain these applications from different sources, store them in local storage, and then be operable on the display device 200.
The following explains the media asset playing method provided in the embodiment of the present application in detail.
Fig. 7 is a flowchart illustrating a method for playing media assets, which is applied to the display device, and in particular, can be applied to a controller of the display device according to some exemplary embodiments. The following embodiments are explained only taking the display device as an execution subject. Referring to fig. 7, the method may include the following steps:
step 701: and receiving a control display instruction.
The control showing instruction can be sent by the control device under the control of a user.
As an example, in the process of playing the media asset by the display device, the user may send a control display instruction to the display device through the control device, and accordingly, the display device may receive the control display instruction.
Because the step is realized in the process of playing the media assets by the display equipment, the media assets can be played in the display equipment before the control display instruction is received.
In implementation, in response to detecting the play instruction, the display device may send a second request to the server, where the second request includes a media asset identifier of the media asset, and the second request is used to request the server to feed back the first play quality parameter supported by the media asset based on the media asset identifier. And receiving a first play quality parameter supported by the media assets sent by the server. And acquiring a code stream of the media asset and decoding and playing the code stream based on the first playing quality parameter and the reference playing quality parameter supported by the media asset.
Wherein, the asset identifier can be used for uniquely indicating one asset. For example, the asset identifier may be a name, number, etc. of the asset.
The playing quality parameter may be a resolution parameter or a sound quality parameter. As an example, if the currently played media asset is a video, the playing quality parameter may be a resolution parameter. If the currently played media asset is audio, the playing quality parameter may be a sound quality parameter.
The playing instruction can be generated based on the detected triggering operation of the playing control, and the triggering operation can be executed by the user through the control device.
The reference playback quality parameter refers to a playback quality parameter used when the display device is not set after being turned on. As an example, the reference playing quality parameter may also be referred to as a default playing quality parameter, which is a playing quality parameter automatically used by default after the display device is turned on.
That is, if the display device detects a play instruction, it may be considered that the user wants to play the asset, and a second request may be sent to the server, where the second request includes an asset identifier of the asset. After receiving the second request, the server may obtain, based on the asset identifier, a first play quality parameter supported by the asset indicated by the asset identifier, and feed back the obtained first play quality parameter supported by the asset to the display device. After receiving the first play quality parameter supported by the media asset fed back by the server, the display device can acquire a code stream of the media asset and perform decoding play according to the first play quality parameter supported by the media asset and the reference play quality parameter.
In some embodiments, after the user turns on the display device, the display device may display a first interface, where the first interface includes a plurality of asset controls, and each asset control corresponds to one asset. If the display device detects that the triggering operation of one of the plurality of asset controls indicates that the user selects the asset, the display device can acquire an asset identifier of the asset and then send an asset information acquisition request to the server, wherein the asset information acquisition request comprises the asset identifier of the asset, and the asset information acquisition request is used for requesting the server to feed back asset information of the asset corresponding to the asset identifier. After receiving the request for obtaining the media information, the server can obtain the media information corresponding to the media identifier based on the media identifier, wherein the media information includes a poster, a play address and the like of the media. After receiving the media information, the display device can switch from the first interface to display the second interface, and can display posters, brief introduction and the like of the media information, and can also display an option control and a play control on the second interface.
As an example, if the user wants to play the media asset, a triggering operation on the play control may be executed through the control device, a play instruction may be generated based on the triggering operation, if the display device detects the play instruction, a second request including the media asset identifier may be sent to the server, and after the server receives the second request, since the server stores a plurality of media asset identifiers and a first play quality parameter supported by each media asset identifier, the server may directly obtain, according to the media asset identifier, the first play quality parameter supported by the media asset corresponding to the media asset identifier. Then the server feeds back the determined first playing quality parameter supported by the media asset to the display device, and the display device can receive the first playing quality parameter supported by the media asset sent by the server.
In implementation, based on the first play quality parameter and the reference play quality parameter supported by the media asset, acquiring a code stream of the media asset and performing decoding play may include the following implementation manners:
in some embodiments, if the first playback quality parameter includes a reference playback quality parameter, a third request is sent to the server, where the third request includes a media asset identifier of the media asset and the reference playback quality parameter, and the third request is used to request the server to feed back, based on the media asset identifier and the reference playback quality parameter, a code stream of the media asset under the reference playback quality parameter. And receiving a code stream of the media assets fed back by the server under the reference playing quality parameter, and decoding and playing the code stream.
That is, if the asset supports the reference playback quality parameter, a third request including the asset identifier and the reference playback quality parameter may be transmitted to the server. After receiving the third request, the server may determine, based on the media asset identifier and the reference playback quality parameter, a code stream of the media asset under the reference playback quality parameter. And then feeding back the determined code stream to the display equipment, and the display equipment receives the code stream of the media asset under the reference playing quality parameter sent by the server and decodes and plays the received code stream to realize the playing of the media asset.
In some embodiments, as can be seen from the above embodiments, each asset identifier in the server may store multiple sets of corresponding relationships correspondingly. For a media asset, the server can also store multiple code streams of the media asset, and each code stream corresponds to a group of corresponding relations of the media asset. For example, for the asset X, assuming that 3 sets of corresponding relationships, namely 1080P and dolby vision, 1080P and HDR, 720P and SDR, are stored in the server, 3 kinds of code streams corresponding to the asset X may also be stored, one set is the code stream of the asset X under 1080P and dolby vision, one set is the code stream of the asset X under 1080P and HDR, and one set is the code stream under 720P and SDR.
As an example, assume that the first playback quality parameter is a resolution parameter. Assuming that resolution parameters supported by the media asset X are 4K, 1080P and 720P, and a reference resolution parameter is 1080P, it may be determined that the first playback quality parameter includes the reference playback quality parameter, and a third request including a media asset identifier X and the reference resolution parameter 1080P may be sent to the server, and after receiving the third request, the server may determine, according to the media asset identifier X, a plurality of code streams corresponding to the media asset, and then determine, according to 1080P, a code stream of the media asset X under 1080P. If 1080P and Dolby vision, and 1080P and HDR code streams exist under 1080P, one code stream can be randomly acquired and fed back to the display device, and a default code stream can also be fed back to the display device. And the display equipment receives the code stream of the media asset X under the reference resolution parameter 1080P, which is sent by the server, and decodes and plays the received code stream to realize the playing of the media asset.
In this implementation manner, if the media asset supports the reference playing quality parameter, the code stream corresponding to the reference playing quality parameter supported by the media asset can be directly acquired and decoded and played. Therefore, the quality of the played media assets is more in line with the watching habits of general users, so that the users can enjoy good visual experience.
In other embodiments, if the first playing quality parameter does not include the reference playing quality parameter, a fourth request is sent to the server in response to that a maximum second playing quality parameter supported by the display device is greater than or equal to a maximum first playing quality parameter in the first playing quality parameter, the second playing quality parameter is used for indicating the playing quality of the media assets played in the display device, the fourth request includes media asset identifiers of the media assets and the maximum first playing quality parameter, and the fourth request is used for requesting the server to feed back code streams of the media assets under the maximum first playing quality parameter based on the media asset identifiers and the maximum first playing quality parameter. And receiving a code stream of the media assets fed back by the server under the maximum first playing quality parameter, and decoding and playing the code stream.
The maximum second playing quality parameter is the second playing quality parameter which enables the playing quality of the media asset to be the best when the media asset is played on the display device in the second playing quality parameters supported by the display device. For example, if the second playback quality parameter is a resolution parameter, the maximum second playback quality parameter is a second playback quality parameter that maximizes the definition of the media asset when playing back on the display device among the resolution parameters supported by the display device.
The maximum first playing quality parameter is the first playing quality parameter which enables the playing quality of the media asset to be the best in the first playing quality parameters supported by the media asset. For example, if the first playback quality parameter is a resolution parameter, the maximum first playback quality parameter is the first playback quality parameter that maximizes the definition of the asset among the resolution parameters supported by the asset.
That is to say, under the condition that the media asset does not support the reference playback quality parameter, since the display device can support the playback quality parameter smaller than the maximum second playback quality parameter supported by the display device, if the maximum second playback quality parameter supported by the display device is greater than or equal to the maximum first playback quality parameter in the first playback quality parameters, it can be considered that the first playback quality parameter display device supported by the media asset supports all the playback quality parameters, that is, any code stream of the media asset can be played on the display device. In this way, a fourth request including the media asset identifier of the media asset and the maximum first playing quality parameter may be sent to the server, and after receiving the fourth request, the server may determine, based on the media asset identifier and the maximum first playing quality parameter, a code stream supported by the media asset under the maximum first playing quality parameter. And then feeding back the determined code stream to the display equipment, and the display equipment receives the code stream of the media asset under the maximum first playing quality parameter sent by the server and decodes and plays the received code stream to realize the playing of the media asset.
As an example, assume that the first playback quality parameter is a resolution parameter. Assuming that the resolution parameters supported by the asset X are 4K, 720P, and 360P, the reference resolution parameter is 1080P, and the maximum resolution parameter supported by the display device is 4K, it may be determined that the reference playback quality parameter is not included in the first playback quality parameter, and the maximum resolution parameter supported by the display device is 4K equal to the maximum resolution parameter supported by the asset X, which may be considered that the display device can support all the resolution parameters supported by the asset X. Therefore, a fourth request comprising the media asset identifier X and the maximum resolution parameter 4K may be sent to the server, and after receiving the fourth request, the server may determine, according to the media asset identifier X, a plurality of code streams corresponding to the media asset, and then determine, according to 4K, a code stream of the media asset X at 4K. If there are three code streams of 4K and dolby vision, 4K and HDR, and 4K and SDR under 4K, one code stream can be randomly acquired and fed back to the display device. And the display equipment receives the code stream of the media asset X under the maximum resolution parameter 4K, which is sent by the server, and decodes and plays the received code stream to realize the playing of the media asset.
In this implementation manner, if the media asset does not support the reference playback quality parameter, the code stream corresponding to the maximum first playback quality parameter supported by the media asset can be acquired for decoding playback on the premise of ensuring the support of the display device. Therefore, the quality of the played media assets is higher than the reference playing quality parameter, and better experience is brought to the user.
In still other embodiments, if the first playing quality parameter does not include the reference playing quality parameter, in response to that the maximum second playing quality parameter supported by the display device is smaller than the maximum first playing quality parameter in the first playing quality parameter, if the first playing quality parameter includes the maximum second playing quality parameter, a fifth request is sent to the server, where the second playing quality parameter is used to indicate the playing quality of the media asset played in the display device, the fifth request includes a media asset identifier of the media asset and the maximum second playing quality parameter, and the fifth request is used to request the server to feed back a code stream of the media asset under the maximum second playing quality parameter based on the media asset identifier and the maximum second playing quality parameter. And receiving a code stream of the media assets fed back by the server under the maximum second playing quality parameter, and decoding and playing the code stream.
That is to say, under the condition that the media asset does not support the reference playback quality parameter, since the display device may support the playback quality parameter smaller than the maximum second playback quality parameter supported by the display device, if the maximum second playback quality parameter supported by the display device is smaller than the maximum first playback quality parameter in the first playback quality parameters, it may be considered that the display device does not support the maximum first playback quality parameter supported by the media asset, that is, the code stream corresponding to the maximum first playback quality parameter supported by the media asset cannot be played on the display device. In this case, if the maximum second playback quality parameter supported by the display device also supports the media asset, a fifth request including a media asset identifier of the media asset and the maximum second playback quality parameter may be sent to the server, and after receiving the fifth request, the server may determine, based on the media asset identifier and the maximum second playback quality parameter, a code stream supported by the media asset under the maximum second playback quality parameter. And then feeding back the determined code stream to the display equipment, and the display equipment receives the code stream of the media asset under the maximum second playing quality parameter sent by the server and decodes and plays the received code stream to realize the playing of the media asset.
As an example, assume that the first playback quality parameter is a resolution parameter. Assuming that resolution parameters supported by the media asset X are 4K, 720P, and 360P, a reference resolution parameter is 1080P, and a maximum resolution parameter supported by the display device is 720P, it may be determined that the first playback quality parameter does not include the reference playback quality parameter, and the maximum resolution parameter supported by the display device is 720P smaller than the maximum resolution parameter 4K supported by the media asset X, and it may be considered that the display device cannot play a code stream under the maximum resolution parameter 4K supported by the media asset X. In this case, if the maximum second playback quality parameter supported by the display device also supports the media asset, a fifth request including a media asset identifier X and a maximum second playback quality parameter 720P may be sent to the server, and after receiving the fifth request, the server may determine a plurality of code streams corresponding to the media asset according to the media asset identifier X, and then determine a code stream of the media asset X under 720P according to 720P. If there are 720P and HDR, and 720P and SDR code streams under 720P, one code stream can be randomly obtained and fed back to the display device. And the display equipment receives the code stream of the media asset X, which is sent by the server, under the maximum resolution parameter 720P supported by the display equipment, and decodes and plays the received code stream to realize the playing of the media asset.
In this implementation manner, if the media asset does not support the reference playback quality parameter and the display device cannot play the code stream of the maximum first playback quality parameter supported by the media asset, if the first playback quality parameter supported by the media asset includes the maximum second playback quality parameter supported by the display device, the code stream corresponding to the maximum second playback quality parameter supported by the display device may be obtained for decoding and playing. Therefore, although the quality of the played media assets is not high, the played media assets are the highest playing quality which can be supported by the display device, the situation that code streams with poor playing quality appear in the display device can be avoided, and the playing quality of the played video can be ensured.
Step 702: and responding to the control display instruction, and acquiring a first play effect parameter supported by the currently played media asset, wherein the first play effect parameter is used for indicating the play effect of the media asset.
The playing effect parameter can be a picture quality parameter or a sound effect parameter.
As an example, if the currently played media asset is a video, the playing effect parameter may be a picture quality parameter and/or a sound effect parameter. If the currently played media asset is audio, the playing effect parameter may be a sound effect parameter.
As an example, if the currently played media asset is a video and the playing quality parameter is a resolution parameter, the playing effect parameter is an image quality parameter and/or a sound effect parameter; if the currently played media asset is audio and the playing quality parameter is a sound quality parameter, the playing effect parameter is a sound effect parameter.
In implementation, if a control display instruction is received, it may be considered that a user wants to switch a first play effect parameter of a currently played media asset, but if the currently played media asset does not support a certain play effect parameter, the switching cannot be implemented, and therefore, the first play effect parameter supported by the currently played media asset needs to be obtained first.
In implementation, the specific implementation of obtaining the first play effect parameter supported by the currently played media asset may include: and sending a first request to a server, wherein the first request comprises a media asset identifier of media assets and a current first playing quality parameter, the first playing quality parameter is used for indicating the playing quality of the media assets, and the first request is used for indicating the server to feed back a first playing effect parameter supported by the media assets based on the media asset identifier and the first playing quality parameter. And receiving a first play effect parameter supported by the media assets fed back by the server under the first play quality parameter.
That is, after receiving the control display instruction, the display device may send a first request to the server, where the first request may include a media asset identifier of a currently playing media asset and a current first playing quality parameter. After receiving the first request, the server may obtain a first play effect parameter supported by the currently played media asset under the current first play quality parameter based on the media asset identifier and the current first play quality parameter, and feed back the first play effect parameter to the display device.
In some embodiments, the server may store a plurality of asset identifiers of assets, and may also store a correspondence between a first play quality parameter and a first play effect parameter of each asset, that is, each asset identifier may store a plurality of sets of correspondences, where each set of correspondences includes a first play quality parameter and a first play effect parameter.
As an example, assuming that a media asset X can support a first playing quality parameter a and a first playing quality parameter B, can support a first playing effect parameter a and a first playing effect parameter B under the first playing quality parameter a, and can only support the first playing effect parameter a under the first playing quality parameter B, for the media asset X, 3 sets of corresponding relationships may be stored in the server, where the first set is the first playing quality parameter a and the first playing effect parameter a, the second set is the first playing quality parameter a and the first playing effect parameter B, and the third set is the first playing quality parameter B and the first playing effect parameter a.
Illustratively, if the currently played media asset is a video, the first playing quality parameter is a resolution parameter, and the first playing effect parameter is an image quality parameter. It is assumed that resolution parameters supported by the asset X are 1080P and 720P, and at 1080P, picture quality parameters supported by the asset X are dolby vision and HDR (High-Dynamic Range), and at 720P, picture quality parameters supported by the asset X are SDR (Software Defined Radio). Then for asset X, there may be 3 sets of correspondences stored in the server, the first set being 1080P and dolby vision, the second set being 1080P and HDR, and the third set being 720P and SDR.
In some embodiments, after receiving the first request, the server may directly obtain a plurality of sets of corresponding relationships corresponding to the media asset according to the media asset identifier, then determine, according to the current first playing quality parameter, a first playing effect parameter corresponding to the current first playing quality parameter from the corresponding relationships, send the determined first playing effect parameter to the display device as a first playing effect parameter supported by the media asset under the current first playing quality parameter, and the display device may obtain the first playing effect parameter supported by the currently played media asset under the current first playing quality parameter.
As an example, assuming that a media asset identifier received by the server is X, and the current first playing quality parameter is a first playing quality parameter a, according to the media asset identifier, the server may determine a corresponding relationship between 3 sets of the first playing quality parameter a and a first playing effect parameter a, the first playing quality parameter a and a first playing effect parameter B, and the first playing quality parameter B and the first playing effect parameter a. Then, according to the first playing quality parameter a, the first playing effect parameter corresponding to the first playing quality parameter a can be determined to be the first playing effect parameter a and the first playing effect parameter b from the 3-group corresponding relationship. And then the server determines the first playing effect parameter a and the first playing effect parameter b as a first playing effect parameter supported by the currently played media asset under the current first playing quality parameter, and feeds the first playing effect parameter back to the display device, and the display device can acquire the first playing effect parameter supported by the currently played media asset under the current first playing quality parameter.
Illustratively, if the currently played media asset is a video, the first playing quality parameter is a resolution parameter, and the first playing effect parameter is an image quality parameter. Assuming that a media asset identifier received by the server is X, a current resolution parameter is 1080P, according to the media asset identifier, the server may determine a corresponding relationship between 1080P and 3 groups of dolby vision, 1080P and HDR, 720P and SDR, then according to 1080P, it may determine that an image quality parameter corresponding to the current resolution parameter is dolby vision and HDR, the server may determine dolby vision and HDR as a first playing effect parameter of the media asset X at 1080P, and feed back dolby vision and HDR to the display device, and the display device may acquire image quality information supported by the currently played video at the current resolution.
In other embodiments, the server may store a plurality of asset identifiers of assets, and may further store a corresponding relationship between a first playing quality parameter of each asset and a first playing effect parameter, that is, each asset identifier may store a plurality of groups of corresponding relationships, each group of corresponding relationships includes one first playing quality parameter and two first playing effect parameters, and the two first playing effect parameters are a parameter related to an image of a video and a parameter related to an audio of the video, respectively, that is, the two first playing effect parameters are an image quality parameter and an audio effect parameter, respectively.
Illustratively, if the currently played media asset is a video, the first playing quality parameter is a resolution parameter, and the first playing effect parameter is a sound effect parameter and an image quality parameter. It is assumed that resolution parameters supported by the asset X are 1080P and 720P, image quality parameters supported by the asset X are dolby vision and HDR (High-Dynamic Range) under 1080P, sound effect parameters supported by the asset X are dolby sound and panoram sound, image quality parameters supported by the asset X are SDR (Software Defined Radio) under 720P, and sound effect parameters supported by the asset X are panoram sound. Then for asset X, there may be 3 sets of correspondences stored in the server, the first set being 1080P, dolby vision and dolby sound, the second set being 1080P, dolby vision and panned sound, the third set being 1080P, HDR and dolby sound, the fourth set being 1080P, HDR and panned sound, the fifth set being 720P, SDR and panned sound.
As an example, after receiving the first request, the server may directly obtain a plurality of sets of corresponding relationships corresponding to the media asset according to the media asset identifier, then determine, according to the current first playing quality parameter, a first playing effect parameter corresponding to the current first playing quality parameter from the corresponding relationships, send the determined first playing effect parameter to the display device as a first playing effect parameter supported by the media asset under the current first playing quality parameter, and the display device may obtain the first playing effect parameter supported by the currently played media asset under the current first playing quality parameter.
Illustratively, if the currently played media asset is a video, the first playing quality parameter is a resolution parameter, and the first playing effect parameter is a sound effect parameter and an image quality parameter. Assuming that the asset identifier received by the server is X, the current resolution parameter is 1080P, based on the asset identification, the server can determine 4-set correspondence of 1080P, dolby vision and dolby sound, 1080P, dolby vision and panned sound, 1080P, HDR and dolby sound, and 1080P, HDR and panned sound, then the image quality parameters corresponding to the current resolution parameters may be determined to be dolby vision and HDR from 1080P, the sound effect parameters corresponding to the current resolution are dolby sound and panoramic sound, the server can determine dolby vision and HDR, and dolby sound and panoramic sound as the first playing effect parameters of the media asset X under 1080P, and the dolby vision and HDR, and the dolby sound effect and the panoramic sound are fed back to the display equipment, and the display equipment can acquire the image quality information and the sound effect information supported by the currently played video under the current resolution.
Step 703: and determining a target playing effect parameter based on the first playing effect parameter and a second playing effect parameter supported by the display equipment, wherein the second playing effect parameter is used for indicating the playing effect of playing the media assets in the display equipment, and the target playing effect parameter is contained in the first playing effect parameter and the second playing effect parameter at the same time.
The target playing effect parameter is a playing effect parameter determined from the intersection of the first playing effect parameter and the second playing effect parameter, and the target playing effect parameter is included in both the first playing effect parameter and the second playing effect parameter. That is, the target play effect parameter is included in both the first play effect parameter and the second play effect parameter, i.e., the target play effect parameter is a play effect parameter supported by the display device and the currently played media asset.
The second play effect parameter supported by the display device may be set by a technician before the display device leaves a factory, and may be stored in the server. For example, the display device may send a parameter acquisition request to the server, where the parameter acquisition request includes a device identifier of the display device. After receiving the parameter obtaining request, the server may determine, based on the device identifier of the display device, a second playing effect parameter supported by the display device, and feed back the determined second playing effect parameter to the display device.
In an implementation, determining a specific implementation of the target play effect parameter based on the first play effect parameter and the second play effect parameter supported by the display device may include: and determining the intersection of the first playing effect parameter and the second playing effect parameter supported by the display equipment to obtain the candidate playing effect parameter. And if the number of the candidate playing effect parameters is multiple, determining the candidate playing effect parameter with the minimum arrangement sequence number in the multiple candidate playing effect parameters as the target playing effect parameter. And if the number of the candidate playing effect parameters is one, determining the candidate playing effect parameters as the target playing effect parameters.
As an example, the candidate playing effect parameter is an intersection of the first playing effect parameter and the second playing effect parameter, that is, the candidate playing effect parameter is a playing effect parameter that is supported by the display device and the currently playing media asset at the same time.
Because the first playing effect parameter supported by the media asset under the first playing quality parameter is not necessarily supported by the display device, for the first playing effect parameter unsupported by the display device, the display device cannot play the code stream under the first playing effect parameter unsupported by the display device, so that the user cannot watch the media asset. Therefore, it is necessary to determine an intersection of a first playing effect parameter supported by the media asset under the first playing quality parameter and a second playing effect parameter supported by the display device, so as to obtain a candidate playing effect parameter. And selecting the playing effect parameter which enables the playing effect of the currently played media asset to be the best from the candidate playing effect parameters as a target playing effect parameter.
Illustratively, the play effect parameter is assumed to be a picture quality parameter. Assuming that the first play effect parameters supported by the media asset under the first play quality parameters include dolby vision, HDR, and SDR and the second play effect parameters supported by the display device include dolby vision and HDR, it may be determined that the intersection of the first play effect parameters and the second play effect parameters are dolby vision and HDR, i.e., the candidate play effect parameters are dolby vision and HDR.
In some embodiments, the display device stores a ranking number of the supported second play effect parameters, where the ranking number is inversely related to the play effect corresponding to the corresponding second play effect parameter. That is, the smaller the sequence number is, the better the playing effect corresponding to the corresponding second playing effect parameter is.
As an example, if the play effect parameter is a picture quality parameter, assuming that the second play effect parameter supported by the display device includes dolby vision, HDR, and SDR, and the arrangement number of dolby vision is 1, the arrangement number of HDR is 2, and the arrangement number of SDR is 3, it can be determined that the order of the play effects from good to bad is dolby vision, HDR, and SDR.
In some embodiments, if the number of the candidate play effect parameters is multiple, the target play effect parameter that makes the play effect of the media asset best can be determined from the multiple; if there is one candidate playing effect parameter, the candidate playing effect parameter can be directly determined as the target playing effect parameter.
Continuing with the above example, it is assumed that the playback effect parameter is an image quality parameter, the candidate playback effect parameters include dolby vision and HDR, and the arrangement number of dolby vision is 1 and the arrangement number of HDR is 2 in the arrangement numbers of the second playback effect parameters supported by the display device. Thus, the candidate playback effect parameter with the smallest arrangement number can be determined as dolby vision, which is a candidate image quality parameter that makes the image quality effect of the media asset presentation the best and the clearest, and can be determined as the target playback effect parameter. Assuming that the candidate play effect parameters include only dolby vision, dolby vision may be directly determined as the target play parameter.
In some embodiments, the playing effect parameter and the playing quality parameter are two different playing parameters, both of which have an influence on the audio-visual effect during the playing of the media asset, and one playing quality parameter and one playing effect parameter correspond to the same code stream. For the media asset corresponding to the same media asset identifier, multiple first playing quality parameters can be supported, for any one supported first playing quality parameter, multiple code streams meeting any one playing quality parameter can be stored in the server, and each code stream corresponds to different playing effect parameters; or, for the media asset corresponding to the same media asset identifier, multiple first playing effect parameters may be supported, and for any one of the supported first playing effect parameters, multiple code streams satisfying the any one of the playing quality parameters may be stored in the server, and each code stream corresponds to a different playing quality parameter.
For example, assume that the playback effect parameter is a picture quality parameter and the playback quality parameter is a resolution parameter. It is assumed that the first play effect parameters supported by the asset corresponding to the asset identifier a include dolby vision, HDR, and SDR, and the supported first play quality parameters include 4K, 720P, and 360P. The code stream stored in the server corresponding to the media asset identifier a includes: the video coding method comprises the steps of coding a video stream with a dolby vision and a resolution parameter of 4K, coding a video stream with a dolby vision and a resolution parameter of 720P, coding a video stream with a dolby vision and a resolution parameter of 360P, coding a video stream with an HDR and a resolution parameter of 4K, coding a video stream with an HDR and a resolution parameter of 720P, coding a video stream with an HDR and a resolution parameter of 360P, coding a video stream with an SDR and a resolution parameter of 4K, coding a video stream with an SDR and a resolution parameter of 720P, and coding a video stream with an SDR and a resolution parameter of 360P.
In other embodiments, the same stream corresponds to one play quality parameter and two play effect parameters. For example, it is assumed that the playback effect parameter is a sound effect parameter and a picture quality parameter, and the playback quality parameter is a resolution parameter. Assume that the first image quality parameters supported by the media asset corresponding to the media asset identifier a include dolby vision and HDR, the first supported sound effect parameters include dolby sound and panorar, and the first supported resolution parameters include 4K and 1080P. The code stream stored in the server corresponding to the media asset identifier a includes: the image quality parameter is dolby vision, the sound effect parameter is a code stream with dolby sound and a resolution parameter of 4K, the image quality parameter is dolby vision, the sound effect parameter is a code stream with dolby sound and a resolution parameter of 1080P, the image quality parameter is dolby vision, the sound effect parameter is a code stream with panoramic sound and a resolution parameter of 4K, the image quality parameter is dolby vision, the sound effect parameter is a code stream with panoramic sound and a resolution parameter of 1080P, the image quality parameter is HDR, the sound effect parameter is a code stream with dolby sound and a resolution parameter of 1080P, the image quality parameter is HDR, the sound effect parameter is a code stream with panoramic sound and a resolution parameter of 4K, the image quality parameter is HDR, the sound effect parameter is panoramic sound and a resolution parameter of 1080P.
In some embodiments, the playback effect parameter may be a picture quality parameter, and the playback quality parameter may be a resolution parameter. In a corresponding example, the code stream may include multiple image quality parameters at the same resolution, and after receiving a control display instruction of a user, a control name corresponding to the code stream with the best image quality effect at the current resolution may be displayed.
Step 704: and displaying the control corresponding to the target playing effect parameter.
In some embodiments, after the target play effect parameter is determined, a control corresponding to the target play effect parameter may be presented in a control list, so as to facilitate selection by a user.
For example, referring to fig. 8, assuming that the target play effect parameter is a dolby horizon, a control corresponding to the dolby horizon may be presented in a control list of the display device.
In some embodiments, as shown in fig. 8, it is assumed that the playback effect parameter is a picture quality parameter and the playback quality parameter is a resolution parameter. In the current resolution, for example, when the first image quality parameter includes dolby vision, HDR, and SDR, and the second image quality parameter includes dolby vision and HDR, the target image quality parameter may be determined to be dolby vision that maximizes the image quality effect of the media asset, and a corresponding image quality control "dolby view" may be displayed according to dolby vision.
In some embodiments, as shown in fig. 8, it is assumed that the playback effect parameter is a picture quality parameter and the playback quality parameter is a resolution parameter. In the current resolution, for example, when the first image quality parameter includes dolby vision and SDR, and the second image quality parameter includes dolby vision, HDR, and SDR, the target image quality parameter may be determined to be dolby vision that maximizes the image quality effect of the media asset, and a corresponding image quality control "dolby view" may be displayed according to dolby vision.
In some embodiments, as shown in fig. 9, it is assumed that the playing effect parameter is a sound effect parameter, and the playing quality parameter is a resolution parameter. Under the current resolution, for example, when the first sound effect parameter includes the dolby sound effect, the a sound effect and the B sound effect, and the second sound effect parameter includes the dolby sound effect and the B sound effect, the target sound effect parameter can be determined to be the best dolby sound effect of the media asset, and the corresponding sound effect control "dolby sound effect" can be displayed according to the dolby sound effect.
In some embodiments, as shown in fig. 10, it is assumed that the playback effect parameter is a picture quality parameter and the playback quality parameter is a resolution parameter. In the current resolution, for example, when the first picture quality parameter includes dolby vision, HDR, and SDR, and the second picture quality parameter includes HDR and SDR, the target picture quality parameter is HDR that makes the picture quality effect of the medium best, and the corresponding picture quality control "HDR" may be displayed according to the HDR.
In some embodiments, as shown in fig. 10, it is assumed that the playback effect parameter is a picture quality parameter and the playback quality parameter is a resolution parameter. In the current resolution, for example, when the first picture quality parameter includes HDR and SDR and the second picture quality parameter includes dolby vision, HDR and SDR, the target picture quality parameter is HDR that makes the picture quality effect of the media asset the best, and the corresponding picture quality control "HDR" may be displayed according to the HDR.
In some embodiments, as shown in fig. 11, if there are two target playback effect parameters, which are the image quality parameter and the audio effect parameter, both the two target playback effect parameters can be displayed on the display screen. Assuming that the image quality parameter is HDR and the sound effect parameter is panoramic sound, the image quality control "HDR" and the sound effect control "panoramic sound" may be respectively displayed in a control list of the display device.
In some embodiments, if the playback effect parameter is a picture quality parameter, in the embodiments of the present application, a picture quality parameter with the best picture quality effect may be selected and displayed to the user; or, if the playing effect parameter is a sound effect parameter, in the embodiment of the present application, a sound effect parameter with the best sound effect may be selected and displayed to the user; alternatively, if the playback effect parameters are the image quality parameters and the sound effect parameters, in the embodiment of the present application, the image quality parameters with the best image quality and the sound effect parameters with the best sound effect may be selected and displayed to the user. Namely, by the method of the embodiment of the application, the playing effect parameter which enables the playing effect of the media assets to be the best can be determined, and the control is displayed to the user, so that the user can quickly switch to the code stream with the best playing effect to play, the operation of the user is reduced, and the use experience of the user is improved.
Further, if the control list further includes a switching control corresponding to the first play quality parameter, in response to detecting a trigger operation on the switching control, an option corresponding to a candidate play quality parameter is displayed, where the candidate play quality parameter is used to indicate a play quality parameter supported by the currently played media asset on the display device. Responding to the selection operation of the option corresponding to any one of the candidate playing quality parameters, and acquiring the code stream of the media asset under any one of the candidate playing quality parameters based on the media asset identification and any one of the candidate playing quality parameters. And decoding and playing the acquired code stream.
Wherein the triggering operation of the switching control can be executed by the user through the control device.
As an example, the candidate playback quality parameter is an intersection of the first playback quality parameter and the second playback quality parameter, that is, the candidate playback quality parameter is a playback quality parameter that is supported by the display device and the currently played media asset at the same time. Wherein the second playing quality parameter is used for indicating the playing quality of the playing media assets in the display device.
As an example, different current resolutions may be selected by switching the control, and image quality effect parameters supported by corresponding code streams may be different in different resolutions.
That is, if the control list further includes a switching control corresponding to the first playing quality parameter, if the triggering operation on the switching control is detected, it indicates that the user wants to switch the first playing quality parameter of the currently played media asset. The display device can display the options corresponding to the candidate playing quality parameters, and if the selection operation of the option corresponding to any candidate playing quality parameter is detected, the code stream of the media asset under the selected candidate playing quality parameter can be acquired.
As an example, referring to fig. 11, assuming that the playback quality parameter is a resolution parameter, a switching control corresponding to the resolution parameter may be displayed on the display device, and when the switching control is displayed on the display device, the switching control may be displayed as a "definition" control, where the control corresponds to settings of different resolutions. The resolution ratio during the initial media asset playing can be preset by a server, or can be determined according to the bandwidth of the display device and the current network, and a user can switch the resolution ratio of the media asset through the definition control, so that code streams at different resolution ratios can be played.
In some embodiments, after detecting the trigger operation of the switching control, the display device may determine, according to a first play quality parameter supported by the media asset and a maximum second play quality parameter supported by the display device, a play quality parameter supported by both the media asset and the display device, that is, a candidate play quality parameter. And then displaying options corresponding to each candidate playing quality parameter, if the selection operation of the option of any candidate quality playing parameter is detected, acquiring a code stream of the media asset under any candidate playing quality parameter based on the media asset identifier and the selected any candidate quality playing parameter, and decoding and playing the acquired code stream.
The maximum second playing quality parameter is the second playing quality parameter which enables the playing quality of the media assets to be the highest in the second playing quality parameters supported by the display equipment. For example, if the second playback Quality parameter is a sound Quality parameter, assuming that the sound Quality parameters supported by the display device include an HQ (High Quality) sound effect and an SQ (Super Quality) sound effect, the maximum sound Quality parameter can be determined to be SQ, where the maximum sound Quality parameter is the sound Quality parameter with the best audio-visual effect.
As an example, determining a specific implementation of the candidate playback quality parameter based on the first playback quality parameter supported by the media asset and the maximum second playback quality parameter supported by the display device may include: and if the first playing quality parameters supported by the media assets comprise first playing quality parameters which are greater than the maximum second playing quality parameter, determining the first playing quality parameters except the first playing quality parameters which are greater than the maximum second playing quality parameter in the first playing quality parameters supported by the media assets as candidate playing quality parameters. And if the first playing quality parameters supported by the media assets do not comprise the first playing quality parameters which are larger than the maximum second playing quality parameter, determining the first playing quality parameters supported by the media assets as candidate playing quality parameters.
That is, if the first playback quality parameters supported by the asset include the first playback quality parameters that are not supported by the display device, the first playback quality parameters that are not supported by the display device in the first playback quality parameters supported by the asset may be deleted, and since the remaining playback quality parameters are smaller than the maximum second playback quality parameters supported by the display device, that is, the remaining playback quality parameters are supported by the display device, the remaining playback quality parameters may be determined as the candidate playback quality parameters. If the first playing quality parameters supported by the media assets are all smaller than or equal to the maximum second playing quality parameters supported by the display equipment, namely the first playing quality parameters supported by the media assets are all supported by the display equipment, therefore, the first playing quality parameters supported by the media assets can be directly determined as candidate playing quality parameters.
Illustratively, the media asset is taken as a video, and the first playing quality parameter is taken as a resolution parameter. It is assumed that the supported resolution parameters of the media assets are 4K, 1080P and 720P, and the supported maximum resolution parameter of the display device is 1080P. It may be determined that the resolution parameters supported by the media asset include 4K that is not supported by the display device, and the others are 1080P and 720P, and since the maximum resolution parameter supported by the display device is 1080P, the display device may also support 720P, and therefore, resolution parameters other than 4K, that is, 1080P and 720P, in the resolution parameters supported by the media asset may be determined as candidate playback quality parameters. It is assumed that the supported resolution parameters of the media assets are 4K, 1080P and 720P, and the maximum supported resolution parameter of the display device is 4K. It may be determined that the resolution parameters supported by the media asset do not include the resolution parameter that is not supported by the display device, and then the resolution parameters 4K, 1080P, and 720P supported by the media asset may be determined as candidate playback quality parameters.
As an example, after determining the candidate playback quality parameters, an option for each candidate playback quality parameter may be presented in the display device. If the user wants to switch to a candidate playing quality parameter, the user can select the candidate playing quality parameter through the control device or the gesture.
As an example, after detecting the selection operation of any of the candidate quality playing parameters, the display device may send a seventh request to the server, where the seventh request may include the asset identifier and any of the selected candidate quality playing parameters. After receiving the seventh request, the server may determine, based on the media asset identifier and any selected candidate quality playing parameter, a code stream of the media asset under any candidate playing quality parameter, and since there may be code streams corresponding to multiple playing effect parameters under any candidate playing quality parameter, a code stream corresponding to a target playing effect parameter under any candidate playing quality parameter may be selected, and the selected code stream is fed back to the display device, and the display device may decode and play the obtained code stream after receiving the code stream fed back by the server.
Further, after the control corresponding to the target playing effect is displayed, the media asset can be played based on the target playing effect parameter in response to the triggering operation of the control corresponding to the target playing effect parameter.
That is to say, after the control corresponding to the target playing effect parameter is displayed, if the trigger operation on the control corresponding to the target playing effect parameter is detected, it can be considered that the user wants to play the code stream under the target playing effect parameter, and therefore, the media asset can be played based on the target playing effect parameter.
As an example, referring to fig. 8, assuming that the target playing effect parameter is a dolby view, as can be seen from fig. 8, after the trigger operation of the control corresponding to the dolby view is detected, the display device may further display an open option and a close option, if the user selects the open option through the control device, the display device may play the media asset based on the target playing effect parameter, and if the open option is in a selected state, it indicates that the picture quality of the dolby view has been successfully switched to.
Further, after the switching is completed, the display device may further display a window for prompting the user that the switching is completed.
In implementation, in response to a triggering operation on a control corresponding to a target playing effect parameter, a sixth request may be sent to the server, where the sixth request includes a media asset identifier, a current first playing quality parameter of the media asset, and the target playing effect parameter, and the sixth request is used to instruct the server to feed back a code stream of the media asset indicated by the media asset identifier under the current first playing quality parameter and the target playing effect parameter. And receiving the code stream fed back by the server and decoding and playing the received code stream.
That is, the display device may transmit a sixth request to the server, where the sixth request may include the asset identifier, the current first playback quality parameter of the asset, and the target playback effect parameter. After receiving the sixth request, the server may determine, based on the media asset identifier, the current first playing quality parameter of the media asset, and the target playing effect parameter, a code stream corresponding to the correspondence between the current first playing quality parameter and the target playing effect parameter in the code stream of the media asset, and then feed back the determined code stream to the display device, and the display device decodes and plays the code stream after receiving the code stream fed back by the server, so as to achieve an effect of switching the playing effect parameter of the media asset to the target playing effect parameter.
As an example, the first playback quality parameter is taken as a resolution parameter, and the target playback effect parameter is taken as an image quality parameter and an audio effect parameter. Assuming that the sixth request includes a media asset identifier X, 1080P, HDR and a dolby sound effect, the server may determine a plurality of sets of corresponding relationships corresponding to the media asset based on the media asset identifier X, then determine a set of corresponding relationship between 1080P, HDR and the dolby sound effect in the plurality of sets of corresponding relationships, then obtain a code stream corresponding to 1080P, HDR and the dolby sound effect set, feed the obtained code stream back to the display device, and after receiving the code stream fed back by the server, the display device decodes and plays the received code stream, where the resolution of the played media asset is 1080P, the image quality is HDR, and the sound effect is the dolby sound effect, and a better playing effect can be achieved.
It should be noted that, in the above embodiment, the play effect parameter under the current play quality parameter is determined according to the play quality parameter. If the playing effect parameters are the sound effect parameters and the image quality parameters, the target sound effect parameters and the target image quality parameters can be determined according to the playing quality parameters respectively, namely the target sound effect parameters and the target image quality parameters are determined to be independent. In other embodiments, if the playing effect parameters are the sound effect parameters and the image quality parameters, the target sound effect parameters under the current playing quality parameters can be determined first under the condition that the playing quality parameters are fixed, and then the target image quality parameters are determined according to the current playing quality parameters and the target sound effect parameters; or, under the condition that the playing quality parameter is fixed, the target image quality parameter under the current playing quality parameter is determined, and then the target sound effect parameter is determined according to the current playing quality parameter and the target image quality parameter. For a specific implementation process, reference may be made to the above embodiments, which are not described herein again.
In the embodiment of the application, if the display device receives the control display instruction, it may be considered that the user wants to switch the play effect parameter of the currently played media asset, and therefore, a first play effect parameter, which is supported by the currently played media asset and used for indicating the play effect of the media asset, may be obtained. Since the first playing effect parameter supported by the media asset is not necessarily supported by the display device, the target playing effect parameter supported by the media asset and the display device needs to be determined according to the first playing effect parameter and the second playing effect parameter supported by the display device, and then the control corresponding to the target playing effect parameter is displayed. Therefore, the user can directly operate the control corresponding to the displayed target playing effect to realize the switching of the playing effect of the media assets, the user does not need to carry out complicated operation, the better media asset playing effect can be quickly experienced, the media asset playing efficiency is improved, and better audio-visual experience is brought to the user.
For convenience of understanding, the following describes, with reference to fig. 12, a method for playing media assets provided in the embodiment of the present application by way of example.
In fig. 12, it is assumed that the media asset is a video, the playback quality parameter is a resolution parameter, and the playback effect parameter is a picture quality parameter.
The server configures image quality parameters supported by the display device in advance according to the device type of the display device. The user selects the video to play through the control device, the display equipment can start playing with the reference resolution parameter, and in the playing process, the user executes the triggering operation of the control list display control through the control device. The display device may detect the control list display instruction, then the display device may acquire, from the server, image quality information supported by the currently played video under the current resolution parameter, store the acquired image quality information, and may also send a parameter acquisition request including a device identifier to the server, and receive image quality parameters supported by the display device fed back by the server. And then according to the image quality parameters supported by the display equipment and the image quality parameters supported by the video under the current resolution parameters, determining the target image quality parameters and displaying options corresponding to the target image quality parameters, selecting the opening options of the image quality parameters by the user through the control device, switching the display equipment to the target image quality parameters corresponding to the current resolution parameters for playing, and prompting the user to finish the switching.
It should be noted that: in the display device provided in the foregoing embodiment, when playing media assets, only the division of the functional modules is exemplified, and in practical applications, the function distribution may be completed by different functional modules according to needs, that is, the internal structure of the device is divided into different functional modules, so as to complete all or part of the functions described above. In addition, the display device provided by the above embodiment and the media asset playing method embodiment belong to the same concept, and the specific implementation process thereof is detailed in the method embodiment and is not described herein again.
In some embodiments, a computer-readable storage medium is further provided, in which a computer program is stored, and when executed by a processor, the computer program implements the steps of the media asset playing method in the above embodiments. For example, the computer readable storage medium may be a ROM, a RAM, a CD-ROM, a magnetic tape, a floppy disk, an optical data storage device, and the like.
It is noted that the computer-readable storage medium referred to in the embodiments of the present application may be a non-volatile storage medium, in other words, a non-transitory storage medium.
It should be understood that all or part of the steps for implementing the above embodiments may be implemented by software, hardware, firmware or any combination thereof. When implemented in software, may be implemented in whole or in part in the form of a computer program product. The computer program product includes one or more computer instructions. The computer instructions may be stored in the computer-readable storage medium described above.
That is, in some embodiments, there is also provided a computer program product containing instructions which, when run on a computer, cause the computer to perform the steps of the above-described media asset playing method.
Finally, it should be noted that: the above embodiments are only used for illustrating the technical solutions of the present application, and not for limiting the same; although the present application has been described in detail with reference to the foregoing embodiments, it should be understood by those of ordinary skill in the art that: the technical solutions described in the foregoing embodiments may still be modified, or some or all of the technical features may be equivalently replaced; and the modifications or the substitutions do not make the essence of the corresponding technical solutions depart from the scope of the technical solutions of the embodiments of the present application.

Claims (10)

1. A display device, characterized in that the display device comprises:
a display configured to display a user interface;
a communicator for communicating with a mobile terminal or a server;
a controller in communication with the display, the controller to:
receiving a control display instruction;
responding to the control display instruction, and acquiring a first play effect parameter supported by the currently played media asset, wherein the first play effect parameter is used for indicating the play effect of the media asset;
determining a target playing effect parameter based on the first playing effect parameter and a second playing effect parameter supported by a display device, wherein the second playing effect parameter is used for indicating the playing effect of playing media assets in the display device, and the target playing effect parameter is simultaneously included in the first playing effect parameter and the second playing effect parameter;
and displaying the control corresponding to the target playing effect parameter.
2. The display device of claim 1, wherein the controller obtains a first play effect parameter supported by a currently played media asset, and is specifically configured to:
sending a first request to the server, wherein the first request comprises a media asset identifier of the media asset and a current first playing quality parameter, the first playing quality parameter is used for indicating the playing quality of the media asset, and the first request is used for indicating the server to feed back a first playing effect parameter supported by the media asset based on the media asset identifier and the first playing quality parameter;
and receiving a first playing effect parameter supported by the media asset fed back by the server under the first playing quality parameter.
3. The display device of claim 2, wherein the controller, prior to receiving the control-reveal instruction, is further to:
responding to the detected playing instruction, sending a second request to the server, wherein the second request comprises a media asset identifier of the media asset, and the second request is used for requesting the server to feed back a first playing quality parameter supported by the media asset based on the media asset identifier;
receiving a first play quality parameter supported by the media asset sent by the server;
and acquiring a code stream of the media asset and decoding and playing the code stream based on a first playing quality parameter and a reference playing quality parameter supported by the media asset, wherein the reference playing quality parameter refers to a playing quality parameter used by the display device under the condition that the display device is not set after being started.
4. The display device according to claim 1, wherein the display device stores therein an arrangement number of the supported second play effect parameters, the arrangement number being negatively related to a play effect corresponding to the corresponding second play effect parameter, and the controller determines the target play effect parameter based on the first play effect parameter and the second play effect parameter supported by the display device, and is specifically configured to:
determining an intersection of the first playing effect parameter and a second playing effect parameter supported by the display device to obtain a candidate playing effect parameter;
if the number of the candidate playing effect parameters is multiple, determining the candidate playing effect parameter with the minimum arrangement sequence number in the multiple candidate playing effect parameters as the target playing effect parameter;
and if the number of the candidate playing effect parameters is one, determining the candidate playing effect parameters as the target playing effect parameters.
5. The display device of claim 2, wherein the controller is further to:
responding to a triggering operation of a control corresponding to the target playing effect parameter, and sending a sixth request to the server, where the sixth request includes the media asset identifier, the current first playing quality parameter of the media asset, and the target playing effect parameter, and the sixth request is used to instruct the server to feed back a code stream of the media asset indicated by the media asset identifier under the current first playing quality parameter and the target playing effect parameter;
and receiving the code stream fed back by the server and decoding and playing the received code stream.
6. The display device of claim 3, wherein the controller is configured to display a control corresponding to the target play effect parameter, and is specifically configured to:
displaying a control corresponding to the target playing effect parameter in a control list;
if the control list further comprises a switching control corresponding to a first playing quality parameter, responding to the detection of the triggering operation of the switching control, and displaying an option corresponding to a candidate playing quality parameter, wherein the candidate playing quality parameter is used for indicating the playing quality parameter supported by the media asset on the display equipment;
responding to the selection operation of an option corresponding to any candidate playing quality parameter in the candidate playing quality parameters, and acquiring a code stream of the media asset under any candidate playing quality parameter based on the media asset identification and any candidate playing quality parameter;
and decoding and playing the acquired code stream.
7. The display device according to any one of claims 1 to 6,
if the media asset is a video, the playing quality parameter is a resolution parameter, and the playing effect parameter is a picture quality parameter and/or a sound effect parameter;
if the media asset is audio, the playing quality parameter is a sound quality parameter, and the playing effect parameter is a sound effect parameter.
8. A display device, characterized in that the display device comprises:
a display configured to display a user interface;
a communicator for communicating with a mobile terminal or a server;
a controller in communication with the display, the controller to:
receiving a control display instruction;
responding to the control display instruction, and acquiring a first image quality parameter supported by the currently played media asset under the current resolution;
determining a target image quality parameter with the clearest image quality based on the intersection of the first image quality parameter and a second image quality parameter supported by a display device, wherein the target image quality parameter is contained in the first image quality parameter and the second image quality parameter;
and displaying the image quality control corresponding to the target image quality parameter.
9. The display device of claim 8,
receiving a selection operation of the image quality control;
sending a request to a server according to the current resolution, a target image quality parameter corresponding to the image quality control and a media asset identifier of the currently played media asset so that the server feeds back a code stream of the media asset indicated by the media asset identifier under the current resolution and the target image quality parameter;
and receiving and decoding the code stream fed back by the server.
10. A media asset playing method is applied to display equipment, and the method comprises the following steps:
receiving a control display instruction;
responding to the control display instruction, and acquiring a first play effect parameter supported by the currently played media asset, wherein the first play effect parameter is used for indicating the play effect of the media asset;
determining a target playing effect parameter based on the first playing effect parameter and a second playing effect parameter supported by a display device, wherein the second playing effect parameter is used for indicating the playing effect of playing media assets in the display device, and the target playing effect parameter is simultaneously included in the first playing effect parameter and the second playing effect parameter;
and displaying the control corresponding to the target playing effect parameter.
CN202010559495.7A 2020-06-18 2020-06-18 Media asset playing method and display equipment Pending CN113825032A (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CN202010559495.7A CN113825032A (en) 2020-06-18 2020-06-18 Media asset playing method and display equipment
PCT/CN2021/081356 WO2021253895A1 (en) 2020-06-18 2021-03-17 Media resource playing method and display device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010559495.7A CN113825032A (en) 2020-06-18 2020-06-18 Media asset playing method and display equipment

Publications (1)

Publication Number Publication Date
CN113825032A true CN113825032A (en) 2021-12-21

Family

ID=78911798

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010559495.7A Pending CN113825032A (en) 2020-06-18 2020-06-18 Media asset playing method and display equipment

Country Status (2)

Country Link
CN (1) CN113825032A (en)
WO (1) WO2021253895A1 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2023185954A1 (en) * 2022-04-01 2023-10-05 海信视像科技股份有限公司 Display device and processing method for display device

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114554284B (en) * 2022-02-22 2023-08-11 网易(杭州)网络有限公司 Image quality information processing method, image quality information processing device, computer equipment and storage medium
CN114710686A (en) * 2022-03-11 2022-07-05 武汉斗鱼鱼乐网络科技有限公司 Exposure method, device, medium and equipment for live broadcast room

Citations (18)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1992890A (en) * 2005-10-11 2007-07-04 美国博通公司 Apparatus and method for providing media program
CN102893626A (en) * 2010-05-17 2013-01-23 Lg电子株式会社 Method of providing definition selection menu and broadcasting receiving apparatus
CN104038839A (en) * 2014-04-29 2014-09-10 四川长虹电器股份有限公司 TV (television) image quality adjustment method
JP2015154303A (en) * 2014-02-17 2015-08-24 シャープ株式会社 Television set and control method of television set
CN104934048A (en) * 2015-06-24 2015-09-23 小米科技有限责任公司 Sound effect regulation method and device
CN105872843A (en) * 2016-04-18 2016-08-17 青岛海信电器股份有限公司 Method and device for video playing
CN105898364A (en) * 2016-05-26 2016-08-24 北京小米移动软件有限公司 Video playing processing method, device, terminal and system
CN106024034A (en) * 2016-06-16 2016-10-12 广东欧珀移动通信有限公司 Method for adjusting sound effect and terminal
CN106303601A (en) * 2016-08-15 2017-01-04 腾讯科技(深圳)有限公司 The playing method and device of multimedia file
CN106791947A (en) * 2016-12-28 2017-05-31 北京金山安全软件有限公司 Method and device for transmitting network video and electronic equipment
CN107203363A (en) * 2017-06-06 2017-09-26 网易(杭州)网络有限公司 Method, device and electronic equipment that image quality for application program is adjusted
CN107734388A (en) * 2017-11-20 2018-02-23 青岛海信电器股份有限公司 A kind of player method and device of television startup displaying file
CN108810649A (en) * 2018-07-12 2018-11-13 深圳创维-Rgb电子有限公司 Picture quality regulation method, intelligent TV set and storage medium
CN109040802A (en) * 2018-09-03 2018-12-18 青岛海信传媒网络技术有限公司 A kind of method and device that media resource obtains
CN109151573A (en) * 2018-09-30 2019-01-04 Oppo广东移动通信有限公司 Video source modeling control method, device and electronic equipment
CN109587560A (en) * 2018-11-27 2019-04-05 Oppo广东移动通信有限公司 Method for processing video frequency, device, electronic equipment and storage medium
US20190174166A1 (en) * 2017-12-01 2019-06-06 At&T Intellectual Property I, L.P. Dynamic playlist customization by adaptive streaming client
CN111263188A (en) * 2020-02-17 2020-06-09 腾讯科技(深圳)有限公司 Video image quality adjusting method and device, electronic equipment and storage medium

Family Cites Families (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103200444A (en) * 2012-01-04 2013-07-10 蓝云科技股份有限公司 Audio-video information exchange system and operational method thereof
WO2014036683A1 (en) * 2012-09-04 2014-03-13 华为终端有限公司 Media playback method, control point and terminal
US10440082B1 (en) * 2016-06-21 2019-10-08 Amazon Technologies, Inc. Adjusting parameter settings for bitrate selection algorithms
CN109391786A (en) * 2017-08-02 2019-02-26 学习王科技股份有限公司 The mickey mouse storage device and method of energy adjust automatically output image quality
CN108111910B (en) * 2017-12-22 2020-01-21 烽火通信科技股份有限公司 Method and system for adjusting video playing definition

Patent Citations (18)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1992890A (en) * 2005-10-11 2007-07-04 美国博通公司 Apparatus and method for providing media program
CN102893626A (en) * 2010-05-17 2013-01-23 Lg电子株式会社 Method of providing definition selection menu and broadcasting receiving apparatus
JP2015154303A (en) * 2014-02-17 2015-08-24 シャープ株式会社 Television set and control method of television set
CN104038839A (en) * 2014-04-29 2014-09-10 四川长虹电器股份有限公司 TV (television) image quality adjustment method
CN104934048A (en) * 2015-06-24 2015-09-23 小米科技有限责任公司 Sound effect regulation method and device
CN105872843A (en) * 2016-04-18 2016-08-17 青岛海信电器股份有限公司 Method and device for video playing
CN105898364A (en) * 2016-05-26 2016-08-24 北京小米移动软件有限公司 Video playing processing method, device, terminal and system
CN106024034A (en) * 2016-06-16 2016-10-12 广东欧珀移动通信有限公司 Method for adjusting sound effect and terminal
CN106303601A (en) * 2016-08-15 2017-01-04 腾讯科技(深圳)有限公司 The playing method and device of multimedia file
CN106791947A (en) * 2016-12-28 2017-05-31 北京金山安全软件有限公司 Method and device for transmitting network video and electronic equipment
CN107203363A (en) * 2017-06-06 2017-09-26 网易(杭州)网络有限公司 Method, device and electronic equipment that image quality for application program is adjusted
CN107734388A (en) * 2017-11-20 2018-02-23 青岛海信电器股份有限公司 A kind of player method and device of television startup displaying file
US20190174166A1 (en) * 2017-12-01 2019-06-06 At&T Intellectual Property I, L.P. Dynamic playlist customization by adaptive streaming client
CN108810649A (en) * 2018-07-12 2018-11-13 深圳创维-Rgb电子有限公司 Picture quality regulation method, intelligent TV set and storage medium
CN109040802A (en) * 2018-09-03 2018-12-18 青岛海信传媒网络技术有限公司 A kind of method and device that media resource obtains
CN109151573A (en) * 2018-09-30 2019-01-04 Oppo广东移动通信有限公司 Video source modeling control method, device and electronic equipment
CN109587560A (en) * 2018-11-27 2019-04-05 Oppo广东移动通信有限公司 Method for processing video frequency, device, electronic equipment and storage medium
CN111263188A (en) * 2020-02-17 2020-06-09 腾讯科技(深圳)有限公司 Video image quality adjusting method and device, electronic equipment and storage medium

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2023185954A1 (en) * 2022-04-01 2023-10-05 海信视像科技股份有限公司 Display device and processing method for display device

Also Published As

Publication number Publication date
WO2021253895A1 (en) 2021-12-23

Similar Documents

Publication Publication Date Title
CN111277884B (en) Video playing method and device
CN111405338B (en) Intelligent image quality switching method and display device
CN113259741B (en) Demonstration method and display device for classical viewpoint of episode
CN111752518A (en) Screen projection method of display equipment and display equipment
CN112214189B (en) Image display method and display device
CN112019782B (en) Control method and display device of enhanced audio return channel
CN111131898B (en) Method and device for playing media resource, display equipment and storage medium
CN111208969A (en) Selection control method of sound output equipment and display equipment
CN112333509B (en) Media asset recommendation method, recommended media asset playing method and display equipment
CN113825032A (en) Media asset playing method and display equipment
CN112118400B (en) Display method of image on display device and display device
CN112188279A (en) Channel switching method and display equipment
CN111176603A (en) Image display method for display equipment and display equipment
CN111954059A (en) Screen saver display method and display device
CN112199064A (en) Interaction method of browser application and system platform and display equipment
CN111757024A (en) Method for controlling intelligent image mode switching and display equipment
CN110602540B (en) Volume control method of display equipment and display equipment
CN111741314A (en) Video playing method and display equipment
CN111263223A (en) Media volume adjusting method and display device
CN112333520B (en) Program recommendation method, display device and server
CN113259733B (en) Display device
CN111988646B (en) User interface display method and display device of application program
CN112118476B (en) Method for rapidly displaying program reservation icon and display equipment
CN111479146B (en) Display apparatus and display method
CN112261463A (en) Display device and program recommendation method

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination