CN111669638A - Video rotation playing method and display equipment - Google Patents

Video rotation playing method and display equipment Download PDF

Info

Publication number
CN111669638A
CN111669638A CN202010530170.6A CN202010530170A CN111669638A CN 111669638 A CN111669638 A CN 111669638A CN 202010530170 A CN202010530170 A CN 202010530170A CN 111669638 A CN111669638 A CN 111669638A
Authority
CN
China
Prior art keywords
video
rotation
playing
picture
data
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202010530170.6A
Other languages
Chinese (zh)
Other versions
CN111669638B (en
Inventor
高雯雯
陆世明
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Hisense Visual Technology Co Ltd
Original Assignee
Hisense Visual Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Hisense Visual Technology Co Ltd filed Critical Hisense Visual Technology Co Ltd
Publication of CN111669638A publication Critical patent/CN111669638A/en
Application granted granted Critical
Publication of CN111669638B publication Critical patent/CN111669638B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/41Structure of client; Structure of client peripherals
    • H04N21/422Input-only peripherals, i.e. input devices connected to specially adapted client devices, e.g. global positioning system [GPS]
    • H04N21/42204User interfaces specially adapted for controlling a client device through a remote control device; Remote control devices therefor
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/431Generation of visual interfaces for content selection or interaction; Content or additional data rendering
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/47End-user applications
    • H04N21/485End-user interface for client configuration
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/80Generation or processing of content or additional data by content creator independently of the distribution process; Content per se
    • H04N21/83Generation or processing of protective or descriptive data associated with content; Content structuring
    • H04N21/845Structuring of content, e.g. decomposing content into time segments
    • H04N21/8455Structuring of content, e.g. decomposing content into time segments involving pointers to the content, e.g. pointers to the I-frames of the video stream

Abstract

The application discloses a video rotation playing method which is used for supporting automatic rotation of videos and manual rotation of angle information and can achieve rotation playing of any angle, and therefore user experience is improved. The method comprises the following steps: responding to a command of instructing to play a video input by a user, analyzing the video to obtain metadata information, and acquiring a rotation angle in the metadata information; if the rotation angle is not zero, acquiring each frame of data of the video and storing the data as a picture; performing rotation processing on each frame of data; and playing the rotated pictures according to a preset time interval.

Description

Video rotation playing method and display equipment
The priority of the application filed by the chinese patent office on 28/02/2020, having application number 202010130047.5, is claimed in the present application, the entire contents of which are incorporated herein by reference.
Technical Field
The present application relates to the field of display technologies, and in particular, to a video rotation playing method and a display device.
Background
More and more people are used to shoot videos by mobile phones, the videos shot by the mobile phones and with angles can be played normally at the mobile phone end, but the videos are played obliquely at the television end, the watching experience is poor, and even some chips cannot support rotary playing.
As shown in fig. 2, the Video playing can be mainly divided into three layers, namely, a Video layer (Video layer), a MiddleWare layer (player layer) and an OSD layer, the relationship of the three layers is continuously upward, and the Video layer is the lowest layer. At present, most schemes such as a television end do not support a rotation function, some schemes such as a business chip can support rotation of a video, and the realization level is in an OSD layer. In a common scheme used by the OSD layer, a special View is used in the OSD layer, a rotation angle is set for the View, and the video is rotated and played through angle conversion of the View, but the rotation of the scheme is not only the rotation of the video, but also other original rotation which is not required to be rotated can be caused, for example, the whole screen or a control is poor in user experience.
Disclosure of Invention
The embodiment of the application provides a video rotation playing method and display equipment, which are used for supporting automatic rotation of videos and manual rotation of angle information, and can realize rotation playing of any angle, so that user experience is improved.
In a first aspect, there is provided a display device comprising:
a display;
a user interface;
a controller for performing:
responding to a command of instructing to play a video input by a user, analyzing the video to obtain metadata information, and acquiring a rotation angle in the metadata information;
if the rotation angle is not zero, acquiring each frame of data of the video and storing the data as a picture;
performing rotation processing on each frame of data;
and playing the rotated pictures according to a preset time interval.
In some embodiments, the controller performs the rotation processing on each frame data in the following manner:
acquiring the display size of the display equipment, and determining the reference width and height of the display equipment;
acquiring bitmap information;
performing matrix transformation processing according to the bitmap information, the rotation angle, the preset scaling ratio, the reference width and the height to obtain a transformation matrix;
and drawing a picture according to the bitmap information and the transformation matrix.
In some embodiments, the controller performs the matrix transformation according to the bitmap information, the rotation angle, the preset scaling, and the reference width and height to obtain a transformation matrix by:
initializing a matrix;
performing horizontal and vertical displacement transformation on the initialized matrix according to a preset scaling, the reference width and height and the width and height of bitmap information to obtain a displacement matrix;
and performing rotation transformation on the displacement matrix according to the rotation angle, the reference width and the height to obtain a transformation matrix.
In some embodiments, the controller performs the obtaining bitmap information by:
acquiring a URL address of a picture, and converting the picture into an input stream;
and analyzing the input stream to obtain bitmap information.
In some embodiments, the controller is specifically configured to play the rotated pictures at preset time intervals in the following manner:
and playing the rotated picture on the OSD layer according to a preset time interval.
In some embodiments, the controller further performs the rotating of each frame data by:
and transmitting the bitmap information, the transformation matrix and the drawn picture to a video layer.
In some embodiments, the controller is specifically configured to play the rotated pictures at preset time intervals in the following manner:
and playing the rotated pictures on the video layer according to a preset time interval.
In a second aspect, there is provided a display device comprising:
a display;
a user interface;
a controller for performing:
playing a video in response to a user-input instruction instructing the video to be played;
responding to an instruction which is input by a user and indicates to rotationally play a video, acquiring each frame of data of the video unreleased part, and storing the data as a picture, wherein the instruction which indicates to rotationally play the video carries information of a rotation angle;
performing rotation processing on each frame data according to the rotation angle;
and playing the rotated pictures according to a preset time interval.
In some embodiments, the controller performs the rotation processing on each frame data in the following manner:
acquiring the display size of the display equipment, and determining the reference width and height of the display equipment;
acquiring bitmap information;
performing matrix transformation processing according to the bitmap information, the rotation angle, the preset scaling ratio, the reference width and the height to obtain a transformation matrix;
and drawing a picture according to the bitmap information and the transformation matrix.
In some embodiments, the controller performs the matrix transformation according to the bitmap information, the rotation angle, the preset scaling, and the reference width and height to obtain a transformation matrix by:
initializing a matrix;
performing horizontal and vertical displacement transformation on the initialized matrix according to a preset scaling, the reference width and height and the width and height of bitmap information to obtain a displacement matrix;
and performing rotation transformation on the displacement matrix according to the rotation angle, the reference width and the height to obtain a transformation matrix.
In some embodiments, the controller performs the obtaining bitmap information by:
acquiring a URL address of a picture, and converting the picture into an input stream;
and analyzing the input stream to obtain bitmap information.
In some embodiments, the controller is specifically configured to play the rotated pictures at preset time intervals in the following manner:
and playing the rotated picture on the OSD layer according to a preset time interval.
In some embodiments, the controller further performs the rotating of each frame data by:
and transmitting the bitmap information, the transformation matrix and the drawn picture to a video layer.
In some embodiments, the controller is specifically configured to play the rotated pictures at preset time intervals in the following manner:
and playing the rotated pictures on the video layer according to a preset time interval.
In a third aspect, a video rotation playing method is provided, including:
responding to a command of instructing to play a video input by a user, analyzing the video to obtain metadata information, and acquiring a rotation angle in the metadata information;
if the rotation angle is not zero, acquiring each frame of data of the video and storing the data as a picture;
performing rotation processing on each frame of data;
and playing the rotated pictures according to a preset time interval.
In a fourth aspect, a video rotation playing method is provided, including:
playing a video in response to a user-input instruction instructing the video to be played;
responding to an instruction which is input by a user and indicates to rotationally play a video, acquiring each frame of data of the video unreleased part, and storing the data as a picture, wherein the instruction which indicates to rotationally play the video carries information of a rotation angle;
performing rotation processing on each frame data according to the rotation angle;
and playing the rotated pictures according to a preset time interval.
In the above embodiment, the display device determines that the video contains angle information, acquires information of each frame of the video by using the decoder, stores the information as pictures, processes the pictures in batch, and plays the processed pictures continuously as slides, thereby realizing the effect of video rotation playing. The video playing method and device support automatic rotation of videos and manual rotation of angle information, can achieve rotary playing of any angle, and improves user experience.
Drawings
Fig. 1A is a schematic diagram illustrating an operation scenario between the display device 200 and the control apparatus 100;
fig. 1B is a block diagram schematically illustrating a configuration of the control apparatus 100 in fig. 1A;
fig. 1C is a block diagram schematically illustrating a configuration of the display device 200 in fig. 1A;
a block diagram of the architectural configuration of the operating system in the memory of the display device 200 is illustrated in fig. 1D.
Fig. 2 is a block diagram illustrating an architectural configuration of video playback in a memory of a display device 200;
fig. 3 is a schematic diagram illustrating another GUI provided by the display apparatus 200;
FIGS. 4A-4C are diagrams illustrating a GUI provided by display device 200;
FIGS. 5A-5C are schematic diagrams illustrating yet another GUI provided by display device 200;
FIG. 6 is a flow chart illustrating a method for providing video rotation playback in a display device;
fig. 7 is a flowchart illustrating a method of providing a rotation process for each frame data in the display device;
FIG. 8 is a flow chart illustrating another method for performing a rotation process on each frame of data provided in the display device;
fig. 9 is a flow chart illustrating a method for providing still another video rotation play in a display device.
Detailed Description
To make the objects, technical solutions and advantages of the exemplary embodiments of the present application clearer, the technical solutions in the exemplary embodiments of the present application will be clearly and completely described below with reference to the drawings in the exemplary embodiments of the present application, and it is obvious that the described exemplary embodiments are only a part of the embodiments of the present application, but not all the embodiments.
All other embodiments, which can be derived by a person skilled in the art from the exemplary embodiments shown in the present application without inventive effort, shall fall within the scope of protection of the present application. Moreover, while the disclosure herein has been presented in terms of exemplary one or more examples, it is to be understood that each aspect of the disclosure can be utilized independently and separately from other aspects of the disclosure to provide a complete disclosure.
The terms "comprises" and "comprising," and any variations thereof, as used herein, are intended to cover a non-exclusive inclusion, such that a product or device that comprises a list of elements is not necessarily limited to those elements explicitly listed, but may include other elements not expressly listed or inherent to such product or device.
The term "module," as used herein, refers to any known or later developed hardware, software, firmware, artificial intelligence, fuzzy logic, or combination of hardware and/or software code that is capable of performing the functionality associated with that element.
The term "gesture" as used in this application refers to a user's behavior through a change in hand shape or an action such as hand motion to convey a desired idea, action, purpose, or result.
Fig. 1A is a schematic diagram illustrating an operation scenario between the display device 200 and the control apparatus 100. As shown in fig. 1A, the control apparatus 100 and the display device 200 may communicate with each other in a wired or wireless manner.
Among them, the control apparatus 100 is configured to control the display device 200, which may receive an operation instruction input by a user and convert the operation instruction into an instruction recognizable and responsive by the display device 200, serving as an intermediary for interaction between the user and the display device 200. Such as: the user operates the channel up/down key on the control device 100, and the display device 200 responds to the channel up/down operation.
The control device 100 may be a remote controller 100A, which includes infrared protocol communication or bluetooth protocol communication, and other short-distance communication methods, etc. to control the display apparatus 200 in a wireless or other wired manner. The user may input a user instruction through a key on a remote controller, voice input, control panel input, etc., to control the display apparatus 200. Such as: the user can input a corresponding control command through a volume up/down key, a channel control key, up/down/left/right moving keys, a voice input key, a menu key, a power on/off key, etc. on the remote controller, to implement the function of controlling the display device 200.
The control device 100 may also be an intelligent device, such as a mobile terminal 100B, a tablet computer, a notebook computer, and the like. For example, the display device 200 is controlled using an application program running on the smart device. The application program may provide various controls to a user through an intuitive User Interface (UI) on a screen associated with the smart device through configuration.
For example, the mobile terminal 100B may install a software application with the display device 200 to implement connection communication through a network communication protocol for the purpose of one-to-one control operation and data communication. Such as: the mobile terminal 100B may be caused to establish a control instruction protocol with the display device 200 to implement the functions of the physical keys as arranged in the remote control 100A by operating various function keys or virtual buttons of the user interface provided on the mobile terminal 100B. The audio and video content displayed on the mobile terminal 100B may also be transmitted to the display device 200, so as to implement a synchronous display function.
The display apparatus 200 may be implemented as a television, and may provide an intelligent network television function of a broadcast receiving television function as well as a computer support function. Examples of the display device include a digital television, a web television, a smart television, an Internet Protocol Television (IPTV), and the like.
The display device 200 may be a liquid crystal display, an organic light emitting display, a projection display device. The specific display device type, size, resolution, etc. are not limited.
The display apparatus 200 also performs data communication with the server 300 through various communication means. Here, the display apparatus 200 may be allowed to be communicatively connected through a Local Area Network (LAN), a Wireless Local Area Network (WLAN), and other networks. The server 300 may provide various contents and interactions to the display apparatus 200. By way of example, the display device 200 may send and receive information such as: receiving Electronic Program Guide (EPG) data, receiving software program updates, or accessing a remotely stored digital media library. The servers 300 may be a group or groups of servers, and may be one or more types of servers. Other web service contents such as a video on demand and an advertisement service are provided through the server 300.
Fig. 1B is a block diagram illustrating the configuration of the control device 100. As shown in fig. 1B, the control device 100 includes a controller 110, a memory 120, a communicator 130, a user input interface 140, an output interface 150, and a power supply 160.
The controller 110 includes a Random Access Memory (RAM)111, a Read Only Memory (ROM)112, a processor 113, a communication interface, and a communication bus. The controller 110 is used to control the operation of the control device 100, as well as the internal components of the communication cooperation, external and internal data processing functions.
Illustratively, when an interaction of a user pressing a key disposed on the remote controller 100A or an interaction of touching a touch panel disposed on the remote controller 100A is detected, the controller 110 may control to generate a signal corresponding to the detected interaction and transmit the signal to the display device 200.
And a memory 120 for storing various operation programs, data and applications for driving and controlling the control apparatus 100 under the control of the controller 110. The memory 120 may store various control signal commands input by a user.
The communicator 130 enables communication of control signals and data signals with the display apparatus 200 under the control of the controller 110. Such as: the control apparatus 100 transmits a control signal (e.g., a touch signal or a button signal) to the display device 200 via the communicator 130, and the control apparatus 100 may receive the signal transmitted by the display device 200 via the communicator 130. The communicator 130 may include an infrared signal interface 131 and a radio frequency signal interface 132. For example: when the infrared signal interface is used, the user input instruction needs to be converted into an infrared control signal according to an infrared control protocol, and the infrared control signal is sent to the display device 200 through the infrared sending module. The following steps are repeated: when the rf signal interface is used, a user input command needs to be converted into a digital signal, and then the digital signal is modulated according to the rf control signal modulation protocol and then transmitted to the display device 200 through the rf transmitting terminal.
The user input interface 140 may include at least one of a microphone 141, a touch pad 142, a sensor 143, a key 144, and the like, so that a user can input a user instruction regarding controlling the display apparatus 200 to the control apparatus 100 through voice, touch, gesture, press, and the like.
The output interface 150 outputs a user instruction received by the user input interface 140 to the display apparatus 200, or outputs an image or voice signal received by the display apparatus 200. Here, the output interface 150 may include an LED interface 151, a vibration interface 152 generating vibration, a sound output interface 153 outputting sound, a display 154 outputting an image, and the like. For example, the remote controller 100A may receive an output signal such as audio, video, or data from the output interface 150, and display the output signal in the form of an image on the display 154, in the form of audio on the sound output interface 153, or in the form of vibration on the vibration interface 152.
And a power supply 160 for providing operation power support for each element of the control device 100 under the control of the controller 110. In the form of a battery and associated control circuitry.
A hardware configuration block diagram of the display device 200 is exemplarily illustrated in fig. 1C. As shown in fig. 1C, the display apparatus 200 may further include a tuner demodulator 210, a communicator 220, a detector 230, an external device interface 240, a controller 250, a memory 260, a user interface 265, a video processor 270, a display 275, an audio processor 280, an audio output interface 285, and a power supply 290.
The tuner demodulator 210 receives the broadcast television signal in a wired or wireless manner, may perform modulation and demodulation processing such as amplification, mixing, and resonance, and is configured to demodulate, from a plurality of wireless or wired broadcast television signals, an audio/video signal carried in a frequency of a television channel selected by a user, and additional information (e.g., EPG data).
The tuner demodulator 210 is responsive to the user selected frequency of the television channel and the television signal carried by the frequency, as selected by the user and controlled by the controller 250.
The tuner demodulator 210 can receive a television signal in various ways according to the broadcasting system of the television signal, such as: terrestrial broadcasting, cable broadcasting, satellite broadcasting, internet broadcasting, or the like; and according to different modulation types, a digital modulation mode or an analog modulation mode can be adopted; and can demodulate the analog signal and the digital signal according to the different kinds of the received television signals.
In other exemplary embodiments, the tuning demodulator 210 may also be in an external device, such as an external set-top box. In this way, the set-top box outputs a television signal after modulation and demodulation, and inputs the television signal into the display apparatus 200 through the external device interface 240.
The communicator 220 is a component for communicating with an external device or an external server according to various communication protocol types. For example, the display apparatus 200 may transmit content data to an external apparatus connected via the communicator 220, or browse and download content data from an external apparatus connected via the communicator 220. The communicator 220 may include a network communication protocol module or a near field communication protocol module, such as a WIFI module 221, a bluetooth communication protocol module 222, and a wired ethernet communication protocol module 223, so that the communicator 220 may receive a control signal of the control device 100 according to the control of the controller 250 and implement the control signal as a WIFI signal, a bluetooth signal, a radio frequency signal, and the like.
The detector 230 is a component of the display apparatus 200 for collecting signals of an external environment or interaction with the outside. The detector 230 may include an image collector 231, such as a camera, a video camera, etc., which may be used to collect external environment scenes to adaptively change the display parameters of the display device 200; and the function of acquiring the attribute of the user or interacting gestures with the user so as to realize the interaction between the display equipment and the user. A light receiver 232 may also be included to collect ambient light intensity to adapt to changes in display parameters of the display device 200, etc.
In some other exemplary embodiments, the detector 230 may further include a temperature sensor, such as by sensing an ambient temperature, and the display device 200 may adaptively adjust a display color temperature of the image. For example, when the temperature is higher, the display apparatus 200 may be adjusted to display a color temperature of an image that is cooler; when the temperature is lower, the display device 200 may be adjusted to display a warmer color temperature of the image.
In some other exemplary embodiments, the detector 230, which may further include a sound collector, such as a microphone, may be configured to receive a sound of a user, such as a voice signal of a control instruction of the user to control the display device 200; alternatively, ambient sounds may be collected that identify the type of ambient scene, enabling the display device 200 to adapt to ambient noise.
The external device interface 240 is a component for providing the controller 250 to control data transmission between the display apparatus 200 and an external apparatus. The external device interface 240 may be connected to an external apparatus such as a set-top box, a game device, a notebook computer, etc. in a wired/wireless manner, and may receive data such as a video signal (e.g., moving image), an audio signal (e.g., music), additional information (e.g., EPG), etc. of the external apparatus.
The external device interface 240 may include: a High Definition Multimedia Interface (HDMI) terminal 241, a Composite Video Blanking Sync (CVBS) terminal 242, an analog or digital Component terminal 243, a Universal Serial Bus (USB) terminal 244, a Component terminal (not shown), a red, green, blue (RGB) terminal (not shown), and the like.
The controller 250 controls the operation of the display device 200 and responds to the operation of the user by running various software control programs (such as an operating system and various application programs) stored on the memory 260.
As shown in fig. 1C, the controller 250 includes a Random Access Memory (RAM)251, a Read Only Memory (ROM)252, a graphics processor 253, a CPU processor 254, a communication interface 255, and a communication bus 256. The RAM251, the ROM252, the graphic processor 253, and the CPU processor 254 are connected to each other through a communication bus 256 through a communication interface 255.
The ROM252 stores various system boot instructions. When the display apparatus 200 starts power-on upon receiving the power-on signal, the CPU processor 254 executes a system boot instruction in the ROM252, copies the operating system stored in the memory 260 to the RAM251, and starts running the boot operating system. After the start of the operating system is completed, the CPU processor 254 copies the various application programs in the memory 260 to the RAM251 and then starts running and starting the various application programs.
A graphic processor 253 for generating screen images of various graphic objects such as icons, images, and operation menus. The graphic processor 253 may include an operator for performing an operation by receiving various interactive instructions input by a user, and further displaying various objects according to display attributes; and a renderer for generating various objects based on the operator and displaying the rendered result on the display 275.
A CPU processor 254 for executing operating system and application program instructions stored in memory 260. And according to the received user input instruction, processing of various application programs, data and contents is executed so as to finally display and play various audio-video contents.
In some example embodiments, the CPU processor 254 may comprise a plurality of processors. The plurality of processors may include one main processor and a plurality of or one sub-processor. A main processor for performing some initialization operations of the display apparatus 200 in the display apparatus preload mode and/or operations of displaying a screen in the normal mode. A plurality of or one sub-processor for performing an operation in a state of a standby mode or the like of the display apparatus.
The communication interface 255 may include a first interface to an nth interface. These interfaces may be network interfaces that are connected to external devices via a network.
The controller 250 may control the overall operation of the display apparatus 200. For example: in response to receiving a user input command for selecting a GUI object displayed on the display 275, the controller 250 may perform an operation related to the object selected by the user input command.
Where the object may be any one of the selectable objects, such as a hyperlink or an icon. The operation related to the selected object is, for example, an operation of displaying a link to a hyperlink page, document, image, or the like, or an operation of executing a program corresponding to an icon. The user input command for selecting the GUI object may be a command input through various input means (e.g., a mouse, a keyboard, a touch panel, etc.) connected to the display apparatus 200 or a voice command corresponding to a user uttering voice.
A memory 260 for storing various types of data, software programs, or applications for driving and controlling the operation of the display device 200. The memory 260 may include volatile and/or nonvolatile memory. And the term "memory" includes the memory 260, the RAM251 and the ROM252 of the controller 250, or a memory card in the display device 200.
In some embodiments, the memory 260 is specifically used for storing an operating program for driving the controller 250 of the display device 200; storing various application programs built in the display apparatus 200 and downloaded by a user from an external apparatus; data such as visual effect images for configuring various GUIs provided by the display 275, various objects related to the GUIs, and selectors for selecting GUI objects are stored.
In some embodiments, the memory 260 is specifically configured to store drivers and related data for the tuner demodulator 210, the communicator 220, the detector 230, the external device interface 240, the video processor 270, the display 275, the audio processor 280, and the like, external data (e.g., audio-visual data) received from the external device interface, or user data (e.g., key information, voice information, touch information, and the like) received from the user interface.
In some embodiments, memory 260 specifically stores software and/or programs representing an Operating System (OS), which may include, for example: a kernel, middleware, an Application Programming Interface (API), and/or an application program. Illustratively, the kernel may control or manage system resources, as well as functions implemented by other programs (e.g., the middleware, APIs, or applications); at the same time, the kernel may provide an interface to allow middleware, APIs, or applications to access the controller to enable control or management of system resources.
A block diagram of the architectural configuration of the operating system in the memory of the display device 200 is illustrated in fig. 1D. The operating system architecture comprises an application layer, a middleware layer and a kernel layer from top to bottom.
The application layer, the application programs built in the system and the non-system-level application programs belong to the application layer and are responsible for direct interaction with users. The application layer may include a plurality of applications such as NETFLIX applications, setup applications, media center applications, and the like. These applications may be implemented as Web applications that execute based on a WebKit engine, and in particular may be developed and executed based on HTML, Cascading Style Sheets (CSS), and JavaScript. The application layer comprises an OSD layer, wherein the OSD is short for on-screen display, namely an on-screen menu type adjusting mode.
Here, HTML, which is called HyperText Markup Language (HyperText Markup Language), is a standard Markup Language for creating web pages, and describes the web pages by Markup tags, where the HTML tags are used to describe characters, graphics, animation, sound, tables, links, etc., and a browser reads an HTML document, interprets the content of the tags in the document, and displays the content in the form of web pages.
CSS, known as Cascading Style Sheets (Cascading Style Sheets), is a computer language used to represent the Style of HTML documents, and may be used to define Style structures, such as fonts, colors, locations, etc. The CSS style can be directly stored in the HTML webpage or a separate style file, so that the style in the webpage can be controlled.
JavaScript, a language applied to Web page programming, can be inserted into an HTML page and interpreted and executed by a browser. The interaction logic of the Web application is realized by JavaScript. The JavaScript can package a JavaScript extension interface through a browser, realize the communication with the kernel layer,
the middleware layer may provide some standardized interfaces to support the operation of various environments and systems. For example, the middleware layer may be implemented as multimedia and hypermedia information coding experts group (MHEG) middleware related to data broadcasting, DLNA middleware which is middleware related to communication with an external device, middleware which provides a browser environment in which each application program in the display device operates, and the like. The middleware layer includes a player layer.
The kernel layer provides core system services, such as: file management, memory management, process management, network management, system security authority management and the like. The kernel layer may be implemented as a kernel based on various operating systems, for example, a kernel based on the Linux operating system.
The kernel layer also provides communication between system software and hardware, and provides device driver services for various hardware, such as: provide display driver for the display, provide camera driver for the camera, provide button driver for the remote controller, provide wiFi driver for the WIFI module, provide audio driver for audio output interface, provide power management drive for Power Management (PM) module etc.. The core layer includes a video layer.
A user interface 265 receives various user interactions. Specifically, it is used to transmit an input signal of a user to the controller 250 or transmit an output signal from the controller 250 to the user. For example, the remote controller 100A may transmit an input signal, such as a power switch signal, a channel selection signal, a volume adjustment signal, etc., input by the user to the user interface 265, and then the input signal is transferred to the controller 250 through the user interface 265; alternatively, the remote controller 100A may receive an output signal such as audio, video, or data output from the user interface 265 via the controller 250, and display the received output signal or output the received output signal in audio or vibration form.
In some embodiments, a user may enter user commands on a Graphical User Interface (GUI) displayed on the display 275, and the user interface 265 receives the user input commands through the GUI. Specifically, the user interface 265 may receive user input commands for controlling the position of a selector in the GUI to select different objects or items.
Alternatively, the user may input a user command by inputting a specific sound or gesture, and the user interface 265 receives the user input command by recognizing the sound or gesture through the sensor.
The video processor 270 is configured to receive an external video signal, and perform video data processing such as decompression, decoding, scaling, noise reduction, frame rate conversion, resolution conversion, and image synthesis according to a standard codec protocol of the input signal, so as to obtain a video signal that is directly displayed or played on the display 275.
Illustratively, the video processor 270 includes a demultiplexing module, a video decoding module, an image synthesizing module, a frame rate conversion module, a display formatting module, and the like.
The demultiplexing module is configured to demultiplex an input audio/video data stream, where, for example, an input MPEG-2 stream (based on a compression standard of a digital storage media moving image and voice), the demultiplexing module demultiplexes the input audio/video data stream into a video signal and an audio signal.
And the video decoding module is used for processing the video signal after demultiplexing, including decoding, scaling and the like.
And the image synthesis module is used for carrying out superposition mixing processing on the GUI signal input by the user or generated by the user and the video image after the zooming processing by the graphic generator so as to generate an image signal for display.
The frame rate conversion module is configured to convert a frame rate of an input video, for example, convert a frame rate of an input 60Hz video into a frame rate of 120Hz or 240Hz, where a common format is implemented by using, for example, an interpolation frame method.
And a display formatting module for converting the signal output by the frame rate conversion module into a signal conforming to a display format of a display, such as converting the format of the signal output by the frame rate conversion module to output an RGB data signal.
And a display 275 for receiving the image signal from the output of the video processor 270 and displaying video, images and menu manipulation interfaces. For example, the display may display video from a broadcast signal received by the tuner demodulator 210, may display video input from the communicator 220 or the external device interface 240, and may display an image stored in the memory 260. The display 275, while displaying a user manipulation interface UI generated in the display apparatus 200 and used to control the display apparatus 200.
And, the display 275 may include a display screen assembly for presenting a picture and a driving assembly for driving the display of an image. Alternatively, a projection device and projection screen may be included, provided display 275 is a projection display.
The audio processor 280 is configured to receive an external audio signal, decompress and decode the received audio signal according to a standard codec protocol of the input signal, and perform audio data processing such as noise reduction, digital-to-analog conversion, and amplification processing to obtain an audio signal that can be played by the speaker 286.
Illustratively, audio processor 280 may support various audio formats. Such as MPEG-2, MPEG-4, Advanced Audio Coding (AAC), high efficiency AAC (HE-AAC), and the like.
Audio output interface 285 receives audio signals from the output of audio processor 280. For example, the audio output interface may output audio in a broadcast signal received via the tuner demodulator 210, may output audio input via the communicator 220 or the external device interface 240, and may output audio stored in the memory 260. The audio output interface 285 may include a speaker 286, or an external audio output terminal 287, such as an earphone output terminal, that outputs to a generating device of an external device.
In other exemplary embodiments, video processor 270 may comprise one or more chips. Audio processor 280 may also comprise one or more chips.
And, in other exemplary embodiments, the video processor 270 and the audio processor 280 may be separate chips or may be integrated with the controller 250 in one or more chips.
And a power supply 290 for supplying power supply support to the display apparatus 200 from the power input from the external power source under the control of the controller 250. The power supply 290 may be a built-in power supply circuit installed inside the display apparatus 200 or may be a power supply installed outside the display apparatus 200.
A schematic view of a GUI provided by the display device 200 is exemplarily shown in fig. 3.
As shown in fig. 3, the display device may provide a GUI to the display that includes multiple presentation areas providing different image content, each presentation area including one or more different items arranged therein. For example, items 411-4112 are arranged in the first display area 41. And the GUI further includes a selector 42 indicating that any one of the items is selected, the position of the selector in the GUI or the positions of the respective items in the GUI being movable by input from a user operating the control means to change the selection of a different item. For example, selector 42 indicates that item 411 within first presentation area 41 is selected.
It should be noted that items, which refer to visual objects displayed in respective presentation areas of the GUI in the display apparatus 200 to represent corresponding contents such as icons, thumbnails, video clips, links, etc., may provide the user with various conventional program contents received through data broadcasting, and various application and service contents set by a content manufacturer.
In some embodiments, the items may represent movies, image content or video clips of a television series, audio content of music, applications, or other user access content history information. If the item is a movie or a tv show, the item may be displayed as a poster of the movie or tv show, a video clip dynamic of a trailer of the movie or tv show. If the item is music, a poster of a music album may be displayed. Such as an icon for the application when the item is an application, or a screenshot of the content that captures the application when it was most recently executed. If the item is the user access history, the content screenshot in the latest execution process can be displayed.
In some embodiments, the item may represent an interface or a set of interfaces to which the display device 200 is connected with an external device, or may represent a name of an external device connected to the display device, or the like. Such as: a signal source input interface set, or an HDMI interface, a USB interface, a PC terminal interface, etc.
The presentation forms of items are often diverse. For example, the items may include text content and/or images for displaying thumbnails related to the text content, or video clips related to the text. As another example, the item may be text and/or an icon of an application.
It is further noted that the selector is used to indicate that any item, such as the focus object, has been selected. In one aspect, the movement of the focus object displayed in the display apparatus 200 may be controlled to select or control the item according to an input of the user through the control device 100. Such as: the user may select and control items by controlling the movement of the focus object between items through the direction keys on the control device 100. On the other hand, movement of items displayed in the display apparatus 200 may be controlled according to an input of a user through the control device 100 to cause a focus object to select or control the items. Such as: the user can control the items to move left and right together by the direction key on the control device 100, so as to enable the focus object to select and control the items while keeping the position of the focus object unchanged.
The form of identification of the selector is often diversified. For example, the position of the focus object may be implemented or identified by enlarging the item, by setting a background color of the item, or by changing a border line, a size, a color, a transparency, an outline, and/or a font of a text or an image of the focused item.
A schematic view of another GUI provided by the display device 200 is exemplarily illustrated in fig. 4A-4C.
When the user selects a certain source, the video starts playing, as shown in fig. 4A. When the user presses a direction key, such as an up key or a down key, on the control device, as shown in fig. 4B, the display apparatus brings up a GUI displaying a play control bar. The first display area 41 comprises items 411 to 415, wherein the items 411 to 415 are respectively left rotation, previous song, play, next song and right rotation; the current selector 42 indicates that play is selected.
In fig. 4B, when the user presses the direction key on the control, as shown in fig. 4C, the current selector 42 indicates that left rotation is selected. When the user presses the enter key on the control device, the display device responds to the input left-rotation instruction, and rotates and plays the currently played video picture to the left.
A schematic view of yet another GUI provided by the display device 200 is illustrated in fig. 5A-5C.
When the user selects a certain source, the video starts playing, as shown in fig. 4A. When the user presses a preset button on the control device, the GUI displaying the settings is called up, as shown in FIG. 5A. The first display area 41 is a menu bar, the menu bar comprises items 411 to 4112, and the items 411 to 4112 are earphone volume, audio track, subtitle, zoom mode, rotation, screen closing function, repeat mode, time selection play, information, playlist, video chapter and setting respectively; the current selector 42 indicates that the earphone volume is selected.
In FIG. 5A, when the user presses the direction key on the control device, as shown in FIG. 5B, the current selector 42 indicates that a rotation is selected, and the rotation provides its corresponding sub-menu, such as items 4151-4153, left rotation 90, right rotation 90, and custom, respectively. When the user presses the direction key on the control, the current selector 42 indicates that the right turn 90 is selected, and when the user presses the enter key on the control, the display device rotates the currently playing video picture 90 to the right and plays it in response to the input command to rotate the right 90. If the user selects the self-definition, the rotating direction and the rotating angle can be input according to the requirement.
In FIG. 5A, when the user presses the direction key on the control device, as shown in FIG. 5C, the current selector 42 indicates that the zoom ratio is selected, and the zoom ratio provides its corresponding sub-menu, such as items 4141-4143, as the fit mode, the original size, and the full screen mode, respectively. When the user presses the direction key on the control device, the current selector 42 indicates that the adaptation mode is selected, and when the user presses the enter key on the control device, the display device plays the currently played video picture in the adaptation mode in response to the input instruction of the adaptation mode.
Fig. 6 and 7 are flowcharts illustrating a method for video rotation playing, which is applied to automatic rotation.
The embodiment of the application does not depend on a scheme quotient and plays on an OSD layer. The realization principle is to realize video rotation playing in an application layer, and the played video shows a common playing effect. The difference from the existing method lies in that: the method of the embodiment of the application decodes the video, and then rotates the decoded picture, thereby realizing normal playing. In addition, the application uses a media playing application media center of the display device.
Referring to fig. 6, the method includes the following steps:
step S61: receiving an instruction which is input by a user through operating the control device and indicates that a video is played through a user interface;
for example, the user interface may receive a film source selected by the user on the display device by pressing a confirmation key on the control.
Step S62: responding to a command of instructing to play a video input by a user, analyzing the video to obtain metadata information, and acquiring a rotation angle in the metadata information;
in some embodiments, the film source selected by the user is parsed, Metadata (Metadata) information therein is acquired, and a value of a rotate item of the Metadata information is acquired and is recorded as a rotate degree.
In some embodiments, the Metadata information is:
rotate:90
creation_time:2020-04-20 12:00:01
encoder:H.264
stream#0:1(und):Audio:aac(LC)(mp4a/0x6134706D),44100hz.mono,fltp.63kb/s
step S63: judging whether the rotation angle is zero or not;
and judging whether the rotate value is 0 or not, if the rotate value is 0, the rotation is not needed, and if the rotate value is not 0, the rotation angle is needed to play.
If the rotation angle is zero, step S64 is performed.
Step S64: and starting a normal player playing flow and playing. I.e. directly using the default player flow.
If the rotation angle is not zero, step S64 is performed.
Step S65: acquiring each frame of data of the video and storing the data as a picture;
in some embodiments, step S64 may be to invoke the opencv program, acquire each frame of data of the video, store the frame in the jpg format, store the frame in a data partition or a removable device such as a usb disk, and clear the stored data when the device is disconnected or quitted.
Step S66: performing rotation processing on each frame of data;
step S67: and playing the rotated pictures according to a preset time interval.
The time interval for playing the rotated pictures, namely the playing time of the slide, can be set to 50ms in advance by a user according to needs or defaults, and the rotated pictures can be played according to the preset time interval after the sound and picture synchronization is realized, so that the video can be played in a rotating manner.
Referring to fig. 7, the method for performing rotation processing on each frame of data includes the following steps:
step S6611: acquiring the display size of the display equipment, and determining the reference width and height of the display equipment;
in some embodiments, the reference width (DisplayWidth) and height (DisplayHeight) of the display are determined by determining whether the display device is a 4K display.
If the display is 4K, the displayWidth is 3840; DisplayHeight 1920;
if not, the display is 2K display, and the DisplayWidth is 1920; DisplayHeight 1080.
Step S6612: acquiring bitmap information;
in some embodiments, in step S6612, to obtain a URL address of the picture, such as/mnt/storage/4246-1201/1. jpg, the picture is first converted into an input stream, and then bitmap information is obtained by a decoding stream (stream) method of a factory class parsing method, which is denoted as bitmap.
After the bitmap information is obtained, the Width and Height of the bimap are also obtained and recorded as Width and Height.
Step S6613: performing matrix transformation processing according to the bitmap information, the rotation angle, the preset scaling ratio, the reference width and the height to obtain a transformation matrix;
in some embodiments, the matrix transformation process is performed on the transformed angles, and the bitmap and the angle information rotadegree are transmitted to the program for processing, the method comprises the following steps: handlemaatrix ().
A matrix is initialized and then matrix transformed.
First, horizontal-vertical direction displacement:
matrix.postTranslate((DisplayWidth-Width*Scale)/2,(DisplayHeight-Height*Scale)/2);
then, if the angle is not 0, rotation is required:
postrotate (depth, DisplayWidth/2, DisplayHeight/2), obtains the Matrix (transformation Matrix) after transformation.
Rotation at any angle is supported using the above approach, common angles are 0, 90 or-270, 180 or-180, 270 or-90 degrees.
The embodiment of the application supports the zooming function after the video is rotated, and can automatically adapt to a screen and zoom in or out. Namely, for the playing supporting any zooming mode after rotation, the self-adaptation, the original size, the enlargement, the reduction and the like can be realized, the zooming Scale is marked as Scale, and the user menu can select the zooming Scale.
Step S6614: and drawing a picture according to the bitmap information and the transformation matrix.
In some embodiments, the picture is drawn on the ImageView using the bitmap information and the transformation matrix.
Fig. 6 and 8 are flowcharts illustrating another method for video rotation playing.
The embodiment of the application depends on the rendering interface of the scheme provider, and the analyzed data is assigned to the Video layer data receiving interface, so that the effect of playing on the Video layer is realized, and the 4K playing effect can be realized.
The method and the device have the advantages that the 4K video rotary playing effect can be achieved, 4K audio and video parameter adjustment is supported, and the playing effect is good.
Referring to fig. 8, the method for performing rotation processing on each frame of data includes the following steps:
step S6621: initiating the playing of PhotoRender. init ().
Step S6622: acquiring the display size of the display equipment, and determining the reference width and height of the display equipment;
step S6623: acquiring bitmap information;
step S6624: performing matrix transformation processing according to the bitmap information, the rotation angle, the preset scaling ratio, the reference width and the height to obtain a transformation matrix;
step S6625: and drawing a picture according to the bitmap information and the transformation matrix.
Step S6626: and transmitting the bitmap information, the transformation matrix and the drawn picture to a video layer render PhotoOnVdp () to realize the display effect of 4K.
In some embodiments, the bitmap information and the transformation matrix are passed to the video layer to achieve a 4K display effect.
Step S6627: resource release photo render.
The embodiment of the application supports two situations of automatic rotation and manual rotation. When the angle is detected, the display device automatically rotates when playing, and the above embodiments are all the cases of automatic rotation.
Fig. 9 is a flowchart illustrating still another video rotation playing method, which is applied to manual rotation.
Step S91: an instruction for instructing to play a video, which is input by a user operating the control device, is received through the user interface.
For example, the user interface may receive a user selection to play a source on the display device by pressing a confirmation key on the control.
Step S92: playing a video in response to a user-input instruction instructing the video to be played;
step S93: and receiving an instruction which is input by a user operating the control device and indicates that the video is played in a rotating mode through the user interface.
For example, in a Menu of Menu being played, when a user uses a Menu button to call a Menu function, a rotation can be selected in the Menu.
Step S94: responding to an instruction which is input by a user and indicates to rotationally play a video, acquiring each frame of data of the video unreleased part, and storing the data as a picture, wherein the instruction which indicates to rotationally play the video carries information of a rotation angle;
step S95: performing rotation processing on each frame data according to the rotation angle;
step S96: and playing the rotated pictures according to a preset time interval.
In the above embodiment, the display device determines that the video contains angle information, acquires information of each frame of the video by using the decoder, stores the information as pictures, processes the pictures in batch, and plays the processed pictures continuously as slides, thereby realizing the effect of video rotation playing. The video playing method and device support automatic rotation of videos and manual rotation of angle information, can achieve rotary playing of any angle, and improves user experience.
While the preferred embodiments of the present application have been described, additional variations and modifications in those embodiments may occur to those skilled in the art once they learn of the basic inventive concepts. Therefore, it is intended that the appended claims be interpreted as including preferred embodiments and all alterations and modifications as fall within the scope of the application.
It will be apparent to those skilled in the art that various changes and modifications may be made in the present application without departing from the spirit and scope of the application. Thus, if such modifications and variations of the present application fall within the scope of the claims of the present application and their equivalents, the present application is intended to include such modifications and variations as well.

Claims (16)

1. A display device, comprising:
a display;
a user interface;
a controller for performing:
responding to a command of instructing to play a video input by a user, analyzing the video to obtain metadata information, and acquiring a rotation angle in the metadata information;
if the rotation angle is not zero, acquiring each frame of data of the video and storing the data as a picture;
performing rotation processing on each frame of data;
and playing the rotated pictures according to a preset time interval.
2. The apparatus according to claim 1, wherein said controller performs said rotation processing for each frame data in a manner that:
acquiring the display size of the display equipment, and determining the reference width and height of the display equipment;
acquiring bitmap information;
performing matrix transformation processing according to the bitmap information, the rotation angle, the preset scaling ratio, the reference width and the height to obtain a transformation matrix;
and drawing a picture according to the bitmap information and the transformation matrix.
3. The apparatus according to claim 2, wherein the controller performs the matrix transformation processing according to the bitmap information, the rotation angle, the preset scaling, and the reference width and height to obtain a transformation matrix by:
initializing a matrix;
performing horizontal and vertical displacement transformation on the initialized matrix according to a preset scaling, the reference width and height and the width and height of bitmap information to obtain a displacement matrix;
and performing rotation transformation on the displacement matrix according to the rotation angle, the reference width and the height to obtain a transformation matrix.
4. The display device according to claim 2, wherein the controller performs the acquiring of the bitmap information in:
acquiring a URL address of a picture, and converting the picture into an input stream;
and analyzing the input stream to obtain bitmap information.
5. The display device according to claim 2, wherein the controller performs the playing of the rotation-processed picture at a preset time interval in a specific manner as follows:
and playing the rotated picture on the OSD layer according to a preset time interval.
6. The display device according to claim 2, wherein the controller further performs the rotation processing for each frame data in the following manner:
and transmitting the bitmap information, the transformation matrix and the drawn picture to a video layer.
7. The display device according to claim 6, wherein the controller performs the playing of the rotation-processed picture at a preset time interval in a specific manner as follows:
and playing the rotated pictures on the video layer according to a preset time interval.
8. A display device, comprising:
a display;
a user interface;
a controller for performing:
playing a video in response to a user-input instruction instructing the video to be played;
responding to an instruction which is input by a user and indicates to rotationally play a video, acquiring each frame of data of the video unreleased part, and storing the data as a picture, wherein the instruction which indicates to rotationally play the video carries information of a rotation angle;
performing rotation processing on each frame data according to the rotation angle;
and playing the rotated pictures according to a preset time interval.
9. The apparatus according to claim 8, wherein said controller performs said rotation processing for each frame data in a manner that:
acquiring the display size of the display equipment, and determining the reference width and height of the display equipment;
acquiring bitmap information;
performing matrix transformation processing according to the bitmap information, the rotation angle, the preset scaling ratio, the reference width and the height to obtain a transformation matrix;
and drawing a picture according to the bitmap information and the transformation matrix.
10. The apparatus according to claim 9, wherein the controller performs the matrix transformation processing according to the bitmap information, the rotation angle, the preset scaling, and the reference width and height to obtain a transformation matrix by:
initializing a matrix;
performing horizontal and vertical displacement transformation on the initialized matrix according to a preset scaling, the reference width and height and the width and height of bitmap information to obtain a displacement matrix;
and performing rotation transformation on the displacement matrix according to the rotation angle, the reference width and the height to obtain a transformation matrix.
11. The display device according to claim 9, wherein the controller performs the acquiring of the bitmap information in:
acquiring a URL address of a picture, and converting the picture into an input stream;
and analyzing the input stream to obtain bitmap information.
12. The display device according to claim 9, wherein the controller performs the playing of the rotation-processed picture at a preset time interval in a specific manner as follows:
and playing the rotated picture on the OSD layer according to a preset time interval.
13. The display device according to claim 9, wherein the controller further performs the rotation processing for each frame data in the following manner:
and transmitting the bitmap information, the transformation matrix and the drawn picture to a video layer.
14. The display device according to claim 13, wherein the controller performs the playing of the rotation-processed picture at a preset time interval in a specific manner as follows:
and playing the rotated pictures on the video layer according to a preset time interval.
15. A video rotation playing method is characterized by comprising the following steps:
responding to a command of instructing to play a video input by a user, analyzing the video to obtain metadata information, and acquiring a rotation angle in the metadata information;
if the rotation angle is not zero, acquiring each frame of data of the video and storing the data as a picture;
performing rotation processing on each frame of data;
and playing the rotated pictures according to a preset time interval.
16. A video rotation playing method is characterized by comprising the following steps:
playing a video in response to a user-input instruction instructing the video to be played;
responding to an instruction which is input by a user and indicates to rotationally play a video, acquiring each frame of data of the video unreleased part, and storing the data as a picture, wherein the instruction which indicates to rotationally play the video carries information of a rotation angle;
performing rotation processing on each frame data according to the rotation angle;
and playing the rotated pictures according to a preset time interval.
CN202010530170.6A 2020-02-28 2020-06-11 Video rotation playing method and display device Active CN111669638B (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN2020101300475 2020-02-28
CN202010130047 2020-02-28

Publications (2)

Publication Number Publication Date
CN111669638A true CN111669638A (en) 2020-09-15
CN111669638B CN111669638B (en) 2022-07-15

Family

ID=72386644

Family Applications (2)

Application Number Title Priority Date Filing Date
CN202010530170.6A Active CN111669638B (en) 2020-02-28 2020-06-11 Video rotation playing method and display device
CN202010597658.0A Pending CN111669634A (en) 2020-02-28 2020-06-28 Video file preview method and display equipment

Family Applications After (1)

Application Number Title Priority Date Filing Date
CN202010597658.0A Pending CN111669634A (en) 2020-02-28 2020-06-28 Video file preview method and display equipment

Country Status (2)

Country Link
CN (2) CN111669638B (en)
WO (1) WO2021169168A1 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112711390A (en) * 2020-12-31 2021-04-27 联想(北京)有限公司 Continuous multi-frame image display output control method and electronic equipment

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112346616B (en) * 2020-11-09 2022-07-08 上海英方软件股份有限公司 Method and device for realizing dynamic icons of video files
CN114007119A (en) * 2021-10-29 2022-02-01 海信视像科技股份有限公司 Video playing method and display equipment

Citations (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101023405A (en) * 2004-07-16 2007-08-22 三星电子株式会社 Display apparatus and control method thereof
US20080189751A1 (en) * 2007-02-07 2008-08-07 Weaver Timothy H Methods & Systems for image processing
CN101527795A (en) * 2009-04-13 2009-09-09 腾讯科技(深圳)有限公司 Processing method for rotating video in broadcasting, device and system thereof
US20100277608A1 (en) * 2009-05-01 2010-11-04 Sanyo Electric Co., Ltd. Image shooting apparatus, video display apparatus, and video processing system therewith
CN103220485A (en) * 2012-01-18 2013-07-24 腾讯科技(深圳)有限公司 Displayed video picture rotating method and system
CN103425401A (en) * 2013-08-21 2013-12-04 乐视网信息技术(北京)股份有限公司 Method for adjusting file playing angle and electronic terminal
CN104813220A (en) * 2012-11-19 2015-07-29 Lg电子株式会社 Video display device and method of displaying video
CN106534959A (en) * 2016-10-11 2017-03-22 北京小米移动软件有限公司 Method and device for processing live video
CN106658175A (en) * 2016-11-15 2017-05-10 Tcl集团股份有限公司 Video display method and system for television terminal
CN110290425A (en) * 2019-07-29 2019-09-27 腾讯科技(深圳)有限公司 A kind of method for processing video frequency, device and storage medium
CN110460894A (en) * 2019-06-25 2019-11-15 维沃移动通信有限公司 A kind of video image display method and terminal device

Family Cites Families (18)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7194701B2 (en) * 2002-11-19 2007-03-20 Hewlett-Packard Development Company, L.P. Video thumbnail
US20070237225A1 (en) * 2006-03-30 2007-10-11 Eastman Kodak Company Method for enabling preview of video files
CN101075258A (en) * 2007-05-14 2007-11-21 腾讯科技(深圳)有限公司 Method and device for generating video microform
CN103020076B (en) * 2011-09-23 2017-02-08 深圳市快播科技有限公司 Dynamic preview method and device for player video file
US20130305286A1 (en) * 2012-05-08 2013-11-14 Tomas Wässingbo Electronic device with multimedia content function
TW201349085A (en) * 2012-05-22 2013-12-01 Pegatron Corp Method for managing multimedia files, digital media controller, multimedia file management system
CN103513974A (en) * 2012-06-26 2014-01-15 北京新媒传信科技有限公司 Method and device for achieving dynamic icon button
CN102819396B (en) * 2012-07-31 2015-02-04 北京奇虎科技有限公司 Method and system for playing multimedia file
CN103810221B (en) * 2012-11-15 2019-03-15 腾讯科技(深圳)有限公司 A kind of method for previewing and device of file
CN102982828B (en) * 2012-11-22 2015-11-25 北京百度网讯科技有限公司 The method of the preview file of generating video file and device
CN104540000B (en) * 2014-12-04 2017-10-17 广东欧珀移动通信有限公司 The generation method and terminal of a kind of dynamic thumbnail
CN104780417A (en) * 2015-03-20 2015-07-15 广东欧珀移动通信有限公司 Display method for preview video file, mobile terminal and system
CN105992068A (en) * 2015-05-19 2016-10-05 乐视移动智能信息技术(北京)有限公司 Video file preview method and device
CN105511888A (en) * 2015-12-31 2016-04-20 魅族科技(中国)有限公司 Method and terminal for presenting file icon
US11140461B2 (en) * 2016-06-29 2021-10-05 Sony Interactive Entertainment LLC Video thumbnail in electronic program guide
CN106162378A (en) * 2016-06-30 2016-11-23 乐视控股(北京)有限公司 The browsing method of video file and browsing apparatus
US10932006B2 (en) * 2017-12-22 2021-02-23 Facebook, Inc. Systems and methods for previewing content
CN108196746A (en) * 2017-12-27 2018-06-22 努比亚技术有限公司 A kind of document presentation method and terminal, storage medium

Patent Citations (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101023405A (en) * 2004-07-16 2007-08-22 三星电子株式会社 Display apparatus and control method thereof
US20080189751A1 (en) * 2007-02-07 2008-08-07 Weaver Timothy H Methods & Systems for image processing
CN101527795A (en) * 2009-04-13 2009-09-09 腾讯科技(深圳)有限公司 Processing method for rotating video in broadcasting, device and system thereof
US20100277608A1 (en) * 2009-05-01 2010-11-04 Sanyo Electric Co., Ltd. Image shooting apparatus, video display apparatus, and video processing system therewith
CN103220485A (en) * 2012-01-18 2013-07-24 腾讯科技(深圳)有限公司 Displayed video picture rotating method and system
CN104813220A (en) * 2012-11-19 2015-07-29 Lg电子株式会社 Video display device and method of displaying video
CN103425401A (en) * 2013-08-21 2013-12-04 乐视网信息技术(北京)股份有限公司 Method for adjusting file playing angle and electronic terminal
CN106534959A (en) * 2016-10-11 2017-03-22 北京小米移动软件有限公司 Method and device for processing live video
CN106658175A (en) * 2016-11-15 2017-05-10 Tcl集团股份有限公司 Video display method and system for television terminal
CN110460894A (en) * 2019-06-25 2019-11-15 维沃移动通信有限公司 A kind of video image display method and terminal device
CN110290425A (en) * 2019-07-29 2019-09-27 腾讯科技(深圳)有限公司 A kind of method for processing video frequency, device and storage medium

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112711390A (en) * 2020-12-31 2021-04-27 联想(北京)有限公司 Continuous multi-frame image display output control method and electronic equipment

Also Published As

Publication number Publication date
CN111669634A (en) 2020-09-15
WO2021169168A1 (en) 2021-09-02
CN111669638B (en) 2022-07-15

Similar Documents

Publication Publication Date Title
CN111698551B (en) Content display method and display equipment
CN111669638B (en) Video rotation playing method and display device
CN111182345A (en) Display method and display equipment of control
CN111726673A (en) Channel switching method and display device
CN111246309A (en) Method for displaying channel list in display device and display device
WO2021212667A1 (en) Multiple media resource data display method and display device
CN111526401B (en) Video playing control method and display equipment
CN113347413A (en) Window position detection method and display device
CN113115088B (en) Control method of user interface of display equipment and display equipment
CN111857363A (en) Input method interaction method and display equipment
CN109922364B (en) Display device
CN112040308A (en) HDMI channel switching method and display device
CN113115092A (en) Display device and detail page display method
CN111885415B (en) Audio data rapid output method and display device
CN113691852B (en) Display equipment and media asset playing method
CN111405329B (en) Display device and control method for EPG user interface display
CN113115093B (en) Display device and detail page display method
CN111614995A (en) Menu display method and display equipment
WO2020147507A1 (en) Display device and display method
CN111601147A (en) Content display method and display equipment
CN111757160A (en) Method for starting sports mode and display equipment
CN111596771A (en) Display device and method for moving selector in input method
CN113329246A (en) Display device and shutdown method
CN111405332B (en) Display device and control method for EPG user interface display
CN113542885B (en) Display device, server and processing method for media asset search

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant