CN111866590B - Display device - Google Patents
Display device Download PDFInfo
- Publication number
- CN111866590B CN111866590B CN202010693749.4A CN202010693749A CN111866590B CN 111866590 B CN111866590 B CN 111866590B CN 202010693749 A CN202010693749 A CN 202010693749A CN 111866590 B CN111866590 B CN 111866590B
- Authority
- CN
- China
- Prior art keywords
- display
- video
- controller
- rotation angle
- video frame
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
- 230000003068 static effect Effects 0.000 claims abstract description 20
- 238000000034 method Methods 0.000 claims description 56
- 230000008569 process Effects 0.000 claims description 43
- 230000004044 response Effects 0.000 claims description 29
- 238000009877 rendering Methods 0.000 claims description 11
- 238000012544 monitoring process Methods 0.000 claims description 7
- 238000012545 processing Methods 0.000 abstract description 32
- 230000033001 locomotion Effects 0.000 abstract description 20
- 230000009466 transformation Effects 0.000 abstract description 4
- 238000000844 transformation Methods 0.000 abstract description 4
- 238000013519 translation Methods 0.000 abstract description 4
- 238000004891 communication Methods 0.000 description 21
- 238000010586 diagram Methods 0.000 description 18
- 230000006870 function Effects 0.000 description 13
- 230000003993 interaction Effects 0.000 description 13
- 238000006243 chemical reaction Methods 0.000 description 8
- 230000001133 acceleration Effects 0.000 description 6
- 230000036544 posture Effects 0.000 description 6
- 230000005540 biological transmission Effects 0.000 description 5
- 230000008859 change Effects 0.000 description 5
- 230000009467 reduction Effects 0.000 description 5
- 230000005236 sound signal Effects 0.000 description 5
- 230000006872 improvement Effects 0.000 description 4
- 230000003287 optical effect Effects 0.000 description 4
- 230000009471 action Effects 0.000 description 3
- 238000012986 modification Methods 0.000 description 3
- 230000004048 modification Effects 0.000 description 3
- 230000000007 visual effect Effects 0.000 description 3
- 239000008186 active pharmaceutical agent Substances 0.000 description 2
- 230000003321 amplification Effects 0.000 description 2
- 230000015572 biosynthetic process Effects 0.000 description 2
- 230000006837 decompression Effects 0.000 description 2
- 238000001514 detection method Methods 0.000 description 2
- 230000000694 effects Effects 0.000 description 2
- 230000005484 gravity Effects 0.000 description 2
- 238000002156 mixing Methods 0.000 description 2
- 238000003199 nucleic acid amplification method Methods 0.000 description 2
- 230000001360 synchronised effect Effects 0.000 description 2
- 238000003786 synthesis reaction Methods 0.000 description 2
- 230000003044 adaptive effect Effects 0.000 description 1
- 238000013473 artificial intelligence Methods 0.000 description 1
- 230000006399 behavior Effects 0.000 description 1
- 238000004364 calculation method Methods 0.000 description 1
- 239000003086 colorant Substances 0.000 description 1
- 239000002131 composite material Substances 0.000 description 1
- 230000006835 compression Effects 0.000 description 1
- 238000007906 compression Methods 0.000 description 1
- 238000012790 confirmation Methods 0.000 description 1
- 230000003247 decreasing effect Effects 0.000 description 1
- 238000009434 installation Methods 0.000 description 1
- 230000002452 interceptive effect Effects 0.000 description 1
- 239000004973 liquid crystal related substance Substances 0.000 description 1
- 238000003825 pressing Methods 0.000 description 1
- 238000006467 substitution reaction Methods 0.000 description 1
- 230000002194 synthesizing effect Effects 0.000 description 1
Images
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/40—Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
- H04N21/43—Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
- H04N21/44—Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream or rendering scenes according to encoded video stream scene graphs
- H04N21/4402—Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream or rendering scenes according to encoded video stream scene graphs involving reformatting operations of video signals for household redistribution, storage or real-time display
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D3/00—Control of position or direction
- G05D3/12—Control of position or direction using feedback
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/40—Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
- H04N21/43—Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
- H04N21/4302—Content synchronisation processes, e.g. decoder synchronisation
- H04N21/4307—Synchronising the rendering of multiple content streams or additional data on devices, e.g. synchronisation of audio on a mobile phone with the video output on the TV screen
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/40—Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
- H04N21/43—Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
- H04N21/431—Generation of visual interfaces for content selection or interaction; Content or additional data rendering
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/40—Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
- H04N21/47—End-user applications
- H04N21/472—End-user interface for requesting content, additional data or services; End-user interface for interacting with content, e.g. for content reservation or setting reminders, for requesting event notification, for manipulating displayed content
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- Databases & Information Systems (AREA)
- Human Computer Interaction (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Automation & Control Theory (AREA)
- Controls And Circuits For Display Device (AREA)
Abstract
The display device shown in the embodiment of the application comprises a display and a controller. If the display is in a rotated state, the controller draws a video frame of a video to be played through Textureview. Textureview can support the translation, scaling, arbitrary angle rotation, screenshot and other transformations of the video frame. Therefore, the video frame is drawn through Textureview, and the display direction of the drawn video frame can be adapted to the display direction of the display. When the display is in a static state, the controller draws the video frame of the video to be played through the Surfaceview, so that when the display is in the static state, image quality processing or motion compensation can be performed on the video frame, and the experience of a user is further improved.
Description
Technical Field
The application relates to the technical field of rotating televisions, in particular to a display device.
Background
The smart television has an independent operating system and supports function expansion. Various application programs can be installed in the smart television according to the needs of the user, for example, social applications such as traditional video applications and short videos, and reading applications such as cartoons and books. The applications can display application pictures by utilizing a screen of the intelligent television, and rich media resources are provided for the intelligent television. Meanwhile, the smart television can also perform data interaction and resource sharing with different terminals. For example, the smart television can be connected with a mobile phone through a wireless communication mode such as a local area network and bluetooth, so as to play resources in the mobile phone or directly project a screen to display a picture on the mobile phone.
However, since the picture scales corresponding to different applications or media assets from different sources are different, the smart tv is often used to display pictures with different scales from the traditional video. For example, video resources shot by a terminal such as a mobile phone are generally vertical media resources with an aspect ratio of 9, 16,9, 18, and 4 in equal proportion; and the picture provided by the reading application is a vertical resource similar to the aspect ratio of the book. The aspect ratio of the display screen of the smart television is generally 16,9, 10 and other horizontal states, so when vertical media such as short videos and comics are displayed through the smart television, the vertical media cannot be normally displayed due to the fact that the picture proportion is not matched with the display screen proportion. Generally, the vertical media asset images need to be zoomed to be displayed completely, which not only wastes the display space on the screen, but also brings bad user experience.
Disclosure of Invention
The application provides a display device to solve the technical problem of a traditional television.
The application shows a display device comprising:
a display;
the rotating assembly is connected with the display and is used for driving the display to rotate based on the control of the controller;
a controller configured with rendering controls, the rendering controls comprising: an image layer rendering control and an image layer rendering control configured to perform the steps of:
monitoring a state of the display in response to receiving a video playback instruction, the state including: a rotating state and a stationary state;
if the display is in a rotating state, drawing a video frame of the video to be played through an image layer drawing control;
and if the display is in a static state, drawing the video frame of the video to be played through a video layer drawing control.
The display device shown in the embodiment of the application comprises a display and a controller. If the display is in a rotated state, the controller draws a video frame of a video to be played through Textureview. Textureview can support the translation, scaling, arbitrary angle rotation, screenshot and other transformations of the video frame. Therefore, by drawing the video frame through Textureview, the display direction of the drawn video frame can be adapted to the display direction of the display. When the display is in a static state, the controller draws the video frame of the video to be played through the SurfaceView, so that when the display is in the static state, image quality processing or motion compensation can be carried out on the video frame, and the experience of a user is further improved.
Drawings
In order to more clearly explain the technical solution of the present application, the drawings needed to be used in the embodiments will be briefly described below, and it is obvious to those skilled in the art that other drawings can be obtained according to the drawings without creative efforts.
Fig. 1A is an application scenario diagram of a display device according to the present application;
FIG. 1B is a rear view of a display device of the present application;
fig. 2 is a block diagram of a hardware configuration of a control apparatus according to the present application;
FIG. 3 is a block diagram of a hardware configuration of a display device according to the present application;
FIG. 4 is a block diagram of an architectural configuration of an operating system in a memory of a display device according to the present application;
FIG. 5A is a schematic view of a landscape showing directional media assets according to the present application;
FIG. 5B is a schematic diagram of the present application showing directional assets displayed in a vertical screen;
FIG. 6 is a flowchart illustrating operation of a controller during video playback according to one possible embodiment;
FIG. 7A is a flow chart of a controller in a video playback process according to one possible embodiment;
FIG. 7B is a flowchart of a controller during video playback provided in accordance with one possible embodiment;
FIG. 8 is a flowchart illustrating operation of a display device during video playback according to one possible embodiment;
FIG. 9 is a diagram illustrating a variation of a display according to an embodiment;
FIG. 10 is a flowchart illustrating the operation of a display device during playing video according to one possible embodiment;
FIG. 11 is a diagram illustrating a variation of a display according to an embodiment;
FIG. 12 is a diagram illustrating a variation of a display according to an embodiment;
FIG. 13 is a graph of a relationship between a first rotation angle and a rotation time of the display when the rotation assembly rotates the display from a vertical screen state to a horizontal screen state;
FIG. 14 is a flowchart illustrating operation of a display device during video playback according to one possible embodiment;
FIG. 15 is a diagram illustrating a variation of a display according to one possible embodiment.
Detailed Description
In order to make those skilled in the art better understand the technical solutions in the present application, the technical solutions in the embodiments of the present application will be clearly and completely described below with reference to the drawings in the embodiments of the present application, and it is obvious that the described embodiments are only a part of the embodiments of the present application, and not all the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present application.
The rotary television is a novel intelligent television and mainly comprises a display and a rotary component. The display is connected to the support or the wall through the rotating assembly, and the display placing angle can be adjusted through the rotating assembly to achieve the purpose of rotation. Different display placement angles can accommodate animated pages of different aspect ratios, for example, in most cases the display is placed sideways to display video pages of movies, television shows, etc. having an aspect ratio of 16. When the aspect ratio of a video page is a page of a short video, caricature, etc. of 9. Thus, the display can be vertically positioned by the swivel assembly to accommodate a 9.
The applications supported by the rotary television are numerous, and a starting signal source of the television can be appointed by setting a starting mode so as to be convenient for a user to watch. For example, in order to obtain the viewing experience of a conventional television, a starting signal source of the television can be set as a live signal, so that the television directly enters a live state after being started. The user can set the starting signal source into any application program through the setting program. Because the display postures supported by different applications are different, the posture of the television at the time of starting up is adaptive to the application used as the starting-up signal source, and the page corresponding to the application of the starting-up signal source can be normally displayed.
However, when watching tv, the user can adjust the display posture of the rotating tv as required, and still keep the adjusted posture when turning off. For example, when a user watches a short video or a cartoon through a television, the user switches the screen to a vertically-placed state and turns off the television in the vertically-placed state. When the user starts the computer for the next time, the screen is in a vertically placed state, and if the starting signal source is set to be an application only supporting the horizontally placed state, the screen is not matched with the application of the starting signal source, and the screen cannot be correctly displayed. Therefore, the application provides a display device and a display method of an application interface.
In order to facilitate a user to display a target media asset detail page in different horizontal and vertical screen display directions of a display and to facilitate improvement of user viewing experience of a display device in different viewing states, embodiments of the present application provide a display device, a detail page display method, and a computer storage medium, where the display device is, for example, a rotating television. It should be noted that the method provided in this embodiment is not only applicable to the rotating television, but also applicable to other display devices, such as a computer and a tablet computer.
The term "module," as used in various embodiments of the present application, may refer to any known or later developed hardware, software, firmware, artificial intelligence, fuzzy logic, or combination of hardware and/or software code that is capable of performing the functionality associated with that element.
The term "remote control" as used in the various embodiments of the present application refers to a component of an electronic device, such as the display device disclosed in the present application, that is capable of wirelessly controlling the electronic device, typically over a short distance. The component may typically be connected to the electronic device using infrared and/or Radio Frequency (RF) signals and/or bluetooth, and may also include functional modules such as WiFi, wireless USB, bluetooth, motion sensors, etc. For example: the hand-held touch remote controller replaces most of the physical built-in hard keys in the common remote control device with the user interface in the touch screen.
The term "gesture" as used in the embodiments of the present application refers to a user behavior used to express an intended idea, action, purpose, or result through a change in hand shape or an action such as hand movement.
The term "hardware system" used in the embodiments of the present application may refer to a physical component having computing, controlling, storing, inputting and outputting functions, which is formed by a mechanical, optical, electrical and magnetic device such as an Integrated Circuit (IC), a Printed Circuit Board (PCB) and the like. In various embodiments of the present application, a hardware system may also be generally referred to as a motherboard (motherboard) or a host chip or controller.
Referring to fig. 1A, an application scenario diagram of a display device according to some embodiments of the present application is provided. As shown in fig. 1A, the control apparatus 100 and the display device 200 may communicate with each other in a wired or wireless manner.
Among them, the control apparatus 100 is configured to control the display device 200, which can receive an operation instruction input by a user and convert the operation instruction into an instruction recognizable and responsive by the display device 200, serving as an intermediary for interaction between the user and the display device 200. Such as: the user operates the channel up/down key on the control device 100, and the display device 200 responds to the channel up/down operation.
The control device 100 may be a remote controller 100A, which includes infrared protocol communication or bluetooth protocol communication, and other short-distance communication methods, etc. to control the display apparatus 200 in a wireless or other wired manner. The user may input a user instruction through a key on a remote controller, voice input, control panel input, etc., to control the display apparatus 200. Such as: the user can input a corresponding control command through a volume up/down key, a channel control key, up/down/left/right moving keys, a voice input key, a menu key, a power on/off key, etc. on the remote controller, to implement the function of controlling the display device 200.
The control device 100 may also be an intelligent device, such as a mobile terminal 100B, a tablet computer, a notebook computer, and the like. For example, the display device 200 is controlled using an application program running on the smart device. The application program may provide various controls to a user through an intuitive User Interface (UI) on a screen associated with the smart device through configuration.
For example, the mobile terminal 100B may install a software application with the display device 200, implement connection communication through a network communication protocol, and implement the purpose of one-to-one control operation and data communication. Such as: the mobile terminal 100B may be caused to establish a control instruction protocol with the display device 200, and the functions of the physical keys as arranged by the remote control 100A may be implemented by operating various function keys or virtual controls of the user interface provided on the mobile terminal 100B. The audio and video content displayed on the mobile terminal 100B may also be transmitted to the display device 200, so as to implement a synchronous display function.
The display apparatus 200 may provide a network television function of a broadcast receiving function and a computer support function. The display device may be implemented as a digital television, a web television, an Internet Protocol Television (IPTV), or the like.
The display device 200 may be a liquid crystal display, an organic light emitting display, a projection device. The specific display device type, size, resolution, etc. are not limited.
The display apparatus 200 also performs data communication with the server 300 through various communication means. Here, the display apparatus 200 may be allowed to be communicatively connected through a Local Area Network (LAN), a Wireless Local Area Network (WLAN), and other networks. The server 300 may provide various contents and interactions to the display apparatus 200. By way of example, the display device 200 may send and receive information such as: receiving Electronic Program Guide (EPG) data, receiving software program updates, or accessing a remotely stored digital media library. The servers 300 may be a group or groups of servers, and may be one or more types of servers. Other web service contents such as a video on demand and an advertisement service are provided through the server 300.
In some embodiments, as shown in FIG. 1B, display device 200 includes a rotation assembly 276, a controller 250, a display 275, a terminal interface extending from a gap in the backplane, and a rotation assembly 276 coupled to the backplane, the rotation assembly 276 configured to rotate the display 275. From the angle of front view of the display device, the rotating component 276 can rotate the display screen to the vertical screen display direction, that is, the vertical side length of the screen is greater than the horizontal side length, or rotate the screen to the horizontal screen display direction, that is, the horizontal side length of the screen is greater than the vertical side length.
Fig. 2 exemplarily provides a block diagram of the configuration of the control apparatus 100. As shown in fig. 2, the control device 100 includes a controller 110, a memory 120, a communicator 130, a user input interface 140, a user output interface 150, and a power supply 160.
The controller 110 includes a Random Access Memory (RAM) 111, a Read Only Memory (ROM) 112, a processor 113, a power-on interface, and a communication bus. The controller 110 is used to control the operation and operation of the apparatus 100, as well as the internal components and the cooperation of communications, external and internal data processing functions.
Illustratively, when an interaction of a user pressing a key disposed on the remote controller 100A or an interaction of touching a touch panel disposed on the remote controller 100A is detected, the controller 110 may control to generate a signal corresponding to the detected interaction and transmit the signal to the display device 200.
A memory 120 for storing various operation programs, data and applications of the driving and controlling apparatus 100 under the control of the controller 110. The memory 120 may store various control signal commands input by a user.
The communicator 130 enables communication of control signals and data signals with the display apparatus 200 under the control of the controller 110. Such as: the control apparatus 100 transmits a control signal (e.g., a touch signal or a control signal) to the display device 200 via the communicator 130, and the control apparatus 100 may receive the signal transmitted by the display device 200 via the communicator 130. The communicator 130 may include an infrared signal interface 131 and a radio frequency signal interface 132. For example: when the infrared signal interface is used, the user input instruction needs to be converted into an infrared control signal according to an infrared control protocol, and the infrared control signal is sent to the display device 200 through the infrared sending module. The following steps are repeated: when the rf signal interface is used, a user input command needs to be converted into a digital signal, and then the digital signal is modulated according to the rf control signal modulation protocol and then transmitted to the display device 200 through the rf transmitting terminal.
The user input interface 140 may include at least one of a microphone 141, a touch pad 142, a sensor 143, a key 144, and the like, so that a user can input a user instruction regarding controlling the display apparatus 200 to the control apparatus 100 through voice, touch, gesture, press, and the like.
The user output interface 150 outputs a user instruction received by the user input interface 140 to the display apparatus 200, or outputs an image or voice signal received by the display apparatus 200. Here, the user output interface 150 may include an LED interface 151, a vibration interface 152 generating vibration, a sound output interface 153 outputting sound, a display 154 outputting images, and the like. For example, the remote controller 100A may receive an output signal such as audio, video, or data from the user output interface 150 and display the output signal in the form of an image on the display 154, an audio on the sound output interface 153, or a vibration on the vibration interface 152.
And a power supply 160 for providing operation power support for each element of the control device 100 under the control of the controller 110. In the form of a battery and associated control circuitry.
A hardware configuration block diagram of the display device 200 is exemplarily provided in fig. 3. As shown in fig. 3, a tuner demodulator 210, a communicator 220, a detector 230, an external device interface 240, a controller 250, a memory 260, a user interface 265, a video processor 270, a display 275, a rotating assembly 276, an audio processor 280, an audio output interface 285, and a power supply 290 may be included in the display apparatus 200.
The rotating assembly 276 may include a driving motor, a rotating shaft, and the like. Wherein, the driving motor can be connected to the controller 250 and output the rotation angle under the control of the controller 250; one end of the rotation shaft is connected to a power output shaft of the driving motor, and the other end is connected to the display 275, so that the display 275 can be fixedly mounted on a wall or a bracket through the rotation member 276.
The rotating assembly 276 may also include other components, such as a transmission component, a detection component, and the like. The transmission component can adjust the rotating speed and torque output by the rotating component 276 through a specific transmission ratio, and can be in a gear transmission mode; the detection means may be composed of a sensor, such as an angle sensor, an attitude sensor, or the like, provided on the rotation shaft. These sensors may detect parameters such as the angle at which the rotating assembly 276 is rotated and send the detected parameters to the controller 250, so that the controller 250 can determine or adjust the state of the display apparatus 200 according to the detected parameters. In practice, rotating assembly 276 may include, but is not limited to, one or more of the components described above.
The tuner demodulator 210 receives the broadcast television signal in a wired or wireless manner, may perform modulation and demodulation processing such as amplification, mixing, and resonance, and is configured to demodulate, from a plurality of wireless or wired broadcast television signals, an audio/video signal carried in a frequency of a television channel selected by a user, and additional information (e.g., EPG data).
The tuner demodulator 210 is responsive to the user selected frequency of the television channel and the television signal carried by the frequency, as selected by the user and as controlled by the controller 250.
The tuner demodulator 210 may receive a television signal in various ways according to the broadcasting system of the television signal, such as: terrestrial broadcasting, cable broadcasting, satellite broadcasting, internet broadcasting, or the like; and according to different modulation types, a digital modulation mode or an analog modulation mode can be adopted; and can demodulate the analog signal and the digital signal according to the different kinds of the received television signals.
In other exemplary embodiments, the tuning demodulator 210 may also be in an external device, such as an external set-top box. In this way, the set-top box outputs a television signal after modulation and demodulation, and inputs the television signal into the display apparatus 200 through the external device interface 240.
The communicator 220 is a component for communicating with an external device or an external server according to various communication protocol types. For example, the display apparatus 200 may transmit content data to an external apparatus connected via the communicator 220, or browse and download content data from an external apparatus connected via the communicator 220. The communicator 220 may include a network communication protocol module or a near field communication protocol module such as a WIFI module 221, a bluetooth module 222, and a wired ethernet module 223, so that the communicator 220 may receive a control signal of the control device 100 according to the control of the controller 250 and implement the control signal as a WIFI signal, a bluetooth signal, a radio frequency signal, and the like.
The detector 230 is a component of the display apparatus 200 for collecting signals of an external environment or interaction with the outside. The detector 230 may include a sound collector 231, such as a microphone, which may be used to receive a user's sound, such as a voice signal of a control instruction of the user to control the display device 200; alternatively, ambient sounds may be collected that identify the type of ambient scene, enabling the display device 200 to adapt to ambient noise.
In some other exemplary embodiments, the detector 230 may further include an image collector 232, such as a camera, a video camera, etc., which may be used to collect external environment scenes to adaptively change the display parameters of the display device 200; and the function of acquiring the attribute of the user or interacting gestures with the user so as to realize the interaction between the display equipment and the user.
In some other exemplary embodiments, the detector 230 may further include a light receiver for collecting the intensity of the ambient light to adapt to the display parameter variation of the display device 200.
In some other exemplary embodiments, the detector 230 may further include a temperature sensor, such as by sensing an ambient temperature, and the display device 200 may adaptively adjust a display color temperature of the image. For example, when the temperature is higher, the display device 200 may be adjusted to display a color temperature of the image that is colder; when the temperature is lower, the display device 200 can be adjusted to display the image with a warmer color temperature.
The external device interface 240 is a component for providing the controller 250 to control data transmission between the display apparatus 200 and an external apparatus. The external device interface 240 may be connected to an external apparatus such as a set-top box, a game device, a notebook computer, etc. in a wired/wireless manner, and may receive data such as a video signal (e.g., moving image), an audio signal (e.g., music), additional information (e.g., EPG), etc. of the external apparatus.
The external device interface 240 may include: a High Definition Multimedia Interface (HDMI) terminal 241, a Composite Video Blanking Sync (CVBS) terminal 242, an analog or digital Component terminal 243, a Universal Serial Bus (USB) terminal 244, a Component terminal (not shown), a red, green, blue (RGB) terminal (not shown), and the like.
The controller 250 controls the operation of the display device 200 and responds to the operation of the user by operating various software control programs (e.g., an operating system and various application programs) stored in the memory 260.
As shown in FIG. 3, controller 250 includes Random Access Memory (RAM) 251, read Only Memory (ROM) 252, graphics processor 253, processor 254, power on interface 255, and communication bus 256. The RAM251, the ROM252, the graphic processor 253, and the power interface 255 of the processor 254 are connected by a communication bus 256.
The ROM252 stores various system startup instructions. When the power-on signal is received, the display apparatus 200 starts to be powered on, and the processor 254 executes the system boot instruction in the ROM252 and copies the operating system stored in the memory 260 to the RAM251 to start running the boot operating system. After the start of the operating system is completed, the processor 254 copies the various application programs in the memory 260 to the RAM251 and then starts running the various application programs.
And a graphic processor 253 for generating various graphic objects such as icons, operation menus, and user input instruction display graphics, etc. The graphic processor 253 may include an operator for performing an operation by receiving various interactive instructions input by a user, and further displaying various objects according to display attributes; and a renderer for generating various objects based on the operator and displaying the rendered result on the display 275.
A processor 254 for executing operating system and application program instructions stored in memory 260. And according to the received user input instruction, processing of various application programs, data and contents is executed so as to finally display and play various audio-video contents.
In some demonstrative embodiments, processor 254 may include a plurality of processors. The plurality of processors may include one main processor and a plurality of or one sub-processor. A main processor for performing some initialization operations of the display device 200 in the display device preloading mode, and/or operations of the animation page in the normal mode. A plurality of or one sub-processor for performing an operation in a state of a standby mode or the like of the display apparatus.
The power-up interface 255 may include a first interface through an nth interface. These interfaces may be network interfaces that are connected to external devices via a network.
The controller 250 may control the overall operation of the display apparatus 200. For example: in response to receiving a user input command for selecting a GUI object displayed on the display 275, the controller 250 may perform an operation related to the object selected by the user input command.
Where the object may be any one of the selectable objects, such as a hyperlink or an icon. The operation related to the selected object is, for example, an operation of displaying a link to a hyperlink page, document, image, or the like, or an operation of executing a program corresponding to the object. The user input command for selecting the GUI object may be a command input through various input means (e.g., a mouse, a keyboard, a touch panel, etc.) connected to the display apparatus 200 or a voice command corresponding to a voice spoken by the user.
A memory 260 for storing various types of data, software programs, or applications for driving and controlling the operation of the display device 200. The memory 260 may include volatile and/or non-volatile memory. And the term "memory" includes the memory 260, the RAM251 and ROM252 of the controller 250, or a memory card in the display device 200.
In some embodiments, the memory 260 is specifically used for storing an operating program for driving the controller 250 of the display device 200; storing various application programs built in the display apparatus 200 and downloaded by a user from an external apparatus; data such as visual effect images for configuring various GUIs provided by the display 275, various objects related to the GUIs, and selectors for selecting GUI objects are stored.
In some embodiments, memory 260 is specifically configured to store drivers for tuner demodulator 210, communicator 220, detector 230, external device interface 240, video processor 270, display 275, audio processor 280, etc., and related data, such as external data (e.g., audio-visual data) received from the external device interface or user data (e.g., key information, voice information, touch information, etc.) received by the user interface.
In some embodiments, memory 260 specifically stores software and/or programs representing an Operating System (OS), which may include, for example: a kernel, middleware, an Application Programming Interface (API), and/or an application program. Illustratively, the kernel may control or manage system resources, as well as functions implemented by other programs (e.g., middleware, APIs, or applications); at the same time, the kernel may provide an interface to allow middleware, APIs, or applications to access the controller to enable control or management of system resources.
A block diagram of the architectural configuration of the operating system in the memory of the display device 200 is illustratively provided in fig. 4. The operating system architecture comprises an application layer, a middleware layer and a kernel layer from top to bottom.
The application layer, the application programs built in the system and the non-system-level application programs belong to the application layer. Is responsible for direct interaction with the user. The application layer may include a plurality of applications such as a setup application, a post application, a media center application, and the like. These applications may be implemented as Web applications that execute based on a WebKit engine, and in particular may be developed and executed based on HTML5, cascading Style Sheets (CSS), and JavaScript.
Here, HTML, which is called hypertext Markup Language (hypertext Markup Language), is a standard Markup Language for creating web pages, and describes the web pages by Markup tags, where the HTML tags are used to describe characters, graphics, animation, sound, tables, links, etc., and a browser reads an HTML document, interprets the content of the tags in the document, and provides the content in the form of web pages.
CSS, known as Cascading Style Sheets (HTML) is a computer language used to represent the Style of HTML documents, and can be used to define the Style structure, such as the language of fonts, colors, positions, etc. The CSS style can be directly stored in the HTML webpage or a separate style file, so that the style in the webpage can be controlled.
JavaScript, a language applied to Web page programming, can be inserted into an HTML page and interpreted and executed by a browser. The interaction logic of the Web application is realized by JavaScript. The JavaScript can package a JavaScript extension interface through a browser, realize the communication with the kernel layer,
the middleware layer may provide some standardized interfaces to support the operation of various environments and systems. For example, the middleware layer may be implemented as multimedia and hypermedia information coding experts group (MHEG) middleware related to data broadcasting, DLNA middleware of middleware related to communication with an external device, middleware providing a browser environment in which each application program in the display device operates, and the like.
The kernel layer provides core system services, such as: file management, memory management, process management, network management, system security authority management and the like. The kernel layer may be implemented as a kernel based on various operating systems, for example, a kernel based on the Linux operating system.
The kernel layer also provides communication between system software and hardware, and provides device driver services for various hardware, such as: provide display driver for the display, provide camera driver for the camera, provide button driver for the remote controller, provide wiFi driver for the WIFI module, provide audio driver for audio output interface, provide power management drive for Power Management (PM) module etc..
In FIG. 3, user interface 265, receives various user interactions. Specifically, it is used to transmit an input signal of a user to the controller 250 or transmit an output signal from the controller 250 to the user. For example, the remote controller 100A may transmit an input signal input by a user, such as a power switch signal, a channel selection signal, a volume adjustment signal, etc., to the user interface 265, and then the input signal is forwarded to the controller 250 through the user interface 265; alternatively, the remote controller 100A may receive an output signal such as audio, video, or data output from the user interface 265 via the controller 250, and display the received output signal or output the received output signal in audio or vibration form.
In some embodiments, a user may enter user commands on a Graphical User Interface (GUI) displayed on the display 275, and the user interface 265 receives the user input commands through the GUI. Specifically, the user interface 265 may receive user input commands for controlling the position of a selector in the GUI to select different objects or items. Among these, "user interfaces" are media interfaces for interaction and information exchange between an application or operating system and a user, which enable the conversion between an internal form of information and a form acceptable to the user. A common presentation form of the user interface is a Graphical User Interface (GUI), which refers to a user interface related to computer operations and displayed in a graphical manner. It may be an interface element such as an icon, a window, a control, etc. displayed in the display screen of the electronic device, where the control may include a visual interface element such as an icon, a control, a menu, a tab, a text box, a dialog box, a status bar, a channel bar, a Widget, etc.
Alternatively, the user may input a user command by inputting a specific sound or gesture, and the user interface 265 receives the user input command by recognizing the sound or gesture through the sensor.
The video processor 270 is configured to receive an external video signal, and perform video data processing such as decompression, decoding, scaling, noise reduction, frame rate conversion, resolution conversion, and image synthesis according to a standard codec protocol of the input signal, so as to obtain a video signal that is directly displayed or played on the display 275.
Illustratively, the video processor 270 includes a demultiplexing module, a video decoding module, an image synthesizing module, a frame rate conversion module, a display formatting module, and the like.
The demultiplexing module is configured to demultiplex an input audio/video data stream, for example, an input MPEG-2 stream (based on a compression standard of a digital storage media moving image and voice), and demultiplex the input audio/video data stream into a video signal and an audio signal.
And the video decoding module is used for processing the video signal after demultiplexing, including decoding, scaling and the like.
And the image synthesis module, such as an image synthesizer, is used for performing superposition mixing processing on the GUI signal input by the user or generated by the user and the video image after the zooming processing by the graphics generator so as to generate an image signal for display.
The frame rate conversion module is configured to convert a frame rate of an input video, for example, convert a frame rate of an input 60Hz video into a frame rate of 120Hz or 240Hz, where a common format is implemented by using, for example, an interpolation frame method.
And a display formatting module for converting the signal output by the frame rate conversion module into a signal conforming to a display format of a display, such as converting the format of the signal output by the frame rate conversion module to output an RGB data signal.
And a display 275 for receiving the image signal from the input of the video processor 270 and displaying the video content, the image and the menu manipulation interface. The video content may be displayed from the video content in the broadcast signal received by the tuner/demodulator 210, or from the video content input by the communicator 220 or the external device interface 240. The display 275, while displaying a user manipulation interface UI generated in the display apparatus 200 and used to control the display apparatus 200.
And, the display 275 may include a display screen component for rendering pages and a driving component that drives the display of images. Alternatively, a projection device and projection screen may be included, provided that display 275 is a projection display.
Rotating assembly 276, controller 250 may issue a control signal to cause rotating assembly 276 to rotate display 275.
The audio processor 280 is configured to receive an external audio signal, and perform decompression and decoding, and audio data processing such as noise reduction, digital-to-analog conversion, and amplification processing according to a standard codec protocol of the input signal, so as to obtain an audio signal that can be played in the microphone 286.
Illustratively, audio processor 280 may support various audio formats. Such as MPEG-2, MPEG-4, advanced Audio Coding (AAC), high efficiency AAC (HE-AAC), and the like.
The audio output interface 285 is used for receiving an audio signal output by the audio processor 280 under the control of the controller 250, and the audio output interface 285 may include a microphone 286 or an external sound output terminal 287, such as an earphone output terminal, for outputting to a generating device of an external device.
In other exemplary embodiments, video processor 270 may comprise one or more chips. Audio processor 280 may also comprise one or more chips.
And, in other exemplary embodiments, the video processor 270 and the audio processor 280 may be separate chips or may be integrated with the controller 250 in one or more chips.
And a power supply 290 for supplying power supply support to the display apparatus 200 from the power input from the external power source under the control of the controller 250. The power supply 290 may be a built-in power supply circuit installed inside the display apparatus 200 or may be a power supply installed outside the display apparatus 200.
Because the display device 200 provided by the present application includes the display 275 and the rotating component 276, the rotating component 276 can rotate the display 275, so that the display 275 can have different display orientations. Thus, in one implementation, the display direction may include a landscape display direction and a portrait display direction. Wherein the landscape display direction is a display direction in which the length (width) of the display 275 in the horizontal direction is greater than the length (height) of the display 275 in the vertical direction when viewed from the front of the display 275; the vertical screen display direction is a display direction in which the length (width) of the display 275 in the horizontal direction is smaller than the length (height) of the display 275 in the vertical direction when viewed from the front of the display 275.
Obviously, depending on the installation/placement position of the display device 200, the vertical direction is referred to as substantially vertical in the present application, and the horizontal direction is also referred to as substantially horizontal. The horizontal display direction is mainly used for displaying horizontal media such as a tv drama and a movie as shown in fig. 5A. The mode of operation when the display 275 is in the landscape orientation may be referred to as the landscape viewing mode, and the mode of operation when the display 275 is in the portrait orientation may be referred to as the portrait viewing mode. The controller 250 in the display device 200 is further communicatively connected to the server 300 for invoking an interface of the server 300 and obtaining corresponding data. The display 275 in the display device 200 can be rotated by the rotation assembly 276 and used to display a user interface. In practical applications, a user may control a play mode, play contents, and the like of the display apparatus 200 through the control device 100, wherein the play mode includes a landscape viewing mode and a portrait viewing mode.
The vertical screen display direction is mainly used for displaying vertical media such as short videos and cartoons, as shown in fig. 5B. In the vertical screen display direction, the display 275 may display the user interface corresponding to the vertical screen display direction and have an interface layout and an interaction mode corresponding to the vertical screen display direction. In the vertical screen media asset watching mode, a user can watch vertical screen media assets such as short videos and cartoons. Similarly, since the controller 250 in the display device 200 is further in communication connection with the server 300, the media asset data corresponding to the vertical screen can be acquired by calling the interface of the server 300 when the vertical screen displays the direction.
The vertical screen display direction is more suitable for playing the page with the ratio of 9:16, etc., such as short videos shot through a terminal like a mobile phone, etc. Because terminal equipment such as cell-phone adopts 9 more: 16,9:18, and the like, when the terminal accesses the display device 200 and displays a terminal page through the display device 200, the vertical screen display direction can avoid excessive zooming of the page, the application page of the display 275 is fully utilized, and better user experience is achieved.
It should be noted that the horizontal screen display direction and the vertical screen display direction are only two different display directions of the display 275, and do not limit the displayed content, for example, vertical media such as short videos and cartoons can still be displayed in the horizontal screen display direction; horizontal media such as TV dramas and movies can still be displayed in the vertical screen display direction, and only the display windows which are not matched with each other need to be compressed and adjusted in the display direction.
When the user uses the display device 200, the display direction of the display 275 is adjusted according to the viewing needs of the user. For example, after a rotation command is issued by controlling a rotation key on the apparatus 100, selecting a rotation option on the UI interface, or inputting a "rotation" related voice through the voice system, the controller 250 controls the rotation component 276 to rotate according to the rotation command, so as to drive the display 275 to rotate. For example, when the user wants to watch a short video through the display device 200, the user can input a rotation instruction in one of the above manners to rotate the display 275 in the landscape display direction counterclockwise by 90 degrees to the portrait display direction, so as to adapt to the picture scale of a vertical application such as a short video.
In the process of playing a video, the display device needs to perform image quality processing and/or motion compensation on a frame picture of the video to be played, which requires the display device to draw the video frame to an image quality processing module and/or a motion compensation module through a surface view (video layer control). Therefore, during video playback, the display device draws a video frame through surfaview (video layer control). However, during the rotation of the display, the display direction of the video frame is not adapted to the display direction of the display because the Surfaceview cannot control the video frame to rotate at any angle. The user experience is poor.
In order to solve the foregoing technical problem, an embodiment of the present application provides a display device, where the display device at least includes: the functions and connection modes of the display, the rotating assembly and the controller can refer to the above embodiments, and are not described herein.
FIG. 6 is a flowchart illustrating operation of the controller during video playback according to one possible embodiment; the controller is further configured to perform steps S11 and S12.
S11, responding to the received video playing instruction, monitoring the state of the display, wherein the state comprises the following steps: a rotating state and a stationary state;
when the user wants to play the video, the user can send a video playing instruction to the display device. In some feasible embodiments, the video playing instruction may be a video playing instruction sent by the user in a form of voice; the sending process of the video playing instruction may be: the user outputs the wake-up word in advance, then outputs a video playing instruction, and establishes a connector connection with the controller 250 based on the wake-up word, so that the controller 250 can respond to the video playing instruction output by the user. For example, the user enters the wake-up word "congregation classmates" and the video play instruction "play XXX video". In some feasible embodiments, the video playing instruction may be a video playing instruction sent by a user through a remote controller; the sending process of the video playing instruction may be: the user touches the relevant control of the remote controller, so that the controller 250 can send a corresponding video playing instruction, and the controller 250 can respond to the video playing instruction accordingly. For example, the user may trigger the remote controller to send a video playing command to the controller 250 by clicking a confirmation key of the remote controller. The present embodiment provides only two exemplary starting modes of the display device, and the sending process of the video playing instruction in the process of the actual application may be, but is not limited to, the two modes.
In this application, the controller can be provided with corresponding monitoring subassembly to make the controller can monitor the state of display. In some possible embodiments, an angle monitor may be disposed in the controller 250, and accordingly, the controller 250 may monitor a real-time rotation angle of the display 275, and when the real-time rotation angle is greater than zero, the display is in a rotation state. In another feasible embodiment, a gravitational acceleration sensor may be disposed in the controller 250, and by monitoring information of the gravitational sensor in 3 directions of the spatial coordinate system (x, y, z), a posture corresponding to any time of the display 275 is obtained, and according to the posture, it may be determined whether the display is in a rotating state or a stationary state. In another possible embodiment, an angular acceleration sensor may be disposed in the controller 250, and the display may be determined to be in a stationary state or a rotating state by monitoring an increase in the angle detected by the angular acceleration sensor. The controller may also be provided with other monitors for monitoring the status of the display 275.
And S12, if the display is in a static state, drawing the video frame of the video to be played through the video layer drawing control. If the display is in a rotating state, drawing a video frame of the video to be played through the image layer drawing control;
referring to fig. 7A, when the display is in a still state, the operation flow of the controller can refer to fig. 7A, where fig. 7A is a flowchart of the controller during the video playing process according to an embodiment; as can be seen from the figure, the controller is configured to perform the steps of:
s21, receiving audio and video data;
in this application, the controller is configured to receive audio and video data, where the audio and video data may include: the network audio and video data drawn by the network, the audio and video data downloaded in advance and the wired audio and video data transmitted by the USB interface. The wired audio-video data comprises satellite television signals, V-section television signals, U-section television signals and other television signals transmitted by other wired stations through microwaves (or optical cables). The present embodiment is only an exemplary illustration of several audio and video signals, and the audio and video signals may be, but are not limited to, the above signals in the process of practical application. .
S22, decoding audio and video data;
in the present application, the controller is provided with a codec (codec), and the codec (codec) can decode the audio and video data, wherein the decoding manner of the audio and video data may adopt a decoding manner that is customary in the art, and the applicant does not make any limitation here.
S23, synchronizing audio and video data;
after decoding, two paths of data are obtained simultaneously, one path is audio data, and the other path is video data.
Since the subsequent controller needs to transmit the audio data to the speaker so that the speaker plays the audio data. The video data needs to be transmitted to the display in order for the display to play the video data. Generally, in order to improve the experience of the user, a column of image quality processing or motion compensation needs to be performed on the video data, which results in that the audio data needs a longer time to reach the display.
The audio data and the video data can be synchronized by adding a delay parameter to the audio data so that the time of the audio data arriving at the loudspeaker is equal to the time of the video data arriving at the display. The implementation of synchronizing audio data and video data during practical application may be, but is not limited to, the above implementation, and applicant does not make much limitation herein.
S24, drawing a video frame according to the video data;
in this embodiment, the controller draws the video frame through the Surfaceview, where a drawing manner of the video frame may be a drawing manner that is customary in the art, and the applicant does not make much limitation here.
S25, performing motion compensation or image quality processing on the video frame;
the process of the image quality processing may be the adjustment of parameters of the video frame, such as but not limited to a series of parameters related to the image quality processing, such as digital contrast, brightness, color, hue, sharpness, hue, color temperature, noise reduction, mpeg noise reduction, dot noise reduction, white enhancement, color enhancement, red gain, green gain, blue gain, R-compensation, G-compensation, B-bias, white balance, etc. The image quality processing may also be adding some special effects to the video frame, and the special effects may be, but are not limited to, snow scenes, 3D pictures, and the like. The present embodiment is only an exemplary way to describe two image quality processes, and in the process of practical application, the image quality processes may be, but are not limited to, the two implementation ways.
S26 controls the display to display the processed video frame.
In the application, the controller draws the video frame of the video to be played through the SurfaceView, so that when the display is in a static state, image quality processing or motion compensation can be carried out on the video frame, and the experience of a user is improved.
Referring to fig. 7B, when the display is in the rotation state, the operation flow of the controller may refer to fig. 7B, where fig. 7B is a flowchart of the controller during the video playing process according to an embodiment; as can be seen from the figure, the controller is configured to perform the steps of:
s21, receiving audio and video data;
s22, decoding audio and video data;
for the decoding manner of the audio and video data, reference may be made to the above embodiments, which are not described herein again.
S23, synchronizing audio and video data;
for the synchronization of audio and video data, reference may be made to the foregoing embodiments, which are not described herein again.
S24, drawing a video frame according to the video data and the rotation angle of the display;
in this embodiment, the controller draws a video frame of a video to be played through Textureview. Textureview can support the translation, scaling, arbitrary angle rotation, screenshot and other transformations of the video frame. Therefore, the video frame is drawn through Textureview, and the display direction of the drawn video frame can be adapted to the display direction of the display. When the user watches the video by using the display device provided by the embodiment, even when the display is in a rotating state, the picture (also referred to as a video frame in the application) watched by the user is always adapted to the display direction of the display, so that the user experience is better.
S26 controls the display to display the video frame.
In the scheme shown in the embodiment of the application, if the display is in a rotating state, the controller draws the video frame of the video to be played through Textureview. Textureview can support the translation, scaling, arbitrary angle rotation, screenshot and other transformations of the video frame. Therefore, the video frame is drawn through Textureview, and the display direction of the drawn video frame can be adapted to the display direction of the display. When the user watches the video by adopting the display equipment provided by the embodiment, even when the display is in a rotating state, the picture watched by the user is always matched with the display direction of the display, and the user experience is better. If the controller draws the video frame with Textureview all the time during the video playing, the video watched by the user is the video without image quality processing. In practical applications, image quality processing or motion compensation is usually required for video frames in order to provide better visual effects to users. However, due to hardware limitations, if video frames are output in the graphics layer, image quality processing cannot be performed. Based on this, according to the display device provided by this embodiment, when the display is in the static state, the controller draws the video frame of the video to be played through the surfview, so that when the display is in the static state, image quality processing or motion compensation can be performed on the video frame, and the experience of the user is further improved.
The following describes in detail the operation flow of the display device during the video playing process with reference to a specific example.
Example 1:
FIG. 8 is a flowchart illustrating operation of a display device during video playback according to one possible embodiment; it can be seen from the figure that during the video playback, the controller is configured to perform the steps of:
s31, receiving audio and video data;
s32, decoding audio and video data;
the decoding manner of the audio/video data may refer to the above embodiments, and is not described herein again.
S33, in response to the received video playing instruction, reading a first display direction and a second display direction, wherein the first display direction is the display direction of the video to be played, and the second display direction is the current display direction of the display device in the display direction of the video to be played;
there are various ways to read the first/second display directions.
For example, the controller 250 may determine the first presentation direction by reading a resolution of the video. Wherein the resolution of the video includes the width and height of the video. If the width is larger than the height, the display direction of the video is the transverse screen display direction; and if the width is smaller than the height, the display direction of the video is the vertical screen display direction.
Specifically, after completing the action of loading the video, the attribute value of the video is also stored in the local memory of the controller 250, and at this time, the controller 250 may determine the display direction of the video by calling the resolution in the attribute value. The resolution describes the number of pixels included in the video in both the "horizontal" and "vertical" dimensions. For example, 1920 × 1080 video is composed of 1920 pixels in the horizontal direction and 1080 pixels in the vertical direction (2,073,600 pixels in total). Based on the resolution ratio, whether the video is a horizontal screen display video or a vertical screen display video can be obtained.
In some possible embodiments, the resolution of the video loaded by the controller 250 is 320 × 240, and the controller 250 determines that the video is a landscape display video, and the corresponding first display direction is a landscape display direction. In another possible embodiment, the resolution of the loaded video called by the controller 250 is 768 × 1024, and the controller 250 determines that the video is a vertical-screen display video, and accordingly, the first display direction is a horizontal-screen display direction.
For another example, to add a direction identifier to the configuration information of the video, the controller 250 may obtain the configuration information of the video being played through the application; then, it is determined whether the first display direction is a landscape display direction or a portrait display direction based on the configuration information.
Specifically, the controller 250 is further configured to: reading a direction identifier; if the display direction identifier is the first direction identifier, the display direction supported by the video is the horizontal screen display direction; and if the display direction identifier is the second direction identifier, the display direction supported by the video is the vertical screen display direction. In the practical application process, the first identification value and the second identification value may be set according to requirements, and the applicant does not make much limitation. An identification value may be added to the configuration information: and com, H/V/HV, wherein H represents that the video only supports the horizontal screen display direction, V represents that the video only supports the vertical screen, and HV represents that the video simultaneously supports the horizontal screen display direction and the vertical screen display direction.
In some feasible embodiments, the controller 250 invokes the orientation identification of the video: and com, the controller 250 determines the display direction of the video support as the landscape display direction. In another possible embodiment, the controller 250 reads the orientation identification of the video: and com, the controller 250 determines the first display direction as the vertical screen display direction.
The current presentation direction (second presentation direction) of the display 275 may be monitored by a sensor built into the display device 200. For example, a gyroscope, a gravitational acceleration sensor, or the like is provided on the display 275 of the display device 200, and attitude data of the display 275 with respect to the direction of gravity can be determined by measuring angular acceleration or the direction of gravity. The monitored attitude data is then compared with the attitude data in the horizontal screen display direction and the vertical screen display direction, respectively, to determine the display direction in which the display 275 is currently located.
For another example, a grating angle sensor, a magnetic field angle sensor, a sliding resistance angle sensor, or the like may be provided on the rotating member 276, and the display direction in which the display 275 is currently located may be determined by measuring the angle of rotation of the rotating member 276 and comparing the angle with the angle in the landscape display direction and the portrait display direction, respectively.
The embodiments of the present application only show several implementation manners of reading the first/second display directions by way of example, and in the process of practical application, the implementation manners of reading the first/second display directions are not limited to the above several implementation manners.
S3411, in response to that the first display direction is different from the second display direction, controlling the rotation assembly to drive the display to rotate, so that the display direction of the rotated display is adapted to the second display direction, and reading a first rotation angle, where the first rotation angle is a rotation angle of the display;
in some feasible embodiments, in response to a video playing instruction input by a user, the controller 250 reads the first display direction and the second display direction; if the first display direction is not consistent with the second display direction, the controller 250 controls the rotating assembly 276 to rotate the display 275, so that the display direction of the display 275 is matched with the first display direction after the rotation; for example, if the controller 250 detects that the current rotation angle of the display 275 is 90 degrees through the angle sensor, it may determine that the display direction of the display 275 is the vertical display direction, and when the first display direction is the horizontal display direction, the display 275 may be rotated so that the display direction of the rotated display 275 is the vertical display direction. For another example, when the controller 250 monitors that the current rotation angle of the display 275 is 90 degrees through the angle sensor, it may determine that the display direction of the display 275 is the horizontal-screen display direction, and when the first display direction is the vertical-screen display direction, the display 275 may be rotated, so that the display direction of the rotated display 275 is the horizontal-screen display direction.
In this embodiment, the controller may be configured with an angle sensor that records the first rotation angle in real time. Wherein the first selected rotation angle is a relative value, and the first selected rotation angle is an increasing angle or a decreasing angle relative to the initial state of the display as the starting point, for example, in playing a video, the controller 250 monitors that the current rotation angle of the display 275 is 90 degrees through the angle sensor, and when the rotation angle of the display is 91 degrees, the first selected rotation angle is 1 degree. When the display is rotated at 89 degrees, the first selected angle of rotation is also 1 degree.
Because the display device is limited by the rotating assembly and the assembly structure thereof in the application process, the display device can not rotate continuously in a certain direction after rotating for a certain angle in the certain direction. Thus, if the rotating member is rotated in one direction at a time, the rotating member may be damaged after a plurality of rotations.
Based on this, the rotation direction that the controller controlled the demonstration each time in this application is the display nearest rotation direction opposite. For example, if the last rotation direction of the display is clockwise rotation, the next rotation direction of the display is counterclockwise rotation controlled by the controller. For another example, if the last rotation direction of the display is counterclockwise, the next rotation direction is clockwise. The damage of the display caused by over rotation can be avoided through the control mode.
S3412 draws the video frame of the video to be shown based on the first rotation angle such that the second rotation angle is equal to the first rotation angle, the second rotation angle being a rotation angle of the video frame.
The process of drawing may be: when the controller 250 controls the rotating component 276 to drive the display 275 to rotate clockwise, the controller 250 reads the first rotation angle, and when the controller needs to draw the first frame of video frame, the first rotation angle read by the controller using Textureview is 0.25 degrees, and the Textureview rotates the first frame of video frame counterclockwise by 0.25 degrees. And drawing the video frame of the video to be displayed based on the first rotation angle according to the process until the display stops rotating.
S3413, in response to that the first rotation angle is equal to the preset rotation angle, draws the video frame of the video to be played through the video layer drawing control, and performs image quality processing or motion compensation on the video frame.
In the present application, the display is switched between landscape mode and portrait mode, and in a possible embodiment, the preset angle may be set to, but is not limited to, 90 degrees.
For image quality processing or motion compensation, reference may be made to the above embodiments, which are not described herein again.
And S342, in response to that the first display direction is the same as the second display direction, drawing a video frame, and performing motion compensation or image quality processing on the video frame to obtain a processed video frame.
The controller renders the video frame through the Surfaceview in response to the first display direction being the same as the second display direction, wherein the rendering manner of the video frame may be a rendering manner customary in the art, and the applicant does not make much limitation herein.
Then, the video frames are subjected to image quality processing and/or motion compensation, wherein the image quality processing and/or motion compensation processes may adopt image quality processing and/or motion compensation methods that are commonly used in the art according to requirements, which is not limited by the applicant herein.
S35 the controller display displays the processed video frame.
Referring to fig. 9, in the process of rotating the display, fig. 9 is a diagram illustrating a variation of a displayed frame of the display according to an embodiment; it can be seen from fig. 9 that during rotation of the display, the displayed video frames may rotate along with the display. The picture watched by the user is always matched with the display direction of the display, and the user experience is good.
Example 2:
in the scheme shown in embodiment 1, if the display is in a rotating state, a video frame of a video to be played is drawn through an image layer drawing control; and if the display is in a static state, drawing the video frame of the video to be played through the video layer drawing control. When the display is switched from the rotating state to the static state, the controller switches from outputting the video frame by Textureview to outputting the video frame by Surfaceview. In the process of switching from textview output video frames to surface view output video frames, because a time interval exists, no video frame is output to the display by the controller in the time interval, and accordingly, a phenomenon of screen flashing appears on the display. In particular, reference may be made to fig. 9. As can be seen from fig. 9, when the display is rotated by 90 degrees, the display will have a black flashing phenomenon (corresponding to frame 12 in fig. 9).
Based on the above problems, the applicant has made further improvements to the display device. The operation flow of the improved display device can refer to fig. 10, and fig. 10 is a flowchart of the operation of the display device in the video playing process according to a feasible embodiment; as can be seen from the figure, during the video playing, the controller is further configured to perform the following steps:
s3414, in response to the first rotating angle being equal to the preset rotating angle, intercepting a shielding image, wherein the shielding image is an image displayed by the display when the display stops rotating;
in the application, when the first rotation angle is equal to the preset rotation angle, the display stops rotating, at the moment, the controller intercepts the shielding image, and the shielding image is an image displayed by the display when the display stops rotating.
The controller controls the display to display the occlusion image, and the occlusion image is located on the upper layer of the video frame in the application, so that when the occlusion image and the video frame exist at the same time, only the occlusion image can be seen from the user.
S3415 counts a display time of the occlusion image;
the controller takes the time for displaying the shielding image as a time starting point, and counts the display time of the shielding image in real time.
S3416 undoes the occlusion image in response to the display time being equal to the preset still frame time.
The preset static frame time may be configured according to a requirement, and in a feasible embodiment, the preset static frame time may be an interval time between a first time and a second time, where the first time is a time corresponding to a last frame of video frame output by Textureview, and the second time is a time corresponding to a first frame of video frame output by Surfaceview. And when the display time is equal to the preset static frame time, removing the shielding image so as to ensure that the user can normally watch the video.
In the present embodiment, referring to fig. 11, a variation diagram of the image displayed by the display in the process of rotating the display is shown, where fig. 11 is a variation diagram of the image displayed by the display according to a feasible embodiment; it can be seen from fig. 11 that during the rotation of the display, the displayed video frames can rotate along with the display during the rotation of the display. When the display is switched from the rotating state to the static state, the controller switches from textview output video frame to Surfaceview output video frame, and in the process, the controller controls the display to show a shielding picture (corresponding to the shielding picture 14 in fig. 11), so that the problem of black screen flashing does not occur in the process.
Example 3:
FIG. 12 is a flowchart illustrating operation of a display device during video playback according to one possible embodiment; it can be seen from the figure that during the video playing process, the controller is further configured to perform the steps of:
s41, receiving audio and video data;
s42, decoding audio and video data;
the decoding manner of the audio/video data may refer to the above embodiments, and is not described herein again.
S43, reading the angle increment value of the display device;
s4411, in response to the angle increment being greater than 0, controlling the rotating assembly to drive the display to rotate, and calculating a first rotating angle according to the angle increment, wherein the first rotating angle is the rotating angle of the display;
in some possible embodiments, in response to a video play instruction input by a user, the controller 250 reads an angle increment value of the display device; if the angle increment value is greater than 0, the controller 250 controls the rotating component 276 to rotate the display 275, so that the display direction of the display 275 is matched with the first display direction after the rotation; for example, when the controller 250 monitors that the angle increase value of the display 275 is greater than 0 through the acceleration sensor, the first rotation angle is calculated according to the angle increase value.
The first rotation angle may be calculated in the following manner. In a possible embodiment, the controller receives the angle increment of 0.1 degrees every 10ms, and the corresponding first rotation angles calculated every 10ms are respectively: 0.1 degree, 0.2 degree, 0.3 degree, 823060 \ 8230, up to 90 degrees.
S4412 draws a video frame of the video to be displayed based on the first rotation angle, so that the display direction of the video frame is adapted to the display direction of the display.
The implementation manner of drawing the video frame of the video to be displayed based on the first rotation angle may refer to the above implementation, and is not described herein again.
S4413, in response to the first rotation angle being equal to the preset rotation angle, drawing the video frame of the video to be played through the video layer drawing control, and performing image quality processing or motion compensation on the video frame.
For image quality processing or motion compensation, reference may be made to the above embodiments, which are not described herein again.
S4414, intercepting a shielding image in response to the first rotation angle being equal to the preset rotation angle, wherein the shielding image is an image displayed by the display when the display stops rotating; s4415, responding to the first rotation angle being equal to the preset rotation angle, counting the display time of the shielding image;
s4416 undoing the occlusion image in response to the display time being equal to the preset still frame time.
And S45, displaying the processed video frame by the controller display.
In the present embodiment, in the process of rotating the display, reference may be continuously made to fig. 11 for a change diagram of the picture displayed by the display, where fig. 11 is a change diagram of the picture displayed by the display according to a feasible embodiment; it can be seen from fig. 11 that in the process of rotating the display, the displayed video frame can rotate along with the display in the process of rotating the display, so that the user experience is better, when the display is switched from the rotating state to the static state, the controller switches from textview output video frame to Surfaceview output video frame, and in the process, the controller controls the display to display the shielding picture, so that the problem of flashing black screen does not occur in the process.
Example 4:
examples 1-3 illustrate arrangements in which a chain motor (also referred to herein as a rotation assembly) is currently used to rotate the display 275. The rotating component 276 is provided with a limit switch, and when the rotating component 276 drives the display 275 to rotate 90 degrees, the rotating component will collide with the limit switch, which causes a large change in the rotation angle of the display 275, as can be seen in fig. 13. Fig. 13 is a graph of a first rotation angle of the display 275 versus rotation time during the rotation of the display 275 from the portrait screen state to the landscape screen state by the rotating assembly 276. At time Ti, the rotating member 276 strikes the limit switch, and the rotation angle of the corresponding display 275 fluctuates greatly, and accordingly, the video frame rendered based on the first rotation angle is tilted greatly, which can be seen in fig. 11 of fig. 9 or fig. 11.
Based on the above problem, the application itself makes a further improvement on the display device provided in the foregoing embodiment, and fig. 14 is a flowchart of the operation of the display device after the improvement, where fig. 14 is a flowchart of the operation of the display device in the video playing process provided according to a possible embodiment; as can be seen from the figure, during the video playing, the controller is further configured to perform the steps of:
s51, receiving audio and video data;
s52, decoding audio and video data;
for the decoding manner of the audio and video data, reference may be made to the above embodiments, which are not described herein again.
S5311, responding to the display to start rotating, recording the rotation time, wherein the rotation time is used for recording the rotation duration of the display;
the method for determining the start of rotation of the display can refer to the above embodiments, and is not described herein.
S5312 generating a predicted angle according to the rotation time and the pre-stored rotation speed;
the rotation speed is generated in advance in the application, and the rotation speed can be directly called when a prediction angle needs to be generated. Wherein the generation process of the rotation speed can be: the controller is configured to perform steps (1) - (4)
Step (1) the controller records the time T required for the display 275 to rotate for N times by the first preset angle θ,N ;
The first preset angle can be set according to requirements during practical application, and in a feasible embodiment, the first preset angle is set to be 90 degrees.
First rotation: the controller 250 controls the display to rotate while recording the time T required for the display 275 to rotate 90 degrees 90,1 ;
And (3) second rotation: the controller 250 controls the display to rotate while recording the time T required for the display 275 to rotate 90 degrees 90,2 ;
And (3) rotation for the third time: the controller 250 controls the display to rotate while recording the time T required for the display 275 to rotate 90 degrees 90,3 ;
Fourth rotation: the controller 250 controls the display to rotate while recording the time T required for the display 275 to rotate 90 degrees 90,4 ;
……
And (3) rotation for the Nth time: the controller 250 controls the display to rotate while recording the time T required for the display 275 to rotate 90 degrees 90,N 。
Step (2), the controller counts the occurrence probability of each time;
Step (3) the controller screens out the time with the probability greater than the preset probability as target time;
the preset probability can be set according to requirements, wherein the higher the value of the preset probability is, the more accurate the corresponding calculation result is.
Step (4) calculating the rotation speed according to the first preset angle and the target time;
the rotational speed can be calculated according to the following formula:
wherein V is a rotation speed, theta is a first preset angle, and T θ,N Is the target time.
In a possible embodiment, θ =30 degrees, calculated according to the method described above:
in a possible embodiment, θ =90 degrees, calculated according to the method described above:
in the present application, predicted angle = rotation time × rotation speed.
S5313 draws the video frame based on the predicted angle such that a second rotation angle is equal to the predicted angle, the second rotation angle being a rotation angle of the video frame.
The implementation of drawing the video frame based on the drawing angle may refer to the implementation of drawing the video frame based on the first rotation angle, which is not described herein again.
S5314, in response to the display stopping rotating, intercepting an occlusion picture;
s5315, in response to the display stopping rotating, drawing the video frame of the video to be played through the video layer drawing control, and performing image quality processing or motion compensation on the video frame.
S5316 counting the display time of the occlusion image;
s5317 in response to the display time being equal to the preset static frame time, undoing the occlusion image;
s54 the controller displays the processed video frame on the display.
In the process of rotating the display of the display device, the predicted rotation angle is calculated based on the rotation time and the rotation speed, the predicted angle is not influenced by the actual rotation angle of the display, and even under the condition that the display is greatly shaken due to touching the limit switch in the rotating process, the controller can obtain the proper predicted angle based on the rotation time and the rotation speed at that time. According to the method and the device, the prediction angle is uniformly increased along with the increase of time, so that the video picture configured based on the prediction angle is rotated at a constant speed along with the increase of time, the video picture is changed smoothly in the process, and the user experience is better. In the process of rotating the display, referring to fig. 15, a variation graph of the displayed image of the display in the embodiment is shown, and fig. 15 is a variation graph of the displayed image of the display according to a possible embodiment; it can be seen from fig. 15 that during the rotation of the display, the displayed video frames can rotate along with the display during the rotation of the display. When the display is converted from the rotating state to the static state, the Textureview output video frame is switched to the Surfaceview output video frame by the controller, and in the process, the controller controls the display to display the shielding picture, so that the problem of black screen flashing does not occur in the process.
In a specific implementation manner, the present invention further provides a computer storage medium, where the computer storage medium may store a program, and the program may include some or all of the steps in each embodiment of the method for adjusting a shooting angle of a camera provided by the present invention when executed. The storage medium may be a magnetic disk, an optical disk, a read-only memory (ROM), a Random Access Memory (RAM), or the like.
Those skilled in the art will readily appreciate that the techniques of the embodiments of the present invention may be implemented using software plus any required general purpose hardware platform. Based on such understanding, the technical solutions in the embodiments of the present invention may be essentially or partially implemented in the form of software products, which may be stored in a storage medium, such as ROM/RAM, magnetic disk, optical disk, etc., and include instructions for causing a computer device (which may be a personal computer, a server, or a network device, etc.) to execute the method in the embodiments or some parts of the embodiments of the present invention.
Finally, it should be noted that: the above embodiments are only used for illustrating the technical solutions of the present application, and not for limiting the same; although the present application has been described in detail with reference to the foregoing embodiments, it should be understood by those of ordinary skill in the art that: the technical solutions described in the foregoing embodiments may still be modified, or some or all of the technical features may be equivalently replaced; and the modifications or the substitutions do not make the essence of the corresponding technical solutions depart from the scope of the technical solutions of the embodiments of the present application.
The foregoing description, for purposes of explanation, has been presented in conjunction with specific embodiments. However, the illustrative discussions above are not intended to be exhaustive or to limit the embodiments to the precise forms disclosed above. Many modifications and variations are possible in light of the above teaching. The embodiments were chosen and described in order to best explain the principles and the practical application, to thereby enable others skilled in the art to best utilize the embodiments and various embodiments with various modifications as are suited to the particular use contemplated.
Claims (10)
1. A display device, comprising:
a display;
the rotating assembly is connected with the display and is used for driving the display to rotate based on the control of the controller;
a controller configured with rendering controls, the rendering controls comprising: an image layer rendering control and a video layer rendering control configured to perform the steps of:
monitoring a state of the display in response to receiving a video playback instruction, the state including: a rotating state and a stationary state;
in response to the display rotating, determining a predicted rotation angle of the display in the rotating process, wherein the predicted rotation angle is obtained based on the rotation time and the rotation speed of the display;
drawing a video frame based on the predicted rotation angle, wherein the rotation angle of the video frame is equal to the predicted rotation angle, the video frame rotates along with the display, and the current display direction of the video frame is matched with the current display direction of the display in the rotation process;
if the display is in a rotating state, drawing a video frame of a video to be played through an image layer drawing control;
and if the display is in a static state, drawing the video frame of the video to be played through a video layer drawing control.
2. The display device according to claim 1, wherein the controller is further configured to perform the steps of:
reading a first display direction and a second display direction in response to receiving a video playing instruction, wherein the first display direction is the current display direction of the display, and the second display direction is the display direction of the video to be played;
and in response to that the first display direction is different from the second display direction, controlling the rotating assembly to drive the display to rotate, so that the display direction of the rotated display is matched with the second display direction.
3. The display device according to claim 1 or 2, wherein the controller is further configured to perform the steps of:
recording a rotation time in response to the display starting to rotate, wherein the rotation time is used for recording the time length of the display rotation;
generating a predicted angle according to the rotation time and a prestored rotation speed;
and drawing the video frame based on the predicted angle, wherein the rotation angle of the video frame is equal to the predicted rotation angle, and the display direction of the video frame is matched with the display direction of a display.
4. The display device of claim 2, wherein the controller is further configured to perform the steps of:
reading a first rotation angle in response to the first display direction being different from the second display direction, the first rotation angle being a rotation angle of the display;
and drawing a video frame of a video to be displayed based on the first rotation angle so as to enable a second rotation angle to be equal to the first rotation angle, wherein the second rotation angle is the rotation angle of the video frame.
5. The display device according to claim 1, wherein the controller is further configured to perform the steps of:
reading an angle increment value of the display in response to receiving a video playing instruction;
in response to the angle increment value being greater than zero, calculating a first rotation angle based on the angle increment value, the first rotation angle being a rotation angle of the display;
and drawing the video frame of the video to be displayed based on the first rotation angle so as to enable a second rotation angle to be equal to the first rotation angle, wherein the second rotation angle is the rotation angle of the video frame.
6. The display device according to claim 4 or 5, wherein the controller is further configured to perform the steps of:
and responding to the first rotating angle being equal to the preset rotating angle, and drawing the video frame of the video to be played through a video layer drawing control.
7. The display device according to claim 4 or 5, wherein the controller is further configured to perform the steps of:
starting to count the rotation time of the display in response to the display starting to rotate;
and in response to the rotation time being equal to the preset rotation time, drawing the video frame of the video to be played through a video layer drawing control.
8. The display device of claim 5, wherein the controller is further configured to perform the steps of:
and responding to the angle increment value being equal to zero, and drawing the video frame of the video to be played through a video layer drawing control.
9. The display device according to claim 1, wherein the controller is further configured to:
intercepting a shielding image when the display stops rotating, wherein the shielding image is an image displayed by the display when the display stops rotating;
and controlling a display to display the shielding image, wherein the shielding image is arranged on the upper layer of the video frame.
10. The display device according to claim 9, wherein the controller is further configured to:
counting the display time of the shielding image;
and in response to the display time being equal to a preset static frame time, undoing the occlusion image.
Priority Applications (3)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202010693749.4A CN111866590B (en) | 2020-07-17 | 2020-07-17 | Display device |
PCT/CN2021/080553 WO2021180224A1 (en) | 2020-03-13 | 2021-03-12 | Display device |
PCT/CN2021/080552 WO2021180223A1 (en) | 2020-03-13 | 2021-03-12 | Display method and display device |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202010693749.4A CN111866590B (en) | 2020-07-17 | 2020-07-17 | Display device |
Publications (2)
Publication Number | Publication Date |
---|---|
CN111866590A CN111866590A (en) | 2020-10-30 |
CN111866590B true CN111866590B (en) | 2022-12-23 |
Family
ID=73000989
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202010693749.4A Active CN111866590B (en) | 2020-03-13 | 2020-07-17 | Display device |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN111866590B (en) |
Families Citing this family (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2021180224A1 (en) * | 2020-03-13 | 2021-09-16 | 海信视像科技股份有限公司 | Display device |
CN112565861A (en) * | 2020-11-23 | 2021-03-26 | 青岛海信传媒网络技术有限公司 | Display device |
CN112565839B (en) * | 2020-11-23 | 2022-11-29 | 青岛海信传媒网络技术有限公司 | Display method and display device of screen projection image |
CN116486759B (en) * | 2023-04-11 | 2024-01-30 | 艺壹佳文化科技(广东)有限公司 | Intelligent adjustment method, device, equipment and storage medium for identification display |
Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2005202127A (en) * | 2004-01-15 | 2005-07-28 | Seiko Epson Corp | Rotary type display device and driving method for rotary type display device |
WO2007105366A1 (en) * | 2006-03-14 | 2007-09-20 | Ntn Corporation | Rotation angle detector and bearing with rotation detector |
JP2018067249A (en) * | 2016-10-21 | 2018-04-26 | レノボ・シンガポール・プライベート・リミテッド | Information processing apparatus, image rotating method, and program |
CN111107418A (en) * | 2019-12-19 | 2020-05-05 | 北京奇艺世纪科技有限公司 | Video data processing method, video data processing device, computer equipment and storage medium |
Family Cites Families (9)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US6115025A (en) * | 1997-09-30 | 2000-09-05 | Silicon Graphics, Inc. | System for maintaining orientation of a user interface as a display changes orientation |
CN101527134B (en) * | 2009-04-03 | 2011-05-04 | 华为技术有限公司 | Display method, display controller and display terminal |
CN102724452B (en) * | 2012-06-27 | 2016-08-10 | 深圳Tcl新技术有限公司 | The picture processing method of video playback and device |
KR20140133363A (en) * | 2013-05-10 | 2014-11-19 | 삼성전자주식회사 | Display apparatus and Method for controlling the display apparatus thereof |
CN104866080B (en) * | 2014-02-24 | 2020-08-18 | 腾讯科技(深圳)有限公司 | Screen content display method and system |
EP3048601A1 (en) * | 2015-01-22 | 2016-07-27 | Thomson Licensing | Method for displaying video frames on a portable video capturing device and corresponding device |
US20170243327A1 (en) * | 2016-02-19 | 2017-08-24 | Lenovo (Singapore) Pte. Ltd. | Determining whether to rotate content based on identification of angular velocity and/or acceleration of device |
CN108260018B (en) * | 2017-02-13 | 2020-05-22 | 广州市动景计算机科技有限公司 | Full screen setting method and device for webpage video and webpage video setting mobile device |
CN111246266A (en) * | 2020-03-04 | 2020-06-05 | 海信视像科技股份有限公司 | Display equipment and UI (user interface) display method during rotation |
-
2020
- 2020-07-17 CN CN202010693749.4A patent/CN111866590B/en active Active
Patent Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2005202127A (en) * | 2004-01-15 | 2005-07-28 | Seiko Epson Corp | Rotary type display device and driving method for rotary type display device |
WO2007105366A1 (en) * | 2006-03-14 | 2007-09-20 | Ntn Corporation | Rotation angle detector and bearing with rotation detector |
JP2018067249A (en) * | 2016-10-21 | 2018-04-26 | レノボ・シンガポール・プライベート・リミテッド | Information processing apparatus, image rotating method, and program |
CN111107418A (en) * | 2019-12-19 | 2020-05-05 | 北京奇艺世纪科技有限公司 | Video data processing method, video data processing device, computer equipment and storage medium |
Also Published As
Publication number | Publication date |
---|---|
CN111866590A (en) | 2020-10-30 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN113395558B (en) | Display equipment and display picture rotation adaptation method | |
CN112565839B (en) | Display method and display device of screen projection image | |
CN111866590B (en) | Display device | |
CN111913608B (en) | Touch screen rotation control interaction method and display device | |
CN111970550B (en) | Display device | |
CN111787388B (en) | Display device | |
CN113395562B (en) | Display device and boot animation display method | |
CN111866593B (en) | Display device and startup interface display method | |
CN112165644B (en) | Display device and video playing method in vertical screen state | |
CN111866569B (en) | Display device | |
CN112565861A (en) | Display device | |
CN113556593A (en) | Display device and screen projection method | |
CN113395554B (en) | Display device | |
CN113630639B (en) | Display device | |
CN114501087B (en) | Display equipment | |
CN113473192B (en) | Display device and starting signal source display adaptation method | |
WO2021180224A1 (en) | Display device | |
CN113542824B (en) | Display equipment and display method of application interface | |
CN113497958A (en) | Display device and picture display method | |
CN113556590A (en) | Method for detecting effective resolution of screen-projected video stream and display equipment | |
CN113542823B (en) | Display equipment and application page display method | |
CN113497962B (en) | Configuration method of rotary animation and display device | |
CN113497965B (en) | Configuration method of rotary animation and display device | |
CN115697771A (en) | Display device and display method of application interface | |
CN111787374A (en) | Display device |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |