CN114501087B - Display equipment - Google Patents

Display equipment Download PDF

Info

Publication number
CN114501087B
CN114501087B CN202011164940.6A CN202011164940A CN114501087B CN 114501087 B CN114501087 B CN 114501087B CN 202011164940 A CN202011164940 A CN 202011164940A CN 114501087 B CN114501087 B CN 114501087B
Authority
CN
China
Prior art keywords
image
display
data
thread
rotation angle
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202011164940.6A
Other languages
Chinese (zh)
Other versions
CN114501087A (en
Inventor
张仁义
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Hisense Visual Technology Co Ltd
Original Assignee
Hisense Visual Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Hisense Visual Technology Co Ltd filed Critical Hisense Visual Technology Co Ltd
Priority to CN202011164940.6A priority Critical patent/CN114501087B/en
Publication of CN114501087A publication Critical patent/CN114501087A/en
Application granted granted Critical
Publication of CN114501087B publication Critical patent/CN114501087B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/41Structure of client; Structure of client peripherals
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/41Structure of client; Structure of client peripherals
    • H04N21/422Input-only peripherals, i.e. input devices connected to specially adapted client devices, e.g. global positioning system [GPS]
    • H04N21/42204User interfaces specially adapted for controlling a client device through a remote control device; Remote control devices therefor
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/431Generation of visual interfaces for content selection or interaction; Content or additional data rendering
    • H04N21/4312Generation of visual interfaces for content selection or interaction; Content or additional data rendering involving specific graphical features, e.g. screen layout, special fonts or colors, blinking icons, highlights or animations
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/47End-user applications
    • H04N21/482End-user interface for program selection
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/47End-user applications
    • H04N21/485End-user interface for client configuration

Abstract

The embodiment of the application shows a display device, including: display, rotating assembly and controller, the controller includes: a graphics processor; HWC hardware; the central processing unit is used for running a data distribution thread and an image synthesis thread; when the display is in a rotating state, the image composition thread marks all attribute information with a second identification value, so that the data distribution thread can transmit second layer data to the image composition thread. The image synthesis thread can call HWC hardware to draw the second image layer data into a second image, and the image is drawn without occupying the GPU in the process, so that the synthesis pressure of the GPU is relieved to a certain extent; when the display is in a static state, the data distribution thread can call the GPU to draw a first image, and then the first image is transferred to the image synthesis thread; therefore, the transmission pressure of the CPU is relieved to some extent.

Description

Display equipment
Technical Field
The application relates to the technical field of rotary televisions, in particular to a display device.
Background
The intelligent television has an independent operating system and supports function expansion. Various application programs, such as a traditional video application, a social application such as a short video and the like, and a reading application such as a cartoon and a reading application such as a reading application, can be installed in the intelligent television according to user needs. The applications can utilize the screen of the intelligent television to display application pictures, and rich media resources are provided for the intelligent television. Meanwhile, the intelligent television can also perform data interaction and resource sharing with different terminals. For example, the smart tv may be connected to the mobile phone through a wireless communication manner such as a local area network, bluetooth, etc., so as to play resources in the mobile phone or directly perform screen projection to display a picture on the mobile phone.
However, since the proportion of pictures corresponding to different applications or media of different sources is different, smart televisions are often used to display pictures different from the traditional video proportion. For example, video resources shot by a terminal such as a mobile phone are vertical assets with the aspect ratio of 9:16, 9:18, 3:4 and the like; while the view provided by the reading application is a vertical resource similar to the aspect ratio of a book. The aspect ratio of the display screen of the smart television is generally in a horizontal state of 16:9, 16:10 and the like, so that when the smart television displays vertical media such as short videos and cartoon, the vertical media images cannot be normally displayed due to the fact that the image proportion is not matched with the display screen proportion. The vertical media frames are generally required to be scaled to be displayed completely, which not only wastes the display space on the screen, but also brings bad user experience.
Disclosure of Invention
The application provides a display device to solve the technical problem of the conventional television.
A first aspect of an embodiment of the present application shows a display device, including: a display; the rotating assembly is connected with the display and used for driving the display to rotate based on the control of the controller; and a controller, the controller comprising:
A graphics processor;
HWC hardware;
the central processing unit is used for running a data distribution thread and an image synthesis thread;
the data distribution thread is configured to perform: receiving presentation data, the presentation data comprising: at least one layer of data, each layer of data configured with attribute information; outputting attribute information;
the image composition thread is configured to perform: in response to receiving the attribute information, reading a rotation angle of the display; if the rotation angle is equal to 0, marking a first identification value or a second identification value for part of attribute information according to the hardware capability of HWC hardware; if the rotation angle is larger than 0, marking a second identification value for all attribute information;
the data distribution thread is configured to perform: in response to receiving the annotated attribute information, invoking a graphics processor to render the first layer data into a first image; outputting the first image and/or the second image layer data; the first layer data is layer data corresponding to the first identification value, and the second layer data is layer data corresponding to the second identification value;
the image composition thread is configured to perform: in response to receiving the second layer data, invoking HWC hardware to render the second image data into a second image;
Or, in response to receiving the second layer data and the first composition data, invoking HWC hardware to render the second image data into a second image; combining the first image and the second image to obtain a display image;
or, in response to receiving the first composite data, controlling the display to present the first composite data.
The display device shown in this embodiment may label all attribute information with the second identification value when the display is in a rotation state, so that the image synthesis thread may transmit second layer data (layer data corresponding to the second identification value) to the image synthesis thread. The image synthesis thread can call HWC hardware to draw the second layer data into a second image, and the image is drawn without occupying a GPU (also called a graphics processor in the embodiment), so that the synthesis pressure of the GPU is relieved to a certain extent; when the display is in a static state, the data distribution thread can call the GPU to draw a first image, and then the first image is transferred to the image synthesis thread; in the case where the presentation data includes a plurality of first layer data, the GPU may map the plurality of layer data into one first image, and the resources occupied during transmission of the first image are smaller than those occupied during transmission of the presentation data, so that the transmission pressure of the CPU (may also be referred to as a central processing unit in this embodiment) is relieved to some extent.
A second aspect of the embodiments of the present application shows a display device, including: a display; the rotating assembly is connected with the display and used for driving the display to rotate based on the control of the controller; and a controller, the controller comprising:
a graphics processor;
HWC hardware;
the central processing unit is used for running a data distribution thread and an image synthesis thread;
the data distribution thread is configured to perform: receiving presentation data, the presentation data comprising: at least one layer of data, each layer of data configured with attribute information; outputting attribute information;
the image composition thread is configured to perform: in response to receiving the attribute information, reading a rotation angle of the display; if the rotation angle is equal to 0, labeling a first identification value for all attribute information; if the rotation angle is larger than 0, marking a second identification value for all attribute information;
the data distribution thread is configured to perform: in response to receiving the annotated attribute information, invoking a graphics processor to render the first layer data into a first image; outputting the first image and/or the second image layer data; the first layer data is layer data corresponding to the first identification value, and the second layer data is layer data corresponding to the second identification value;
The image composition thread is configured to perform: in response to receiving the second layer data, invoking HWC hardware to render the second image data into a second image;
or, in response to receiving the first composite data, controlling the display to present the first composite data.
In the display device shown in this embodiment, when the display is in a rotation state, the image composition thread marks all attribute information with the second identification value, so that the data distribution thread may transmit the second layer data to the image composition thread. The image synthesis thread can call HWC hardware to draw the second image layer data into a second image, and the image is drawn without occupying the GPU in the process, so that the synthesis pressure of the GPU is relieved to a certain extent; when the display is in a static state, the data distribution thread can call the GPU to draw a first image, and then the first image is transferred to the image synthesis thread; under the condition that the display data comprises a plurality of first layer data, the GPU draws the plurality of layer data into a first image, and resources occupied in the transmission process of the first image are smaller than those occupied in the transmission process of the display data, so that the transmission pressure of a CPU is relieved to a certain extent.
Drawings
In order to more clearly illustrate the technical solutions of the present application, the drawings that are needed in the embodiments will be briefly described below, and it will be obvious to those skilled in the art that other drawings can be obtained from these drawings without inventive effort.
Fig. 1A is an application scenario diagram of a display device of the present application;
FIG. 1B is a rear view of a display device of the present application;
FIG. 2 is a block diagram of the hardware configuration of the control device of the present application;
FIG. 3 is a block diagram of the hardware configuration of the display device of the present application;
FIG. 4 is a block diagram of an architecture configuration of an operating system in a memory of a display device of the present application;
FIG. 5A is a schematic diagram of a horizontal screen display direction media asset of the present application;
FIG. 5B is a schematic diagram of a vertical screen display direction media asset of the present application;
FIG. 6 is a schematic diagram of a display device according to one possible embodiment;
FIG. 7 is a flow chart illustrating interaction of components of a display device according to one possible embodiment;
FIG. 8 is a schematic diagram of a display presentation interface shown according to one possible embodiment;
FIG. 9A is a first image shown according to one possible embodiment;
FIG. 9B is a first image shown according to one possible embodiment;
FIG. 9C is a first image shown according to one possible embodiment;
FIG. 10 is a presentation image shown according to one possible embodiment;
FIG. 11 is a schematic diagram showing image changes during rotation of a display according to one possible embodiment;
FIG. 12 is a schematic diagram of a display device according to one possible embodiment;
FIG. 13 is a flow chart illustrating interaction of components of a display device according to one possible embodiment;
FIG. 14 is a flow chart illustrating interaction of components of a display device according to one possible embodiment.
Detailed Description
In order to better understand the technical solutions in the present application, the following description will clearly and completely describe the technical solutions in the embodiments of the present application with reference to the drawings in the embodiments of the present application, and it is obvious that the described embodiments are only some embodiments of the present application, but not all embodiments. All other embodiments, which can be made by one of ordinary skill in the art based on the embodiments herein without making any inventive effort, shall fall within the scope of the present application.
The rotary television is a novel intelligent television and mainly comprises a display and a rotary component. The display is connected to the bracket or the wall through the rotating assembly, and the placement angle of the display can be adjusted through the rotating assembly, so that the purpose of rotation is achieved. Different display placement angles can accommodate animated pages of different aspect ratios, e.g., the display is in most cases placed sideways to display video pages of movies, television shows, etc. with an aspect ratio of 16:9. When the aspect ratio of the video page is 9:16 for short video, cartoon, etc., the horizontally placed display needs to scale the page and display black areas on both sides of the display. Thus, the display can be placed vertically by the rotating assembly to accommodate a 9:16 ratio of video pages.
The applications supported by the rotary television are numerous, and in order to facilitate the user to watch, a starting signal source of the television can be designated by setting a starting mode. For example, in order to obtain the viewing experience of the conventional television, a power-on signal source of the television may be set to be a live signal, so that the television directly enters a live state after being powered on. The user can set the power-on signal source as any application program through the setting program. Because the postures of the displays supported by different applications are different, the postures of the television during starting up are matched with the applications serving as starting-up signal sources, and pages corresponding to the starting-up signal source applications can be normally displayed.
However, when watching television, the user can adjust the posture of the display of the rotary television according to the requirement and still keep the adjusted posture when turning off. For example, when a user views a short video or a cartoon through a television, the user switches the screen to a vertically placed state and turns off the screen in the vertically placed state. When the user starts up next time, the gesture of the screen is in a vertical placement state, and if the starting-up signal source is set to be an application only supporting the horizontal placement state, the gesture of the screen is not matched with the starting-up signal source application, and the screen cannot be displayed correctly. Therefore, the application provides a display device and a display method of an application interface.
In order to facilitate a user to display a target media detail page in different horizontal and vertical screen display directions of a display, and facilitate the user to watch experience of a display device in different watching states, the embodiment of the application provides a display device, a detail page display method and a computer storage medium, wherein the display device is a rotary television. It should be noted that, the method provided in this embodiment is not only applicable to a rotary television, but also applicable to other display devices, such as a computer, a tablet computer, and the like.
The term "module" as used in various embodiments of the present application may refer to any known or later developed hardware, software, firmware, artificial intelligence, fuzzy logic, or combination of hardware or/and software code that is capable of performing the functionality associated with that element.
The term "remote control" as used in the various embodiments of the present application refers to a component of an electronic device (such as a display device as disclosed herein) that can typically wirelessly control the electronic device over a relatively short range of distances. The assembly may be connected to the electronic device generally using infrared and/or Radio Frequency (RF) signals and/or bluetooth, and may also include functional modules such as WiFi, wireless USB, bluetooth, motion sensors, etc. For example: the hand-held touch remote controller replaces most of the physical built-in hard keys in a general remote control device with a touch screen user interface.
The term "gesture" as used in embodiments of the present application refers to a user behavior that is used to express an intended idea, action, purpose, and/or result by a change in hand or motion of a hand, etc.
The term "hardware system" as used in the various embodiments of the present application may refer to a physical component comprising mechanical, optical, electrical, magnetic devices such as integrated circuits (Integrated Circuit, ICs), printed circuit boards (Printed circuit board, PCBs) with computing, control, storage, input and output functions. In various embodiments of the present application, the hardware system will also be generally referred to as a motherboard (or a motherboard) or a host chip or controller.
Referring to fig. 1A, an application scenario diagram of a display device according to some embodiments of the present application is provided. As shown in fig. 1A, communication between the control apparatus 100 and the display device 200 may be performed in a wired or wireless manner.
Wherein the control apparatus 100 is configured to control the display device 200, which can receive an operation instruction input by a user, and convert the operation instruction into an instruction recognizable and responsive to the display device 200, and to mediate interaction between the user and the display device 200. Such as: the user responds to the channel addition and subtraction operation by operating the channel addition and subtraction key on the control apparatus 100.
The control device 100 may be a remote control 100A, including an infrared protocol communication or a bluetooth protocol communication, and other short-range communication modes, and the display apparatus 200 is controlled by a wireless or other wired mode. The user may control the display device 200 by inputting user instructions through keys on a remote control, voice input, control panel input, etc. Such as: the user can input corresponding control instructions through volume up-down keys, channel control keys, up/down/left/right movement keys, voice input keys, menu keys, on-off keys, etc. on the remote controller to realize the functions of the control display device 200.
The control device 100 may also be an intelligent device, such as a mobile terminal 100B, a tablet computer, a notebook computer, or the like. For example, the display device 200 is controlled using an application running on a smart device. The application program, by configuration, can provide various controls to the user through an intuitive User Interface (UI) on a screen associated with the smart device.
For example, the mobile terminal 100B may install a software application with the display device 200, implement connection communication through a network communication protocol, and achieve the purpose of one-to-one control operation and data communication. Such as: the mobile terminal 100B may be caused to establish a control instruction protocol with the display device 200 to implement functions such as physical buttons arranged by the remote controller 100A by operating various function keys or virtual controls of a user interface provided on the mobile terminal 100B. The audio/video content displayed on the mobile terminal 100B may also be transmitted to the display device 200, so as to implement a synchronous display function.
The display device 200 may provide a broadcast receiving function and a network television function of a computer supporting function. The display device may be implemented as a digital television, a web television, an Internet Protocol Television (IPTV), or the like.
The display device 200 may be a liquid crystal display, an organic light emitting display, a projection device. The specific display device type, size, resolution, etc. are not limited.
The display device 200 is also in data communication with the server 300 via a variety of communication means. Display device 200 may be permitted to communicate via a Local Area Network (LAN), a Wireless Local Area Network (WLAN), and other networks. The server 300 may provide various contents and interactions to the display device 200. By way of example, the display device 200 may send and receive information, such as: receiving Electronic Program Guide (EPG) data, receiving software program updates, or accessing a remotely stored digital media library. The servers 300 may be one group, may be multiple groups, and may be one or more types of servers. Other web service content such as video on demand and advertising services are provided through the server 300.
In some embodiments, as shown in FIG. 1B, the display device 200 includes a rotating assembly, a controller 250, a display 275, a terminal interface extending from a void on the back plate, and a rotating assembly 276 coupled to the back plate, the rotating assembly 276 capable of rotating the display 275. From the perspective of the front view of the display device, the rotating assembly 276 may rotate the display screen to a portrait display orientation, i.e., a state in which the vertical side length of the screen is greater than the lateral side length, or to a landscape display orientation, i.e., a state in which the lateral side length of the screen is greater than the vertical side length.
A block diagram of the configuration of the control apparatus 100 is exemplarily provided in fig. 2. As shown in fig. 2, the control device 100 includes a controller 110, a memory 120, a communicator 130, a user input interface 140, a user output interface 150, and a power supply 160.
The controller 110 includes a Random Access Memory (RAM) 111, a Read Only Memory (ROM) 112, a processor 113, a power-on interface, and a communication bus. The controller 110 is used to control the operation and operation of the apparatus 100, as well as the communication collaboration between the internal components, external and internal data processing functions.
For example, when an interaction in which a user presses a key arranged on the remote controller 100A or an interaction in which a touch panel arranged on the remote controller 100A is touched is detected, the controller 110 may control to generate a signal corresponding to the detected interaction and transmit the signal to the display device 200.
The memory 120 stores various operating programs, data and applications of the driving and controlling apparatus 100 under the control of the controller 110. The memory 120 may store various control signal instructions input by a user.
The communicator 130 performs communication of control signals and data signals with the display device 200 under the control of the controller 110. Such as: the control apparatus 100 transmits a control signal (e.g., a touch signal or a control signal) to the display device 200 via the communicator 130, and the control apparatus 100 may receive the signal transmitted by the display device 200 via the communicator 130. Communicator 130 may include an infrared signal interface 131 and a radio frequency signal interface 132. For example: when the infrared signal interface is used, the user input instruction needs to be converted into an infrared control signal according to an infrared control protocol, and the infrared control signal is sent to the display device 200 through the infrared sending module. And the following steps: when the radio frequency signal interface is used, the user input instruction is converted into a digital signal, and then the digital signal is modulated according to a radio frequency control signal modulation protocol and then transmitted to the display device 200 through the radio frequency transmission terminal.
The user input interface 140 may include at least one of a microphone 141, a touch pad 142, a sensor 143, keys 144, etc., so that a user may input user instructions regarding controlling the display apparatus 200 to the control device 100 through voice, touch, gesture, press, etc.
The user output interface 150 outputs a user instruction received by the user input interface 140 to the display device 200 or outputs an image or voice signal received by the display device 200. Here, the user output interface 150 may include an LED interface 151, a vibration interface 152 generating vibrations, a sound output interface 153 outputting sound, a display 154 outputting an image, and the like. For example, the remote controller 100A may receive an output signal of audio, video, or data from the user output interface 150, and display the output signal as an image on the display 154, as an audio at the sound output interface 153, or as a vibration at the vibration interface 152.
A power supply 160 for providing operating power support for the various elements of the control device 100 under the control of the controller 110. May be in the form of a battery and associated control circuitry.
A hardware configuration block diagram of the display device 200 is exemplarily provided in fig. 3. As shown in fig. 3, a modem 210, a communicator 220, a detector 230, an external device interface 240, a controller 250, a memory 260, a user interface 265, a video processor 270, a display 275, a rotating assembly 276, an audio processor 280, an audio output interface 285, a power supply 290 may be included in the display apparatus 200.
Wherein the rotating assembly 276 may include a drive motor, a rotating shaft, etc. The driving motor may be connected to the controller 250, and the controller 250 outputs a rotation angle under control; one end of the rotating shaft is connected to a power output shaft of the driving motor, and the other end is connected to the display 275, so that the display 275 can be fixedly mounted on a wall or a bracket through the rotating assembly 276.
The rotating assembly 276 may also include other components, such as a transmission component, a detection component, and the like. Wherein, the transmission component can adjust the rotation speed and torque output by the rotating component 276 through a specific transmission ratio, and can be in a gear transmission mode; the detection means may be constituted by a sensor provided on the rotation shaft, such as an angle sensor, an attitude sensor, or the like. These sensors may detect parameters such as the angle at which the rotating assembly 276 rotates and send the detected parameters to the controller 250 to enable the controller 250 to determine or adjust the status of the display device 200 based on the detected parameters. In practice, the rotating assembly 276 may include, but is not limited to, one or more of the components described above.
The modem 210 receives broadcast television signals through a wired or wireless manner, and may perform modulation and demodulation processes such as amplification, mixing, and resonance, for demodulating an audio/video signal carried in a frequency of a television channel selected by a user and additional information (e.g., EPG data) from among a plurality of wireless or wired broadcast television signals.
The tuning demodulator 210 is responsive to the frequency of the television channel selected by the user and the television signal carried by that frequency, as selected by the user, and as controlled by the controller 250.
The tuning demodulator 210 can receive signals in various ways according to broadcasting systems of television signals, such as: terrestrial broadcasting, cable broadcasting, satellite broadcasting, internet broadcasting, or the like; according to different modulation types, a digital modulation mode or an analog modulation mode can be adopted; and the analog signal and the digital signal can be demodulated according to the kind of the received television signal.
In other exemplary embodiments, the modem 210 may also be in an external device, such as an external set-top box or the like. In this way, the set-top box outputs a television signal after modulation and demodulation, and inputs the television signal to the display apparatus 200 through the external device interface 240.
The communicator 220 is a component for communicating with an external device or an external server according to various communication protocol types. For example, the display device 200 may transmit content data to an external device connected via the communicator 220, or browse and download content data from an external device connected via the communicator 220. The communicator 220 may include a network communication protocol module or a near field communication protocol module such as a WIFI module 221, a bluetooth module 222, a wired ethernet module 223, etc., so that the communicator 220 may receive a control signal of the control device 100 according to the control of the controller 250 and implement the control signal as a WIFI signal, a bluetooth signal, a radio frequency signal, etc.
The detector 230 is a component of the display device 200 for collecting signals of the external environment or interaction with the outside. The detector 230 may include a sound collector 231, such as a microphone, that may be used to receive a user's sound, such as a voice signal of a control instruction of the user controlling the display device 200; alternatively, ambient sounds for identifying the type of ambient scene may be collected, and the implementation display device 200 may adapt to ambient noise.
In other exemplary embodiments, the detector 230 may further include an image collector 232, such as a camera, webcam, etc., that may be used to collect external environmental scenes to adaptively change the display parameters of the display device 200; and the function is used for collecting the attribute of the user or interacting gestures with the user so as to realize the interaction between the display equipment and the user.
In other exemplary embodiments, the detector 230 may further include a light receiver for collecting ambient light intensity to adapt to changes in display parameters of the display device 200, etc.
In other exemplary embodiments, the detector 230 may further include a temperature sensor, such as by sensing ambient temperature, the display device 200 may adaptively adjust the display color temperature of the image. Illustratively, the display device 200 may be adjusted to display a colder color temperature shade of the image when the temperature is higher than ambient; when the temperature is low, the display device 200 may be adjusted to display a color temperature-warm tone of the image.
The external device interface 240 is a component that provides the controller 250 to control data transmission between the display apparatus 200 and an external device. The external device interface 240 may be connected to an external device such as a set-top box, a game device, a notebook computer, etc., in a wired/wireless manner, and may receive data such as a video signal (e.g., a moving image), an audio signal (e.g., music), additional information (e.g., an EPG), etc., of the external device.
The external device interface 240 may include: any one or more of a High Definition Multimedia Interface (HDMI) terminal 241, a Composite Video Blanking Sync (CVBS) terminal 242, an analog or digital Component terminal 243, a Universal Serial Bus (USB) terminal 244, a Component terminal (not provided in the figure), a Red Green Blue (RGB) terminal (not provided in the figure), and the like.
The controller 250 controls the operation of the display device 200 and responds to the user's operations by running various software control programs (e.g., an operating system and various application programs) stored on the memory 260.
As shown in fig. 3, the controller 250 includes a Random Access Memory (RAM) 251, a Read Only Memory (ROM) 252, a graphics processor 253, a processor 254, a power-on interface 255, and a communication bus 256. Wherein RAM251, ROM252, and graphics processor 253, processor 254 power-on interface 255 are connected by communication bus 256.
A ROM252 for storing various system boot instructions. When the power-on signal is received, the power of the display device 200 starts to be started, the processor 254 executes a system start instruction in the ROM252, and copies the operating system stored in the memory 260 into the RAM251 to start running the start operating system. When the operating system is started, the processor 254 copies the various applications in the memory 260 to the RAM251, and then starts running the various applications.
The graphic processor 253 generates various graphic objects such as icons, operation menus, and user input instruction display graphics, etc. The graphic processor 253 may include an operator for performing an operation by receiving user input of various interactive instructions, thereby displaying various objects according to display attributes; and a renderer for generating various objects based on the operator, and displaying the result of rendering on the display 275.
Processor 254 is operative to execute operating system and application program instructions stored in memory 260. And executing processing of various application programs, data and contents according to the received user input instructions so as to finally display and play various audio and video contents.
In some example embodiments, the processor 254 may include a plurality of processors. The plurality of processors may include one main processor and a plurality or one sub-processor. A main processor for performing some initialization operations of the display device 200 in a display device preloading mode and/or operations of the animated pages in a normal mode. A plurality of or a sub-processor for performing an operation in a state of standby mode or the like of the display device.
The power-on interface 255 may include a first interface to an nth interface. These interfaces may be network interfaces that are connected to external devices via a network.
The controller 250 may control the overall operation of the display apparatus 200. For example: in response to receiving a user input command for selecting a GUI object displayed on the display 275, the controller 250 may perform an operation related to the object selected by the user input command.
Wherein the object may be any one of selectable objects, such as a hyperlink or an icon. The operation related to the selected object, for example, an operation of displaying a link to a hyperlink page, a document, an image, or the like, or an operation of executing a program corresponding to the object. The user input command for selecting the GUI object may be a command input through various input means (e.g., mouse, keyboard, touch pad, etc.) connected to the display apparatus 200 or a voice command corresponding to a voice uttered by the user.
The memory 260 is used to store various types of data, software programs, or applications that drive and control the operation of the display device 200. Memory 260 may include volatile and/or nonvolatile memory. And the term "memory" includes memory 260, RAM251 and ROM252 of controller 250, or a memory card in display device 200.
In some embodiments, the memory 260 is specifically configured to store an operating program that drives the controller 250 in the display device 200; various application programs built in the display device 200 and downloaded from an external device by a user are stored; data for configuring various GUIs provided by the display 275, various objects related to the GUIs, visual effect images of selectors for selecting GUI objects, and the like are stored.
In some embodiments, the memory 260 is specifically configured to store drivers and related data for the modem 210, the communicator 220, the detector 230, the external device interface 240, the video processor 270, the display 275, the audio processor 280, etc., such as external data (e.g., audio-visual data) received from the external device interface or user data (e.g., key information, voice information, touch information, etc.) received from the user interface.
In some embodiments, memory 260 specifically stores software and/or programs for representing an Operating System (OS), which may include, for example: a kernel, middleware, an Application Programming Interface (API), and/or an application program. Illustratively, the kernel may control or manage system resources, as well as other program implemented functions (e.g., middleware, APIs, or application programs); at the same time, the kernel may provide an interface to allow middleware, APIs, or applications to access the controller to implement control or management of system resources.
An architectural configuration block diagram of the operating system in the memory of the display device 200 is provided by way of example in fig. 4. The operating system architecture is an application layer, a middleware layer and a kernel layer in sequence from top to bottom.
The application layer, the application program built in the system and the non-system application program belong to the application layer. Is responsible for direct interaction with the user. The application layer may include a plurality of applications, such as a setup application, an electronic post application, a media center application, and the like. These applications may be implemented as Web applications that execute based on WebKit engines, and in particular may be developed and executed based on HTML5, cascading Style Sheets (CSS), and JavaScript.
Here, HTML, which is called a hypertext markup language (Hyper Text Markup Language) in its entirety, is a standard markup language for creating web pages, which are described by markup tags for describing words, graphics, animations, sounds, tables, links, etc., and a browser reads an HTML document, interprets the contents of the tags within the document, and is provided in the form of a web page.
CSS, collectively referred to as cascading style sheets (Cascading Style Sheets), is a computer language used to represent the style of HTML files and may be used to define style structures such as fonts, colors, positions, and the like. The CSS style can be directly stored in an HTML webpage or a separate style file, so that the control of the style in the webpage is realized.
JavaScript, a language applied to Web page programming, can be inserted into HTML pages and interpreted by a browser. The interaction logic of the Web application is realized through JavaScript. The JavaScript can be used for realizing communication with the kernel layer by encapsulating the JavaScript extension interface through the browser,
middleware layer, some standardized interfaces may be provided to support the operation of various environments and systems. For example, the middleware layer may be implemented as multimedia and hypermedia information coding expert group (MHEG) of middleware related to data broadcasting, as DLNA middleware of middleware related to communication with an external device, as middleware providing a browser environment in which applications within a display device are running, and the like.
A kernel layer providing core system services such as: file management, memory management, process management, network management, system security authority management and other services. The kernel layer may be implemented as a kernel based on various operating systems, such as a kernel based on the Linux operating system.
The kernel layer also provides communication between system software and hardware at the same time, providing device driver services for various hardware, such as: providing a display driver for a display, providing a camera driver for a camera, providing a key driver for a remote control, providing a WIFI driver for a WIFI module, providing an audio driver for an audio output interface, providing a Power Management (PM) module with a power management driver, and the like.
In fig. 3, a user interface 265 receives various user interactions. Specifically, an input signal for a user is transmitted to the controller 250, or an output signal from the controller 250 is transmitted to the user. Illustratively, the remote control 100A may send input signals such as a power switch signal, a channel selection signal, a volume adjustment signal, etc., input by a user to the user interface 265, and then forwarded by the user interface 265 to the controller 250; alternatively, the remote controller 100A may receive an output signal such as audio, video, or data, which is processed by the controller 250 to be output from the user interface 265, and display the received output signal or output the received output signal in the form of audio or vibration.
In some embodiments, a user may input a user command through a Graphical User Interface (GUI) displayed on the display 275, and the user interface 265 receives the user input command through the GUI. In particular, the user interface 265 may receive user input commands for controlling the position of a selector in a GUI to select different objects or items. Wherein a "user interface" is a media interface for interaction and exchange of information between an application or operating system and a user, which enables conversion between an internal form of information and a user-acceptable form. A commonly used presentation form of the user interface is a graphical user interface (graphic user interface, GUI), which refers to a user interface related to computer operations that is displayed in a graphical manner. It may be an interface element such as an icon, a window, a control, etc. displayed in a display screen of the electronic device, where the control may include a visual interface element such as an icon, a control, a menu, a tab, a text box, a dialog box, a status bar, a channel bar, a Widget, etc.
Alternatively, the user may enter a user command by entering a particular sound or gesture, and the user interface 265 recognizes the sound or gesture through the sensor to receive the user input command.
The video processor 270 is configured to receive an external video signal, and perform video data processing such as decompression, decoding, scaling, noise reduction, frame rate conversion, resolution conversion, and image composition according to a standard codec protocol of an input signal, so as to obtain a video signal that is directly displayed or played on the display 275.
By way of example, video processor 270 includes a demultiplexing module, a video decoding module, an image compositing module, a frame rate conversion module, a display formatting module, and the like.
Wherein, the demultiplexing module is used for demultiplexing the input audio/video data stream, such as the input MPEG-2 stream (based on the compression standard of the digital storage media moving image and voice), and then the demultiplexing module demultiplexes the input audio/video data stream into video signals, audio signals and the like.
And the video decoding module is used for processing the demultiplexed video signal, including decoding, scaling and the like.
And the image synthesis module, such as an image synthesis thread, is used for carrying out superposition mixing processing on the graphic generator and the video image after the scaling processing according to the GUI signal input by a user or generated by the graphic generator, so as to generate an image signal for display.
The frame rate conversion module is configured to convert a frame rate of an input video, for example, convert a frame rate of an input 60Hz video into a frame rate of 120Hz or 240Hz, and a common format is implemented in an inserting frame manner.
And a display formatting module for converting the signal output by the frame rate conversion module into a signal conforming to a display format such as a display, for example, format converting the signal output by the frame rate conversion module to output an RGB data signal.
And a display 275 for receiving image signals from the video processor 270 and displaying video content, images and menu manipulation interfaces. The video content may be displayed from the broadcast signal received by the modem 210, or may be displayed from the video content input by the communicator 220 or the external device interface 240. And a display 275 for simultaneously displaying a user manipulation interface UI generated in the display device 200 and used to control the display device 200.
And, the display 275 may include a display screen component for rendering the page and a drive component to drive the display of the image. Alternatively, if the display 275 is a projection display, a projection device and a projection screen may be included.
The rotating assembly 276, the controller 250 may issue control signals to cause the rotating assembly 276 to rotate the display 275.
The audio processor 280 is configured to receive an external audio signal, decompress and decode according to a standard codec of an input signal, and perform audio data processing such as noise reduction, digital-to-analog conversion, and amplification, so as to obtain an audio signal that can be played in the microphone 286.
Illustratively, the audio processor 280 may support various audio formats. Such as MPEG-2, MPEG-4, advanced Audio Coding (AAC), high efficiency AAC (HE-AAC), etc.
An audio output interface 285 for receiving the audio signal output by the audio processor 280 under the control of the controller 250, the audio output interface 285 may include a microphone 286, or an external audio output terminal 287, such as a headphone output terminal, for outputting to a generating device of an external device.
In other exemplary embodiments, video processor 270 may include one or more chip components. Audio processor 280 may also include one or more chip components.
And, in other exemplary embodiments, video processor 270 and audio processor 280 may be separate chips or integrated with controller 250 in one or more chips.
The power supply 290 is used for providing power supply support for the display device 200 by power input by an external power supply under the control of the controller 250. The power supply 290 may be a built-in power supply circuit mounted inside the display device 200 or may be a power supply mounted outside the display device 200.
Because the display device 200 provided by the application comprises the display 275 and the rotating assembly 276, the rotating assembly 276 can drive the display 275 to rotate, so that the display 275 is in different display directions. Thus, in one implementation, the presentation directions may include a landscape presentation direction and a portrait presentation direction. The horizontal screen display direction refers to a display direction in which the length (width) of the display 275 in the horizontal direction is greater than the length (height) of the display 275 in the vertical direction when viewed from the front of the display 275; the vertical screen display direction refers to a display direction in which the length (width) of the display 275 in the horizontal direction is smaller than the length (height) of the display 275 in the vertical direction when viewed from the front of the display 275.
Obviously, the vertical direction is referred to as being substantially vertical in this application, and the horizontal direction is also referred to as being substantially horizontal, depending on the mounting/placement position of the display apparatus 200. The horizontal screen display direction is mainly used for displaying horizontal media such as TV dramas, movies and the like as shown in fig. 5A. The mode of operation when display 275 is in the landscape orientation may be referred to as the landscape orientation and the mode of operation when display 275 is in the portrait orientation may be referred to as the portrait orientation. The controller 250 in the display device 200 is further communicatively connected to the server 300 for invoking an interface of the server 300 to obtain the corresponding data. The display 275 in the display device 200 can be rotated by the rotation assembly 276 and used to display a user interface. In practical applications, the user may control the play mode, play content, etc. of the display device 200 through the control apparatus 100, where the play mode includes a horizontal screen media asset viewing mode and a vertical screen media asset viewing mode.
The vertical screen display direction is mainly used for displaying vertical media such as short videos and cartoon pictures, as shown in fig. 5B. In the vertical display direction, the display 275 may display a user interface corresponding to the vertical display direction and possess an interface layout and an interaction manner corresponding to the vertical display direction. In the vertical screen media asset viewing mode, the user can view short video, cartoon and other vertical screen media assets. Similarly, since the controller 250 in the display device 200 is further communicatively connected to the server 300, the media data corresponding to the vertical screen can be obtained by calling the interface of the server 300 when the vertical screen displays the direction.
The display direction of the vertical screen is more suitable for playing the page with the proportion of 9:16, etc., for example, short video shot by a terminal such as a mobile phone, etc. As the terminal equipment such as mobile phones and the like adopts 9:16,9:18, etc., so that when the terminal is connected to the display device 200 and the terminal page is displayed by the display device 200, the transition scaling of the page can be avoided by the vertical screen display direction, the application page of the display 275 is fully utilized, and better user experience is provided.
It should be noted that, the horizontal screen display direction and the vertical screen display direction are only two different display directions of the display 275, and do not limit the displayed content, for example, vertical media such as short videos and cartoon can still be displayed in the horizontal screen display direction; and the transverse media such as TV dramas, movies and the like can still be displayed in the vertical screen display direction, and only the incompatible display windows need to be compressed and adjusted in the display direction.
When the user uses the display device 200, the display direction of the display 275 is adjusted according to the viewing needs of the user. For example, after a rotation command is issued by a rotation key on the control device 100, or by selecting a rotation option on the UI interface, or by inputting a "rotation" related voice through the voice system, the controller 250 controls the rotation component 276 to rotate according to the rotation command, so as to drive the display 275 to rotate. For example, when a user wants to watch a short video through the display device 200, the user may input a rotation command in one of the above ways, so that the display 275 in the horizontal display direction is rotated counterclockwise by 90 degrees to the vertical display direction, thereby adapting to the image scale of the vertical application such as the short video.
In the display device, HWC (hwcomponent, which may be referred to as image composition thread in this embodiment) is a HAL Layer module for performing window (Layer) composition and display in Android, and implementation depends on a specific device, and provides hardware support for SurfaceFlinger (which may be referred to as image composition thread in this embodiment) service. SurfaceFinger is a stand-alone Service that receives as input the Surface of all Window, calculates the position of each Surface (also referred to as layer data in this embodiment) in the final image based on Ztherebetween, transparency, size, position, etc., and then passes it to HWC hardware or graphics processor (Graphics Processing Unit, GPU) to generate the final display Buffer (also referred to as presentation image/first image/second image in this embodiment) which is then displayed on a particular display device.
The SurfaceFlinger synthesizes the Layer by using the graphics processor, which needs to occupy and consume the graphics processor resource, and when the SurfaceFlinger synthesizes the Layer by using the GPU, the application program cannot render the output by using the GPU. And the synthesis pressure of the GPU can be relieved by performing layer synthesis through HWC hardware equipment. To relieve the synthesis pressure of the GPU, some display devices synthesize layers through HWCs. However, in the process of synthesizing the image layer by using the HWC hardware, a new process (image synthesis thread) needs to be created, and then the presentation data received by the SurfaceFlinger is transmitted to the new process for synthesis, and the performance consumption of data transmission is increased in the above process. How to reasonably utilize HWC hardware or GPU to synthesize Buffer is a problem to be solved.
In order to solve the problems existing in the prior art, an embodiment of the present application shows a display device, specifically referring to fig. 6, fig. 6 is a schematic diagram of the display device according to a feasible embodiment, it can be seen that the display device at least includes a display, a rotating component, and a controller, where the controller includes: a central processor (corresponding to processor 254 described above), graphics processor 253 and HWC hardware 257, the central processor being configured to execute a data distribution thread and an image composition thread, the data distribution thread and image composition thread interaction flow diagram referring to fig. 7. Specific:
the data distribution thread is configured to execute step S101 to receive presentation data;
in this embodiment, the display data may be, but not limited to, video data, advertisement data, page data, etc., and any data that may be displayed on the display after rendering may be used as the display data.
The display device shown in the embodiment of the application can support multi-window playing, and each window can display a separate page. Each page may present an image corresponding to at least one layer of data. Thus, presentation data received by a data distribution thread may include multiple layers of data.
For example, in a feasible embodiment, the display device plays two paths of videos simultaneously, the presentation data received by the corresponding data distribution thread includes two layers of data, and the final display presents an interface as shown in fig. 8. FIG. 8 is a schematic diagram of a display presentation interface shown according to one possible embodiment. The present application is not limited in the number of presentation data including layer data.
In this application, each layer data is configured with attribute information, and the attribute information may be, but is not limited to, parameters such as size and position of the layer data. The resources occupied by the attribute information in the transmission process are far smaller than those required in the display data transmission process.
The data distribution thread is configured to perform step S102 of outputting attribute information;
the data distribution thread may transfer all attribute information to the image composition thread in the form of a list. For example, in a feasible embodiment, the presentation data includes 3 layers of data, and the corresponding attribute list can be referred to in table 1:
TABLE 1
Sequence number Layer of picture Attribute information
1 Layer data 1 Attribute information 1
2 Layer data 2 Attribute information 2
3 Layer data 3 Attribute information 3
The data distribution thread may transfer all attribute information in the form of folders to the image composition thread.
In the practical application process, the attribute information can also be transmitted in other manners, and the applicant does not limit the attribute information in an excessive way.
The image synthesis thread responds to the received attribute information, and step S103 is executed to read the rotation angle of the display;
the process of reading the rotation angle may be, but not limited to, that the image composition thread reads the corresponding rotation angle in the attribute database, or that the image composition thread directly reads the rotation angle of the display.
If the rotation angle is equal to 0, the image synthesis thread executes step S1041 to label attribute information according to a preset rule;
in some viable embodiments, the preset rules may be determined by the hardware capabilities of the image composition thread. For example, in a feasible embodiment, the image synthesis thread renders a frame of image, so that the running smoothness of the whole display device can be guaranteed to be optimal. In response to receiving the attribute list (attribute information), the image composition thread tags a first one of the attribute information in the attribute list with a second identification value and tags the remaining attribute information in the attribute list with the first identification value. The noted attribute information may be referred to in table 2.
TABLE 2
Sequence number Layer of picture Attribute information Identification value
1 Layer data 1 Attribute information 1 Second identification value
2 Layer data 2 Attribute information 2 First identification value
3 Layer data 3 Attribute information 3 First identification value
It is noted that this embodiment is merely illustrative of one type of preset rule, and the preset rule may be configured according to the hardware capabilities of the HWC during actual application, and applicant does not make any excessive restrictions herein.
In some feasible embodiments, the preset rules may be: when the rotation angle is equal to 0, all the attribute information is marked with a first identification value, and the marked attribute information can be referred to in table 3.
TABLE 3 Table 3
Sequence number Layer of picture Attribute information Identification value
1 Layer data 1 Attribute information 1 First identification value
2 Layer data 2 Attribute information 2 First identification value
3 Layer data 3 Attribute information 3 First identification value
If the rotation angle is greater than 0, the image synthesis thread executes step S1042 to label the second identification value for all attribute information;
when the rotation angle is greater than 0, the noted attribute information may be referred to in table 4.
TABLE 4 Table 4
Sequence number Layer of picture Attribute information Identification value
1 Layer data 1 Attribute information 1 Second identification value
2 Layer data 2 Attribute information 2 Second identification value
3 Layer data 3 Attribute information 3 Second identification value
It should be noted that the present embodiment does not limit the kinds of the first identification value and the second identification value. In the practical application process, the designer can configure the first identification value and the second identification value according to the requirement.
The data distribution thread executes step S105, in response to receiving the marked attribute information, the GPU is called to draw the first layer data into a first image; the first layer data is layer data corresponding to attribute information marked as a first identifier. The second layer data is layer data corresponding to the attribute information marked as the second identifier.
In some possible embodiments, the data distribution thread may wait to send the first image and the second image layer data to the image composition thread after completing the first image. In some feasible embodiments, the data distribution thread may send the second layer data to the image synthesis thread in response to receiving the annotated attribute information, so as to ensure that the GPU synthesizes the first image and the HWC hardware synthesizes the second image to be performed synchronously.
The second image may be synthesized by a method commonly used in the art, and the applicant does not describe the method in detail herein. In this embodiment, the first image is less than or equal to the display range of the display.
The first image drawn will be described below with reference to specific examples. Taking the noted attribute information shown in table 3 as an example, the first image synthesized by the image synthesis thread may refer to fig. 9A, where image 2 is layer 1 drawn by layer data 2, image 3 is layer 2 drawn by layer data 3, and a dashed frame in the figure is a display range of the display.
In a feasible embodiment, the noted attribute information only includes one attribute information, and the attribute information is noted as a first identification value, and the first image drawn by the GPU may refer to fig. 9B.
In a feasible embodiment, the noted attribute information only includes two attribute information, and both attribute information are noted as the first identification value, and the first image drawn by the GPU may refer to fig. 9C.
In the scheme shown in the embodiment of the application, the data distribution thread calls the GPU to draw the first layer data into the first image in response to receiving the marked attribute information. The first image is then passed to the image composition thread. Compared with the transmission mode that the data distribution thread directly transmits all the display data to the image synthesis thread, the transmission mode can reduce the performance consumption of transmission.
For example, the data distribution thread directly transmits all presentation data to the image composition thread in the transmission mode: in one possible embodiment, the data distribution thread receives 4 lines of 1024×1080 video, and for a frame of presentation image, the data distribution thread needs to transmit 4 lines of 1024×1080 video data to the image composition thread.
The transmission scheme shown in this embodiment is as follows: in a feasible embodiment, the data distribution thread receives 4 paths of 1024 x 1080 video, and for one frame of display image, the data distribution thread draws 4 paths of 1024 x 1080 video data into one 1024 x 1080 first image, and then transmits the first image to the image synthesis thread, which can reduce the transmission performance consumption.
The image composition thread executes step S106A: in response to receiving the second layer data, invoking HWC hardware to render the second image data into a second image;
or, in response to receiving the second layer data and the first composite data, step S106B invokes the HWC hardware to render the second image data into a second image; combining the first image and the second image to obtain a display image;
or, in response to receiving the first composite data, step S106C controls the display to display the first composite data.
In a feasible embodiment, the synthesis manner of the display image may be:
the HWC hardware is configured to perform step S11 of rendering the second layer of data into a second image;
the second image may be synthesized by an image synthesizing method commonly used in the art, which is not described in detail herein.
The HWC hardware is configured to perform step S12 of synthesizing the first image and the second image into a frame of presentation image;
fig. 10 is a presentation image shown according to a possible embodiment, wherein the resolution of the presentation image is equal to the resolution of the display.
In response to completion of composition of the presentation image, the image composition thread is configured to perform step S13 of reading the rotation angle of the display;
the implementation manner of reading the rotation angle of the display may be that a memory inside the controller receives the rotation angle of the display in real time, and the image synthesis thread invokes the rotation angle of the display in the memory.
The implementation of reading the rotation angle of the display may be: the memory inside the controller records the rotation rate of the display in the rotation process, and the image synthesis thread can acquire the rotation angle by calling the rotation rate and the real-time.
It should be noted that the present embodiment is merely exemplary two implementations for reading the rotation angle of the display, and in the practical application process, the implementation of reading the rotation angle of the display may be, but is not limited to, the two ways described above.
If the rotation angle is equal to 0, the image composition thread executes step S14A to control the display to display the display image.
If the rotation angle is greater than 0, the image composition thread is further configured to perform step S14B to rotate the presentation image based on the rotation angle; and controlling the display to display the rotated display image.
For example, fig. 11 is a schematic diagram showing image variation during rotation of the display (corresponding to the case where the rotation angle is greater than 0) according to a possible embodiment. It can be seen that the presentation image always matches the viewing angle direction of the user.
In a feasible embodiment, the synthesis manner of the display image may be: a step of
The image composition thread is configured to perform step S21 of reading the rotation angle of the display;
the implementation manner of reading the rotation angle of the display may refer to the above embodiment, and will not be described herein.
If the rotation angle is equal to 0, the image composition thread is configured to execute step S22A to control the display to display the display image;
If the rotation angle is greater than 0, the HWC hardware is called to execute step S22B to draw the second image layer data based on the rotation angle to obtain the second image.
The following describes in detail the operation procedure of the display device shown in the embodiment of the present application with reference to specific embodiments:
example 1
In a viable embodiment, the user utilizes the display device to conduct video chat with two friends. The user enables the split screen function of the display device so that the display can show the conversation interfaces of the user, friend a and friend B. In response to the user enabling the split screen function, the data distribution thread receives presentation data. The display data in this embodiment includes: local conversation video (layer data a), conversation video of friend a (layer data B), and conversation video of friend B (layer data C). The data distribution thread sends the attribute information of the layer data A, the attribute information of the layer data B and the attribute information of the layer data C to the image synthesis thread. The image composition thread reads the rotation angle of the display in response to receiving the attribute information. In the application, the rotation angle of the display is equal to 0, the image synthesis thread marks the attribute information of the layer data A with a second identification value, and marks the attribute information of the layer data B and the attribute information of the layer data C with a first identification value. And the image synthesis thread transmits the marked attribute information to the data distribution thread. And the data distribution thread calls the GPU to draw the layer data B and the layer data C into a first image in response to receiving the marked attribute information. The data distribution thread outputs the first image and the layer data A to the image synthesis thread. The data distribution thread calls HWC hardware to draw the layer data A into a second image in response to receiving the first image and the layer data A; and combining the first image and the second image into a frame of display image. The image synthesis thread reads the rotation angle of the display again, wherein the rotation angle of the display is equal to 0, and the image synthesis thread directly outputs the display image to the display.
Example 2
In a viable embodiment, the user utilizes the display device to conduct video chat with two friends. The user enables the split screen function of the display device so that the display can show the conversation interfaces of the user, friend a and friend B. In response to a user enabling the split screen function, the data distribution thread receives presentation data, which in this embodiment includes: local conversation video (layer data a), conversation video of friend a (layer data B), and conversation video of friend B (layer data C). The data distribution thread sends the attribute information of the layer data A, the attribute information of the layer data B and the attribute information of the layer data C to the image synthesis thread. The image composition thread reads the rotation angle of the display in response to receiving the attribute information. In the application, the rotation angle of the display is equal to 0, and the image synthesis thread marks a first identification value for attribute information of the layer data A, attribute information of the layer data B and attribute information of the layer data C. And the image synthesis thread transmits the marked attribute information to the data distribution thread. And the data distribution thread calls the GPU to draw the layer data A, the layer data B and the layer data C into a first image in response to the received marked attribute information. The data distribution thread outputs the first image to the image composition thread. In response to receiving the first image (presentation image), the image composition thread again reads the rotation angle of the display, wherein the rotation angle of the display is equal to 0, and the image composition thread directly outputs the presentation image to the display.
Example 3
In a viable embodiment, the user utilizes the display device to conduct video chat with two friends. The user enables the split screen function of the display device so that the display can show the conversation interfaces of the user, friend a and friend B. In response to a user enabling the split screen function, the data distribution thread receives presentation data, which in this embodiment includes: local conversation video (layer data a), conversation video of friend a (layer data B), and conversation video of friend B (layer data C). The data distribution thread sends the attribute information of the layer data A, the attribute information of the layer data B and the attribute information of the layer data C to the image synthesis thread. The image composition thread reads the rotation angle of the display in response to receiving the attribute information. In the application, the display is in a rotating state, the rotating angle is larger than 0, and the image synthesis thread marks the attribute information of the layer data A, the attribute information of the layer data B and the attribute information of the layer data C with a second identification value. And the image synthesis thread transmits the marked attribute information to the data distribution thread. And the data distribution thread responds to the received attribute information after labeling and sends the layer data A, the layer data B and the layer data C to the image synthesis thread. The image composition thread invokes the HWC hardware to render the received layer data a, layer data B, and layer data C into a second image (presentation image). In response to completion of the composition of the second image, the image composition thread reads the rotation angle of the display again, wherein the rotation angle of the display is greater than 0, and the image composition thread rotates the display image based on the rotation angle, so that the display image is matched with the viewing angle of the user, and the image composition thread displays the rotated display image.
The display device shown in the embodiment of the application comprises: a display; the rotating assembly is connected with the display and used for driving the display to rotate based on the control of the controller; and a controller; the controller includes: a graphics processor; HWC hardware; the central processing unit is used for running a data distribution thread and an image synthesis thread; the image synthesis thread reads the rotation angle of the display in response to receiving the attribute information; if the rotation angle is equal to 0, marking attribute information according to a preset rule; if the rotation angle is larger than 0, marking a second identification value for all attribute information; the image synthesis thread calls the GPU to draw the first layer data into a first image in response to the received marked attribute information; and outputting a first image and/or second layer data, wherein the first layer data is layer data corresponding to a first identification value, and the second layer data is layer data corresponding to a second identification value. In the display device shown in this embodiment, when the display is in a rotation state, the image composition thread marks all attribute information with the second identification value, so that the data distribution thread may transmit the second layer data to the image composition thread. The image synthesis thread can call HWC hardware to draw the second image layer data into a second image, and the image is drawn without occupying the GPU in the process, so that the synthesis pressure of the GPU is relieved to a certain extent; when the display is in a static state, the data distribution thread can call the GPU to draw a first image, and then the first image is transferred to the image synthesis thread; the resources occupied in the first image transmission process are smaller than those occupied in the display data transmission process, so that the transmission pressure of the CPU is relieved to a certain extent.
The embodiment of the present application further shows a display device suitable for playing video data, specifically referring to fig. 12, a central processing unit (corresponding to the above processor 254), a graphics processor 253 and HWC hardware 257, where the central processing unit is configured to execute a data distribution thread, an image synthesis thread, a video application and a decoding thread; the interactive flow chart of the data distribution thread, the image synthesis thread, the video application and the decoding thread can refer to fig. 13;
in response to receiving the video playing instruction, the video application executes step S201 to call the first interface to output a start message;
the video application creates a decoding thread instance, sets a playing film source and initializes the decoding thread. After the initialization of the decoding thread is completed, the video application sets a video display window for receiving video output data, namely, sets a surfaceveiway or surfaceexture control.
After receiving the surfaceview control set by the video application, the decoding thread connects the surfaceview control with an internal video decoding output interface (which may also be referred to as a second interface in this embodiment).
The video application invokes the decode thread pre interface (also referred to as the first interface in this embodiment) and sends a start message that is used to inform the decode thread to prepare before playing.
In response to receiving the start message, the decoding thread performs step S202 to read an initial angle of the display device;
the decoding thread shown in the embodiment of the present application is connected to the rotating component, so that an initial angle of the display can be read, for example, in a feasibility implementation, the display is initially located in a landscape display state, and accordingly, the initial angle can be 0 degrees. In one possible implementation, the display is initially in the portrait display position, and accordingly, the initial angle may be 90 degrees.
The decoding thread executes step S203 to write the initial angle into the attribute database;
the storage location of the attribute database is not limited in the embodiments of the present application. Alternatively, in a feasible embodiment, the attribute database may be stored in the RAM to increase the read-write rate of the data, and after the power is turned off, the data recorded in the attribute database may be deleted, thereby reducing the storage amount of the data.
In response to completing the writing of the initial angle, the decoding thread executes step S204 to output feedback information;
in the application, the feedback information is used for informing the video application that the decoding thread has completed the initialization setting and the writing of the initial angle, and the video application can perform subsequent work.
Responsive to receiving the feedback information; the video application executes step S205 to call a second interface to output the original data;
in this application, the original is shown as video data that has not been decoded. The video application transfers the raw data to a decoding thread so that the decoding thread can decode the raw data.
The video application executes step S206 to read the rotation angle of the display according to the set frequency;
the video application reads the rotation angle of the display according to the set frequency while transmitting the original data. In this application, the setting frequency may be set according to the requirement, and the applicant does not make any limitation here.
In a feasible embodiment, the set frequency is equal to the page refresh rate of the image composition thread. The video application can call a pre-stored page refresh rate, and the rotation angle of the display is read according to the page refresh rate, so that the display image drawn based on the rotation angle is matched with the display device.
For example, in some possible embodiments the page refresh rate is 60 times/second, and accordingly, the image composition thread may draw 60 presentation pages per minute, and the video application may read 60 rotation angles per second and write the read rotation angles to the properties database.
The video application performs step S207 to write the rotation angle into the attribute database;
the decoding thread executes step S208 to decode the received original data to obtain video data;
the decoding process may be performed in a manner conventional in the art, and the applicant does not intend to limit the present invention.
The decoding thread performs step S209 to output video data;
the decoding thread transmits the decoded video data to the image composition thread.
The data distribution thread is configured to perform step S210 to receive video data;
the display device shown in the embodiment of the application supports multi-window playing, each window can display an independent page, and corresponding image synthesis threads can receive multiple paths of video data simultaneously. Each path of video data is layer data, and each layer of data is configured with attribute information;
the data distribution thread is configured to perform step S211 of outputting attribute information;
the data distribution thread may transfer all attribute information to the image composition thread in the form of a list.
In response to receiving the attribute information, the image composition thread performs step S212 to read the rotation angle of the display;
in this embodiment, the image composition thread reads the rotation angle in the attribute database.
If the rotation angle is equal to 0, the image synthesis thread executes step S213A to label attribute information according to a preset rule;
the implementation manner of labeling attribute information according to the preset rule may refer to the above embodiment, and will not be described herein.
If the rotation angle is greater than 0, the image synthesis thread executes step S213B to label the second identification value for all attribute information;
the image composition thread executes step S214 to invoke the GPU to draw the first layer data into the first image in response to receiving the annotated attribute information.
The drawing process of the first image may refer to the above embodiment, and will not be described herein.
The image composition thread performs step S215A: in response to receiving the second layer data, invoking HWC hardware to render the second image data into a second image;
or, in response to receiving the second layer data and the first composite data, step S215B invokes the HWC hardware to render the second image data into a second image; combining the first image and the second image to obtain a display image;
or, in response to receiving the first composite data, step S215C controls the display to present the first composite data.
The synthesis process of the display image can be as follows: responsive to receiving the first image and/or the second layer data, the image composition thread renders the second layer data into a second image; synthesizing the first image and the second image into a frame of display image; reading a rotation angle of the display in response to completing the composition of the presentation image; and if the rotation angle is equal to 0, controlling the display to display the display image.
In a feasible embodiment, the image composition thread may read the rotation angle according to the frequency of page refreshing, so that the rotated display image matches the user viewing angle.
The mode of outputting the presentation image by the image synthesis thread may be:
outputting a display image through the video layer if the rotation angle is equal to 0; and if the rotation angle is greater than 0, outputting the display image through the OSD layer.
In a feasible embodiment, the video application outputs an end message in response to completing the playing of the video;
in response to receiving the initiation message, the decoding thread releases the cached video data.
The embodiment of the present application further provides a display device, and in particular, reference may be made to fig. 6, fig. 6 is a schematic diagram of a display device shown in an embodiment of the present application, where it may be seen that the display device at least includes a display, a rotating assembly, and a controller, and the controller includes: a central processor (corresponding to processor 254 described above), graphics processor 253 and HWC hardware 257, the central processor being configured to execute a data distribution thread and an image composition thread, the data distribution thread and image composition thread interaction flow diagram referring to fig. 14. Specific:
the data distribution thread is configured to perform step S301 to receive presentation data;
The data distribution thread is configured to execute step S302 to output attribute information;
in response to receiving the attribute information, the image composition thread performs step S303 to read the rotation angle of the display;
if the rotation angle is equal to 0, the image synthesis thread executes step S3041 to label the first identification value for all attribute information;
if the rotation angle is greater than 0, the image synthesis thread executes step S3042 to label the second identification value for all attribute information;
the data distribution thread executes step S305, in response to receiving the noted attribute information, a GPU is called to draw the first layer data into a first image; and outputting a first image or second layer data, wherein the first layer data is layer data corresponding to a first identification value, and the second layer data is layer data corresponding to a second identification value.
The image composition thread performs step S306 to invoke HWC hardware to compose a presentation image in response to receiving the first image and/or the second layer data.
The display device shown in the embodiment of the application comprises: display, rotating assembly and controller, the controller can include a graphics processor, the graphics processor including: an image composition thread and an image composition thread; outputting attribute information by an image synthesis thread; the image synthesis thread reads the rotation angle of the display in response to receiving the attribute information; if the rotation angle is equal to 0, labeling a first identification value for all attribute information; if the rotation angle is larger than 0, marking a second identification value for all attribute information; the image synthesis thread calls the GPU to draw the first layer data into a first image in response to the received marked attribute information; outputting a first image or second layer data, wherein the first layer data is layer data corresponding to a first identification value, and the second layer data is layer data corresponding to a second identification value; the image composition thread may invoke HWC hardware. In the display device shown in this embodiment, when the display is in a rotation state, the image composition thread marks all attribute information with the second identification value, so that the data distribution thread may transmit the second layer data to the image composition thread. The image synthesis thread can call HWC hardware to draw the second image layer data into a second image, and the image is drawn without occupying the GPU in the process, so that the synthesis pressure of the GPU is relieved to a certain extent; when the display is in a static state, the data distribution thread can call the GPU to draw a first image, and then the first image is transferred to the image synthesis thread; the resources occupied in the first image transmission process are smaller than those occupied in the display data transmission process, so that the transmission pressure of the CPU is relieved to a certain extent.
In a specific implementation, the present invention further provides a computer storage medium, where the computer storage medium may store a program, where the program may include some or all of the steps in each embodiment of the method for adjusting a shooting angle of a camera provided by the present invention when executed. The storage medium may be a magnetic disk, an optical disk, a read-only memory (ROM), a random access memory (random access memory, RAM), or the like.
It will be apparent to those skilled in the art that the techniques of embodiments of the present invention may be implemented in software plus a necessary general purpose hardware platform. Based on such understanding, the technical solutions in the embodiments of the present invention may be embodied essentially or in parts contributing to the prior art in the form of a software product, which may be stored in a storage medium, such as a ROM/RAM, a magnetic disk, an optical disk, etc., including several instructions to cause a computer device (which may be a personal computer, a server, or a network device, etc.) to execute the method of the embodiments or some parts of the embodiments of the present invention.
Finally, it should be noted that: the above embodiments are only for illustrating the technical solution of the present application, and not for limiting the same; although the present application has been described in detail with reference to the foregoing embodiments, it should be understood by those of ordinary skill in the art that: the technical scheme described in the foregoing embodiments can be modified or some or all of the technical features thereof can be replaced by equivalents; such modifications and substitutions do not depart from the spirit of the corresponding technical solutions from the scope of the technical solutions of the embodiments of the present application.
The foregoing description, for purposes of explanation, has been presented in conjunction with specific embodiments. However, the illustrative discussions above are not intended to be exhaustive or to limit the embodiments to the precise forms disclosed above. Many modifications and variations are possible in light of the above teaching. The embodiments were chosen and described in order to best explain the principles and the practical application, to thereby enable others skilled in the art to best utilize the embodiments and various embodiments with various modifications as are suited to the particular use contemplated.

Claims (10)

1. A display device, characterized by comprising: a display; the rotating assembly is connected with the display and used for driving the display to rotate based on the control of the controller; the controller includes:
a graphics processor;
HWC hardware;
the central processing unit is used for running a data distribution thread and an image synthesis thread;
the data distribution thread is configured to perform: receiving presentation data, the presentation data comprising: at least one layer of data, each layer of data configured with attribute information; outputting the attribute information;
the image composition thread is configured to perform: reading the rotation angle of the display in response to receiving the attribute information; if the rotation angle is equal to 0, labeling a first identification value or a second identification value for part of attribute information according to a preset rule; if the rotation angle is larger than 0, marking a second identification value for all attribute information;
The data distribution thread is configured to perform: in response to receiving the noted attribute information, invoking a graphics processor to draw first layer data into a first image; outputting the first image and/or the second image layer data; the first layer data is layer data corresponding to a first identification value, and the second layer data is layer data corresponding to a second identification value;
the image composition thread is configured to perform: in response to receiving the second layer data, invoking HWC hardware to render the second image data into a second image;
or, in response to receiving the second layer data and the first image, invoking HWC hardware to render the second image data into a second image; combining the first image and the second image to obtain a display image;
or, in response to receiving the first image, controlling the display to show the first image.
2. The display device of claim 1, wherein the image composition thread is further configured to:
reading a rotation angle of the display in response to completion of composition of the presentation image;
and if the rotation angle is equal to 0, controlling the display to display the display image.
3. The display device of claim 2, wherein if the rotation angle is greater than 0, the image composition thread is further configured to:
rotating the display image based on the rotation angle;
and controlling the display to display the rotated display image.
4. The display device of claim 1, wherein the image composition thread is further configured to perform:
reading the rotation angle of the display in response to receiving the second layer data;
if the rotation angle is equal to 0, invoking HWC hardware to draw the second layer data into a second image; controlling the display to display the second image;
if the rotation angle is greater than 0, the HWC hardware is invoked to render a second image based on the rotation angle.
5. The display device of any one of claims 1-4, wherein the presentation data is video data, the central processor further configured to perform: video application and decoding threads;
the video application is configured to perform: responding to the received video playing instruction, calling a first interface to output a starting message;
the decode thread is configured to perform: in response to receiving the start message, reading an initial angle of the display device; writing the initial angle into an attribute database; outputting feedback information in response to completing the writing of the initial angle;
The video application is configured to perform: responding to the received feedback information, and calling a second interface to output original data; reading the rotation angle of the display according to the set frequency; writing the rotation angle into an attribute database;
the decode thread is configured to perform: decoding the received original data to obtain video data; and outputting the video data.
6. The display device of claim 5, wherein the image composition thread reads the rotation angle recorded in the attribute database at a page refresh frequency of the HWC hardware.
7. The display device of claim 6, wherein the video application reads the rotation angle at a page refresh frequency of the HWC hardware.
8. The display device of claim 7, wherein, in response to completing the playing of the video, the video application is configured to: outputting an end message;
the decoding thread is configured to: and releasing the cached video data in response to receiving the end message.
9. The display device of any of claims 2-4, wherein the image composition thread is further configured to:
Outputting the display image through a video layer if the rotation angle is equal to 0;
and if the rotation angle is larger than 0, outputting the display image through an OSD layer.
10. A display device, characterized by comprising: a display; the rotating assembly is connected with the display and used for driving the display to rotate based on the control of the controller; the controller includes:
a graphics processor;
HWC hardware;
the central processing unit is used for running a data distribution thread and an image synthesis thread;
the data distribution thread is configured to perform: receiving presentation data, the presentation data comprising: at least one layer of data, each layer of data configured with attribute information; outputting the attribute information;
the image composition thread is configured to perform: reading the rotation angle of the display in response to receiving the attribute information; if the rotation angle is equal to 0, labeling a first identification value for all attribute information; if the rotation angle is larger than 0, marking a second identification value for all attribute information;
the data distribution thread is configured to perform: in response to receiving the noted attribute information, invoking a graphics processor to draw first layer data into a first image; outputting the first image and/or the second image layer data; the first layer data is layer data corresponding to a first identification value, and the second layer data is layer data corresponding to a second identification value;
The image composition thread is configured to perform: in response to receiving the second layer data, invoking HWC hardware to render the second image data into a second image;
or, in response to receiving the first image, controlling the display to show the first image.
CN202011164940.6A 2020-10-27 2020-10-27 Display equipment Active CN114501087B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202011164940.6A CN114501087B (en) 2020-10-27 2020-10-27 Display equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202011164940.6A CN114501087B (en) 2020-10-27 2020-10-27 Display equipment

Publications (2)

Publication Number Publication Date
CN114501087A CN114501087A (en) 2022-05-13
CN114501087B true CN114501087B (en) 2024-04-12

Family

ID=81470156

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202011164940.6A Active CN114501087B (en) 2020-10-27 2020-10-27 Display equipment

Country Status (1)

Country Link
CN (1) CN114501087B (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116700655B (en) * 2022-09-20 2024-04-02 荣耀终端有限公司 Interface display method and electronic equipment

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107786883A (en) * 2016-08-31 2018-03-09 三星电子株式会社 Image display and its operating method
CN111176603A (en) * 2019-12-31 2020-05-19 海信视像科技股份有限公司 Image display method for display equipment and display equipment
CN111246266A (en) * 2020-03-04 2020-06-05 海信视像科技股份有限公司 Display equipment and UI (user interface) display method during rotation
CN111417004A (en) * 2019-01-08 2020-07-14 三星电子株式会社 Image display apparatus and method of operating the same
CN111787388A (en) * 2020-07-10 2020-10-16 海信视像科技股份有限公司 Display device

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107786883A (en) * 2016-08-31 2018-03-09 三星电子株式会社 Image display and its operating method
CN111417004A (en) * 2019-01-08 2020-07-14 三星电子株式会社 Image display apparatus and method of operating the same
CN111176603A (en) * 2019-12-31 2020-05-19 海信视像科技股份有限公司 Image display method for display equipment and display equipment
CN111246266A (en) * 2020-03-04 2020-06-05 海信视像科技股份有限公司 Display equipment and UI (user interface) display method during rotation
CN111787388A (en) * 2020-07-10 2020-10-16 海信视像科技股份有限公司 Display device

Also Published As

Publication number Publication date
CN114501087A (en) 2022-05-13

Similar Documents

Publication Publication Date Title
CN111970550B (en) Display device
CN111787388B (en) Display device
CN112165644B (en) Display device and video playing method in vertical screen state
CN111866590B (en) Display device
CN111866593B (en) Display device and startup interface display method
CN111866569B (en) Display device
CN112565839A (en) Display method and display device of screen projection image
CN114827707B (en) Display equipment and startup animation display method
CN113556593B (en) Display device and screen projection method
CN112565861A (en) Display device
CN113395600B (en) Interface switching method of display equipment and display equipment
CN114501087B (en) Display equipment
CN113542824B (en) Display equipment and display method of application interface
CN113630639B (en) Display device
CN113556590B (en) Method for detecting effective resolution of screen-projected video stream and display equipment
CN113497958B (en) Display equipment and picture display method
CN113573118B (en) Video picture rotating method and display equipment
CN113395554B (en) Display device
CN113542823B (en) Display equipment and application page display method
CN113497965B (en) Configuration method of rotary animation and display device
CN113497962B (en) Configuration method of rotary animation and display device
CN111787374A (en) Display device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant