CN110708581B - Display device and method for presenting multimedia screen saver information - Google Patents
Display device and method for presenting multimedia screen saver information Download PDFInfo
- Publication number
- CN110708581B CN110708581B CN201910795660.6A CN201910795660A CN110708581B CN 110708581 B CN110708581 B CN 110708581B CN 201910795660 A CN201910795660 A CN 201910795660A CN 110708581 B CN110708581 B CN 110708581B
- Authority
- CN
- China
- Prior art keywords
- chip
- information
- screen saver
- display
- display device
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
- 238000000034 method Methods 0.000 title claims abstract description 29
- 238000004891 communication Methods 0.000 claims description 66
- 230000004044 response Effects 0.000 claims description 24
- 238000005516 engineering process Methods 0.000 claims description 4
- 230000006870 function Effects 0.000 description 57
- 238000010586 diagram Methods 0.000 description 18
- 238000012545 processing Methods 0.000 description 14
- 230000003993 interaction Effects 0.000 description 13
- 238000006243 chemical reaction Methods 0.000 description 8
- 230000002159 abnormal effect Effects 0.000 description 7
- 238000001514 detection method Methods 0.000 description 7
- 230000009471 action Effects 0.000 description 6
- 230000009977 dual effect Effects 0.000 description 6
- 230000005236 sound signal Effects 0.000 description 6
- 230000005540 biological transmission Effects 0.000 description 5
- 230000033001 locomotion Effects 0.000 description 5
- 230000003287 optical effect Effects 0.000 description 5
- 230000008569 process Effects 0.000 description 5
- 230000003068 static effect Effects 0.000 description 5
- 230000008859 change Effects 0.000 description 4
- 230000002452 interceptive effect Effects 0.000 description 4
- 230000015572 biosynthetic process Effects 0.000 description 3
- 238000003786 synthesis reaction Methods 0.000 description 3
- 239000008186 active pharmaceutical agent Substances 0.000 description 2
- 230000003321 amplification Effects 0.000 description 2
- 238000004458 analytical method Methods 0.000 description 2
- 230000002349 favourable effect Effects 0.000 description 2
- 238000003199 nucleic acid amplification method Methods 0.000 description 2
- 230000002093 peripheral effect Effects 0.000 description 2
- 230000009467 reduction Effects 0.000 description 2
- 230000001360 synchronised effect Effects 0.000 description 2
- 238000013473 artificial intelligence Methods 0.000 description 1
- 230000006399 behavior Effects 0.000 description 1
- 230000009286 beneficial effect Effects 0.000 description 1
- 230000008901 benefit Effects 0.000 description 1
- 230000000903 blocking effect Effects 0.000 description 1
- 239000002131 composite material Substances 0.000 description 1
- 230000009849 deactivation Effects 0.000 description 1
- 230000006837 decompression Effects 0.000 description 1
- 238000011982 device technology Methods 0.000 description 1
- 235000020280 flat white Nutrition 0.000 description 1
- 238000007667 floating Methods 0.000 description 1
- 238000012905 input function Methods 0.000 description 1
- 238000003780 insertion Methods 0.000 description 1
- 230000037431 insertion Effects 0.000 description 1
- 239000004973 liquid crystal related substance Substances 0.000 description 1
- 230000007246 mechanism Effects 0.000 description 1
- 238000012544 monitoring process Methods 0.000 description 1
- 238000003825 pressing Methods 0.000 description 1
- 238000012546 transfer Methods 0.000 description 1
- 230000000007 visual effect Effects 0.000 description 1
Images
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/40—Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
- H04N21/41—Structure of client; Structure of client peripherals
- H04N21/4104—Peripherals receiving signals from specially adapted client devices
- H04N21/4126—The peripheral being portable, e.g. PDAs or mobile phones
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/40—Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
- H04N21/41—Structure of client; Structure of client peripherals
- H04N21/422—Input-only peripherals, i.e. input devices connected to specially adapted client devices, e.g. global positioning system [GPS]
- H04N21/42204—User interfaces specially adapted for controlling a client device through a remote control device; Remote control devices therefor
- H04N21/42206—User interfaces specially adapted for controlling a client device through a remote control device; Remote control devices therefor characterized by hardware details
- H04N21/4221—Dedicated function buttons, e.g. for the control of an EPG, subtitles, aspect ratio, picture-in-picture or teletext
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/40—Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
- H04N21/41—Structure of client; Structure of client peripherals
- H04N21/422—Input-only peripherals, i.e. input devices connected to specially adapted client devices, e.g. global positioning system [GPS]
- H04N21/4223—Cameras
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/40—Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
- H04N21/43—Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
- H04N21/4302—Content synchronisation processes, e.g. decoder synchronisation
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/40—Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
- H04N21/43—Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
- H04N21/436—Interfacing a local distribution network, e.g. communicating with another STB or one or more peripheral devices inside the home
- H04N21/4363—Adapting the video stream to a specific local network, e.g. a Bluetooth® network
- H04N21/43637—Adapting the video stream to a specific local network, e.g. a Bluetooth® network involving a wireless protocol, e.g. Bluetooth, RF or wireless LAN [IEEE 802.11]
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/40—Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
- H04N21/47—End-user applications
- H04N21/485—End-user interface for client configuration
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- Human Computer Interaction (AREA)
- Computer Networks & Wireless Communication (AREA)
- Controls And Circuits For Display Device (AREA)
Abstract
The application discloses a display device and a method for presenting multimedia screen saver information, wherein in the display device, a display can present multimedia screen saver information provided by a user interface or a screen saver application; when the first chip receives key input, first information is sent to the second chip according to a preset rule; when receiving the key input or the first information, the second chip re-executes the screen saver countdown aiming at the key input or the first information, and meanwhile, if the content presented by the display is the multimedia screen saver information, the multimedia screen saver information is cancelled so that the display presents a user interface; and when the screen saver countdown executed by the second chip is finished, presenting the multimedia screen saver information on the display. The display equipment can respond to the key input on the second chip and also respond to the key input on the first chip, so that the phenomena that the screen saver counts down abnormally and does not exit after a user presses a key are avoided.
Description
Technical Field
The present application relates to the field of display device technologies, and in particular, to a display device and a method for presenting multimedia screen saver information.
Background
Currently, display devices may provide a user with a play picture such as audio, video, pictures, and the like. As shown in fig. 3 or 4, the dual-chip display device with the camera has a first chip (a chip) and a second chip (N chip), and the display device can provide multifunctional experiences such as "chatting while playing", "chatting while watching", "chatting while learning", and the like for a user. For example, in a "chat while playing" scenario, real pictures of game participants are presented in real time while providing a game scenario for the user, in a "chat while watching" scenario, multiple video chat pictures are presented while playing a video program picture for the user, and so on.
In a usage scenario, if the user leaves the display device, the display device will not receive the key input for a long time, which causes the display screen to be in a static state for a long time, is not favorable for the service life of the display screen, and wastes electric energy. In order to prolong the service life of the screen of the display and save electric energy, the screen saver application can be installed on the display equipment, the screen saver state can be automatically entered after the static time of the screen of the display reaches a certain time through the screen saver application, and the screen saver state can be automatically exited when the screen of the display needs to be active, so that the purposes of protecting the screen and saving electric energy are achieved.
However, since the dual-chip display device includes two chips at the same time, and both chips can receive the key input, for example, in a game scenario, the key input of the device such as a keyboard or a mouse is received by the first chip, and in a user setting scenario, the key input of the device such as a remote controller is received by the second chip, if the screen saver application is installed on the second chip (or the first chip), the screen saver application can only enter or exit the screen saver state in response to the key input on the second chip (or the first chip), and when the key input occurs on the other chip, the screen saver application will not respond, thereby affecting the user experience.
Disclosure of Invention
The application provides a display device and a method for presenting multimedia screen saver information, which aim to solve the problem that a screen saver application in a dual-chip display device can only respond to key input on a second chip (or a first chip) to enter or exit a screen saver state, and when key input occurs on the other chip, the screen saver application cannot respond.
In a first aspect, the present application provides a display device comprising:
a display configured to present a user interface or multimedia screen saver information provided by a screen saver application running on a second chip;
a first chip in communication with the second chip configured to:
when key input is received, first information is sent to the second chip according to a preset rule;
the second chip is configured to:
when receiving a key input or first information sent by a first chip, executing screen saver countdown aiming at the received key input or first information; and if the content presented by the display is the multimedia screen saver information, canceling the multimedia screen saver information to enable the display to present a user interface;
when the executed screen saver countdown is complete, then multimedia screen saver information is presented on the display.
Further, the second chip is further configured to:
after the multimedia screen saver information is cancelled and presented on the display, sending a prompt message to the first chip at a preset interval until a feedback message of the first chip for the prompt message is received, wherein the prompt message is used for prompting the content presented by the display of the first chip.
Further, the first chip is further configured to:
when a prompt message sent by a second chip is received, sending a feedback message for the prompt message to the second chip, wherein the prompt message is used for prompting the content presented by the display of the first chip;
if the content presented by the display is judged to be the multimedia screen saver information according to the prompt message, recording the screen state as the screen saver state;
and if the content presented by the display is judged to be the user interface according to the prompt message, recording that the screen state is a non-screen saver state.
Further, when the first chip receives a key input, sending first information to the second chip according to a preset rule, specifically including:
judging the recorded screen state when receiving key input;
if the recorded screen state is a screen saver state, sending the first information to the second chip;
and if the recorded screen state is a non-screen saver state, transmitting the first information to the second chip when the time for transmitting the first information last time reaches preset time.
Further, if the recorded screen status is a screen saver status, the sending, by the first chip, the first information to the second chip specifically includes:
sending the first information to the second chip at preset intervals until a response message of the second chip to the first information is received;
after the first chip sends the first information to the second chip again, the method further includes:
and sending the first information to the second chip once every preset time until receiving a response message of the second chip to the first information.
Further, the second chip is further configured to:
after receiving first information sent by a first chip, sending a response message to the first information to the first chip.
Further, the first chip is further configured to:
running a communication application based on an interprocess communication technology, wherein the communication application is used for sending first information to the second chip and receiving a response message for the first information sent by the second chip;
and the chip is also used for receiving a prompt message sent by the second chip and sending a feedback message for the prompt message to the second chip.
Further, the second chip is configured to:
when key input or first information sent by the first chip is received, if the content presented by the display is the multimedia screen saver information, or the content presented by the display is the user interface and the user interface does not include a dynamic picture, executing screen saver countdown aiming at the received key input or the first information;
if the content presented by the display is a user interface and the user interface includes a dynamic picture, then no screensaver countdown is performed.
Further:
the key input received by the second chip is input of a first control device connected with the second chip;
the key input received by the first chip is input of a second control device connected with the first chip.
In a second aspect, the present application further provides a method for presenting multimedia screen saver information by a dual-chip display device, including:
presenting a user interface or multimedia screen saver information on a display of a display device, the multimedia screen saver information being provided by a screen saver application running on a second chip;
when the second chip receives key input or first information sent by the first chip, executing screen saver countdown aiming at the received key input or the first information; and if the content presented by the display is the multimedia screen saver information, canceling the multimedia screen saver information to enable the display to present a user interface;
when the executed screen saver countdown is completed, presenting multimedia screen saver information on the display; and the first information is sent by the first chip according to a preset rule when the first chip receives the key input.
According to the technical scheme, in the display equipment provided by the application, the display can present multimedia screen saver information provided by a user interface or a screen saver application; when the first chip receives the key input, first information is sent to the second chip according to a preset rule so as to prompt the second chip that the first chip receives the key input; when receiving the key input or the first information, the second chip re-executes the screen saver countdown aiming at the key input or the first information, and meanwhile, if the content presented by the display is the multimedia screen saver information, the multimedia screen saver information is cancelled so that the display presents a user interface; and when the screen saver countdown executed by the second chip is finished, presenting the multimedia screen saver information on the display.
The second chip executes the step of retiming when receiving the key input or the first information and executes the step of canceling the screen saver when the screen saver exists, so that the second chip can respond to the key input on the second chip and the key input on the first chip, and the phenomena of abnormal countdown of the screen saver and no exit of the screen saver after a user presses a key are avoided.
Drawings
In order to more clearly explain the technical solution of the present application, the drawings needed to be used in the embodiments will be briefly described below, and it is obvious to those skilled in the art that other drawings can be obtained according to these drawings without any creative effort.
Fig. 1 is a schematic diagram schematically illustrating an operation scene between a display device and a control apparatus according to an embodiment;
fig. 2 is a block diagram exemplarily showing a hardware configuration of the control apparatus 100 according to the embodiment;
fig. 3 is a block diagram exemplarily showing a hardware configuration of the display device 200 according to the embodiment;
a block diagram of the hardware architecture of the display device 200 according to fig. 3 is exemplarily shown in fig. 4;
fig. 5 is a diagram exemplarily showing a functional configuration of the display device 200 according to the embodiment;
fig. 6a schematically shows a software configuration in the display device 200 according to an embodiment;
fig. 6b schematically shows a configuration of an application in the display device 200 according to an embodiment;
fig. 7 schematically illustrates a user interface in the display device 200 according to an embodiment;
FIG. 8 is an exemplary illustration of an application scenario in accordance with an embodiment of the present application;
fig. 9a is an interaction scenario of a first control device and a display device exemplarily illustrated in an embodiment of the present application;
fig. 9b is an interaction scenario of a second control device and a display device exemplarily illustrated in the embodiment of the present application;
FIG. 10 is a diagram illustrating an exemplary entry of a display screen into a screensaver state according to an embodiment of the application;
fig. 11 is a schematic diagram of an exemplary hardware structure of a display device according to an embodiment of the present application;
FIG. 12 is a flow chart of one embodiment of a method performed by a display device of the present application;
FIG. 13a is a diagram illustrating a screen in a screen saver state according to an exemplary embodiment of the present application;
FIG. 13b is a schematic diagram of a screen in another state of screen saver according to an embodiment of the present application;
FIG. 14 is a flow chart of another embodiment of a method performed by a display device of the present application;
fig. 15 is a flowchart illustrating a method for presenting multimedia screen saver information according to an embodiment of the present application.
Detailed Description
To make the objects, technical solutions and advantages of the exemplary embodiments of the present application clearer, the technical solutions in the exemplary embodiments of the present application will be clearly and completely described below with reference to the drawings in the exemplary embodiments of the present application, and it is obvious that the described exemplary embodiments are only a part of the embodiments of the present application, but not all the embodiments.
For the convenience of users, various external device interfaces are usually provided on the display device to facilitate connection of different peripheral devices or cables to implement corresponding functions. When a high-definition camera is connected to an interface of the display device, if a hardware system of the display device does not have a hardware interface of a high-pixel camera receiving a source code, data received by the camera cannot be displayed on a display screen of the display device.
Furthermore, due to the hardware structure, the hardware system of the conventional display device only supports one path of hard decoding resources, and usually only supports video decoding with a resolution of 4K at most, so when a user wants to realize video chat while watching a network television, the user needs to use the hard decoding resources (usually GPU in the hardware system) to decode the network video without reducing the definition of the network video screen, and in this case, the user can only process the video chat screen by using a general purpose processor (e.g. CPU) in the hardware system to perform soft decoding on the video.
The soft decoding is adopted to process the video chat picture, so that the data processing burden of a CPU (central processing unit) can be greatly increased, and when the data processing burden of the CPU is too heavy, the problem of picture blocking or unsmooth flow can occur. Further, due to the data processing capability of the CPU, when the CPU performs soft decoding on the video chat screen, multi-channel video calls cannot be generally implemented, and when the user wants to perform video chat with multiple other users in the same chat scene, access is blocked.
In view of the above aspects, to overcome the above drawbacks, the present application discloses a dual hardware system architecture to implement multiple channels of video chat data (at least one channel of local video).
The concept to which the present application relates will be first explained below with reference to the drawings. It should be noted that the following description of the respective concepts is only for the purpose of facilitating understanding of the contents of the present application, and does not represent a limitation to the scope of the present application.
The term "module," as used in various embodiments of the present application, may refer to any known or later developed hardware, software, firmware, artificial intelligence, fuzzy logic, or combination of hardware and/or software code that is capable of performing the functionality associated with that element.
The term "remote control" as used in the various embodiments of the present application refers to a component of an electronic device (such as a display device as disclosed herein) that is capable of wirelessly controlling the electronic device, typically over a relatively short distance. The components may generally be connected to the electronic device using infrared and/or Radio Frequency (RF) signals and/or bluetooth, and may also include functional modules such as WiFi, wireless USB, bluetooth, motion sensors, etc. For example: the hand-held touch remote controller replaces most of the physical built-in hard keys in the common remote control device with the user interface in the touch screen.
The term "gesture" as used in the embodiments of the present application refers to a user behavior used to express an intended idea, action, purpose, or result through a change in hand shape or an action such as hand movement.
The term "hardware system" used in the embodiments of the present application may refer to a physical component having computing, controlling, storing, inputting and outputting functions, which is formed by a mechanical, optical, electrical and magnetic device such as an Integrated Circuit (IC), a Printed Circuit Board (PCB) and the like. In various embodiments of the present application, a hardware system may also be referred to as a motherboard (or chip).
An illustrative view of an operational scenario between a display device and a control apparatus according to an embodiment is illustrated in fig. 1. As shown in fig. 1, a user may operate the display apparatus 200 through the control device 100.
The control device 100 may be a remote controller 100A, which may communicate with the display device 200 through an infrared protocol communication, a bluetooth protocol communication, a ZigBee (ZigBee) protocol communication, or other short-range communication, and is configured to control the display device 200 through a wireless or other wired manner. The user may input a user instruction through a key on the remote controller, voice input, control panel input, etc. to control the display apparatus 200. Such as: the user can input a corresponding control command through a volume up/down key, a channel control key, up/down/left/right moving keys, a voice input key, a menu key, a power on/off key, etc. on the remote controller, to implement the function of controlling the display device 200.
The control apparatus 100 may also be an intelligent device, such as a mobile terminal 100B, a tablet computer, a notebook computer, etc., which may communicate with the display device 200 through a Local Area Network (LAN), a Wide Area Network (WAN), a Wireless Local Area Network (WLAN), or other networks, and implement control of the display device 200 through an application program corresponding to the display device 200.
For example, the mobile terminal 100B and the display device 200 may each have a software application installed thereon, so that connection communication between the two can be realized through a network communication protocol, and the purpose of one-to-one control operation and data communication can be further realized. Such as: a control instruction protocol can be established between the mobile terminal 100B and the display device 200, a remote control keyboard is synchronized to the mobile terminal 100B, and the function of controlling the display device 200 is realized by controlling a user interface on the mobile terminal 100B; the audio and video content displayed on the mobile terminal 100B may also be transmitted to the display device 200, so as to implement a synchronous display function.
As shown in fig. 1, the display apparatus 200 may also perform data communication with the server 300 through various communication means. In various embodiments of the present application, the display device 200 may be allowed to be communicatively coupled to the server 300 via a local area network, a wireless local area network, or other network. The server 300 may provide various contents and interactions to the display apparatus 200.
Illustratively, the display device 200 receives software Program updates, or accesses a remotely stored digital media library by sending and receiving information, and Electronic Program Guide (EPG) interactions. The servers 300 may be a group or groups, and may be one or more types of servers. Other web service contents such as a video on demand and an advertisement service are provided through the server 300.
The display device 200 may be a liquid crystal display, an oled (organic Light Emitting diode) display, a projection display device, or an intelligent tv. The specific display device type, size, resolution, etc. are not limiting, and those skilled in the art will appreciate that the display device 200 may be modified in performance and configuration as desired.
The display apparatus 200 may additionally provide an intelligent network tv function that provides a computer support function in addition to the broadcast receiving tv function. Examples include a web tv, a smart tv, an Internet Protocol Television (IPTV), and the like.
As shown in fig. 1, the display device may be connected or provided with a camera, and is configured to present a picture taken by the camera on a display interface of the display device or other display devices, so as to implement interactive chat between users. Specifically, the picture shot by the camera can be displayed on the display device in a full screen mode, a half screen mode or any optional area.
As an optional connection mode, the camera is connected with the display rear shell through the connecting plate, and is fixedly installed in the middle of the upper side of the display rear shell, and as an installable mode, the camera can be fixedly installed at any position of the display rear shell, so that it can be ensured that an image acquisition area is not shielded by the rear shell, for example, the display orientation of the image acquisition area is the same as that of the display device.
As another alternative connection mode, the camera is connected to the display rear shell through a connection board or other conceivable connector, the camera is capable of lifting, the connector is provided with a lifting motor, when a user wants to use the camera or an application program wants to use the camera, the camera is lifted out of the display, and when the camera is not needed, the camera can be embedded in the rear shell to protect the camera from being damaged.
As an embodiment, the camera adopted in the present application may have 1600 ten thousand pixels, so as to achieve the purpose of ultra-high definition display. In actual use, cameras higher or lower than 1600 ten thousand pixels may also be used.
After the camera is installed on the display device, the contents displayed by different application scenes of the display device can be fused in various different modes, so that the function which cannot be realized by the traditional display device is achieved.
Illustratively, a user may conduct a video chat with at least one other user while watching a video program. The presentation of the video program may be as a background frame over which a window for video chat is displayed. The function is called 'chat while watching'.
Optionally, in a scene of "chat while watching", at least one video chat is performed across terminals while watching a live video or a network video.
In another example, a user may conduct a video chat with at least one other user while entering the educational application for learning. For example, a student may interact remotely with a teacher while learning content in an educational application. Vividly, this function can be called "chatting while learning".
In another example, a user conducts a video chat with a player entering a card game while playing the game. For example, a player may enable remote interaction with other players when entering a gaming application to participate in a game. Figuratively, this function may be referred to as "watch while playing".
Optionally, the game scene is fused with the video picture, the portrait in the video picture is scratched and displayed in the game picture, and the user experience is improved.
Optionally, in the motion sensing game (such as ball hitting, boxing, running and dancing), the human posture and motion, limb detection and tracking and human skeleton key point data detection are obtained through the camera, and then the human posture and motion, the limb detection and tracking and the human skeleton key point data detection are fused with the animation in the game, so that the game of scenes such as sports and dancing is realized.
In another example, a user may interact with at least one other user in a karaoke application in video and voice. Vividly, this function can be called "sing while watching". Preferably, when at least one user enters the application in a chat scenario, a plurality of users can jointly complete recording of a song.
In another example, a user may turn on a camera locally to take pictures and videos, figurative, which may be referred to as "looking into the mirror".
In other examples, more or less functionality may be added. The function of the display device is not particularly limited in this application.
Fig. 2 is a block diagram schematically showing the configuration of the control apparatus 100 according to the exemplary embodiment. As shown in fig. 2, the control device 100 includes a controller 110, a communicator 130, a user input/output interface 140, a memory 190, and a power supply 180.
The control apparatus 100 is configured to control the display device 200, and to receive an input operation command from a user, and to convert the operation command into a command recognizable and responsive to the display device 200, thereby mediating interaction between the user and the display device 200. Such as: the user operates the channel up/down key on the control device 100, and the display device 200 responds to the channel up/down operation.
In some embodiments, the control device 100 may be a smart device. Such as: the control apparatus 100 may install various applications for controlling the display device 200 according to user demands.
In some embodiments, as shown in fig. 1, the mobile terminal 100B or other intelligent electronic device may function similar to the control apparatus 100 after installing an application for manipulating the display device 200. Such as: the user may implement the function of controlling the physical keys of the apparatus 100 by installing an application, various function keys or virtual buttons of a graphical user interface available on the mobile terminal 100B or other intelligent electronic device.
The controller 110 includes a processor 112, a RAM113 and a ROM114, a communication interface, and a communication bus. The controller 110 is used to control the operation of the control device 100, as well as the internal components for communication and coordination and external and internal data processing functions.
The communicator 130 enables communication of control signals and data signals with the display apparatus 200 under the control of the controller 110. Such as: the received user input signal is transmitted to the display apparatus 200. The communicator 130 may include at least one of a WIFI module 131, a bluetooth module 132, an NFC module 133, and the like.
A user input/output interface 140, wherein the input interface includes at least one of a microphone 141, a touch pad 142, a sensor 143, a key 144, and the like. Such as: the user can realize a user instruction input function through actions such as voice, touch, gesture, pressing, and the like, and the input interface converts the received analog signal into a digital signal and converts the digital signal into a corresponding instruction signal, and sends the instruction signal to the display device 200.
The output interface includes an interface that transmits the received user instruction to the display apparatus 200. In some embodiments, the interface may be an infrared interface or a radio frequency interface. Such as: when the infrared signal interface is used, the user input instruction needs to be converted into an infrared control signal according to an infrared control protocol, and the infrared control signal is sent to the display device 200 through the infrared sending module. The following steps are repeated: when the rf signal interface is used, a user input command needs to be converted into a digital signal, and then modulated according to the rf control signal modulation protocol, and then transmitted to the display device 200 through the rf transmitting terminal.
In some embodiments, the control device 100 includes at least one of a communicator 130 and an output interface. The communicator 130 is configured in the control device 100, such as: the modules of WIFI, bluetooth, NFC, etc. may send the user input command to the display device 200 through the WIFI protocol, or the bluetooth protocol, or the NFC protocol code.
And a memory 190 for storing various operation programs, data and applications for driving and controlling the control apparatus 100 under the control of the controller 110. The memory 190 may store various control signal commands input by a user.
And a power supply 180 for providing operation power support for the components of the control device 100 under the control of the controller 110. A battery and associated control circuitry.
A hardware configuration block diagram of a hardware system in the display apparatus 200 according to an exemplary embodiment is exemplarily illustrated in fig. 3.
When a dual hardware system architecture is adopted, the mechanism relationship of the hardware system can be shown in fig. 3. For convenience of description, one hardware system in the dual hardware system architecture will be referred to as a first hardware system or a system, a-chip, and the other hardware system will be referred to as a second hardware system or N-system, N-chip. The chip A comprises a controller of the chip A and various modules connected with the controller of the chip A through various interfaces, and the chip N comprises a controller of the chip N and various modules connected with the controller of the chip N through various interfaces. The a-chip and the N-chip may each have a separate operating system installed therein, so that there are two separate but interrelated subsystems in the display apparatus 200.
As shown in fig. 3, the a chip and the N chip may be connected, communicated and powered through a plurality of different types of interfaces. The interface type of the interface between the a chip and the N chip may include a General-purpose input/output (GPIO) interface, a USB interface, an HDMI interface, a UART interface, and the like. One or more of these interfaces may be used for communication or power transfer between the a-chip and the N-chip. For example, as shown in fig. 3, in the dual hardware system architecture, the N chip may be powered by an external power source (power), and the a chip may not be powered by the external power source but by the N chip.
In addition to the interface for connecting with the N chip, the a chip may further include an interface for connecting other devices or components, such as an MIPI interface for connecting a Camera (Camera) shown in fig. 3, a bluetooth interface, and the like.
Similarly, in addition to the interface for connecting with the N chip, the N chip may further include an VBY interface for connecting with a display screen tcon (timer Control register), and an i2S interface for connecting with a power Amplifier (AMP) and a Speaker (Speaker); and an IR/Key interface, a USB interface, a Wifi interface, a bluetooth interface, an HDMI interface, a Tuner interface, and the like.
The dual hardware system architecture of the present application is further described below with reference to fig. 4. It should be noted that fig. 4 is only an exemplary illustration of the dual hardware system architecture of the present application and does not represent a limitation of the present application. In actual practice, both hardware systems may contain more or less hardware or interfaces as desired.
A block diagram of the hardware architecture of the display device 200 according to fig. 3 is exemplarily shown in fig. 4. As shown in fig. 4, the hardware system of the display apparatus 200 may include an a chip and an N chip, and a module connected to the a chip or the N chip through various interfaces.
The N-chip may include a tuner demodulator 220, a communicator 230, an external device interface 250, a controller 210, a memory 290, a user input interface, a video processor 260-1, an audio processor 260-2, a display 280, an audio output interface 270, and a power supply. The N-chip may also include more or fewer modules in other embodiments.
The tuner demodulator 220 is configured to perform modulation and demodulation processing such as amplification, mixing, resonance and the like on a broadcast television signal received in a wired or wireless manner, so as to demodulate an audio/video signal carried in a frequency of a television channel selected by a user and additional information (e.g., an EPG data signal) from a plurality of wireless or wired broadcast television signals. Depending on the broadcast system of the television signal, the signal path of the tuner 220 may be varied, such as: terrestrial broadcasting, cable broadcasting, satellite broadcasting, internet broadcasting, or the like; according to different modulation types, the signal can be adjusted in a digital modulation mode or an analog modulation mode; and depending on the type of television signal being received, tuner demodulator 220 may demodulate analog and/or digital signals.
The tuner demodulator 220 is also operative to respond to the user-selected television channel frequency and the television signals carried thereby, in accordance with the user selection, and as controlled by the controller 210.
In other exemplary embodiments, the tuner/demodulator 220 may be in an external device, such as an external set-top box. In this way, the set-top box outputs television audio/video signals after modulation and demodulation, and the television audio/video signals are input into the display device 200 through the external device interface 250.
The communicator 230 is a component for communicating with an external device or an external server according to various communication protocol types. For example: the communicator 230 may include a WIFI module 231, a bluetooth communication protocol module 232, a wired ethernet communication protocol module 233, and other network communication protocol modules such as an infrared communication protocol module or a near field communication protocol module.
The display apparatus 200 may establish a connection of a control signal and a data signal with an external control apparatus or a content providing apparatus through the communicator 230. For example, the communicator may receive a control signal of the remote controller 100A according to the control of the controller.
The external device interface 250 is a component for providing data transmission between the N-chip controller 210 and the a-chip and other external devices. The external device interface may be connected with an external apparatus such as a set-top box, a game device, a notebook computer, etc. in a wired/wireless manner, and may receive data such as a video signal (e.g., moving image), an audio signal (e.g., music), additional information (e.g., EPG), etc. of the external apparatus.
The external device interface 250 may include: a High Definition Multimedia Interface (HDMI) terminal 251, a Composite Video Blanking Sync (CVBS) terminal 252, an analog or digital component terminal 253, a Universal Serial Bus (USB) terminal 254, a red, green, blue (RGB) terminal (not shown), and the like. The number and type of external device interfaces is not limited by this application.
The controller 210 controls the operation of the display device 200 and responds to the user's operation by running various software control programs (e.g., an operating system and/or various application programs) stored on the memory 290.
As shown in fig. 4, the controller 210 includes a read only memory RAM214, a random access memory ROM213, a graphics processor 216, a CPU processor 212, a communication interface 218, and a communication bus. The RAM214, the ROM213, the graphic processor 216, the CPU processor 212, and the communication interface 218 are connected via a bus.
A ROM213 for storing instructions for various system boots. If the display device 200 is powered on upon receipt of the power-on signal, the CPU processor 212 executes a system boot instruction in the ROM and copies the operating system stored in the memory 290 to the RAM214 to start running the boot operating system. After the start of the operating system is completed, the CPU processor 212 copies the various application programs in the memory 290 to the RAM214, and then starts running and starting the various application programs.
A graphics processor 216 for generating various graphics objects, such as: icons, operation menus, user input instruction display graphics, and the like. The display device comprises an arithmetic unit which carries out operation by receiving various interactive instructions input by a user and displays various objects according to display attributes. And a renderer for generating various objects based on the arithmetic unit and displaying the rendered result on the display 280.
A CPU processor 212 for executing operating system and application program instructions stored in memory 290. And executing various application programs, data and contents according to various interactive instructions received from the outside so as to finally display and play various audio and video contents.
In some exemplary embodiments, the CPU processor 212 may include a plurality of processors. The plurality of processors may include a main processor and a plurality of or a sub-processor. A main processor for performing some operations of the display apparatus 200 in the pre-power-on mode and/or operations of displaying a screen in the normal mode. A plurality of or one sub-processor for performing an operation in a standby mode or the like.
The communication interfaces may include a first interface 218-1 through an nth interface 218-n. These interfaces may be network interfaces that are connected to external devices via a network.
The controller 210 may control the overall operation of the display apparatus 200. For example: in response to receiving a user command for selecting a UI object displayed on the display 280, the controller 210 may perform an operation related to the object selected by the user command.
Wherein the object may be any one of selectable objects, such as a hyperlink or an icon. Operations related to the selected object, such as: displaying an operation connected to a hyperlink page, document, image, etc., or performing an operation of a program corresponding to the icon. The user command for selecting the UI object may be a command input through various input means (e.g., a mouse, a keyboard, a touch pad, etc.) connected to the display apparatus 200 or a voice command corresponding to a voice spoken by the user.
The memory 290 includes a memory for storing various software modules for driving and controlling the display apparatus 200. Such as: various software modules stored in memory 290, including: the system comprises a basic module, a detection module, a communication module, a display control module, a browser module, various service modules and the like.
The basic module is a bottom layer software module for signal communication between hardware in the display device 200 and sending processing and control signals to an upper layer module. The detection module is a management module used for collecting various information from various sensors or user input interfaces, and performing digital-to-analog conversion and analysis management.
For example: the voice recognition module comprises a voice analysis module and a voice instruction database module. The display control module is a module for controlling the display 280 to display image content, and may be used to play information such as multimedia image content and UI interface. The communication module is used for carrying out control and data communication with external equipment. And the browser module is used for executing data communication between the browsing servers. The service module is a module for providing various services and various applications.
Meanwhile, the memory 290 is also used to store a visual effect map for receiving external data and user data, images of various items in various user interfaces, and a focus object, etc.
A user input interface for transmitting an input signal of a user to the controller 210 or transmitting a signal output from the controller to the user. For example, the control device (e.g., a mobile terminal or a remote controller) may transmit an input signal, such as a power switch signal, a channel selection signal, a volume adjustment signal, etc., input by a user to the user input interface, and then the input signal is forwarded to the controller by the user input interface; alternatively, the control device may receive an output signal such as audio, video, or data processed by the controller to be output from the user input interface, and display the received output signal or output the received output signal in audio or vibration form.
In some embodiments, a user may enter a user command on a Graphical User Interface (GUI) displayed on the display 280, and the user input interface receives the user input command through the Graphical User Interface (GUI). Alternatively, the user may input the user command by inputting a specific sound or gesture, and the user input interface receives the user input command by recognizing the sound or gesture through the sensor.
The video processor 260-1 is configured to receive a video signal, and perform video data processing such as decompression, decoding, scaling, noise reduction, frame rate conversion, resolution conversion, and image synthesis according to a standard codec protocol of the input signal, so as to obtain a video signal directly displayed or played on the display 280.
Illustratively, the video processor 260-1 includes a demultiplexing module, a video decoding module, an image synthesis module, a frame rate conversion module, a display formatting module, and the like.
The demultiplexing module is used for demultiplexing the input audio and video data stream, and if the input MPEG-2 is input, the demultiplexing module demultiplexes the input audio and video data stream into a video signal and an audio signal.
And the video decoding module is used for processing the video signal after demultiplexing, including decoding, scaling and the like.
And the image synthesis module is used for carrying out superposition mixing processing on the GUI signal input by the user or generated by the user and the video image after the zooming processing by the graphic generator so as to generate an image signal for display.
The frame rate conversion module is configured to convert a frame rate of an input video, such as a 24Hz, 25Hz, 30Hz, or 60Hz video, into a 60Hz, 120Hz, or 240Hz frame rate, where the input frame rate may be related to a source video stream, and the output frame rate may be related to an update rate of a display. The input is realized in a common format by using a frame insertion mode.
And a display formatting module for converting the signal output by the frame rate conversion module into a signal conforming to a display format of a display, such as converting the format of the signal output by the frame rate conversion module to output an RGB data signal.
And a display 280 for receiving the image signal input from the video processor 260-1 and displaying the video content and image and the menu manipulation interface. The display 280 includes a display component for presenting a picture and a driving component for driving the display of an image. The video content may be displayed from the video in the broadcast signal received by the tuner/demodulator 220, or from the video content input from the communicator or the external device interface. And a display 220 simultaneously displaying a user manipulation interface UI generated in the display apparatus 200 and used to control the display apparatus 200.
And, a driving component for driving the display according to the type of the display 280. Alternatively, in case the display 280 is a projection display, it may also comprise a projection device and a projection screen.
The audio processor 260-2 is configured to receive an audio signal, decompress and decode the audio signal according to a standard codec protocol of the input signal, and perform noise reduction, digital-to-analog conversion, amplification and other audio data processing to obtain an audio signal that can be played in the speaker 272.
An audio output interface 270 for receiving the audio signal output by the audio processor 260-2 under the control of the controller 210, wherein the audio output interface may include a speaker 272 or an external sound output terminal 274 for outputting to a generating device of an external device, such as: external sound terminal or earphone output terminal.
In other exemplary embodiments, video processor 260-1 may comprise one or more chip components. The audio processor 260-2 may also include one or more chips.
And, in other exemplary embodiments, the video processor 260-1 and the audio processor 260-2 may be separate chips or may be integrated in one or more chips with the controller 210.
And a power supply for supplying power supply support to the display apparatus 200 from the power input from the external power source under the control of the controller 210. The power supply may include a built-in power supply circuit installed inside the display apparatus 200, or may be a power supply installed outside the display apparatus 200, such as a power supply interface for providing an external power supply in the display apparatus 200.
Similar to the N-chip, as shown in fig. 4, the a-chip may include a controller 310, a communicator 330, a detector 340, and a memory 390. A user input interface, a video processor, an audio processor, a display, an audio output interface may also be included in some embodiments. In some embodiments, there may also be a power supply that independently powers the a-chip.
The communicator 330 is a component for communicating with an external device or an external server according to various communication protocol types. For example: the communicator 330 may include a WIFI module 331, a bluetooth communication protocol module 332, a wired ethernet communication protocol module 333, and other network communication protocol modules such as an infrared communication protocol module or a near field communication protocol module.
The communicator 330 of the a-chip and the communicator 230 of the N-chip also interact with each other. For example, the N-chip WiFi module 231 is used to connect to an external network, generate network communication with an external server, and the like. The WiFi module 331 of the a chip is used to connect to the WiFi module 231 of the N chip without making a direct connection with an external network or the like. Therefore, for the user, a display device as in the above embodiment displays a WiFi account to the outside.
The detector 340 is a component of the display device a chip for collecting signals of an external environment or interacting with the outside. The detector 340 may include a light receiver 342, a sensor for collecting the intensity of ambient light, which may be used to adapt to display parameter changes, etc. by collecting the ambient light; the system may further include an image collector 341, such as a camera, a video camera, etc., which may be used to collect external environment scenes, collect attributes of the user or interact gestures with the user, adaptively change display parameters, and identify user gestures, so as to implement an interaction function with the user.
An external device interface 350, which provides a component for data transmission between the controller 310 and the N-chip or other external devices. The external device interface may be connected with an external apparatus such as a set-top box, a game device, a notebook computer, etc. in a wired/wireless manner.
The controller 310 controls the operation of the display device 200 and responds to the user's operation by running various software control programs stored on the memory 390 (e.g., using installed third party applications, etc.), and interacting with the N-chip.
As shown in fig. 4, the controller 310 includes a read only memory ROM313, a random access memory RAM314, a graphics processor 316, a CPU processor 312, a communication interface 318, and a communication bus. The ROM313 and the RAM314, the graphic processor 316, the CPU processor 312, and the communication interface 318 are connected via a bus.
A ROM313 for storing instructions for various system boots. CPU processor 312 executes system boot instructions in ROM and copies the operating system stored in memory 390 to RAM314 to begin running the boot operating system. After the start of the operating system is completed, the CPU processor 312 copies various application programs in the memory 390 to the RAM314, and then starts running and starting various application programs.
The CPU processor 312 is used for executing the operating system and application program instructions stored in the memory 390, communicating with the N chip, transmitting and interacting signals, data, instructions, etc., and executing various application programs, data and contents according to various interaction instructions received from the outside, so as to finally display and play various audio and video contents.
The communication interfaces may include a first interface 318-1 through an nth interface 318-n. These interfaces may be network interfaces connected to external devices via a network, or may be network interfaces connected to the N-chip via a network.
The controller 310 may control the overall operation of the display apparatus 200. For example: in response to receiving a user command for selecting a UI object displayed on the display 280, the controller 210 may perform an operation related to the object selected by the user command.
A graphics processor 316 for generating various graphics objects, such as: icons, operation menus, user input instruction display graphics, and the like. The display device comprises an arithmetic unit which carries out operation by receiving various interactive instructions input by a user and displays various objects according to display attributes. And a renderer for generating various objects based on the arithmetic unit and displaying the rendered result on the display 280.
Both the A-chip graphics processor 316 and the N-chip graphics processor 216 are capable of generating various graphics objects. In distinction, if application 1 is installed on the a-chip and application 2 is installed on the N-chip, the a-chip graphics processor 316 generates a graphics object when a user performs a command input by the user in application 1 at the interface of application 1. When a user makes a command input by the user in the interface of the application 2 and within the application 2, a graphic object is generated by the graphic processor 216 of the N chip.
Fig. 5 is a diagram schematically illustrating a functional configuration of a display device according to an exemplary embodiment.
As shown in fig. 5, the memory 390 of the a-chip and the memory 290 of the N-chip are used to store an operating system, an application program, contents, user data, and the like, respectively, and perform system operations for driving the display device 200 and various operations in response to a user under the control of the controller 310 of the a-chip and the controller 210 of the N-chip. The A-chip memory 390 and the N-chip memory 290 may include volatile and/or non-volatile memory.
The memory 290 is specifically configured to store an operating program for driving the controller 210 in the display device 200, and to store various applications installed in the display device 200, various applications downloaded by a user from an external device, various graphical user interfaces related to the applications, various objects related to the graphical user interfaces, user data information, and internal data of various supported applications. The memory 290 is used to store system software such as an Operating System (OS) kernel, middleware, and applications, and to store input video data and audio data, and other user data.
The memory 290 is specifically used for storing drivers and related data such as the video processor 260-1 and the audio processor 260-2, the display 280, the communication interface 230, the tuner demodulator 220, the input/output interface, and the like.
In some embodiments, memory 290 may store software and/or programs, software programs for representing an Operating System (OS) including, for example: a kernel, middleware, an Application Programming Interface (API), and/or an application program. Illustratively, the kernel may control or manage system resources, or functions implemented by other programs (such as the middleware, APIs, or applications), and the kernel may provide interfaces to allow the middleware and APIs, or applications, to access the controller to implement controlling or managing system resources.
The memory 290, for example, includes a broadcast receiving module 2901, a channel control module 2902, a volume control module 2903, an image control module 2904, a display control module 2905, an audio control module 2906, an external instruction recognition module 2907, a communication control module 2908, a power control module 2910, an operating system 2911, and other application programs 2912, a browser module, and so forth. The controller 210 performs functions such as: the system comprises a broadcast television signal receiving and demodulating function, a television channel selection control function, a volume selection control function, an image control function, a display control function, an audio control function, an external instruction identification function, a communication control function, an optical signal receiving function, an electric power control function, a software control platform supporting various functions, a browser function and other various functions.
The memory 390 includes a memory storing various software modules for driving and controlling the display apparatus 200. Such as: various software modules stored in memory 390, including: the system comprises a basic module, a detection module, a communication module, a display control module, a browser module, various service modules and the like. Since the functions of the memory 390 and the memory 290 are similar, reference may be made to the memory 290 for relevant points, and thus, detailed description thereof is omitted here.
Illustratively, the memory 390 includes an image control module 3904, an audio control module 2906, an external instruction recognition module 3907, a communication control module 3908, a light receiving module 3909, an operating system 3911, and other application programs 3912, a browser module, and the like. The controller 210 performs functions such as: the system comprises an image control function, a display control function, an audio control function, an external instruction identification function, a communication control function, an optical signal receiving function, an electric power control function, a software control platform supporting various functions, a browser function and other various functions.
Differently, the external instruction recognition module 2907 of the N-chip and the external instruction recognition module 3907 of the a-chip can recognize different instructions.
Illustratively, since the image receiving device such as a camera is connected with the a-chip, the external instruction recognition module 3907 of the a-chip may include an image recognition module 3907-1, wherein the image recognition module 3907-1 stores therein an image database, and when the camera receives an external image instruction, the camera corresponds to the image in the image database to perform instruction control on the display device. Since the voice receiving device and the remote controller are connected to the N-chip, the external command recognition module 2907 of the N-chip may include a voice recognition module 2907-2, a voice database is stored in the voice recognition module 2907-2, and when the voice receiving device receives an external voice command or the like, the voice receiving device and the like perform a corresponding relationship with a command in the voice database to perform command control on the display device. Similarly, a control device 100 such as a remote controller is connected to the N-chip, and a key command recognition module performs command interaction with the control device 100.
A block diagram of a configuration of a software system in a display device 200 according to an exemplary embodiment is exemplarily shown in fig. 6 a.
For an N-chip, as shown in fig. 6a, the operating system 2911, which includes executing operating software for handling various basic system services and for performing hardware related tasks, serves as an intermediary between applications and hardware components for data processing.
In some embodiments, portions of the operating system kernel may contain a series of software to manage the display device hardware resources and provide services to other programs or software code.
In other embodiments, portions of the operating system kernel may include one or more device drivers, which may be a set of software code in the operating system that assists in operating or controlling the devices or hardware associated with the display device. The drivers may contain code that operates the video, audio, and/or other multimedia components. Examples include a display, a camera, Flash, WiFi, and audio drivers.
The accessibility module 2911-1 is configured to modify or access the application program to achieve accessibility and operability of the application program for displaying content.
A communication module 2911-2 for connection to other peripherals via associated communication interfaces and a communication network.
The user interface module 2911-3 is configured to provide an object for displaying a user interface, so that each application program can access the object, and user operability can be achieved.
Control applications 2911-4 for controlling process management, including runtime applications and the like.
The event transmission system 2914 may be implemented within the operating system 2911 or within the application 2912. In some embodiments, one aspect is implemented within the operating system 2911 and the other aspect is implemented within the application 2912 for listening for various user input events, and the handlers for performing one or more predefined sets of operations in response to the identification of various events or sub-events will be referred to in terms of various events.
The event monitoring module 2914-1 is configured to monitor an event or a sub-event input by the user input interface.
The event identification module 2914-2 is used to input various event definitions for various user input interfaces, identify various events or sub-events, and transmit them to the processes for executing their respective set or sets of processes.
The event or sub-event refers to an input detected by one or more sensors in the display device 200 and an input of an external control device (e.g., the control apparatus 100). Such as: the method comprises the following steps of inputting various sub-events by voice, inputting sub-events by gestures of gesture recognition, inputting sub-events by remote control key commands of a control device and the like. By way of example, one or more sub-events in the remote control may include a variety of forms including, but not limited to, one or a combination of key press up/down/left/right/, ok key, key press hold, and the like. And non-physical key operations such as move, hold, release, etc.
The interface layout management module 2913, directly or indirectly receiving the input events or sub-events from the event transmission system 2914, monitors the input events or sub-events, and updates the layout of the user interface, including but not limited to the position of each control or sub-control in the interface, and the size, position, and level of the container, and performs various operations related to the layout of the interface.
Since the functions of the operating system 3911 of the a chip are similar to those of the operating system 2911 of the N chip, reference may be made to the operating system 2911 for relevant points, and details are not repeated here.
As shown in fig. 6b, the application layer of the display device contains various applications that can be executed at the display device 200.
The N-chip application layer 2912 may include, but is not limited to, one or more applications such as: video on-demand applications, application centers, gaming applications, and the like. The application layer 3912 of the a-chip may include, but is not limited to, one or more applications such as: live television applications, media center applications, and the like. It should be noted that what applications are respectively contained on the a chip and the N chip is determined according to an operating system and other designs, and the present invention does not need to make specific limitations and divisions on the applications contained on the a chip and the N chip.
The live television application program can provide live television through different signal sources. For example, a live television application may provide television signals using input from cable television, radio broadcasts, satellite services, or other types of live television services. And, the live television application may display video of the live television signal on the display device 200.
A video-on-demand application may provide video from different storage sources. Unlike live television applications, video on demand provides video displays from some storage source. For example, the video on demand may come from a server side of the cloud storage, from a local hard disk storage containing stored video programs.
The media center application program can provide various applications for playing multimedia contents. For example, a media center, which may be other than live television or video on demand, may provide services that a user may access to various images or audio through a media center application.
The application program center can provide and store various application programs. The application may be a game, an application, or some other application associated with a computer system or other device that may be run on a display device. The application center may obtain these applications from different sources, store them in local storage, and then be operable on the display device 200.
A schematic diagram of a user interface in a display device 200 according to an exemplary embodiment is illustrated in fig. 7. As shown in fig. 7, the user interface includes a plurality of view display areas, illustratively, a first view display area 201 and a play screen 202, wherein the play screen includes a layout of one or more different items. And a selector in the user interface indicating that the item is selected, the position of the selector being movable by user input to change the selection of a different item.
It should be noted that the multiple view display areas may present display screens of different hierarchies. For example, a first view display area may present video chat project content and a second view display area may present application layer project content (e.g., web page video, VOD presentations, application screens, etc.).
Optionally, the different view display areas are presented with different priorities, and the display priorities of the view display areas are different among the view display areas with different priorities. If the priority of the system layer is higher than that of the application layer, when the user uses the acquisition selector and the picture switching in the application layer, the picture display of the view display area of the system layer is not blocked; and when the size and the position of the view display area of the application layer are changed according to the selection of the user, the size and the position of the view display area of the system layer are not influenced.
The display frames of the same hierarchy can also be presented, at this time, the selector can switch between the first view display area and the second view display area, and when the size and the position of the first view display area are changed, the size and the position of the second view display area can be changed along with the change.
Since the a-chip and the N-chip may have independent operating systems installed therein, there are two independent but interrelated subsystems in the display device 200. For example, Android (Android) and various kinds of APPs can be independently installed on the chip a and the chip N, so that each chip can realize a certain function, and the chip a and the chip N cooperatively realize a certain function.
As shown in fig. 3 or 4, the display device 200 includes a first chip (a chip) and a second chip (N chip), and the first chip and the second chip can be connected, communicated, and powered through a plurality of different types of interfaces. Respectively, the first chip comprises a first controller and various modules connected with the first controller through various interfaces, and the second chip comprises a second controller and various modules connected with the second controller through various interfaces. In particular, the first chip and the second chip are respectively provided with a Bluetooth module, so that the first chip and the second chip both have a Bluetooth function.
In the technical scenario of the present application, the first chip and the first chip respectively run different applications, and for convenience of distinction and explanation, the application run by the first chip is referred to as a first application, and the application run by the second chip is referred to as a second application.
The first application comprises applications which are third-party applications, such as game applications like 'dance', 'violent god', and the like; the second application includes a system setup application and other preset applications, where the other preset applications are typically display device brand side specific applications, such as social-type applications like "hi see" pre-installed in the haixin television operating system.
The system setting application refers to a utility program for enabling a user to modify system configuration or functions, for example, the user modifies network connection in the system setting application, including opening, closing, modifying a connected network, and the like, and further, for example, the user sets system parameters, such as sound parameters, display parameters, and the like, in the system setting application, and for example, the user manages various applications in the system setting application, such as uninstallation, deactivation, and the like.
Fig. 8 is a simplified schematic diagram of an application scenario of the present application, and as shown in fig. 8, in a short-distance area of the display device 200 of the present application, there are a plurality of control apparatuses having a control function for the display device 200, such as external bluetooth devices, e.g., a bluetooth mouse 801, a bluetooth keyboard 802, a bluetooth speaker 803, and a bluetooth gamepad 804, and further, for example, a remote controller 100A or a control apparatus 100B. Wherein the external bluetooth device is connected with the bluetooth module of the display device to establish a connection with the display device to control it.
Since the applications included in the first application are all third-party applications, mainly game applications such as "dance", "violent god of driving", and the like, in the scenario shown in fig. 8, external bluetooth devices such as a bluetooth mouse 801, a bluetooth keyboard 802, and a bluetooth gamepad 804 are connected to the first chip, data sent by the bluetooth devices to the display device is transmitted to the first chip through bluetooth technology, and for convenience of distinction and explanation, the control device having a control function connected to the first chip is collectively referred to as a first control device.
Exemplarily, fig. 9a is a schematic view of an interaction scene between a first control apparatus and a display device (a first chip), in fig. 9a, a display of the display device presents a game screen, a bluetooth mouse 801 and a bluetooth keyboard 802 are connected to the first chip of the display device, after a user presses a key on the bluetooth mouse 801 or the bluetooth keyboard 802, the bluetooth mouse 801 or the bluetooth keyboard 802 transmits key input data (key-EVENT data) to the first chip through an established communication channel, and after receiving the key input data, the first chip executes corresponding control logic, thereby implementing a game function on the display device.
Since hardware such as the display, the audio output interface, and the power supply are connected to the second chip, and the second application mainly provides a system setting application and other system preset applications, in the scenario shown in fig. 8, two chips are provided, such as the remote controller 100A, the control device 100B, and the bluetooth speaker 803, and for convenience of distinction and explanation, the control devices with control functions connected to the second chip are collectively referred to as the second control device.
Exemplarily, fig. 9b is a schematic view of an interaction scene between a second control apparatus and a display device (a second chip), in fig. 9b, a display of the display device presents a system setting interface, a remote controller 100A is connected to the second chip, after a user presses a key on the remote controller 100A, the remote controller 100A transmits key input data (key-EVENT data) to the second chip through an established communication channel, and after receiving the key input data, the second chip executes corresponding control logic, thereby implementing selection and determination of a control provided by the system setting interface.
In a usage scenario, if the user leaves the display device, the display device will not receive the key input for a long time, which causes the display screen to be in a static state for a long time, is not favorable for the service life of the display screen, and wastes electric energy. In order to prolong the service life of the screen of the display and save electric energy, the screen saver application can be installed on the display equipment, the screen saver state can be automatically entered after the static time of the screen of the display reaches a certain time through the screen saver application, and the screen saver state can be automatically exited when the screen of the display needs to be active, so that the purposes of protecting the screen and saving electric energy are achieved.
Illustratively, fig. 10 is a schematic diagram of a screen saver state of a display screen, in which the display no longer presents a static user interface, but presents multimedia screen saver information provided by a screen saver application, and the multimedia screen saver information is generally dynamic information, so as to achieve the purpose of screen saver. For example, as shown in fig. 10, a plurality of pictures are displayed on the screen in a scrolling manner, and each picture is displayed on the screen for the same time.
In the above usage scenario, if the user returns to the display device and operates the display device using the remote control, the state of the screen saver shown in fig. 10 will immediately exit, and the display will not present the multimedia screen saver information, but rather a user interface.
Optionally, the user may select the multimedia screen saver information provided by the screen saver application in the screen saver state according to the preference of the user.
However, for the above scenario, although damage to the display screen due to the screen being in a stationary state for a long time can be avoided by the screen saver application, the screen saver problem of the dual-chip display device cannot be solved for a while.
Specifically, since two chips are included in the dual-chip display device and both the two chips can receive the key input, for example, the first chip can receive the key input of the first control device, and the second chip can receive the key input of the second control device, if the screen saver application is installed on the second chip (or the first chip), the screen saver application can only respond to the key input on the second chip (or the first chip) to enter or exit the screen saver state, and when the key input occurs on the other chip, the screen saver application does not respond, thereby affecting the user experience.
For example, if the screen saver application is installed on the second chip, the user cannot cause the screen saver application to exit the screen saver state in response to the user operating the display device via the first control means while the display screen is in the screen saver state, i.e., presenting the multimedia screen saver information, and vice versa.
For another example, if the screen saver application is installed on the second chip, when the display screen is in a non-screen saver state, i.e. when the user interface is presented, the user cannot make the flat-white application respond by operating the display device through the first control means to re-time the time of entering the screen saver state.
In order to solve the above problem, embodiments of the present application provide a display device. Fig. 11 is a schematic diagram of a hardware structure of the display device, and as shown in fig. 11, the display device may include: display 111, second chip 112 and first chip 113, wherein, second chip 112 is connected with display 111 communication, and first chip 112 is connected with second chip 112 communication.
As shown in fig. 12, the display device is configured to perform the steps of:
In step 121, when the content presented by the display is the multimedia screen saver information, the screen status is the screen saver status, and when the content presented by the display is the user interface, the screen status is the non-screen saver status. The display described in step 121 presents user interface or multimedia screen saver information, which indicates that the screen state of the display is switched back and forth between the screen saver state and the non-screen saver state.
Optionally, in the screen saver state, the multimedia screen saver information is displayed on the user interface in a floating manner, and at this time, the layer data corresponding to the multimedia screen saver information is overlaid on the layer data corresponding to the user interface. According to different content of the multimedia screen saver information, the transparency of layer data corresponding to the multimedia screen saver information can be smaller than 100%, so that a user interface can realize fuzzy display through a layer corresponding to the multimedia screen saver information.
Alternatively, in the screen saver state, the multimedia screen saver information is displayed in a specific view display area among the screen display areas. Illustratively, the multimedia screen saver information is displayed in the middle area of the screen display area as shown in fig. 13a, or in the upper right corner area of the screen display area as shown in fig. 13 b.
In this embodiment, the user interface does not include dynamic pictures, such as videos and animations played by the display device, and dynamic progress bars displayed when playing audio, downloading files, searching signals, and the like. Of course, in further embodiments, the user interface may include the aforementioned dynamic screen.
It should be noted that, in this embodiment, the screen saver application is installed on the second chip, which is generally a preset application of the display device, and does not need to be downloaded and installed by the user. Of course, in another embodiment, the screen saver application may be installed on the first chip, and in such an embodiment, only the steps configured by the first chip and the second chip need to be slightly exchanged and adjusted based on the technical idea of the embodiment, so that the technical problem solved by the present application can be solved, and therefore, the present invention also belongs to the protection scope of the technical solution of the present application.
And step 122, when the first chip receives the key input, sending first information to the second chip according to a preset rule.
The key input received by the first chip is the key input of the first control device, for example, the interaction scenario shown in fig. 9 a.
The first information is generated and sent to the second chip according to a preset rule when the first chip receives the key input, and is used for prompting the second chip that the first chip receives the key input currently, so that the second chip can re-execute the screen saver countdown aiming at the first information, or cancel the multimedia screen saver information being presented, and the phenomenon that no response exists in the key input on the first chip is avoided.
The preset rule may be understood as a rule that the first chip sends the first information, for example, the first information is sent once every 500ms from the time when the key input is received (including the time when the key input is received), until a response message to the first information sent by the second chip is received, so that the second chip can be ensured to receive the first information certainly, and the situation that the sending of the first chip fails or the receiving of the second chip fails due to a communication problem is avoided.
The key input received by the second chip is the key input of the second control device, for example, the interaction scenario shown in fig. 9 b.
When the second chip receives the key input or the first information, the screen saver countdown is executed by the second chip aiming at the key input or the first information received this time, namely the screen saver countdown is executed again according to the time of receiving the key input, or according to the time of receiving the first information, or according to the time of receiving the key input by the first chip in the first information; meanwhile, the second chip judges the content presented by the current display, namely the current screen state, and if the content presented by the display is the multimedia screen saver information, the multimedia screen saver information is cancelled, so that the display presents a user interface.
In specific implementation, when the second chip receives the key input or the first information, a command for instructing the screen saver application to re-execute the screen saver countdown is generated and sent to the screen saver application, and the screen saver application responds to the command to re-execute the screen saver countdown. And if the content presented by the display is the multimedia screen saver information, generating an instruction for instructing the screen saver application to cancel the multimedia screen saver information, and sending the instruction to the screen saver application, wherein the screen saver application responds to the instruction to quit displaying the multimedia screen saver information on the display.
The second chip executes the step of retiming when receiving the key input or the first information and executes the step of canceling the screen saver when the screen saver exists, so that the second chip can respond to the key input on the second chip and the key input on the first chip, and the phenomena of abnormal countdown of the screen saver and no exit of the screen saver after a user presses a key are avoided.
It should be noted that the screen saver countdown function related to the present application can be implemented by a screen saver application having a clock function, and the countdown length can be set and adjusted by a user.
As can be seen from the above embodiments, in the display device provided by the present application, the display may present the multimedia screen saver information provided by the user interface or the screen saver application; when the first chip receives the key input, first information is sent to the second chip according to a preset rule so as to prompt the second chip that the first chip receives the key input; when receiving the key input or the first information, the second chip re-executes the screen saver countdown aiming at the key input or the first information, and meanwhile, if the content presented by the display is the multimedia screen saver information, the multimedia screen saver information is cancelled so that the display presents a user interface; and when the screen saver countdown executed by the second chip is finished, presenting the multimedia screen saver information on the display.
The second chip executes the step of retiming when receiving the key input or the first information and executes the step of canceling the screen saver when the screen saver exists, so that the second chip can respond to the key input on the second chip and the key input on the first chip, and the phenomena of abnormal countdown of the screen saver and no exit of the screen saver after a user presses a key are avoided.
In the scenario shown in fig. 9a, the first control device like bluetooth keyboard and bluetooth mouse interacts with the first chip, which is typically a game scenario. In a game scene, a user usually operates continuously, which means that the first control device sends key input data to the first chip at a high frequency, and at this time, the first chip needs to generate first information and send the first information to the second chip every time the first chip receives a key input, which results in consuming a lot of first chip resources, and meanwhile, since the second chip needs to receive the first information and execute corresponding processing, the second chip resources are also consumed a lot, and since the transceiving action is too frequent, the problem of transceiving failure is easy to occur.
In order to save chip resources and avoid the problem of failure in transceiving information between the first chip and the second chip, fig. 14 shows another embodiment of the display device of the present application. In the display device, the display, the first chip and the second chip are respectively configured to perform the following steps:
in step 141, after the display device is powered on, the display presents a user interface.
As a possible implementation manner, the prompt message sent by the second chip to the first chip includes an identifier of content presented by the display, for example, if the multimedia screen saver information is presented, the identifier is a first identifier, and if the user interface is presented, the identifier is a second identifier. The first chip determines what is presented by the display by parsing the identification in the hint message.
Specifically, the second chip sends a prompt message to the first chip after starting to execute (including re-execute) the screen saver countdown and presenting the multimedia screen saver information on the display, and sends the prompt message to the first chip every 500ms later until receiving the feedback message sent by the first chip.
And step 143, the first chip receives the prompt message sent by the second chip, and sends a feedback message for the prompt message to the second chip.
In step 144, the first chip determines the content presented by the display according to the received prompt message, if the content presented by the display is the user interface, step 145 is executed, and if the content presented by the display is the multimedia screen saver information, step 146 is executed.
In step 146, the first chip records the screen status of the display as a screen saver status.
Meanwhile, in step 147, the first chip monitors whether the key input sent by the first control device is received, if so, step 148 is executed, otherwise, no action is taken.
In step 148, the first chip determines the recorded screen status, and if the screen status is the screen saver status, step 149 is executed, and if the screen saver status is not the screen saver status, step 150 is executed.
Step 149, the first chip generates the first information and sends the generated first information to the second chip at preset intervals until receiving a response message of the second chip to the first information.
Optionally, the first chip sends the first information once every 500ms until receiving the response message of the second chip.
Optionally, the first chip obtains the time of last sending the first information from the operation log, and then judges whether the time of last sending the first information reaches or exceeds a preset time according to the time of receiving the key input, if so, the first chip is immediately executed to send the first information to the second chip, and if not, the first chip is executed to send the first information to the second chip when the preset time is reached. And in the later time, the first information is sent once every preset time until a response message of the second chip to the first information is received. Wherein the preset time may be 5 s.
Alternatively, the first chip may obtain a timestamp corresponding to a last received key input, and is configured to determine a time of last sending the first information, and determine whether the time of last sending the first information reaches or exceeds a preset time according to the timestamp corresponding to the received key input this time.
It can be seen that, under the condition that the screen of the display is in the non-screen saver state, the first chip can control the frequency of sending the first information within a reasonable range determined according to the preset time, wherein when the length of the preset time is smaller than the length of the screen saver countdown, it can be ensured that before the screen state enters the screen saver state, the second chip can perform the screen saver countdown again due to the fact that the first information is received, and the screen saver countdown is not abnormal.
Compared with the mode that the first chip sends the first information to the second chip every time the first chip receives the key input, the implementation mode can effectively reduce the frequency of sending the first information, so that the resources of the first chip and the second chip can be saved, and the normal countdown of the screen saver can be ensured.
When the first chip executes the steps 143 and 150, step 151, the second chip monitors whether the key input sent by the second control device or the first information sent by the first chip is received, if so, the execution step 142 is skipped, and the step 152 is executed, otherwise, no action is taken; wherein, if the first information is received is monitored in step 151, step 154 is executed.
The second chip sends a response message to the first chip for the first information, step 154.
As can be seen from the embodiment shown in fig. 14, the display device provided by the present embodiment has at least the following beneficial effects:
on one hand, the second chip executes the step of retiming when receiving the key input or the first information and executes the step of canceling the screen saver when the screen saver exists, so that the second chip can respond to the key input on the second chip and the key input on the first chip, and the phenomena of abnormal countdown of the screen saver and no exit of the screen saver after a user presses a key are avoided.
On the other hand, under the condition that the screen of the display is in a non-screen saver state, the first chip can control the frequency of sending the first information within a reasonable range determined according to preset time, so that the frequency of sending the first information is effectively reduced, the resources of the first chip and the second chip are saved, and the normal countdown of the screen saver can be ensured.
It should be noted that, in the above embodiments, there are many methods for implementing communication between the first chip and the second chip, and one possible implementation manner provided by the present application is as follows:
the first chip is also configured to run a communication application RPC APP based on an inter-process communication technology, wherein the communication application is used for sending first information to the second chip and receiving a response message sent by the second chip and corresponding to the first information; and the chip is also used for receiving a prompt message sent by the second chip and sending a feedback message for the prompt message to the second chip.
In the usage scenario of the display device provided by the present application, the following scenario may occur:
exemplary scenario 1, no key input for a long time is performed for either the first chip or the second chip, since the user is watching the video program;
exemplary scenario 2, no key input for a long time, either the first chip or the second chip, is due to the user listening to the audio program.
In both exemplary scenarios, the display screen is actually active, although the display device has been free of key inputs for a long time. Based on this, in order to avoid entering a screen saver state when the display screen is in an active state, in the display device provided in the embodiment of the present application, the second chip is configured to:
when key input or first information sent by the first chip is received, if the content presented by the display is the multimedia screen saver information, or the content presented by the display is the user interface and the user interface does not include a dynamic picture, executing screen saver countdown aiming at the received key input or the first information; if the content presented by the display is a user interface and the user interface includes a dynamic picture, then no screensaver countdown is performed.
By adopting the implementation mode, when the second chip receives the key input or the first information, if the content presented by the display is the user interface and the user interface does not include the dynamic picture, the countdown is executed again, and if the user interface includes the dynamic picture, the screen saver countdown is not executed, so that the situation that a user watches videos or listens audios and the like, and the display equipment is in an active state but does not have the key input, the screen saver state can be avoided.
As an alternative to the foregoing implementation, the prompt message sent by the second chip to the first chip includes an identifier of content presented by the display, and if the presented multimedia screen saver information is the first identifier, the first identifier is the second identifier if the user interface is presented and the user interface does not include a dynamic picture, and if the user interface is presented and the user interface includes a dynamic picture, the third identifier is the third identifier.
Furthermore, the first chip may determine, according to the identifier in the prompt message, whether the user interface includes a dynamic picture when the user interface is displayed, and if the user interface includes the dynamic picture, the first chip does not send the first information to the second chip when the key input is received, and if the user interface does not include the dynamic picture, the first information is sent to the second chip when the input is received, so that resources of the first chip and the second chip may be further saved.
Specifically, in this alternative, the first chip is configured to:
receiving a prompt message sent by a second chip;
determining content presented by the display according to the prompt message;
if the content presented by the display is judged to be the multimedia screen saver information, recording the screen state as the screen saver state;
if the content presented by the display is judged to be the user interface and the user interface does not comprise the dynamic picture, recording that the screen state is a non-screen saver state;
if the content presented by the display is judged to be the user interface and the user interface comprises the dynamic picture, recording that the screen state is an active state;
when key input is received, if the recorded screen state is a screen saver state, first information is sent to the second chip at preset intervals until a response message sent by the second chip is received;
if the recorded screen state is a non-screen saver state, transmitting the first information to the second chip when the time from the last transmission of the first information reaches the preset time; and sending first information to the second chip every other preset time until receiving a response message sent by the second chip.
If the recorded screen status is active, the first information is not transmitted.
By adopting the alternative mode, the situation that a user watches videos or listens audio and the like, when the display screen is in an active state but the display equipment has no key input can be avoided, the screen saver state is avoided, the frequency of sending the first information by the first chip can be further reduced, and further chip resources are saved.
In further embodiments, the user may set a black list and a white list in which several applications added for the user are different from each other.
After the display device is turned on, the first chip and the second chip respectively monitor the foreground applications running respectively, and under the condition that the foreground applications are white list applications, the first chip and the second chip can respectively execute the logic shown in the foregoing fig. 12 or fig. 14 for executing presenting the multimedia screen saver information and executing the screen saver countdown. And under the condition that the foreground application is the blacklist application, the first chip and the second chip do not execute the logic of presenting the multimedia screen saver information and executing the screen saver countdown.
In summary, the embodiment of the present application provides a display device, which includes a display, a first chip and a second chip. The display may present multimedia screen saver information provided by a user interface or screen saver application; when the first chip receives the key input, first information is sent to the second chip according to a preset rule so as to prompt the second chip that the first chip receives the key input; when receiving the key input or the first information, the second chip re-executes the screen saver countdown aiming at the key input or the first information, and meanwhile, if the content presented by the display is the multimedia screen saver information, the multimedia screen saver information is cancelled so that the display presents a user interface; when the screen saver countdown performed by the second chip is complete, the multimedia screen saver information is presented on the display.
The second chip executes the step of retiming when receiving the key input or the first information and executes the step of canceling the screen saver when the screen saver exists, so that the second chip can respond to the key input on the second chip and the key input on the first chip, and the phenomena of abnormal countdown of the screen saver and no exit of the screen saver after a user presses a key are avoided.
In addition, when the preset rule for the first chip to send the first information is the rule described in the embodiment shown in fig. 14, the phenomena of abnormal screen saver countdown and no screen saver exit after a user presses a key can be avoided, the frequency of sending the first information by the first chip can be effectively reduced, chip resources are saved, and the method is particularly suitable for a game scene of continuous operation of the user.
Based on the display device provided by the application, the embodiment of the application also provides a method for presenting the multimedia screen saver information by using the dual-chip display device. As shown in fig. 15, the method may include:
It should be noted that the method of the present application further includes other steps executed by the first chip and the second chip in the display device embodiment of the present application, and specific contents may refer to the display device embodiment, and details are not described here again.
All other embodiments, which can be derived by a person skilled in the art from the exemplary embodiments shown in the present application without inventive effort, shall fall within the scope of protection of the present application. Moreover, while the disclosure herein has been presented in terms of exemplary embodiment or embodiments, it is to be understood that each aspect of the disclosure can be implemented as a separate entity, or steps.
It should be understood that the terms "first," "second," "third," and the like in the description and in the claims of the present application and in the drawings described above are used for distinguishing between similar elements and not necessarily for describing a particular sequential or chronological order. It is to be understood that the data so used are interchangeable under appropriate circumstances and can be implemented in sequences other than those illustrated or otherwise described herein with respect to the embodiments of the application, for example.
Furthermore, the terms "comprises" and "comprising," and any variations thereof, are intended to cover a non-exclusive inclusion, such that a product or device that comprises a list of elements is not necessarily limited to those elements explicitly listed, but may include other elements not expressly listed or inherent to such product or device.
In a specific implementation, the present invention further provides a computer storage medium, wherein the computer storage medium may store a program, and the program may include some or all of the steps of the embodiments of the method provided by the present invention when executed. The storage medium may be a magnetic disk, an optical disk, a read-only memory (ROM) or a Random Access Memory (RAM).
Those skilled in the art will readily appreciate that the techniques of the embodiments of the present invention may be implemented as software plus a required general purpose hardware platform. Based on such understanding, the technical solutions in the embodiments of the present invention may be embodied in the form of a software product, which may be stored in a storage medium, such as a ROM/RAM, a magnetic disk, an optical disk, and the like, and includes several instructions for enabling a computer device (which may be a personal computer, a server, or a network device) to execute the method according to the embodiments or some parts of the embodiments of the present invention.
The same and similar parts in the various embodiments in this specification may be referred to each other. In particular, as for the apparatus and device embodiments, since they are substantially similar to the method embodiments, the description is relatively simple, and reference may be made to the description in the method embodiments for relevant points.
Claims (9)
1. A display device, comprising:
a display configured to present a user interface or multimedia screen saver information, the multimedia screen saver information provided by a screen saver application running on a second chip;
a first chip in communication with the second chip configured to:
when receiving a key input of a control device connected with the first chip, sending first information to the second chip according to a preset rule;
the second chip is configured to:
when receiving a key input of a control device connected with the second chip or first information sent by the first chip, executing screen saver countdown aiming at the key input or the first information of the control device connected with the second chip; and if the content presented by the display is the multimedia screen saver information, canceling the multimedia screen saver information to enable the display to present a user interface;
when the executed screen saver countdown is complete, then multimedia screen saver information is presented on the display.
2. The display device of claim 1, wherein the second chip is further configured to:
after the multimedia screen saver information is cancelled and the multimedia screen saver information is presented on the display, sending a prompt message to the first chip at a preset interval until a feedback message of the first chip for the prompt message is received, wherein the prompt message is used for prompting the content presented by the display of the first chip.
3. The display device according to claim 1, wherein the first chip is further configured to:
when a prompt message sent by a second chip is received, sending a feedback message for the prompt message to the second chip, wherein the prompt message is used for prompting the content presented by the display of the first chip;
if the content presented by the display is judged to be the multimedia screen saver information according to the prompt message, recording the screen state as the screen saver state;
and if the content presented by the display is judged to be the user interface according to the prompt message, recording that the screen state is a non-screen saver state.
4. The display device according to claim 1, wherein when the first chip receives a key input, sending first information to the second chip according to a preset rule, specifically comprises:
judging the recorded screen state when receiving key input;
if the recorded screen state is a screen saver state, sending the first information to the second chip;
and if the recorded screen state is a non-screen saver state, transmitting the first information to the second chip when the time for transmitting the first information last time reaches preset time.
5. The display device according to claim 4, wherein if the recorded screen status is a screen saver status, the sending, by the first chip, the first information to the second chip specifically includes:
sending the first information to the second chip at preset intervals until a response message of the second chip to the first information is received;
after the first chip sends the first information to the second chip again, the method further includes:
and sending the first information to the second chip once every preset time until receiving a response message of the second chip to the first information.
6. The display device of claim 1, wherein the second chip is further configured to:
after receiving first information sent by a first chip, sending a response message to the first information to the first chip.
7. The display device according to claim 1, wherein the first chip is further configured to:
running a communication application based on an interprocess communication technology, wherein the communication application is used for sending first information to the second chip and receiving a response message for the first information sent by the second chip;
and the chip is also used for receiving a prompt message sent by the second chip and sending a feedback message for the prompt message to the second chip.
8. The display device of claim 1, wherein the second chip is configured to:
when key input or first information sent by the first chip is received, if the content presented by the display is the multimedia screen saver information, or the content presented by the display is the user interface and the user interface does not include a dynamic picture, executing screen saver countdown aiming at the received key input or the first information;
if the content presented by the display is a user interface and the user interface includes a dynamic picture, then no screensaver countdown is performed.
9. A method for presenting multimedia screen saver information by a dual-chip display device is characterized by comprising the following steps:
presenting a user interface or multimedia screen saver information on a display of a display device, the multimedia screen saver information being provided by a screen saver application running on a second chip;
when the second chip receives key input of a control device connected with the second chip or first information sent by the first chip, executing screen saver countdown aiming at the key input or the first information; and if the content presented by the display is the multimedia screen saver information, canceling the multimedia screen saver information to enable the display to present a user interface;
when the executed screen saver countdown is completed, presenting multimedia screen saver information on the display; the first information is sent by the first chip according to a preset rule when the first chip receives key input of a control device connected with the first chip.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201910795660.6A CN110708581B (en) | 2019-08-27 | 2019-08-27 | Display device and method for presenting multimedia screen saver information |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201910795660.6A CN110708581B (en) | 2019-08-27 | 2019-08-27 | Display device and method for presenting multimedia screen saver information |
Publications (2)
Publication Number | Publication Date |
---|---|
CN110708581A CN110708581A (en) | 2020-01-17 |
CN110708581B true CN110708581B (en) | 2021-09-24 |
Family
ID=69193607
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201910795660.6A Active CN110708581B (en) | 2019-08-27 | 2019-08-27 | Display device and method for presenting multimedia screen saver information |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN110708581B (en) |
Families Citing this family (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
DK180684B1 (en) * | 2019-09-09 | 2021-11-25 | Apple Inc | Techniques for managing display usage |
CN113495711B (en) * | 2020-03-19 | 2023-05-05 | 聚好看科技股份有限公司 | Display apparatus and display method |
CN111586481B (en) * | 2020-05-06 | 2022-06-14 | 海信视像科技股份有限公司 | Terminal and application processing method |
CN111836115B (en) * | 2020-07-02 | 2021-12-24 | 海信视像科技股份有限公司 | Screen saver display method, screen saver skipping method and display device |
CN113301419A (en) * | 2021-05-14 | 2021-08-24 | 海信视像科技股份有限公司 | Display device and display control method for preventing screen burn-in |
Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
EP2546825A4 (en) * | 2010-11-04 | 2014-07-30 | Zte Corp | Method, device and terminal for identifying lcd screen |
CN108021456A (en) * | 2016-11-04 | 2018-05-11 | 阿里巴巴集团控股有限公司 | touch event processing method, device and operating system |
CN108628650A (en) * | 2018-03-20 | 2018-10-09 | 广州视源电子科技股份有限公司 | Touch event processing method and device and intelligent interaction equipment |
CN109388431A (en) * | 2018-09-19 | 2019-02-26 | 深圳创维汽车智能有限公司 | Screen awakening method, device and the storage medium of onboard system |
CN109669782A (en) * | 2017-10-13 | 2019-04-23 | 阿里巴巴集团控股有限公司 | Hardware abstraction layer multiplexing method, device, operating system and equipment |
CN110058905A (en) * | 2018-01-18 | 2019-07-26 | 阿里巴巴集团控股有限公司 | Event handling and operating system management method, apparatus, equipment and storage medium |
Family Cites Families (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US8592697B2 (en) * | 2008-09-10 | 2013-11-26 | Apple Inc. | Single-chip multi-stimulus sensor controller |
-
2019
- 2019-08-27 CN CN201910795660.6A patent/CN110708581B/en active Active
Patent Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
EP2546825A4 (en) * | 2010-11-04 | 2014-07-30 | Zte Corp | Method, device and terminal for identifying lcd screen |
CN108021456A (en) * | 2016-11-04 | 2018-05-11 | 阿里巴巴集团控股有限公司 | touch event processing method, device and operating system |
CN109669782A (en) * | 2017-10-13 | 2019-04-23 | 阿里巴巴集团控股有限公司 | Hardware abstraction layer multiplexing method, device, operating system and equipment |
CN110058905A (en) * | 2018-01-18 | 2019-07-26 | 阿里巴巴集团控股有限公司 | Event handling and operating system management method, apparatus, equipment and storage medium |
CN108628650A (en) * | 2018-03-20 | 2018-10-09 | 广州视源电子科技股份有限公司 | Touch event processing method and device and intelligent interaction equipment |
CN109388431A (en) * | 2018-09-19 | 2019-02-26 | 深圳创维汽车智能有限公司 | Screen awakening method, device and the storage medium of onboard system |
Non-Patent Citations (1)
Title |
---|
服务器消除屏保方案改进;胡文宇,张大鹏,蒋圣超;《电子世界》;20131215(第23期);全文 * |
Also Published As
Publication number | Publication date |
---|---|
CN110708581A (en) | 2020-01-17 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN113330736B (en) | Display and image processing method | |
CN110708581B (en) | Display device and method for presenting multimedia screen saver information | |
CN112399213B (en) | Display device and remote controller key multiplexing method | |
CN112068741B (en) | Display device and display method for Bluetooth switch state of display device | |
CN112399232A (en) | Display equipment, camera priority use control method and device | |
CN112399243A (en) | Playing method and display device | |
CN112068987A (en) | Method and device for rapidly restoring factory settings | |
CN112463267B (en) | Method for presenting screen saver information on display device screen and display device | |
CN112073795B (en) | Video data processing method and device and display equipment | |
CN112399233A (en) | Display device and position self-adaptive adjusting method of video chat window | |
CN112073813B (en) | Display device and method for detecting and processing abnormal starting between two systems | |
CN112073790B (en) | Display device and method for synchronizing starting states between two systems | |
CN113141528B (en) | Display device, boot animation playing method and storage medium | |
CN112073769A (en) | Display device and method for applying common display | |
CN112073666B (en) | Power supply control method of display equipment and display equipment | |
CN112073812B (en) | Application management method on smart television and display device | |
CN112073776B (en) | Voice control method and display device | |
WO2021169125A1 (en) | Display device and control method | |
CN112073808B (en) | Color space switching method and display device | |
CN112399245A (en) | Playing method and display device | |
CN112073773A (en) | Screen interaction method and device and display equipment | |
CN112073816A (en) | Dual-system USB upgrading method and device and display equipment | |
CN112073779B (en) | Display device and fault-tolerant method for key transmission | |
CN112995113B (en) | Display device, port control method and storage medium | |
CN112995762B (en) | Display device and network state synchronization method |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
CB02 | Change of applicant information | ||
CB02 | Change of applicant information |
Address after: 266555 Qingdao economic and Technological Development Zone, Shandong, Hong Kong Road, No. 218 Applicant after: Hisense Visual Technology Co., Ltd. Address before: 266555 Qingdao economic and Technological Development Zone, Shandong, Hong Kong Road, No. 218 Applicant before: QINGDAO HISENSE ELECTRONICS Co.,Ltd. |
|
GR01 | Patent grant | ||
GR01 | Patent grant |