CN116320554A - Display device and display method - Google Patents

Display device and display method Download PDF

Info

Publication number
CN116320554A
CN116320554A CN202310133508.8A CN202310133508A CN116320554A CN 116320554 A CN116320554 A CN 116320554A CN 202310133508 A CN202310133508 A CN 202310133508A CN 116320554 A CN116320554 A CN 116320554A
Authority
CN
China
Prior art keywords
display
incoming call
controller
video
display screen
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202310133508.8A
Other languages
Chinese (zh)
Inventor
王学磊
丁佳一
穆聪聪
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Hisense Visual Technology Co Ltd
Original Assignee
Hisense Visual Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Hisense Visual Technology Co Ltd filed Critical Hisense Visual Technology Co Ltd
Publication of CN116320554A publication Critical patent/CN116320554A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/41Structure of client; Structure of client peripherals
    • H04N21/4104Peripherals receiving signals from specially adapted client devices
    • H04N21/4122Peripherals receiving signals from specially adapted client devices additional display device, e.g. video projector
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/14Digital output to display device ; Cooperation and interconnection of the display device with other functional units
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/41Structure of client; Structure of client peripherals
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/41Structure of client; Structure of client peripherals
    • H04N21/422Input-only peripherals, i.e. input devices connected to specially adapted client devices, e.g. global positioning system [GPS]
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/41Structure of client; Structure of client peripherals
    • H04N21/422Input-only peripherals, i.e. input devices connected to specially adapted client devices, e.g. global positioning system [GPS]
    • H04N21/42204User interfaces specially adapted for controlling a client device through a remote control device; Remote control devices therefor
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/41Structure of client; Structure of client peripherals
    • H04N21/426Internal components of the client ; Characteristics thereof
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/431Generation of visual interfaces for content selection or interaction; Content or additional data rendering
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/431Generation of visual interfaces for content selection or interaction; Content or additional data rendering
    • H04N21/4312Generation of visual interfaces for content selection or interaction; Content or additional data rendering involving specific graphical features, e.g. screen layout, special fonts or colors, blinking icons, highlights or animations
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/431Generation of visual interfaces for content selection or interaction; Content or additional data rendering
    • H04N21/4312Generation of visual interfaces for content selection or interaction; Content or additional data rendering involving specific graphical features, e.g. screen layout, special fonts or colors, blinking icons, highlights or animations
    • H04N21/4316Generation of visual interfaces for content selection or interaction; Content or additional data rendering involving specific graphical features, e.g. screen layout, special fonts or colors, blinking icons, highlights or animations for displaying supplemental content in a region of the screen, e.g. an advertisement in a separate window
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/47End-user applications
    • H04N21/478Supplemental services, e.g. displaying phone caller identification, shopping application
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/47End-user applications
    • H04N21/478Supplemental services, e.g. displaying phone caller identification, shopping application
    • H04N21/4788Supplemental services, e.g. displaying phone caller identification, shopping application communicating with other users, e.g. chatting

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • General Engineering & Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Business, Economics & Management (AREA)
  • Marketing (AREA)
  • Controls And Circuits For Display Device (AREA)
  • Information Transfer Between Computers (AREA)

Abstract

The application provides a display device and a display method, wherein the display device comprises a first display screen, a second display screen and a controller, and responds to an incoming call request, the display device detects a media asset category displayed in a first user interface; responding to the media asset type as a video picture, and controlling a second display screen to display an incoming call reminding picture corresponding to the incoming call request in a second user interface, wherein the incoming call reminding picture comprises an operation control for indicating to put through the incoming call request and an operation control for rejecting the incoming call request; therefore, when the first display screen is playing the video picture, after receiving the incoming call request, the incoming call reminding picture can be displayed on the second display screen, so that the video picture being displayed by the first display screen is not blocked, and the user experience is improved.

Description

Display device and display method
The present application claims priority from chinese patent office, application number 2019110673655, application name "display device, display method, and computing device," filed on month 11 and 4 of 2019, the entire contents of which are incorporated herein by reference.
The application is divided into a number 202010327361.2, a date 2020-04-23 and a name "display device and display method".
Technical Field
The present invention relates to intelligent display technologies, and in particular, to a display device and a display method.
Background
With the continuous development of television technology and internet technology, intelligent televisions based on the internet are appeared. The intelligent television is based on the internet technology, has an open operating system and a chip, and has an open application platform, and can support various functions such as video, entertainment and data, so that diversified requirements of users are met, and brand-new use experience is brought to the users. In terms of user interaction, the intelligent television can support a voice interaction mode, and a user can control the intelligent television through voice.
In the prior art, a smart television includes a display. The information required to be displayed to the user by the intelligent television is displayed on the screen. For example, when a smart tv is playing a video, if a pop-up window, a notification message, etc. need to be displayed, these pieces of information inevitably cover the main screen. This approach can affect the user's normal viewing of the video, resulting in a poor user experience.
Disclosure of Invention
The embodiment of the application provides a display device and a display method.
An embodiment of the present application provides a display device, including: the system comprises a first display screen, a second display screen and a controller.
The controller is configured to: and when the first display screen displays the video picture, receiving an incoming call request, and displaying an incoming call reminding picture on the second display screen.
In some embodiments, the controller is further configured to, when the first display screen displays a video screen, receive an incoming call request, and after displaying an incoming call alert screen on the second display screen:
and responding to the user input tab for switching on the incoming call request, and controlling the second display screen to present a chat picture.
In some embodiments, the controller is further configured to:
and responding to the selection of the chat screen by the user, and controlling the chat screen to be switched from the second display screen to the first display screen.
In some embodiments, the controller is further configured to, when the first display screen displays a video screen, receive an incoming call request, and after displaying an incoming call alert screen on the second display screen:
and responding to the user input tab for switching on the incoming call request, and controlling the first display screen to present a chat picture.
In some embodiments, the controller is further configured to:
and responding to the small window video call indication input by the user, and displaying a small window video call picture on the second display screen.
In some embodiments, the controller is further configured to:
and responding to the small window video call indication input by the user, and displaying a small window video call picture in a target area of the first display screen.
In some embodiments, the controller comprises a first controller and a second controller. The first controller is configured to: receiving an incoming call request when the first display screen displays a video picture, responding to the incoming call request, and sending the incoming call request to the first display screen
The second controller sends an incoming call indication; the second controller is configured to: and receiving the incoming call indication, and displaying an incoming call reminding picture on the second display screen in response to the incoming call indication.
In some embodiments, the second controller is configured to:
receiving an answering video incoming call instruction input by a user, responding to the answering video incoming call instruction, sending a video call instruction to the first controller, and closing the incoming call reminding picture.
The first controller is configured to:
and responding to the video call indication, and displaying a video call picture on the first display screen.
In some embodiments, the first controller is configured to:
receiving a small window video call instruction input by a user, closing the video call picture in response to the small window video call instruction, and sending the small window video call instruction to the second controller.
The second controller is configured to:
and responding to the small window video call indication, and displaying a small window video call picture on the second display screen.
In some embodiments, the first controller is configured to:
receiving a small window video call instruction input by a user, closing the video call picture in response to the small window video call instruction, and displaying the small window video call picture in a target area of the first display screen.
In some embodiments, the first controller is configured to:
and displaying the video picture on the first display screen.
The embodiment of the application also provides a display method, which comprises the following steps:
the controller receives an incoming call request when the first display screen displays a video picture;
and the controller responds to the incoming call request and displays an incoming call reminding picture on a second display screen.
In some embodiments, the method further comprises:
after the incoming call reminding picture is displayed on the second display screen, responding to the user input to switch on the incoming call reminding picture
And controlling the second display screen to present a chat picture by using the tab of the incoming call request.
In some embodiments, the method further comprises:
and responding to the selection of the chat screen by the user, and controlling the chat screen to be switched from the second display screen to the first display screen.
In some embodiments, the method further comprises:
after the incoming call reminding picture is displayed on the second display screen, the first display screen is controlled to present a chat picture in response to the user input tab for the incoming call completing request.
In some embodiments, the method further comprises:
and responding to the small window video call indication input by the user, and displaying a small window video call picture on the second display screen.
In some embodiments, the method further comprises:
and responding to the small window video call indication input by the user, and displaying a small window video call picture in a target area of the first display screen.
In some embodiments, the controller comprises a first controller and a second controller.
The controller receives an incoming call request when the first display screen displays a video picture, and the method comprises the following steps:
the first controller receives an incoming call request when the first display screen displays a video picture.
The controller responds to the incoming call request and displays an incoming call reminding picture on a second display screen, and the incoming call reminding picture comprises the following steps:
the first controller responds to the incoming call request and sends an incoming call indication to the second controller; the second controller receives the incoming call indication, and responds to the incoming call indication, and an incoming call reminding picture is displayed on the second display screen.
In some embodiments, the method comprises:
the second controller receives an answering video incoming call instruction input by a user, responds to the answering video incoming call instruction, sends a video call instruction to the first controller, and closes the incoming call reminding picture.
The first controller receives the video call indication, and responds to the video call indication, and a video call picture is displayed on the first display screen.
In some embodiments, the method further comprises:
the first controller receives a small window video call instruction input by a user; and closing the video call picture in response to the small window video call instruction, and sending the small window video call instruction to the second controller, wherein the second controller is used for displaying the small window video call picture on the second display screen in response to the small window video call instruction.
In some embodiments, the method further comprises:
the first controller receives a small window video call instruction input by a user; and responding to the small window video call indication, closing the video call picture, and displaying the small window video call picture in a target area of the first display screen.
In some embodiments, after closing the video call screen, the method further comprises:
the first controller displays the video picture on the first display screen.
Embodiments of the present application also provide a computing device, comprising:
a memory for storing program instructions;
and a processor for calling program instructions stored in the memory and executing the method according to the second aspect according to the obtained program.
According to the display device and the display method, when the first display screen is playing the video picture, the controller receives the incoming call request, and then the incoming call reminding picture is displayed on the second display screen, so that the video picture being displayed by the first display screen is not shielded, and the user experience is greatly improved.
Drawings
In order to more clearly illustrate the invention or the technical solutions of the prior art, the following description of the embodiments or the drawings used in the description of the prior art will be given in brief, it being obvious that the drawings in the description below are some embodiments of the invention and that other drawings can be obtained from them without inventive effort for a person skilled in the art.
Fig. 1 is a schematic diagram of an operation scenario between a display device and a control apparatus provided in an exemplary embodiment of the present invention;
fig. 2 is a block diagram of a configuration of a control apparatus 100 provided in an exemplary embodiment of the present invention;
fig. 3a is a schematic diagram of a hardware structure of a hardware system in a display device 200 according to an exemplary embodiment of the present application;
fig. 3b is a schematic diagram of a hardware structure of a hardware system in another display device 200 according to an exemplary embodiment of the present application;
fig. 4 is a schematic diagram illustrating a connection relationship between a power board and a load in a display device 200 according to an exemplary embodiment of the present invention;
fig. 5 is a block diagram of a hardware architecture of the display device 200 according to fig. 3a and 3b in an exemplary embodiment of the present invention;
fig. 6 is a functional configuration diagram of a display device provided in an exemplary embodiment of the present invention;
fig. 7 is a block diagram showing a configuration of a software system in the display device 200 provided in the exemplary embodiment of the present invention;
FIG. 8 is an application layer schematic of a display device provided in an exemplary embodiment of the present invention;
fig. 9 is a schematic diagram of a user interface in a display device 200 provided in an exemplary embodiment of the present invention;
FIGS. 10-12 schematically illustrate interactions with a user of a user interface in a display device 200 according to an exemplary embodiment;
FIGS. 13-15 schematically illustrate interactions with a user of a user interface in a display device 200 according to an exemplary embodiment;
FIG. 16 is a schematic diagram of an interaction process of a first controller and a second controller;
FIG. 17 schematically illustrates a user interface in a display device 200 interacting with a user in accordance with an exemplary embodiment;
FIG. 18 schematically illustrates a user interface in a display device 200 interacting with a user in accordance with an exemplary embodiment;
FIG. 19 is a schematic diagram of an interaction process of a first controller and a second controller;
FIG. 20 schematically illustrates a user interface in a display device 200 interacting with a user in accordance with an exemplary embodiment;
FIG. 21 schematically illustrates a user interface in a display device 200 interacting with a user in accordance with an exemplary embodiment;
fig. 22 is a flow chart of a display method according to an embodiment of the present application.
Detailed Description
For the purpose of making the objects, technical solutions and advantages of the embodiments of the present invention more apparent, the technical solutions of the embodiments of the present invention will be clearly and completely described below with reference to the accompanying drawings in the embodiments of the present invention, and it is apparent that the described embodiments are only some embodiments of the present invention, not all embodiments of the present invention. All other embodiments, which can be made by those skilled in the art based on the embodiments of the invention without making any inventive effort, are intended to be within the scope of the invention.
The terms first, second and the like in the description, in the claims and in the drawings, are used for distinguishing between similar elements and not necessarily for describing a particular sequential or chronological order.
The present application is mainly directed to audio and video synchronization processing of a display device having a dual-system and dual-display structure, that is, a display device having a first controller (first hardware system), a second controller (second hardware system), a first display screen, and a second display screen, and first, the structure, function, implementation, and other aspects of the display device having a dual-system hardware structure will be described in detail below.
For convenience of use, various external device interfaces are usually provided on the display device, so as to connect different peripheral devices or cables to realize corresponding functions. When the high-definition camera is connected to the interface of the display device, if the hardware system of the display device does not have the hardware interface of the high-pixel camera for receiving the source code, the data received by the camera cannot be presented on the display screen of the display device.
Also, due to the hardware structure, the hardware system of the conventional display device only supports one path of hard decoding resource, and usually only supports video decoding with a resolution of 4K at maximum, so when video chat while watching the network television is to be implemented, in order not to reduce the definition of the network video picture, it is necessary to decode the network video using the hard decoding resource (typically, GPU in the hardware system), and in this case, only the video chat picture can be processed in such a way that the video is soft decoded by a general processor (e.g. CPU) in the hardware system.
The soft decoding is adopted to process the video chat pictures, so that the data processing load of the CPU is greatly increased, and when the data processing load of the CPU is too heavy, the problems of picture blocking or unsmooth can occur. In some embodiments, due to the data processing capability of the CPU, when the video chat frame is processed by soft decoding of the CPU, multiple video calls cannot be realized, and when a user wants to perform video chat with multiple other users at the same time in the same chat scene, access is blocked.
Based on the above-mentioned aspects, to overcome the above-mentioned drawbacks, the present application discloses a dual hardware system architecture to implement multiple video chat data (at least one local video).
The concepts related to the present application will be described with reference to the accompanying drawings. It should be noted that the following descriptions of the concepts are only for making the content of the present application easier to understand, and do not represent a limitation on the protection scope of the present application.
The term "module" as used in various embodiments of the present application may refer to any known or later developed hardware, software, firmware, artificial intelligence, fuzzy logic, or combination of hardware or/and software code that is capable of performing the functionality associated with that element.
The term "remote control" as used in the various embodiments of the present application refers to a component of an electronic device (such as a display device as disclosed herein) that can typically wirelessly control the electronic device over a relatively short range of distances. The assembly may be connected to the electronic device generally using infrared and/or Radio Frequency (RF) signals and/or bluetooth, and may also include functional modules such as WiFi, wireless USB, bluetooth, motion sensors, etc. For example: the hand-held touch remote controller replaces most of the physical built-in hardware in a general remote control device with a touch screen user interface.
The term "gesture" as used in embodiments of the present application refers to a user behavior that is used to express an intended idea, action, purpose, and/or result by a change in hand or motion of a hand, etc.
The term "hardware system" as used in the various embodiments of the present application may refer to a physical component comprising mechanical, optical, electrical, magnetic devices such as integrated circuits (Integrated Circuit, ICs), printed circuit boards (Printed circuit board, PCBs) with computing, control, storage, input and output functions. In various embodiments of the present application, the hardware system may also be generally referred to as a motherboard (or a chip).
A schematic diagram of an operation scenario between a display device and a control apparatus according to an embodiment is exemplarily shown in fig. 1. As shown in fig. 1, a user may operate the display apparatus 200 by controlling the device 100.
The control device 100 may be a remote controller 100A, which may communicate with the display device 200 through infrared protocol communication, bluetooth protocol communication, zigBee protocol communication, or other short-range communication, and is used to control the display device 200 through wireless or other wired modes. The user may control the display device 200 by inputting user instructions through keys, voice input, control panel input, etc. on the remote controller 100A. Such as: the user can input corresponding control instructions through volume up-down keys, channel control keys, up/down/left/right movement keys, voice input keys, menu keys, on-off keys, etc. on the remote controller 100A to realize the functions of the control display device 200.
The control apparatus 100 may also be an intelligent device, such as a mobile terminal 100B, a tablet computer, a notebook computer, etc., which may communicate with the display device 200 through a local network (LAN, local Area Network), a wide area network (WAN, wide Area Network), a wireless local area network (WLAN, wireless Local Area Network), or other networks, and control the display device 200 through an application program corresponding to the display device 200. For example, the display device 200 is controlled using an application running on a smart device. The application may provide various controls to the User through an intuitive User Interface (UI) on a screen associated with the smart device.
By way of example, both the mobile terminal 100B and the display device 200 may be provided with software applications, so that connection communication between the two may be implemented through a network communication protocol, thereby achieving the purpose of one-to-one control operation and data communication. Such as: the mobile terminal 100B and the display device 200 can be made to establish a control instruction protocol, the remote control keyboard is synchronized to the mobile terminal 100B, and the functions of controlling the display device 200 are realized by controlling the user interface on the mobile terminal 100B; the audio/video content displayed on the mobile terminal 100B may also be transmitted to the display device 200, so as to implement a synchronous display function.
As shown in fig. 1, the display device 200 may also be in data communication with the server 300 through a variety of communication means. In various embodiments of the present application, the display device 200 may be permitted to make a wired or wireless communication connection with the server 300 via a local area network, a wireless local area network, or other network. The server 300 may provide various contents and interactions to the display device 200.
By way of example, the display device 200 receives software program updates by sending and receiving information, and electronic program guide (EPG, electronic Program Guide) interactions, or accesses a remotely stored digital media library. The servers 300 may be one group, may be multiple groups, and may be one or more types of servers. Other web service content such as video on demand and advertising services are provided through the server 300.
The display device 200 comprises a first display screen 201 and a second display screen 202, wherein the first display screen 201 and the second display screen 202 are mutually independent, and a dual hardware control system is adopted between the first display screen 201 and the second display screen 202.
Wherein the first display 201 and the second display 202 may be used to display different display screens. For example, the first display 201 may be used for displaying a conventional television program, and the second display 202 may be used for displaying auxiliary information such as a notification message, a voice assistant, etc.
In some embodiments, the content displayed on the first display 201 and the content displayed on the second display 202 may be independent of each other and not affect each other. For example, when the first display 201 plays a television program, the second display 202 may display information such as time, weather, air temperature, a reminder message, etc. that is unrelated to the television program.
In some embodiments, there may also be an association between the content displayed on the first display 201 and the content displayed on the second display 202. For example, when the first display screen 201 plays the main screen of the video chat, the second display screen 202 may display information such as an avatar, a chat duration, and the like of the user currently accessing the video chat.
In some embodiments, some or all of the content displayed by the second display 202 may be adjusted to the display of the first display 201. For example, information such as time, weather, air temperature, a reminder message, etc. displayed on the first display screen 201 may be adjusted to be displayed on the first display screen 201, while other information is displayed on the second display screen 202.
In addition, the first display 201 displays the multi-party interactive screen while displaying the conventional television program screen, and the multi-party interactive screen does not obstruct the conventional television program screen. The display modes of the traditional television program picture and the multiparty interaction picture are not limited in the application. For example, the present application may set the positions and sizes of the conventional television program picture and the multiparty interactive picture according to the priorities of the conventional television program picture and the multiparty interactive picture.
Taking the example that the priority of the traditional television program picture is higher than that of the multiparty interaction picture, the area of the traditional television program picture is larger than that of the multiparty interaction picture, and the multiparty interaction picture can be positioned at one side of the traditional television program picture or suspended at one corner of the multiparty interaction picture.
The display device 200, in one aspect, may be a liquid crystal display, OLED (Organic Light Emitting Diode) display, projection display device; in another aspect, the display device may be a smart television or a display system of a display and a set-top box. The particular display device type, size, resolution, etc. are not limited, and those skilled in the art will appreciate that the display device 200 may be subject to some changes in performance and configuration as desired.
The display device 200 may additionally provide an intelligent network television function of a computer support function in addition to the broadcast receiving television function. Examples include web tv, smart tv, internet Protocol Tv (IPTV), etc. In some embodiments, the display device may not have broadcast receiving television functionality.
As shown in fig. 1, a camera may be connected or provided on the display device 200, so as to present a picture surface captured by the camera on a display interface of the present display device or other display devices, so as to implement interactive chat between users. Specifically, the picture shot by the camera can be displayed in a full screen, a half screen or any optional area on the display device.
As a connection manner in some embodiments, the camera is connected with the rear shell of the display device through the connection plate, and is fixedly installed in the middle of the upper side of the rear shell of the display device, and as a mountable manner, the camera can be fixedly installed at any position of the rear shell of the display device, so that the image acquisition area of the camera is not blocked by the rear shell, for example, the display orientation of the image acquisition area and the display device is the same.
As another connection mode in some embodiments, the camera is connected with the rear shell of the display device in a liftable manner through a connection plate or other conceivable connector, and a lifting motor is installed on the connector, so that when the user wants to use the camera or has an application program to use the camera, the camera is lifted out of the display device, and when the user does not need to use the camera, the camera can be embedded behind the rear shell, so that the camera is protected from being damaged and privacy security of the user is protected.
As an embodiment, the camera adopted in the application can be 1600 ten thousand pixels, so as to achieve the purpose of ultra-high definition display. In practical use, cameras higher or lower than 1600 ten thousand pixels may also be used.
After the camera is installed on the display device, the contents displayed in different application scenes of the display device can be fused in a plurality of different modes, so that the function which cannot be realized by the traditional display device is achieved.
For example, a user may conduct a video chat with at least one other user while watching a video program. The presentation of the video program may be a background picture over which a window of video chat is displayed. The function is visual and can be called as 'chat while watching'.
In some embodiments, in a "watch-while-chat" scenario, at least one video chat is conducted across terminals while live video or network video is being viewed.
In another example, a user may conduct a video chat with at least one other user while entering the educational application study. For example, students may be able to achieve remote interaction with teachers while learning content in educational applications. The function is visual and can be called as 'learning while boring'.
In another example, a user may conduct a video chat with a player entering a game while playing a card game. For example, a player may enable remote interaction with other players when entering a gaming application to participate in a game. The function is visual and can be called 'play while watching'.
In some embodiments, the game scene is fused with the video picture, the portrait in the video picture is scratched, and the portrait is displayed in the game picture, so that the user experience is improved.
In some embodiments, in somatosensory games (such as ball games, boxing games, running games, dancing games, etc.), human body gestures and actions are obtained through a camera, limb detection and tracking, detection of human body skeleton key point data, and then fusion with animation in the games is carried out, so that the games of scenes such as sports, dancing, etc. are realized.
In another example, a user may interact with at least one other user in a karaoke application, video and voice. The function is visual and can be called 'watch and sing'. In some embodiments, when at least one user enters the application in a chat scene, a plurality of users can jointly complete recording of a song.
In another example, the user may open the camera locally to take pictures and video, and the function may be referred to as "looking at the mirror".
In other examples, more functions may be added or the above functions may be reduced. The function of the display device is not particularly limited in this application.
A block diagram of the configuration of the control apparatus 100 according to the exemplary embodiment is exemplarily shown in fig. 2. As shown in fig. 2, the control device 100 includes a controller 110, a communicator 130, a user input/output interface 140, a memory 190, and a power supply 180.
The control apparatus 100 is configured to control the display device 200 and to receive an input operation instruction from a user, and to convert the operation instruction into an instruction recognizable and responsive to the display device 200, and to perform an interaction between the user and the display device 200. Such as: the user responds to the channel addition and subtraction operation by operating the channel addition and subtraction key on the control apparatus 100.
In some embodiments, the control apparatus 100 may be a smart device. Such as: the control apparatus 100 may install various applications for controlling the display device 200 according to user's needs.
In some embodiments, as shown in fig. 1, a mobile terminal 100B or other intelligent electronic device may function similarly to the control apparatus 100 after installing an application that manipulates the display device 200. Such as: the user may implement the functions of the physical keys of the control apparatus 100 by installing an application, various function keys or virtual buttons of a graphical user interface available on the mobile terminal 100B or other intelligent electronic device.
The controller 110 includes a processor 112, RAM 113 and ROM 114, a communication interface, and a communication bus. The controller 110 is used to control the operation and operation of the control device 100, as well as the communication collaboration among the internal components and the external and internal data processing functions.
The communicator 130 performs communication of control signals and data signals with the display device 200 under the control of the controller 110. Such as: the received user input signal is transmitted to the display device 200. The communicator 130 may include at least one of a WIFI module 131, a bluetooth module 132, an NFC module 133, and the like.
A user input/output interface 140, wherein the input interface includes at least one of a microphone 141, a touch pad 142, a sensor 143, keys 144, a camera 145, etc. Such as: the user can implement a user instruction input function through actions such as voice, touch, gesture, press, and the like, and the input interface converts a received analog signal into a digital signal and converts the digital signal into a corresponding instruction signal, and sends the corresponding instruction signal to the display device 200.
The output interface includes an interface that transmits the received user instruction to the display device 200. In some embodiments, an infrared interface may be used, as well as a radio frequency interface. Such as: when the infrared signal interface is used, the user input instruction needs to be converted into an infrared control signal according to an infrared control protocol, and the infrared control signal is sent to the display device 200 through the infrared sending module. And the following steps: when the radio frequency signal interface is used, the user input instruction is converted into a digital signal, and then the digital signal is modulated according to a radio frequency control signal modulation protocol and then transmitted to the display device 200 through the radio frequency transmission terminal.
In some embodiments, the control device 100 includes at least one of a communicator 130 and an output interface. The control device 100 is provided with a communicator 130 such as: the modules such as WIFI, bluetooth, NFC, etc. may send the user input instruction to the display device 200 through the WIFI protocol, or the bluetooth protocol, or the NFC protocol code.
A memory 190 for storing various operation programs, data and applications for driving and controlling the control device 100 under the control of the controller 110. The memory 190 may store various control signal instructions input by a user.
The power supply 180 is configured to provide operation power support for each electrical component of the control device 100 under the control of the controller 110. The power supply 180 may use a battery and associated control circuitry to provide power.
In some embodiments, a hardware architecture diagram of a hardware system in display device 200 according to an exemplary embodiment is schematically illustrated in fig. 3 a. For ease of illustration, the display device 200 in fig. 3a is illustrated as a liquid crystal display.
As shown in fig. 3a, the display device 200 may include: the first panel 11, the first backlight assembly 12, the first rear case 13, the first controller 14, the second controller 15, the first display driving circuit 16, the second panel 21, the second backlight assembly 22, the second rear case 23, the second display driving circuit 24, and the power supply assembly 30. Additionally, in some embodiments, the display device 200 may further include:
A base or a suspension bracket. For ease of illustration, the display device 200 is illustrated in fig. 3a as including a base 41, the base 41 being configured to support the display device 200. It should be noted that only one form of base design is shown in the figures, and those skilled in the art can design different forms of base according to the product requirements.
Wherein the first panel 11 is used for presenting a picture of the first display 201 to a user. In some embodiments, the first panel 11 may be a liquid crystal panel. For example, the liquid crystal panel may include, in order from top to bottom: a horizontal polarizing plate, a color filter, a liquid crystal layer, a thin film transistor TFT, a vertical polarizing plate, a light guide plate, and a printed circuit board (printed circuit board, PCB) on which driving circuits such as a gate driving circuit, a source driving circuit, and the like are provided. The grid driving circuit is connected with the grid of the thin film transistor TFT through a scanning line, and the source driving circuit is connected with the drain of the thin film transistor TFT through a data line.
The first backlight assembly 12 is disposed below the first panel 11, and is usually some optical assemblies for providing a light source with sufficient brightness and uniform distribution, so that the first panel 11 can display images normally. The first backlight assembly 12 further includes a first back plate (not shown).
Wherein, the first back case 13 is disposed on the first panel 11 to conceal the first backlight assembly 12, the first controller 14, the second controller 15, the first display driving circuit 16, the power supply assembly 30, and other components of the display device 200 together, thereby providing an aesthetic effect.
The first controller 14, the second controller 15, the first display driving circuit 16 and the power supply assembly 30 are disposed on a first back plate, and some convex hull structures are typically stamped and formed on the first back plate. The first controller 14, the second controller 15, and the first display driving circuit 16 and the power supply assembly 30 are fixed to the convex hull by screws or hooks.
In some embodiments, the first controller 14, the second controller 15, the first display driving circuit 16 and the power supply assembly 30 may be disposed on one board together, or may be disposed on different boards respectively, for example, the first controller 14 is disposed on a motherboard, the second controller 15 is disposed on an interaction board, the first display driving circuit 16 is disposed on the first display driving board, the power supply assembly 30 is disposed on a power supply board, or may be disposed on different boards in a combined manner, or may be disposed on one board together with the first backlight assembly 12, which may be specifically set according to practical requirements.
For convenience of explanation, fig. 3a illustrates the first controller 14, the second controller 15, the first display driving circuit 16, and the power supply assembly 30 together on one board.
The main functions of the first display driving circuit 16 are: the backlight driving signal, such as PWM signal and Local dimming signal, transmitted by the first controller 14 performs a thousand-level backlight partition control, where the control is based on the image content change, and after handshake is established between the first controller 14, the VbyOne display signal sent by the first controller 14 is received, and the VbyOne display signal is converted into an LVDS signal, so as to implement image display of the first display screen 201.
Wherein the second panel 21 is used for presenting the user with a picture of the second display 202. In some embodiments, the second panel 21 may be a liquid crystal panel, and the specific structure may be referred to in the foregoing description, which is not repeated herein.
The second backlight assembly 22 is disposed below the second panel 12, and is usually some optical assemblies for providing a light source with sufficient brightness and uniform distribution, so that the second panel 12 can display images normally. The second backlight assembly 22 further includes a second back plate (not shown).
Wherein, the second back case 23 is disposed on the second panel 21 to conceal the parts of the display device 200 such as the second backlight assembly 22 and the second display driving circuit 24 together, thereby providing an aesthetic effect.
The second display driving circuit 24 is disposed on the second back plate, and some convex hull structures are typically stamped and formed on the second back plate. The second display driving circuit 24 is fixed to the convex hull by screws or hooks. The second display driving circuit 24 may be disposed on a board alone, such as a second display driving board, or may be disposed on a board together with the second backlight assembly 22, and may be specifically disposed according to practical requirements, which is not limited in this application. For ease of illustration, the second display driver circuit 24 is shown in fig. 3a as being provided separately on a single board.
In some embodiments, the key pad is further included in fig. 3a, where the key pad may be disposed on the first back plate or may be disposed on the second back plate, which is not limited in this application. And a plurality of keys and key circuits are arranged on the key board, so that the first controller 14 or the second controller 15 can receive key signals from the key board, and the first controller 14 or the second controller 15 can send control signals to the key board.
In addition, the display device 200 further includes a sound reproduction means (not shown in the figure) such as an acoustic component, for example, an I2S interface including a power Amplifier (AMP) and a Speaker (Speaker), etc., for realizing reproduction of sound. Typically, the audio assembly is capable of at least two channels of sound output; when the panoramic surround effect is to be achieved, a plurality of acoustic components need to be provided to output sounds of a plurality of channels, and a detailed description thereof will not be given here.
It should be noted that the display device 200 may also be an OLED display, and accordingly, the template included in the display device 200 is changed accordingly, for example, since the OLED display may realize self-luminescence, a backlight assembly (the first backlight assembly 12 and the second backlight assembly 22 in fig. 3 a) is not required in the OLED display, which is not described herein too much.
Alternatively, a display device having a dual display screen is exemplified as shown in fig. 3a, and a hardware configuration diagram of a hardware system in the display device according to an exemplary embodiment is exemplified in fig. 3 b.
In the display device with a single display screen as shown in fig. 3b, the display device comprises: a panel 1, a backlight assembly 2, a rear case 3, a controller 4, a power supply assembly 5, and a chassis 6. Wherein the panel 1 is used for presenting pictures to a user; the backlight assembly 2 is located below the panel 1, and is usually some optical assemblies, and is used for supplying enough light sources with uniform brightness and distribution, so that the panel 1 can normally display image content, the backlight assembly 2 further comprises a back plate, the controller 4 and the power assembly 5 are arranged on the back plate, a convex hull structure is usually stamped and formed on the back plate, and the controller 4 and the power assembly 5 are fixed on the convex hull through screws or hooks; the rear shell 3 is covered on the panel 1 to jointly hide parts of the display equipment such as the backlight assembly 2, the controller 4, the power supply assembly 5 and the like, thereby achieving an attractive effect; and a base 6 for supporting the display device.
The controller 4 and the power supply assembly 5 may be separately disposed on a board, or may be disposed on a board together with the backlight assembly, specifically, may be disposed according to actual requirements, which is not limited in this application. For ease of illustration, in fig. 3b, the controller 4 and the power supply assembly 5 are co-located on a single board.
In addition, the display device 200 further includes a sound reproduction means (not shown in the figure) such as an acoustic component, for example, an I2S interface including a power Amplifier (AMP) and a Speaker (Speaker), etc., for realizing reproduction of sound. Typically, the audio assembly is capable of at least two channels of sound output; when the panoramic surround effect is to be achieved, a plurality of acoustic components need to be provided to output sounds of a plurality of channels, and a detailed description thereof will not be given here.
It should be noted that the display device 200 may also employ an OLED display screen, so that the template included in the display device 200 is changed accordingly, which is not described herein too much.
FIG. 4 shows a schematic diagram of the connection between the power panel and the load, as shown IN FIG. 4, the power module 30 includes an input terminal IN and an output terminal OUT (first, second, third, fourth, and fifth output terminals OUT1, OUT2, OUT3, OUT4, OUT5 are shown), where the input terminal IN is connected to an AC power source AC (e.g., mains), the output terminal OUT is connected to the load, e.g., the first output terminal OUT1 is connected to a sound reproduction device, the second output terminal OUT2 is connected to the first/second panel 11/21, the third output terminal OUT3 is connected to the first/second backlight module 12/22, the fourth output terminal OUT4 is connected to the first/second controller 14/15, and the fifth output terminal OUT5 is connected to the first/second display driving circuit 16/second display driving circuit
24 are connected. The power supply assembly 30 is required to convert ac mains power into dc power required by the load, and the dc power typically has different specifications, for example, 18V for the audio assembly, 12V/18V for the first controller 14, etc.
For convenience of description, one hardware system in the dual hardware system architecture is hereinafter referred to as a first hardware system or a first controller, and the other hardware system is hereinafter referred to as a second hardware system or a second controller. The first controller comprises various processors and various interfaces of the first controller, and various modules connected with the first controller through the various interfaces, and the second controller comprises various processors and various interfaces of the second controller, and various modules connected with the second controller through the various interfaces.
The first controller and the second controller may be respectively provided with a relatively independent operating system, and the operating system of the first controller and the operating system of the second controller may communicate with each other through a communication protocol, which is exemplary: the frame work layer of the operating system of the first controller and the frame work layer of the operating system of the second controller may communicate for command and data transmission such that there are two separate but interrelated subsystems in the display device 200.
The dual hardware system architecture of the present application is described in some embodiments below in conjunction with fig. 5. It should be noted that fig. 5 is merely an exemplary illustration of the dual hardware system architecture of the present application, and is not meant to limit the present application. In practical applications, both hardware systems may include more or fewer hardware or interfaces as desired.
A hardware architecture block diagram of the display device 200 according to fig. 3a, 3b is exemplarily shown in fig. 5. As shown in fig. 5, the hardware system of the display apparatus 200 may include a first controller 210 and a second controller 310, and modules connected to the first controller 210 or the second controller 310 through various interfaces.
In some embodiments: the second controller 310 may be configured to receive the instruction sent by the first controller 210 and control the second display 380 to display a corresponding image.
The modules connected to the first controller 210 may include a modem 220, a communicator 230, an external device interface 25, a memory 290, a user input interface 260-3, a video processor 260-1, an audio processor 260-2, a first display screen 280 (i.e., the first display screen 201 in fig. 1), an audio output interface 270, and a power supply module 240. In other embodiments, the first controller 210 may also be connected to more or fewer modules.
The modem 220 is configured to perform modulation and demodulation processes such as amplification, mixing, and resonance on a broadcast television signal received by a wired or wireless manner, so as to demodulate an audio/video signal carried in a frequency of a television channel selected by a user and additional information (e.g., an EPG data signal) from a plurality of wireless or wired broadcast television signals. Depending on the broadcasting system of the television signal, the signal paths of the modem 220 may be various, such as: terrestrial broadcasting, cable broadcasting, satellite broadcasting, internet broadcasting, or the like; according to different modulation types, the signal adjustment mode can be a digital modulation mode or an analog modulation mode; and the modem 220 may demodulate analog signals and/or digital signals according to the kind of received television signals.
The tuning demodulator 220 is further configured to respond to the frequency of the television channel selected by the user and the television signal carried by the frequency according to the user selection and controlled by the first controller 210.
In other exemplary embodiments, the modem 220 may also be in an external device, such as an external set-top box, or the like. In this way, the set-top box outputs the television audio/video signal after modulation and demodulation, and inputs the television audio/video signal to the display apparatus 200 through the external device interface 250.
Communicator 230 is a component for communicating with external devices or external servers according to various communication protocol types. For example: communicator 230 may include a WIFI module 231, a bluetooth communication protocol module 232, a wired ethernet communication protocol module 233, and other network communication protocol modules or near field communication protocol modules (not shown) such as an infrared communication protocol module.
The display device 200 may establish a connection of control signals and data signals with an external control device or a content providing device through the communicator 230. For example, the communicator may receive a control signal of the remote controller 100 according to the control of the first controller 210.
The external device interface 250 is a component that provides data transmission between the first controller 210 and external other apparatuses. The external device interface 250 may be connected to an external device such as a set-top box, a game device, a notebook computer, etc., in a wired/wireless manner, and may receive data such as a video signal (e.g., a moving image), an audio signal (e.g., music), additional information (e.g., an EPG), etc., of the external device.
Among other things, the external device interface 250 may include: the High Definition Multimedia Interface (HDMI) terminals are also referred to as HDMI 251, composite Video Blanking Sync (CVBS) terminals are also referred to as AV 252, analog or digital component terminals are also referred to as any one or more of component 253, universal Serial Bus (USB) terminal 254, red Green Blue (RGB) terminals (not shown in the figures), etc. The present application is not limited in the number and type of external device interfaces.
The first controller 210 controls the operation of the display device 200 and responds to the user's operations by running various software control programs (e.g., an operating system and/or various application programs) stored on the memory 290.
As shown in fig. 5, the first controller 210 includes a read only memory RAM 213, a random access memory ROM 214, a graphics processor 216, a CPU processor 212, a communication interface 218, and a communication bus. The RAM 213 and the ROM 214 are connected to the graphics processor 216, the CPU processor 212, and the communication interface 218 via buses.
A ROM 213 for storing instructions for various system starts. When the power of the display device 200 starts to be started when the power-on signal is received, the CPU processor 212 executes a system start instruction in the ROM, and copies the operating system stored in the memory 290 into the RAM 214 to start to run the start-up operating system. When the operating system is started, the CPU processor 212 copies various applications in the memory 290 to the RAM 214, and then starts running the various applications.
A graphics processor 216 for generating various graphical objects, such as: icons, operation menus, user input instruction display graphics, and the like. The device comprises an arithmetic unit, wherein the arithmetic unit is used for receiving various interaction instructions input by a user to carry out operation and displaying various objects according to display attributes. And a renderer that generates various objects based on the operator, and the result of rendering is displayed on the first display screen 280.
CPU processor 212 is operative to execute operating system and application program instructions stored in memory 290. And executing various application programs, data and contents according to various interactive instructions received from the outside, so as to finally display and play various audio and video contents.
In some exemplary embodiments, the CPU processor 212 may include multiple processors. One of the plurality of processors may include one main processor, and a plurality of or one sub-processor. A main processor for performing some operations of the display apparatus 200 in the pre-power-up mode and/or displaying a picture in the normal mode. A plurality of or a sub-processor for performing an operation in a standby mode or the like.
Communication interface 218 may include first interface 218-1 through nth interface 218-n. These interfaces may be network interfaces that are connected to external devices via a network.
The first controller 210 may control operations of the display device 200 in relation to the first display screen 280. For example: in response to receiving a user command for selecting a UI object displayed on the first display screen 280, the first controller 210 may perform an operation related to the object selected by the user command.
The first controller 210 may control operations of the display device 200 in relation to the second display screen 380. For example: in response to receiving a user command for selecting a UI object displayed on the second display 380, the first controller 210 may perform an operation related to the object selected by the user command.
Wherein the object may be any one of selectable objects, such as a hyperlink or an icon. Operations related to the selected object, such as: operations to connect to a hyperlink page, document, image, etc., or operations to execute a program corresponding to an icon are displayed. The user command for selecting the UI object may be an input command through various input means (e.g., mouse, keyboard, touch pad, etc.) connected to the display device 200 or a voice command corresponding to a voice uttered by the user.
Memory 290 includes memory for storing various software modules for driving and controlling display device 200. Such as: various software modules stored in memory 290, including: a base module, a detection module, a communication module, a display control module, a browser module, various service modules, and the like (not shown in the figure).
The base module is a bottom software module for signal communication between the various hardware in the display device 200 and for sending processing and control signals to the upper modules. The detection module is a management module for collecting various information from various sensors or user input interfaces, and performing digital-to-analog conversion and analysis management. The voice recognition module comprises a voice analysis module and a voice instruction database module. The display control module is a module for controlling the first display screen 280 to display image content, and can be used for playing information such as multimedia image content and UI interface. The communication module is used for controlling and data communication with external equipment. The browser module is a module for performing data communication between the browsing servers. The service module is used for providing various services and various application programs.
Meanwhile, the memory 290 is also used to store received external data and user data, images of various items in various user interfaces, visual effect maps of focus objects, and the like.
The user input interface 260-3 is used to transmit an input signal of a user to the first controller 210 or transmit a signal output from the first controller 210 to the user. Illustratively, the control device (e.g., a mobile terminal or a remote controller) may send input signals such as a power switch signal, a channel selection signal, a volume adjustment signal, etc., input by the user to the user input interface and then transferred to the first controller 210 by the user input interface 260-3; alternatively, the control device may receive an output signal such as audio, video, or data, which is output from the user input interface 260-3 via the first controller 210, and display the received output signal or output the received output signal in an audio or vibration form.
In some embodiments, a user may input a user command through a Graphical User Interface (GUI) displayed on the first display screen 280, and the user input interface 260-3 receives the user input command through the Graphical User Interface (GUI). Alternatively, the user may input the user command by inputting a specific sound or gesture, and the user input interface 260-3 recognizes the sound or gesture through the sensor to receive the user input command.
The video processor 260-1 is configured to receive a video signal, and perform video data processing such as decompression, decoding, scaling, noise reduction, frame rate conversion, resolution conversion, and image synthesis according to a standard codec protocol of an input signal, so as to obtain a video signal that is directly displayed or played on the first display screen 280.
By way of example, the video processor 260-1 includes a demultiplexing module, a video decoding module, an image synthesis module, a frame rate conversion module, a display formatting module, etc. (not shown).
The demultiplexing module is used for demultiplexing the input audio/video data stream, such as the input MPEG-2, and demultiplexes the input audio/video data stream into video signals, audio signals and the like.
And the video decoding module is used for processing the demultiplexed video signal, including decoding, scaling and the like.
And an image synthesis module, such as an image synthesizer, for performing superposition mixing processing on the graphic generator and the video picture after the scaling processing according to the GUI signal input by the user or generated by the graphic generator, so as to generate an image signal for display.
A frame rate conversion module, configured to convert a frame rate of an input video, such as converting a frame rate of an input 24Hz, 25Hz, 30Hz, 60Hz video to a frame rate of 60Hz, 120Hz, or 240Hz, where the input frame rate may be related to a source video stream and the output frame rate may be related to a refresh rate of a display device. And a display formatting module for changing the signal output by the frame rate conversion module into a signal conforming to a display format of a display device, such as converting the signal output by the frame rate conversion module into a format to output an RGB data signal.
The first display 280 is configured to receive an image signal input from the video processor 260-1, display video content and images, and a menu manipulation interface the first display 280 includes a display assembly for presenting a picture and a driving assembly for driving the display of images. The video content may be displayed from a video in a broadcast signal received by the modem 220 or may be displayed from a video input from a communicator or an external device interface. The first display 280 simultaneously displays a user manipulation interface UI generated in the display device 200 and used to control the display device 200.
And, depending on the type of first display screen 280, a driving assembly for driving the display. Alternatively, if the first display screen 280 is a projection display screen, a projection device and a projection screen may be included.
The audio processor 260-2 is configured to receive the audio signal, decompress and decode according to the standard codec protocol of the input signal, and perform audio data processing such as noise reduction, digital-to-analog conversion, and amplification processing, so as to obtain an audio signal that can be played in the speaker 272.
An audio output interface 270 for receiving the audio signal output from the audio processor 260-2 under the control of the first controller 210, where the audio output interface may include a speaker 272 or an external audio output terminal 274 for outputting to a generating device of an external device, such as: external sound terminals or earphone output terminals, etc.
In other exemplary embodiments, video processor 260-1 may include one or more chip components. The audio processor 260-2 may also include one or more chip components.
And, in other exemplary embodiments, the video processor 260-1 and the audio processor 260-2 may be separate chips or integrated with the first controller 210 in one or more chips.
The power supply module 240 is configured to provide power supply support for the display device 200 with power input from an external power source under the control of the first controller 210. The power supply module 240 may include a built-in power circuit installed inside the display apparatus 200, or may be a power source installed outside the display apparatus 200, such as a power interface providing an external power source in the display apparatus 200.
Similar to the first controller 210, as shown in FIG. 5, the modules coupled to the second controller 310 may include a communicator 330, a detector 340, a memory 390, and a second display 380 (i.e., the second display 202 of FIG. 1). User input interfaces, video processors, audio processors, display screens, audio output interfaces (not shown) may also be included in some embodiments. In some embodiments, there may also be a power module (not shown) that independently powers the second controller 310.
The communicator 330 is a component for communicating with external devices or external servers according to various communication protocol types. For example: the communicator 330 may include a WIFI module 331, a bluetooth communication protocol module 332, a wired ethernet communication protocol module 333, and other network communication protocol modules or near field communication protocol modules (not shown) such as an infrared communication protocol module.
The communicator 330 and the communicator 230 of the first controller 210 also interact with each other. For example, the WiFi module 231 in the hardware system of the first controller 210 is used to connect to an external network, and generate network communication with an external server, etc. The WiFi module 331 in the hardware system of the second controller 310 is used to connect to the WiFi module 231 of the first controller 210 without directly connecting to an external network or the like, and the second controller 310 is connected to the external network through the first controller 210. Thus, for the user, a display device as in the above embodiment displays a WiFi account to the outside.
The detector 340 is a component of the second controller 310 for collecting signals of the external environment or interacting with the outside. The detector 340 may include a light receiver 342, a sensor for capturing ambient light intensity, a display parameter change that may be adapted by capturing ambient light, etc.; the system can also comprise an image collector 341, such as a camera, a video camera and the like, which can be used for collecting external environment scenes, collecting attributes of a user or interacting gestures with the user, adaptively changing display parameters and identifying the gestures of the user so as to realize the interaction function with the user.
The external device interface 350 provides a component for data transfer between the second controller 310 and the first controller 210 or other external devices. The external device interface may be connected with external apparatuses such as a set-top box, a game device, a notebook computer, and the like in a wired/wireless manner.
A video processor 360 for processing the relevant video signals.
The second controller 310 controls the operation of the display device 200 and responds to user operations by running various software control programs stored on the memory 390 (e.g., with an installed third party application, etc.), as well as interactions with the first controller 210.
As shown in fig. 5, the second controller 310 includes a read only memory ROM 313, a random access memory RAM 314, a graphic processor 316, a CPU processor 312, a communication interface 318, and a communication bus. The ROM 313 and RAM 314, and the graphics processor 316, CPU processor 312, and communication interface 318 are connected by a bus.
A ROM 313 for storing instructions for various system starts. The CPU processor 312 runs the system boot instructions in ROM and copies the operating system stored in the memory 390 into the RAM 314 to begin running the boot operating system. When the operating system is started, the CPU processor 312 copies various applications in the memory 390 to the RAM 314, and then starts running the various applications.
The CPU processor 312 is configured to execute operating system and application program instructions stored in the memory 390, and to communicate, signal, data, instruction, etc. with the first controller 210, and to execute various application programs, data, and contents according to various interactive instructions received from the outside, so as to finally display and play various audio and video contents.
The communication interfaces 318 are plural and may include a first interface 318-1 through an nth interface 318-n. These interfaces may be network interfaces connected to external devices via a network, or network interfaces connected to the first controller 210 via a network.
The second controller 310 may control operations of the display device 200 with respect to the second display screen 380. For example: in response to receiving a user command for selecting a UI object displayed on the second display 380, the second controller 310 may perform an operation related to the object selected by the user command.
The second controller 310 may control operations of the display device 200 related to the first display screen 280. For example: in response to receiving a user command for selecting a UI object displayed on the first display screen 280, the first controller 210 may perform an operation related to the object selected by the user command.
A graphics processor 316 for generating various graphical objects, such as: icons, operation menus, user input instruction display graphics, and the like. The device comprises an arithmetic unit, wherein the arithmetic unit is used for receiving various interaction instructions input by a user to carry out operation and displaying various objects according to display attributes. And a renderer for generating various objects based on the operator, and displaying the result of rendering on the second display 380.
The graphics processor 316 of the second controller 310 and the graphics processor 216 of the first controller 210 are each capable of generating various graphics objects. By distinction, if application 1 is installed on the second controller 310, application 2 is installed on the first controller 210, and when a user inputs an instruction at the interface of application 1 and within application 1, a graphic object is generated by the graphic processor 316 of the second controller 310. When a user is at the interface of application 2 and a user entered instruction is made within application 2, a graphical object is generated by the graphics processor 216 of the first controller 210.
A functional configuration diagram of a display device according to an exemplary embodiment is exemplarily shown in fig. 6.
As shown in fig. 6, the memory 390 of the second controller 310 and the memory 290 of the first controller 210 are used to store an operating system, application programs, contents, user data, and the like, respectively, and perform system operations for driving the first display screen 280 and the second display screen 380 and various operations in response to a user under the control of the second controller 310 and the first controller 210. Memory 390 and memory 290 may include volatile and/or nonvolatile memory.
As for the memory 290, it is particularly used to store an operation program for driving the first controller 210 in the display device 200, and to store various application programs built in the display device 200, and various application programs downloaded by a user from an external device, and various graphic user interfaces related to the application programs, and various objects related to the graphic user interfaces, user data information, and various internal data supporting the application programs. The memory 290 is used to store system software such as an Operating System (OS) kernel, middleware and applications, and to store input video data and audio data, as well as other user data.
The memory 290 is specifically configured to store drivers and related data for the video processor 260-1 and the audio processor 260-2, the first display 280, the communicator 230, the modem 220, the input/output interface, and the like.
In some embodiments, memory 290 may store software and/or programs, the software programs used to represent an Operating System (OS) including, for example: a kernel, middleware, an Application Programming Interface (API), and/or an application program. For example, the kernel may control or manage system resources, or functions implemented by other programs (such as the middleware, APIs, or application programs), and the kernel may provide interfaces to allow the middleware and APIs, or applications to access the controller to implement control or management of system resources.
By way of example, the memory 290 includes a broadcast receiving module 2901, a channel control module 2902, a volume control module 2903, an image control module 2904, a display control module 2905, a first audio control module 2906, an external instruction recognition module 2907, a communication control module 2908, a light receiving module 2909, a power control module 2910, an operating system 2911, and other applications 2912, a browser module 2913, and the like. The first controller 210 executes various software programs in the memory 290 such as: broadcast television signal receiving and demodulating functions, television channel selection control functions, volume selection control functions, image control functions, display control functions, audio control functions, external instruction recognition functions, communication control functions, optical signal receiving functions, power control functions, software control platforms supporting various functions, browser functions and other various functions.
Memory 390 includes storage for various software modules for driving and controlling display device 200. Such as: various software modules stored in memory 390, including: a base module, a detection module, a communication module, a display control module, a browser module, various service modules, and the like (not shown in the figure). Since the functions of the memory 390 and the memory 290 are similar, the relevant portions will be referred to as the memory 290, and will not be described herein.
By way of example, the memory 390 includes an image control module 3904, a second audio control module 3906, an external instruction recognition module 3907, a communication control module 3908, a light receiving module 3909, an operating system 3911, and other application programs 3912, a browser module 3913, and the like. The first controller 210 executes various software programs in the memory 290 such as: image control function, display control function, audio control function, external instruction recognition function, communication control function, optical signal receiving function, power control function, software control platform supporting various functions, browser function and other various functions.
Differentially, the external instruction recognition module 2907 of the first controller 210 and the external instruction recognition module 3907 of the second controller 310 may recognize different instructions.
For example, when an image receiving device such as a camera is connected to the second controller 310, the external command recognition module 3907 of the second controller 310 may include a graphics recognition module 2907-1, where a graphics database is stored in the graphics recognition module 3907-1, and when the camera receives an external graphics command, the camera performs a correspondence with the command in the graphics database to perform command control on the display device. Since the voice receiving device and the remote controller are connected to the first controller 210, the external command recognition module 2907 of the first controller 210 may include a voice recognition module 2907-2, where a voice database is stored in the voice recognition module 2907-2, and when the voice receiving device receives an external voice command or receives a command from the outside, the voice receiving device and the like perform a corresponding relationship with the command in the voice database to perform command control on the display device. Similarly, the control device 100 such as a remote controller is connected to the first controller 210, and the key instruction recognition module 2907-3 performs instruction interaction with the control device 100.
A block diagram of the configuration of the software system in the display device 200 according to an exemplary embodiment is illustrated in fig. 7.
For the first controller 210, as shown in FIG. 7, an operating system 2911, including executing operating software for processing various underlying system services and for performing hardware-related tasks, acts as a medium for completing data processing between application programs and hardware components.
In some embodiments, portions of the operating system kernel may contain a series of software to manage display device hardware resources and to serve other programs or software code.
In other embodiments, portions of the operating system kernel may contain one or more device drivers, which may be a set of software code in the operating system that helps operate or control the devices or hardware associated with the display device. The driver may contain code to operate video, audio and/or other multimedia components. Examples include a display screen, camera, flash, wiFi, and audio drivers.
Wherein, accessibility module 2911-1 is configured to modify or access an application program to realize accessibility of the application program and operability of display content thereof.
The communication module 2911-2 is used for connecting with other peripheral devices via related communication interfaces and communication networks.
User interface module 2911-3 is configured to provide an object for displaying a user interface for access by each application program, so as to implement user operability.
Control applications 2911-4 are used to control process management, including runtime applications, and the like.
The event delivery system 2914 may be implemented within the operating system 2911 or in the application 2912. In some embodiments, one aspect is implemented within the operating system 2911, while the application 2912 is implemented to monitor various user input events, and to refer to a process program that implements one or more sets of predefined operations in response to recognition results of various events or sub-events, based on the various events.
The event monitoring module 2914-1 is configured to monitor a user input interface to input an event or a sub-event.
The event recognition module 2914-2 is configured to input definitions of various events to various user input interfaces, recognize various events or sub-events, and transmit them to a process for executing one or more corresponding sets of processes.
The event or sub-event refers to an input detected by one or more sensors in the display device 200, and an input of an external control device (such as the control apparatus 100). Such as: various sub-events are input through voice, gesture input sub-events of gesture recognition, sub-events of remote control key instruction input of a control device and the like. By way of example, one or more sub-events in the remote control may include a variety of forms including, but not limited to, one or a combination of key press up/down/left/right/, ok key, key press, etc. And operations of non-physical keys, such as movement, holding, releasing, etc.
The interface layout management module 2913 directly or indirectly receives the user input events or sub-events from the event transmission system 2914, and is used for updating the layout of the user interface, including but not limited to the positions of the controls or sub-controls in the interface, and various execution operations related to the interface layout, such as the size or position of the container, the level, and the like.
Since the operating system 3911 of the first controller 310 is similar to the operating system 2911 of the first controller 210, the relevant portions are referred to the operating system 2911, and will not be described herein.
As shown in fig. 8, the application layer of the display device contains various applications that may be executed on the display device 200.
The application layer 2912 of the first controller 210 may include, but is not limited to, one or more applications such as: video on demand applications, application centers, gaming applications, etc. The application layer 3912 of the second controller 310 may include, but is not limited to, one or more applications such as: live television applications, media center applications, etc. It should be noted that, what application programs are included on the second controller 310 and the first controller 210 are determined according to the operating system and other designs, and specific limitation and division of the application programs included on the second controller 310 and the first controller 210 are not required in the present application.
Live television applications can provide live television through different signal sources. For example, a live television application may provide television signals using inputs from cable television, radio broadcast, satellite services, or other types of live television services. And, the live television application may display video of the live television signal on the display device 200.
Video on demand applications may provide video from different storage sources. Unlike live television applications, video-on-demand provides video displays from some storage sources. For example, video-on-demand may come from the server side of cloud storage, from a local hard disk storage containing stored video programs.
The media center application may provide various applications for playing multimedia content. For example, a media center may be a different service than live television or video on demand, and a user may access various images or audio through a media center application.
An application center may be provided to store various applications. The application may be a game, an application, or some other application associated with a computer system or other device but operable on a display device. The application center may obtain these applications from different sources, store them in local storage, and then be run on the display device 200.
Since the second controller 310 and the first controller 210 may have separate operating systems installed therein, there are two independent but interrelated subsystems in the display device 200. For example, the second controller 310 and the first controller 210 may be independently provided with Android (Android) and various APP, which may each implement a certain function, and make the second controller 310 and the first controller 210 cooperatively implement a certain function.
A schematic diagram of a user interface in the display device 200 according to an exemplary embodiment is illustrated in fig. 9. As shown in fig. 9, the user interface includes a view first view display area 2011 and a view second view display area 2021. The function implementation of the first view display area 2011 and the second view display area 2021 is substantially the same, and only the first view display area 2011 will be described with emphasis. By way of example, wherein the first view display area 2011 includes a layout of one or more different items. And a selector for indicating that the item is selected is also included in the user interface, and the position of the selector can be moved by user input to change the selection of a different item.
In some embodiments, the first view display area 2011 is a scalable view display. "scalable" may mean that the first view display area 2011 is scalable in on-screen size or duty cycle, or that the items in the view display 201 are scalable in on-screen size or duty cycle.
"item" refers to a visual object displayed in a view display area of a user interface in the display device 200 to represent corresponding content such as an icon, a thumbnail, a video clip, and the like. For example: the items may represent movies, image content or video clips of a television show, audio content of music, applications, or other user access content history information.
Further, the item may represent an interface or an interface set display in which the display device 200 is connected to an external device, or may represent an external device name or the like connected to the display device. Such as: a set of signal source input interfaces, or a high definition multimedia interface (High Definition Multimedia Interface, HDMI), a USB interface, a PC terminal interface, or the like.
It should be noted that: the view display area may present Video chat project content or present application layer project content (e.g., web page Video, video On Demand (VOD) presentations, application screens, etc.).
The "selector" is used to indicate that any item therein has been selected, such as a cursor or an object of focus. Positioning the selection information input according to the icon or menu position touched by the user in the display device 200 may cause movement of the display focus object in the display device 200 to select a control item, one or more of which may be selected or controlled.
The focus object refers to an object that moves between items according to user input. Illustratively, the focus object location is achieved or identified by drawing a thick line through the item edge. In other embodiments, the focus form is not limited to examples, and may be a form in which a cursor or the like is tangible or intangible to a user, such as a form in which a 3D deformation of an item is possible, and may also change the mark of a border line, size, color, transparency, outline, and/or font of a text or image of the focused item.
The event transmission system 2914, which may monitor user input for each predefined event or sub-event, provides control identifying the event or sub-event directly or indirectly to the interface layout management module 2913.
The interface layout management module 2913 is configured to monitor a user interface state (including a position and/or a size of a view partition, an item, a focus, or a cursor object, a change process, etc.), and according to the event or the sub-event, modify a layout of a size and a position, a hierarchy, etc. of a view display area, and/or adjust or modify a size or/and a layout of a position, a number, a type, a content, etc. of a layout of various items in the view display area. In some embodiments, modifying and adjusting the layout includes displaying or not displaying item content in each view partition or view partition on the screen.
And a user input interface for transmitting an input signal of a user to the controller or transmitting a signal output from the controller to the user. Illustratively, the control device (e.g., mobile terminal or remote control) may send input signals such as power switch signals, channel selection signals, volume adjustment signals, etc., input by the user to the user input interface, which may then be forwarded to the controller; alternatively, the control device may receive an output signal such as audio, video, or data, which is output from the user input interface via the controller, and display the received output signal or output the received output signal in the form of audio or vibration.
In some embodiments, a user may input a user command through a user interface displayed on the display device 200, and the user input interface receives the user input command through the user interface. Alternatively, the user may input the user command by inputting a specific sound or gesture, and the user input interface recognizes the sound or gesture through the sensor to receive the user input command.
A "user interface" is a media interface for interaction and exchange of information between an application or operating system and a user, which enables conversion between an internal form of information and a user-acceptable form. A commonly used presentation form of a user interface is a Graphical User Interface (GUI), which refers to a user interface graphically displayed in connection with computer operations. It may be an interface element such as an icon, a window, a control, etc. displayed in a display screen of the electronic device, where the control may include a visual interface element such as an icon, a button, a menu, a tab, a text box, a dialog box, a status bar, a navigation bar, a Widget, etc.
Fig. 10-12 schematically illustrate interactions with a user by a user interface in a display device 200 according to an exemplary embodiment.
As shown in fig. 10 to 11, a video picture, which may be an image content of a movie, a television series, a video clip, or a video picture in an application, or the like, is displayed in the first display screen (specifically, the first view display area 2011). Meanwhile, the controller receives the incoming call request, and the controller displays an incoming call reminder screen on the second display screen (specifically, the second view display area 2021) in response to the incoming call request. The incoming call reminder screen includes a tab for indicating a request to answer an incoming call and a tab for rejecting the request to answer the incoming call.
After the controller displays the incoming call reminding screen on the second display screen, the incoming call reminding screen can be executed according to any one of the following two alternative modes.
In a first alternative, the controller displays a chat screen on the first display screen in response to a user-entered tab for making an incoming call request. Further, the controller may also display a widget video call screen on the second display, or on a target area of the first display, in response to a widget video call instruction from the user.
The tab of the incoming call request input by the user may also be referred to as an indication of the incoming call request input by the user.
In a second alternative manner, the controller responds to a tab input by a user and used for switching on an incoming call request, and controls the second display screen to present a chat screen. Further, the controller may also control the chat screen to switch from the second display screen to the first display screen in response to a user selection of the chat screen.
In the two modes, the incoming call connection request may refer to, for example, receiving a video incoming call, and the chat screen may refer to, for example, a video call screen.
It should be noted that in the embodiment of the present application, the number of the controllers may be one or more. When the controller is one, the controller receives an incoming call request when the first display screen displays the video picture, and displays an incoming call reminding picture on the second display screen. When the controllers are multiple, the first display screen displays the video picture through the cooperative interaction among the controllers, and the second display screen displays the incoming call reminding picture. In one embodiment, the controller may include a first controller for receiving an incoming call request and sending an incoming call indication to a second controller when the first display screen displays a video screen, and a second controller for receiving the incoming call indication and displaying an incoming call reminder screen on the second display screen. The following embodiments of the present application are each described by taking the example that the controller includes the first controller and the second controller, but this should not be construed as limiting the present application.
Referring to fig. 10, if the type of the incoming call is a video-based incoming call such as a video call, a game invitation, etc., the second controller may include items such as an caller portrait, a name, an incoming call reminder, an answer video button, an answer only voice button, a hang-up button, etc., in the incoming call reminder screen displayed in the second view display area 2021, while the second controller positions the focus object on the answer video button.
Referring to fig. 11, if the type of the incoming call is voice call, the second controller may include items of an caller's portrait, name, incoming call reminder, answer voice button, hang-up button, etc. in the incoming call reminder screen displayed in the second view display area 2021, and simultaneously, the second controller positions the focus object on the answer voice button. The user can move the focus object by operating left and right keys on the remote controller to select a button and press an OK key to confirm.
Taking the first alternative as described above as an example, if the user presses the OK key while the focus object is positioned at the answer video button of fig. 10, in one embodiment, the first display screen and the second display screen may display the screen shown in fig. 13 or 14. Fig. 13 is a schematic diagram of a video call participant in a hexagonal shape, fig. 14 is a schematic diagram of a video call participant in a hexagonal shape, and in the screen shown in fig. 13 or 14, a user may press a hover small window button, the screen may switch to the screen shown in fig. 17, or the user may press a switch layout button, and the screen may switch to the screen shown in fig. 18. Alternatively, the first display screen and the second display screen may display the screen shown in fig. 17 or fig. 18, the user may press a specific menu key, for example, an OK key, and the screen may be switched to the screen shown in fig. 13 or fig. 14.
If the focus object is positioned on the listen-only voice button of fig. 10, the first display screen and the second display screen may display the screen shown in fig. 15. As shown in fig. 15, the first display screen may continue to display video pictures, in some embodiments, at which time play of the video may be paused. The second display may display a portrait of the participant other than the current user among the parties to the voice call. It should be understood that, in the embodiments of the present application, the current user refers to a user corresponding to the display device. Meanwhile, the position corresponding to the current user in the second display screen is displayed as a black screen, and a mark of unopened video is overlapped on the black screen.
If the focus object is positioned on the answer button of fig. 11, a voice flag may be displayed on the second display screen to identify that a voice call is currently in progress.
If the user presses the OK key while the focus object is positioned at the hang-up button of fig. 10 or 11, the first display screen and the second display screen may display the screen shown in fig. 12. The screen shown in fig. 12 is a screen after receiving no incoming call request and ending the voice call or the video call. As shown in fig. 12, the first display may continue to play video while the second display may display information such as voice assistant, weather, date, and time.
As described above, fig. 13 and 14 are schematic diagrams during a video call, and are described below.
As shown in fig. 13, a video call screen is displayed in the first view display area 2011, in which a video of each of six participants participating in a video call is displayed, and the video of each participant occupies a part of the area of the first view display area 2011. The lower left corner of the area where the video of each participant is located shows the name of the participant and the lower right corner shows the participant in a mute or non-mute state.
Illustratively, the video of user A is displayed in the upper left-hand region of the first view display area. In the area corresponding to the user A, the image of the user A fills the whole area, and meanwhile, the name of the user A is displayed in a superposition manner on the left lower corner of the area, and a sign that the user A is currently muting is displayed in a superposition manner on the right lower corner of the area. Meanwhile, on the video of each participant, items for user operation and the video call duration are displayed in a semitransparent form. These items may include a hover widget button, a toggle layout button, an invite new participant button, a hang up button, a video off button, a microphone off button, a screen sharing button, and so forth. The items displayed for different call types may be different. The focal object may be positioned on one of the items.
Meanwhile, the second display screen does not display the incoming call reminding picture any more. Specifically, the second display screen may be in a state of turning off the backlight, or information such as weather, time, and the like may be displayed on the second display screen. Examples of interfaces after the user presses the hover widget button, switches layout buttons, and invites a new participant button will be described in the embodiments below. After the user presses the hang-up button, the interface displays the screen shown in fig. 12. After the user presses the video off button, the video corresponding to the user is in a black screen state, and the video of the current user is not displayed any more. After the user presses the microphone off button, a mute flag is displayed in the area where the user's video is located.
Illustratively, the user is user a in fig. 13, and after the user presses the microphone off button, the sign of the sitting angle of the area in which user a is located is displayed as a mute sign.
As shown in fig. 14, a video call screen is displayed in the first view display area 2011, in which videos of two participants participating in a video call are displayed, and each of the videos of the participants occupies half of the first view display area 2011. Meanwhile, on top of the videos of the two participants, items for user operation are displayed in a semi-transparent form. These items may include a hover widget button, a beauty filter button, an invite friend button, a hang-up button, a video off button, a microphone off button, a switch layout button, and the like. The focus object may be positioned on one of the items, for example, on a hover widget button.
Specifically, the second display screen may be in a state of turning off the backlight, or information such as weather, time, and the like may be displayed on the second display screen. Examples of interfaces after the user presses the hover widget button, switches layout buttons, invite friends button, and beauty filter button will be described in the embodiments below. After the user presses the hang-up button, the interface displays the screen shown in fig. 12. After the user presses the video off button, the video corresponding to the user is in a black screen state, and the video of the current user is not displayed any more. After the user presses the microphone off button, a mute flag is displayed in the area where the user's video is located. Illustratively, the user is user a in fig. 13, and after the user presses the microphone off button, the sign of the sitting angle of the area in which user a is located is displayed as a mute sign.
Fig. 16 is a schematic diagram of an interaction process of the first controller and the second controller, and as shown in fig. 16, the interaction process of the first controller and the second controller includes:
s1601, the first controller displays a video frame on the first display screen.
In some embodiments, the video picture may be a movie, image content of a television show, a video clip, or a video picture in an application, etc.
And if the video picture also has corresponding audio content, the first controller also plays the audio information corresponding to the video picture at the same time.
S1602, the first controller receives an incoming call request.
In some embodiments, the incoming call request may be a video call request, a voice call request, a game invitation, or the like.
S1603, the first controller sends an incoming call instruction to the second controller.
And if the first controller judges that the video picture is currently displayed on the first display screen, the first controller can send an incoming call instruction to the second controller in a serial port transmission mode. The incoming call indication may include information of the type of the incoming call and the caller, and the second controller selects the item to be displayed according to the information. Or the incoming call indication can comprise information of an incoming call type and an incoming call person, and also comprises an item to be displayed by the second controller on the second display screen, and the second controller directly displays the item based on the incoming call indication.
The types of incoming calls may include video calls, voice calls, game invitations, and the like.
The caller information may include the caller's portrait, name, etc.
Correspondingly, the second controller receives the incoming call indication.
S1604, the second controller displays an incoming call reminding picture on the second display screen.
The second controller displays an incoming call reminding picture in a second view display area of the second display screen, wherein the incoming call reminding picture comprises items such as a portrait, a name, an incoming call prompt, a video answering button, a voice answering button, a hang-up button and the like of an caller. The second controller may position the focus object on the answer video button if the type of the incoming call is a video call, a game invitation, or the like, and position the focus object after answering only the voice button if the type of the incoming call is a voice call.
After the second controller displays the incoming call reminding picture on the second display screen, the user can move the focus object by operating the left key and the right key on the remote controller to select the button and press the OK key to confirm. Alternatively, the user may directly press the OK key to confirm.
S1605, the second controller receives an incoming call indication of the user answering the video.
S1606, the second controller sends a video call instruction to the first controller.
After the user presses the OK key, the second controller recognizes an instruction of the user based on the item in which the focus object is currently located. If the focus object is positioned on the hang-up button, the second controller confirms that the hang-up instruction is received and sends the hang-up instruction to the first controller, and after the first controller receives the hang-up instruction, the second controller is instructed to close the incoming call reminding picture, and the following steps are not executed. If the focus object is positioned on the answer-only voice button, the second controller confirms that an answer-voice call instruction is received and sends a voice call instruction to the first controller, and after the first controller receives the voice call instruction, the first controller performs control of voice call, such as playing voice of an caller, receiving voice of a receiver and sending the voice to caller equipment, and the like, and meanwhile, the first controller can instruct the second controller to close an incoming call reminding picture and instruct the second controller to display an item in voice call on the second display screen. And, the following steps are not performed any more.
If the focus object is positioned on the answer video button, the second controller confirms that an answer video incoming call indication is received and sends a video call indication to the first controller.
Correspondingly, the first controller receives the video call instruction and executes the following steps.
S1607, the first controller displays a video call screen on the first display screen.
In some embodiments, the video call frame may include a video of the video call participant and an item for the user to operate. Wherein items for user operation may be displayed on top of the video call participant.
In some embodiments, the first controller may determine the video display location of each participant in the video call as a function of the number of participants in the video call. For example, assuming that the video call is a two-party video call, videos of an caller and an answering caller occupy half of the screen of the first display screen, respectively. For another example, assuming the video call is a hexagonal video call, each participant in the call occupies one sixth of the first display, respectively.
In some embodiments, the first controller displays items for user operation over the video of the video call participant, which may include a hover widget button, a toggle layout button, an invite new participant button, a hang-up button, a video off button, a microphone off button, a screen share button, and the like. The items displayed for different call types may be different. The focus object may be positioned on one of the items, such as a hover widget button. The user can move the focus object by operating the left and right keys on the remote controller to select other items and confirm by operating the OK key on the remote controller.
S1608, the second controller closes the incoming call reminding picture.
It should be noted that the second controller may perform this step after performing the above step S1306. The execution sequence of this step and S1607 may be no longer consecutive.
After the second controller closes the incoming call reminding picture, information such as time, weather, air temperature, reminding information and the like can be displayed on the second display screen, or the second controller and backlight of the second display screen can be closed.
In the above embodiment, when the first display screen is playing the video picture, the first controller receives the incoming call request, and displays the incoming call reminding picture on the second display screen, so that the video picture being displayed by the first display screen is not blocked, and the user experience is greatly improved. After the user selects to answer the video call, a video call picture is displayed on the first display screen, so that the user can view videos of all parties of the video call.
In some embodiments, in the interaction procedure, information transmission between the first controller and the second controller may be transmitted through a serial port.
In some embodiments, after the first controller displays the video call picture on the first display screen, the user can select to perform the small window video call, and the first controller uses the small window to display the video call picture according to the selection of the user, and simultaneously, continues to play the call picture before the video call, thereby realizing the effect of chat while watching, and greatly improving the use experience of the user. Specifically, the user may press the hover widget button in fig. 13 or 14, or the user may press the switch layout button in fig. 13 or 14 to make a widget video call.
Fig. 17 schematically illustrates interaction with a user by a user interface in the display device 200 according to an exemplary embodiment. As shown in fig. 17, in response to the small window video call instruction input by the user, the video call screen in the first view display area 2011 is closed, and the video screen before the video call is performed is continued to be displayed. Wherein the widget video call indication is entered by the user pressing the hover widget button described above. Meanwhile, a small window video call screen in which videos of the respective video participants are displayed is displayed in the second view display region 2021.
In some embodiments, the lower left corner of the area where each participant's video is located shows the participant's name, and the lower right corner shows the participant in a muted or unmuted state. In some embodiments, a line of prompt text (not shown) may be superimposed over the video of each participant in the second view display region 2021, such as "switch window by menu key", so that the user may perform a return to the display shown in fig. 13 or 14 according to the prompt text.
In some embodiments, after displaying the video of each video participant in the second view display area 2021, the user may press a specific button (e.g., a back button) in the remote controller, and the display apparatus returns to the display screen shown in fig. 13 or 14 again. Alternatively, a return button may be displayed in the second view display area, and the user selects the button by operating the remote controller and confirms that the display apparatus returns to the display screen shown in fig. 13 or 14 again.
Fig. 18 schematically illustrates interaction with a user by a user interface in the display device 200 according to an exemplary embodiment, and as shown in fig. 18, in response to a small window video call instruction input by the user, the video call screen in the first view display area 2011 is closed and the video screen before the video call is performed is continued to be displayed. Wherein the widget video call instruction is input by the user pressing the above-mentioned switch layout button. Meanwhile, a small window video call picture is displayed at a position, close to the lower edge of the first display screen, below the first view display area 2011, and videos of all video participants in the small window video call picture are displayed. In some embodiments, the lower left corner of the area where each participant's video is located shows the participant's name, and the lower right corner shows the participant in a muted or unmuted state.
In some embodiments, after the video of each video participant in the upper right corner of the first view display area 2011, the user may press a specific button (e.g., a back button) in the remote controller, and the display device returns to the display screen shown in fig. 13 or 14 again. Alternatively, a return button may be displayed at a position below the first view display area 2011 near the lower edge of the first display screen, and the user selects the button by operating the remote controller and confirms that the display apparatus returns to the display screen shown in fig. 13 or 14 again.
Fig. 19 is a schematic diagram of an interaction process between a first controller and a second controller, where, as shown in fig. 19, after a video call screen is displayed on a first display screen, a user may select a small window to display the video call screen, and the first controller may notify the second controller to display the small window video call screen on the second display screen, where the interaction process between the first controller and the second controller includes:
s1901, the first controller receives a small window video call instruction input by a user.
The user may select a floating widget button displayed on the first display and confirm that the widget video call indication is received by the first controller.
S1902, the first controller sends a small window video call instruction to the second controller.
The widget video call indication is used for indicating the second controller to display a widget video call picture on the second display. For example, the first controller may carry in the indication the number of participants in the video call, the names of the parties, etc.
Correspondingly, the second controller receives the small window video call indication.
S1903, the first controller closes the video call picture and displays the video picture before the video call.
And S1904, the second controller displays a small window video call picture on the second display screen.
After receiving the small window video call instruction, the second controller displays videos, names and the like of all the participants in a plurality of areas on the second display screen based on the number, names and the like of the call participants.
The execution sequence of the steps S1903 and S1904 may be no longer the order.
In some embodiments, after the second controller displays the widget video call screen on the second display, the user may press a specific button (e.g., a return button) in the remote controller, or select a return button in the widget video call screen through the remote controller and confirm that the first controller receives a return instruction of the user based on the user's operation, and in response to the return instruction, the first controller sends an instruction to close the widget video call screen to the second controller, and the second controller closes the widget video call screen in response to the instruction, and at the same time, the first controller redisplays the above video call screen on the first display.
In some embodiments, in the interaction procedure, information transmission between the first controller and the second controller may be transmitted through a serial port.
In another embodiment, after receiving the widget video call instruction input by the user, the first controller closes the video call screen and displays the widget video call screen in the target area of the first display.
The first controller may determine a size of the target area according to the number of video call participants, and display a small window video call screen on the target area.
In some embodiments, after the second controller displays the widget video call screen in the target area of the first display, the user may press a specific button (e.g., a return button) in the remote controller, or select a return button in the widget video call screen through the remote controller and confirm that the first controller receives a return instruction of the user based on the operation of the user, and in response to the return instruction, the first controller closes the widget video call screen, and at the same time, the first controller redisplays the above-mentioned video call screen on the first display.
Fig. 20 schematically illustrates interaction with a user by a user interface in a display device 200 according to an exemplary embodiment. As shown in fig. 20, in response to an invitation instruction input by the user, a buddy list available for invitation is displayed on top of the displayed video call screen in the first view display area 2011. Wherein the invitation indication is entered by the user pressing the invite new participant button or invite buddy button described above.
In some embodiments, a buddy list may be displayed above the first view display area 2011 near the upper edge of the first display screen, in which a portrait of each buddy may be displayed along with a name. After the user selects the portrait or name of a friend and presses the OK key, the friend joins the video call, and the video of the friend is newly added in the first view display area.
Fig. 21 schematically illustrates interaction with a user by a user interface in a display device 200 according to an exemplary embodiment. As shown in fig. 21, in response to the user input of the indication of the united states Yan Lvjing, the united states Yan Lvjing option is displayed on the displayed video call screen in the first view display area 2011. The aesthetic Yan Lvjing options may include background, decals, and filters. Under the context option, the user may select any of a variety of contexts. Under the decal option, the user may select any of a variety of decals. Under the filter option, the user can select any filter mode. Under the beautifying option, the user selects any one of the beautifying modes. In some embodiments, the mei Yan Lvjing option may be displayed below the first view display area 2011 near the lower edge of the first display screen, at which point other buttons originally displayed at that location are temporarily hidden.
The interactive processing procedure and the interface display procedure of the first controller and the second controller are described above by taking the first alternative manner as an example. For the second alternative described above, a corresponding process and display procedure may be used. The second controller displays the video call screen on the second display screen in the same manner as the first controller in the first alternative manner after receiving the video call answering instruction input by the user, and the first controller displays the video call screen on the first display screen after the user presses the confirm key. The specific implementation process is not described here.
Fig. 22 is a schematic flow chart of a display method provided in the embodiment of the present application, where the execution of the method is mainly pushed to the above-mentioned controller, as shown in fig. 22, including:
s2201, receiving an incoming call request when the first display screen displays a video picture.
S2202, in response to the incoming call request, displaying an incoming call reminding picture on a second display screen.
The specific implementation of the above steps may be referred to the foregoing embodiments, and will not be described herein.
In this embodiment, when the first display screen is playing the video picture, the controller receives the incoming call request, and displays the incoming call reminding picture on the second display screen, so that the video picture being displayed by the first display screen is not blocked, and the user experience is greatly improved.
In some embodiments, the controller includes a first controller and a second controller.
The controller receives an incoming call request when the first display screen displays a video picture, and the method comprises the following steps:
the first controller receives an incoming call request when the first display screen displays a video picture.
The controller responds to the incoming call request and displays an incoming call reminding picture on a second display screen, and the incoming call reminding picture comprises the following steps:
the first controller responds to the incoming call request and sends an incoming call indication to the second controller;
The second controller receives the incoming call indication, and responds to the incoming call indication, and an incoming call reminding picture is displayed on the second display screen.
In some embodiments, the above method further comprises:
after the incoming call reminding picture is displayed on the second display screen, responding to the user input tab for the incoming call completing request, and controlling the second display screen to present a chat picture.
In some embodiments, the above method further comprises:
and responding to the selection of the chat screen by the user, and controlling the chat screen to be switched from the second display screen to the first display screen.
In some embodiments, the above method further comprises:
the second controller receives an answering video incoming call instruction input by a user, responds to the answering video incoming call instruction, sends a video call instruction to the first controller, and closes the incoming call reminding picture.
The first controller receives the video call indication sent by the second controller, and responds to the video call indication, and a video call picture is displayed on the first display screen.
In some embodiments, the above method further comprises:
the first controller receives a small window video call instruction input by a user, responds to the small window video call instruction, closes the video call picture and sends the small window video call instruction to the second controller. And the second controller responds to the small window video call indication and displays a small window video call picture on the second display screen.
In some embodiments, the above method further comprises:
the first controller receives a small window video call instruction input by a user, responds to the small window video call instruction, closes the video call picture, and displays the small window video call picture in a target area of the first display screen.
In some embodiments, after closing the video call screen, the method further comprises:
and the first controller displays the video picture on the first display screen.
In some embodiments, the present application also provides a storage medium having instructions stored therein, which when run on a computer, cause the computer to perform the method as in the method embodiments described above.
In some embodiments, the present application further provides a chip for executing the instructions, where the chip is configured to perform the method in the method embodiment.
The present application also provides a program product, which comprises a computer program stored in a storage medium, from which at least one processor can read the computer program, and the at least one processor can implement the method in the above method embodiments when executing the computer program.
In the embodiments of the present application, "at least one" means one or more, and "a plurality" means two or more. "and/or", describes an association relationship of an association object, and indicates that there may be three relationships, for example, a and/or B, and may indicate: a alone, a and B together, and B alone, wherein a, B may be singular or plural. The character "/" generally indicates that the front and rear associated objects are an "or" relationship; in the formula, the character "/" indicates that the front and rear associated objects are a "division" relationship. "at least one of" or the like means any combination of these items, including any combination of single item(s) or plural items(s). For example, at least one (one) of a, b, or c may represent: a, b, c, a-b, a-c, b-c, or a-b-c, wherein a, b, c may be single or plural.
It will be appreciated that the various numerical numbers referred to in the embodiments of the present application are merely for ease of description and are not intended to limit the scope of the embodiments of the present application.
It should be understood that, in the embodiment of the present invention, the sequence number of each process does not mean that the execution sequence of each process should be determined by the function and the internal logic, and should not limit the implementation process of the embodiment of the present application.
Finally, it should be noted that: the above embodiments are only for illustrating the technical solution of the present invention, and not for limiting the same; although the invention has been described in detail with reference to the foregoing embodiments, it will be understood by those of ordinary skill in the art that: the technical scheme described in the foregoing embodiments can be modified or some or all of the technical features thereof can be replaced by equivalents; such modifications and substitutions do not depart from the spirit of the invention.

Claims (19)

1. A display device, characterized by comprising:
a first display configured to display a first user interface;
a second display configured to display a second user interface;
a controller configured to:
responding to an incoming call request, and detecting a media asset category displayed in the first user interface;
and responding to the media asset type as a video picture, controlling the second display screen to display an incoming call reminding picture corresponding to the incoming call request in the second user interface, wherein the incoming call reminding picture comprises an operation control for indicating to connect the incoming call request and an operation control for rejecting the incoming call request.
2. The display device of claim 1, wherein the controller is further configured to:
and responding to the media asset type as a non-video picture, and controlling the first display screen to display an incoming call reminding picture corresponding to the incoming call request in the first user interface.
3. The display device of claim 1, wherein the controller is further configured to:
closing the incoming call reminding picture in response to an event input by a user for switching on the incoming call request;
and controlling the first display screen to pause playing of the media data in the first user interface, and controlling the first display screen to display a chat picture of the incoming call participant in the first user interface, wherein the chat picture displays chat time and operation controls for user operation in a semitransparent mode.
4. The display device of claim 3, wherein the controller is further configured to:
responding to a small window video call instruction input by a user, and controlling the first display screen to continuously display the media data in the first user interface;
and controlling the second display screen to display a small window video call picture in the second user interface, wherein the small window video call picture comprises a video picture of a video participant.
5. The display device of claim 3, wherein the controller is further configured to:
responding to a small window video call instruction input by a user, and controlling the first display screen to close the chat picture in the first user interface;
and controlling the first display screen to display a small window video call picture in a target area in the first user interface.
6. The display device of claim 3, wherein the controller is further configured to:
responding to a switching request of a user on the chat screen, controlling the first display screen to close the chat screen in the first user interface, and controlling the first display screen to play media data in the first user interface;
and controlling the second display screen to display a chat picture of the incoming call participant on the second user interface.
7. The display device of claim 1, wherein the controller is further configured to:
acquiring an incoming call category in the incoming call request, wherein the incoming call category at least comprises a video category and a voice category;
if the incoming call category is the video category, controlling the second display screen to display a first control containing answering videos in the second user interface, and setting a focus on the first control;
And if the incoming call category is the voice category, controlling the second display screen to display a second control containing voice answering in the second user interface, and setting a focus on the second control.
8. The display device of claim 1, wherein the controller is further configured to:
closing the incoming call reminding picture in response to an event input by a user for switching on the incoming call request;
and controlling the second display screen to display a chat picture of the incoming call participant in the second user interface.
9. The display device of claim 8, wherein the controller is further configured to:
responding to a switching request of a user on the chat picture, and controlling the first display screen to pause playing of media asset data in the first user interface;
and controlling the second display screen to close the chat screen, and controlling the first display screen to display the chat screen of the incoming call participant in the first user interface.
10. The display device of claim 1, wherein the controller is further configured to:
and responding to an event input by a user for rejecting the incoming call request, or responding to an incoming call ending event corresponding to the incoming call request, controlling the first display screen to continuously display the first user interface, and controlling the second display screen to display preset information or setting the second display screen to be in a backlight state.
11. A display method, characterized by being applied to a display device, the display device including a first display screen, a second display screen, and a controller, the display method comprising:
responding to the incoming call request, and detecting the media resource category displayed in the first user interface;
and responding to the media asset type as a video picture, controlling the second display screen to display an incoming call reminding picture corresponding to the incoming call request in a second user interface, wherein the incoming call reminding picture comprises an operation control for indicating to connect the incoming call request and an operation control for rejecting the incoming call request.
12. A display device comprising a first display screen, a second display screen, a first controller, and a second controller, wherein:
a first display configured to display a first user interface;
a second display configured to display a second user interface;
a first controller configured to:
responding to an incoming call request, and detecting a media asset category displayed in the first user interface;
responding to the media asset type as a video picture, and sending an incoming call indication corresponding to the incoming call request to a second controller, so that the second controller controls the second display screen to display an incoming call reminding picture in the second user interface after receiving the incoming call indication; the incoming call reminding picture comprises an operation control used for indicating to connect the incoming call request and an operation control used for rejecting the incoming call request;
A second controller configured to:
and receiving an incoming call instruction sent by the first controller, and controlling the second display screen to display an incoming call reminding picture in the second user interface after receiving the incoming call instruction.
13. The display device of claim 12, wherein the first controller is further configured to:
and responding to the media asset type as a non-video picture, and controlling the first display screen to display an incoming call reminding picture corresponding to the incoming call request in the first user interface.
14. The display device of claim 12, wherein the second controller is further configured to:
analyzing the incoming call reminding picture, wherein the incoming call reminding picture comprises an incoming call category, and the incoming call category at least comprises a video category and a voice category;
if the incoming call category is the video category, controlling the second display screen to display a first control containing answering video in the second user interface, and positioning a focus on the first control;
and if the incoming call category is the voice category, controlling the second display screen to display a second control containing voice answering in the second user interface, and positioning a focus on the second control.
15. The display device of claim 12, wherein the second controller is further configured to:
receiving an incoming call answering video indication input by a user;
and responding to the answering video incoming call indication, sending a video call indication to the first controller, and closing the incoming call reminding picture.
16. The display device of claim 15, wherein the first controller is further configured to:
and responding to the video call instruction, controlling the first display screen to pause playing of the media data in the first user interface, and controlling the first display screen to display a video call picture in the first user interface, wherein the video call picture displays chat time and an operation control for user operation in a semitransparent mode.
17. The display device of claim 16, wherein the second controller is further configured to:
identifying user indication through the operation control, wherein the operation control at least comprises a hang-up control and a voice answering control;
responding to the operation of the hang-up control input by a user, and sending a hang-up instruction corresponding to the hang-up control to a first controller, so that the first controller controls the second controller to close the incoming call reminding picture after receiving the hang-up instruction;
Responding to the operation of the answering voice control input by a user, sending an answering voice instruction corresponding to the answering voice control to a first controller, enabling the first controller to play incoming call voice after receiving the answering voice instruction, and controlling the second display to display a voice call picture in a second user interface of the second display screen.
18. The display device of claim 16, wherein the first controller is further configured to:
receiving a small window video call instruction input by a user;
and responding to the small window video call instruction, controlling the first display screen to close the video call picture in the first user interface, and sending the small window video call instruction to the second controller so that the second controller can display the small window video call picture on the second display screen after receiving the small window video call instruction.
19. A display method, characterized by being applied to a display device, the display device including a first display screen, a second display screen, a first controller, and a second controller, the display method comprising:
responding to the incoming call request, and detecting the media resource category displayed in the first user interface;
Responding to the media asset type as a video picture, and sending an incoming call indication corresponding to the incoming call request to a second controller, so that the second controller controls the second display screen to display an incoming call reminding picture in a second user interface after receiving the incoming call indication; the incoming call reminding picture comprises an operation control used for indicating to connect the incoming call request and an operation control used for rejecting the incoming call request;
a second controller configured to:
and receiving an incoming call instruction sent by the first controller, and controlling the second display screen to display an incoming call reminding picture in the second user interface after receiving the incoming call instruction.
CN202310133508.8A 2019-11-04 2020-04-23 Display device and display method Pending CN116320554A (en)

Applications Claiming Priority (3)

Application Number Priority Date Filing Date Title
CN201911067365 2019-11-04
CN2019110673655 2019-11-04
CN202010327361.2A CN112788381B (en) 2019-11-04 2020-04-23 Display apparatus and display method

Related Parent Applications (1)

Application Number Title Priority Date Filing Date
CN202010327361.2A Division CN112788381B (en) 2019-11-04 2020-04-23 Display apparatus and display method

Publications (1)

Publication Number Publication Date
CN116320554A true CN116320554A (en) 2023-06-23

Family

ID=75750043

Family Applications (2)

Application Number Title Priority Date Filing Date
CN202310133508.8A Pending CN116320554A (en) 2019-11-04 2020-04-23 Display device and display method
CN202010327361.2A Active CN112788381B (en) 2019-11-04 2020-04-23 Display apparatus and display method

Family Applications After (1)

Application Number Title Priority Date Filing Date
CN202010327361.2A Active CN112788381B (en) 2019-11-04 2020-04-23 Display apparatus and display method

Country Status (2)

Country Link
CN (2) CN116320554A (en)
WO (1) WO2021088326A1 (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113676689A (en) * 2021-08-18 2021-11-19 百度在线网络技术(北京)有限公司 Video call method and device and television

Family Cites Families (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR101763595B1 (en) * 2010-11-16 2017-08-01 엘지전자 주식회사 Method for processing data for monitoring service in network tv and the network tv
KR101991862B1 (en) * 2011-02-10 2019-06-24 삼성전자주식회사 Portable device having touch screen display and method for controlling thereof
JP5547766B2 (en) * 2012-03-28 2014-07-16 京セラ株式会社 COMMUNICATION DEVICE, COMMUNICATION METHOD, AND COMMUNICATION PROGRAM
CN103780864B (en) * 2012-10-18 2017-10-03 腾讯科技(深圳)有限公司 Video calling interface display method and device
CN103051972B (en) * 2012-12-21 2019-01-29 康佳集团股份有限公司 A kind of control method and system sent a telegram here by smart television prompting mobile telephone set
CN105592193A (en) * 2014-10-20 2016-05-18 中国电信股份有限公司 Method and system for displaying incoming call on multiple screens
CN106445441A (en) * 2016-09-27 2017-02-22 宇龙计算机通信科技(深圳)有限公司 Information display method and system
CN106534999A (en) * 2016-11-09 2017-03-22 合浦县文化体育和旅游局 TV set capable of video chats
CN106791571B (en) * 2017-01-09 2020-05-19 宇龙计算机通信科技(深圳)有限公司 Video display method and device for double-screen display terminal
CN106953990B (en) * 2017-03-29 2020-06-16 青岛海信移动通信技术股份有限公司 Incoming call answering method of mobile terminal and mobile terminal
CN109379484B (en) * 2018-09-19 2020-09-25 维沃移动通信有限公司 Information processing method and terminal
CN109981885B (en) * 2019-02-03 2022-03-08 华为技术有限公司 Method for presenting video by electronic equipment in incoming call and electronic equipment
CN110166814A (en) * 2019-05-25 2019-08-23 Oppo广东移动通信有限公司 Video picture displaying method and relevant device
CN110572519A (en) * 2019-09-20 2019-12-13 青岛海信移动通信技术股份有限公司 Method for intercepting caller identification interface and display equipment

Also Published As

Publication number Publication date
CN112788381B (en) 2023-01-17
CN112788381A (en) 2021-05-11
WO2021088326A1 (en) 2021-05-14

Similar Documents

Publication Publication Date Title
CN111510753B (en) Display device
CN112073665B (en) Video call interface switching method on smart television
CN111526415B (en) Double-screen display equipment and HDMI switching method thereof
CN111510788B (en) Display method and display device for double-screen double-system screen switching animation
CN112788422A (en) Display device
CN111464840B (en) Display device and method for adjusting screen brightness of display device
CN112788378B (en) Display device and content display method
CN113141528B (en) Display device, boot animation playing method and storage medium
CN112783380A (en) Display apparatus and method
CN112784137A (en) Display device, display method and computing device
CN112788423A (en) Display device and display method of menu interface
CN112839254A (en) Display apparatus and content display method
CN112788381B (en) Display apparatus and display method
CN112788375B (en) Display device, display method and computing device
CN113938633B (en) Video call processing method and display device
CN112911354B (en) Display apparatus and sound control method
CN113365124A (en) Display device and display method
CN112788380B (en) Display device and display method
CN112788387A (en) Display apparatus, method and storage medium
CN113497966B (en) Double-screen display equipment
CN112786036B (en) Display device and content display method
CN113630633B (en) Display device and interaction control method
CN113495702B (en) Interactive invitation processing method and display equipment
CN113453079B (en) Control method for returning double-system-size double-screen application and display equipment
CN111970547B (en) Display device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination