WO2021088326A1 - Dispositif d'affichage et procédé d'affichage d'appel entrant - Google Patents

Dispositif d'affichage et procédé d'affichage d'appel entrant Download PDF

Info

Publication number
WO2021088326A1
WO2021088326A1 PCT/CN2020/086461 CN2020086461W WO2021088326A1 WO 2021088326 A1 WO2021088326 A1 WO 2021088326A1 CN 2020086461 W CN2020086461 W CN 2020086461W WO 2021088326 A1 WO2021088326 A1 WO 2021088326A1
Authority
WO
WIPO (PCT)
Prior art keywords
controller
screen
display
user
video
Prior art date
Application number
PCT/CN2020/086461
Other languages
English (en)
Chinese (zh)
Inventor
王学磊
丁佳一
穆聪聪
Original Assignee
海信视像科技股份有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 海信视像科技股份有限公司 filed Critical 海信视像科技股份有限公司
Publication of WO2021088326A1 publication Critical patent/WO2021088326A1/fr

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/41Structure of client; Structure of client peripherals
    • H04N21/4104Peripherals receiving signals from specially adapted client devices
    • H04N21/4122Peripherals receiving signals from specially adapted client devices additional display device, e.g. video projector
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/14Digital output to display device ; Cooperation and interconnection of the display device with other functional units
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/41Structure of client; Structure of client peripherals
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/41Structure of client; Structure of client peripherals
    • H04N21/422Input-only peripherals, i.e. input devices connected to specially adapted client devices, e.g. global positioning system [GPS]
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/41Structure of client; Structure of client peripherals
    • H04N21/422Input-only peripherals, i.e. input devices connected to specially adapted client devices, e.g. global positioning system [GPS]
    • H04N21/42204User interfaces specially adapted for controlling a client device through a remote control device; Remote control devices therefor
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/41Structure of client; Structure of client peripherals
    • H04N21/426Internal components of the client ; Characteristics thereof
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/431Generation of visual interfaces for content selection or interaction; Content or additional data rendering
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/431Generation of visual interfaces for content selection or interaction; Content or additional data rendering
    • H04N21/4312Generation of visual interfaces for content selection or interaction; Content or additional data rendering involving specific graphical features, e.g. screen layout, special fonts or colors, blinking icons, highlights or animations
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/431Generation of visual interfaces for content selection or interaction; Content or additional data rendering
    • H04N21/4312Generation of visual interfaces for content selection or interaction; Content or additional data rendering involving specific graphical features, e.g. screen layout, special fonts or colors, blinking icons, highlights or animations
    • H04N21/4316Generation of visual interfaces for content selection or interaction; Content or additional data rendering involving specific graphical features, e.g. screen layout, special fonts or colors, blinking icons, highlights or animations for displaying supplemental content in a region of the screen, e.g. an advertisement in a separate window
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/47End-user applications
    • H04N21/478Supplemental services, e.g. displaying phone caller identification, shopping application
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/47End-user applications
    • H04N21/478Supplemental services, e.g. displaying phone caller identification, shopping application
    • H04N21/4788Supplemental services, e.g. displaying phone caller identification, shopping application communicating with other users, e.g. chatting

Definitions

  • the embodiments of the present application relate to smart display device technology, and in particular, to a display device and a caller ID display method.
  • smart TVs based on the Internet have emerged.
  • smart TVs Based on Internet technology, smart TVs have an open operating system and chips, and an open application platform that can support multiple functions such as audio and video, entertainment, and data, so as to meet the diverse needs of users and bring users a brand-new experience.
  • smart TVs can support voice interaction, and users can control smart TVs through voice.
  • a smart TV includes a display.
  • the information that the smart TV needs to display to the user is displayed on the screen.
  • a boot animation will be played, and pictures of the boot animation are displayed on one display.
  • the embodiment of the present application provides a dual-screen display device and a caller ID display method.
  • the technical solution is as follows.
  • an embodiment of the present application provides a display device including: a first display screen, a second display screen, and a controller.
  • the controller is configured to: when the first display screen displays a video screen, receive an incoming call request, and display an incoming call reminder screen on the second display screen
  • an embodiment of the present application also provides a caller ID display method, including:
  • the controller receives the call request when the first display screen displays the video picture
  • the controller In response to the incoming call request, the controller displays an incoming call reminder screen on the second display screen.
  • An embodiment of the present application also provides a computing device, including:
  • Memory used to store program instructions
  • the processor is configured to call the program instructions stored in the memory, and execute the method described in the second aspect above according to the obtained program.
  • the controller when the first display screen is playing a video screen, the controller receives an incoming call request, and then displays an incoming call reminder screen on the second display screen, so that the first display screen is The displayed video screen is not blocked, which greatly improves the user experience.
  • FIG. 1 is a schematic diagram of an operation scenario between a display device and a control device provided in an exemplary embodiment of this application;
  • FIG. 2 is a configuration block diagram of a control device 100 provided in an exemplary embodiment of this application;
  • FIG. 3a is a schematic diagram of the hardware structure of a hardware system in a display device 200 provided in an exemplary embodiment of this application;
  • 3b is a schematic diagram of the hardware structure of a hardware system in another display device 200 provided in an exemplary embodiment of this application;
  • FIG. 4 is a schematic diagram of the connection relationship between the power supply board and the load in the display device 200 provided in an exemplary embodiment of this application;
  • FIG. 5 is a block diagram of the hardware architecture of the display device 200 shown in FIG. 3 in an exemplary embodiment of this application;
  • FIG. 6 is a schematic diagram of a functional configuration of a display device provided in an exemplary embodiment of this application.
  • FIG. 7 is a configuration block diagram of a software system in a display device 200 provided in an exemplary embodiment of this application;
  • FIG. 8 is a schematic diagram of an application layer of a display device provided in an exemplary embodiment of this application.
  • FIG. 9 is a schematic diagram of a user interface in a display device 200 provided in an exemplary embodiment of this application.
  • 10-12 exemplarily show schematic diagrams of interaction between a user interface and a user in the display device 200 according to an exemplary embodiment
  • FIG. 13-15 exemplarily show schematic diagrams of interaction between a user interface and a user in the display device 200 according to an exemplary embodiment
  • Figure 16 is a schematic diagram of the interaction process between the first controller and the second controller
  • FIG. 17 exemplarily shows a schematic diagram of interaction between a user interface and a user in the display device 200 according to an exemplary embodiment
  • FIG. 18 exemplarily shows a schematic diagram of interaction between a user interface and a user in the display device 200 according to an exemplary embodiment
  • Figure 19 is a schematic diagram of the interaction process between the first controller and the second controller
  • FIG. 20 exemplarily shows a schematic diagram of interaction between a user interface and a user in the display device 200 according to an exemplary embodiment
  • FIG. 21 exemplarily shows a schematic diagram of interaction between a user interface and a user in the display device 200 according to an exemplary embodiment
  • FIG. 22 is a schematic flowchart of a caller ID method provided by an embodiment of this application.
  • This application is mainly directed to a display device with a dual-system and dual-display structure, that is, a display device with a first controller (a first hardware system), a second controller (a second hardware system), a first display screen, and a second display screen.
  • a display device with a dual-system and dual-display structure that is, a display device with a first controller (a first hardware system), a second controller (a second hardware system), a first display screen, and a second display screen.
  • various external device interfaces are usually provided on the display device to facilitate the connection of different peripheral devices or cables to achieve corresponding functions.
  • a high-resolution camera When a high-resolution camera is connected to the interface of the display device, if the hardware system of the display device does not have the hardware interface of the high-pixel camera that receives the source code, it will cause the data received by the camera to be unable to present the data received by the camera to the display of the display device. On the screen.
  • the hardware system of traditional display devices only supports one hard decoding resource, and usually only supports 4K resolution video decoding. Therefore, when you want to realize the video chat while watching Internet TV, in order not to reduce
  • the definition of the network video picture requires the use of hard decoding resources (usually the GPU in the hardware system) to decode the network video.
  • the general-purpose processor such as CPU
  • the video chat screen is processed by soft decoding.
  • Using soft decoding to process the video chat screen will greatly increase the data processing burden on the CPU.
  • the data processing burden on the CPU is too heavy, the picture may freeze or become unsmooth.
  • the CPU soft decoding is used to process the video chat screen, it is usually impossible to achieve multi-channel video calls.
  • the user wants to simultaneously video chat with multiple other users in the same chat scene At times, access will be blocked.
  • this application discloses a dual hardware system architecture to implement multiple channels of video chat data (at least one local video).
  • module used in the various embodiments of this application can refer to any known or later developed hardware, software, firmware, artificial intelligence, fuzzy logic, or combination of hardware or/and software code that can execute related components Function.
  • remote control used in the various embodiments of this application refers to a component of an electronic device (such as the display device disclosed in this application), which can generally control the electronic device wirelessly within a short distance.
  • This component can generally use infrared and/or radio frequency (RF) signals and/or Bluetooth to connect with electronic devices, and can also include functional modules such as WiFi, wireless USB, Bluetooth, and motion sensors.
  • RF radio frequency
  • a handheld touch remote control uses a user interface in a touch screen to replace most of the physical built-in hardware in a general remote control device.
  • gesture used in the embodiments of the present application refers to a user's behavior through a change of hand shape or hand movement to express expected ideas, actions, goals, and/or results.
  • the term "hardware system” used in the various embodiments of this application may refer to an integrated circuit (IC), printed circuit board (Printed circuit board, PCB) and other mechanical, optical, electrical, and magnetic devices with computing , Control, storage, input and output functions of the physical components.
  • the hardware system is usually also referred to as a motherboard or a chip.
  • Fig. 1 exemplarily shows a schematic diagram of an operation scenario between a display device and a control device according to an embodiment. As shown in FIG. 1, the user can operate the display device 200 by controlling the device 100.
  • the control device 100 may be a remote controller 100A, which can communicate with the display device 200 through infrared protocol communication, Bluetooth protocol communication, ZigBee protocol communication or other short-distance communication methods for wireless or other short-distance communication.
  • the display device 200 is controlled in a wired manner.
  • the user can control the display device 200 by inputting user instructions through keys, voice input, control panel input, etc. on the remote controller 100A.
  • the user can control the display device by inputting corresponding control commands through the volume up and down keys on the remote control 100A, channel control keys, up/down/left/right movement keys, voice input keys, menu keys, switch machine keys, etc. 200 features.
  • the control device 100 may also be a smart device, such as a mobile terminal 100B, a tablet computer, a computer, a notebook computer, etc., which may be connected through a local area network (LAN, Wide Area Network), a wide area network (WAN, Wide Area Network), and a wireless local area network (WLAN, Wireless Local Area Network) or other networks communicate with the display device 200, and control the display device 200 through an application program corresponding to the display device 200.
  • LAN Local area network
  • WAN Wide Area Network
  • WLAN Wireless Local Area Network
  • the application can provide users with various controls through an intuitive user interface (UI, User Interface) on the screen associated with the smart device.
  • UI User Interface
  • both the mobile terminal 100B and the display device 200 can be installed with software applications, so that the connection and communication between the two can be realized through a network communication protocol, and the purpose of one-to-one control operation and data communication can be realized.
  • the mobile terminal 100B can establish a control command protocol with the display device 200, synchronize the remote control keyboard to the mobile terminal 100B, and control the user interface of the mobile terminal 100B to achieve the function of controlling the display device 200; or the mobile terminal 100B The audio and video content displayed on the screen is transmitted to the display device 200 to realize the synchronous display function.
  • the display device 200 can also communicate with the server 300 through multiple communication methods.
  • the display device 200 may be allowed to perform a wired communication connection or a wireless communication connection with the server 300 through a local area network, a wireless local area network, or other networks.
  • the server 300 may provide various contents and interactions to the display device 200.
  • the display device 200 transmits and receives information, interacts with an Electronic Program Guide (EPG, Electronic Program Guide), receives software program updates, or accesses a remotely stored digital media library.
  • EPG Electronic Program Guide
  • the server 300 may be a group or multiple groups, and may be one or more types of servers.
  • the server 300 provides other network service content such as video-on-demand and advertising services.
  • the display device 200 includes a first display screen 201 and a second display screen 202, wherein the first display screen 201 and the second display screen 202 are independent of each other, and the first display screen 201 and the second display screen 202 adopt dual hardware Control System.
  • the first display screen 201 and the second display screen 202 can be used to display different display pictures.
  • the first display screen 201 can be used for screen display of traditional television programs
  • the second display screen 202 can be used for screen display of auxiliary information such as notification messages and voice assistants.
  • the content displayed on the first display screen 201 and the content displayed on the second display screen 202 may be independent of each other without affecting each other.
  • the second display screen 202 may display information such as time, weather, temperature, and reminder messages that are not related to the TV program.
  • the second display screen 202 may display information such as the avatar and the chat duration of the user currently accessing the video chat.
  • part or all of the content displayed on the second display screen 202 can be adjusted to be displayed on the first display screen 201.
  • the time, weather, temperature, reminder message and other information displayed on the first display screen 201 can be adjusted to be displayed on the first display screen 201, and the second display screen 202 can be used to display other information.
  • the first display screen 201 displays a multi-party interactive screen while displaying a traditional TV program screen, and the multi-party interactive screen does not block the traditional TV program screen.
  • the present application does not limit the display mode of the traditional TV program screen and the multi-party interactive screen.
  • this application can set the position and size of the traditional TV program screen and the multi-party interactive screen according to the priority of the traditional TV program screen and the multi-party interactive screen.
  • the area of traditional TV program screens is larger than that of multi-party interactive screens, and the multi-party interactive screens can be located on one side of the traditional TV program screen or can be set floating In the corner of the multi-party interactive screen.
  • the display device 200 may be a liquid crystal display, an OLED (Organic Light Emitting Diode) display, or a projection display device; on the other hand, the display device may be a smart TV or a display system composed of a display and a set-top box.
  • OLED Organic Light Emitting Diode
  • the display device 200 may make some changes in performance and configuration as required.
  • the display device 200 may additionally provide a smart network TV function that provides a computer support function. Examples include Internet TV, Smart TV, Internet Protocol TV (IPTV), and so on. In some embodiments, the display device may not have the function of broadcasting and receiving TV.
  • a smart network TV function that provides a computer support function. Examples include Internet TV, Smart TV, Internet Protocol TV (IPTV), and so on.
  • IPTV Internet Protocol TV
  • the display device may not have the function of broadcasting and receiving TV.
  • a camera may be connected or provided on the display device 200 for presenting a picture captured by the camera on the display interface of the display device or other display devices, so as to realize interactive chats between users.
  • the image captured by the camera can be displayed on the display device in full screen, half screen, or in any selectable area.
  • the camera is connected to the rear shell of the display device through a connecting plate, and is fixedly installed on the upper middle of the rear shell of the display device.
  • it can be fixedly installed on the rear of the display device. Any position of the shell can ensure that the image capture area is not blocked by the rear shell. For example, the image capture area and the display device have the same orientation.
  • the camera can be connected to the rear shell of the display device through a connecting plate or other conceivable connectors.
  • a lifting motor is installed on the connector.
  • the camera used in this application may have 16 million pixels to achieve the purpose of ultra-high-definition display. In actual use, a camera with higher or lower than 16 million pixels can also be used.
  • the content displayed in different application scenarios of the display device can be merged in a variety of different ways, so as to achieve functions that cannot be achieved by traditional display devices.
  • the user can video chat with at least one other user while watching a video program.
  • the presentation of the video program can be used as the background screen, and the video chat window is displayed on the background screen.
  • At least one video chat is performed across terminals.
  • the user can video chat with at least one other user while entering the education application for learning.
  • students can realize remote interaction with teachers while learning content in educational applications. Visually, you can call this function "learning and chatting”.
  • a video chat is conducted with players entering the game.
  • players entering the game.
  • a player enters a game application to participate in a game, it can realize remote interaction with other players. Visually, you can call this function "watch and play".
  • the game scene and the video image are merged, and the portrait in the video image is cut out and displayed on the game image, which improves the user experience.
  • somatosensory games such as ball games, boxing games, running games, dancing games, etc.
  • human body postures and movements are acquired through a camera, limb detection and tracking, and key point data detection of human bones.
  • Animations are integrated in the game to realize games such as sports, dance and other scenes.
  • the user can interact with at least one other user in video and voice in the K song application. Visually, you can call this function "watch and sing".
  • this function "watch and sing".
  • multiple users can jointly complete the recording of a song.
  • the user can turn on the camera locally to obtain pictures and videos, which is vivid, and this function can be called "look in the mirror".
  • Fig. 2 exemplarily shows a configuration block diagram of the control device 100 according to an exemplary embodiment.
  • the control device 100 includes a controller 110, a communicator 130, a user input/output interface 140, a memory 190, and a power supply 180.
  • the control device 100 is configured to control the display device 200, and can receive user input operation instructions, and convert the operation instructions into instructions that can be recognized and responded to by the display device 200, acting as an intermediary for the interaction between the user and the display device 200 effect.
  • the user operates the channel plus and minus key on the control device 100, and the display device 200 responds to the channel plus and minus operation.
  • control device 100 may be a smart device.
  • control device 100 can install various applications for controlling the display device 200 according to user requirements.
  • the mobile terminal 100B or other smart electronic devices can perform similar functions to the control device 100 after installing an application for controlling the display device 200.
  • the user can install various function keys or virtual buttons of the graphical user interface that can be provided on the mobile terminal 100B or other smart electronic devices by installing applications to realize the function of the physical keys of the control device 100.
  • the controller 110 includes a processor 112, a RAM 113 and a ROM 114, a communication interface, and a communication bus.
  • the controller 110 is used to control the operation and operation of the control device 100, as well as communication and cooperation between internal components, and external and internal data processing functions.
  • the communicator 130 realizes the communication of control signals and data signals with the display device 200 under the control of the controller 110. For example, the received user input signal is sent to the display device 200.
  • the communicator 130 may include at least one of communication modules such as a WIFI module 131, a Bluetooth module 132, and an NFC module 133.
  • the user input/output interface 140 where the input interface includes at least one of input interfaces such as a microphone 141, a touch panel 142, a sensor 143, a button 144, and a camera 145.
  • input interfaces such as a microphone 141, a touch panel 142, a sensor 143, a button 144, and a camera 145.
  • the user can implement the user instruction input function through voice, touch, gesture, pressing and other actions.
  • the input interface converts the received analog signal into a digital signal, and the digital signal into a corresponding instruction signal, and sends it to the display device 200.
  • the output interface includes an interface for sending the received user instruction to the display device 200.
  • it may be an infrared interface or a radio frequency interface.
  • the user input instruction needs to be converted into an infrared control signal according to the infrared control protocol, and sent to the display device 200 via the infrared sending module.
  • a radio frequency signal interface a user input instruction needs to be converted into a digital signal, which is then modulated according to the radio frequency control signal modulation protocol, and then sent to the display device 200 by the radio frequency sending terminal.
  • control device 100 includes at least one of a communicator 130 and an output interface.
  • the control device 100 is equipped with a communicator 130, such as WIFI, Bluetooth, NFC and other modules, which can encode the user input command through the WIFI protocol, or the Bluetooth protocol, or the NFC protocol, and send it to the display device 200.
  • a communicator 130 such as WIFI, Bluetooth, NFC and other modules, which can encode the user input command through the WIFI protocol, or the Bluetooth protocol, or the NFC protocol, and send it to the display device 200.
  • the memory 190 is used to store various operating programs, data, and applications for driving and controlling the control device 100 under the control of the controller 110.
  • the memory 190 can store various control signal instructions input by the user.
  • the power supply 180 is used to provide operating power support for the electrical components of the control device 100 under the control of the controller 110.
  • the power supply 180 can be powered by a battery and related control circuits.
  • FIG. 3a exemplarily shows a schematic diagram of the hardware structure of the hardware system in the display device 200 according to the exemplary embodiment.
  • the display device 200 in FIG. 3a uses a liquid crystal display as an example for illustration.
  • the display device 200 may include: a first panel 11, a first backlight assembly 12, a first rear case 13, a first controller 14, a second controller 15, a first display driving circuit 16, and a second panel 21.
  • the display device 200 may further include a base or a suspension bracket.
  • the display device 200 in FIG. 3a includes a base 41 for illustration, and the base 41 is used to support the display device 200. It is worth noting that only one form of base design is shown in the figure, and those skilled in the art can design different forms of bases according to product requirements.
  • the first panel 11 is used to present a picture of the first display screen 201 to the user.
  • the first panel 11 may be a liquid crystal panel.
  • a liquid crystal panel may include, from top to bottom, a horizontal polarizer, a color filter, a liquid crystal layer, a thin film transistor TFT, a vertical polarizer, a light guide plate, and a printed circuit board (PCB).
  • the printed circuit board Drive circuits such as a gate drive circuit and a source drive circuit are arranged on the PCB.
  • the gate driving circuit is connected to the gate of the thin film transistor TFT through a scan line
  • the source driving circuit is connected to the drain of the thin film transistor TFT through a data line.
  • the first backlight assembly 12 is located under the first panel 11, and is usually some optical components for supplying sufficient brightness and uniformly distributed light sources, so that the first panel 11 can display images normally.
  • the first backlight assembly 12 also includes a first back plate (not shown in the figure).
  • the first rear case 13 is covered on the first panel 11 to jointly hide the display devices such as the first backlight assembly 12, the first controller 14, the second controller 15, the first display driving circuit 16, the power supply assembly 30, etc. 200 parts, play a beautiful effect.
  • the first controller 14, the second controller 15, the first display driving circuit 16 and the power supply assembly 30 are arranged on the first backplane, and some convex hull structures are usually stamped on the first backplane.
  • the first controller 14, the second controller 15, the first display driving circuit 16 and the power supply assembly 30 are fixed on the convex hull by screws or hooks.
  • the first controller 14, the second controller 15, the first display driving circuit 16, and the power supply assembly 30 may be arranged on one board together, or may be arranged on different boards, such as the first controller. 14 is set on the main board, the second controller 15 is set on the interactive board, the first display drive circuit 16 is set on the first display drive board, and the power supply assembly 30 is set on the power supply board. It can also be combined on different boards. , It can also be set on a board together with the first backlight assembly 12, which can be set according to actual needs, which is not limited in this application.
  • the first controller 14, the second controller 15, the first display driving circuit 16 and the power supply assembly 30 are all provided on one board in FIG. 3a for illustration.
  • the main function of the first display driving circuit 16 is to perform a thousand-level backlight partition control through the backlight driving signal transmitted by the first controller 14, such as a PWM signal and a local dimming signal. This part of the control is changed according to the image content. And after the handshake is established with the first controller 14, the VbyOne display signal sent by the first controller 14 is received, and the VbyOne display signal is converted into an LVDS signal to realize the image display of the first display screen 201.
  • the second panel 21 is used to present the screen of the second display screen 202 to the user.
  • the second panel 21 may be a liquid crystal panel, and the specific structure included can refer to the description of the foregoing content, which will not be repeated here.
  • the second backlight assembly 22 is located below the second panel 12, and is usually some optical components for supplying sufficient brightness and uniformly distributed light sources, so that the second panel 12 can display images normally.
  • the second backlight assembly 22 also includes a second back plate (not shown in the figure).
  • the second rear case 23 is covered on the second panel 21 to jointly hide the components of the display device 200 such as the second backlight assembly 22 and the second display driving circuit 24, which has a beautiful effect.
  • the second display driving circuit 24 is disposed on the second backplane, and some convex hull structures are usually stamped on the second backplane.
  • the second display driving circuit 24 is fixed on the convex hull by screws or hooks.
  • the second display driving circuit 24 can be separately arranged on a board, such as a second display driving board, or it can be arranged on the same board together with the second backlight assembly 22, which can be set according to actual needs. This application does not Make a limit.
  • the second display driving circuit 24 is separately provided on a board in FIG. 3a for illustration.
  • FIG. 3a also includes a key pad.
  • the key pad can be provided on the first backplane or the second backplane, which is not limited in this application.
  • a plurality of buttons and button circuits are provided on the button board, so that the first controller 14 or the second controller 15 can receive the button signal from the button board, and the first controller 14 or the second controller 15 can also send the button to the button.
  • the board sends control signals.
  • the display device 200 also includes a sound reproduction device (not shown in the figure), such as an audio component, such as an I2S interface including a power amplifier (AMP) and a speaker (Speaker), for realizing sound reproduction.
  • an audio component such as an I2S interface including a power amplifier (AMP) and a speaker (Speaker), for realizing sound reproduction.
  • AMP power amplifier
  • Speaker speaker
  • audio components can achieve sound output of at least two channels; when the panoramic sound surround effect is to be achieved, multiple audio components need to be set to output multiple channels of sound, which will not be described in detail here.
  • the display device 200 can also be an OLED display.
  • the template included in the display device 200 is changed accordingly.
  • the OLED display can realize self-luminescence, the OLED display does not require a backlight assembly ( Figure 3a)
  • the first backlight assembly 12 and the second backlight assembly 22) are not detailed here.
  • FIG. 3a a display device with dual display screens is taken as an exemplary illustration
  • FIG. 3b exemplarily shows a schematic diagram of the hardware structure of a hardware system in the display device according to an exemplary embodiment.
  • the display device includes: a panel 1, a backlight assembly 2, a rear housing 3, a controller 4, a power supply assembly 5, and a base 6.
  • the panel 1 is used to present images to the user;
  • the backlight assembly 2 is located under the panel 1, usually some optical components, used to supply sufficient brightness and uniformly distributed light sources, so that the panel 1 can display image content normally, and the backlight assembly 2 It also includes a backplane.
  • the controller 4 and the power supply assembly 5 are arranged on the backplane, and some convex structures are usually stamped on the backplane.
  • the controller 4 and the power supply assembly 5 are fixed on the convex hull by screws or hooks; the rear shell 3
  • the cover is arranged on the panel 1 to jointly hide the backlight assembly 2, the controller 4 and the power supply assembly 5 and other display device components, which has a beautiful effect; the base 6 is used to support the display device.
  • the controller 4 and the power supply assembly 5 can be separately arranged on a board, can also be arranged on the same board together, or can be arranged on the same board together with the backlight assembly, which can be set according to actual needs. Not limited.
  • the controller 4 and the power supply assembly 5 are jointly arranged on a board.
  • the display device 200 also includes a sound reproduction device (not shown in the figure), such as an audio component, such as an I2S interface including a power amplifier (AMP) and a speaker (Speaker), for realizing sound reproduction.
  • an audio component such as an I2S interface including a power amplifier (AMP) and a speaker (Speaker), for realizing sound reproduction.
  • AMP power amplifier
  • Speaker speaker
  • audio components can achieve sound output of at least two channels; when the panoramic sound surround effect is to be achieved, multiple audio components need to be set to output multiple channels of sound, which will not be described in detail here.
  • the display device 200 may also adopt an OLED display screen.
  • the template included in the display device 200 will be changed accordingly, and no further description will be given here.
  • FIG. 4 shows a schematic diagram of the connection relationship between the power board and the load.
  • the power supply assembly 30 includes an input terminal IN and an output terminal OUT (the figure shows the first output terminal OUT1, the second output terminal OUT2, and the third output terminal OUT2.
  • the output terminal OUT3, the fourth output terminal OUT4 and the fifth output terminal OUT5) wherein the input terminal IN is connected to an alternating current power supply AC (such as mains), and the output terminal OUT is connected to a load, for example, the first output terminal OUT1 and sound reproduction
  • the device is connected, the second output terminal OUT2 is connected to the first panel 11/the second panel 21, the third output terminal OUT3 is connected to the first backlight assembly 12/the second backlight assembly 22, and the fourth output terminal OUT4 is connected to the first controller 14.
  • /The second controller 15 is connected, and the fifth output terminal OUT5 is connected to the first display driving circuit 16/the second display driving circuit 24.
  • the power supply assembly 30 needs to convert AC mains power into DC power required by the load, and the DC power usually has different specifications, for example, the audio component requires 18V, the first controller 14 requires 12V/18V, and so on.
  • first hardware system one hardware system in the dual hardware system architecture
  • second hardware system the second controller
  • the first controller includes various processors of the first controller, various interfaces, and various modules connected to the first controller through various interfaces
  • the second controller includes various processors of the second controller , Various interfaces, and various modules connected to the second controller through various interfaces.
  • a relatively independent operating system may be installed in the first controller and the second controller, and the operating system of the first controller and the operating system of the second controller may communicate with each other through a communication protocol, for example: the first controller
  • the framework layer of the operating system of the second controller and the framework layer of the operating system of the second controller can communicate for command and data transmission, so that there are two independent but interrelated subsystems in the display device 200.
  • FIG. 5 is only an exemplary description of the dual hardware system architecture of the present application, and does not represent a limitation to the present application. In practical applications, both hardware systems can contain more or less hardware or interfaces as required.
  • FIG. 5 exemplarily shows a block diagram of the hardware architecture of the display device 200 shown in FIG. 3.
  • the hardware system of the display device 200 includes a first controller 210 and a second controller 310, and modules connected to the first controller 210 or the second controller 310 through various interfaces.
  • the second controller 310 may be used to receive instructions sent by the first controller 210 and control the second display screen 380 to display corresponding images.
  • the modules connected to the first controller 210 may include a tuner and demodulator 220, a communicator 230, an external device interface 25, a memory 290, a user input interface 260-3, a video processor 260-1, an audio processor 260-2,
  • the first display screen 280 (that is, the first display screen 201 in FIG. 1 ), the audio output interface 270, and the power supply module 240.
  • the first controller 210 may also be connected with more or fewer modules.
  • the tuner and demodulator 220 is used to perform modulation and demodulation processing such as amplification, mixing, and resonance on the broadcast and television signals received through wired or wireless methods, so as to demodulate the user’s information from multiple wireless or cable broadcast and television signals. Select the audio and video signals carried in the frequency of the TV channel, as well as additional information (such as EPG data signals).
  • the signal path of the tuner and demodulator 220 can have many kinds, such as: terrestrial broadcasting, cable broadcasting, satellite broadcasting or Internet broadcasting, etc.; and according to different modulation types, the signal adjustment method can be digitally modulated The method may also be an analog modulation method; and according to different types of received television signals, the tuner demodulator 220 may demodulate analog signals and/or digital signals.
  • the tuner and demodulator 220 is also used to respond to the TV channel frequency selected by the user and the TV signal carried by the frequency according to the user's selection and control by the first controller 210.
  • the tuner demodulator 220 may also be in an external device, such as an external set-top box.
  • the set-top box outputs TV audio and video signals through modulation and demodulation, and inputs them to the display device 200 through the external device interface 250.
  • the communicator 230 is a component for communicating with external devices or external servers according to various communication protocol types.
  • the communicator 230 may include a WIFI module 231, a Bluetooth communication protocol module 232, a wired Ethernet communication protocol module 233, and an infrared communication protocol module and other network communication protocol modules or near field communication protocol modules (not shown in the figure).
  • the display device 200 may establish a control signal and a data signal connection with an external control device or content providing device through the communicator 230.
  • the communicator may receive a control signal of the remote controller 100 according to the control of the first controller 210.
  • the external device interface 250 is a component that provides data transmission between the first controller 210 and other external devices.
  • the external device interface 250 can be connected to external devices such as set-top boxes, game devices, notebook computers, etc. in a wired/wireless manner, and can receive external devices such as video signals (such as moving images), audio signals (such as music), and additional information (such as EPG) and other data.
  • the external device interface 250 may include: a high-definition multimedia interface (HDMI) terminal is also called HDMI 251, a composite video blanking synchronization (CVBS) terminal is also called AV 252, and an analog or digital component terminal is also called component 253. , Any one or more of universal serial bus (USB) terminals 254, red, green and blue (RGB) terminals (not shown in the figure), etc. This application does not limit the number and types of external device interfaces.
  • HDMI high-definition multimedia interface
  • CVBS composite video blanking synchronization
  • AV 253 an analog or digital component terminal
  • USB universal serial bus
  • RGB red, green and blue
  • the first controller 210 controls the work of the display device 200 and responds to user operations by running various software control programs (such as an operating system and/or various application programs) stored on the memory 290.
  • various software control programs such as an operating system and/or various application programs
  • the first controller 210 includes a read-only memory RAM 213, a random access memory ROM 214, a graphics processor 216, a CPU processor 212, a communication interface 218, and a communication bus.
  • the RAM 213 and the ROM 214, the graphics processor 216, the CPU processor 212, and the communication interface 218 are connected by a bus.
  • the graphics processor 216 is used to generate various graphics objects, such as icons, operation menus, and user input instructions to display graphics. Including an arithmetic unit, which performs operations by receiving various interactive commands input by the user, and displays various objects according to the display attributes. It also includes a renderer, which generates various objects obtained based on the arithmetic unit, and displays the result of the rendering on the first display screen 280.
  • the CPU processor 212 is configured to execute operating system and application program instructions stored in the memory 290. And according to receiving various interactive instructions input from the outside, to execute various application programs, data and content, so as to finally display and play various audio and video content.
  • the CPU processor 212 may include multiple processors.
  • the multiple processors may include one main processor and multiple or one sub-processors.
  • the main processor is used to perform some operations of the display device 200 in the pre-power-on mode, and/or to display images in the normal mode.
  • the communication interface 218 may include a first interface 218-1 to an nth interface 218-n. These interfaces may be network interfaces connected to external devices via a network.
  • the first controller 210 may control operations of the display device 200 related to the first display screen 280. For example, in response to receiving a user command for selecting a UI object to be displayed on the first display screen 280, the first controller 210 may perform an operation related to the object selected by the user command.
  • the first controller 210 may control the operation of the display device 200 related to the second display screen 380. For example, in response to receiving a user command for selecting a UI object to be displayed on the second display screen 380, the first controller 210 may perform an operation related to the object selected by the user command.
  • the object may be any one of the selectable objects, such as a hyperlink or an icon.
  • Operations related to the selected object for example: display operations connected to hyperlink pages, documents, images, etc., or perform operations corresponding to the icon.
  • the user command for selecting the UI object may be a command input through various input devices (for example, a mouse, a keyboard, a touch pad, etc.) connected to the display device 200 or a voice command corresponding to the voice spoken by the user.
  • the memory 290 includes storing various software modules used to drive and control the display device 200.
  • various software modules stored in the memory 290 include: a basic module, a detection module, a communication module, a display control module, a browser module, and various service modules (not shown in the figure).
  • the basic module is a bottom-level software module used for signal communication between various hardware in the display device 200 and sending processing and control signals to the upper-level module.
  • the detection module is a management module used to collect various information from various sensors or user input interfaces, and perform digital-to-analog conversion, analysis and management.
  • the voice recognition module includes a voice analysis module and a voice command database module.
  • the display control module is a module for controlling the first display screen 280 to display image content, and can be used to play information such as multimedia image content and UI interfaces.
  • the communication module is a module used for control and data communication with external devices.
  • the browser module is a module used to perform data communication between browsing servers.
  • the service module is a module used to provide various services and various applications.
  • the memory 290 is also used to store and receive external data and user data, images of various items in various user interfaces, and visual effect diagrams of focus objects, and the like.
  • the user input interface 260-3 is used to send a user's input signal to the first controller 210, or to transmit a signal output from the first controller 210 to the user.
  • the control device (such as a mobile terminal or a remote control) can send input signals input by the user, such as a power switch signal, a channel selection signal, and a volume adjustment signal, to the user input interface, and then forward the input signal to the user input interface 260-3.
  • the first controller 210; or, the control device may receive output signals such as audio, video, or data output from the user input interface 260-3 processed by the first controller 210, and display the received output signal or output the received output signal It is in the form of audio or vibration.
  • the user may input a user command on a graphical user interface (GUI) displayed on the first display screen 280, and the user input interface 260-3 receives the user input command through the graphical user interface (GUI).
  • GUI graphical user interface
  • the user may input a user command by inputting a specific sound or gesture, and the user input interface 260-3 recognizes the sound or gesture through a sensor to receive the user input command.
  • the video processor 260-1 is used to receive video signals, and perform video data processing such as decompression, decoding, scaling, noise reduction, frame rate conversion, resolution conversion, and image synthesis according to the standard codec protocol of the input signal.
  • the video signal is directly displayed or played on the first display screen 280.
  • the video processor 260-1 includes a demultiplexing module, a video decoding module, an image synthesis module, a frame rate conversion module, a display formatting module, etc. (not shown in the figure).
  • the demultiplexing module is used to demultiplex the input audio and video data stream. For example, if MPEG-2 is input, the demultiplexing module will demultiplex into a video signal and an audio signal.
  • the video decoding module is used to process the demultiplexed video signal, including decoding and scaling.
  • An image synthesis module such as an image synthesizer, is used for superimposing and mixing the GUI signal generated by the graphics generator with the zoomed video image according to user input or by itself, so as to generate an image signal for display.
  • the frame rate conversion module is used to convert the frame rate of the input video, such as converting the frame rate of the input 24Hz, 25Hz, 30Hz, 60Hz video to the frame rate of 60Hz, 120Hz or 240Hz, where the input frame rate can be the same as the source
  • the video stream is related, and the output frame rate can be related to the refresh rate of the display device.
  • the display formatting module is used to change the signal output by the frame rate conversion module to a signal conforming to the display format of the display device, for example, format the signal output by the frame rate conversion module to output RGB data signals.
  • the first display screen 280 is used to receive the image signal input from the video processor 260-1, display video content and images, and the menu control interface.
  • the first display screen 280 includes a display screen component for presenting images and driving image display Drive components.
  • the displayed video content can be from the video in the broadcast signal received by the tuner and demodulator 220, or from the video content input by the communicator or the interface of an external device.
  • the first display screen 280 simultaneously displays a user manipulation interface UI generated in the display device 200 and used to control the display device 200.
  • the first display screen 280 also includes a driving component for driving the display.
  • the first display screen 280 is a projection display screen, it may also include a projection device and a projection screen.
  • the audio processor 260-2 is used to receive audio signals, and perform decompression and decoding according to the standard codec protocol of the input signal, as well as audio data processing such as noise reduction, digital-to-analog conversion, and amplification processing, so that it can be played in the speaker 272 Audio signal.
  • the audio output interface 270 is used to receive the audio signal output by the audio processor 260-2 under the control of the first controller 210.
  • the audio output interface may include a speaker 272, or output to an external audio output terminal 274 of a generator of an external device , Such as: external audio terminal or headphone output terminal, etc.
  • the video processor 260-1 may include one or more chips.
  • the audio processor 260-2 may also include one or more chips.
  • the video processor 260-1 and the audio processor 260-2 may be separate chips, or may be integrated with the first controller 210 in one or more chips.
  • the power supply module 240 is configured to provide power supply support for the display device 200 with power input from an external power supply under the control of the first controller 210.
  • the power supply module 240 may include a built-in power supply circuit installed inside the display device 200, or may be a power supply installed outside the display device 200, such as a power interface for providing an external power supply in the display device 200.
  • the modules connected to the second controller 310 may include a communicator 330, a detector 340, a memory 390, and a second display screen 380 (that is, the second display in FIG. 1 Display 202). In some embodiments, it may also include a user input interface, a video processor, an audio processor, a display screen, and an audio output interface (not shown in the figure). In some embodiments, there may also be a power supply module (not shown in the figure) that independently supplies power to the second controller 310.
  • the communicator 330 is a component for communicating with external devices or external servers according to various communication protocol types.
  • the communicator 330 may include a WIFI module 331, a Bluetooth communication protocol module 332, a wired Ethernet communication protocol module 333, and an infrared communication protocol module and other network communication protocol modules or near field communication protocol modules (not shown in the figure).
  • the communicator 330 and the communicator 230 of the first controller 210 also interact with each other.
  • the WiFi module 231 in the hardware system of the first controller 210 is used to connect to an external network, and to generate network communication with an external server or the like.
  • the WiFi module 331 in the hardware system of the second controller 310 is used to connect to the WiFi module 231 of the first controller 210 without direct connection with the external network, etc., and the second controller 310 connects to the external network through the first controller 210 . Therefore, for the user, a display device as in the above embodiment may display a WiFi account to the outside.
  • the detector 340 is a component used by the second controller 310 to collect signals from the external environment or interact with the outside.
  • the detector 340 may include a light receiver 342, a sensor used to collect the intensity of ambient light, which can adaptively display parameter changes by collecting ambient light, etc.; it may also include an image collector 341, such as a camera, a camera, etc., which can be used to collect external
  • the environment scene, as well as the user's attributes or gestures used to interact with the user can adaptively change the display parameters, and can also recognize the user's gestures to achieve the function of interaction with the user.
  • the external device interface 350 provides a component for data transmission between the second controller 310 and the first controller 210 or other external devices.
  • the external device interface can be connected to external devices such as set-top boxes, game devices, notebook computers, etc., in a wired/wireless manner.
  • the video processor 360 is used to process related video signals.
  • the second controller 310 controls the work of the display device 200 and responds to the user by running various software control programs (such as installed third-party applications, etc.) stored on the memory 390 and interacting with the first controller 210. Operation.
  • software control programs such as installed third-party applications, etc.
  • the second controller 310 includes a read-only memory ROM 313, a random access memory RAM 314, a graphics processor 316, a CPU processor 312, a communication interface 318, and a communication bus.
  • the ROM 313 and the RAM 314, the graphics processor 316, the CPU processor 312, and the communication interface 318 are connected by a bus.
  • the CPU processor 312 runs the system startup instruction in the ROM, and copies the operating system stored in the memory 390 to the RAM 314 to start the operating system. After the operating system is started, the CPU processor 312 copies the various application programs in the memory 390 to the RAM 314, and then starts to run and start the various application programs.
  • the CPU processor 312 is used to execute operating system and application instructions stored in the memory 390, communicate with the first controller 210, transmit and interact with signals, data, instructions, etc., and receive various interactive instructions from external inputs. , To execute a variety of applications, data and content, in order to finally display and play a variety of audio and video content.
  • the second controller 310 may control operations of the display device 200 related to the second display screen 380. For example, in response to receiving a user command for selecting a UI object to be displayed on the second display screen 380, the second controller 310 may perform an operation related to the object selected by the user command.
  • the second controller 310 may control the operation of the display device 200 related to the first display screen 280. For example, in response to receiving a user command for selecting a UI object to be displayed on the first display screen 280, the first controller 210 may perform an operation related to the object selected by the user command.
  • the graphics processor 316 is used to generate various graphics objects, such as icons, operation menus, and user input instructions to display graphics. Including an arithmetic unit, which performs operations by receiving various interactive commands input by the user, and displays various objects according to the display attributes. It also includes a renderer, which generates various objects obtained based on the arithmetic unit, and displays the result of the rendering on the second display screen 380.
  • Both the graphics processor 316 of the second controller 310 and the graphics processor 216 of the first controller 210 can generate various graphics objects. Differentily, if application 1 is installed in the second controller 310 and application 2 is installed in the first controller 210, when the user is on the interface of application 1, and the user inputs instructions in the application 1, the second controller The graphics processor 316 of 310 generates graphics objects. When the user is on the interface of the application 2 and performs the user-input instruction in the application 2, the graphics processor 216 of the first controller 210 generates a graphic object.
  • Fig. 6 exemplarily shows a schematic diagram of a functional configuration of a display device according to an exemplary embodiment.
  • the memory 390 of the second controller 310 and the memory 290 of the first controller 210 are respectively used to store an operating system, application programs, content, and user data. Under the control of 210, the system operation of driving the first display screen 280 and the second display screen 380 and responding to various operations of the user are executed.
  • the memory 390 and the memory 290 may include volatile and/or non-volatile memory.
  • the memory 290 is specifically used to store the operating program that drives the first controller 210 in the display device 200, and store various application programs built in the display device 200, various application programs downloaded by the user from an external device, and application related programs. A variety of graphical user interfaces, and various objects related to the graphical user interface, user data information, and various internal data supporting applications.
  • the memory 290 is used to store system software such as an operating system (OS) kernel, middleware, and applications, as well as to store input video data and audio data, and other user data.
  • OS operating system
  • the memory 290 is specifically used to store driver programs and related data such as the video processor 260-1 and the audio processor 260-2, the first display screen 280, the communicator 230, the tuner and demodulator 220, and the input/output interface.
  • the memory 290 may store software and/or programs.
  • the software programs used to represent an operating system (OS) include, for example, kernels, middleware, application programming interfaces (APIs), and/or application programs.
  • OS operating system
  • the kernel may control or manage system resources, or functions implemented by other programs (such as the middleware, API, or application program), and the kernel may provide interfaces to allow middleware and APIs, or applications to access the controller , In order to achieve control or management of system resources.
  • the memory 290 includes a broadcast receiving module 2901, a channel control module 2902, a volume control module 2903, an image control module 2904, a display control module 2905, a first audio control module 2906, an external command recognition module 2907, a communication control module 2908, The light receiving module 2909, the power control module 2910, the operating system 2911, and other application programs 2912, the browser module 2913, and so on.
  • the first controller 210 executes various software programs in the memory 290 such as: broadcast and television signal reception and demodulation function, TV channel selection control function, volume selection control function, image control function, display control function, audio control function, Various functions such as external command recognition function, communication control function, optical signal receiving function, power control function, software control platform supporting various functions, and browser function.
  • various software programs in the memory 290 such as: broadcast and television signal reception and demodulation function, TV channel selection control function, volume selection control function, image control function, display control function, audio control function, Various functions such as external command recognition function, communication control function, optical signal receiving function, power control function, software control platform supporting various functions, and browser function.
  • the memory 390 includes storing various software modules used to drive and control the display device 200.
  • various software modules stored in the memory 390 include: a basic module, a detection module, a communication module, a display control module, a browser module, and various service modules (not shown in the figure). Since the functions of the memory 390 and the memory 290 are relatively similar, please refer to the memory 290 for related parts, which will not be repeated here.
  • the memory 390 includes an image control module 3904, a second audio control module 3906, an external command recognition module 3907, a communication control module 3908, an optical receiving module 3909, an operating system 3911, and other application programs 3912, a browser module 3913, etc. .
  • the first controller 210 executes various software programs in the memory 290 such as: image control function, display control function, audio control function, external command recognition function, communication control function, light signal receiving function, power control function, support Various functions such as software control platform and browser functions.
  • the external command recognition module 2907 of the first controller 210 and the external command recognition module 3907 of the second controller 310 can recognize different commands.
  • the external command recognition module 3907 of the second controller 310 may include a graphic recognition module 2907-1, and the graphic recognition module 3907-1 stores a graphic database.
  • the camera receives the graphics instruction from the outside world, it will correspond to the instruction in the graphics database to control the display device.
  • the voice receiving device and the remote controller are connected to the first controller 210
  • the external command recognition module 2907 of the first controller 210 may include a voice recognition module 2907-2, and the voice recognition module 2907-2 stores a voice database.
  • the voice receiving device or the like receives a voice instruction from the outside world, it corresponds to the instruction in the voice database to control the display device.
  • the control device 100 such as a remote controller is connected to the first controller 210, and the key instruction recognition module 2907-3 interacts with the control device 100 in instructions.
  • FIG. 7 exemplarily shows a configuration block diagram of a software system in the display device 200 according to an exemplary embodiment.
  • the operating system 2911 includes operating software for processing various basic system services and for implementing hardware-related tasks, acting as an application program and completing data processing between hardware components Medium.
  • part of the operating system kernel may include a series of software to manage the hardware resources of the display device and provide services for other programs or software codes.
  • part of the operating system kernel may include one or more device drivers, and the device drivers may be a set of software codes in the operating system to help operate or control devices or hardware associated with the display device.
  • the drive may contain code to manipulate video, audio, and/or other multimedia components. Examples include displays, cameras, Flash, WiFi, and audio drivers.
  • the accessibility module 2911-1 is used to modify or access the application program to realize the accessibility of the application program and the operability of its display content.
  • the communication module 2911-2 is used to connect to other peripherals via related communication interfaces and communication networks.
  • the user interface module 2911-3 is used to provide objects that display user interfaces for access by various applications, and can realize user operability.
  • the control application 2911-4 is used to control process management, including runtime applications.
  • the event transmission system 2914 can be implemented in the operating system 2911 or in the application 2912. In some embodiments, it is implemented in the operating system 2911 on the one hand, and implemented in the application program 2912 at the same time, for monitoring various user input events, and responding to the recognition results of various events or sub-events according to various events. And implement one or more sets of pre-defined operation procedures.
  • the event monitoring module 2914-1 is used to monitor input events or sub-events of the user input interface.
  • the event recognition module 2914-2 is used to input the definition of various events to various user input interfaces, recognize various events or sub-events, and transmit them to the processing to execute their corresponding one or more sets of processing procedures .
  • the event or sub-event refers to the input detected by one or more sensors in the display device 200 and the input of an external control device (such as the control device 100, etc.).
  • an external control device such as the control device 100, etc.
  • various sub-events of voice input, gesture input sub-events of gesture recognition, and sub-events of remote control button command input of control devices include multiple forms, including but not limited to one or a combination of pressing up/down/left/right/, the OK key, and pressing the key.
  • non-physical keys such as moving, pressing, and releasing.
  • the interface layout management module 2913 which directly or indirectly receives various user input events or sub-events monitored by the event transmission system 2914, is used to update the layout of the user interface, including but not limited to the position of each control or sub-control in the interface, and the container
  • the size, position, level, etc. of the interface are related to the various execution operations of the interface layout.
  • the application layer of the display device includes various application programs that can be executed on the display device 200.
  • the application layer 2912 of the first controller 210 may include, but is not limited to, one or more applications, such as: video-on-demand application, application center, game application, and so on.
  • the application layer 3912 of the second controller 310 may include, but is not limited to, one or more applications, such as a live TV application, a media center application, and so on. It should be noted that the application programs included in the second controller 310 and the first controller 210 are determined according to the operating system and other designs. This application does not need to determine the application programs included in the second controller 310 and the first controller 210.
  • the application program is specifically defined and divided.
  • Live TV applications can provide live TV through different sources.
  • a live TV application can provide a TV signal using input from cable TV, wireless broadcasting, satellite services, or other types of live TV services.
  • the live TV application can display the video of the live TV signal on the display device 200.
  • Video-on-demand applications can provide videos from different storage sources. Unlike live TV applications, VOD provides video display from certain storage sources. For example, video on demand can come from the server side of cloud storage, and from the local hard disk storage that contains stored video programs.
  • Media center applications can provide various multimedia content playback applications.
  • the media center can provide services that are different from live TV or video on demand, and users can access various images or audio through the media center application.
  • Application center can provide storage of various applications.
  • the application program can be a game, an application program, or some other application program that is related to a computer system or other device but can be run on a display device.
  • the application center can obtain these applications from different sources, store them in the local storage, and then run on the display device 200.
  • both the second controller 310 and the first controller 210 may be respectively installed with independent operating systems, there are two independent but related sub-systems in the display device 200.
  • both the second controller 310 and the first controller 210 can be independently installed with Android (Android) and various APPs, both of which can realize certain functions, and the second controller 310 and the first controller 210 can cooperate to realize certain functions. Item function.
  • FIG. 9 exemplarily shows a schematic diagram of a user interface in the display device 200 according to an exemplary embodiment.
  • the user interface includes a first view display area 2011 and a second view display area 2021.
  • the functions of the first view display area 2011 and the second view display area 2021 are basically the same. The following only focuses on the first view display area 2011.
  • the first view display area 2011 includes one or more different items in the layout.
  • the user interface also includes a selector indicating that the item is selected, and the position of the selector can be moved through user input to change the selection of different items.
  • the first view display area 2011 is a zoomable view display.
  • Scalable may indicate that the size or proportion of the first view display area 2011 on the screen is scalable, or that the size or proportion of the items in the view display 201 on the screen is scalable.
  • Items refer to visual objects displayed in the view display area of the user interface in the display device 200 to represent corresponding content such as icons, thumbnails, and video clips.
  • an item can represent image content or video clips of movies, TV series, audio content of music, applications, or other user access content history information.
  • the item may indicate an interface or a set display of interfaces connected to the display device 200 and an external device, or may indicate the name of an external device connected to the display device, or the like.
  • the view display area can present the content of the video chat project or the content of the application layer project (for example, webpage video, video on demand (VOD) display, application program screen, etc.).
  • the application layer project for example, webpage video, video on demand (VOD) display, application program screen, etc.
  • “Selector” is used to indicate that any item has been selected, such as cursor or focus object.
  • the selection information input is positioned according to the icon or menu position touched by the user in the display device 200, and the movement of the focus object displayed in the display device 200 can be used to select the control item, and one or more items can be selected or controlled.
  • the focus object refers to the object that moves between items based on user input. For example, draw a thick line on the edge of the item to realize or identify the position of the focus object.
  • the focus form is not limited to examples, it can be a tangible or intangible form that can be recognized by the user such as a cursor, such as a 3D deformation of the item, and the border line and size of the text or image of the focused item can also be changed. , Color, transparency and outline and/or font.
  • the event transmission system 2914 can monitor the user input of each predefined event or sub-event, and directly or indirectly provide the interface layout management module 2913 with the control of the recognized event or sub-event.
  • the interface layout management module 2913 is used to monitor the state of the user interface (including the position and/or size, change process, etc. of the view partition, item, focus or cursor object, etc.), and according to the event or sub-event, the view display area can be modified
  • modifying and adjusting the layout includes displaying or not displaying each view partition or the content of items in the view partition on the screen.
  • the user input interface is used to send the user's input signal to the controller, or to transmit the signal output from the controller to the user.
  • the control device (such as a mobile terminal or a remote control) can send input signals input by the user, such as a power switch signal, a channel selection signal, and a volume adjustment signal, to the user input interface, and then forward the user input interface to the controller;
  • the control device may receive output signals such as audio, video, or data output from the user input interface processed by the controller, and display the received output signal or output the received output signal in the form of audio or vibration.
  • the user may input a user command on the user interface displayed on the display device 200, and the user input interface receives the user input command through the user interface.
  • the user may input a user command by inputting a specific voice or gesture, and the user input interface recognizes the voice or gesture through the sensor to receive the user input command.
  • User interface is a medium interface for interaction and information exchange between applications or operating systems and users. It realizes the conversion between the internal form of information and the form acceptable to users.
  • the commonly used form of the user interface is a graphical user interface (graphic user interface, GUI), which refers to a user interface related to computer operations that is displayed in a graphical manner. It can be an icon, window, control and other interface elements displayed on the display screen of an electronic device.
  • the control can include icons, buttons, menus, tabs, text boxes, dialog boxes, status bars, navigation bars, Widgets, etc. Visual interface elements.
  • 10-12 exemplarily show schematic diagrams of interaction between the user interface and the user in the display device 200 according to an exemplary embodiment.
  • the first display screen (specifically, the first view display area 2011) displays a video screen, which can be the image content of a movie, a TV series, a video clip, or a video screen in an application, etc. .
  • the controller receives the incoming call request, and in response to the incoming call request, the controller displays the incoming call reminder screen on the second display screen (specifically, the second view display area 2021).
  • the incoming call reminder screen includes a tab for instructing to connect the incoming call request and a tab for rejecting the incoming call request.
  • the controller After the controller displays the incoming call reminder screen on the second display screen, it can execute it in any of the following two optional ways.
  • the controller displays the chat screen on the first display screen in response to the tab of the call connection request input by the user. Further, the controller may also display a small window video call screen on the second display screen or in the target area of the first display screen in response to the user's small window video call instruction.
  • the option card of the request to connect to the incoming call input by the user may also be referred to as the indication of the request to connect to the incoming call input by the user.
  • the controller controls the second display screen to present the chat screen in response to the tab for the call connection request input by the user. Further, the controller can also control the chat screen to switch from the second display screen to the first display screen in response to the user's selection of the chat screen.
  • the incoming call connection request may refer to answering a video call, for example, and the chat screen may refer to a video call screen, for example.
  • the number of controllers may be one or multiple.
  • the one controller receives the incoming call request when the first display screen displays the video image, and displays the incoming call reminder image on the second display screen.
  • the call reminder screen is displayed on the second screen when the video screen is displayed on the first screen through collaboration and interaction between the controllers.
  • the controller may include a first controller and a second controller.
  • the first controller is configured to receive an incoming call request when the video screen is displayed on the first display screen and send an incoming call indication to the second controller.
  • the controller is used to receive the incoming call indication and display the incoming call reminder screen on the second display screen.
  • the incoming call reminder screen displayed in the second view display area 2021 by the second controller may include the caller's portrait, name, and incoming call prompt. , Answer video button, answer voice only button, hang up button, etc.
  • the second controller positions the focus object on the answer video button.
  • the incoming call reminder screen displayed in the second view display area 2021 by the second controller may include the caller's portrait, name, incoming call reminder, answer voice button, and hang up button. Wait for items, and at the same time, the second controller positions the focus object on the answer voice button. The user can move the focus object by operating the left and right keys on the remote control to select a button, and press the OK key to confirm.
  • the first display screen and the second display screen can display FIG. 13 or The screen shown in Figure 14.
  • Figure 13 is a schematic diagram of six parties in a video call
  • Figure 14 is a schematic diagram of two parties in a video call.
  • the user can press the floating window button and the screen can be switched.
  • the screen can be switched to the screen shown in Figure 18.
  • the first display screen and the second display screen can display the screen shown in Figure 17 or Figure 18. The user can press a specific menu key, such as the OK key, and the screen can be switched to Figure 13 or The screen shown in Figure 14.
  • the first display screen and the second display screen can display the screen shown in FIG. 15. As shown in FIG. 15, the first display screen can continue to display video images. In some embodiments, the video can be paused at this time.
  • the second display screen may display portraits of participants other than the current user among the parties in the voice call. It should be understood that the current user described in the embodiment of the present application refers to a user corresponding to the display device. At the same time, the position corresponding to the current user in the second display screen is displayed as a black screen, and a sign indicating that the video is not started is superimposed on the black screen.
  • a voice logo may be displayed on the second display screen to identify that a voice call is currently in progress.
  • the first display screen and the second display screen can display the screen shown in FIG. 12.
  • the picture shown in FIG. 12 is the picture after not receiving the incoming call request and ending the voice call or video call.
  • the first display screen can continue to play video, while the second display screen can display information such as voice assistant, weather, date, and time.
  • Fig. 13 and Fig. 14 are schematic diagrams of a video call, which will be described separately below.
  • the video call screen is displayed in the first view display area 2011.
  • the video call screen displays the video of each of the six participants participating in the video call, and the video of each participant occupies the first view. Part of district 2011.
  • the lower left corner of the area where the video of each participant is located displays the name of the participant, and the lower right corner displays whether the participant is muted or non-muted.
  • the video of user A is displayed in the upper left corner of the display area of the first view.
  • the image of user A fills the entire area.
  • the lower left corner of the area superimposes and displays the name of user A, and the lower right corner of the area superimposes the sign that user A is currently muting.
  • the items for the user to operate and the duration of the video call are displayed in a semi-transparent form. These items can include floating small window buttons, switch layout buttons, invite new participants button, hang up button, video off button, microphone off button, screen sharing button, etc.
  • the displayed items can be different for different call types.
  • the focus object can be positioned on one of the items.
  • the call reminder screen is no longer displayed on the second display screen.
  • the second display screen may be in a state where the backlight is turned off, or the second display screen may display information such as weather and time.
  • An example of the interface after the user presses the floating window button, the layout switch button, and the invite new participant button will be described in the following embodiments.
  • the interface displays the screen shown in Figure 12 above.
  • the user presses the video off button the user's corresponding video is in a black screen state, and the current user's video is no longer displayed.
  • the microphone off button a mute sign is displayed in the area where the user's video is located.
  • the user is user A in FIG. 13, and when the user presses the microphone off button, the sign of the sitting corner of the area where user A is located is displayed as a mute sign.
  • a video call screen is displayed in the first view display area 2011, and the video call screen displays videos of two participants participating in the video call, and the video of each participant occupies half of the first view display area 2011.
  • the items for users to operate are displayed in a semi-transparent form. These items can include floating small window buttons, beauty filter buttons, invite friends buttons, hang up buttons, video off buttons, microphone off buttons, and switch layout buttons.
  • the focus object can be positioned on one of the items, for example, positioned on the floating window button.
  • the second display screen may be in a state where the backlight is turned off, or the second display screen may display information such as weather and time.
  • the interface After the user presses the hang up button, the interface displays the screen shown in Figure 12 above.
  • the user presses the video off button the user's corresponding video is in a black screen state, and the current user's video is no longer displayed.
  • the microphone off button a mute sign is displayed in the area where the user's video is located.
  • the user is user A in FIG. 13, and when the user presses the microphone off button, the sign of the sitting corner of the area where user A is located is displayed as a mute sign.
  • FIG. 16 is a schematic diagram of the interaction process between the first controller and the second controller. As shown in FIG. 16, the interaction process between the first controller and the second controller includes:
  • the first controller displays a video image on the first display screen.
  • the video screen may be the image content of a movie, a TV series, a video clip, or a video screen in an application.
  • the first controller also simultaneously plays the audio information corresponding to the video picture.
  • the first controller receives an incoming call request.
  • the incoming call request may be a video call request, a voice call request, a game invitation, etc.
  • the first controller sends an incoming call indication to the second controller.
  • the first controller determines that the video screen is currently being displayed on the first display screen, and can send an incoming call indication to the second controller through a serial port transmission mode.
  • the incoming call indication may include the type of the incoming call and the information of the caller, and the second controller selects the item to be displayed accordingly.
  • the incoming call indication may include both the type of the incoming call and the information of the caller, as well as the items to be displayed on the second display screen by the second controller, and the second controller directly displays the items based on the incoming call indication.
  • the types of incoming calls can include video calls, voice calls, game invitations, and so on.
  • the information of the caller may include the portrait, name, etc. of the caller.
  • the second controller receives the above-mentioned incoming call indication.
  • the second controller displays an incoming call reminder screen on the second display screen.
  • the second controller displays the incoming call reminder screen in the second view display area of the second display screen.
  • the incoming call reminder screen includes the caller's portrait, name, incoming call reminder, answer video button, answer voice only button, hang up button, etc. . If the type of the incoming call is a video-based incoming call such as a video call or game invitation, the second controller can position the focus object on the answer video button. If the type of the incoming call is a voice call, the second controller will position the focus object on the Only after answering the voice button.
  • the user can move the focus object by operating the left key and the right key on the remote control to select a button, and press the OK key to confirm. Alternatively, the user can directly press the OK key to confirm.
  • the second controller receives the user's instruction to answer the video call.
  • the second controller sends a video call instruction to the first controller.
  • the second controller After the user presses the OK key, the second controller recognizes the user's instruction based on the item currently located by the focus object. If the focus object is positioned on the hang-up button, the second controller confirms that it receives the hang-up instruction and sends the hang-up instruction to the first controller. After the first controller receives the hang-up instruction, it instructs the second controller to shut down Incoming call reminder screen, no longer perform the following steps. If the focus object is positioned on the voice-only button, the second controller confirms that it has received the voice call instruction and sends the voice call instruction to the first controller.
  • the first controller After the first controller receives the voice call instruction, it executes the voice call control , Such as playing the caller’s voice, receiving the receiver’s voice and sending it to the caller’s device, etc. At the same time, the first controller can instruct the second controller to close the call reminder screen and instruct the second controller to be on the second display Shows the items that are in a voice call. And, do not perform the following steps.
  • the second controller confirms that the instruction to answer the video call is received, and sends the video call instruction to the first controller.
  • the first controller receives the video call instruction and executes the following steps.
  • the first controller displays a video call screen on the first display screen.
  • the above-mentioned video call screen may include videos of the video call participants and items for the user to operate. Among them, the items for the user to operate can be displayed on the video of the video call participants.
  • the first controller may determine the video display position of each participant according to the number of participants in the video call. For example, assuming that the video call is a two-party video call, the video of the caller and the caller respectively occupy half of the screen of the first display screen. For another example, assuming that the video call is a six-party video call, each participant in the call occupies one-sixth of the first display.
  • the first controller displays items for the user to operate on the video of the video call participants. These items may include a floating window button, a switch layout button, a button for inviting new parties, a hangup button, and a video Off button, microphone off button, screen sharing button, etc.
  • the displayed items can be different for different call types.
  • the focus object can be positioned on one of the items, such as a floating window button. The user can move the focus object by operating the left and right keys on the remote control to select other items, and confirm by operating the OK key on the remote control.
  • the second controller closes the incoming call reminder screen.
  • the second controller may execute this step after executing the above-mentioned step S1306.
  • This step and the execution order of S1607 may be in no particular order.
  • the second controller After the second controller closes the call reminder screen, it can display information such as time, weather, temperature, and reminder messages on the second display screen, or the second controller can turn off the backlight of the second display screen.
  • the first controller when the first controller receives the incoming call request while the first display screen is playing a video screen, it displays the incoming call reminder screen on the second display screen, so that the video screen being displayed on the first display screen is not blocked. , Which greatly improves the user experience. After the user chooses to answer the video call, the video call screen is displayed on the first display screen, so that the user can view the videos of all parties to the video call.
  • the information transmission between the first controller and the second controller may be transmitted through a serial port.
  • the user can choose to make a small window video call.
  • the first controller uses the small window to display the video call screen according to the user's selection.
  • the user may press the floating small window button in FIG. 13 or FIG. 14, or the user may press the switch layout button in FIG. 13 or FIG. 14 to make a small window video call.
  • FIG. 17 exemplarily shows a schematic diagram of interaction between a user interface and a user in the display device 200 according to an exemplary embodiment.
  • the video call screen in the first view display area 2011 is closed, and the video screen before the video call continues to be displayed.
  • the small window video call instruction is input by the user pressing the above floating small window button.
  • a small window video call screen is displayed in the second view display area 2021, and the video of each video participant in the small window video call screen is displayed.
  • the name of the participant is displayed in the lower left corner of the area where the video of each participant is located, and the lower right corner shows that the participant is in a silent state or a non-muted state.
  • a line of prompt text (not shown in the figure) can be superimposed on the video of each participant in the second view display area 2021, such as "press the menu key to switch windows", so that the user can follow the prompt text
  • the execution returns to the display screen shown in Fig. 13 or Fig. 14.
  • the user can press a specific button (for example, the return button) on the remote control, and the display device will return to that shown in FIG. 13 or FIG. 14. Display screen shown.
  • a return button may be displayed in the second view display area. The user selects the button by operating the remote control and confirms that the display device returns to the display screen shown in FIG. 13 or FIG. 14 again.
  • FIG. 18 exemplarily shows a schematic diagram of interaction between the user interface and the user in the display device 200 according to an exemplary embodiment.
  • the display area in the first view display area 2011 The video call screen is closed, and the video screen before the video call continues to be displayed.
  • the small window video call instruction is input by the user pressing the above-mentioned switch layout button.
  • a small window video call screen is displayed below the first view display area 2011 near the lower edge of the first display screen, and the video of each video participant in the small window video call screen.
  • the name of the participant is displayed in the lower left corner of the area where the video of each participant is located, and the lower right corner shows that the participant is in a silent state or a non-muted state.
  • the user can press a specific button (for example, the return button) on the remote control, and the display device will return to that shown in FIG. 13 or FIG. 14. Display screen shown.
  • a return button may be displayed below the first view display area 2011 near the lower edge of the first display screen. The user selects this button by operating the remote control and confirms that the display device returns to the display shown in FIG. 13 or FIG. 14 Picture.
  • Figure 19 is a schematic diagram of the interaction process between the first controller and the second controller. As shown in Figure 19, after the video call screen is displayed on the first display screen, the user can select the small window to display the video call screen, and the first controller can Notifying the second controller to display the small window video call screen on the second display screen, the interaction process between the first controller and the second controller includes:
  • the first controller receives a small window video call instruction input by the user.
  • the user can select the floating small window button displayed on the first display screen and confirm that the first controller receives the small window video call instruction.
  • the first controller sends a small window video call instruction to the second controller.
  • the small window video call indication is used to instruct the second controller to display a small window video call picture on the second display screen.
  • the first controller may carry the number of participants in the video call, the names of the parties, etc. in the instruction.
  • the second controller receives the small window video call instruction.
  • the first controller closes the video call screen and displays the video screen before the video call.
  • the second controller displays a small window video call screen on the second display screen.
  • the second controller After receiving the small window video call instruction, the second controller displays the video and name of each participant in multiple areas on the second display screen based on the number and names of the participants in the call.
  • the execution order of the above steps S1903 and S1904 may be in no particular order.
  • the user can press a specific button (for example, the return button) on the remote control, or select the small window video call through the remote control
  • the return button in the screen and confirm that the first controller receives the user's return instruction based on the user's operation.
  • the first controller sends the second controller an instruction to close the small window video call screen, and the second
  • the controller closes the video call screen in the small window, and at the same time, the first controller displays the above-mentioned video call screen on the first display screen again.
  • the information transmission between the first controller and the second controller may be transmitted through a serial port.
  • the first controller closes the video call screen after receiving the small window video call instruction input by the user, and displays the small window in the target area of the first display screen. Window video call screen.
  • the first controller may determine the size of the target area according to the number of participants in the video call, and display a small window video call picture on the target area.
  • the user can press a specific button (such as the return button) on the remote control, or select the small window through the remote control.
  • the return button in the window video call screen and confirm that the first controller receives the user’s return instruction based on the user’s operation.
  • the first controller closes the small window video call screen.
  • the first controller The above-mentioned video call screen is displayed on the first display screen again.
  • FIG. 20 exemplarily shows a schematic diagram of interaction between a user interface and a user in the display device 200 according to an exemplary embodiment.
  • a list of friends available for invitation is displayed.
  • the invitation instruction is input by the user pressing the aforementioned invite new participant button or invite friend button.
  • the friend list may be displayed above the first view display area 2011 near the upper edge of the first display screen, and each friend's portrait and name may be displayed in the friend list. After the user selects a friend's portrait or name and presses the OK button, the friend joins the video call, and the friend's video is added to the display area of the first view.
  • FIG. 21 exemplarily shows a schematic diagram of interaction between a user interface and a user in the display device 200 according to an exemplary embodiment.
  • the first view display area 2011 displays the beautification filter options on the displayed video call screen.
  • the beauty filter options can include backgrounds, stickers, and filters. Under the background option, the user can choose any of a variety of backgrounds. Under the sticker option, users can choose any one of a variety of stickers. Under the filter options, the user can choose any filter method. Under the beautification option, the user can select any beautification method.
  • the beauty filter option may be displayed at a position near the lower edge of the first display screen under the first view display area 2011. At this time, other buttons originally displayed at this position are temporarily hidden.
  • the foregoing describes the interaction processing process and the interface display process of the first controller and the second controller by taking the foregoing first optional manner as an example.
  • the corresponding processing and display process can be used.
  • the second controller receives the instruction to answer the video call input by the user, it displays the video call screen on the second display screen in the same manner as the first controller in the above-mentioned first optional method, and when the user presses After the confirmation key, the first controller displays the video call screen on the first display screen.
  • the specific implementation process will not be repeated here.
  • FIG. 22 is a schematic flowchart of a caller ID method provided by an embodiment of the application. The method is mainly executed by the aforementioned controller, as shown in FIG. 22, including:
  • S2201 Receive an incoming call request when a video screen is displayed on the first display screen.
  • S2202 in response to the aforementioned call request, display an incoming call reminder screen on the second display screen.
  • the controller when the first display screen is playing a video screen, the controller receives an incoming call request, and then displays the incoming call reminder screen on the second display screen, so that the video screen being displayed on the first display screen is not blocked. Greatly enhance the user experience.
  • the controller includes a first controller and a second controller.
  • the controller receiving an incoming call request when a video screen is displayed on the first display screen includes:
  • the first controller receives an incoming call request when the first display screen displays a video image.
  • the controller displays an incoming call reminder screen on the second display screen in response to the incoming call request, including:
  • the first controller sends an incoming call indication to the second controller in response to the incoming call request;
  • the second controller receives the incoming call indication, and in response to the incoming call indication, displays an incoming call reminder screen on the second display screen.
  • the above method further includes:
  • the incoming call reminder screen is displayed on the second display screen, in response to the user input for the tab request to connect to the incoming call, control the second display screen to present a chat screen.
  • the above method further includes:
  • controlling the chat screen In response to the user's selection of the chat screen, controlling the chat screen to switch from the second display screen to the first display screen.
  • the above method further includes:
  • the second controller receives the instruction for answering the video call input by the user, and in response to the instruction for answering the video call, sends a video call instruction to the first controller, and closes the incoming call reminder screen.
  • the first controller receives the video call instruction sent by the second controller, and in response to the video call instruction, displays a video call picture on the first display screen.
  • the above method further includes:
  • the first controller receives the small window video call instruction input by the user, responds to the small window video call instruction, closes the video call screen, and sends the small window video call instruction to the second controller.
  • the second controller displays a small window video call screen on the second display screen.
  • the above method further includes:
  • the first controller receives the small window video call instruction input by the user, and in response to the small window video call instruction, closes the video call screen, and displays the small window video call screen in the target area of the first display screen.
  • the method further includes:
  • the first controller displays the video screen on the first display screen.
  • the embodiments of the present application further provide a storage medium, which stores instructions in the storage medium, which when run on a computer, causes the computer to execute the method in the above method embodiment.
  • the embodiment of the present application further provides a chip for executing instructions, and the chip is used to execute the method in the above method embodiment.
  • An embodiment of the present application also provides a program product, the program product includes a computer program, the computer program is stored in a storage medium, at least one processor can read the computer program from the storage medium, and the at least one When the processor executes the computer program, the method in the foregoing method embodiment can be implemented.
  • At least one refers to one or more, and “multiple” refers to two or more.
  • “And/or” describes the association relationship of the associated objects, indicating that there can be three relationships, for example, A and/or B, which can mean: A alone exists, A and B exist at the same time, and B exists alone, where A, B can be singular or plural.
  • the character “/” generally indicates that the associated objects before and after are in an “or” relationship; in the formula, the character “/” indicates that the associated objects before and after are in a “division” relationship.
  • “The following at least one item (a)” or similar expressions refers to any combination of these items, including any combination of a single item (a) or a plurality of items (a).
  • at least one of a, b, or c can mean: a, b, c, ab, ac, bc, or abc, where a, b, and c can be single or multiple A.

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • General Engineering & Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Theoretical Computer Science (AREA)
  • Business, Economics & Management (AREA)
  • Marketing (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Controls And Circuits For Display Device (AREA)
  • Information Transfer Between Computers (AREA)

Abstract

Selon divers ‌modes‌ ‌de‌ réalisation,‌ ‌la‌ ‌présente‌ invention ‌concerne‌ un dispositif d'affichage, comprenant un premier écran d'affichage, un second écran d'affichage et un dispositif de commande. Le dispositif de commande est configuré pour recevoir une demande d'appel entrant lorsque le premier écran d'affichage affiche des images vidéo, et pour afficher une image d'alerte d'appel entrant sur le second écran d'affichage. Si le dispositif de commande reçoit une demande d'appel entrant lorsque le premier écran d'affichage affiche des images vidéo, une image d'alerte d'appel entrant est affichée sur le second écran d'affichage, de telle sorte que les images vidéo affichées sur le premier écran d'affichage ne sont pas bloquées.
PCT/CN2020/086461 2019-11-04 2020-04-23 Dispositif d'affichage et procédé d'affichage d'appel entrant WO2021088326A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN201911067365 2019-11-04
CN201911067365.5 2019-11-04

Publications (1)

Publication Number Publication Date
WO2021088326A1 true WO2021088326A1 (fr) 2021-05-14

Family

ID=75750043

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2020/086461 WO2021088326A1 (fr) 2019-11-04 2020-04-23 Dispositif d'affichage et procédé d'affichage d'appel entrant

Country Status (2)

Country Link
CN (2) CN116320554A (fr)
WO (1) WO2021088326A1 (fr)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113676689A (zh) * 2021-08-18 2021-11-19 百度在线网络技术(北京)有限公司 一种视频通话方法、装置及电视

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103051972A (zh) * 2012-12-21 2013-04-17 康佳集团股份有限公司 一种通过智能电视提示手机来电的控制方法及系统
US20130260728A1 (en) * 2012-03-28 2013-10-03 Kyocera Corporation Communication device, communication method, and storage medium storing communication program
CN105592193A (zh) * 2014-10-20 2016-05-18 中国电信股份有限公司 一种用于多屏显示来电的方法及系统
CN106445441A (zh) * 2016-09-27 2017-02-22 宇龙计算机通信科技(深圳)有限公司 一种消息显示方法及系统
CN106534999A (zh) * 2016-11-09 2017-03-22 合浦县文化体育和旅游局 能够进行视频聊天的电视机
CN106953990A (zh) * 2017-03-29 2017-07-14 青岛海信移动通信技术股份有限公司 移动终端的来电接听方法及移动终端

Family Cites Families (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR101763595B1 (ko) * 2010-11-16 2017-08-01 엘지전자 주식회사 모니터링 서비스를 위한 데이터를 처리하는 네트워크 tv의 제어 방법 및 상기 네트워크 tv
KR102014273B1 (ko) * 2011-02-10 2019-08-26 삼성전자주식회사 터치 스크린 디스플레이를 구비한 휴대 기기 및 그 제어 방법
CN103780864B (zh) * 2012-10-18 2017-10-03 腾讯科技(深圳)有限公司 视频通话界面显示方法及装置
CN106791571B (zh) * 2017-01-09 2020-05-19 宇龙计算机通信科技(深圳)有限公司 一种用于双屏显示终端的视频显示方法及装置
CN109379484B (zh) * 2018-09-19 2020-09-25 维沃移动通信有限公司 一种信息处理方法及终端
CN109981885B (zh) * 2019-02-03 2022-03-08 华为技术有限公司 一种电子设备在来电时呈现视频的方法和电子设备
CN110166814A (zh) * 2019-05-25 2019-08-23 Oppo广东移动通信有限公司 视频画面显示方法及相关设备
CN110572519A (zh) * 2019-09-20 2019-12-13 青岛海信移动通信技术股份有限公司 一种来电显示界面的拦截方法以及显示设备

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20130260728A1 (en) * 2012-03-28 2013-10-03 Kyocera Corporation Communication device, communication method, and storage medium storing communication program
CN103051972A (zh) * 2012-12-21 2013-04-17 康佳集团股份有限公司 一种通过智能电视提示手机来电的控制方法及系统
CN105592193A (zh) * 2014-10-20 2016-05-18 中国电信股份有限公司 一种用于多屏显示来电的方法及系统
CN106445441A (zh) * 2016-09-27 2017-02-22 宇龙计算机通信科技(深圳)有限公司 一种消息显示方法及系统
CN106534999A (zh) * 2016-11-09 2017-03-22 合浦县文化体育和旅游局 能够进行视频聊天的电视机
CN106953990A (zh) * 2017-03-29 2017-07-14 青岛海信移动通信技术股份有限公司 移动终端的来电接听方法及移动终端

Also Published As

Publication number Publication date
CN116320554A (zh) 2023-06-23
CN112788381A (zh) 2021-05-11
CN112788381B (zh) 2023-01-17

Similar Documents

Publication Publication Date Title
WO2020248668A1 (fr) Dispositif d'affichage et procédé de traitement d'image
WO2021088320A1 (fr) Dispositif d'affichage et procédé d'affichage de contenu
US20210314659A1 (en) Method For Switching Video Call Interface On Smart Television, And Smart Television
CN111526415B (zh) 一种双屏显示设备及其hdmi的切换方法
CN111491190B (zh) 一种双系统摄像头切换控制方法及显示设备
CN111464840B (zh) 显示设备及显示设备屏幕亮度的调节方法
CN112788422A (zh) 显示设备
WO2020248714A1 (fr) Procédé et dispositif de transmission de données
WO2020248697A1 (fr) Dispositif d'affichage et procédé de traitement des données de communication vidéo
CN112788378B (zh) 显示设备与内容显示方法
CN113141528B (zh) 显示设备、开机动画播放方法及存储介质
WO2020248654A1 (fr) Appareil d'affichage et procéder pour afficher des applications de façon conjointe
WO2020248699A1 (fr) Procédé de traitement du son et appareil d'affichage
CN112788423A (zh) 一种显示设备及菜单界面的显示方法
CN112783380A (zh) 显示设备和方法
WO2021088326A1 (fr) Dispositif d'affichage et procédé d'affichage d'appel entrant
CN112784137A (zh) 显示设备、显示方法及计算设备
WO2021223074A1 (fr) Dispositif d'affichage et procédé de commande d'interaction
CN112788375B (zh) 显示设备、显示方法及计算设备
CN113365124B (zh) 一种显示设备及显示方法
WO2020248681A1 (fr) Dispositif d'affichage et procédé d'affichage des états de commutation bluetooth
WO2021088325A1 (fr) Dispositif d'affichage à double écran et procédé d'affichage d'image
WO2021189400A1 (fr) Dispositif d'affichage, et procédé d'affichage d'une fenêtre de chat vidéo
CN113497966B (zh) 一种双屏显示设备
CN113630633B (zh) 显示设备及交互控制方法

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 20884511

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 20884511

Country of ref document: EP

Kind code of ref document: A1