CN112969084A - User interface display method, storage medium and display device - Google Patents

User interface display method, storage medium and display device Download PDF

Info

Publication number
CN112969084A
CN112969084A CN201911310348.XA CN201911310348A CN112969084A CN 112969084 A CN112969084 A CN 112969084A CN 201911310348 A CN201911310348 A CN 201911310348A CN 112969084 A CN112969084 A CN 112969084A
Authority
CN
China
Prior art keywords
application
icon
focus
display
user
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201911310348.XA
Other languages
Chinese (zh)
Inventor
张欣
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Vidaa Netherlands International Holdings BV
Original Assignee
Qingdao Hisense Media Network Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Qingdao Hisense Media Network Technology Co Ltd filed Critical Qingdao Hisense Media Network Technology Co Ltd
Priority to PCT/CN2020/084194 priority Critical patent/WO2021114529A1/en
Publication of CN112969084A publication Critical patent/CN112969084A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/41Structure of client; Structure of client peripherals
    • H04N21/422Input-only peripherals, i.e. input devices connected to specially adapted client devices, e.g. global positioning system [GPS]
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/04817Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance using icons
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/41Structure of client; Structure of client peripherals
    • H04N21/422Input-only peripherals, i.e. input devices connected to specially adapted client devices, e.g. global positioning system [GPS]
    • H04N21/42204User interfaces specially adapted for controlling a client device through a remote control device; Remote control devices therefor
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/431Generation of visual interfaces for content selection or interaction; Content or additional data rendering
    • H04N21/4312Generation of visual interfaces for content selection or interaction; Content or additional data rendering involving specific graphical features, e.g. screen layout, special fonts or colors, blinking icons, highlights or animations
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/443OS processes, e.g. booting an STB, implementing a Java virtual machine in an STB or power management in an STB
    • H04N21/4438Window management, e.g. event handling following interaction with the user interface
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/80Generation or processing of content or additional data by content creator independently of the distribution process; Content per se
    • H04N21/81Monomedia components thereof
    • H04N21/8166Monomedia components thereof involving executable data, e.g. software

Abstract

The application provides a user interface display method, a storage medium and a display device, wherein two view display areas are set when a user interface is displayed, when a focus is positioned on a first application icon, recommended data corresponding to the first application are displayed in the first display area, and then when the focus is switched, the focus is switched from the first application icon to a second application icon, the recommended data displayed in the first display area is switched to the recommended data bound by the second application. Therefore, when the user switches the focus, the recommended data content displayed in the second display area can be automatically switched according to the recommended data bound by each application, so that the user can conveniently check the recommended data corresponding to each application, the user can operate the application installed in the television more conveniently and quickly, and the user experience is improved.

Description

User interface display method, storage medium and display device
Technical Field
The present embodiments relate to display technologies, and more particularly, to a user interface presentation method, a storage medium, and a display apparatus.
Background
The smart television is a television product for meeting diversified and personalized requirements of users. The smart television is based on an Internet (Internet) application technology, is provided with an open operating system and a chip, is provided with an open application platform, can realize a bidirectional human-computer interaction function, integrates various functions such as audio and video, entertainment, data and the like, and aims to bring more convenient experience to users.
The home application panel (also referred to as an operating system desktop) of the smart television is a user interface which is displayed first after the smart television is turned on and enters a normal working state, wherein various user interface objects, such as icons showing a plurality of application programs, can be displayed. In order to meet the diversified requirements of users, more and more applications can be installed in the smart television, for example, applications for users to watch videos, news applications, fitness applications, and the like.
In view of the trend that the number of applications installed in the current smart television is increasing, a new user interface display method needs to be provided, so that a user can enjoy different functions provided by each application and can operate applications installed in the smart television more conveniently and quickly.
Disclosure of Invention
The embodiment of the application provides a user interface display method, a storage medium and display equipment, so that a user can operate an application installed in a television more conveniently and quickly.
According to a first aspect of embodiments of the present application, there is provided a display apparatus, including:
a display configured to display a user interface, the user interface including a plurality of view display regions;
a controller communicatively coupled to the display, the controller configured to execute a presentation user interface:
displaying at least one application icon in the second display area;
when the focus is positioned at the first application icon, the recommendation data corresponding to the first application is displayed in the first display area;
and switching the recommendation data displayed in the first display area to recommendation data bound by the second application in response to switching the focus from the first application icon to the second application icon.
According to a second aspect of the embodiments of the present application, there is provided a user interface presentation method, including:
a display device displays a user interface, the user interface including a plurality of view display regions;
displaying at least one application icon in the second display area;
when the focus is positioned at the first application icon, the recommendation data corresponding to the first application is displayed in the first display area;
and switching the recommendation data displayed in the first display area to recommendation data bound by the second application in response to switching the focus from the first application icon to the second application icon.
According to a third aspect of embodiments of the present application, there is provided a computer storage medium, which may store a program that, when executed, may implement the method of the second aspect of embodiments of the present application.
As can be seen from the foregoing embodiments, in the user interface display method, the storage medium, and the display device provided in the embodiments of the present application, when a user interface is displayed, two view display areas are set, when a focus is located in a first application icon, recommended data corresponding to a first application is displayed in the first display area, and then, when the focus is switched, and when the focus is switched from the first application icon to a second application icon, recommended data displayed in the first display area is switched to recommended data bound to the second application. Therefore, when the user switches the focus, the recommended data content displayed in the second display area can be automatically switched according to the recommended data bound by the applications, the user can conveniently check the recommended data corresponding to the applications, the user can conveniently and quickly operate the applications installed in the television, and user experience is improved.
Drawings
In order to more clearly illustrate the technical solutions in the embodiments of the present invention, the drawings needed to be used in the description of the embodiments will be briefly introduced below, and it is obvious that the drawings in the following description are only some embodiments of the present invention, and it is obvious for those skilled in the art to obtain other drawings based on these drawings without inventive exercise.
Fig. 1 is a schematic diagram illustrating an operation scenario between a display device and a control apparatus;
fig. 2 is a block diagram schematically showing a configuration of the control apparatus 100 in fig. 1;
fig. 3 is a block diagram schematically illustrating a configuration of the display device 200 in fig. 1;
FIG. 4a is a block diagram illustrating an architectural configuration of an operating system in memory of the display device 200;
a functional configuration diagram of the display device 200 is exemplarily shown in fig. 4 b;
fig. 4c schematically shows a software configuration of the display device 200;
fig. 5 is a schematic view illustrating a home page interface in the display device 200;
fig. 6a to 6h are schematic diagrams illustrating application display screens in the home page interface of the display device 200;
fig. 7a to 7d are operation diagrams illustrating a sequence of application icons in the home interface of the mobile display device 200 by the control apparatus 100;
FIG. 8 is a flow chart diagram illustrating a user interface presentation method;
FIG. 9 is a flow chart illustrating another user interface presentation method;
fig. 10a to 10c are schematic diagrams illustrating operations of an application in the home page interface of the editing display device 200;
fig. 11a to 11d are schematic diagrams illustrating operations of application icons in the home interface of the mobile display device 200.
Detailed Description
To make the objects, technical solutions and advantages of the exemplary embodiments of the present application clearer, the technical solutions in the exemplary embodiments of the present application will be clearly and completely described below with reference to the drawings in the exemplary embodiments of the present application, and it is obvious that the described exemplary embodiments are only a part of the embodiments of the present application, but not all the embodiments.
In order to enable a user to operate an application installed in a television more conveniently and quickly, the embodiment provides a user interface display method, a storage medium and display equipment. It should be noted that the method provided by this embodiment is not only applicable to the main page of the television, but also applicable to other interface displays of the television, and in addition, the method is not only applicable to the television, but also applicable to other display devices, such as a computer, a tablet computer, and the like.
The concept to which the present application relates will be first explained below with reference to the drawings. It should be noted that the following descriptions of the concepts are only for the purpose of facilitating understanding of the contents of the present application, and do not represent limitations on the scope of the present application.
The term "module," as used in various embodiments of the present application, may refer to any known or later developed hardware, software, firmware, artificial intelligence, fuzzy logic, or combination of hardware and/or software code that is capable of performing the functionality associated with that element.
The term "remote control" as used in the embodiments of the present application refers to a component of an electronic device (such as the display device disclosed in the present application) that is capable of wirelessly controlling the electronic device, typically over a short distance. The component may typically be connected to the electronic device using infrared and/or Radio Frequency (RF) signals and/or bluetooth, and may also include functional modules such as WiFi, wireless USB, bluetooth, motion sensors, etc. For example: the hand-held touch remote controller replaces most of the physical built-in hard keys in the common remote control device with the user interface in the touch screen.
The term "gesture" as used in the embodiments of the present application refers to a user behavior used to express an intended idea, action, purpose, or result through a change in hand shape or an action such as hand movement.
The term "hardware system" used in the embodiments of the present application may refer to a physical component having computing, controlling, storing, inputting and outputting functions, which is formed by a mechanical, optical, electrical and magnetic device such as an Integrated Circuit (IC), a Printed Circuit Board (PCB) and the like. In various embodiments of the present application, a hardware system may also be referred to as a motherboard (or chip).
Fig. 1 is a schematic diagram illustrating an operation scenario between a display device and a control apparatus. As shown in fig. 1, the control apparatus 100 and the display device 200 may communicate with each other in a wired or wireless manner.
Among them, the control apparatus 100 is configured to control the display device 200, which may receive an operation instruction input by a user and convert the operation instruction into an instruction recognizable and responsive by the display device 200, serving as an intermediary for interaction between the user and the display device 200. Such as: the user operates the channel up/down key on the control device 100, and the display device 200 responds to the channel up/down operation.
The control device 100 may be a remote controller 100A, which includes infrared protocol communication or bluetooth protocol communication, and other short-distance communication methods, etc. to control the display apparatus 200 in a wireless or other wired manner. The user may input a user instruction through a key on a remote controller, voice input, control panel input, etc., to control the display apparatus 200. Such as: the user can input a corresponding control command through a volume up/down key, a channel control key, up/down/left/right moving keys, a voice input key, a menu key, a power on/off key, etc. on the remote controller, to implement the function of controlling the display device 200.
The control device 100 may also be an intelligent device, such as a mobile terminal 100B, a tablet computer, a notebook computer, and the like. For example, the display device 200 is controlled using an application program running on the smart device. The application program may provide various controls to a user through an intuitive User Interface (UI) on a screen associated with the smart device through configuration.
For example, the mobile terminal 100B may install a software application with the display device 200 to implement connection communication through a network communication protocol for the purpose of one-to-one control operation and data communication. Such as: the mobile terminal 100B may be caused to establish a control instruction protocol with the display device 200 to implement the functions of the physical keys as arranged in the remote control 100A by operating various function keys or virtual buttons of the user interface provided on the mobile terminal 100B. The audio and video content displayed on the mobile terminal 100B may also be transmitted to the display device 200, so as to implement a synchronous display function.
The display apparatus 200 may provide a network television function of a broadcast receiving function and a computer support function. The display device may be implemented as a digital television, a web television, an Internet Protocol Television (IPTV), or the like.
The display device 200 may be a liquid crystal display, an organic light emitting display, a projection device. The specific display device type, size, resolution, etc. are not limited.
The display apparatus 200 also performs data communication with the server 300 through various communication means. Here, the display apparatus 200 may be allowed to be communicatively connected through a Local Area Network (LAN), a Wireless Local Area Network (WLAN), and other networks. The server 300 may provide various contents and interactions to the display apparatus 200. By way of example, the display device 200 may send and receive information such as: receiving Electronic Program Guide (EPG) data, receiving software program updates, or accessing a remotely stored digital media library. The servers 300 may be a group or groups of servers, and may be one or more types of servers. Other web service contents such as a video on demand and an advertisement service are provided through the server 300.
Fig. 2 is a block diagram illustrating the configuration of the control device 100. As shown in fig. 2, the control device 100 includes a controller 110, a memory 120, a communicator 130, a user input interface 140, an output interface 150, and a power supply 160.
The controller 110 includes a Random Access Memory (RAM)111, a Read Only Memory (ROM)112, a processor 113, a communication interface, and a communication bus. The controller 110 is used to control the operation of the control device 100, as well as the internal components of the communication cooperation, external and internal data processing functions.
Illustratively, when an interaction of a user pressing a key disposed on the remote controller 100A or an interaction of touching a touch panel disposed on the remote controller 100A is detected, the controller 110 may control to generate a signal corresponding to the detected interaction and transmit the signal to the display device 200.
And a memory 120 for storing various operation programs, data and applications for driving and controlling the control apparatus 100 under the control of the controller 110. The memory 120 may store various control signal commands input by a user.
The communicator 130 enables communication of control signals and data signals with the display apparatus 200 under the control of the controller 110. Such as: the control apparatus 100 transmits a control signal (e.g., a touch signal or a button signal) to the display device 200 via the communicator 130, and the control apparatus 100 may receive the signal transmitted by the display device 200 via the communicator 130. The communicator 130 may include an infrared signal interface 131 and a radio frequency signal interface 132. For example: when the infrared signal interface is used, the user input instruction needs to be converted into an infrared control signal according to an infrared control protocol, and the infrared control signal is sent to the display device 200 through the infrared sending module. The following steps are repeated: when the rf signal interface is used, a user input command needs to be converted into a digital signal, and then the digital signal is modulated according to the rf control signal modulation protocol and then transmitted to the display device 200 through the rf transmitting terminal.
The user input interface 140 may include at least one of a microphone 141, a touch pad 142, a sensor 143, a key 144, and the like, so that a user can input a user instruction regarding controlling the display apparatus 200 to the control apparatus 100 through voice, touch, gesture, press, and the like.
The output interface 150 outputs a user instruction received by the user input interface 140 to the display apparatus 200, or outputs an image or voice signal received by the display apparatus 200. Here, the output interface 150 may include an LED interface 151, a vibration interface 152 generating vibration, a sound output interface 153 outputting sound, a display 154 outputting an image, and the like. For example, the remote controller 100A may receive an output signal such as audio, video, or data from the output interface 150, and display the output signal in the form of an image on the display 154, in the form of audio on the sound output interface 153, or in the form of vibration on the vibration interface 152.
And a power supply 160 for providing operation power support for each element of the control device 100 under the control of the controller 110. In the form of a battery and associated control circuitry.
A hardware configuration block diagram of the display device 200 is exemplarily shown in fig. 3. As shown in fig. 3, the display apparatus 200 may include a tuner demodulator 210, a communicator 220, a detector 230, an external device interface 240, a controller 250, a memory 290, a user interface 265, a video processor 270, a display 275, an audio processor 280, an audio output interface 285, and a power supply 260.
The tuner demodulator 210 receives the broadcast television signal in a wired or wireless manner, may perform modulation and demodulation processing such as amplification, mixing, and resonance, and is configured to demodulate, from a plurality of wireless or wired broadcast television signals, an audio/video signal carried in a frequency of a television channel selected by a user, and additional information (e.g., EPG data).
The tuner demodulator 210 is responsive to the user selected frequency of the television channel and the television signal carried by the frequency, as selected by the user and controlled by the controller 250.
The tuner demodulator 210 can receive a television signal in various ways according to the broadcasting system of the television signal, such as: terrestrial broadcasting, cable broadcasting, satellite broadcasting, internet broadcasting, or the like; and according to different modulation types, a digital modulation mode or an analog modulation mode can be adopted; and can demodulate the analog signal and the digital signal according to the different kinds of the received television signals.
In other exemplary embodiments, the tuning demodulator 210 may also be in an external device, such as an external set-top box. In this way, the set-top box outputs a television signal after modulation and demodulation, and inputs the television signal into the display apparatus 200 through the external device interface 240.
The communicator 220 is a component for communicating with an external device or an external server according to various communication protocol types. For example, the display apparatus 200 may transmit content data to an external apparatus connected via the communicator 220, or browse and download content data from an external apparatus connected via the communicator 220. The communicator 220 may include a network communication protocol module or a near field communication protocol module, such as a WIFI module 221, a bluetooth communication protocol module 222, and a wired ethernet communication protocol module 223, so that the communicator 220 may receive a control signal of the control device 100 according to the control of the controller 250 and implement the control signal as a WIFI signal, a bluetooth signal, a radio frequency signal, and the like.
The detector 230 is a component of the display apparatus 200 for collecting signals of an external environment or interaction with the outside. The detector 230 may include a sound collector 231, such as a microphone, which may be used to receive a user's sound, such as a voice signal of a control instruction of the user to control the display device 200; alternatively, ambient sounds may be collected that identify the type of ambient scene, enabling the display device 200 to adapt to ambient noise.
In some other exemplary embodiments, the detector 230, which may further include an image collector 232, such as a camera, a video camera, etc., may be configured to collect external environment scenes to adaptively change the display parameters of the display device 200; and the function of acquiring the attribute of the user or interacting gestures with the user so as to realize the interaction between the display equipment and the user.
In some other exemplary embodiments, the detector 230 may further include a light receiver for collecting the intensity of the ambient light to adapt to the display parameter variation of the display device 200.
In some other exemplary embodiments, the detector 230 may further include a temperature sensor, such as by sensing an ambient temperature, and the display device 200 may adaptively adjust a display color temperature of the image. For example, when the temperature is higher, the display apparatus 200 may be adjusted to display a color temperature of an image that is cooler; when the temperature is lower, the display device 200 may be adjusted to display a warmer color temperature of the image.
The external device interface 240 is a component for providing the controller 250 to control data transmission between the display apparatus 200 and an external apparatus. The external device interface 240 may be connected to an external apparatus such as a set-top box, a game device, a notebook computer, etc. in a wired/wireless manner, and may receive data such as a video signal (e.g., moving image), an audio signal (e.g., music), additional information (e.g., EPG), etc. of the external apparatus.
The external device interface 240 may include: a High Definition Multimedia Interface (HDMI) terminal 241, a Composite Video Blanking Sync (CVBS) terminal 242, an analog or digital Component terminal 243, a Universal Serial Bus (USB) terminal 244, a Component terminal (not shown), a red, green, blue (RGB) terminal (not shown), and the like.
The controller 250 controls the operation of the display device 200 and responds to the operation of the user by running various software control programs (such as an operating system and various application programs) stored on the memory 290.
As shown in fig. 3, the controller 250 includes a Random Access Memory (RAM)251, a Read Only Memory (ROM)252, a graphics processor 253, a CPU processor 254, a communication interface 255, and a communication bus 256. The RAM251, the ROM252, the graphic processor 253, and the CPU processor 254 are connected to each other through a communication bus 256 through a communication interface 255.
The ROM252 stores various system boot instructions. If the display apparatus 200 starts power-on upon receiving the power-on signal, the CPU processor 254 executes a system boot instruction in the ROM252, copies the operating system stored in the memory 290 to the RAM251, and starts running the boot operating system. After the start of the operating system is completed, the CPU processor 254 copies the various application programs in the memory 290 to the RAM251 and then starts running and starting the various application programs.
And a graphic processor 253 for generating various graphic objects such as icons, operation menus, and user input instruction display graphics, etc. The graphic processor 253 may include an operator for performing an operation by receiving various interactive instructions input by a user, and further displaying various objects according to display attributes; and a renderer for generating various objects based on the operator and displaying the rendered result on the display 275.
CPU processor 254 executes operating system and application program instructions stored in memory 290. And according to the received user input instruction, processing of various application programs, data and contents is executed so as to finally display and play various audio-video contents.
In some example embodiments, the CPU processor 254 may comprise a plurality of processors. The plurality of processors may include one main processor and a plurality of or one sub-processor. A main processor for performing some initialization operations of the display apparatus 200 in the display apparatus preload mode and/or operations of displaying a screen in the normal mode. A plurality of or one sub-processor for performing an operation in a state of a standby mode or the like of the display apparatus.
The communication interface 255 may include a first interface to an nth interface. These interfaces may be network interfaces that are connected to external devices via a network.
The controller 250 may control the overall operation of the display apparatus 200. For example: in response to receiving a user input command for selecting a GUI object displayed on the display 275, the controller 250 may perform an operation related to the object selected by the user input command.
Where the object may be any one of the selectable objects, such as a hyperlink or an icon. The operation related to the selected object is, for example, an operation of displaying a link to a hyperlink page, document, image, or the like, or an operation of executing a program corresponding to the object. The user input command for selecting the GUI object may be a command input through various input means (e.g., a mouse, a keyboard, a touch panel, etc.) connected to the display apparatus 200 or a voice command corresponding to a voice spoken by the user.
The memory 290 stores various types of data, software programs, or applications for driving and controlling the operation of the display device 200. The memory 290 may include volatile and/or nonvolatile memory. And the term "memory" includes the memory 290, the RAM251 and the ROM252 of the controller 250, or a memory card in the display device 200.
In some embodiments, the memory 290 is specifically used for storing an operating program for driving the controller 250 of the display device 200; storing various application programs built in the display apparatus 200 and downloaded by a user from an external apparatus; data such as visual effect images for configuring various GUIs provided by the display 275, various objects related to the GUIs, and selectors for selecting GUI objects are stored.
In some embodiments, memory 290 is specifically configured to store drivers for tuner demodulator 210, communicator 220, detector 230, external device interface 240, video processor 270, display 275, audio processor 280, etc., and related data, such as external data (e.g., audio-visual data) received from the external device interface or user data (e.g., key information, voice information, touch information, etc.) received by the user interface.
In some embodiments, memory 290 particularly stores software and/or programs representing an Operating System (OS), which may include, for example: a kernel, middleware, an Application Programming Interface (API), and/or an application program. Illustratively, the kernel may control or manage system resources, as well as functions implemented by other programs (e.g., the middleware, APIs, or applications); at the same time, the kernel may provide an interface to allow middleware, APIs, or applications to access the controller to enable control or management of system resources.
A block diagram of the architectural configuration of the operating system in the memory of the display device 200 is illustrated in fig. 4 a. The operating system architecture comprises an application layer, a middleware layer and a kernel layer from top to bottom.
The application layer, the application programs built in the system and the non-system-level application programs belong to the application layer. Is responsible for direct interaction with the user. The application layer may include a plurality of applications such as a setup application, a post application, a media center application, and the like. These applications may be implemented as Web applications that execute based on a WebKit engine, and in particular may be developed and executed based on HTML5, Cascading Style Sheets (CSS), and JavaScript.
Here, HTML, which is called HyperText Markup Language (HyperText Markup Language), is a standard Markup Language for creating web pages, and describes the web pages by Markup tags, where the HTML tags are used to describe characters, graphics, animation, sound, tables, links, etc., and a browser reads an HTML document, interprets the content of the tags in the document, and displays the content in the form of web pages.
CSS, known as Cascading Style Sheets (Cascading Style Sheets), is a computer language used to represent the Style of HTML documents, and may be used to define Style structures, such as fonts, colors, locations, etc. The CSS style can be directly stored in the HTML webpage or a separate style file, so that the style in the webpage can be controlled.
JavaScript, a language applied to Web page programming, can be inserted into an HTML page and interpreted and executed by a browser. The interaction logic of the Web application is realized by JavaScript. The JavaScript can package a JavaScript extension interface through a browser, realize the communication with the kernel layer,
the middleware layer may provide some standardized interfaces to support the operation of various environments and systems. For example, the middleware layer may be implemented as multimedia and hypermedia information coding experts group (MHEG) middleware related to data broadcasting, DLNA middleware which is middleware related to communication with an external device, middleware which provides a browser environment in which each application program in the display device operates, and the like.
The kernel layer provides core system services, such as: file management, memory management, process management, network management, system security authority management and the like. The kernel layer may be implemented as a kernel based on various operating systems, for example, a kernel based on the Linux operating system.
The kernel layer also provides communication between system software and hardware, and provides device driver services for various hardware, such as: provide display driver for the display, provide camera driver for the camera, provide button driver for the remote controller, provide wiFi driver for the WIFI module, provide audio driver for audio output interface, provide power management drive for Power Management (PM) module etc..
A user interface 265 receives various user interactions. Specifically, it is used to transmit an input signal of a user to the controller 250 or transmit an output signal from the controller 250 to the user. For example, the remote controller 100A may transmit an input signal, such as a power switch signal, a channel selection signal, a volume adjustment signal, etc., input by the user to the user interface 265, and then the input signal is transferred to the controller 250 through the user interface 265; alternatively, the remote controller 100A may receive an output signal such as audio, video, or data output from the user interface 265 via the controller 250, and display the received output signal or output the received output signal in audio or vibration form.
In some embodiments, a user may enter user commands on a Graphical User Interface (GUI) displayed on the display 275, and the user interface 265 receives the user input commands through the GUI. Specifically, the user interface 265 may receive user input commands for controlling the position of a selector in the GUI to select different objects or items.
Alternatively, the user may input a user command by inputting a specific sound or gesture, and the user interface 265 receives the user input command by recognizing the sound or gesture through the sensor.
The video processor 270 is configured to receive an external video signal, and perform video data processing such as decompression, decoding, scaling, noise reduction, frame rate conversion, resolution conversion, and image synthesis according to a standard codec protocol of the input signal, so as to obtain a video signal that is directly displayed or played on the display 275.
Illustratively, the video processor 270 includes a demultiplexing module, a video decoding module, an image synthesizing module, a frame rate conversion module, a display formatting module, and the like.
The demultiplexing module is configured to demultiplex an input audio/video data stream, where, for example, an input MPEG-2 stream (based on a compression standard of a digital storage media moving image and voice), the demultiplexing module demultiplexes the input audio/video data stream into a video signal and an audio signal.
And the video decoding module is used for processing the video signal after demultiplexing, including decoding, scaling and the like.
And the image synthesis module is used for carrying out superposition mixing processing on the GUI signal input by the user or generated by the user and the video image after the zooming processing by the graphic generator so as to generate an image signal for display.
The frame rate conversion module is configured to convert a frame rate of an input video, for example, convert a frame rate of an input 60Hz video into a frame rate of 120Hz or 240Hz, where a common format is implemented by using, for example, an interpolation frame method.
And a display formatting module for converting the signal output by the frame rate conversion module into a signal conforming to a display format of a display, such as converting the format of the signal output by the frame rate conversion module to output an RGB data signal.
A display 275 for receiving the image signal from the video processor 270 and displaying the video content, the image and the menu manipulation interface. The display video content may be from the video content in the broadcast signal received by the tuner-demodulator 210, or from the video content input by the communicator 220 or the external device interface 240. The display 275, while displaying a user manipulation interface UI generated in the display apparatus 200 and used to control the display apparatus 200.
And, the display 275 may include a display screen assembly for presenting a picture and a driving assembly for driving the display of an image. Alternatively, a projection device and projection screen may be included, provided display 275 is a projection display.
The audio processor 280 is configured to receive an external audio signal, decompress and decode the received audio signal according to a standard codec protocol of the input signal, and perform audio data processing such as noise reduction, digital-to-analog conversion, and amplification processing to obtain an audio signal that can be played by the speaker 286.
Illustratively, audio processor 280 may support various audio formats. Such as MPEG-2, MPEG-4, Advanced Audio Coding (AAC), high efficiency AAC (HE-AAC), and the like.
The audio output interface 285 is used for receiving an audio signal output by the audio processor 280 under the control of the controller 250, and the audio output interface 285 may include a speaker 286 or an external sound output terminal 287, such as an earphone output terminal, for outputting to a generating device of an external device.
In other exemplary embodiments, video processor 270 may comprise one or more chips. Audio processor 280 may also comprise one or more chips.
And, in other exemplary embodiments, the video processor 270 and the audio processor 280 may be separate chips or may be integrated with the controller 250 in one or more chips.
And a power supply 260 for supplying power supply support to the display apparatus 200 from the power input from the external power source under the control of the controller 250. The power supply 260 may be a built-in power supply circuit installed inside the display apparatus 200, or may be a power supply installed outside the display apparatus 200.
A schematic diagram of the functional configuration of the display device 200 is exemplarily shown in fig. 4 b. As shown in fig. 4b, the memory 290 is used to store an operating system, an application program, contents, user data, and the like, and performs system operations for driving the display device 200 and various operations in response to a user under the control of the controller 210. The memory 290 may include volatile and/or nonvolatile memory.
The memory 290 is specifically used for storing an operating program for driving the controller 210 in the display device 200, and storing various applications installed in the display device 200, various applications downloaded by a user from an external device, various graphical user interfaces related to the applications, various objects related to the graphical user interfaces, user data information, and internal data of various supported applications. The memory 290 is used to store system software such as an Operating System (OS) kernel, middleware, and applications, and to store input video data and audio data, and other user data.
The memory 290 is specifically used for storing drivers and related data such as the video processor 260-1 and the audio processor 260-2, the display 280, the communication interface 230, the tuner demodulator 220, the detector 240, the input/output interface, etc.
In some embodiments, memory 290 may store software and/or programs, software programs for representing an Operating System (OS) including, for example: a kernel, middleware, an Application Programming Interface (API), and/or an application program. For example, the kernel may control or manage system resources, or functions implemented by other programs (e.g., the middleware, APIs, or applications), and the kernel may provide interfaces to allow the middleware and APIs, or applications, to access the controller to implement controlling or managing system resources.
Illustratively, the memory 290 includes a broadcast receiving module 2901, a channel control module 2902, a volume control module 2903, an image control module 2904, a display control module 2905, an audio control module 2906, an external command recognition module 2907 (e.g., a graphic recognition module 2907-1, a voice recognition module 2907-2, a key command recognition module 2907-3), a communication control module 2908, a light receiving module 2909, a power control module 2910, an operating system 2911, and other applications 2912, a browser module, and so forth. The controller 210 performs functions such as: a broadcast television signal reception demodulation function, a television channel selection control function, a volume selection control function, an image control function, a display control function, an audio control function, an external instruction recognition function, a communication control function, an optical signal reception function, an electric power control function, a software control platform supporting various functions, a browser function, and the like.
A block diagram of the configuration of the software system in the display device 200 is exemplarily shown in fig. 4 c. As shown in fig. 4c, an operating system 2911, including executing operating software for handling various basic system services and for performing hardware related tasks, acts as an intermediary between application programs and hardware components for performing data processing.
In some embodiments, portions of the operating system kernel may contain a series of software to manage the display device hardware resources and provide services to other programs or software code.
In other embodiments, portions of the operating system kernel may include one or more device drivers, which may be a set of software code in the operating system that assists in operating or controlling the devices or hardware associated with the display device. The drivers may contain code that operates the video, audio, and/or other multimedia components. Examples include a display screen, a camera, Flash, WiFi, and audio drivers.
The accessibility module 2911-1 is configured to modify or access the application program to achieve accessibility and operability of the application program for displaying content.
A communication module 2911-2 for connection to other peripherals via associated communication interfaces and a communication network.
The user interface module 2911-3 is configured to provide an object for displaying a user interface, so that each application program can access the object, and user operability can be achieved.
Control applications 2911-4 for controlling process management, including runtime applications and the like.
The event transmission system 2914 may be implemented within the operating system 2911 or within the application 2912. In some embodiments, an aspect is implemented within the operating system 2911, while implemented in the application 2912, for listening for various user input events, and will implement one or more sets of predefined operations in response to various events referring to the recognition of various types of events or sub-events.
The event monitoring module 2914-1 is configured to monitor an event or a sub-event input by the user input interface.
The event identification module 2914-2 is used to input various event definitions for various user input interfaces, identify various events or sub-events, and transmit them to the process for executing one or more sets of their corresponding handlers.
The event or sub-event refers to an input detected by one or more sensors in the display device 200 and an input of an external control device (e.g., the control apparatus 100). Such as: the method comprises the following steps of inputting various sub-events through voice, inputting a gesture sub-event through gesture recognition, inputting a remote control key command of a control device and the like. Illustratively, the one or more sub-events in the remote control include a variety of forms including, but not limited to, one or a combination of key presses up/down/left/right/, ok keys, key presses, and the like. And non-physical key operations such as move, hold, release, etc.
The interface management module 2913 is configured to receive, directly or indirectly, the monitoring of each user input event or sub-event from the event transmission system 2914, and update the display effect of the items in the homepage interface, including but not limited to the circular focusing scroll display of each item in the interface or the circular focusing scroll display of each item in the interface.
A schematic diagram of a home page interface in the display device 200 according to an exemplary embodiment is illustrated in fig. 5. As shown in FIG. 5, the user interface includes a plurality of view display areas, illustratively a first view display area 201 and a second view display area 202, each having one or more different items laid out therein. And the user interface also comprises a selector which indicates that any item is selected, and the selector can change and select different items through user input.
It should be noted that the plurality of view display areas may be visible boundaries or invisible boundaries. Such as: the different view display areas can be marked through different background colors of the view display areas, and visible marks such as boundary lines and invisible boundaries can also be provided. It is also possible that there is no visible or non-visible border, and that only the associated items in a certain area of the screen are displayed, having the same changing properties in size and/or arrangement, which certain area is seen as the presence of the border of the same view partition, such as: items in the first view display area 201 are simultaneously zoomed in or out while the second view display area 202 is changed differently.
In some embodiments, one or more of the view display areas may be a scalable view display. "scalable" may mean that the view display area is scalable in size or proportion on the screen, or that the items in the view display are scalable in size or proportion on the screen.
"item" refers to a visual object displayed in each view display area of the user interface in the display device 200 to represent corresponding content such as icons, thumbnails, video clips, and the like. For example: the items may represent movies, image content or video clips of a television show, audio content of music, applications, or other user access content history information.
In some embodiments, an "item" may display an image thumbnail. Such as: when the item is a movie or a tv show, the item may be displayed as a poster of the movie or tv show. If the item is music, a poster of a music album may be displayed. Such as an icon for the application when the item is an application, or a screenshot of the content that captures the application when it was most recently executed. If the item is the user access history, the content screenshot in the latest execution process can be displayed. The "item" may be displayed as a video clip. Such as: the item is a video clip dynamic of a trailer of a television or a television show.
Further, the item may represent an interface or a collection of interfaces on which the display device 200 is connected to an external device, or may represent a name of an external device connected to the display device, or the like. Such as: a signal source input interface set, or an HDMI interface, a USB interface, a PC terminal interface, etc.
For example, as in fig. 6a, in the first view display area 201, text and/or icons for some common applications are displayed, wherein each item may comprise text content and/or an image for displaying a thumbnail related to the text content, or a video clip related to the text, etc.
A "selector" is used to indicate where any item has been selected, such as a cursor or a focus object. The cursor movement on the display device 200 is controlled to select or control an item according to the user input through the control apparatus 100. The control item may be selected by causing the movement of the focus object displayed in the display apparatus 200 according to an input of a user through the control apparatus 100, and one or more items may be selected or controlled. Such as: the user may select and control items by controlling the movement of the focus object between items through the direction keys on the control device 100.
The focus object refers to an object that moves between items according to user input. Illustratively, the focus object position is implemented or identified by drawing a thick line through the item edge as in FIG. 7 a. In other embodiments, the focus form is not limited to an example, and may be a form such as a cursor that is recognizable by the user, either tangible or intangible, such as in the form of a 3D deformation of the item, or may change the identification of the border lines, size, color, transparency, and outline and/or font of the text or image of the item in focus.
In some embodiments, each item in the view display area is associated with different content or link. It should be noted that the view display areas in this embodiment are arranged horizontally in the screen, and in the actual application process, may also be arranged vertically, or arranged at any other angle.
In other embodiments, the user interface may include one or more view display areas, and in particular, the number of view display areas on the display screen may be arranged according to the amount of different classified content to be displayed.
Based on the fact that more and more applications are installed in the current display device, in order to enable a user to operate each application more conveniently and quickly, the embodiment further provides a function that the user can edit the application sequence in the second view display area 202 according to personal preferences, and meanwhile, the display content in the first view display area 201 is changed, so that a user-defined free space is better provided for the user.
Fig. 6a to 6h are schematic diagrams illustrating application display screens in the home page interface of the display device 200. As shown in fig. 6a to 6h, in the present embodiment, icons of applications installed in the terminal are displayed in the second view display area 202 (also referred to as an application display area in the present embodiment), and meanwhile, recommendation data corresponding to an application in which the current focus is located is displayed in the first view display area 201 (also referred to as a stage area in the present embodiment). In addition, in order to facilitate the user to edit the application, a prompt for pressing a prescribed key to enter an editing page is provided on the third view display area 203, and for example, "MENU available to enter an application editing state" is displayed on the third view display area 203 when the focus is on the application in the second view display area 202, and "MENU available to delete a favorite snapshot" is displayed on the third view display area 203 when the focus is on the recommended data in the first view display area 201.
Further, the embodiment also provides different page display effects for different situations of whether the number of applications installed in the television exceeds the number of items that can be displayed in the second view display area and whether the user adds a shortcut option that likes a snapshot.
As shown in fig. 6a, when the total number of applications installed in the display device 200 is less than or equal to the number of items that can be displayed in the second view display area 202, the present embodiment sets that at most 14 items can be displayed in the second view display area 202, and when the user has not added a shortcut option favorites a snapshot, all the applications installed in the display device 200 can be completely displayed in the second view display area 202 at this time.
If the user wants to change the order of the home page presentation applications and the contents recommended in the first view display area 201, he can quickly jump to an edit page, i.e. a jumped page as shown in fig. 6b, according to the prompt in the third view display area 203, e.g. by clicking a MENU button on the control device 100.
As shown in fig. 6c, when the total number of applications installed in the display device 200 is greater than the number of items that can be displayed in the second view display area 202, and the user has not added the shortcut option of the favorite snapshot, all the applications installed in the display device 200 cannot be completely displayed in the second view display area 202, so that the total application entry is added in the present application, for example, the MyApps icon circled by a dotted line in the figure, the MyApps data item is added after the 13 th data of the item array corresponding to the application, and then the MyApps data item can be displayed in the 14 th item position.
If the user wants to change the sequence or content of the home page presentation application and edit the application icons in the second display area, the user can quickly jump to the edited page, i.e. the jumped page shown in fig. 6d, according to the prompt in the third view display area 203, for example, by clicking the MENU button on the control device 100. In order to facilitate the user to distinguish the application displayed in the homepage interface from other applications not displayed in the homepage interface, the embodiment displays the two applications in a partitioned manner, wherein the application displayed in the homepage interface is displayed in the upper half part of the MyApp interface, the icon of the MyApp is set as a virtual placeholder under the interface and is not displayed, and the other applications left before are displayed in the lower half part of the MyApp interface. Under the interface, the user can edit and adjust the position of the application in the interface.
As shown in fig. 6e, when the total number of applications installed in the display device 200 is smaller than the number of items that can be displayed in the second view display area 202, and the user has added the shortcut option of the favorite snapshot. At this time, in the present embodiment, the ShortCuts data item is added to the 1 st data position of the item array corresponding to the application, and then the ShortCuts data item can be displayed at the 1 st item position, that is, the icon position circled by the dotted line in the figure, and when the focus is at the item position, the data content added by the user before, such as favorite channel, web address, etc., can be displayed in the first view display area 201. Then, the remaining 13 item positions are used to display the application icons installed in the display device 200.
Similarly, if the user wants to change the order of the home page presentation applications and the contents recommended in the first view display area 201, he can quickly jump to the edit page, i.e., the jumped page shown in fig. 6f, according to the prompt in the third view display area 203, for example, by clicking the MENU button on the control device 100. In order to prevent the user from moving the application position to influence the display position of the shortcut option, the position of the shortcut option in the user interface is set to be not focused, i.e., the user cannot select the item through the control device 100, and the position thereof cannot be moved.
As shown in fig. 6g, when the total number of applications installed in the display apparatus 200 is greater than or equal to the number of items that can be displayed in the second view display area 202 and the user has added the shortcut option of the favorite snapshot. At this time, in the present embodiment, the ShortCuts data item is added to the 1 st data position of the item array corresponding to the application, and then the ShortCuts data item can be displayed at the 1 st item position, that is, the icon position circled by the dotted line in the figure, and when the focus is at the item position, the data content previously added by the user can be displayed in the first view display area 201. Meanwhile, a general application entry is added, for example, a MyApps icon is circled by a dotted line in the figure, in this embodiment, a MyApps data item is added after the 13 th data of the corresponding item array is applied, and then the MyApps data item can be displayed at the 14 th item position. The remaining 12 item positions are used to display application icons installed in the display apparatus 200.
Similarly, if the user wants to change the order of the home page presentation applications and the contents recommended in the first view display area 201, he or she can quickly jump to the edit page, i.e., the jumped page shown in fig. 6h, according to the prompt in the third view display area 203, e.g., by clicking the MENU button on the control device 100. Wherein, in order to prevent the user from moving the application position to influence the display position of the shortcut option, the position of the shortcut option is set to be not focused and the position thereof is not moved in the user interface; and the icon of MyApps is set as a virtual placeholder under the interface and cannot be focused.
Further, the embodiment also provides a method for adjusting the display sequence of the applications in the homepage interface. Fig. 7a to 7d are operation diagrams illustrating a sequence of items in the home interface of the mobile display device 200 by the control apparatus 100.
In this embodiment, taking the case that the total number of applications installed in the display device 200 is greater than the number of items that can be displayed in the second view display area 202, and the user has not added a shortcut option favoring a snapshot, when the user inputs a command to edit an application on the control apparatus 100, the interface shown in fig. 7a may be entered, and at the same time, a prompt option that the user can click a button on the control apparatus 100 to move or delete an application position is displayed in the interface.
As shown in fig. 7b, after the user clicks a key on the control device 100 according to the prompt, for example, after the MENU key, two option boxes, namely, a move item position (Sort) and a delete item (univers), pop up at the application position where the current focus is located, and under the interface, only these two option boxes can be focused, if the user selects "univers", the application is deleted, and if the user selects "Sort", the operation interface of the move item can be entered, as shown in fig. 7 c.
In the state shown in fig. 7c, the user may control a key operation or a voice operation on the apparatus 100 to move the position of the selected application so as to insert the selected application into the target position in the application icon queue in the user interface. As shown in fig. 7c, the selected application "NETFLIX" is moved to a position not displayed in the second view display area 202. After the user finishes editing the application, the user exits the editing interface, the second view display area 202 of the home interface updates and displays the content of the specific item, specifically, the first 13 applications in the total item array corresponding to the applications installed in the display device 200 are displayed, and the display of the recommended data in the first view display area 201 is synchronized, so that the user interface display effect shown in fig. 7d is obtained, wherein "NETFLIX" is removed and is not displayed in the user interface any more.
A flow diagram of a user interface presentation method is illustrated in fig. 8. As shown in fig. 8, the method mainly includes the following steps:
s801: displaying at least one application icon in the second display area; when the focus is located at the first application icon, the recommendation data corresponding to the first application is displayed in the first display area.
In order to realize the display interface, firstly, the display device binds each piece of recommended data to the corresponding application according to the identifier of each piece of recommended data and the identifier of the application installed in the display device.
When the application installed in the display device is changed or the recommended data is changed, all the applications and the recommended data installed in the display device can be acquired, then the recommended data corresponding to each application is identified according to the application identifier such as the name, an appReList field is added in the data structure of the application, the recommended data is bound to the appReList field of the data of the application, and then the recommended data can be corresponding to the recommended data only by identifying the selected application as long as the content of the recommended data issued by the data center (such as the cloud) is not changed.
Then, the display device intercepts the items of the first N bits in the total item array corresponding to the application installed in the display device according to the displayable item quantity N in the second view display area, and forms a sub item array.
For example, the number of displayable items in the first view display area is 14, the total item array corresponding to the application installed in the display device is { item0, item1, item2 …, item13 …, item }, and then the top 14 items are intercepted to form a sub-item array { item0, item1, item2 …, item13 }.
And finally, displaying icons corresponding to all items in the sub-item array in the second display area, and displaying recommended data bound to the item with the current focus in the first display area. And displaying icons corresponding to all items in the sub-item array in the second display area according to the item sequence in the sub-item array, and simultaneously displaying recommended data bound to the item with the current focus in the first display area.
S802: and switching the recommendation data displayed in the first display area to recommendation data bound by the second application in response to switching the focus from the first application icon to the second application icon.
Therefore, when the user switches the focus, the recommended data content displayed in the second display area can be automatically switched according to the recommended data bound by the applications, the user can conveniently check the recommended data corresponding to the applications, the user can conveniently and quickly operate the applications installed in the television, and user experience is improved.
Further, the embodiment also provides a data processing method for the display device when the user adds favorite snapshots and the number of the applications installed in the display device is more than the number of the applications that can be displayed in the second view display area.
A flow diagram of another user interface presentation method is illustrated in fig. 9. As shown in fig. 9, the method mainly includes the following steps:
s901: displaying at least one icon in a second display area, the icon comprising: the shortcut option icon and the at least one application icon, the total application entrance icon and the at least one application icon, or the shortcut option icon, the total application entrance icon and the at least one application icon;
the data processing process of the display device is as follows:
s9011: and binding each piece of recommended data to the corresponding application according to the identifier of each piece of recommended data and the identifier of the application installed in the display equipment.
S9012: it is determined whether the user added the overly quick option.
In order to facilitate the user's operation of the display device 200, the user may set some contents as shortcuts according to personal preferences, for example, the user sets a certain network channel, channel or website in the display device 200 as a shortcut through the control apparatus 100 to be presented in the homepage interface, and further thinks that the above actions add shortcut options to the user. If so, step S9013 is executed, otherwise step S9014 is executed.
S9013: and adding the shortcut option to a total item array formed by applications installed in the display equipment, wherein the position of the shortcut option in the total item array is in the top N positions.
In this embodiment, the total item array formed by the applications installed in the display device is called AllAppData, and since the shortcut option needs to be displayed in the second view display area of the home page interface, the position where the shortcut option (shortcut) is set in the total item array in this embodiment is top N bits, where N is the number of items that can be displayed in the second view display area, for example, the position of item0 where the shortcut option is added to the 1 st bit in the AllAppData array is added.
S9014: and judging whether the number of items in the total item array is greater than N.
For example, the number of items that can be displayed in the second view display area is 14, the number of applications installed in the display device is also 14, and if a shortcut option is added in step S9013, the number of items in the total item array at this time is greater than 14, and then step S9015 is executed; otherwise, step S9016 is directly executed.
S9015: adding a total application entry for identifying all applications installed in the display device to the total project array, wherein the position of the total application entry in the total project array is in the top N bits.
In order to display the total application entry in the second view display area, the present embodiment sets the total application entry in the total item array at the first N positions, for example, at the nth position in the total item array, so that the total application entry can be displayed at the last item position in the second view display area.
S9016: and intercepting the items of the front N bits in the total item array corresponding to the application installed in the display equipment according to the displayable item quantity N in the second view display area to form a sub-item array.
And intercepting the top N-bit items in the AllAppData array to form a sub-item array mainTileData.
S9017: and displaying icons corresponding to all items in the sub-item array in the second display area, and displaying recommended data bound to the item with the current focus in the first display area.
S902: when the focus is located at the first application icon, the recommendation data corresponding to the first application is displayed in the first display area.
S903: and switching the recommendation data displayed in the first display area to recommendation data bound by the second application in response to switching the focus from the first application icon to the second application icon.
Further, after step S903, the present embodiment also provides a display method of editing an application installed in the display device.
The method specifically comprises the following steps:
s904: and receiving a first operation of editing the application icon in the second display area, which is input by a user.
Receiving a user input, determining a type of the user input event, wherein the controller of the display device 200 is configured to monitor the type of the user input event, such as whether the key input is a MENU key command. If the monitored user input event is an MENU key instruction, detecting the position of the selector in the user interface, further determining whether the selector is positioned in the second view display area, and if so, indicating that the key input is for editing the items in the view display area, namely, further responding to the key input and entering an editing mode.
S905: in response to the first operation, displaying the at least one application icon in a first area of the user interface, and displaying icons of remaining applications installed by the display device in a second area of the user interface.
And after entering the editing mode, presenting the items as MyApp display pages on the user interface, wherein in order to distinguish the positions of the items by the user conveniently, the items in the sub-item array mainTileData in the total item array are displayed in a first area in the user interface, and the rest items are displayed in a second area in the user interface.
In order to enable the item to be arbitrarily moved between the first area and the second area in the user interface when the application icon is moved in the subsequent operation, the embodiment uses one grid List to realize the whole MyApp page, and meanwhile, in order to apply the content in two parts, the space between the first area and the second area does not follow the normal space layout of the original grid List, and meanwhile, the embodiment adds a dividing line and/or a prompt between the first area and the second area. The core code of its realization is as follows:
row_num=parseInt(index/this.columns);
if(row_num>1){
style.top=(row_num*(this.cellHeight+this.spacingV)+this.spacingT)+3.3333+'rem';
}else{
style.top=(row_num*(this.cellHeight+this.spacingV)+this.spacingT)+'rem';
}
wherein: top is the distance from the item there to the top of the outer box (also called the item container), 3.3333 is the distance from line 3 to the top of the box for each element is increased by 3.3333rem, i.e., 100px, although other values are possible in embodiments. Since the items in the first area are shown divided into two lines in the present embodiment, row _ num >1 is set in the above code.
In addition, if the sub-item array contains the shortcut option, the shortcut option is set to be not focusable, namely the position of the shortcut option cannot be edited and cannot be moved; if the total application entry is contained in the sub-item array, the total application entry is set to be not focusable, namely the position of the total application entry cannot be edited and cannot be moved, and the icon of the total application entry is in a hidden state. Fig. 10a to 10c are schematic diagrams illustrating operations of an application in the home page interface of the editing display device 200. As shown in fig. 10a, the focus cannot be applied at the shortcut option position, i.e., the position where index is 0, and the position where index is 13, which is the position of the total application entry, and other positions can be applied.
S906: and receiving a second operation of moving the application icon in the user interface input by the user.
First, a user input is received, and a type of the user input event is determined, wherein the controller of the display apparatus 200 is configured to monitor the type of the user input event, such as whether the monitoring key input is a MENU key command. If the monitoring user input event is a MENU key instruction, two option boxes of a moving item position (Sort) and a deleting item (Uninstall) are displayed at the target item where the current focus is located, as shown in fig. 10b, and only the two option boxes can be focused under the interface. If the user selects "Sort", the operation interface of the mobile item may be entered, as shown in fig. 10b, and under the operation interface, according to the user input, the position of the application icon where the focus is located is changed by changing the position of the target item where the current focus is located in the item queue corresponding to the total item array.
S907: and responding to the second operation, controlling the application icon where the focus is located to move on the user interface so as to insert the application icon where the focus is located to a target position in an application icon queue in the user interface.
In order to reduce the number of data updating times in the item moving process and prevent the phenomenon of pause in the item moving process, the display device adopts the following processing mode to move the application icon:
s9071: and moving out the target item at the current focus from the total item array to obtain a non-target item array formed by the remaining items.
The target item where the current focus is located, namely the edited item, is taken out from the total item array, the total item array is recorded as AllApData, the edited target item is recorded as moveItem in the embodiment, and the obtained sub list array is recorded as AllApData'.
S9072: and controlling the target item to be displayed according to a focus style, and controlling each item in the non-target item array to be displayed according to a non-focus style.
The moveItem is displayed according to the focus pattern, each item in the AllApData 'array is displayed according to the non-focus pattern, and even if the subsequent focus is attached to one item in the AllApData' array, the item with the focus does not display the focus pattern, so that the visual feeling of the focus on the moveItem is provided for the user.
S9073: and controlling the movement of the display positions of the items in the target item and the non-target item array in the user interface according to the user input for moving the target item.
A user input is received and the type of the user input event is determined, wherein the controller of the display device 200 is configured to monitor the type of the user input event, such as whether the monitored key input is an UP, DOWN, LEFT, or RIGHT key command. If the monitoring user input event is any one of the UP, DOWN, LEFT and RIGHT key commands, the key input is the user input for moving the target item, namely, the key input is further responded. For example, after the user presses a RIGHT key, the user wants to control the moveltem to move to the RIGHT, and further controls the moveltem to move to the RIGHT by a distance of one item, and an item behind the moveltem in the AllAppData' array also moves to the RIGHT by a distance of one item. Through the operation, the phase position of the moveItem relative to each item in the AllAppData' array can be changed, and the position of the moveItem in the AllAppData is further changed.
Then, when the editing mode is recovered to the normal mode, if the user does not delete the work, the moveItem data is inserted into the AllAppData' array.
Compared with the conventional mode that the operated item (item) is taken out from the AllAppData array and then inserted into the moved position in the operation process of moving the item (item) every time (the position of the item in the mainTileData array is recorded as an index in the embodiment), and the focus is changed to the position and the display style of each item in the mainTileData array is updated, the embodiment performs data update only when entering the editing mode or exiting the editing mode and recovering to the normal mode, and does not perform data change in the operation process of moving the item, so that the page DOM is not updated repeatedly, and the item movement does not generate a pause phenomenon; in addition, in the prior art, data needs to be updated once every time an item is moved, and a picture corresponding to the item in the data from the cloud is not cached by the television terminal and the browser, and the picture needs to be reacquired every time.
Further, in order to more conveniently calculate the moving distance of the item in the AllAppData' array when moving the moveItem, in step S9071, after the target item at which the current focus is located is moved out of the total item array, the controller is further configured to: and moving the focus to an item which is added with 1 relative to the index value of the target item before the target item is moved out of the total item array. Thus, in step S9073, the movement of the item in AllAppdata can be controlled directly according to the position of the focus, for example, when the moveItem is controlled to move to the right, the position of the focus and the items behind the focus can be controlled directly
Fig. 11a to 11d are schematic diagrams illustrating operations of application icons in the home interface of the mobile display device 200. As shown in fig. 11a to 11d, when the control item is moved, if the total application entry icon is included in the second display area, in order to enable smooth position change of other items in the display interface after the moved item, in this embodiment, the position of the total application entry is set to be an available focus, and in order to ensure that the display position of the total application entry is not changed, when the moved item is at a special position above, below, left, and right of the total application entry, in step S9073, according to the user input for moving the target item, when the display position of the target item and the item in the non-target item array in the user interface is controlled to be moved, the following processing method is adopted:
1) and if the application icon where the focus is located is an application icon which is located above and adjacent to the total application entry icon, and the second operation is an operation of shifting down one application icon position, moving the application icon where the focus is located to a position before the total application entry icon. I.e. the target item position as shown in fig. 11a, when the focus is at the item position with index equal to 6, the following processing steps are adopted for the corresponding display device:
s01: and changing the change value of the index value of the item where the focus is located into M-1, wherein M is the number of items contained in one row in the user interface.
Since the focus is located at the item position with index of 6, the user inputs to move the target item down by one item position formula, and index 6+ M (M is 7) to be normally moved is changed to 13, and we need to ensure that the position of MyApps icon in the figure with index of 12 is not changed, so this embodiment only changes index + M-1 of the index value of the item where the focus is located to 12, that is, after the target item is moved, the focus is at the position with index of 12, that is, the position of MyApps seen by the user in the figure with index of 12.
S02: and calculating the transverse offset and the longitudinal offset of each item in the non-target item array according to the change value M-1 of the index value, wherein a gap of an item is reserved before the item of the focus.
The longitudinal offset of each item may refer to the calculation manner in step S908. In addition, a gap of the item is left before the item of the focus point, so that the target item can be displayed.
S03: displaying each item in the sub-item array according to the transverse offset and the longitudinal offset of each item in the non-target item array;
s04: and displaying the target item at a gap position before the item with the focus. And further realizing that the application icon where the focus is located is moved to the position before the total application entrance icon.
2) And if the application icon where the focus is located is an application icon which is located in front of a total application entry icon and is adjacent to the total application entry icon, and the second operation is an operation at the position of a backward application icon, moving the application icon where the focus is located to the position behind the total application entry icon. That is, as shown in fig. 11b, the target item position, in which the focus is located at the item position with index equal to 12, is processed by the following steps: the change value of the index value of the item at the focus is 2, that is, the index +2 is the position of 14, and the target item is moved to the position of 13 in fig. 11 b.
3) And if the application icon where the focus is located is the application icon which is located behind the total application entry icon and is adjacent to the total application entry icon, and the second operation is the operation of the previous application icon position, moving the application icon where the focus is located to the position before the total application entry icon. That is, as shown in fig. 11c, the target item position, in which the focus is located at the item position with index equal to 14, corresponds to the following processing steps: and after the focus is moved, the application icon where the focus is located is moved to the position before the total application entry icon, and the focus is attached to the position of the total application entry.
4) And if the application icon where the focus is located is an application icon which is located below a total application entry icon and is adjacent to the total application entry icon, and the second operation is an operation of moving up by one application icon position, moving the application icon where the focus is located to a position before the total application entry icon. That is, as shown in fig. 11d, the target item position, in which the focus is located at the item position with index equal to 20, is processed by the following steps: the data corresponding to the total application entry is moved to the 12 th position, and then the actual focus is moved to the index of 12.
In addition, when the target item position is moved in the above step S907, as shown in fig. 11a to 11d, if there is another app icon in the position adjacent to the app icon where the focus is located, a moving direction indicator is set on the app icon where the focus is located, the moving direction indicator points to the another app icon, that is, the moving direction indicator is set only when there is another item in the position adjacent to the target item.
S908: and receiving a fifth operation of finishing editing the application icon in the user interface input by the user.
S909: and responding to the fifth operation, and displaying the application icon of the first area in the second display area.
And intercepting the items of the front N bits in the new total item array corresponding to the moved target item according to the instruction of finishing moving the items in the first display area input by the user to form a new sub-item array. And then, displaying icons corresponding to all items in the new sub-item array in the first display area, and displaying recommended data bound to the item with the current focus in the second display area.
If the new sub-item array contains shortcut options, setting the shortcut options as focusable points; if the total application entry is contained in the new sub-item array, the total application entry is set to be in focus and the icon of the total application entry is visible.
Based on the same inventive concept as the user interface display method and the display device, the embodiment further provides a computer storage medium, wherein the computer storage medium can store a program, and the program can implement any user interface display method provided by the implementation.
While preferred embodiments of the present invention have been described, additional variations and modifications in those embodiments may occur to those skilled in the art once they learn of the basic inventive concepts. Therefore, it is intended that the appended claims be interpreted as including preferred embodiments and all such alterations and modifications as fall within the scope of the invention.
It will be apparent to those skilled in the art that various changes and modifications may be made in the present invention without departing from the spirit and scope of the invention. Thus, if such modifications and variations of the present invention fall within the scope of the claims of the present invention and their equivalents, the present invention is also intended to include such modifications and variations.

Claims (10)

1. A display device, characterized in that the display device comprises:
a display configured to display a user interface, the user interface including a plurality of view display regions;
a controller communicatively coupled to the display, the controller configured to execute a presentation user interface:
displaying at least one application icon in the second display area;
when the focus is positioned at the first application icon, the recommendation data corresponding to the first application is displayed in the first display area;
and switching the recommendation data displayed in the first display area to recommendation data bound by the second application in response to switching the focus from the first application icon to the second application icon.
2. The display device of claim 1, wherein the controller is further configured to:
receiving a first operation of editing the application icon in the second display area, which is input by a user;
in response to the first operation, displaying the at least one application icon in a first area of the user interface, and displaying icons of remaining applications installed by the display device in a second area of the user interface.
3. The display device of claim 2, wherein the controller is further configured to:
receiving a second operation of moving the application icon in the user interface input by the user;
and responding to the second operation, controlling the application icon where the focus is located to move on the user interface so as to insert the application icon where the focus is located to a target position in an application icon queue in the user interface.
4. The display device of claim 2, wherein after displaying the at least one application icon in the first region of the user interface, the controller is further configured to:
if a total application entrance icon is displayed in the second display area, hiding the total application entrance icon in the first area, wherein the position of the total application entrance icon cannot be focused; and/or the presence of a gas in the gas,
and if a shortcut option icon is displayed in the second display area, displaying the shortcut option icon in the first area, wherein the position of the shortcut option icon cannot be focused.
5. The display device according to claim 3, wherein in response to the second operation, controlling the application icon in which the focus is located to move in the user interface comprises:
if the application icon where the focus is located is an application icon which is located in front of a total application entry icon and is adjacent to the total application entry icon, and the second operation is an operation of a backward application icon position, moving the application icon where the focus is located to a position behind the total application entry icon;
alternatively, the first and second electrodes may be,
if the application icon where the focus is located is an application icon which is located behind a total application entry icon and is adjacent to the total application entry icon, and the second operation is an operation which is performed at the position of a previous application icon, moving the application icon where the focus is located to the position before the total application entry icon;
alternatively, the first and second electrodes may be,
if the application icon where the focus is located is an application icon which is located above a total application entry icon and is adjacent to the total application entry icon, and the second operation is an operation of shifting down one application icon position, moving the application icon where the focus is located to a position before the total application entry icon;
alternatively, the first and second electrodes may be,
and if the application icon where the focus is located is an application icon which is located below a total application entry icon and is adjacent to the total application entry icon, and the second operation is an operation of moving up by one application icon position, moving the application icon where the focus is located to a position before the total application entry icon.
6. The display device of claim 3, wherein the controller is further configured to, while the user interface is moving, control the application icon at which focus is located:
if other application icons exist at the adjacent positions of the application icon where the focus is located, setting a moving direction indication mark on the application icon where the focus is located, wherein the moving direction indication mark points to the other application icons.
7. The display device according to claim 3, wherein the second operation of receiving user input to move the application icon in the user interface comprises:
receiving a third operation of editing the application icon in the user interface input by the user;
in response to the third operation, displaying options of deleting the application and moving the application on the application icon where the focus is located;
receiving an operation for selecting an option of the mobile application for input.
8. The display device of claim 2, wherein the controller is further configured to:
receiving a fifth operation of finishing editing the application icon in the user interface, wherein the fifth operation is input by a user;
and responding to the fifth operation, and displaying the application icon of the first area in the second display area.
9. A user interface display method applied to a display device is characterized by comprising the following steps:
a display device displays a user interface, the user interface including a plurality of view display regions;
displaying at least one application icon in the second display area;
when the focus is positioned at the first application icon, the recommendation data corresponding to the first application is displayed in the first display area;
and switching the recommendation data displayed in the first display area to recommendation data bound by the second application in response to switching the focus from the first application icon to the second application icon.
10. A computer storage medium, characterized in that the computer storage medium can store a program which, when executed, can implement the method of claim 9.
CN201911310348.XA 2019-12-12 2019-12-18 User interface display method, storage medium and display device Pending CN112969084A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
PCT/CN2020/084194 WO2021114529A1 (en) 2019-12-12 2020-04-10 User interface display method and display device

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN201911275517 2019-12-12
CN2019112755170 2019-12-12

Publications (1)

Publication Number Publication Date
CN112969084A true CN112969084A (en) 2021-06-15

Family

ID=71653968

Family Applications (3)

Application Number Title Priority Date Filing Date
CN201911310348.XA Pending CN112969084A (en) 2019-12-12 2019-12-18 User interface display method, storage medium and display device
CN202010230889.8A Pending CN111447479A (en) 2019-12-12 2020-03-27 Graphical user interface method and display device for providing prompt
CN202010276220.2A Pending CN111491196A (en) 2019-12-12 2020-04-09 Display apparatus and user interface display method

Family Applications After (2)

Application Number Title Priority Date Filing Date
CN202010230889.8A Pending CN111447479A (en) 2019-12-12 2020-03-27 Graphical user interface method and display device for providing prompt
CN202010276220.2A Pending CN111491196A (en) 2019-12-12 2020-04-09 Display apparatus and user interface display method

Country Status (2)

Country Link
CN (3) CN112969084A (en)
WO (3) WO2021114529A1 (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113986074A (en) * 2021-10-11 2022-01-28 深圳Tcl新技术有限公司 Icon state switching method, device and equipment and computer readable storage medium
CN114727145A (en) * 2022-03-31 2022-07-08 当趣网络科技(杭州)有限公司 Display interface interaction method and device and large-screen terminal

Families Citing this family (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112015411B (en) * 2020-08-14 2021-04-20 深圳市卓智荟教育科技有限公司 Education application interface configuration method and device based on SAAS and readable storage medium
CN113490060B (en) * 2020-09-09 2023-08-01 青岛海信电子产业控股股份有限公司 Display equipment and method for determining common contact person
CN112367550A (en) * 2020-10-30 2021-02-12 Vidaa美国公司 Method for realizing multi-title dynamic display of media asset list and display equipment
CN113014979A (en) * 2021-02-18 2021-06-22 青岛海信传媒网络技术有限公司 Content display method and display equipment
CN113347482B (en) * 2021-06-18 2023-10-27 聚好看科技股份有限公司 Method for playing data and display device
CN113535019A (en) * 2021-07-14 2021-10-22 Vidaa美国公司 Display device and display method of application icons
CN113794914B (en) * 2021-08-26 2023-07-28 Vidaa(荷兰)国际控股有限公司 Display equipment and method for configuring startup navigation
CN114168242B (en) * 2021-11-11 2023-04-14 青岛海信传媒网络技术有限公司 Display device and display method of content of external device

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2012141037A1 (en) * 2011-04-15 2012-10-18 シャープ株式会社 Menu screen display control method and display control device
EP2908545A1 (en) * 2014-02-18 2015-08-19 Kabushiki Kaisha Toshiba Electronic device, method and storage medium
CN108307222A (en) * 2018-01-25 2018-07-20 青岛海信电器股份有限公司 Smart television and the method that upper content is applied based on access homepage in display equipment
CN108683939A (en) * 2018-03-16 2018-10-19 青岛海信电器股份有限公司 Application sequence setting method in TV and device
CN108701001A (en) * 2017-06-30 2018-10-23 华为技术有限公司 Show the method and electronic equipment of graphic user interface
CN109254706A (en) * 2018-08-16 2019-01-22 青岛海信电器股份有限公司 A kind of application program image target location regulation method and display terminal
CN110337034A (en) * 2019-07-12 2019-10-15 青岛海信传媒网络技术有限公司 Method for displaying user interface and display equipment

Family Cites Families (22)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN100373936C (en) * 2005-08-09 2008-03-05 深圳市同洲电子股份有限公司 Device and method for dynamic demonstrating set-top box operation guide
CN100531301C (en) * 2007-02-12 2009-08-19 深圳市同洲电子股份有限公司 Set-top box and its remote operation system and method
GB2453789B (en) * 2007-10-19 2012-11-14 British Sky Broadcasting Ltd Television display
CN101453580A (en) * 2007-12-05 2009-06-10 乐金电子(中国)研究开发中心有限公司 Remote controller, digital television and remote control method
EP2439933A4 (en) * 2009-06-19 2012-12-05 Shenzhen Tcl New Technology Menu generation method for television
EP2558783A2 (en) * 2010-04-12 2013-02-20 Aktiebolaget Electrolux A control interface for household appliances
KR101781129B1 (en) * 2010-09-20 2017-09-22 삼성전자주식회사 Terminal device for downloading and installing an application and method thereof
US9338510B2 (en) * 2011-07-31 2016-05-10 Google Inc. Systems and methods for presenting home screen shortcuts
US20130212517A1 (en) * 2012-02-13 2013-08-15 Lenovo (Beijing) Co., Ltd. Electronic Device and Display Processing Method
US9800951B1 (en) * 2012-06-21 2017-10-24 Amazon Technologies, Inc. Unobtrusively enhancing video content with extrinsic data
CN103517152A (en) * 2013-06-27 2014-01-15 厦门华侨电子股份有限公司 Method for adding or deleting application program quick start program and plug-in in television home page
CN104793874B (en) * 2014-01-20 2019-03-29 联想(北京)有限公司 A kind of interface display method and electronic equipment
CN103902158B (en) * 2014-03-18 2017-08-25 深圳市艾优尼科技有限公司 One kind management application program image target method and terminal
CN105282620A (en) * 2014-07-23 2016-01-27 深圳市同方多媒体科技有限公司 Homepage customization method and system of smart television
US11030385B2 (en) * 2015-03-30 2021-06-08 Microsoft Technology Licensing, Llc Enhanced preview technology for application add-ins
CN104918129B (en) * 2015-05-26 2018-09-18 深圳创维-Rgb电子有限公司 A kind of customizing method and system of TV desktop
CN105516505B (en) * 2015-12-25 2019-08-06 Tcl集团股份有限公司 A kind of more apply synchronizes the method, system and smart phone used
CN106933438A (en) * 2015-12-29 2017-07-07 宇龙计算机通信科技(深圳)有限公司 A kind of application display method, device and mobile terminal
CN106210906A (en) * 2016-08-12 2016-12-07 三星电子(中国)研发中心 A kind of access method of intelligent television content
CN107197354B (en) * 2017-05-25 2020-09-25 海信视像科技股份有限公司 User interface control method and device and smart television
CN109766066B (en) * 2018-12-29 2022-03-01 华为技术有限公司 Message processing method, related device and system
CN110012340A (en) * 2019-04-11 2019-07-12 青岛海信电器股份有限公司 A kind of graphical user interface method and display equipment of offer menu item

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2012141037A1 (en) * 2011-04-15 2012-10-18 シャープ株式会社 Menu screen display control method and display control device
EP2908545A1 (en) * 2014-02-18 2015-08-19 Kabushiki Kaisha Toshiba Electronic device, method and storage medium
CN108701001A (en) * 2017-06-30 2018-10-23 华为技术有限公司 Show the method and electronic equipment of graphic user interface
CN108307222A (en) * 2018-01-25 2018-07-20 青岛海信电器股份有限公司 Smart television and the method that upper content is applied based on access homepage in display equipment
CN108683939A (en) * 2018-03-16 2018-10-19 青岛海信电器股份有限公司 Application sequence setting method in TV and device
CN109254706A (en) * 2018-08-16 2019-01-22 青岛海信电器股份有限公司 A kind of application program image target location regulation method and display terminal
CN110337034A (en) * 2019-07-12 2019-10-15 青岛海信传媒网络技术有限公司 Method for displaying user interface and display equipment

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113986074A (en) * 2021-10-11 2022-01-28 深圳Tcl新技术有限公司 Icon state switching method, device and equipment and computer readable storage medium
CN114727145A (en) * 2022-03-31 2022-07-08 当趣网络科技(杭州)有限公司 Display interface interaction method and device and large-screen terminal

Also Published As

Publication number Publication date
WO2021203569A1 (en) 2021-10-14
CN111447479A (en) 2020-07-24
WO2021114560A1 (en) 2021-06-17
WO2021114529A1 (en) 2021-06-17
CN111491196A (en) 2020-08-04

Similar Documents

Publication Publication Date Title
CN109618206B (en) Method and display device for presenting user interface
WO2021114529A1 (en) User interface display method and display device
CN111314789B (en) Display device and channel positioning method
CN111182345B (en) Display method and display equipment of control
CN111654739A (en) Content display method and display equipment
WO2020207115A1 (en) Method for providing menu items on graphic user interface and display device
CN112463269B (en) User interface display method and display equipment
CN111625169B (en) Method for browsing webpage by remote controller and display equipment
CN111427643A (en) Display device and display method of operation guide based on display device
CN111479155A (en) Display device and user interface display method
CN111246309A (en) Method for displaying channel list in display device and display device
CN111104020B (en) User interface setting method, storage medium and display device
CN111414216A (en) Display device and display method of operation guide based on display device
CN111045557A (en) Moving method of focus object and display device
CN111901653B (en) Configuration method of external sound equipment of display equipment and display equipment
CN113115092B (en) Display device and detail page display method
CN111064983B (en) Display device
CN112004126A (en) Search result display method and display device
CN112040308A (en) HDMI channel switching method and display device
CN111857363A (en) Input method interaction method and display equipment
CN111669634A (en) Video file preview method and display equipment
CN111857502A (en) Image display method and display equipment
CN111541929A (en) Multimedia data display method and display equipment
CN113132776A (en) Display device
CN113163228A (en) Media asset playing type marking method and server

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
TA01 Transfer of patent application right
TA01 Transfer of patent application right

Effective date of registration: 20221018

Address after: 83 Intekte Street, Devon, Netherlands

Applicant after: VIDAA (Netherlands) International Holdings Ltd.

Address before: 266061 room 131, 248 Hong Kong East Road, Laoshan District, Qingdao City, Shandong Province

Applicant before: QINGDAO HISENSE MEDIA NETWORKS Ltd.

RJ01 Rejection of invention patent application after publication
RJ01 Rejection of invention patent application after publication

Application publication date: 20210615