CN109922364B - Display device - Google Patents

Display device Download PDF

Info

Publication number
CN109922364B
CN109922364B CN201910258277.7A CN201910258277A CN109922364B CN 109922364 B CN109922364 B CN 109922364B CN 201910258277 A CN201910258277 A CN 201910258277A CN 109922364 B CN109922364 B CN 109922364B
Authority
CN
China
Prior art keywords
gui
display
item
selector
user
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201910258277.7A
Other languages
Chinese (zh)
Other versions
CN109922364A (en
Inventor
王大勇
陈验方
黄玖法
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Hisense Visual Technology Co Ltd
Original Assignee
Hisense Visual Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Hisense Visual Technology Co Ltd filed Critical Hisense Visual Technology Co Ltd
Publication of CN109922364A publication Critical patent/CN109922364A/en
Priority to PCT/CN2019/126701 priority Critical patent/WO2020147507A1/en
Application granted granted Critical
Publication of CN109922364B publication Critical patent/CN109922364B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Abstract

The invention relates to the technical field of display, in particular to display equipment which is used for quickly searching interesting items in a large number of items so as to improve the satisfaction degree of users. The display device includes: a display for displaying a first GUI comprising a plurality of items, and a selector indicating that the item is selected, the selector moving its position in the first GUI based on user input to select a different said item; a controller in communication with the display for performing: determining a position of the selector in the first GUI based on user input indicating movement of the selector; when the residence time of the selector at any position in the first GUI is calculated to exceed a preset threshold, maintaining the first GUI display while displaying a second GUI associated with the item selected at the any position at a preset size smaller than the size of the first GUI, and moving the selector onto the second GUI.

Description

Display device
Technical Field
The invention relates to the technical field of display, in particular to display equipment.
Background
Display devices typically provide a user with a large number of resource browsing and usage functions to accommodate the user's various needs. However, display devices currently do not provide a user with a function to quickly find a resource of interest among a large number of resources. Especially, when the user does not know the functions provided by the resources well, the user needs to determine which resource is interested in the resources through the processes of starting the resources, browsing the content information of the resources and closing the resources for many times, so that the user satisfaction is low.
Disclosure of Invention
The invention provides a display device for realizing resource preview, which aims to quickly search interesting resources in a large number of resources so as to improve the satisfaction degree of a user.
The embodiment of the invention provides the following specific technical scheme:
a display device, comprising:
a display for displaying a first GUI comprising a plurality of items, and a selector indicating that the item is selected, the selector moving its position in the first GUI based on user input to select a different said item;
a controller in communication with the display for performing:
determining a position of the selector in the first GUI based on user input indicating movement of the selector;
when the residence time of the selector at any position in the first GUI is calculated to exceed a preset threshold, maintaining the first GUI display while displaying a second GUI associated with the item selected at the any position at a preset size smaller than the size of the first GUI, and moving the selector onto the second GUI.
In some embodiments, the second GUI includes content detail information that the selected item is activated to display.
In some embodiments, a plurality of items are included in the second GUI relating to content detail information for the selected item, the plurality of items being selectable or activatable by a selector based on user input moving the position of the selector in the second GUI.
In some embodiments, while maintaining the first GUI display, further comprises:
changing a display state of the selected item by removing the selector and by distinguishing the selected item from other items to be displayed on the first GUI; alternatively, the first and second electrodes may be,
on the first GUI, by removing the selector and maintaining a display state of the selected item prior to removal of the selector.
In some embodiments, the controller is further configured to perform:
by performing at least one of a moving operation and a zooming operation of the second GUI based on a user input; the moving operation indicates a position change of the second GUI on the first GUI, and the zooming operation indicates a size change of zooming of the second GUI according to a preset size.
In some embodiments, the user input is a key value input through a key arranged on a remote controller, the key value corresponding to a moving operation and/or a zooming operation of the second GUI.
In some embodiments, the controller is further configured to perform:
replacing the first GUI displayed on the display by changing the second GUI from a preset size to a size of the first GUI based on a user input.
In some embodiments, the controller is further configured to perform:
by closing the second GUI and causing the selector to resume selection of the selected item in the first GUI prior to display of the second GUI based on user input.
In some embodiments, the controller is further configured to:
when the calculated residence time of the selector at any position in the first GUI does not exceed a preset threshold, maintaining the display state of the selected item at the any position while maintaining the first GUI display.
In the above embodiment, considering that a user usually has a process of waiting for a selector for more than 200-; and the content detail information of the item can be further browsed by controlling the selector, so that the user can quickly determine whether the item is interested or not without the processes of starting, browsing and closing the item for many times.
Furthermore, when the user confirms that the function of the item needs to be used, because the GUI associated with the item is displayed in a preset size, the content detail information of the item can be directly displayed in a full screen mode, so that the non-inductive starting of the item can be realized, and the visual experience of the user is improved.
Drawings
In order to more clearly illustrate the technical solutions in the embodiments of the present application, the drawings needed to be used in the description of the embodiments are briefly introduced below, and it is obvious that the drawings in the following description are only some embodiments of the present application, and it is obvious for those skilled in the art to obtain other drawings based on these drawings without creative efforts.
Fig. 1A is a schematic diagram illustrating an operation scenario between the display device 200 and the control apparatus 100;
fig. 1B is a block diagram schematically illustrating a configuration of the control apparatus 100 in fig. 1A;
fig. 1C is a block diagram schematically illustrating a configuration of the display device 200 in fig. 1A;
FIG. 1D is a block diagram illustrating an architectural configuration of an operating system in memory of display device 200;
FIGS. 2A-2G schematically illustrate a GUI400 provided by display device 200;
FIGS. 3A-3E schematically illustrate a GUI400 provided by display device 200;
FIGS. 4A-4D schematically illustrate a GUI400 provided by the display device 200;
fig. 5A-5C illustrate a flow chart of a method of displaying a graphical user interface provided by device 200.
Detailed Description
To make the objects, technical solutions and advantages of the exemplary embodiments of the present application clearer, the technical solutions in the exemplary embodiments of the present application will be clearly and completely described below with reference to the drawings in the exemplary embodiments of the present application, and it is obvious that the described exemplary embodiments are only a part of the embodiments of the present application, but not all the embodiments.
All other embodiments, which can be derived by a person skilled in the art from the exemplary embodiments shown in the present application without inventive effort, shall fall within the scope of protection of the present application. Moreover, while the disclosure herein has been presented in terms of exemplary one or more examples, it is to be understood that each aspect of the disclosure can be utilized independently and separately from other aspects of the disclosure to provide a complete disclosure.
It should be understood that the terms "first," "second," "third," and the like in the description and in the claims of the present application and in the drawings described above are used for distinguishing between similar elements and not necessarily for describing a particular sequential or chronological order. It is to be understood that the data so used are interchangeable under appropriate circumstances and can be implemented in sequences other than those illustrated or otherwise described herein with respect to the embodiments of the application, for example.
Furthermore, the terms "comprises" and "comprising," and any variations thereof, are intended to cover a non-exclusive inclusion, such that a product or device that comprises a list of elements is not necessarily limited to those elements explicitly listed, but may include other elements not expressly listed or inherent to such product or device.
The term module, as used herein, refers to any known or later developed hardware, software, firmware, artificial intelligence, fuzzy logic, or combination of hardware or/and software code that is capable of performing the functionality associated with that element.
The term "gesture" as used in this application refers to a user's behavior through a change in hand shape or an action such as hand motion to convey a desired idea, action, purpose, or result.
Fig. 1A is a schematic diagram illustrating an operation scenario between the display device 200 and the control apparatus 100. As shown in fig. 1A, the control apparatus 100 and the display device 200 may communicate with each other in a wired or wireless manner.
Among them, the control apparatus 100 is configured to control the display device 200, which may receive an operation instruction input by a user and convert the operation instruction into an instruction recognizable and responsive by the display device 200, serving as an intermediary for interaction between the user and the display device 200. Such as: the user operates the channel up/down key on the control device 100, and the display device 200 responds to the channel up/down operation.
The control device 100 may be a remote control 100A, such as a hand-held touch remote control, in which a user interface in a touch screen replaces most of the physical built-in hard keys in a typical remote control. The remote controller is generally connected to the display device using infrared protocol communication, bluetooth protocol communication, other short-range communication methods, and the like, and controls the display device 200 in a wireless or other wired manner. The user may input a user command through a key on a remote controller, a voice input, a control panel input, etc. to control the display apparatus 200. Such as: the user can input a corresponding control command through a volume up/down key, a channel control key, up/down/left/right moving keys, a voice input key, a menu key, a power on/off key, etc. on the remote controller, to implement the function of controlling the display device 200.
The control device 100 may also be an intelligent device, such as a mobile terminal 100B, a tablet computer, a notebook computer, and the like. For example, the display device 200 is controlled using an application program running on the smart device. The application program may provide various controls to a user through an intuitive User Interface (UI) on a screen associated with the smart device through configuration.
For example, the mobile terminal 100B may install a software application with the display device 200 to implement connection communication through a network communication protocol for the purpose of one-to-one control operation and data communication. Such as: the mobile terminal 100B may be caused to establish a control instruction protocol with the display device 200 to implement the functions of the physical keys as arranged in the remote control 100A by operating various function keys or virtual buttons of the user interface provided on the mobile terminal 100B. The audio and video content displayed on the mobile terminal 100B may also be transmitted to the display device 200, so as to implement a synchronous display function.
The display apparatus 200 may be implemented as a television, and may provide an intelligent network television function of a broadcast receiving television function as well as a computer support function. Examples of the display device include a digital television, a web television, a smart television, an Internet Protocol Television (IPTV), and the like.
The display device 200 may be a liquid crystal display, an organic light emitting display, a projection display device. The specific display device type, size, resolution, etc. are not limited.
The display apparatus 200 also performs data communication with the server 300 through various communication means. Here, the display apparatus 200 may be allowed to be communicatively connected through a Local Area Network (LAN), a Wireless Local Area Network (WLAN), and other networks. The server 300 may provide various contents and interactions to the display apparatus 200. By way of example, the display device 200 may send and receive information such as: receiving Electronic Program Guide (EPG) data, receiving software program updates, or accessing a remotely stored digital media library. The servers 300 may be a group or groups of servers, and may be one or more types of servers. Other web service contents such as a video on demand and an advertisement service are provided through the server 300.
Fig. 1B is a block diagram schematically showing the configuration of the control apparatus 100. As shown in fig. 1B, the control device 100 includes a controller 110, a memory 120, a communicator 130, a user input interface 140, an output interface 150, and a power supply 160.
The controller 110 includes a Random Access Memory (RAM) 111, a Read Only Memory (ROM) 112, a processor 113, a communication interface, and a communication bus. The controller 110 is used to control the operation of the control device 100, as well as the internal components and the cooperation of communications, external and internal data processing functions.
Illustratively, when an interaction of a user pressing a key disposed on the remote controller 100A or an interaction of touching a touch panel disposed on the remote controller 100A is detected, the controller 110 may control to generate a signal corresponding to the detected interaction and transmit the signal to the display device 200.
And a memory 120 for storing various operation programs, data and applications for driving and controlling the control apparatus 100 under the control of the controller 110. The memory 120 may store various control signal commands input by a user.
The communicator 130 enables communication of control signals and data signals with the display apparatus 200 under the control of the controller 110. Such as: the control apparatus 100 transmits a control signal (e.g., a touch signal or a button signal) to the display device 200 via the communicator 130, and the control apparatus 100 may receive the signal transmitted by the display device 200 via the communicator 130. The communicator 130 may include an infrared signal interface 131 and a radio frequency signal interface 132. For example: when the infrared signal interface is used, the user input instruction needs to be converted into an infrared control signal according to an infrared control protocol, and the infrared control signal is sent to the display device 200 through the infrared sending module. The following steps are repeated: when the rf signal interface is used, a user input command needs to be converted into a digital signal, and then the digital signal is modulated according to the rf control signal modulation protocol and then transmitted to the display device 200 through the rf transmitting terminal.
The user input interface 140 may include at least one of a microphone 141, a touch pad 142, a sensor 143, a key 144, and the like, so that a user can input a user instruction regarding controlling the display apparatus 200 to the control apparatus 100 through voice, touch, gesture, press, and the like.
The output interface 150 outputs a user instruction received by the user input interface 140 to the display apparatus 200, or outputs an image or voice signal received by the display apparatus 200. Here, the output interface 150 may include an LED interface 151, a vibration interface 152 generating vibration, a sound output interface 153 outputting sound, a display 154 outputting an image, and the like. For example, the remote controller 100A may receive an output signal such as audio, video, or data from the output interface 150 and display the output signal in the form of an image on the display 154, in the form of audio at the sound output interface 153, or in the form of vibration at the vibration interface 152.
And a power supply 160 for providing operation power support for each element of the control device 100 under the control of the controller 110. In the form of a battery and associated control circuitry.
A hardware configuration block diagram of the display device 200 is exemplarily illustrated in fig. 1C. As shown in fig. 1C, the display apparatus 200 may further include a tuner demodulator 210, a communicator 220, a detector 230, an external device interface 240, a controller 250, a memory 260, a user interface 265, a video processor 270, a display 275, an audio processor 280, an audio input interface 285, and a power supply 290.
The tuner demodulator 210 receives the broadcast television signal in a wired or wireless manner, may perform modulation and demodulation processing such as amplification, mixing, and resonance, and is configured to demodulate, from a plurality of wireless or wired broadcast television signals, an audio/video signal carried in a frequency of a television channel selected by a user, and additional information (e.g., EPG data).
The tuner demodulator 210 is responsive to the user selected frequency of the television channel and the television signal carried by the frequency, as selected by the user and controlled by the controller 250.
The tuner demodulator 210 can receive a television signal in various ways according to the broadcasting system of the television signal, such as: terrestrial broadcasting, cable broadcasting, satellite broadcasting, internet broadcasting, or the like; and according to different modulation types, a digital modulation mode or an analog modulation mode can be adopted; and can demodulate the analog signal and the digital signal according to different types of the received television signals.
In other exemplary embodiments, the tuning demodulator 210 may also be in an external device, such as an external set-top box. In this way, the set-top box outputs television signals through modulation and demodulation, and inputs the television signals into the display apparatus 200 through the external device interface 240.
The communicator 220 is a component for communicating with an external device or an external server according to various communication protocol types. For example, the display apparatus 200 may transmit content data to an external apparatus connected via the communicator 220, or browse and download content data from an external apparatus connected via the communicator 220. The communicator 220 may include a network communication protocol module or a near field communication protocol module, such as a WIFI module 221, a bluetooth communication protocol module 222, and a wired ethernet communication protocol module 223, so that the communicator 220 may receive a control signal of the control device 100 according to the control of the controller 250 and implement the control signal as a WIFI signal, a bluetooth signal, a radio frequency signal, and the like.
The detector 230 is a component of the display apparatus 200 for collecting signals of an external environment or interacting with the outside. The detector 230 may include an image collector 231, such as a camera, a video camera, etc., which may be used to collect external environment scenes to adaptively change the display parameters of the display device 200; and the function of acquiring the attribute of the user or interacting gestures with the user so as to realize the interaction between the display equipment and the user. A light receiver 232 may also be included to collect ambient light intensity to adapt to changes in display parameters of the display device 200, etc.
In some other exemplary embodiments, the detector 230 may further include a temperature sensor, such as by sensing an ambient temperature, and the display device 200 may adaptively adjust a display color temperature of the image. For example, when the temperature is higher, the display apparatus 200 may be adjusted to display a color temperature of an image that is cooler; when the temperature is lower, the display device 200 may be adjusted to display a warmer color temperature of the image.
In some other exemplary embodiments, the detector 230, which may further include a sound collector, such as a microphone, may be configured to receive a sound of a user, such as a voice signal of a control instruction of the user to control the display device 200; alternatively, ambient sounds may be collected that identify the type of ambient scene, enabling the display device 200 to adapt to ambient noise.
The external device interface 240 is a component for providing the controller 210 to control data transmission between the display apparatus 200 and an external apparatus. The external device interface 240 may be connected to an external apparatus such as a set-top box, a game device, a notebook computer, etc. in a wired/wireless manner, and may receive data such as a video signal (e.g., moving image), an audio signal (e.g., music), additional information (e.g., EPG), etc. of the external apparatus.
The external device interface 240 may include: a High Definition Multimedia Interface (HDMI) terminal 241, a Composite Video Blanking Sync (CVBS) terminal 242, an analog or digital Component terminal 243, a Universal Serial Bus (USB) terminal 244, a Component terminal (not shown), a red, green, blue (RGB) terminal (not shown), and the like.
The controller 250 controls the operation of the display device 200 and responds to the operation of the user by running various software control programs (such as an operating system and various application programs) stored on the memory 260.
As shown in fig. 1C, the controller 250 includes a Random Access Memory (RAM) 251, a Read Only Memory (ROM) 252, a graphics processor 253, a CPU processor 254, a communication interface 255, and a communication bus 256. The RAM251, the ROM252, the graphic processor 253, and the CPU processor 254 are connected to each other through a communication bus 256.
The ROM252 stores various system boot instructions. When the display apparatus 200 starts power-on upon receiving the power-on signal, the CPU processor 254 executes a system boot instruction in the ROM252, copies the operating system stored in the memory 260 to the RAM251, and starts running the boot operating system. After the start of the operating system is completed, the CPU processor 254 copies the various application programs in the memory 260 to the RAM251 and then starts running and starting the various application programs.
A graphic processor 253 for generating screen images of various graphic objects such as icons, images, and operation menus. The graphic processor 253 may include an operator for performing an operation by receiving various interactive instructions input by a user, and further displaying various objects according to display attributes; and a renderer for generating various objects based on the operator and displaying the rendered result on the display 275.
A CPU processor 254 for executing operating system and application program instructions stored in memory 260. And according to the received user input instruction, processing of various application programs, data and contents is executed so as to finally display and play various audio-video contents.
In some example embodiments, the CPU processor 254 may comprise a plurality of processors. The plurality of processors may include one main processor and a plurality of or one sub-processor. A main processor for performing some initialization operations of the display apparatus 200 in the display apparatus preload mode and/or operations of displaying a screen in the normal mode. A plurality of or one sub-processor for performing an operation in a state of a display device standby mode or the like.
The communication interface 255 may include a first interface to an nth interface. These interfaces may be network interfaces that are connected to external devices via a network.
The controller 250 may control the overall operation of the display apparatus 200. For example: in response to receiving a user input command for selecting a GUI object displayed on the display 275, the controller 250 may perform an operation related to the object selected by the user input command.
Where the object may be any one of the selectable objects, such as a hyperlink or an icon. The operation related to the selected object is, for example, an operation of displaying a link to a hyperlink page, document, image, or the like, or an operation of executing a program corresponding to an icon. The user input command for selecting the GUI object may be a command input through various input means (e.g., a mouse, a keyboard, a touch pad, etc.) connected to the display apparatus 200 or a voice command corresponding to a user speaking a voice.
A memory 260 for storing various types of data, software programs, or applications for driving and controlling the operation of the display device 200. The memory 260 may include volatile and/or nonvolatile memory. And the term "memory" includes the memory 260, the RAM251 and the ROM252 of the controller 250, or a memory card in the display device 200.
In some embodiments, the memory 260 is specifically used for storing an operating program for driving the controller 250 of the display device 200; storing various application programs built in the display apparatus 200 and downloaded by a user from an external apparatus; data such as visual effect images for configuring various GUIs provided by the display 275, various objects related to the GUIs, and selectors for selecting GUI objects are stored.
In some embodiments, the memory 260 is specifically configured to store drivers and related data for the tuner demodulator 210, the communicator 220, the detector 230, the external device interface 240, the video processor 270, the display 275, the audio processor 280, and the like, external data (e.g., audio-visual data) received from the external device interface, or user data (e.g., key information, voice information, touch information, and the like) received from the user interface.
In some embodiments, memory 260 specifically stores software and/or programs representing an Operating System (OS), which may include, for example: a kernel, middleware, an Application Programming Interface (API), and/or an application program. Illustratively, the kernel may control or manage system resources, as well as functions implemented by other programs (e.g., the middleware, APIs, or applications); at the same time, the kernel may provide an interface to allow middleware, APIs, or applications to access the controller to enable control or management of system resources.
A block diagram of the architectural configuration of the operating system in the memory of the display device 200 is illustrated in fig. 1D. The operating system architecture comprises an application layer, a framework layer and a kernel layer from top to bottom.
The application layer, the application programs built in the system and the non-system-level application programs belong to the application layer. Is responsible for direct interaction with the user. The application layer may include a plurality of applications such as a live television application, a video-on-demand application, a media center application, a screenshot application, and the like.
The live television application program can provide live television through different signal sources. For example, a live television application may provide television signals using input from cable television, radio broadcasts, satellite services, or other types of live television services. And, the live television application may display video of the live television signal on display device 200.
A video-on-demand application may provide video from different storage sources. Unlike live television applications, video on demand provides a video display from some storage source. For example, the video on demand may come from a server side of the cloud storage, from a local hard disk storage containing stored video programs.
The media center application program can provide various applications for playing multimedia contents. For example, the media center may be different from a live tv or a video-on-demand, and the user may access various images or videos stored in the memory through the media center application program.
The screenshot application program can perform screenshot on a current display picture on the display, and perform annotation such as identification frames, names and the like on identification objects (such as characters, channel station logos, buildings and the like) contained in the screenshot image, so as to provide a display function of various identification object information contained in the display picture for a user. The current display picture can be at least one of characters, images and videos.
And the framework layer is responsible for providing the API required by the application layer. For example, a live television application, a video-on-demand application, and a media center application may call a decoder to perform audio and video decoding through an interface provided by the framework layer. For another example, the screenshot application may call the screenshot image of the current display that has been captured through an interface provided by the framework layer.
The kernel layer provides core system services, such as: file management, memory management, process management, network management, system security authority management and the like. The kernel layer may be implemented as a kernel based on various operating systems, for example, an android operating system based kernel.
The kernel also provides communication between system software and hardware, providing device driver services for various hardware, such as: provide display driver for the display, provide camera driver for the camera, provide button driver for the remote controller, provide wiFi driver for the WIFI module, provide audio driver for audio output interface, provide power management drive for Power Management (PM) module etc..
A user interface 265 receives various user interactions. Specifically, it is used to transmit an input signal of a user to the controller 250 or transmit an output signal from the controller 250 to the user. For example, the remote controller 100A may transmit an input signal, such as a power switch signal, a channel selection signal, a volume adjustment signal, etc., input by the user to the user interface 265, and then the input signal is transferred to the controller 250 through the user interface 265; alternatively, the remote controller 100A may receive an output signal such as audio, video, or data output from the user interface 265 via the controller 250, and display the received output signal or output the received output signal in audio or vibration form.
In some embodiments, a user may enter user commands on a Graphical User Interface (GUI) displayed on the display 275, and the user interface 265 receives the user input commands through the GUI. Specifically, the user interface 265 may receive user input commands for controlling the position of a selector in the GUI to select different objects.
Alternatively, the user may input a user command by inputting a specific sound or gesture, and the user interface 265 receives the user input command by recognizing the sound or gesture through the sensor.
The video processor 270 is configured to receive an external video signal, and perform video data processing such as decompression, decoding, scaling, noise reduction, frame rate conversion, resolution conversion, and image synthesis according to a standard codec protocol of the input signal, so as to obtain a video signal that is directly displayed or played on the display 275.
Illustratively, the video processor 270 includes a demultiplexing module, a video decoding module, an image synthesizing module, a frame rate conversion module, a display formatting module, and the like.
The demultiplexing module is configured to demultiplex an input audio/video data stream, where, for example, an input MPEG-2 stream (based on a compression standard of a digital storage media moving image and voice), the demultiplexing module demultiplexes the input audio/video data stream into a video signal and an audio signal.
And the video decoding module is used for processing the video signal after demultiplexing, including decoding, scaling and the like.
And the image synthesis module is used for carrying out superposition mixing processing on the GUI signal input by the user or generated by the user and the video image after the zooming processing by the graphic generator so as to generate an image signal for display.
The frame rate conversion module is configured to convert a frame rate of an input video, for example, convert a frame rate of an input 60Hz video into a frame rate of 120Hz or 240Hz, where a common format is implemented by using, for example, an interpolation frame method.
And a display formatting module for converting the signal output by the frame rate conversion module into a signal conforming to a display format of a display, such as converting the format of the signal output by the frame rate conversion module to output RGB data signals.
And a display 275 for receiving the image signal from the output of the video processor 270 and displaying video, images and menu manipulation interfaces. For example, the display may display video from a broadcast signal received by the tuner demodulator 210, may display video input from the communicator 220 or the external device interface 240, and may display an image stored in the memory 260. The display 275, while displaying a user manipulation interface UI generated in the display apparatus 200 and used to control the display apparatus 200.
And, the display 275 may include a display screen assembly for presenting a picture and a driving assembly for driving the display of an image. Alternatively, a projection device and projection screen may be included, provided display 275 is a projection display.
The audio processor 280 is configured to receive an external audio signal, decompress and decode the received audio signal according to a standard codec protocol of the input signal, and perform audio data processing such as noise reduction, digital-to-analog conversion, and amplification processing to obtain an audio signal that can be played by the speaker 286.
Illustratively, audio processor 280 may support various audio formats. Such as MPEG-2, MPEG-4, Advanced Audio Coding (AAC), high efficiency AAC (HE-AAC), and the like.
Audio output interface 285 receives audio signals from the output of audio processor 280. For example, the audio output interface may output audio in a broadcast signal received via the tuner demodulator 210, may output audio input via the communicator 220 or the external device interface 240, and may output audio stored in the memory 260. The audio output interface 285 may include a speaker 286, or an external audio output terminal 287, such as an earphone output terminal, that outputs to a generating device of an external device.
In other exemplary embodiments, video processor 270 may comprise one or more chips. Audio processor 280 may also comprise one or more chips.
And, in other exemplary embodiments, the video processor 270 and the audio processor 280 may be separate chips or may be integrated with the controller 250 in one or more chips.
And a power supply 290 for supplying power supply support to the display apparatus 200 from the power input from the external power source under the control of the controller 250. The power supply 290 may be a built-in power supply circuit installed inside the display apparatus 200 or may be a power supply installed outside the display apparatus 200.
A schematic diagram of one GUI400 provided by the display device 200 is illustrated in fig. 2A-2G.
2A-2G, a display device may provide a GUI400 to a display, the GUI400 including a first GUI in which a plurality of different items are arranged, and a selector indicating that any one of the GUI400 is selected. For example, FIG. 2A shows a GUI that includes a first GUI41 for arranging items 411-415 and a selector 42 for indicating that item 411 is selected.
Note that the item refers to a visual object displayed in the GUI provided by the display apparatus 200 to represent corresponding content such as an icon, a thumbnail, a link, and the like. If the item is a movie or a tv show, the item may be displayed as a poster of the movie or tv show. If the item is music, a poster of a music album may be displayed. Such as an icon for the application when the item is an application or a screenshot of the content that was captured for the application when it was most recently executed. If the item is the user access history, the content screenshot in the latest execution process can be displayed.
The presentation forms of items are often diverse. For example, the items may include text content and/or images for displaying thumbnails related to the text content. As another example, the item may be text and/or an icon of an application.
It is further noted that the selector may be moved in position within the GUI by user input operating the control means to change the selection of different items. For example, a focus object, which can be controlled to move to display the focus object in the display device according to the user's input through the control means to select or control an item. Such as: the user may select and control items by controlling the movement of the focus object between items through a directional key on the control device.
The form of identification of the selector is often diversified. For example, the location of the focal object is implemented or identified in FIG. 2A by enlarging the item 411 bounding box. As another example, the location of the focus object is implemented or identified in FIG. 2D by drawing a thick line through the edge of item 431. Further, the position of the focus object may also be identified by changing the border line, size, color, transparency, and outline of the text or image of the focused item, and/or the font, etc.
2A-2G, items 411-415 are provided on the display, the items 411-415 corresponding to icons and names of settings, music, television assistant, video, file applications, respectively, which may be activated based on user input to display their corresponding content detail information. E.g., the television assistant application corresponding to item 413, may be activated upon the user pressing the confirmation key on the control device when the selector 42 is moved to the position of item 413 in fig. 2C, thereby displaying the content detail information of the television assistant application shown in fig. 4D in full screen.
In fig. 2A, the first GUI41 is displayed full screen and the selector 42 indicates that the item 411 is selected. When the user presses the right direction key on the control device, as shown in fig. 2B, the display device may move the position of the selector 42 in response to the input right direction command, so that the selector 42 indicates that the item 412 is selected, the item 412 border is enlarged and displayed, and the item 411 border is restored to be the same as the other items 413-415, so as to prompt the user that the current item 412 is selected by the selector 42.
In fig. 2B, when the time for the selector 42 to stay at the position of the item 412 after moving from the position of the item 411 to the position of the item 412 is short, the user continues to press the right direction key on the control device, as shown in fig. 2C, the display device may move the position of the selector 42 in response to the input right direction instruction, so that the selector 42 indicates that the item 413 is selected, the border of the item 413 is enlarged and displayed, and the border of the item 412 is restored to be the same as the other items 411, 414-415, so as to prompt the user that the current item 413 is selected by the selector 42.
In some embodiments, in fig. 2C, when the selector 42 stays at the position of the item 413 for a longer time after moving from the position of the item 412 to the position of the item 413, the display device still does not continue to receive the input from the user operating the control device, at this time, as shown in fig. 2D, the display device may display the second GUI43 associated with the item 413 at the upper right corner position of the first GUI41 in a preset size smaller than the size of the first GUI41, so that the user can preview the content detail information of the item 413 in the second GUI43 in advance. For example, the style and content of the television assistant application interface is provided in the second GUI 43.
In FIG. 2D, the second GUI43 is displayed while the selector 42 is caused to move over the second GUI43, e.g., the selector 42 moves over the item 431 in the second GUI43, for the user to continue browsing the content detail information of the item 414 in the second GUI 43. The first GUI41 remains displayed, and the item 413 in the first GUI41 can be distinguished from other items 411-412, 414-415 in display status, such as the item 413 is displayed with thick lines drawn at the edge; the display state when the selector selects the same can also be maintained, for example, the item 413 frame is enlarged and displayed; the user may thus be prompted that the currently displayed second GUI43 is associated with the item 413.
In some embodiments, in fig. 2C, when the selector 42 stays at the position of the item 413 for a short time after moving from the position of the item 412 to the position of the item 413, the user continues to press the right direction key on the control device, as shown in fig. 2E, the display device may move the position of the selector 42 in response to the input right direction instruction, so that the selector 42 indicates that the item 414 is selected, the border of the item 414 is enlarged and displayed, and the border of the item 413 returns to be the same as the other items 411 to 412 and 415, so as to prompt the user that the current item 414 is selected by the selector 42.
In some embodiments, in fig. 2E, when the selector 42 stays at the item 414 position for a longer time after moving from the item 413 position to the item 414 position, the display device still does not continue to receive the input from the user operating the control device, at this time, as shown in fig. 2F, the display device may display the second GUI44 associated with the item 414 at the upper right corner position of the first GUI41 in a preset size smaller than the size of the first GUI41, so that the user can preview the content detail information of the item 414 in the second GUI44 in advance.
In FIG. 2F, the second GUI44 is displayed while the selector 42 is caused to move over the second GUI44 for the user to continue browsing the content detail information of the item 414 in the second GUI 44. The first GUI41 is kept displayed, and the item 414 in the first GUI41 can be distinguished from other items 411-413, 415 in display status, for example, the item 414 is displayed with a thick line at the edge; the display state when the selector selects the same can also be maintained, for example, the item 414 is displayed in a frame enlarging mode; the user may thus be prompted that the currently displayed second GUI44 is associated with the item 414.
In some embodiments, in fig. 2E, when the time for the selector 42 to stay at the position of the item 414 after moving from the position of the item 413 to the position of the item 414 is short, the user continues to press the right direction key on the control device, as shown in fig. 2G, the display device may move the position of the selector 42 in response to the input right direction instruction, so that the selector 42 indicates that the item 415 is selected, the border of the item 415 is enlarged and displayed, and the border of the item 414 is restored to be the same as the other items 411 to 413, so as to prompt the user that the current item 415 is selected by the selector 42.
In other embodiments, the GUI shown in fig. 2D, the second GUI43 provides content detail information corresponding to the item 413. For example: the content detail information for the television helper application provided in the second GUI43 includes items 431-434. These items 431-434 may also be activated upon the user pressing a confirmation key on the control device when the selector 42 is moved to the corresponding item position in FIG. 2D, thereby switching to the second sub-GUI associated with the corresponding item, thus enabling the user to switch between content detail information for the television helper application for the user to further confirm whether the functionality provided by the television helper application is of interest.
For example, in fig. 2D, the selector 42 indicates that the item 431 in the second GUI43 is selected. When the user presses a confirmation key on the control apparatus, as shown in fig. 3A, the display device may display a second sub-GUI 4310 associated with an item 431 in place of the second GUI43 in response to an input instruction to activate the item 431. For example, the second sub-GUI 4310 displays the content information corresponding to the memory speed-up option.
As another example, in fig. 2D, the selector 42 indicates that the item 431 in the second GUI43 is selected. When the user presses the right direction key on the control, as shown in FIG. 3B, the display device may move the position of the selector 42 in the second GUI43 in response to the input right direction command, such that the selector 42 indicates that the item 432 is selected. When the user continues to press the enter key on the control, as shown in fig. 3C, the display device may display a second sub-GUI 4320 associated with the item 432 in place of the second GUI43 in response to an input instruction to activate the item 432. Content information corresponding to the garbage disposal option is displayed as in the second sub-GUI 4320.
As another example, in fig. 3B, the selector 42 indicates that the item 432 in the second GUI43 is selected. When the user presses a right direction key on the control, as shown in fig. 3D, the display device may move the position of the selector 42 in the second GUI43 in response to the input right direction instruction, such that the selector 42 indicates that the item 433 is selected. When the user continues to press the enter key on the control, as shown in fig. 3E, the display device may display a second sub-GUI 4330 associated with the item 433 in place of the second GUI43 in response to an input instruction to activate the item 433. For example, the content information corresponding to the security killing option is displayed in the second sub-GUI 4330.
In still other embodiments, in the GUIs shown in fig. 2D and 3A-3E, when a user presses a preset key on the control device, the display device may control the display position and/or display size of the second GUI43 on the first GUI41, so that the second GUI may be flexibly moved and/or zoomed based on the user's needs to enhance the user's satisfaction.
For example, in fig. 2D, when the user presses a channel down key on the control apparatus, as shown in fig. 4A, the display device may control the second GUI43 to move leftward by a preset distance in response to an input key instruction; as the user continues to press the channel down key on the control device, as shown in fig. 4B, the display device may control the second GUI43 to continue moving leftward a preset distance in response to the input key command.
As another example, in fig. 4A, when the user presses the channel up key on the control apparatus, as shown in fig. 4C, the display device may control the second GUI43 to move down by a preset distance while enlarging a preset multiple in response to the input key command.
In still other embodiments, in the GUIs illustrated in fig. 2D and 3A-3E, when a user presses a preset key on the control device, the display device may control the second GUI to be displayed full screen, so that when the user is interested in the second GUI associated with an item in the first GUI, a quick launch of the item may be achieved without being perceptible to the user.
For example, in fig. 2D, when the user presses a channel up key on the control apparatus, as shown in fig. 4D, the display device may control the second GUI43 associated with the item 413 to be displayed full screen instead of the first GUI41 in response to the input key instruction, thereby providing the user with a function of using the item 413. Here, compared to "being activated based on the user pressing the confirmation key on the control device to display the content detail information of the item 413 shown in fig. 4D in full screen" when the selector 42 is moved to the position of the item 413 in fig. 2C, since the item 413 has been activated in advance, the content detail information of the item 413 can be displayed in full screen directly, so that it is possible to avoid that the content detail information of the item 413 shown in fig. 4D can be viewed after waiting for a certain time because the item 413 requires a long start-up time.
In still other embodiments, the GUIs illustrated in fig. 2D and 3A-3E, when the user presses a preset key on the control device, the display device may control the second GUI to no longer be displayed to allow the user to continue browsing other items in the first GUI when the user is not interested in the second GUI associated with the items in the first GUI.
For example, in FIG. 2D, when the user presses a return key on the control, as shown in FIG. 2C, the display device may no longer display the second GUI43 and cause the selector 42 to move over the item 413 in response to the input return instruction.
As another example, in fig. 3B, when the user presses a return key on the control or continuously presses a return key on the control, as shown in fig. 2C, the display device may no longer display the second GUI43 and cause the selector 42 to move onto the item 413 in response to an input return instruction.
As described in the above embodiments, when a display device provides a large number of items, the display device may provide a GUI associated with an item to a user based on the time that a selector remains at the location of the item, such that the user previews content detail information for the item in advance; and the content detail information of the item can be further browsed by controlling the selector, so that the user can quickly determine whether the item is interested or not without the processes of starting the item, browsing the content information of the item and closing the item for many times.
Further, in providing the GUI associated with the item to the user, the display device may control a display position and a display size of the GUI associated with the item on the display based on the user input, so that the user may conveniently preview the content detail information of the item according to the user's needs.
Furthermore, after the user previews the content detail information of the item and determines that the function provided by the item needs to be used, the display device can control the GUI associated with the item to be displayed on the display in a full screen mode based on the user input because the GUI associated with the item is started in advance, so that the non-inductive starting of the item can be realized, and the user can clearly feel that the starting speed of the item is accelerated.
Fig. 5A-5C illustrate a flow chart of a method of displaying a graphical user interface provided by device 200.
With reference to the method shown in FIG. 5A, the method includes the following steps S51-S56:
step S51: a first GUI including a plurality of items and a selector indicating that any of the items is selected is displayed. For example, the first GUI41 including the items 411 to 415 shown in FIG. 2A is displayed, along with the selector 42 indicating that the item 411 is selected.
Step S52: and receiving a movement instruction which is input by a user through the control device and indicates to move the selector. For example, the user presses the right direction key on the control to indicate moving the selector to the right.
Step S53: in response to the entered movement instruction, an actual position of the selector in the first GUI is determined.
Step S54: judging whether the residence time of the selector at the actual position exceeds a preset time length or not; if yes, go to step S55; otherwise, step S56 is executed.
Step S55: displaying a second GUI associated with the item selected at the actual position at a preset size smaller than the size of the first GUI, and causing the selector to move to the second GUI.
Step S56: the item held in this actual position is selected by the selector.
For example, in the GUI shown in FIG. 2C, the selector 42 is determined to be stuck at the item 413 position in the first GUI 41. When the dwell time of the selector 42 at the location of the item 413 is calculated to exceed a preset duration (e.g., 300 ms), the second GUI43 associated with the item 413 is displayed in a preset size smaller than the size of the first GUI41 at the upper right corner of the first GUI41 as shown in fig. 2D, and the selector is caused to move onto the item 431 in the second GUI 43. When the dwell time of the calculation selector 42 at the position of the item 413 does not exceed the preset time period (e.g., 300 ms), as shown in fig. 2C, the selector 42 is kept selecting the item 413, as the border of the item 413 is enlarged.
In addition, the method may further include:
step S57: an instruction input by a user through the control device is received to instruct to move and/or zoom the second GUI. For example, the user presses a channel up/down key on the control device to instruct movement and/or zooming of the second GUI.
Step S58: and controlling the display position and/or the display size of the second GUI on the first GUI in response to the input instruction.
Here, the corresponding relationship between the key values of the keys and the related operations for controlling the second GUI may be prestored in the display device. A key value such as a channel down key corresponds to controlling the second GUI to move 200 pixels to the left. When the user presses the channel down key on the control apparatus as in fig. 2D, the display device may control the second GUI43 to move leftward by a preset distance in response to an input key command, as shown in fig. 4A. And if the key value of the channel plus key corresponds to controlling the second GUI to move 100 pixels downwards and/or enlarge 1.5 times. As shown in fig. 4A, when the user presses the channel up key on the control apparatus, as shown in fig. 4C, the display device may control the second GUI43 to move down by a preset distance while enlarging a preset multiple in response to an input key command.
The following describes steps S53 to S55 in detail with reference to the method shown in fig. 5B, taking the example that the operating system stored in the memory of fig. 1D is an android system.
Step S531: the framework layer detects an actual position of the selector in the first GUI based on a movement instruction of the movement selector transmitted by a key driver in the kernel layer.
For example, in fig. 2B, when the user presses the right direction key on the control device, the key driver in the kernel layer first converts the key value corresponding to the right direction key into an instruction to move the selector to the right, and transmits the instruction to move the selector to the framework layer, so that the framework layer detects the position of the selector to be moved in the first GUI, as in fig. 2C, the selector 42 moves to the item 413 position.
Step S541: judging whether the residence time of the selector at the actual position exceeds a preset time length by a software program corresponding to the first GUI; if so, go to step S551, otherwise go to step S56.
For example, in fig. 2C, the software program corresponding to the first GUI is a home application, and the home application determines whether the stay time of the selector 42 at the position of the item 413 exceeds a preset time period.
Step S551: a software program corresponding to the first GUI sends a starting request for activating the selected item on the actual position to the framework layer;
step S552: the framework layer judges whether the start request carries information for indicating that a second GUI associated with the selected item is displayed by using a small window or not; if yes, go to step S553; otherwise, the flow ends.
Step S553: based on the information, a second GUI associated with the selected item is launched and displayed using the widget.
Continuing with the above example, when the home application determines that the selector 42 has stayed in the position of the item 413 for a time period exceeding the preset duration, the home application sends a launch request to the framework layer to activate the television helper application (i.e., item 413) with information indicating that the television helper application is to be displayed using the portlet. And then, when the framework layer judges that the information is carried in the starting request, starting the television assistant application program, and displaying the television assistant application program in a small window determined based on the information.
Furthermore, for example, the first GUI corresponds to ActivityA, and the second GUI corresponds to ActivityB (Activity is one of the four most important components of the android system, and one Activity may correspond to one graphical user interface), that is, the start request is a request for starting ActivityB sent by ActivityA.
Specifically, in combination with the method shown in fig. 5C, step S553 further includes steps S5531 to S5535:
step S5531: detecting whether the information in the starting request contains a parameter for indicating that activityB adopts a small window display mode; if yes, go to step S5532; otherwise, the flow ends.
Step S5532: ActivityA is kept visible.
Step S5533: detecting whether information in the starting request carries parameters for indicating the display size and the display position of the small window; if yes, go to step S5534; otherwise, the flow ends.
Step S5534: and determining the display size and the display position of the small window.
Step S5535: ActivityB starts and is displayed with a small window.
Here, the parameter indicating that ActivityB adopts the small window display mode may be FLAG _ ADJUST, for example.
If the information in the starting request contains the parameter, modifying the mark in the Activity Task corresponding to the Activity B, namely forcing the me of the Activity B to be transparent, so that the android system does not modify the Activity A from the visible state to the invisible state according to the current android system rule, and the Activity A still keeps the visible state.
Then, the parameter indicating the display size and display position of the small window may be, for example, WIN _ ENABLED _ RIGHT _ TOP _ 4.
If the information in the start request carries the parameter, calculating mBounds of the display window corresponding to the activityB according to the window adjustment parameter (_ 4) (the size of the boundary is one fourth of the size of the display screen, namely the width and the height of the display window are one fourth of the width and the height of the display screen respectively, such as mBounds [0,1920/4,0,1080/4 ]), and determining the position of the mBounds of the display window corresponding to the activityB in the display screen (the position is positioned at the upper RIGHT of the display screen) according to the window position parameter (_ RIGHT _ TOP);
finally, ActivityB starts and is displayed in a small window that determines the display size and display position.
It should be noted that, for example, the size of the boundary of the display window corresponding to ActivityB may be one third, one fourth or one fifth of the size of the display screen, and the position of the display window corresponding to ActivityB may be the upper left, upper right, lower left or lower right of the display screen.
Combining the size of the boundary and the display position, wherein the display area of the display window corresponding to the activityB can be 12, for example, the display window corresponding to the activityB is positioned at the upper left of the display screen, and the size of the boundary is one third of the size of the display screen; the activityB corresponding to the display window is positioned at the upper left of the display screen, and the size of the boundary is one fourth of the size of the display screen; the activityB corresponding to the display window is positioned at the upper left of the display screen, and the size of the boundary is one fifth of the size of the display screen; the activityB corresponding to the display window is positioned at the upper right of the display screen, and the size of the boundary is one third of the size of the display screen; the activityB corresponding to the display window is positioned at the upper right of the display screen, and the size of the boundary is one fourth of the size of the display screen; the activityB corresponding to the display window is positioned at the upper right of the display screen, and the size of the boundary is one fifth of the size of the display screen; the activityB corresponding to the display window is positioned at the lower left of the display screen, and the size of the boundary is one third of the size of the display screen; the activityB corresponding to the display window is positioned at the lower left of the display screen, and the size of the boundary is one fourth of the size of the display screen; the activityB corresponding to the display window is positioned at the lower left of the display screen, and the size of the boundary is one fifth of the size of the display screen; the activityB corresponding to the display window is positioned at the lower right part of the display screen, and the size of the boundary is one third of the size of the display screen; the activityB corresponding to the display window is positioned at the lower right part of the display screen, and the size of the boundary is one fourth of the size of the display screen; the activityB corresponds to a display window positioned at the lower right of the display screen, and the size of the boundary is one fifth of the size of the display screen.
It should be noted that, in the android system, currently, a task at the top of a task stack is shown in the foreground. Because the activityA and the activityB are under the same task stack, the corresponding task after the activityB is started is positioned at the top of the task stack, and correspondingly, the task corresponding to the activityA is positioned at the secondary top of the task stack (namely under the task corresponding to the activityB), the selector can be controlled to be switched from the activityA to the activityB. Further, on activtyb, other operations may be further performed by controlling the selector, such as initiating other activities. Illustratively, a second sub-GUI 4310 as shown in FIG. 3A, a second sub-GUI 4320 as shown in FIG. 3C, a second sub-GUI 4330 as shown in FIG. 3E may be launched.
As described in the above embodiments, the method of the graphical user interface to implement the project preview function may be implemented by a multi-window effect provided by the android system. Because the system of the android 7.0 and later versions supports a multitask stack, the function of displaying different Activities in multiple windows can be realized. However, the system before the android 7.0 version does not support a multitask stack, and the function of displaying different activities in multiple windows cannot be realized.
Therefore, in this embodiment, the Activity launch property and the window display in the android system are modified, so that a multi-window display effect is realized in one task stack without being limited by the version of the android system.
Illustratively, in previous systems of android version 7.0, when ActivityB was launched on ActivityA, the display area of ActivityB corresponds to full screen by default, thereby only showing one full screen window displaying ActivityB. Therefore, in this embodiment, to achieve a multi-window display effect under one task stack, the display area of the ActivityB is limited to the area defined by the range of the specified boundary mwheels, and the display area of the ActivityA is unchanged, so that the ActivityA and the ActivityB respectively correspond to different display windows.
Further, in the system before the android version 7.0, since ActivityB is overlaid on ActivityA, the lifecycle of ActivityA enters a Pause state; and when the theme of ActivityB is opaque, the android system will change ActivityA from the visible state to the invisible state, resulting in the inability to display ActivityA. Therefore, in the embodiment, the me of the ActivityB is set to be transparent, so that the ActivityA is still in a visible state according to the current android system rule, and both the ActivityA and the ActivityB are displayed.
In addition, in connection with the display device shown in fig. 1C, some components in the display device may perform:
a display for displaying a first GUI comprising a plurality of items and a selector indicating that any one of the items is selected. For example, the first GUI41 including the items 411 to 415 shown in FIG. 2A is displayed, along with the selector 42 indicating that the item 411 is selected.
A user interface for receiving user input. For example, the user interface may receive an instruction from a user to control the selector to move in a right direction in the GUI by pressing a right direction key on the control device to change the position of the selector in the GUI.
A controller for performing:
determining an actual position of the selector in the first GUI based on a user input indicating movement of the selector;
when the residence time of the selector in the actual position in the first GUI is calculated to exceed a preset threshold, a second GUI associated with the item selected in the actual position is displayed in a preset size smaller than the size of the first GUI while the first GUI is maintained, and the selector is moved onto the second GUI. For example, in the GUI shown in FIG. 2C, the selector 42 is determined to be stuck at the item 413 position in the first GUI 41. When the dwell time of the selector 42 at the location of the item 413 is calculated to exceed a preset duration (e.g., 300 ms), the second GUI43 associated with the item 413 is displayed in a preset size smaller than the size of the first GUI41 at the upper right corner of the first GUI41 as shown in fig. 2D, and the selector is caused to move onto the item 431 in the second GUI 43.
When the residence time of the actual position of the computing selector in the first GUI does not exceed the preset threshold, the first GUI display is maintained while maintaining the display state of the selected item at the actual position. For example, in the GUI shown in FIG. 2C, the selector 42 is determined to be stuck at the item 413 position in the first GUI 41. When the dwell time of the calculation selector 42 at the position of the item 413 does not exceed the preset time period (e.g., 300 ms), as shown in fig. 2C, the selector 42 is kept selecting the item 413, as the border of the item 413 is enlarged.
In some embodiments, the controller is further configured to perform:
performing at least one of a moving operation and a zooming operation of the second GUI by based on the user input; the moving operation indicates a change in position of the second GUI on the first GUI, and the zooming operation indicates a change in size of zooming of the second GUI in accordance with a preset size. For example, in fig. 2D, when the user presses a channel down key on the control apparatus, as shown in fig. 4A, the display device may control the second GUI43 to move leftward by a preset distance in response to an input key instruction. As another example, in fig. 4A, when the user presses the channel up key on the control apparatus, as shown in fig. 4C, the display device may control the second GUI43 to move down by a preset distance while enlarging a preset multiple in response to the input key command.
In some embodiments, the controller is further configured to perform:
by closing the second GUI and causing the selector to resume selection to the item selected in the first GUI prior to display of the second GUI is performed based on the user input. For example, in FIG. 2D, when the user presses a return key on the control, as shown in FIG. 2C, the display device may no longer display the second GUI43 and cause the selector 42 to move over the item 413 in response to the input return instruction.
While preferred embodiments of the present invention have been described, additional variations and modifications in those embodiments may occur to those skilled in the art once they learn of the basic inventive concepts. Therefore, it is intended that the appended claims be interpreted as including preferred embodiments and all such alterations and modifications as fall within the scope of the invention.
It will be apparent to those skilled in the art that various modifications and variations can be made in the embodiments of the present invention without departing from the spirit or scope of the embodiments of the invention. Thus, if such modifications and variations of the embodiments of the present invention fall within the scope of the claims of the present invention and their equivalents, the present invention is also intended to encompass such modifications and variations.

Claims (9)

1. A display device, comprising:
a display for displaying a first GUI comprising a plurality of items, and a selector indicating that the item is selected, the selector moving its position in the first GUI to select a different one of the items based on a movement instruction input by a user in a preset direction, and receiving a confirmation selection of an item to activate the item;
a controller in communication with the display for performing:
determining, based on a user input indicating movement of the selector, an item to which the selector corresponds at a location in the first GUI, wherein the first GUI corresponds to ActivityA;
after the selector moves to any item in the first GUI, calculating whether the residence time of the selector on the item exceeds a preset threshold value;
if the residence time exceeds a preset threshold value and a next instruction is not received, receiving first request information for starting activityB sent by activityA, wherein the request information carries information of an item selected by a selector and information for indicating the display of a small window;
displaying a second GUI associated with the correspondingly selected item of the activityB in a small window while the first GUI display is displayed on the display, and the selector jumping directly from the selected item on the first GUI to a first sub-item on the second GUI to effect movement of the selector on the second GUI; wherein the first GUI is still visible but cannot receive a selection of a selector, the activiyA and activiyB being on the same task stack;
if the residence time does not reach the preset threshold value and the item on the first GUI is confirmed to be selected, receiving second request information for starting the ActivityB sent by the ActivityA, wherein the request information carries information of the item selected by the selector and information for indicating full-screen display;
displaying a second GUI associated with an item corresponding to the activityB on the display, the second GUI associated with the selected item being displayed full screen in place of the first GUI;
and if the residence time does not reach the preset threshold value and a next instruction for moving the selector is received, directly moving the selector from the currently selected item to the next item.
2. The display device of claim 1, wherein the second GUI includes content detail information in which the selected item is activated for display.
3. The display device of claim 2, wherein a plurality of items related to the content detail information of the selected item are included in the second GUI, the plurality of items being selectable or activatable by a selector based on a user input moving a position of the selector in the second GUI.
4. The display device of claim 1, further comprising, while maintaining the first GUI display:
changing a display state of the selected item by removing the selector and by distinguishing the selected item from other items to be displayed on the first GUI; alternatively, the first and second electrodes may be,
on the first GUI, by removing the selector and maintaining a display state of the selected item prior to removal of the selector.
5. The display device of claim 1, wherein the controller is further configured to perform:
performing at least one of a moving operation and a zooming operation of the second GUI by based on a user input; the moving operation indicates a position change of the second GUI on the first GUI, and the zooming operation indicates a size change of zooming of the second GUI according to a preset size.
6. The display device of claim 5, wherein the user input is a key value input through a key arranged on a remote controller, the key value corresponding to a moving operation and/or a zooming operation of the second GUI.
7. The display device of claim 5, wherein the controller is further configured to perform:
replacing the first GUI displayed on the display by changing the second GUI from a preset size to a size of the first GUI based on a user input.
8. The display device of claim 1, wherein the controller is further configured to perform:
by closing the second GUI and causing the selector to resume selection of the selected item in the first GUI prior to display of the second GUI based on user input.
9. The display device of claim 1, wherein the controller is further configured to perform:
when the calculated residence time of the selector at any position in the first GUI does not exceed a preset threshold, maintaining the display state of the selected item at the any position while maintaining the first GUI display.
CN201910258277.7A 2019-01-16 2019-04-01 Display device Active CN109922364B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
PCT/CN2019/126701 WO2020147507A1 (en) 2019-01-16 2019-12-19 Display device and display method

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN2019100397945 2019-01-16
CN201910039794 2019-01-16

Publications (2)

Publication Number Publication Date
CN109922364A CN109922364A (en) 2019-06-21
CN109922364B true CN109922364B (en) 2022-07-08

Family

ID=66967991

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910258277.7A Active CN109922364B (en) 2019-01-16 2019-04-01 Display device

Country Status (1)

Country Link
CN (1) CN109922364B (en)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2020147507A1 (en) * 2019-01-16 2020-07-23 青岛海信电器股份有限公司 Display device and display method
CN111414216A (en) * 2020-03-04 2020-07-14 海信视像科技股份有限公司 Display device and display method of operation guide based on display device

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103561344A (en) * 2013-11-06 2014-02-05 玲珑视界科技(北京)有限公司 Interface display method for set-top box
CN103914535A (en) * 2014-03-31 2014-07-09 百度在线网络技术(北京)有限公司 Information acquisition method and device
CN104598109A (en) * 2015-01-08 2015-05-06 天津三星通信技术研究有限公司 Method and equipment for previewing application in portable terminal
CN106303740A (en) * 2015-06-10 2017-01-04 阿里巴巴集团控股有限公司 The desktop navigation system of intelligent television and the implementation method of this system
CN106354372A (en) * 2016-09-08 2017-01-25 珠海市魅族科技有限公司 Information preview method and device

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
TW201239730A (en) * 2011-03-24 2012-10-01 Acer Inc Method for customizing user interface and electronic device thereof

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103561344A (en) * 2013-11-06 2014-02-05 玲珑视界科技(北京)有限公司 Interface display method for set-top box
CN103914535A (en) * 2014-03-31 2014-07-09 百度在线网络技术(北京)有限公司 Information acquisition method and device
CN104598109A (en) * 2015-01-08 2015-05-06 天津三星通信技术研究有限公司 Method and equipment for previewing application in portable terminal
CN106303740A (en) * 2015-06-10 2017-01-04 阿里巴巴集团控股有限公司 The desktop navigation system of intelligent television and the implementation method of this system
CN106354372A (en) * 2016-09-08 2017-01-25 珠海市魅族科技有限公司 Information preview method and device

Also Published As

Publication number Publication date
CN109922364A (en) 2019-06-21

Similar Documents

Publication Publication Date Title
CN109618206B (en) Method and display device for presenting user interface
CN111405333A (en) Display apparatus and channel control method
CN111654739A (en) Content display method and display equipment
CN111427643A (en) Display device and display method of operation guide based on display device
CN111182345A (en) Display method and display equipment of control
CN111629249B (en) Method for playing startup picture and display device
CN111726673B (en) Channel switching method and display device
CN111654732A (en) Advertisement playing method and display device
CN111669634A (en) Video file preview method and display equipment
CN111372133A (en) Method for reserving upgrading and display device
CN111045557A (en) Moving method of focus object and display device
CN109922364B (en) Display device
CN111901653B (en) Configuration method of external sound equipment of display equipment and display equipment
CN113115092B (en) Display device and detail page display method
CN111857363A (en) Input method interaction method and display equipment
CN112040308A (en) HDMI channel switching method and display device
CN112004126A (en) Search result display method and display device
CN111857502A (en) Image display method and display equipment
CN111526401A (en) Video playing control method and display equipment
CN112040285B (en) Interface display method and display equipment
CN113115093B (en) Display device and detail page display method
CN111726674B (en) HbbTV application starting method and display equipment
WO2020147507A1 (en) Display device and display method
CN111614995A (en) Menu display method and display equipment
CN111757160A (en) Method for starting sports mode and display equipment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
CB02 Change of applicant information

Address after: 266100 No. 151, Zhuzhou Road, Laoshan District, Shandong, Qingdao

Applicant after: Hisense Video Technology Co.,Ltd.

Address before: 266100 No. 151, Zhuzhou Road, Laoshan District, Shandong, Qingdao

Applicant before: QINGDAO HISENSE ELECTRONICS Co.,Ltd.

CB02 Change of applicant information
GR01 Patent grant
GR01 Patent grant