CN109618206B - Method and display device for presenting user interface - Google Patents

Method and display device for presenting user interface Download PDF

Info

Publication number
CN109618206B
CN109618206B CN201910070160.6A CN201910070160A CN109618206B CN 109618206 B CN109618206 B CN 109618206B CN 201910070160 A CN201910070160 A CN 201910070160A CN 109618206 B CN109618206 B CN 109618206B
Authority
CN
China
Prior art keywords
display area
view display
displayed
screen
items
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201910070160.6A
Other languages
Chinese (zh)
Other versions
CN109618206A (en
Inventor
王大勇
黑建业
朱铄
于文钦
赵宾
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Hisense Visual Technology Co Ltd
Original Assignee
Hisense Visual Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Hisense Visual Technology Co Ltd filed Critical Hisense Visual Technology Co Ltd
Priority to CN201910070160.6A priority Critical patent/CN109618206B/en
Publication of CN109618206A publication Critical patent/CN109618206A/en
Application granted granted Critical
Publication of CN109618206B publication Critical patent/CN109618206B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/431Generation of visual interfaces for content selection or interaction; Content or additional data rendering
    • H04N21/4312Generation of visual interfaces for content selection or interaction; Content or additional data rendering involving specific graphical features, e.g. screen layout, special fonts or colors, blinking icons, highlights or animations
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/41Structure of client; Structure of client peripherals
    • H04N21/422Input-only peripherals, i.e. input devices connected to specially adapted client devices, e.g. global positioning system [GPS]
    • H04N21/42204User interfaces specially adapted for controlling a client device through a remote control device; Remote control devices therefor
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/442Monitoring of processes or resources, e.g. detecting the failure of a recording device, monitoring the downstream bandwidth, the number of times a movie has been viewed, the storage space available from the internal hard disk
    • H04N21/44213Monitoring of end-user related data
    • H04N21/44218Detecting physical presence or behaviour of the user, e.g. using sensors to detect if the user is leaving the room or changes his face expression during a TV program
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/47End-user applications
    • H04N21/485End-user interface for client configuration
    • H04N21/4858End-user interface for client configuration for modifying screen layout parameters, e.g. fonts, size of the windows

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Human Computer Interaction (AREA)
  • Health & Medical Sciences (AREA)
  • General Health & Medical Sciences (AREA)
  • Social Psychology (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Databases & Information Systems (AREA)
  • Controls And Circuits For Display Device (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

In an embodiment, the display device comprises a display configured to display a user interface, wherein the user interface comprises a plurality of view display areas, wherein each view display area comprises one or more different items arranged therein, and a selector indicating that the items are selected, wherein the position of the selector can be moved by user input to change the selection of the different items, wherein the size and position and/or hierarchical arrangement of the view display areas is modified in response to the user input, and wherein the selector is moved to the selected item.

Description

Method and display device for presenting user interface
Technical Field
The embodiment of the application relates to a display technology. And more particularly, to a display apparatus and a method of presenting a user interface.
Background
The operability of the user interface displayed on the smart television influences the experience of the user.
Then, the content of the user interface displayed by the smart television is very rich, usually, all interface information content cannot be completely presented on one screen, and all user interface content can be displayed only by requiring multiple screens. At present, a scrolling mode is mostly adopted to update a user interface, and then, in the process of scrolling and browsing, a user interface with pages turned in front cannot continuously provide information for the user. Especially, important dynamic video information, such as recommended content of video resources, is very important content for users or operations. It is desirable to provide a novel user interface display method for improving the user experience.
Disclosure of Invention
The exemplary embodiments of the present application provide a display device and a method for presenting a user interface, which can improve the user experience of a user operating the display device.
According to an aspect of exemplary embodiments, there is provided a display apparatus including: a display configured to display a user interface, wherein the user interface comprises a plurality of view display areas, wherein each view display area comprises a layout of one or more different items, and a selector indicating that the items are selected, wherein a position of the selector can be moved by a user input to cause different items to be selected; a controller in communication with the display screen, the controller configured to perform presenting a user interface: receiving user input, and determining the type of the user input event; initiating detection of a position of the selector in the user interface displayed in the screen based on receiving a user input event indicating movement of the selector; upon determining, based on the user input event and the detection, that the selector is moved from the user interface displayed on the screen to an area outside thereof, then in response to the user input, modifying the size and position and/or hierarchical layout of at least a portion of the view display area displayed on the screen, updating the items in the user interface displayed on the screen in areas outside of the user interface not displayed on the screen, and causing the selector to move to the selected items updated in the screen.
In some exemplary embodiments, the plurality of view display areas include a scalable view display area located at an upper portion of the screen, in which at least some of the items in the scalable view display area are changeable in size and/or number on the screen, and a scroll view display area not overlapping with the scalable view display area, in which the number of items displayed in the screen is scrollably updated.
In some exemplary embodiments, the plurality of items in the view display area are arranged in rows and columns.
In some exemplary embodiments, determining that the selector is moved from the area outside the user interface displayed on the screen based on the user input event and the detection specifically includes: based on the user input event and the detection, it is determined that the selector is currently located on an item at a boundary location in the user interface, and the received user input indicates that the selector is to move the item selected outside of the boundary of the user interface displayed on the screen.
In some exemplary embodiments, the plurality of view display areas includes at least a first view display area and a second view display area, and at least one item in the first view display area is displayed as a video clip.
In some exemplary embodiments, modifying the size and position of the view display area further comprises: the size of the first view display area in the moving direction of the selector is reduced to a first value, the second view display area is moved in the moving direction of the selector, the size of the second view display area in the screen and the item list are kept unchanged, and meanwhile, a third view display area is displayed on the screen.
In some exemplary embodiments, modifying the size and position of the view display area further comprises: the first view display is reduced in size to a first value in the direction of movement of the selector, the second view display is increased in size to a second value in the direction of movement of the selector, and one or more items are added to the second view display.
In some exemplary embodiments, the size of the first view display area is reduced to a first value in the direction of movement of the selector, and further comprising: updating the quantity of the items contained in the first view display area, the size of at least part of the items, and/or the quantity of the items distributed by at least part of the lines.
In some exemplary embodiments, updating the number of items contained in the first view display area, the size of at least some of the items, further comprises: reducing the size of a portion of the items in the first view display area, leaving the size of at least one item displayed as a video clip unchanged, and adding at least one item of the same size as the reduced item.
In some exemplary embodiments, modifying the size and location and/or the hierarchical layout of the view display area further comprises:
the first view display area is not displayed in the screen, the second view display area scrolls to the position of the first view display area in the direction opposite to the movement direction of the selector, the size of the second view display area in the screen is enlarged, at least one item originally displayed as a video clip in the first view display area is displayed on the screen in a floating mode, and the updated item is displayed on the upper layer of the second view display area.
Compared with the prior art, the technical solutions proposed in the exemplary embodiments of the present application have the following beneficial effects:
in at least some embodiments provided herein, a method of presenting a user interface is provided.
Drawings
In order to more clearly illustrate the embodiments of the present application or the technical solutions in the prior art, the drawings needed to be used in the description of the embodiments or the prior art will be briefly introduced below, and it is obvious that the drawings in the following description are some embodiments of the present application, and for those skilled in the art, other drawings can be obtained according to these drawings without inventive exercise.
Fig. 1 is a schematic diagram illustrating an operation scenario between a display device and a control apparatus according to an embodiment;
fig. 2 is a block diagram exemplarily showing a hardware configuration of a display device 200 according to an embodiment;
fig. 3 is a block diagram exemplarily showing a hardware configuration of the control apparatus 100 according to the embodiment;
fig. 4 is a diagram exemplarily showing a functional configuration of the display device 200 according to the embodiment;
fig. 5a schematically shows a software configuration in the display device 200 according to an embodiment;
fig. 5b schematically shows a configuration of an application in the display device 200 according to an embodiment;
fig. 6 is a schematic diagram illustrating a user interface in the display device 200 according to an embodiment;
fig. 7a schematically illustrates a flow chart of a method of presenting a user interface in a display device 200 according to an embodiment;
fig. 7b schematically shows a flow chart of yet another method of presenting a user interface in a display device 200 according to an embodiment;
fig. 8a-8e schematically illustrate the operation between the control device 100 and the display device 200 according to an embodiment;
fig. 9 is a schematic flow chart illustrating a method according to an embodiment of the invention when the control device 100 is operated on a user interface in the display apparatus 200;
another schematic operation diagram between the control apparatus 100 and the display device 200 according to an embodiment is exemplarily shown in fig. 10a-10 e;
fig. 11 is a schematic flow chart illustrating another method according to an embodiment of the present invention when the control apparatus 100 operates on the user interface of the display device 200;
fig. 12a-12e schematically illustrate yet another operation between the control device 100 and the display apparatus 200 according to an embodiment;
fig. 13a-b schematically show an application scenario of a user interface of a display device 200 according to an embodiment;
fig. 14 is a schematic diagram illustrating another application scenario of the user interface of the display device 200 according to an embodiment.
Detailed Description
To make the objects, technical solutions and advantages of the exemplary embodiments of the present application clearer, the technical solutions in the exemplary embodiments of the present application will be clearly and completely described below with reference to the drawings in the exemplary embodiments of the present application, and it is obvious that the described exemplary embodiments are only a part of the embodiments of the present application, but not all the embodiments.
All other embodiments, which can be derived by a person skilled in the art from the exemplary embodiments shown in the present application without inventive effort, shall fall within the scope of protection of the present application. Moreover, while the disclosure herein has been presented in terms of exemplary one or more examples, it is to be understood that each aspect of the disclosure can be utilized independently and separately from other aspects of the disclosure to provide a complete disclosure.
It should be understood that the terms "first," "second," "third," and the like in the description and in the claims of the present application and in the drawings described above are used for distinguishing between similar elements and not necessarily for describing a particular sequential or chronological order. It is to be understood that the data so used are interchangeable under appropriate circumstances and can be implemented in sequences other than those illustrated or otherwise described herein with respect to the embodiments of the application, for example.
Furthermore, the terms "comprises" and "comprising," and any variations thereof, are intended to cover a non-exclusive inclusion, such that a product or device that comprises a list of elements is not necessarily limited to those elements explicitly listed, but may include other elements not expressly listed or inherent to such product or device.
The term "module," as used herein, refers to any known or later developed hardware, software, firmware, artificial intelligence, fuzzy logic, or combination of hardware and/or software code that is capable of performing the functionality associated with that element.
The term "remote control" as used in this application refers to a component of an electronic device (such as the display device disclosed in this application) that is typically wirelessly controllable over a relatively short range of distances. Typically using infrared and/or Radio Frequency (RF) signals and/or bluetooth to connect with the electronic device, and may also include WiFi, wireless USB, bluetooth, motion sensor, etc. For example: the hand-held touch remote controller replaces most of the physical built-in hard keys in the common remote control device with the user interface in the touch screen.
The term "gesture" as used in this application refers to a user's behavior through a change in hand shape or an action such as hand motion to convey a desired idea, action, purpose, or result.
Fig. 1 is a schematic diagram illustrating an operation scenario between a display device and a control apparatus according to an embodiment. As shown in fig. 1, a user may operate the display device 200 through the mobile terminal 300 and the control apparatus 100.
The control device 100 may control the display device 200 in a wireless or other wired manner by using a remote controller, including infrared protocol communication, bluetooth protocol communication, other short-distance communication manners, and the like. The user may input a user command through a key on a remote controller, voice input, control panel input, etc. to control the display apparatus 200. Such as: the user can input a corresponding control command through a volume up/down key, a channel control key, up/down/left/right moving keys, a voice input key, a menu key, a power on/off key, etc. on the remote controller, to implement the function of controlling the display device 200.
In some embodiments, mobile terminals, tablets, computers, laptops, and other smart devices may also be used to control the display device 200. For example, the display device 200 is controlled using an application program running on the smart device. The application, through configuration, may provide the user with various controls in an intuitive User Interface (UI) on a screen associated with the smart device.
For example, the mobile terminal 300 may install a software application with the display device 200, implement connection communication through a network communication protocol, and implement the purpose of one-to-one control operation and data communication. Such as: the mobile terminal 300 and the display device 200 can establish a control instruction protocol, synchronize a remote control keyboard to the mobile terminal 300, and control the display device 200 by controlling a user interface on the mobile terminal 300. The audio and video content displayed on the mobile terminal 300 can also be transmitted to the display device 200, so as to realize the synchronous display function.
As also shown in fig. 1, the display apparatus 200 also performs data communication with the server 400 through various communication means. The display device 200 may be allowed to be communicatively connected through a Local Area Network (LAN), a Wireless Local Area Network (WLAN), and other networks. The server 400 may provide various contents and interactions to the display apparatus 200. Illustratively, the display device 200 receives software program updates, or accesses a remotely stored digital media library, by sending and receiving information, as well as Electronic Program Guide (EPG) interactions. The servers 400 may be a group or groups of servers, and may be one or more types of servers. Other web service contents such as video on demand and advertisement services are provided through the server 400.
The display device 200 may be a liquid crystal display, an OLED display, a projection display device. The particular display device type, size, resolution, etc. are not limiting, and those skilled in the art will appreciate that the display device 200 may be modified in performance and configuration as desired.
The display apparatus 200 may additionally provide an intelligent network tv function that provides a computer support function in addition to the broadcast receiving tv function. Examples include a web tv, a smart tv, an Internet Protocol Tv (IPTV), and the like.
A hardware configuration block diagram of a display device 200 according to an exemplary embodiment is exemplarily shown in fig. 2. As shown in fig. 2, the display device 200 includes a controller 210, a tuning demodulator 220, a communication interface 230, a detector 240, an input/output interface 250, a video processor 260-1, an audio processor 60-2, a display 280, an audio output 270, a memory 290, a power supply, and an infrared receiver.
A display 280 for receiving the image signal from the video processor 260-1 and displaying the video content and image and components of the menu manipulation interface. The display 280 includes a display screen assembly for presenting a picture, and a driving assembly for driving the display of an image. The video content may be displayed from broadcast television content, or may be broadcast signals that may be received via a wired or wireless communication protocol. Alternatively, various image contents received from the network communication protocol and sent from the network server side can be displayed.
Meanwhile, the display 280 simultaneously displays a user manipulation UI interface generated in the display apparatus 200 and used to control the display apparatus 200.
And, a driving component for driving the display according to the type of the display 280. Alternatively, in case the display 280 is a projection display, it may also comprise a projection device and a projection screen.
The communication interface 230 is a component for communicating with an external device or an external server according to various communication protocol types. For example: the communication interface 230 may be a Wifi chip 231, a bluetooth communication protocol chip 232, a wired ethernet communication protocol chip 233, or other network communication protocol chips or near field communication protocol chips, and an infrared receiver (not shown).
The display apparatus 200 may establish control signal and data signal transmission and reception with an external control apparatus or a content providing apparatus through the communication interface 230. And an infrared receiver, an interface device for receiving an infrared control signal for controlling the apparatus 100 (e.g., an infrared remote controller, etc.).
The detector 240 is a signal used by the display device 200 to collect an external environment or interact with the outside. The detector 240 includes a light receiver 242, a sensor for collecting the intensity of ambient light, and parameters such as parameter changes can be adaptively displayed by collecting the ambient light.
The image acquisition device 241, such as a camera and a camera, may be used to acquire an external environment scene, acquire attributes of a user or interact gestures with the user, adaptively change display parameters, and recognize gestures of the user, so as to implement an interaction function with the user.
In some other exemplary embodiments, the detector 240, a temperature sensor, etc. may be provided, for example, by sensing the ambient temperature, and the display device 200 may adaptively adjust the display color temperature of the image. For example, the display apparatus 200 may be adjusted to display a cool tone when the temperature is in a high environment, or the display apparatus 200 may be adjusted to display a warm tone when the temperature is in a low environment.
In other exemplary embodiments, the detector 240, and a sound collector, such as a microphone, may be used to receive a user's voice, a voice signal including a control instruction from the user to control the display device 200, or collect an ambient sound for identifying an ambient scene type, and the display device 200 may adapt to the ambient noise.
The input/output interface 250 controls data transmission between the display device 200 of the controller 210 and other external devices. Such as receiving video and audio signals or command instructions from an external device.
Input/output interface 250 may include, but is not limited to, the following: any one or more of high definition multimedia interface HDMI interface 251, analog or data high definition component input interface 253, composite video input interface 252, USB input interface 254, RGB ports (not shown in the figures), etc.
In some other exemplary embodiments, the input/output interface 250 may also form a composite input/output interface with the above-mentioned plurality of interfaces.
The tuning demodulator 220 receives the broadcast television signals in a wired or wireless receiving manner, may perform modulation and demodulation processing such as amplification, frequency mixing, resonance, and the like, and demodulates the television audio and video signals carried in the television channel frequency selected by the user and the EPG data signals from a plurality of wireless or wired broadcast television signals.
The tuner demodulator 220 is responsive to the user-selected television signal frequency and the television signal carried by the frequency, as selected by the user and controlled by the controller 210.
The tuner-demodulator 220 may receive signals in various ways according to the broadcasting system of the television signal, such as: terrestrial broadcast, cable broadcast, satellite broadcast, or internet broadcast signals, etc.; and according to different modulation types, the modulation mode can be digital modulation or analog modulation. Depending on the type of television signal received, both analog and digital signals are possible.
In other exemplary embodiments, the tuner/demodulator 220 may be in an external device, such as an external set-top box. In this way, the set-top box outputs television audio/video signals after modulation and demodulation, and the television audio/video signals are input into the display device 200 through the input/output interface 250.
The video processor 260-1 is configured to receive an external video signal, and perform video processing such as decompression, decoding, scaling, noise reduction, frame rate conversion, resolution conversion, image synthesis, and the like according to a standard codec protocol of the input signal, so as to obtain a signal that can be displayed or played on the direct display device 200.
Illustratively, the video processor 260-1 includes a demultiplexing module, a video decoding module, an image synthesizing module, a frame rate conversion module, a display formatting module, and the like.
The demultiplexing module is used for demultiplexing the input audio and video data stream, and if the input MPEG-2 is input, the demultiplexing module demultiplexes the input audio and video data stream into a video signal and an audio signal.
And the video decoding module is used for processing the video signal after demultiplexing, including decoding, scaling and the like.
And the image synthesis module is used for carrying out superposition mixing processing on the GUI signal input by the user or generated by the user and the video image after the zooming processing by the graphic generator so as to generate an image signal for display.
And a module, configured to convert an input video frame rate, for example, a 60Hz frame rate into a 120Hz frame rate or a 240Hz frame rate, where a common format is implemented by using, for example, an interpolation frame method.
The display format module is used for converting the received video output signal after the frame rate conversion, and changing the signal to conform to the signal of the display format, such as outputting an RGB data signal.
The audio processor 260-2 is configured to receive an external audio signal, decompress and decode the received audio signal according to a standard codec protocol of the input signal, and perform noise reduction, digital-to-analog conversion, amplification processing, and the like to obtain an audio signal that can be played in the speaker.
In other exemplary embodiments, video processor 260-1 may comprise one or more chips. The audio processor 260-2 may also comprise one or more chips.
And, in other exemplary embodiments, the video processor 260-1 and the audio processor 260-2 may be separate chips or may be integrated together with the controller 210 in one or more chips.
An audio output 272, which receives the sound signal output from the audio processor 260-2 under the control of the controller 210, such as: the speaker 272, and the external sound output terminal 274 that can be output to the generation device of the external device, in addition to the speaker 272 carried by the display device 200 itself, such as: an external sound interface or an earphone interface and the like.
The power supply provides power supply support for the display device 200 from the power input from the external power source under the control of the controller 210. The power supply may include a built-in power supply circuit installed inside the display device 200, or may be a power supply interface installed outside the display device 200 to provide an external power supply in the display device 200.
A user input interface for receiving an input signal of a user and then transmitting the received user input signal to the controller 210. The user input signal may be a remote controller signal received through an infrared receiver, and various user control signals may be received through the network communication module.
For example, the user inputs a user command through the remote controller 100 or the mobile terminal 300, the user input interface responds to the user input through the controller 210 according to the user input, and the display device 200 responds to the user input.
In some embodiments, a user may enter a user command on a Graphical User Interface (GUI) displayed on the display 280, and the user input interface receives the user input command through the Graphical User Interface (GUI). Alternatively, the user may input the user command by inputting a specific sound or gesture, and the user input interface receives the user input command by recognizing the sound or gesture through the sensor.
The controller 210 controls the operation of the display apparatus 200 and responds to the user's operation through various software control programs stored in the memory 290.
As shown in fig. 2, the controller 210 includes a RAM213 and a ROM214, and a graphic processor 216, a CPU processor 212, a communication interface 218, such as: a first interface 218-1 through an nth interface 218-n, and a communication bus. The RAM213 and the ROM214, the graphic processor 216, the CPU processor 212, and the communication interface 218 are connected via a bus.
A ROM213 for storing instructions for various system boots. If the display apparatus 200 starts power-on upon receipt of the power-on signal, the CPU processor 212 executes a system boot instruction in the ROM, copies the operating system stored in the memory 290 to the RAM213, and starts running the boot operating system. After the start of the operating system is completed, the CPU processor 212 copies the various application programs in the memory 290 to the RAM213, and then starts running and starting the various application programs.
A graphics processor 216 for generating various graphics objects, such as: icons, operation menus, user input instruction display graphics, and the like. The display device comprises an arithmetic unit which carries out operation by receiving various interactive instructions input by a user and displays various objects according to display attributes. And a renderer for generating various objects based on the operator and displaying the rendered result on the display 280.
A CPU processor 212 for executing operating system and application program instructions stored in memory 290. And executing various application programs, data and contents according to various interactive instructions received from the outside so as to finally display and play various audio and video contents.
In some exemplary embodiments, the CPU processor 212 may include a plurality of processors. The plurality of processors may include one main processor and a plurality of or one sub-processor. A main processor for performing some operations of the display apparatus 200 in a pre-power-up mode and/or operations of displaying a screen in a normal mode. A plurality of or one sub-processor for one operation in a standby mode or the like.
The controller 210 may control the overall operation of the display apparatus 100. For example: in response to receiving a user command for selecting a UI object to be displayed on the display 280, the controller 210 may perform an operation related to the object selected by the user command.
Wherein the object may be any one of selectable objects, such as a hyperlink or an icon. Operations related to the selected object, such as: displaying an operation connected to a hyperlink page, document, image, or the like, or performing an operation of a program corresponding to the icon. The user command for selecting the UI object may be a command input through various input means (e.g., a mouse, a keyboard, a touch pad, etc.) connected to the display apparatus 200 or a voice command corresponding to a voice spoken by the user.
The memory 290 includes a memory for storing various software modules for driving the display device 200. Such as: various software modules stored in memory 290, including: the system comprises a basic module, a detection module, a communication module, a display control module, a browser module, various service modules and the like.
Wherein the basic module is a bottom layer software module for signal communication among the various hardware in the postpartum care display device 200 and for sending processing and control signals to the upper layer module. The detection module is used for collecting various information from various sensors or user input interfaces, and the management module is used for performing digital-to-analog conversion and analysis management.
For example: the voice recognition module comprises a voice analysis module and a voice instruction database module. The display control module is a module for controlling the display 280 to display image content, and may be used to play information such as multimedia image content and UI interface. And the communication module is used for carrying out control and data communication with external equipment. And the browser module is used for executing a module for data communication between browsing servers. And the service module is used for providing various services and modules including various application programs.
Meanwhile, the memory 290 is also used to store visual effect maps and the like for receiving external data and user data, images of respective items in various user interfaces, and a focus object.
A block diagram of the configuration of the control apparatus 100 according to an exemplary embodiment is exemplarily shown in fig. 3. As shown in fig. 3, the control apparatus 100 includes a controller 110, a communication interface 130, a user input/output interface 140, a memory 190, and a power supply 180.
The control device 100 is configured to control the display device 200 and may receive an input operation instruction of a user and convert the operation instruction into an instruction recognizable and responsive by the display device 200, serving as an interaction intermediary between the user and the display device 200. Such as: the user responds to the channel up and down operation by operating the channel up and down keys on the control device 100.
In some embodiments, the control device 100 may be a smart device. Such as: the control apparatus 100 may install various applications that control the display apparatus 200 according to user demands.
In some embodiments, as shown in fig. 1, a mobile terminal 300 or other intelligent electronic device may function similar to the control device 100 after installing an application that manipulates the display device 200. Such as: the user may implement the functions of controlling the physical keys of the device 100 by installing applications, various function keys or virtual buttons of a graphical user interface available on the mobile terminal 300 or other intelligent electronic device.
The controller 110 includes a processor 112 and RAM113 and ROM114, a communication interface 218, and a communication bus. The controller 110 is used to control the operation of the control device 100, as well as the internal components for communication and coordination and external and internal data processing functions.
The communication interface 130 enables communication of control signals and data signals with the display apparatus 200 under the control of the controller 110. Such as: the received user input signal is transmitted to the display apparatus 200. The communication interface 130 may include at least one of a WiFi chip, a bluetooth module, an NFC module, and other near field communication modules.
A user input/output interface 140, wherein the input interface includes at least one of a microphone 141, a touch pad 142, a sensor 143, keys 144, and other input interfaces. Such as: the user can realize a user instruction input function through actions such as voice, touch, gesture, pressing, and the like, and the input interface converts the received analog signal into a digital signal and converts the digital signal into a corresponding instruction signal, and sends the instruction signal to the display device 200.
The output interface includes an interface that transmits the received user instruction to the display apparatus 200. In some embodiments, the interface may be an infrared interface or a radio frequency interface. Such as: when the infrared signal interface is used, the user input instruction needs to be converted into an infrared control signal according to an infrared control protocol, and the infrared control signal is sent to the display device 200 through the infrared sending module. The following steps are repeated: when the rf signal interface is used, a user input command needs to be converted into a digital signal, and then the digital signal is modulated according to the rf control signal modulation protocol and then transmitted to the display device 200 through the rf transmitting terminal.
In some embodiments, the control device 100 includes at least one of a communication interface 130 and an output interface. The control device 100 is provided with a communication interface 130, such as: the WiFi, bluetooth, NFC, etc. modules may transmit the user input command to the display device 200 through the WiFi protocol, or the bluetooth protocol, or the NFC protocol code.
A memory 190 for storing various operation programs, data and applications for driving and controlling the control apparatus 200 under the control of the controller 110. The memory 190 may store various control signal commands input by a user.
And a power supply 180 for providing operational power support to the various elements of the control device 100 under the control of the controller 110. A battery and associated control circuitry.
Fig. 4 is a diagram schematically illustrating a functional configuration of the display device 200 according to an exemplary embodiment. As shown in fig. 4, the memory 290 is used to store an operating system, an application program, contents, user data, and the like, and performs system operations for driving the display device 200 and various operations in response to a user under the control of the controller 210. The memory 290 may include volatile and/or nonvolatile memory.
The memory 290 is specifically configured to store an operating program for driving the controller 210 in the display device 200, and to store various application programs installed in the display device 200, various application programs downloaded by a user from an external device, various graphical user interfaces related to the applications, various objects related to the graphical user interfaces, user data information, and internal data of various supported applications. The memory 290 is used to store system software such as an OS kernel, middleware, and applications, and to store input video data and audio data, and other user data.
The memory 290 is specifically used for storing drivers and related data such as the audio/video processors 260-1 and 260-2, the display 280, the communication interface 230, the tuning demodulator 220, the input/output interface of the detector 240, and the like.
In some embodiments, memory 290 may store software and/or programs, software programs for representing an Operating System (OS) including, for example: a kernel, middleware, an Application Programming Interface (API), and/or an application program. For example, the kernel may control or manage system resources, or functions implemented by other programs (e.g., the middleware, APIs, or applications), and the kernel may provide interfaces to allow the middleware and APIs, or applications, to access the controller to implement controlling or managing system resources.
The memory 290, for example, includes a broadcast receiving module 2901, a channel control module 2902, a volume control module 2903, an image control module 2904, a display control module 2905, an audio control module 2906, an external instruction recognition module 2907, a communication control module 2908, a light receiving module 2909, a power control module 2910, an operating system 2911, and other applications 2912, a browser module, and the like. The controller 210 performs functions such as: a broadcast television signal reception demodulation function, a television channel selection control function, a volume selection control function, an image control function, a display control function, an audio control function, an external instruction recognition function, a communication control function, an optical signal reception function, an electric power control function, a software control platform supporting various functions, a browser function, and the like.
A block diagram of a configuration of a software system in a display device 200 according to an exemplary embodiment is exemplarily shown in fig. 5 a.
As shown in fig. 5a, an operating system 2911, including executing operating software for handling various basic system services and for performing hardware related tasks, acts as an intermediary for data processing performed between application programs and hardware components. In some embodiments, portions of the operating system kernel may contain a series of software to manage the display device hardware resources and provide services to other programs or software code.
In other embodiments, portions of the operating system kernel may include one or more device drivers, which may be a set of software code in the operating system that assists in operating or controlling the devices or hardware associated with the display device. The drivers may contain code that operates the video, audio, and/or other multimedia components. Examples include a display screen, a camera, Flash, WiFi, and audio drivers.
The accessibility module 2911-1 is configured to modify or access the application program to achieve accessibility and operability of the application program for displaying content.
A communication module 2911-2 for connection to other peripherals via associated communication interfaces and a communication network.
The user interface module 2911-3 is configured to provide an object for displaying a user interface, so that each application program can access the object, and user operability can be achieved.
Control applications 2911-4 for controllable process management, including runtime applications and the like.
The event transmission system 2914, which may be implemented within the operating system 2911 or within the application program 2912, in some embodiments, on the one hand, within the operating system 2911 and on the other hand, within the application program 2912, is configured to listen for various user input events, and to refer to handlers that perform one or more predefined operations in response to the identification of various types of events or sub-events, depending on the various events.
The event monitoring module 2914-1 is configured to monitor an event or a sub-event input by the user input interface.
The event identification module 2914-1 is configured to input definitions of various types of events for various user input interfaces, identify various events or sub-events, and transmit the same to a process for executing one or more corresponding sets of processes.
The event or sub-event refers to an input detected by one or more sensors in the display device 200 and an input of an external control device (e.g., the control device 100). Such as: the method comprises the following steps of inputting various sub-events through voice, inputting gestures through gesture recognition, inputting sub-events through remote control key commands of the control equipment and the like. Illustratively, the one or more sub-events in the remote control include a variety of forms including, but not limited to, one or a combination of key presses up/down/left/right/, ok keys, key presses, and the like. And non-physical key operations such as move, hold, release, etc.
The interface layout manager 2913, directly or indirectly receiving the input events or sub-events from the event transmission system 2914, monitors the input events or sub-events, and updates the layout of the user interface, including but not limited to the position of each control or sub-control in the interface, and the size, position, and level of the container, and other various execution operations related to the layout of the interface.
As shown in fig. 5b, the application layer 2912 contains various applications that may also be executed at the display device 200. The application may include, but is not limited to, one or more applications such as: live television applications, video-on-demand applications, media center applications, application centers, gaming applications, and the like.
The live television application program can provide live television through different signal sources. For example, a live television application may provide television signals using input from cable television, radio broadcasts, satellite services, or other types of live television services. And, the live television application may display video of the live television signal on the display device 200.
A video-on-demand application may provide video from different storage sources. Unlike live television applications, video on demand provides a video display from some storage source. For example, the video on demand may come from a server side of the cloud storage, from a local hard disk storage containing stored video programs.
The media center application program can provide various applications for playing multimedia contents. For example, a media center, which may be other than live television or video on demand, may provide services that a user may access to various images or audio through a media center application.
The application program center can provide and store various application programs. The application may be a game, an application, or some other application associated with a computer system or other device that may be run on the smart television. The application center may obtain these applications from different sources, store them in local storage, and then be operable on the display device 200.
A schematic diagram of a user interface in a display device 200 according to an exemplary embodiment is illustrated in fig. 6. As shown in FIG. 6, the user interface includes a plurality of view display areas, illustratively a first view display area 201 and a second view display area 202, each of which includes a layout of one or more different items. And a selector in the user interface indicating that any one of the items is selected, the position of the selector being movable by user input to change the selection of a different item.
It should be noted that the plurality of view display areas may be visible boundaries or invisible boundaries. Such as: the different view display areas can be marked through different background colors of the view display areas, and visible marks such as boundary lines and invisible boundaries can also be provided. It is also possible that there is no visible or non-visible border, and that only the associated items in a certain area of the screen are displayed, having the same changing properties in size and/or arrangement, which certain area is seen as the presence of the border of the same view partition, such as: the items in the view display area 201 are simultaneously zoomed in or out, while the change in the view display area 202 is different.
Wherein, in some embodiments, the first view display area 201 is a scalable view display. "scalable" may mean that the first view display area 201 is scalable in size or proportion on the screen, or that the items in the first view display 201 are scalable in size or proportion on the screen. The first view display area 201 is a scroll view display area which can scroll update the number of items displayed in the screen by a user input.
"item" refers to a visual object displayed in each view display area of the user interface in the display device 200 to represent corresponding content such as icons, thumbnails, video clips, and the like. For example: the items may represent movies, image content or video clips of a television show, audio content of music, applications, or other user access content history information.
In some embodiments, an "item" may display an image thumbnail. Such as: when the item is a movie or a tv show, the item may be displayed as a poster of the movie or tv show. If the item is music, a poster of a music album may be displayed. Such as an icon for the application when the item is an application, or a screenshot of the content that captures the application when it was most recently executed. If the item is the user access history, the content screenshot in the latest execution process can be displayed. The "item" may be displayed as a video clip. Such as: the item is a video clip dynamic of a trailer of a television or a television show.
Further, the item may represent an interface or a collection of interfaces on which the display device 200 is connected to an external device, or may represent a name of an external device connected to the display device, or the like. Such as: a signal source input interface set, or an HDMI interface, a USB interface, a PC terminal interface, etc.
Illustratively, the items are shown as 2011- > 2015 in the first view display area 201 and resources 1-9 in the second view display area 202 in FIG. 7 a. In some embodiments, each item may include text content and/or an image for displaying a thumbnail associated with the text content, or a video clip associated with the text. In other embodiments, each item may be text or an icon of an application.
The items may or may not be the same size. In some implementation examples, the size of the items may be varied.
A "selector" is used to indicate where any item has been selected, such as a cursor or a focus object. The cursor movement on the display device 200 is controlled to select or control an item according to a user input through the control device 100. The control item may be selected by causing movement of the focus object displayed in the display apparatus 200 according to an input of a user through the control apparatus 100, and one or more items thereof may be selected or controlled. Such as: the user may select and control items by controlling the movement of the focus object between items through the direction keys on the control device 100.
The focus object refers to an object that moves between items according to user input. Illustratively, the focus object 40 location is implemented or identified by drawing a thick line through the item edge as in FIG. 7 a. In other embodiments, the focus form is not limited to an example, and may be a form such as a cursor that is recognizable by the user, either tangible or intangible, such as in the form of a 3D deformation of the item, or may change the identification of the border lines, size, color, transparency, and outline and/or font of the text or image of the item in focus.
In some embodiments, different content or links are associated with each item in the first view display area 201 and the second view display area 202. Illustratively, each item in the first view display area 201 is a thumbnail of a poster or a video clip, and text and/or icons of various application icons are displayed in the second view display area 202.
The event transmission system 2914, which may monitor user input for each predefined event or sub-event heard, provides control identifying the event or sub-event, directly or indirectly, to the interface layout manager 2913.
The interface layout management module 2913 is configured to monitor the state of the user interface (including the position and/or size of the view partition, the item, the focus or the cursor object, the change process, and the like), and according to the event or the sub-event, may perform modification on the layout of the size, position, hierarchy, and the like of each view display area, and/or adjust or modify the layout of the size, position, number, type, content, and the like of each type of item layout of each view display area. In some embodiments, the layout is modified and adjusted, including displaying or not displaying the view sections or the content of the items in the view sections on the screen.
In other embodiments, the user interface may include one or more view display areas, the number of view display areas on the display screen being arranged according to the amount of different classified content to be displayed. As in fig. 12b, a third view display area 203 is also included.
A flow diagram of a method of presenting a user interface in a display device 200 according to an embodiment is illustrated in fig. 7 a.
In the 710 implementation, a user input is received, and a type of the user input event is determined. The controller of the display apparatus 200 is configured to monitor input of a user input event, such as whether a monitoring key input is a key up/down command.
In 730 execution, based on receiving a user input event indicating movement of a selector, detection of a position of the selector in the user interface displayed in the screen is initiated. And if the monitored user input event is the input of an up-down key, detecting the upper position of the selector in the user interface. Further, if it is determined whether the selector is located on an item at an edge position in the user interface displayed on the screen, if the selector is located on an item at an edge position in the user interface on the screen, such as: the selector is on the last row of items on the screen, the user enters a down key input indicating that the user will select an item on the off-screen user interface not displayed on the screen, and the user input is responded to in execution 750.
In an implementation of 750, upon determining that the selector is moved from the user interface displayed on the screen to an area outside thereof based on the user input event and the detection, the size and location and/or hierarchical layout of at least a portion of the view display area displayed on the screen is modified in response to the user input, an item not in the area outside of the user interface displayed on the screen is updated in the user interface displayed on the screen, and the selector is moved to the selected item updated in the screen.
A further method flow diagram for presenting a user interface in a display device 200 according to an embodiment is illustrated in fig. 7 b.
In the 720 implementation, a user input is received, a type of user input event is determined, and a determination is made as to whether the selector is scrolling the view display area on the screen.
In the 740 implementation, a direction selection input is based on a user input event, and a selector initiates position detection of the selector in a scroll view display area while the selector scrolls the view display area on a screen.
In an implementation 760, upon determining that the selector is moved from the scroll view display area to the outer area of the screen, a size and location and/or hierarchical layout of at least a portion of the view display area displayed on the screen is modified in response to the user input, items not displayed on the screen are updated to be displayed in the scroll view display area, and the selector is moved to the selected item displayed in the screen in the updated display.
Fig. 8a-8e schematically illustrate the operation between the control device 100 and the display device 200 according to an exemplary embodiment. 8a-8e, a plurality of different items are laid out in a row and a plurality of columns in the first view display area 201, and the layout is not limited to a row and a plurality of columns. Such as: item 2011 is a video clip window and item 2012-2015 is a plurality of different posters that include text and image thumbnails.
In the second view display area 202, a plurality of project resources 1-9 are displayed in a multi-row and multi-column arrangement, such as: resources 1-9 are application title text and/or icons, etc.
As in fig. 8a, the remote control and the display device communicate with each other. The user selects the user interface of the display device 200 through the input of the remote controller, so as to realize the control of the items in the display device 200. Such as: the focus object is controlled to move among the items in the user interface through up/down/left/right and/or enter keys of the remote controller to select or control each item.
Illustratively, scrolling down/up through the different items in the second function display 202 in the user interface may be accomplished by an "up/down" button on the remote control. In other embodiments, depending on user habits or design requirements, an "up" button may be provided to effect the scrolling up.
Here, when the moving-focus object 40 is located at a lower boundary position of the content display area of the second view display area 201 in the screen of the display device, as in "resource 8" in fig. 8a, through one or more operations of the user pressing the "down" key, the user interface changes to that shown in fig. 7 b.
As shown in fig. 8b, the original first view display area 201 is not displayed in the display screen, the item list in the second view display area 202 is gradually moved upwards to the position of the first view display area 201, and the width of the second view display area 202 in the screen is enlarged to newly display the list of a plurality of items (e.g. resources 10-14). Meanwhile, the item 2011 in the original first view display area 201 becomes a floating window, and the floating window is displayed on the upper layer of the display content of the second view display area 202 and covers the changed part of the second view display area 202.
Continuing to press the "down" key, the size of the second view display area 202 may remain the same and the displayed resource list and the list of resources to be displayed may scroll up on the display screen together. Wherein each time the focus is moved one line, the list of items scrolls up one line, and the position and size of the item 2011 remain unchanged. As shown in fig. 8b, when the two-down "key operation is continuously pressed, the on-screen display user interface changes to that shown in fig. 8 c.
As shown in fig. 8c, when the user presses the "up" key, a user interface is displayed on the screen, in reverse variation as shown in fig. 8c to 8b to 8 a.
When the moving focus 40 is positioned on the in-screen item 2011 in the display device, as shown in fig. 8d, the "ok" key is pressed, and the playing content of the video clip in the item 2011 is enlarged in full screen or partially full screen on the screen, as shown in fig. 8 e.
A schematic flow chart of a method for controlling the operation of the apparatus 100 on a user interface in a display device 200 according to an exemplary embodiment is illustrated in fig. 9. As shown in fig. 9, at 8011, in response to an instruction to display a user interface, the user interface is displayed on a screen, the user interface comprising a plurality of view display areas, such as a first view display area 201 and a second view display area 202, each view display area comprising a layout of one or more different items, the user interface being as shown in fig. 6 and fig. 8 a.
Wherein the size or/and location and hierarchy of the multiple view displays may be varied in response to a predefined event of the input. As in fig. 8a-b, an event of continuing to move the focus is monitored when it is monitored that the focus is located at an edge area of a user interface displayed on the screen, and then the event is responded to as being predefined.
In 8012, the event monitoring module 2914-1 monitors the user input event on the user interface for downward movement of the button. In some other embodiments, the key events may be moved in a left-right direction depending on the interface layout. In some other embodiments, the event may be a combination event of moving a key and determining a key.
At 8013, the interface layout management module 2913, if it is monitored that the current focus is on the second view display area 202, continues with 8014 execution.
At 8014, the interface layout management module 2913 continues to monitor the position of the focus on the on-screen user interface, and if it is determined that the current focus is located on the item at the boundary position in the on-screen user interface display area, such as the resource 8 in fig. 8a, the resource 12 in fig. 8b, and the resource 22 in fig. 8c, the item at the boundary position in the on-screen user interface display area.
In some embodiments, 8012 may be performed after 8013 and 8014. Execution then continues at 8015.
In 8015, when the focus is on an item at a boundary position in the user interface display area on the screen, a user-input command to move in a direction outside the boundary of the screen is received, such as a push down command in fig. 8 a.
At 8016, in response to one or more commands to move in a direction outside the screen boundary, the method comprises: move down the instruction.
In 8017, the first view display 201 is not displayed on the screen, the second view display is moved up to the position of the first view display, and the second view display is increased by a second value, and at least a part of the items to be displayed is displayed in the second view display. As shown in FIG. 8b, the width of the second view display area 202 in the screen is expanded to newly add a plurality of items of the display resources 10-14.
And at least a portion of the items in the original first view display area 201 are displayed floating on the second view display area 202. As shown in fig. 8b, an item 2011 in the original first-view display area 201 is displayed in a floating manner.
The floating items displayed on the first view display area 201 are displayed in the second view display area 202, and are in a different display level from the second view display area 202, and the upper layer items are overlaid on the second view display area 202.
Another operational schematic between the control apparatus 100 and the display device 200 according to an exemplary embodiment is exemplarily shown in fig. 10a-10 e. When the user presses the "down" key on the remote control one or more times while moving the focus object 40 over the item location of the resource 8, as shown in fig. 10a, the user interface changes to that shown in fig. 10 b.
As shown in fig. 10b, the size of the original first view display area 201 is reduced to a first value in the focus moving direction, the size of the second view display area 202 is increased to a second value in the focus moving direction, and the items in the second view display area 202 move upwards in sequence, and the width of the second view display area 202 in the screen is enlarged to newly add the items of the display resources 10-14. At the same time, the position and/or size of all or some of the items 2011 and 2012-2015 in the original first-view display area 201 are rearranged. In some embodiments, new items may be added to the display, or existing items may be reduced.
The size of the first view display area 201 remains unchanged at the first value while continuing to press the "down" key one or more times. And the size of the second view display area 202 remains unchanged at the second value, the displayed plurality of item lists and the plurality of item lists to be displayed scroll up on the display screen. And when the focus is moved one line each time, the list of the items to be displayed is scrolled one line upwards.
As shown in fig. 10c, when the "up" key is pressed one or more times, the user interface is displayed on the screen, in reverse variation as shown in fig. 10c to 10b to 10 a.
When the "ok" key is pressed while the moving-focus object 40 is positioned on the in-screen item 2011 in the display device as shown in fig. 10d, the video clip content in the item 2011 is displayed full screen on the screen as shown in fig. 10 e.
Fig. 12a-12e schematically show yet another operation between the control device 100 and the display device 200 according to an exemplary embodiment. When the user presses the "down" key on the remote control once, as shown in fig. 12a, while moving the focus object 40 over the item location of the resource 7, the user interface changes to that shown in fig. 12 b.
As shown in fig. 12b, the original first view display area 201 is reduced in size to a first value in the focus moving direction, the second view display area 202 is moved in the focus moving direction, and the size of the second view display area 202 and the item list in the screen remain unchanged. Meanwhile, below the second view display area 202, a third view display area 203 is newly displayed.
Continuing to press the "down" key one or more times, the size of the first view display area 201 remains the same at the first value and the second view display area 202 scrolls up until it disappears on the screen. And the third view display area 203 scrolls upwards until the third view display area 203 has no content to be displayed, and then stops.
As shown in fig. 12d, when the "up" key is pressed, a user interface is displayed on the screen, in reverse variation as between fig. 12c to 12b to 12 a.
Fig. 11 is a schematic flow chart illustrating another method when the control apparatus 100 operates on the user interface of the display device 200 according to an exemplary embodiment. As shown in fig. 11, unlike the method provided in fig. 9, in 9017, it is determined whether or not there is a content to be displayed in the screen that is not displayed in the second view display area.
If yes, 9018 is executed, and the position and the size of the first view display area in the user interface are kept unchanged after the first view display area is reduced to the first value in the focus moving direction. And the display area of the second view is increased to a second value, the position of the second view is moved towards the direction of reducing the first function, the originally displayed content is rolled, and the content to be displayed of the display part is newly added.
If not, 9017 is executed, and the position and the size of the first view display area in the user interface are kept unchanged after the first view display area is reduced to the first value in the focus moving direction. The second view display area is entirely scrolled in the first function reduction direction, and a third view display area is newly displayed outside the second view display area.
In some embodiments, when the user performs the scroll operation, on one hand, the user may conveniently preview the content or application to be displayed for selection, and on the other hand, since at least a part of the content or application in the first view display area may be kept on the display screen all the time, such as: the widget in the first view display area may remain on the screen and the user may continue to browse the content in the widget.
In some application examples, a schematic diagram of an application scenario of a user interface of a display device 200 according to an exemplary embodiment is illustrated in fig. 13 a-b. As shown in fig. 13a-b, the user interface of the device is displayed, and at least one poster in a format of video, thumbnail, text, or the like is displayed in the first view display area 201, such as: video display window 2011, multiple thumbnail display windows 2012 and 2016.
The second view display area displays an arrangement list of a plurality of applications, for example: the system comprises application programs such as a gathering and watching program, a gathering and learning program, a gathering and playing program, a setting option program, a live broadcast center, a VOD on demand program, a media center, a signal source, a member center, a WeChat television, a mango TV program, an enlightenment learning word, an application (program) center, a television mall and the like.
In other embodiments, the list of the plurality of applications in the second view display area may be represented in at least one of an icon or text format.
As the number of applications increases, the second view display area may have a limited list of displays on the screen, only partially displayed, e.g., two rows of the second view display area in fig. 13a may be displayed on the screen, e.g., the third row of applications in fig. 13b, and the user may scroll the class list in the second view display area on the display screen by moving the focus. When the focus is moved to the screen boundary position, the list is scrolled one line for each movement of the focus.
In other embodiments, the applications may be classified into two or more categories according to application type, and different types of applications may be in the second view display area and the third view display area, as well as the nth view display area.
In some application examples, a schematic diagram of another application scenario of a user interface of a display device 200 according to an exemplary embodiment is illustrated in fig. 14. As shown in fig. 14, the user interface of an application program, in which a poster related to the application program is displayed in a first view display area 201, is as follows: the video display window 2011, the plurality of thumbnail display windows 2012 and 2016 may be in at least one of a video or thumbnail or text format.
And displaying an arrangement list of a plurality of functions or contents in one application program in the second view display area. Such as a closed look application, a plurality of functions or content lists such as: movies, TV shows, art programs, documentaries, animations, classrooms, Chinese vocals, free-to-watch, concerts, children, etc. The items may be in thumbnail or text or icon format.
Finally, it should be noted that: the above embodiments are only used for illustrating the technical solutions of the present application, and not for limiting the same; although the present application has been described in detail with reference to the foregoing embodiments, it should be understood by those of ordinary skill in the art that: the technical solutions described in the foregoing embodiments may still be modified, or some or all of the technical features may be equivalently replaced; and the modifications or the substitutions do not make the essence of the corresponding technical solutions depart from the scope of the technical solutions of the embodiments of the present application.

Claims (2)

1. A display device, comprising:
a display configured to display a user interface, wherein the user interface comprises at least a first view display area and a second view display area, each view display area comprising a layout of one or more different items, and a selector indicating that the items are selected, wherein a position of the selector in the user interface is movable by a user input to cause the different items to be selected, wherein the items refer to visual objects representing displayed content;
a controller in communication with the display, the controller configured to perform presenting a user interface:
receiving user input, and determining the type of the user input event;
initiating detection of a position of the selector in the user interface displayed in the screen;
upon determining, based on the user input event and the detection, that the selector is moved from the user interface displayed on the screen to an area outside thereof, then in response to the user input, modifying the size and position and/or hierarchical layout of at least part of the view display area displayed on the screen, updating the items in the user interface displayed on the screen in areas outside of the user interface not displayed on the screen, and causing the selector to move to the selected items updated in the screen; wherein the content of the first and second substances,
when a selector is located in a second view display area and moves to a boundary position in the second view display area in the opposite direction towards the first view display area, the size of the first view display area is reduced to a first value in the moving direction of the selector, the size of a part of items in the first view display area is reduced, the size of at least one item displayed as a video clip is kept unchanged, at least one item with the same size as the reduced item is added, the first value is kept unchanged until the selector moves to the opposite direction, the size of a part of items in the first view display area is reduced, and the size of the second view display area is increased to a second value in the moving direction of the selector, and one or more items are added in the second view display area;
wherein at least one item in the first view display area is displayed as a video montage.
2. The display device according to claim 1, wherein the first view display area is a scalable view display area and the second view display area is a scroll view display area, wherein at least some of the items in the scalable view display area are changeable in size and/or number on the screen, and wherein the number of items displayed in the screen in the scroll view display area that does not overlap with the scalable view display area is scrollably updated.
CN201910070160.6A 2019-01-24 2019-01-24 Method and display device for presenting user interface Active CN109618206B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201910070160.6A CN109618206B (en) 2019-01-24 2019-01-24 Method and display device for presenting user interface

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201910070160.6A CN109618206B (en) 2019-01-24 2019-01-24 Method and display device for presenting user interface

Publications (2)

Publication Number Publication Date
CN109618206A CN109618206A (en) 2019-04-12
CN109618206B true CN109618206B (en) 2021-11-05

Family

ID=66019330

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910070160.6A Active CN109618206B (en) 2019-01-24 2019-01-24 Method and display device for presenting user interface

Country Status (1)

Country Link
CN (1) CN109618206B (en)

Families Citing this family (16)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110134393B (en) * 2019-05-16 2024-01-09 北京三快在线科技有限公司 Method and device for processing operation signal
CN112073798B (en) 2019-06-10 2022-09-23 海信视像科技股份有限公司 Data transmission method and equipment
CN112199124B (en) * 2019-06-21 2022-07-01 海信视像科技股份有限公司 Project opening method and device and display equipment
US11093108B2 (en) 2019-07-12 2021-08-17 Qingdao Hisense Media Networks Ltd. Method for displaying user interface and display device
CN110337034B (en) * 2019-07-12 2022-02-11 青岛海信传媒网络技术有限公司 User interface display method and display equipment
CN112399248A (en) * 2019-08-16 2021-02-23 北京迪文科技有限公司 Intelligent screen device and method supporting analog video
CN112463269B (en) * 2019-09-06 2022-03-15 青岛海信传媒网络技术有限公司 User interface display method and display equipment
CN110737840B (en) * 2019-10-22 2023-07-28 海信视像科技股份有限公司 Voice control method and display device
CN111131871B (en) * 2019-12-03 2021-03-19 海信视像科技股份有限公司 Method and display equipment for displaying EPG (electronic program guide) user interface during program playing
CN111176603A (en) * 2019-12-31 2020-05-19 海信视像科技股份有限公司 Image display method for display equipment and display equipment
CN113225424A (en) * 2020-01-17 2021-08-06 青岛海信传媒网络技术有限公司 Voice playing method based on content and display equipment
CN113259741B (en) * 2020-02-12 2022-09-16 聚好看科技股份有限公司 Demonstration method and display device for classical viewpoint of episode
CN113453056B (en) * 2020-03-27 2022-08-09 聚好看科技股份有限公司 Display method and display device for photo album control
WO2021217345A1 (en) * 2020-04-27 2021-11-04 青岛海信传媒网络技术有限公司 Content display method and display device
CN111857936B (en) * 2020-07-29 2023-10-24 聚好看科技股份有限公司 User interface display method and display device of application program
CN113034226B (en) * 2021-03-16 2023-06-02 北京达佳互联信息技术有限公司 Live broadcast data processing method and device, electronic equipment, medium and product

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103986962A (en) * 2014-06-03 2014-08-13 合一网络技术(北京)有限公司 Method and system for displaying suspended playing window
CN108307222A (en) * 2018-01-25 2018-07-20 青岛海信电器股份有限公司 Smart television and the method that upper content is applied based on access homepage in display equipment

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20110063297A (en) * 2009-12-02 2011-06-10 삼성전자주식회사 Mobile device and control method thereof
US9652125B2 (en) * 2015-06-18 2017-05-16 Apple Inc. Device, method, and graphical user interface for navigating media content
KR102354328B1 (en) * 2015-09-22 2022-01-21 삼성전자주식회사 Image display apparatus and operating method for the same
CN107835461B (en) * 2017-11-02 2021-09-28 深圳市雷鸟网络传媒有限公司 Focus movement control method, smart television and computer-readable storage medium

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103986962A (en) * 2014-06-03 2014-08-13 合一网络技术(北京)有限公司 Method and system for displaying suspended playing window
CN108307222A (en) * 2018-01-25 2018-07-20 青岛海信电器股份有限公司 Smart television and the method that upper content is applied based on access homepage in display equipment

Also Published As

Publication number Publication date
CN109618206A (en) 2019-04-12

Similar Documents

Publication Publication Date Title
CN109618206B (en) Method and display device for presenting user interface
CN111698557B (en) User interface display method and display equipment
US11093108B2 (en) Method for displaying user interface and display device
WO2021114529A1 (en) User interface display method and display device
CN113259741B (en) Demonstration method and display device for classical viewpoint of episode
CN112463269B (en) User interface display method and display equipment
CN109960556B (en) Display device
CN111970549B (en) Menu display method and display device
CN111031375B (en) Method for skipping detailed page of boot animation and display equipment
CN111770370A (en) Display device, server and media asset recommendation method
CN111176603A (en) Image display method for display equipment and display equipment
CN111479155A (en) Display device and user interface display method
CN111045557A (en) Moving method of focus object and display device
CN111954059A (en) Screen saver display method and display device
CN111083538A (en) Background image display method and device
CN109922364B (en) Display device
CN111857363A (en) Input method interaction method and display equipment
CN112199560B (en) Search method of setting items and display equipment
CN112235621B (en) Display method and display equipment for visual area
CN113259733B (en) Display device
WO2021196432A1 (en) Display method and display device for content corresponding to control
CN113115093B (en) Display device and detail page display method
CN112367550A (en) Method for realizing multi-title dynamic display of media asset list and display equipment
CN112004127A (en) Signal state display method and display equipment
CN111949179A (en) Control amplifying method and display device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
CB02 Change of applicant information

Address after: 266100 No. 151, Zhuzhou Road, Laoshan District, Shandong, Qingdao

Applicant after: Hisense Video Technology Co.,Ltd.

Address before: 266100 No. 151, Zhuzhou Road, Laoshan District, Shandong, Qingdao

Applicant before: HISENSE ELECTRIC Co.,Ltd.

CB02 Change of applicant information
GR01 Patent grant
GR01 Patent grant