CN111045557A - Moving method of focus object and display device - Google Patents

Moving method of focus object and display device Download PDF

Info

Publication number
CN111045557A
CN111045557A CN201911328401.9A CN201911328401A CN111045557A CN 111045557 A CN111045557 A CN 111045557A CN 201911328401 A CN201911328401 A CN 201911328401A CN 111045557 A CN111045557 A CN 111045557A
Authority
CN
China
Prior art keywords
item
unavailable
items
user
focus object
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201911328401.9A
Other languages
Chinese (zh)
Inventor
刘鹏
孙琦玮
董杰
高峰凯
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Hisense Electric Co Ltd
Qingdao Hisense Electronics Co Ltd
Original Assignee
Hisense Electric Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Hisense Electric Co Ltd filed Critical Hisense Electric Co Ltd
Priority to CN201911328401.9A priority Critical patent/CN111045557A/en
Publication of CN111045557A publication Critical patent/CN111045557A/en
Priority to PCT/CN2020/133646 priority patent/WO2021121051A1/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/0482Interaction with lists of selectable items, e.g. menus
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/04842Selection of displayed objects or displayed text elements

Abstract

The invention provides a method for moving a focus object and display equipment, which are characterized in that the focus object is controlled to move to a first target item in a first moving direction by acquiring a focus moving instruction input by a user and controlling the focus object to move to the first target item according to the focus moving instruction, wherein the first target item is an available item which is closest to an item where the focus object is located in a menu interface, and when one or more unavailable items exist on the menu interface, the focus object can be quickly moved to the corresponding available item according to the instruction input by the user, the operation of inputting the instruction for multiple times by the user is not required, and the user experience is improved.

Description

Moving method of focus object and display device
Technical Field
The invention relates to the technical field of display, in particular to a method for moving a focus object and display equipment.
Background
When the display device displays the menu, under certain conditions, part of the items in the menu are unavailable items (i.e., the items are grayed out), the unavailable items cannot be selected, and the focus object cannot be moved to the unavailable items, for example, when the sound output of the display device is set to power on, the set items such as the sound mode of the sound menu, wall-hanging setting (wall-hanging sound effect), and the like cannot be operated and are unavailable items.
Currently, when a focus object moves to an unavailable item or a boundary of a plurality of consecutive unavailable items by using a display device of an Android (Android) system, such as a P version of the Android system, if a user presses a direction key of a remote controller to move the focus object to the direction of the unavailable item, the focus object does not directly skip the unavailable item, but scrolls a page to display the available item on a menu interface, and the user can move the focus object to the available item by pressing the same direction key of the remote controller again.
In the prior art, when one or more unavailable items exist in a menu interface and an available item to which a focus object is to be moved is not visible, the available item is displayed first, the focus object is moved, multiple operations are required by a user, and in the process of displaying the available item, the current focus object and the item where the focus object is located are easily caused to roll out of the menu interface, so that the focus object is not visible, the user operation is not convenient, and the user experience is influenced.
Disclosure of Invention
The application provides a moving method and display equipment of a focus object, so that the focus object can be rapidly moved to a corresponding target item according to an instruction input by a user, and user experience is improved.
In a first aspect, a method for moving a focus object is provided, the method including:
displaying a menu page including a plurality of items; the menu page also comprises a focus object indicating that the item is selected and an unavailable item area comprising at least one unavailable item; the unavailable item is not selectable by a focus object;
receiving a focus moving instruction input by a user;
when determining that the current item selected by the focus object is adjacent to an unavailable item area along a first moving direction, controlling the focus object to skip the unavailable item area along the first moving direction and directly move to a first target item; the first target item is adjacent to the unavailable item area.
In a second aspect, there is provided a display device comprising:
a display for displaying a menu page of a plurality of items; the menu page also comprises a focus object indicating that the item is selected and an unavailable item area comprising at least one unavailable item; the unavailable item is not selectable by a focus object;
the user interface is used for receiving instructions input by a user;
a controller for performing:
in response to a focus moving instruction input by a user, when determining that a current item selected by the focus object is adjacent to an unavailable item area in a first moving direction, controlling the focus object to move directly to a first target item by skipping the unavailable item area in the first moving direction; the first target item is adjacent to the unavailable item area.
In the embodiment of the scheme, the focus moving instruction input by the user is obtained, and the focus object is controlled to move to the first target item in the first moving direction according to the focus moving instruction, wherein the first target item is an available item which is closest to the item where the focus object is located in the menu interface, and when one or more unavailable items exist on the menu interface, the focus object can be rapidly moved to the corresponding available item according to the instruction input by the user, the operation of inputting the instruction for many times by the user is not required, and the user experience is improved.
Drawings
In order to more clearly illustrate the technical solutions in the embodiments of the present invention, the drawings needed to be used in the description of the embodiments will be briefly introduced below, and it is obvious that the drawings in the following description are only some embodiments of the present invention, and it is obvious for those skilled in the art to obtain other drawings based on these drawings without inventive exercise.
Fig. 1A is a schematic diagram illustrating an operation scenario between a display device and a control apparatus;
fig. 1B is a block diagram schematically illustrating a configuration of the control apparatus 100 in fig. 1A;
fig. 1C is a block diagram schematically illustrating a configuration of the display device 200 in fig. 1A;
FIG. 1D is a block diagram illustrating an architectural configuration of an operating system in memory of display device 200;
fig. 2A and 2B are schematic diagrams illustrating one GUI400 provided by the display apparatus 200;
FIG. 3 is a diagram illustrating a GUI400-1 provided by the display device 200 in the prior art;
fig. 4 schematically illustrates a GUI400 provided by the display apparatus 200;
fig. 5 schematically illustrates a GUI400 provided by the display apparatus 200;
fig. 6A and 6B are schematic diagrams illustrating one GUI500 provided by the display apparatus 200;
FIG. 7 is a flowchart illustrating a first embodiment of a method for moving a focus object;
FIG. 8 is a flowchart illustrating a second embodiment of a method for moving a focus object;
FIG. 9 is a flowchart illustrating a third embodiment of a method for moving a focus object;
fig. 10 is a flowchart illustrating a fourth embodiment of the method for moving the focus object.
Detailed Description
In order to make the objects, technical solutions and advantages of the present invention clearer, the present invention will be described in further detail with reference to the accompanying drawings, and it is apparent that the described embodiments are only a part of the embodiments of the present invention, not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
Fig. 1A is a schematic diagram illustrating an operation scenario between a display device and a control apparatus. As shown in fig. 1A, the control apparatus 100 and the display device 200 may communicate with each other in a wired or wireless manner.
Among them, the control apparatus 100 is configured to control the display device 200, which may receive an operation instruction input by a user and convert the operation instruction into an instruction recognizable and responsive by the display device 200, serving as an intermediary for interaction between the user and the display device 200. Such as: the user operates the channel up/down key on the control device 100, and the display device 200 responds to the channel up/down operation.
The control device 100 may be a remote controller 100A, which includes infrared protocol communication or bluetooth protocol communication, and other short-distance communication methods, etc. to control the display apparatus 200 in a wireless or other wired manner. The user may input a user instruction through a key on a remote controller, voice input, control panel input, etc., to control the display apparatus 200. Such as: the user can input a corresponding control command through a volume up/down key, a channel control key, up/down/left/right moving keys, a voice input key, a menu key, a power on/off key, etc. on the remote controller, to implement the function of controlling the display device 200.
The control device 100 may also be an intelligent device, such as a mobile terminal 100B, a tablet computer, a notebook computer, and the like. For example, the display device 200 is controlled using an application program running on the smart device. The application program may provide various controls to a user through an intuitive User Interface (UI) on a screen associated with the smart device through configuration.
For example, the mobile terminal 100B may install a software application with the display device 200 to implement connection communication through a network communication protocol for the purpose of one-to-one control operation and data communication. Such as: the mobile terminal 100B may be caused to establish a control instruction protocol with the display device 200 to implement the functions of the physical keys as arranged in the remote control 100A by operating various function keys or virtual buttons of the user interface provided on the mobile terminal 100B. The audio and video content displayed on the mobile terminal 100B may also be transmitted to the display device 200, so as to implement a synchronous display function.
The display apparatus 200 may provide a network television function of a broadcast receiving function and a computer support function. The display device may be implemented as a digital television, a web television, an Internet Protocol Television (IPTV), or the like.
The display device 200 may be a liquid crystal display, an organic light emitting display, a projection device. The specific display device type, size, resolution, etc. are not limited.
The display apparatus 200 also performs data communication with the server 300 through various communication means. Here, the display apparatus 200 may be allowed to be communicatively connected through a Local Area Network (LAN), a Wireless Local Area Network (WLAN), and other networks. The server 300 may provide various contents and interactions to the display apparatus 200. By way of example, the display device 200 may send and receive information such as: receiving Electronic Program Guide (EPG) data, receiving software program updates, or accessing a remotely stored digital media library. The servers 300 may be a group or groups of servers, and may be one or more types of servers. Other web service contents such as a video on demand and an advertisement service are provided through the server 300.
Fig. 1B is a block diagram illustrating the configuration of the control device 100. As shown in fig. 1B, the control device 100 includes a controller 110, a memory 120, a communicator 130, a user input interface 140, an output interface 150, and a power supply 160.
The controller 110 includes a Random Access Memory (RAM)111, a Read Only Memory (ROM)112, a processor 113, a communication interface, and a communication bus. The controller 110 is used to control the operation of the control device 100, as well as the internal components of the communication cooperation, external and internal data processing functions.
Illustratively, when an interaction of a user pressing a key disposed on the remote controller 100A or an interaction of touching a touch panel disposed on the remote controller 100A is detected, the controller 110 may control to generate a signal corresponding to the detected interaction and transmit the signal to the display device 200.
And a memory 120 for storing various operation programs, data and applications for driving and controlling the control apparatus 100 under the control of the controller 110. The memory 120 may store various control signal commands input by a user.
The communicator 130 enables communication of control signals and data signals with the display apparatus 200 under the control of the controller 110. Such as: the control apparatus 100 transmits a control signal (e.g., a touch signal or a button signal) to the display device 200 via the communicator 130, and the control apparatus 100 may receive the signal transmitted by the display device 200 via the communicator 130. The communicator 130 may include an infrared signal interface 131 and a radio frequency signal interface 132. For example: when the infrared signal interface is used, the user input instruction needs to be converted into an infrared control signal according to an infrared control protocol, and the infrared control signal is sent to the display device 200 through the infrared sending module. The following steps are repeated: when the rf signal interface is used, a user input command needs to be converted into a digital signal, and then the digital signal is modulated according to the rf control signal modulation protocol and then transmitted to the display device 200 through the rf transmitting terminal.
The user input interface 140 may include at least one of a microphone 141, a touch pad 142, a sensor 143, a key 144, and the like, so that a user can input a user instruction regarding controlling the display apparatus 200 to the control apparatus 100 through voice, touch, gesture, press, and the like.
The output interface 150 outputs a user instruction received by the user input interface 140 to the display apparatus 200, or outputs an image or voice signal received by the display apparatus 200. Here, the output interface 150 may include an LED interface 151, a vibration interface 152 generating vibration, a sound output interface 153 outputting sound, a display 154 outputting an image, and the like. For example, the remote controller 100A may receive an output signal such as audio, video, or data from the output interface 150, and display the output signal in the form of an image on the display 154, in the form of audio on the sound output interface 153, or in the form of vibration on the vibration interface 152.
And a power supply 160 for providing operation power support for each element of the control device 100 under the control of the controller 110. In the form of a battery and associated control circuitry.
A hardware configuration block diagram of the display device 200 is exemplarily illustrated in fig. 1C. As shown in fig. 1C, the display apparatus 200 may include a tuner demodulator 210, a communicator 220, a detector 230, an external device interface 240, a controller 250, a memory 260, a user interface 265, a video processor 270, a display 275, an audio processor 280, an audio output interface 285, and a power supply 290.
The tuner demodulator 210 receives the broadcast television signal in a wired or wireless manner, may perform modulation and demodulation processing such as amplification, mixing, and resonance, and is configured to demodulate, from a plurality of wireless or wired broadcast television signals, an audio/video signal carried in a frequency of a television channel selected by a user, and additional information (e.g., EPG data).
The tuner demodulator 210 is responsive to the user selected frequency of the television channel and the television signal carried by the frequency, as selected by the user and controlled by the controller 250.
The tuner demodulator 210 can receive a television signal in various ways according to the broadcasting system of the television signal, such as: terrestrial broadcasting, cable broadcasting, satellite broadcasting, internet broadcasting, or the like; and according to different modulation types, a digital modulation mode or an analog modulation mode can be adopted; and can demodulate the analog signal and the digital signal according to the different kinds of the received television signals.
In other exemplary embodiments, the tuning demodulator 210 may also be in an external device, such as an external set-top box. In this way, the set-top box outputs a television signal after modulation and demodulation, and inputs the television signal into the display apparatus 200 through the external device interface 240.
The communicator 220 is a component for communicating with an external device or an external server according to various communication protocol types. For example, the display apparatus 200 may transmit content data to an external apparatus connected via the communicator 220, or browse and download content data from an external apparatus connected via the communicator 220. The communicator 220 may include a network communication protocol module or a near field communication protocol module, such as a WIFI module 221, a bluetooth communication protocol module 222, and a wired ethernet communication protocol module 223, so that the communicator 220 may receive a control signal of the control device 100 according to the control of the controller 250 and implement the control signal as a WIFI signal, a bluetooth signal, a radio frequency signal, and the like.
The detector 230 is a component of the display apparatus 200 for collecting signals of an external environment or interaction with the outside. The detector 230 may include a sound collector 231, such as a microphone, which may be used to receive a user's sound, such as a voice signal of a control instruction of the user to control the display device 200; alternatively, ambient sounds may be collected that identify the type of ambient scene, enabling the display device 200 to adapt to ambient noise.
In some other exemplary embodiments, the detector 230, which may further include an image collector 232, such as a camera, a video camera, etc., may be configured to collect external environment scenes to adaptively change the display parameters of the display device 200; and the function of acquiring the attribute of the user or interacting gestures with the user so as to realize the interaction between the display equipment and the user.
In some other exemplary embodiments, the detector 230 may further include a light receiver for collecting the intensity of the ambient light to adapt to the display parameter variation of the display device 200.
In some other exemplary embodiments, the detector 230 may further include a temperature sensor, such as by sensing an ambient temperature, and the display device 200 may adaptively adjust a display color temperature of the image. For example, when the temperature is higher, the display apparatus 200 may be adjusted to display a color temperature of an image that is cooler; when the temperature is lower, the display device 200 may be adjusted to display a warmer color temperature of the image.
The external device interface 240 is a component for providing the controller 250 to control data transmission between the display apparatus 200 and an external apparatus. The external device interface 240 may be connected to an external apparatus such as a set-top box, a game device, a notebook computer, etc. in a wired/wireless manner, and may receive data such as a video signal (e.g., moving image), an audio signal (e.g., music), additional information (e.g., EPG), etc. of the external apparatus.
The external device interface 240 may include: a High Definition Multimedia Interface (HDMI) terminal 241, a Composite Video Blanking Sync (CVBS) terminal 242, an analog or digital Component terminal 243, a Universal Serial Bus (USB) terminal 244, a Component terminal (not shown), a red, green, blue (RGB) terminal (not shown), and the like.
The controller 250 controls the operation of the display device 200 and responds to the operation of the user by running various software control programs (such as an operating system and various application programs) stored on the memory 260.
As shown in fig. 1C, the controller 250 includes a Random Access Memory (RAM)251, a Read Only Memory (ROM)252, a graphics processor 253, a CPU processor 254, a communication interface 255, and a communication bus 256. The RAM251, the ROM252, the graphic processor 253, and the CPU processor 254 are connected to each other through a communication bus 256 through a communication interface 255.
The ROM252 stores various system boot instructions. When the display apparatus 200 starts power-on upon receiving the power-on signal, the CPU processor 254 executes a system boot instruction in the ROM252, copies the operating system stored in the memory 260 to the RAM251, and starts running the boot operating system. After the start of the operating system is completed, the CPU processor 254 copies the various application programs in the memory 260 to the RAM251 and then starts running and starting the various application programs.
And a graphic processor 253 for generating various graphic objects such as icons, operation menus, and user input instruction display graphics, etc. The graphic processor 253 may include an operator for performing an operation by receiving various interactive instructions input by a user, and further displaying various objects according to display attributes; and a renderer for generating various objects based on the operator and displaying the rendered result on the display 275.
A CPU processor 254 for executing operating system and application program instructions stored in memory 260. And according to the received user input instruction, processing of various application programs, data and contents is executed so as to finally display and play various audio-video contents.
In some example embodiments, the CPU processor 254 may comprise a plurality of processors. The plurality of processors may include one main processor and a plurality of or one sub-processor. A main processor for performing some initialization operations of the display apparatus 200 in the display apparatus preload mode and/or operations of displaying a screen in the normal mode. A plurality of or one sub-processor for performing an operation in a state of a standby mode or the like of the display apparatus.
The communication interface 255 may include a first interface to an nth interface. These interfaces may be network interfaces that are connected to external devices via a network.
The controller 250 may control the overall operation of the display apparatus 200. For example: in response to receiving a user input command for selecting a GUI object displayed on the display 275, the controller 250 may perform an operation related to the object selected by the user input command.
Where the object may be any one of the selectable objects, such as a hyperlink or an icon. The operation related to the selected object is, for example, an operation of displaying a link to a hyperlink page, document, image, or the like, or an operation of executing a program corresponding to the object. The user input command for selecting the GUI object may be a command input through various input means (e.g., a mouse, a keyboard, a touch panel, etc.) connected to the display apparatus 200 or a voice command corresponding to a voice spoken by the user.
A memory 260 for storing various types of data, software programs, or applications for driving and controlling the operation of the display device 200. The memory 260 may include volatile and/or nonvolatile memory. And the term "memory" includes the memory 260, the RAM251 and the ROM252 of the controller 250, or a memory card in the display device 200.
In some embodiments, the memory 260 is specifically used for storing an operating program for driving the controller 250 of the display device 200; storing various application programs built in the display apparatus 200 and downloaded by a user from an external apparatus; data such as visual effect images for configuring various GUIs provided by the display 275, various objects related to the GUIs, and selectors for selecting GUI objects are stored.
In some embodiments, memory 260 is specifically configured to store drivers for tuner demodulator 210, communicator 220, detector 230, external device interface 240, video processor 270, display 275, audio processor 280, etc., and related data, such as external data (e.g., audio-visual data) received from the external device interface or user data (e.g., key information, voice information, touch information, etc.) received by the user interface.
In some embodiments, memory 260 specifically stores software and/or programs representing an Operating System (OS), which may include, for example: a kernel, middleware, an Application Programming Interface (API), and/or an application program. Illustratively, the kernel may control or manage system resources, as well as functions implemented by other programs (e.g., the middleware, APIs, or applications); at the same time, the kernel may provide an interface to allow middleware, APIs, or applications to access the controller to enable control or management of system resources.
A block diagram of the architectural configuration of the operating system in the memory of the display device 200 is illustrated in fig. 1D. The operating system architecture comprises an application layer, a middleware layer and a kernel layer from top to bottom.
The application layer, the application programs built in the system and the non-system-level application programs belong to the application layer. Is responsible for direct interaction with the user. The application layer may include a plurality of applications such as a setup application, a post application, a media center application, and the like. These applications may be implemented as Web applications that execute based on a WebKit engine, and in particular may be developed and executed based on HTML5, Cascading Style Sheets (CSS), and JavaScript.
Here, HTML, which is called HyperText Markup Language (HyperText Markup Language), is a standard Markup Language for creating web pages, and describes the web pages by Markup tags, where the HTML tags are used to describe characters, graphics, animation, sound, tables, links, etc., and a browser reads an HTML document, interprets the content of the tags in the document, and displays the content in the form of web pages.
CSS, known as Cascading Style Sheets (Cascading Style Sheets), is a computer language used to represent the Style of HTML documents, and may be used to define Style structures, such as fonts, colors, locations, etc. The CSS style can be directly stored in the HTML webpage or a separate style file, so that the style in the webpage can be controlled.
JavaScript, a language applied to Web page programming, can be inserted into an HTML page and interpreted and executed by a browser. The interaction logic of the Web application is realized by JavaScript. The JavaScript can package a JavaScript extension interface through a browser, realize the communication with the kernel layer,
the middleware layer may provide some standardized interfaces to support the operation of various environments and systems. For example, the middleware layer may be implemented as multimedia and hypermedia information coding experts group (MHEG) middleware related to data broadcasting, DLNA middleware which is middleware related to communication with an external device, middleware which provides a browser environment in which each application program in the display device operates, and the like.
The kernel layer provides core system services, such as: file management, memory management, process management, network management, system security authority management and the like. The kernel layer may be implemented as a kernel based on various operating systems, for example, a kernel based on the Linux operating system.
The kernel layer also provides communication between system software and hardware, and provides device driver services for various hardware, such as: provide display driver for the display, provide camera driver for the camera, provide button driver for the remote controller, provide wiFi driver for the WIFI module, provide audio driver for audio output interface, provide power management drive for Power Management (PM) module etc..
A user interface 265 receives various user interactions. Specifically, it is used to transmit an input signal of a user to the controller 250 or transmit an output signal from the controller 250 to the user. For example, the remote controller 100A may transmit an input signal, such as a power switch signal, a channel selection signal, a volume adjustment signal, etc., input by the user to the user interface 265, and then the input signal is transferred to the controller 250 through the user interface 265; alternatively, the remote controller 100A may receive an output signal such as audio, video, or data output from the user interface 265 via the controller 250, and display the received output signal or output the received output signal in audio or vibration form.
In some embodiments, a user may enter user commands on a Graphical User Interface (GUI) displayed on the display 275, and the user interface 265 receives the user input commands through the GUI. Specifically, the user interface 265 may receive user input commands for controlling the position of a selector in the GUI to select different objects or items.
Alternatively, the user may input a user command by inputting a specific sound or gesture, and the user interface 265 receives the user input command by recognizing the sound or gesture through the sensor.
The video processor 270 is configured to receive an external video signal, and perform video data processing such as decompression, decoding, scaling, noise reduction, frame rate conversion, resolution conversion, and image synthesis according to a standard codec protocol of the input signal, so as to obtain a video signal that is directly displayed or played on the display 275.
Illustratively, the video processor 270 includes a demultiplexing module, a video decoding module, an image synthesizing module, a frame rate conversion module, a display formatting module, and the like.
The demultiplexing module is configured to demultiplex an input audio/video data stream, where, for example, an input MPEG-2 stream (based on a compression standard of a digital storage media moving image and voice), the demultiplexing module demultiplexes the input audio/video data stream into a video signal and an audio signal.
And the video decoding module is used for processing the video signal after demultiplexing, including decoding, scaling and the like.
And the image synthesis module is used for carrying out superposition mixing processing on the GUI signal input by the user or generated by the user and the video image after the zooming processing by the graphic generator so as to generate an image signal for display.
The frame rate conversion module is configured to convert a frame rate of an input video, for example, convert a frame rate of an input 60Hz video into a frame rate of 120Hz or 240Hz, where a common format is implemented by using, for example, an interpolation frame method.
And a display formatting module for converting the signal output by the frame rate conversion module into a signal conforming to a display format of a display, such as converting the format of the signal output by the frame rate conversion module to output an RGB data signal.
A display 275 for receiving the image signal from the video processor 270 and displaying the video content, the image and the menu manipulation interface. The display video content may be from the video content in the broadcast signal received by the tuner-demodulator 210, or from the video content input by the communicator 220 or the external device interface 240. The display 275, while displaying a user manipulation interface UI generated in the display apparatus 200 and used to control the display apparatus 200.
And, the display 275 may include a display screen assembly for presenting a picture and a driving assembly for driving the display of an image. Alternatively, a projection device and projection screen may be included, provided display 275 is a projection display.
The audio processor 280 is configured to receive an external audio signal, decompress and decode the received audio signal according to a standard codec protocol of the input signal, and perform audio data processing such as noise reduction, digital-to-analog conversion, and amplification processing to obtain an audio signal that can be played by the speaker 286.
Illustratively, audio processor 280 may support various audio formats. Such as MPEG-2, MPEG-4, Advanced Audio Coding (AAC), high efficiency AAC (HE-AAC), and the like.
The audio output interface 285 is used for receiving an audio signal output by the audio processor 280 under the control of the controller 250, and the audio output interface 285 may include a speaker 286 or an external sound output terminal 287, such as an earphone output terminal, for outputting to a generating device of an external device.
In other exemplary embodiments, video processor 270 may comprise one or more chips. Audio processor 280 may also comprise one or more chips.
And, in other exemplary embodiments, the video processor 270 and the audio processor 280 may be separate chips or may be integrated with the controller 250 in one or more chips.
And a power supply 290 for supplying power supply support to the display apparatus 200 from the power input from the external power source under the control of the controller 250. The power supply 290 may be a built-in power supply circuit installed inside the display apparatus 200 or may be a power supply installed outside the display apparatus 200.
A schematic diagram of one GUI400 provided by the display device 200 is exemplarily shown in fig. 2A and 2B.
As shown in fig. 2A, the display apparatus provides a GUI400 to the display according to a control instruction input by a user through operation of the control device, the GUI400 including a main display screen 41, and a menu interface 42 including a plurality of items 421 to 428, and a focus object 43 at any one of the items, and the focus object 43 can be moved by a focus movement instruction input by the user to change a selected different item, for example, the user inputs the focus movement instruction by pressing up/down/left/right direction keys of a remote controller. The items in the menu interface have different states according to the setting or state of the current display device, and illustratively, the items 421 to 424 in the menu interface 42 are unavailable items (i.e., grayed-out items) and the items 425 to 428 are available items. Further, the item where the focus object is located is an available item, and the focus object cannot be placed on an unavailable item.
Optionally, the main display screen 41 may be at least one of an image, a text and a video content that the user is watching, for example, the playing screen shown in fig. 2A is a picture screen, or may be a menu page, a search page, an application page, and the like, for example, the playing screen shown in fig. 2B is an upper-level menu of the menu interface 42.
A schematic diagram of a GUI400-1 provided by the prior art display device 200 is illustrated in fig. 3. An exemplary diagram of a GUI400 provided by the display device 200 is illustrated in FIGS. 4 and 5
As shown in fig. 2A, the current focus object 43 is placed on the item 425, and when the user inputs a focus movement instruction by operating the control means, the display apparatus 200 may respond to the focus movement instruction.
In the related art, when a focus movement instruction is input by a user, if the focus movement instruction instructs to move the focus object 43 upward, the display apparatus 200 displays, through the display, the GUI400-1 as shown in fig. 3, in which the menu interface 42-1 includes the items 420 exposed by the upward movement of the menu interface 42 and the items 421 to 427, and the position of the focus object 43 does not change, and when the focus movement instruction is input again by the user, the focus object moves to the items 420.
In an example provided by the present solution, in the current menu interface 42, an item 426 below the item 425 where the focus object 43 is located is an available item, and if the focus moving instruction indicates to move the focus object 43 downward, the focus object 43 moves to the item 426, as shown in fig. 4; as another example, in the current menu interface 42, since all the items 421 to 424 above the item 425 where the focus object 43 is located are unavailable items, the items 421 to 424 form an unavailable item area 400, that is, the focus object is at the boundary of the unavailable item area, and if the focus moving instruction instructs to move the focus object 43 upward, the focus object 43 skips the unavailable item area and moves directly to the item 420, as shown in fig. 5.
A schematic diagram of one GUI500 provided by the display device 200 is exemplarily shown in fig. 6A and 6B.
As shown in fig. 6A or 6B, the display apparatus provides a GUI500 to the display according to a menu interface opening operation by the user, for example, a control instruction input by the user through the operation control device, the GUI500 including a main display screen 51 and a menu interface 52 including a plurality of items 521 to 525 opened. Optionally, the main display screen 51 may be at least one of an image, a text and a video content that the user is watching, for example, the playing screen shown in fig. 6A is a picture screen, or may be a menu page, a search page, an application page, and the like, for example, the playing screen shown in fig. 6B is an upper-level menu of the menu interface 52.
Where the item 521 is the first item of the menu interface 52 and is an unavailable item, and the item 522 is the second item of the menu interface 52 and is an available item, the focus object 53 is controlled to be the first available item, i.e., the item 522, while the menu interface 52 is opened.
Fig. 7 is a flowchart illustrating a first embodiment of a method for moving a focus object. As shown in fig. 7, the method includes:
s101: and acquiring a focus moving instruction input by a user.
In this step, the controller acquires a focus movement instruction input by the user by detecting, for example, a focus movement instruction input by the user by pressing or touching up/down/left/right movement keys on the control apparatus 100, or a focus movement instruction input by the user by voice or the like is received.
The focus movement instruction includes a first movement direction indicating a target direction in which the focus object is to be moved, such as up, down, left, right, and the like.
S102: and controlling the focus object to move to the first target item in the first moving direction according to the focus moving instruction.
The controller controls the focus object to move to a first target item according to the focus moving instruction, wherein the first target item is an available item which is closest to an item where the focus object is located in the menu interface, for example, the focus moving instruction instructs to move the focus object upwards, and then the focus object is controlled to move to a first available item above a current item.
Optionally, if the first target item is not displayed on the current menu interface, the menu interface is scrolled until the first target item is exposed, and the focus object is moved to the first target item.
According to the moving method of the focus object, the focus moving instruction input by the user is obtained, the focus moving instruction comprises a first moving direction, and the focus object is controlled to move to a first target item in the first moving direction according to the focus moving instruction, wherein the first target item is an available item which is closest to an item where the focus object is located in a menu interface, when multiple consistent unavailable items exist on the menu interface, the focus object can be rapidly moved to the corresponding available item according to the instruction input by the user, the operation of inputting the instruction for multiple times by the user is not needed, and the user experience is improved.
On the basis of the embodiment shown in fig. 7, fig. 8 exemplarily shows a flow chart of a second embodiment of the method for moving the focus object. As shown in fig. 8, step S102: controlling the focus object to move to a first target item in the first moving direction according to the focus moving instruction, specifically comprising:
s1021: it is determined whether the focus object has an unavailable item area adjacent in the first moving direction.
Wherein the unavailable item area comprises one to a plurality of continuous unavailable items.
For example, in conjunction with FIG. 2A, the current focus object is in item 425, items 421-424 above item 425 constitute the unavailable items area, and item 426 below item 425 is the available item. If the first moving direction is upward, the focus object has an unavailable item area adjacent to the focus object in the first moving direction, and if the first moving direction is downward, the focus object has no unavailable item area adjacent to the focus object in the first moving direction.
If the focus object has an unavailable item area adjacent to the focus object in the first moving direction, go to step S1022; if the focus object has no adjacent unavailable item area in the first moving direction, the process proceeds to step S1023.
As an example of this step, fig. 9 exemplarily shows a flow chart of a third embodiment of the method for moving the focus object. As shown in fig. 9, includes:
s1: it is determined whether an item adjacent to the focus object in the first movement direction is an unavailable item.
If yes, go to step S2; otherwise, the process proceeds to step S3.
S2: determining whether the items adjacent to the unavailable item are unavailable items, repeating the step until the items adjacent to the unavailable item are determined to be available items, and obtaining an unavailable item area.
S3: the focus object has no adjacent unavailable item area in the first moving direction.
For example, as shown in fig. 2A and fig. 5, the current focus object is in the item 425, if the first moving direction is upward, the item 424 above the item 425 is an unavailable item, it is continuously determined whether the item above the item 424 is an unavailable item, and the item 423 above the item 424 is also an unavailable item, and this step is repeated until the item 420 is determined to be an available item, the process is stopped, and an unavailable item area (including the unavailable items 421 to 425) is obtained; if the first direction of movement is downward and the item 426 below the item 425 is an available item, then it is determined that the focus object has no adjacent unavailable item area in the first direction of movement.
S1022: and controlling the focus object to skip the unavailable item area in the first moving direction and move to the first target item. Wherein the first target item is adjacent to the unavailable item area.
S1023: controlling the focus object to move to a first target item in a first moving direction; the first target item is adjacent to the item where the current focus object is located.
In this embodiment, according to whether an unavailable item area exists in the item where the focus object is located in the first moving direction, the focus object is moved to the available item closest to the first moving direction, and the focus moving instruction of the user is quickly and accurately responded.
According to the moving method of the focus object, in the process of opening any menu interface, the focus object is controlled to be displayed on the first available item of the current menu interface, the problem that the focus object is not displayed when the first item of the menu interface is an unavailable item is avoided, and the method specifically comprises the following steps: and responding to the menu interface opening operation input by the user, and controlling the focus object to move to a second target item, wherein the second target item is the first available item in the menu interface. The scheme provides the following two possible implementation ways for how to determine the second target item:
and in the first mode, whether the items of the menu interface are available items is determined in sequence, and the determined first available item is used as a second target item. That is, whether each item is an available item is confirmed in sequence from the first item of the menu interface, and when the available item is confirmed, the available item is stopped and used as a second target item.
In a second mode, whether each item of the menu interface is an available item is determined, and the first available item in the available items is used as a second target item. And traversing each item in the menu interface, and selecting each determined available item as a second target item according to the sequence of the determined available item in the menu interface.
On the basis of the foregoing embodiments, the present solution is described by taking an Android P system as an example of a system used by a display device, and fig. 10 exemplarily shows a flowchart of a fourth method for moving a focus object. As shown in fig. 10, in a specific implementation process, the controller may detect a key event of the remote controller in real time through the detection module, for example, the key event cannot be directly received through a native Fragment of the system in the android system, where the Fragment is used to generate a menu interface, the scheme intercepts the key event through Activity, and then forwards the Fragment to process the key event, and specifically implements that a keyCode and an event of a current key event are obtained in an onKeyDown () function of tvsettingsa Activity, then a keyevent interface is created in the Fragment, a keyCode and an event sent by the Activity are received through defining a keyevent () function, and a keyevent () is rewritten in a keyevent page where the key event needs to be obtained. Further, determining whether the key event is a focus moving instruction according to the acquired key value of the key through a logic policy in the Fragment, and determining a first moving direction in the focus moving instruction.
Furthermore, View and sub View in the menu interface need to be acquired. The references inherit from View classes, each sub View corresponds to a reference, and each reference corresponds to an item in the menu interface, so that the reference where the focus is located can be determined by judging which sub View layer the current focus object is located on. Optionally, the above process may include determining whether View is empty, so as to increase the fault tolerance of the system.
If Fragment receives a key event with keyCode of 19 (upper direction key of remote control) at onKeyEventService () at the same time, it sequentially judges totalsonic, TotalSurround, TotalVolume and wall hanging setting above item 425 in fig. 2A through isEnabled () function of Preference, if four preferences are grayed, if all returned are false, it indicates that this is an unavailable item area and the current focus object is located at the boundary of this area, it indicates that the focus object needs to be moved to item 420 shown in fig. 5, here we use the smoothening focus position function of reccleview, and transfer parameter of 2 (item 420 is the third item of page), thus realizing smooth movement of focus to the first target position.
While preferred embodiments of the present invention have been described, additional variations and modifications in those embodiments may occur to those skilled in the art once they learn of the basic inventive concepts. Therefore, it is intended that the appended claims be interpreted as including preferred embodiments and all such alterations and modifications as fall within the scope of the invention.
It will be apparent to those skilled in the art that various changes and modifications may be made in the present invention without departing from the spirit and scope of the invention. Thus, if such modifications and variations of the present invention fall within the scope of the claims of the present invention and their equivalents, the present invention is also intended to include such modifications and variations.

Claims (10)

1. A method of moving a focus object, the method comprising:
displaying a menu page including a plurality of items; the menu page also comprises a focus object indicating that the item is selected and an unavailable item area comprising at least one unavailable item; the unavailable item is not selectable by a focus object;
receiving a focus moving instruction input by a user;
when determining that the current item selected by the focus object is adjacent to an unavailable item area along a first moving direction, controlling the focus object to skip the unavailable item area along the first moving direction and directly move to a first target item; the first target item is adjacent to the unavailable item area.
2. The method of claim 1, further comprising:
when determining that the current item selected by the focus object is not adjacent to an unavailable item area along a first moving direction, controlling the focus object to move to the first target item along the first moving direction; the first target item is adjacent to the current item.
3. The method of claim 1, further comprising:
an unavailable item area is determined.
4. The method of claim 3, further comprising:
determining whether an item adjacent to the current item in the first direction of movement is an unavailable item;
if yes, determining whether the items adjacent to the unavailable items are unavailable items, repeating the step until the items adjacent to the unavailable items are determined to be available items, and obtaining the unavailable item area.
5. The method of claim 1, wherein the first target item is not displayed within the menu page prior to the receiving of the user-entered focus movement instruction;
and updating and displaying the first target item in the menu page while directly moving to the first target item.
6. The method of claim 1, further comprising:
receiving a menu interface opening instruction input by a user, displaying a menu page comprising a plurality of items, and simultaneously controlling a focus object to move to a second target item; the second target item is the first available item in the menu interface.
7. The method of claim 6, further comprising:
sequentially determining whether the items of the menu interface are available items, and taking the determined first available item as the second target item;
alternatively, the first and second electrodes may be,
determining whether each item of the menu interface is an available item, and using a first available item of all available items as the second target item.
8. A display device, comprising:
a display for displaying a menu page of a plurality of items; the menu page also comprises a focus object indicating that the item is selected and an unavailable item area comprising at least one unavailable item; the unavailable item is not selectable by a focus object;
the user interface is used for receiving instructions input by a user;
a controller for performing:
in response to a focus moving instruction input by a user, when determining that a current item selected by the focus object is adjacent to an unavailable item area in a first moving direction, controlling the focus object to move directly to a first target item by skipping the unavailable item area in the first moving direction; the first target item is adjacent to the unavailable item area.
9. The apparatus of claim 8, wherein the controller is specifically configured to:
determining whether an item adjacent to the current item in the first direction of movement is an unavailable item;
if yes, determining whether the items adjacent to the unavailable items are unavailable items, repeating the step until the items adjacent to the unavailable items are determined to be available items, and obtaining the unavailable item area.
10. The apparatus of claim 8, wherein the controller is specifically configured to:
the first target item is not displayed within the menu page prior to the focus movement instruction in response to user input;
and updating and displaying the first target item in the menu page while directly moving to the first target item.
CN201911328401.9A 2019-12-20 2019-12-20 Moving method of focus object and display device Pending CN111045557A (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CN201911328401.9A CN111045557A (en) 2019-12-20 2019-12-20 Moving method of focus object and display device
PCT/CN2020/133646 WO2021121051A1 (en) 2019-12-20 2020-12-03 Display method and display device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201911328401.9A CN111045557A (en) 2019-12-20 2019-12-20 Moving method of focus object and display device

Publications (1)

Publication Number Publication Date
CN111045557A true CN111045557A (en) 2020-04-21

Family

ID=70238072

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201911328401.9A Pending CN111045557A (en) 2019-12-20 2019-12-20 Moving method of focus object and display device

Country Status (2)

Country Link
CN (1) CN111045557A (en)
WO (1) WO2021121051A1 (en)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112511874A (en) * 2020-11-12 2021-03-16 北京视游互动科技有限公司 Game control method, smart television and storage medium
WO2021121051A1 (en) * 2019-12-20 2021-06-24 海信视像科技股份有限公司 Display method and display device
CN113703625A (en) * 2021-07-30 2021-11-26 青岛海尔科技有限公司 Method, apparatus, storage medium, and electronic apparatus for controlling focus movement

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102915190A (en) * 2011-08-03 2013-02-06 联想(北京)有限公司 Display processing method, device and electronic equipment
US20130159858A1 (en) * 2011-12-14 2013-06-20 Microsoft Corporation Collaborative media sharing
CN107092410A (en) * 2016-02-24 2017-08-25 口碑控股有限公司 Interface alternation method, equipment and the intelligent terminal of a kind of touch-screen
CN109313528A (en) * 2016-06-12 2019-02-05 苹果公司 Accelerate to roll

Family Cites Families (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1242663A (en) * 1998-04-07 2000-01-26 无线行星公司 Method for displaying selectable and non-selectable elements on small screen
JP2005327000A (en) * 2004-05-13 2005-11-24 Sony Corp User interface controller, user interface control method and computer program
CN101018282A (en) * 2006-02-09 2007-08-15 上海乐金广电电子有限公司 Method for automatically skipping the restricted channel in the broadcast receiving device
US20090132963A1 (en) * 2007-11-21 2009-05-21 General Electric Company Method and apparatus for pacs software tool customization and interaction
US10574825B2 (en) * 2017-02-15 2020-02-25 Microsoft Technology Licensing, Llc Assisted-communication with intelligent personal assistant
CN111045557A (en) * 2019-12-20 2020-04-21 青岛海信电器股份有限公司 Moving method of focus object and display device

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102915190A (en) * 2011-08-03 2013-02-06 联想(北京)有限公司 Display processing method, device and electronic equipment
US20130159858A1 (en) * 2011-12-14 2013-06-20 Microsoft Corporation Collaborative media sharing
CN107092410A (en) * 2016-02-24 2017-08-25 口碑控股有限公司 Interface alternation method, equipment and the intelligent terminal of a kind of touch-screen
CN109313528A (en) * 2016-06-12 2019-02-05 苹果公司 Accelerate to roll

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2021121051A1 (en) * 2019-12-20 2021-06-24 海信视像科技股份有限公司 Display method and display device
CN112511874A (en) * 2020-11-12 2021-03-16 北京视游互动科技有限公司 Game control method, smart television and storage medium
CN112511874B (en) * 2020-11-12 2023-10-03 北京视游互动科技有限公司 Game control method, intelligent television and storage medium
CN113703625A (en) * 2021-07-30 2021-11-26 青岛海尔科技有限公司 Method, apparatus, storage medium, and electronic apparatus for controlling focus movement

Also Published As

Publication number Publication date
WO2021121051A1 (en) 2021-06-24

Similar Documents

Publication Publication Date Title
CN111447498B (en) Awakening method of display equipment and display equipment
CN111314789B (en) Display device and channel positioning method
CN111182345B (en) Display method and display equipment of control
CN111654739A (en) Content display method and display equipment
CN111625169B (en) Method for browsing webpage by remote controller and display equipment
CN111427643A (en) Display device and display method of operation guide based on display device
CN111726673B (en) Channel switching method and display device
CN111246309A (en) Method for displaying channel list in display device and display device
CN111479155A (en) Display device and user interface display method
CN111414216A (en) Display device and display method of operation guide based on display device
CN111045557A (en) Moving method of focus object and display device
CN111901653B (en) Configuration method of external sound equipment of display equipment and display equipment
CN113347413A (en) Window position detection method and display device
CN112004126A (en) Search result display method and display device
CN112040308A (en) HDMI channel switching method and display device
CN111857502A (en) Image display method and display equipment
CN111857363A (en) Input method interaction method and display equipment
CN111541929A (en) Multimedia data display method and display equipment
CN112040285B (en) Interface display method and display equipment
CN113115093B (en) Display device and detail page display method
CN113010074A (en) Webpage Video control bar display method and display equipment
CN111614995A (en) Menu display method and display equipment
CN111459372A (en) Network list refreshing display method and display equipment
CN111601147A (en) Content display method and display equipment
CN111596771A (en) Display device and method for moving selector in input method

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
CB02 Change of applicant information

Address after: 266555 Qingdao economic and Technological Development Zone, Shandong, Hong Kong Road, No. 218

Applicant after: Hisense Video Technology Co., Ltd

Address before: 266555 Qingdao economic and Technological Development Zone, Shandong, Hong Kong Road, No. 218

Applicant before: HISENSE ELECTRIC Co.,Ltd.

CB02 Change of applicant information
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication

Application publication date: 20200421

RJ01 Rejection of invention patent application after publication