WO2021121051A1 - 一种显示方法及显示设备 - Google Patents

一种显示方法及显示设备 Download PDF

Info

Publication number
WO2021121051A1
WO2021121051A1 PCT/CN2020/133646 CN2020133646W WO2021121051A1 WO 2021121051 A1 WO2021121051 A1 WO 2021121051A1 CN 2020133646 W CN2020133646 W CN 2020133646W WO 2021121051 A1 WO2021121051 A1 WO 2021121051A1
Authority
WO
WIPO (PCT)
Prior art keywords
item
unavailable
focus
user
focus object
Prior art date
Application number
PCT/CN2020/133646
Other languages
English (en)
French (fr)
Inventor
贾桂丽
孙琦玮
董杰
刘鹏
高峰凯
Original Assignee
海信视像科技股份有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 海信视像科技股份有限公司 filed Critical 海信视像科技股份有限公司
Publication of WO2021121051A1 publication Critical patent/WO2021121051A1/zh

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/0482Interaction with lists of selectable items, e.g. menus
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/04842Selection of displayed objects or displayed text elements

Definitions

  • This application relates to the field of display technology, and in particular to a display method and display device.
  • the display device displays the menu
  • some items in the menu are unavailable items (that is, the items are grayed out)
  • these unavailable items cannot be selected, and the focus object cannot be moved to unavailable items, for example,
  • the sound output of the display device is set to the power amplifier, the setting items such as the sound mode of the sound menu and the wall setting (wall sound effect) cannot be operated and are unavailable items.
  • the focus object when the focus object moves to an unavailable item, or the boundary of multiple unavailable items, if the user presses the arrow keys of the remote control, the focus is When the object moves to the direction of the unavailable item, the focus object will not skip the unavailable item directly, but scroll the page to display the available items on the menu interface. When the user presses the same direction key on the remote control again, the focus object can be moved To available items.
  • a display method is provided, and the method includes:
  • a menu page including multiple items is displayed; wherein the menu page also includes a focus object indicating that the item is selected, and an unavailable item area including at least one unavailable item; the unavailable item cannot be selected by the focus object ;
  • control the focus object When it is determined that the current item selected by the focus object is adjacent to an unavailable item area along the first movement direction, control the focus object to skip the unavailable item area along the first movement direction, and directly Move to the first target item; the first target item is adjacent to the unavailable item area.
  • a display device including:
  • User interface used to receive instructions entered by the user
  • the focus object In response to the focus movement instruction input by the user, when it is determined that the current item selected by the focus object is adjacent to an unavailable item area along the first movement direction, the focus object is controlled to jump along the first movement direction. After passing the unavailable item area, directly move to the first target item; the first target item is adjacent to the unavailable item area.
  • FIG. 1A shows a usage scenario of a display device according to some embodiments
  • FIG. 1B shows a block diagram of the hardware configuration of the control device 100 according to some embodiments
  • FIG. 1C shows a block diagram of the hardware configuration of a display device 200 according to some embodiments
  • FIG. 1D shows a software configuration diagram in the display device 200 according to some embodiments
  • FIG. 2A and 2B exemplarily show a schematic diagram of a GUI 400 provided by the display device 200;
  • FIG. 3 exemplarily shows a schematic diagram of a GUI 400-1 provided by the display device 200 in the prior art
  • FIG. 4 exemplarily shows a schematic diagram of a GUI 400 provided by the display device 200
  • FIG. 5 exemplarily shows a schematic diagram of a GUI 400 provided by the display device 200
  • 6A and 6B exemplarily show a schematic diagram of a GUI 500 provided by the display device 200;
  • FIG. 7 exemplarily shows a schematic flowchart of Embodiment 1 of a method for moving a focus object
  • FIG. 8 exemplarily shows a schematic flowchart of Embodiment 2 of the method for moving a focus object
  • FIG. 9 exemplarily shows a schematic flowchart of Embodiment 3 of the method for moving a focus object
  • FIG. 10 exemplarily shows a schematic flowchart of Embodiment 4 of the method for moving a focus object.
  • module refers to any known or later developed hardware, software, firmware, artificial intelligence, fuzzy logic, or combination of hardware or/and software code that can perform functions related to the element.
  • Fig. 1A is a schematic diagram of a usage scenario of a display device according to an embodiment.
  • the display device 200 also performs data communication with the server 300, and the user can operate the display device 200 through the smart device 100B or the control device 100A.
  • control device 100A may be a remote controller, and the communication between the remote controller and the display device includes at least one of infrared protocol communication or Bluetooth protocol communication, and other short-range communication methods, and the display is controlled by wireless or wired methods.
  • Equipment 200 The user can control the display device 200 by inputting user instructions through at least one of keys on the remote control, voice input, and control panel input.
  • the smart device 100B may include any one of a mobile terminal, a tablet computer, a computer, a notebook computer, an AR/VR device, and the like.
  • the smart device 100B may also be used to control the display device 200.
  • an application program running on a smart device is used to control the display device 200.
  • the smart device 100B and the display device can also be used to communicate data.
  • the display device 200 can also be controlled in a manner other than the control device 100A and the smart device 100B.
  • a voice command acquisition module configured inside the display device 200 can be used to directly receive the user's voice command control.
  • the voice control device provided outside the display device 200 can also be used to receive the user's voice command control.
  • the display device 200 also performs data communication with the server 300.
  • the display device 200 may be allowed to communicate through a local area network (LAN), a wireless local area network (WLAN), and other networks.
  • the server 300 may provide various contents and interactions to the display device 200.
  • the server 300 may be one cluster or multiple clusters, and may include one or more types of servers.
  • the software steps executed by one step execution subject can be migrated to another step execution subject for data communication with it for execution as required.
  • the software steps executed by the server can be migrated to the display device in data communication with the server as required for execution, and vice versa.
  • Fig. 1B exemplarily shows a configuration block diagram of a control device 100A according to an exemplary embodiment.
  • the control device 100A includes a controller 110, a communication interface 130, a user input/output interface 140, a memory, and a power supply.
  • the control device 100 can receive an input operation instruction from a user, and convert the operation instruction into an instruction that can be recognized and responded to by the display device 200, so as to act as an intermediary between the user and the display device 200.
  • the communication interface 130 is used to communicate with the outside and includes at least one of a WIFI chip, a Bluetooth module, an NFC or an alternative module.
  • the user input/output interface 140 includes at least one of a microphone, a touch panel, a sensor, a button, or an alternative module.
  • FIG. 1C shows a block diagram of the hardware configuration of the display device 200 according to an exemplary embodiment.
  • the display device 200 includes a tuner and demodulator 210, a communicator 220, a detector 230, an external device interface 240, a controller 250, a display 260, an audio output interface 270, a memory, a power supply, and a user interface. At least one.
  • the controller includes a central processing unit, a video processor, an audio processor, a graphics processor, RAM, ROM, and the first interface to the nth interface for input/output.
  • the display 260 includes a display screen component for presenting images, and a driving component for driving image display, for receiving image signals output from the controller, and displaying video content, image content, and menu control interface. Components and user control UI interface, etc.
  • the display 260 may be at least one of a liquid crystal display, an OLED display, and a projection display, and may also be a projection device and a projection screen.
  • the tuner demodulator 210 receives broadcast television signals through wired or wireless reception, and demodulates audio and video signals, such as EPG data signals, from multiple wireless or wired broadcast television signals.
  • the communicator 220 is a component for communicating with external devices or servers according to various communication protocol types.
  • the communicator may include at least one of a Wifi module, a Bluetooth module, a wired Ethernet module and other network communication protocol chips or a near field communication protocol chip, and an infrared receiver.
  • the display device 200 may establish transmission and reception of control signals and data signals with the control device 100 or the server 400 through the communicator 220.
  • the detector 230 is used to collect signals from the external environment or interaction with the outside.
  • the detector 230 includes a light receiver, a sensor used to collect the ambient light intensity; or, the detector 230 includes an image collector, such as a camera, which can be used to collect external environment scenes, user attributes or user interaction gestures, or
  • the detector 230 includes a sound collector, such as a microphone, for receiving external sound.
  • the external device interface 240 may include, but is not limited to, the following: high-definition multimedia interface (HDMI), analog or data high-definition component input interface (component), composite video input interface (CVBS), USB input interface (USB), Any one or more interfaces such as RGB port. It may also be a composite input/output interface formed by the above-mentioned multiple interfaces.
  • HDMI high-definition multimedia interface
  • component analog or data high-definition component input interface
  • CVBS composite video input interface
  • USB input interface USB
  • Any one or more interfaces such as RGB port. It may also be a composite input/output interface formed by the above-mentioned multiple interfaces.
  • the controller 250 and the tuner and demodulator 210 may be located in different separate devices, that is, the tuner and demodulator 210 may also be in an external device of the main device where the controller 250 is located, such as an external set-top box. Wait.
  • the controller 250 controls the operation of the display device and responds to user operations through various software control programs stored in the memory.
  • the controller 250 controls the overall operation of the display device 200. For example, in response to receiving a user command for selecting a UI object to be displayed on the display 260, the controller 250 may perform an operation related to the object selected by the user command.
  • the object may be any one of selectable objects, such as a hyperlink, an icon, or other operable controls.
  • Operations related to the selected object include: displaying operations connected to hyperlink pages, documents, images, etc., or performing operations corresponding to the icons.
  • the controller includes a central processing unit (Central Processing Unit, CPU), a video processor, an audio processor, a graphics processing unit (Graphics Processing Unit, GPU), RAM Random Access Memory, RAM), ROM (Read- Only Memory, ROM), at least one of the first interface to the nth interface used for input/output, a communication bus (Bus), and the like.
  • CPU Central Processing Unit
  • video processor video processor
  • audio processor audio processor
  • graphics processing unit Graphics Processing Unit, GPU
  • RAM Random Access Memory
  • ROM Read- Only Memory
  • the CPU processor is used to execute the operating system and application program instructions stored in the memory, and execute various application programs, data and content according to various interactive instructions received from the outside, so as to finally display and play various audio and video content.
  • the CPU processor may include multiple processors. For example, it includes a main processor and one or more sub-processors.
  • the graphics processor is used to generate various graphics objects, such as at least one of icons, operation menus, and user input instructions to display graphics.
  • the graphics processor includes an arithmetic unit, which performs operations by receiving various interactive instructions input by the user, and displays various objects according to display attributes; also includes a renderer, which renders various objects obtained based on the arithmetic unit, and the above-mentioned rendered objects are used It is displayed on the display.
  • the video processor is used to receive external video signals and perform decompression, decoding, scaling, noise reduction, frame rate conversion, resolution conversion, image synthesis, etc., according to the standard codec protocol of the input signal. At least one of the processing can obtain a signal that can be directly displayed or played on the display device 200.
  • the video processor includes at least one of a demultiplexing module, a video decoding module, an image synthesis module, a frame rate conversion module, a display formatting module, and the like.
  • the demultiplexing module is used to demultiplex the input audio and video data stream.
  • the video decoding module is used to process the demultiplexed video signal, including decoding and scaling.
  • An image synthesis module such as an image synthesizer, is used to superimpose and mix the GUI signal generated by the graphics generator with the zoomed video image according to user input or itself to generate an image signal for display.
  • the frame rate conversion module is used to convert the frame rate of the input video.
  • the display formatting module is used to convert the video output signal after receiving the frame rate conversion, and change the signal to conform to the signal of the display format, such as outputting an RGB data signal.
  • the audio processor is configured to receive an external audio signal, and perform decompression and decoding according to the standard codec protocol of the input signal, as well as at least one of noise reduction, digital-to-analog conversion, and amplification processing. kind of, get the sound signal that can be played in the speaker.
  • the user may input a user command on a graphical user interface (GUI) displayed on the display 260, and the user input interface receives the user input command through the graphical user interface (GUI).
  • GUI graphical user interface
  • the user may input a user command by inputting a specific sound or gesture, and the user input interface recognizes the sound or gesture through the sensor to receive the user input command.
  • the "user interface” is a medium interface for interaction and information exchange between an application or operating system and a user, and it realizes the conversion between the internal form of information and the form acceptable to the user.
  • the commonly used form of the user interface is the Graphic User Interface (GUI), which refers to a user interface related to computer operations that is displayed in a graphical manner. It can be an icon, window, control and other interface elements displayed on the display screen of an electronic device.
  • the control can include icons, buttons, menus, tabs, text boxes, dialog boxes, status bars, navigation bars, Widgets, etc. At least one of the visual interface elements.
  • the user interface 280 is an interface that can be used to receive control input (for example, a physical button on the body of the display device, or others).
  • the system of the display device may include a kernel (Kernel), a command parser (shell), a file system, and an application program.
  • Kernel a kernel
  • shell command parser
  • file system a file system
  • application program an application program.
  • the kernel, shell, and file system together form the basic operating system structure. They allow users to manage files, run programs, and use the system.
  • the kernel starts, activates the kernel space, abstracts hardware, initializes hardware parameters, etc., runs and maintains virtual memory, scheduler, signals, and inter-process communication (IPC).
  • IPC inter-process communication
  • the Shell and user applications are loaded.
  • the application is started, it is compiled into machine code to form a process.
  • the system is divided into four layers, from top to bottom as the application (Applications) layer (referred to as “application layer”), and the application framework (Application Framework) layer (referred to as “framework layer”). “), Android runtime and system library layer (referred to as “system runtime library layer”), and kernel layer.
  • application layer application layer
  • application framework Application Framework
  • kernel layer Android runtime and system library layer
  • kernel layer kernel layer
  • These applications may be Window programs, system setting programs, or clock programs that come with the operating system; they may also be developed by third-party developers. s application.
  • the application package in the application layer is not limited to the above examples.
  • the framework layer provides application programming interfaces (application programming interface, API) and programming frameworks for applications in the application layer.
  • the application framework layer includes some predefined functions.
  • the application framework layer is equivalent to a processing center, which decides to let applications in the application layer take actions. Through the API interface, the application can access the resources in the system and obtain the services of the system during execution.
  • the application framework layer in this embodiment of the application includes managers (Managers), content providers (Content Provider), etc., where the manager includes at least one of the following modules: Interact with all activities running in the system; Location Manager is used to provide system services or applications with access to system location services; Package Manager is used to retrieve current installations on the device Various information related to the application package; Notification Manager is used to control the display and clearing of notification messages; Window Manager is used to manage icons, windows, toolbars, and wallpapers on the user interface And desktop widgets.
  • Managers includes at least one of the following modules: Interact with all activities running in the system; Location Manager is used to provide system services or applications with access to system location services; Package Manager is used to retrieve current installations on the device Various information related to the application package; Notification Manager is used to control the display and clearing of notification messages; Window Manager is used to manage icons, windows, toolbars, and wallpapers on the user interface And desktop widgets.
  • the activity manager is used to manage the life cycle of each application program and the usual navigation fallback functions, such as controlling the exit, opening, and back of the application program.
  • the window manager is used to manage all window programs, such as obtaining the size of the display screen, judging whether there is a status bar, locking the screen, capturing the screen, and controlling the display window changes (such as shrinking the display window, dithering, distorting, etc.).
  • the system runtime layer provides support for the upper layer, that is, the framework layer.
  • the Android operating system will run the C/C++ library included in the system runtime layer to implement functions to be implemented by the framework layer.
  • the kernel layer is a layer between hardware and software.
  • the kernel layer contains at least one of the following drivers: audio driver, display driver, Bluetooth driver, camera driver, WIFI driver, USB driver, HDMI driver, sensor driver (such as fingerprint sensor, temperature sensor, pressure Sensors, etc.), and power drive, etc.
  • FIG. 2A and FIG. 2B exemplarily show a schematic diagram of a GUI 400 provided by the display device 200.
  • the display device provides a GUI 400 to the display according to the control instruction input by the user by operating the control device.
  • the GUI 400 includes a main display screen 41 and a menu interface 42 including a plurality of items 421 to 428.
  • the focus object 43 of one item can be moved by the focus movement instruction input by the user to change the selected items, for example, the user can input the focus movement instruction by pressing the up/down/left/right direction keys of the remote control .
  • the items in the menu interface have different states.
  • items 421 to 424 in the menu interface 42 are unavailable items (ie, grayed out items), and items 425 to 428 are available items.
  • the item where the focus object is located is an available item, and the focus object cannot be placed on an unavailable item.
  • the main display screen 41 may be at least one of the image, text, and video content that the user is watching.
  • the playback screen shown in FIG. 2A is a picture screen, or it may be a menu page, a search page, or an application page.
  • the playback screen shown in FIG. 2B is the upper menu of the menu interface 42.
  • Fig. 3 exemplarily shows a schematic diagram of a GUI 400-1 provided by the display device 200 in the prior art.
  • 4 and 5 exemplarily show a schematic diagram of a GUI 400 provided by the display device 200
  • the current focus object 43 is placed on the item 425.
  • the display device 200 can respond to the focus movement instruction.
  • the display device 200 displays the GUI 400-1 shown in FIG. 3 through the display, wherein the menu interface 42-1 Including the item 420 that is moved upward from the menu interface 42 and the items 421 to 427, and the position of the focus object 43 does not change.
  • the focus movement instruction again, the focus object moves to the item 420.
  • the item 426 below the item 425 where the focus object 43 is located is an available item. If the focus movement instruction indicates to move the focus object 43 downward, the focus object 43 will move to Item 426 is shown in Figure 4; as another example, in the current menu interface 42, the items 421-224 above the item 425 where the focus object 43 is located are all unavailable items, and the items 421-224 are not available. Use the item area 400, that is, the focus object is at the boundary of the unavailable item area. If the focus movement instruction instructs to move the focus object 43 upward, the focus object 43 skips the unavailable item area and moves directly to the item 420, as shown in Figure 5 .
  • 6A and 6B exemplarily show a schematic diagram of a GUI 500 provided by the display device 200.
  • the display device provides a GUI 500 to the display according to the user's menu interface opening operation, for example, a control command input by the user by operating the control device.
  • the GUI 500 includes a main display screen 51, and the opened include A menu interface 52 of multiple items 521 to 525.
  • the main display screen 51 may be at least one of the image, text, and video content that the user is watching.
  • the playback screen shown in FIG. 5A is a picture screen, or it may be a menu page, a search page, The application page, etc., for example, the playback screen shown in FIG. 5B is the upper-level menu of the menu interface 52.
  • item 521 is the first item of the menu interface 52 and is an unavailable item
  • item 522 is the second item of the menu interface 52 and is an available item.
  • the control focus object 53 is in The first available item is item 522.
  • FIG. 7 exemplarily shows a schematic flowchart of Embodiment 1 of a method for moving a focus object. As shown in Figure 7, the method includes:
  • S101 Acquire a focus movement instruction input by the user.
  • the controller acquires the focus movement instruction input by the user through detection, for example, detects the focus movement instruction input by the user by pressing or touching the up/down/left/right movement keys on the control device 100, or receives the user through Voice input focus movement instructions, etc.
  • the focus movement instruction includes a first movement direction, and the first movement direction is used to indicate the target direction in which the focus object will move, such as up, down, left, and right.
  • S102 According to the focus movement instruction, control the focus object to move to the first target item in the first movement direction.
  • the controller controls the focus object to move to the first target item according to the focus movement instruction.
  • the first target item is the available item in the menu interface that is closest to the item where the focus object is located.
  • the focus movement instruction instructs to move the focus object upwards
  • the focus object is controlled to move to the first available item above the current item. It should be understood that under certain conditions, some items in the menu interface cannot be set or used, for example, when the sound output is an amplifier, the sound in the sound menu Items such as Mode (Sound Mode) and Wall Mount Setup (Wall Mount Setup) are unavailable and are grayed out.
  • the first target item is not displayed on the current menu interface, scroll the menu interface until the first target item is exposed, and at the same time move the focus object to the first target item.
  • An embodiment of the solution provides a method for moving a focus object by acquiring a focus movement instruction input by a user.
  • the focus movement instruction includes a first movement direction, and according to the focus movement instruction, the focus object is controlled to move in the first movement direction.
  • To the first target item where the first target item is an available item in the menu interface that is closest to the item where the focus object is located.
  • the focus object can be enabled according to the user
  • the input instruction is quickly moved to the corresponding available item, and the user does not need to input the instruction multiple times, which improves the user experience.
  • FIG. 8 exemplarily shows a schematic flowchart of Embodiment 2 of the method for moving a focus object.
  • step S102 controlling the focus object to move to the first target item in the first movement direction according to the focus movement instruction, which specifically includes:
  • S1021 Determine whether there is an unusable item area adjacent to the focus object in the first moving direction.
  • the unavailable item area includes one or more consecutive unavailable items.
  • the current focus object is the item 425
  • the items 421 to 424 above the item 425 constitute an unavailable item area
  • the item 426 below the item 425 is an available item. If the first movement direction is upward, the focus object has adjacent unavailable item areas in the first movement direction. If the first movement direction is downward, then the focus object has no adjacent unavailable item areas in the first movement direction. .
  • step S1022 is entered; if the focus object has no adjacent unavailable item area in the first movement direction, then step S1023 is entered.
  • FIG. 9 exemplarily shows a schematic flowchart of Embodiment 3 of the method for moving a focus object. As shown in Figure 9, it includes:
  • S1 Determine whether the item adjacent to the focus object in the first moving direction is an unavailable item.
  • step S2 If yes, go to step S2; otherwise, go to step S3.
  • S2 Determine whether the item adjacent to the unavailable item is an unavailable item, repeat this step until it is determined that the item adjacent to the unavailable item is an available item, and the unavailable item area is obtained.
  • S3 The focus object has no adjacent unusable item areas in the first moving direction.
  • the current focus object is in the item 425, if the first moving direction is upward, the item 424 above the item 425 is an unavailable item, continue to determine whether the item above the item 424 is an unavailable item , The item 423 above the item 424 is also an unavailable item, repeat this step until the item 420 is determined to be an available item, stop this process, and get the unavailable item area (including unavailable items 421 to 425); if the first The moving direction is downward, and the item 426 below the item 425 is an available item, it is determined that the focused object does not have an adjacent unusable item area in the first moving direction.
  • S1022 Control the focus object in the first moving direction, skip the unavailable item area, and move to the first target item. Among them, the first target project is adjacent to the unavailable project area.
  • S1023 Control the focus object to move to the first target item in the first moving direction; the first target item is adjacent to the item where the current focus object is located.
  • the focus object is moved to the closest available item in the first movement direction to quickly and accurately respond to the user's focus movement instruction .
  • This solution provides a method for moving the focus object.
  • the control focus object is displayed on the first available item of the current menu interface to avoid when the first item of the menu interface is an unavailable item .
  • Causing the problem that the focus object is not displayed specifically including: in response to the user input of the menu interface opening operation, controlling the focus object to move to the second target item, which is the first available item in the menu interface.
  • Method 1 Determine in sequence whether the items in the menu interface are available items, and use the determined first available item as the second target item. That is, starting from the first item in the menu interface, confirming whether each item is an available item in turn, and stopping when an available item is confirmed, taking the available item as the second target item.
  • Method 2 Determine whether each item in the menu interface is an available item, and use the first available item in the available items as the second target item. Traverse each item in the menu interface, and determine each available item, according to its order in the menu interface, and select the item sorted first as the second target item.
  • the system used by the display device is the Android P system as an example to illustrate the solution.
  • FIG. 10 exemplarily shows a schematic flowchart of Embodiment 4 of the method for moving a focus object.
  • the controller can detect the key events of the remote control in real time through the detection module.
  • the system's native Fragment cannot directly receive key events.
  • Fragment is used for Generate the menu interface. This solution intercepts the key event through Activity, and then forwards it to Fragment to process the key event.
  • the specific implementation is to obtain the keyCode and event of the current key event in the onKeyDown() function of TvSettingsActivity, and then create the KeyEventService interface in the Fragment. And by defining the onKeyEventService() function to receive the keyCode and event sent by the Activity, rewrite onKeyEventService() on the Fragment page that needs to get the key event. Further, the logic strategy in the Fragment determines whether the key event is a focus movement instruction according to the acquired key value, and determines the first movement direction in the focus movement instruction.
  • the foregoing process may include determining whether the View is empty, so as to increase the fault tolerance of the system.
  • Fragment receives a keycode event of 19 (upward key of the remote control) in onKeyEventService() at the same time, use the isEnabled() function of Preference to determine the settings of TotalSonics, TotalSurround, TotalVolume, and wall hanging above item 425 in Figure 2A. Whether the four Preferences are grayed out or not, if the returned values are all false, it means that this is an unavailable item area and the current focus object is located at the boundary of this area, which means that the focus object needs to be moved to the item 420 shown in Figure 5.
  • the smoothScrollToPosition(int position) function of RecycleView and pass the parameter as 2 (item 420 is the third item on the page), so that smooth movement of the focus to the first target position is achieved.

Abstract

一种显示方法及显示设备,通过获取用户输入的焦点移动指令,并根据焦点移动指令,控制焦点对象在第一移动方向上,移动至第一目标项目,其中,第一目标项目为菜单界面中与所述焦点对象所处的项目距离最近的可用项目,在菜单界面上存在一至多个不可用项目时,使焦点对象能够根据用户输入的指令快速移动至对应的可用项目。

Description

一种显示方法及显示设备
本申请要求在2019年12月20日提交中国专利局、申请号为CN201911328401.9、申请名称为“焦点对象的移动方法及显示设备”的中国专利的优先权;其全部内容通过引用结合在本申请中。
技术领域
本申请涉及显示技术领域,尤其涉及一种显示方法及显示设备。
背景技术
显示设备显示菜单时,在特定条件下,菜单中的部分项目为不可用项目(即项目置灰),将不能对这些不可用项目进行选择,且焦点对象也不能移动至不可用项目,例如,当设置了显示设备的声音输出为功放时,声音菜单的声音模式、壁挂设置(壁挂音效)等设置项目不能被操作,为不可用项目。
目前,使用安卓(Android)系统的显示设备,例如Android的P版本,在焦点对象移动至不可用项目,或者多个连续不可用项目的边界时,若用户通过按压遥控器的方向键,让焦点对象向不可用项目的方向移动,则焦点对象不会直接跳过不可用项目,而是滚动页面使可用项目显示于菜单界面,当用户再次通过按压遥控器的同一方向键,才能使焦点对象移动至可用项目。
现有技术中,在菜单界面有一至多个不可用项目,使焦点对象所要移动至的可用项目不可见时,先将可用项目显示出来,再移动焦点对象,需要用户进行多次操作,且在将可用项目显示出来的过程中,容易导致当前焦点对象以及焦点对象所处的项目滚动出菜单界面,导致焦点对象不可见,不便于用户操作,影响了用户体验。
发明内容
本申请提供一种显示方法及显示设备,第一方面,提供一种显示方法,所述方法包括:
显示包括多个项目的菜单页面;其中,所述菜单页面中还包括指示项目被选择的焦点对象,以及包括至少一个不可用项目的不可用项目区域;所述不可用项目不可被焦点对象所选择;
接收用户输入的焦点移动指令;
在确定所述焦点对象所选择的当前项目沿第一移动方向上相邻有不可用项目区域时,控制所述焦点对象沿所述第一移动方向上,跳过所述不可用项目区域,直接移动至第一目标项目;所述第一目标项目与所述不可用项目区域相邻。
第二方面,提供一种显示设备,包括:
显示器,用于显示多个项目的菜单页面;其中,所述菜单页面中还包括指示项目被选择的焦点对象,以及包括至少一个不可用项目的不可用项目区域;所述不可用项目不可被 焦点对象所选择;
用户接口,用于接收用户输入的指令;
控制器,用于执行:
响应于用户输入的焦点移动指令,在确定所述焦点对象所选择的当前项目沿第一移动方向上相邻有不可用项目区域时,控制所述焦点对象沿所述第一移动方向上,跳过所述不可用项目区域,直接移动至第一目标项目;所述第一目标项目与所述不可用项目区域相邻。
附图说明
为了更清楚地说明本申请实施例或相关技术中的实施方式,下面将对实施例或相关技术描述中所需要使用的附图作一简单地介绍,显而易见地,下面描述中的附图是本申请的一些实施例,对于本领域普通技术人员来讲,还可以根据这些附图获得其他的附图。
图1A示出了根据一些实施例的显示设备的使用场景;
图1B示出了根据一些实施例的控制装置100的硬件配置框图;
图1C示出了根据一些实施例的显示设备200的硬件配置框图;
图1D示出了根据一些实施例的显示设备200中软件配置图;
图2A和图2B中示例性示出了显示设备200提供的一个GUI400的示意图;
图3中示例性示出了现有技术中显示设备200提供的一个GUI400-1的示意图;
图4中示例性示出了显示设备200提供的一个GUI400的示意图;
图5中示例性示出了显示设备200提供的一个GUI400的示意图;
图6A和图6B中示例性示出了显示设备200提供的一个GUI500的示意图;
图7示例性示出了焦点对象的移动方法实施例一的流程示意图;
图8示例性示出了焦点对象的移动方法实施例二的流程示意图;
图9示例性示出了焦点对象的移动方法实施例三的流程示意图;
图10示例性示出了焦点对象的移动方法实施例四的流程示意图。
具体实施方式
为使本申请的目的和实施方式更加清楚,下面将结合本申请示例性实施例中的附图,对本申请示例性实施方式进行清楚、完整地描述,显然,描述的示例性实施例仅是本申请一部分实施例,而不是全部的实施例。
需要说明的是,本申请中对于术语的简要说明,仅是为了方便理解接下来描述的实施方式,而不是意图限定本申请的实施方式。除非另有说明,这些术语应当按照其普通和通常的含义理解。
本申请中说明书和权利要求书及上述附图中的术语“第一”、“第二”、“第三”等是用于区别类似或同类的对象或实体,而不必然意味着限定特定的顺序或先后次序,除非另外注明。应该理解这样使用的用语在适当情况下可以互换。
术语“包括”和“具有”以及他们的任何变形,意图在于覆盖但不排他的包含,例如, 包含了一系列组件的产品或设备不必限于清楚地列出的所有组件,而是可包括没有清楚地列出的或对于这些产品或设备固有的其它组件。
术语“模块”是指任何已知或后来开发的硬件、软件、固件、人工智能、模糊逻辑或硬件或/和软件代码的组合,能够执行与该元件相关的功能。
图1A为根据实施例中显示设备的使用场景的示意图。如图1A所示,显示设备200还与服务器300进行数据通信,用户可通过智能设备100B或控制装置100A操作显示设备200。
在一些实施例中,控制装置100A可以是遥控器,遥控器和显示设备的通信包括红外协议通信或蓝牙协议通信,及其他短距离通信方式中的至少一种,通过无线或有线方式来控制显示设备200。用户可以通过遥控器上按键、语音输入、控制面板输入等至少一种输入用户指令,来控制显示设备200。
在一些实施例中,智能设备100B可以包括移动终端、平板电脑、计算机、笔记本电脑,AR/VR设备等中的任意一种。
在一些实施例中,也可以使用智能设备100B以控制显示设备200。例如,使用在智能设备上运行的应用程序控制显示设备200。
在一些实施例中,也可以使用智能设备100B和显示设备进行数据的通信。
在一些实施例中,显示设备200还可以采用除了控制装置100A和智能设备100B之外的方式进行控制,例如,可以通过显示设备200设备内部配置的获取语音指令的模块直接接收用户的语音指令控制,也可以通过显示设备200设备外部设置的语音控制装置来接收用户的语音指令控制。
在一些实施例中,显示设备200还与服务器300进行数据通信。可允许显示设备200通过局域网(LAN)、无线局域网(WLAN)和其他网络进行通信连接。服务器300可以向显示设备200提供各种内容和互动。服务器300可以是一个集群,也可以是多个集群,可以包括一类或多类服务器。
在一些实施例中,一个步骤执行主体执行的软件步骤可以随需求迁移到与之进行数据通信的另一步骤执行主体上进行执行。示例性的,服务器执行的软件步骤可以随需求迁移到与之数据通信的显示设备上执行,反之亦然。
图1B示例性示出了根据示例性实施例中控制装置100A的配置框图。如图1B所示,控制装置100A包括控制器110、通信接口130、用户输入/输出接口140、存储器、供电电源。控制装置100可接收用户的输入操作指令,且将操作指令转换为显示设备200可识别和响应的指令,起用用户与显示设备200之间交互中介作用。
在一些实施例中,通信接口130用于和外部通信,包含WIFI芯片,蓝牙模块,NFC或可替代模块中的至少一种。
在一些实施例中,用户输入/输出接口140包含麦克风,触摸板,传感器,按键或可替代模块中的至少一种。
图1C示出了根据示例性实施例中显示设备200的硬件配置框图。
在一些实施例中,显示设备200包括调谐解调器210、通信器220、检测器230、外部装置接口240、控制器250、显示器260、音频输出接口270、存储器、供电电源、用户接口中的至少一种。
在一些实施例中控制器包括中央处理器,视频处理器,音频处理器,图形处理器,RAM,ROM,用于输入/输出的第一接口至第n接口。
在一些实施例中,显示器260包括用于呈现画面的显示屏组件,以及驱动图像显示的驱动组件,用于接收源自控制器输出的图像信号,进行显示视频内容、图像内容以及菜单操控界面的组件以及用户操控UI界面等。
在一些实施例中,显示器260可为液晶显示器、OLED显示器、以及投影显示器中的至少一种,还可以为一种投影装置和投影屏幕。
在一些实施例中,调谐解调器210通过有线或无线接收方式接收广播电视信号,以及从多个无线或有线广播电视信号中解调出音视频信号,如以及EPG数据信号。
在一些实施例中,通信器220是用于根据各种通信协议类型与外部设备或服务器进行通信的组件。例如:通信器可以包括Wifi模块,蓝牙模块,有线以太网模块等其他网络通信协议芯片或近场通信协议芯片,以及红外接收器中的至少一种。显示设备200可以通过通信器220与控制装置100或服务器400建立控制信号和数据信号的发送和接收。
在一些实施例中,检测器230用于采集外部环境或与外部交互的信号。例如,检测器230包括光接收器,用于采集环境光线强度的传感器;或者,检测器230包括图像采集器,如摄像头,可以用于采集外部环境场景、用户的属性或用户交互手势,再或者,检测器230包括声音采集器,如麦克风等,用于接收外部声音。
在一些实施例中,外部装置接口240可以包括但不限于如下:高清多媒体接口(HDMI)、模拟或数据高清分量输入接口(分量)、复合视频输入接口(CVBS)、USB输入接口(USB)、RGB端口等任一个或多个接口。也可以是上述多个接口形成的复合性的输入/输出接口。
在一些实施例中,控制器250和调谐解调器210可以位于不同的分体设备中,即调谐解调器210也可在控制器250所在的主体设备的外置设备中,如外置机顶盒等。
在一些实施例中,控制器250,通过存储在存储器上中各种软件控制程序,来控制显示设备的工作和响应用户的操作。控制器250控制显示设备200的整体操作。例如:响应于接收到用于选择在显示器260上显示UI对象的用户命令,控制器250便可以执行与由用户命令选择的对象有关的操作。
在一些实施例中,所述对象可以是可选对象中的任何一个,例如超链接、图标或其他可操作的控件。与所选择的对象有关操作有:显示连接到超链接页面、文档、图像等操作,或者执行与所述图标相对应程序的操作。
在一些实施例中控制器包括中央处理器(Central Processing Unit,CPU),视频处理器,音频处理器,图形处理器(Graphics Processing Unit,GPU),RAM Random Access Memory,RAM),ROM(Read-Only Memory,ROM),用于输入/输出的第一接口至第n接口,通信总线(Bus)等中的至少一种。
CPU处理器。用于执行存储在存储器中操作系统和应用程序指令,以及根据接收外部输入的各种交互指令,来执行各种应用程序、数据和内容,以便最终显示和播放各种音视频内容。CPU处理器,可以包括多个处理器。如,包括一个主处理器以及一个或多个子处理器。
在一些实施例中,图形处理器,用于产生各种图形对象,如:图标、操作菜单、以及用户输入指令显示图形等中的至少一种。图形处理器包括运算器,通过接收用户输入各种交互指令进行运算,根据显示属性显示各种对象;还包括渲染器,对基于运算器得到的各种对象,进行渲染,上述渲染后的对象用于显示在显示器上。
在一些实施例中,视频处理器,用于将接收外部视频信号,根据输入信号的标准编解码协议,进行解压缩、解码、缩放、降噪、帧率转换、分辨率转换、图像合成等视频处理中的至少一种,可得到直接可显示设备200上显示或播放的信号。
在一些实施例中,视频处理器,包括解复用模块、视频解码模块、图像合成模块、帧率转换模块、显示格式化模块等中的至少一种。其中,解复用模块,用于对输入音视频数据流进行解复用处理。视频解码模块,用于对解复用后的视频信号进行处理,包括解码和缩放处理等。图像合成模块,如图像合成器,其用于将图形生成器根据用户输入或自身生成的GUI信号,与缩放处理后视频图像进行叠加混合处理,以生成可供显示的图像信号。帧率转换模块,用于对转换输入视频帧率。显示格式化模块,用于将接收帧率转换后视频输出信号,改变信号以符合显示格式的信号,如输出RGB数据信号。
在一些实施例中,音频处理器,用于接收外部的音频信号,根据输入信号的标准编解码协议,进行解压缩和解码,以及降噪、数模转换、和放大处理等处理中的至少一种,得到可以在扬声器中播放的声音信号。
在一些实施例中,用户可在显示器260上显示的图形用户界面(GUI)输入用户命令,则用户输入接口通过图形用户界面(GUI)接收用户输入命令。或者,用户可通过输入特定的声音或手势进行输入用户命令,则用户输入接口通过传感器识别出声音或手势,来接收用户输入命令。
在一些实施例中,“用户界面”,是应用程序或操作系统与用户之间进行交互和信息交换的介质接口,它实现信息的内部形式与用户可以接受形式之间的转换。用户界面常用的表现形式是图形用户界面(Graphic User Interface,GUI),是指采用图形方式显示的与计算机操作相关的用户界面。它可以是在电子设备的显示屏中显示的一个图标、窗口、控件等界面元素,其中控件可以包括图标、按钮、菜单、选项卡、文本框、对话框、状态栏、导航栏、Widget等可视的界面元素中的至少一种。
在一些实施例中,用户接口280,为可用于接收控制输入的接口(如:显示设备本体上的实体按键,或其他等)。
在一些实施例中,显示设备的系统可以包括内核(Kernel)、命令解析器(shell)、文件系统和应用程序。内核、shell和文件系统一起组成了基本的操作系统结构,它们让用户可以管理文件、运行程序并使用系统。上电后,内核启动,激活内核空间,抽象硬件、初 始化硬件参数等,运行并维护虚拟内存、调度器、信号及进程间通信(IPC)。内核启动后,再加载Shell和用户应用程序。应用程序在启动后被编译成机器码,形成一个进程。
参见图1D,在一些实施例中,将系统分为四层,从上至下分别为应用程序(Applications)层(简称“应用层”),应用程序框架(Application Framework)层(简称“框架层”),安卓运行时(Android runtime)和系统库层(简称“系统运行库层”),以及内核层。
在一些实施例中,应用程序层中运行有至少一个应用程序,这些应用程序可以是操作系统自带的窗口(Window)程序、系统设置程序或时钟程序等;也可以是第三方开发者所开发的应用程序。在具体实施时,应用程序层中的应用程序包不限于以上举例。
框架层为应用程序层的应用程序提供应用编程接口(application programming interface,API)和编程框架。应用程序框架层包括一些预先定义的函数。应用程序框架层相当于一个处理中心,这个中心决定让应用层中的应用程序做出动作。应用程序通过API接口,可在执行中访问系统中的资源和取得系统的服务。
如图1D所示,本申请实施例中应用程序框架层包括管理器(Managers),内容提供者(Content Provider)等,其中管理器包括以下模块中的至少一个:活动管理器(Activity Manager)用与和系统中正在运行的所有活动进行交互;位置管理器(Location Manager)用于给系统服务或应用提供了系统位置服务的访问;文件包管理器(Package Manager)用于检索当前安装在设备上的应用程序包相关的各种信息;通知管理器(Notification Manager)用于控制通知消息的显示和清除;窗口管理器(Window Manager)用于管理用户界面上的括图标、窗口、工具栏、壁纸和桌面部件。
在一些实施例中,活动管理器用于管理各个应用程序的生命周期以及通常的导航回退功能,比如控制应用程序的退出、打开、后退等。窗口管理器用于管理所有的窗口程序,比如获取显示屏大小,判断是否有状态栏,锁定屏幕,截取屏幕,控制显示窗口变化(例如将显示窗口缩小显示、抖动显示、扭曲变形显示等)等。
在一些实施例中,系统运行库层为上层即框架层提供支撑,当框架层被使用时,安卓操作系统会运行系统运行库层中包含的C/C++库以实现框架层要实现的功能。
在一些实施例中,内核层是硬件和软件之间的层。如图1D所示,内核层至少包含以下驱动中的至少一种:音频驱动、显示驱动、蓝牙驱动、摄像头驱动、WIFI驱动、USB驱动、HDMI驱动、传感器驱动(如指纹传感器,温度传感器,压力传感器等)、以及电源驱动等。图2A和图2B中示例性示出了显示设备200提供的一个GUI400的示意图。
如图2A所示,显示设备根据用户通过操作控制装置而输入的控制指令,向显示器提供一GUI400,该GUI400包括主显示画面41,和包括多个项目421~428的菜单界面42,以及处于任一项目的焦点对象43,可通过用户输入的焦点移动指令,移动焦点对象43,以改变选择的不同项目,例如,用户通过按压遥控器的上/下/左/右方向键,输入焦点移动指令。根据当前显示设备的设置或者状态,菜单界面中项目具有不同的状态,示例性的,菜单界面42中项目421~424为不可用项目(即置灰项目),项目425~428为可用项目。进一步地,焦点对象所处的项目为可用项目,焦点对象不可置于不可用项目上。
在一些实施例中,主显示画面41可以是用户正在观看的图像、文字、视频内容中的至少一个,例如图2A所示播放画面为一个图片画面,也可以是菜单页面、搜索页面、应用页面等,例如图2B所示播放画面为菜单界面42的上级菜单。
图3中示例性示出了现有技术中显示设备200提供的一个GUI400-1的示意图。图4和图5中示例性示出了显示设备200提供的一个GUI400的示意图
如图2A所示,当前的焦点对象43置于项目425上,当用户通过操作控制装置而输入焦点移动指令,显示设备200可以响应于该焦点移动指令。
在一些实施例中,当用户输入焦点移动指令后,若焦点移动指令指示将焦点对象43向上移动,则显示设备200通过显示器显示如图3所示的GUI400-1,其中,菜单界面42-1包括由菜单界面42向上移动,露出的项目420,以及项目421~427,而焦点对象43的位置不发生改变,当用户再次输入焦点移动指令后,焦点对象移动至项目420。
本方案提供的一种示例,在当前菜单界面42中,焦点对象43所处的项目425下方的项目426为可用项目,若焦点移动指令指示将焦点对象43向下移动,则焦点对象43移动至项目426,如图4所示;作为另一种示例,在当前菜单界面42中,由于焦点对象43所处的项目425上方的项目421~424均为不可用项目,项目421~424形成了不可用项目区域400,即焦点对象处于不可用项目区域的边界,若焦点移动指令指示将焦点对象43向上移动,则焦点对象43跳过不可用项目区域,直接移动至项目420,如图5所示。
图6A和图6B中示例性示出了显示设备200提供的一个GUI500的示意图。
如图6A或6B所示,显示设备根据用户的菜单界面打开操作,例如,用户通过操作控制装置而输入的控制指令,向显示器提供一GUI500,该GUI500包括主显示画面51,和所打开的包括多个项目521~525的菜单界面52。在本申请一些实施例中,主显示画面51可以是用户正在观看的图像、文字、视频内容中的至少一个,例如图5A所示播放画面为一个图片画面,也可以是菜单页面、搜索页面、应用页面等,例如图5B所示播放画面为菜单界面52的上级菜单。
其中,项目521为菜单界面52的首个项目,且为不可用项目,项目522为菜单界面52的第二个项目,且为可用项目,则在打开菜单界面52的同时,控制焦点对象53处于首个可用项目,即项目522。
图7示例性示出了焦点对象的移动方法实施例一的流程示意图。如图7所示,该方法包括:
S101:获取用户输入的焦点移动指令。
在本步骤中,控制器通过检测获取用户输入的焦点移动指令,例如,检测用户通过按压或触摸控制装置100上的上/下/左/右的移动按键输入的焦点移动指令,或者接收用户通过语音输入的焦点移动指令等。
该焦点移动指令包括第一移动方向,第一移动方向用于指示焦点对象将要移动的目标方向,如向上、向下、向左、向右等。
S102:根据焦点移动指令,控制焦点对象在第一移动方向上,移动至第一目标项目。
控制器根据焦点移动指令,控制焦点对象移动至第一目标项目,该第一目标项目为菜单界面中与焦点对象所处的项目距离最近的可用项目,例如,焦点移动指令指示将焦点对象向上移动,则控制焦点对象移动至当前项目上方的第一个可用项目上,应理解,在特定条件下,菜单界面中的部分项目不能设置或者使用,例如当声音输出为功放时,声音菜单中的声音模式(Sound Mode)、壁挂设置(Wall Mount Setup)等项目不可用,为置灰状态。
在一些实施例中,若第一目标项目未显示于当前菜单界面,则将菜单界面滚动至第一目标项目露出,同时将焦点对象移动至第一目标项目。
本方案实施例提供的一种焦点对象的移动方法,通过获取用户输入的焦点移动指令,该焦点移动指令包括第一移动方向,并根据焦点移动指令,控制焦点对象在第一移动方向上,移动至第一目标项目,其中,第一目标项目为菜单界面中与所述焦点对象所处的项目距离最近的可用项目,在菜单界面上存在一致多个不可用项目时,使焦点对象能够根据用户输入的指令快速移动至对应的可用项目,不需要用户进行多次输入指令的操作,提高了用户体验。
在图7所示实施例的基础上,图8示例性示出了焦点对象的移动方法实施例二的流程示意图。如图8所示,步骤S102:根据焦点移动指令,控制焦点对象在所述第一移动方向上,移动至第一目标项目,具体包括:
S1021:确定焦点对象在第一移动方向上是否相邻有不可用项目区域。
其中,不可用项目区域包括一至多个连续的不可用项目。
例如,结合图2A所示,当前焦点对象处于项目425,项目425上方的项目421~424组成了不可用项目区域,项目425下方的项目426为可用项目。若第一移动方向为向上,则焦点对象在第一移动方向上相邻有不可用项目区域,若第一移动方向为向下,则焦点对象在第一移动方向上没有相邻不可用项目区域。
若焦点对象在第一移动方向上相邻有不可用项目区域,则进入步骤S1022;若焦点对象在第一移动方向上没有相邻不可用项目区域,则进入步骤S1023。
作为本步骤的一种示例,图9示例性示出了焦点对象的移动方法实施例三的流程示意图。如图9所示,包括:
S1:确定在第一移动方向上与焦点对象相邻的项目是否为不可用项目。
若是,则进入步骤S2;否则,进入步骤S3。
S2:确定与所不可用项目相邻的项目是否为不可用项目,重复本步骤直至确定与不可用项目相邻的项目为可用项目,并得到不可用项目区域。
S3:焦点对象在第一移动方向上没有相邻的不可用项目区域。
例如,结合图2A和图5所示,当前焦点对象处于项目425,若第一移动方向为向上,则项目425上方的项目424为不可用项目,继续确定项目424上方的项目是否为不可用项目,项目424项目上方的项目423也为不可用项目,则重复本步骤,直至确定项目420为可用项目,停止本过程,并得到不可用项目区域(包括不可用项目421~425);若第一移动方向为向下,项目425下方的项目426为可用项目,则确定焦点对象在第一移动方向上没 有相邻的不可用项目区域。
S1022:控制焦点对象在第一移动方向上,跳过不可用项目区域,移动至第一目标项目。其中,第一目标项目与不可用项目区域相邻。
S1023:控制焦点对象在第一移动方向上,移动至第一目标项目;所述第一目标项目与当前焦点对象所处的项目相邻。
本实施例中,根据焦点对象所处的项目在第一移动方向上是否存在不可用项目区域,将焦点对象移动至第一移动方向上距离最近的可用项目,快速准确的响应用户的焦点移动指令。
本方案提供的一种焦点对象的移动方法,在任一菜单界面打开的过程中,将控制焦点对象显示于当前菜单界面的首个可用项目上,避免当菜单界面的首个项目为不可用项目时,造成焦点对象不显示的问题,具体包括:响应于用户输入的菜单界面打开操作,控制焦点对象移动至第二目标项目,该第二目标项目为菜单界面中首个可用项目。本方案对于如何确定第二目标项目,提供如下两种可能的实现方式:
方式一,依次确定菜单界面的项目是否为可用项目,将确定的首个可用项目作为第二目标项目。即由菜单界面的首个项目开始,依次确认每个项目是否为可用项目,当确认到可用项目时停止,将该可用项目作为第二目标项目。
方式二,确定菜单界面的每个项目是否为可用项目,将可用项目中的首个可用项目作为第二目标项目。遍历菜单界面中的每个项目,将确定的每个可用项目,按照其在菜单界面中的顺序,选择排序在先的项目作为第二目标项目。
在上述实施例的基础上,以显示设备使用的系统为Android P系统为例,对本方案进行说明,图10示例性示出了焦点对象的移动方法实施例四的流程示意图。如图10所示,在具体的实现过程中,控制器可通过检测模块实时检测遥控器的按键事件,例如,在Android P系统中通过系统原生的Fragment无法直接接收按键事件,其中,Fragment用于生成菜单界面,本方案通过Activity截获按键事件,再转交Fragment处理该按键事件,具体的实现是在TvSettingsActivity的onKeyDown()函数中获取到当前按键事件的keyCode及event,然后在Fragment中创建KeyEventService接口,并通过定义onKeyEventService()函数来接收Activity发送的keyCode及event,在需要获取按键事件的Fragment页面重写onKeyEventService()。进一步地,通过Fragment中的逻辑策略根据获取的按键键值确定按键事件是否为焦点移动指令,以及确定该焦点移动指令中的第一移动方向。
进而,需要获取菜单界面中的View(显示内容)及子View。Preference继承自View类,每个子View对应一个Preference,每个Preference对应菜单界面中的一个项目,因此,可以通过判断当前焦点对象位于哪个子View层上确定焦点所在的Preference,示例性的,可使用getListView()函数获取当前页面的RecycleView,再通过RecycleView.getFocusedChild()即可获取目前焦点所在的子View。在本申请一些实施例中,在上述过程中可包括判断View是否为空,增加系统的容错能力。
如果Fragment同时在onKeyEventService()接收到keyCode为19(遥控器的上方向 键)的按键事件,通过Preference的isEnabled()函数,依次判断图2A中项目425上方的TotalSonics、TotalSurround、TotalVolume及壁挂设置,四个Preference是否置灰,如果返回的都是false,则说明这是一个不可用项目区域且当前焦点对象位于该区域的边界,就说明需要将焦点对象移动到图5所示的项目420上,这里我们使用RecycleView的smoothScrollToPosition(int position)函数,并传递参数为2(项目420为页面第三项),这样就实现了焦点到第一目标位置的平滑移动。
最后应说明的是:以上各实施例仅用以说明本申请的技术方案,而非对其限制;尽管参照前述各实施例对本申请进行了详细的说明,本领域的普通技术人员应当理解:其依然可以对前述各实施例所记载的技术方案进行修改,或者对其中部分或者全部技术特征进行等同替换;而这些修改或者替换,并不使相应技术方案的本质脱离本申请各实施例技术方案的范围。
为了方便解释,已经结合具体的实施方式进行了上述说明。但是,上述示例性的讨论不是意图穷尽或者将实施方式限定到上述公开的具体形式。根据上述的教导,可以得到多种修改和变形。上述实施方式的选择和描述是为了更好的解释原理以及实际的应用,从而使得本领域技术人员更好的使用所述实施方式以及适于具体使用考虑的各种不同的变形的实施方式。

Claims (11)

  1. 一种显示方法,其特征在于,所述方法包括:
    显示包括多个项目的菜单页面;其中,所述菜单页面中还包括指示项目被选择的焦点对象,以及包括至少一个不可用项目的不可用项目区域;所述不可用项目不可被焦点对象所选择;
    接收用户输入的焦点移动指令;
    在确定所述焦点对象所选择的当前项目沿第一移动方向上相邻有不可用项目区域时,控制所述焦点对象沿所述第一移动方向上,跳过所述不可用项目区域,直接移动至第一目标项目;所述第一目标项目与所述不可用项目区域相邻。
  2. 根据权利要求1所述的方法,其特征在于,所述方法还包括:
    在确定所述焦点对象所选择的当前项目沿第一移动方向上未相邻有不可用项目区域时,控制所述焦点对象沿所述第一移动方向上,移动至所述第一目标项目;所述第一目标项目与所述当前项目相邻。
  3. 根据权利要求1所述的方法,其特征在于,所述方法还包括:
    确定不可用项目区域。
  4. 根据权利要求3所述的方法,其特征在于,所述方法还包括:
    确定在所述第一移动方向上与所述当前项目相邻的项目是否为不可用项目;
    若是,则确定与所述不可用项目相邻的项目是否为不可用项目,重复本步骤直至确定与不可用项目相邻的项目为可用项目,并得到所述不可用项目区域。
  5. 根据权利要求1所述的方法,其特征在于,所述接收用户输入的焦点移动指令之前,所述第一目标项目未显示在所述菜单页面内;
    所述直接移动至第一目标项目的同时,所述第一目标项目更新显示在所述菜单页面内。
  6. 根据权利要求1所述的方法,其特征在于,所述方法还包括:
    接收用户输入的菜单界面打开指令,显示包括多个项目的菜单页面,同时控制焦点对象移动至第二目标项目;所述第二目标项目为所述菜单界面中首个可用项目。
  7. 根据权利要求6所述的方法,其特征在于,所述方法还包括:
    依次确定所述菜单界面的项目是否为可用项目,将确定的首个可用项目作为所述第二目标项目;
  8. 根据权利要求6所述的方法,其特征在于,所述方法还包括:
    确定所述菜单界面的每个项目是否为可用项目,将所有可用项目中的首个可用项目作为所述第二目标项目。
  9. 一种显示设备,其特征在于,包括:
    显示器,用于显示多个项目的菜单页面;其中,所述菜单页面中还包括指示项目被选择的焦点对象,以及包括至少一个不可用项目的不可用项目区域;所述不可用项目不可被焦点对象所选择;
    用户接口,用于接收用户输入的指令;
    控制器,用于执行:
    响应于用户输入的焦点移动指令,在确定所述焦点对象所选择的当前项目沿第一移动方向上相邻有不可用项目区域时,控制所述焦点对象沿所述第一移动方向上,跳过所述不可用项目区域,直接移动至第一目标项目;所述第一目标项目与所述不可用项目区域相邻。
  10. 根据权利要求9所示的设备,其特征在于,所述控制器具体用于:
    确定在所述第一移动方向上与所述当前项目相邻的项目是否为不可用项目;
    若是,则确定与所述不可用项目相邻的项目是否为不可用项目,重复本步骤直至确定与不可用项目相邻的项目为可用项目,并得到所述不可用项目区域。
  11. 根据权利要求9所述的设备,其特征在于,所述控制器具体用于:
    所述响应于用户输入的焦点移动指令之前,所述第一目标项目未显示在所述菜单页面内;
    所述直接移动至第一目标项目的同时,所述第一目标项目更新显示在所述菜单页面内。
PCT/CN2020/133646 2019-12-20 2020-12-03 一种显示方法及显示设备 WO2021121051A1 (zh)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN201911328401.9 2019-12-20
CN201911328401.9A CN111045557A (zh) 2019-12-20 2019-12-20 焦点对象的移动方法及显示设备

Publications (1)

Publication Number Publication Date
WO2021121051A1 true WO2021121051A1 (zh) 2021-06-24

Family

ID=70238072

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2020/133646 WO2021121051A1 (zh) 2019-12-20 2020-12-03 一种显示方法及显示设备

Country Status (2)

Country Link
CN (1) CN111045557A (zh)
WO (1) WO2021121051A1 (zh)

Families Citing this family (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111045557A (zh) * 2019-12-20 2020-04-21 青岛海信电器股份有限公司 焦点对象的移动方法及显示设备
CN112511874B (zh) * 2020-11-12 2023-10-03 北京视游互动科技有限公司 游戏操控方法、智能电视及存储介质
CN113703625A (zh) * 2021-07-30 2021-11-26 青岛海尔科技有限公司 控制焦点移动的方法、装置、存储介质及电子装置

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1242663A (zh) * 1998-04-07 2000-01-26 无线行星公司 用于在小屏幕上显示可选和非可选单元的方法
CN1697509A (zh) * 2004-05-13 2005-11-16 索尼株式会社 用户接口控制设备、用户接口控制方法以及计算机程序
CN101018282A (zh) * 2006-02-09 2007-08-15 上海乐金广电电子有限公司 自动跳过广播接收装置中限制频道的方法
US20090132963A1 (en) * 2007-11-21 2009-05-21 General Electric Company Method and apparatus for pacs software tool customization and interaction
CN110300986A (zh) * 2017-02-15 2019-10-01 微软技术许可有限责任公司 与智能个人助理的辅助通信
CN111045557A (zh) * 2019-12-20 2020-04-21 青岛海信电器股份有限公司 焦点对象的移动方法及显示设备

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102915190B (zh) * 2011-08-03 2016-03-30 联想(北京)有限公司 一种显示处理方法、装置及电子设备
US9245020B2 (en) * 2011-12-14 2016-01-26 Microsoft Technology Licensing, Llc Collaborative media sharing
CN107092410A (zh) * 2016-02-24 2017-08-25 口碑控股有限公司 一种触摸屏的界面交互方法、设备以及智能终端设备
DK201670574A1 (en) * 2016-06-12 2018-01-02 Apple Inc Accelerated scrolling

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1242663A (zh) * 1998-04-07 2000-01-26 无线行星公司 用于在小屏幕上显示可选和非可选单元的方法
CN1697509A (zh) * 2004-05-13 2005-11-16 索尼株式会社 用户接口控制设备、用户接口控制方法以及计算机程序
CN101018282A (zh) * 2006-02-09 2007-08-15 上海乐金广电电子有限公司 自动跳过广播接收装置中限制频道的方法
US20090132963A1 (en) * 2007-11-21 2009-05-21 General Electric Company Method and apparatus for pacs software tool customization and interaction
CN110300986A (zh) * 2017-02-15 2019-10-01 微软技术许可有限责任公司 与智能个人助理的辅助通信
CN111045557A (zh) * 2019-12-20 2020-04-21 青岛海信电器股份有限公司 焦点对象的移动方法及显示设备

Also Published As

Publication number Publication date
CN111045557A (zh) 2020-04-21

Similar Documents

Publication Publication Date Title
WO2021121051A1 (zh) 一种显示方法及显示设备
CN112672195A (zh) 遥控器按键设置方法及显示设备
WO2022073392A1 (zh) 图像显示方法及显示设备
WO2022048203A1 (zh) 一种输入法控件的操控提示信息的显示方法及显示设备
CN113268199A (zh) 一种显示设备及功能项设置方法
CN114302021A (zh) 显示设备和音画同步方法
CN116612722A (zh) 一种显示设备及背光亮度调整方法
CN113794914B (zh) 显示设备及开机导航配置的方法
CN113490024A (zh) 控制装置按键设置方法及显示设备
CN113593488A (zh) 背光调整方法及显示设备
CN113301405A (zh) 一种显示设备及虚拟键盘的显示控制方法
CN113132809B (zh) 一种通道切换方法、通道节目播放方法及显示设备
CN114793298B (zh) 一种显示设备和菜单显示方法
CN113064691B (zh) 一种开机用户界面的显示方法及显示设备
CN112911371B (zh) 双路视频资源播放方法及显示设备
CN114390190B (zh) 显示设备及监测应用启动摄像头的方法
CN113703705A (zh) 显示设备及列表更新方法
CN114302070A (zh) 显示设备和音频输出方法
CN113286185A (zh) 一种显示设备及主页显示方法
CN113064534A (zh) 一种用户界面的显示方法及显示设备
CN112637683A (zh) 显示设备系统优化方法及显示设备
CN112882780A (zh) 设置页面显示方法及显示设备
CN112668546A (zh) 视频缩略图显示方法及显示设备
CN113689856B (zh) 一种浏览器页面视频播放进度的语音控制方法及显示设备
CN112835633B (zh) 显示设备及显示语言的控制方法

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 20902818

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 20902818

Country of ref document: EP

Kind code of ref document: A1