CN112929717B - Focus management method and display device - Google Patents
Focus management method and display device Download PDFInfo
- Publication number
- CN112929717B CN112929717B CN201911242371.XA CN201911242371A CN112929717B CN 112929717 B CN112929717 B CN 112929717B CN 201911242371 A CN201911242371 A CN 201911242371A CN 112929717 B CN112929717 B CN 112929717B
- Authority
- CN
- China
- Prior art keywords
- focus
- component object
- current
- value
- display device
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
Images
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/40—Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
- H04N21/41—Structure of client; Structure of client peripherals
- H04N21/4104—Peripherals receiving signals from specially adapted client devices
- H04N21/4122—Peripherals receiving signals from specially adapted client devices additional display device, e.g. video projector
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/40—Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
- H04N21/41—Structure of client; Structure of client peripherals
- H04N21/422—Input-only peripherals, i.e. input devices connected to specially adapted client devices, e.g. global positioning system [GPS]
- H04N21/42204—User interfaces specially adapted for controlling a client device through a remote control device; Remote control devices therefor
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/40—Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
- H04N21/43—Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
- H04N21/431—Generation of visual interfaces for content selection or interaction; Content or additional data rendering
- H04N21/4312—Generation of visual interfaces for content selection or interaction; Content or additional data rendering involving specific graphical features, e.g. screen layout, special fonts or colors, blinking icons, highlights or animations
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/40—Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
- H04N21/43—Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
- H04N21/443—OS processes, e.g. booting an STB, implementing a Java virtual machine in an STB or power management in an STB
- H04N21/4438—Window management, e.g. event handling following interaction with the user interface
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- Human Computer Interaction (AREA)
- Software Systems (AREA)
- User Interface Of Digital Computer (AREA)
- Controls And Circuits For Display Device (AREA)
Abstract
The present application relates to the field of computer technologies, and in particular, to a method for focus management and a display device. The application provides a method for focus management, which comprises the following steps: the method comprises the steps that a display device obtains a key value sent by a remote controller, wherein the key value is used for controlling the moving direction of a focus; analyzing the object pointed by the key value into a component object, specifically analyzing the object pointed by the key value into a reference number of the component object; the display equipment sends the reference number to a current page; the display equipment determines a second position after the focus is updated according to a current focus value of a first position of the focus of a current page and a reference number of the component object; the present application also provides a display device, including: the device comprises a display, a focus recording module, a focus calculating module and a focus setting module. The problem of focus loss caused by slow time efficiency, repeated focus and no-mounted focus when the focus is searched and positioned can be solved to a certain extent.
Description
Technical Field
The present application relates to the field of computer technologies, and in particular, to a method for focus management and a display device.
Background
Focus management is the management of focus movement and focus display of pages in a smart tv by different control and interaction means, e.g. using a remote control, in a smart display device, e.g. a smart tv or a set-top box.
In some implementations, by operating the remote controller, the display device finds an ID value of a positioning planned display focus in several orders of trunk nodes and branch and leaf nodes from a first layer to a lowermost layer of the document object model, sends the ID value of the focus to the UI interface for focus setting display when the focus ID value of polarization display is found, and records the state.
However, when the page complexity of the display device increases, the number of objects in the page increases, and in this case, the focus positioning mode of traversing the document object model tree nodes in sequence according to the main branches and leaves takes more time, which results in prolonging the response time; and because the display object may have the same ID on multiple components, focus may be caused to be duplicated; when the page data of the page follow-up control and the like of the display device are not updated in time, the focus is not mounted on the document object model tree, and therefore the focus is lost.
Disclosure of Invention
The application provides a focus management method and display equipment, which can solve the problems of slow effect of focus traversal, focus loss and focus repetition in a document object model tree to a certain extent by recording a focus in a component object mode and setting the component object as an object with an upper focus.
The embodiment of the application is realized as follows:
a first aspect of an embodiment of the present application provides a display device, including:
a display configured to display a user interface, wherein the user interface comprises a plurality of view display areas, wherein each view display area comprises a layout of one or more different items, and a selector indicating that the item is selected, wherein a position of the selector in the user interface is movable by a user input to cause the different item to be selected;
the communication interface is used for acquiring a key value sent by the remote controller, and the key value is used for controlling the moving direction of the focus;
a controller coupled to the display and the communication interface and configured to: analyzing the object pointed by the key value into a component object, specifically analyzing the object into a reference number of the component object and sending the reference number to a current page;
And determining a second position after the focus is updated according to the current focus value of the first position of the focus of the current page and the reference number of the component object, and switching the focus to the second position.
A second aspect of an embodiment of the present application provides a method of focus management, including:
the method comprises the steps that a display equipment communication interface obtains a key value sent by a remote controller, wherein the key value is used for controlling the moving direction of a focus;
analyzing the object pointed by the key value into a component object, specifically analyzing the object pointed by the key value into a reference number of the component object;
the display equipment sends the reference number to a current page;
the display equipment determines a second position after the focus is updated according to a current focus value of a first position of the focus of a current page and a reference number of the component object;
the display device acquires the second position and switches the focus to the second position.
The beneficial effect of this application is passed: the focus is recorded in a component object mode, so that the problem of slow time efficiency in searching a positioning focus in a document object model tree is solved to a certain extent; further, the problem of focus loss caused by focus repetition and focus non-mounting in the process of searching and positioning the focus in the document object model tree can be solved to a certain extent through the uniqueness of the component object; the focus switching is carried out by using the reference number of the component object through the method of the unidirectional event stream, so that the focus and new steps are simplified.
Drawings
In order to more clearly illustrate the embodiments of the present application or the technical solutions in the prior art, the drawings needed to be used in the description of the embodiments or the prior art will be briefly introduced below, and it is obvious that the drawings in the following description are some embodiments of the present application, and for those skilled in the art, other drawings can be obtained according to these drawings without inventive exercise.
Fig. 1 is a schematic diagram illustrating an operation scenario between a display device and a control apparatus according to an embodiment;
fig. 2 is a block diagram exemplarily showing a hardware configuration of a display device 200 according to an embodiment;
fig. 3 is a block diagram exemplarily showing a hardware configuration of the control apparatus 100 according to the embodiment;
fig. 4 is a diagram exemplarily showing a functional configuration of the display device 200 according to the embodiment;
fig. 5a schematically shows a software configuration in the display device 200 according to an embodiment;
fig. 5b schematically shows a configuration of an application in the display device 200 according to an embodiment;
FIG. 6 is a component object structure diagram illustrating a focus object repeat scene in a document object model tree according to an embodiment of the present application;
FIG. 7 is a diagram illustrating an embodiment of the present application with focus positioned traversal of a document object model tree;
FIG. 8 is a diagram illustrating a structure of a scene in which a focus object is not yet mounted in a document object model tree according to an embodiment of the present application;
FIG. 9 is a diagram illustrating a method for switching focus in a unidirectional data flow according to an embodiment of the present application;
FIG. 10 is a schematic structural diagram illustrating a current page focus switching of a display device according to an embodiment of the present application;
FIG. 11 is a schematic diagram illustrating switching of a display device focus between sibling component objects according to an embodiment of the present application;
FIG. 12 is a schematic diagram illustrating switching of focus of a display device between parent component objects according to an embodiment of the present application;
FIG. 13 is a flow chart illustrating focus shifting provided by embodiments of the present application;
FIG. 14 shows a display device according to an embodiment of the present application;
fig. 15 is a schematic diagram illustrating a page structure of a display device according to an embodiment of the present application;
FIG. 16 is a flow chart illustrating a method for focus management according to an embodiment of the present disclosure;
fig. 17 is a flowchart illustrating a focus assignment method of a focus management method according to an embodiment of the present application.
Detailed Description
To make the objects, technical solutions and advantages of the exemplary embodiments of the present application clearer, the technical solutions in the exemplary embodiments of the present application will be clearly and completely described below with reference to the drawings in the exemplary embodiments of the present application, and it is obvious that the described exemplary embodiments are only a part of the embodiments of the present application, but not all the embodiments.
All other embodiments, which can be derived by a person skilled in the art from the exemplary embodiments shown in the present application without inventive effort, shall fall within the scope of protection of the present application. Moreover, while the disclosure herein has been presented in terms of exemplary one or more examples, it is to be understood that each aspect of the disclosure can be utilized independently and separately from other aspects of the disclosure to provide a complete disclosure.
It should be understood that the terms "first," "second," "third," and the like in the description and in the claims of the present application and in the drawings described above are used for distinguishing between similar elements and not necessarily for describing a particular sequential or chronological order. It is to be understood that the data so used are interchangeable under appropriate circumstances and can be implemented in sequences other than those illustrated or otherwise described herein with respect to the embodiments of the application, for example.
Furthermore, the terms "comprises" and "comprising," as well as any variations thereof, are intended to cover a non-exclusive inclusion, such that a product or device that comprises a list of elements is not necessarily limited to those elements explicitly listed, but may include other elements not expressly listed or inherent to such product or device.
The term "module" as used herein refers to any known or later developed hardware, software, firmware, artificial intelligence, fuzzy logic, or combination of hardware and/or software code that is capable of performing the functionality associated with that element. .
Reference throughout this specification to "embodiments," "some embodiments," "one embodiment," or "an embodiment," or the like, means that a particular feature, structure, or characteristic described in connection with the embodiment is included in at least one embodiment. Thus, appearances of the phrases "in various embodiments," "in some embodiments," "in at least one other embodiment," or "in an embodiment" or the like throughout this specification are not necessarily all referring to the same embodiment. Furthermore, the particular features, structures, or characteristics may be combined in any suitable manner in one or more embodiments. Thus, the particular features, structures, or characteristics shown or described in connection with one embodiment may be combined, in whole or in part, with the features, structures, or characteristics of one or more other embodiments, without limitation. Such modifications and variations are intended to be included within the scope of the present application.
The term "remote control" as used in this application refers to a component of an electronic device, such as the display device disclosed in this application, that is typically wirelessly controllable over a short range of distances. Typically using infrared and/or Radio Frequency (RF) signals and/or bluetooth to connect with the electronic device, and may also include WiFi, wireless USB, bluetooth, motion sensor, etc. For example: the hand-held touch remote controller replaces most of the physical built-in hard keys in the common remote control device with the user interface in the touch screen.
The term "gesture" as used in this application refers to a user's behavior through a change in hand shape or an action such as hand motion to convey a desired idea, action, purpose, or result.
Fig. 1 is a schematic diagram illustrating an operation scenario between a display device and a control apparatus according to an embodiment. As shown in fig. 1, a user may operate the display device 200 through the mobile terminal 300 and the control apparatus 100.
The control device 100 may control the display device 200 in a wireless or other wired manner by using a remote controller, including infrared protocol communication, bluetooth protocol communication, other short-distance communication manners, and the like. The user may input a user command through a key on a remote controller, voice input, control panel input, etc. to control the display apparatus 200. Such as: the user can input a corresponding control command through a volume up/down key, a channel control key, up/down/left/right moving keys, a voice input key, a menu key, a power on/off key, etc. on the remote controller, to implement the function of controlling the display device 200.
In some embodiments, mobile terminals, tablets, computers, laptops, and other smart devices may also be used to control the display device 200. For example, the display device 200 is controlled using an application program running on the smart device. The application, through configuration, may provide the user with various controls in an intuitive User Interface (UI) on a screen associated with the smart device.
For example, the mobile terminal 300 may install a software application with the display device 200, implement connection communication through a network communication protocol, and implement the purpose of one-to-one control operation and data communication. Such as: the mobile terminal 300 and the display device 200 can establish a control instruction protocol, synchronize a remote control keyboard to the mobile terminal 300, and control the display device 200 by controlling a user interface on the mobile terminal 300. The audio and video content displayed on the mobile terminal 300 can also be transmitted to the display device 200, so as to realize the synchronous display function.
As also shown in fig. 1, the display apparatus 200 also performs data communication with the server 400 through various communication means. The display device 200 may be allowed to be communicatively connected through a Local Area Network (LAN), a Wireless Local Area Network (WLAN), and other networks. The server 400 may provide various contents and interactions to the display apparatus 200. Illustratively, the display device 200 receives software program updates, or accesses a remotely stored digital media library, by sending and receiving information, as well as Electronic Program Guide (EPG) interactions. The servers 400 may be a group or groups of servers, and may be one or more types of servers. Other web service contents such as video on demand and advertisement services are provided through the server 400.
The display device 200 may be a liquid crystal display, an OLED display, a projection display device. The particular display device type, size, resolution, etc. are not limiting, and those skilled in the art will appreciate that the display device 200 may be modified in performance and configuration as desired.
The display apparatus 200 may additionally provide an intelligent network tv function that provides a computer support function in addition to the broadcast receiving tv function. Examples include a web tv, a smart tv, an Internet Protocol Tv (IPTV), and the like.
A hardware configuration block diagram of a display device 200 according to an exemplary embodiment is exemplarily shown in fig. 2. As shown in fig. 2, the display device 200 includes a controller 210, a tuning demodulator 220, a communication interface 230, a detector 240, an input/output interface 250, a video processor 260-1, an audio processor 60-2, a display 280, an audio output 270, a memory 290, a power supply, and an infrared receiver.
A display 280 for receiving the image signal from the video processor 260-1 and displaying the video content and image and components of the menu manipulation interface. The display 280 includes a display screen assembly for presenting a picture, and a driving assembly for driving the display of an image. The video content may be displayed from broadcast television content, or may be broadcast signals that may be received via a wired or wireless communication protocol. Alternatively, various image contents received from the network communication protocol and sent from the network server side can be displayed.
Meanwhile, the display 280 simultaneously displays a user manipulation UI interface generated in the display apparatus 200 and used to control the display apparatus 200.
And, a driving component for driving the display according to the type of the display 280. Alternatively, in case the display 280 is a projection display, it may also comprise a projection device and a projection screen.
The communication interface 230 is a component for communicating with an external device or an external server according to various communication protocol types. For example: the communication interface 230 may be a Wifi chip 231, a bluetooth communication protocol chip 232, a wired ethernet communication protocol chip 233, or other network communication protocol chips or near field communication protocol chips, and an infrared receiver (not shown) or the like may be used to receive the remote control signal.
The display apparatus 200 may establish control signal and data signal transmission and reception with an external control apparatus or a content providing apparatus through the communication interface 230. And an infrared receiver, an interface device for receiving an infrared control signal for controlling the apparatus 100 (e.g., an infrared remote controller, etc.).
The detector 240 is a signal used by the display device 200 to collect an external environment or interact with the outside. The detector 240 includes a light receiver 242, a sensor for collecting the intensity of ambient light, and parameters such as parameter changes can be adaptively displayed by collecting the ambient light.
The image acquisition device 241, such as a camera and a camera, may be used to acquire an external environment scene, acquire attributes of a user or interact gestures with the user, adaptively change display parameters, and recognize gestures of the user, so as to implement an interaction function with the user.
In some other exemplary embodiments, the detector 240, a temperature sensor, etc. may be provided, for example, by sensing the ambient temperature, and the display device 200 may adaptively adjust the display color temperature of the image. For example, the display apparatus 200 may be adjusted to display a cool tone when the temperature is in a high environment, or the display apparatus 200 may be adjusted to display a warm tone when the temperature is in a low environment.
In other exemplary embodiments, the detector 240, and a sound collector, such as a microphone, may be used to receive a user's voice, a voice signal including a control instruction from the user to control the display device 200, or collect an ambient sound for identifying an ambient scene type, and the display device 200 may adapt to the ambient noise.
The input/output interface 250 controls data transmission between the display device 200 of the controller 210 and other external devices. Such as receiving video and audio signals or command instructions from an external device.
Input/output interface 250 may include, but is not limited to, the following: any one or more of high definition multimedia interface HDMI interface 251, analog or data high definition component input interface 253, composite video input interface 252, USB input interface 254, RGB ports (not shown in the figures), etc.
In some other exemplary embodiments, the input/output interface 250 may also form a composite input/output interface with the above-mentioned plurality of interfaces.
The tuning demodulator 220 receives the broadcast television signals in a wired or wireless receiving manner, may perform modulation and demodulation processing such as amplification, frequency mixing, resonance, and the like, and demodulates the television audio and video signals carried in the television channel frequency selected by the user and the EPG data signals from a plurality of wireless or wired broadcast television signals.
The tuner demodulator 220 is responsive to the user-selected television signal frequency and the television signal carried by the frequency, as selected by the user and controlled by the controller 210.
The tuner-demodulator 220 may receive signals in various ways according to the broadcasting system of the television signal, such as: terrestrial broadcast, cable broadcast, satellite broadcast, or internet broadcast signals, etc.; and according to different modulation types, the modulation mode can be digital modulation or analog modulation. Depending on the type of television signal received, both analog and digital signals are possible.
In other exemplary embodiments, the tuner/demodulator 220 may be in an external device, such as an external set-top box. In this way, the set-top box outputs television audio/video signals after modulation and demodulation, and the television audio/video signals are input into the display device 200 through the input/output interface 250.
The video processor 260-1 is configured to receive an external video signal, and perform video processing such as decompression, decoding, scaling, noise reduction, frame rate conversion, resolution conversion, image synthesis, and the like according to a standard codec protocol of the input signal, so as to obtain a signal that can be displayed or played on the direct display device 200.
Illustratively, the video processor 260-1 includes a demultiplexing module, a video decoding module, an image synthesizing module, a frame rate conversion module, a display formatting module, and the like.
The demultiplexing module is used for demultiplexing the input audio and video data stream, and if the input MPEG-2 is input, the demultiplexing module demultiplexes the input audio and video data stream into a video signal and an audio signal.
And the video decoding module is used for processing the video signal after demultiplexing, including decoding, scaling and the like.
And the image synthesis module is used for carrying out superposition mixing processing on the GUI signal input by the user or generated by the user and the video image after the zooming processing by the graphic generator so as to generate an image signal for display.
The frame rate conversion module is used for converting an input video frame rate, such as a 60Hz frame rate into a 120Hz frame rate or a 240Hz frame rate, and a common format is implemented by using a frame interpolation method, for example.
The display format module is used for converting the received frame rate converted video output signal and changing the signal to conform to the signal of the display format, such as outputting an RGB data signal.
The audio processor 260-2 is configured to receive an external audio signal, decompress and decode the received audio signal according to a standard codec protocol of the input signal, and perform noise reduction, digital-to-analog conversion, amplification processing, and the like to obtain an audio signal that can be played in the speaker.
In other exemplary embodiments, video processor 260-1 may comprise one or more chips. The audio processor 260-2 may also comprise one or more chips.
And, in other exemplary embodiments, the video processor 260-1 and the audio processor 260-2 may be separate chips or may be integrated together with the controller 210 in one or more chips.
An audio output 272, which receives the sound signal output from the audio processor 260-2 under the control of the controller 210, such as: the speaker 272, and the external sound output terminal 274 that can be output to the generation device of the external device, in addition to the speaker 272 carried by the display device 200 itself, such as: an external sound interface or an earphone interface and the like.
The power supply provides power supply support for the display device 200 from the power input from the external power source under the control of the controller 210. The power supply may include a built-in power supply circuit installed inside the display apparatus 200, or may be a power supply interface installed outside the display apparatus 200 to provide an external power supply in the display apparatus 200.
A user input interface for receiving an input signal of a user and then transmitting the received user input signal to the controller 210. The user input signal may be a remote controller signal received through an infrared receiver, and various user control signals may be received through the network communication module.
For example, the user inputs a user command through the remote controller 100 or the mobile terminal 300, the user input interface responds to the user input through the controller 210 according to the user input, and the display device 200 responds to the user input.
In some embodiments, a user may enter a user command on a Graphical User Interface (GUI) displayed on the display 280, and the user input interface receives the user input command through the Graphical User Interface (GUI). Alternatively, the user may input the user command by inputting a specific sound or gesture, and the user input interface receives the user input command by recognizing the sound or gesture through the sensor.
The controller 210 controls the operation of the display apparatus 200 and responds to the user's operation through various software control programs stored in the memory 290.
As shown in fig. 2, the controller 210 includes a RAM213 and a ROM214, as well as a graphics processor 216, a CPU processor 212, and a communication interface 218, such as: a first interface 218-1 through an nth interface 218-n, and a communication bus. The RAM213 and the ROM214, the graphic processor 216, the CPU processor 212, and the communication interface 218 are connected via a bus.
A ROM213 for storing instructions for various system boots. If the display apparatus 200 starts power-on upon receipt of the power-on signal, the CPU processor 212 executes a system boot instruction in the ROM, copies the operating system stored in the memory 290 to the RAM213, and starts running the boot operating system. After the start of the operating system is completed, the CPU processor 212 copies the various application programs in the memory 290 to the RAM213, and then starts running and starting the various application programs.
A graphics processor 216 for generating various graphics objects, such as: icons, operation menus, user input instruction display graphics, and the like. The display device comprises an arithmetic unit which carries out operation by receiving various interactive instructions input by a user and displays various objects according to display attributes. And a renderer for generating various objects based on the operator and displaying the rendered result on the display 280.
A CPU processor 212 for executing operating system and application program instructions stored in memory 290. And executing various application programs, data and contents according to various interactive instructions received from the outside so as to finally display and play various audio and video contents.
In some exemplary embodiments, the CPU processor 212 may include a plurality of processors. The plurality of processors may include one main processor and a plurality of or one sub-processor. A main processor for performing some operations of the display apparatus 200 in a pre-power-up mode and/or operations of displaying a screen in a normal mode. A plurality of or one sub-processor for one operation in a standby mode or the like.
The controller 210 may control the overall operation of the display apparatus 100. For example: in response to receiving a user command for selecting a UI object to be displayed on the display 280, the controller 210 may perform an operation related to the object selected by the user command.
Wherein the object may be any one of selectable objects, such as a hyperlink or an icon. Operations related to the selected object, such as: displaying an operation connected to a hyperlink page, document, image, or the like, or performing an operation of a program corresponding to the icon. The user command for selecting the UI object may be a command input through various input means (e.g., a mouse, a keyboard, a touch pad, etc.) connected to the display apparatus 200 or a voice command corresponding to a voice spoken by the user.
The memory 290 includes a memory for storing various software modules for driving the display device 200. Such as: various software modules stored in memory 290, including: the system comprises a basic module, a detection module, a communication module, a display control module, a browser module, various service modules and the like.
Wherein the basic module is a bottom layer software module for signal communication among the various hardware in the postpartum care display device 200 and for sending processing and control signals to the upper layer module. The detection module is used for collecting various information from various sensors or user input interfaces, and the management module is used for performing digital-to-analog conversion and analysis management.
For example: the voice recognition module comprises a voice analysis module and a voice instruction database module. The display control module is a module for controlling the display 280 to display image content, and may be used to play information such as multimedia image content and UI interface. And the communication module is used for carrying out control and data communication with external equipment. And the browser module is used for executing a module for data communication between browsing servers. And the service module is used for providing various services and modules including various application programs.
Meanwhile, the memory 290 is also used to store visual effect maps and the like for receiving external data and user data, images of respective items in various user interfaces, and a focus object.
A block diagram of the configuration of the control apparatus 100 according to an exemplary embodiment is exemplarily shown in fig. 3. As shown in fig. 3, the control apparatus 100 includes a controller 110, a communication interface 130, a user input/output interface 140, a memory 190, and a power supply 180.
The control device 100 is configured to control the display device 200 and may receive an input operation instruction of a user and convert the operation instruction into an instruction recognizable and responsive by the display device 200, serving as an interaction intermediary between the user and the display device 200. Such as: the user operates the channel up/down keys on the control device 100, and the display device 200 responds to the channel up/down operation.
In some embodiments, the control device 100 may be a smart device. Such as: the control apparatus 100 may install various applications that control the display apparatus 200 according to user demands.
In some embodiments, as shown in fig. 1, a mobile terminal 300 or other intelligent electronic device may function similar to the control device 100 after installing an application that manipulates the display device 200. Such as: the user may implement the functions of controlling the physical keys of the device 100 by installing applications, various function keys or virtual buttons of a graphical user interface available on the mobile terminal 300 or other intelligent electronic device.
The controller 110 includes a processor 112 and RAM113 and ROM114, a communication interface 218, and a communication bus. The controller 110 is used to control the operation of the control device 100, as well as the internal components for communication and coordination and external and internal data processing functions.
The communication interface 130 enables communication of control signals and data signals with the display apparatus 200 under the control of the controller 110. Such as: the received user input signal is transmitted to the display apparatus 200. The communication interface 130 may include at least one of a WiFi chip, a bluetooth module, an NFC module, and other near field communication modules.
A user input/output interface 140, wherein the input interface includes at least one of a microphone 141, a touch pad 142, a sensor 143, keys 144, and other input interfaces. Such as: the user can realize a user instruction input function through actions such as voice, touch, gesture, pressing, and the like, and the input interface converts the received analog signal into a digital signal and converts the digital signal into a corresponding instruction signal, and sends the instruction signal to the display device 200.
The output interface includes an interface that transmits the received user instruction to the display apparatus 200. In some embodiments, the interface may be an infrared interface or a radio frequency interface. Such as: when the infrared signal interface is used, the user input instruction needs to be converted into an infrared control signal according to an infrared control protocol, and the infrared control signal is sent to the display device 200 through the infrared sending module. The following steps are repeated: when the rf signal interface is used, a user input command needs to be converted into a digital signal, and then modulated according to an rf control signal modulation protocol, and then transmitted to the display device 200 through the rf transmitting terminal.
In some embodiments, the control device 100 includes at least one of a communication interface 130 and an output interface. The control device 100 is provided with a communication interface 130, such as: the WiFi, bluetooth, NFC, etc. modules may transmit the user input command to the display device 200 through the WiFi protocol, or the bluetooth protocol, or the NFC protocol code.
A memory 190 for storing various operation programs, data and applications for driving and controlling the control apparatus 200 under the control of the controller 110. The memory 190 may store various control signal commands input by a user.
And a power supply 180 for providing operational power support to the various elements of the control device 100 under the control of the controller 110. A battery and associated control circuitry.
Fig. 4 is a diagram schematically illustrating a functional configuration of the display device 200 according to an exemplary embodiment. As shown in fig. 4, the memory 290 is used to store an operating system, an application program, contents, user data, and the like, and performs system operations for driving the display device 200 and various operations in response to a user under the control of the controller 210. The memory 290 may include volatile and/or nonvolatile memory.
The memory 290 is specifically configured to store an operating program for driving the controller 210 in the display device 200, and to store various application programs installed in the display device 200, various application programs downloaded by a user from an external device, various graphical user interfaces related to the applications, various objects related to the graphical user interfaces, user data information, and internal data of various supported applications. The memory 290 is used to store system software such as an OS kernel, middleware, and applications, and to store input video data and audio data, and other user data.
The memory 290 is specifically used for storing drivers and related data such as the audio/video processors 260-1 and 260-2, the display 280, the communication interface 230, the tuning demodulator 220, the input/output interface of the detector 240, and the like.
In some embodiments, memory 290 may store software and/or programs, software programs for representing an Operating System (OS) including, for example: a kernel, middleware, an Application Programming Interface (API), and/or an application program. For example, the kernel may control or manage system resources, or functions implemented by other programs (e.g., the middleware, APIs, or applications), and the kernel may provide interfaces to allow the middleware and APIs, or applications, to access the controller to implement controlling or managing system resources.
The memory 290, for example, includes a broadcast receiving module 2901, a channel control module 2902, a volume control module 2903, an image control module 2904, a display control module 2905, an audio control module 2906, an external instruction recognition module 2907, a communication control module 2908, a light receiving module 2909, a power control module 2910, an operating system 2911, and other applications 2912, a browser module, and the like. The controller 210 performs functions such as: a broadcast television signal reception demodulation function, a television channel selection control function, a volume selection control function, an image control function, a display control function, an audio control function, an external instruction recognition function, a communication control function, an optical signal reception function, an electric power control function, a software control platform supporting various functions, a browser function, and the like.
A block diagram of the configuration of the software system in the display device 200 according to an exemplary embodiment is exemplarily shown in fig. 5 a.
As shown in fig. 5a, an operating system 2911, including executing operating software for handling various basic system services and for performing hardware related tasks, acts as an intermediary for data processing performed between application programs and hardware components. In some embodiments, portions of the operating system kernel may contain a series of software to manage the display device hardware resources and provide services for other programs or software code.
In other embodiments, portions of the operating system kernel may include one or more device drivers, which may be a set of software code in the operating system that assists in operating or controlling the devices or hardware associated with the display device. The driver may contain code to operate video, audio and/or other multimedia components. Examples include a display screen, a camera, Flash, WiFi, and audio drivers.
The accessibility module 2911-1 is configured to modify or access the application program to achieve accessibility and operability of the application program for displaying content.
A communication module 2911-2 for connection to other peripherals via associated communication interfaces and a communication network.
User interface modules 2911-3, which are used to provide objects for displaying user interfaces for access by various applications, enable user operability.
Control applications 2911-4 for controlling process management, including runtime applications and the like.
The event transmission system 2914, which may be implemented within the operating system 2911 or within the application program 2912, in some embodiments, on the one hand, within the operating system 2911 and on the other hand, within the application program 2912, is configured to listen for various user input events, and to refer to handlers that perform one or more predefined operations in response to the identification of various types of events or sub-events, depending on the various events.
The event monitoring module 2914-1 is configured to monitor an event or a sub-event input by the user input interface.
The event identification module 2914-1 is configured to input definitions of various types of events for various user input interfaces, identify various events or sub-events, and transmit the same to a process for executing one or more corresponding sets of processes.
The event or sub-event refers to an input detected by one or more sensors in the display device 200 and an input of an external control device (e.g., the control device 100). Such as: the method comprises the following steps of inputting various sub-events through voice, inputting gestures through gesture recognition, inputting sub-events through remote control key commands of the control equipment and the like. Illustratively, the one or more sub-events in the remote control include a variety of forms including, but not limited to, one or a combination of key presses up/down/left/right/, ok keys, key presses, and the like. And non-physical key operations such as move, hold, release, etc.
The interface layout manager 2913, directly or indirectly receiving the input events or sub-events from the event transmission system 2914, monitors the input events or sub-events, and updates the layout of the user interface, including but not limited to the position of each control or sub-control in the interface, and the size, position, and level of the container, and other various execution operations related to the layout of the interface.
As shown in fig. 5b, the application layer 2912 contains various applications that may also be executed at the display device 200. The application may include, but is not limited to, one or more applications such as: live television applications, video-on-demand applications, media center applications, application centers, gaming applications, and the like.
The live television application program can provide live television through different signal sources. For example, a live television application may provide television signals using input from cable television, radio broadcasts, satellite services, or other types of live television services. And, the live television application may display video of the live television signal on the display device 200.
A video-on-demand application may provide video from different storage sources. Unlike live television applications, video on demand provides a video display from some storage source. For example, the video on demand may come from a server side of the cloud storage, from a local hard disk storage containing stored video programs.
The media center application program can provide various applications for playing multimedia contents. For example, a media center, which may be other than live television or video on demand, may provide services that a user may access to various images or audio through a media center application.
The application program center can provide and store various application programs. The application may be a game, an application, or some other application associated with a computer system or other device that may be run on the smart television. The application center may obtain these applications from different sources, store them in local storage, and then be operable on the display device 200.
The embodiment of the application can be applied to various types of display devices (including but not limited to smart televisions, set top boxes and other devices), and optionally, the devices can access various pages of the display devices by using various different controllers.
In the embodiment of the application, after the display device receives the key value of the remote controller, the display device sends the key value to the UI interface design, and the UI interface is a page management system. It should be noted that, the communication interface of the display device described in this embodiment specifically receives the key value sent by the remote controller. The communication interface transmits the key value to the controller of the display device for further processing of a focus switching process to be described later.
Fig. 14 is a schematic structural diagram of a UI interface provided in the embodiment of the present application. The UI interface includes a page management module, a focus recording module, and a focus setting module, and certainly, the UI interface also includes other modules, which are not illustrated one by one here. The page management module is responsible for creating, updating, switching and destroying pages; and the focus setting module is responsible for setting focus display of the focus in the page according to the position of the focus acquired from the page.
After the UI interface obtains the key value of the remote controller, the UI interface confirms the current page needing the key value by using the page management module and sends the key value to the page.
In the embodiment of the application, the page is responsible for defining and displaying the page control. Fig. 15 shows a schematic structural diagram of a page provided in an embodiment of the present application, and referring to fig. 15, in the embodiment of the present application, a focus calculation module is included in the page, and the focus calculation module may calculate a position of a focus in the page and send the calculated position of the focus to a UI interface.
The page is also responsible for defining whether the page control can obtain focus (i.e. the attribute of the page control is a focused page control or a non-focused page control), and describes the page control identification of the next available focus of the control in the four directions of up, down, left and right (i.e. the identification value of the next page control is specified in the page control attributes nextfocusUP, nextfocusDOWN, nextfocusLEFT, nextfocusIGHT).
The page is also responsible for the processing of focus change events (focus change event refers to the page control getting focus onFocus, losing focus onBlur). And the page control for obtaining the focus is the focused page control.
With continued reference to fig. 15, the focus calculation module is responsible for receiving and converting the key value into a movement direction (direction) of the focus, and transmitting the movement direction into the focus position according to the current focus position currentFocusNode and the movement direction to calculate the position of the focus after the focus is updated.
It should be noted that the focus calculation module may be a piece of js (javascript) script program. This program needs to be embedded in the page for focus control.
Fig. 16 is a flowchart illustrating a method for focus management according to an embodiment of the present application, where as shown in fig. 3, the method includes:
step 1601: the display equipment acquires a key value sent by the remote controller, and the key value is used for controlling the moving direction of the focus.
Step 1602: analyzing the object pointed by the key value into a component object, specifically analyzing the object pointed by the key value into a reference number of the component object;
step 1603: the display equipment sends the reference number to a current page;
step 1604: the display equipment determines a second position after the focus is updated by using a unidirectional data flow mode according to a current focus value of a first position of the focus of a current page and a reference number of the component object;
Step 1605: the display equipment acquires the second position and switches the focus to the second position;
in step 1601, a key value of the remote controller corresponds to a moving direction of the focus, the correspondence is predetermined, and the display terminal can determine the moving direction corresponding to the key value through the received key value.
A user sends an operation instruction carrying a key value to a terminal by using a remote controller, the remote controller is used as operation equipment to transmit the key value to the terminal, and the terminal transmits the key value to a display page; the page has an attribute that can obtain the key value, which is specified in the protocol and is not described herein again.
In step 1602, the object pointed to by the key value is analyzed as a component object, specifically, as a reference number of the component object;
in this application, the information of the component object where the focus is located is recorded by the display device, which may specifically be a reference number of the component object, so that the display device no longer records an ID number of an object that needs to be focused.
In this embodiment, it can be considered that the display device is always set to be a component object.
The component object in focus has a reference to the component when it is created, specifically a reference number. And the reference number of the component object is unique.
Compared with the traditional mode that the focus needs to traverse the document object model tree, the focus recording mode provided by the application does not need to traverse all document object model trees when searching the component object where the focus is located, the component object is quickly positioned according to the reference number of the component object when being created, and the time for traversing the document object model tree to search the focus ID is saved to a certain extent.
The component object is uniquely present, i.e., it exists a unique number when created and stored.
Therefore, the efficiency of focus positioning can be improved by using the reference number of the component object where the focus is located to find the mode for positioning the focus.
The method of recording focus by using the component object can solve the problem of ID duplication of the nodes of the document object model tree to a certain extent.
FIG. 6 is a diagram illustrating a component object structure of a focus object repeat scene in a document object model tree.
When [ DIV: 2-2 ] the display device can be directly adjusted from [ DIV: and 2, acquiring a component object where the object is located to obtain a product index value (DIV: 2-2 ] node references of the component object in which the object resides.
It should be noted that the objects [ DIV: 2-2, when the assembly object is not the assembly object, the focus assignment process can be terminated; when the object [ DIV: and 2-2, when the component objects are the component objects, setting the component objects of the display equipment to the global system.
It can be seen that the focus is recorded and found in the manner of component objects, and the document object model tree may not need to be traversed [ body ] and [ DIV: 1, [ DIV: 1-1, [ DIV: 1-2, [ DIV: 2 ], [ DIV: 2-1, [ DIV: 2-2 ] such a way addressing finds the position [ DIV: 2-2), but directly provides the component reference number stored in the definition at the time of creation for finding and positioning, namely: [ body ] and [ Ref: 2 ], [ Ref: 2-2, [ DIV: and 2-2, the complex traversal process is simplified.
The display devices operate on objects and are all constrained to operate within the upper level component object, so that there is no possibility of duplicate focus occurring. In the component object where the object finding focus is located [ DIV: 2-2 ], the display device does not need to pay attention to [ DIV: this assembly object.
When needed, will [ DIV: 3 ] assembly object [ DIV: 2-2 ] when the assembly object is set to be in focus to be displayed on the display device, only the distance between [ DIV: and 3, operating under the assembly object, wherein the operation does not involve [ DIV: 2 ] assembly object [ DIV: 2-2 ] the assembly object comprises the following specific operation flows: [ body ] and [ Ref: 2 ], [ Ref: 2-2, [ DIV: 2-2. So that the problem of focus duplication does not occur, and finally the display device needs to display the component object [ DIV: 2-2.
It should be noted that there may not be two equal objects in a system but they represent different elements, i.e. if there are 2 objects equal, it can be considered that they necessarily represent the same element. Therefore, the focus is recorded by the component object, so that the problem of focus searching error is fundamentally avoided.
The mode of using the component object to record the focus can solve the problem that the document object model tree node is not mounted in the focus switching process to a certain extent.
FIG. 8 is a diagram illustrating a component object structure of a scene in which a focus object is not yet mounted in a document object model tree.
Although there are nodes that are not mounted, but objects have already been created, our upper nodes can get the object references of the new nodes.
When the object in focus needs to be displayed, the object is fully operational. Because it has established a unique reference number at the time of creation, although the corresponding branch is not yet mounted in the document object model tree.
The display equipment can change the state of the object, so that the object is displayed in a form expected by the display equipment when being mounted, and the problem that the focus is lost due to the fact that the focus object is not mounted in time is solved.
In step 1603, the display device sends the reference number to the current page. The attributes of the pages include the stacking order, and the page with the higher stacking order is always displayed in front of the page with the lower stacking order. In the embodiment of the application, the page which needs to be operated by the user through the remote controller is the page displayed at the forefront, so that the page with the largest stacking sequence in the pages displayed by the terminal, namely the current page, can be determined as the page where the focus is located at present.
In step 1604, the display device determines the updated second position of the focus by using a unidirectional data flow according to the current focus value of the first position of the focus of the current page and the reference number of the component object.
The focus is recorded based on the manner of the inter-component object, and the switching of the focus will be explained in detail below. To facilitate switching focus, the display device uses a unidirectional event stream to address this issue.
Fig. 9 shows a schematic diagram of a unidirectional data flow.
A unidirectional data flow refers to a state that can be modified from only one direction. When the display device needs to modify the state of the component object, it can be considered to completely restart a modified flow. This way of modification limits the state modification, making the state predictable.
The data transfer between component objects is unidirectional, i.e. data is always transferred from a parent component object to a child component object, which may have its own data maintained within it, but it has no right to modify the data transferred to it by the parent component, which is done for better decoupling between component objects.
In the display of the page, a plurality of sub-component objects may depend on certain data of the parent component object, and if the sub-component objects can modify the data of the parent component object, a change of one sub-component object may cause changes to all sub-component objects depending on the data, so the display device described in the present application uses a unidirectional data stream method to solve the switching of the focus.
In the document object modeled tree, a parent component object (which may also be considered a parent node) records the child component object of focus, which in turn records the child component objects of the child component object of focus.
Therefore, the display device can switch the focus of page display according to the recording information of the multilayer structure, and can preset an initial focus for each parent component object node, thereby further preventing focus loss.
It should be noted that, the limitation of the preset initial focus data also supports the memory function of the focus, and the display device does not need to spend time to store and read the previous focus position.
Fig. 10 is a schematic diagram showing a structure of the current page focus switching of the display device.
Switching focus within a page includes two scenarios, switching focus within sibling component objects, switching focus between component objects.
Fig. 11 shows a schematic diagram of switching the focus of the display device between sibling component objects, i.e. the component objects switched by the focus are all mounted under the same parent component object.
This scenario is the most common case, as shown in fig. 11, the focus is at the first position in the current page, i.e. the child component object with the current focus value of a1, the child component object a1 is mounted under the parent component object a, a2 is the second position where the focus needs to be switched, and a2 child component object is also mounted under the parent component object a.
In this case, the display device only needs to assign the value of the Focus of the current page to a2, and obtain the second position of the Focus.
Fig. 12 shows a schematic diagram of switching of the focus of the display device between parent component objects, i.e. the component object switched by the focus is mounted under a different parent component object.
This scenario is also very common, mainly consisting in the direct switching of multiple lists in the current page of the display device. The focus is at the first position in the current page, i.e., the child component object with the current focus value of a2, which is mounted under parent component object A at a2, B1 is the second position where the focus needs to be switched, and B1 is mounted under parent component object B.
With continued reference to fig. 12, the display device uses a current focus to record the focus under the current component object, i.e. the first position of the focus of the current page, and when the first focus moves out, the display device can be used for focus recording.
The display device directly assigns the Focus value of the current page to the parent component object B at the second position of the Focus, then the recursive search mechanism of the Focus finds the currentFocus value of the current Focus value of the parent component object B, and finally the Focus value is assigned to B1, so that the second position of the Focus is obtained.
It should be noted that the assignment of the above focuses is performed by using the reference number of the component object.
In some embodiments, the current page may specifically be a focus calculation module of the current page, and may determine a page control corresponding to the first position, and when a page control exists in the moving direction controlled by the key value, determine the page control having the attribute of the focused page control as the focused page control. The attribute is a focused page control, the presentation focus can be located in the page control, and a user can execute corresponding actions, such as clicking operation, on the page control through the focus.
If the current page determines the focused page control, the position of the focused page control may be determined as the second position after the focus is updated.
It should be noted that in the embodiment of the present application, the page control refers to some visible tags in the current page, such as a button, a text input box, and the like.
It should also be noted that, in the embodiment of the present application, detailed description is only given by taking the UI interface of the smart television as an example, the UI interface of the corresponding display device is accessed through other display devices, and the server interface and the web interface interconnected with the display device are both within the protection scope of the present application, and are not described herein again.
In some embodiments, the current page may specifically be a focus calculation module of the current page, and if it is determined that a page control corresponding to the first position exists in the moving direction controlled by the key value, and a page control with a property of being a focused page control does not exist, a page control with a property of being a focused page control is searched in a DOM (Document Object Model) of the current page, and the position of the searched page control is determined as the second position after the focus is updated.
It should be noted that a page is a collection of a series of page controls included in the page, and these page controls form the tree structure, which is generally referred to as a DOM tree.
In some embodiments, the current page may be specifically a focus calculation module of the current page, and if it is determined that a page control corresponding to the first position does not exist in the moving direction controlled by the key value, the position of a preset page control in the first page is determined as the second position after the focus is updated. The preset page control may be any page control in the first page, and is determined specifically according to an actual situation, which is not described herein again.
In step 1605, the display device acquires the second position and switches the focus to the second position;
the above process is described below by a detailed embodiment.
As shown in fig. 13, a schematic flowchart of the focus movement provided in the embodiment of the present application includes:
step 1301: and the user operates the keys of the remote controller to trigger the remote controller to send the corresponding key values to the display equipment.
Step 1302: and the display equipment calls a standard interface and sends the key value to the UI interface.
Step 1303: and the focus recording module in the UI sends the reference number of the component object corresponding to the key value to the focus calculation module of the current page.
Step 1304: and the focus calculation module of the current page calculates a second position of the focus according to the reference number and the first position of the focus in the current page, stores the second position in the current page, and sends the second position to the focus setting module in the UI.
Fig. 17 is a flowchart illustrating a focus assignment method of a focus management method according to an embodiment of the present application.
The remote control module is used for acquiring a key value sent by the remote controller, analyzing an object pointed by the key value into a component object, judging whether the analyzed object is the component object or not, and if not, ending the assignment process; and
if the analyzed object is a component object, analyzing the current focus value of the component object, and if the current focus value is the component object, assigning the current focus value to the current component object;
if the analyzed current focus value is not the component object, if not, ending the process; and
if the current component is a component object, if the previous focus exists, the state of the previous focus is reset, and the state of the current focus is set.
The assignment logic can find that the focus logic simplifies a lot of operations, but simultaneously, the focus logic is more ergodic, safer and more efficient. The state of the focus is uniformly controlled by the display device, and the generation of multi-focus problems can be prevented. The display device adds the focus state of the component object, and each component object realizes a style in each state. Not only the centralized management of the states is realized, but also the individual requirements are met.
The hardware associated with implementing the above-described focus management in the embodiments of the present application will be explained below.
A display 280 configured to display a user interface, wherein the user interface comprises a plurality of view display areas, wherein each view display area comprises a layout of one or more different items, and a selector indicating that the item is selected, wherein a position of the selector in the user interface is movable by a user input to cause a different item to be selected;
the communication interface 230 is configured to obtain a key value sent by the remote controller, where the key value is used to control a moving direction of the focus.
The controller 210 parses the object pointed by the key value into a component object, specifically, a reference number of the component object, and sends the component object to the current page.
And determining the updated second position of the focus by using a unidirectional data flow mode according to the current focus value of the first position of the focus of the current page and the reference number of the component object. The controller acquires the second position and switches the focus to the second position.
Optionally, the controller 210 is further configured to:
when the displayed current page is switched from the first focus to the second focus, acquiring a component object at a first position of the focus;
optionally, the controller 210 is specifically configured to:
determining a page control corresponding to the first position of the first page displayed by the terminal, and determining the page control with the attribute of focus page control as a focused page control if the page control exists in the moving direction controlled by the key value;
and determining the position of the focused page control as the second position after the focus is updated.
Optionally, if the page control corresponding to the first position is determined, in the moving direction controlled by the key value, a page control exists, and a page control with the attribute of being a focused page control does not exist, searching for the page control with the attribute of being the focused page control in the document object model DOM of the first page;
And determining the position of the searched page control as the second position after the focus is updated.
Optionally, if the page control corresponding to the first position is determined, and no page control exists in the moving direction controlled by the key value, the position of a preset page control in the first page is determined as the second position after the focus is updated.
The beneficial effect of this application is passed: the focus is recorded in a component object mode, so that the problem of slow time efficiency in searching a positioning focus in a document object model tree is solved to a certain extent; further, the problem of focus loss caused by focus repetition and focus non-mounting in the process of searching and positioning the focus in the document object model tree can be solved to a certain extent through the uniqueness of the component object; the focus switching is carried out by using the reference number of the component object through the method of the unidirectional event stream, so that the focus and new steps are simplified.
Moreover, those skilled in the art will appreciate that aspects of the present application may be illustrated and described in terms of several patentable species or situations, including any new and useful combination of processes, machines, manufacture, or materials, or any new and useful improvement thereon. Accordingly, various aspects of the present application may be embodied entirely in hardware, entirely in software (including firmware, resident software, micro-code, etc.) or in a combination of hardware and software. The above hardware or software may be referred to as "data blocks," modules, "" engines, "" units, "" components, "or" systems. Furthermore, aspects of the present application may be represented as a computer product, including computer readable program code, embodied in one or more computer readable media.
A computer storage medium may comprise a propagated data signal with computer program code embodied therewith, for example, on baseband or as part of a carrier wave. The propagated signal may take any of a variety of forms, including electromagnetic, optical, and the like, or any suitable combination. A computer storage medium may be any computer-readable medium that can communicate, propagate, or transport a program for use by or in connection with an instruction execution system, apparatus, or device. Program code on a computer storage medium may be propagated over any suitable medium, including radio, cable, fiber optic cable, RF, or the like, or any combination of the preceding.
Computer program code required for the operation of various portions of the present application may be written in any one or more programming languages, including an object oriented programming language such as Java, Scala, Smalltalk, Eiffel, JADE, Emerald, C + +, C #, VB.NET, Python, and the like, a conventional programming language such as C, Visual Basic, Fortran 2003, Perl, COBOL 2002, PHP, ABAP, a dynamic programming language such as Python, Ruby, and Groovy, or other programming languages, and the like. The program code may execute entirely on the user's computer, partly on the user's computer, as a stand-alone software package, partly on the user's computer and partly on a remote computer or entirely on the remote computer or server. In the latter scenario, the remote computer may be connected to the user's computer through any network format, such as a Local Area Network (LAN) or a Wide Area Network (WAN), or the connection may be made to an external computer (for example, through the Internet), or in a cloud computing environment, or as a service, such as a software as a service (SaaS).
Additionally, the order in which elements and sequences of the processes described herein are processed, the use of alphanumeric characters, or the use of other designations, is not intended to limit the order of the processes and methods described herein, unless explicitly claimed. While various presently contemplated embodiments of the invention have been discussed in the foregoing disclosure by way of example, it is to be understood that such detail is solely for that purpose and that the appended claims are not limited to the disclosed embodiments, but, on the contrary, are intended to cover all modifications and equivalent arrangements that are within the spirit and scope of the embodiments herein. For example, although the system components described above may be implemented by hardware devices, they may also be implemented by software-only solutions, such as installing the described system on an existing server or mobile device.
Similarly, it should be noted that in the preceding description of embodiments of the application, various features are sometimes grouped together in a single embodiment, figure, or description thereof for the purpose of streamlining the disclosure aiding in the understanding of one or more of the embodiments. This method of disclosure, however, is not intended to imply that more features are required than are expressly recited in the claims. Indeed, the embodiments may be characterized as having less than all of the features of a single embodiment disclosed above.
Each patent, patent application publication, and other material, such as articles, books, specifications, publications, documents, and the like, cited in this application is hereby incorporated by reference in its entirety. Except where the application is filed in a manner inconsistent or contrary to the present disclosure, and except where the claim is filed in its broadest scope (whether present or later appended to the application) as well. It is noted that the descriptions, definitions and/or use of terms in this application shall control if they are inconsistent or contrary to the statements and/or uses of the present application in the material attached to this application.
Claims (10)
1. A display device, comprising:
a display configured to display a user interface, wherein the user interface comprises a plurality of view display areas, wherein each view display area comprises a layout of one or more different items, and a selector indicating that the item is selected, wherein a position of the selector in the user interface is movable by a user input to cause the different item to be selected;
The communication interface is used for acquiring a key value sent by the remote controller, and the key value is used for controlling the moving direction of the focus;
a controller coupled to the display and the communication interface and configured to:
analyzing the object pointed by the key value into a component object;
judging whether the analyzed object is a component object;
if the analyzed object is not the component object, ending the assignment process;
if the parsed object is a component object, parsing a current focal value of the component object,
judging whether the analyzed current focus value is a component object;
if the current focus value is a component object, assigning the current focus value to the current component object;
if the current focal value is not the component object, ending the assignment process;
specifically, the object pointed by the key value is analyzed into a reference number of the component object and sent to the current page;
and determining a second position after the focus is updated according to the current focus value of the first position of the focus of the current page and the reference number of the component object, and switching the focus to the second position.
2. The display device of claim 1, wherein the component object is unique.
3. The display device of claim 1, wherein the reference number of the component object is established by its parent component object at the time of creation of the component object.
4. The display device of claim 1, wherein determining the updated second position of the focus from the current focus value of the first position of the focus of the current page and the reference number of the component object comprises:
and when the component object at the second position and the component object at the first position are mounted under the same parent component object, the display equipment assigns the Focus value of the current page as the reference number of the component object at the second position to obtain the second position of the Focus.
5. The display device of claim 1, wherein determining the updated second position of the focus from the current focus value of the first position of the focus of the current page and the reference number of the component object comprises:
when the component object at the second position and the component object at the first position are mounted under different parent component objects, the display equipment assigns the Focus value of the current page as the reference number of the parent component object at the second position of the Focus;
The display device finds the currentFocus of the current Focus value of the parent component object through a recursive search mechanism of the Focus of the Focus;
and finally, the display equipment assigns the value of the Focus of the page to the reference number of the component object to obtain a second position of the Focus.
6. A method of focus management, comprising:
the method comprises the steps that a display equipment communication interface obtains a key value sent by a remote controller, wherein the key value is used for controlling the moving direction of a focus;
resolving the object pointed by the key value into a component object;
judging whether the analyzed object is a component object;
if the analyzed object is not the component object, ending the assignment process;
if the parsed object is a component object, parsing a current focus value of the component object,
judging whether the analyzed current focus value is a component object;
if the current focus value is a component object, assigning the current focus value to the current component object;
if the current focal value is not the component object, ending the assignment process;
specifically, the object pointed by the key value is analyzed as the reference number of the component object;
the display equipment sends the reference number to a current page;
The display equipment determines a second position after the focus is updated according to a current focus value of a first position of the focus of a current page and a reference number of the component object;
the display device acquires the second position and switches a focus to the second position.
7. The method of focus management of claim 6, wherein said component object is unique.
8. A method for focus management as recited in claim 6, wherein the reference number of the component object is established by its parent component object at the time the component object is created.
9. The method of claim 6, wherein the determining, by the display device, the updated second position of the focus according to the current focus value of the first position of the focus of the current page and the reference number of the component object, comprises:
and when the component object at the second position and the component object at the first position are mounted under the same parent component object, the display equipment assigns the Focus value of the current page as the reference number of the component object at the second position to obtain the second position of the Focus.
10. The method of claim 6, wherein the determining, by the display device, the updated second position of the focus according to the current focus value of the first position of the focus of the current page and the reference number of the component object, comprises:
When the component object at the second location and the component object at the first location mount under different parent component objects,
the display equipment assigns the Focus value of the current page as the reference number of the parent component object at the second position of the Focus;
the display device finds the currentFocus of the current Focus value of the parent component object through a recursive search mechanism of the Focus of the Focus;
and finally, the display equipment assigns the value of the Focus of the page to the reference number of the component object to obtain a second position of the Focus.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201911242371.XA CN112929717B (en) | 2019-12-06 | 2019-12-06 | Focus management method and display device |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201911242371.XA CN112929717B (en) | 2019-12-06 | 2019-12-06 | Focus management method and display device |
Publications (2)
Publication Number | Publication Date |
---|---|
CN112929717A CN112929717A (en) | 2021-06-08 |
CN112929717B true CN112929717B (en) | 2022-07-29 |
Family
ID=76161609
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201911242371.XA Active CN112929717B (en) | 2019-12-06 | 2019-12-06 | Focus management method and display device |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN112929717B (en) |
Families Citing this family (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN113703625A (en) * | 2021-07-30 | 2021-11-26 | 青岛海尔科技有限公司 | Method, apparatus, storage medium, and electronic apparatus for controlling focus movement |
Family Cites Families (13)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2005029460A1 (en) * | 2003-08-21 | 2005-03-31 | Microsoft Corporation | Focus management using in-air points |
GB0613197D0 (en) * | 2006-07-01 | 2006-08-09 | Ibm | An improved method and system for finding the focus of a document |
JP2012128662A (en) * | 2010-12-15 | 2012-07-05 | Samsung Electronics Co Ltd | Display control device, program and display control method |
CN102984569A (en) * | 2012-11-29 | 2013-03-20 | 中兴通讯股份有限公司 | Method, device and system for controlling television interface focus |
US9285964B2 (en) * | 2013-06-18 | 2016-03-15 | Google Inc. | Automatically recovering and maintaining focus |
CN103428550A (en) * | 2013-08-09 | 2013-12-04 | 华为终端有限公司 | Object selecting method and terminal |
US11023099B2 (en) * | 2014-12-16 | 2021-06-01 | Micro Focus Llc | Identification of a set of objects based on a focal object |
CN104808920A (en) * | 2015-04-30 | 2015-07-29 | 青岛海信电器股份有限公司 | Focal point control method and focal point control device |
US10353550B2 (en) * | 2016-06-11 | 2019-07-16 | Apple Inc. | Device, method, and graphical user interface for media playback in an accessibility mode |
CN106708371B (en) * | 2017-01-17 | 2020-04-07 | 深圳创维数字技术有限公司 | Method and system for realizing focus control of browser |
WO2018159864A1 (en) * | 2017-02-28 | 2018-09-07 | 엘지전자 주식회사 | Mobile terminal and control method for mobile terminal |
CN107341016B (en) * | 2017-06-30 | 2020-09-04 | 百度在线网络技术(北京)有限公司 | Focus state implementation method and device under split screen mechanism, terminal and storage medium |
CN109309874B (en) * | 2018-08-31 | 2021-05-11 | 海信视像科技股份有限公司 | Focus updating method and device |
-
2019
- 2019-12-06 CN CN201911242371.XA patent/CN112929717B/en active Active
Also Published As
Publication number | Publication date |
---|---|
CN112929717A (en) | 2021-06-08 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN110737840B (en) | Voice control method and display device | |
CN109618206B (en) | Method and display device for presenting user interface | |
CN111314789B (en) | Display device and channel positioning method | |
CN111405318B (en) | Video display method and device and computer storage medium | |
CN112969084A (en) | User interface display method, storage medium and display device | |
CN110659010A (en) | Picture-in-picture display method and display equipment | |
CN111897478A (en) | Page display method and display equipment | |
CN112399212A (en) | Display device, file sharing method and server | |
CN111479145A (en) | Display device and television program pushing method | |
CN111866568B (en) | Display device, server and video collection acquisition method based on voice | |
CN111787376B (en) | Display device, server and video recommendation method | |
CN110675872A (en) | Voice interaction method based on multi-system display equipment and multi-system display equipment | |
CN114079829A (en) | Display device and generation method of video collection file watermark | |
CN112463269A (en) | User interface display method and display equipment | |
CN112653910B (en) | Display device, server and control method for television program recommendation | |
CN112473121B (en) | Display device and avoidance ball display method based on limb identification | |
CN112380420A (en) | Searching method and display device | |
CN112165641A (en) | Display device | |
CN111045557A (en) | Moving method of focus object and display device | |
CN111050207A (en) | Television channel switching method and television | |
CN112162809B (en) | Display device and user collection display method | |
CN112929717B (en) | Focus management method and display device | |
CN113542899A (en) | Information display method, display device and server | |
CN111586463A (en) | Display device | |
CN110719514A (en) | Equipment control method and system and terminal |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant | ||
TR01 | Transfer of patent right |
Effective date of registration: 20221026 Address after: 83 Intekte Street, Devon, Netherlands Patentee after: VIDAA (Netherlands) International Holdings Ltd. Address before: 266061 room 131, 248 Hong Kong East Road, Laoshan District, Qingdao City, Shandong Province Patentee before: QINGDAO HISENSE MEDIA NETWORKS Ltd. |
|
TR01 | Transfer of patent right |