CN112584211A - Display device - Google Patents

Display device Download PDF

Info

Publication number
CN112584211A
CN112584211A CN202011396065.4A CN202011396065A CN112584211A CN 112584211 A CN112584211 A CN 112584211A CN 202011396065 A CN202011396065 A CN 202011396065A CN 112584211 A CN112584211 A CN 112584211A
Authority
CN
China
Prior art keywords
abscissa
control
display
ordinate
level
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202011396065.4A
Other languages
Chinese (zh)
Other versions
CN112584211B (en
Inventor
郝云英
满丽
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Vidaa Netherlands International Holdings BV
Vidaa USA Inc
Original Assignee
Vidaa Netherlands International Holdings BV
Vidaa USA Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Vidaa Netherlands International Holdings BV, Vidaa USA Inc filed Critical Vidaa Netherlands International Holdings BV
Priority to CN202011396065.4A priority Critical patent/CN112584211B/en
Publication of CN112584211A publication Critical patent/CN112584211A/en
Priority to PCT/US2021/061652 priority patent/WO2022120079A1/en
Priority to EP21901480.0A priority patent/EP4256796A1/en
Application granted granted Critical
Publication of CN112584211B publication Critical patent/CN112584211B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/41Structure of client; Structure of client peripherals
    • H04N21/422Input-only peripherals, i.e. input devices connected to specially adapted client devices, e.g. global positioning system [GPS]
    • H04N21/42203Input-only peripherals, i.e. input devices connected to specially adapted client devices, e.g. global positioning system [GPS] sound input device, e.g. microphone
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/04812Interaction techniques based on cursor appearance or behaviour, e.g. being affected by the presence of displayed objects
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/41Structure of client; Structure of client peripherals
    • H04N21/422Input-only peripherals, i.e. input devices connected to specially adapted client devices, e.g. global positioning system [GPS]
    • H04N21/42204User interfaces specially adapted for controlling a client device through a remote control device; Remote control devices therefor
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/431Generation of visual interfaces for content selection or interaction; Content or additional data rendering
    • H04N21/4312Generation of visual interfaces for content selection or interaction; Content or additional data rendering involving specific graphical features, e.g. screen layout, special fonts or colors, blinking icons, highlights or animations
    • H04N21/4316Generation of visual interfaces for content selection or interaction; Content or additional data rendering involving specific graphical features, e.g. screen layout, special fonts or colors, blinking icons, highlights or animations for displaying supplemental content in a region of the screen, e.g. an advertisement in a separate window

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Business, Economics & Management (AREA)
  • Marketing (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The display device shown in the embodiment of the present application includes a display and a controller, wherein the controller is configured to perform: reading a first position in response to a movement instruction of the focus frame, wherein the first position is the position of the moved focus frame; generating a second location according to the first location information, the first location being associated with the second location; and controlling the display to display prompt information of a first control at the second position, wherein the first control is a control corresponding to the moved focus frame. It can be seen that according to the display device provided by the embodiment of the application, the controller can generate the second position according to the first position, and control the display to display the prompt information at the second position, so that the user can know the function of the control through the prompt information in more detail, further prompt information is displayed around the first control all the time, and the user experience is better.

Description

Display device
Technical Field
The application relates to the technical field of social television, in particular to a display device.
Background
The display device may provide functions such as playing audio, video, pictures, etc. to a user, receiving a wide attention from the user. In order to improve the user experience, the display device is added with some functions to meet the requirements of different users. Typically, there is one control for each function. The user can complete the setting of the corresponding function through the control corresponding to the touch function. For example, the display device is provided with a brightness adjustment function, a brightness adjustment control is correspondingly arranged in a function menu of the display device, and a user can adjust the brightness of the display device by adjusting the brightness adjustment control.
Typically, a display device is configured with hundreds or thousands of functions, and hundreds or thousands of controls may be exposed on a page of the corresponding display device. In order to ensure that the associated controls can be displayed on one page, the area of the controls needs to be limited, which results in limited content displayed on the controls, functions corresponding to each control which cannot be clearly seen by a user, and poor user experience.
Disclosure of Invention
In order to solve technical problems in the prior art, embodiments of the present application illustrate a display device.
In order to solve the problems existing in the prior art, an embodiment of the present application provides a display device, including:
a display;
a controller configured to perform:
reading a first position in response to a movement instruction of the focus frame, wherein the first position is the position of the moved focus frame;
generating a second location according to the first location information, the first location being associated with the second location;
and controlling the display to display prompt information of a first control at the second position, wherein the first control is a control corresponding to the moved focus frame.
The display device shown in the embodiment of the present application includes a display and a controller, wherein the controller is configured to perform: reading a first position in response to a movement instruction of the focus frame, wherein the first position is the position of the moved focus frame; generating a second location according to the first location information, the first location being associated with the second location; and controlling the display to display prompt information of a first control at the second position, wherein the first control is a control corresponding to the moved focus frame. It can be seen that according to the display device provided by the embodiment of the application, the controller can generate the second position according to the first position, and control the display to display the prompt information at the second position, so that the user can know the function of the control through the prompt information in more detail, further prompt information is displayed around the first control all the time, and the user experience is better.
Drawings
In order to more clearly illustrate the embodiments of the present application or the implementation manner in the related art, a brief description will be given below of the drawings required for the description of the embodiments or the related art, and it is obvious that the drawings in the following description are some embodiments of the present application, and other drawings can be obtained by those skilled in the art according to the drawings.
FIG. 1 illustrates a usage scenario of a display device according to some embodiments;
fig. 2 illustrates a hardware configuration block diagram of the control apparatus 100 according to some embodiments;
fig. 3 illustrates a hardware configuration block diagram of the display apparatus 200 according to some embodiments;
FIG. 4 is a flow diagram illustrating interaction of a display device with a user provided in accordance with some embodiments;
FIG. 5 is a schematic diagram illustrating a presentation interface of a display when the display presents a menu list, in accordance with one possible embodiment;
FIG. 6 is a schematic diagram of a presentation interface of a display shown in accordance with a possible embodiment;
FIG. 7 is a flow chart illustrating a first position reading out manner according to one possible embodiment;
FIG. 8 is a schematic diagram of a presentation interface of a display shown in accordance with a possible embodiment;
FIG. 9 is a flow chart illustrating a second location generation approach in accordance with a possible embodiment;
FIG. 10A is a schematic diagram illustrating a variation of a presentation interface of a display in accordance with one possible embodiment;
FIG. 10B is a schematic diagram illustrating a variation of a presentation interface of a display according to one possible embodiment;
FIG. 11 is a presentation interface of the display showing the current language as the second language in accordance with one possible embodiment;
FIG. 12 is a flow chart illustrating a second location generation approach in accordance with a possible embodiment;
FIG. 13A is a schematic diagram illustrating a variation of a presentation interface of a display in accordance with one possible embodiment;
FIG. 13B is a schematic diagram illustrating a variation of a presentation interface of a display in accordance with one possible embodiment;
FIG. 14 is a flow chart of a second position generation method according to a possible embodiment
FIG. 15 is a schematic diagram of a display presentation interface shown in accordance with one possible embodiment;
FIG. 16 is a flow chart illustrating a method of presenting toasts, in accordance with one possible embodiment;
FIG. 17 is a schematic diagram of a display presentation interface according to one possible embodiment.
Detailed Description
To make the purpose and embodiments of the present application clearer, the following will clearly and completely describe the exemplary embodiments of the present application with reference to the attached drawings in the exemplary embodiments of the present application, and it is obvious that the described exemplary embodiments are only a part of the embodiments of the present application, and not all of the embodiments.
It should be noted that the brief descriptions of the terms in the present application are only for the convenience of understanding the embodiments described below, and are not intended to limit the embodiments of the present application. These terms should be understood in their ordinary and customary meaning unless otherwise indicated.
The terms "first," "second," "third," and the like in the description and claims of this application and in the above-described drawings are used for distinguishing between similar or analogous objects or entities and not necessarily for describing a particular sequential or chronological order, unless otherwise indicated. It is to be understood that the terms so used are interchangeable under appropriate circumstances.
The terms "comprises" and "comprising," and any variations thereof, are intended to cover a non-exclusive inclusion, such that a product or apparatus that comprises a list of elements is not necessarily limited to all elements expressly listed, but may include other elements not expressly listed or inherent to such product or apparatus.
The term "module" refers to any known or later developed hardware, software, firmware, artificial intelligence, fuzzy logic, or combination of hardware and/or software code that is capable of performing the functionality associated with that element.
Fig. 1 is a schematic diagram of a usage scenario of a display device according to an embodiment. As shown in fig. 1, the display apparatus 200 is also in data communication with a server 400, and a user can operate the display apparatus 200 through the smart device 300 or the control device 100.
In some embodiments, the control apparatus 100 may be a remote controller, and the communication between the remote controller and the display device includes at least one of an infrared protocol communication or a bluetooth protocol communication, and other short-distance communication methods, and controls the display device 200 in a wireless or wired manner. The user may control the display apparatus 200 by inputting a user instruction through at least one of a key on a remote controller, a voice input, a control panel input, and the like.
In some embodiments, the smart device 300 may include any of a mobile terminal, a tablet, a computer, a laptop, an AR/VR device, and the like.
In some embodiments, the smart device 300 may also be used to control the display device 200. For example, the display device 200 is controlled using an application program running on the smart device.
In some embodiments, the smart device 300 and the display device may also be used for communication of data.
In some embodiments, the display device 200 may also be controlled in a manner other than the control apparatus 100 and the smart device 300, for example, the voice instruction control of the user may be directly received by a module configured inside the display device 200 to obtain a voice instruction, or may be received by a voice control apparatus provided outside the display device 200.
In some embodiments, the display device 200 is also in data communication with a server 400. The display device 200 may be allowed to be communicatively connected through a Local Area Network (LAN), a Wireless Local Area Network (WLAN), and other networks. The server 400 may provide various contents and interactions to the display apparatus 200. The server 400 may be a cluster or a plurality of clusters, and may include one or more types of servers.
In some embodiments, software steps executed by one step execution agent may be migrated on demand to another step execution agent in data communication therewith for execution. Illustratively, software steps performed by the server may be migrated to be performed on a display device in data communication therewith, and vice versa, as desired.
Fig. 2 exemplarily shows a block diagram of a configuration of the control apparatus 100 according to an exemplary embodiment. As shown in fig. 2, the control device 100 includes a controller 110, a communication interface 130, a user input/output interface 140, a memory, and a power supply. The control apparatus 100 may receive an input operation instruction from a user and convert the operation instruction into an instruction recognizable and responsive by the display device 200, serving as an interaction intermediary between the user and the display device 200.
In some embodiments, the communication interface 130 is used for external communication, and includes at least one of a WIFI chip, a bluetooth module, NFC, or an alternative module.
In some embodiments, the user input/output interface 140 includes at least one of a microphone, a touchpad, a sensor, a key, or an alternative module.
Fig. 3 shows a hardware configuration block diagram of the display apparatus 200 according to an exemplary embodiment.
In some embodiments, the display apparatus 200 includes at least one of a tuner demodulator 210, a communicator 220, a detector 230, an external device interface 240, a controller 250, a display 260, an audio output interface 270, a memory, a power supply, a user interface.
In some embodiments the controller comprises a central processor, a video processor, an audio processor, a graphics processor, a RAM, a ROM, a first interface to an nth interface for input/output.
In some embodiments, the display 260 includes a display screen component for displaying pictures, and a driving component for driving image display, a component for receiving image signals from the controller output, displaying video content, image content, and menu manipulation interface, and a user manipulation UI interface, etc.
In some embodiments, the display 260 may be at least one of a liquid crystal display, an OLED display, and a projection display, and may also be a projection device and a projection screen.
In some embodiments, the tuner demodulator 210 receives broadcast television signals via wired or wireless reception, and demodulates audio/video signals, such as EPG data signals, from a plurality of wireless or wired broadcast television signals.
In some embodiments, communicator 220 is a component for communicating with external devices or servers according to various communication protocol types. For example: the communicator may include at least one of a Wifi module, a bluetooth module, a wired ethernet module, and other network communication protocol chips or near field communication protocol chips, and an infrared receiver. The display apparatus 200 may establish transmission and reception of control signals and data signals with the control device 100 or the server 400 through the communicator 220.
In some embodiments, the detector 230 is used to collect signals of the external environment or interaction with the outside. For example, detector 230 includes a light receiver, a sensor for collecting ambient light intensity; alternatively, the detector 230 includes an image collector, such as a camera, which may be used to collect external environment scenes, attributes of the user, or user interaction gestures, or the detector 230 includes a sound collector, such as a microphone, which is used to receive external sounds.
In some embodiments, the external device interface 240 may include, but is not limited to, the following: high Definition Multimedia Interface (HDMI), analog or data high definition component input interface (component), composite video input interface (CVBS), USB input interface (USB), RGB port, and the like. The interface may be a composite input/output interface formed by the plurality of interfaces.
In some embodiments, the controller 250 and the modem 210 may be located in different separate devices, that is, the modem 210 may also be located in an external device of the main device where the controller 250 is located, such as an external set-top box.
In some embodiments, the controller 250 controls the operation of the display device and responds to user operations through various software control programs stored in memory. The controller 250 controls the overall operation of the display apparatus 200. For example: in response to receiving a user command for selecting a UI object to be displayed on the display 260, the controller 250 may perform an operation related to the object selected by the user command.
In some embodiments, the object may be any one of selectable objects, such as a hyperlink, an icon, or other actionable control. The operations related to the selected object are: displaying an operation connected to a hyperlink page, document, image, or the like, or performing an operation of a program corresponding to the icon.
In some embodiments the controller comprises at least one of a Central Processing Unit (CPU), a video processor, an audio processor, a Graphics Processing Unit (GPU), a RAM Random Access Memory (RAM), a ROM (Read-Only Memory), a first to nth interface for input/output, a communication Bus (Bus), and the like.
A CPU processor. For executing operating system and application program instructions stored in the memory, and executing various application programs, data and contents according to various interactive instructions receiving external input, so as to finally display and play various audio-video contents. The CPU processor may include a plurality of processors. E.g. comprising a main processor and one or more sub-processors.
In some embodiments, a graphics processor for generating various graphics objects, such as: at least one of an icon, an operation menu, and a user input instruction display figure. The graphic processor comprises an arithmetic unit, which performs operation by receiving various interactive instructions input by a user and displays various objects according to display attributes; the system also comprises a renderer for rendering various objects obtained based on the arithmetic unit, wherein the rendered objects are used for being displayed on a display.
In some embodiments, the video processor is configured to receive an external video signal, and perform at least one of video processing such as decompression, decoding, scaling, noise reduction, frame rate conversion, resolution conversion, and image synthesis according to a standard codec protocol of the input signal, so as to obtain a signal displayed or played on the direct display device 200.
In some embodiments, the video processor includes at least one of a demultiplexing module, a video decoding module, an image composition module, a frame rate conversion module, a display formatting module, and the like. The demultiplexing module is used for demultiplexing the input audio and video data stream. And the video decoding module is used for processing the video signal after demultiplexing, including decoding, scaling and the like. And the image synthesis module is used for carrying out superposition mixing processing on the GUI signal input by the user or generated by the user and the video image after the zooming processing by the graphic generator so as to generate an image signal for display. And the frame rate conversion module is used for converting the frame rate of the input video. And the display formatting module is used for converting the received video output signal after the frame rate conversion, and changing the signal to be in accordance with the signal of the display format, such as an output RGB data signal.
In some embodiments, the audio processor is configured to receive an external audio signal, decompress and decode the received audio signal according to a standard codec protocol of the input signal, and perform at least one of noise reduction, digital-to-analog conversion, and amplification processing to obtain a sound signal that can be played in the speaker.
In some embodiments, a user may enter user commands on a Graphical User Interface (GUI) displayed on display 260, and the user input interface receives the user input commands through the Graphical User Interface (GUI). Alternatively, the user may input the user command by inputting a specific sound or gesture, and the user input interface receives the user input command by recognizing the sound or gesture through the sensor.
In some embodiments, a "user interface" is a media interface for interaction and information exchange between an application or operating system and a user that enables conversion between an internal form of information and a form that is acceptable to the user. A commonly used presentation form of the User Interface is a Graphical User Interface (GUI), which refers to a User Interface related to computer operations and displayed in a graphical manner. It may be an interface element such as an icon, a window, a control, etc. displayed in the display screen of the electronic device, where the control may include at least one of an icon, a button, a menu, a tab, a text box, a dialog box, a status bar, a navigation bar, a Widget, etc. visual interface elements.
In some embodiments, user interface 280 is an interface that may be used to receive control inputs (e.g., physical buttons on the body of the display device, or the like).
In some embodiments, a system of a display device may include a Kernel (Kernel), a command parser (shell), a file system, and an application program. The kernel, shell, and file system together make up the basic operating system structure that allows users to manage files, run programs, and use the system. After power-on, the kernel is started, kernel space is activated, hardware is abstracted, hardware parameters are initialized, and virtual memory, a scheduler, signals and interprocess communication (IPC) are operated and maintained. And after the kernel is started, loading the Shell and the user application program. The application program is compiled into machine code after being started, and a process is formed.
Typically, a display device is configured with hundreds or thousands of functions, and hundreds or thousands of controls may be exposed on a page of the corresponding display device. In order to ensure that the associated controls can be displayed on one page, the area of the controls needs to be limited, which results in limited content displayed on the controls, functions corresponding to each control which cannot be clearly seen by a user, and poor user experience.
In order to solve the problems in the prior art, an embodiment of the present application illustrates a display device, where the display device at least includes: a display and a controller. A flow chart of the interaction of the display device with the user can be seen in fig. 4.
The display is configured to perform step S101 to present a menu list.
Fig. 5 is a presentation interface of a display showing when the display presents a menu list according to a possible embodiment. The menu list 1 may be presented in the form of a floating window, and the menu list includes at least one control 11.
The Android UI system is mainly adopted by display equipment, the Android UI system takes controls as basic display units, and each control has an attribute: a focal point; the focus state of each control is only two: with or without the control that has focus, there is by default no change in appearance, and in order to indicate to the user which control currently has focus, the focus needs to be displayed by the focus frame, so that when focus shifts, the focus frame appears on the control that newly acquired focus.
For example, when the interface 11 in fig. 5 obtains the focus for the web control, the interface is shown on the display, and as can be seen from the interface 11, the focus frame appears on the web control.
Typically, there is a "parent-child" relationship between the various controls. In the actual application process, the controls can be divided into the following according to the parent-child relationship between the controls: primary control, secondary control, tertiary control … …. The first-level control is a control set in a setting menu; the primary control has no father control (superior control); typically, the user can invoke the primary control directly. The secondary control is a sub control (lower control) of the primary control; the user needs to call the corresponding secondary control through the primary control. Specifically, the user clicks the first-level control, the display displays a menu list corresponding to the first-level control, and the menu list corresponding to the first-level control includes a child control for the first-level control. For example, when the user clicks the web control (primary control), the controller controls the display to display a menu list of web controls, which may be referred to as the interface 12 in fig. 5, where the menu list of web controls includes a web connection control, an internet control, a limited connection control, and a wireless connection control. The third-level control is a sub-control (lower-level control) of the second-level control, and a user needs to call the corresponding second-level control through the first-level control and then call the corresponding third-level control through the second-level control. For example, when the user clicks the web control (primary control), the controller controls the display to show a menu list of web controls, which may be referred to in particular as interface 12 in fig. 5; when the user clicks the network connection control (secondary control), the controller controls the display to display a menu list of network connection controls, which may specifically refer to interface 13 in fig. 5, where the menu list of network connection controls includes a connection test control and a network test control.
In this embodiment, the hierarchy of the display device including the control is not limited, and the control of the corresponding hierarchy may be set according to a requirement in an actual application process, where the applicant does not make much limitation.
The user executes step S102 to output a focus frame moving instruction as required.
The output form of the focus frame movement instruction is not limited in this embodiment. For example, in one possible embodiment, the user may output a focus frame movement instruction via a remote control. For another example, in a feasible embodiment, the controller may be installed with voice assistant software, and the corresponding user may directly output the focus frame moving instruction. In the process of practical application, the output manner of the focus frame moving instruction may be, but is not limited to, the above two manners.
In response to the moving instruction of the focus frame, the controller is configured to execute step S103 to read a first position, where the moved focus frame is located in the display presentation interface;
the representation of the first location is various. For example, in one possible embodiment, the first position may be the distance between the four borders (top, left, bottom and right borders) of the focus frame and the display area, specifically referring to the interface 21 in fig. 6, wherein the first position may be represented by (a1, a2, a3, a4), wherein a1 is the distance between the top end of the focus frame and the top end of the display area, a2 (not shown in the figure) is the distance between the left border of the focus frame and the left border of the display area, a3 is the distance between the bottom end of the focus frame and the bottom end of the display area, and a4 is the distance between the right border of the focus frame and the right border of the display area. For another example, in a possible embodiment, the first position may be coordinate values of four vertices (upper left vertex, lower right vertex, and upper right vertex) of the focus frame. The coordinate system is not limited in this embodiment, and any vertex of a display may be used as the coordinate system in a feasible embodiment. For example, the interface 22 in fig. 6 constructs a rectangular coordinate system with the lower left corner of the display as the origin of coordinates, and the coordinates of the four vertices of the focus frame in the rectangular coordinate system are b1(X1, Y1), b2(X2, Y2), b3(X3, Y3), and b4(X4, Y4). The first position may be represented in other ways by requirements of subsequent calculations during the course of practical application, without the applicant making any undue restrictions.
In the process of practical application, in order to reduce the data processing amount of the controller, the embodiment of the application limits some application scenarios, and in the application scenarios, the controller may not read the first position, so as to reduce the data processing amount of the controller, and further improve the operation speed of the whole display device.
In the embodiment of the present application, a reading method of the first position is provided, and specifically, refer to fig. 7. Fig. 7 is a flowchart illustrating a first position reading manner according to a possible embodiment, wherein the controller is configured to execute steps S11-S152.
S11 reading a first ID, wherein the first ID is the control ID of the first control;
in the application, each control is configured with configuration information, the configuration information may include, but is not limited to, an attribute of the control, a control ID of the control, and prompt information of the control, and the configuration information is stored in an information list. The hierarchy of the control may be determined from the control ID.
S12 determining a first level according to the first ID, the first level being a level of a first control;
in the technical solution provided by this embodiment, the purpose of reading the first position is to calculate the display position (also referred to as the second position in this embodiment) of the prompt message by using the first position. However, no hints are configured for the primary control. When the first control is a primary control, the controller continues to read the first position, which is a waste of computing resources of the controller.
S13, judging whether the first level is one level; if the first level is one level, the first location is not read.
There are various ways to determine whether the first level is a first level. For example, in some feasible embodiments, the IDs of different hierarchical controls can be set to different lengths, and then the control can be determined to be a control of several levels according to the number of characters contained in the ID of the control. For example, in one possible embodiment where the control ID of the primary control comprises a characters, the control ID of the secondary control comprises B characters, and the control ID of the tertiary control comprises C characters … …, the controller may determine the first level based on the number of characters contained in the control ID. For another example, in some feasible embodiments, different identifiers may be set for IDs of different hierarchical controls, and then whether the control is a primary control may be determined according to the identifier included in the ID of the control. For example, in a feasible embodiment, the control ID of the primary control includes an identifier a, the control ID of the secondary control includes an identifier B, and the control ID of the tertiary control includes an identifier C … …, and the controller may determine whether the control is the primary control according to the identifier included in the control ID.
In the practical application process, the implementation manner of determining whether the first level is one may be, but is not limited to, the above two.
It is worth noting that in the process of practical application, other manners besides the control ID may be adopted to determine the level of the first control, which is not described herein again by the applicant herein, and all schemes that the first position is not read after the first control is determined to be the first-level control are within the protection scope of the present application.
If the first level is not one level, executing step S14 to determine whether the prompt message can be read in the target information list, where the target information list is an information list of the first control; if the prompt message is not read, the first position is not read;
according to the technical scheme provided by the implementation, the purpose of reading the first position is to calculate the display position of the prompt message by using the first position subsequently. However, in an application scenario where the configuration information list of the first control does not include the prompt information, it is needless to say that the controller continues to read the first position, and based on the scheme shown in this embodiment of the application, whether the controller can read the prompt information in the target information list in advance is determined, and then whether to read the first position is determined according to a determination result, so that the data processing amount of the controller is reduced.
There are various ways to determine whether the prompt information can be read in the target information list. For example, in some possible embodiments, the prompt may be stored in a fixed location of the target information list, and when the controller determines that the first control is not the primary control, the controller reads data directly to the fixed location of the target information list, and the controller determines whether the prompt may be read in the target information list by whether the data is read in the fixed location. For example, in a feasible embodiment, the fixed position may be a storage position corresponding to the nth character to M characters of the target information list, in the process of configuring the target information list, a designer may write the prompt information of the first control in the storage positions of the N characters to M characters, and when the controller determines that the first control is not the primary control, the controller directly reads the prompt information from the storage positions of the nth character to M characters of the target information list; if the data is read, prompt information is recorded in the target information list; if the data is not read, no prompt information is recorded in the target information list. For another example, in some feasible embodiments, the hint information identifier may be configured, and the controller may determine whether the target information list contains hint information according to whether the hint information identifier is read. For example, in a feasible embodiment, the identifier of the hint information is an identifier a, and when the controller determines that the first control is not the primary control, the controller searches for the identifier a in the target information list; if the identifier A is found, prompt information is recorded in the target information list; if the to-be-identified A is not searched, no prompt information is recorded in the target information list.
In the actual application process, the implementation manner of judging whether the prompt information can be read in the target information list may be, but is not limited to, the above two.
When the user invokes a control at a different level, the focus frame needs to be moved between the controls at the different levels. For example, referring to fig. 5, in the initial state, the focus frame is located on the network control (the primary control), and when the user wants to invoke the network connection control (the secondary control), the user needs to touch the network control, and the display content of the display jumps from the system menu to the network menu in the process of controlling the focus frame to jump from the network control to the network connection control. The display content of the display jumps from the system menu to the network menu, which may be referred to as a hierarchy change animation in this embodiment; in the above process, it takes a certain time for the UI interface to draw the web menu, and thus the controller cannot immediately read the first location information. When the user calls the control of the same level, the UI interface does not need to redraw a new menu interface, and the controller can immediately read the first position in the application scene. In order to ensure that the controller can confirm to know the reading time of the first position, in the technical solution shown in the embodiment of the present application, the controller can confirm the reading time of the first position by whether the control hierarchy changes.
If the prompt message is not read, the controller executes step S15 to determine whether the control hierarchy is changed;
there are various ways to determine whether the control hierarchy has changed.
For example, in a feasible embodiment, whether the control hierarchy changes can be judged through the control ID. Specifically, in some embodiments, the controller reads a second level, where the second level is a level of a second control, and the second control is a control corresponding to the focus frame before movement; in response to the focus movement command, the controller reads the first level, and the controller determines whether the control level is changed by determining whether the first level is consistent with the second level. For example, in a possible embodiment, where the second level is two levels and the first level is three levels, the controller may determine that a change has occurred to the control level.
As another example, in a possible embodiment, focus movement instructions may be differentiated. And the corresponding focus movement in the process of controlling the focus to move between the controls in the same hierarchy is designated as a first control instruction, the corresponding focus movement in the process of controlling the focus to move between the controls in different hierarchies is designated as a second control instruction, and the first control instruction is different from the second control instruction. In response to a first control instruction, the controller determines that no change in focus has occurred; in response to the second control instruction, the controller determines that a change in focus has occurred.
In the process of actual application, the implementation manners of determining whether the control hierarchy changes may be, but are not limited to, the above two.
If the change occurs, executing step S161 to control the display to play a hierarchy change animation; the second level is a level of a second control, and the second control is a control corresponding to the focus frame before movement;
in response to the end level change animation playing/control level not changing, executing step S162 to read the first position;
the controller is configured to execute step S104 to generate a second location according to the first location information, the first location being associated with the second location;
the embodiment of the application shows that a first position is associated with a second position; wherein the association between the first position and the second position may be that the center of the first position and the center of the second position are located on the same horizontal plane, specifically referring to the interface 31 in fig. 8, fig. 8 is a schematic view of a display interface of the display according to a possible embodiment; it can be seen from the interface 31 that the centre of the first location 2 is located on the same horizontal plane as the centre of the second location 3. The first position and the second position can be associated with each other such that the top end of the first position and the top end of the second position are located on the same horizontal plane, and in particular, see the interface 32 in fig. 8, it can be seen from the interface 32 that the top end of the first position and the top end of the second position are located on the same horizontal plane. The first position and the second position can be associated with each other, and the bottom end of the first position and the bottom end of the second position are located on the same horizontal plane, specifically, see the interface 33 in fig. 8, and it can be seen from the interface 33 that the bottom end of the first position and the bottom end of the second position are located on the same horizontal plane.
The embodiment of the present application merely describes three relationships between the first position and the second position by way of example, and in the process of practical application, the relationships between the first position and the second position may be, but are not limited to, the three manners described above.
In this embodiment, the second position is a display position (also referred to as a region) of the prompt message, and in some feasible embodiments, the width of the second position may be kept constant. In order to obtain as large a display area (second position) as possible for displaying the prompt information, the embodiment of the present application illustrates a method for generating the second position, specifically, refer to fig. 9, where fig. 9 is a flowchart illustrating a second position generating manner according to a possible embodiment, and the controller is configured to execute steps S21 to S221/S222.
Step S21 of determining that the first center ordinate is greater than or equal to the second center ordinate;
the rectangular coordinate system referred to in this embodiment may use any vertex of the display as the origin of coordinates. Wherein the first position may include: a first top end ordinate, a first bottom end ordinate and a first center ordinate, wherein the first top end ordinate is an ordinate of the first position, the first bottom end ordinate is an ordinate of the bottom end of the first position, and the first center ordinate is an ordinate of the center of the first position; in this application, the first position is equal to the position of the focus frame after the movement is equal to the position of the first control.
If the first center ordinate is greater than or equal to the second center ordinate, step S221 is executed where the second top ordinate is equal to the first top ordinate, and the second top ordinate is the ordinate of the top of the second position;
if the first center ordinate is smaller than the second center ordinate, step S222 is executed where the second bottom ordinate is the ordinate of the bottom end of the second position.
For example, fig. 10A is a schematic diagram illustrating a variation of a presentation interface of a display according to a possible embodiment, in an interface 4-1 of fig. 10A, a control 2 is a first control, and a focus frame is located on an upper layer of the control 2. In the present embodiment, a lower right corner of the display in the coordinate system is an origin, an extension line of a lower boundary of the display is taken as an X-axis, and an extension line of the lower boundary of the display is taken as an X-axis. The coordinates of the center of the first position are A (X5, Y5) and the coordinates of the center of the display are B (X6, Y6). The controller compares Y6 with Y5, Y5> Y6, and the ordinate of the top end of the second position is equal to the ordinate of the top end of the first position, as shown in FIG. 10A, interface 4-2. It can be seen from the figure that the display position of the focus frame is the same as the top end of the display position of the prompt message. In some embodiments, if the prompt contains more content, the prompt may be continuously presented downward starting from the top of the focus frame until the edge of the display is presented, as shown in fig. 10A for example, interface 4-3.
For example, fig. 10B is a schematic diagram illustrating a variation of a presentation interface of a display according to a feasible embodiment, in an interface 4-1 of fig. 10B, a control 2 is a first control, and a focus frame is located on an upper layer of the control 2. The coordinates of the center of the first position are C (X7, Y7) and the coordinates of the center of the display are B (X6, Y6). In the present application, the controller compares Y6 with Y7, Y7< Y6, and the ordinate of the second bottom end is the ordinate of the bottom end of the second position, and the specific demonstration effect can be seen from the interface 5-2 of FIG. 10B. It can be seen from the figure that the display position of the focus frame is the same as the bottom end of the display position of the prompt message. In some embodiments, if the content of the prompt message is more than the content of the content, the prompt message may be continuously presented upward starting from the bottom of the first position until the edge of the display is presented, which may be referred to as interface 5-3 in fig. 10B.
The text in most languages is displayed from left to right, and the controls for the displays in these countries are also typically presented to the left (these languages may be referred to as the first language in this embodiment). However, the text of some languages, such as hebrew, bose, arabic, etc. (these languages may be referred to as the second language in this embodiment), is displayed from right to left. In order to meet the requirements of more users, the technical scheme shown in the embodiment of the application divides the language suitable for the display into a first language and a second language, and when the current language of the display device is the first language, the control is displayed on the left side of the display interface; and when the current language of the display equipment is the second language, the control is displayed on the right side of the display interface.
FIG. 11 is a presentation interface of a display showing a second language of a current language according to one possible embodiment, wherein the interface 6-1 is a presentation interface of a system menu; the interface 6-2 is a display interface of a network menu, and the interface 6-3 is a display interface of a network connection menu.
In the display device shown in this embodiment, the first position may be located on the left side of the display or on the right side of the display. To avoid occlusion of the control by the reminder information, in some feasible embodiments, the controller needs to predetermine whether the first position is located on the left side or the right side of the display. Referring to fig. 12, fig. 12 is a flowchart illustrating a second position generating manner according to a possible embodiment, wherein the controller is configured to execute steps S31-S321/S322.
Step S31 determines whether the first center abscissa is larger than the second center abscissa.
The first position in this application further comprises: a first left boundary abscissa, a first right boundary abscissa, and a first center abscissa, the first left boundary abscissa being the abscissa of the first position left boundary, the first right boundary abscissa being the abscissa of the first position right boundary, the first center abscissa being the abscissa of the first position center;
if the first center abscissa is greater than the second center abscissa, performing step S321 where the second right boundary abscissa is less than or equal to the first left boundary abscissa, the second right boundary abscissa being the abscissa of the second position right boundary;
if the first center abscissa is less than the second center abscissa, the second left border abscissa is greater than or equal to the first right border abscissa, which is the abscissa of the second position left border, is performed in step S322.
For example, fig. 13A is a schematic diagram illustrating a variation of a presentation interface of a display according to a possible embodiment, in an interface 7-1 of fig. 13A, a control 2 is a first control, and a focus frame is located on an upper layer of the control 2. In the present embodiment, a lower right corner of the display in the coordinate system is an origin, an extension line of a lower boundary of the display is taken as an X-axis, and an extension line of the lower boundary of the display is taken as an X-axis. The coordinates of the center of the first position are D (X8, Y8) and the coordinates of the center of the display are B (X6, Y6). The controller compares X6 with X8Y 8> Y6, the vertex ordinate of the second position being equal to the vertex ordinate of the first position; x8> X6 the second right boundary abscissa is less than or equal to the first left boundary abscissa, and a specific demonstration effect can be seen in the interface 7-2 of fig. 13A. As can be seen from the interface 7-2, the display position of the focus frame is the same as the top end of the display position of the prompt message, and the display position of the prompt message is located on the left side of the display position of the focus frame. In some embodiments, if the prompt contains more content, the prompt may be continuously presented downward starting from the top of the first position until the edge of the display is presented, as shown in fig. 13A, in particular, in interface 7-3.
For example, fig. 13B is a schematic diagram illustrating a variation of a display interface of a display according to a feasible embodiment, in an interface 8-1 of fig. 13B, a control 2 is a first control, and a focus frame is located on an upper layer of the control 2. The coordinates of the center of the first position are E (X9, Y9) and the coordinates of the center of the display are B (X6, Y6). The controller compares X6 with X9, Y6 with Y9, Y9< Y6, the bottom ordinate of the second position being equal to the bottom ordinate of the first position; x8> X6 the second right boundary abscissa is less than or equal to the first left boundary abscissa, and a specific demonstration effect can be seen in the interface 8-2 of fig. 13B. It can be seen from the figure that the display position of the focus frame is the same as the bottom end of the display position of the prompt message, and the display position of the prompt message is located on the left side of the display position of the focus frame. In some embodiments, if the prompt contains more content, the prompt may be continuously presented downward starting from the top of the first position until the edge of the display is presented, as shown in fig. 13B, in particular, interface 8-3.
The embodiment of the present application further illustrates a method for generating a second position, specifically, referring to fig. 14, where fig. 14 is a flowchart illustrating a manner of generating a second position according to a feasible embodiment, and the controller is configured to execute steps S41 to S421/S422.
Step S41, judging whether the language currently adopted by the display is the first language;
in the embodiment of the application, each time the adjustment of the adjusted language is completed, the controller stores the language currently adopted by the display device. The specific storage mode may be storage in the form of identification bits, for example, the first identification bit corresponds to the first language, and the second identification bit corresponds to the second language. When the control needs to be judged to be positioned on the left side or the right side of the display, the controller judges whether the adopted language is the first language or not by reading the identification bit.
The first control comprises a first top end abscissa and a first bottom end abscissa, wherein the first top end abscissa is the abscissa of the top end of the first control, and the first bottom end abscissa is the abscissa of the bottom end of the first control;
if the current language is set as the first language, executing step S421, where the abscissa of the second left boundary is greater than or equal to the abscissa of the first right boundary, the abscissa of the second left boundary is the abscissa of the left boundary at the second position, and when the current language of the display device is the first language, the control is displayed on the left side of the display interface;
if the current language setting is the second language, step S422 is executed, where the abscissa of the second right boundary is less than or equal to the abscissa of the first left boundary, the abscissa of the second right boundary is the abscissa of the right boundary at the second position, and the control is displayed on the right side of the display interface when the current language of the display device is the second language.
In the optional technical solution shown in the embodiment of the present application, the first-level control is not configured with the prompt information, and the possible controls that are greater than or equal to the second-level control are configured with the prompt information. When the first control is a secondary control, the user does not need to jump to the primary control from the secondary control, because all the primary controls are configured with prompt information. When the first control is a three-level or more-than-three-level control, the user may jump to the upper level of the first control or jump to the lower level of the first control. In order to enable a user to more intuitively judge whether the first control is a secondary control which can only jump to a lower-level control or a tertiary or more-than-tertiary control which can jump to an upper-level control and a lower-level control. According to the technical scheme, the display position of the prompt message corresponding to the secondary control is different from the display position of the prompt message corresponding to the tertiary or more-tertiary control. Specifically, if the first level is two-level, the second right boundary abscissa is smaller than the first left boundary abscissa, or the second left boundary abscissa is larger than the first right boundary abscissa; if the first level is greater than or equal to the second level, the second right boundary abscissa is equal to the first left boundary abscissa, or the second left boundary abscissa is equal to the first right boundary abscissa.
For example, fig. 15 is a schematic diagram illustrating a display presentation interface according to a possible embodiment, wherein the interface 9-1 is a presentation interface of a display in an application scenario in which the first level is two levels. As can be seen in interface 9-1, a prompt is presented adjacent to the first control. The interface 9-2 is a display interface of the display in the application scene of which the first level is more than or equal to three levels. As can be seen in interface 9-2, the prompt is presented spaced from the first control presentation.
The embodiment of the application is only an exemplary manner of presenting the prompt information, and in the process of actual application, the presentation position of the prompt information in the application scene with the first level being two levels and the presentation position of the prompt information in the application scene with the first level being more than or equal to three levels may be, but are not limited to, the above manners.
The description of the second position generation process is thus completed.
The controller is configured to execute step S105 to control the display to display prompt information of a first control at a second position, where the first control is a control corresponding to the moved focus frame.
To present the prompt information in a more striking manner, the embodiment of the present application illustrates a method for presenting the prompt information, and specifically, referring to fig. 16, fig. 16 is a flowchart of the method for presenting the prompt information according to a feasible embodiment, wherein the controller is configured to execute steps S51 to S54.
Step S51, determining whether the prompt contains a picture;
if the prompt message contains a picture, executing step S52 to control the display to display the picture;
in the scheme shown in the embodiment of the application, the controller controls the display to preferentially display the prompt message of the picture type. The picture can play a role of prompting the user in a more striking way.
If the prompt message does not contain a picture, executing step S53 to determine whether the prompt message contains characters;
if the prompt message does not contain the picture/finish rendering of the picture prompt message, executing step S54 to control the display to display characters;
optionally, the picture is always positioned above the text in the display process.
According to the scheme shown in the embodiment of the application, the prompt message of the picture type is always positioned above the prompt message of the character type. The picture can play the effect of suggestion to the user in a more striking mode, and then promotes user's experience and feels.
For example, FIG. 17 is a schematic diagram of a display presentation interface shown in accordance with one possible embodiment. The interface 10-1 is a display interface of the display in an application scene where the prompt information includes text type prompt information and picture type prompt information. As can be seen from the interface 10-1, the picture-type reminder is always located above the text-type reminder. The interface 10-2 is a display interface of the display in an application scene where the prompt information includes text type prompt information and picture type prompt information. As can be seen from the interface 10-2, the picture-type reminder is always located above the text-type reminder.
The display device shown in the embodiment of the present application includes a display and a controller, wherein the controller is configured to perform: reading a first position in response to a movement instruction of the focus frame, wherein the first position is the position of the moved focus frame; generating a second location according to the first location information, the first location being associated with the second location; and controlling the display to display prompt information of a first control at the second position, wherein the first control is a control corresponding to the moved focus frame. It can be seen that according to the display device provided by the embodiment of the application, the controller can generate the second position according to the first position, and control the display to display the prompt information at the second position, so that the user can know the function of the control through the prompt information in more detail, further prompt information is displayed around the first control all the time, and the user experience is better.
In a specific implementation, the present invention further provides a computer storage medium, where the computer storage medium may store a program, and the program may include some or all of the steps in each embodiment of the method for customizing a control key and the method for starting the control key provided by the present invention when executed. The storage medium may be a magnetic disk, an optical disk, a read-only memory (ROM), a Random Access Memory (RAM), or the like.
Those skilled in the art will readily appreciate that the techniques of the embodiments of the present invention may be implemented as software plus a required general purpose hardware platform. Based on such understanding, the technical solutions in the embodiments of the present invention may be essentially or partially implemented in the form of software products, which may be stored in a storage medium, such as ROM/RAM, magnetic disk, optical disk, etc., and include instructions for causing a computer device (which may be a personal computer, a server, or a network device, etc.) to execute the method in the embodiments or some parts of the embodiments of the present invention.
Finally, it should be noted that: the above embodiments are only used for illustrating the technical solutions of the present application, and not for limiting the same; although the present application has been described in detail with reference to the foregoing embodiments, it should be understood by those of ordinary skill in the art that: the technical solutions described in the foregoing embodiments may still be modified, or some or all of the technical features may be equivalently replaced; and the modifications or the substitutions do not make the essence of the corresponding technical solutions depart from the scope of the technical solutions of the embodiments of the present application. The foregoing description, for purposes of explanation, has been presented in conjunction with specific embodiments. However, the illustrative discussions above are not intended to be exhaustive or to limit the embodiments to the precise forms disclosed above. Many modifications and variations are possible in light of the above teaching. The embodiments were chosen and described in order to best explain the principles and the practical application, to thereby enable others skilled in the art to best utilize the embodiments and various embodiments with various modifications as are suited to the particular use contemplated.

Claims (10)

1. A display device, comprising:
a display;
a controller configured to perform:
reading a first position in response to a movement instruction of the focus frame, wherein the first position is the position of the moved focus frame;
generating a second location according to the first location information, the first location being associated with the second location;
and controlling the display to display prompt information of a first control at the second position, wherein the first control is a control corresponding to the moved focus frame.
2. The display device of claim 1, wherein the controller is further configured to:
reading a first ID, the first ID being a control ID of the first control;
determining a first level according to the first ID, the first level being a level of the first control;
if the first level is one level, the first location is not read.
3. The display device according to claim 2, wherein each control is configured with an information list for storing hint information of the control; if the first level is not a level, the controller is further configured to:
reading prompt information in a target information list, wherein the target information list is an information list of a first control;
and if the prompt message is not read, the first position is not read.
4. The display device of claim 3, wherein if prompt information is read, the controller is further configured to:
if the first level is different from a second level, controlling the display to play a level change animation, wherein the second level is a level of a second control, and the second control is a control corresponding to the focus frame before moving;
reading the first location in response to ending playback of a level change animation;
reading the first location if the first level is the same as the second level.
5. The display device according to any one of claims 1 to 4, wherein the first position comprises: a first top end ordinate, a first bottom end ordinate and a first center ordinate, wherein the first top end ordinate is the ordinate of the first position, the first bottom end ordinate is the ordinate of the bottom end of the first position, and the first center ordinate is the ordinate of the center of the first position;
if the first center ordinate is greater than or equal to the second center ordinate, the second top end ordinate is equal to the first top end ordinate, and the second top end ordinate is the ordinate of the top end of the second position;
and if the first center ordinate is smaller than the second center ordinate, the second bottom end ordinate is the ordinate of the bottom end of the second position.
6. The display device of claim 5, wherein the first position further comprises: a first left boundary abscissa, a first right boundary abscissa, and a first center abscissa, the first left boundary abscissa being the abscissa of the first position left boundary, the first right boundary abscissa being the abscissa of the first position right boundary, the first center abscissa being the abscissa of the first position center;
if the first center abscissa is greater than a second center abscissa, the second right border abscissa is less than or equal to the first left border abscissa, the second center abscissa is the abscissa of the display center, and the second right border abscissa is the abscissa of the second position right border;
and if the first central abscissa is less than the second central abscissa, the second left boundary abscissa is greater than or equal to the first right boundary abscissa, and the second left boundary abscissa is the abscissa of the second position left boundary.
7. The display device of claim 5, wherein the first position further comprises: a first left boundary abscissa and a first right boundary abscissa, the first left boundary abscissa being the abscissa of the first position left boundary, the first right boundary abscissa being the abscissa of the first position right boundary;
if the current language is set as the first language, the second left boundary abscissa is greater than or equal to the first right boundary abscissa, the second left boundary abscissa is the abscissa of the second position left boundary, and when the current language of the display device is the first language, the control is displayed on the left side of the display interface;
and if the current language setting is a second language, the second right boundary abscissa is less than or equal to the first left boundary abscissa, the second right boundary abscissa is the abscissa of the second position right boundary, and the control is displayed on the right side of the display interface when the current language of the display device is the second language.
8. The display device according to claim 6 or 7, wherein if the first hierarchy is two levels, the second right boundary abscissa is smaller than the first left boundary abscissa, or the second left boundary abscissa is larger than the first right boundary abscissa;
the second right boundary abscissa is equal to the first left boundary abscissa or the second left boundary abscissa is equal to the first right boundary abscissa if the first level is greater than or equal to two levels.
9. The display device according to claim 1,
and if the prompt message contains pictures and characters, preferentially displaying the pictures in the display process.
10. The display device according to claim 1,
and if the prompt message contains a picture and characters, the picture is always positioned above the characters in the display process.
CN202011396065.4A 2020-12-03 2020-12-03 Display equipment Active CN112584211B (en)

Priority Applications (3)

Application Number Priority Date Filing Date Title
CN202011396065.4A CN112584211B (en) 2020-12-03 2020-12-03 Display equipment
PCT/US2021/061652 WO2022120079A1 (en) 2020-12-03 2021-12-02 Display apparatus
EP21901480.0A EP4256796A1 (en) 2020-12-03 2021-12-02 Display apparatus

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202011396065.4A CN112584211B (en) 2020-12-03 2020-12-03 Display equipment

Publications (2)

Publication Number Publication Date
CN112584211A true CN112584211A (en) 2021-03-30
CN112584211B CN112584211B (en) 2023-11-03

Family

ID=75126913

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202011396065.4A Active CN112584211B (en) 2020-12-03 2020-12-03 Display equipment

Country Status (1)

Country Link
CN (1) CN112584211B (en)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113473241A (en) * 2021-07-20 2021-10-01 海信视像科技股份有限公司 Display equipment and display control method of image-text style menu
WO2023087963A1 (en) * 2021-11-22 2023-05-25 北京有竹居网络技术有限公司 Information display method and device, and storage medium
US11962865B2 (en) 2021-07-20 2024-04-16 Hisense Visual Technology Co., Ltd. Display apparatus and process method for display apparatus

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101661369A (en) * 2008-08-26 2010-03-03 阿尔派株式会社 Menu display device and menu display method
US20120096386A1 (en) * 2010-10-19 2012-04-19 Laurent Baumann User interface for application transfers
CN102523528A (en) * 2011-12-30 2012-06-27 广州弘洋视讯科技有限公司 Method for displaying interface of intelligent television
CN103210654A (en) * 2011-01-20 2013-07-17 Lg电子株式会社 Digital receiver and method of providing real-time rating thereof
CN103517148A (en) * 2012-06-15 2014-01-15 索尼公司 Information processing system, information processing apparatus, and information processing method
CN105792007A (en) * 2016-05-06 2016-07-20 青岛海信电器股份有限公司 Notification message display and interaction method and device based on intelligent television
CN109190006A (en) * 2018-07-19 2019-01-11 聚好看科技股份有限公司 A kind of exchange method and device based on information search interface

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101661369A (en) * 2008-08-26 2010-03-03 阿尔派株式会社 Menu display device and menu display method
US20120096386A1 (en) * 2010-10-19 2012-04-19 Laurent Baumann User interface for application transfers
CN103210654A (en) * 2011-01-20 2013-07-17 Lg电子株式会社 Digital receiver and method of providing real-time rating thereof
CN102523528A (en) * 2011-12-30 2012-06-27 广州弘洋视讯科技有限公司 Method for displaying interface of intelligent television
CN103517148A (en) * 2012-06-15 2014-01-15 索尼公司 Information processing system, information processing apparatus, and information processing method
CN105792007A (en) * 2016-05-06 2016-07-20 青岛海信电器股份有限公司 Notification message display and interaction method and device based on intelligent television
CN109190006A (en) * 2018-07-19 2019-01-11 聚好看科技股份有限公司 A kind of exchange method and device based on information search interface

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113473241A (en) * 2021-07-20 2021-10-01 海信视像科技股份有限公司 Display equipment and display control method of image-text style menu
US11962865B2 (en) 2021-07-20 2024-04-16 Hisense Visual Technology Co., Ltd. Display apparatus and process method for display apparatus
WO2023087963A1 (en) * 2021-11-22 2023-05-25 北京有竹居网络技术有限公司 Information display method and device, and storage medium

Also Published As

Publication number Publication date
CN112584211B (en) 2023-11-03

Similar Documents

Publication Publication Date Title
CN112584211B (en) Display equipment
CN113066490B (en) Prompting method of awakening response and display equipment
CN112506400A (en) Page information voice broadcasting method and display device
CN112799627B (en) Display apparatus and image display method
CN113268199A (en) Display device and function item setting method
CN115243094A (en) Display device and multi-layer stacking method
CN112860331B (en) Display equipment and voice interaction prompting method
CN112799576A (en) Virtual mouse moving method and display device
CN112601109A (en) Audio playing method and display device
CN112235621B (en) Display method and display equipment for visual area
CN112650418B (en) Display device
CN113573112A (en) Display device and remote controller
CN113784203A (en) Display device and channel switching method
CN112882780A (en) Setting page display method and display device
CN112668546A (en) Video thumbnail display method and display equipment
CN114281228A (en) Display device and control method for displaying multilevel menu panel
CN112882631A (en) Display method of electronic specification on display device and display device
CN114302070A (en) Display device and audio output method
CN113766164B (en) Display equipment and signal source interface display method
CN112416214A (en) Display device
CN114281284B (en) Display apparatus and image display method
CN115268697A (en) Display device and line drawing rendering method
EP4256796A1 (en) Display apparatus
CN112631796A (en) Display device and file copying progress display method
CN113672192A (en) Method for prompting message by browser page characters and display equipment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant