CN113747216B - Display device and touch menu interaction method - Google Patents

Display device and touch menu interaction method Download PDF

Info

Publication number
CN113747216B
CN113747216B CN202010474023.1A CN202010474023A CN113747216B CN 113747216 B CN113747216 B CN 113747216B CN 202010474023 A CN202010474023 A CN 202010474023A CN 113747216 B CN113747216 B CN 113747216B
Authority
CN
China
Prior art keywords
touch
display
menu
control
finger
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202010474023.1A
Other languages
Chinese (zh)
Other versions
CN113747216A (en
Inventor
王学磊
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Hisense Visual Technology Co Ltd
Original Assignee
Hisense Visual Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Hisense Visual Technology Co Ltd filed Critical Hisense Visual Technology Co Ltd
Priority to CN202010474023.1A priority Critical patent/CN113747216B/en
Publication of CN113747216A publication Critical patent/CN113747216A/en
Application granted granted Critical
Publication of CN113747216B publication Critical patent/CN113747216B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/41Structure of client; Structure of client peripherals
    • H04N21/4104Peripherals receiving signals from specially adapted client devices
    • H04N21/4122Peripherals receiving signals from specially adapted client devices additional display device, e.g. video projector
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/0482Interaction with lists of selectable items, e.g. menus
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04883Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/41Structure of client; Structure of client peripherals
    • H04N21/422Input-only peripherals, i.e. input devices connected to specially adapted client devices, e.g. global positioning system [GPS]
    • H04N21/42204User interfaces specially adapted for controlling a client device through a remote control device; Remote control devices therefor
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/47End-user applications
    • H04N21/472End-user interface for requesting content, additional data or services; End-user interface for interacting with content, e.g. for content reservation or setting reminders, for requesting event notification, for manipulating displayed content
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/47End-user applications
    • H04N21/485End-user interface for client configuration
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/47End-user applications
    • H04N21/485End-user interface for client configuration
    • H04N21/4852End-user interface for client configuration for modifying audio parameters, e.g. switching between mono and stereo

Abstract

The application provides a display device and a touch menu interaction method. The interaction method can be applied to the display equipment to realize different functions on the touch menu. In the interaction process, a user can input a multi-finger touch instruction through the touch screen so as to call a touch menu. After the touch menu is called, the user can also input a multi-finger rotating touch instruction to zoom the diameter of the touch menu; and inputting a single-finger sliding touch instruction to adjust the position of the control options on the touch menu so as to open any control option by inputting a single-finger clicking touch instruction. For the opened control options, an adjustment control can be directly displayed in the middle area of the touch menu, so that a user inputs a single-finger sliding touch instruction to adjust the control parameters. The interaction method is simple to operate, and user experience can be effectively improved.

Description

Display device and touch menu interaction method
Technical Field
The application relates to the technical field of touch televisions, in particular to a display device and a touch menu interaction method.
Background
The touch television is intelligent television equipment with a touch screen and capable of realizing touch interaction. The user can implement touch interaction on the operation interface of the touch television through clicking, sliding and other operation modes on the touch television, so that media displayed in the operation interface can be played, and other auxiliary actions such as page turning, switching and the like can be completed on the operation interface. In the touch interaction process, a menu is a frequently used function.
A plurality of control options such as "back", "volume", "home" and the like may be set in the touch menu. When the user clicks the corresponding control option, a corresponding control action is executed. For example, if the user clicks the "volume" option icon in the touch menu, the television displays a volume adjustment interface. In order to perform interactive operations on a touch menu, a typical touch television provides a dedicated menu interface. In the menu interface, each control option is displayed in turn in a longitudinal arrangement mode, and for an option which cannot be displayed completely, the user can turn pages in combination with the sliding operation on the touch screen for selection.
However, since the display screen of the touch television is large, when a sliding operation is performed to turn pages, a user's finger needs to slide a long distance on the touch screen to complete the page turning, and thus, this interaction method brings inconvenience to the user. And if the number of options in the menu interface is large, the user needs to continuously slide and turn pages, so that the user frequently performs long-distance sliding operation on the screen, the interaction mode is complicated, and the user experience is seriously reduced.
Disclosure of Invention
The application provides a display device and a touch menu interaction method, which are used for solving the problem of complicated touch menu interaction mode of the traditional display device.
In a first aspect, the present application provides a display device including a display, a touch sensing module, and a controller. Wherein the display is configured to display a touch menu; the touch menu is an annular menu control formed by a plurality of control options; the touch sensing module is configured to detect a touch instruction input by a user.
The controller is configured to execute the following procedure:
receiving a multi-finger touch instruction input by a user and used for calling the touch menu;
and responding to the multi-finger touch instruction, controlling the display to display the touch menu, and calculating and determining the display position of the touch menu according to the position of each touch point in the multi-finger touch instruction.
Based on the display device, the application also provides a touch menu interaction method, which comprises the following steps:
receiving a multi-finger touch instruction input by a user and used for calling a touch menu; the touch menu is an annular menu control formed by a plurality of control options;
and responding to the multi-finger touch instruction, controlling a display to display the touch menu, and calculating and determining the display position of the touch menu according to the position of each touch point in the multi-finger touch instruction.
In a second aspect, the present application provides a display device including a display, a touch sensing module, and a controller. Wherein the display is configured to display a touch menu; the touch menu is an annular menu control formed by a plurality of control options; the touch sensing module is configured to detect a touch instruction input by a user.
The controller is configured to execute the following procedure:
receiving a single-finger sliding touch instruction input by a user and used for calling the touch menu;
responding to the single-finger sliding touch instruction, and detecting an action track of a touch point;
and if the action track is the same as the preset judgment track, controlling the display to display the touch menu, wherein the display position of the touch menu is calculated and determined according to the action track of the touch point in the single-finger sliding touch instruction.
Based on the display device, the application also provides a touch menu interaction method, which comprises the following steps:
receiving a single-finger sliding touch instruction input by a user and used for calling a touch menu; the touch menu is an annular menu control formed by a plurality of control options;
responding to the single-finger sliding touch instruction, and detecting an action track of a touch point;
And if the action track is the same as the preset judgment track, controlling a display to display the touch menu, wherein the display position of the touch menu is calculated and determined according to the action track of the touch point in the single-finger sliding touch instruction.
In a third aspect, the present application provides a display device including a display, a touch sensing module, and a controller. Wherein the display is configured to display a touch menu; the touch menu is an annular menu control formed by a plurality of control options; the touch sensing module is configured to detect a touch instruction input by a user.
The controller is configured to execute the following procedure:
receiving a multi-finger rotating touch instruction input by a user and used for zooming the touch menu;
responding to the multi-finger rotating touch control instruction, and acquiring the rotating directions of a plurality of touch points in the multi-finger rotating touch control instruction;
and scaling the diameter of the touch menu according to the rotation direction, and controlling the display to display the touch menu in real time.
Based on the display device, the application also provides a touch menu interaction method, which comprises the following steps:
receiving a multi-finger rotating touch instruction input by a user and used for zooming a touch menu; the touch menu is an annular menu control formed by a plurality of control options;
Responding to the multi-finger rotating touch control instruction, and acquiring the rotating directions of a plurality of touch points in the multi-finger rotating touch control instruction;
and scaling the diameter of the touch menu according to the rotation direction, and controlling a display to display the touch menu in real time.
In a fourth aspect, the present application provides a display apparatus including a display, a touch sensing module, and a controller. Wherein the display is configured to display a touch menu; the touch menu is an annular menu control formed by a plurality of control options; the touch sensing module is configured to detect a touch instruction input by a user.
The controller is configured to execute the following procedure:
receiving a single-finger sliding touch instruction input by a user and used for switching control options in the touch menu;
responding to the single-finger sliding touch instruction, and acquiring a touch point movement track of the single-finger sliding touch instruction;
and adjusting the display position of the control options in the annular menu control according to the movement track of the touch point, and controlling the display to display the touch menu in real time.
Based on the display device, the application also provides a touch menu interaction method, which comprises the following steps:
Receiving a single-finger sliding touch instruction input by a user and used for switching control options in a touch menu; the touch menu is an annular menu control formed by a plurality of control options;
responding to the single-finger sliding touch instruction, and acquiring a touch point movement track of the single-finger sliding touch instruction;
and adjusting the display position of the control options in the annular menu control according to the movement track of the touch point, and controlling the display to display the touch menu in real time.
In a fifth aspect, the present application provides a display apparatus including a display, a touch sensing module, and a controller. Wherein the display is configured to display a touch menu; the touch menu is an annular menu control formed by a plurality of control options; the touch sensing module is configured to detect a touch instruction input by a user.
The controller is configured to execute the following procedure:
receiving a single-finger click touch instruction input by a user and used for opening the control options;
responding to the single-finger click touch instruction, and adding an adjustment control corresponding to the opened control option in the middle area of the annular menu control;
and controlling the display to display the touch menu and the adjustment control.
Based on the display device, the application also provides a touch menu interaction method, which comprises the following steps:
receiving a single-finger clicking touch instruction input by a user and used for opening a control option; the touch menu is an annular menu control formed by a plurality of control options;
responding to the single-finger click touch instruction, and adding an adjustment control corresponding to the opened control option in the middle area of the annular menu control;
and controlling a display to display the touch menu and the adjustment control.
In a sixth aspect, the present application further provides a display device, where the display device includes a display, a touch sensing module, and a controller. Wherein the display is configured to display a touch menu; the touch menu is a round menu control formed by at least one control option; the touch sensing module is configured to detect a touch instruction input by a user.
The controller is configured to execute the following procedure:
receiving a first direction rotation touch instruction input by a user and used for displaying hidden options in the touch menu;
and responding to the first direction rotating touch instruction, and controlling the display to sequentially display the hidden options according to the first direction.
Based on the display device, the application also provides a touch menu interaction method, which comprises the following steps:
receiving a first direction rotation touch instruction input by a user and used for displaying hidden options in a touch menu, wherein the touch menu is a circular menu control formed by at least one control option;
and responding to the first direction rotating touch instruction, and controlling the display to sequentially display the hidden options according to the first direction.
In a seventh aspect, the present application further provides a display device, where the display device includes a display, a touch sensing module, and a controller. Wherein the display is configured to display a touch menu; the touch menu is a round menu control formed by at least one control option; the touch sensing module is configured to detect a touch instruction input by a user.
The controller is configured to execute the following procedure:
acquiring a multi-finger touch instruction input by a user and used for calling the touch menu;
responding to the multi-finger touch instruction, and acquiring the number of control options contained in the touch menu;
and setting the display diameter of the touch menu according to the number of the control options, and controlling the display to display the touch menu according to the display diameter.
Based on the display device, the application also provides a touch menu interaction method, which comprises the following steps:
acquiring a multi-finger touch instruction input by a user and used for calling a touch menu; the touch menu is a round menu control formed by at least one control option;
responding to the multi-finger touch instruction, and acquiring the number of control options contained in the touch menu;
and setting the display diameter of the touch menu according to the number of the control options, and controlling the display to display the touch menu according to the display diameter.
According to the technical scheme, the application provides the display equipment and the touch menu interaction method. The interaction method can be applied to the display equipment to realize different functions on the touch menu. In the interaction process, a user can input a multi-finger touch instruction through the touch screen so as to call a touch menu. After the touch menu is called, the user can also input a multi-finger rotating touch instruction to zoom the diameter of the touch menu; and inputting a single-finger sliding touch instruction to adjust the position of the control options on the touch menu so as to open any control option by inputting a single-finger clicking touch instruction. For the opened control options, an adjustment control can be directly displayed in the middle area of the touch menu, so that a user inputs a single-finger sliding touch instruction to adjust the control parameters.
The touch menu interaction method provided by the application can display menu functions through the annular interface, simplify operation actions through single-finger or multi-finger clicking, sliding and rotating touch interaction modes, and solve the problem of complicated touch menu interaction modes of the traditional display equipment. And the annular menu can be freely scaled according to the number requirement of the user on menu functions, so that the user experience is effectively improved.
Drawings
In order to more clearly illustrate the technical solution of the present application, the drawings that are needed in the embodiments will be briefly described below, and it will be obvious to those skilled in the art that other drawings can be obtained from these drawings without inventive effort.
Fig. 1 is a block diagram showing a configuration of a display device in an embodiment of the present application;
FIG. 2 is a block diagram of an architecture configuration of an operating system in a memory of a display device according to an embodiment of the present application;
FIG. 3 is a schematic diagram of an operation scenario between a display device and a control device according to an embodiment of the present application;
FIG. 4 is a block diagram illustrating a configuration of a control device according to an embodiment of the present application;
FIG. 5 is a schematic view of a touch menu according to an embodiment of the present application;
FIG. 6 is a schematic view of a touch menu zoom in an embodiment of the present application;
FIG. 7 is a schematic diagram illustrating a zooming of a touch menu to any state according to an embodiment of the present application;
FIG. 8 is a schematic diagram of adjusting control options according to an embodiment of the present application;
FIG. 9 is a schematic diagram of a loop display control option according to an embodiment of the present application;
FIG. 10 is a flow chart of a touch menu call by a multi-finger touch command according to an embodiment of the present application;
FIG. 11 is a schematic flow chart of displaying a touch menu according to the coordinates of a center point in an embodiment of the application;
FIG. 12 is a schematic flow chart of displaying a touch menu at the edge of a screen in an embodiment of the application;
FIG. 13 is a flow chart illustrating the adjustment of control options according to an initial diameter in accordance with an embodiment of the present application;
FIG. 14 is a flow chart of determining an initial diameter according to the number of touch points in an embodiment of the application;
FIG. 15 is a flow chart of a touch menu call by a single-finger sliding touch command in an embodiment of the application;
FIG. 16 is a schematic flow chart of displaying a touch menu according to the center position of an action track in an embodiment of the application;
FIG. 17 is a flow chart of zooming a touch menu according to an embodiment of the application;
FIG. 18 is a flow chart of calculating the scaling according to the embodiment of the present application;
FIG. 19 is a flow chart of determining the maximum angle variation in an embodiment of the present application;
FIG. 20 is a flow chart of switching control options by a single-finger sliding touch command in an embodiment of the application;
FIG. 21 is a flow chart illustrating a control option according to a sliding speed switching in an embodiment of the application;
FIG. 22 is a flow chart illustrating an adjustment control according to an embodiment of the present application;
FIG. 23 is a flow chart of adjusting control parameters by adjusting a control in an embodiment of the application;
FIG. 24 is a schematic diagram of a volume adjustment control in an embodiment of the present application;
FIG. 25 is a flowchart illustrating a method for determining whether a control option supports displaying an adjustment control according to an embodiment of the present application;
FIG. 26 is a flow chart illustrating a hidden option according to an embodiment of the present application;
fig. 27 is a flowchart of setting a display diameter according to the number of control options in an embodiment of the present application.
Detailed Description
Reference will now be made in detail to the embodiments, examples of which are illustrated in the accompanying drawings. When the following description refers to the accompanying drawings, the same numbers in different drawings refer to the same or similar elements, unless otherwise indicated. The embodiments described in the examples below do not represent all embodiments consistent with the application. Merely exemplary of systems and methods consistent with aspects of the application as set forth in the claims.
The application provides a display device and a touch menu interaction method, wherein the touch menu interaction method can be applied to the display device 200, and the display device 200 is a device with display pictures and touch interaction functions, such as a touch television. Obviously, the display device is not limited to the above-mentioned touch television, but may be other devices having a touch function, such as a mobile phone, a tablet computer, a notebook computer, a touch display, and the like.
The display device 200 may provide a broadcast receiving function and a network television function of a computer supporting function. The display device may be implemented as a digital television, a web television, an Internet Protocol Television (IPTV), or the like.
The display device 200 may be a liquid crystal display, an organic light emitting display, a projection device. The specific display device type, size, resolution, etc. are not limited.
A hardware configuration block diagram of the display device 200 is exemplarily shown in fig. 1. As shown in fig. 1, a modem 210, a communicator 220, a detector 230, an external device interface 240, a controller 250, a memory 260, a user interface 265, a video processor 270, a display 275, a touch sensing module 277, an audio processor 280, an audio output interface 285, and a power supply 290 may be included in the display apparatus 200.
The modem 210 receives broadcast television signals through a wired or wireless manner, and may perform modulation and demodulation processes such as amplification, mixing, and resonance, for demodulating an audio/video signal carried in a frequency of a television channel selected by a user and additional information (e.g., EPG data) from among a plurality of wireless or wired broadcast television signals.
The tuning demodulator 210 is responsive to the frequency of the television channel selected by the user and the television signal carried by that frequency, as selected by the user, and as controlled by the controller 250.
The tuning demodulator 210 can receive signals in various ways according to broadcasting systems of television signals, such as: terrestrial broadcasting, cable broadcasting, satellite broadcasting, internet broadcasting, or the like; according to different modulation types, a digital modulation mode or an analog modulation mode can be adopted; and the analog signal and the digital signal can be demodulated according to the kind of the received television signal.
In other exemplary embodiments, the modem 210 may also be in an external device, such as an external set-top box or the like. In this way, the set-top box outputs a television signal after modulation and demodulation, and inputs the television signal to the display apparatus 200 through the external device interface 240.
The communicator 220 is a component for communicating with an external device or an external server according to various communication protocol types. For example, the display device 200 may transmit content data to an external device connected via the communicator 220, or browse and download content data from an external device connected via the communicator 220. The communicator 220 may include a network communication protocol module or a near field communication protocol module such as a WIFI module 221, a bluetooth communication protocol module 222, a wired ethernet communication protocol module 223, etc., so that the communicator 220 may receive a control signal of the control device 100 according to the control of the controller 250 and implement the control signal as a WIFI signal, a bluetooth signal, a radio frequency signal, etc.
The detector 230 is a component of the display device 200 for collecting signals of the external environment or interaction with the outside. The detector 230 may include a sound collector 231, such as a microphone, that may be used to receive a user's sound, such as a voice signal of a control instruction of the user controlling the display device 200; alternatively, ambient sounds for identifying the type of ambient scene may be collected, and the implementation display device 200 may adapt to ambient noise.
In other exemplary embodiments, the detector 230 may further include an image collector 232, such as a camera, webcam, etc., that may be used to collect external environmental scenes to adaptively change the display parameters of the display device 200; and the function is used for collecting the attribute of the user or interacting gestures with the user so as to realize the interaction between the display equipment and the user.
In other exemplary embodiments, the detector 230 may further include a light receiver for collecting ambient light intensity to adapt to changes in display parameters of the display device 200, etc.
In other exemplary embodiments, the detector 230 may further include a temperature sensor, such as by sensing ambient temperature, the display device 200 may adaptively adjust the display color temperature of the image. Illustratively, the display device 200 may be adjusted to display a colder color temperature shade of the image when the temperature is higher than ambient; when the temperature is low, the display device 200 may be adjusted to display a color temperature-warm tone of the image.
The external device interface 240 is a component that provides the controller 250 to control data transmission between the display apparatus 200 and an external device. The external device interface 240 may be connected to an external device such as a set-top box, a game device, a notebook computer, etc., in a wired/wireless manner, and may receive data such as a video signal (e.g., a moving image), an audio signal (e.g., music), additional information (e.g., an EPG), etc., of the external device.
The external device interface 240 may include: any one or more of a High Definition Multimedia Interface (HDMI) terminal 241, a Composite Video Blanking Sync (CVBS) terminal 242, an analog or digital Component terminal 243, a Universal Serial Bus (USB) terminal 244, a Component terminal (not shown), a Red Green Blue (RGB) terminal (not shown), and the like.
The controller 250 controls the operation of the display device 200 and responds to the user's operations by running various software control programs (e.g., an operating system and various application programs) stored on the memory 260. For example, the controller may be implemented as a System-on-a-Chip (SOC).
As shown in fig. 1, the controller 250 includes a Random Access Memory (RAM) 251, a Read Only Memory (ROM) 252, a graphics processor 253, a CPU processor 254, a communication interface 255, and a communication bus 256. The RAM251, the ROM252, the graphics processor 253, and the CPU 254 are connected to each other via a communication bus 256.
A ROM252 for storing various system boot instructions. When the power of the display apparatus 200 starts to be started upon receiving the power-on signal, the CPU processor 254 runs a system start instruction in the ROM252, copies the operating system stored in the memory 260 into the RAM251 to start running the start operating system. When the operating system is started, the CPU processor 254 copies various applications in the memory 260 to the RAM251, and then starts running the various applications.
The graphic processor 253 generates various graphic objects such as icons, operation menus, and user input instruction display graphics, etc. The graphic processor 253 may include an operator for performing an operation by receiving user input of various interactive instructions, thereby displaying various objects according to display attributes; and a renderer for generating various objects based on the operator, and displaying the result of rendering on the display 275.
CPU processor 254 is operative to execute operating system and application program instructions stored in memory 260. And executing processing of various application programs, data and contents according to the received user input instructions so as to finally display and play various audio and video contents.
In some exemplary embodiments, the CPU processor 254 may comprise a plurality of processors. The plurality of processors may include one main processor and a plurality or one sub-processor. A main processor for performing some initialization operations of the display device 200 in a display device preloading mode and/or an operation of displaying a picture in a normal mode. A plurality of or a sub-processor for performing an operation in a state of standby mode or the like of the display device.
Communication interface 255 may include a first interface through an nth interface. These interfaces may be network interfaces that are connected to external devices via a network.
The controller 250 may control the overall operation of the display apparatus 200. For example: in response to receiving a user input command for selecting a GUI object displayed on the display 275, the controller 250 may perform an operation related to the object selected by the user input command. For example, the controller may be implemented as an SOC (System on Chip) or an MCU (Micro Control Unit ).
Wherein the object may be any one of selectable objects, such as a hyperlink or an icon. The operation related to the selected object, for example, an operation of displaying a link to a hyperlink page, a document, an image, or the like, or an operation of executing a program corresponding to the object. The user input command for selecting the GUI object may be a command input through various input means (e.g., mouse, keyboard, touch pad, etc.) connected to the display apparatus 200 or a voice command corresponding to a voice uttered by the user.
The memory 260 is used to store various types of data, software programs, or applications that drive and control the operation of the display device 200. Memory 260 may include volatile and/or nonvolatile memory. And the term "memory" includes memory 260, RAM251 and ROM252 of controller 250, or a memory card in display device 200.
In some embodiments, the memory 260 is specifically configured to store an operating program that drives the controller 250 in the display device 200; various application programs built in the display device 200 and downloaded from an external device by a user are stored; data for configuring various GUIs provided by the display 275, various objects related to the GUIs, visual effect images of selectors for selecting GUI objects, and the like are stored.
In some embodiments, the memory 260 is specifically configured to store drivers and related data for the modem 210, the communicator 220, the detector 230, the external device interface 240, the video processor 270, the display 275, the audio processor 280, etc., such as external data (e.g., audio-visual data) received from the external device interface or user data (e.g., key information, voice information, touch information, etc.) received from the user interface.
In some embodiments, memory 260 specifically stores software and/or programs for representing an Operating System (OS), which may include, for example: a kernel, middleware, an Application Programming Interface (API), and/or an application program. Illustratively, the kernel may control or manage system resources, as well as functions implemented by other programs (such as the middleware, APIs, or application programs); at the same time, the kernel may provide an interface to allow middleware, APIs, or applications to access the controller to implement control or management of system resources.
An architectural configuration block diagram of an operating system in a memory of a display device 200 is illustrated in fig. 2. The operating system architecture is an application layer, a middleware layer and a kernel layer in sequence from top to bottom.
The application layer, the application program built in the system and the non-system application program belong to the application layer. Is responsible for direct interaction with the user. The application layer may include a plurality of applications, such as a setup application, an electronic post application, a media center application, and the like. These applications may be implemented as Web applications that execute based on WebKit engines, and in particular may be developed and executed based on HTML5, cascading Style Sheets (CSS), and JavaScript.
Here, HTML, which is called a hypertext markup language (HyperText Markup Language) in its entirety, is a standard markup language for creating web pages, which are described by markup tags for describing words, graphics, animations, sounds, tables, links, etc., and a browser reads an HTML document, interprets the contents of tags within the document, and displays them in the form of web pages.
CSS, collectively referred to as cascading style sheets (Cascading Style Sheets), is a computer language used to represent the style of HTML files and may be used to define style structures such as fonts, colors, positions, and the like. The CSS style can be directly stored in an HTML webpage or a separate style file, so that the control of the style in the webpage is realized.
JavaScript, a language applied to Web page programming, can be inserted into HTML pages and interpreted by a browser. The interaction logic of the Web application is realized through JavaScript. The JavaScript can be used for realizing communication with the kernel layer by encapsulating the JavaScript extension interface through the browser,
middleware layer, some standardized interfaces may be provided to support the operation of various environments and systems. For example, the middleware layer may be implemented as multimedia and hypermedia information coding expert group (MHEG) of middleware related to data broadcasting, as DLNA middleware of middleware related to communication with an external device, as middleware providing a browser environment in which applications within a display device are running, and the like.
A kernel layer providing core system services such as: file management, memory management, process management, network management, system security authority management and other services. The kernel layer may be implemented as a kernel based on various operating systems, such as a kernel based on the Linux operating system.
The kernel layer also provides communication between system software and hardware at the same time, providing device driver services for various hardware, such as: providing a display driver for a display, providing a camera driver for a camera, providing a key driver for a remote control, providing a WIFI driver for a WIFI module, providing an audio driver for an audio output interface, providing a Power Management (PM) module with a power management driver, and the like.
A user interface 265 receives various user interactions. Specifically, an input signal for a user is transmitted to the controller 250, or an output signal from the controller 250 is transmitted to the user. Illustratively, the remote control 100A may send input signals such as a power switch signal, a channel selection signal, a volume adjustment signal, etc., input by a user to the user interface 265, and then forwarded by the user interface 265 to the controller 250; alternatively, the remote controller 100A may receive an output signal such as audio, video, or data, which is processed by the controller 250 to be output from the user interface 265, and display the received output signal or output the received output signal in the form of audio or vibration.
In some embodiments, a user may input a user command through a Graphical User Interface (GUI) displayed on the display 275, and the user interface 265 receives the user input command through the GUI. In particular, the user interface 265 may receive user input commands for controlling the position of a selector in a GUI to select different objects or items.
Alternatively, the user may enter a user command by entering a particular sound or gesture, and the user interface 265 recognizes the sound or gesture through the sensor to receive the user input command.
The video processor 270 is configured to receive an external video signal, and perform video data processing such as decompression, decoding, scaling, noise reduction, frame rate conversion, resolution conversion, and image composition according to a standard codec protocol of an input signal, so as to obtain a video signal that is directly displayed or played on the display 275.
By way of example, video processor 270 includes a demultiplexing module, a video decoding module, an image compositing module, a frame rate conversion module, a display formatting module, and the like.
Wherein, the demultiplexing module is used for demultiplexing the input audio/video data stream, such as the input MPEG-2 stream (based on the compression standard of the digital storage media moving image and voice), and then the demultiplexing module demultiplexes the input audio/video data stream into video signals, audio signals and the like.
And the video decoding module is used for processing the demultiplexed video signal, including decoding, scaling and the like.
And an image synthesis module, such as an image synthesizer, for performing superposition mixing processing on the graphic generator and the video image after the scaling processing according to the GUI signal input by the user or generated by the graphic generator, so as to generate an image signal for display.
The frame rate conversion module is configured to convert a frame rate of an input video, for example, convert a frame rate of an input 60Hz video into a frame rate of 120Hz or 240Hz, and a common format is implemented in an inserting frame manner.
And a display formatting module for converting the signal output by the frame rate conversion module into a signal conforming to a display format such as a display, for example, format converting the signal output by the frame rate conversion module to output an RGB data signal.
And a display 275 for receiving image signals from the video processor 270 and displaying video content, images and menu manipulation interfaces. The video content may be displayed from the broadcast signal received by the modem 210, or may be displayed from the video content input by the communicator 220 or the external device interface 240. And a display 275 for simultaneously displaying a user manipulation interface UI generated in the display device 200 and used to control the display device 200. And, the display 275 may include a display screen assembly for presenting pictures and a drive assembly for driving the display of images.
The touch sensing module 277 is configured to detect a touch action performed by a user on the display 275, and implement a touch interaction operation.
In one exemplary embodiment of the application, to enable touch interaction, the display screen of display 275 may be a touch screen through which a user may perform a series of touch interaction actions, such as: click, long press, slide, etc. The touch action performed by the user on the touch screen may be detected by the touch sensing module 277, and a corresponding interaction action is performed according to a pre-configured interaction rule.
In some embodiments, the touch screen function may be implemented by adding a layer of touch sensing elements to the display screen of the display 275, and the specific touch sensing principle may be determined according to the actual interaction requirement. For example, a capacitive touch screen, a resistive touch screen, an infrared touch screen, a surface acoustic wave touch screen, or the like may be employed according to the actual application scene of the display device 200.
The touch sensing module 277 may include a sensing unit disposed on the display 275 and a signal processing unit disposed in the display device. The sensing unit can be used for sensing touch operation of a user and converting the touch operation into an electric signal; the signal processing unit may process the generated electrical signal, including feature extraction, noise reduction, amplification, etc.
Taking a capacitive touch screen as an example, the sensing unit may be a layer of transparent special metal conductive material attached to the surface of the display screen glass of the display 275. When the finger or palm of the user touches the conductive substance layer, the capacitance value of the touch point is changed, so that a touch signal is generated. The signal processing unit may receive the touch signal and process the touch signal, and convert the touch signal into a digital signal readable by the controller 250.
In general, interactions performed by a user on a touch screen may include clicking, long pressing, sliding, and the like. In order to support more interaction modes, the touch screen can also support multi-touch. The more points the touch screen supports touch, the more corresponding interactive actions can be implemented. For example, multi-finger clicking, multi-finger long pressing, multi-finger sliding, and the like may be implemented.
For different interaction actions, characteristics of the touch signal, such as touch point positions, touch point number, touch area and the like, can be acquired for the generated touch signal. And judging the type of the touch signal according to the signal characteristics generated by the touch point, thereby generating a touch instruction. The position of the user touch, namely the position for executing the intersection operation, can be detected according to the position of the touch point; the number of fingers used in the touch interaction operation of the user can be determined through the number of touch points; judging the duration of the touch signal through the sheet, determining whether the user performs a click operation or a long-press operation; the sliding operation performed by the user can be determined by the position change condition of the touch point.
For example, the touch sensing module 277 may extract characteristics of the touch signal after detecting the touch signal, and if the number of touch points in the touch signal is equal to 1, the duration of the touch signal is less than 0.5s, and the position of the touch point in the touch signal is unchanged, it is determined that the interaction action input by the current user is a single pointing action, and accordingly a single pointing touch instruction may be generated.
For another example, if the number of touch points in the touch signal is equal to 5, the duration of the touch signal is greater than 1s, the position variation of the touch points in the touch signal is small, and the angle variation of the touch points exceeds a preset judgment threshold, the interaction action input by the current user is determined to be five-finger rotation action, and accordingly a five-finger rotation touch instruction can be generated.
The touch sensing module 277 may be connected to the controller 250 to transmit the generated touch command to the controller 250. Since the interaction process is a continuous process, the touch sensing module 277 continuously sends a touch command to the controller 250 to form a data stream. In order to distinguish between different touch commands, the touch sensing module 277 may generate a touch command according to a rule of one or more touch actions, so that the controller 250 may receive a complete and identifiable touch command.
It should be noted that, the signal processing operation performed on the touch signal may be performed by the controller 250, that is, in some embodiments, the sensing unit may directly send the detected electrical signal to the controller 250, and the controller 250 may process the touch signal by calling a preset signal processing program, so as to generate the touch instruction.
The audio processor 280 is configured to receive an external audio signal, decompress and decode according to a standard codec of an input signal, and perform audio data processing such as noise reduction, digital-to-analog conversion, and amplification, so as to obtain an audio signal that can be played in the speaker 286.
Illustratively, the audio processor 280 may support various audio formats. Such as MPEG-2, MPEG-4, advanced Audio Coding (AAC), high efficiency AAC (HE-AAC), etc.
An audio output interface 285 for receiving the audio signal output from the audio processor 280 under the control of the controller 250, the audio output interface 285 may include a speaker 286, or an external audio output terminal 287, such as a headphone output terminal, for outputting to a generating device of an external device.
In other exemplary embodiments, video processor 270 may include one or more chip components. Audio processor 280 may also include one or more chip components.
And, in other exemplary embodiments, video processor 270 and audio processor 280 may be separate chips or integrated with controller 250 in one or more chips.
The power supply 290 is used for providing power supply support for the display device 200 by power input by an external power supply under the control of the controller 250. The power supply 290 may be a built-in power supply circuit mounted inside the display device 200 or may be a power supply mounted outside the display device 200.
In addition to the above-described touch interaction, the user may also perform an interaction with the display device 200 through the control apparatus 100. A schematic diagram of an operational scenario between a display device and a control means is exemplarily shown in fig. 3. As shown in fig. 3, communication between the control apparatus 100 and the display device 200 may be performed in a wired or wireless manner.
Wherein the control apparatus 100 is configured to control the display device 200, which can receive an operation instruction input by a user, and convert the operation instruction into an instruction recognizable and responsive to the display device 200, and to mediate interaction between the user and the display device 200. Such as: the user responds to the channel addition and subtraction operation by operating the channel addition and subtraction key on the control apparatus 100.
The control device 100 may be a remote control 100A, including an infrared protocol communication or a bluetooth protocol communication, and other short-range communication modes, and the display apparatus 200 is controlled by a wireless or other wired mode. The user may control the display device 200 by inputting user instructions through keys on a remote control, voice input, control panel input, etc. Such as: the user can input corresponding control instructions through volume up-down keys, channel control keys, up/down/left/right movement keys, voice input keys, menu keys, on-off keys, etc. on the remote controller to realize the functions of the control display device 200.
The control device 100 may also be an intelligent device, such as a mobile terminal 100B, a tablet computer, a notebook computer, or the like. For example, the display device 200 is controlled using an application running on a smart device. The application program, by configuration, can provide various controls to the user through an intuitive User Interface (UI) on a screen associated with the smart device.
For example, the mobile terminal 100B may install a software application with the display device 200, implement connection communication through a network communication protocol, and achieve the purpose of one-to-one control operation and data communication. Such as: the mobile terminal 100B may be caused to establish a control instruction protocol with the display device 200 to implement functions such as physical keys arranged by the remote controller 100A by operating various function keys or virtual buttons of a user interface provided on the mobile terminal 100B. The audio/video content displayed on the mobile terminal 100B may also be transmitted to the display device 200, so as to implement a synchronous display function.
A block diagram of the configuration of the control apparatus 100 is exemplarily shown in fig. 4. As shown in fig. 4, the control device 100 includes a controller 110, a memory 120, a communicator 130, a user input interface 140, an output interface 150, and a power supply 160.
The controller 110 includes a Random Access Memory (RAM) 111, a Read Only Memory (ROM) 112, a processor 113, a communication interface, and a communication bus. The controller 110 is used to control the operation and operation of the control device 100, as well as the communication collaboration between the internal components, external and internal data processing functions.
For example, when an interaction in which a user presses a key arranged on the remote controller 100A or an interaction in which a touch panel arranged on the remote controller 100A is touched is detected, the controller 110 may control to generate a signal corresponding to the detected interaction and transmit the signal to the display device 200.
The display device 200 is also in data communication with the server 300 via a variety of communication means. Display device 200 may be permitted to communicate via a Local Area Network (LAN), a Wireless Local Area Network (WLAN), and other networks. The server 300 may provide various contents and interactions to the display device 200. By way of example, the display device 200 may send and receive information, such as: receiving Electronic Program Guide (EPG) data, receiving software program updates, or accessing a remotely stored digital media library. The servers 300 may be one group, may be multiple groups, and may be one or more types of servers. Other web service content such as video on demand and advertising services are provided through the server 300.
In some embodiments of the present application, a touch menu may be displayed on the display 275, and a touch screen may be built into the display 275 through which a user may perform interactive actions on the touch menu (or on the entire screen).
The touch menu is a display control containing a plurality of control options, and may be set to different shapes according to the system UI style of the display device 200, as shown in fig. 5, the touch menu may be in a ring shape, and the touch menu includes a plurality of control options arranged in a ring shape, where each control option corresponds to a function.
For example, in the touch menu, control options such as "set, volume, timer, screen capture, label, intelligent whiteboard, live broadcast classroom" are sequentially provided. If the user selects the 'set' option through interactive operation, the user can directly jump to the set interface from the current interface; if the user selects the 'volume' option, a control corresponding to the volume adjusting function can be displayed at the upper layer of the jump to volume adjusting interface or the current interface.
The touch menu may be invoked in any interface of the display device 200, so that the function of the control options in the touch menu may be a basic setting function for the display device 200, or other more common functions, or other functions that can only be implemented by using a touch interaction manner. For example, control options related to basic play functions such as "image, sound, signal source" are set in the touch menu. The touch menu can also comprise functional options which are more suitable for touch operation, such as a brightness adjusting function arranged in the touch menu, so as to realize stepless brightness adjustment.
In some embodiments, the touch menu may also be invoked only in part of the scene, such as in an application page and a play page, and may not be invoked at the setup interface to avoid duplicate operations.
In some embodiments, control options included in the touch menu may also support user customization, i.e., the user may add or delete control options to the touch menu through a setup page. For example, when a user uses the display device, resources are often transmitted through the bluetooth module, and then the bluetooth connection function can be added to the touch menu through the setting page, so that the user can quickly jump to the bluetooth connection setting page through the touch menu.
In some embodiments, the touch menu may also automatically adjust the control options included in the menu according to the usage scenario. The adjusted content may include not only specific control options but also the number and arrangement order of control options. For example, when the playing page calls the touch menu, the touch menu may include relatively more playing related content, such as: control options such as play/pause, volume, image quality, signal source, fast forward/fast backward, screen projection, etc., and more common control options can be preferentially displayed according to actual use frequency.
In some embodiments, each control option in the touch menu may represent its assigned function in graphical and/or textual form for user opening/enablement. For example, the volume control option may employ icons like a small horn and a sound wave shape, and a prompt text such as "volume" may also be set under the icon. The control options of different functions may also have different icon shapes in different states, for example, the volume control option may display fewer sound wave arcs in front of the small horn icon when the volume of the current display device 200 is smaller, and may display more sound wave arcs in front of the small horn icon when the volume of the current display device 200 is larger.
For a touch menu of a ring structure, a minimum and/or maximum display diameter may also be set. Typically, the minimum diameter may be adapted to the palm size of the user to perform the operation directly after invocation, while the maximum diameter is typically such that the operating area of the touch menu exceeds the edge of the display 275 screen, thereby facilitating the user to perform the touch operation.
It should be noted that, in the embodiment of the present application, the touch menu is in a ring shape, so that the touch menu includes a plurality of diameters, for example, an inner diameter and an outer diameter, and the corresponding display diameter may refer to any one of the inner diameter and the outer diameter. The control options are arranged and distributed on the annular band of the touch menu, and the corresponding effective operation area is also on the annular band, so that the display diameter can be a center circle corresponding to the annular band, namely the display diameter can be an average value of the inner diameter and the outer diameter. The diameters or radii described hereinafter refer to the corresponding center circle diameter or radius of the annular band, i.e., the display diameter is the average of the inner diameter and the outer diameter of the annular control, and the display radius is 1/2 of the display diameter, unless otherwise specified.
In some embodiments, the touch menu supports zooming the diameter of the display ring. The zooming of the touch menu can be realized through a specific touch action, for example, in order to obtain better interaction experience, after the touch menu is called out, multi-finger touch is adopted and the touch menu rotates clockwise, at this time, the display diameter of the touch menu can be gradually increased, and the effect of enlarging the display area is achieved, as shown in fig. 6 and 7.
Obviously, when the display diameter of the touch menu is enlarged, the annular band of the touch menu has a larger display area, so that in some embodiments, more control options can be displayed after the touch menu is enlarged. As shown in fig. 5, when the touch menu is not enlarged, 5 control options of "set, volume, timer, screen capturing, labeling, intelligent whiteboard, live broadcasting classroom" are displayed in the touch menu. As shown in fig. 6, after the touch menu is enlarged, 3 control options of "menu, edit, search and search" can be added in the touch menu. The radius of the circle can be increased as required, and the menu is increased to N sub-function entries.
The number of control options accommodated in the annular touch menu is also limited due to the limited area of the annular band of the touch menu. Therefore, in some implementations, when the number of control options included in the touch menu is greater, part of the control options may be hidden, and after the touch menu is called out, the hidden control options are displayed by sliding or turning pages.
For example, in the touch menu shown in fig. 6, the touch menu includes 8 control options including "set, volume, timer, screen capture, label, smart whiteboard, live broadcasting classroom, menu, edit, and search, on the basis of this display screen, if the user inputs a touch instruction for turning pages, the" HDMI "option may be added after the" search "option, and meanwhile, in order to keep the annular arrangement, the" set "option may be hidden, as shown in fig. 7.
In order to maintain the continuity of operation, when the control options displayed in the touch menu are switched, all the control options can also be formed into a ring for display. When switching to the last control option, the first control option may be displayed again after the last control option. As shown in fig. 8 and 9, with the "HDMI" option displayed in the sliding operation, if there are no other function options, the "set" option is displayed after the "HDMI" option, so that the entire touch menu is kept in a ring display, and the split feeling in the interaction process is reduced.
Based on the above touch menu, some embodiments of the present application provide a display device 200 and a touch menu interaction method. As shown in fig. 10, the touch menu interaction method includes the following steps:
S11: and receiving a multi-finger touch instruction input by a user and used for calling the touch menu.
In order to present the touch menu in the display screen, the controller 250 of the display device 200 may monitor the interaction of the user on the touch screen in real time through the touch sensing module 277. After the user inputs the interaction action for calling the touch menu through the touch gesture on the touch screen, the touch sensing module 277 generates a touch signal and generates a touch instruction to send the touch instruction to the controller 250, so that the controller 250 can control the display 275 to present the touch menu after receiving the touch instruction.
The specific touch instruction for invoking the touch menu may be set according to the application scenario of the actual display device 200, so as to effectively distinguish from the commonly used touch instruction, such as single-finger clicking, single-finger sliding, single-finger long pressing, etc., and in some implementations, a multi-finger touch instruction may be adopted. Wherein the multi-finger touch instruction may be generated by a multi-finger click action input and/or a multi-finger swipe action input trigger detected on the touch screen.
For example, after normally starting up the software system of the display device 200, the user may touch the touch screen of the display 275 with a plurality of fingers (five fingers, or four fingers, or three fingers, or two fingers) in any scene, i.e., may invoke a touch menu function.
S12: and responding to the multi-finger touch instruction, and controlling the display to display the touch menu.
After the user inputs the multi-finger touch command, the controller 250 may present the touch menu in the display screen by executing a corresponding interactive program. In order to facilitate user operation, the display position of the touch menu is calculated and determined according to the position of each touch point in the multi-finger touch instruction.
In addition, a series of guiding pictures can be displayed in the display interface at the same time of displaying the touch menu, for example, the user is instructed to zoom the touch menu through interaction of multi-finger rotation, and the like. The guide screen can be displayed in a mode of matching the graph with the characters, and manual closing is supported. For example, the guide screen may be a translucent pattern, and a close button is provided in the upper right corner area of the screen. The guide screen may also be displayed only when the touch menu is invoked for the first few times of the display apparatus 200, and the guide screen is not displayed when the number of times of invocation of the touch menu or the use time of the display apparatus 200 reaches a certain set threshold.
It can be seen that, in this embodiment, the user may invoke the touch menu on any interface through multi-finger touch operation. Because the touch modes such as multi-finger clicking or multi-finger sliding are used, compared with the traditional menu calling mode, misoperation of a user can be avoided. Meanwhile, the display position of the touch menu can be determined according to the touch position, so that the operation of a user can be facilitated, and the user experience is improved.
In some embodiments, as shown in fig. 11, the display positions of the touch menu and the guide screen may depend on the position where the system determines that the finger touch action of the user is located, so that the touch menu can appear at a position close to the position where the finger is located. That is, in the step of controlling the display to display the touch menu, the method further includes:
s121: acquiring the position coordinates of each touch point in the multi-finger touch instruction;
s122: calculating center point coordinates based on a plurality of touch points according to the position coordinates;
s123: and controlling the display to display the touch menu by taking the center point coordinates as reference.
In practical applications, the controller 250 may determine touch position points (hereinafter referred to as touch points) of five fingers (or four fingers, or three fingers, or two fingers) first, and then form a reference point based on a plurality of touch points as a center, so that the touch menu takes the reference point as a center, and forms an annular touch menu display screen.
Because the palm is usually opened in the touch interaction process of the user, when a multi-finger touch instruction is input, a plurality of touch points can be approximately positioned on the same arc line, namely, a convex polygon shape can be formed among the plurality of touch points, and therefore, the center point coordinate determined by the positions of the touch points can be the center of gravity of the formed convex polygon. For example, when a user touches the touch device with five fingers, a convex pentagon can be formed corresponding to the positions of the fingertips of the five fingers respectively, then the position coordinates of the five vertices are extracted, the gravity center position coordinates of the pentagon can be obtained, and finally the touch menu is displayed with the center point coordinates as a reference.
In the above embodiment, the center point may be determined through a plurality of touch points, so that the touch menu may be displayed at a position relatively fitting the palm. Thus, the user can realize the subsequent interactive operation on the touch menu without moving the finger to a larger distance.
In an exemplary embodiment, any one touch point may also be selected directly from touch points corresponding to the multi-finger touch instruction as a display reference point of the touch menu. Namely, the step of controlling the display to display the touch menu further comprises the following steps: acquiring the position coordinates of each touch point in the multi-finger touch instruction; and controlling the display to display the touch menu with reference to the position coordinates of a designated one of the plurality of touch points.
For example, when the user inputs by right-hand five-finger touch, the touch point corresponding to the index finger may be designated as a reference, and a touch menu may be displayed, that is, the second touch point from the left to the right among the five touch points is designated as a reference point.
Since in practical applications, the touch operation position of the user is uncertain, for example, the user may operate in the middle area of the screen of the display 275, and the user may also operate in the area near the edge, if the display position of the touch menu is determined by calculating the position of each touch point in the multi-finger touch instruction, the situation may occur that the touch menu is too close to the edge of the screen to be displayed completely. In an exemplary embodiment, as shown in fig. 12, to fully display the touch menu, the controller 250 is further configured to perform the following program steps:
S1231: acquiring the interval distance between the center point coordinate and the edge of the touch screen of the display;
s1232: if the interval distances are all larger than or equal to the initial radius of the touch menu, setting the center point coordinate as the display origin of the touch menu;
s1233: if any spacing distance is smaller than the initial radius of the touch menu, translating a display origin of the touch menu in a direction away from the corresponding side edge of the touch screen by taking the center point coordinate as a reference;
s1234: and controlling a display to display the touch menu by taking the origin as the center.
In general, the display screen of the display 275 is rectangular, and thus the distance between the center point and the screen edge is determined to be 4 in total, that is, the distance DL between the center point and the screen left edge, the distance DR between the center point and the screen right edge, the distance DT between the center point and the screen upper edge, and the distance DB between the center point and the screen lower edge. Wherein the distance may be an international standard length unit of measure, such as 10cm; the number of pixels spaced from the edge of the screen directly by the center point, e.g., 720px, is also possible.
After the distance between the center point and the edge of the screen is obtained, the four distances can be respectively compared with the initial diameter of the touch menu to determine whether the current touch position can meet the requirement of displaying the touch menu completely. Specifically, if the interval distances are all greater than or equal to the initial radius of the touch menu, the current touch position can meet the complete display requirement, and the center point coordinate is correspondingly set as the display origin of the touch menu. For example, setting the initial radius of the touch menu to be 7cm, and obtaining the distances between the center point and the edge of the screen to be dl=87 cm, dr=57 cm, dt=40 cm and db=41 cm respectively, wherein the distances between the current center point and the edge of the screen can be determined to be greater than or equal to the initial radius of the touch menu through comparison, and the coordinates of the center point can be directly used as the display origin of the touch menu.
If any interval distance is smaller than the initial radius of the touch menu, the display origin of the touch menu is shifted to a direction away from the corresponding side edge of the touch screen by taking the center point coordinate as a reference, so that the touch menu can be completely displayed. For example, setting the initial radius of the touch menu to be 7cm, and the acquired distances between the center point and the edge of the screen are dl=139 cm, dr=5 cm, dt=40 cm and db=41 cm respectively, then comparing the distances dr=5 cm < 7cm between the center point and the right edge of the screen, that is, the touch position is too right, at this time, the display origin can be shifted at least 2cm to the left, so as to completely display the touch menu.
After determining the display position of the touch menu, the content displayed in the touch menu may be further determined. In order to bring about better user experience, as shown in fig. 13, the step of controlling the display 275 to display the touch menu further includes:
s124: acquiring the initial diameter of the touch menu;
s125: setting the number of control options in the touch menu and the icon size of each control option according to the initial diameter;
s126: and controlling the display to display the touch menu according to the set control option quantity and icon size.
In an exemplary embodiment, in order to display control options in the touch menu, an initial diameter of the touch menu may be obtained first, then, the number of displayable control options and the size of icons corresponding to each control option are set according to the initial diameter, and finally, the touch menu is displayed according to the set number and icon size.
In the above embodiment, the initial diameter (or initial radius) may be set to a default value according to the number of control options included in the touch menu, or may be determined by calculating the position of each touch point in the multi-finger touch instruction. If the initial diameter is determined based on the position calculation of each touch point, the initial diameter may be determined by the maximum distance between the plurality of touch points. Thus, as shown in fig. 14, the step of obtaining the initial diameter of the touch menu further includes:
s1241: acquiring the number of touch points and the position coordinates of each touch point in the multi-finger touch instruction;
s1242: calculating the distance between any two touch points;
s1243: if the number of the touch points is 2, setting the initial diameter of the touch menu to be equal to the distance between the two touch points;
S1244: and if the number of the touch points is greater than or equal to 3, setting the initial diameter of the touch menu to be equal to the distance between the two touch points farthest apart in all the touch points.
Typically, the maximum distance between the plurality of touch points is typically the distance after the user's finger has opened, e.g., the maximum distance between five touch points is the distance between the thumb and the little finger (or thumb and middle finger) when the user is operating with five fingers. According to the distance, a touch menu which is suitable for the palm size of the user can be directly displayed, so that the user can conveniently operate the touch menu.
Based on the above-mentioned touch menu interaction method, the present application further provides a display device 200, including a display 275, a touch sensing module 277 and a controller 250. Wherein the display 275 is configured to display a touch menu; the touch menu is an annular menu control formed by a plurality of control options; the touch sensing module 277 is configured to detect a touch instruction input by a user.
The controller 250 is configured to perform the following program steps:
s11: receiving a multi-finger touch instruction input by a user and used for calling the touch menu;
s12: and responding to the multi-finger touch instruction, and controlling the display to display the touch menu.
And the display position of the touch menu is calculated and determined according to the position of each touch point in the multi-finger touch instruction.
As can be seen from the above technical solutions, in some embodiments of the present application, a display device 200 and a touch menu interaction method are provided, where after receiving a multi-finger touch instruction for invoking a touch menu input by a user, a controller 250 of the display device 200 may respond to the multi-finger touch instruction to control a display 275 to display the touch menu. Through the above configuration of the controller 250, the user can use multi-finger touch on any interface to call the touch menu, so as to avoid misoperation. And a touch menu is displayed at a position where the user performs multi-touch. The interaction method is simple and convenient to operate, does not need a complicated calling mode of the user memory, has good interaction guidance, and can effectively improve user experience.
In the above embodiment, the touch menu is invoked by multi-finger touch, and other touch menus may be invoked in practical application.
As shown in fig. 15, in some embodiments of the present application, there is further provided a touch menu interaction method, including:
s21: receiving a single-finger sliding touch instruction input by a user and used for calling the touch menu;
In one exemplary embodiment, the touch pad list may also be invoked by sliding a single finger over a particular track. For example, the user may input a single-finger sliding touch instruction for invoking the touch menu by sliding a "C" shaped track on the touch screen with a single finger.
It should be noted that, the single-finger sliding track is only used as an optional single-finger sliding touch instruction track, and other types of action tracks may be used in practical applications, for example: the W-shaped track, the V-shaped track, the Z-shaped track, the S-shaped track and the like can also be tracks of other forms. Obviously, the track input by the user through single-finger sliding can be set according to actual needs, and user definition can be supported.
S22: responding to the single-finger sliding touch instruction, and detecting an action track of a touch point;
after receiving the single-finger sliding touch instruction input by the user, the method can also respond to the single-finger sliding touch instruction to detect the action track of the touch point. In practical applications, the detection of the motion trajectory may start when the user touches the touch screen. Specifically, when the user touches the touch screen to generate a touch signal, the touch sensing module 277 or the controller 250 may detect the duration of the touch signal first, and if the duration is shorter, for example, less than 0.5s, determine that the interaction operation corresponding to the touch signal is a click operation.
The position change condition of the touch point can be detected while the duration of the touch signal is detected. And if the touch point position corresponding to the touch signal is unchanged or the change distance is smaller than a preset threshold value, determining that the corresponding interaction action is long-press operation. The duration corresponding to the long press operation typically requires more than 2-3 seconds to execute the corresponding control program in order to effectively distinguish the click action from the long press action. If the position of the touch point corresponding to the touch signal is changed, determining that the interaction action corresponding to the current touch signal is a sliding action, so that an action track formed in the sliding process can be recorded.
After the action track of the touch point is detected, the detected action track can be compared with a preset judgment track, and whether the action track is identical with the preset judgment track or not is determined.
S23: and if the action track is the same as the preset judgment track, controlling the display to display the touch menu.
By comparing the motion trajectory with the preset judgment trajectory, if the motion trajectory is the same as the preset judgment trajectory, the display 275 is controlled to display the touch menu. For example, the preset determination track is a "C" track, and the interaction action input by the user through the touch screen also forms a "C" track, and the determined action track is the same as the preset determination track, so that the control program for calling the touch menu can be directly executed, and the touch menu is displayed on the display 275.
In order to facilitate the subsequent execution of interactive operation on the touch menu, the display position of the touch menu is calculated and determined according to the action track of the touch point in the single-finger sliding touch instruction. For example, a touch menu may be displayed at the end or middle of the action track. After the touch menu is displayed, the user can execute interaction action on the touch menu without moving a long distance, so that the convenience of operation is improved.
In this embodiment, the motion trajectory is not identical to the preset determination trajectory, but may be determined to be identical only when the motion trajectory inputted by the user is similar to the preset determination trajectory. For example, if the preset judging track is a "C" track, the user inputs a larger "C" action track or a smaller "C" track on the touch screen, and both the action track and the preset judging track are determined to be the same. In the actual judgment, the change rule of the action track can be determined by judging the slope change of the track, the angle difference of the start point and the stop point and the like.
In order to display a touch menu, in an exemplary embodiment, as shown in fig. 16, the step of controlling the display to display the touch menu further includes:
S231: acquiring an action track graph of a touch point in the single-finger sliding touch instruction;
s232: positioning a graph center point in the action track graph;
s233: and controlling the display to display the touch menu by taking the center point of the graph as a reference.
After acquiring the single-finger sliding touch command, the controller 250 may acquire an action track pattern of the touch point, and determine a base point of displaying the touch menu by locating a pattern center point in the action track pattern, thereby displaying the touch menu with the determined base point as a reference. The center point can be the center of gravity of a closed graph formed by the track, or the center of the closed graph formed by specific points in the track, or the center of a circle circumscribed by part of points in the track. The base point or the reference point mentioned in the above embodiment may be a center point of the annular touch menu.
For example, if the preset determination track is a "C" track, after receiving a single finger sliding touch command of the "C" track, the controller 250 may obtain the position coordinates of the start point, the end point and the farthest point, and determine the circumscribed circle of the triangle formed by the position coordinates of the three points, thereby determining the center position coordinate of the circumscribed circle, and determining the center position coordinate as the display reference point of the touch menu.
Based on the above-mentioned touch menu interaction method, some embodiments of the present application further provide a display device 200 for implementing the above-mentioned touch menu interaction method. The display device 200 includes a display 275, a touch sensing module 277 and a controller 250. Wherein the display 275 is configured to display a touch menu; the touch menu is an annular menu control formed by a plurality of control options; the touch sensing module 277 is configured to detect a touch instruction input by a user.
The controller 250 is configured to perform the following program steps:
s21: receiving a single-finger sliding touch instruction input by a user and used for calling the touch menu;
s22: responding to the single-finger sliding touch instruction, and detecting an action track of a touch point;
s23: and if the action track is the same as the preset judgment track, controlling the display to display the touch menu.
And the display position of the touch menu is calculated and determined according to the action track of the touch point in the single-finger sliding touch instruction.
As can be seen from the above technical solution, based on the touch menu interaction method and the display device 200 provided in the above embodiments, a user may input a single-finger sliding touch instruction on a touch screen, and make the action track of a touch point in the single-finger sliding touch instruction be the same as the preset judgment track shape, so as to trigger the call condition of the touch menu, so as to control the display 275 to display the touch menu according to the action track shape. The touch menu interaction method can be effectively distinguished from conventional touch operation through the preset action track, so that misoperation is reduced, and user experience is improved.
After invoking the touch menu, the user may perform further operations by other operations on the touch screen. Since the screen area of the display 275 is larger, a large display area can be used for displaying the touch menu, so in order to reduce the frequent implementation of the page switching operation when a user opens a certain control option, the display area of the display 275 can be used for displaying more control options, i.e. the touch menu can be scaled as required.
As shown in fig. 17, in an exemplary embodiment of the present application, in order to zoom a touch menu, a touch menu interaction method is provided, which includes the following steps:
s31: receiving a multi-finger rotating touch instruction input by a user and used for zooming the touch menu;
after the display 275 displays the touch menu, the user may further input a touch interaction according to the touch screen. When the user wants to zoom the touch menu, the multi-finger rotating action can be input on the touch screen, so that a multi-finger rotating touch instruction is generated.
S32: responding to the multi-finger rotating touch control instruction, and acquiring the rotating directions of a plurality of touch points in the multi-finger rotating touch control instruction;
For the controller of the display device 200, various touch instructions input by the user may be continuously received after the touch menu is displayed. If the received touch instruction is a multi-finger rotating touch instruction, the rotation directions of the touch points can be further acquired.
Typically, the rotating touch command is a touch command with distinct rotation characteristics. In practical application, if the position moving distances of the plurality of touch points are shorter in the input touch control instruction, the movement tracks of the plurality of touch points are arc-shaped, and the change rules of the curvature of the arc shapes are the same, the touch control instruction input by the current user can be determined to be a multi-finger rotating touch control instruction.
Therefore, the controller 250 can detect the moving distance, the track shape, the curvature change rule and other parameters of the touch point at the same time, so as to determine whether the touch command is a multi-finger rotating touch command. If the multi-finger rotating touch control instruction is input, the motion track change rule can be further extracted, so that the rotating directions of a plurality of touch points are determined. For example, the start point, the intermediate point, and the end point may be extracted, and the rotation direction may be determined by the positional relationship between the three points.
S33: and scaling the diameter of the touch menu according to the rotation direction, and controlling the display to display the touch menu in real time.
After determining the rotation direction, the diameter of the touch menu may be scaled according to the rotation direction, and it is obvious that whether the touch menu is scaled down or scaled up depends on the rotation direction of the input touch command, for example, if the rotation direction is clockwise, the diameter of the touch menu is scaled up; and if the rotation direction is anticlockwise, reducing the diameter of the touch menu.
Obviously, the zoom-in or zoom-out strategy can also be adjusted according to the use habits set by different areas or different users. For example, for a user with a left hand for the inertial hand, the diameter of the touch menu may be reduced when the direction of rotation is determined to be clockwise; and the rotation direction is anticlockwise, and the diameter of the touch menu is enlarged.
In order to obtain better interaction experience, in practical application, the touch menu can be enlarged or reduced in real time along with the rotation process of the multi-finger rotation touch instruction, namely, the diameter of the touch menu can be continuously enlarged or reduced according to the real-time change of the rotation angle.
Thus, as shown in fig. 18, in an exemplary embodiment, the step of controlling the display to display the touch menu in real time further includes:
S331: acquiring rotation angles of a plurality of touch points in the multi-finger rotation touch instruction;
s332: calculating a scaling ratio according to the proportion of the rotation angle to the maximum rotation angle;
s333: and scaling the diameter of the current touch menu according to the scaling proportion.
In this embodiment, the rotation angles of the plurality of touch points in the multi-finger rotation touch instruction may be obtained first, and then the scaling ratio may be calculated according to the ratio of the rotation angle to the maximum rotation angle, so as to scale the diameter of the current touch menu according to the scaling ratio. For example, by analyzing the motion trajectory of the touch point, it is determined that the current rotation angle is rotated 10 degrees clockwise and the maximum rotation angle is 90 degrees, and the scaling ratio is calculated to be 10/90, so that the diameter of the touch menu can be increased by 1/9 according to the calculated scaling ratio.
In order to obtain a continuous zooming effect, in practical application, the zooming ratio can be calculated once when the rotation angle is continuously obtained according to a certain precision, for example, the rotation angle is changed by 1 degree in a multi-finger rotation touch control instruction. When the rotation angle is changed from 10 degrees to 11 degrees, the calculation of primary scaling is carried out, namely the scaling is 11/90, and the diameter is correspondingly increased by 11/90; and similarly, when the rotation is carried out again to 12 degrees, the calculation is carried out again to determine that the scaling ratio is 12/90, and the diameter is correspondingly increased by 12/90.
Obviously, the higher the calculation frequency of the scaling ratio is, the smoother the scaling effect is in the adjusting process correspondingly, so that the calculation frequency of the scaling ratio can be improved as much as possible on the premise that the hardware configuration of the display device 200 can be supported. For example, the calculation frequency may be equal to the screen refresh frequency of the display 275 to obtain the best fluency.
Because each touch point in the multi-finger rotating touch instruction corresponds to different fingers, the lengths of the fingers are different and are limited by the influence of different user rotating habits, and the change amount of the rotating angle which can be achieved by the correspondence of each touch point has larger difference. For example, a part of users are used to rotate by taking the palm center as an axis, and the difference of the rotation angles of the thumb and the other four fingers is small; and a part of users are used to rotate by taking the thumb as an axis, so that the difference of the rotation angles of the thumb and the other four fingers is large, and the detection of the rotation angles is influenced.
Therefore, in order to determine the rotation angle, a specific one of the plurality of touch points may be selected as a judgment basis for the rotation angle, for example, the rotation angle of the index finger may be selected as a judgment basis. And the touch point rotation angle with the largest rotation angle can be determined as the calculation basis of the scaling by comparing the detection results of the plurality of rotation angles. That is, in an exemplary embodiment of the present application, as shown in fig. 19, the rotation angle is a maximum angle variation amount of the current position with respect to the initial position among the plurality of touch points; the step of obtaining the rotation angles of the plurality of touch points in the multi-finger rotation touch instruction further comprises the following steps:
S3311: acquiring initial position coordinates and current position coordinates of each touch point in the multi-finger rotating touch control instruction;
s3312: calculating the angle change quantity of the current position coordinate relative to the initial position coordinate;
s3313: and comparing the angle change amounts of the touch points, and determining the maximum angle change amount.
It can be seen that the controller 250 may determine the maximum angle variation by acquiring the initial position coordinates and the current position coordinates passing through each of the touch points, calculating the angle variation, and finally comparing the angle variation of the plurality of touch points. More accurate rotation angle change data can be obtained through the maximum angle change amount, so that the subsequent calculation of the scaling is facilitated.
In practical application, after the diameter of the touch menu is adjusted, the area of the annular zone in the touch menu is correspondingly changed, so that the number of control options and/or the icon size of the control options contained in the touch menu can be modified according to the current diameter of the touch menu, and further, a user can select more control options on the touch menu and execute more functional operations.
Based on the above-mentioned interaction method of touch menu, some embodiments of the present application further provide a display device 200 for implementing the above-mentioned interaction method of touch menu. The display device 200 includes a display 275, a touch sensing module 277 and a controller 250. Wherein the display 275 is configured to display a touch menu; the touch menu is an annular menu control formed by a plurality of control options; the touch sensing module 277 is configured to detect a touch instruction input by a user.
The controller 250 is configured to perform the following program steps:
s31: receiving a multi-finger rotating touch instruction input by a user and used for zooming the touch menu;
s32: responding to the multi-finger rotating touch control instruction, and acquiring the rotating directions of a plurality of touch points in the multi-finger rotating touch control instruction;
s33: and scaling the diameter of the touch menu according to the rotation direction, and controlling the display to display the touch menu in real time.
As can be seen from the above technical solutions, the above embodiments provide a touch menu interaction method and a display device 200, where after the touch menu is called, the touch menu interaction method can use multi-finger touch (e.g. five fingers) and rotate clockwise (or anticlockwise), and control and adjust the diameter of the touch menu to gradually increase, so as to display more control option functions. The interaction method can modify the number of control options and/or the icon size of the control options contained in the touch menu when the diameter of the touch menu is adjusted and changed so as to present more functions. According to the touch menu interaction method, the number of control option functions in the touch menu can be freely scaled according to the number requirement of a user on the touch menu through the annular menu, and the user experience level of a product is effectively improved.
For a touch menu, the maximum diameter of the touch menu may be set according to the size of an actual screen to display a greater number of control options. However, if the number of control options in the touch menu is large, the touch menu under the maximum diameter cannot display all the control options. Thus, as shown in fig. 20, in an exemplary embodiment, the touch operation performed on the touch menu may further include the steps of:
s41: receiving a single-finger sliding touch instruction input by a user and used for switching control options in the touch menu;
s42: responding to the single-finger sliding touch instruction, and acquiring a touch point movement track of the single-finger sliding touch instruction;
s43: and adjusting the display position of the control options in the annular menu control according to the movement track of the touch point, and controlling the display to display the touch menu in real time.
In this embodiment, the user may drag the control options in the annular menu through single-finger sliding, so as to display the hidden control options, thereby achieving the purpose of switching the control option pages in the touch menu, so that more control options may be displayed.
In order to achieve better interaction effect, a single-finger sliding touch instruction can be input in an annular band area (or an associated area or a nearby area) of the touch menu. Along with the sliding of the finger of the user, the movement track of the touch point can be formed, so that the display position of the control option graph in the annular menu control can be adjusted according to the movement track of the touch point.
In practical application, the change speed of the control option icons should be the same as the sliding speed of the touch finger, so as to achieve the effect that the icons follow the finger movement. And after the sliding speed reaches a certain value, the control option is continuously moved in the annular belt at the end of sliding so as to obtain the effect of inertia. Obviously, the faster the sliding speed, the further the distance to continue to move, so as to obtain a more realistic touch experience.
For the sliding speed of the touch finger, other related controls may be further set, for example, different switching effects may be obtained for different sliding speeds. That is, as shown in fig. 21, in an exemplary embodiment, the method further comprises:
s431: acquiring the sliding speed of the touch point in the single-finger sliding touch instruction;
s432: if the sliding speed is smaller than or equal to a preset speed threshold, controlling the control options to move in the annular menu control along with the touch point;
s433: and if the sliding speed is greater than a preset speed threshold, controlling the touch menu to switch the display page.
And judging the sliding speed of the touch point through a preset speed threshold, and when the sliding speed is lower than the preset speed threshold, dragging the control option icons in the annular menu according to the mode, so as to control the control options to move in the annular menu control along with the touch point. When the sliding speed is higher than the preset speed threshold, it may be determined that the user may want to quickly find the hidden control options, so that the touch menu may be controlled to switch the display pages, where each display page includes a plurality of control options.
It should be noted that the display page is a relative concept, and the control options in different display pages may be partially different or may be completely different in practical applications. For example, each display page contains at most 11 control option icons, when the touch menu contains 23 control options, three display pages can be included, wherein the first display page contains the 1 st to 11 th control options, the second display page contains the 12 th to 22 th control options, and the third display page contains the 13 th to 23 th control options. Therefore, when the user slides in the first display page in a single-finger manner, the user switches to the second display page, and when the second display page slides in a single-finger manner, the user switches to the third display page.
Based on the above-mentioned touch menu interaction method, some embodiments of the present application further provide a display device 200, including a display 275, a touch sensing module 277 and a controller 250. Wherein the display 275 is configured to display a touch menu; the touch menu is an annular menu control formed by a plurality of control options; the touch sensing module 277 is configured to detect a touch instruction input by a user.
The controller 250 is configured to perform the following program steps:
S41: receiving a single-finger sliding touch instruction input by a user and used for switching control options in the touch menu;
s42: responding to the single-finger sliding touch instruction, and acquiring a touch point movement track of the single-finger sliding touch instruction;
s43: and adjusting the display position of the control options in the annular menu control according to the movement track of the touch point, and controlling the display to display the touch menu in real time.
As can be seen from the above technical solution, in the touch menu interaction method and the display device 200 provided in the above embodiments, after displaying the touch menu, the touch point movement track of the touch menu can be obtained by receiving the single-finger sliding touch instruction, so as to adjust the display position of the control options in the annular menu control according to the touch point movement track, and display the control options on the display 275 in real time. Therefore, the touch menu interaction method can enable the annular touch menu to rotate when a user slides the menu area through the finger, so that more menu function inlets can be checked, and acceleration rotation is supported.
For the touch menu, the corresponding setting function can be started by clicking a control option contained in the touch menu. Because different control options in the touch menu have different functions, some functions need to be set in detail in a specific control interface, and some functions can achieve the setting purpose through simple controls, so that different pictures can be displayed according to the different started options.
In an exemplary embodiment, as shown in fig. 22, after the touch menu is invoked, the following interaction method may be further included:
s51: receiving a single-finger click touch instruction input by a user and used for opening the control options;
s52: responding to the single-finger click touch instruction, and adding an adjustment control corresponding to the opened control option in the middle area of the annular menu control;
s53: and controlling the display to display the touch menu and the adjustment control.
In practical application, when the user does not start any control options, controls with common functions can be displayed in the middle area of the touch menu. For example, a return home control is displayed in the middle area, and when the user clicks on the control, the control home of the operating system can be switched to.
When the user starts a control option by clicking a control option icon, an adjustment control corresponding to the opened control option can be added in the middle area of the annular menu control by responding to a single-finger click touch instruction. For example, when the user clicks on the volume control option, the return home page control in the middle area may be replaced with a volume setting control for the user to complete the volume setting function through further interaction.
Therefore, the adjustment control displayed in the middle of the annular touch menu can be a setting function with simpler interactive operation. For example, a draggable slider may be included in the adjustment control for parameter setting. Thus, as shown in fig. 23, in one exemplary embodiment, the method further comprises:
s54: receiving a single-finger sliding instruction which is input by a user in the adjustment control area and used for adjusting control parameters;
s55: adjusting control parameters of the opened control options in response to the single-finger sliding instruction;
s56: and adjusting the display shape of the adjustment control according to the adjusted control parameter, and controlling the display to display the adjustment control.
For example, as shown in fig. 24, the volume adjustment control may be in the form of an annular drag line and a slider, and after the user clicks the volume control option, the volume adjustment control may be displayed in a middle area of the touch menu, and a single-finger sliding instruction input by the user is received, so as to drag the position of the slider, and adjust the volume control parameter. Obviously, in the volume adjustment control, the volume correspondingly adjusted when the sliding block is positioned at different positions is also different, so that the purpose of volume adjustment is achieved.
Since a part of the control options cannot display the adjustment control in the middle area of the touch menu, as shown in fig. 25, in an exemplary embodiment, the step of adding the adjustment control corresponding to the opened control option in the middle area of the annular menu control further includes:
s521: judging whether the opened control option supports displaying the adjustment control;
s522: if the opened control options do not support to display the adjustment control, controlling a display to display a setting interface corresponding to the opened control options;
s523: and if the opened control options support to display the adjustment control, adding the adjustment control corresponding to the opened control options in the middle area of the annular menu control.
After a user clicks any control option, judging whether the clicked control option corresponds to the setting program to support displaying the adjustment control, wherein a specific judging mode can be realized by presetting a supportable list, then carrying out matching search on the opened control option and the list, and if the opened control option is in the list, determining to support displaying the adjustment control, so that the display is directly controlled to display the corresponding adjustment control; if the opened control options are not in the list, determining that the opened control options do not support to display the adjustment control, and at the moment, directly jumping to a setting interface corresponding to the opened control options.
In this embodiment, a Home key (Home/counter) function may be defined in the middle of the touch menu, and the user clicks on the middle center control area to enter the Home/counter. When the user clicks the volume function entry area, the volume adjustment page is entered, the volume value can be adjusted by sliding the circular control in the circle center control area, the volume adjustment page is exited again by clicking the volume function entry area or overtime, and the circle center control area displays the Home key (Home/counter) function.
In an exemplary embodiment, when the user no longer uses the touch menu, an exit instruction may be input, and the corresponding display device 200 may receive the exit instruction input by the user to exit the touch menu; and responding to the exit instruction, and controlling the display to cancel displaying the touch menu and the adjustment control.
The exit instruction is a single-finger click touch instruction input by a user in a region beyond the touch menu; or the exit instruction is a single-finger click touch instruction input by a user on a control which refers to an exit function in the touch menu; alternatively, the exit instruction is an instruction for referring to an exit function, which is input by the user through the control device 100.
In practical application, after the user invokes the total entry function of the touch menu to complete the related operation, if it is desired to exit the touch menu, the user may click on the blank area outside the on-screen menu, exit the total entry function of the touch menu, and also support pressing the return key on the control device 100 of the display apparatus 200 to exit the total entry function of the menu. In addition, if the user does not input other interaction instructions within the preset time after calling out the touch menu, the user automatically exits the touch menu when the preset time is reached.
Based on the above-mentioned interaction method of touch menu, some implementations of the present application further provide a display device 200, including a display 275, a touch sensing module 277 and a controller 250. Wherein the display 275 is configured to display a touch menu; the touch menu is an annular menu control formed by a plurality of control options; the touch sensing module 277 is configured to detect a touch instruction input by a user.
The controller 250 is configured to perform the following program steps:
s51: receiving a single-finger click touch instruction input by a user and used for opening the control options;
s52: responding to the single-finger click touch instruction, and adding an adjustment control corresponding to the opened control option in the middle area of the annular menu control;
S53: and controlling the display to display the touch menu and the adjustment control.
In some embodiments, the touch menu may also be a circular menu including at least one control option. After the touch menu is invoked, at least one control option, such as a "home option," may be displayed in the circular touch menu. And other control options in the touch menu can be set as hidden options, and the hidden options can be displayed according to further interactive operation of the user.
As shown in fig. 26, in order to display a hidden option, the interaction method of a touch menu provided in some embodiments of the present application includes the following steps:
s61: receiving a first direction rotation touch instruction input by a user and used for displaying hidden options in the touch menu;
s62: and responding to the first direction rotating touch instruction, and controlling the display to sequentially display the hidden options according to the first direction.
After the touch menu is invoked, the user may input a first direction rotation touch instruction through the touch screen. The first direction may be set according to actual interaction requirements, for example, the first direction may be a clockwise direction. In order to effectively distinguish from other interaction modes, the first direction may also be set to other special rotation directions, for example, the first direction rotation touch may be clockwise rotation at a specific rotation speed, or clockwise rotation performed by a plurality of touch points at the same time under a specific number of touch points.
After the user inputs the first direction rotating touch command, the display device 200 may detect the first direction rotating touch command through the touch sensing module 277, and determine that the user is currently to display the hidden option by detecting the rotating touch operation in the first direction, so that the display 275 may be controlled by the controller 250 to display the hidden option.
The hidden options may be displayed at an outer circumferential position of the circular menu, and in order to obtain a better interaction effect, the hidden options may also be sequentially displayed in the interface in the same direction as the first direction. For example, when the first direction is clockwise, the hidden options may be sequentially displayed at the outer circumferential position of the "home" control in the clockwise direction, so as to form a ring-shaped control option layout.
In some embodiments, the displayed hidden options may be further hidden by reversing the operation of rotating the touch command in the first direction. The interaction method of the touch menu further comprises the following steps:
s63: receiving a second direction rotation touch instruction input by a user and used for canceling display of hidden options in the touch menu;
s64: and responding to the second direction rotating touch instruction, and controlling the display to cancel displaying the hidden options in sequence according to the second direction.
Wherein the second direction is a rotational direction opposite to the first direction. For example, when the first direction is clockwise, the second direction is counterclockwise. The user may input a second direction rotation touch instruction through the touch screen after the display device 200 displays the hidden option. The controller 250 may detect the touch instruction through the touch sensing module 275, thereby re-switching the hidden option to the hidden state, and not displaying in the interface any more.
Also, in order to obtain a better interaction effect, the hidden options may be hidden sequentially in the second direction. For example, a plurality of control options may be controlled to cancel display from the interface in a counter-clockwise direction.
Based on the above-mentioned interaction method of touch menu, some embodiments of the present application further provide a display device 200, including a display 275, a touch sensing module 277 and a controller 250. Wherein the display 275 is configured to display a touch menu; the touch menu is a round menu control formed by at least one control option; the touch sensing module 277 is configured to detect a touch instruction input by a user.
The controller 250 is configured to perform the following program steps:
S61: receiving a first direction rotation touch instruction input by a user and used for displaying hidden options in the touch menu;
s62: and responding to the first direction rotating touch instruction, and controlling the display to sequentially display the hidden options according to the first direction.
As shown in fig. 27, in some embodiments, if the touch menu is a circular menu or a ring menu including at least one control option, the display radius (or diameter) of the touch menu may also be set by the number of control options included in the touch menu, that is, in some embodiments of the present application, there is further provided a touch menu interaction method including:
s71: acquiring a multi-finger touch instruction input by a user and used for calling a touch menu;
s72: responding to the multi-finger touch instruction, and acquiring the number of control options contained in the touch menu;
s73: and setting the display diameter of the touch menu according to the number of the control options, and controlling the display to display the touch menu according to the display diameter.
In practical application, the larger the display diameter of the touch menu is, the larger the display space which can be provided by the touch menu is, and the larger the number of control options which can be correspondingly accommodated is. Therefore, after the user inputs the multi-finger touch command to call out the touch menu, the controller 250 may further extract the number of control options included in the touch menu, and determine the initial display diameter of the touch menu according to the number of control options, so as to display the touch menu according to the size of the initial display diameter.
For example, under the preset initial display diameter, 10 control options can be displayed in the touch menu at the same time, and when the number of control options contained in the touch menu is greater than 10, the display diameter of the touch menu can be increased to obtain a larger display space, and more than 10 control options are displayed. Similarly, when the number of control options in the touch menu is less than 10, the display diameter of the touch menu can be reduced, so that more various display effects can be obtained, and the user experience is improved.
It should be noted that, depending on the size and resolution of the display 275 of the display device 200, an upper limit and a lower limit of the display diameter may be set, and an upper limit and a lower limit of the number of control options may be set correspondingly. And when the number of the control options is larger than the upper limit of the number, all the control options can be displayed in the mode of switching the display pages without increasing the display diameter of the touch menu. When the number of control options is smaller than the lower number limit, the display diameter of the touch menu can be not reduced any more, and a better display layout can be obtained by adjusting the interval between the control options.
Based on the above-mentioned touch menu interaction method, some embodiments of the present application further provide a display device 200, including a display 275, a touch sensing module 277 and a controller 250. Wherein the display 275 is configured to display a touch menu; the touch menu is a round menu control formed by at least one control option; the touch sensing module 277 is configured to detect a touch instruction input by a user.
The controller 250 is configured to perform the following program steps:
s71: acquiring a multi-finger touch instruction input by a user and used for calling a touch menu;
s72: and responding to the multi-finger touch instruction, and controlling the display to display the touch menu.
As can be seen from the above technical solutions, the present application provides a display device 200 and a touch menu interaction method. The interaction method can be applied to the display device 200 to realize different functions on the touch menu. In the interaction process, a user can input a multi-finger touch instruction through the touch screen so as to call a touch menu. After the touch menu is called, the user can also input a multi-finger rotating touch instruction to zoom the diameter of the touch menu; and inputting a single-finger sliding touch instruction to adjust the positions of the control options on the touch menu, so that any control option can be opened by clicking the touch instruction by inputting the single-finger sliding touch instruction, and an adjustment control can be directly displayed in the middle area of the touch menu for the opened control option, so that a user inputs the single-finger sliding touch instruction, and the control parameters are adjusted.
The above-provided detailed description is merely a few examples under the general inventive concept and does not limit the scope of the present application. Any other embodiments which are extended according to the solution of the application without inventive effort fall within the scope of protection of the application for a person skilled in the art.

Claims (20)

1. A display device, characterized by comprising:
a display configured to display a touch menu; the touch menu is an annular menu control formed by a plurality of control options;
the touch sensing module is configured to detect a touch instruction input by a user;
a controller configured to:
receiving a multi-finger touch instruction input by a user and used for calling the touch menu;
responding to the multi-finger touch instruction, and acquiring the position coordinate of each touch point in the multi-finger touch instruction;
calculating center point coordinates based on a plurality of touch points according to the position coordinates;
acquiring the interval distance between the center point coordinate and the edge of the touch screen of the display;
if the interval distances are all larger than or equal to the initial radius of the touch menu, setting the center point coordinate as the display origin of the touch menu;
if any spacing distance is smaller than the initial radius of the touch menu, translating a display origin of the touch menu in a direction away from the corresponding side edge of the touch screen by taking the center point coordinate as a reference;
controlling a display to display the touch menu by taking the origin as the center;
receiving a multi-finger rotating touch instruction input by a user and used for zooming the touch menu;
Responding to the multi-finger rotating touch control instruction, and acquiring the rotating directions of a plurality of touch points in the multi-finger rotating touch control instruction;
acquiring initial position coordinates and current position coordinates of each touch point in the multi-finger rotating touch control instruction;
calculating the angle change quantity of the current position coordinate relative to the initial position coordinate;
comparing the angle change amounts of the touch points to determine the maximum angle change amount;
acquiring the rotation angles of a plurality of touch points in the multi-finger rotation touch instruction according to the maximum angle variation;
calculating a scaling ratio according to the proportion of the rotation angle to the maximum rotation angle;
and scaling the diameter of the current touch menu according to the rotation direction and the scaling ratio, and controlling the display to display the touch menu in real time.
2. The display device of claim 1, wherein in the step of controlling the display to display the touch menu, the controller is further configured to:
acquiring an initial diameter of the touch menu, wherein the initial diameter is a preset default diameter or is calculated and determined according to the position of each touch point in the multi-finger touch instruction;
Setting the number of control options in the touch menu and the icon size of each control option according to the initial diameter;
and controlling the display to display the touch menu according to the set control option quantity and icon size.
3. The display device of claim 2, wherein in the step of obtaining the initial diameter of the touch menu, the controller is further configured to:
acquiring the number of touch points and the position coordinates of each touch point in the multi-finger touch instruction;
calculating the distance between any two touch points;
if the number of the touch points is 2, setting the initial diameter of the touch menu to be equal to the distance between the two touch points;
and if the number of the touch points is greater than or equal to 3, setting the initial diameter of the touch menu to be equal to the distance between the two touch points farthest apart in all the touch points.
4. A display device according to any of claims 1-3, characterized in that the multi-finger touch instruction is generated by a multi-finger swipe action input and/or a multi-finger slide action input trigger detected on the touch screen.
5. The display device of claim 1, wherein in the step of scaling the diameter of the touch menu according to the direction of rotation, the controller is further configured to:
If the rotation direction is clockwise, the diameter of the touch menu is enlarged;
and if the rotation direction is anticlockwise, reducing the diameter of the touch menu.
6. The display device of claim 1, wherein in the step of scaling the diameter of the touch menu according to the direction of rotation, the controller is further configured to:
acquiring rotation angles of a plurality of touch points in the multi-finger rotation touch instruction;
calculating a scaling ratio according to the proportion of the rotation angle to the maximum rotation angle;
and scaling the diameter of the current touch menu according to the scaling proportion.
7. The display device according to claim 6, wherein the rotation angle is a maximum angle change amount of a current position with respect to an initial position among the plurality of touch points; the controller is further configured to:
acquiring initial position coordinates and current position coordinates of each touch point in the multi-finger rotating touch control instruction;
calculating the angle change quantity of the current position coordinate relative to the initial position coordinate;
and comparing the angle change amounts of the touch points, and determining the maximum angle change amount.
8. The display device of claim 1, wherein in the step of scaling the diameter of the touch menu according to the direction of rotation, the controller is further configured to:
And modifying the quantity of the control options and/or the icon size of the control options contained in the touch menu according to the current diameter of the touch menu.
9. The display device of claim 1, wherein the controller is further configured to:
receiving a single-finger click touch instruction input by a user and used for opening the control options;
responding to the single-finger click touch instruction, and adding an adjustment control corresponding to the opened control option in the middle area of the annular menu control;
and controlling the display to display the touch menu and the adjustment control.
10. The display device of claim 9, wherein the controller is further configured to:
receiving a single-finger sliding instruction which is input by a user in the adjustment control area and used for adjusting control parameters;
adjusting control parameters of the opened control options in response to the single-finger sliding instruction;
and adjusting the display shape of the adjustment control according to the adjusted control parameter, and controlling the display to display the adjustment control.
11. The display device of claim 9, wherein in the step of adding an adjustment control corresponding to the opened control option in a middle region of the annular menu control, the controller is further configured to:
Judging whether the opened control option supports displaying the adjustment control;
if the opened control options do not support to display the adjustment control, controlling a display to display a setting interface corresponding to the opened control options;
and if the opened control options support to display the adjustment control, adding the adjustment control corresponding to the opened control options in the middle area of the annular menu control.
12. The display device of claim 9, wherein the controller is further configured to:
receiving an exit instruction input by a user for exiting the touch menu;
and responding to the exit instruction, and controlling the display to cancel displaying the touch menu and the adjustment control.
13. The display device according to claim 12, wherein the exit instruction is a single-finger click touch instruction input by a user in a region other than the touch menu;
or the exit instruction is a single-finger click touch instruction input by a user on a control which refers to an exit function in the touch menu;
or the exit instruction is an instruction which is input by a user through the control device and is used for indicating an exit function.
14. The display device of claim 1, wherein the controller is further configured to:
Controlling the touch menu to display at least one control option through a circular control, and setting other control options in the touch menu as hidden options;
receiving a first direction rotation touch instruction input by a user and used for displaying hidden options in the touch menu;
and responding to the first direction rotating touch instruction, and controlling the display to sequentially display the hidden options according to the first direction.
15. The display device of claim 14, wherein the controller is further configured to:
receiving a second direction rotation touch instruction input by a user and used for canceling display of hidden options in the touch menu, wherein the second direction is opposite to the first direction;
and responding to the second direction rotating touch instruction, and controlling the display to cancel displaying the hidden options in sequence according to the second direction.
16. The display device of claim 1, wherein the controller is further configured to:
acquiring a multi-finger touch instruction input by a user and used for calling the touch menu;
responding to the multi-finger touch instruction, and acquiring the number of control options contained in the touch menu;
And setting the display diameter of the touch menu according to the number of the control options, and controlling the display to display the touch menu according to the display diameter.
17. The touch menu interaction method is characterized by comprising the following steps of:
receiving a multi-finger touch instruction input by a user and used for calling a touch menu; the touch menu is an annular menu control formed by a plurality of control options;
responding to the multi-finger touch instruction, and acquiring the position coordinate of each touch point in the multi-finger touch instruction;
calculating center point coordinates based on a plurality of touch points according to the position coordinates;
acquiring the interval distance between the center point coordinate and the edge of the touch screen of the display;
if the interval distances are all larger than or equal to the initial radius of the touch menu, setting the center point coordinate as the display origin of the touch menu;
if any spacing distance is smaller than the initial radius of the touch menu, translating a display origin of the touch menu in a direction away from the corresponding side edge of the touch screen by taking the center point coordinate as a reference;
controlling a display to display the touch menu by taking the origin as the center;
receiving a multi-finger rotating touch instruction input by a user and used for zooming a touch menu;
Responding to the multi-finger rotating touch control instruction, and acquiring the rotating directions of a plurality of touch points in the multi-finger rotating touch control instruction;
acquiring initial position coordinates and current position coordinates of each touch point in the multi-finger rotating touch control instruction;
calculating the angle change quantity of the current position coordinate relative to the initial position coordinate;
comparing the angle change amounts of the touch points to determine the maximum angle change amount;
acquiring the rotation angles of a plurality of touch points in the multi-finger rotation touch instruction according to the maximum angle variation;
calculating a scaling ratio according to the proportion of the rotation angle to the maximum rotation angle;
and scaling the diameter of the current touch menu according to the rotation direction and the scaling ratio, and controlling the display to display the touch menu in real time.
18. The touch menu interaction method of claim 17, comprising:
receiving a single-finger clicking touch instruction input by a user and used for opening a control option;
responding to the single-finger click touch instruction, and adding an adjustment control corresponding to the opened control option in the middle area of the annular menu control;
and controlling a display to display the touch menu and the adjustment control.
19. The touch menu interaction method of claim 17, comprising:
controlling the touch menu to display at least one control option through a circular control, and setting other control options in the touch menu as hidden options;
receiving a first direction rotation touch instruction input by a user and used for displaying hidden options in a touch menu;
and responding to the first direction rotating touch instruction, and controlling the display to sequentially display the hidden options according to the first direction.
20. The touch menu interaction method according to claim 17, further comprising, after receiving a multi-finger touch instruction for invoking the touch menu input by a user:
responding to the multi-finger touch instruction, and acquiring the number of control options contained in the touch menu;
and setting the display diameter of the touch menu according to the number of the control options, and controlling the display to display the touch menu according to the display diameter.
CN202010474023.1A 2020-05-29 2020-05-29 Display device and touch menu interaction method Active CN113747216B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010474023.1A CN113747216B (en) 2020-05-29 2020-05-29 Display device and touch menu interaction method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010474023.1A CN113747216B (en) 2020-05-29 2020-05-29 Display device and touch menu interaction method

Publications (2)

Publication Number Publication Date
CN113747216A CN113747216A (en) 2021-12-03
CN113747216B true CN113747216B (en) 2023-09-08

Family

ID=78724555

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010474023.1A Active CN113747216B (en) 2020-05-29 2020-05-29 Display device and touch menu interaction method

Country Status (1)

Country Link
CN (1) CN113747216B (en)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114706515A (en) * 2022-04-26 2022-07-05 长沙朗源电子科技有限公司 Figure three-finger rotation method and device based on electronic whiteboard
CN116737019B (en) * 2023-08-15 2023-11-03 山东泰克信息科技有限公司 Intelligent display screen induction identification control management system

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1713123A (en) * 2004-06-26 2005-12-28 鸿富锦精密工业(深圳)有限公司 Choice style of displaying menu on annular screen and displaying device thereof
JP2006139615A (en) * 2004-11-12 2006-06-01 Access Co Ltd Display device, menu display program, and tab display program
CN103677558A (en) * 2012-08-29 2014-03-26 三星电子株式会社 Method and apparatus for controlling zoom function in electronic device
CN103713809A (en) * 2012-09-29 2014-04-09 中国移动通信集团公司 Dynamic generating method and dynamic generating device for annular menu of touch screen
CN106033300A (en) * 2015-03-10 2016-10-19 联想(北京)有限公司 A control method and an electronic apparatus
KR20190114348A (en) * 2018-03-29 2019-10-10 주식회사 네틱스 Apparatus and method for multi-touch recognition

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8572508B2 (en) * 2010-11-22 2013-10-29 Acer Incorporated Application displaying method for touch-controlled device and touch-controlled device thereof

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1713123A (en) * 2004-06-26 2005-12-28 鸿富锦精密工业(深圳)有限公司 Choice style of displaying menu on annular screen and displaying device thereof
JP2006139615A (en) * 2004-11-12 2006-06-01 Access Co Ltd Display device, menu display program, and tab display program
CN103677558A (en) * 2012-08-29 2014-03-26 三星电子株式会社 Method and apparatus for controlling zoom function in electronic device
CN103713809A (en) * 2012-09-29 2014-04-09 中国移动通信集团公司 Dynamic generating method and dynamic generating device for annular menu of touch screen
CN106033300A (en) * 2015-03-10 2016-10-19 联想(北京)有限公司 A control method and an electronic apparatus
KR20190114348A (en) * 2018-03-29 2019-10-10 주식회사 네틱스 Apparatus and method for multi-touch recognition

Also Published As

Publication number Publication date
CN113747216A (en) 2021-12-03

Similar Documents

Publication Publication Date Title
CN109618206B (en) Method and display device for presenting user interface
CN114827694A (en) Display equipment and UI (user interface) display method during rotation
WO2021114529A1 (en) User interface display method and display device
CN111182345B (en) Display method and display equipment of control
CN113805738B (en) Custom setting method and starting method for control keys and display equipment
CN111625169B (en) Method for browsing webpage by remote controller and display equipment
CN109960556B (en) Display device
CN112463269B (en) User interface display method and display equipment
CN112181207B (en) Display device and geometric figure recognition method
CN111913608A (en) Touch screen rotation control interaction method and display device
CN113747216B (en) Display device and touch menu interaction method
CN111176603A (en) Image display method for display equipment and display equipment
CN113014939A (en) Display device and playing method
CN111901646A (en) Display device and touch menu display method
CN112073787B (en) Display device and home page display method
CN111757154A (en) Method for controlling webpage cursor by remote controller and display equipment
CN114157889B (en) Display equipment and touch control assisting interaction method
CN111988646B (en) User interface display method and display device of application program
CN112243147B (en) Video picture scaling method and display device
CN112243148B (en) Display device and video picture scaling method
CN113485613A (en) Display equipment and method for realizing free-drawing screen edge painting
CN112199560A (en) Setting item searching method and display device
CN112367550A (en) Method for realizing multi-title dynamic display of media asset list and display equipment
CN113810747A (en) Display equipment and signal source setting interface interaction method
CN111259639A (en) Adaptive adjustment method of table and display device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant