CN114415864A - Touch area determination method and display device - Google Patents

Touch area determination method and display device Download PDF

Info

Publication number
CN114415864A
CN114415864A CN202011089349.9A CN202011089349A CN114415864A CN 114415864 A CN114415864 A CN 114415864A CN 202011089349 A CN202011089349 A CN 202011089349A CN 114415864 A CN114415864 A CN 114415864A
Authority
CN
China
Prior art keywords
touch
touch area
area
determining
sub
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202011089349.9A
Other languages
Chinese (zh)
Other versions
CN114415864B (en
Inventor
王武军
张晓娜
赵书凯
赵宏洋
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Qingdao Hisense Commercial Display Co Ltd
Original Assignee
Qingdao Hisense Commercial Display Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Qingdao Hisense Commercial Display Co Ltd filed Critical Qingdao Hisense Commercial Display Co Ltd
Priority to CN202011089349.9A priority Critical patent/CN114415864B/en
Publication of CN114415864A publication Critical patent/CN114415864A/en
Application granted granted Critical
Publication of CN114415864B publication Critical patent/CN114415864B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • G06F3/042Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by opto-electronic means
    • G06F3/0421Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by opto-electronic means by interrupting or reflecting a light beam, e.g. optical touch-screen
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04883Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/14Digital output to display device ; Cooperation and interconnection of the display device with other functional units
    • G06F3/147Digital output to display device ; Cooperation and interconnection of the display device with other functional units using display panels

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The embodiment of the application provides a method for determining a touch area and display equipment, wherein a first touch area is determined according to a light path shielded in a first direction of a touch screen; if an unobstructed light path in a second direction exists in the first touch area, determining a second touch area according to the first touch area and the unobstructed light path, wherein the second touch area is a sub-area of the first touch area. That is to say, in the embodiment of the present application, a first touch area is determined through a blocked light path, and then a second touch area can be further determined through an unblocked light path by taking the first touch area as a coarse outline, where the second touch area is a sub-area of the first touch area, so that the touch area defined by the second touch area is smaller, and the accuracy of the touch area is higher.

Description

Touch area determination method and display device
Technical Field
The embodiment of the application relates to the technical field of touch control, in particular to a method for determining a touch area and display equipment.
Background
With the widespread application of electronic products, in the process of using the electronic products by users, the display screens of the electronic products directly influence the use experience of the users on the electronic products. To simplify the human-computer interaction process, a touch screen is present, which can detect a touch area to respond according to the touch area. For example, if a control for closing the current interface is displayed on the touch area, the current interface is closed.
In the prior art, an infrared touch screen is a common touch screen and has a rectangular structure, wherein two opposite long sides in the rectangular structure are a long-side transmitting side and a long-side receiving side respectively, a plurality of transmitting lamps are arranged on the long-side transmitting side, and a plurality of receiving lamps are arranged on the long-side receiving side; in a similar way, two opposite short edges are respectively a short edge transmitting edge and a short edge receiving edge, a plurality of transmitting lamps are arranged on the short edge transmitting edge, and a plurality of receiving lamps are arranged on the short edge receiving edge. The light infrared light emitted by the emitting lamp can be received by one or more opposite receiving lamps, and a light path is formed between the emitting lamp and the corresponding receiving lamp. After a user touches the infrared touch screen, part of light paths between the transmitting lamps and the receiving lamps are shielded, so that the receiving lamps cannot receive infrared light transmitted by the transmitting lamps, and a touch area can be determined through the shielded light paths.
However, the applicant has found that the above method of determining the touch area has a problem of low accuracy after studying the prior art.
Disclosure of Invention
The embodiment of the application provides a method for determining a touch area and a display device, which can solve the technical problem that the accuracy of the current method for determining the touch area is low.
In a first aspect, an embodiment of the present application provides a display device, including:
the touch screen is used for receiving touch operation;
a controller communicatively coupled to the touch screen, the controller configured to:
determining a first touch area according to the shielded light path in the first direction of the touch screen;
if an unobstructed light path in a second direction exists in the first touch area, determining a second touch area according to the first touch area and the unobstructed light path, wherein the second touch area is a sub-area of the first touch area.
In a second aspect, an embodiment of the present application provides a method for determining a touch area, where the method includes:
determining a first touch area according to the shielded light path in the first direction of the touch screen;
if an unobstructed light path in a second direction exists in the first touch area, determining a second touch area according to the first touch area and the unobstructed light path, wherein the second touch area is a sub-area of the first touch area.
According to the method for determining the touch area and the display device, the first touch area is determined according to the light path shielded in the first direction of the touch screen; if an unobstructed light path in a second direction exists in the first touch area, determining a second touch area according to the first touch area and the unobstructed light path, wherein the second touch area is a sub-area of the first touch area. That is to say, in the embodiment of the present application, a first touch area is determined through a blocked light path, and then a second touch area can be further determined through an unblocked light path by taking the first touch area as a coarse outline, where the second touch area is a sub-area of the first touch area, so that the touch area defined by the second touch area is smaller, and the accuracy of the touch area is higher.
Drawings
In order to more clearly illustrate the embodiments of the present application or the technical solutions in the prior art, the drawings needed to be used in the description of the embodiments of the present application or the prior art will be briefly described below, and it is obvious that the drawings in the following description are some embodiments of the present application, and other drawings can be obtained by those skilled in the art without inventive exercise.
Fig. 1 is a schematic diagram illustrating an operational scenario between a display device and a control apparatus according to some embodiments;
a block diagram of a hardware configuration of a display device 200 according to some embodiments is illustrated in fig. 2;
FIG. 3 illustrates an icon control interface display diagram of an application in a display device 200, according to some embodiments;
FIG. 4 is a schematic diagram illustrating infrared light emitted by a touch screen;
FIG. 5 illustrates a schematic diagram of an optical network in one orientation of a touch screen;
FIG. 6 illustrates a schematic diagram of an optical network in another orientation of a touch screen;
FIG. 7 illustrates a schematic diagram of an optical network in another orientation of a touch screen;
FIG. 8 illustrates a schematic diagram of an optical network in another orientation of a touch screen;
FIG. 9 illustrates a schematic diagram of an optical network in another orientation of a touch screen;
FIG. 10 illustrates a schematic view of an optical network in another orientation of a touch screen;
FIG. 11 illustrates an infrared light matrix diagram of a touch screen;
FIG. 12 is a schematic diagram illustrating a user operation area in a touch screen;
FIG. 13 illustrates a schematic diagram of candidate regions in a touch screen;
FIG. 14 illustrates a schematic view of a first touch area in a touch screen;
FIG. 15 illustrates a schematic view of a second touch area in a touch screen;
a flowchart of a method for determining a touch area is shown schematically in fig. 16.
Detailed Description
To make the objects, embodiments and advantages of the present application clearer, the following description of exemplary embodiments of the present application will clearly and completely describe the exemplary embodiments of the present application with reference to the accompanying drawings in the exemplary embodiments of the present application, and it is to be understood that the described exemplary embodiments are only a part of the embodiments of the present application, and not all of the embodiments.
All other embodiments, which can be derived by a person skilled in the art from the exemplary embodiments described herein without inventive step, are intended to be within the scope of the claims appended hereto. In addition, while the disclosure herein has been presented in terms of one or more exemplary examples, it should be appreciated that aspects of the disclosure may be implemented solely as a complete embodiment.
It should be noted that the brief descriptions of the terms in the present application are only for the convenience of understanding the embodiments described below, and are not intended to limit the embodiments of the present application. These terms should be understood in their ordinary and customary meaning unless otherwise indicated.
The terms "first," "second," "third," and the like in the description and claims of this application and in the above-described drawings are used for distinguishing between similar or analogous objects or entities and are not necessarily intended to limit the order or sequence of any particular one, Unless otherwise indicated. It is to be understood that the terms so used are interchangeable under appropriate circumstances such that the embodiments described herein are, for example, capable of operation in sequences other than those illustrated or otherwise described herein.
Furthermore, the terms "comprises" and "comprising," and any variations thereof, are intended to cover a non-exclusive inclusion, such that a product or device that comprises a list of elements is not necessarily limited to those elements explicitly listed, but may include other elements not expressly listed or inherent to such product or device.
Fig. 1 is a schematic diagram illustrating an operation scenario between a display device and a control apparatus according to an embodiment. As shown in fig. 1, a user may operate the display device 200 through the mobile terminal 300 and the control apparatus 100.
In some embodiments, the control apparatus 100 may be a remote controller, and the communication between the remote controller and the display device includes an infrared protocol communication or a bluetooth protocol communication, and other short-distance communication methods, etc., and the display device 200 is controlled by wireless or other wired methods. The user may input a user command through a key on a remote controller, voice input, control panel input, etc. to control the display apparatus 200. Such as: the user can input a corresponding control command through a volume up/down key, a channel control key, up/down/left/right moving keys, a voice input key, a menu key, a power on/off key, etc. on the remote controller, to implement the function of controlling the display device 200.
In some embodiments, mobile terminals, tablets, computers, laptops, and other smart devices may also be used to control the display device 200. For example, the display device 200 is controlled using an application program running on the smart device. The application, through configuration, may provide the user with various controls in an intuitive User Interface (UI) on a screen associated with the smart device.
In some embodiments, the mobile terminal 300 may install a software application with the display device 200 to implement connection communication through a network communication protocol for the purpose of one-to-one control operation and data communication. Such as: the mobile terminal 300 and the display device 200 can establish a control instruction protocol, synchronize a remote control keyboard to the mobile terminal 300, and control the display device 200 by controlling a user interface on the mobile terminal 300. The audio and video content displayed on the mobile terminal 300 can also be transmitted to the display device 200, so as to realize the synchronous display function.
As also shown in fig. 1, the display apparatus 200 also performs data communication with the server 400 through various communication means. The display device 200 may be allowed to be communicatively connected through a Local Area Network (LAN), a Wireless Local Area Network (WLAN), and other networks. The server 400 may provide various contents and interactions to the display apparatus 200. Illustratively, the display device 200 receives software program updates, or accesses a remotely stored digital media library, by sending and receiving information, as well as Electronic Program Guide (EPG) interactions. The server 400 may be a cluster or a plurality of clusters, and may include one or more types of servers. Other web service contents such as video on demand and advertisement services are provided through the server 400.
The display device 200 may be a liquid crystal display, an OLED display, or a projection display. The particular display device type, size, resolution, etc. are not limiting, and those skilled in the art will appreciate that the display device 200 may be modified in performance and configuration as desired.
The display apparatus 200 may additionally provide an intelligent network tv function of a computer support function including, but not limited to, a network tv, an intelligent tv, an Internet Protocol Tv (IPTV), and the like, in addition to the broadcast receiving tv function.
A hardware configuration block diagram of a display device 200 according to an exemplary embodiment is exemplarily shown in fig. 2.
In some embodiments, at least one of the controller 250, the tuner demodulator 210, the communicator 220, the detector 230, the input/output interface 255, the display 275, the audio output interface 285, the memory 260, the power supply 290, the user interface 265, and the external device interface 240 is included in the display apparatus 200.
In some embodiments, a display screen 275 receives image signals originating from the first processor output and displays video content and images and components of the menu manipulation interface.
In some embodiments, the display 275, includes a display component for presenting a picture, and a driving component for driving the display of an image.
In some embodiments, the video content is displayed from broadcast television content, or alternatively, from various broadcast signals that may be received via wired or wireless communication protocols. Alternatively, various image contents received from the network communication protocol and sent from the network server side can be displayed.
In some embodiments, the display screen 275 is used to present a user-manipulated UI interface generated in the display device 200 and used to control the display device 200.
In some embodiments, a drive assembly for driving the display is also included, depending on the type of display screen 275.
In some embodiments, the display screen 275 is a projection display screen and may also include a projection device and a projection screen.
In some embodiments, communicator 220 is a component for communicating with external devices or external servers according to various communication protocol types. For example: the communicator may include at least one of a Wifi chip, a bluetooth communication protocol chip, a wired ethernet communication protocol chip, and other network communication protocol chips or near field communication protocol chips, and an infrared receiver.
In some embodiments, the display apparatus 200 may establish control signal and data signal transmission and reception with the control device 100 or the content providing apparatus through the communicator 220.
In some embodiments, the user interface 265 may be configured to receive infrared control signals from a control device 100 (e.g., an infrared remote control, etc.).
In some embodiments, the detector 230 is a signal used by the display device 200 to collect an external environment or interact with the outside.
In some embodiments, the detector 230 includes a light receiver, a sensor for collecting the intensity of ambient light, and parameters changes can be adaptively displayed by collecting the ambient light, and the like.
In some embodiments, the detector 230 may further include an image collector, such as a camera, etc., which may be configured to collect external environment scenes, collect attributes of the user or gestures interacted with the user, adaptively change display parameters, and recognize user gestures, so as to implement a function of interaction with the user.
In some embodiments, the detector 230 may also include a temperature sensor or the like, such as by sensing ambient temperature.
In some embodiments, the display apparatus 200 may adaptively adjust a display color temperature of an image. For example, the display apparatus 200 may be adjusted to display a cool tone when the temperature is in a high environment, or the display apparatus 200 may be adjusted to display a warm tone when the temperature is in a low environment.
In some embodiments, the detector 230 may also be a sound collector or the like, such as a microphone, which may be used to receive the user's voice. Illustratively, a voice signal including a control instruction of the user to control the display device 200, or to collect an ambient sound for recognizing an ambient scene type, so that the display device 200 can adaptively adapt to an ambient noise.
In some embodiments, as shown in fig. 2, the input/output interface 255 is configured to allow data transfer between the controller 250 and external other devices or other controllers 250. Such as receiving video signal data and audio signal data of an external device, or command instruction data, etc.
In some embodiments, the external device interface 240 may include, but is not limited to, the following: the interface can be any one or more of a high-definition multimedia interface (HDMI), an analog or data high-definition component input interface, a composite video input interface, a USB input interface, an RGB port and the like. The plurality of interfaces may form a composite input/output interface.
In some embodiments, as shown in fig. 2, the tuning demodulator 210 is configured to receive a broadcast television signal through a wired or wireless receiving manner, perform modulation and demodulation processing such as amplification, mixing, resonance, and the like, and demodulate an audio and video signal from a plurality of wireless or wired broadcast television signals, where the audio and video signal may include a television audio and video signal carried in a television channel frequency selected by a user and an EPG data signal.
In some embodiments, the frequency points demodulated by the tuner demodulator 210 are controlled by the controller 250, and the controller 250 can send out control signals according to user selection, so that the modem responds to the television signal frequency selected by the user and modulates and demodulates the television signal carried by the frequency.
In some embodiments, the broadcast television signal may be classified into a terrestrial broadcast signal, a cable broadcast signal, a satellite broadcast signal, an internet broadcast signal, or the like according to the broadcasting system of the television signal. Or may be classified into a digital modulation signal, an analog modulation signal, and the like according to a modulation type. Or the signals are classified into digital signals, analog signals and the like according to the types of the signals.
In some embodiments, the controller 250 and the modem 210 may be located in different separate devices, that is, the modem 210 may also be located in an external device of the main device where the controller 250 is located, such as an external set-top box. Therefore, the set top box outputs the television audio and video signals modulated and demodulated by the received broadcast television signals to the main body equipment, and the main body equipment receives the audio and video signals through the first input/output interface.
In some embodiments, the controller 250 controls the operation of the display device and responds to user operations through various software control programs stored in memory. The controller 250 may control the overall operation of the display apparatus 200. For example: in response to receiving a user command for selecting a UI object displayed on the display 275, the controller 250 may perform an operation related to the object selected by the user command.
As shown in fig. 2, the controller 250 includes at least one of a Random Access Memory 251 (RAM), a Read-Only Memory 252 (ROM), a video processor 270, an audio processor 280, other processors 253 (e.g., a Graphics Processing Unit (GPU), a Central Processing Unit 254 (CPU), a Communication Interface (Communication Interface), and a Communication Bus 256(Bus), which connects the respective components.
In some embodiments, RAM 251 is used to store temporary data for the operating system or other programs that are running
In some embodiments, ROM 252 is used to store instructions for various system boots.
In some embodiments, the ROM 252 is used to store a Basic Input Output System (BIOS). The system is used for completing power-on self-test of the system, initialization of each functional module in the system, a driver of basic input/output of the system and booting an operating system.
In some embodiments, when the power-on signal is received, the display device 200 starts to power up, the CPU executes the system boot instruction in the ROM 252, and copies the temporary data of the operating system stored in the memory to the RAM 251 so as to start or run the operating system. After the start of the operating system is completed, the CPU copies the temporary data of the various application programs in the memory to the RAM 251, and then, the various application programs are started or run.
In some embodiments, a central processor 254 is used to execute operating system and application program instructions stored in memory. And executing various application programs, data and contents according to various interactive instructions received from the outside so as to finally display and play various audio and video contents.
In some exemplary embodiments, the central processor 254 may include a plurality of processors. The plurality of processors may include a main processor and one or more sub-processors. A main processor for performing some operations of the display apparatus 200 in a pre-power-up mode and/or operations of displaying a screen in a normal mode. One or more sub-processors for one operation in a standby mode or the like.
In some embodiments, the graphics processor 253 is used to generate various graphics objects, such as: icons, operation menus, user input instruction display graphics, and the like. The display device comprises an arithmetic unit which carries out operation by receiving various interactive instructions input by a user and displays various objects according to display attributes. And the system comprises a renderer for rendering various objects obtained based on the arithmetic unit, wherein the rendered objects are used for being displayed on a display screen.
In some embodiments, the video processor 270 is configured to receive an external video signal, and perform video processing such as decompression, decoding, scaling, noise reduction, frame rate conversion, resolution conversion, image synthesis, and the like according to a standard codec protocol of the input signal, so as to obtain a signal that can be displayed or played on the direct display device 200.
In some embodiments, video processor 270 includes a demultiplexing module, a video decoding module, an image synthesis module, a frame rate conversion module, a display formatting module, and the like.
The demultiplexing module is used for demultiplexing the input audio and video data stream, and if the input MPEG-2 is input, the demultiplexing module demultiplexes the input audio and video data stream into a video signal and an audio signal.
And the video decoding module is used for processing the video signal after demultiplexing, including decoding, scaling and the like.
And the image synthesis module is used for carrying out superposition mixing processing on the GUI signal input by the user or generated by the user and the video image after the zooming processing by the graphic generator so as to generate an image signal for display.
The frame rate conversion module is configured to convert an input video frame rate, such as a 60Hz frame rate into a 120Hz frame rate or a 240Hz frame rate, and the normal format is implemented in, for example, an interpolation frame mode.
The display format module is used for converting the received video output signal after the frame rate conversion, and changing the signal to conform to the signal of the display format, such as outputting an RGB data signal.
In some embodiments, the graphic processor 253 and the video processor 270 may be integrated or separately configured, and when the graphic processor and the video processor are integrated, the graphic processor may perform processing of a graphic signal output to a display screen, and when the graphic processor and the video processor are separately configured, the graphic processor and the video processor may perform different functions, respectively, for example, a GPU + frc (frame Rate conversion) architecture.
In some embodiments, the audio processor 280 is configured to receive an external audio signal, decompress and decode the received audio signal according to a standard codec protocol of the input signal, and perform noise reduction, digital-to-analog conversion, and amplification processes to obtain an audio signal that can be played in a speaker.
In some embodiments, video processor 270 may comprise one or more chips. Audio processor 280 may also comprise one or more chips.
In some embodiments, the video processor 270 and the audio processor 280 may be separate chips or may be integrated together with the controller in one or more chips.
The power supply 290 supplies power to the display device 200 from the power input from the external power source under the control of the controller 250. The power supply 290 may include a built-in power supply circuit installed inside the display apparatus 200, or may be a power supply interface installed outside the display apparatus 200 to provide an external power supply in the display apparatus 200.
A user interface 265 for receiving an input signal of a user and then transmitting the received user input signal to the controller 250. The user input signal may be a remote controller signal received through an infrared receiver, and various user control signals may be received through the network communication module.
In some embodiments, the user inputs a user command through the control apparatus 100 or the mobile terminal 300, the user input interface responds to the user input through the controller 250 according to the user input, and the display device 200 responds to the user input through the controller 250.
In some embodiments, a "user interface" is a media interface for interaction and information exchange between an application or operating system and a user that enables conversion between an internal form of information and a form that is acceptable to the user. A commonly used presentation form of the User Interface is a Graphical User Interface (GUI), which refers to a User Interface related to computer operations and displayed in a graphical manner. It may be an interface element such as an icon, a window, a control, etc. displayed in the display screen of the electronic device, where the control may include a visual interface element such as an icon, a button, a menu, a tab, a text box, a dialog box, a status bar, a navigation bar, a Widget, etc.
In some embodiments, as shown in fig. 3, the application layer containing at least one application may display a corresponding icon control in the display screen, such as: the system comprises a live television application icon control, a video on demand application icon control, a media center application icon control, an application center icon control, a game application icon control and the like.
In some embodiments, the live television application may provide live television via different signal sources. For example, a live television application may provide television signals using input from cable television, radio broadcasts, satellite services, or other types of live television services. And, the live television application may display video of the live television signal on the display device 200.
In some embodiments, a video-on-demand application may provide video from different storage sources. Unlike live television applications, video on demand provides a video display from some storage source. For example, the video on demand may come from a server side of the cloud storage, from a local hard disk storage containing stored video programs.
In some embodiments, the media center application may provide various applications for multimedia content playback. For example, a media center, which may be other than live television or video on demand, may provide services that a user may access to various images or audio through a media center application.
In some embodiments, an application center may provide storage for various applications. The application may be a game, an application, or some other application associated with a computer system or other device that may be run on the smart television. The application center may obtain these applications from different sources, store them in local storage, and then be operable on the display device 200.
Currently, the display screen may be a touch screen for receiving a touch operation. When a user performs touch operation on the touch screen, the touch operation can shield the light path on the touch screen, so that the touch area corresponding to the touch operation can be determined according to the shielded light path. However, when the touch screen supports multi-point touch, a user can touch on the touch screen through a plurality of fingers, and the plurality of fingers only block the optical paths in some directions and do not block the optical paths in other directions, so that the touch area determined only according to the blocked optical paths in the directions is larger, and the accuracy of the touch area is lower.
In order to solve the technical problem, in the embodiment of the present application, a first touch area is determined through a blocked light path, then the first touch area is used as a coarse outline, a second touch area can be further determined through an unblocked light path, and the second touch area is a sub-area of the first touch area, so that the touch area defined by the second touch area is smaller, and the accuracy of the touch area is higher. The following examples are given for illustrative purposes.
In the embodiment of the present application, the controller 250 is configured to:
determining a first touch area according to a light path shielded in a first direction of the touch screen; if the first touch area has an unshielded light path in the second direction, determining a second touch area according to the first touch area and the unshielded light path, wherein the second touch area is a sub-area of the first touch area.
The touch screen can be an infrared touch display screen, and the infrared touch display screen has the advantages of strong environmental adaptability, long service life, more recognizable touch areas and the like. The infrared touch display screen is of a rectangular structure. The Infrared Touch display Screen is also called an Infrared Technology Touch Screen (ITST). As shown in fig. 4, four sides of the rectangular structure are respectively a long-side emitting side LS1, a long-side receiving side LS2, a short-side emitting side SS1 and a short-side receiving side SS2, wherein a plurality of emitting lamps are respectively arranged on LS1 and SS1, a plurality of receiving lamps are respectively arranged on LS2 and SS2, the emitting lamp on LS1 is used for emitting infrared light, the receiving lamp on LS2 is used for receiving the infrared light emitted by the emitting lamp on LS1, and similarly, the emitting lamp on SS1 is used for emitting infrared light, and the receiving lamp on SS2 is used for receiving the infrared light emitted by the emitting lamp on SS 1.
As can be seen from fig. 4, one emission lamp can emit infrared light in multiple directions, the emission lamp TL1 on LS1 in fig. 4 emits infrared light in three directions D1, D2 and D3, and the emission lamp TL1 emits infrared light in three directions D1, D2 and D3 respectively to form light paths; similarly, the emission lamp TL2 on the SS1 emits infrared light in three directions D4, D5, and D6, and the infrared light emitted from the emission lamp TL2 forms optical paths in three directions D4, D5, and D6, respectively.
In practical application, any one of the emitting lamps can emit infrared light in multiple directions, a light path is formed between each emitting lamp and the corresponding receiving lamp, and multiple light paths in the same direction are parallel to form an infrared detection network, also called an optical network, on the touch screen. For example, with respect to fig. 4, each of the emission lamps on the LS1 may emit infrared light in three directions D1, D2, and D3, and each of the emission lamps on the SS1 may emit infrared light in three directions D4, D5, and D6, so that a plurality of light paths respectively corresponding to the directions D1, D2, D3, D4, D5, and D6 respectively form a light net on the touch screen, as shown in fig. 5 to 10 respectively which form the light nets on the touch screen by the plurality of light paths in the directions D1 to D6.
It will be appreciated that optical nets of different orientations may form an infrared light matrix that intersects vertically and horizontally. For example, a vertically oriented optical net as shown in fig. 6 and a horizontally oriented optical net as shown in fig. 9 may constitute an infrared light matrix as shown in fig. 11. In addition, the infrared touch display screen includes a general display screen and a touch frame positioned around the general display screen.
For the infrared touch display screen, any object which implements touch operation can shield the light path of the infrared light where the contact is located, so that the touch area can be determined according to the coordinate position of the shielded light path, and the response to the touch operation can be realized according to the control in the touch area. The first direction in the embodiment of the application is a direction in which the blocked light path is located, the blocked light path is a light path blocked by a user operation area, and the user operation area is an area covered by a finger when a user touches the touch screen. As shown in fig. 12, the shaded area AR1 is a user operation area that blocks the optical paths in the three directions D2, D5, D6, so that D2, D5, D6 are referred to as a first direction.
Based on the above description, the second direction is a direction in which there is no blocked optical path, that is, no optical path in the second direction is blocked, and an unblocked optical path is an optical path unblocked by the user operation area. As shown in fig. 12, the user operation area does not block the optical path in the direction D3, and thus D3 is referred to as a second direction.
In this embodiment, an area formed by the blocked light path in the first direction may be used as the first touch area. For example, the area formed by the first directions D2, D5, D6 in fig. 12 may be a first touch area.
In a possible embodiment, there are at least two first directions, and the specific step of determining the first touch area may include:
determining a candidate region corresponding to a first direction; the candidate regions include: the area is formed on the touch screen by the shielded light path in the first direction of the touch screen;
and determining an overlapping area between the candidate areas respectively corresponding to the at least two first directions as a first touch area.
The candidate areas correspond to first directions, each first direction corresponds to one candidate area, and firstly, a plurality of continuous light paths shielded in the first directions are obtained; then, determining to select the first and the last optical paths from the continuous optical paths as boundary optical paths; finally, the area between the boundary optical paths is taken as a candidate area. For example, as shown in fig. 13, for the first direction D2, there are two blocked optical paths: LR1 and LR2 so that LR1 and LR2 are boundary optical paths and a shaded region AR2 between LR1 and LR2 is a candidate region in the direction D2.
After the candidate regions corresponding to the first directions D5 and D6, respectively, are continuously determined in the above manner, the overlapping region of the candidate regions may be regarded as the first touch region. As shown in fig. 14, a first touch area AR3 determined according to the first directions D2, D5, and D6 is an area bounded by optical paths in the three directions D2, D5, and D6, that is, an overlapping area of candidate areas.
After the first touch area is determined, determining a second touch area according to the first touch area and an unobstructed optical path in a second direction, specifically including:
dividing the first touch area into sub-areas through an unshielded light path;
the sub-area is determined as a second touch area.
In the embodiment of the present application, the unobstructed optical path passes through the first touch area, so that the first touch area can be divided into one or more sub-areas. For example, as shown in fig. 12, the optical path in the second direction D3 is not blocked, so that the optical path in the D3 may divide the first touch area AR3 obtained in fig. 14 into three sub-areas AR4, AR5, and AR6 as shown in fig. 15, so that the sub-areas AR4, AR5, and AR6 may be used as the second touch area.
In the embodiment of the present application, the sub-area having the area smaller than or equal to the first area threshold may be deleted to use the remaining sub-area as the second touch area, and as in the sub-areas AR4, AR5, and AR6 shown in fig. 15, the areas of AR5 and AR6 are smaller than or equal to the first area threshold, so that the remaining sub-area AR4 is used as the second touch area after deleting AR5 and AR 6.
As can be seen from fig. 12, the operating area AR1 has a tilted elongated structure, the first touch area AR3 determined in fig. 14 is a touch area determined by the prior art and determined only according to the blocked light path, and the second touch area AR4 determined in fig. 15 is a touch area determined by the embodiment of the present application and determined by combining the blocked light path and the unblocked light path. It can be seen that the first touch area AR3 in fig. 14 does not show the bar-shaped feature and the tilt feature of the user operation area, whereas in fig. 15, since the first touch area is divided by the unobstructed optical path, the divided sub-areas show the bar-shaped feature and the tilt feature in the second direction of the unobstructed optical path, and the unobstructed optical path in the second direction shows that the original user operation area also shows the bar-shaped feature and the tilt feature in the second direction. Based on the principle, the second touch area more accurately reflects the long strip-shaped characteristic and the inclined characteristic, so that compared with the first touch area determined by the prior art, the second touch area more accurately reflects the characteristic of the user operation area, and the accuracy is higher.
In summary, the first touch area is a touch area determined according to the blocked light path, and the second touch area is a sub-area of the first touch area determined according to the first touch area and the unblocked light path, so that the size of the first touch area is greater than or equal to that of the second touch area.
In another possible embodiment, before determining the sub-region as the second touch region, the controller is further configured to:
deleting the subarea which does not meet the preset condition;
determining the sub-region as a second touch region, including:
and determining the sub-area remaining after the deletion as a second touch area.
The preset condition may be set according to an actual application scenario, and the preset condition is used to define the second touch area, so that the determined second touch area meets a preset expectation, and the preset condition may include but is not limited to: size, shape, and position.
The above-mentioned dimension may be an area, a length, a width, etc., so that the preset condition may be that the length is greater than or equal to a first length threshold, or the width is greater than or equal to a first width threshold, or the area is greater than or equal to a first area threshold. For example, if the minimum recognizable length of the touch screen is 2 mm, the first width threshold and the first length threshold may be 1.5 mm, and the first area threshold may be 1.5 mm by 1.5 mm.
Of course, in practical applications, there may be a case where the second touch area is too large, and in this case, there are usually several controls in the second touch area, and it is not possible to determine which control responds to the controls, so that the case where the second touch area is too large may be determined as an abnormal case, and the corresponding user touch operation may be determined as an abnormal touch operation. Specifically, a second length threshold or a second width threshold or a second area threshold may be set, and the preset condition may be: the length is less than or equal to a second length threshold, or the width is less than or equal to a second width threshold, and the area is less than or equal to a second area threshold. The second length threshold is greater than the first length threshold, the second width threshold is greater than the first width threshold, and the second area threshold is greater than the first area threshold. The first length threshold and the first width threshold are used for eliminating abnormal touch operation with an excessively small area, and the second length threshold is used for eliminating abnormal touch operation with an excessively large area.
In the above definition of the shape, the preset condition may be that the shape of the sub-region is a target shape, and the target shape may be a shape set according to an actual application scene, and the preset condition is to respond to a touch region belonging to the target shape and not to respond to a touch region not belonging to the target shape. For example, the target shape may be a rectangle having a length less than or equal to the second length threshold and a width less than or equal to the third width threshold, so that a response to an abnormal touch operation may be avoided as follows: the user operation area is too large, so that most of the area of the touch screen and even the whole touch screen are blocked. In the above case, there may be a plurality of controls in the second touch region that is usually determined, so that it is not possible to determine which control is to be responded, and thus it may be set as an abnormal touch operation.
In the above definition of the position, some normal positions other than the position having the special function may be used as target positions, the system responds to a touch operation on the target positions, and the system does not respond to the position having the special function, so that the preset condition may be that the position of the sub-area is within the target area, and a plurality of continuous or discontinuous target positions exist in the target area.
The second touch area can be limited through the preset conditions, so that the accuracy of the second touch area is further improved.
In another possible embodiment, deleting the sub-region that does not satisfy the preset condition includes:
and deleting the subarea with the size smaller than or equal to the preset size threshold.
The size may be an area, a length, a width, and the like, and the corresponding preset size threshold may be referred to as a first area threshold, a first length threshold, and a first width threshold.
According to the embodiment of the application, the sub-area with the area smaller than or equal to the preset size threshold can be avoided as the second touch area, and in practical application, the size of the touch area corresponding to the abnormal touch operation is usually smaller, so that the touch area corresponding to the abnormal touch operation can be removed, the response to the abnormal touch operation can be avoided, and the resource waste is reduced.
In another possible embodiment, deleting the sub-region that does not satisfy the preset condition includes:
the delete shape does not belong to a sub-region of the target shape.
The target shape may refer to the foregoing description, and is not described herein again.
According to the embodiment of the application, the abnormal touch operation can be identified through the target shape, the response to the abnormal touch operation is avoided, and the resource waste is reduced.
In another possible embodiment, before determining the candidate region corresponding to the first direction, the controller is further configured to:
determining the scanning period and the direction of the preset number;
in each scanning period, the direction in which the blocked light path exists is determined to be the first direction.
The scanning period may be set according to an actual application scenario, and is not limited in the embodiment of the present application. The scanning period is used for scanning different directions to determine whether a blocked light path exists in each direction, and if one receiving lamp cannot receive infrared light emitted by a corresponding emitting lamp, the light path between the emitting lamp and the receiving lamp is called as a blocked light path.
It can be understood that, in each scanning period, the optical paths in all directions are scanned once to determine whether the optical paths are blocked, so that the second touch area in the scanning period can be determined according to the blocked optical paths and the unblocked optical paths in the scanning period; thus, a plurality of second touch areas are obtained in a plurality of scanning periods. Since the scanning period has a time sequence, the plurality of second touch areas can form a track according to the time sequence, and the track can be subjected to a smoothing process. Of course, if the second touch area does not exist within a certain time period, that is, the blocked light path does not exist, the second touch area of the time period representing the previous time period is the end area of the track, which represents that the continuous touch operation of the user is ended. Before the track is generated, the track can be converted from a self-defined coordinate system to a standard coordinate system, so that the second touch area is more in accordance with the standard, and the generated track can be correctly displayed in the touch screen.
In practical applications, after the second touch areas are obtained and before the trajectory is generated, ghost point removing operations may be performed on the plurality of second touch areas. When a user touches two points on the touch screen, two abscissas and two ordinates appear on the touch screen, so that 4 points are formed. The ghost point removing operation is a relatively mature technology, and the embodiments of the present application are not described herein again.
According to the display device provided by the embodiment of the application, the first touch area is determined through the blocked light path, then the first touch area is used as a thick outline, the second touch area can be further determined through the unblocked light path, and the second touch area is a sub-area of the first touch area, so that the touch area limited by the second touch area is smaller, and the accuracy of the touch area is higher.
Based on the content described in the foregoing embodiments, an embodiment of the present application further provides a method for determining a touch area, and referring to fig. 16, a flowchart of the method for determining a touch area is exemplarily shown in fig. 16, where the method includes:
step 301, determining a first touch area according to the blocked light path in the first direction of the touch screen.
Step 302, if an unobstructed optical path in the second direction exists in the first touch area, determining a second touch area according to the first touch area and the unobstructed optical path, where the second touch area is a sub-area of the first touch area.
According to the method for determining the touch area, the first touch area is determined through the blocked light path, then the first touch area is used as a thick outline, the second touch area can be further determined through the unblocked light path, and the second touch area is a sub-area of the first touch area, so that the touch area defined by the second touch area is smaller, and the accuracy of the touch area is higher.
In a possible embodiment, the determining the second touch area according to the first touch area and the unobstructed optical path includes:
dividing the first touch area into sub-areas through an unshielded light path;
the sub-area is determined as a second touch area.
In a possible implementation manner, before determining the sub-area as the second touch area, the method further includes:
deleting the subarea which does not meet the preset condition;
the determining the sub-region as the second touch region includes:
and determining the sub-area remaining after the deletion as a second touch area.
In a possible embodiment, deleting the sub-region that does not satisfy the preset condition includes:
and deleting the subarea with the size smaller than or equal to the preset size threshold.
In a possible embodiment, deleting the sub-region that does not satisfy the preset condition includes:
the delete shape does not belong to a sub-region of the target shape.
In a possible embodiment, there are at least two first directions, and determining the first touch area according to the blocked light path in the first direction of the touch screen includes:
determining a candidate region corresponding to a first direction; the candidate regions include: the area is formed on the touch screen by the shielded light path in the first direction of the touch screen;
and determining an overlapping area between the candidate areas respectively corresponding to the at least two first directions as a first touch area.
In a possible implementation manner, before determining the candidate region corresponding to the first direction, the method further includes:
determining the scanning period and the direction of the preset number;
in each scanning period, the direction in which the blocked light path exists is determined to be the first direction.
It should be noted that implementation principles of each step in the method for determining a touch area are consistent with implementation principles of each function corresponding to the display device described in the foregoing embodiments, and specific reference may be made to contents described in each embodiment corresponding to the display device, which is not described herein again.
Finally, it should be noted that: the above embodiments are only used for illustrating the technical solutions of the present application, and not for limiting the same; although the present application has been described in detail with reference to the foregoing embodiments, it should be understood by those of ordinary skill in the art that: the technical solutions described in the foregoing embodiments may still be modified, or some or all of the technical features may be equivalently replaced; and the modifications or the substitutions do not make the essence of the corresponding technical solutions depart from the scope of the technical solutions of the embodiments of the present application.
The foregoing description, for purposes of explanation, has been presented in conjunction with specific embodiments. However, the illustrative discussions above are not intended to be exhaustive or to limit the embodiments to the precise forms disclosed above. Many modifications and variations are possible in light of the above teaching. The embodiments were chosen and described in order to best explain the principles and the practical application, to thereby enable others skilled in the art to best utilize the embodiments and various embodiments with various modifications as are suited to the particular use contemplated.

Claims (10)

1. A display device, characterized in that the display device comprises:
the touch screen is used for receiving touch operation;
a controller communicatively coupled to the touch screen, the controller configured to:
determining a first touch area according to the shielded light path in the first direction of the touch screen;
if an unobstructed light path in a second direction exists in the first touch area, determining a second touch area according to the first touch area and the unobstructed light path, wherein the second touch area is a sub-area of the first touch area.
2. The display device of claim 1, wherein the determining a second touch area from the first touch area and the unobstructed optical path comprises:
dividing the first touch area into sub-areas through the unobstructed optical path;
determining the sub-region as a second touch region.
3. The display device of claim 2, wherein prior to the determining the sub-region as the second touch region, the controller is further configured to:
deleting the subarea which does not meet the preset condition;
the determining the sub-region as a second touch region includes:
determining the sub-area remaining after deletion as a second touch area.
4. The display device according to claim 3, wherein deleting the sub-area which does not satisfy the preset condition comprises:
and deleting the subarea with the size smaller than or equal to the preset size threshold.
5. The display device according to claim 3, wherein deleting the sub-area which does not satisfy the preset condition comprises:
the delete shape does not belong to a sub-region of the target shape.
6. The display device according to any one of claims 1 to 5, wherein there are at least two first directions, and the determining a first touch area according to the blocked light path in the first direction of the touch screen comprises:
determining a candidate region corresponding to a first direction; the candidate region includes: the area is formed on the touch screen by the shielded light path in the first direction of the touch screen;
and determining an overlapping area between the candidate areas corresponding to at least two first directions as a first touch area.
7. The display device according to claim 6, wherein before the determining the candidate region corresponding to the first direction, the controller is further configured to:
determining the scanning period and the direction of the preset number;
and determining the direction in which the blocked light path exists as a first direction in each scanning period.
8. A method for determining a touch area, the method comprising:
determining a first touch area according to the shielded light path in the first direction of the touch screen;
if an unobstructed light path in a second direction exists in the first touch area, determining a second touch area according to the first touch area and the unobstructed light path, wherein the second touch area is a sub-area of the first touch area.
9. The method of claim 8, wherein determining a second touch area from the first touch area and the unobstructed optical path comprises:
dividing the first touch area into sub-areas through the unobstructed optical path;
determining the sub-region as a second touch region.
10. The method of claim 9, wherein prior to determining the sub-region as the second touch region, further comprising:
deleting the subarea which does not meet the preset condition;
the determining the sub-region as a second touch region includes:
determining the sub-area remaining after deletion as a second touch area.
CN202011089349.9A 2020-10-13 2020-10-13 Touch area determining method and display device Active CN114415864B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202011089349.9A CN114415864B (en) 2020-10-13 2020-10-13 Touch area determining method and display device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202011089349.9A CN114415864B (en) 2020-10-13 2020-10-13 Touch area determining method and display device

Publications (2)

Publication Number Publication Date
CN114415864A true CN114415864A (en) 2022-04-29
CN114415864B CN114415864B (en) 2024-02-02

Family

ID=81260411

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202011089349.9A Active CN114415864B (en) 2020-10-13 2020-10-13 Touch area determining method and display device

Country Status (1)

Country Link
CN (1) CN114415864B (en)

Citations (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102722291A (en) * 2012-06-01 2012-10-10 合肥工业大学 Method for driving infrared multi-point touch screen
CN104238834A (en) * 2014-09-28 2014-12-24 青岛海信电器股份有限公司 Method and system for improving touch sensitivity
CN105045439A (en) * 2015-08-07 2015-11-11 青岛海信电器股份有限公司 Touch screen multi-point positioning method and apparatus and touchscreen device
CN105260064A (en) * 2015-10-15 2016-01-20 青岛海信电器股份有限公司 Touch point identification method and apparatus, and display device
CN105302380A (en) * 2015-12-02 2016-02-03 青岛海信电器股份有限公司 Touch point obtaining method, infrared touch device and touch screen equipment
WO2016115818A1 (en) * 2015-01-23 2016-07-28 中兴通讯股份有限公司 Integrated optical-to-electrical touch screen and apparatus, electronic device, and touch recognition method and system
CN106020568A (en) * 2016-05-11 2016-10-12 青岛海信电器股份有限公司 Touch area extracting method, device and display device
CN106201064A (en) * 2016-06-24 2016-12-07 青岛海信电器股份有限公司 Touch localization method, device and touch screen
US20160378334A1 (en) * 2015-06-25 2016-12-29 Xiaomi Inc. Method and apparatus for controlling display and mobile terminal
CN106406613A (en) * 2016-09-14 2017-02-15 青岛海信电器股份有限公司 Touch point positioning method and device
US20180356935A1 (en) * 2016-01-28 2018-12-13 Huawei Technologies Co., Ltd. Touch point positioning method and apparatus, and terminal device
CN109542276A (en) * 2018-11-22 2019-03-29 青岛海信商用显示股份有限公司 A kind of touch point identification method and device and display equipment
CN110489015A (en) * 2019-08-19 2019-11-22 青岛海信商用显示股份有限公司 Touch point determines method and device, touch screen and display
CN110581904A (en) * 2018-06-11 2019-12-17 Oppo广东移动通信有限公司 Control method, control device, electronic device, and storage medium

Patent Citations (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102722291A (en) * 2012-06-01 2012-10-10 合肥工业大学 Method for driving infrared multi-point touch screen
CN104238834A (en) * 2014-09-28 2014-12-24 青岛海信电器股份有限公司 Method and system for improving touch sensitivity
WO2016115818A1 (en) * 2015-01-23 2016-07-28 中兴通讯股份有限公司 Integrated optical-to-electrical touch screen and apparatus, electronic device, and touch recognition method and system
US20160378334A1 (en) * 2015-06-25 2016-12-29 Xiaomi Inc. Method and apparatus for controlling display and mobile terminal
CN105045439A (en) * 2015-08-07 2015-11-11 青岛海信电器股份有限公司 Touch screen multi-point positioning method and apparatus and touchscreen device
CN105260064A (en) * 2015-10-15 2016-01-20 青岛海信电器股份有限公司 Touch point identification method and apparatus, and display device
CN105302380A (en) * 2015-12-02 2016-02-03 青岛海信电器股份有限公司 Touch point obtaining method, infrared touch device and touch screen equipment
US20180356935A1 (en) * 2016-01-28 2018-12-13 Huawei Technologies Co., Ltd. Touch point positioning method and apparatus, and terminal device
CN106020568A (en) * 2016-05-11 2016-10-12 青岛海信电器股份有限公司 Touch area extracting method, device and display device
CN106201064A (en) * 2016-06-24 2016-12-07 青岛海信电器股份有限公司 Touch localization method, device and touch screen
CN106406613A (en) * 2016-09-14 2017-02-15 青岛海信电器股份有限公司 Touch point positioning method and device
CN110581904A (en) * 2018-06-11 2019-12-17 Oppo广东移动通信有限公司 Control method, control device, electronic device, and storage medium
CN109542276A (en) * 2018-11-22 2019-03-29 青岛海信商用显示股份有限公司 A kind of touch point identification method and device and display equipment
CN110489015A (en) * 2019-08-19 2019-11-22 青岛海信商用显示股份有限公司 Touch point determines method and device, touch screen and display

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
张伟等;: "基于ARM的红外多点触摸屏设计", 液晶与显示, no. 05 *

Also Published As

Publication number Publication date
CN114415864B (en) 2024-02-02

Similar Documents

Publication Publication Date Title
CN111752518A (en) Screen projection method of display equipment and display equipment
CN112118468A (en) Method for changing color of peripheral equipment along with color of picture and display equipment
CN112214189A (en) Image display method and display device
CN112087671B (en) Display method and display equipment for control prompt information of input method control
CN111899175A (en) Image conversion method and display device
CN111176603A (en) Image display method for display equipment and display equipment
CN112118400A (en) Display method of image on display device and display device
CN111954059A (en) Screen saver display method and display device
CN112783380A (en) Display apparatus and method
CN111954043B (en) Information bar display method and display equipment
CN112272331B (en) Method for rapidly displaying program channel list and display equipment
CN113485613A (en) Display equipment and method for realizing free-drawing screen edge painting
CN112269668A (en) Application resource sharing and display equipment
CN112328553A (en) Thumbnail capturing method and display device
CN111984167A (en) Rapid naming method and display device
CN112040340A (en) Resource file acquisition method and display device
CN112235621B (en) Display method and display equipment for visual area
CN111988646B (en) User interface display method and display device of application program
CN113141532B (en) Identification method of graphic identification code and display device
CN111259639B (en) Self-adaptive adjustment method of table and display equipment
CN114415864B (en) Touch area determining method and display device
CN113076031A (en) Display device, touch positioning method and device
CN112367550A (en) Method for realizing multi-title dynamic display of media asset list and display equipment
CN112261463A (en) Display device and program recommendation method
CN112363683A (en) Method for supporting multi-layer display of webpage application and display equipment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant