CN114415864B - Touch area determining method and display device - Google Patents

Touch area determining method and display device Download PDF

Info

Publication number
CN114415864B
CN114415864B CN202011089349.9A CN202011089349A CN114415864B CN 114415864 B CN114415864 B CN 114415864B CN 202011089349 A CN202011089349 A CN 202011089349A CN 114415864 B CN114415864 B CN 114415864B
Authority
CN
China
Prior art keywords
touch area
touch
area
determining
blocked
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202011089349.9A
Other languages
Chinese (zh)
Other versions
CN114415864A (en
Inventor
王武军
张晓娜
赵书凯
赵宏洋
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Qingdao Hisense Commercial Display Co Ltd
Original Assignee
Qingdao Hisense Commercial Display Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Qingdao Hisense Commercial Display Co Ltd filed Critical Qingdao Hisense Commercial Display Co Ltd
Priority to CN202011089349.9A priority Critical patent/CN114415864B/en
Publication of CN114415864A publication Critical patent/CN114415864A/en
Application granted granted Critical
Publication of CN114415864B publication Critical patent/CN114415864B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • G06F3/042Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by opto-electronic means
    • G06F3/0421Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by opto-electronic means by interrupting or reflecting a light beam, e.g. optical touch-screen
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04883Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/14Digital output to display device ; Cooperation and interconnection of the display device with other functional units
    • G06F3/147Digital output to display device ; Cooperation and interconnection of the display device with other functional units using display panels

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The embodiment of the application provides a method for determining a touch area and display equipment, wherein the first touch area is determined according to a light path which is blocked in a first direction of a touch screen; and if the first touch area has an unoccluded light path in the second direction, determining a second touch area according to the first touch area and the unoccluded light path, wherein the second touch area is a subarea of the first touch area. In other words, in the embodiment of the present application, the first touch area is determined through the blocked light path first, then the first touch area as a rough outline may further determine the second touch area through the non-blocked light path, where the second touch area is a sub-area of the first touch area, so that the touch area defined by the second touch area is smaller, and the accuracy of the touch area is higher.

Description

Touch area determining method and display device
Technical Field
The embodiment of the application relates to the technical field of touch control, in particular to a touch area determining method and display equipment.
Background
With the widespread use of electronic products, in the process of using the electronic products by users, the display screen of the electronic products directly affects the use feeling of the users on the electronic products. To simplify the human-machine interaction process, a touch screen is presented, and the touch screen can detect a touch area to respond according to the touch area. For example, if a control for closing the current interface is displayed on the touch area, closing the current interface.
In the prior art, an infrared touch screen is used as a common touch screen, and is of a rectangular structure, wherein two opposite long sides in the rectangular structure are a long side transmitting side and a long side receiving side respectively, a plurality of transmitting lamps are arranged on the long side transmitting side, and a plurality of receiving lamps are arranged on the long side receiving side; similarly, two opposite short sides are a short side transmitting side and a short side receiving side respectively, a plurality of transmitting lamps are arranged on the short side transmitting side, and a plurality of receiving lamps are arranged on the short side receiving side. The infrared light emitted by the emitting lamp can be received by the opposite receiving lamp or lamps, and an optical path is formed between the emitting lamp and the corresponding receiving lamp. After the user touches the infrared touch screen, the light paths between part of the transmitting lamps and the receiving lamps are blocked, so that the receiving lamps cannot receive infrared light emitted by the transmitting lamps, and the touch area can be determined through the blocked light paths.
However, the applicant has studied the prior art and found that the above method of determining a touch area has a problem of low accuracy.
Disclosure of Invention
The embodiment of the application provides a touch area determining method and display equipment, which can solve the technical problem of low accuracy of the current touch area determining mode.
In a first aspect, embodiments of the present application provide a display device, including:
the touch screen is used for receiving touch operation;
a controller in communication with the touch screen, the controller configured to:
determining a first touch area according to the light path which is blocked in the first direction of the touch screen;
and if the first touch area has an unoccluded light path in the second direction, determining a second touch area according to the first touch area and the unoccluded light path, wherein the second touch area is a subarea of the first touch area.
In a second aspect, an embodiment of the present application provides a method for determining a touch area, where the method includes:
determining a first touch area according to the light path which is blocked in the first direction of the touch screen;
and if the first touch area has an unoccluded light path in the second direction, determining a second touch area according to the first touch area and the unoccluded light path, wherein the second touch area is a subarea of the first touch area.
According to the method for determining the touch area and the display device, the first touch area is determined according to the light path which is blocked in the first direction of the touch screen; and if the first touch area has an unoccluded light path in the second direction, determining a second touch area according to the first touch area and the unoccluded light path, wherein the second touch area is a subarea of the first touch area. In other words, in the embodiment of the present application, the first touch area is determined through the blocked light path first, then the first touch area as a rough outline may further determine the second touch area through the non-blocked light path, where the second touch area is a sub-area of the first touch area, so that the touch area defined by the second touch area is smaller, and the accuracy of the touch area is higher.
Drawings
In order to more clearly illustrate the embodiments of the present application or the technical solutions in the prior art, a brief description will be given below of the drawings that are needed in the embodiments of the present application or the description of the prior art, and it is obvious that the drawings in the following description are some embodiments of the present application, and that other drawings may be obtained according to these drawings without inventive effort to a person skilled in the art.
A schematic diagram of an operational scenario between a display device and a control apparatus according to some embodiments is schematically shown in fig. 1;
a hardware configuration block diagram of a display device 200 according to some embodiments is exemplarily shown in fig. 2;
an icon control interface display schematic of an application in a display device 200 according to some embodiments is illustrated in fig. 3;
a schematic diagram of a touch screen emitting infrared light is exemplarily shown in fig. 4;
FIG. 5 illustrates a schematic view of an optical network in one direction of a touch screen;
FIG. 6 illustrates a schematic view of an optical network in another direction of a touch screen;
FIG. 7 illustrates a schematic view of an optical network in another direction of a touch screen;
FIG. 8 illustrates a schematic view of an optical network in another direction of a touch screen;
FIG. 9 illustrates a schematic view of an optical network in another direction of a touch screen;
FIG. 10 illustrates a schematic view of an optical network in another direction of a touch screen;
FIG. 11 illustrates an infrared light matrix schematic of a touch screen;
FIG. 12 illustrates a schematic view of a user operation area in a touch screen;
FIG. 13 illustrates a schematic diagram of candidate regions in a touch screen;
FIG. 14 illustrates a schematic view of a first touch area in a touch screen;
FIG. 15 illustrates a schematic view of a second touch area in a touch screen;
a flow chart of a method of determining a touch area is schematically shown in fig. 16.
Detailed Description
For purposes of clarity, embodiments and advantages of the present application, the following description will make clear and complete the exemplary embodiments of the present application, with reference to the accompanying drawings in the exemplary embodiments of the present application, it being apparent that the exemplary embodiments described are only some, but not all, of the examples of the present application.
Based on the exemplary embodiments described herein, all other embodiments that may be obtained by one of ordinary skill in the art without making any inventive effort are within the scope of the claims appended hereto. Furthermore, while the disclosure is presented in the context of an exemplary embodiment or embodiments, it should be appreciated that the various aspects of the disclosure may, separately, comprise a complete embodiment.
It should be noted that the brief description of the terms in the present application is only for convenience in understanding the embodiments described below, and is not intended to limit the embodiments of the present application. Unless otherwise indicated, these terms should be construed in their ordinary and customary meaning.
The terms first, second, third and the like in the description and in the claims and in the above-described figures are used for distinguishing between similar or similar objects or entities and not necessarily for describing a particular sequential or chronological order, unless otherwise indicated (Unless otherwise indicated). It is to be understood that the terms so used are interchangeable under appropriate circumstances such that the embodiments of the application are, for example, capable of operation in sequences other than those illustrated or otherwise described herein.
Furthermore, the terms "comprise" and "have," and any variations thereof, are intended to cover a non-exclusive inclusion, such that a product or apparatus that comprises a list of elements is not necessarily limited to those elements expressly listed, but may include other elements not expressly listed or inherent to such product or apparatus.
A schematic diagram of an operation scenario between a display device and a control apparatus according to an embodiment is exemplarily shown in fig. 1. As shown in fig. 1, a user may operate the display apparatus 200 through the mobile terminal 300 and the control device 100.
In some embodiments, the control apparatus 100 may be a remote controller, and the communication between the remote controller and the display device includes infrared protocol communication or bluetooth protocol communication, and other short-range communication modes, etc., and the display device 200 is controlled by a wireless or other wired mode. The user may control the display device 200 by inputting user instructions through keys on a remote control, voice input, control panel input, etc. Such as: the user can input corresponding control instructions through volume up-down keys, channel control keys, up/down/left/right movement keys, voice input keys, menu keys, on-off keys, etc. on the remote controller to realize the functions of the control display device 200.
In some embodiments, mobile terminals, tablet computers, notebook computers, and other smart devices may also be used to control the display device 200. For example, the display device 200 is controlled using an application running on a smart device. The application program, by configuration, can provide various controls to the user in an intuitive User Interface (UI) on a screen associated with the smart device.
In some embodiments, the mobile terminal 300 may install a software application with the display device 200, implement connection communication through a network communication protocol, and achieve the purpose of one-to-one control operation and data communication. Such as: it is possible to implement a control command protocol established between the mobile terminal 300 and the display device 200, synchronize a remote control keyboard to the mobile terminal 300, and implement a function of controlling the display device 200 by controlling a user interface on the mobile terminal 300. The audio/video content displayed on the mobile terminal 300 can also be transmitted to the display device 200, so as to realize the synchronous display function.
As also shown in fig. 1, the display device 200 is also in data communication with the server 400 via a variety of communication means. The display device 200 may be permitted to make communication connections via a Local Area Network (LAN), a Wireless Local Area Network (WLAN), and other networks. The server 400 may provide various contents and interactions to the display device 200. By way of example, display device 200 receives software program updates, or accesses a remotely stored digital media library by sending and receiving information, as well as Electronic Program Guide (EPG) interactions. The server 400 may be a cluster, or may be multiple clusters, and may include one or more types of servers. Other web service content such as video on demand and advertising services are provided through the server 400.
The display device 200 may be a liquid crystal display, an OLED display, a projection display. The particular display device type, size, resolution, etc. are not limited, and those skilled in the art will appreciate that the display device 200 may be modified in performance and configuration as desired.
The display apparatus 200 may additionally provide a smart network television function of a computer support function, including, but not limited to, a network television, a smart television, an Internet Protocol Television (IPTV), etc., in addition to the broadcast receiving television function.
A hardware configuration block diagram of the display device 200 according to an exemplary embodiment is illustrated in fig. 2.
In some embodiments, at least one of the controller 250, the modem 210, the communicator 220, the detector 230, the input/output interface 255, the display 275, the audio output interface 285, the memory 260, the power supply 290, the user interface 265, and the external device interface 240 is included in the display apparatus 200.
In some embodiments, the display 275 is configured to receive image signals from the first processor output, and to display video content and images and components of the menu manipulation interface.
In some embodiments, the display 275 includes a display screen assembly for presenting a picture, and a drive assembly for driving the display of an image.
In some embodiments, the video content is displayed from broadcast television content, or alternatively, from various broadcast signals that may be received via a wired or wireless communication protocol. Alternatively, various image contents received from the network server side transmitted from the network communication protocol may be displayed.
In some embodiments, the display 275 is used to present a user-manipulated UI interface generated in the display device 200 and used to control the display device 200.
In some embodiments, depending on the type of display 275, a drive assembly for driving the display is also included.
In some embodiments, display 275 is a projection display and may also include a projection device and projection screen.
In some embodiments, communicator 220 is a component for communicating with external devices or external servers according to various communication protocol types. For example: the communicator may include at least one of a Wifi chip, a bluetooth communication protocol chip, a wired ethernet communication protocol chip, or other network communication protocol chip or a near field communication protocol chip, and an infrared receiver.
In some embodiments, the display apparatus 200 may establish control signal and data signal transmission and reception between the control device 100 or the content providing apparatus through the communicator 220.
In some embodiments, the user interface 265 may be used to receive infrared control signals from the control device 100 (e.g., an infrared remote control, etc.).
In some embodiments, the detector 230 is a signal that the display device 200 uses to capture or interact with the external environment.
In some embodiments, the detector 230 includes an optical receiver, a sensor for capturing the intensity of ambient light, a parameter change may be adaptively displayed by capturing ambient light, etc.
In some embodiments, the detector 230 may further include an image collector, such as a camera, a video camera, etc., which may be used to collect external environmental scenes, collect attributes of a user or interact with a user, adaptively change display parameters, and recognize a user gesture to realize an interaction function with the user.
In some embodiments, the detector 230 may also include a temperature sensor or the like, such as by sensing ambient temperature.
In some embodiments, the display device 200 may adaptively adjust the display color temperature of the image. The display device 200 may be adjusted to display a colder color temperature shade of the image, such as when the temperature is higher, or the display device 200 may be adjusted to display a warmer color shade of the image when the temperature is lower.
In some embodiments, the detector 230 may also be a sound collector or the like, such as a microphone, that may be used to receive the user's sound. Illustratively, a voice signal including a control instruction for a user to control the display apparatus 200, or an acquisition environmental sound is used to recognize an environmental scene type so that the display apparatus 200 can adapt to environmental noise.
In some embodiments, as shown in fig. 2, the input/output interface 255 is configured to enable data transfer between the controller 250 and external other devices or other controllers 250. Such as receiving video signal data and audio signal data of an external device, command instruction data, or the like.
In some embodiments, external device interface 240 may include, but is not limited to, the following: any one or more interfaces of a high definition multimedia interface HDMI interface, an analog or data high definition component input interface, a composite video input interface, a USB input interface, an RGB port, and the like can be used. The plurality of interfaces may form a composite input/output interface.
In some embodiments, as shown in fig. 2, the modem 210 is configured to receive the broadcast television signal by a wired or wireless receiving manner, and may perform modulation and demodulation processes such as amplification, mixing, and resonance, and demodulate the audio/video signal from a plurality of wireless or wired broadcast television signals, where the audio/video signal may include a television audio/video signal carried in a television channel frequency selected by a user, and an EPG data signal.
In some embodiments, the frequency point demodulated by the modem 210 is controlled by the controller 250, and the controller 250 may send a control signal according to the user selection, so that the modem responds to the television signal frequency selected by the user and modulates and demodulates the television signal carried by the frequency.
In some embodiments, the broadcast television signal may be classified into a terrestrial broadcast signal, a cable broadcast signal, a satellite broadcast signal, an internet broadcast signal, or the like according to a broadcasting system of the television signal. Or may be differentiated into digital modulation signals, analog modulation signals, etc., depending on the type of modulation. Or it may be classified into digital signals, analog signals, etc. according to the kind of signals.
In some embodiments, the controller 250 and the modem 210 may be located in separate devices, i.e., the modem 210 may also be located in an external device to the main device in which the controller 250 is located, such as an external set-top box or the like. In this way, the set-top box outputs the television audio and video signals modulated and demodulated by the received broadcast television signals to the main body equipment, and the main body equipment receives the audio and video signals through the first input/output interface.
In some embodiments, the controller 250 controls the operation of the display device and responds to user operations through various software control programs stored on the memory. The controller 250 may control the overall operation of the display apparatus 200. For example: in response to receiving a user command for selecting a UI object displayed on the display 275, the controller 250 may perform an operation related to the object selected by the user command.
As shown in fig. 2, the controller 250 includes at least one of a random access Memory 251 (Random Access Memory, RAM), a Read-Only Memory 252 (ROM), a video processor 270, an audio processor 280, other processors 253 (e.g., a graphics processor (Graphics Processing Unit, GPU), a central processing unit 254 (Central Processing Unit, CPU), a communication interface (Communication Interface), and a communication Bus 256 (Bus), which connects the respective components.
In some embodiments, RAM 251 is used to store temporary data for the operating system or other on-the-fly programs
In some embodiments, ROM 252 is used to store instructions for various system boots.
In some embodiments, ROM 252 is used to store a basic input output system, referred to as a basic input output system (Basic Input Output System, BIOS). The system comprises a drive program and a boot operating system, wherein the drive program is used for completing power-on self-checking of the system, initialization of each functional module in the system and basic input/output of the system.
In some embodiments, upon receipt of the power-on signal, the display device 200 power starts up, the CPU runs system boot instructions in the ROM 252, copies temporary data of the operating system stored in memory into the RAM 251, in order to start up or run the operating system. When the operating system is started, the CPU copies temporary data of various applications in the memory to the RAM 251, and then, facilitates starting or running of the various applications.
In some embodiments, central processor 254 is used to execute operating system and application program instructions stored in memory. And executing various application programs, data and contents according to various interactive instructions received from the outside, so as to finally display and play various audio and video contents.
In some exemplary embodiments, the central processor 254 may comprise a plurality of processors. The plurality of processors may include one main processor and one or more sub-processors. A main processor for performing some operations of the display apparatus 200 in the pre-power-up mode and/or displaying a picture in the normal mode. One or more sub-processors for one operation in a standby mode or the like.
In some embodiments, the graphics processor 253 is configured to generate various graphical objects, such as: icons, operation menus, user input instruction display graphics, and the like. The device comprises an arithmetic unit, wherein the arithmetic unit is used for receiving various interaction instructions input by a user to carry out operation and displaying various objects according to display attributes. And a renderer for rendering the various objects obtained by the arithmetic unit, wherein the rendered objects are used for being displayed on a display screen.
In some embodiments, video processor 270 is configured to receive external video signals, perform video processing such as decompression, decoding, scaling, noise reduction, frame rate conversion, resolution conversion, image composition, etc., according to standard codec protocols for input signals, and may result in signals that are displayed or played on directly displayable device 200.
In some embodiments, video processor 270 includes a demultiplexing module, a video decoding module, an image compositing module, a frame rate conversion module, a display formatting module, and the like.
The demultiplexing module is used for demultiplexing the input audio/video data stream, such as the input MPEG-2, and demultiplexes the input audio/video data stream into video signals, audio signals and the like.
And the video decoding module is used for processing the demultiplexed video signals, including decoding, scaling and the like.
And an image synthesis module, such as an image synthesizer, for performing superposition mixing processing on the graphic generator and the video image after the scaling processing according to the GUI signal input by the user or generated by the graphic generator, so as to generate an image signal for display.
The frame rate conversion module is configured to convert the input video frame rate, for example, converting the 60Hz frame rate into the 120Hz frame rate or the 240Hz frame rate, and the common format is implemented in an inserting frame manner.
The display format module is used for converting the received frame rate into a video output signal, and changing the video output signal to a signal conforming to the display format, such as outputting an RGB data signal.
In some embodiments, the graphics processor 253 and the video processor 270 may be integrally configured, or may be separately configured, where the processing of graphics signals output to the display screen may be performed during the integrally configured, and where different functions, such as gpu+ FRC (Frame Rate Conversion)) architectures may be performed during the separately configured.
In some embodiments, the audio processor 280 is configured to receive an external audio signal, decompress and decode the audio signal according to a standard codec protocol of an input signal, and perform noise reduction, digital-to-analog conversion, and amplification processing, so as to obtain a sound signal that can be played in a speaker.
In some embodiments, video processor 270 may include one or more chips. Audio processor 280 may also include one or more chips.
In some embodiments, video processor 270 and audio processor 280 may be separate chips or may be integrated together with the controller in one or more chips.
The power supply 290 supplies power input from an external power source to the display device 200 under the control of the controller 250. The power supply 290 may include a built-in power circuit installed inside the display device 200, or may be an external power source installed in the display device 200, and a power interface for providing an external power source in the display device 200.
The user interface 265 is used to receive an input signal from a user and then transmit the received user input signal to the controller 250. The user input signal may be a remote control signal received through an infrared receiver, and various user control signals may be received through a network communication module.
In some embodiments, a user inputs a user command through the control apparatus 100 or the mobile terminal 300, the user input interface is then responsive to the user input through the controller 250, and the display device 200 is then responsive to the user input.
In some embodiments, a "user interface" is a media interface for interaction and exchange of information between an application or operating system and a user that enables conversion between an internal form of information and a form acceptable to the user. A commonly used presentation form of the user interface is a graphical user interface (Graphic User Interface, GUI), which refers to a user interface related to computer operations that is displayed in a graphical manner. It may be an interface element such as an icon, a window, a control, etc. displayed in a display screen of the electronic device, where the control may include a visual interface element such as an icon, a button, a menu, a tab, a text box, a dialog box, a status bar, a navigation bar, a Widget, etc.
In some embodiments, as shown in fig. 3, the application layer contains at least one icon control that the application can display in the display screen, such as: a live television application icon control, a video on demand application icon control, a media center application icon control, an application center icon control, a game application icon control, and the like.
In some embodiments, the live television application may provide live television via different signal sources. For example, a live television application may provide television signals using inputs from cable television, radio broadcast, satellite services, or other types of live television services. And, the live television application may display video of the live television signal on the display device 200.
In some embodiments, the video on demand application may provide video from different storage sources. Unlike live television applications, video-on-demand provides video displays from some storage sources. For example, video-on-demand may come from the server side of cloud storage, from a local hard disk storage containing stored video programs.
In some embodiments, the media center application may provide various multimedia content playing applications. For example, a media center may be a different service than live television or video on demand, and a user may access various images or audio through a media center application.
In some embodiments, an application center may be provided to store various applications. The application may be a game, an application, or some other application associated with a computer system or other device but which may be run in a smart television. The application center may obtain these applications from different sources, store them in local storage, and then be run on the display device 200.
Currently, the display screen may be a touch screen for receiving a touch operation. When a user performs touch operation on the touch screen, the touch operation can block the light path on the touch screen, so that a touch area corresponding to the touch operation can be determined according to the blocked light path. However, when the touch screen supports multi-touch, a user may touch the touch screen with a plurality of fingers, where the plurality of fingers block only light paths in certain directions and do not block light paths in other directions, so that a touch area determined only according to the blocked light paths is larger, resulting in lower accuracy of the touch area.
In order to solve the technical problem, in the embodiment of the present application, the first touch area is determined through the blocked light path, and then the first touch area as a rough outline can further determine the second touch area through the non-blocked light path, where the second touch area is a sub-area of the first touch area, so that the touch area defined by the second touch area is smaller, and the accuracy of the touch area is higher. The following will explain in detail the embodiments.
In the embodiment of the present application, the controller 250 is configured to:
determining a first touch area according to the light path blocked in the first direction of the touch screen; if the first touch area has the light path which is not shielded in the second direction, determining a second touch area according to the first touch area and the light path which is not shielded, wherein the second touch area is a subarea of the first touch area.
The touch screen can be an infrared touch display screen, and the infrared touch display screen has the advantages of being strong in environmental adaptability, long in service life, capable of identifying a plurality of touch areas and the like. The infrared touch display screen is of a rectangular structure. Infrared touch displays are also known as infrared technology touch screens (ITST, infrared Touch Screen Technology). As shown in fig. 4, four sides of the rectangular structure are a long side transmitting side LS1, a long side receiving side LS2, a short side transmitting side SS1 and a short side receiving side SS2, wherein a plurality of transmitting lamps are respectively arranged on LS1 and SS1, a plurality of receiving lamps are respectively arranged on LS2 and SS2, the transmitting lamps on LS1 are used for transmitting infrared light, the receiving lamps on LS2 are used for receiving infrared light transmitted by the transmitting lamps on LS1, similarly, the transmitting lamps on SS1 are used for transmitting infrared light, and the receiving lamps on SS2 are used for receiving infrared light transmitted by the transmitting lamps on SS 1.
As can be seen from fig. 4, one light emitting lamp can emit infrared light in multiple directions, and the light emitting lamp TL1 in LS1 in fig. 4 emits infrared light in three directions D1, D2 and D3, and the infrared light emitted by the light emitting lamp TL1 forms light paths in the three directions D1, D2 and D3, respectively; also, the emission lamp TL2 on the SS1 emits infrared light in three directions D4, D5, and D6, and the infrared light emitted from the emission lamp TL2 forms optical paths in the three directions D4, D5, and D6, respectively.
In practical application, any one of the emitting lamps can emit infrared light in multiple directions, and a light path is formed between each emitting lamp and the corresponding receiving lamp, and multiple light paths in the same direction are parallel on the touch screen to form an infrared detection network, which is also called an optical network. For example, for fig. 4, each of the emitting lamps on the ls1 may emit infrared light in three directions D1, D2 and D3, and each of the emitting lamps on the SS1 may emit infrared light in three directions D4, D5 and D6, so that a plurality of light paths respectively corresponding to the directions D1, D2, D3, D4, D5 and D6 respectively form a light network on the touch screen, as fig. 5 to 10 respectively show the light networks formed on the touch screen by the plurality of light paths in the directions D1 to D6.
It is understood that the optical networks in different directions may form an infrared light matrix that intersects horizontally and vertically. For example, a vertical-direction optical network as shown in fig. 6 and a horizontal-direction optical network as shown in fig. 9 may constitute an infrared light matrix as shown in fig. 11. In addition, the infrared touch display includes a general display and a touch frame positioned around the general display.
For the infrared touch display screen, any object implementing touch operation can block the light path of infrared light where the contact point is located, so that the touch area can be determined according to the coordinate position of the blocked light path, and the response to the touch operation can be realized according to the control in the touch area. The first direction in the embodiment of the application is the direction in which the blocked light path is located, the blocked light path is the light path blocked by the user operation area, and the user operation area is the area covered by the finger when the user touches the touch screen. As shown in fig. 12, the hatched area AR1 is a user operation area that blocks the optical paths in the three directions D2, D5, D6, so that D2, D5, D6 are referred to as a first direction.
Based on the above description, the second direction is a direction in which there is no blocked light path, that is, none of the light paths in the second direction is blocked, and the non-blocked light path is a light path in which the user operation area is not blocked. As shown in fig. 12, the user operation area does not block the optical path in the direction D3, so D3 is referred to as a second direction.
In the embodiment of the present application, the area formed by the light path blocked in the first direction may be used as the first touch area. For example, the area formed by the first directions D2, D5, D6 in fig. 12 may be a first touch area.
In a possible implementation, there are at least two first directions, and the specific step of determining the first touch area may include:
determining a candidate region corresponding to the first direction; the candidate region includes: a region formed on the touch screen by the light path which is blocked in the first direction of the touch screen;
and determining an overlapping region between candidate regions respectively corresponding to at least two first directions as a first touch region.
The method comprises the steps that candidate areas correspond to first directions, each first direction corresponds to one candidate area, and firstly, a plurality of continuous light paths which are shielded in the first direction are obtained; then, determining that the first and the last light paths are selected from the continuous light paths as boundary light paths; finally, the area between the boundary light paths is taken as a candidate area. For example, as shown in fig. 13, for the first direction D2, there are two blocked light paths: LR1 and LR2, whereby LR1 and LR2 are boundary optical paths, and a hatched area AR2 between LR1 and LR2 is a candidate area for the direction D2.
After continuing to determine the candidate areas respectively corresponding to the first directions D5 and D6 in the above manner, the overlapping area of the candidate areas may be regarded as the first touch area. Fig. 14 shows a first touch area AR3 determined according to the first directions D2, D5, and D6, which is an area of the boundary optical path enclosure on the three directions D2, D5, and D6, that is, an overlapping area of the candidate areas.
After the first touch area is determined, determining a second touch area according to the first touch area and the light path which is not blocked in the second direction, specifically including:
dividing the first touch area into subareas through the light path which is not shielded;
the sub-region is determined to be the second touch region.
In the embodiment of the application, the light path which is not blocked passes through the first touch area, so that the first touch area can be divided into one or more sub-areas. For example, as shown in fig. 12, the optical path in the second direction D3 is not blocked, so that the optical path in D3 may divide the first touch area AR3 obtained in fig. 14 into three sub-areas AR4, AR5, and AR6 as shown in fig. 15, so that the sub-areas AR4, AR5, and AR6 may be regarded as the second touch area.
In this embodiment of the present application, the sub-areas with the area smaller than or equal to the first area threshold may also be deleted, so that the remaining sub-area is used as the second touch area, and as shown in fig. 15, the areas of AR5 and AR6 are smaller than or equal to the first area threshold, so that the remaining sub-area AR4 is used as the second touch area after deleting AR5 and AR 6.
As can be seen from fig. 12, the operation area AR1 has an inclined elongated structure, the first touch area AR3 determined in fig. 14 is a touch area determined in the prior art, and is a touch area determined only according to the blocked light path, and the second touch area AR4 determined in fig. 15 is a touch area determined in the embodiment of the present application, and is a touch area determined by combining the blocked light path and the non-blocked light path. It can be seen that the first touch area AR3 in fig. 14 does not show the elongated feature and the oblique feature of the user operation area, but in fig. 15, the first touch area is divided by using the light path that is not blocked, so that the divided sub-area shows the elongated feature and the oblique feature in the second direction where the light path that is not blocked is located, and the light path that is not blocked in the second direction shows the elongated feature and the oblique feature along the second direction as well, which represents the original user operation area. Based on the principle, the second touch area more accurately reflects the long-strip-shaped characteristic and the inclined characteristic, so that compared with the first touch area determined by the prior art, the second touch area more accurately reflects the characteristic of the user operation area, and the accuracy is higher.
In summary, the first touch area is a touch area determined according to the blocked light path, and the second touch area is a sub-area of the first touch area determined according to the first touch area and the non-blocked light path, so that the size of the first touch area is greater than or equal to that of the second touch area, and based on the fact that the first touch area is a touch area with coarser granularity, the second touch area is a more accurate touch area.
In another possible embodiment, before determining the sub-region as the second touch region, the controller is further configured to:
deleting the subareas which do not meet the preset conditions;
determining the sub-region as a second touch region, comprising:
and determining the subarea remained after the deletion as a second touch area.
The preset conditions may be set according to an actual application scenario, and the preset conditions are used for defining the second touch area, so that the determined second touch area meets a preset desire, and the preset conditions may include, but are not limited to: size, shape and position are defined.
The above-mentioned dimensions may be an area, a length, a width, etc., so that the preset condition may be that the length is greater than or equal to a first length threshold, or that the width is greater than or equal to a first width threshold, or that the area is greater than or equal to a first area threshold. For example, the minimum recognizable length of the touch screen is 2 mm, the first width threshold and the first length threshold may be 1.5 mm, and the first area threshold may be 1.5 mm by 1.5 mm.
Of course, in practical application, there may be a case where the second touch area is too large, in which case, there are usually a plurality of controls in the second touch area, and it cannot be determined to which control is responded to, so that the case where the second touch area is too large may be determined as an abnormal case, and the corresponding user touch operation is determined as an abnormal touch operation. Specifically, a second length threshold value or a second width threshold value or a second area threshold value may be set, and the preset condition may be: the length is less than or equal to a second length threshold, or the width is less than or equal to a second width threshold, and the area is less than or equal to a second area threshold. Wherein the second length threshold is greater than the first length threshold, the second width threshold is greater than the first width threshold, and the second area threshold is greater than the first area threshold. The first length threshold value and the first width threshold value are used for eliminating abnormal touch operations with too small areas, and the second length threshold value is used for eliminating abnormal touch operations with too large areas.
In the definition of the shape, the preset condition may be that the shape of the sub-region is a target shape, the target shape may be a shape set according to an actual application scene, the touch area belonging to the target shape is responded, and the touch area not belonging to the target shape is not responded. For example, the target shape may be a rectangle having a length less than or equal to the second length threshold and a width less than or equal to the third width threshold, so that a response to an abnormal touch operation as follows may be avoided: excessive user operation area causes a large portion of the touch screen and even the entire touch screen to be blocked. In the above case, there are a plurality of controls in the second touch area that is normally determined, so that it is impossible to determine which control to respond to, and thus it can be set as an abnormal touch operation.
In the above definition of the positions, normal positions other than the positions with the special functions may be used as target positions, the system responds to the touch operations on the target positions, and the system does not respond to the positions with the special functions, so that the preset condition may be that the positions of the sub-regions are in the target region, and a plurality of continuous or discontinuous target positions exist in the target region.
The second touch area can be limited by preset conditions, so that accuracy of the second touch area is further improved.
In another possible embodiment, deleting the sub-region that does not meet the preset condition includes:
deleting sub-regions with a size less than or equal to a preset size threshold.
The size may be an area, a length, a width, etc., and the corresponding preset size threshold may be referred to as a first area threshold, a first length threshold, a first width threshold.
According to the method and the device for processing the touch area, the subareas with the area smaller than or equal to the preset size threshold can be prevented from being used as the second touch area, and in practical application, the size of the touch area corresponding to the abnormal touch operation is usually smaller, so that the touch area corresponding to the abnormal touch operation can be removed, response to the abnormal touch operation can be avoided, and resource waste is reduced.
In another possible embodiment, deleting the sub-region that does not meet the preset condition includes:
the deleted shape does not belong to a sub-region of the target shape.
The target shape may refer to the foregoing description, and will not be described herein.
According to the method and the device for identifying the abnormal touch operation, the abnormal touch operation can be identified through the target shape, response to the abnormal touch operation is avoided, and resource waste is reduced.
In another possible embodiment, before determining the candidate region corresponding to the first direction, the controller is further configured to:
determining a scanning period and a preset number of directions;
in each scanning period, the direction in which the blocked light path exists is determined as a first direction.
The scanning period may be set according to an actual application scenario, which is not limited in the embodiment of the present application. The scanning period is used for scanning different directions to determine whether a blocked light path exists in each direction, and if one receiving lamp cannot receive infrared light emitted by a corresponding emitting lamp, the light path between the emitting lamp and the receiving lamp is called as a blocked light path.
It will be appreciated that in each scanning cycle, the light paths in all directions are scanned once to determine whether the light paths are blocked, so that the second touch area in the scanning cycle can be determined according to the blocked light paths and the unblocked light paths in the scanning cycle; thus, a plurality of second touch areas are obtained in a plurality of scanning periods. Since the scan period has a time sequence, the plurality of second touch regions may form a track according to the time sequence, and the track may be further smoothed. Of course, if the second touch area does not exist within a certain time period, that is, there is no blocked light path, the time period represents that the second touch area in the previous time period is the ending area of the track, which represents that the continuous touch operation of the user ends. Before the track is generated, the track can be converted from a self-defined coordinate system to a standard coordinate system, so that the second touch area meets the standard, and the generated track can be displayed in the touch screen correctly.
In practical application, after the second touch area is obtained and before the track is generated, ghost point removing operation can be performed on the plurality of second touch areas. When a user touches two points on the touch screen, two horizontal coordinates and two vertical coordinates appear on the touch screen, so that 4 points are formed, but in practice, the user touches only two points, and the other two points are ghost points. The ghost point removing operation is already a relatively mature technology, and the embodiments of the present application will not be described herein.
According to the display device provided by the embodiment of the application, the first touch area is determined through the shielded light path, then the first touch area is used as the rough outline, the second touch area can be further determined through the non-shielded light path, and the second touch area is a subarea of the first touch area, so that the limited touch area is smaller, and the accuracy of the touch area is higher.
Based on the foregoing description of the embodiments, a method for determining a touch area is further provided in the embodiments of the present application, and referring to fig. 16, a flowchart of a method for determining a touch area is shown in fig. 16, and the method includes:
step 301, determining a first touch area according to the light path blocked in the first direction of the touch screen.
Step 302, if there is an optical path which is not blocked in the second direction in the first touch area, determining a second touch area according to the first touch area and the optical path which is not blocked, wherein the second touch area is a sub-area of the first touch area.
According to the method for determining the touch area, the first touch area is determined through the blocked light path, then the first touch area serves as a rough outline, the second touch area can be further determined through the non-blocked light path, and the second touch area is a subarea of the first touch area, so that the limited touch area is smaller, and the accuracy of the touch area is higher.
In a possible implementation manner, the determining the second touch area according to the first touch area and the unobstructed optical path includes:
dividing the first touch area into subareas through the light path which is not shielded;
the sub-region is determined to be the second touch region.
In a possible implementation manner, before the determining the sub-area as the second touch area, the method further includes:
deleting the subareas which do not meet the preset conditions;
the determining the sub-region as the second touch region includes:
and determining the subarea remained after the deletion as a second touch area.
In a possible implementation manner, the deleting the sub-area that does not meet the preset condition includes:
deleting sub-regions with a size less than or equal to a preset size threshold.
In a possible implementation manner, the deleting the sub-area that does not meet the preset condition includes:
the deleted shape does not belong to a sub-region of the target shape.
In a possible implementation manner, at least two first directions exist, and determining the first touch area according to the light path blocked in the first direction of the touch screen includes:
determining a candidate region corresponding to the first direction; the candidate region includes: a region formed on the touch screen by the light path which is blocked in the first direction of the touch screen;
and determining an overlapping region between candidate regions respectively corresponding to at least two first directions as a first touch region.
In a possible implementation manner, before determining the candidate region corresponding to the first direction, the method further includes:
determining a scanning period and a preset number of directions;
in each scanning period, the direction in which the blocked light path exists is determined as a first direction.
It should be noted that, the implementation principle of each step in the method for determining the touch area is consistent with the implementation principle of each function corresponding to the display device described in the foregoing embodiment, and specific reference may be made to the content described in each embodiment corresponding to the foregoing display device, which is not repeated herein.
Finally, it should be noted that: the above embodiments are only for illustrating the technical solution of the present application, and not for limiting the same; although the present application has been described in detail with reference to the foregoing embodiments, it should be understood by those of ordinary skill in the art that: the technical scheme described in the foregoing embodiments can be modified or some or all of the technical features thereof can be replaced by equivalents; such modifications and substitutions do not depart from the spirit of the corresponding technical solutions from the scope of the technical solutions of the embodiments of the present application.
The foregoing description, for purposes of explanation, has been presented in conjunction with specific embodiments. However, the illustrative discussions above are not intended to be exhaustive or to limit the embodiments to the precise forms disclosed above. Many modifications and variations are possible in light of the above teaching. The embodiments were chosen and described in order to best explain the principles and the practical application, to thereby enable others skilled in the art to best utilize the embodiments and various embodiments with various modifications as are suited to the particular use contemplated.

Claims (7)

1. A display device, the display device comprising:
The touch screen is used for receiving touch operation;
a controller in communication with the touch screen, the controller configured to:
according to the light path blocked in the first direction of the touch screen, determining candidate areas corresponding to the first direction, and determining an overlapping area between at least two candidate areas corresponding to the first direction as a first touch area; the candidate region includes: a region formed on the touch screen by the light path which is blocked in the first direction of the touch screen;
if the first touch area has an optical path which is not blocked in the second direction, dividing the first touch area into subareas through the optical path which is not blocked, determining the subareas as a second touch area, wherein the second touch area is the subarea of the first touch area.
2. The display device of claim 1, wherein prior to the determining the sub-region as the second touch region, the controller is further configured to:
deleting the subareas which do not meet the preset conditions;
the determining the sub-region as a second touch region includes:
and determining the subarea remained after the deletion as a second touch area.
3. The display device of claim 2, wherein the deleting the sub-region that does not satisfy the preset condition comprises:
deleting sub-regions with a size less than or equal to a preset size threshold.
4. The display device of claim 2, wherein the deleting the sub-region that does not satisfy the preset condition comprises:
the deleted shape does not belong to a sub-region of the target shape.
5. The display device of claim 1, wherein prior to the determining the candidate region corresponding to the first direction, the controller is further configured to:
determining a scanning period and a preset number of directions;
and in each scanning period, determining the direction in which the blocked light path exists as a first direction.
6. A method of determining a touch area, the method comprising:
according to the light path blocked in the first direction of the touch screen, determining candidate areas corresponding to the first direction, and determining an overlapping area between at least two candidate areas corresponding to the first direction as a first touch area; the candidate region includes: a region formed on the touch screen by the light path which is blocked in the first direction of the touch screen;
If the first touch area has an optical path which is not blocked in the second direction, dividing the first touch area into subareas through the optical path which is not blocked, determining the subareas as a second touch area, wherein the second touch area is the subarea of the first touch area.
7. The method of claim 6, wherein prior to determining the sub-region as the second touch region, further comprising:
deleting the subareas which do not meet the preset conditions;
the determining the sub-region as a second touch region includes:
and determining the subarea remained after the deletion as a second touch area.
CN202011089349.9A 2020-10-13 2020-10-13 Touch area determining method and display device Active CN114415864B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202011089349.9A CN114415864B (en) 2020-10-13 2020-10-13 Touch area determining method and display device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202011089349.9A CN114415864B (en) 2020-10-13 2020-10-13 Touch area determining method and display device

Publications (2)

Publication Number Publication Date
CN114415864A CN114415864A (en) 2022-04-29
CN114415864B true CN114415864B (en) 2024-02-02

Family

ID=81260411

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202011089349.9A Active CN114415864B (en) 2020-10-13 2020-10-13 Touch area determining method and display device

Country Status (1)

Country Link
CN (1) CN114415864B (en)

Citations (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102722291A (en) * 2012-06-01 2012-10-10 合肥工业大学 Method for driving infrared multi-point touch screen
CN104238834A (en) * 2014-09-28 2014-12-24 青岛海信电器股份有限公司 Method and system for improving touch sensitivity
CN105045439A (en) * 2015-08-07 2015-11-11 青岛海信电器股份有限公司 Touch screen multi-point positioning method and apparatus and touchscreen device
CN105260064A (en) * 2015-10-15 2016-01-20 青岛海信电器股份有限公司 Touch point identification method and apparatus, and display device
CN105302380A (en) * 2015-12-02 2016-02-03 青岛海信电器股份有限公司 Touch point obtaining method, infrared touch device and touch screen equipment
WO2016115818A1 (en) * 2015-01-23 2016-07-28 中兴通讯股份有限公司 Integrated optical-to-electrical touch screen and apparatus, electronic device, and touch recognition method and system
CN106020568A (en) * 2016-05-11 2016-10-12 青岛海信电器股份有限公司 Touch area extracting method, device and display device
CN106201064A (en) * 2016-06-24 2016-12-07 青岛海信电器股份有限公司 Touch localization method, device and touch screen
CN106406613A (en) * 2016-09-14 2017-02-15 青岛海信电器股份有限公司 Touch point positioning method and device
CN109542276A (en) * 2018-11-22 2019-03-29 青岛海信商用显示股份有限公司 A kind of touch point identification method and device and display equipment
CN110489015A (en) * 2019-08-19 2019-11-22 青岛海信商用显示股份有限公司 Touch point determines method and device, touch screen and display
CN110581904A (en) * 2018-06-11 2019-12-17 Oppo广东移动通信有限公司 Control method, control device, electronic device, and storage medium

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106293444B (en) * 2015-06-25 2020-07-03 小米科技有限责任公司 Mobile terminal, display control method and device
CN105589613B (en) * 2016-01-28 2019-04-19 华为技术有限公司 A kind of touch independent positioning method, device and terminal device

Patent Citations (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102722291A (en) * 2012-06-01 2012-10-10 合肥工业大学 Method for driving infrared multi-point touch screen
CN104238834A (en) * 2014-09-28 2014-12-24 青岛海信电器股份有限公司 Method and system for improving touch sensitivity
WO2016115818A1 (en) * 2015-01-23 2016-07-28 中兴通讯股份有限公司 Integrated optical-to-electrical touch screen and apparatus, electronic device, and touch recognition method and system
CN105045439A (en) * 2015-08-07 2015-11-11 青岛海信电器股份有限公司 Touch screen multi-point positioning method and apparatus and touchscreen device
CN105260064A (en) * 2015-10-15 2016-01-20 青岛海信电器股份有限公司 Touch point identification method and apparatus, and display device
CN105302380A (en) * 2015-12-02 2016-02-03 青岛海信电器股份有限公司 Touch point obtaining method, infrared touch device and touch screen equipment
CN106020568A (en) * 2016-05-11 2016-10-12 青岛海信电器股份有限公司 Touch area extracting method, device and display device
CN106201064A (en) * 2016-06-24 2016-12-07 青岛海信电器股份有限公司 Touch localization method, device and touch screen
CN106406613A (en) * 2016-09-14 2017-02-15 青岛海信电器股份有限公司 Touch point positioning method and device
CN110581904A (en) * 2018-06-11 2019-12-17 Oppo广东移动通信有限公司 Control method, control device, electronic device, and storage medium
CN109542276A (en) * 2018-11-22 2019-03-29 青岛海信商用显示股份有限公司 A kind of touch point identification method and device and display equipment
CN110489015A (en) * 2019-08-19 2019-11-22 青岛海信商用显示股份有限公司 Touch point determines method and device, touch screen and display

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
基于ARM的红外多点触摸屏设计;张伟等;;液晶与显示(第05期);全文 *

Also Published As

Publication number Publication date
CN114415864A (en) 2022-04-29

Similar Documents

Publication Publication Date Title
CN109618206B (en) Method and display device for presenting user interface
CN112118400B (en) Display method of image on display device and display device
US11960674B2 (en) Display method and display apparatus for operation prompt information of input control
CN112118468A (en) Method for changing color of peripheral equipment along with color of picture and display equipment
CN111176603A (en) Image display method for display equipment and display equipment
CN111899175A (en) Image conversion method and display device
CN111954059A (en) Screen saver display method and display device
CN111954043B (en) Information bar display method and display equipment
CN111984167B (en) Quick naming method and display device
CN112269668A (en) Application resource sharing and display equipment
CN114430492A (en) Display device, mobile terminal and picture synchronous zooming method
CN114079819A (en) Content display method and display equipment
CN112783380A (en) Display apparatus and method
CN111078926A (en) Method for determining portrait thumbnail image and display equipment
CN112235621B (en) Display method and display equipment for visual area
CN114415864B (en) Touch area determining method and display device
CN111988646B (en) User interface display method and display device of application program
CN113485613A (en) Display equipment and method for realizing free-drawing screen edge painting
CN112363683A (en) Method for supporting multi-layer display of webpage application and display equipment
CN112367550A (en) Method for realizing multi-title dynamic display of media asset list and display equipment
CN114417035A (en) Picture browsing method and display device
CN111310424B (en) Form generation method and display device
CN113436564B (en) EPOS display method and display equipment
CN111462702B (en) Display device backlight control method and display device
CN112286472B (en) UI display method and display equipment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant