CN114237419B - Display device and touch event identification method - Google Patents

Display device and touch event identification method Download PDF

Info

Publication number
CN114237419B
CN114237419B CN202111506266.XA CN202111506266A CN114237419B CN 114237419 B CN114237419 B CN 114237419B CN 202111506266 A CN202111506266 A CN 202111506266A CN 114237419 B CN114237419 B CN 114237419B
Authority
CN
China
Prior art keywords
touch
smaller value
ratio
touch event
target
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202111506266.XA
Other languages
Chinese (zh)
Other versions
CN114237419A (en
Inventor
张先震
娄建生
王兆恩
王武军
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Qingdao Hisense Commercial Display Co Ltd
Original Assignee
Qingdao Hisense Commercial Display Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Qingdao Hisense Commercial Display Co Ltd filed Critical Qingdao Hisense Commercial Display Co Ltd
Publication of CN114237419A publication Critical patent/CN114237419A/en
Application granted granted Critical
Publication of CN114237419B publication Critical patent/CN114237419B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • G06F3/0412Digitisers structurally integrated in a display
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/033Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
    • G06F3/0354Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor with detection of 2D relative movements between the device, or an operating part thereof, and a plane or surface, e.g. 2D mice, trackballs, pens or pucks
    • G06F3/03545Pens or stylus
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • G06F3/042Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by opto-electronic means
    • G06F3/0421Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by opto-electronic means by interrupting or reflecting a light beam, e.g. optical touch-screen

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The application provides an embodiment, belongs to the display technology, and provides a display device and a method for identifying touch events, wherein the display device comprises an infrared touch screen, is configured to respond to touch operation acted on the infrared touch screen, acquire touch point coordinates corresponding to the touch operation, and transmit the touch point coordinates to a processor; a processor connected with the infrared touch screen, wherein the processor is configured to determine a target width and a target height of a touch point corresponding to a touch operation according to the coordinates of the touch point; determining a ratio of a larger value to a smaller value in the target width and the target height and the smaller value; and identifying a touch event corresponding to the touch operation according to the ratio, the smaller value and threshold information, wherein the threshold information is used for indicating the relation between the ratio, the smaller value and the corresponding threshold. The method and the device can accurately identify the touch event and improve user experience.

Description

Display device and touch event identification method
Cross reference to related applications
The present application claims priority from chinese patent office, application number 202110345405.9, application name "display device, method and apparatus for recognizing touch events," filed on 31/03/2021, the entire contents of which are incorporated herein by reference.
Technical Field
The application relates to the technical field of touch control, in particular to a display device and a touch event identification method.
Background
With the maturity of the touch large screen technology, various touch integrated machines are endlessly layered, and great convenience is brought to life and work of people. For example, in the education field and the conference field, one interactive touch control integrated machine can meet various use scenes such as writing, erasing, recording and broadcasting, online teaching or remote conference. In addition, various conference software and teaching software applied to the interactive touch integrated machine are also generated, such as electronic whiteboard writing software, annotation software and the like. The software receives touch data sent by the touch screen in real time, acquires touch point information based on the touch data, and initiates a touch event; and then, according to the current mode of the application, making a corresponding event response.
At present, when an interactive touch control integrated machine identifies a touch control event, a touch control point is judged through a touch control screen so as to identify a thin pen point and a thick pen point of a touch control pen. But the inventors found during use that: the false recognition rate of the scheme is high at a low angle.
Disclosure of Invention
The exemplary embodiment of the application provides a display device and a method for identifying touch events, which can accurately identify the touch events and improve user experience.
In a first aspect, an embodiment of the present application provides a display apparatus, including:
the infrared touch screen is configured to respond to touch operation acted on the infrared touch screen, acquire touch point coordinates corresponding to the touch operation, and transmit the touch point coordinates to the processor;
a processor coupled to the infrared touch screen, the processor configured to:
determining the target width and the target height of a touch point corresponding to the touch operation according to the touch point coordinates;
responding to touch operation acted on the infrared touch screen, and acquiring the target width and the target height of a touch point corresponding to the touch operation;
determining a ratio of a larger value to a smaller value in the target width and the target height and the smaller value;
and identifying a touch event corresponding to the touch operation according to the ratio, the smaller value and threshold information, wherein the threshold information is used for indicating the relation between the ratio, the smaller value and the corresponding threshold.
In some possible implementations, the processor, when used to determine the ratio of the larger value to the smaller value and the smaller value in the target width and the target height, is specifically to: if the target width is larger than the target height, determining the ratio as the ratio of the target width to the target height, and determining the smaller value as the target height; or if the target width is smaller than or equal to the target height, determining the ratio as the ratio of the target height to the target width, and determining the smaller value as the target width.
In some possible implementations, the processor is configured to, when identifying a touch event corresponding to a touch operation according to the ratio, the smaller value and the threshold information, specifically: determining a ratio range in which the ratio is positioned according to the ratio and a threshold value corresponding to the ratio, wherein smaller value threshold values corresponding to different ratio ranges are different; and identifying a touch event corresponding to the touch operation according to the smaller value and the smaller value threshold corresponding to the ratio range, wherein the touch event is one of a fine pen point touch event, a thick pen point touch event, a finger touch event and a palm touch event.
In some possible implementations, the processor is configured to determine, according to the coordinates of the touch point, a target width and a target height of the touch point corresponding to the touch operation, specifically: acquiring touch point coordinates of a preset number of touch points; determining the actual width and the actual height of the corresponding touch point according to the coordinates of the touch point; determining the average width of a preset number of touch points as the target width of the touch points corresponding to the touch operation; and determining the average height of the preset number of touch points as the target height of the touch point corresponding to the touch operation.
In some possible implementations, the processor is configured to determine, according to the coordinates of the touch point, a target width and a target height of the touch point corresponding to the touch operation, specifically: determining the actual width and the actual height of the touch point according to the coordinates of the touch point; respectively carrying out coordinate transformation on the actual width and the actual height according to the screen proportion of the touch screen; determining the width after coordinate transformation as the target width of the touch point corresponding to the touch operation; and determining the height after coordinate transformation as the target height of the touch point corresponding to the touch operation.
In some possible implementations, the processor is configured to, when identifying a touch event corresponding to a touch operation according to the ratio, the smaller value and the threshold information, specifically: for continuous touch events, scaling the smaller value, wherein the continuous touch events refer to touch events with a touch time interval smaller than a duration threshold, and the touch time interval is the time difference between the pen down time of the current touch event and the pen lifting time of the last touch event; and identifying a touch event corresponding to the touch operation according to the ratio, the scaled smaller value and the threshold information.
In some possible implementations, the processor is specifically configured to: acquiring a touch time interval of a continuous touch event; and scaling the smaller value according to the touch time interval and the preset scaling weight corresponding to the last touch event.
In some possible implementations, the processor is configured to, when performing scaling processing on the smaller value according to the touch time interval and a preset scaling weight corresponding to the previous touch event, specifically: if the last touch event is a fine pen point touch event, performing reduction processing on the smaller value according to the touch time interval and the preset scaling weight corresponding to the last touch event; or if the last touch event is a thick pen point touch event or a finger touch event and the smaller value is larger than the average value of the corresponding smaller value thresholds, performing reduction processing on the smaller value according to the touch time interval and the preset scaling weight corresponding to the last touch event; or if the last touch event is a thick pen point touch event or a finger touch event and the smaller value is smaller than or equal to the average value of the corresponding smaller value thresholds, amplifying the smaller value according to the touch time interval and the preset scaling weight corresponding to the last touch event.
In some possible implementations, before the processor is configured to identify a touch event corresponding to the touch operation according to the ratio, the smaller value, and the threshold information, the processor is further configured to: threshold information is obtained based on the following: acquiring target widths and target heights of a plurality of test points; for each test point, determining the ratio of a larger value to a smaller value in the target width and the target height of the test point; classifying the ratios corresponding to the test points, and determining a plurality of ratio ranges; and determining a smaller value threshold corresponding to each touch event according to smaller values in the target width and the target height of the test points in the ratio range for each ratio range.
In a second aspect, an embodiment of the present application provides a method for identifying a touch event, which is applied to a display device, where the method for identifying a touch event includes:
responding to touch operation acted on the infrared touch screen, and acquiring touch point coordinates corresponding to the touch operation; determining the target width and the target height of a touch point corresponding to the touch operation according to the touch point coordinates; determining a ratio of a larger value to a smaller value in the target width and the target height and the smaller value; and identifying a touch event corresponding to the touch operation according to the ratio, the smaller value and threshold information, wherein the threshold information is used for indicating the relation between the ratio, the smaller value and the corresponding threshold.
In some possible implementations, determining the ratio of the greater value to the lesser value and the lesser value of the target width and the target height includes: if the target width is larger than the target height, determining the ratio as the ratio of the target width to the target height, and determining the smaller value as the target height; or if the target width is smaller than or equal to the target height, determining the ratio as the ratio of the target height to the target width, and determining the smaller value as the target width.
In some possible implementations, identifying a touch event corresponding to a touch operation according to the ratio, the smaller value, and the threshold information includes: determining a ratio range in which the ratio is positioned according to the ratio and a threshold value corresponding to the ratio, wherein smaller value threshold values corresponding to different ratio ranges are different; and identifying a touch event corresponding to the touch operation according to the smaller value and the smaller value threshold corresponding to the ratio range, wherein the touch event is one of a fine pen point touch event, a thick pen point touch event, a finger touch event and a palm touch event.
In some possible implementations, determining, according to the coordinates of the touch point, a target width and a target height of the touch point corresponding to the touch operation includes: acquiring touch point coordinates of a preset number of touch points; determining the actual width and the actual height of the corresponding touch point according to the coordinates of the touch point; determining the average width of a preset number of touch points as the target width of the touch points corresponding to the touch operation; and determining the average height of the preset number of touch points as the target height of the touch point corresponding to the touch operation.
In some possible implementations, determining, according to the coordinates of the touch point, a target width and a target height of the touch point corresponding to the touch operation includes: determining the actual width and the actual height of the touch point according to the coordinates of the touch point; respectively carrying out coordinate transformation on the actual width and the actual height according to the screen proportion of the touch screen; determining the width after coordinate transformation as the target width of the touch point corresponding to the touch operation; and determining the height after coordinate transformation as the target height of the touch point corresponding to the touch operation.
In some possible implementations, identifying a touch event corresponding to a touch operation according to the ratio, the smaller value, and the threshold information includes: for continuous touch events, scaling the smaller value, wherein the continuous touch events refer to touch events with a touch time interval smaller than a duration threshold, and the touch time interval is the time difference between the pen down time of the current touch event and the pen lifting time of the last touch event; and identifying a touch event corresponding to the touch operation according to the ratio, the scaled smaller value and the threshold information.
In some possible implementations, scaling the smaller value of the touch point includes: acquiring a touch time interval of a continuous touch event; and scaling the smaller value according to the touch time interval and the preset scaling weight corresponding to the last touch event.
In some possible implementations, scaling the smaller value according to the touch time interval and a preset scaling weight corresponding to the previous touch event includes: if the last touch event is a fine pen point touch event, performing reduction processing on the smaller value according to the touch time interval and the preset scaling weight corresponding to the last touch event; or if the last touch event is a thick pen point touch event or a finger touch event and the smaller value is larger than the average value of the corresponding smaller value thresholds, performing reduction processing on the smaller value according to the touch time interval and the preset scaling weight corresponding to the last touch event; or if the last touch event is a thick pen point touch event or a finger touch event and the smaller value is smaller than or equal to the average value of the corresponding smaller value thresholds, amplifying the smaller value according to the touch time interval and the preset scaling weight corresponding to the last touch event.
In some possible implementations, before identifying the touch event corresponding to the touch operation according to the ratio, the smaller value and the threshold information, the method further includes: threshold information is obtained based on the following:
acquiring target widths and target heights of a plurality of test points; for each test point, determining the ratio of a larger value to a smaller value in the target width and the target height of the test point; classifying the ratios corresponding to the test points, and determining a plurality of ratio ranges; and determining a smaller value threshold corresponding to each touch event according to smaller values in the target width and the target height of the test points in the ratio range for each ratio range.
In a third aspect, an embodiment of the present application provides a touch event recognition device, applied to a display device, where the touch event recognition device includes:
the first acquisition module is used for responding to the touch operation acted on the infrared touch screen and acquiring touch point coordinates corresponding to the touch operation;
the first determining module is used for determining the target width and the target height of the touch point corresponding to the touch operation according to the coordinates of the touch point;
a second determining module for determining a ratio of a larger value to a smaller value and a smaller value of the target width and the target height;
the identification module is used for identifying a touch event corresponding to the touch operation according to the ratio, the smaller value and threshold information, and the threshold information is used for indicating the relation between the ratio, the smaller value and the corresponding threshold.
In some possible implementations, the second determining module is specifically configured to: if the target width is larger than the target height, determining the ratio as the ratio of the target width to the target height, and determining the smaller value as the target height; or if the target width is smaller than or equal to the target height, determining the ratio as the ratio of the target height to the target width, and determining the smaller value as the target width.
In some possible implementations, the identification module is specifically configured to: determining a ratio range in which the ratio is positioned according to the ratio and a threshold value corresponding to the ratio, wherein smaller value threshold values corresponding to different ratio ranges are different; and identifying a touch event corresponding to the touch operation according to the smaller value and the smaller value threshold corresponding to the ratio range, wherein the touch event is one of a fine pen point touch event, a thick pen point touch event, a finger touch event and a palm touch event.
In some possible implementations, the first determining module is specifically configured to: acquiring touch point coordinates of a preset number of touch points; determining the actual width and the actual height of the corresponding touch point according to the coordinates of the touch point; determining the average width of a preset number of touch points as the target width of the touch points corresponding to the touch operation; and determining the average height of the preset number of touch points as the target height of the touch point corresponding to the touch operation.
In some possible implementations, the first determining module is specifically configured to: determining the actual width and the actual height of the touch point according to the coordinates of the touch point; respectively carrying out coordinate transformation on the actual width and the actual height according to the screen proportion of the touch screen; determining the width after coordinate transformation as the target width of the touch point corresponding to the touch operation; and determining the height after coordinate transformation as the target height of the touch point corresponding to the touch operation.
In some possible implementations, the identification module is specifically configured to: for continuous touch events, scaling the smaller value, wherein the continuous touch events refer to touch events with a touch time interval smaller than a duration threshold, and the touch time interval is the time difference between the pen down time of the current touch event and the pen lifting time of the last touch event; and identifying a touch event corresponding to the touch operation according to the ratio, the scaled smaller value and the threshold information.
In some possible implementations, the identification module is specifically configured to: acquiring a touch time interval of a continuous touch event; and scaling the smaller value according to the touch time interval and the preset scaling weight corresponding to the last touch event.
In some possible implementations, the identification module is configured to, when performing scaling processing on the smaller value according to the touch time interval and a preset scaling weight corresponding to the previous touch event, specifically: if the last touch event is a fine pen point touch event, performing reduction processing on the smaller value according to the touch time interval and the preset scaling weight corresponding to the last touch event; or if the last touch event is a thick pen point touch event or a finger touch event and the smaller value is larger than the average value of the corresponding smaller value thresholds, performing reduction processing on the smaller value according to the touch time interval and the preset scaling weight corresponding to the last touch event; or if the last touch event is a thick pen point touch event or a finger touch event and the smaller value is smaller than or equal to the average value of the corresponding smaller value thresholds, amplifying the smaller value according to the touch time interval and the preset scaling weight corresponding to the last touch event.
In some possible implementations, the method further includes: the second obtaining module is configured to obtain, before identifying a touch event corresponding to a touch operation according to the ratio, the smaller value and the threshold information, the threshold information based on the following manner:
acquiring target widths and target heights of a plurality of test points; for each test point, determining the ratio of a larger value to a smaller value in the target width and the target height of the test point; classifying the ratios corresponding to the test points, and determining a plurality of ratio ranges; and determining a smaller value threshold corresponding to each touch event according to smaller values in the target width and the target height of the test points in the ratio range for each ratio range.
In a fourth aspect, an embodiment of the present application provides a computer readable storage medium, where computer program instructions are stored, where the computer program instructions, when executed, implement a method for identifying a touch event according to the second aspect of the present application.
In a fifth aspect, an embodiment of the present application provides a computer program product, including a computer program, where the computer program when executed by a processor implements a method for identifying a touch event according to the second aspect of the present application.
The application provides a display device, a method and a device for identifying a touch event, wherein the display device comprises an infrared touch screen, is configured to respond to touch operation acted on the infrared touch screen, acquire touch point coordinates corresponding to the touch operation, and transmit the touch point coordinates to a processor; a processor connected with the infrared touch screen, wherein the processor is configured to determine a target width and a target height of a touch point corresponding to a touch operation according to the coordinates of the touch point; determining a ratio of a larger value to a smaller value in the target width and the target height and the smaller value; and identifying a touch event corresponding to the touch operation according to the ratio, the smaller value and threshold information, wherein the threshold information is used for indicating the relation between the ratio, the smaller value and the corresponding threshold. The application identifies the touch event corresponding to the touch operation based on the ratio of the larger value to the smaller value in the target width and the target height of the touch point, the smaller value and the threshold information, so that the touch event can be accurately identified, and the user experience is improved.
These and other aspects of the application will be apparent from and elucidated with reference to the embodiment(s) described hereinafter.
Drawings
In order to more clearly illustrate the embodiments of the present application or the implementation of the related art, the drawings that are required for the embodiments or the related art description will be briefly described, and it is apparent that the drawings in the following description are some embodiments of the present application and that other drawings may be obtained according to these drawings for a person having ordinary skill in the art.
A schematic diagram of an operation scenario between a display device and a user according to an embodiment is exemplarily shown in fig. 1;
a hardware configuration block diagram of the display device 200 in accordance with an exemplary embodiment is illustrated in fig. 2;
FIG. 3 is a schematic diagram of a software system of a display device according to the present application;
FIG. 4 is a flowchart of a method for identifying a touch event according to an embodiment of the present application;
FIG. 5a is a schematic diagram of a touch point of a pen according to an embodiment of the present application when the pen is horizontally oriented 90 °;
FIG. 5b is a schematic view of a touch point of a pen according to an embodiment of the present application when the pen is horizontally oriented at 45 °;
FIG. 5c is a schematic view of a touch point of a pen according to an embodiment of the present application when the pen is horizontally oriented at 30 °;
FIG. 6 is a schematic diagram of threshold analysis of coarse and fine nibs according to an embodiment of the present application;
fig. 7 is a flowchart of a method for identifying a touch event according to another embodiment of the present application;
FIG. 8 is a flowchart of a method for recognizing a touch event according to another embodiment of the present application;
FIG. 9 is a diagram illustrating timestamp information according to an embodiment of the present application;
FIG. 10 is a schematic diagram of a high frequency range of a thick nib according to an embodiment of the present application;
fig. 11 is a schematic structural diagram of a touch event recognition device according to an embodiment of the application.
Detailed Description
For the purposes of making the objects, embodiments and advantages of the present application more apparent, an exemplary embodiment of the present application will be described more fully hereinafter with reference to the accompanying drawings in which exemplary embodiments of the application are shown, it being understood that the exemplary embodiments described are merely some, but not all, of the examples of the application.
Based on the exemplary embodiments described herein, all other embodiments that may be obtained by one of ordinary skill in the art without making any inventive effort are within the scope of the appended claims. Furthermore, while the present disclosure has been described in terms of an exemplary embodiment or embodiments, it should be understood that each aspect of the disclosure can be practiced separately from the other aspects.
It should be noted that the brief description of the terminology in the present application is for the purpose of facilitating understanding of the embodiments described below only and is not intended to limit the embodiments of the present application. Unless otherwise indicated, these terms should be construed in their ordinary and customary meaning.
The terms "first," second, "" third and the like in the description and in the claims and in the above drawings are used for distinguishing between similar or similar objects or entities and not necessarily for describing a particular sequential or chronological order, unless otherwise indicated (Unless otherwise indicated). It is to be understood that the terms so used are interchangeable under appropriate circumstances such that the embodiments of the application are, for example, capable of operation in sequences other than those illustrated or otherwise described herein.
Furthermore, the terms "comprise" and "have," and any variations thereof, are intended to cover a non-exclusive inclusion, such that a product or apparatus that comprises a list of elements is not necessarily limited to those elements expressly listed, but may include other elements not expressly listed or inherent to such product or apparatus.
The term "module" as used in this disclosure refers to any known or later developed hardware, software, firmware, artificial intelligence, fuzzy logic, or combination of hardware and/or software code that is capable of performing the function associated with that element.
The term "remote control" as used herein refers to a component of an electronic device (such as a display device as disclosed herein) that can be controlled wirelessly, typically over a relatively short distance. Typically, the electronic device is connected to the electronic device using infrared and/or Radio Frequency (RF) signals and/or bluetooth, and may also include functional modules such as WiFi, wireless USB, bluetooth, motion sensors, etc. For example: the hand-held touch remote controller replaces most of the physical built-in hard keys in a general remote control device with a touch screen user interface.
The term "gesture" as used herein refers to a user action by a change in hand shape or hand movement, etc., used to express an intended idea, action, purpose, or result.
A schematic diagram of an operation scenario between a display device and a user according to an embodiment is exemplarily shown in fig. 1. As shown in fig. 1, a user may perform writing operation on an infrared touch screen of the display device 200 through the stylus 100, and the processor of the display device 200 receives the touch operation on the infrared touch screen, obtains data of a touch point corresponding to the touch operation, and identifies a touch event corresponding to the touch operation according to the data of the touch point.
As also shown in fig. 1, the display device 200 is also in data communication with the server 400 via a variety of communication means. The display device 200 may be permitted to make communication connections via a Local Area Network (LAN), a Wireless Local Area Network (WLAN), and other networks. The server 400 may provide various contents and interactions to the display device 200. By way of example, display device 200 receives software program updates, or accesses a remotely stored digital media library by sending and receiving information, as well as Electronic Program Guide (EPG) interactions. The server 400 may be a cluster, or may be multiple clusters, and may include one or more types of servers. Other web service content such as video on demand and advertising services are provided through the server 400.
The embodiments of the present application are not limited to the type, size, resolution, etc. of the particular display device 200, and those skilled in the art will appreciate that the display device 200 may be modified in performance and configuration as desired.
A hardware configuration block diagram of the display device 200 according to an exemplary embodiment is illustrated in fig. 2.
In some embodiments, at least one of the controller 250, the modem 210, the communicator 220, the detector 230, the input/output interface 255, the display 275, the audio output interface 285, the memory 260, the power supply 290, the user interface 265, and the external device interface 240 is included in the display apparatus 200.
In some embodiments, the display 275 is configured to receive image signals from the first processor output, and to display video content and images and components of the menu manipulation interface.
In some embodiments, display 275 includes a display screen assembly for presenting pictures, and a drive assembly for driving the display of images.
In some embodiments, the video content is displayed from broadcast television content, or alternatively, from various broadcast signals that may be received via a wired or wireless communication protocol. Alternatively, various image contents received from the network server side transmitted from the network communication protocol may be displayed.
In some embodiments, the display 275 is used to present a user-manipulated UI interface generated in the display device 200 and used to control the display device 200.
In some embodiments, depending on the type of display 275, a drive assembly for driving the display is also included.
In some embodiments, display 275 is a projection display and may further include a projection device and a projection screen.
In some embodiments, communicator 220 is a component for communicating with external devices or external servers according to various communication protocol types. For example: the communicator may include at least one of a Wifi chip, a bluetooth communication protocol chip, a wired ethernet communication protocol chip, or other network communication protocol chip or a near field communication protocol chip, and an infrared receiver.
In some embodiments, the display device 200 may establish control signal and data signal transmission and reception between the communicator 220 and an external control device or a content providing device.
In some embodiments, the user interface 265 may be used to receive infrared control signals from a control device (e.g., an infrared remote control, etc.).
In some embodiments, the detector 230 is a signal that the display device 200 uses to capture or interact with the external environment.
In some embodiments, the detector 230 includes an optical receiver, a sensor for capturing the intensity of ambient light, a parameter change may be adaptively displayed by capturing ambient light, etc.
In some embodiments, the detector 230 may further include an image collector, such as a camera, a video camera, etc., which may be used to collect external environmental scenes, collect attributes of a user or interact with a user, adaptively change display parameters, and recognize a user gesture to realize an interaction function with the user.
In some embodiments, the detector 230 may also include a temperature sensor or the like, such as by sensing ambient temperature.
In some embodiments, the display device 200 may adaptively adjust the display color temperature of the image. The display device 200 may be adjusted to display a colder color temperature shade of the image, such as when the temperature is higher, or the display device 200 may be adjusted to display a warmer color shade of the image when the temperature is lower.
In some embodiments, the detector 230 may also be a sound collector or the like, such as a microphone, that may be used to receive the user's sound. Illustratively, a voice signal including a control instruction for a user to control the display apparatus 200, or an acquisition environmental sound is used to recognize an environmental scene type so that the display apparatus 200 can adapt to environmental noise.
In some embodiments, as shown in fig. 2, the input/output interface 255 is configured to enable data transfer between the controller 250 and external other devices or other controllers 250. Such as receiving video signal data and audio signal data of an external device, command instruction data, or the like.
In some embodiments, external device interface 240 may include, but is not limited to, the following: any one or more interfaces of a high definition multimedia interface HDMI interface, an analog or data high definition component input interface, a composite video input interface, a USB input interface, an RGB port, and the like can be used. The plurality of interfaces may form a composite input/output interface.
In some embodiments, as shown in fig. 2, the modem 210 is configured to receive the broadcast television signal by a wired or wireless receiving manner, and may perform modulation and demodulation processes such as amplification, mixing, and resonance, and demodulate the audio/video signal from a plurality of wireless or wired broadcast television signals, where the audio/video signal may include a television audio/video signal carried in a television channel frequency selected by a user, and an EPG data signal.
In some embodiments, the frequency point demodulated by the modem 210 is controlled by the controller 250, and the controller 250 may send a control signal according to the user selection, so that the modem responds to the television signal frequency selected by the user and modulates and demodulates the television signal carried by the frequency.
In some embodiments, the broadcast television signal may be classified into a terrestrial broadcast signal, a cable broadcast signal, a satellite broadcast signal, an internet broadcast signal, or the like according to a broadcasting system of the television signal. Or may be differentiated into digital modulation signals, analog modulation signals, etc., depending on the type of modulation. Or it may be classified into digital signals, analog signals, etc. according to the kind of signals.
In some embodiments, the controller 250 and the modem 210 may be located in separate devices, i.e., the modem 210 may also be located in an external device to the main device in which the controller 250 is located, such as an external set-top box or the like. In this way, the set-top box outputs the television audio and video signals modulated and demodulated by the received broadcast television signals to the main body equipment, and the main body equipment receives the audio and video signals through the first input/output interface.
In some embodiments, the controller 250 controls the operation of the display device and responds to user operations through various software control programs stored on the memory. The controller 250 may control the overall operation of the display apparatus 200. For example: in response to receiving a user command to select to display a UI object on the display 275, the controller 250 may perform an operation related to the object selected by the user command.
In some embodiments, the object may be any one of selectable objects, such as a hyperlink or an icon. Operations related to the selected object, such as: displaying an operation of connecting to a hyperlink page, a document, an image, or the like, or executing an operation of a program corresponding to the icon. The user command for selecting the UI object may be an input command through various input means (e.g., mouse, keyboard, touch pad, etc.) connected to the display device 200 or a voice command corresponding to a voice uttered by the user.
As shown in fig. 2, the controller 250 includes at least one of a random access Memory 251 (Random Access Memory, RAM), a Read-Only Memory 252 (ROM), a video processor 270, an audio processor 280, other processors 253 (e.g., a graphics processor (Graphics Processing Unit, GPU), a processor 254 (Central Processing Unit, CPU), a communication interface (Communication Interface), and a communication Bus 256 (Bus), which connects the respective components.
In some embodiments, RAM 251 is used to store temporary data for the operating system or other on-the-fly programs, and in some embodiments ROM 252 is used to store various system boot instructions.
In some embodiments, ROM 252 is used to store a basic input output system, referred to as a basic input output system (Basic lnput Output System, BIOS). The system comprises a drive program and a boot operating system, wherein the drive program is used for completing power-on self-checking of the system, initialization of each functional module in the system and basic input/output of the system.
In some embodiments, upon receipt of the power-on signal, the display device 200 power starts up, the CPU runs system boot instructions in the ROM 252, copies temporary data of the operating system stored in memory into the RAM 251, in order to start up or run the operating system. When the operating system is started, the CPU copies temporary data of various applications in the memory to the RAM 251, and then, facilitates starting or running of the various applications.
In some embodiments, CPU processor 254 is used to execute operating system and application program instructions stored in memory. And executing various application programs, data and contents according to various interactive instructions received from the outside, so as to finally display and play various audio and video contents.
In some exemplary embodiments, the CPU processor 254 may comprise a plurality of processors. The plurality of processors may include one main processor and one or more sub-processors. A main processor for performing some operations of the display apparatus 200 in the pre-power-up mode and/or displaying a picture in the normal mode. One or more sub-processors for one operation in a standby mode or the like.
In some exemplary embodiments, the display 275 of the display device 200 is an infrared touch screen, and then the display 275 sends information of a touch point corresponding to a touch operation to the processor 254 in real time, and the processor 254 receives the information of the touch point and identifies a touch event corresponding to the touch operation.
In some embodiments, the graphics processor 253 is configured to generate various graphical objects, such as: icons, operation menus, user input instruction display graphics, and the like. The device comprises an arithmetic unit, wherein the arithmetic unit is used for receiving various interaction instructions input by a user to carry out operation and displaying various objects according to display attributes. And a renderer for rendering the various objects obtained by the arithmetic unit, wherein the rendered objects are used for being displayed on a display.
In some embodiments, video processor 270 is configured to receive external video signals, perform video processing such as decompression, decoding, scaling, noise reduction, frame rate conversion, resolution conversion, image composition, etc., according to standard codec protocols for input signals, and may result in signals that are displayed or played on directly displayable device 200.
In some embodiments, video processor 270 includes a demultiplexing module, a video decoding module, an image compositing module, a frame rate conversion module, a display formatting module, and the like.
The demultiplexing module is used for demultiplexing the input audio/video data stream, such as the input MPEG-2, and demultiplexes the input audio/video data stream into video signals, audio signals and the like.
And the video decoding module is used for processing the demultiplexed video signals, including decoding, scaling and the like.
And an image synthesis module, such as an image synthesizer, for performing superposition mixing processing on the graphic generator and the video image after the scaling processing according to the GUI signal input by the user or generated by the graphic generator, so as to generate an image signal for display.
The frame rate conversion module is configured to convert the input video frame rate, for example, converting the 60Hz frame rate into the 120Hz frame rate or the 240Hz frame rate, and the common format is implemented in an inserting frame manner.
The display format module is used for converting the received frame rate into a video output signal, and changing the video output signal to a signal conforming to the display format, such as outputting an RGB data signal.
In some embodiments, the graphics processor 253 may be integrated with the video processor, or may be separately configured, where the integrated configuration may perform processing of graphics signals output to the display, and the separate configuration may perform different functions, such as gpu+ FRC (Frame Rate Conversion)) architecture, respectively.
In some embodiments, the audio processor 280 is configured to receive an external audio signal, decompress and decode the audio signal according to a standard codec protocol of an input signal, and perform noise reduction, digital-to-analog conversion, and amplification processing, so as to obtain a sound signal that can be played in a speaker.
In some embodiments, video processor 270 may include one or more chips. The audio processor may also comprise one or more chips.
In some embodiments, video processor 270 and audio processor 280 may be separate chips or may be integrated together with the controller in one or more chips.
In some embodiments, the audio output, under the control of the controller 250, receives sound signals output by the audio processor 280, such as: the speaker 286, and an external sound output terminal that can be output to a generating device of an external device, other than the speaker carried by the display device 200 itself, such as: external sound interface or earphone interface, etc. can also include the close range communication module in the communication interface, for example: and the Bluetooth module is used for outputting sound of the Bluetooth loudspeaker.
The power supply 290 supplies power input from an external power source to the display device 200 under the control of the controller 250. The power supply 290 may include a built-in power circuit installed inside the display device 200, or may be an external power source installed in the display device 200, and a power interface for providing an external power source in the display device 200.
The user interface 265 is used to receive an input signal from a user and then transmit the received user input signal to the controller 250. The user input signal may be a remote control signal received through an infrared receiver, and various user control signals may be received through a network communication module.
In some embodiments, a user inputs a user command through the control device or the mobile terminal 300, and the user input interface responds to the user input through the controller 250 by the display apparatus 200 according to the user input.
In some embodiments, a user may input a user command through a Graphical User Interface (GUI) displayed on the display 275, and the user input interface receives the user input command through the Graphical User Interface (GUI). Alternatively, the user may input the user command by inputting a specific sound or gesture, and the user input interface recognizes the sound or gesture through the sensor to receive the user input command.
In some embodiments, a "user interface" is a media interface for interaction and exchange of information between an application or operating system and a user that enables conversion between an internal form of information and a form acceptable to the user. A commonly used presentation form of the user interface is a graphical user interface (Graphic User Interface, GUI), which refers to a user interface related to computer operations that is displayed in a graphical manner. It may be an interface element such as an icon, a window, a control, etc. displayed in a display screen of the electronic device, where the control may include a visual interface element such as an icon, a button, a menu, a tab, a text box, a dialog box, a status bar, a navigation bar, a Widget, etc.
The memory 260 includes memory storing various software modules for driving the display device 200. Such as: various software modules stored in the first memory, including: at least one of a base module, a detection module, a communication module, a display control module, a browser module, various service modules, and the like.
The base module is a bottom software module for signal communication between the various hardware in the display device 200 and for sending processing and control signals to the upper modules. The detection module is used for collecting various information from various sensors or user input interfaces and carrying out digital-to-analog conversion and analysis management.
For example, the voice recognition module includes a voice analysis module and a voice instruction database module. The display control module is used for controlling the display to display the image content, and can be used for playing the multimedia image content, the UI interface and other information. And the communication module is used for carrying out control and data communication with external equipment. And the browser module is used for executing data communication between the browsing servers. And the service module is used for providing various services and various application programs. Meanwhile, the memory 260 also stores received external data and user data, images of various items in various user interfaces, visual effect maps of focus objects, and the like.
Fig. 3 is a schematic diagram of a software system of a display device according to the present application. Referring to FIG. 3, in some embodiments, the system is divided into four layers, from top to bottom, an application layer (simply "application layer"), an application framework layer (Application Framework) layer (simply "framework layer"), an Zhuoyun rows (Android run) and a system library layer (simply "system runtime layer"), and a kernel layer, respectively.
In some embodiments, at least one application program is running in the application program layer, and these application programs may be a Window (Window) program of an operating system, a system setting program, a clock program, a camera application, and the like; and may be an application program developed by a third party developer, such as a hi-see program, a K-song program, a magic mirror program, etc. In particular implementations, the application packages in the application layer are not limited to the above examples, and may actually include other application packages, which the embodiments of the present application do not limit.
The framework layer provides an application programming interface (Application Programming Interface, API) and programming framework for application programs of the application layer. The application framework layer includes a number of predefined functions. The application framework layer corresponds to a processing center that decides to let the applications in the application layer act. An application program can access resources in a system and acquire services of the system in execution through an API interface
As shown in fig. 3, the application framework layer in the embodiment of the present application includes a manager (manager), a Content Provider (Content Provider), and the like, where the manager includes at least one of the following modules: an Activity Manager (Activity Manager) is used to interact with all activities that are running in the system; a Location Manager (Location Manager) is used to provide system services or applications with access to system Location services; a Package Manager (Package Manager) for retrieving various information about an application Package currently installed on the device; a notification manager (Notification Manager) for controlling the display and clearing of notification messages; a Window Manager (Window Manager) is used to manage bracketing icons, windows, toolbars, wallpaper, and desktop components on the user interface.
In some embodiments, the activity manager is to: the lifecycle of each application program is managed, as well as the usual navigation rollback functions, such as controlling the exit of the application program (including switching the currently displayed user interface in the display window to the system desktop), opening, backing (including switching the currently displayed user interface in the display window to the previous user interface of the currently displayed user interface), etc.
In some embodiments, the window manager is configured to manage all window procedures, such as obtaining a display screen size, determining whether there is a status bar, locking the screen, intercepting the screen, controlling display window changes (e.g., scaling the display window down, dithering, distorting, etc.), and so on.
In some embodiments, the system runtime layer provides support for the upper layer, the framework layer, and when the framework layer is in use, the android operating system runs the C/C++ libraries contained in the system runtime layer to implement the functions to be implemented by the framework layer.
In some embodiments, the kernel layer is a layer between hardware and software. As shown in fig. 3, the kernel layer contains at least one of the following drivers: audio drive, display drive, bluetooth drive, camera drive, WIFI drive, USB drive, HDMI drive, sensor drive (e.g., fingerprint sensor, temperature sensor, touch sensor, pressure sensor, etc.), and the like.
In some embodiments, the kernel layer further includes a power driver module for power management.
In some embodiments, the software programs and/or modules corresponding to the software architecture in fig. 3 are stored in the first memory or the second memory shown in fig. 2.
In some embodiments, taking a magic mirror application (photographing application) as an example, when the remote control receiving device receives an input operation of the remote control, a corresponding hardware interrupt is sent to the kernel layer. The kernel layer processes the input operation into the original input event (including the value of the input operation, the timestamp of the input operation, etc.). The original input event is stored at the kernel layer. The application program framework layer acquires an original input event from the kernel layer, identifies a control corresponding to the input event according to the current position of the focus and takes the input operation as a confirmation operation, wherein the control corresponding to the confirmation operation is a control of a magic mirror application icon, the magic mirror application calls an interface of the application framework layer, the magic mirror application is started, and further, a camera driver is started by calling the kernel layer, so that a still image or video is captured through a camera.
In some embodiments, for a display device with a touch function, taking a split screen operation as an example, the display device receives an input operation (such as a split screen operation) acted on a display screen by a user, and the kernel layer may generate a corresponding input event according to the input operation and report the event to the application framework layer. The window mode (e.g., multi-window mode) and window position and size corresponding to the input operation are set by the activity manager of the application framework layer. And window management of the application framework layer draws a window according to the setting of the activity manager, then the drawn window data is sent to a display driver of the kernel layer, and the display driver displays application interfaces corresponding to the window data in different display areas of the display screen.
At present, when an interactive touch control integrated machine identifies a touch control event, a touch control point is judged through a touch control screen so as to identify a thin pen point and a thick pen point of a touch control pen. At low angles, such as 45-degree holding of a pen, a short line is drawn, and the false recognition rate of the touch event is high. Therefore, the application provides a display device, a method and a device for identifying a touch event, which are used for accurately identifying a fine pen point touch event, a coarse pen point touch event, a finger touch event and a palm touch event by optimizing the selection of a touch point corresponding to touch operation, the selection of a judging scheme, the setting of a grading threshold and the pen lifting time, so that the user operation is more convenient and the use experience of the user is improved.
The following detailed embodiments are used to describe how the present application recognizes touch events.
Fig. 4 is a flowchart of a method for identifying a touch event according to an embodiment of the application. As shown in fig. 4, the infrared touch screen in the display device is configured to perform steps S401 and S402, and the processor in the display device is configured to perform steps S403 to S405. Specifically:
in S401, in response to a touch operation acting on the infrared touch screen, coordinates of a touch point corresponding to the touch operation are acquired.
In the embodiment of the application, the infrared touch screen is, for example, an interactive touch conference integrated machine. For example, the interactive touch conference integrated machine uses a 65-inch infrared touch frame, wherein the touch speed of the infrared touch frame is 7ms, the touch precision is +/-1 mm, and the minimum recognition object is 2mm; the used open pluggable specification (Open Pluggable Specification, OPS) computer is configured as an I5-8400-8G-128G SSD, and a win10 enterprise edition 1809LTSC operating system is adopted; the development language used is c#, and the UI framework uses microsoft Windows presentation foundation (Windows Presentation Foundation, WPF) framework.
Illustratively, a user uses a dual-color stylus to input touch operations on an infrared touch screen. Illustratively, the dual-color stylus may be a 3D printed test pen. For example, in three-level recognition (recognition of thick and thin tips, palms of a two-color stylus), the diameter of the thin tip is 3mm, and the diameter of the thick tip is 9mm; in the four-stage recognition (recognition of the thick nib and the thin nib of the double-color stylus, the finger and the palm), the diameter of the thin nib is 3mm, and the diameter of the thick nib is 6.2mm.
In practical application, the infrared touch screen can monitor whether a user inputs touch operation in real time, and respond to the touch operation when the user inputs the touch operation, so as to obtain touch point coordinates corresponding to the touch operation.
In S402, touch point coordinates are transmitted to a processor.
In practical applications, the infrared touch screen transmits the monitored coordinates of the touch point to the processor. In one implementation, based on the point reporting rate, the infrared touch screen transmits the monitored touch point coordinates to the processor; in another implementation, the infrared touch screen transmits the monitored coordinates of the touch point to the processor in real time. Correspondingly, the processor receives the touch point coordinates and executes the identification of the touch event corresponding to the touch operation based on the touch point coordinates.
In S403, the target width and the target height of the touch point corresponding to the touch operation are determined according to the coordinates of the touch point.
In this step, the processor may obtain the target width and the target height of the corresponding touch point according to the coordinates of the touch point. For example, for the target height of the touch point, two coordinate values are corresponding to the infrared touch screen, and the difference between the two coordinate values is obtained to obtain the target height of the touch point.
In S404, the ratio of the larger value to the smaller value and the smaller value of the target width and the target height are determined.
In the embodiment of the application, after the target width and the target height of the touch point are obtained, the sizes of the target width and the target height are compared, a larger value and a smaller value in the two can be determined, and the ratio of the larger value to the smaller value can be obtained. For example, if the target width is 50 pixels and the target height is 60 pixels, the target height is determined to be a larger value, the target width is determined to be a smaller value, the ratio of the larger value to the smaller value is 1.2, and the smaller value is 50 pixels.
In S405, a touch event corresponding to the touch operation is identified according to the ratio, the smaller value, and threshold information, where the threshold information is used to indicate a relationship between the ratio, the smaller value, and the corresponding threshold.
In this step, threshold information is acquired in advance based on the statistical value, the threshold information indicating the relationship of the ratio, the smaller value, and the corresponding threshold. Wherein the thresholds for different touch events are different. After the ratio of the larger value to the smaller value in the target width and the target height and the smaller value are obtained, identifying a touch event corresponding to the touch operation according to the ratio, the smaller value and the threshold information.
In some embodiments, the threshold information may be obtained based on: acquiring target widths and target heights of a plurality of test points; for each test point, determining the ratio of a larger value to a smaller value in the target width and the target height of the test point; classifying the ratios corresponding to the test points, and determining a plurality of ratio ranges; and determining a smaller value threshold corresponding to each touch event according to smaller values in the target width and the target height of the test points in the ratio range for each ratio range.
In the embodiment of the application, two methods are adopted for comparison aiming at the problem of higher false recognition rate of the touch event at a low angle: one method is a method provided by the embodiment of the application, and can also be called a smaller edge judgment method; another method is a touch point area determination method. Fig. 5a is a schematic diagram of a touch point when a pen is horizontally oriented at 90 ° in a horizontal direction, fig. 5b is a schematic diagram of a touch point when a pen is horizontally oriented at 45 ° in a horizontal direction, and fig. 5c is a schematic diagram of a touch point when a pen is horizontally oriented at 30 ° in a horizontal direction. As shown in fig. 5a to 5c, when the horizontal and horizontal angles are 90 °, 45 ° and 30 °, the touch area of the thin nib of the dual-color pen is greatly changed, and through experiments, the smaller side judging method is more stable than the touch point area judging method, the accuracy is higher when writing at less than 60 °, the specific test results are shown in table 1, when the writing angle is less than 45 °, the error recognition rate of the smaller side judging method is larger, and the error recognition rate of writing at the angle is also larger. The smaller edge judging method is at the same level as the touch point area judging method in the horizontal direction when the smaller edge judging method is smaller than 45 degrees, and the vertical short line is superior to the touch point area judging method.
Table 1 touch point area determination method and smaller edge determination method comparison
In the comparing process of the touch point area judging method and the smaller side judging method, threshold information is determined by the smaller side judging method, and the specific process is as follows: the method comprises the steps of obtaining target widths and target heights of a plurality of test points, determining the ratio of a larger value to a smaller value in the target widths and the target heights of the test points for each test point, classifying the ratio corresponding to each test point, and determining a plurality of ratio ranges. Illustratively, the ratio range is divided into three layers, the three layers having ratio ranges of: interval [1,1.4 ], interval [1.4,2 ] and interval [2 ], ++ infinity A kind of electronic device. And determining a smaller value threshold corresponding to each touch event according to smaller values in the target width and the target height of the test points in the ratio range for each ratio range. The ratio range is divided into three layers, and three corresponding smaller value thresholds are arranged under each layer, so that the threshold setting is more targeted, the selection is more flexible and accurate, and the accurate threshold is obtained through statistical analysis, so that the recognition rate of the touch event is improved. Such as the smaller edge decision method in table 1, the corresponding ratio range is interval 1,1.4), and there are 3 smaller threshold values in the ratio range: 79.2, 146, 370. For example, when determining the threshold value, data of the touch point are collected in real time, the data of the touch point are written into a file, a large amount of data are collected by classification and angles, the collected data are cleaned (unused touch point table identification information such as the serial number of the touch point is removed), the data obtained after a large amount of tests are analyzed by using software such as minitab (ratio) and the like, and the threshold value between the thin pen point and the thick pen point, between the thick pen point and the finger and between the finger and the palm is obtained. FIG. 6 is a schematic diagram of threshold analysis of a thick nib and a thin nib according to an embodiment of the present application, as shown in FIG. 6, illustrating values of smaller edges corresponding to the thick nib and the thin nib in different ratio ranges, for example, the ratio range is a range [1,1.4 ], and the threshold of the smaller edge is determined according to the boundary point of the thick nib and the thin nib to a broken line of the range and points around the boundary point.
In some scenes, the identification method of the touch event can be applied to software of conference plates and education plates, so that the user operation is more convenient and faster, and the use experience of the user is improved.
According to the identification method of the touch event, which is provided by the embodiment of the application, the display equipment comprises an infrared touch screen, and is configured to respond to touch operation acted on the infrared touch screen, acquire touch point coordinates corresponding to the touch operation and transmit the touch point coordinates to the processor; a processor connected with the infrared touch screen, wherein the processor is configured to determine a target width and a target height of a touch point corresponding to a touch operation according to the coordinates of the touch point; determining a ratio of a larger value to a smaller value in the target width and the target height and the smaller value; and identifying a touch event corresponding to the touch operation according to the ratio, the smaller value and threshold information, wherein the threshold information is used for indicating the relation between the ratio, the smaller value and the corresponding threshold. According to the embodiment of the application, the touch event corresponding to the touch operation is identified based on the ratio of the larger value to the smaller value in the target width and the target height of the touch point, the smaller value and the threshold value information, so that the touch event can be accurately identified, and the user experience is improved.
The following describes how the processor recognizes the touch event in the display device according to the embodiment of the present application in detail with reference to specific steps. Fig. 7 is a flowchart of a method for identifying a touch event according to another embodiment of the present application. As shown in fig. 7, the processor in the display device is configured to perform the steps of:
in S701, the actual width and the actual height of the touch point are determined according to the touch point coordinates.
For example, after the coordinates of the touch point are obtained, two coordinate values are corresponding to the actual height of the touch point on the infrared touch screen, and the difference between the two coordinate values is obtained to obtain the actual height of the touch point.
In S702, the actual width and the actual height are respectively transformed according to the screen ratio of the touch screen.
In the embodiment of the application, after the actual width and the actual height of the touch point are obtained, the actual width and the actual height of the touch point are respectively subjected to coordinate transformation according to the screen proportion of the touch screen. For example, the screen ratio is 16:9, and the height after the actual height is subjected to coordinate transformation can be obtained by the following formula one:
height coordinate transformed i =height i X 9 equation one
Wherein height is Coordinate transformed i Representing the height of each touch point after coordinate transformation i Representing the actual height of each touch point.
The width after coordinate transformation of the actual width can be obtained by the following formula two:
width coordinate transformed i =width i X 16 formula II
Wherein width is Coordinate transformed i Representing the width of each touch point after coordinate transformation, width i Representing the actual width of each touch point.
In S703, the average width of the coordinate transformed preset number of touch points is determined as the target width of the touch point corresponding to the touch operation.
In the embodiment of the application, the preset number is 3. After coordinate transformation is performed on the actual widths of the preset number of touch points according to the screen proportion of the touch screen, the average width of the preset number of touch points after coordinate transformation can be determined, and the average width is the target width of the touch points corresponding to the touch operation. Illustratively, the actual widths of the first 3 touch points are subjected to coordinate transformation according to the screen proportion of the touch screen, and the actual widths of the first 3 touch points are subjected to coordinate transformationAfter transformation, use width respectively 1 、width 2 、width 3 By way of representation, the average width can be obtained according to the following equation three:
width AVG =(width 1 +width 2 +width 3 ) Formula 3
Wherein width is AVG Representing the average width.
In S704, the average height after coordinate transformation of the preset number of touch points is determined as the target height of the touch point corresponding to the touch operation.
In the embodiment of the application, after the actual heights of the preset number of touch points are respectively subjected to coordinate transformation according to the screen proportion of the touch screen, the average height of the preset number of touch points after coordinate transformation can be determined, and the average height is the target height of the touch point corresponding to the touch operation. Exemplary, the actual heights of the first 3 touch points are transformed according to the screen proportion of the touch screen, and the actual heights of the first 3 touch points are transformed by height 1 、height 2 、height 3 By way of representation, the average height can be obtained according to the following equation four:
height AVG =(height 1 +height 2 +height 3 ) Formula 3
Wherein height is AVG Representing the average height.
In S705, it is determined whether the target width is greater than the target height.
In the embodiment of the application, after the target width and the target height are obtained, the sizes of the target width and the target height can be judged. If the target width is greater than the target height, step S706 is performed; if the target width is equal to or smaller than the target height, step S707 is performed.
In S706, the ratio is determined as the ratio of the target width to the target height, and the smaller value is the target height.
In the embodiment of the application, the target width is larger than the target height, so that the ratio is determined as the ratio of the target width to the target height, and the smaller value is the target height. Exemplary, target width of touch pointThe degree is width AVG The target height is height AVG ,width AVG Greater than height AVG Then a smaller value may be determined to be height AVG The ratio can be obtained according to the following formula five:
Ratio=width AVG /height AVG formula five
Where Ratio represents the Ratio.
In S707, the ratio is determined as the ratio of the target height to the target width, and the smaller value is the target width.
In the embodiment of the application, the target width is smaller than or equal to the target height, so that the ratio is determined as the ratio of the target height to the target width, and the smaller value is the target width. Illustratively, the target width of the touch point is width AVG The target height is height AVG ,height AVG Greater than width AVG Then it can be determined that the smaller value is width AVG Ratio ratio=height AVG /width AVG
In S708, according to the ratio and the threshold corresponding to the ratio, the ratio range in which the ratio is located is determined, and the smaller threshold corresponding to the different ratio ranges is different.
In the embodiment of the application, after the ratio is obtained, the ratio range in which the ratio is located is determined according to the ratio and the threshold corresponding to the ratio. Illustratively, the ratio is 1.2, the ratio ranges are: interval [1,1.4 ], interval [1.4,2 ] and interval [2 ], ++ infinity a) of the above-mentioned components, the ratio range in which the ratio 1.2 is located can be determined to be interval 1,1.4. The smaller threshold values for different ratio ranges are different, e.g., the ratio range is interval 1,1.4, corresponding to 3 smaller threshold values: 79.2, 146, 370.
In S709, a touch event corresponding to the touch operation is identified according to the smaller value and the smaller value threshold corresponding to the ratio range.
The touch event is one of a fine pen point touch event, a coarse pen point touch event, a finger touch event and a palm touch event.
In the embodiment of the application, after the ratio range in which the ratio is located is determined, the touch event corresponding to the touch operation is identified according to the smaller value and the smaller value threshold corresponding to the ratio range. Illustratively, the ratio 1.2 is 1.2, the smaller value is 60, the ratio lies in the ratio range [1,1.4 ], as in the smaller edge determination method in table 1, the corresponding ratio range is the range [1,1.4), there are 3 smaller value thresholds: 79.2, 146, 370, the smaller value is 60 between (0, 79.2), thus, the touch event corresponding to the touch operation can be determined as a fine pen point touch event.
According to the method for identifying the touch event, disclosed by the embodiment of the application, the average width and the average height of the preset number of touch points after coordinate transformation are used as the target width and the target height of the touch points corresponding to the touch operation, and then the touch event corresponding to the touch operation is identified based on the ratio of the larger value to the smaller value, the smaller value and the threshold value information in the target width and the target height of the touch points, so that the touch event can be accurately identified, and the user experience is improved.
Fig. 8 is a flowchart of a touch event recognition method according to another embodiment of the present application, and on the basis of the foregoing embodiment, the embodiment of the present application further describes how a processor in a display device recognizes a continuous touch event. As shown in fig. 8, the processor in the display device is configured to perform the steps of:
in S801, an actual width and an actual height of a touch point are determined according to the touch point coordinates.
In S802, the actual width and the actual height are respectively transformed according to the screen ratio of the touch screen.
In S803, the average width after coordinate transformation of the preset number of touch points is determined as the target width of the touch point corresponding to the touch operation.
In S804, the average height after coordinate transformation of the preset number of touch points is determined as the target height of the touch point corresponding to the touch operation.
In S805, the sizes of the target width and the target height are determined.
In S806, the ratio is determined as the ratio of the target width to the target height, and the smaller value is the target height.
In S807, the ratio is determined as the ratio of the target height to the target width, and the smaller value is the target width.
In S808, the ratio range in which the ratio is located is determined according to the ratio and the threshold corresponding to the ratio, and the smaller threshold corresponding to the different ratio ranges is different.
In the embodiment of the present application, the specific implementation process of S801 to S808 may refer to the related description of the embodiment shown in fig. 7, which is not repeated here.
In S809, scaling processing is performed on the smaller value for the continuous touch event.
The continuous touch event refers to a touch event with a touch time interval smaller than a duration threshold, and the touch time interval is a time difference between a pen down time of a current touch event and a pen up time of a previous touch event.
In the embodiment of the present application, the duration threshold is 1 second, for example. According to research, when software is used, a plurality of continuous operation scenes such as continuous writing, continuous scribing and the like exist, so that the processing of continuous touch events is more important for improving the competitiveness of products, but the continuous operation is continuous in a short time, the first recognition is wrong, the subsequent rapid scribing always carries out false recognition, namely, the change of the touch area can cause higher false recognition in a short time, and the method of the application carries out the optimization of dynamic scaling judgment values aiming at the scenes. For example, the data protocol of an interactive touch integrated machine is analyzed, and it is known that the acquired touch data packet has no timestamp information and needs to be manually acquired. And acquiring specification (Size) data of a touch point through down, move and up events of the WPF, acquiring current pen down time through the down event, and recording pen up time by using the up event. Through the print test, the time stamp information is relatively stable. Fig. 9 is a schematic diagram of timestamp information provided in an embodiment of the present application, as shown in fig. 9, in the rectangular frame area, the timestamp information in seconds is in the print data obtained by testing the continuous drawing of the short-line, which can well meet the requirement of dynamically obtaining the pen lifting time in the continuous writing state. The accurate dynamic time interval is obtained, and a foundation is laid for the dynamic scaling of the subsequent smaller value. The recognition rate in continuous writing is optimized from the writing interval time, and the smaller value in the continuous writing state (for example, the pen lifting time is less than 1 second) is scaled, the shorter the writing interval time is, the greater the scaling degree is, and the mode is more prone to be judged as the last mode. It should be noted that, the difference between the palm touch event and the finger touch event is very large, so that the scaling process is not performed on the smaller value of the palm touch event.
In some embodiments, scaling the smaller value of the touch point may include: acquiring a touch time interval of a continuous touch event; and scaling the smaller value according to the touch time interval and the preset scaling weight corresponding to the last touch event.
Illustratively, the touch time interval is a writing time interval, which may be obtained according to the following formula six:
interval time = StartTime-finish Time formula six
Wherein, interval time represents writing interval time, finish time represents pen-up time of the last pen, and StartTime represents pen-down time of the last pen.
After the touch time interval is obtained, the smaller value can be scaled according to the touch time interval and the preset scaling weight corresponding to the last touch event.
In some embodiments, scaling the smaller value according to the touch time interval and a preset scaling weight corresponding to the previous touch event may further include: if the last touch event is a fine pen point touch event, performing reduction processing on the smaller value according to the touch time interval and the preset scaling weight corresponding to the last touch event; or if the last touch event is a thick pen point touch event or a finger touch event and the smaller value is larger than the average value of the corresponding smaller value thresholds, performing reduction processing on the smaller value according to the touch time interval and the preset scaling weight corresponding to the last touch event; or if the last touch event is a thick pen point touch event or a finger touch event and the smaller value is smaller than or equal to the average value of the corresponding smaller value thresholds, amplifying the smaller value according to the touch time interval and the preset scaling weight corresponding to the last touch event.
For example, if the previous touch event is a fine pen point touch event, the smaller value is scaled down according to the following formula seven:
Side after shrinking =side- (1-Intervaltime) ×thinPenweight formula seven
Wherein, side After shrinking Representing the smaller value after the smaller value is scaled down, side represents the smaller value, and thinPenWeight is a preset scaling weight obtained through analysis and statistics of data, for example, 15.6. Through the transformation of the formula seven, the smaller value is smaller, and the touch control which is more prone to be judged as the thin pen point is realized.
Illustratively, the average value of the smaller threshold is the average value of the two-edge decision threshold, such as the smaller-edge decision method in table 1, and for the coarse-nib touch event, the two-edge decision thresholds are 79.2 and 146, respectively, and the average value of the two-edge decision threshold is 112.6. If the previous touch event is a thick pen point touch event or a finger touch event, the two modes are basically consistent. For example, if the previous touch event is a thick pen point touch event and the smaller value is greater than the average value of the corresponding smaller value thresholds, the smaller value is scaled according to the following formula eight:
Side after shrinking =Side-(1-IntervalTime)×thickPen Weight 1 Equation eight
Wherein, the thickPenweight 1 The preset scaling weight obtained through analysis statistics of the data is 25 for example.
For example, if the previous touch event is a thick pen point touch event and the smaller value is smaller than or equal to the average value of the corresponding smaller value thresholds, the smaller value is amplified according to the following formula nine:
Side after amplification =Side+(1-IntervalTime)×thickPenWeight 2 Formula nine
Wherein, the thickPenweight is a preset scaling obtained by analysis and statistics of the dataWeights, e.g. 28, side After amplification Representing the smaller value after the enlargement processing of the smaller value.
The preset scaling weight is mastered, and the high-frequency part is ensured to be always and correctly identified. Taking a coarse pen point of a meeting machine future adaptive dual-color pen as an example, fig. 10 is a schematic diagram of a high-frequency interval of the coarse pen point provided by an embodiment of the present application, as shown in fig. 10, when Ratio e [1,1.4 ]) is required, a small value is ensured to be within the [140, 194 ]) interval, and a coarse pen point touch event is always determined, that is, if writing interval time is short enough, a fine pen point touch event can be determined only when the small value is less than 140; the same holds true for finger touch events, which can only be determined if the minimum edge is greater than or equal to 194. The accuracy of the judgment of the high-frequency interval is ensured, and the condition of continuous scribing and continuous error is avoided.
In S810, a touch event corresponding to the touch operation is identified according to the ratio, the scaled smaller value and the threshold information.
In the embodiment of the application, after the scaled smaller value is obtained, on the basis of the embodiment shown in fig. 7, the touch event corresponding to the touch operation is identified according to the ratio, the scaled smaller value and the threshold information.
Based on the above embodiments, the identification of the four-level touch event or the three-level touch event may be exemplarily realized according to the customization requirement of the product, where the difference is that the three-level touch event is one less finger touch event than the four-level touch event. For example, about 800 pieces of data are selected for testing the three-level touch event and the four-level touch event, and the obtained recognition accuracy is shown in table 2 and table 3, respectively, so that whether the three-level touch event and the four-level touch event are the three-level touch event or the four-level touch event, the method provided by the application has the recognition accuracy reaching a very high level, and has been successfully applied to the MR6B whiteboard software.
Table 2 three-level recognition algorithm accuracy
Table 3 four-level recognition algorithm accuracy
According to the method for identifying the touch events, disclosed by the embodiment of the application, for the continuous touch events, the touch events corresponding to the touch operation are identified according to the ratio, the scaled smaller value and the threshold information by scaling the smaller value, so that the continuous touch events can be accurately identified, and the user experience is improved.
The following are examples of the apparatus of the present application that may be used to perform the method embodiments of the present application. For details not disclosed in the embodiments of the apparatus of the present application, please refer to the embodiments of the method of the present application.
Fig. 11 is a schematic structural diagram of a touch event recognition device according to an embodiment of the application. The embodiment of the application provides a touch event recognition device which is applied to display equipment. As shown in fig. 11, the touch event recognition apparatus 1100 includes: a first acquisition module 1101, a first determination module 1102, a second determination module 1103, and an identification module 1104. Wherein:
the first obtaining module 1101 is configured to obtain touch point coordinates corresponding to a touch operation in response to the touch operation acting on the infrared touch screen;
the first determining module 1102 is configured to determine, according to the coordinates of the touch point, a target width and a target degree of the touch point corresponding to the touch operation;
a second determining module 1103 for determining a ratio of a larger value to a smaller value and a smaller value of the target width and the target height;
the identifying module 1104 is configured to identify a touch event corresponding to the touch operation according to the ratio, the smaller value, and threshold information, where the threshold information is used to indicate a relationship between the ratio, the smaller value, and the corresponding threshold.
In some possible implementations, the second determining module 1103 is specifically configured to: if the target width is larger than the target height, determining the ratio as the ratio of the target width to the target height, and determining the smaller value as the target height; or if the target width is smaller than or equal to the target height, determining the ratio as the ratio of the target height to the target width, and determining the smaller value as the target width.
In some possible implementations, the identification module 1104 is specifically configured to: determining a ratio range in which the ratio is positioned according to the ratio and a threshold value corresponding to the ratio, wherein smaller value threshold values corresponding to different ratio ranges are different; and identifying a touch event corresponding to the touch operation according to the smaller value and the smaller value threshold corresponding to the ratio range, wherein the touch event is one of a fine pen point touch event, a thick pen point touch event, a finger touch event and a palm touch event.
In some possible implementations, the first determining module 1102 is specifically configured to: acquiring touch point coordinates of a preset number of touch points; determining the actual width and the actual height of the corresponding touch point according to the coordinates of the touch point; determining the average width of a preset number of touch points as the target width of the touch points corresponding to the touch operation; and determining the average height of the preset number of touch points as the target height of the touch point corresponding to the touch operation.
In some possible implementations, the first determining module 1102 is specifically configured to: determining the actual width and the actual height of the touch point according to the coordinates of the touch point; respectively carrying out coordinate transformation on the actual width and the actual height according to the screen proportion of the touch screen; determining the width after coordinate transformation as the target width of the touch point corresponding to the touch operation; and determining the height after coordinate transformation as the target height of the touch point corresponding to the touch operation.
In some possible implementations, the identification module 1104 is specifically configured to: for continuous touch events, scaling the smaller value, wherein the continuous touch events refer to touch events with a touch time interval smaller than a duration threshold, and the touch time interval is the time difference between the pen down time of the current touch event and the pen lifting time of the last touch event; and identifying a touch event corresponding to the touch operation according to the ratio, the scaled smaller value and the threshold information.
In some possible implementations, the identification module 1104 is specifically configured to: acquiring a touch time interval of a continuous touch event; and scaling the smaller value according to the touch time interval and the preset scaling weight corresponding to the last touch event.
In some possible implementations, the identification module 1104 is configured to, when performing scaling processing on the smaller value according to the touch time interval and the preset scaling weight corresponding to the previous touch event, specifically: if the last touch event is a fine pen point touch event, performing reduction processing on the smaller value according to the touch time interval and the preset scaling weight corresponding to the last touch event; or if the last touch event is a thick pen point touch event or a finger touch event and the smaller value is larger than the average value of the corresponding smaller value thresholds, performing reduction processing on the smaller value according to the touch time interval and the preset scaling weight corresponding to the last touch event; or if the last touch event is a thick pen point touch event or a finger touch event and the smaller value is smaller than or equal to the average value of the corresponding smaller value thresholds, amplifying the smaller value according to the touch time interval and the preset scaling weight corresponding to the last touch event.
In some possible implementations, the method further includes: the second obtaining module 1105 is configured to obtain, before identifying a touch event corresponding to a touch operation according to the ratio, the smaller value, and the threshold information, the threshold information based on the following manner:
Acquiring target widths and target heights of a plurality of test points; for each test point, determining the ratio of a larger value to a smaller value in the target width and the target height of the test point; classifying the ratios corresponding to the test points, and determining a plurality of ratio ranges; and determining a smaller value threshold corresponding to each touch event according to smaller values in the target width and the target height of the test points in the ratio range for each ratio range.
It should be noted that, the device provided in this embodiment may be used to execute the above-mentioned method for identifying a touch event, and its implementation manner and technical effects are similar, and this embodiment is not repeated here.
It should be noted that, it should be understood that the division of the modules of the above apparatus is merely a division of a logic function, and may be fully or partially integrated into a physical entity or may be physically separated. And these modules may all be implemented in software in the form of calls by the processing element; or can be realized in hardware; the method can also be realized in a form of calling software by a processing element, and the method can be realized in a form of hardware by a part of modules. For example, the processing module may be a processing element that is set up separately, may be implemented in a chip of the above-mentioned apparatus, or may be stored in a memory of the above-mentioned apparatus in the form of program codes, and the functions of the above-mentioned processing module may be called and executed by a processing element of the above-mentioned apparatus. The implementation of the other modules is similar. In addition, all or part of the modules can be integrated together or can be independently implemented. The processing element here may be an integrated circuit with signal processing capabilities. In implementation, each step of the above method or each module above may be implemented by an integrated logic circuit of hardware in a processor element or an instruction in a software form.
For example, the modules above may be one or more integrated circuits configured to implement the methods above, such as: one or more ASICs (Application Specific Integrated Circuit, specific integrated circuits), or one or more DSPs (Digital Signal Processor, digital signal processors), or one or more FPGAs (Field Programmable Gate Array, field programmable gate arrays), etc. For another example, when a module above is implemented in the form of a processing element scheduler code, the processing element may be a general purpose processor, such as a CPU or other processor that may invoke the program code. For another example, the modules may be integrated together and implemented in the form of a System-on-a-Chip (SOC).
In the above embodiments, it may be implemented in whole or in part by software, hardware, firmware, or any combination thereof. When implemented in software, may be implemented in whole or in part in the form of a computer program product. The computer program product comprises one or more computer programs. When the computer program instructions are loaded and executed on a computer, the processes or functions in accordance with embodiments of the present application are produced in whole or in part. The computer may be a general purpose computer, a special purpose computer, a computer network, or other programmable apparatus. The computer program may be stored in or transmitted from one computer readable storage medium to another, for example, a website, computer, server, or data center via a wired (e.g., coaxial cable, fiber optic, digital Subscriber Line (DSL)) or wireless (e.g., infrared, wireless, microwave, etc.). Computer readable storage media can be any available media that can be accessed by a computer or data storage devices, such as servers, data centers, etc., that contain an integration of one or more available media. Usable media may be magnetic media (e.g., floppy disks, hard disks, magnetic tape), optical media (e.g., DVD), or semiconductor media (e.g., solid State Disk (SSD)), among others.
The embodiment of the application also provides a computer readable storage medium, wherein a computer program is stored in the computer readable storage medium, and when the computer program is executed by a processor, the method for identifying touch events according to any one of the method embodiments is realized.
Embodiments of the present application also provide a computer program product, where the computer program product includes a computer program, where the computer program is stored in a computer readable storage medium, and at least one processor may read the computer program from the computer readable storage medium, where the at least one processor may implement the method for identifying a touch event according to any one of the method embodiments above when the computer program is executed.
Finally, it should be noted that: the above embodiments are only for illustrating the technical solution of the present application, and not for limiting the same; although the application has been described in detail with reference to the foregoing embodiments, it will be understood by those of ordinary skill in the art that: the technical scheme described in the foregoing embodiments can be modified or some or all of the technical features thereof can be replaced by equivalents; such modifications and substitutions do not depart from the spirit of the application.
The foregoing description, for purposes of explanation, has been presented in conjunction with specific embodiments. However, the illustrative discussions above are not intended to be exhaustive or to limit the embodiments to the precise forms disclosed above. Many modifications and variations are possible in light of the above teaching. The embodiments were chosen and described in order to best explain the principles and the practical application, to thereby enable others skilled in the art to best utilize the embodiments and various embodiments with various modifications as are suited to the particular use contemplated.

Claims (9)

1. A display device, characterized by comprising:
the infrared touch screen is configured to respond to touch operation acted on the infrared touch screen, acquire touch point coordinates corresponding to the touch operation, and transmit the touch point coordinates to the processor;
the processor coupled to the infrared touch screen, the processor configured to:
determining the target width and the target height of the touch point corresponding to the touch operation according to the touch point coordinates;
determining a ratio of a larger value to a smaller value of the target width and the target height;
identifying a touch event corresponding to the touch operation according to the ratio, the smaller value and threshold information, wherein the threshold information is used for indicating the relation between the ratio, the smaller value and the corresponding threshold;
The processor is configured to, when identifying a touch event corresponding to the touch operation according to the ratio, the smaller value and the threshold information, specifically:
determining a ratio range in which the ratio is positioned according to the ratio and a threshold value corresponding to the ratio, wherein smaller value threshold values corresponding to different ratio ranges are different;
and identifying a touch event corresponding to the touch operation according to the smaller value and a smaller value threshold corresponding to the ratio range, wherein the touch event is one of a thin pen point touch event, a thick pen point touch event, a finger touch event and a palm touch event.
2. The display device according to claim 1, wherein the processor, when being configured to determine the ratio of the larger value to the smaller value of the target width and the target height and the smaller value, is specifically configured to:
if the target width is larger than the target height, determining the ratio as the ratio of the target width to the target height, and the smaller value as the target height; or alternatively, the process may be performed,
and if the target width is smaller than or equal to the target height, determining the ratio as the ratio of the target height to the target width, and determining the smaller value as the target width.
3. The display device according to claim 1, wherein the processor is configured to, when configured to determine, according to the touch point coordinates, a target width and a target height of a touch point corresponding to the touch operation, specifically:
acquiring touch point coordinates of a preset number of touch points;
determining the actual width and the actual height of the corresponding touch point according to the touch point coordinates;
determining the average width of the preset number of touch points as the target width of the touch point corresponding to the touch operation;
and determining the average height of the preset number of touch points as the target height of the touch point corresponding to the touch operation.
4. The display device according to claim 1, wherein the processor is configured to, when configured to determine, according to the touch point coordinates, a target width and a target height of a touch point corresponding to the touch operation, specifically:
determining the actual width and the actual height of the touch point according to the touch point coordinates;
respectively carrying out coordinate transformation on the actual width and the actual height according to the screen proportion of the touch screen;
determining the width after coordinate transformation as the target width of the touch point corresponding to the touch operation;
And determining the height after coordinate transformation as the target height of the touch point corresponding to the touch operation.
5. The display device according to any one of claims 1 to 4, wherein the processor is configured to, when configured to identify a touch event corresponding to the touch operation according to the ratio, the smaller value, and threshold information, specifically:
for continuous touch events, scaling the smaller value, wherein the continuous touch events are touch events with a touch time interval smaller than a duration threshold, and the touch time interval is the time difference between the pen down time of the current touch event and the pen lifting time of the last touch event;
and identifying a touch event corresponding to the touch operation according to the ratio, the scaled smaller value and the threshold information.
6. The display device of claim 5, wherein the processor, when configured to scale the smaller value, is configured to:
acquiring a touch time interval of a continuous touch event;
and scaling the smaller value according to the touch time interval and the preset scaling weight corresponding to the last touch event.
7. The display device of claim 6, wherein the processor is configured to, when performing the scaling processing on the smaller value according to the touch time interval and a preset scaling weight corresponding to the previous touch event, specifically:
If the last touch event is a fine pen point touch event, performing reduction processing on the smaller value according to the touch time interval and a preset scaling weight corresponding to the last touch event; or alternatively, the process may be performed,
if the last touch event is a thick pen point touch event or a finger touch event and the smaller value is larger than the average value of the corresponding smaller value thresholds, performing reduction processing on the smaller value according to the touch time interval and the preset scaling weight corresponding to the last touch event; or alternatively, the process may be performed,
and if the last touch event is a thick pen point touch event or a finger touch event and the smaller value is smaller than or equal to the average value of the corresponding smaller value thresholds, amplifying the smaller value according to the touch time interval and the preset scaling weight corresponding to the last touch event.
8. The display device of any one of claims 1 to 4, wherein the processor, prior to being configured to identify a touch event corresponding to the touch operation based on the ratio, the smaller value, and threshold information, is further configured to:
the threshold information is obtained based on the following manner:
acquiring target widths and target heights of a plurality of test points;
For each test point, determining the ratio of a larger value to a smaller value in the target width and the target height of the test point;
classifying the ratios corresponding to the test points, and determining a plurality of ratio ranges;
and determining a smaller value threshold corresponding to each touch event according to smaller values in the target width and the target height of the test points in the ratio range for each ratio range.
9. The method for identifying the touch event is characterized by being applied to a display device, and comprises the following steps:
responding to touch operation acted on an infrared touch screen, and acquiring touch point coordinates corresponding to the touch operation;
determining the target width and the target height of the touch point corresponding to the touch operation according to the touch point coordinates;
determining a ratio of a larger value to a smaller value of the target width and the target height;
identifying a touch event corresponding to the touch operation according to the ratio, the smaller value and threshold information, wherein the threshold information is used for indicating the relation between the ratio, the smaller value and the corresponding threshold;
and identifying the touch event corresponding to the touch operation according to the ratio, the smaller value and the threshold information, wherein the identifying comprises the following steps:
Determining a ratio range in which the ratio is positioned according to the ratio and a threshold value corresponding to the ratio, wherein smaller value threshold values corresponding to different ratio ranges are different;
and identifying a touch event corresponding to the touch operation according to the smaller value and a smaller value threshold corresponding to the ratio range, wherein the touch event is one of a thin pen point touch event, a thick pen point touch event, a finger touch event and a palm touch event.
CN202111506266.XA 2021-03-31 2021-12-10 Display device and touch event identification method Active CN114237419B (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN2021103454059 2021-03-31
CN202110345405 2021-03-31

Publications (2)

Publication Number Publication Date
CN114237419A CN114237419A (en) 2022-03-25
CN114237419B true CN114237419B (en) 2023-10-27

Family

ID=80754599

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202111506266.XA Active CN114237419B (en) 2021-03-31 2021-12-10 Display device and touch event identification method

Country Status (1)

Country Link
CN (1) CN114237419B (en)

Families Citing this family (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115185411B (en) * 2022-07-08 2024-03-15 北京字跳网络技术有限公司 Cursor moving method and device and electronic equipment
CN115793893B (en) * 2023-02-07 2023-05-19 广州众远智慧科技有限公司 Touch writing handwriting generation method and device, electronic equipment and storage medium
CN115793892B (en) * 2023-02-07 2023-05-16 广州众远智慧科技有限公司 Touch data processing method and device, electronic equipment and storage medium
CN118132294A (en) * 2024-05-06 2024-06-04 海马云(天津)信息技术有限公司 Trigger event information processing method and device

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2015192490A1 (en) * 2014-06-17 2015-12-23 中兴通讯股份有限公司 Method, device, and terminal for removing jittering on touchscreen
CN106547433A (en) * 2016-11-07 2017-03-29 青岛海信电器股份有限公司 Written handwriting determines method and device
CN111078108A (en) * 2019-12-13 2020-04-28 惠州Tcl移动通信有限公司 Screen display method and device, storage medium and mobile terminal
CN111158505A (en) * 2018-11-08 2020-05-15 鸿合科技股份有限公司 Thick-thin handwriting writing method and device and electronic equipment
CN112468863A (en) * 2020-11-24 2021-03-09 北京字节跳动网络技术有限公司 Screen projection control method and device and electronic device

Family Cites Families (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20100328241A1 (en) * 2009-06-12 2010-12-30 Keith Paulsen Method and system for measuring position on surface capacitance touch panel using a flying capacitor
JP5790203B2 (en) * 2011-06-29 2015-10-07 ソニー株式会社 Information processing apparatus, information processing method, program, and remote operation system
US10140011B2 (en) * 2011-08-12 2018-11-27 Microsoft Technology Licensing, Llc Touch intelligent targeting
TW201405363A (en) * 2012-07-26 2014-02-01 Hon Hai Prec Ind Co Ltd Application controlling system and method
CN106020678A (en) * 2016-04-29 2016-10-12 青岛海信移动通信技术股份有限公司 Method and device for implementing touch operation in mobile equipment
CN110543246A (en) * 2018-05-28 2019-12-06 深圳市鸿合创新信息技术有限责任公司 touch event processing method and touch screen device

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2015192490A1 (en) * 2014-06-17 2015-12-23 中兴通讯股份有限公司 Method, device, and terminal for removing jittering on touchscreen
CN106547433A (en) * 2016-11-07 2017-03-29 青岛海信电器股份有限公司 Written handwriting determines method and device
CN111158505A (en) * 2018-11-08 2020-05-15 鸿合科技股份有限公司 Thick-thin handwriting writing method and device and electronic equipment
CN111078108A (en) * 2019-12-13 2020-04-28 惠州Tcl移动通信有限公司 Screen display method and device, storage medium and mobile terminal
CN112468863A (en) * 2020-11-24 2021-03-09 北京字节跳动网络技术有限公司 Screen projection control method and device and electronic device

Also Published As

Publication number Publication date
CN114237419A (en) 2022-03-25

Similar Documents

Publication Publication Date Title
CN114237419B (en) Display device and touch event identification method
EP3647922A1 (en) User terminal device and method for controlling the user terminal device thereof
US9720567B2 (en) Multitasking and full screen menu contexts
CN112181207B (en) Display device and geometric figure recognition method
CN112866773B (en) Display equipment and camera tracking method in multi-person scene
JP2012503801A (en) Object detection and user settings
US20190012129A1 (en) Display apparatus and method for controlling display apparatus
CN114157889B (en) Display equipment and touch control assisting interaction method
CN111949782A (en) Information recommendation method and service equipment
WO2020055480A1 (en) Multi-region detection for images
CN112165641A (en) Display device
CN113630569B (en) Display apparatus and control method of display apparatus
CN115129214A (en) Display device and color filling method
CN114115637A (en) Display device and electronic drawing board optimization method
CN111939561B (en) Display device and interaction method
CN113485613A (en) Display equipment and method for realizing free-drawing screen edge painting
CN113076031B (en) Display equipment, touch positioning method and device
KR20210023434A (en) Display apparatus and control method thereof
CN112926420B (en) Display device and menu character recognition method
CN116801027A (en) Display device and screen projection method
CN114760513A (en) Display device and cursor positioning method
WO2020156381A1 (en) Video playing method and terminal device
CN115390702A (en) Display device, touch point positioning method and device
CN111931692A (en) Display device and image recognition method
CN112199560A (en) Setting item searching method and display device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant