CN115562514B - Touch display method, graphical interface and related device - Google Patents

Touch display method, graphical interface and related device Download PDF

Info

Publication number
CN115562514B
CN115562514B CN202210193026.7A CN202210193026A CN115562514B CN 115562514 B CN115562514 B CN 115562514B CN 202210193026 A CN202210193026 A CN 202210193026A CN 115562514 B CN115562514 B CN 115562514B
Authority
CN
China
Prior art keywords
touch
coordinates
coordinate
compensation function
measured
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202210193026.7A
Other languages
Chinese (zh)
Other versions
CN115562514A (en
Inventor
邸皓轩
李丹洪
张晓武
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Honor Device Co Ltd
Original Assignee
Honor Device Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Honor Device Co Ltd filed Critical Honor Device Co Ltd
Priority to CN202210193026.7A priority Critical patent/CN115562514B/en
Publication of CN115562514A publication Critical patent/CN115562514A/en
Application granted granted Critical
Publication of CN115562514B publication Critical patent/CN115562514B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • G06F3/0414Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means using force sensing means to determine a position
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • G06F3/0416Control or interface arrangements specially adapted for digitisers
    • G06F3/04162Control or interface arrangements specially adapted for digitisers for exchanging data with external devices, e.g. smart pens, via the digitiser sensing hardware
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • G06F3/0416Control or interface arrangements specially adapted for digitisers
    • G06F3/04166Details of scanning methods, e.g. sampling time, grouping of sub areas or time sharing with display driving
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • G06F3/044Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by capacitive means
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2203/00Indexing scheme relating to G06F3/00 - G06F3/048
    • G06F2203/041Indexing scheme relating to G06F3/041 - G06F3/045
    • G06F2203/04105Pressure sensors for measuring the pressure or force exerted on the touch surface without providing the touch position

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Position Input By Displaying (AREA)

Abstract

The application discloses a touch display method, a graphical interface and a related device. In the method, the touch device acquires a horizontal axis compensation function and a vertical axis compensation function in advance. The horizontal axis compensation function is used for representing the corresponding relation between the measured horizontal coordinate and the compensation value, and the vertical axis compensation function is used for representing the corresponding relation between the measured vertical coordinate and the compensation value. When the touch equipment acquires touch operation of the input equipment acting on the touch screen, acquiring actual measurement coordinates corresponding to the touch coordinates where the touch operation is located, searching compensation values corresponding to the abscissas of the actual measurement coordinates in a abscissas compensation function, and adding the compensation values corresponding to the abscissas to the actual measurement abscissas to acquire compensated abscissas; similarly, searching a compensation value corresponding to the ordinate of the measured coordinate in the vertical axis compensation function, adding the compensation value corresponding to the ordinate to the measured ordinate to obtain a compensated ordinate, so as to obtain a compensation coordinate corresponding to the measured coordinate, and displaying a corresponding moving track according to a plurality of continuous compensation coordinates.

Description

Touch display method, graphical interface and related device
Technical Field
The present application relates to the field of terminals, and in particular, to a touch display method, a graphical interface, and a related device.
Background
Along with the popularization of electronic devices such as smart phones, flat panels, drawing boards, etc., touch screens and touch display technologies have also been widely used. When input equipment such as a finger and an electronic pen is used for writing or drawing on the touch screen, a plurality of sensing units arranged in the touch screen can detect capacitance changes to generate electric signals, a processor calculates specific coordinates of the input equipment falling down in the touch screen by adopting a gravity center algorithm, a triangular algorithm and the like according to the electric signals generated by the sensing units, and finally, the display screen is driven to connect the continuously detected coordinates into handwriting and display the handwriting. However, due to the fact that the algorithm used for calculating the coordinates has the problems of inaccurate calculation, noise caused by the hardware attribute of the touch screen and the like, handwriting output by the display screen is not an input straight or smooth moving track, and the moving track of the moire is displayed.
How to solve the above problems is a current urgent problem to be solved.
Disclosure of Invention
The application provides a touch display method, a graphical interface and a related device. After the actual measurement coordinates are obtained according to the touch operation detected by the touch screen, the actual measurement coordinates are compensated according to the difference value between the actual measurement coordinates and the theoretical coordinates, namely, the actual measurement coordinates are overlapped with corresponding compensation values to obtain compensated coordinates, and finally, a moving track formed by connecting a plurality of continuous compensated coordinates is displayed, so that the display effect of the moving track is smoother and straighter.
In a first aspect, the present application provides a touch display method, where the method is applied to an electronic device, and the method includes: a sensor in a touch screen of the electronic equipment acquires a first downlink signal sent by an input equipment; the electronic equipment determines a plurality of continuous first actual measurement coordinates on the touch screen according to the position of the sensor for acquiring the first downlink signal; the electronic equipment obtains a first compensation value corresponding to the first measured coordinate in a compensation function, and superimposes the first compensation value on the first measured coordinate to obtain a first compensation coordinate; the compensation function is used for representing the corresponding relation between the measured coordinate and a compensation value, and the compensation value indicates the difference value between the measured coordinate and the corresponding theoretical coordinate; the electronic equipment displays a plurality of continuous moving tracks formed by connecting the first compensation coordinates.
After the method provided by the first aspect is implemented, the actual measurement coordinates can be compensated according to the difference between the actual measurement coordinates and the theoretical coordinates, namely, the actual measurement coordinates are overlapped with corresponding compensation values to obtain compensated coordinates, and finally, a moving track formed by connecting a plurality of continuous compensated coordinates is displayed, so that the problem that the existing moving track has a wave effect or a sawtooth effect is solved, and the display effect of the moving track is smoother and straighter.
The method provided in combination with the first aspect, the compensation function comprising a horizontal axis compensation function and a vertical axis compensation function, the compensation value comprising a horizontal axis compensation value and a vertical axis compensation value; the transverse axis compensation function is used for representing the corresponding relation between the abscissa of the actual measurement coordinate and the transverse axis compensation value, and the transverse axis compensation value indicates the difference value between the abscissa of the actual measurement coordinate and the abscissa of the theoretical coordinate; the vertical axis compensation function is used for representing the corresponding relation between the vertical axis of the measured coordinate and the vertical axis compensation value, and the vertical axis compensation value indicates the difference value between the vertical axis of the measured coordinate and the vertical axis of the theoretical coordinate.
Therefore, the electronic equipment can perform omnibearing compensation on the abscissa of the actually measured coordinate according to the horizontal axis compensation function and the vertical axis compensation function so as to obtain a compensation coordinate which is closer to the theoretical coordinate, the problem that the existing moving track has a wave effect or a sawtooth effect is solved, and the display effect of the moving track is smoother and straighter.
The method provided in combination with the first aspect, the sensors being arranged in a first direction and a second direction, the first direction being perpendicular to the second direction; the lateral axis compensation function is obtained by the second device by: the input device moves in the touch screen at a first speed in parallel to the first direction, and the input device sends a downlink signal in the moving process; the first speed is lower than a preset speed and is a constant value; determining a plurality of continuous actual measurement coordinates according to the position of the sensor for acquiring the downlink signals; determining a plurality of continuous theoretical coordinates according to the initial measured coordinate, the final measured coordinate and the first speed in the plurality of continuous measured coordinates; and acquiring the transverse axis compensation function according to the plurality of continuous measured coordinates and the plurality of continuous theoretical coordinates.
Therefore, the electronic equipment can acquire experimental data (namely actual measurement coordinates) and theoretical data according to professional experiments, and obtain a more accurate compensation function through mathematical operation, so that the compensation effect is more accurate.
With reference to the method provided in the first aspect, the obtaining, by the second device, the transverse axis compensation function according to the plurality of continuous measured coordinates and the plurality of continuous theoretical coordinates specifically includes: the electronic equipment performs data fitting on the plurality of continuous measured coordinates to obtain a transverse axis fitting function; the electronic equipment obtains a plurality of differences between the transverse axis fitting function and the plurality of continuous theoretical coordinates; the electronic device performs data fitting on the plurality of differences to obtain the lateral axis compensation function.
Therefore, the electronic equipment can acquire experimental data and theoretical data according to professional experiments, and process the experimental data (namely actual measurement coordinates) by utilizing data fitting, so that a more accurate function is obtained to identify the relation between the actual measurement data, and single experimental data is not required to be processed one by one, so that the operation of subsequent coordinate compensation is simplified.
With reference to the method provided in the first aspect, the horizontal axis compensation function and the vertical axis compensation function are the same or different; when the arrangement rules of the sensor in the first direction and the second direction are consistent, and the hardware structure of the sensor is the same, the horizontal axis compensation function and the vertical axis compensation function are the same; otherwise, the horizontal axis compensation function and the vertical axis compensation function are different.
Therefore, when the arrangement rules of the sensors in the touch screen are consistent in the transverse and longitudinal directions and the hardware structures of the sensors are the same, only one compensation function in one arrangement direction can be obtained, and the coordinates in two arrangement directions are compensated by the one compensation function, so that the obtaining steps of the compensation function are simplified.
The method provided in connection with the first aspect, the second device and the electronic device are the same device or different devices.
In this way, the compensation function may be acquired by the electronic device itself, or may be stored in the electronic device after other devices acquire the compensation function.
In combination with the method provided in the first aspect, before the electronic device displays a movement track formed by connecting a plurality of continuous first compensation coordinates, the method further includes: the electronic device runs and displays an interface provided by a drawing or writing type application.
Therefore, when a user uses the touch screen of the electronic equipment to conduct drawing or writing operation, by adopting the method provided by the scheme, writing or drawn handwriting can be enabled to present smoother lines, and user experience is improved.
The method provided in connection with the first aspect, the input device includes, but is not limited to, an active capacitive pen.
Therefore, as the pen point of the active capacitance pen is thinner and the detection downlink signal of the sensor is not ideal, the traditional handwriting display effect is poor, obvious wavy lines exist, and the final handwriting display effect can be smoother and straighter after the scheme provided by the active capacitance pen is adopted.
With reference to the method provided in the first aspect, when the input device is the active capacitive pen, the downlink signal is specifically: the driving unit in the active capacitance pen sends an excitation signal to the pen point.
Therefore, the sensor in the electronic equipment can effectively receive the downlink signal by driving the active capacitance pen, and smooth display of the moving track is ensured.
In combination with the method provided in the first aspect, the types of the downlink signals include three types, but are not limited to: square waves, sine waves, and triangular waves.
In this way, the active capacitance pen can bear downlink signals through various types of signals, thereby improving the feasibility of the application.
With reference to the method provided in the first aspect, when the input device is the active capacitive pen, the downlink signal carries pressure information of the active capacitive pen; the pressure information is related to the display effect of the movement track.
Therefore, the electronic equipment can control the display effect of the moving track according to the pressure information sent by the active capacitance pen, for example, the larger the pressure is, the thicker the moving track display effect is, and the thinner the moving track display effect is when the pressure is, so that a user can write or draw according to personal requirements.
In a second aspect, the present application provides a chip for application to an electronic device, the chip comprising one or more processors for invoking computer instructions to cause the electronic device to perform a method as described in any of the first aspects above.
In a third aspect, the application provides a computer readable storage medium comprising instructions which, when run on an electronic device, cause the electronic device to perform a method as described in any of the first aspects above.
In a fourth aspect, the present application provides an electronic device comprising one or more processors and one or more memories; wherein the one or more memories are coupled to the one or more processors, the one or more memories for storing computer program code comprising computer instructions that, when executed by the one or more processors, cause the electronic device to perform the method as described in any of the first aspects above.
Drawings
Fig. 1 is a schematic view of an application scenario of a touch method according to an embodiment of the present application;
fig. 2 is a schematic structural diagram of an active pen 200 according to an embodiment of the present application;
fig. 3A is a schematic diagram of sensor arrangement of a touch screen 110 according to an embodiment of the present application;
fig. 3B to fig. 3C are schematic diagrams illustrating a method for detecting coordinates of a touch screen 110 according to an embodiment of the present application;
FIG. 4 is a schematic diagram of a handwriting display interface according to an embodiment of the present application;
fig. 5 is a schematic flow chart of a touch display method according to an embodiment of the present application;
FIG. 6 is a schematic flow chart of a compensation function obtaining method according to an embodiment of the present application;
fig. 7 is a schematic diagram of a correspondence between a theoretical abscissa and an actually measured abscissa according to an embodiment of the present application;
FIG. 8 is a graph illustrating a cross-axis fitting function according to an embodiment of the present application;
FIG. 9 is a graph illustrating a cross-axis fit difference function according to an embodiment of the present application;
FIG. 10 is a schematic diagram of a correspondence between a compensation abscissa and a theoretical abscissa according to an embodiment of the present application;
FIG. 11 is a schematic diagram of another handwriting display interface according to an embodiment of the present application;
fig. 12 is a schematic hardware structure diagram of a touch device 100 according to an embodiment of the present application;
Fig. 13 is a schematic software structure diagram of a touch device 100 according to an embodiment of the present application;
fig. 14 is a schematic hardware structure of an active pen 200 according to an embodiment of the present application.
Detailed Description
The technical solutions of the embodiments of the present application will be clearly and thoroughly described below with reference to the accompanying drawings. Wherein, in the description of the embodiments of the present application, unless otherwise indicated, "/" means or, for example, a/B may represent a or B; the text "and/or" is merely an association relation describing the associated object, and indicates that three relations may exist, for example, a and/or B may indicate: a exists alone, A and B exist together, and B exists alone.
The terms "first," "second," and the like, are used below for descriptive purposes only and are not to be construed as implying or implying relative importance or implicitly indicating the number of technical features indicated. Thus, a feature defining "a first" or "a second" may explicitly or implicitly include one or more such feature, and in the description of embodiments of the application, unless otherwise indicated, the meaning of "a plurality" is two or more.
Reference in the specification to "an embodiment" means that a particular feature, structure, or characteristic described in connection with the embodiment may be included in at least one embodiment of the application. The appearances of such phrases in various places in the specification are not necessarily all referring to the same embodiment, nor are separate or alternative embodiments mutually exclusive of other embodiments. Those of skill in the art will explicitly and implicitly appreciate that the described embodiments of the application may be combined with other embodiments.
The term "User Interface (UI)" in the following embodiments of the present application is a media interface for interaction and information exchange between an application program or an operating system and a user, which enables conversion between an internal form of information and a form acceptable to the user. The user interface is a source code written in a specific computer language such as java, extensible markup language (extensible markup language, XML) and the like, and the interface source code is analyzed and rendered on the electronic equipment to finally be presented as content which can be identified by a user. A commonly used presentation form of the user interface is a graphical user interface (graphic user interface, GUI), which refers to a user interface related to computer operations that is displayed in a graphical manner. It may be a visual interface element of text, icons, buttons, menus, tabs, text boxes, dialog boxes, status bars, navigation bars, widgets, etc., displayed in a display of the electronic device.
With the wide application of touch screens and touch display technologies, in order to solve the defects of the current touch display technologies, namely the problems of inaccurate calculation, noise caused by the hardware attribute of the touch screen and the like of an algorithm used in calculating touch coordinates, handwriting output by the display screen is not straight or smooth lines, and wavy lines are displayed. The application provides a touch display method, a graphical interface and a related device. In the method, the touch device acquires a horizontal axis compensation function and a vertical axis compensation function in advance. The horizontal axis compensation function is used for representing the corresponding relation between the measured horizontal coordinate and the compensation value, and the vertical axis compensation function is used for representing the corresponding relation between the measured vertical coordinate and the compensation value. When the touch equipment acquires touch operation of the input equipment acting on the touch screen, acquiring actual measurement coordinates corresponding to the touch coordinates where the touch operation is located, searching compensation values corresponding to the abscissas of the actual measurement coordinates in a abscissas compensation function, and adding the compensation values corresponding to the abscissas to the actual measurement abscissas to acquire compensated abscissas; similarly, searching a compensation value corresponding to the ordinate of the measured coordinate in the vertical axis compensation function, adding the compensation value corresponding to the ordinate to the measured ordinate to obtain a compensated ordinate, finally obtaining the compensation coordinate corresponding to the measured coordinate, and displaying a moving track formed by connecting a plurality of continuous compensation coordinates by the touch control equipment.
In the following embodiments of the present application, three coordinate points are involved, specifically as follows:
touch coordinates: the input device actually contacts a coordinate point of the touch screen, and the touch coordinate may also be a theoretical coordinate.
Actual measurement coordinates: and the processor preliminarily calculates coordinate points according to the electric signals detected by the plurality of sensing units arranged in the touch screen.
Compensating coordinates: based on the measured coordinates, the coordinate points which are obtained through compensation errors and are closer to the touch points than the measured coordinates.
Therefore, the touch display method provided by the application can compensate errors existing when the processor calculates the touch coordinates, so that the compensation coordinates which are relatively close to the actual touch coordinates (theoretical coordinates) are obtained, the influence caused by the calculation errors of the touch coordinates is greatly reduced or eliminated, and the touch equipment can display straighter and smoother handwriting which accords with the user expectation when the user writes or draws on the touch equipment.
The specific method for obtaining the compensation function may be described with reference to the following method, which is not described herein.
In order to more clearly describe the touch display method provided by the application, a scene schematic diagram applicable to the touch display method provided by the application is described first.
Referring to fig. 1, fig. 1 illustrates a schematic view of a scene to which the touch display method provided by the present application is applicable.
As shown in fig. 1, the scene includes a touch device 100 and an input device. In fig. 1, a touch device 100 is taken as an example of a tablet, and an input device is taken as an example of an electronic pen. The input device may provide an input to the touch device 100, and the touch device 100 performs an operation responsive to the input based on the input of the input device. Specifically, the touch device 100 may be provided with a touch area (for example, the touch screen 110), and the input device may write, draw, etc. in the touch area, so that the touch device may display a moving path of the input device, that is, display handwriting. The handwriting display method specifically comprises the following steps: the touch device 100 acquires a horizontal axis compensation function and a vertical axis compensation function in advance. The horizontal axis compensation function is used for representing the corresponding relation between the measured horizontal coordinate and the compensation value, and the vertical axis compensation function is used for representing the corresponding relation between the measured vertical coordinate and the compensation value. When the touch equipment acquires touch operation of the input equipment acting on the touch screen, acquiring actual measurement coordinates corresponding to the touch coordinates where the touch operation is located, searching compensation values corresponding to the abscissas of the actual measurement coordinates in a abscissas compensation function, and adding the compensation values corresponding to the abscissas to the actual measurement abscissas to acquire compensated abscissas; similarly, searching a compensation value corresponding to the ordinate of the measured coordinate in the vertical axis compensation function, adding the compensation value corresponding to the ordinate to the measured ordinate to obtain a compensated ordinate, finally obtaining the compensation coordinate corresponding to the measured coordinate, and displaying handwriting formed by connecting a plurality of continuous compensation coordinates by the touch control equipment.
In one embodiment, the touch device 100 and the input device may be interconnected by a communication network to implement interaction of wireless signals. The communication network may be, but is not limited to: WI-FI hotspot network, WI-FI peer-to-peer (P2P) network, bluetooth network, zigBee network, or near field communication (near field communication, NFC) network.
It should be understood that the touch device 100 and the input device shown in fig. 1 are only examples, and in other embodiments of the present application, the touch device 100 may be a device other than a tablet, and the input device may be a device other than an electronic pen. The method comprises the following steps:
the touch device 100 may also be other devices configuring the touch screen 110, such as a cell phone, a hand-drawn pad, an augmented reality (augmented reality, AR) device, a Virtual Reality (VR) device, an artificial intelligence (artificial intelligence, AI) device, a wearable device, a vehicle-mounted device, a smart home device and/or a smart city device, a desktop computer, a laptop computer, a handheld computer, a notebook computer, an ultra-mobile personal computer (UMPC), a netbook, and a personal digital assistant (personal digital assistant, PDA, etc.), and embodiments of the present application are not particularly limited as to the specific type of the touch device 100.
The input device may also be, for example, other media than an electronic pen, such as other conductive media like a finger. When the input device is an electronic pen, the electronic pen may be, but is not limited to: inductive pens and capacitive pens. When the input device is a capacitive pen, the capacitive pen may include: passive capacitive pens and active capacitive pens. Passive capacitive pens may be referred to as passive capacitive pens and active capacitive pens may be referred to as active capacitive pens. One or more electrodes may be provided in the active capacitive pen (e.g., within the pen tip), through which the active capacitive pen may emit a signal. When the electronic pen is an active capacitive pen, an integrated electrode array is required on the touch screen 110 of the touch device 100 interacting with the active capacitive pen. In one embodiment, the electrode array may be a capacitive electrode array. The touch device 100 may receive a signal from the active capacitive pen through the electrode array, and further recognize the position of the active capacitive pen on the touch screen based on the change of the capacitance value on the touch screen 110, the tilt angle of the active capacitive pen, and so on when the signal is received. When the input device is an inductive pen, an electromagnetic induction board needs to be integrated on the touch screen 110 of the touch device 100 that interacts with the inductive pen. The electromagnetic induction plate is distributed with coils, and the induction pen is also integrated with the coils. Based on the electromagnetic induction principle, in the magnetic field range generated by the electromagnetic induction plate, along with the movement of the induction pen, the induction pen can accumulate electric energy. The inductive pen can transmit the accumulated electric energy to the electromagnetic induction plate through the coil in the inductive pen through free oscillation. The electromagnetic induction plate can scan the coil on the electromagnetic induction plate based on the electric energy from the induction pen, and calculate the position of the induction pen on the touch screen 110.
It will be appreciated that in one embodiment, the touch screen may be referred to as a touch screen, the electronic pen may be also referred to as a stylus or a handwriting pen, etc., and the name of the device is not limited by the embodiments of the present application.
Based on the above description of the scenario shown in fig. 1, the structures of the touch screen 110 and the input device related to the above scenario will be mainly described. In the present application, when the input device is an active capacitive pen (active pen 200 for short), and the touch screen 110 adapted to the active pen 200 are taken as examples, the structures of the two are described in detail.
Referring to fig. 2, fig. 2 schematically illustrates a structural diagram of an active pen 200.
As shown in fig. 2, the active pen 200 includes a pen tip 201, a driving unit 202, a body 203, and a power supply unit 204.
Wherein the nib 201 may be internally provided with a pressure sensor (not shown in the drawings), and one or more transmitting electrodes, receiving electrodes (not shown in the drawings). The pressure sensor is used for detecting the pressure of the pen tip acting on the touch screen 110 (i.e. receiving the external force of the touch screen 110), so as to control the thickness of the line output by the touch device 100 when writing on the touch screen 110 according to the pressure detected by the pen tip 201. The transmitting electrode is used for transmitting a downlink signal to the touch screen 110 of the touch device 100. The receiving electrode is configured to receive an uplink signal sent by the touch screen 110 of the touch device 100.
Among them, between the pen tip 201 and the driving unit 202, the driving unit 202 and the power supply unit 204 may be electrically connected through a wire or a flexible circuit board. The power supply unit 204 may include a lithium ion battery, or the power supply unit 204 may include a nickel-chromium battery, an alkaline battery, a nickel-hydrogen battery, or the like. In one embodiment, the battery included in the power supply unit 204 may be a rechargeable battery or a disposable battery, wherein when the battery included in the power supply unit 204 is a rechargeable battery, the active pen 200 may charge the battery in the power supply unit 204 by wireless charging. Of course, the battery in the power supply unit 204 may be charged by a wired charging method, for example, a power connector (not shown) is disposed at an end of the pen body 203 near the pen tail, and the power connector is connected to the power supply unit 204.
In the embodiment of the present application, the power supply unit 204 is configured to supply power to the driving unit 202, and the driving unit 202 may actively generate an excitation signal (also referred to as a downlink signal) and send the excitation signal to the pen tip 201. When the tip 201 of the active pen 200 contacts the touch screen 110, the capacitance value at the corresponding position of the touch screen 110 may change, and the touch device 100 may determine the position of the tip 201 of the active pen 200 on the touch screen 110 based on the capacitance value on the touch screen 110.
In an embodiment, the downlink signal may be of a square wave, a sine wave, a triangular wave, etc., and may carry pressure information, key information, etc. of the active pen 200.
Referring to fig. 3A, fig. 3A schematically illustrates a sensor arrangement of the touch screen 110.
As shown in fig. 3A, when the input device is an active pen 200, the touch screen 110 adapted to the active pen 200 includes a plurality of sensor channels, such as a plurality of transverse axis sensor channels 111 arranged laterally and a plurality of longitudinal axis sensor channels 112 arranged longitudinally. The sensor channel is in particular provided with a number of mutually insulated electrode arrays. In one embodiment, the electrode array may be a capacitive electrode array. The touch device 100 may receive a downlink signal transmitted from the active pen 200 through the scan electrode array, and further recognize a position (i.e., touch coordinates) of the active pen 200 on the touch screen 110 based on a change of a capacitance value on the touch screen 110 when the downlink signal is received, and so on.
In the process of scanning the electrode array by the touch device 100 to determine the touch coordinates, the touch device 100 may specifically detect the capacitance change of any touch point in the effective area of the touch screen by continuously scanning the horizontal axis sensor channel 111 and the vertical axis sensor channel 112, and then calculate and output the horizontal axis and vertical axis coordinates according to the capacitance change. It should be noted that, since the sensor channel in the touch screen 110 is a grid-like sensing circuit formed by horizontally and vertically staggered capacitive electrodes, when the user holds the active pen 200 to write on the touch screen 110, the capacitance of multiple electrodes near a certain point of the touch screen 110 changes whenever the pen tip 201 of the active pen 200 is dropped on the certain point, especially when the active pen 200 continuously writes on the touch screen 110, the capacitance of multiple electrodes near each coordinate point included in the moving track of the active pen 200 changes, and although the processor of the touch device 100 may calculate the coordinates according to the capacitance change condition by adopting a specific algorithm, such as a gravity center algorithm, a triangle algorithm, etc., errors may still occur between the finally calculated measured coordinates and the actual touch coordinates due to inaccuracy of the hardware attribute of the touch screen and the calculation algorithm of the coordinates. That is, in the actual writing process, the detected actual measurement coordinate of each touch coordinate included in the movement track of the active pen may be different from the actual touch coordinate (i.e., theoretical coordinate), the actual measurement coordinate may be any coordinate point around the touch coordinate, and the finally displayed writing trace is different from the actual track of the movement of the active pen 200 held by the user. Specifically, fig. 3B to 3C may be combined.
As shown in fig. 3B, when the active pen 200 is held to input a line non-perpendicular/parallel to the sensor channel in the touch screen 110, the line is an actual movement track of the active pen 200 in the touch screen 110, and the actual movement track includes the contact coordinate 1, the contact coordinate 2, the contact coordinate 3, the contact coordinate 4, and so on. However, since the sensor in the touch screen 110 can detect capacitance changes at a plurality of positions near each contact coordinate, and the algorithm for calculating the coordinates is inaccurate, the finally calculated coordinates may be the actual measurement coordinate 1, the actual measurement coordinate 2, the actual measurement coordinate 3, the actual measurement coordinate 4, and so on as shown in fig. 3C.
It should be understood that fig. 3B-3C are only exemplary illustrations of errors between the measured coordinates and the touch coordinates calculated by the touch device 100 when a line that is not perpendicular/parallel to the sensor channel is input in the touch screen 110. In other embodiments, when a straight line perpendicular/parallel to the sensor channel is input in the touch screen 110, there may be an error between the measured coordinates and the touch coordinates calculated by the touch device 100, but the measured coordinates may be near points of the touch coordinates in the front-rear direction on the straight line perpendicular to the sensor channel or near points of the touch coordinates in the left-right direction on the straight line parallel to the sensor channel, so that the handwriting displayed after connecting the measured coordinates is still a straight line perpendicular/parallel to the sensor channel, so that the user cannot observe that the touch coordinates are incorrectly calculated.
Referring to FIG. 4, FIG. 4 illustrates a schematic diagram of a handwriting display interface.
Lines 411A, 412A, and 413A are shown as actual movement trajectories of active pen 200 when active pen 200 is being written in touch screen 110. The line 411B, the line 412B, and the line 413B are lines 411A, 412A, and 413A, respectively, of the actual movement track of the active pen 200 by the touch device 100, and display corresponding handwriting.
It should be noted that the user interface 410 actually displayed by the touch device 100 includes only the line 411B, the line 412B, and the line 413B. The user interface 410 may be an interface provided by a writing, painting-like application installed in the touch device 100. The lines 411A, 412A and 413A shown in fig. 4 are not actually displayed by the touch device 100, but are merely used to illustrate the actual movement track of the active pen 200 when the active pen 200 writes in the touch screen 110, and the actual movement track of the active pen 200 is not the current positions of the lines 411A, 412A and 413A. When the display screen 120 of the touch device 100 is attached to the touch screen 110, that is, when the touch screen 110 and the display screen 120 are integrated into one touch display screen, the actual moving track of the active pen 200 on the touch screen 110 is mostly coincident with the handwriting displayed on the display screen 120; when the touch screen 110 is independent of the body of the touch device but can be connected with the touch device through wireless or wired communication, that is, when the sensor in the touch screen 110 is not attached to the display screen 120, the actual moving track of the active pen 200 on the touch screen 110 has a corresponding relationship with the handwriting displayed on the display screen 120.
Next, in conjunction with the description of fig. 3A-3C and fig. 4, it can be seen that the touch device 100 will display the corresponding line 411B with the straight line effect and the line 412B with the straight line effect only when the active pen 200 is held to input a straight line perpendicular to the horizontal axis sensor channel (e.g., line 411A) or a straight line parallel to the horizontal axis sensor channel (e.g., line 412A) in the touch screen 110; when the active pen 200 is held in the touch screen 110, a straight line or a curved line (such as a line 413A) which is not perpendicular/parallel to the sensor channel is input, the touch device 100 displays a corresponding line 413B having a wave effect. In some embodiments, line 413A may correspond to line 413B that may be in a saw tooth shape, in addition to a wave shape, and the like, as embodiments of the present application are not limited in this respect.
It will be appreciated that when the user uses the active pen 200 to write or draw, in many cases, the movement track of the active pen 200 on the touch screen 110 is a straight line that is not perpendicular/parallel to the sensor channel, although it is difficult for the user to draw a straight line that is strictly perpendicular/parallel to the sensor channel when holding the active pen 200 for drawing, which may result in the final displayed handwriting being a wavy or jagged line instead of a straight line. In order to solve this problem, the touch display method provided by the present application can well solve the problem, and the following method embodiments may be referred to specifically.
Referring to fig. 5, fig. 5 schematically illustrates a flowchart of a touch display method provided by the present application.
As shown in fig. 5, the touch display method specifically includes the following steps:
s501, the touch equipment receives a downlink signal sent by the input equipment.
Specifically, the touch device 100 may pre-activate a sensor in the touch screen 110, for detecting a downlink signal sent by the input device. One specific implementation of the sensor in the touch screen 110 is a sensor network composed of the electrode array shown in fig. 3A. In the embodiment of the application, the touch device can keep the sensor in the touch screen continuously working all the time after being started, or the touch device can keep the sensor in the touch screen continuously working when detecting that the display screen is lightened, or the touch device can keep the sensor in the touch screen continuously working after detecting that the communication connection with the input device is established. The embodiment of the application does not limit the working time of the touch screen.
The input device may be, for example, a finger, an electronic pen, or the like, which is not limited in this embodiment of the present application. When the input device is specifically the active pen 200, the active pen 200 may send a downlink signal to the touch device 100 through any one or more of a transmitting electrode, a WI-FI module, a bluetooth module, a ZigBee module, and an NFC module, and the corresponding touch device 100 may receive the downlink signal through any one or more of a receiving electrode, a WI-FI module, a bluetooth module, a ZigBee module, and an NFC module. The receiving electrode is a specific implementation form of the sensor in the touch screen 110.
In some embodiments, when the active pen 200 contacts the touch screen (for example, when the input device is held on the touch screen 110 of the touch device 100 to perform a point touch, writing, etc.), the touch device 100 may receive the downlink signal, and calculate the measured coordinates of the contact point in the touch screen according to the downlink signal; in other embodiments, when the active pen 200 is suspended above the touch screen 110 and the distance between the active pen 200 and the touch screen 110 is within a preset range, the touch device may also detect the downlink signal sent by the active pen 200. It can be appreciated that, in different transmission scenarios, the information carried by the downlink signal may be different, for example, when the active pen 200 contacts the touch screen 110, the downlink signal may carry pressure information, so that the touch device determines thickness of the writing trace according to the pressure information; when the active pen 200 does not contact the touch screen 110, the downlink signal carries more or less functional information, such as more or less information of the key, the working mode, the tilt angle, etc., for performing the corresponding operation by the touch device 100.
In some embodiments, when the input device is an inactive pen, such as a finger, a passive pen, an inductive pen, or the like, the input device may not emit a downlink signal, and when the input device touches the touch screen, due to the conductive capability or the electromagnetic induction capability of the input device, a capacitance, an electric field, or the like in the touch screen may be changed, so that the touch device obtains measured coordinates according to the location of the changed sensor.
S502, the touch control device calculates a plurality of actual measurement coordinates of the input device falling on the touch control screen according to the downlink signals.
When the touch device 100 receives the downlink signal, the measured coordinates may be calculated according to the capacitance change condition of the touch screen 110. When the input device is dropped at a certain position in the touch screen 110, the capacitance of a plurality of sensing units, i.e. electrodes, around the position in the touch screen 110 may change, and the capacitance change values at different positions may be different. Then, the sensor may send the electrode related information that the capacitance change satisfies the preset condition to the processor, and the processor calculates the contact position of the input device in the touch screen 110, that is, the measured coordinates by adopting a coordinate algorithm.
The algorithm specifically used when the processor calculates the coordinates may refer to the prior art, such as a trigonometric algorithm, a gravity center algorithm, etc., which is not limited in the embodiment of the present application and is not described in detail.
It will be appreciated that the measured coordinates calculated in step S502 have the characteristics of inaccurate calculation illustrated in fig. 3B-3C above. If the touch device displays the handwriting formed by connecting the plurality of measured coordinates according to the measured coordinates calculated in step S502, the handwriting has a display effect similar to that described in the user interface shown in fig. 4 (effects of waves, sawtooth lines, etc.).
S503, the touch control device obtains compensation values corresponding to the measured coordinates in the compensation function, and calculates a plurality of compensated coordinates after the measured coordinates are compensated.
Specifically, after the processor of the touch device 100 calculates a plurality of measured coordinates, a compensation value corresponding to each measured coordinate in the compensation function may be obtained according to the compensation function, and then the measured coordinate values are superimposed with the corresponding compensation values, so as to obtain a final compensation value.
In the embodiment of the present application, the compensation function specifically includes a horizontal axis compensation function and a vertical axis compensation function, where the horizontal axis compensation function is used to represent a corresponding relationship between an actually measured abscissa and a compensation value, and the vertical axis compensation function is used to represent a corresponding relationship between an actually measured ordinate and a compensation value. Searching a compensation value corresponding to the abscissa of the measured coordinate in the abscissa axis compensation function, and adding the compensation value corresponding to the abscissa to the measured abscissa to obtain a compensated abscissa; similarly, searching a compensation value corresponding to the ordinate of the measured coordinate in the vertical axis compensation function, adding the compensation value corresponding to the ordinate to the measured ordinate to obtain a compensated ordinate, finally obtaining the compensation coordinate corresponding to the measured coordinate, and displaying handwriting formed by connecting a plurality of continuous compensation coordinates by the touch control equipment.
It will be appreciated that in some embodiments, the horizontal axis compensation function and the vertical axis compensation function may be the same or different. Specifically, when the sensor channels in the touch screen 110 of the touch device 100 are arranged in a central symmetrical arrangement, for example, the number of the sensor channels is the same, the interval between the sensor channels is the same, and the number of electrodes, the structure, etc. of each sensing unit in the sensor are the same, the lateral axis compensation function and the longitudinal axis compensation function may be the same; otherwise, the compensation function is not the same across the vertical axis.
That is, the compensation function provided in the embodiment of the present application is related to the hardware structure of the touch screen 110 in the touch device 100, the algorithm (such as the gravity center algorithm and the trigonometric algorithm) used for calculating the measured coordinates, and the like, and the compensation function used by the touch screen 110 is different. For example, the touch control device with the same model has the same touch control coordinates calculated by the same algorithm in response to the touch control screen hardware structure, so the same compensation function can be used. That is, the compensation function may be obtained in advance after the touch screen 110 is produced, and the compensation function is stored in the memory of the touch device 100 in advance, so that the compensation function can be directly read from the memory when the subsequent touch device 100 calculates the compensation coordinates according to the measured coordinates.
The method for obtaining the compensation function may refer to a method flow shown in fig. 6, and specifically includes the following steps:
s5031, a horizontal straight line parallel to a horizontal axis (X axis) is drawn in the touch screen 110 at a first speed using a manipulator.
In the embodiment of the present application, the manipulator refers to the active pen 200 that is not held by the user or the finger of the user, and the manipulator refers to a medium in which an operation input in the touch screen 110 can be detected by the touch screen 110 and the touch coordinates are calculated according to the input operation.
In the embodiment of the application, the first speed is a constant speed, and the first speed is lower than a preset value. That is, the manipulator draws a horizontal straight line parallel to the horizontal axis in the touch screen 110 slowly at a constant speed. Because of this, it is not only ensured that the touch screen 110 has enough time to detect the touch operation of the manipulator, but also that the ideal coordinates that should be theoretically detected by the touch device 100 can be obtained by simple calculation, that is, the theoretical coordinates are calculated, so that preparation work is made for the subsequent acquisition of the compensation function.
S5032, according to the downlink signal detected by the touch screen 110, a plurality of continuous measured coordinates included in the horizontal straight line are obtained by calculation.
Specifically, when the manipulator draws a straight line in the touch screen 110 at a constant speed, the sensor in the touch screen 110 may periodically scan the downlink signal sent by the input device to detect the corresponding coordinates, and finally detect and obtain a plurality of continuous actual measurement coordinates included in the horizontal straight line. Wherein the ordinate of the plurality of consecutive measured coordinates is a constant value, and the value of the abscissa of the plurality of consecutive measured coordinates does not satisfy the relationship of an arithmetic progression from the abscissa value of the starting measured coordinate. This is because the above-described influence of the hardware structure of the sensor in the touch screen 110 and the coordinate calculation algorithm leads to an error in the coordinate calculation result, and thus, although a horizontal straight line parallel to the horizontal axis (X axis) is drawn in the touch screen 110 slowly at a constant speed by using a manipulator, a plurality of consecutive measured coordinates acquired by periodically scanning the sensor in the touch screen 110 by the capacitance change therein are not satisfied in theoretical abscissa value to represent an arithmetic progression.
In the following embodiments of the present application, values of the abscissa in the measured coordinates are sequentially 0, 40, 110, 140, 210, 240, 310, 340, 410, 440, 510, 540, 610, 640, 710, 1000, etc. are described as examples. The size of the abscissa in the measured coordinates is merely an example, and the abscissa may also include a greater or lesser number, which is not limited by the embodiment of the present application.
S5033, a plurality of continuous theoretical coordinates contained in the horizontal straight line are obtained according to the initial measured coordinates, the final measured coordinates and the first speed of the manipulator.
Specifically, the start measured coordinate (0, 0) and the end measured coordinate (1000 ) in the measured coordinates of the horizontal straight line may be respectively the start point and the end point in the theoretical coordinates of the horizontal straight line, and then, by combining the first speed of the manipulator, a plurality of theoretical coordinates consistent with the scanning period of the touch screen 110 are obtained through calculation.
It is known that the ordinate of the plurality of continuous theoretical coordinates included in the horizontal straight line is a constant value, and the value of the abscissa of the plurality of continuous theoretical coordinates satisfies the relationship of starting from the abscissa value of the initial measured coordinate (also referred to as initial theoretical coordinate) to increase in an arithmetic progression. This is because the theoretical coordinates obtained at a fixed time satisfy the above condition by drawing a horizontal straight line parallel to the horizontal axis (X axis) in the touch panel 110 with a manipulator slowly at a constant speed.
The following embodiments of the present application will be described by taking the numerical values of the abscissa in the theoretical coordinates as 0, 50, 100, 150, 200, 250, 300, 350, 400, 450, 500, 550, 600, 650, 700 and 1000 in order as examples. The size of the abscissa in the theoretical coordinate is merely an example, and the abscissa may also include a greater or lesser number, which is not limited by the embodiment of the present application.
Referring to fig. 7, fig. 7 schematically illustrates a correspondence between a theoretical abscissa and an actually measured abscissa.
As shown in fig. 7, the vertical axis in the two-dimensional coordinate system is used to represent the abscissa value in the measured coordinate, and the horizontal axis is used to represent the abscissa value in the theoretical coordinate. The coordinates in the coordinate system shown in fig. 7, such as (0, 0), (50, 40), (100, 110), (150,140), (200, 210), (250,240), (300, 310), (350,340), (400, 410), (400, 440), (500, 510), (550,540), (600, 610), (650,640), (700, 710), and (1000, 10000), etc., are obtained according to the measured coordinates and the abscissa values in the theoretical coordinates obtained in step S5032 and step S5033, respectively, which are not exemplified herein.
According to the coordinates, the relation between the abscissa value in the theoretical coordinates and the abscissa value in the measured coordinates, namely the wave line shown in fig. 7, can be drawn. The wave line can be vividly reflected, and the difference between the abscissa value in the actual measurement coordinate and the abscissa value in the theoretical coordinate is shown. In the ideal calculation result, the value of the measured coordinate is equal to the value of the theoretical coordinate through detection and calculation, so that the writing experience of the user can be better met. That is, in the coordinate system shown in fig. 7, only when the functional relationship reflecting the abscissa value in the theoretical coordinate and the abscissa value in the measured coordinate satisfies y=x, it is explained that the value of the measured coordinate calculated by the detection in the touch device 100 should be equal to the value of the theoretical coordinate.
S5034, performing data fitting according to a plurality of original coordinates consisting of the measured abscissa value and the corresponding theoretical abscissa value to obtain a horizontal axis fitting function.
First, as can be seen from the functional relationship shown in fig. 7, half of each sensing channel in the touch screen 110 is one period, because the hardware structure of each sensing channel is symmetrical, and thus the function of each half of the channel in the coordinate system is one period. Thus, the fitting function can be obtained by studying coordinate points in half of the sensor channels in the functional relationship shown in fig. 7. The method comprises the following specific steps:
referring to fig. 8, fig. 8 illustrates a graph showing data fitting to raw coordinates in a half sensor channel to yield a horizontal axis fitting function.
As shown in fig. 8, the series of original coordinates in fig. 8 is a series of original coordinates obtained by extracting a plurality of coordinates within the half channel shown in fig. 7. And then, carrying out data fitting on the original coordinates, thereby obtaining a transverse axis fitting function. In the embodiment of the present application, the data fitting may be specifically a primary fitting, that is, a linear regression, or may also be a secondary fitting, a tertiary fitting, or the like, which is not limited by the embodiment of the present application. The fitting function shown in fig. 8 is shown by way of example only, with the result obtained after three fits. For specific steps of fitting, reference may be made to the prior art, and no further description is given here.
S5035, performing difference calculation on the transverse axis fitting function and the theoretical coordinates to obtain a transverse axis fitting difference function (namely a transverse axis compensation function).
Referring to fig. 9, fig. 9 illustrates a schematic diagram of a horizontal axis fitting difference function.
As shown in fig. 9, the dotted line is used to represent the real difference between the horizontal axis fitting function and the theoretical coordinates; the solid line is a fitted difference function obtained by fitting the real difference, and is also called a transverse axis compensation function. It will be appreciated that the fitted difference function is only illustrated by performing a third fit on the true difference, and in other embodiments of the present application, the corresponding fitted difference function may be obtained by performing a first fit or a second fit on the true difference, which is not limited in this embodiment of the present application.
It should be noted that the compensation function applied in the embodiment of the present application is to fit the real difference function to obtain a fitted difference function, instead of the original real difference function, because fitting the original data has the following meaning:
data fitting is a method that is often used when processing and analyzing experimental data (i.e., measured abscissa). The purpose of the data fitting is to be able to find an expression of the relationship between the reaction variables (i.e. the measured abscissa) so that it can best approach the known data under certain criteria. The principle of the method specifically comprises the following steps: least square method, chebyshev method, etc. In particular, the method of applying curve fitting reveals that the relation between the measured abscissas has important theoretical and practical significance. For example, the relation between the measured abscissas is reflected by using the cross-axis fitting difference function, so that the relation between the measured abscissas can be clearly and simply reflected, the measured abscissas are directly connected into the function instead of being directly connected into the function when the compensating abscissas are obtained according to the measured abscissas, a final result can be obtained through simple calculation, and the subsequent data processing steps are simplified.
S5036, compensating the actual measurement abscissa according to the horizontal axis compensation function to obtain a compensation abscissa.
Specifically, after the touch device 100 detects the measured coordinate, a fitting difference value of the measured abscissa value on the corresponding transverse axis of the transverse axis compensation function may be found, where the fitting difference value of the transverse axis may also be referred to as a compensation value, and the measured abscissa value is superimposed with the compensation value to obtain the final compensated abscissa.
Referring to fig. 10, fig. 10 schematically illustrates a correspondence between the compensation abscissa and the theoretical abscissa.
As shown in fig. 10, the functional relationship of the compensation abscissa and the theoretical abscissa can be approximated by the expression y=x. That is, after the compensation function obtaining method shown in the steps S5031 to S5036 is adopted, the actual measurement abscissa can be compensated, so that the obtained compensated abscissa is almost equal to the theoretical abscissa value of the ideal clock, and the abscissa detection difference is largely eliminated.
It is to be understood that the above steps S5031 to S5036 merely describe a method for obtaining the horizontal axis compensation function, and a method for obtaining the vertical axis compensation function is similar thereto. The method specifically comprises the following steps:
drawing a horizontal straight line parallel to a longitudinal axis (Y axis) in the touch screen 110 slowly at a constant speed using a robot;
According to the downlink signal detected by the touch screen 110, a plurality of continuous actual measurement coordinates contained in the horizontal straight line are obtained through calculation;
acquiring a plurality of continuous theoretical coordinates contained in the horizontal straight line according to the initial measured coordinate and the final measured coordinate and the first speed of the manipulator;
performing data fitting according to a plurality of original coordinates consisting of the measured ordinate values and the corresponding theoretical ordinate values to obtain a ordinate axis fitting function;
obtaining a functional relation (namely a vertical axis compensation function) between the measured vertical coordinate and the vertical axis fitting difference value according to the vertical axis fitting function;
and compensating the measured ordinate according to the vertical axis compensation function to obtain a compensated ordinate.
S504, the touch control device controls the display screen to display handwriting formed by connecting a plurality of compensation coordinates.
Specifically, after the processor of the touch device 100 obtains the compensation coordinates corresponding to the plurality of measured coordinates, the display screen may be controlled to display corresponding handwriting according to the plurality of compensation coordinates. Reference may be made in particular to fig. 11.
As shown in fig. 11, fig. 11 illustrates another user interface diagram of the output of the touch device 100.
As shown in fig. 11, line 611A, line 612A, and line 613A are the actual movement trajectories of active pen 200 when active pen 200 is held to write in touch screen 110, respectively. The line 611B, the line 612B, and the line 613B are the corresponding handwriting displayed by the touch device 100 according to the actual movement track line 611A, the line 612A, and the line 613A of the active pen 200, respectively.
As can be seen from fig. 11, when the touch display method shown in fig. 5 is adopted and the user inputs handwriting such as line 613A to the touch device 100 using the active pen 200, the touch device 100 displays a line 613B with a straight line effect instead of the line 413B with a wavy or saw-tooth effect displayed in the prior art; when the user inputs handwriting such as the line 611A and the line 612A to the touch device 100 using the active pen 200, the touch device 100 also displays the line 611B and the line 612B having a straight line effect and is more accurate than the line 411B and the line 412B having a straight line effect displayed in the related art.
It should be noted that the user interface 610 actually displayed by the touch device 100 includes only the line 611B, the line 612B and the line 613B. The user interface 610 may be an interface provided by a writing, painting-like application installed in the touch device 100. While the lines 611A, 612A and 613A shown in fig. 6 are not actually displayed by the touch device 100, they are merely used to illustrate the actual movement track of the active pen 200 when the active pen 200 writes in the touch screen 110, and the actual movement track of the active pen 200 is not the current positions of the lines 611A, 612A and 613A. When the display screen 120 of the touch device 100 is attached to the touch screen 110, that is, when the touch screen 110 and the display screen 120 are integrated into one touch display screen, the actual moving track of the active pen 200 on the touch screen 110 almost completely coincides with the handwriting displayed on the display screen 120; when the touch screen 110 is independent of the body of the touch device but can be connected with the touch device through wireless or wired communication, that is, when the sensor in the touch screen 110 is not attached to the display screen 120, the active pen 200 has a corresponding relationship with the handwriting displayed on the display screen 120 on the actual moving track of the touch screen 110, for example, when writing is performed on the upper left corner in the touch screen 110, the corresponding handwriting is displayed on the upper left corner of the display screen 120.
The display 120 of the touch device 100 may display a corresponding effect according to the electrical signal received by the touch screen 110, for example, when the touch device 100 opens an Application (APP) installed in an operating system, such as writing, drawing, etc., the moving path of the active pen 200, that is, displaying handwriting, may be displayed on the display by contacting and moving the input device in a designated area of the touch screen 110, where the effect of the color, thickness, shape, etc. of the handwriting may be configured by a software function.
In the embodiment of the present application, the touch screen 110 may be integrated with the touch device 100 into an integrated machine, and when the touch screen 110 and the touch device 100 are integrated into an integrated machine, the touch screen 110 and the display screen 120 are integrated into a touch display screen, that is, the sensor unit in the touch screen 110 is attached to the display screen 120. It will be appreciated that in other embodiments of the present application, the touch screen 110 may be separate from the body of the touch device, but may be connected to the touch device by wireless or wired communication. That is, the sensor unit in the touch screen 110 and the display screen 120 may not be attached, for example, a notebook computer with an external handwriting board is taken as an example, the notebook computer includes the display screen 120, and the sensor unit for detecting the touch signal is disposed in the external handwriting board.
Based on the above detailed description of the application scenario of the touch display method and the method embodiments, the touch device 100 and the active pen 200 device embodiments related to the application scenario are mainly described below. The method comprises the following steps:
referring to fig. 12, fig. 12 schematically illustrates a hardware structure of a touch device 100 provided by the present application.
As shown in fig. 12, the touch device 100 may include a touch screen 110, a display 120, a processor 130, a memory 140, a universal serial bus (universal serial bus, USB) interface 150, a charge management module 160, a power management module 161, a battery 162, an antenna 1, an antenna 2, a mobile communication module 170, a wireless communication module 180, a sensor module 190, and other modules not shown in the drawings such as keys, motors, indicators, cameras, and subscriber identity module (subscriber identification module, SIM) card interfaces. The sensor modules 190 may include pressure sensors 190A, touch sensors 190B, air pressure sensors 190C, magnetic sensors 190D, acceleration sensors 190E, distance sensors 190F, proximity sensors 190G, fingerprint sensors 190H, temperature sensors 190J, gyroscope sensors 190K, ambient light sensors 190L, bone conduction sensors 190M, and the like.
It should be understood that the structure illustrated in the embodiments of the present application does not constitute a specific limitation on the touch device 100. In other embodiments of the present application, touch device 100 may include more or less components than illustrated, or certain components may be combined, or certain components may be split, or different arrangements of components. The illustrated components may be implemented in hardware, software, or a combination of software and hardware.
The touch screen 110 may also be referred to as a "touch panel". The touch screen 110 has a sensor layer integrated therein, which may include a pressure sensor 190A, a touch sensor 190B, and the like. The sensor layer can operate in a variety of modes. If operating in mutual capacitance mode, the column and row traces form a single capacitive sense node at each overlap point (e.g., a "vertical" mutual capacitance). If operating in self-capacitance mode, the column and row traces form two (vertically aligned) capacitive sense nodes at each overlap point. In another embodiment, if operating in a mutual capacitance mode, adjacent column traces and/or adjacent row traces may each form a single capacitive sense node (e.g., a "horizontal" mutual capacitance). As described above, the sensor layer may detect the presence of the nib 201 of the active pen 200 and/or the touch of the user's finger by monitoring the change in capacitance (e.g., mutual capacitance or self capacitance) presented at each capacitive sensing node.
In some embodiments of the present application, the touch screen 110 and the display screen 120 may be integrated into one touch screen, that is, the sensor layer in the touch screen 110 is attached to the component in the display screen, and the user may input a touch operation in the touch screen and display a corresponding effect in the touch screen. In other embodiments of the present application, the touch screen 110 and the display screen 120 may be two separate devices. That is, the sensor layer in the touch screen 110 is attached to the component in the display screen, so that the user can input a touch operation in the touch screen 110 and display a corresponding effect in the display screen 120.
The pressure sensor 190A in the touch screen 110 may be used to sense a pressure signal, and may convert the pressure signal into an electrical signal. The pressure sensor 190A is of a wide variety, such as a resistive pressure sensor, an inductive pressure sensor, a capacitive pressure sensor, and the like. The capacitive pressure sensor may be a capacitive pressure sensor comprising at least two parallel plates with conductive material. The capacitance between the electrodes changes when a force is applied to the pressure sensor 190A. The touch device 100 determines the intensity of the pressure according to the change of the capacitance. When a touch operation is applied to the display screen 120, the touch device 100 detects the intensity of the touch operation according to the pressure sensor 190A. The touch device 100 may also calculate the location of the touch based on the detection signal of the pressure sensor 190A. For the sensing network in the touch screen 110 composed of the capacitive pressure sensors, reference may be made to the detailed description of fig. 3A, and the detailed description is omitted herein. In some embodiments, touch operations that act on the same touch location, but at different touch operation strengths, may correspond to different operation instructions. For example: and executing an instruction for checking the short message when the touch operation with the touch operation intensity smaller than the first pressure threshold acts on the short message application icon. And executing an instruction for newly creating the short message when the touch operation with the touch operation intensity being greater than or equal to the first pressure threshold acts on the short message application icon.
The touch sensor 190B in the touch screen 110 described above may be used to detect touch operations acting on or near it. The touch sensor 190B may communicate the detected touch operation to the application processor 130 to determine the touch event type. Visual output related to the touch operation may be provided through the display screen 120.
A display screen 120, the display screen 120 being for displaying images, video, etc. The display 120 includes a display panel. The display panel may employ a liquid crystal display (liquid crystal display, LCD). The display panel may also be manufactured using organic light-emitting diode (OLED), active-matrix organic light-emitting diode (AMOLED), flexible light-emitting diode (flex-emitting diode), mini, micro-OLED, quantum dot light-emitting diode (quantum dot light emitting diodes, QLED), or the like. In some embodiments, the touch device 100 may include 1 or N display screens 120, N being a positive integer greater than 1.
In the embodiment of the present application, the display screen 120 may display corresponding handwriting on the display screen 120 according to the writing or drawing operation input by the user in the touch screen 110. Specifically, when the user holds an input device, such as the active pen 200, to write or draw in the touch screen 110, the touch screen 110 may detect a change in capacitance according to the user operation, and send the capacitance change information to the processor 130, the processor 130 may calculate, according to the capacitance change information, a contact coordinate of the active pen 200 in the touch screen 110, and then the processor 130 may control the display screen 120 to display handwriting formed by connecting a plurality of continuous coordinates.
Processor 130 may include one or more processing units such as, for example: processor 130 may include an application processor (application processor, AP), a modem processor, a graphics processor (graphics processing unit, GPU), an image signal processor (image signal processor, ISP), a controller, a memory, a video codec, a digital signal processor (digital signal processor, DSP), a baseband processor, and/or a neural network processor (neural-network processing unit, NPU), etc. Wherein the different processing units may be separate devices or may be integrated in one or more processors.
The controller may be a neural hub and a command center of the touch device 100. The controller can generate operation control signals according to the instruction operation codes and the time sequence signals to finish the control of instruction fetching and instruction execution.
A memory may also be provided in the processor 130 for storing instructions and data. In some embodiments, the memory in the processor 130 is a cache memory. The memory may hold instructions or data that the processor 130 has just used or recycled. If the processor 130 needs to reuse the instruction or data, it may be called directly from the memory. Repeated accesses are avoided and the latency of the processor 130 is reduced, thereby improving the efficiency of the system.
In the embodiment of the present application, the memory provided in the processor 130 may be used to store an algorithm for calculating the touch coordinates, and after receiving the capacitance change information sent by the touch screen, the processor 130 may directly call the algorithm for calculating the touch coordinates, and perform coordinate calculation to obtain the actual measurement coordinates described in the above method embodiment; the processor 130 may then call the compensation function to obtain final compensation coordinates for controlling the display screen to display the corresponding handwriting according to the compensation coordinates.
Memory 140 may include an internal memory 141 and an external memory interface 142.
The internal memory 141 may include one or more random access memories (random access memory, RAM) and one or more nonvolatile memories (NVM), among others. The random access memory may include a static random-access memory (SRAM), a dynamic random-access memory (dynamic random access memory, DRAM), a synchronous dynamic random-access memory (synchronous dynamic random access memory, SDRAM), a double data rate synchronous dynamic random-access memory (double data rate synchronous dynamic random access memory, DDR SDRAM, such as fifth generation DDR SDRAM is commonly referred to as DDR5 SDRAM), etc.; the nonvolatile memory may include a disk storage device, a flash memory (flash memory). The FLASH memory may include NOR FLASH, NAND FLASH, 3D NAND FLASH, etc. divided according to an operation principle, may include single-level memory cells (SLC), multi-level memory cells (MLC), triple-level memory cells (TLC), quad-level memory cells (QLC), etc. divided according to a storage specification, may include universal FLASH memory (english: universal FLASH storage, UFS), embedded multimedia memory cards (embedded multi media Card, eMMC), etc. divided according to a storage specification. The random access memory may be read directly from and written to by the processor 130, may be used to store executable programs (e.g., machine instructions) for an operating system or other on-the-fly programs, may also be used to store data for users and applications, and the like. The nonvolatile memory may also store executable programs, store data of users and applications, and the like, and may be loaded into the random access memory in advance for the processor 130 to directly read and write.
The external memory interface 142 may be used to connect to an external nonvolatile memory, so as to extend the storage capability of the touch device 100. The external nonvolatile memory communicates with the processor 130 through the external memory interface 142 to implement data storage functions. For example, files such as music and video are stored in an external nonvolatile memory.
In the embodiment of the present application, the memory 140 is further used for storing a compensation function. The compensation function is obtained in advance through the method flow shown in fig. 6 and stored in the memory 140 of the touch device 100, so that the processor 130 can obtain the compensation coordinate when calculating the compensation coordinate according to the measured coordinate.
The USB interface 150 is an interface conforming to the USB standard specification, and may specifically be a Mini USB interface, a Micro USB interface, a USB Type C interface, or the like. The USB interface 150 may be used to connect a charger to charge the touch device 100, or may be used to transfer data between the touch device 100 and a peripheral device. And can also be used for connecting with a headset, and playing audio through the headset. The interface may also be used to connect other electronic devices, such as a stand-alone drawing board, AR device, etc.
It should be understood that the connection relationship between the modules illustrated in the embodiment of the present application is only illustrative, and is not limited to the structure of the touch device 100. In other embodiments of the present application, the touch device 100 may also use different interfacing manners, or a combination of multiple interfacing manners in the above embodiments.
The charge management module 160 is configured to receive a charge input from a charger. The charger can be a wireless charger or a wired charger. In some wired charging embodiments, the charge management module 160 may receive a charging input of a wired charger through the USB interface 150. In some wireless charging embodiments, the charging management module 160 may receive wireless charging input through a wireless charging coil of the touch device 100. The charging management module 160 may also power the electronic device through the power management module 161 while charging the battery 162.
The power management module 161 is used for connecting the battery 162, the charge management module 160 and the processor 130. The power management module 161 receives input from the battery 162 and/or the charge management module 160 and provides power to the processor 130, the internal memory 141, the external memory, the display 120, the camera, the wireless communication module 180, etc. The power management module 161 may also be configured to monitor battery capacity, battery cycle number, battery health (leakage, impedance) and other parameters. In other embodiments, the power management module 161 may also be located in the processor 130. In other embodiments, the power management module 161 and the charge management module 160 may be disposed in the same device.
The wireless communication function of the touch device 100 may be implemented by the antenna 1, the antenna 2, the mobile communication module 170, the wireless communication module 180, a modem processor, a baseband processor, and the like.
The antennas 1 and 2 are used for transmitting and receiving electromagnetic wave signals. Each antenna in the touch device 100 may be used to cover a single or multiple communication bands. Different antennas may also be multiplexed to improve the utilization of the antennas. For example: the antenna 1 may be multiplexed into a diversity antenna of a wireless local area network. In other embodiments, the antenna may be used in conjunction with a tuning switch.
The mobile communication module 170 may provide a solution including 2G/3G/4G/5G wireless communication applied on the touch device 100. The mobile communication module 170 may include at least one filter, switch, power amplifier, low noise amplifier (low noise amplifier, LNA), etc. The mobile communication module 170 may receive electromagnetic waves from the antenna 1, perform processes such as filtering, amplifying, and the like on the received electromagnetic waves, and transmit the processed electromagnetic waves to the modem processor for demodulation. The mobile communication module 170 may amplify the signal modulated by the modem processor, and convert the signal into electromagnetic waves through the antenna 1 to radiate the electromagnetic waves. In some embodiments, at least some of the functional modules of the mobile communication module 170 may be disposed in the processor 130. In some embodiments, at least some of the functional modules of the mobile communication module 170 may be disposed in the same device as at least some of the modules of the processor 130.
The modem processor may include a modulator and a demodulator. The modulator is used for modulating the low-frequency baseband signal to be transmitted into a medium-high frequency signal. The demodulator is used for demodulating the received electromagnetic wave signal into a low-frequency baseband signal. The demodulator then transmits the demodulated low frequency baseband signal to the baseband processor for processing. The low frequency baseband signal is processed by the baseband processor and then transferred to the application processor. The application processor outputs sound signals through an audio device (not limited to speakers, receivers, etc.), or displays images or video through the display screen 120. In some embodiments, the modem processor may be a stand-alone device. In other embodiments, the modem processor may be provided in the same device as the mobile communication module 170 or other functional module, independent of the processor 130.
The wireless communication module 180 may provide solutions for wireless communication including wireless local area network (wireless local area networks, WLAN) (e.g., wireless fidelity (wireless fidelity, wi-Fi) network), bluetooth (BT), global navigation satellite system (global navigation satellite system, GNSS), frequency modulation (frequency modulation, FM), near field wireless communication technology (near field communication, NFC), infrared technology (IR), etc., applied to the touch device 100. The wireless communication module 180 may be one or more devices integrating at least one communication processing module. The wireless communication module 180 receives electromagnetic waves via the antenna 2, demodulates and filters the electromagnetic wave signals, and transmits the processed signals to the processor 130. The wireless communication module 180 may also receive a signal to be transmitted from the processor 130, frequency modulate it, amplify it, and convert it to electromagnetic waves for radiation via the antenna 2.
In some embodiments, antenna 1 and mobile communication module 170 of touch device 100 are coupled, and antenna 2 and wireless communication module 180 are coupled, such that touch device 100 may communicate with a network and other devices through wireless communication techniques. The wireless communication techniques may include the Global System for Mobile communications (global system for mobile communications, GSM), general packet radio service (general packet radio service, GPRS), code division multiple access (code division multiple access, CDMA), wideband code division multiple access (wideband code division multiple access, WCDMA), time division code division multiple access (time-division code division multiple access, TD-SCDMA), long term evolution (long term evolution, LTE), BT, GNSS, WLAN, NFC, FM, and/or IR techniques, among others. The GNSS may include a global satellite positioning system (global positioning system, GPS), a global navigation satellite system (global navigation satellite system, GLONASS), a beidou satellite navigation system (beidou navigation satellite system, BDS), a quasi zenith satellite system (quasi-zenith satellite system, QZSS) and/or a satellite based augmentation system (satellite based augmentation systems, SBAS).
In the embodiment of the present application, the touch device 100 may also establish a communication connection with an input device, such as the active pen 200, through the above-mentioned wireless communication module 180 or the mobile communication module 170, for data transmission. For example, it is possible to specifically: a near field communication network such as a WI-FI hotspot network, a WI-FI peer-to-peer (P2P) network, a bluetooth network, a ZigBee network, or a near field communication (near field communication, NFC) network establishes a communication connection with the active pen 200 to receive a downlink signal sent by the active pen 200.
Optionally, in other embodiments of the present application, the touch device 100 further includes a device shown in fig. 12, which is specifically as follows:
the gyro sensor may be used to determine a motion gesture of the touch device 100. In some embodiments, the angular velocity of the touch device 100 about three axes (i.e., x, y, and z axes) may be determined by a gyroscopic sensor. The gyro sensor may be used for photographing anti-shake. For example, when the shutter is pressed, the gyro sensor detects the shake angle of the touch device 100, calculates the distance to be compensated by the lens module according to the angle, and makes the lens counteract the shake of the touch device 100 through the reverse motion, thereby realizing anti-shake. The gyroscopic sensor may also be used to navigate, somatosensory a game scene.
The air pressure sensor is used for measuring air pressure. In some embodiments, the touch device 100 calculates altitude from barometric pressure values measured by barometric pressure sensors, aiding in positioning and navigation.
The magnetic sensor includes a hall sensor. The touch device 100 may detect the opening and closing of the flip cover using a magnetic sensor. In some embodiments, when the touch device 100 is a flip machine, the touch device 100 may detect the opening and closing of the flip according to the magnetic sensor. And then according to the detected opening and closing state of the leather sheath or the opening and closing state of the flip, the characteristics of automatic unlocking of the flip and the like are set.
The acceleration sensor may detect the magnitude of acceleration of the touch device 100 in various directions (typically three axes). The magnitude and direction of gravity can be detected when the touch device 100 is stationary. The electronic equipment gesture recognition method can also be used for recognizing the gesture of the electronic equipment, and is applied to horizontal and vertical screen switching, pedometers and other applications.
And a distance sensor for measuring the distance. The touch device 100 may measure the distance by infrared or laser. In some embodiments, the scene is photographed and the touch device 100 may range using a distance sensor to achieve quick focus.
The proximity light sensor may include, for example, a Light Emitting Diode (LED) and a light detector, such as a photodiode. The light emitting diode may be an infrared light emitting diode. The touch device 100 emits infrared light outward through the light emitting diode. The touch device 100 detects infrared reflected light from nearby objects using a photodiode. When sufficient reflected light is detected, it may be determined that there is an object in the vicinity of the touch device 100. When insufficient reflected light is detected, the touch device 100 may determine that there is no object in the vicinity of the touch device 100. The touch device 100 can detect that the user holds the touch device 100 close to the ear to talk by using the proximity light sensor, so as to automatically extinguish the screen to achieve the purpose of saving electricity. The proximity light sensor can also be used in a holster mode, and a pocket mode can be used for automatically unlocking and locking a screen.
The ambient light sensor is used for sensing ambient light brightness. The touch device 100 may adaptively adjust the brightness of the display 120 according to the perceived ambient light level. The ambient light sensor may also be used to automatically adjust white balance when taking a photograph. The ambient light sensor may also cooperate with the proximity light sensor to detect whether the touch device 100 is in a pocket to prevent false touches.
The fingerprint sensor is used for collecting fingerprints. The touch device 100 can utilize the collected fingerprint characteristics to realize fingerprint unlocking, access an application lock, fingerprint photographing, fingerprint incoming call answering and the like.
The temperature sensor is used for detecting temperature. In some embodiments, the touch device 100 performs a temperature processing strategy using the temperature detected by the temperature sensor. For example, when the temperature reported by the temperature sensor exceeds a threshold, the touch device 100 performs a reduction in performance of a processor located near the temperature sensor in order to reduce power consumption to implement thermal protection. In other embodiments, when the temperature is lower than another threshold, the touch device 100 heats the battery 162 to avoid abnormal shutdown of the touch device 100 caused by low temperature. In other embodiments, when the temperature is lower than the further threshold, the touch device 100 performs boosting on the output voltage of the battery 162 to avoid abnormal shutdown caused by low temperature.
The bone conduction sensor may acquire a vibration signal. In some embodiments, the bone conduction sensor may acquire a vibration signal of the human vocal tract vibrating the bone pieces. The bone conduction sensor can also contact the pulse of a human body to receive the blood pressure jumping signal. In some embodiments, the bone conduction sensor may also be provided in the headset, in combination with the bone conduction headset. The audio module can analyze out a voice signal based on the vibration signal of the sound part vibration bone block obtained by the bone conduction sensor, so as to realize a voice function. The application processor can analyze heart rate information based on the blood pressure beating signals acquired by the bone conduction sensor, so that a heart rate detection function is realized.
It will be apparent to those skilled in the art that some of the specific details presented above with respect to the touch device 100 may not be required to practice a particular described embodiment or equivalent thereof. Similarly, other touch devices may include a greater number of subsystems, modules, components, etc. Some of the sub-modules may be implemented as software or hardware, where appropriate. It should be understood, therefore, that the foregoing description is not intended to be exhaustive or to limit the disclosure to the precise form described herein. On the contrary, many modifications and variations will be apparent to those of ordinary skill in the art in light of the above teachings.
Alternatively, the touch device 100 may implement audio functions through an audio module, a speaker, a receiver, a microphone, an earphone interface, an application processor, and the like. Such as music playing, recording, etc.
The audio module is used for converting digital audio information into analog audio signals for output and also used for converting analog audio input into digital audio signals. The audio module may also be used to encode and decode audio signals. In some embodiments, the audio module may be disposed in the processor 130, or a portion of the functional modules of the audio module may be disposed in the processor 130.
Speakers, also known as "horns," are used to convert audio electrical signals into sound signals. The touch device 100 may listen to music through a speaker or to hands-free conversations.
A receiver, also called an "earpiece", is used to convert the audio electrical signal into a sound signal. When the touch device 100 is answering a phone call or voice message, the voice can be received by placing the receiver close to the human ear.
Microphones, also known as "microphones" and "microphones", are used to convert sound signals into electrical signals. When making a call or transmitting voice information, a user can sound near the microphone through the mouth, inputting a sound signal to the microphone. The touch device 100 may be provided with at least one microphone. In other embodiments, the touch device 100 may be provided with two microphones, and may implement a noise reduction function in addition to collecting sound signals. In other embodiments, the touch device 100 may further be provided with three, four or more microphones to collect sound signals, reduce noise, identify the source of sound, implement directional recording functions, etc.
The earphone interface is used for connecting a wired earphone. The earphone interface may be a USB interface 150, or may be a 3.5mm open mobile electronic device platform (open moile terminl pltform, OMTP) standard interface, a american cellular telecommunications industry association (ellulr teleommunitions inustry ssoition of the US, TI) standard interface.
The touch device 100 may implement a photographing function through an ISP, a camera, a video codec, a GPU, a display screen 120, an application processor, and the like.
The ISP is used for processing the data fed back by the camera. For example, when photographing, the shutter is opened, light is transmitted to the camera photosensitive element through the lens, the optical signal is converted into an electric signal, and the camera photosensitive element transmits the electric signal to the ISP for processing and is converted into an image visible to naked eyes. ISP can also optimize the noise, brightness and skin color of the image. The ISP can also optimize parameters such as exposure, color temperature and the like of a shooting scene. In some embodiments, the ISP may be provided in the camera.
Optionally, the touch device 100 may also be used to capture still images or video through a camera. The object generates an optical image through the lens and projects the optical image onto the photosensitive element. The photosensitive element may be a charge coupled device (charge coupled device, CCD) or a Complementary Metal Oxide Semiconductor (CMOS) phototransistor. The photosensitive element converts the optical signal into an electrical signal, which is then transferred to the ISP to be converted into a digital image signal. The ISP outputs the digital image signal to the DSP for processing. The DSP converts the digital image signal into an image signal in a standard RGB, YUV, or the like format. In some embodiments, the touch device 100 may include 1 or N cameras, where N is a positive integer greater than 1.
The digital signal processor is used for processing digital signals, and can process other digital signals besides digital image signals. For example, when the touch device 100 selects a frequency bin, the digital signal processor is used to fourier transform the frequency bin energy, and so on.
Video codecs are used to compress or decompress digital video. The touch device 100 may support one or more video codecs. In this way, the touch device 100 may play or record video in multiple encoding formats, for example: dynamic picture experts group (moving picture experts group, MPEG) 1, MPEG2, MPEG3, MPEG4, etc.
The NPU is a neural-network (NN) computing processor, and can rapidly process input information by referencing a biological neural network structure, for example, referencing a transmission mode between human brain neurons, and can also continuously perform self-learning. Applications such as intelligent cognition of the touch device 100 can be implemented through the NPU, for example: image recognition, face recognition, speech recognition, text understanding, etc.
The software system of the touch device 100 may employ a layered architecture, an event driven architecture, a micro-core architecture, a micro-service architecture, or a cloud architecture. Embodiments of the application are configured in a layered manner The system is exemplified to illustrate the software structure of the touch device 100. The touch device may be a built-in +.>Or other operating system.
Fig. 13 is a software block diagram of the touch device 100 according to the embodiment of the application.
The layered architecture divides the software into several layers, each with distinct roles and branches. The layers communicate with each other through a software interface. In some embodiments, it willThe system is divided into four layers, namely an application program layer, an application program framework layer and a An Zhuoyun line (++)>runtimes) and system libraries, and kernel layer。
The application layer may include a series of application packages.
As shown in fig. 13, the application package may include writing/drawing type applications, gallery, call, WLAN, bluetooth, music, video, short message, etc. applications.
The application framework layer provides an application programming interface (application programming interface, API) and programming framework for application programs of the application layer. The application framework layer includes a number of predefined functions.
As shown in fig. 13, the application framework layer may include a window manager, a content provider, a view system, a phone manager, a resource manager, a notification manager, and the like.
The window manager is used for managing window programs. The window manager can acquire the size of the display screen, judge whether a status bar exists, lock the screen, intercept the screen and the like.
The content provider is used to store and retrieve data and make such data accessible to applications. The data may include video, images, audio, calls made and received, browsing history and bookmarks, phonebooks, etc.
The view system includes visual controls, such as controls to display text, controls to display pictures, and the like. The view system may be used to build applications. The display interface may be composed of one or more views. For example, a display interface including a text message notification icon may include a view displaying text and a view displaying a picture.
The telephony manager is used to provide the communication functions of the touch device 100. Such as the management of call status (including on, hung-up, etc.).
The resource manager provides various resources for the application program, such as localization strings, icons, pictures, layout files, video files, and the like.
The notification manager allows the application to display notification information in a status bar, can be used to communicate notification type messages, can automatically disappear after a short dwell, and does not require user interaction. Such as notification manager is used to inform that the download is complete, message alerts, etc. The notification manager may also be a notification in the form of a chart or scroll bar text that appears on the system top status bar, such as a notification of a background running application, or a notification that appears on the screen in the form of a dialog window. For example, the text information is prompted in the status bar, a prompt tone is sent, the touch control device vibrates, and the indicator lights flash.
Run time includes a core library and virtual machines. />runtime is responsible for scheduling and management of the android system.
The core library consists of two parts: one part is a function which needs to be called by java language, and the other part is a core library of android.
The application layer and the application framework layer run in a virtual machine. The virtual machine executes java files of the application program layer and the application program framework layer as binary files. The virtual machine is used for executing the functions of object life cycle management, stack management, thread management, security and exception management, garbage collection and the like.
The system library may include a plurality of functional modules. For example: surface manager (surface manager), media Libraries (Media Libraries), three-dimensional graphics processing Libraries (e.g., openGL ES), 2D graphics engines (e.g., SGL), etc.
The surface manager is used to manage the display subsystem and provides a fusion of 2D and 3D layers for multiple applications.
Media libraries support a variety of commonly used audio, video format playback and recording, still image files, and the like. The media library may support a variety of audio and video encoding formats, such as MPEG4, h.264, MP3, AAC, AMR, JPG, PNG, etc.
The three-dimensional graphic processing library is used for realizing three-dimensional graphic drawing, image rendering, synthesis, layer processing and the like.
The 2D graphics engine is a drawing engine for 2D drawing.
The kernel layer is a layer between hardware and software. The inner core layer at least comprises a display driver, a camera driver, an audio driver and a sensor driver.
The workflow of the touch device 100 software and hardware is illustrated below in connection with a writing/drawing scenario.
When touch sensor 190B receives a touch operation, a corresponding hardware interrupt is issued to the kernel layer. The kernel layer processes the touch operation into the original input event (including information such as touch coordinates, time stamp of touch operation, etc.). The original input event is stored at the kernel layer. The application framework layer acquires an original input event from the kernel layer, and identifies a control corresponding to the input event. Taking the touch operation as a touch single click operation, taking a control corresponding to the single click operation as an example of a control of a writing/drawing application icon, the writing/drawing application calls an interface of an application framework layer, the writing/drawing application is started, the touch screen 110 is further started to drive by calling the kernel layer, and a moving track input by a user for writing/drawing is captured through the touch screen 110.
Referring to fig. 14, fig. 14 is a schematic hardware structure of an active pen 200 according to an embodiment of the present application.
As shown in fig. 14, the active pen 200 may include: a processor 210, one or more sensors 220. Or may further include: keys 230, indicator lights 240, one or more electrodes 250, a driver circuit 260, a power supply 270, and a wireless communication module 280, among others.
It should be understood that the illustrated structure of the present embodiment does not constitute a specific limitation on the active pen 200. In other embodiments of the present application, active pen 200 may include more or fewer components than shown, or certain components may be combined, or certain components may be split, or different arrangements of components. The illustrated components may be implemented in hardware, software, or a combination of software and hardware.
Processor 210 may include memory circuitry and processing circuitry to support the performance of operations by active pen 200. The storage and processing circuitry may include storage devices such as non-volatile memory (e.g., flash memory or other electrically programmable read-only memory configured as a solid state drive), volatile memory (e.g., static or dynamic random access memory), and the like. Processing circuitry in processor 210 may be used to control the operation of active pen 200. The processing circuitry may be based on one or more microprocessors, microcontrollers, digital signal processors, baseband processors, power management units, audio chips, application specific integrated circuits, and the like.
The sensor 220 may include a pressure sensor 221. The pressure sensor 221 may be disposed at the tip 201 of the active pen 200 (as shown in fig. 2). Of course, the pressure sensor 220 may also be disposed within the body 203 of the active pen 200 such that, upon application of force to one end of the nib 201 of the active pen 200, the other end of the nib 201 moves to apply force to the pressure sensor 221. In one embodiment, processor 210 may adjust the thickness of the line as nib 201 of active pen 200 writes based on the amount of pressure detected by pressure sensor 221.
The sensor 220 may also include an inertial sensor 222. Inertial sensor 222 may include a three-axis accelerometer and a three-axis gyroscope, and/or other components for measuring motion of active pen 200, for example, a three-axis magnetometer may be included in the sensor in a nine-axis inertial sensor configuration. The sensors 220 may also include other sensors such as temperature sensors, ambient light sensors, light-based proximity sensors, contact sensors, magnetic sensors, pressure sensors, and/or other sensors.
The keys 230 include a power-on key and the like. The keys 230 may be mechanical keys. Or may be a touch key. The touch device 100 may receive key inputs, generating key signal inputs related to user settings and function controls of the touch device 100.
The indicator light 240 may be used to indicate a state of charge, a change in charge, an indication message, a notification, etc.
Electrode 250 may be located within nib 201 of active pen 200 (see fig. 2 above) and drive circuit 260 may be located within body 203 of active pen 200 (see fig. 2 above). The driver circuit 260 may be used to connect the electrode 250 with the power supply 270. The driving circuit may receive an electrical signal provided by the battery and transmit a downlink signal to the touch device 100 through the amplifying driving electrode 250.
Power supply 270 may be a nickel-cadmium battery, a nickel-hydrogen battery, or a lithium ion battery. In the embodiment of the present application, the power supply 270 may be configured by a plurality of batteries connected in series-parallel, in addition to the power supply, and the voltage output from the batteries is not limited. In one possible implementation, the power supply 270 may be an external power supply to the active pen 200. Embodiments of the present application are not limited in the type of power supply 270 and the value of the output voltage.
The wireless communication module 280 may support wireless communication between the active pen 200 and the touch device 100. The wireless communication module 280 may be a Bluetooth module, a WI-FI hotspot module, a WI-FI point-to-point module, etc., a MFC module, a ZigBee module, etc. The bluetooth module may include a radio frequency transceiver, such as a transceiver. The bluetooth module may also include one or more antennas. The transceiver may transmit and/or receive wireless signals using an antenna, which may be based on the type of wireless module, bluetooth signals, wireless local area network signals, remote signals such as cellular telephone signals, near field communication signals, or other wireless signals.
It will be appreciated that the active pen 200 may include a microphone, speaker, audio generator, vibrator, camera, data port, and other devices, as desired. The user may control the operation of the touch device 100 by providing commands with these devices and receive status information and other outputs.
As described above, the touch display method provided by the embodiment of the application, the software and hardware structure of the touch device 100 related to the touch display method, and the software and hardware structure of the active pen 200 are described. Therefore, the touch display method, the graphical interface and the related device provided by the application have the following beneficial effects: the touch control coordinate compensation method has the advantages that touch control coordinate compensation can be performed on the touch control equipment with inaccurate touch control coordinate calculation, the influence caused by the touch control coordinate calculation error is greatly reduced or eliminated, and when a user writes or draws on the touch control equipment, the touch control equipment can display straighter and smoother handwriting which meets the user expectation, so that the user experience is improved.
The embodiments of the present application may be arbitrarily combined to achieve different technical effects.
In the above embodiments, it may be implemented in whole or in part by software, hardware, firmware, or any combination thereof. When implemented in software, may be implemented in whole or in part in the form of a computer program product. The computer program product includes one or more computer instructions. When the computer program instructions are loaded and executed on a computer, the processes or functions in accordance with the present application are produced in whole or in part. The computer may be a general purpose computer, a special purpose computer, a computer network, or other programmable apparatus. The computer instructions may be stored in a computer-readable storage medium or transmitted from one computer-readable storage medium to another computer-readable storage medium, for example, the computer instructions may be transmitted from one website, computer, server, or data center to another website, computer, server, or data center by a wired (e.g., coaxial cable, fiber optic, digital subscriber line), or wireless (e.g., infrared, wireless, microwave, etc.). The computer readable storage medium may be any available medium that can be accessed by a computer or a data storage device such as a server, data center, etc. that contains an integration of one or more available media. The usable medium may be a magnetic medium (e.g., floppy Disk, hard Disk, tape), an optical medium (e.g., DVD), or a semiconductor medium (e.g., solid State Disk), etc.
Those of ordinary skill in the art will appreciate that implementing all or part of the above-described method embodiments may be accomplished by a computer program to instruct related hardware, the program may be stored in a computer readable storage medium, and the program may include the above-described method embodiments when executed. And the aforementioned storage medium includes: ROM or random access memory RAM, magnetic or optical disk, etc.
In summary, the foregoing description is only exemplary embodiments of the present application and is not intended to limit the scope of the present application. Any modification, equivalent replacement, improvement, etc. made according to the disclosure of the present application should be included in the protection scope of the present application.
The above embodiments are only for illustrating the technical solution of the present application, and not for limiting the same; although the application has been described in detail with reference to the foregoing embodiments, it will be understood by those of ordinary skill in the art that: the technical scheme described in the foregoing embodiments can be modified or some technical features thereof can be replaced by equivalents; such modifications and substitutions do not depart from the spirit of the application.

Claims (13)

1. A touch display method applied to an electronic device, the method comprising:
a sensor in a touch screen of the electronic equipment acquires a downlink signal sent by the input equipment;
the electronic equipment determines a plurality of continuous actual measurement coordinates on the touch screen according to the position of the sensor which collects the downlink signals, wherein the actual measurement coordinates are located in any area in the touch screen;
the electronic equipment obtains a compensation value corresponding to each measured coordinate in a compensation function, and superimposes the corresponding compensation value on each measured coordinate to obtain a plurality of compensation coordinates; the compensation function comprises a horizontal axis compensation function and a vertical axis compensation function, and the compensation value comprises a horizontal axis compensation value and a vertical axis compensation value; the transverse axis compensation function is used for representing the corresponding relation between the abscissa of the actual measurement coordinate and the transverse axis compensation value, and the transverse axis compensation value indicates the difference value between the abscissa of the actual measurement coordinate and the abscissa of the theoretical coordinate; the vertical axis compensation function is used for representing the corresponding relation between the vertical coordinate of the measured coordinate and the vertical axis compensation value, and the vertical axis compensation value indicates the difference value between the vertical coordinate of the measured coordinate and the vertical coordinate of the theoretical coordinate; the electronic equipment displays a plurality of continuous moving tracks formed by connecting the compensation coordinates.
2. The method of claim 1, wherein the sensors are arranged in a first direction and a second direction, the first direction being perpendicular to the second direction; the lateral axis compensation function is obtained by the second device by:
an input device moves in the touch screen at a first speed in parallel to the first direction, and the input device sends a downlink signal in the moving process; the first speed is lower than a preset speed and is a constant value;
determining a plurality of continuous actual measurement coordinates according to the position of the sensor for acquiring the downlink signals;
determining a plurality of continuous theoretical coordinates according to the initial measured coordinates, the final measured coordinates and the first speed in the plurality of continuous measured coordinates;
and acquiring the transverse axis compensation function according to the continuous actual measurement coordinates and the continuous theoretical coordinates.
3. The method according to claim 2, wherein the second device obtains the lateral axis compensation function from the plurality of consecutive measured coordinates and the plurality of consecutive theoretical coordinates, in particular comprising:
the electronic equipment performs data fitting on the plurality of continuous measured coordinates to obtain a transverse axis fitting function;
The electronic equipment obtains a plurality of differences between the transverse axis fitting function and the plurality of continuous theoretical coordinates;
and the electronic equipment performs data fitting on the plurality of difference values to obtain the transverse axis compensation function.
4. A method according to claim 2 or 3, wherein the lateral axis compensation function and the longitudinal axis compensation function are the same or different; when the arrangement rules of the sensors in the first direction and the second direction are consistent, and the sensor hardware structures are the same, the horizontal axis compensation function and the vertical axis compensation function are the same; otherwise, the horizontal axis compensation function and the vertical axis compensation function are different.
5. The method of claim 2, wherein the second device and the electronic device are the same device or different devices.
6. The method of claim 1, wherein before the electronic device displays a movement track formed by a plurality of consecutive compensated coordinate connections, the method further comprises:
the electronic device runs and displays an interface provided by a drawing or writing type application program.
7. The method of claim 1, wherein the input device comprises, but is not limited to, an active capacitive pen.
8. The method of claim 7, wherein when the input device is the active capacitive stylus, the downstream signal is specifically: the driving unit in the active capacitance pen sends an excitation signal to the pen point.
9. The method according to claim 7 or 8, wherein the type of the downlink signal includes three, but is not limited to: square waves, sine waves, and triangular waves.
10. The method of claim 7, wherein when the input device is the active capacitive stylus, the downstream signal carries pressure information of the active capacitive stylus; the pressure information is related to a display effect of the movement track.
11. A chip for application to an electronic device, the chip comprising one or more processors to invoke computer instructions to cause the electronic device to perform the method of any of claims 1-10.
12. A computer readable storage medium comprising instructions which, when run on an electronic device, cause the electronic device to perform the method of any of claims 1-10.
13. An electronic device comprising one or more processors and one or more memories; wherein the one or more memories are coupled to the one or more processors, the one or more memories for storing computer program code comprising computer instructions that, when executed by the one or more processors, cause the electronic device to perform the method of any of claims 1-10.
CN202210193026.7A 2022-02-28 2022-02-28 Touch display method, graphical interface and related device Active CN115562514B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210193026.7A CN115562514B (en) 2022-02-28 2022-02-28 Touch display method, graphical interface and related device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210193026.7A CN115562514B (en) 2022-02-28 2022-02-28 Touch display method, graphical interface and related device

Publications (2)

Publication Number Publication Date
CN115562514A CN115562514A (en) 2023-01-03
CN115562514B true CN115562514B (en) 2023-11-24

Family

ID=84736629

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210193026.7A Active CN115562514B (en) 2022-02-28 2022-02-28 Touch display method, graphical interface and related device

Country Status (1)

Country Link
CN (1) CN115562514B (en)

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103970323A (en) * 2013-01-30 2014-08-06 北京汇冠新技术股份有限公司 Method and system for tracking of trajectory of touch screen
CN104765543A (en) * 2015-04-17 2015-07-08 努比亚技术有限公司 Audio playing parameter adjustment method and device
CN105975122A (en) * 2016-04-27 2016-09-28 集怡嘉数码科技(深圳)有限公司 Touch track compensation method and apparatus as well as terminal device
CN106201091A (en) * 2016-07-13 2016-12-07 北京集创北方科技股份有限公司 The coordinate processing method of touch screen and device
CN107272921A (en) * 2016-01-28 2017-10-20 乐金显示有限公司 Active touch control pen includes its touch-sensing system and touch-sensing method
CN109445636A (en) * 2018-10-31 2019-03-08 上海海栎创微电子有限公司 A kind of self-capacitance touch screen edge touch coordinate compensation method

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103970323A (en) * 2013-01-30 2014-08-06 北京汇冠新技术股份有限公司 Method and system for tracking of trajectory of touch screen
CN104765543A (en) * 2015-04-17 2015-07-08 努比亚技术有限公司 Audio playing parameter adjustment method and device
CN107272921A (en) * 2016-01-28 2017-10-20 乐金显示有限公司 Active touch control pen includes its touch-sensing system and touch-sensing method
CN105975122A (en) * 2016-04-27 2016-09-28 集怡嘉数码科技(深圳)有限公司 Touch track compensation method and apparatus as well as terminal device
CN106201091A (en) * 2016-07-13 2016-12-07 北京集创北方科技股份有限公司 The coordinate processing method of touch screen and device
CN109445636A (en) * 2018-10-31 2019-03-08 上海海栎创微电子有限公司 A kind of self-capacitance touch screen edge touch coordinate compensation method

Also Published As

Publication number Publication date
CN115562514A (en) 2023-01-03

Similar Documents

Publication Publication Date Title
KR102470275B1 (en) Voice control method and electronic device
CN110910872B (en) Voice interaction method and device
CN115866121B (en) Application interface interaction method, electronic device and computer readable storage medium
CN113641271A (en) Application window management method, terminal device and computer readable storage medium
CN114461057A (en) VR display control method, electronic device and computer readable storage medium
EP4228233A1 (en) Method for adding operation sequence, electronic device, and system
CN115016697A (en) Screen projection method, computer device, readable storage medium, and program product
CN115119048B (en) Video stream processing method and electronic equipment
CN115032640B (en) Gesture recognition method and terminal equipment
CN111249728A (en) Image processing method and image processing device
CN112416984A (en) Data processing method and device
CN113380240B (en) Voice interaction method and electronic equipment
CN115562514B (en) Touch display method, graphical interface and related device
CN113050864B (en) Screen capturing method and related equipment
CN114173286A (en) Method and device for determining test path, electronic equipment and readable storage medium
CN113821130A (en) Method and related device for determining screenshot area
CN116048236B (en) Communication method and related device
CN116320880B (en) Audio processing method and device
CN116795476B (en) Wallpaper deleting method and electronic equipment
CN114205318B (en) Head portrait display method and electronic equipment
WO2022166550A1 (en) Data transmission method and electronic device
CN117013660B (en) Charging icon display method and electronic equipment
WO2022143891A1 (en) Focal point synchronization method and electronic device
WO2021104000A1 (en) Screen display method and electronic device
CN117708009A (en) Signal transmission method and electronic equipment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant