CN115562514A - Touch display method, graphical interface and related device - Google Patents

Touch display method, graphical interface and related device Download PDF

Info

Publication number
CN115562514A
CN115562514A CN202210193026.7A CN202210193026A CN115562514A CN 115562514 A CN115562514 A CN 115562514A CN 202210193026 A CN202210193026 A CN 202210193026A CN 115562514 A CN115562514 A CN 115562514A
Authority
CN
China
Prior art keywords
coordinate
touch
coordinates
compensation function
compensation
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202210193026.7A
Other languages
Chinese (zh)
Other versions
CN115562514B (en
Inventor
邸皓轩
李丹洪
张晓武
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Honor Device Co Ltd
Original Assignee
Honor Device Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Honor Device Co Ltd filed Critical Honor Device Co Ltd
Priority to CN202210193026.7A priority Critical patent/CN115562514B/en
Publication of CN115562514A publication Critical patent/CN115562514A/en
Application granted granted Critical
Publication of CN115562514B publication Critical patent/CN115562514B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • G06F3/0414Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means using force sensing means to determine a position
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • G06F3/0416Control or interface arrangements specially adapted for digitisers
    • G06F3/04162Control or interface arrangements specially adapted for digitisers for exchanging data with external devices, e.g. smart pens, via the digitiser sensing hardware
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • G06F3/0416Control or interface arrangements specially adapted for digitisers
    • G06F3/04166Details of scanning methods, e.g. sampling time, grouping of sub areas or time sharing with display driving
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • G06F3/044Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by capacitive means
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2203/00Indexing scheme relating to G06F3/00 - G06F3/048
    • G06F2203/041Indexing scheme relating to G06F3/041 - G06F3/045
    • G06F2203/04105Pressure sensors for measuring the pressure or force exerted on the touch surface without providing the touch position

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Position Input By Displaying (AREA)

Abstract

The application discloses a touch display method, a graphical interface and a related device. In the method, a horizontal axis compensation function and a vertical axis compensation function are acquired in advance by a touch device. The horizontal axis compensation function is used for representing the corresponding relation between the actually measured horizontal coordinate and the compensation value, and the vertical axis compensation function is used for representing the corresponding relation between the actually measured vertical coordinate and the compensation value. When the touch control device acquires a touch control operation acted on a touch control screen by an input device, acquiring an actual measurement coordinate corresponding to a touch control coordinate where the touch control operation is located, searching a compensation value corresponding to an abscissa of the actual measurement coordinate in an abscissa compensation function, and adding the actual measurement abscissa to the compensation value corresponding to the abscissa to acquire a compensated abscissa; similarly, a compensation value corresponding to the ordinate of the actual measurement coordinate is searched in the ordinate compensation function, the actual measurement ordinate is added with the compensation value corresponding to the ordinate to obtain the compensated ordinate, so as to obtain the compensation coordinate corresponding to the actual measurement coordinate, and then the corresponding movement track is displayed according to a plurality of continuous compensation coordinates.

Description

Touch display method, graphical interface and related device
Technical Field
The present application relates to the field of terminals, and in particular, to a touch display method, a graphical interface and a related device.
Background
With the popularization of electronic devices such as smart phones, tablets, drawing boards and the like, touch screens and touch display technologies are also widely applied. When input equipment such as fingers and electronic pens is used for writing or drawing on the touch screen, a plurality of sensing units arranged in the touch screen can detect capacitance changes to generate electric signals, the processor calculates specific coordinates of the input equipment falling pen in the touch screen according to the electric signals generated by the sensing units by adopting a gravity center algorithm, a triangular algorithm and the like, and finally drives the display screen to connect a plurality of continuously detected coordinates into handwriting and display the handwriting. However, the algorithm used for calculating the coordinates has the problems of inaccurate calculation, noise caused by the hardware attribute of the touch screen and the like, so that the handwriting output by the display screen does not represent an input straight or smooth moving track but represents a wavy moving track.
How to solve the above problems is a problem to be solved.
Disclosure of Invention
The application provides a touch display method, a graphical interface and a related device. The implementation of the touch control method can realize that after the actual measurement coordinates are obtained according to the touch operation detected by the touch screen, the actual measurement coordinates are compensated according to the difference value between the actual measurement coordinates and the theoretical coordinates, namely, the actual measurement coordinates are superposed with the corresponding compensation value to obtain the compensated coordinates, and finally, a plurality of continuous movement tracks formed by connecting the compensated coordinates are displayed, so that the display effect of the movement tracks is smoother and straighter.
In a first aspect, the present application provides a touch display method, which is applied to an electronic device, and includes: a sensor in a touch screen of the electronic equipment acquires a first downlink signal sent by input equipment; the electronic equipment determines a plurality of continuous first measured coordinates on the touch screen according to the position of the sensor which acquires the first downlink signal; the electronic equipment acquires a first compensation value corresponding to the first measured coordinate in a compensation function, and superposes the first measured coordinate on the first compensation value to acquire a first compensation coordinate; the compensation function is used for representing the corresponding relation between the measured coordinate and the compensation value, and the compensation value indicates the difference value between the measured coordinate and the corresponding theoretical coordinate; the electronic equipment displays a plurality of continuous moving tracks formed by connecting the first compensation coordinates.
After the method provided by the first aspect is implemented, the actual measurement coordinates can be compensated according to the difference between the actual measurement coordinates and the theoretical coordinates, that is, the actual measurement coordinates are overlapped with the corresponding compensation value to obtain compensated coordinates, and finally, a plurality of continuous movement tracks formed by connecting the compensated coordinates are displayed, so that the problem that the existing movement tracks have a wave effect or a sawtooth effect is solved, and the display effect of the movement tracks is smoother and straighter.
The method provided in connection with the first aspect, the compensation function comprising a horizontal axis compensation function and a vertical axis compensation function, the compensation values comprising a horizontal axis compensation value and a vertical axis compensation value; the horizontal axis compensation function is used for representing the corresponding relation between the horizontal coordinate of the measured coordinate and the horizontal axis compensation value, and the horizontal axis compensation value indicates the difference value between the horizontal coordinate of the measured coordinate and the horizontal coordinate of the theoretical coordinate; the vertical axis compensation function is used for representing the corresponding relation between the vertical coordinate of the measured coordinate and the vertical axis compensation value, and the vertical axis compensation value indicates the difference value between the vertical coordinate of the measured coordinate and the vertical coordinate of the theoretical coordinate.
Therefore, the electronic equipment can carry out all-dimensional compensation on the horizontal and vertical coordinates of the measured coordinates according to the horizontal axis compensation function and the vertical axis compensation function so as to obtain the compensation coordinates which are closer to the theoretical coordinates, and the problem that the existing moving track has a wave effect or a sawtooth effect is solved, so that the display effect of the moving track is smoother and straighter.
In combination with the method provided by the first aspect, the sensors are arranged in a first direction and a second direction, the first direction being perpendicular to the second direction; the cross-axis compensation function is obtained by the second device by: the input device moves in the touch screen at a first speed in a direction parallel to the first direction, and the input device sends a downlink signal in the moving process; the first speed is lower than a preset speed and is a constant value; determining a plurality of continuous measured coordinates according to the position of the sensor which acquires the downlink signal; determining a plurality of continuous theoretical coordinates according to the initial actual measurement coordinate, the final actual measurement coordinate and the first speed in the plurality of continuous actual measurement coordinates; and acquiring the transverse axis compensation function according to the plurality of continuous measured coordinates and the plurality of continuous theoretical coordinates.
Therefore, the electronic equipment can acquire experimental data (namely actual measurement coordinates) and theoretical data according to professional experiments, and obtain a more accurate compensation function through mathematical operation, so that the compensation effect is more accurate.
With reference to the method provided by the first aspect, the obtaining, by the second device, the cross-axis compensation function according to the multiple continuous measured coordinates and the multiple continuous theoretical coordinates specifically includes: the electronic equipment performs data fitting on the continuous measured coordinates to obtain a cross-axis fitting function; the electronic equipment acquires a plurality of differences between the cross-axis fitting function and the plurality of continuous theoretical coordinates; the electronic device performs data fitting on the plurality of difference values to obtain the cross-axis compensation function.
Therefore, the electronic equipment can acquire experimental data and theoretical data according to professional experiments, the experimental data (namely, actual measurement coordinates) are processed by data fitting, and then more accurate functions are obtained to identify the relation between the actual measurement data, single experimental data do not need to be processed one by one, and the operation of subsequent coordinate compensation is simplified.
In combination with the method provided in the first aspect, the horizontal axis compensation function and the vertical axis compensation function are the same or different; when the arrangement rules of the sensors in the first direction and the second direction are consistent, and the hardware structures of the sensors are the same, the horizontal axis compensation function is the same as the vertical axis compensation function; otherwise, the horizontal axis compensation function and the vertical axis compensation function are different.
Therefore, when the arrangement rules of the sensors in the touch screen are consistent in the horizontal and vertical directions and the hardware structures of the sensors are the same, only one compensation function in one arrangement direction can be obtained, and the compensation function is used for compensating the coordinates in the two arrangement directions, so that the obtaining steps of the compensation function are simplified.
In combination with the method provided by the first aspect, the second device and the electronic device are the same device or different devices.
In this way, the compensation function may be acquired by the electronic device itself, or may be acquired by another device and then stored in the electronic device.
With reference to the method provided by the first aspect, before the electronic device displays a plurality of consecutive movement tracks formed by connecting the first compensation coordinates, the method further includes: the electronic equipment runs and displays an interface provided by a drawing or writing application program.
Like this, when the user uses electronic equipment's touch screen to draw or write the operation, through adopting after the method that this scheme provided, can be so that the handwriting of writing or drawing demonstrates more smooth lines, improve user experience and feel.
In connection with the method provided by the first aspect, the input device includes, but is not limited to, an active capacitive pen.
Like this, because the nib of active electric capacity pen is thinner, and the detection of sensor is down signal unsatisfactory for current handwriting display effect is not good has obvious wave lines, can make final handwriting display effect more smooth, straight after the scheme that adopts itself to provide.
With reference to the method provided by the first aspect, when the input device is the active capacitive pen, the downlink signal specifically is: and the driving unit in the active capacitance pen sends an excitation signal to the pen point.
Therefore, the sensor in the electronic equipment can effectively receive downlink signals by driving the active capacitance pen, and smooth display of a moving track is ensured.
In combination with the method provided by the first aspect, the types of the downlink signal include three but are not limited to: square waves, sine waves, and triangular waves.
Therefore, the active capacitance pen can bear downlink signals through various types of signals, and therefore the implementation performance of the application is improved.
With reference to the method provided by the first aspect, when the input device is the active capacitive stylus, the downlink signal carries pressure information of the active capacitive stylus; the pressure information is related to the display effect of the movement trajectory.
Therefore, the electronic equipment can also control the display effect of the moving track according to the pressure information sent by the active capacitance pen, for example, the larger the pressure is, the thicker the display effect of the moving track is, and the larger the pressure is, the thinner the display effect of the moving track is, so that a user can write or draw according to personal requirements.
In a second aspect, the present application provides a chip for an electronic device, the chip comprising one or more processors configured to invoke computer instructions to cause the electronic device to perform the method as described in any of the above first aspects.
In a third aspect, the present application provides a computer-readable storage medium comprising instructions that, when executed on an electronic device, cause the electronic device to perform the method as described in any of the first aspects above.
In a fourth aspect, the present application provides an electronic device comprising one or more processors and one or more memories; wherein the one or more memories are coupled to the one or more processors, the one or more memories being configured to store computer program code comprising computer instructions that, when executed by the one or more processors, cause the electronic device to perform the method as described in any of the first aspects above.
Drawings
Fig. 1 is a schematic view of an application scenario of a touch method according to an embodiment of the present disclosure;
fig. 2 is a schematic structural diagram of an active pen 200 according to an embodiment of the present disclosure;
fig. 3A is a schematic diagram of a sensor arrangement of a touch screen 110 according to an embodiment of the present disclosure;
fig. 3B-3C are schematic diagrams illustrating a coordinate detection method of the touch screen 110 according to an embodiment of the disclosure;
FIG. 4 is a schematic diagram of a handwriting display interface provided by an embodiment of the present application;
fig. 5 is a schematic flowchart of a touch display method according to an embodiment of the present disclosure;
fig. 6 is a schematic flowchart of a compensation function obtaining method according to an embodiment of the present disclosure;
fig. 7 is a schematic diagram illustrating a correspondence between a theoretical abscissa and an actual abscissa provided in an embodiment of the present application;
FIG. 8 is a graph illustrating a horizontal axis fitting function provided in an embodiment of the present application;
FIG. 9 is a graph illustrating a horizontal axis fitting difference function according to an embodiment of the present disclosure;
fig. 10 is a schematic diagram illustrating a relationship between a compensation abscissa and a theoretical abscissa according to an embodiment of the present application;
FIG. 11 is a schematic diagram of another handwriting display interface provided in the embodiments of the present application;
fig. 12 is a schematic diagram of a hardware structure of a touch device 100 according to an embodiment of the present disclosure;
fig. 13 is a schematic diagram of a software structure of a touch device 100 according to an embodiment of the present disclosure;
fig. 14 is a schematic diagram of a hardware structure of an active pen 200 according to an embodiment of the present disclosure.
Detailed Description
The technical solutions in the embodiments of the present application will be described in detail and clearly with reference to the accompanying drawings. In the description of the embodiments herein, "/" means "or" unless otherwise specified, for example, a/B may mean a or B; the "and/or" in the text is only an association relation describing the association object, and indicates that three relations may exist, for example, a and/or B may indicate: a exists alone, A and B exist simultaneously, and B exists alone.
In the following, the terms "first", "second" are used for descriptive purposes only and are not to be construed as implying or implying relative importance or implicitly indicating the number of technical features indicated. Thus, a feature defined as "first" or "second" may explicitly or implicitly include one or more of that feature, and in the description of embodiments of the application, unless stated otherwise, "plurality" means two or more.
Reference in the specification to "an embodiment" means that a particular feature, structure, or characteristic described in connection with the embodiment can be included in at least one embodiment of the specification. The appearances of the phrase in various places in the specification are not necessarily all referring to the same embodiment, nor are separate or alternative embodiments mutually exclusive of other embodiments. It is explicitly and implicitly understood by one skilled in the art that the embodiments described herein can be combined with other embodiments.
The term "User Interface (UI)" in the following embodiments of the present application is a medium interface for performing interaction and information exchange between an application program or an operating system and a user, and implements conversion between an internal form of information and a form acceptable to the user. The user interface is source code written by java, extensible markup language (XML) and other specific computer languages, and the interface source code is analyzed and rendered on the electronic device and finally presented as content which can be identified by the user. A commonly used presentation form of the user interface is a Graphical User Interface (GUI), which refers to a user interface related to computer operations and displayed in a graphical manner. It may be a visual interface element such as text, an icon, a button, a menu, a tab, a text box, a dialog box, a status bar, a navigation bar, a Widget, etc. displayed in a display of the electronic device.
With the wide application of touch screens and touch display technologies, in order to solve the defects of the existing touch display technologies, namely, the problems of inaccurate calculation of an algorithm used for calculating touch coordinates, noise caused by hardware attributes of the touch screen and the like, handwriting output by the display screen is not straight or smooth lines, and wavy lines are presented. The application provides a touch display method, a graphical interface and a related device. In the method, a touch device acquires a horizontal axis compensation function and a vertical axis compensation function in advance. The horizontal axis compensation function is used for representing the corresponding relation between the actually measured horizontal coordinate and the compensation value, and the vertical axis compensation function is used for representing the corresponding relation between the actually measured vertical coordinate and the compensation value. When the touch control equipment acquires the touch control operation acted on the touch control screen by the input equipment, acquiring the actual measurement coordinate corresponding to the touch control coordinate where the touch control operation is located, searching the compensation value corresponding to the abscissa of the actual measurement coordinate in the abscissa compensation function, and adding the actual measurement abscissa to the compensation value corresponding to the abscissa to acquire the compensated abscissa; similarly, a compensation value corresponding to the ordinate of the actual measurement coordinate is searched in the ordinate compensation function, the actual measurement ordinate is added to the compensation value corresponding to the ordinate to obtain the compensated ordinate, the compensation coordinate corresponding to the actual measurement coordinate is finally obtained, and then the touch device displays a moving track formed by connecting a plurality of continuous compensation coordinates.
In the following embodiments of the present application, three coordinate points are referred to, specifically as follows:
touch coordinates: the coordinate point of the touch screen actually touched by the input device may also become a theoretical coordinate.
And (3) actual measurement coordinates: and the processor preliminarily calculates the coordinate points according to the electric signals detected by the plurality of sensing units arranged in the touch screen.
Compensation coordinates: on the basis of the actual measurement coordinates, the coordinate points which are closer to the touch points are obtained by compensating errors compared with the actual measurement coordinates.
Therefore, the touch display method provided by the application can compensate errors existing when the processor calculates the touch coordinates, obtain the compensation coordinates which are closer to the actual touch coordinates (theoretical coordinates), and greatly reduce or eliminate the influence caused by the calculation errors of the touch coordinates, so that when a user writes or draws on the touch equipment, the touch equipment can display straighter and smoother handwriting which accords with the expectation of the user.
The specific method for obtaining the compensation function may refer to the following method, and is not described herein again.
In order to more clearly introduce the touch display method provided by the present application, a scene diagram applicable to the touch display method provided by the present application is introduced first.
Referring to fig. 1, fig. 1 schematically illustrates a scene diagram to which the touch display method provided by the present application is applied.
As shown in fig. 1, a touch device 100 and an input device are included in the scene. In fig. 1, a touch device 100 is taken as an example of a tablet, and an input device is taken as an example of an electronic pen. The input device can provide an input to touch device 100, and touch device 100 performs an operation responsive to the input based on the input of the input device. Specifically, the touch device 100 may set a touch area (e.g., the touch screen 110), and the input device may write, draw, and the like in the touch area, so that the touch device may display a moving path of the input device, that is, display handwriting. The handwriting display method specifically comprises the following steps: the touch device 100 acquires the horizontal axis compensation function and the vertical axis compensation function in advance. The horizontal axis compensation function is used for representing the corresponding relation between the actually measured horizontal coordinate and the compensation value, and the vertical axis compensation function is used for representing the corresponding relation between the actually measured vertical coordinate and the compensation value. When the touch control equipment acquires the touch control operation acted on the touch control screen by the input equipment, acquiring the actual measurement coordinate corresponding to the touch control coordinate where the touch control operation is located, searching the compensation value corresponding to the abscissa of the actual measurement coordinate in the abscissa compensation function, and adding the actual measurement abscissa to the compensation value corresponding to the abscissa to acquire the compensated abscissa; similarly, a compensation value corresponding to the ordinate of the actual measurement coordinate is searched in the ordinate compensation function, the actual measurement ordinate is added to the compensation value corresponding to the ordinate to obtain the compensated ordinate, the compensation coordinate corresponding to the actual measurement coordinate is finally obtained, and then the handwriting formed by connecting a plurality of continuous compensation coordinates is displayed by the touch device.
In one embodiment, the touch device 100 and the input device may be interconnected through a communication network to realize interaction of wireless signals. The communication network may be, but is not limited to: a near field communication network such as a WI-FI hotspot network, a WI-FI peer-to-peer (P2P) network, a bluetooth network, a ZigBee network, or a Near Field Communication (NFC) network.
It is understood that the form of the touch device 100 and the input device shown in fig. 1 are only examples, and in other embodiments of the present application, the touch device 100 may be a tablet or other device, and the input device may be a device other than an electronic pen. The method comprises the following specific steps:
the touch device 100 may also be other devices configured with the touch screen 110, such as a mobile phone, a hand drawing board, an Augmented Reality (AR) device, a Virtual Reality (VR) device, an Artificial Intelligence (AI) device, a wearable device, a vehicle-mounted device, a smart home device and/or a smart city device, a desktop computer, a laptop computer, a handheld computer, a notebook computer, an ultra-mobile personal computer (UMPC), a netbook, and a personal digital assistant (PDA, etc.), and the embodiment of the present application does not particularly limit the specific type of the touch device 100.
The input device may also be, for example, other media than an electronic pen, such as other conductive media like a finger. When the input device is an electronic pen, the electronic pen may be, but is not limited to: inductance pen and electric capacity pen. When the input device is a capacitive stylus, the capacitive stylus may include: passive capacitive pens and active capacitive pens. The passive capacitive stylus may be referred to as a passive capacitive stylus and the active capacitive stylus may be referred to as an active capacitive stylus. One or more electrodes may be disposed in the active capacitive stylus (e.g., within the pen tip) through which the active capacitive stylus may transmit signals. When the electronic pen is an active capacitive pen, an integrated electrode array is required on the touch screen 110 of the touch device 100 interacting with the active capacitive pen. In one embodiment, the electrode array may be a capacitive electrode array. The touch device 100 can receive a signal from the active capacitive pen through the electrode array, and then, when receiving the signal, identify a position of the active capacitive pen on the touch screen, an inclination angle of the active capacitive pen, and the like based on a change of a capacitance value on the touch screen 110. When the input device is an inductive pen, an electromagnetic induction board needs to be integrated on the touch screen 110 of the touch device 100 interacting with the inductive pen. The electromagnetic induction board is distributed with coils, and the inductance pen is also integrated with the coils. Based on the electromagnetic induction principle, the inductive pen can accumulate electric energy along with the movement of the inductive pen in the magnetic field range generated by the electromagnetic induction plate. The inductance pen can transmit the accumulated electric energy to the electromagnetic induction plate through free oscillation and a coil in the inductance pen. The electromagnetic induction board can scan the coil on the electromagnetic induction board based on the electric energy from the inductive pen, and calculate the position of the inductive pen on the touch screen 110.
It is to be understood that, in an embodiment, the touch screen may be referred to as a touch screen, the electronic pen may also be referred to as a stylus or a stylus, and the like, and the name of the above-mentioned device is not limited in this embodiment.
Based on the above description of the scenario shown in fig. 1, the following mainly describes the structures of the touch screen 110 and the input device related to the scenario. The present application takes the input device as an active capacitive pen (abbreviated as the active pen 200) and the touch screen 110 adapted to the active pen 200 as an example, and details the structures of the two are described in detail.
Referring to fig. 2, fig. 2 illustrates a schematic structural view of an active pen 200.
As shown in fig. 2, the active pen 200 includes a pen tip 201, a driving unit 202, a pen body 203, and a power supply unit 204.
The pen tip 201 may be provided with a pressure sensor (not shown in the figure) and one or more transmitting electrodes and receiving electrodes (not shown in the figure) inside. The pressure sensor is used for detecting the pressure applied by the pen tip on the touch screen 110 (i.e. the external force applied by the touch screen 110), so as to control the thickness of the line output by the touch device 100 when writing on the touch screen 110 according to the pressure detected by the pen tip 201. The transmitting electrode is used for transmitting a downlink signal to the touch screen 110 of the touch device 100. The receiving electrode is used for receiving an uplink signal transmitted by the touch screen 110 of the touch device 100.
Among them, the pen tip 201 and the driving unit 202, the driving unit 202 and the power supply unit 204, etc. can be electrically connected through wires or flexible circuit boards. The power supply unit 204 may include a lithium ion battery, or the power supply unit 204 may include a nickel-chromium battery, an alkaline battery, or a nickel-hydrogen battery. In one embodiment, the battery included in the power supply unit 204 may be a rechargeable battery or a disposable battery, wherein when the battery included in the power supply unit 204 is a rechargeable battery, the active pen 200 may charge the battery in the power supply unit 204 by a wireless charging manner. Of course, the battery in the power supply unit 204 may also be charged by wired charging, for example, a power connector (not shown) is disposed at an end of the pen body 203 near the pen tail, and the power connector is connected to the power supply unit 204.
In the embodiment of the present application, the power supply unit 204 is used to supply power to the driving unit 202, and the driving unit 202 can actively generate an excitation signal (also called a downlink signal) to be sent to the pen tip 201. When the tip 201 of the active pen 200 contacts the touch screen 110, the capacitance value at the corresponding position of the touch screen 110 changes, and the touch device 100 may determine the position of the tip 201 of the active pen 200 on the touch screen 110 based on the capacitance value on the touch screen 110.
In an embodiment, the downlink signal may be a square wave, a sine wave, a triangular wave, or the like, and the downlink signal may carry pressure information, key information, or the like of the active pen 200.
Referring to fig. 3A, fig. 3A schematically illustrates a sensor arrangement of the touch screen 110.
As shown in fig. 3A, when the input device is an active pen 200, the touch screen 110 adapted to the active pen 200 includes a plurality of sensor channels, such as a plurality of transverse-axis sensor channels 111 arranged in a transverse direction, and a plurality of longitudinal-axis sensor channels 112 arranged in a longitudinal direction. The sensor channel is specifically provided with a plurality of mutually insulated electrode arrays. In one embodiment, the electrode array may be a capacitive electrode array. The touch device 100 may receive the downlink signal transmitted from the active pen 200 through the scan electrode array, and then recognize a position (i.e., touch coordinates) of the active pen 200 on the touch screen 110 and the like based on a change in capacitance value on the touch screen 110 when receiving the downlink signal.
In the process of determining the touch coordinate by scanning the electrode array with the touch device 100, the touch device 100 may specifically detect a capacitance change of any touch point in the effective area of the touch screen by continuously scanning the horizontal axis sensor channel 111 and the vertical axis sensor channel 112, and then calculate and output horizontal axis and vertical axis coordinates according to the capacitance change. It should be noted that, because the sensor channel in the touch screen 110 is a grid-shaped sensing circuit formed by capacitive electrodes by interleaving horizontally and vertically, when a user holds the active pen 200 to write on the touch screen 110, whenever the pen tip 201 of the active pen 200 falls on a certain fixed point in the touch screen 110, the capacitance of multiple electrodes near the fixed point changes, especially when the active pen 200 continuously writes on the touch screen 110, the capacitance of multiple electrodes near each coordinate point included in the moving trajectory of the active pen 200 changes, although the processor of the touch device 100 may calculate coordinates by using a specific algorithm such as a centroid algorithm, a trigonometric algorithm, etc., according to the capacitance change condition, due to the hardware property of the touch screen and the inaccuracy of the coordinate calculation algorithm, there is still a difference between the finally calculated measured coordinates and the actual touch coordinates. That is to say, in the actual writing process, the detected actual coordinate corresponding to each touch coordinate included in the active pen moving track may be different from the actual touch coordinate (i.e. the theoretical coordinate), the actual coordinate may be any coordinate point around the touch coordinate, and the finally displayed writing trace is different from the actual track where the user holds the active pen 200 to move. Fig. 3B-3C may be combined in particular.
As shown in fig. 3B, when the active pen 200 is held to input a line in the touch screen 110, the line is not perpendicular/parallel to the sensor channel, and the line is the actual movement track of the active pen 200 in the touch screen 110, and the actual movement track includes the contact coordinate 1, the contact coordinate 2, the contact coordinate 3, the contact coordinate 4, and so on. However, because the sensor in the touch screen 110 may detect the capacitance change at a plurality of positions near each touch coordinate, and in addition, the algorithm for calculating the coordinates is inaccurate, the finally calculated coordinates may be as shown in fig. 3C as the actually measured coordinates 1, 2, 3, 4, and so on.
It is understood that fig. 3B-3C are merely exemplary to show that there is an error between the measured coordinates calculated by the touch device 100 and the touch coordinates when a line not perpendicular/parallel to the sensor channel is input into the touch screen 110. In other embodiments, when a straight line perpendicular to/parallel to the sensor channel is input into the touch screen 110, the touch device 100 may calculate the actual measurement coordinates and the contact coordinates, but the actual measurement coordinates may be a point near the contact coordinates in the front-back direction on the straight line perpendicular to the sensor channel or a point near the contact coordinates in the left-right direction on the straight line parallel to the sensor channel, so that the handwriting displayed after connecting the actual measurement coordinates is still a straight line perpendicular to/parallel to the sensor channel, and the user may not observe the contact coordinates and may not calculate the contact coordinates incorrectly.
Referring to FIG. 4, FIG. 4 illustrates a diagram of a handwriting display interface.
The lines 411A, 412A, and 413A are shown as the actual movement traces of the active pen 200 while the active pen 200 is writing on the touch screen 110. The line 411B, the line 412B, and the line 413B respectively display corresponding handwriting for the touch device 100 according to the actual movement trajectory line 411A, the line 412A, and the line 413A of the active pen 200.
It is noted that the user interface 410 actually displayed by the touch device 100 only includes the line 411B, the line 412B, and the line 413B. The user interface 410 may be an interface provided by a writing and drawing type application installed in the touch device 100. The lines 411A, 412A, and 413A shown in fig. 4 are not actually displayed by the touch device 100, but merely indicate the actual movement track of the active pen 200 when the active pen 200 writes on the touch screen 110, and the actual movement track of the active pen 200 is not the current position of the lines 411A, 412A, and 413A. When the display screen 120 of the touch device 100 is attached to the touch screen 110, that is, the touch screen 110 and the display screen 120 are integrated into a touch display screen, most of handwriting displayed on the display screen 120 and an actual movement track of the active pen 200 on the touch screen 110 are overlapped; when the touch screen 110 is independent from the body of the touch device but can be connected to the touch device in a wireless or wired communication manner, that is, the sensor in the touch screen 110 is not attached to the display screen 120, the actual moving track of the active pen 200 on the touch screen 110 corresponds to the handwriting displayed on the display screen 120.
Next, in conjunction with the above descriptions of fig. 3A-3C and fig. 4, it can be seen that the touch device 100 displays the corresponding line 411B with the linear effect and the line 412B with the linear effect only when the active pen 200 is held to input a straight line (e.g., the line 411A) perpendicular to the horizontal axis sensor channel or a straight line (e.g., the line 412A) parallel to the horizontal axis sensor channel in the touch screen 110; when the active pen 200 is held to input a straight line or a curved line (e.g., the line 413A) in the touch screen 110, which is not perpendicular/parallel to the sensor channel, the touch device 100 displays a corresponding line 413B having a wave effect. In some embodiments, the line 413B corresponding to the line 413A may also have a saw-tooth shape, and the like, besides the wave shape, which is not limited in this application.
It is understood that, when a user uses the active pen 200 to write or draw, in most cases, the movement track of the active pen 200 on the touch screen 110 is a straight line which is not perpendicular/parallel to the sensor channel, and although the user needs to draw a straight line, it is difficult for the user to draw a straight line which is perpendicular/parallel to the sensor channel when holding the active pen 200, which may cause the final displayed handwriting to be not a straight line or a smooth line, but a wavy or jagged line. In order to solve the problem, the touch display method provided by the present application can well solve the problem, and reference may be specifically made to the following method embodiments.
Referring to fig. 5, fig. 5 schematically illustrates a flow chart of a touch display method provided in the present application.
As shown in fig. 5, the touch display method specifically includes the following steps:
s501, the touch device receives a downlink signal sent by the input device.
Specifically, the touch device 100 may activate a sensor in the touch screen 110 in advance, so as to detect a downlink signal sent by the input device. One specific implementation of the sensor in the touch screen 110 is a sensing network formed by an electrode array as shown in fig. 3A. In this embodiment of the application, the touch device may always keep the sensor in the touch screen to continuously work after the touch device is powered on, or the touch device may keep the sensor in the touch screen to continuously work when detecting that the display screen is lighted, or the touch device may keep the sensor in the touch screen to continuously work after detecting that the communication connection with the input device is established. The working time of the touch screen is not limited in the embodiment of the application.
The input device may specifically be a medium such as the above-mentioned finger, an electronic pen, and the like, which is not limited in this embodiment of the present application. When the input device is specifically the active pen 200, the active pen 200 may transmit a downlink signal to the touch device 100 through any one or more of the transmitting electrode, the WI-FI module, the bluetooth module, the ZigBee module, and the NFC module, and the corresponding touch device 100 may receive the downlink signal through any one or more of the receiving electrode, the WI-FI module, the bluetooth module, the ZigBee module, and the NFC module. The receiving electrodes are the specific implementation form of the sensor in the touch screen 110.
In some embodiments, when the active pen 200 touches the touch screen (e.g., when the input device is held to perform a touch, writing, etc. operation on the touch screen 110 of the touch device 100), the touch device 100 may receive the downlink signal and calculate the actual coordinates of the touch point located in the touch screen according to the downlink signal; in other embodiments, when the active pen 200 is suspended above the touch screen 110 and the distance between the active pen 200 and the touch screen 110 is within a preset range, the touch device may also detect a downlink signal sent by the active pen 200. It can be understood that, in different sending scenarios, information carried by the downlink signal may be different, for example, when the active pen 200 contacts the touch screen 110, the downlink signal carries pressure information, which is used by the touch device to determine the thickness of a writing trace according to the pressure information; when the active pen 200 does not contact the touch screen 110, the downlink signal carries more function information, such as more or less information about a key, a working mode, a tilt angle, and the like, for controlling the touch device 100 to perform a corresponding operation in the air.
In some embodiments, when the input device is an inactive pen, such as a finger, a passive pen, an inductive pen, or the like, such input device may not emit a downlink signal, and when the input device touches the touch screen, because the input device has a conductive capability or an electromagnetic induction capability, a capacitance, an electric field, or the like in the touch screen may be changed, so that the touch device obtains measured coordinates according to a position of a changed sensor.
And S502, the touch control equipment calculates a plurality of actual measurement coordinates of the input equipment falling pen on the touch control screen according to the downlink signal.
When the touch device 100 receives the downlink signal, the actual measurement coordinates may be calculated according to a change condition of the capacitance in the touch screen 110. When the input device is dropped at a certain position in the touch screen 110, capacitance changes occur in a plurality of sensing units, i.e., electrodes, around the certain position in the touch screen 110, and capacitance change values at different positions may be different. Then, the sensor may send the electrode related information whose capacitance change satisfies the preset condition to the processor, and the processor calculates the contact position of the input device in the touch screen 110, that is, the measured coordinate, by using a coordinate algorithm.
In the above, the algorithm specifically used when the processor calculates the coordinate may refer to the prior art, such as a trigonometric algorithm, a center of gravity algorithm, and the like, which is not limited in the embodiment of the present application and is not repeated.
It is understood that the measured coordinates calculated in step S502 have the property of being computationally inaccurate as illustrated in fig. 3B-3C above. If the touch device displays the handwriting formed by connecting the plurality of actual measurement coordinates according to the actual measurement coordinates calculated in step S502, the display effect of the handwriting is similar to that introduced in the user interface shown in fig. 4 (the effect of the wave, the sawtooth line, and the like).
S503, the touch control device obtains compensation values corresponding to the plurality of actual measurement coordinates in the compensation function respectively, and calculates a plurality of compensation coordinates of the plurality of actual measurement coordinates after compensation.
Specifically, after the processor of the touch device 100 calculates the plurality of measured coordinates, the compensation value corresponding to each measured coordinate in the compensation function may be obtained according to the compensation function, and then the measured coordinate values are superimposed on the corresponding compensation value to obtain the final compensation value.
In this embodiment of the present application, the compensation function specifically includes a horizontal axis compensation function and a vertical axis compensation function, where the horizontal axis compensation function is used to represent a corresponding relationship between an actually measured horizontal coordinate and a compensation value, and the vertical axis compensation function is used to represent a corresponding relationship between an actually measured vertical coordinate and a compensation value. Searching a compensation value corresponding to the abscissa of the measured coordinate in the abscissa compensation function, and adding the measured abscissa to the compensation value corresponding to the abscissa to obtain the compensated abscissa; similarly, a compensation value corresponding to the ordinate of the actual measurement coordinate is searched in the ordinate compensation function, the actual measurement ordinate is added with the compensation value corresponding to the ordinate to obtain the compensated ordinate, the compensation coordinate corresponding to the actual measurement coordinate is finally obtained, and then the handwriting formed by connecting a plurality of continuous compensation coordinates is displayed by the touch control equipment.
It will be appreciated that in some embodiments, the cross-axis compensation function and the vertical-axis compensation function may be the same or different. Specifically, when the sensor channels in the touch screen 110 of the touch device 100 are arranged in a centrosymmetric arrangement, for example, the number of the horizontal and vertical sensor channels is the same, the intervals between the horizontal and vertical sensor channels are the same, and the number, structure, etc. of each sensing unit, i.e., the electrodes, in the horizontal and vertical sensors are the same, the horizontal axis compensation function and the vertical axis compensation function may be the same; otherwise, the compensation functions of the transverse and longitudinal axes are not the same.
That is to say, the compensation function proposed in the embodiment of the present application is related to the hardware structure information of the touch screen 110 in the touch device 100 and the algorithm (for example, the centroid algorithm and the trigonometric algorithm) used for calculating the actual measurement coordinates, and the like, and the compensation functions used by the touch screen 110 having different hardware structures or different algorithms used for calculating the actual measurement coordinates are different. For example, in the same type of touch device, the hardware structure of the touch screen responds, and the algorithm used for calculating the touch coordinates is the same, so that the same compensation function can be used. That is to say, the compensation function may be obtained in advance after the touch screen 110 is produced, and the compensation function is stored in the memory of the touch device 100 in advance, so that the compensation function is conveniently read from the memory directly when the subsequent touch device 100 calculates the compensation coordinate according to the actual measurement coordinate.
As for the method for obtaining the compensation function, reference may be made to the method flow shown in fig. 6, which specifically includes the following steps:
s5031, a horizontal straight line parallel to the horizontal axis (X axis) is drawn in the touch panel 110 at a first speed using the manipulator.
In the embodiment of the present application, the manipulator refers to the active pen 200 not held by the user or the finger of the user, and the manipulator refers to a medium in which the operation input in the touch screen 110 can be detected by the touch screen 110 and the contact coordinates are calculated according to the input operation.
In the embodiment of the present application, the first speed is a constant speed, and the first speed is lower than a preset value. That is, the manipulator draws a horizontal straight line parallel to the horizontal axis in the touch screen 110 at a constant speed. Therefore, not only can the touch screen 110 be ensured to have enough time to detect the touch operation of the manipulator, but also the ideal coordinates that should be detected theoretically by the touch device 100 can be obtained through simple calculation in the following process, that is, the theoretical coordinates are calculated, so that preparation work is performed for the subsequent obtaining of the compensation function.
S5032, according to the downlink signal detected by the touch screen 110, a plurality of continuous actual measurement coordinates included in the horizontal straight line are obtained through calculation.
Specifically, when the manipulator draws a straight line in the touch screen 110 at a constant speed, the sensor in the touch screen 110 may periodically scan a downlink signal sent by the input device to detect a corresponding coordinate, and finally detect to obtain a plurality of continuous measured coordinates included in the horizontal straight line. The ordinate of the continuous measured coordinates is a fixed value, and the value of the abscissa of the continuous measured coordinates does not satisfy the relation of equidifferent increase from the abscissa value of the initial measured coordinate. This is because the above-described influence of the hardware structure of the sensor and the coordinate calculation algorithm in the touch screen 110 described in fig. 3A to 3C causes an error in the coordinate calculation result, and therefore, although the manipulator draws a horizontal straight line parallel to the horizontal axis (X axis) in the touch screen 110 at a constant speed, the sensor in the touch screen 110 periodically scans a plurality of continuously measured coordinates obtained by capacitance change therein, which do not satisfy the theoretical abscissa value and exhibit an arithmetic progression.
In the following embodiments of the present application, the numerical values of the abscissa in the measured coordinates are 0,40, 110, 140, 210, 240, 310, 340, 410, 440, 510, 540, 610, 640, 710, and 1000 in this order. The size of the abscissa in the measured coordinates is only an example, and the abscissa may also include a greater or lesser number, which is not limited in the embodiment of the present application.
S5033, obtaining a plurality of continuous theoretical coordinates included in the horizontal straight line according to the start actual measurement coordinate, the end actual measurement coordinate, and the first speed of the manipulator.
Specifically, the initial actual measurement coordinate (i.e., (0, 0)) and the final actual measurement coordinate (i.e., (1000, 1000)) of the actual measurement coordinates of the horizontal line may be taken as the initial point and the final point of the theoretical coordinates of the horizontal line, and then a plurality of theoretical coordinates consistent with the scanning period of the touch screen 110 may be calculated by combining the first speed of the manipulator.
It can be known that the ordinate of the plurality of consecutive theoretical coordinates included in the horizontal straight line is a fixed value, and the value of the abscissa of the plurality of consecutive theoretical coordinates satisfies the relationship of increasing in arithmetic mean from the abscissa value of the initial measured coordinate (also called initial theoretical coordinate). This is because if the manipulator draws a horizontal straight line parallel to the horizontal axis (X axis) in the touch screen 110 at a constant speed, the theoretical coordinates acquired at regular time satisfy the above conditions.
In the following embodiments of the present application, the numerical values on the abscissa in the theoretical coordinate are 0, 50, 100, 150, 200, 250, 300, 350, 400, 450, 500, 550, 600, 650, 700, and 1000 in this order as an example. The size of the abscissa in the theoretical coordinate is only an example, and the abscissa may also include a greater or lesser number, which is not limited in the embodiments of the present application.
Referring to fig. 7, fig. 7 is a schematic diagram illustrating a correspondence relationship between a theoretical abscissa and a measured abscissa.
As shown in fig. 7, the ordinate axis in the two-dimensional coordinate system represents the abscissa value in the measured coordinate, and the abscissa axis represents the abscissa value in the theoretical coordinate. A plurality of coordinates in the coordinate system shown in fig. 7, such as (0, 0), (50, 40), (100, 110), (150, 140), (200, 210), (250, 240), (300, 310), (350, 340), (400, 410), (400, 440), (500, 510), (550, 540), (600, 610), (650, 640), (700, 710), and (1000, 10000), etc., are obtained according to the actual measurement coordinates and the abscissa values in the theoretical coordinates respectively acquired in step S5032 and step S5033, which is not an example herein.
According to the coordinates, the relationship between the abscissa value in the theoretical coordinate and the abscissa value in the measured coordinate, i.e., the wave line shown in fig. 7, can be drawn. The wave line can be embodied vividly, and the abscissa value in the measured coordinate is different from the abscissa value in the theoretical coordinate. In the ideal calculation result, the numerical value of the measured coordinate obtained by detection and calculation is equal to the numerical value of the theoretical coordinate, so that the writing experience of the user can be better satisfied. That is, in the coordinate system shown in fig. 7, only when the functional relationship reflecting the abscissa value in the theoretical coordinate and the abscissa value in the measured coordinate satisfies Y = X, it is indicated that the measured coordinate value obtained by the detection calculation in the touch device 100 should be equal to the theoretical coordinate value.
S5034, performing data fitting according to a plurality of original coordinates consisting of the actual measured abscissa value and the corresponding theoretical abscissa value to obtain a abscissa fitting function.
First, as can be seen from the functional relationship shown in fig. 7, half of each sensing channel in the touch screen 110 is a period, because the hardware structure of each sensing channel is symmetrical, and thus the function of each half of the sensing channel in the coordinate system is a period. Thus, a fitting function can be obtained by studying coordinate points in half of the sensing channels in the functional relationship shown in fig. 7. The method comprises the following specific steps:
referring to fig. 8, fig. 8 illustrates a graph of data fitting of raw coordinates in half of a sensing channel to a fitted function along the horizontal axis.
As shown in fig. 8, the series of original coordinates in fig. 8 is a series of original coordinates obtained by extracting a plurality of coordinates within the half channel shown in fig. 7. And then, performing data fitting on the original coordinates to obtain a horizontal axis fitting function. In this embodiment of the present application, the data fitting may specifically be a first-order fitting, that is, a linear regression, or may also be a second-order fitting, a third-order fitting, and the like, and this embodiment of the present application is not limited. The fitting function shown in fig. 8 is shown by taking the result obtained after the cubic fitting as an example. The specific steps of fitting can refer to the prior art, and are not described herein.
S5035, performing a difference calculation on the horizontal axis fitting function and the theoretical coordinate to obtain a horizontal axis fitting difference function (i.e., a horizontal axis compensation function).
Referring to fig. 9, fig. 9 illustrates a diagram of fitting a difference function on the horizontal axis.
As shown in fig. 9, the dashed line is used to represent the true difference between the fitting function of the horizontal axis and the theoretical coordinate; the solid line is a fitting difference function obtained by fitting the real difference, and is also called as a cross-axis compensation function. It can be understood that the fitting difference function is only illustrated by performing a cubic fitting on the true difference, and in other embodiments of the present application, a corresponding fitting difference function may also be obtained by performing a primary fitting or a secondary fitting on the true difference, which is not limited in this embodiment of the present application.
It should be noted that the compensation function applied in the embodiment of the present application is a fitting difference function obtained by fitting the true difference function, and is not an original true difference function, because fitting the original data has the following meaning:
data fitting is a method often used when processing and analyzing experimental data (i.e., measured abscissa). The purpose of the data fitting is to find an expression of the relationship between the reaction variables (i.e. measured abscissas) that best approximates the known data under certain criteria. The principle specifically comprises: least squares, chebyshev, and the like. Particularly, the method of applying curve fitting reveals that the relation between the measured abscissas has important theoretical significance and practical significance. For example, the correlation between the measured abscissas is reflected by the cross-axis fitting difference function, so that the correlation between the measured abscissas can be vividly and simply reflected, and the final result can be obtained by simply calculating only by directly connecting the measured abscissas to the function when the compensated abscissas are obtained subsequently according to the measured abscissas, thereby simplifying the subsequent data processing steps.
S5036, compensating the measured abscissa according to the abscissa compensation function to obtain a compensated abscissa.
Specifically, after the touch device 100 detects the actual measured coordinates, a cross axis fitting difference corresponding to the actual measured abscissa value in the cross axis compensation function may be found, and the cross axis fitting difference may also be referred to as a compensation value, and the actual measured abscissa value is superimposed on the compensation value to obtain a final compensated abscissa.
Referring to fig. 10, fig. 10 exemplarily shows a diagram illustrating a correspondence relationship between a compensation abscissa and a theoretical abscissa.
As shown in fig. 10, the functional relationship of the compensation abscissa to the theoretical abscissa can be approximated by the expression Y = X. That is, after the compensation function obtaining method shown in the steps S5031 to S5036 is adopted, the actual measured abscissa can be compensated, so that the obtained compensated abscissa is almost equal to the theoretical abscissa of the ideal clock, and further, the abscissa detection difference is eliminated to a great extent.
It is understood that the above steps S5031-S5036 only describe the method of obtaining the horizontal axis compensation function, and the method of obtaining the vertical axis compensation function is similar thereto. The method specifically comprises the following steps:
drawing a horizontal straight line parallel to the longitudinal axis (Y axis) in the touch screen 110 at a constant speed by using a manipulator;
according to the downlink signal detected by the touch screen 110, a plurality of continuous measured coordinates contained in the horizontal straight line are obtained through calculation;
acquiring a plurality of continuous theoretical coordinates contained in the horizontal straight line according to the initial measured coordinate, the final measured coordinate and the first speed of the manipulator;
performing data fitting according to a plurality of original coordinates consisting of the actual measurement longitudinal coordinate values and the corresponding theoretical longitudinal coordinate values to obtain a longitudinal axis fitting function;
acquiring a functional relation (namely a longitudinal axis compensation function) of the actually measured longitudinal coordinate and the longitudinal axis fitting difference value according to the longitudinal axis fitting function;
and compensating the measured vertical coordinate according to the vertical axis compensation function to obtain a compensated vertical coordinate.
And S504, the touch control equipment controls the display screen to display the handwriting formed by connecting the compensation coordinates.
Specifically, after the processor of the touch device 100 acquires the compensation coordinates corresponding to the plurality of actually measured coordinates, the display screen may be controlled to display the corresponding handwriting according to the plurality of compensation coordinates. Reference may be made in particular to fig. 11.
As shown in fig. 11, fig. 11 illustrates another schematic user interface output by the touch device 100.
As shown in fig. 11, a line 611A, a line 612A, and a line 613A are actual movement traces of the active pen 200 while the active pen 200 is writing on the touch screen 110, respectively. The line 611B, the line 612B, and the line 613B respectively display corresponding handwriting for the touch device 100 according to the actual movement trajectory line 611A, the line 612A, and the line 613A of the active pen 200.
As can be seen from fig. 11, after the touch display method shown in fig. 5 is adopted, when the user uses the active pen 200 to input handwriting, such as the line 613A, to the touch device 100, the touch device 100 displays the line 613B with the straight line effect, instead of the displayed line 413B with the wave effect or the sawtooth effect in the prior art; when the user inputs handwriting such as the line 611A and the line 612A to the touch device 100 using the active pen 200, the touch device 100 also displays the line 611B and the line 612B having the straight line effect, and is more accurate than the displayed line 411B and the line 412B having the straight line effect in the related art.
It is noted that the user interface 610 actually displayed by the touch device 100 only includes the line 611B, the line 612B, and the line 613B. The user interface 610 may be an interface provided by a writing and drawing type application installed in the touch device 100. The line 611A, the line 612A, and the line 613A shown in fig. 6 are not actually displayed by the touch device 100, but are merely used to illustrate the actual movement track of the active pen 200 when the active pen 200 writes on the touch screen 110, and the actual movement track of the active pen 200 is not the position of the current line 611A, the line 612A, and the line 613A. When the display screen 120 of the touch device 100 is attached to the touch screen 110, that is, the touch screen 110 and the display screen 120 are integrated into a touch display screen, the actual moving track of the active pen 200 on the touch screen 110 is almost completely overlapped with the handwriting displayed on the display screen 120; when the touch screen 110 is independent from the body of the touch device but can be connected to the touch device in a wireless or wired communication manner, that is, the sensor in the touch screen 110 is not attached to the display screen 120, the actual moving track of the active pen 200 on the touch screen 110 has a corresponding relationship with the handwriting displayed on the display screen 120, for example, when writing is performed on the upper left corner of the touch screen 110, the corresponding handwriting is displayed on the upper left corner of the display screen 120.
The display screen 120 of the touch device 100 may display a corresponding effect according to the electrical signal received by the touch screen 110, for example, when the touch device 100 opens an Application (APP) installed in an operating system, such as writing and drawing, and the like, the display screen may display a moving path of the active pen 200, that is, display handwriting, through the input device contacting and moving in an area designated by the touch screen 110, where the effects of the handwriting, such as color, thickness, shape, and the like, may be configured through software functions.
In the embodiment of the present application, the touch screen 110 may be integrated with the touch device 100 into a single machine, and when the touch screen 110 is integrated with the touch device 100 into a single machine, the touch screen 110 is further integrated with the display screen 120 into a touch display screen, that is, the sensor unit in the touch screen 110 is attached to the display screen 120. It is understood that in other embodiments of the present application, the touch screen 110 may be independent from the body of the touch device but may be connected to the touch device through wireless or wired communication. That is, the sensor unit of the touch screen 110 and the display screen 120 may not be attached to each other, for example, a notebook computer with an external tablet is taken as an example, the notebook computer includes the display screen 120, and the sensor unit for detecting the touch signal is disposed in the external tablet.
Based on the above detailed description of the application scenario and method embodiments of the touch display method, embodiments of the touch device 100 and the active pen 200 apparatus related to the application scenario are mainly described next. The method comprises the following specific steps:
referring to fig. 12, fig. 12 schematically illustrates a hardware structure of the touch device 100 provided in the present application.
As shown in fig. 12, the touch device 100 may include a touch screen 110, a display 120, a processor 130, a memory 140, a Universal Serial Bus (USB) interface 150, a charging management module 160, a power management module 161, a battery 162, an antenna 1, an antenna 2, a mobile communication module 170, a wireless communication module 180, a sensor module 190, and other modules not shown in the drawings, such as a key, a motor, an indicator, a camera, and a Subscriber Identity Module (SIM) card interface. The sensor module 190 may include a pressure sensor 190A, a touch sensor 190B, an air pressure sensor 190C, a magnetic sensor 190D, an acceleration sensor 190E, a distance sensor 190F, a proximity light sensor 190G, a fingerprint sensor 190H, a temperature sensor 190J, a gyroscope sensor 190K, an ambient light sensor 190L, a bone conduction sensor 190M, and the like.
It is understood that the exemplary structure of the embodiment of the present application does not constitute a specific limitation to the touch device 100. In other embodiments of the present application, the touch device 100 may include more or fewer components than those shown, or combine certain components, or split certain components, or arrange different components. The illustrated components may be implemented in hardware, software, or a combination of software and hardware.
Touch screen 110 may also be referred to as a "touch panel". The touch screen 110 has a sensor layer integrated therein, which may include a pressure sensor 190A, a touch sensor 190B, and the like. The sensor layer can operate in multiple modes. If operating in mutual capacitance mode, the column and row traces form a single capacitive sensing node at each overlap point (e.g., a "vertical" mutual capacitance). If operating in self-capacitance mode, the column and row traces form two (vertically aligned) capacitive sensing nodes at each overlap point. In another embodiment, adjacent column traces and/or adjacent row traces may each form a single capacitive sensing node (e.g., a "horizontal" mutual capacitance) if operating in a mutual capacitance mode. As described above, the sensor layer may detect the presence of the tip 201 of the active pen 200 and/or the touch of the user's finger by monitoring changes in capacitance (e.g., mutual or self capacitance) present at each capacitive sensing node.
In some embodiments of the present application, the touch screen 110 and the display screen 120 may be integrated into a touch display screen, that is, the sensor layer in the touch screen 110 is attached to the component in the display screen, and the user can input a touch operation in the touch display screen and also display a corresponding effect in the touch display screen. In other embodiments of the present application, the touch screen 110 and the display screen 120 may be two independent devices. That is, the sensor layer in the touch screen 110 is attached to the components in the display screen, so that the user can input touch operation in the touch screen 110 and display corresponding effect in the display screen 120.
The pressure sensor 190A in the touch screen 110 can be used for sensing a pressure signal and converting the pressure signal into an electrical signal. The pressure sensor 190A can be of a wide variety, such as a resistive pressure sensor, an inductive pressure sensor, a capacitive pressure sensor, and the like. The capacitive pressure sensor may be a sensor comprising at least two parallel plates having an electrically conductive material. When a force acts on the pressure sensor 190A, the capacitance between the electrodes changes. The touch device 100 determines the intensity of the pressure according to the change of the capacitance. When a touch operation is applied to the display screen 120, the touch device 100 detects the intensity of the touch operation according to the pressure sensor 190A. The touch device 100 may also calculate the touched position according to the detection signal of the pressure sensor 190A. For the sensing network in the touch screen 110 formed by the capacitive pressure sensors, reference may be made to the detailed description of fig. 3A, which is not repeated herein. In some embodiments, the touch operations that are applied to the same touch position but different touch operation intensities may correspond to different operation instructions. For example: and when the touch operation with the touch operation intensity smaller than the first pressure threshold value acts on the short message application icon, executing an instruction for viewing the short message. And when the touch operation with the touch operation intensity larger than or equal to the first pressure threshold value acts on the short message application icon, executing an instruction of newly building the short message.
The touch sensor 190B in the touch screen 110 described above may be used to detect a touch operation applied thereto or nearby. The touch sensor 190B may pass the detected touch operation to the application processor 130 to determine the touch event type. Visual output related to the touch operation may be provided through the display screen 120.
A display screen 120, the display screen 120 for displaying images, video, etc. The display screen 120 includes a display panel. The display panel may employ a Liquid Crystal Display (LCD). The display screen panel may also be made of organic light-emitting diodes (OLEDs), active-matrix organic light-emitting diodes (AMOLEDs), flexible light-emitting diodes (FLEDs), miniature, micro-leds, quantum dot light-emitting diodes (QLEDs), and the like. In some embodiments, the touch device 100 may include 1 or N display screens 120, N being a positive integer greater than 1.
In the embodiment of the present application, the display screen 120 may display corresponding handwriting on the display screen 120 according to a writing or drawing operation input by a user in the touch screen 110. Specifically, when a user holds an input device, for example, the active pen 200, to write or draw on the touch screen 110, the touch screen 110 may detect that the capacitance changes according to a user operation, and send capacitance change information to the processor 130, the processor 130 may calculate a contact coordinate of the active pen 200 in the touch screen 110 according to the capacitance change information, and then the processor 130 may control the display screen 120 to display handwriting formed by connecting a plurality of continuous coordinates.
Processor 130 may include one or more processing units, such as: the processor 130 may include an Application Processor (AP), a modem processor, a Graphics Processor (GPU), an Image Signal Processor (ISP), a controller, a memory, a video codec, a Digital Signal Processor (DSP), a baseband processor, and/or a neural-Network Processing Unit (NPU), among others. Wherein, the different processing units may be independent devices or may be integrated in one or more processors.
The controller may be a neural center and a command center of the touch device 100. The controller can generate an operation control signal according to the instruction operation code and the timing signal to complete the control of instruction fetching and instruction execution.
A memory may also be provided in the processor 130 for storing instructions and data. In some embodiments, the memory in the processor 130 is a cache memory. The memory may hold instructions or data that have just been used or recycled by the processor 130. If the processor 130 needs to reuse the instruction or data, it can be called directly from the memory. Avoiding repeated accesses reduces the latency of the processor 130, thereby increasing the efficiency of the system.
In this embodiment of the present application, the memory provided in the processor 130 may be configured to store an algorithm for calculating touch coordinates, and after receiving capacitance change information sent by the touch screen, the processor 130 may directly call the algorithm for calculating touch coordinates to perform coordinate calculation, so as to obtain the actual measurement coordinates described in the above method embodiment; the processor 130 may also call a compensation function to obtain a final compensation coordinate, which is used to control the display screen to display the corresponding handwriting according to the compensation coordinate.
Memory 140 may include internal memory 141 and external memory interface 142.
The internal memory 141 may include one or more Random Access Memories (RAMs) and one or more non-volatile memories (NVMs). The random access memory may include static random-access memory (SRAM), dynamic random-access memory (DRAM), synchronous dynamic random-access memory (SDRAM), double data rate synchronous dynamic random-access memory (DDR SDRAM), such as fifth generation DDR SDRAM generally referred to as DDR5SDRAM, and the like; the nonvolatile memory may include a magnetic disk storage device, a flash memory (flash memory). The FLASH memory may include NOR FLASH, NAND FLASH, 3D NAND FLASH, etc. according to the operation principle, may include single-level cells (SLC), multi-level cells (MLC), three-level cells (TLC), four-level cells (QLC), etc. according to the level order of the memory cell, and may include universal FLASH memory (UFS), embedded multimedia memory cards (eMMC), etc. according to the storage specification. The random access memory may be read and written directly by the processor 130, may be used to store executable programs (e.g., machine instructions) of an operating system or other programs in operation, and may also be used to store data of users and applications, etc. The nonvolatile memory may also store executable programs, data of users and application programs, and the like, and may be loaded into the random access memory in advance for the processor 130 to directly read and write.
The external memory interface 142 may be used to connect an external nonvolatile memory, so as to expand the storage capability of the touch device 100. The external non-volatile memory communicates with the processor 130 through the external memory interface 142 to perform data storage functions. For example, files such as music, video, etc. are saved in an external nonvolatile memory.
In the embodiment of the present application, the memory 140 is further used for storing the compensation function. The compensation function is obtained in advance through the method flow shown in fig. 6 above and is stored in the memory 140 of the touch device 100, so that the compensation function is obtained when the processor 130 calculates the compensation coordinates according to the measured coordinates.
The USB interface 150 is an interface conforming to the USB standard specification, and may specifically be a Mini USB interface, a Micro USB interface, a USB Type C interface, or the like. The USB interface 150 can be used to connect a charger to charge the touch device 100, and can also be used to transmit data between the touch device 100 and a peripheral device. And the earphone can also be used for connecting an earphone and playing audio through the earphone. The interface may also be used to connect other electronic devices, such as stand-alone drawing boards, AR devices, etc.
It should be understood that the interface connection relationship between the modules illustrated in the embodiment of the present application is only an exemplary illustration, and does not form a structural limitation on the touch device 100. In other embodiments of the present application, the touch device 100 may also adopt different interface connection manners or a combination of multiple interface connection manners in the above embodiments.
The charging management module 160 is used to receive charging input from the charger. The charger may be a wireless charger or a wired charger. In some wired charging embodiments, the charging management module 160 may receive charging input from a wired charger via the USB interface 150. In some wireless charging embodiments, the charging management module 160 may receive a wireless charging input through a wireless charging coil of the touch device 100. The charging management module 160 may also provide power to the electronic device via the power management module 161 while charging the battery 162.
The power management module 161 is used to connect the battery 162, the charging management module 160 and the processor 130. The power management module 161 receives input from the battery 162 and/or the charging management module 160, and supplies power to the processor 130, the internal memory 141, the external memory, the display 120, the camera, the wireless communication module 180, and the like. The power management module 161 may also be used to monitor parameters such as battery capacity, battery cycle count, battery state of health (leakage, impedance), etc. In other embodiments, the power management module 161 may be disposed in the processor 130. In other embodiments, the power management module 161 and the charging management module 160 may be disposed in the same device.
The wireless communication function of the touch device 100 can be implemented by the antenna 1, the antenna 2, the mobile communication module 170, the wireless communication module 180, a modem processor, a baseband processor, and the like.
The antennas 1 and 2 are used for transmitting and receiving electromagnetic wave signals. Each antenna in the touch device 100 can be used to cover a single or multiple communication bands. Different antennas can also be multiplexed to improve the utilization of the antennas. For example: the antenna 1 may be multiplexed as a diversity antenna of a wireless local area network. In other embodiments, the antenna may be used in conjunction with a tuning switch.
The mobile communication module 170 may provide a solution including wireless communication of 2G/3G/4G/5G, etc. applied on the touch device 100. The mobile communication module 170 may include at least one filter, a switch, a power amplifier, a Low Noise Amplifier (LNA), and the like. The mobile communication module 170 may receive the electromagnetic wave from the antenna 1, filter, amplify, etc. the received electromagnetic wave, and transmit the filtered electromagnetic wave to the modem processor for demodulation. The mobile communication module 170 may also amplify the signal modulated by the modem processor, and convert the signal into electromagnetic wave through the antenna 1 to radiate the electromagnetic wave. In some embodiments, at least some of the functional modules of the mobile communication module 170 may be disposed in the processor 130. In some embodiments, at least some of the functional modules of the mobile communication module 170 may be disposed in the same device as at least some of the modules of the processor 130.
The modem processor may include a modulator and a demodulator. The modulator is used for modulating a low-frequency baseband signal to be transmitted into a medium-high frequency signal. The demodulator is used for demodulating the received electromagnetic wave signal into a low-frequency baseband signal. The demodulator then passes the demodulated low frequency baseband signal to a baseband processor for processing. The low frequency baseband signal is processed by the baseband processor and then passed to the application processor. The application processor outputs a sound signal through an audio device (not limited to a speaker, a receiver, etc.) or displays an image or video through the display screen 120. In some embodiments, the modem processor may be a stand-alone device. In other embodiments, the modem processor may be provided in the same device as the mobile communication module 170 or other functional modules, independent of the processor 130.
The wireless communication module 180 may provide a solution for wireless communication applied to the touch device 100, including Wireless Local Area Networks (WLANs) (e.g., wireless fidelity (Wi-Fi) networks), bluetooth (BT), global Navigation Satellite System (GNSS), frequency Modulation (FM), near Field Communication (NFC), infrared (IR), and the like. The wireless communication module 180 may be one or more devices integrating at least one communication processing module. The wireless communication module 180 receives electromagnetic waves via the antenna 2, demodulates and filters electromagnetic wave signals, and transmits the processed signals to the processor 130. The wireless communication module 180 may also receive a signal to be transmitted from the processor 130, perform frequency modulation and amplification on the signal, and convert the signal into electromagnetic waves through the antenna 2 to radiate the electromagnetic waves.
In some embodiments, antenna 1 of the touch device 100 is coupled to the mobile communication module 170 and antenna 2 is coupled to the wireless communication module 180, such that the touch device 100 can communicate with a network and other devices through wireless communication technology. The wireless communication technology may include global system for mobile communications (GSM), general Packet Radio Service (GPRS), code division multiple access (code division multiple access, CDMA), wideband Code Division Multiple Access (WCDMA), time-division code division multiple access (time-division code division multiple access, TD-SCDMA), long Term Evolution (LTE), BT, GNSS, WLAN, NFC, FM, and/or IR technologies, etc. The GNSS may include a Global Positioning System (GPS), a global navigation satellite system (GLONASS), a beidou navigation satellite system (BDS), a quasi-zenith satellite system (QZSS), and/or a Satellite Based Augmentation System (SBAS).
In this embodiment, the touch device 100 may further establish a communication connection with an input device, such as the active pen 200, through the wireless communication module 180 or the mobile communication module 170, for data transmission. For example, specifically, the following may be used: a near field communication network such as a WI-FI hotspot network, a WI-FI peer-to-peer (P2P) network, a bluetooth network, a ZigBee network, or a Near Field Communication (NFC) network establishes a communication connection with the active pen 200 to receive a downlink signal transmitted by the active pen 200.
Optionally, in other embodiments of the present application, the touch device 100 further includes a device shown in fig. 12, which is specifically as follows:
the gyro sensor may be used to determine the motion pose of the touch device 100. In some embodiments, the angular velocity of touch device 100 about three axes (i.e., x, y, and z axes) may be determined by a gyroscope sensor. The gyro sensor may be used for photographing anti-shake. Illustratively, when the shutter is pressed, the gyroscope sensor detects the shake angle of the touch device 100, calculates the distance to be compensated for the lens module according to the shake angle, and allows the lens to counteract the shake of the touch device 100 through reverse movement, thereby achieving anti-shake. The gyroscope sensor can also be used for navigation and motion sensing game scenes.
The air pressure sensor is used for measuring air pressure. In some embodiments, the touch device 100 calculates an altitude from a barometric pressure value measured by a barometric pressure sensor to assist positioning and navigation.
The magnetic sensor includes a hall sensor. The touch device 100 may detect the opening and closing of the flip holster using a magnetic sensor. In some embodiments, when touch device 100 is a flip, touch device 100 can detect the opening and closing of the flip according to a magnetic sensor. And then according to the opening and closing state of the leather sheath or the opening and closing state of the flip cover, the automatic unlocking of the flip cover is set.
The acceleration sensor may detect the magnitude of acceleration of the touch device 100 in various directions (generally, three axes). The magnitude and direction of gravity can be detected when the touch device 100 is stationary. The method can also be used for recognizing the posture of the electronic equipment, and is applied to horizontal and vertical screen switching, pedometers and other applications.
A distance sensor for measuring a distance. The touch device 100 may measure the distance by infrared or laser. In some embodiments, shooting a scene, the touch device 100 may utilize a range sensor to measure distance to achieve fast focus.
The proximity light sensor may include, for example, a Light Emitting Diode (LED) and a light detector, such as a photodiode. The light emitting diode may be an infrared light emitting diode. The touch device 100 emits infrared light to the outside through the light emitting diode. The touch device 100 detects infrared reflected light from a nearby object using a photodiode. When sufficient reflected light is detected, it can be determined that there is an object near the touch device 100. When insufficient reflected light is detected, touch device 100 can determine that there is no object near touch device 100. The touch device 100 can detect that the user holds the touch device 100 close to the ear for talking by using the proximity light sensor, so as to automatically turn off the screen to achieve the purpose of saving power. The proximity optical sensor can also be used in a leather sheath mode, and the pocket mode automatically unlocks and locks the screen.
The ambient light sensor is used for sensing the ambient light brightness. The touch device 100 can adaptively adjust the brightness of the display screen 120 according to the perceived ambient light brightness. The ambient light sensor can also be used to automatically adjust the white balance when taking a picture. The ambient light sensor may also cooperate with the proximity light sensor to detect whether the touch device 100 is in a pocket to prevent accidental touches.
The fingerprint sensor is used for collecting fingerprints. The touch device 100 may utilize the collected fingerprint characteristics to unlock a fingerprint, access an application lock, photograph the fingerprint, answer an incoming call with the fingerprint, and the like.
The temperature sensor is used for detecting temperature. In some embodiments, the touch device 100 executes a temperature processing strategy using the temperature detected by the temperature sensor. For example, when the temperature reported by the temperature sensor exceeds the threshold, the touch device 100 performs a reduction in performance of a processor located near the temperature sensor, so as to reduce power consumption and implement thermal protection. In other embodiments, when the temperature is lower than another threshold, the touch device 100 heats the battery 162 to avoid abnormal shutdown of the touch device 100 due to low temperature. In some other embodiments, when the temperature is lower than a further threshold, the touch device 100 boosts the output voltage of the battery 162 to avoid abnormal shutdown caused by low temperature.
The bone conduction sensor may acquire a vibration signal. In some embodiments, the bone conduction sensor may acquire a vibration signal of a human voice vibrating a bone mass. The bone conduction sensor can also contact the pulse of the human body to receive the blood pressure jumping signal. In some embodiments, the bone conduction sensor may also be disposed in a headset, integrated into a bone conduction headset. The audio module can analyze a voice signal based on the vibration signal of the sound part vibration bone block acquired by the bone conduction sensor, so as to realize a voice function. The application processor can analyze heart rate information based on the blood pressure beating signals acquired by the bone conduction sensor, and a heart rate detection function is realized.
It will be apparent to one skilled in the art that some of the specific details presented above with respect to touch device 100 may not be needed to practice particular described implementations or their equivalents. Similarly, other touch devices may include a greater number of subsystems, modules, components, and the like. Some sub-modules may be implemented as software or hardware, where appropriate. Accordingly, it should be understood that the above description is not intended to be exhaustive or to limit the disclosure to the precise form disclosed herein. On the contrary, many modifications and variations are possible in light of the above teaching, as would be apparent to those of ordinary skill in the art.
Optionally, the touch device 100 may implement an audio function through an audio module, a speaker, a receiver, a microphone, an earphone interface, an application processor, and the like. Such as music playing, recording, etc.
The audio module is used for converting digital audio information into analog audio signals to be output and converting the analog audio input into digital audio signals. The audio module may also be used to encode and decode audio signals. In some embodiments, the audio module may be disposed in the processor 130, or a portion of the functional modules of the audio module may be disposed in the processor 130.
Loudspeakers, also known as "horns," are used to convert electrical audio signals into sound signals. The touch device 100 can listen to music through a speaker or listen to a hands-free call.
Receivers, also called "earpieces", are used to convert electrical audio signals into acoustic signals. When the touch device 100 receives a call or voice information, the voice can be received by placing the receiver close to the ear.
Microphones, also known as "microphones", are used to convert sound signals into electrical signals. When making a call or sending voice information, a user can input a voice signal into the microphone by making a sound by approaching the microphone through the mouth of the user. The touch device 100 may be provided with at least one microphone. In other embodiments, the touch device 100 may be provided with two microphones to achieve a noise reduction function in addition to collecting sound signals. In other embodiments, the touch device 100 may further include three, four or more microphones for collecting sound signals, reducing noise, identifying sound sources, and implementing directional recording functions.
The earphone interface is used for connecting a wired earphone. The earphone interface may be a USB interface 150, or may be an open mobile electronic device platform (OMTP) standard interface of 3.5mm, or an american cellular telecommunications industry association (cellule telecommunications industry association of the US, TI) standard interface.
The touch device 100 may implement a shooting function through an ISP, a camera, a video codec, a GPU, a display screen 120, an application processor, and the like.
And the ISP is used for processing data fed back by the camera. For example, when a user takes a picture, the shutter is opened, light is transmitted to the camera photosensitive element through the lens, an optical signal is converted into an electric signal, and the camera photosensitive element transmits the electric signal to the ISP for processing and converting into an image visible to the naked eye. The ISP can also carry out algorithm optimization on the noise, brightness and skin color of the image. The ISP can also optimize parameters such as exposure, color temperature and the like of a shooting scene. In some embodiments, the ISP may be provided in a camera.
Optionally, the touch device 100 can also be used to capture still images or video through a camera. The object generates an optical image through the lens and projects the optical image to the photosensitive element. The photosensitive element may be a Charge Coupled Device (CCD) or a complementary metal-oxide-semiconductor (CMOS) phototransistor. The photosensitive element converts the optical signal into an electrical signal, and then transmits the electrical signal to the ISP to be converted into a digital image signal. And the ISP outputs the digital image signal to the DSP for processing. The DSP converts the digital image signal into image signal in standard RGB, YUV and other formats. In some embodiments, the touch device 100 may include 1 or N cameras, where N is a positive integer greater than 1.
The digital signal processor is used for processing digital signals, and can process other digital signals besides digital image signals. For example, when the touch device 100 selects a frequency bin, the digital signal processor is used to perform fourier transform or the like on the frequency bin energy.
Video codecs are used to compress or decompress digital video. Touch device 100 may support one or more video codecs. In this way, the touch device 100 can play or record videos in a plurality of encoding formats, such as: moving Picture Experts Group (MPEG) 1, MPEG2, MPEG3, MPEG4, and the like.
The NPU is a neural-network (NN) computing processor, which processes input information quickly by referring to a biological neural network structure, for example, by referring to a transfer mode between neurons of a human brain, and can also learn by itself continuously. Applications such as intelligent recognition of the touch device 100 can be realized through the NPU, for example: image recognition, face recognition, speech recognition, text understanding, and the like.
The software system of the touch device 100 may adopt a layered architecture, an event-driven architecture, a micro-core architecture, a micro-service architecture, or a cloud architecture. The embodiment of the application adopts a layered architecture
Figure BDA0003525019880000181
The system is used as an example to illustrate the software structure of the touch device 100. The touch device may be a vehicle
Figure BDA0003525019880000182
Or other operating system.
Fig. 13 is a block diagram of a software structure of the touch device 100 according to the embodiment of the present application.
The layered architecture divides the software into several layers, each layer having a clear role and division of labor. The layers communicate with each other through a software interface. In some embodiments, the method will comprise
Figure BDA0003525019880000183
The system is divided into four layers, which are respectively an application program layer, an application program framework layer and an android runtime (from top to bottom)
Figure BDA0003525019880000184
runtime) and system libraries, and the kernel layer.
The application layer may include a series of application packages.
As shown in fig. 13, the application package may include writing/drawing type applications, gallery, talk, WLAN, bluetooth, music, video, short message, etc. applications.
The application framework layer provides an Application Programming Interface (API) and a programming framework for the application program of the application layer. The application framework layer includes a number of predefined functions.
As shown in FIG. 13, the application framework layers may include a window manager, content provider, view system, phone manager, resource manager, notification manager, and the like.
The window manager is used for managing window programs. The window manager can obtain the size of the display screen, judge whether a status bar exists, lock the screen, intercept the screen and the like.
Content providers are used to store and retrieve data and make it accessible to applications. The data may include video, images, audio, calls made and received, browsing history and bookmarks, phone books, etc.
The view system includes visual controls such as controls to display text, controls to display pictures, and the like. The view system may be used to build applications. The display interface may be composed of one or more views. For example, the display interface including the short message notification icon may include a view for displaying text and a view for displaying pictures.
The phone manager is used to provide the communication function of the touch device 100. Such as management of call status (including on, off, etc.).
The resource manager provides various resources for the application, such as localized strings, icons, pictures, layout files, video files, and the like.
The notification manager enables the application to display notification information in the status bar, can be used to convey notification-type messages, can disappear automatically after a brief dwell, and does not require user interaction. Such as a notification manager used to inform download completion, message alerts, etc. The notification manager may also be a notification that appears in the form of a chart or scroll bar text at the top status bar of the system, such as a notification of a background running application, or a notification that appears on the screen in the form of a dialog window. For example, text information is prompted in the status bar, a prompt tone is given, the touch device vibrates, the indicator light flickers, and the like.
Figure BDA0003525019880000191
The Runtime comprises a core library and a virtual machine.
Figure BDA0003525019880000192
runtime is responsible for the scheduling and management of the android system.
The core library comprises two parts: one part is a function which needs to be called by java language, and the other part is a core library of android.
The application layer and the application framework layer run in a virtual machine. And executing java files of the application program layer and the application program framework layer into a binary file by the virtual machine. The virtual machine is used for performing the functions of object life cycle management, stack management, thread management, safety and exception management, garbage collection and the like.
The system library may include a plurality of functional modules. For example: surface managers (surface managers), media Libraries (Media Libraries), three-dimensional graphics processing Libraries (e.g., openGL ES), 2D graphics engines (e.g., SGL), and the like.
The surface manager is used to manage the display subsystem and provide fusion of 2D and 3D layers for multiple applications.
The media library supports a variety of commonly used audio, video format playback and recording, and still image files, among others. The media library may support a variety of audio-video encoding formats such as MPEG4, h.264, MP3, AAC, AMR, JPG, PNG, etc.
The three-dimensional graphic processing library is used for realizing three-dimensional graphic drawing, image rendering, synthesis, layer processing and the like.
The 2D graphics engine is a drawing engine for 2D drawing.
The kernel layer is a layer between hardware and software. The inner core layer at least comprises a display driver, a camera driver, an audio driver and a sensor driver.
The following describes exemplary workflow of software and hardware of the touch device 100 in conjunction with a writing/drawing scenario.
When the touch sensor 190B receives a touch operation, a corresponding hardware interrupt is issued to the kernel layer. The kernel layer processes the touch operation into an original input event (including touch coordinates, a time stamp of the touch operation, and other information). The raw input events are stored at the kernel layer. And the application program framework layer acquires the original input event from the kernel layer and identifies the control corresponding to the input event. Taking the touch operation as a touch click operation, and taking the control corresponding to the click operation as the control of the writing/drawing application icon as an example, the writing/drawing application calls the interface of the application framework layer, starts the writing/drawing application, further starts the driving of the touch screen 110 by calling the kernel layer, and captures the movement track input by the user for writing/drawing through the touch screen 110.
Referring to fig. 14, fig. 14 is a schematic diagram of a hardware structure of an active pen 200 according to an embodiment of the present disclosure.
As shown in fig. 14, the active pen 200 may include: a processor 210, one or more sensors 220. Or may further include: keys 230, indicator light 240, one or more electrodes 250, driver circuit 260, power supply 270, and wireless communication module 280, among others.
It is to be understood that the illustrated structure of the embodiment of the present application does not constitute a specific limitation to the active pen 200. In other embodiments of the present application, the active pen 200 may include more or fewer components than shown, or some components may be combined, some components may be split, or a different arrangement of components. The illustrated components may be implemented in hardware, software, or a combination of software and hardware.
The processor 210 may include memory circuitry and processing circuitry for supporting the operation of the active pen 200. The storage and processing circuitry may include storage devices such as non-volatile memory (e.g., flash memory or other electrically programmable read-only memory configured as a solid state drive), volatile memory (e.g., static or dynamic random access memory), and so forth. Processing circuitry in the processor 210 may be used to control the operation of the active pen 200. The processing circuitry may be based on one or more microprocessors, microcontrollers, digital signal processors, baseband processors, power management units, audio chips, application specific integrated circuits, and the like.
Sensor 220 may include a pressure sensor 221. The pressure sensor 221 may be disposed at a tip 201 (shown in FIG. 2) of the active pen 200. Of course, the pressure sensor 220 may also be disposed in the body 203 of the active pen 200, such that when a force is applied to one end of the nib 201 of the active pen 200, the other end of the nib 201 moves to apply a force to the pressure sensor 221. In one embodiment, the processor 210 may adjust the line thickness of the active pen 200 when the tip 201 writes according to the pressure detected by the pressure sensor 221.
The sensors 220 may also include inertial sensors 222. The inertial sensors 222 may include a three-axis accelerometer and a three-axis gyroscope, and/or other components for measuring motion of the active pen 200, e.g., a three-axis magnetometer may be included in the sensors in a nine-axis inertial sensor configuration. The sensors 220 may also include other sensors, such as temperature sensors, ambient light sensors, light-based proximity sensors, contact sensors, magnetic sensors, pressure sensors, and/or other sensors.
The key 230 includes a power-on key and the like. The keys 230 may be mechanical keys. Or may be touch keys. The touch device 100 may receive a key input, and generate a key signal input related to user setting and function control of the touch device 100.
Indicator light 240 may be used to indicate a state of charge, a change in charge, or may be used to indicate a message, notification, etc.
The electrode 250 may be located within the tip 201 of the active pen 200 (see fig. 2 above) and the driver circuit 260 may be located within the body 203 of the active pen 200 (see fig. 2 above). Drive circuit 260 may be used to connect electrode 250 to power supply 270. The driving circuit may receive the electrical signal provided by the battery and transmit a downlink signal to the touch device 100 through the amplification driving electrode 250.
Power supply 270 may be a nickel cadmium battery, a nickel metal hydride battery, or a lithium ion battery. In the embodiment of the present application, the power supply 270 may be formed by a plurality of batteries connected in series and parallel, besides a power supply, and the voltage output by the batteries is not limited. In one possible implementation, the power source 270 may be an external power source to the active pen 200. The embodiment of the present application does not limit the type of the power supply 270 and the output voltage value.
The wireless communication module 280 may support wireless communication between the active pen 200 and the touch device 100. The wireless communication module 280 may be a bluetooth module, a WI-FI hotspot module, a WI-FI point-to-point module, etc., an MFC module, a ZigBee module, etc. The bluetooth module may include a radio frequency transceiver, such as a transceiver. The bluetooth module may also include one or more antennas. The transceiver may transmit and/or receive wireless signals, which may be bluetooth signals, wireless local area network signals, long range signals such as cellular telephone signals, near field communication signals, or other wireless signals, based on the type of wireless module, using an antenna.
It will be appreciated that the active pen 200 may include a microphone, a speaker, an audio generator, a vibrator, a camera, a data port, and other devices, as desired for the application. A user can control the operation of touch device 100 by providing commands with these devices, and receive status information and other outputs.
As described above, the touch display method provided in the embodiment of the present application, the software and hardware structure of the touch device 100 related to the touch display method, and the software and hardware structure of the active pen 200 are introduced. Therefore, the touch display method, the graphical interface and the related device provided by the application have the following beneficial effects: the touch coordinate compensation can be carried out on the touch equipment with inaccurate touch coordinate calculation, so that the influence caused by touch coordinate calculation errors is greatly reduced or eliminated, and when a user writes or draws on the touch equipment, the touch equipment can display straighter and smoother handwriting meeting the user expectation, so that the user experience is improved.
The embodiments of the present application can be combined arbitrarily to achieve different technical effects.
In the above embodiments, the implementation may be wholly or partially realized by software, hardware, firmware, or any combination thereof. When implemented in software, may be implemented in whole or in part in the form of a computer program product. The computer program product includes one or more computer instructions. When the computer program instructions are loaded and executed on a computer, the procedures or functions described in accordance with the present application are generated, in whole or in part. The computer may be a general purpose computer, a special purpose computer, a network of computers, or other programmable device. The computer instructions may be stored in a computer readable storage medium or transmitted from one computer readable storage medium to another, for example, the computer instructions may be transmitted from one website, computer, server, or data center to another website, computer, server, or data center by wire (e.g., coaxial cable, fiber optic, digital subscriber line) or wirelessly (e.g., infrared, wireless, microwave, etc.). The computer-readable storage medium can be any available medium that can be accessed by a computer or a data storage device, such as a server, a data center, etc., that includes one or more available media. The usable medium may be a magnetic medium (e.g., floppy Disk, hard Disk, magnetic tape), an optical medium (e.g., DVD), or a semiconductor medium (e.g., solid State Disk), among others.
One of ordinary skill in the art will appreciate that all or part of the processes in the methods of the above embodiments may be implemented by hardware related to instructions of a computer program, which may be stored in a computer-readable storage medium, and when executed, may include the processes of the above method embodiments. And the aforementioned storage medium includes: various media capable of storing program codes, such as ROM or RAM, magnetic or optical disks, etc.
In short, the above description is only an example of the technical solution of the present invention, and is not intended to limit the scope of the present invention. Any modifications, equivalents, improvements and the like made in accordance with the disclosure of the present invention are intended to be included within the scope of the present invention.
The above embodiments are only used to illustrate the technical solutions of the present application, and not to limit the same; although the present application has been described in detail with reference to the foregoing embodiments, it should be understood by those of ordinary skill in the art that: the technical solutions described in the foregoing embodiments may still be modified, or some technical features may be equivalently replaced; and these modifications or substitutions do not depart from the scope of the technical solutions of the embodiments of the present application.

Claims (14)

1. A touch display method is applied to electronic equipment and is characterized by comprising the following steps:
a sensor in a touch screen of the electronic equipment acquires a first downlink signal sent by input equipment;
the electronic equipment determines a plurality of continuous first measured coordinates on the touch screen according to the position of the sensor collecting the first downlink signal;
the electronic equipment acquires a first compensation value corresponding to the first measured coordinate in a compensation function, and the first measured coordinate is superposed with the first compensation value to acquire a first compensation coordinate; the compensation function is used for representing the corresponding relation between the measured coordinates and the compensation value, and the compensation value indicates the difference between the measured coordinates and the corresponding theoretical coordinates;
and the electronic equipment displays a plurality of continuous moving tracks formed by connecting the first compensation coordinates.
2. The method of claim 1, wherein the compensation function comprises a horizontal-axis compensation function and a vertical-axis compensation function, and wherein the compensation values comprise a horizontal-axis compensation value and a vertical-axis compensation value;
the horizontal axis compensation function is used for representing the corresponding relation between the horizontal coordinate of the measured coordinate and the horizontal axis compensation value, and the horizontal axis compensation value indicates the difference value between the horizontal coordinate of the measured coordinate and the horizontal coordinate of the theoretical coordinate;
the longitudinal axis compensation function is used for representing the corresponding relation between the longitudinal coordinate of the measured coordinate and the longitudinal axis compensation value, and the longitudinal axis compensation value indicates the difference value between the longitudinal coordinate of the measured coordinate and the longitudinal coordinate of the theoretical coordinate.
3. The method of claim 2, wherein the sensors are arranged in a first direction and a second direction, the first direction being perpendicular to the second direction; the cross-axis compensation function is obtained by the second device by:
the input device moves in the touch screen at a first speed in a direction parallel to the first direction, and the input device sends a downlink signal in the moving process; the first speed is lower than a preset speed and is a constant value;
determining a plurality of continuous measured coordinates according to the position of the sensor which acquires the downlink signal;
determining a plurality of continuous theoretical coordinates according to the initial actual measurement coordinate, the final actual measurement coordinate and the first speed in the plurality of continuous actual measurement coordinates;
and acquiring the transverse axis compensation function according to the plurality of continuous measured coordinates and the plurality of continuous theoretical coordinates.
4. The method according to claim 3, wherein the second device obtains the cross-axis compensation function according to the plurality of consecutive measured coordinates and the plurality of consecutive theoretical coordinates by:
the electronic equipment performs data fitting on the continuous measured coordinates to obtain a cross-axis fitting function;
the electronic device obtaining a plurality of differences between the cross-axis fitting function and the plurality of consecutive theoretical coordinates;
the electronic device performs data fitting on the plurality of difference values to obtain the cross-axis compensation function.
5. The method of claim 3 or 4, wherein the transverse axis compensation function and the longitudinal axis compensation function are the same or different; when the arrangement rules of the sensors in the first direction and the second direction are consistent, and the hardware structures of the sensors are the same, the horizontal axis compensation function and the vertical axis compensation function are the same; otherwise, the horizontal axis compensation function and the vertical axis compensation function are different.
6. The method of any of claims 3-5, wherein the second device and the electronic device are the same device or different devices.
7. The method according to any one of claims 1-5, wherein before the electronic device displays a plurality of consecutive movement tracks connected by the first compensation coordinate, the method further comprises:
the electronic equipment runs and displays an interface provided by a drawing or writing application program.
8. The method according to any of claims 1-7, wherein the input device includes, but is not limited to, an active capacitive stylus.
9. The method according to claim 8, wherein when the input device is the active capacitive stylus, the downlink signal is specifically: and the driving unit in the active capacitance pen sends an excitation signal to the pen point.
10. The method according to claim 8 or 9, wherein the types of the downstream signals include three without limitation: square waves, sine waves, and triangular waves.
11. The method according to any of claims 8-11, wherein when the input device is the active capacitive pen, the downlink signal carries pressure information of the active capacitive pen; the pressure information is related to the display effect of the movement trajectory.
12. A chip for application to an electronic device, the chip comprising one or more processors for invoking computer instructions to cause the electronic device to perform the method of any of claims 1-11.
13. A computer-readable storage medium comprising instructions that, when executed on an electronic device, cause the electronic device to perform the method of any of claims 1-11.
14. An electronic device, comprising one or more processors and one or more memories; wherein the one or more memories are coupled to the one or more processors for storing computer program code comprising computer instructions that, when executed by the one or more processors, cause the electronic device to perform the method of any of claims 1-11.
CN202210193026.7A 2022-02-28 2022-02-28 Touch display method, graphical interface and related device Active CN115562514B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210193026.7A CN115562514B (en) 2022-02-28 2022-02-28 Touch display method, graphical interface and related device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210193026.7A CN115562514B (en) 2022-02-28 2022-02-28 Touch display method, graphical interface and related device

Publications (2)

Publication Number Publication Date
CN115562514A true CN115562514A (en) 2023-01-03
CN115562514B CN115562514B (en) 2023-11-24

Family

ID=84736629

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210193026.7A Active CN115562514B (en) 2022-02-28 2022-02-28 Touch display method, graphical interface and related device

Country Status (1)

Country Link
CN (1) CN115562514B (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117289808A (en) * 2023-09-21 2023-12-26 深圳市瀚天鑫科技有限公司 Capacitive miniature cursor positioning method and system

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103970323A (en) * 2013-01-30 2014-08-06 北京汇冠新技术股份有限公司 Method and system for tracking of trajectory of touch screen
CN104765543A (en) * 2015-04-17 2015-07-08 努比亚技术有限公司 Audio playing parameter adjustment method and device
CN105975122A (en) * 2016-04-27 2016-09-28 集怡嘉数码科技(深圳)有限公司 Touch track compensation method and apparatus as well as terminal device
CN106201091A (en) * 2016-07-13 2016-12-07 北京集创北方科技股份有限公司 The coordinate processing method of touch screen and device
CN107272921A (en) * 2016-01-28 2017-10-20 乐金显示有限公司 Active touch control pen includes its touch-sensing system and touch-sensing method
CN109445636A (en) * 2018-10-31 2019-03-08 上海海栎创微电子有限公司 A kind of self-capacitance touch screen edge touch coordinate compensation method

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103970323A (en) * 2013-01-30 2014-08-06 北京汇冠新技术股份有限公司 Method and system for tracking of trajectory of touch screen
CN104765543A (en) * 2015-04-17 2015-07-08 努比亚技术有限公司 Audio playing parameter adjustment method and device
CN107272921A (en) * 2016-01-28 2017-10-20 乐金显示有限公司 Active touch control pen includes its touch-sensing system and touch-sensing method
CN105975122A (en) * 2016-04-27 2016-09-28 集怡嘉数码科技(深圳)有限公司 Touch track compensation method and apparatus as well as terminal device
CN106201091A (en) * 2016-07-13 2016-12-07 北京集创北方科技股份有限公司 The coordinate processing method of touch screen and device
CN109445636A (en) * 2018-10-31 2019-03-08 上海海栎创微电子有限公司 A kind of self-capacitance touch screen edge touch coordinate compensation method

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117289808A (en) * 2023-09-21 2023-12-26 深圳市瀚天鑫科技有限公司 Capacitive miniature cursor positioning method and system

Also Published As

Publication number Publication date
CN115562514B (en) 2023-11-24

Similar Documents

Publication Publication Date Title
KR102470275B1 (en) Voice control method and electronic device
CN109814766B (en) Application display method and electronic equipment
CN110114747B (en) Notification processing method and electronic equipment
CN113596242B (en) Sensor adjustment method and device, electronic equipment and storage medium
EP4160596A1 (en) Video synthesis method and apparatus, electronic device, and storage medium
WO2022095744A1 (en) Vr display control method, electronic device, and computer readable storage medium
WO2023179123A1 (en) Bluetooth audio playback method, electronic device, and storage medium
CN115484380A (en) Shooting method, graphical user interface and electronic equipment
US20240193945A1 (en) Method for determining recommended scenario and electronic device
CN115016697A (en) Screen projection method, computer device, readable storage medium, and program product
CN114371985A (en) Automated testing method, electronic device, and storage medium
CN113448658A (en) Screen capture processing method, graphical user interface and terminal
EP4228233A1 (en) Method for adding operation sequence, electronic device, and system
CN115562514B (en) Touch display method, graphical interface and related device
CN115032640B (en) Gesture recognition method and terminal equipment
CN112416984A (en) Data processing method and device
CN113380240B (en) Voice interaction method and electronic equipment
US20230401897A1 (en) Method for preventing hand gesture misrecognition and electronic device
CN114283195A (en) Method for generating dynamic image, electronic device and readable storage medium
CN113050864B (en) Screen capturing method and related equipment
CN115482143A (en) Application image data calling method and system, electronic equipment and storage medium
CN114740986A (en) Handwriting input display method and related equipment
CN114637392A (en) Display method and electronic equipment
CN114812381A (en) Electronic equipment positioning method and electronic equipment
CN112882823A (en) Screen display method and electronic equipment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant