CN115617192B - Touch positioning method and electronic equipment - Google Patents

Touch positioning method and electronic equipment Download PDF

Info

Publication number
CN115617192B
CN115617192B CN202210700269.5A CN202210700269A CN115617192B CN 115617192 B CN115617192 B CN 115617192B CN 202210700269 A CN202210700269 A CN 202210700269A CN 115617192 B CN115617192 B CN 115617192B
Authority
CN
China
Prior art keywords
touch
ultrasonic
contact
coordinates
touch screen
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202210700269.5A
Other languages
Chinese (zh)
Other versions
CN115617192A (en
Inventor
韩帅
樊亮
冀焕霞
赵元凤
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Honor Device Co Ltd
Original Assignee
Honor Device Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Honor Device Co Ltd filed Critical Honor Device Co Ltd
Priority to CN202210700269.5A priority Critical patent/CN115617192B/en
Publication of CN115617192A publication Critical patent/CN115617192A/en
Application granted granted Critical
Publication of CN115617192B publication Critical patent/CN115617192B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • G06F3/0416Control or interface arrangements specially adapted for digitisers
    • G06F3/04162Control or interface arrangements specially adapted for digitisers for exchanging data with external devices, e.g. smart pens, via the digitiser sensing hardware
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/033Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
    • G06F3/038Control and interface arrangements therefor, e.g. drivers or device-embedded control circuitry
    • G06F3/0386Control and interface arrangements therefor, e.g. drivers or device-embedded control circuitry for light pen
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • G06F3/0416Control or interface arrangements specially adapted for digitisers
    • G06F3/0418Control or interface arrangements specially adapted for digitisers for error correction or compensation, e.g. based on parallax, calibration or alignment
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • G06F3/044Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by capacitive means
    • G06F3/0441Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by capacitive means using active external devices, e.g. active pens, for receiving changes in electrical potential transmitted by the digitiser, e.g. tablet driving signals
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04883Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/40Extraction of image or video features
    • G06V10/54Extraction of image or video features relating to texture
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/12Fingerprints or palmprints
    • G06V40/13Sensors therefor
    • G06V40/1306Sensors therefor non-optical, e.g. ultrasonic or capacitive sensing

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Human Computer Interaction (AREA)
  • Multimedia (AREA)
  • Position Input By Displaying (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The embodiment of the application discloses a touch positioning method and electronic equipment, relates to the field of touch screens, and can improve the positioning accuracy of a touch position of a user. The method comprises the following steps: coordinates of the point of contact are determined in response to a contact operation of the contact object with respect to the touch screen. And acquiring first texture information of the contact, wherein the first texture information is used for indicating texture information of a contact surface of the contact and the touch screen. Judging whether the first texture information accords with preset texture characteristics or not, wherein the preset texture characteristics comprise at least one of the following: texture features of the fingerprint, texture features of the stylus. If yes, determining the coordinates of the contact point as effective touch coordinates. If not, determining the coordinates of the contact point as invalid touch coordinates.

Description

Touch positioning method and electronic equipment
Technical Field
The embodiment of the application relates to the field of touch screens, in particular to a touch positioning method and electronic equipment.
Background
The touch screen is a window for the electronic equipment to interact with the outside, and has the functions of receiving the outside input, displaying the output of the electronic equipment and the like. The touch screen can comprise a touch panel and a chip, and the chip can determine the coordinates of the position touched by the user according to the parameters of each contact point in the touch panel. Taking a capacitive screen as an example, each contact in a touch panel of the capacitive screen is a capacitor. When a user touches the capacitive screen, the chip of the capacitive screen can determine the coordinates of the touch position according to the capacitance distribution of the capacitance in the touch panel.
However, in practical application, in a touch area not touched by a user, parameter distribution similar to that of the user touch is generated due to external interference, device quality and other reasons, so that the user touch position determined by the chip is not accurate enough.
Disclosure of Invention
The embodiment of the application provides a touch positioning method and electronic equipment, which can improve the positioning accuracy of a touch position of a user.
In order to achieve the above purpose, the following technical scheme is adopted in the embodiment of the application.
In a first aspect, a touch positioning method is provided, and the touch positioning method is applied to electronic equipment, wherein the electronic equipment comprises a touch screen. The method comprises the following steps: coordinates of the point of contact are determined in response to a contact operation of the contact object with respect to the touch screen. And acquiring first texture information of the contact, wherein the first texture information is used for indicating texture information of a contact surface of the contact and the touch screen. Judging whether the first texture information accords with preset texture characteristics or not, wherein the preset texture characteristics comprise at least one of the following: texture features of the fingerprint, texture features of the stylus. If yes, determining the coordinates of the contact point as effective touch coordinates. If not, determining the coordinates of the contact point as invalid touch coordinates.
Based on the scheme, whether the texture of the contact surface of the contact object and the touch screen is the texture of the fingerprint or the texture of the touch pen is judged, so that the touch interference of the non-finger and the non-touch pen to the touch screen is eliminated, and the positioning accuracy of the touch position of a user is improved.
In one possible design, the touch screen is a capacitive screen. Determining coordinates of a contact point in response to a contact operation of a contact object with respect to a touch screen, including: and acquiring capacitance value distribution of the capacitive screen. And determining the contact area of the contact object and the capacitive screen according to the capacitance value distribution of the capacitive screen. Coordinates of the contact point are calculated from the capacitance distribution in the contact area and the resolution of the touch screen. Based on this scheme, the accuracy of calculating the coordinates of the contact point can be improved.
In one possible design, determining the contact area of the contact object and the capacitive screen according to the capacitance value distribution of the capacitive screen includes: and determining an area formed by the capacitors with the capacitance values larger than a preset threshold value in the capacitive screen as a contact area according to the capacitance value distribution of the capacitive screen. Based on this scheme, the accuracy of the determined contact area can be improved.
In one possible design, according to the capacitance value distribution of the capacitive screen, determining an area formed by the capacitances with capacitance values greater than a preset threshold in the capacitive screen as a contact area includes: and determining a first capacitor according to the capacitance value distribution of the capacitive screen, wherein the first capacitor is the capacitor with the largest capacitance value in the capacitive screen. And taking the first capacitor as an initial stack element, and placing the capacitor with the capacitance value larger than the preset threshold value in the capacitor adjacent to the capacitor in the stack until no capacitor with the capacitance value larger than the preset threshold value exists in the capacitor adjacent to the capacitor in the stack. The area made up of the capacitances in the stack is determined as the contact area. Based on this scheme, the accuracy of the determined contact area can be improved. Based on this scheme, the accuracy of the determined contact area can be improved.
In one possible design, calculating coordinates of the contact point from the capacitance value distribution in the contact area and the resolution of the touch screen includes: and calculating a weighted average value of the abscissa and a weighted average value of the ordinate of each capacitor in the contact area by taking the capacitance value of each capacitor in the contact area as a weight. The product of the weighted average of the abscissa and the resolution is taken as the abscissa of the contact point, and the product of the weighted average of the ordinate and the resolution is taken as the ordinate of the contact point. Based on the scheme, the coordinate of the contact zone is determined through the centroid principle, so that the accuracy of the determined coordinate of the contact point can be improved.
In one possible design, the electronic device further includes a plurality of ultrasonic sensors, and the plurality of ultrasonic modules are disposed on an inner screen side of the touch screen. Determining coordinates of a contact point in response to a contact operation of a contact object with respect to a touch screen, including: the ultrasonic wave is sent to a first direction through each ultrasonic sensor, and the first direction is perpendicular to the touch screen and points to the outer screen side of the touch screen. In response to receiving the reflected wave of the ultrasonic wave by each ultrasonic sensor, an ultrasonic return time of each ultrasonic sensor is calculated. The ultrasonic return time of the ultrasonic sensor is the difference between the time when the reflected wave of the ultrasonic wave is received and the time when the ultrasonic wave is transmitted. Coordinates of the contact point are determined based on the ultrasonic return time of each ultrasonic sensor. Based on the scheme, the coordinates of the contact point can be determined through the ultrasonic sensor, the touch screen does not need to have the function of a capacitive screen, and the cost is low.
In one possible design, determining coordinates of the contact point based on the ultrasonic return time of each ultrasonic sensor includes: and taking the touch screen area corresponding to the ultrasonic sensor with the ultrasonic return time in the first preset interval as the contact area between the contact object and the touch screen. And taking coordinates corresponding to the ultrasonic sensors, of which the differences between the ultrasonic return times of the adjacent ultrasonic sensors in the contact area are in a second preset interval, as coordinates of the contact point. Based on the scheme, the coordinates of the contact point can be determined through the ultrasonic sensor, the touch screen does not need to have the function of a capacitive screen, and the cost is low.
In one possible design, obtaining first texture information for a contact includes: the first texture information is determined based on the ultrasonic return time of each ultrasonic sensor in the contact region. Based on the scheme, the texture information of the contact surface can be determined through the ultrasonic sensor, and the required cost is low.
In a second aspect, there is provided an electronic device comprising: the touch screen, touch module, identification module and chip. The chip is respectively connected with the touch module and the identification module. The chip, the touch module and the identification module are arranged on the inner screen side of the touch screen. The touch module is used for responding to the contact operation of the contact object on the touch screen and determining the coordinates of the contact point. The identification module is used for acquiring first texture information of the contact object, and the first texture information is used for indicating texture information of a contact surface of the contact object and the touch screen. The chip is used for judging whether the first texture information accords with preset texture characteristics, and the preset texture characteristics comprise at least one of the following: texture features of the fingerprint, texture features of the stylus. If yes, determining the coordinates of the contact point as effective touch coordinates. If not, determining the coordinates of the contact point as invalid coordinates.
In one possible design, the identification module is any one of the following: the device comprises an optical fingerprint identification module, an ultrasonic fingerprint identification module and a semiconductor fingerprint identification module.
In a third aspect, an electronic device is provided that includes one or more processors and one or more memories. One or more memories are coupled to the one or more processors, the one or more memories storing computer instructions. The computer instructions, when executed by one or more processors, cause the electronic device to perform the touch location method of any of the first aspects.
In a fourth aspect, a computer readable storage medium is provided, the computer readable storage medium comprising computer instructions which, when executed, perform the touch location method of any of the first aspects.
In a fifth aspect, a computer program product is provided, comprising instructions in the computer program product, which when run on a computer, enables the computer to perform the touch location method as in any of the first aspects according to the instructions.
It should be appreciated that the technical features of the technical solutions provided in the second aspect, the third aspect, the fourth aspect and the fifth aspect may all correspond to the touch positioning method provided in the first aspect and the possible designs thereof, so that the beneficial effects that can be achieved are similar, and are not repeated herein.
Drawings
FIG. 1 is a schematic diagram of a capacitive screen before and after being touched by a user;
fig. 2 is a schematic structural diagram of an electronic device according to an embodiment of the present application;
fig. 3 is a schematic diagram of an electronic device according to an embodiment of the present application;
fig. 4 is a schematic view of an application scenario of an electronic device according to an embodiment of the present application;
FIG. 5 is a schematic diagram of a touch module according to an embodiment of the present application;
FIG. 6 is a schematic diagram of the present application before and after reflection of ultrasonic waves;
FIG. 7 is a schematic diagram of an ultrasonic return time provided by an embodiment of the present application;
fig. 8 is a schematic diagram of a finger fingerprint touch screen according to an embodiment of the present application;
fig. 9 is a flowchart of a touch positioning method according to an embodiment of the present application;
fig. 10 is a schematic diagram of an electronic device according to an embodiment of the present application;
fig. 11 is a schematic diagram of a system on chip according to an embodiment of the present application.
Detailed Description
The terms "first," "second," and "third," etc. in embodiments of the application are used for distinguishing between different objects and not for defining a particular sequence. Furthermore, the words "exemplary" or "such as" are used to mean serving as an example, instance, or illustration. Any embodiment or design described herein as "exemplary" or "e.g." in an embodiment should not be taken as preferred or advantageous over other embodiments or designs. Rather, the use of words such as "exemplary" or "such as" is intended to present related concepts in a concrete fashion.
In order to facilitate understanding of the embodiments of the present application, the following description will first be given of the background of the application.
The chip of the touch screen can position the touch position through the parameters of each contact point in the touch panel. Parameters may include capacitance, resistance, etc. A touch panel whose parameter is capacitance may be referred to as a capacitive screen, and a touch panel whose parameter is resistance may be referred to as a resistive screen.
The temperature and humidity of the environment, the non-artificial false touch and the damage of the touch panel can cause the touch area which is not touched by the user to generate parameter distribution similar to the touch of the user, so that the chip is not accurate to position the touch position.
Taking a capacitive screen as an example, please refer to fig. 1, which is a schematic diagram of the capacitive screen before and after being touched by a user. As shown in fig. 1, a plurality of capacitors are disposed in the touch panel corresponding to the touch area of the capacitive screen. Under the condition that no user touches or external interference exists, the capacitance value of each capacitor in the touch panel is smaller than a preset threshold value, wherein the preset threshold value can be 1000pF. When a user touches a touch area of the screen, due to the coupling effect of a human body and a capacitor, the capacitance value of the capacitor in the touch panel corresponding to the area is larger than a preset threshold value. The chip of the capacitive screen takes a position corresponding to a capacitor with a capacitance value larger than a preset threshold value in the touch panel as a touch position.
When the touch panel of the capacitive screen is interfered by the outside, the capacitance value of part of the capacitors may be larger than a preset threshold value. For example, when a water drop exists in a touch area of the screen, the water drop may cause a capacitance value of a touch panel corresponding to the area to be greater than a preset threshold value, so that a touch position determined by the chip is incorrect or inaccurate.
In the embodiment of the application, when no user touches, the coordinates of the touch position determined by the chip are referred to as invalid touch coordinates, and the coordinates of the touch position determined by the chip are referred to as valid touch coordinates.
The chip of the capacitive screen may report the determined touch location to a processor of the electronic device. The processor can feed back the touch position reported by the chip correspondingly. For example, if the processor determines that the touch position reported by the chip is located in an icon of an application, the processor starts the application. However, when the coordinates of the touch position determined by the chip are invalid touch coordinates, the feedback of the processor for the touch position may not conform to the expectation of the user, resulting in poor touch experience of the user.
In order to solve the problems, the embodiment of the application provides a touch positioning method and electronic equipment, which can improve the positioning accuracy of a chip of a touch screen to a touch position and improve the touch experience of a user.
In an embodiment of the present application, the electronic device may refer to an electronic device having a touch screen, such as a mobile phone, a tablet computer, a wearable device (e.g., a smart watch), a vehicle-mounted device, a Laptop (Laptop), a desktop computer, and so on. Exemplary embodiments of electronic devices include, but are not limited to, piggybackingOr other operating system. As an example, please refer to fig. 2, which is a schematic structural diagram of an electronic device 200 according to an embodiment of the present application. The touch positioning method provided by the embodiment of the application can be applied to the electronic device 200 shown in fig. 2.
As shown in fig. 2, the electronic device 200 may include a processor 201, a communication module 202, a display screen 203, and the like.
The processor 201 may include one or more processing units, for example: the processor 201 may include an application processor (application processor, AP), a modem processor, a graphics processor (graphics processing unit, GPU), an image signal processor (image signal processor, ISP), a controller, a memory, a video stream codec, a digital signal processor (digital signal processor, DSP), a baseband processor, and/or a neural network processor (neural-network processing unit, NPU), etc. Wherein the different processing units may be separate devices or may be integrated in one or more processors 201.
The controller may be a neural hub and command center of the electronic device 200. The controller can generate operation control signals according to the instruction operation codes and the time sequence signals to finish the control of instruction fetching and instruction execution.
A memory may also be provided in the processor 201 for storing instructions and data. In some embodiments, the memory in the processor 201 is a cache memory. The memory may hold instructions or data that the processor 201 has just used or recycled. If the processor 201 needs to reuse the instruction or data, it can be called directly from the memory. Repeated accesses are avoided and the latency of the processor 201 is reduced, thus improving the efficiency of the system.
In some embodiments, the processor 201 may include one or more interfaces. The interfaces may include an integrated circuit (inter-integrated circuit, I2C) interface, an integrated circuit built-in audio (inter-integrated circuit sound, I2S) interface, a pulse code modulation (pulse code modulation, PCM) interface, a universal asynchronous receiver transmitter (universal asynchronous receiver/transmitter, UART) interface, a mobile industry processor 201 interface (mobile industry processor interface, MIPI), a general-purpose input/output (GPIO) interface, a subscriber identity module (subscriber identity module, SIM) interface, and/or a universal serial bus (universal serial bus, USB) interface 211, among others.
The electronic device 200 implements display functions through a GPU, a display screen 203, and an application processor 201, etc. The GPU is a microprocessor for image processing, and is connected to the display screen 203 and the application processor 201. The GPU is used to perform mathematical and geometric calculations for graphics rendering. Processor 201 may include one or more GPUs that execute program instructions to generate or change display information.
The display screen 203 is used to display images, video streams, and the like.
The communication module 202 may include an antenna 1, an antenna 2, a mobile communication module 202A, and/or a wireless communication module 202B. Taking the communication module 202 as an example, the antenna 1, the antenna 2, the mobile communication module 202A and the wireless communication module 202B are included at the same time.
The wireless communication function of the electronic device 200 can be realized by the antenna 1, the antenna 2, the mobile communication module 202A, the wireless communication module 202B, a modem processor, a baseband processor, and the like.
The antennas 1 and 2 are used for transmitting and receiving electromagnetic wave signals. Each antenna in the electronic device 200 may be used to cover a single or multiple communication bands. Different antennas may also be multiplexed to improve the utilization of the antennas. For example: the antenna 1 may be multiplexed into a diversity antenna of a wireless local area network. In other embodiments, the antenna may be used in conjunction with a tuning switch.
The mobile communication module 202A may provide a solution for wireless communication, including 2G/3G/4G/5G, applied on the electronic device 200. The mobile communication module 202A may include at least one filter, switch, power amplifier, low noise amplifier (low noise amplifier, LNA), etc. The mobile communication module 202A may receive electromagnetic waves from the antenna 1, perform processing such as filtering and amplifying the received electromagnetic waves, and transmit the processed electromagnetic waves to a modem processor for demodulation. The mobile communication module 202A may amplify the signal modulated by the modem processor, and convert the signal into electromagnetic waves through the antenna 1 to radiate the electromagnetic waves. In some embodiments, at least some of the functional modules of the mobile communication module 202A may be provided in the processor 201. In some embodiments, at least some of the functional modules of the mobile communication module 202A may be provided in the same device as at least some of the modules of the processor 201.
The modem processor may include a modulator and a demodulator. The modulator is used for modulating the low-frequency baseband signal to be transmitted into a medium-high frequency signal. The demodulator is used for demodulating the received electromagnetic wave signal into a low-frequency baseband signal. The demodulator then transmits the demodulated low frequency baseband signal to the baseband processor for processing. The low frequency baseband signal is processed by the baseband processor and then transferred to the application processor. The application processor outputs sound signals through an audio device (not limited to speaker 206A, receiver 206B, etc.), or displays images or video streams through display 203. In some embodiments, the modem processor may be a stand-alone device. In other embodiments, the modem processor may be provided in the same device as the mobile communication module 202A or other functional module, independent of the processor 201.
The wireless communication module 202B may provide solutions for wireless communication including wireless local area network (wireless local area networks, WLAN) (e.g., wireless fidelity (wireless fidelity, wi-Fi) network), bluetooth (BT), global navigation satellite system (global navigation satellite system, GNSS), frequency modulation (frequency modulation, FM), near field wireless communication technology (near field communication, NFC), infrared technology (IR), etc., applied to the electronic device 200. The wireless communication module 202B may be one or more devices that integrate at least one communication processing module. The wireless communication module 202B receives electromagnetic waves via the antenna 2, modulates the electromagnetic wave signals, filters the electromagnetic wave signals, and transmits the processed signals to the processor 201. The wireless communication module 202B may also receive a signal to be transmitted from the processor 201, frequency modulate it, amplify it, and convert it into electromagnetic waves to radiate through the antenna 2.
In some embodiments, antenna 1 and mobile communication module 202A of electronic device 200 are coupled, and antenna 2 and wireless communication module 202B are coupled, such that electronic device 200 may communicate with a network and other devices through wireless communication techniques. The wireless communication techniques may include the Global System for Mobile communications (global system for mobile communications, GSM), general packet radio service (general packet radio service, GPRS), code division multiple access (code division multiple access, CDMA), wideband code division multiple access (wideband code division multiple access, WCDMA), time division code division multiple access (time-division code division multiple access, TD-SCDMA), long term evolution (long term evolution, LTE), BT, GNSS, WLAN, NFC, FM, and/or IR techniques, among others. The GNSS may include a global satellite positioning system (global positioning system, GPS), a global navigation satellite system (global navigation satellite system, GLONASS), a beidou satellite navigation system (beidou navigation satellite system, BDS), a quasi zenith satellite system (quasi-zenith satellite system, QZSS) and/or a satellite based augmentation system (satellite based augmentation systems, SBAS).
As shown in fig. 2, in some implementations, the electronic device 200 may further include an external memory interface 210, an internal memory 204, a universal serial bus (universal serial bus, USB) interface 211, a charge management module 212, a power management module 213, a battery 214, an audio module 206, a speaker 206A, a receiver 206B, a microphone 206C, an earphone interface 206D, a sensor module 205, keys 209, a motor, an indicator 208, a camera 207, and a subscriber identity module (subscriber identification module, SIM) card interface, etc.
The charge management module 212 is configured to receive a charge input from a charger. The charger can be a wireless charger or a wired charger. In some wired charging embodiments, the charging management module 212 may receive a charging input of a wired charger through the USB interface 211. In some wireless charging embodiments, the charge management module 212 may receive wireless charging input through a wireless charging coil of the electronic device 200. The charging management module 212 may also power the electronic device 200 through the power management module 213 while charging the battery 214.
The power management module 213 is used for connecting the battery 214, and the charge management module 212 and the processor 201. The power management module 213 receives input from the battery 214 and/or the charge management module 212 to power the processor 201, the internal memory 204, the external memory, the display 203, the camera 207, the wireless communication module 202B, and the like. The power management module 213 may also be configured to monitor the capacity of the battery 214, the number of cycles of the battery 214, and the state of health (leakage, impedance) of the battery 214. In other embodiments, the power management module 213 may also be disposed in the processor 201. In other embodiments, the power management module 213 and the charge management module 212 may be disposed in the same device.
The external memory interface 210 may be used to connect an external memory card, such as a Micro SD card, to enable expansion of the memory capabilities of the electronic device 200. The external memory card communicates with the processor 201 via an external memory interface 210 to implement data storage functions. For example, files such as music, video streams, etc. are stored in an external memory card.
The internal memory 204 may be used to store computer executable program code including instructions. The processor 201 executes various functional applications of the electronic device 200 and data processing by executing instructions stored in the internal memory 204.
The internal memory 204 may also store one or more computer programs corresponding to the data transmission method provided in the embodiment of the present application.
The electronic device 200 may implement audio functions through an audio module 206, a speaker 206A, a receiver 206B, a microphone 206C, an earphone interface 206D, and an application processor 201, etc. Such as music playing, recording, etc.
Keys 209 include a power on key, a volume key, etc. The keys 209 may be mechanical keys 209. Or may be a touch key 209. The electronic device 200 may receive key 209 inputs, generating key signal inputs related to user settings and function controls of the electronic device 200.
The indicator 208 may be an indicator light, which may be used to indicate a state of charge, a change in charge, a message indicating a missed call, a notification, etc.
The SIM card interface is used for connecting the SIM card. The SIM card may be inserted into or removed from the SIM card interface to enable contact and separation with the electronic device 200. The electronic device 200 may support 1 or N SIM card interfaces, N being a positive integer greater than 1. The SIM card interface may support Nano SIM cards, micro SIM cards, etc. The same SIM card interface can be used to insert multiple cards simultaneously. The types of the plurality of cards may be the same or different. The SIM card interface may also be compatible with different types of SIM cards. The SIM card interface may also be compatible with external memory cards. The electronic device 200 interacts with the network through the SIM card to realize functions such as communication and data communication. In some embodiments, the electronic device 200 employs esims, namely: an embedded SIM card. The eSIM card can be embedded in the electronic device 200 and cannot be separated from the electronic device 200.
The sensor module 205 in the electronic device 200 may include components such as touch sensors, pressure sensors, gyroscopic sensors, barometric pressure sensors, magnetic sensors, acceleration sensors, distance sensors, proximity sensors, ambient light sensors, fingerprint sensors, temperature sensors, bone conduction sensors, etc. to enable sensing and/or acquisition of different signals.
It is to be understood that the structure illustrated in this embodiment does not constitute a specific limitation on the electronic apparatus 200. In other embodiments, the electronic device 200 may include more or fewer components than shown, or certain components may be combined, or certain components may be split, or different arrangements of components. The illustrated components may be implemented in hardware, software, or a combination of software and hardware.
The electronic device may further include a touch sensor, a fingerprint recognition module, and a chip. The following will specifically describe. Wherein the fingerprint recognition module may also be referred to as recognition module.
Fig. 3 is a schematic diagram of an electronic device according to an embodiment of the application. As shown in fig. 3, the electronic device may include: a display screen 301, a touch module 302, a fingerprint identification module 303 and a chip 304.
The touch module 302 may be disposed in the display 301 (not shown in fig. 3) or on an inner screen side of the display 301, and connected to the chip 304. The fingerprint recognition module 303 may be disposed on the inner screen side of the display screen 301 and connected to the chip 304.
It should be noted that, the touch module 302 may be a touch panel with the above parameters being capacitance or resistance.
In some embodiments, the touch module 302 may correspond to a portion of the display 301, providing touch functionality for the portion of the display 301. In other embodiments, the touch module 302 may correspond to the complete display 301, providing full screen touch functionality.
For convenience of description, a portion of the display screen 301 corresponding to the touch module 302 will be referred to as a touch screen in the following embodiments.
In an embodiment of the present application, the touch module 302 may detect a touch operation of an object with respect to the touch screen. The touch module 302 is configured to generate first information in response to a touch operation of an object on the touch screen, and send the first information to the chip 304. The first information is used for indicating coordinates of a contact position of the touch screen and the object. The object may be a physical entity such as a finger, stylus, pencil, paper, etc.
In embodiments of the application, the object may also be referred to as a contact. The coordinates of the contact location may also be referred to as coordinates of the contact point.
The fingerprint recognition module 303 may be an optical fingerprint recognition module, an ultrasonic fingerprint recognition module, a semiconductor fingerprint recognition module, or the like.
The fingerprint identification module 303 is configured to obtain second information of the object, and send the second information to the chip 304. The second information is used to indicate texture features of the contact surface of the object with the touch screen. Wherein the second information may also be referred to as first texture information.
That is, the second information may indicate whether a contact surface of the object with the touch screen at the contact position includes fingerprint information, texture information of a tip of the stylus, or the like.
In some embodiments, the fingerprint recognition module 303 may correspond to a portion of the display 301, providing fingerprint recognition functionality for that portion of the display 301. In other embodiments, the fingerprint recognition module 303 may correspond to the complete display 301, providing full screen fingerprint recognition functionality.
The chip 304 is configured to take, as the effective touch coordinate, the coordinate of the touch position of the touch screen and the object in the first information when the second information indicates that the contact surface of the object and the touch screen at the touch position meets a preset condition. The effective touch coordinates refer to coordinates of a contact point when a finger or a stylus touches the touch screen. In other words, the chip 304 may filter information that other objects than a finger and a stylus contact the touch screen, only retaining touch coordinates in the touch screen that a user uses the finger or the stylus.
The preset condition may be a preset texture feature, such as a texture feature of a fingerprint, a texture feature of a stylus tip, and the like.
For example, please refer to fig. 4, which is a schematic diagram of an application scenario of an electronic device according to an embodiment of the present application. As shown in fig. 4, area 1 of the touch screen has no object contact, area 2 has pencil contact, and area 3 has finger contact.
In the embodiment of the application, the touch module can respond to the pencil and the finger touching the touch screen to send the coordinates of the area 2 and the coordinates of the area 3 to the chip and the fingerprint identification module. After receiving the coordinates of the area 2 and the coordinates of the area 3, the fingerprint identification module detects whether the area 2 and the area 3 include fingerprint information or texture information of the stylus pen point, and sends the detection result to the chip. Since the area 2 is pencil contact and the area 3 is finger contact, the detection result is that the area 2 does not include fingerprint information or texture information of the stylus tip, and the area 3 includes fingerprint information. The chip will take the coordinates of region 3 as valid touch coordinates.
Therefore, the positioning accuracy of the touch position of the user is higher, and the touch experience of the user is improved.
In the embodiment of the application, after the chip determines the effective touch coordinate, the effective touch coordinate can be reported to an upper module such as a processor of the electronic equipment, so that the processor can feed back the effective touch coordinate reported by the chip correspondingly. The processing procedure after the chip determines the effective touch coordinates can refer to the related art, and the disclosure is not repeated here.
In the embodiment of the present application, the touch module may determine the coordinates of the contact position of the touch screen with the object through ultrasonic waves, and the detailed description is given below with reference to fig. 5.
Fig. 5 is a schematic diagram of a touch module according to an embodiment of the application. As shown in fig. 5, the touch module may include a driving circuit 501, a processing module 502, and a plurality of ultrasonic sensors 503. Referring to fig. 5, a plurality of ultrasonic sensors 503 are respectively connected to the driving circuit 501 and the processing module 502, and the processing module 502 is also respectively connected to the driving circuit 501 and the chip 304.
The driving circuit 501 is configured to periodically send an electrical pulse to the ultrasonic sensor 503 and the processing module 502.
The ultrasonic sensor 503 is configured to mechanically vibrate under the excitation of an electric pulse, thereby emitting ultrasonic waves in a first direction. The first direction is perpendicular to the touch screen and points to the outer screen side of the touch screen.
Fig. 6 is a schematic diagram of the ultrasonic wave reflection according to an embodiment of the application. As shown in fig. 6, after the ultrasonic wave is emitted, if an object such as a finger is encountered, the ultrasonic wave is reflected back to the ultrasonic sensor 503. In embodiments of the present application, the reflected ultrasonic waves may also be referred to as reflected waves.
The reflected wave vibrates the ultrasonic sensor 503. The ultrasonic sensor 503 is also configured to convert the vibration responsive to the reflected wave into an electrical signal and transmit the electrical signal to the processing module 502.
The processing module 502 has a clock disposed therein. The processing module 502 is configured to determine a return time of the ultrasonic wave emitted by the corresponding ultrasonic sensor 503 according to a time when the electric pulse is received and a time when the electric signal is received in the same period.
For example, the time when the processing module 502 receives the electrical pulse in the same period is referred to as a first time, the time when the electrical signal is received is referred to as a second time, and the difference between the second time and the first time is the return time of the ultrasonic wave.
It will be appreciated that if the ultrasonic wave is not dielectric-blocked after transmission, the ultrasonic wave is not reflected back.
The processing module 502 is further configured to generate first information according to each return time, and send the first information to the chip 304.
For example, after receiving the return time sent by each ultrasonic sensor 503, the processing module 502 may first determine whether each return time is within a first preset interval.
The boundary value of the first preset interval may be an empirical value. For example, a finger, a stylus, or other object is randomly selected to contact the touch screen, and the return time of the ultrasonic sensor 503 corresponding to the contact area is collected and repeated multiple times to obtain a return time set. As an alternative embodiment, the minimum value in the return time set may be taken as the minimum value of the first preset interval, and the maximum value in the return time set may be taken as the maximum value of the first preset interval.
In this way, if the return time is within the first preset interval, it is indicated that the ultrasonic wave corresponding to the return time is reflected by the object contacting the touch screen, and the touch screen area corresponding to the ultrasonic sensor 503 corresponding to the return time is the contact area between the touch screen and the object.
The processing module 502 may use coordinates corresponding to the ultrasonic sensors having the difference between the ultrasonic return times of the adjacent ultrasonic sensors in the contact area in the second preset interval as coordinates of the contact point, that is, the first information. Similarly, the second preset interval is also an empirical value, and will not be described herein.
In the embodiment of the application, the touch module may further include a processing module and a touch panel with a capacitive parameter. The processing module can determine coordinates of a touch screen and object contact position through capacitance.
As shown in fig. 1, a plurality of capacitors are disposed in the touch panel. Under the condition that no user touches or external interference exists, the capacitance value of each capacitor in the touch panel is smaller than a preset threshold value of 1000pF. When a user touches the touch screen with a finger, the capacitance value of a part of the capacitance of the touch panel is larger than a preset threshold value due to the coupling effect of a human body and the capacitance.
In some embodiments, the processing module may acquire capacitance values of all the capacitances, and use an area formed by the capacitances with capacitance values greater than a preset threshold as the touch area.
Illustratively, the capacitance having the largest capacitance value is referred to as a first capacitance. The processing module can take the first capacitor as an initial stack element, determine a second capacitor with a capacitance value larger than a preset threshold value in 8 capacitors around the first capacitor, put the second capacitor into the stack, determine a third capacitor with a capacitance value larger than a preset value in 8 capacitors around the second capacitor, and circulate until the number of elements in the stack is not increased. Thus, the area formed by the capacitors in the stack is the touch area.
The processing module may calculate a weighted average of the abscissa and a weighted average of the ordinate of each capacitance in the touch area using the capacitance value of each capacitance in the touch area as a weight according to the centroid principle. The processing module may multiply the weighted average of the abscissa with the horizontal resolution to obtain an abscissa of the touch screen and object contact position, and multiply the weighted average of the ordinate with the vertical resolution to obtain an ordinate of the touch screen and object contact position. In the embodiment of the present application, the processing module may be integrated into a chip. In other words, the chip may integrate the above-described functions of the processing module.
In the embodiment of the present application, when the touch module is as shown in fig. 5, the process of acquiring the second information of the object by the fingerprint recognition module may be completed by ultrasonic waves. The following is a detailed description.
The farther the ultrasonic waves of the same energy travel in space, the greater the energy loss thereof and thus the lower the amplitude thereof. Thus, the relative magnitude of the distance travelled by the reflected wave can be determined from the amplitude of the reflected wave.
Fig. 7 is a schematic diagram of an ultrasonic return time according to an embodiment of the application.
As shown in fig. 7, when there is no touch, that is, when the touch panel is not touched by an object, the ultrasonic wave does not have a reflected wave.
When the touch screen is touched by the non-grain object, the contact surface of the object and the touch screen is a plane without grains. The distance between each ultrasonic sensor emitting ultrasonic waves and the contact surface is equal. The distances traveled by the reflected waves reflected by the contact surface are equal.
Therefore, in the same period T, the ultrasonic wave return time of the reflected wave reflected by the contact surface is T1, and the amplitude of each reflected wave is P1.
When the touch screen is touched by a finger, the ultrasonic sensor emits ultrasonic waves at different positions, the amplitude of the reflected waves is different, and the return time of the ultrasonic waves is also different.
The above-mentioned differences in amplitude and ultrasonic return time are mainly due to the fact that the fingerprint includes ridges and valleys, and please refer to fig. 8, which is a schematic diagram of a finger fingerprint touch screen according to an embodiment of the present application. As shown in fig. 8, the ridge is a raised portion of the fingerprint that is in direct contact with the touch screen. The valleys are the portions of the fingerprint that are recessed and are not in contact with the touch screen and are a distance apart. After the finger is in contact with the touch screen, the return time of the ultrasonic wave reflected by the ridge is referred to as a first time, and the return time of the ultrasonic wave reflected by the valley is referred to as a second time, and it is understood that the first time is smaller than the second time.
For example, as shown in fig. 7, the ultrasonic waves corresponding to t2 and P2 are ultrasonic waves reflected by ridges, and the ultrasonic waves corresponding to t3 and P3 are ultrasonic waves reflected by valleys. Since the distance traveled by the ultrasonic waves reflected by the ridges in space is greater than the distance traveled by the ultrasonic waves reflected by the valleys in space, as shown in fig. 7, P2 is greater than P3, and t3 is greater than t2.
In the embodiment of the application, the fingerprint identification module can acquire the return time of each ultrasonic wave in the contact surface of the object and the touch screen, and judge whether the contact surface comprises fingerprint information or texture information of the stylus point.
In some embodiments, a value may be preset, referred to as a first preset value. When the return time of a certain ultrasonic wave in the contact surface is smaller than the first preset value, the ultrasonic wave is determined to be reflected by the ridge. And when the return time of a certain ultrasonic wave in the contact surface is greater than the first preset value, determining that the ultrasonic wave is reflected by the valley. Thus, the distribution of ridges and valleys in the fingerprint, namely the first texture information, can be obtained according to the ultrasonic return time of each position of the contact surface.
In other embodiments, a value may be preset, referred to as a second preset value. When the amplitude of a reflected wave in the contact surface is larger than the second preset value, determining that the reflected wave is reflected by the ridge. When the amplitude of a certain reflected wave in the contact surface is smaller than the second preset value, determining that the reflected wave is reflected by the valley. Thus, the distribution of ridges and valleys in the fingerprint, namely the first texture information, can be obtained according to the amplitude of the reflected wave at each position of the contact surface.
It can be seen that, in the electronic device provided by the embodiment of the application, when the second information indicates that the contact surface of the object and the touch screen at the contact position includes fingerprint information or texture information of the pen point of the touch pen, the coordinates of the touch screen at the contact position of the touch screen and the object in the first information are used as effective touch coordinates through the chip, so that the positioning accuracy of the touch position of the user is higher, and the touch experience of the user can be improved.
Based on the electronic device, the embodiment of the application also provides a touch positioning method which is applied to the electronic device described in the embodiment.
Referring to fig. 9, a flowchart of a touch positioning method according to an embodiment of the application is shown. As shown in fig. 9, the scheme includes S901 to S904 (including S904a and S904 b).
And S901, responding to the contact operation of the contact object on the touch screen, and determining the coordinates of the contact point.
S902, acquiring first texture information of the contact.
The first texture information is used for indicating texture information of a contact surface of the contact object and the touch screen.
S903, judging whether the first texture information accords with preset texture characteristics.
The preset texture features include at least one of the following: texture features of the fingerprint, texture features of the stylus.
If yes, S904a described below is performed. If not, the following S904b is executed.
And S904a, determining the coordinates of the contact point as effective touch coordinates.
S904b, determining the coordinates of the contact point as invalid touch coordinates.
It should be noted that, all relevant contents of each module related to the above system embodiment may be cited to step descriptions of the corresponding method, and are not described herein.
Fig. 10 is a schematic diagram of an electronic device 1000 according to an embodiment of the application. The electronic device 1000 may be any of the above examples, for example, the electronic device 1000 may be a mobile phone, a computer, or the like. For example, as shown in fig. 10, the electronic device 1000 may include: a processor 1001 and a memory 1002. The memory 1002 is used to store computer-executable instructions. For example, in some embodiments, the processor 1001, when executing the instructions stored in the memory 1002, may cause the electronic device 1000 to perform any of the functions of the electronic device in the above embodiments to implement any of the methods in the above examples.
It should be noted that, all relevant contents of each step related to the above method embodiment may be cited to the functional description of the corresponding functional module, which is not described herein.
Fig. 11 shows a schematic diagram of the components of a chip system 1100. The chip system 1100 may be provided in an electronic device. For example, the system-on-chip 1100 may be provided in a cell phone. By way of example, the chip system 1100 may include: a processor 1101 and a communication interface 1102 for supporting the electronic device to implement the functions referred to in the above embodiments. In one possible design, the chip system 1100 may further include memory to hold the program instructions and data necessary for the electronic device. The chip system can be composed of chips, and can also comprise chips and other discrete devices. It should be noted that, in some implementations of the present application, the communication interface 1102 may also be referred to as an interface circuit.
It should be noted that, all relevant contents of each step related to the above method embodiment may be cited to the functional description of the corresponding functional module, which is not described herein.
The embodiment of the application also provides a computer storage medium, in which computer instructions are stored, which when run on a terminal device, cause the terminal device to execute the relevant method steps to implement the method in the above embodiment.
The embodiments of the present application also provide a computer program product which, when run on a computer, causes the computer to perform the above-mentioned related steps to implement the method in the above-mentioned embodiments.
In addition, embodiments of the present application also provide an apparatus, which may be embodied as a chip, component or module, which may include a processor and a memory coupled to each other; the memory is configured to store computer-executable instructions, and when the device is operated, the processor may execute the computer-executable instructions stored in the memory, so that the chip performs the methods in the above method embodiments.
The terminal device, the computer storage medium, the computer program product, or the chip provided in the embodiments of the present application are used to execute the corresponding methods provided above, so that the beneficial effects thereof can be referred to the beneficial effects in the corresponding methods provided above, and are not described herein.
The scheme provided by the embodiment of the application is mainly described from the perspective of the electronic equipment. To achieve the above functions, it includes corresponding hardware structures and/or software modules that perform the respective functions. Those of skill in the art will readily appreciate that the various illustrative elements and algorithm steps described in connection with the embodiments disclosed herein may be implemented as hardware or combinations of hardware and computer software. Whether a function is implemented as hardware or computer software driven hardware depends upon the particular application and design constraints imposed on the solution. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present application.
The embodiment of the application can divide the functional modules of the devices involved in the method according to the method example, for example, each functional module can be divided corresponding to each function, and two or more functions can be integrated in one processing module. The integrated modules may be implemented in hardware or in software functional modules. It should be noted that, in the embodiment of the present application, the division of the modules is schematic, which is merely a logic function division, and other division manners may be implemented in actual implementation.
The functions or acts or operations or steps and the like in the embodiments described above may be implemented in whole or in part by software, hardware, firmware or any combination thereof. When implemented using a software program, it may be implemented in whole or in part in the form of a computer program product. The computer program product includes one or more computer instructions. When the computer program instructions are loaded and executed on a computer, the processes or functions described in accordance with embodiments of the present application are produced in whole or in part. The computer may be a general purpose computer, a special purpose computer, a computer network, or other programmable apparatus. The computer instructions may be stored in a computer-readable storage medium or transmitted from one computer-readable storage medium to another computer-readable storage medium, for example, the computer instructions may be transmitted from one website, computer, server, or data center to another website, computer, server, or data center by a wired (e.g., coaxial cable, fiber optic, digital subscriber line (digital subscriber line, DSL)) or wireless (e.g., infrared, wireless, microwave, etc.). The computer readable storage medium may be any available medium that can be accessed by a computer or a data storage device including one or more servers, data centers, etc. that can be integrated with the medium. The usable medium may be a magnetic medium (e.g., a floppy disk, a hard disk, a magnetic tape), an optical medium (e.g., a DVD), or a semiconductor medium (e.g., a Solid State Disk (SSD)), or the like.
Although the application has been described in connection with specific features and embodiments thereof, it will be apparent that various modifications and combinations can be made without departing from the spirit and scope of the application. Accordingly, the specification and drawings are merely exemplary illustrations of the present application as defined in the appended claims and are considered to cover any and all modifications, variations, combinations, or equivalents that fall within the scope of the application. It will be apparent to those skilled in the art that various modifications and variations can be made to the present application without departing from the spirit or scope of the application. Thus, it is intended that the present application also include such modifications and alterations insofar as they come within the scope of the appended claims or the equivalents thereof.

Claims (7)

1. The touch positioning method is characterized by being applied to electronic equipment, wherein the electronic equipment comprises a touch screen and a plurality of ultrasonic sensors; the plurality of ultrasonic sensors are arranged on the inner screen side of the touch screen; the method comprises the following steps:
responding to the touch operation of a contact object on the touch screen, and taking a touch screen area corresponding to an ultrasonic sensor with the ultrasonic return time in a first preset interval as a contact area between the contact object and the touch screen; wherein, the ultrasonic wave return time of the ultrasonic sensor is the difference between the time of receiving the reflected wave of the ultrasonic wave and the time of transmitting the ultrasonic wave;
Taking coordinates corresponding to the ultrasonic sensors with the difference of ultrasonic return time of the adjacent ultrasonic sensors in the contact area in a second preset interval as coordinates of the contact point;
acquiring first texture information of the contact, wherein the first texture information is used for indicating texture information of a contact surface of the contact and the touch screen;
judging whether the first texture information accords with preset texture characteristics or not, wherein the preset texture characteristics comprise at least one of the following: texture features of the fingerprint, texture features of the stylus;
if yes, determining the coordinates of the contact point as effective touch coordinates; and if not, determining the coordinates of the contact point as invalid touch coordinates.
2. The method according to claim 1, wherein before the touch screen area corresponding to the ultrasonic sensor having the ultrasonic return time in the first preset interval is used as the contact area between the contact object and the touch screen, the method further includes:
transmitting ultrasonic waves to a first direction through each ultrasonic sensor, wherein the first direction is perpendicular to the touch screen and points to the outer screen side of the touch screen;
calculating an ultrasonic return time of each of the ultrasonic sensors in response to receiving a reflected wave of an ultrasonic wave by each of the ultrasonic sensors; the ultrasonic wave return time of the ultrasonic sensor is the difference between the time when the reflected wave of the ultrasonic wave is received and the time when the ultrasonic wave is transmitted.
3. The method of claim 1, wherein the obtaining the first texture information of the contact comprises:
and determining the first texture information according to the ultrasonic return time of each ultrasonic sensor in the contact area.
4. An electronic device, comprising: the touch screen, the touch module, the identification module and the chip; the chip is connected with the touch module and the identification module respectively; the chip, the touch module and the identification module are arranged on the inner screen side of the touch screen; the touch module comprises a plurality of ultrasonic sensors, and the ultrasonic sensors are arranged on the inner screen side of the touch screen;
the touch module is used for responding to the touch operation of a contact object on the touch screen, and taking a touch screen area corresponding to an ultrasonic sensor with the ultrasonic return time in a first preset interval as a contact area between the contact object and the touch screen; wherein, the ultrasonic wave return time of the ultrasonic sensor is the difference between the time of receiving the reflected wave of the ultrasonic wave and the time of transmitting the ultrasonic wave; taking coordinates corresponding to the ultrasonic sensors with the difference of ultrasonic return time of the adjacent ultrasonic sensors in the contact area in a second preset interval as coordinates of the contact point;
The identification module is used for acquiring first texture information of the contact, and the first texture information is used for indicating texture information of a contact surface of the contact and the touch screen;
the chip is used for judging whether the first texture information accords with preset texture characteristics or not, and the preset texture characteristics comprise at least one of the following: texture features of the fingerprint, texture features of the stylus; if yes, determining the coordinates of the contact point as effective touch coordinates; if not, determining the coordinates of the contact point as invalid coordinates.
5. The electronic device of claim 4, wherein the identification module is any one of: the device comprises an optical fingerprint identification module, an ultrasonic fingerprint identification module and a semiconductor fingerprint identification module.
6. An electronic device comprising one or more processors and one or more memories; the one or more memories coupled to the one or more processors, the one or more memories storing computer instructions; the computer instructions, when executed by the one or more processors, cause the electronic device to perform the touch location method of any of claims 1-3.
7. A computer readable storage medium, characterized in that the computer readable storage medium comprises computer instructions which, when run, perform the touch location method of any of claims 1-3.
CN202210700269.5A 2022-06-20 2022-06-20 Touch positioning method and electronic equipment Active CN115617192B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210700269.5A CN115617192B (en) 2022-06-20 2022-06-20 Touch positioning method and electronic equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210700269.5A CN115617192B (en) 2022-06-20 2022-06-20 Touch positioning method and electronic equipment

Publications (2)

Publication Number Publication Date
CN115617192A CN115617192A (en) 2023-01-17
CN115617192B true CN115617192B (en) 2023-11-10

Family

ID=84857333

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210700269.5A Active CN115617192B (en) 2022-06-20 2022-06-20 Touch positioning method and electronic equipment

Country Status (1)

Country Link
CN (1) CN115617192B (en)

Citations (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2004164505A (en) * 2002-11-15 2004-06-10 Fujitsu Ltd Coordinate output device
JP2009134505A (en) * 2007-11-30 2009-06-18 Pentel Corp Handwriting input system
CN105824559A (en) * 2016-02-29 2016-08-03 维沃移动通信有限公司 Unintended activation recognizing and treating method and electronic equipment
CN106970726A (en) * 2017-03-16 2017-07-21 宇龙计算机通信科技(深圳)有限公司 Control method and device for the electronic equipment with full frame fingerprint recognition
CN107291307A (en) * 2017-07-26 2017-10-24 京东方科技集团股份有限公司 Ultrasonic wave contactor control device and method, display device
CN107426434A (en) * 2017-08-01 2017-12-01 京东方科技集团股份有限公司 Input unit and electronic equipment
CN109241957A (en) * 2018-11-19 2019-01-18 Oppo广东移动通信有限公司 Electronic device, fingerprint collecting method, device, storage medium and mobile terminal
CN110221720A (en) * 2019-04-29 2019-09-10 华为技术有限公司 A kind of touch method and electronic equipment
CN110287931A (en) * 2019-07-01 2019-09-27 Oppo广东移动通信有限公司 Touch coordinate determines method, apparatus, terminal and storage medium
CN110892372A (en) * 2017-08-30 2020-03-17 华为技术有限公司 Screen control method and terminal
CN111381729A (en) * 2020-03-27 2020-07-07 深圳市鸿合创新信息技术有限责任公司 Touch point positioning method and device of capacitive touch screen

Patent Citations (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2004164505A (en) * 2002-11-15 2004-06-10 Fujitsu Ltd Coordinate output device
JP2009134505A (en) * 2007-11-30 2009-06-18 Pentel Corp Handwriting input system
CN105824559A (en) * 2016-02-29 2016-08-03 维沃移动通信有限公司 Unintended activation recognizing and treating method and electronic equipment
CN106970726A (en) * 2017-03-16 2017-07-21 宇龙计算机通信科技(深圳)有限公司 Control method and device for the electronic equipment with full frame fingerprint recognition
CN107291307A (en) * 2017-07-26 2017-10-24 京东方科技集团股份有限公司 Ultrasonic wave contactor control device and method, display device
CN107426434A (en) * 2017-08-01 2017-12-01 京东方科技集团股份有限公司 Input unit and electronic equipment
CN110892372A (en) * 2017-08-30 2020-03-17 华为技术有限公司 Screen control method and terminal
CN109241957A (en) * 2018-11-19 2019-01-18 Oppo广东移动通信有限公司 Electronic device, fingerprint collecting method, device, storage medium and mobile terminal
CN110221720A (en) * 2019-04-29 2019-09-10 华为技术有限公司 A kind of touch method and electronic equipment
CN110287931A (en) * 2019-07-01 2019-09-27 Oppo广东移动通信有限公司 Touch coordinate determines method, apparatus, terminal and storage medium
CN111381729A (en) * 2020-03-27 2020-07-07 深圳市鸿合创新信息技术有限责任公司 Touch point positioning method and device of capacitive touch screen

Also Published As

Publication number Publication date
CN115617192A (en) 2023-01-17

Similar Documents

Publication Publication Date Title
CN108388390B (en) Apparatus and method for controlling fingerprint sensor
CN106066986B (en) Method and apparatus for sensing a fingerprint
JP7244666B2 (en) Screen control method, electronic device, and storage medium
US10699093B2 (en) Mobile terminal, method and device for displaying fingerprint recognition region
US10256658B2 (en) Operating method of an electronic device and electronic device supporting the same
EP3637225B1 (en) Display processing method and apparatus
US20180314874A1 (en) Method For Displaying Fingerprint Identification Area And Mobile Terminal
US10182769B2 (en) Information management method and electronic device
CN112771481B (en) Electronic device for pairing with handwriting pen and method thereof
EP3142352A1 (en) Method for processing sound by electronic device and electronic device thereof
CN107423601A (en) Method for controlling fingerprint identification and Related product
CN116070035B (en) Data processing method and electronic equipment
CN113778255B (en) Touch recognition method and device
CN113108945A (en) Temperature anomaly detection method and device
CN115617192B (en) Touch positioning method and electronic equipment
CN112513789A (en) Method for controlling operation mode by using electronic pen and electronic device
CN113343709B (en) Method for training intention recognition model, method, device and equipment for intention recognition
CN112365088B (en) Method, device and equipment for determining travel key points and readable storage medium
CN112990421B (en) Method, device and storage medium for optimizing operation process of deep learning network
US20200167016A1 (en) Electronic device and method of transmitting data by electronic device
CN111510553A (en) Motion trail display method and device and readable storage medium
CN112749583A (en) Face image grouping method and device, computer equipment and storage medium
CN113052408B (en) Method and device for community aggregation
CN114264884B (en) Dielectric constant measuring method and device
CN116668951B (en) Method for generating geofence, electronic equipment and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant