CN111625175B - Touch event processing method, touch event processing device, medium and electronic equipment - Google Patents

Touch event processing method, touch event processing device, medium and electronic equipment Download PDF

Info

Publication number
CN111625175B
CN111625175B CN202010374275.7A CN202010374275A CN111625175B CN 111625175 B CN111625175 B CN 111625175B CN 202010374275 A CN202010374275 A CN 202010374275A CN 111625175 B CN111625175 B CN 111625175B
Authority
CN
China
Prior art keywords
touch event
touch
processor
user
interrupt signal
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202010374275.7A
Other languages
Chinese (zh)
Other versions
CN111625175A (en
Inventor
古启才
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Oppo Chongqing Intelligent Technology Co Ltd
Original Assignee
Oppo Chongqing Intelligent Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Oppo Chongqing Intelligent Technology Co Ltd filed Critical Oppo Chongqing Intelligent Technology Co Ltd
Priority to CN202010374275.7A priority Critical patent/CN111625175B/en
Publication of CN111625175A publication Critical patent/CN111625175A/en
Application granted granted Critical
Publication of CN111625175B publication Critical patent/CN111625175B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F21/00Security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
    • G06F21/30Authentication, i.e. establishing the identity or authorisation of security principals
    • G06F21/31User authentication
    • G06F21/32User authentication using biometric data, e.g. fingerprints, iris scans or voiceprints
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F9/00Arrangements for program control, e.g. control units
    • G06F9/06Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
    • G06F9/46Multiprogramming arrangements
    • G06F9/48Program initiating; Program switching, e.g. by interrupt
    • G06F9/4806Task transfer initiation or dispatching
    • G06F9/4812Task transfer initiation or dispatching by interrupt, e.g. masked
    • G06F9/4818Priority circuits therefor
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F9/00Arrangements for program control, e.g. control units
    • G06F9/06Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
    • G06F9/46Multiprogramming arrangements
    • G06F9/50Allocation of resources, e.g. of the central processing unit [CPU]
    • G06F9/5005Allocation of resources, e.g. of the central processing unit [CPU] to service a request
    • G06F9/5011Allocation of resources, e.g. of the central processing unit [CPU] to service a request the resources being hardware resources other than CPUs, Servers and Terminals
    • G06F9/5022Mechanisms to release resources

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Software Systems (AREA)
  • Human Computer Interaction (AREA)
  • Computer Security & Cryptography (AREA)
  • Computer Hardware Design (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The disclosure provides a touch event processing method, a touch event processing device, a computer readable storage medium and electronic equipment, and relates to the technical field of human-computer interaction. The touch event processing method is applied to a touch terminal, the touch terminal comprises a touch screen, and the touch event processing method comprises the following steps: in a black screen state, when the touch screen receives a touch event input by a user, triggering a timer to start timing; sending an interrupt signal to a processor and maintaining the touch event; when the processor reads the touch event, releasing the touch event and emptying the timer; when the timing time of the timer reaches a first preset time, sending an interrupt signal to the processor again so that the processor reads the touch event. The method and the device have the advantages that the touch event is maintained, the processor is waited to read the touch event, and the problem that user information cannot be identified due to the fact that the touch event is lost is avoided.

Description

Touch event processing method, touch event processing device, medium and electronic equipment
Technical Field
The present disclosure relates to the field of human-computer interaction technologies, and in particular, to a touch event processing method, a touch event processing apparatus, a computer-readable storage medium, and an electronic device.
Background
With the rapid development of information technology, terminal devices such as mobile phones and tablet computers are widely used. Considering that a great amount of personal information of users may be stored in these terminal devices, security issues are increasingly emphasized, and particularly, fingerprint recognition technology is greatly developed.
The fingerprint identification technology under the optical screen is one of the popular fingerprint identification technologies at present, and specifically, a light source is provided through a display device, and then a fingerprint image of a user is acquired in an optical shooting mode, so that functions of unlocking or paying of a terminal and the like are achieved. However, in practical applications, when the terminal device is in a black screen or deep sleep state, a user performs touch operation on the touch screen and wants to perform fingerprint identification, which often results in that a touch operation event of the user is lost, user information cannot be collected, and fingerprint identification cannot be performed because the wake-up process of the processor is slow, thereby affecting user experience. Therefore, how to obtain the effective touch event of the user in time when the touch terminal is in a black screen or deep sleep state to complete fingerprint identification is a problem to be solved urgently in the prior art.
It is to be noted that the information disclosed in the above background section is only for enhancement of understanding of the background of the present disclosure, and thus may include information that does not constitute prior art known to those of ordinary skill in the art.
Disclosure of Invention
The disclosure provides a touch event processing method, a touch event processing device, a computer readable storage medium and an electronic device, so that a processor can effectively and timely acquire a touch event of a user at least to a certain extent to normally perform fingerprint identification.
Additional features and advantages of the disclosure will be set forth in the detailed description which follows, or in part will be obvious from the description, or may be learned by practice of the disclosure.
According to a first aspect of the present disclosure, a touch event processing method is provided, which is applied to a touch terminal, where the touch terminal includes a touch screen, and the method includes: in a black screen state, when the touch screen receives a touch event input by a user, triggering a timer to start timing; sending an interrupt signal to a processor and maintaining the touch event; when the processor reads the touch event, releasing the touch event and emptying the timer; when the timing time of the timer reaches a first preset time, sending an interrupt signal to the processor again so that the processor reads the touch event.
According to a second aspect of the present disclosure, there is provided a touch event processing apparatus applied to a touch terminal, where the touch terminal includes a touch screen, the apparatus including: the event receiving module is used for triggering a timer to start timing when the touch screen receives a touch event input by a user in a black screen state; the signal sending module is used for sending an interrupt signal to the processor and maintaining the touch event; the event releasing module is used for releasing the touch event and emptying the timer when the processor reads the touch event; and the event reading module is used for sending an interrupt signal to the processor again when the timing time of the timer reaches a first preset time so as to enable the processor to read the touch event.
According to a third aspect of the present disclosure, there is provided a computer-readable storage medium having stored thereon a computer program which, when executed by a processor, implements the above-described touch event processing method.
According to a fourth aspect of the present disclosure, there is provided an electronic device comprising: a processor; and a memory for storing executable instructions of the processor; wherein the processor is configured to perform the above-described touch event processing method via execution of the executable instructions.
The technical scheme of the disclosure has the following beneficial effects:
according to the touch event processing method, the touch event processing device, the computer readable storage medium and the electronic equipment, in a black screen state, when a touch screen receives a touch event input by a user, a timer is triggered to start timing; sending an interrupt signal to a processor and maintaining a touch event; when the processor reads the touch event, releasing the touch event and emptying the timer; when the timing time of the timer reaches a first preset time, the interrupt signal is sent to the processor again, so that the processor reads the touch event. On the one hand, by maintaining the touch event input by the user, the problem that the touch event is possibly lost due to power consumption saving in the black screen state of the touch terminal, so that fingerprint identification cannot be normally performed is solved, and the user experience is improved. On the other hand, in addition to sending the interrupt signal to the processor, the timer is set, and when the timing time reaches the first preset time, the interrupt signal is repeatedly sent to the processor again, so that the processor reads the touch event, and the problem that the processor cannot read the touch event in time is further avoided.
It is to be understood that both the foregoing general description and the following detailed description are exemplary and explanatory only and are not restrictive of the disclosure.
Drawings
The accompanying drawings, which are incorporated in and constitute a part of this specification, illustrate embodiments consistent with the present disclosure and together with the description, serve to explain the principles of the disclosure. It is to be understood that the drawings in the following description are merely exemplary of the disclosure, and that other drawings may be derived from those drawings by one of ordinary skill in the art without the exercise of inventive faculty.
Fig. 1 shows a schematic diagram of an electronic device of the present exemplary embodiment;
FIG. 2 is a flowchart illustrating a touch event processing method according to the exemplary embodiment;
fig. 3 shows a system structure for sending an interrupt signal in a touch event processing method according to the present exemplary embodiment;
FIG. 4 illustrates a sub-flowchart of a touch event processing method of the present exemplary embodiment;
FIG. 5 shows a schematic diagram of a touch screen in the present exemplary embodiment;
FIG. 6 is a flowchart illustrating another touch event processing method according to the exemplary embodiment;
fig. 7 shows a block diagram of a touch event processing device according to the present exemplary embodiment.
Detailed Description
Example embodiments will now be described more fully with reference to the accompanying drawings. Example embodiments may, however, be embodied in many different forms and should not be construed as limited to the examples set forth herein; rather, these embodiments are provided so that this disclosure will be thorough and complete, and will fully convey the concept of example embodiments to those skilled in the art. The described features, structures, or characteristics may be combined in any suitable manner in one or more embodiments. In the following description, numerous specific details are provided to give a thorough understanding of embodiments of the disclosure. One skilled in the relevant art will recognize, however, that the subject matter of the present disclosure can be practiced without one or more of the specific details, or with other methods, components, devices, steps, and the like. In other instances, well-known technical solutions have not been shown or described in detail to avoid obscuring aspects of the present disclosure.
Furthermore, the drawings are merely schematic illustrations of the present disclosure and are not necessarily drawn to scale. The same reference numerals in the drawings denote the same or similar parts, and thus their repetitive description will be omitted. Some of the block diagrams shown in the figures are functional entities and do not necessarily correspond to physically or logically separate entities. These functional entities may be implemented in the form of software, or in one or more hardware modules or integrated circuits, or in different networks and/or processor devices and/or microcontroller devices.
An exemplary embodiment of the present disclosure provides an electronic device for implementing a touch event processing method. The electronic device may be implemented in various forms, and may include, for example, a mobile device such as a mobile phone, a tablet computer, a notebook computer, a Personal Digital Assistant (PDA), a navigation device, a wearable device, an unmanned aerial vehicle, and a stationary device such as a desktop computer and a smart television. The following takes the mobile terminal 100 in fig. 1 as an example, and exemplifies the configuration of the electronic device. It will be appreciated by those skilled in the art that the configuration of figure 1 can also be applied to fixed type devices, in addition to components specifically intended for mobile purposes. In other embodiments, mobile terminal 100 may include more or fewer components than shown, or some components may be combined, some components may be split, or a different arrangement of components. The illustrated components may be implemented in hardware, software, or a combination of software and hardware. The interfacing relationship between the components is only schematically illustrated and does not constitute a structural limitation of the mobile terminal 100. In other embodiments, the mobile terminal 100 may also interface differently than shown in fig. 1, or a combination of multiple interfaces.
As shown in fig. 1, the mobile terminal 100 may specifically include: a processor 110, an internal memory 121, an external memory interface 122, a Universal Serial Bus (USB) interface 130, a charging management Module 140, a power management Module 141, a battery 142, an antenna 1, an antenna 2, a mobile communication Module 150, a wireless communication Module 160, an audio Module 170, a speaker 171, a receiver 172, a microphone 173, an earphone interface 174, a sensor Module 180, a display 190, a camera Module 191, an indicator 192, a motor 193, a key 194, and a Subscriber Identity Module (SIM) card interface 195. Wherein the sensor module 180 may include a depth sensor 1801, a pressure sensor 1802, a gyroscope sensor 1803, an air pressure sensor 1804, and the like.
Processor 110 may include one or more processing units, such as: the Processor 110 may include an Application Processor (AP), a modem Processor, a Graphics Processing Unit (GPU), an Image Signal Processor (ISP), a controller, a video codec, a Digital Signal Processor (DSP), a baseband Processor, and/or a Neural-Network Processing Unit (NPU), and the like. The different processing units may be separate devices or may be integrated into one or more processors.
The controller can generate an operation control signal according to the instruction operation code and the timing signal to complete the control of reading the instruction and executing the instruction.
A memory may also be provided in processor 110 for storing instructions and data. The memory may store instructions for implementing six modular functions: detection instructions, connection instructions, information management instructions, analysis instructions, data transmission instructions, and notification instructions, and are controlled to be executed by the processor 110. In some implementations, the memory in the processor 110 is a cache memory. The memory may hold instructions or data that have just been used or recycled by the processor 110. If the processor 110 needs to reuse the instruction or data, it can be called directly from memory. Avoiding repeated accesses reduces the latency of the processor 110, thereby increasing the efficiency of the system.
In some implementations, the processor 110 may include one or more interfaces. The Interface may include an Integrated Circuit (I2C) Interface, an Inter-Integrated Circuit built-in audio (I2S) Interface, a Pulse Code Modulation (PCM) Interface, a Universal Asynchronous Receiver/Transmitter (UART) Interface, a Mobile Industry Processor Interface (MIPI), a General-Purpose Input/Output (GPIO) Interface, a Subscriber Identity Module (SIM) Interface, and/or a Universal Serial Bus (USB) Interface, etc. Connections are made with other components of the mobile terminal 100 through different interfaces.
The USB interface 130 is an interface conforming to the USB standard specification, and may specifically be a MiniUSB interface, a microsusb interface, a USB type c interface, or the like. The USB interface 130 may be used to connect a charger to charge the mobile terminal 100, may also be connected to an earphone to play audio through the earphone, and may also be used to connect the mobile terminal 100 to other electronic devices, such as a computer and a peripheral device.
The charging management module 140 is configured to receive charging input from a charger. The charger may be a wireless charger or a wired charger. In some wired charging embodiments, the charging management module 140 may receive charging input from a wired charger via the USB interface 130. In some wireless charging embodiments, the charging management module 140 may receive a wireless charging input through a wireless charging coil of the mobile terminal 100. The charging management module 140 may also supply power to the electronic device through the power management module 141 while charging the battery 142.
The power management module 141 is used for connecting the battery 142, the charging management module 140 and the processor 110. The power management module 141 receives the input of the battery 142 and/or the charging management module 140, and supplies power to the processor 110, the internal memory 121, the display screen 190, the camera module 191, the wireless communication module 160, and the like. The power management module 141 may also be used to monitor parameters such as battery capacity, battery cycle count, battery state of health (leakage, impedance), etc. In some other embodiments, the power management module 141 may also be disposed in the processor 110. In other embodiments, the power management module 141 and the charging management module 140 may be disposed in the same device.
The wireless communication function of the mobile terminal 100 may be implemented by the antenna 1, the antenna 2, the mobile communication module 150, the wireless communication module 160, a modem processor, a baseband processor, and the like.
The antennas 1 and 2 are used for transmitting and receiving electromagnetic wave signals. Each antenna in the mobile terminal 100 may be used to cover a single or multiple communication bands. Different antennas can also be multiplexed to improve the utilization of the antennas. For example: the antenna 1 may be multiplexed as a diversity antenna of a wireless local area network. In other embodiments, the antenna may be used in conjunction with a tuning switch.
The mobile communication module 150 may provide a solution including 2G/3G/4G/5G wireless communication applied on the mobile terminal 100. The mobile communication module 150 may include at least one filter, a switch, a power Amplifier, a Low Noise Amplifier (LNA), and the like. The mobile communication module 150 may receive the electromagnetic wave from the antenna 1, filter, amplify, etc. the received electromagnetic wave, and transmit the electromagnetic wave to the modem processor for demodulation. The mobile communication module 150 may also amplify the signal modulated by the modem processor, and convert the signal into electromagnetic wave through the antenna 1 to radiate the electromagnetic wave. In some embodiments, at least some of the functional modules of the mobile communication module 150 may be disposed in the processor 110. In some embodiments, at least some of the functional modules of the mobile communication module 150 may be disposed in the same device as at least some of the modules of the processor 110.
The modem processor may include a modulator and a demodulator. The modulator is used for modulating a low-frequency baseband signal to be transmitted into a medium-high frequency signal. The demodulator is used for demodulating the received electromagnetic wave signal into a low-frequency baseband signal. The demodulator then passes the demodulated low frequency baseband signal to a baseband processor for processing. The low frequency baseband signal is processed by the baseband processor and then transferred to the application processor. The application processor outputs a sound signal through an audio device (not limited to the speaker 171, the receiver 172, etc.) or displays an image or video through the display screen 190. In some embodiments, the modem processor may be a stand-alone device. In other embodiments, the modem processor may be separate from the processor 110 and may be disposed in the same device as the mobile communication module 150 or other functional modules.
The Wireless Communication module 160 may provide solutions for Wireless Communication applied to the mobile terminal 100, including Wireless Local Area Networks (WLANs) (e.g., Wireless Fidelity (Wi-Fi) network), Bluetooth (BT), Global Navigation Satellite System (GNSS), Frequency Modulation (FM), Near Field Communication (NFC), Infrared (IR), and the like. The wireless communication module 160 may be one or more devices integrating at least one communication processing module. The wireless communication module 160 receives electromagnetic waves via the antenna 2, performs frequency modulation and filtering processing on electromagnetic wave signals, and transmits the processed signals to the processor 110. The wireless communication module 160 may also receive a signal to be transmitted from the processor 110, perform frequency modulation and amplification on the signal, and convert the signal into electromagnetic waves through the antenna 2 to radiate the electromagnetic waves.
In some embodiments, the antenna 1 of the mobile terminal 100 is coupled to the mobile communication module 150 and the antenna 2 is coupled to the wireless communication module 160, so that the mobile terminal 100 can communicate with a network and other devices through wireless communication technology. The wireless communication technology may include Global System for Mobile communications (GSM), General Packet Radio Service (GPRS), Code Division Multiple Access (Code Division Multiple Access, CDMA), Wideband Code Division Multiple Access (WCDMA), Time Division-Synchronous Code Division Multiple Access (TD-SCDMA), Long Term Evolution (Long Term Evolution, LTE), New air interface (New Radio, NR), BT, GNSS, WLAN, NFC, FM, and/or IR technologies, etc. GNSS may include Global Positioning System (GPS), Global Navigation Satellite System (GLONASS), Beidou Navigation Satellite System (BDS), Quasi-Zenith Satellite System (QZSS), and/or Satellite Based Augmentation System (SBAS).
The mobile terminal 100 implements a display function through the GPU, the display screen 190, the application processor, and the like. The GPU is a microprocessor for image processing, and is connected to a display screen 190 and an application processor. The GPU is used to perform mathematical and geometric calculations for graphics rendering. The processor 110 may include one or more GPUs that execute program instructions to generate or alter display information.
The display screen 190 is used to display images, video, and the like. The display screen 190 includes a display panel. The Display panel may be a Liquid Crystal Display (LCD), an Organic Light-Emitting Diode (OLED), an Active Matrix Organic Light-Emitting Diode (Active-Matrix Organic Light-Emitting Diode, AMOLED), a flexible Light-Emitting Diode (FLED), a miniature, a Micro-oeld, a Quantum dot Light-Emitting Diode (Quantum dot Light-Emitting Diodes, QLED), or the like. In some embodiments, the mobile terminal 100 may include 1 or N display screens 190, N being a positive integer greater than 1.
The mobile terminal 100 may implement a photographing function through the ISP, the camera module 191, the video codec, the GPU, the display screen 190, the application processor, and the like.
The ISP is used to process the data fed back by the camera module 191. For example, when a photo is taken, the shutter is opened, light is transmitted to the camera photosensitive element through the lens, the optical signal is converted into an electrical signal, and the camera photosensitive element transmits the electrical signal to the ISP for processing and converting into an image visible to naked eyes. The ISP can also carry out algorithm optimization on the noise, brightness and skin color of the image. The ISP can also optimize parameters such as exposure, color temperature and the like of a shooting scene. In some embodiments, the ISP may be provided in the camera module 191.
The camera module 191 is used to capture still images or videos. The object generates an optical image through the lens and projects the optical image to the photosensitive element. The photosensitive element may be a Charge Coupled Device (CCD) or a Complementary Metal-Oxide-Semiconductor (CMOS) phototransistor. The light sensing element converts the optical signal into an electrical signal, which is then passed to the ISP where it is converted into a digital image signal. And the ISP outputs the digital image signal to the DSP for processing. The DSP converts the digital image signal into image signal in standard RGB, YUV and other formats. In some embodiments, the mobile terminal 100 may include 1 or N camera modules 191, where N is a positive integer greater than 1, and if the mobile terminal 100 includes N cameras, one of the N cameras is the main camera.
The digital signal processor is used for processing digital signals, and can process digital image signals and other digital signals. For example, when the mobile terminal 100 selects a frequency bin, the digital signal processor is used to perform fourier transform or the like on the frequency bin energy.
Video codecs are used to compress or decompress digital video. The mobile terminal 100 may support one or more video codecs. In this way, the mobile terminal 100 may play or record video in a variety of encoding formats, such as: moving Picture Experts Group (MPEG) 1, MPEG2, MPEG3, MPEG4, and the like.
The external memory interface 122 may be used to connect an external memory card, such as a Micro SD card, to extend the memory capability of the mobile terminal 100. The external memory card communicates with the processor 110 through the external memory interface 122 to implement a data storage function. For example, files such as music, video, etc. are saved in an external memory card.
The internal memory 121 may be used to store computer-executable program code, which includes instructions. The internal memory 121 may include a program storage area and a data storage area. The storage program area may store an operating system, an application program (such as a sound playing function, an image playing function, etc.) required by at least one function, and the like. The storage data area may store data (e.g., audio data, a phonebook, etc.) created during use of the mobile terminal 100, and the like. In addition, the internal memory 121 may include a high-speed random access memory, and may further include a nonvolatile memory, such as at least one magnetic disk Storage device, a Flash memory device, a Universal Flash Storage (UFS), and the like. The processor 110 executes various functional applications of the mobile terminal 100 and data processing by executing instructions stored in the internal memory 121 and/or instructions stored in a memory provided in the processor.
The mobile terminal 100 may implement an audio function through the audio module 170, the speaker 171, the receiver 172, the microphone 173, the earphone interface 174, and the application processor. Such as music playing, recording, etc.
The audio module 170 is used to convert digital audio information into an analog audio signal output and also to convert an analog audio input into a digital audio signal. The audio module 170 may also be used to encode and decode audio signals. In some embodiments, the audio module 170 may be disposed in the processor 110, or some functional modules of the audio module 170 may be disposed in the processor 110.
The speaker 171, also called a "horn", converts an audio electric signal into a sound signal. The mobile terminal 100 can listen to music through the speaker 171 or listen to a hands-free call.
A receiver 172, also called "earpiece", is used to convert the electrical audio signal into an acoustic signal. When the mobile terminal 100 receives a call or voice information, it can receive voice by placing the receiver 172 close to the human ear.
A microphone 173, also referred to as a "microphone", is used to convert sound signals into electrical signals. When making a call or transmitting voice information, the user can input a voice signal to the microphone 173 by speaking near the microphone 173 through the mouth. The mobile terminal 100 may be provided with at least one microphone 173. In other embodiments, the mobile terminal 100 may be provided with two microphones 173 to implement a noise reduction function in addition to collecting sound signals. In other embodiments, the mobile terminal 100 may further include three, four or more microphones 173 to collect sound signals, reduce noise, identify sound sources, perform directional recording, and so on.
The earphone interface 174 is used to connect a wired earphone. The headset interface 174 may be the USB interface 130, or may be a 3.5mm Open Mobile Terminal Platform (OMTP) standard interface, a Cellular Telecommunications Industry Association of america (CTIA) standard interface.
The depth sensor 1801 is used to acquire depth information of a scene. In some embodiments, the depth sensor may be disposed in the camera module 191.
The pressure sensor 1802 is used to sense a pressure signal, which can be converted into an electrical signal. In some embodiments, the pressure sensor 1802 may be disposed on the display screen 190. The pressure sensors 1802 can be of a wide variety, such as resistive pressure sensors, inductive pressure sensors, capacitive pressure sensors, and the like.
The gyro sensor 1803 may be used to determine a motion gesture of the mobile terminal 100. In some embodiments, the angular velocity of the mobile terminal 100 about three axes (i.e., x, y, and z axes) may be determined by the gyro sensors 1803. The gyro sensor 1803 may be used for photographing anti-shake. For example, when the shutter is pressed, the gyroscope 1803 detects a shaking angle of the mobile terminal 100, calculates a distance to be compensated for by the lens module according to the shaking angle, and allows the lens to counteract shaking of the mobile terminal 100 through a reverse motion, thereby achieving anti-shaking. The gyro sensor 1803 may also be used for navigation and body sensing of game scenes.
Air pressure sensor 1804 is used to measure air pressure. In some embodiments, mobile terminal 100 may calculate altitude, aid in positioning and navigation, from barometric pressure values measured by barometric pressure sensor 1804.
In addition, other functional sensors, such as a magnetic sensor, an acceleration sensor, a distance sensor, a proximity light sensor, a fingerprint sensor, a temperature sensor, a touch sensor, an ambient light sensor, a bone conduction sensor, etc., may be disposed in the sensor module 180 according to actual needs.
The keys 194 include a power-on key, a volume key, and the like. The keys 194 may be mechanical keys. Or may be touch keys. The mobile terminal 100 may receive a key input, and generate a key signal input related to user setting and function control of the mobile terminal 100.
The motor 193 may generate a vibration prompt, such as a vibration prompt for incoming calls, alarm clocks, receiving information, etc., and may also be used for touch vibration feedback, such as touch operations applied to different applications (e.g., photographing, games, audio playing, etc.), or touch operations applied to different areas of the display screen 190, which may correspond to different vibration feedback effects. The touch vibration feedback effect may support customization.
Indicator 192 may be an indicator light that may be used to indicate a state of charge, a change in charge, or a message, missed call, notification, etc.
The SIM card interface 195 is used to connect a SIM card. The SIM card can be attached to and detached from the mobile terminal 100 by being inserted into the SIM card interface 195 or being pulled out of the SIM card interface 195. The mobile terminal 100 may support 1 or N SIM card interfaces, where N is a positive integer greater than 1. The SIM card interface 195 may support a Nano SIM card, a Micro SIM card, a SIM card, etc. The same SIM card interface 195 can be inserted with multiple cards at the same time. The types of the plurality of cards may be the same or different. The SIM card interface 195 may also be compatible with different types of SIM cards. The SIM card interface 195 may also be compatible with external memory cards. The mobile terminal 100 interacts with the network through the SIM card to implement functions such as communication and data communication. In some embodiments, the mobile terminal 100 employs eSIM, namely: an embedded SIM card. The eSIM card may be embedded in the mobile terminal 100 and may not be separated from the mobile terminal 100.
Exemplary embodiments of the present disclosure provide a touch event processing method, which may be applied to a touch terminal including a touch screen. The touch terminal may include, but is not limited to, a smart phone, a tablet computer, a notebook computer, a palm computer, a Personal Digital Assistant (PDA), a Portable Media Player (PMP), an electronic reader, a vehicle-mounted device, a navigation device, a wearable device, and other devices including a touch screen. The touch screen is positioned in the terminal equipment and can be used for realizing interactive operation with a user.
The following describes a touch event processing method and a touch event processing apparatus according to exemplary embodiments of the present disclosure.
Fig. 2 shows a flow of a touch event processing method in the present exemplary embodiment, which includes the following steps S210 to S240:
in step S210, in the black screen state, when the touch screen receives a touch event input by a user, a timer is triggered to start timing.
A touch screen (also called "touch screen" or "touch panel") is an inductive liquid crystal display device that can be used to receive user input signals. According to the working principle and transmission medium of the touch screen, the touch screen can be divided into a plurality of types, including a resistance type touch screen, a capacitance type touch screen, an infrared ray type touch screen, a surface acoustic wave touch screen and the like; among them, the capacitive touch screen is one of the most widely used touch screen types in the terminal device. In the present exemplary embodiment, the touch screen may be a capacitive touch screen. The touch screen can work by utilizing the capacitance sensing principle of the touch screen, and when a user performs specific operation on the touch screen through a specific object, such as a finger, a touch pen or other touch objects, and the like, such as clicking, double clicking, pressing, sliding and the like, corresponding functions can be realized. In the present exemplary embodiment, the touch event is an operation event input by the user for unlocking the finger print under the screen, such as a pressing operation, a lifting operation, or a long-pressing operation. In the present exemplary embodiment, a variety of information may be included in the touch event, for example, when the user presses the touch screen with a finger, the touch event may include a pressing force, a pressing area of the finger with the touch screen, a pressing area position, a pressing time, and the like.
In order to improve the cruising ability of the touch terminal and save the power consumption of the touch terminal, in practical applications, when the user does not need to use the touch terminal, the touch terminal is generally adjusted to be in a black screen state. In an exemplary embodiment, the black screen state is: the touch screen is in a dormant state, and the fingerprint identification function under the touch screen is started.
In the black screen state, the touch screen or the touch terminal is in a dormant state or a deep dormant state, some unused application programs can be automatically closed, and in order to enable a user to unlock the touch terminal through touch operation, the fingerprint identification function under the screen is in an open state, that is, a touch event input by the user can be detected, so that when the touch event input by the user is received, a timer is triggered to start timing.
In addition, the black screen state can be set as follows: and starting a state of a black screen gesture and starting a state of a fingerprint identification function under the screen. When the black screen gesture is turned on, a user can light the screen or trigger certain functions based on certain quick gestures when the mobile phone is in the black screen state, for example, in the black screen state, the touch screen slides to a V shape, and the flashlight can be triggered to be turned on.
Step S220 is to send an interrupt signal to the processor and maintain the touch event.
In the exemplary embodiment, the touch screen may have its own independent memory and processor, and when the touch screen receives a touch event input by a user, as shown in fig. 3, the touch screen 310 usually triggers an interrupt signal periodically to enable the processor 320 of the touch terminal to read the touch event through a specific pin, such as an INT pin or I2C (Inter-Integrated Circuit)/SPI (Serial Peripheral Interface). However, in practical applications, because the processor is in a deep sleep state in a black screen state, after receiving an interrupt signal, the processor needs to wake up step by step to respond to the interrupt signal and start to read a touch event received by the touch screen. The wake-up process is slow and there is usually a certain lag, for example, the lag reading time of the high-pass platform is 40MS (milliseconds), i.e. after the processor receives the interrupt signal, 40MS responds to the interrupt signal. In this process, the touch event may be lost, so that the processor cannot acquire the touch event.
Based on this, the present exemplary embodiment may set a mechanism for maintaining the touch event, that is, after receiving the touch event input by the user, the touch event is maintained until the processor reads the touch event and then the touch event is cleared. Specifically, in an exemplary embodiment, the maintaining the touch event may include:
and storing the touch event into a data register of the touch screen.
The data register may be a memory of the touch screen system, and is configured to buffer the received touch event into the register, wait for the processor to read the touch event, and clear the touch event only after the processor finishes reading or the buffering time exceeds a certain time.
In step S230, when the processor reads the touch event, the touch event is released and the timer is cleared.
In step S240, when the timing time of the timer reaches the first preset time, the interrupt signal is sent to the processor again, so that the processor reads the touch event.
The first preset time refers to a time threshold for triggering the sending of the interrupt signal to the processor again. If the processor responds to the interrupt signal and reads the touch event after sending the interrupt signal to the processor in step S220, the touch event can be released, the timer is reset to zero, and the timing is stopped; if the interrupt signal is sent to the processor and the processor does not respond to the interrupt signal in step S220, when the timing time of the timer reaches the first preset time, the interrupt signal is triggered to be sent to the processor again, so that the processor reads the touch event, thereby ensuring the validity of the processor reading the touch event. It should be noted that, in the present exemplary embodiment, in step S220, the time period for the touch screen to send the interrupt signal to the processor is less than a first preset time, for example: the time frequency of the touch screen sending the interrupt signal to the processor is 120HZ (hertz), the first preset time may be set to 8.3MS, etc.
In an exemplary embodiment, the touch event processing method may further include the steps of:
when the timing time of the timer reaches a second preset time, releasing the touch event;
and the second preset time is greater than the first preset time.
In this exemplary embodiment, a maximum timing time may be set for the timer, and when the timing time reaches a second preset time, the touch event may be considered to be an invalid event, and the touch event is released, so as to avoid that the touch event is maintained for too long time and the storage pressure of the touch screen system is increased. The second preset time is greater than the first preset time, for example, the first preset time may be set to 40MS, and the second preset time may be set to 80 MS.
In summary, in the exemplary embodiment, according to the touch event processing method, the touch event processing apparatus, the computer-readable storage medium and the electronic device, in the black screen state, when the touch screen receives a touch event input by a user, a timer is triggered to start timing; sending an interrupt signal to a processor and maintaining a touch event; when the processor reads the touch event, releasing the touch event and emptying the timer; when the timing time of the timer reaches a first preset time, the interrupt signal is sent to the processor again, so that the processor reads the touch event. On the one hand, by maintaining the touch event input by the user, the problem that the touch event is possibly lost due to power consumption saving in the black screen state of the touch terminal, so that fingerprint identification cannot be normally performed is solved, and the user experience is improved. On the other hand, in addition to sending the interrupt signal to the processor, the timer is set, and when the timing time reaches the first preset time, the interrupt signal is repeatedly sent to the processor again, so that the processor reads the touch event, and the problem that the processor cannot read the touch event in time is further avoided.
In an exemplary embodiment, the reading of the touch event by the processor may include:
the processor extracts fingerprint information of the user from the touch event and verifies the fingerprint information;
and if the fingerprint information passes the verification, triggering to unlock the touch terminal.
After the processor receives the touch event, fingerprint information of the user can be extracted from the touch event, and the fingerprint information is verified. The specific verification process may include: fingerprint characteristic information is extracted from the fingerprint information, the fingerprint characteristic information is matched with the characteristic information of the preset fingerprint which is input by the system, when the matching degree meets a certain condition, the matching can be considered to be successful, and then the fingerprint information of the user is considered to pass the verification, and the touch terminal is unlocked.
In an exemplary embodiment, as shown in fig. 4, when the touch screen receives a touch event input by a user, the touch event processing method may further include the steps of:
step S410, judging whether a touch event input by a user acts in a preset coordinate range of the touch screen;
in step S420, if the touch event input by the user acts within the preset coordinate range of the touch screen, a step of triggering a timer to start timing is performed.
In order to ensure that the touch event input by the user is an effective touch event and avoid accidental unlocking caused by the situation of accidental touch of the user, in the exemplary embodiment, before the timer is triggered to start timing, a pre-determination may be performed to determine whether the touch event input by the user acts within a preset coordinate range of the touch screen, where the preset coordinate range may be set at a specific position of an area having a function of unlocking fingerprints under the screen, for example, as shown in fig. 5, coordinates within the area 510 may be set as the preset coordinate range, when the user presses the touch screen with a finger, the finger is located in the area 510, it may be considered that the user currently needs to perform fingerprint identification, and a step of triggering the timer to start timing may be further performed.
In an exemplary embodiment, when the touch event input by the user acts within the preset coordinate range of the touch screen, the touch event processing method may further include:
judging whether the action area of a touch event input by a user on the touch screen meets a preset area or not;
and if the preset area is met, executing a step of triggering a timer to start timing.
It is considered that there may be a case where the user makes a wrong touch within a preset coordinate range or otherwise does not need to perform fingerprint recognition. Therefore, the present exemplary embodiment may further provide that, after determining that the touch event input by the user acts within the preset coordinate range of the touch screen, it is further determined whether to trigger the timer to start timing by determining an acting area of the user on the touch screen. The action area acting on the touch screen can be set according to actual needs, for example, the action area can be set according to use habits of users or other factors, the action area of a common thumb is larger than that of other fingers, in order to improve the speed and accuracy of recognition, the area of the touch screen pressed by the thumb can be used as a preset area, and the like, and the touch screen is not particularly limited by the disclosure.
Fig. 6 schematically shows a flowchart of another touch event processing method, which may specifically include the following steps:
step S610, in a black screen state, starting a fingerprint identification function under a screen;
step S620, judging whether the touch screen receives a touch event input by a user;
if the touch event input by the user is not received, executing step S630, and ending the current process;
if the touch event input by the user is received, executing step S640, and determining whether the touch event input by the user acts within a preset coordinate range of the touch screen;
if the current flow does not act in the preset coordinate range of the touch screen, executing the step S630 and ending the current flow;
if the touch event acts in the preset coordinate range of the touch screen, executing step S650, and judging whether the acting area of the touch event input by the user acting on the touch screen meets the preset area;
if the action area does not meet the preset area, executing step S630, and ending the current process;
if the action area meets the preset area, executing step S660, triggering a timer to start timing, sending an interrupt signal to the processor, and maintaining the touch event;
step S670, judging whether the timing time of the timer reaches a second preset time;
if the second preset time is reached, executing step S671 to release the touch event;
if the second preset time is not reached, executing step S680 to determine whether the timing time of the timer reaches the first preset time;
if the first preset time is reached, executing step S690, and sending the interrupt signal to the processor again, so that the processor reads the touch event;
if the first preset time is not reached, step S691 is executed, no interrupt signal is transmitted, and the process returns to step S680.
Exemplary embodiments of the present disclosure also provide a touch event processing apparatus. As shown in fig. 7, the touch event processing apparatus 700 may include: the event receiving module 710 is configured to, in a black screen state, trigger a timer to start timing when the touch screen receives a touch event input by a user; a signal sending module 720, configured to send an interrupt signal to the processor and maintain a touch event; an event releasing module 730, configured to release the touch event and clear the timer when the processor reads the touch event; the event reading module 740 is configured to send an interrupt signal to the processor again when the timing time of the timer reaches a first preset time, so that the processor reads the touch event.
In an exemplary embodiment, the touch event processing apparatus may further include: the second release module is used for releasing the touch event when the timing time of the timer reaches a second preset time; and the second preset time is greater than the first preset time.
In an exemplary embodiment, the black screen state is: the touch screen is in a dormant state, and the fingerprint identification function under the touch screen is started.
In an exemplary embodiment, the event reading module includes: the verification unit is used for sending the interrupt signal to the processor again when the timing time of the timer reaches a first preset time so that the processor extracts the fingerprint information of the user from the touch event and verifies the fingerprint information; and the unlocking unit is used for triggering the unlocking of the touch terminal if the fingerprint information passes the verification.
In an exemplary embodiment, the signal transmission module includes: and the event maintaining unit is used for storing the touch event into a data register of the touch screen.
In an exemplary embodiment, the touch event processing apparatus may further include: the first judgment module is used for judging whether the touch event input by the user acts in a preset coordinate range of the touch screen when the touch screen receives the touch event input by the user; and if the touch event input by the user acts in the preset coordinate range of the touch screen, executing a step of triggering a timer to start timing.
In an exemplary embodiment, the touch event processing apparatus may further include: the second judgment module is used for judging whether the action area of the touch event input by the user on the touch screen meets the preset area or not if the touch event input by the user acts in the preset coordinate range of the touch screen; and if the preset area is met, executing a step of triggering a timer to start timing.
The specific details of each module in the above apparatus have been described in detail in the method section, and details that are not disclosed may refer to the method section, and thus are not described again.
As will be appreciated by one skilled in the art, aspects of the present disclosure may be embodied as a system, method or program product. Accordingly, various aspects of the present disclosure may be embodied in the form of: an entirely hardware embodiment, an entirely software embodiment (including firmware, microcode, etc.) or an embodiment combining hardware and software aspects that may all generally be referred to herein as a "circuit," module "or" system.
Exemplary embodiments of the present disclosure also provide a computer-readable storage medium having stored thereon a program product capable of implementing the above-described method of the present specification. In some possible embodiments, various aspects of the disclosure may also be implemented in the form of a program product including program code for causing a terminal device to perform the steps according to various exemplary embodiments of the disclosure described in the "exemplary methods" section above of this specification, when the program product is run on the terminal device, for example, any one or more of the steps in fig. 2, fig. 4 or fig. 6 may be performed.
The present disclosure describes a program product for implementing the above method, which may employ a portable compact disc read-only memory (CD-ROM) and include program code, and may be run on a terminal device, such as a personal computer. However, the program product of the present disclosure is not limited thereto, and in this document, a readable storage medium may be any tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device.
The program product may employ any combination of one or more readable media. The readable medium may be a readable signal medium or a readable storage medium. A readable storage medium may be, for example, but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any combination of the foregoing. More specific examples (a non-exhaustive list) of the readable storage medium include: an electrical connection having one or more wires, a portable disk, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing.
A computer readable signal medium may include a propagated data signal with readable program code embodied therein, for example, in baseband or as part of a carrier wave. Such a propagated data signal may take many forms, including, but not limited to, electro-magnetic, optical, or any suitable combination thereof. A readable signal medium may also be any readable medium that is not a readable storage medium and that can communicate, propagate, or transport a program for use by or in connection with an instruction execution system, apparatus, or device.
Program code embodied on a readable medium may be transmitted using any appropriate medium, including but not limited to wireless, wireline, optical fiber cable, RF, etc., or any suitable combination of the foregoing.
Program code for carrying out operations for the present disclosure may be written in any combination of one or more programming languages, including an object oriented programming language such as Java, C + + or the like and conventional procedural programming languages, such as the "C" programming language or similar programming languages. The program code may execute entirely on the user's computing device, partly on the user's device, as a stand-alone software package, partly on the user's computing device and partly on a remote computing device, or entirely on the remote computing device or server. In the case of a remote computing device, the remote computing device may be connected to the user computing device through any kind of network, including a Local Area Network (LAN) or a Wide Area Network (WAN), or may be connected to an external computing device (e.g., through the internet using an internet service provider).
Other embodiments of the disclosure will be apparent to those skilled in the art from consideration of the specification and practice of the disclosure disclosed herein. This disclosure is intended to cover any variations, uses, or adaptations of the disclosure following, in general, the principles of the disclosure and including such departures from the present disclosure as come within known or customary practice within the art to which the disclosure pertains. It is intended that the specification and examples be considered as exemplary only, with a true scope and spirit of the disclosure being indicated by the following claims.
It will be understood that the present disclosure is not limited to the precise arrangements described above and shown in the drawings and that various modifications and changes may be made without departing from the scope thereof. The scope of the present disclosure is to be limited only by the terms of the appended claims.

Claims (9)

1. A touch event processing method is applied to a touch terminal, the touch terminal comprises a touch screen, and the method comprises the following steps:
in a black screen state, when the touch screen receives a touch event input by a user, triggering a timer to start timing, wherein the touch event is an operation event input by the user and used for performing fingerprint identification under the screen;
sending an interrupt signal to a processor and maintaining the touch event;
if the processor responds to the interrupt signal, enabling the processor to read the touch event, releasing the touch event and clearing the timer;
if the processor does not respond to the interrupt signal, repeatedly sending the interrupt signal to the processor again when the timing time of the timer reaches a first preset time and does not reach a second preset time, and reading the touch event if the processor responds to the interrupt signal;
if the processor does not respond to the interrupt signal, releasing the touch event when the timing time of the timer reaches the second preset time;
and the second preset time is greater than the first preset time.
2. The method of claim 1, wherein the black screen state is: the touch screen is in a dormant state, and the fingerprint identification function under the touch screen is started.
3. The method of claim 1, wherein the processor reading the touch event comprises:
the processor extracts fingerprint information of the user from the touch event and verifies the fingerprint information;
and if the fingerprint information passes the verification, triggering to unlock the touch terminal.
4. The method of claim 1, wherein the maintaining the touch event comprises:
and storing the touch event into a data register of the touch screen.
5. The method of claim 1, wherein when the touch screen receives a touch event input by a user, the method further comprises:
judging whether the touch event input by the user acts in a preset coordinate range of the touch screen;
and if the touch event input by the user acts in the preset coordinate range of the touch screen, executing a step of triggering a timer to start timing.
6. The method of claim 5, wherein if the touch event of the user input is applied within a preset coordinate range of the touch screen, the method further comprises:
judging whether the action area of the touch event input by the user on the touch screen meets a preset area or not;
and if the preset area is met, executing a step of triggering a timer to start timing.
7. A touch event processing device is applied to a touch terminal, the touch terminal comprises a touch screen, and the device comprises:
the event receiving module is used for triggering a timer to start timing when the touch screen receives a touch event input by a user in a black screen state, wherein the touch event is an operation event input by the user and used for performing fingerprint identification under the screen;
the signal sending module is used for sending an interrupt signal to the processor and maintaining the touch event;
an event release module, configured to, if the processor responds to the interrupt signal, enable the processor to release the touch event and clear the timer when reading the touch event;
the event reading module is used for repeatedly sending the interrupt signal to the processor again when the timing time of the timer reaches a first preset time and does not reach a second preset time if the processor does not respond to the interrupt signal, and reading the touch event if the processor responds to the interrupt signal; if the processor does not respond to the interrupt signal, releasing the touch event when the timing time of the timer reaches the second preset time;
and the second preset time is greater than the first preset time.
8. A computer-readable storage medium, on which a computer program is stored, which, when being executed by a processor, carries out the method of any one of claims 1 to 6.
9. An electronic device, comprising:
a processor; and
a memory for storing executable instructions of the processor;
wherein the processor is configured to perform the method of any of claims 1 to 6 via execution of the executable instructions.
CN202010374275.7A 2020-05-06 2020-05-06 Touch event processing method, touch event processing device, medium and electronic equipment Active CN111625175B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010374275.7A CN111625175B (en) 2020-05-06 2020-05-06 Touch event processing method, touch event processing device, medium and electronic equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010374275.7A CN111625175B (en) 2020-05-06 2020-05-06 Touch event processing method, touch event processing device, medium and electronic equipment

Publications (2)

Publication Number Publication Date
CN111625175A CN111625175A (en) 2020-09-04
CN111625175B true CN111625175B (en) 2021-06-22

Family

ID=72259691

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010374275.7A Active CN111625175B (en) 2020-05-06 2020-05-06 Touch event processing method, touch event processing device, medium and electronic equipment

Country Status (1)

Country Link
CN (1) CN111625175B (en)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112596600A (en) * 2020-12-16 2021-04-02 惠州Tcl移动通信有限公司 Screen unlocking method and device, storage medium and mobile terminal
CN115016666B (en) * 2021-11-18 2023-08-25 荣耀终端有限公司 Touch processing method, terminal equipment and storage medium

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1766860A (en) * 2005-12-02 2006-05-03 北京中星微电子有限公司 Interrupt control handling apparatus and method
CN105824743A (en) * 2016-03-14 2016-08-03 广东欧珀移动通信有限公司 Method for testing unlocking time and mobile terminal
WO2016150163A1 (en) * 2015-03-26 2016-09-29 小米科技有限责任公司 Module wakeup method and apparatus
CN107194228A (en) * 2017-04-28 2017-09-22 广东欧珀移动通信有限公司 Solve lock control method and Related product
CN107450837A (en) * 2017-07-28 2017-12-08 广东欧珀移动通信有限公司 Respond method, apparatus, storage medium and the mobile terminal of blank screen gesture

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102236528A (en) * 2010-04-21 2011-11-09 乐金电子(中国)研究开发中心有限公司 Method and device for adjusting sampling interval of touch screen and digital terminal

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1766860A (en) * 2005-12-02 2006-05-03 北京中星微电子有限公司 Interrupt control handling apparatus and method
WO2016150163A1 (en) * 2015-03-26 2016-09-29 小米科技有限责任公司 Module wakeup method and apparatus
CN105824743A (en) * 2016-03-14 2016-08-03 广东欧珀移动通信有限公司 Method for testing unlocking time and mobile terminal
CN107194228A (en) * 2017-04-28 2017-09-22 广东欧珀移动通信有限公司 Solve lock control method and Related product
CN107450837A (en) * 2017-07-28 2017-12-08 广东欧珀移动通信有限公司 Respond method, apparatus, storage medium and the mobile terminal of blank screen gesture

Also Published As

Publication number Publication date
CN111625175A (en) 2020-09-04

Similar Documents

Publication Publication Date Title
WO2020156269A1 (en) Display method for electronic device having flexible screen and electronic device
WO2020224449A1 (en) Split-screen display operation method and electronic device
CN110347269B (en) Empty mouse mode realization method and related equipment
CN112671976B (en) Control method and device of electronic equipment, electronic equipment and storage medium
CN111742539B (en) Voice control command generation method and terminal
WO2021052139A1 (en) Gesture input method and electronic device
WO2020019355A1 (en) Touch control method for wearable device, and wearable device and system
CN110742580A (en) Sleep state identification method and device
CN113641488A (en) Method and device for optimizing resources based on user use scene
CN113448482A (en) Sliding response control method and device of touch screen and electronic equipment
WO2020221062A1 (en) Navigation operation method and electronic device
CN111625175B (en) Touch event processing method, touch event processing device, medium and electronic equipment
CN114822525A (en) Voice control method and electronic equipment
US11899929B2 (en) Method for customizing key of foldable device, device, and storage medium
CN114095602B (en) Index display method, electronic device and computer readable storage medium
CN114089902A (en) Gesture interaction method and device and terminal equipment
CN109285563B (en) Voice data processing method and device in online translation process
CN114077519A (en) System service recovery method and device and electronic equipment
CN115878500A (en) Memory recovery method and device, electronic equipment and readable storage medium
CN113867520A (en) Device control method, electronic device, and computer-readable storage medium
CN115393676A (en) Gesture control optimization method and device, terminal and storage medium
CN115442474A (en) Information protection method and electronic equipment
CN114079809A (en) Terminal and input method and device thereof
CN112463086A (en) Display control method and electronic equipment
CN116723384B (en) Process control method, electronic device and readable storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant