CN116225274A - Identification method and device for touch operation, electronic equipment and storage medium - Google Patents
Identification method and device for touch operation, electronic equipment and storage medium Download PDFInfo
- Publication number
- CN116225274A CN116225274A CN202310476916.3A CN202310476916A CN116225274A CN 116225274 A CN116225274 A CN 116225274A CN 202310476916 A CN202310476916 A CN 202310476916A CN 116225274 A CN116225274 A CN 116225274A
- Authority
- CN
- China
- Prior art keywords
- touch operation
- acceleration data
- touch
- screen
- monitors
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0481—Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
- G06F3/04815—Interaction with a metaphor-based environment or interaction object displayed as three-dimensional, e.g. changing the user viewpoint with respect to the environment or object
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0487—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
- G06F3/0488—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
-
- Y—GENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
- Y02—TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
- Y02D—CLIMATE CHANGE MITIGATION TECHNOLOGIES IN INFORMATION AND COMMUNICATION TECHNOLOGIES [ICT], I.E. INFORMATION AND COMMUNICATION TECHNOLOGIES AIMING AT THE REDUCTION OF THEIR OWN ENERGY USE
- Y02D30/00—Reducing energy consumption in communication networks
- Y02D30/70—Reducing energy consumption in communication networks in wireless communication networks
Landscapes
- Engineering & Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Human Computer Interaction (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Position Input By Displaying (AREA)
Abstract
The application discloses a touch operation identification method, a touch operation identification device, electronic equipment and a storage medium, wherein the method is applied to the electronic equipment, and the electronic equipment comprises the following steps: the method comprises the following steps of: the AP monitors whether touch operation on the touch screen exists or not; the CP monitors acceleration data and stores the monitored acceleration data into the storage module; when the AP monitors the touch operation, the CP identifies whether the touch operation is a finger joint knocking action according to the acceleration data stored in the storage module, and sends an identification result to the AP. With this embodiment, the CP recognizes whether the touch operation is a knuckle-tapping motion from the acceleration data. Compared with the prior art, the method omits the operation of transmitting the acceleration data from the CP to the AP and identifying the acceleration data by the AP according to the received acceleration data. The touch operation can be timely identified, and the identification rate of the finger joint knocking screen can be improved.
Description
Technical Field
The present disclosure relates to the field of electronic devices, and in particular, to a method and apparatus for identifying touch operation, an electronic device, and a storage medium.
Background
Along with the diversification of functions of electronic devices such as mobile phones and tablets, a user can save images or videos displayed in a screen through functions such as screen capturing, regional screen capturing or screen recording. Besides starting related functions through menus or pressing physical keys, the functions such as screen capturing, regional screen capturing or screen recording can be started through operations such as clicking or double clicking of the finger joints on the touch screen.
In actual operation, when the finger joints are used for knocking the display screen to start related functions, the problem of low success rate or high misoperation exists.
Therefore, how to improve the success rate of the operation of recognizing the knuckle is a technical problem to be solved.
Disclosure of Invention
The application provides a touch operation identification method, a touch operation identification device, electronic equipment and a storage medium, which can improve the identification rate of a finger joint knocking screen.
In a first aspect, an embodiment of the present application provides a method for identifying a touch operation, which is applied to an electronic device, where the electronic device includes: a touch screen, an Application Processor (AP), and a baseband processor (CP), the method comprising:
the AP monitors whether touch operation on the touch screen exists or not;
The CP monitors acceleration data and stores the monitored acceleration data into a storage module;
when the AP monitors touch operation, the CP identifies whether the touch operation is a finger joint knocking action according to the acceleration data stored in the storage module, and sends an identification result to the AP.
When the embodiment is adopted to identify the touch operation, when the AP monitors the touch operation, the CP identifies whether the touch operation is a knuckle knocking action according to the acceleration data. Compared with the prior art, the method and the device have the advantages that the CP transmits acceleration data to the AP, and the AP identifies whether the touch operation is a finger joint knocking operation according to the received acceleration data.
In one possible implementation manner, the method for identifying the touch operation may further include:
when the AP monitors touch operation, acquiring a capacitance value of the touch screen;
the AP identifies whether the touch operation is a knuckle knocking action according to the capacitance value;
And when the identification result obtained by the AP according to the capacitance value and the identification result obtained by the CP according to the acceleration data are both finger joint knocking actions, determining that the touch operation is the finger joint knocking actions.
In the embodiment, the AP identifies the touch operation according to the capacitance value of the touch screen, and when the identification result obtained by the AP according to the capacitance value and the identification result obtained by the CP according to the acceleration data are both the finger joint knocking actions, the touch operation is determined to be the finger joint knocking actions, and the accuracy of identification is improved by adopting the embodiment.
In one possible implementation manner, the method for identifying the touch operation may further include:
and determining the touch operation as a clicking operation when the identification result obtained by the AP according to the capacitance value and the identification result obtained by the CP according to the acceleration data are not joint tapping actions.
In one possible implementation, the memory module is a memory module with power consumption lower than that of the double rate synchronous dynamic random access memory DDR.
With this embodiment, compared with the prior art that the DDR is used for storing the acceleration data, the memory module with lower power consumption than the DDR is used for storing the acceleration data, so that the power consumption is reduced.
In a second aspect, an embodiment of the present application provides a touch operation method, which is applied to an electronic device, where the electronic device includes: the method comprises the following steps of:
the AP monitors whether touch operation on the touch screen exists or not;
the CP monitors acceleration data and stores the monitored acceleration data into a preset storage module;
when the AP monitors touch operation, the CP identifies whether the touch operation is a knuckle knocking action according to the acceleration data stored in the storage module, and sends an identification result to the AP;
and responding to the touch operation, and executing preset operation matched with the identified type of the touch operation by the AP.
In one possible implementation, when it is determined that the touch operation includes a knuckle tapping action, the preset operation includes: any one of screen capturing, region screen capturing or screen recording.
In a third aspect, an embodiment of the present application provides a touch operation recognition device, which is applied to an electronic device, where the electronic device includes: touch screen, recognition device includes: an application processor AP and a baseband processor CP,
The AP is used for monitoring whether touch operation on the touch screen exists or not;
the CP is used for monitoring acceleration data, storing the monitored acceleration data into a preset storage module, and identifying whether the touch operation is a finger joint knocking action or not according to the acceleration data stored in the storage module when the AP monitors the touch operation, and sending an identification result to the AP.
In a fourth aspect, an embodiment of the present application provides a touch operation device, which is applied to an electronic device, where the electronic device includes: the touch screen, the touch operation device includes: a response module and a touch operated recognition device as described in the third aspect;
the response module is used for responding to the touch operation when the recognition device determines that the touch operation comprises a knuckle knocking action, and executing preset operation matched with the touch operation.
In a fifth aspect, embodiments of the present application provide an electronic device, including: a touch screen, an application processor AP, a baseband processor CP, and a memory storing instructions that, when executed by the AP and the CP, cause the electronic device to perform a method as described above or any one of the possible implementations of the first aspect or a method as described above or any one of the possible implementations of the second aspect.
In a sixth aspect, embodiments of the present application provide a computer-readable storage medium storing instructions that, when executed by an electronic device, cause the electronic device to perform a method as described in the first aspect or any one of the possible implementations of the first aspect or a method as described in the second aspect or any one of the possible implementations of the second aspect.
In a seventh aspect, embodiments of the present application provide a computer program product comprising instructions which, when executed by an electronic device, cause the electronic device to perform a method as described above for the first aspect or any one of the possible implementations of the first aspect or a method as described above for the second aspect or any one of the possible implementations of the second aspect.
It will be appreciated that the advantages achieved by the second to seventh aspects described above may be referred to as the advantages corresponding to the method of the first aspect or any one of the possible implementations of the first aspect.
Drawings
Fig. 1 is a flowchart of a touch operation recognition method according to an embodiment of the present application;
Fig. 2 is a schematic diagram of information interaction between an AP and a CP in the prior art;
fig. 3 is a schematic diagram of information interaction between an AP and a CP in an embodiment of the present application;
fig. 4 is a flowchart of a method for identifying a touch operation according to an embodiment of the present disclosure;
fig. 5 is a flowchart of a touch operation method according to an embodiment of the present disclosure;
fig. 6 is a schematic structural diagram of a touch operation recognition device according to an embodiment of the present application;
fig. 7 is a schematic structural diagram of a touch operation device according to an embodiment of the present disclosure;
fig. 8 is a schematic structural diagram of an electronic device according to an embodiment of the present application;
fig. 9 is a block diagram of a software system of an electronic device according to an embodiment of the present application.
Detailed Description
For the purpose of making the objects, technical solutions and advantages of the present application more apparent, embodiments of the present application will be described in further detail below with reference to the accompanying drawings.
It should be understood that reference herein to "a plurality" means two or more. In the description of the present application, "/" means or, unless otherwise indicated, for example, a/B may represent a or B; "and/or" herein is merely an association relationship describing an association object, and means that three relationships may exist, for example, a and/or B may mean: a exists alone, A and B exist together, and B exists alone. In addition, for the purpose of facilitating the clear description of the technical solutions of the present application, the words "first", "second", etc. are used to distinguish between the same item or similar items having substantially the same function and effect. It will be appreciated by those of skill in the art that the words "first," "second," and the like do not limit the amount and order of execution, and that the words "first," "second," and the like do not necessarily differ.
When using electronic equipment with a touch screen such as a mobile phone and a tablet, a user can knock the touch screen through a finger joint to execute related operations on the electronic equipment, if the touch operation on the touch screen is detected, the type of the touch operation needs to be identified, and the touch operation is determined to be the joint knocking operation or the common touch clicking operation. And then, responding to the determined type of the touch operation, and executing related operation matched with the touch operation. At present, most smart phones adopt a system architecture of AP and CP, wherein the AP is used for running an operating system and processing high-load multimedia applications, and the CP is used for completing functions such as network interaction.
Referring to fig. 1, fig. 1 is a flow chart of a touch operation recognition method. As shown in fig. 1, the method for identifying a touch operation may include steps 101 to 106, where:
101. and monitoring touch operation on the touch screen.
102. Obtaining a first result according to the capacitance value of the touch screen, wherein the first result is as follows: and determining whether the touch operation is a knuckle knocking action.
103. Obtaining a second result according to the acceleration data, wherein the second result is as follows: and determining whether the touch operation is a knuckle knocking action.
In some possible implementations, acceleration data may be monitored based on the virtual sensor of the knuckle. The virtual sensor is mainly used for measuring the related quantity through modeling aiming at the measured which is difficult to directly measure or has high direct measurement cost, and then processing the measured signal, so that the measured information can be indirectly obtained. In a sense it is a mathematical model that allows the output of measured data from the relevant raw data by creation of an extended system model. The raw data of the virtual sensor may be derived from a virtual machining process or a virtual simulation process, or may be derived from a real physical sensor.
104. Judging whether the first result and the second result refer to joint knocking actions.
If yes, go to step 105; if not, go to step 106.
105. And determining the touch operation as a finger joint knocking action.
106. The touch operation is determined to be a normal touch click operation.
It should be noted that the difference between the embodiment of the present application and the prior art is embodied in the specific implementation of step 103 in fig. 1.
Referring to fig. 2 and fig. 3, fig. 2 is a schematic diagram of information interaction between an AP and a CP in the prior art, and fig. 3 is a schematic diagram of information interaction between an AP and a CP in an embodiment of the present application. Specifically, as shown in fig. 2, in the prior art, a CP monitors acceleration data and stores the acceleration data in a storage module, when an AP determines that there is a touch operation, the CP sends the acceleration data stored in the storage module to the AP, and on the AP side, the finger joint drives the request acceleration data, and determines whether the touch operation is a finger joint tapping action according to a finger joint algorithm in combination with the acceleration data acquired from the CP. As can be seen from fig. 2, the CP does not recognize the monitored acceleration data, but transmits the acceleration data stored in the storage module to the AP, and the AP recognizes the detected acceleration data according to the received acceleration data, so as to determine whether the detected acceleration data is a finger joint tapping action.
As an improvement, the technical solution adopted in the embodiment of the present application may refer to fig. 3, where the CP monitors acceleration data and stores the acceleration data in a storage module, where the storage module may be configured to store a preset number (such as 128 numbers) of acceleration data, and after the storage module is full, the new monitored acceleration data covers the first stored data, and when the AP determines that there is a touch operation, the CP determines, according to the preset number of acceleration data stored in the storage module, whether the touch operation is a knuckle-knocking action, and then sends whether the recognition result is the knuckle-knocking action to the AP. By adopting the embodiment, the acceleration data stored in the storage module is not transmitted to the AP by the CP, so that the transmission time is saved, errors in transmission are avoided, and the recognition rate of the finger joint knocking screen is improved.
In addition, in the prior art, when the electronic device turns on the finger joint recognition function, the DDR is woken up, and acceleration data monitored by the CP is stored in the DDR. In the embodiment of the application, in order to save power consumption, the DDR is not awakened, and acceleration data monitored by the CP is stored in a storage module (such as Island low power consumption) with lower power consumption than the DDR.
It should be noted that, the electronic device related to the embodiment of the present application may be an electronic device with a touch screen, specifically, may be a mobile phone, a tablet computer, etc., and the embodiment of the present application does not limit a specific type of the electronic device. For easy understanding, the method for recognizing touch operation provided in the present application will be described in detail below by taking an electronic device as an example of a mobile phone.
Referring to fig. 4, fig. 4 is a flowchart of a touch operation recognition method provided in an embodiment of the present application. In this embodiment, the method for identifying a touch operation includes steps 401 to 404, wherein:
the ap monitors whether there is a touch operation on the touch screen.
Specifically, the AP may monitor whether there is a touch operation on the touch screen through a sensor on the electronic device.
The cp listens for acceleration data and stores the monitored acceleration data to the memory module.
In particular, the CP may acquire monitored acceleration data using a finger joint virtual sensor.
403. When the AP monitors the touch operation, the CP identifies whether the touch operation is a finger joint knocking action according to the acceleration data stored in the storage module.
The cp sends the recognition result to the AP.
When the embodiment is adopted to identify the touch operation, when the AP monitors the touch operation, the CP identifies whether the touch operation is a knuckle knocking action according to the acceleration data stored in the storage module. Compared with the prior art, the method and the device have the advantages that the CP transmits acceleration data in the storage module to the AP, and the AP identifies whether the touch operation is a knuckle knocking action according to the received acceleration data.
In some possible embodiments, when the AP monitors the touch operation, the AP identifies whether the touch operation is a knuckle knocking operation according to the capacitance value of the touch screen; specifically, whether the touch operation is a finger joint knocking action can be determined according to the trained neural network model of the touch operation and the capacitance value of the touch screen. And when the recognition result obtained by the AP according to the capacitance value of the touch screen and the recognition result obtained by the CP according to the acceleration data are both finger joint knocking actions, determining that the touch operation is the finger joint knocking actions.
It can be appreciated that when a user clicks, slides, or taps with a finger joint on the touch screen, the capacitance of the clicked, slid, or tapped area changes, and the touch driving module can monitor the capacitance of each area on the touch screen.
In the embodiment, the AP identifies the touch operation according to the capacitance value of the touch screen, and when the identification result obtained by the AP according to the capacitance value data and the identification result obtained by the CP according to the acceleration data are both the knuckle knocking actions, the touch operation is determined to be the knuckle knocking actions, and the accuracy of identification is improved by adopting the embodiment.
In some possible implementations, the memory module storing the acceleration data is a memory module with power consumption lower than DDR. The CP stores the acceleration data into a low power consumption storage module.
With this embodiment, compared with the prior art that the DDR is used for storing the acceleration data, the low-power consumption storage module is used for storing the acceleration data, so that the power consumption is reduced.
As shown in fig. 5, the embodiment of the present application further provides a touch operation method, which is applied to an electronic device, where the electronic device includes: the touch screen, the application processor AP and the baseband processor CP are arranged in the touch screen, and the touch operation method comprises the following steps of:
the ap monitors whether there is a touch operation on the touch screen.
The cp listens for acceleration data and stores the monitored acceleration data to the memory module.
In some possible implementations, the monitored acceleration data may be acquired using a virtual finger joint sensor. The monitored acceleration data may be stored in a low power memory module corresponding to the low power island.
503. When the AP monitors the touch operation, the CP identifies whether the touch operation is a finger joint knocking action according to the acceleration data stored in the storage module.
And 504, the CP sends the identification result obtained according to the acceleration data to the AP.
In some possible embodiments, the method may further include: when the AP monitors touch operation, the AP identifies whether the touch operation is a knuckle knocking action according to the capacitance value of the touch screen; and when the recognition result obtained by the AP according to the capacitance value of the touch screen and the recognition result obtained by the CP according to the acceleration data are both finger joint knocking actions, determining that the touch operation is the finger joint knocking actions.
505. In response to the touch operation, the AP performs a preset operation matching the type of the touch operation.
The preset operation may include: any one of screen capturing, regional screen capturing, screen recording and the like.
When the embodiment is adopted for touch operation, when the AP monitors the touch operation, the CP identifies whether the touch operation is a knuckle knocking action according to the acceleration data stored in the storage module. Compared with the prior art, the CP transmits the acceleration data stored in the storage module to the AP, and the AP recognizes whether the touch operation is a finger joint knocking action according to the received acceleration data.
As shown in fig. 6, the embodiment of the present application further provides a touch operation recognition device 600, which is applied to an electronic device, and the electronic device includes: the touch screen, the recognition apparatus 600 includes: an application processor AP601 and a baseband processor CP602, wherein the AP601 is used for monitoring whether touch operation on a touch screen exists or not; the CP602 is configured to monitor acceleration data and store the monitored acceleration data in the storage module, and when the AP601 monitors a touch operation, identify whether the touch operation is a finger joint tapping motion according to the acceleration data stored in the storage module, and send an identification result to the AP601.
In some possible embodiments, the AP601 is further configured to, when the AP601 monitors a touch operation, identify, by the AP601, whether the touch operation is a finger joint tapping action according to a capacitance value of the touch screen; when the recognition result obtained by the AP601 according to the capacitance value and the recognition result obtained by the CP602 according to the acceleration data stored in the storage module are both finger joint tapping actions, determining that the touch operation is finger joint tapping actions.
In some possible implementations, the memory module is a memory module with power consumption lower than DDR.
As shown in fig. 7, the embodiment of the present application further provides a touch operation device, which is applied to an electronic device, and the electronic device includes: the touch screen, touch operation device 700 includes: the response module 701 and the identification device 702, the identification device 702 includes: an application processor AP7021 and a baseband processor CP7022;
the response module 701 is configured to, when the recognition device 702 determines that the touch operation includes a finger joint tapping action, perform a preset operation matched with the touch operation in response to the touch operation.
The AP7021 is used for monitoring whether touch operation on the touch screen exists or not; the CP7022 is configured to monitor acceleration data, store the monitored acceleration data in the storage module, and when the AP7021 monitors a touch operation, identify whether the touch operation is a finger joint tapping action according to the acceleration data stored in the storage module, and send the identification result to the AP7021.
In some possible embodiments, the AP7021 is further configured to, when the AP7021 monitors a touch operation, identify, by the AP7021, whether the touch operation is a knuckle-tapping action according to a capacitance value of the touch screen; when the recognition result obtained by the AP7021 according to the capacitance value and the recognition result obtained by the CP7022 according to the acceleration data are both the knuckle knocking actions, determining that the touch operation is the knuckle knocking actions.
In some possible implementations, the memory module is a memory module with power consumption lower than the double rate synchronous dynamic random access memory DDR.
Next, an electronic device according to an embodiment of the present application will be described.
Fig. 8 is a schematic structural diagram of an electronic device 800, which may be a mobile phone, a tablet computer, or other devices. Referring to fig. 8, the electronic device 800 may include a processor 810, an external memory interface 820, an internal memory 821, a universal serial bus (universal serial bus, USB) interface 830, a charge management module 840, a power management module 841, a battery 842, an antenna 1, an antenna 2, a mobile communication module 850, a wireless communication module 860, an audio module 870, a speaker 870A, a receiver 870B, a microphone 870C, an earphone interface 870D, a sensor module 880, a key 890, a motor 891, an indicator 892, a camera 893, a screen 894, a user identification module (subscriber identification module, SIM) card interface 895, and the like. The sensor module 880 may include, among other things, a pressure sensor 880A, a gyroscope sensor 880B, an air pressure sensor 880C, a magnetic sensor 880D, an acceleration sensor 880E, a distance sensor 880F, a proximity light sensor 880G, a fingerprint sensor 880H, a temperature sensor 880J, a touch sensor 880K, an ambient light sensor 880L, a bone conduction sensor 880M, and the like.
The processor 810 may include one or more processing units, such as: the processor 810 may include an AP, CP, modem processor, graphics processor (graphics processing unit, GPU), image signal processor (image signal processor, ISP), controller, memory, video codec, digital signal processor (digital signal processor, DSP), baseband processor, and/or neural Network Processor (NPU), etc. Wherein the different processing units may be separate devices or may be integrated in one or more processors.
The controller may be a neural hub and a command center of the electronic device 800, among others. The controller can generate operation control signals according to the instruction operation codes and the time sequence signals to finish the control of instruction fetching and instruction execution.
A memory may also be provided in the processor 810 for storing instructions and data. In some embodiments, the memory in processor 810 is a cache memory. The memory may hold instructions or data that the processor 810 has just used or recycled. If the processor 810 needs to reuse the instruction or data, it may be called directly from the memory. Repeated accesses are avoided and the latency of the processor 810 is reduced, thereby improving the efficiency of the system. The processor may also be provided with a low power memory (e.g., island low power, etc.) to reduce power consumption.
The electronic device 800 implements display functions through a GPU, a screen 894, and an application processor, etc. The GPU is a microprocessor for image processing, connected to the screen 894 and the application processor. The GPU is used to perform mathematical and geometric calculations for graphics rendering. Processor 810 may include one or more GPUs that execute program instructions to generate or change display information.
The screen 894 is used to display images, videos, and the like. The screen 894 includes a display panel. The display panel may employ a liquid crystal screen (liquid crystal display, LCD), an organic light-emitting diode (OLED), an active-matrix organic light emitting diode (AMOLED), a flexible light-emitting diode (FLED), miniled, microLed, micro-OLED, a quantum dot light-emitting diode (quantum dot light emitting diodes, QLED), or the like. In some embodiments, electronic device 800 may include 1 or N screens 894, N being an integer greater than 1.
Electronic device 800 may implement shooting functionality through an ISP, camera 893, video codec, GPU, screen 894, and application processor, among others.
The ISP is used to process the data fed back by camera 893. For example, when photographing, the shutter is opened, light is transmitted to the camera photosensitive element through the lens, the light signal is converted into an electric signal, and the camera photosensitive element transmits the electric signal to the ISP for processing and is converted into an image visible to naked eyes. ISP can also optimize the noise, brightness and skin color of the image. The ISP can also optimize parameters such as exposure, color temperature and the like of a shooting scene. In some embodiments, the ISP may be located in camera 893.
The camera 893 is used to capture still images or video. The object generates an optical image through the lens and projects the optical image onto the photosensitive element. The photosensitive element may be a charge coupled device (charge coupled device, CCD) or a Complementary Metal Oxide Semiconductor (CMOS) phototransistor. The photosensitive element converts the optical signal into an electrical signal, which is then transferred to the ISP to be converted into a digital image signal. The ISP outputs the digital image signal to the DSP for processing. The DSP converts the digital image signal into an image signal in a standard RGB, YUV, or the like format. In some embodiments, electronic device 800 may include 1 or N cameras 893, N being an integer greater than 1.
The external memory interface 820 may be used to connect an external memory card, such as a Micro SD card, to enable expansion of the memory capabilities of the electronic device 800. The external memory card communicates with the processor 810 through an external memory interface 820 to implement data storage functions. Such as storing files of music, video, etc. in an external memory card.
The internal memory 821 may be used to store computer-executable program code that includes instructions. The processor 810 performs various functional applications and data processing of the electronic device 800 by executing instructions stored in the internal memory 821. The internal memory 821 may include a stored program area and a stored data area. The storage program area may store an application program (such as a sound playing function, an image playing function, etc.) required for at least one function of the operating system, etc. The storage data area may store data created by the electronic device 800 during use (e.g., audio data, phonebook, etc.), and so forth. In addition, the internal memory 821 may include a high-speed random access memory, and may further include a nonvolatile memory such as at least one magnetic disk storage device, a flash memory device, a universal flash memory (universal flash storage, UFS), and the like.
The acceleration sensor 880E can detect the magnitude of acceleration of the electronic device 800 in various directions (typically three axes). The magnitude and direction of gravity may be detected when the electronic device 800 is stationary. The acceleration sensor 880E can also be used to recognize the gesture of the electronic device 800, and can be applied to applications such as horizontal-vertical screen switching, pedometers, and the like. Of course, the acceleration sensor 880E may also be combined with the gyro sensor 880B to recognize the gesture of the electronic device 800, and be applied to the landscape switching.
The gyro sensor 880B may be used to determine a motion gesture of the electronic device 800. In some embodiments, the angular velocity of electronic device 800 about three axes (i.e., x, y, and z axes) may be determined by gyro sensor 880B. The gyro sensor 880B may be used for photographing anti-shake. For example, when the shutter is pressed, the gyro sensor 880B detects the shake angle of the electronic device 800, calculates the distance to be compensated by the lens module according to the angle, and makes the lens counteract the shake of the electronic device 800 through the reverse motion, thereby realizing anti-shake. The gyro sensor 880B can also be used for horizontal and vertical screen switching, navigation, and somatosensory of game scenes.
It is to be understood that the structure illustrated in the embodiments of the present application does not constitute a specific limitation on the electronic device 800. In other embodiments of the present application, electronic device 800 may include more or fewer components than shown, or certain components may be combined, or certain components may be split, or different arrangements of components. The illustrated components may be implemented in hardware, software, or a combination of software and hardware.
The electronic device provided in the embodiment of the present application may be a User Equipment (UE), for example, a mobile terminal (such as a mobile phone), a tablet computer, and other devices.
In addition, an operating system is run on the components. For example, android open source operating systems developed by google corporation may be used.
The software system of the electronic device may employ a layered architecture, an event driven architecture, a microkernel architecture, a microservice architecture, a cloud architecture, or the like. In order to more clearly illustrate the identification method of the touch operation provided by the embodiment of the application, the embodiment of the application takes an Android (Android) system with a layered architecture as an example, and illustrates a software system of the electronic device.
As shown in fig. 9, the electronic device may include a hardware layer and a software layer, where an Android system of the layered architecture may include an application layer, an application framework layer, a system library layer, and a kernel layer. In some alternative embodiments, the system of the electronic device may also include a hierarchy not mentioned by the above technical architecture, such as Android Runtime (Android run). The application layer may include a series of application packages, such as a navigation application, a music application, a video application, a finger joint tapping screen application, and the like. The application packages may include video, chat, etc. applications, and System user interfaces (System user interface, system UI), and the finger joint tapping screen application may be used for screen shots, recordings, long screen shots, area screen shots, etc.
Video, chat, etc. applications are used to provide corresponding services to users. For example, a user views a video using a video application, chatts with other users using a chat application, and listens to music using a music application.
The system UI is used for managing a human-computer interface (UI) of the electronic device, and in the embodiment of the present application, the system UI is used for monitoring touch operations on a touch screen.
The application framework layer provides an application programming interface (applicationprogramming interface, API) and programming framework for application programs of the application layer. The application framework layer includes a number of predefined functions. The application framework layer may include a window management service module (window manage service, WMS), a display rotation module (also known as displayport), an application management service module (activity manage service, AMS), an Input management module (also known as Input), and the like.
WMSs are used to manage windows. The window manager can acquire the size of the screen, judge whether a status bar exists, cut out the screen by matting the image in the screen, and the like. In the embodiment of the application, the WMS can create and manage the window corresponding to the application.
The display rotation module is used for controlling the screen to rotate, and the screen displays the layout of a vertical screen or a horizontal screen through rotation. And for example, when the screen rotation is determined to be needed, notifying the Surfaceflinger to switch the transverse screen and the vertical screen of the application interface.
The AMS serves to launch a specific application according to a user's operation. For example, after the image is synthesized, the primary key of the image is triggered to be displayed in the screen, after the image is displayed, the image which is determined to need to be subjected to the matting operation is triggered to be subjected to the matting operation, and an application stack corresponding to the video application is created, so that the video application can normally run.
The system library layer may include a plurality of functional modules, such as: a sensor module (also known as a sensor) and a SurfaceFlinger.
The sensor module is used for acquiring data acquired by the sensor, such as acquiring ambient light under a screen. And collecting the gravity direction information of the electronic equipment. Or, the sensor module can also adjust the brightness of the screen according to the ambient light and determine the horizontal and vertical screen state information of the electronic device according to the gravity direction information of the electronic device, wherein the horizontal and vertical screen state information is used for indicating whether the electronic device is in a horizontal screen state or a vertical screen state.
Surfaceflinger is a system service used for the functions of creation, control, management and the like of a layer.
In addition, the system library layer may further include: surface manager (surface manager), media Libraries (Media Libraries), three-dimensional graphics processing Libraries (e.g., openGL ES), 2D graphics engines (e.g., SGL), etc. The surface manager is used to manage the display subsystem and provides a fusion of 2D and 3D layers for multiple applications. Media libraries support a variety of commonly used audio, video format playback and recording, still image files, and the like. The media library may support a variety of audio and video encoding formats, such as: MPEG4, h.264, MP3, AAC, AMR, JPG, PNG, etc. The three-dimensional graphic processing library is used for realizing three-dimensional graphic drawing, image rendering, synthesis, layer processing and the like. The 2D graphics engine is a drawing engine for 2D drawing.
The kernel layer is a layer between hardware and software. In this embodiment of the present application, the kernel layer at least includes a touch driving module and a display driving module.
The display driving module is used for displaying the synthesized image in the screen according to the module of the application framework layer and the image data provided by the application program of the application layer. For example, the video application communicates a frame of image data of the video to a display driver module, which displays a frame of image of the video on the touch screen based on the image data. The SystemUI transmits the image data to a display driving module, and the display driving module displays the synthesized image in a screen.
The touch control driving module is used for monitoring capacitance data of each area of the touch screen. When a user clicks or slides on the touch screen, the capacitance value of the clicked or slid area can be changed, the touch control driving module can monitor the change of the capacitance value of each area on the touch screen and send a capacitance value change message to the input management module, and the capacitance value data change message carries information such as the change amplitude of the capacitance value of each area of the touch screen and the change time.
The input management module can determine touch operation according to the reported capacitance value change message, and then sends the identified touch operation to other modules. The touch operation herein may include a knuckle tap operation, a click operation, a drag operation, and a specific gesture operation (e.g., a swipe gesture operation, a sideslip gesture operation, etc.).
The hardware layer includes a screen, an ambient light sensor, etc., for detecting ambient light information under the screen, etc. The method comprises the steps that an application processor monitors touch operation on a touch screen, a baseband processor monitors acceleration data and stores the monitored acceleration data into a storage module, and when an AP monitors the touch operation, a CP identifies whether the touch operation is a knuckle knocking action according to the acceleration data stored in the storage module; and the CP sends the identification result to the AP.
The above technical architecture exemplifies modules and devices in an electronic device that may be involved in the present application. In practical applications, the electronic device may include all or part of the modules and devices of the above technical architecture, and other modules and devices not mentioned in the above technical architecture, and of course, may also include only the modules and devices of the above technical architecture, which is not limited in this embodiment.
The present application also provides a computer-readable storage medium storing a computer program, which when executed by a processor is capable of implementing the steps in the above-described method embodiments.
The present application provides a computer program product comprising a computer program enabling the implementation of the steps of the various method embodiments described above, when the computer program is executed by a processor.
All or part of the process in the method of the above embodiments may be implemented by a computer program, which may be stored in a computer readable storage medium and which, when executed by a processor, implements the steps of the method embodiments described above. Wherein the computer program comprises computer program code which may be in source code form, object code form, executable file or some intermediate form etc. The computer readable medium may include at least: any entity or device capable of carrying computer program code to a photographing device/electronic apparatus, recording medium, computer memory, read-only memory (ROM), random access memory (random accessmemory, RAM), electrical carrier signals, telecommunications signals, and software distribution media. Such as a U-disk, removable hard disk, magnetic or optical disk, etc. In some jurisdictions, computer readable media may not be electrical carrier signals and telecommunications signals in accordance with legislation and patent practice.
In the foregoing embodiments, the descriptions of the embodiments are emphasized, and in part, not described or illustrated in any particular embodiment, reference is made to the related descriptions of other embodiments.
Those of ordinary skill in the art will appreciate that the various illustrative elements and algorithm steps described in connection with the embodiments disclosed herein may be implemented as electronic hardware, or combinations of computer software and electronic hardware. Whether such functionality is implemented as hardware or software depends upon the particular application and design constraints imposed on the solution. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present application.
In the embodiments provided in the present application, it should be understood that the disclosed method and electronic device may be implemented in other manners. For example, the apparatus/network device embodiments described above are merely illustrative, e.g., the division of the modules or units is merely a logical functional division, and there may be additional divisions in actual implementation, e.g., multiple units or components may be combined or integrated into another system, or some features may be omitted, or not performed. Alternatively, the coupling or direct coupling or communication connection shown or discussed may be an indirect coupling or communication connection via interfaces, devices or units, which may be in electrical, mechanical or other forms.
The units described as separate units may or may not be physically separate, and units shown as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the units may be selected according to actual needs to achieve the purpose of the solution of this embodiment.
It should be understood that the terms "comprises" and/or "comprising," when used in this specification and the appended claims, specify the presence of stated features, integers, steps, operations, elements, and/or components, but do not preclude the presence or addition of one or more other features, integers, steps, operations, elements, components, and/or groups thereof.
It should also be understood that the term "and/or" as used in this specification and the appended claims refers to any and all possible combinations of one or more of the associated listed items, and includes such combinations.
Reference in the specification to "one embodiment" or "some embodiments" or the like means that a particular feature, structure, or characteristic described in connection with the embodiment is included in one or more embodiments of the application. Thus, appearances of the phrases "in one embodiment," "in some embodiments," "in other embodiments," and the like in the specification are not necessarily all referring to the same embodiment, but mean "one or more but not all embodiments" unless expressly specified otherwise. The terms "comprising," "including," "having," and variations thereof mean "including but not limited to," unless expressly specified otherwise.
The above embodiments are only for illustrating the technical solution of the present application, and are not limiting; although the present application has been described in detail with reference to the foregoing embodiments, it should be understood by those of ordinary skill in the art that: the technical scheme described in the foregoing embodiments can be modified or some technical features thereof can be replaced by equivalents; such modifications and substitutions do not depart from the spirit and scope of the technical solutions of the embodiments of the present application, and are intended to be included in the scope of the present application.
Claims (10)
1. The identification method of the touch operation is characterized by being applied to electronic equipment, wherein the electronic equipment comprises: the method comprises the following steps of:
the AP monitors whether touch operation on the touch screen exists or not;
the CP monitors acceleration data and stores the monitored acceleration data into a storage module;
when the AP monitors touch operation, the CP identifies whether the touch operation is a finger joint knocking action according to the acceleration data stored in the storage module, and sends an identification result to the AP.
2. The method as recited in claim 1, further comprising:
when the AP monitors touch operation, acquiring a capacitance value of the touch screen;
the AP identifies whether the touch operation is a knuckle knocking action according to the capacitance value;
and when the identification result obtained by the AP according to the capacitance value and the identification result obtained by the CP according to the acceleration data are both finger joint knocking actions, determining that the touch operation is the finger joint knocking actions.
3. The method as recited in claim 2, further comprising:
and determining the touch operation as a clicking operation when the identification result obtained by the AP according to the capacitance value and the identification result obtained by the CP according to the acceleration data are not joint tapping actions.
4. A method according to any one of claim 1 to 3, wherein,
the memory module is a memory module with power consumption lower than that of the double-rate synchronous dynamic random access memory DDR.
5. The touch operation method is characterized by being applied to electronic equipment, and the electronic equipment comprises: the method comprises the following steps of:
The AP and the CP identifying a type of touch operation using the method as set forth in any one of claims 1 to 4;
and responding to the touch operation, and executing preset operation matched with the type of the touch operation by the AP.
6. The method of claim 5, wherein the step of determining the position of the probe is performed,
when the touch operation is determined to comprise a knuckle knocking action, the preset operation comprises: any one of screen capturing, region screen capturing or screen recording.
7. The utility model provides a recognition device of touch operation, is characterized in that is applied to electronic equipment, the electronic equipment includes: touch screen, recognition device includes: an application processor AP and a baseband processor CP,
the AP is used for monitoring whether touch operation on the touch screen exists or not;
the CP is used for monitoring acceleration data, storing the monitored acceleration data into the storage module, and identifying whether the touch operation is a finger joint knocking action or not according to the acceleration data stored in the storage module when the AP monitors the touch operation, and sending an identification result to the AP.
8. A touch operation device, characterized in that it is applied to an electronic apparatus, the electronic apparatus comprising: the touch screen, the touch operation device includes: a response module and the touch operation recognition device according to claim 7;
The response module is used for responding to the touch operation when the recognition device determines that the touch operation comprises a knuckle knocking action, and executing preset operation matched with the touch operation.
9. An electronic device, comprising: a touch screen, an application processor AP, a baseband processor CP, and a memory storing instructions that, when executed by the AP and the CP, cause the electronic device to perform the method of any of claims 1-4 or any of claims 5-6.
10. A computer-readable storage medium storing instructions that, when executed by an electronic device, cause the electronic device to perform the method of any one of claims 1 to 4 or any one of claims 5 to 6.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202310476916.3A CN116225274A (en) | 2023-04-28 | 2023-04-28 | Identification method and device for touch operation, electronic equipment and storage medium |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202310476916.3A CN116225274A (en) | 2023-04-28 | 2023-04-28 | Identification method and device for touch operation, electronic equipment and storage medium |
Publications (1)
Publication Number | Publication Date |
---|---|
CN116225274A true CN116225274A (en) | 2023-06-06 |
Family
ID=86573472
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202310476916.3A Pending CN116225274A (en) | 2023-04-28 | 2023-04-28 | Identification method and device for touch operation, electronic equipment and storage medium |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN116225274A (en) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN116450026A (en) * | 2023-06-16 | 2023-07-18 | 荣耀终端有限公司 | Method and system for identifying touch operation |
Citations (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN104253900A (en) * | 2013-06-28 | 2014-12-31 | 展讯通信(上海)有限公司 | Smart phone and data transmission method and data transmission system thereof |
US20150309650A1 (en) * | 2014-04-24 | 2015-10-29 | Qualcomm Incorporated | Efficient lossless compression for peripheral interface data transfer |
CN106415472A (en) * | 2015-04-14 | 2017-02-15 | 华为技术有限公司 | Gesture control method, device, terminal apparatus and storage medium |
CN106445120A (en) * | 2016-09-05 | 2017-02-22 | 华为技术有限公司 | Touch operation identification method and apparatus |
CN108418768A (en) * | 2018-02-13 | 2018-08-17 | 广东欧珀移动通信有限公司 | Recognition methods, device, terminal and the storage medium of business datum |
CN110475023A (en) * | 2019-08-19 | 2019-11-19 | Oppo广东移动通信有限公司 | Context data processing method, device, electronic equipment and computer-readable medium |
CN113805487A (en) * | 2020-07-23 | 2021-12-17 | 荣耀终端有限公司 | Control instruction generation method and device, terminal equipment and readable storage medium |
-
2023
- 2023-04-28 CN CN202310476916.3A patent/CN116225274A/en active Pending
Patent Citations (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN104253900A (en) * | 2013-06-28 | 2014-12-31 | 展讯通信(上海)有限公司 | Smart phone and data transmission method and data transmission system thereof |
US20150309650A1 (en) * | 2014-04-24 | 2015-10-29 | Qualcomm Incorporated | Efficient lossless compression for peripheral interface data transfer |
CN106415472A (en) * | 2015-04-14 | 2017-02-15 | 华为技术有限公司 | Gesture control method, device, terminal apparatus and storage medium |
CN106445120A (en) * | 2016-09-05 | 2017-02-22 | 华为技术有限公司 | Touch operation identification method and apparatus |
CN108418768A (en) * | 2018-02-13 | 2018-08-17 | 广东欧珀移动通信有限公司 | Recognition methods, device, terminal and the storage medium of business datum |
CN110475023A (en) * | 2019-08-19 | 2019-11-19 | Oppo广东移动通信有限公司 | Context data processing method, device, electronic equipment and computer-readable medium |
CN113805487A (en) * | 2020-07-23 | 2021-12-17 | 荣耀终端有限公司 | Control instruction generation method and device, terminal equipment and readable storage medium |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN116450026A (en) * | 2023-06-16 | 2023-07-18 | 荣耀终端有限公司 | Method and system for identifying touch operation |
CN116450026B (en) * | 2023-06-16 | 2023-10-20 | 荣耀终端有限公司 | Method and system for identifying touch operation |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US20230027523A1 (en) | Display control method and terminal device | |
CN116826892B (en) | Charging method, charging device, electronic apparatus, and readable storage medium | |
US20240187725A1 (en) | Photographing method and electronic device | |
US20230377306A1 (en) | Video Shooting Method and Electronic Device | |
CN116225274A (en) | Identification method and device for touch operation, electronic equipment and storage medium | |
CN116546274B (en) | Video segmentation method, selection method, synthesis method and related devices | |
CN115018692A (en) | Image rendering method and electronic equipment | |
US20240064397A1 (en) | Video Shooting Method and Electronic Device | |
CN113157092B (en) | Visualization method, terminal device and storage medium | |
CN117746762B (en) | Display processing method and device and electronic equipment | |
CN117894256B (en) | Display processing method and device and electronic equipment | |
CN116680133B (en) | Black screen detection method and electronic equipment | |
CN117135253B (en) | Contact searching method, terminal equipment and storage medium | |
WO2023078133A1 (en) | Video playback method and device | |
CN113821153B (en) | Gesture navigation method, electronic device and readable storage medium | |
CN116700554B (en) | Information display method, electronic device and readable storage medium | |
CN116027940B (en) | Screen capturing method, device and storage medium | |
CN116736999B (en) | Control method of electronic equipment and electronic equipment | |
EP4383191A1 (en) | Display method and electronic device | |
US20240040235A1 (en) | Video shooting method and electronic device | |
WO2024045701A9 (en) | Data processing method and apparatus, and device and storage medium | |
WO2024131584A1 (en) | Trajectory playback method and apparatus | |
US20240013493A1 (en) | Low-power architecture for augmented reality device | |
CN117715021A (en) | Communication method, electronic device, and readable storage medium | |
CN118550497A (en) | Ambient light determination method, screen brightness adjustment method and electronic device |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination |