CN115016629A - Method and device for preventing false touch - Google Patents

Method and device for preventing false touch Download PDF

Info

Publication number
CN115016629A
CN115016629A CN202111408327.9A CN202111408327A CN115016629A CN 115016629 A CN115016629 A CN 115016629A CN 202111408327 A CN202111408327 A CN 202111408327A CN 115016629 A CN115016629 A CN 115016629A
Authority
CN
China
Prior art keywords
motion
aod
terminal device
monitoring result
display
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202111408327.9A
Other languages
Chinese (zh)
Other versions
CN115016629B (en
Inventor
金学奎
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Honor Device Co Ltd
Original Assignee
Honor Device Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Honor Device Co Ltd filed Critical Honor Device Co Ltd
Priority to CN202310591455.4A priority Critical patent/CN116820222A/en
Priority to CN202111408327.9A priority patent/CN115016629B/en
Publication of CN115016629A publication Critical patent/CN115016629A/en
Application granted granted Critical
Publication of CN115016629B publication Critical patent/CN115016629B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F1/00Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
    • G06F1/26Power supply means, e.g. regulation thereof
    • G06F1/32Means for saving power
    • G06F1/3203Power management, i.e. event-based initiation of a power-saving mode
    • G06F1/3234Power saving characterised by the action undertaken
    • G06F1/325Power saving in peripheral device
    • G06F1/3265Power saving in display device
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M1/00Substation equipment, e.g. for use by subscribers
    • H04M1/72Mobile telephones; Cordless telephones, i.e. devices for establishing wireless links to base stations without route selection
    • H04M1/724User interfaces specially adapted for cordless or mobile telephones
    • H04M1/72448User interfaces specially adapted for cordless or mobile telephones with means for adapting the functionality of the device according to specific conditions
    • H04M1/72454User interfaces specially adapted for cordless or mobile telephones with means for adapting the functionality of the device according to specific conditions according to context-related or environment-related conditions
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02DCLIMATE CHANGE MITIGATION TECHNOLOGIES IN INFORMATION AND COMMUNICATION TECHNOLOGIES [ICT], I.E. INFORMATION AND COMMUNICATION TECHNOLOGIES AIMING AT THE REDUCTION OF THEIR OWN ENERGY USE
    • Y02D30/00Reducing energy consumption in communication networks
    • Y02D30/70Reducing energy consumption in communication networks in wireless communication networks

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Environmental & Geological Engineering (AREA)
  • Human Computer Interaction (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Signal Processing (AREA)
  • Telephone Function (AREA)

Abstract

The embodiment of the application provides a method and a device for preventing false touch, relates to the technical field of terminals, is applied to terminal equipment, and comprises the following steps: the method comprises the steps that terminal equipment receives first operation for realizing information screen display AOD; responding to the first operation, and obtaining a monitoring result by the terminal equipment based on the first data; the first data comprises acceleration data, angular acceleration data, and ambient light data; when the monitoring result indicates that the terminal equipment meets the pocket mode, the terminal equipment is switched from the AOD to the information screen; or when the monitoring result indicates that the terminal device does not meet the pocket mode, the terminal device keeps the AOD. Therefore, the terminal equipment can avoid the situation that the AP is continuously wakened up due to mistaken touch in the pocket through the recognition of the pocket mode in the AOD process, and the purpose of reducing the function is achieved.

Description

Method and device for preventing false touch
Technical Field
The application relates to the technical field of terminals, in particular to a method and a device for preventing mistaken touch.
Background
With the popularization and development of the internet, the functional requirements of people on terminal equipment are diversified. For example, in order to meet the use requirement of a user for viewing the contents of time, incoming messages, notification messages, or the like in the display screen of the terminal device at any time, many terminal devices may support AOD (always on display). Wherein, the AOD can be understood as a state of low power consumption display under the screen.
Normally, when the user turns off the power key, the display screen of the terminal device may enter into a DOZE (DOZE) state. If a user puts a terminal device into a pocket during displaying an AOD after turning off a screen, there may be a situation where an Application Processor (AP) is continuously awakened to implement the AOD along with the user's movement, resulting in higher power consumption and faster power consumption.
Disclosure of Invention
The embodiment of the application provides a method for preventing mistaken touch, so that terminal equipment can avoid the situation that AP is continuously waken up due to mistaken touch in a pocket through recognition of a pocket mode in an AOD process, and the purpose of reducing functions is achieved.
In a first aspect, an embodiment of the present application provides a method for preventing a false touch, which is applied to a terminal device, and the method includes: the method comprises the steps that terminal equipment receives first operation for realizing information screen display AOD; responding to the first operation, and obtaining a monitoring result by the terminal equipment based on the first data; the first data comprises acceleration data, angular acceleration data, and ambient light data; when the monitoring result indicates that the terminal equipment meets the pocket mode, the terminal equipment is switched from the AOD to the screen-off mode; or when the monitoring result indicates that the terminal equipment does not meet the pocket mode, the terminal equipment keeps the AOD. Therefore, the terminal equipment can avoid the situation that the AP is continuously wakened up due to mistaken touch in the pocket through the recognition of the pocket mode in the AOD process, and the purpose of reducing the function is achieved.
In a possible implementation manner, the obtaining, by the terminal device, the monitoring result based on the first data includes: when the acceleration data and the angular acceleration data indicate that the terminal equipment meets a first preset direction and the ambient light data are smaller than a first brightness threshold value, the terminal equipment determines that the monitoring result is that the terminal equipment meets a pocket mode; or when the acceleration data and the angular acceleration data indicate that the terminal device does not meet the first preset direction and/or the ambient light data is greater than the second brightness threshold, the terminal device determines that the monitoring result is that the terminal device does not meet the pocket mode; wherein the second brightness threshold is greater than or equal to the first brightness threshold. In this way, the terminal device can accurately determine whether the terminal device is in the pocket mode based on the ambient light data, the acceleration data, and the angular acceleration data.
In a possible implementation manner, the first preset direction is a direction that the terminal device meets a head-down direction.
In a possible implementation manner, the display mode of the AOD includes a first display mode or a second display mode, and the terminal device performs display through the AOD application, including: when the display mode of the AOD application is a first display mode, the terminal equipment displays a preset animation through the AOD application and stops displaying the screen after the preset animation is displayed; or when the display mode of the AOD application is the second display mode, the terminal device displays the preset animation through the AOD application, and displays the last frame of picture in the preset animation after the preset animation is displayed. Therefore, the terminal equipment can realize the AOD based on the preset mode, and the experience of the user in using the AOD is enhanced. The first display mode may be a touch display mode, and the second display mode may be an all-day display mode.
In one possible implementation, the terminal device includes: an AOD application, and a display, the method further comprising: the terminal equipment receives a first operation for realizing AOD, and the first operation comprises the following steps: the AOD application receives a first operation for implementing AOD; responding to the first operation, the terminal equipment obtains a monitoring result based on the first data, and the monitoring result comprises the following steps: in response to the first operation, the AOD application obtains a monitoring result based on the first data; when the monitoring result indicates that terminal equipment satisfies the pocket mode, terminal equipment switches to from AOD to the screen of turning to, includes: when the monitoring result instructs terminal equipment to satisfy the pocket mode, AOD uses and instructs the display screen to be switched to the news screen by AOD, perhaps, when the monitoring result instructs terminal equipment not to satisfy the pocket mode, terminal equipment keeps AOD, include: and when the monitoring result indicates that the terminal equipment does not meet the pocket mode, the AOD application indicates the display screen to keep the AOD. Therefore, the AOD application can avoid the situation that the AP is continuously wakened up in the pocket due to mistaken touch through the recognition of the pocket mode, and the purpose of reducing the function is achieved.
In a possible implementation manner, the terminal device further includes: a motion framework and a motion sensor control center, responsive to a first operation, the AOD application obtains monitoring results based on first data, comprising: in response to the first operation, the AOD application sends a first message to the moving framework; the first message is used for indicating the moving framework to register the listener; the monitor is used for monitoring whether the terminal equipment is in a pocket mode; in response to the first message, the mobility framework registers a listener; the motion framework sends a second message to the motion sensor control center; the second message is used for indicating the moving frame to start the monitor; in response to the second message, the motion sensor control center obtains a monitoring result based on the first data; the motion sensor control center sends the monitoring result to AOD application; the AOD application obtains the monitoring results. In this way, the AOD application may pass through the registered listener and listen to the monitoring results obtained by the motion sensor control center based on the first data.
In one possible implementation, a motion framework registration listener includes: the motion framework calls the new HWExtMotion (motion type _ HEAD _ DOWN _ WITH _ ALS) interface to register the listener.
In one possible implementation, the motion sensor control center obtains the monitoring result based on the first data, and the monitoring result comprises: when the acceleration data and the angular acceleration data indicate that the terminal equipment meets a first preset direction and the ambient light data are smaller than a first brightness threshold, the motion sensor control center determines that the monitoring result is that the terminal equipment meets a pocket mode; or when the acceleration data and the angular acceleration data indicate that the terminal equipment does not meet the first preset direction and/or the ambient light data are larger than the second brightness threshold value, the motion sensor control center determines that the monitoring result is that the terminal equipment does not meet the pocket mode; wherein the second brightness threshold is greater than or equal to the first brightness threshold. Therefore, the motion sensor control center can accurately judge whether the terminal equipment is in the pocket mode or not through the ambient light data, the acceleration data and the angular acceleration data.
In a possible implementation manner, the terminal device further includes: the motion hardware abstraction layer interface description language and the motion hardware abstraction layer, the motion framework sends the second message to the motion sensor control center, including: the motion framework sends the second message to a motion hardware abstraction layer interface description language; the motion hardware abstraction layer interface description language sends the second message to the motion hardware abstraction layer; the motion hardware abstraction layer sends the second message to the motion sensor control center. Therefore, after the motion framework starts the pocket mode monitoring, the hardware abstraction layer interface description language and the motion hardware abstraction layer can send the message of starting the pocket mode monitoring to the motion sensor control center, and the motion sensor control center can realize the monitoring of the pocket mode.
In one possible implementation, the motion framework sends the second message to a motion hardware abstraction layer interface description language, including: the motion frame calls an int HWExtMotionService:: enableMotion (int motionType, int action, int delay) interface, and sends a second message to a motion hardware abstraction layer interface description language. In this way, the motion frame can accurately send the second message to the motion hardware abstraction layer interface description language based on the int hwextmotion service: (int motion, int action, int delay) interface.
In one possible implementation, the sending of the second message to the motion hardware abstraction layer by the motion hardware abstraction layer interface description language includes: the Motion hardware abstraction layer interface description language calls Return < int32_ t > Motion:: startMotionReco (int32_ t recoMotion) interface, and sends the second message to the Motion hardware abstraction layer. Thus, the Motion hardware abstraction layer interface description language call may accurately send the second message to the Motion hardware abstraction layer based on Return < int32_ t > Motion:: startMotionReco (int32_ trecoMotion) interface.
In one possible implementation, the sending of the monitoring result to the AOD application by the motion sensor control center includes: the motion sensor control center sends the monitoring result to a motion hardware abstraction layer; the motion hardware abstraction layer sends the monitoring result to a motion hardware abstraction layer interface description language; the motion hardware abstraction layer interface description language sends the monitoring result to the motion framework; and the motion framework sends the monitoring result to the AOD application. Therefore, the AOD application can accurately receive the monitoring result sent by the motion sensor control center based on the hardware abstraction layer and the motion hardware abstraction layer interface description language.
In one possible implementation, the sending, by the motion sensor control center, the monitoring result to the motion hardware abstraction layer includes: and the motion sensor control center calls the pb _ send _ sensor _ stream _ event interface and sends the monitoring result to the motion hardware abstraction layer. In this way, the motion sensor control center can accurately send the monitoring result to the motion hardware abstraction layer based on the pb _ send _ sensor _ stream _ event interface.
In one possible implementation, the motion hardware abstraction layer interface description language sends the monitoring result to the motion framework, including: the Motion hardware abstraction layer interface description language calls Return < void > Motion:polMotionResult (polMotionResult _ cb _ hidl _ cb) interface and sends the monitoring result to the Motion framework. Thus, the Motion hardware abstraction layer interface description language can accurately send the monitoring result to the Motion framework based on the Return < void > Motion: (polMotionResult _ cb _ hidl _ cb) interface.
In one possible implementation, the display mode of the AOD includes a first display mode or a second display mode, and the AOD application instructs the display screen to maintain the AOD, including: when the display mode of the AOD application is a first display mode, the AOD application instructs the display screen to display a preset animation, and the screen is stopped after the preset animation is displayed; or when the display mode of the AOD application is the second display mode, the AOD application instructs the display screen to display the preset animation, and the last frame of picture in the preset animation is displayed after the preset animation is displayed. In this way, AOD application can realize AOD based on a preset mode, and the experience of using AOD by a user is enhanced.
In a second aspect, an embodiment of the present application provides a terminal device, which may also be referred to as a terminal (terminal), a User Equipment (UE), a Mobile Station (MS), a Mobile Terminal (MT), or the like. The terminal device may be a mobile phone (mobile phone), a smart tv, a wearable device, a tablet computer (Pad), a computer with a wireless transceiving function, a Virtual Reality (VR) terminal device, an Augmented Reality (AR) terminal device, a wireless terminal in industrial control (industrial control), a wireless terminal in self-driving (self-driving), a wireless terminal in remote surgery (remote medical supply), a wireless terminal in smart grid (smart grid), a wireless terminal in transportation safety (transportation safety), a wireless terminal in smart city (smart city), a wireless terminal in smart home (smart home), and so on.
The terminal device comprises a processor for invoking a computer program in a memory for performing the method according to the first aspect.
In a third aspect, embodiments of the present application provide a computer-readable storage medium storing computer instructions that, when executed on a terminal device, cause the terminal device to perform a method according to the first aspect, or cause the terminal device to perform a method according to the second aspect.
In a fourth aspect, embodiments of the present application provide a computer program product which, when executed, causes a terminal device to perform a method as in the first aspect, or a computer to perform a method as in the second aspect.
In a fifth aspect, embodiments of the present application provide a chip comprising a processor configured to invoke a computer program in a memory to perform a method according to the first aspect or to perform a method according to the second aspect.
It should be understood that the second aspect to the fifth aspect of the present application correspond to the technical solutions of the first aspect of the present application, and the beneficial effects achieved by the aspects and the corresponding possible implementations are similar and will not be described again.
Drawings
Fig. 1 is a schematic interface diagram of an AOD according to an embodiment of the present disclosure;
FIG. 2 is a schematic diagram of an interface of another AOD provided in an embodiment of the present application;
FIG. 3 is a schematic illustration of an animation display provided by an embodiment of the present application;
fig. 4 is a schematic diagram of a hardware structure of a terminal device according to an embodiment of the present application;
fig. 5 is a schematic diagram of a software structure of a terminal device according to an embodiment of the present application;
fig. 6 is a schematic flowchart of a method for preventing false touch according to an embodiment of the present disclosure;
FIG. 7 is a schematic diagram of an interface of another AOD provided by an embodiment of the present application;
FIG. 8 is a schematic diagram of an interface of another AOD provided in an embodiment of the present application;
FIG. 9 is a schematic diagram of an interface of another AOD according to an embodiment of the present application;
FIG. 10 is a schematic diagram of an interface of another AOD according to an embodiment of the present application;
fig. 11 is a schematic structural diagram of an anti-false touch device according to an embodiment of the present disclosure;
fig. 12 is a schematic hardware structure diagram of an apparatus for preventing false touch according to an embodiment of the present application.
Detailed Description
In the embodiments of the present application, terms such as "first" and "second" are used to distinguish the same or similar items having substantially the same function and action. For example, the first value and the second value are only used to distinguish different values, and the order of the values is not limited. Those skilled in the art will appreciate that the terms "first," "second," etc. do not denote any order or quantity, nor do the terms "first," "second," etc. denote any order or importance.
It is noted that, in the present application, words such as "exemplary" or "for example" are used to mean exemplary, illustrative, or descriptive. Any embodiment or design described herein as "exemplary" or "such as" is not necessarily to be construed as preferred or advantageous over other embodiments or designs. Rather, use of the word "exemplary" or "such as" is intended to present concepts related in a concrete fashion.
In the present application, "at least one" means one or more, "a plurality" means two or more. "and/or" describes the association relationship of the associated objects, meaning that there may be three relationships, e.g., a and/or B, which may mean: a exists alone, A and B exist simultaneously, and B exists alone, wherein A and B can be singular or plural. The character "/" generally indicates that the former and latter associated objects are in an "or" relationship. "at least one of the following" or similar expressions refer to any combination of these items, including any combination of the singular or plural items. For example, at least one (one) of a, b, or c, may represent: a, b, c, a and b, a and c, b and c, or a, b and c, wherein a, b and c can be single or multiple.
More terminal devices can support AOD, and along with further improvement of AOD function, the terminal devices can support various display modes of AOD. Fig. 1 is an interface schematic diagram of an AOD according to an embodiment of the present disclosure. In the embodiment corresponding to fig. 1, a terminal device is taken as an example for description, and this example does not limit the embodiments of the present application.
When the mobile phone receives an operation of opening an interface for setting a display mode in the AOD application by a user, the mobile phone may display the interface shown in fig. 1, where the interface may include, for example: a touch display, an all-day display, and a timing display.
In one implementation, in a light touch display mode, when the AOD is displayed for a period of time, for example, after displaying a period of AOD animation, the AOD may not be displayed continuously (or the AOD is understood to be hidden), the terminal device enters a screen-saving state, and at this time, the AP sleeps; in the screen state, when the terminal device receives the triggering operation of the user for the display screen, the terminal device wakes up the AP, and enables the display screen to enter the DOZE state again to start the AOD.
In another implementation, in the all-day display mode, when the AOD displays for a period of time, for example, after displaying a period of AOD animation, the last frame of the AOD animation may be continuously displayed, at this time, the display screen enters the DOZE _ supend state, and the AP sleeps; in the DOZE-SUPPERD state, when the terminal equipment receives the trigger operation of the user for the display screen, the terminal equipment wakes up the AP, causes the display screen to be switched from the DOZE-SUPPERD state to the DOZE state, and starts AOD.
In another implementation, in the timing display mode, within a preset time period, when the AOD displays for a certain period of time, for example, the AOD animation is displayed for a certain period of time, the last frame of the AOD animation can be continuously displayed, at this time, the display screen enters the DOZE _ supelectronic state, and the AP sleeps; in the DOZE-supelectronic state, when the terminal device receives a trigger operation of a user for a display screen, the terminal device may determine whether the current time satisfies a preset time period, and if the current time does not satisfy the preset time period, the mobile phone may enter a screen-off state without performing AOD. It can be understood that, in the non-preset time period, even if the terminal device receives the trigger operation of the user for the display screen, the terminal device does not need to start the AOD.
Further, after the terminal device receives the operation of the user to start the AOD and select the display mode as in the embodiment corresponding to fig. 1, the terminal device performs the AOD, and in the DOZE state, the terminal device may display the interface corresponding to fig. 2. As in the interface corresponding to fig. 2, the terminal device may play according to a preset AOD animation, as shown in fig. 3, which is an example of a process of playing the animation.
Illustratively, fig. 2 is a schematic interface diagram of another AOD provided in an embodiment of the present application.
Under the condition that the user starts the AOD, when the terminal device receives an operation of a screen turn-off event, the terminal device may trigger the AOD and display an interface corresponding to fig. 2. As shown in fig. 2, one or more of the following may be displayed in the interface, for example: the AOD animation 201, such as a polar bear lying on an ice surface, time information 202, and date information 203. In a possible implementation manner, information such as an identifier of fingerprint identification and an identifier of electric quantity may also be displayed in the interface corresponding to fig. 2, and content displayed in the interface corresponding to fig. 2 may include other content according to an actual scene, which is not limited in this embodiment of the application.
In the embodiment of the present application, the screen saver event may be understood as an event that a user clicks a power key (power) when the terminal device is turned on the screen, or the screen saver event may also be an event that the terminal device does not detect a user operation within a period of time when the terminal device is turned on the screen.
Illustratively, fig. 3 is a schematic diagram of an animation display provided in an embodiment of the present application. As shown in fig. 3, the AOD animation may contain consecutive multi-frame pictures, which may include, for example: a picture of the first frame as shown in a in fig. 3, a picture as shown in b in fig. 3, a picture as shown in c in fig. 3, a picture as shown in d in fig. 3, and a picture of the last frame as shown in e in fig. 3. The picture of the first frame as shown in a of fig. 3 and the picture as shown in b of fig. 3 can be pictures of polar bears lying on the ice, the picture as shown in c of fig. 3 can be pictures of polar bears lying on the ice and rising from the ice, the picture as shown in d of fig. 3 can be pictures of polar bears walking on the ice, and the picture of the last frame as shown in e of fig. 3 can be pictures of polar bears walking to the edge of the ice.
In the tap display, when the terminal device receives an operation of triggering a screen-down event, the terminal device may display according to the AOD animation shown in fig. 3, for example, a picture of a first frame shown in a in fig. 3 is played to a picture of a last frame shown in e in fig. 3, when the AOD animation is played, the terminal device may screen down, and the AP is dormant.
In a possible implementation manner, in the process of displaying the information screen of the terminal device, when the terminal device receives an operation that the user triggers the display screen, the terminal device may wake up the AP to display the AOD animation or another preset display animation.
In the all-day display, when the terminal device receives an operation of triggering a screen-rest event, the terminal device may display according to the AOD animation shown in fig. 3, where a picture of a first frame shown by a in fig. 3 is played to a picture of a last frame shown by e in fig. 3, and when the AOD animation playing is completed, the terminal device may continuously display the picture of the last frame shown by e in fig. 3, and the AP is dormant.
In a possible implementation manner, as shown in fig. 3, in the process that the terminal device continuously displays the picture of the last frame as shown in e in fig. 3, when the terminal device receives an operation of touching the screen by the user, the terminal device may wake up the AP to display the AOD animation or another preset display animation.
However, no matter the AOD display mode is display all day or touch display, when the user puts terminal equipment into a pocket or a bag at the terminal equipment AOD in-process, along with the user's ambulation, there may be the situation such as the display screen is constantly touched lightly for AOD constantly enters into the DOZE state, and the AP is constantly awakened up, results in the consumption power higher, and equipment power consumptive very fast.
In view of this, the embodiment of the present application provides a method for reducing power consumption, so that a terminal device can avoid a situation that an AP is continuously waken up in a pocket due to a false touch by identifying a pocket mode in an AOD process, thereby achieving a purpose of reducing functions. The pocket mode may be understood as that the user places the terminal device in a pocket or a bag, so that the terminal device may appear like head down, and detect weak ambient light.
It is understood that the terminal device may also be referred to as a terminal (terminal), a User Equipment (UE), a Mobile Station (MS), a Mobile Terminal (MT), etc. The terminal device may be a mobile phone (mobile phone) having an AOD function, a smart tv, a wearable device, a tablet computer (Pad), a computer with a wireless transceiving function, a Virtual Reality (VR) terminal device, an Augmented Reality (AR) terminal device, a wireless terminal in industrial control (industrial control), a wireless terminal in self-driving (self-driving), a wireless terminal in remote surgery (remote medical supply), a wireless terminal in smart grid (smart grid), a wireless terminal in transportation safety (transportation safety), a wireless terminal in city (smart city), a wireless terminal in smart home (smart home), and the like. The embodiment of the present application does not limit the specific technology and the specific device form adopted by the terminal device.
Therefore, in order to better understand the embodiments of the present application, the following describes a structure of a terminal device according to the embodiments of the present application. Exemplarily, fig. 4 is a schematic structural diagram of a terminal device provided in an embodiment of the present application.
The terminal device may include a processor 110, an external memory interface 120, an internal memory 121, a Universal Serial Bus (USB) interface 130, a charging management module 140, a power management module 141, an antenna 1, an antenna 2, a mobile communication module 150, a wireless communication module 160, an audio module 170, a speaker 170A, a receiver 170B, a microphone 170C, an earphone interface 170D, a sensor module 180, a button 190, an indicator 192, a camera 193, a display screen 194, and the like. The sensor module 180 may include a pressure sensor 180A, a gyroscope sensor 180B, an air pressure sensor 180C, a magnetic sensor 180D, an acceleration sensor 180E, a distance sensor 180F, a proximity light sensor 180G, a fingerprint sensor 180H, a temperature sensor 180J, a touch sensor 180K, an ambient light sensor 180L, a bone conduction sensor 180M, and the like.
It is to be understood that the illustrated structure of the embodiments of the present application does not constitute a specific limitation to the terminal device. In other embodiments of the present application, a terminal device may include more or fewer components than shown, or some components may be combined, or some components may be split, or a different arrangement of components may be used. The illustrated components may be implemented in hardware, software, or a combination of software and hardware.
Processor 110 may include one or more processing units, such as: the processor 110 may include an Application Processor (AP), a modem processor, a Graphics Processor (GPU), an Image Signal Processor (ISP), a controller, a video codec, a Digital Signal Processor (DSP), a baseband processor, and/or a neural-Network Processing Unit (NPU), etc. The different processing units may be separate devices or may be integrated into one or more processors.
The controller can generate an operation control signal according to the instruction operation code and the timing signal to complete the control of instruction fetching and instruction execution.
A memory may also be provided in the processor 110 for storing instructions and data. In some embodiments, the memory in the processor 110 is a cache memory. The memory may hold instructions or data that have just been used or recycled by the processor 110. If the processor 110 needs to reuse the instruction or data, it may be called from memory. Avoiding repeated accesses reduces the latency of the processor 110, thereby increasing the efficiency of the system.
In some embodiments, processor 110 may include one or more interfaces. The interface may include an integrated circuit (I2C) interface, an integrated circuit built-in audio (I2S) interface, a Pulse Code Modulation (PCM) interface, a universal asynchronous receiver/transmitter (UART) interface, a Mobile Industry Processor Interface (MIPI), a general-purpose-input/output (GPIO) interface, a Subscriber Identity Module (SIM) interface, and/or a Universal Serial Bus (USB) interface, etc.
The I2C interface is a bi-directional synchronous serial bus that includes a serial data line (SDA) and a Serial Clock Line (SCL). In some embodiments, processor 110 may include multiple sets of I2C buses. The processor 110 may be coupled to the touch sensor 180K, the charger, the flash, the camera 193, etc. through different I2C bus interfaces, respectively. For example: the processor 110 may be coupled to the touch sensor 180K through an I2C interface, so that the processor 110 and the touch sensor 180K communicate through an I2C bus interface, thereby implementing the touch function of the terminal device 100.
The I2S interface may be used for audio communication. In some embodiments, processor 110 may include multiple sets of I2S buses. The processor 110 may be coupled to the audio module 170 via an I2S bus to enable communication between the processor 110 and the audio module 170. In some embodiments, the audio module 170 may communicate audio signals to the wireless communication module 160 via the I2S interface, enabling answering of calls via a bluetooth headset.
The PCM interface may also be used for audio communication, sampling, quantizing and encoding analog signals. In some embodiments, the audio module 170 and the wireless communication module 160 may be coupled by a PCM bus interface. In some embodiments, the audio module 170 may also transmit audio signals to the wireless communication module 160 through the PCM interface, so as to implement a function of answering a call through a bluetooth headset. Both the I2S interface and the PCM interface may be used for audio communication.
The UART interface is a universal serial data bus used for asynchronous communications. The bus may be a bidirectional communication bus. It converts the data to be transmitted between serial communication and parallel communication. In some embodiments, a UART interface is generally used to connect the processor 110 with the wireless communication module 160. For example: the processor 110 communicates with a bluetooth module in the wireless communication module 160 through a UART interface to implement a bluetooth function. In some embodiments, the audio module 170 may transmit the audio signal to the wireless communication module 160 through a UART interface, so as to realize the function of playing music through a bluetooth headset.
MIPI interfaces may be used to connect processor 110 with peripheral devices such as display screen 194, camera 193, and the like. The MIPI interface includes a Camera Serial Interface (CSI), a display screen serial interface (DSI), and the like. In some embodiments, processor 110 and camera 193 communicate through a CSI interface to implement the capture function of terminal device 100. The processor 110 and the display screen 194 communicate through the DSI interface to implement the display function of the terminal device 100.
The GPIO interface may be configured by software. The GPIO interface may be configured as a control signal and may also be configured as a data signal. In some embodiments, a GPIO interface may be used to connect the processor 110 with the camera 193, the display 194, the wireless communication module 160, the audio module 170, the sensor module 180, and the like. The GPIO interface may also be configured as an I2C interface, an I2S interface, a UART interface, a MIPI interface, and the like.
The USB interface 130 is an interface conforming to the USB standard specification, and may be a Mini USB interface, a Micro USB interface, a USB Type C interface, or the like. The USB interface 130 may be used to connect a charger to charge the terminal device 100, and may also be used to transmit data between the terminal device 100 and a peripheral device. And the earphone can also be used for connecting an earphone and playing audio through the earphone. The interface may also be used to connect other electronic devices, such as AR devices and the like.
It should be understood that the connection relationship between the modules illustrated in the embodiment of the present application is an exemplary illustration, and does not limit the structure of the terminal device 100. In other embodiments of the present application, the terminal device 100 may also adopt different interface connection manners or a combination of multiple interface connection manners in the above embodiments.
The charging management module 140 is configured to receive charging input from a charger. The charger may be a wireless charger or a wired charger. In some wired charging embodiments, the charging management module 140 may receive charging input from a wired charger via the USB interface 130. In some wireless charging embodiments, the charging management module 140 may receive a wireless charging input through a wireless charging coil of the terminal device 100. The charging management module 140 may also supply power to the terminal device through the power management module 141 while charging the battery 142.
The power management module 141 is used to connect the battery 142, the charging management module 140 and the processor 110. The power management module 141 receives input from the battery 142 and/or the charge management module 140, and supplies power to the processor 110, the internal memory 121, the display 194, the camera 193, the wireless communication module 160, and the like. The power management module 141 may also be used to monitor parameters such as battery capacity, battery cycle count, battery state of health (leakage, impedance), etc. In some other embodiments, the power management module 141 may also be disposed in the processor 110. In other embodiments, the power management module 141 and the charging management module 140 may be disposed in the same device.
The wireless communication function of the terminal device 100 may be implemented by the antenna 1, the antenna 2, the mobile communication module 150, the wireless communication module 160, a modem processor, a baseband processor, and the like.
The antennas 1 and 2 are used for transmitting and receiving electromagnetic wave signals. The antennas in terminal device 100 may be used to cover a single or multiple communication bands. Different antennas can also be multiplexed to improve the utilization of the antennas. For example: the antenna 1 may be multiplexed as a diversity antenna of a wireless local area network. In other embodiments, the antenna may be used in conjunction with a tuning switch.
The mobile communication module 150 may provide a solution including wireless communication of 2G/3G/4G/5G, etc. applied to the terminal device 100. The mobile communication module 150 may include at least one filter, a switch, a power amplifier, a Low Noise Amplifier (LNA), and the like. The mobile communication module 150 may receive the electromagnetic wave from the antenna 1, filter, amplify, etc. the received electromagnetic wave, and transmit the electromagnetic wave to the modem processor for demodulation. The mobile communication module 150 may also amplify the signal modulated by the modem processor, and convert the signal into electromagnetic wave through the antenna 1 to radiate the electromagnetic wave. In some embodiments, at least some of the functional modules of the mobile communication module 150 may be disposed in the processor 110. In some embodiments, at least some of the functional modules of the mobile communication module 150 may be disposed in the same device as at least some of the modules of the processor 110.
The modem processor may include a modulator and a demodulator. The modulator is used for modulating a low-frequency baseband signal to be transmitted into a medium-high frequency signal. The demodulator is used for demodulating the received electromagnetic wave signal into a low-frequency baseband signal. The demodulator then passes the demodulated low frequency baseband signal to a baseband processor for processing. The low frequency baseband signal is processed by the baseband processor and then transferred to the application processor. The application processor outputs a sound signal through an audio device (not limited to the speaker 170A, the receiver 170B, etc.) or displays an image or video through the display screen 194. In some embodiments, the modem processor may be a stand-alone device. In other embodiments, the modem processor may be provided in the same device as the mobile communication module 150 or other functional modules, independent of the processor 110.
The wireless communication module 160 may provide solutions for wireless communication applied to the terminal device 100, including Wireless Local Area Networks (WLANs) (e.g., wireless fidelity (Wi-Fi) networks), bluetooth (bluetooth, BT), Global Navigation Satellite System (GNSS), Frequency Modulation (FM), Near Field Communication (NFC), Infrared (IR), and the like. The wireless communication module 160 may be one or more devices integrating at least one communication processing module. The wireless communication module 160 receives electromagnetic waves via the antenna 2, performs frequency modulation and filtering on electromagnetic wave signals, and transmits the processed signals to the processor 110. The wireless communication module 160 may also receive a signal to be transmitted from the processor 110, perform frequency modulation and amplification on the signal, and convert the signal into electromagnetic waves through the antenna 2 to radiate the electromagnetic waves.
In some embodiments, the antenna 1 of the terminal device 100 is coupled to the mobile communication module 150 and the antenna 2 is coupled to the wireless communication module 160 so that the terminal device 100 can communicate with the network and other devices through wireless communication technology. The wireless communication technology may include global system for mobile communications (GSM), General Packet Radio Service (GPRS), Code Division Multiple Access (CDMA), Wideband Code Division Multiple Access (WCDMA), time-division code division multiple access (TD-SCDMA), Long Term Evolution (LTE), BT, GNSS, WLAN, NFC, FM, and/or IR technologies, among others. GNSS may include Global Positioning System (GPS), global navigation satellite system (GLONASS), beidou satellite navigation system (BDS), quasi-zenith satellite system (QZSS), and/or Satellite Based Augmentation System (SBAS).
The terminal device 100 implements a display function by the GPU, the display screen 194, and the application processor, etc. The GPU is a microprocessor for image processing, and is connected to the display screen 194 and an application processor. The GPU is used to perform mathematical and geometric calculations for graphics rendering. The processor 110 may include one or more GPUs that execute program instructions to generate or alter display information.
The display screen 194 is used to display images, video, and the like. The display screen 194 includes a display panel. The display panel may be a Liquid Crystal Display (LCD), an organic light-emitting diode (OLED), an active-matrix organic light-emitting diode (active-matrix organic light-emitting diode, AMOLED), a flexible light-emitting diode (FLED), a miniature, a Micro-oeld, a quantum dot light-emitting diode (QLED), or the like. In some embodiments, the terminal device 100 may include 1 or N display screens 194, where N is a positive integer greater than 1. In the case where the message screen display is provided on the terminal device, the display screen 194 may perform the message screen display on a part of the display screen.
The terminal device 100 may implement a shooting function through the ISP, the camera 193, the video codec, the GPU, the display screen 194, the application processor, and the like.
The ISP is used to process the data fed back by the camera 193. For example, when a photo is taken, the shutter is opened, light is transmitted to the camera photosensitive element through the lens, the optical signal is converted into an electrical signal, and the camera photosensitive element transmits the electrical signal to the ISP for processing and converting into an image visible to naked eyes. The ISP can also carry out algorithm optimization on noise, brightness and skin color of the image. The ISP can also optimize parameters such as exposure, color temperature and the like of a shooting scene. In some embodiments, the ISP may be provided in camera 193.
The camera 193 is used to capture still images or video. The object generates an optical image through the lens and projects the optical image to the photosensitive element. The photosensitive element may be a Charge Coupled Device (CCD) or a complementary metal-oxide-semiconductor (CMOS) phototransistor. The photosensitive element converts the optical signal into an electrical signal, and then transmits the electrical signal to the ISP to be converted into a digital image signal. And the ISP outputs the digital image signal to the DSP for processing. The DSP converts the digital image signal into an image signal in a standard RGB, YUV and other formats. In some embodiments, the terminal device 100 may include 1 or N cameras 193, N being a positive integer greater than 1.
The digital signal processor is used for processing digital signals, and can process digital image signals and other digital signals. For example, when the terminal device 100 selects a frequency point, the digital signal processor is used to perform fourier transform or the like on the frequency point energy.
Video codecs are used to compress or decompress digital video. The terminal device 100 may support one or more video codecs. In this way, the terminal device 100 can play or record video in a plurality of encoding formats, such as: moving Picture Experts Group (MPEG) 1, MPEG2, MPEG3, MPEG4, and the like.
The NPU is a neural-network (NN) computing processor that processes input information quickly by using a biological neural network structure, for example, by using a transfer mode between neurons of a human brain, and can also learn by itself continuously. The NPU can implement applications such as intelligent recognition of the terminal device 100, for example: image recognition, face recognition, speech recognition, text understanding, and the like.
The external memory interface 120 may be used to connect an external memory card, such as a Micro SD card, to extend the storage capability of the terminal device 100. The external memory card communicates with the processor 110 through the external memory interface 120 to implement a data storage function. For example, files such as music, video, etc. are saved in the external memory card.
The internal memory 121 may be used to store computer-executable program code, which includes instructions. The internal memory 121 may include a program storage area and a data storage area. The storage program area may store an operating system, an application program (such as a sound playing function, an image playing function, etc.) required by at least one function, and the like. The storage data area may store data (such as audio data, a phonebook, etc.) created during use of the terminal device 100, and the like. In addition, the internal memory 121 may include a high-speed random access memory, and may further include a nonvolatile memory, such as at least one magnetic disk storage device, a flash memory device, a universal flash memory (UFS), and the like. The processor 110 executes various functional applications of the terminal device 100 and data processing by executing instructions stored in the internal memory 121 and/or instructions stored in a memory provided in the processor.
The terminal device 100 may implement an audio function through the audio module 170, the speaker 170A, the receiver 170B, the microphone 170C, the earphone interface 170D, and the application processor. Such as music playing, recording, etc.
The audio module 170 is used to convert digital audio information into an analog audio signal output and also to convert an analog audio input into a digital audio signal. The audio module 170 may also be used to encode and decode audio signals. In some embodiments, the audio module 170 may be disposed in the processor 110, or some functional modules of the audio module 170 may be disposed in the processor 110.
The speaker 170A, also called a "horn", is used to convert the audio electrical signal into an acoustic signal. The terminal device 100 can listen to music through the speaker 170A, or listen to a handsfree call.
The receiver 170B, also called "earpiece", is used to convert the electrical audio signal into a sound signal. When the terminal device 100 answers a call or voice information, it is possible to answer a voice by bringing the receiver 170B close to the human ear.
The microphone 170C, also referred to as a "microphone," is used to convert sound signals into electrical signals. When making a call or transmitting voice information, the user can input a voice signal to the microphone 170C by speaking the user's mouth near the microphone 170C. The terminal device 100 may be provided with at least one microphone 170C. In other embodiments, the terminal device 100 may be provided with two microphones 170C, which may implement a noise reduction function in addition to collecting sound signals. In other embodiments, the terminal device 100 may further include three, four or more microphones 170C to collect a sound signal, reduce noise, identify a sound source, and implement a directional recording function.
The headphone interface 170D is used to connect a wired headphone. The headset interface 170D may be the USB interface 130, or may be a 3.5mm open mobile electronic device platform (OMTP) standard interface, a cellular telecommunications industry association (cellular telecommunications industry association of the USA, CTIA) standard interface.
The pressure sensor 180A is used for sensing a pressure signal, and converting the pressure signal into an electrical signal. In some embodiments, the pressure sensor 180A may be disposed on the display screen 194. The pressure sensor 180A can be of a variety of types, such as a resistive pressure sensor, an inductive pressure sensor, a capacitive pressure sensor, and the like. The capacitive pressure sensor may be a sensor comprising at least two parallel plates having an electrically conductive material. When a force acts on the pressure sensor 180A, the capacitance between the electrodes changes. The terminal device 100 determines the intensity of the pressure from the change in the capacitance. When a touch operation is applied to the display screen 194, the terminal device 100 detects the intensity of the touch operation from the pressure sensor 180A. The terminal device 100 may also calculate the touched position from the detection signal of the pressure sensor 180A. In some embodiments, the touch operations that are applied to the same touch position but different touch operation intensities may correspond to different operation instructions.
The gyro sensor 180B may be used to determine the motion attitude of the terminal device. The air pressure sensor 180C is used to measure air pressure. In some embodiments, the angular velocity of the terminal device about three axes (i.e., x, y, and z axes) may be determined by the gyro sensor 180B. The gyro sensor 180B may be used for photographing anti-shake. Illustratively, when the shutter is pressed, the gyroscope sensor 180B detects the shake angle of the terminal device, calculates the distance to be compensated for by the lens module according to the shake angle, and enables the lens to counteract the shake of the terminal device through reverse movement, thereby achieving anti-shake. The gyroscope sensor 180B may also be used for navigation, somatosensory gaming scenes.
The acceleration sensor 180E can detect the magnitude of acceleration of the terminal device in various directions (generally, three axes). When the terminal equipment is static, the size and the direction of gravity can be detected. The method can also be used for identifying the attitude of the terminal equipment, and is applied to application programs such as horizontal and vertical screen switching, pedometers and the like.
In this embodiment, the gyro sensor 180B may be used together with the acceleration sensor 180E to detect whether the user places the terminal device head down, that is, the terminal device may collect motion data through the gyro sensor 180B and the acceleration sensor 180E, and detect whether the state of the terminal device is head down according to the motion data. The head-down state may be used as one of the determination conditions of the pocket mode.
The proximity light sensor 180G may include, for example, a Light Emitting Diode (LED) and a light detector, such as a photodiode. The light emitting diode may be an infrared light emitting diode. The terminal device emits infrared light to the outside through the light emitting diode. The terminal device detects infrared reflected light from a nearby object using a photodiode. When sufficient reflected light is detected, it can be determined that there is an object near the terminal device. When insufficient reflected light is detected, the end device may determine that there are no objects near the end device. The terminal equipment can utilize the proximity light sensor 180G to detect that the user holds the terminal equipment to be close to the ear for conversation so as to automatically turn off the screen to achieve the purpose of power saving.
The ambient light sensor 180L is used to sense the ambient light level. The terminal device may adaptively adjust the brightness of the display screen 194 according to the perceived ambient light brightness. The ambient light sensor 180L may also be used to automatically adjust the white balance when taking a picture.
The fingerprint sensor 180H is used to collect a fingerprint. The temperature sensor 180J is used to detect temperature. The touch sensor 180K is also called a "touch device". The bone conduction sensor 180M may acquire a vibration signal.
The touch sensor 180K may be disposed on the display screen 194, and the touch sensor 180K and the display screen 194 form a touch screen, or "touch screen". The touch sensor 180K is used to detect a touch operation acting thereon or nearby. The touch sensor may communicate the detected touch operation to the application processor to determine the touch event type. Visual output associated with the touch operation may be provided through the display screen 194. In other embodiments, the touch sensor 180K may be disposed on the surface of the terminal device at a position different from the position of the display screen 194.
The keys 190 include a power-on key, a volume key, and the like. The keys 190 may be mechanical keys. Or may be touch keys. The terminal device 100 may receive a key input, and generate a key signal input related to user setting and function control of the terminal device 100.
The motor 191 may generate a vibration cue. Indicator 192 may be an indicator light that may be used to indicate a state of charge, a change in charge, or a message, missed call, notification, etc. The SIM card interface 195 is used to connect a SIM card.
The software system of the terminal device may adopt a layered architecture, an event-driven architecture, a micro-core architecture, a micro-service architecture, or a cloud architecture, which will not be described herein again.
Exemplarily, fig. 5 is a schematic diagram of a software structure of a terminal device according to an embodiment of the present application.
As shown in fig. 5, the layered architecture divides the software into several layers, each layer having a clear role and division of labor. The layers communicate with each other through a software interface. In some embodiments, the system may be divided into five layers, which are an application layer, an application framework (framework) layer, a service layer following a hardware abstraction layer description language (HIDL) layer, a Hardware Abstraction Layer (HAL) layer, and a sensor hub (sensor hub) layer, etc. from top to bottom. It can be understood that the layered architecture may include other layers according to an actual scenario, which is not described in this embodiment.
Wherein the application layer may include a series of application programs. For example, the application program may include an application program such as an AOD application, which may be an application for realizing a message screen display of the terminal device.
The framework layer may provide an Application Programming Interface (API) and a programming framework for applications of the application layer. The frame layer may include some predefined functions, among others.
As shown in fig. 5, the frame layer may include a motion frame module or the like. In this embodiment of the present application, the motion frame is configured to forward a message sent by the AOD application to a motion HIDL in the service HIDL layer through an interface, and forward a message sent by the motion HIDL in the service HIDL layer to the AOD application through the interface. The motion frame may also be used to receive a registration indication message sent by the AOD application to register for pocket mode and register a pocket mode listener based on the indication message. The pocket mode listener can be used for monitoring whether the terminal device meets the pocket mode.
In a possible implementation, a power management module (not shown in fig. 5) may be further included in the frame layer. The power manager can be used for switching the state of the system based on the lock and the timer, so that the terminal equipment can realize the function of preventing false touch. In the embodiment of the application, the power manager can provide a wakelock mechanism, and when the wakelock is applied, the system is in an AP wake-up state; when wakelock is released, then the system may be in an AP sleep state. Further, the wakelock may include an identifier of whether the wakelock is in a timeout state, for example, if the wakelock is automatically released after the timeout, the system may initiate the mechanism to enter the sleep state.
A motion hill may be included in the service hill layer, and may be configured to forward a message received by a motion frame in the frame layer to a motion HAL in the HAL layer through an interface, and to forward a message received by a motion HAL in the HAL layer to a motion frame in the frame layer through an interface.
The HAL layer is used for abstracting hardware, and can provide a uniform interface for querying hardware equipment for applications on an upper layer. The HAL layer hides the hardware interface details of a specific platform, provides a virtual hardware platform for an operating system, makes the virtual hardware platform have hardware independence, and can be transplanted on various platforms. In embodiments of the present application, the HAL layer may include, for example, a motion HAL. The motion HAL is configured to forward a message received by the motion HIDL in the service HIDL layer to the sensor hub layer through an interface, and to forward a message received by the motion sensorubb in the sensor hub layer to the motion HIDL in the service HIDL layer through an interface.
The sensor hub layer is used for fusing different types of sensor data, so that the terminal equipment realizes a new function derived based on combination of various sensor data. In the embodiment of the application, the motion sensorubb layer can include a motion sensorubb, and the motion sensorubb can detect whether the terminal device is in a pocket mode or not based on the ambient light data, the gyroscope data and the acceleration data.
It will be appreciated that the ambient light data is monitored by an ambient light sensor, the gyroscope data is monitored by a gyroscope sensor, and the acceleration data is monitored by an acceleration sensor. And the functions of the ambient light sensor, the gyroscope sensor and the acceleration sensor can be referred to the relevant description of the sensors in fig. 4, and are not described again here.
The following describes the technical solution of the present application and how to solve the above technical problems in detail by specific embodiments. The following embodiments may be implemented independently or in combination, and details of the same or similar concepts or processes may not be repeated in some embodiments.
Exemplarily, fig. 6 is a schematic flowchart of a method for preventing false touch according to an embodiment of the present application. As shown in fig. 6, the terminal device may include an APP layer, a frame layer, a service HIDL layer, a HAL layer, and a sensorhub layer. Each APP layer of the terminal device may include a plurality of modules for supporting the false touch prevention method provided in the embodiment of the present application, for example, the APP layer may include an AOD application; the frame layer can include a motion frame; the service HIDL layer may include a motion HIDL; the HAL layer may include a motion HAL; the sensorubb layer may include a motion sensorubb therein.
As shown in fig. 6, the method for preventing false touch may include the following steps:
s601, when the AOD application receives the operation for screen displaying, the AOD application sends an indication message to the motion frame.
Illustratively, the AOD application may send the indication message to the motion frame through an interface, such as a registerdeviceiriesener interface. Suitably, the motion frame may receive an indication message sent by the AOD application.
In the embodiment of the present application, the indication message is used to indicate that the motion frame registers a listener for a listening pocket mode; the operation for turning off the screen may be triggered by the user, or may be automatically triggered when the operation of the user is not detected for a while in the bright screen state. For example, the user-triggered screenshot action may include one or more of the following, for example: the present invention is not limited to voice operation, or a user's click or long-press operation on a power (power) key.
Illustratively, fig. 7 is an interface schematic diagram of another AOD provided in an embodiment of the present application. As shown in a in fig. 7, in the case that the user has turned on the AOD, for example, the mode of the tap display (or the day-all display, or the timing display, etc.) is turned on, when the terminal device receives an operation of the user to trigger the power key 701, or when the terminal device does not detect the user operation within a period of time during which the display screen is bright, the terminal device may switch from the bright screen state shown in a in fig. 7 to the AOD shown in b in fig. 7, in which the display screen is in the DOZE state and the AP is in the wake-up state. The interface shown in a in fig. 7 may be similar to the interface shown in fig. 1, and the interface shown in b in fig. 7 may be similar to the interface shown in fig. 2, which is not described herein again.
It is understood that the interface shown in a in fig. 7 is only an example when the terminal device is in the bright screen state, and in a possible implementation manner, the terminal device may also be switched to the AOD from interfaces in other bright screen states, which is not limited in this embodiment of the application.
It will be appreciated that in the AOD process, the AOD application may indicate a power manager in the frame layer, applying for a weaklock. The display content of the AOD may be other display content such as AOD animation, a frame of image, static information, and the like, which is not limited in this embodiment of the application.
After S601, the user may place the terminal device in a pocket (or a backpack), and when the user walks or moves, the terminal device in the pocket (or the backpack) may continuously shake along with the movement of the user, so as to continuously trigger the display screen (or change of ambient light), and further wake up the AP continuously, thereby increasing power consumption.
S602, the motion frame registers a listener for listening to the pocket mode. Illustratively, the motion frame may call an interface, such as a new HwExtMotion (motion type _ HEAD _ DOWN _ WITH _ ALS) interface registration listener, such as a registration callback function mhwextdevicemanager.
S603, the motion frame sends a message for instructing to turn on the pocket mode listening to the motion HIDL.
Illustratively, the motion frame may call an interface, such as int hwext motion service:: enablemeotion (int motion type, int action, int delay) interface, to notify motion HIDL of opening a message of pocket mode listening. In some embodiments, the motion HIDL may receive a message sent by the motion frame indicating that pocket mode listening is turned on.
S604, motion HIDL sends a message to motion HAL to instruct to turn on pocket mode listening.
Illustratively, Motion HIDL may invoke an interface, such as Return < int32_ t > Motion:: startMotionReco (int32_ t recoMotion) interface informs the Motion HAL of the message to turn on pocket mode listening. In some embodiments, the motion HAL may receive a message sent by the motion HIDL indicating that pocket mode listening is turned on.
S605, the motion HAL sends a message indicating to turn on the pocket mode listening to the motion sensorubb.
For example, the motion HAL may call an interface, such as a send-request interface, to notify the motion sensorubb to turn on the message for pocket mode listening. Suitably, the sensorsub may receive a message sent by the motion HAL instructing to turn on pocket mode listening.
S606, acquiring monitoring data by the motion sensorsub, and calculating whether the terminal equipment is in a pocket mode to obtain a pocket mode monitoring result.
For example, the motion sensor may start data monitoring through a send-request interface of a high-throughput communication interface (QMI), and when receiving corresponding sensor data, call an algorithm interface, for example, a proxy _ with _ aod interface, to calculate whether the terminal device is in a pocket mode, so as to obtain a pocket mode monitoring result.
In the embodiment of the present application, the monitoring data may include ambient light data, acceleration data, gyroscope data, and the like. The acceleration data, the gyroscope data, and the like may be used to monitor the attitude of the terminal device, for example, whether the terminal device is in a head-down state. For example, the motion sensorhub may determine whether the terminal device satisfies the head-down state based on the acceleration data and the gyroscope data using a preset rule.
For example, when the motion sensorhub determines that the terminal device satisfies the head-down state and the ambient light data is less than (or equal to or less than) the preset first brightness threshold, the motion sensorhub may determine that the pocket mode monitoring result satisfies the pocket mode, for example, set the pocket mode monitoring result to 1; alternatively, when the motion sensorsub determines that the terminal device does not satisfy the head-down state and/or the ambient light data is greater than or equal to (or greater than) the preset second brightness threshold, the motion sensorsub may determine that the pocket mode monitoring result does not satisfy the pocket mode, for example, set the pocket mode monitoring result to 0.
Wherein the second brightness threshold may be greater than or equal to the first brightness threshold. The values of the first brightness threshold and the second brightness threshold may be obtained based on learning of historical touch data generated when the terminal device is in a pocket mode, and specific values of the thresholds are not limited in this embodiment of the application.
It can be understood that the data for monitoring whether the terminal device satisfies the pocket mode may include other contents according to an actual scene, which is not limited in this embodiment of the application.
S607, the motion sensorubb sends the pocket pattern monitoring result to the motion HAL.
For example, the motion sensorubb may call an interface, such as a pb _ send _ sensor _ stream _ event interface in the sensorubb, to send the pocket mode monitoring result to the motion HAL. Suitably, the motion HAL may receive the pocket pattern monitoring result transmitted by the motion sensorhub.
S608, the motion HAL sends the pocket mode monitoring result to the motion HIDL.
For example, the motion HAL may call an interface, such as a MotionEventPoll interface, assign the pocket mode monitoring result to the pointer of the HIDL layer, and release a corresponding release amount, and may receive the corresponding pocket mode monitoring result when the HIDL layer monitors that the corresponding semaphore is released. In some embodiments, the motion HIDL may receive the pocket mode monitoring result sent by the motion HAL.
S609, the motion HIDL sends the pocket mode monitoring result to the motion frame.
Illustratively, Motion HIDL sends the pocket mode monitoring result to the Motion frame via the Return < void > Motion:, poll MotionResult (poll MotionResult _ cb _ HIDL _ cb) interface. In some embodiments, the motion frame may receive the pocket mode monitoring result sent by the motion HIDL.
S610, the motion frame sends the pocket mode monitoring result to the AOD application.
For example, the motion frame may call an interface, such as a threadloop interface, to return the pocket mode monitoring results to the AOD application. Suitably, the AOD application may receive the pocket mode monitoring result sent by the motion frame.
And S611, determining a display mode by the AOD application according to the pocket mode monitoring result.
In this embodiment of the application, when the pocket mode monitoring result (for example, indicated as 1) indicates that the terminal device satisfies the pocket mode, the AOD application may indicate the display screen to play the screen, so that the display screen is switched from the DOZE state to the play screen state, and the AOD application indicates a power manager in the frame layer to release the weblock, and the AP startup mechanism enters the sleep mode; alternatively, when the pocket mode monitoring result (e.g., identified as 0) indicates that the terminal device does not satisfy the pocket mode, the AOD application may instruct the display screen to keep AOD display, the display screen keeps DOZE state, and the AP is in awake state.
Illustratively, fig. 8 is a schematic interface diagram of another AOD provided in an embodiment of the present application. In the case of AOD display as shown in a in fig. 8, when the terminal device receives a pocket mode monitoring result indicating that the pocket mode is satisfied based on the steps shown in S605-S611, the terminal device may switch from the interface shown in a in fig. 8 to an interface shown in b in fig. 8 in which the terminal device is in a screen saver state.
Alternatively, in the case of AOD display as shown in b in fig. 8, when the terminal device receives a pocket mode monitoring result indicating that the pocket mode is not satisfied based on the steps shown in S605 to S611, the terminal device may switch from an interface as shown in a in fig. 8 to an interface as shown in c in fig. 8 in which the terminal device maintains AOD display.
The interface shown as a in fig. 8 and the interface shown as c in fig. 8 are similar to the interface shown in fig. 3, and are not repeated herein.
It can be understood that when the terminal device is in the pocket mode and performs screen saving, even if the terminal device receives the trigger for the display screen, the terminal device can maintain the screen saving state, avoid AOD display, ensure that the AP is in the sleep state, and reduce power consumption.
Based on this, the terminal equipment can avoid the situation that the AP is continuously awakened due to mistaken touch in the pocket through the recognition of the pocket mode in the AOD process, and the purpose of reducing the function is achieved.
In a possible implementation manner, after the terminal device is indicated in the step shown in S611 that the pocket mode is not satisfied and AOD display is performed, the terminal device may perform display based on the display mode of AOD.
In one implementation, when the AOD display mode is the tap display, the terminal device may turn off the screen within a period of time when the AOD display is finished. Fig. 9 is an exemplary interface schematic diagram of another AOD according to an embodiment of the present application.
In the case where the AOD display mode is the tap display, the terminal device may display from the first frame screen of the AOD as shown by a in fig. 9 to the last frame screen as shown by b in fig. 9, and screen-down after a certain period of time, for example, 7 seconds, at which the AOD display ends, for example, switch from the interface shown by b in fig. 9 to the interface shown by c in fig. 9. The interface shown as a in fig. 9 and the interface shown as b in fig. 9 may be similar to the interface shown in fig. 2, and are not repeated herein.
In another implementation, when the AOD display mode is full-day display, the terminal device may continue to display the last frame of the AOD after the AOD display is finished. Illustratively, fig. 10 is a schematic interface diagram of another AOD provided in an embodiment of the present application.
In the case where the AOD display mode is the all-day display, the terminal apparatus may display from the first frame screen of the AOD as shown by a in fig. 10 to the last frame screen as shown by b in fig. 10, and hold the last frame screen as shown by b in fig. 10 after the AOD display ends. The interface shown by a in fig. 10 and the interface shown by b in fig. 10 may be similar to the interface shown in fig. 2, and are not described herein again.
It is understood that the interface provided by the embodiments of the present application is only an example, and does not constitute a further limitation to the embodiments of the present application.
The method for preventing false touch in the embodiment of the present application has been described above, and the following describes the terminal device that executes the method for preventing false touch in the embodiment of the present application. Those skilled in the art can understand that the method and the apparatus can be combined and referred to each other, and the terminal device provided in the embodiments of the present application can perform the steps in the above-mentioned method for preventing false touch.
As shown in fig. 11, fig. 11 is a schematic structural diagram of a device for preventing accidental touch according to an embodiment of the present application. The device for preventing false touch can be the terminal device in the embodiment of the application. This prevent device of mistake touching includes: a display screen 1101 for displaying an image; one or more processors 1102; a memory 1103; a plurality of application programs; and one or more computer programs, wherein the one or more computer programs are stored in the memory 1104, the one or more computer programs comprising instructions which, when executed by the false touch protection apparatus, cause the false touch protection apparatus to perform the steps of the above-described false touch protection method.
Fig. 12 is a schematic hardware structure diagram of an apparatus for preventing false touch according to an embodiment of the present disclosure. Referring to fig. 12, the apparatus includes: memory 1201, processor 1202, and interface circuitry 1203. The apparatus may also include a display 1204, wherein the memory 1201, the processor 1202, the interface circuitry 1203, and the display 1204 may communicate; illustratively, the memory 1201, the processor 1202, the interface circuit 1203 and the display 1204 may communicate through a communication bus, and the memory 1201 is used for storing computer execution instructions, is controlled by the processor 1202 to execute, and is used for executing communication through the interface circuit 1203, so as to implement the method for preventing false touch provided by the embodiment of the present application.
Optionally, the interface circuit 1203 may also include a transmitter and/or a receiver. Optionally, the processor 1202 may include one or more CPUs, and may also be other general-purpose processors, Digital Signal Processors (DSPs), Application Specific Integrated Circuits (ASICs), and the like. A general purpose processor may be a microprocessor or the processor may be any conventional processor or the like. The steps of a method disclosed in this application may be directly implemented by a hardware processor, or implemented by a combination of hardware and software modules in a processor.
In a possible implementation manner, the computer execution instruction in the embodiment of the present application may also be referred to as an application program code, which is not specifically limited in the embodiment of the present application.
The device for preventing false touch provided by the embodiment of the application is used for executing the method for preventing false touch of the embodiment, and the technical principle and the technical effect are similar, and are not repeated here.
The embodiment of the application provides a terminal device, and the structure of the terminal device is shown in fig. 1. The memory of the terminal device may be configured to store at least one program instruction, and the processor is configured to execute the at least one program instruction, so as to implement the technical solutions of the above-mentioned method embodiments. The implementation principle and technical effect are similar to those of the embodiments related to the method, and are not described herein again.
The embodiment of the application provides a chip. The chip comprises a processor for calling a computer program in the memory to execute the technical solution in the above embodiments. The principle and technical effects are similar to those of the related embodiments, and are not described herein again.
The embodiment of the present application provides a computer program product, which enables a terminal device to execute the technical solutions in the above embodiments when the computer program product runs on an electronic device. The principle and technical effects are similar to those of the related embodiments, and are not described herein again.
The embodiment of the present application provides a computer-readable storage medium, on which program instructions are stored, and when the program instructions are executed by a terminal device, the terminal device is enabled to execute the technical solutions of the above embodiments. The principle and technical effects are similar to those of the related embodiments, and are not described herein again.
Embodiments of the present application are described with reference to flowchart illustrations and/or block diagrams of methods, apparatus (systems), and computer program products according to embodiments of the application. It will be understood that each flow and/or block of the flow diagrams and/or block diagrams, and combinations of flows and/or blocks in the flow diagrams and/or block diagrams, can be implemented by computer program instructions. These computer program instructions may be provided to a processing unit of a general purpose computer, special purpose computer, embedded processor, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processing unit of the computer or other programmable data processing apparatus, create means for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
The above embodiments are only for illustrating the embodiments of the present invention and are not to be construed as limiting the scope of the present invention, and any modifications, equivalent substitutions, improvements and the like made on the basis of the embodiments of the present invention shall be included in the scope of the present invention.

Claims (19)

1. A method for preventing false touch is applied to terminal equipment, and the method comprises the following steps:
the terminal equipment receives a first operation for realizing information screen display (AOD);
responding to the first operation, and obtaining a monitoring result by the terminal equipment based on first data; the first data comprises acceleration data, angular acceleration data, and ambient light data;
when the monitoring result indicates that the terminal equipment meets a pocket mode, the terminal equipment is switched from the AOD to a screen off;
or when the monitoring result indicates that the terminal equipment does not meet the pocket mode, the terminal equipment keeps the AOD.
2. The method of claim 1, wherein the terminal device obtains the monitoring result based on the first data, and comprises:
when the acceleration data and the angular acceleration data indicate that the terminal device meets a first preset direction and the ambient light data is smaller than a first brightness threshold, the terminal device determines that the monitoring result is that the terminal device meets the pocket mode;
or when the acceleration data and the angular acceleration data indicate that the terminal device does not satisfy a first preset direction, and/or the ambient light data is greater than a second brightness threshold, the terminal device determines that the monitoring result is that the terminal device does not satisfy the pocket mode; wherein the second brightness threshold is greater than or equal to the first brightness threshold.
3. The method of claim 2, wherein the first predetermined direction is a head-down direction of the terminal device.
4. The method according to any one of claims 1 to 3, wherein the display mode of the AOD comprises a first display mode or a second display mode, and the terminal device performs display through the AOD application, and comprises:
when the display mode of the AOD application is the first display mode, the terminal equipment displays a preset animation through the AOD application and stops displaying the screen after the preset animation is displayed;
or when the display mode of the AOD application is the second display mode, the terminal device displays the preset animation through the AOD application, and displays the last frame of picture in the preset animation after the preset animation is displayed.
5. The method according to any of claims 1-4, wherein the terminal device comprises: an AOD application, and a display, the method further comprising:
the terminal equipment receives a first operation for realizing AOD, and the first operation comprises the following steps: the AOD application receiving the first operation to implement AOD;
responding to the first operation, the terminal equipment obtains a monitoring result based on first data, and the monitoring result comprises the following steps: in response to the first operation, the AOD application obtains the monitoring result based on the first data;
when the monitoring result indicates that the terminal device satisfies the pocket mode, the terminal device switches from the AOD to a screen saver, including: when the monitoring result indicates that the terminal equipment meets the pocket mode, the AOD application indicates that the display screen is switched from the AOD to the rest screen,
or, when the monitoring result indicates that the terminal device does not satisfy the pocket mode, the terminal device maintains the AOD, including: when the monitoring result indicates that the terminal equipment does not meet the pocket mode, the AOD application instructs the display screen to keep the AOD.
6. The method of claim 5, wherein the terminal device further comprises: a motion framework and a motion sensor control center, responsive to the first operation, the AOD application obtains the monitoring results based on the first data, including:
in response to the first operation, the AOD application sending a first message to the motion framework; the first message is used for indicating the motion framework to register a listener; the listener is used for monitoring whether the terminal equipment is in a pocket mode;
in response to the first message, the movement framework registering the listener;
the motion framework sends a second message to the motion sensor control center; the second message is used for instructing the motion framework to start the listener;
in response to the second message, the motion sensor control center obtains the monitoring result based on the first data;
the motion sensor control center sends the monitoring result to the AOD application;
and the AOD application obtains the monitoring result.
7. The method of claim 6, wherein the motion framework registering the listener comprises:
the moving framework calls a new HWExtMotion (motiontype type _ HEAD _ DOWN _ WITH _ ALS) interface to register the listener.
8. The method of claim 6 or 7, wherein the motion sensor control center derives the monitoring result based on the first data, comprising:
when the acceleration data and the angular acceleration data indicate that the terminal device meets a first preset direction and the ambient light data are smaller than a first brightness threshold, the motion sensor control center determines that the monitoring result is that the terminal device meets the pocket mode;
or when the acceleration data and the angular acceleration data indicate that the terminal device does not satisfy a first preset direction, and/or the ambient light data is greater than a second brightness threshold, the motion sensor control center determines that the monitoring result is that the terminal device does not satisfy the pocket mode; wherein the second brightness threshold is greater than or equal to the first brightness threshold.
9. The method according to any of claims 6-8, wherein the terminal device further comprises: the motion hardware abstraction layer interface description language and the motion hardware abstraction layer, the motion framework sends a second message to the motion sensor control center, including:
the motion framework sends the second message to the motion hardware abstraction layer interface description language;
the motion hardware abstraction layer interface description language sends the second message to the motion hardware abstraction layer;
the motion hardware abstraction layer sends the second message to the motion sensor control center.
10. The method of claim 9, wherein the motion framework sends the second message to the motion hardware abstraction layer interface description language, comprising:
the motion frame calls an int HWExtMotionService: (int motion, int action, int delay) interface, and sends the second message to the motion hardware abstraction layer interface description language.
11. The method of claim 9, wherein the motion hardware abstraction layer interface description language sending the second message to the motion hardware abstraction layer comprises:
the Motion hardware abstraction layer interface description language calls Return < int32_ t > Motion:: startMotionReco (int32_ t recoMotion) interface, and sends the second message to the Motion hardware abstraction layer.
12. The method according to any one of claims 6-11, wherein the motion sensor control center sends the monitoring result to the AOD application, comprising:
the motion sensor control center sends the monitoring result to the motion hardware abstraction layer;
the motion hardware abstraction layer sends the monitoring result to the motion hardware abstraction layer interface description language;
the motion hardware abstraction layer interface description language sends the monitoring result to the motion framework;
and the motion framework sends the monitoring result to the AOD application.
13. The method of claim 12, wherein the motion sensor control center sends the monitoring results to the motion hardware abstraction layer, comprising:
and the motion sensor control center calls a pb _ send _ sensor _ stream _ event interface and sends the monitoring result to the motion hardware abstraction layer.
14. The method of claim 12, wherein the athletic hardware abstraction layer interface description language sends the monitoring results to the athletic framework, comprising:
and the Motion hardware abstraction layer interface description language calls a Return < void > Motion interface, wherein the PolMotionResult (PolMotionResult _ cb _ hidl _ cb) interface sends the monitoring result to the Motion framework.
15. The method of any of claims 5-14, wherein the display mode of the AOD comprises a first display mode or a second display mode, and wherein the AOD application instructs the display to hold the AOD comprises:
when the display mode of the AOD application is the first display mode, the AOD application instructs the display screen to display a preset animation, and the screen is stopped after the preset animation is displayed;
or when the display mode of the AOD application is the second display mode, the AOD application indicates the display screen to display the preset animation, and the last frame of picture in the preset animation is displayed after the preset animation is displayed.
16. A terminal device, characterized in that the terminal device comprises a processor for invoking a computer program in a memory for executing the method according to any of claims 1-15.
17. A computer-readable storage medium, having stored thereon computer instructions, which, when run on a terminal device, cause the terminal device to perform the method of any one of claims 1-10.
18. A computer program product, characterized in that it comprises a computer program which, when run, causes a terminal device to perform the method according to any one of claims 1-15.
19. A chip, characterized in that the chip comprises a processor for calling a computer program in a memory for performing the method according to any of claims 1-15.
CN202111408327.9A 2021-11-19 2021-11-19 Method and device for preventing false touch Active CN115016629B (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CN202310591455.4A CN116820222A (en) 2021-11-19 2021-11-19 Method and device for preventing false touch
CN202111408327.9A CN115016629B (en) 2021-11-19 2021-11-19 Method and device for preventing false touch

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202111408327.9A CN115016629B (en) 2021-11-19 2021-11-19 Method and device for preventing false touch

Related Child Applications (1)

Application Number Title Priority Date Filing Date
CN202310591455.4A Division CN116820222A (en) 2021-11-19 2021-11-19 Method and device for preventing false touch

Publications (2)

Publication Number Publication Date
CN115016629A true CN115016629A (en) 2022-09-06
CN115016629B CN115016629B (en) 2023-06-06

Family

ID=83065073

Family Applications (2)

Application Number Title Priority Date Filing Date
CN202310591455.4A Pending CN116820222A (en) 2021-11-19 2021-11-19 Method and device for preventing false touch
CN202111408327.9A Active CN115016629B (en) 2021-11-19 2021-11-19 Method and device for preventing false touch

Family Applications Before (1)

Application Number Title Priority Date Filing Date
CN202310591455.4A Pending CN116820222A (en) 2021-11-19 2021-11-19 Method and device for preventing false touch

Country Status (1)

Country Link
CN (2) CN116820222A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116208705A (en) * 2023-04-24 2023-06-02 荣耀终端有限公司 Equipment abnormality recovery method and electronic equipment

Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20150153803A1 (en) * 2013-12-04 2015-06-04 Sony Corporation Apparatus and method for controlling a suspended state
CN108646973A (en) * 2018-05-10 2018-10-12 Oppo广东移动通信有限公司 Put out screen display methods, mobile terminal and computer readable storage medium
CN108696639A (en) * 2018-05-10 2018-10-23 Oppo广东移动通信有限公司 Put out screen display methods, mobile terminal and storage medium
CN108900710A (en) * 2018-06-29 2018-11-27 Oppo(重庆)智能科技有限公司 False-touch prevention method, apparatus, mobile terminal and the storage medium of mobile terminal
CN111290690A (en) * 2020-01-15 2020-06-16 Oppo(重庆)智能科技有限公司 Terminal control method and device, mobile terminal and storage medium
CN111542802A (en) * 2018-09-21 2020-08-14 华为技术有限公司 Method for shielding touch event and electronic equipment
CN111596781A (en) * 2019-02-20 2020-08-28 华为技术有限公司 False touch prevention method and terminal
WO2021017836A1 (en) * 2019-07-30 2021-02-04 华为技术有限公司 Method for controlling display of large-screen device, and mobile terminal and first system
WO2021213274A1 (en) * 2020-04-24 2021-10-28 深圳市万普拉斯科技有限公司 Method and apparatus for preventing false touch of mobile terminal, and computer device and storage medium

Patent Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20150153803A1 (en) * 2013-12-04 2015-06-04 Sony Corporation Apparatus and method for controlling a suspended state
CN108646973A (en) * 2018-05-10 2018-10-12 Oppo广东移动通信有限公司 Put out screen display methods, mobile terminal and computer readable storage medium
CN108696639A (en) * 2018-05-10 2018-10-23 Oppo广东移动通信有限公司 Put out screen display methods, mobile terminal and storage medium
CN108900710A (en) * 2018-06-29 2018-11-27 Oppo(重庆)智能科技有限公司 False-touch prevention method, apparatus, mobile terminal and the storage medium of mobile terminal
CN111542802A (en) * 2018-09-21 2020-08-14 华为技术有限公司 Method for shielding touch event and electronic equipment
CN111596781A (en) * 2019-02-20 2020-08-28 华为技术有限公司 False touch prevention method and terminal
WO2021017836A1 (en) * 2019-07-30 2021-02-04 华为技术有限公司 Method for controlling display of large-screen device, and mobile terminal and first system
CN111290690A (en) * 2020-01-15 2020-06-16 Oppo(重庆)智能科技有限公司 Terminal control method and device, mobile terminal and storage medium
WO2021213274A1 (en) * 2020-04-24 2021-10-28 深圳市万普拉斯科技有限公司 Method and apparatus for preventing false touch of mobile terminal, and computer device and storage medium

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116208705A (en) * 2023-04-24 2023-06-02 荣耀终端有限公司 Equipment abnormality recovery method and electronic equipment
CN116208705B (en) * 2023-04-24 2023-09-05 荣耀终端有限公司 Equipment abnormality recovery method and electronic equipment

Also Published As

Publication number Publication date
CN115016629B (en) 2023-06-06
CN116820222A (en) 2023-09-29

Similar Documents

Publication Publication Date Title
US11703960B2 (en) Air mouse mode implementation method and related device
CN111552451B (en) Display control method and device, computer readable medium and terminal equipment
US20220174143A1 (en) Message notification method and electronic device
CN112312366B (en) Method, electronic equipment and system for realizing functions through NFC (near field communication) tag
US20230189366A1 (en) Bluetooth Communication Method, Terminal Device, and Computer-Readable Storage Medium
CN110557740A (en) Electronic equipment control method and electronic equipment
EP4033341A1 (en) Always on display method and mobile device
EP4132202A1 (en) Data downloading method and apparatus, and terminal device
US20240098354A1 (en) Connection establishment method and electronic device
CN113596919B (en) Data downloading method and device and terminal equipment
CN114221402A (en) Charging method and device of terminal equipment and terminal equipment
CN111935705A (en) Data service management method and device, computer readable medium and terminal equipment
CN115016629B (en) Method and device for preventing false touch
EP4280596A1 (en) Video call method and related device
CN114375027A (en) Method and device for reducing power consumption
CN113391735A (en) Display form adjusting method and device, electronic equipment and storage medium
WO2024055881A1 (en) Clock synchronization method, electronic device, system, and storage medium
WO2023207715A1 (en) Screen-on control method, electronic device, and computer-readable storage medium
CN115665632B (en) Audio circuit, related device and control method
WO2023020420A1 (en) Volume display method, electronic device, and storage medium
WO2024093614A1 (en) Device input method and system, and electronic device and storage medium
WO2022252786A1 (en) Window split-screen display method and electronic device
CN111801931B (en) Method for switching on and hanging up telephone when call occurs SRVCC switch
CN115580541A (en) Information synchronization method and electronic equipment
CN114691066A (en) Application display method and electronic equipment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant