WO2023098208A1 - 防误触的方法和装置 - Google Patents

防误触的方法和装置 Download PDF

Info

Publication number
WO2023098208A1
WO2023098208A1 PCT/CN2022/117611 CN2022117611W WO2023098208A1 WO 2023098208 A1 WO2023098208 A1 WO 2023098208A1 CN 2022117611 W CN2022117611 W CN 2022117611W WO 2023098208 A1 WO2023098208 A1 WO 2023098208A1
Authority
WO
WIPO (PCT)
Prior art keywords
terminal
information
ambient light
screen
threshold
Prior art date
Application number
PCT/CN2022/117611
Other languages
English (en)
French (fr)
Inventor
邸皓轩
李丹洪
张晓武
Original Assignee
荣耀终端有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 荣耀终端有限公司 filed Critical 荣耀终端有限公司
Priority to EP22857074.3A priority Critical patent/EP4212999A4/en
Publication of WO2023098208A1 publication Critical patent/WO2023098208A1/zh

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/033Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
    • G06F3/0346Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor with detection of the device orientation or free movement in a 3D space, e.g. 3D mice, 6-DOF [six degrees of freedom] pointers using gyroscopes, accelerometers or tilt-sensors
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • G06F3/0416Control or interface arrangements specially adapted for digitisers
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F1/00Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
    • G06F1/16Constructional details or arrangements
    • G06F1/1613Constructional details or arrangements for portable computers
    • G06F1/1626Constructional details or arrangements for portable computers with a single-body enclosure integrating a flat display, e.g. Personal Digital Assistants [PDAs]
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F1/00Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
    • G06F1/16Constructional details or arrangements
    • G06F1/1613Constructional details or arrangements for portable computers
    • G06F1/1633Constructional details or arrangements of portable computers not specific to the type of enclosures covered by groups G06F1/1615 - G06F1/1626
    • G06F1/1684Constructional details or arrangements related to integrated I/O peripherals not covered by groups G06F1/1635 - G06F1/1675
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F1/00Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
    • G06F1/16Constructional details or arrangements
    • G06F1/1613Constructional details or arrangements for portable computers
    • G06F1/1633Constructional details or arrangements of portable computers not specific to the type of enclosures covered by groups G06F1/1615 - G06F1/1626
    • G06F1/1684Constructional details or arrangements related to integrated I/O peripherals not covered by groups G06F1/1635 - G06F1/1675
    • G06F1/1694Constructional details or arrangements related to integrated I/O peripherals not covered by groups G06F1/1635 - G06F1/1675 the I/O peripheral being a single or a set of motion sensors for pointer control or gesture input obtained by sensing movements of the portable computer
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F1/00Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
    • G06F1/26Power supply means, e.g. regulation thereof
    • G06F1/32Means for saving power
    • G06F1/3203Power management, i.e. event-based initiation of a power-saving mode
    • G06F1/3206Monitoring of events, devices or parameters that trigger a change in power modality
    • G06F1/3215Monitoring of peripheral devices
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F1/00Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
    • G06F1/26Power supply means, e.g. regulation thereof
    • G06F1/32Means for saving power
    • G06F1/3203Power management, i.e. event-based initiation of a power-saving mode
    • G06F1/3206Monitoring of events, devices or parameters that trigger a change in power modality
    • G06F1/3231Monitoring the presence, absence or movement of users
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F1/00Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
    • G06F1/26Power supply means, e.g. regulation thereof
    • G06F1/32Means for saving power
    • G06F1/3203Power management, i.e. event-based initiation of a power-saving mode
    • G06F1/3234Power saving characterised by the action undertaken
    • G06F1/325Power saving in peripheral device
    • G06F1/3262Power saving in digitizer or tablet
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F1/00Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
    • G06F1/26Power supply means, e.g. regulation thereof
    • G06F1/32Means for saving power
    • G06F1/3203Power management, i.e. event-based initiation of a power-saving mode
    • G06F1/3234Power saving characterised by the action undertaken
    • G06F1/325Power saving in peripheral device
    • G06F1/3265Power saving in display device
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M1/00Substation equipment, e.g. for use by subscribers
    • H04M1/72Mobile telephones; Cordless telephones, i.e. devices for establishing wireless links to base stations without route selection
    • H04M1/724User interfaces specially adapted for cordless or mobile telephones
    • H04M1/72448User interfaces specially adapted for cordless or mobile telephones with means for adapting the functionality of the device according to specific conditions
    • H04M1/72454User interfaces specially adapted for cordless or mobile telephones with means for adapting the functionality of the device according to specific conditions according to context-related or environment-related conditions
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M1/00Substation equipment, e.g. for use by subscribers
    • H04M1/72Mobile telephones; Cordless telephones, i.e. devices for establishing wireless links to base stations without route selection
    • H04M1/724User interfaces specially adapted for cordless or mobile telephones
    • H04M1/72448User interfaces specially adapted for cordless or mobile telephones with means for adapting the functionality of the device according to specific conditions
    • H04M1/72463User interfaces specially adapted for cordless or mobile telephones with means for adapting the functionality of the device according to specific conditions to restrict the functionality of the device
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02DCLIMATE CHANGE MITIGATION TECHNOLOGIES IN INFORMATION AND COMMUNICATION TECHNOLOGIES [ICT], I.E. INFORMATION AND COMMUNICATION TECHNOLOGIES AIMING AT THE REDUCTION OF THEIR OWN ENERGY USE
    • Y02D30/00Reducing energy consumption in communication networks
    • Y02D30/70Reducing energy consumption in communication networks in wireless communication networks

Definitions

  • the present application relates to the technical field of terminals, and in particular, to a method and device for preventing false touches.
  • the screen of the touch-screen terminal touches the capacitive substance when the screen lights up, which is prone to accidental touches.
  • the screen will be on for a long time to consume more power, and the fingerprint unlocking function may be locked if the fingerprint is accidentally touched multiple times. Touching will cause false alarms, etc.
  • the present application provides an anti-mistouch method, device, computer readable storage medium and computer program product, which can effectively solve the problem of anti-mistouch in pocket mode and greatly improve user experience.
  • a method for preventing false touches including:
  • the terminal Based on the quaternion of the terminal, determine first information and attitude angle information of the terminal, where the first information is used to identify whether the terminal is in a head-down state, and the attitude angle information is used to identifying the gesture of the terminal;
  • the attitude angle information, the motion information, and the ambient light information determine that the terminal is in the first mode.
  • the first mode is used to identify that the state information of the terminal meets the corresponding preset conditions of the terminal. .
  • the terminal enters a screen-off state.
  • the terminal being in the head-down pocket mode includes the following conditions: the first information satisfies a first preset condition, the attitude angle information satisfies a second preset condition, and the motion information satisfies a third preset condition. It is assumed that the condition and the ambient light information meet the fourth preset condition.
  • the foregoing method may be executed by a terminal device or a chip in the terminal device.
  • the information of the terminal including but not limited to ambient light information, first information, attitude angle information, and motion information
  • the terminal is detected cooperatively by multiple sensors, and whether the terminal is in the first mode is determined according to the information of the terminal. If it is determined that the terminal is in the pocket mode with its head down, then the terminal enters the screen-off state in the pocket mode to save power consumption, and it can also avoid false wake-up of the terminal when the screen is off and the screen is locked and the screen is on, effectively preventing false alarms. touch.
  • determining that the terminal is in the first mode based on the first information includes: the first information satisfies a first preset condition;
  • the first information satisfies a first preset condition, including: the product of the first gravity component and the second gravity component of the terminal is a negative number; and, the absolute value of the first gravity component is greater than a first threshold, the The absolute value of the second gravity component is smaller than the first threshold; wherein, the first gravity component and the second gravity component are calculated based on the quaternion.
  • determining that the terminal is in the first mode based on the attitude angle information includes: the attitude angle information meets a second preset condition; the attitude angle information meets a second preset condition, including : The pitch angle of the terminal is within a preset angle range; wherein, the pitch angle of the terminal is calculated based on the quaternion.
  • determining that the terminal is in the first mode based on the motion information includes: the motion information satisfies a third preset condition; the motion information satisfies the third preset condition, including: continuous n
  • the combined velocity value of the accelerometer of the frame is less than or equal to the combined velocity threshold, n ⁇ 2, and n is an integer.
  • the motion information satisfies the third preset condition, and further includes: the total accelerometer velocity value of the i-th frame and the accelerometer total velocity value of the i-1th frame in the consecutive n frames
  • the difference is less than the predetermined difference threshold, i ⁇ [2,n]. Therefore, judging the motion state of the terminal by accelerometer summed velocity values of n consecutive frames can avoid misjudgment of the motion state of the terminal when the terminal shakes greatly in the pocket mode, and help ensure the accuracy of the judgment result.
  • determining that the terminal is in the first mode based on the ambient light information includes: the ambient light information satisfies a fourth preset condition; the ambient light information satisfies a fourth preset condition, including : The ambient light intensity of the terminal is less than or equal to the first light threshold.
  • the method when the ambient light intensity of the terminal is greater than the first light threshold, the method further includes:
  • the screen of the terminal is turned on
  • the terminal When the ambient light intensity of the terminal is less than or equal to the second light threshold, the terminal enters a screen-off state.
  • the above method for preventing false touches is applicable.
  • a proximity light sensor is disposed in the terminal. Before determining that the terminal is in the first mode based on the first information, the attitude angle information, the motion information, and the ambient light information, the method further includes: detecting a reflection of the terminal by a proximity light sensor Light information: when no reflected light is detected, determine whether the terminal is in a head-down pocket mode according to the head-down information, the attitude angle information, the motion information, and the ambient light information.
  • the proximity light sensor determines that it is not approaching, it may further be determined in combination with the above-mentioned first information, the attitude angle information, the motion information, and the ambient light information to determine whether the terminal is in the head-down pocket mode, Get more accurate judgment results.
  • the method before the terminal enters the screen-off state, the method further includes: detecting the interface of the terminal; if the interface of the terminal is an AOD interface with the screen off, the AOD is off; if If the interface of the terminal is a lock screen interface, the terminal enters the anti-mistouch mode. Therefore, before entering the screen-off state, by detecting the interface of the terminal, so as to make corresponding processing according to the actual situation of the interface, the power consumption can be further saved.
  • a false touch prevention device including a unit for performing any one of the methods in the first aspect.
  • the apparatus may be a terminal (or terminal equipment), or a chip in the terminal (or terminal equipment).
  • the device includes an input unit, a display unit and a processing unit.
  • the processing unit may be a processor, the input unit may be a communication interface, and the display unit may be a graphics processing module and a screen; the terminal may also include a memory, the memory is used to store computer program codes, When the processor executes the computer program code stored in the memory, the terminal is made to execute any method in the first aspect.
  • the processing unit may be a logic processing unit inside the chip, the input unit may be an output interface, a pin or a circuit, etc., and the display unit may be a graphics processing unit inside the chip; the chip It can also include a memory, which can be a memory in the chip (for example, a register, a cache, etc.), or a memory located outside the chip (for example, a read-only memory, a random access memory, etc.); the memory is used for Computer program codes are stored, and when the processor executes the computer program codes stored in the memory, the chip is made to perform any one method of the first aspect.
  • a memory which can be a memory in the chip (for example, a register, a cache, etc.), or a memory located outside the chip (for example, a read-only memory, a random access memory, etc.); the memory is used for Computer program codes are stored, and when the processor executes the computer program codes stored in the memory, the chip is made to perform any one method of the first aspect.
  • the processing unit is configured to obtain a quaternion of the terminal through an acceleration sensor and a gyroscope sensor, and the quaternion is used to characterize the posture of the terminal; based on the quaternion of the terminal , determining first information and attitude angle information of the terminal, wherein the first information is used to identify whether the terminal is in a head-down state, and the attitude angle information is used to identify the attitude of the terminal; by The acceleration sensor detects motion information of the terminal, the motion information is used to identify the motion state of the terminal; the ambient light sensor detects the ambient light information of the terminal, and the ambient light information is used to identify the terminal The light intensity of the environment; based on the first information, the attitude angle information, the motion information, and the ambient light information, determine that the terminal is in a first mode, and the first mode is used to identify the terminal's The state information satisfies the corresponding preset conditions of the terminal; the display unit enters a screen-off state.
  • the determining by the processing unit that the terminal is in the first mode based on the first information includes: the first information satisfies a first preset condition; the first information satisfies a first preset condition.
  • the assumption conditions include: the product of the first gravity component and the second gravity component of the terminal is a negative number; and, the absolute value of the first gravity component is greater than a first threshold, and the absolute value of the second gravity component is smaller than the specified the first threshold;
  • the first gravity component and the second gravity component are calculated based on the quaternion.
  • the determining by the processing unit that the terminal is in the first mode based on the attitude angle information includes: the attitude angle information satisfies a second preset condition; the attitude angle information satisfies a second preset condition.
  • Set conditions including: the pitch angle of the terminal is within the preset angle range;
  • the pitch angle of the terminal is calculated based on the quaternion.
  • the determining by the processing unit that the terminal is in the first mode based on the motion information includes: the motion information satisfies a third preset condition; the motion information satisfies a third preset condition, Including: the total speed value of the accelerometer in n consecutive frames is less than or equal to the speed threshold, n ⁇ 2, and n is an integer.
  • the motion information satisfies the third preset condition, and further includes: the difference between the total accelerometer velocity value of the i-th frame and the accelerometer total velocity value of the i-1th frame in the consecutive n frames The value is less than a predetermined difference threshold, i ⁇ [2,n].
  • the determining by the processing unit that the terminal is in the first mode based on the ambient light information includes: the ambient light information satisfies a fourth preset condition; the ambient light information satisfies a fourth preset condition.
  • the setting conditions include: the ambient light intensity of the terminal is less than or equal to the first light threshold.
  • the processing unit is further configured to: when the ambient light of the terminal is greater than the first light threshold, use the ambient light sensor to detect whether the ambient light of the terminal is greater than a second light threshold , the second light threshold is greater than the first light threshold;
  • the screen of the terminal is turned on
  • the terminal When the ambient light intensity of the terminal is less than or equal to the second light threshold, the terminal enters a screen-off state.
  • the processing unit is further configured to: detect reflected light information of the terminal through a proximity light sensor;
  • the processing unit before the terminal enters the screen-off state, is further configured to: detect the interface of the terminal; if the interface of the terminal is an AOD interface with the screen off, the display unit The AOD of the terminal is off; if the interface of the terminal is a lock screen interface, the terminal enters the anti-mistouch mode.
  • a computer-readable storage medium stores computer program codes, and when the computer program codes are executed by the device for preventing false touches, the device executes the method described in the first aspect. any of the methods.
  • a computer program product comprising: computer program code, when the computer program code is run by the device for preventing false touch, the device is made to execute any one of the methods in the first aspect. method.
  • FIG. 1 is an example diagram of an application scenario of an embodiment of the present application
  • FIG. 2 is a schematic diagram of a hardware system applicable to the electronic device of the present application
  • Fig. 3 is a schematic diagram of a software system applicable to the electronic device of the present application.
  • Fig. 4 is an example diagram of the coordinate system defined by the mobile phone
  • FIG. 5 is a schematic flowchart of a method for preventing false touches according to an embodiment of the present application
  • Fig. 6 is a specific flow chart of the method for preventing false touch according to the embodiment of the present application.
  • FIG. 7 is an example diagram of a user's handheld terminal
  • Fig. 8 is an example diagram of the interface display of the anti-mistouch mode
  • Fig. 9 is an example diagram of an interface for enabling the false touch prevention mode.
  • the false touch prevention method provided in the embodiment of the present application can be applied to a terminal with a touch screen.
  • the terminal may be, for example, a mobile phone, a tablet computer, a multimedia playback device, an e-book reader, a personal computer, a personal digital assistant (personal digital assistant, PDA), a smart watch, and the like.
  • PDA personal digital assistant
  • This application does not limit the specific form of the terminal.
  • the technical solutions of the embodiments of the present application are applied to a scenario where the terminal is in a backpack or a pocket of clothing (such as clothes, trousers, etc.).
  • the user can place the terminal in a clothes pocket, trousers pocket or backpack.
  • the scenario in which "the terminal is in a pocket of a backpack or clothing (such as clothes, trousers, etc.)" may be referred to as a pocket mode.
  • the terminal in the pocket mode includes the following common postures: vertical screen head up, vertical screen head down, screen up, screen down, horizontal screen upright (there may be a certain inclination angle), etc.
  • the vertical screen is the most state where the terminal is placed in the pocket in daily life.
  • the user's behavior scene includes but not limited to the following behaviors: walking, running, sitting, standing, lying down, jumping, riding, etc.
  • FIG. 1 shows an example diagram of an application scenario of an embodiment of the present application.
  • the terminal 12 is placed in the backpack 11 .
  • the user has no need to use the terminal 12 .
  • the terminal 12 is in the knapsack 11, it may come into contact with the knapsack 11 itself or a capacitive substance (metal conductive material, skin, etc.) contained in the knapsack 11, thereby causing false touches.
  • a capacitive substance metal conductive material, skin, etc.
  • the backpack 11 shown in FIG. 1 can also be replaced by a clothing pocket, which is not specifically limited.
  • FIG. 1 is only a schematic illustration of an application scenario of the present application, which does not limit the embodiment of the present application, and the present application is not limited thereto.
  • the objects in contact with the terminal include but are not limited to the following: clothing pockets, backpacks, clothing pockets, or capacitive substances contained in the backpack (metal conductive objects, skin, etc.).
  • the on-screen state includes always on display (AOD) on and on when the lock screen is on
  • AOD always on display
  • objects in contact with the terminal may trigger the terminal multiple times, resulting in multiple False touches seriously affect the user experience (for example, multiple false touches will trigger the fingerprint unlocking function to be locked, multiple wrong touches of the password input will cause the password input function to be locked, and the emergency call function will be falsely alarmed if the emergency call function is touched by mistake, etc.).
  • AOD means that without lighting up the entire screen, the screen of the control terminal is partially lit, and some important information is displayed on the terminal.
  • the technical solution provided by the embodiment of this application uses multiple sensors to cooperatively detect terminal information (including but not limited to ambient light information, first information, attitude angle information, and motion information), and determines whether the terminal is head-down according to the terminal information. pocket pattern. If it is determined that the terminal is in the pocket mode with its head down, then the terminal enters the screen-off state in the pocket mode to save power consumption, and it can also avoid false wake-up of the terminal when the screen is off and the screen is locked and the screen is on, effectively preventing false alarms. touch.
  • terminal information including but not limited to ambient light information, first information, attitude angle information, and motion information
  • Fig. 2 shows a hardware system applicable to the electronic equipment of this application.
  • the electronic device 100 may be a mobile phone, a smart screen, a tablet computer, a wearable electronic device, a vehicle electronic device, an augmented reality (augmented reality, AR) device, a virtual reality (virtual reality, VR) device, a notebook computer, a super mobile personal computer ( ultra-mobile personal computer (UMPC), netbook, personal digital assistant (personal digital assistant, PDA), projector, etc.
  • augmented reality augmented reality
  • VR virtual reality
  • a notebook computer a super mobile personal computer ( ultra-mobile personal computer (UMPC), netbook, personal digital assistant (personal digital assistant, PDA), projector, etc.
  • UMPC ultra-mobile personal computer
  • PDA personal digital assistant
  • the electronic device 100 may include a processor 110, an external memory interface 120, an internal memory 121, a universal serial bus (universal serial bus, USB) interface 130, a charging management module 140, a power management module 141, a battery 142, an antenna 1, and an antenna 2 , mobile communication module 150, wireless communication module 160, audio module 170, speaker 170A, receiver 170B, microphone 170C, earphone jack 170D, sensor module 180, button 190, motor 191, indicator 192, camera 193, display screen 194, and A subscriber identification module (subscriber identification module, SIM) card interface 195 and the like.
  • SIM subscriber identification module
  • the sensor module 180 may include a pressure sensor 180A, a gyroscope sensor 180B, an air pressure sensor 180C, a magnetic sensor 180D, an acceleration sensor 180E, a distance sensor 180F, a proximity light sensor 180G, a fingerprint sensor 180H, a temperature sensor 180J, a touch sensor 180K, and an ambient light sensor. sensor 180L, bone conduction sensor 180M, etc.
  • the structure shown in FIG. 2 does not constitute a specific limitation on the electronic device 100 .
  • the electronic device 100 may include more or fewer components than those shown in FIG. 2 , or the electronic device 100 may include a combination of some of the components shown in FIG. 2 , or , the electronic device 100 may include subcomponents of some of the components shown in FIG. 2 .
  • the proximity light sensor 180G shown in FIG. 2 may be optional.
  • the components shown in FIG. 2 can be realized in hardware, software, or a combination of software and hardware.
  • Processor 110 may include one or more processing units.
  • the processor 110 may include at least one of the following processing units: an application processor (application processor, AP), a modem processor, a graphics processing unit (graphics processing unit, GPU), an image signal processor (image signal processor) , ISP), controller, video codec, digital signal processor (digital signal processor, DSP), baseband processor, neural network processor (neural-network processing unit, NPU).
  • an application processor application processor, AP
  • modem processor graphics processing unit
  • graphics processing unit graphics processing unit
  • image signal processor image signal processor
  • ISP image signal processor
  • controller video codec
  • digital signal processor digital signal processor
  • DSP digital signal processor
  • baseband processor baseband processor
  • neural network processor neural-network processing unit
  • the controller can generate an operation control signal according to the instruction opcode and timing signal, and complete the control of fetching and executing the instruction.
  • a memory may also be provided in the processor 110 for storing instructions and data.
  • the memory in processor 110 is a cache memory.
  • the memory may hold instructions or data that the processor 110 has just used or recycled. If the processor 110 needs to use the instruction or data again, it can be called directly from the memory. Repeated access is avoided, and the waiting time of the processor 110 is reduced, thereby improving the efficiency of the system.
  • connection relationship between the modules shown in FIG. 2 is only a schematic illustration, and does not constitute a limitation on the connection relationship between the modules of the electronic device 100 .
  • each module of the electronic device 100 may also adopt a combination of various connection modes in the foregoing embodiments.
  • the electronic device 100 can realize the display function through the GPU, the display screen 194 and the application processor.
  • the GPU is a microprocessor for image processing, and is connected to the display screen 194 and the application processor. GPUs are used to perform mathematical and geometric calculations for graphics rendering.
  • Processor 110 may include one or more GPUs that execute program instructions to generate or change display information.
  • Display 194 may be used to display images or video.
  • the display screen 194 includes a display panel.
  • the display panel can be a liquid crystal display (LCD), an organic light-emitting diode (OLED), an active-matrix organic light-emitting diode (AMOLED), a flexible Light-emitting diode (flex light-emitting diode, FLED), mini light-emitting diode (mini light-emitting diode, Mini LED), micro light-emitting diode (micro light-emitting diode, Micro LED), micro OLED (Micro OLED) or quantum dot light emitting Diodes (quantum dot light emitting diodes, QLED).
  • the electronic device 100 may include 1 or N display screens 194 , where N is a positive integer greater than 1.
  • the electronic device 100 can realize the shooting function through the ISP, the camera 193 , the video codec, the GPU, the display screen 194 , and the application processor.
  • the ISP is used for processing the data fed back by the camera 193 .
  • the light is transmitted to the photosensitive element of the camera through the lens, and the light signal is converted into an electrical signal, and the photosensitive element of the camera transmits the electrical signal to the ISP for processing, and converts it into an image visible to the naked eye.
  • ISP can optimize the algorithm of image noise, brightness and color, and ISP can also optimize parameters such as exposure and color temperature of the shooting scene.
  • the ISP may be located in the camera 193 .
  • Camera 193 is used to capture still images or video.
  • the object generates an optical image through the lens and projects it to the photosensitive element.
  • the photosensitive element may be a charge coupled device (CCD) or a complementary metal-oxide-semiconductor (CMOS) phototransistor.
  • CMOS complementary metal-oxide-semiconductor
  • the photosensitive element converts the light signal into an electrical signal, and then transmits the electrical signal to the ISP to convert it into a digital image signal.
  • the ISP outputs the digital image signal to the DSP for processing.
  • DSP converts digital image signals into standard red green blue (red green blue, RGB), YUV and other image signals.
  • the electronic device 100 may include 1 or N cameras 193 , where N is a positive integer greater than 1.
  • Digital signal processors are used to process digital signals. In addition to digital image signals, they can also process other digital signals. For example, when the electronic device 100 selects a frequency point, the digital signal processor is used to perform Fourier transform on the energy of the frequency point.
  • the electronic device 100 can implement audio functions, such as music playing and recording, through the audio module 170, the speaker 170A, the receiver 170B, the microphone 170C, the earphone interface 170D, and the application processor.
  • audio functions such as music playing and recording, through the audio module 170, the speaker 170A, the receiver 170B, the microphone 170C, the earphone interface 170D, and the application processor.
  • the audio module 170 is used to convert digital audio information into analog audio signal output, and can also be used to convert analog audio input into digital audio signal.
  • the audio module 170 may also be used to encode and decode audio signals.
  • the audio module 170 or some functional modules of the audio module 170 may be set in the processor 110 .
  • Speaker 170A also known as a horn, is used to convert audio electrical signals into sound signals.
  • the electronic device 100 can listen to music or make a hands-free call through the speaker 170A.
  • Receiver 170B also known as an earpiece, is used to convert audio electrical signals into audio signals.
  • the user uses the electronic device 100 to answer calls or voice messages, he can listen to the voice by putting the receiver 170B close to the ear.
  • Microphone 170C also known as microphone or microphone, is used to convert sound signals into electrical signals.
  • a sound signal may be input into the microphone 170C by uttering a sound close to the microphone 170C.
  • the earphone interface 170D is used for connecting wired earphones.
  • the earphone interface 170D may be a USB interface 130, or a 3.5mm open mobile terminal platform (open mobile terminal platform, OMTP) standard interface, or a cellular telecommunications industry association of the USA (CTIA) standard interface. .
  • the pressure sensor 180A is used to sense the pressure signal and convert the pressure signal into an electrical signal.
  • pressure sensor 180A may be disposed on display screen 194 .
  • pressure sensor 180A may be a resistive pressure sensor, an inductive pressure sensor or a capacitive pressure sensor.
  • the capacitive pressure sensor may include at least two parallel plates with conductive materials.
  • touch operations acting on the same touch position but with different touch operation intensities may correspond to different operation instructions. For example: when the touch operation with the touch operation intensity less than the first pressure threshold acts on the short message application icon, execute the instruction of viewing the short message; when the touch operation with the intensity greater than or equal to the first pressure threshold acts on the short message application icon , to execute the instruction of creating a new short message.
  • the gyro sensor 180B can be used to determine the motion posture of the electronic device 100 .
  • the angular velocity of the electronic device 100 around three axes may be determined by the gyro sensor 180B.
  • the gyro sensor 180B can be used for image stabilization. For example, when the shutter is pressed, the gyro sensor 180B detects the shake angle of the electronic device 100, and calculates the distance that the lens module needs to compensate according to the angle, so that the lens can counteract the shake of the electronic device 100 through reverse motion to achieve anti-shake.
  • the gyro sensor 180B can also be used in scenarios such as navigation and somatosensory games.
  • the air pressure sensor 180C is used to measure air pressure.
  • the electronic device 100 calculates the altitude based on the air pressure value measured by the air pressure sensor 180C to assist positioning and navigation.
  • the magnetic sensor 180D includes a Hall sensor.
  • the electronic device 100 may use the magnetic sensor 180D to detect the opening and closing of the flip leather case.
  • the electronic device 100 can detect opening and closing of the clamshell according to the magnetic sensor 180D.
  • the electronic device 100 may set features such as automatic unlocking of the flip cover according to the detected opening and closing state of the leather case or the opening and closing state of the flip cover.
  • the acceleration sensor 180E can detect the acceleration of the electronic device 100 in various directions (generally x-axis, y-axis and z-axis). The magnitude and direction of gravity can be detected when the electronic device 100 is stationary. The acceleration sensor 180E can also be used to identify the posture of the electronic device 100 as an input parameter for application programs such as horizontal and vertical screen switching and pedometer.
  • the quaternion of the electronic device 100 can be acquired through the acceleration sensor 180E and the gyro sensor 180B.
  • the motion information of the electronic device 100 can be detected by the acceleration sensor 180E.
  • the distance sensor 180F is used to measure distance.
  • the electronic device 100 may measure the distance by infrared or laser. In some embodiments, for example, in a shooting scene, the electronic device 100 can use the distance sensor 180F for distance measurement to achieve fast focusing.
  • the proximity light sensor 180G may include, for example, a light-emitting diode (LED) and a light detector, such as a photodiode.
  • the LEDs may be infrared LEDs.
  • the electronic device 100 emits infrared light through the LED.
  • Electronic device 100 uses photodiodes to detect infrared reflected light from nearby objects. When the reflected light is detected, the electronic device 100 may determine that there is an object nearby. When no reflected light is detected, the electronic device 100 may determine that there is no object nearby.
  • the electronic device 100 can use the proximity light sensor 180G to detect whether the user is holding the electronic device 100 close to the ear to talk, so as to automatically turn off the screen to save power.
  • the proximity light sensor 180G can also be used for automatic unlocking and automatic screen locking in leather case mode or pocket mode. It should be understood that the proximity light sensor 180G described in FIG. 2 may be an optional component. In some scenarios, an ultrasonic sensor may be used instead of the proximity light sensor 180G to detect the proximity light.
  • the ambient light sensor 180L is used for sensing ambient light brightness.
  • the electronic device 100 can adaptively adjust the brightness of the display screen 194 according to the perceived ambient light brightness.
  • the ambient light sensor 180L can also be used to automatically adjust the white balance when taking pictures.
  • the ambient light sensor 180L can also cooperate with the proximity light sensor 180G to detect whether the electronic device 100 is in the pocket, so as to prevent accidental touch.
  • ambient light information of the terminal may be detected by the ambient light sensor 180L.
  • the fingerprint sensor 180H is used to collect fingerprints.
  • the electronic device 100 can use the collected fingerprint characteristics to implement functions such as unlocking, accessing the application lock, taking pictures, and answering incoming calls.
  • the touch sensor 180K is also referred to as a touch device.
  • the touch sensor 180K may be disposed on the display screen 194, and the touch sensor 180K and the display screen 194 form a touch screen, which is also called a touch screen.
  • the touch sensor 180K is used to detect a touch operation on or near it.
  • the touch sensor 180K may transmit the detected touch operation to the application processor to determine the touch event type.
  • Visual output related to the touch operation can be provided through the display screen 194 .
  • the touch sensor 180K may also be disposed on the surface of the electronic device 100 and disposed at a different position from the display screen 194 .
  • Keys 190 include a power key and a volume key.
  • the key 190 can be a mechanical key or a touch key.
  • the electronic device 100 can receive key input signals and implement functions related to case input signals.
  • the motor 191 can generate vibrations.
  • the motor 191 can be used for incoming call notification, and can also be used for touch feedback.
  • the motor 191 can generate different vibration feedback effects for touch operations on different application programs. For touch operations acting on different areas of the display screen 194, the motor 191 can also generate different vibration feedback effects. Different application scenarios (for example, time reminder, receiving information, alarm clock and games) may correspond to different vibration feedback effects.
  • the touch vibration feedback effect can also support customization.
  • the processor 110 may determine first information and attitude angle information of the terminal based on the quaternion of the terminal, where the first information is used to identify whether the terminal is head-down , the attitude angle information is used to identify the attitude of the terminal; and based on the first information, the attitude angle information, the motion information and the ambient light information, determine that the electronic device 100 is in the first mode .
  • the hardware system of the electronic device 100 is described in detail above, and the software system of the electronic device 100 is introduced below.
  • the software system may adopt a layered architecture, an event-driven architecture, a micro-kernel architecture, a micro-service architecture, or a cloud architecture.
  • the embodiment of the present application uses a layered architecture as an example to exemplarily describe the software system of the electronic device 100 .
  • a software system adopting a layered architecture is divided into several layers, and each layer has a clear role and division of labor. Layers communicate through software interfaces.
  • the software system can be divided into four layers, which are application program layer, application program framework layer, Android Runtime (Android Runtime) and system library, and kernel layer from top to bottom.
  • the application layer can include applications such as camera, gallery, calendar, call, map, navigation, WLAN, Bluetooth, music, video, and short message.
  • the application framework layer provides an application programming interface (application programming interface, API) and a programming framework for applications in the application layer.
  • the application framework layer can include some predefined functions.
  • the application framework layer includes window managers, content providers, view systems, telephony managers, resource managers, and notification managers.
  • a window manager is used to manage window programs.
  • the window manager can get the size of the display, determine whether there is a status bar, lock the screen, and capture the screen.
  • Content providers are used to store and retrieve data and make it accessible to applications.
  • the data may include video, images, audio, calls made and received, browsing history and bookmarks, and phonebook.
  • the view system includes visual controls, such as those that display text and those that display pictures.
  • the view system can be used to build applications.
  • the display interface may be composed of one or more views, for example, a display interface including an SMS notification icon may include a view for displaying text and a view for displaying pictures.
  • the phone manager is used to provide communication functions of the electronic device 100, such as management of call status (connected or hung up).
  • the resource manager provides various resources to the application, such as localized strings, icons, pictures, layout files, and video files.
  • the notification manager enables the application to display notification information in the status bar, which can be used to convey notification-type messages, and can automatically disappear after a short stay without user interaction.
  • the notification manager is used for download completion notifications and message reminders.
  • the notification manager can also manage notifications that appear in the status bar at the top of the system in the form of charts or scrolling text, such as notifications from applications running in the background.
  • the notification manager can also manage notifications that appear on the screen in the form of dialog windows, such as prompting text messages in the status bar, making alert sounds, vibrating electronic devices, and blinking lights.
  • the Android Runtime includes core library and virtual machine. The Android runtime is responsible for the scheduling and management of the Android system.
  • the core library consists of two parts: one part is the function function that the java language needs to call, and the other part is the core library of Android.
  • the application layer and the application framework layer run in virtual machines.
  • the virtual machine executes the java files of the application program layer and the application program framework layer as binary files.
  • the virtual machine is used to perform functions such as object life cycle management, stack management, thread management, security and exception management, and garbage collection.
  • the system library can include multiple functional modules, such as: surface manager (surface manager), media library (Media Libraries), three-dimensional graphics processing library (for example: open graphics library for embedded systems (open graphics library for embedded systems, OpenGL ES) and 2D graphics engine (for example: skia graphics library (skia graphics library, SGL)).
  • surface manager surface manager
  • media library Media Libraries
  • three-dimensional graphics processing library for example: open graphics library for embedded systems (open graphics library for embedded systems, OpenGL ES)
  • 2D graphics engine for example: skia graphics library (skia graphics library, SGL)
  • the surface manager is used to manage the display subsystem and provides the fusion of 2D layers and 3D layers for multiple applications.
  • the media library supports playback and recording of multiple audio formats, playback and recording of multiple video formats, and still image files.
  • the media library can support multiple audio and video encoding formats, such as: MPEG4, H.264, moving picture experts group audio layer III (MP3), advanced audio coding (AAC), auto Adaptive multi-rate (adaptive multi-rate, AMR), joint photographic experts group (joint photographic experts group, JPG) and portable network graphics (portable network graphics, PNG).
  • MP3 moving picture experts group audio layer III
  • AAC advanced audio coding
  • AMR auto Adaptive multi-rate
  • JPG joint photographic experts group
  • portable network graphics portable network graphics
  • the 3D graphics processing library can be used to implement 3D graphics drawing, image rendering, compositing and layer processing.
  • the 2D graphics engine is a drawing engine for 2D drawing.
  • the kernel layer is the layer between hardware and software.
  • the kernel layer may include driver modules such as display driver, camera driver, audio driver and sensor driver.
  • a corresponding hardware interrupt is sent to the kernel layer, and the kernel layer processes the touch operation into an original input event, and the original input event includes information such as touch coordinates and a time stamp of the touch operation.
  • the original input event is stored in the kernel layer, and the application framework layer obtains the original input event from the kernel layer, identifies the control corresponding to the original input event, and notifies the corresponding application (application, APP) of the control.
  • the above-mentioned touch operation is a single-click operation
  • the APP corresponding to the above-mentioned control is a camera APP. After the camera APP is awakened by the single-click operation, it can call the camera driver of the kernel layer through the API, and control the camera 193 to take pictures through the camera driver.
  • the kernel layer of the software architecture shown in FIG. 3 may also include a sensor control center (Sensorhub) layer, or called a sensor software driver layer.
  • a six-axis fusion attitude angle algorithm module (which may be referred to as a six-axis fusion module) is provided in the Sensorhub layer.
  • the six-axis fusion attitude angle algorithm module is used to fuse the 3-axis data obtained by the acceleration sensor and the 3-axis data collected by the gyroscope to obtain 6-axis data, and output the quaternion of the terminal based on the 6-axis data.
  • the quaternion can represent the Euler angle (or called the attitude angle) of the terminal.
  • Euler angles include pitch angle (pitch), roll angle (roll), and yaw angle (yaw).
  • the purpose of setting the six-axis fusion attitude angle algorithm module on the Sensorhub layer is that the Sensorhub layer can run with low power consumption, so that the sensor data can be processed in real time without taking up too much power consumption.
  • attitude angle can be represented by the following formula:
  • the following code can be used to calculate the quaternion and use the quaternion to represent the Euler angle.
  • the gravitational component of the terminal can also be calculated by using the quaternion.
  • the gravitational component is expressed by the following formula
  • FIG. 5 shows a schematic flowchart of a method for preventing false touches according to an embodiment of the present application.
  • the acceleration sensor, the gyroscope sensor and the ambient light sensor in FIG. 5 are only described as examples, and the present application is not limited thereto. It is understood that these sensors can also be replaced by sensors having the same function.
  • the ambient light sensor can be used to detect the ambient light where the terminal is located, then the ambient light sensor can be replaced with a sensor for detecting the ambient light where the terminal is located.
  • This method can be applied to the scenario shown in Figure 1.
  • the method includes:
  • each of the acceleration sensor and the gyroscope collects 3-axis data to obtain 6-axis data, and then inputs the 6-axis data to the aforementioned 6-axis fusion module.
  • the 6-axis fusion module is based on the quaternion of the 6-axis data output terminal.
  • S402. Determine first information and attitude angle information of the terminal based on the quaternion of the terminal.
  • the attitude angle information is used to identify the attitude of the terminal.
  • the attitude angle of the terminal can be calculated by using the quaternion. In judging whether it is in the head-down pocket mode, the pitch angle is mainly used.
  • the attitude angle information of the terminal includes a pitch angle of the terminal.
  • the first information is used to identify whether the terminal is in a head-down state.
  • the head-down state refers to the situation where the forward direction of the terminal (the forward direction is the direction of the terminal's head when the user uses it in portrait mode) is facing downward.
  • the head-down state may determine whether the terminal is in the head-down state by using whether the head-down angle of the terminal is within a certain preset range when the terminal is facing downward. For example, when the terminal is placed upside down in the pocket, the terminal is in a state of facing down.
  • the gravity components v.x, v.y and v.z can also be calculated by using the quaternion, wherein v.y and v.z can be used as the basis for judging that the terminal is facing down.
  • the first information of the terminal includes gravity components v.y and v.z of the terminal.
  • the motion state of the terminal includes the following two states: a steady state and a motion state.
  • the stable state means that the terminal maintains a relatively stable state within a certain period of time. "The terminal maintains a relatively stable state for a certain period of time” does not mean that the terminal does not shake during this period of time. If the terminal shakes slightly at a certain point in time, it remains stable during this period of time. This case is also considered to be a steady state. For example, if the user walks with the mobile phone in his pocket, the terminal may shake slightly. At this time, the terminal can be considered to be in a stable state.
  • the motion state refers to that in certain scenarios when the user uses the terminal, the terminal shakes or shakes to a certain extent. For example, when a user walks with a terminal in hand, the terminal is in a motion state.
  • the motion information includes accelerometer and velocity values.
  • the motion state of the terminal can be judged by comparing the combined speed value of the terminal's accelerometer with the combined speed threshold, which will be described in detail later.
  • the ambient light information is used to identify the light intensity (or light brightness) of the environment where the terminal is located.
  • the ambient light information is used to identify the light intensity of the environment where the terminal is located.
  • the intensity of light can be specifically characterized by the illuminance value.
  • the ambient light information includes the illuminance of the environment where the terminal is located, and the unit of the ambient illuminance value may be lux.
  • the ambient light sensor detects that the illuminance value of the environment where the terminal is located will be relatively large; if the terminal is in the user's pocket, the ambient light sensor detects The illuminance value of the environment where the terminal is located will be relatively small.
  • the terminal Based on the ambient light information, the first information, the attitude angle information, and the motion information, determine that the terminal is in a first mode, where the first mode is used to identify that the state information of the terminal satisfies The corresponding preset conditions for the terminal.
  • the state information of the terminal includes information of the terminal in various dimensions (including: ambient light intensity, motion state of the terminal, head-down state of the terminal, spatial attitude of the terminal, etc.). In the embodiment of the present application, it needs to be judged in combination with information of the terminal in various dimensions that the terminal is in the first mode.
  • the first mode may be a finger-down pocket mode.
  • head-down refers to the previous article.
  • pocket mode is a general term for a type of scene where the user places the terminal in a low-light environment (even without light) without the need to use the terminal. For example, when a user is walking, the terminal can be placed in a trouser pocket or in a backpack. For another example, the user may place the terminal in a drawer or the like.
  • the head-down pocket mode can refer to a state where the terminal is placed in a low-light environment and is head-down. It should be understood that the pocket mode is just a name for the above-mentioned type of scenarios, and this name does not limit the protection scope of the embodiment of the present application.
  • the terminal being in the head-down pocket mode includes the following conditions: the head-down information satisfies a first preset condition, the attitude angle information satisfies a second preset condition, the motion information satisfies a third preset condition, and The ambient light information satisfies a fourth preset condition.
  • judging whether the terminal is in the head-down pocket mode should be judged jointly based on the above four kinds of information.
  • the terminal can be judged to be in the head-down pocket mode only when the above four kinds of information meet the preset conditions.
  • the head-down information satisfies the first preset condition, including: (1) the product of the first gravity component (denoted as v.y) and the second gravity component (denoted as v.z) of the terminal is a negative number; (2) The absolute value of v.y is greater than the first threshold, and the absolute value of v.z is smaller than the first threshold.
  • the first threshold may be a value from 5-10 inclusive. It should be understood that the present application does not specifically limit the value of the first threshold. For example, the first threshold may be 6, or 7, or 8, and so on.
  • a headdown flag (headdown_flag) may be introduced to identify whether the terminal is in a headdown state.
  • the attitude angle information satisfies a second preset condition, including: the pitch angle of the terminal is within a preset angle range.
  • the preset angle range is 70 degrees to 130 degrees.
  • a pitch angle flag may be introduced to identify whether the pitch angle of the terminal is within a preset angle range.
  • the motion information satisfies a third preset condition, including: the total speed value of the accelerometer in n consecutive frames is less than or equal to the total speed threshold.
  • the value of the combined speed threshold may be a certain value in 900-1300.
  • the combined speed threshold value is 900, 1000, 1050 or 1100 and so on.
  • the motion information satisfying the third preset condition further includes: the difference between the accelerometer total velocity value of the i-th frame and the accelerometer total velocity value of the i-1th frame in the consecutive n frames is less than a predetermined difference Value threshold, where, i ⁇ [2,n], n ⁇ 2.
  • a predetermined difference Value threshold may be a certain value in 100-180 (inclusive), which is not limited in the present application.
  • the predetermined difference threshold may be 110, 120, 135, 150, or 160, etc.
  • a steady state flag (steady_flag) may be introduced to identify whether the terminal is in a motion state or in a stable state in the pocket mode.
  • the ambient light information satisfying a fourth preset condition includes: ambient light intensity of the terminal is less than or equal to a first light threshold.
  • the first light threshold can be set to 6.0 lux, or 7.0 lux, etc., which is not specifically limited.
  • the above-mentioned head-down information satisfies the first preset condition
  • the attitude angle information satisfies the second preset condition
  • the motion information satisfies the third preset condition
  • the ambient light information satisfies the fourth preset condition.
  • the description of the assumed conditions is an exemplary description, and the present application is not limited thereto. In fact, those skilled in the art can set other reasonable judgment conditions for the terminal being in the pocket mode facing down.
  • the terminal enters a screen-off state.
  • the screen-off state refers to a state in which the screen of the terminal does not emit light.
  • the terminal when the terminal receives a signal that triggers the screen to turn on (this signal may be caused by a false touch), the ambient light information, first information, attitude angle information, and motion information of the terminal are detected cooperatively through multiple sensors, And based on this information, it is determined whether the terminal is in the pocket mode with the head down. If it is determined that the terminal is in the first mode (or head-down pocket mode), then the terminal enters the screen-off state to save power consumption, and can also avoid false wake-up of the terminal when the screen is off and the screen is locked and the screen is on. Effectively prevent accidental touch.
  • 501 Use an ambient light sensor to detect whether ambient light of a terminal is greater than a second light threshold.
  • the AOD is turned on; if not, skip to 510 to determine whether the ambient light of the terminal is greater than the first light threshold (ie, the first light threshold mentioned at S406 above) .
  • the relative stable state of the current frame can be judged by the following condition (1): whether the accelerometer combined velocity value of the current frame is greater than the combined velocity threshold. If the total velocity value of the accelerometer in the current frame is not greater than the velocity threshold, it can be considered that the terminal is in a relatively stable state in the current frame; if the total velocity value of the accelerometer in the current frame is greater than the velocity threshold, it can be considered that the terminal is not relatively steady state.
  • the combined speed threshold may be denoted as A, and the value of the combined speed threshold A may be any value from 900 to 1300 (including the endpoint value), which is not limited in this application.
  • the relatively stable state can also be judged by the following condition (2): whether the difference between the accelerometer total velocity value of the current frame and the accelerometer total velocity value of the previous frame does not exceed a predetermined difference value threshold. That is to say, if the accelerometer combined velocity value in the current frame is not greater than the combined velocity threshold, and the difference between the accelerometer combined velocity value in the current frame and the accelerometer combined velocity value in the previous frame does not exceed the predetermined difference threshold, then it can be considered The terminal is in a relatively stable state in the current frame; if the total velocity value of the accelerometer in the current frame is greater than the velocity threshold, and the difference between the total velocity value of the accelerometer in the current frame and the total velocity value of the accelerometer in the previous frame exceeds the predetermined difference threshold , it can be determined that the terminal is not in a relatively stable state in the current frame.
  • the data saved in the Buffer may include the calculated accelerometer and velocity value of each frame, or the state judgment result of each frame, which is not specifically limited.
  • the purpose of introducing the buffer here is to save the data of multiple consecutive frames, which is convenient for use in the subsequent judgment step 503 . For example, the data of the last 5 frames is always maintained in the buffer.
  • the terminal If the n consecutive frames are all in a stable state, it can be determined that the terminal is in a stable state; if the n consecutive n frames do not satisfy the condition that they are all in a stable state, then the terminal is considered to be in a moving state.
  • the terminal is in a stable state.
  • the value of the combined velocity value of the accelerometer depends on the motion state of the terminal. For example, the greater the range of movement of the terminal, the greater the value of the combined velocity value of the accelerometer.
  • the range of motion of the terminal may depend on the location of the terminal (for example, in a pocket, in the user's hand) and the user's behavior scene.
  • FIG. 7 shows a scene where the user walks with the terminal 12 in hand. It can be seen from FIG. 7 that when the user walks with the mobile phone, as the arm swings, the terminal 12 will follow the movement of the user's hand.
  • the total accelerometer velocity value of the terminal 12 in the scenario in FIG. 1 the total accelerometer velocity value of the terminal 12 in the scenario in FIG.
  • the accelerometer total velocity value of the terminal 12 in the scenario of Figure 7 is any value of 900-1300 (including the endpoint value); the accelerometer aggregate velocity value of the terminal 12 in the scenario of Figure 1 is 500-800 (including the endpoint value) Any value of , for example, 550, 600, 700, etc.
  • the motion state of the terminal When judging the motion state of the terminal, it is not possible to simply judge whether the terminal is in a motion state or a stable state based on the data of a certain frame, but it needs to be judged in combination with the data of n consecutive frames.
  • the misjudgment of the motion state helps to ensure the accuracy of the judgment result. For example, when the terminal is in pocket mode, there may occasionally be a large shaking. The accelerometer combined speed value calculated by this larger shaking is greater than the combined speed threshold, but this larger shaking is sporadic. It does not mean that the terminal must be in motion.
  • the data of the above n consecutive frames can be obtained from the buffer introduced at step 502 .
  • the accelerometer combined velocity value of n consecutive frames can be obtained from the buffer, and then the accelerometer combined velocity value of each frame is compared with the combined velocity threshold. If the combined velocity values of the accelerometer in n consecutive frames are greater than the combined velocity threshold, it can be determined that the terminal is indeed in a motion state. If the combined velocity values of the accelerometer in n consecutive frames are less than the combined velocity threshold, it can be determined that the terminal is in a stable state in the pocket. In addition, here, all situations other than "the combined velocity values of the accelerometer in n consecutive frames are greater than the combined velocity threshold" are considered as stable states. In other words, if it is not the case of "both greater than", the terminal is considered to be in a stable state in the pocket.
  • the stable state flag is used to identify whether the terminal is in a steady state or in a moving state. For example, when the value of steady_flag is 1, it means that the terminal is in a stable state in the pocket; when the value of steady_flag is 0, it means that the terminal is in a moving state.
  • the above steps 502-504 are the steps of obtaining the stable state identifier.
  • the six-axis fusion algorithm module outputs a quaternion.
  • the head-down judgment refers to judging whether the first gravitational component (denoted as v.y) and the second gravitational component (denoted as v.z) of the terminal meet the following two conditions: (1) the product of the two axes of v.y and v.z is a negative number; (2) ) The absolute value of v.y is greater than the first threshold, and the absolute value of v.z is smaller than the first threshold.
  • Judging the attitude angle refers to judging whether the pitch angle is within the preset angle range.
  • step 509 determine a head-down flag (headdown_flag) based on the judgment result of step 507 , and determine a pitch angle flag (pitch_flag) based on the judgment result of step 508 .
  • the upside down flag is used to identify whether the terminal is upside down. For example, when the value of headdown_flag is 1, it means that the terminal is facing down; when the value of headdown_flag is 0, it means that the terminal is not facing down. Specifically, if the gravity component in 508 satisfies the preset condition and the terminal is in a head-down state, set headdown_flag to 1; if the gravity component in 508 does not meet the preset condition, it is considered that the terminal is not in the head-down state, and the headdown_flag is set to 0.
  • the pitch angle flag is used to identify whether the pitch angle of the terminal is within a preset angle range. For example, when the value of pitch_flag is 1, it means that the pitch angle of the terminal is within the preset angle range; when the value of pitch_flag is 0, it means that the pitch angle of the terminal is not within the preset angle range. Specifically, if it is judged in 508 that the pitch angle is within the preset angle range, pitch_flag is set to 1; if it is judged in 508 that the pitch angle is not within the preset angle range, pitch_flag is set to 0.
  • the above 505-509 are the steps of obtaining the head-down mark and the pitch angle mark.
  • steps 501, 502-504, and 505-509 can be understood as three branches of algorithm execution.
  • these three branches can be carried out at the same time, and the order of step numbering does not represent the execution among the three branches. sequence.
  • the first light threshold is smaller than the second light threshold.
  • the stable state flag is 1, the head-down flag is 1, the pitch angle flag is 1, and the ambient light is less than or equal to the first light threshold, it means that the terminal is in the head-down pocket mode, and 511 is executed.
  • the ambient light is greater than the first light threshold, continue to detect the ambient light where the terminal is located, and perform 512 .
  • AOD is off. If it is a lock screen interface, enter the anti-mistouch mode.
  • FIG. 8 shows an interface display diagram of the false touch prevention mode.
  • the words “Do not block the top of the screen” and corresponding schematic diagrams are presented to the user in an interface 801 .
  • the anti-mistouch interface there is also the words “swipe twice below the anti-mistouch mode to forcibly exit", that is, the user can exit the anti-mistouch mode by sliding the control 802 twice. It can be understood that if the current terminal is in the head-down pocket mode, the user will not perform the operation of sliding down the control 802, and the terminal will automatically turn off.
  • the premise of entering the anti-mistouch mode is that the anti-mistouch mode of the terminal has been turned on, or the function of the anti-mistouch mode is solidified in the terminal (that is, the user does not need to enable the anti-mistouch mode).
  • Fig. 9 shows an interface diagram of how to enable the false touch prevention mode.
  • the auxiliary function control 901 includes an option 902 of the anti-mistouch mode. By clicking the option 902, the user can choose to turn on or off the anti-mistouch mode.
  • the interface shown in FIG. 9 displays an option 902 for enabling the false touch prevention mode. After the user enables this function, if a false touch occurs on the lock screen, the mobile phone can enter the anti-mistouch mode and enter the interface shown in Figure 8.
  • the auxiliary function control 901 also includes other functional control options, such as barrier-free, one-handed mode, quick start and gestures, smart multi-window, and timer switch shown in FIG. 9 ), which are not specifically limited.
  • the above method for preventing false touches is applicable.
  • a proximity light sensor is disposed in the terminal.
  • the method in Fig. 6 also includes:
  • the proximity light sensor is not very sensitive, resulting in inaccurate judgment results. Based on this, when the proximity light sensor determines that it is not approaching, it may further be determined in combination with the conditions in 510 above to determine whether the terminal is in the head-down pocket mode.
  • FIGS. 6-9 are only for the understanding of those skilled in the art, and do not limit the protection scope of the embodiment of the present application.
  • the false touch prevention method provided by this application uses multiple sensors to cooperatively detect the ambient light information, head-down information, attitude angle information, and motion information of the terminal, and determines whether the terminal is in a head-down pocket based on these information. mode can effectively solve the problem of anti-false touch in pocket mode and greatly improve user experience.
  • the present application also provides a computer program product, which implements the method described in any method embodiment in the present application when the computer program product is executed by a processor.
  • the computer program product can be stored in a memory, and finally converted into an executable object file that can be executed by a processor after preprocessing, compiling, assembling, linking and other processing processes.
  • the present application also provides a computer-readable storage medium, on which a computer program is stored, and when the computer program is executed by a computer, the method described in any method embodiment in the present application is implemented.
  • the computer program may be a high-level language program or an executable object program.
  • the computer readable storage medium may be a volatile memory or a nonvolatile memory, or may include both a volatile memory and a nonvolatile memory.
  • the non-volatile memory can be read-only memory (read-only memory, ROM), programmable read-only memory (programmable ROM, PROM), erasable programmable read-only memory (erasable PROM, EPROM), electrically programmable Erases programmable read-only memory (electrically EPROM, EEPROM) or flash memory.
  • Volatile memory can be random access memory (RAM), which acts as external cache memory.
  • RAM random access memory
  • SRAM static random access memory
  • DRAM dynamic random access memory
  • DRAM synchronous dynamic random access memory
  • SDRAM double data rate synchronous dynamic random access memory
  • double data rate SDRAM double data rate SDRAM
  • DDR SDRAM enhanced synchronous dynamic random access memory
  • ESDRAM enhanced synchronous dynamic random access memory
  • serial link DRAM SLDRAM
  • direct memory bus random access memory direct rambus RAM, DR RAM
  • the disclosed systems, devices and methods may be implemented in other ways. For example, some features of the method embodiments described above may be omitted, or not implemented.
  • the device embodiments described above are only illustrative, and the division of units is only a logical function division. In actual implementation, there may be other division methods, and multiple units or components may be combined or integrated into another system.
  • the coupling between the various units or the coupling between the various components may be direct coupling or indirect coupling, and the above coupling includes electrical, mechanical or other forms of connection.
  • sequence numbers of the processes do not mean the order of execution, and the execution order of the processes should be determined by their functions and internal logic, rather than by the embodiments of the present application.
  • the implementation process constitutes any limitation.
  • system and “network” are often used herein interchangeably.
  • the term “and/or” in this article is just an association relationship describing associated objects, which means that there can be three relationships, for example, A and/or B, which can mean: A exists alone, A and B exist simultaneously, and A and B exist alone. There are three cases of B.
  • the character "/" in this article generally indicates that the contextual objects are an "or” relationship.

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Human Computer Interaction (AREA)
  • Computer Hardware Design (AREA)
  • Environmental & Geological Engineering (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Signal Processing (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

一种防误触的方法和装置,应用于终端技术领域。该方法包括:获取终端的四元数;基于终端的四元数,确定终端的第一信息以及姿态角信息;检测终端的运动信息;检测终端的环境光信息;基于第一信息、姿态角信息、运动信息以及环境光信息,确定终端处于第一模式;终端进入熄屏状态,以节省功耗,并且,还能够避免终端在熄屏显示和锁屏亮屏状态下发生误唤醒,有效防止误触。

Description

防误触的方法和装置
本申请要求于2021年12月1日提交国家知识产权局、申请号为202111459980.8、申请名称为“防误触的方法和装置”的中国专利申请的优先权,其全部内容通过引用结合在本申请中。
技术领域
本申请涉及终端技术领域,并且具体地,涉及一种防误触的方法和装置。
背景技术
当用户将触屏终端放在背包或口袋时,触屏终端屏幕亮起时接触电容物质,容易发生误触。在该场景下会存在以下问题:屏幕长时间亮起比较耗电,指纹误触多次可能会触发指纹解锁功能锁死,密码输入误触多次会导致密码输入功能锁死,紧急呼叫被误触会误报警等。这些问题均会严重影响用户体验。因此,亟需提出一种防误触的方法。
发明内容
有鉴于此,本申请提供了一种防误触的方法、装置、计算机可读存储介质和计算机程序产品,能够有效解决口袋模式下防误触的问题,极大提升用户体验。
第一方面,提供了一种防误触的方法,包括:
通过加速度传感器和陀螺仪传感器,获取终端的四元数,所述四元数用于表征所述终端的姿态;
基于所述终端的四元数,确定所述终端的第一信息以及姿态角信息,其中,所述第一信息用于标识所述终端是否处于头朝下的状态,所述姿态角信息用于标识所述终端的姿态;
通过所述加速度传感器检测所述终端的运动信息,所述运动信息用于标识所述终端的运动状态;
通过环境光传感器检测所述终端的环境光信息,所述环境光信息用于标识所述终端所处环境的光线强度;
根据所述第一信息、所述姿态角信息、所述运动信息以及环境光信息,确定所述终端处于头第一模式所述第一模式用于标识终端的状态信息满足终端的相应预设条件。
所述终端进入熄屏状态。
可选地,所述终端处于头朝下的口袋模式包括以下条件:所述第一信息满足第一预设条件、所述姿态角信息满足第二预设条件、所述运动信息满足第三预设条件以及所述环境光信息满足第四预设条件。
上述方法可以由终端设备或终端设备中的芯片执行。基于上述方案,通过多传感器协同检测终端的信息(包括但不限于环境光信息、第一信息、姿态角信息以及运动信息),并根据终端的信息确定终端是否处于第一模式。如果确定终端是头朝下的口袋模式,那么在口袋模式下终端进入熄屏状态以节省功耗,并且,还能够避免终端在熄屏显示和锁屏亮屏状态下发生误唤醒,有效防止误触。
在一种可能的实现方式中,基于所述第一信息确定所述终端处于第一模式包括:所述第一信息满足第一预设条件;
所述第一信息满足第一预设条件,包括:所述终端的第一重力分量和第二重力分量的乘积为负数;以及,所述第一重力分量的绝对值大于第一阈值,所述第二重力分量的绝对值小于所述第一阈值;其中,所述第一重力分量和第二重力分量基于所述四元数计算得到。
在一种可能的实现方式中,基于所述姿态角信息确定所述终端处于第一模式包括:所述姿态角信息满足第二预设条件;所述姿态角信息满足第二预设条件,包括:终端的俯仰角在预设角度范围内;其中,所述终端的俯仰角基于所述四元数计算得到。
在一种可能的实现方式中,基于所述运动信息确定所述终端处于第一模式包括:所述运动信息满足第三预设条件;所述运动信息满足第三预设条件,包括:连续n帧的加速度计合速度值小于或等于合速度阈值,n≥2,n是整数。
在一种可能的实现方式中,所述运动信息满足第三预设条件,还包括:所述连续n帧中第i帧的加速度计合速度值与第i-1帧的加速度计合速度值的差值小于预定差值阈值,i∈[2,n]。因此,通过连续n帧的加速度计合速度值来判断终端的运动状态,能够避免终端在口袋模式下发生大幅晃动时对终端运动状态的误判,有助于确保判断结果的准确性。
在一种可能的实现方式中,基于所述环境光信息确定所述终端处于第一模式包括:所述环境光信息满足第四预设条件;所述环境光信息满足第四预设条件,包括:所述终端的环境光照度小于或等于第一光阈值。
在一种可能的实现方式中,在所述终端的环境光照度大于所述第一光阈值时,所述方法还包括:
通过所述环境光传感器检测所述终端的环境光是否大于第二光阈值,所述第二光阈值大于所述第一光阈值;
在所述终端的环境光照度大于第二光阈值时,所述终端的屏幕亮起;
在所述终端的环境光照度小于或等于所述第二光阈值时,所述终端进入熄屏状态。
因此,通过进一步检测终端的环境光,比较环境光照度与第二光阈值的关系,以便决定是熄屏还是亮屏,避免无谓的功耗,有助于进一步节省功耗。
在本申请实施例中,无论终端中是否设置有接近光传感器,以上防误触方法都是适用的。
在一种可能的实现方式中,终端中设置有接近光传感器。在基于所述第一信息、所述姿态角信息、所述运动信息和所述环境光信息确定所述终端处于第一模式之前,所述方法还包括:通过接近光传感器检测所述终端的反射光信息;在检测不到反射光时,根据所述头朝下信息、所述姿态角信息、所述运动信息以及环境光信息,确定所述终端是否处于头朝下的口袋模式。
因此,在接近光传感器判断为不接近时,还可以进一步结合上述第一信息、所述姿态角信息、所述运动信息以及环境光信息进行判断,以确定终端是否处于头朝下的口袋模式,获得更准确的判断结果。
在一种可能的实现方式中,在所述终端进入熄屏状态之前,所述方法还包括:检测所述终端的界面;如果所述终端的界面是熄屏显示AOD界面,则AOD熄灭;如果所述终端的界面是锁屏界面,则所述终端进入防误触模式。因此,在进入熄屏状态之前,通过检测 终端的界面,以便根据界面实际情况作出相应的处理,可以进一步节省功耗。
第二方面,提供了一种防误触的装置,包括用于执行第一方面中任一种方法的单元。该装置可以是终端(或者终端设备),也可以是终端(或者终端设备)内的芯片。该装置包括输入单元、显示单元和处理单元。
当该装置是终端时,该处理单元可以是处理器,该输入单元可以是通信接口,该显示单元可以是图形处理模块和屏幕;该终端还可以包括存储器,该存储器用于存储计算机程序代码,当该处理器执行该存储器所存储的计算机程序代码时,使得该终端执行第一方面中的任一种方法。
当该装置是终端内的芯片时,该处理单元可以是芯片内部的逻辑处理单元,该输入单元可以是输出接口、管脚或电路等,该显示单元可以是芯片内部的图形处理单元;该芯片还可以包括存储器,该存储器可以是该芯片内的存储器(例如,寄存器、缓存等),也可以是位于该芯片外部的存储器(例如,只读存储器、随机存取存储器等);该存储器用于存储计算机程序代码,当该处理器执行该存储器所存储的计算机程序代码时,使得该芯片执行第一方面的任一种方法。
在一种实现方式中,所述处理单元用于通过加速度传感器和陀螺仪传感器,获取终端的四元数,所述四元数用于表征所述终端的姿态;基于所述终端的四元数,确定所述终端的第一信息以及姿态角信息,其中,所述第一信息用于标识所述终端是否处于头朝下的状态,所述姿态角信息用于标识所述终端的姿态;通过所述加速度传感器检测所述终端的运动信息,所述运动信息用于标识所述终端的运动状态;通过环境光传感器检测所述终端的环境光信息,所述环境光信息用于标识所述终端所处环境的光线强度;基于所述第一信息、所述姿态角信息、所述运动信息以及环境光信息,确定所述终端处于第一模式,所述第一模式用于标识所述终端的状态信息满足终端的相应预设条件;所述显示单元进入熄屏状态。
在一种实现方式中,所述处理单元用于基于所述第一信息确定所述终端处于第一模式包括:所述第一信息满足第一预设条件;所述第一信息满足第一预设条件,包括:所述终端的第一重力分量和第二重力分量的乘积为负数;以及,所述第一重力分量的绝对值大于第一阈值,所述第二重力分量的绝对值小于所述第一阈值;
其中,所述第一重力分量和第二重力分量基于所述四元数计算得到。
在一种实现方式中,所述处理单元用于基于所述姿态角信息确定所述终端处于第一模式包括:所述姿态角信息满足第二预设条件;所述姿态角信息满足第二预设条件,包括:终端的俯仰角在预设角度范围内;
其中,所述终端的俯仰角基于所述四元数计算得到。
在一种实现方式中,所述处理单元用于基于所述运动信息确定所述终端处于第一模式包括:所述运动信息满足第三预设条件;所述运动信息满足第三预设条件,包括:连续n帧的加速度计合速度值小于或等于合速度阈值,n≥2,n是整数。
在一种实现方式中,所述运动信息满足第三预设条件,还包括:所述连续n帧中第i帧的加速度计合速度值与第i-1帧的加速度计合速度值的差值小于预定差值阈值,i∈[2,n]。
在一种实现方式中,所述处理单元用于基于所述环境光信息确定所述终端处于第一模式包括:所述环境光信息满足第四预设条件;所述环境光信息满足第四预设条件,包括: 所述终端的环境光照度小于或等于第一光阈值。
在一种实现方式中,所述处理单元还用于:在所述终端的环境光照度大于所述第一光阈值时,通过所述环境光传感器检测所述终端的环境光是否大于第二光阈值,所述第二光阈值大于所述第一光阈值;
在所述终端的环境光照度大于第二光阈值时,所述终端的屏幕亮起;
在所述终端的环境光照度小于或等于所述第二光阈值时,所述终端进入熄屏状态。
在一种实现方式中,所述处理单元还用于:通过接近光传感器检测所述终端的反射光信息;
在检测不到反射光时,根据所述第一信息、所述姿态角信息、所述运动信息以及环境光信息,确定所述终端处于第一模式。
在一种实现方式中,在所述终端进入熄屏状态之前,所述处理单元还用于:检测所述终端的界面;如果所述终端的界面是熄屏显示AOD界面,则所述显示单元的AOD熄灭;如果所述终端的界面是锁屏界面,则所述终端进入防误触模式。
第三方面,提供了一种计算机可读存储介质,所述计算机可读存储介质存储有计算机程序代码,当所述计算机程序代码被防误触的装置运行时,使得该装置执行第一方面中的任一种方法。
第四方面,提供了一种计算机程序产品,所述计算机程序产品包括:计算机程序代码,当所述计算机程序代码被防误触的装置运行时,使得该装置执行第一方面中的任一种方法。
附图说明
图1是本申请实施例的应用场景的一个示例图;
图2是一种适用于本申请的电子设备的硬件系统的示意图;
图3是一种适用于本申请的电子设备的软件系统的示意图;
图4是手机定义的坐标系统的一个示例图;
图5是本申请实施例的防误触的方法的示意性流程图;
图6是本申请实施例的防误触的方法的一个具体流程图;
图7是用户手持终端的一个示例图;
图8是防误触模式的界面显示的一个示例图;
图9是开启防误触模式的一个界面示例图。
具体实施方式
下面将结合附图,对本申请实施例中的技术方案进行描述。
本申请实施例提供的防误触方法可应用于具有触摸屏的终端中。该终端例如可以为手机、平板电脑、多媒体播放设备、电子书阅读器、个人计算机、个人数字助理(personaldigitalassistant,PDA)、智能手表等。本申请对终端的具体形式不作限制。
本申请实施例的技术方案应用于终端处于背包或衣物(比如衣服、裤子等)口袋中的场景。在实际应用中,用户可将终端放置在衣服口袋、裤子口袋或背包中。为了便于描述,可以将“终端处于背包或衣物(比如衣服、裤子等)口袋中”的场景称作口袋模式。
可以理解,处于口袋模式的终端包括以下常见姿态:竖屏头朝上、竖屏头朝下、屏幕朝上、屏幕朝下、横屏竖起(可存在一定倾角)等。其中,竖屏头朝下是日常生活中终端放到口袋绝大多数状态。
还可以理解,本申请对用户的行为场景不作限定。比如,用户的行为场景包括但不限于以下行为:走、跑、坐、站立、躺、跳、骑等。
图1示出了本申请实施例的应用场景的一个示例图。如图1所示,该应用场景中,终端12放置在背包11中。当终端12放置在背包11中时,用户没有使用终端12的需求。终端12在背包11中时,可能会与背包11本身或背包11中包含的电容物质(金属导电物、皮肤等)接触,从而发生误触。
可以理解,图1中示出的背包11也可替换为衣物口袋,对此不作具体限定。
应理解,图1中的场景只是示意性说明本申请的一个应用场景,这并不对本申请实施例构成限定,本申请并不限于此。
当终端处于口袋模式时,与终端接触的物体可触发终端亮屏,导致终端误唤醒。这样的亮屏并非用户所期望的。若终端在口袋模式下长期处于亮屏状态,会严重浪费功耗。其中,与终端接触的物体包括但不限于以下内容:衣物口袋、背包、衣物口袋或背包中包含的电容物质(金属导电物、皮肤等)。
另外,在终端亮屏状态(该亮屏状态包括熄屏显示(always on display,AOD)亮起以及锁屏亮屏状态)下,与终端接触的物体可能会多次触发终端,从而发生多次误触,严重影响用户体验(比如,多次误触会触发指纹解锁功能锁死、密码输入误触多次会导致密码输入功能锁死、紧急呼叫功能被误触会误报警等)。
AOD是指不点亮整块屏幕的情况下,控制终端的屏幕局部亮起,将一些重要的信息显示在终端上。
本申请实施例提供的技术方案,通过多传感器协同检测终端的信息(包括但不限于环境光信息、第一信息、姿态角信息以及运动信息),并根据终端的信息确定终端是否处于头朝下的口袋模式。如果确定终端是头朝下的口袋模式,那么在口袋模式下终端进入熄屏状态以节省功耗,并且,还能够避免终端在熄屏显示和锁屏亮屏状态下发生误唤醒,有效防止误触。
图2示出了一种适用于本申请的电子设备的硬件系统。
电子设备100可以是手机、智慧屏、平板电脑、可穿戴电子设备、车载电子设备、增强现实(augmented reality,AR)设备、虚拟现实(virtual reality,VR)设备、笔记本电脑、超级移动个人计算机(ultra-mobile personal computer,UMPC)、上网本、个人数字助理(personal digital assistant,PDA)、投影仪等等,本申请实施例对电子设备100的具体类型不作任何限制。
电子设备100可以包括处理器110,外部存储器接口120,内部存储器121,通用串行总线(universal serial bus,USB)接口130,充电管理模块140,电源管理模块141,电池142,天线1,天线2,移动通信模块150,无线通信模块160,音频模块170,扬声器170A,受话器170B,麦克风170C,耳机接口170D,传感器模块180,按键190,马达191,指示器192,摄像头193,显示屏194,以及用户标识模块(subscriber identification module,SIM)卡接口195等。其中传感器模块180可以包括压力传感器180A,陀螺仪传感器180B, 气压传感器180C,磁传感器180D,加速度传感器180E,距离传感器180F,接近光传感器180G,指纹传感器180H,温度传感器180J,触摸传感器180K,环境光传感器180L,骨传导传感器180M等。
需要说明的是,图2所示的结构并不构成对电子设备100的具体限定。在本申请另一些实施例中,电子设备100可以包括比图2所示的部件更多或更少的部件,或者,电子设备100可以包括图2所示的部件中某些部件的组合,或者,电子设备100可以包括图2所示的部件中某些部件的子部件。比如,图2所示的接近光传感器180G可以是可选的。图2示的部件可以以硬件、软件、或软件和硬件的组合实现。
处理器110可以包括一个或多个处理单元。例如,处理器110可以包括以下处理单元中的至少一个:应用处理器(application processor,AP)、调制解调处理器、图形处理器(graphics processing unit,GPU)、图像信号处理器(image signal processor,ISP)、控制器、视频编解码器、数字信号处理器(digital signal processor,DSP)、基带处理器、神经网络处理器(neural-network processing unit,NPU)。其中,不同的处理单元可以是独立的器件,也可以是集成的器件。
控制器可以根据指令操作码和时序信号,产生操作控制信号,完成取指令和执行指令的控制。
处理器110中还可以设置存储器,用于存储指令和数据。在一些实施例中,处理器110中的存储器为高速缓冲存储器。该存储器可以保存处理器110刚用过或循环使用的指令或数据。如果处理器110需要再次使用该指令或数据,可从所述存储器中直接调用。避免了重复存取,减少了处理器110的等待时间,因而提高了系统的效率。
图2所示的各模块间的连接关系只是示意性说明,并不构成对电子设备100的各模块间的连接关系的限定。可选地,电子设备100的各模块也可以采用上述实施例中多种连接方式的组合。
电子设备100可以通过GPU、显示屏194以及应用处理器实现显示功能。GPU为图像处理的微处理器,连接显示屏194和应用处理器。GPU用于执行数学和几何计算,用于图形渲染。处理器110可包括一个或多个GPU,其执行程序指令以生成或改变显示信息。
显示屏194可以用于显示图像或视频。显示屏194包括显示面板。显示面板可以采用液晶显示屏(liquid crystal display,LCD)、有机发光二极管(organic light-emitting diode,OLED)、有源矩阵有机发光二极体(active-matrix organic light-emitting diode,AMOLED)、柔性发光二极管(flex light-emitting diode,FLED)、迷你发光二极管(mini light-emitting diode,Mini LED)、微型发光二极管(micro light-emitting diode,Micro LED)、微型OLED(Micro OLED)或量子点发光二极管(quantum dot light emitting diodes,QLED)。在一些实施例中,电子设备100可以包括1个或N个显示屏194,N为大于1的正整数。
电子设备100可以通过ISP、摄像头193、视频编解码器、GPU、显示屏194以及应用处理器等实现拍摄功能。
ISP用于处理摄像头193反馈的数据。例如,拍照时,打开快门,光线通过镜头被传递到摄像头感光元件上,光信号转换为电信号,摄像头感光元件将所述电信号传递给ISP处理,转化为肉眼可见的图像。ISP可以对图像的噪点、亮度和色彩进行算法优化,ISP还可以优化拍摄场景的曝光和色温等参数。在一些实施例中,ISP可以设置在摄像头193 中。
摄像头193用于捕获静态图像或视频。物体通过镜头生成光学图像投射到感光元件。感光元件可以是电荷耦合器件(charge coupled device,CCD)或互补金属氧化物半导体(complementary metal-oxide-semiconductor,CMOS)光电晶体管。感光元件把光信号转换成电信号,之后将电信号传递给ISP转换成数字图像信号。ISP将数字图像信号输出到DSP加工处理。DSP将数字图像信号转换成标准的红绿蓝(red green blue,RGB),YUV等格式的图像信号。在一些实施例中,电子设备100可以包括1个或N个摄像头193,N为大于1的正整数。
数字信号处理器用于处理数字信号,除了可以处理数字图像信号,还可以处理其他数字信号。例如,当电子设备100在频点选择时,数字信号处理器用于对频点能量进行傅里叶变换等。
电子设备100可以通过音频模块170、扬声器170A、受话器170B、麦克风170C、耳机接口170D以及应用处理器等实现音频功能,例如,音乐播放和录音。
音频模块170用于将数字音频信息转换成模拟音频信号输出,也可以用于将模拟音频输入转换为数字音频信号。音频模块170还可以用于对音频信号编码和解码。在一些实施例中,音频模块170或者音频模块170的部分功能模块可以设置于处理器110中。
扬声器170A,也称为喇叭,用于将音频电信号转换为声音信号。电子设备100可以通过扬声器170A收听音乐或免提通话。
受话器170B,也称为听筒,用于将音频电信号转换成声音信号。当用户使用电子设备100接听电话或语音信息时,可以通过将受话器170B靠近耳朵接听语音。
麦克风170C,也称为话筒或传声器,用于将声音信号转换为电信号。当用户拨打电话或发送语音信息时,可以通过靠近麦克风170C发声将声音信号输入麦克风170C。
耳机接口170D用于连接有线耳机。耳机接口170D可以是USB接口130,也可以是3.5mm的开放移动电子设备100平台(open mobile terminal platform,OMTP)标准接口,美国蜂窝电信工业协会(cellular telecommunications industry association of the USA,CTIA)标准接口。
压力传感器180A用于感受压力信号,可以将压力信号转换成电信号。在一些实施例中,压力传感器180A可以设置于显示屏194。压力传感器180A的种类很多,例如可以是电阻式压力传感器、电感式压力传感器或电容式压力传感器。电容式压力传感器可以是包括至少两个具有导电材料的平行板,当力作用于压力传感器180A,电极之间的电容改变,电子设备100根据电容的变化确定压力的强度。当触摸操作作用于显示屏194时,电子设备100根据压力传感器180A检测所述触摸操作。电子设备100也可以根据压力传感器180A的检测信号计算触摸的位置。在一些实施例中,作用于相同触摸位置,但不同触摸操作强度的触摸操作,可以对应不同的操作指令。例如:当触摸操作强度小于第一压力阈值的触摸操作作用于短消息应用图标时,执行查看短消息的指令;当触摸操作强度大于或等于第一压力阈值的触摸操作作用于短消息应用图标时,执行新建短消息的指令。
陀螺仪传感器180B可以用于确定电子设备100的运动姿态。在一些实施例中,可以通过陀螺仪传感器180B确定电子设备100围绕三个轴(即,x轴、y轴和z轴)的角速度。陀螺仪传感器180B可以用于拍摄防抖。例如,当快门被按下时,陀螺仪传感器180B检测 电子设备100抖动的角度,根据角度计算出镜头模组需要补偿的距离,让镜头通过反向运动抵消电子设备100的抖动,实现防抖。陀螺仪传感器180B还可以用于导航和体感游戏等场景。
气压传感器180C用于测量气压。在一些实施例中,电子设备100通过气压传感器180C测得的气压值计算海拔高度,辅助定位和导航。
磁传感器180D包括霍尔传感器。电子设备100可以利用磁传感器180D检测翻盖皮套的开合。在一些实施例中,当电子设备100是翻盖机时,电子设备100可以根据磁传感器180D检测翻盖的开合。电子设备100可以根据检测到的皮套的开合状态或翻盖的开合状态,设置翻盖自动解锁等特性。
加速度传感器180E可检测电子设备100在各个方向上(一般为x轴、y轴和z轴)加速度的大小。当电子设备100静止时可检测出重力的大小及方向。加速度传感器180E还可以用于识别电子设备100的姿态,作为横竖屏切换和计步器等应用程序的输入参数。
在一些实施例中,可以通过加速度传感器180E和陀螺仪传感器180B获取电子设备100的四元数。
在一些实施例中,可以通过加速度传感器180E检测电子设备100的运动信息。
距离传感器180F用于测量距离。电子设备100可以通过红外或激光测量距离。在一些实施例中,例如在拍摄场景中,电子设备100可以利用距离传感器180F测距以实现快速对焦。
接近光传感器180G可以包括例如发光二极管(light-emitting diode,LED)和光检测器,例如,光电二极管。LED可以是红外LED。电子设备100通过LED向外发射红外光。电子设备100使用光电二极管检测来自附近物体的红外反射光。当检测到反射光时,电子设备100可以确定附近存在物体。当检测不到反射光时,电子设备100可以确定附近没有物体。电子设备100可以利用接近光传感器180G检测用户是否手持电子设备100贴近耳朵通话,以便自动熄灭屏幕达到省电的目的。接近光传感器180G也可用于皮套模式或口袋模式的自动解锁与自动锁屏。应理解,图2中所述的接近光传感器180G可以是可选部件。在一些场景下,可以利用超声传感器来替代接近光传感器180G检测接近光。
环境光传感器180L用于感知环境光亮度。电子设备100可以根据感知的环境光亮度自适应调节显示屏194亮度。环境光传感器180L也可用于拍照时自动调节白平衡。环境光传感器180L还可以与接近光传感器180G配合,检测电子设备100是否在口袋里,以防误触。在一些实施例中,可以通过环境光传感器180L检测所述终端的环境光信息。
指纹传感器180H用于采集指纹。电子设备100可以利用采集的指纹特性实现解锁、访问应用锁、拍照和接听来电等功能。
触摸传感器180K,也称为触控器件。触摸传感器180K可以设置于显示屏194,由触摸传感器180K与显示屏194组成触摸屏,触摸屏也称为触控屏。触摸传感器180K用于检测作用于其上或其附近的触摸操作。触摸传感器180K可以将检测到的触摸操作传递给应用处理器,以确定触摸事件类型。可以通过显示屏194提供与触摸操作相关的视觉输出。在另一些实施例中,触摸传感器180K也可以设置于电子设备100的表面,并且与显示屏194设置于不同的位置。
按键190包括开机键和音量键。按键190可以是机械按键,也可以是触摸式按键。电 子设备100可以接收按键输入信号,实现于案件输入信号相关的功能。
马达191可以产生振动。马达191可以用于来电提示,也可以用于触摸反馈。马达191可以对作用于不同应用程序的触摸操作产生不同的振动反馈效果。对于作用于显示屏194的不同区域的触摸操作,马达191也可产生不同的振动反馈效果。不同的应用场景(例如,时间提醒、接收信息、闹钟和游戏)可以对应不同的振动反馈效果。触摸振动反馈效果还可以支持自定义。
在一些实施例中,处理器110可以基于所述终端的四元数,确定所述终端的第一信息以及姿态角信息,其中,所述第一信息用于标识所述终端是否处于头朝下的状态,所述姿态角信息用于标识所述终端的姿态;并基于所述第一信息、所述姿态角信息、所述运动信息和所述环境光信息,确定电子设备100处于第一模式。
上文详细描述了电子设备100的硬件系统,下面介绍电子设备100的软件系统。软件系统可以采用分层架构、事件驱动架构、微核架构、微服务架构或云架构,本申请实施例以分层架构为例,示例性地描述电子设备100的软件系统。
如图3所示,采用分层架构的软件系统分成若干个层,每一层都有清晰的角色和分工。层与层之间通过软件接口通信。在一些实施例中,软件系统可以分为四层,从上至下分别为应用程序层、应用程序框架层、安卓运行时(Android Runtime)和系统库、以及内核层。
应用程序层可以包括相机、图库、日历、通话、地图、导航、WLAN、蓝牙、音乐、视频、短信息等应用程序。
应用程序框架层为应用程序层的应用程序提供应用程序编程接口(application programming interface,API)和编程框架。应用程序框架层可以包括一些预定义的函数。
例如,应用程序框架层包括窗口管理器、内容提供器、视图系统、电话管理器、资源管理器和通知管理器。
窗口管理器用于管理窗口程序。窗口管理器可以获取显示屏大小,判断是否有状态栏、锁定屏幕和截取屏幕。
内容提供器用来存放和获取数据,并使这些数据可以被应用程序访问。所述数据可以包括视频、图像、音频、拨打和接听的电话、浏览历史和书签、以及电话簿。
视图系统包括可视控件,例如显示文字的控件和显示图片的控件。视图系统可用于构建应用程序。显示界面可以由一个或多个视图组成,例如,包括短信通知图标的显示界面,可以包括显示文字的视图以及显示图片的视图。
电话管理器用于提供电子设备100的通信功能,例如通话状态(接通或挂断)的管理。
资源管理器为应用程序提供各种资源,比如本地化字符串、图标、图片、布局文件和视频文件。
通知管理器使应用程序可以在状态栏中显示通知信息,可以用于传达告知类型的消息,可以短暂停留后自动消失,无需用户交互。比如通知管理器被用于下载完成告知和消息提醒。通知管理器还可以管理以图表或者滚动条文本形式出现在系统顶部状态栏的通知,例如后台运行的应用程序的通知。通知管理器还可以管理以对话窗口形式出现在屏幕上的通知,例如在状态栏提示文本信息、发出提示音、电子设备振动以及指示灯闪烁。
Android Runtime包括核心库和虚拟机。Android runtime负责安卓系统的调度和管理。
核心库包含两部分:一部分是java语言需要调用的功能函数,另一部分是安卓的核心 库。
应用程序层和应用程序框架层运行在虚拟机中。虚拟机将应用程序层和应用程序框架层的java文件执行为二进制文件。虚拟机用于执行对象生命周期的管理、堆栈管理、线程管理、安全和异常的管理、以及垃圾回收等功能。
系统库可以包括多个功能模块,例如:表面管理器(surface manager),媒体库(Media Libraries),三维图形处理库(例如:针对嵌入式系统的开放图形库(open graphics library for embedded systems,OpenGL ES)和2D图形引擎(例如:skia图形库(skia graphics library,SGL))。
表面管理器用于对显示子系统进行管理,并且为多个应用程序提供了2D图层和3D图层的融合。
媒体库支持多种音频格式的回放和录制、多种视频格式回放和录制以及静态图像文件。媒体库可以支持多种音视频编码格式,例如:MPEG4、H.264、动态图像专家组音频层面3(moving picture experts group audio layer III,MP3)、高级音频编码(advanced audio coding,AAC)、自适应多码率(adaptive multi-rate,AMR)、联合图像专家组(joint photographic experts group,JPG)和便携式网络图形(portable network graphics,PNG)。
三维图形处理库可以用于实现三维图形绘图、图像渲染、合成和图层处理。
二维图形引擎是2D绘图的绘图引擎。
内核层是硬件和软件之间的层。内核层可以包括显示驱动、摄像头驱动、音频驱动和传感器驱动等驱动模块。
下面结合显示拍照场景,示例性说明电子设备100的软件系统和硬件系统的工作流程。
当用户在触摸传感器180K上进行触摸操作时,相应的硬件中断被发送至内核层,内核层将触摸操作加工成原始输入事件,原始输入事件例如包括触摸坐标和触摸操作的时间戳等信息。原始输入事件被存储在内核层,应用程序框架层从内核层获取原始输入事件,识别出原始输入事件对应的控件,并通知该控件对应的应用程序(application,APP)。例如,上述触摸操作为单击操作,上述控件对应的APP为相机APP,相机APP被单击操作唤醒后,可以通过API调用内核层的摄像头驱动,通过摄像头驱动控制摄像头193进行拍摄。
图3中所示的软件架构的内核层还可以包括传感器控制中心(Sensorhub)层,或称作传感器软件驱动层。Sensorhub层中设置有六轴融合姿态角算法模块(可以简称为六轴融合模块)。六轴融合姿态角算法模块用于将加速度传感器获得的3轴数据以及陀螺仪采集的3轴数据进行融合,得到6轴数据,并基于该6轴数据输出终端的四元数(quaternion)。四元数可以表示终端的欧拉角(或称作姿态角)。欧拉角包括俯仰角(pitch)、滚转角(roll)、偏航角(yaw)。
将六轴融合姿态角算法模块设置于Sensorhub层的目的在于:Sensorhub层可以低功耗运行,这样,在不占用过多功耗的情况下可实时处理传感器数据。
以图4中所示的手机定义的坐标系统为例,假设手机屏幕朝上水平放置,x轴水平指向右边,Y轴垂直指向前方,Z轴指向正面正上方。当手机左右摇摆时(绕y轴旋转),得到变化的滚转角;当手机前后摇摆时(绕x轴旋转),得到变化的俯仰角;当手机横屏转换成竖屏或竖屏转换成横屏时(绕z轴旋转),即得到变化的偏航角。
假设四元数为q0,q1,q2,q3,可以通过以下公式表征姿态角:
Figure PCTCN2022117611-appb-000001
Figure PCTCN2022117611-appb-000002
Figure PCTCN2022117611-appb-000003
举例来说,可通过以下代码计算四元数,并利用四元数表征欧拉角。
#测量正常化
norm=math.sqrt(ax*ax+ay*ay+az*az)
#单元化
ax=ax/norm
ay=ay/norm
az=az/norm
#估计方向的重力
vx=q1*q3-q0*q2
vy=q0*q1+q2*q3
vz=q0*q0-0.5+q3*q3
#错误的领域和方向传感器测量参考方向之间的交叉乘积的总和
ex=(ay*vz-az*vy)
ey=(az*vx-ax*vz)
ez=(ax*vy-ay*vx)
#积分误差比例积分增益
exInt+=ex*Ki*(1/50)
eyInt+=ey*Ki*(1/50)
ezInt+=ez*Ki*(1/50)
#调整后的陀螺仪测量
gx+=Kp*ex+exInt
gy+=Kp*ey+eyInt
gz+=Kp*ez+ezInt
#整合四元数
gx*=0.5*(1/50)
gy*=0.5*(1/50)
gz*=0.5*(1/50)
q0+=(-q1*gx-q2*gy-q3*gz)
q1+=(q0*gx+q2*gz-q3*gy)
q2+=(q0*gy-q1*gz+q3*gx)
q3+=(q0*gz+q1*gy-q2*gx)
#正常化四元数
norm=math.sqrt(q0*q0+q1*q1+q2*q2+q3*q3)
q0/=norm
q1/=norm
q2/=norm
q3/=norm
#获取欧拉角pitch、roll、yaw
#pitch=math.asin(-2*q1*q3+2*q0*q2)*57.3
#roll=math.atan2(2*q2*q3+2*q0*q1,-2*q1*q1-2*q2*q2+1)*57.3
#yaw=math.atan2(2*(q1*q2+q0*q3),q0*q0+q1*q1-q2*q2-q3*q3)*57.3
pitch=-math.asin(2*q1*q3+2*q0*q2)*57.3
roll=math.atan2(2*q1*q2-2*q0*q3,2*q0*q0+2*q1*q1-1)*57.3
yaw=math.atan2(2*q2*q3-2*q0*q1,2*q0*q0+2*q3*q3-1)*57.3
在利用四元数得到终端的姿态角后,可以辅助判断终端的姿态。具体判断方法将在后文介绍。
另外,利用四元数还可以计算终端的重力分量。重力分量采用以下公式表示
v.x=2*(q2*q4-q1*q3);
v.y=2*(q1*q2+q3*q4);
v.z=1-2*(q2*q2+q3*q3);
在利用四元数得到上述重力分量后,可以用来判断终端是头朝下,具体判断方法后文介绍。
以下结合图5-图9描述本申请实施例的防误触方法。
参考图5,图5示出了本申请实施例的防误触的方法的示意性流程图。可以理解,图5中的加速度传感器、陀螺仪传感器以及环境光传感器只是示例性描述,本申请并不限于此。可以理解,这些传感器也可以被具有相同功能的传感器替换。比如,环境光传感器可用于检测终端所处的环境光线,那么该环境光传感器可以被替换为用于检测终端所处的环境光线的传感器。该方法可应用于图1所示的场景。该方法包括:
S401,通过加速度传感器和陀螺仪传感器,获取终端的四元数,所述四元数用于表征所述终端的姿态。
具体地,获取终端的四元数通过以下方式实现:通过加速度传感器和陀螺仪各采集3轴数据,得到6轴数据,然后将该6轴数据输入到前文提到的六轴融合模块。六轴融合模块基于该6轴数据输出终端的四元数。
S402,基于所述终端的四元数,确定所述终端的第一信息以及姿态角信息。
所述姿态角信息用于标识所述终端的姿态。如前文所述,利用四元数可以计算终端的姿态角。在判断是否处于头朝下的口袋模式中,主要利用俯仰角。所述终端的姿态角信息包括终端的俯仰角。
其中,所述第一信息用于标识所述终端是否处于头朝下的状态。头朝下的状态是指终端的正向(正向是用户在竖屏使用时终端头部的方向)朝下的情况。在一些实施例中,头朝下的状态可以利用在正向朝下时的终端头朝下的角度是否在某个预设范围内,来判断终端是头朝下的状态。例如,在终端倒立放置在口袋中时,终端是头朝下的状态。
另外,如前文所述,利用四元数还可以计算重力分量v.x,v.y和v.z,其中,v.y和v.z可以作为判断终端头朝下的依据。所述终端的第一信息包括终端的重力分量v.y和v.z。
S403,通过加速度传感器检测所述终端的运动信息,所述运动信息用于标识所述终端的运动状态。
终端的运动状态包括以下两种状态:稳定状态以及运动状态。
稳定状态是指终端在一定时间段内保持相对平稳的状态。“终端一定时间段内保持相对平稳的状态”并非限定终端在该时间段内均未发生晃动,如果说终端在某个时间点虽然发生了轻微晃动,但在该时间段内整体保持平稳状态,这种情况下也认为是稳定状态。比如说,用户将手机放置在口袋中走路,终端有可能会发生轻微晃动,此时可以认为终端是稳定状态。
运动状态是指在用户使用终端的某些场景下,终端发生一定幅度的晃动或者摇动。比如说,用户在手持终端走路时,终端处于运动状态。
所述运动信息包括加速度计合速度值。通过比较终端的加速度计合速度值与合速度阈值可以判断终端的运动状态,后文将详细介绍。
S404,通过环境光传感器检测所述终端的环境光信息,所述环境光信息用于标识所述终端所处环境的光线强度(或光线亮度)。换种表述,所述环境光信息用于标识所述终端所处环境的光线强弱程度。光线强弱程度具体可通过光照度值来表征。比如,所述环境光信息包括终端所处环境的光照度,环境光照度值的单位可以采用lux。
举例来说,如果在白天用户手持终端使用的场景下,那么通过环境光传感器检测检测到终端所处环境的光照度值会比较大;如果终端在用户的衣服口袋中,那么通过环境光传感器检测检测到终端所处环境的光照度值会比较小。
S405,基于所述环境光信息、所述第一信息、所述姿态角信息以及所述运动信息,确定所述终端处于第一模式,所述第一模式用于标识所述终端的状态信息满足终端的相应预设条件。
其中,所述终端的状态信息包括终端在各个维度(包括:环境光线强度、终端所处的运动状态、终端的头朝下状态、终端的空间姿态等)的信息。在本申请实施例中,终端处于第一模式需要结合终端在各个维度的信息来判断。
例如,第一模式可以指头朝下的口袋模式。其中,头朝下的解释参见前文。
口袋模式的解释可参考下文:通常来讲,口袋模式是对用户在没有使用终端需求的情况下,将终端放置在某个弱光环境(甚至是不见光)下的一类场景的统称。比如说,用户在走路时,可以将终端放置在裤子口袋中或者背包中。又比如说,用户可以将终端放置在抽屉中等。结合上文关于头朝下状态的解释,例如,头朝下的口袋模式可以指终端被置于 某个弱光环境,且是头朝下的状态。应理解,口袋模式只是对上述一类场景的一个命名,此命名不对本申请实施例的保护范围构成限定。
所述终端处于头朝下的口袋模式包括以下条件:所述头朝下信息满足第一预设条件、所述姿态角信息满足第二预设条件、所述运动信息满足第三预设条件以及所述环境光信息满足第四预设条件。
在本申请实施例中,判断终端是否处于头朝下的口袋模式,要基于上述4种信息共同判断。在上述4种信息均满足预设条件的情况下,终端才能被判断为是处于头朝下的口袋模式。
可选地,所述头朝下信息满足第一预设条件,包括:(1)终端的第一重力分量(记作v.y)和第二重力分量(记作v.z)两轴的乘积为负数;(2)v.y的绝对值大于第一阈值,v.z的绝对值小于该第一阈值。第一阈值可以是5-10中的一个数值(包含端点值)。应理解,本申请对第一阈值的取值不作具体限定。比如,第一阈值可以是6,或者7,或者8等。在具体实现时,可以引入头朝下标识(headdown_flag)来标识终端是否是头朝下的状态。
可选地,所述姿态角信息满足第二预设条件,包括:终端的俯仰角在预设角度范围内。例如,预设角度范围是70度到130度。在具体实现时,可以引入俯仰角标识(pitch_flag)来标识终端的俯仰角是否在预设角度范围内。
可选地,所述运动信息满足第三预设条件,包括:连续n帧的加速度计合速度值小于或等于合速度阈值。例如,合速度阈值的取值可以为900-1300中的某个值。比如,合速度阈值取值为900、1000、1050或1100等。
可选地,所述运动信息满足第三预设条件还包括:所述连续n帧中第i帧的加速度计合速度值与第i-1帧的加速度计合速度值的差值小于预定差值阈值,其中,i∈[2,n],n≥2。例如,假设n取值为5,i取值为4,对于连续5帧的加速度计合速度值,第4帧与第3帧的加速度计合速度值的差值小于预定差值阈值。预定差值阈值可以是100-180(包含端点值)中的某个值,本申请对此不作限定。比如,预定差值阈值可以取110、120、135、150或160等。在具体实现时,可以引入稳定状态标识(steady_flag)标识终端是处于运动状态还是口袋模式下的稳定状态。
可选地,所述环境光信息满足第四预设条件,包括:终端的环境光照度小于或等于第一光阈值。例如,第一光阈值可以设置为6.0lux,或者7.0lux等其他值,对此不作具体限定。
应理解,上述关于所述头朝下信息满足第一预设条件、所述姿态角信息满足第二预设条件、所述运动信息满足第三预设条件以及所述环境光信息满足第四预设条件的描述是示例性地描述,本申请不限于此。事实上,本领域技术人员可以为终端处于头朝下的口袋模式设定其他合理的判断条件。
S406,所述终端进入熄屏状态。熄屏状态是指终端的屏幕未发光的状态。
本申请实施例中,在终端接收到触发屏幕亮屏的信号(该信号可能是误触导致的)时,通过多传感器协同检测终端的环境光信息、第一信息、姿态角信息以及运动信息,并根据这些信息确定终端是否处于头朝下的口袋模式。如果确定终端是第一模式(或者说头朝下的口袋模式),那么终端进入熄屏状态以节省功耗,并且,还能够避免终端在熄屏显示和 锁屏亮屏状态下发生误唤醒,有效防止误触。
为便于理解,以下结合图6中所示的流程详细描述各个信息的判断逻辑。如图6所示,包括:
501,通过环境光传感器检测终端的环境光是否大于第二光阈值。
如果检测到终端的环境光大于第二光阈值,则AOD亮起;如果否,则跳至510,判断终端的环境光是否大于第一光阈值(即前文S406处提到的第一光阈值)。
502,通过加速度传感器,获取当前帧的加速度计合速度值。
通过当前帧的加速度计合速度值,可以判断当前帧是否是相对稳定状态。
基于当前帧的加速度计合速度值,当前帧的相对稳定状态可通过以下条件(1)进行判断:当前帧的加速度计合速度值是否大于合速度阈值。如果当前帧的加速度计合速度值不大于合速度阈值,则可以认为终端在当前帧是相对稳定状态;如果当前帧的加速度计合速度值大于合速度阈值,则可以认为终端在当前帧不是相对稳定状态。
加速度计合速度值可记作a,a采用以下公式表示:a=x 2+y 2+z 2,其中,x,y,z的取值通过加速度传感器采集。合速度阈值可记作A,合速度阈值A的取值可以是900-1300(包含端点值)中的任一值,本申请对此不作限定。
可选地,除了上述条件(1)以外,相对稳定状态还可以通过以下条件(2)进行判断:当前帧加速度计合速度值与前一帧的加速度计合速度值差值是否不超过预定差值阈值。也就是说,如果当前帧的加速度计合速度值不大于合速度阈值,且,当前帧加速度计合速度值与前一帧的加速度计合速度值差值不超过预定差值阈值,则可以认为终端在当前帧是相对稳定状态;如果当前帧的加速度计合速度值大于合速度阈值,且,当前帧加速度计合速度值与前一帧的加速度计合速度值差值超过了预定差值阈值,则可以确定终端在当前帧不是相对稳定状态。
上述预定差值阈值的描述参考前文,为了简洁,这里不作赘述。
另外,此处可考虑引入缓存(buffer),将当前帧的数据保存到缓存(buffer)中,以作为后续步骤503的依据。Buffer中保存的数据可以包括已经计算好的每帧的加速度计合速度值,或者,每帧的状态判断结果,对此不作具体限定。这里引入buffer的目的在于保存连续多帧的数据,便于后续判断步骤503使用。例如,buffer中始终维持最近5帧的数据。
以下可以通过连续多帧的数据判断终端是否处于相对稳定状态。
503,判断连续n帧是否均是稳定状态。
如果连续n帧均是稳定状态,则可以确定终端是稳定状态;如果连续n帧不满足均是稳定状态的条件,则认为终端是运动状态。
可以理解,连续n帧中每帧的状态都可以参考上述判断当前帧相对稳定状态的方式进行判断。
举例来说,如果连续n帧的加速度计合速度值都小于合速度阈值,且该连续n帧中第i帧的加速度计合速度值与第i-1帧的加速度计合速度值的差值小于预定差值阈值,i∈[2,n],则可以确定终端是稳定状态。
可以理解,加速度计合速度值的大小取决于终端的运动状态。比如,终端的运动幅度越大,加速度计合速度值的取值越大。终端的运动幅度可以取决于终端所在的位置(比如,口袋中,用户的手中)以及用户的行为场景。
以用户的行为场景是走为例,当用户在走路时,终端可能有以下两种情形:终端在裤子口袋中;终端在用户手中。这两种状态的区别在于,若终端在裤子口袋中,那么随着用户走路,终端的运动幅度不会太大;若终端在用户手中,那么随着用户走路过程中手臂的摆动,终端也会随之大幅运动。图7示出了用户手持终端12走路的场景。从图7可以看到,当用户拿着手机走路时,随着手臂的摆动,终端12也会跟随用户的手发生运动。相比于图1的场景,图7场景下终端12的加速度计合速度值大于图1场景下终端12的加速度计合速度值。例如,图7场景下终端12的加速度计合速度值为900-1300(包含端点值)的任一值;图1场景下终端12的加速度计合速度值为500-800(包含端点值)中的任一值,比如,550,600,700等。
在判断终端的运动状态时,并不能简单以某帧的数据判断终端是运动状态还是稳定状态而是需要结合连续n帧的数据进行判断,这样能够避免终端在口袋模式下发生大幅晃动时对终端运动状态的误判,从而有助于确保判断结果的准确性。例如,终端在口袋模式下,偶尔也有可能会存在一次幅度较大的晃动,这次较大的晃动计算的加速度计合速度值大于合速度阈值,但这次较大的晃动是偶发性的,并不能说明终端就一定处于运动状态。
上述连续n帧的数据可以从502处引入的buffer中获取。举例来说,从buffer中可以获得连续n帧的加速度计合速度值,然后将每帧的加速度计合速度值与合速度阈值做对比。如果连续n帧的加速度计合速度值均大于合速度阈值,那么可以确定终端确为运动状态。如果连续n帧的加速度计合速度值的均小于合速度阈值,那么可以确定终端为在口袋中的稳定状态。另外,这里还将除了“连续n帧的加速度计合速度值均大于合速度阈值”之外的情形都认为是稳定状态。换种表述,如果不是“均大于”的情形,则认为终端为在口袋中的稳定状态。
504,基于503的判断结果,确定稳定状态标识(steady_flag)。
稳定状态标识用于标识终端是稳定状态还是运动状态。比如,当steady_flag取值为1时,表示终端为在口袋中的稳定状态;当steady_flag取值为0时,表示终端为运动状态。
如果503的判断结果为是(比如,连续n帧的加速度计合速度值都大于合速度阈值),那么将steady_flag置0。如果503的判断结果为否,那么将steady_flag置1。
以上502-504是获取稳定状态标识的步骤。
505,将通过加速度传感器获得的3轴数据以及通过陀螺仪获得的3轴数据,输入到六轴融合算法模块。
506,六轴融合算法模块输出四元数。
507,基于四元数计算重力分量,进行头朝下判断。
508,基于四元数计算姿态角,进行姿态角判断。
四元数的获取方式,以及利用四元数计算姿态角以及重力分量的具体方式参考前文描述,这里不作赘述。
头朝下判断是指判断终端的第一重力分量(记作v.y)和第二重力分量(记作v.z)是否满足以下两个条件:(1)v.y和v.z两轴的乘积为负数;(2)v.y的绝对值大于第一阈值,v.z的绝对值小于该第一阈值。
姿态角判断是指判断俯仰角是否在预设角度范围内。
509,基于步骤507的判断结果确定头朝下标识(headdown_flag),以及,基于步骤508 的判断结果确定俯仰角标识(pitch_flag)。
头朝下标识用于标识终端是否是头朝下。比如,当headdown_flag取值为1时,表示终端头朝下;当headdown_flag取值为0时,表示终端不是头朝下。具体地,如果508中的重力分量满足预设条件,终端是头朝下状态,则将headdown_flag置1;如果508中的重力分量不满足预设条件,则认为终端不是头朝下状态,则将headdown_flag置0。
俯仰角标识用于标识终端的俯仰角是否在预设角度范围。比如,当pitch_flag取值为1时,表示终端的俯仰角在预设角度范围;当pitch_flag取值为0时,表示终端的俯仰角不在预设角度范围。具体地,如果508中判断俯仰角在预设角度范围内,则将pitch_flag置1;如果508中判断俯仰角不在预设角度范围内,则将pitch_flag置0。
以上505-509是获得头朝下标识以及俯仰角标识的步骤。
应理解,上述步骤501,502-504,以及505-509可以理解为算法执行的三个分支,在具体实现时,该三个分支可以同时进行,步骤编号次序不代表三个分支之间的执行先后顺序。
510,判断稳定状态标识是否为1、头朝下标识是否为1、俯仰角标识是否为1以及环境光小于或等于第一光阈值。
其中,第一光阈值小于第二光阈值。
如果稳定状态标识为1、头朝下标识为1、俯仰角标识为1且环境光小于或等于第一光阈值,说明终端处于头朝下的口袋模式,则执行511。
另外,如果环境光大于第一光阈值,那么继续检测终端所处的环境光线,并执行512。
511,判断界面是AOD界面还是锁屏界面(或者说解锁界面)。
如果是AOD界面,则AOD熄灭。如果是锁屏界面,则进入防误触模式。
图8示出了防误触模式的界面显示图。如图8所示,界面801中向用户呈现“请勿遮挡屏幕顶端”字样以及相应的示意图。在防误触界面中,还呈现“防误触模式在下方滑动两次即可强行退出”的字样,即用户通过下滑两次控件802,即可退出防误触模式。可以理解,如果当前终端处于头朝下的口袋模式,用户不会执行下滑控件802的操作,那么终端会自动熄灭。
另外,在锁屏界面下,进入防误触的前提是终端的防误触模式已开启,或者,防误触模式功能固化在终端中(即无需用户开启防误触模式)。
图9示出了如何开启防误触模式的界面图。如图9所示界面,辅助功能控件901中包括防误触模式的选项902。用户通过点击选项902,可以选择打开或关闭防误触模式。图9所示界面中显示开启防误触模式的选项902。当用户开启该功能后,手机在锁屏界面下如果发生误触可以进入防误触模式,进入图8所示的界面。可以理解,辅助功能控件901中还包括其他功能控件的选项,比如图9中示出的无障碍、单手模式、快速启动及手势、智慧多窗以及定时开关机),对此不作具体限定。
512,判断环境光是否大于第二光阈值。如果是,则AOD亮起;如果否,则AOD熄灭。
在本申请实施例中,无论终端中是否设置有接近光传感器,以上防误触方法都是适用的。
作为一种可能的实现方式,终端中设置有接近光传感器。相应地,图6中的方法还包 括:
513,通过接近光传感器判断是否接近。如果接近,则AOD熄灭;如果判断不接近,则跳转至510。
在某些场景下(比如终端放置在黑色衣物的口袋中)接近光传感器不是很灵敏,导致判断结果不准确。基于此,在接近光传感器判断为不接近时,还可以进一步结合上述510中的条件进行判断,以确定终端是否处于头朝下的口袋模式。
可以理解,图6-图9中的示例只是便于本领域技术人员进行理解,并不对本申请实施例的保护范围构成限定。
由上可知,本申请提供的防误触的方法,通过多传感器协同检测终端的环境光信息、头朝下信息、姿态角信息以及运动信息,并根据这些信息确定终端是否处于头朝下的口袋模式能够有效解决口袋模式下防误触的问题,极大提升用户体验。
本申请还提供了一种计算机程序产品,该计算机程序产品被处理器执行时实现本申请中任一方法实施例所述的方法。
该计算机程序产品可以存储在存储器中,经过预处理、编译、汇编和链接等处理过程最终被转换为能够被处理器执行的可执行目标文件。
本申请还提供了一种计算机可读存储介质,其上存储有计算机程序,该计算机程序被计算机执行时实现本申请中任一方法实施例所述的方法。该计算机程序可以是高级语言程序,也可以是可执行目标程序。
该计算机可读存储介质可以是易失性存储器或非易失性存储器,或者,可以同时包括易失性存储器和非易失性存储器。其中,非易失性存储器可以是只读存储器(read-only memory,ROM)、可编程只读存储器(programmable ROM,PROM)、可擦除可编程只读存储器(erasable PROM,EPROM)、电可擦除可编程只读存储器(electrically EPROM,EEPROM)或闪存。易失性存储器可以是随机存取存储器(random access memory,RAM),其用作外部高速缓存。通过示例性但不是限制性说明,许多形式的RAM可用,例如静态随机存取存储器(static RAM,SRAM)、动态随机存取存储器(dynamic RAM,DRAM)、同步动态随机存取存储器(synchronous DRAM,SDRAM)、双倍数据速率同步动态随机存取存储器(double data rate SDRAM,DDR SDRAM)、增强型同步动态随机存取存储器(enhanced SDRAM,ESDRAM)、同步连接动态随机存取存储器(synchlink DRAM,SLDRAM)和直接内存总线随机存取存储器(direct rambus RAM,DR RAM)。
本领域的技术人员可以清楚地了解到,为了描述的方便和简洁,上述描述的装置和设备的具体工作过程以及产生的技术效果,可以参考前述方法实施例中对应的过程和技术效果,在此不再赘述。
在本申请所提供的几个实施例中,所揭露的系统、装置和方法,可以通过其它的方式实现。例如,以上所描述的方法实施例的一些特征可以忽略,或不执行。以上所描述的装置实施例仅仅是示意性的,单元的划分,仅仅为一种逻辑功能划分,实际实现时可以有另外的划分方式,多个单元或组件可以结合或者可以集成到另一个系统。另外,各单元之间的耦合或各个组件之间的耦合可以是直接耦合,也可以是间接耦合,上述耦合包括电的、机械的或其它形式的连接。
应理解,在本申请的各种实施例中,各过程的序号的大小并不意味着执行顺序的先后, 各过程的执行顺序应以其功能和内在逻辑确定,而不应对本申请的实施例的实施过程构成任何限定。
另外,本文中术语“系统”和“网络”在本文中常被可互换使用。本文中的术语“和/或”,仅仅是一种描述关联对象的关联关系,表示可以存在三种关系,例如,A和/或B,可以表示:单独存在A,同时存在A和B,单独存在B这三种情况。另外,本文中字符“/”,一般表示前后关联对象是一种“或”的关系。
总之,以上所述仅为本申请技术方案的较佳实施例而已,并非用于限定本申请的保护范围。凡在本申请的精神和原则之内,所作的任何修改、等同替换、改进等,均应包含在本申请的保护范围之内。

Claims (13)

  1. 一种防误触的方法,其特征在于,包括:
    通过加速度传感器和陀螺仪传感器,获取终端的四元数,所述四元数用于表征所述终端的姿态;
    基于所述终端的四元数,确定所述终端的第一信息以及姿态角信息,其中,所述第一信息用于标识所述终端是否处于头朝下的状态,所述姿态角信息用于标识所述终端的姿态;
    通过所述加速度传感器检测所述终端的运动信息,所述运动信息用于标识所述终端的运动状态;
    通过环境光传感器检测所述终端的环境光信息,所述环境光信息用于标识所述终端所处环境的光线强度;
    基于所述第一信息、所述姿态角信息、所述运动信息和所述环境光信息,确定所述终端处于第一模式,所述第一模式用于标识所述终端的状态信息满足终端的相应预设条件;
    所述终端进入熄屏状态。
  2. 根据权利要求1所述的方法,其特征在于,基于所述第一信息确定所述终端处于第一模式包括:所述第一信息满足第一预设条件;
    其中,所述第一信息满足第一预设条件,包括:
    所述终端的第一重力分量和第二重力分量的乘积为负数;
    以及,所述第一重力分量的绝对值大于第一阈值,所述第二重力分量的绝对值小于所述第一阈值;
    其中,所述第一重力分量和第二重力分量基于所述四元数计算得到。
  3. 根据权利要求1或2所述的方法,其特征在于,基于所述姿态角信息确定所述终端处于第一模式包括:所述姿态角信息满足第二预设条件;
    其中,所述姿态角信息满足第二预设条件,包括:
    所述终端的俯仰角在预设角度范围内;
    其中,所述终端的俯仰角基于所述四元数计算得到。
  4. 根据权利要求1至3中任一项所述的方法,其特征在于,基于所述运动信息确定所述终端处于第一模式包括:所述运动信息满足第三预设条件;
    其中,所述运动信息满足第三预设条件,包括:
    连续n帧的加速度计合速度值小于或等于合速度阈值,n≥2,n是整数。
  5. 根据权利要求4所述的方法,其特征在于,所述运动信息满足第三预设条件,还包括:
    所述连续n帧中第i帧的加速度计合速度值与第i-1帧的加速度计合速度值的差值小于预定差值阈值,i∈[2,n]。
  6. 根据权利要求1至5中任一项所述的方法,其特征在于,基于所述环境光信息确定所述终端处于第一模式包括:所述环境光信息满足第四预设条件;
    其中,所述环境光信息满足第四预设条件,包括:
    所述终端的环境光照度小于或等于第一光阈值。
  7. 根据权利要求6所述的方法,其特征在于,在所述终端的环境光照度大于所述第一光阈值时,所述方法还包括:
    通过所述环境光传感器检测所述终端的环境光是否大于第二光阈值,所述第二光阈值大于所述第一光阈值;
    在所述终端的环境光照度大于第二光阈值时,所述终端的屏幕亮起;
    在所述终端的环境光照度小于或等于所述第二光阈值时,所述终端进入熄屏状态。
  8. 根据权利要求1至7中任一项所述的方法,其特征在于,在基于所述第一信息、所述姿态角信息、所述运动信息和所述环境光信息,确定所述终端处于第一模式之前,所述方法还包括:
    通过接近光传感器检测所述终端的反射光信息;
    其中,所述基于所述第一信息、所述姿态角信息、所述运动信息和所述环境光信息,确定所述终端处于第一模式,包括:
    在检测不到反射光时,基于所述第一信息、所述姿态角信息、所述运动信息和所述环境光信息,确定所述终端处于第一模式。
  9. 根据权利要求1至8中任一项所述的方法,其特征在于,在所述终端进入熄屏状态之前,所述方法还包括:
    检测所述终端的界面;
    其中,所述终端进入熄屏状态,包括:
    如果所述终端的界面是熄屏显示AOD界面,则AOD熄灭;
    如果所述终端的界面是锁屏界面,则所述终端进入防误触模式。
  10. 一种电子设备,其特征在于,包括处理器和存储器,所述处理器和所述存储器耦合,所述存储器用于存储计算机程序,当所述计算机程序被所述处理器执行时,使得所述电子设备执行权利要求1至9中任一项所述的方法。
  11. 一种计算机可读存储介质,其特征在于,所述计算机可读存储介质存储有计算机程序,当所述计算机程序被处理器执行时,使得所述处理器执行权利要求1至9中任一项所述的方法。
  12. 一种芯片,其特征在于,包括处理器,当所述处理器执行指令时,所述处理器执行如权利要求1至9中任一项所述的方法。
  13. 一种计算机程序产品,其特征在于,包括计算机程序,当所述计算机程序被运行时,使得计算机执行如权利要求1至9中任一项所述的方法。
PCT/CN2022/117611 2021-12-01 2022-09-07 防误触的方法和装置 WO2023098208A1 (zh)

Priority Applications (1)

Application Number Priority Date Filing Date Title
EP22857074.3A EP4212999A4 (en) 2021-12-01 2022-09-07 METHOD AND DEVICE FOR PREVENTING ACCIDENTAL CONTACT

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN202111459980.8 2021-12-01
CN202111459980.8A CN116204075A (zh) 2021-12-01 2021-12-01 防误触的方法和装置

Publications (1)

Publication Number Publication Date
WO2023098208A1 true WO2023098208A1 (zh) 2023-06-08

Family

ID=86513528

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2022/117611 WO2023098208A1 (zh) 2021-12-01 2022-09-07 防误触的方法和装置

Country Status (3)

Country Link
EP (1) EP4212999A4 (zh)
CN (1) CN116204075A (zh)
WO (1) WO2023098208A1 (zh)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117707321A (zh) * 2023-06-30 2024-03-15 荣耀终端有限公司 防误触识别方法及相关设备

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107786743A (zh) * 2017-10-27 2018-03-09 北京小米移动软件有限公司 防止终端误触的方法及装置
CN107831985A (zh) * 2017-11-13 2018-03-23 广东欧珀移动通信有限公司 一种移动终端屏幕控制的方法、移动终端及存储介质
CN108391002A (zh) * 2018-02-05 2018-08-10 广东欧珀移动通信有限公司 显示控制方法和装置、终端、计算机可读存储介质
CN108900710A (zh) * 2018-06-29 2018-11-27 Oppo(重庆)智能科技有限公司 移动终端的防误触方法、装置、移动终端及存储介质
CN109375859A (zh) * 2018-10-29 2019-02-22 Oppo(重庆)智能科技有限公司 屏幕控制方法、装置、终端及存储介质
CN110753154A (zh) * 2019-09-12 2020-02-04 深圳市万普拉斯科技有限公司 移动终端的控制方法、装置、电子设备和存储介质
US20210181919A1 (en) * 2019-12-13 2021-06-17 Samsung Electronics Co., Ltd. Method and electronic device for accidental touch prediction using ml classification

Family Cites Families (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP4628483B2 (ja) * 2008-07-15 2011-02-09 パナソニック株式会社 携帯装置及びその位置特定方法
CN102045443B (zh) * 2010-11-08 2014-01-01 中兴通讯股份有限公司 一种动作感应装置及移动终端
CN102984375A (zh) * 2012-11-23 2013-03-20 广东欧珀移动通信有限公司 一种防止手机误操作的方法
CN106469011B (zh) * 2016-08-31 2019-03-01 维沃移动通信有限公司 一种信息显示方法及移动终端
KR20180066762A (ko) * 2016-12-09 2018-06-19 엘지전자 주식회사 이동 단말기
CN108696639A (zh) * 2018-05-10 2018-10-23 Oppo广东移动通信有限公司 熄屏显示方法、移动终端以及存储介质
US10573273B2 (en) * 2018-06-13 2020-02-25 Mapsted Corp. Method and system for device placement based optimization techniques
CN111542802A (zh) * 2018-09-21 2020-08-14 华为技术有限公司 一种屏蔽触摸事件的方法及电子设备
CN109840061A (zh) * 2019-01-31 2019-06-04 华为技术有限公司 控制屏幕显示的方法及电子设备
CN113535440A (zh) * 2020-04-21 2021-10-22 深圳市万普拉斯科技有限公司 功能模式的触发方法、装置及电子设备

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107786743A (zh) * 2017-10-27 2018-03-09 北京小米移动软件有限公司 防止终端误触的方法及装置
CN107831985A (zh) * 2017-11-13 2018-03-23 广东欧珀移动通信有限公司 一种移动终端屏幕控制的方法、移动终端及存储介质
CN108391002A (zh) * 2018-02-05 2018-08-10 广东欧珀移动通信有限公司 显示控制方法和装置、终端、计算机可读存储介质
CN108900710A (zh) * 2018-06-29 2018-11-27 Oppo(重庆)智能科技有限公司 移动终端的防误触方法、装置、移动终端及存储介质
CN109375859A (zh) * 2018-10-29 2019-02-22 Oppo(重庆)智能科技有限公司 屏幕控制方法、装置、终端及存储介质
CN110753154A (zh) * 2019-09-12 2020-02-04 深圳市万普拉斯科技有限公司 移动终端的控制方法、装置、电子设备和存储介质
US20210181919A1 (en) * 2019-12-13 2021-06-17 Samsung Electronics Co., Ltd. Method and electronic device for accidental touch prediction using ml classification

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
See also references of EP4212999A4

Also Published As

Publication number Publication date
EP4212999A4 (en) 2024-06-19
CN116204075A (zh) 2023-06-02
EP4212999A1 (en) 2023-07-19

Similar Documents

Publication Publication Date Title
ES2964533T3 (es) Método de control de voz y dispositivo electrónico
WO2021052016A1 (zh) 一种人体姿态检测方法及电子设备
WO2021120914A1 (zh) 一种界面元素的显示方法及电子设备
CN113364971B (zh) 图像处理方法和装置
CN111316199B (zh) 一种信息处理方法及电子设备
CN114217699B (zh) 一种检测手写笔笔尖方向的方法、电子设备及手写笔
CN110059686B (zh) 字符识别方法、装置、设备及可读存储介质
WO2021008589A1 (zh) 一种应用的运行方法及电子设备
WO2024016564A1 (zh) 二维码识别方法、电子设备以及存储介质
WO2021218429A1 (zh) 应用窗口的管理方法、终端设备及计算机可读存储介质
WO2021052035A1 (zh) 一种屏幕侧面区域显示方法及电子设备
CN111542802A (zh) 一种屏蔽触摸事件的方法及电子设备
WO2022095744A1 (zh) Vr显示控制方法、电子设备及计算机可读存储介质
WO2023098208A1 (zh) 防误触的方法和装置
CN111524528B (zh) 防录音检测的语音唤醒方法及装置
CN116055629B (zh) 一种识别终端状态的方法、电子设备、存储介质和芯片
WO2023029547A1 (zh) 视频处理方法和电子设备
US20230401897A1 (en) Method for preventing hand gesture misrecognition and electronic device
CN114816311B (zh) 一种屏幕移动的方法和装置
WO2022222705A1 (zh) 设备控制方法和电子设备
CN116521018B (zh) 误触提示方法、终端设备及存储介质
WO2022022381A1 (zh) 生成涂鸦图案的方法、装置、电子设备及存储介质
CN114006976B (zh) 一种界面显示方法及终端设备
WO2024067551A1 (zh) 界面显示方法及电子设备
WO2024066976A1 (zh) 控件显示方法及电子设备

Legal Events

Date Code Title Description
WWE Wipo information: entry into national phase

Ref document number: 18041933

Country of ref document: US

ENP Entry into the national phase

Ref document number: 2022857074

Country of ref document: EP

Effective date: 20230223