WO2023142736A1 - 触控操作响应方法和电子设备 - Google Patents

触控操作响应方法和电子设备 Download PDF

Info

Publication number
WO2023142736A1
WO2023142736A1 PCT/CN2022/138814 CN2022138814W WO2023142736A1 WO 2023142736 A1 WO2023142736 A1 WO 2023142736A1 CN 2022138814 W CN2022138814 W CN 2022138814W WO 2023142736 A1 WO2023142736 A1 WO 2023142736A1
Authority
WO
WIPO (PCT)
Prior art keywords
electronic device
area
user
interface
duration
Prior art date
Application number
PCT/CN2022/138814
Other languages
English (en)
French (fr)
Inventor
赵文龙
Original Assignee
荣耀终端有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 荣耀终端有限公司 filed Critical 荣耀终端有限公司
Publication of WO2023142736A1 publication Critical patent/WO2023142736A1/zh

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • G06F3/0416Control or interface arrangements specially adapted for digitisers
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M1/00Substation equipment, e.g. for use by subscribers
    • H04M1/72Mobile telephones; Cordless telephones, i.e. devices for establishing wireless links to base stations without route selection
    • H04M1/724User interfaces specially adapted for cordless or mobile telephones
    • H04M1/72448User interfaces specially adapted for cordless or mobile telephones with means for adapting the functionality of the device according to specific conditions
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M1/00Substation equipment, e.g. for use by subscribers
    • H04M1/72Mobile telephones; Cordless telephones, i.e. devices for establishing wireless links to base stations without route selection
    • H04M1/724User interfaces specially adapted for cordless or mobile telephones
    • H04M1/72448User interfaces specially adapted for cordless or mobile telephones with means for adapting the functionality of the device according to specific conditions
    • H04M1/72454User interfaces specially adapted for cordless or mobile telephones with means for adapting the functionality of the device according to specific conditions according to context-related or environment-related conditions

Definitions

  • the present application relates to the technical field of terminals, and in particular to a touch operation response method and electronic equipment.
  • the electronic device carried by the user is shaking in scenarios such as walking, running, and exercising with the user.
  • the user may also trigger the display interface of the electronic device to execute the corresponding function.
  • Embodiments of the present application provide a touch operation response method and an electronic device, which can improve the problem of invalid touch or false touch when a user inputs a touch operation to the electronic device in a scene with high acceleration.
  • the present application provides a method for responding to a touch operation, which is applied to an electronic device.
  • the touch operation response method provided in the present application includes: the electronic device displays a first interface. When the first area of the first interface of the electronic device receives a user's touch operation, the acceleration in the direction of the screen of the electronic device is acquired. When the acceleration is greater than the acceleration threshold, the electronic device determines the second area on the first interface according to the first displacement of the first area in the previous first time period; the electronic device executes the corresponding function when the second area is triggered.
  • the electronic device determines the target area that overlaps with the first area as the third area, wherein the third area includes a touch element; the electronic device executes the touch control in the third area. The corresponding function when the control is triggered.
  • the electronic device After the first area of the first interface of the electronic device receives a touch operation, if the motion state of the electronic device indicates that the acceleration of the electronic device in the direction of the screen is greater than the speed threshold, then From when the user initiates the touch to when the user touches the first interface, the electronic device has a large deviation. In this way, there is a large offset between the first area that the user actually triggers and the area that the user originally needs to trigger. In this way, the electronic device can determine the second area according to the first displacement of the first area within the previous first time period. Understandably, the second area is the area that the user originally wants to trigger. In this way, the electronic device executes the corresponding function when the second area is triggered more accurately.
  • the third area determined by the electronic device includes touch controls. Since the area of the third area is larger than the area where the touch element is located, even if the first area actually triggered by the user is offset from the touch element that the user originally needs to trigger.
  • the electronic device executes the function corresponding to the touch control in the third area triggered by the user, that is, executes the function corresponding to when the touch control in the third area that overlaps with the first area is triggered, and the reliability is high.
  • the method provided in the present application further includes: setting the first duration by the electronic device in response to a user's trigger operation.
  • the user can set the first duration that matches himself.
  • the method provided by the present application further includes: the electronic device displays a second interface, where the second interface includes a target control and a first control, and the first control is used to instruct the user to trigger the countdown,
  • the target control is the control that instructs the user to fire after the countdown ends.
  • the electronic device starts counting down in response to the user's trigger operation on the first control. After the countdown is finished, the electronic device records the first moment when the countdown is finished. The electronic device records the second moment in response to the user's trigger operation on the target control. The electronic device calculates the time difference between the first moment and the second moment to obtain the first duration.
  • the first duration is preconfigured by the electronic device.
  • the first duration does not need to be configured by the user, which saves the user's operations.
  • the first duration includes multiple different values, and each value of the first duration corresponds to a type parameter.
  • the method provided by the present application further includes: the electronic device obtains the type parameter.
  • the electronic device determines the value of the first duration according to the type parameter.
  • determining the value of the first duration according to the type parameter can make the determined first duration more reliable.
  • the electronic device determines the second area on the first interface according to the first displacement of the first area in the previous first time period, including: the electronic device determines according to the acceleration and the first time length The first displacement of the first area in the previous first duration. The electronic device determines the second area on the first interface according to the first displacement.
  • the second displacement that occurs when the second region is moved to coincide with the first region has the same distance and opposite direction to the first displacement.
  • the electronic device determines the target area that overlaps with the first area as the third area, including: when the acceleration is greater than the acceleration threshold, the electronic device determines at least one The target area, and the target area includes a touch control. The electronic device determines the target area overlapping with the first area as the third area.
  • any two target areas do not overlap.
  • the electronic device is a mobile phone, a smart watch, a smart bracelet, a tablet computer, or a vehicle-mounted terminal.
  • the present application also provides a touch operation response device, which is applied to electronic equipment, and the device includes:
  • a display unit configured to display the first interface
  • the processing unit is configured to obtain the acceleration of the screen direction of the electronic device when the first area of the first interface of the electronic device receives a user's touch operation;
  • the processing unit is further configured to determine the second area on the first interface according to the first displacement of the first area in the previous first time period when the acceleration is greater than the acceleration threshold; and execute the corresponding function when the second area is triggered ;
  • the processing unit is further configured to determine the target area overlapping with the first area as the third area when the acceleration is greater than the acceleration threshold, wherein the third area includes a touch control; and execute the third area The corresponding function when triggered.
  • an embodiment of the present application provides an electronic device, including a processor and a memory, the memory is used to store code instructions; the processor is used to run code instructions, so that the electronic device can perform any of the first aspect or the first aspect.
  • the embodiment of the present application also provides a computer-readable storage medium, the computer-readable storage medium stores instructions, and when the instructions are executed, the computer executes the first aspect or any implementation method of the first aspect The touch operation response method described in .
  • the embodiment of the present application further provides a computer program product, including a computer program, which, when the computer program is run, causes the computer to perform the touch operation as described in the first aspect or any implementation manner of the first aspect Response method.
  • FIG. 1 is a schematic diagram of a scene in which a user A lifts up an electronic device 100 to record a scene in front of him while walking;
  • FIG. 2 is a schematic diagram of a scene where the electronic device 100 moves upward during the period from when the user A initiates a touch action to when the electronic device 100 receives the trigger operation;
  • FIG. 3 is a schematic diagram of the hardware architecture of the mobile phone provided by the embodiment of the present application.
  • FIG. 4 is a schematic diagram of the hardware architecture of the smart bracelet provided by the embodiment of the present application.
  • FIG. 5 is one of the flow charts of the touch operation response method provided by the embodiment of the present application.
  • FIG. 6 is one of the schematic diagrams of the interface provided by the embodiment of the present application for the user to set the first duration
  • FIG. 7 is a schematic diagram of the interface of the mobile phone 100 according to the embodiment of the present application to execute the function corresponding to the second area of the first interface;
  • FIG. 8 is the second flowchart of the touch operation response method provided by the embodiment of the present application.
  • Fig. 9 is the second schematic diagram of the interface provided by the embodiment of the present application for the user to set the first duration
  • FIG. 10 is a schematic diagram of a scene where the smart bracelet 200 moves upwards during the period from user A initiating a touch action to when the smart bracelet 200 receives a trigger operation;
  • FIG. 11 is one of the interface schematic diagrams of the smart bracelet 200 provided in the embodiment of the present application to execute the functions corresponding to the second area of the first interface;
  • FIG. 12 is the third flowchart of the touch operation response method provided by the embodiment of the present application.
  • FIG. 13 is the second schematic diagram of the smart bracelet 200 performing the functions corresponding to the second area of the first interface provided by the embodiment of the present application;
  • Fig. 14 is the third interface schematic diagram of the smart bracelet 200 provided in the embodiment of the present application to execute the function corresponding to the second area of the first interface;
  • FIG. 15 is a schematic diagram of a hardware structure of an electronic device provided by an embodiment of the present application.
  • FIG. 16 is a schematic structural diagram of a chip provided by an embodiment of the present application.
  • words such as “first” and “second” are used to distinguish the same or similar items with basically the same function and effect.
  • the first value and the second value are only used to distinguish different values, and their sequence is not limited.
  • words such as “first” and “second” do not limit the quantity and execution order, and words such as “first” and “second” do not necessarily limit the difference.
  • At least one means one or more, and “multiple” means two or more.
  • “And/or” describes the association relationship of associated objects, indicating that there may be three types of relationships, for example, A and/or B, which can mean: A exists alone, A and B exist at the same time, and B exists alone, where A, B can be singular or plural.
  • the character “/” generally indicates that the contextual objects are an “or” relationship.
  • “At least one of the following" or similar expressions refer to any combination of these items, including any combination of single or plural items.
  • At least one item (piece) of a, b, or c can represent: a, b, c, a-b, a-c, b-c, or a-b-c, where a, b, c can be single or multiple .
  • the electronic devices have become a part of people's work and life, bringing convenience to users' lives.
  • the acceleration of the electronic device is relatively high, for example, in scenarios where the user carries the electronic device for walking, running, and exercising, the electronic device is shaking when the user carries the electronic device.
  • the user may also trigger the display interface of the electronic device to execute the corresponding function.
  • user A lifts up the electronic device 100 to record the scenery ahead while walking. In this way, the user can initiate a touch action on the camera icon 102 on the system desktop 101 displayed on the electronic device 100 . Since the user A is walking, the electronic device 100 may shake up and down with the user A. In this way, during the time from user A initiating a touch action to electronic device 100 receiving a trigger operation, as shown in (a)-(b) in FIG. 2 , electronic device 100 may have moved upward by a distance S. In this way, as shown in (b) in FIG. 2 , user A does not touch the camera icon 102 , but touches the area 103 below the camera icon 102 . In this way, an invalid touch is caused.
  • the present application provides a touch operation response method. After receiving a touch operation in the first area of the first interface of the electronic device, if the acceleration of the electronic device in the direction of the screen is greater than the speed threshold, it indicates that the user When the touch is initiated and the touch reaches the first interface, the electronic device has a large deviation. In this way, there is a large offset between the first area that the user actually triggers and the area that the user originally needs to trigger. In this way, the electronic device executes the corresponding function when the second area is triggered. Wherein, the second area is obtained according to the displacement of the first area within the previous first time period. Understandably, the second area is the area that the user originally wants to trigger. In this way, the electronic device executes the corresponding function when the second area is triggered more accurately.
  • the above-mentioned electronic device may also be called a terminal, a user equipment (user equipment, UE), a mobile station (mobile station, MS), a mobile terminal (mobile terminal, MT) and so on.
  • the electronic device may be a mobile phone (mobile phone), a wearable device, a tablet computer (Pad), a virtual reality (virtual reality, VR) electronic device, or an augmented reality (augmented reality, AR) electronic device.
  • the embodiments of the present application do not limit the specific technology and specific device form adopted by the electronic device.
  • FIG. 3 is a schematic structural diagram of a mobile phone provided by an embodiment of the present application.
  • the mobile phone can include a processor 110, an external memory interface 120, an internal memory 121, a universal serial bus (universal serial bus, USB) interface 130, a charging management module 140, a power management module 141, an antenna 1, an antenna 2, and a mobile communication module 150 , a wireless communication module 160, an audio module 170, a speaker 170A, a receiver 170B, a microphone 170C, a sensor module 180, a button 190, an indicator 192, a camera 193, and a display screen 194, etc.
  • a processor 110 an external memory interface 120, an internal memory 121, a universal serial bus (universal serial bus, USB) interface 130, a charging management module 140, a power management module 141, an antenna 1, an antenna 2, and a mobile communication module 150 , a wireless communication module 160, an audio module 170, a speaker 170A, a receiver 170B, a microphone 170C, a sensor module 180, a button 190, an indicator 192, a camera 193, and
  • the sensor module 180 may include a pressure sensor 180A, a gyroscope sensor 180B, an air pressure sensor 180C, a magnetic sensor 180D, an acceleration sensor 180E, a distance sensor 180F, a proximity light sensor 180G, a fingerprint sensor 180H, a temperature sensor 180J, a touch sensor 180K, an ambient light sensor 180L, bone conduction sensor 180M, etc.
  • the structure shown in the embodiment of the present application does not constitute a specific limitation on the mobile phone.
  • the mobile phone may include more or fewer components than shown in the figure, or combine certain components, or separate certain components, or arrange different components.
  • the illustrated components can be realized in hardware, software or a combination of software and hardware.
  • Processor 110 may include one or more processing units. Wherein, different processing units may be independent devices, or may be integrated in one or more processors.
  • a memory may also be provided in the processor 110 for storing instructions and data.
  • the wireless communication module 160 can provide applications on mobile phones including wireless local area networks (wireless local area networks, WLAN) (such as wireless fidelity (wireless fidelity, Wi-Fi) network), bluetooth (bluetooth, BT), global navigation satellite system (global navigation satellite system) navigation satellite system (GNSS), frequency modulation (frequency modulation, FM) and other wireless communication solutions.
  • WLAN wireless local area networks
  • WLAN wireless local area networks
  • Wi-Fi wireless fidelity
  • BT wireless fidelity
  • BT wireless fidelity
  • BT global navigation satellite system
  • GNSS global navigation satellite system navigation satellite system
  • FM frequency modulation
  • the mobile phone realizes the display function through the GPU, the display screen 194, and the application processor.
  • the GPU is a microprocessor for image processing, and is connected to the display screen 194 and the application processor. GPUs are used to perform mathematical and geometric calculations for graphics rendering.
  • the display screen 194 is used to display images, videos and the like.
  • the display screen 194 includes a display panel.
  • the mobile phone may include 1 or N display screens 194, where N is a positive integer greater than 1.
  • the mobile phone can realize shooting function through ISP, camera 193 , video codec, GPU, display screen 194 and application processor.
  • Camera 193 is used to capture still images or video.
  • the mobile phone may include 1 or N cameras 193, where N is a positive integer greater than 1.
  • the internal memory 121 may be used to store computer-executable program codes including instructions.
  • the internal memory 121 may include an area for storing programs and an area for storing data.
  • the internal memory 121 stores the first duration, or stores the mapping relationship between different type parameters and different values of the first duration.
  • the mobile phone can realize the audio function through the audio module 170, the speaker 170A, the receiver 170B, the microphone 170C, and the application processor. Such as music playback, recording, etc.
  • the audio module 170 is used to convert digital audio information into analog audio signal output, and is also used to convert analog audio input into digital audio signal.
  • Speaker 170A also referred to as a "horn” is used to convert audio electrical signals into sound signals.
  • the cell phone can listen to music through speaker 170A, or listen to hands-free calls.
  • Receiver 170B also called “earpiece”, is used to convert audio electrical signals into audio signals. When the mobile phone receives a call or a voice message, the receiver 170B can be placed close to the human ear to receive the voice.
  • the microphone 170C also called “microphone” or “microphone”, is used to convert sound signals into electrical signals.
  • the pressure sensor 180A is used to sense the pressure signal and convert the pressure signal into an electrical signal.
  • pressure sensor 180A may be disposed on display screen 194 .
  • the gyroscope sensor 180B can be used to determine the motion posture of the mobile phone.
  • the air pressure sensor 180C is used to measure air pressure.
  • the magnetic sensor 180D includes a Hall sensor.
  • the acceleration sensor 180E can detect the acceleration of the mobile phone in various directions (generally three axes).
  • the distance sensor 180F is used to measure the distance.
  • Proximity light sensor 180G may include, for example, light emitting diodes (LEDs) and light detectors, such as photodiodes.
  • the ambient light sensor 180L is used for sensing ambient light brightness.
  • the fingerprint sensor 180H is used to collect fingerprints.
  • the temperature sensor 180J is used to detect temperature.
  • the touch sensor 180K is also called “touch device”.
  • the touch sensor 180K can be disposed on the display screen 194, and the touch sensor 180K and the display screen 194 form a touch screen, also called a “touch screen”.
  • the bone conduction sensor 180M can acquire vibration signals.
  • FIG. 4 is a schematic diagram of a hardware structure of a smart bracelet provided by an embodiment of the present application.
  • the smart bracelet can include a processor 210, an internal memory 221, a charging management module 240, a power management module 241, an antenna 1, a mobile communication module 250, a wireless communication module 260, an audio module 270, a speaker 270A, a receiver 270B, a sensor module 280, Button 290, indicator 292, camera 293, display screen 294, etc.
  • the sensor module 280 may include: a gyro sensor 280B, a barometer 280C, a magnetic sensor 280D, an acceleration sensor 280E, a proximity light sensor 280G, a temperature sensor 280J, a touch sensor 280K, and an ambient light sensor 280L.
  • each module in the wearable device is the same as the function principle of the module corresponding to the name in the above-mentioned mobile phone, and will not be repeated here.
  • Acceleration in the direction of the screen the acceleration in the extending direction of the plane where the display screen of the electronic device is located.
  • the touch operation response method provided by the embodiment of the present application includes:
  • the mobile phone 100 displays the second interface.
  • the second interface includes a target control and a first control
  • the first control is used to instruct the user to trigger the countdown
  • the target control is a control for instructing the user to trigger after the countdown ends.
  • the mobile phone 100 may display a setting list interface 601 in response to a user's trigger operation on the "settings icon" on the system desktop (not shown in FIG. 6 ).
  • the setting list interface 601 includes a first setting option 602 .
  • the first setting option 602 is used to indicate to set the user's touch response time, that is, the time from when the user initiates the touch action to when the mobile phone 100 receives the trigger operation.
  • the mobile phone 100 may display the second interface 603 in response to the user's trigger operation on the first setting option 602 .
  • the second interface 603 includes a target control 604 and a first control 605 .
  • S502 The mobile phone 100 starts counting down in response to the user's trigger operation on the first control.
  • the mobile phone 100 may respond to the user's trigger operation on the first control 605, and the mobile phone 100 starts counting down.
  • the duration of the countdown may be but not limited to 3s (for example, it may also be 4s, or 5s, etc.).
  • the second interface 603 can display the remaining duration (unit, second) of the countdown, for example, "3", "2", "1” and “0", and display "3", "2", "1” and " 0" interval is 1s. Understandably, as shown in (d) of FIG. 6 , when the countdown ends, the second interface 603 displays a value "0", wherein the value "0" is used to prompt the user that the countdown ends.
  • the user may be ready to initiate a touch operation to the target control 604 .
  • the user can initiate touch actions.
  • the mobile phone 100 records the first moment when the countdown ends, that is, records the moment when the user can initiate a touch action.
  • S504 The mobile phone 100 records a second moment in response to the user's trigger operation on the target control.
  • the mobile phone 100 may record the second moment in response to the user's trigger operation on the target control 604 . Understandably, the second moment is the moment when the mobile phone 100 receives a user's trigger operation on the target control 604 .
  • S505 The mobile phone 100 calculates the time difference between the first moment and the second moment to obtain the first duration.
  • the first duration can be understood as the duration from when the user initiates the touch action to when the mobile phone 100 receives the trigger operation.
  • the first duration may be 0.55s, 0.45s, and 0.35s, etc., which is not limited herein.
  • the mobile phone 100 may execute S502-S505 in a loop multiple times (for example, 3 times or more). In this way, the mobile phone 100 can also obtain multiple durations. Furthermore, the mobile phone 100 may calculate an average value of the obtained multiple durations. In this way, the mobile phone 100 stores the calculated average value. For example, the multiple first durations obtained by the mobile phone 100 are respectively 0.45s, 0.5s, and 0.55s, and the average value of 0.45s, 0.5s, and 0.55 calculated by the mobile phone 100 is 0.5s. In this way, the first duration stored in the mobile phone 100 is 0.5s.
  • the average value stored by the mobile phone 100 is the first duration of final storage. Understandably, when the first duration is the average value of multiple obtained first durations, the reliability is higher.
  • S506 The mobile phone 100 stores the first duration.
  • the mobile phone 100 sets the first duration in response to the user's trigger operation. In this way, the user can set the first duration that matches himself.
  • the mobile phone 100 may be pre-configured with the first duration when leaving the factory. In this way, the first duration does not need to be configured by the user, which saves the user's operations. In this way, the above S501-S506 can be omitted.
  • the first duration pre-stored in the mobile phone 100 at the factory may be one value, or may include multiple values.
  • each value of the first duration corresponds to a type parameter.
  • the type parameter may be age or exercise state, etc., which is not limited here.
  • the mapping relationship between each value of the first duration and the type parameter may be shown in Table 1 below.
  • S507 The mobile phone 100 displays the first interface.
  • the system desktop 101 of the mobile phone 100 is the first interface.
  • S508 The mobile phone 100 determines that the first area of the first interface receives a user's touch operation.
  • user A may initiate a touch action on the camera icon 102 in the system desktop 101 (ie, the first interface) displayed on the mobile phone 100 for video recording.
  • the mobile phone 100 touches the first area 103 of the system desktop 101
  • the mobile phone 100 determines that the first area 103 of the system desktop 101 has received a user's touch operation.
  • the mobile phone 100 acquires the motion state of the mobile phone 100.
  • the motion state is used to represent the acceleration of the mobile phone 100 .
  • the motion state of the mobile phone 100 includes a walking state, a running state, etc., which are not limited here. Understandably, the mobile phone 100 exhibits different accelerations in different motion states.
  • the way for the mobile phone 100 to obtain the motion state is: the mobile phone 100 continuously collects pose parameters and accelerations at different times.
  • the collected pose parameters and accelerations may be filtered data, which can enhance the reliability of the collected data.
  • the mobile phone 100 inputs the collected pose parameters and acceleration into the pre-trained first model, so that the first model outputs a motion state based on the pose parameters.
  • the first model may be trained by using massive pose parameters and accelerations as training samples, and using the motion state as a training target.
  • the above S509 may be replaced by: the mobile phone 100 acquires the acceleration of the mobile phone 100 in the direction of the screen.
  • S510 The mobile phone 100 determines whether the acceleration in the direction of the screen is greater than the acceleration threshold, and if so, executes S511.
  • the acceleration of the mobile phone 100 in the direction of the screen can be understood as: the acceleration in the extending direction of the plane where the display screen 194 of the mobile phone 100 is located.
  • S511 The mobile phone 100 determines the second area on the first interface according to the first displacement of the first area within the previous first duration.
  • the second displacement that occurs when the second region 104 is moved to coincide with the first region 103 has the same distance as the first displacement and the opposite direction.
  • the mobile phone 100 may shake up and down with the user A, so the acceleration of the mobile phone 100 may be greater than the acceleration threshold (eg, 0.2m/s).
  • the acceleration threshold eg, 0.2m/s.
  • mobile phone 100 may have moved up a distance S.
  • user A does not touch the camera icon 102 , but touches the first area 103 below the camera icon 102 .
  • the mobile phone 100 needs to determine the second area 104 (ie, the area where the camera icon 102 is located).
  • the following describes how the mobile phone 100 specifically determines the second area on the first interface.
  • the mobile phone 100 may acquire the first duration from the internal memory 121 . Furthermore, the mobile phone 100 determines the first displacement of the mobile phone 100 within the previous first time period. Wherein, when the first duration stored in the internal memory 121 includes only one value, the first duration may be obtained directly from the internal storage 121 .
  • the mobile phone 100 may acquire the type parameter first.
  • the type parameter is age
  • the mobile phone 100 may obtain the age from the operating system or an application program.
  • the operating system or the age in a certain application program may be the personal information entered by the user when registering an account. Because in the internal memory 121, there is a mapping relationship between different ages and different values of the first duration. In this way, the mobile phone 100 can determine the first duration from the internal memory 121 according to the acquired age.
  • the mobile phone 100 may determine the first duration from the internal memory 121 based on the motion state obtained in S509 above.
  • determining the value of the first duration according to the type parameter can make the determined first duration more reliable.
  • the mobile phone 100 can be calculated according to the formula Calculate the first displacement. That is, the first displacement determined by the mobile phone 100 satisfies the condition: Wherein, S k is the first displacement, ⁇ t is the subdivided interval duration in the first duration, a(i-1) is the acceleration of the i-1th interval duration of the mobile phone 100 in the first duration, a(i ) is the acceleration of the mobile phone 100 in the i-th interval duration in the first duration, and k is a positive integer.
  • the mobile phone 100 can also be based on the formula Calculate the first displacement. That is, the first displacement determined by the mobile phone 100 satisfies the condition: Wherein, S k is the first displacement, t is the first duration, a is the acceleration, and v 0 is the moving speed of the mobile phone 100 before the first duration.
  • the mobile phone 100 may determine the second area 104 (ie, the area within the camera icon 102 ).
  • the second displacement that occurs when the second region 104 is moved to coincide with the first region 103 has the same distance and opposite direction as the first displacement. That is to say, when the user initiates a touch action, he originally intends to touch the second area 104 .
  • the mobile phone 100 moves the first displacement within the first duration, so that the first area 103 of the mobile phone 100 is triggered by the user. In this way, there is an error in the position triggered by the user.
  • the mobile phone 100 obtains a second displacement opposite to the first displacement, and moves the first area 103 by the second displacement. Further, the mobile phone 100 may move the first area 103 to a second shifted area to determine the second area 104 originally intended to be touched by the user.
  • S512 The mobile phone 100 executes a corresponding function when the second area is triggered.
  • the second area is located where the camera icon 102 is located. Furthermore, as shown in (c) of FIG. 7 , the mobile phone 100 may respond to the function of displaying the shooting preview interface 106 after the camera icon 102 is triggered. Understandably, in this case, the corresponding function when the second region 104 is triggered is the function of displaying the shooting preview interface 106 .
  • the position of the video recording control 102 is the area that the user originally intended to trigger. In this way, after the mobile phone 100 responds to the video recording control 102 being triggered, the reliability of performing the shooting function is higher.
  • the mobile phone 100 is taken as an example of the electronic device 100 to illustrate the touch operation response method provided in the embodiment of the present application.
  • the touch operation response method provided by the embodiment of the present application will be described, and this example does not constitute a limitation to the embodiment of the present application.
  • the basic principles and technical effects of the touch operation response method provided when the electronic device adopts the smart bracelet 200 are the same as those of the above-mentioned embodiments. , you can refer to the corresponding content in the above-mentioned embodiments.
  • the touch operation response method includes:
  • the smart bracelet 200 displays a second interface.
  • the second interface includes the target control and the first control.
  • the first control is used to trigger the countdown.
  • the smart bracelet 200 can display a system desktop 1001 .
  • the system desktop 1001 includes a “Settings” icon 1002 .
  • the smart bracelet 200 may display a setting list interface 1003 in response to a user's trigger operation on the “settings” icon 1002 on the system desktop 1001 .
  • the settings list interface 1003 includes a first settings option 1004 .
  • the first setting option 1004 is used to indicate to set the touch response time (that is, the time period from when the user initiates the touch action to when the smart bracelet 200 receives the trigger operation).
  • the first setting option 1004 is used to indicate to set the touch response time (that is, the time period from when the user initiates the touch action to when the smart bracelet 200 receives the trigger operation).
  • the smart bracelet 200 may display the second interface 1005 in response to the user's trigger operation on the first setting option 1004 .
  • the second interface 1005 includes a target control 1006 and a first control 1007 . Understandably, the first control 1007 is used to trigger the countdown.
  • S802 The smart bracelet 200 starts counting down in response to the user's trigger operation on the first control.
  • the smart bracelet 200 may respond to the user's trigger operation on the first control 1007 , and the smart bracelet 200 starts counting down.
  • the duration of the countdown may be but not limited to 3s.
  • the second interface 1005 may display the remaining duration (unit, second) of the countdown. For example, “3", “2", “1”, “0”. It can be understood that the interval between displaying "3", “2", “1” and “0” is 1s. Understandably, as shown in (e) of FIG. 9 , when the countdown ends, the second interface 603 displays a value "0", wherein the value "0" is used to prompt the user that the countdown ends.
  • S804 The smart bracelet 200 records a second moment in response to the user's trigger operation on the target control.
  • the smart bracelet 200 may record the second moment in response to the user's trigger operation on the target control 1006 . Understandably, the second moment is the moment when the smart bracelet 200 receives a user's trigger operation on the target control 1006 .
  • S805 The smart bracelet 200 calculates the time difference between the first moment and the second moment to obtain the first duration.
  • S806 The smart bracelet 200 stores the first duration.
  • S807 The smart bracelet 200 displays the first interface.
  • the smart bracelet 200 may display a running record interface 1010 (ie, the first interface) in response to the user's trigger operation on the exercise application.
  • the running record interface 1010 includes information such as running mileage, running time, and running speed.
  • the running record interface 1010 also includes a pause control 1007 .
  • S808 The smart bracelet 200 determines that the first area of the first interface receives a user's touch operation.
  • the user wants to stop recording information such as running mileage, running time, and running speed, the user can initiate a touch action to the pause control 1007 during the running process.
  • the smart bracelet 200 determines that the first area 1008 of the running record interface 1010 has received a user's touch operation.
  • the smart bracelet 200 obtains the motion status of the smart bracelet 200 .
  • the motion state is used to characterize the acceleration of the smart bracelet 200 .
  • S809 may be replaced by: the smart bracelet 200 acquires the acceleration of the smart bracelet 200 in the direction of the screen.
  • S810 The smart bracelet 200 judges whether the acceleration in the direction of the screen is greater than the acceleration threshold, and if so, execute S811.
  • the smart bracelet 200 determines the second area on the first interface according to the first displacement of the first area within the previous first duration.
  • the second displacement that occurs when the second region is moved to coincide with the first region has the same distance and opposite direction to the first displacement.
  • the smart bracelet 200 may shake up and down with the user A, so that the acceleration of the smart bracelet 200 may be greater than the acceleration threshold (such as 0.2m/s). In this way, as shown in (a)-(b) in FIG. 10 , the smart bracelet 200 may have moved a distance S downward from the time when the user A initiates the touch action to the time when the smart bracelet 200 receives the trigger operation. In this way, as shown in (b) in FIG. 10 , user A does not touch the pause control 1007 , but touches the first area 1008 above the pause control 1007 . In this way, as shown in (a) in FIG. 11 , the smart bracelet 200 needs to determine the second area 1009 , which is the area that the user originally wants to trigger.
  • the acceleration threshold such as 0.2m/s
  • S812 The smart bracelet 200 executes the corresponding function when the second area is triggered.
  • the smart bracelet 200 executes the function when the second area 1009 (ie, the area included in the pause control 1007 ) is triggered. That is, the smart bracelet 200 executes the function of suspending recording information such as running mileage, running time, and running speed.
  • the smart bracelet 200 moves the first displacement within the first duration, In this way, the first region 1007 of the smart bracelet 200 is triggered by the user. In this way, there is an error in the position triggered by the user. In order to compensate for the error in the position triggered by the user, the smart bracelet 200 obtains a position opposite to the first displacement. The second displacement, and move the second displacement based on the position of the first area 1007.
  • the smart bracelet 200 can determine the area after the second displacement based on the position of the first area 1007 as the area that the user originally wanted to touch The second area 1009 (that is, the area included in the pause control 1007). In this way, the smart bracelet 200 is more reliable in pausing and recording information such as running mileage, running time, and running speed.
  • S1211 The target area of the smart bracelet 200 overlapping with the first area is determined as a third area, wherein a touch control is included in the third area.
  • the touch elements are located in the target area, and when there are multiple touch elements, the target areas corresponding to any two touch elements do not overlap. In this way, the probability of false triggering can be reduced.
  • a pause control 1007 is included in the running record interface 1010 (ie, the first interface), and the pause control 1007 can be understood as a touch control.
  • the smart bracelet 200 determines the target area 1011 corresponding to the pause control 1007, wherein there is an overlapping relationship between the target area 1011 and the first area 1008, it can be seen that the target area 1011 is the third area 1011.
  • the pause control 1007 is located in the third area 1011 . It can be seen that the area of the third area 1011 is larger than the area of the pause control 1007 .
  • the running record interface 1501 (namely the first interface) includes a start control 1502 , a pause control 1503 and an end control 1504 .
  • the start control 1502 is used to instruct the user to start recording information such as running mileage, running time, and running speed
  • the pause control 1503 is used to instruct the user to suspend recording information such as running mileage, running time, and running speed
  • the end control 1504 is used to instruct the user to end Record running mileage, running time, running speed and other information.
  • the smart bracelet 200 determines the target area 1507 corresponding to the start control 1502, the target area 1508 corresponding to the pause control 1503, and End the target area 1509 corresponding to the control 1504 .
  • the start control 1502 is located in the target area 1507
  • the pause control 1503 is located in the target area 1508, and the end control 1504 is located in the target area 1509; and there is no overlapping relationship between the target area 1507, the target area 1508, and the target area 1509.
  • S1212 The smart bracelet determines that the target area overlapping with the first area is the third area, and executes the corresponding function when the touch control in the third area is triggered.
  • the smart bracelet 200 executes the function corresponding to the touch control in the third area 1011 when it is triggered.
  • the third area 1011 includes a pause control 1007, when the smart bracelet 200 executes the pause control 1007 is triggered, it suspends the recording of information such as running mileage, running time, and running speed. Function.
  • the smart bracelet 200 when the user initiates a touch action, the user originally wants to touch the pause control 1007 . However, the smart bracelet 200 has moved the first displacement within the first duration, so that the first area 1008 of the smart bracelet 200 is triggered by the user, and there is no overlapping relationship between the first area 1008 and the pause control 1007, resulting in There is an error in the position triggered by the user. In order to compensate for errors in the position triggered by the user, the smart phone 200 may determine a third area 1011 whose area is larger than the pause control 1007 . In this way, the third area 1011 may overlap with the first area 1008 . Furthermore, when the pause control 1007 located in the third area 1011 is triggered, the smart bracelet 200 suspends the function of recording running mileage, running time, running speed and other information, which has high reliability.
  • the smart bracelet 200 determines that the target area 1508 overlapping with the first area 1505 is the third area 1508 .
  • the pause control 1503 in the third area 1508 is triggered, the function of recording running mileage, running time, running speed and other information is suspended, which has high reliability.
  • the principle of the smart bracelet 200 executing the corresponding function when the third area 1508 is triggered is the same as the principle of the first method in S1212 , and will not be repeated here.
  • the touch operation response method provided by this application after receiving a touch operation in the first area of the first interface of the electronic device, if the motion state of the electronic device indicates If the acceleration of the electronic device in the direction of the screen is greater than the rate threshold, it indicates that the electronic device has experienced a large deviation from when the user initiates the touch to when the user touches the first interface. In this way, there is a large offset between the first area that the user actually triggers and the area that the user originally needs to trigger. In this way, the electronic device can determine the second area according to the first displacement of the first area within the previous first time period. Understandably, the second area is the area that the user originally wants to trigger. In this way, the electronic device executes the corresponding function when the second area is triggered more accurately.
  • the third area determined by the electronic device includes touch controls. Since the area of the third area is larger than the area where the touch element is located, even if the first area actually triggered by the user is offset from the touch element that the user originally needs to trigger.
  • the electronic device executes the function corresponding to the touch control in the third area triggered by the user, that is, executes the function corresponding to when the touch control in the third area that overlaps with the first area is triggered, and the reliability is high.
  • the trigger operation mentioned may include: click operation, long press operation, and gesture trigger operation, etc., which are not limited here.
  • the electronic device mentioned can also be: a smart watch, a tablet computer/vehicle terminal, etc., which will not be discussed here. limited.
  • the embodiment of the present application also provides a touch operation response device, which is applied to electronic equipment.
  • the touch operation response device provided in the embodiment of the present application includes: a display unit configured to display a first interface.
  • the processing unit is configured to obtain the acceleration in the direction of the screen of the electronic device when the first area of the first interface of the electronic device receives a user's touch operation.
  • the processing unit is further configured to determine the second area on the first interface according to the first displacement of the first area in the previous first time period when the acceleration is greater than the acceleration threshold; and execute the corresponding function when the second area is triggered .
  • the processing unit is further configured to determine the target area overlapping with the first area as the third area when the acceleration is greater than the acceleration threshold, wherein the third area includes a touch control; and execute the third area The corresponding function when triggered.
  • the processing unit is further configured to set the first duration in response to a user's trigger operation.
  • the display unit is also used to display the second interface.
  • the second interface includes a target control and a first control
  • the first control is used to instruct the user to trigger the countdown
  • the target control is a control for instructing the user to trigger after the countdown ends.
  • the electronic device starts counting down in response to the user's trigger operation on the first control. After the countdown is finished, the electronic device records the first moment when the countdown is finished. The electronic device records the second moment in response to the user's trigger operation on the target control. The electronic device calculates the time difference between the first moment and the second moment to obtain the first duration.
  • the first duration is preconfigured by the processing unit.
  • the first duration includes multiple different values, and each value of the first duration corresponds to a type parameter.
  • processing unit is also used to acquire type parameters.
  • the electronic device determines the value of the first duration according to the type parameter.
  • the processing unit is specifically configured to determine, according to the acceleration and the first duration, the first displacement of the first region within the previous first duration; and according to the first displacement, determine on the first interface second area.
  • the second displacement that occurs when the second region is moved to coincide with the first region has the same distance as the first displacement and the opposite direction.
  • the processing unit is specifically configured to: when the acceleration is greater than an acceleration threshold, the electronic device determines at least one target area, and the target area includes a touch control element. The electronic device determines the target area overlapping with the first area as the third area.
  • any two target areas do not overlap.
  • the electronic device is a mobile phone, a smart watch, a smart bracelet, a tablet computer, or a vehicle-mounted terminal.
  • FIG. 15 is a schematic diagram of a hardware structure of a terminal device provided by an embodiment of the present application.
  • the terminal device includes a processor 1501, a communication line 1504 and at least one communication interface (the exemplary Take the communication interface 1503 as an example for illustration).
  • the processor 1501 can be a general-purpose central processing unit (central processing unit, CPU), a microprocessor, a specific application integrated circuit (application-specific integrated circuit, ASIC), or one or more for controlling the execution of the application program program integrated circuit.
  • CPU central processing unit
  • ASIC application-specific integrated circuit
  • Communication lines 1504 may include circuitry that communicates information between the components described above.
  • the communication interface 1503 uses any device such as a transceiver for communicating with other devices or communication networks, such as Ethernet, wireless local area networks (wireless local area networks, WLAN) and so on.
  • a transceiver for communicating with other devices or communication networks, such as Ethernet, wireless local area networks (wireless local area networks, WLAN) and so on.
  • the terminal device may also include a memory 1502 .
  • the memory 1502 may be a read-only memory (ROM) or other types of static storage devices that can store static information and instructions, a random access memory (random access memory, RAM) or other types that can store information and instructions It can also be an electrically erasable programmable read-only memory (EEPROM), compact disc read-only memory (CD-ROM) or other optical disc storage, optical disc storage (including compact discs, laser discs, optical discs, digital versatile discs, Blu-ray discs, etc.), magnetic disk storage media or other magnetic storage devices, or can be used to carry or store desired program code in the form of instructions or data structures and can be programmed by a computer Any other medium accessed, but not limited to.
  • the memory may exist independently and be connected to the processor through the communication line 1504 . Memory can also be integrated with the processor.
  • the memory 1502 is used to store computer-executed instructions for implementing the solution of the present application, and the execution is controlled by the processor 1501 .
  • the processor 1501 is configured to execute computer-executed instructions stored in the memory 1502, so as to realize the touch operation response method provided in the embodiment of the present application.
  • the computer-executed instructions in the embodiment of the present application may also be referred to as application program code, which is not specifically limited in the embodiment of the present application.
  • the processor 1501 may include one or more CPUs, for example, CPU0 and CPU1 in FIG. 15 .
  • a terminal device may include multiple processors, for example, processor 1501 and processor 1505 in FIG. 15 .
  • processors may be a single-core (single-CPU) processor or a multi-core (multi-CPU) processor.
  • a processor herein may refer to one or more devices, circuits, and/or processing cores for processing data (eg, computer program instructions).
  • FIG. 16 is a schematic structural diagram of a chip provided by an embodiment of the present application.
  • the chip 160 includes one or more than two (including two) processors 1610 and a communication interface 1630.
  • memory 1640 stores the following elements: executable modules or data structures, or subsets thereof, or extensions thereof.
  • the memory 1640 may include a read-only memory and a random access memory, and provides instructions and data to the processor 1610 .
  • a part of the memory 1640 may also include a non-volatile random access memory (non-volatile random access memory, NVRAM).
  • the memory 1640 , the communication interface 1630 and the memory 1640 are coupled together through the bus system 1620 .
  • the bus system 1620 may include not only a data bus, but also a power bus, a control bus, and a status signal bus.
  • the various buses are labeled bus system 1620 in FIG. 13 .
  • the methods described in the foregoing embodiments of the present application may be applied to the processor 1610 or implemented by the processor 1610 .
  • the processor 1610 may be an integrated circuit chip with signal processing capability.
  • each step of the above method may be implemented by an integrated logic circuit of hardware in the processor 1610 or instructions in the form of software.
  • the above-mentioned processor 1610 may be a general-purpose processor (for example, a microprocessor or a conventional processor), a digital signal processor (digital signal processing, DSP), an application specific integrated circuit (application specific integrated circuit, ASIC), an off-the-shelf programmable gate Array (field-programmable gate array, FPGA) or other programmable logic devices, discrete gates, transistor logic devices or discrete hardware components, the processor 1610 can implement or execute the methods, steps and logic block diagrams disclosed in the embodiments of the present application .
  • DSP digital signal processing
  • ASIC application specific integrated circuit
  • FPGA field-programmable gate array
  • the steps of the method disclosed in connection with the embodiments of the present application may be directly implemented by a hardware decoding processor, or implemented by a combination of hardware and software modules in the decoding processor.
  • the software module may be located in a mature storage medium in the field such as random access memory, read-only memory, programmable read-only memory, or electrically erasable programmable read only memory (EEPROM).
  • the storage medium is located in the memory 1640, and the processor 1610 reads the information in the memory 1640, and completes the steps of the above method in combination with its hardware.
  • the instructions stored in the memory for execution by the processor may be implemented in the form of computer program products.
  • the computer program product may be written in the memory in advance, or may be downloaded and installed in the memory in the form of software.
  • a computer program product includes one or more computer instructions. When the computer program instructions are loaded and executed on the computer, the processes or functions according to the embodiments of the present application will be generated in whole or in part.
  • the computer can be a general purpose computer, special purpose computer, computer network, or other programmable apparatus.
  • Computer instructions may be stored in or transmitted from one computer-readable storage medium to another computer-readable storage medium, e.g. Coaxial cable, optical fiber, digital subscriber line (digital subscriber line, DSL) or wireless (such as infrared, wireless, microwave, etc.) transmission to another website site, computer, server or data center.
  • Computer readable storage medium can be Any available media capable of being stored by a computer or a data storage device such as a server, data center, etc. integrated with one or more available media.
  • available media may include magnetic media (e.g., floppy disks, hard disks, or tapes), optical media (e.g., A digital versatile disc (digital versatile disc, DVD)), or a semiconductor medium (for example, a solid state disk (solid state disk, SSD)), etc.
  • magnetic media e.g., floppy disks, hard disks, or tapes
  • optical media e.g., A digital versatile disc (digital versatile disc, DVD)
  • a semiconductor medium for example, a solid state disk (solid state disk, SSD)
  • Computer-readable media may include computer storage media and communication media, and may include any medium that can transfer a computer program from one place to another.
  • a storage media may be any target media that can be accessed by a computer.
  • the computer-readable medium may include compact disc read-only memory (compact disc read-only memory, CD-ROM), RAM, ROM, EEPROM or other optical disc storage; the computer-readable medium may include a magnetic disk memory or other disk storage devices.
  • any connected cord is properly termed a computer-readable medium.
  • Disk and disc includes compact disc (CD), laser disc, compact disc, digital versatile disc (DVD), floppy disk and Blu-ray disc where disks usually reproduce data magnetically, while discs reproduce data optically with lasers. Reproduce data.

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Signal Processing (AREA)
  • Environmental & Geological Engineering (AREA)
  • Telephone Function (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

本申请提供一种触控操作响应方法和电子设备,涉及终端技术领域。在电子设备的第一界面的第一区域接收到触控操作后,若电子设备的运动状态指示电子设备的在屏幕方向上的加速度大于速率阈值,则说明用户从发起触控,至触控到第一界面时,电子设备发生了较大的偏移。如此,用户实际触发到的第一区域与用户原本需要触发的区域存在较大的偏移。如此,电子设备可以根据第一区域在之前的第一时长内发生的第一位移,确定第二区域。如此,电子设备执行第二区域被触发时对应的功能更准确。或者,电子设备确定的第三区域内包括触控件。由于第三区域的面积大于触控件所在位置的面积,即执行第三区域内的触控件被触发时对应的功能,可靠性高。

Description

触控操作响应方法和电子设备
本申请要求于2022年01月26日提交中国国家知识产权局、申请号为202210092654.6、申请名称为“触控操作响应方法和电子设备”的中国专利申请的优先权,其全部内容通过引用结合在本申请中。
技术领域
本申请涉及终端技术领域,尤其涉及一种触控操作响应方法和电子设备。
背景技术
目前,电子设备已经成为人们工作生活的一部分,为用户的生活带来了方便。在一些电子设备的加速度较大的场景中,例如,用户携带电子设备步行、跑步以及健身等场景中,用户携带的电子设备在晃动。在这种情况下,用户也可能会触发电子设备的显示界面,以执行对应的功能。
然而,在上述的场景中,用户在对电子设备输入触控操作时,会造成无效触控或误触控。
发明内容
本申请实施例提供一种触控操作响应方法和电子设备,可以改善加速度较大的场景下,用户在对电子设备输入触控操作时,会造成无效触控或误触控的问题。
第一方面,本申请提供的一种触控操作响应方法,应用于电子设备。本申请提供的触控操作响应方法包括:电子设备显示第一界面。电子设备的第一界面的第一区域接收到用户的触控操作时,获取电子设备的屏幕方向的加速度。在加速度大于加速度阈值时,电子设备根据第一区域在之前的第一时长内发生的第一位移,在第一界面确定第二区域;电子设备执行第二区域被触发时对应的功能。或者,在加速度大于加速度阈值时,电子设备将与第一区域存在重叠关系的目标区域,确定为第三区域,其中,第三区域内包括有一个触控件;电子设备执行第三区域内的触控件被触发时对应的功能。
本申请提供的触控操作响应方法,在电子设备的第一界面的第一区域接收到触控操作后,若电子设备的运动状态指示电子设备的在屏幕方向上的加速度大于速率阈值,则说明用户从发起触控,至触控到第一界面时,电子设备发生了较大的偏移。这样一来,用户实际触发到的第一区域与用户原本需要触发的区域存在较大的偏移。如此,电子设备可以根据第一区域在之前的第一时长内发生的第一位移,确定第二区域。可以理解地,第二区域即用户原本想要触发的区域。如此,电子设备执行第二区域被触发时对应的功能更准确。
或者,电子设备确定的第三区域内包括触控件。由于第三区域的面积大于触控件所在位置的面积,这样一来,即使用户实际触发到的第一区域与用户原本需要触发的触控件存在偏移。电子设备执行被用户触发的第三区域内的触控件对应的功能,即执行与第一区域存在重叠关系的第三区域内的触控件被触发时对应的功能,可靠性高。
在一种可选的实施方式中,在电子设备显示第一界面之前,本申请提供的方法还包括:电子设备响应于用户的触发操作,设置第一时长。
这样一来,用户可以设置与自己匹配的第一时长。
进一步地,在电子设备显示第一界面之前,本申请提供的方法还包括:电子设备显示第二界面,其中,第二界面包括目标控件和第一控件,第一控件用于指示用户触发倒计时,目标控件为指示用户在倒计时结束后触发的控件。电子设备响应于用户对第一控件的触发操作,开始倒计时。电子设备在倒计时完毕后,记录倒计时完毕的第一时刻。电子设备响应于用户对目标控件的触发操作,记录第二时刻。电子设备计算第一时刻和第二时刻的时差,得到第一时长。
在一种可选的实施方式中,第一时长是电子设备预配置的。
这样,第一时长无需用户配置,节省了用户的操作。
进一步地,第一时长包括多个不同的取值,第一时长的各个取值对应有一个类型参数。
更进一步地,在满足加速度大于加速度阈值之后,本申请提供的方法还包括:电子设备获取类型参数。电子设备根据类型参数,确定第一时长的取值。
可以理解地,根据类型参数,确定第一时长的取值,可以使得确定的第一时长的可靠性更高。
在一种可选的实施方式中,电子设备根据第一区域在之前的第一时长内发生的第一位移,在第一界面确定第二区域,包括:电子设备根据加速度和第一时长,确定第一区域在之前的第一时长内发生的第一位移。电子设备根据第一位移,在第一界面确定第二区域。
进一步地,将第二区域移动到与第一区域重合时发生的第二位移,与第一位移的距离相同且方向相反。
这样一来,确定的第二区域的可靠性更高。
在一种可选的实施方式中,在加速度大于加速度阈值时,电子设备将第一区域存在重叠关系的目标区域,确定为第三区域,包括:在加速度大于加速度阈值时,电子设备确定至少一个目标区域,且目标区域内包括一个触控件。电子设备将与第一区域存在重叠关系的目标区域,确定为第三区域。
进一步地,目标区域的数量为多个,且任两个目标区域不重叠。
这样一来,可以减少误触发的概率。
在一种可选的实施方式中,电子设备为手机、智能手表、智能手环、平板电脑、或者车载终端。
第二方面,本申请还提供一种触控操作响应装置,应用于电子设备,装置包括:
显示单元,用于显示第一界面;
处理单元,用于电子设备的第一界面的第一区域接收到用户的触控操作时,获取电子设备的屏幕方向的加速度;
处理单元,还用于在加速度大于加速度阈值时,根据第一区域在之前的第一时长内发生的第一位移,在第一界面确定第二区域;并执行第二区域被触发时对应的功能;
或者,处理单元,还用于在加速度大于加速度阈值时,将与第一区域存在重叠关系的 目标区域,确定为第三区域,其中,第三区域内包括有一个触控件;并执行第三区域被触发时对应的功能。
第三方面,本申请实施例提供了一种电子设备,包括处理器和存储器,存储器用于存储代码指令;处理器用于运行代码指令,使得电子设备以执行如第一方面或第一方面的任一种实现方式中描述的触控操作响应方法。
第四方面,本申请实施例还提供一种计算机可读存储介质,计算机可读存储介质存储有指令,当指令被执行时,使得计算机执行如第一方面或第一方面的任一种实现方式中描述的触控操作响应方法。
第五方面,本申请实施例还提供一种计算机程序产品,包括计算机程序,当计算机程序被运行时,使得计算机执行如第一方面或第一方面的任一种实现方式中描述的触控操作响应方法。
应当理解的是,本申请的第二方面至第五方面与本申请的第一方面的技术方案相对应,各方面及对应的可行实施方式所取得的有益效果相似,不再赘述。
附图说明
图1为用户A在步行的过程中,举起电子设备100录制前方的风景的场景示意图;
图2为用户A从发起触控行为,至电子设备100接收到触发操作的时间内,电子设备100向上移动的场景示意图;
图3为本申请实施例提供的手机的硬件架构示意图;
图4为本申请实施例提供的智能手环的硬件架构示意图;
图5为本申请实施例提供的触控操作响应方法的流程图之一;
图6为本申请实施例提供的用户设置第一时长的界面示意图之一;
图7为本申请实施例提供的手机100执行第一界面的第二区域对应的功能的界面示意图;
图8为本申请实施例提供的触控操作响应方法的流程图之二;
图9为本申请实施例提供的用户设置第一时长的界面示意图之二;
图10为用户A从发起触控行为,至智能手环200接收到触发操作的时间内,智能手环200向上移动的场景示意图;
图11为本申请实施例提供的智能手环200执行第一界面的第二区域对应的功能的界面示意图之一;
图12为本申请实施例提供的触控操作响应方法的流程图之三;
图13为本申请实施例提供的智能手环200执行第一界面的第二区域对应的功能的界面示意图之二;
图14为本申请实施例提供的智能手环200执行第一界面的第二区域对应的功能的界面示意图之三;
图15为本申请实施例提供的一种电子设备的硬件结构示意图;
图16为本申请实施例提供的一种芯片的结构示意图。
具体实施方式
为了便于清楚描述本申请实施例的技术方案,在本申请的实施例中,采用了“第一”、 “第二”等字样对功能和作用基本相同的相同项或相似项进行区分。例如,第一值和第二值仅仅是为了区分不同的值,并不对其先后顺序进行限定。本领域技术人员可以理解“第一”、“第二”等字样并不对数量和执行次序进行限定,并且“第一”、“第二”等字样也并不限定一定不同。
需要说明的是,本申请中,“示例性的”或者“例如”等词用于表示作例子、例证或说明。本申请中被描述为“示例性的”或者“例如”的任何实施例或设计方案不应被解释为比其他实施例或设计方案更优选或更具优势。确切而言,使用“示例性的”或者“例如”等词旨在以具体方式呈现相关概念。
本申请中,“至少一个”是指一个或者多个,“多个”是指两个或两个以上。“和/或”,描述关联对象的关联关系,表示可以存在三种关系,例如,A和/或B,可以表示:单独存在A,同时存在A和B,单独存在B的情况,其中A,B可以是单数或者复数。字符“/”一般表示前后关联对象是一种“或”的关系。“以下至少一项(个)”或其类似表达,是指的这些项中的任意组合,包括单项(个)或复数项(个)的任意组合。例如,a,b,或c中的至少一项(个),可以表示:a,b,c,a-b,a-c,b-c,或a-b-c,其中a,b,c可以是单个,也可以是多个。
目前,电子设备已经成为人们工作生活的一部分,为用户的生活带来了方便。在一些电子设备的加速度较大的场景中,例如,用户携带电子设备步行、跑步以及健身等场景中,用户携带电子设备时,电子设备在晃动。在这种情况下,用户也可能会触发电子设备的显示界面,以执行对应的功能。
如图1所示,用户A在步行的过程中,举起电子设备100录制前方的风景。如此,用户可以发起对电子设备100显示的系统桌面101中的相机图标102的触控行为。由于用户A在步行的过程中,电子设备100可能随着用户A上下晃动。如此,在从用户A发起触控行为,至电子设备100接收到触发操作的时间内,如图2中的(a)-(b)所示,电子设备100可能向上移动了距离S。这样一来,如图2中的(b)所示,会导致用户A并未触控到相机图标102,而是触控到位于相机图标102下方的区域103。如此,造成一次无效触控。
有鉴于此,本申请提供了一种触控操作响应方法,在电子设备的第一界面的第一区域接收到触控操作后,若电子设备在屏幕方向上的加速度大于速率阈值,则说明用户从发起触控,至触控到第一界面时,电子设备发生了较大的偏移。这样一来,用户实际触发到的第一区域与用户原本需要触发的区域存在较大的偏移。如此,电子设备执行第二区域被触发时对应的功能。其中,第二区域是根据与第一区域在之前的第一时长内发生的位移得到的。可以理解地,第二区域即用户原本想要触发的区域。如此,电子设备执行第二区域被触发时对应的功能更准确。
可以理解的是,上述电子设备也可以称为终端,(terminal)、用户设备(user equipment,UE)、移动台(mobile station,MS)、移动终端(mobile terminal,MT)等。电子设备可以是手机(mobile phone)、穿戴式设备、平板电脑(Pad)、虚拟现实(virtual reality,VR)电子设备、增强现实(augmented reality,AR)电子设备。本申请的实施例对电子设备所采用的具体技术和具体设备形态不做限定。
为了能够更好地理解本申请实施例,下面对本申请实施例的电子设备的结构进行介绍。示例性的,下面以电子设备100为手机为例,说明电子设备100的结构示意图。图3为本申请实施例提供的一种手机的结构示意图。
手机可以包括处理器110,外部存储器接口120,内部存储器121,通用串行总线(universal serial bus,USB)接口130,充电管理模块140,电源管理模块141,天线1,天线2,移动通信模块150,无线通信模块160,音频模块170,扬声器170A,受话器170B,麦克风170C,传感器模块180,按键190,指示器192,摄像头193,以及显示屏194等。其中传感器模块180可以包括压力传感器180A,陀螺仪传感器180B,气压传感器180C,磁传感器180D,加速度传感器180E,距离传感器180F,接近光传感器180G,指纹传感器180H,温度传感器180J,触摸传感器180K,环境光传感器180L,骨传导传感器180M等。
可以理解的是,本申请实施例示意的结构并不构成对手机的具体限定。在本申请另一些实施例中,手机可以包括比图示更多或更少的部件,或者组合某些部件,或者拆分某些部件,或者不同的部件布置。图示的部件可以以硬件,软件或软件和硬件的组合实现。
处理器110可以包括一个或多个处理单元。其中,不同的处理单元可以是独立的器件,也可以集成在一个或多个处理器中。处理器110中还可以设置存储器,用于存储指令和数据。
无线通信模块160可以提供应用在手机上的包括无线局域网(wirelesslocal area networks,WLAN)(如无线保真(wireless fidelity,Wi-Fi)网络),蓝牙(bluetooth,BT),全球导航卫星系统(global navigation satellite system,GNSS),调频(frequency modulation,FM)等无线通信的解决方案。
手机通过GPU,显示屏194,以及应用处理器等实现显示功能。GPU为图像处理的微处理器,连接显示屏194和应用处理器。GPU用于执行数学和几何计算,用于图形渲染。
显示屏194用于显示图像,视频等。显示屏194包括显示面板。在一些实施例中,手机可以包括1个或N个显示屏194,N为大于1的正整数。
手机可以通过ISP,摄像头193,视频编解码器,GPU,显示屏194以及应用处理器等实现拍摄功能。
摄像头193用于捕获静态图像或视频。在一些实施例中,手机可以包括1个或N个摄像头193,N为大于1的正整数。
内部存储器121可以用于存储计算机可执行程序代码,可执行程序代码包括指令。内部存储器121可以包括存储程序区和存储数据区。内部存储器121存储有第一时长,或者存储有不同的类型参数与第一时长的不同取值的映射关系。
手机可以通过音频模块170,扬声器170A,受话器170B,麦克风170C,以及应用处理器等实现音频功能。例如音乐播放,录音等。
音频模块170用于将数字音频信息转换成模拟音频信号输出,也用于将模拟音频输入转换为数字音频信号。扬声器170A,也称“喇叭”,用于将音频电信号转换为声音信号。手机可以通过扬声器170A收听音乐,或收听免提通话。收话器170B,也称“听筒”,用于 将音频电信号转换成声音信号。当手机接听电话或语音信息时,可以通过将收话器170B靠近人耳接听语音。麦克风170C,也称“话筒”,“传声器”,用于将声音信号转换为电信号。
压力传感器180A用于感受压力信号,可以将压力信号转换成电信号。在一些实施例中,压力传感器180A可以设置于显示屏194。陀螺仪传感器180B可以用于确定手机的运动姿态。气压传感器180C用于测量气压。磁传感器180D包括霍尔传感器。加速度传感器180E可检测手机在各个方向上(一般为三轴)加速度的大小。距离传感器180F,用于测量距离。接近光传感器180G可以包括例如发光二极管(LED)和光检测器,例如光电二极管。环境光传感器180L用于感知环境光亮度。指纹传感器180H用于采集指纹。温度传感器180J用于检测温度。触摸传感器180K,也称“触控器件”。触摸传感器180K可以设置于显示屏194,由触摸传感器180K与显示屏194组成触摸屏,也称“触控屏”。骨传导传感器180M可以获取振动信号。
另外,当电子设备100为穿戴设备时,穿戴设备可以为智能手环。图4为本申请实施例提供的一种智能手环的硬件结构示意图。
智能手环可以包括处理器210,内部存储器221,充电管理模块240,电源管理模块241,天线1,移动通信模块250,无线通信模块260,音频模块270,扬声器270A,受话器270B,传感器模块280,按键290,指示器292,摄像头293,以及显示屏294等。其中传感器模块280可以包括:陀螺仪传感器280B,气压计280C,磁传感器280D,加速度传感器280E,接近光传感器280G,温度传感器280J,触摸传感器280K,以及环境光传感器280L等。
其中,穿戴设备中各个模块的功能,与上述的手机中名称对应的模块的功能的原理相同,在此不作赘述。
术语解释:
屏幕方向上的加速度:电子设备的显示屏的所在平面的延伸方向上的加速度。
下面,以电子设备100为手机100为例,对本申请实施例提供的触控操作响应方法进行说明,该示例并不构成对本申请实施例的限定。下述实施例可以相互结合,对于相同或相似的概念或过程不再赘述。如图5所示,本申请实施例提供的触控操作响应方法包括:
S501:手机100显示第二界面。其中,第二界面包括目标控件和第一控件,第一控件用于指示用户触发倒计时,目标控件为指示用户在倒计时结束后触发的控件。
如图6中的(a)所示,手机100可以响应于用户对系统桌面(图6中未示)中的“设置图标”的触发操作,显示设置列表界面601。设置列表界面601包括第一设置选项602。其中,第一设置选项602用于指示设置用户的触控反应时长,即用户从发起触控行为,至手机100接收到触发操作的时长。如图6中的(b)所示,手机100可以响应于用户对第一设置选项602的触发操作,显示第二界面603。其中,第二界面603包括目标控件604和第一控件605。
S502:手机100响应于用户对第一控件的触发操作,开始倒计时。
进而,如图6中的(c)-(d)所示,手机100可以响应于用户对第一控件605的触发操作,手机100开始倒计时。其中,倒计时的时长可以为但不限于3s(如还可以为4s、或者5s等)。第二界面603可以显示倒计时的剩余的时长(单位,秒), 例如,“3”、“2”、“1”以及“0”,且显示“3”、“2”、“1”以及“0”的间隔时长为1s。可以理解地,如图6中的(d)所示,在倒计时结束时,第二界面603显示数值“0”,其中,数值“0”用于提示用户倒计时结束。
S503:手机100在倒计时完毕后,记录倒计时完毕的第一时刻。
可以理解地,在倒计时期间,用户可以做好向目标控件604发起触控操作的准备。在倒计时结束时,用户可以开始发起触控行为。如此,手机100记录倒计时完毕的第一时刻,即记录用户可以开始发起触控行为的时刻。
S504:手机100响应于用户对目标控件的触发操作,记录第二时刻。
如图6中的(e)所示,手机100可以响应于用户对目标控件604的触发操作,记录第二时刻。可以理解地,第二时刻即手机100接收到用户对目标控件604的触发操作的时刻。
S505:手机100计算第一时刻和第二时刻的时差,得到第一时长。
其中,第一时长可以理解为用户从发起触控行为,至手机100接收到触发操作的时长。例如,第一时长可以为0.55s、0.45s、以及0.35s等,在此不作限定。
需要说明的是,手机100可以将S502-S505循环执行多次(如3次或3次以上)。这样,手机100也可以得到多个时长。进而,手机100可以计算得到的多个时长的平均值。如此,手机100存储计算得到的平均值。例如,手机100得到的多个第一时长分别为0.45s、0.5s、以及0.55s,则手机100计算0.45s、0.5s、以及0.55的平均值为0.5s。如此,手机100存储的第一时长为0.5s。
其中,手机100存储的平均值,即最终存储的第一时长。可以理解地,当第一时长为得到的多个第一时长的平均值时,可靠性更高。
S506:手机100存储第一时长。
可见,上述的S501-S506的方案为:手机100响应于用户的触发操作,设置第一时长。这样一来,用户可以设置与自己匹配的第一时长。
在另一些实施方式中,手机100可以在出厂时预配置有第一时长。这样,第一时长无需用户配置,节省了用户的操作。如此,上述的S501-S506可以省略。
其中,手机100在出厂时预存储的第一时长可以是一个取值,也可以包括多个取值。其中,当预存储的第一时长包括多个取值时,第一时长的各取值对应有类型参数。可以理解地,在类型参数不同时,第一时长的取值也不同。其中,类型参数可以为年龄或者运动状态等,在此不作限定。当类型参数为年龄时,第一时长的各取值与类型参数的映射关系可以如下表1所示。
表1
年龄 第一时长
18-30岁 0.55s
30-45岁 0.6s
45-60岁 0.65s
60岁以上 0.7s
从表1中可以看出,不同的年龄段与第一时长的不同取值存在映射关系,年龄越 大,第一时长的取值也越大。另外,当类型参数为运动状态时,第一时长的各取值与类型参数的映射关系可以如下表2所示。
表2
运动状态 第一时长
静止状态 0.55s
步行状态 0.6s
跑步状态 0.65s
驾车状态 0.7s
从表2中可以看出,不同的运动状态与第一时长的不同取值存在映射关系。
可以理解地,上述的S501-S506的方案为,如何得到并存储第一时长。下面,结合S507-S510说明,手机100如何响应用户的触发操作。
S507:手机100显示第一界面。
仍如图1所示,用户A在步行的过程中,举起手机100录制前方的风景。如此,用户A可以解锁手机100的屏幕,显示系统桌面101,系统桌面101包括相机图标102。可以理解地,在这种情况下,手机100的系统桌面101即第一界面。
S508:手机100确定第一界面的第一区域接收到用户的触控操作。
可以理解地,用户A可以发起对手机100显示的系统桌面101(即第一界面)中的相机图标102的触控行为,以便进行视频录制。当手机100触发到系统桌面101的第一区域103时,手机100确定系统桌面101的第一区域103接收到用户的触控操作。
S509:手机100获取手机100的运动状态。其中,运动状态用于表征手机100的加速度。
其中,手机100的运动状态包括步行状态、跑步状态等,在此不作限定。可以理解地,手机100在不同的运动状态下,表征的加速度不同。
手机100获取运动状态的方式为:手机100连续采集不同时刻的位姿参数和加速度。其中,采集的位姿参数和加速度可以是滤波后的数据,这样可以增强采集的数据的可靠性。进而,手机100将采集到的位姿参数和加速度输入到预训练的第一模型中,以使第一模型基于位姿参数输出运动状态。其中,第一模型可以是以海量的位姿参数和加速度作为训练样本,以运动状态作为训练目标训练得到的。
另外,上述的S509可以替换为:手机100获取手机100在屏幕方向上的加速度。
S510:手机100判断在屏幕方向上的加速度是否大于加速度阈值,如果是,则执行S511。
其中,手机100在屏幕方向上的加速度可以理解为:手机100的显示屏194的所在平面的延伸方向上的加速度。
S511:手机100根据第一区域在之前的第一时长内发生的第一位移,在第一界面确定第二区域。
在一些实施方式中,将第二区域104移动到与第一区域103重合时发生的第二位移,与第一位移的距离相同且方向相反。
由于用户A在步行的过程中,手机100可能随着用户A上下晃动,这样一来,手机100的加速度可能大于加速度阈值(如0.2m/s)。如此,仍如图2中的(a)-(b)所示,从用 户A发起触控行为,至手机100接收到触发操作的时间,手机100可能向上移动了距离S。这样一来,如图7中的(a)所示,用户A并未触控到相机图标102,而是触控到位于相机图标102下方的第一区域103。这样一来,如图7中的(b)所示,手机100需要确定第二区域104(即相机图标102所在的区域)。
下面介绍,手机100具体如何在第一界面上确定第二区域。
示例性地,手机100可以从内部存储器121获取第一时长。进而,手机100确定之前的第一时长内,手机100发生的第一位移。其中,当内部存储器121存储的第一时长仅包括一个取值时,可以直接从内部存储121获取第一时长。
当内部存储器121存储的第一时长包括多个取值时,手机100可以先获取类型参数。例如,类型参数为年龄时,手机100可以从操作系统、或者某一应用程序获取年龄。其中,操作系统、或者某一应用程序中的年龄可以为用户在注册账号时,输入的个人信息。由于在内部存储器121中,不同的年龄与第一时长的不同取值存在映射关系。如此,手机100可以根据获取的年龄从内部存储器121中,确定第一时长。
再例如,当类型参数为运动状态时,由于在内部存储器121中,不同的运动状态与第一时长的不同取值存在映射关系。如此,手机100可以基于上述的S509得到的运动状态,从内部存储器121中,确定第一时长。
可以理解地,根据类型参数,确定第一时长的取值,可以使得确定的第一时长的可靠性更高。
这样一来,手机100可以根据算式
Figure PCTCN2022138814-appb-000001
计算第一位移。即,手机100确定的第一位移满足条件:
Figure PCTCN2022138814-appb-000002
其中,S k为第一位移,Δt为第一时长中被细分的间隔时长,a(i-1)为手机100在第一时长中的第i-1个间隔时长的加速度,a(i)为手机100在第一时长中的第i个间隔时长的加速度,k为正整数。或者,手机100还可以根据算式
Figure PCTCN2022138814-appb-000003
计算第一位移。即,手机100确定的第一位移满足条件:
Figure PCTCN2022138814-appb-000004
其中,S k为第一位移,t为第一时长,a为加速度,v 0为手机100在第一时长之前的移动速度。
进而,如图7中的(b)所示,手机100可以确定第二区域104(即相机图标102内的区域)。其中,将第二区域104移动到与第一区域103重合时发生的第二位移,与第一位移的距离相同且方向相反。即是说,在用户从发起触控行为时,原本想要触控第二区域104。然而,手机100在第一时长内移动了第一位移,这样一来,手机100的第一区域103被用户触发到。如此,导致用户触发的位置存在误差。为了补偿用户触发的位置存在的误差,手机100得到与第一位移反方向的第二位移,并将第一区域103移动第二位移。进而,手机100可以将第一区域103移动第二位移后的区域,确定为用户原本想要触控的第二区域104。
S512:手机100执行第二区域被触发时对应的功能。
一些实施方式中,基于上述的S511中,第二区域位于相机图标102的所在位置。进而,如图7中的(c)所示,手机100可以响应于相机图标102被触发后,显示拍摄预览界面106的功能。可以理解地,在这种情况下,第二区域104被触发时对应的功能,即显示拍摄预览界面106的功能。
可以理解地,基于上述的S511的描述,视频录制控件102的位置为用户原本想要触发的区域。如此,手机100响应于视频录制控件102被触发后,执行拍摄的功能的可靠性更高。
在上述实施例中,是以电子设备100为手机100为例,对本申请实施例提供的触控操作响应方法进行说明的。下面,以电子设备100为智能手环200为例,对本申请实施例提供的触控操作响应方法进行说明,该示例也不构成对本申请实施例的限定。需要说明的是,当电子设备采用智能手环200时所提供的触控操作响应方法,其基本原理及产生的技术效果和上述实施例相同,为简要描述,本实施例部分未提及之处,可参考上述的实施例中相应内容。
如图8所示,当电子设备100为智能手环200时,本申请实施例提供的触控操作响应方法包括:
S801:智能手环200显示第二界面。其中,第二界面包括目标控件和第一控件。其中,第一控件用于触发倒计时。
如图9中的(a)所示,智能手环200可以显示系统桌面1001。系统桌面1001包括“设置”图标1002。如图9中的(b)所示,智能手环200可以响应于用户对系统桌面1001中的“设置”图标1002的触发操作,显示设置列表界面1003。设置列表界面1003包括第一设置选项1004。其中,第一设置选项1004用于指示设置触控反应时间(即用户从发起触控行为,至智能手环200接收到触发操作的时长)。如图9中的(b)-(c)所示,智能手环200可以响应于用户对第一设置选项1004的触发操作,显示第二界面1005。第二界面1005包括目标控件1006和第一控件1007。可以理解地,第一控件1007用于触发倒计时。
S802:智能手环200响应于用户对第一控件的触发操作,开始倒计时。
进而,如图9中的(c)-(d)所示,智能手环200可以响应于用户对第一控件1007的触发操作,智能手环200开始倒计时。其中,倒计时的时长可以为但不限于3s。第二界面1005可以显示倒计时的剩余的时长(单位,秒)。例如,“3”、“2”、“1”、“0”。可以理解地,显示“3”、“2”、“1”、以及“0”的间隔时长为1s。可以理解地,如图9中的(e)所示,在倒计时结束时,第二界面603显示数值“0”,其中,数值“0”用于提示用户倒计时结束。
S803:智能手环200在倒计时完毕后,记录倒计时完毕的第一时刻。
其中,S803的原理与上述的S503的原理相同,在此不再赘述。
S804:智能手环200响应于用户对目标控件的触发操作,记录第二时刻。
如图9中的(f)所示,智能手环200可以响应于用户对目标控件1006的触发操作,记录第二时刻。可以理解地,第二时刻即智能手环200接收到用户对目标控件1006的触发操作的时刻。
S805:智能手环200计算第一时刻和第二时刻的时差,得到第一时长。
其中,S805的原理与上述的S505的原理相同,在此不做赘述。
S806:智能手环200存储第一时长。
其中,S806的原理与上述的S506的原理相同,在此不赘述。
可以理解地,上述的S801-S806的方案为,如何得到并存储第一时长。下面,结 合S807-S812说明,智能手环200如何响应用户的触发操作。
S807:智能手环200显示第一界面。
在一些应用场景中,用户A在跑步机跑步的过程中,佩戴上了智能手环200。如图10中的(a)所示,智能手环200可以响应于用户对运动应用的触发操作,显示跑步记录界面1010(即第一界面)。跑步记录界面1010包括有跑步里程、跑步时间、跑步速度等信息。另外,跑步记录界面1010还包括暂停控件1007。
S808:智能手环200确定第一界面的第一区域接收到用户的触控操作。
可以理解地,若用户想要停止记录跑步里程、跑步时间、跑步速度等信息,用户可以在跑步的过程中,向暂停控件1007的发起触控行为。如图10中的(b)所示,当用户触发到跑步记录界面1010的第一区域1008时,智能手环200确定跑步记录界面1010的第一区域1008接收到用户的触控操作。
S809:智能手环200获取智能手环200的运动状态。其中,运动状态用于表征智能手环200的加速度。
其中,S809可以替换为:智能手环200获取智能手环200的在屏幕方向上的加速度。
其中,S809与上述的S509的原理相同,在此不赘述。
S810:智能手环200判断在屏幕方向上的加速度是否大于加速度阈值,如果是,则执行S811。
S811:智能手环200根据第一区域在之前的第一时长内发生的第一位移,在第一界面确定第二区域。示例性地,将第二区域移动到与第一区域重合时发生的第二位移,与第一位移的距离相同且方向相反。
由于用户A在跑步的过程中,智能手环200可能随着用户A上下晃动,这样一来,智能手环200的加速度可能大于加速度阈值(如0.2m/s)。如此,仍如图10中的(a)-(b)所示,从用户A发起触控行为,至智能手环200接收到触发操作的时间,智能手环200可能向下移动了距离S。这样一来,如图10中的(b)所示,用户A并未触控到暂停控件1007,而是触控到位于暂停控件1007上方的第一区域1008。这样一来,如图11中的(a)所示,智能手环200需要确定第二区域1009,第二区域1009即用户原本想要触发的区域。
S812:智能手环200执行第二区域被触发时对应的功能。
如图11中的(a)-(b)所示,智能手环200执行第二区域1009(即暂停控件1007包括的区域)被触发时的功能。即,智能手环200执行暂停记录跑步里程、跑步时间、跑步速度等信息的功能。
可以理解地,在用户从发起触控行为时,原本想要触控第二区域1009((即暂停控件1007包括的区域)。然而,智能手环200在第一时长内移动了第一位移,这样一来,智能手环200的第一区域1007被用户触发到。如此,导致用户触发的位置存在误差。为了补偿用户触发的位置存在的误差,智能手环200得到与第一位移反方向的第二位移,并将基于第一区域1007的位置移动第二位移。进而,智能手环200可以将基于第一区域1007的位置移动第二位移后的区域,确定为用户原本想要触控的第二区域1009(即暂停控件1007包括的区域)。如此,智能手环200暂停记录跑步里程、跑步时间、跑步速度等信息的可靠性更高。
在另一些实施方式中,如图12所示,上述的S811-S812还可以替换为:
S1211:智能手环200与第一区域存在重叠关系的目标区域,确定为第三区域,其中,第三区域内包括有一个触控件。其中,触控件位于目标区域内,且在触控件的数量为多个时,任意两个触控件对应的目标区域不重叠。这样,可以减少误触发的概率。
下面,结合两种方式介绍S1411的实现原理:
第一种方式:仍如图10中的(a)-(b)所示,在跑步记录界面1010(即第一界面)包括一个暂停控件1007,暂停控件1007可以理解为一个触控件。如图13中的(a)所示,智能手环200确定暂停控件1007对应的目标区域1011,其中,目标区域1011与第一区域1008之间存在重叠关系,可见,目标区域1011即第三区域1011。在图13中的(a)中,暂停控件1007位于第三区域1011内。可见,第三区域1011的面积大于暂停控件1007的面积。
第二种方式:如图14中的(a)-(b)所示,在跑步记录界面1501(即第一界面)包括开始控件1502、暂停控件1503以及结束控件1504。其中,开始控件1502用于指示用户开始记录跑步里程、跑步时间、跑步速度等信息;暂停控件1503用于指示用户暂停记录跑步里程、跑步时间、跑步速度等信息;结束控件1504用于指示用户结束记录跑步里程、跑步时间、跑步速度等信息。
如图14中的(c)所示,在跑步记录界面1501的第一区域1505接收到触发操作时,智能手环200确定开始控件1502对应的目标区域1507、暂停控件1503对应的目标区域1508以及结束控件1504对应的目标区域1509。其中,开始控件1502位于目标区域1507内,暂停控件1503位于目标区域1508内,以及结束控件1504位于目标区域1509内;且目标区域1507、目标区域1508、以及目标区域1509之间没有重叠关系。
S1212:智能手环确定与第一区域存在重叠关系的目标区域为第三区域,并执行第三区域内的触控件被触发时对应的功能。
基于上述的S1211中的第一种方式,第一区域1008与暂停控件1007不存在重叠关系,但第一区域1008与第三区域1011存在重叠关系。如此,智能手环200执行与第三区域1011被触发时内的触控件对应的功能。如图13中的(a)-(b)所示,由于第三区域1011包括暂停控件1007,智能手环200执行暂停控件1007被触发时,暂停记录跑步里程、跑步时间、跑步速度等信息的功能。
可以理解地,在用户从发起触控行为时,原本想要触控暂停控件1007。然而,智能手环200在第一时长内移动了第一位移,这样一来,智能手环200的第一区域1008被用户触发到,且第一区域1008与暂停控件1007不存在重叠关系,导致用户触发的位置存在误差。为了补偿用户触发的位置存在的误差,智能手坏200可以确定面积大于暂停控件1007的第三区域1011。这样一来,可以使得第三区域1011与第一区域1008存在重叠关系。进而,智能手环200执行位于第三区域1011内的暂停控件1007被触发时,暂停记录跑步里程、跑步时间、跑步速度等信息的功能,可靠性高。
另外,基于上述的S1211中的第二种方式,如图14中的(d)所示,智能手环200确定与第一区域1505存在重叠关系的目标区域1508为第三区域1508。第三区域1508中的暂停控件1503被触发时,暂停记录跑步里程、跑步时间、跑步速度等信息的功能, 可靠性高。其中,智能手环200执行第三区域1508被触发时对应的功能的原理,与S1212中的第一种方式的原理相同,在此不再赘述。
综上所述,本申请提供的触控操作响应方法,本申请提供的触控操作响应方法,在电子设备的第一界面的第一区域接收到触控操作后,若电子设备的运动状态指示电子设备的在屏幕方向上的加速度大于速率阈值,则说明用户从发起触控,至触控到第一界面时,电子设备发生了较大的偏移。这样一来,用户实际触发到的第一区域与用户原本需要触发的区域存在较大的偏移。如此,电子设备可以根据第一区域在之前的第一时长内发生的第一位移,确定第二区域。可以理解地,第二区域即用户原本想要触发的区域。如此,电子设备执行第二区域被触发时对应的功能更准确。
或者,电子设备确定的第三区域内包括触控件。由于第三区域的面积大于触控件所在位置的面积,这样一来,即使用户实际触发到的第一区域与用户原本需要触发的触控件存在偏移。电子设备执行被用户触发的第三区域内的触控件对应的功能,即执行与第一区域存在重叠关系的第三区域内的触控件被触发时对应的功能,可靠性高。
另外,上述介绍的本申请实施例提供的触控操作响应方法中,提到的触发操作可以包括:点击操作、长按操作、以及手势触发操作等,在此不做限定。
另外,上述介绍的本申请实施例提供的触控操作响应方法中,提到的电子设备除了是手机或者智能手环外,还可以是:智能手表、平板电脑/车载终端等,在此不做限定。
另外,本申请实施例还提供一种触控操作响应装置,应用于电子设备。本申请实施例提供的触控操作响应装置包括:显示单元,用于显示第一界面。处理单元,用于电子设备的第一界面的第一区域接收到用户的触控操作时,获取电子设备的屏幕方向的加速度。处理单元,还用于在加速度大于加速度阈值时,根据第一区域在之前的第一时长内发生的第一位移,在第一界面确定第二区域;并执行第二区域被触发时对应的功能。或者,处理单元,还用于在加速度大于加速度阈值时,将与第一区域存在重叠关系的目标区域,确定为第三区域,其中,第三区域内包括有一个触控件;并执行第三区域被触发时对应的功能。
在一种可选的实施方式中,处理单元,还用于响应于用户的触发操作,设置第一时长。
进一步地,显示单元,还用于显示第二界面。其中,第二界面包括目标控件和第一控件,第一控件用于指示用户触发倒计时,目标控件为指示用户在倒计时结束后触发的控件。电子设备响应于用户对第一控件的触发操作,开始倒计时。电子设备在倒计时完毕后,记录倒计时完毕的第一时刻。电子设备响应于用户对目标控件的触发操作,记录第二时刻。电子设备计算第一时刻和第二时刻的时差,得到第一时长。
在一种可选的实施方式中,第一时长是处理单元预配置的。
进一步地,第一时长包括多个不同的取值,第一时长的各个取值对应有一个类型参数。
更进一步地,处理单元,还用于获取类型参数。电子设备根据类型参数,确定第一时长的取值。
在一种可选的实施方式中,处理单元,具体用于根据加速度和第一时长,确定第一区域在之前的第一时长内发生的第一位移;根据第一位移,在第一界面确定第二区域。
进一步地,将第二区域移动到与第一区域重合时发生的第二位移,与第一位移的距离 相同且方向相反。
在一种可选的实施方式中,处理单元,具体用于在加速度大于加速度阈值时,电子设备确定至少一个目标区域,且目标区域内包括一个触控件。电子设备将与第一区域存在重叠关系的目标区域,确定为第三区域。
进一步地,目标区域的数量为多个,且任两个目标区域不重叠。
在一种可选的实施方式中,电子设备为手机、智能手表、智能手环、平板电脑、或者车载终端。
示例性的,图15为本申请实施例提供的一种终端设备的硬件结构示意图,如图15所示,该终端设备包括处理器1501,通信线路1504以及至少一个通信接口(图15中示例性的以通信接口1503为例进行说明)。
处理器1501可以是一个通用中央处理器(central processing unit,CPU),微处理器,特定应用集成电路(application-specific integrated circuit,ASIC),或一个或多个用于控制本申请方案程序执行的集成电路。
通信线路1504可包括在上述组件之间传送信息的电路。
通信接口1503,使用任何收发器一类的装置,用于与其他设备或通信网络通信,如以太网,无线局域网(wireless local area networks,WLAN)等。
可能的,该终端设备还可以包括存储器1502。
存储器1502可以是只读存储器(read-only memory,ROM)或可存储静态信息和指令的其他类型的静态存储设备,随机存取存储器(random access memory,RAM)或者可存储信息和指令的其他类型的动态存储设备,也可以是电可擦可编程只读存储器(electrically erasable programmable read-only memory,EEPROM)、只读光盘(compact disc read-only memory,CD-ROM)或其他光盘存储、光碟存储(包括压缩光碟、激光碟、光碟、数字通用光碟、蓝光光碟等)、磁盘存储介质或者其他磁存储设备、或者能够用于携带或存储具有指令或数据结构形式的期望的程序代码并能够由计算机存取的任何其他介质,但不限于此。存储器可以是独立存在,通过通信线路1504与处理器相连接。存储器也可以和处理器集成在一起。
其中,存储器1502用于存储执行本申请方案的计算机执行指令,并由处理器1501来控制执行。处理器1501用于执行存储器1502中存储的计算机执行指令,从而实现本申请实施例所提供的触控操作响应方法。
可能的,本申请实施例中的计算机执行指令也可以称之为应用程序代码,本申请实施例对此不作具体限定。
在具体实现中,作为一种实施例,处理器1501可以包括一个或多个CPU,例如图15中的CPU0和CPU1。
在具体实现中,作为一种实施例,终端设备可以包括多个处理器,例如图15中的处理器1501和处理器1505。这些处理器中的每一个可以是一个单核(single-CPU)处理器,也可以是一个多核(multi-CPU)处理器。这里的处理器可以指一个或多个设备、电路、和/或用于处理数据(例如计算机程序指令)的处理核。
示例性的,图16为本申请实施例提供的一种芯片的结构示意图。芯片160包括 一个或两个以上(包括两个)处理器1610和通信接口1630。
在一些实施方式中,存储器1640存储了如下的元素:可执行模块或者数据结构,或者他们的子集,或者他们的扩展集。
本申请实施例中,存储器1640可以包括只读存储器和随机存取存储器,并向处理器1610提供指令和数据。存储器1640的一部分还可以包括非易失性随机存取存储器(non-volatile random access memory,NVRAM)。
本申请实施例中,存储器1640、通信接口1630以及存储器1640通过总线系统1620耦合在一起。其中,总线系统1620除包括数据总线之外,还可以包括电源总线、控制总线和状态信号总线等。为了便于描述,在图13中将各种总线都标为总线系统1620。
上述本申请实施例描述的方法可以应用于处理器1610中,或者由处理器1610实现。处理器1610可能是一种集成电路芯片,具有信号的处理能力。在实现过程中,上述方法的各步骤可以通过处理器1610中的硬件的集成逻辑电路或者软件形式的指令完成。上述的处理器1610可以是通用处理器(例如,微处理器或常规处理器)、数字信号处理器(digital signal processing,DSP)、专用集成电路(application specific integrated circuit,ASIC)、现成可编程门阵列(field-programmable gate array,FPGA)或者其他可编程逻辑器件、分立门、晶体管逻辑器件或分立硬件组件,处理器1610可以实现或者执行本申请实施例中的公开的各方法、步骤及逻辑框图。
结合本申请实施例所公开的方法的步骤可以直接体现为硬件译码处理器执行完成,或者用译码处理器中的硬件及软件模块组合执行完成。其中,软件模块可以位于随机存储器、只读存储器、可编程只读存储器或带电可擦写可编程存储器(electrically erasable programmable read only memory,EEPROM)等本领域成熟的存储介质中。该存储介质位于存储器1640,处理器1610读取存储器1640中的信息,结合其硬件完成上述方法的步骤。
在上述实施例中,存储器存储的供处理器执行的指令可以以计算机程序产品的形式实现。其中,计算机程序产品可以是事先写入在存储器中,也可以是以软件形式下载并安装在存储器中。
计算机程序产品包括一个或多个计算机指令。在计算机上加载和执行计算机程序指令时,全部或部分地产生按照本申请实施例的流程或功能。计算机可以是通用计算机、专用计算机、计算机网络或者其他可编程装置。计算机指令可以存储在计算机可读存储介质中,或者从一个计算机可读存储介质向另一计算机可读存储介质传输,例如,计算机指令可以从一个网站站点、计算机、服务器或数据中心通过有线(例如同轴电缆、光纤、数字用户线(digital subscriber line,DSL)或无线(例如红外、无线、微波等)方式向另一个网站站点、计算机、服务器或数据中心进行传输。计算机可读存储介质可以是计算机能够存储的任何可用介质或者是包括一个或多个可用介质集成的服务器、数据中心等数据存储设备。例如,可用介质可以包括磁性介质(例如,软盘、硬盘或磁带)、光介质(例如,数字通用光盘(digital versatile disc,DVD))、或者半导体介质(例如,固态硬盘(solid state disk,SSD))等。
本申请实施例还提供了一种计算机可读存储介质。上述实施例中描述的方法可以全部或部分地通过软件、硬件、固件或者其任意组合来实现。计算机可读介质可以包括计算机存储介质和通信介质,还可以包括任何可以将计算机程序从一个地方传送到另一个地方的介质。存储介质可以是可由计算机访问的任何目标介质。
作为一种可能的设计,计算机可读介质可以包括紧凑型光盘只读储存器(compact disc read-only memory,CD-ROM)、RAM、ROM、EEPROM或其它光盘存储器;计算机可读介质可以包括磁盘存储器或其它磁盘存储设备。而且,任何连接线也可以被适当地称为计算机可读介质。例如,如果使用同轴电缆,光纤电缆,双绞线,DSL或无线技术(如红外,无线电和微波)从网站,服务器或其它远程源传输软件,则同轴电缆,光纤电缆,双绞线,DSL或诸如红外,无线电和微波之类的无线技术包括在介质的定义中。如本文所使用的磁盘和光盘包括光盘(CD),激光盘,光盘,数字通用光盘(digital versatile disc,DVD),软盘和蓝光盘,其中磁盘通常以磁性方式再现数据,而光盘利用激光光学地再现数据。
上述的组合也应包括在计算机可读介质的范围内。以上,仅为本申请的具体实施方式,但本申请的保护范围并不局限于此,任何熟悉本技术领域的技术人员在本申请揭露的技术范围内,可轻易想到变化或替换,都应涵盖在本申请的保护范围之内。因此,本申请的保护范围应以权利要求的保护范围为准。

Claims (13)

  1. 一种触控操作响应方法,其特征在于,应用于电子设备,所述方法包括:
    所述电子设备显示第一界面;
    所述电子设备的所述第一界面的第一区域接收到用户的触控操作时,获取所述电子设备的屏幕方向的加速度;
    在所述加速度大于加速度阈值时,所述电子设备根据所述第一区域在之前的第一时长内发生的第一位移,在所述第一界面确定第二区域;所述电子设备执行所述第二区域被触发时对应的功能;
    或者,在所述加速度大于加速度阈值时,所述电子设备将与所述第一区域存在重叠关系的目标区域,确定为第三区域,所述第三区域内包括有一个触控件;所述电子设备执行所述第三区域内的触控件被触发时对应的功能。
  2. 根据权利要求1所述的方法,其特征在于,在所述电子设备显示第一界面之前,所述方法还包括:
    所述电子设备响应于用户的触发操作,设置所述第一时长。
  3. 根据权利要求2所述的方法,其特征在于,在所述电子设备显示第一界面之前,所述方法还包括:
    所述电子设备显示第二界面,其中,所述第二界面包括目标控件和第一控件,所述第一控件用于指示用户触发倒计时,所述目标控件为指示用户在倒计时结束后触发的控件;
    所述电子设备响应于用户对所述第一控件的触发操作,开始倒计时;
    所述电子设备在倒计时完毕后,记录倒计时完毕的第一时刻;
    所述电子设备响应于用户对所述目标控件的触发操作,记录第二时刻;
    所述电子设备计算所述第一时刻和所述第二时刻的时差,得到所述第一时长。
  4. 根据权利要求1所述的方法,其特征在于,所述第一时长是所述电子设备预配置的。
  5. 根据权利要求4所述的方法,其特征在于,所述第一时长包括多个不同的取值,所述第一时长的各个取值对应有一个类型参数。
  6. 根据权利要求5所述的方法,其特征在于,在满足所述加速度大于加速度阈值之后,所述方法还包括:
    所述电子设备获取所述类型参数;
    所述电子设备根据所述类型参数,确定所述第一时长的取值。
  7. 根据权利要求1所述的方法,其特征在于,所述电子设备根据所述第一区域在之前的第一时长内发生的第一位移,在所述第一界面确定第二区域,包括:
    所述电子设备根据所述加速度和所述第一时长,确定所述第一区域在之前的所述第一时长内发生的第一位移;
    所述电子设备根据所述第一位移,在所述第一界面确定所述第二区域。
  8. 根据权利要求7所述的方法,其特征在于,将所述第二区域移动到与所述第一区域重合时发生的第二位移,与所述第一位移的距离相同且方向相反。
  9. 根据权利要求1所述的方法,其特征在于,在所述加速度大于加速度阈值时,所述电子设备将所述第一区域存在重叠关系的目标区域,确定为第三区域,包括:
    在所述加速度大于加速度阈值时,所述电子设备确定至少一个目标区域,且所述目标 区域内包括一个触控件;
    所述电子设备将与所述第一区域存在重叠关系的目标区域,确定为所述第三区域。
  10. 根据权利要求9所述的方法,其特征在于,所述目标区域的数量为多个,且任两个所述目标区域不重叠。
  11. 根据权利要求1-10任一所述的方法,其特征在于,所述电子设备为手机、智能手表、智能手环、平板电脑、或者车载终端。
  12. 一种电子设备,包括存储器、处理器以及存储在所述存储器中并可在所述处理器上运行的计算机程序,其特征在于,所述处理器执行所述计算机程序时,使得所述电子设备执行如权利要求1至11任一项所述的方法。
  13. 一种计算机可读存储介质,所述计算机可读存储介质存储有计算机程序,其特征在于,所述计算机程序被处理器执行时,使得计算机执行如权利要求1至11任一项所述的方法。
PCT/CN2022/138814 2022-01-26 2022-12-13 触控操作响应方法和电子设备 WO2023142736A1 (zh)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN202210092654.6A CN116540862B (zh) 2022-01-26 2022-01-26 触控操作响应方法和电子设备
CN202210092654.6 2022-01-26

Publications (1)

Publication Number Publication Date
WO2023142736A1 true WO2023142736A1 (zh) 2023-08-03

Family

ID=87442247

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2022/138814 WO2023142736A1 (zh) 2022-01-26 2022-12-13 触控操作响应方法和电子设备

Country Status (2)

Country Link
CN (1) CN116540862B (zh)
WO (1) WO2023142736A1 (zh)

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20030222858A1 (en) * 2002-05-28 2003-12-04 Pioneer Corporation Touch panel device
CN101978337A (zh) * 2008-02-11 2011-02-16 苹果公司 屏幕的运动补偿
CN103294232A (zh) * 2012-02-22 2013-09-11 华为终端有限公司 一种触摸操作的处理方法及终端
CN105094440A (zh) * 2015-08-18 2015-11-25 惠州Tcl移动通信有限公司 一种基于移动终端的触摸屏防抖方法、系统及移动终端
CN105260044A (zh) * 2014-07-18 2016-01-20 国基电子(上海)有限公司 电子设备及触控操作识别方法
CN107390931A (zh) * 2017-07-26 2017-11-24 广东欧珀移动通信有限公司 触摸操作的响应控制方法、装置、存储介质及移动终端

Family Cites Families (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2010108316A (ja) * 2008-10-30 2010-05-13 Kyocera Corp 電子機器およびキー制御方法
US9841839B2 (en) * 2013-10-07 2017-12-12 Tactual Labs Co. System for measuring latency on a touch device
CN106775084B (zh) * 2016-12-16 2019-04-16 Oppo广东移动通信有限公司 一种触摸屏的防误触方法、装置及移动终端
JP6890743B2 (ja) * 2019-04-04 2021-06-18 三菱電機株式会社 指示判定装置、車載機器、及び指示判定方法
CN110077421A (zh) * 2019-04-18 2019-08-02 广州小鹏汽车科技有限公司 车辆触控输入事件的处理方法、处理装置和车辆触摸屏

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20030222858A1 (en) * 2002-05-28 2003-12-04 Pioneer Corporation Touch panel device
CN101978337A (zh) * 2008-02-11 2011-02-16 苹果公司 屏幕的运动补偿
CN103294232A (zh) * 2012-02-22 2013-09-11 华为终端有限公司 一种触摸操作的处理方法及终端
CN105260044A (zh) * 2014-07-18 2016-01-20 国基电子(上海)有限公司 电子设备及触控操作识别方法
CN105094440A (zh) * 2015-08-18 2015-11-25 惠州Tcl移动通信有限公司 一种基于移动终端的触摸屏防抖方法、系统及移动终端
CN107390931A (zh) * 2017-07-26 2017-11-24 广东欧珀移动通信有限公司 触摸操作的响应控制方法、装置、存储介质及移动终端

Also Published As

Publication number Publication date
CN116540862B (zh) 2023-12-01
CN116540862A (zh) 2023-08-04

Similar Documents

Publication Publication Date Title
US10390140B2 (en) Output device outputting audio signal and control method thereof
US10554807B2 (en) Mobile terminal and method of operating the same
US11044684B2 (en) Method and device for measuring amount of user physical activity
KR102561587B1 (ko) 전자 장치 및 그의 동작 방법
US11200022B2 (en) Method and apparatus of playing audio data
US11574009B2 (en) Method, apparatus and computer device for searching audio, and storage medium
US20150382321A1 (en) Method and apparatus for providing notification
US11095838B2 (en) Electronic device and method for capturing image in electronic device
WO2018026145A1 (ko) 전자 장치 및 전자 장치의 시선 추적 방법
CN102708120A (zh) 生活流式传输
US20170144042A1 (en) Mobile terminal, training management program and training management method
WO2021213451A1 (zh) 轨迹回放方法及相关装置
KR102517228B1 (ko) 사용자의 입력에 대한 외부 전자 장치의 응답 시간에 기반하여 지정된 기능을 제어하는 전자 장치 및 그의 방법
KR102452314B1 (ko) 컨텐츠 재생 방법 및 이를 지원하는 전자 장치
US9338340B2 (en) Launching a camera of a wireless device from a wearable device
WO2019165786A1 (zh) 数据传输方法、装置及系统、显示装置
WO2019205735A1 (zh) 数据传输方法、装置、显示屏及显示装置
WO2024016564A1 (zh) 二维码识别方法、电子设备以及存储介质
WO2021052035A1 (zh) 一种屏幕侧面区域显示方法及电子设备
WO2018131928A1 (ko) 적응적인 사용자 인터페이스를 제공하기 위한 장치 및 방법
WO2018084649A1 (ko) 눈을 촬영하여 정보를 획득하는 방법 및 장치
WO2017191908A1 (ko) 위치 정보 계산 방법 및 그 전자 장치
CN113706807B (zh) 发出报警信息的方法、装置、设备及存储介质
WO2023142736A1 (zh) 触控操作响应方法和电子设备
WO2018182282A1 (ko) 전자 장치 및 그의 이미지 처리 방법

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 22923523

Country of ref document: EP

Kind code of ref document: A1