CN113946208B - Touch control panel control method and electronic equipment - Google Patents

Touch control panel control method and electronic equipment Download PDF

Info

Publication number
CN113946208B
CN113946208B CN202111062574.8A CN202111062574A CN113946208B CN 113946208 B CN113946208 B CN 113946208B CN 202111062574 A CN202111062574 A CN 202111062574A CN 113946208 B CN113946208 B CN 113946208B
Authority
CN
China
Prior art keywords
gesture
user
notebook computer
touch
touch pad
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202111062574.8A
Other languages
Chinese (zh)
Other versions
CN113946208A (en
Inventor
王晓杨
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Honor Device Co Ltd
Original Assignee
Honor Device Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Honor Device Co Ltd filed Critical Honor Device Co Ltd
Priority to CN202111062574.8A priority Critical patent/CN113946208B/en
Publication of CN113946208A publication Critical patent/CN113946208A/en
Application granted granted Critical
Publication of CN113946208B publication Critical patent/CN113946208B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/017Gesture based interaction, e.g. based on a set of recognized hand gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/033Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
    • G06F3/0354Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor with detection of 2D relative movements between the device, or an operating part thereof, and a plane or surface, e.g. 2D mice, trackballs, pens or pucks
    • G06F3/03545Pens or stylus
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/033Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
    • G06F3/0354Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor with detection of 2D relative movements between the device, or an operating part thereof, and a plane or surface, e.g. 2D mice, trackballs, pens or pucks
    • G06F3/03547Touch pads, in which fingers can move on a surface
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/04847Interaction techniques to control parameter settings, e.g. interaction with sliders or dials

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The embodiment of the application discloses a touch pad control method and electronic equipment, and relates to the field of electronic equipment, so that the process of using a touch pad by a user is simplified, and the efficiency of using the touch pad by the user is improved. The specific scheme is as follows: the electronic equipment receives a first gesture input by a user on a touch pad of the electronic equipment by using a first operation object; the electronic equipment responds to the first gesture and executes a first event; in the process of receiving the first gesture, the electronic equipment receives a second gesture input by a user on the touch pad by using a second operation object, wherein the starting time of inputting the second gesture is later than the starting time of inputting the first gesture; the electronic device continues to execute the first event and, in response to the second gesture, executes a second event, the second event being different from the first event.

Description

Touch control panel control method and electronic equipment
Technical Field
The present disclosure relates to the field of electronic devices, and in particular, to a method for controlling a touch panel and an electronic device.
Background
Currently, a touch pad is widely used as a built-in input device of a notebook computer. The user may operate the touch pad using one or more fingers on one hand, or the fingers on both hands may constitute different gestures. The response of the notebook computer is different aiming at the operation of different gestures.
However, when the user performs a plurality of operations on the touch pad at the same time, the notebook computer can only respond to one of the plurality of operations, and cannot respond to the plurality of operations at the same time. This results in a cumbersome and inefficient process of operating the touch pad.
Disclosure of Invention
The embodiment of the application provides a touch pad control method and electronic equipment, which can simplify the process of using a touch pad by a user and improve the efficiency of using the touch pad by the user.
In order to achieve the above purpose, the embodiment of the present application adopts the following technical solutions:
in a first aspect, an embodiment of the present application provides a method for operating a touch pad, which is applied to an electronic device. The touchpad manipulation may include: the electronic equipment receives a first gesture input by a user on a touch pad of the electronic equipment by using a first operation object; the electronic equipment responds to the first gesture and executes a first event; in the process of receiving the first gesture, the electronic equipment receives a second gesture input by a user on the touch pad by using a second operation object, and the starting time of inputting the second gesture is later than the starting time of inputting the first gesture; the electronic device continues to execute the first event and, in response to the second gesture, executes a second event, the second event being different from the first event.
According to the method of the first aspect, the electronic device receives a second gesture of the user in the process of receiving the first gesture of the user and responding to the first gesture, and the electronic device continues to respond to the first gesture and responds to the second gesture. According to the scheme, under the condition that the user inputs multiple gestures on the touch pad of the electronic equipment, the electronic equipment can respond to the multiple gestures at the same time, so that the process of using the touch pad by the user is simplified, and the efficiency of using the touch pad by the user is improved.
With reference to the first aspect, in another possible implementation manner, an interval time between the starting time of inputting the second gesture and the starting time of inputting the first gesture is greater than or equal to an interval time threshold.
Based on the possible implementation manner, the first gesture or the second gesture can be accurately determined by that the interval time between the starting time of inputting the second gesture and the starting time of inputting the first gesture is greater than or equal to the interval time threshold.
With reference to the first aspect, in another possible implementation manner, the first gesture and the second gesture are one of the following gestures: click, touch, slide, rotate, multi-finger pinch, and multi-finger separation.
Based on the possible implementation manner, the electronic device can accurately recognize the first gesture or the second gesture input by the user, so that the electronic device can execute different events.
With reference to the first aspect, in another possible implementation manner, the first event is selecting content displayed in the current interface, and the second event is rotating the selected content, zooming out or enlarging the selected content, displaying a desktop of the electronic device and displaying the selected content on the desktop, switching the desktop of the electronic device and displaying the selected content on the switched desktop, adjusting a speed of selecting the content in the current interface, or scrolling the content in the current interface; or, the first event is to scroll and display the content in the current interface, and the second event is to adjust the speed of scrolling and displaying the content.
Based on the possible implementation mode, the electronic equipment can respond to a plurality of gestures input by the user at the same time through the difference between the first event and the second event, so that the process of using the touch pad by the user is simplified, and the efficiency of using the touch pad by the user is improved.
With reference to the first aspect, in another possible implementation manner, the electronic device includes an input device with a touch pad, or is connected to the input device with the touch pad in a wireless or wired manner; the first operation object and the second operation object are fingers or touch pens.
Based on the possible implementation mode, different gestures can be input into the input device with the touch pad through a finger or a touch pen, and the electronic device can respond to the gestures at the same time, so that the process of using the touch pad by a user is simplified, and the efficiency of using the touch pad by the user is improved.
With reference to the first aspect, in another possible implementation manner, the second gesture and the first gesture are gestures input by a user on different touch sub-areas of the touch pad.
Based on the possible implementation manner, the second gesture and the first gesture are gestures input by the user on different touch sub-areas of the touch pad, and the first gesture and the second gesture can be accurately determined to be different gestures input by the user.
In a second aspect, an embodiment of the present application provides a touch pad manipulation device, which can be applied to an electronic device, for implementing the method in the first aspect. The functions of the touch control panel control device can be realized by hardware, and can also be realized by hardware executing corresponding software. The hardware or software includes one or more modules corresponding to the above functions, for example, a receiving module and an executing module, etc.
The receiving module may be configured to receive a first gesture input by a user on a touch pad of the electronic device using a first operation object.
An execution module may be configured to execute the first event in response to the first gesture.
The receiving module may be further configured to receive a second gesture input by the user on the touch pad by using a second operation object in the process of receiving the first gesture, where a start time of inputting the second gesture is later than a start time of inputting the first gesture.
The execution module may be further configured to continue to execute the first event and execute a second event in response to the second gesture, where the second event is different from the first event.
With reference to the second aspect, in another possible implementation manner, an interval time between the starting time of inputting the second gesture and the starting time of inputting the first gesture is greater than or equal to an interval time threshold.
With reference to the second aspect, in another possible implementation manner, the first gesture and the second gesture are one of the following gestures: click, touch, slide, rotate, multi-finger pinch, and multi-finger separation.
With reference to the second aspect, in another possible implementation manner, the first event is selecting content displayed in the current interface, and the second event is rotating the selected content, zooming out or enlarging the selected content, displaying a desktop of the electronic device and displaying the selected content on the desktop, switching the desktop of the electronic device and displaying the selected content on the switched desktop, adjusting a speed of the selected content in the current interface, or scrolling the content in the current interface; or, the first event is to scroll and display the content in the current interface, and the second event is to adjust the speed of scrolling and displaying the content.
With reference to the second aspect, in another possible implementation manner, the electronic device includes an input device with a touch pad, or is connected to the input device with a touch pad in a wireless or wired manner; the first operation object and the second operation object are fingers or touch pens.
With reference to the second aspect, in another possible implementation manner, the second gesture and the first gesture are gestures input by the user on different touch sub areas of the touch pad.
In a third aspect, an embodiment of the present application provides an electronic device, including: a processor, a memory for storing instructions executable by the processor. The processor is configured to execute the above instructions to enable the electronic device to implement the touch pad manipulation method according to the first aspect or any one of the possible implementation manners of the first aspect.
In a fourth aspect, embodiments of the present application provide a computer-readable storage medium having computer program instructions stored thereon. The computer program instructions, when executed by the electronic device, cause the electronic device to implement the trackpad manipulation method of the first aspect or any of its possible implementations.
In a fifth aspect, an embodiment of the present application provides a computer program product, which includes computer readable code, and when the computer readable code is run in an electronic device, the electronic device is caused to implement the touch pad manipulation method according to the first aspect or any one of the possible implementation manners of the first aspect.
It should be understood that the beneficial effects of the second to fifth aspects can be seen from the description of the first aspect, and are not repeated herein.
Drawings
Fig. 1 is a schematic view of a notebook computer according to an embodiment of the present disclosure;
fig. 2 is a schematic hardware structure diagram of an electronic device according to an embodiment of the present application;
fig. 3 is a first schematic display interface diagram of an electronic device according to an embodiment of the present disclosure;
fig. 4 is a second schematic display interface diagram of an electronic device according to an embodiment of the present application;
fig. 5 is a third schematic view of a display interface of an electronic device according to an embodiment of the present application;
fig. 6 is a fourth schematic view of a display interface of the electronic device according to the embodiment of the present application;
fig. 7 is a schematic view of a display interface of an electronic device according to an embodiment of the present application;
fig. 8 is a sixth schematic view of a display interface of an electronic device according to an embodiment of the present application;
fig. 9 is a seventh schematic display interface of an electronic device according to an embodiment of the present application;
fig. 10A is a first flowchart illustrating a method for operating a touch pad according to an embodiment of the present disclosure;
fig. 10B is a flowchart illustrating a second method for controlling a touch pad according to an embodiment of the present disclosure;
fig. 11 is a schematic structural diagram of a touch pad operating device according to an embodiment of the present disclosure.
Detailed Description
The technical solutions in the embodiments of the present application will be clearly and completely described below with reference to the drawings in the embodiments of the present application, and it is obvious that the described embodiments are only a part of the embodiments of the present application, and not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present application.
The terms "first", "second" and "first" are used for descriptive purposes only and are not to be construed as indicating or implying relative importance or to implicitly indicate the number of technical features indicated. Thus, a feature defined as "first" or "second" may explicitly or implicitly include one or more of that feature. In the description of the present application, "a plurality" means two or more unless otherwise specified.
Currently, there are many pointer input devices for personal computers (e.g., notebook computers) or smart mobile terminals, such as a mouse, a touch pad, a touch screen, a stylus pen, and the like. Among them, the touch panel is widely used as a built-in input device of a notebook computer. For example, as shown in fig. 1, a user may use one or more fingers on one hand or fingers on both hands to form different gestures to operate the touch pad 02 of a notebook computer, i.e., use one or more fingers to input different gestures (e.g., sliding or rotating) on the touch pad 02. The response of the notebook computer is different for different gestures. For example, the notebook computer may display the interface on the display screen 01 after responding to different gestures.
For example, when a user opens an article on a notebook computer, the article has a plurality of pages. The user can slide on the touch pad 02 of the notebook computer with one finger. In response, the laptop may highlight the text selected by the pointer in the currently displayed page of the article. As another example, the user may also slide on the touch pad 02 of the notebook computer using two fingers on one hand. In response, the laptop may scroll through the text of other pages in the article.
However, when the user performs multiple operations on the touch pad at the same time, that is, the user inputs multiple gestures on the touch pad at the same time, the notebook computer can only respond to one of the multiple gestures, and cannot respond to the multiple gestures input by the user at the same time, that is, the touch pad of the notebook computer can only perform a single-line operation. This results in a cumbersome and inefficient process for the user to use the touch pad.
In the prior art, the efficiency of using the notebook computer can be improved by using a multi-point parallel operation mode. For example, a user may simultaneously operate two physical buttons on a mouse connected to the notebook computer or simultaneously operate two input devices of the notebook computer to enable the notebook computer to simultaneously respond to two different operations, thereby improving the efficiency of using the notebook computer.
For example, the user may simultaneously operate the left button and the wheel of the mouse, and the notebook computer may respond to the operation of the left button and the wheel of the mouse. For example, the display of a notebook computer displays an article, the article has a plurality of pages, and a certain page of the article is currently displayed. The user can operate the left button of the mouse to select a part of the characters in the article which is currently displayed. Meanwhile, the user can operate the scroll wheel of the mouse, for example, rotate the scroll wheel of the mouse to scroll and display the characters included in other pages of the article. That is, the notebook computer can also scroll and show the characters included in other pages of the article while responding to the characters selected by the user, thereby facilitating the user to continuously select the characters included in other pages.
For another example, the user may simultaneously operate the touch pad of the notebook computer and the wheel of the mouse, and the notebook computer may respond to the operation of the user on the wheel of the mouse while responding to the operation of the user on the touch pad. For example, the display screen of the notebook computer displays an article with multiple pages, and currently displays one page of the article. The user may operate the touch pad, for example, the user may slide a finger on the touch pad to select a portion of the text currently displayed on the display screen. Meanwhile, the user can operate the scroll wheel of the mouse, for example, rotate the scroll wheel of the mouse to scroll and display the characters included in other pages of the article. That is, the notebook computer can also scroll and show the characters included in other pages of the article while responding to the characters selected by the user, thereby facilitating the user to continuously select the characters included in other pages.
Although the multi-point parallel operation enables the notebook computer to respond to a plurality of operations at the same time, the efficiency of using the notebook computer can be improved compared to the single-line operation (i.e., the notebook computer responds to one operation at a time). However, for a notebook computer using a touch pad as a unique pointer input device, that is, when the input device of the notebook computer only has a touch pad, the user cannot perform the above-mentioned multi-point parallel operation, which results in a reduction in the operation efficiency of the user in various working scenes. Therefore, the multi-point parallel operation mode in the prior art does not well solve the problems of tedious process and low efficiency in operating the touch pad.
In view of the above problems, embodiments of the present invention provide a method for operating a touch pad, which is applied to an electronic device, and can enable the electronic device to respond to a plurality of gestures simultaneously when a user inputs the plurality of gestures on the touch pad of the electronic device, so as to simplify a process of using the touch pad by the user and improve efficiency of using the touch pad by the user.
The following describes a method for manipulating a touch pad provided in an embodiment of the present application.
The touch pad control method provided by the embodiment of the application can be applied to electronic equipment. In some examples, the electronic device may be a handheld computer, a Personal Computer (PC) (e.g., a laptop), a Personal Digital Assistant (PDA), or the like having an electronic device, and the electronic device includes an input device having a touch pad, such as a laptop, or the electronic device and the input device having a touch pad are connected by wireless or wired connection. The embodiment of the present application does not limit the specific form of the electronic device.
By way of example, taking an electronic device as a notebook computer as an example, fig. 2 shows a schematic structural diagram of an electronic device provided in an embodiment of the present application.
As shown in fig. 2, the notebook computer may include: processor 210, fan 211, external memory interface 220, internal memory 221, universal Serial Bus (USB) interface 230, charging management module 240, power management module 241, battery 242, display 250, antenna, wireless communication module 260, audio module 270, speaker (i.e., loudspeaker) 270A, microphone 270C, headphone interface 270B, touch pad 280, keyboard 290, and camera 291, among others.
The devices (such as the processor 210, the fan 211, the external memory interface 220, the internal memory 221, the usb interface 230, the charging management module 240, the power management module 241, the battery 242, the antenna, the wireless communication module 260, the audio module 270, the touch pad 280, the speaker 270A, the microphone 270C, the earphone interface 270B, the keyboard 290, the camera 291, and the like) other than the display 250 can be disposed on the base of the notebook computer. The camera 291 may also be disposed on a frame of the display screen 250 of the notebook computer.
It is understood that the structure illustrated in the embodiment is not a specific limitation to the notebook computer. In other embodiments, a notebook computer may include more or fewer components than shown, or some components may be combined, some components may be separated, or a different arrangement of components may be used. The illustrated components may be implemented in hardware, software, or a combination of software and hardware.
Processor 210 may include one or more processing units, such as: the processor 210 may include an Application Processor (AP), a modem processor, a Graphics Processing Unit (GPU), an Image Signal Processor (ISP), a controller, a memory, a video codec, a Digital Signal Processor (DSP), a baseband processor, and/or a neural-Network Processing Unit (NPU), etc. The different processing units may be separate devices or may be integrated into one or more processors.
The controller can be the neural center and the command center of a notebook computer. The controller can generate an operation control signal according to the instruction operation code and the timing signal to complete the control of instruction fetching and instruction execution.
A memory may also be provided in processor 210 for storing instructions and data. In some embodiments, the memory in processor 210 is a cache memory. The memory may hold instructions or data that have just been used or recycled by processor 210. If the processor 210 needs to use the instruction or data again, it can be called directly from the memory. Avoiding repeated accesses reduces the latency of the processor 210, thereby increasing the efficiency of the system.
In some embodiments, processor 210 may include one or more interfaces. The interface may include an integrated circuit (I2C) interface, an integrated circuit built-in audio (I2S) interface, a Pulse Code Modulation (PCM) interface, a universal asynchronous receiver/transmitter (UART) interface, a Mobile Industry Processor Interface (MIPI), a general-purpose input/output (GPIO) interface, a Subscriber Identity Module (SIM) interface, and/or a Universal Serial Bus (USB) interface, etc.
It should be understood that the interface connection relationship between the modules in this embodiment is only schematically illustrated, and does not limit the structure of the notebook computer. In other embodiments, the notebook computer may also adopt different interface connection manners or a combination of multiple interface connection manners in the above embodiments.
The charging management module 240 is used to receive charging input from a charger (e.g., a wireless charger or a wired charger) to charge the battery 242. The power management module 241 is used to connect the battery 242, the charging management module 240 and the processor 210. The power management module 241 receives the input of the battery 242 and/or the charging management module 240 to supply power to the components of the notebook computer.
The wireless communication function of the notebook computer can be realized by the antenna and wireless communication module 260, the modem processor, the baseband processor, and the like.
The antenna is used for transmitting and receiving electromagnetic wave signals. Each antenna in a notebook computer may be used to cover a single or multiple communication bands. Different antennas can also be multiplexed to improve the utilization of the antennas.
In some embodiments, the antenna of the notebook computer is coupled to the wireless communication module 260 so that the notebook computer can communicate with the network and other devices through wireless communication techniques. The wireless communication module 260 may provide a solution for wireless communication applied to a notebook computer, including Wireless Local Area Networks (WLANs) (e.g., wireless fidelity (Wi-Fi) networks), bluetooth (blue tooth, BT), global Navigation Satellite System (GNSS), frequency Modulation (FM), near Field Communication (NFC), infrared (IR), and the like.
The notebook computer can implement the display function through the GPU, the display screen 250, and the application processor. The GPU is a microprocessor for image processing, and is connected to the display screen 250 and the application processor. The GPU is used to perform mathematical and geometric calculations for graphics rendering. Processor 210 may include one or more GPUs that execute program instructions to generate or alter display information. The display screen 250 is used to display images, video, and the like.
The touch pad 280 is integrated with a touch sensor. The notebook computer can receive a control command of the notebook computer from a user through the touch pad 280 and the keyboard 290.
The notebook computer can realize the shooting function through the ISP, the camera 291, the video codec, the GPU, the display screen 250, the application processor, and the like. The ISP is used to process the data fed back by the camera 291. In some embodiments, the ISP may be provided in camera 291. The camera 291 is used to capture still images or video. In some embodiments, the notebook computer may include 1 or N cameras 291, N being a positive integer greater than 1.
The external memory interface 220 may be used to connect an external memory card, such as a Micro SD card, to extend the storage capability of the notebook computer. Internal memory 221 may be used to store computer-executable program code, including instructions. The processor 210 executes various functional applications of the notebook computer and data processing by executing instructions stored in the internal memory 221. For example, in the present embodiment, the processor 210 may execute instructions stored in the internal memory 221, and the internal memory 221 may include a program storage area and a data storage area.
The notebook computer can implement an audio function through the audio module 270, the speaker 270A, the microphone 270C, the earphone interface 270B, the application processor, and the like. Such as music playing, recording, etc.
Audio module 270 is used to convert digital audio signals to analog audio signal outputs and also to convert analog audio inputs to digital audio signals. Audio module 270 may also be used to encode and decode audio signals. In some embodiments, the audio module 270 may be disposed in the processor 210, or some functional modules of the audio module 270 may be disposed in the processor 210. The speaker 270A, also called a "horn", is used to convert an audio electrical signal into an acoustic signal. The microphone 270C, also referred to as a "microphone," is used to convert acoustic signals into electrical signals. The earphone interface 270B is used to connect a wired earphone. The headset interface 270B may be a USB interface 230, or may be a 3.5mm open mobile electronic device platform (OMTP) standard interface, a cellular telecommunications industry association (cellular telecommunications industry association of the USA, CTIA) standard interface.
The notebook computer according to the embodiment of the present application may include one or more speakers 270A and one or more microphones 270C.
The fan 211 is used for heat dissipation of the notebook computer. The processor 210 can control the fan 211 to operate at different rotation speeds to dissipate heat of the notebook computer.
Of course, it should be understood that fig. 2 is only an exemplary illustration of the electronic device in the form of a notebook computer. If the electronic device is in the form of a handheld computer, a PDA, or other devices, the structure of the electronic device may include fewer structures than those shown in fig. 2, or may include more structures than those shown in fig. 2, and is not limited herein.
The methods in the following embodiments may be implemented in an electronic device having the above hardware structure. In the embodiment of the present application, an electronic device is taken as a notebook computer for illustration.
For ease of understanding, the scheme of the present application is exemplified below with reference to specific scenarios and fig. 3 to 9.
In some examples, a user may select a portion of content, such as pictures or text, in an interface currently displayed on a display screen of a laptop computer. Meanwhile, the user can rotate the selected part of the content. That is, the notebook computer can respond to the gesture of the user for selecting part of the content in the current display interface and the gesture for rotating the selected content at the same time.
For example, a user may use one finger of one hand to slide over a touchpad of a notebook computer. In response, the notebook computer may select the corresponding content and highlight (e.g., highlight) the content selected by the pointer, such as text or pictures, on the display screen. Meanwhile, the user can use two fingers on the other hand to rotate on the touch pad of the notebook computer, such as clockwise rotation or counterclockwise rotation. In response, the notebook computer may rotate the content selected by the pointer and display a rotating animation of the content on the display screen.
For example, referring to fig. 3 (a), taking the case that the display screen 01 of the notebook computer currently displays a picture, and the touch area of the touch pad 02 of the notebook computer includes the touch sub-area 03 and the touch sub-area 04, the touch sub-area 03 and the touch sub-area 04 are used for detecting different gestures input by the user.
For example, the user may use the index finger of the right hand to slide on the touch sub-area 04, such as to slide to the lower right. In response, the notebook computer may select a portion of the picture, such as picture 05, that the pointer is stroked over. Also, as shown in fig. 3 (a), the notebook computer may highlight the picture 05, such as adding a middle frame to the picture 05 and displaying it on the display screen 01. Thereafter, the user may keep the position of the right index finger on the touch sub-area 04 unchanged, or the user may keep the right index finger sliding on the touch sub-area 04. Meanwhile, the user may rotate clockwise on the touch sub-area 03 using the index finger and thumb of the left hand. In response, the notebook computer may rotate the picture 05 clockwise. And, as the user's finger rotates, the display screen 01 of the notebook computer may display a rotation animation of the picture 05 as shown in (b) of fig. 3. That is, the notebook computer can respond to the gesture of the user rotation while responding to the gesture selected by the user.
In other examples, the user may select a portion of the content, such as a picture or text, in the interface currently displayed on the display screen of the laptop. Meanwhile, the user can zoom out or zoom in the selected part of the content. That is, the notebook computer can respond to a gesture of the user for selecting a part of the content in the current display interface and a gesture for zooming out or zooming in the selected content at the same time.
For example, a user may use one finger of one hand to slide over a touchpad of a notebook computer. In response, the laptop may select the corresponding content and highlight (e.g., highlight, etc.) the content selected by the pointer on the display screen. Meanwhile, the user can use two fingers on the other hand to pinch on (i.e., move closer to) or separate (i.e., move away from) the touchpad of the notebook computer. In response, the notebook computer may zoom in or out on the content selected by the pointer and display a zoom-in or zoom-out animation on the display screen.
For example, referring to fig. 4 (a), taking an example that the display screen 01 of the notebook computer currently displays a picture, and the touch area of the touch pad 02 of the notebook computer includes a touch sub-area 03 and a touch sub-area 04, the touch sub-area 03 and the touch sub-area 04 are used for detecting different gestures of the user.
For example, the user can slide on the touch sub-area 04 using the index finger of the right hand, such as to slide to the lower right. In response, the notebook computer may select a portion of the picture, such as picture 05, over which the pointer is stroked. Also, as shown in fig. 4 (a), the notebook computer may highlight the picture 05, such as adding a middle box to the picture 05 and displaying in the display screen 01. Thereafter, the user may keep the position of the right index finger on the touch sub-area 04 unchanged, or the user may keep the right index finger sliding on the touch sub-area 04. Meanwhile, the user may use the index finger and the thumb of the left hand to approach each other on the touch sub-area 03. In response, the notebook computer may zoom out the picture 05. And, as the user's fingers approach each other, that is, the user's multi-fingers are pinched, the display screen 01 of the notebook computer may display a reduced animation of the picture 05, as shown in fig. 4 (b). That is, the notebook computer can respond to the gesture of zooming out of the user while responding to the gesture selected by the user.
Accordingly, while the user keeps the position of the index finger of the right hand on the touch sub-area 04 unchanged, the user can use the index finger and the thumb of the left hand to move away from each other on the touch sub-area 03, i.e., the user's multiple fingers are separated. In response, the notebook computer may zoom in on the picture 05. And the display screen 01 of the notebook computer can display the enlarged animation of the picture 05. That is, the notebook computer may respond to a user's enlarged gesture while responding to a user's selected gesture.
In other examples, a laptop may include multiple desktops, and different desktops may be used to display different content, such as applications or files. A user may select certain content, such as applications or files, on a desktop. Meanwhile, the user can drag the selected content to another desktop. That is to say, the notebook computer may respond to the gesture used by the user to select a part of the content in the current display interface and the gesture used to display the other desktops of the notebook computer at the same time, so that the selected content may be dragged to the other desktops of the notebook computer.
For example, a user may use one finger of one hand to slide over a touchpad of a notebook computer. In response, the notebook computer may select corresponding content on the currently displayed desktop of the display screen, and highlight (e.g., highlight) the content selected by the pointer on the display screen. Meanwhile, the user can slide on the touch pad of the notebook computer, such as left-right sliding, by using two fingers on the other hand. In response, the notebook computer may display the other desktop of the notebook computer and display the content selected by the pointer on the desktop, i.e., the user drags the selected content to another desktop.
For example, as shown in (a) of fig. 5, a notebook computer may include a desktop 06, and the desktop 06 may include a plurality of applications, such as a music application, and the like. As shown in fig. 5 (b), the notebook computer may further include a desktop 07, and the desktop 07 may also include a plurality of applications, such as a weather application. The user may select one or more applications on desktop 06, such as a music application, and drag and move the selected applications on desktop 06 into desktop 07.
As shown in (a) and (b) of fig. 5, taking the example that the touch area of the touch pad 02 of the notebook computer includes the touch sub-area 03 and the touch sub-area 04, the touch sub-area 03 and the touch sub-area 04 are used to detect different gestures of the user. The user can slide on the touch sub-area 04 with the index finger of the right hand, such as to slide to the lower right. In response, the laptop may select a portion of the desktop 06 content, such as a music application. Also, as shown in fig. 5 (c), the notebook computer may highlight the music application, such as highlighting the music application selected by the pointer.
Thereafter, the user may keep the right index finger sliding on the touch sub-area 04, dragging the music application to a different location. Meanwhile, the user can slide left and right, such as slide right, on the touch sub-area 03 using two fingers of the left hand. In response, the notebook computer may display the desktop 07 and display the music application on the desktop 07. The index finger of the right hand of the user may continue to slide on the touch sub-area 04, so that the music application selected by the pointer may be dragged and moved to a corresponding position in the desktop 07. As shown in fig. 5 (d), the display screen 01 of the notebook computer may display the dragged desktop 07, and the dragged desktop 07 includes a music application. Accordingly, the dragged desktop 06 does not include a music application. That is, the notebook computer can respond to the gesture of displaying other desktops of the notebook computer while responding to the gesture selected by the user, so that the selected content can be dragged to the other desktops of the notebook computer.
In other examples, the user may select a portion of the content, such as a picture or text, in the interface currently displayed on the display screen of the laptop. At the same time, the user may drag and move the selected portion of content to another application. That is, the notebook computer may respond to a gesture of the user for selecting a part of the content in the current display interface and a gesture for displaying the desktop of the notebook computer at the same time, and the desktop of the notebook computer may include a plurality of applications, so that the selected content may be dragged to other applications.
For example, a user may use one finger of one hand to slide over a touchpad of a notebook computer. In response, the display screen of the notebook computer may highlight (e.g., highlight, etc.) the content selected by the pointer, such as a picture in the selected picture application. Meanwhile, the user can slide on the touch pad of the notebook computer by using three fingers on the other hand, such as left-right sliding. In response, the laptop may display a desktop that may include multiple applications such that a user may drag and move content selected by the pointer into another application, such as dragging and moving a selected picture into an email application such that the picture may be sent via email.
For example, referring to fig. 6 (a), taking the display screen 01 of the notebook computer currently displays the interface 08 of the picture application, the interface 08 of the picture application includes a plurality of pictures, and the touch area of the touch pad 02 of the notebook computer includes the touch sub-area 03 and the touch sub-area 04, for example, the touch sub-area 03 and the touch sub-area 04 are used to detect different gestures of the user.
The user can click, such as a single click, on the touch sub-area 04 using the right index finger. In response, the notebook computer may select a portion of the picture selected by the pointer, such as the picture 09 in the interface 08 of the picture application. Also, as shown in fig. 6 (a), the display screen 01 of the notebook computer may highlight the photograph 09 selected by the pointer. Thereafter, the user can slide on the touch sub-area 04 with the index finger of the right hand, such as to slide to the lower right, thereby dragging the photo 09. In response, the display screen 01 of the notebook computer may display the animation 10 of zooming the picture 09 in the interface 08 of the picture application. Meanwhile, the user can slide on the touch sub-area 03 using three fingers of the left hand, such as up and down. In response, as shown in (b) of fig. 6, the notebook computer may display the desktop 06 including a plurality of applications, such as an email application, and the animation 10 zoomed by the picture 09. Thereafter, the user may hold the right index finger sliding on the touch sub-area 04, dragging the animation 10 to the desktop 06, including other applications, such as an email application. In response, the laptop may drag the animation 10 into the email application. Desktop 06 may also include multiple pages, each of which may include multiple applications. When the currently displayed page of the desktop 06 does not include the email application, the user may keep the index finger of the right hand sliding on the touch sub-area 04, and at the same time, as shown in fig. 6 (c), the user may use two fingers of the left hand to slide on the touch sub-area 03, such as sliding left and right, so that the notebook computer may page the desktop 06, display an application that is not currently displayed, and thereby drag the animation 10 to another application. Afterwards, the notebook computer can open the e-mail, and the e-mail can include the picture 09, so that the user can edit the e-mail and send the picture 09 to other users. That is, the laptop may respond to a gesture that displays a desktop of the laptop while responding to a gesture selected by a user, and the desktop of the laptop may include a plurality of applications, so that the selected content may be dragged to other applications.
In other examples, the user may select a portion of the content, such as a picture or text, in the interface currently displayed by the laptop. Meanwhile, the user can operate the notebook computer to scroll and display the content in the current interface. That is, the notebook computer can respond to the gesture of the user for selecting part of the content in the current display interface and the gesture for scrolling and displaying the content in the current interface at the same time.
For example, a user may use one finger of one hand to slide over a touchpad of a notebook computer. In response, the display screen of the notebook computer may highlight (e.g., highlight, etc.) the content selected by the pointer, such as a portion of text in a currently displayed page of an article. Meanwhile, the user can slide on the touch pad 02 of the notebook computer using two fingers on the other hand. In response, the laptop may scroll through the text of the other pages in the article so that the user may continue to select the text included in the other pages in the article.
For example, referring to fig. 7 (a), taking an example that a user opens an article on a notebook computer, the article has a plurality of pages, a display screen 01 of the notebook computer currently displays a certain page of the article, and a touch area of a touch pad 02 of the notebook computer includes a touch sub-area 03 and a touch sub-area 04, the touch sub-area 03 and the touch sub-area 04 are used to detect different gestures of the user.
The user can slide on the touch sub-area 04 using the index finger of the right hand, such as to slide to the lower right. In response, the notebook computer may select a portion of the text on the first page of the article, such as the last two lines of text. Also, as shown in fig. 7 (a), the notebook computer may highlight the character selected by the pointer. Thereafter, the user may keep the right index finger sliding on the touch sub-area 04 while the user may use both fingers of the left hand to slide, e.g., down, on the touch sub-area 03. In response, the display screen 01 of the notebook computer may scroll through the contents of the other pages of the article, while the pointer of the notebook computer may continue to select some or all of the text of the other pages of the article. Also, as shown in fig. 7 (b), the display screen of the notebook computer may highlight the character selected by the pointer. That is, the notebook computer may respond to a gesture of scrolling the content in the current interface by the user while responding to the gesture selected by the user.
In other examples, the user may select a portion of the content, such as a picture or text, in the interface currently displayed by the laptop. Meanwhile, the user can adjust the speed of the selected content, for example, the speed of the selected content is reduced, so that the precision of the selected content can be improved. That is, the notebook computer may respond to both a gesture of the user for selecting a portion of the content in the current display interface and a gesture for adjusting the speed of selecting the content in the current interface.
For example, a user may use one finger of one hand to slide over a touchpad of a notebook computer. In response, the display screen of the notebook computer may highlight (e.g., highlight, etc.) the content selected by the pointer, such as a portion of text in a currently displayed page of an article. Meanwhile, the user can touch the touch pad of the notebook computer by using one finger on the other hand, such as single finger tap. In response, the notebook computer may reduce the speed at which the pointer selects content while selecting content.
For example, referring to fig. 8 (a), taking an example that a user opens an article on a notebook computer, the article has a plurality of pages, the display screen 01 of the notebook computer currently displays a certain page of the article, and the touch area of the touch pad 02 of the notebook computer includes the touch sub-area 03 and the touch sub-area 04, the touch sub-area 03 and the touch sub-area 04 are used for detecting different gestures of the user.
The user can slide on the touch sub-area 04 with the index finger of the right hand, such as to slide to the lower right. In response, the notebook computer may select a portion of the text on the first page of the article, such as the last two lines of text. Also, as shown in fig. 8 (a), the notebook computer may highlight a portion of the text selected by the pointer. Thereafter, the user may keep the right index finger sliding on the touch sub-area 04, and at the same time, the user may touch on the touch sub-area 03 with a finger of the left hand, such as a single finger tap. In response, the notebook computer can reduce the speed of selecting characters by the pointer, for example, one character is selected each time, so that the precision of selecting characters can be improved. As shown in fig. 8 (b), the display screen 01 of the notebook computer may highlight a portion of the text selected by the pointer. That is, the notebook computer can respond to the gesture that the user reduces the speed of the selected content while responding to the gesture selected by the user.
In other examples, the user may scroll through the content, such as pictures or text, in the current interface of the laptop. Meanwhile, the user can adjust the speed of scrolling the content in the current interface, for example, the speed of scrolling the content is reduced, so that the precision of scrolling the content can be improved. That is, the notebook computer can respond to the gesture of the user for selecting part of the content in the current display interface and the gesture for adjusting the speed of scrolling and displaying the content in the current interface at the same time.
For example, a user may use two fingers on one hand to slide, such as down, on a touch pad of a notebook computer. In response, the display screen of the notebook computer may scroll through the content in the current interface, such as the text of other pages in an article. Meanwhile, the user can touch the touch pad of the notebook computer by using one finger on the other hand, such as single finger tap. In response, the notebook computer reduces the speed of the scrolling while scrolling the content in the current interface.
For example, referring to fig. 9 (a), taking an example that a user opens an article on a notebook computer, the article has a plurality of pages, a display screen 01 of the notebook computer currently displays a certain page of the article, and a touch area of a touch pad 02 of the notebook computer includes a touch sub-area 03 and a touch sub-area 04, the touch sub-area 03 and the touch sub-area 04 are used to detect different gestures of the user.
The user can slide on the touch sub-area 03 using two fingers of the left hand, such as sliding down. In response, the display of the notebook computer may display the text of other pages of the article. Thereafter, the user may keep two fingers of the left hand sliding on the touch sub-area 03, and at the same time, the user may use one finger of the right hand to touch on the touch sub-area 03, such as a single finger tap. In response, the notebook computer may slow down the scrolling, such as scrolling a line of text at a time. Also, as shown in fig. 9 (b), the display screen 01 of the notebook computer may scroll to display the text of the other pages of the article. That is, the notebook computer can respond to the gesture of reducing the scrolling speed of the user while responding to the scrolling gesture of the user.
The following describes a method for manipulating a touch pad provided in an embodiment of the present application.
Fig. 10A is a schematic flowchart of a method for operating a touch pad according to an embodiment of the present disclosure. As shown in fig. 10A, the touch pad manipulation method may include S1001-S1006.
S1001, the notebook computer receives a first gesture input on the touch pad by a user through a first operation object.
When a user needs to operate the notebook computer, the user may use the operation object to perform a corresponding operation on the touch pad of the notebook computer, for example, on a touch area of the touch pad (i.e., an area on the touch pad for detecting a gesture of the user using the operation object). Namely, the user can input corresponding gestures on the touch pad of the notebook computer by using the operation object. In this embodiment of the application, the operation object (such as the first operation object described above) may be one or more fingers of the user, or may be a touch pen, and the like, and this embodiment is not limited in this respect. For convenience of description, the following embodiments are schematically illustrated by taking one or more fingers of which the operation object is a user as an example. For example, the first operation object may be a finger of a left hand of the user, or may be a finger of a right hand of the user, which is not limited in this embodiment of the application.
In some examples, the notebook computer may detect the touch area of the touch pad to determine whether there is motion of the operation object within the touch area of the touch pad. Therefore, when a user uses the operation object to input a corresponding gesture on the touch area of the touch pad of the notebook computer, the notebook computer can detect the motion of the operation object in the touch area of the touch pad, that is, detect the gesture input by the user. Then, different gestures input by the user can be recognized according to different motions detected by the notebook computer, namely different operations of the user can be recognized, so that different responses are carried out. For example, a user may use multiple fingers on one hand to input different gestures (e.g., click, slide, or rotate, etc.) on the touchpad to operate the touchpad of a notebook computer, or a user may use multiple fingers on two hands to input different gestures on the touchpad to operate the touchpad of a notebook computer. The response of the notebook computer is different for different gestures. The notebook computer can also display the interface after responding to different gestures on the display screen.
When a user inputs different gestures on the touch pad by using a plurality of fingers, the plurality of fingers are not easy to operate on the touch pad at the same time, so that the notebook computer can detect the gestures of the user in stages, and determine whether the operation of the user on the touch pad by using the plurality of fingers is a gesture. For example, if the notebook computer detects that the user slides down on the touch pad with two fingers in the same phase, the notebook computer may determine that the two fingers slide down on the touch pad as one gesture. If the laptop detects that the user slides down on the touch pad with one finger in one stage and detects that the user slides down on the touch pad with another finger in another stage, the laptop may determine that these are two gestures input by the user, rather than one gesture.
In some examples, the notebook computer may determine whether the movement of one or more fingers of the user is detected in the touch area of the touch pad within a first preset time period (which may also be referred to as a first stage) to determine whether the movement of the one or more fingers is a gesture, i.e., determine whether the movement of the one or more fingers is an operation. Namely, the user uses the first operation object to input the first gesture on the touch pad of the notebook computer. One or more fingers used by the user within the first preset time period can be called as a first operation object. The movement of the first operation object on the touch pad within the first preset time period may be referred to as a first gesture. The first preset time period may be set according to an actual situation, which is not limited in this embodiment of the application.
The first gesture may be a click (e.g., a single click or a double click), a touch (e.g., a single finger tap), a slide (e.g., a vertical slide, a horizontal slide, or a slide in other directions), a rotation (e.g., a clockwise rotation or a counterclockwise rotation), a proximity of multiple operation objects (e.g., a multi-finger pinch), or a distance of multiple operation objects (e.g., a multi-finger separation), and so on. The embodiment of the present application does not limit the specific type of the first gesture.
For example, referring to fig. 3 (a), when the notebook computer detects that the user slides with one finger (for example, slides to the lower right) in the touch area of the touch pad 02 within a first preset time period, the notebook computer may determine that the user needs to select a corresponding content, such as a text or an application, in the interface currently displayed on the display screen 01. In response, the laptop may select the content that the pointer crossed, and the laptop may highlight the user-selected content on the display screen 01.
For another example, continuing with fig. 9 (a), in a case that the notebook computer detects that two fingers of the user slide downward in the touch area of the touch pad 02 in the first stage, the notebook computer may determine that the user needs to scroll contents that are not displayed in the current display interface of the display screen 01, such as characters of other pages in an article. In response, the notebook computer may scroll down within the display area of the display screen 01 to display content not displayed in the current display interface of the display screen 01.
In some examples, in order to distinguish the motion of the operation object within different preset time lengths, i.e. distinguish the gestures input by the user within different preset time lengths, the touch area of the touch pad of the notebook computer may include a plurality of touch sub-areas. The different touch sub-areas are used for detecting different gestures input by the user within different preset durations, that is, the user can input different gestures on the different touch sub-areas.
For example, referring to fig. 3 (a), the touch area of the touch pad 02 of the notebook computer includes two touch sub-areas. That is, the touch area of the touch pad 02 of the notebook computer may include the touch sub-area 03 and the touch sub-area 04. The touch sub-area 04 may be configured to detect a gesture input by a user within a first preset duration, and the touch sub-area 03 may be configured to detect a gesture input by the user within another preset duration. For example, the user may slide down within the touch sub-area 04 with one finger. The notebook computer may determine that a gesture input by the user in the touch sub-area 04 within a first preset time period is received, that is, the user slides down with one finger. Thereafter, the user can slide down within the touch area 03 using two fingers. The notebook computer may determine that the gesture input by the user in the touch sub-area 03 within the other preset time period (for example, the second preset time period) is received, that is, the user slides down the two fingers.
It can be understood that the touch sub-area 03 may also be configured to detect a gesture input by the user within a first preset duration, and correspondingly, the touch sub-area 04 may be configured to detect a gesture input by the user within another preset duration.
In other examples, in order to distinguish motions of operation objects with different preset durations, that is, to distinguish gestures input by users with different preset durations, an operation object whose distance between touch points on a touch area of a touch pad is smaller than or equal to a distance threshold may be used as an operation object within the same preset duration, and a motion of the operation object within a first preset duration is a gesture input by a user within the same preset duration. The distance threshold may be set according to actual conditions, and is not limited in this embodiment of the application.
For example, a user may slide down within the touch area of a touchpad of a notebook computer using two fingers. The notebook computer can detect whether the distance between two touch points of two fingers on the touch area is smaller than or equal to a distance threshold value. When the distance between the two touch points is less than or equal to the distance threshold, the notebook computer may determine that the user slides down using the two fingers, which is a gesture input by the user within the same preset duration. That is, two fingers of the user slide down as one gesture, that is, one operation. Under the condition that the distance between the two touch points is greater than the distance threshold, the notebook computer can determine that the user slides downwards by using two fingers, and the gesture input by the user in two preset durations is not the gesture input by the user in the same preset duration, namely two operations.
S1002, the notebook computer responds to the first gesture and executes a first event.
Under the condition that the notebook computer receives a first gesture input by a user on the touch pad by using a first operation object within a first preset time length, the notebook computer can respond to the first gesture in a corresponding mode.
The notebook computer responds to the first gesture, and the first event can be opening content, selecting content, scrolling the displayed content, dragging the content or rotating the content, and the like, wherein the content can be an application, a folder, characters, a photo, and the like. For example, when the first gesture is a click, such as a single click, and the notebook computer responds to the first gesture, the execution of the first event may be to select content, such as a certain picture. When the first gesture is a click, such as a double click, and the notebook computer responds to the first gesture, the execution of the first event may be opening content, such as opening an application or a folder. When the first gesture is that the user slides on the touch pad by using a finger, such as downward sliding, the notebook computer responds to the first gesture, and the execution of the first event can be the selection of the content, such as the selection of a plurality of characters in a certain character. When the first gesture is that the user rotates on the touch pad by using a plurality of fingers, the notebook computer responds to the first gesture, and the first event is executed to rotate the content, such as rotating a certain picture. When the first gesture is that the user uses a plurality of fingers to move the touch pads close to or away from each other, the notebook computer responds to the first gesture, and the executing of the first event may be zooming out or zooming in the content, such as zooming out or zooming in a certain picture.
For example, as shown in fig. 3 (a), the first gesture is taken as an example where the user slides to the lower right with one hand. When the user wants to select a part of the pictures displayed on the display screen 01, the user can slide to the lower right on the touch area 04 by using one finger to select the corresponding picture, namely the picture 05. In response, the laptop may select picture 05 and may highlight picture 05, such as by placing a checkbox around picture 05.
S1003, the notebook computer receives a second gesture input on the touch pad by the user through the second operation object.
In some examples, the notebook computer may receive a second gesture input by the user on the touch pad using the second operation object within a second preset time period. The second preset duration (which may also be referred to as the second stage) may be a preset duration after the first preset duration. I.e. the starting time of the user inputting the second gesture is later than the starting time of the user inputting the first gesture. The second preset time period may be set according to an actual situation, which is not limited in this embodiment of the application. The second preset duration may be the same as the first preset duration, and the second preset duration may be different from the first preset duration.
The notebook computer may detect the touch area of the touch pad to determine whether there is a motion of the operation object in the touch area of the touch pad within a second preset time period, where the operation object within the second preset time period may be referred to as a second operation object. In this way, when the user uses the operation object to input a corresponding gesture on the touch area of the touch pad of the notebook computer within the second preset time period, the notebook computer may detect the motion of the operation object in the touch area of the touch pad, that is, detect the gesture input by the user, where the gesture input by the user within the second preset time period may be referred to as a second gesture.
The second operation object may be one or more fingers of the user, or may also be a touch pen, and the like, which is not limited in this embodiment. The second operation object may be the same as the first operation object, and the second operation object may be different from the first operation object. In the embodiment of the present application, a second operation object is taken as one or more fingers of a user for example to perform schematic description. For example, the second operation object may be a finger of a left hand of the user, or may be a finger of a right hand of the user, which is not limited in this embodiment of the application.
The second operation object and the first operation object may be different fingers on one hand of the user or fingers on two hands of the user. For example, the first operation object may be one or more fingers on the right hand of the user, and the second operation object may be one or more fingers on the left hand of the user. Accordingly, the first operation object may be one or more fingers on the left hand of the user, and the second operation object may be one or more fingers on the right hand of the user. For another example, the first operation object may be one or more fingers on the right hand of the user, and the second operation object may also be one or more fingers on the right hand of the user. Accordingly, the first operation object may be one or more fingers on the left hand of the user, and the second operation object may also be one or more fingers on the left hand of the user.
In some embodiments, to distinguish between a first gesture input by a user within a first preset duration and a second gesture input by a user within a second preset duration, an interval between the second preset duration and the first preset duration may be greater than or equal to an interval threshold, i.e., an interval between a start time of the second gesture input by the user and a start time of the first gesture input by the user is greater than or equal to the interval threshold. The interval time threshold may be set in practical situations, and is not limited in this embodiment of the application.
For example, the motion of the operation object of the first gesture input by the user in the first preset time period is taken as one finger of the right hand of the user to slide downwards, and the second gesture input by the user in the second preset time period is taken as two fingers of the left hand of the user to slide downwards. Under the condition that the interval time between the second preset time length and the first preset time length is greater than or equal to the interval time threshold, the notebook computer can determine that one finger of the right hand of the user slides downwards and two fingers of the left hand of the user slide downwards to be different gestures input by the user in two different preset time lengths, namely the first gesture and the second gesture are two different gestures, and the notebook computer can have two different responses. When the interval time between the second preset time and the first preset time is smaller than the interval time threshold, the notebook computer may determine that the downward sliding of one finger of the right hand of the user and the downward sliding of two fingers of the left hand of the user are motions of the operation object in the same stage, that is, the first gesture and the second gesture are the same gesture, for example, the user slides downward using three fingers (one finger of the right hand and two fingers of the left hand), so that the notebook computer may scroll and display the content that is not displayed on the current interface. At this time, the notebook computer has only one response.
In some embodiments, the second gesture and the first gesture may be gestures input by a user in different touch areas of a touch pad of a notebook computer.
Optionally, with reference to fig. 10A, as shown in fig. 10B, after S1003, the method for operating a touch pad provided in the embodiment of the present application may further include S1007.
S1007, the notebook computer determines whether the second gesture and the first gesture are gestures input by the user on different touch sub-regions of the touch pad.
In order to distinguish different gestures input by a user, a touch area of a touch pad of a notebook computer may include a plurality of touch sub-areas. Different touch sub-areas may be used to detect different gestures input by the user, i.e. the user may input different gestures on different touch sub-areas.
When the notebook computer determines that the second gesture and the first gesture are not gestures input by the user on different touch sub areas of the touch pad, that is, the second gesture and the first gesture are gestures input by the user on the same touch sub area of the touch pad, the notebook computer may determine that the second gesture and the first gesture are the same gesture, and the notebook computer may continue to execute S1002. In a case that the notebook computer determines that the second gesture and the first gesture are gestures input by the user on different touch sub-areas of the touch pad, the notebook computer may determine that the second gesture and the first gesture are different gestures input by the user, and the notebook computer may continue to execute S1004.
For example, with reference to fig. 3 (a), the touch area of the touch pad of the notebook computer includes two touch sub-areas. That is, the touch area of the touch pad of the notebook computer includes the touch sub-area 03 and the touch sub-area 04. The touch sub-area 04 may be configured to detect a first gesture input by the user within a first preset time period, and the touch sub-area 03 may be configured to detect a second gesture input by the user within a second preset time period. After the notebook computer detects the first gesture input by the user within the first preset time duration in the touch sub-area 04, the notebook computer detects the second gesture input by the user within the second preset time duration in the touch sub-area 03. When the second gesture and the first gesture are determined to be gestures input by the user on different touch sub-areas of the touch pad, that is, the notebook computer may determine that the second gesture detected in the touch area 03 of the touch pad is not the same as the first gesture detected in the touch area 04.
The second gesture may be a click (single click or double click), a touch (such as single finger tap), a slide (such as up-down slide, left-right slide, or slide in other directions), a rotation (such as clockwise rotation or counterclockwise rotation), a proximity or a distance between a plurality of operation objects, and the like. It is understood that the second gesture may be the same as the first gesture or may be different from the first gesture. The embodiment of the present application is not limited to a specific type of the second gesture.
S1004, the notebook computer determines whether a second gesture input by the user on the touch pad by using a second operation object is received in the process of receiving the first gesture.
When the user needs to perform multiple operations (for example, two operations) on the content displayed on the notebook computer, the user can keep the motion of the first operation object (i.e., the first gesture), and at the same time, the user can input the motion of the second operation object (i.e., the second gesture) on the touch pad. In other words, in the case that the notebook computer determines that the second gesture input by the user on the touch pad is received, the notebook computer may determine whether the first gesture input by the user on the touch pad is still received, so as to determine whether the user needs to perform two operations on the content displayed on the notebook computer at the same time.
In a case that the notebook computer determines that the second gesture input by the user on the touch pad by using the second operation object is received in the process of receiving the first gesture, the notebook computer may continue to perform S1005. In a case that the notebook computer determines that the second gesture input by the user on the touch pad using the second operation object is not received in the process of receiving the first gesture, the notebook computer may continue to perform S1006.
S1005, the notebook computer continues to execute the first event and responds to the second gesture to execute a second event, wherein the second event is different from the first event.
In the process that the notebook computer determines to receive the first gesture, under the condition that a second gesture input by a user on the touch pad by using a second operation object is received, it can be determined that the user needs to perform two operations on the content displayed on the notebook computer at the same time, so that the notebook computer can respond to the two operations at the same time, namely respond to the first gesture and the second gesture at the same time.
The notebook computer responds to the second gesture, and the second event can be opening the content, selecting the content, dragging the content, rotating the content, amplifying the content, reducing the content, scrolling the displayed content, reducing the selected speed or reducing the display speed and the like. The content may be an application, a file, a folder, text or a photo, etc.
For example, when the second gesture is that the user rotates on the touch pad of the notebook computer using two fingers, the notebook computer responds to the second gesture, and the performing of the second event may be rotating the content, such as rotating a certain picture. When the second gesture is that the user uses two fingers to approach or move away from each other on the touch pad of the notebook computer, the notebook computer responds to the second gesture, and the executing of the second event may be zooming out or zooming in the content, such as zooming out or zooming in a certain picture. When the second gesture is that the user slides on the touch pad of the notebook computer by using two fingers, for example, the notebook computer slides left and right, the notebook computer responds to the second gesture, and the second event is executed to switch other desktop of the notebook computer, so that the content selected by the pointer can be dragged and moved to the other desktop. When the second gesture is that the user slides on the touch pad of the notebook computer with three fingers, for example, slides left and right, the notebook computer responds to the second gesture, and the second event is executed to display the desktop of the notebook computer, so that the content selected by the pointer can be dragged and moved to another application. When the second gesture is that the user slides on the touch pad of the notebook computer by using two fingers, for example, the notebook computer slides up and down, the notebook computer responds to the second gesture, and the executing of the second event may be scrolling and displaying the content in the current interface. When the second gesture is a user touching the touchpad of the notebook computer with one finger, such as a single finger tap, the notebook computer responds to the second gesture, and the execution of the second event may be to reduce the speed of selecting the content by the pointer. When the second gesture is that the user touches the touch pad of the notebook computer with one finger, such as a single finger tap, the notebook computer responds to the second gesture, and the executing of the second event may also be to reduce the speed of scrolling the displayed content.
In some examples, the first event is different from the second event, such that the laptop may respond to two different operations simultaneously, i.e., the first gesture and the second gesture.
For example, when the notebook computer responds to a first gesture and executes a first event as the selected content, the notebook computer responds to a second gesture and executes a second event as the selected content of the first gesture. When the notebook computer responds to the first gesture and executes the first event as the selected content, the notebook computer responds to the second gesture and executes the second event as zooming out or zooming in the selected content of the first gesture. When the notebook computer responds to the first gesture and executes the first event as the selected content, the notebook computer responds to the second gesture and executes the second event as the other desktops of the notebook computer are switched, so that the content selected by the first gesture can be dragged and moved to the other desktops. When the notebook computer responds to the first gesture and executes the first event as the selected content, the notebook computer responds to the second gesture and executes the second event as the desktop displaying the notebook computer, wherein the desktop can comprise a plurality of applications, so that the content selected by the first gesture can be dragged and moved to another application. When the notebook computer responds to the first gesture and executes the first event as the selected content, the notebook computer responds to the second gesture and executes the second event as scrolling the content in the current interface. When the notebook computer responds to the first gesture and executes the first event to scroll and display the content in the current interface, the notebook computer responds to the second gesture and executes the second event to reduce the speed of scrolling and displaying the content.
S1006, the notebook computer responds to the second gesture and executes a third event.
In the case that the notebook computer determines that the second gesture input by the user on the touch pad by using the second operation object is not received in the process of receiving the first gesture, the notebook computer may determine that the user only needs to perform one operation on the content displayed by the notebook computer, that is, the notebook computer may only respond to the second gesture and execute the third event.
The notebook computer responds to the second gesture, and the third event can be opening content, selecting content, scrolling display content, dragging content or rotating content, and the like, wherein the content can be an application, a folder, characters or a photo.
In the case that the notebook computer determines that a second gesture input by the user on the touch pad by using the second operation object is not received in the process of receiving the first gesture, the notebook computer may respond to the second gesture. At this time, the second gesture may also be referred to as the first gesture. Then, the notebook computer may continue to receive other gestures input by the user on the touch pad by using the operation object within other preset time periods, and determine whether to receive other gestures input by the user on the touch pad by using the operation object in the process of receiving the second gesture. In the case that the notebook computer determines that other gestures input by the user on the touch pad using the operation object are received in the process of receiving the second gesture, the notebook computer may respond to the second gesture and the other gestures input by the user at the same time, that is, the notebook computer may continue to perform the above-mentioned S1003-S1006.
By adopting the scheme of the application, the electronic equipment can determine whether a second gesture input by a user on the touch pad by using a second operation object is received in the process of receiving the first gesture, and can continue the response of the first gesture and respond to the second gesture when the electronic equipment receives the second gesture input by the user on the touch pad by using the second operation object in the process of receiving the first gesture. That is, when a user inputs a plurality of gestures on the touch pad of the electronic device, the electronic device can respond to the plurality of gestures at the same time, so that the process of using the touch pad by the user is simplified, and the efficiency of using the touch pad by the user is improved.
Corresponding to the method in the foregoing embodiment, an embodiment of the present application further provides a touch pad operating device. The touch pad control device can be applied to electronic equipment and used for realizing the method in the embodiment. The functions of the device can be realized by hardware, and can also be realized by executing corresponding software by hardware. The hardware or software includes one or more modules corresponding to the above-described functions.
For example, fig. 11 shows a schematic structural diagram of a touch pad manipulation device 1100, and as shown in fig. 11, the touch pad manipulation device 1100 may include: a receiving module 1101, an executing module 1102, and the like.
The receiving module 1101 may be configured to receive a first gesture input by a user on a touch pad of the electronic device by using a first operation object.
The execution module 1102 may be configured to execute a first event in response to a first gesture.
The receiving module 1101 may be further configured to receive a second gesture input by the user on the touch pad by using a second operation object in the process of receiving the first gesture, where a starting time of inputting the second gesture is later than a starting time of inputting the first gesture.
The executing module 1102 may be further configured to continue executing the first event and execute a second event in response to the second gesture, where the second event is different from the first event.
In another possible implementation manner, an interval time between the starting time of inputting the second gesture and the starting time of inputting the first gesture is greater than or equal to an interval time threshold.
In another possible implementation manner, the first gesture and the second gesture are one of the following gestures: click, touch, slide, rotate, multi-finger pinch, and multi-finger separation.
In another possible implementation manner, the first event is selecting content displayed in the current interface, and the second event is rotating the selected content, zooming out or enlarging the selected content, displaying a desktop of the electronic device and displaying the selected content on the desktop, switching the desktop of the electronic device and displaying the selected content on the switched desktop, adjusting the speed of the selected content in the current interface, or scrolling the content in the current interface; or, the first event is to scroll and display the content in the current interface, and the second event is to adjust the speed of scrolling and displaying the content.
In another possible implementation manner, the electronic device includes an input device with a touch pad, or is connected with the input device with the touch pad in a wireless or wired manner; the first operation object and the second operation object are fingers or touch pens.
In another possible implementation manner, the second gesture and the first gesture are gestures input by the user on different touch sub-areas of the touch pad.
It should be understood that the division of units or modules (hereinafter referred to as units) in the above apparatus is only a division of logical functions, and may be wholly or partially integrated into one physical entity or physically separated in actual implementation. And the units in the device can be realized in the form of software called by the processing element; or may be implemented entirely in hardware; part of the units can also be realized in the form of software called by a processing element, and part of the units can be realized in the form of hardware.
For example, each unit may be a processing element separately set up, or may be implemented by being integrated into a chip of the apparatus, or may be stored in a memory in the form of a program, and a function of the unit may be called and executed by a processing element of the apparatus. In addition, all or part of the units can be integrated together or can be independently realized. The processing element described herein, which may also be referred to as a processor, may be an integrated circuit having signal processing capabilities. In the implementation process, the steps of the method or the units above may be implemented by integrated logic circuits of hardware in a processor element or in a form called by software through the processor element.
In one example, the units in the above apparatus may be one or more integrated circuits configured to implement the above method, such as: one or more ASICs, or one or more DSPs, or one or more FPGAs, or a combination of at least two of these integrated circuit forms.
As another example, when a unit in an apparatus may be implemented in the form of a processing element scheduler, the processing element may be a general-purpose processor, such as a CPU or other processor that may invoke a program. As another example, these units may be integrated together and implemented in the form of a system-on-a-chip (SOC).
In one implementation, the unit of the above apparatus for implementing each corresponding step in the above method may be implemented in the form of a processing element scheduler. For example, the apparatus may include a processing element and a memory element, the processing element calling a program stored by the memory element to perform the method described in the above method embodiments. The memory elements may be memory elements on the same chip as the processing elements, i.e. on-chip memory elements.
In another implementation, the program for performing the above method may be in a memory element on a different chip than the processing element, i.e. an off-chip memory element. At this time, the processing element calls or loads a program from the off-chip storage element onto the on-chip storage element to call and execute the method described in the above method embodiment.
For example, the embodiments of the present application may also provide an apparatus, such as: an electronic device may include: a processor, a memory for storing instructions executable by the processor. The processor is configured to execute the above instructions, so that the electronic device implements the touch pad manipulation method according to the foregoing embodiments. The memory may be located within the electronic device or external to the electronic device. And the processor includes one or more.
In yet another implementation, the unit of the apparatus for implementing the steps of the method may be configured as one or more processing elements, and these processing elements may be disposed on the electronic device corresponding to the foregoing, where the processing elements may be integrated circuits, for example: one or more ASICs, or one or more DSPs, or one or more FPGAs, or a combination of these types of integrated circuits. These integrated circuits may be integrated together to form a chip.
For example, the embodiment of the present application also provides a chip, and the chip can be applied to the electronic device. The chip includes one or more interface circuits and one or more processors; the interface circuit and the processor are interconnected through a line; the processor receives and executes computer instructions from the memory of the electronic device through the interface circuitry to implement the methods described in the method embodiments above.
Embodiments of the present application further provide a computer program product, which includes computer instructions executed by the electronic device as described above.
Through the above description of the embodiments, it is clear to those skilled in the art that, for convenience and simplicity of description, the foregoing division of the functional modules is merely used as an example, and in practical applications, the above function distribution may be completed by different functional modules according to needs, that is, the internal structure of the device may be divided into different functional modules to complete all or part of the above described functions.
In the several embodiments provided in the present application, it should be understood that the disclosed apparatus and method may be implemented in other ways. For example, the above-described device embodiments are merely illustrative, and for example, the division of the modules or units is only one logical functional division, and there may be other divisions when actually implemented, for example, a plurality of units or components may be combined or may be integrated into another device, or some features may be omitted, or not executed. In addition, the shown or discussed mutual coupling or direct coupling or communication connection may be an indirect coupling or communication connection through some interfaces, devices or units, and may be in an electrical, mechanical or other form.
The units described as separate parts may or may not be physically separate, and parts displayed as units may be one physical unit or a plurality of physical units, that is, may be located in one place, or may be distributed in a plurality of different places. Some or all of the units can be selected according to actual needs to achieve the purpose of the solution of the embodiment.
In addition, functional units in the embodiments of the present application may be integrated into one processing unit, or each unit may exist alone physically, or two or more units are integrated into one unit. The integrated unit can be realized in a form of hardware, and can also be realized in a form of a software functional unit.
The integrated unit, if implemented in the form of a software functional unit and sold or used as a stand-alone product, may be stored in a readable storage medium. Based on such understanding, the technical solutions of the embodiments of the present application may be embodied in the form of software products, such as: and (5) programming. The software product is stored in a program product, such as a computer readable storage medium, and includes several instructions for causing a device (which may be a single chip, a chip, or the like) or a processor (processor) to perform all or part of the steps of the methods according to the embodiments of the present application. And the aforementioned storage medium includes: a U disk, a removable hard disk, a ROM, a RAM, a magnetic disk or an optical disk, and various media capable of storing program codes.
For example, embodiments of the present application may also provide a computer-readable storage medium having computer program instructions stored thereon. The computer program instructions, when executed by the electronic device, cause the electronic device to implement the touchpad manipulation method as described in the aforementioned method embodiments.
The above description is only an embodiment of the present application, but the scope of the present application is not limited thereto, and any changes or substitutions within the technical scope disclosed in the present application should be covered within the scope of the present application. Therefore, the protection scope of the present application shall be subject to the protection scope of the claims.

Claims (13)

1. A touch pad control method is applied to an electronic device, and the method comprises the following steps:
the electronic equipment receives a first gesture input by a user on a touch pad of the electronic equipment by using a first operation object; the first operation object is an operation object, within a first preset time length, of which the distance between touch points on a touch area of the touch pad is smaller than or equal to a distance threshold value;
the electronic equipment responds to the first gesture and executes a first event;
in the process of receiving the first gesture, the electronic equipment receives a second gesture input by a user on the touch pad by using a second operation object, wherein the starting time of inputting the second gesture is later than the starting time of inputting the first gesture; the second operation object is an operation object of which the distance between touch points on the touch area of the touch pad is smaller than or equal to a distance threshold value within a second preset time length;
and the electronic equipment continues to execute the first event and responds to the second gesture to execute a second event, wherein the second event is different from the first event.
2. The method of claim 1, wherein an interval time between the starting time of the inputting of the second gesture and the starting time of the inputting of the first gesture is greater than or equal to an interval time threshold.
3. The method of claim 1, wherein the first gesture and the second gesture are one of: click, touch, slide, rotate, multi-finger pinch, and multi-finger separation.
4. The method of claim 1,
the first event is the selection of the content displayed in the current interface, and the second event is the rotation of the selected content, the reduction or the enlargement of the selected content, the display of the desktop of the electronic equipment and the display of the selected content on the desktop, the switching of the desktop of the electronic equipment and the display of the selected content on the switched desktop, the adjustment of the speed of the selected content in the current interface, or the scrolling of the content in the current interface;
alternatively, the first and second electrodes may be,
the first event is that the content in the current interface is displayed in a rolling mode, and the second event is that the speed of the content displayed in the rolling mode is adjusted.
5. The method of claim 1,
the electronic equipment comprises input equipment with the touch pad, or is connected with the input equipment with the touch pad in a wireless or wired mode;
the first operation object and the second operation object are fingers or touch pens.
6. The method of any of claims 1-5, wherein the second gesture and the first gesture are gestures input by a user on different touch sub-areas of the trackpad.
7. An electronic device, comprising a processor, a memory for storing processor-executable instructions; the processor is configured to execute the instructions to cause the electronic device to:
receiving a first gesture input by a user on a touch pad of the electronic equipment by using a first operation object; the first operation object is an operation object, within a first preset time length, of which the distance between touch points on a touch area of the touch pad is smaller than or equal to a distance threshold value;
executing a first event in response to the first gesture;
in the process of receiving the first gesture, receiving a second gesture input by a user on the touch pad by using a second operation object, wherein the starting time of inputting the second gesture is later than that of inputting the first gesture; the second operation object is an operation object in which the distance between touch points on the touch area of the touch pad is smaller than or equal to a distance threshold value within a second preset time period; the second preset time is a preset time after the first preset time;
and continuing to execute the first event, and responding to the second gesture, and executing a second event, wherein the second event is different from the first event.
8. The electronic device of claim 7, wherein a time interval between the time of the start of the input of the second gesture and the time of the start of the input of the first gesture is greater than or equal to a time interval threshold.
9. The electronic device of claim 7, wherein the first gesture and the second gesture are one of: click, touch, slide, rotate, multi-finger pinch, and multi-finger separation.
10. The electronic device of claim 7,
the first event is the selection of the content displayed in the current interface, and the second event is the rotation of the selected content, the reduction or the enlargement of the selected content, the display of the desktop of the electronic equipment and the display of the selected content on the desktop, the switching of the desktop of the electronic equipment and the display of the selected content on the switched desktop, the adjustment of the speed of the selected content in the current interface, or the rolling display of the content in the current interface;
alternatively, the first and second electrodes may be,
the first event is that the content in the current interface is displayed in a scrolling mode, and the second event is that the speed of the content displayed in the scrolling mode is adjusted.
11. The electronic device of claim 7,
the electronic equipment comprises input equipment with the touch pad, or is connected with the input equipment with the touch pad in a wireless or wired mode;
the first operation object and the second operation object are fingers or touch pens.
12. The electronic device of any of claims 7-11, wherein the second gesture and the first gesture are gestures input by a user on different touch sub-regions of the trackpad.
13. A computer readable storage medium having stored thereon computer program instructions; it is characterized in that the preparation method is characterized in that,
the computer program instructions, when executed by an electronic device, cause the electronic device to implement the method of any of claims 1 to 6.
CN202111062574.8A 2021-09-10 2021-09-10 Touch control panel control method and electronic equipment Active CN113946208B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202111062574.8A CN113946208B (en) 2021-09-10 2021-09-10 Touch control panel control method and electronic equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202111062574.8A CN113946208B (en) 2021-09-10 2021-09-10 Touch control panel control method and electronic equipment

Publications (2)

Publication Number Publication Date
CN113946208A CN113946208A (en) 2022-01-18
CN113946208B true CN113946208B (en) 2023-03-28

Family

ID=79328046

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202111062574.8A Active CN113946208B (en) 2021-09-10 2021-09-10 Touch control panel control method and electronic equipment

Country Status (1)

Country Link
CN (1) CN113946208B (en)

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102968277A (en) * 2012-11-30 2013-03-13 北京小米科技有限责任公司 Method and device for deleting or cutting file based on touch screen
WO2014065845A1 (en) * 2012-10-25 2014-05-01 Blackberry Limited Electronic device and method of displaying information in response to detecting a gesture
WO2015192085A2 (en) * 2014-06-12 2015-12-17 Apple Inc. Systems and methods for multitasking on an electronic device with a touch-sensitive display
CN105359083A (en) * 2013-04-15 2016-02-24 微软技术许可有限责任公司 Dynamic management of edge inputs by users on a touch device
US10409412B1 (en) * 2015-09-30 2019-09-10 Apple Inc. Multi-input element for electronic device

Family Cites Families (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7834855B2 (en) * 2004-08-25 2010-11-16 Apple Inc. Wide touchpad on a portable computer
WO2011130919A1 (en) * 2010-04-23 2011-10-27 Motorola Mobility, Inc. Electronic device and method using touch-detecting surface
US20120026077A1 (en) * 2010-07-28 2012-02-02 Google Inc. Mapping trackpad operations to touchscreen events
TW201327273A (en) * 2011-12-23 2013-07-01 Wistron Corp Touch keypad module and mode switching method thereof
TWI470475B (en) * 2012-04-17 2015-01-21 Pixart Imaging Inc Electronic system
US20140168097A1 (en) * 2012-12-17 2014-06-19 Motorola Mobility Llc Multi-touch gesture for movement of media
KR102015347B1 (en) * 2013-01-07 2019-08-28 삼성전자 주식회사 Method and apparatus for providing mouse function using touch device
KR20160032611A (en) * 2014-09-16 2016-03-24 삼성전자주식회사 Method and apparatus for controlling an electronic device using a touch input
US11392290B2 (en) * 2020-06-26 2022-07-19 Intel Corporation Touch control surfaces for electronic user devices and related methods

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2014065845A1 (en) * 2012-10-25 2014-05-01 Blackberry Limited Electronic device and method of displaying information in response to detecting a gesture
CN102968277A (en) * 2012-11-30 2013-03-13 北京小米科技有限责任公司 Method and device for deleting or cutting file based on touch screen
CN105359083A (en) * 2013-04-15 2016-02-24 微软技术许可有限责任公司 Dynamic management of edge inputs by users on a touch device
WO2015192085A2 (en) * 2014-06-12 2015-12-17 Apple Inc. Systems and methods for multitasking on an electronic device with a touch-sensitive display
US10409412B1 (en) * 2015-09-30 2019-09-10 Apple Inc. Multi-input element for electronic device

Also Published As

Publication number Publication date
CN113946208A (en) 2022-01-18

Similar Documents

Publication Publication Date Title
JP6816858B2 (en) How to control the display of multiple objects by operation-related input to the mobile terminal and the mobile terminal
US11112956B2 (en) Device, method, and graphical user interface for switching between camera interfaces
EP3686723B1 (en) User terminal device providing user interaction and method therefor
CN108900770B (en) Method and device for controlling rotation of camera, smart watch and mobile terminal
US9645699B2 (en) Device, method, and graphical user interface for adjusting partially off-screen windows
US9058095B2 (en) Method for displaying data in mobile terminal having touch screen and mobile terminal thereof
US8347238B2 (en) Device, method, and graphical user interface for managing user interface content and user interface elements by dynamic snapping of user interface elements to alignment guides
EP3547218B1 (en) File processing device and method, and graphical user interface
US20080129759A1 (en) Method for processing image for mobile communication terminal
US9360923B2 (en) System and method for managing display power consumption
JP2012048725A (en) Portable electronic device having multi-touch input
CN110569094B (en) Display method and electronic equipment
CN111147660B (en) Control operation method and electronic equipment
WO2017101445A1 (en) Method for responding to operation track and operation track response apparatus
US20130179845A1 (en) Method and apparatus for displaying keypad in terminal having touch screen
JP7446441B2 (en) Image cropping method and electronic equipment
WO2015047602A1 (en) System and method for capturing images
CN111459363A (en) Information display method, device, equipment and storage medium
WO2019047129A1 (en) Method for moving application icons, and terminal
CN112230910B (en) Page generation method, device and equipment of embedded program and storage medium
CN110266875B (en) Prompt message display method and electronic equipment
CN113946208B (en) Touch control panel control method and electronic equipment
US10958815B1 (en) Folded flex circuit board for camera ESD protection
JP6100657B2 (en) Electronics
CN114461312B (en) Display method, electronic device and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant