CN113821113B - Electronic equipment and touch pen interaction method and system and electronic equipment - Google Patents

Electronic equipment and touch pen interaction method and system and electronic equipment Download PDF

Info

Publication number
CN113821113B
CN113821113B CN202111382321.9A CN202111382321A CN113821113B CN 113821113 B CN113821113 B CN 113821113B CN 202111382321 A CN202111382321 A CN 202111382321A CN 113821113 B CN113821113 B CN 113821113B
Authority
CN
China
Prior art keywords
screen
stylus
electronic device
electrode
touch
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202111382321.9A
Other languages
Chinese (zh)
Other versions
CN113821113A (en
Inventor
李江涛
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shanghai Glory Smart Technology Development Co ltd
Original Assignee
Honor Device Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Honor Device Co Ltd filed Critical Honor Device Co Ltd
Priority to CN202111382321.9A priority Critical patent/CN113821113B/en
Publication of CN113821113A publication Critical patent/CN113821113A/en
Application granted granted Critical
Publication of CN113821113B publication Critical patent/CN113821113B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/033Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
    • G06F3/0354Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor with detection of 2D relative movements between the device, or an operating part thereof, and a plane or surface, e.g. 2D mice, trackballs, pens or pucks
    • G06F3/03545Pens or stylus
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • G06F3/044Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by capacitive means
    • G06F3/0441Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by capacitive means using active external devices, e.g. active pens, for receiving changes in electrical potential transmitted by the digitiser, e.g. tablet driving signals
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • G06F3/044Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by capacitive means
    • G06F3/0442Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by capacitive means using active external devices, e.g. active pens, for transmitting changes in electrical potential to be received by the digitiser
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures

Abstract

The embodiment of the application provides an electronic device and a touch pen interaction method, system and electronic device. When the touch mode is the first touch mode, the electronic equipment executes handwriting display operation, and even if the operation of the stylus is the same as the screen edge gesture, the electronic equipment cannot respond to the screen edge gesture. When the touch mode is the first touch mode, if the operation of the stylus is the same as the screen edge gesture, the electronic device executes the operation of responding to the screen edge gesture, so that the condition that the stylus triggers the screen edge gesture by mistake can be avoided, and the drawing experience of a user is improved.

Description

Electronic equipment and touch pen interaction method and system and electronic equipment
Technical Field
The embodiment of the application relates to a communication technology, in particular to an electronic device and a touch pen interaction method and system and an electronic device.
Background
With the development of touch technology, more and more electronic devices adopt a touch mode to perform human-computer interaction, for example, a user can provide input to the electronic device by operating a touch screen of the electronic device through a touch pen, and the electronic device executes corresponding operation based on the input of the touch pen.
At present, when a user uses a stylus to write or draw on an electronic device, the operation of the stylus on the edge of a touch screen of the electronic device is prone to trigger a gesture on the edge of the screen by mistake, which causes the electronic device to display multiple windows and quit the operation of an application program.
Disclosure of Invention
The embodiment of the application provides an electronic device and a touch pen interaction method and system and the electronic device, and the touch pen can be prevented from mistakenly triggering a screen edge gesture, so that the drawing experience of a user is improved.
In a first aspect, an embodiment of the present application provides an interaction method between an electronic device and a stylus, where the method may be executed by the electronic device or a chip in the electronic device, and the following description takes the electronic device as an execution subject as an example. According to the method, when a user draws handwriting on a screen of electronic equipment by using a touch pen, the electronic equipment can acquire the position of the touch pen on the screen of the electronic equipment, and when the electronic equipment detects that the position of the touch pen on the screen of the electronic equipment is in a preset area at the edge of the screen, the touch mode of the touch pen is detected. The electronic device can perform different operations in response to different touch modes.
And the electronic equipment responds to the condition that the touch mode is a first touch mode, and displays the handwriting of the touch pen. And the electronic equipment responds that the touch mode is a second touch mode, and if the operation of the stylus on the screen is the same as the first screen edge gesture, the electronic equipment executes the operation responding to the first screen edge gesture.
In the embodiment of the application, a user can draw on the screen in different touch modes, and when the position of the stylus on the screen of the electronic device is in the preset area at the edge of the screen, the electronic device responds to the touch modes of the stylus and can execute different operations. If the touch mode is the first touch mode, the electronic device executes the handwriting display operation, and even if the operation of the stylus is the same as the screen edge gesture, the electronic device does not respond to the screen edge gesture. When the touch mode is the first touch mode, if the operation of the stylus is the same as the screen edge gesture, the electronic device executes the operation of responding to the screen edge gesture, so that the condition that the stylus triggers the screen edge gesture by mistake can be avoided, and the drawing experience of a user is improved.
The preset area at the edge of the screen may be an area corresponding to any screen edge gesture, for example, the screen edge gesture is a gesture for triggering the electronic device to exit the application program, and the area corresponding to the screen edge gesture is an area at the edge of the bottom of the screen. The regions corresponding to different screen edge gestures are different. In a possible implementation manner, the first screen edge gesture in the embodiment of the present application is any one of the following gestures: triggering a gesture of the electronic device for exiting an application program, triggering a gesture of the electronic device for displaying a pull-down notification bar, triggering a gesture of the electronic device for displaying a pull-up control center, triggering a gesture of the electronic device for displaying multiple windows, or triggering a gesture of the electronic device for displaying a multi-task switching interface.
In one possible implementation manner, different touch modes of the stylus represent different portions of the stylus touching the screen. That is, the user touches the screen of the stylus with different positions on the stylus, and the touch mode of the stylus is different.
In order to follow the writing habit of the user, the first touch mode is that the pen point of the touch pen contacts the screen, and the second touch mode is that the tail of the touch pen contacts the screen. That is to say, the user may touch the screen with the pen tip of the stylus pen, and in the preset area at the edge of the screen, even if the operation of the stylus pen is the same as the gesture at the edge of the screen, the electronic device may not respond to the gesture at the edge of the screen, and continue to display the handwriting of the stylus pen. The user may contact the screen using the tail of the stylus pen, and in a non-preset area of the edge of the screen, the electronic device may display handwriting of the stylus pen, and in a preset area of the edge of the screen, when the operation of the stylus pen is the same as the gesture at the edge of the screen, the electronic device may perform an operation in response to the gesture at the edge of the screen.
The following describes a manner in which the electronic device detects the touch mode of the stylus:
the frequency of the first electrode is different from that of the second electrode. The electronic device may detect a touch pattern of the stylus based on a frequency of the signal from the electrode. Illustratively, the frequency of the first electrode is a first frequency, and the frequency of the second electrode is a second frequency. The electronic device may determine that the touch mode of the stylus is the second touch mode if the frequency of the signal from the electrode is the first frequency. Similarly, the electronic device may determine that the touch mode of the stylus is the first touch mode when the frequency of the signal from the electrode is the second frequency.
And a first electrode is arranged at the tail part of the touch pen, a second electrode is arranged at the pen point of the touch pen, and the frequencies of the first electrode and the second electrode are the same. Since the areas of the pen tip and the tail of the stylus pen are different, the area of the capacitance value of the screen caused by the signal from the electrode of the pen tip of the stylus pen and the area of the capacitance value of the screen caused by the signal from the electrode of the tail of the stylus pen are different.
The area causing the change of the capacitance value of the screen when the pen point contacts the screen is in a first area range, and the area causing the change of the capacitance value of the screen when the tail portion contacts the screen is in a second area range. If the area of the electrode where the signal causes the change in the capacitance value of the screen is within the first area range, the electronic device may determine that the touch mode is the first touch mode. If the area of the electrode where the signal causes the change in the capacitance value of the screen is within the second area range, the electronic device may determine that the touch mode is the second touch mode.
Its third, the afterbody of touch-control pen is provided with first electrode, and the nib of touch-control pen is provided with the second electrode, first electrode with the frequency of second electrode is the same, just the nib still is provided with pressure sensor, works as the nib contact during the screen, the touch-control pen through with wireless connection between the electronic equipment to electronic equipment sends the pressure data that pressure sensor gathered.
In this manner, the electronic device can detect the touch mode of the stylus according to the area of the capacitance value of the screen, which is caused by the signal from the electrode, and whether the pressure data from the stylus is received. The area causing the change of the capacitance value of the screen when the pen point contacts the screen is in a first area range, and the area causing the change of the capacitance value of the screen when the tail portion contacts the screen is in a second area range.
If the area of the electrode, in which the signal of the electrode causes the change of the capacitance value of the screen, is within the first area range, and the electronic device receives pressure data from the stylus, it may be determined that the touch mode is the first touch mode. If the area of the electrode, in which the signal of the electrode causes the change of the capacitance value of the screen, is within the second area range, and the electronic device does not receive the pressure data from the stylus, it may be determined that the touch mode is the second touch mode.
In a second aspect, an embodiment of the present application provides an electronic device, which may include: a processor, a memory. The memory is for storing computer executable program code, the program code comprising instructions; the instructions, when executed by the processor, cause the electronic device to perform the method as in the first aspect.
In a third aspect, an embodiment of the present application provides a stylus, which may include: the touch control device comprises a first electrode and a second electrode, wherein the first electrode and the second electrode are arranged at different parts of the touch control pen.
In one possible implementation, the first electrode is disposed at a tip of the stylus pen, and the second electrode is disposed at a tail of the stylus pen.
In a fourth aspect, an embodiment of the present application provides a touch interaction system, where the system includes the electronic device as described in the second aspect above and a stylus as described in the third aspect above, and the touch interaction system may implement the interaction method between the electronic device and the stylus as described in the first aspect above.
In a fifth aspect, embodiments of the present application provide a computer program product containing instructions, which when run on a computer, cause the computer to perform the method of the first aspect.
In a sixth aspect, embodiments of the present application provide a computer-readable storage medium, which stores instructions that, when executed on a computer, cause the computer to perform the method in the first aspect.
For each possible implementation manner of the second aspect to the sixth aspect, the beneficial effects of the second aspect may refer to the beneficial effects brought by the first aspect, and details are not repeated herein.
The embodiment of the application provides an electronic device and a touch pen interaction method, system and electronic device. When the touch mode is the first touch mode, the electronic equipment executes handwriting display operation, and even if the operation of the stylus is the same as the screen edge gesture, the electronic equipment cannot respond to the screen edge gesture. When the touch mode is the first touch mode, if the operation of the stylus is the same as the screen edge gesture, the electronic device executes the operation of responding to the screen edge gesture, so that the condition that the stylus triggers the screen edge gesture by mistake can be avoided, and the drawing experience of a user is improved.
Drawings
Fig. 1 is a schematic view of a scenario provided in an embodiment of the present application;
fig. 2A is a schematic structural diagram of a stylus pen according to an embodiment of the present disclosure;
fig. 2B is a schematic diagram of a partially disassembled structure of a stylus provided in the embodiment of the present application;
FIG. 3 is a schematic diagram of interaction between a stylus and an electronic device according to an embodiment of the present disclosure;
fig. 4 is a schematic diagram of a hardware structure of a stylus according to an embodiment of the present disclosure;
fig. 5 is a schematic hardware structure diagram of an electronic device according to an embodiment of the present disclosure;
FIG. 6A is a diagram illustrating a user drawing on a screen of an electronic device using a stylus;
FIG. 6B is another schematic diagram of a user drawing on a screen of an electronic device using a stylus;
FIG. 6C is another schematic diagram of a user drawing on a screen of an electronic device using a stylus;
FIG. 6D is another diagram illustrating a user drawing on a screen of an electronic device using a stylus;
fig. 7 is another schematic structural diagram of a stylus pen according to an embodiment of the present disclosure;
fig. 8 is a flowchart illustrating an embodiment of an interaction method between an electronic device and a stylus according to an embodiment of the present disclosure;
FIG. 9 is a schematic interface diagram illustrating interaction between an electronic device and a stylus according to an embodiment of the present disclosure;
FIG. 10 is a schematic diagram of another interface for interaction between an electronic device and a stylus, according to an embodiment of the present disclosure;
FIG. 11 is a schematic diagram of another interface for interaction between an electronic device and a stylus according to an embodiment of the present disclosure;
FIG. 12 is a schematic diagram of another interface for interaction between an electronic device and a stylus, according to an embodiment of the present disclosure;
fig. 13 is a schematic structural diagram of an electronic device according to an embodiment of the present application.
Detailed Description
Fig. 1 is a schematic view of a scene applicable to the embodiment of the present application. Referring to fig. 1, the scene includes a stylus (stylus) 100 and an electronic device 200, and the electronic device 200 is illustrated as a tablet computer (PAD) in fig. 1. The stylus 100 may provide an input to the electronic apparatus 200, and the electronic apparatus 200 performs an operation in response to the input based on the input of the stylus 100. In one embodiment, a wireless keyboard 300 may also be included in the scenario. The wireless keyboard 300 may also provide input to the electronic device 200, and the electronic device 200 performs an operation in response to the input based on the input of the wireless keyboard 300.
In one embodiment, a touch area may be disposed on wireless keyboard 300, stylus 100 may operate the touch area of wireless keyboard 300 to provide input to wireless keyboard 300, and wireless keyboard 300 may perform an operation in response to the input based on the input of stylus 100. In one embodiment, the stylus 100 and the electronic device 200, the stylus 100 and the wireless keyboard 300, and the electronic device 200 and the wireless keyboard 300 may be interconnected through a communication network to realize wireless signal interaction. The communication network may be, but is not limited to: a close-range communication network such as a WI-FI hotspot network, a WI-FI peer-to-peer (P2P) network, a bluetooth network, a zigbee network, or a Near Field Communication (NFC) network. The following embodiments mainly describe the interaction process between the stylus 100 and the electronic device 200.
The stylus 100 in the embodiment of the present application is an active capacitive stylus, which may also be referred to as an active capacitive stylus or an active stylus.
Fig. 2A is a schematic structural diagram of a stylus pen according to an embodiment of the present disclosure. Referring to fig. 2A, the stylus 100 may include a pen tip 10, a pen barrel 20, and a rear cap 30. The inside of the barrel 20 is a hollow structure, the nib 10 and the rear end 30 are respectively located at two ends of the barrel 20, the rear cover 30 and the barrel 20 can be inserted or engaged, and the matching relationship between the nib 10 and the barrel 20 is described in detail in fig. 2B.
Fig. 2B is a schematic diagram of a partially disassembled structure of a stylus provided in the embodiment of the present application. Referring to fig. 2B, the stylus 100 further includes a spindle assembly 50, the spindle assembly 50 is located in the barrel 20, and the spindle assembly 50 is slidably disposed in the barrel 20. The spindle assembly 50 has an external thread 51 thereon, and the nib 10 includes a writing end 11 and a connecting end 12, wherein the connecting end 12 of the nib 10 has an internal thread (not shown) that is engaged with the external thread 51.
When the spindle assembly 50 is assembled into the cartridge 20, the connection end 12 of the nib 10 protrudes into the cartridge 20 and is threadedly connected with the external thread 51 of the spindle assembly 50. In some other examples, the connection end 12 of the pen tip 10 and the spindle assembly 50 may be detachably connected by a snap fit or the like. Replacement of the nib 10 is achieved by the removable connection between the connecting end 12 of the nib 10 and the spindle assembly 50.
In order to detect the pressure applied to the writing end 11 of the pen tip 10, referring to fig. 2A, a gap 10a is formed between the pen tip 10 and the pen barrel 20, so that when the writing end 11 of the pen tip 10 is subjected to an external force, the pen tip 10 can move toward the pen barrel 20, and the movement of the pen tip 10 drives the spindle assembly 50 to move within the pen barrel 20. For detecting the external force, referring to fig. 2B, a pressure sensing assembly 60 is disposed on the main shaft assembly 50, a portion of the pressure sensing assembly 60 is fixedly connected to a fixing structure in the pen holder 20, and a portion of the pressure sensing assembly 60 is fixedly connected to the main shaft assembly 50. Thus, when the main shaft assembly 50 moves along with the pen tip 10, since part of the pressure sensing assembly 60 is fixedly connected with the fixed structure in the pen holder 20, the movement of the main shaft assembly 50 drives the deformation of the pressure sensing assembly 60, the deformation of the pressure sensing assembly 60 is transmitted to the circuit board 70 (for example, the pressure sensing assembly 60 and the circuit board 70 can be electrically connected through a wire or a flexible circuit board), and the circuit board 70 detects the pressure of the writing end 11 of the pen tip 10 according to the deformation of the pressure sensing assembly 60, so as to control the line thickness of the writing end 11 according to the pressure of the writing end 11 of the pen tip 10.
It should be noted that the pressure detection of the pen tip 10 includes, but is not limited to, the above method. For example, a pressure sensor may be provided in writing end 11 of pen tip 10, and the pressure of pen tip 10 may be detected by the pressure sensor.
In this embodiment, referring to fig. 2B, the stylus pen 100 further includes a plurality of electrodes, which may be, for example, a first transmitting electrode 41, a ground electrode 43, and a second transmitting electrode 42. The first emitter electrode 41, the ground electrode 43, and the second emitter electrode 42 are all electrically connected to the circuit board 70. The first transmitting electrode 41 may be located in the pen tip 10 and near the writing end 11, the circuit board 70 may be configured as a control board that may provide signals to the first transmitting electrode 41 and the second transmitting electrode 42, respectively, the first transmitting electrode 41 is used to transmit a first signal, and when the first transmitting electrode 41 is near the touch screen 201 of the electronic device 200, a coupling capacitance may be formed between the first transmitting electrode 41 and the touch screen 201 of the electronic device 200, so that the electronic device 200 may receive the first signal. The second transmitting electrode 42 is configured to transmit a second signal, and the electronic device 200 determines the tilt angle of the stylus pen 100 according to the received first signal and the received second signal. In the embodiment of the present application, the second emitter electrode 42 may be located on the inner wall of the barrel 20. In one example, the second emitter electrode 42 may also be located on the spindle assembly 50.
The ground electrode 43 may be located between the first and second emitter electrodes 41 and 42, or the ground electrode 43 may be located at the outer peripheries of the first and second emitter electrodes 41 and 42, the ground electrode 43 serving to reduce the coupling of the first and second emitter electrodes 41 and 42 to each other.
When the electronic device 200 receives the first signal from the stylus pen 100, the capacitance value at the corresponding position of the touch screen 201 changes. Accordingly, electronic device 200 can determine the location of stylus 100 (or the tip of stylus 100) on touch screen 201 based on changes in capacitance values on touch screen 201. In addition, the electronic device 200 may acquire the tilt angle of the stylus pen 100 by using a dual-tip projection method in the tilt angle detection algorithm. Here, the positions of the first transmitting electrode 41 and the second transmitting electrode 42 in the stylus pen 100 are different, so that when the electronic device 200 receives the first signal and the second signal from the stylus pen 100, capacitance values at two positions on the touch screen 201 are changed. The electronic device 200 may obtain the tilt angle of the stylus pen 100 according to the distance between the first transmitting electrode 41 and the second transmitting electrode 42 and the distance between two positions where the capacitance value of the touch screen 201 changes, and for more details, refer to the related description of the dual-tip projection method in the prior art. In one embodiment, the touch screen may be referred to as a screen.
In the embodiment of the present application, referring to fig. 2B, the stylus 100 further includes: a battery assembly 80, the battery assembly 80 being used to provide power to the circuit board 70. The battery assembly 80 may include a lithium ion battery, or the battery assembly 80 may include a nickel-chromium battery, an alkaline battery, a nickel-hydrogen battery, or the like. In one embodiment, the battery assembly 80 may include a rechargeable battery or a disposable battery, wherein when the battery assembly 80 includes a rechargeable battery, the stylus 100 may charge the battery in the battery assembly 80 by a wireless charging method.
Wherein, an electrode array is integrated on the touch screen 201 of the electronic device 200. Referring to fig. 3, after the electronic device 200 is wirelessly connected to the stylus pen 100, the electronic device 200 may transmit an uplink signal to the stylus pen 100 through the electrode array. Stylus 100 may receive the uplink signal through a receive electrode, and stylus 100 transmits the downlink signal through a transmit electrode (e.g., first transmit electrode 41 and second transmit electrode 42). The downlink signal includes the first signal and the second signal described above. When the tip 10 of the stylus 100 contacts the touch screen 201, the capacitance value at the corresponding position of the touch screen 201 changes, and the electronic device 200 may determine the position of the tip 10 of the stylus 100 on the touch screen 201 based on the capacitance value on the touch screen 201. In one embodiment, the upstream and downstream signals may be square wave signals.
Fig. 4 is a schematic diagram of a hardware structure of a stylus pen according to an embodiment of the present disclosure. Referring to FIG. 4, a processor 110 is included in stylus 100. Processor 110 may include storage and processing circuitry to support the operation of stylus 100. The storage and processing circuitry may include storage devices such as non-volatile memory (e.g., flash memory or other electrically programmable read-only memory configured as a solid state drive), volatile memory (e.g., static or dynamic random access memory), and so forth. Processing circuitry in processor 110 may be used to control the operation of stylus 100. The processing circuitry may be based on one or more microprocessors, microcontrollers, digital signal processors, baseband processors, power management units, audio chips, application specific integrated circuits, and the like.
One or more sensors can be included in stylus 100. For example, the sensor may include a pressure sensor 120. Pressure sensor 120 may be disposed at writing end 11 of stylus 100 (as shown in fig. 2B). Of course, the pressure sensor 120 may be disposed in the shaft 20 of the stylus 100, such that when a force is applied to one end of the tip 10 of the stylus 100, the other end of the tip 10 moves to apply a force to the pressure sensor 120. In one embodiment, processor 110 may adjust the line thickness of stylus 100 when writing with tip 10 according to the pressure detected by pressure sensor 120.
The sensors may also include inertial sensors 130. Inertial sensors 130 may include three-axis accelerometers and three-axis gyroscopes, and/or other components for measuring motion of stylus 100, e.g., a three-axis magnetometer may be included in the sensor in a nine-axis inertial sensor configuration. The sensors may also include additional sensors such as temperature sensors, ambient light sensors, light-based proximity sensors, contact sensors, magnetic sensors, pressure sensors, and/or other sensors.
A status indicator 140, such as a light emitting diode, and a button 150 may be included in stylus 100. Status indicator 140 is used to alert a user to the status of stylus 100. Buttons 150 may include mechanical and non-mechanical buttons, and buttons 150 may be used to collect button press information from a user.
In the embodiment of the present application, one or more electrodes 160 (refer to the description in fig. 2B in particular) may be included in the stylus 100, wherein one electrode 160 may be located at the writing end of the stylus 100, and one electrode 160 may be located in the pen tip 10, as described above.
Sensing circuitry 170 may be included in stylus 100. Sensing circuitry 170 can sense capacitive coupling between electrodes 160 and drive lines of a capacitive touch sensor panel interacting with stylus 100. The sensing circuit 170 can include an amplifier to receive capacitance readings from the capacitive touch sensor panel, a clock to generate a demodulation signal, a phase shifter to generate a phase shifted demodulation signal, a mixer to demodulate the capacitance readings using an in-phase demodulation frequency component, and a mixer to demodulate the capacitance readings using a quadrature demodulation frequency component, among others. The results of the mixer demodulation can be used to determine an amplitude proportional to the capacitance so that stylus 100 can sense contact with the capacitive touch sensor panel.
It is understood that a microphone, speaker, audio generator, vibrator, camera, data port, and other devices may be included in stylus 100, depending on the actual requirements. A user can control the operation of stylus 100 and electronic device 200 interacting with stylus 100 by providing commands with these devices, as well as receive status information and other outputs.
Processor 110 may be used to run software on stylus 100 that controls the operation of stylus 100. During operation of stylus 100, software running on processor 110 may process sensor inputs, button inputs, and inputs from other devices to monitor movement of stylus 100 and other user inputs. Software running on the processor 110 may detect the user command and may communicate with the electronic device 200.
To support wireless communication of stylus 100 with electronic device 200, stylus 100 may include a wireless module. Fig. 4 illustrates an example in which the wireless module is a bluetooth module 180. The wireless module can also be a WI-FI hotspot module, a WI-FI point-to-point module and the like. Bluetooth module 180 may include a radio frequency transceiver, such as a transceiver. Bluetooth module 180 may also include one or more antennas. The transceiver may transmit and/or receive wireless signals, which may be bluetooth signals, wireless local area network signals, long range signals such as cellular telephone signals, near field communication signals, or other wireless signals, based on the type of wireless module, using the antenna.
Stylus 100 may further include a charging module 190, and charging module 190 may support charging of stylus 100 to provide power to stylus 100.
It should be understood that the electronic device 200 in the embodiment of the present application may be referred to as a User Equipment (UE), a terminal (terminal), and the like, for example, the electronic device 200 may be a mobile terminal or a fixed terminal having a touch screen, such as a tablet computer (PAD), a Personal Digital Assistant (PDA), a handheld device having a wireless communication function, a computing device, a vehicle-mounted device, or a wearable device, a Virtual Reality (VR) terminal device, an Augmented Reality (AR) terminal device, a wireless terminal in industrial control (industrial control), a wireless terminal in self driving (self driving), a wireless terminal in remote medical (remote medical), a wireless terminal in smart grid (smart grid), a wireless terminal in transportation safety (transportation safety), a wireless terminal in smart city (smart city), a wireless terminal in smart home (smart home), and the like. The form of the terminal device is not particularly limited in the embodiment of the present application.
Fig. 5 is a schematic diagram of a hardware structure of an electronic device according to an embodiment of the present disclosure. Referring to fig. 5, the electronic device 200 may include multiple subsystems that cooperate to perform, coordinate, or monitor one or more operations or functions of the electronic device 200. Electronic device 200 includes processor 210, input surface 220, coordination engine 230, power subsystem 240, power connector 250, wireless interface 260, and display 270.
For example, coordination engine 230 may be used to communicate with and/or process data with other subsystems of electronic device 200; communicating and/or transacting data with stylus 100; measuring and/or obtaining the output of one or more analog or digital sensors (such as touch sensors); measuring and/or obtaining an output of one or more sensor nodes of an array of sensor nodes (such as an array of capacitive sensing nodes); receiving and locating tip and ring signals from stylus 100; the stylus pen 100 and the like are positioned based on the positions of the tip signal crossing area and the ring signal crossing area.
Coordination engine 230 of electronic device 200 includes or is otherwise communicatively coupled to a sensor layer located below or integrated with input surface 220. Coordination engine 230 locates stylus 100 on input surface 220 using the sensor layer and estimates the angular position of stylus 100 relative to the plane of input surface 220 using the techniques described herein. In one embodiment, the input surface 220 may be referred to as a touch screen 201.
For example, the sensor layer of coordination engine 230 of electronic device 200 is a grid of capacitive sensing nodes arranged as columns and rows. More specifically, the array of column traces is disposed perpendicular to the array of row traces. The sensor layer may be separate from other layers of the electronic device, or the sensor layer may be disposed directly on another layer, such as, but not limited to: display stack layers, force sensor layers, digitizer layers, polarizer layers, battery layers, structural or decorative outer shell layers, and the like.
The sensor layer can operate in multiple modes. If operating in mutual capacitance mode, the column and row traces form a single capacitive sensing node at each overlap point (e.g., a "vertical" mutual capacitance). If operating in self-capacitance mode, the column and row traces form two (vertically aligned) capacitive sensing nodes at each overlap point. In another embodiment, adjacent column traces and/or adjacent row traces may each form a single capacitive sensing node (e.g., a "horizontal" mutual capacitance) if operating in a mutual capacitance mode. As described above, the sensor layer may detect the presence of the tip 10 of the stylus 100 and/or the touch of a user's finger by monitoring changes in capacitance (e.g., mutual or self capacitance) present at each capacitive sensing node. In many cases, coordination engine 230 may be configured to detect tip and ring signals received from stylus 100 through the sensor layer via capacitive coupling.
Wherein the tip signal and/or the ring signal may include specific information and/or data that may be configured to cause the electronic device 200 to recognize the stylus 100. Such information is generally referred to herein as "stylus identity" information. This information and/or data may be received by the sensor layer and interpreted, decoded, and/or demodulated by the coordination engine 230.
The processor 210 may use the stylus identity information to receive input from more than one stylus at the same time. In particular, the coordination engine 230 may be configured to transmit the position and/or angular position of each of the number of styli detected by the coordination engine 230 to the processor 210. In other cases, the coordination engine 230 may also transmit information to the processor 210 regarding the relative positions and/or relative angular positions of the plurality of styli detected by the coordination engine 230. For example, coordination engine 230 may notify processor 210 that a detected first stylus is located a distance from a detected second stylus.
In other cases, the end signal and/or the ring signal may also include specific information and/or data for the electronic device 200 to identify a particular user. Such information is generally referred to herein as "user identity" information.
The coordination engine 230 may forward the user identity information (if detected and/or recoverable) to the processor 210. If the user identity information cannot be recovered from the tip signal and/or the ring signal, the coordination engine 230 may optionally indicate to the processor 210 that the user identity information is not available. The processor 210 can utilize the user identity information (or the absence of such information) in any suitable manner, including but not limited to: accept or reject input from a particular user, allow or reject access to a particular function of the electronic device, and the like. The processor 210 may use the user identity information to receive input from more than one user at the same time.
In still other cases, the tip signal and/or the ring signal may include specific information and/or data that may be configured to cause the electronic device 200 to identify settings or preferences of the user or stylus 100. Such information is generally referred to herein as "stylus setting" information.
The coordination engine 230 may forward the stylus setting information (if detected and/or recoverable) to the processor 210. If the stylus setting information cannot be recovered from the tip signal and/or the ring signal, the coordination engine 230 may optionally indicate to the processor 210 that the stylus setting information is not available. The electronic device 200 can utilize the stylus to set information (or the absence of such information) in any suitable manner, including but not limited to: applying settings to the electronic device, applying settings to a program running on the electronic device, changing line thickness, color, patterns rendered by a graphics program of the electronic device, changing settings of a video game operating on the electronic device, and so forth.
In general, the processor 210 may be configured to perform, coordinate, and/or manage the functions of the electronic device 200. Such functions may include, but are not limited to: communicate and/or transact data with other subsystems of electronic device 200, communicate and/or transact data with stylus 100, communicate and/or transact data via a wireless interface, communicate and/or transact data via a wired interface, facilitate power exchange via a wireless (e.g., inductive, resonant, etc.) or wired interface, receive position and angular position of one or more styli, and/or the like.
Processor 210 may be implemented as any electronic device capable of processing, receiving, or transmitting data or instructions. For example, the processor may be a microprocessor, central processing unit, application specific integrated circuit, field programmable gate array, digital signal processor, analog circuit, digital circuit, or a combination of these devices. The processor may be a single threaded or a multi-threaded processor. The processor may be a single core or a multi-core processor.
During use, the processor 210 may be configured to access a memory having stored instructions. The instructions may be configured to cause the processor to perform, coordinate, or monitor one or more operations or functions of the electronic device 200.
The instructions stored in the memory may be configured to control or coordinate the operation of other components of the electronic device 200, such as, but not limited to: another processor, analog or digital circuitry, a volatile or non-volatile memory module, a display, a speaker, a microphone, a rotary input device, a button or other physical input device, a biometric authentication sensor and/or system, a force or touch input/output component, a communication module (such as a wireless interface and/or a power connector), and/or a haptic or tactile feedback device.
The memory may also store electronic data that may be used by the stylus or the processor. For example, the memory may store electronic data or content (such as media files, documents, and applications), device settings and preferences, timing signals and control signals or data for various modules, data structures or databases, files or configurations related to detecting tip signals and/or ring signals, and so forth. The memory may be configured as any type of memory. For example, the memory may be implemented as random access memory, read only memory, flash memory, removable memory, other types of storage elements, or a combination of such devices.
The electronic device 200 also includes a power subsystem 240. Power subsystem 240 may include a battery or other power source. The power subsystem 240 may be configured to provide power to the electronic device 200. The power subsystem 240 may also be coupled to a power connector 250. Power connector 250 may be any suitable connector or port that may be configured to receive power from an external power source and/or configured to provide power to an external load. For example, in some embodiments, power connector 250 may be used to recharge a battery within power subsystem 240. In another embodiment, power connector 250 can be used to transmit power stored in (or available to) power subsystem 240 to stylus 100.
Electronic device 200 also includes a wireless interface 260 to facilitate electronic communication between electronic device 200 and stylus 100. In one embodiment, electronic device 200 may be configured to communicate with stylus 100 via a low energy bluetooth communication interface or a near field communication interface. In other examples, the communication interface facilitates electronic communication between the electronic device 200 and an external communication network, device, or platform.
The wireless interface 260 (whether a communication interface between the electronic device 200 and the stylus 100 or another communication interface) may be implemented as one or more wireless interfaces, bluetooth interfaces, near field communication interfaces, magnetic interfaces, universal serial bus interfaces, inductive interfaces, resonant interfaces, capacitive coupling interfaces, Wi-Fi interfaces, TCP/IP interfaces, network communication interfaces, optical interfaces, acoustic interfaces, or any conventional communication interfaces.
The electronic device 200 also includes a display 270. The display 270 may be located behind the input surface 220 or may be integral therewith. The display 270 may be communicatively coupled to the processor 210. Processor 210 may present information to a user using display 270. In many cases, processor 210 uses display 270 to present an interface with which a user may interact. In many cases, a user manipulates stylus 100 to interact with the interface.
It will be apparent to one skilled in the art that some of the specific details presented above with respect to the electronic device 200 may not be required to practice particular described embodiments or their equivalents. Similarly, other electronic devices may include a greater number of subsystems, modules, components, etc. Some sub-modules may be implemented as software or hardware, where appropriate. Accordingly, it should be understood that the above description is not intended to be exhaustive or to limit the disclosure to the precise form disclosed herein. On the contrary, many modifications and variations are possible in light of the above teaching, as would be apparent to those of ordinary skill in the art.
When a user opens a drawing application program and a memo application program on the electronic device, or the global writing function of the electronic device is performed, the user can use the stylus to perform operations such as drawing and writing on the screen of the electronic device (collectively referred to as drawing operation), and accordingly, the electronic device can display handwriting drawn by the stylus. Fig. 6A-6D are schematic diagrams illustrating a user drawing on a screen of an electronic device using a stylus. The drawing of handwriting by the user on the interface of the memo application is illustrated in fig. 6A-6D as an example.
Referring to a in fig. 6A, the user draws a line on the interface of the memo using the stylus pen, and the electronic device displays the line drawn by the user accordingly. Referring to b in fig. 6A, when a user draws a line at the bottom edge of the screen of the electronic device using a stylus, the same gesture as that triggering the electronic device to exit the application may cause the electronic device to exit the memo application, and it should be understood that b in fig. 6A is a dotted line to perform a drawing operation representing that the stylus draws at the bottom edge of the screen. Referring to fig. 6A, in response to a drawing operation of the stylus pen on the bottom edge of the screen, the electronic device exits the memo application and displays a main interface of the electronic device.
It should be noted that, in the process of drawing, the user originally wants to draw a line at the bottom edge of the screen, but the action of drawing the line is the same as the gesture (or action) for triggering the electronic device to exit the application program, and thus the operation for triggering the electronic device to exit the application program is mistakenly triggered. The user wants to continue drawing, needs to re-enter the memo application program, affects drawing experience of the user, and needs to avoid an edge area of a screen of the electronic device when the user draws handwriting so as to avoid false triggering and also affect the user experience.
At the edge of the screen of the electronic device, besides the gesture for triggering the electronic device to exit the application program in fig. 6A, the gesture for triggering the electronic device to display the pull-down notification bar, display the pull-up control center, display multiple windows, display a multitask switching interface, and the like is also included. Referring to B in fig. 6B, when the user draws a line at the top edge of the screen of the electronic device using the stylus pen, the action of drawing the line is the same as the gesture for triggering the electronic device to display the pull-down notification bar, which causes the electronic device to display the pull-down notification bar. Referring to c in fig. 6B, the electronic device displays a pull-down notification bar in response to a drawing operation of the stylus at the top edge of the screen. It should be understood that a in fig. 6B is the same as a in fig. 6A, and the handwriting of the stylus is displayed on the electronic device.
The user uses the stylus to perform sliding operations on the left and right side edges of the screen of the electronic device, which may trigger the electronic device to display a multi-window menu. For example, referring to b in fig. 6C, when the user slides the left edge of the screen of the electronic device to draw a line to the right using the stylus, the electronic device may be triggered by mistake to display a multi-window menu, which is shown as C in fig. 6C. It should be understood that a in fig. 6C is the same as a in fig. 6A, and the handwriting of the stylus is displayed on the electronic device.
Referring to b in fig. 6D, when the user draws a line at the bottom edge of the screen of the electronic device using the stylus, and the drawing speed is relatively slow, the action of drawing the line is the same as the gesture for triggering the electronic device to display the multitask switching interface, and the electronic device may be triggered by mistake to display the multitask switching interface, where the multitask switching interface is shown as c in fig. 6D. It should be understood that a in fig. 6D is the same as a in fig. 6A, and the handwriting of the stylus is displayed on the electronic device.
As described above in fig. 6A-6D, the gesture triggering the electronic device response at the edge of the screen of the electronic device is exemplified, and other gestures for triggering the electronic device response may also be included, and in the following embodiments, the gesture is characterized by "screen edge gesture".
At present, various screen edge gestures are set at the screen edge of the electronic device, so that when a user uses a stylus to draw at the screen edge of the electronic device, if the drawing operation is the same as the screen edge gesture, the electronic device is triggered by mistake to execute the operation responding to the screen edge gesture, so that the user cannot draw continuously, and the drawing experience of the user is influenced.
In order to avoid false triggering of the electronic device to respond to the screen edge gesture, in the embodiment of the application, when the user uses the stylus to draw at the edge of the screen of the electronic device, the electronic device is prohibited from responding to the screen edge gesture. However, after the electronic device is prohibited from responding to the screen edge gesture, if the user actually requires the electronic device to execute operations such as exiting the application program and pulling down the notification bar, the electronic device does not respond and the use of the user is also affected.
Before introducing the interaction method between the electronic device and the stylus provided by the embodiment of the present application, a structure of the stylus is introduced:
fig. 7 is another schematic structural diagram of a stylus pen according to an embodiment of the present disclosure. It should be understood that a in fig. 7 is a schematic diagram of an internal structure of the stylus pen, and b in fig. 7 is a schematic diagram of connection of the stylus pen. Referring to a and B in fig. 7, on the basis of the stylus shown in fig. 2A and 2B, the stylus shown in fig. 7 is added with an electrode at the tail of the stylus, such as an electrode disposed in the rear cover 30 of fig. 2B.
In the following embodiments, the electrode disposed at the tail of the stylus is referred to as a first electrode 71, the electrode disposed at the tip of the stylus is referred to as a second electrode 72, and the second electrode 72 may be described with reference to fig. 2B. In an embodiment, unlike fig. 2B, the second electrode 72 may include an electrode, such as the first emitter electrode 41, and the second electrode 72 is not limited in this embodiment. Referring to fig. 7, a stylus pen may include: a first electrode 71, a second electrode 72, a Micro Controller Unit (MCU) 73 and a communication module 74. The MCU 73 is connected to the first electrode 71, the second electrode 72, and the communication module 74, respectively.
The first electrode 71 and the second electrode 72 may be the same or different. In the embodiment of the present application, the first electrode 71 and the second electrode 72 are the same or different: the frequency of the first electrode 71 and the frequency of the second electrode 72 are the same or different. Illustratively, the frequency of the first electrode 71 and the frequency of the second electrode 72 are the same, e.g., both are 360 Hz. Alternatively, the frequency of the first electrode 71 is different from the frequency of the second electrode 72, for example, the frequency of the first electrode 71 is 120Hz, and the frequency of the second electrode 72 is 360 Hz.
And the MCU 73 is used for controlling the first electrode 71 and the second electrode 72 to emit downlink signals and processing signals from the electronic equipment. In one embodiment, the MCU 73 may be the circuit board 70 of FIG. 2B described above.
And a communication module 74 for enabling wireless communication between the stylus and the electronic device. Illustratively, the communication module 74 may be a Bluetooth module or a WI-FI module, etc.
In one embodiment, the stylus may further include: a pressure sensor 75, the pressure sensor 75 may be connected with the MCU 73. The pressure sensor 75 is disposed at the tip of the stylus, as described with reference to fig. 2B and 4.
On the basis of the stylus shown in fig. 7, the following describes an interaction method between the electronic device and the stylus provided in the embodiments of the present application with reference to specific embodiments. The following several embodiments may be combined with each other and may not be described in detail in some embodiments for the same or similar concepts or processes.
Fig. 8 is a flowchart illustrating an embodiment of an interaction method between an electronic device and a stylus according to an embodiment of the present disclosure. Referring to fig. 8, an interaction method of an electronic device and a stylus provided in an embodiment of the present application may include:
s801, the electronic device acquires the position of the touch pen on the screen.
In one embodiment, the electronic device may obtain the location of the stylus on the screen while the electronic device is in the drawing mode. Wherein, the electronic device being in the drawing mode can be understood as: the stylus may draw handwriting on a screen of the electronic device, which displays the handwriting of the stylus accordingly. Illustratively, if a user opens a drawing application program and a memo application program on the electronic device, or opens a global writing function of the electronic device, the electronic device is in a drawing mode, and the user can use a stylus to draw, write and the like on a screen (collectively, drawing handwriting). If the user opens the global writing function of the electronic device, a control of the global writing function may be opened on a setting interface of the electronic device for the user, and the electronic device is triggered to start the global writing function.
In this application, after the electronic device is connected to the touch pen via bluetooth, the MCU in the touch pen can control the first electrode and the second electrode to transmit signals at a certain frequency, and the electronic device can receive signals from the first electrode and the second electrode. In one embodiment, the strength of the signal received by the electronic device from the first electrode and the second electrode is lower the farther the stylus is from the screen of the electronic device, and for signal strength reasons, the electronic device may receive the signal from the first electrode when the first electrode is at a distance from the screen, and similarly, the electronic device may receive the signal from the second electrode when the second electrode is at a distance from the screen. Illustratively, the distance may be 3 mm.
When the electronic device receives a signal from the first electrode or the second electrode, a capacitance value at a position corresponding to a screen of the electronic device changes. Accordingly, the electronic device can determine the position of the pen tip or the tail of the stylus on the screen based on the change of the capacitance value on the screen, which can be referred to the related description in fig. 2B.
In the embodiment of the application, because the pen point and the tail part of the stylus pen both include the electrodes, a user can draw handwriting on a screen by using the pen point or the tail part of the stylus pen, and the electronic device can display the handwriting on a corresponding position based on the position of the pen point or the tail part of the stylus pen on the screen.
S802, responding to the situation that the position of the touch pen is in a preset area at the edge of the screen, and detecting the touch mode of the touch pen by the electronic equipment.
If the touch modes are different, the actions (or operations and responses) of the electronic device in response to the operation of the stylus are also different.
The different touch modes represent different positions on the touch pen, which are in contact with the screen, that is, the different positions on the touch pen, which are in contact with the screen by the user, are different, so that the touch modes of the touch pen are different.
In one embodiment, the stylus may include at least two pen points, for example, the pen shaft of the stylus includes a plurality of pen points, and when the stylus is used by a user, one of the pen points may be ejected out of the pen shaft, and each pen point may invoke a different function of the electronic device. In this example, different pen tips touch the screen and the touch mode of the stylus is different. Illustratively, the first touch mode is a first pen tip touching the screen, and the second touch mode is a second pen tip touching the screen. The pen tips may be differentiated by different thicknesses, and/or by providing electrodes of different frequencies.
In one embodiment, the first touch mode characterizes a tip of the stylus contacting the screen, and the second touch mode characterizes a tail of the stylus contacting the screen. That is, the user may contact the screen using the tip or tail of the stylus to use different touch modes of the stylus.
In the following embodiments, the description will be given by taking an example in which the first touch mode represents that the tip of the stylus contacts the screen, and the second touch mode represents that the tail of the stylus contacts the screen. In this kind of example, the above S802 may be replaced with: in response to the position of the stylus being within a preset area of the screen edge, the electronic device detects whether a tip or a tail of the stylus is in contact with the screen.
And according to different gestures at the edge of the screen, the preset area at the edge of the screen is different. Illustratively, when the screen edge gesture is a gesture for triggering the electronic device to exit the application program, the preset area of the screen edge is an area 61 of the bottom edge of the screen as shown by a in fig. 6A. When the screen edge gesture is a gesture for triggering the electronic device to display the drop-down notification bar, the preset area of the screen edge is an area 62 of the top edge of the screen as shown in a in fig. 6B.
It should be understood that, in the embodiment of the present application, the electronic device detects that the position of the stylus is within a preset area at any edge of the screen, and may detect a touch mode of the stylus (i.e., whether the stylus touches the screen is a pen point or a tail of the stylus).
In an embodiment of the present application, the manner of detecting the touch mode of the stylus by the electronic device may be:
first, the first electrode and the second electrode have different frequencies.
In this manner, the electronic device can determine the touch pattern of the stylus based on the frequency of the signals received from the electrodes. For example, if the frequency of the first electrode is 360Hz and the frequency of the second electrode is 120Hz, if the frequency of the signal received by the electronic device is 360Hz, it may be determined that the touch screen is a tail, i.e. the touch mode of the stylus pen is the second touch mode.
And secondly, the frequency of the first electrode is the same as that of the second electrode.
The area of the pen point contacting the screen is smaller than that of the tail portion contacting the screen, so that the area of the pen point contacting the screen, which causes capacitance value change in the screen, is smaller than that of the tail portion contacting the screen. In this manner, the electronic device may detect the touch mode of the stylus based on the area on the screen where the capacitance value changes. In one embodiment, the area of the screen where the capacitance value changes can be represented by the number of pixels in the screen, and the larger the number of pixels, the larger the area.
The electronic device stores a first area range of an area causing a capacitance value to change in the screen when the pen point contacts the screen, and a second area range of an area causing a capacitance value to change in the screen when the tail portion contacts the screen. For example, the electronic device detects an area of the screen where the capacitance value changes, and if the area is within a first area range, it may be determined that the pen point is touching the screen, that is, the touch mode is the first touch mode, and if the area is within a second area range, it may be determined that the tail is touching the screen, that is, the touch mode is the second touch mode.
In one embodiment, a pen point of the stylus pen is provided with a pressure sensor, and when the pen point contacts the screen, the stylus pen can send pressure data of the pen point to the electronic device through a bluetooth module in the stylus pen. In such an embodiment, to more accurately detect the touch pattern of the stylus, the electronic device may further detect the touch pattern of the stylus in combination with whether pressure data from the stylus is received.
For example, if the electronic device detects that the area of the screen where the capacitance value changes is within the first area range and the electronic device further receives pressure data from the stylus, the electronic device may determine that it is the pen tip that touches the screen, that is, the touch mode is the first touch mode. If the electronic device detects that the area of the screen with the changed capacitance value is within the second area range and the electronic device does not receive the pressure data from the stylus, the electronic device may determine that the screen is in the tail portion, that is, the touch mode is the second touch mode.
It should be noted that when a user's finger touches the screen, the capacitance value on the screen also changes, and the area of the capacitance value in the screen that changes when the finger touches the screen is larger than the area of the capacitance value in the screen that changes when the tail touches the screen. In the embodiment of the application, the electronic device may detect whether the pen point, the screen or the finger is in contact with the screen based on the area of the screen where the capacitance value changes, the first area range corresponding to the pen point, the second area range corresponding to the tail, and the area range corresponding to the finger.
S803, responding to the fact that the touch mode is the first touch mode, and displaying handwriting on the screen by the electronic equipment based on the position of the touch pen on the screen.
The first touch mode represents that the pen point of the touch pen is in contact with the screen.
When the pen tip of the stylus pen is in contact with the screen, the electronic device may display handwriting at a corresponding position based on the position on the screen of the pen tip. That is, the electronic device may inhibit responding to the screen-edge gesture when it is the tip of the stylus that contacts the screen, i.e., the electronic device does not perform an operation in response to the screen-edge gesture even if it detects that the user performs the screen-edge gesture using the tip of the stylus.
Illustratively, a in fig. 9 is an interface of a memo, on which a user draws handwriting using a pen tip of a stylus, and the electronic device may display the handwriting based on a position of the pen tip. Referring to b in fig. 9, when the user draws handwriting on the bottom edge of the screen using the pen tip of the stylus, the drawing operation of the stylus is the same as the screen edge gesture that triggers "exit application", but the electronic device does not respond to the screen edge gesture, but displays the handwriting on the screen based on the position of the pen tip on the screen.
S804, responding to the touch mode being the second touch mode, and if the drawing operation of the stylus is the same as the first screen edge gesture, the electronic device executes an operation responding to the first screen edge gesture.
The second touch mode represents that the touch screen is the tail of the touch pen.
When the tail of the stylus is contacted with the screen, the electronic device may obtain a drawing operation (or an operation or an action) of the tail of the stylus on the screen based on the position on the screen of the tail, and if the drawing operation of the tail is the same as the first screen edge gesture, the electronic device performs an operation in response to the first screen edge gesture. That is to say, in the embodiment of the present application, when the tail of the stylus is contacted to the screen, the electronic device does not prohibit the response to the screen edge gesture, that is, when the electronic device detects that the user performs the screen edge gesture using the tail of the stylus, the operation in response to the screen edge gesture may be performed. It should be understood that the first screen edge gesture is a screen edge gesture in a predetermined area of the screen edge, such as a screen edge gesture that triggers "exit application".
In one embodiment, the first screen edge gesture is any one of: the gesture of triggering the electronic device to exit the application program, the gesture of triggering the electronic device to display a pull-down notification bar, the gesture of triggering the electronic device to display a pull-up control center, the gesture of triggering the electronic device to display multiple windows, or the gesture of triggering the electronic device to display a multi-task switching interface.
Illustratively, a in fig. 10 is a memo interface on which a user draws handwriting using a tail of a stylus, and the electronic device may display the handwriting based on a position of the tail. Referring to b in fig. 10, when the user draws handwriting on the bottom edge of the screen by using the tail of the stylus, the electronic device detects that the drawing operation of the tail is a screen edge gesture of "triggering the electronic device to exit the application", and then the electronic device may perform an operation of exiting the application. As shown in fig. 10 c, the electronic device may exit the memo application and display the main interface. It should be understood that a in fig. 10 is a dashed line to represent the tail drawing operation of the stylus on the screen.
In summary, in S801 to S804, the embodiment of the present application provides an interaction method for an electronic device and a stylus, when a user draws handwriting using a pen tip of the stylus, even in an edge area of a screen of the electronic device, the electronic device can display the handwriting of the pen tip, and the electronic device is not triggered by mistake to execute a gesture in response to the edge of the screen. In addition, if the user wants to trigger the electronic device to perform an operation in response to the screen edge gesture, the user may use the tail of the stylus to perform the screen edge gesture, and then trigger the electronic device to perform an operation in response to the screen edge gesture. In the embodiment of the application, a user can finish drawing at the edge of the screen by using one touch pen, and the electronic equipment is triggered to execute the gesture responding to the edge of the screen, so that the operation is simple, and the user experience can be improved.
Illustratively, a in fig. 11 is a memo interface on which a user may draw handwriting using a pen tip of a stylus, and the electronic device may display the handwriting accordingly. Referring to b in fig. 11, when the user desires to view the pull-down notification bar, the user may perform a screen edge gesture triggering the "pull-down notification bar" in an edge area of the top of the screen using the tail of the stylus pen, and the electronic device may display the pull-down notification bar in response to an operation of the tail of the stylus pen. The drop down notification bar may be as shown in c in fig. 11.
As the above embodiments describe the interaction method between the electronic device and the stylus pen when the electronic device is in the drawing mode, the following describes the interaction method between the electronic device and the stylus pen when the electronic device is in the non-drawing mode:
wherein, when the electronic device is in the non-drawing mode, it can be understood as: the electronic device does not display handwriting based on the operation of the stylus pen, but performs an operation in response to the operation. For example, when a user may perform a sliding operation on a page using a stylus, the electronic device may slide the displayed page in response to the sliding operation. That is, when the electronic device is in the non-drawing mode, the user may perform a click, a slide, or the like operation using the pen tip or the tail of the stylus pen, and accordingly, the electronic device may perform an operation in response to the operation based on the operation of the pen tip or the tail of the stylus pen.
For example, as a in fig. 12 is a main interface of the electronic device, a user may click an icon of an application program displayed on the main interface using a pen tip of a stylus, for example, click an icon of a social application program using the pen tip, and the electronic device may display an interface of the social application program. As shown in b in fig. 12, the interface of the social application includes dialog boxes 121 of the user and the plurality of contacts, the user may click on any of the dialog boxes 121 using the tail of the stylus, and the electronic device may display the dialog interface, which is shown in c in fig. 12.
In the embodiment of the application, when the electronic device is in a non-drawing mode, a user can use the pen point or the tail of the stylus to perform operations such as clicking and sliding, the electronic device can perform operations responding to the operations based on the pen point or the tail of the stylus, and the first electrode newly added at the tail of the stylus does not affect the current interaction flow of the stylus and the electronic device, can enrich the interaction mode of the stylus and the electronic device, and reduces the abrasion of the pen point.
In an embodiment, an embodiment of the present application further provides an electronic device, and with reference to fig. 13, the electronic device may include: a processor (e.g., CPU) 1301, a memory 1302, and a screen 1303. The memory 1302 may include a random-access memory (RAM) and may further include a non-volatile memory (NVM), such as at least one disk memory, and the memory 1302 may store various instructions for performing various processing functions and implementing the method steps of the present application.
Optionally, the electronic device related to the present application may further include: a power supply 1304, a communication bus 1305, and a communication port 1306. The communication port 1306 is used for enabling connection communication between the electronic device and other peripherals. In an embodiment of the present application, the memory 1302 is used for storing computer executable program code, which includes instructions; when the processor executes the instructions, the instructions cause the processor of the electronic device to execute the actions in the above method embodiments, which have similar implementation principles and technical effects, and are not described herein again.
In an embodiment, the present application further provides a touch interactive system, which may include an electronic device and a stylus, as shown in fig. 1. The electronic device may perform the steps performed by the electronic device according to the above embodiments, and the touch interaction system may implement the electronic device and the interaction method of the stylus provided in the embodiments of the present application, and specific implementation principles and technical effects may refer to the relevant descriptions in the above embodiments.
It should be noted that the modules or components described in the above embodiments may be one or more integrated circuits configured to implement the above methods, for example: one or more Application Specific Integrated Circuits (ASICs), or one or more microprocessors (DSPs), or one or more Field Programmable Gate Arrays (FPGAs), etc. For another example, when one of the above modules is implemented in the form of a processing element scheduler code, the processing element may be a general-purpose processor, such as a Central Processing Unit (CPU) or other processor that can call program code, such as a controller. As another example, these modules may be integrated together, implemented in the form of a system-on-a-chip (SOC).
In the above embodiments, the implementation may be wholly or partially realized by software, hardware, firmware, or any combination thereof. When implemented in software, may be implemented in whole or in part in the form of a computer program product. The computer program product includes one or more computer instructions. The procedures or functions according to the embodiments of the present application are all or partially generated when the computer program instructions are loaded and executed on a computer. The computer may be a general purpose computer, a special purpose computer, a network of computers, or other programmable device. The computer instructions may be stored in a computer readable storage medium or transmitted from one computer readable storage medium to another, for example, the computer instructions may be transmitted from one website, computer, server, or data center to another website, computer, server, or data center by wire (e.g., coaxial cable, fiber optic, Digital Subscriber Line (DSL)) or wirelessly (e.g., infrared, wireless, microwave, etc.). The computer-readable storage medium can be any available medium that can be accessed by a computer or a data storage device, such as a server, a data center, etc., that incorporates one or more of the available media. The usable medium may be a magnetic medium (e.g., floppy Disk, hard Disk, magnetic tape), an optical medium (e.g., DVD), or a semiconductor medium (e.g., Solid State Disk (SSD)), among others.
The term "plurality" herein means two or more. The term "and/or" herein is merely an association describing an associated object, meaning that three relationships may exist, e.g., a and/or B, may mean: a exists alone, A and B exist simultaneously, and B exists alone. In addition, the character "/" herein generally indicates that the former and latter related objects are in an "or" relationship; in the formula, the character "/" indicates that the preceding and following related objects are in a relationship of "division". In addition, it is to be understood that the terms first, second, etc. in the description of the present application are used for distinguishing between the descriptions and not necessarily for describing a sequential or chronological order.
It is to be understood that the various numerical references referred to in the embodiments of the present application are merely for descriptive convenience and are not intended to limit the scope of the embodiments of the present application.
It should be understood that, in the embodiment of the present application, the sequence numbers of the above-mentioned processes do not mean the execution sequence, and the execution sequence of each process should be determined by its function and inherent logic, and should not constitute any limitation to the implementation process of the embodiment of the present application.

Claims (11)

1. An interaction method of an electronic device and a touch control pen is applied to the electronic device, and comprises the following steps:
acquiring the position of a touch pen on a screen of the electronic equipment;
responding to the position in a preset area at the edge of the screen, and detecting a touch mode of the stylus, wherein different touch modes of the stylus represent different parts, contacting the screen, on the stylus, and electrodes are arranged on the parts, contacting the screen, on the stylus;
responding to the touch mode as a first touch mode, and displaying the handwriting of the touch pen;
and responding to the touch mode as a second touch mode, and executing the operation responding to the first screen edge gesture if the operation of the stylus on the screen is the same as the first screen edge gesture.
2. The method of claim 1, wherein the first touch mode is indicative of a tip of the stylus contacting the screen and the second touch mode is indicative of a tail of the stylus contacting the screen.
3. The method of claim 1 or 2, wherein the first screen edge gesture is any one of: triggering a gesture of the electronic device for exiting an application program, triggering a gesture of the electronic device for displaying a pull-down notification bar, triggering a gesture of the electronic device for displaying a pull-up control center, triggering a gesture of the electronic device for displaying multiple windows, or triggering a gesture of the electronic device for displaying a multi-task switching interface.
4. The method of claim 2, wherein the tail is provided with a first electrode, the tip is provided with a second electrode, the first electrode and the second electrode have different frequencies, and the detecting the touch pattern of the stylus comprises:
detecting a touch mode of the stylus pen according to a frequency of a signal from an electrode.
5. The method of claim 2, wherein the tail is provided with a first electrode, the tip is provided with a second electrode, the first electrode and the second electrode have the same frequency, and the detecting the touch pattern of the stylus comprises:
detecting a touch mode of the stylus according to an area where a signal from an electrode causes a change in capacitance value of the screen.
6. The method of claim 5, wherein the area causing the change in the capacitance value of the screen when the tip contacts the screen is within a first area range, the area causing the change in the capacitance value of the screen when the tail contacts the screen is within a second area range, and the detecting the touch pattern of the stylus based on the area causing the change in the capacitance value of the screen by the signal from the electrode comprises:
if the area of the electrode, which causes the change of the capacitance value of the screen, is within the first area range, determining that the touch mode is the first touch mode;
and if the area of the signal of the electrode, which causes the change of the capacitance value of the screen, is within the second area range, determining that the touch mode is the second touch mode.
7. The method of claim 5, wherein the pen tip is further provided with a pressure sensor, when the pen tip contacts the screen, the stylus sends pressure data collected by the pressure sensor to the electronic device through a wireless connection with the electronic device, and the detecting the touch mode of the stylus according to an area where a signal from the electrode causes a change in capacitance value of the screen comprises:
detecting a touch mode of the stylus according to an area where a signal from an electrode causes a change in capacitance value of the screen and whether pressure data from the stylus is received.
8. The method of claim 7, wherein the area that causes the change in the capacitance of the screen when the tip contacts the screen is within a first area range, and the area that causes the change in the capacitance of the screen when the tail contacts the screen is within a second area range;
the detecting a touch mode of the stylus according to an area where a signal from an electrode causes a change in capacitance value of the screen and whether pressure data from the stylus is received, includes:
if the area of the electrode, which causes the change of the capacitance value of the screen, is within the first area range and pressure data from the stylus pen is received, determining that the touch mode is the first touch mode;
and if the area of the electrode, which causes the change of the capacitance value of the screen, is within the second area range and pressure data from the stylus pen is not received, determining that the touch mode is the second touch mode.
9. An electronic device, comprising: a processor, a memory, and a screen;
the memory stores computer-executable instructions;
the processor executes computer-executable instructions stored by the memory, causing the processor to perform the method of any of claims 1-8.
10. A touch interactive system, comprising: a stylus, and the electronic device of claim 9.
11. A computer-readable storage medium, in which a computer program or instructions are stored which, when executed, implement the method of any one of claims 1-8.
CN202111382321.9A 2021-11-22 2021-11-22 Electronic equipment and touch pen interaction method and system and electronic equipment Active CN113821113B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202111382321.9A CN113821113B (en) 2021-11-22 2021-11-22 Electronic equipment and touch pen interaction method and system and electronic equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202111382321.9A CN113821113B (en) 2021-11-22 2021-11-22 Electronic equipment and touch pen interaction method and system and electronic equipment

Publications (2)

Publication Number Publication Date
CN113821113A CN113821113A (en) 2021-12-21
CN113821113B true CN113821113B (en) 2022-04-22

Family

ID=78917915

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202111382321.9A Active CN113821113B (en) 2021-11-22 2021-11-22 Electronic equipment and touch pen interaction method and system and electronic equipment

Country Status (1)

Country Link
CN (1) CN113821113B (en)

Families Citing this family (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115543105B (en) * 2022-01-11 2023-10-20 荣耀终端有限公司 Information transmission method and device
CN114911364B (en) * 2022-03-29 2023-06-13 荣耀终端有限公司 Control method, touch pen and touch system
CN115113747B (en) * 2022-08-22 2023-04-07 荣耀终端有限公司 Touch pen using method and system and touch pen
CN116736991A (en) * 2022-09-23 2023-09-12 荣耀终端有限公司 Control method of touch pen, touch pen and storage medium

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN203812196U (en) * 2014-01-29 2014-09-03 上海允得文具礼品有限公司 Active type touch pen allowing writing
KR20140128818A (en) * 2013-04-29 2014-11-06 인포뱅크 주식회사 A portable terminal and a method for operating it
KR20160102811A (en) * 2015-02-23 2016-08-31 엘지전자 주식회사 Mobile terminal that can control handwriting relevant function through gesture of the hand which is worn a wrist wearable device and method for controlling the same

Family Cites Families (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7256773B2 (en) * 2003-06-09 2007-08-14 Microsoft Corporation Detection of a dwell gesture by examining parameters associated with pen motion
KR20100093293A (en) * 2009-02-16 2010-08-25 주식회사 팬택 Mobile terminal with touch function and method for touch recognition using the same
KR102084041B1 (en) * 2012-08-24 2020-03-04 삼성전자 주식회사 Operation Method And System for function of Stylus pen
WO2018136057A1 (en) * 2017-01-19 2018-07-26 Hewlett-Packard Development Company, L.P. Input pen gesture-based display control
WO2020107384A1 (en) * 2018-11-30 2020-06-04 深圳市柔宇科技有限公司 Handwriting clearing tool, writing apparatus and writing system
CN113641283A (en) * 2021-07-05 2021-11-12 华为技术有限公司 Electronic device, screen writing mode switching method and medium thereof

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20140128818A (en) * 2013-04-29 2014-11-06 인포뱅크 주식회사 A portable terminal and a method for operating it
CN203812196U (en) * 2014-01-29 2014-09-03 上海允得文具礼品有限公司 Active type touch pen allowing writing
KR20160102811A (en) * 2015-02-23 2016-08-31 엘지전자 주식회사 Mobile terminal that can control handwriting relevant function through gesture of the hand which is worn a wrist wearable device and method for controlling the same

Also Published As

Publication number Publication date
CN113821113A (en) 2021-12-21

Similar Documents

Publication Publication Date Title
CN113238703B (en) Note generation method and system
CN113821113B (en) Electronic equipment and touch pen interaction method and system and electronic equipment
KR102139526B1 (en) Apparatus, method and computer readable recording medium for fulfilling a plurality of objects displayed on an electronic device
KR102092132B1 (en) Electronic apparatus providing hovering input effect and control method thereof
US10775901B2 (en) Techniques for identifying rolling gestures on a device
EP2743819A2 (en) Terminal and method for providing user interface using a pen
JP6085630B2 (en) Touch pen system and touch pen
CN114461129B (en) Handwriting drawing method and device, electronic equipment and readable storage medium
KR20140126129A (en) Apparatus for controlling lock and unlock and method therefor
CN109478108B (en) Stylus communication channel
EP2703978B1 (en) Apparatus for measuring coordinates and control method thereof
KR102051585B1 (en) An electronic device and method having a function of hand writing using multi-touch
US11216121B2 (en) Smart touch pad device
CN117032485A (en) Touch pen-based use method and device
EP3433713B1 (en) Selecting first digital input behavior based on presence of a second, concurrent, input
US20150370352A1 (en) Active stylus pen, data input system and control method of active stylus pen
CN114995665A (en) Method for prompting pen point life of touch control pen, touch control pen and electronic equipment
CN115562560A (en) Handwriting drawing method and device, electronic equipment and readable storage medium
KR101961786B1 (en) Method and apparatus for providing function of mouse using terminal including touch screen
JPWO2020158088A1 (en) Drawing system
EP4206885A1 (en) Functional mode switching method, electronic device, and system
CN115543105A (en) Information transmission method and device
CN115599229A (en) Control method of touch pen and touch pen device
WO2022019899A1 (en) Stylus with force sensor arrays
KR20230023425A (en) Method for sensing touch gesture and electronic device supporting the same

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant
TR01 Transfer of patent right
TR01 Transfer of patent right

Effective date of registration: 20230919

Address after: 201306 building C, No. 888, Huanhu West 2nd Road, Lingang New District, China (Shanghai) pilot Free Trade Zone, Pudong New Area, Shanghai

Patentee after: Shanghai Glory Smart Technology Development Co.,Ltd.

Address before: Unit 3401, unit a, building 6, Shenye Zhongcheng, No. 8089, Hongli West Road, Donghai community, Xiangmihu street, Futian District, Shenzhen, Guangdong 518040

Patentee before: Honor Device Co.,Ltd.