WO2024032470A1 - 基于触控笔的使用方法和装置 - Google Patents

基于触控笔的使用方法和装置 Download PDF

Info

Publication number
WO2024032470A1
WO2024032470A1 PCT/CN2023/111063 CN2023111063W WO2024032470A1 WO 2024032470 A1 WO2024032470 A1 WO 2024032470A1 CN 2023111063 W CN2023111063 W CN 2023111063W WO 2024032470 A1 WO2024032470 A1 WO 2024032470A1
Authority
WO
WIPO (PCT)
Prior art keywords
stylus
area
user
electronic device
mode
Prior art date
Application number
PCT/CN2023/111063
Other languages
English (en)
French (fr)
Other versions
WO2024032470A9 (zh
Inventor
张北航
Original Assignee
荣耀终端有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 荣耀终端有限公司 filed Critical 荣耀终端有限公司
Priority to CN202380025107.9A priority Critical patent/CN118786408A/zh
Publication of WO2024032470A1 publication Critical patent/WO2024032470A1/zh
Publication of WO2024032470A9 publication Critical patent/WO2024032470A9/zh

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • G06F3/0416Control or interface arrangements specially adapted for digitisers
    • G06F3/04162Control or interface arrangements specially adapted for digitisers for exchanging data with external devices, e.g. smart pens, via the digitiser sensing hardware
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04886Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures by partitioning the display area of the touch-screen or the surface of the digitising tablet into independently controllable areas, e.g. virtual keyboards or menus

Definitions

  • the present application relates to the field of terminal technology, and in particular to a stylus-based usage method and device.
  • a stylus is wrapped with a touch film near the pen tip, so that the stylus can control functions such as electronic devices based on the user's triggering operation of the touch film.
  • the accuracy of the above stylus-based usage method is low, which affects the user's experience of controlling electronic devices based on the stylus pen.
  • Embodiments of the present application provide a stylus-based usage method and device, which can improve the accuracy of operation recognition by identifying trigger data for the first area where the touch film overlaps with the cut surface, thereby improving the user's ability to use touch. Pen-controlled electronic device experience.
  • embodiments of the present application provide a method of using a stylus.
  • Part or all of the body of the stylus is surrounded by a touch film.
  • the body of the stylus is provided with a cut surface.
  • the touch film includes The first area is smaller than the pen body area covered by the touch film and the first area is located at the cut plane position.
  • the method includes: when the stylus receives the first operation for the first area, the stylus identifies the first Operation, the recognition result is obtained, and the target device is controlled based on the recognition result.
  • the target device establishes a communication connection with the stylus. In this way, the accuracy of operation recognition can be improved by identifying the trigger data of the first area where the touch film overlaps with the cut surface, thereby improving the user experience of using the stylus to control the target device.
  • the recognition result when the first operation is a sliding operation away from the pen tip position of the stylus, the recognition result is used to instruct the target device to turn the page downward; or, when the first operation is a sliding operation toward the tip of the stylus, When a sliding operation is performed at the pen tip position of the control pen, the recognition result is used to instruct the target device to turn up the page.
  • the stylus can control the target device to turn pages according to the user's sliding operation on the first area, thereby avoiding accidental touches caused by the user triggering to other positions of the touch film, thereby improving the user's control of the target device using the stylus. accuracy.
  • the method further includes: the stylus pen obtains status information of the stylus pen; the status information of the stylus pen is used to indicate the direction of the tip of the stylus pen; the stylus pen recognizes the first operation and obtains the recognition result, and base Controlling the target device based on the recognition result includes: recognizing the first operation and status information with the stylus, obtaining the recognition result, and controlling the target device based on the recognition result.
  • the stylus can jointly control page turning of the target device according to the direction of the tip of the stylus and the sliding operation on the first area, so that the user can use the stylus to flexibly control the target device.
  • the recognition result when the status information meets the preset conditions and the first operation is a sliding operation away from the pen tip position of the stylus, the recognition result is used to instruct the target device to turn the page down; or, When the status information does not meet the preset conditions and the first operation is a sliding operation away from the tip of the stylus, the recognition result is used to instruct the target device to turn the page upward; or when the status information meets the preset conditions and the third operation When the first operation is a sliding operation close to the tip of the stylus, the recognition result is used to instruct the target device to turn the page upward; or, when the status information does not meet the preset conditions, and the first operation is close to the tip of the stylus When a swipe operation occurs at the location, the recognition result is used to instruct the target device to page down. In this way, the stylus can jointly control page turning of the target device according to the direction of the tip of the stylus and the sliding operation on the first area, so that the user can use the stylus to flexibly control the target device.
  • the preset condition may be that the pen tip is upward as described in the embodiment of the present application; if the preset condition is not satisfied, the pen tip may be downward as described in the embodiment of the present application.
  • the method further includes: switching the stylus from a first mode to a second mode; wherein, in the first mode, the stylus can respond to a triggering operation at any position of the touch film; In the second mode, the stylus can respond to the trigger operation in the first area and does not respond to the trigger operation in the second area of the touch film except the first area; the stylus receives the first operation for the first area. , including: the stylus pen in the second mode receives the first operation for the first area. In this way, the stylus can switch from the first mode to the second mode before controlling the target device, and receive the user's operation on the first area in the second mode, thereby avoiding receiving the user's operation on the first area in the first mode. Accidental touch caused by the first operation of the area.
  • the first mode may be the writing mode described in the embodiments of this application
  • the second mode may be the page-turning pen mode described in the embodiments of this application.
  • the stylus includes a touch integrated circuit board IC and a micro control unit MCU.
  • the method further includes: the MCU sending a first instruction to the touch IC; the first instruction is used to instruct the stylus to start from The first mode is switched to the second mode; the touch IC acquires the first trigger data corresponding to the first operation based on the second mode, and sends the first trigger data to the MCU; the stylus recognizes the first operation, including: MCU recognition First trigger data.
  • the touch IC in the stylus can obtain only the first trigger operation corresponding to the first operation according to the second mode, thereby reducing the memory usage caused by obtaining the trigger operation of the second area.
  • the stylus includes a touch IC and an MCU, and the method further includes: the touch IC obtains the second trigger data, and sends the second trigger data to the MCU; wherein the second trigger data is For the trigger data of the touch film, the second trigger data includes the first trigger data corresponding to the first operation; the stylus recognizes the first operation, including: the MCU recognizes the first trigger data based on the second mode.
  • the MCU in the stylus can receive the second trigger data sent by the touch IC, and filter out the first trigger data through the second mode, so that the stylus can improve the accuracy of operation recognition based on the first trigger data.
  • the stylus recognizes the first operation, obtains the recognition result, and controls the target device based on the recognition result, including: the stylus recognizes the first operation, obtains the recognition result, and controls the target device based on the recognition result. Movement of the cursor and/or changes in the target device page. In this way, the stylus can not only realize For turning pages on the target device, you can also control the movement of the cursor and changes in the page of the target device, enhancing the flexibility of controlling the target device.
  • embodiments of the present application provide a device based on a stylus.
  • Part or all of the body of the stylus is surrounded by a touch film.
  • the body of the stylus is provided with a cut surface.
  • the touch film includes The first area is smaller than the pen body area covered by the touch film and the first area is located at the cut plane position.
  • the recognition result is used to instruct the target device to turn the page downward; or, when the first operation is a sliding operation toward the tip of the stylus, When a sliding operation is performed at the pen tip position of the control pen, the recognition result is used to instruct the target device to turn up the page.
  • the processing unit is also used to obtain the status information of the stylus; the status information of the stylus is used to indicate the tip orientation of the stylus; the recognition unit is also used to identify the first operation and The status information is used to obtain the recognition results, and the processing unit is also used to control the target device based on the recognition results.
  • the recognition result when the status information meets the preset conditions and the first operation is a sliding operation away from the pen tip position of the stylus, the recognition result is used to instruct the target device to turn the page down; or, When the status information does not meet the preset conditions and the first operation is a sliding operation away from the tip of the stylus, the recognition result is used to instruct the target device to turn the page upward; or when the status information meets the preset conditions and the third operation When the first operation is a sliding operation close to the tip of the stylus, the recognition result is used to instruct the target device to turn the page upward; or, when the status information does not meet the preset conditions, and the first operation is close to the tip of the stylus When a swipe operation occurs at the location, the recognition result is used to instruct the target device to page down.
  • the processing unit is also configured to switch from the first mode to the second mode; in the first mode, the stylus can respond to a trigger operation at any position of the touch film; in the second mode In the mode, the stylus can respond to the triggering operation in the first area, and does not respond to the triggering operation in the second area of the touch film except the first area; the processing unit is also used to receive the response to the touch screen in the second mode. The first operation of the first area.
  • the stylus includes a touch integrated circuit board IC and a micro control unit MCU.
  • the processing unit is also used to send a first instruction to the touch IC; the first instruction is used to instruct the stylus Switch from the first mode to the second mode; the processing unit is also used to obtain the first trigger data corresponding to the first operation based on the second mode, and send the first trigger data to the MCU; the processing unit is also used to identify the first One trigger data.
  • the stylus includes a touch IC and an MCU, and the processing unit is also used to obtain the second trigger data and send the second trigger data to the MCU; where the second trigger data is For the trigger data of the touch film, the second trigger data includes first trigger data corresponding to the first operation; the processing unit is also configured to identify the first trigger data based on the second pattern.
  • the recognition unit is further configured to recognize the first operation and obtain a recognition result
  • the processing unit is further configured to control the movement of the cursor in the target device and/or control the change of the target device page based on the recognition result.
  • embodiments of the present application provide a stylus, which includes a memory, a processor, and a computer program stored in the memory and executable on the processor.
  • the processor executes the computer program
  • the stylus is caused to execute the following steps: The method described in one aspect or any possible implementation manner of the first aspect.
  • inventions of the present application provide a computer-readable storage medium.
  • the computer-readable storage medium stores A computer program is stored.
  • the computer program When the computer program is executed by the processor, it causes the computer to execute the method described in the first aspect or any possible implementation manner of the first aspect.
  • embodiments of the present application provide a computer program product, including a computer program, which when the computer program is run, causes the computer to execute the method described in the first aspect or any possible implementation manner of the first aspect.
  • embodiments of the present application provide a system, including: a stylus and a target device.
  • the target device establishes a communication connection with the stylus.
  • Part or all of the body of the stylus is surrounded by a touch film.
  • the body of the pen is provided with a cut surface.
  • the touch film includes a first area. The first area is smaller than the area of the pen body covered by the touch film and the first area is located at the position of the cut surface.
  • the stylus is used for: when receiving a response to the first When the first operation is performed in an area, the first operation is recognized and the recognition result is obtained; the stylus is also used to: send the recognition result to the target device; and the target device is used to: receive the recognition result and control the target device based on the recognition result.
  • Figure 1 is a schematic diagram of a scenario provided by an embodiment of the present application.
  • Figure 2 is a schematic diagram of a touch film of a stylus provided by an embodiment of the present application.
  • Figure 3 is a schematic diagram of a holding posture provided by an embodiment of the present application.
  • Figure 4 is a schematic diagram of the hardware structure of a stylus provided by an embodiment of the present application.
  • Figure 5 is a schematic diagram of the hardware structure of an electronic device provided by an embodiment of the present application.
  • Figure 6 is a schematic diagram of a touch film of another stylus provided by an embodiment of the present application.
  • Figure 7 is a schematic flowchart of a stylus-based usage method provided by an embodiment of the present application.
  • Figure 8 is a schematic diagram of an interface for turning on the page-turning pen mode provided by an embodiment of the present application.
  • Figure 9 is a schematic diagram of a scene simulating a laser pointer based on a stylus according to an embodiment of the present application.
  • Figure 10 is a schematic diagram of a scene for controlling page turning based on a laser pen according to an embodiment of the present application
  • Figure 11 is a schematic diagram of another scenario for controlling page turning based on a stylus according to an embodiment of the present application.
  • Figure 12 is a schematic structural diagram of a stylus-based usage device provided by an embodiment of the present application.
  • FIG. 13 is a schematic diagram of the hardware structure of another stylus provided by an embodiment of the present application.
  • words such as “first” and “second” are used to distinguish the same or similar items with basically the same functions and effects.
  • the first value and the second value are only used to distinguish different values, and their order is not limited.
  • words such as “first” and “second” do not limit the number and execution order, and words such as “first” and “second” do not limit the number and execution order.
  • At least one refers to one or more, and “plurality” refers to two or more.
  • “And/or” describes the association of associated objects, indicating that there can be three relationships, for example, A and/or B, which can mean: A exists alone, A and B exist simultaneously, and B exists alone, where A, B can be singular or plural.
  • the character “/” generally indicates that the related objects are in an “or” relationship.
  • “At least one of the following” or similar expressions thereof refers to any combination of these items, including any combination of a single item (items) or a plurality of items (items).
  • At least one of a, b, or c can represent: a, b, c, a and b, a and c, b and c, or a, b and c, where a, b, c can be single or multiple.
  • Figure 1 is a schematic diagram of a scenario provided by an embodiment of the present application.
  • the electronic device is a tablet as an example for illustration. This example does not constitute a limitation on the embodiment of the present application.
  • this scene may include an electronic device 200 and a stylus 100 .
  • the stylus 100 may be used to input data into the electronic device 200 , or may also be used to control the electronic device 200 .
  • both the stylus 100 and the electronic device 200 can support two working modes, such as a writing mode and a page-turning pen mode.
  • the writing mode may be a mode that the stylus 100 enters by default.
  • the user can use the stylus 100 to write on the touch screen of the electronic device 200, thereby inputting data into the electronic device 200.
  • the page-turning pen mode the page-turning control of the electronic device 200 can be realized based on the operation of the user holding the stylus 100 and sliding up and down (or left and right) in the touch film.
  • the touch film can be a component surrounding the front end of the stylus and used to receive user trigger operations.
  • the touch film can be used to receive the user's up and down (or left and right) sliding operation in the page-turning pen mode.
  • FIG. 2 is a schematic diagram of a touch film of a stylus provided by an embodiment of the present application.
  • the front end of the stylus close to the pen tip is surrounded by a touch film.
  • the touch film can be at a position corresponding to the gray area shown in a in Figure 2 .
  • the touch film when the touch film is unfolded, the touch film may be a rectangular area with a width W and a height H.
  • the stylus can receive and identify operation data at any position in the touch film. For example, taking the page-turning pen mode where the user realizes page-turning control of the electronic device based on the stylus as an example, an example of the user's holding posture when triggering the stylus is provided.
  • FIG. 3 is a schematic diagram of a holding posture provided by an embodiment of the present application. It can be understood that the stylus can realize flipping of the electronic device based on the recognition of the user's holding posture of the touch film in any of a, b, or c in Figure 3. Page control.
  • the stylus can easily mistake the triggering of the touch film by the user's palm or multiple fingers as a page turning operation in the page-turning pen mode, causing the touch
  • the accuracy of pen recognition triggering operations is low, which affects the user experience of controlling electronic devices based on the stylus.
  • embodiments of the present application provide a method of using a stylus.
  • the stylus is surrounded by a touch film near the tip, and the stylus receives a first operation on the touch film; in response to the first operation , the stylus recognizes the operation in the first area of the touch film, and obtains the recognition result, so that the stylus can recognize the operation based on the first area, avoiding errors caused by the user triggering to other positions of the touch film. touch, thereby improving the accuracy of users using the stylus to control electronic devices.
  • the first area is smaller than the corresponding area of the touch film.
  • FIG. 4 is a schematic diagram of the hardware structure of a stylus provided by an embodiment of the present application.
  • the stylus pen can also be called a stylus pen, a capacitive pen, or an inductive pen, etc.
  • the stylus 100 may have a processor 110 .
  • Processor 110 may include storage and processing circuitry for supporting operation of stylus 100 .
  • Storage and processing circuitry may include storage devices such as non-volatile memory (eg, flash memory or other electrically programmable read-only memory configured as a solid-state drive), volatile memory (eg, static or dynamic random access memory) wait.
  • the processing circuitry in the processor 110 may be used to control the operation of the stylus 100 .
  • the processing circuit may be implemented based on one or more microprocessors, microcontrollers, digital signal processors, baseband processors, power management units, audio chips, or application specific integrated circuits.
  • the processor 110 may include modules such as a touch integrated circuit board (integrated circuit, IC) and a microcontroller unit (microcontroller unit, MCU).
  • modules such as a touch integrated circuit board (integrated circuit, IC) and a microcontroller unit (microcontroller unit, MCU).
  • the touch IC can be used to obtain the trigger operation in the touch film;
  • the MCU is used to determine the working mode of the stylus and identify the trigger operation based on the working mode.
  • the sensor may include pressure sensor 120 .
  • the pressure sensor 120 may be provided on the writing end of the stylus pen 100 .
  • the pressure sensor 120 can also be provided in the pen holder 20 of the stylus pen 100 . In this way, after one end of the pen tip of the stylus pen 100 receives force, the other end of the pen tip moves and applies force to the pressure sensor 120 .
  • the processor 110 can adjust the line thickness of the tip of the stylus pen 100 when writing according to the pressure detected by the pressure sensor 120 .
  • Sensors may also include inertial sensors 130 .
  • Inertial sensor 130 may include a three-axis accelerometer and a three-axis gyroscope sensor, and/or other components for measuring the motion of stylus 100 , for example, a three-axis magnetometer may be configured in a nine-axis inertial sensor configuration. included in the sensor.
  • the gyro sensor can be used to detect the pen tip orientation of the stylus pen 100.
  • the gyro sensor can be used to detect the pen tip orientation of the stylus pen, and to detect the pen tip orientation of the stylus pen facing down. .
  • Sensors may also include additional sensors such as temperature sensors, ambient light sensors, light-based proximity sensors, contact sensors, magnetic sensors, pressure sensors, and/or other sensors.
  • a status indicator 140 such as a light emitting diode and a button 150 may be included in the stylus 100 .
  • the status indicator 140 is used to prompt the user of the status of the stylus pen 100 .
  • Buttons 150 may include mechanical buttons and non-mechanical buttons, and buttons 150 may be used to collect button press information from users.
  • the stylus 100 may include one or more electrodes 160 , one of which may be located at the writing end of the stylus 100 , and one of which may be located within the nib. Please refer to the above related descriptions. .
  • Sensing circuit 170 may be included in stylus 100 . Sensing circuitry 170 may sense capacitive coupling between electrodes 160 and drive lines of the capacitive touch sensor panel that interact with stylus 100 . Sensing circuitry 170 may include an amplifier to receive capacitance readings from the capacitive touch sensor panel, a clock to generate a demodulation signal, a phase shifter to generate a phase-shifted demodulation signal, and to use an in-phase demodulation frequency. mixers to demodulate capacitance readings using quadrature demodulation frequency components, etc. The results of the mixer demodulation can be used to determine an amplitude proportional to the capacitance so that the stylus 100 can sense contact with the capacitive touch sensor panel.
  • the stylus 100 may include a microphone, a speaker, an audio generator, a vibrator, a camera, a data port, and other devices.
  • the user can control the operation of the stylus 100 and the electronic device 200 that interacts with the stylus 100 by providing commands using these devices, and receive status information and other output.
  • the processor 110 may be used to run software on the stylus 100 that controls the operation of the stylus 100 .
  • software running on processor 110 may process sensor input, button input, and input from other devices to monitor movement of stylus 100 and other user input.
  • Software running on processor 110 can detect The user commands and can communicate with the electronic device 200 .
  • the stylus 100 may include a wireless module.
  • the wireless module is the Bluetooth module 180 as an example for illustration.
  • the wireless module can also be a WI-FI hotspot module, WI-FI point-to-point module, etc.
  • Bluetooth module 180 may include a radio frequency transceiver, such as a transceiver.
  • Bluetooth module 180 may also include one or more antennas.
  • the transceiver may utilize an antenna to transmit and/or receive wireless signals, which, based on the type of wireless module, may be Bluetooth signals, wireless LAN signals, long-range signals such as cellular phone signals, near field communication signals, or other wireless signals.
  • the stylus 100 may also include a charging module 190 , and the charging module 190 may support charging of the stylus 100 and provide power for the stylus 100 .
  • the electronic device 200 in the embodiment of the present application can be called a user equipment (UE), a terminal, etc.
  • the electronic device 200 can be a tablet computer (portable android device, PAD), a personal computer Digital processing (personal digital assistant, PDA), handheld devices, computing devices, vehicle-mounted devices or wearable devices with wireless communication functions, virtual reality (VR) electronic devices, augmented reality (AR) electronic devices, Wireless terminals in industrial control, wireless terminals in self-driving, wireless terminals in remote medical, wireless terminals in smart grid, transportation safety ), wireless terminals in smart cities, wireless terminals in smart homes, and other mobile terminals or fixed terminals with touch screens.
  • the form of the electronic device is not specifically limited.
  • FIG. 5 is a schematic diagram of the hardware structure of an electronic device provided by an embodiment of the present application.
  • electronic device 200 may include multiple subsystems that cooperate to perform, coordinate, or monitor one or more operations or functions of electronic device 202 .
  • Electronic device 200 includes processor 210, input surface 220, coordination engine 230, power subsystem 240, power connector 250, wireless interface 260, and display 270.
  • the coordination engine 230 may be used to communicate and/or process data with other subsystems of the electronic device 200; communicate and/or transact data with the stylus 100; measure and/or obtain one or more analog or digital Output of a sensor, such as a touch sensor; measuring and/or obtaining output of one or more sensor nodes of an array of sensor nodes, such as an array of capacitive sensing nodes; receiving and locating tip signals and ring signals from the stylus 100 ; Positioning the stylus 100 etc. based on the positions of the tip signal intersection area and the ring signal intersection area.
  • the processor 210 may be configured to receive control instructions from the stylus, and the control instructions may be used to instruct the electronic device 200 to perform responsive control.
  • the stylus pen when the stylus pen receives a user's trigger operation in a preset area, the stylus pen can identify the trigger operation and send a control instruction corresponding to the recognition result to the electronic device 200, so that The electronic device 200 can execute the control instructions.
  • the electronic device 200 can perform page turning control according to the control instruction.
  • processor 210 may be configured to perform, coordinate, and/or manage the functions of electronic device 200 .
  • Such functionality may include, but is not limited to: communicating and/or transacting data with other subsystems of the electronic device 200 , communicating and/or transacting data with the stylus 100 , communicating data and/or transacting data via a wireless interface, via a wired
  • the interface communicates and/or transacts data, facilitates the exchange of power via a wireless (eg, inductive, resonant, etc.) or wired interface, receives position and angular position of one or more stylus, etc.
  • Processor 210 may be implemented as any electronic device capable of processing, receiving, or sending data or instructions.
  • a processor may be a microprocessor, central processing unit, application specific integrated circuit, field programmable gate array, digital signal processing converter, analog circuit, digital circuit, or a combination of these devices.
  • a processor can be a single-threaded or multi-threaded processor.
  • the processor can be a single-core or multi-core processor.
  • processor 210 may be configured to access memory in which instructions are stored.
  • the instructions may be configured to cause the processor to perform, coordinate, or monitor one or more operations or functions of electronic device 200 .
  • the instructions stored in memory may be configured to control or coordinate the operation of other components of electronic device 200, such as, but not limited to: another processor, analog or digital circuitry, volatile or non-volatile memory modules, Displays, speakers, microphones, rotary input devices, buttons or other physical input devices, biometric sensors and/or systems, force or touch input/output components, communication modules (such as wireless interfaces and/or power connectors), and/or Tactile feedback device.
  • another processor analog or digital circuitry
  • volatile or non-volatile memory modules such as, but not limited to: another processor, analog or digital circuitry, volatile or non-volatile memory modules, Displays, speakers, microphones, rotary input devices, buttons or other physical input devices, biometric sensors and/or systems, force or touch input/output components, communication modules (such as wireless interfaces and/or power connectors), and/or Tactile feedback device.
  • the memory may also store electronic data that can be used by the stylus or processor.
  • memory may store electronic data or content (such as media files, documents, and applications), device settings and preferences, timing signals and control signals, or data, data structures, or databases for various modules related to detecting tip signals and/or Or ring signal related files or configurations, etc.
  • the memory can be configured as any type of memory.
  • the memory may be implemented as random access memory, read-only memory, flash memory, removable memory, other types of storage elements, or combinations of such devices.
  • the memory may also be the stylus connected to the electronic device 200, the working mode corresponding to the stylus, and the usage time of the stylus in different working modes, so that the electronic device can interact with the touch pen next time.
  • the stylus establishes a connection, the working mode with the longest usage time is sent to the stylus to avoid frequent switching of working modes by the stylus.
  • Electronic device 200 also includes power subsystem 240.
  • Power subsystem 240 may include a battery or other power source. Power subsystem 240 may be configured to provide power to electronic device 200 . Power subsystem 240 may also be coupled to power connector 250 .
  • Electronic device 200 also includes a wireless interface 260 to facilitate electronic communication between electronic device 200 and stylus 100 .
  • the electronic device 200 may be configured to use a low-energy Bluetooth communication interface.
  • the electronic device 200 may receive a recognition result from a stylus based on the wireless interface 260 .
  • the electronic device 200 also communicates with the stylus 100 using a near field communication interface.
  • the communication interface facilitates electronic communication between electronic device 200 and external communication networks, devices, or platforms.
  • the wireless interface 260 may be implemented as one or more wireless interfaces, a Bluetooth interface, a near field communication interface, a magnetic interface, or a universal serial bus. Interface, inductive interface, resonant interface, capacitive coupling interface, Wi-Fi interface, TCP/IP interface, network communication interface, optical interface, acoustic interface or any traditional communication interface.
  • Electronic device 200 also includes display 270 .
  • Display 270 may be located behind input surface 220 or may be integrated therewith.
  • Processor 210 may use display 270 to present information to the user. In many cases, processor 210 uses display 270 to present an interface with which the user can interact. In many cases, the user manipulates the stylus 100 to interact with the interface. In this embodiment of the present application, the display 270 can display an interface controlled by a stylus pen.
  • FIG. 6 is a schematic diagram of a touch film of another stylus provided by an embodiment of the present application.
  • the body of the stylus can be provided with at least one section 601.
  • the section 601 is provided with a preset area 602 (or can also be called the first area) near the pen tip.
  • the corresponding preset area 602 at position 601 can be used to receive the user's trigger operation on the preset area 602 when the stylus is in the page-turning pen mode.
  • the cut plane is a plane without curvature, which extends longitudinally along the body of the stylus.
  • the cut plane can indicate the position of the body of the stylus.
  • the user can quickly distinguish the cut area just by holding it. and non-cut areas.
  • the stylus can divide the touch film with width W and height H into multiple channels of 6 ⁇ 6 size, and based on the user's cut plane as shown in b in Figure 6
  • the trigger operation of the preset area 602 of 1 ⁇ 4 size is recognized, and the trigger operation in the page-turning pen mode is recognized, and then the recognition result is used to control the electronic device.
  • the preset area 602 with a size of 1 ⁇ 4 can meet the size of the user's finger when triggering with a single finger. It can be understood that this can avoid accidental touches caused by multiple fingers triggering in the preset area 602 and improve Accuracy of operation recognition.
  • the size of the preset area 602 at the cut plane position may also be 1 ⁇ 3, 1 ⁇ 5, or 1 ⁇ 6.
  • the preset area can be understood as a long strip area provided at the cut surface of the stylus body.
  • the size of the preset area may be related to the size of the grid, the size of the section, and the size of the finger resulting from the division of the touch film. For example, when the touch film is divided into a 12 ⁇ 12 grid, the size of the preset area 602 may be 2 ⁇ 6, 2 ⁇ 7, or 3 ⁇ 7, which is not done in the embodiment of this application. limited.
  • the 6 ⁇ 6 size division of the touch film and the 1 ⁇ 4 size corresponding to the preset area are only an example, and this example does not constitute a limitation on the embodiments of the present application.
  • the size of the preset area may also be 1 ⁇ 3, 1 ⁇ 4 or 1 ⁇ 5; or when the touch film is divided into 7
  • the size of the preset area may also be 1 ⁇ 3, 1 ⁇ 4, 1 ⁇ 5, 1 ⁇ 6, or 1 ⁇ 7, etc., which is not limited in the embodiments of the present application.
  • the stylus in order to prevent the stylus from identifying the trigger operation at any position in the touch film, resulting in low accuracy in identifying the user's trigger operation, the stylus can detect the touch film at the cut surface position.
  • the trigger operation in the preset area is recognized, and the trigger operation at other locations in the touch film except the preset area is ignored, thereby reducing false touches and thereby improving the accuracy of the stylus in identifying the trigger operation.
  • FIG. 7 is a schematic flowchart of a stylus-based usage method provided by an embodiment of the present application.
  • the stylus-based usage method may involve an electronic device and a stylus.
  • the electronic device may be a tablet, and the stylus may include a touch IC and an MCU as an example. Examples are not intended to limit the embodiments of this application.
  • the stylus-based usage method may include the following steps:
  • the electronic device establishes a communication connection with the stylus.
  • the stylus and the electronic device can be interconnected through a communication network to achieve wireless signal interaction.
  • the communication network may be, but is not limited to, a WI-FI hotspot network, a WI-FI peer-to-peer (P2P) network, a Bluetooth network, a zigbee network or a near field communication (NFC) network at a short distance Communication network, this is not limited in the embodiments of this application.
  • connection between the stylus and the electronic device based on Bluetooth will be taken as an example for illustration.
  • the electronic device turns on the page-turning pen mode.
  • the page-turning pen mode can be turned on by the user's triggering operation on the electronic device.
  • FIG. 8 is a schematic diagram of an interface for turning on the page-turning pen mode provided by an embodiment of the present application.
  • the electronic device may display an interface as shown in a in Figure 8 .
  • the interface may include: a text box 801 for searching for setting items, a control for viewing account information, a control for setting WLAN, a control for setting Bluetooth, and a control for setting up Bluetooth.
  • the account information can be 1234567XXXX;
  • the operation of the user to open the setting function can be the triggering operation of the control corresponding to the setting function on the desktop of the electronic device, or it can also be the user's setting operation in the multi-task background interface of the electronic device.
  • the triggering operation of the thumbnail corresponding to the function may also be a voice operation, etc., which is not limited in the embodiments of the present application.
  • the trigger operation may be a click operation, a long press operation, a sliding operation, a voice operation or other gesture operations, etc., which is not limited in the embodiments of the present application.
  • the electronic device when the electronic device receives the user's input search for "demo page pen mode" in the text box 801 for searching for setting items, the electronic device can open the delayed page pen mode corresponding to Function interface, and displays the interface shown in b in Figure 8.
  • the interface may include: a control 802 for turning on the page-turning pen mode, and prompt information corresponding to turning on the page-turning pen.
  • the prompt information may be displayed as: Turning on the page-turning pen After mode, you can use the stylus to control the electronic device to achieve functions such as turning pages.
  • the stylus can also turn on the page-turning pen mode based on the user's triggering operation on the stylus in writing mode. For example, the stylus can switch from the writing mode to the page-turning pen mode based on the user's multiple tapping operations on the touch film, or based on the user's long-term or multiple presses on the stylus tip. , and then the stylus pen executes the steps shown in S703.
  • the electronic device sends a message for instructing the stylus to switch to the page-turning pen mode to the stylus.
  • the electronic device can send the message for instructing the stylus to switch to the page-turning pen mode to the stylus based on Bluetooth or other methods.
  • the stylus can receive a message sent by the electronic device for instructing the stylus to switch to the page-turning pen mode.
  • the stylus pen When the stylus pen receives the user's triggering operation on the touch film, the stylus recognizes the triggering operation in the preset area of the touch film and obtains the recognition result.
  • the stylus when the stylus receives a user's trigger operation on the touch film, the stylus can identify the trigger operation at a preset area in the touch film based on the following two methods.
  • the MCU in the stylus can send an instruction message containing the page-turning pen mode to the touch IC, instructing the touch IC to only receive trigger operations corresponding to the preset area.
  • the touch IC can send the trigger data corresponding to the trigger operation received in the preset area to the MCU, and the MCU will identify it and obtain the identification result.
  • the trigger data may include: the position corresponding to the trigger operation, the capacitance change during the trigger, etc.
  • the touch IC can activate only some channels corresponding to the preset area to receive the trigger operation according to the instruction message for indicating the page turning pen mode, so that the touch IC can receive the trigger operation corresponding to the preset area, and obtain the Trigger data corresponding to the trigger operation.
  • the touch IC can normally receive the user's trigger operation at any position of the touch film and send the trigger data corresponding to the trigger operation to the MCU; the MCU can indicate the stylus based on the received The message of switching to page-turning pen mode is used to identify the corresponding trigger data in the preset area and obtain the identification result. Or it can also be understood that the MCU can filter out trigger data outside the preset area in the touch film.
  • the touch IC can send the trigger data and the channel corresponding to the trigger data to the MCU, so that the MCU can adjust the channel of the preset area according to the message used to instruct the stylus to switch to the page-turning pen mode.
  • the trigger data is identified and the identification result is obtained.
  • the stylus sends a message indicating the recognition result to the electronic device.
  • the stylus may also send the message indicating the recognition result to the electronic device based on Bluetooth.
  • the electronic device can receive the recognition result sent by the stylus.
  • the electronic device controls the electronic device based on the recognition result.
  • embodiments of the present application can provide a variety of stylus-based control methods for electronic devices.
  • the user can simulate a laser pointer based on the triggering operation of the stylus and display a laser point in the electronic device (see the corresponding embodiment in Figure 9); or, the user can also implement the electronic control based on the user's triggering operation of the stylus.
  • Page turning control of the device see the corresponding embodiment in Figure 10).
  • a user can simulate a laser pointer to display a laser point in the electronic device based on triggering operations on the stylus.
  • FIG. 9 is a schematic diagram of a scenario for simulating a laser pointer based on a stylus according to an embodiment of the present application.
  • the stylus when the stylus receives the user's pressing operation (or long pressing operation) in the preset area in the step shown in S704, the stylus can change the pressing operation.
  • the corresponding recognition result is sent to the electronic device through S705, so that the electronic device displays the laser point 901 in the step shown in S706. Users can use the stylus to simulate a laser pointer and provide visual demonstrations during business presentations.
  • the interface displayed by the electronic device can be the homepage of the slide show.
  • the interface can include: laser point 901, used to indicate the information on page 1, and display as a stylus introduction. text information, information showing the sharing person: User A, and time information indicating the time is July 7, 2022, etc.
  • the electronic device when the user ends the pressing operation on the preset area of the stylus, the electronic device can cancel the display of the laser point.
  • the user can simulate the laser pointer to display the laser point in the electronic device based on the pressing operation on the preset area of the stylus, and improve the accuracy of pressing operation recognition.
  • the user can also control the page turning of the electronic device based on the user's triggering operation of the stylus.
  • FIG. 10 is a schematic diagram of a scenario for controlling page turning based on a stylus according to an embodiment of the present application.
  • the stylus when the stylus receives the user's sliding movement away from the pen tip in the preset area in the step shown in S704, the stylus can change the sliding operation corresponding to The recognition result is sent to the electronic device through S705, so that the electronic device can turn pages in the step shown in S706. For example, the electronic device can turn pages downward.
  • the electronic device can be displayed by the electronic device shown in a in Figure 10
  • the interface is switched to the interface displayed by the electronic device shown in b in Figure 10 .
  • the stylus when the stylus receives the user's sliding movement close to the pen tip in the preset area in the step shown in S704, the stylus can change the direction corresponding to the sliding operation.
  • the recognition result is sent to the electronic device through S705, so that the electronic device can turn pages in the step shown in S706.
  • the electronic device can turn pages upward.
  • the electronic device can be displayed by the electronic device shown in b in Figure 10
  • the interface switches to the interface displayed by the electronic device shown in a in Figure 10 .
  • the interface displayed by the electronic device shown in b in FIG. 10 may include: information indicating that the page is 2, a definition introduction of the stylus, etc.
  • the stylus when the user attaches the stylus to the bottom of the electronic device, or the user turns off the switch that demonstrates the page-turning pen mode in the interface as shown in b in Figure 8, or the user uses the stylus multiple times
  • the stylus can exit page-turning pen mode when tapping the touch film, etc.
  • the user can control the electronic device to turn pages based on the sliding operation of the preset area of the stylus, and improve the accuracy of page turning recognition.
  • the stylus can also control electronic devices according to the direction of the pen tip.
  • the interface shown as a in Figure 10 and the interface shown as b in Figure 10 With the tip of the stylus pointing upward, when the stylus receives the user's sliding movement away from the pen tip in the preset area, the stylus can control the electronic device to turn pages downward; With the pen tip facing up, when the stylus pen receives the user's sliding movement near the pen tip position in the preset area, the stylus pen can control the electronic device to turn pages upward.
  • the stylus may be provided with a gyroscope sensor or other sensor for detecting the orientation of the stylus.
  • the stylus pen when the tip of the stylus pen is facing downwards, when the stylus pen receives the user's sliding close to the pen tip position in the preset area, the stylus pen can control the electronic device to turn pages downward; or , with the tip of the stylus facing downwards, when the stylus receives the user's sliding movement away from the pen tip in the preset area, the stylus can control the electronic device to turn pages upward.
  • the stylus in the embodiment of the present application can be based on the recognition of the operation of the preset area in the touch film to avoid the user's triggering to other positions on the touch film. accidental touches, thereby improving the accuracy of users using stylus pens to control electronic devices.
  • the stylus when the electronic device displays document content, web page content, etc., the stylus can also realize continuous page turning control of the cursor in the electronic device, or the document content, web page content, etc.
  • FIG. 11 is a schematic diagram of another scenario of page turning controlled by a stylus according to an embodiment of the present application.
  • the electronic device can display document content, and the document content can include a cursor 1101.
  • the interface displayed by the electronic device may also include: controls for indicating completion of document editing, save controls, and cancel controls. Pin typing controls, repeat typing controls, controls for exiting the document, controls for page up, controls for page down, and flags indicating the location of the currently displayed content in the document.
  • the document content displayed by the electronic device in the scene shown in b in FIG. 11 may be different from the document content displayed by the electronic device in the scene shown in a in FIG. 11 .
  • the stylus when the stylus receives the user's sliding movement away from the pen tip in the preset area in the step shown in S704, the stylus can change the sliding operation corresponding to The recognition result is sent to the electronic device through S705, so that the electronic device controls the cursor to continuously move downward in the step shown in S706. Further, when the user stops operating on the preset area, the electronic device can end control of the cursor. For example, the electronic device can control the movement of the cursor according to the user's sliding of the stylus.
  • the electronic device can display the interface displayed by the electronic device as shown in a in Figure 11 when the user does not operate the stylus, and When the user finishes operating the stylus pen, the electronic device displays an interface as shown in b in Figure 11 . It is understandable that the movement of the cursor will lead to changes in the content displayed on the page.
  • the stylus pen when the stylus pen receives the user's sliding movement away from the pen tip in the preset area, the stylus pen can also control the page displayed by the electronic device to continuously turn downwards. Further, when the user stops operating on the preset area, the electronic device can end the page turning operation.
  • the stylus when the stylus receives the user's sliding movement closer to the pen tip in the preset area in the step shown in S704, the stylus can The recognition result corresponding to the operation is sent to the electronic device through S705, so that the electronic device controls the cursor to continuously move upward in the step shown in S706. Further, when the user stops operating on the preset area, the electronic device can end control of the cursor. For example, the electronic device can control the movement of the cursor according to the user's sliding of the stylus. For example, the electronic device can display the interface displayed by the electronic device as shown in b in Figure 11 when the user does not operate the stylus, and When the user finishes operating the stylus pen, the electronic device displays an interface as shown in a in Figure 11 .
  • the stylus pen when the stylus pen receives the user's sliding close to the pen tip in the preset area, the stylus pen can also control the page displayed by the electronic device to continuously turn upward. Further, when the user stops operating on the preset area, the electronic device can end the page turning operation.
  • the specific process is similar to the way a stylus controls the cursor movement in an electronic device, and will not be described again here.
  • the stylus can also control the electronic device according to the direction of the pen tip.
  • the stylus can control the page displayed in the electronic device to continuously turn downwards (or electronically The cursor displayed in the device continuously moves downward); or, with the tip of the stylus facing upwards, when the stylus receives a user's swipe near the tip in a preset area, the stylus can control the electronic The page displayed on the device is continuously turned upward (or the cursor displayed on the electronic device is continuously moved upward).
  • the stylus can control the page displayed in the electronic device to continuously move forward. Turning pages down (or the cursor displayed on the electronic device continuously moves downward); or, with the tip of the stylus pointing upward, when the stylus receives the user's sliding movement away from the pen tip in the preset area, The stylus can control the page displayed on the electronic device to continuously turn upward (or the cursor displayed on the electronic device to move upward continuously).
  • the stylus can recognize the operation of the preset area in the touch film and the orientation of the stylus to avoid accidental touches when the user triggers other positions on the touch film, and improve user utilization.
  • stylus pen control Sub-device accuracy
  • Figure 12 is a schematic structural diagram of a stylus-based usage device provided by an embodiment of the present application.
  • the stylus-based usage device may be a wearable device in the embodiment of the present application, or it may be It is a chip or system on a chip within a wearable device.
  • the stylus-based usage device 1200 may be used in communication equipment, circuits, hardware components or chips.
  • the stylus-based usage device includes: an identification unit 1201 and a processing unit 1202 .
  • the recognition unit 1201 is used to support the stylus-based use device 1200 to perform data recognition steps
  • the processing unit 1202 is used to support the stylus-based use device 1200 to perform data processing steps.
  • the embodiment of the present application provides a stylus-based usage device 1200.
  • Part or all of the body of the stylus is surrounded by a touch film.
  • the body of the stylus is provided with a cut surface.
  • the touch film includes The first area is smaller than the pen body area covered by the touch film and the first area is located at the cut plane position.
  • the processing unit 1202 receives the first operation for the first area, the identification unit 1201 is used to identify the first area. Through operation, the recognition result is obtained, and the processing unit 1202 is configured to control the target device based on the recognition result, and the target device establishes a communication connection with the stylus.
  • the stylus-based usage device 1200 may also include a communication unit 1203 .
  • the communication unit 1203 is configured to support the stylus-based usage device 1200 to perform the steps of sending data and receiving data.
  • the communication unit 1203 may be an input or output interface, a pin or a circuit, etc.
  • the stylus-based usage device 1200 may further include: a storage unit 1204 .
  • the processing unit 1202 and the storage unit 1204 are connected through lines.
  • the storage unit 1204 may include one or more memories, which may be devices used to store programs or data in one or more devices or circuits.
  • the storage unit 1204 may exist independently and be connected to the processing unit 1202 of the stylus-based usage device through a communication line.
  • the storage unit 1204 may also be integrated with the processing unit 1202.
  • the storage unit 1204 may store computer execution instructions for methods in the wearable device, so that the processing unit 1202 executes the methods in the above embodiments.
  • the storage unit 1204 may be a register, cache, RAM, etc., and the storage unit 1204 may be integrated with the processing unit 1202.
  • the storage unit 1204 may be a read-only memory (ROM) or other type of static storage device that can store static information and instructions, and the storage unit 1204 may be independent from the processing unit 1202.
  • Figure 13 is a schematic diagram of the hardware structure of another stylus provided by an embodiment of the present application.
  • the stylus includes a processor 1301, a communication line 1304 and at least one communication interface (exemplary in Figure 13 Taking the communication interface 1303 as an example for explanation).
  • the processor 1301 can be a general central processing unit (CPU), a microprocessor, an application-specific integrated circuit (ASIC), or one or more processors used to control the execution of the program of the present application. integrated circuit.
  • CPU central processing unit
  • ASIC application-specific integrated circuit
  • Communication lines 1304 may include circuitry that communicates information between the components described above.
  • the communication interface 1303 uses any device such as a transceiver to communicate with other devices or communication networks, such as Ethernet, wireless local area networks (WLAN), etc.
  • a transceiver to communicate with other devices or communication networks, such as Ethernet, wireless local area networks (WLAN), etc.
  • WLAN wireless local area networks
  • the stylus may also include memory 1302.
  • Memory 1302 may be a read-only memory (ROM) or other type of static storage device that can store static information and instructions, a random access memory (random access memory (RAM)) or other type that can store information and instructions.
  • a dynamic storage device can also be an electrically erasable programmable read-only memory (EEPROM), a compact disc read-only memory (CD-ROM) or other optical disk storage, optical disc storage (including compressed optical discs, laser discs, optical discs, digital versatile discs, Blu-ray discs, etc.), disk storage media or other magnetic storage devices, or can be used to carry or store desired program code in the form of instructions or data structures and can be used by a computer Any other medium for access, but not limited to this.
  • the memory may exist independently and be connected to the processor through a communication line 1304. Memory can also be integrated with the processor.
  • the memory 1302 is used to store computer execution instructions for executing the solution of the present application, and is controlled by the processor 1301 for execution.
  • the processor 1301 is used to execute computer execution instructions stored in the memory 1302, thereby implementing the wearing detection method provided by the embodiment of the present application.
  • the computer execution instructions in the embodiments of the present application may also be called application codes, which are not specifically limited in the embodiments of the present application.
  • the processor 1301 may include one or more CPUs, such as CPU0 and CPU1 in FIG. 13 .
  • the stylus may include multiple processors, such as processor 1301 and processor 1305 in Figure 13 .
  • processors may be a single-CPU processor or a multi-CPU processor.
  • a processor here may refer to one or more devices, circuits, and/or processing cores for processing data (eg, computer program instructions).
  • a computer program product includes one or more computer instructions. When computer program instructions are loaded and executed on a computer, processes or functions according to embodiments of the present application are generated in whole or in part.
  • the computer may be a general purpose computer, a special purpose computer, a computer network, or other programmable device.
  • Computer instructions may be stored in or transmitted from one computer-readable storage medium to another computer-readable storage medium, e.g., computer instructions may be transmitted from a website, computer, server or data center via a wired link (e.g. Coaxial cable, optical fiber, digital subscriber line (DSL) or wireless (such as infrared, wireless, microwave, etc.) means to transmit to another website site, computer, server or data center.
  • a wired link e.g. Coaxial cable, optical fiber, digital subscriber line (DSL) or wireless (such as infrared, wireless, microwave, etc.
  • the computer-readable storage medium can be Any available media that a computer can store or is a data storage device such as a server, data center, or other integrated server that includes one or more available media.
  • available media may include magnetic media (eg, floppy disks, hard disks, or tapes), optical media (eg, Digital versatile disc (digital versatile disc, DVD)), or semiconductor media (for example, solid state disk (solid state disk, SSD)), etc.
  • Embodiments of the present application also provide a system.
  • the system includes a stylus and a target device.
  • the target device establishes a communication connection with the stylus.
  • Part or all of the body of the stylus is surrounded by a touch film.
  • the pen of the stylus The body is provided with a cut surface, the touch film includes a first area, the first area is smaller than the pen body area covered by the touch film and the first area is located at the cut surface position, the stylus is used to: when receiving a signal directed to the first area During the first operation, the first operation is recognized and the recognition result is obtained; the stylus is also used to: send the recognition result to the target device; and the target device is used to: receive the recognition result and control the target device based on the recognition result.
  • Computer-readable media may include computer storage media and communication media and may include any medium that can transfer a computer program from one place to another.
  • the storage media can be any target media that can be accessed by the computer.
  • the computer-readable medium may include compact disc read-only memory (CD-ROM), RAM, ROM, EEPROM or other optical disk storage; the computer-readable medium may include a magnetic disk memory or other disk storage device.
  • any connection line is also properly termed a computer-readable medium.
  • coaxial cable, fiber optic cable, twisted pair, DSL or wireless technologies such as infrared, radio and microwave
  • coaxial cable, fiber optic cable, twisted pair, DSL or wireless technologies such as infrared, radio and microwave
  • Disk and optical disk include compact disc (CD), laser disc, optical disc, digital versatile disc (DVD), floppy disk, and Blu-ray disc, where disks typically reproduce data magnetically, while discs reproduce data optically using lasers. Reproduce data.

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Position Input By Displaying (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

本申请实施例提供一种基于触控笔的使用方法和装置,涉及终端技术领域,触控笔的部分或全部笔身围绕有触控薄膜,触控笔的笔身处设置有切面,触控薄膜中包括第一区域,第一区域小于触控薄膜覆盖的笔身区域且第一区域位于切面位置处,方法包括:当触控笔接收到针对第一区域的第一操作时,触控笔识别第一操作,得到识别结果,并基于识别结果控制目标设备,目标设备与触控笔建立有通信连接。这样,可以通过针对触控薄膜与切面重叠处的第一区域的触发数据的识别,提高操作识别的准确性,进而提高用户使用触控笔控制目标设备的使用体验。

Description

基于触控笔的使用方法和装置
本申请要求于2022年08月09日提交中国国家知识产权局、申请号为202210951471.5、申请名称为“基于触控笔的使用方法和装置”的中国专利申请的优先权,其全部内容通过引用结合在本申请中。
技术领域
本申请涉及终端技术领域,尤其涉及一种基于触控笔的使用方法和装置。
背景技术
随着互联网的普及和发展,人们对于电子设备的功能需求也越发多样化。例如,为了简化使用电子设备的方式,触控笔逐渐成为向电子设备中输入数据,或者控制电子设备的重要工具。
通常情况下,触控笔靠近笔头的位置处包裹有触控薄膜,使得触控笔可以基于用户对于触控薄膜的触发操作,控制电子设备等功能。
然而,上述基于触控笔的使用方法的准确性较低,影响用户基于触控笔控制电子设备的使用体验。
发明内容
本申请实施例提供一种基于触控笔的使用方法和装置,可以通过针对触控薄膜与切面重叠处的第一区域的触发数据的识别,提高操作识别的准确性,进而提高用户使用触控笔控制电子设备的使用体验。
第一方面,本申请实施例提供一种基于触控笔的使用方法,触控笔的部分或全部笔身围绕有触控薄膜,触控笔的笔身处设置有切面,触控薄膜中包括第一区域,第一区域小于触控薄膜覆盖的笔身区域且第一区域位于切面位置处,方法包括:当触控笔接收到针对第一区域的第一操作时,触控笔识别第一操作,得到识别结果,并基于识别结果控制目标设备,目标设备与触控笔建立有通信连接。这样,可以通过针对触控薄膜与切面重叠处的第一区域的触发数据的识别,提高操作识别的准确性,进而提高用户使用触控笔控制目标设备的使用体验。
其中,该第一区域可以为本申请实施例中描述的预设区域;目标设备可以为本申请实施例中描述的电子设备。
在一种可能的实现方式中,当第一操作为向远离触控笔的笔尖位置处的滑动操作时,识别结果用于指示目标设备向下翻页;或者,当第一操作为向靠近触控笔的笔尖位置处的滑动操作时,识别结果用于指示目标设备向上翻页。这样,触控笔可以根据用户针对第一区域的滑动操作,控制目标设备进行翻页,避免用户触发到触控薄膜的其他位置时带来的误触,进而提高用户使用触控笔控制目标设备的准确性。
在一种可能的实现方式中,方法还包括:触控笔获取触控笔的状态信息;触控笔的状态信息用于指示触控笔的笔尖朝向;触控笔识别第一操作,得到识别结果,并基 于识别结果控制目标设备,包括:触控笔识别第一操作以及状态信息,得到识别结果,并基于识别结果控制目标设备。这样,触控笔可以根据触控笔的笔尖朝向以及针对第一区域的滑动操作共同对目标设备进行翻页控制,使得用户可以利用触控笔对于目标设备进行灵活控制。
在一种可能的实现方式中,当状态信息满足预设条件,且第一操作为向远离触控笔的笔尖位置处的滑动操作时,识别结果用于指示目标设备向下翻页;或者,当状态信息不满足预设条件,且第一操作为远离触控笔的笔尖位置处的滑动操作时,识别结果用于指示目标设备向上翻页;或者,当状态信息满足预设条件,且第一操作为向靠近触控笔的笔尖位置处的滑动操作时,识别结果用于指示目标设备向上翻页;或者,当状态信息不满足预设条件,且第一操作为靠近触控笔的笔尖位置处的滑动操作时,识别结果用于指示目标设备向下翻页。这样,触控笔可以根据触控笔的笔尖朝向以及针对第一区域的滑动操作共同对目标设备进行翻页控制,使得用户可以利用触控笔对于目标设备进行灵活控制。
其中,该预设条件可以为本申请实施例中描述的笔尖朝上;不满于预设条件可以为本申请实施例中描述的笔尖朝下。
在一种可能的实现方式中,方法还包括:触控笔从第一模式切换到第二模式;其中,在第一模式中,触控笔能够响应触控薄膜任意位置的触发操作;在第二模式中,触控笔能够响应第一区域中的触发操作,且不响应触控薄膜中除第一区域外的第二区域的触发操作;触控笔接收到针对第一区域的第一操作,包括:处于第二模式的触控笔接收到针对第一区域的第一操作。这样,触控笔可以在对目标设备进行控制前,由第一模式切换至第二模式,并在第二模式中接收用户针对第一区域的操作,避免在第一模式中接收用户针对第一区域的第一操作时带来的误触。
其中,该第一模式可以为本申请实施例中描述的书写模式,第二模式可以为本申请实施例中描述的翻页笔模式。
在一种可能的实现方式中,触控笔中包括触控集成电路板IC以及微控制单元MCU,方法还包括:MCU向触控IC发送第一指令;第一指令用于指示触控笔从第一模式切换到第二模式;触控IC基于第二模式,获取第一操作对应的第一触发数据,并将第一触发数据发送至MCU;触控笔识别第一操作,包括:MCU识别第一触发数据。这样,使得触控笔中的触控IC可以根据第二模式,仅获取该第一操作对应的第一触发操作,减少获取第二区域的触发操作带来的内存占用。
在一种可能的实现方式中,触控笔中包括触控IC以及MCU,方法还包括:触控IC获取第二触发数据,并将第二触发数据发送至MCU;其中,第二触发数据为针对触控薄膜的触发数据,第二触发数据中包括第一操作对应的第一触发数据;触控笔识别第一操作,包括:MCU基于第二模式识别第一触发数据。这样,触控笔中的MCU可以接收触控IC发来第二触发数据,并通过第二模式筛选出第一触发数据,使得触控笔可以基于第一触发数据提高操作识别的准确性。
在一种可能的实现方式中,触控笔识别第一操作,得到识别结果,并基于识别结果控制目标设备,包括:触控笔识别第一操作,得到识别结果,并基于识别结果控制目标设备中光标的移动和/或控制目标设备页面的变化。这样,触控笔不仅可以实现对 于目标设备的翻页,还可以控制光标的移动以及目标设备页面的变化,增强控制目标设备的灵活性。
第二方面,本申请实施例提供一种基于触控笔的使用装置,触控笔的部分或全部笔身围绕有触控薄膜,触控笔的笔身处设置有切面,触控薄膜中包括第一区域,第一区域小于触控薄膜覆盖的笔身区域且第一区域位于切面位置处,当处理单元接收到针对第一区域的第一操作时,识别单元用于,识别第一操作,得到识别结果,处理单元用于基于识别结果控制目标设备,目标设备与触控笔建立有通信连接。
在一种可能的实现方式中,当第一操作为向远离触控笔的笔尖位置处的滑动操作时,识别结果用于指示目标设备向下翻页;或者,当第一操作为向靠近触控笔的笔尖位置处的滑动操作时,识别结果用于指示目标设备向上翻页。
在一种可能的实现方式中,处理单元,还用于获取触控笔的状态信息;触控笔的状态信息用于指示触控笔的笔尖朝向;识别单元,还用于识别第一操作以及状态信息,得到识别结果,处理单元,还用于基于识别结果控制目标设备。
在一种可能的实现方式中,当状态信息满足预设条件,且第一操作为向远离触控笔的笔尖位置处的滑动操作时,识别结果用于指示目标设备向下翻页;或者,当状态信息不满足预设条件,且第一操作为远离触控笔的笔尖位置处的滑动操作时,识别结果用于指示目标设备向上翻页;或者,当状态信息满足预设条件,且第一操作为向靠近触控笔的笔尖位置处的滑动操作时,识别结果用于指示目标设备向上翻页;或者,当状态信息不满足预设条件,且第一操作为靠近触控笔的笔尖位置处的滑动操作时,识别结果用于指示目标设备向下翻页。
在一种可能的实现方式中,处理单元,还用于从第一模式切换到第二模式;其中,在第一模式中,触控笔能够响应触控薄膜任意位置的触发操作;在第二模式中,触控笔能够响应第一区域中的触发操作,且不响应触控薄膜中除第一区域外的第二区域的触发操作;处理单元,还用于在第二模式下接收到针对第一区域的第一操作。
在一种可能的实现方式中,触控笔中包括触控集成电路板IC以及微控制单元MCU,处理单元,还用于向触控IC发送第一指令;第一指令用于指示触控笔从第一模式切换到第二模式;处理单元,还用于基于第二模式,获取第一操作对应的第一触发数据,并将第一触发数据发送至MCU;处理单元,还用于识别第一触发数据。
在一种可能的实现方式中,触控笔中包括触控IC以及MCU,所处理单元,还用于获取第二触发数据,并将第二触发数据发送至MCU;其中,第二触发数据为针对触控薄膜的触发数据,第二触发数据中包括第一操作对应的第一触发数据;处理单元,还用于基于第二模式识别第一触发数据。
在一种可能的实现方式中,识别单元,还用于识别第一操作,得到识别结果,处理单元,还用于基于识别结果控制目标设备中光标的移动和/或控制目标设备页面的变化。
第三方面,本申请实施例提供一种触控笔,包括存储器、处理器以及存储在存储器中并可在处理器上运行的计算机程序,处理器执行计算机程序时,使得触控笔执行如第一方面或第一方面中任一种可能的实现方式中描述的方法。
第四方面,本申请实施例提供一种计算机可读存储介质,计算机可读存储介质存 储有计算机程序,计算机程序被处理器执行时,使得计算机执行如第一方面或第一方面中任一种可能的实现方式中描述的方法。
第五方面,本申请实施例提供一种计算机程序产品,包括计算机程序,当计算机程序被运行时,使得计算机执行如第一方面或第一方面中任一种可能的实现方式中描述的方法。
第六方面,本申请实施例提供一种系统,包括:触控笔以及目标设备,目标设备与触控笔建立有通信连接,触控笔的部分或全部笔身围绕有触控薄膜,触控笔的笔身处设置有切面,触控薄膜中包括第一区域,第一区域小于触控薄膜覆盖的笔身区域且第一区域位于切面位置处,触控笔用于:当接收到针对第一区域的第一操作时,识别第一操作,得到识别结果;触控笔还用于:将识别结果发送至目标设备;目标设备用于:接收识别结果,并基于识别结果控制目标设备。
应当理解的是,本申请的第三方面至第六方面与本申请的第一方面的技术方案相对应,各方面及对应的可行实施方式所取得的有益效果相似,不再赘述。
附图说明
图1为本申请实施例提供的一种场景示意图;
图2为本申请实施例提供的一种触控笔的触控薄膜的示意图;
图3为本申请实施例提供的一种握持姿势的示意图;
图4为本申请实施例提供的一种触控笔的硬件结构示意图;
图5为本申请实施例提供的一种电子设备的硬件结构示意图;
图6为本申请实施例提供的另一种触控笔的触控薄膜的示意图;
图7为本申请实施例提供的一种基于触控笔的使用方法的流程示意图;
图8为本申请实施例提供的一种开启翻页笔模式的界面示意图;
图9为本申请实施例提供的一种基于触控笔模拟激光笔的场景示意图;
图10为本申请实施例提供的一种基于激光笔控制翻页的场景示意图;
图11为本申请实施例提供的另一种基于触控笔控制翻页的场景示意图;
图12为本申请实施例提供的一种基于触控笔的使用装置的结构示意图;
图13为本申请实施例提供的另一种触控笔的硬件结构示意图。
具体实施方式
为了便于清楚描述本申请实施例的技术方案,在本申请的实施例中,采用了“第一”、“第二”等字样对功能和作用基本相同的相同项或相似项进行区分。例如,第一值和第二值仅仅是为了区分不同的值,并不对其先后顺序进行限定。本领域技术人员可以理解“第一”、“第二”等字样并不对数量和执行次序进行限定,并且“第一”、“第二”等字样也并不限定一定不同。
需要说明的是,本申请中,“示例性的”或者“例如”等词用于表示作例子、例证或说明。本申请中被描述为“示例性的”或者“例如”的任何实施例或设计方案不应被解释为比其他实施例或设计方案更优选或更具优势。确切而言,使用“示例性的”或者“例如”等词旨在以具体方式呈现相关概念。
本申请中,“至少一个”是指一个或者多个,“多个”是指两个或两个以上。“和/或”,描述关联对象的关联关系,表示可以存在三种关系,例如,A和/或B,可以表示:单独存在A,同时存在A和B,单独存在B的情况,其中A,B可以是单数或者复数。字符“/”一般表示前后关联对象是一种“或”的关系。“以下至少一项(个)”或其类似表达,是指的这些项中的任意组合,包括单项(个)或复数项(个)的任意组合。例如,a,b,或c中的至少一项(个),可以表示:a,b,c,a和b,a和c,b和c,或a、b和c,其中a,b,c可以是单个,也可以是多个。
示例性的,图1为本申请实施例提供的一种场景示意图。在图1对应的实施例中,以电子设备为平板为例进行示例说明,该示例并不构成对本申请实施例的限定。
如图1所示,该场景中可以包括电子设备200以及触控笔100,触控笔100可以用于向电子设备200中输入数据,或者也可以用于对电子设备200进行控制。
通常情况下,触控笔100以及电子设备200均可以支持两种工作模式,如书写模式以及翻页笔模式。其中,书写模式可以为触控笔100默认进入的模式,在书写模式下,用户可以利用触控笔100在电子设备200的触控屏中书写,进而实现向电子设备200中输入数据。在翻页笔模式下,可以基于用户握持触控笔100并在触控薄膜中上下(或左右)滑动的操作,实现对于电子设备200的翻页控制。
可以理解的是,触控薄膜可以为围绕于触控笔前端、用于接收用户触发操作的部件。例如,该触控薄膜可以用于接收用户在翻页笔模式下上下(或左右)滑动的操作等。
示例性的,图2为本申请实施例提供的一种触控笔的触控薄膜的示意图。如图2中的a所示,触控笔靠近笔头位置处的前端围绕有触控薄膜,该触控薄膜可以为图2中的a所示的灰色区域对应的位置。如图2中的b所示,当展开该触控薄膜时,该触控薄膜可以为宽为W高为H的矩形区域。
通常情况下,在用户触发该触控薄膜后,触控笔可以实现对于触控薄膜中任意位置处操作数据的接收以及识别。例如,以翻页笔模式下,用户基于触控笔实现对于电子设备的翻页控制为例,对用户触发触控笔时的握持姿势进行示例说明。示例性的,图3为本申请实施例提供的一种握持姿势的示意图。可以理解的是,触控笔可以基于用户在图3中的a、图3中的b或图3中的c中任一种针对触控薄膜的握持姿势的识别,实现对于电子设备的翻页控制。
然而,由于用户在触控薄膜中触发时的握持姿势千差万别,使得触控笔很容易将用户的手掌或多个手指针对触摸薄膜的触发误认为翻页笔模式下的翻页操作,导致触控笔识别触发操作的准确率较低,进而影响用户基于触控笔控制电子设备的使用体验。
有鉴于此,本申请实施例提供一种基于触控笔的使用方法,触控笔靠近笔尖位置处围绕有触控薄膜,触控笔接收针对触控薄膜的第一操作;响应于第一操作,触控笔识别触控薄膜的第一区域中的操作,得到识别结果,使得触控笔可以基于第一区域处的操作的识别,避免用户触发到触控薄膜的其他位置时带来的误触,进而提高用户使用触控笔控制电子设备的准确性。其中,第一区域小于触控薄膜对应的区域。
示例性的,图4为本申请实施例提供的一种触控笔的硬件结构示意图。
其中,触控笔也可以称为手写笔、电容笔、或电感笔等。如图4所示,触控笔100可以具有处理器110。处理器110可以包括用于支持触控笔100的操作的存储和处理电路。 存储和处理电路可以包括诸如非易失性存储器的存储装置(例如,闪存存储器或构造为固态驱动器的其它电可编程只读存储器)、易失性存储器(例如,静态或动态随机存取存储器)等。处理器110中的处理电路可以用来控制触控笔100的操作。处理电路可以基于一个或多个微处理器、微控制器、数字信号处理器、基带处理器、电源管理单元、音频芯片、或专用集成电路等实现。
可以理解的是,处理器110中可以包括:触控集成电路板(integrated circuit,IC)、以及微控制单元(microcontroller unit,MCU)等模块。
其中,触控IC可以用于实现对于触控薄膜中的触发操作的获取;MCU用于确定触控笔所处的工作模式,以及基于工作模式对触发操作进行识别。
触控笔100中可以包括一个或多个传感器。例如,传感器可以包括压力传感器120。压力传感器120可以设置在触控笔100的书写端。当然,压力传感器120还可以设在触控笔100的笔杆20内,这样,触控笔100的笔尖一端受力后,笔尖的另一端移动将力作用到压力传感器120。在一种实施例中,处理器110根据压力传感器120检测到的压力大小可以调整触控笔100的笔尖书写时的线条粗细。
传感器也可以包括惯性传感器130。惯性传感器130可以包括三轴加速计和三轴陀螺仪传感器,和/或,用于测量触控笔100的运动的其它部件,例如,三-轴磁力计可以以九-轴惯性传感器的构造被包括在传感器中。本申请实施例中,该陀螺仪传感器可以用于检测触控笔100的笔尖朝向,例如陀螺仪传感器可以用于检测触控笔的笔尖朝向、以及用于检测触控笔的笔尖朝下等状态。
传感器也可以包括附加的传感器,诸如温度传感器、环境光传感器、基于光的接近传感器、接触传感器、磁传感器、压力传感器和/或其它传感器。
触控笔100中可以包括如发光二极管的状态指示器140和按钮150。状态指示器140用于向用户提示触控笔100的状态。按钮150可以包括机械按钮和非机械按钮,按钮150可以用于从用户收集按钮按压信息。
本申请实施例中,触控笔100中可以包括一个或多个电极160,其中一个电极160可以位于触控笔100的书写端处,其中一个电极160可以位于笔尖内,可以参照上述的相关描述。
触控笔100中可以包括感测电路170。感测电路170可感测位于电极160和与触控笔100交互的电容触摸传感器面板的驱动线之间的电容耦合。感测电路170可以包括用以接收来自电容触摸传感器面板的电容读数的放大器、用以生成解调信号的时钟、用以生成相移的解调信号的相移器、用以使用同相解调频率分量来解调电容读数的混频器、以及用以使用正交解调频率分量来解调电容读数的混频器等。混频器解调的结果可用于确定与电容成比例的振幅,使得触控笔100可以感测到与电容触摸传感器面板的接触。
可以理解的是,根据实际需求,在触控笔100可以包括麦克风、扬声器、音频发生器、振动器、相机、数据端口以及其它设备。用户可以通过利用这些设备提供命令来控制触控笔100和与触控笔100交互的电子设备200的操作,并且接收状态信息和其它输出。
处理器110可以用于运行触控笔100上的控制触控笔100的操作的软件。触控笔100的操作过程中,运行在处理器110上的软件可以处理传感器输入、按钮输入和来自其它装置的输入以监视触控笔100的移动和其它用户输入。在处理器110上运行的软件可以检测 用户命令并且可以与电子设备200通信。
为了支持触控笔100与电子设备200的无线通信,触控笔100可以包括无线模块。图4中以无线模块为蓝牙模块180为例进行说明。无线模块还可以为WI-FI热点模块、WI-FI点对点模块等。蓝牙模块180可以包括射频收发器,例如收发器。蓝牙模块180也可以包括一个或多个天线。收发器可以利用天线发射和/或接收无线信号,无线信号基于无线模块的类型,可以是蓝牙信号、无线局域网信号、诸如蜂窝电话信号的远程信号、近场通信信号或其它无线信号。
触控笔100还可以包括充电模块190,充电模块190可以支持触控笔100的充电,为触控笔100提供电力。
可以理解的是,本申请实施例中的电子设备200可以称为用户设备(user equipment,UE)、终端(terminal)等,例如,电子设备200可以为平板电脑(portable android device,PAD)、个人数字处理(personal digital assistant,PDA)、具有无线通信功能的手持设备、计算设备、车载设备或可穿戴设备,虚拟现实(virtual reality,VR)电子设备、增强现实(augmented reality,AR)电子设备、工业控制(industrial control)中的无线终端、无人驾驶(self driving)中的无线终端、远程医疗(remote medical)中的无线终端、智能电网(smart grid)中的无线终端、运输安全(transportation safety)中的无线终端、智慧城市(smart city)中的无线终端、智慧家庭(smart home)中的无线终端等具有触控屏的移动终端或固定终端。本申请实施例中对电子设备的形态不做具体限定。
示例性的,图5为本申请实施例提供的一种电子设备的硬件结构示意图。
参照图5,电子设备200可以包括多个子系统,这些子系统协作以执行、协调或监控电子设备202的一个或多个操作或功能。电子设备200包括处理器210、输入表面220、协调引擎230、电源子系统240、电源连接器250、无线接口260和显示器270。
示例性的,协调引擎230可以用于与电子设备200的其他子系统进行通信和/或处理数据;与触控笔100通信和/或交易数据;测量和/或获得一个或多个模拟或数字传感器(诸如触摸传感器)的输出;测量和/或获得传感器节点阵列(诸如电容感测节点的阵列)的一个或多个传感器节点的输出;接收和定位来自触控笔100的尖端信号和环信号;基于尖端信号交叉区域和环形信号交叉区域的位置来定位触控笔100等。
处理器210可以用于接收来自触控笔的控制指令,该控制指令可以用于指示电子设备200执行响应的控制。本申请实施例中,当触控笔接收到用户在预设区域中的触发操作时,触控笔可以对该触发操作进行识别,并将该识别结果对应的控制指令发送至电子设备200,使得电子设备200可以执行该控制指令。例如,电子设备200可以根据该控制指令进行翻页控制。
一般而言,处理器210可被配置为执行、协调和/或管理电子设备200的功能。此类功能可以包括但不限于:与电子设备200的其他子系统通信和/或交易数据,与触控笔100通信和/或交易数据,通过无线接口进行数据通信和/或交易数据,通过有线接口进行数据通信和/或交易数据,促进通过无线(例如,电感式、谐振式等)或有线接口进行电力交换,接收一个或多个触笔的位置和角位置等。
处理器210可被实现为能够处理、接收或发送数据或指令的任何电子设备。例如,处理器可以是微处理器、中央处理单元、专用集成电路、现场可编程门阵列、数字信号处理 器、模拟电路、数字电路或这些设备的组合。处理器可以是单线程或多线程处理器。处理器可以是单核或多核处理器。
在使用期间,处理器210可被配置为访问存储有指令的存储器。该指令可被配置为使处理器执行、协调或监视电子设备200的一个或多个操作或功能。
存储在存储器中的指令可被配置为控制或协调电子设备200的其他部件的操作,该部件诸如但不限于:另一处理器、模拟或数字电路、易失性或非易失性存储器模块、显示器、扬声器、麦克风、旋转输入设备、按钮或其他物理输入设备、生物认证传感器和/或系统、力或触摸输入/输出部件、通信模块(诸如无线接口和/或电源连接器),和/或触觉反馈设备。
存储器还可存储可由触控笔或处理器使用的电子数据。例如,存储器可以存储电子数据或内容(诸如媒体文件、文档和应用程序)、设备设置和偏好、定时信号和控制信号或者用于各种模块的数据、数据结构或者数据库,与检测尖端信号和/或环信号相关的文件或者配置等等。存储器可被配置为任何类型的存储器。例如,存储器可被实现作为随机存取存储器、只读存储器、闪存存储器、可移动存储器、其他类型的存储元件或此类设备的组合。
本申请实施例中,存储器也可以与电子设备200连接过的触控笔、该触控笔对应的工作模式、以及触控笔在不同工作模式下的使用时长,使得电子设备可以在下次与触控笔建立连接时,将使用时长最长的工作模式发送至触控笔,避免触控笔进行工作模式的频繁切换。
电子设备200还包括电源子系统240。电源子系统240可包括电池或其它电源。电源子系统240可被配置为向电子设备200提供电力。电源子系统240还可耦接到电源连接器250。
电子设备200还包括无线接口260,以促进电子设备200与触控笔100之间的电子通信。本申请实施例中,电子设备200可被配置为经由低能量蓝牙通信接口,例如电子设备200可以基于无线接口260接收来自触控笔的识别结果。
在一些实施例中,电子设备200也利用近场通信接口与触控笔100通信。在其他示例中,通信接口有利于电子设备200与外部通信网络、设备或平台之间的电子通信。
无线接口260无论是电子设备200与触控笔100之间的通信接口还是另外的通信接口)可被实现为一个或多个无线接口、蓝牙接口、近场通信接口、磁性接口、通用串行总线接口、电感接口、谐振接口,电容耦合接口、Wi-Fi接口、TCP/IP接口、网络通信接口、光学接口、声学接口或任何传统的通信接口。
电子设备200还包括显示器270。显示器270可以位于输入表面220后方,或者可以与其集成一体。处理器210可以使用显示器270向用户呈现信息。在很多情况下,处理器210使用显示器270来呈现用户可以与之交互的界面。在许多情况下,用户操纵触控笔100与界面进行交互。本申请实施例中,显示器270可以显示经过触控笔控制后的界面。
对于本领域的技术人员而言将显而易见的是,上文关于电子设备200所呈现的具体细节中的一些细节可为实践特定的实施方案或其等同物所不需要的。类似地,其他电子设备可以包括更多数量的子系统、模块、部件等。在适当的情况下,一些子模块可以被实现为软件或硬件。因此,应当理解,上述描述并非旨在穷举或将本公开限制于本文的精确形式。相反,对于本领域的普通技术人员而言将显而易见的是,根据上述教导内容,许多修改和变型是可能的。
下面以具体地实施例对本申请的技术方案以及本申请的技术方案如何解决上述技术问题进行详细说明。下面这几个具体的实施例可以独立实现,也可以相互结合,对于相同或相似的概念或过程可能在某些实施例中不再赘述。
本申请实施例中,对围绕于触控笔前端的触控薄膜进行示例说明。示例性的,图6为本申请实施例提供的另一种触控笔的触控薄膜的示意图。在图6中对应的实施例中,触控笔的笔身可以设置有至少一个切面601,该切面601靠近笔尖位置处设置有预设区域602(或也可以称为第一区域),该切面601位置处对应的预设区域602可以用于接收触控笔处于翻页笔模式下时用户针对预设区域602的触发操作。
其中,切面为没有弧度的平面,其沿着触控笔的笔身纵向延伸,切面可以指示触控笔的笔身位置,用户在使用触控笔时,可以仅仅通过握持便快速区分切面区域和非切面区域。
如图6中的a所示,触控笔可以将宽为W高为H的触控薄膜划分成6×6等尺寸的多个通道,并基于用户针对如图6中的b所示的切面处的1×4等尺寸的预设区域602的触发操作,对翻页笔模式下的触发操作进行识别,进而利用识别结果控制电子设备。
其中,该1×4等尺寸的预设区域602可以满足用户单指触发时手指的尺寸,可以理解的是这样可以避免多个手指在预设区域602中触发时带来的误触情况,提高操作识别的准确性。
可能的实现方式中,该切面位置处的预设区域602的尺寸也可以为1×3、1×5、或1×6等尺寸。
可以理解的是,该预设区域可以理解为设置在触控笔笔身的切面处的长条形区域。预设区域的尺寸可以与触控薄膜的划分所导致的网格的大小、切面的大小、以及手指的大小相关。例如,当该触控薄膜划分为12×12的网格时,该预设区域602的尺寸可以为2×6、2×7、或3×7等尺寸,本申请实施例中对此不做限定。
可以理解的是,对于触控薄膜6×6的尺寸划分,以及预设区域对应的1×4的尺寸仅作为一种示例,该示例并不能构成对本申请实施例的限定。例如,当该触控薄膜划分为5×5的多个通道时,该预设区域的尺寸也可以为1×3、1×4或1×5等;或者,当该触控薄膜划分为7×7的多个通道时,该预设区域的尺寸也可以为1×3、1×4、1×5、1×6、或1×7等,本申请实施例中对此不做限定。
可以理解的是,为了避免触控笔对于触控薄膜中任一位置处的触发操作的识别,使得识别用户的触发操作准确率低的情况,触控笔可以对触控薄膜中位于切面位置处预设区域的触发操作进行识别,并忽略触控薄膜中除预设区域以外的其他位置处的触发操作,降低误触情况,进而提高触控笔识别触发操作的准确性。
在图6对应的实施例的基础上,本申请实施例提供一种基于触控笔的使用方法。示例性的,图7为本申请实施例提供的一种基于触控笔的使用方法的流程示意图。在图7对应的实施例中,该基于触控笔的使用方法可以涉及电子设备以及触控笔,以电子设备可以为平板,触控笔中包括触控IC以及MCU为例进行示例说明,该示例并不能够成对本申请实施例的限定。
如图7所示,该基于触控笔的使用方法可以包括如下步骤:
S701、电子设备与触控笔建立通信连接。
其中,触控笔和电子设备之间,可以通过通信网络进行互联以实现无线信号的交互。 该通信网络可以但不限于为:WI-FI热点网络、WI-FI点对点(peer-to-peer,P2P)网络、蓝牙网络、zigbee网络或近场通信(near field communication,NFC)网络等近距离通信网络,本申请实施例中对此不做限定。
可以理解的是,本申请实施例中将以触控笔与电子设备之间基于蓝牙连接为例进行示例说明。
S702、电子设备开启翻页笔模式。
可以理解的是,由于触控笔开启时默认进入书写模式,因此可以通过用户针对电子设备的触发操作开启翻页笔模式。
示例性的,图8为本申请实施例提供的一种开启翻页笔模式的界面示意图。
当电子设备接收到用户打开设置功能的操作时,电子设备可以显示如图8中的a所示的界面。如图8中的a所示的界面,该界面中可以包括:用于搜索设置项的文本框801、用于查看账号信息的控件、用于设置WLAN的控件、用于设置蓝牙的控件、用于设置移动网络的控件、用于查看超级终端的控件等。其中,账号信息可以为1234567XXXX;该用户打开设置功能的操作可以为用户在电子设备的桌面中针对设置功能对应的控件的触发操作,或者也可以为用户在电子设备的多任务后台界面中针对设置功能对应的缩略图的触发操作,或者也可以为语音操作等,本申请实施例中对此不做限定。其中,该触发操作可以为点击操作、长按操作、滑动操作、语音操作或其他手势操作等,本申请实施例中对此不做限定。
如图8中的a所示的界面,当电子设备接收到用户在用于搜索设置项的文本框801中输入搜索“演示翻页笔模式”时,电子设备可以打开延时翻页笔模式对应的功能界面,并显示如图8中的b所示的界面。如图8中的b所示的界面中,该界面中可以包括:用于开启翻页笔模式的控件802,以及开启翻页笔对应的提示信息,该提示信息可以显示为:开启翻页笔模式后,可以利用触控笔控制电子设备实现翻页等功能。
在如图8中的b所示的界面中,在该翻页笔模式为关闭状态下,当电子设备接收到用户针对用于开启翻页笔模式的控件802的触发操作时,电子设备可以执行S703所示的步骤。
可能的实现方式中,触控笔也可以基于用户针对处于书写模式的触控笔的触发操作开启翻页笔模式。例如,触控笔可以基于用户针对触控薄膜的多次敲击操作,或者基于用户长时间按压或多次按压触控笔笔尖等操作等,使得触控笔由书写模式切换至翻页笔模式,进而触控笔执行S703所示的步骤。
S703、电子设备将用于指示触控笔切换至翻页笔模式的消息发送至触控笔。
示例性的,在电子设备与触控笔基于蓝牙实现通信连接的情况下,电子设备可以基于蓝牙等方式将该用于指示触控笔切换至翻页笔模式的消息发送至触控笔。适应的,触控笔可以接收到电子设备发送的用于指示触控笔切换至翻页笔模式的消息。
S704、当触控笔接收到用户针对触控薄膜的触发操作时,触控笔对触控薄膜中的预设区域处的触发操作进行识别,得到识别结果。
本申请实施例中,当触控笔接收到用户针对触控薄膜的触发操作时,触控笔可以基于下述两种方式对触控薄膜中的预设区域处的触发操作进识别。
一种实现中,在触控笔接收到来自电子设备的用于指示触控笔切换至翻页笔模式的消 息时,触控笔中的MCU可以将包含翻页笔模式的指示消息发送至触控IC,指示触控IC仅接收预设区域对应的触发操作。进一步的,触控IC可以将预设区域内接收到的触发操作对应的触发数据发送至MCU,由MCU进行识别,得到识别结果。其中,该触发数据可以包括:触发操作对应的位置、以及触发时的容值变化等。
其中,触控IC可以根据该用于指示翻页笔模式的指示消息,仅启动预设区域对应的部分通道进行触发操作的接收,使得触控IC可以接收预设区域对应的触发操作,得到该触发操作对应的触发数据。
可以理解的是,即使用户触发该触控薄膜中除预设区域以外的区域,触控IC也无法识别到用户的触发操作。
另一种实现中,触控IC可以正常接收用户针对触控薄膜任一位置处的触发操作,并将该触发操作对应的触发数据发送至MCU;MCU可以基于接收到的用于指示触控笔切换至翻页笔模式的消息,对预设区域处对应的触发数据进行识别,得到识别结果。或者也可以理解为MCU可以过滤掉触控薄膜中预设区域以外的触发数据。
其中,触控IC可以将触发数据以及该触发数据所对应的通道一并发送至MCU,使得MCU可以根据用于指示触控笔切换至翻页笔模式的消息,对预设区域所处通道处的触发数据进行识别,得到识别结果。
其中,该预设区域的含义可以参见图6对应的实施例中的描述,在此不再赘述。
S705、触控笔将用于指示识别结果的消息发送至电子设备。
示例性的,触控笔也可以基于蓝牙的方式将该用于指示识别结果的消息发送至电子设备。适应的,电子设备可以接收到触控笔发送的识别结果。
S706、电子设备基于识别结果控制电子设备。
其中,在翻页笔模式中,本申请实施例可以提供多种基于触控笔对于电子设备的控制方式。例如,用户可以基于针对触控笔的触发操作模拟激光笔,在电子设备中显示激光点(参见图9对应的实施例);或者,用户也可以基于用户针对触控笔的触发操作实现对于电子设备的翻页控制(参见图10对应的实施例)。
一种实现中,用户可以基于针对触控笔的触发操作模拟激光笔,在电子设备中显示激光点。
示例性的,图9为本申请实施例提供的一种基于触控笔模拟激光笔的场景示意图。
如图9中的a所示的场景,当触控笔在S704所示的步骤中接收到用户在预设区域中的按压操作(或长按操作)等时,触控笔可以将该按压操作对应的识别结果通过S705发送至电子设备,使得电子设备在S706所示的步骤中显示激光点901。用户可以利用触控笔模拟激光笔,在进行业务展示时提供可视化示范。
如图9中的a所示的电子设备显示的界面,该界面中可以为幻灯片的首页,该界面中可以包括:激光点901,用于指示页面为1的信息、显示为触控笔介绍的文字信息、显示为分享人:用户A的信息、以及用于指示时间为2022年7月7日的时间信息等。
可能的实现方式中,如图9中的b所示的场景,当用户结束针对触控笔中预设区域的按压操作等时,电子设备可以取消显示激光点。
可以理解的是,用户可以基于针对触控笔的预设区域的按压操作,模拟激光笔在电子设备中显示激光点,并且提高按压操作识别的准确性。
另一种实现中,用户也可以基于用户针对触控笔的触发操作实现对于电子设备的翻页控制。
示例性的,图10为本申请实施例提供的一种基于触控笔控制翻页的场景示意图。
如图10中的a所示的场景,当触控笔在S704所示的步骤中接收到用户在预设区域中的向远离笔尖位置处滑动等时,触控笔可以将该滑动操作对应的识别结果通过S705发送至电子设备,使得电子设备在S706所示的步骤中进行翻页,例如电子设备可以实现向下翻页,此时电子设备可以由图10中的a所示的电子设备显示的界面切换至图10中的b所示的电子设备显示的界面。
如图10中的b所示的场景,当触控笔在S704所示的步骤中接收到用户在预设区域中的向靠近笔尖位置处滑动等时,触控笔可以将该滑动操作对应的识别结果通过S705发送至电子设备,使得电子设备在S706所示的步骤中进行翻页,例如电子设备可以实现向上翻页,此时电子设备可以由图10中的b所示的电子设备显示的界面切换至图10中的a所示的电子设备显示的界面。其中,图10中的b所示的电子设备显示的界面可以包括:用于指示页面为2的信息、以及触控笔的定义介绍等。
可能的实现方式中,当用户将触控笔吸附到电子设备底部,或者用户在如图8中的b所示的界面中关闭演示翻页笔模式的开关,或者用户在触控笔中多次敲击触控薄膜等时,触控笔可以退出翻页笔模式。
可以理解的是,用户可以基于针对触控笔的预设区域的滑动操作,控制电子设备进行翻页,并且提高翻页识别的准确性。
可以理解的是,触控笔也可以根据笔尖的朝向对电子设备进行控制。如图10中的a所示的界面以及图10中的b所示的界面。在触控笔的笔尖朝上的情况下,当触控笔接收到用户在预设区域中远离笔尖位置处滑动时,触控笔可以控制电子设备向下翻页;或者,在触控笔的笔尖朝上的情况下,当触控笔接收到用户在预设区域中靠近笔尖位置处滑动时,触控笔可以控制电子设备向上翻页。其中,触控笔中可以设置有陀螺仪传感器等用于检测触控笔的朝向的传感器。
可能的实现方式中,在触控笔的笔尖朝下的情况下,当触控笔接收到用户在预设区域中靠近笔尖位置处滑动时,触控笔可以控制电子设备向下翻页;或者,在触控笔的笔尖朝下的情况下,当触控笔接收到用户在预设区域中远离笔尖位置处滑动时,触控笔可以控制电子设备向上翻页。
可以理解的是,不同触控笔的朝向、以及不同朝向状态下用于针对预设区域的不同触发操作,对于电子设备的控制仅作为一种示例,并不能构成对本申请实施例的限定。
基于图7对应的实施例中描述的内容,本申请实施例中的触控笔可以基于对触控薄膜中预设区域的操作的识别,避免用户触发到触控薄膜的其他位置时带来的误触,进而提高用户使用触控笔控制电子设备的准确性。
可能的实现方式中,当电子设备中显示文档内容、或者网页内容等时,触控笔也可以实现对于电子设备中的光标、或者对于文档内容以及网页内容等的连续翻页控制。
示例性的,图11为本申请实施例提供的另一种基于触控笔控制翻页的场景示意图。
如图11中的a所示的场景,电子设备可以显示文档内容,该文档内容中可以包括光标1101。电子设备显示的界面中还可以包括:用于指示完成文档编辑的控件、保存控件、撤 销键入控件、重复键入控件、用于退出文档的控件、用于向上翻页的控件、用于向下翻页的控件、以及用于指示当前显示的内容在文档中的位置的标识等。图11中的b所示的场景中电子设备显示的文档内容,可以与图11中的a所示的场景中电子设备显示的文档内容不同。
如图11中的a所示的场景,当触控笔在S704所示的步骤中接收到用户在预设区域中的向远离笔尖位置处滑动等时,触控笔可以将该滑动操作对应的识别结果通过S705发送至电子设备,使得电子设备在S706所示的步骤中或者控制光标连续向下移动。进一步的,当用户停止针对预设区域的操作时,电子设备可以结束针对光标的控制。例如,电子设备可以根据用户针对触控笔的滑动控制光标的移动,如电子设备可以在用户未对触控笔进行操作时显示如图11中的a所示的电子设备显示的界面,并在用户结束对触控笔的操作时如图11中的b所示的电子设备显示的界面。可以理解的是,光标的移动将会带动页面显示内容的变化。
可能的实现方式中,在触控笔接收到用户在预设区域中的向远离笔尖位置处滑动等的时,触控笔也可以控制电子设备显示的页面向下连续翻页。进一步的,当用户停止针对预设区域的操作时,电子设备可以结束翻页操作。
适应的,如图11中的b所示的场景,当触控笔在S704所示的步骤中接收到用户在预设区域中的向靠近笔尖位置处滑动等时,触控笔可以将该滑动操作对应的识别结果通过S705发送至电子设备,使得电子设备在S706所示的步骤中或者控制光标连续向上移动。进一步的,当用户停止针对预设区域的操作时,电子设备可以结束针对光标的控制。例如,电子设备可以根据用户针对触控笔的滑动控制光标的移动,如电子设备可以在用户未对触控笔进行操作时显示如图11中的b所示的电子设备显示的界面,并在用户结束对触控笔的操作时如图11中的a所示的电子设备显示的界面。
可能的实现方式中,在触控笔接收到用户在预设区域中靠近笔尖位置处滑动等时,触控笔也可以控制电子设备显示的页面向上连续翻页。进一步的,当用户停止针对预设区域的操作时,电子设备可以结束翻页操作。具体过程与触控笔控制电子设备中的光标移动的方式类似,在此不再赘述。
可能的实现方式中,触控笔也可以根据笔尖的朝向对电子设备进行控制。如图11中的a所示的界面以及图11中的b所示的界面。在触控笔的笔尖朝上的情况下,当触控笔接收到用户在预设区域中远离笔尖位置处滑动时,触控笔可以控制电子设备中显示的页面连续向下翻页(或者电子设备中显示的光标连续向下移动);或者,在触控笔的笔尖朝上的情况下,当触控笔接收到用户在预设区域中靠近笔尖位置处滑动时,触控笔可以控制电子设备中显示的页面连续向上翻页(或者电子设备中显示的光标连续向上移动)。
可能的实现方式中,在触控笔的笔尖朝下的情况下,当触控笔接收到用户在预设区域中靠近笔尖位置处滑动时,触控笔可以控制电子设备中显示的页面连续向下翻页(或者电子设备中显示的光标连续向下移动);或者,在触控笔的笔尖朝上的情况下,当触控笔接收到用户在预设区域中远离笔尖位置处滑动时,触控笔可以控制电子设备中显示的页面连续向上翻页(或者电子设备中显示的光标连续向上移动)。
基于此,触控笔可以基于对触控薄膜中预设区域的操作的识别以及触控笔的朝向的识别,避免用户触发到触控薄膜的其他位置时带来的误触,并且提高用户利用触控笔控制电 子设备的准确性。
可以理解的是,本申请实施例提供的界面仅作为一种示例,并不能构成对本申请实施例的限定。
上面结合图6-图11,对本申请实施例提供的方法进行了说明,下面对本申请实施例提供的执行上述方法的装置进行描述。如图12所示,图12为本申请实施例提供的一种基于触控笔的使用装置的结构示意图,该基于触控笔的使用装置可以是本申请实施例中的可穿戴设备,也可以是可穿戴设备内的芯片或芯片系统。
如图12所示,基于触控笔的使用装置1200可以用于通信设备、电路、硬件组件或者芯片中,该基于触控笔的使用装置包括:识别单元1201、以及处理单元1202。其中,识别单元1201用于支持基于触控笔的使用装置1200执行数据识别的步骤,处理单元1202用于支持基于触控笔的使用装置1200执行数据处理的步骤。
具体的,本申请实施例提供一种基于触控笔的使用装置1200,触控笔的部分或全部笔身围绕有触控薄膜,触控笔的笔身处设置有切面,触控薄膜中包括第一区域,第一区域小于触控薄膜覆盖的笔身区域且第一区域位于切面位置处,当处理单元1202接收到针对第一区域的第一操作时,识别单元1201用于,识别第一操作,得到识别结果,处理单元1202用于基于识别结果控制目标设备,目标设备与触控笔建立有通信连接。
可能的实现方式中,该基于触控笔的使用装置1200中也可以包括通信单元1203。具体的,通信单元1203用于支持基于触控笔的使用装置1200执行数据的发送以及数据的接收的步骤。其中,该通信单元1203可以是输入或者输出接口、管脚或者电路等。
可能的实施例中,基于触控笔的使用装置1200还可以包括:存储单元1204。处理单元1202、存储单元1204通过线路相连。存储单元1204可以包括一个或者多个存储器,存储器可以是一个或者多个设备、电路中用于存储程序或者数据的器件。存储单元1204可以独立存在,通过通信线路与基于触控笔的使用装置具有的处理单元1202相连。存储单元1204也可以和处理单元1202集成在一起。
存储单元1204可以存储可穿戴设备中的方法的计算机执行指令,以使处理单元1202执行上述实施例中的方法。存储单元1204可以是寄存器、缓存或者RAM等,存储单元1204可以和处理单元1202集成在一起。存储单元1204可以是只读存储器(read-only memory,ROM)或者可存储静态信息和指令的其他类型的静态存储设备,存储单元1204可以与处理单元1202相独立。
图13为本申请实施例提供的另一种触控笔的硬件结构示意图,如图13所示,该触控笔包括处理器1301,通信线路1304以及至少一个通信接口(图13中示例性的以通信接口1303为例进行说明)。
处理器1301可以是一个通用中央处理器(central processing unit,CPU),微处理器,特定应用集成电路(application-specific integrated circuit,ASIC),或一个或多个用于控制本申请方案程序执行的集成电路。
通信线路1304可包括在上述组件之间传送信息的电路。
通信接口1303,使用任何收发器一类的装置,用于与其他设备或通信网络通信,如以太网,无线局域网(wireless local area networks,WLAN)等。
可能的,该触控笔还可以包括存储器1302。
存储器1302可以是只读存储器(read-only memory,ROM)或可存储静态信息和指令的其他类型的静态存储设备,随机存取存储器(random access memory,RAM)或者可存储信息和指令的其他类型的动态存储设备,也可以是电可擦可编程只读存储器(electrically erasable programmable read-only memory,EEPROM)、只读光盘(compact disc read-only memory,CD-ROM)或其他光盘存储、光碟存储(包括压缩光碟、激光碟、光碟、数字通用光碟、蓝光光碟等)、磁盘存储介质或者其他磁存储设备、或者能够用于携带或存储具有指令或数据结构形式的期望的程序代码并能够由计算机存取的任何其他介质,但不限于此。存储器可以是独立存在,通过通信线路1304与处理器相连接。存储器也可以和处理器集成在一起。
其中,存储器1302用于存储执行本申请方案的计算机执行指令,并由处理器1301来控制执行。处理器1301用于执行存储器1302中存储的计算机执行指令,从而实现本申请实施例所提供的佩戴检测方法。
可能的,本申请实施例中的计算机执行指令也可以称之为应用程序代码,本申请实施例对此不作具体限定。
在具体实现中,作为一种实施例,处理器1301可以包括一个或多个CPU,例如图13中的CPU0和CPU1。
在具体实现中,作为一种实施例,触控笔可以包括多个处理器,例如图13中的处理器1301和处理器1305。这些处理器中的每一个可以是一个单核(single-CPU)处理器,也可以是一个多核(multi-CPU)处理器。这里的处理器可以指一个或多个设备、电路、和/或用于处理数据(例如计算机程序指令)的处理核。
计算机程序产品包括一个或多个计算机指令。在计算机上加载和执行计算机程序指令时,全部或部分地产生按照本申请实施例的流程或功能。计算机可以是通用计算机、专用计算机、计算机网络或者其他可编程装置。计算机指令可以存储在计算机可读存储介质中,或者从一个计算机可读存储介质向另一计算机可读存储介质传输,例如,计算机指令可以从一个网站站点、计算机、服务器或数据中心通过有线(例如同轴电缆、光纤、数字用户线(digital subscriber line,DSL)或无线(例如红外、无线、微波等)方式向另一个网站站点、计算机、服务器或数据中心进行传输。计算机可读存储介质可以是计算机能够存储的任何可用介质或者是包括一个或多个可用介质集成的服务器、数据中心等数据存储设备。例如,可用介质可以包括磁性介质(例如,软盘、硬盘或磁带)、光介质(例如,数字通用光盘(digital versatile disc,DVD))、或者半导体介质(例如,固态硬盘(solid state disk,SSD))等。
本申请实施例还提供一种系统,系统包括触控笔以及目标设备,目标设备与触控笔建立有通信连接,触控笔的部分或全部笔身围绕有触控薄膜,触控笔的笔身处设置有切面,触控薄膜中包括第一区域,第一区域小于触控薄膜覆盖的笔身区域且第一区域位于切面位置处,触控笔用于:当接收到针对第一区域的第一操作时,识别第一操作,得到识别结果;触控笔还用于:将识别结果发送至目标设备;目标设备用于:接收识别结果,并基于识别结果控制目标设备。
本申请实施例还提供了一种计算机可读存储介质。上述实施例中描述的方法可以全部 或部分地通过软件、硬件、固件或者其任意组合来实现。计算机可读介质可以包括计算机存储介质和通信介质,还可以包括任何可以将计算机程序从一个地方传送到另一个地方的介质。存储介质可以是可由计算机访问的任何目标介质。
作为一种可能的设计,计算机可读介质可以包括紧凑型光盘只读储存器(compact disc read-only memory,CD-ROM)、RAM、ROM、EEPROM或其它光盘存储器;计算机可读介质可以包括磁盘存储器或其它磁盘存储设备。而且,任何连接线也可以被适当地称为计算机可读介质。例如,如果使用同轴电缆,光纤电缆,双绞线,DSL或无线技术(如红外,无线电和微波)从网站,服务器或其它远程源传输软件,则同轴电缆,光纤电缆,双绞线,DSL或诸如红外,无线电和微波之类的无线技术包括在介质的定义中。如本文所使用的磁盘和光盘包括光盘(CD),激光盘,光盘,数字通用光盘(digital versatile disc,DVD),软盘和蓝光盘,其中磁盘通常以磁性方式再现数据,而光盘利用激光光学地再现数据。
上述的组合也应包括在计算机可读介质的范围内。以上,仅为本发明的具体实施方式,但本发明的保护范围并不局限于此,任何熟悉本技术领域的技术人员在本发明揭露的技术范围内,可轻易想到变化或替换,都应涵盖在本发明的保护范围之内。因此,本发明的保护范围应以权利要求的保护范围为准。

Claims (36)

  1. 一种系统,其特征在于,所述系统包括触控笔和与所述触控笔建立有通信连接的电子设备,
    所述触控笔的至少部分笔身设置有触控薄膜,所述触控薄膜包括第一区域和第二区域,所述第一区域小于所述触控薄膜对应的所述触控笔的笔身区域,所述第二区域为除了所述第一区域外的其他触控薄膜的区域;
    在第一模式下,所述触控笔用于在所述电子设备的触控屏中书写,所述系统能够在第一应用中对用户在所述触控薄膜的任意位置的第一操作进行响应;
    在第二模式下,所述系统能够在第二应用中对用户在所述第一区域的第二操作进行响应,其中,所述系统在所述第二应用中对用户在所述第二区域的第二操作不响应;
    所述第一模式不同于所述第二模式;
    所述第一操作不同于所述第二操作。
  2. 根据权利要求1所述的系统,其特征在于,所述第二操作为滑动操作。
  3. 根据权利要求2所述的系统,其特征在于,所述系统能够在第二应用中对用户在所述第一区域的第二操作进行响应包括:所述触控笔能够识别用户在所述第一区域的滑动操作,且所述电子设备的显示页面响应于所述滑动操作翻页。
  4. 根据权利要求3所述的系统,所述触控笔能够识别用户在所述第一区域的滑动操作,且所述电子设备的显示页面响应于所述滑动操作翻页包括:
    所述触控笔能够识别用户在所述第一区域向远离所述触控笔的笔尖位置处的第一滑动操作,且所述电子设备的显示页面响应于所述第一滑动操作向下翻页;
    以及,所述触控笔能够识别用户在所述第一区域向靠近所述触控笔的笔尖位置处的第二滑动操作,且所述电子设备的显示页面响应于所述第二滑动操作向上翻页。
  5. 根据权利要求1所述的系统,其特征在于,所述第二操作为长按操作。
  6. 根据权利要求5所述的系统,其特征在于,所述系统能够在第二应用中对用户在所述第一区域的第二操作进行响应包括:所述触控笔能够识别用户在所述第一区域的长按操作,且所述电子设备的显示页面响应于所述长按操作显示激光点。
  7. 根据权利要求6所述的系统,其特征在于,所述系统能够在第二应用中对用户在所述第一区域的第二操作进行响应还包括:所述触控笔能够识别用户所述第一区域的长按操作结束,且所述电子设备的显示页面响应于所述长按操作的结束取消显示激光点。
  8. 根据权利要求1-7中任一项所述的系统,其特征在于,所述第一操作包括敲击操作。
  9. 根据权利要求8所述的系统,其特征在于,所述系统能够在第一应用中对用户在所述触控薄膜的任意位置的第一操作进行响应包括:所述触控笔能够识别用户在所述触控薄膜任意位置的敲击操作,且由所述第一模式切换到所述第二模式。
  10. 根据权利要求1-7中任一项所述的系统,其特征在于,所述电子设备还用于显示第一界面,所述第一界面包括第一控件,所述第一控件用于接收用户的操作以指示所述系统处于所述第一模式或所述第二模式。
  11. 根据权利要求10所述的系统,其特征在于,所述第一界面还包括第一提示信 息,所述提示信息用于提示用户在所述第二模式下利用所述触控笔控制所述电子设备实现翻页。
  12. 根据权利要求1-11中任一项所述的系统,其特征在于,所述第一模式为书写模式,所述第二模式为翻页笔模式。
  13. 根据权利要求1-12中任一项所述的系统,其特征在于,所述第一应用是进行书写的应用;所述第二应用是放映幻灯片的应用。
  14. 根据权利要求1-13中任一项所述的系统,其特征在于,
    所述系统能够在第二应用中对用户在所述第一区域的第二操作进行响应包括:
    所述电子设备能够接收到所述触控笔发送的第一消息,所述第一消息用于指示用户的操作为第二操作的识别结果;
    所述电子设备能够基于所述识别结果控制所述电子设备的显示内容。
  15. 根据权利要求1-14中任一项所述的系统,其特征在于,所述触控笔中包括触控集成电路板IC以及微控制单元MCU;
    所述触控IC用于在所述第二模式下,向所述MCU发送第一触发数据,所述第一触发数据仅包含第一区域内接收到的触发操作对应的触发数据。
  16. 根据权利要求1-14中任一项所述的系统,其特征在于,所述触控笔中包括触控集成电路板IC以及微控制单元MCU;
    所述触控IC用于在所述第二模式下,向所述微控制单元MCU发送第二触发数据,所述第二触发数据包含所述触控薄膜任一位置处接收到的触发操作对应的触发数据,所述微控制单元MCU用于在所述第二模式下,仅对所述第一区域处对应的触发数据进行识别。
  17. 根据权利要求1-16中任一项所述的系统,其特征在于,所述触控笔的笔身处设置有切面,所述第一区域位于所述切面位置处。
  18. 根据权利要求17所述的系统,所述第一区域小于所述触控薄膜对应的笔身切面区域。
  19. 一种显示方法,所述方法应用于包括触控笔和电子设备的系统,所述触控笔的至少部分笔身设置有触控薄膜,所述触控薄膜包括第一区域和第二区域,所述第一区域小于所述触控薄膜对应的所述触控笔的笔身区域,所述第二区域为除了所述第一区域外的其他触控薄膜的区域;所述方法包括:
    所述电子设备和所述触控笔建立通信连接;
    在第一模式下,所述触控笔在所述电子设备的触控屏中书写,所述系统在第一应用中对用户在所述触控薄膜的任意位置的第一操作进行响应;
    在第二模式下,所述系统在第二应用中对用户在所述第一区域的第二操作进行响应,其中,所述系统在所述第二应用中对用户在所述第二区域的第二操作不响应;
    所述第一模式不同于所述第二模式;
    所述第一操作不同于所述第二操作。
  20. 根据权利要求19所述的方法,其特征在于,所述第二操作为滑动操作。
  21. 根据权利要求20所述的方法,其特征在于,所述系统能够在第二应用中对用户在所述第一区域的第二操作进行响应包括:所述触控笔能够识别用户在所述第一区 域的滑动操作,且所述电子设备的显示页面响应于所述滑动操作翻页。
  22. 根据权利要求21所述的方法,其特征在于,所述触控笔能够识别用户在所述第一区域的滑动操作,且所述电子设备的显示页面响应于所述滑动操作翻页包括:
    所述触控笔能够识别用户在所述第一区域向远离所述触控笔的笔尖位置处的第一滑动操作,且所述电子设备的显示页面响应于所述第一滑动操作向下翻页;
    以及,所述触控笔能够识别用户在所述第一区域向靠近所述触控笔的笔尖位置处的第二滑动操作,且所述电子设备的显示页面响应于所述第二滑动操作向上翻页。
  23. 根据权利要求19所述的方法,其特征在于,所述第二操作为长按操作。
  24. 根据权利要求23所述的方法,其特征在于,所述系统能够在第二应用中对用户在所述第一区域的第二操作进行响应包括:所述触控笔能够识别用户在所述第一区域的长按操作,且所述电子设备的显示页面响应于所述长按操作显示激光点。
  25. 根据权利要求24所述的方法,其特征在于,所述系统能够在第二应用中对用户在所述第一区域的第二操作进行响应还包括:所述触控笔能够识别用户所述第一区域的长按操作结束,且所述电子设备的显示页面响应于所述长按操作的结束取消显示激光点。
  26. 根据权利要求19-24中任一项所述的方法,其特征在于,所述第一操作包括敲击操作。
  27. 根据权利要求26所述的方法,其特征在于,所述系统能够在第一应用中对用户在所述触控薄膜的任意位置的第一操作进行响应包括:所述触控笔能够识别用户在所述触控薄膜任意位置的敲击操作,且由所述第一模式切换到所述第二模式。
  28. 根据权利要求19-26中任一项所述的方法,其特征在于,所述方法还包括:
    所述电子设备显示第一界面,所述第一界面包括第一控件,所述第一控件用于接收用户的操作以指示所述系统处于所述第一模式或所述第二模式。
  29. 根据权利要求28所述的方法,其特征在于,所述第一界面还包括第一提示信息,所述提示信息用于提示用户在所述第二模式下利用所述触控笔控制所述电子设备实现翻页。
  30. 根据权利要求19-29中任一项所述的方法,其特征在于,所述第一模式为书写模式,所述第二模式为翻页笔模式。
  31. 根据权利要求19-30中任一项所述的方法,其特征在于,所述第一应用是进行书写的应用;所述第二应用是放映幻灯片的应用。
  32. 根据权利要求19-31中任一项所述的方法,其特征在于,所述系统能够在第二应用中对用户在所述第一区域的第二操作进行响应包括:
    所述电子设备能够接收到所述触控笔发送的第一消息,所述第一消息用于指示用户的操作为第二操作的识别结果;
    所述电子设备能够基于所述识别结果控制所述电子设备的显示内容。
  33. 根据权利要求19-32中任一项所述的方法,其特征在于,所述触控笔中包括触控集成电路板IC以及微控制单元MCU;
    所述触控IC用于在所述第二模式下,向所述MCU发送第一触发数据,所述第一触发数据仅包含第一区域内接收到的触发操作对应的触发数据。
  34. 根据权利要求19-32中任一项所述的方法,其特征在于,所述触控笔中包括触控集成电路板IC以及微控制单元MCU;
    所述触控IC用于在所述第二模式下,向所述微控制单元MCU发送第二触发数据,所述第二触发数据包含所述触控薄膜任一位置处接收到的触发操作对应的触发数据,所述微控制单元MCU用于在所述第二模式下,仅对所述第一区域处对应的触发数据进行识别。
  35. 根据权利要求19-34中任一项所述的方法,其特征在于,所述触控笔的笔身处设置有切面,所述第一区域位于所述切面位置处。
  36. 根据权利要求35所述的方法,其特征在于,所述第一区域小于所述触控薄膜对应的笔身切面区域。
PCT/CN2023/111063 2022-08-09 2023-08-03 基于触控笔的使用方法和装置 WO2024032470A1 (zh)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202380025107.9A CN118786408A (zh) 2022-08-09 2023-08-03 基于触控笔的使用方法和装置

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN202210951471.5A CN116088698B (zh) 2022-08-09 2022-08-09 基于触控笔的使用方法和装置
CN202210951471.5 2022-08-09

Publications (2)

Publication Number Publication Date
WO2024032470A1 true WO2024032470A1 (zh) 2024-02-15
WO2024032470A9 WO2024032470A9 (zh) 2024-05-10

Family

ID=86208865

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2023/111063 WO2024032470A1 (zh) 2022-08-09 2023-08-03 基于触控笔的使用方法和装置

Country Status (2)

Country Link
CN (3) CN116088698B (zh)
WO (1) WO2024032470A1 (zh)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116088698B (zh) * 2022-08-09 2024-07-02 荣耀终端有限公司 基于触控笔的使用方法和装置

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE202013103197U1 (de) * 2013-07-17 2014-10-20 Stabilo International Gmbh Digitaler Stift
CN110045843A (zh) * 2019-03-26 2019-07-23 维沃移动通信有限公司 电子笔、电子笔控制方法及终端设备
CN110347269A (zh) * 2019-06-06 2019-10-18 华为技术有限公司 一种空鼠模式实现方法及相关设备
CN114461129A (zh) * 2021-07-02 2022-05-10 荣耀终端有限公司 笔迹绘制方法、装置、电子设备和可读存储介质
CN116088698A (zh) * 2022-08-09 2023-05-09 荣耀终端有限公司 基于触控笔的使用方法和装置

Family Cites Families (18)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103019417B (zh) * 2012-11-28 2016-09-14 沈阳工业大学 一种识别手指挠笔和敲笔动作的电子手写笔及其输入方法
US9261985B2 (en) * 2013-03-11 2016-02-16 Barnes & Noble College Booksellers, Llc Stylus-based touch-sensitive area for UI control of computing device
CN103941892A (zh) * 2014-04-29 2014-07-23 常熟思睿科电子有限公司 一种智能家电笔式控制终端及其工作方法
TWI564756B (zh) * 2016-02-05 2017-01-01 致伸科技股份有限公司 觸控筆
CN106227370A (zh) * 2016-07-15 2016-12-14 汤棋 一种智能触控笔
US10025403B2 (en) * 2016-07-25 2018-07-17 Microsoft Technology Licensing, Llc Stylus communication channels
JP6562007B2 (ja) * 2017-01-30 2019-08-21 京セラドキュメントソリューションズ株式会社 ペン型入力装置及び表示入力システム
US11340716B2 (en) * 2018-07-06 2022-05-24 Apple Inc. Touch-based input for stylus
TWI697814B (zh) * 2018-07-16 2020-07-01 禾瑞亞科技股份有限公司 功能模組化之觸控筆
CN109710093B (zh) * 2018-12-14 2021-11-30 掌阅科技股份有限公司 阅读操作方法、手写阅读设备和存储介质
CN113168242A (zh) * 2018-12-19 2021-07-23 深圳市柔宇科技股份有限公司 手写系统的控制方法和手写系统
KR20200115889A (ko) * 2019-03-28 2020-10-08 삼성전자주식회사 전자 펜을 통한 사용자 입력에 기초하여 동작을 실행하는 전자 디바이스 및 그 동작 방법
CN110865734B (zh) * 2019-11-13 2022-10-25 北京字节跳动网络技术有限公司 目标对象显示方法、装置、电子设备和计算机可读介质
CN112947773B (zh) * 2019-11-26 2024-01-26 京东方科技集团股份有限公司 触控笔及触控系统
CN113176831A (zh) * 2021-04-21 2021-07-27 维沃移动通信有限公司 触控笔、设置方法及设置装置
CN113970971B (zh) * 2021-09-10 2022-10-04 荣耀终端有限公司 基于触控笔的数据处理方法和装置
CN114185442A (zh) * 2021-11-30 2022-03-15 联想(北京)有限公司 触摸板模块及电子设备
CN114415850A (zh) * 2021-12-29 2022-04-29 联想(北京)有限公司 一种控制方法、装置、触控笔及计算机可读存储介质

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE202013103197U1 (de) * 2013-07-17 2014-10-20 Stabilo International Gmbh Digitaler Stift
CN110045843A (zh) * 2019-03-26 2019-07-23 维沃移动通信有限公司 电子笔、电子笔控制方法及终端设备
CN110347269A (zh) * 2019-06-06 2019-10-18 华为技术有限公司 一种空鼠模式实现方法及相关设备
CN114461129A (zh) * 2021-07-02 2022-05-10 荣耀终端有限公司 笔迹绘制方法、装置、电子设备和可读存储介质
CN116088698A (zh) * 2022-08-09 2023-05-09 荣耀终端有限公司 基于触控笔的使用方法和装置

Also Published As

Publication number Publication date
CN117032485B (zh) 2024-07-23
CN117032485A (zh) 2023-11-10
CN116088698B (zh) 2024-07-02
CN116088698A (zh) 2023-05-09
WO2024032470A9 (zh) 2024-05-10
CN118786408A (zh) 2024-10-15

Similar Documents

Publication Publication Date Title
WO2022193814A1 (zh) 笔记生成方法和系统
US11740764B2 (en) Method and system for providing information based on context, and computer-readable recording medium thereof
US9265074B2 (en) Pen-based content transfer system and method thereof
CN111149086B (zh) 编辑主屏幕的方法、图形用户界面及电子设备
US9521237B2 (en) Cellular communication device with wireless pointing device function
US20150378557A1 (en) Foldable electronic apparatus and interfacing method thereof
KR100922643B1 (ko) 핸드헬드 포인터 기반 사용자 인터페이스의 제공 방법 및장치
JP2015005173A (ja) タッチ・スクリーンを備える携帯式情報端末および入力方法
WO2021197487A1 (zh) 一种鼠标控制终端屏幕的方法、装置、鼠标及存储介质
US20140351725A1 (en) Method and electronic device for operating object
WO2024032470A1 (zh) 基于触控笔的使用方法和装置
WO2019036826A1 (zh) 一种对电子设备的控制方法以及输入设备
WO2020125476A1 (zh) 一种触控显示屏操作方法和用户设备
WO2019047129A1 (zh) 一种移动应用图标的方法及终端
CN109074124A (zh) 数据处理的方法及移动设备
KR20200019426A (ko) 스마트 터치패드 인터페이스 방법 및 그 장치
CN112639695B (zh) 自适应数字笔和触敏设备
US9292107B1 (en) Mobile telephone as computer mouse
US9411443B2 (en) Method and apparatus for providing a function of a mouse using a terminal including a touch screen
US10416852B2 (en) Display and interaction method in a user interface
JP2018170048A (ja) 情報処理装置、入力方法及びプログラム
US20160342280A1 (en) Information processing apparatus, information processing method, and program
TW201702818A (zh) 用於配合多重上墨技術使用之單一觸控筆
TW202001510A (zh) 可應用於互動控制之輸入裝置與電子裝置
KR20150099888A (ko) 디스플레이를 제어하는 전자 장치 및 방법

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 23851691

Country of ref document: EP

Kind code of ref document: A1