WO2023216078A1 - 触控方法、装置、设备、系统、存储介质及程序产品 - Google Patents

触控方法、装置、设备、系统、存储介质及程序产品 Download PDF

Info

Publication number
WO2023216078A1
WO2023216078A1 PCT/CN2022/091805 CN2022091805W WO2023216078A1 WO 2023216078 A1 WO2023216078 A1 WO 2023216078A1 CN 2022091805 W CN2022091805 W CN 2022091805W WO 2023216078 A1 WO2023216078 A1 WO 2023216078A1
Authority
WO
WIPO (PCT)
Prior art keywords
touch
signal
stylus
identification
pressure
Prior art date
Application number
PCT/CN2022/091805
Other languages
English (en)
French (fr)
Inventor
郝帅凯
Original Assignee
广州视源电子科技股份有限公司
广州视睿电子科技有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 广州视源电子科技股份有限公司, 广州视睿电子科技有限公司 filed Critical 广州视源电子科技股份有限公司
Priority to PCT/CN2022/091805 priority Critical patent/WO2023216078A1/zh
Priority to CN202280007020.4A priority patent/CN117377934A/zh
Publication of WO2023216078A1 publication Critical patent/WO2023216078A1/zh

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means

Definitions

  • the present application relates to the field of touch control, and in particular, to a touch control method, device, equipment, system, storage medium and program product.
  • a touch device is a device equipped with a touch component and a display screen.
  • the user can operate the touch device by touching the components or content displayed on the display screen. Since touch devices get rid of the constraints of keyboard and mouse, they make human-computer interaction more straightforward and are favored by more and more users.
  • touch devices using capacitive sensing touch components support touching them through multiple touch objects, such as the user's fingers and at least one active stylus.
  • touch devices using capacitive sensing touch components support touching them through multiple touch objects, such as the user's fingers and at least one active stylus.
  • touch devices The touch response of the control device is not flexible and intelligent enough, and cannot meet the actual needs of users.
  • Embodiments of the present application provide a touch method, device, equipment, system, storage medium and program product, which can solve the problem of touching a touch device using a capacitive-sensitive touch component through multiple touch objects.
  • the touch response of touch devices is not flexible and intelligent enough.
  • the technical solutions are as follows:
  • embodiments of the present application provide a touch control method, which method is applied to a touch control device.
  • the method includes:
  • the first touch information of the touch object is obtained according to the touch sensing signal; wherein the first touch information includes: touch position , and, the identification of the touch object determined based on the intensity of the touch sensing signal; the touch device supports multiple touch objects, and the signal strengths of the touch sensing signals corresponding to different touch objects are different. ;
  • a touch response is performed at the touch position.
  • the plurality of touch objects include the user's limbs and M touch pens, where M is an integer greater than or equal to 1; the touch sensing signal corresponding to any part of the user's limbs The signal strength is the same;
  • the plurality of touch objects include M touch pens, where M is an integer greater than or equal to 2;
  • the signal strengths and/or phases of the touch signals emitted by each stylus are different, so that the signal strengths of the touch sensing signals detected by the touch device are different.
  • the M is greater than or equal to 2
  • the M touch pens include a first touch pen and a second touch pen, and the phase of the touch signal emitted by the first touch pen is equal to the phase of the touch signal emitted by the first touch pen.
  • the phases of the touch signals emitted by the two stylus pens are opposite, and the phase of the touch signals emitted by the second stylus pen is the same as the phase of the touch control driving signal of the touch device;
  • Obtaining the identity of the touch object according to the touch sensing signal includes:
  • the touch object is an identifier of the user's limb
  • the signal strength of the touch sensing signal is less than the second preset signal strength threshold, it is determined that the identification of the touch object is the identification of the first stylus
  • the third preset signal strength threshold is greater than the first preset signal strength threshold.
  • the M is greater than or equal to 2
  • the M touch pens include a first touch pen and a second touch pen, and the phase of the touch signal emitted by the first touch pen is equal to the phase of the touch signal emitted by the first touch pen.
  • the phases of the touch signals emitted by the two stylus pens are opposite, and the phase of the touch signals emitted by the second stylus pen is the same as the phase of the touch control driving signal of the touch device;
  • Obtaining the identity of the touch object according to the touch sensing signal includes:
  • the signal strength of the touch sensing signal is less than the second preset signal strength threshold, it is determined that the identification of the touch object is the identification of the first stylus
  • the third preset signal strength threshold is greater than the second preset signal strength threshold.
  • the touch parameters corresponding to the identification of the touch object, before performing a touch response at the touch position further include:
  • the first touch information is updated to obtain updated touch information.
  • the updated touch information includes: the identification of the touch object, the touch position, and the touch position of the touch object.
  • the touch response at the touch location according to the touch parameters corresponding to the identification of the touch object includes:
  • a touch response is performed at the touch position.
  • the identification of the touch object is the identification of the user's limb
  • the updating of the first touch information to obtain updated touch information includes:
  • the detected pressure is added to the first touch information to obtain the updated touch information.
  • the identification of the touch object is the identification of the stylus
  • the method further includes:
  • Receive second touch information sent by the stylus includes the identification of the stylus, and the pressure used when the stylus touches the touch device;
  • the updating of the first touch information to obtain updated touch information includes:
  • the first touch information and the second touch information are combined to obtain the updated touch information.
  • the touch operation performed by the touch object touch device is a writing operation
  • the touch parameters include drawing parameters
  • the touch response to the touch of the touch object at the touch position according to the touch parameters corresponding to the identification of the touch object and the pressure includes:
  • writing handwriting matching the drawing parameters is displayed at the touch position.
  • embodiments of the present application provide a touch method, which method is applied to a stylus, and the method includes:
  • touch information is sent to the touch device and a touch signal with preset phase and signal strength is emitted through the pen tip;
  • the touch information includes: the touch The logo of the control pen, and the pressure;
  • transmitting a touch signal with preset phase and signal strength to the touch device through the pen tip includes:
  • the touch signal with preset phase and signal strength is transmitted to the touch device through the pen tip.
  • embodiments of the present application provide a touch control method, which method is applied to a touch control device.
  • the method includes:
  • the first touch signal of the first touch object is obtained according to the touch sensing signal corresponding to the first touch object.
  • touch information and obtain the first touch information of the second touch object according to the touch sensing signal corresponding to the second touch object; wherein the first touch information includes: touch position , and, the identification of the touch object determined based on the intensity of the touch sensing signal corresponding to the touch object;
  • a touch response is performed at the touch position of the second touch object.
  • a touch device including:
  • Detection module used to detect touch sensing signals
  • a determination module configured to determine whether a touch object touches the touch device according to the detected touch sensing signal
  • An acquisition module configured to acquire first touch information of the touch object according to the touch sensing signal when it is determined that a touch object touches the touch device; wherein, the first touch information Including: the touch position, and the identification of the touch object determined based on the intensity of the touch sensing signal; the touch device supports multiple touch objects, and the touch sensing corresponding to different touch objects Signals vary in signal strength;
  • a response module configured to perform a touch response at the touch position according to the touch parameters corresponding to the identification of the touch object.
  • a touch device including:
  • Detection module used to detect touch sensing signals
  • a determination module configured to determine whether a touch object touches the touch device according to the detected touch sensing signal
  • An acquisition module configured to acquire the first touch object according to the touch sensing signal corresponding to the first touch object when it is determined that a first touch object and a second touch object touch the touch device simultaneously. the first touch information of the control object, and obtain the first touch information of the second touch object according to the touch sensing signal corresponding to the second touch object; wherein, the first touch information Including: the touch position, and the identification of the touch object determined based on the intensity of the touch sensing signal corresponding to the touch object;
  • a response module configured to perform a touch response at the touch position of the first touch object according to the touch parameters corresponding to the identification of the first touch object, and to perform a touch response according to the identification of the second touch object. According to the corresponding touch parameters, a touch response is performed at the touch position of the second touch object.
  • a stylus which includes: a pen tip, an induction ring, a pressure sensor, an analog-to-digital conversion module, a touch signal output unit, a general control module, a communication module, and a communication module. antenna;
  • the pressure sensor is used to detect the pressure of the pen tip
  • the analog-to-digital conversion module is used to convert the electrical signal of the pressure of the pen tip detected by the pressure sensor into a digital signal and then transmit it to the main control module;
  • the induction ring is used to sense the touch drive signal emitted by the touch component of the touch device
  • the overall control module is configured to send touch information to the touch device through the communication module and communication antenna when the pressure is greater than or equal to a preset pressure threshold, and control the touch device to transmit touch information.
  • the signal output unit outputs a touch signal with a preset phase and signal strength, and transmits the touch signal to the touch device through the pen tip; when the pressure is less than the preset pressure threshold, the signal output unit stops passing the touch signal through the pen tip.
  • the communication module and communication antenna send the touch information to the touch device, and control the touch signal output unit to stop outputting the touch signal; wherein the touch information includes: the stylus The identification, as well as, the pressure.
  • embodiments of the present application provide a touch device, including: a touch component, a processor, and a memory; wherein the memory stores a computer program, and the computer program is adapted to be loaded by the processor and Perform any one of the method steps of the first aspect or the third aspect above.
  • embodiments of the present application provide a stylus, including: a processor and a memory; wherein the memory stores a computer program, and the computer program is adapted to be loaded by the processor and execute the second step described above. method steps.
  • embodiments of the present application provide a touch system, which includes: the touch device as described in the seventh aspect, and at least two stylus pens as described in the eighth aspect.
  • embodiments of the present application provide a computer storage medium that stores a plurality of instructions, and the instructions are suitable for being loaded by a processor and executing any one of the methods of the first aspect or the third aspect. step.
  • embodiments of the present application provide a computer storage medium that stores a plurality of instructions, and the instructions are suitable for being loaded by a processor and executing any of the method steps of the second aspect.
  • embodiments of the present application provide a computer program product, including a computer program that implements any one of the method steps of the first aspect or the third aspect when executed by a processor.
  • embodiments of the present application provide a computer program product, including a computer program that implements any of the method steps of the second aspect when executed by a processor.
  • the touch device when it performs touch recognition, it can not only identify the touch position of the touch object, but also identify the current touch object, so that it can be based on the touch parameters corresponding to the touch object. , perform touch response to achieve the effect of having different touch responses for different touch objects, which improves the flexibility and intelligence of the touch response of the touch device.
  • Figure 1 is a schematic diagram of a scene applicable to the touch method provided by the embodiment of the present application.
  • Figure 2 is a schematic structural diagram of a possible touch component provided by an embodiment of the present application.
  • Figure 3 is a schematic structural diagram of a possible intelligent processing system provided by an embodiment of the present application.
  • Figure 4 is a schematic flowchart of a touch control method provided by an embodiment of the present application.
  • Figure 5 is a schematic diagram of signal strength provided by an embodiment of the present application.
  • Figure 6 is another signal strength diagram provided by an embodiment of the present application.
  • Figure 7 is a schematic flow chart of another touch method provided by an embodiment of the present application.
  • Figure 8 is a schematic structural diagram of a possible stylus 200 provided by an embodiment of the present application.
  • Figure 9 is a schematic structural diagram of the first touch signal output unit provided by an embodiment of the present application.
  • Figure 10 is a schematic structural diagram of a second touch signal output unit provided by an embodiment of the present application.
  • FIG 11 is a schematic structural diagram of the third touch signal output unit provided by an embodiment of the present application.
  • Figure 12 is a schematic structural diagram of a fourth touch signal output unit provided by an embodiment of the present application.
  • Figure 13 is a schematic structural diagram of a fifth touch signal output unit provided by an embodiment of the present application.
  • Figure 14 is a schematic structural diagram of a sixth touch signal output unit provided by an embodiment of the present application.
  • Figure 15 is a schematic structural diagram 2 of the third touch signal output unit provided by an embodiment of the present application.
  • Figure 16 is a schematic structural diagram of a touch device provided by an embodiment of the present application.
  • Figure 17 is a schematic structural diagram of an intelligent interactive tablet provided by an embodiment of the present application.
  • FIG. 1 is a schematic diagram of a scenario in which the touch control method provided by the embodiment of the present application is applicable.
  • the application scenario includes a touch object and a touch device 100 .
  • the touch device 100 can be any device that uses capacitive sensing touch components to implement touch functions, such as smart interactive tablets, mobile phones, tablets, computers, virtual reality , VR) terminal equipment, augmented reality (AR) terminal equipment, wireless terminals in industrial control, wireless terminals in self-driving, wireless terminals in remote medical surgery Terminals, wireless terminals in smart grids, wireless terminals in transportation safety, wireless terminals in smart cities, wireless terminals in smart homes, personal digital processing ( personal digital assistant (PDA), vehicle-mounted devices, wearable devices, etc.
  • the touch device 100 is an intelligent interactive tablet as an example for illustration.
  • the touch object can touch the touch device 100 to provide input to the touch device 100, and the touch device 100 performs operations in response to the input based on the input of the touch object.
  • the touch object mentioned here can be any object that can implement touch operations on the touch device 100, such as the stylus 200, the user's finger 300, etc.
  • the stylus 200 involved in the embodiment of the present application may be a capacitive pen.
  • the capacitive pen may include: a passive capacitive pen and an active capacitive pen.
  • Passive capacitive pens can be called passive capacitive pens or passive stylus pens
  • active capacitive pens can be called active stylus pens, which are divided into active passive pens and active active pens.
  • the stylus pen 100 may also be called a handwriting pen, and the embodiments of this application do not distinguish this.
  • the application scenarios of the embodiments of this application mainly involve active stylus pens.
  • the following embodiments take active passive pens as examples for illustration. That is, the stylus and active stylus described in subsequent application documents have the same meaning, and no distinction is made between them.
  • the so-called active passive pen refers to a stylus that can process the detected touch drive signal of the touch device 100 to generate a touch signal and actively output it.
  • the touch device 100 can identify whether the active passive pen touches and the touch position by receiving the touch signal output from the active passive pen.
  • the stylus 200 and the touch device 100 may be connected by wire or interconnected through a communication network to achieve interaction.
  • the communication network can be, but is not limited to, a WI-FI hotspot network, a WI-FI peer-to-peer (P2P) network, a Bluetooth network, a zigbee network or a near field communication (NFC) network, etc. Communications network.
  • P2P WI-FI peer-to-peer
  • NFC near field communication
  • Communications network It should be understood that when the stylus 200 and the touch device 100 are interconnected through a communication network, they may each have a communication module and a communication antenna capable of establishing the communication network.
  • the Bluetooth network as an example, both can have a Bluetooth module and a Bluetooth antenna respectively, so that a Bluetooth network can be established using the Bluetooth module and the Bluetooth antenna.
  • the touch device 100 shown in FIG. 1 supports touching it through multiple touch objects.
  • the touch device 100 supports M stylus touches.
  • M is greater than or An integer equal to 2; alternatively, the touch device 100 supports touch control by the user's limbs (eg, fingers), and M stylus pens implement touch control.
  • M is an integer greater than or equal to 1.
  • the user's limbs mentioned here may be, for example, the user's fingers. The following description takes the user's fingers as an example. It should be noted that in this embodiment, the signal strength of the touch sensing signal corresponding to any part of the user's limbs is the same.
  • touch devices can only identify the touch position of any touch object. Therefore, the touch device uses the same touch response to the touch of any touch object, resulting in such The touch response of touch devices is not flexible and intelligent enough to meet the actual needs of users.
  • the touch device 100 as an intelligent interactive tablet as an example, currently, for an intelligent interactive tablet that supports touch control through multiple touch objects, the following usage scenarios may exist:
  • user A writes on the smart interactive tablet with user A's finger
  • user B writes on the smart interactive tablet with stylus 1
  • user C writes on the smart interactive tablet with stylus 2.
  • the smart interactive tablet can only identify the touch position of any touch object, the smart interactive tablet uses the same drawing parameters for processing and display of writing on each touch object, such as , the color of the handwriting, the thickness of the handwriting, etc., lead to the fact that when multiple touch objects are used to touch the smart interactive tablet for writing, the writing content displayed on the smart interactive tablet all have the same drawing parameters, resulting in the user being unable to distinguish through the drawing parameters. Which user's writing cannot meet the user's actual needs.
  • embodiments of the present application provide a new touch method.
  • the touch device can perform touch control based on the touch parameters corresponding to the touch object. Touch response, so that the touch device has different touch responses for different touch objects, which improves the flexibility and intelligence of the touch response of the touch device.
  • the hardware part of the touch device 100 is composed of, for example, a touch display module, an intelligent processing system, and other parts, which are combined together by the overall structural components and are also supported by a dedicated software system.
  • the touch display module includes a display screen, a touch component and a backlight component.
  • the backlight component is used to provide a backlight source for the display screen;
  • the display screen generally uses a liquid crystal display device for picture display;
  • the touch control component is set on the display screen, or is set at the front of the display screen, or is set independently of the display screen.
  • a touch panel outside the screen that is, the touch panel exists alone or is set on a keyboard, etc.
  • the touch component may be a touch component using capacitive technology, that is, a capacitive sensing touch component.
  • screen data is displayed on the display screen of the touch control device 100.
  • a touch object such as a finger 300 or a stylus pen 200
  • the touch control component of the touch control device 100 will collect the touch control data, and the touch control component will convert the touch control data into coordinate data of the touch point. and then sent to the intelligent processing system, or sent to the intelligent processing system where the intelligent processing system converts it into the coordinate data of the touch point.
  • the intelligent processing system obtains the coordinate data of the touch point, it implements the corresponding control operation according to the preset program. , driving the display content of the display screen to change to achieve diversified display and operation effects.
  • FIG. 2 is a schematic structural diagram of a possible touch component provided by an embodiment of the present application. As shown in FIG. 2 , taking the touch component being disposed on the display screen or at the front end of the display screen as an example, the touch component may include: a touch driving electrode layer, a touch sensing electrode layer and a touch screen control system.
  • the touch driving electrode layer and the touch sensing electrode layer are arranged on the upper side of the display screen. That is, the touch sensing electrode layer and the touch driving electrode layer are both stacked on the display screen. The touch sensing electrode layer and the touch driving electrode layer are on the top, and the display screen is on the bottom. The touch sensing electrode layer, the touch driving electrode layer, The three display screens constitute the touch screen 101 (also called a touch screen) of the touch device 100 .
  • the touch driving electrode layer is provided with multiple rows of driving electrodes Dx
  • the touch sensing electrode layer is provided with multiple rows of sensing electrodes Sy.
  • the multiple rows of driving electrodes Dx and the multiple rows of sensing electrodes Sy are arranged in a staggered manner.
  • the x and y mentioned here represent the number of rows of electrodes, and are both integers greater than or equal to 1.
  • the maximum values of x and y may be the same or different, specifically related to the number of rows of electrodes provided in the electrode layer.
  • Figure 2 is a schematic diagram taking the values of x and y as both 5 as an example.
  • the touch screen control system may include, for example: a touch main control module, a driving module and a sensing module.
  • one end of the touch main control module can be connected to one end of the driving module and one end of the sensing module respectively
  • the other end of the driving module can be connected to each row of driving electrodes through the driving electrode bus
  • the other end of the sensing module can be connected through the sensing electrode bus. Connect to each row of sensing electrodes.
  • the touch main control module controls the driving module to send a driving signal (which may also be called a touch driving signal) through the driving electrode, and couples the emitted driving signal to the sensing electrode through capacitive coupling, so that it can be sensed by the sensing electrode.
  • a driving signal which may also be called a touch driving signal
  • the signal sensed by the sensing electrode is called a touch sensing signal.
  • the touch master control module can control the sensing module to detect the intensity of the touch sensing signal (or the size of the touch sensing signal).
  • the intensity of the touch sensing signal will change, so that whether there is a touch and the touch location can be determined based on the intensity of the touch sensing signal. That is, the coordinate data of the touch point.
  • the above driving signal may also be called a scanning signal.
  • FIG 3 is a schematic structural diagram of a possible intelligent processing system provided by an embodiment of the present application.
  • the intelligent processing system may include, for example: a host control module, a touch data receiving module, and a communication module.
  • the host control module is communicatively connected with the communication module and the touch data receiving module respectively.
  • the touch data receiving module can be communicatively connected with the touch main control module of the touch screen control system to receive touch data from the touch component.
  • the communication module can establish a communication connection with the stylus pen 200 to receive touch data from the stylus pen 200 .
  • the intelligent processing system may also include a communication antenna, so that the communication module establishes a communication connection with the stylus 200 through the communication antenna.
  • the host control module is used to perform touch response based on touch data.
  • touch device 100 involved in this application is not limited to this, and the method in the embodiment of this application can be applied to any touch device 100 that can implement the method in the embodiment of this application.
  • the following settings are made for the stylus pens in this application: the signal strength and/or phase of the touch signals emitted by different stylus pens are different, so that the touch device
  • the touch sensing signals detected based on the touch signals emitted by different stylus pens have different signal strengths, so that the current touch object can be distinguished.
  • FIG. 4 is a schematic flowchart of a touch control method provided by an embodiment of the present application. As shown in Figure 4, the method may include:
  • this step may be performed by a sensing module.
  • S102 Determine whether a touch object touches the touch device according to the detected touch sensing signal.
  • this step may be performed by the touch main control module.
  • step S103 If there is a touch object touching the touch device, step S103 is executed. If there is no touch object touching the touch device, step S101 is returned to execution.
  • the touch device 100 the touch component, and the stylus 200 in Figures 2 and 3
  • the intensity of the touch sensing signal will change.
  • the touch device can determine whether a touch object touches the touch device based on whether the intensity of the detected touch sensing signal changes.
  • Figure 5 is a schematic diagram of signal strength provided by an embodiment of the present application.
  • the driving electrode of the touch component of the touch device emits a signal as shown in Figure 5
  • the driving signal shown is an example.
  • the intensity of the touch sensing signal sensed by the sensing electrode of the touch component is H.
  • part of the driving signal emitted by the driving electrode of the touch component of the touch device will be attenuated by human body coupling, causing the touch sensing signal intensity sensed by the sensing electrode to be H1 which is less than H. This way it can be determined whether a touch object touches the touch device and the touch location.
  • the touch device Supports touch with multiple touch objects.
  • the signal strengths of the touch sensing signals corresponding to different touch objects are different.
  • this step can be performed by the touch master control module and provided to the host control module.
  • the M touch objects supported by the touch device include at least two stylus pens. That is, when M is greater than or equal to 2, in order to enable the touch device to distinguish different stylus pens, this application is performed on the stylus pens. It is set as follows, the signal strength and/or phase of the touch signals emitted by different stylus pens are different, so that the touch sensing signal detected by the touch device based on the touch signals emitted by different stylus pens has different signal strengths, so as to Distinguish the current touch object.
  • the touch signals used by different stylus pens have the same phase but different intensities; or the touch signals used by different stylus pens have different phases but the same intensity; or different stylus pens use different phases but have the same intensity.
  • the touch signals used between them have different phases and different strengths.
  • some of the multiple stylus pens have the same intensity but different phases, or some of the stylus pens have the same phase but different intensities, etc.
  • the touch signals of the two stylus pens can satisfy the following relationship:
  • the first relationship the phase of the touch signal emitted by the first stylus is opposite to the phase of the touch signal emitted by the second stylus, and the phase of the touch signal emitted by the second stylus is the same as that of the touch device.
  • the phases of the touch drive signals are the same.
  • the intensity of the touch signal emitted by the first stylus pen and the touch signal emitted by the second stylus pen are the same or different.
  • the second relationship the phase of the touch signal emitted by the first stylus is opposite to the phase of the touch signal emitted by the second stylus, and the phase of the touch signal emitted by the first stylus is the same as that of the touch device.
  • the phases of the touch drive signals are the same.
  • the intensity of the touch signal emitted by the first stylus pen and the touch signal emitted by the second stylus pen are the same or different.
  • the third relationship The phase of the touch signal emitted by the first stylus and the phase of the touch signal emitted by the second stylus are both the same as the phase of the touch drive signal of the touch device, but the phase of the touch signal emitted by the first stylus is the same.
  • the intensity of the touch signal emitted and the touch signal emitted by the second stylus are different.
  • the fourth relationship the phase of the touch signal emitted by the first stylus and the phase of the touch signal emitted by the second stylus are opposite to the phase of the touch drive signal of the touch device, but the phase of the touch signal emitted by the first stylus is opposite.
  • the intensity of the touch signal emitted and the touch signal emitted by the second stylus are different.
  • the touch signals of the first stylus and the second stylus satisfy the first relationship, that is, the touch signal emitted by the first stylus and the touch drive signal emitted by the driving electrode
  • the phase of the touch signal emitted by the second stylus is opposite to that of the touch drive signal emitted by the drive electrode. That is to say, the first stylus emits an inverse signal and the second stylus emits an in-phase signal.
  • the inverse signal emitted by the first stylus will weaken the intensity of the driving signal, so that the intensity of the touch sensing signal sensed by the sensing electrode is less than H1 H2.
  • the touch device can determine which touch object is currently being touched and its touch position based on the intensity of the detected touch sensing signal, and obtain the first touch information of the touch object.
  • Figure 6 is another signal strength diagram provided by an embodiment of the present application.
  • the touch sensor taking the user's limb as the user's finger, when the finger, the first stylus and the second stylus touch the touch device, the touch sensor The intensity of the signal satisfies the following relationship: the intensity of the touch sensing signal H3 when the second stylus is touched > the intensity of the touch sensing signal H1 when the finger is touched > the intensity of the touch sensing signal H2 when the first stylus is touched. Therefore, the following thresholds can be set to determine touch objects:
  • the first preset signal strength threshold TH1 is between the signal strength H when there is no touch and the signal strength H1 received when the finger is in contact.
  • the second preset signal strength threshold TH2 is between the signal strength H1 received when the finger touches and the signal strength H2 received when the pen 1 touches.
  • the third preset signal strength threshold TH3 is between the signal strength H when there is no touch and the signal strength H3 received when the pen 2 is in contact.
  • the identification of the touch object can be obtained as follows:
  • the touch object is an identifier of the user's limb; or,
  • the signal strength of the touch sensing signal is less than the second preset signal strength threshold TH2, it is determined that the identification of the touch object is the identification of the first stylus; or,
  • the signal strength of the touch sensing signal is greater than or equal to the third preset signal strength threshold TH3, it is determined that the identification of the touch object is the identification of the second stylus.
  • the above example illustrates how to identify the identification of the touch object when using a finger
  • the first stylus pen and the second stylus pen to touch the touch device.
  • the first stylus pen described in the above example can be used.
  • the determination method is based on the identification method of the logo and the logo of the second stylus, which will not be described again here.
  • the operation of determining the touch position based on the strength of the touch sensing signal can be implemented by the host control module of the touch device or by the touch main control module. This application does not discuss this. Limitation.
  • how to determine the touch position based on the intensity of the touch sensing signal please refer to the implementation of the existing technology, which will not be described again here.
  • this step may be performed by a host control module.
  • the touch parameters mentioned here may, for example, be related to the function currently used by the user, or the function of the APP currently used, or in other words, related to the touch operation performed.
  • the touch parameters mentioned here may include drawing parameters, such as the color of the handwriting, the thickness of the handwriting, etc.
  • the touch parameters mentioned here may be tactile feedback parameters, so that different touch objects can adopt different tactile feedbacks.
  • the drawing parameters corresponding to the user's limbs are the color of the handwriting is blue
  • the drawing parameters corresponding to the stylus 1 are the color of the handwriting is red
  • the stylus The corresponding drawing parameter 2 is that the color of the handwriting is green.
  • the handwriting displayed on the smart interactive tablet is blue.
  • the handwriting displayed on the smart interactive tablet is red.
  • the handwriting displayed on the smart interactive tablet is red.
  • the handwriting is green.
  • the user can accurately distinguish which handwriting corresponds to which user based on the color of each handwriting on the intelligent interactive tablet without the need for the user to change the color when writing.
  • the smart interactive tablet can distinguish the user's limbs and the input behavior of at least two stylus pens. It can not only use multiple touch objects to touch the smart interactive tablet, but also enable the smart interactive tablet to target different touch controls. Objects have different touch responses, which improves the flexibility and intelligence of the touch response of the smart interactive tablet.
  • the touch device when it performs touch recognition, it can not only identify the touch position of the touch object, but also identify the current touch object, so that it can be based on the touch parameters corresponding to the touch object. , perform touch response to achieve the effect of having different touch responses for different touch objects, which improves the flexibility and intelligence of the touch response of the touch device.
  • the above-mentioned touch device can further perform touch response in combination with the pressure used by the touch object when touching the touch device. That is, before step S104, the touch device may also update the first touch information to obtain updated touch information.
  • the updated touch information includes: the identification of the touch object, the touch position, and the pressure used by the touch object when it touches the touch device.
  • the touch device performs a touch response at the touch position based on the touch parameters corresponding to the identification of the touch object and the pressure used when the touch object touches the touch device. Because the pressure can reflect the strength used by the user when touching. Therefore, in this embodiment, the touch device can further combine the pressure used when the touch object touches the touch device to bring a more realistic touch response to the user.
  • the touch device can display and draw the touch position on the display screen according to the drawing parameters of the touch object and the pressure used when the touch object touches the touch device. Parameters and pressure matched handwriting. That is, the pressure used when the touch object touches the touch device can be further combined to present handwriting that matches the strength corresponding to the pressure, so as to more accurately restore the user's real handwriting, thereby giving the user a more realistic touch experience. control response.
  • the touch device can further combine the pressure used when the touch object touches the touch device to provide the user with tactile feedback that matches the intensity corresponding to the pressure, thus giving the user a more realistic experience. Touch responsive.
  • the touch component of the touch device may further include at least one pressure sensor.
  • the pressure sensor may be, for example, a stress plate that induces deformation, or a sensor that can convert pressure into an electromagnetic signal, such as a piezoelectric sensor, a magnetostrictive sensor, an acceleration sensor, an optical sensor, etc.
  • the specific location of the pressure sensor and the number of the pressure sensor can be determined according to the actual product form of the touch device.
  • the touch device can identify the pressure used by the user when using the touch object for touch through the at least one pressure sensor, so that the touch device uses the pressure to update the first touch information to obtain the updated touch information. control information.
  • the touch component of the touch device may also include at least one pressure sensor.
  • a pressure sensor is also provided accordingly.
  • the touch object may send touch information to the touch device.
  • the touch information may include, for example: the identification of the touch object and the pressure used when touching the touch device. .
  • the touch device uses the pressure it recognizes based on the pressure sensor to update the first touch information to obtain updated touch information.
  • the touch device When the touch device detects that the touch object is a stylus, the touch device can obtain the touch information sent by the stylus.
  • the first stylus can send second touch information to the touch device; the second touch information includes the identification of the first stylus, and the touch of the first stylus.
  • the touch device determines that the identification of the touch object determined based on the touch sensing signal is the same as the identification of the first stylus, it will merge the first touch information and the second touch information to obtain an update. touch information.
  • the above operations of receiving the second touch information and updating the first touch information may be performed by the host control module.
  • the first touch information is as shown in the following Table 1:
  • the second touch information is shown in Table 2 below:
  • the updated touch information is obtained, for example, as shown in Table 3 below:
  • the touch device when the touch device performs a touch response, it can further combine the pressure used when the touch object touches the touch device to perform a touch response, thereby providing the user with a more realistic touch experience. control response.
  • the stylus involved in the embodiments of the present application may be an active stylus, for example.
  • the touch device can identify whether the active stylus is touching by receiving a touch signal emitted from the active stylus. Therefore, the active stylus can achieve small-tip touch.
  • the intensity of the touch signal emitted by the active stylus is Small, causing the touch device to be unable to accurately identify the touch position.
  • the touch position recognition is inaccurate, resulting in obvious jitter in the written handwriting and fonts. severe distortion.
  • FIG. 7 is a schematic flowchart of another touch control method provided by an embodiment of the present application. As shown in Figure 7, considering the above problems, the stylus in the embodiment of the present application uses the following method to transmit touch signals:
  • S202 Determine whether the pressure of the tip of the stylus is greater than or equal to a preset pressure threshold.
  • step S203 is executed.
  • step S204 is performed.
  • S203 Send touch information to the touch device, and transmit a touch signal with preset phase and signal strength to the touch device through the pen tip, where the touch information includes: the identification of the stylus, and the stylus touch The pressure used when touching the device.
  • the pressure used when the stylus pen touches the touch device is the pressure of the tip of the stylus pen.
  • the stylus detects the touch drive signal emitted by the drive electrode of the touch component of the touch device, the detected touch drive signal of the touch device is The signal is processed.
  • the stylus can transmit a touch signal with high signal strength to the touch device to improve the accuracy of the touch device in identifying its touch position, so that the touch device can provide an accurate touch response.
  • the stylus can identify whether the tip of the stylus is in contact with the screen of the smart interactive tablet based on the relationship between the pressure of the pen tip and the preset pressure threshold.
  • the stylus pen can determine whether the stylus pen tip is in contact with the screen of the intelligent interactive tablet.
  • the stylus can transmit touch signals to the smart interactive tablet through the pen tip, and send touch information to the smart interactive tablet through the communication module and communication antenna, so that the smart interactive tablet can respond to the touch.
  • the stylus pen when the stylus pen recognizes that the pressure of the pen tip is less than the preset pressure threshold, the stylus pen can determine that the pen tip of the stylus pen has left the screen of the intelligent interactive tablet. At this time, the stylus can stop transmitting touch signals to the smart interactive tablet through the pen tip, and stop sending touch information to the smart interactive tablet through the communication module and communication antenna to avoid floating touch.
  • the communication module is a Bluetooth module and the communication antenna is Bluetooth. Taking the antenna as an example, the structure of the stylus 200 is schematically explained.
  • FIG. 8 is a schematic structural diagram of a possible stylus 200 provided by an embodiment of the present application.
  • the stylus 200 used to implement the embodiment shown in FIG. 7 may include, for example: a pen tip, an induction ring, a pressure sensor, an analog-to-digital conversion module, a touch signal output unit, a general control module, and a power module. , Bluetooth module and Bluetooth antenna.
  • Pen tip used to transmit the pressure generated when the user uses the stylus pen 200 to touch to the touch device 100, and to emit the touch signal output by the output amplification module.
  • Induction loop used to sense the touch drive signal emitted by the touch component of the touch device 100 .
  • Pressure sensor used to detect the pressure generated when the user uses the stylus pen 200 for touch, that is, the pressure transmitted through the pen tip.
  • Analog-to-digital conversion module used to convert the electrical signal of pressure detected by the pressure sensor into a digital signal.
  • the touch signal output unit is used to perform the action of outputting touch signals according to the control instructions of the main control module.
  • Power module provides power to the stylus.
  • General control module used to determine whether the pressure generated by the user when using the stylus pen 200 for touch is greater than the preset threshold, and when greater than or equal to the preset pressure threshold, perform the following operations:
  • the touch information may include: the identification of the stylus, and the pressure generated when the user uses the stylus 200 for touch.
  • the overall control module can also have the following functions:
  • the above-mentioned overall control module can be implemented using any circuit or chip that can realize the control function, such as MCU.
  • the specific modules included in the above-mentioned touch signal output unit are related to the phase of the touch signal that needs to be output.
  • the touch signal output unit needs to have the following functions: perform inverse amplification processing on the detected touch drive signal, and can perform inverse amplification processing based on the total
  • the control module is instructed to shut down.
  • the touch signal output unit may include the following implementation methods:
  • Figure 9 is a schematic structural diagram of the first touch signal output unit provided by an embodiment of the present application. As shown in Figure 9, the touch signal output unit includes: an output control module and an output amplification module.
  • the output control module is used to control whether the output amplification module outputs the touch signal according to the control instructions of the main control module.
  • the output amplification module is used to perform inverse amplification processing on the touch drive signal sensed by the induction ring. After obtaining the touch signal, it is output to the pen tip.
  • the above-mentioned output amplification module may also be called an inverting amplification module.
  • FIG 10 is a schematic structural diagram of the second touch signal output unit provided by the embodiment of the present application.
  • the touch signal output unit includes: an output control module, an input detection module and an output amplification module.
  • the output control module is used to control whether the output amplification module outputs the touch signal according to the control instructions of the main control module.
  • the input detection module is used to perform in-phase amplification processing on the touch drive signal sensed by the induction ring.
  • the output amplification module is used to perform inverse amplification processing on the touch drive signal amplified by the input detection module. After obtaining the touch signal, it is output to the pen tip.
  • the above-mentioned input detection module can also be called a non-inverting amplification module
  • the above-mentioned output amplification module can also be called an inverting amplification module.
  • Figure 11 is a schematic structural diagram of the third touch signal output unit provided by the embodiment of the present application. As shown in Figure 11, the touch signal output unit includes: an output control module, an input detection module and an output amplification module. .
  • the output control module is used to control whether the output amplification module outputs the touch signal according to the control instructions of the main control module.
  • the input detection module is used for inverting and amplifying the touch drive signal sensed by the induction ring.
  • the output amplification module is used to perform in-phase amplification processing on the touch drive signal amplified by the input detection module. After obtaining the touch signal, it is output to the pen tip.
  • the above-mentioned input detection module may also be called an inverting amplification module
  • the above-mentioned output amplification module may also be called a non-inverting amplification module.
  • the touch signal output unit needs to have the following functions: perform in-phase amplification processing on the detected touch drive signal, and be able to control the The module is instructed to shut down.
  • the touch signal output unit may include the following implementation methods:
  • Figure 13 is a schematic structural diagram of a fourth touch signal output unit provided by an embodiment of the present application. As shown in Figure 13, the touch signal output unit includes: an output control module and an output amplification module.
  • the output control module is used to control whether the output amplification module outputs the touch signal according to the control instructions of the main control module.
  • the output amplification module is used to perform in-phase amplification processing on the touch drive signal sensed by the induction ring. After obtaining the touch signal, it is output to the pen tip.
  • the above output amplification module may also be called a non-inverting amplification module.
  • Figure 14 is a schematic structural diagram of the fifth touch signal output unit provided by the embodiment of the present application. As shown in Figure 14, the touch signal output unit includes: an output control module, an input detection module and an output amplification module.
  • the output control module is used to control whether the output amplification module outputs the touch signal according to the control instructions of the main control module.
  • the input detection module is used to perform in-phase amplification processing on the touch drive signal sensed by the induction ring.
  • the output amplification module is used to perform in-phase amplification processing on the touch drive signal amplified by the input detection module. After obtaining the touch signal, it is output to the pen tip.
  • both the input detection module and the output amplification module can be called non-inverting amplification modules.
  • Figure 15 is a schematic structural diagram of a sixth touch signal output unit provided by an embodiment of the present application.
  • the touch signal output unit includes: an output control module, an input detection module and an output amplification module.
  • the output control module is used to control whether the output amplification module outputs the touch signal according to the control instructions of the main control module.
  • the input detection module is used for inverting and amplifying the touch drive signal sensed by the induction ring.
  • the output amplification module is used to perform inverse amplification processing on the touch drive signal amplified by the input detection module. After obtaining the touch signal, it is output to the pen tip.
  • both the input detection module and the output amplification module can be called inverting amplification modules.
  • the overall control module can identify whether the tip of the stylus touches the touch component of the touch device based on the relationship between the pressure of the pen tip and the preset pressure threshold. When the pressure of the pen tip of the stylus pen is greater than or equal to the preset pressure threshold, the overall control module may determine that the pen tip of the stylus pen contacts the touch component of the touch device. At this time, the main control module can control the output amplification module to output the touch signal to the pen tip through the output control module, so as to transmit the touch signal to the touch device.
  • the overall control module may determine that the stylus pen tip has left the touch component of the touch device. At this time, the main control module can control the output amplification module to stop the intelligent interactive tablet from transmitting touch signals through the output control module to avoid floating touch.
  • the in-phase amplification module shown in the touch signal output unit shown in FIGS. 9 to 14 can be implemented by any circuit with a non-phase amplification function, and the inverting amplification module can be implemented by any circuit with an inverting amplification function.
  • the above-mentioned output control module can be implemented, for example, by any switch control circuit that can be turned on or off through control instructions.
  • Figure 15 is a schematic diagram 2 of the structure of the third touch signal output unit provided by an embodiment of the present application.
  • the output control module, input detection module and output amplification module included in the touch signal output unit can be implemented by, for example, a circuit as shown in Figure 15, which will not be described again.
  • the stylus can avoid floating touch without reducing the intensity of the touch signal emitted by the stylus to the touch device. Therefore, the stylus can emit a touch signal with high signal strength to the touch device to improve the accuracy of the touch device in identifying its touch position, so that the touch device can provide an accurate touch response.
  • FIG. 16 is a schematic structural diagram of a touch device provided by an embodiment of the present application.
  • the touch device includes a detection module 11 , a determination module 12 , an acquisition module 13 , and a response module 14 .
  • the touch device may also include: an update module 15 and/or a receiving module 16.
  • Detection module 11 used to detect touch sensing signals
  • the determination module 12 is used to determine whether a touch object touches the touch device according to the detected touch sensing signal
  • the acquisition module 13 is configured to acquire the first touch information of the touch object according to the touch sensing signal when it is determined that a touch object touches the touch device; wherein, the first touch The information includes: the touch position, and the identification of the touch object determined based on the intensity of the touch sensing signal; the touch device supports multiple touch objects, and the touch objects corresponding to different touch objects The signal strength of the induction signal is different;
  • the response module 14 is configured to perform a touch response at the touch position according to the touch parameters corresponding to the identification of the touch object.
  • the acquisition module 13 can be specifically used to determine that the first touch object and the second touch object touch the touch device at the same time.
  • the touch sensing signal obtains the first touch information of the second touch object; wherein the first touch information includes: the touch position, and is determined based on the intensity of the touch sensing signal corresponding to the touch object.
  • the identification of the touch object the response module 14 is specifically configured to perform a touch response at the touch position of the first touch object according to the touch parameter corresponding to the identification of the first touch object, and, according to The touch parameter corresponding to the identifier of the second touch object is used to perform a touch response at the touch position of the second touch object.
  • the plurality of touch objects include the user's limbs and M touch pens, where M is an integer greater than or equal to 1; the touch sensing signal corresponding to any part of the user's limbs have the same signal strength; alternatively, the plurality of touch objects include M touch pens, where M is an integer greater than or equal to 2.
  • M is greater than or equal to 2
  • the signal strengths and/or phases of the touch signals emitted by each stylus are different, so that the signal strengths of the touch sensing signals detected by the touch device are different.
  • the M is greater than or equal to 2
  • the M touch pens include a first touch pen and a second touch pen
  • the phase of the touch signal emitted by the first touch pen is different from the phase of the second touch pen.
  • the phase of the touch signal emitted by the stylus is opposite, and the phase of the touch signal emitted by the second stylus is the same as the phase of the touch drive signal of the touch device.
  • the acquisition module 13 is specifically configured to determine when the signal strength of the touch sensing signal is less than or equal to the first preset signal strength threshold and greater than or equal to the second preset signal strength threshold.
  • the touch object is an identifier of the user's limb; when the signal strength of the touch sensing signal is less than the second preset signal strength threshold, it is determined that the identifier of the touch object is the first stylus. the identification; when the signal strength of the touch sensing signal is greater than or equal to the third preset signal strength threshold, determine the identification of the touch object as the identification of the second stylus; the third preset The signal strength threshold is greater than the first preset signal strength threshold.
  • the M is greater than or equal to 2
  • the M touch pens include a first touch pen and a second touch pen
  • the phase of the touch signal emitted by the first touch pen and the second touch pen are The phase of the touch signal emitted by the stylus is opposite, and the phase of the touch signal emitted by the second stylus is the same as the phase of the touch drive signal of the touch device.
  • the acquisition module 13 is specifically configured to determine that the identification of the touch object is the first when the signal strength of the touch sensing signal is less than the second preset signal strength threshold.
  • the identification of the stylus when the signal strength of the touch sensing signal is greater than or equal to the third preset signal strength threshold, it is determined that the identification of the touch object is the identification of the second stylus; the third The third preset signal strength threshold is greater than the first preset signal strength threshold.
  • the update module 15 is configured to update the first touch before the response module 14 performs a touch response at the touch position according to the touch parameter corresponding to the identification of the touch object.
  • the touch information is updated to obtain updated touch information.
  • the updated touch information includes: the identification of the touch object, the touch position, and the touch device touched by the touch object. the pressure used.
  • the response module 14 is specifically configured to perform a touch response at the touch position according to the touch parameters corresponding to the identification of the touch object and the pressure.
  • the identifier of the touch object is the identifier of the user's limb.
  • the update module 15 is specifically configured to detect the pressure used by the touch object when it touches the touch device, and update the detected pressure. The pressure is added to the first touch information to obtain the updated touch information.
  • the identification of the touch object is the identification of the stylus
  • the receiving module 16 is configured to receive the second touch information sent by the stylus; the second touch information includes the identification of the stylus. Identifies, and the pressure used when the stylus touches the touch device.
  • the update module 15 is specifically configured to merge the first data when the identity of the touch object determined based on the strength of the touch sensing signal is the same as the identity of the stylus. The touch information and the second touch information are used to obtain updated touch information.
  • the touch operation performed by the touch object touch device is a writing operation
  • the touch parameters include drawing parameters
  • the response module 14 is specifically configured to perform a writing operation according to the identity of the touch object and The drawing parameters corresponding to the two pressures display the writing handwriting matching the drawing parameters at the touch position.
  • the touch device provided in the above embodiments performs the foregoing touch device method
  • the division of the above functional modules is only used as an example. In actual applications, the above functions can be allocated from different modules as needed.
  • the functional modules are completed, that is, the internal structure of the device is divided into different functional modules to complete all or part of the functions described above.
  • the touch device provided by the above embodiments has the same concept as the foregoing embodiments of the touch method. For details of the implementation process, please refer to the method embodiments, which will not be described again here.
  • Embodiments of the present application also provide a computer storage medium.
  • the computer storage medium can store multiple instructions.
  • the instructions are suitable for being loaded by the processor and executing the steps of implementing the touch device in the touch method as described above. This will not be described in detail.
  • the device where the storage medium is located may be a touch device, and the touch device may be, for example, a smart interactive tablet.
  • Embodiments of the present application also provide another computer storage medium.
  • the computer storage medium can store multiple instructions.
  • the instructions are suitable for the processor to load and execute the steps of implementing the stylus in the touch method as described above. No further details will be given here.
  • the device where the storage medium is located may be a stylus, and the touch device may be, for example, an intelligent interactive tablet.
  • Embodiments of the present application also provide a touch device.
  • the touch device may include: a touch component, a processor, and a memory; wherein the memory stores a computer program, and the computer program is adapted to be processed by the processor. Load and execute the steps to implement the touch method as mentioned above, which will not be described again here.
  • the touch device mentioned here may be, for example, the touch device described in the embodiments of the present application.
  • the touch device may be, for example, a smart interactive tablet.
  • FIG 17 is a schematic structural diagram of an intelligent interactive tablet provided by an embodiment of the present application.
  • the intelligent interactive tablet 1000 may include: at least one processor 1001, at least one network interface 1004, user interface 1003, memory 1005, and at least one communication bus 1002.
  • the communication bus 1002 is used to realize connection communication between these components.
  • the user interface 1003 may include a display screen (Display) and a camera (Camera), and the optional user interface 1003 may also include a standard wired interface and a wireless interface.
  • Display display screen
  • Camera Camera
  • the optional user interface 1003 may also include a standard wired interface and a wireless interface.
  • the network interface 1004 may optionally include a standard wired interface and a wireless interface (such as a WI-FI interface).
  • the processor 1001 may include one or more processing cores.
  • the processor 1001 uses various interfaces and lines to connect various parts of the entire intelligent interactive tablet 1000, and by running or executing instructions, programs, code sets or instruction sets stored in the memory 1005, and calling data stored in the memory 1005, Execute various functions of the intelligent interactive tablet 1000 and process data.
  • the processor 1001 can use at least one of digital signal processing (Digital Signal Processing, DSP), field-programmable gate array (Field-Programmable Gate Array, FPGA), and programmable logic array (Programmable Logic Array, PLA). implemented in hardware form.
  • DSP Digital Signal Processing
  • FPGA Field-Programmable Gate Array
  • PLA programmable logic array
  • the processor 1001 can integrate one or a combination of a central processing unit (Central Processing Unit, CPU), a graphics processor (Graphics Processing Unit, GPU), a modem, etc.
  • CPU central processing unit
  • GPU Graphics Processing Unit
  • the CPU mainly handles the operating system, user interface, and applications
  • the GPU is responsible for rendering and drawing the content that needs to be displayed on the display
  • the modem is used to handle wireless communications. It can be understood that the above-mentioned modem may not be integrated into the processor 1001 and may be implemented by a separate chip.
  • the memory 1005 may include random access memory (RAM) or read-only memory (Read-Only Memory).
  • the memory 1005 includes non-transitory computer-readable storage medium.
  • Memory 1005 may be used to store instructions, programs, codes, sets of codes, or sets of instructions.
  • the memory 1005 may include a program storage area and a data storage area, where the program storage area may store instructions for implementing the operating system, instructions for at least one function (such as touch function, sound playback function, image playback function, etc.), Instructions, etc., used to implement each of the above method embodiments; the storage data area can store data, etc. involved in each of the above method embodiments.
  • the memory 1005 may optionally be at least one storage device located remotely from the aforementioned processor 1001. As shown in Figure 17, memory 1005, which is a computer storage medium, may include an operating system, a network communication module, a user interface module, and an operating application program for an intelligent interactive tablet.
  • the user interface 1003 is mainly used to provide an input interface for the user and obtain the data input by the user; and the processor 1001 can be used to call the operating application of the smart interactive tablet stored in the memory 1005. program, and specifically execute the actions of the touch device in the aforementioned touch method.
  • the touch device may include: a processor and a memory; wherein the memory stores a computer program, and the computer program is adapted to be loaded by the processor and executed as follows: The above steps for implementing the touch control method will not be described again here.
  • Embodiments of the present application provide a touch control system, which includes: a touch device as described above, and at least two stylus pens as described above, for implementing the foregoing touch method. , will not be described in detail here.
  • embodiments of the present application may be provided as methods, systems, or computer program products. Accordingly, the present application may take the form of an entirely hardware embodiment, an entirely software embodiment, or an embodiment that combines software and hardware aspects. Furthermore, the present application may take the form of a computer program product embodied on one or more computer-usable storage media (including, but not limited to, disk storage, CD-ROM, optical storage, etc.) having computer-usable program code embodied therein.
  • computer-usable storage media including, but not limited to, disk storage, CD-ROM, optical storage, etc.
  • These computer program instructions may also be stored in a computer-readable memory that causes a computer or other programmable data processing apparatus to operate in a particular manner, such that the instructions stored in the computer-readable memory produce an article of manufacture including the instruction means, the instructions
  • the device implements the functions specified in a process or processes of the flowchart and/or a block or blocks of the block diagram.
  • These computer program instructions may also be loaded onto a computer or other programmable data processing device, causing a series of operating steps to be performed on the computer or other programmable device to produce computer-implemented processing, thereby executing on the computer or other programmable device.
  • Instructions provide steps for implementing the functions specified in a process or processes of a flowchart diagram and/or a block or blocks of a block diagram.
  • a computing device includes one or more processors (CPUs), input/output interfaces, network interfaces, and memory.
  • processors CPUs
  • input/output interfaces network interfaces
  • memory volatile and non-volatile memory
  • Memory may include non-volatile memory in computer-readable media, random access memory (RAM) and/or non-volatile memory in the form of read-only memory (ROM) or flash memory (flash RAM). Memory is an example of a computer-readable medium.
  • RAM random access memory
  • ROM read-only memory
  • flash RAM flash memory
  • Computer-readable media includes both persistent and non-volatile, removable and non-removable media that can be implemented by any method or technology for storage of information.
  • Information may be computer-readable instructions, data structures, modules of programs, or other data.
  • Examples of computer storage media include, but are not limited to, phase change memory (PRAM), static random access memory (SRAM), dynamic random access memory (DRAM), other types of random access memory (RAM), and read-only memory.
  • PRAM phase change memory
  • SRAM static random access memory
  • DRAM dynamic random access memory
  • RAM random access memory
  • read-only memory read-only memory
  • ROM read-only memory
  • EEPROM electrically erasable programmable read-only memory
  • flash memory or other memory technology
  • compact disc read-only memory CD-ROM
  • DVD digital versatile disc
  • Magnetic tape cassettes tape magnetic disk storage or other magnetic storage devices or any other non-transmission medium can be used to store information that can be accessed by a computing device.
  • computer-readable media does not include transitory media, such as modulated data signals and carrier waves.

Abstract

本申请实施例公开了一种触控方法、装置、设备、系统、存储介质及程序产品,属于触控技术领域。方法包括:检测触控感应信号;根据检测到的触控感应信号,确定是否有触控物触控触控设备;若确定有触控物触控触控设备,则根据触控感应信号,获取触控物的第一触控信息;其中,第一触控信息包括:触控位置,以及,基于触控感应信号的强度确定的触控物的标识;触控设备支持多个触控物,不同触控物的触控感应信号的信号强度不同;根据触控物的标识对应的触控参数,在触控位置进行触控响应。因此,本申请实施例可以在有多个触控物触控触控设备时,提高触控设备的触控响应的灵活性和智能化。

Description

触控方法、装置、设备、系统、存储介质及程序产品 技术领域
本申请涉及触控领域,尤其涉及一种触控方法、装置、设备、系统、存储介质及程序产品。
背景技术
触控设备是一种设置有触控组件和显示屏的设备,用户在使用触控设备时,可以通过触摸显示屏上显示的组件或内容,实现对触控设备的操作。由于触控设备摆脱了键盘和鼠标的束缚,使得人机交互更为直截了当,受到越来越多的用户的青睐。
目前,针对采用电容感应式的触摸组件的触控设备,虽然支持通过多个触控物对其进行触控,例如,用户的手指和至少一个有源触控笔等,但是,目前此类触控设备的触控响应不够灵活和不够智能化,无法满足用户实际使用时的需求。
发明内容
本申请实施例提供了一种触控方法、装置、设备、系统、存储介质及程序产品,可以解决在通过多个触控物对采用电容感应式的触摸组件的触控设备进行触控时,触控设备的触控响应不够灵活和不够智能化的问题。所述技术方案如下:
第一方面,本申请实施例提供了一种触控方法,所述方法应用于触控设备,所述方法包括:
检测触控感应信号;
根据检测到的触控感应信号,确定是否有触控物触控所述触控设备;
若确定有触控物触控所述触控设备,则根据所述触控感应信号,获取所述触控物的第一触控信息;其中,所述第一触控信息包括:触控位置,以及,基于所述触控感应信号的强度确定的所述触控物的标识;所述触控设备支持多个触控物触控,不同触控物对应的触控感应信号的信号强度不同;
根据所述触控物的标识对应的触控参数,在所述触控位置进行触控响应。
可选地,所述多个触控物包括用户的肢体,以及,M个触控笔,其中,M为大于或等于1的整数;所述用户的肢体的任一部位对应的触控感应信号的信号强度相同;
或者,所述多个触控物包括M个触控笔,其中,M为大于或等于2的整数;
当M大于或等于2时,各触控笔发射的触控信号的信号强度和/或相位不同,以使所述触控设备检测到的触控感应信号的信号强度不同。
可选地,所述M大于或等于2,所述M个触控笔包括第一触控笔和第二触控笔,所述第一触控笔发射的触控信号的相位和所述第二触控笔发射的触控信号的相位相反,且所述第二触控笔的发射的触控信号的相位与所述触控设备的触控驱动信号的相位相同;
根据所述触控感应信号,获取所述触控物的标识,包括:
若所述触控感应信号的信号强度小于或等于第一预设信号强度阈值、且大于或等于第二预设信号强度阈值,则确定所述触控物为用户的肢体的标识;
若所述触控感应信号的信号强度小于所述第二预设信号强度阈值,则确定所述触控物 的标识为所述第一触控笔的标识;
若所述触控感应信号的信号强度大于或等于第三预设信号强度阈值,则确定所述触控物的标识为所述第二触控笔的标识;所述第三预设信号强度阈值大于所述第一预设信号强度阈值。
可选地,所述M大于或等于2,所述M个触控笔包括第一触控笔和第二触控笔,所述第一触控笔发射的触控信号的相位和所述第二触控笔发射的触控信号的相位相反,且所述第二触控笔的发射的触控信号的相位与所述触控设备的触控驱动信号的相位相同;
根据所述触控感应信号,获取所述触控物的标识,包括:
若所述触控感应信号的信号强度小于第二预设信号强度阈值,则确定所述触控物的标识为所述第一触控笔的标识;
若所述触控感应信号的信号强度大于或等于第三预设信号强度阈值,则确定所述触控物的标识为所述第二触控笔的标识;所述第三预设信号强度阈值大于所述第二预设信号强度阈值。
可选地,所述根据所述触控物的标识对应的触控参数,在所述触控位置进行触控响应之前,还包括:
对所述第一触控信息更新,得到更新后的触控信息,所述更新后的触控信息包括:所述触控物的标识、所述触控位置,以及,所述触控物触控所述触控设备时所使用的压力;
所述根据所述触控物的标识对应的触控参数,在所述触控位置进行触控响应,包括:
根据所述触控物的标识和所述压力两者对应的触控参数,在所述触控位置进行触控响应。
可选地,所述触控物的标识为用户的肢体的标识,所述对所述第一触控信息更新,得到更新后的触控信息,包括:
检测所述触控物触控所述触控设备时所使用的压力;
将检测到的所述压力添加至所述第一触控信息中,得到所述更新后的触控信息。
可选地,所述触控物的标识为触控笔的标识,所述方法还包括:
接收所述触控笔发送的第二触控信息;所述第二触控信息包括所述触控笔的标识,以及,所述触控笔触控所述触控设备时所使用的压力;
所述对所述第一触控信息更新,得到更新后的触控信息,包括:
若基于所述触控感应信号的强度确定的所述触控物的标识与所述触控笔的标识相同,则合并所述第一触控信息和所述第二触控信息,得到更新后的触控信息。
可选地,所述触控物触控触控设备执行的触控操作为书写操作,所述触控参数包括绘制参数;
所述根据所述触控物的标识和所述压力两者对应的触控参数,在所述触控位置对所述触控物的触控进行触控响应,包括:
根据所述触控物的标识和所述压力两者对应的绘制参数,在所述触控位置显示与所述绘制参数匹配的书写笔迹。
第二方面,本申请实施例提供了一种触控方法,所述方法应用于触控笔,所述方法包括:
检测所述触控笔的笔尖的压力;
若所述压力大于或等于预设压力阈值,则向所述触控设备发送触控信息以及通过所述笔尖发射预设相位和信号强度的触控信号;所述触控信息包括:所述触控笔的标识,以及, 所述压力;
若所述压力小于所述预设压力阈值,则停止向所述触控设备发送所述触控信息,以及,停止通过所述笔尖发射所述触控信号。
可选地,所述通过所述笔尖向所述触控设备传输预设相位和信号强度的触控信号,包括:
将检测到的所述触控设备的触控驱动信号进行处理,得到预设相位和信号强度的触控信号;
通过所述笔尖向所述触控设备传输预设相位和信号强度的触控信号。
第三方面,本申请实施例提供了一种触控方法,所述方法应用于触控设备,所述方法包括:
检测触控感应信号;
根据检测到的触控感应信号,确定是否有触控物触控所述触控设备;
若确定有第一触控物和第二触控物同时触控所述触控设备,则根据所述第一触控物对应的触控感应信号,获取所述第一触控物的第一触控信息,以及,根据所述第二触控物对应的触控感应信号,获取所述第二触控物的第一触控信息;其中,所述第一触控信息包括:触控位置,以及,基于触控物对应的触控感应信号的强度确定的触控物的标识;
根据所述第一触控物的标识对应的触控参数,在所述第一触控物的触控位置进行触控响应;
根据所述第二触控物的标识对应的触控参数,在所述第二触控物的触控位置进行触控响应。
第四方面,本申请实施例提供了一种触控设备,包括:
检测模块,用于检测触控感应信号;
确定模块,用于根据检测到的触控感应信号,确定是否有触控物触控所述触控设备;
获取模块,用于在确定有触控物触控所述触控设备时,根据所述触控感应信号,获取所述触控物的第一触控信息;其中,所述第一触控信息包括:触控位置,以及,基于所述触控感应信号的强度确定的所述触控物的标识;所述触控设备支持多个触控物触控,不同触控物对应的触控感应信号的信号强度不同;
响应模块,用于根据所述触控物的标识对应的触控参数,在所述触控位置进行触控响应。
第五方面,本申请实施例提供了一种触控设备,包括:
检测模块,用于检测触控感应信号;
确定模块,用于根据检测到的触控感应信号,确定是否有触控物触控所述触控设备;
获取模块,用于在确定有第一触控物和第二触控物同时触控所述触控设备时,根据所述第一触控物对应的触控感应信号,获取所述第一触控物的第一触控信息,以及,根据所述第二触控物对应的触控感应信号,获取所述第二触控物的第一触控信息;其中,所述第一触控信息包括:触控位置,以及,基于触控物对应的触控感应信号的强度确定的触控物的标识;
响应模块,用于根据所述第一触控物的标识对应的触控参数,在所述第一触控物的触控位置进行触控响应,以及,根据所述第二触控物的标识对应的触控参数,在所述第二触控物的触控位置进行触控响应。
第六方面,本申请实施例提供了一种触控笔,所述触控笔包括:笔尖、感应环、压力传感器、模数转换模块、触控信号输出单元、总控制模块、通信模块和通信天线;
所述压力传感器,用于检测所述笔尖的压力;
所述模数转换模块,用于将所述压力传感器检测到的所述笔尖的压力的电信号转换为数字信号后传输至所述总控制模块;
所述感应环,用于感应触控设备的触控组件发出的触控驱动信号;
所述总控制模块,用于在所述压力大于或等于预设压力阈值时,通过所述通信模块和通信天线向所述触控设备发送触控信息,以及控制所述触控设备传输触控信号输出单元输出预设相位和信号强度的触控信号,并通过所述笔尖向所述触控设备传输所述触控信号;在所述压力小于所述预设压力阈值时,停止通过所述通信模块和通信天线向所述触控设备发送所述触控信息,以及,控制所述触控信号输出单元停止输出所述触控信号;其中,所述触控信息包括:所述触控笔的标识,以及,所述压力。
第七方面,本申请实施例提供了一种触控设备,包括:触控组件、处理器和存储器;其中,所述存储器存储有计算机程序,所述计算机程序适于由所述处理器加载并执行如上述第一方面任意一项或第三方面的方法步骤。
第八方面,本申请实施例提供了一种触控笔,包括:处理器和存储器;其中,所述存储器存储有计算机程序,所述计算机程序适于由所述处理器加载并执行上述第二方面的方法步骤。
第九方面,本申请实施例提供了一种触控系统,所述触控系统包括:如第七方面所述的触控设备,以及,至少两个如第八方面所述的触控笔。
第十方面,本申请实施例提供一种计算机存储介质,所述计算机存储介质存储有多条指令,所述指令适于由处理器加载并执行上述第一方面任一项或第三方面的方法步骤。
第十一方面,本申请实施例提供一种计算机存储介质,所述计算机存储介质存储有多条指令,所述指令适于由处理器加载并执行上述第二方面任一项的方法步骤。
第十二方面,本申请实施例提供一种计算机程序产品,包括计算机程序,该计算机程序被处理器执行时实现上述第一方面任一项或第三方面的方法步骤。
第十三方面,本申请实施例提供一种计算机程序产品,包括计算机程序,该计算机程序被处理器执行时实现上述第二方面任一项的方法步骤。
在本申请实施例中,触控设备在进行触控识别时,不仅可以识别触控物的触控位置,还可以识别出当前的触控物,从而可以基于该触控物对应的触控参数,进行触控响应,以达到针对不同的触控物具有不同的触控响应的效果,提高了触控设备的触控响应的灵活性和智能化。
附图说明
为了更清楚地说明本申请实施例或现有技术中的技术方案,下面将对实施例或现有技术描述中所需要使用的附图作简单地介绍,显而易见地,下面描述中的附图仅仅是本申请的一些实施例,对于本领域普通技术人员来讲,在不付出创造性劳动的前提下,还可以根据这些附图获得其他的附图。
图1为本申请实施例提供的触控方法适用的一种场景示意图;
图2为本申请实施例提供的一种可能的触控组件的结构示意图;
图3为本申请实施例提供的一种可能的智能处理系统的结构示意图;
图4为本申请实施例提供的一种触控方法的流程示意图;
图5为本申请实施例提供的一种信号强度示意图;
图6为本申请实施例提供的另一种信号强度示意图;
图7为本申请实施例提供的另一种触控方法的流程示意图;
图8为本申请实施例提供的一种可能的触控笔200的结构示意图;
图9为本申请实施例提供的第一种触控信号输出单元的结构示意图;
图10为本申请实施例提供的第二种触控信号输出单元的结构示意图;
图11为本申请实施例提供的第三种触控信号输出单元的结构示意图一;
图12为本申请实施例提供的第四种触控信号输出单元的结构示意图;
图13为本申请实施例提供的第五种触控信号输出单元的结构示意图;
图14为本申请实施例提供的第六种触控信号输出单元的结构示意图;
图15为本申请实施例提供的第三种触控信号输出单元的结构示意图二;
图16为本申请实施例提供的一种触控装置的结构示意图;
图17是本申请实施例提供的一种智能交互平板的结构示意图。
具体实施方式
为使本申请的目的、技术方案和优点更加清楚,下面将结合附图对本申请实施例方式作进一步地详细描述。
应当明确,所描述的实施例仅仅是本申请一部分实施例,而不是全部的实施例。基于本申请中的实施例,本领域普通技术人员在没有作出创造性劳动前提下所获得的所有其它实施例,都属于本申请保护的范围。
下面的描述涉及附图时,除非另有表示,不同附图中的相同数字表示相同或相似的要素。以下示例性实施例中所描述的实施方式并不代表与本申请相一致的所有实施方式。相反,它们仅是如所附权利要求书中所详述的、本申请的一些方面相一致的装置和方法的例子。
在本申请的描述中,需要理解的是,术语“第一”、“第二”、“第三”等仅用于区别类似的对象,而不必用于描述特定的顺序或先后次序,也不能理解为指示或暗示相对重要性。对于本领域的普通技术人员而言,可以根据具体情况理解上述术语在本申请中的具体含义。此外,在本申请的描述中,除非另有说明,“多个”是指两个或两个以上。“和/或”,描述关联对象的关联关系,表示可以存在三种关系,例如,A和/或B,可以表示:单独存在A,同时存在A和B,单独存在B这三种情况。字符“/”一般表示前后关联对象是一种“或”的关系。
图1为本申请实施例提供的触控方法适用的一种场景示意图。参照图1,该应用场景中包括触控物和触控设备100。
其中,触控设备100可以为任一采用电容感应式的触控组件实现触控功能的设备,例如,智能交互平板、手机(mobile phone)、平板电脑(pad)、电脑、虚拟现实(virtual reality,VR)终端设备、增强现实(augmented reality,AR)终端设备、工业控制(industrial control)中的无线终端、无人驾驶(self driving)中的无线终端、远程手术(remote medical surgery)中的无线终端、智能电网(smart grid)中的无线终端、运输安全(transportation safety)中的无线终端、智慧城市(smart city)中的无线终端、智慧家庭(smart home)中的无线终端、个人数字处理(personal digital assistant,PDA)、车载设备、可穿戴设备等。图1中以触控设备100为智能交互平板为例进行说明。
触控物可以触控触控设备100,以向触控设备100提供输入,触控设备100基于触控物的输入,执行响应于该输入的操作。此处所说的触控物可以为任一能够对触控设备100实现触控操作的对象,例如触控笔200、用户的手指300等。
应用于上述采用电容感应式的触控组件实现触控功能的设备,本申请实施例涉及的触控笔200可以为电容笔。其中,电容笔可以包括:无源电容笔和有源电容笔。无源电容笔可以称为被动式电容笔或者无源触控笔,有源电容笔可以称为有源触控笔,分为有源式被 动笔和有源式主动笔。
在一些实施例中,触控笔100也可以称为手写笔,本申请实施例对此不进行区分。本申请实施例的应用场景中主要涉及有源触控笔,下述实施例均以有源式被动笔为例进行示例说明。即,后续申请文件中所描述的触控笔与有源触控笔的含义等同,对此不进行区分。
所谓有源式被动笔是指可以将检测到的触控设备100的触控驱动信号进行处理后生成触控信号并主动输出的触控笔。触控设备100可以通过接收来自有源式被动笔所输出的触控信号,以识别有源式被动笔是否触控,以及,触控位置。
触控笔200和触控设备100之间可以有线连接,也可以通过通信网络进行互联,以实现交互。该通信网络可以但不限于为:WI-FI热点网络、WI-FI点对点(peer-to-peer,P2P)网络、蓝牙网络、zigbee网络或近场通信(near field communication,NFC)网络等近距离通信网络。应理解,当触控笔200和触控设备100之间通过通信网络进行互联时,两者可以分别具有能够建立该通信网络的通信模块和通信天线。以蓝牙网络为例,两者可以分别具有蓝牙模块和蓝牙天线,从而能够利用蓝牙模块和蓝牙天线建立蓝牙网络。
目前,针对图1所示的触控设备100,虽然支持通过多个触控物对其进行触控,例如,触控设备100支持M个触控笔触控,在该示例中,M为大于或等于2的整数;或者,触控设备100支持用户的肢体(例如手指)触控,以及,M个触控笔实现触控,在该示例中,M为大于或等于1的整数。此处所说的用户的肢体例如可以是用户的手指等,下述均以用户的手指为例进行说明。需说明,在本实施例中,用户的肢体的任一部位对应的触控感应信号的信号强度均相同。
但是,目前此类触控设备针对任意触控物的触控,仅能够识别其触控的位置,因此,触控设备对于任意触控物的触控均采用相同的触控响应,导致此类触控设备的触控响应不够灵活和不够智能化,无法满足用户实际使用时的需求。
以触控设备100为智能交互平板为例,目前,针对支持通过多个触控物对其进行触控的智能交互平板来说,可能会存在如下使用场景:
例如,在会议中,用户A通过用户A的手指在智能交互平板上进行书写,用户B通过触控笔1在该智能交互平板上进行书写,用户C通过触控笔2在该智能交互平板上进行书写。
然而,由于智能交互平板针对任意触控物的触控,仅能够识别其触控的位置,因此,智能交互平板针对每个触控物的书写,都采用相同的绘制参数进行处理和显示,例如,笔迹的颜色、笔迹的粗细等,导致在采用多个触控物触控智能交互平板进行书写时,智能交互平板所显示的书写内容均具有相同的绘制参数,导致用户无法通过绘制参数区分是哪个用户的书写,无法满足用户实际使用时的需求。
发明人通过研究发现,支持多个触控物触控的触控设备存在上述技术问题的原因是:触控设备在进行触控识别时无法区分触控物,导致触控设备采用相同的触控响应。
有鉴于此,本申请实施例提供了一种新的触控方法,在采用某一触控物对触控设备进行触控时,触控设备能够基于该触控物对应的触控参数,进行触控响应,从而使触控设备针对不同的触控物,具有不同的触控响应,提高了触控设备的触控响应的灵活性和智能化。
为了便于对本申请实施例的理解,下面分别对本申请实施例涉及的触控设备100的结构进行示例说明。
示例性的,触控设备100的硬件部分例如由触控显示模组、智能处理系统等部分所构成,由整体结构件结合到一起,同时也由专用的软件系统作为支撑。
其中,触控显示模组包括显示屏、触控组件和背光灯组件。背光灯组件用于为显示屏提供背光光源;显示屏一般采用液晶显示装置,用于进行画面展示;触控组件设置在显示屏上、或者、设置在显示屏前端、或者、设置在独立于显示屏之外的触控面板(即,该触控面板单独存在,或者设置在例如键盘上等)上,用于实现触控识别。本实施例中,触控组件可以是采用电容技术的触摸组件,即,电容感应式的触摸组件。
以触控组件设置在显示屏上或者设置在显示屏前端为例,在实际使用中,触控设备100的显示屏上显示画面数据,当用户通过手指300或者触控笔200等触控物点击显示屏上显示的内容,例如点击显示屏上显示的图形按钮时,触控设备100的触控组件将采集到触控数据,从而触控组件将该触控数据转换为触控点的坐标数据后发送到智能处理系统,或者发送到智能处理系统处由智能处理系统转换为触控点的坐标数据,智能处理系统获得触控点的坐标数据后,根据预先设定的程序实现相应的控制操作,驱动显示屏显示内容发生变化,实现多样化的显示、操作效果。
图2为本申请实施例提供的一种可能的触控组件的结构示意图。如图2所示,以触控组件设置在显示屏上或者设置在显示屏前端为例,该触控组件例如可以包括:触控驱动电极层、触控感应电极层和触控屏控制系统。
其中,触控驱动电极层和触控感应电极层设置在显示屏的上侧。即,触控感应电极层和触控驱动电极层均堆叠在显示屏上,触控感应电极层和触控驱动电极层在上,显示屏在下,触控感应电极层、触控驱动电极层、显示屏三者构成触控设备100的触控屏101(也可以称为触摸屏)。
触控驱动电极层设置有多行驱动电极Dx,触控感应电极层设置有多行感应电极Sy,多行驱动电极Dx与多行感应电极Sy交错排列。此处所说的x和y,表征电极的行数,均为大于或等于1的整数,x和y的最大值可以相同或不同,具体与该电极层设置的电极的行数有关。图2是以x和y的取值均为5为例的示意图。
触控屏控制系统例如可以包括:触控主控模块、驱动模块和感应模块。其中,触控主控模块的一端可以分别与驱动模块的一端和感应模块的一端连接,驱动模块的另一端可以通过驱动电极总线与每行驱动电极连接,感应模块的另一端可以通过感应电极总线与每行感应电极连接。
触控主控模块控制驱动模块通过驱动电极发送驱动信号(也可以称为触控驱动信号),以通过电容耦合的方式,将发出的驱动信号耦合到感应电极,从而能够被感应电极感应到。在本实施例中,将感应电极感应到的信号称为触控感应信号。触控主控模块可以控制感应模块检测触控感应信号的强度(或者说触控感应信号的大小)。
当用户用触控笔200或手指300触摸显示屏时,触控感应信号的强度会发生变化,从而可以基于该触控感应信号的强度可以确定是否存在触控,以及触控位置。即,触控点的坐标数据。在一些实施例中,上述驱动信号也可以称为扫描信号。
图3为本申请实施例提供的一种可能的智能处理系统的结构示意图。如图3所示,该智能处理系统例如可以包括:主机控制模块、触控数据接收模块、通信模块。其中,主机控制模块分别与通信模块和触控数据接收模块通信连接。
以图2所示的触控屏控制系统为例,触控数据接收模块可以与触控屏控制系统的触控主控模块通信连接,用于接收来自触控组件的触控数据。
通信模块可以与触控笔200建立通信连接,以接收来自触控笔200的触控数据。若该通信连接为无线连接,则该智能处理系统还可以包括通信天线,以使通信模块通过通信天线与触控笔200建立通信连接。
主机控制模块用于基于触控数据进行触控响应。
应理解,上述图2和图3,仅是示例性的给出了触控设备100的一种可能的结构,该结构仅是重点强调了与本申请方法相关的内容,对于触控设备100是否还具有其他的部件,以及,是否具有其他的功能,本申请并不限定。另外,上述图2和图3中针对触控设备100中的各模块的划分仅是一种示意,本申请对各模块的划分,以及,各模块的命名并不进行限定。
需说明,本申请所涉及的触控设备100并不以此为限,本申请实施例的方法可以适用于任一能够实现本申请实施例方法的触控设备100。
通过上述图2和图3中关于触控设备100的介绍可知,当用户用触控笔200或手指300触摸显示屏时,触控感应信号的强度会发生变化。
因此,为了能够使触控设备区分出不同的触控笔,本申请中针对触控笔进行如下设置:不同触控笔发射的触控信号的信号强度和/或相位不同,以使触控设备基于不同触控笔发射的触控信号检测到的触控感应信号的信号强度不同,从而能够区分出当前的触控物。
下面结合具体地实施例对本申请实施例提供的触控方法的技术方案进行详细说明。下面这几个具体的实施例可以相互结合,对于相同或者相似的概念或者过程可能在某些实施例不再赘述。
图4为本申请实施例提供的一种触控方法的流程示意图。如图4所示,该方法可以包括:
S101,检测触控感应信号。
例如,该步骤例如可以由感应模块执行。
S102,根据检测到的触控感应信号,确定是否有触控物触控触控设备。
例如,该步骤例如可以由触控主控模块执行。
若有触控物触控触控设备,则执行步骤S103,若无触控物触控触控设备,则返回执行步骤S101。
通过上述图2和图3中关于触控设备100、触控组件、触控笔200的介绍可知,当用户用触控笔200或手指300触摸显示屏时,触控感应信号的强度会发生变化。因此,触控设备可以基于检测到的触控感应信号的强度是否发生变化,确定是否有触控物触控触控设备。
以触控物为用户的手指为例,图5为本申请实施例提供的一种信号强度示意图,如图5所示,示例性的,以触控设备的触控组件的驱动电极发出如图所示的驱动信号为例,则在该示例下,当无任何触控物触控触控设备时,触控组件的感应电极感应到的触控感应信号强度为H。
当用户通过手指进行触控时,触控设备的触控组件的驱动电极发出的驱动信号中的部分信号会被人体耦合衰减,导致感应电极感应到的触控感应信号强度为小于H的H1,从而可以确定是否有触控物触控触控设备,以及触控位置。
S103、根据触控感应信号,获取触控物的第一触控信息,其中,第一触控信息包括: 触控位置,以及,基于触控感应信号确定的触控物的标识;触控设备支持多个触控物触控,不同触控物对应的触控感应信号的信号强度不同。
例如,该步骤例如可以由触控主控模块执行,并提供给主机控制模块。
触控设备支持的M个触控物包括至少两个触控笔为例,即M大于或等于2时,为了能够使触控设备区分出不同的触控笔,本申请中针对触控笔进行如下设置,不同触控笔发射的触控信号的信号强度和/或相位不同,以使触控设备基于不同触控笔发射的触控信号检测到的触控感应信号的信号强度不同,从而能够区分出当前的触控物。
例如,不同触控笔之间所采用的触控信号的相位相同、但强度不同;或者,不同触控笔之间所采用的触控信号的相位不同、但强度相同;或者,不同触控笔之间所采用的触控信号的相位不同、强度也不同。再例如,多个触控笔中部分触控笔的强度相同、相位不同,或者,部分触控笔的相位相同但强度不同等。
以M个触控笔包括第一触控笔和第二触控笔为例,该两个触控笔的触控信号可以满足如下关系:
第一种关系:第一触控笔发射的触控信号的相位和第二触控笔发射的触控信号的相位相反,且第二触控笔的发射的触控信号的相位与触控设备的触控驱动信号的相位相同。此时,第一触控笔发射的触控信号和第二触控笔发射的触控信号的强度相同或不同。
第二种关系:第一触控笔发射的触控信号的相位和第二触控笔发射的触控信号的相位相反,且第一触控笔的发射的触控信号的相位与触控设备的触控驱动信号的相位相同。此时,第一触控笔发射的触控信号和第二触控笔发射的触控信号的强度相同或不同。
第三种关系:第一触控笔发射的触控信号的相位、第二触控笔发射的触控信号的相位均与触控设备的触控驱动信号的相位相同,但第一触控笔发射的触控信号和第二触控笔发射的触控信号的强度不同。
第四种关系:第一触控笔发射的触控信号的相位、第二触控笔发射的触控信号的相位均与触控设备的触控驱动信号的相位相反,但第一触控笔发射的触控信号和第二触控笔发射的触控信号的强度不同。
继续参照图5,以第一触控笔和第二触控笔的触控信号满足第一种关系为例,即,第一触控笔发射的触控信号与驱动电极发出的触控驱动信号的相位相反,第二触控笔发射的触控信号与驱动电极发出的触控驱动信号的相位相同。也就是说,第一触控笔发射反相信号、第二触控笔发射同相信号。
在该示例下,当用户通过第一触控笔进行触控时,第一触控笔发射的反相信号会减弱驱动信号的强度,从而使感应电极感应到的触控感应信号强度为小于H1的H2。
当用户通过第二触控笔进行触控时,第二触控笔发射的同相信号会增强驱动信号的强度,从而使感应电极感应到的触控感应信号强度为大于H的H3。在该实现方式下,触控设备可以基于检测到的触控感应信号的强度,确定当前是哪个触控物,以及,其触控位置,得到该触控物的第一触控信息。
图6为本申请实施例提供的另一种信号强度示意图。如图6所示,在图5所示的示例下,以用户的肢体为用户的手指为例,当采用手指、第一触控笔和第二触控笔触控触控设备时,触控感应信号的强度满足如下关系:第二触控笔触控时触控感应信号的强度H3>手指触控时触控感应信号的强度H1>第一触控笔触控时触控感应信号的强度H2。因此,可 以设置如下阈值用于判断触控物:
第一预设信号强度阈值TH1,介于无触控时的信号强度H和手指接触时接收到的信号强度H1之间。
第二预设信号强度阈值TH2,介于手指接触时接收到的信号强度H1和笔1接触时接收到的信号强度H2之间。
第三预设信号强度阈值TH3,介于无触控时的信号强度H和笔2接触时接收到的信号强度H3之间。
即,第三预设信号强度阈值TH3>第一预设信号强度阈值TH1>第二预设信号强度阈值TH2。
在该示例下,可以采用如下方式获取触控物的标识:
若触控感应信号的信号强度小于或等于第一预设信号强度阈值TH1、且大于或等于第二预设信号强度阈值TH2,则确定触控物为用户的肢体的标识;或者,
若触控感应信号的信号强度小于第二预设信号强度阈值TH2,则确定触控物的标识为第一触控笔的标识;或者,
若触控感应信号的信号强度大于或等于第三预设信号强度阈值TH3,则确定触控物的标识为第二触控笔的标识。
应理解,若触控感应信号的信号强度不满足上述任一条件,则确定当前无任何触控物触控触控设备。
另外,虽然上述示例以采用手指、第一触控笔和第二触控笔触控触控设备时,如何识别触控物的标识为例进行了说明。但是应理解,当采用多个触控笔触控触控设备时,例如,采用第一触控笔和第二触控笔触控触控设备,可以采用上述示例中所描述的第一触控笔的标识和第二触控笔的标识的判断方式进行判断,在此不再赘述。
需说明,关于如何基于触控感应信号的强度,确定触控位置这个操作,可以是由触控设备的主机控制模块来实现,也可以是由触控主控模块实现,本申请对此不进行限定,另外,关于如何基于触控感应信号的强度,确定触控位置,可以参见现有技术的实现方式,在此不再赘述。
S104、根据触控物的标识对应的触控参数,在触控位置进行触控响应。
例如,该步骤例如可以由主机控制模块执行。
此处所说的触控参数例如可以与用户当前所使用的功能,或者,当前所使用的APP的功能有关,或者说,与执行的触控操作有关。
例如,以用户使用触控设备进行书写操作为例,则此处所说的触控参数例如可以包括绘制参数,例如,笔迹的颜色、笔迹的粗细等。
再例如,以用户使用触控设备进行触控游戏操作为例,则此处所说的触控参数例如可以为触感反馈参数,以使不同的触控物可以采用不同的触感反馈。
仍然以前述关于触控设备为智能交互平板的示例为例,假定用户的肢体对应的绘制参数为笔迹的颜色为蓝色、触控笔1对应的绘制参数为笔迹的颜色为红色,触控笔2对应的绘制参数为笔迹的颜色为绿色。
在该实现方式下,当在会议中,用户A通过用户A的手指在智能交互平板上进行书写时,智能交互平板所呈现的笔迹为蓝色。用户B通过触控笔1在该智能交互平板上进行书 写时,智能交互平板所呈现的笔迹为红色,用户C通过触控笔2在该智能交互平板上进行书写时,智能交互平板所呈现的笔迹为绿色。
这样,无需用户自己在书写时进行颜色的变更,即可基于智能交互平板上各笔迹的颜色准确的区分出哪个笔迹与哪个用户对应。通过该方式,智能交互平板可以区分用户的肢体、至少两个触控笔的输入行为,不仅可以采用多个触控物对智能交互平板进行触控,还可以使智能交互平板针对不同的触控物具有不同的触控响应,提高了智能交互平板的触控响应的灵活性和智能化。
在本申请实施例中,触控设备在进行触控识别时,不仅可以识别触控物的触控位置,还可以识别出当前的触控物,从而可以基于该触控物对应的触控参数,进行触控响应,以达到针对不同的触控物具有不同的触控响应的效果,提高了触控设备的触控响应的灵活性和智能化。
进一步地,在上述实施例的基础上,上述触控设备还可以进一步结合触控物进行触控所述触控设备时所使用的压力,进行触控响应。即,在上述步骤S104之前,触控设备还可以先对第一触控信息更新,得到更新后的触控信息。其中,更新后的触控信息包括:触控物的标识、触控位置,以及,触控物触控触控设备时所使用的压力。
这样,触控设备根据触控物的标识对应的触控参数,以及,触控物触控触控设备时所使用的压力,在触控位置进行触控响应。由于该压力可以反映出用户触控时所使用的力度。因此,本实施例中,触控设备可以进一步结合触控物触控触控设备时所使用的压力,给用户带来更加真实的触控响应。
以书写操作为例,使用真正的笔进行书写时,所书写出来的字迹会因为使用者的书写力度不同,出现粗细不同的情况,当力度越大时,字迹会越粗。因此,本实施例中,触控设备在用户书写时,可以根据触控物的绘制参数,以及,触控物触控触控设备时所使用的压力,在显示屏的触控位置显示与绘制参数和压力匹配的书写笔迹。即,可以进一步结合触控物触控触控设备时所使用的压力,呈现出与该压力对应的力度匹配的笔迹,以更加准确地还原用户的真实笔迹,从而给用户带来更加真实的触控响应。
以触感反馈为例,则触控设备可以进一步结合触控物触控触控设备时所使用的压力,给用户带来与该压力对应的力度匹配的触感反馈,从而给用户带来更加真实的触控响应。
关于如何更新第一触控信息,例如可以包括如下几种实现方式:
第一种实现方式:触控设备的触控组件还可以包括至少一个压力传感器。具体实现时,上述压力传感器例如可以是感应形变的应力片,也可以是压电传感器、磁致伸缩传感器、加速度传感器、光学传感器等能够将压力转化为电磁信号的传感器。压力传感器的具体设置位置,以及,压力传感器的数量可以根据触控设备的实际产品形态确定。
在该实现方式下,触控设备可以通过该至少一个压力传感器识别用户使用触控物进行触控时所使用的压力,从而触控设备使用该压力更新第一触控信息,得到更新后的触控信息。
第二种实现方式:触控设备的触控组件还可以包括至少一个压力传感器,另外,针对除用户手指之外、且能够跟触控设备进行通信交互的触控物(例如,触控笔)也相应设置有压力传感器。当该触控物触控触控设备时,该触控物可以向触控设备发送触控信息,该触控信息例如可以包括:触控物的标识和触控触控设备时所使用的压力。
当触控设备检测到触控物为用户的肢体时,触控设备使用自己基于压力传感器识别到 的压力,对第一触控信息进行更新,得到更新后的触控信息。
当触控设备检测到触控物为触控笔时,触控设备可以获取该触控笔发送的触控信息。
以第一触控笔为例,第一触控笔可以向触控设备发送第二触控信息;该第二触控信息包括第一触控笔的标识,以及,第一触控笔触控触控设备时所使用的压力。则在该示例下,触控设备若判断确定基于触控感应信号确定的触控物的标识与第一触控笔的标识相同,则合并第一触控信息和第二触控信息,得到更新后的触控信息。
例如,上述接收第二触控信息,以及,更新第一触控信息的操作例如可以由主机控制模块执行。
示例性的,第一触控信息例如下述表1所示:
表1
触控物的标识 触控位置
触控笔1 (X,Y)
第二触控信息例如下述表2所示:
表2
触控物的标识 触控触控设备时所使用的压力
触控笔1 压力1
合并第一触控信息和第二触控信息,得到更新后的触控信息例如下述表3所示:
表3
触控物的标识 触控位置 触控触控设备时所使用的压力
触控笔1 (X,Y) 压力1
在本申请实施例中,触控设备在进行触控响应时,还可以进一步结合触控物触控触控设备时所使用的压力,进行触控响应,从而可以给用户带来更加真实的触控响应。
如上述实施例所述,本申请实施例涉及的触控笔例如可以是有源触控笔。针对有源触控笔,触控设备可以通过接收来自有源触控笔所发射的触控信号,以识别有源触控笔是否触控。因此,有源触控笔可以实现小笔尖触控。
目前为了避免触控笔在接近触控设备、但未实际触控触控设备时,被触控设备识别为触控(即悬浮触控),有源触控笔所发射的触控信号的强度较小,导致触控设备无法准确的识别出其触控位置,尤其是通过有源触控笔进行书写操作时,会出现因触控位置识别不准确,导致书写的笔迹存在明显的抖动,字体失真严重的情况。
图7为本申请实施例提供的另一种触控方法的流程示意图。如图7所示,考虑到上述问题,本申请实施例中触控笔采用如下方式发射触控信号:
S201,检测触控笔的笔尖的压力。
S202,判断触控笔的笔尖的压力是否大于或等于预设压力阈值。
若是,即触控笔的笔尖的压力大于或等于预设压力阈值,说明触控笔当前已触控触控设备,或者说,触控笔的笔尖已接触触控设备的触控组件,此时,即便向触控设备发送触控信号,也不会出现悬浮触控的情况,则执行步骤S203。
若否,即触控笔的笔尖的压力小于预设压力阈值,说明触控笔当前未触控触控设备,或者说,触控笔的笔尖未接触触控设备的触控组件,此时,如果向触控设备发送触控信号,会出现悬浮触控的情况,则执行步骤S204。
S203,向触控设备发送触控信息,以及,通过笔尖向触控设备传输预设相位和信号强度的触控信号,其中,触控信息包括:触控笔的标识,以及,触控笔触控触控设备时所使用的压力。
此时,触控笔触控触控设备时所使用的压力即为触控笔的笔尖的压力。
关于预设相位和信号强度的触控信号,可以是触控笔检测到触控设备的触控组件的驱动电极发出的触控驱动信号后,将检测到的所述触控设备的触控驱动信号进行处理得到的。
S204、停止向触控设备发送触控信息,以及,停止通过笔尖向触控设备传输触控信号。
通过上述方法,无需通过减小触控笔向触控设备传输的触控信号的强度,即可避免悬浮触控。因此,触控笔可以向触控设备传输信号强度大的触控信号,以提高触控设备识别其触控位置的准确性,从而可以使触控设备提供准确的触控响应。
以智能交互平板的书写操作为例,触控笔基于笔尖的压力与预设压力阈值的关系,可以识别出触控笔的笔尖是否接触到智能交互平板的屏幕。触控笔的笔尖的压力大于或等于预设压力阈值时,触控笔可以判定触控笔的笔尖是否接触到智能交互平板的屏幕。此时,触控笔可以通过笔尖向智能交互平板传输触控信号,并通过通信模块和通信天线向智能交互平板发送触控信息,以供智能交互平板进行触控响应。
在书写过程中,当触控笔识别到笔尖的压力小于预设压力阈值时,触控笔可以判定触控笔的笔尖已离开智能交互平板的屏幕。此时,触控笔可以停止通过笔尖向智能交互平板传输触控信号,以及,并停止通过通信模块和通信天线向智能交互平板发送触控信息,以避免出现悬浮触控。
通过上述方式,可以避免出现因触控信号的强度较小,导致触控位置识别不准确,进而导致书写的笔迹存在明显的抖动,字体失真严重的情况,提高了书写的准确性,进而使触控设备可以更加真实的还原用户的笔迹。
以采用蓝牙连接与触控设备建立通信连接、且基于检测到的触控设备的触控驱动信号,得到触控信号的触控笔为例,即,以通信模块为蓝牙模块,通信天线为蓝牙天线为例,对触控笔200的结构进行示意说明。
图8为本申请实施例提供的一种可能的触控笔200的结构示意图。如图8所示,用于实现图7所示的实施例的触控笔200例如可以包括:笔尖、感应环、压力传感器、模数转换模块、触控信号输出单元、总控制模块、电源模块、蓝牙模块和蓝牙天线。
(1)笔尖:用于向触控设备100传递用户使用触控笔200进行触控时产生的压力,以及,发射输出放大模块输出的触控信号。
(2)感应环:用于感应触控设备100的触控组件发出的触控驱动信号。
(3)压力传感器:用于检测用户使用触控笔200进行触控时产生的压力,即,通过笔尖传递过来的压力。
(4)模数转换模块:用于将压力传感器检测到的压力的电信号转换为数字信号。
(5)触控信号输出单元,用于根据总控制模块的控制指令,执行输出触控信号的动作。
(6)电源模块:为触控笔供电。
(7)总控制模块:用于判断用户使用触控笔200进行触控时产生的压力是否大于预设阈值,并在大于或等于预设压力阈值时,执行如下操作:
1)控制触控信号输出单元工作,即,输出触控信号。
2)通过蓝牙模块和蓝牙天线与触控设备100发送触控信息,该触控信息可以包括:触控笔的标识,以及,用户使用触控笔200进行触控时产生的压力。
在小于该预设压力阈值时,执行如下操作:
A)控制触控信号输出单元停止工作,即,停止输出触控信号。
B)停止通过蓝牙模块和蓝牙天线向触控设备100发送触控信息。
可选地,该总控制模块还可以具有如下功能:
a)校准压力传感器;
b)触控笔的电源管理,例如,用户未使用触控笔时进入休眠状态,以节省耗电量,在识别到用户使用触控笔时进入唤醒状态,并进行工作。
c)通过蓝牙模块和蓝牙天线与触控设备100进行交互,接收来自触控设备100的信息或者指令。
示例性的,上述总控制模块例如可以采用任一能够实现控制功能的电路或者芯片实现,例如MCU。
示例性的,上述触控信号输出单元具体所包括的模块,与所需输出的触控信号的相位相关。
以触控笔需输出与触控驱动信号相位相反的触控信号为例,则该触控信号输出单元需具有如下功能:对检测到的触控驱动信号进行反相放大处理,并能基于总控制模块的指令进行关断。
可选地,触控信号输出单元例如可以包括如下几种实现方式:
实现方式1:图9为本申请实施例提供的第一种触控信号输出单元的结构示意图,如图9所示,触控信号输出单元包括:输出控制模块和输出放大模块。
输出控制模块,用于根据总控制模块的控制指令,控制输出放大模块是否输出触控信号。
输出放大模块,用于对感应环感应到的触控驱动信号进行反相放大处理,得到触控信号后,输出至笔尖。
在该实现方式下,上述输出放大模块也可以称为反相放大模块。
实现方式2:图10为本申请实施例提供的第二种触控信号输出单元的结构示意图,如图10所示,触控信号输出单元包括:输出控制模块、输入检测模块和输出放大模块。
输出控制模块,用于根据总控制模块的控制指令,控制输出放大模块是否输出触控信号。
输入检测模块,用于对感应环感应到的触控驱动信号进行同相放大处理。
输出放大模块,用于对输入检测模块放大后的触控驱动信号进行反相放大处理,得到触控信号后,输出至笔尖。
在该实现方式下,上述输入检测模块也可以称为同相放大模块,上述输出放大模块也可以称为反相放大模块。
实现方式3:图11为本申请实施例提供的第三种触控信号输出单元的结构示意图一,如图11所示,触控信号输出单元包括:输出控制模块、输入检测模块和输出放大模块。
输出控制模块,用于根据总控制模块的控制指令,控制输出放大模块是否输出触控信号。
输入检测模块,用于对感应环感应到的触控驱动信号进行反相放大处理。
输出放大模块,用于对输入检测模块放大后的触控驱动信号进行同相放大处理,得到触控信号后,输出至笔尖。
在该实现方式下,上述输入检测模块也可以称为反相放大模块,上述输出放大模块也可以称为同相放大模块。
以触控笔需输出与触控驱动信号相位相同的触控信号为例,则该触控信号输出单元需具有如下功能:对检测到的触控驱动信号进行同相放大处理,并能基于总控制模块的指令进行关断。
可选地,触控信号输出单元例如可以包括如下几种实现方式:
实现方式1:图13为本申请实施例提供的第四种触控信号输出单元的结构示意图,如图13所示,触控信号输出单元包括:输出控制模块和输出放大模块。
输出控制模块,用于根据总控制模块的控制指令,控制输出放大模块是否输出触控信号。
输出放大模块,用于对感应环感应到的触控驱动信号进行同相放大处理,得到触控信号后,输出至笔尖。
在该实现方式下,上述输出放大模块也可以称为同相放大模块。
实现方式2:图14为本申请实施例提供的第五种触控信号输出单元的结构示意图,如图14所示,触控信号输出单元包括:输出控制模块、输入检测模块和输出放大模块。
输出控制模块,用于根据总控制模块的控制指令,控制输出放大模块是否输出触控信号。
输入检测模块,用于对感应环感应到的触控驱动信号进行同相放大处理。
输出放大模块,用于对输入检测模块放大后的触控驱动信号进行同相放大处理,得到触控信号后,输出至笔尖。
在该实现方式下,上述输入检测模块和输出放大模块均可以称为同相放大模块。
实现方式3:图15为本申请实施例提供的第六种触控信号输出单元的结构示意图,如图15所示,触控信号输出单元包括:输出控制模块、输入检测模块和输出放大模块。
输出控制模块,用于根据总控制模块的控制指令,控制输出放大模块是否输出触控信号。
输入检测模块,用于对感应环感应到的触控驱动信号进行反相放大处理。
输出放大模块,用于对输入检测模块放大后的触控驱动信号进行反相放大处理,得到触控信号后,输出至笔尖。
在该实现方式下,上述输入检测模块和输出放大模块均可以称为反相放大模块。
在上述图9至图14所示的实现方式下,总控制模块基于笔尖的压力与预设压力阈值的关系,可以识别出触控笔的笔尖是否接触到触控设备的触控组件。当触控笔的笔尖的压力大于或等于预设压力阈值时,总控制模块可以判定触控笔的笔尖接触到触控设备的触控组件。此时,总控制模块可以通过输出控制模块,控制输出放大模块向笔尖输出触控信号,以向触控设备发射触控信号。
当触控笔的笔尖的压力小于预设压力阈值时,总控制模块可以判定触控笔的笔尖已离开接触到触控设备的触控组件。此时,总控制模块可以通过输出控制模块,控制输出放大模块停止智能交互平板发射触控信号,以避免出现悬浮触控。
应理解,上述图9至图14所示的触控信号输出单元中所示的同相放大模块例如可以通过任一具有同相放大功能的电路实现,反相放大模块例如可以通过任一具有反相放大功能的电路实现,上述输出控制模块例如可以通过任一通过控制指令可以打开或关断的开关 控制电路实现。
例如,以图11所示的第三种触控信号输出单元的结构为例,图15为本申请实施例提供的第三种触控信号输出单元的结构示意图二,在该实现方式下,上述触控信号输出单元所包括的输出控制模块、输入检测模块和输出放大模块例如可以通过如图15所示的电路实现,对此不再赘述。
通过上述结构,触控笔无需通过减小触控笔向触控设备发射的触控信号的强度,即可避免悬浮触控。因此,触控笔可以向触控设备发射信号强度大的触控信号,以提高触控设备识别其触控位置的准确性,从而可以使触控设备提供准确的触控响应。
下述为本申请装置实施例,可以用于执行本申请方法实施例。对于本申请装置实施例中未披露的细节,请参照本申请方法实施例。
图16为本申请实施例提供的一种触控设备的结构示意图。如图16所示,该触控设备包括检测模块11、确定模块12、获取模块13、响应模块14。可选地,该触控设备还可以包括:更新模块15和/或接收模块16。
检测模块11,用于检测触控感应信号;
确定模块12,用于根据检测到的触控感应信号,确定是否有触控物触控所述触控设备;
获取模块13,用于在确定有触控物触控所述触控设备时,根据所述触控感应信号,获取所述触控物的第一触控信息;其中,所述第一触控信息包括:触控位置,以及,基于所述触控感应信号的强度确定的所述触控物的标识;所述触控设备支持多个触控物触控,不同触控物对应的触控感应信号的信号强度不同;
响应模块14,用于根据所述触控物的标识对应的触控参数,在所述触控位置进行触控响应。
以同时有第一触控物和第二触控物同时触控所述触控设备为例,则获取模块13,可以具体用于在确定有第一触控物和第二触控物同时触控所述触控设备时,根据所述第一触控物对应的触控感应信号,获取所述第一触控物的第一触控信息,以及,根据所述第二触控物对应的触控感应信号,获取所述第二触控物的第一触控信息;其中,所述第一触控信息包括:触控位置,以及,基于触控物对应的触控感应信号的强度确定的触控物的标识;响应模块14,具体用于根据所述第一触控物的标识对应的触控参数,在所述第一触控物的触控位置进行触控响应,以及,根据所述第二触控物的标识对应的触控参数,在所述第二触控物的触控位置进行触控响应。
可选地,所述多个触控物包括用户的肢体,以及,M个触控笔,其中,M为大于或等于1的整数;所述用户的肢体的任一部位对应的触控感应信号的信号强度相同;或者,所述多个触控物包括M个触控笔,其中,M为大于或等于2的整数。当M大于或等于2时,各触控笔发射的触控信号的信号强度和/或相位不同,以使所述触控设备检测到的触控感应信号的信号强度不同。
例如,所述M大于或等于2,所述M个触控笔包括第一触控笔和第二触控笔,所述第一触控笔发射的触控信号的相位和所述第二触控笔发射的触控信号的相位相反,且所述第二触控笔的发射的触控信号的相位与所述触控设备的触控驱动信号的相位相同。则在该示例下,所述获取模块13,具体用于在所述触控感应信号的信号强度小于或等于第一预设信号强度阈值、且大于或等于第二预设信号强度阈值时,确定所述触控物为用户的肢体的标识;在所述触控感应信号的信号强度小于所述第二预设信号强度阈值时,确定所述触控 物的标识为所述第一触控笔的标识;在所述触控感应信号的信号强度大于或等于第三预设信号强度阈值时,确定所述触控物的标识为所述第二触控笔的标识;所述第三预设信号强度阈值大于所述第一预设信号强度阈值。
再例如,所述M大于或等于2,所述M个触控笔包括第一触控笔和第二触控笔,所述第一触控笔发射的触控信号的相位和所述第二触控笔发射的触控信号的相位相反,且所述第二触控笔的发射的触控信号的相位与所述触控设备的触控驱动信号的相位相同。则在该示例下,所述获取模块13,具体用于在所述触控感应信号的信号强度小于所述第二预设信号强度阈值时,确定所述触控物的标识为所述第一触控笔的标识;在所述触控感应信号的信号强度大于或等于第三预设信号强度阈值时,确定所述触控物的标识为所述第二触控笔的标识;所述第三预设信号强度阈值大于所述第一预设信号强度阈值。
可选地,所述更新模块15,用于在所述响应模块14根据所述触控物的标识对应的触控参数,在所述触控位置进行触控响应之前,对所述第一触控信息更新,得到更新后的触控信息,所述更新后的触控信息包括:所述触控物的标识、所述触控位置,以及,所述触控物触控所述触控设备时所使用的压力。则在该示例下,所述响应模块14,具体用于根据所述触控物的标识和所述压力两者对应的触控参数,在所述触控位置进行触控响应。
例如,所述触控物的标识为用户的肢体的标识,所述更新模块15,具体用于检测所述触控物触控所述触控设备时所使用的压力,并将检测到的所述压力添加至所述第一触控信息中,得到所述更新后的触控信息。
例如,所述触控物的标识为触控笔的标识,接收模块16,用于接收所述触控笔发送的第二触控信息;所述第二触控信息包括所述触控笔的标识,以及,所述触控笔触控所述触控设备时所使用的压力。则在该示例下,所述更新模块15,具体用于在基于所述触控感应信号的强度确定的所述触控物的标识与所述触控笔的标识相同时,合并所述第一触控信息和所述第二触控信息,得到更新后的触控信息。
可选地,所述触控物触控触控设备执行的触控操作为书写操作,所述触控参数包括绘制参数;所述响应模块14,具体用于根据所述触控物的标识和所述压力两者对应的绘制参数,在所述触控位置显示与所述绘制参数匹配的书写笔迹。
需要说明的是,上述实施例提供的触控设备在执行前述的触控设备的方法时,仅以上述各功能模块的划分进行举例说明,实际应用中,可以根据需要而将上述功能分配由不同的功能模块完成,即将设备的内部结构划分成不同的功能模块,以完成以上描述的全部或者部分功能。另外,上述实施例提供的触控设备与前述实现触控方法实施例属于同一构思,其体现实现过程详见方法实施例,这里不再赘述。
本申请实施例还提供了一种计算机存储介质,所述计算机存储介质可以存储有多条指令,所述指令适于由处理器加载并执行如上述实现触控方法中触控设备的步骤,在此不进行赘述。存储介质所在设备可以是触控设备,该触控设备例如可以是智能交互平板。
本申请实施例还提供了另一种计算机存储介质,所述计算机存储介质可以存储有多条指令,所述指令适于由处理器加载并执行如上述实现触控方法中触控笔的步骤,在此不进行赘述。存储介质所在设备可以是触控笔,该触控设备例如可以是智能交互平板。
本申请实施例还提供了一种触控设备,该触控设备可以包括:触控组件、处理器和存储器;其中,所述存储器存储有计算机程序,所述计算机程序适于由所述处理器加载并执 行如上述实现触控方法的步骤,在此不进行赘述。
此处所说的触控设备例如可以是前述本申请实施例所描述的触控设备。作为一种可能的实现方式,该触控设备例如可以是智能交互平板。
图17为本申请实施例提供的一种智能交互平板的结构示意图。如图17所示,所述智能交互平板1000可以包括:至少一个处理器1001,至少一个网络接口1004,用户接口1003,存储器1005,至少一个通信总线1002。
其中,通信总线1002用于实现这些组件之间的连接通信。
其中,用户接口1003可以包括显示屏(Display)、摄像头(Camera),可选用户接口1003还可以包括标准的有线接口、无线接口。
其中,网络接口1004可选的可以包括标准的有线接口、无线接口(如WI-FI接口)。
其中,处理器1001可以包括一个或者多个处理核心。处理器1001利用各种接口和线路连接整个智能交互平板1000内的各个部分,通过运行或执行存储在存储器1005内的指令、程序、代码集或指令集,以及调用存储在存储器1005内的数据,执行智能交互平板1000的各种功能和处理数据。可选的,处理器1001可以采用数字信号处理(Digital Signal Processing,DSP)、现场可编程门阵列(Field-Programmable Gate Array,FPGA)、可编程逻辑阵列(Programmable Logic Array,PLA)中的至少一种硬件形式来实现。处理器1001可集成中央处理器(Central Processing Unit,CPU)、图像处理器(Graphics Processing Unit,GPU)和调制解调器等中的一种或几种的组合。其中,CPU主要处理操作系统、用户界面和应用程序等;GPU用于负责显示屏所需要显示的内容的渲染和绘制;调制解调器用于处理无线通信。可以理解的是,上述调制解调器也可以不集成到处理器1001中,单独通过一块芯片进行实现。
其中,存储器1005可以包括随机存储器(Random Access Memory,RAM),也可以包括只读存储器(Read-Only Memory)。可选的,该存储器1005包括非瞬时性计算机可读介质(non-transitory computer-readable storage medium)。存储器1005可用于存储指令、程序、代码、代码集或指令集。存储器1005可包括存储程序区和存储数据区,其中,存储程序区可存储用于实现操作系统的指令、用于至少一个功能的指令(比如触控功能、声音播放功能、图像播放功能等)、用于实现上述各个方法实施例的指令等;存储数据区可存储上面各个方法实施例中涉及到的数据等。存储器1005可选的还可以是至少一个位于远离前述处理器1001的存储装置。如图17所示,作为一种计算机存储介质的存储器1005中可以包括操作系统、网络通信模块、用户接口模块以及智能交互平板的操作应用程序。
在图17所示的智能交互平板1000中,用户接口1003主要用于为用户提供输入的接口,获取用户输入的数据;而处理器1001可以用于调用存储器1005中存储的智能交互平板的操作应用程序,并具体执行前述触控方法中触控设备的动作。
本申请实施例还提供了一种触控笔,该触控设备可以包括:处理器和存储器;其中,所述存储器存储有计算机程序,所述计算机程序适于由所述处理器加载并执行如上述实现触控方法的步骤,在此不进行赘述。
本申请实施例提供了一种触控系统,所述触控系统包括:如前述所描述的触控设备,以及,至少两个如前述所描述的触控笔,用于实现前述的触控方法,在此不进行赘述。
本领域内的技术人员应明白,本申请的实施例可提供为方法、系统、或计算机程序产品。因此,本申请可采用完全硬件实施例、完全软件实施例、或结合软件和硬件方面的实 施例的形式。而且,本申请可采用在一个或多个其中包含有计算机可用程序代码的计算机可用存储介质(包括但不限于磁盘存储器、CD-ROM、光学存储器等)上实施的计算机程序产品的形式。
本申请是参照根据本申请实施例的方法、设备(系统)、和计算机程序产品的流程图和/或方框图来描述的。应理解可由计算机程序指令实现流程图和/或方框图中的每一流程和/或方框、以及流程图和/或方框图中的流程和/或方框的结合。可提供这些计算机程序指令到通用计算机、专用计算机、嵌入式处理机或其他可编程数据处理设备的处理器以产生一个机器,使得通过计算机或其他可编程数据处理设备的处理器执行的指令产生用于实现在流程图一个流程或多个流程和/或方框图一个方框或多个方框中指定的功能的装置。
这些计算机程序指令也可存储在能引导计算机或其他可编程数据处理设备以特定方式工作的计算机可读存储器中,使得存储在该计算机可读存储器中的指令产生包括指令装置的制造品,该指令装置实现在流程图一个流程或多个流程和/或方框图一个方框或多个方框中指定的功能。
这些计算机程序指令也可装载到计算机或其他可编程数据处理设备上,使得在计算机或其他可编程设备上执行一系列操作步骤以产生计算机实现的处理,从而在计算机或其他可编程设备上执行的指令提供用于实现在流程图一个流程或多个流程和/或方框图一个方框或多个方框中指定的功能的步骤。
在一个典型的配置中,计算设备包括一个或多个处理器(CPU)、输入/输出接口、网络接口和内存。
存储器可能包括计算机可读介质中的非永久性存储器,随机存取存储器(RAM)和/或非易失性内存等形式,如只读存储器(ROM)或闪存(flash RAM)。存储器是计算机可读介质的示例。
计算机可读介质包括永久性和非永久性、可移动和非可移动媒体可以由任何方法或技术来实现信息存储。信息可以是计算机可读指令、数据结构、程序的模块或其他数据。计算机的存储介质的例子包括,但不限于相变内存(PRAM)、静态随机存取存储器(SRAM)、动态随机存取存储器(DRAM)、其他类型的随机存取存储器(RAM)、只读存储器(ROM)、电可擦除可编程只读存储器(EEPROM)、快闪记忆体或其他内存技术、只读光盘只读存储器(CD-ROM)、数字多功能光盘(DVD)或其他光学存储、磁盒式磁带,磁带磁磁盘存储或其他磁性存储设备或任何其他非传输介质,可用于存储可以被计算设备访问的信息。按照本文中的界定,计算机可读介质不包括暂存电脑可读媒体(transitory media),如调制的数据信号和载波。
还需要说明的是,术语“包括”、“包含”或者其任何其他变体意在涵盖非排他性的包含,从而使得包括一系列要素的过程、方法、商品或者设备不仅包括那些要素,而且还包括没有明确列出的其他要素,或者是还包括为这种过程、方法、商品或者设备所固有的要素。在没有更多限制的情况下,由语句“包括一个……”限定的要素,并不排除在包括要素的过程、方法、商品或者设备中还存在另外的相同要素。
以上仅为本申请的实施例而已,并不用于限制本申请。对于本领域技术人员来说,本申请可以有各种更改和变化。凡在本申请的精神和原理之内所作的任何修改、等同替换、改进等,均应包含在本申请的权利要求范围之内。

Claims (18)

  1. 一种触控方法,其特征在于,所述方法应用于触控设备,所述方法包括:
    检测触控感应信号;
    根据检测到的触控感应信号,确定是否有触控物触控所述触控设备;
    若确定有触控物触控所述触控设备,则根据所述触控感应信号,获取所述触控物的第一触控信息;其中,所述第一触控信息包括:触控位置,以及,基于所述触控感应信号的强度确定的所述触控物的标识;所述触控设备支持多个触控物触控,不同触控物对应的触控感应信号的信号强度不同;
    根据所述触控物的标识对应的触控参数,在所述触控位置进行触控响应。
  2. 根据权利要求1所述的方法,其特征在于,所述多个触控物包括用户的肢体,以及,M个触控笔,其中,M为大于或等于1的整数;所述用户的肢体的任一部位对应的触控感应信号的信号强度相同;
    或者,所述多个触控物包括M个触控笔,其中,M为大于或等于2的整数;
    当M大于或等于2时,各触控笔发射的触控信号的信号强度和/或相位不同,以使所述触控设备检测到的触控感应信号的信号强度不同。
  3. 根据权利要求2所述的方法,其特征在于,所述M大于或等于2,所述M个触控笔包括第一触控笔和第二触控笔,所述第一触控笔发射的触控信号的相位和所述第二触控笔发射的触控信号的相位相反,且所述第二触控笔的发射的触控信号的相位与所述触控设备的触控驱动信号的相位相同;
    根据所述触控感应信号,获取所述触控物的标识,包括:
    若所述触控感应信号的信号强度小于或等于第一预设信号强度阈值、且大于或等于第二预设信号强度阈值,则确定所述触控物为用户的肢体的标识;
    若所述触控感应信号的信号强度小于所述第二预设信号强度阈值,则确定所述触控物的标识为所述第一触控笔的标识;
    若所述触控感应信号的信号强度大于或等于第三预设信号强度阈值,则确定所述触控物的标识为所述第二触控笔的标识;所述第三预设信号强度阈值大于所述第一预设信号强度阈值。
  4. 根据权利要求2所述的方法,其特征在于,所述M大于或等于2,所述M个触控笔包括第一触控笔和第二触控笔,所述第一触控笔发射的触控信号的相位和所述第二触控笔发射的触控信号的相位相反,且所述第二触控笔的发射的触控信号的相位与所述触控设备的触控驱动信号的相位相同;
    根据所述触控感应信号,获取所述触控物的标识,包括:
    若所述触控感应信号的信号强度小于第二预设信号强度阈值,则确定所述触控物的标识为所述第一触控笔的标识;
    若所述触控感应信号的信号强度大于或等于第三预设信号强度阈值,则确定所述触控物的标识为所述第二触控笔的标识;所述第三预设信号强度阈值大于所述第二预设信号强度阈值。
  5. 根据权利要求1-4任一项所述的方法,其特征在于,所述根据所述触控物的标识对 应的触控参数,在所述触控位置进行触控响应之前,还包括:
    对所述第一触控信息更新,得到更新后的触控信息,所述更新后的触控信息包括:所述触控物的标识、所述触控位置,以及,所述触控物触控所述触控设备时所使用的压力;
    所述根据所述触控物的标识对应的触控参数,在所述触控位置进行触控响应,包括:
    根据所述触控物的标识和所述压力两者对应的触控参数,在所述触控位置进行触控响应。
  6. 根据权利要求5所述的方法,其特征在于,所述触控物的标识为用户的肢体的标识,所述对所述第一触控信息更新,得到更新后的触控信息,包括:
    检测所述触控物触控所述触控设备时所使用的压力;
    将检测到的所述压力添加至所述第一触控信息中,得到所述更新后的触控信息。
  7. 根据权利要求5所述的方法,其特征在于,所述触控物的标识为触控笔的标识,所述方法还包括:
    接收所述触控笔发送的第二触控信息;所述第二触控信息包括所述触控笔的标识,以及,所述触控笔触控所述触控设备时所使用的压力;
    所述对所述第一触控信息更新,得到更新后的触控信息,包括:
    若基于所述触控感应信号的强度确定的所述触控物的标识与所述触控笔的标识相同,则合并所述第一触控信息和所述第二触控信息,得到更新后的触控信息。
  8. 根据权利要求6-7任一项所述的方法,其特征在于,所述触控物触控触控设备执行的触控操作为书写操作,所述触控参数包括绘制参数;
    所述根据所述触控物的标识和所述压力两者对应的触控参数,在所述触控位置对所述触控物的触控进行触控响应,包括:
    根据所述触控物的标识和所述压力两者对应的绘制参数,在所述触控位置显示与所述绘制参数匹配的书写笔迹。
  9. 一种触控方法,其特征在于,所述方法应用于触控笔,所述方法包括:
    检测所述触控笔的笔尖的压力;
    若所述压力大于或等于预设压力阈值,则向所述触控设备发送触控信息以及通过所述笔尖向所述触控设备传输预设相位和信号强度的触控信号;所述触控信息包括:所述触控笔的标识,以及,所述压力;
    若所述压力小于所述预设压力阈值,则停止向所述触控设备发送所述触控信息,以及,停止通过所述笔尖向所述触控设备传输所述触控信号。
  10. 根据权利要求9所述的方法,其特征在于,所述通过所述笔尖向所述触控设备传输预设相位和信号强度的触控信号,包括:
    将检测到的所述触控设备的触控驱动信号进行处理,得到预设相位和信号强度的触控信号;
    通过所述笔尖向所述触控设备传输预设相位和信号强度的触控信号。
  11. 一种触控方法,其特征在于,所述方法应用于触控设备,所述方法包括:
    检测触控感应信号;
    根据检测到的触控感应信号,确定是否有触控物触控所述触控设备;
    若确定有第一触控物和第二触控物同时触控所述触控设备,则根据所述第一触控物对 应的触控感应信号,获取所述第一触控物的第一触控信息,以及,根据所述第二触控物对应的触控感应信号,获取所述第二触控物的第一触控信息;其中,所述第一触控信息包括:触控位置,以及,基于触控物对应的触控感应信号的强度确定的触控物的标识;
    根据所述第一触控物的标识对应的触控参数,在所述第一触控物的触控位置进行触控响应;
    根据所述第二触控物的标识对应的触控参数,在所述第二触控物的触控位置进行触控响应。
  12. 一种触控设备,其特征在于,所述触控设备包括:
    检测模块,用于检测触控感应信号;
    确定模块,用于根据检测到的触控感应信号,确定是否有触控物触控所述触控设备;
    获取模块,用于在确定有触控物触控所述触控设备时,根据所述触控感应信号,获取所述触控物的第一触控信息;其中,所述第一触控信息包括:触控位置,以及,基于所述触控感应信号的强度确定的所述触控物的标识;所述触控设备支持多个触控物触控,不同触控物对应的触控感应信号的信号强度不同;
    响应模块,用于根据所述触控物的标识对应的触控参数,在所述触控位置进行触控响应。
  13. 一种触控笔,其特征在于,所述触控笔包括:笔尖、感应环、压力传感器、模数转换模块、触控信号输出单元、总控制模块、通信模块和通信天线;
    所述压力传感器,用于检测所述笔尖的压力;
    所述模数转换模块,用于将所述压力传感器检测到的所述笔尖的压力的电信号转换为数字信号后传输至所述总控制模块;
    所述感应环,用于感应触控设备的触控组件发出的触控驱动信号;
    所述总控制模块,用于在所述压力大于或等于预设压力阈值时,通过所述通信模块和通信天线向所述触控设备发送触控信息,以及控制触控信号输出单元输出预设相位和信号强度的触控信号,并通过所述笔尖向所述触控设备传输所述触控信号;在所述压力小于所述预设压力阈值时,停止通过所述通信模块和通信天线向所述触控设备发送所述触控信息,以及,控制所述触控信号输出单元停止输出所述触控信号;其中,所述触控信息包括:所述触控笔的标识,以及,所述压力。
  14. 一种触控设备,其特征在于,包括:触控组件、处理器和存储器;其中,所述存储器存储有计算机程序,所述计算机程序适于由所述处理器加载并执行如权利要求1-8、11任意一项的方法步骤。
  15. 一种触控笔,其特征在于,包括:处理器和存储器;其中,所述存储器存储有计算机程序,所述计算机程序适于由所述处理器加载并执行如权利要求9-10任意一项的方法步骤。
  16. 一种触控系统,其特征在于,所述触控系统包括:如权利要求14所述的触控设备,以及,至少两个如权利要求15所述的触控笔。
  17. 一种计算机存储介质,其特征在于,所述计算机存储介质存储有多条指令,所述指令适于由处理器加载并执行如权利要求1-11任意一项的方法步骤。
  18. 一种计算机程序产品,包括计算机程序,该计算机程序被处理器执行时实现权利要求1-11中任一项所述的方法。
PCT/CN2022/091805 2022-05-09 2022-05-09 触控方法、装置、设备、系统、存储介质及程序产品 WO2023216078A1 (zh)

Priority Applications (2)

Application Number Priority Date Filing Date Title
PCT/CN2022/091805 WO2023216078A1 (zh) 2022-05-09 2022-05-09 触控方法、装置、设备、系统、存储介质及程序产品
CN202280007020.4A CN117377934A (zh) 2022-05-09 2022-05-09 触控方法、装置、设备、系统、存储介质及程序产品

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/CN2022/091805 WO2023216078A1 (zh) 2022-05-09 2022-05-09 触控方法、装置、设备、系统、存储介质及程序产品

Publications (1)

Publication Number Publication Date
WO2023216078A1 true WO2023216078A1 (zh) 2023-11-16

Family

ID=88729507

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2022/091805 WO2023216078A1 (zh) 2022-05-09 2022-05-09 触控方法、装置、设备、系统、存储介质及程序产品

Country Status (2)

Country Link
CN (1) CN117377934A (zh)
WO (1) WO2023216078A1 (zh)

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104156088A (zh) * 2013-05-14 2014-11-19 汉王科技股份有限公司 主动式电容笔及触控装置
CN104834395A (zh) * 2014-02-10 2015-08-12 宏碁股份有限公司 触控电子系统及其触控判断方法
US20180046269A1 (en) * 2016-08-11 2018-02-15 Microsoft Technology Licensing, Llc Pen Wake Up on Screen Detect
CN108595047A (zh) * 2018-04-20 2018-09-28 北京硬壳科技有限公司 触控物识别方法及装置
CN211087190U (zh) * 2019-07-30 2020-07-24 联想(北京)有限公司 触控显示装置和电子设备
CN112486354A (zh) * 2020-11-30 2021-03-12 维沃移动通信有限公司 电子设备的触控方法、触控组件的信息传输方法及装置

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104156088A (zh) * 2013-05-14 2014-11-19 汉王科技股份有限公司 主动式电容笔及触控装置
CN104834395A (zh) * 2014-02-10 2015-08-12 宏碁股份有限公司 触控电子系统及其触控判断方法
US20180046269A1 (en) * 2016-08-11 2018-02-15 Microsoft Technology Licensing, Llc Pen Wake Up on Screen Detect
CN108595047A (zh) * 2018-04-20 2018-09-28 北京硬壳科技有限公司 触控物识别方法及装置
CN211087190U (zh) * 2019-07-30 2020-07-24 联想(北京)有限公司 触控显示装置和电子设备
CN112486354A (zh) * 2020-11-30 2021-03-12 维沃移动通信有限公司 电子设备的触控方法、触控组件的信息传输方法及装置

Also Published As

Publication number Publication date
CN117377934A (zh) 2024-01-09

Similar Documents

Publication Publication Date Title
KR102553493B1 (ko) 터치 감지 장치 및 펜과 위치 측정 방법
CN109313519B (zh) 包括力传感器的电子设备
US10725578B2 (en) Apparatus and method for controlling fingerprint sensor
US10996786B2 (en) Method and apparatus for controlling multi window display in interface
EP3096210B1 (en) Method and apparatus for processing input using touch screen
US10261683B2 (en) Electronic apparatus and screen display method thereof
US20150205412A1 (en) Method for obtaining input in electronic device, electronic device, and storage medium
US9880642B2 (en) Mouse function provision method and terminal implementing the same
US20150324004A1 (en) Electronic device and method for recognizing gesture by electronic device
US10545662B2 (en) Method for controlling touch sensing module of electronic device, electronic device, method for operating touch sensing module provided in electronic device, and touch sensing module
US10642380B2 (en) Input device, method, and system for electronic device
KR20180015987A (ko) 펜과 관련된 정보를 판단하는 터치 감지 장치 및 그 제어 방법과 펜
WO2020073980A1 (zh) 寄宿应用的处理方法、设备及计算机可读存储介质
KR102140290B1 (ko) 입력 처리 방법 및 그 전자 장치
US11204645B2 (en) Method for providing haptic feedback, and electronic device for performing same
US10037135B2 (en) Method and electronic device for user interface
US9606665B2 (en) Object moving method and electronic device implementing the same
US20210064207A1 (en) Electronic device and method for changing condition for determining touch input to be pressure input
WO2020135010A1 (zh) 寄宿应用的处理方法、设备及计算机可读存储介质
EP3035313B1 (en) Method and apparatus for remote control
WO2023216078A1 (zh) 触控方法、装置、设备、系统、存储介质及程序产品
CN112654955B (zh) 检测笔相对于电子设备的定位
KR102246270B1 (ko) 전자 장치 및 그 연동 방법
WO2020181423A1 (zh) 书写笔记信息的传输方法及其传输系统、电子设备
US20160062596A1 (en) Electronic device and method for setting block

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 22941045

Country of ref document: EP

Kind code of ref document: A1