WO2016058387A1 - 触控交互的处理方法、装置和系统 - Google Patents

触控交互的处理方法、装置和系统 Download PDF

Info

Publication number
WO2016058387A1
WO2016058387A1 PCT/CN2015/080243 CN2015080243W WO2016058387A1 WO 2016058387 A1 WO2016058387 A1 WO 2016058387A1 CN 2015080243 W CN2015080243 W CN 2015080243W WO 2016058387 A1 WO2016058387 A1 WO 2016058387A1
Authority
WO
WIPO (PCT)
Prior art keywords
touch
start time
hand gesture
information
user
Prior art date
Application number
PCT/CN2015/080243
Other languages
English (en)
French (fr)
Inventor
丁强
高小榕
黄肖山
Original Assignee
华为技术有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 华为技术有限公司 filed Critical 华为技术有限公司
Priority to KR1020177012912A priority Critical patent/KR101875350B1/ko
Priority to BR112017007752-3A priority patent/BR112017007752B1/pt
Priority to JP2017520458A priority patent/JP6353982B2/ja
Priority to EP15851451.3A priority patent/EP3200051B1/en
Publication of WO2016058387A1 publication Critical patent/WO2016058387A1/zh
Priority to US15/486,452 priority patent/US10372325B2/en

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • G06F3/015Input arrangements based on nervous system activity detection, e.g. brain waves [EEG] detection, electromyograms [EMG] detection, electrodermal response detection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/017Gesture based interaction, e.g. based on a set of recognized hand gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • G06F3/0416Control or interface arrangements specially adapted for digitisers
    • G06F3/04166Details of scanning methods, e.g. sampling time, grouping of sub areas or time sharing with display driving
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • G06F3/0416Control or interface arrangements specially adapted for digitisers
    • G06F3/0418Control or interface arrangements specially adapted for digitisers for error correction or compensation, e.g. based on parallax, calibration or alignment
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04883Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2203/00Indexing scheme relating to G06F3/00 - G06F3/048
    • G06F2203/041Indexing scheme relating to G06F3/041 - G06F3/045
    • G06F2203/04104Multi-touch detection in digitiser, i.e. details about the simultaneous detection of a plurality of touching locations, e.g. multiple fingers or pen and finger
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2203/00Indexing scheme relating to G06F3/00 - G06F3/048
    • G06F2203/041Indexing scheme relating to G06F3/041 - G06F3/045
    • G06F2203/04105Pressure sensors for measuring the pressure or force exerted on the touch surface without providing the touch position
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2203/00Indexing scheme relating to G06F3/00 - G06F3/048
    • G06F2203/048Indexing scheme relating to G06F3/048
    • G06F2203/04808Several contacts: gestures triggering a specific function, e.g. scrolling, zooming, right-click, when the user establishes several contacts with the surface simultaneously; e.g. using several fingers or a combination of fingers and pen

Definitions

  • the embodiments of the present invention relate to the field of human-computer interaction, and in particular, to a method, an apparatus, and a system for processing touch interaction.
  • Touch technology is one of the key technologies for human-computer interaction. It is divided into single touch and multi-touch according to the number of touch points. Single touch can only recognize and support one finger click and touch at a time. Multi-touch, also known as multi-touch and multi-point sensing, can simultaneously collect multi-point signals and perform gesture recognition, thereby enabling recognition and support of five fingers simultaneously making clicks and touch actions. Due to the convenient, natural and friendly touch operation, its application fields are more and more extensive.
  • the touch command is detected by detecting the touch point on the interface, and one touch point is a single touch, and multiple touch points are multi-touch, but which touch operation is performed by which It is impossible to distinguish which hand and which finger the user performs.
  • the identifiable multi-touch command is also relatively simple. For example, the movement of the index finger plus the thumb and the movement of the middle finger plus the thumb are recognized as zoom operations.
  • it is easy to cause confusion of touch commands because the left and right hands cannot be distinguished.
  • the simultaneous movement of the left and right fingers is misidentified as a one-handed index finger plus thumb zoom operation.
  • a multi-user performs a touch operation since the user cannot be distinguished, the touch operation of multiple users at the same time in the same area cannot be realized.
  • the prior art has the following drawbacks: when a single user or multiple users simultaneously perform a touch operation in the same area, it is easy to cause confusion of touch commands.
  • the embodiment of the invention provides a method, a device and a system for processing a touch interaction, which are used to implement a touch interaction operation in the same area by a single user or multiple users.
  • an embodiment of the present invention provides a method for processing a touch interaction, including:
  • first information sent by the myoelectric signal acquisition device and second information sent by the position capture device; wherein the first information includes a device identifier of the myoelectric signal acquisition device, and the user enters a touch-type hand gesture and a first start time of the touch; the second information includes a touch point of the touch, a second start time of the touch, and coordinate information of each touch point;
  • the first start time of the touch is that the EMG signal acquisition device recognizes a start time of the touch operation by the user
  • the second start time of the touch is that the position capture device recognizes that the user performs a touch operation.
  • the touch command includes the device identifier of the myoelectric signal acquisition device, the hand gesture, and coordinate information of the touch points.
  • the method further includes:
  • the first information further includes a working strength of the touch control performed by the user
  • the touch command further includes the operating strength
  • the hand gesture and the user sent by the EMG signal acquisition device are continuously received.
  • the operating force and the coordinate information of the touch points sent by the position capturing device are updated to update the touch command.
  • the method further includes: deleting the location if the operating strength is determined to be less than a second preset threshold The touch command.
  • an embodiment of the present invention provides a method for processing a touch interaction, including:
  • the electromyography signal acquisition device periodically collects the surface electromyogram signal S1 of the plurality of channels;
  • the EMG signal acquisition device determines a hand gesture of the user to perform touch according to the time-frequency domain characteristic of the surface electromyography signal S1; and determines a first start time of the touch according to the hand gesture; a start time is that the EMG signal acquisition device recognizes a start time of the user performing a touch operation;
  • the EMG signal acquisition device transmits first information to the processing device for the processing device root Generating a touch command and performing a corresponding interaction operation according to the first information and the second information sent by the location capture device; wherein the first information includes a device identifier of the myoelectric signal acquisition device, the hand a type of gesture, and a first start time of the touch; the second information includes a touch point of the touch, a second start time of the touch, and coordinate information of each touch point, where the touch The second start time is that the location capture device recognizes the start time of the user performing the touch operation.
  • the EMG signal collecting device determines, according to the time-frequency domain feature of the surface electromyogram signal S1, a hand gesture that the user performs the touch includes:
  • the electromyographic signal acquisition device performs the determination of the hand gesture according to the type of the hand gesture according to the magnitude and frequency of the surface electromyogram signal S1.
  • the method further includes: determining, by the EMG signal collecting device, a touch operation of the user according to a time-frequency domain characteristic of the surface electromyogram signal S1
  • the first information includes the operating strength
  • the determining the first start time of the touch according to the hand gesture is to determine the first touch according to the hand gesture and the operating strength. Start time.
  • the determining, by the user, the operation force of the touch operation comprises: the EMG signal collecting device is configured to The myoelectric signal S1 is superimposed and averaged to obtain a single-channel myoelectric signal S2, and the average amplitude of the single-channel myoelectric signal S2 is calculated as the operating force S by using a sliding time window.
  • the determining, by the hand gesture and the operating strength, the first start time of the touch includes Determining, by the lookup table, the first start time of the touch after acquiring the hand gesture and the operation strength; wherein the first pre-operation of the operation strength corresponding to each hand gesture is pre-stored in the table
  • the threshold value is set, and if the operating strength obtained by the surface electromyogram signal S1 is greater than the first preset threshold, the current system time is acquired as the first start time of the touch.
  • the processing device determines that the time interval between the first start time of the touch and the second start time of the touch is less than a preset threshold, and The touch command is generated when the number of touch points corresponding to the hand gesture is consistent with the number of touch points included in the second information.
  • an embodiment of the present invention provides a processing apparatus, including:
  • An information receiving module configured to receive first information sent by the myoelectric signal collecting device, and second information sent by the position capturing device; wherein the first information includes a device identifier of the myoelectric signal collecting device, and the user touches Controlling the hand gesture, and the first start time of the touch; the second information includes the number of touch points touched by the user, the second start time of the touch, and the coordinate information of each touch point;
  • the first start time of the control is that the EMG signal acquisition device recognizes the start time of the user performing the touch operation
  • the second start time of the touch is that the position capture device recognizes the start of the touch operation by the user. time;
  • the command generating module is configured to determine, if the time interval between the first start time of the touch and the second start time of the touch is less than a preset threshold, and the number of touch points corresponding to the hand gesture and the
  • the touch information includes the device identifier of the myoelectric signal acquisition device, the hand gesture, and coordinate information of each touch point.
  • the instruction execution module is configured to execute an interaction operation corresponding to the touch instruction.
  • the instruction generating module is further configured to: according to the user that is sent by the EMG signal collecting device that is continuously received by the information receiving module The touch gesture is performed, and the coordinate information of each touch point sent by the position capture device is updated to update the touch command.
  • the first information further includes an operation strength of the user to perform touch control; and the touch command further includes the operation strength.
  • the instruction generating module is further configured to: according to the electromyogram continuously received by the information receiving module And the touch gesture is updated by the hand gesture and the operation strength of the touch, and the coordinate information of the touch points sent by the location capture device sent by the signal collection device.
  • the command generating module is further configured to: if it is determined that the operating strength is less than a second preset threshold Determining that the touch command ends and deleting the touch command.
  • an embodiment of the present invention provides a device for collecting an electromyogram signal, including:
  • An acquisition module configured to periodically collect surface electromyogram signals S1 of multiple channels
  • a processing module configured to determine a hand gesture of the user to perform touch according to the time-frequency domain characteristic of the surface electromyogram signal S1; determine a first start time of the touch according to the hand gesture; the first start of the touch Time for the EMG signal acquisition device to recognize the start time of the user performing the touch operation;
  • a sending module configured to send the first information to the processing device, where the processing device generates a touch command according to the first information and the second information sent by the location capturing device, and performs a corresponding interaction operation;
  • the first information includes the device identifier of the myoelectric signal acquisition device, the hand gesture, and the first start time of the touch;
  • the second information includes the touch points and touches of the user performing touch The second start time and the coordinate information of each touch point are controlled, and the second start time of the touch is the start time of the position capture device that the user performs the touch operation.
  • the processing module determines, according to the time-frequency domain feature of the surface electromyogram signal S1, that the hand gesture of the user to perform touch includes: The processing module performs the determination of the hand gesture according to the type of the hand gesture according to the magnitude and frequency of the surface electromyogram signal S1.
  • the processing module is further configured to determine, according to the time-frequency domain characteristic of the surface electromyogram signal S1, the operation intensity of the user performing the touch;
  • the first information further includes the operating strength, and the determining the first start time of the touch according to the hand gesture is determining the first start time of the touch according to the hand gesture and the operating strength.
  • the processing module determines, by the processing module, the operation strength of the touch: the processing module: the surface muscle
  • the electrical signal S1 is superimposed and averaged to obtain a single-channel myoelectric signal S2, and the average amplitude of the single-channel myoelectric signal S2 is calculated as the operating force S by using a sliding time window.
  • the processing module determines, according to the hand gesture and the operating strength, the first start of the touch The time includes: after the processing module acquires the hand gesture and the operation strength, determining, by using a lookup table, the first start time of the touch; wherein, the table is pre-stored with corresponding to each hand gesture
  • the first preset threshold value of the operating force if the operating strength obtained by the surface electromyogram signal S1 is greater than the first preset threshold, the current system time is acquired as the first start time of the touch .
  • the second information sent by the location capture device, the generating the touch command includes: the processing device determines that the time interval between the first start time of the touch and the second start time of the touch is less than a preset threshold, and The number of touch points corresponding to the hand gesture and the number of the touch points included in the second information The touch command is generated.
  • the embodiment of the present invention provides a touch processing system, including a position capture device, and a processing device according to any one of the first to fourth aspects of the third aspect, the third aspect, at least one The electromyographic signal acquisition device of any one of the first to fifth possible implementations of the fourth aspect, wherein the myoelectric signal acquisition device and the position capture device respectively communicate with the treatment device connection.
  • the method, device and system for processing touch interaction provided by the embodiment of the invention generate a touch command and execute the first information sent by the myoelectric signal acquisition device and the second information sent by the position capture device. Since the device identification corresponding to the EMG signal acquisition device can distinguish the touch operations of different users and the right and left hands of the same user, the single user or multiple users can simultaneously perform touch operations in the same area, and the touch commands are not confused.
  • Embodiment 1 is a flowchart of Embodiment 1 of a method for processing touch interaction according to the present invention
  • Embodiment 2 is a flowchart of Embodiment 2 of a method for processing touch interaction according to the present invention
  • Embodiment 3 is a schematic structural view of Embodiment 1 of a processing apparatus according to the present invention.
  • Embodiment 1 of an electromyography signal collection device according to the present invention
  • FIG. 5 is a schematic structural diagram of Embodiment 1 of a touch processing system according to the present invention.
  • Embodiment 1 of a processing device according to the present invention.
  • Embodiment 7 is a schematic structural view of Embodiment 1 of an electromyography signal acquisition device according to the present invention.
  • FIG. 8 is a schematic structural diagram of Embodiment 2 of a touch processing system according to the present invention.
  • the execution body of the touch interaction processing method may be a processing device, such as a chip, a mobile terminal, or the like.
  • the processing device can be integrated in the processing device, and the processing device can be a mobile terminal, a computer, a server, etc., and the chip can be integrated in a mobile terminal, a computer, or the like.
  • the processing device and the processing device may be any device and device having a storage and computing function, which is not limited by the embodiment of the present invention.
  • the processing method of the touch interaction may include:
  • Step 101 Receive first information sent by the myoelectric signal acquisition device, and second information sent by the location capture device.
  • the first information includes a device identifier of an electromyography (EMG) signal acquisition device, a hand gesture that the user performs touch, and a first start time of the touch.
  • the second information includes the number of touch points touched by the user, the second start time of the touch, and the coordinate information of each touch point.
  • the first start time of the touch is that the EMG signal acquisition device recognizes the start time of the touch operation by the user
  • the second start time of the touch is that the position capture device recognizes the start time of the touch operation by the user.
  • the myoelectric signal acquisition device and the position capture device can respectively identify the touch operation event of the user in the same touch operation area.
  • the EMG signal acquisition device may be any device that can collect multi-channel surface electromyography (sEMG) signals.
  • the EMG signal acquisition device can be disposed on the user's arm; the position capture device can be Any device that can recognize the user's touch operation.
  • the user touch operation event recognized by the myoelectric signal acquisition device may be defined as a myoelectric touch event Touch_EMG
  • the user touch operation event recognized by the position capture device may be defined as a track touch event Touch_TrackSys.
  • the electromyography touch event Touch_EMG contains three parameters, namely: the device identification (device ID) of the myoelectric signal acquisition device, the hand gesture (G) for the user to touch, and the first start time (T1) of the touch.
  • the track touch event Touch_TrackSys includes three parameters: the number of touch points (N2) for the touch, the second start time (T2), and the coordinate information (L) of each touch point.
  • the device ID can uniquely distinguish the EMG signal acquisition device.
  • the device ID may be a number, a letter or an arbitrary form, which is not limited by the embodiment of the present invention. For example, two users perform touch operation at the same time, user A's left and right hands perform touch operation, and user B only uses right hand for touch operation, then each arm of the user needs to set a myoelectric signal acquisition device, and the device ID is 210 EMG signal acquisition device collects multi-channel surface EMG signal of user A's left hand, device ID is 211 The EMG signal acquisition device collects the multi-channel surface EMG signal of the right hand of the user A, and the EMG signal acquisition device with the device ID 220 collects the multi-channel surface EMG signal of the right hand of the user B.
  • the device ID 210, the device ID 211, and the device ID 220 can uniquely distinguish different electromyography signal collection devices, the touch operations of the user A left hand, the user A right hand, and the user B right hand can be distinguished. Therefore, the device identification of the EMG signal acquisition device can distinguish the touch operations of different users and the left and right hands of the same user, thereby enabling the touch operation of multiple users or one user's left and right hands simultaneously in the same region without generating touch. Confusion of operations.
  • the hand gesture G of the user may be a one-handed single-finger hand gesture or a one-handed multi-finger hand gesture, and the corresponding touch point (N1) may be determined according to the hand gesture G.
  • the one-handed single-finger hand gesture can be a thumb gesture of the thumb, the index finger or the ring finger.
  • the touch gesture number N1 corresponding to the hand gesture G is 1; the one-hand multi-finger gesture can be the index finger.
  • the hand gesture combined with the ring finger, at this time, the number of touch points N1 corresponding to the hand gesture G is 2.
  • the EMG signal acquisition device predefines a set of hand gestures, and only the hand gestures in the collection can be recognized. For the hand gestures in the set, the hand gestures that are common in the field may be included, and the new hand gestures may be defined in advance, which is not limited in the embodiment of the present invention.
  • the first start time T1 of the touch indicates the start time of the electromyography touch event Touch_EMG, and is the start time of the touch operation performed by the user recognized by the myoelectric signal acquisition device.
  • the second touch start time T2 represents the start time of the track touch event Touch_TrackSys, which is the start time of the touch operation recognized by the position capture device.
  • Step 103 If it is determined that the time interval between the first start time of the touch and the second start time of the touch is less than a preset threshold, and the number of touch points corresponding to the hand gesture is consistent with the number of touch points included in the second information, generate The touch command includes a device identifier of the myoelectric signal acquisition device, a hand gesture, and coordinate information of each touch point.
  • the preset threshold T_th can be set as needed.
  • T_th and N1 N2, indicating that the electromyography touch event Touch_EMG recognized by the myoelectric signal acquisition device and the track touch event Touch_TrackSys recognized by the position capture device are the same touch operation event, according to the touch operation Generate touch command Touch_eff.
  • the touch command Touch_eff includes three parameters: a
  • the touch command Touch_eff may be a common touch command such as selecting, moving, zooming, and rotating, and may also be a new touch command such as a user-defined drawing line and a volume adjustment, which is not limited by the embodiment of the present invention. .
  • Step 105 Perform an interaction operation corresponding to the touch instruction.
  • the interaction operation corresponding to the touch command Touch_eff is executed.
  • the touch command Touch_eff includes a hand gesture G, and an additional feature can be defined by the hand gesture G, and the information richness of the touch command Touch_eff is increased.
  • the hand gesture G is a one-handed single-finger gesture
  • different touch commands Touch_eff can be generated according to different fingers, and a thumb can be defined to indicate a target translation, an index finger indicates a click selection, etc.
  • the hand gesture G is a single
  • different fingers can be defined to represent different colors or line types
  • the hand gesture G is a one-handed multi-finger gesture, it can be based on different fingers.
  • the combination generates different touch commands Touch_eff, which can define a thumb plus index finger to indicate zoom, a thumb plus ring finger to indicate brightness adjustment, an index finger plus a ring finger to indicate volume adjustment, and the like.
  • Touch_eff can define a thumb plus index finger to indicate zoom, a thumb plus ring finger to indicate brightness adjustment, an index finger plus a ring finger to indicate volume adjustment, and the like.
  • the embodiment of the invention does not limit this.
  • the method may further include:
  • Step 107 Continuously receive the hand gesture sent by the user sent by the myoelectric signal acquisition device, and the coordinate information of each touch point sent by the position capture device, and update the touch instruction.
  • the embodiment of the invention provides a method for processing touch interaction, which generates and executes a touch command by using the first information sent by the myoelectric signal acquisition device and the second information sent by the position capture device. Since the device identification of the EMG signal acquisition device can distinguish the touch operations of different users and the left and right hands of the same user, the single user or multiple users can simultaneously perform touch operations in the same area, and the touch commands are not confused.
  • the first information may further include a user performing a touch operation.
  • the strength and the touch command also include the operation strength.
  • the user's touch operation intensity (S) corresponds to the user's touch hand gesture G, indicating the degree to which the user performs the touch operation. It can be understood that an effective touch operation must have a certain degree of operation. If the operation force is too small, it can be understood as a touch that the user accidentally makes in the touch area, and is not an effective touch operation.
  • the electromyography signal acquiring device pre-defines a first preset threshold S_th(G) corresponding to each hand gesture G, and the operating strength S corresponding to the hand gesture G is greater than the corresponding corresponding to the hand gesture G
  • a threshold value S_th(G) is preset, the hand gesture G is considered to be an effective hand gesture, and the touch operation is defined as a myoelectric touch event Touch_EMG.
  • the first preset threshold S_th(G) can be set as needed.
  • the touch command Touch_eff may include the operating force S, and may define additional features for the operating force S to increase the information richness of the touch command Touch_eff.
  • the size of the operating force S can be defined to represent the thickness of the line; when the generated touch command Touch_eff is to adjust the volume, the size of the operating force S can be defined to indicate the volume of the volume. The embodiment of the invention does not limit this.
  • the method may further include:
  • Step 109 Continuously receiving the hand gesture and operation strength of the touch sensor sent by the EMG signal acquisition device, and the coordinate information of each touch point sent by the position capture device, and updating the touch command.
  • Step 111 If it is determined that the operation strength is less than the second preset threshold, the touch command is deleted.
  • the processing device may predefine a second preset threshold S2_th(G) corresponding to each of the hand gestures G, and the operating strength S corresponding to the hand gesture G is smaller than the second preset gate corresponding to the hand gesture G
  • S2_th(G) When the limit value S2_th(G) is satisfied, it is considered that the touch command Touch_eff is completed, and the touch command Touch_eff is deleted.
  • the second preset threshold S2_th(G) can be set as needed.
  • step 111 may also be before step 109.
  • the method further includes: maintaining a touch instruction list for storing the touch instruction.
  • the touch command is added to the touch command list; when it is determined that the touch command ends, the touch command is deleted from the touch command list.
  • the execution body of the touch interaction processing method may be an EMG signal acquisition device, for example, a plurality of EMG signal acquisition electrodes and a wristband type. EMG signal acquisition device.
  • the EMG signal acquisition device can be integrated in the EMG signal acquisition device, for example, a plurality of EMG signal acquisition electrodes are integrated in the wristband type EMG signal acquisition device.
  • the electromyographic signal acquisition device and the myoelectric signal acquisition device may be any device and device that can collect the multi-channel surface myoelectric signal, which is not limited in the embodiment of the present invention.
  • the EMG signal acquisition device can identify the user's touch operation event, and the user touch operation event can be defined as the myoelectric touch event Touch_EMG.
  • the processing method of the touch interaction may include:
  • Step 201 The EMG signal collecting device periodically collects the surface electromyogram signal S1 of the plurality of channels.
  • the myoelectric signal acquisition device may comprise a plurality of electrodes capable of collecting surface electromyogram signals, and each electrode collects a surface electromyogram signal of one channel.
  • the EMG signal acquisition device periodically collects surface electromyogram signals S1 of multiple channels, and the collection period can be set as needed.
  • the surface electromyogram signal S1 of the multiple channels may be after pre-processing, and the pre-processing process may include: performing signal amplification processing on the surface electromyogram signal S1 of the collected multiple channels, and performing power frequency interference trapping. Wave processing, filtering processing, and the like.
  • the myoelectric signal acquisition device can be placed on the user's arm. If the user's left and right hands are touch-operated, each arm needs to be equipped with a myoelectric signal acquisition device.
  • Each EMG signal acquisition device has a device identification (device ID) corresponding thereto, and the device ID can uniquely distinguish the EMG signal acquisition device.
  • the device ID may be a number, a letter, or any other form, which is not limited by the embodiment of the present invention.
  • the device identification corresponding to the EMG signal acquisition device can distinguish the touch operations of different users and the left and right hands of the same user, so that the right and left hands of multiple users or one user can simultaneously touch in the same area. Control operations without the confusion of touch operations.
  • Step 203 The EMG signal acquisition device determines a hand gesture of the user to perform touch according to the time-frequency domain characteristic of the surface electromyography signal S1; and determines a first start time of the touch according to the hand gesture.
  • the first start time of the touch is the start time of the EMG signal acquisition device to recognize the user performing the touch operation.
  • the EMG signal acquisition device determines the hand gesture (G) of the user to perform the touch according to the time-frequency domain characteristic of the surface EMG signal S1, which may specifically include:
  • the myoelectric signal acquisition device performs the judgment of the hand gesture G according to the type of the hand gesture according to the magnitude and frequency of the surface electromyogram signal S1.
  • the linear discriminant analysis (LDA) algorithm or the support vector machine (SVM) algorithm may be used to determine the hand gesture type G.
  • the method for determining the type of the hand gesture is not limited in the embodiment of the present invention.
  • the type of the hand gesture G can be a one-handed single-finger hand gesture or a one-handed gesture.
  • Multi-finger hand gestures each of which has its corresponding number of touch points N1.
  • the number of touch points N1 corresponding to the gesture of the one-handed thumb is one
  • the number of touch points N1 corresponding to the gesture of the combination of the one-handed index finger and the ring finger is two
  • the type of the hand-type gesture G in the embodiment of the present invention is not limited.
  • a set of hand gestures is predefined, and only the hand gestures in the collection can be recognized.
  • the hand gestures that are common in the field may be included, and the new hand gestures may be defined in advance, which is not limited in the embodiment of the present invention.
  • determining the first start time (T1) of the touch according to the hand gesture G includes: after acquiring the hand gesture G, the electromyography signal collecting device defines the touch operation as a myoelectric touch event Touch_EMG The current system time is obtained as the first touch start time T1 in the electromyography touch event Touch_EMG.
  • Step 205 The EMG signal collecting device sends the first information to the processing device, so that the processing device generates a touch command according to the first information and the second information sent by the location capturing device, and performs a corresponding interaction operation.
  • the first information includes a device identifier of the myoelectric signal acquisition device, a hand gesture, and a first start time of the touch;
  • the second information includes the number of touch points that the user performs touch, the second start time of the touch, and each touch.
  • the coordinate information of the control point, and the second start time of the touch is that the position capture device recognizes the start time of the touch operation by the user.
  • the embodiment of the invention provides a method for processing touch interaction.
  • the electromyography signal acquisition device periodically collects surface electromyogram signals of multiple channels, determines a gesture hand shape of the touch operation, and determines the first touch according to the hand gesture. Start time.
  • the EMG signal acquisition device sends the first information including the device identifier, the hand gesture, and the first start time of the touch to the processing device, so that the processing device generates the touch according to the first information and the second information sent by the location capture device. Control the instructions and execute them. Since the device identification of the EMG signal acquisition device can distinguish the touch operations of different users and the left and right hands of the same user, the single user or multiple users can simultaneously perform touch operations in the same area, and the touch commands are not confused.
  • the EMG signal acquisition device determines the operation strength of the user to perform the touch according to the time-frequency domain characteristic of the surface electromyogram signal S1.
  • the first information further includes the operating strength, and determining the first start time of the touch according to the hand gesture is determining the first start time of the touch according to the hand gesture and the operating strength.
  • determining the strength (S) of the user's touch operation may specifically include:
  • the EMG signal acquisition device superimposes the surface EMG signal S1 to obtain a single-channel EMG signal S2, and uses the sliding time window to calculate the average amplitude of the single-channel EMG signal S2.
  • the operating force S is the force required to calculate the average amplitude of the single-channel EMG signal S2.
  • the sliding time window includes a width I of the sliding time window, a sliding step length J of the sliding time window, and the parameter value can be set as needed.
  • the number of calculations K of the sliding time window may also be included, and K is an integer greater than 1.
  • the width I of the sliding time window indicates that the single channel EMG signal S2 is averaged on I to obtain an average amplitude Z1
  • the sliding step J of the sliding time window indicates that a single channel EMG signal S2 is calculated every interval J.
  • the average amplitude on I, the number K of calculations of the sliding time window means that the calculation result of K times is averaged to obtain an average amplitude Z2, and the average amplitude Z1 or the average amplitude Z2 can be used as the operating strength S.
  • the average amplitude of the single-channel myoelectric signal S2 is calculated as the operating strength S by using a sliding time window.
  • the specific steps are as follows: Calculate the average amplitude Z1 over a period of 5 seconds for the single-channel EMG signal S2, and calculate it once every 1 second. Take the average amplitude Z1 for 3 consecutive times to average the average amplitude Z2, and take the average amplitude Z2 as Operating force S.
  • the first start time of the touch is determined according to the hand gesture and the operating strength, and specifically includes:
  • the first start time T1 of the touch is determined by looking up the table.
  • the first preset threshold S_th(G) of the operating strength corresponding to each hand gesture G is pre-stored in the table, and the operating strength S obtained according to the surface electromyography signal S1 is greater than the first preset threshold.
  • S_th (G) the current system time is obtained as the first touch start time T1.
  • each hand gesture G is pre-defined with a corresponding first preset threshold S_th(G), and the first preset threshold S_th(G) can be Need to set up.
  • the touch operation is an effective touch operation, defined as a myoelectric touch event Touch_EMG, and acquires the current system time as the touch first start time T1 of the myoelectric touch event Touch_EMG.
  • the processing device generates the touch command according to the first information and the second information sent by the location capture device, where the processing device determines the first start time of the touch and the second start time of the touch
  • the time interval is less than the preset threshold, and the touch point corresponding to the hand gesture is consistent with the number of touch points included in the second information, and the touch instruction is generated.
  • the touch command may include a device identifier of the myoelectric signal acquisition device, a hand gesture, and coordinate information of each touch point.
  • the touch command may further include a touch operation strength of the user.
  • the touch command includes a hand gesture G, which can be defined by the gesture of the opponent type G.
  • a hand gesture G which can be defined by the gesture of the opponent type G.
  • touch commands can be generated according to different fingers, and a thumb can be defined to indicate a target translation, an index finger represents a click selection, etc.;
  • the hand gesture is a one-handed single finger Gestures, and the generated touch command is to draw a line, then you can define different fingers to represent different colors or line types;
  • the hand gesture is a one-handed multi-finger gesture, you can generate different according to different finger combinations.
  • the touch command can define a thumb plus index finger to indicate zoom, an index finger plus a ring finger to indicate volume adjustment, and a thumb plus ring finger to indicate brightness adjustment. The embodiment of the invention does not limit this.
  • the touch command may further include an operation force S, and an additional feature may be defined for the operation force S to increase the information richness of the touch command.
  • the size of the operation force may be defined to indicate the thickness of the line; when the generated touch command is to adjust the volume, the size of the operation force may be defined to indicate the volume of the volume. The embodiment of the invention does not limit this.
  • FIG. 3 is a schematic structural diagram of Embodiment 1 of a processing apparatus according to the present invention. As shown in FIG. 3, the processing apparatus may include:
  • the information receiving module 11 is configured to receive first information sent by the myoelectric signal collecting device and second information sent by the position capturing device.
  • the first information includes a device identifier of the myoelectric signal acquisition device, a hand gesture that the user performs touch, and a first start time of the touch;
  • the second information includes the number of touch points that the user performs touch, and the second touch The start time and the coordinate information of each touch point.
  • the first start time of the touch is that the EMG signal acquisition device recognizes the start time of the touch operation by the user
  • the second start time of the touch is that the position capture device recognizes the start time of the touch operation by the user.
  • the command generating module 13 is configured to determine, if the time interval between the first start time of the touch and the second start time of the touch is less than a preset threshold, and the touch points corresponding to the hand gesture and the touch included in the second information The touch points are consistent, and the touch command is generated.
  • the touch command includes a device identifier of the myoelectric signal acquisition device, a hand gesture, and coordinate information of each touch point.
  • the instruction execution module 15 is configured to execute an interaction operation corresponding to the touch instruction.
  • command generating module 13 is further configured to: according to the hand gesture of the user performing touch control sent by the myoelectric signal collecting device continuously received by the information receiving module, and the coordinate information of each touch point sent by the position capturing device, Control instructions are updated.
  • the first information may further include a touch operation strength of the user; the touch command may further include the operation strength.
  • the instruction generating module 13 is further configured to receive the electromyogram continuously according to the information receiving module.
  • the touch gesture is updated by the hand gesture and the operation intensity of the touch signal sent by the signal acquisition device and the coordinate information of each touch point sent by the position capture device.
  • the command generating module 13 is further configured to: if it is determined that the operating strength is less than the second preset threshold, determine that the touch command ends, and delete the touch command.
  • the EMG signal acquisition device may further include: a storage module 17 configured to maintain a touch instruction list for storing the touch instruction.
  • the embodiment of the present invention does not limit the form of the processing device, and may be a chip, a smart phone, a computer, a server, or the like, or may be another device having computing and storage capabilities.
  • the embodiment of the invention provides a processing device, the information receiving module receives the first information sent by the myoelectric signal collecting device, and the second information sent by the position capturing device; the command generating module generates a touch command, and the touch command includes the myoelectric
  • the device identification of the signal acquisition device, the hand gesture and the coordinate information of each touch point the instruction execution module performs an interaction operation corresponding to the touch instruction. Since the device identification of the EMG signal acquisition device can distinguish the touch operations of different users and the left and right hands of the same user, the single user or multiple users can simultaneously perform touch operations in the same area, and the touch commands are not confused.
  • Embodiment 1 of the electromyography signal acquisition device of the present invention is a schematic structural view of Embodiment 1 of the electromyography signal acquisition device of the present invention; as shown in FIG. 4, the EMG signal acquisition device may include:
  • the acquisition module 21 is configured to periodically collect surface electromyogram signals S1 of the plurality of channels.
  • the processing module 23 is configured to determine, according to the time-frequency domain feature of the surface electromyogram signal S1, a hand gesture that the user performs touch, and determine a first start time of the touch according to the hand gesture.
  • the first start time of the touch is the start time of the EMG signal acquisition device to recognize the user performing the touch operation.
  • the sending module 25 is configured to send the first information to the processing device, so that the processing device generates the touch command according to the first information and the second information sent by the location capturing device, and performs a corresponding interaction operation.
  • the first information includes a device identifier of the myoelectric signal acquisition device, a hand gesture, and a first start time of the touch;
  • the second information includes the number of touch points that the user performs touch, the second start time of the touch, and each touch.
  • the coordinate information of the control point, and the second start time of the touch is that the position capture device recognizes the start time of the touch operation by the user.
  • the processing module 23 determines, according to the time-frequency domain feature of the surface electromyogram signal S1, that the hand gesture of the user performing the touch comprises: the processing module 23 according to the magnitude and frequency of the surface myoelectric signal S1, according to the hand gesture The type makes a judgment of the hand gesture.
  • the processing module 23 is further configured to: according to a time-frequency domain characteristic of the surface electromyogram signal S1,
  • the first information includes the operating strength
  • the first start time of the touch is determined according to the hand gesture to determine the first start time of the touch according to the hand gesture and the operating strength.
  • the processing module 23 determines the operation strength of the touch operation by the user, and may include: the processing module 23 superimposes the surface electromyogram signal S1 to obtain a single channel electromyogram signal S2, and calculates a single channel muscle by using a sliding time window manner. The average amplitude of the electrical signal S2 is taken as the operating force S.
  • the processing module 23 determines the first start time of the touch according to the hand gesture and the operating strength, and may include: after acquiring the hand gesture and the operating strength, the processing module 23 determines the first start time of the touch by using the lookup table.
  • the first preset threshold value of the operating strength corresponding to each hand gesture is pre-stored in the table, and if the operating strength obtained according to the surface electromyogram signal S1 is greater than the first preset threshold, the current system is acquired. Time is used as the first start time of the touch.
  • the processing device generates the touch command according to the first information and the second information sent by the location capture device, and the method includes: the processing device determines the time between the first start time of the touch and the second start time of the touch If the interval is less than the preset threshold, and the number of touch points corresponding to the hand gesture is consistent with the number of touch points included in the second information, a touch command is generated.
  • the touch command may include a device identifier of the myoelectric signal acquisition device, a hand gesture, and coordinate information of each touch point.
  • the touch command may further include a touch operation strength of the user.
  • the embodiment of the present invention does not limit the form of the myoelectric signal acquisition device, and may be in the form of a wearable device such as a wristband or a watch, or may be an electrode that can collect surface electromyographic signals.
  • the embodiment of the invention provides an electromyography signal collecting device, wherein the collecting module collects the surface electromyogram signal S1 of the plurality of channels; the processing module determines the hand gesture of the user to perform the touch and the first start time of the touch; the sending module processes the The device sends the first information, so that the processing device generates and executes a touch command according to the first information and the second information sent by the location capture device, where the touch command includes a device identifier of the myoelectric signal acquisition device, a hand gesture, and each The coordinate information of the touch point. Since the device identification corresponding to the EMG signal acquisition device can distinguish the touch operations of different users and the left and right hands of the same user, the single-user hands or multiple users can simultaneously perform touch operations in the same area, and the touch commands are not confused.
  • FIG. 5 is a schematic structural diagram of Embodiment 1 of the touch processing system of the present invention.
  • the touch processing system may include: a position capture device 105, a processing device 103, and at least one myoelectric signal acquisition device 101.
  • the position capture device 105 can adopt any touch operation
  • the processing device 103 can adopt the structure of the device embodiment of FIG. 3, and correspondingly, the technical solution of the method embodiment of FIG. 1 can be executed
  • the electrical signal acquisition device 101 can adopt the structure of the device embodiment of FIG. 4, and correspondingly, the technical solution of the method embodiment of FIG. 2 can be executed.
  • the EMG signal acquisition device 101 and the position capture device 105 are respectively connected to the processing device 103, and can communicate by means of wired, wireless, Bluetooth, wifi, or the like.
  • the location capture device 105 can include: a capacitive sensing module 1031, an infrared sensing module 1033, and an ultrasound sensing module 1035.
  • the capacitive sensing module 1031 is configured to obtain the number of touch points, the touch start time, and the touch point coordinates of the touch operation through the capacitive touch screen;
  • the infrared sensing module 1033 is configured to obtain the touch operation touch by the infrared touch sensing system.
  • the ultrasonic sensing module 1035 is configured to obtain the number of touch points, the touch start time, and the touch point coordinates of the touch operation through the ultrasonic touch sensing system.
  • the position capture device 105 and the processing device 103 may be integrated or may be separately and independently arranged.
  • the touch processing system may further include a user feedback device 107 for displaying an execution result of the touch instruction, such as an LED display screen, a projection display device, a speaker, a tactile feedback device, and the like.
  • the user feedback device 107 may include: a display module 1071, a sound module 1073, and a haptic feedback module 1075.
  • the touch processing system can be applied to traditional electronic touch devices, such as touch mobile phones and touch computers, and can also be applied to education, corporate office, entertainment, advertisement display, and the like.
  • traditional electronic touch devices such as touch mobile phones and touch computers
  • the desk table is used as a drawing paper
  • the finger is used as a brush
  • many students work together to complete a piece of work on the drawing paper.
  • FIG. 6 is a schematic structural diagram of Embodiment 1 of a processing device according to the present invention.
  • the processing device may include: a receiver 31, a first memory 33, and a processor 35.
  • the receiver 31 and the first memory 33 respectively pass through a bus. Connected to the processor 35.
  • the receiver 31 is configured to receive first information sent by the myoelectric signal acquisition device and second information sent by the location capture device.
  • the first information includes a device identifier of the myoelectric signal acquisition device, a hand gesture that the user performs touch, and a first start time of the touch;
  • the second information includes the number of touch points that the user performs touch, and the second touch The start time and the coordinate information of each touch point.
  • the first start time of the touch is that the EMG signal acquisition device recognizes the start time of the user performing the touch operation
  • the second start time of the touch is that the position capture device recognizes the start time of the touch operation by the user.
  • the first memory 33 is used to store instructions.
  • the processor 35 is configured to run the instructions stored in the first memory 33 to perform the following steps:
  • the touch command includes a device identifier of the myoelectric signal acquisition device, a hand gesture, and coordinate information of each touch point.
  • the processor 35 is further configured to: perform a touch gesture of the user sent by the EMG signal acquisition device continuously received by the information receiving module, and each touch point sent by the location capture device The coordinate information updates the touch command.
  • the first information may further include a touch operation strength of the user; the touch command may further include the operation strength.
  • the processor 35 is further configured to perform the following steps: the hand gesture and the operation force of the user performing touch control sent by the EMG signal acquisition device continuously received by the information receiving module, and the touches sent by the location capture device The coordinate information of the handle is updated to the touch command.
  • the processor 35 is further configured to: if it is determined that the operating strength is less than the second preset threshold, determine that the touch command ends, and delete the touch command.
  • the processing device may further include: a second memory 37, configured to maintain a touch command list, where the touch command list stores touch commands.
  • the second memory 37 is connected to the processor 35 via a bus.
  • the embodiment of the present invention does not limit the form of the processing device, and may be a chip, a smart phone, a computer, a server, or the like, or may be another device having computing and storage capabilities.
  • the embodiment of the invention provides a processing device, the receiver receives the first information sent by the myoelectric signal acquisition device, and the second information sent by the location capture device; the processor generates the touch instruction and performs the interaction operation corresponding to the touch instruction.
  • the touch command includes a device identifier of the myoelectric signal acquisition device, a hand gesture, and coordinate information of each touch point. Since the device identification of the EMG signal acquisition device can distinguish the touch operations of different users and the left and right hands of the same user, the single-user hands or multiple users can simultaneously perform touch operations in the same area, and the touch commands are not confused.
  • FIG. 7 is a schematic structural diagram of Embodiment 1 of the electromyography signal collection device of the present invention.
  • the EMG signal acquisition device may include: a processor 41, a transmitter 43 and a third memory 45, and a transmitter 43
  • the three memories 45 are connected to the processor 41 via buses, respectively.
  • the third memory 45 is for storing instructions.
  • the processor 41 is configured to run the instructions stored in the third memory 45 to perform the following steps:
  • the surface electromyogram signal S1 of the plurality of channels is periodically collected.
  • the hand gesture of the user to perform touch is determined, and the first start time of the touch is determined according to the hand gesture.
  • the first start time of the touch is the start time of the EMG signal acquisition device to identify the user performing the touch operation.
  • determining, according to the time-frequency domain feature of the surface electromyogram signal S1, determining the hand gesture of the user to perform the touch may include: performing the hand according to the type of the hand gesture according to the amplitude and frequency of the surface electromyogram signal S1. The judgment of the type gesture.
  • the transmitter 43 is configured to send the first information to the processing device, so that the processing device generates the touch instruction and performs the corresponding interaction operation according to the first information and the second information sent by the location capture device.
  • the first information includes a device identifier of the myoelectric signal acquisition device, a hand gesture, and a first start time of the touch;
  • the second information includes the number of touch points that the user performs touch, the second start time of the touch, and each touch
  • the coordinate information of the handle, the second start time of the touch is the start time of the touch capture operation by the location capture device.
  • the processor 41 is further configured to: perform determining, according to the time-frequency domain feature of the surface electromyogram signal S1, the strength of the operation performed by the user; the first information further includes the operating strength, and correspondingly, according to The hand gesture determines the first start time of the touch to determine the first start time of the touch according to the hand gesture and the operating strength.
  • determining the intensity of the user's touch operation may include: superimposing the surface electromyogram signal S1 to obtain a single channel myoelectric signal S2, and calculating a mean amplitude of the single channel myoelectric signal S2 by using a sliding time window manner. The value is used as the operating force S.
  • determining the first start time of the touch according to the hand gesture and the operation strength may include: after acquiring the hand gesture and the operation strength, determining the first start time of the touch by looking up the table.
  • the first preset threshold value of the operating strength corresponding to each hand gesture is pre-stored in the table, and if the operating strength obtained according to the surface electromyogram signal S1 is greater than the first preset threshold, the current system is acquired. Time is used as the first start time of the touch.
  • the processing device generates the touch command according to the first information and the second information sent by the location capture device, and the method includes: the processing device determines the time between the first start time of the touch and the second start time of the touch If the interval is less than the preset threshold, and the number of touch points corresponding to the hand gesture is consistent with the number of touch points included in the second information, a touch command is generated.
  • Touch commands can include EMG signal acquisition Device identification, hand gestures, and coordinate information of each touch point.
  • the touch command may further include a touch operation strength of the user.
  • the embodiment of the present invention does not limit the form of the myoelectric signal acquisition device, and may be in the form of a wearable device such as a wristband or a watch, or may be an electrode that can collect surface electromyographic signals.
  • the embodiment of the invention provides an electromyography signal acquisition device, wherein the processor collects the surface electromyogram signal S1 of the plurality of channels, determines the hand gesture of the user to touch and the first start time of the touch; and the transmitter sends the signal to the processing device.
  • the first information is used by the processing device to generate and execute a touch command according to the first information and the second information sent by the location capture device, where the touch command includes a device identifier of the myoelectric signal acquisition device, a hand gesture, and each touch The coordinate information of the point. Since the device identification corresponding to the EMG signal acquisition device can distinguish the touch operations of different users and the left and right hands of the same user, the single user or multiple users can simultaneously perform touch operations in the same area, and the touch commands are not confused.
  • FIG. 8 is a schematic structural diagram of Embodiment 2 of the touch processing system of the present invention.
  • the touch processing system may include: a location capture device 205, a processing device 203, and at least one myoelectric signal acquisition device 201.
  • the location capture device 205 can use any existing device that can obtain the touch points, the touch start time, and the touch point coordinates of the touch operation.
  • the processing device 203 can adopt the structure of the device embodiment of FIG.
  • the technical solution of the embodiment of the method of FIG. 1 can be implemented; the electromyography signal acquisition device 201 can adopt the structure of the device embodiment of FIG. 7, and correspondingly, the technical solution of the method embodiment of FIG. 2 can be executed.
  • the EMG signal collection device 201 and the location capture device 205 are respectively connected to the processing device 203, and can communicate by means of wired, wireless, Bluetooth, wifi, and the like.
  • the position capture device 205 may include: a capacitive sensor 2031, an infrared sensor 2033, and an ultrasonic sensor 2035.
  • the capacitive sensor 2031 is configured to obtain the touch point number, the touch start time and the touch point coordinate of the touch operation through the capacitive touch screen;
  • the infrared sensor 2033 is configured to obtain the touch operation touch by the infrared touch sensing system.
  • the control point number, the touch start time and the touch point coordinates; the ultrasonic sensor 2035 is used to obtain the touch point number, the touch start time and the touch point coordinate of the touch operation through the ultrasonic touch sensing system.
  • the location capture device 205 and the processing device 203 may be integrated or may be separately and independently set.
  • the touch processing system may further include a user feedback device 207, configured to display an execution result of the touch instruction, such as an LED display screen, a projection display device, a speaker, and a tactile feedback setting.
  • a user feedback device 207 can include: a display device 2071, a sound device 2073, and a haptic feedback device 2075.
  • the touch processing system can be applied to traditional electronic touch devices, such as touch mobile phones and touch computers, and can also be applied to education, corporate office, entertainment, advertisement display, and the like.
  • traditional electronic touch devices such as touch mobile phones and touch computers
  • the desk table is used as a drawing paper
  • the finger is used as a brush
  • many students work together to complete a piece of work on the drawing paper.

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Physics & Mathematics (AREA)
  • Human Computer Interaction (AREA)
  • Dermatology (AREA)
  • Neurosurgery (AREA)
  • Neurology (AREA)
  • General Health & Medical Sciences (AREA)
  • Health & Medical Sciences (AREA)
  • Biomedical Technology (AREA)
  • User Interface Of Digital Computer (AREA)
  • Position Input By Displaying (AREA)
  • Measurement And Recording Of Electrical Phenomena And Electrical Characteristics Of The Living Body (AREA)

Abstract

一种触控交互的处理方法、装置和系统。触控交互的处理方法,包括:接收肌电信号采集装置发送的第一信息,以及位置捕获装置发送的第二信息(101);若确定触控第一起始时间与触控第二起始时间之间的时间间隔小于预设阈值,且手型手势对应的触控点数与所述第二信息包括的触控点数一致,生成触控指令,所述触控指令包括所述肌电信号采集装置的设备标识、所述手型手势和各触控点的坐标信息(103);执行所述触控指令对应的交互操作(105)。该处理方法、装置和系统可以实现单用户双手或者多用户在同一区域同时进行触控交互操作。

Description

触控交互的处理方法、装置和系统 技术领域
本发明实施例涉及人机交互领域,尤其涉及一种触控交互的处理方法、装置和系统。
背景技术
触控技术是人机交互的关键技术之一,根据触控点的数目分为单点触控和多点触控。单点触控只能识别和支持每次一个手指的点击、触控。多点触控又称多重触控、多点感应,可以同时采集多点信号并进行手势识别,从而实现识别和支持五个手指同时做的点击、触控动作。由于触控操作便捷、自然、友好,其应用领域越来越广泛。
现有技术中,通过检测界面上的触控点进行触控指令识别,一个触控点即为单点触控,多个触控点即为多点触控,但是该触控操作是由哪个用户哪只手哪几个手指进行的却无法区分。当单用户进行单手多点触控操作时,由于无法区分手指,因此可识别的多点触控指令也比较单一,比如食指加拇指的移动和中指加拇指的移动都识别为缩放操作。当单用户进行双手触控操作时,由于无法区分左右手,容易造成触控指令的混淆,比如将左右手单指的同时移动误识别为单手的食指加拇指的缩放操作。当多用户进行触控操作时,由于无法区分用户,多用户同时同区域的触控操作将无法实现。
综上,现有技术存在如下缺陷:单用户双手或者多用户在同一区域同时进行触控操作时,容易造成触控指令的混淆。
发明内容
本发明实施例提供一种触控交互的处理方法、装置和系统,用以实现单用户双手或者多用户在同一区域同时进行触控交互操作。
第一方面,本发明实施例提供一种触控交互的处理方法,包括:
接收肌电信号采集装置发送的第一信息,以及位置捕获装置发送的第二信息;其中,所述第一信息包括所述肌电信号采集装置的设备标识、用户进 行触控的手型手势,以及触控第一起始时间;所述第二信息包括所述用户进行触控的触控点数、触控第二起始时间和各触控点的坐标信息;所述触控第一起始时间为所述肌电信号采集装置识别出用户进行触控操作的起始时间,所述触控第二起始时间为所述位置捕获装置识别出用户进行触控操作的起始时间;
若确定所述触控第一起始时间与所述触控第二起始时间之间的时间间隔小于预设阈值,且所述手型手势对应的触控点数与所述第二信息包括的所述触控点数一致,生成触控指令,所述触控指令包括所述肌电信号采集装置的设备标识、所述手型手势和所述各触控点的坐标信息;
执行所述触控指令对应的交互操作。
结合第一方面,在第一方面的第一种可能的实现方式中,所述方法还包括:
持续接收所述肌电信号采集装置发送的所述用户进行触控的手型手势,以及所述位置捕获装置发送的所述各触控点的坐标信息,对所述触控指令进行更新。
结合第一方面,在第一方面的第二种可能的实现方式中,所述第一信息中还包括用户进行触控的操作力度;所述触控指令中还包括所述操作力度。
结合第一方面的第二种可能的实现方式,在第一方面的第三种可能的实现方式中,持续接收所述肌电信号采集装置发送的所述用户进行触控的手型手势和所述操作力度,以及所述位置捕获装置发送的所述各触控点的坐标信息,对所述触控指令进行更新。
结合第一方面的第三种可能的实现方式,在第一方面的第四种可能的实现方式中,所述方法还包括:若确定所述操作力度小于第二预设门限值,删除所述触控指令。
第二方面,本发明实施例提供一种触控交互的处理方法,包括:
肌电信号采集装置周期性采集多个通道的表面肌电信号S1;
所述肌电信号采集装置根据所述表面肌电信号S1的时频域特征,确定用户进行触控的手型手势;根据所述手型手势确定触控第一起始时间;所述触控第一起始时间为所述肌电信号采集装置识别出用户进行触控操作的起始时间;
所述肌电信号采集装置向处理装置发送第一信息,以供所述处理装置根 据所述第一信息,以及位置捕获装置发送的第二信息,生成触控指令并执行对应的交互操作;其中,所述第一信息包括所述肌电信号采集装置的设备标识、所述手型手势,以及所述触控第一起始时间;所述第二信息包括所述用户进行触控的触控点数、触控第二起始时间和各触控点的坐标信息,所述触控第二起始时间为所述位置捕获装置识别出用户进行触控操作的起始时间。
结合第二方面,在第二方面的第一种可能的实现方式中,所述肌电信号采集装置根据所述表面肌电信号S1的时频域特征,确定用户进行触控的手型手势包括:所述肌电信号采集装置根据所述表面肌电信号S1的幅值和频率,按照手型手势的类型进行所述手型手势的判断。
结合第二方面,在第二方面的第二种可能的实现方式中,还包括:所述肌电信号采集装置根据所述表面肌电信号S1的时频域特征,确定用户进行触控的操作力度;则所述第一信息中还包括所述操作力度,所述根据所述手型手势确定触控第一起始时间为根据所述手型手势和所述操作力度确定所述触控第一起始时间。
结合第二方面的第二种可能的实现方式,在第二方面的第三种可能的实现方式中,所述确定用户进行触控的操作力度包括:所述肌电信号采集装置将所述表面肌电信号S1进行叠加平均得到单通道肌电信号S2,并采用滑动时间窗的方式计算所述单通道肌电信号S2的平均幅值作为所述操作力度S。
结合第二方面的第二种可能的实现方式,在第二方面的第四种可能的实现方式中,所述根据所述手型手势和所述操作力度确定所述触控第一起始时间包括:在获取所述手型手势和所述操作力度后,通过查表确定所述触控第一起始时间;其中,所述表中预先存储有各手型手势所对应的操作力度的第一预设门限值,若根据所述表面肌电信号S1获取的操作力度大于所述第一预设门限值,则获取当前的系统时间作为所述触控第一起始时间。
结合第二方面或者第二方面的第一种至第四种任一种可能的实现方式,在第二方面的第五种可能的实现方式中,所述处理装置根据所述第一信息,以及位置捕获装置发送的第二信息,生成触控指令包括:所述处理装置确定所述触控第一起始时间与所述触控第二起始时间之间的时间间隔小于预设阈值,且所述手型手势对应的触控点数与所述第二信息包括的触控点数一致,则生成所述触控指令。
第三方面,本发明实施例提供一种处理装置,包括:
信息接收模块,用于接收肌电信号采集装置发送的第一信息,以及位置捕获装置发送的第二信息;其中,所述第一信息包括所述肌电信号采集装置的设备标识、用户进行触控的手型手势,以及触控第一起始时间;所述第二信息包括所述用户进行触控的触控点数、触控第二起始时间和各触控点的坐标信息;所述触控第一起始时间为所述肌电信号采集装置识别出用户进行触控操作的起始时间,所述触控第二起始时间为所述位置捕获装置识别出用户进行触控操作的起始时间;
指令生成模块,用于若确定所述触控第一起始时间与所述触控第二起始时间之间的时间间隔小于预设阈值,且所述手型手势对应的触控点数与所述第二信息包括的所述触控点数一致,生成触控指令,所述触控指令包括所述肌电信号采集装置的设备标识、所述手型手势和所述各触控点的坐标信息;
指令执行模块,用于执行所述触控指令对应的交互操作。
结合第三方面,在第三方面的第一种可能的实现方式中,所述指令生成模块还用于,根据所述信息接收模块持续接收到的所述肌电信号采集装置发送的所述用户进行触控的手型手势,以及所述位置捕获装置发送的各触控点的坐标信息,对所述触控指令进行更新。
结合第三方面,在第三方面的第二种可能的实现方式中,所述第一信息中还包括用户进行触控的操作力度;所述触控指令中还包括所述操作力度。
结合第三方面的第二种可能的实现方式,在第三方面的第三种可能的实现方式中,所述指令生成模块还用于,根据所述信息接收模块持续接收到的所述肌电信号采集装置发送的所述用户进行触控的手型手势和所述操作力度,以及所述位置捕获装置发送的所述各触控点的坐标信息,对所述触控指令进行更新。
结合第三方面的第三种可能的实现方式,在第三方面的第四种可能的实现方式中,所述指令生成模块还用于,若确定所述操作力度小于第二预设门限值,判断所述触控指令结束,并删除所述触控指令。
第四方面,本发明实施例提供一种肌电信号采集装置,包括:
采集模块,用于周期性采集多个通道的表面肌电信号S1;
处理模块,用于根据所述表面肌电信号S1的时频域特征,确定用户进行触控的手型手势;根据所述手型手势确定触控第一起始时间;所述触控第一起始时间为所述肌电信号采集装置识别出用户进行触控操作的起始时间;
发送模块,用于向处理装置发送第一信息,以供所述处理装置根据所述第一信息,以及位置捕获装置发送的第二信息,生成触控指令并执行对应的交互操作;其中,所述第一信息包括所述肌电信号采集装置的设备标识、所述手型手势,以及所述触控第一起始时间;所述第二信息包括所述用户进行触控的触控点数、触控第二起始时间和各触控点的坐标信息,所述触控第二起始时间为所述位置捕获装置识别出用户进行触控操作的起始时间。
结合第四方面,在第四方面的第一种可能的实现方式中,所述处理模块根据所述表面肌电信号S1的时频域特征,确定用户进行触控的手型手势包括:所述处理模块根据所述表面肌电信号S1的幅值和频率,按照手型手势的类型进行所述手型手势的判断。
结合第四方面,在第四方面的第二种可能的实现方式中,所述处理模块还用于,根据所述表面肌电信号S1的时频域特征,确定用户进行触控的操作力度;则所述第一信息中还包括所述操作力度,所述根据所述手型手势确定触控第一起始时间为根据所述手型手势和所述操作力度确定所述触控第一起始时间。
结合第四方面的第二种可能的实现方式,在第四方面的第三种可能的实现方式中,所述处理模块确定用户进行触控的操作力度包括:所述处理模块将所述表面肌电信号S1进行叠加平均得到单通道肌电信号S2,并采用滑动时间窗的方式计算所述单通道肌电信号S2的平均幅值作为所述操作力度S。
结合第四方面的第三种可能的实现方式,在第四方面的第四种可能的实现方式中,所述处理模块根据所述手型手势和所述操作力度确定所述触控第一起始时间包括:所述处理模块在获取所述手型手势和所述操作力度后,通过查表确定所述触控第一起始时间;其中,所述表中预先存储有各手型手势所对应的操作力度的第一预设门限值,若根据所述表面肌电信号S1获取的操作力度大于所述第一预设门限值,则获取当前的系统时间作为所述触控第一起始时间。
结合第四方面或者第四方面的第一种至第四种任一种可能的实现方式,在第四方面的第五种可能的实现方式中,所述处理装置根据所述第一信息,以及位置捕获装置发送的第二信息,生成触控指令包括:所述处理装置确定所述触控第一起始时间与所述触控第二起始时间之间的时间间隔小于预设阈值,且所述手型手势对应的触控点数与所述第二信息包括的所述触控点数一 致,则生成所述触控指令。
第五方面,本发明实施例提供一种触控处理系统,包括位置捕获装置、如第三方面、第三方面的第一至第四种任一种可能的实现方式的处理装置,至少一个如第四方面、第四方面的第一至第五种任一种可能的实现方式的肌电信号采集装置;其中,所述肌电信号采集装置和所述位置捕获装置分别与所述处理装置通信连接。
本发明实施例提供的触控交互的处理方法、装置和系统,通过肌电信号采集装置发送的第一信息,以及位置捕获装置发送的第二信息,生成触控指令并执行。由于肌电信号采集装置对应的设备标识可以区分出不同用户以及同一用户左右手的触控操作,实现了单用户双手或者多用户在同一区域同时进行触控操作,触控指令不会混淆。
附图说明
为了更清楚地说明本发明实施例或现有技术中的技术方案,下面将对实施例或现有技术描述中所需要使用的附图作一简单地介绍,显而易见地,下面描述中的附图是本发明的一些实施例,对于本领域普通技术人员来讲,在不付出创造性劳动性的前提下,还可以根据这些附图获得其他的附图。
图1为本发明触控交互的处理方法实施例一的流程图;
图2为本发明触控交互的处理方法实施例二的流程图;
图3为本发明处理装置实施例一的结构示意图;
图4为本发明肌电信号采集装置实施例一的结构示意图;
图5为本发明触控处理系统实施例一的结构示意图;
图6为本发明处理设备实施例一的结构示意图;
图7为本发明肌电信号采集设备实施例一的结构示意图;
图8为本发明触控处理系统实施例二的结构示意图。
具体实施方式
为使本发明实施例的目的、技术方案和优点更加清楚,下面将结合本发明实施例中的附图,对本发明实施例中的技术方案进行清楚、完整地描述,显然,所描述的实施例是本发明一部分实施例,而不是全部的实施例。基于 本发明中的实施例,本领域普通技术人员在没有作出创造性劳动前提下所获得的所有其他实施例,都属于本发明保护的范围。
图1为本发明触控交互的处理方法实施例一的流程图,该触控交互的处理方法的执行主体可以是处理装置,例如:芯片、移动终端等。该处理装置可以集成在处理设备中,处理设备可以为,移动终端、计算机、服务器等,芯片可以集成在移动终端、计算机等情况。处理装置和处理设备可以是任意具有存储、计算功能的装置和设备,本发明实施例对此不加以限制。如图1所示,该触控交互的处理方法可以包括:
步骤101、接收肌电信号采集装置发送的第一信息,以及位置捕获装置发送的第二信息。
其中,第一信息包括肌电(electromyography,简称EMG)信号采集装置的设备标识、用户进行触控的手型手势,以及触控第一起始时间。第二信息包括用户进行触控的触控点数、触控第二起始时间和各触控点的坐标信息。触控第一起始时间为肌电信号采集装置识别出用户进行触控操作的起始时间,触控第二起始时间为位置捕获装置识别出用户进行触控操作的起始时间。
在本实施例中,肌电信号采集装置和位置捕获装置在同一触控操作区域可以分别识别出用户的触控操作事件。需要说明,肌电信号采集装置可以是任意可以采集多通道表面肌电(surface electromyography,简称sEMG)信号的装置,优选地,肌电信号采集装置可以设置在用户的手臂上;位置捕获装置可以是任意可以识别出用户的触控操作的装置。肌电信号采集装置识别出的用户触控操作事件可以定义为肌电触控事件Touch_EMG,位置捕获装置识别出的用户触控操作事件可以定义为轨迹触控事件Touch_TrackSys。其中,肌电触控事件Touch_EMG包含三个参数,分别为:肌电信号采集装置的设备标识(设备ID)、用户进行触控的手型手势(G)、触控第一起始时间(T1)。其中,轨迹触控事件Touch_TrackSys包含三个参数,分别为:用户进行触控的触控点数(N2)、触控第二起始时间(T2)、各触控点的坐标信息(L)。
其中,设备ID可以唯一区分肌电信号采集装置。设备ID可以是数字、字母或任意形式,本发明实施例对此不加以限制。例如:两个用户同时进行触控操作,用户A的左右手都进行触控操作,用户B只使用右手进行触控操作,则用户的每条手臂都需要设置一个肌电信号采集装置,设备ID为210的肌电信号采集装置采集用户A左手的多通道表面肌电信号,设备ID为211 的肌电信号采集装置采集用户A右手的多通道表面肌电信号,设备ID为220的肌电信号采集装置采集用户B右手的多通道表面肌电信号。由于设备ID210、设备ID211、设备ID220可以唯一区分不同的肌电信号采集装置,从而可以区分出用户A左手、用户A右手以及用户B右手的触控操作。所以,通过肌电信号采集装置的设备标识可以区分出不同用户以及同一用户左右手的触控操作,从而可以实现多用户或者一个用户的左右手同时同区域进行的触控操作,而不会产生触控操作的混淆。
其中,用户进行触控的手型手势G可以是单手单手指的手型手势,也可以是单手多手指的手型手势,根据手型手势G可以确定相应的触控点数(N1)。例如:单手单手指的手型手势可以是拇指、食指或者无名指的手型手势,此时,该手型手势G对应的触控点数N1为1;单手多手指的手型手势可以是食指与无名指组合的手型手势,此时,该手型手势G对应的触控点数N1为2。可选的,肌电信号采集装置预先定义一个手型手势的集合,只有该集合中的手型手势才能被识别出来。对于集合中的手型手势,可以包括本领域常见的手型手势,也可以预先定义新的手型手势,本发明实施例对此不加以限制。
其中,触控第一起始时间T1表示肌电触控事件Touch_EMG的起始时间,为肌电信号采集装置识别出的用户进行触控操作的起始时间。触控第二起始时间T2表示轨迹触控事件Touch_TrackSys的起始时间,为位置捕获装置识别出的用户进行触控操作的起始时间。
其中,各触控点的坐标信息L表示在触控区域中由位置捕获装置识别出的N2个触控点的位置坐标,L={l1,l2,…,lN2},其中,li(i=1,2,…,N2)表示由位置捕获装置识别出的每个触控点的二维坐标。
步骤103、若确定触控第一起始时间与触控第二起始时间之间的时间间隔小于预设阈值,且手型手势对应的触控点数与第二信息包括的触控点数一致,生成触控指令,所述触控指令包括肌电信号采集装置的设备标识、手型手势和各触控点的坐标信息。
其中,预设阈值T_th可以根据需要进行设置。
具体地,肌电信号采集装置识别出的肌电触控事件Touch_EMG中触控第一起始时间T1与位置捕获装置识别出的轨迹触控事件Touch_TrackSys中触控第二起始时间T2的时间间隔T_gap=|T1-T2|,当时间间隔T_gap小于预 设阈值T_th,且肌电触控事件Touch_EMG中手型手势G对应的触控点数N1与轨迹触控事件Touch_TrackSys中用户进行触控的触控点数N2一致,即,T_gap=|T1-T2|<T_th且N1=N2,则说明肌电信号采集装置识别出的肌电触控事件Touch_EMG与位置捕获装置识别出的轨迹触控事件Touch_TrackSys是同一个触控操作事件,则,根据该触控操作可以生成触控指令Touch_eff。该触控指令Touch_eff包含三个参数,分别为:肌电信号采集装置的设备标识(设备ID)、用户进行触控的手型手势(G)、各触控点的坐标信息(L)。
可选的,触控指令Touch_eff可以是选中、移动、缩放、旋转等常见触控指令,也可以是用户定义的画线条、调节音量等新增触控指令,本发明实施例对此不加以限制。
步骤105、执行触控指令对应的交互操作。
具体地,生成触控指令Touch_eff后,将执行该触控指令Touch_eff对应的交互操作。
需要说明的是,触控指令Touch_eff中包含手型手势G,可以对手型手势G定义附加特征,增加触控指令Touch_eff的信息丰富度。例如:当手型手势G为单手单手指的手势,则可以根据不同的手指生成不同的触控指令Touch_eff,可以定义拇指表示目标平移、食指表示单击选中等;当手型手势G为单手单手指的手势,且生成的触控指令为画线条时,则可以定义不同的手指代表不同的颜色或者线条类型;当手型手势G为单手多手指的手势,则可以根据不同的手指组合生成不同的触控指令Touch_eff,可以定义拇指加食指表示缩放、拇指加无名指表示调节亮度、食指加无名指表示调节音量等。本发明实施例对此不加以限制。
进一步,在步骤105之后,还可以包括:
步骤107、持续接收肌电信号采集装置发送的用户进行触控的手型手势,以及位置捕获装置发送的各触控点的坐标信息,对触控指令进行更新。
本发明实施例提供了一种触控交互的处理方法,通过肌电信号采集装置发送的第一信息,以及位置捕获装置发送的第二信息,生成触控指令并执行。由于肌电信号采集装置的设备标识可以区分出不同用户以及同一用户左右手的触控操作,可以实现单用户双手或者多用户在同一区域同时进行触控操作,触控指令不会混淆。
可选的,在上述实施例中,第一信息中还可以包括用户进行触控的操作 力度,触控指令中还包括所述操作力度。
其中,用户进行触控的操作力度(S)与用户进行触控的手型手势G相对应,表示用户进行触控操作的轻重程度。可以理解的是,有效的触控操作必然有一定的操作力度,如果操作力度过小,可以理解为是用户在触控区域不小心进行的触碰,并不是有效的触控操作。可选的,肌电信号采集装置预先定义每一个手型手势G对应的第一预设门限值S_th(G),当手型手势G对应的操作力度S大于该手型手势G对应的第一预设门限值S_th(G)时,认为该手型手势G是有效的手型手势,定义该触控操作为肌电触控事件Touch_EMG。其中,第一预设门限值S_th(G)可以根据需要进行设置。
需要说明的是,触控指令Touch_eff中可以包含操作力度S,可以对操作力度S定义附加特征,增加触控指令Touch_eff的信息丰富度。例如:生成的触控指令Touch_eff为画线条时,可以定义操作力度S的大小表示线条的粗细;生成的触控指令Touch_eff为调节音量时,可以定义操作力度S的大小表示音量的大小。本发明实施例对此不加以限制。
进一步,在上述实施例中,在步骤105之后,还可以包括:
步骤109、持续接收肌电信号采集装置发送的用户进行触控的手型手势和操作力度,以及位置捕获装置发送的各触控点的坐标信息,对触控指令进行更新。
步骤111、若确定操作力度小于第二预设门限值,删除触控指令。
其中,处理装置可以预先定义每一个手型手势G对应的第二预设门限值S2_th(G),当手型手势G对应的操作力度S小于该手型手势G对应的第二预设门限值S2_th(G)时,认为该触控指令Touch_eff结束,则删除该触控指令Touch_eff。其中,第二预设门限值S2_th(G)可以根据需要进行设置。
可选的,步骤111也可以位于步骤109之前。
可选的,在上述实施例中,还可以包括:维护一个触控指令列表,用于存储触控指令。当新生成一个触控指令时,将该触控指令加入到触控指令列表中;当判断触控指令结束时,将该触控指令从触控指令列表中删除。
图2为本发明触控交互的处理方法实施例二的流程图,该触控交互的处理方法的执行主体可以是肌电信号采集装置,例如,多个肌电信号采集电极片、腕带型肌电信号采集设备。该肌电信号采集装置可以集成在肌电信号采集设备中,例如:多个肌电信号采集电极片集成在腕带型肌电信号采集设备 中,肌电信号采集装置和肌电信号采集设备可以是任意可以采集多通道表面肌电信号的装置和设备,本发明实施例对此不加以限制。在本实施例中,肌电信号采集装置可以识别出用户的触控操作事件,可以将该用户触控操作事件定义为肌电触控事件Touch_EMG。如图2所示,该触控交互的处理方法可以包括:
步骤201、肌电信号采集装置周期性采集多个通道的表面肌电信号S1。
其中,肌电信号采集装置可以包含多个可以采集表面肌电信号的电极,每个电极采集一个通道的表面肌电信号。当用户进行多点触控操作时,该肌电信号采集装置周期性采集多个通道的表面肌电信号S1,采集周期可以根据需要进行设置。可选的,多个通道的表面肌电信号S1可以是经过预处理之后的,预处理过程可以包括:对采集到的多个通道的表面肌电信号S1进行信号放大处理,进行工频干扰陷波处理,进行滤波处理等。
优选地,肌电信号采集装置可以设置在用户的手臂上。如果用户的左右手都进行触控操作,则,每条手臂都需要设置一个肌电信号采集装置。每个肌电信号采集装置具有与其对应的设备标识(设备ID),该设备ID可以唯一区分该肌电信号采集装置。设备ID可以是数字、字母或者其他任意形式,本发明实施例对此不加以限制。当用户进行多点触控操作时,通过肌电信号采集装置对应的设备标识可以区分出不同用户以及同一用户左右手的触控操作,从而可以实现多用户或者一个用户的左右手同时在同区域进行触控操作,而不会产生触控操作的混淆。
步骤203、肌电信号采集装置根据表面肌电信号S1的时频域特征,确定用户进行触控的手型手势;根据手型手势确定触控第一起始时间。触控第一起始时间为肌电信号采集装置识别出用户进行触控操作的起始时间。
可选的,肌电信号采集装置根据表面肌电信号S1的时频域特征,确定用户进行触控的手型手势(G),具体可以包括:
肌电信号采集装置根据表面肌电信号S1的幅值和频率,按照手型手势的类型进行手型手势G的判断。具体地,可以采用线性判别分析(Linear Discriminant Analysis,简称LDA)算法或者支持向量机(Support Vector Machine,简称SVM)算法,进行手型手势类型G的判断。本发明实施例对判断手型手势类型的方法不加以限制。
其中,手型手势G的类型可以是单手单手指的手型手势,也可以是单手 多手指的手型手势,每一种手型手势G有其相应的触控点数N1。例如单手拇指的手势对应的触控点数N1为1,单手食指与无名指组合的手势对应的触控点数N1为2,本发明实施例对手型手势G的类型不加以限制。可选的,在进行手型手势的识别时,预先定义一个手型手势的集合,只有该集合中的手型手势才能被识别出来。对于集合中的手型手势,可以包括本领域常见的手型手势,也可以预先定义新的手型手势,本发明实施例对此不加以限制。
可选的,根据手型手势G确定触控第一起始时间(T1),具体可以包括:在获取手型手势G后,肌电信号采集装置将该触控操作定义为肌电触控事件Touch_EMG,获取当前的系统时间作为肌电触控事件Touch_EMG中的触控第一起始时间T1。
步骤205、肌电信号采集装置向处理装置发送第一信息,以供处理装置根据第一信息,以及位置捕获装置发送的第二信息,生成触控指令并执行对应的交互操作。其中,第一信息包括肌电信号采集装置的设备标识、手型手势,以及触控第一起始时间;第二信息包括用户进行触控的触控点数、触控第二起始时间和各触控点的坐标信息,触控第二起始时间为位置捕获装置识别出用户进行触控操作的起始时间。
本发明实施例提供了一种触控交互的处理方法,肌电信号采集装置周期性采集多个通道的表面肌电信号,确定触控操作的手势手型,根据手型手势确定触控第一起始时间。肌电信号采集装置将包含设备标识、手型手势、触控第一起始时间的第一信息发送给处理装置,以供处理装置根据该第一信息以及位置捕获装置发送的第二信息,生成触控指令并执行。由于肌电信号采集装置的设备标识可以区分出不同用户以及同一用户左右手的触控操作,可以实现单用户双手或者多用户在同一区域同时进行触控操作,触控指令不会混淆。
可选的,在上述实施例中,还可以包括:肌电信号采集装置根据表面肌电信号S1的时频域特征,确定用户进行触控的操作力度。则第一信息中还包括所述操作力度,根据手型手势确定触控第一起始时间为根据手型手势和操作力度确定触控第一起始时间。
可选的,确定用户进行触控的操作力度(S),具体可以包括:
肌电信号采集装置将表面肌电信号S1进行叠加平均得到单通道肌电信号S2,并采用滑动时间窗的方式计算单通道肌电信号S2的平均幅值作为所 述操作力度S。
其中,滑动时间窗包括滑动时间窗的宽度I,滑动时间窗的滑动步长J,参数数值可以根据需要进行设置。可选的,还可以包括滑动时间窗的计算次数K,K为大于1的整数。其中,滑动时间窗的宽度I表示对单通道肌电信号S2在I上取平均得到平均幅值Z1,滑动时间窗的滑动步长J表示每间隔时间J都要计算一次单通道肌电信号S2在I上的平均幅值,滑动时间窗的计算次数K表示取K次的计算结果再取平均得到平均幅值Z2,可以将平均幅值Z1或者平均幅值Z2作为操作力度S。例如:设置I为5秒钟、J为1秒钟、K为3次,则,采用滑动时间窗的方式计算单通道肌电信号S2的平均幅值作为所述操作力度S,具体步骤为:对单通道肌电信号S2计算5秒长度内的平均幅值Z1,每过1秒钟都计算一次,取连续3次的平均幅值Z1做平均得到平均幅值Z2,将平均幅值Z2作为操作力度S。
可选的,根据手型手势和操作力度确定触控第一起始时间,具体可以包括:
在获取手型手势G和操作力度S后,通过查表确定触控第一起始时间T1。其中,表中预先存储有各手型手势G所对应的操作力度的第一预设门限值S_th(G),若根据表面肌电信号S1获取的操作力度S大于第一预设门限值S_th(G),则获取当前的系统时间作为触控第一起始时间T1。
其中,在预先定义的手型手势集合中,每一个手型手势G都预先定义有相应的第一预设门限值S_th(G),该第一预设门限值S_th(G)可以根据需要进行设置。在获取手型手势G和对应的操作力度S后,如果判断操作力度S大于该手型手势G对应的第一预设门限值S_th(G),即,S>S_th(G),则认为该触控操作是一个有效的触控操作,定义为肌电触控事件Touch_EMG,并获取当前的系统时间作为该肌电触控事件Touch_EMG的触控第一起始时间T1。
可选的,在上述实施例中,处理装置根据第一信息,以及位置捕获装置发送的第二信息,生成触控指令包括:处理装置确定触控第一起始时间与触控第二起始时间之间的时间间隔小于预设阈值,且手型手势对应的触控点数与第二信息包括的触控点数一致,则生成触控指令。触控指令可以包括肌电信号采集装置的设备标识、手型手势和各触控点的坐标信息。可选的,触控指令还可以包括用户进行触控的操作力度。
需要说明的是,触控指令中包含手型手势G,可以对手型手势G定义附 加特征,增加触控指令的信息丰富度。例如:当手型手势为单手单手指的手势,则可以根据不同的手指生成不同的触控指令,可以定义拇指表示目标平移、食指表示单击选中等;当手型手势为单手单手指的手势,且生成的触控指令为画线条时,则可以定义不同的手指代表不同的颜色或者线条类型;当手型手势为单手多手指的手势,则可以根据不同的手指组合生成不同的触控指令,可以定义拇指加食指表示缩放、食指加无名指表示调节音量、拇指加无名指表示调节亮度。本发明实施例对此不加以限制。
需要说明的是,触控指令中还可以包含操作力度S,可以对操作力度S定义附加特征,增加触控指令的信息丰富度。例如:生成的触控指令为画线条时,可以定义操作力度的大小表示线条的粗细;生成的触控指令为调节音量时,可以定义操作力度的大小表示音量的大小。本发明实施例对此不加以限制。
图3为本发明处理装置实施例一的结构示意图;如图3所示,该处理装置可以包括:
信息接收模块11,用于接收肌电信号采集装置发送的第一信息,以及位置捕获装置发送的第二信息。其中,第一信息包括肌电信号采集装置的设备标识、用户进行触控的手型手势,以及触控第一起始时间;第二信息包括用户进行触控的触控点数、触控第二起始时间和各触控点的坐标信息。触控第一起始时间为肌电信号采集装置识别出用户进行触控操作的起始时间,触控第二起始时间为位置捕获装置识别出用户进行触控操作的起始时间。
指令生成模块13,用于若确定触控第一起始时间与触控第二起始时间之间的时间间隔小于预设阈值,且手型手势对应的触控点数与第二信息包括的触控点数一致,生成触控指令,所述触控指令包括肌电信号采集装置的设备标识、手型手势和各触控点的坐标信息。
指令执行模块15,用于执行触控指令对应的交互操作。
进一步,指令生成模块13还用于,根据信息接收模块持续接收到的肌电信号采集装置发送的用户进行触控的手型手势,以及位置捕获装置发送的各触控点的坐标信息,对触控指令进行更新。
可选的,第一信息中还可以包括用户进行触控的操作力度;触控指令中还可以包括所述操作力度。
可选的,指令生成模块13还用于,根据信息接收模块持续接收到的肌电 信号采集装置发送的用户进行触控的手型手势和操作力度,以及位置捕获装置发送的各触控点的坐标信息,对触控指令进行更新。
可选的,所述指令生成模块13还用于,若确定操作力度小于第二预设门限值,判断触控指令结束,并删除触控指令。
可选的,肌电信号采集装置还可以包括:存储模块17,用于维护一个触控指令列表,用于存储所述触控指令。
需要说明的是,本发明实施例对处理装置的形态不加以限定,可以是芯片、智能手机、电脑、服务器等,也可以是具备计算、存储能力的其他装置。
本发明实施例提供了一种处理装置,信息接收模块接收肌电信号采集装置发送的第一信息,以及位置捕获装置发送的第二信息;指令生成模块生成触控指令,触控指令包括肌电信号采集装置的设备标识、手型手势和各触控点的坐标信息;指令执行模块执行触控指令对应的交互操作。由于肌电信号采集装置的设备标识可以区分出不同用户以及同一用户左右手的触控操作,可以实现单用户双手或者多用户在同一区域同时进行触控操作,触控指令不会混淆。
图4为本发明肌电信号采集装置实施例一的结构示意图;如图4所示,该肌电信号采集装置可以包括:
采集模块21,用于周期性采集多个通道的表面肌电信号S1。
处理模块23,用于根据表面肌电信号S1的时频域特征,确定用户进行触控的手型手势,根据手型手势确定触控第一起始时间。其中,触控第一起始时间为肌电信号采集装置识别出用户进行触控操作的起始时间。
发送模块25,用于向处理装置发送第一信息,以供处理装置根据第一信息,以及位置捕获装置发送的第二信息,生成触控指令并执行对应的交互操作。其中,第一信息包括肌电信号采集装置的设备标识、手型手势,以及触控第一起始时间;第二信息包括用户进行触控的触控点数、触控第二起始时间和各触控点的坐标信息,触控第二起始时间为位置捕获装置识别出用户进行触控操作的起始时间。
可选的,处理模块23根据表面肌电信号S1的时频域特征,确定用户进行触控的手型手势包括:处理模块23根据表面肌电信号S1的幅值和频率,按照手型手势的类型进行手型手势的判断。
可选的,处理模块23还可以用于,根据表面肌电信号S1的时频域特征, 确定用户进行触控的操作力度;则第一信息中还包括操作力度,相应地,根据手型手势确定触控第一起始时间为根据手型手势和操作力度确定触控第一起始时间。
可选的,处理模块23确定用户进行触控的操作力度,可以包括:处理模块23将表面肌电信号S1进行叠加平均得到单通道肌电信号S2,并采用滑动时间窗的方式计算单通道肌电信号S2的平均幅值作为操作力度S。
可选的,处理模块23根据手型手势和操作力度确定触控第一起始时间,可以包括:处理模块23在获取手型手势和操作力度后,通过查表确定触控第一起始时间。其中,表中预先存储有各手型手势所对应的操作力度的第一预设门限值,若根据表面肌电信号S1获取的操作力度大于第一预设门限值,则获取当前的系统时间作为触控第一起始时间。
可选的,处理装置根据第一信息,以及位置捕获装置发送的第二信息,生成触控指令,可以包括:处理装置确定触控第一起始时间与触控第二起始时间之间的时间间隔小于预设阈值,且手型手势对应的触控点数与第二信息包括的触控点数一致,则生成触控指令。触控指令可以包括肌电信号采集装置的设备标识、手型手势和各触控点的坐标信息。可选的,触控指令还可以包括用户进行触控的操作力度。
需要说明的是,本发明实施例对肌电信号采集装置的形态不加以限定,可以是腕带、手表等可穿戴设备形态,也可以是多个可以采集表面肌电信号的电极。
本发明实施例提供了一种肌电信号采集装置,采集模块采集多个通道的表面肌电信号S1;处理模块确定用户进行触控的手型手势以及触控第一起始时间;发送模块向处理装置发送第一信息,以供处理装置根据该第一信息以及位置捕获装置发送的第二信息,生成触控指令并执行,触控指令包括肌电信号采集装置的设备标识、手型手势和各触控点的坐标信息。由于肌电信号采集装置对应的设备标识可以区分出不同用户以及同一用户左右手的触控操作,可以实现单用户双手或者多用户在同一区域同时进行触控操作,触控指令不会混淆。
图5为本发明触控处理系统实施例一的结构示意图,如图5所示,该触控处理系统可以包括:位置捕获装置105、处理装置103,以及至少一个肌电信号采集装置101。其中,位置捕获装置105可以采用任何可以获取触控操 作的触控点数、触控起始时间和触控点坐标的现有装置;处理装置103可以采用图3装置实施例的结构,相对应地,可以执行图1方法实施例的技术方案;肌电信号采集装置101可以采用图4装置实施例的结构,相对应地,可以执行图2方法实施例的技术方案。
其中,肌电信号采集装置101和位置捕获装置105分别与处理装置103通信连接,可以采用有线、无线、蓝牙、wifi等方式进行通信。
可选的,位置捕获装置105可以包括:电容感应模块1031、红外感应模块1033、超声感应模块1035。其中,电容感应模块1031用于通过电容触屏获得触控操作的触控点数、触控起始时间和触控点坐标;红外感应模块1033用于通过红外触控感应系统获得触控操作的触控点数、触控起始时间和触控点坐标;超声感应模块1035用于通过超声触控感应系统获得触控操作的触控点数、触控起始时间和触控点坐标。
可选的,位置捕获装置105和处理装置103可以集成在一体,也可以分开独立设置。
可选的,触控处理系统还可以包括用户反馈装置107,用于显示触控指令的执行结果,例如是LED显示屏幕、投影显示设备、扬声器、触觉反馈装置等。可选的,用户反馈装置107可以包括:显示模块1071、声音模块1073、触觉反馈模块1075。
需要说明的是,本发明实施例提供的触控处理系统,可以应用于传统的电子触控设备,例如触控手机、触控电脑等,也可以应用于教育、企业办公、娱乐、广告展示等多个领域,例如使用课桌桌面进行的美术课教学,将课桌桌面作为画纸,手指作为画笔,多名学生在画纸上共同完成一幅作品。
图6为本发明处理设备实施例一的结构示意图;如图6所示,该处理设备可以包括:接收器31、第一存储器33和处理器35,接收器31、第一存储器33分别通过总线与处理器35连接。
接收器31,用于接收肌电信号采集设备发送的第一信息,以及位置捕获设备发送的第二信息。其中,第一信息包括肌电信号采集设备的设备标识、用户进行触控的手型手势,以及触控第一起始时间;第二信息包括用户进行触控的触控点数、触控第二起始时间和各触控点的坐标信息。触控第一起始时间为肌电信号采集设备识别出用户进行触控操作的起始时间,触控第二起始时间为位置捕获设备识别出用户进行触控操作的起始时间。
第一存储器33,用于存储指令。
处理器35,用于运行第一存储器33中所存储的指令,以执行以下步骤:
若确定触控第一起始时间与触控第二起始时间之间的时间间隔小于预设阈值,且手型手势对应的触控点数与第二信息包括的触控点数一致,生成触控指令,所述触控指令包括肌电信号采集设备的设备标识、手型手势和各触控点的坐标信息。
执行触控指令对应的交互操作。
可选的,处理器35还可以用于执行以下步骤:根据信息接收模块持续接收到的肌电信号采集设备发送的用户进行触控的手型手势,以及位置捕获设备发送的各触控点的坐标信息,对触控指令进行更新。
可选的,第一信息中还可以包括用户进行触控的操作力度;触控指令中还可以包括所述操作力度。
可选的,处理器35还可以用于执行以下步骤:根据信息接收模块持续接收到的肌电信号采集设备发送的用户进行触控的手型手势和操作力度,以及位置捕获设备发送的各触控点的坐标信息,对触控指令进行更新。
可选的,处理器35还可以用于执行以下步骤:若确定操作力度小于第二预设门限值,判断触控指令结束,并删除触控指令。
可选的,处理设备还可以包括:第二存储器37,用于维护一个触控指令列表,触控指令列表中存储有触控指令。第二存储器37通过总线与处理器35连接。
需要说明的是,本发明实施例对处理设备的形态不加以限定,可以是芯片、智能手机、电脑、服务器等,也可以是具备计算、存储能力的其他设备。
本发明实施例提供了一种处理设备,接收器接收肌电信号采集设备发送的第一信息,以及位置捕获设备发送的第二信息;处理器生成触控指令并执行触控指令对应的交互操作,触控指令包括肌电信号采集设备的设备标识、手型手势和各触控点的坐标信息。由于肌电信号采集设备的设备标识可以区分出不同用户以及同一用户左右手的触控操作,可以实现单用户双手或者多用户在同一区域同时进行触控操作,触控指令不会混淆。
图7为本发明肌电信号采集设备实施例一的结构示意图;如图7所示,该肌电信号采集设备可以包括:处理器41、发送器43和第三存储器45,发送器43、第三存储器45分别通过总线与处理器41连接。
第三存储器45,用于存储指令。
处理器41,用于运行第三存储器45中所存储的指令,以执行以下步骤:
周期性采集多个通道的表面肌电信号S1。
根据表面肌电信号S1的时频域特征,确定用户进行触控的手型手势,根据手型手势确定触控第一起始时间。其中,触控第一起始时间为肌电信号采集设备识别出用户进行触控操作的起始时间。
可选的,根据表面肌电信号S1的时频域特征,确定用户进行触控的手型手势可以包括:根据表面肌电信号S1的幅值和频率,按照手型手势的类型进行所述手型手势的判断。
发送器43,用于向处理设备发送第一信息,以供处理设备根据第一信息,以及位置捕获设备发送的第二信息,生成触控指令并执行对应的交互操作。其中,第一信息包括肌电信号采集设备的设备标识、手型手势,以及触控第一起始时间;第二信息包括用户进行触控的触控点数、触控第二起始时间和各触控点的坐标信息,触控第二起始时间为位置捕获设备识别出用户进行触控操作的起始时间。
可选的,处理器41还可以用于执行以下步骤:根据表面肌电信号S1的时频域特征,确定用户进行触控的操作力度;则第一信息中还包括操作力度,相应地,根据手型手势确定触控第一起始时间为根据手型手势和操作力度确定触控第一起始时间。
可选的,确定用户进行触控的操作力度,可以包括:将表面肌电信号S1进行叠加平均得到单通道肌电信号S2,并采用滑动时间窗的方式计算单通道肌电信号S2的平均幅值作为操作力度S。
可选的,根据手型手势和操作力度确定触控第一起始时间,可以包括:获取手型手势和操作力度后,通过查表确定触控第一起始时间。其中,表中预先存储有各手型手势所对应的操作力度的第一预设门限值,若根据表面肌电信号S1获取的操作力度大于第一预设门限值,则获取当前的系统时间作为触控第一起始时间。
可选的,处理设备根据第一信息,以及位置捕获设备发送的第二信息,生成触控指令,可以包括:处理设备确定触控第一起始时间与触控第二起始时间之间的时间间隔小于预设阈值,且手型手势对应的触控点数与第二信息包括的触控点数一致,则生成触控指令。触控指令可以包括肌电信号采集设 备的设备标识、手型手势和各触控点的坐标信息。可选的,触控指令还可以包括用户进行触控的操作力度。
需要说明的是,本发明实施例对肌电信号采集设备的形态不加以限定,可以是腕带、手表等可穿戴设备形态,也可以是多个可以采集表面肌电信号的电极。
本发明实施例提供了一种肌电信号采集设备,处理器采集多个通道的表面肌电信号S1,确定用户进行触控的手型手势以及触控第一起始时间;发送器向处理设备发送第一信息,以供处理设备根据该第一信息以及位置捕获设备发送的第二信息,生成触控指令并执行,触控指令包括肌电信号采集设备的设备标识、手型手势和各触控点的坐标信息。由于肌电信号采集设备对应的设备标识可以区分出不同用户以及同一用户左右手的触控操作,可以实现单用户双手或者多用户在同一区域同时进行触控操作,触控指令不会混淆。
图8为本发明触控处理系统实施例二的结构示意图,如图8所示,该触控处理系统可以包括:位置捕获设备205、处理设备203,以及至少一个肌电信号采集设备201。其中,位置捕获设备205可以采用任何可以获取触控操作的触控点数、触控起始时间和触控点坐标的现有设备;处理设备203可以采用图6设备实施例的结构,相对应地,可以执行图1方法实施例的技术方案;肌电信号采集设备201可以采用图7设备实施例的结构,相对应地,可以执行图2方法实施例的技术方案。
其中,肌电信号采集设备201和位置捕获设备205分别与处理设备203通信连接,可以采用有线、无线、蓝牙、wifi等方式进行通信。
可选的,位置捕获设备205可以包括:电容感应器2031、红外感应器2033、超声感应器2035。其中,电容感应器2031用于通过电容触屏获得触控操作的触控点数、触控起始时间和触控点坐标;红外感应器2033用于通过红外触控感应系统获得触控操作的触控点数、触控起始时间和触控点坐标;超声感应器2035用于通过超声触控感应系统获得触控操作的触控点数、触控起始时间和触控点坐标。
可选的,位置捕获设备205和处理设备203可以集成在一体,也可以分开独立设置。
可选的,触控处理系统还可以包括用户反馈设备207,用于显示触控指令的执行结果,例如是LED显示屏幕、投影显示设备、扬声器、触觉反馈设 备等。可选的,用户反馈设备207可以包括:显示设备2071、声音设备2073、触觉反馈设备2075。
需要说明的是,本发明实施例提供的触控处理系统,可以应用于传统的电子触控设备,例如触控手机、触控电脑等;也可以应用于教育、企业办公、娱乐、广告展示等多个领域,例如使用课桌桌面进行的美术课教学,将课桌桌面作为画纸,手指作为画笔,多名学生在画纸上共同完成一幅作品。
最后应说明的是:以上各实施例仅用以说明本发明的技术方案,而非对其限制;尽管参照前述各实施例对本发明进行了详细的说明,本领域的普通技术人员应当理解:其依然可以对前述各实施例所记载的技术方案进行修改,或者对其中部分或者全部技术特征进行等同替换;而这些修改或者替换,并不使相应技术方案的本质脱离本发明各实施例技术方案的范围。

Claims (23)

  1. 一种触控交互的处理方法,其特征在于,包括:
    接收肌电信号采集装置发送的第一信息,以及位置捕获装置发送的第二信息;其中,所述第一信息包括所述肌电信号采集装置的设备标识、用户进行触控的手型手势,以及触控第一起始时间;所述第二信息包括所述用户进行触控的触控点数、触控第二起始时间和各触控点的坐标信息;所述触控第一起始时间为所述肌电信号采集装置识别出用户进行触控操作的起始时间,所述触控第二起始时间为所述位置捕获装置识别出用户进行触控操作的起始时间;
    若确定所述触控第一起始时间与所述触控第二起始时间之间的时间间隔小于预设阈值,且所述手型手势对应的触控点数与所述第二信息包括的所述触控点数一致,生成触控指令,所述触控指令包括所述肌电信号采集装置的设备标识、所述手型手势和所述各触控点的坐标信息;
    执行所述触控指令对应的交互操作。
  2. 根据权利要求1所述的方法,其特征在于,所述方法还包括:
    持续接收所述肌电信号采集装置发送的所述用户进行触控的手型手势,以及所述位置捕获装置发送的所述各触控点的坐标信息,对所述触控指令进行更新。
  3. 根据权利要求1所述的方法,其特征在于,所述第一信息中还包括用户进行触控的操作力度;所述触控指令中还包括所述操作力度。
  4. 根据权利要求3所述的方法,其特征在于,持续接收所述肌电信号采集装置发送的所述用户进行触控的手型手势和所述操作力度,以及所述位置捕获装置发送的所述各触控点的坐标信息,对所述触控指令进行更新。
  5. 根据权利要求4所述的方法,其特征在于,所述方法还包括:
    若确定所述操作力度小于第二预设门限值,删除所述触控指令。
  6. 一种触控交互的处理方法,其特征在于,包括:
    肌电信号采集装置周期性采集多个通道的表面肌电信号S1;
    所述肌电信号采集装置根据所述表面肌电信号S1的时频域特征,确定用户进行触控的手型手势;根据所述手型手势确定触控第一起始时间;所述 触控第一起始时间为所述肌电信号采集装置识别出用户进行触控操作的起始时间;
    所述肌电信号采集装置向处理装置发送第一信息,以供所述处理装置根据所述第一信息,以及位置捕获装置发送的第二信息,生成触控指令并执行对应的交互操作;其中,所述第一信息包括所述肌电信号采集装置的设备标识、所述手型手势,以及所述触控第一起始时间;所述第二信息包括所述用户进行触控的触控点数、触控第二起始时间和各触控点的坐标信息,所述触控第二起始时间为所述位置捕获装置识别出用户进行触控操作的起始时间。
  7. 根据权利要求6所述的方法,其特征在于,所述肌电信号采集装置根据所述表面肌电信号S1的时频域特征,确定用户进行触控的手型手势包括:
    所述肌电信号采集装置根据所述表面肌电信号S1的幅值和频率,按照手型手势的类型进行所述手型手势的判断。
  8. 根据权利要求6所述的方法,其特征在于,还包括:所述肌电信号采集装置根据所述表面肌电信号S1的时频域特征,确定用户进行触控的操作力度;则所述第一信息中还包括所述操作力度,所述根据所述手型手势确定触控第一起始时间为根据所述手型手势和所述操作力度确定所述触控第一起始时间。
  9. 根据权利要求8所述的方法,其特征在于,所述确定用户进行触控的操作力度包括:
    所述肌电信号采集装置将所述表面肌电信号S1进行叠加平均得到单通道肌电信号S2,并采用滑动时间窗的方式计算所述单通道肌电信号S2的平均幅值作为所述操作力度S。
  10. 根据权利要求8所述的方法,其特征在于,所述根据所述手型手势和所述操作力度确定所述触控第一起始时间包括:
    在获取所述手型手势和所述操作力度后,通过查表确定所述触控第一起始时间;其中,所述表中预先存储有各手型手势所对应的操作力度的第一预设门限值,若根据所述表面肌电信号S1获取的操作力度大于所述第一预设门限值,则获取当前的系统时间作为所述触控第一起始时间。
  11. 根据权利要求6至10任一所述的方法,其特征在于,所述处理装置根据所述第一信息,以及位置捕获装置发送的第二信息,生成触控指令包 括:所述处理装置确定所述触控第一起始时间与所述触控第二起始时间之间的时间间隔小于预设阈值,且所述手型手势对应的触控点数与所述第二信息包括的触控点数一致,则生成所述触控指令。
  12. 一种处理装置,其特征在于,包括:
    信息接收模块,用于接收肌电信号采集装置发送的第一信息,以及位置捕获装置发送的第二信息;其中,所述第一信息包括所述肌电信号采集装置的设备标识、用户进行触控的手型手势,以及触控第一起始时间;所述第二信息包括所述用户进行触控的触控点数、触控第二起始时间和各触控点的坐标信息;所述触控第一起始时间为所述肌电信号采集装置识别出用户进行触控操作的起始时间,所述触控第二起始时间为所述位置捕获装置识别出用户进行触控操作的起始时间;
    指令生成模块,用于若确定所述触控第一起始时间与所述触控第二起始时间之间的时间间隔小于预设阈值,且所述手型手势对应的触控点数与所述第二信息包括的所述触控点数一致,生成触控指令,所述触控指令包括所述肌电信号采集装置的设备标识、所述手型手势和所述各触控点的坐标信息;
    指令执行模块,用于执行所述触控指令对应的交互操作。
  13. 根据权利要求12所述的处理装置,其特征在于,所述指令生成模块还用于,根据所述信息接收模块持续接收到的所述肌电信号采集装置发送的所述用户进行触控的手型手势,以及所述位置捕获装置发送的各触控点的坐标信息,对所述触控指令进行更新。
  14. 根据权利要求12所述的处理装置,其特征在于,所述第一信息中还包括用户进行触控的操作力度;所述触控指令中还包括所述操作力度。
  15. 根据权利要求14所述的处理装置,其特征在于,所述指令生成模块还用于,根据所述信息接收模块持续接收到的所述肌电信号采集装置发送的所述用户进行触控的手型手势和所述操作力度,以及所述位置捕获装置发送的所述各触控点的坐标信息,对所述触控指令进行更新。
  16. 根据权利要求15所述的处理装置,其特征在于,所述指令生成模块还用于,若确定所述操作力度小于第二预设门限值,判断所述触控指令结束,并删除所述触控指令。
  17. 一种肌电信号采集装置,其特征在于,包括:
    采集模块,用于周期性采集多个通道的表面肌电信号S1;
    处理模块,用于根据所述表面肌电信号S1的时频域特征,确定用户进行触控的手型手势;根据所述手型手势确定触控第一起始时间;所述触控第一起始时间为所述肌电信号采集装置识别出用户进行触控操作的起始时间;
    发送模块,用于向处理装置发送第一信息,以供所述处理装置根据所述第一信息,以及位置捕获装置发送的第二信息,生成触控指令并执行对应的交互操作;其中,所述第一信息包括所述肌电信号采集装置的设备标识、所述手型手势,以及所述触控第一起始时间;所述第二信息包括所述用户进行触控的触控点数、触控第二起始时间和各触控点的坐标信息,所述触控第二起始时间为所述位置捕获装置识别出用户进行触控操作的起始时间。
  18. 根据权利要求17所述的采集装置,其特征在于,所述处理模块根据所述表面肌电信号S1的时频域特征,确定用户进行触控的手型手势包括:
    所述处理模块根据所述表面肌电信号S1的幅值和频率,按照手型手势的类型进行所述手型手势的判断。
  19. 根据权利要求17所述的采集装置,其特征在于,所述处理模块还用于,根据所述表面肌电信号S1的时频域特征,确定用户进行触控的操作力度;则所述第一信息中还包括所述操作力度,所述根据所述手型手势确定触控第一起始时间为根据所述手型手势和所述操作力度确定所述触控第一起始时间。
  20. 根据权利要求19所述的采集装置,其特征在于,所述处理模块确定用户进行触控的操作力度包括:
    所述处理模块将所述表面肌电信号S1进行叠加平均得到单通道肌电信号S2,并采用滑动时间窗的方式计算所述单通道肌电信号S2的平均幅值作为所述操作力度S。
  21. 根据权利要求19所述的采集装置,其特征在于,所述处理模块根据所述手型手势和所述操作力度确定所述触控第一起始时间包括:
    所述处理模块在获取所述手型手势和所述操作力度后,通过查表确定所述触控第一起始时间;其中,所述表中预先存储有各手型手势所对应的操作力度的第一预设门限值,若根据所述表面肌电信号S1获取的操作力度大于所述第一预设门限值,则获取当前的系统时间作为所述触控第一起始时间。
  22. 根据权利要求17至21任一所述的采集装置,其特征在于,所述处理装置根据所述第一信息,以及位置捕获装置发送的第二信息,生成触控指 令包括:所述处理装置确定所述触控第一起始时间与所述触控第二起始时间之间的时间间隔小于预设阈值,且所述手型手势对应的触控点数与所述第二信息包括的所述触控点数一致,则生成所述触控指令。
  23. 一种触控处理系统,其特征在于,包括位置捕获装置、如权利要求12-16任一所述的处理装置,以及至少一个如权利要求17-22任一所述的肌电信号采集装置;其中,所述肌电信号采集装置和所述位置捕获装置分别与所述处理装置通信连接。
PCT/CN2015/080243 2014-10-16 2015-05-29 触控交互的处理方法、装置和系统 WO2016058387A1 (zh)

Priority Applications (5)

Application Number Priority Date Filing Date Title
KR1020177012912A KR101875350B1 (ko) 2014-10-16 2015-05-29 터치 상호작용을 처리하기 위한 방법, 디바이스 및 시스템
BR112017007752-3A BR112017007752B1 (pt) 2014-10-16 2015-05-29 Método, dispositivo, e sistema de processamento de interação por toque
JP2017520458A JP6353982B2 (ja) 2014-10-16 2015-05-29 タッチインタラクション処理方法、装置及びシステム
EP15851451.3A EP3200051B1 (en) 2014-10-16 2015-05-29 Method, device and system for processing touch interaction
US15/486,452 US10372325B2 (en) 2014-10-16 2017-04-13 Electromyographic based touch interaction processing method, device, and system

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN201410549003.0A CN105573536B (zh) 2014-10-16 2014-10-16 触控交互的处理方法、装置和系统
CN201410549003.0 2014-10-16

Related Child Applications (1)

Application Number Title Priority Date Filing Date
US15/486,452 Continuation US10372325B2 (en) 2014-10-16 2017-04-13 Electromyographic based touch interaction processing method, device, and system

Publications (1)

Publication Number Publication Date
WO2016058387A1 true WO2016058387A1 (zh) 2016-04-21

Family

ID=55746078

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2015/080243 WO2016058387A1 (zh) 2014-10-16 2015-05-29 触控交互的处理方法、装置和系统

Country Status (7)

Country Link
US (1) US10372325B2 (zh)
EP (1) EP3200051B1 (zh)
JP (1) JP6353982B2 (zh)
KR (1) KR101875350B1 (zh)
CN (1) CN105573536B (zh)
BR (1) BR112017007752B1 (zh)
WO (1) WO2016058387A1 (zh)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111338516A (zh) * 2020-02-26 2020-06-26 业成科技(成都)有限公司 手指触控的检测方法和装置、电子设备、存储介质

Families Citing this family (21)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10188309B2 (en) 2013-11-27 2019-01-29 North Inc. Systems, articles, and methods for electromyography sensors
US20150124566A1 (en) 2013-10-04 2015-05-07 Thalmic Labs Inc. Systems, articles and methods for wearable electronic devices employing contact sensors
US11921471B2 (en) 2013-08-16 2024-03-05 Meta Platforms Technologies, Llc Systems, articles, and methods for wearable devices having secondary power sources in links of a band for providing secondary power in addition to a primary power source
US10761718B2 (en) * 2015-09-25 2020-09-01 Ricoh Company, Ltd. Electronic whiteboard, method for image processing in electronic whiteboard, and recording medium containing computer program of electronic whiteboard
WO2019079757A1 (en) 2017-10-19 2019-04-25 Ctrl-Labs Corporation SYSTEMS AND METHODS FOR IDENTIFYING BIOLOGICAL STRUCTURES ASSOCIATED WITH NEUROMUSCULAR SOURCE SIGNALS
US11493993B2 (en) 2019-09-04 2022-11-08 Meta Platforms Technologies, Llc Systems, methods, and interfaces for performing inputs based on neuromuscular control
US11481030B2 (en) 2019-03-29 2022-10-25 Meta Platforms Technologies, Llc Methods and apparatus for gesture detection and classification
US11961494B1 (en) 2019-03-29 2024-04-16 Meta Platforms Technologies, Llc Electromagnetic interference reduction in extended reality environments
US11907423B2 (en) 2019-11-25 2024-02-20 Meta Platforms Technologies, Llc Systems and methods for contextualized interactions with an environment
US11150730B1 (en) 2019-04-30 2021-10-19 Facebook Technologies, Llc Devices, systems, and methods for controlling computing devices via neuromuscular signals of users
US11049053B2 (en) * 2018-03-29 2021-06-29 Ricoh Company, Ltd. Communication terminal, sharing system, communication method, and non-transitory recording medium storing program
CN112789577B (zh) 2018-09-20 2024-04-05 元平台技术有限公司 增强现实系统中的神经肌肉文本输入、书写和绘图
CN109542278B (zh) * 2018-09-27 2022-02-11 江苏特思达电子科技股份有限公司 触摸数据的处理方法、装置及触摸设备
US10908783B2 (en) * 2018-11-06 2021-02-02 Apple Inc. Devices, methods, and graphical user interfaces for interacting with user interface objects and providing feedback
EP3886693A4 (en) 2018-11-27 2022-06-08 Facebook Technologies, LLC. METHOD AND DEVICE FOR AUTOCALIBRATION OF A PORTABLE ELECTRODE SENSING SYSTEM
CN111783056B (zh) * 2020-07-06 2024-05-14 诺百爱(杭州)科技有限责任公司 一种基于肌电信号识别用户身份的方法、装置和电子设备
CN112506379A (zh) * 2020-12-21 2021-03-16 北京百度网讯科技有限公司 触控事件的处理方法、装置、设备以及存储介质
US11868531B1 (en) 2021-04-08 2024-01-09 Meta Platforms Technologies, Llc Wearable device providing for thumb-to-finger-based input gestures detected based on neuromuscular signals, and systems and methods of use thereof
CN113126771A (zh) * 2021-05-07 2021-07-16 浙江柔灵科技有限公司 一种具有反馈信息功能的肌电手势识别系统
CN114642440B (zh) * 2022-05-23 2022-08-12 博睿康科技(常州)股份有限公司 获取刺激系统预设时长的方法、刺激系统及其调控方法
CN116243825B (zh) * 2023-05-06 2023-07-25 成都市芯璨科技有限公司 一种基于电容检测的触控检测芯片及装置

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102117150A (zh) * 2011-03-30 2011-07-06 汉王科技股份有限公司 字符后处理方法及系统
CN102184056A (zh) * 2011-04-26 2011-09-14 广东威创视讯科技股份有限公司 多触摸点识别方法及装置
CN102449573A (zh) * 2009-06-09 2012-05-09 索尼爱立信移动通讯有限公司 基于手指识别区分右手输入和左手输入
WO2013103344A1 (en) * 2012-01-05 2013-07-11 Sony Ericsson Mobile Communications Ab Adjusting coordinates of touch input

Family Cites Families (15)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH07248873A (ja) * 1994-03-08 1995-09-26 Sharp Corp 筋電信号を用いた制御装置
EP1330777A1 (en) * 2000-10-27 2003-07-30 Elo Touchsystems, Inc. Touch confirming touchscreen utilizing plural touch sensors
JP4099117B2 (ja) * 2003-07-22 2008-06-11 シャープ株式会社 仮想キーボードシステム
US8441467B2 (en) 2006-08-03 2013-05-14 Perceptive Pixel Inc. Multi-touch sensing display through frustrated total internal reflection
US8022941B2 (en) 2006-10-12 2011-09-20 Disney Enterprises, Inc. Multi-user touch screen
US8581856B2 (en) * 2009-05-27 2013-11-12 Microsoft Corporation Touch sensitive display apparatus using sensor input
JP4988016B2 (ja) 2009-08-27 2012-08-01 韓國電子通信研究院 指の動き検出装置およびその方法
KR20110032640A (ko) 2009-09-23 2011-03-30 삼성전자주식회사 멀티 터치 인식 디스플레이 장치
US8692799B1 (en) 2011-07-05 2014-04-08 Cypress Semiconductor Corporation Single layer multi-touch capacitive sensor
US9134807B2 (en) 2012-03-02 2015-09-15 Microsoft Technology Licensing, Llc Pressure sensitive key normalization
US9201546B2 (en) 2012-03-09 2015-12-01 Elo Touch Solutions, Inc. Acoustic touch apparatus with multi-touch capability
US8866771B2 (en) 2012-04-18 2014-10-21 International Business Machines Corporation Multi-touch multi-user gestures on a multi-touch display
US9011232B2 (en) 2012-05-18 2015-04-21 Universal Entertainment Corporation Gaming machine and gaming method
JP5929572B2 (ja) * 2012-07-09 2016-06-08 コニカミノルタ株式会社 操作表示装置およびプログラム
US9223459B2 (en) 2013-01-25 2015-12-29 University Of Washington Through Its Center For Commercialization Using neural signals to drive touch screen devices

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102449573A (zh) * 2009-06-09 2012-05-09 索尼爱立信移动通讯有限公司 基于手指识别区分右手输入和左手输入
CN102117150A (zh) * 2011-03-30 2011-07-06 汉王科技股份有限公司 字符后处理方法及系统
CN102184056A (zh) * 2011-04-26 2011-09-14 广东威创视讯科技股份有限公司 多触摸点识别方法及装置
WO2013103344A1 (en) * 2012-01-05 2013-07-11 Sony Ericsson Mobile Communications Ab Adjusting coordinates of touch input

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
See also references of EP3200051A4 *

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111338516A (zh) * 2020-02-26 2020-06-26 业成科技(成都)有限公司 手指触控的检测方法和装置、电子设备、存储介质
CN111338516B (zh) * 2020-02-26 2022-05-10 业成科技(成都)有限公司 手指触控的检测方法和装置、电子设备、存储介质

Also Published As

Publication number Publication date
JP6353982B2 (ja) 2018-07-04
KR101875350B1 (ko) 2018-07-05
JP2017534980A (ja) 2017-11-24
BR112017007752A2 (zh) 2018-02-06
BR112017007752B1 (pt) 2023-01-31
KR20170067873A (ko) 2017-06-16
EP3200051A4 (en) 2017-09-20
CN105573536B (zh) 2018-09-07
EP3200051B1 (en) 2018-12-12
CN105573536A (zh) 2016-05-11
US20170220245A1 (en) 2017-08-03
EP3200051A1 (en) 2017-08-02
US10372325B2 (en) 2019-08-06

Similar Documents

Publication Publication Date Title
WO2016058387A1 (zh) 触控交互的处理方法、装置和系统
US9978261B2 (en) Remote controller and information processing method and system
KR102170321B1 (ko) 파지된 물체를 이용한 모션을 인식하는 장치 및 방법, 시스템
US20150205400A1 (en) Grip Detection
US20180253163A1 (en) Change of active user of a stylus pen with a multi-user interactive display
TWI530867B (zh) 觸感回饋系統及其提供觸感回饋的方法
CN108958615A (zh) 一种显示控制方法、终端及计算机可读存储介质
CN202748770U (zh) 一种自适应调整屏幕触控输入范围的移动终端
US11409371B2 (en) Systems and methods for gesture-based control
CN105117003A (zh) 智能穿戴设备及其工作方法
WO2012152205A1 (zh) 一种人机交互设备
US10754446B2 (en) Information processing apparatus and information processing method
CN103809866A (zh) 一种操作模式切换方法和电子设备
CN103823630A (zh) 一种虚拟鼠标
CN107273009A (zh) 一种移动终端快速截屏的方法及系统
CN102662511A (zh) 通过触摸屏进行控制操作的方法及终端
TW201430627A (zh) 觸控筆操作識別系統及方法
CN104461365A (zh) 终端的触控方法和装置
US20160085311A1 (en) Control unit and method of interacting with a graphical user interface
JP2015053034A (ja) 入力装置
CN111176421B (zh) 可穿戴设备及其操控方法、操控系统和存储装置
CN108205390A (zh) 一种终端操作的方法和终端
WO2016201760A1 (zh) 一种触摸显示装置中识别手势的方法和系统
CN207965847U (zh) 一种触控控制设备
Odagaki et al. Touch interface for sensing fingertip force in mobile device using electromyogram

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 15851451

Country of ref document: EP

Kind code of ref document: A1

ENP Entry into the national phase

Ref document number: 2017520458

Country of ref document: JP

Kind code of ref document: A

NENP Non-entry into the national phase

Ref country code: DE

REG Reference to national code

Ref country code: BR

Ref legal event code: B01A

Ref document number: 112017007752

Country of ref document: BR

REEP Request for entry into the european phase

Ref document number: 2015851451

Country of ref document: EP

ENP Entry into the national phase

Ref document number: 20177012912

Country of ref document: KR

Kind code of ref document: A

ENP Entry into the national phase

Ref document number: 112017007752

Country of ref document: BR

Kind code of ref document: A2

Effective date: 20170413