CN112214109B - Myoelectricity and gesture data-based composite control method, device and system - Google Patents

Myoelectricity and gesture data-based composite control method, device and system Download PDF

Info

Publication number
CN112214109B
CN112214109B CN202011063852.7A CN202011063852A CN112214109B CN 112214109 B CN112214109 B CN 112214109B CN 202011063852 A CN202011063852 A CN 202011063852A CN 112214109 B CN112214109 B CN 112214109B
Authority
CN
China
Prior art keywords
signal
target
determining
value
myoelectricity
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202011063852.7A
Other languages
Chinese (zh)
Other versions
CN112214109A (en
Inventor
于文龙
张元康
莫博康
黄天展
翁恭伟
梁旭
刘永建
黄品高
王辉
高超
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shenzhen Runyi Taiyi Technology Co ltd
Original Assignee
Shenzhen Runyi Taiyi Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shenzhen Runyi Taiyi Technology Co ltd filed Critical Shenzhen Runyi Taiyi Technology Co ltd
Priority to CN202011063852.7A priority Critical patent/CN112214109B/en
Publication of CN112214109A publication Critical patent/CN112214109A/en
Application granted granted Critical
Publication of CN112214109B publication Critical patent/CN112214109B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • G06F3/015Input arrangements based on nervous system activity detection, e.g. brain waves [EEG] detection, electromyograms [EMG] detection, electrodermal response detection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/017Gesture based interaction, e.g. based on a set of recognized hand gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2203/00Indexing scheme relating to G06F3/00 - G06F3/048
    • G06F2203/01Indexing scheme relating to G06F3/01
    • G06F2203/011Emotion or mood input determined on the basis of sensed human body parameters such as pulse, heart rate or beat, temperature of skin, facial expressions, iris, voice pitch, brain activity patterns
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02PCLIMATE CHANGE MITIGATION TECHNOLOGIES IN THE PRODUCTION OR PROCESSING OF GOODS
    • Y02P90/00Enabling technologies with a potential contribution to greenhouse gas [GHG] emissions mitigation
    • Y02P90/02Total factory control, e.g. smart factories, flexible manufacturing systems [FMS] or integrated manufacturing systems [IMS]

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Health & Medical Sciences (AREA)
  • Biomedical Technology (AREA)
  • Dermatology (AREA)
  • General Health & Medical Sciences (AREA)
  • Neurology (AREA)
  • Neurosurgery (AREA)
  • User Interface Of Digital Computer (AREA)
  • Measurement And Recording Of Electrical Phenomena And Electrical Characteristics Of The Living Body (AREA)

Abstract

The embodiment of the invention provides a composite control method, a device and a system based on myoelectricity and gesture data, which are applied to a composite control system based on myoelectricity and gesture data, wherein the composite control system based on myoelectricity and gesture data comprises at least one myoelectricity collector and an inertial sensor, and myoelectricity signals collected by the at least one myoelectricity collector are obtained; determining a target control mode corresponding to the electromyographic signals; acquiring attitude data acquired by an inertial sensor; the target control operation in the target control mode is executed according to the gesture data, so that the operation control can be performed based on the myoelectricity and gesture recognition cooperation, and the flexibility of the composite control system based on the myoelectricity and gesture data is improved.

Description

Myoelectricity and gesture data-based composite control method, device and system
Technical Field
The invention relates to the field of operation control, in particular to a composite control method, device and system based on myoelectricity and gesture data.
Background
In a single myoelectric control system, myoelectricity generated by movement intention is utilized, a mode recognition algorithm is utilized, a control mode is recognized according to the myoelectric signals, and then the operation of the system is controlled, but the recognition result often only represents a scalar (only has no direction in size), so that one recognition result can only control one fixed operation of the system at the same time, and a directional complex system is difficult to realize the cooperative operation of the system, and each recognition result has larger time delay, so that the continuous operation is not facilitated.
Disclosure of Invention
The embodiment of the invention provides a composite control method, device and system based on myoelectricity and gesture data, which can cooperatively perform operation control based on myoelectricity and gesture recognition, so that the flexibility of the composite control system based on myoelectricity and gesture data is improved.
The first aspect of the embodiment of the invention provides a composite control method based on myoelectricity and gesture data, which is applied to a composite control system based on myoelectricity and gesture data, wherein the composite control system based on myoelectricity and gesture data comprises at least one myoelectricity collector and an inertial sensor, and the method comprises the following steps:
acquiring an electromyographic signal acquired by the at least one electromyographic acquisition device;
determining a target control mode corresponding to the electromyographic signal;
acquiring attitude data acquired by the inertial sensor;
and executing target control operation in the target control mode according to the gesture data.
Optionally, the determining the target control mode corresponding to the electromyographic signal includes:
processing the electromyographic signals to obtain target signal processing data;
and determining a target control mode corresponding to the target signal processing data according to a mapping relation between the preset signal processing data and the control mode.
Optionally, the processing the electromyographic signal to obtain target signal processing data includes:
signal processing is carried out on the electromyographic signals to obtain an electromyographic signal curve,
determining a target segment with a signal peak value in the electromyographic signal curve, wherein the starting position of the target segment corresponds to a first signal valley value, and the ending position of the target segment corresponds to a second signal valley value;
determining a target signal peak value, a first signal point with rising speed larger than a first value and a second signal point with falling speed larger than a second value in the target segment, wherein the first signal point corresponds to a first time point; the second signal point corresponds to a second time point;
determining a first signal strength value corresponding to the first signal point and a second signal strength value corresponding to the second signal point, the first signal strength value being greater than or equal to the first signal valley, the second signal strength value being greater than or equal to the second signal valley;
determining a first time period between the first time point and the second time point;
determining a first signal strength difference between the first signal strength value and the second signal strength value;
Determining a first offset value corresponding to the target signal peak value;
and determining a first signal range according to the first offset value and the first signal strength difference value, and taking the first duration and the first signal range as the target signal processing data.
Optionally, the determining, according to a mapping relationship between preset signal processing data and a control mode, a target control mode corresponding to the target signal processing data includes:
matching the first time length with the time length in the mapping relation between the preset signal processing data and the control mode; matching the first signal range with a signal range in a mapping relation between preset signal processing data and a control mode to obtain a preset duration successfully matched with the first time length and a preset signal range successfully matched with the first signal range;
and determining a target control mode corresponding to the preset duration and the preset signal range in the mapping relation.
Optionally, the method further comprises:
if the electromyographic signal curve comprises a plurality of segments, each segment comprises a signal peak value, and determining the maximum peak value in a plurality of signal peak values corresponding to the segments;
And determining a target segment corresponding to the maximum peak value in the segments.
Optionally, the composite control system based on myoelectricity and gesture data further comprises a controlled device, wherein the controlled device comprises at least two components as follows: chassis, cradle head, mechanical arm and mechanical arm;
the target control mode includes any one of the following: emergency stop mode, chassis translation mode, chassis rotation and cradle head rotation mode, robotic arm mode, and robotic arm mode.
Optionally, the gesture data includes acceleration and angular velocity; the executing the target control operation in the target control mode according to the gesture data includes:
determining a target component that performs the target control operation in the target control mode;
calculating control parameters of the target component according to the attitude data;
and controlling the target component to execute the target control operation according to the control parameter.
A second aspect of an embodiment of the present invention provides a composite control device based on myoelectricity and gesture data, applied to a composite control system based on myoelectricity and gesture data, where the composite control system based on myoelectricity and gesture data includes at least one myoelectricity collector and an inertial sensor, the device includes:
The acquisition unit is used for acquiring the myoelectric signals acquired by the at least one myoelectric acquisition unit;
the determining unit is used for determining a target control mode corresponding to the electromyographic signal;
the acquisition unit is also used for acquiring the attitude data acquired by the inertial sensor;
and the execution unit is used for executing the target control operation in the target control mode according to the gesture data.
A third aspect of the embodiments of the present invention provides a composite control system based on myoelectricity and gesture data, the composite control system based on myoelectricity and gesture data including at least one myoelectricity collector, an inertial sensor, a controlled device, and a controller, the at least one myoelectricity collector and the inertial sensor being connected to the controller, wherein,
the myoelectricity collector is used for collecting myoelectricity signals of a human body;
the controller is used for determining a target control mode corresponding to the electromyographic signal;
the inertial sensor is used for collecting posture data of a human body;
the controller is further used for controlling the controlled device to execute target control operation in the target control mode according to the gesture data.
A fourth aspect of the embodiments of the present invention provides a computer readable storage medium for storing a computer program for execution by a processor to implement some or all of the steps described in the method according to the first aspect of the embodiments of the present invention.
A fifth aspect of the embodiments of the present invention provides a computer program product comprising a non-transitory computer readable storage medium storing a computer program operable to cause a computer to perform some or all of the steps described in the method of the first aspect of the embodiments of the present invention.
The embodiment of the invention has at least the following beneficial effects:
it can be seen that, by the myoelectricity and gesture data-based composite control method, device and system in the embodiment of the invention, the myoelectricity and gesture data-based composite control system is applied to the myoelectricity and gesture data-based composite control system, and the myoelectricity and gesture data-based composite control system comprises at least one myoelectricity collector and an inertial sensor, and the myoelectricity signals collected by the at least one myoelectricity collector are obtained; determining a target control mode corresponding to the electromyographic signals; acquiring attitude data acquired by an inertial sensor; the target control operation in the target control mode is executed according to the gesture data, so that the operation control can be performed based on the myoelectricity and gesture recognition cooperation, and the flexibility of the composite control system based on the myoelectricity and gesture data is improved.
Drawings
In order to more clearly illustrate the embodiments of the invention or the technical solutions in the prior art, the drawings that are required in the embodiments or the description of the prior art will be briefly described, it being obvious that the drawings in the following description are only some embodiments of the invention, and that other drawings may be obtained according to these drawings without inventive effort for a person skilled in the art.
Fig. 1 is a schematic structural diagram of a composite control system based on myoelectricity and gesture data according to an embodiment of the present invention;
fig. 2 is a schematic flow chart of a composite control method based on myoelectricity and gesture data according to an embodiment of the present invention;
fig. 3 is a schematic flow chart of another composite control method based on myoelectricity and gesture data according to an embodiment of the present invention;
fig. 4 is a schematic structural diagram of a composite control device based on myoelectricity and gesture data according to an embodiment of the present invention.
Detailed Description
The following description of the embodiments of the present invention will be made clearly and completely with reference to the accompanying drawings, in which it is apparent that the embodiments described are only some embodiments of the present invention, but not all embodiments. All other embodiments, which can be made by those skilled in the art based on the embodiments of the invention without making any inventive effort, are intended to be within the scope of the invention.
The terms first, second and the like in the description and in the claims and in the above-described figures are used for distinguishing between different objects and not necessarily for describing a sequential or chronological order. Furthermore, the terms "comprise" and "have," as well as any variations thereof, are intended to cover a non-exclusive inclusion. For example, a process, method, system, article, or apparatus that comprises a list of steps or elements is not limited to only those listed steps or elements but may include other steps or elements not listed or inherent to such process, method, article, or apparatus.
Reference in the specification to "an embodiment" means that a particular feature, structure, or characteristic described in connection with the embodiment may be included in at least one embodiment of the invention. The appearances of such phrases in various places in the specification are not necessarily all referring to the same embodiment, nor are separate or alternative embodiments mutually exclusive of other embodiments. Those of skill in the art will explicitly and implicitly appreciate that the described embodiments of the invention may be combined with other embodiments.
Referring to fig. 1, fig. 1 is a schematic structural diagram of a composite control system based on myoelectricity and gesture data according to an embodiment of the present invention, where the composite control system based on myoelectricity and gesture data includes at least one myoelectricity collector, an inertial sensor, a controlled device, and a controller, where the at least one myoelectricity collector and the inertial sensor are both connected to the controller,
The myoelectricity collector is used for collecting myoelectricity signals of a human body;
the controller is used for determining a target control mode corresponding to the electromyographic signal;
the inertial sensor is used for collecting posture data of a human body;
the controller is further used for controlling the controlled device to execute target control operation in the target control mode according to the gesture data.
The myoelectricity collector comprises an electrode array, at least 2 paths of differential signal collecting circuits, a control module, a transmission module and a power module, wherein the electrode array is used for being in contact with a human body to collect myoelectricity signals, the at least 2 paths of differential signal collecting circuits are used for preprocessing the collected myoelectricity signals to obtain processed myoelectricity signals, and the preprocessing can comprise at least one of the following steps: signal amplification, analog-to-digital conversion, low-pass filtering, electromagnetic interference filtering, and the like; the transmission module is used for transmitting the preprocessed electromyographic signals to the controller.
The inertial sensor can be used for collecting gesture data such as acceleration, angular velocity and the like generated during limb movement of a human body.
The controller controls the controlled device to execute the target control operation in the target control mode according to the gesture data, so that after the target control mode is identified according to the electromyographic signals, the controlled device is controlled to execute the target control operation in the target control mode directionally according to the gesture data, and the operation control intelligence is improved.
Optionally, in the aspect of determining the target control mode corresponding to the electromyographic signal, the controller is specifically configured to:
processing the electromyographic signals to obtain target signal processing data;
and determining a target control mode corresponding to the target signal processing data according to a mapping relation between the preset signal processing data and the control mode.
Optionally, in the aspect of processing the electromyographic signal to obtain target signal processing data, the controller is specifically configured to:
signal processing is carried out on the electromyographic signals to obtain an electromyographic signal curve,
determining a target segment with a signal peak value in the electromyographic signal curve, wherein the starting position of the target segment corresponds to a first signal valley value, and the ending position of the target segment corresponds to a second signal valley value;
determining a target signal peak value, a first signal point with rising speed larger than a first value and a second signal point with falling speed larger than a second value in the target segment, wherein the first signal point corresponds to a first time point; the second signal point corresponds to a second time point;
determining a first signal strength value corresponding to the first signal point and a second signal strength value corresponding to the second signal point, the first signal strength value being greater than or equal to the first signal valley, the second signal strength value being greater than or equal to the second signal valley;
Determining a first time period between the first time point and the second time point;
determining a first signal strength difference between the first signal strength value and the second signal strength value;
determining a first offset value corresponding to the target signal peak value;
and determining a first signal range according to the first offset value and the first signal strength difference value, and taking the first duration and the first signal range as the target signal processing data.
Optionally, in the aspect of determining a target control mode corresponding to the target signal processing data according to a mapping relationship between preset signal processing data and control modes, the controller is specifically configured to:
matching the first time length with the time length in the mapping relation between the preset signal processing data and the control mode; matching the first signal range with a signal range in a mapping relation between preset signal processing data and a control mode to obtain a preset duration successfully matched with the first time length and a preset signal range successfully matched with the first signal range;
and determining a target control mode corresponding to the preset duration and the preset signal range in the mapping relation.
Optionally, the controlled device includes at least two of the following components: chassis, cradle head, mechanical arm and mechanical arm;
the target control mode includes any one of the following: emergency stop mode, chassis translation mode, chassis rotation and cradle head rotation mode, robotic arm mode, and robotic arm mode.
Optionally, the controller may be disposed on the controlled device, or may be independently disposed outside the controlled device.
Optionally, the controller is further configured to:
if the electromyographic signal curve comprises a plurality of segments, each segment comprises a signal peak value, and determining the maximum peak value in a plurality of signal peak values corresponding to the segments;
and determining a target segment corresponding to the maximum peak value in the segments.
Optionally, the gesture data includes acceleration and angular velocity; in the aspect of the performing the target control operation in the target control mode according to the attitude data, the controller is specifically configured to:
determining a target component that performs the target control operation in the target control mode;
calculating control parameters of the target component according to the attitude data;
and controlling the target component to execute the target control operation according to the control parameter.
Wherein the controlled device may comprise the following components: the chassis, the cradle head, the mechanical arm and the mechanical arm can control the controlled equipment to stop immediately in an emergency stop mode; in the chassis translation mode, the controlled equipment can be controlled to translate by using control parameters; under the chassis rotation and cradle head rotation modes, the rotation of the controlled equipment and the rotation of the cradle head can be respectively controlled by using control parameters; in the mechanical arm mode, the mechanical arm can be controlled to move up and down and left and right respectively by using control parameters; in the manipulator mode, the manipulator may be controlled to take using control parameters.
It can be seen that, by the composite control system based on myoelectricity and gesture data in the embodiment of the invention, the composite control system based on myoelectricity and gesture data comprises at least one myoelectricity collector, an inertial sensor, controlled equipment and a controller, wherein the at least one myoelectricity collector and the inertial sensor are both connected with the controller, and the myoelectricity collector is used for collecting myoelectricity signals of a human body; the controller is used for determining a target control mode corresponding to the electromyographic signals; the inertial sensor is used for collecting the posture data of the human body; the controller is also used for controlling the controlled equipment to execute target control operation in a target control mode according to the gesture data, so that operation control can be performed cooperatively based on myoelectricity and gesture recognition, and the flexibility of the composite control system based on myoelectricity and gesture data is improved.
Referring to fig. 2, fig. 2 is a flow chart of a composite control method based on myoelectricity and gesture data according to an embodiment of the present invention. As shown in fig. 2, the myoelectricity and gesture data-based composite control method provided by the embodiment of the invention is applied to the myoelectricity and gesture data-based composite control system shown in fig. 1, wherein the myoelectricity and gesture data-based composite control system comprises at least one myoelectricity collector and an inertial sensor, and the method can comprise the following steps:
201. acquiring an electromyographic signal acquired by the at least one electromyographic acquisition device;
wherein, at least one myoelectricity collector can gather human myoelectricity signal.
Optionally, the myoelectricity collector has a plurality ofly, and a plurality of myoelectricity collectors distribute in human different positions, can gather the myoelectricity signal in different positions through the myoelectricity collector in different positions to the motion intention of user is analyzed more accurately according to the myoelectricity signal that the human different positions reacted.
202. Determining a target control mode corresponding to the electromyographic signal;
in specific implementation, for different controlled devices, a plurality of control modes of different components of the controlled device for realizing various functions can be set, and when a user wants to select different control modes, the corresponding myoelectric signals are different, so that the mapping relation between the myoelectric signals and the target control modes can be preset, and the target control mode corresponding to the myoelectric signals can be determined according to the mapping relation.
Wherein the target control mode includes any one of the following: emergency stop mode, chassis translation mode, chassis rotation and cradle head rotation mode, robotic arm mode, and robotic arm mode.
Optionally, in step 202, the determining the target control mode corresponding to the electromyographic signal includes:
21. processing the electromyographic signals to obtain target signal processing data;
22. and determining a target control mode corresponding to the target signal processing data according to a mapping relation between the preset signal processing data and the control mode.
The method comprises the steps of obtaining a plurality of preset signal processing data in advance, obtaining a control mode corresponding to each signal processing data in the plurality of signal processing data, obtaining a plurality of control modes, and then establishing a mapping relation between the signal processing data and the control modes. As shown in the following table 1, an example of a mapping relationship between signal processing data and a control mode is provided in an embodiment of the present invention:
signal processing data Control mode
Signal processing data 1 Control mode 1
Signal processing data 2 Control mode 2
Signal processing data 3 Control mode 3
... ...
Signal processing data n Control pattern n
TABLE 1
Optionally, the electromyographic signal may be subjected to signal processing to obtain an electromyographic signal curve, and then, a target control mode corresponding to the electromyographic signal curve is determined according to a mapping relationship between a preset signal curve and the control mode.
The method comprises the steps of obtaining a plurality of preset signal curves in advance, obtaining a control mode corresponding to each signal curve in the plurality of signal curves, obtaining a plurality of control modes, and then establishing a mapping relation between the signal curves and the control modes.
Optionally, in step 21, the processing the myoelectric signal to obtain target signal processing data includes:
2101. signal processing is carried out on the electromyographic signals to obtain an electromyographic signal curve,
2102. determining a target segment with a signal peak value in the electromyographic signal curve, wherein the starting position of the target segment corresponds to a first signal valley value, and the ending position of the target segment corresponds to a second signal valley value;
2103. determining a target signal peak value, a first signal point with rising speed larger than a first value and a second signal point with falling speed larger than a second value in the target segment, wherein the first signal point corresponds to a first time point; the second signal point corresponds to a second time point;
2104. Determining a first signal strength value corresponding to the first signal point and a second signal strength value corresponding to the second signal point, the first signal strength value being greater than or equal to the first signal valley, the second signal strength value being greater than or equal to the second signal valley;
2105. determining a first time period between the first time point and the second time point;
2106. determining a first signal strength difference between the first signal strength value and the second signal strength value;
2107. determining a first offset value corresponding to the target signal peak value;
2108. and determining a first signal range according to the first offset value and the first signal strength value, and taking the first duration and the first signal range as the target signal processing data.
In specific implementation, when different movement intentions are generated by a user, the response time of the electromyographic signals and the signal intensity of the electromyographic signals can all present different electromyographic signal curves, so that the electromyographic signal curves can be analyzed to identify the movement intentions used, specifically, a target segment with a signal peak in the electromyographic signal curves can be determined, then the target signal peak in the target segment can be determined, a first signal point with the rising speed greater than a first value and a second signal point with the falling speed greater than a second value in the target segment can be determined, generally, the signal intensity of the electromyographic signals can be in an ascending trend at one side of the target signal peak, and the first signal point with the steepest change of the electromyographic signal curve at the side of the target signal peak can be determined; on the other side of the target signal peak value, the signal intensity of the electromyographic signal can be in a descending trend, a second signal point with the steepest change of the electromyographic signal curve on the side of the target signal peak value can be determined, and the first signal point corresponds to a first time point; the second signal point corresponds to a second time point; furthermore, a first time length can be determined according to the first time point and the second time point, the first time length can be used for representing the time length of the human body when the electromyographic signals change in the process of producing motion consciousness, and a first signal intensity difference value between the first signal intensity value and the second signal intensity value can be determined; determining a first offset value corresponding to a target signal peak value; the first signal range is determined according to the first offset value and the first signal intensity value, the first signal range can represent the range of the change of the electromyographic signals of the human body in the process of producing the motion consciousness, and the time length and the signal intensity change of the electromyographic signals of the human body in the process of producing the motion consciousness are different in consideration of different motion intentions.
Optionally, in the step 22, the determining, according to a mapping relationship between preset signal processing data and a control mode, a target control mode corresponding to the target signal processing data includes:
matching the first time length with the time length in the mapping relation between the preset signal processing data and the control mode; matching the first signal range with a signal range in a mapping relation between preset signal processing data and a control mode to obtain a preset duration successfully matched with the first time length and a preset signal range successfully matched with the first signal range;
and determining a target control mode corresponding to the preset duration and the preset signal range in the mapping relation.
In a specific implementation, a mapping relationship between signal processing data and a control mode may be preset, as shown in the following table 2, which is an example of a mapping relationship between signal processing data and a control mode provided in an embodiment of the present invention:
Figure BDA0002713161170000111
TABLE 2
The first time length and a plurality of time lengths preset in the mapping relation can be sequentially matched to obtain preset time lengths successfully matched with the first time length, the first signal range and a plurality of signal ranges in the mapping relation are sequentially matched to obtain a preset signal range successfully matched with the first signal range, and then a target control mode corresponding to the preset time length and the preset signal range is determined, so that the target control mode can be determined more accurately based on the first time length and the first signal range.
203. Acquiring attitude data acquired by the inertial sensor;
the gesture data may include acceleration, angular velocity, etc. of the human body entity.
In particular embodiments, the user may wear the inertial sensor on a limb of the body for controlling the controlled device, e.g., an arm, wrist, finger, waist, etc., without limitation.
204. And executing target control operation in the target control mode according to the gesture data.
Specifically, the euler angle may be calculated from attitude data such as acceleration, angular velocity, and the like, and then the execution target control operation of the controlled device in the target control mode may be controlled according to the euler angle.
The euler angles may include pitch angle, roll angle, and yaw angle, among others.
Optionally, the controlled device of the composite control system based on myoelectricity and gesture data comprises at least two components: chassis, cradle head, mechanical arm and mechanical arm; the target control mode includes any one of the following: emergency stop mode, chassis translation mode, chassis rotation and cradle head rotation mode, robotic arm mode, and robotic arm mode.
Optionally, the performing the target control operation in the target control mode according to the gesture data includes:
Determining a target component that performs the target control operation in the target control mode;
calculating control parameters of the target component according to the attitude data;
and controlling the target component to execute the target control operation according to the control parameter.
Specifically, an emergency stop mode, a chassis translation mode, a chassis rotation and cradle head rotation mode, a mechanical arm mode and a mechanical arm mode can be preset for the controlled equipment. Thus, in the emergency stop mode, the controllable controlled device is immediately stopped; in the chassis translation mode, the controlled equipment can be controlled to translate by using control parameters; under the chassis rotation and cradle head rotation modes, the rotation of the controlled equipment and the rotation of the cradle head can be respectively controlled by using control parameters; in the mechanical arm mode, the mechanical arm can be controlled to move up and down and left and right respectively by using control parameters; in the manipulator mode, the manipulator may be controlled to take using control parameters.
Wherein the control parameters may include at least one of: the chassis translation distance, the chassis translation direction, the chassis rotation angle, the pan-tilt rotation direction, the pan-tilt rotation angle, the robot arm movement direction, the robot arm movement amplitude, and the like are not limited herein.
Alternatively, in order to ensure control stability and safety of the composite control system based on myoelectricity and posture data, the state of the control mode may be indicated using a lamp light, vibration, or the like.
It can be seen that, in the embodiment of the present invention, the myoelectric signal acquired by at least one myoelectric acquisition unit is acquired; determining a target control mode corresponding to the electromyographic signals; acquiring attitude data acquired by an inertial sensor; the target control operation in the target control mode is executed according to the gesture data, so that the operation control can be performed based on the myoelectricity and gesture recognition cooperation, and the flexibility of the composite control system based on the myoelectricity and gesture data is improved.
For example, as shown in fig. 3, fig. 3 is a flowchart of another composite control method based on myoelectricity and gesture data provided by the present invention. The myoelectric signals acquired by the at least one myoelectric acquisition device can be acquired, the myoelectric signals are processed to obtain target signal processing data, and a target control mode corresponding to the target signal processing data is determined from a mapping relation between preset signal processing data and control modes. Acquiring attitude data such as acceleration, angular velocity and the like acquired by an inertial sensor; determining a target component for executing the target control operation in the target control mode, performing attitude calculation according to the acceleration and the angular velocity to obtain control parameters of the target component, wherein the control parameters can comprise a pitch angle and a roll angle, controlling the target component to execute the target control operation according to the pitch angle and the roll angle, and controlling the controlled equipment to stop immediately in an emergency stop mode; in the chassis translation mode, the controlled equipment can be controlled to translate by using control parameters; under the chassis rotation and cradle head rotation modes, the rotation of the controlled equipment and the rotation of the cradle head can be respectively controlled by using control parameters; in the mechanical arm mode, the mechanical arm can be controlled to move up and down and left and right respectively by using control parameters; in the manipulator mode, the manipulator can be controlled to take by using the control parameters, so that the controlled equipment can be controlled in a directional manner, and the flexibility of the composite control system based on myoelectricity and gesture data is improved.
Referring to fig. 4, fig. 4 is a schematic structural diagram of a composite control device based on myoelectricity and gesture data, where the composite control device 400 based on myoelectricity and gesture data is applied to the composite control system based on myoelectricity and gesture data shown in fig. 1, and the composite control system based on myoelectricity and gesture data includes at least one myoelectricity collector and an inertial sensor, and the device 400 includes an acquisition unit 401, a determination unit 402 and an execution unit 403, where,
the acquiring unit 401 is configured to acquire an electromyographic signal acquired by the at least one electromyographic acquirer;
the determining unit 402 is configured to determine a target control mode corresponding to the electromyographic signal;
the acquiring unit 401 is further configured to acquire attitude data acquired by the inertial sensor;
the executing unit 403 is configured to execute a target control operation in the target control mode according to the gesture data.
Optionally, in the aspect of determining the target control mode corresponding to the electromyographic signal, the determining unit 402 is specifically configured to:
processing the electromyographic signals to obtain target signal processing data;
and determining a target control mode corresponding to the target signal processing data according to a mapping relation between the preset signal processing data and the control mode.
Optionally, in the aspect of processing the electromyographic signal to obtain target signal processing data, the determining unit 402 is specifically configured to:
signal processing is carried out on the electromyographic signals to obtain an electromyographic signal curve,
determining a target segment with a signal peak value in the electromyographic signal curve, wherein the starting position of the target segment corresponds to a first signal valley value, and the ending position of the target segment corresponds to a second signal valley value;
determining a target signal peak value, a first signal point with rising speed larger than a first value and a second signal point with falling speed larger than a second value in the target segment, wherein the first signal point corresponds to a first time point; the second signal point corresponds to a second time point;
determining a first signal strength value corresponding to the first signal point and a second signal strength value corresponding to the second signal point, the first signal strength value being greater than or equal to the first signal valley, the second signal strength value being greater than or equal to the second signal valley;
determining a first time period between the first time point and the second time point;
determining a first signal strength difference between the first signal strength value and the second signal strength value;
Determining a first offset value corresponding to the target signal peak value;
and determining a first signal range according to the first offset value and the first signal strength difference value, and taking the first duration and the first signal range as the target signal processing data.
Optionally, in the aspect of determining the target control mode corresponding to the target signal processing data according to the mapping relationship between the preset signal processing data and the control mode, the determining unit 402 is specifically configured to:
matching the first time length with the time length in the mapping relation between the preset signal processing data and the control mode; matching the first signal range with a signal range in a mapping relation between preset signal processing data and a control mode to obtain a preset duration successfully matched with the first time length and a preset signal range successfully matched with the first signal range;
and determining a target control mode corresponding to the preset duration and the preset signal range in the mapping relation.
Optionally, the controlled device includes at least two of the following components: chassis, cradle head, mechanical arm and mechanical arm;
the target control mode includes any one of the following: emergency stop mode, chassis translation mode, chassis rotation and cradle head rotation mode, robotic arm mode, and robotic arm mode.
Optionally, the determining unit 402 is further configured to:
if the electromyographic signal curve comprises a plurality of segments, each segment comprises a signal peak value, and determining the maximum peak value in a plurality of signal peak values corresponding to the segments;
and determining a target segment corresponding to the maximum peak value in the segments.
Optionally, the gesture data includes acceleration and angular velocity; in terms of the performing the target control operation in the target control mode according to the attitude data, the performing unit 403 is specifically configured to:
determining a target component that performs the target control operation in the target control mode;
calculating control parameters of the target component according to the attitude data;
and controlling the target component to execute the target control operation according to the control parameter.
It can be seen that the composite control device based on myoelectricity and gesture data described in the embodiment of the present invention acquires myoelectricity signals acquired by at least one myoelectricity acquisition unit; determining a target control mode corresponding to the electromyographic signals; acquiring attitude data acquired by an inertial sensor; the target control operation in the target control mode is executed according to the gesture data, so that the operation control can be performed based on the myoelectricity and gesture recognition cooperation, and the flexibility of the composite control system based on the myoelectricity and gesture data is improved.
The embodiment of the invention also provides a computer storage medium, wherein the computer storage medium stores a computer program for electronic data exchange, and the computer program makes a computer execute part or all of the steps of any one of the myoelectricity acquisition methods described in the embodiment of the method.
Embodiments of the present invention also provide a computer program product comprising a non-transitory computer-readable storage medium storing a computer program that causes a computer to perform some or all of the steps of any one of the myoelectric acquisition methods described in the method embodiments above.
It should be noted that, for simplicity of description, the foregoing method embodiments are all described as a series of acts, but it should be understood by those skilled in the art that the present invention is not limited by the order of acts described, as some steps may be performed in other orders or concurrently in accordance with the present invention. Further, those skilled in the art will also appreciate that the embodiments described in the specification are all preferred embodiments, and that the acts and modules referred to are not necessarily required for the present invention.
In the foregoing embodiments, the descriptions of the embodiments are emphasized, and for parts of one embodiment that are not described in detail, reference may be made to related descriptions of other embodiments.
In the several embodiments provided by the present invention, it should be understood that the disclosed apparatus may be implemented in other manners. For example, the apparatus embodiments described above are merely illustrative, such as the division of the units, merely a logical function division, and there may be additional manners of dividing the actual implementation, such as multiple units or components may be combined or integrated into another system, or some features may be omitted, or not performed. Alternatively, the coupling or direct coupling or communication connection shown or discussed with each other may be an indirect coupling or communication connection via some interfaces, devices or units, or may be in electrical or other forms.
The units described as separate units may or may not be physically separate, and units shown as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the units may be selected according to actual needs to achieve the purpose of the solution of this embodiment.
In addition, each functional unit in each embodiment of the present invention may be integrated in one processing unit, each unit may exist alone physically, or two or more units may be integrated in one unit. The integrated units described above may be implemented either in hardware or in software program modules.
The integrated units, if implemented in the form of software program modules, may be stored in a computer-readable memory for sale or use as a stand-alone product. Based on this understanding, the technical solution of the present invention may be embodied essentially or partly in the form of a software product, or all or part of the technical solution, which is stored in a memory, and includes several instructions for causing a computer device (which may be a personal computer, a server, a network device, or the like) to perform all or part of the steps of the method according to the embodiments of the present invention. And the aforementioned memory includes: a U-disk, a read-only memory (ROM), a random access memory (random access memory, RAM), a removable hard disk, a magnetic disk, or an optical disk, or other various media capable of storing program codes.
Those of ordinary skill in the art will appreciate that all or a portion of the steps in the various methods of the above embodiments may be implemented by a program that instructs associated hardware, and the program may be stored in a computer readable memory, which may include: flash disk, read-only memory, random access memory, magnetic or optical disk, etc.
The foregoing has outlined rather broadly the more detailed description of embodiments of the invention, wherein the principles and embodiments of the invention are explained in detail using specific examples, the above examples being provided solely to facilitate the understanding of the method and core concepts of the invention; meanwhile, as those skilled in the art will have variations in the specific embodiments and application scope in accordance with the ideas of the present invention, the present description should not be construed as limiting the present invention in view of the above.

Claims (10)

1. A composite control method based on myoelectricity and gesture data, which is characterized by being applied to a composite control system based on myoelectricity and gesture data, wherein the composite control system based on myoelectricity and gesture data comprises at least one myoelectricity collector and an inertial sensor, and the method comprises the following steps:
acquiring an electromyographic signal acquired by the at least one electromyographic acquisition device;
Determining a target control mode corresponding to the electromyographic signal;
acquiring attitude data acquired by the inertial sensor;
executing target control operation in the target control mode according to the gesture data;
the determining the target control mode corresponding to the electromyographic signal comprises the following steps: processing the electromyographic signals to obtain target signal processing data; determining a target control mode corresponding to the target signal processing data according to a mapping relation between preset signal processing data and control modes;
the processing the electromyographic signals to obtain target signal processing data includes:
signal processing is carried out on the electromyographic signals to obtain an electromyographic signal curve,
determining a target segment with a signal peak value in the electromyographic signal curve, wherein the starting position of the target segment corresponds to a first signal valley value, and the ending position of the target segment corresponds to a second signal valley value;
determining a target signal peak value, a first signal point with rising speed larger than a first value and a second signal point with falling speed larger than a second value in the target segment, wherein the first signal point corresponds to a first time point; the second signal point corresponds to a second time point;
Determining a first signal strength value corresponding to the first signal point and a second signal strength value corresponding to the second signal point, the first signal strength value being greater than or equal to the first signal valley, the second signal strength value being greater than or equal to the second signal valley;
determining a first time period between the first time point and the second time point;
determining a first signal strength difference between the first signal strength value and the second signal strength value;
determining a first offset value corresponding to the target signal peak value;
and determining a first signal range according to the first offset value and the first signal strength difference value, and taking the first duration and the first signal range as the target signal processing data.
2. The method according to claim 1, wherein the determining a target control mode corresponding to the target signal processing data according to a mapping relationship between preset signal processing data and control modes includes:
matching the first time length with the time length in the mapping relation between the preset signal processing data and the control mode; matching the first signal range with a signal range in a mapping relation between preset signal processing data and a control mode to obtain a preset duration successfully matched with the first time length and a preset signal range successfully matched with the first signal range;
And determining a target control mode corresponding to the preset duration and the preset signal range in the mapping relation.
3. The method according to claim 1 or 2, characterized in that the method further comprises:
if the electromyographic signal curve comprises a plurality of segments, each segment comprises a signal peak value, and determining the maximum peak value in a plurality of signal peak values corresponding to the segments;
and determining a target segment corresponding to the maximum peak value in the segments.
4. The method according to any of claims 1-2, wherein the composite control system based on myoelectricity and gesture data further comprises a controlled device comprising at least two of: chassis, cradle head, mechanical arm and mechanical arm;
the target control mode includes any one of the following: emergency stop mode, chassis translation mode, chassis rotation and cradle head rotation mode, robotic arm mode, and robotic arm mode.
5. A method according to claim 3, wherein the composite control system based on myoelectricity and gesture data further comprises a controlled device comprising at least two of the following: chassis, cradle head, mechanical arm and mechanical arm;
The target control mode includes any one of the following: emergency stop mode, chassis translation mode, chassis rotation and cradle head rotation mode, robotic arm mode, and robotic arm mode.
6. The method of claim 4, wherein the gesture data comprises acceleration and angular velocity; the executing the target control operation in the target control mode according to the gesture data includes:
determining a target component that performs the target control operation in the target control mode;
calculating control parameters of the target component according to the attitude data;
and controlling the target component to execute the target control operation according to the control parameter.
7. The method of claim 5, wherein the gesture data comprises acceleration and angular velocity; the executing the target control operation in the target control mode according to the gesture data includes:
determining a target component that performs the target control operation in the target control mode;
calculating control parameters of the target component according to the attitude data;
and controlling the target component to execute the target control operation according to the control parameter.
8. A composite control device based on myoelectricity and gesture data, characterized in that it is applied to a composite control system based on myoelectricity and gesture data, the composite control system based on myoelectricity and gesture data comprising at least one myoelectricity collector and an inertial sensor, the device comprising:
the acquisition unit is used for acquiring the myoelectric signals acquired by the at least one myoelectric acquisition unit;
the determining unit is used for determining a target control mode corresponding to the electromyographic signal;
the acquisition unit is also used for acquiring the attitude data acquired by the inertial sensor;
an execution unit configured to execute a target control operation in the target control mode according to the posture data;
the determining unit is further used for processing the electromyographic signals to obtain target signal processing data, and determining a target control mode corresponding to the target signal processing data according to a mapping relation between preset signal processing data and control modes;
the determining unit is also used for carrying out signal processing on the electromyographic signals to obtain an electromyographic signal curve,
determining a target segment with a signal peak value in the electromyographic signal curve, wherein the starting position of the target segment corresponds to a first signal valley value, and the ending position of the target segment corresponds to a second signal valley value;
Determining a target signal peak value, a first signal point with rising speed larger than a first value and a second signal point with falling speed larger than a second value in the target segment, wherein the first signal point corresponds to a first time point; the second signal point corresponds to a second time point;
determining a first signal strength value corresponding to the first signal point and a second signal strength value corresponding to the second signal point, the first signal strength value being greater than or equal to the first signal valley, the second signal strength value being greater than or equal to the second signal valley;
determining a first time period between the first time point and the second time point;
determining a first signal strength difference between the first signal strength value and the second signal strength value;
determining a first offset value corresponding to the target signal peak value;
and determining a first signal range according to the first offset value and the first signal strength difference value, and taking the first duration and the first signal range as the target signal processing data.
9. The utility model provides a compound control system based on myoelectricity and gesture data, which is characterized in that the compound control system based on myoelectricity and gesture data comprises at least one myoelectricity collector, an inertial sensor, controlled equipment and a controller, wherein the at least one myoelectricity collector and the inertial sensor are connected with the controller,
The myoelectricity collector is used for collecting myoelectricity signals of a human body;
the controller is used for determining a target control mode corresponding to the electromyographic signal;
the inertial sensor is used for collecting posture data of a human body;
the controller is further used for controlling the controlled device to execute target control operation in the target control mode according to the gesture data;
the controller is also used for processing the electromyographic signals to obtain target signal processing data; determining a target control mode corresponding to the target signal processing data according to a mapping relation between preset signal processing data and control modes;
the controller is further configured to:
signal processing is carried out on the electromyographic signals to obtain an electromyographic signal curve,
determining a target segment with a signal peak value in the electromyographic signal curve, wherein the starting position of the target segment corresponds to a first signal valley value, and the ending position of the target segment corresponds to a second signal valley value;
determining a target signal peak value, a first signal point with rising speed larger than a first value and a second signal point with falling speed larger than a second value in the target segment, wherein the first signal point corresponds to a first time point; the second signal point corresponds to a second time point;
Determining a first signal strength value corresponding to the first signal point and a second signal strength value corresponding to the second signal point, the first signal strength value being greater than or equal to the first signal valley, the second signal strength value being greater than or equal to the second signal valley;
determining a first time period between the first time point and the second time point;
determining a first signal strength difference between the first signal strength value and the second signal strength value;
determining a first offset value corresponding to the target signal peak value;
and determining a first signal range according to the first offset value and the first signal strength difference value, and taking the first duration and the first signal range as the target signal processing data.
10. A computer-readable storage medium, characterized in that a computer program for electronic data exchange is stored, wherein the computer program causes a computer to perform the method according to any one of claims 1-7.
CN202011063852.7A 2020-09-30 2020-09-30 Myoelectricity and gesture data-based composite control method, device and system Active CN112214109B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202011063852.7A CN112214109B (en) 2020-09-30 2020-09-30 Myoelectricity and gesture data-based composite control method, device and system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202011063852.7A CN112214109B (en) 2020-09-30 2020-09-30 Myoelectricity and gesture data-based composite control method, device and system

Publications (2)

Publication Number Publication Date
CN112214109A CN112214109A (en) 2021-01-12
CN112214109B true CN112214109B (en) 2023-06-23

Family

ID=74051687

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202011063852.7A Active CN112214109B (en) 2020-09-30 2020-09-30 Myoelectricity and gesture data-based composite control method, device and system

Country Status (1)

Country Link
CN (1) CN112214109B (en)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20230086750A (en) * 2021-03-19 2023-06-15 썬전 샥 컴퍼니 리미티드 Movement monitoring method and movement monitoring system
CN113021349A (en) * 2021-03-24 2021-06-25 季华实验室 Remote operation control method, device, system, equipment and storage medium

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2001054507A (en) * 1999-08-17 2001-02-27 Sony Corp Motion capture device using myoelectric information, its controlling method, electric stimulator using this, force tactile presenting device and controlling method of these
CN110169851A (en) * 2019-05-28 2019-08-27 南京航空航天大学 The artificial hand control system of electromyography signal automatic adjusument

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107553499A (en) * 2017-10-23 2018-01-09 上海交通大学 Natural the gesture motion control system and method for a kind of Multi-shaft mechanical arm
CN109901539A (en) * 2019-03-27 2019-06-18 辽东学院 A kind of man-machine interactive system and its control method applied to smart home

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2001054507A (en) * 1999-08-17 2001-02-27 Sony Corp Motion capture device using myoelectric information, its controlling method, electric stimulator using this, force tactile presenting device and controlling method of these
CN110169851A (en) * 2019-05-28 2019-08-27 南京航空航天大学 The artificial hand control system of electromyography signal automatic adjusument

Also Published As

Publication number Publication date
CN112214109A (en) 2021-01-12

Similar Documents

Publication Publication Date Title
CN107378944B (en) Multidimensional surface electromyographic signal artificial hand control method based on principal component analysis method
CN112214109B (en) Myoelectricity and gesture data-based composite control method, device and system
CN107856014B (en) Mechanical arm pose control method based on gesture recognition
CN110125909B (en) Multi-information fusion human body exoskeleton robot control protection system
Artemiadis et al. EMG-based teleoperation of a robot arm in planar catching movements using ARMAX model and trajectory monitoring techniques
CN106695794A (en) Mobile machine arm system based on surface myoelectric signal and control method of mobile machine arm system
Bouteraa et al. A gesture-based telemanipulation control for a robotic arm with biofeedback-based grasp
CN109521880B (en) Teleoperation robot system and method based on mixed bioelectricity signal driving
CN105014676A (en) Robot motion control method
CN105138133A (en) Biological signal gesture recognition device and method
Aswath et al. Human gesture recognition for real-time control of humanoid robot
Mongardi et al. A low-power embedded system for real-time sEMG based event-driven gesture recognition
Kim et al. Arm motion estimation algorithm using MYO armband
Fukuda et al. Development of an IoT-based prosthetic control system
Fan et al. Improved teleoperation of an industrial robot arm system using leap motion and myo armband
CN203552178U (en) Wrist strip type hand motion identification device
CN105718032A (en) Spaced control autodyne aircraft
Patel et al. EMG-based human machine interface control
CN111358659B (en) Robot power-assisted control method and system and lower limb rehabilitation robot
CN204725501U (en) Body sense mechanical arm comfort level checkout gear
Wei et al. Research on robotic arm movement grasping system based on MYO
CN113021349A (en) Remote operation control method, device, system, equipment and storage medium
CN204748634U (en) Motion control system of robot
Zhao et al. A multimodal-signals-based gesture recognition method for human machine interaction
CN111230872B (en) Object delivery intention recognition system and method based on multiple sensors

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant