CN112123334B - Interactive arm control method and system based on event-driven mechanism - Google Patents

Interactive arm control method and system based on event-driven mechanism Download PDF

Info

Publication number
CN112123334B
CN112123334B CN202010854724.8A CN202010854724A CN112123334B CN 112123334 B CN112123334 B CN 112123334B CN 202010854724 A CN202010854724 A CN 202010854724A CN 112123334 B CN112123334 B CN 112123334B
Authority
CN
China
Prior art keywords
speed
virtual human
event
control
preset
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202010854724.8A
Other languages
Chinese (zh)
Other versions
CN112123334A (en
Inventor
翟超
郭博文
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
China University of Geosciences
Original Assignee
China University of Geosciences
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by China University of Geosciences filed Critical China University of Geosciences
Priority to CN202010854724.8A priority Critical patent/CN112123334B/en
Publication of CN112123334A publication Critical patent/CN112123334A/en
Application granted granted Critical
Publication of CN112123334B publication Critical patent/CN112123334B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1602Programme controls characterised by the control system, structure, architecture
    • B25J9/161Hardware, e.g. neural networks, fuzzy logic, interfaces, processor
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1612Programme controls characterised by the hand, wrist, grip control
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1656Programme controls characterised by programming, planning systems for manipulators
    • B25J9/1661Programme controls characterised by programming, planning systems for manipulators characterised by task planning, object-oriented languages
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1656Programme controls characterised by programming, planning systems for manipulators
    • B25J9/1664Programme controls characterised by programming, planning systems for manipulators characterised by motion, path, trajectory planning

Abstract

The invention provides an interactive arm control method and system based on an event-driven mechanism, which are characterized in that a series of hand position-time and speed-time sequences of a patient are obtained by acquiring hand position signals of the patient with certain autism, and performing filtering and speed estimation processing; judging whether the difference between the current hand position of the virtual human and the preset boundary point position is smaller than a set threshold value or not; if so, triggering one-time PD control, and correspondingly adjusting the motion speed and the motion trail of the virtual human according to a series of obtained hand position-time and speed-time sequences of the patient; if not, the virtual human moves according to the preset speed and the preset track. The invention has the beneficial effects that: the designed event-driven mechanism is added to the PD controller, so that the control method is more humanized, and in addition, the control effect is improved to a certain extent and the signal transmission quantity is reduced.

Description

Interactive arm control method and system based on event-driven mechanism
Technical Field
The invention relates to the field of automatic control, in particular to an interactive arm control method and system based on an event-driven mechanism.
Background
In the traditional arm control method, the controller is designed to be controlled in a fixed time period, and the method can improve the control effect by adjusting control parameters, but the method still has the following defects: is not in accordance with the control mechanism of the human body; secondly, in order to ensure a good control effect, the signal transmission quantity among various modules of the control model is too large, so that great resource consumption is caused, and a great amount of experiments are required to determine the most appropriate control parameters. Therefore, in order to make the control mechanism of the controller model more fit to the control mechanism of the human body, and to reduce unnecessary signal transmission, an event driven controller model based on threshold driving is proposed. The method can be used in the fields of consumption entertainment and medical health, creates a virtual avatar of a real person, and performs real-time human-machine motion cooperative games. In the field of mental hygiene, can be used for clinical treatment of autism and social syndrome, and has good application prospect.
Disclosure of Invention
In order to solve the problems, the invention provides an interactive arm control method and system based on an event driving mechanism, which are used for controlling the arm movement of a virtual human in real time and realizing the movement coordination and interaction between the virtual human and a real human. An interactive arm control method based on an event-driven mechanism mainly comprises the following steps:
s1: acquiring a hand position signal of a person, and performing filtering and speed estimation processing to obtain a series of hand position-time and speed-time sequences of the person;
s2: judging whether the difference between the current hand position of the virtual human and the preset boundary point position is smaller than a set threshold value or not; if yes, go to step S3; if not, go to step S4;
s3: triggering one-time PD control, and correspondingly adjusting the motion speed and the motion trail of the virtual human according to a series of hand position-time and speed-time sequences of the certain human obtained in the step S1;
s4: the virtual human moves according to the preset speed and the preset track.
Further, the hand position signal of the person is collected by utilizing a backward difference rule.
Further, the PD control process is as follows:
Figure BDA0002646018440000021
wherein the content of the first and second substances,
Figure BDA0002646018440000022
is the acceleration of the virtual human, Δ is a set threshold, a1,a2,a3Are all the control parameters, x,
Figure BDA0002646018440000023
respectively representing the current hand position and velocity of the virtual human, h,
Figure BDA0002646018440000024
respectively representing the current position and speed of the person's hand,/1,l2And v is the preset speed of the virtual human.
An interactive arm control system based on an event driving mechanism enables a virtual human to generate a motion trail with self-similar characteristics and adaptively adjusts the motion state of the virtual human to achieve man-machine cooperation, and the interactive arm control system comprises a signal processing module, an event triggering judgment module, a control module and an end effector module;
the signal processing module is used for acquiring hand position signals of a person, and performing filtering and speed estimation processing;
the event triggering judgment module is used for judging whether the difference between the current hand position of the virtual human and the preset boundary point position is smaller than a set threshold value or not;
the control module is used for correspondingly controlling and adjusting the motion trail of the virtual human;
and the end effector module is used for adjusting the motion trail of the virtual human.
Furthermore, the signal processing module comprises a signal sampling unit, a filtering unit and a speed estimation unit, wherein the signal sampling unit is used for acquiring the position signal of the hand of the person; the filtering unit is used for removing an interference signal in the position signal of the hand of the person; a speed estimation unit for autonomously generating a series of position and speed signals.
Further, the control module is a PD controller.
The technical scheme provided by the invention has the beneficial effects that: compared with the existing control method, the control effect is improved to a certain extent and the signal transmission quantity is reduced.
Drawings
The invention will be further described with reference to the accompanying drawings and examples, in which:
FIG. 1 is a flowchart of an interactive arm control method based on an event-driven mechanism according to an embodiment of the present invention;
FIG. 2 is a schematic diagram of a recovery experiment performed by an autistic patient and a virtual human on a screen according to an embodiment of the invention;
FIG. 3 is a schematic diagram illustrating the comparison of the event-driven control and the conventional time period control in the time synchronism in the embodiment of the present invention;
FIG. 4 is a schematic diagram of the comparison of event-driven control with conventional time-cycle control in motion coordination in an embodiment of the present invention.
Detailed Description
For a more clear understanding of the technical features, objects and effects of the present invention, embodiments of the present invention will now be described in detail with reference to the accompanying drawings.
The embodiment of the invention provides an interactive arm control method and system based on an event-driven mechanism.
Referring to fig. 1, fig. 1 is a flowchart of an interactive arm control method based on an event-driven mechanism in an embodiment of the present invention, fig. 2 is a schematic diagram of a recovery experiment performed by an autistic patient and a virtual human on a screen in an embodiment of the present invention, fig. 3 is a schematic diagram of a comparison situation of an event-driven control and a conventional time period control in time synchronization in an embodiment of the present invention, fig. 4 is a schematic diagram of a comparison situation of an event-driven control and a conventional time period control in motion cooperativity in an embodiment of the present invention, and an interactive arm control method based on an event-driven mechanism specifically includes the following steps:
s1: collecting hand position signals of a certain autism patient, and carrying out filtering and speed estimation processing to obtain a series of hand position-time and speed-time sequences of the patient; the hand position signals of the patients with the autism are acquired by utilizing the backward difference rule, and only a small amount of hand position signals of the patients are acquired, so that the available speed signals and position signals can be automatically calculated without being limited to the signals acquired before.
S2: judging whether the difference between the current hand position of the virtual human and the preset boundary point position is smaller than a set threshold value or not; if yes, go to step S3; if not, go to step S4;
s3: triggering one-time PD control, and correspondingly adjusting the motion speed and the motion trail of the virtual human according to a series of hand position-time and speed-time sequences of the patient obtained in the step S1;
judging whether the current position of the virtual human is close to the boundary point of the motion trail in advance, and executing different control strategies according to the judgment result: if the current position of the virtual human is far away from the boundary point, the virtual human moves according to the preset speed v and the preset track, and if the current position of the virtual human is close to the boundary point, the event is triggered, and the system executes PD control once.
The procedure for PD control is as follows:
Figure BDA0002646018440000041
wherein the content of the first and second substances,
Figure BDA0002646018440000042
is the acceleration of the virtual human, Δ is a set threshold, a1,a2,a3Are all the control parameters, x,
Figure BDA0002646018440000043
respectively representing the current hand position and velocity of the virtual human, h,
Figure BDA0002646018440000044
respectively representing the current position and velocity of the hands of the autistic patient,/1,l2And v is the preset speed of the virtual human.
And selecting proper preset speed, boundary points, set thresholds and control parameters according to a large number of experimental simulations and data analysis.
S4: the virtual human moves according to the preset speed and the preset track.
An interactive arm control system based on an event-driven mechanism comprises a signal processing module, an event trigger judging module, a control module and an end effector module;
the signal processing module is used for acquiring hand position signals of a certain autism patient, and performing filtering and speed estimation processing; the signal processing module comprises a signal sampling unit, a filtering unit and a speed estimation unit, wherein the signal sampling unit is used for collecting position signals of the hands of the autism patient; the filtering unit is used for removing interference signals in position signals of the hands of the autism patient; a speed estimation unit for autonomously generating a series of position and speed signals.
The event triggering judgment module is used for judging whether the difference between the current hand position of the virtual human and the preset boundary point position is smaller than a set threshold value or not;
the control module is used for correspondingly controlling and adjusting the motion trail of the virtual human; the control module is a PD controller.
And the end effector module is used for adjusting the motion trail of the virtual human.
As shown in fig. 2, the left bead represents a virtual human, the right bead represents an autistic patient, x,
Figure BDA0002646018440000045
respectively representing the current position and velocity of the virtual human hand, h,
Figure BDA0002646018440000046
respectively representing the current position and velocity of the hands of the autistic patient,/1,l2Are boundary points. Both move in parallel horizontal line segments.
As shown in fig. 3 and 4, after a large number of experimental analyses: if the patient has already demonstrated good signs of recovery by itself (i.e. the patient has recovered social ability. in particular when he/she was experimented with a coordinated movement with another real person object, the level of coordination is higher, i.e. the Cnl value is higher), then the coordination of the movement is better when time period control (PD control) is used; if the patient's condition is still quite severe, then the synergy of motion is better when event driven control (ET control) is employed.
By the interactive arm control method and the interactive arm control system, the virtual human can generate a motion trail with self-similar characteristics, and the motion state can be adjusted in a self-adaptive mode to achieve man-machine cooperation. In addition, the control method can effectively reduce information transmission and energy consumption in man-machine interaction movement. The aim of curing the autism is finally achieved by controlling the interaction between the virtual human in the screen and the autism patient. The experimental result shows that compared with the existing control method, the control method can better fit the real person experimental data.
The invention has the beneficial effects that: the designed event-driven mechanism is added to the PD controller, so that the control method is more humanized, and in addition, the control effect is improved to a certain extent and the signal transmission quantity is reduced.
The above description is only for the purpose of illustrating the preferred embodiments of the present invention and is not to be construed as limiting the invention, and any modifications, equivalents, improvements and the like that fall within the spirit and principle of the present invention are intended to be included therein.

Claims (4)

1. An interactive arm control method based on an event-driven mechanism enables a virtual human to generate a motion trail with self-similar characteristics and adaptively adjusts the motion state of the virtual human to achieve man-machine cooperation, and is characterized in that: the method comprises the following steps:
s1: acquiring a hand position signal of a person, and performing filtering and speed estimation processing to obtain a series of hand position-time and speed-time sequences of the person;
s2: judging whether the difference between the current hand position of the virtual human and the preset boundary point position is smaller than a set threshold value or not; if yes, go to step S3; if not, go to step S4;
s3: triggering one-time PD control, and correspondingly adjusting the motion speed and the motion trail of the virtual human according to a series of hand position-time and speed-time sequences of the certain human obtained in the step S1;
the procedure for PD control is as follows:
Figure FDA0003153789490000011
wherein the content of the first and second substances,
Figure FDA0003153789490000012
is the acceleration of the virtual human, Δ is a set threshold, a1,a2,a3Are all the control parameters, x,
Figure FDA0003153789490000014
respectively representing the current hand of the virtual humanThe position and speed, h,
Figure FDA0003153789490000013
respectively representing the current position and speed of the person's hand,/1,l2The boundary points are preset boundary points, and v is the preset speed of the virtual human;
s4: the virtual human moves according to the preset speed and the preset track.
2. The interactive arm control method based on event-driven mechanism as claimed in claim 1, wherein: in step S1, the hand position signal of the person is acquired by using a backward difference rule.
3. An interactive arm control system based on an event-driven mechanism enables a virtual human to generate a motion trail with self-similar characteristics and adaptively adjusts the motion state of the virtual human to achieve man-machine cooperation, and is characterized in that: the system comprises a signal processing module, an event triggering judgment module, a control module and an end effector module;
the signal processing module is used for acquiring hand position signals of a person, and performing filtering and speed estimation processing;
the event triggering judgment module is used for judging whether the difference between the current hand position of the virtual human and the preset boundary point position is smaller than a set threshold value or not;
the control module is used for correspondingly controlling and adjusting the motion trail of the virtual human; the control module is a PD controller, and the control process is as follows:
Figure FDA0003153789490000021
wherein the content of the first and second substances,
Figure FDA0003153789490000022
is the acceleration of the virtual human, Δ is a set threshold, a1,a2,a3Are all control parameters,x,
Figure FDA0003153789490000024
Respectively representing the current hand position and velocity of the virtual human, h,
Figure FDA0003153789490000023
respectively representing the current position and speed of the person's hand,/1,l2The boundary points are preset boundary points, and v is the preset speed of the virtual human;
and the end effector module is used for adjusting the motion trail of the virtual human.
4. The interactive arm control system based on event-driven mechanism as claimed in claim 3, wherein: the signal processing module comprises a signal sampling unit, a filtering unit and a speed estimation unit, wherein the signal sampling unit is used for collecting a position signal of a hand of a person; the filtering unit is used for removing an interference signal in the position signal of the hand of the person; a speed estimation unit for autonomously generating a series of position and speed signals.
CN202010854724.8A 2020-08-24 2020-08-24 Interactive arm control method and system based on event-driven mechanism Active CN112123334B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010854724.8A CN112123334B (en) 2020-08-24 2020-08-24 Interactive arm control method and system based on event-driven mechanism

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010854724.8A CN112123334B (en) 2020-08-24 2020-08-24 Interactive arm control method and system based on event-driven mechanism

Publications (2)

Publication Number Publication Date
CN112123334A CN112123334A (en) 2020-12-25
CN112123334B true CN112123334B (en) 2021-09-10

Family

ID=73847176

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010854724.8A Active CN112123334B (en) 2020-08-24 2020-08-24 Interactive arm control method and system based on event-driven mechanism

Country Status (1)

Country Link
CN (1) CN112123334B (en)

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110794964A (en) * 2019-10-22 2020-02-14 深圳追一科技有限公司 Interaction method and device for virtual robot, electronic equipment and storage medium
CN111443619A (en) * 2020-04-17 2020-07-24 南京工程学院 Virtual-real fused human-computer cooperation simulation method and system

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10319109B2 (en) * 2017-03-31 2019-06-11 Honda Motor Co., Ltd. Interaction with physical objects as proxy objects representing virtual objects

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110794964A (en) * 2019-10-22 2020-02-14 深圳追一科技有限公司 Interaction method and device for virtual robot, electronic equipment and storage medium
CN111443619A (en) * 2020-04-17 2020-07-24 南京工程学院 Virtual-real fused human-computer cooperation simulation method and system

Also Published As

Publication number Publication date
CN112123334A (en) 2020-12-25

Similar Documents

Publication Publication Date Title
CN106671084B (en) A kind of autonomous householder method of mechanical arm based on brain-computer interface
CN106621287A (en) Upper limb rehabilitation training method based on brain-computer interface and virtual reality technology
CN107308638A (en) A kind of entertaining rehabilitation training of upper limbs system and method for virtual reality interaction
CN106504751A (en) Self adaptation lip reading exchange method and interactive device
CN112114670B (en) Man-machine co-driving system based on hybrid brain-computer interface and control method thereof
CN103207667B (en) A kind of control method of human-computer interaction and its utilization
CN106890423A (en) Using the intelligent running machine of stereoscopic vision
CN112951409A (en) Hemiplegia patient rehabilitation system based on Kinect interaction and virtual reality
CN103349595A (en) Intelligent brain-computer interface wheelchair based on multi-mode hierarchical control
CN110812122A (en) Sitting and standing training method and system for lower limb rehabilitation robot
CN112123334B (en) Interactive arm control method and system based on event-driven mechanism
CN103390193B (en) A kind of automatic trainer of rat robot towards navigation and rat behavior recognition methods and training method
CN107544675A (en) Brain control formula virtual reality method
CN111603158B (en) Fatigue driving warning method and system based on electrophysiological signal artificial intelligent analysis
CN103110486A (en) Wheel chair and method controlled by tooth
CN117694907A (en) Fine motion motor imagery electroencephalogram signal classification method and device
CN103736249A (en) Multi-user bicycle analog simulation system based on Internet
CN105549733B (en) Brain-computer interface system and method based on stable state vision inducting under a kind of intelligent space
CN206559469U (en) A kind of target following equipment based on FPGA
CN111243705A (en) Self-adaptation VR mirror image training system
CN106371588A (en) Movement imagery brain-computer interface-based hand function rehabilitation method
CN114610156A (en) Interaction method and device based on AR/VR glasses and AR/VR glasses
CN112306244B (en) Limb movement imagination brain-computer interaction method and system
CN106598243A (en) Multimode adaptive cursor control method and system
CN107496141A (en) A kind of finger rehabilitation device device and finger training method

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant