CN113311943B - Wearable interactive device for guiding movement of upper limbs of human - Google Patents

Wearable interactive device for guiding movement of upper limbs of human Download PDF

Info

Publication number
CN113311943B
CN113311943B CN202110577259.2A CN202110577259A CN113311943B CN 113311943 B CN113311943 B CN 113311943B CN 202110577259 A CN202110577259 A CN 202110577259A CN 113311943 B CN113311943 B CN 113311943B
Authority
CN
China
Prior art keywords
upper limb
human
control module
mcu control
relative vector
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202110577259.2A
Other languages
Chinese (zh)
Other versions
CN113311943A (en
Inventor
刘冠阳
黄超
王毅
王宇航
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beihang University
Original Assignee
Beihang University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beihang University filed Critical Beihang University
Priority to CN202110577259.2A priority Critical patent/CN113311943B/en
Publication of CN113311943A publication Critical patent/CN113311943A/en
Application granted granted Critical
Publication of CN113311943B publication Critical patent/CN113311943B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/103Detecting, measuring or recording devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
    • A61B5/11Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/68Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient
    • A61B5/6801Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient specially adapted to be attached to or worn on the body surface
    • A61B5/6802Sensor mounted on worn items
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/016Input arrangements with force or tactile feedback as computer generated output to the user
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F9/00Arrangements for program control, e.g. control units
    • G06F9/06Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
    • G06F9/44Arrangements for executing specific programs
    • G06F9/455Emulation; Interpretation; Software simulation, e.g. virtualisation or emulation of application or operating system execution engines
    • G06F9/45533Hypervisors; Virtual machine monitors
    • G06F9/45558Hypervisor-specific management and integration aspects
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F9/00Arrangements for program control, e.g. control units
    • G06F9/06Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
    • G06F9/44Arrangements for executing specific programs
    • G06F9/455Emulation; Interpretation; Software simulation, e.g. virtualisation or emulation of application or operating system execution engines
    • G06F9/45533Hypervisors; Virtual machine monitors
    • G06F9/45558Hypervisor-specific management and integration aspects
    • G06F2009/45562Creating, deleting, cloning virtual machine instances
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F9/00Arrangements for program control, e.g. control units
    • G06F9/06Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
    • G06F9/44Arrangements for executing specific programs
    • G06F9/455Emulation; Interpretation; Software simulation, e.g. virtualisation or emulation of application or operating system execution engines
    • G06F9/45533Hypervisors; Virtual machine monitors
    • G06F9/45558Hypervisor-specific management and integration aspects
    • G06F2009/45595Network integration; Enabling network access in virtual machine instances

Landscapes

  • Engineering & Computer Science (AREA)
  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Software Systems (AREA)
  • General Physics & Mathematics (AREA)
  • Pathology (AREA)
  • Animal Behavior & Ethology (AREA)
  • Human Computer Interaction (AREA)
  • Biophysics (AREA)
  • Veterinary Medicine (AREA)
  • Biomedical Technology (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Medical Informatics (AREA)
  • Molecular Biology (AREA)
  • Surgery (AREA)
  • Public Health (AREA)
  • General Health & Medical Sciences (AREA)
  • Oral & Maxillofacial Surgery (AREA)
  • Physiology (AREA)
  • Dentistry (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The invention relates to a wearable interactive device for guiding the movement of human upper limbs, comprising: the system comprises an MEMS sensor, an MCU control module, a vibration motor module and an upper computer virtual environment; the MEMS sensor is used for collecting the motion information of the upper limbs of the human body and sending the motion information of the upper limbs of the human body to the MCU control module; the MCU control module is used for receiving the human upper limb movement information and sending the human upper limb movement information to the upper computer virtual environment; the upper computer virtual environment is used for receiving human upper limb movement information, generating a touch coding instruction according to the human upper limb movement information, and sending the touch coding instruction to the MCU control module; the vibration motor module is used for receiving the touch coding instruction sent by the MCU control module, vibrating according to the touch coding instruction and guiding the upper limb of the human body to move. According to the invention, the MEMS sensor is used for acquiring the motion information of the upper limbs of the human body, so that the precision of motion guidance is improved.

Description

Wearable interactive device for guiding movement of upper limbs of human
Technical Field
The invention relates to the technical field of motion guidance, in particular to wearable interactive equipment for guiding the motion of upper limbs of a human being.
Background
As a large-scale long-life space research facility with participation of people, the Chinese space station is built into a national-level space laboratory, develops wide-field and serialized space science research, promotes important fields of space science to enter the front of the world, obtains important findings, leads technical development, and promotes space science and application to realize cross-over development. The space science experiment refers to a scientific experiment carried out at a space station, and ground scientists send control instructions to control experiment parameters or send instructions to astronauts at the space station to order the astronauts to manually control the experiment.
Aiming at the interaction requirements of various experimental control characteristics, a remote science wearable guide system needs to be researched and realized, and the ground personnel and astronauts are guided to interact in coordination through operations with energy injection, such as lifting, rotating, replacing and maintaining of the modules by limbs and wearable equipment. Currently, many visual methods are adopted for obtaining human body posture information, for example, kinect equipment is used, but optical equipment has the defects of shielding and low precision.
Disclosure of Invention
The invention aims to provide a wearable interactive device for guiding the movement of human upper limbs, which improves the guiding precision.
In order to achieve the purpose, the invention provides the following scheme:
a wearable interactive device for guiding motion of a human upper limb, comprising: the system comprises an MEMS sensor, an MCU control module, a vibration motor module and an upper computer virtual environment;
the MEMS sensor is used for collecting the motion information of the upper limbs of the human body and sending the motion information of the upper limbs of the human body to the MCU control module;
the MCU control module is used for receiving the human upper limb movement information and sending the human upper limb movement information to the upper computer virtual environment;
the upper computer virtual environment is used for receiving the human upper limb movement information, generating a touch coding instruction according to the human upper limb movement information, and sending the touch coding instruction to the MCU control module;
the vibration motor module is used for receiving the touch coding instruction sent by the MCU control module, vibrating according to the touch coding instruction and guiding the upper limb of the human body to move.
Optionally, the wearable interactive device for guiding the movement of the upper limb of the human being further comprises a bluetooth module, and the MCU control module and the upper computer virtual environment perform data transmission through the bluetooth module.
Optionally, the bluetooth module is an HC-05 bluetooth module.
Optionally, the MCU control module includes an ArduinoNano single chip microcomputer.
Optionally, a plurality of vibration motor modules are distributed on the upper limb of the human body, the MCU control module includes a plurality of PWM output ports, and each vibration motor module communicates data with the MCU control module through the PWM output port.
Optionally, the upper computer virtual environment is built through Unity 3D.
Optionally, the upper computer virtual environment uses bluetooth serial communication.
Optionally, the vibration motor module comprises a dc eccentric rotating mass motor.
Optionally, the upper computer virtual environment receives human upper limb movement information of a master end person and human upper limb movement information of a slave end person;
the human body upper limb movement information of the master end personnel and the human body upper limb movement information of the slave end personnel comprise the movement information of the left arm and the movement information of the right arm; the motion information of the left arm and the motion information of the right arm both comprise a wrist joint point coordinate, an elbow joint point coordinate and a shoulder joint point coordinate; the relative vector of the forearm is represented by the relative vector of the elbow joint point coordinate pointing to the wrist joint point coordinate, and the relative vector of the forearm is represented by the relative vector of the shoulder joint point coordinate pointing to the elbow joint point coordinate;
and the upper computer virtual environment sends out a touch coding instruction according to the difference between the relative vector of the left forearm of the master end personnel and the relative vector of the left forearm of the slave end personnel, the difference between the relative vector of the left forearm of the master end personnel and the relative vector of the left forearm of the slave end personnel, the difference between the relative vector of the right forearm of the master end personnel and the relative vector of the right forearm of the slave end personnel, and the difference between the relative vector of the right forearm of the master end personnel and the relative vector of the right forearm of the slave end personnel.
Optionally, the wrist joint point coordinate, the elbow joint point coordinate and the shoulder joint point coordinate are coordinates on a two-dimensional coordinate system.
According to the specific embodiment provided by the invention, the invention discloses the following technical effects:
according to the invention, the MEMS sensor is used for acquiring the motion information of the upper limbs of the human body, so that the problem of shielding of optical equipment can be avoided, the precision of motion guidance is improved, and meanwhile, the MEMS sensor enables the wearable interactive equipment to be convenient and fast and has low power consumption.
Drawings
In order to more clearly illustrate the embodiments of the present invention or the technical solutions in the prior art, the drawings needed to be used in the embodiments will be briefly described below, and it is obvious that the drawings in the following description are only some embodiments of the present invention, and it is obvious for those skilled in the art to obtain other drawings without inventive exercise.
FIG. 1 is a schematic structural diagram of a wearable interactive device for guiding the movement of a human upper limb according to the present invention;
FIG. 2 is a schematic structural view of four vibration motor modules distributed on a single strap of the present invention;
FIG. 3 is a schematic diagram of the coordinates of the wrist joint point of the forearm relative to the elbow joint point measured by the MEMS sensor according to the present invention;
FIG. 4 is a schematic diagram of the distribution of a wearable interactive device for guiding the movement of a human upper limb in the human upper limb;
FIG. 5 is a schematic diagram of four relative vectors of human upper limb information mapped onto a two-dimensional coordinate system according to the present invention.
Detailed Description
The technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are only a part of the embodiments of the present invention, and not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
The invention aims to provide a wearable interactive device for guiding the movement of upper limbs of a human, which improves the guiding precision.
In order to make the aforementioned objects, features and advantages of the present invention comprehensible, embodiments accompanied with figures are described in further detail below.
Fig. 1 is a schematic structural diagram of a wearable interactive device for guiding the movement of human upper limbs, and as shown in fig. 1, a wearable interactive device for guiding the movement of human upper limbs includes: the system comprises a MEMS sensor 101, an MCU control module 102, a vibration motor module 103 and an upper computer virtual environment 105.
The MEMS sensor 101 is used for collecting the motion information of the upper limbs of the human body and sending the motion information of the upper limbs of the human body to the MCU control module 102.
The MCU control module 102 is configured to receive the human upper limb movement information and send the human upper limb movement information to the upper computer virtual environment 105.
The upper computer virtual environment 105 is used for receiving the human upper limb movement information, generating a touch coding instruction according to the human upper limb movement information, and sending the touch coding instruction to the MCU control module 102.
The vibration motor module 103 is used for receiving the tactile coding instruction sent by the MCU control module 102, vibrating according to the tactile coding instruction, and guiding the movement of the upper limbs of the human body.
The wearable interactive device for guiding the movement of the upper limbs of the human is characterized by further comprising a Bluetooth module 104, and the MCU control module 102 and the upper computer virtual environment 105 perform data transmission through the Bluetooth module 104.
The bluetooth module 104 is an HC-05 bluetooth module 104.
The MCU control module 102 comprises an Arduino Nano single-chip microcomputer. The MCU control module 102 includes a plurality of PWM outputs for controlling the haptic device (vibration motor module 103) and a serial communication interface for data communication with a computer and an attitude sensor (MEMS sensor 101). Because wearable equipment still needs to satisfy convenient, low-power consumption's requirement, MCU control module 102 selects the miniature singlechip of Arduino Nano. The Arduino Nano single chip microcomputer adopts an ATmega328 microcontroller and is provided with 6 PWM output ports, 16 IO output ports, a UART interface and an IIC interface. MCU control module 102 supplies power to VIN port through 7.4V lithium battery.
The plurality of vibration motor modules 103 are distributed on the upper limb of the human body, and each vibration motor module 103 is in data communication with the MCU control module 102 through a PWM output port.
An MPU9250 chip inertial sensor with small size, high precision and high cost performance is selected as an acquisition node of motion capture by a Micro-Electro-Mechanical System (MEMS) sensor. The MPU9250 chip inertial sensor consists of a three-degree-of-freedom accelerometer, a three-degree-of-freedom gyroscope and a three-degree-of-freedom magnetometer, can measure the attitude information of an object, and is high in precision and free of drift error. The communication protocol of the MEMS sensor 101 and the Arduino Nano single-chip microcomputer adopts an IIC protocol, the static precision of an X axis and a Y axis is 0.05 degrees, the dynamic precision of the X axis and the Y axis is 0.1 degrees, and the precision of a Z axis is 1 degree.
The upper computer virtual environment 105 is built through Unity 3D. The upper computer virtual environment 105 uses bluetooth serial communication.
The vibration motor module 103 includes a dc eccentric rotating mass motor. Vibrating motor module 103 adopts MOS pipe drive, and the IO mouth of signal line connection Arduino Nano singlechip is opened according to the control vibrations of output level height, also can be through the PWM delivery outlet of connecting Arduino Nano singlechip, realizes controlling the function of vibrating motor vibrations power and weak size through changing input PWM duty cycle.
The structure of a single strap formed by the vibration motor module 103 is shown in fig. 2, and 4 vibration motors are fixed on the velcro strap to provide vibration guidance in four directions (1, 2, 3, and 4 in fig. 2).
The upper computer virtual environment 105 receives human upper limb movement information of the master end person and human upper limb movement information of the slave end person.
The vibrotactile device (the vibration motor module 103) adopts a split design, and the vibrotactile devices of the left arm and the right arm are independent. The vibrating motors in the vibrating motor module 103 are distributed at two positions of the small arm and the large arm of the left hand and the right hand respectively, 4 vibrating motors are arranged at each position at equal intervals, the 4 vibrating motors surround the small arm or the large arm for a circle, and the vibrating motors at the positions of the small arm and the large arm are used for indicating four directions, namely up, down, left and right. As shown in fig. 3, the coordinates of the forearm wrist joint relative to the elbow joint are measured using MEMS sensor 101. Similarly, the coordinates of the wrist joint of the small arm and the wrist joint of the two arms of the left arm and the right arm and the coordinates of the shoulder joint of the large arm and the elbow joint of the two arms of the large arm can be measured by the MEMS sensor 101.
The human body upper limb movement information of the master end personnel and the human body upper limb movement information of the slave end personnel comprise the movement information of the left arm and the movement information of the right arm; the motion information of the left arm and the motion information of the right arm both comprise a wrist joint point coordinate, an elbow joint point coordinate and a shoulder joint point coordinate; the posture (relative vector) of the forearm is represented by a relative vector in which the elbow joint point coordinates point to the wrist joint point coordinates, and the posture (relative vector) of the forearm is represented by a relative vector in which the shoulder joint point coordinates point to the elbow joint point coordinates.
The upper computer virtual environment 105 sends out a touch coding instruction according to the difference between the relative vector of the left forearm of the master end personnel and the relative vector of the left forearm of the slave end personnel, the difference between the relative vector of the right forearm of the master end personnel and the relative vector of the right forearm of the slave end personnel, and the difference between the relative vector of the right forearm of the master end personnel and the relative vector of the right forearm of the slave end personnel.
When the difference between the relative vector of the left forearm of the master member and the relative vector of the left forearm of the slave member is greater than a set threshold, the relative vector difference is decomposed into an x-axis vector difference and a y-axis vector difference, the x-axis vector difference and the y-axis vector difference are compared, and the upper computer virtual environment 105 sends out a touch coding instruction according to the x-axis vector difference and the y-axis vector difference.
The coordinates of the wrist joint point, the elbow joint point and the shoulder joint point are all coordinates on a two-dimensional coordinate system.
The wearable interactive distribution for guiding the movement of the upper limbs of the human is shown in fig. 4, and the most important actions of the user are concentrated on the upper limbs in a specific operation guiding task. Therefore, the operation guidance algorithm takes the posture data of the upper limbs of the master end person and the slave end person as input, compares the difference between the posture data and the posture data, and outputs a tactile vibration command which can be accepted by the vibration guidance module. The measurement is carried out through the MEMS sensor 101, the upper limb action of the human body is decomposed into four parts, namely a left small arm, a left big arm, a right small arm and a right big arm, and the posture information of the four parts is represented by four relative vectors respectively. And neglecting the Z-axis coordinate information vertical to the human body, and mapping the three-dimensional relative vector to the two-dimensional coordinate system. Fig. 5 shows the mapping of four relative vectors onto a two-dimensional coordinate system.
The relative vector of the elbow joint point pointing to the wrist joint point is used for representing the posture of the forearm, and the relative vector of the shoulder joint point pointing to the elbow joint point is used for representing the posture of the forearm. Taking the calculation of the posture of the forearm as an example, the calculation formula is as follows.
P elbow =(X elbow ,Y elbow ),P shoulder =(X shoulder ,Y shoulder )
P UpperArm =(X elbow -X shoulder ,Y elbow -Y shoulde r)
Wherein P is elbow World coordinates, P, representing the elbow joint points shoulder World coordinates of shoulder joint points are shown, and the acquisition of the coordinates is obtained through the Position property of Transform components of Unity 3D. P UpperArm And (3) a projection vector of the posture of the big arm in an XY plane (vertical to a connecting line between the human body and the Kinect).
Taking the left forearm as an example, a master-slave operation guiding algorithm is introduced.
Recording the relative vector of the left forearm of the main end person as follows:
Figure BDA0003084780100000061
the relative vector of the left forearm of the slave is:
Figure BDA0003084780100000062
the difference of the two vectors is used for representing the difference of the postures of the left forearm of the main end and the slave end:
Figure BDA0003084780100000063
when | Δ P ≦ Δ d (Δ d is the first set threshold), the master and slave are considered to be in the same attitude, and when | Δ P ≦ Δ d, Δ P is decomposed into two vectors in the x, y directions: delta P x ,ΔP y . Comparing two directionsMeasuring the size of the modulus to output vibrotactile stimulation at the corresponding position. I Δ P i.e. the euclidean distance, the calculation formula is:
Figure BDA0003084780100000071
the master-slave attitude difference versus vibration guidance is shown in table 1.
Table 1 master-slave attitude difference corresponding vibration guiding mapping table
Figure BDA0003084780100000072
In the present specification, the embodiments are described in a progressive manner, each embodiment focuses on differences from other embodiments, and the same and similar parts among the embodiments are referred to each other.
The principles and embodiments of the present invention have been described herein using specific examples, which are provided only to help understand the method and the core concept of the present invention; meanwhile, for a person skilled in the art, according to the idea of the present invention, the specific embodiments and the application range may be changed. In view of the above, the present disclosure should not be construed as limiting the invention.

Claims (9)

1. A wearable interactive device for guiding motion of a human upper limb, comprising: the system comprises an MEMS sensor, an MCU control module, a vibration motor module and an upper computer virtual environment;
the MEMS sensor is used for collecting the motion information of the upper limbs of the human body and sending the motion information of the upper limbs of the human body to the MCU control module;
the MCU control module is used for receiving the human upper limb movement information and sending the human upper limb movement information to the upper computer virtual environment;
the upper computer virtual environment is used for receiving the human upper limb movement information, generating a touch coding instruction according to the human upper limb movement information, and sending the touch coding instruction to the MCU control module;
the vibration motor module is used for receiving the touch coding instruction sent by the MCU control module, vibrating according to the touch coding instruction and guiding the upper limb of the human body to move;
the upper computer virtual environment receives human upper limb movement information of a master end person and human upper limb movement information of a slave end person;
the human upper limb movement information of the master end personnel and the human upper limb movement information of the slave end personnel comprise the movement information of the left arm and the movement information of the right arm; the motion information of the left arm and the motion information of the right arm both comprise a wrist joint point coordinate, an elbow joint point coordinate and a shoulder joint point coordinate; the relative vector of the forearm is represented by the relative vector of the elbow joint point coordinate pointing to the wrist joint point coordinate, and the relative vector of the forearm is represented by the relative vector of the shoulder joint point coordinate pointing to the elbow joint point coordinate;
and the upper computer virtual environment sends out a touch coding instruction according to the difference between the relative vector of the left forearm of the master end personnel and the relative vector of the left forearm of the slave end personnel, the difference between the relative vector of the left forearm of the master end personnel and the relative vector of the left forearm of the slave end personnel, the difference between the relative vector of the right forearm of the master end personnel and the relative vector of the right forearm of the slave end personnel, and the difference between the relative vector of the right forearm of the master end personnel and the relative vector of the right forearm of the slave end personnel.
2. The wearable interactive device for guiding human upper limb movement according to claim 1, further comprising a bluetooth module, wherein the MCU control module and the upper computer virtual environment perform data transmission through the bluetooth module.
3. A wearable interactive device for guiding human upper extremity movement according to claim 2, characterized in that the bluetooth module is a HC-05 bluetooth module.
4. A wearable interactive device to guide human upper limb movements according to claim 1, characterized in that the MCU control module comprises an ArduinoNano single-chip microcomputer.
5. The wearable interactive device for guiding human upper limb movement according to claim 1, wherein a plurality of vibration motor modules are distributed on the human upper limb, the MCU control module comprises a plurality of PWM output ports, and each vibration motor module is in data communication with the MCU control module through the PWM output ports.
6. A wearable interactive device to guide human upper limb movement according to claim 1, characterized in that the upper computer virtual environment is built by Unity 3D.
7. The wearable interactive device for guiding human upper limb movement of claim 1, wherein the upper computer virtual environment uses bluetooth serial port communication.
8. A wearable interactive device to guide human upper limb movements according to claim 1, characterized in that the vibration motor module comprises a dc eccentric rotating mass motor.
9. A wearable interactive device for guiding human upper extremity movement according to claim 1, characterized in that the wrist, elbow and shoulder joint point coordinates are coordinates on a two-dimensional coordinate system.
CN202110577259.2A 2021-05-26 2021-05-26 Wearable interactive device for guiding movement of upper limbs of human Active CN113311943B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110577259.2A CN113311943B (en) 2021-05-26 2021-05-26 Wearable interactive device for guiding movement of upper limbs of human

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110577259.2A CN113311943B (en) 2021-05-26 2021-05-26 Wearable interactive device for guiding movement of upper limbs of human

Publications (2)

Publication Number Publication Date
CN113311943A CN113311943A (en) 2021-08-27
CN113311943B true CN113311943B (en) 2022-10-04

Family

ID=77375032

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110577259.2A Active CN113311943B (en) 2021-05-26 2021-05-26 Wearable interactive device for guiding movement of upper limbs of human

Country Status (1)

Country Link
CN (1) CN113311943B (en)

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101514902A (en) * 2008-12-01 2009-08-26 东南大学 Navigation device for the blind
CN110161900A (en) * 2019-04-25 2019-08-23 中国人民解放军火箭军工程大学 The wearable remote control operation platform of one remote operation
CN110327048A (en) * 2019-03-11 2019-10-15 浙江工业大学 A kind of human upper limb posture reconstruction system based on wearable inertial sensor
CN110688910A (en) * 2019-09-05 2020-01-14 南京信息职业技术学院 Method for realizing wearable human body basic posture recognition
CN210377375U (en) * 2019-11-14 2020-04-21 云南电网有限责任公司电力科学研究院 Somatosensory interaction device

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108187310B (en) * 2017-12-21 2019-05-31 东南大学 Feel that the limb motion of information and posture information is intended to understand and upper-limbs rehabilitation training robot and its control method based on power

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101514902A (en) * 2008-12-01 2009-08-26 东南大学 Navigation device for the blind
CN110327048A (en) * 2019-03-11 2019-10-15 浙江工业大学 A kind of human upper limb posture reconstruction system based on wearable inertial sensor
CN110161900A (en) * 2019-04-25 2019-08-23 中国人民解放军火箭军工程大学 The wearable remote control operation platform of one remote operation
CN110688910A (en) * 2019-09-05 2020-01-14 南京信息职业技术学院 Method for realizing wearable human body basic posture recognition
CN210377375U (en) * 2019-11-14 2020-04-21 云南电网有限责任公司电力科学研究院 Somatosensory interaction device

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
面向移动终端的指套式力触觉交互系统设计;王路;《中国优秀博硕士学位论文全文数据库(硕士)信息科技辑》;20160815(第8期);第1.3,2.3.1,3.4.2,3.5节,第六章 *

Also Published As

Publication number Publication date
CN113311943A (en) 2021-08-27

Similar Documents

Publication Publication Date Title
CN106326881B (en) Gesture recognition method and gesture recognition device for realizing man-machine interaction
CN103192387A (en) Robot and control method thereof
CN110327048B (en) Human upper limb posture reconstruction system based on wearable inertial sensor
Righetti et al. Proposition of a modular I2C-based wearable architecture
CN108279773B (en) Data glove based on MARG sensor and magnetic field positioning technology
CN109048897A (en) A kind of method of principal and subordinate's teleoperation of robot
CN106227368B (en) A kind of human synovial angle calculation method and device
Krupke et al. Prototyping of immersive HRI scenarios
CN102156540A (en) Three-dimensional somatosensory man-machine interactive system with vibrotactile feedback and interactive method thereof
Shao et al. A natural interaction method of multi-sensory channels for virtual assembly system of power transformer control cabinet
CN113311943B (en) Wearable interactive device for guiding movement of upper limbs of human
CN113305830A (en) Humanoid robot action system based on human body posture control and control method
CN111152260A (en) Joint corner auxiliary measurement system and method for serial rotary joint mechanical arm
Sharma et al. Design and implementation of robotic hand control using gesture recognition
CN206578829U (en) A kind of bionical body-sensing mechanical arm of seven freedom
Ma et al. Magnetic hand motion tracking system for human–machine interaction
Lobo et al. Inertouchhand system-ith-demonstration of a glove device with distributed inertial sensors and vibro-tactile feedback
He et al. Design and implementation of low-cost inertial sensor-based human motion capture system
CN109426346A (en) A kind of data glove based on force feedback technique
CN217767390U (en) Glove type control system
Chenghao et al. Research on human posture recognition system based on inertial sensor
CN207704451U (en) Gesture acquisition system
CN202694258U (en) Limb posture recognition device
Qian et al. DH Parameter Method-based Wearable Motion Tracking
Graziano et al. A wireless haptic data suit for controlling humanoid robots

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant