CN112883935A - Neural network real-time sign language translation system equipment - Google Patents

Neural network real-time sign language translation system equipment Download PDF

Info

Publication number
CN112883935A
CN112883935A CN202110358848.1A CN202110358848A CN112883935A CN 112883935 A CN112883935 A CN 112883935A CN 202110358848 A CN202110358848 A CN 202110358848A CN 112883935 A CN112883935 A CN 112883935A
Authority
CN
China
Prior art keywords
module
data
sign language
main control
control module
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202110358848.1A
Other languages
Chinese (zh)
Inventor
蔡向东
朱思旭
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Harbin University of Science and Technology
Original Assignee
Harbin University of Science and Technology
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Harbin University of Science and Technology filed Critical Harbin University of Science and Technology
Priority to CN202110358848.1A priority Critical patent/CN112883935A/en
Publication of CN112883935A publication Critical patent/CN112883935A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/20Movements or behaviour, e.g. gesture recognition
    • G06V40/28Recognition of hand or arm movements, e.g. recognition of deaf sign language
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/04Architecture, e.g. interconnection topology
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/08Learning methods

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Health & Medical Sciences (AREA)
  • General Health & Medical Sciences (AREA)
  • General Physics & Mathematics (AREA)
  • Molecular Biology (AREA)
  • Computational Linguistics (AREA)
  • Software Systems (AREA)
  • Mathematical Physics (AREA)
  • General Engineering & Computer Science (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Artificial Intelligence (AREA)
  • Biomedical Technology (AREA)
  • Biophysics (AREA)
  • Computing Systems (AREA)
  • Data Mining & Analysis (AREA)
  • Evolutionary Computation (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Human Computer Interaction (AREA)
  • Psychiatry (AREA)
  • Multimedia (AREA)
  • Social Psychology (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The invention discloses a neural network real-time sign language translation system device, belongs to the technical field of sign language translation, and can translate sign language actions made by device users into written characters in real time and display the written characters on a mobile phone APP to help deaf-mutes reduce difficulty in communication with people. The invention comprises a main control module, a gesture module, a communication module and a terminal module. The main control module is used for acquiring signals from the gesture module, processing the signals through a neural network sign language recognition algorithm and transmitting the signals to the terminal module through the communication module; the gesture module is used for collecting sign language action data of a user of the equipment, consists of a bending sensor for collecting data when fingers are bent and an MPU6050 chip module for detecting the orientation of a palm, and is connected with the main control module through a connecting circuit; the communication module adopts a half-duplex working HC-05 Bluetooth module, is connected with an I/O port of the main control module through a UART interface and is used as a data transmission channel; the terminal module adopts a mobile phone terminal App form with a Bluetooth function, receives signals processed by the main control module through the communication module, and obtains a final recognition translation result which is displayed on a mobile phone screen in a written text form.

Description

Neural network real-time sign language translation system equipment
Technical Field
The invention belongs to the technical field of sign language translation, and particularly relates to a real-time sign language translation system device of a neural network.
Background
According to statistics in recent years, over 2000 million people in China have hearing impairment, the number of people around the world is enlarged to about 7000 million people, and in recent years, relevant researchers at home and abroad invest a lot of time to overcome the defects and challenges in the technical field of sign language recognition, and research on potential development spaces of input sensors such as data gloves or special cameras. Although the network camera or the stereo camera can ensure the accuracy and the speed of tracking the hand movement, the complex background and the illumination are difficult to process.
Disclosure of Invention
The invention provides a neural network real-time sign language translation system device, which can translate sign language actions made by a device user into written characters in real time and display the written characters on a mobile phone APP to help deaf-mutes reduce difficulty in communication with people.
The technical scheme for solving the technical problems is as follows: a real-time sign language translation system device of neural network comprises a main control module, a gesture module, a communication module and a terminal module:
the main control module is used for acquiring signals from the gesture module, processing the signals through a neural network sign language recognition algorithm and transmitting the signals to the terminal module through the communication module;
the gesture module is used for collecting sign language action data, consists of a bending sensor for collecting data when fingers are bent and an MPU6050 chip module for detecting the orientation of a palm, and is connected with the main control module through a connecting circuit;
the communication module adopts a half-duplex working HC-05 Bluetooth module, is connected with an I/O port of the main control module through a UART interface and is used as a data transmission channel;
the terminal module adopts a mobile phone terminal App form, receives the signals processed by the main control module through the communication module, and obtains a final recognition translation result which is displayed on a mobile phone screen in a written text form;
the design process of the sign language recognition algorithm of the neural network comprises the steps of firstly utilizing data equipment to collect sign language actions of equipment users, completing collection by different users in batches at different periods, further establishing a sign language database for training and verification, then preprocessing all data according to different categories, finally determining the structure and the activation function of the neural network according to the length and the type of the data, determining a weight matrix to be transplanted to the data equipment after completing training and obtaining a stable result in verification, and introducing action codes to avoid repeated recognition of static actions.
The invention has the advantages that: the gesture language action made by the equipment wearer can be recognized in real time and translated in the form of text; when the device is used by different device users with different hand posture actions, the device can still keep high precision; the signal transmission speed is high, and the data equipment is not influenced by the external environment in the communication process with the mobile phone terminal; the energy consumption is low, and the research and development cost is moderate.
Drawings
FIG. 1 is a general system design diagram of a neural network real-time sign language translation system device according to the present invention;
FIG. 2 is a diagram of the operation of the data device of the present invention at the handset side;
FIG. 3 is a basic flow diagram of neural network training and recognition of the present invention.
Detailed Description
For a further understanding of the present invention, reference will now be made in detail to the embodiments illustrated in the drawings.
As shown in fig. 1, a neural network real-time sign language translation system device includes a main control module, a gesture module, a communication module, and a terminal module;
the gesture module is mainly used for gesture positioning, namely acquiring hand motion information of a user of the equipment, and comprises a bending sensor and a gyroscope sensor, wherein the bending sensor also comprises a force sensitive element, an elastic seal account and a flexible circuit;
the main control module mainly performs the fusion processing of receipts and consists of an I.MAX6ULL microcontroller and an AD conversion circuit;
the communication module is mainly used for transmitting data and outputting text data obtained by system translation and recognition to the terminal module;
the terminal module is a mobile phone end APP, and the final translation recognition result is received through the Bluetooth device and displayed on the mobile phone APP.
The working process of the data device shown in fig. 2 on the handset side is as follows:
(1) a user wears the data device, bends fingers, causes the resistance value of a bending sensor on the device to change, thereby causing the voltage to change, rotates the wrist, changes the orientation of the palm, and causes the data on the gyroscope sensor to change;
(2) the MAX6ULL microcontroller acquires the changed voltage value and the change of the original data of the gyroscope sensor by utilizing the AD converter;
(3) the measured voltage value of the change is put into a trained neural network to predict the current sign language, and then the predicted result is matched with a corresponding text;
(4) transmitting the text data in a data packet form by using a communication module;
(5) and the mobile terminal of the mobile phone receives the data packet and displays the final translation recognition result on the APP of the mobile phone terminal in a text form.
As shown in fig. 3, after the acquisition, labeling and preprocessing of the sign language actions are completed, training of the neural network is performed, a part a on the left side of fig. 3 is trained by using a sign language database composed of the acquired action samples, and a part b on the right side of fig. 3 is used for real-time recognition of the sign language actions by using the trained weight matrix.
As shown in fig. 3, firstly, data acquisition is performed on sign language actions of equipment users by using data equipment, different users complete acquisition in batches at different periods, a sign language database for training and verification is further established, then, all data are preprocessed according to different categories, finally, the structure and the activation function of a neural network are established according to the length and the type of the data, and after training is completed and a stable result is obtained in verification, a weight matrix to be transplanted to a data glove is determined. In addition to this, an action code is introduced to avoid repeated recognition of static actions.

Claims (3)

1. A real-time sign language translation system device of a neural network is characterized by comprising a main control module, a gesture module, a communication module and a terminal module;
the main control module is a microcontroller and is used for acquiring signals from the gesture module, processing the signals and transmitting the signals to the terminal module through the communication module;
the gesture module is used for collecting sign language action data, and the gesture module is divided into two parts: the bending sensor for collecting data when the fingers are bent and the MPU6050 chip module for detecting the orientation of the palm are connected with the main control module through the connecting circuit;
the communication module adopts an HC-05 Bluetooth module, is connected with the main control module through a serial port and is used as a data transmission channel;
the terminal module adopts a form of a mobile phone end APP, and receives a final recognition translation result from the main control module through the communication module.
2. The real-time sign language translation system equipment of claim 1, wherein the main control module uses an i.max6ull chip and adopts a sign language recognition algorithm based on a neural network; the I.MAX6ULL chip is provided with an ARM Cortex-A7 core, and adopts a 12-bit ADC converter;
the gesture module is divided into an MPU6050 chip module for detecting the orientation of a palm and a bending sensor for collecting data when fingers are bent; the MPU6050 chip module integrates a six-axis accelerometer and a gyroscope and is an integrated six-axis motion processing component; the bending sensor is divided into a plurality of bending sensors which are distributed on each finger and connected with the main control module through the connecting circuit, the bending of the fingers causes the bending sensor to act, and the bending sensors are converted into voltage signals to be output through the amplifying circuit and ADC;
the communication module uses an HC-05 Bluetooth module, the HC-05 Bluetooth module is connected with the main control module through a serial port interface, the HC-05 Bluetooth module adopts a half-duplex working mode, each module can transmit and receive signals, the point-to-point wireless data communication can be realized by using at least two modules, and a module UART interface is conveniently connected with an IO port of the microcontroller;
the terminal module receives the final recognition translation result through the communication module, and the result is broadcasted and displayed in a voice and text mode through a matched mobile phone APP.
3. The device of claim 2, wherein the design process of the neural network-based sign language recognition algorithm comprises acquisition of sign language data, preprocessing of data, training and recognition of a neural network, and avoidance of repeated recognition of static actions;
the acquisition process of the sign language data comprises the steps of firstly establishing a sign language database according to acquired actions, then carrying out state marking according to different states such as static state, dynamic state, curling and straightening change of fingers, palm orientation and the like, wherein individual data missing phenomena of a few samples can occur in the acquisition process of the sign language data, and after abnormal data are removed, the average value of residual data is taken to replace the abnormal or missing data;
in the data preprocessing process, maximum-minimum standardization is adopted to carry out data normalization processing, so that the influence on data analysis results caused by different ranges of the bending sensor due to different human fingers is avoided, and all indexes are in the same order of magnitude;
in the training and recognition process of the neural network, a hand language database consisting of collected motion samples is used for training, and then a trained weight matrix is used for real-time recognition of motions;
the process of avoiding repeated identification of static actions adopts a method of introducing action codes;
the 12-bit ADC converter can realize voltage change of a multichannel acquisition bending sensor, high-speed data transmission is realized by adopting a direct memory access mode, and the requirement of uninterrupted data acquisition is met by adopting a DMA transmission mode.
CN202110358848.1A 2021-04-02 2021-04-02 Neural network real-time sign language translation system equipment Pending CN112883935A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110358848.1A CN112883935A (en) 2021-04-02 2021-04-02 Neural network real-time sign language translation system equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110358848.1A CN112883935A (en) 2021-04-02 2021-04-02 Neural network real-time sign language translation system equipment

Publications (1)

Publication Number Publication Date
CN112883935A true CN112883935A (en) 2021-06-01

Family

ID=76040483

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110358848.1A Pending CN112883935A (en) 2021-04-02 2021-04-02 Neural network real-time sign language translation system equipment

Country Status (1)

Country Link
CN (1) CN112883935A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113434042A (en) * 2021-06-29 2021-09-24 深圳市阿尓法智慧科技有限公司 Deaf-mute interactive AI intelligent navigation device

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
张维: "基于神经网络的手语翻译系统研究", 《中国优秀博硕士学位论文全文数据库(硕士)信息科技辑(月刊)》 *

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113434042A (en) * 2021-06-29 2021-09-24 深圳市阿尓法智慧科技有限公司 Deaf-mute interactive AI intelligent navigation device

Similar Documents

Publication Publication Date Title
CN105511615B (en) Wearable text input system and method based on EMG
JP2021072136A (en) Methods and devices for combining muscle activity sensor signals and inertial sensor signals for gesture-based control
CN204791666U (en) Portable intelligent sign language interpreter device
CN102063825A (en) Sign language recognizing device based on data gloves
CN201936248U (en) Sign language recognition device based on data glove
CN210402266U (en) Sign language translation system and sign language translation gloves
CN111562842B (en) Virtual keyboard design method based on electromyographic signals
Bhardwaj A Review Paper on Smart Glove-Converts Gestures into Speech and Text
CN112883935A (en) Neural network real-time sign language translation system equipment
CN104980599A (en) Sign language-voice call method and sign language-voice call system
CN110865709A (en) Flexible sensor-based gesture recognition system and method and glove
CN110807471A (en) Behavior recognition system and recognition method of multi-mode sensor
CN110189590A (en) A kind of adaptively correcting formula sign language mutual translation system and method
Tao et al. Research on communication APP for deaf and mute people based on face emotion recognition technology
CN109567819A (en) A kind of wearable device based on PVDF sensor array
CN110413106B (en) Augmented reality input method and system based on voice and gestures
CN111831122A (en) Gesture recognition system and method based on multi-joint data fusion
CN112099669B (en) Electret capacitive pressure sensor unit array for wrist back gesture recognition
CN111766941B (en) Gesture recognition method and system based on intelligent ring
CN213582081U (en) Gesture recognition system based on flexible antibacterial biological membrane multi-channel data acquisition module
Manware et al. Smart Gloves as a Communication Tool for the Speech Impaired and Hearing Impaired
CN209070491U (en) A kind of pliable pressure sensing hand language recognition device
CN110764621A (en) Self-powered intelligent touch glove and mute gesture broadcasting system
Zhang et al. Stacked LSTM-Based Dynamic Hand Gesture Recognition with Six-Axis Motion Sensors
CN111367400A (en) Gesture recognition device based on flexible sensor

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
WD01 Invention patent application deemed withdrawn after publication

Application publication date: 20210601

WD01 Invention patent application deemed withdrawn after publication