CN112402194A - Auxiliary terminal system for visually impaired people - Google Patents

Auxiliary terminal system for visually impaired people Download PDF

Info

Publication number
CN112402194A
CN112402194A CN201910784819.4A CN201910784819A CN112402194A CN 112402194 A CN112402194 A CN 112402194A CN 201910784819 A CN201910784819 A CN 201910784819A CN 112402194 A CN112402194 A CN 112402194A
Authority
CN
China
Prior art keywords
camera
sensor
visually impaired
information
processing unit
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201910784819.4A
Other languages
Chinese (zh)
Inventor
黄宗元
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Hyperx Ai Computing Technology Co ltd
Original Assignee
Beijing Hyperx Ai Computing Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Hyperx Ai Computing Technology Co ltd filed Critical Beijing Hyperx Ai Computing Technology Co ltd
Priority to CN201910784819.4A priority Critical patent/CN112402194A/en
Publication of CN112402194A publication Critical patent/CN112402194A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61HPHYSICAL THERAPY APPARATUS, e.g. DEVICES FOR LOCATING OR STIMULATING REFLEX POINTS IN THE BODY; ARTIFICIAL RESPIRATION; MASSAGE; BATHING DEVICES FOR SPECIAL THERAPEUTIC OR HYGIENIC PURPOSES OR SPECIFIC PARTS OF THE BODY
    • A61H3/00Appliances for aiding patients or disabled persons to walk about
    • A61H3/06Walking aids for blind persons
    • A61H3/061Walking aids for blind persons with electronic detecting or guiding means

Abstract

The invention relates to an auxiliary terminal system for visually impaired people, which comprises: the glasses end and the host end; the eyeglass end comprises: a sensor unit, a depth image processing unit, a color image processing unit and a sound processing unit; and the sensor unit comprises a first camera, a second camera, a third camera, an inertia measurement sensor, an ultrasonic sensor and an infrared distance sensor. The system can realize the functions of acquisition of environmental information and self state information, object identification, three-dimensional scene reconstruction, collision risk detection, map navigation, sound interaction, tactile prompt and the like. The depth camera, the micro ultrasonic sensor and the inertia measurement unit are combined, so that the power consumption of the system can be greatly reduced, the requirement on the performance of a host end is reduced, the single use time is prolonged for the visually impaired, and the trouble of trivial charging is reduced; meanwhile, higher, more accurate and richer environmental information can be provided, so that more aspects are brought to the daily life of the visually impaired.

Description

Auxiliary terminal system for visually impaired people
Technical Field
The invention relates to the field of auxiliary equipment for visually impaired people, in particular to an auxiliary terminal system for visually impaired people.
Background
In the field of auxiliary equipment for visually impaired people, the traditional auxiliary equipment for visually impaired people has the following disadvantages: 1. the traditional visual impairment auxiliary equipment only uses sound signals as information interaction media, interaction is not friendly, and spatial information cannot be fed back accurately in real time; 2. the sensor used by the traditional visual impairment auxiliary equipment has single function, cannot accurately construct real environmental information in real time and feed the real environmental information back to the visual impairment people, and brings great inconvenience to the travel of the visual impairment people.
Disclosure of Invention
The present invention is directed to solving the above-mentioned problems of conventional visual impairment aids.
In order to achieve the above object, the present application provides a visually impaired person assisting terminal system, including: the glasses end and the host end; the eyeglass end comprises: a sensor unit, a depth image processing unit, a color image processing unit and a sound processing unit;
the sensor unit comprises a first camera, a second camera, a third camera, an inertia measurement sensor, an ultrasonic sensor and an infrared distance sensor;
the depth image processing unit is connected with the first camera and the second camera and used for acquiring the image information of the surrounding environment of the glasses end through the first camera and the second camera; outputting the original image and the corresponding depth map, and transmitting the original image and the corresponding depth map to a host end through a data line;
the color image processing unit is connected with the third camera and is used for acquiring the image information of the surrounding environment of the glasses end through the third camera and transmitting the image information to the host end through a data line;
the ultrasonic sensor is used for acquiring distance information of surrounding environment barriers at the glasses end and transmitting the distance information to the host end through a data line;
the inertial measurement sensor is used for acquiring the self attitude information of the glasses end and transmitting the self attitude information to the host end through a data line;
the host terminal is used for realizing dynamic reconstruction of a peripheral scene according to the peripheral environment information acquired by the sensor unit and the self posture of the glasses through computer image recognition and graphic geometry technology, judging collision risks of peripheral obstacles and visually impaired people, and then sending out a sound reminding signal and a touch prompter signal; or identifying the type and position of the peripheral object by computer image identification technology, and then sending out sound reminding signals.
Preferably, the system further comprises a microphone, and the host collects voice information of the visually impaired through the microphone, converts the voice into characters through an offline voice recognition algorithm or an online recognition algorithm, then realizes the search and confirmation of the destination through an offline map destination search function or an online map destination search function, and then plans a navigation route offline or online to guide the visually impaired to reach the destination.
Preferably, the system also comprises a distance sensor which is arranged on the glasses end and faces to one side of the forehead of the human body, and whether the product is worn by the user or not can be detected through a corresponding algorithm and the product is normally used; if the user does not wear and use the product normally, the system automatically enters a standby mode after timeout, and then is automatically shut down after timeout, so that the energy consumption of the intelligent management system is intelligently managed, the normal working time of the system is prolonged, and unnecessary troubles caused by frequent charging of visually impaired people are reduced.
Preferably, the color image processing unit of the system comprises an automatic focusing module, wherein the automatic focusing module has a gesture focusing function, can be controlled by a touch sensor, and can shoot a high-pixel picture; where the user's finger is pointing, that position is taken as the center of focus.
Preferably, the system further comprises a tactile sensor, an infrared sensor and a global satellite navigation system.
Preferably, the processing unit at the host end of the system is an application processor of an ARM architecture, and is used for realizing the functions of acquisition of environmental information and self state information, object identification, three-dimensional scene reconstruction, collision risk detection, map navigation, sound interaction and touch sensing, and the touch sensor realizes information interaction between a terminal and a person through conversion of bioelectric signals.
The depth camera, the micro ultrasonic sensor and the inertia measurement unit are combined, so that the power consumption of the system can be greatly reduced, the requirement on the performance of a host end is reduced, the single use time is prolonged for the visually impaired, and the trouble of trivial charging is reduced; meanwhile, higher, more accurate and richer environmental information can be provided, so that more aspects are brought to the daily life of the visually impaired.
Drawings
Fig. 1 is an architecture diagram of an auxiliary terminal system for visually impaired people according to an embodiment of the present invention.
Detailed Description
In order to make the objects, technical solutions and advantages of the embodiments of the present invention clearer, the technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are some, but not all, embodiments of the present invention. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
Fig. 1 is an architecture diagram of an auxiliary terminal system for visually impaired people according to an embodiment of the present invention. As shown in fig. 1, the assistant terminal system for visually impaired people includes: an eyeglass end and a host end.
The eyeglass end comprises: a sensor unit, a depth image processing unit, a color image processing unit and a sound processing unit; the sensor unit comprises a first camera, a second camera, a third camera, an Inertial Measurement Unit (IMU), an ultrasonic sensor and an infrared distance sensor; the depth image processing unit is connected with the first camera and the second camera and used for acquiring the image information of the surrounding environment of the glasses end through the first camera and the second camera; and outputting the original image (RGB/gray-scale image) and the corresponding depth image to the host end through the data line. The color image processing unit is connected with the third camera and used for collecting the image information of the surrounding environment of the glasses end through the third camera and transmitting the image information to the host end through a data line. And the inertial measurement sensor is used for acquiring the self attitude information of the glasses end and transmitting the self attitude information to the host end through a data line. The ultrasonic sensor is used for acquiring distance information of surrounding environment barriers at the glasses end and transmitting the distance information to the host end through a data line; the ultrasonic sensor is a miniature ultrasonic sensor, is small in size and low in power consumption, is convenient to install and use in the small-volume space at the glasses end, and can identify the scenes of single planes such as a white wall, glass, a mirror, a window and a water surface besides the common environment and scenes by combining with the depth sensor, so that the safety of the visually impaired people in daily life is improved.
The system comprises a host terminal, a sensor unit, a data association unit and a data processing unit, wherein the host terminal is used for realizing dynamic reconstruction of a peripheral scene through computer image identification and graphic geometry technology, such as an extended Kalman filtering algorithm based on data association, according to peripheral environment information acquired by the sensor unit and self postures of the glasses; then judging the collision risk of peripheral obstacles and visually impaired people, and then sending out a sound reminding signal and a touch prompter signal; or identifying the type and position of the peripheral object by computer image identification technology, and then sending out sound reminding signals.
In one embodiment, when the distance from the door of a glasses wearer, such as a visually impaired person, is 3m, the first camera and the second camera are equivalent to two eyes of the visually impaired person, images of the real environment are taken and transmitted to the depth image processing unit, the depth image processing unit calculates the depth distance of each pixel point in the visual field by utilizing triangulation calculation according to the captured images taken by the first camera and the second camera, and simultaneously generates a depth map, and transmits the original image of the first camera and the depth map to the host through the USB interface.
Meanwhile, the IMU transmits the attitude data of the head to the host computer in real time through the USB interface. When the interrupt signal is received by the IMU, the frame number of the current image is read from the image processing unit through the I2C interface and added to the IMU data at the current moment.
The data of the ultrasonic waves are collected by the single chip microcomputer in real time and are transmitted to the host through the USB interface. After the host receives the depth map, the original image of the first camera and the IMU data, whether a door exists in the visual field is detected by using the original image of the first camera through the yolo algorithm, and if the door is detected, whether the visually impaired people face the door is detected by using the attitude data of the IMU. If the visually impaired people face the door, the distance between the visually impaired people and the door is judged in real time, and the collision risk degree is informed to the visually impaired people through sound. Meanwhile, the depth map and the original image of the first camera are used for reconstructing surrounding scenes through a Kalman filtering algorithm, and collision risk degrees of other surrounding objects are reported to the visually impaired through sound. Meanwhile, the distance between the visually impaired people and the door and whether the door is opened or closed can be judged more accurately by using the received data of the ultrasonic waves, so that the visually impaired people can be guided to pass through the door correctly.
The whole system reconstructs and judges scenes and collision risk degree in real time according to the distance between the visually impaired people and the door until the visually impaired people can smoothly pass through the front door.
The processing unit at the host end is an application processor of an ARM framework and is used for achieving the functions of collecting environmental information and self state information, identifying objects, reconstructing three-dimensional scenes, detecting collision risks, navigating maps, interacting sound and sensing touch, and the touch sensor achieves information interaction between a terminal and a person through conversion of bioelectricity signals.
The embodiment can reduce the load of the host end and the requirement on the processing capacity of the host; meanwhile, the special chip is adopted to calculate the depth map, so that the calculation efficiency can be improved, and the power consumption can be reduced and dispersed.
Fig. 1 is a diagram of an auxiliary terminal system for visually impaired people, further including a microphone, a touch sensor, an infrared distance sensor, and a global satellite navigation system; the method comprises the steps that a host terminal collects voice information of visually impaired people through a microphone, voice is converted into characters through an offline voice recognition algorithm or an online recognition algorithm, for example, a natural language processing algorithm, then searching and confirming of a destination are achieved through an offline map destination searching function or an online map destination searching function, and then a navigation route is planned offline or online to guide the visually impaired people to reach the destination.
The infrared distance sensor is arranged on the glasses end and faces one side of the forehead of the human body, and whether the user wears the glasses or not and normally uses the glasses or not is judged according to the position and posture information of the camera; if the user does not wear and use the product normally, the system automatically enters a standby mode after timeout, and then is automatically shut down after timeout, so that the energy consumption of the intelligent management system is intelligently managed, the normal working time of the system is prolonged, and unnecessary troubles caused by frequent charging of visually impaired people are reduced.
As an improvement, the color image processing unit comprises an automatic focusing module, the automatic focusing module has a gesture focusing function, can be controlled by a touch sensor, and can shoot a high-pixel picture; where the user's finger is pointing, that position is taken as the center of focus.
As another improvement, in this embodiment, a single chip microcomputer highly synchronizes data of the camera and the IMU, so that the accuracy of functions such as acquisition of environmental information and self-state information, object identification, three-dimensional scene reconstruction, collision risk detection, map navigation, and the like is improved:
each time the depth processing chip captures an image from the camera, an interrupt signal and timestamp information are generated and transmitted to the single chip microcomputer. And after the single chip microcomputer receives the interrupt and the information, the time stamp is used for marking IMU data of the current frame. After receiving the depth map, the corresponding original map and the data of the IMU, the host side compares and aligns the time stamps and then carries out the following algorithm processing.
The above-mentioned embodiments are intended to illustrate the objects, technical solutions and advantages of the present invention in further detail, and it should be understood that the above-mentioned embodiments are merely exemplary embodiments of the present invention, and are not intended to limit the scope of the present invention, and any modifications, equivalent substitutions, improvements and the like made within the spirit and principle of the present invention should be included in the protection of the present invention.

Claims (6)

1. An assistant terminal system for visually impaired people, comprising: the glasses end and the host end; characterized in that the spectacle end comprises: a sensor unit, a depth image processing unit, a color image processing unit and a sound processing unit;
the sensor unit comprises a first camera, a second camera, a third camera, an inertia measurement sensor and an ultrasonic sensor;
the depth image processing unit is connected with the first camera and the second camera and is used for acquiring image information of the surrounding environment of the glasses end through the first camera and the second camera; outputting the original image and the corresponding depth map, and transmitting the original image and the corresponding depth map to a host end through a data line;
the color image processing unit is connected with the third camera and is used for acquiring image information of the surrounding environment of the glasses end through the third camera and transmitting the image information to the host end through a data line;
the ultrasonic sensor is used for acquiring barrier distance information in the surrounding environment of the glasses end and transmitting the barrier distance information to the host end through a data line;
the inertial measurement sensor is used for acquiring self attitude information of the glasses end and transmitting the self attitude information to the host end through a data line;
the host end is used for realizing dynamic reconstruction of a peripheral scene according to the peripheral environment information acquired by the sensor unit and the self posture of the glasses through computer image recognition and graphic geometry technology, judging collision risks of peripheral obstacles and visually impaired people, and then sending out a sound reminding signal and a touch prompter signal; or identifying the type and position of the peripheral object by computer image identification technology, and then sending out sound reminding signals.
2. The system of claim 1, further comprising a microphone, wherein the host collects voice information of the visually impaired through the microphone, converts the voice into text through an offline voice recognition algorithm or an online recognition algorithm, then realizes the search and confirmation of the destination through an offline map destination search or an online map destination search function, and then plans a navigation route offline or online to guide the visually impaired to reach the destination.
3. The system of claim 1, further comprising an infrared distance sensor installed at the glasses end facing the forehead of the human body, which can detect whether the user wears and normally uses the product through a corresponding algorithm; if the user does not wear and use the product normally, the system automatically enters a standby mode after timeout, and then is automatically shut down after timeout, so that the energy consumption of the intelligent management system is intelligently managed, the normal working time of the system is prolonged, and unnecessary troubles caused by frequent charging of visually impaired people are reduced.
4. The system according to claim 1, wherein the color image processing unit comprises an auto-focusing module, the auto-focusing module has a gesture focusing function, can be controlled by a touch sensor, and can take a high-pixel picture; where the user's finger is pointing, that position is taken as the center of focus.
5. The system of claim 1, further comprising a tactile sensor, an infrared distance sensor, and a global satellite navigation system.
6. The system of claim 1, wherein the processing unit at the host end is an application processor of an ARM architecture, and is configured to implement functions of collecting environmental information and self-state information, identifying an object, reconstructing a three-dimensional scene, detecting a collision risk, navigating a map, interacting sound, and sensing a touch sensation, and the touch sensation sensor implements information interaction between a terminal and a person through conversion of a bioelectric signal.
CN201910784819.4A 2019-08-23 2019-08-23 Auxiliary terminal system for visually impaired people Pending CN112402194A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201910784819.4A CN112402194A (en) 2019-08-23 2019-08-23 Auxiliary terminal system for visually impaired people

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201910784819.4A CN112402194A (en) 2019-08-23 2019-08-23 Auxiliary terminal system for visually impaired people

Publications (1)

Publication Number Publication Date
CN112402194A true CN112402194A (en) 2021-02-26

Family

ID=74779829

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910784819.4A Pending CN112402194A (en) 2019-08-23 2019-08-23 Auxiliary terminal system for visually impaired people

Country Status (1)

Country Link
CN (1) CN112402194A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114420131A (en) * 2022-03-16 2022-04-29 云天智能信息(深圳)有限公司 Intelligent voice auxiliary recognition system for weak eyesight

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103595919A (en) * 2013-11-15 2014-02-19 深圳市中兴移动通信有限公司 Manual focusing method and shooting device
CN104127302A (en) * 2014-07-31 2014-11-05 青岛歌尔声学科技有限公司 Walk safety navigation method for the visually impaired
CN104506765A (en) * 2014-11-21 2015-04-08 惠州Tcl移动通信有限公司 Mobile terminal and focusing method based on touch gesture of mobile terminal
US20160078278A1 (en) * 2014-09-17 2016-03-17 Toyota Motor Engineering & Manufacturing North America, Inc. Wearable eyeglasses for providing social and environmental awareness
CN106597690A (en) * 2016-11-23 2017-04-26 杭州视氪科技有限公司 Visually impaired people passage prediction glasses based on RGB-D camera and stereophonic sound
CN106843491A (en) * 2017-02-04 2017-06-13 上海肇观电子科技有限公司 Smart machine and electronic equipment with augmented reality
CN108245385A (en) * 2018-01-16 2018-07-06 曹醒龙 A kind of device for helping visually impaired people's trip
CN109085926A (en) * 2018-08-21 2018-12-25 华东师范大学 A kind of the augmented reality system and its application of multi-modality imaging and more perception blendings

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103595919A (en) * 2013-11-15 2014-02-19 深圳市中兴移动通信有限公司 Manual focusing method and shooting device
CN104127302A (en) * 2014-07-31 2014-11-05 青岛歌尔声学科技有限公司 Walk safety navigation method for the visually impaired
US20160078278A1 (en) * 2014-09-17 2016-03-17 Toyota Motor Engineering & Manufacturing North America, Inc. Wearable eyeglasses for providing social and environmental awareness
CN104506765A (en) * 2014-11-21 2015-04-08 惠州Tcl移动通信有限公司 Mobile terminal and focusing method based on touch gesture of mobile terminal
CN106597690A (en) * 2016-11-23 2017-04-26 杭州视氪科技有限公司 Visually impaired people passage prediction glasses based on RGB-D camera and stereophonic sound
CN106843491A (en) * 2017-02-04 2017-06-13 上海肇观电子科技有限公司 Smart machine and electronic equipment with augmented reality
CN108245385A (en) * 2018-01-16 2018-07-06 曹醒龙 A kind of device for helping visually impaired people's trip
CN109085926A (en) * 2018-08-21 2018-12-25 华东师范大学 A kind of the augmented reality system and its application of multi-modality imaging and more perception blendings

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114420131A (en) * 2022-03-16 2022-04-29 云天智能信息(深圳)有限公司 Intelligent voice auxiliary recognition system for weak eyesight
CN114420131B (en) * 2022-03-16 2022-05-31 云天智能信息(深圳)有限公司 Intelligent voice auxiliary recognition system for weak eyesight

Similar Documents

Publication Publication Date Title
US10498944B2 (en) Wearable apparatus with wide viewing angle image sensor
US20150379896A1 (en) Intelligent eyewear and control method thereof
US20130250078A1 (en) Visual aid
KR102242681B1 (en) Smart wearable device, method and system for recognizing 3 dimensional face and space information using this
Sáez et al. Aerial obstacle detection with 3-D mobile devices
CN111643324A (en) Intelligent glasses for blind people
EP3413165B1 (en) Wearable system gesture control method and wearable system
KR20090036183A (en) The method and divice which tell the recognized document image by camera sensor
Owayjan et al. Smart assistive navigation system for blind and visually impaired individuals
WO2015093130A1 (en) Information processing device, information processing method, and program
CN105116544A (en) Electronic glasses operated by head
CN106937909A (en) A kind of intelligent assisting blind glasses system
CN112402194A (en) Auxiliary terminal system for visually impaired people
CN112002186B (en) Information barrier-free system and method based on augmented reality technology
TW201629924A (en) Visual assistant system and wearable device having the same
Biswas et al. Shortest path based trained indoor smart jacket navigation system for visually impaired person
KR101191640B1 (en) Apparatus and method for providing information to user
Rahman et al. An automated navigation system for blind people
CN108125776A (en) A kind of glasses for guiding blind
Botezatu et al. Development of a versatile assistive system for the visually impaired based on sensor fusion
Vyavahare et al. Assistant for visually impaired using computer vision
CN112494290A (en) Navigation glasses
Imtiaz et al. Wearable scene classification system for visually impaired individuals
AU2021100117A4 (en) Ankle band for identifying nearest obstacles
CN115079833B (en) Multilayer interface and information visualization presenting method and system based on somatosensory control

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
WD01 Invention patent application deemed withdrawn after publication

Application publication date: 20210226

WD01 Invention patent application deemed withdrawn after publication