CN110623820A - Blind device is led to wearable intelligence - Google Patents

Blind device is led to wearable intelligence Download PDF

Info

Publication number
CN110623820A
CN110623820A CN201910639192.3A CN201910639192A CN110623820A CN 110623820 A CN110623820 A CN 110623820A CN 201910639192 A CN201910639192 A CN 201910639192A CN 110623820 A CN110623820 A CN 110623820A
Authority
CN
China
Prior art keywords
blind
module
equipment
information
blind guiding
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201910639192.3A
Other languages
Chinese (zh)
Inventor
宛处好
陈雨濛
杨力川
黄昕阳
吕乐斌
黄卓斌
马宇辰
蔡家杰
徐哲
徐鹏
黄惟暄
高阳
杨晓玲
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
University of Electronic Science and Technology of China
Original Assignee
University of Electronic Science and Technology of China
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by University of Electronic Science and Technology of China filed Critical University of Electronic Science and Technology of China
Priority to CN201910639192.3A priority Critical patent/CN110623820A/en
Publication of CN110623820A publication Critical patent/CN110623820A/en
Pending legal-status Critical Current

Links

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61HPHYSICAL THERAPY APPARATUS, e.g. DEVICES FOR LOCATING OR STIMULATING REFLEX POINTS IN THE BODY; ARTIFICIAL RESPIRATION; MASSAGE; BATHING DEVICES FOR SPECIAL THERAPEUTIC OR HYGIENIC PURPOSES OR SPECIFIC PARTS OF THE BODY
    • A61H3/00Appliances for aiding patients or disabled persons to walk about
    • A61H3/06Walking aids for blind persons
    • A61H3/061Walking aids for blind persons with electronic detecting or guiding means
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S15/00Systems using the reflection or reradiation of acoustic waves, e.g. sonar systems
    • G01S15/88Sonar systems specially adapted for specific applications
    • G01S15/93Sonar systems specially adapted for specific applications for anti-collision purposes
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S19/00Satellite radio beacon positioning systems; Determining position, velocity or attitude using signals transmitted by such systems
    • G01S19/38Determining a navigation solution using signals transmitted by a satellite radio beacon positioning system
    • G01S19/39Determining a navigation solution using signals transmitted by a satellite radio beacon positioning system the satellite radio beacon positioning system transmitting time-stamped messages, e.g. GPS [Global Positioning System], GLONASS [Global Orbiting Navigation Satellite System] or GALILEO
    • G01S19/42Determining position
    • G01S19/45Determining position by combining measurements of signals from the satellite radio beacon positioning system with a supplementary measurement
    • G01S19/47Determining position by combining measurements of signals from the satellite radio beacon positioning system with a supplementary measurement the supplementary measurement being an inertial measurement, e.g. tightly coupled inertial
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0231Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means
    • G05D1/0246Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means using a video camera in combination with image processing means
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/16Sound input; Sound output
    • G06F3/167Audio in a user interface, e.g. using voice commands for navigating, audio feedback
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61HPHYSICAL THERAPY APPARATUS, e.g. DEVICES FOR LOCATING OR STIMULATING REFLEX POINTS IN THE BODY; ARTIFICIAL RESPIRATION; MASSAGE; BATHING DEVICES FOR SPECIAL THERAPEUTIC OR HYGIENIC PURPOSES OR SPECIFIC PARTS OF THE BODY
    • A61H2201/00Characteristics of apparatus not provided for in the preceding codes
    • A61H2201/01Constructive details
    • A61H2201/0173Means for preventing injuries
    • A61H2201/0184Means for preventing injuries by raising an alarm
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61HPHYSICAL THERAPY APPARATUS, e.g. DEVICES FOR LOCATING OR STIMULATING REFLEX POINTS IN THE BODY; ARTIFICIAL RESPIRATION; MASSAGE; BATHING DEVICES FOR SPECIAL THERAPEUTIC OR HYGIENIC PURPOSES OR SPECIFIC PARTS OF THE BODY
    • A61H2201/00Characteristics of apparatus not provided for in the preceding codes
    • A61H2201/16Physical interface with patient
    • A61H2201/1602Physical interface with patient kind of interface, e.g. head rest, knee support or lumbar support
    • A61H2201/165Wearable interfaces
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61HPHYSICAL THERAPY APPARATUS, e.g. DEVICES FOR LOCATING OR STIMULATING REFLEX POINTS IN THE BODY; ARTIFICIAL RESPIRATION; MASSAGE; BATHING DEVICES FOR SPECIAL THERAPEUTIC OR HYGIENIC PURPOSES OR SPECIFIC PARTS OF THE BODY
    • A61H2201/00Characteristics of apparatus not provided for in the preceding codes
    • A61H2201/50Control means thereof
    • A61H2201/5007Control means thereof computer controlled
    • A61H2201/501Control means thereof computer controlled connected to external computer devices or networks
    • A61H2201/5015Control means thereof computer controlled connected to external computer devices or networks using specific interfaces or standards, e.g. USB, serial, parallel
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61HPHYSICAL THERAPY APPARATUS, e.g. DEVICES FOR LOCATING OR STIMULATING REFLEX POINTS IN THE BODY; ARTIFICIAL RESPIRATION; MASSAGE; BATHING DEVICES FOR SPECIAL THERAPEUTIC OR HYGIENIC PURPOSES OR SPECIFIC PARTS OF THE BODY
    • A61H2201/00Characteristics of apparatus not provided for in the preceding codes
    • A61H2201/50Control means thereof
    • A61H2201/5023Interfaces to the user
    • A61H2201/5048Audio interfaces, e.g. voice or music controlled
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61HPHYSICAL THERAPY APPARATUS, e.g. DEVICES FOR LOCATING OR STIMULATING REFLEX POINTS IN THE BODY; ARTIFICIAL RESPIRATION; MASSAGE; BATHING DEVICES FOR SPECIAL THERAPEUTIC OR HYGIENIC PURPOSES OR SPECIFIC PARTS OF THE BODY
    • A61H2201/00Characteristics of apparatus not provided for in the preceding codes
    • A61H2201/50Control means thereof
    • A61H2201/5058Sensors or detectors
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61HPHYSICAL THERAPY APPARATUS, e.g. DEVICES FOR LOCATING OR STIMULATING REFLEX POINTS IN THE BODY; ARTIFICIAL RESPIRATION; MASSAGE; BATHING DEVICES FOR SPECIAL THERAPEUTIC OR HYGIENIC PURPOSES OR SPECIFIC PARTS OF THE BODY
    • A61H2201/00Characteristics of apparatus not provided for in the preceding codes
    • A61H2201/50Control means thereof
    • A61H2201/5058Sensors or detectors
    • A61H2201/5064Position sensors

Landscapes

  • Engineering & Computer Science (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Health & Medical Sciences (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Multimedia (AREA)
  • General Health & Medical Sciences (AREA)
  • Electromagnetism (AREA)
  • Physical Education & Sports Medicine (AREA)
  • Human Computer Interaction (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Audiology, Speech & Language Pathology (AREA)
  • Aviation & Aerospace Engineering (AREA)
  • Automation & Control Theory (AREA)
  • Epidemiology (AREA)
  • Pain & Pain Management (AREA)
  • General Engineering & Computer Science (AREA)
  • Rehabilitation Therapy (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Animal Behavior & Ethology (AREA)
  • Public Health (AREA)
  • Veterinary Medicine (AREA)
  • Acoustics & Sound (AREA)
  • Navigation (AREA)
  • Studio Devices (AREA)

Abstract

The invention relates to wearable intelligent blind guiding equipment. When the equipment is started, user positioning is firstly carried out, the GPS technology is applied to the outdoor environment, and the gyroscope and binocular vision camera positioning technology is applied to the indoor environment. After the positioning is finished, the equipment calls map software to plan the optimal route by recognizing the voice command of the user. In the process of advancing, the SOC system analyzes data (convolutional neural network, SLAM scanning imaging and mode recognition) transmitted by the ultrasonic matrix radar and the camera, processes information of the surrounding environment, performs real-time 3D modeling on the surrounding environment, performs path planning in a complex environment, gives voice obstacle avoidance prompts in real time, and achieves obstacle avoidance without blind areas.

Description

Blind device is led to wearable intelligence
Technical Field
The invention relates to a blind guiding mode, in particular to wearable intelligent blind guiding equipment.
Background
The main problems in the life of the blind are how to safely go out and realize interaction with the surrounding environment by the blind and severe visual handicapped as a special group in the society, and the electronic blind guiding systems which are successfully developed at present can be roughly divided into three categories, namely an ultrasonic blind guiding instrument, a mobile robot and a guiding type stick. The ultrasonic blind-guiding instrument is positioned in a mode of transmitting ultrasonic waves and receiving barrier reflection echoes, and has the defects that a user needs to continuously perform scanning detection actions during traveling, the traveling speed is reduced, and a leak detection blind area exists; the mobile robot is an ideal electronic blind guiding device, the performance of which is greatly improved compared with that of an ultrasonic blind guiding instrument, but the mobile robot has a complicated structure and cannot be widely applied due to the defects of high cost, limited action range and the like; the essence of the guiding type walking stick is to unload the power system of the mobile robot, and to keep the sensing and control part of the intelligent sensing, which is a relatively ideal electronic blind guiding device at present, but the guiding type walking stick is also a disadvantage of single detection mode and weak interaction capability, and it is of high cost and is not convenient for popularization and application. In addition, the biological blind guiding dog can help guide the blind to move, but has the defects of too long training period and adaptation period and higher cost, so that the use rate of the blind guiding dog is very low. At present, the existing technical scheme only adopts an ultrasonic module or a single camera sensor, and cannot realize the stability of working modes and states. For example, in an environment with multiple noise pollution sources, data reception of the acoustic wave sensor may be abnormal, and in a dark or bright light environment, data acquisition of the camera may be abnormal. In the aspect of an interaction mode, at present, only a voice module is used for broadcasting obstacles, and bidirectional interaction between a user and equipment cannot be realized, namely, the user controls the equipment through voice, and the equipment broadcasts road conditions and navigation information through voice. There is currently no solution in the state control and power management of devices. In terms of indoor positioning, there is currently no solution. There is currently no solution in the identification of the type of obstacle, such as stairs, pillars, potholes, etc. In the aspect of user safety early warning, no solution exists at present.
Disclosure of Invention
The wearable intelligent blind guiding device is used for carrying an ultrasonic matrix radar, a three-dimensional depth-of-field camera, an SOC on-chip integrated system, a natural language interactive processing chip, a GPS outdoor locator, an indoor gyroscope and a binocular camera for inertial positioning. In the aspect of an interaction mode, the invention adopts a natural language bidirectional interaction mode to realize the control of the user on the equipment and the broadcast of the road condition information of the equipment. In the aspect of obstacle type identification, measurement parameter output of various sensors is delivered to an SOC chip to calculate and draw a spatial stereogram around a user, and algorithms such as a support vector machine, a perceptron, a convolutional neural network and a hidden Markov model are adopted to carry out environment mode matching and obstacle type identification. In the aspect of the working stability of the equipment, the invention provides a solution considering noise pollution and light pollution environment at the same time, namely, the ultrasonic matrix radar, the three-dimensional depth-of-field camera and other multi-sensors work cooperatively, so that the blind guiding equipment is ensured to be stable and reliable in a complex and changeable environment. In the aspect of indoor positioning, the invention provides a solution for indoor positioning of a nine-axis gyroscope and a binocular camera. In the aspects of equipment state control and power management, the invention adopts an indoor high-function mode, a minimum power consumption mode, a lower power consumption mode and an outdoor mode, and the working states of sensors of equipment in different modes are different, so as to carry out power management. In the aspect of user safety early warning, the invention adopts an onboard safety early warning scheme to detect the current of a hardware system in real time, and sends user geographic information and the like to the family of the user when equipment is abnormal. The technical scheme of the invention is a wearable intelligent blind guiding device, which comprises: the system comprises a data acquisition system, a data processing system and a man-machine interaction system, wherein the data processing system is responsible for data processing and transmission of the data acquisition system and the man-machine interaction system; the data acquisition system includes: the wearable device is fixed on the head of a person, the fixed plate is arranged on the part of the wearable device, which is positioned on the face of the person, the two ultrasonic radar arrays are respectively arranged on two sides of the fixed plate, and the three cameras are sequentially and transversely arranged between the two ultrasonic radar arrays; the data processing system includes: the data processing takes SOC system-DDR 3-sodimm _ ZYNQ as the core. Its internal division is PL (digital logic) and PS (armmontex a9) dual system. The PL part carries out digital acceleration operation on the data, and the data are converted into parallel processing in serial mode. Data of the three-dimensional depth of field camera and the ultrasonic matrix radar are compiled and controlled through the SDK and preprocessed through the PS. The PS passes data to the digital logic PL portion via the AMBA bus. The human-computer interaction system comprises: an earphone and a microphone. In the present invention, the data processing and interaction scheme among the modules in the data processing system is as follows: in the data processing system, data packets returned by the voice interaction module, the GPS, the nine-axis gyroscope and the ultrasonic matrix module are packaged according to an IIC communication protocol and transmitted to an L4 processing unit in the SOC through a UART serial port. The data stream generated by the voice signal is analyzed into a user instruction through a voice recognition algorithm, and the user instruction is used as a basis for realizing the voice control function. The depth camera module collects scene information, corresponding data signals can be transmitted through a serial interface or a universal serial bus transmission channel, preprocessing is carried out through SDK compiling control, data are returned to a ZYNQ unit in an SOC through an AMBA bus, pattern recognition is carried out on image information through an SVM-support vector machine, pattern recognition results are returned to an L4 processor, the environment where a user is located is analyzed by combining data streams returned by the peripheral equipment and user instruction contents, and the L4 processing unit controls a voice synthesis module to generate corresponding voice signals according to the comprehensive analysis results and transmits the voice signals to an earphone so as to achieve functions of real-time obstacle avoidance and the like. The wearable intelligent blind guiding equipment mainly comprises a 3D infrared imaging camera, an SOC (system on chip), an ultrasonic matrix radar, a GPS (global positioning system) module and a natural language interaction processor, and is used for guiding blind of a user by utilizing key technologies such as SLAM (simultaneous localization and mapping), an SOC (system on chip) integrated system, a convolutional neural network, natural language interaction, an ultrasonic matrix theory radar and the like. When the equipment is started, user positioning is firstly carried out, the GPS technology is applied to the outdoor environment, and the gyroscope and binocular vision camera positioning technology is applied to the indoor environment. After the positioning is finished, the equipment calls map software to plan the optimal route by recognizing the voice command of the user. In the process of advancing, the SOC system analyzes data (intelligent neural network, image recognition and SLAM scanning imaging) transmitted by the laser radar and the camera, processes information of the surrounding environment, carries out real-time 3D modeling on the surrounding environment, carries out path planning in a complex environment, gives voice obstacle avoidance prompts in real time, and achieves obstacle avoidance without blind areas. In the invention, the SOC of the core processing module adopts a multi-core integration form, and multiple main controllers can interactively run, thereby having the advantages of low power consumption, high integration and the like. In the aspect of scene identification, the thermal imaging vision system is combined with the ultrasonic matrix radar, so that the normal work of a mapping function under severe conditions of large environmental temperature and light intensity variation range and the like can be ensured, and the method has the advantages of strong anti-interference capability, wide applicability, strong functional stability and the like. In the aspect of user positioning, the invention uses a GPS, vision and gyroscope triple positioning system, can solve the error problem of inertial navigation and realize accurate positioning of complex environment. In the aspect of user control, the invention uses a natural language bidirectional interaction system to identify the environment where the user is located and simultaneously feed back the voice instruction of the user in real time, thereby providing more humanized service and safer and more reliable guarantee for the activities of the user.
Drawings
Fig. 1 is a schematic structural diagram of the whole intelligent wearable device.
Fig. 2 is a schematic view of a head-mounted frame of the wearable device.
Fig. 3 is a schematic view of a main control system of the blind guiding device.
In fig. 1, the first to fourth are head sensor portions. And thirdly, ultrasonic matrix radar. And the second is a three-dimensional depth-of-field camera. The first is a head-wearing spectacle frame. And the data connecting line is used for transmitting the data acquired by the sensor to the SOC processor system. Fifthly, forming a master control system of the blind guiding device. Wherein, the seventh is a relay module for controlling the power supply of the system. And the speech synthesis module is used for speech synthesis and recognition in the natural language interaction process. And the system is a ZYNQ on-chip system and is used for processing screen stream data and ultrasonic matrix radar data. A nine-axis gyroscope for indoor positioning. Ninthly, an earphone line, and a microphone at the r, the combination of which is used for interaction between the user and the device.
Detailed Description
The present invention will be further described with reference to the accompanying drawings and embodiments, which are divided into an SOC system module, a peripheral module, a voice interaction module, a micro-ultrasonic module, and a GPS positioning and image processing module.
The SOC has the specific functions of each module:
1. considering the higher energy consumption of ZYNQ, STM32L4 is started at any time and is used for processing real-time voice information to carry out system control, and meanwhile, the battery state is detected to automatically enter a low-power mode when the battery voltage is lower;
division of L4: monitoring the battery state, performing real-time voice recognition interaction, reading information of an ultrasonic matrix and performing preliminary obstacle avoidance prompting by using data, communicating a nine-axis acceleration gyroscope with ZYNQ to obtain obstacle avoidance information after image processing, and adjusting GPS driving in real time according to the voice information or the battery state;
division of ZYNQ: the depth camera and the conventional camera are driven to perform image processing, and image information interaction is performed with the L4.
Information of each peripheral function module:
a GPS module: positioning function by using off-line map
A nine-axis gyroscope: completing path recording and indoor positioning
Ultrasonic wave matrix: the obstacle avoidance function is carried out under the dark condition and the low power consumption condition and is used as the preliminary judgment of the obstacle avoidance
Double lithium battery module: system power supply usage
TPS7A4701 LDO for reducing voltage to reasonable voltage for different peripheral modules
Voice interaction: speech recognition and interaction, under the control of L4
A relay module: B3GA4.5Z is adopted to control the power on-off management of the whole system, and the control of the relay is completed on the waist processing board.
The wearable intelligent blind guiding equipment takes an obstacle avoidance function as a processing core, the function is expanded around the obstacle avoidance function, image processing development is taken as a core significance, and miniature ultrasonic waves are used for assistance. The image processing is expanded around the depth camera, and because the height fluctuation in the walking process is too large, the use of ultrasonic waves is limited, and the ultrasonic waves are used as an auxiliary function for emergency obstacle avoidance.
Furthermore, in the aspect of voice interaction, the earphone and the microphone provide real-time prompts for the blind, and the effect to be achieved by voice recognition at the present stage is heard by a user command to give a correct control effect. Meanwhile, a voice synthesizer is used for giving prompts of obstacle avoidance and geographical positions in real time.
Further, the micro-ultrasound module: the emission and acceptance distance is not less than 2.5M, and the volume is controlled within 2.5 x 2.5. It is proposed to use either the IIC or 485 interface for communication. The use of the transducer as much as possible is highly efficient.
Further, GPS positioning, so that map preloading is not simple longitude and latitude as much as possible, considers devices for adding SIM7 XXX.
Further, in the aspect of image processing, image detection is taken as a core, and by utilizing the advantage of a three-dimensional camera in depth ranging, development environments such as LINUX and the like are built on a platform so as to facilitate later addition of machine learning and algorithm updating.
The main control states are:
1. minimum power consumption mode: only STM32L4 remains, as well as the use of voice interaction (for use by the blind in a familiar environment).
2. Lower power consumption mode: the STM32L4 and the use of voice interaction and ultrasound modules (for use in low battery and strange environments) are retained.
3. An outdoor mode: all the devices are started, and the GPS is started at any time.
4. Indoor high functional mode: all devices are turned on, and the GPS is turned off after positioning is completed.

Claims (8)

1. The utility model provides a blind equipment is led to wearing formula intelligence, head-mounted glasses main part, wearing formula transmission processor module, microphone and earphone, its characterized in that: the blind guiding equipment is provided with an ultrasonic matrix radar, wherein slam scanning imaging is adopted for obtaining barrier information in a real-time environment and detecting the barrier information in a 5m field of view angle around the blind, the equipment is provided with two monocular depth camera modules, one camera module is used for obtaining front distance information and implementing effective obstacle avoidance, and the other camera module is used for obtaining information under feet of a user, so that the information of low-height barriers such as street edges can be better supported, and the prompt can be carried out when the user goes up and down steps.
2. The intelligent blind guiding device of claim 1, wherein: the blind guiding equipment is designed in a separating mode, the wearing part of the blind guiding equipment is ultrasonic, the camera module is used for acquiring real-time information, and the battery module, the operation module, the gps positioning module and the wireless communication module are all located in the operation box. The box adopts miniaturized design, and the size of area is 4.3 inches x2.5 inches. The two parts are connected through a wired data line, the transmission information quality is guaranteed through a usb2.0 transmission protocol, and the body sensing weight of the head-wearing device can be reduced.
3. The intelligent blind guiding device of claim 2, wherein: a camera is used for acquireing user's information under the foot, can indicate when the user goes up and down the step, has installed lighting system additional on it, can satisfy the illumination condition in dark environment, better satisfying step detection demand.
4. The intelligent blind guiding device of claim 1, wherein: the product is provided with a natural language interaction module, an earphone and microphone hardware, can receive the voice signal input of the blind person, and can perform voice prompt feedback with the blind person.
5. The intelligent blind guiding device of claim 2, wherein: the GPS navigation module and the indoor 9-axis gyroscope inertial positioning module can record the track of the blind person, effectively plan the outdoor and indoor walking route of the blind person and report the position in real time.
6. The intelligent blind guiding device of claim 2, wherein: the 4G communication module can enable the blind guiding equipment to be networked, and feeds back the current position to the blind relative through the software of the upper computer. And at dangerous moment, can report to the police, medical help seeking through voice interaction. And an SIM card is arranged in the operation box and can support 4g networking communication requirements.
7. The intelligent blind guiding device of claim 1, wherein: the intelligent recognition system can perform target detection and character recognition, can effectively recognize labeled information such as bus stations, blind person passage, guideboards and the like, and performs information interaction through the voice module.
8. An intelligent blind guiding device according to claims 1-7, characterized in that: and (3) computing by adopting a self-developed ZYNQ main computing center (responsible for video processing) and an STM32 (responsible for controlling each sensor and the peripheral equipment thereof). The power-saving control circuit has the excellent performances of low power consumption and long endurance time.
CN201910639192.3A 2019-07-15 2019-07-15 Blind device is led to wearable intelligence Pending CN110623820A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201910639192.3A CN110623820A (en) 2019-07-15 2019-07-15 Blind device is led to wearable intelligence

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201910639192.3A CN110623820A (en) 2019-07-15 2019-07-15 Blind device is led to wearable intelligence

Publications (1)

Publication Number Publication Date
CN110623820A true CN110623820A (en) 2019-12-31

Family

ID=68968915

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910639192.3A Pending CN110623820A (en) 2019-07-15 2019-07-15 Blind device is led to wearable intelligence

Country Status (1)

Country Link
CN (1) CN110623820A (en)

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109730910A (en) * 2018-11-30 2019-05-10 深圳市智瞻科技有限公司 Vision-aided system and its ancillary equipment, method, the readable storage medium storing program for executing of trip
CN111743740A (en) * 2020-06-30 2020-10-09 平安国际智慧城市科技股份有限公司 Blind guiding method and device, blind guiding equipment and storage medium
CN113031265A (en) * 2021-02-05 2021-06-25 杭州小派智能科技有限公司 Split AR display device and display method
CN113707161A (en) * 2021-09-22 2021-11-26 珠海华网科技有限责任公司 Method for playing partial discharge signal audio based on ZYNQ platform
CN114587949A (en) * 2022-02-21 2022-06-07 北京航空航天大学 Blind guiding system

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2006251596A (en) * 2005-03-14 2006-09-21 Kyoto Institute Of Technology Support device for visually handicapped person
CN101999972A (en) * 2010-11-24 2011-04-06 上海理工大学 Stereoscopic vision based auxiliary walking device for blindmen and auxiliary method thereof
CN103941574A (en) * 2014-04-18 2014-07-23 邓伟廷 Intelligent spectacles
CN204542562U (en) * 2015-04-02 2015-08-12 重庆大学 A kind of intelligent blind glasses
RO132133A2 (en) * 2016-03-03 2017-09-29 Gafencu Daniela În Calitate De Tutore A Minorei Gafencu Miruna-Alexandra Individual portable device and method of orientation in motion, meant to be used by persons suffering from severe sight deficiencies
CN109481248A (en) * 2018-12-26 2019-03-19 浙江师范大学 A kind of smart guide glasses
CN109602585A (en) * 2018-11-30 2019-04-12 西安工程大学 A kind of glasses for guiding blind and its anti-collision early warning method

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2006251596A (en) * 2005-03-14 2006-09-21 Kyoto Institute Of Technology Support device for visually handicapped person
CN101999972A (en) * 2010-11-24 2011-04-06 上海理工大学 Stereoscopic vision based auxiliary walking device for blindmen and auxiliary method thereof
CN103941574A (en) * 2014-04-18 2014-07-23 邓伟廷 Intelligent spectacles
CN204542562U (en) * 2015-04-02 2015-08-12 重庆大学 A kind of intelligent blind glasses
RO132133A2 (en) * 2016-03-03 2017-09-29 Gafencu Daniela În Calitate De Tutore A Minorei Gafencu Miruna-Alexandra Individual portable device and method of orientation in motion, meant to be used by persons suffering from severe sight deficiencies
CN109602585A (en) * 2018-11-30 2019-04-12 西安工程大学 A kind of glasses for guiding blind and its anti-collision early warning method
CN109481248A (en) * 2018-12-26 2019-03-19 浙江师范大学 A kind of smart guide glasses

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109730910A (en) * 2018-11-30 2019-05-10 深圳市智瞻科技有限公司 Vision-aided system and its ancillary equipment, method, the readable storage medium storing program for executing of trip
CN111743740A (en) * 2020-06-30 2020-10-09 平安国际智慧城市科技股份有限公司 Blind guiding method and device, blind guiding equipment and storage medium
CN113031265A (en) * 2021-02-05 2021-06-25 杭州小派智能科技有限公司 Split AR display device and display method
CN113707161A (en) * 2021-09-22 2021-11-26 珠海华网科技有限责任公司 Method for playing partial discharge signal audio based on ZYNQ platform
CN114587949A (en) * 2022-02-21 2022-06-07 北京航空航天大学 Blind guiding system

Similar Documents

Publication Publication Date Title
CN110623820A (en) Blind device is led to wearable intelligence
CN205494329U (en) Intelligence is saved oneself and is led blind walking stick
CN110575371B (en) Intelligent blind-guiding walking stick and control method
CN103126862B (en) Outdoor blind guiding robot based on global position system (GPS), general packet radio service (GPRS) and radio frequency identification devices (RFID) and navigational positioning method
CN101509780B (en) Intelligent blind navigation system and navigation method
CN108536145A (en) A kind of robot system intelligently followed using machine vision and operation method
CN102164344B (en) Navigation mobile phone for the blind
CN202409427U (en) Portable intelligent electronic blind guide instrument
CN106859929A (en) A kind of Multifunctional blind person guiding instrument based on binocular vision
CN110478206B (en) Intelligent blind guiding system and equipment
CN109259948B (en) Wheelchair for assisting driving
CN110584962A (en) Combined obstacle-detection intelligent blind-guiding system
CN101986673A (en) Intelligent mobile phone blind-guiding device and blind-guiding method
CN112870033A (en) Intelligent blind guiding helmet system for unstructured road and navigation method
CN204446522U (en) Based on hyperacoustic intelligent blind-guiding device
CN103312899A (en) Smart phone with blind guide function
CN210078040U (en) Intelligent blind guiding device
CN111035543A (en) Intelligent blind guiding robot
CN109998873A (en) A kind of wearable blindmen intelligent positioning and blind guiding system
CN112130570A (en) Blind guiding robot of optimal output feedback controller based on reinforcement learning
CN111840016A (en) Flexible and configurable intelligent navigation device for blind people
CN109662830B (en) A kind of language blind guiding stick, the deep neural network optimization method based on the walking stick
CN211095815U (en) Multifunctional intelligent blind guiding system for glasses
CN114533503B (en) Glasses system for intelligent blind-assisting travel and interaction method
CN207663060U (en) A kind of round-the-clock quick unmanned vehicle detection obstacle avoidance system of Multi-sensor Fusion

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
WD01 Invention patent application deemed withdrawn after publication
WD01 Invention patent application deemed withdrawn after publication

Application publication date: 20191231