CN111728834A - Handheld portable blind guider - Google Patents
Handheld portable blind guider Download PDFInfo
- Publication number
- CN111728834A CN111728834A CN202010727272.7A CN202010727272A CN111728834A CN 111728834 A CN111728834 A CN 111728834A CN 202010727272 A CN202010727272 A CN 202010727272A CN 111728834 A CN111728834 A CN 111728834A
- Authority
- CN
- China
- Prior art keywords
- blind
- guider
- data
- visual
- ultrasonic
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
- 230000000007 visual effect Effects 0.000 claims abstract description 25
- 238000001514 detection method Methods 0.000 claims abstract description 23
- 238000012545 processing Methods 0.000 claims abstract description 18
- 238000004458 analytical method Methods 0.000 claims abstract description 10
- 238000013528 artificial neural network Methods 0.000 claims abstract description 8
- 230000003993 interaction Effects 0.000 claims abstract description 7
- 238000011176 pooling Methods 0.000 claims abstract description 7
- 230000004913 activation Effects 0.000 claims abstract description 4
- 238000001994 activation Methods 0.000 claims abstract description 4
- 238000010606 normalization Methods 0.000 claims abstract description 4
- 238000012549 training Methods 0.000 claims description 17
- 239000000463 material Substances 0.000 claims description 14
- 238000005516 engineering process Methods 0.000 claims description 10
- 229920000742 Cotton Polymers 0.000 claims description 8
- 238000013135 deep learning Methods 0.000 claims description 8
- 239000000835 fiber Substances 0.000 claims description 8
- 230000007613 environmental effect Effects 0.000 claims description 7
- 241000282376 Panthera tigris Species 0.000 claims description 5
- 230000004888 barrier function Effects 0.000 claims description 5
- 238000007405 data analysis Methods 0.000 claims description 4
- 238000011161 development Methods 0.000 claims description 4
- 238000001914 filtration Methods 0.000 claims description 4
- 238000004364 calculation method Methods 0.000 claims description 3
- 238000010276 construction Methods 0.000 claims description 3
- 238000013527 convolutional neural network Methods 0.000 claims description 3
- 230000001537 neural effect Effects 0.000 claims description 3
- 230000006870 function Effects 0.000 abstract description 10
- 230000000694 effects Effects 0.000 abstract description 4
- 238000005259 measurement Methods 0.000 description 15
- 235000014676 Phragmites communis Nutrition 0.000 description 7
- 238000004422 calculation algorithm Methods 0.000 description 6
- 238000000034 method Methods 0.000 description 6
- 238000003909 pattern recognition Methods 0.000 description 4
- 241001166076 Diapheromera femorata Species 0.000 description 3
- 244000273256 Phragmites communis Species 0.000 description 3
- 230000001133 acceleration Effects 0.000 description 3
- 230000008569 process Effects 0.000 description 3
- 238000013473 artificial intelligence Methods 0.000 description 2
- 238000010586 diagram Methods 0.000 description 2
- 238000012360 testing method Methods 0.000 description 2
- 230000005540 biological transmission Effects 0.000 description 1
- 238000002474 experimental method Methods 0.000 description 1
- 210000003811 finger Anatomy 0.000 description 1
- 238000005187 foaming Methods 0.000 description 1
- 230000004927 fusion Effects 0.000 description 1
- 230000005484 gravity Effects 0.000 description 1
- 230000036541 health Effects 0.000 description 1
- 230000001771 impaired effect Effects 0.000 description 1
- 238000009434 installation Methods 0.000 description 1
- 230000010354 integration Effects 0.000 description 1
- 238000013507 mapping Methods 0.000 description 1
- 238000000691 measurement method Methods 0.000 description 1
- 238000007781 pre-processing Methods 0.000 description 1
- 238000002360 preparation method Methods 0.000 description 1
- 238000011160 research Methods 0.000 description 1
- 238000012827 research and development Methods 0.000 description 1
- 230000004044 response Effects 0.000 description 1
- 230000035945 sensitivity Effects 0.000 description 1
- 210000003813 thumb Anatomy 0.000 description 1
- 230000009466 transformation Effects 0.000 description 1
- 238000002604 ultrasonography Methods 0.000 description 1
- 210000000707 wrist Anatomy 0.000 description 1
Images
Classifications
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61H—PHYSICAL THERAPY APPARATUS, e.g. DEVICES FOR LOCATING OR STIMULATING REFLEX POINTS IN THE BODY; ARTIFICIAL RESPIRATION; MASSAGE; BATHING DEVICES FOR SPECIAL THERAPEUTIC OR HYGIENIC PURPOSES OR SPECIFIC PARTS OF THE BODY
- A61H3/00—Appliances for aiding patients or disabled persons to walk about
- A61H3/06—Walking aids for blind persons
- A61H3/061—Walking aids for blind persons with electronic detecting or guiding means
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61H—PHYSICAL THERAPY APPARATUS, e.g. DEVICES FOR LOCATING OR STIMULATING REFLEX POINTS IN THE BODY; ARTIFICIAL RESPIRATION; MASSAGE; BATHING DEVICES FOR SPECIAL THERAPEUTIC OR HYGIENIC PURPOSES OR SPECIFIC PARTS OF THE BODY
- A61H3/00—Appliances for aiding patients or disabled persons to walk about
- A61H3/06—Walking aids for blind persons
- A61H3/061—Walking aids for blind persons with electronic detecting or guiding means
- A61H2003/063—Walking aids for blind persons with electronic detecting or guiding means with tactile perception
Landscapes
- Health & Medical Sciences (AREA)
- Epidemiology (AREA)
- Pain & Pain Management (AREA)
- Physical Education & Sports Medicine (AREA)
- Rehabilitation Therapy (AREA)
- Life Sciences & Earth Sciences (AREA)
- Animal Behavior & Ethology (AREA)
- General Health & Medical Sciences (AREA)
- Public Health (AREA)
- Veterinary Medicine (AREA)
- Traffic Control Systems (AREA)
- Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)
Abstract
The invention provides a hand-held portable blind guider, which comprises a front-end detection part of a system consisting of two ultrasonic sensors, three infrared sensors and a camera, wherein an intelligent processing module is formed by a RISC-V dual-core 64-bit CPU architecture platform and is used as a hardware core of visual analysis, and the visual identification of a blind advancing path environment is realized by using a neural network double-precision operation of an artificial neural network hardware accelerator KPU for running convolution, batch normalization, activation and pooling. The blind guider brings great convenience for the travel of the blind, is light and practical, is obtained by feedback body feeling and voice information, has the functions of judging from the global direction and judging from the local advancing space, is easy to master the advancing condition, and has obvious human-computer interaction effect.
Description
Technical Field
The invention relates to the related field of automatic detection technology, in particular to a handheld portable blind guider.
Technical Field
At present, for the blind guiding equipment at home and abroad, although emerging products are continuously developed towards the intelligent and automatic directions, only a few products are accepted by visually impaired people, and the following devices can be summarized for obtaining research and development results in the blind guiding assistive devices:
1. at present, an ultrasonic blind guiding device or an ultrasonic blind guiding instrument is used for limiting the identification range of obstacles, the identification range is limited by topographic factors, and the current products have universal and single functions, can generally carry out distance measurement and blind guiding in one direction and do not have an intelligent integration.
2. The guide type blind guiding walking stick is characterized in that a navigation product in the market of the blind in China focuses on an electronic blind guiding walking stick and a blind guiding dog with an alarm function, the blind walking stick is still the most assistant tool used by the blind, but the function is single, the formed measurement is carried out, and the independent and free traveling problem of the blind is not fundamentally solved.
3. A mobile blind guiding robot or a blind guiding dog is taken as a new blind guiding mobile machine device in an intelligent technology, can integrate road exploration, alarm prompting and navigation, has more comprehensive functions, but has the problems of complex mechanical structure, complex device control, poor man-machine cooperation and high development cost, and is difficult to be suitable for the public.
4. The vision blind guiding device is formed by a binocular vision system to carry out three-dimensional measurement on the obstacles on the advancing path, visual image data analysis is carried out by a high-grade microcomputer image processing system, the prior scholars carry out artificial intelligent mode identification analysis on the blind guiding device, research and experiment are carried out by a deep learning technology, and the functions of measuring environmental data, tracking objects encountered by the road, identifying and understanding and the like are realized. However, the system equipment is inevitably large in size due to the requirement of excessive operation, the use of the equipment needs a large amount of preliminary preparation and debugging, and the carrying by a user is difficult.
5. The infrared laser depth vision measurer (Xtion) is used for building blind guiding equipment, on-site three-dimensional environment measurement can be achieved, but outdoor light interference seriously affects the measurement effect and is only limited to indoor use. These visual image tests, while accurate in positioning and capable of detecting obstacles three-dimensionally, are not effectively expressed to the blind.
Disclosure of Invention
Aiming at the use problem of the blind guiding device researched and developed, the invention provides the handheld portable blind guiding device which has the characteristics of light weight, easy handheld carrying, capability of supporting the blind to move along with the visual artificial intelligence technology, detection and positioning, body feeling and voice alarm prompt, and can play a role in assisting the blind in interaction, simplicity in operation and durability when the blind goes out.
The technical scheme adopted is as follows:
the utility model provides a whether this guide blind ware passes through the ultrasonic wave, infrared sensor data analysis can detect blind person advancing direction the place ahead 2 meters can hold the space range that the health passed through and have the barrier, detects out the unsmooth and the slope type condition on road surface. The system transmits vibration signals to the palm of the hand holding the blind guiding device and the outer side of the tiger's mouth by controlling the vibration pieces distributed on the surface and the rear end surface of the handle, and the signals are used for prompting whether the barrier is positioned right ahead or in the direction of the front, up, down, left and right. The front-end camera collects visual data in real time, inputs the visual data to an intelligent processing module of a RISC-V CPU architecture platform for carrying out deep learning technology development, completes the identification of the orientation of a front street, the congestion condition, the intersection position, a signal lamp and key characters, and guides the advancing direction of the blind by combining GPS data through voice.
The blind guider comprises a front-end detection part of a system consisting of two ultrasonic sensors, three infrared sensors and a camera, an intelligent processing module is formed by a RISC-V dual-core 64-bit CPU architecture platform and serves as a hardware core of visual analysis, and the visual identification of the blind advancing path environment is realized by operating convolution, batch normalization, activation and pooling neural network double-precision operation by using an artificial neural network hardware accelerator KPU; and other sensors are connected with corresponding interfaces of the RISC-V AIoT MaixDiino module to complete the obstacle detection of the blind person traveling path.
Because the new technology of artificial intelligence is applied to the field of measurement and control, the development space of a plurality of disabled assisting devices is improved, the blind person can use the new technology and the special requirements of convenient operation and application for the blind person, and the environmental characteristics of comprehensive use are met, the blind person guide uses the novel intelligent embedded module to complete the visual analysis of the walking path of the blind person, thereby avoiding the encumbrance of a high-grade microcomputer image processing system, the calibration process and the excessive operation load of a binocular vision system and the difficult factors of serious outdoor interference of infrared laser depth visual measurement. These visual image tests, while accurate in positioning and capable of detecting obstacles three-dimensionally, are not effectively expressed to the blind. Therefore, the novelty of the blind guider lies in that the blind guider meets the requirements of the blind, namely, the portable device which integrates the judgment of the environment and the direction by macroscopic visual analysis and the measurement of local obstacles, and meanwhile, the interaction mode at the rear end handle of the blind guider is a use mode which is not found at present.
The blind guider provides a solution for real-time processing for the difficulties of the operational capability of the used hardware module and the processing of huge visual data: after filtering and edge and texture processing, a Fisher classifier is used for optimizing and extracting feature data, real-time training data are greatly reduced, coordinate data of the contour stripe data of the environmental object are recorded and sorted out, a normalized training sequence data type is further formed, and a CNN network neural model is input for training. The model selects the parameters of a convolution layer, a pooling layer and a hidden layer with good real-time performance, reduces the loss calculation amount, completes the model construction through deep learning data training, forms the classification and recognition of the position characteristics of streets and intersections, traffic identification lines (such as blind roads, pedestrian lines, self-driving lines, motor vehicle lines and the like) at the edges of roads and the directions of building objects, recognizes the dense congestion degree of pedestrians, trains road sign characters encountered in the street environment and realizes the character recognition.
The blind guider is characterized in that a sleeve made of fiber blended foamed cotton materials is sleeved on each ultrasonic sensor, so that the directional characteristic of each ultrasonic sensor can form a detection sector. The sleeve is used for gathering the detection included angle of the ultrasonic sensor; the fiber blended foamed cotton material is an ideal material which can be used as a sleeve by absorbing ultrasonic materials.
The invention has the advantages that:
the blind guider brings great convenience for the travel of the blind, is light and practical, is obtained by feedback body feeling and voice information, has the functions of judging from the global direction and judging from the local advancing space, is easy to master the advancing condition, and has obvious human-computer interaction effect. The edge position of an object can be obtained by slight shaking in use, so that the blind can conveniently sense the edge frame of the object without contact. Meanwhile, the current position and the place needing turning or adjusting the trend are mastered by means of voice prompt. The touch feeling of the palm to the vibrating plate of the handle is utilized, the touch feeling of the tiger's mouth on the front side of the hand to the vibrating plate is detected, and the distance and the direction of the obstacle are described for the blind in combination with the frequency of the alarm whistle. The blind person can find out the obstacle outline by slightly shaking the blind guide device.
If the blind person family holds the blind guide device to preset GPS position data for the line where the blind person wants to walk before using the blind guide device, the family presses a data preset button every road passing a street corner and a corner to establish position navigation data, and the judgment efficiency is improved. When the blind guiding device is used, the blind can turn and correct the direction according to the vibration signal and the voice prompt.
The invention is characterized in that:
1. the method combining visual macroscopic measurement and multi-sensor local measurement comprises the following steps:
the blind guider utilizes visual analysis to carry out environment exploration on a macroscopic surface, and video signals collected by the camera are analyzed and judged according to a recognition algorithm to obtain the orientation of a road, the position of an intersection, the dense congestion degree of pedestrians, the character recognition of a road sign and the color state recognition of a signal lamp. Meanwhile, a plurality of distance measuring sensors are utilized to locally explore the advancing space, and whether the space capable of accommodating the blind person to move forward exists or not and whether the ground is in a convex-concave slope type or not is judged. And the direction in which the blind person needs to travel is adjusted by combining GPS navigation data detection.
2. A novel coverage area measurement method for double ultrasonic sensor detection comprises the following steps:
two sides of the front panel of the blind guider are respectively provided with an ultrasonic sensor with the same model and performance, and each ultrasonic sensor is sleeved with a sleeve made of fiber blended foamed cotton material, so that the directional characteristic of each ultrasonic sensor can form a detection sector. The sleeve has the function of gathering the detection included angle of the ultrasonic sensor; the fiber blended foaming cotton material has the function of absorbing ultrasonic waves to prevent ultrasonic reflection interference. The space directly in front of the travel can be detected using the common area where they overlap, as shown in fig. 2. The private area formed by the sensor No. 1 on the left can be used for detecting the corresponding space in the front left of the vehicle; similarly, the private area formed by the sensor No. 2 on the right can be used for detecting the corresponding space on the front right of the travel.
3. The handle portable interaction mode of the hand-held blind guider comprises the following steps:
four vibrating reeds are distributed on the upper, lower, left and right side faces of the handle of the handheld blind guider; four vibrating reeds are distributed on the rear end face of the blind guider system box, and eight vibrating reeds are arranged in total. The vibrating reed of the handle can transmit vibration signals representing corresponding directions to each part of the inner side of the palm holding the handle, and respectively represents that barrier warning exists in the upper direction, the lower direction, the left direction and the right direction; the downward vibration corresponds to road surface obstacles on the ground or uphill and downhill. And the vibrating reed on the rear end face of the blind guider system box transmits signals to the side faces of the index finger and the thumb of the first web which are in contact with the vibrating reed, and also represents that the corresponding direction in front of the blind guider system box has obstacles so as to enhance the signal prompting effect. If the four vibrating pieces vibrate together, it represents that there is an obstacle right in front of the travel.
Drawings
FIG. 1 is a schematic structural view of the present invention;
FIG. 2 is a top view of a measurement of an ultrasonic sensor at the front end;
FIG. 3 is a left side view of the operation of the blind guide;
FIG. 4 is a diagram of a core algorithm framework for a pattern recognition training model;
fig. 5 is a schematic diagram of the rear end structure of the present invention.
Detailed Description
In order to make the objects, technical solutions and advantages of the embodiments of the present invention clearer, the technical solutions in the embodiments of the present invention are clearly and completely described below with reference to the accompanying drawings, and it is obvious that the described embodiments are some, but not all embodiments of the present invention. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
The parts not described in the present invention are all the prior art or standard products, and are not described again.
In the description of the present invention, it should be noted that the terms "center", "longitudinal", "lateral", "up", "down", "front", "back", "left", "right", "vertical", "horizontal", "top", "bottom", "inner", "outer", etc., indicate orientations or positional relationships based on those shown in the drawings, and are only for convenience of description and simplicity of description, but do not indicate or imply that the referred device or element must have a specific orientation, be constructed and operated in a specific orientation. And therefore should not be construed as limiting the invention. Furthermore, the terms "first," "second," and "third" are used for descriptive purposes only and are not to be construed as indicating or implying relative importance.
In the description of the present invention, it should be noted that unless otherwise explicitly stated or limited, the terms "mounted," "connected," and "connected" are to be construed broadly, and may be, for example, fixedly connected, detachably connected, or integrally connected, mechanically connected, electrically connected, directly connected, or indirectly connected through an intermediate medium, and communicate between two elements. The specific meaning of the above terms in the present invention can be specifically understood by those of ordinary skill in the art. In addition, in the description of the present invention, "a plurality" means two or more unless otherwise specified.
Example (b):
the blind guider consists of two ultrasonic sensors, three infrared sensors and a camera, and the front end detection part of the system is formed as shown in figure 1.
Fig. 2 is a measurement top view of ultrasonic sensors No. 1 and No. 2 located at the front end of the blind guider, and the dotted lines represent the ultrasonic wave transmitting and receiving coverage areas of the two ultrasonic sensors.
Each ultrasonic sensor is sleeved with a sleeve made of fiber blended foamed cotton material, so that the directional characteristic of each ultrasonic sensor can form a detection sector. The sleeve is used for gathering the detection included angle of the ultrasonic sensor; the fiber blended foamed cotton material is an ideal material which can be used as a sleeve by absorbing ultrasonic materials. In the top view of fig. 2, the two ultrasonic sensors are advanced to form two fan-shaped detection zones, which again exist in both overlapping and non-overlapping regions. The overlap region may be used to detect obstacles ahead; the non-overlapping area may be used to detect the presence of obstacles on both sides of the heading.
Fig. 3 is a left side view of the operation of the blind guider, wherein an included angle phi is an installation included angle of two infrared sensors, and is a measurement direction included angle. The included angle beta is the included angle between one infrared sensor and the ground normal, and the system can measure the included angle in real time by using an ADXL345 triaxial gravity acceleration gradient angle sensor module. The blind guider initialization process can finish the calibration of the value of the included angle phi, namely, the value of h is calculated by measuring the values of the distance d, the distance L and the included angle beta, and then the value of the included angle phi is calculated. During work, the concave-convex and gradient conditions of the front ground can be judged by measuring the distances d and L. The value L can be calculated by the detected value d and the included angle phi. Meanwhile, the difference value is compared with the actually measured L value through mean filtering, and the degree of the convex-concave ground surface is judged according to the difference value.
The internal hardware of the blind guider consists of two modules, wherein one module is an embedded module and is responsible for data acquisition, signal analysis, measurement operation and control signal output of ultrasonic, infrared and gravitational acceleration sensors and a GPS sensor; the module consists of RISC-V AIoTMaixDiino. The other is a RISC-V dual-core 64-bit CPU architecture platform which is responsible for visual data processing and forms an intelligent processing module. Reference numeral 6 in fig. 1 is a camera, data collected by the camera is transmitted to a RISC-V dual-core architecture intelligent module for data analysis and processing, the module uses a professional AI chip k210 as a core processing unit, the k210 has dual-core processing of an independent FPU, a 64-bit CPU bit width, an 8M on-chip SRAM, an adjustable nominal frequency of 400M, and has a convolution artificial neural network hardware accelerator KPU, which can perform convolution artificial neural network dual-precision operation with high performance, and the AI processing k210 can perform operations such as convolution, batch normalization, activation, pooling, and the like; the pre-processing work of voice direction scanning and voice data output can be carried out, and image detection, voice recognition, color and object recognition can be realized; and a 72pin full pin is led out, the function of free mapping can be realized, and the FPC24P seat can be connected with a DVP camera and an 8bit MCU LCD. The method comprises the steps of establishing a data fusion model of an algorithm through real-time data of a camera, ultrasound, infrared and gravitational acceleration inclination angles and various GPS sensors, utilizing a visual pattern recognition technology to classify and recognize environmental target objects, forming software through algorithm analysis of a deep learning theory, training the orientation of a street, and determining the position of the street and the state of a signal lamp, the jam condition of personnel and road sign characters.
The system uses a deep learning network model established by TensorFlow as a core algorithm of the visual pattern recognition technology,
the core algorithm of the pattern recognition training model is shown in fig. 4 below:
after the visual data is subjected to filtering, edge and texture processing, a Fisher classifier is used for optimizing and extracting characteristic data, real-time training data is reduced, coordinate data of the contour stripe data of the environmental object are recorded and sorted, a normalized training sequence data type is further formed, and the normalized training sequence data type is input into a CNN network neural model for training. The model selects the parameters of a convolution layer, a pooling layer and a hidden layer with good real-time performance, reduces the loss calculation amount, completes the model construction through deep learning data training, and forms classification and identification on the position characteristics of streets and intersections, traffic identification lines (such as blind roads, pedestrian lines, self-driving lines, motor vehicle lines and the like) on the edges of roads and the directions of building objects so as to judge the advancing direction; identifying the dense congestion degree of the pedestrians to prompt whether the pedestrians go ahead or not; the method trains road sign characters encountered in the street environment, realizes character recognition and assists in recognizing environmental information. And locking the signal lamp area through fast Fourier transformation, and identifying the color to realize the judgment of the state of the traffic signal lamp. The blind can adjust the self advancing direction and determine whether to avoid or not according to the system identification result.
The hardware system is arranged in the measuring boxes No. 1 and No. 2 marked in figure 5. In fig. 5, reference numeral 3 denotes the rear end face of the measuring box, reference numerals 4, 5, and 6 denote positions of the vibrating pieces distributed on the rear end face, respectively, and 4 vibrating pieces are disposed on the rear end face, respectively corresponding to the prompt information in the vertical and horizontal directions. Reference numeral 9 denotes a handle of the blind guide, 4 vibrating pieces are respectively arranged on the surface of the handle, reference numerals 7 and 8 in the figure denote upper and right vibrating pieces, barrier information of the front upper part and the right front part is respectively expressed, and other vibrating pieces are the same. When the blind guider is used, an operator can adjust the direction of the wrist according to the signal of the vibrating reed to realize the interaction of the detection process. If there is an obstacle in the front direction, all of the 4 rear end faces in contact with the tiger's mouth vibrate. When each sensor detects that obstacles exist, the height of the alarm tone is adjusted according to the whistle frequency arranged in the detection distance control system. The touch feeling of the palm to the vibrating plate of the handle is utilized, the touch feeling of the tiger's mouth on the front side of the hand to the vibrating plate is detected, and the distance and the direction of the outline of the obstacle are described for the blind in combination with the frequency of the alarm whistle.
The blind guider must be carried by the eyes to set a specific point in advance, so that the turning of the intersection can be selected. The set data is obtained by GPS positioning measurement, and the data is recorded in a system memory by holding and pressing a contact sheet on the surface of the handle of the blind guider. When the blind person uses the device, real-time position data obtained by receiving GPS satellite signals are compared with preset fixed point position data, and a positive route judgment result and the following advancing direction are obtained through text information assistance, voice prompt and vibration touch signals, so that the blind person can be guided to continue to advance.
The present invention and its embodiments have been described above, and the description is not intended to be limiting, and the drawings are only one embodiment of the present invention, and the actual structure is not limited thereto. In summary, those skilled in the art should, without their teaching, appreciate that they can readily devise similar arrangements and embodiments without departing from the spirit and scope of the invention.
Claims (4)
1. A hand-held portable blind guider comprises two ultrasonic sensors, three infrared sensors and a camera, wherein a front-end detection part of a system is formed by two ultrasonic sensors, three infrared sensors and one camera, an intelligent processing module is formed by a RISC-V dual-core 64-bit CPU architecture platform and is used as a hardware core of visual analysis, and the visual identification of a blind person advancing path environment is realized by using an artificial neural network hardware accelerator KPU to run convolution, batch normalization, activation and pooling neural network double-precision operation; other sensors are connected with corresponding interfaces of the RISC-V AIoTMaixDiino module to complete the obstacle detection of the blind person traveling path; the blind guider completes visual analysis of the traveling path of a blind person by using the novel intelligent embedded module, and the interaction mode of the rear-end handle of the blind guider is a novel use mode.
2. A hand-held portable blind guider can detect whether an obstacle exists in a space range which can accommodate a body to pass in front of the blind in the advancing direction by 2 meters through data analysis of an ultrasonic sensor and an infrared sensor, and detect the concave-convex and slope-shaped conditions of a road surface; the system transmits vibration signals to the palm of the hand holding the blind guiding device and the outer side of the tiger's mouth by controlling the vibration pieces distributed on the surface and the rear end surface of the handle, and the signals are used for prompting whether the barrier is positioned right ahead or in the front direction, up, down, left and right; the front-end camera collects visual data in real time, inputs the visual data to an intelligent processing module of a RISC-V CPU architecture platform for carrying out deep learning technology development, completes the identification of the orientation of a front street, the congestion condition, the intersection position, a signal lamp and key characters, and guides the advancing direction of the blind by combining GPS data through voice.
3. A handheld portable blind guider uses a Fisher classifier to optimize and extract characteristic data after filtering, edge and texture processing, greatly reduces real-time training data, records and arranges coordinate data of contour stripe data of an environmental object, further forms a normalized training sequence data type, and inputs the normalized training sequence data type into a CNN network neural model for training; the model selects the parameters of a convolution layer, a pooling layer and a hidden layer with good real-time performance, reduces the loss calculation amount, completes the model construction through the data training of deep learning, forms the classification recognition of the position characteristics of streets and intersections, the traffic marking lines at the edges of roads and the directions of building objects, recognizes the dense jam degree of pedestrians, trains the road sign characters encountered by the street environment and realizes the character recognition.
4. A hand-held portable blind guider is characterized in that each ultrasonic sensor is sleeved with a sleeve made of fiber blended foamed cotton material, so that the directional characteristic of each ultrasonic sensor can form a detection sector; the sleeve is used for gathering the detection included angle of the ultrasonic sensor; the fiber blended foamed cotton material is an ideal material which can be used as a sleeve by absorbing ultrasonic materials.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202010727272.7A CN111728834A (en) | 2020-07-27 | 2020-07-27 | Handheld portable blind guider |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202010727272.7A CN111728834A (en) | 2020-07-27 | 2020-07-27 | Handheld portable blind guider |
Publications (1)
Publication Number | Publication Date |
---|---|
CN111728834A true CN111728834A (en) | 2020-10-02 |
Family
ID=72657738
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202010727272.7A Pending CN111728834A (en) | 2020-07-27 | 2020-07-27 | Handheld portable blind guider |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN111728834A (en) |
Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN204446522U (en) * | 2015-01-06 | 2015-07-08 | 中国人民解放军第三军医大学 | Based on hyperacoustic intelligent blind-guiding device |
CN205494329U (en) * | 2016-03-23 | 2016-08-24 | 张耐华 | Intelligence is saved oneself and is led blind walking stick |
CN208942782U (en) * | 2018-05-24 | 2019-06-07 | 安徽大学 | Distance perception blind-guiding walking stick based on vibration intensity |
WO2019123622A1 (en) * | 2017-12-21 | 2019-06-27 | 株式会社ニコン | Guiding device |
CN110575371A (en) * | 2019-10-22 | 2019-12-17 | 大连民族大学 | intelligent blind-guiding walking stick and control method |
CN111035542A (en) * | 2019-12-24 | 2020-04-21 | 开放智能机器(上海)有限公司 | Intelligent blind guiding system based on image recognition |
-
2020
- 2020-07-27 CN CN202010727272.7A patent/CN111728834A/en active Pending
Patent Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN204446522U (en) * | 2015-01-06 | 2015-07-08 | 中国人民解放军第三军医大学 | Based on hyperacoustic intelligent blind-guiding device |
CN205494329U (en) * | 2016-03-23 | 2016-08-24 | 张耐华 | Intelligence is saved oneself and is led blind walking stick |
WO2019123622A1 (en) * | 2017-12-21 | 2019-06-27 | 株式会社ニコン | Guiding device |
CN208942782U (en) * | 2018-05-24 | 2019-06-07 | 安徽大学 | Distance perception blind-guiding walking stick based on vibration intensity |
CN110575371A (en) * | 2019-10-22 | 2019-12-17 | 大连民族大学 | intelligent blind-guiding walking stick and control method |
CN111035542A (en) * | 2019-12-24 | 2020-04-21 | 开放智能机器(上海)有限公司 | Intelligent blind guiding system based on image recognition |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN205494329U (en) | Intelligence is saved oneself and is led blind walking stick | |
CN109144057A (en) | A kind of guide vehicle based on real time environment modeling and autonomous path planning | |
US8588464B2 (en) | Assisting a vision-impaired user with navigation based on a 3D captured image stream | |
Kriegman et al. | A mobile robot: Sensing, planning and locomotion | |
CN102389361B (en) | Blindman outdoor support system based on computer vision | |
CN106595630B (en) | It is a kind of that drawing system and method are built based on laser navigation Intelligent Mobile Robot | |
Shoval et al. | Computerized obstacle avoidance systems for the blind and visually impaired | |
CN106598039B (en) | A kind of Intelligent Mobile Robot barrier-avoiding method based on laser radar | |
CN106671974A (en) | Parking space detection method for intelligent parking system | |
Bouhamed et al. | New electronic white cane for stair case detection and recognition using ultrasonic sensor | |
CN102506737A (en) | Pipeline detection device | |
CN112870033A (en) | Intelligent blind guiding helmet system for unstructured road and navigation method | |
CN101368828A (en) | Blind man navigation method and system based on computer vision | |
Borenstein | The navbelt-a computerized multi-sensor travel aid for active guidance of the blind | |
CN108021133A (en) | A kind of Multi-sensor Fusion high speed unmanned vehicle detects obstacle avoidance system | |
CN113325837A (en) | Control system and method for multi-information fusion acquisition robot | |
CN111580515B (en) | Obstacle avoidance system for unmanned sweeping machine and obstacle avoidance method thereof | |
CN110623820A (en) | Blind device is led to wearable intelligence | |
CN115416047B (en) | Blind assisting system and method based on multi-sensor four-foot robot | |
CN106774325A (en) | Robot is followed based on ultrasonic wave, bluetooth and vision | |
CN111142543A (en) | Robot obstacle avoidance device, control method, edge control method and distance measurement system | |
CN106840158A (en) | A kind of visually impaired people's indoor navigation system and method based on Wi Fi and attitude angle transducer | |
CN111728834A (en) | Handheld portable blind guider | |
CN109191887A (en) | Localization method and positioning system in garage based on depth recognition | |
CN208355727U (en) | A kind of blind-guiding stick based on double ultrasonic waves |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
WD01 | Invention patent application deemed withdrawn after publication |
Application publication date: 20201002 |
|
WD01 | Invention patent application deemed withdrawn after publication |