CN114191267A - Light-weight intelligent method and system for assisting blind person in going out in complex environment - Google Patents

Light-weight intelligent method and system for assisting blind person in going out in complex environment Download PDF

Info

Publication number
CN114191267A
CN114191267A CN202111480420.0A CN202111480420A CN114191267A CN 114191267 A CN114191267 A CN 114191267A CN 202111480420 A CN202111480420 A CN 202111480420A CN 114191267 A CN114191267 A CN 114191267A
Authority
CN
China
Prior art keywords
blind
module
obstacle
early warning
obstacle avoidance
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202111480420.0A
Other languages
Chinese (zh)
Inventor
陈晓敏
赵涛涛
孙强
赵之喻
周桓宇
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Nantong University
Original Assignee
Nantong University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Nantong University filed Critical Nantong University
Priority to CN202111480420.0A priority Critical patent/CN114191267A/en
Publication of CN114191267A publication Critical patent/CN114191267A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61HPHYSICAL THERAPY APPARATUS, e.g. DEVICES FOR LOCATING OR STIMULATING REFLEX POINTS IN THE BODY; ARTIFICIAL RESPIRATION; MASSAGE; BATHING DEVICES FOR SPECIAL THERAPEUTIC OR HYGIENIC PURPOSES OR SPECIFIC PARTS OF THE BODY
    • A61H3/00Appliances for aiding patients or disabled persons to walk about
    • A61H3/06Walking aids for blind persons
    • A61H3/061Walking aids for blind persons with electronic detecting or guiding means
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61FFILTERS IMPLANTABLE INTO BLOOD VESSELS; PROSTHESES; DEVICES PROVIDING PATENCY TO, OR PREVENTING COLLAPSING OF, TUBULAR STRUCTURES OF THE BODY, e.g. STENTS; ORTHOPAEDIC, NURSING OR CONTRACEPTIVE DEVICES; FOMENTATION; TREATMENT OR PROTECTION OF EYES OR EARS; BANDAGES, DRESSINGS OR ABSORBENT PADS; FIRST-AID KITS
    • A61F9/00Methods or devices for treatment of the eyes; Devices for putting-in contact lenses; Devices to correct squinting; Apparatus to guide the blind; Protective devices for the eyes, carried on the body or in the hand
    • A61F9/08Devices or methods enabling eye-patients to replace direct visual perception by another kind of perception
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/21Design or setup of recognition systems or techniques; Extraction of features in feature space; Blind source separation
    • G06F18/214Generating training patterns; Bootstrap methods, e.g. bagging or boosting
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/24Classification techniques
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/04Architecture, e.g. interconnection topology
    • G06N3/045Combinations of networks
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/08Learning methods
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61HPHYSICAL THERAPY APPARATUS, e.g. DEVICES FOR LOCATING OR STIMULATING REFLEX POINTS IN THE BODY; ARTIFICIAL RESPIRATION; MASSAGE; BATHING DEVICES FOR SPECIAL THERAPEUTIC OR HYGIENIC PURPOSES OR SPECIFIC PARTS OF THE BODY
    • A61H3/00Appliances for aiding patients or disabled persons to walk about
    • A61H3/06Walking aids for blind persons
    • A61H3/061Walking aids for blind persons with electronic detecting or guiding means
    • A61H2003/063Walking aids for blind persons with electronic detecting or guiding means with tactile perception
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61HPHYSICAL THERAPY APPARATUS, e.g. DEVICES FOR LOCATING OR STIMULATING REFLEX POINTS IN THE BODY; ARTIFICIAL RESPIRATION; MASSAGE; BATHING DEVICES FOR SPECIAL THERAPEUTIC OR HYGIENIC PURPOSES OR SPECIFIC PARTS OF THE BODY
    • A61H2201/00Characteristics of apparatus not provided for in the preceding codes
    • A61H2201/50Control means thereof
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61HPHYSICAL THERAPY APPARATUS, e.g. DEVICES FOR LOCATING OR STIMULATING REFLEX POINTS IN THE BODY; ARTIFICIAL RESPIRATION; MASSAGE; BATHING DEVICES FOR SPECIAL THERAPEUTIC OR HYGIENIC PURPOSES OR SPECIFIC PARTS OF THE BODY
    • A61H2201/00Characteristics of apparatus not provided for in the preceding codes
    • A61H2201/50Control means thereof
    • A61H2201/5058Sensors or detectors

Abstract

The invention relates to the technical field of intelligent auxiliary equipment, in particular to a method and a system for intelligently assisting a blind person to go out in a light-weight complex environment, which comprises the following steps: step S0: constructing an intelligent auxiliary system for the blind to go out; step S1: developing intelligent auxiliary application for the blind person to go out in a lightweight complex environment; step S2: obstacle identification; step S3: obstacle avoidance and early warning; step S4: and (4) multi-end interaction. According to the method, the lightweight artificial intelligence model is deployed on the embedded equipment, so that the rapid operation and the efficient decision are carried out, and the dependence of an artificial intelligence scheme on the equipment performance and the network resources is reduced; the invention adopts the energy-saving and interference-free multi-stage somatosensory vibration early warning and voice early warning module to provide accurate obstacle avoidance information for the blind to perform obstacle avoidance; the cost and the power consumption are reduced, and meanwhile, the safety of various scenes of daily traveling of the blind and the visually disabled is guaranteed, so that economic, instant and safe intelligent service is provided for the blind.

Description

Light-weight intelligent method and system for assisting blind person in going out in complex environment
Technical Field
The invention relates to the technical field of intelligent auxiliary equipment, in particular to a method and a system for intelligently assisting a blind person to go out in a light-weight complex environment.
Background
With the development of cities, the road conditions are more and more complex, and in order to standardize the path of a blind road and ensure the smoothness of the blind road, roadblocks are arranged around the blind road to prevent large vehicles from passing; at important activities, festivals or construction sites, warning signs and the like can be set up at the roadside, which are potential barriers to the progress of the blind. The common blind guiding device can not meet the travel requirement of the blind, the traditional scheme based on the image recognition blind guiding stick has extremely high requirements on the computing capability and the network transmission capability of equipment and high manufacturing cost, and the diversity of obstacles brings greater challenges to recognition.
At present, various intelligent auxiliary schemes for the blind people to go out are proposed, and a model inference mode can be divided into a local inference scheme and a cloud inference scheme; the recognition objects can be classified into blind road recognition schemes and obstacle recognition schemes on blind roads. The local inference scheme is used for the intelligent auxiliary application of the blind person in the light-weight complex environment at the equipment end, and needs strong computing capability and storage capability; the cloud deducing mode is that image data collected by a network camera is uploaded to the cloud for deducing; the blind road identification scheme identifies navigation information such as left-turning, right-turning and straight-going navigation information from the blind road image information; the barrier identification scheme on the blind road advocates that the blind road characteristics and the barrier characteristics are bound and identified, the barriers on the blind road are mainly identified, and the barrier avoidance reminding is carried out on the blind person, but the blind road or the barriers are not separately identified.
Most local inference schemes run on equipment with good performance and large storage space, and the hardware cost is high, the power consumption is large, and the local inference schemes are not suitable for mobile scenes; therefore, a network camera is also proposed to be used in the scheme, the real-time data stream is uploaded to a cloud end, the result is fed back to the terminal equipment after the cloud end is inferred, and the cloud end inference scheme has high requirements on network resources and low real-time performance; the blind road identification scheme mainly carries out scene identification, identifies blind road patterns and feeds back the blind road patterns to the blind person to obtain effective blind road navigation information, but the walking safety of the blind person cannot be ensured; the blind road obstacle identification scheme is considered more comprehensively than the blind road identification scheme, the obstacles which are often appeared on the blind road are classified, the obstacles are identified, and the blind person is warned to avoid the obstacles, but the blind road obstacle identification scheme often binds the identification main body and the blind road, namely the blind road is required to be included in the identification content for identification, and the corresponding capacity of the blind guiding device under the complex travel environment is limited.
Disclosure of Invention
Aiming at the problems, the invention provides an intelligent assistant blind person trip method and a system thereof under a light-weight complex environment, fast operation and efficient decision are carried out by deploying a light-weight artificial intelligence model on embedded equipment, the dependence of an artificial intelligence scheme on equipment performance and network resources is reduced, the cost and power consumption are reduced, and the safety of daily trip diversified scenes of blind persons and visually disabled groups is met, so that economic, instant and safe intelligent service is provided for the blind persons.
In order to achieve the purpose, the technical scheme adopted by the invention is as follows:
a method for intelligently assisting a blind person to go out in a lightweight complex environment comprises the following steps:
step S0: constructing an intelligent auxiliary system for the blind to go out;
step S1: developing intelligent auxiliary application for the blind person to go out in a lightweight complex environment;
step S2: obstacle identification;
step S3: obstacle avoidance and early warning;
step S4: and (4) multi-end interaction.
Preferably, the specific steps of step S1 are as follows:
step S1.1: collecting image data of characteristic obstacles influencing the trip safety of the blind person to establish a data set;
step S1.2: building a neural network model, and training the neural network model to identify the characteristic obstacles based on the characteristic obstacle data set in the step S1.1;
step S1.3: quantizing the neural network model in the step S1.2 by using a full integer, and compressing the quantity of the neural network model;
step S1.4: and (4) developing a light-weight intelligent auxiliary application for the blind person to go out in a complex environment on the basis of the quantified neural network model in the step (S1.3) on a microcontroller of the blind guiding device embedded system.
Preferably, the specific steps of step S2 are as follows:
step S2.1: initializing an embedded system microcontroller, an obstacle identification module, an obstacle avoidance early warning module and a multi-terminal interaction module of the blind guiding device;
the blind guiding device comprises a blind guiding base body and an intelligent auxiliary device;
the intelligent auxiliary device comprises an obstacle identification module, an obstacle avoidance early warning module and a multi-terminal interaction module;
the obstacle identification module comprises an image acquisition module and an auxiliary lighting module, wherein the image acquisition module comprises an image sensor and an edge artificial intelligence image identification application module;
the obstacle avoidance early warning module comprises a distance measurement module, a linear motor vibration module and a voice synthesis broadcast module;
the multi-terminal interaction module comprises a key module, a communication module and a positioning module;
step S2.2: setting an ambient light threshold value L, acquiring an ambient light intensity value S by using a light intensity sensor, and turning on an illumination module to assist image acquisition when the ambient light intensity value is lower than the ambient light threshold value;
step S2.3: an image sensor collects environmental image data;
step S2.4: the intelligent assistant application for blind person traveling in the light complex environment identifies the characteristic barrier by deducing the image data, the step S3 is carried out when the characteristic barrier is identified, and the step S2.2 is carried out when the characteristic barrier is not identified.
Preferably, the specific steps of step S3 are as follows:
step S3.1: setting a safety distance S _ safe, and selecting uniform N equal division points or non-uniform N division points S _ risk _1, S _ risk _2, … … and S _ risk _ (N-1) between 0 and S _ safe;
in the step, the safe distance S _ safe is related to the blind person pace and the maximum detection distance of the ultrasonic ranging equipment;
step S3.2: starting a distance measuring module to measure distance, repeating the step S3.2 when the distance of the obstacle is greater than S _ safe, and entering the step S3.3 when the distance of the obstacle is less than S _ safe;
step S3.3: when the distance of the measured obstacle is in the range from S _ risk _ (n-1) to S _ safe, the vibration module conducts somatosensory vibration obstacle avoidance early warning at a preset mode PWM _ set _ n frequency, and the voice broadcast module conducts voice obstacle avoidance early warning at a preset mode V _ set _ n; when the distance of the measured obstacle is in the range from S _ risk _ (n-2) to S _ risk _ (n-1), the vibration module conducts somatosensory vibration obstacle avoidance early warning at the frequency of a preset mode PWM _ set _ (n-1), and the voice broadcast module conducts voice obstacle avoidance early warning at the frequency of a preset mode V _ set _ (n-1); … …, respectively; when the distance of the measured obstacle is in the range of 0-S _ risk1, the vibration module conducts somatosensory vibration obstacle avoidance early warning at the frequency of a preset mode PWM _ set1, and the voice broadcast module conducts voice obstacle avoidance early warning at the frequency of a preset mode V _ set _ 1;
in this step, the distance S _ risk _ n suffix number n represents a risk level, and the smaller n is, the higher the risk level is;
in the step, PWM _ set _ n and V _ set _ n suffix number n represent a multi-stage early warning mode, the smaller n is, the higher the danger level is, the higher the vibration frequency and vibration amplitude of the vibration module are, and the higher the speech speed and broadcast volume of the voice broadcast module are, so that multi-stage danger is early warned in a hierarchical progressive manner.
Preferably, the specific steps of step S4 are as follows:
step S4.1: the blind guiding device uploads the current position information and the state information of the equipment to the cloud server through the communication module for analysis and storage; the cloud server issues an instruction to the blind guiding device through the communication module;
step S4.2: the cloud server regularly pushes the information uploaded by the blind guiding device to the mobile terminal application program; and the mobile terminal application program acquires the historical information and the latest information of the blind guiding device by accessing the cloud server.
The invention also provides an intelligent assistant blind person traveling system in a lightweight complex environment, which comprises a blind person guiding device, a cloud server and a mobile terminal application program;
the blind guiding device carries out real-time obstacle identification of edge equipment and feeds back obstacle avoidance early warning information to assist the blind in carrying out safe obstacle avoidance;
the cloud server is communicated with the blind guiding device and receives state information and position information updated by the blind guiding device;
and the mobile terminal application program interacts with the cloud server to acquire the real-time position information and the state information of the blind guiding device.
The invention has the beneficial effects that:
1. the invention establishes the barrier data set of the blind person trip environment by collecting the barrier image data influencing the trip safety of the blind person, and trains the lightweight artificial intelligence model based on the data set so as to intelligently assist the blind person to deal with the complex trip environment.
2. The invention arranges the lightweight artificial intelligence model on the embedded equipment to carry out quick operation and high-efficiency decision, and reduces the dependence of the artificial intelligence scheme on the equipment performance and the network resources.
3. The invention adopts the energy-saving and interference-free multi-stage somatosensory vibration early warning and voice early warning module to provide accurate obstacle avoidance information for the blind to perform obstacle avoidance; the cost and the power consumption are reduced, and meanwhile, the safety of various scenes of daily traveling of the blind and the visually disabled is guaranteed, so that economic, instant and safe intelligent service is provided for the blind.
Drawings
FIG. 1 is a flow chart of the trip method of the present invention;
FIG. 2 is a flow chart of intelligent auxiliary application for the blind person to travel in a complex environment of light weight for developing the travel method of the invention;
fig. 3 is a flow chart of obstacle recognition for the travel method of the present invention;
FIG. 4 is a flow chart of obstacle avoidance early warning in the travel method of the present invention;
FIG. 5 is a detailed flow chart of an embodiment of the present invention;
fig. 6 is a block diagram of the travel system of the present invention.
Detailed Description
The technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the accompanying drawings, so that those skilled in the art can better understand the advantages and features of the present invention, and thus the scope of the present invention is more clearly defined. The embodiments described herein are only a few embodiments of the present invention, rather than all embodiments, and all other embodiments that can be derived by one of ordinary skill in the art without inventive faculty based on the embodiments described herein are intended to fall within the scope of the present invention.
Referring to fig. 1-6, the system for intelligently assisting the blind in traveling in a lightweight complex environment comprises a blind guiding device 1, a cloud server 2 and a mobile terminal application program 3;
the blind guiding device 1 is used for carrying out real-time obstacle identification on edge equipment and feeding back obstacle avoidance early warning information to assist the blind in carrying out safe obstacle avoidance;
the cloud server 2 is communicated with the blind guiding device 1 and receives state information and position information updated by the blind guiding device 1;
and the mobile terminal application program 3 interacts with the cloud server 2 to acquire the real-time position information and the state information of the blind guiding device 1.
Example (b):
referring to fig. 5, a method for intelligently assisting a blind person to go out in a lightweight complex environment includes the following steps:
step 1: collecting image data of characteristic obstacles influencing the trip safety of the blind person to establish a data set;
step 2: building a neural network model, and training the neural network model to identify the characteristic obstacles based on the characteristic obstacle data set in the step 1;
and step 3: quantizing the neural network model in the step 2 by using a full integer, and compressing the quantity of the neural network model;
in this step, the full integer quantization method converts the floating point type input and output data of the neural network model, the weight of the neural network model, and the bias data into unsigned 8-bit binary integers.
And 4, step 4: and (3) developing the intelligent auxiliary application of the blind person in the light-weight complex environment on the basis of the quantified neural network model in the step 3 on the embedded system microcontroller of the blind guiding device.
And 5: initializing an embedded system microcontroller, an obstacle identification module, an obstacle avoidance early warning module and a multi-terminal interaction module of the blind guiding device;
in this step, the blind guiding device comprises a blind guiding base body and an intelligent auxiliary device;
in a preferred embodiment, the blind guiding base body comprises, but is not limited to, a blind guiding stick for assisting in travelling, a wearable blind guiding belt and blind guiding glasses;
the intelligent auxiliary device comprises an obstacle identification module, an obstacle avoidance early warning module and a multi-terminal interaction module;
the obstacle identification module comprises an image acquisition module and an auxiliary lighting module, wherein the image acquisition module comprises an image sensor and a light-weight intelligent auxiliary application module for the blind person to go out in a complex environment;
the obstacle avoidance early warning module comprises a distance measurement module, a vibration module and a voice synthesis broadcast module;
the multi-terminal interaction module comprises a key module, a communication module and a positioning module;
as a preferred embodiment, the image sensor module is any one of a CMOS image sensor or a CCD image sensor;
as a preferred embodiment, the distance measuring module is any one of an infrared distance measuring module, an ultrasonic distance measuring module and a laser distance measuring module;
as a preferred embodiment, the vibration module is any one of a linear motor vibration module and a rotor motor vibration module;
as a preferred embodiment, the communication module is any one of NB-IoT, e-MTC, 4G and 5G communication modules;
as a preferred embodiment, the communication module connects to the cloud server by using one of communication protocols such as LWM2M, MQTT, CoAP, and the like;
as a preferred implementation, the positioning module is any one of GNSS, GPS, and beidou satellite system.
Step 6: and setting an ambient light threshold value to be 30lux, acquiring an ambient light intensity value S by using a light intensity sensor, and turning on an illumination module to assist image acquisition when the ambient light intensity value is lower than the ambient light threshold value.
In the step, the light intensity sensor collects light intensity data and transmits the light intensity data to the embedded microprocessor core system, when the light intensity S is less than 30lux, the light intensity data is considered to enter a shadow area or be in the night time, the lighting module is turned on to assist obstacle recognition and play a role in light warning for pedestrians and vehicles.
And 7: an image sensor collects environmental image data.
And 8: the intelligent auxiliary application for the blind person to go out in the light complex environment identifies the characteristic barrier by deducing the image data, the step 9 is carried out when the characteristic barrier is identified, and the step 6 is carried out when the characteristic barrier is not identified.
And step 9: in the step, a safe distance of 6m is set according to the walking speed of the blind and the maximum detection distance of the ultrasonic ranging equipment, uniform points 2m and 4m are uniformly distributed at 3 equal intervals between 0 m and 6m, the ranging module is started to carry out ranging, the step 9 is restarted when the distance of the obstacle is larger than 6m, and the step 10 is started when the distance of the obstacle is smaller than 6 m.
Step 10: as shown in fig. 5, when the measured distance of the obstacle is within the range of 4m to 6m, the vibration module performs somatosensory vibration obstacle avoidance early warning at a preset mode PWM _ set _3 frequency, and the voice broadcast module performs voice obstacle avoidance early warning at a preset mode V _ set _ 3; when the distance between the measured obstacles is within the range of 2 m-4 m, the vibration module conducts somatosensory vibration obstacle avoidance early warning at a preset mode PWM _ set _2 frequency, and the voice broadcast module conducts voice obstacle avoidance early warning at a preset mode V _ set _ 2; … …, respectively; when the measured obstacle distance is within the range of 0-2 m, the vibration module conducts somatosensory vibration obstacle avoidance early warning in a preset mode PWM _ set _1 frequency, and the voice broadcast module conducts voice obstacle avoidance early warning in a preset mode V _ set _ 1.
Step 11: the blind guiding device uploads the current position information and the state information of the equipment to the cloud server through the communication module for analysis and storage; the cloud server issues an instruction to the blind guiding device through the communication module.
Step 12: the cloud server regularly pushes the information uploaded by the blind guiding device to the mobile terminal application program; and the mobile terminal application program acquires the historical information and the latest information of the blind guiding device by accessing the cloud server.
In conclusion, the barrier image data influencing the trip safety of the blind are collected to establish a barrier data set of the trip environment of the blind, and a lightweight artificial intelligence model is trained based on the data set so as to intelligently assist the blind to cope with the complex trip environment; according to the method, the lightweight artificial intelligence model is deployed on the embedded equipment, so that the rapid operation and the efficient decision are carried out, and the dependence of an artificial intelligence scheme on the equipment performance and the network resources is reduced; the invention adopts the energy-saving and interference-free multi-stage somatosensory vibration early warning and voice early warning module to provide accurate obstacle avoidance information for the blind to perform obstacle avoidance; the cost and the power consumption are reduced, and meanwhile, the safety of various scenes of daily traveling of the blind and the visually disabled is guaranteed, so that economic, instant and safe intelligent service is provided for the blind.
As will be appreciated by one skilled in the art, embodiments of the present application may be provided as a method, system, or computer program product. Accordingly, the present application may take the form of an entirely hardware embodiment, an entirely software embodiment or an embodiment combining software and hardware aspects. Furthermore, the present application may take the form of a computer program product embodied on one or more computer-usable storage media (including, but not limited to, disk storage, CD-ROM, optical storage, and the like) having computer-usable program code embodied therein.
The present application is described with reference to flowchart illustrations and block diagrams of methods, apparatus, systems and computer program products according to embodiments of the application. It will be understood that each flow and block of the flow diagrams and block diagrams, and combinations of flows and blocks in the flow diagrams and block diagrams, can be implemented by computer program instructions. These computer program instructions may be provided to a processor of a general purpose computer, special purpose computer, embedded processor, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable data processing apparatus, create means for implementing the functions specified in the flowchart flow or flows and block diagram block or blocks.
These computer program instructions may also be stored in a computer-readable memory that can direct a computer or other programmable data processing apparatus to function in a particular manner, such that the instructions stored in the computer-readable memory produce an article of manufacture including instruction means which implement the function specified in the flowchart flow or flows and block diagram block or blocks.
These computer program instructions may also be loaded onto a computer or other programmable data processing apparatus to cause a series of operational steps to be performed on the computer or other programmable apparatus to produce a computer implemented process such that the instructions which execute on the computer or other programmable apparatus provide steps for implementing the functions specified in the flowchart flow or flows and block diagram block or blocks.
Finally, it should be noted that: the above embodiments are only for illustrating the technical solutions of the present invention and not for limiting the same, and although the present invention is described in detail with reference to the above embodiments, those of ordinary skill in the art should understand that: modifications and equivalents may be made to the embodiments of the invention without departing from the spirit and scope of the invention, which is to be covered by the claims.

Claims (6)

1. A method for intelligently assisting a blind person to go out in a lightweight complex environment is characterized by comprising the following steps: the method comprises the following steps:
step S0: constructing an intelligent auxiliary system for the blind to go out;
step S1: developing intelligent auxiliary application for the blind person to go out in a lightweight complex environment;
step S2: obstacle identification;
step S3: obstacle avoidance and early warning;
step S4: and (4) multi-end interaction.
2. The method for intelligently assisting the blind to go out under the light-weight complex environment according to claim 1, is characterized in that: the specific steps of step S1 are as follows:
step S1.1: collecting image data of characteristic obstacles influencing the trip safety of the blind person to establish a data set;
step S1.2: building a neural network model, and training the neural network model to identify the characteristic obstacles based on the characteristic obstacle data set in the step S1.1;
step S1.3: quantizing the neural network model in the step S1.2 by using a full integer, and compressing the quantity of the neural network model;
step S1.4: and (4) developing a light-weight intelligent auxiliary application for the blind person to go out in a complex environment on the basis of the quantified neural network model in the step (S1.3) on a microcontroller of the blind guiding device embedded system.
3. The method for intelligently assisting the blind to go out under the light-weight complex environment according to claim 1, is characterized in that: the specific steps of step S2 are as follows:
step S2.1: initializing an embedded system microcontroller, an obstacle identification module, an obstacle avoidance early warning module and a multi-terminal interaction module of the blind guiding device;
the blind guiding device comprises a blind guiding base body and an intelligent auxiliary device;
the intelligent auxiliary device comprises an obstacle identification module, an obstacle avoidance early warning module and a multi-terminal interaction module;
the obstacle identification module comprises an image acquisition module and an auxiliary lighting module, wherein the image acquisition module comprises an image sensor and an edge artificial intelligence image identification application module;
the obstacle avoidance early warning module comprises a distance measurement module, a linear motor vibration module and a voice synthesis broadcast module;
the multi-terminal interaction module comprises a key module, a communication module and a positioning module;
step S2.2: setting an ambient light threshold value L, acquiring an ambient light intensity value S by using a light intensity sensor, and turning on an illumination module to assist image acquisition when the ambient light intensity value is lower than the ambient light threshold value;
step S2.3: an image sensor collects environmental image data;
step S2.4: the intelligent assistant application for blind person traveling in the light complex environment identifies the characteristic barrier by deducing the image data, the step S3 is carried out when the characteristic barrier is identified, and the step S2.2 is carried out when the characteristic barrier is not identified.
4. The method for intelligently assisting the blind to go out under the light-weight complex environment according to claim 1, is characterized in that: the specific steps of step S3 are as follows:
step S3.1: setting a safety distance S _ safe, and selecting uniform N equal division points or non-uniform N division points S _ risk _1, S _ risk _2, … … and S _ risk _ (N-1) between 0 and S _ safe;
in the step, the safe distance S _ safe is related to the blind person pace and the maximum detection distance of the ultrasonic ranging equipment;
step S3.2: starting a distance measuring module to measure distance, repeating the step S3.2 when the distance of the obstacle is greater than S _ safe, and entering the step S3.3 when the distance of the obstacle is less than S _ safe;
step S3.3: when the distance of the measured obstacle is in the range from S _ risk _ (n-1) to S _ safe, the vibration module conducts somatosensory vibration obstacle avoidance early warning at a preset mode PWM _ set _ n frequency, and the voice broadcast module conducts voice obstacle avoidance early warning at a preset mode V _ set _ n; when the distance of the measured obstacle is in the range from S _ risk _ (n-2) to S _ risk _ (n-1), the vibration module conducts somatosensory vibration obstacle avoidance early warning at the frequency of a preset mode PWM _ set _ (n-1), and the voice broadcast module conducts voice obstacle avoidance early warning at the frequency of a preset mode V _ set _ (n-1); … …, respectively; when the distance of the measured obstacle is in the range of 0-S _ risk1, the vibration module conducts somatosensory vibration obstacle avoidance early warning at the frequency of a preset mode PWM _ set1, and the voice broadcast module conducts voice obstacle avoidance early warning at the frequency of a preset mode V _ set _ 1;
in this step, the distance S _ risk _ n suffix number n represents a risk level, and the smaller n is, the higher the risk level is;
in the step, PWM _ set _ n and V _ set _ n suffix number n represent a multi-stage early warning mode, the smaller n is, the higher the danger level is, the higher the vibration frequency and vibration amplitude of the vibration module are, and the higher the speech speed and broadcast volume of the voice broadcast module are, so that multi-stage danger is early warned in a hierarchical progressive manner.
5. The method for intelligently assisting the blind to go out under the light-weight complex environment according to claim 1, is characterized in that: the specific steps of step S4 are as follows:
step S4.1: the blind guiding device uploads the current position information and the state information of the equipment to the cloud server through the communication module for analysis and storage; the cloud server issues an instruction to the blind guiding device through the communication module;
step S4.2: the cloud server regularly pushes the information uploaded by the blind guiding device to the mobile terminal application program; and the mobile terminal application program acquires the historical information and the latest information of the blind guiding device by accessing the cloud server.
6. A travel system for intelligently assisting the blind in traveling under a light-weight complex environment according to claim 1, characterized in that: the trip system comprises a blind guiding device, a cloud server and a mobile terminal application program;
the blind guiding device carries out real-time obstacle identification of edge equipment and feeds back obstacle avoidance early warning information to assist the blind in carrying out safe obstacle avoidance;
the cloud server is communicated with the blind guiding device and receives state information and position information updated by the blind guiding device;
and the mobile terminal application program interacts with the cloud server to acquire the real-time position information and the state information of the blind guiding device.
CN202111480420.0A 2021-12-06 2021-12-06 Light-weight intelligent method and system for assisting blind person in going out in complex environment Pending CN114191267A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202111480420.0A CN114191267A (en) 2021-12-06 2021-12-06 Light-weight intelligent method and system for assisting blind person in going out in complex environment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202111480420.0A CN114191267A (en) 2021-12-06 2021-12-06 Light-weight intelligent method and system for assisting blind person in going out in complex environment

Publications (1)

Publication Number Publication Date
CN114191267A true CN114191267A (en) 2022-03-18

Family

ID=80650769

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202111480420.0A Pending CN114191267A (en) 2021-12-06 2021-12-06 Light-weight intelligent method and system for assisting blind person in going out in complex environment

Country Status (1)

Country Link
CN (1) CN114191267A (en)

Citations (20)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN2248956Y (en) * 1995-12-21 1997-03-05 山东师范大学 Hand holding type multifunction device for guiding blind person
CN101227539A (en) * 2007-01-18 2008-07-23 联想移动通信科技有限公司 Blind guiding mobile phone and blind guiding method
CN102641197A (en) * 2011-02-16 2012-08-22 中兴通讯股份有限公司 Portable guide terminal and guide method
CN106074099A (en) * 2016-06-13 2016-11-09 李彤轩 Blind person's special intelligent guides system
CN106389078A (en) * 2016-11-24 2017-02-15 贵州大学 Intelligent blind guiding glass system and blind guiding method thereof
CN107175645A (en) * 2017-07-05 2017-09-19 深圳悉罗机器人有限公司 Mobile robot
CN107588780A (en) * 2017-10-27 2018-01-16 朱秋华 A kind of intelligent blind guiding system
US20180185232A1 (en) * 2015-06-19 2018-07-05 Ashkon Namdar Wearable navigation system for blind or visually impaired persons with wireless assistance
CN108606916A (en) * 2018-05-24 2018-10-02 安徽大学 A kind of intelligent blind-guiding apparatus and system
CN208448080U (en) * 2018-03-26 2019-02-01 浙江师范大学 A kind of multifunctional intellectual blind-guiding stick
CN109662830A (en) * 2019-01-18 2019-04-23 湖南师范大学 A kind of language blind guiding stick, the deep neural network optimization method based on the walking stick
CN109902745A (en) * 2019-03-01 2019-06-18 成都康乔电子有限责任公司 A kind of low precision training based on CNN and 8 integers quantization inference methods
CN110081895A (en) * 2019-04-26 2019-08-02 宁波财经学院 A kind of intelligent blind-guiding method and system
CN110135580A (en) * 2019-04-26 2019-08-16 华中科技大学 A kind of full integer quantization method and its application method of convolutional network
CN110478204A (en) * 2019-07-25 2019-11-22 李高轩 A kind of glasses for guiding blind of combination image recognition and its blind guiding system of composition
CN111529324A (en) * 2020-05-07 2020-08-14 重庆工程学院 Blind guide instrument based on computer vision
CN111985495A (en) * 2020-07-09 2020-11-24 珠海亿智电子科技有限公司 Model deployment method, device, system and storage medium
CN112168634A (en) * 2020-10-29 2021-01-05 中国电子科技集团公司第二十八研究所 Multifunctional blind guiding stick
CN112508125A (en) * 2020-12-22 2021-03-16 无锡江南计算技术研究所 Efficient full-integer quantization method of image detection model
CN214632899U (en) * 2020-05-18 2021-11-09 福建农林大学 Intelligent guide walking stick

Patent Citations (20)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN2248956Y (en) * 1995-12-21 1997-03-05 山东师范大学 Hand holding type multifunction device for guiding blind person
CN101227539A (en) * 2007-01-18 2008-07-23 联想移动通信科技有限公司 Blind guiding mobile phone and blind guiding method
CN102641197A (en) * 2011-02-16 2012-08-22 中兴通讯股份有限公司 Portable guide terminal and guide method
US20180185232A1 (en) * 2015-06-19 2018-07-05 Ashkon Namdar Wearable navigation system for blind or visually impaired persons with wireless assistance
CN106074099A (en) * 2016-06-13 2016-11-09 李彤轩 Blind person's special intelligent guides system
CN106389078A (en) * 2016-11-24 2017-02-15 贵州大学 Intelligent blind guiding glass system and blind guiding method thereof
CN107175645A (en) * 2017-07-05 2017-09-19 深圳悉罗机器人有限公司 Mobile robot
CN107588780A (en) * 2017-10-27 2018-01-16 朱秋华 A kind of intelligent blind guiding system
CN208448080U (en) * 2018-03-26 2019-02-01 浙江师范大学 A kind of multifunctional intellectual blind-guiding stick
CN108606916A (en) * 2018-05-24 2018-10-02 安徽大学 A kind of intelligent blind-guiding apparatus and system
CN109662830A (en) * 2019-01-18 2019-04-23 湖南师范大学 A kind of language blind guiding stick, the deep neural network optimization method based on the walking stick
CN109902745A (en) * 2019-03-01 2019-06-18 成都康乔电子有限责任公司 A kind of low precision training based on CNN and 8 integers quantization inference methods
CN110081895A (en) * 2019-04-26 2019-08-02 宁波财经学院 A kind of intelligent blind-guiding method and system
CN110135580A (en) * 2019-04-26 2019-08-16 华中科技大学 A kind of full integer quantization method and its application method of convolutional network
CN110478204A (en) * 2019-07-25 2019-11-22 李高轩 A kind of glasses for guiding blind of combination image recognition and its blind guiding system of composition
CN111529324A (en) * 2020-05-07 2020-08-14 重庆工程学院 Blind guide instrument based on computer vision
CN214632899U (en) * 2020-05-18 2021-11-09 福建农林大学 Intelligent guide walking stick
CN111985495A (en) * 2020-07-09 2020-11-24 珠海亿智电子科技有限公司 Model deployment method, device, system and storage medium
CN112168634A (en) * 2020-10-29 2021-01-05 中国电子科技集团公司第二十八研究所 Multifunctional blind guiding stick
CN112508125A (en) * 2020-12-22 2021-03-16 无锡江南计算技术研究所 Efficient full-integer quantization method of image detection model

Similar Documents

Publication Publication Date Title
US11094191B2 (en) Distributed safety infrastructure for autonomous vehicles and methods of use
US9615066B1 (en) Smart lighting and city sensor
KR102657921B1 (en) End-to-end system training using fused images
CN116017818A (en) Intelligent control method and system for intelligent urban road illumination based on Internet of things
KR20230152643A (en) Multi-modal segmentation network for enhanced semantic labeling in mapping
CN114971290B (en) Park management system and method based on intelligent street lamp
CN113034938A (en) Intelligent traffic system for city management
CN111951548A (en) Vehicle driving risk determination method, device, system and medium
US11209830B2 (en) Safety aware automated governance of vehicles
CN110838219A (en) Danger prediction alarm method and device
CN112414424B (en) Blind person navigation method and blind person navigation device
CN114191267A (en) Light-weight intelligent method and system for assisting blind person in going out in complex environment
CN117237475A (en) Vehicle traffic track generation method and device based on diffusion generation model
CN111489565B (en) Intelligent traffic system based on big data and control method thereof
KR20230143961A (en) Vehicle action selection based on simulated states
CN114120665B (en) Intelligent phase control method and system based on pedestrian number
CN110446106B (en) Method for identifying front camera file, electronic equipment and storage medium
DE102022100413A1 (en) OBJECT DETECTION USING RADAR AND LIDAR COMBINATION
Lin et al. Application of the efficientdet algorithm in traffic flow statistics
Satyanarayana et al. A laser curtain for detecting heterogeneous lane-less traffic
KR102645980B1 (en) Generating corrected future maneuver parameters in a planner
Azfar et al. Incorporating Vehicle Detection Algorithms via Edge Computing on a Campus Digital Twin Model
WO2024066798A1 (en) Vehicle control method and apparatus, and device and storage medium
TWI736955B (en) Blind-man navigation method and blind-man navigation device
WO2023279396A1 (en) Operational design domain identification method and apparatus

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination