CN113885704B - Man-machine interaction method and system for blind guiding vehicle - Google Patents

Man-machine interaction method and system for blind guiding vehicle Download PDF

Info

Publication number
CN113885704B
CN113885704B CN202111160701.8A CN202111160701A CN113885704B CN 113885704 B CN113885704 B CN 113885704B CN 202111160701 A CN202111160701 A CN 202111160701A CN 113885704 B CN113885704 B CN 113885704B
Authority
CN
China
Prior art keywords
blind
blind guiding
guiding vehicle
vehicle
controller
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202111160701.8A
Other languages
Chinese (zh)
Other versions
CN113885704A (en
Inventor
王映焓
蒲虹旭
王毅
王宜飞
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Ziqing Zhixing Technology Beijing Co ltd
Original Assignee
Ziqing Zhixing Technology Beijing Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Ziqing Zhixing Technology Beijing Co ltd filed Critical Ziqing Zhixing Technology Beijing Co ltd
Priority to CN202111160701.8A priority Critical patent/CN113885704B/en
Publication of CN113885704A publication Critical patent/CN113885704A/en
Application granted granted Critical
Publication of CN113885704B publication Critical patent/CN113885704B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61HPHYSICAL THERAPY APPARATUS, e.g. DEVICES FOR LOCATING OR STIMULATING REFLEX POINTS IN THE BODY; ARTIFICIAL RESPIRATION; MASSAGE; BATHING DEVICES FOR SPECIAL THERAPEUTIC OR HYGIENIC PURPOSES OR SPECIFIC PARTS OF THE BODY
    • A61H3/00Appliances for aiding patients or disabled persons to walk about
    • A61H3/06Walking aids for blind persons
    • A61H3/061Walking aids for blind persons with electronic detecting or guiding means
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/20Information retrieval; Database structures therefor; File system structures therefor of structured data, e.g. relational data
    • G06F16/29Geographical information databases
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/16Sound input; Sound output
    • G06F3/167Audio in a user interface, e.g. using voice commands for navigating, audio feedback
    • GPHYSICS
    • G07CHECKING-DEVICES
    • G07BTICKET-ISSUING APPARATUS; FARE-REGISTERING APPARATUS; FRANKING APPARATUS
    • G07B15/00Arrangements or apparatus for collecting fares, tolls or entrance fees at one or more control points
    • G07B15/02Arrangements or apparatus for collecting fares, tolls or entrance fees at one or more control points taking into account a variable factor such as distance or time, e.g. for passenger transport, parking systems or car rental systems
    • G07B15/04Arrangements or apparatus for collecting fares, tolls or entrance fees at one or more control points taking into account a variable factor such as distance or time, e.g. for passenger transport, parking systems or car rental systems comprising devices to free a barrier, turnstile, or the like
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61HPHYSICAL THERAPY APPARATUS, e.g. DEVICES FOR LOCATING OR STIMULATING REFLEX POINTS IN THE BODY; ARTIFICIAL RESPIRATION; MASSAGE; BATHING DEVICES FOR SPECIAL THERAPEUTIC OR HYGIENIC PURPOSES OR SPECIFIC PARTS OF THE BODY
    • A61H3/00Appliances for aiding patients or disabled persons to walk about
    • A61H3/06Walking aids for blind persons
    • A61H3/061Walking aids for blind persons with electronic detecting or guiding means
    • A61H2003/063Walking aids for blind persons with electronic detecting or guiding means with tactile perception
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61HPHYSICAL THERAPY APPARATUS, e.g. DEVICES FOR LOCATING OR STIMULATING REFLEX POINTS IN THE BODY; ARTIFICIAL RESPIRATION; MASSAGE; BATHING DEVICES FOR SPECIAL THERAPEUTIC OR HYGIENIC PURPOSES OR SPECIFIC PARTS OF THE BODY
    • A61H2201/00Characteristics of apparatus not provided for in the preceding codes
    • A61H2201/50Control means thereof
    • A61H2201/5023Interfaces to the user
    • A61H2201/5048Audio interfaces, e.g. voice or music controlled
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2203/00Indexing scheme relating to G06F3/00 - G06F3/048
    • G06F2203/01Indexing scheme relating to G06F3/01
    • G06F2203/011Emotion or mood input determined on the basis of sensed human body parameters such as pulse, heart rate or beat, temperature of skin, facial expressions, iris, voice pitch, brain activity patterns

Abstract

The embodiment of the invention discloses a man-machine interaction method and a man-machine interaction system for a blind guiding vehicle, wherein the method comprises the following steps: the blind guiding vehicle receives a data instruction sent by the controller; the data instruction is obtained by converting a voice instruction received by an earphone and comprises indication information for starting the blind guiding vehicle; the blind guiding vehicle receives positioning information of the fixed UWB positioning device, calculates a sight distance connected domain topological graph and plans a passable path reaching the bracelet, and the blind guiding vehicle drives the bracelet according to the passable path; the blind guiding vehicle periodically receives the positioning information of the fixed UWB positioning device, and updates the passable path based on the multi-sensor fusion SLAM function and the dynamic obstacle avoidance function until reaching a target point.

Description

Man-machine interaction method and system for blind guiding vehicle
Technical Field
The invention relates to the technical field of blind guiding vehicles, in particular to a man-machine interaction method and a man-machine interaction system for a blind guiding vehicle.
Background
With the improvement of urban infrastructure level and diversification of transportation and travel modes, the ordinary people deeply feel travel economy and convenience, but the acceleration of the rhythm of social life causes that the disabled or mobility-inconvenient vulnerable group cannot obtain more time companions from families, and the independent travel of the vision-impaired patient is a main problem.
In the past of underdeveloped information electronic technology, the blind guiding dog and the blind guiding stick play important roles, but training and raising of the blind guiding dog require continuous investment, the cost is high, the service time is limited, and the contact interaction of the traditional blind road and the blind guiding stick is difficult to deal with complex urban traffic scenes, so that scientific and technological workers have thought to be widely beneficial and present a plurality of practical design inventions, such as intelligent blind guiding sticks, intelligent blind guiding e-dogs, intelligent blind guiding helmets and the like.
However, the existing blind guiding sticks, blind guiding e-dogs and other designs are mainly applied to specific scenes, one or two sensors or actuators are used in a man-machine interaction mode, such as voice prompt and vibration feedback, the whole man-machine interaction process lacks of systematic man-machine engineering design, does not have deep semantic understanding and correlation capability, and the interaction experience is poor in working condition scenes.
Disclosure of Invention
It is an object of the present invention to provide a human-machine interaction method and system for a blind-guide vehicle that overcomes or at least alleviates at least one of the above-mentioned drawbacks of the prior art.
In order to achieve the above object, an embodiment of the present invention provides a human-computer interaction system for a blind guiding vehicle, including: blind guiding vehicle, earphone, hand ring and controller, wherein
The earphone is used for receiving a voice command, converting the voice command into a data command and forwarding the data command to the controller;
the controller is used for sending the data instruction to the blind guiding vehicle; the data instruction comprises indication information for starting the blind guiding vehicle;
the blind guiding vehicle is configured with a multi-sensor fusion SLAM function, a dynamic obstacle avoidance function and an ultra wide band UWB positioning function, and is used for receiving the data instruction sent by the controller, and when the data instruction contains indication information for starting the blind guiding vehicle: planning a passable path reaching the bracelet according to the blind guiding vehicle and the position information of the bracelet, and driving the bracelet according to the passable path.
Preferably, a traction ring is arranged at the tail end of the guide rod at the rear end of the blind guiding vehicle, and a buzzer is arranged on the traction ring and used for giving out the prompt sound after the blind guiding vehicle reaches the target point.
Preferably, the blind guiding vehicle is used for: receiving the positioning information of the fixed UWB positioning device, calculating the sight distance connected domain topological graph according to the positioning information, wherein the bracelet and the blind guiding vehicle are respectively located in at least two fixed UWB accessory coverage ranges.
Preferably, the blind guiding vehicle is used for informing the controller when reaching a target point;
the controller is used for playing the prompt tone through the earphone.
Preferably, the controller is configured to: when the data instruction comprises destination information, planning a route to a destination by using a preset blind guiding map service, prompting the route to the destination through the earphone voice, and sending the route to the blind guiding vehicle after the route to the destination is confirmed;
the blind guiding vehicle is used for: according to the route to the destination, planning a driving path of the blind guiding vehicle and the user as a whole; according to the information detected by the sensor and the scene switching position information in the route to the destination, identifying the area scene, wherein the area scene comprises at least one of the following: the system comprises a non-public blind guiding facility region, a public blind road facility region, an obstacle-free blind road facility region, an obstacle-contained blind road facility region, a spanning discontinuous blind road region and a public transportation transfer region.
Preferably, the system further comprises:
the blind guiding insole is internally provided with a plurality of pressure sensors and a plurality of vibration feedback devices; the pressure sensors are respectively arranged on two sides of the front sole, the middle part of the arch of foot and the heel part of foot and are used for monitoring the stress deviation conditions in the front-back direction and the left-right direction of the step of foot; the plurality of vibration feedback devices are respectively arranged at the thumb and the little finger at the left end and the right end of the front side of the step and used for the left side and the right side to feed back transversely.
Preferably, variance threshold values of the pressure sensors on the blind road are preset in the controller, and are used for receiving the pressure detected by the pressure sensors, calculating the variance, if the calculated variance is smaller than the variance threshold value, judging that the current blind guiding insole is located away from the blind road, and controlling the corresponding vibration feedback device to vibrate to remind that the blind road is away.
Preferably, the controller is configured to: and calculating the ratio of the pressure measured by the forefoot pressure sensor to the pressure measured by the heel pressure sensor, and informing the blind guiding vehicle to reduce the vehicle speed when the ratio exceeds a preset threshold value.
Preferably, the controller is configured to: calculating the step frequency of the blind according to the data collected by the pressure sensors, and regulating the speed of the blind guiding vehicle by combining a preset step frequency threshold value; or the blind person walking speed is calculated according to the current speed and the walking frequency of the blind person guiding vehicle, and the speed of the blind person guiding vehicle is adjusted according to a preset walking speed threshold value.
The embodiment of the invention also provides a man-machine interaction method for the blind guiding vehicle, which is applied to any one of the man-machine interaction systems and comprises the following steps:
the blind guiding vehicle receives the data instruction sent by the controller; the data instruction is obtained by converting a voice instruction received by the earphone and comprises indication information for starting the blind guiding vehicle;
the blind guiding vehicle receives the positioning information of the fixed UWB positioning device, calculates a sight distance connected domain topological graph and plans a passable path reaching the bracelet, and the blind guiding vehicle drives the bracelet according to the passable path;
the blind guiding vehicle periodically receives the positioning information of the fixed UWB positioning device, and updates the passable path based on the multi-sensor fusion SLAM function and the dynamic obstacle avoidance function until reaching a target point.
Due to the adoption of the technical scheme, the invention has the following advantages:
the man-machine interaction method and the man-machine interaction system for the blind guiding vehicle, provided by the embodiment of the invention, have the advantages that the voice calling function of the blind guiding vehicle is realized, and the use experience of a user is improved. Moreover, blind guiding scene requirements of various scenes can be met, walking-level advancing deviation detection and auxiliary deviation correction are provided, the wearable equipment is high in fitting degree with a user, multi-level body feeling linkage feedback can be provided for the user, and the applicability of blind guiding in various scenes is effectively improved.
Drawings
Fig. 1 is a schematic view of a human-computer interaction system for a blind guiding vehicle according to an embodiment of the present invention.
Fig. 2 shows another schematic diagram of a human-computer interaction system provided by the embodiment of the invention.
Fig. 3 shows a deployment diagram of an indoor fixed UWB locating device provided by an embodiment of the present invention.
Fig. 4 shows a schematic diagram of a blind guiding vehicle planning a passable path according to an embodiment of the present invention.
Fig. 5 shows a schematic diagram of a blind guiding vehicle establishing a visual communication domain according to an embodiment of the present invention.
Fig. 6 shows a schematic structural diagram of a blind guiding insole provided by an embodiment of the invention.
Fig. 7 shows a schematic diagram of the blind guiding insole provided by the embodiment of the invention for blind road deviation reminding and auxiliary deviation correction.
Fig. 8 shows a timing diagram of insole pressure during blind-guiding walking provided by the embodiment of the invention.
Fig. 9 shows another schematic diagram of a human-computer interaction system provided by the embodiment of the invention.
Fig. 10 is a flow chart diagram illustrating a man-machine interaction method for a blind guiding vehicle according to an embodiment of the present invention.
Detailed Description
In the drawings, the same or similar reference numerals are used to denote the same or similar elements or elements having the same or similar functions. Embodiments of the present invention will be described in detail below with reference to the accompanying drawings.
In the description of the present invention, the terms "central", "longitudinal", "lateral", "front", "rear", "left", "right", "vertical", "horizontal", "top", "bottom", "inner", "outer", etc., indicate orientations or positional relationships based on those shown in the drawings, and are only for convenience in describing the present invention and simplifying the description, but do not indicate or imply that the device or element being referred to must have a particular orientation, be constructed in a particular orientation, and be operated, and therefore, should not be construed as limiting the scope of the present invention.
In the present invention, the technical features of the embodiments and implementations may be combined with each other without conflict, and the present invention is not limited to the embodiments or implementations in which the technical features are located.
The present invention will be further described with reference to the accompanying drawings and specific embodiments, it should be noted that the technical solutions and design principles of the present invention are described in detail in the following only by way of an optimized technical solution, but the scope of the present invention is not limited thereto.
The following terms are referred to herein, and their meanings are explained below for ease of understanding. It will be understood by those skilled in the art that the following terms may have other names, but any other names should be considered consistent with the terms set forth herein without departing from their meaning.
The embodiment of the invention provides a man-machine interaction system for a blind guiding vehicle, and fig. 1 shows a schematic diagram of the man-machine interaction system, which comprises: a blind guiding vehicle 10, a headset 20, a bracelet 30 and a controller 40, wherein
The earphone 20 is used for receiving voice instructions, converting the voice instructions into data instructions and forwarding the data instructions to the controller 40;
the controller 40 is used for sending the data instruction to the blind guiding vehicle 10; the data instruction comprises indication information for starting the blind guiding vehicle;
the blind guiding vehicle 10 is configured with a multi-sensor fusion SLAM function, a dynamic obstacle avoidance function and a UWB (ultra wide band) positioning function, and is configured to receive the data command sent by the controller 40, and when the data command contains indication information for starting the blind guiding vehicle: planning a passable path reaching the bracelet according to the position information of the blind guiding vehicle 10 and the bracelet 30, and driving to the bracelet according to the passable path.
In one embodiment, a traction ring is disposed at the end of the guide rod at the rear end of the blind guiding vehicle 10, and a buzzer is disposed on the traction ring for emitting the warning sound after the blind guiding vehicle reaches the target point.
In an implementation mode, an adsorption device is further arranged on the traction ring and used for generating adsorption between the traction ring and the intelligent bracelet.
In one embodiment, the blind guiding vehicle is configured to: receiving the positioning information of the fixed UWB positioning device, calculating the sight distance connected domain topological graph according to the positioning information, wherein the bracelet and the blind guiding vehicle are respectively located in at least two fixed UWB accessory coverage ranges.
In one embodiment, the blind guiding vehicle is used for informing the controller when reaching a target point; the controller is used for playing the prompt tone through the earphone.
In one embodiment, the controller is configured to: when the data instruction comprises destination information, planning a route to a destination by using a preset blind guiding map service, prompting the route to the destination through the earphone voice, and sending the route to the blind guiding vehicle after the route to the destination is confirmed; the blind guiding vehicle is used for: according to the route to the destination, planning a driving path of the blind guiding vehicle and the user as a whole; according to the information detected by the sensor and the scene switching position information in the route to the destination, identifying the area scene, wherein the area scene comprises at least one of the following: the system comprises a non-public blind guiding facility region, a public blind road facility region, an obstacle-free blind road facility region, an obstacle-contained blind road facility region, a spanning discontinuous blind road region and a public transportation transfer region.
In one embodiment, the human-computer interaction system further includes a blind-guiding insole 60, in which a plurality of pressure sensors and a plurality of vibration feedback devices are disposed; the pressure sensors are respectively arranged on two sides of the front sole, the middle part of the arch of foot and the heel part of foot and are used for monitoring the stress deviation conditions in the front-back direction and the left-right direction of the step of foot; the plurality of vibration feedback devices are respectively arranged at the thumb and the little finger at the left end and the right end of the front side of the step and used for the left side and the right side to feed back transversely.
In one embodiment, a pressure variance threshold of the pressure sensors on the blind road is preset in the controller, and is used for receiving the pressure detected by the pressure sensors, calculating a variance, and if the calculated variance is smaller than the variance threshold, judging that the position of the current blind guiding insole deviates from the blind road, and controlling a corresponding vibration feedback device to vibrate to remind the blind road to deviate.
In one embodiment, the controller is configured to: and calculating the ratio of the pressure measured by the forefoot pressure sensor to the pressure measured by the heel pressure sensor, and informing the blind guiding vehicle to reduce the vehicle speed when the ratio exceeds a preset threshold value.
In one embodiment, the controller is configured to: and calculating the step frequency of the blind according to the data acquired by the pressure sensors, and regulating the speed of the blind guiding vehicle by combining a preset step frequency threshold value.
In one embodiment, the controller is configured to: or the blind person walking speed is calculated according to the current speed and the walking frequency of the blind person guiding vehicle, and the speed of the blind person guiding vehicle is adjusted according to a preset walking speed threshold value.
For the convenience of understanding, the human-computer interaction system provided by the invention is described by the specific embodiment below. It should be noted that the following specific examples are only for illustrating the present invention, and the specific description thereof is not intended to limit the present invention.
Fig. 2 shows another schematic diagram of a human-computer interaction system provided by the embodiment of the invention. As shown in fig. 2, the system includes a pair of earphones 20, a bracelet 30, a pair of blind-guiding insoles 50, a controller 40 (e.g. mobile phone) and the blind-guiding vehicle 10, which together constitute a wearable blind-guiding system.
(1) Blind guiding vehicle 10
The blind guiding vehicle 10 mainly comprises a positioning and navigation module, an environment sensing module, a dynamic obstacle avoidance module and a wireless communication module. The positioning module is responsible for tracking space coordinate information in real time and supports the functions of absolute positioning and indoor space relative positioning; the environment perception module supports the space perception functions of static facilities and dynamic obstacles; the dynamic obstacle avoidance module is used for planning a passable route by taking the blind guiding vehicle and the blind as a whole; the wireless communication module is responsible for communicating with a user and external GSM/5G to update state information, and realizes relative positioning with the blind through UWB communication.
(2) Controller 40
The controller 40 is a communication and decision center of the whole blind guiding system, and comprises four modules of map blind guiding, voice recognition and synthesis, emergency help seeking and wireless communication. The map blind guiding module is responsible for carrying out route planning and feasibility analysis according to a destination set by the voice of a user; the voice recognition and synthesis module is a core medium of man-machine interaction and is responsible for accurately recognizing the voice of a user, correctly understanding the semantics of the scene in which the user is positioned and scheduling other functional modules to assist in achieving the goal; the emergency help-seeking module is used for maintaining the identity information of the user and providing a voice and entity touch mode to quickly initiate remote assistance requests such as voice and video; the wireless communication module is communicated with the outside for updating information through GSM/5G communication, and is communicated with the information of each functional module through NFC technology such as Bluetooth communication in the blind guiding system. The controller 40 may be a mobile phone, and may also be other devices having the above functions.
(3) Earphone 20
The earphone 20 is an important medium for communication of user behavior intention in the blind guiding system, and in the present example, the earphone 20 includes three modules of dynamic noise reduction and sound amplification, sound pickup and wireless communication. The dynamic noise reduction module can dynamically switch the noise reduction level according to the ambient environment condition of the user and switch the transparent mode according to the requirement of the user on environment perception. The pickup module has the functions of triggering keywords and picking up sound and reducing noise. The wireless communication module is connected with the smart phone through Bluetooth communication to transmit audio and control information.
(4) Hand ring 30
The bracelet 30 is a main medium for transmitting the body action intention of the user, and comprises four modules of inertial posture measurement, NFC authentication and payment, blind guiding traction matching and wireless communication. The inertial attitude measurement module is responsible for monitoring the travelling speed of a user and monitoring abnormal body attitudes such as falling down; the NFC authentication and payment module is used for community and building access control and public traffic non-contact payment; the blind guiding traction matching module is used for matching with a blind guiding vehicle when a user calls the blind guiding vehicle; the wireless communication module realizes relative positioning with the blind guiding vehicle through UWB communication, and information is transmitted through Bluetooth communication smart phone interconnection.
(5) Blind guiding insole 50
The blind guiding insole 50 is the most direct means for the somatosensory interactive feedback of a user, and comprises three modules of ground flatness identification, a vibration feedback array and wireless communication. The ground flatness identification module is used for assisting a user in detecting the walking ground condition and whether the user walks on a blind road or deviates; the vibration feedback array module is used for prompting whether a user deviates from a blind road and the deviated direction and assisting correction through the vibration of a plurality of vibration points distributed in the insole; the wireless communication module is connected with the smart phone through Bluetooth communication to transmit information.
The operation process of the man-machine interaction system when the blind-guiding vehicle is started indoors is described below by a specific embodiment.
In the starting process, the relative distance measurement and positioning capacity of the bracelet and the blind guiding vehicle UWB communication is utilized, and the bracelet and the earphone audio prompt guide item are combined to realize the indoor calling function of the blind guiding vehicle. The fixed UWB positioning devices (or modules) are deployed in the areas where the blind guiding vehicles are possibly called by referring to the indoor layout, and the principle of deployment point positions is to ensure that the blind guiding vehicles and blind users are within the sight distance coverage range of not less than 2 UWB positioning devices, so that the users and the blind guiding vehicles can achieve better indoor relative positioning accuracy. Fig. 3 shows a deployment schematic of an indoor stationary UWB locating device.
The starting process specifically comprises the following steps:
step one, voice awakening and calling
The user wears the intelligent earphone and speaks a voice awakening word such as 'hello, little help', the earphone feedback such as 'beep' monitors a user instruction, the user speaks a voice instruction such as 'call blind guiding car', the voice instruction is correctly analyzed and forwarded to the blind guiding car in a dormant state through the intelligent mobile phone, the blind guiding car carries out system initialization, sensor and software function self-checking is completed, the blind guiding car enters a servo state, user position information is requested, and the voice awakening word such as 'the blind guiding car receives and is arriving' is fed back through an uplink.
Step two: blind guiding vehicle execution path planning
The blind guiding vehicle and the user realize positioning through UWB communication, and different from the traditional minimum three-point positioning mode, in the embodiment, each device to be positioned only needs to be ensured to be kept in the coverage range of two fixed UWB positioning devices. The blind guiding vehicle has the space SLAM and dynamic obstacle avoidance traffic capacity, and the UWB only needs to calculate the sightA passable path is planned (one-time planning for short) from the connectable domain, as shown in fig. 4, the visual distance of the position (4) where the blind user is located can cover the numbers of the UWB positioning devices (1) and (2), the visual distance of the position (5) where the blind guiding vehicle is located can cover the numbers of the UWB positioning devices (1) and (3), the visual connected domain is established based on the visual distance, and the distance between any two points is used as a weight DijA connected domain graph as shown in fig. 5 is established. Based on the visual connected domain topological graph, the blind guiding vehicle can know that the currently reachable shortest paths are (5), (1) and (4) in sequence through a shortest path algorithm such as Dijkstra algorithm, and the blind guiding vehicle finishes primary path planning.
Step three: dynamic obstacle avoidance and tracking travel of blind guiding vehicle
The blind guiding vehicle is provided with a single-line laser radar and an inertial navigation sensor, has the capabilities of indoor visual range space obstacle avoidance and planning to advance, and has a connected domain graph weight distance D constructed by the path planned once in the step twoijFor scalar, quadratic route planning dynamically calculates the direction of travel. The advancing direction keeps approaching to the next target position of the sight distance, and the judging method comprises the following steps: the current position is dynamically moved to avoid the obstacle in the direction of the positioning device (1), the distance D is about, the length of D is larger than the UWB minimum position error (10cm), and then the current distance D between the current position p and the adjacent positioning device is judgedp1、Dp3And planning the next travel direction Ap1And (5) range and continue to dynamically avoid barrier to advance.
Step four: user matching and connecting blind guiding vehicle traction ring
When the blind guiding vehicle reaches a target point (for example, a position which is less than 20cm away from the user position is detected), the controller acquires feedback of the blind guiding vehicle, and prompts the user that the blind guiding vehicle is in place through the earphone voice.
The tail end of the extension guide rod at the rear end of the blind guiding vehicle can be provided with a traction ring, the tail end of the traction ring is provided with an active buzzer to slowly send out 'beep' sound, when the blind guiding vehicle approaches a user and stops, the intelligent earphone is automatically switched to a transparent mode, the blind person can identify the spatial direction where the traction ring is located in a close range through the sense of hearing, and then the blind person stretches out the hand to be close to the traction ring. The end of the traction ring can also be provided with a guide (auxiliary and safe) rope to be connected with the magnetic suction button, when the approaching bracelet is within 10cm, the magnetic suction can be automatically sucked on the magnetic suction seat of the hand ring, a user can smoothly hold the guide ring along the guide rope, the user is connected with the equipment to be matched, and the prompt tone stops at the moment. The magnetic attraction guide ring can be found quickly after the traction ring is separated for a short time in the blind guide process.
The function of the blind guiding insole is explained below by a specific embodiment.
Fig. 6 shows a structural schematic diagram of the blind guiding insole. As shown, the blind guiding insole(s) is (are) equipped with 4 pressure sensors 61 and 2 vibration feedback devices 62. It will be readily appreciated that this number is merely exemplary and that other numbers of pressure sensors and vibration feedback devices may be provided. The pressure sensors are respectively deployed and used for respectively monitoring main ground contact stress points of the normal foot, including two sides of a sole, the middle of an arch and a heel, so that stress deviation conditions in the front-back direction and the left-right direction of a step can be monitored; the vibration feedback devices are respectively arranged at the positions of the thumb and the little finger at the left end and the right end of the front side of the step, avoid main stress points and are closer to the tail end tactile nerves, and the transverse feedback of the left side and the right side can be effectively provided.
The blind guiding insole can be used for blind road deviation reminding and auxiliary deviation correction
In the process of guiding blind by pulling behind a blind guiding vehicle, blind persons continuously search blind roads through the touch of feet, the condition that the blind persons deviate from the blind roads is easy to occur, blind road deviation identification and auxiliary deviation correction are mainly carried out on the non-constant rule of the pressure of the uneven blind roads through a single intelligent insole and a plurality of dispersed point position pressure sensors, when the fact that the single foot and a plurality of point positions are continuously, slowly and consistently stressed for many times is found, blind users are reminded to pay attention to judge whether the blind roads are separated from the blind roads or not through vibration feedback only to the feet in the deviation direction, and the blind persons are transversely corrected. Calculating the pressure mean value F of a single point location which needs to be continuously calculated for multiple times for the blind personavgAnd the variance D is used for calibrating the parameter values on the blind road and taking the variance threshold D as a threshold for judging deviation. That is, the pressure variance D of a single point location that is continuous a plurality of times is set in advance as a threshold value. The specific numerical value of the times can be flexibly set according to actual needs.
As shown in fig. 7, the blind person normally walks along the blind road at the time of T0 and walks with the walking track facing to the rightCompletely taking the blind road by the right foot at the moment T1, continuously leaving the blind road state in the following n steps, and calculating the variance D in real time at the moments T1-T2n_rightLess than variance threshold D, variance of foot in another direction Dn_leftIf the deviation is not lower than the variance threshold, the single foot is judged to deviate from the blind road, the vibrator of the right foot vibrates to remind the foot surface deviating from the blind road, and auxiliary deviation correction is finished at the time T3.
The blind sidewalk is judged to be deviated from the blind sidewalk only if the variance corresponding to the front sole outer side pressure sensor is smaller than the variance threshold value D, and the rest positions are not deviated from the blind sidewalk.
(II) blind person step pitch, step frequency and direction monitoring and self-adaptive speed
In the blind guiding process, the blind person needs to control the traveling speed so as to sense the underfoot and surrounding environment in real time, and the blind guiding vehicle also needs to dynamically adjust the traction speed so as to adapt to the walking speed of the blind person.
The main determining factors of the walking speed of the blind are the step pitch and the step (track) frequency (degree);
single step pitch DstepThe method 1 can be realized by a straight-line distance S with a fixed length according to the conditions of the person (height, posture and the like) and the influence of environmental factorstest(such as 10m) statistics of the number of steps NstepBy performing a slow step distance DslowConstant speed step DnornalAnd (3) obtaining the rough calibration of parameters, wherein the unit is m, and the measured empirical value of the pace of the healthy person can be estimated by referring to the method 2, H is the height, and lambda is the reduction coefficient of the pace of the non-healthy person.
The method comprises the following steps:
Figure BDA0003290089560000091
the method 2 comprises the following steps:
Dstep=0.262H+155.911λ
walking frequency F for healthy personstepThe experience statistic value is 95-125 steps/minute, the blind carries out experience estimation according to the attenuation constant coefficient mu, and the value range is as follows:
95μ≤Fstep≤125μ
the blind-guiding insole can collect the voltage value of a single pressure sensor through a fixed frequency and convert the voltage value into a single-point pressure value, fig. 8 shows a pressure time chart of the insole in the blind-guiding walking process, wherein the abscissa is time t, the ordinate is a pressure value N, and when the sole touches the ground and is supported, the pressure P of the sensor is measuredstepThe maximum peak value is reached, and the pressure reaches the minimum valley value when the foot surface is lifted and separated from the ground; time F of horizontal axis corresponding to adjacent peak valuestepI.e., a cycle in which a single step is completed for a single leg.
When a person vertically stands and wears flat-heeled shoes or barefeet, the heel and the metatarsus bones of the forefoot are subjected to (heavy) force distribution empirical values of 57 percent to 43 percent, the heel provides more supporting force relatively, but when the walking speed of the person is accelerated, the gravity center can move forwards to adapt to the speed matching requirement, the sole continuously provides more supporting force, the distribution of the force on the instep changes, and in an extreme case, when the resultant force borne by the two instep is detected to be smaller than the statistical mean value, the gravity center is probably seriously deviated, the falling down and other situations are very easy to occur, so the walking state of the pedestrian is monitored by detecting the pressure distribution of the instep, and the blind guiding vehicle speed is timely adjusted.
Through the above analysis, the manner of adjusting the speed of the blind guiding vehicle may include:
1: regulating the speed of the blind guiding vehicle according to the pressure distribution
Preset Pfront:PtailE.g. setting the maximum deviation factor delta value range (0,1)
Figure BDA0003290089560000101
If the current P is acquired by the sensorront:PtailIf the value of the blind person is larger than the preset threshold value, the blind person is judged to be in a rapid driving state at present, and the blind person guiding vehicle is adjusted to reduce the speed. PfrontBy averaging the measurements of the two pressure sensors on the front side, PtailThe rear heel pressure sensor measurement.
2: regulating the speed of blind guiding vehicle according to the step frequency
As analyzed above, the normal range of step frequency is: f is more than or equal to 95 mustepLess than or equal to 125 mu, if the current F collected by the sensor isstepIf the running speed of the blind guiding vehicle is larger than 125 mu, judging that the running speed of the blind guiding vehicle is too high, and if the blind needs to chase the blind guiding vehicle, adjusting the blind guiding vehicle to reduce the running speed; otherwise, if the current F acquired by the sensorstepIf the speed is less than 95 mu, the blind guiding vehicle is judged to be too low in running speed, and the blind guiding vehicle is adjusted to improve the running speed.
3: the speed of the blind guiding vehicle is adjusted according to the step pitch
Figure BDA0003290089560000102
Current guiding speed V of known blind guiding vehiclespeedCurrent step frequency FstepAnd acquiring a frequency point where the maximum amplitude value is located by carrying out Fourier transform on a time domain signal acquired by the heel pressure sensor of each foot. If D is obtained by current calculationsetpGreater than NnormalAnd if the speed of the blind guiding vehicle is too high, the speed of the blind guiding vehicle needs to be reduced. Otherwise, if D is currently presentsetpLess than slow step distance DslowAnd judging that the speed of the blind guiding vehicle is too low, and increasing the speed of the blind guiding vehicle.
Fig. 9 exemplarily shows a human-computer interaction system, which includes an intelligent blind guiding vehicle, a smart phone, an intelligent noise reduction earphone, an intelligent bracelet and an intelligent insole.
The intelligent blind guiding vehicle mainly comprises a positioning and navigation module, an environment sensing module, a dynamic obstacle avoidance module and a wireless communication module. The positioning module is responsible for tracking space coordinate information in real time, supporting absolute positioning and indoor space relative positioning functions, environment sensing supports static facilities and dynamic barrier space sensing functions, the dynamic barrier avoiding module is responsible for planning a passable route by taking the blind guiding vehicle and the blind as a whole, the wireless communication module is responsible for communicating with users and external GSM/5G to update state information, and the relative positioning with the blind is realized through UWB communication.
The smart phone is a communication and decision center of the whole blind guiding system and comprises four modules of map blind guiding, voice recognition and synthesis, emergency help seeking and wireless communication. The map blind guiding module is responsible for carrying out route planning and feasibility analysis according to a destination set by user voice, the voice recognition and synthesis module is a core medium of man-machine interaction and is responsible for accurately recognizing the user voice, correctly understanding semantics of a scene where the user is located and scheduling other functional modules to assist in achieving a target, the emergency help module is used for maintaining identity information of the user and providing remote assistance requests of voice, video and the like in a voice and entity touch mode, the wireless communication module is communicated with the outside through GSM/5G communication and updates information, and the information can be mutually achieved with other devices such as an intelligent blind guiding vehicle, an intelligent noise reduction earphone, an intelligent insole and/or an intelligent bracelet and the like in a blind guiding system through Bluetooth communication.
The intelligent noise reduction earphone is an important medium for conveying the behavior intention of a blind guiding system user, and comprises three modules of dynamic noise reduction and sound amplification, sound pickup and wireless communication. The dynamic noise reduction module can dynamically switch noise reduction levels according to the ambient environment conditions of a user, and switches a transparent mode according to the requirement of the user for environment perception, the pickup module has a keyword triggering and pickup noise reduction function, and the wireless communication module is connected with the smart phone through Bluetooth communication to transmit audio and control information.
The intelligent bracelet is a main medium for transferring the limb action intention of a user and comprises four modules of inertial attitude measurement, NFC authentication and payment, blind guiding traction matching and wireless communication. The inertial attitude measurement module is responsible for monitoring the travelling speed of a user, monitoring abnormal body attitudes such as falling and the like, NFC authentication and payment are used for community and building entrance guard and public traffic non-contact payment, the blind guiding traction matching module is responsible for realizing relative positioning between the wireless communication module and a blind guiding vehicle through UWB communication when the user calls the blind guiding vehicle, and information is transmitted through Bluetooth communication smart phone interconnection.
The intelligent insole is the most direct means for somatosensory interactive feedback of a user and comprises three modules, namely ground flatness identification, a vibration feedback array and wireless communication. The ground flatness identification module is used for assisting a user in detecting the walking ground condition and whether the user walks on the blind road or deviates, the vibration feedback array module is used for prompting whether the user deviates from the blind road or not and the deviated direction and assisting in correction through vibration of a plurality of vibration points distributed in the insole, and the wireless communication module is used for transmitting information with the smart phone in an interconnected mode through Bluetooth communication.
The following describes practical application of the human-computer interaction system for the blind guiding vehicle provided by the embodiment of the invention through a specific example, and the functions of the somatosensory interaction system which covers the whole process before, during and after blind guiding service and is mainly and multi-sensor-assisted by voice interaction comprise five steps including human-computer basic information interaction, blind guiding equipment coupling, navigation planning, navigation initiation and navigation termination. In this example, the controller is illustrated by taking a smartphone as an example.
The method comprises the following steps: man-machine basic information interaction
In the blind guiding travel service process, under common scenes of identity document verification, destination navigation, emergency help seeking and the like, a blind user is inconvenient to carry or provide, a portable input and calling mode needs to be provided in advance by means of voice interaction, the user needs to wear an earphone in human interaction, and Bluetooth is automatically connected with a smart phone.
The user wears the earphone and speaks the voice awakening word, the voice feeds back the user, the voice of the user respectively explains the following three types of configuration commands, and the voice and the semanteme are correctly analyzed and the voice feeds back the user to confirm and then the input is successful.
Figure BDA0003290089560000121
Setting personal information: including personal identification, physical health status
Figure BDA0003290089560000122
Setting a common address: including home, community, park, hospital
Figure BDA0003290089560000123
Setting an emergency contact means: voice, video, shared positioning function
An example of human-computer interaction:
(1) the user: "you are good, help"
(2) Voice feedback: 'in wool'
(3) The user: setting personal certificate information "
(4) Voice response: "Please specify the certificate category to be set"
(5) The user: identity card "
(6) Voice response: "please explain the identity card-information item to be set"
(7) The user: certificate number "
(8) Voice response: "please explain the ID card-card number to be set"
(9) The user: "100 XXX"
(10) Voice response: "you set ID card-card number-100 XXX, confirm please say affirmation, reset please say modification"
(11) The user: "confirmation"
(12) Voice response: "successful setup"
Step two: blind guiding vehicle calling and user coupling
The blind guiding service depends on a blind guiding vehicle, the blind guiding vehicle needs a user to wear an earphone and a bracelet, and the calling process is the operation process of the man-machine interaction system when the blind guiding vehicle is started indoors, which is not described herein again. Only the speech, semantic interaction logic is introduced as follows.
Interaction example:
(1) the user: "you are good, help"
(2) Voice response: 'in wool'
(3) The user: calling blind-guiding vehicle "
(4) Voice response: good, in call "
(5) The intelligent mobile phone issues a blind guiding vehicle calling command and waits for a self-checking result
(6) Voice response: 'guiding vehicle is awakened, self-checking is successful, and it is coming to your current position'
(7) Intelligent vehicle UWB positioning information and vehicle end sensor dynamic obstacle avoidance reaching user body side
(8) Voice response: 'guiding vehicle is in place beside your body, please hold guiding handle with hand'
(9) The earphone feeds back the distance and the relative position of the blind guiding vehicle traction ring through 'beep' sound relieving and short sound, and the user probes the blind guiding vehicle traction ring through a hand wearing the bracelet to complete the coupling of the blind guiding vehicle equipment.
Step three: navigation planning
The user uses the navigation planning service through earphone voice interaction, the user sets a destination through voice, the blind guiding map service preassembled by the smart phone searches for an optional route, the travel suggestion is provided by integrating the traffic capacity of the blind guiding vehicle, and the user completes the route planning task after confirming the execution route through voice interaction.
Interaction example:
(1) the user: "you are good, help"
(2) Voice response: 'in wool'
(3) The user: "search for route to XXX"
(4) Voice response: "find N routes to XXX for you, wherein the I route contains X public transportation transfers along the way, Y pedestrian crosses the road, accumulate walking distance Z km, there are traffic lights, there are W blind person barrier-free facilities along the way, V blind person guide volunteer service station, J route XXX …, ask for choose preferred route based on the above information"
(5) The user: navigation K route "
(6) Voice response: "good"
Step four: initiate navigation, blind guiding scene recognition and dynamic policy switching
The blind guiding vehicle and the man-machine interaction system support automatic identification of 5 blind guiding scenes, and are adapted to corresponding voice interaction blind guiding strategies, and interaction processes are introduced in sequence as follows.
4.1 non-public blind guide facility areas such as indoor and community
Blind road facilities in non-public blind guide facility areas are not sound, and scene identification mainly depends on map positioning and facility identification of blind guide vehicles, such as elevators and building conditions; the blind guiding and positioning mainly depends on the fusion SLAM of multiple sensors of a blind guiding vehicle, and common technical means comprise single-line laser radar and UWB positioning. The blind guiding passage follows the right passage principle, and the vertical ladder mode is mainly adopted for going upstairs and downstairs until the scene is finished when the blind guiding passage enters a public blind road area.
Interaction example:
(1) voice broadcasting: 'when navigation starts, the community navigation mode is started, the expected passing distance is X meters, and the people pass through the X straight ladders and arrive at XXX'
(2) Voice broadcasting: "Current position continues 50 meters along XXX, please note to follow the navigation traction route, currently set as right-hand traffic route"
(3) Voice broadcasting: 'reach the straight ladder X meters in front of the current position, stop at the X floor when the current elevator stops at the X floor, and reach the current floor after X minutes is estimated'
(4) Voice broadcasting: 'an elevator arrives and please enter the elevator along with traction'
(5) Smart phone navigation sends a target floor stop request to an elevator through smart bracelet UWB communication and receives a confirmation response
(6) Voice broadcasting: 'Elevator reaches destination floor, please follow to pull out elevator'
(7) Voice broadcasting: "Current position continues 50 meters along XXX"
(8) Voice broadcasting: 'this area guiding is finished, in the navigation mode switching'
4.2 Barrier-free Blind road facility area
Blind road facility area scene identification mainly depends on map marking and blind road identification of blind guiding vehicles; the blind guiding vehicle has a blind road detection function based on image recognition and semantic segmentation, the continuous blind road advancing and maintaining capacity is realized, the blind person is pulled to travel, the blind road deviation is detected and early-warned through the intelligent insole, the user is prompted to transversely correct the blind road through foot vibration feedback, and the blind road is shielded by obstacles and a discontinuous blind road crossing scene is additionally judged and processed.
Interaction example:
(1) voice broadcasting: 'navigation starts, a blind road navigation mode is started, the expected passing distance is X meters, and the vehicle reaches XXX'
(2) Meet the turn angle and be greater than 45 turns, voice broadcast: 'Right-angle left/right turn ahead, please pay attention to following and keeping blind road passage'
(3) The blind guiding vehicle is pulled to move forwards, the intelligent insole detects the following speed in real time and performs self-adaptive adjustment on the blind guiding vehicle speed, and the blind road deviation is detected in real time, if the deviation of walking occurs, the foot vibration is reminded.
(4) Meet the blind track bifurcation, voice broadcast: 'front blind road crossing please pay attention to following along left side crossing and keeping blind road passing'
(5) Voice broadcasting: 'this area guiding is finished, in the navigation mode switching'
4.3 area of obstructed blind sidewalk facilities
The blind guiding vehicle adopts a blind track detection tracking strategy when traveling on the blind track, and when a front blind track area is shielded by an obstacle, the blind guiding vehicle identifies and classifies the type and size of the obstacle through a visual image, and judges whether to bypass or need to seek assistance based on the type and size of the obstacle.
Interaction example:
(1) blind vehicle discovery place ahead closely pedestrian, the temporary blind road that walks of non-motor vehicle are led to blind road and lead blind vehicle in-process to lead blind vehicle and slow down and walk to voice broadcast: the pedestrian passes through the front part and passes through the middle part at low speed, and the blind person is dragged to pass through at low speed.
(2) Leading blind vehicle discovery the place ahead and having static barrier to occupy partial blind road, leading blind vehicle speed reduction and trying to plan the route of detouring that does not break away from the blind road to voice broadcast: the blind road occupied by the part with the obstacle in front is planned to pass along the left detour route at low speed, the blind person is required to avoid the obstacle on the right side, and the blind person is dragged to pass at low speed.
(3) Leading blind vehicle discovery the place ahead to have static barrier to shelter from continuous blind road leading blind vehicle to pass through the in-process, leading blind vehicle planning to pass around the route failure and park along current circuit, voice broadcast: 'blind road occupied by continuous road section barriers in front of the blind road, no detour of the blind road, please confirm the current processing strategy, remotely assist or select alternative detour route planning'
(4) The user: "View alternative detour route"
(5) Voice broadcasting: "find the nearest alternative detour route to walk X meters more, cross the traffic lights at X, the time consumption increases X minutes, whether switch the route"
(6) The user: "confirmation of switching route"
(7) And the blind guiding vehicle switches the route and continues to navigate.
4.4 spanning non-continuous blind road region
The blind guiding cross-region intersection needs a blind guiding vehicle to identify and verify intersection information based on image detection, and comprises the steps of collecting surrounding traffic environment information, wherein the information comprises signal lamps, pedestrian crossings, extension opposite directions, whether blind roads are connected or not and the like.
Interaction example:
(1) leading blind car and being close to crossing zebra crossing and slowing down the parking, discernment current signal lamp, voice broadcast: 'X meters ahead approach the intersection pedestrian path zebra crossing starting point, and the green light can pass after X seconds in the current passing direction'
(2) Current signal lamp turns green, leads blind car broadcast low frequency alarm sound and warning light, and it is current to pull the blind person along zebra crossing, voice broadcast: "Green light in current traffic direction please pay attention to following blind guiding vehicle to pass"
(3) The crossing is strideed across and the next blind road connection point is driven into in the completion, voice broadcast: "has passed through the current intersection and please continue to pass along the blind road"
4.5 public transport transfer area
The public traffic transfer mainly refers to subway connection transfer, subway facilities are sound and have fixed volunteer service stations, and the subway facility has basic environmental conditions required by blind guide; scene recognition mainly relies on map positioning; public transport is transferred and is mainly related to four parts of navigation in the station, floodgate machine ticket checking, the assistance of getting on or off the bus, and navigation mainly relies on the multisensor of leading blind car to fuse SLAM in the station, and the technical means who uses commonly used includes single line laser radar, UWB assistance-localization real-time, and floodgate machine ticket checking passes through intelligent bracelet non-contact NFC near field communication, gets on or off the bus with the help of website voice broadcast and volunteer's assistance.
Interaction example:
(1) get into the subway station and move towards subway floodgate machine through the guidance of plugging into of blind road, voice broadcast: 'currently arrived subway gate entrance guard please pass through bracelet and brush code'
(2) The user swipes the sign indicating number through wrist portion intelligence bracelet, and passageway floodgate door is opened, draws the user to pass through floodgate passageway after leading blind car and detecting the accessible state in the place ahead
(3) Leading blind car according to planning transfer route to the point of taking a bus of appointed orientation and marching, arrive the back and get off the bus on the subway and park, voice broadcast: 'having arrived at the boarding point and waiting for the arrival of the next train'
(4) When the subway train arrives, the voice prompts the user: when the train arrives and please follow the guiding bus, the door is opened and then the user gets on the train, for example, the station provides blind guiding volunteering service, and the volunteer can be requested to assist the bus to sit.
(5) Prompting a user to take the train for the total running stations and get-off stations, and prompting by voice: 'ride this train and pass X website altogether, and get off at X website, expect time of passing X minutes, please keep track of train arrival to report'
(6) The blind guiding vehicle tracks the current driving position according to the GPS positioning information and arrives at the target station when monitoring, and broadcasts by voice: "arrive at X station at present, please get off at this station to transfer"
(7) And (4) the blind guiding vehicle pulls the getting-off vehicle and continues the transfer in the station, and if the public transportation transfer navigation is finished, the blind guiding vehicle guides the user station to go out of the gate machine, and the scene blind guiding is finished.
Step five: ending navigation
After the blind guiding vehicle finishes the blind guiding service, the user is separated from the blind guiding vehicle and orders the blind guiding vehicle to automatically return, and the user picks up the wearable blind guiding equipment.
Interaction example:
(1) the user removes the blind index rope of leading on the bracelet, sends voice command: "you are good, help"
(2) Voice response: 'in wool'
(3) The user: guide vehicle for blind person to return to position "
(4) Voice response: "good"
(5) Blind guiding vehicle realizes tracking return preset parking position by using UWB (ultra wide band) auxiliary positioning and multi-sensor dynamic obstacle avoidance
The user releases the intelligent bracelet and the intelligent earphone, and the blind guiding service is finished.
An embodiment of the present invention further provides a human-computer interaction method for a blind guiding vehicle, which is applied to the human-computer interaction system described in any of the foregoing embodiments, as shown in fig. 10, and includes:
step 101, the blind guiding vehicle receives a data instruction sent by the controller; the data instruction is obtained by converting a voice instruction received by the earphone and comprises indication information for starting the blind guiding vehicle;
102, the blind guiding vehicle receives positioning information of the fixed UWB positioning device, calculates a sight distance connected domain topological graph and plans a passable path reaching the bracelet, and the blind guiding vehicle drives the bracelet according to the passable path;
and 103, periodically receiving the positioning information of the fixed UWB positioning device by the blind guiding vehicle, and updating a passable path based on the multi-sensor fusion SLAM function and the dynamic obstacle avoidance function until the blind guiding vehicle reaches a target point.
In step 102, the blind guiding vehicle receives positioning information of the fixed UWB positioning device on the blind guiding vehicle and the bracelet, and calculates the sight distance connected domain topological graph according to the positioning information, wherein the bracelet and the blind guiding vehicle are respectively located in at least two fixed UWB accessory coverage areas.
In step 103, when the blind guiding vehicle reaches a target point, the controller is notified, and the controller plays the prompt tone through the earphone. The blind guiding vehicle can also send out a prompt sound after reaching a target point through a buzzer at the rear end of the blind guiding vehicle.
In the man-machine interaction method, when the data instruction comprises destination information, the controller plans a route to the destination by using a preset blind guiding map service, prompts the route to the destination through the earphone voice, and sends the route to the blind guiding vehicle after the route to the destination is confirmed. The blind guiding vehicle receives the route sent by the controller, plans a driving path with the blind guiding vehicle and the user as a whole, and identifies the area scene according to the information detected by the sensor and the scene switching position information in the route to the destination, wherein the area scene comprises at least one of the following: the system comprises a non-public blind guiding facility region, a public blind road facility region, an obstacle-free blind road facility region, an obstacle-contained blind road facility region, a spanning discontinuous blind road region and a public transportation transfer region.
The man-machine interaction method further comprises the step of using the blind guiding insole to correct the blind road. A plurality of pressure sensors and a plurality of vibration feedback devices are arranged in the blind guiding insole; the pressure sensors are respectively arranged on two sides of the front sole, the middle part of the arch of foot and the heel part of foot and are used for monitoring the stress deviation conditions in the front-back direction and the left-right direction of the step of foot; the plurality of vibration feedback devices are respectively arranged at the thumb and the little finger at the left end and the right end of the front side of the step and used for the left side and the right side to feed back transversely.
The controller is internally preset with variance threshold values of the pressure sensors on the blind road, receives the pressure detected by the pressure sensors, calculates the variance, judges that the position of the current blind guiding insole deviates from the blind road if the calculated variance is smaller than the variance threshold value, and controls the corresponding vibration feedback device to vibrate to remind the blind road of deviating.
The controller can also calculate the ratio of the pressure measured by the forefoot pressure sensor to the pressure measured by the heel pressure sensor, and when the ratio exceeds a preset threshold value, the blind guiding vehicle is informed to reduce the vehicle speed.
The specific implementation of each step in the method can refer to the description in the above system embodiment, and is not described here again.
The man-machine interaction system and the man-machine interaction method for the blind guiding vehicle, provided by the embodiment of the invention, can cover blind guiding scene requirements of various scenes, provide walking-level traveling deviation detection and auxiliary deviation correction, have high fitness with a wearable device design and a user, can realize multi-level body feeling linkage feedback for the user, and effectively improve the applicability of blind guiding in various scenes.
Finally, it should be pointed out that: the above examples are only for illustrating the technical solutions of the present invention, and are not limited thereto. Those of ordinary skill in the art will understand that: modifications can be made to the technical solutions described in the foregoing embodiments, or some technical features may be equivalently replaced; such modifications or substitutions do not depart from the spirit and scope of the corresponding technical solutions of the embodiments of the present invention.

Claims (7)

1. A human-computer interaction system for a blind-guiding vehicle, comprising: blind guiding vehicle, earphone, bracelet, blind guiding insole and controller, wherein
The earphone is used for receiving a voice command, converting the voice command into a data command and forwarding the data command to the controller;
the blind guiding insole is internally provided with a plurality of pressure sensors and a plurality of vibration feedback devices; the pressure sensors are respectively arranged on two sides of the front sole, the middle part of the arch of foot and the heel part of foot and are used for monitoring the stress deviation conditions in the front-back direction and the left-right direction of the step of foot; the plurality of vibration feedback devices are respectively arranged at the thumb and the little finger at the left end and the right end of the front side of the step and are used for left side and right side transverse feedback;
the controller is used for sending the data instruction to the blind guiding vehicle; the data instruction comprises indication information for starting the blind guiding vehicle; the controller is internally preset with variance threshold values of the pressure sensors on the blind road, and is used for receiving the pressure detected by the pressure sensors and calculating the variance, if the calculated variance is smaller than the variance threshold value, the position of the current blind guiding insole is judged to deviate from the blind road, and the corresponding vibration feedback device is controlled to vibrate to remind the blind road of deviating; the controller is configured to: calculating the ratio of the pressure measured by the front sole pressure sensor to the pressure measured by the heel pressure sensor, and informing the blind guiding vehicle to reduce the vehicle speed when the ratio exceeds a preset threshold value;
the blind guiding vehicle is configured with a multi-sensor fusion SLAM function, a dynamic obstacle avoidance function and an ultra wide band UWB positioning function, and is used for receiving the data instruction sent by the controller, and when the data instruction contains indication information for starting the blind guiding vehicle: planning a passable path reaching the bracelet according to the blind guiding vehicle and the position information of the bracelet, and driving the bracelet according to the passable path.
2. The system as claimed in claim 1, wherein a traction ring is arranged at the end of the guide rod at the rear end of the blind guiding vehicle, and a buzzer is arranged on the traction ring and used for giving out a warning sound after the blind guiding vehicle reaches a target point.
3. The system of claim 1, wherein the blind guide vehicle is configured to: receiving the positioning information of the fixed UWB positioning device, calculating a sight distance connected domain topological graph according to the positioning information, wherein the fixed UWB positioning device is used for guiding the blind vehicle and the bracelet, and the bracelet and the blind guiding vehicle are respectively positioned in at least two fixed UWB accessory coverage ranges.
4. The system of claim 1,
the blind guiding vehicle is used for informing the controller when reaching a target point;
the controller is used for playing prompt tones through the earphones.
5. The system of any one of claims 1-4, wherein the controller is to: when the data instruction comprises destination information, planning a route to a destination by using a preset blind guiding map service, prompting the route to the destination through the earphone voice, and sending the route to the blind guiding vehicle after the route to the destination is confirmed;
the blind guiding vehicle is used for: according to the route to the destination, planning a driving path of the blind guiding vehicle and the user as a whole; according to the information detected by the sensor and the scene switching position information in the route to the destination, identifying the area scene, wherein the area scene comprises at least one of the following: the system comprises a non-public blind guiding facility region, a public blind road facility region, an obstacle-free blind road facility region, an obstacle-contained blind road facility region, a spanning discontinuous blind road region and a public transportation transfer region.
6. The system of claim 1, wherein the controller is to: calculating the step frequency of the blind according to the data collected by the pressure sensors, and regulating the speed of the blind guiding vehicle by combining a preset step frequency threshold value; or the blind person walking speed is calculated according to the current speed and the walking frequency of the blind person guiding vehicle, and the speed of the blind person guiding vehicle is adjusted according to a preset walking speed threshold value.
7. A man-machine interaction method for a blind guiding vehicle, applied to the system of any one of claims 1-6, comprising:
the blind guiding vehicle receives the data instruction sent by the controller; the data instruction is obtained by converting a voice instruction received by the earphone and comprises indication information for starting the blind guiding vehicle;
the blind guiding vehicle receives positioning information of the fixed UWB positioning device, calculates a sight distance connected domain topological graph and plans a passable path reaching the bracelet, and the blind guiding vehicle drives the bracelet according to the passable path;
the blind guiding vehicle periodically receives the positioning information of the fixed UWB positioning device, and updates the passable path based on the multi-sensor fusion SLAM function and the dynamic obstacle avoidance function until reaching a target point.
CN202111160701.8A 2021-09-30 2021-09-30 Man-machine interaction method and system for blind guiding vehicle Active CN113885704B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202111160701.8A CN113885704B (en) 2021-09-30 2021-09-30 Man-machine interaction method and system for blind guiding vehicle

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202111160701.8A CN113885704B (en) 2021-09-30 2021-09-30 Man-machine interaction method and system for blind guiding vehicle

Publications (2)

Publication Number Publication Date
CN113885704A CN113885704A (en) 2022-01-04
CN113885704B true CN113885704B (en) 2022-04-05

Family

ID=79004750

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202111160701.8A Active CN113885704B (en) 2021-09-30 2021-09-30 Man-machine interaction method and system for blind guiding vehicle

Country Status (1)

Country Link
CN (1) CN113885704B (en)

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104856845A (en) * 2015-06-05 2015-08-26 京东方科技集团股份有限公司 Intelligent shoe suitable for the blind
CN106781332A (en) * 2017-02-14 2017-05-31 上海斐讯数据通信技术有限公司 The method and system of alarm are realized by sweeping robot
WO2018119588A1 (en) * 2016-12-26 2018-07-05 深圳前海达闼云端智能科技有限公司 Method and system for realizing safe trip of blind person, and wearable device
CN110321860A (en) * 2019-07-09 2019-10-11 华东师范大学 A kind of intelligence auxiliary system visually impaired based on distributed multi-source heterogeneous sensing technology
CN113311819A (en) * 2021-03-25 2021-08-27 华南理工大学广州学院 Method for guiding blind by robot dog and robot dog control system

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106307781B (en) * 2016-11-01 2018-07-03 徐书奇 Submillineter Wave Technology blind waveguided shoe
CN208785218U (en) * 2018-01-08 2019-04-26 西安石油大学 A kind of guide donning system
DE102018109164A1 (en) * 2018-04-17 2019-10-17 Elten GmbH Shoe for obstacle detection
CN108629966A (en) * 2018-05-08 2018-10-09 京东方科技集团股份有限公司 Intersection blind guiding system, blind-guiding method and guide terminal

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104856845A (en) * 2015-06-05 2015-08-26 京东方科技集团股份有限公司 Intelligent shoe suitable for the blind
WO2018119588A1 (en) * 2016-12-26 2018-07-05 深圳前海达闼云端智能科技有限公司 Method and system for realizing safe trip of blind person, and wearable device
CN106781332A (en) * 2017-02-14 2017-05-31 上海斐讯数据通信技术有限公司 The method and system of alarm are realized by sweeping robot
CN110321860A (en) * 2019-07-09 2019-10-11 华东师范大学 A kind of intelligence auxiliary system visually impaired based on distributed multi-source heterogeneous sensing technology
CN113311819A (en) * 2021-03-25 2021-08-27 华南理工大学广州学院 Method for guiding blind by robot dog and robot dog control system

Also Published As

Publication number Publication date
CN113885704A (en) 2022-01-04

Similar Documents

Publication Publication Date Title
US11331242B2 (en) Travel assistance system
KR101130735B1 (en) Navigation system using rf beacon
JP4867779B2 (en) Pet guiding robot and pet guiding method
KR100998265B1 (en) Guiance method and system for walking way of a blind person using brailleblock have rfid tag thereof
KR101898582B1 (en) A stick for the blind
US20180356233A1 (en) Intelligent navigation assistance device
WO2019214338A1 (en) Accessible intersection guide system, accessibility method, and accessibility terminal for the visually impaired
US10441494B2 (en) Route guidance method and apparatus for visually impaired using visible light communication inside subway station
US20210179386A1 (en) Elevator trip planning based on destinations and activity parameters
CN109646258A (en) A kind of blind-guiding stick, blind guiding system and blind-guiding method
KR102173634B1 (en) System and method for navigation for blind
KR101268486B1 (en) Way guide system for blind people
CN113885704B (en) Man-machine interaction method and system for blind guiding vehicle
KR20150145429A (en) A navigation function visually impaired Smart Stick system
CN214632899U (en) Intelligent guide walking stick
Akram et al. Construction and analysis of a novel wearable assistive device for a visually impaired person
JP2004117094A (en) Personal navigation system
Hersh et al. Mobility: an overview
CN111664850A (en) Voice blind guiding and taking guide system and method based on multi-source fusion positioning
JP2004174176A (en) Visually challenging person and healthy person guide device
WO2022077408A1 (en) Intelligent tactile pavement system, and blind person walking assisting device and method based on intelligent tactile pavement system
KR101684167B1 (en) Device of walking guidance for a blind person
JP5216379B2 (en) Guiding block
KR20160059495A (en) Course Guidance Method and Apparatus for Visually Impaired People Using Visible Light Communication
CN213076454U (en) Electronic blind road and blind person detection walking stick matched with blind road

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant