CN111035543A - Intelligent blind guiding robot - Google Patents
Intelligent blind guiding robot Download PDFInfo
- Publication number
- CN111035543A CN111035543A CN201911420408.3A CN201911420408A CN111035543A CN 111035543 A CN111035543 A CN 111035543A CN 201911420408 A CN201911420408 A CN 201911420408A CN 111035543 A CN111035543 A CN 111035543A
- Authority
- CN
- China
- Prior art keywords
- blind guiding
- guiding robot
- information
- vehicle
- robot
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
Images
Classifications
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61H—PHYSICAL THERAPY APPARATUS, e.g. DEVICES FOR LOCATING OR STIMULATING REFLEX POINTS IN THE BODY; ARTIFICIAL RESPIRATION; MASSAGE; BATHING DEVICES FOR SPECIAL THERAPEUTIC OR HYGIENIC PURPOSES OR SPECIFIC PARTS OF THE BODY
- A61H3/00—Appliances for aiding patients or disabled persons to walk about
- A61H3/06—Walking aids for blind persons
- A61H3/061—Walking aids for blind persons with electronic detecting or guiding means
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J11/00—Manipulators not otherwise provided for
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J19/00—Accessories fitted to manipulators, e.g. for monitoring, for viewing; Safety devices combined with or specially adapted for use in connection with manipulators
- B25J19/02—Sensing devices
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J19/00—Accessories fitted to manipulators, e.g. for monitoring, for viewing; Safety devices combined with or specially adapted for use in connection with manipulators
- B25J19/02—Sensing devices
- B25J19/021—Optical sensing devices
- B25J19/022—Optical sensing devices using lasers
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J19/00—Accessories fitted to manipulators, e.g. for monitoring, for viewing; Safety devices combined with or specially adapted for use in connection with manipulators
- B25J19/02—Sensing devices
- B25J19/026—Acoustical sensing devices
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J19/00—Accessories fitted to manipulators, e.g. for monitoring, for viewing; Safety devices combined with or specially adapted for use in connection with manipulators
- B25J19/02—Sensing devices
- B25J19/04—Viewing devices
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C21/00—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
- G01C21/26—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
- G01C21/28—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network with correlation of data from several navigational instruments
- G01C21/30—Map- or contour-matching
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C21/00—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
- G01C21/26—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
- G01C21/34—Route searching; Route guidance
- G01C21/3407—Route searching; Route guidance specially adapted for specific applications
- G01C21/343—Calculating itineraries, i.e. routes leading from a starting point to a series of categorical destinations using a global route restraint, round trips, touristic trips
Abstract
The invention provides an intelligent blind guiding robot, which comprises: a vehicle chassis; the vehicle-mounted operation unit is arranged on the vehicle chassis; the sensor system is arranged on a vehicle chassis and used for sensing the environment information of 360 degrees around the blind guiding robot and sending the environment information to the vehicle-mounted operation unit; and the human-computer interaction module is in communication connection with the vehicle-mounted operation unit and used for transmitting blind guiding information in real time in a voice interaction mode. Aiming at the defects of the existing intelligent blind guiding robot, the invention provides the intelligent blind guiding robot based on three-dimensional laser radar, monocular vision, millimeter wave radar, ultrasonic radar and combined inertial navigation, thereby realizing complete environment perception capability, providing abundant and accurate environment information for decision planning of the blind guiding robot and playing the role of eyes of people with visual disorder.
Description
Technical Field
The invention belongs to the field of intelligent blind guiding, and particularly relates to an intelligent blind guiding robot.
Background
China is one of the most visually impaired countries worldwide, and eye diseases are also a major public health problem in china. About people with visual disorder in China can reach 1730 ten thousand. Most of the visually-handicapped people are basically 'surprise at step' if no family is accompanied by the visitors, and although the blind-person-dedicated road is designed on part of the road, the infrastructure concerning the visually-handicapped people is not perfect in practice. Especially, the current society is rapidly developed, the living environment becomes more and more complex, and the visually handicapped people can not conveniently and safely go out alone without the help of blind guiding.
The guide dog is a dog which is strictly trained and is one of working dogs. The trained guide dog can help the visually impaired to go to schools, shops, laundries, street gardens and the like. They are used to the constraints of collars, blind-guide traction bands and other accessories, and understand a lot of passwords, can lead visually impaired people to walk safely, and can lead the owner to stop so as not to cause danger when encountering obstacles and needing to turn.
But there are few groups who can really use the guide dog because the guide dog is a "luxury good" and the chance that the visually handicapped gets a guide dog is one in ten thousandth. The training time of the blind guiding dog is as long as two years, and the steps from the cultivation of puppies to the search of foster families are all difficult, the passing rate of the examination is only 30%, and the cost of each blind guiding dog is as high as 22 ten thousand yuan. Therefore, biological blind guide methods typified by blind guide dogs have not been widely used due to high training costs, long training periods, long lives, and the like.
With the rapid development of artificial intelligence and the Internet, intelligent machines are more and more mature, and an intelligent blind guiding robot with excellent performance is an optimal solution for solving the daily difficulty of people with visual impairment. However, the existing intelligent blind guiding device can only realize a local obstacle avoidance function, or a radio frequency tag needs to be arranged in advance, or the plane projection using a three-dimensional model has a large difference from the real environment, or the existing intelligent blind guiding device has obvious defects in human-computer interaction, and is difficult to realize accurate and effective cognition on a complex environment.
In the prior art, CN107390703 introduces an intelligent blind guiding robot and a blind guiding method thereof, and the intelligent blind guiding robot comprises a vehicle body, a driving wheel, a radar module, a camera module, a blind guiding connecting rod and a robot control unit, and adopts a visual SLAM mode to combine point cloud information to map and position an unknown environment and identify traffic lights. However, this technique has the following drawbacks: firstly, the sensing range of the sensor is small, and the sensor cannot respond in time when an object moving at high speed appears; secondly, the man-machine interaction module is too simple, only controls the transmission of commands, and is not humanized enough; thirdly, the key area sensed by the sensor in the blind guiding equipment is arranged at the front end of the equipment, and the sensing degree of the blind guiding equipment to the back and two sides of the blind person is insufficient.
Therefore, it is necessary to design an intelligent blind guiding robot, which is easy to popularize, can provide convenient service for visually handicapped people, and practically brings social and humanistic care to a real place.
Disclosure of Invention
In order to solve the problems in the prior art, the invention provides an intelligent blind guiding robot which can realize complete environment perception capability, play the eyes of people with visual impairment and practically bring social and human care to the real places.
According to an aspect of the present invention, there is provided an intelligent blind guiding robot, comprising:
a vehicle chassis;
the vehicle-mounted operation unit is arranged on the vehicle chassis;
the sensor system is arranged on a vehicle chassis and used for sensing the environment information of 360 degrees around the blind guiding robot and sending the environment information to the vehicle-mounted operation unit;
and the human-computer interaction module is in communication connection with the vehicle-mounted operation unit and used for transmitting blind guiding information in real time in a voice interaction mode.
Furthermore, the vehicle chassis is an electric vehicle driven by a motor and comprises a frame, and a plurality of groups of tires, a driving device, a braking device and a steering device which are arranged on the frame.
Furthermore, the man-machine interaction module adopts a Bluetooth communication mode to communicate with a Bluetooth earphone of the visually impaired.
Further, the sensor system comprises 1 three-dimensional laser radar, 1 millimeter wave radar, 2 cameras, 4 ultrasonic radars and 1 set of combined navigation equipment.
Furthermore, the three-dimensional laser radar is installed in the middle of the top of the blind guiding robot and used for sensing the environment information of the blind guiding robot in the periphery of 360 degrees.
Furthermore, the millimeter wave radar is installed in front of the blind guiding robot and used for acquiring speed and direction information of a target in front of the blind guiding robot.
Further, the camera includes:
the front vision camera is arranged in front of the blind guiding robot and used for acquiring information of traffic lights and obstacles;
and the rear vision camera is arranged behind the blind guiding robot, and the three-dimensional laser radar detects the current and rear position information of the visually impaired.
Furthermore, the ultrasonic radars are distributed in front of and behind the blind guiding robot, and 2 ultrasonic radars are respectively arranged in front of and behind the blind guiding robot and used for detecting information of short obstacles in front of and behind the blind guiding robot.
Furthermore, the combined navigation equipment is installed in an unobstructed area at the top of the blind guiding robot, and the distance between the two GPS antennas is greater than 1 meter, so that the combined navigation equipment is used for acquiring the positioning information of the blind guiding robot.
Further, the vehicle-mounted operation unit is connected with all devices of the sensor system and used for receiving various sensing information and outputting information of obstacles in a sensing range, motion information of vehicles and accurate positioning information to the man-machine interaction module.
Aiming at the defects of the existing intelligent blind guiding robot, the invention provides the intelligent blind guiding robot based on three-dimensional laser radar, monocular vision, millimeter wave radar, ultrasonic radar and combined inertial navigation, thereby realizing complete environment perception capability, providing abundant and accurate environment information for decision planning of the blind guiding robot and playing the eyes of people with visual disorder.
The advantages of the invention are as follows:
(1) the sensing range is wide, and a long-distance target of more than 20 meters can be accurately detected, so that a vision-impaired person can also respond to an oncoming fast target in advance, and the safety is fully guaranteed;
(2) the obstacle 0.2 m away from the vehicle body and the low obstacle on the ground can be detected;
(3) the detection of 360 degrees around the robot is realized, and pedestrians around the robot can be detected without dead angles;
(4) accurately identifying road information, traffic light information and the position of a visually impaired person;
(5) and a positioning mode combining the combined inertial navigation and the laser radar SLAM algorithm is used, so that the interference of a complex environment on a GPS signal is solved, and the accurate position of the blind guiding robot is known in real time.
Drawings
The above and other objects, features and advantages of the present disclosure will become more apparent by describing in greater detail exemplary embodiments thereof with reference to the attached drawings, in which like reference numerals generally represent like parts throughout.
Fig. 1 is a schematic composition diagram of an intelligent blind guiding robot according to an embodiment of the present invention.
Detailed Description
Preferred embodiments of the present disclosure will be described in more detail below with reference to the accompanying drawings. While the preferred embodiments of the present disclosure are shown in the drawings, it should be understood that the present disclosure may be embodied in various forms and should not be limited to the embodiments set forth herein. Rather, these embodiments are provided so that this disclosure will be thorough and complete, and will fully convey the scope of the disclosure to those skilled in the art.
According to an aspect of the present invention, there is provided an intelligent blind guiding robot, comprising:
a vehicle chassis;
the vehicle-mounted operation unit is arranged on the vehicle chassis;
the sensor system is arranged on a vehicle chassis and used for sensing the environment information of 360 degrees around the blind guiding robot and sending the environment information to the vehicle-mounted operation unit;
and the human-computer interaction module is in communication connection with the vehicle-mounted operation unit and used for transmitting blind guiding information in real time in a voice interaction mode.
The vehicle-mounted operation unit can calculate road information, accurate positioning information and barrier information in a sensing range of the intelligent blind guiding robot through information acquired by the sensor system, and transmits the information to a visually impaired person in real time in a voice interaction mode through the man-machine interaction module to remind the visually impaired person of a route and road condition information which the visually impaired person should walk.
The vehicle chassis can be an electric vehicle driven by a motor and comprises a frame, and a plurality of groups of tires, a driving device, a braking device and a steering device which are arranged on the frame. The wheel group is installed on the frame, and drive arrangement is the motor, with the battery electric connection on the frame. The driving motor, the brake device and the steering device are in communication connection with the vehicle-mounted arithmetic unit and are used for steering, regulating speed, braking, starting or stopping according to instructions sent by the vehicle-mounted arithmetic unit. The robot trolley capable of autonomous driving is the prior art and is not described in detail herein.
The sensor system comprises 1 three-dimensional laser radar, 1 millimeter wave radar, 2 cameras, 4 ultrasonic radars and 1 set of combined navigation equipment.
Specifically, 1 three-dimensional laser radar is installed at the middle position of the top of the vehicle and used for sensing the environment information of 360 degrees around the blind guiding robot and sending the environment information to the vehicle-mounted operation unit. The vehicle-mounted operation unit is used for establishing images and positioning in real time through an SLAM algorithm and is combined with the combined navigation equipment to realize the accurate positioning of the blind guiding robot in different environments.
The 1 millimeter wave radar is installed in front of the blind guiding robot and used for acquiring speed and direction information of a target in front of the blind guiding robot.
2 cameras, respectively front vision camera and back vision camera. The front vision camera is arranged in front of the blind guiding robot and used for acquiring information of traffic lights and obstacles; the back vision camera is installed in leading blind robot rear, detects the current of visual disturbance person and the positional information in rear with laser radar together, and 2 cameras are still considered the type information of acquireing the target simultaneously.
4 ultrasonic radar distributes in leading blind robot the place ahead and rear, respectively installs 2 UPA around, is used for detecting leading short obstacle information around the blind robot.
And the combined navigation equipment is arranged in an unobstructed area at the top of the blind guiding robot, and the distance between the two GPS antennas is greater than 1 m, so that the combined navigation equipment is used for acquiring the positioning information of the blind guiding robot. Specifically, the method and the device use a positioning mode combining inertial navigation and a laser radar SLAM algorithm, solve the interference of a complex environment on a GPS signal, and can know the accurate position of the blind guiding robot in real time.
The human-computer interaction module is used for real-time communication between the visually impaired and the blind guiding robot, and can comprise a voice recognition module for recognizing instructions of the visually impaired. E.g., stopped or started, a little faster or slower. After the human-computer interaction module identifies the instruction, the instruction is converted into an instruction signal and sent to the vehicle-mounted operation unit, and the vehicle-mounted operation unit controls the driving motor, the steering device and the brake device to act. In addition, the man-machine interaction module is also provided with a voice playing module which is connected with the vehicle-mounted operation unit, receives the data of the vehicle-mounted operation unit, converts the data into voice signals and provides the voice signals for the visually impaired through a loudspeaker or a Bluetooth mode. Preferably, the communication mode is a Bluetooth mode, and a person with visual impairment can only be provided with a Bluetooth headset.
And the vehicle-mounted operation unit is connected with all devices of the sensor system and used for receiving various sensing information and outputting information of obstacles in a sensing range, motion information of vehicles and accurate positioning information to the man-machine interaction module. The vehicle-mounted operation unit can automatically plan a traveling route according to the information of the obstacles in the perception range, and can automatically adjust the traveling speed according to the speed or the distance of the visually impaired.
For example, the front vision camera acquires traffic light information and sends the traffic light information to the vehicle-mounted computing unit, the vehicle-mounted computing unit judges a red light in front, the position of the blind guiding robot is determined through the combined navigation equipment, if the blind guiding robot is located in a waiting area, a waiting stopping instruction is output to the man-machine interaction module, and the man-machine interaction module informs a vision-impaired person of stopping waiting for the red light in a voice mode.
For example, when the ultrasonic radar detects that about 10 cm of protrusion exists on the ground five meters ahead in distribution, the obstacle information is sent to the vehicle-mounted operation unit, the vehicle-mounted operation unit determines the position of the blind guiding robot through the combined navigation equipment, and the human-computer interaction module is used for continuously reminding a visually impaired person: note that there is a 10 cm protruding barrier 5 meters in front, and note that there is a 10 cm protruding barrier 3 meters in front, … ….
In addition, the vehicle-mounted intelligent road map display system can further comprise a display screen which is connected with the vehicle-mounted operation unit, can display surrounding environment information, positioning information, route planning information, barrier maps and the like in real time, and is convenient for family members or passers-by to help to look up information.
The intelligent blind guiding robot also comprises a connecting rod or a connecting rope, one end of the intelligent blind guiding robot is connected to the frame of the blind guiding robot, and the other end of the intelligent blind guiding robot is held by a person with visual disturbance. The connecting rod can be a telescopic rod, and the hand holding end is provided with a control switch and a speed regulating switch. The visually impaired person can operate the blind guiding robot through the connecting rod.
To facilitate understanding of the solution of the embodiments of the present invention and the effects thereof, a specific application example is given below. It will be understood by those skilled in the art that this example is merely for the purpose of facilitating an understanding of the present invention and that any specific details thereof are not intended to limit the invention in any way.
As shown in fig. 1, the intelligent blind guiding robot of the present embodiment includes: the vehicle chassis comprises a frame, a wheel set and a battery which are arranged on the frame, a driving device, a braking device and a steering device;
the sensor system comprises 1 three-dimensional laser radar, 1 millimeter wave radar, 2 cameras, 4 ultrasonic radars and 1 set of combined navigation equipment. The system comprises 1 three-dimensional laser radar, a vehicle body, a combined navigation device and a control system, wherein the three-dimensional laser radar is arranged in the middle of the top of the vehicle and used for sensing environment information of the blind guiding robot at 360 degrees around the vehicle body, real-time mapping and positioning are carried out through an SLAM algorithm, and the combined navigation device is combined to realize accurate positioning of the blind guiding robot in different environments; the system comprises a blind guiding robot, a plurality of millimeter wave radars, a plurality of blind guiding robots and a plurality of blind guiding robots, wherein the 1 millimeter wave radar is arranged in front of the blind guiding robot and is used for acquiring speed and direction information of a target in front of the blind guiding robot; the front vision camera is arranged in front of the blind guiding robot and used for acquiring information of traffic lights and obstacles; the rear vision camera is arranged behind the blind guiding robot, detects the current and rear position information of the visually impaired with the laser radar, and simultaneously obtains the type information of the target by the 2 cameras; the 4 ultrasonic radars are distributed in front of and behind the blind guiding robot, and 2 UPA are respectively installed in the front and the back of the blind guiding robot and are used for detecting the information of short obstacles in the front and the back of the blind guiding robot; and the combined navigation equipment is arranged in an unobstructed area at the top of the blind guiding robot, and the distance between the two GPS antennas is greater than 1 m, so that the combined navigation equipment is used for acquiring the positioning information of the blind guiding robot.
And the vehicle-mounted operation unit is connected with all devices of the sensor system and used for receiving the sensing information of each sensor, outputting the information of the obstacles in the sensing range, the motion information of the vehicle and accurate positioning information after algorithm processing.
The driving device, the braking device and the steering device are in communication connection with the vehicle-mounted computing unit through a drive-by-wire interface, and perform various actions under the control of the vehicle-mounted computing unit.
And the human-computer interaction module is connected with the vehicle-mounted operation unit and is used for real-time communication between the visually impaired and the blind guiding robot, the communication mode is a Bluetooth mode, and the visually impaired can only be provided with a Bluetooth earphone.
In addition, the system also comprises a display screen which is connected with the vehicle-mounted arithmetic unit and can display the surrounding environment information, the positioning information, the route planning information, the barrier map and the like in real time.
In addition, anti-collision blocks are arranged at the front and the rear of the frame chassis.
In addition, a start-stop button is further arranged on the frame and used for starting and stopping the blind guiding robot.
Having described embodiments of the present disclosure, the foregoing description is intended to be exemplary, not exhaustive, and not limited to the disclosed embodiments. Many modifications and variations will be apparent to those of ordinary skill in the art without departing from the scope and spirit of the described embodiments. The terminology used herein is chosen in order to best explain the principles of the embodiments, the practical application, or improvements made to the technology in the marketplace, or to enable others of ordinary skill in the art to understand the embodiments disclosed herein.
Claims (10)
1. An intelligent blind guiding robot, comprising:
a vehicle chassis;
the vehicle-mounted operation unit is arranged on the vehicle chassis;
the sensor system is arranged on a vehicle chassis and used for sensing the environment information of 360 degrees around the blind guiding robot and sending the environment information to the vehicle-mounted operation unit;
and the human-computer interaction module is in communication connection with the vehicle-mounted operation unit and used for transmitting blind guiding information in real time in a voice interaction mode.
2. The intelligent blind guiding robot as claimed in claim 1, wherein the vehicle chassis is a motor-driven electric vehicle comprising a frame and a plurality of sets of tires, a driving device, a braking device and a steering device mounted on the frame.
3. The intelligent blind guiding robot as claimed in claim 1, wherein the human-computer interaction module communicates with a bluetooth headset of the visually impaired in a bluetooth communication manner.
4. The intelligent blind guiding robot of claim 1, wherein the sensor system comprises 1 three-dimensional lidar, 1 millimeter wave radar, 2 cameras, 4 ultrasonic radars and 1-suit navigation device.
5. The intelligent blind guiding robot as claimed in claim 4, wherein the three-dimensional laser radar is installed in the middle position of the top of the blind guiding robot and used for sensing 360-degree environment information around the blind guiding robot.
6. The intelligent blind guiding robot as claimed in claim 4, wherein the millimeter wave radar is installed in front of the blind guiding robot for acquiring speed and orientation information of a target in front of the blind guiding robot.
7. The intelligent blind guiding robot of claim 4, wherein the camera comprises:
the front vision camera is arranged in front of the blind guiding robot and used for acquiring information of traffic lights and obstacles;
and the rear vision camera is arranged behind the blind guiding robot, and the three-dimensional laser radar detects the current and rear position information of the visually impaired.
8. The intelligent blind guiding robot as claimed in claim 4, wherein the ultrasonic radars are distributed in front of and behind the blind guiding robot, and 2 ultrasonic radars are respectively installed in front of and behind the blind guiding robot for detecting information of short obstacles in front of and behind the blind guiding robot.
9. The intelligent blind guiding robot as claimed in claim 4, wherein the combined navigation device is installed in an unobstructed area at the top of the blind guiding robot, and the distance between the two GPS antennas is greater than 1 meter, so as to obtain the positioning information of the blind guiding robot.
10. The intelligent blind guiding robot as claimed in claim 4, wherein the vehicle-mounted arithmetic unit is connected with all devices of the sensor system and is used for receiving various kinds of sensing information and outputting information of obstacles in a sensing range, motion information of vehicles and accurate positioning information to the human-computer interaction module.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201911420408.3A CN111035543A (en) | 2019-12-31 | 2019-12-31 | Intelligent blind guiding robot |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201911420408.3A CN111035543A (en) | 2019-12-31 | 2019-12-31 | Intelligent blind guiding robot |
Publications (1)
Publication Number | Publication Date |
---|---|
CN111035543A true CN111035543A (en) | 2020-04-21 |
Family
ID=70243195
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201911420408.3A Pending CN111035543A (en) | 2019-12-31 | 2019-12-31 | Intelligent blind guiding robot |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN111035543A (en) |
Cited By (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN112932911A (en) * | 2021-04-02 | 2021-06-11 | 常州大学怀德学院 | Blind guiding robot based on hybrid sensing system |
CN113370225A (en) * | 2021-05-31 | 2021-09-10 | 山东新一代信息产业技术研究院有限公司 | Blind person guiding service robot system |
CN114712181A (en) * | 2022-05-07 | 2022-07-08 | 河海大学 | Blind person obstacle avoidance navigation system based on visual SLAM |
CN116358591A (en) * | 2023-04-07 | 2023-06-30 | 广州爱浦路网络技术有限公司 | Travel navigation assistance method for visually impaired people, electronic device and storage medium |
Citations (11)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN104899855A (en) * | 2014-03-06 | 2015-09-09 | 株式会社日立制作所 | Three-dimensional obstacle detection method and apparatus |
CN105866779A (en) * | 2016-04-06 | 2016-08-17 | 浙江大学 | Wearable barrier avoiding apparatus and barrier avoiding method based on binocular camera and millimeter-wave radar |
CN106074096A (en) * | 2016-05-27 | 2016-11-09 | 苏州铭冠软件科技有限公司 | A kind of blind person's portable navigating instrument based on computer vision |
CN106515728A (en) * | 2016-12-22 | 2017-03-22 | 深圳市招科智控科技有限公司 | System and method for avoiding collision and obstacle for a driverless bus |
CN106708040A (en) * | 2016-12-09 | 2017-05-24 | 重庆长安汽车股份有限公司 | Sensor module of automatic driving system, automatic driving system and automatic driving method |
CN106890067A (en) * | 2017-01-06 | 2017-06-27 | 南京邮电大学 | Indoor blind man navigation robot |
CN107390703A (en) * | 2017-09-12 | 2017-11-24 | 北京创享高科科技有限公司 | A kind of intelligent blind-guidance robot and its blind-guiding method |
CN208366352U (en) * | 2018-05-17 | 2019-01-11 | 中兴健康科技有限公司 | A kind of guide equipment |
CN110353958A (en) * | 2019-08-09 | 2019-10-22 | 朱原灏 | A kind of mancarried device and its method for assisting blind-man riding |
CN209734484U (en) * | 2019-01-18 | 2019-12-06 | 长春光华学院 | Intelligent blind guiding device |
CN209852236U (en) * | 2019-03-07 | 2019-12-27 | 北京主线科技有限公司 | Environment sensing device for unmanned truck |
-
2019
- 2019-12-31 CN CN201911420408.3A patent/CN111035543A/en active Pending
Patent Citations (11)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN104899855A (en) * | 2014-03-06 | 2015-09-09 | 株式会社日立制作所 | Three-dimensional obstacle detection method and apparatus |
CN105866779A (en) * | 2016-04-06 | 2016-08-17 | 浙江大学 | Wearable barrier avoiding apparatus and barrier avoiding method based on binocular camera and millimeter-wave radar |
CN106074096A (en) * | 2016-05-27 | 2016-11-09 | 苏州铭冠软件科技有限公司 | A kind of blind person's portable navigating instrument based on computer vision |
CN106708040A (en) * | 2016-12-09 | 2017-05-24 | 重庆长安汽车股份有限公司 | Sensor module of automatic driving system, automatic driving system and automatic driving method |
CN106515728A (en) * | 2016-12-22 | 2017-03-22 | 深圳市招科智控科技有限公司 | System and method for avoiding collision and obstacle for a driverless bus |
CN106890067A (en) * | 2017-01-06 | 2017-06-27 | 南京邮电大学 | Indoor blind man navigation robot |
CN107390703A (en) * | 2017-09-12 | 2017-11-24 | 北京创享高科科技有限公司 | A kind of intelligent blind-guidance robot and its blind-guiding method |
CN208366352U (en) * | 2018-05-17 | 2019-01-11 | 中兴健康科技有限公司 | A kind of guide equipment |
CN209734484U (en) * | 2019-01-18 | 2019-12-06 | 长春光华学院 | Intelligent blind guiding device |
CN209852236U (en) * | 2019-03-07 | 2019-12-27 | 北京主线科技有限公司 | Environment sensing device for unmanned truck |
CN110353958A (en) * | 2019-08-09 | 2019-10-22 | 朱原灏 | A kind of mancarried device and its method for assisting blind-man riding |
Cited By (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN112932911A (en) * | 2021-04-02 | 2021-06-11 | 常州大学怀德学院 | Blind guiding robot based on hybrid sensing system |
CN113370225A (en) * | 2021-05-31 | 2021-09-10 | 山东新一代信息产业技术研究院有限公司 | Blind person guiding service robot system |
CN114712181A (en) * | 2022-05-07 | 2022-07-08 | 河海大学 | Blind person obstacle avoidance navigation system based on visual SLAM |
CN116358591A (en) * | 2023-04-07 | 2023-06-30 | 广州爱浦路网络技术有限公司 | Travel navigation assistance method for visually impaired people, electronic device and storage medium |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN111035543A (en) | Intelligent blind guiding robot | |
CN108646731B (en) | Unmanned vehicle field end control system and control method thereof | |
Göhring et al. | Semi-autonomous car control using brain computer interfaces | |
KR101193610B1 (en) | Intelligent robot system for traffic guidance of crosswalk | |
CN109795462A (en) | Controller of vehicle, control method for vehicle and storage medium | |
CN111609851B (en) | Mobile blind guiding robot system and blind guiding method | |
CN107524103B (en) | A kind of Intelligent road warning system and its method based on unmanned plane | |
US20220155400A1 (en) | Microphone Array for Sound Source Detection and Location | |
CN108177651A (en) | A kind of quick unmanned vehicle detection obstacle avoidance system of Multi-sensor Fusion | |
CN106652505B (en) | A kind of dysopia pedestrian's street crossing guidance system based on intelligent glasses | |
CN111459172A (en) | Autonomous navigation system of boundary security unmanned patrol car | |
CN113075926A (en) | Blind guiding robot dog based on artificial intelligence | |
CN112870033A (en) | Intelligent blind guiding helmet system for unstructured road and navigation method | |
CN109998873A (en) | A kind of wearable blindmen intelligent positioning and blind guiding system | |
KR102184598B1 (en) | Driving Prediction and Safety Driving System Based on Judgment of Driver Emergency Situation of Autonomous Driving Vehicle | |
CN108037756A (en) | A kind of Multi-sensor Fusion middling speed unmanned vehicle detects obstacle avoidance system | |
CN110623820A (en) | Blind device is led to wearable intelligence | |
CN111840016A (en) | Flexible and configurable intelligent navigation device for blind people | |
CN115416047A (en) | Blind assisting system and method based on multi-sensor quadruped robot | |
CN214376963U (en) | Parking service robot | |
WO2021187039A1 (en) | Information processing device, information processing method, and computer program | |
Gajjar et al. | A comprehensive study on lane detecting autonomous car using computer vision | |
WO2019214165A1 (en) | Intelligent flight follower for navigation for blind people | |
CN214632899U (en) | Intelligent guide walking stick | |
CN217801729U (en) | Outdoor robot |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
CB02 | Change of applicant information |
Address after: 100176 floor 10, building 1, zone 2, yard 9, Taihe 3rd Street, Beijing Economic and Technological Development Zone, Daxing District, Beijing Applicant after: Beijing National New Energy Vehicle Technology Innovation Center Co.,Ltd. Address before: 102600 1705, block a, building 1, No. 10, Ronghua Middle Road, Beijing Economic and Technological Development Zone, Daxing District, Beijing Applicant before: BEIJING NEW ENERGY VEHICLE TECHNOLOGY INNOVATION CENTER Co.,Ltd. |
|
CB02 | Change of applicant information | ||
RJ01 | Rejection of invention patent application after publication |
Application publication date: 20200421 |
|
RJ01 | Rejection of invention patent application after publication |