CN102895092A - Multi-sensor integration based three-dimensional environment identifying system for walker aid robot - Google Patents

Multi-sensor integration based three-dimensional environment identifying system for walker aid robot Download PDF

Info

Publication number
CN102895092A
CN102895092A CN2011104139109A CN201110413910A CN102895092A CN 102895092 A CN102895092 A CN 102895092A CN 2011104139109 A CN2011104139109 A CN 2011104139109A CN 201110413910 A CN201110413910 A CN 201110413910A CN 102895092 A CN102895092 A CN 102895092A
Authority
CN
China
Prior art keywords
information
industrial computer
motion
module
assistant robot
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN2011104139109A
Other languages
Chinese (zh)
Inventor
冷春涛
李宝顺
黄怡
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Individual
Original Assignee
Individual
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Individual filed Critical Individual
Priority to CN2011104139109A priority Critical patent/CN102895092A/en
Publication of CN102895092A publication Critical patent/CN102895092A/en
Pending legal-status Critical Current

Links

Images

Abstract

The invention discloses a multi-sensor integration based three-dimensional environment identifying system and a multi-sensor integrated based three-dimensional environment identifying method for a walker aid robot. The multi-sensor integration based three-dimensional environment identifying system comprises an ARM information processing module, an industrial personal computer and a lower computer motion control module, wherein the ARM information processing module sets up a three-dimensional scene reconstruction and target tracking information and transmits the same to the industrial personal computer module after processing acquired data. The industrial personal computer receives users' commands, generates a navigation map by means of a GPS (global positioning system), combines the three-dimensional scene reconstruction and target tracking information with a navigation map to generate motion path planning information and transmits the motion path planning information to the lower computer motion control module. The lower computer control module drives and controls the walker aid robot to act correspondingly according to the motion path planning information and feeds the motion information back to the industrial personal computer. The three-dimensional environment identifying system integrates multiple sensors, creates a real three-dimensional environment for the walker aid robot and generates superior motion path for the walker aid robot to safely, stably and reliably walk, target tracking is realized effectively, and intelligentization of the walker aid robot is improved.

Description

A kind of assistant robot three-dimensional environment recognition system based on Multi-sensor Fusion
Technical field
The present invention relates to the information engineering technical field, relate in particular to a kind of assistant robot three-dimensional environment recognition system and method based on Multi-sensor Fusion.
Background technology
Assistant robot, alleviates the family social burden and is extremely important improving the elderly and the disabled's quality of life mainly for old people or handicapped people with disability such as blind person's etc. safe, autonomous action.Realize that assistant robot then is one of assistant robot key technology to the identification of 3 D complex environment, its effectiveness, reliability have directly determined the success or failure that can the assistant robot technology practical application.
Traditional assistant robot technology generally is to utilize to carry out digital map navigation such as GPS, or utilize the sensors such as image processing equipment, ultrasound wave that the sidewalk for visually impaired people is detected, this technology is paid attention to is identification, navigation to the walking path of assistant robot, and has ignored the identification to assistant robot environment of living in.
The user of assistant robot is not only to wish to arrive another place by assistant robot from a place, wishes that also assistant robot helps them that its residing environment is identified, its interested people or thing are followed the tracks of.And for amblyope, they from one place to another, the navigation data that GPS sets up only can provide a thick routing information for them, also needs to finish by image or ultrasonic sensor as for information such as road conditions, barriers.But, when when they arrive destination or in its playground, carrying out activity, assistant robot not only needs to produce active path, also to identify and follow the tracks of its residing environment, for example amblyope or old people also wish as the normal person and the household is independently taking a walk or going shopping rather than pushed away their stroll or gone shopping by the household together, realize this function, just need to realize the identification of assistant robot three-dimensional environment and the tracking of target, but because GPS can not provide three-dimensional information, add the impact of its precision, only depend on GPS can not satisfy this requirement, although the information of surrounding can be provided by imageing sensor, but the two-dimensional signal that provides, can not the capable of meeting requirements on three-dimensional environment requirement of identification and target following, so traditional technology is by adopting GPS location and map building to realize navigation feature, only can realize path planning and barrier avoiding function under subenvironment by employing image processing techniques and ultrasonic technology.These technology do not relate to as user provides target selection, target recognition, target following, and user that can not provide for user, hommization experiences, and this deficiency remains and must improve and improve.
" a kind of intelligent guide vehicle " (Chinese invention patent application number 200610088153.1) disclosed in the prior art, this car is by ultrasound barrier probe, realize the detection of sidewalk for visually impaired people, yet this system only can be used for having the place of sidewalk for visually impaired people, realize detection and the guiding of sidewalk for visually impaired people, and do not have the navigation of GPS global map, more there are not identification and target following based on three-dimensional environment.
" blind guiding system for blindman hospitalizing " (Chinese invention patent application number 200610117532.9) also disclosed in the prior art, this system adopts infrared emission and receiving system to realize the blindman hospitalizing guide, adopt the blind person of this system can effectively find the section office of seeking medical advice, but this system can not realize keeping away barrier, does not also have the identification of path planning and three-dimensional environment.
Summary of the invention
The object of the invention is to overcome the deficiencies in the prior art, for old people or amblyope provide a kind of assistant robot three-dimensional environment recognition system and method based on Multi-sensor Fusion.
For achieving the above object, the present invention is achieved through the following technical solutions:
The invention discloses a kind of assistant robot three-dimensional environment recognition system based on Multi-sensor Fusion, this system comprises: ARM message processing module, industrial computer module, slave computer motion-control module; Wherein, ARM message processing module image data, and data are processed to set up 3 D scene rebuilding and target following information, and this information is sent to the industrial computer module; The industrial computer module receives user's instruction, utilizes the GPS of himself to generate navigation map, with the 3 D scene rebuilding that receives and the target following information generation trajectory path planning information that combines with navigation map, and is sent to the slave computer motion-control module; The slave computer control module produces corresponding actions according to trajectory path planning information-driven control assistant robot, feeds back simultaneously movable information to the industrial computer module.
Further, described ARM message processing module also comprises: the arm processor system, entirely tie up photographic head, local cameras, laser sensor and obliquity sensor; Full dimension photographic head Real-time Collection global image information is delivered to the arm processor system, and the arm processor system carries out coarse positioning according to global image to tracking target; Local cameras is delivered to the arm processor system in real time with the topography of its collection, the arm processor system carries out fine positioning according to topography to target, the data that laser sensor and obliquity sensor all collect it are passed to the arm processor system in real time, the arm processor system with laser sensor data and obliquity sensor data in conjunction with the generating three-dimensional data, the arm processor system passes through multi-sensor information fusion, carry out 3 D scene rebuilding and target following information, and these information are delivered to the industrial computer module.
Further, described industrial computer module also comprises: industrial computer, pulse transducer, pressure transducer, body temperature trans, GPS, touch screen, receiver and GPRS, industrial computer reads pulse transducer in real time, pressure transducer, body temperature trans, generate health and fitness information, the order that industrial computer receives according to receiver or touch screen, read GPS and generate navigation map, read 3 D scene rebuilding and target following information that the ARM message processing module is brought, according to its order that receives, utilize navigation map and 3 D scene rebuilding and target following Information generation trajectory path planning information, and the trajectory path planning information that generates is delivered to the slave computer motion-control module; The assistant robot kinematic parameter that the industrial computer module produces and user's health and fitness information, or deliver to touch screen and show in real time, or deliver to audio amplifier output, or utilize GPRS to pass to server computer by internet.
Further, described slave computer motion-control module also comprises: motion control card, motor driver, servomotor, elevating mechanism, walking mechanism, the attitude maintaining body, motion control card in the host computer motion-control module is accepted the trajectory path planning information that industrial computer is sent here, motion path is carried out interpolation, produce motion control signal, motion control card is delivered to each motor driver with the motion control signal of its generation, respectively each servomotor is driven by each motor driver, the driven by servomotor elevating mechanism can produce elevating movement, the driven by servomotor walking mechanism can make the assistant robot walking, driven by servomotor attitude maintaining body can make attitude held stationary in the assistant robot motor process.
The invention also discloses a kind of assistant robot three-dimensional environment recognition methods based on multi-sensor information fusion, be used for the system as claimed in claim 1, the method comprises the steps:
S1.ARM message processing module image data is carried out processed setting up 3 D scene rebuilding and target following information to this image data, and described information is delivered to the industrial computer module;
S2. the industrial computer module reads the information that ARM message handler module transmits, and utilizes the GPS of himself to generate navigation map, produces trajectory path planning information according to the order that receives, and this trajectory path planning information is delivered to the slave computer motion-control module;
S3. the slave computer control module produces corresponding action according to trajectory path planning information-driven control each several part mechanism, feeds back simultaneously movable information to the industrial computer module.
Further, among the described step S1, described ARM message processing module also comprises: the arm processor system, entirely tie up photographic head, local cameras, laser sensor and obliquity sensor; Wherein, entirely tie up photographic head Real-time Collection global image information and deliver to the arm processor system, the arm processor system carries out coarse positioning according to global image to tracking target; Local cameras is delivered to the arm processor system in real time with the topography of its collection, the arm processor system carries out fine positioning according to topography to target, the data that laser sensor and obliquity sensor all collect it are passed to the arm processor system in real time, the arm processor system with laser sensor data and obliquity sensor data in conjunction with the generating three-dimensional data, the arm processor system passes through multi-sensor information fusion, carry out 3 D scene rebuilding and target following information, and these information are delivered to the industrial computer module.
Further, among the described step S2, described industrial computer module also comprises: industrial computer, pulse transducer, pressure transducer, body temperature trans, GPS, touch screen, receiver and GPRS, wherein, industrial computer reads pulse transducer in real time, pressure transducer, body temperature trans, generate health and fitness information, the order that industrial computer receives according to receiver or touch screen, read GPS and generate navigation map, read 3 D scene rebuilding and target following information that the ARM message processing module is brought, according to its order that receives, utilize navigation map and 3 D scene rebuilding and target following Information generation trajectory path planning information, and the trajectory path planning information that generates is delivered to the slave computer motion-control module; The assistant robot kinematic parameter that industrial computer produces and user's health and fitness information, or deliver to touch screen and show in real time, or deliver to audio amplifier output, or utilize GPRS to pass to server computer by internet.
Further, among the described step S3, described slave computer motion-control module also comprises: motion control card, motor driver, servomotor, elevating mechanism, walking mechanism, the attitude maintaining body, wherein, motion control card in the host computer motion-control module is accepted the trajectory path planning information that industrial computer is sent here, motion path is carried out interpolation, produce motion control signal, motion control card is delivered to each motor driver with the motion control signal of its generation, respectively each servomotor is driven by each motor driver, the driven by servomotor elevating mechanism can produce elevating movement, the driven by servomotor walking mechanism can make the assistant robot walking, driven by servomotor attitude maintaining body can make attitude held stationary in the assistant robot motor process.
The invention also discloses a kind of assistant robot three-dimensional environment recognition methods based on multi-sensor information fusion, comprise the steps:
The S1.ARM processor system carries out coarse positioning and fine positioning according to the image of overall photographic head, local cameras collection; The arm processor system with laser sensor data and obliquity sensor data in conjunction with the generating three-dimensional data; Arm processor merges three-dimensional data, coarse positioning information and fine positioning information, generates three-dimensional environment identification and the target following information of assistant robot, and these information are delivered to the industrial computer module;
S2. the industrial computer module reads GPS and generates navigation map according to the order that receiver or touch screen receive; 3 D scene rebuilding and target following information in conjunction with arm processor is sent here generate trajectory path planning information, are sent to the slave computer motion-control module;
S3. the slave computer motion-control module receives the trajectory path planning information that the industrial computer module is sent here, and motion path is carried out interpolation, produces motion control signal; And motion control signal delivered to each motor driver, drive respectively each servomotor by each motor driver and make assistant robot produce respectively lifting, walking, attitude to keep; And each mechanism kinematic information fed back to the industrial computer module.
Further, the method also comprises the steps:
S4. the assistant robot kinematic parameter that produces of industrial computer module and user's health and fitness information, or deliver to touch screen and show in real time, or deliver to audio amplifier output, or utilize GPRS to pass to server computer by internet.
Beneficial effect of the present invention: adopt the present invention with a plurality of sensor fusion, for assistant robot is set up real three-dimensional environment, thereby can generate safe, steady, the reliable walking that more excellent motion path is realized assistant robot, can realize the effective tracking to target, thereby improve the intelligence program of assistant robot.
Description of drawings
The below is described in further detail the present invention with embodiment with reference to the accompanying drawings.
Fig. 1 is the assistant robot three-dimensional environment recognition system structured flowchart that the present invention is based on Multi-sensor Fusion;
Fig. 2 is the assistant robot three-dimensional environment recognition methods flow chart that the present invention is based on multi-sensor information fusion.
The specific embodiment
The invention will be further described below in conjunction with drawings and Examples.
Fig. 1 is the assistant robot three-dimensional environment recognition system system architecture diagram that the present invention is based on Multi-sensor Fusion, and as shown in Figure 1, system of the present invention comprises industrial computer module, ARM message processing module and slave computer motion-control module.The ARM message processing module links to each other with the industrial computer module by the socket communication; The industrial computer module links to each other with the slave computer module by serial ports; The industrial computer module also links to each other with Web server by GPRS, realizes and the exchange of Web data, and other users can inquire about by Internet the information of assistant robot user.
The arm processor module comprises: entirely tie up photographic head, local cameras, laser sensor and obliquity sensor and arm processor system.Full dimension photographic head is delivered to the arm processor system with the image that gathers, and the arm processor system carries out rim detection to global image, then carries out the zone and merges, and carries out the feature extraction in zone again, according to provincial characteristics, realizes the coarse positioning to environmental goals; Local cameras is sent to the arm processor system with topography, the arm processor system carries out first rim detection to topography and then carries out the zone merging, template matching is carried out in zone after being combined again, according to the result of coupling, realizes the fine positioning to objectives; Laser sensor and obliquity sensor are sent to the arm processor system with data, obtain three-dimensional data by the arm processor system by the three-dimensional data composition algorithm; The arm processor system utilizes three-dimensional reconstruction and target tracking algorism that coarse positioning information, fine positioning information and three-dimensional data are carried out Multi-sensor Fusion, thereby realizes the three-dimensional reconstruction of assistant robot surrounding and the tracking of target; And these information are sent to the industrial computer processing system.
ARM (Advanced RISC Machines) processor is the common name of microprocessor, is 32 bit reduced instruction set computer (RISC) processor architectures, and it uses widely in many Embedded System Design.
The industrial computer module comprises: industrial computer, pulse transducer, pressure transducer, body temperature trans, GPS, touch screen, receiver and GPRS.User's order may be crossed receiver's input and also can be inputted by touch screen, and when inputting by the receiver, industrial computer comes recognition command by the order sound identification module of its inside; Industrial computer generates guidance path according to data and the user command that GPS transmits, and simultaneously, the three-dimensional environment identifying information in conjunction with the ARM message processing module obtains generates trajectory path planning; Industrial computer is sent to the slave computer motion-control module with the trajectory path planning information that generates; Simultaneously, touch screen shows the health information of kinematic parameter and the user of assistant robot in real time, the lang sound of going forward side by side prompting; Industrial computer is also delivered to server computer by the GPRS module through Internet with the kinematic parameter of assistant robot and the health information of user, and other users can inquire about and process the information of assistant robot by network.
The slave computer motion-control module comprises: motion control card, motor driver, drive motors and elevating mechanism, walking mechanism, attitude maintaining body.Motion control card is accepted the path data of industrial computer planning, carries out interpolation according to these data, and each motor driver is sent the driving signal, and motor driver is according to the given signal of motion control card, and drive motors moves.Motion control card also is sent to industrial computer with the position of each mechanism, thereby realizes industrial computer obtaining the information of assistant robot kinestate.
Fig. 2 is the assistant robot three-dimensional environment recognition methods flow chart that the present invention is based on multi-sensor information fusion, as shown in Figure 2, the present invention includes following steps:
S1.ARM message processing module image data is carried out processed setting up 3 D scene rebuilding and target following information to this image data, and described information is delivered to the industrial computer module;
Among the described step S1, described ARM message processing module also comprises: the arm processor system, entirely tie up photographic head, local cameras, laser sensor and obliquity sensor; Wherein, entirely tie up photographic head Real-time Collection global image information and deliver to the arm processor system, the arm processor system carries out coarse positioning according to global image to tracking target; Local cameras is delivered to the arm processor system in real time with the topography of its collection, the arm processor system carries out fine positioning according to topography to target, the data that laser sensor and obliquity sensor all collect it are passed to the arm processor system in real time, the arm processor system with laser sensor data and obliquity sensor data in conjunction with the generating three-dimensional data, the arm processor system passes through multi-sensor information fusion, carry out 3 D scene rebuilding and target following information, and these information are delivered to the industrial computer module.
S2. the industrial computer module reads the information that ARM message handler module transmits, and utilizes the GPS of himself to generate navigation map, produces trajectory path planning information according to the order that receives, and this trajectory path planning information is delivered to the slave computer motion-control module;
Among the described step S2, described industrial computer module also comprises: industrial computer, pulse transducer, pressure transducer, body temperature trans, GPS, touch screen, receiver and GPRS, wherein, industrial computer reads pulse transducer in real time, pressure transducer, body temperature trans, generate health and fitness information, the order that industrial computer receives according to receiver or touch screen, read GPS and generate navigation map, read 3 D scene rebuilding and target following information that the ARM message processing module is brought, according to its order that receives, utilize navigation map and 3 D scene rebuilding and target following Information generation trajectory path planning information, and the trajectory path planning information that generates is delivered to the slave computer motion-control module; The assistant robot kinematic parameter that industrial computer produces and user's health and fitness information, or deliver to touch screen and show in real time, or deliver to audio amplifier output, or utilize GPRS to pass to server computer by internet.
S3. the slave computer control module drives control each several part mechanism and produces corresponding action, and its movable information is fed back to the industrial computer module;
Among the described step S3, described slave computer motion-control module also comprises: motion control card, motor driver, servomotor, elevating mechanism, walking mechanism, the attitude maintaining body, wherein, motion control card in the host computer motion-control module is accepted the trajectory path planning information that industrial computer is sent here, motion path is carried out interpolation, produce motion control signal, motion control card is delivered to each motor driver with the motion control signal of its generation, respectively each servomotor is driven by each motor driver, the driven by servomotor elevating mechanism can produce elevating movement, the driven by servomotor walking mechanism can make the assistant robot walking, driven by servomotor attitude maintaining body can make attitude held stationary in the assistant robot motor process.
In addition, the specific embodiment of the present invention can also comprise the steps:
The S1.ARM processor system carries out coarse positioning and fine positioning according to the image of overall photographic head, local cameras collection; The arm processor system with laser sensor data and obliquity sensor data in conjunction with the generating three-dimensional data; Arm processor merges three-dimensional data, coarse positioning information and fine positioning information, generates three-dimensional environment identification and the target following information of assistant robot, and these information are delivered to the industrial computer module;
S2. the industrial computer module reads GPS and generates navigation map according to the order that receiver or touch screen receive; 3 D scene rebuilding and target following information in conjunction with arm processor is sent here generate trajectory path planning information, are sent to the slave computer motion-control module;
S3. the slave computer motion-control module receives the trajectory path planning information that the industrial computer module is sent here, and motion path is carried out interpolation, produces motion control signal; And motion control signal delivered to each motor driver, drive respectively each by each motor driver and servoly make assistant robot produce respectively lifting, walking, attitude to keep; And each mechanism kinematic information fed back to the industrial computer module;
S4. the assistant robot kinematic parameter that produces of industrial computer module and user's health and fitness information, or deliver to touch screen and show in real time, or deliver to audio amplifier output, or utilize GPRS to pass to server computer by internet.
Beneficial effect of the present invention: adopt the present invention with a plurality of sensor fusion, for assistant robot is set up real three-dimensional environment, thereby can generate safe, steady, the reliable walking that more excellent motion path is realized assistant robot, can realize the effective tracking to target, thereby improve the intelligence program of assistant robot.
Above-mentioned only is preferred embodiment of the present invention and institute's application technology principle, anyly is familiar with those skilled in the art in the technical scope that the present invention discloses, the variation that can expect easily or replacement, all should be encompassed in protection scope of the present invention in.

Claims (10)

1. the assistant robot three-dimensional environment recognition system based on Multi-sensor Fusion is characterized in that, this system comprises: ARM message processing module, industrial computer module, slave computer motion-control module; Wherein, the ARM message processing module is set up 3 D scene rebuilding and target following information, and this information is sent to the industrial computer module after the data that gather are processed; The industrial computer module receives user's instruction, utilizes the GPS of himself to generate navigation map, with the 3 D scene rebuilding that receives and the target following information generation trajectory path planning information that combines with navigation map, is sent to the slave computer motion-control module; The slave computer control module produces corresponding actions according to trajectory path planning information-driven control assistant robot, feeds back simultaneously movable information to the industrial computer module.
2. the assistant robot three-dimensional environment recognition system based on Multi-sensor Fusion as claimed in claim 1, it is characterized in that, described ARM message processing module also comprises: the arm processor system, entirely tie up photographic head, local cameras, laser sensor and obliquity sensor; Full dimension photographic head Real-time Collection global image information is delivered to the arm processor system, and the arm processor system carries out coarse positioning according to global image to tracking target; Local cameras is delivered to the arm processor system in real time with the topography of its collection, the arm processor system carries out fine positioning according to topography to target, the data that laser sensor and obliquity sensor all collect it are passed to the arm processor system in real time, the arm processor system with laser sensor data and obliquity sensor data in conjunction with the generating three-dimensional data, the arm processor system passes through multi-sensor information fusion, carry out 3 D scene rebuilding and target following information, and these information are delivered to the industrial computer module.
3. the assistant robot three-dimensional environment recognition system based on Multi-sensor Fusion as claimed in claim 1, it is characterized in that, described industrial computer module also comprises: industrial computer, pulse transducer, pressure transducer, body temperature trans, GPS, touch screen, receiver and GPRS, industrial computer reads pulse transducer in real time, pressure transducer, body temperature trans, generate health and fitness information, the order that industrial computer receives according to receiver or touch screen, read GPS and generate navigation map, read 3 D scene rebuilding and target following information that the ARM message processing module is brought, according to its order that receives, utilize navigation map and 3 D scene rebuilding and target following Information generation trajectory path planning information, and the trajectory path planning information that generates is delivered to the slave computer motion-control module; The assistant robot kinematic parameter that the industrial computer module produces and user's health and fitness information, or deliver to touch screen and show in real time, or deliver to audio amplifier output, or utilize GPRS to pass to server computer by internet.
4. the assistant robot three-dimensional environment recognition system based on Multi-sensor Fusion as claimed in claim 1, it is characterized in that, described slave computer motion-control module also comprises: motion control card, motor driver, servomotor, elevating mechanism, walking mechanism, the attitude maintaining body, motion control card in the host computer motion-control module is accepted the trajectory path planning information that industrial computer is sent here, motion path is carried out interpolation, produce motion control signal, motion control card is delivered to each motor driver with the motion control signal of its generation, respectively each servomotor is driven by each motor driver, the driven by servomotor elevating mechanism can produce elevating movement, the driven by servomotor walking mechanism can make the assistant robot walking, driven by servomotor attitude maintaining body can make attitude held stationary in the assistant robot motor process.
5. assistant robot three-dimensional environment recognition methods based on multi-sensor information fusion is used for the system as claimed in claim 1, it is characterized in that, the method comprises the steps:
S1.ARM message processing module image data processes to set up 3 D scene rebuilding and target following information to this image data, and this information is delivered to the industrial computer module;
S2. the industrial computer module reads the information that ARM message handler module transmits, and utilizes the GPS of himself to generate navigation map, produces trajectory path planning information according to the order that receives, and this trajectory path planning information is delivered to the slave computer motion-control module;
S3. the slave computer control module produces corresponding action according to trajectory path planning information-driven control assistant robot, feeds back simultaneously movable information to the industrial computer module.
6. the assistant robot three-dimensional environment recognition methods based on multi-sensor information fusion as claimed in claim 5, it is characterized in that, among the described step S1, described ARM message processing module also comprises: the arm processor system, entirely tie up photographic head, local cameras, laser sensor and obliquity sensor; Wherein, entirely tie up photographic head Real-time Collection global image information and deliver to the arm processor system, the arm processor system carries out coarse positioning according to global image to tracking target; Local cameras is delivered to the arm processor system in real time with the topography of its collection, the arm processor system carries out fine positioning according to topography to target, the data that laser sensor and obliquity sensor all collect it are passed to the arm processor system in real time, the arm processor system with laser sensor data and obliquity sensor data in conjunction with the generating three-dimensional data, the arm processor system passes through multi-sensor information fusion, carry out 3 D scene rebuilding and target following information, and these information are delivered to the industrial computer module.
7. the assistant robot three-dimensional environment recognition methods based on multi-sensor information fusion as claimed in claim 5, it is characterized in that, among the described step S2, described industrial computer module also comprises: industrial computer, pulse transducer, pressure transducer, body temperature trans, GPS, touch screen, receiver and GPRS, wherein, industrial computer reads pulse transducer in real time, pressure transducer, body temperature trans, generate health and fitness information, the order that industrial computer receives according to receiver or touch screen, read GPS and generate navigation map, read 3 D scene rebuilding and target following information that the ARM message processing module is brought, according to its order that receives, utilize navigation map and 3 D scene rebuilding and target following Information generation trajectory path planning information, and the trajectory path planning information that generates is delivered to the slave computer motion-control module; The assistant robot kinematic parameter that industrial computer produces and user's health and fitness information, or deliver to touch screen and show in real time, or deliver to audio amplifier output, or utilize GPRS to pass to server computer by internet.
8. the assistant robot three-dimensional environment recognition methods based on multi-sensor information fusion as claimed in claim 5, it is characterized in that, among the described step S3, described slave computer motion-control module also comprises: motion control card, motor driver, servomotor, elevating mechanism, walking mechanism, the attitude maintaining body, wherein, motion control card in the host computer motion-control module is accepted the trajectory path planning information that industrial computer is sent here, motion path is carried out interpolation, produce motion control signal, motion control card is delivered to each motor driver with the motion control signal of its generation, respectively each servomotor is driven by each motor driver, the driven by servomotor elevating mechanism can produce elevating movement, the driven by servomotor walking mechanism can make the assistant robot walking, driven by servomotor attitude maintaining body can make attitude held stationary in the assistant robot motor process.
9. the assistant robot three-dimensional environment recognition methods based on multi-sensor information fusion is characterized in that, comprises the steps:
The S1.ARM processor system carries out coarse positioning and fine positioning according to the image of overall photographic head, local cameras collection; The arm processor system with laser sensor data and obliquity sensor data in conjunction with the generating three-dimensional data; Arm processor merges three-dimensional data, coarse positioning information and fine positioning information, generates three-dimensional environment identification and the target following information of assistant robot, and these information are delivered to the industrial computer module;
S2. the industrial computer module reads GPS and generates navigation map according to the order that receiver or touch screen receive; 3 D scene rebuilding and target following information in conjunction with arm processor is sent here generate trajectory path planning information, are sent to the slave computer motion-control module;
S3. the slave computer motion-control module receives the trajectory path planning information that the industrial computer module is sent here, and motion path is carried out interpolation, produces motion control signal; And motion control signal delivered to each motor driver, drive respectively each servomotor by each motor driver and make assistant robot produce respectively lifting, walking, attitude to keep; Feed back simultaneously movable information to the industrial computer module.
10. the assistant robot three-dimensional environment recognition methods based on multi-sensor information fusion as claimed in claim 9 is characterized in that, also comprises the steps:
S4. the assistant robot kinematic parameter that produces of industrial computer module and user's health and fitness information, or deliver to touch screen and show in real time, or deliver to audio amplifier output, or utilize GPRS to pass to server computer by internet.
CN2011104139109A 2011-12-13 2011-12-13 Multi-sensor integration based three-dimensional environment identifying system for walker aid robot Pending CN102895092A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN2011104139109A CN102895092A (en) 2011-12-13 2011-12-13 Multi-sensor integration based three-dimensional environment identifying system for walker aid robot

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN2011104139109A CN102895092A (en) 2011-12-13 2011-12-13 Multi-sensor integration based three-dimensional environment identifying system for walker aid robot

Publications (1)

Publication Number Publication Date
CN102895092A true CN102895092A (en) 2013-01-30

Family

ID=47567788

Family Applications (1)

Application Number Title Priority Date Filing Date
CN2011104139109A Pending CN102895092A (en) 2011-12-13 2011-12-13 Multi-sensor integration based three-dimensional environment identifying system for walker aid robot

Country Status (1)

Country Link
CN (1) CN102895092A (en)

Cited By (22)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103655123A (en) * 2013-11-20 2014-03-26 南通康盛医疗器械有限公司 Safety walking aid
CN103759734A (en) * 2014-01-24 2014-04-30 成都万先自动化科技有限责任公司 Service robot assisting crossing of street
CN103984407A (en) * 2013-02-08 2014-08-13 英属维京群岛商速位互动股份有限公司 Method and apparatus for performing motion recognition using motion sensor fusion
CN105425795A (en) * 2015-11-26 2016-03-23 纳恩博(北京)科技有限公司 Method for planning optimal following path and apparatus
CN106020201A (en) * 2016-07-13 2016-10-12 广东奥讯智能设备技术有限公司 Mobile robot 3D navigation and positioning system and navigation and positioning method
CN106155062A (en) * 2016-09-08 2016-11-23 肇庆市小凡人科技有限公司 A kind of Mobile Robot Control System
CN106679661A (en) * 2017-03-24 2017-05-17 山东大学 Simultaneous localization and mapping system and method assisted by search and rescue robot arms
CN106726208A (en) * 2015-11-20 2017-05-31 沈阳新松机器人自动化股份有限公司 A kind of intelligent wheelchair control system
CN107307852A (en) * 2016-04-27 2017-11-03 王方明 Intelligent robot system
CN107684493A (en) * 2017-10-17 2018-02-13 苏州诚满信息技术有限公司 A kind of cloud for wheelchair simulates intelligent barrier avoiding system
CN107831675A (en) * 2016-09-16 2018-03-23 天津思博科科技发展有限公司 Online robot control device based on intelligence learning technology
CN108181336A (en) * 2017-11-10 2018-06-19 浙江泰克松德能源科技有限公司 A kind of full-automatic calibration data picking platform of handheld type X fluorescence spectrometer and its implementation
CN108711163A (en) * 2018-02-24 2018-10-26 中国人民解放军火箭军工程大学 A kind of robot circumstances not known method for tracking target based on multisensor
CN109106563A (en) * 2018-06-28 2019-01-01 清华大学天津高端装备研究院 A kind of automation blind-guide device based on deep learning algorithm
CN109375614A (en) * 2018-08-29 2019-02-22 上海常仁信息科技有限公司 A kind of locating and tracking system based on robot
CN110338993A (en) * 2019-07-12 2019-10-18 扬州大学 A kind of method that electric wheelchair and electric wheelchair follow personnel automatically
CN111230858A (en) * 2019-03-06 2020-06-05 南昌工程学院 Visual robot motion control method based on reinforcement learning
CN111267080A (en) * 2020-02-18 2020-06-12 上海柴孚机器人有限公司 Method for automatically correcting path of industrial robot
CN111267081A (en) * 2020-02-18 2020-06-12 上海柴孚机器人有限公司 Method for orienting an industrial robot
CN111300409A (en) * 2020-02-16 2020-06-19 上海柴孚机器人有限公司 Industrial robot path planning method
CN112587378A (en) * 2020-12-11 2021-04-02 中国科学院深圳先进技术研究院 Exoskeleton robot footprint planning system and method based on vision and storage medium
CN112656402A (en) * 2020-11-30 2021-04-16 重庆优乃特医疗器械有限责任公司 Acquisition robot linkage control system applied to 3D posture detection and analysis

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2007139710A (en) * 2005-11-22 2007-06-07 Advanced Telecommunication Research Institute International Walking-aid robot
CN101530368A (en) * 2009-04-02 2009-09-16 上海交通大学 Intelligent controller of assistant robot
CN101540948A (en) * 2009-04-09 2009-09-23 上海交通大学 Web monitoring system facing walking robot
CN101549498A (en) * 2009-04-23 2009-10-07 上海交通大学 Automatic tracking and navigation system of intelligent aid type walking robots

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2007139710A (en) * 2005-11-22 2007-06-07 Advanced Telecommunication Research Institute International Walking-aid robot
CN101530368A (en) * 2009-04-02 2009-09-16 上海交通大学 Intelligent controller of assistant robot
CN101540948A (en) * 2009-04-09 2009-09-23 上海交通大学 Web monitoring system facing walking robot
CN101549498A (en) * 2009-04-23 2009-10-07 上海交通大学 Automatic tracking and navigation system of intelligent aid type walking robots

Cited By (28)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103984407A (en) * 2013-02-08 2014-08-13 英属维京群岛商速位互动股份有限公司 Method and apparatus for performing motion recognition using motion sensor fusion
CN103984407B (en) * 2013-02-08 2018-04-24 曦恩体感科技股份有限公司 The method and device of movement identification is carried out using motion sensor fusion
CN103655123A (en) * 2013-11-20 2014-03-26 南通康盛医疗器械有限公司 Safety walking aid
CN103759734A (en) * 2014-01-24 2014-04-30 成都万先自动化科技有限责任公司 Service robot assisting crossing of street
CN106726208A (en) * 2015-11-20 2017-05-31 沈阳新松机器人自动化股份有限公司 A kind of intelligent wheelchair control system
WO2017088720A1 (en) * 2015-11-26 2017-06-01 纳恩博(北京)科技有限公司 Method and device for planning optimal following path and computer storage medium
CN105425795A (en) * 2015-11-26 2016-03-23 纳恩博(北京)科技有限公司 Method for planning optimal following path and apparatus
CN107307852A (en) * 2016-04-27 2017-11-03 王方明 Intelligent robot system
CN106020201A (en) * 2016-07-13 2016-10-12 广东奥讯智能设备技术有限公司 Mobile robot 3D navigation and positioning system and navigation and positioning method
CN106155062A (en) * 2016-09-08 2016-11-23 肇庆市小凡人科技有限公司 A kind of Mobile Robot Control System
CN107831675A (en) * 2016-09-16 2018-03-23 天津思博科科技发展有限公司 Online robot control device based on intelligence learning technology
CN106679661A (en) * 2017-03-24 2017-05-17 山东大学 Simultaneous localization and mapping system and method assisted by search and rescue robot arms
CN106679661B (en) * 2017-03-24 2023-08-22 山东大学 System and method for assisting in simultaneous positioning and environment map construction of search and rescue robot arms
CN107684493B (en) * 2017-10-17 2020-05-19 冀晓静 Cloud simulation intelligent obstacle avoidance system for wheelchair
CN107684493A (en) * 2017-10-17 2018-02-13 苏州诚满信息技术有限公司 A kind of cloud for wheelchair simulates intelligent barrier avoiding system
CN108181336A (en) * 2017-11-10 2018-06-19 浙江泰克松德能源科技有限公司 A kind of full-automatic calibration data picking platform of handheld type X fluorescence spectrometer and its implementation
CN108711163A (en) * 2018-02-24 2018-10-26 中国人民解放军火箭军工程大学 A kind of robot circumstances not known method for tracking target based on multisensor
CN109106563A (en) * 2018-06-28 2019-01-01 清华大学天津高端装备研究院 A kind of automation blind-guide device based on deep learning algorithm
CN109375614A (en) * 2018-08-29 2019-02-22 上海常仁信息科技有限公司 A kind of locating and tracking system based on robot
CN111230858A (en) * 2019-03-06 2020-06-05 南昌工程学院 Visual robot motion control method based on reinforcement learning
CN111230858B (en) * 2019-03-06 2022-11-22 南昌工程学院 Visual robot motion control method based on reinforcement learning
CN110338993A (en) * 2019-07-12 2019-10-18 扬州大学 A kind of method that electric wheelchair and electric wheelchair follow personnel automatically
CN111300409A (en) * 2020-02-16 2020-06-19 上海柴孚机器人有限公司 Industrial robot path planning method
CN111267080A (en) * 2020-02-18 2020-06-12 上海柴孚机器人有限公司 Method for automatically correcting path of industrial robot
CN111267081A (en) * 2020-02-18 2020-06-12 上海柴孚机器人有限公司 Method for orienting an industrial robot
CN112656402A (en) * 2020-11-30 2021-04-16 重庆优乃特医疗器械有限责任公司 Acquisition robot linkage control system applied to 3D posture detection and analysis
CN112587378A (en) * 2020-12-11 2021-04-02 中国科学院深圳先进技术研究院 Exoskeleton robot footprint planning system and method based on vision and storage medium
CN112587378B (en) * 2020-12-11 2022-06-07 中国科学院深圳先进技术研究院 Exoskeleton robot footprint planning system and method based on vision and storage medium

Similar Documents

Publication Publication Date Title
CN102895092A (en) Multi-sensor integration based three-dimensional environment identifying system for walker aid robot
Demeester et al. User-adapted plan recognition and user-adapted shared control: A Bayesian approach to semi-autonomous wheelchair driving
US11020294B2 (en) Mobility and mobility system
Lin et al. Deep learning based wearable assistive system for visually impaired people
CN101549498B (en) Automatic tracking and navigation system of intelligent aid type walking robots
CN102411371A (en) Multi-sensor service-based robot following system and method
CN111026873B (en) Unmanned vehicle and navigation method and device thereof
CN102499638B (en) Living body detection system based on vision, hearing, smell and touch
Njah et al. Wheelchair obstacle avoidance based on fuzzy controller and ultrasonic sensors
CN102895093A (en) Walker aid robot tracking system and walker aid robot tracking method based on RGB-D (red, green and blue-depth) sensor
Manta et al. Wheelchair control by head motion using a noncontact method in relation to the pacient
Jayakody et al. Smart wheelchair to facilitate disabled individuals
Aravinth WiFi and Bluetooth based smart stick for guiding blind people
CN115698631A (en) Walking-aid robot navigation method, walking-aid robot and computer readable storage medium
Ruíz-Serrano et al. Obstacle avoidance embedded system for a smart wheelchair with a multimodal navigation interface
Mazo et al. Experiences in assisted mobility: the SIAMO project
CN115805595B (en) Robot navigation method and device and sundry cleaning robot
Wahyufitriyani et al. Review of intelligent wheelchair technology control development in the last 12 years
Lidoris et al. The autonomous city explorer project: Aims and system overview
Mazo et al. Integral system for assisted mobility
CN115399950A (en) Intelligent wheelchair with positioning navigation and multi-mode man-machine interaction functions and control method
Wang et al. An improved localization and navigation method for intelligent wheelchair in narrow and crowded environments
Ajay et al. Smart wheelchair
Aziz et al. Smart Wheelchairs: A Review on Control Methods
Sahoo et al. Autonomous navigation and obstacle avoidance in smart robotic wheelchairs

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
DD01 Delivery of document by public notice

Addressee: Li Baoshun

Document name: the First Notification of an Office Action

WD01 Invention patent application deemed withdrawn after publication

Application publication date: 20130130

WD01 Invention patent application deemed withdrawn after publication