CN105425828A - Robot anti-impact double-arm coordination control system based on sensor fusion technology - Google Patents

Robot anti-impact double-arm coordination control system based on sensor fusion technology Download PDF

Info

Publication number
CN105425828A
CN105425828A CN201510767657.5A CN201510767657A CN105425828A CN 105425828 A CN105425828 A CN 105425828A CN 201510767657 A CN201510767657 A CN 201510767657A CN 105425828 A CN105425828 A CN 105425828A
Authority
CN
China
Prior art keywords
sensor
path planning
fusion
module
robot
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201510767657.5A
Other languages
Chinese (zh)
Inventor
鲁守银
李臣
王涛
高焕兵
隋首钢
刘存根
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shandong Jianzhu University
Original Assignee
Shandong Jianzhu University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shandong Jianzhu University filed Critical Shandong Jianzhu University
Priority to CN201510767657.5A priority Critical patent/CN105425828A/en
Publication of CN105425828A publication Critical patent/CN105425828A/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D3/00Control of position or direction
    • G05D3/12Control of position or direction using feedback

Abstract

The present invention discloses a robot anti-impact double-arm coordination control system based on sensor fusion technology, belonging to the field of robot control. The system comprises a binocular camera, an ultrasonic sensor, a sensor information fusion module, a path planning module, a motion controller, and first and second six-degree-of-freedom manipulators. The binocular camera and the ultrasonic sensor are used for detecting the obstacle information in an operation environment, after the obtained environment is subjected to fusion processing by the sensor information fusion module, accurate obstacle position information is obtained and is transmitted to the path planning module to carry out anti-impact path planning, then the path planning module sends planned path information to the motion controller, and the motion controls the motion of the first and second six-degree-of-freedom manipulators. According to the system, the information from different sensors can be fused effectively, a reasonable path is planned through the path planning module, and thus the control of a robot is more precise.

Description

Robot anticollision based on sensor fusion techniques impacts Dual-Arm Coordination control system
Technical field
The present invention relates to robot controlling field, refer to that a kind of robot anticollision based on sensor fusion techniques impacts Dual-Arm Coordination control system especially.
Background technology
Along with improving constantly of industrial automation level, the darker wider future development of application forward of robot, the intellectuality of robot seems particularly important in the industrial production.In commercial production, tow-armed robot is compared one armed robot and can be completed more complicated operation task, but robot wants to understand environment, to conform and barrier collisionless in working environment is impacted carries out the co-ordination that operation then needs between high-performance sensors and multiple sensor.
Increasing and improve sensor performance in intelligent system is the important means strengthening system intelligence, obvious defect is there is in traditional single-sensor in the anticollision impact system of mechanical arm, be difficult to obtain accurate environmental information, this is because each sensor all exists certain defect, only single-sensor is relied on to be difficult to meet control overflow.Therefore, how fusion treatment is carried out to the information from different sensors, thus more effective being applied in a flexible way makes the control of robot more accurate, is an emphasis of research at present.
Summary of the invention
The technical problem to be solved in the present invention is to provide a kind of robot anticollision based on sensor fusion techniques and impacts Dual-Arm Coordination control system, and it can carry out effective integration to the information from different sensors, to make the control of robot more accurate.
For solving the problems of the technologies described above, the invention provides technical scheme as follows:
A kind of robot anticollision based on sensor fusion techniques impacts Dual-Arm Coordination control system, comprise binocular camera, ultrasonic sensor, sensor data fusion module, path planning module, motion controller, the first and second sixdegree-of-freedom simulation, wherein:
Described binocular camera is two and is installed on the end of described first and second sixdegree-of-freedom simulation respectively, described ultrasonic sensor is also two and is installed on the end of described first and second sixdegree-of-freedom simulation respectively, and described binocular camera is connected the input end of described sensor data fusion module with the output terminal of ultrasonic sensor;
The output terminal of described sensor data fusion module connects the input end of described path planning module;
The output terminal of described path planning module connects the input end of described motion controller;
The output terminal of described motion controller connects described first and second sixdegree-of-freedom simulation;
Described binocular camera and ultrasonic sensor are for detecting the obstacle information in operating environment, by the obstacle information of acquisition after described sensor data fusion module fusion treatment, the comparatively accurate Obstacle Position information obtained also passes to the path planning that described path planning module carries out anticollision impact, the routing information of planning is issued described motion controller by described path planning module afterwards, is controlled the motion of described first and second sixdegree-of-freedom simulation by described motion controller.
Further, described binocular camera is further used for demarcating barrier, and gathers two dimensional image in different angles to the unique point on barrier;
Described ultrasonic sensor is further used for real-time detection surrounding environment, and after constantly detecting ultrasound wave and launching, run into the echo that barrier reflects, measure from being transmitted into the time T receiving echo, calculate relative distance S=CT/2, in formula, C is ultrasonic velocity.
Further, the unique point that described sensor data fusion module is further used for binocular camera is caught carries out fusion calculation, obtain the track of unique point and the spatial attitude parameter of unique point place plane, after corresponding algorithm process, obtain the coordinate P (X of unique point in world coordinate system w, Y w, Z w), then by the three-dimensional coordinate P (X of neural network algorithm fusion feature point w, Y w, Z w) and the range information R that catches of ultrasonic sensor, obtain the three-dimensional coordinate P (X, Y, Z) that barrier is more accurate.
Further, described path planning module is further used for the Obstacle Position information obtained according to described sensor data fusion module, the inverse of inspection machine mechanical arm is separated, for Robot Design one is without the movement locus of barrier, original path can be got back to simultaneously after requiring avoiding obstacles and continued corresponding operation.
Further, described neural network algorithm adopts three layers of BP neural network, and input layer is P in=[X w, Y w, Z w, R], be respectively the object dimensional volume coordinate (X that stereoscopic vision merges rear gained w, Y w, Z w) and the range information R that obtains of ultrasonic sensor, hidden layer has 9 nodes, output layer T=[X, Y, Z] three dimensional space coordinate P (X, the Y of object for obtaining after fusion treatment, Z), in learning process, adopt the vibration in Error function and self-adaptative adjustment learning rate algorithm suppression learning process, improve speed of convergence, improve the learning efficiency of neural network.
Further, the formula of described Error function and self-adaptative adjustment learning rate algorithm is as follows:
ω ij(k+1)=ω ij(k)+Δω ij+α(ω ij(k)-ω ij(k-1))
ω j2(k+1)=ω j2(k)+Δω j2+α(ω j2(k)-ω j2(k-1))(1)
&eta; ( k + 1 ) = 0.7 &eta; ( k ) E ( k + 1 ) > E ( k ) 1.05 &eta; ( k ) E ( k + 1 ) < E ( k ) - - - ( 2 )
Formula (1) is Error function, wherein ω ij, ω j2for link weight coefficients, α is factor of momentum α ∈ (0,1);
Formula (2) is self-adaptative adjustment learning rate algorithm, and wherein η is self study speed, and E (k) is error sum of squares.
The present invention has following beneficial effect:
Information from different sensors (i.e. binocular camera and ultrasonic sensor) can be carried out effective integration by the present invention, and cooks up rational path accordingly by path planning module, thus makes the control of robot more accurate.
Accompanying drawing explanation
Fig. 1 is the structural representation that the robot anticollision based on sensor fusion techniques of the present invention impacts Dual-Arm Coordination control system;
Fig. 2 is barrier unique point image procossing coordinate conversion schematic diagram of the present invention;
Fig. 3 is control principle schematic diagram of the present invention.
Embodiment
For making the technical problem to be solved in the present invention, technical scheme and advantage clearly, be described in detail below in conjunction with the accompanying drawings and the specific embodiments.
The invention provides a kind of robot anticollision based on sensor fusion techniques and impact Dual-Arm Coordination control system, as shown in Figure 1, comprise binocular camera 1, ultrasonic sensor 2, sensor data fusion module 3, path planning module 4, motion controller 5, first sixdegree-of-freedom simulation 6 and the second sixdegree-of-freedom simulation 7, wherein:
Binocular camera 1 is two and is installed on the end of the first sixdegree-of-freedom simulation 6 and the second sixdegree-of-freedom simulation 7 respectively, ultrasonic sensor 2 is also two and is installed on the end of the first sixdegree-of-freedom simulation 6 and the second sixdegree-of-freedom simulation 7 respectively, the input end of the output terminal connecting sensor information fusion module 3 of binocular camera 1 and ultrasonic sensor 2;
The input end of the output terminal access path planning module 4 of sensor data fusion module 3;
The output terminal of path planning module 4 connects the input end of motion controller 5;
The output terminal of motion controller 5 connects the first sixdegree-of-freedom simulation 6 and the second sixdegree-of-freedom simulation 7;
Binocular camera 1 and ultrasonic sensor 2 are for detecting the obstacle information in operating environment, by the obstacle information of acquisition after sensor data fusion module 3 fusion treatment, the comparatively accurate Obstacle Position information obtained also passes to the path planning that path planning module 4 carries out anticollision impact, the routing information of planning is issued motion controller 5 by path planning module 4 afterwards, is controlled the motion of the first sixdegree-of-freedom simulation 6 and the second sixdegree-of-freedom simulation 7 by motion controller 5.
Information from different sensors (binocular camera and ultrasonic sensor) can be carried out effective integration by the present invention, and cooks up rational path accordingly by path planning module, thus makes the control of robot more accurate.
As to a modification of the present invention, binocular camera 1 is further used for demarcating barrier, and gathers two dimensional image in different angles to the unique point on barrier; Concrete, establish world coordinate system, camera coordinate system and imaging plane coordinate system as shown in Figure 2 when asking for barrier characteristic point position information by the two dimensional image gathered, the step being asked for barrier characteristic point position coordinate information by the two dimensional image gathered is as follows:
(1) binocular camera gathers two dimensional image in different angles to the unique point on barrier, obtains the coordinate (x, y) of unique point in imaging plane coordinate system;
(2) coordinate (X of unique point in imaging plane coordinate system can be tried to achieve by (x, y) by following formula again c, Y c, Z c);
( x - u 0 ) ( 1 + k 1 r 2 + k 2 r 4 ) = - f X c Z c ( y - v 0 ) ( 1 + k 1 r 2 + k 2 r 2 ) = - f Y c Z c - - - ( 1 )
In formula: (x, y) is the coordinate of unique point in imaging plane;
K 1, k 2for binocular camera camera lens radial distortion penalty coefficient;
(u 0, v 0) be the intersection point of camera optics main shaft and imaging plane;
r 2=(x-u 0) 2+(y-v 0) 2
Ask for (X c, Y c, Z c) time application epipolar line restriction and algorithm for stereo matching can find out two imaging point (x about binocular camera 1, y 1) and (x 2, y 2), these two imaging points substitution formula (1) can be obtained four equatioies and solve remaining three variable (X c, Y c, Z c), thus try to achieve the coordinate (X of unique point in imaging plane coordinate system c, Y c, Z c).
(3) more just can according to (X by following formula c, Y c, Z c) try to achieve the coordinate P (X of barrier unique point in world coordinate system w, Y w, Z w).
[X W,Y W,Z W]=([X C,Y C,Z C]+t)·R -1(2)
In formula: t = X 0 Y 0 Z 0 ;
R = 1 0 0 0 cos &alpha; - s i n &alpha; 0 s i n &alpha; cos &alpha; c o s &beta; 0 s i n &beta; 0 1 0 - s i n &beta; 0 c o s &beta; c o s &gamma; - s i n &gamma; 0 s i n &gamma; c o s &gamma; 0 0 0 1 ; - - - ( 3 )
Wherein (X 0, Y 0, Z 0) be the coordinate of unique point in camera coordinate system;
α, β, λ are the anglec of rotation of camera coordinate system relative to world coordinate system.
Ultrasonic sensor 2 is further used for real-time detection surrounding environment, and after constantly detecting ultrasound wave and launching, run into the echo that barrier reflects, measure from being transmitted into the time T receiving echo, calculate relative distance S=CT/2, in formula, C is ultrasonic velocity.Like this, by the acting in conjunction of two kinds of sensors (binocular camera and ultrasonic sensor), more accurate environmental information can be obtained.
Improve as to another kind of the present invention, path planning module 4 is further used for the Obstacle Position information obtained according to sensor data fusion module 3, the inverse of inspection machine mechanical arm is separated, for Robot Design one is without the movement locus of barrier, original path can be got back to simultaneously after requiring avoiding obstacles and continued corresponding operation.
As to further improvement of the present invention, the unique point that sensor data fusion module 3 is further used for binocular camera 1 is caught carries out fusion calculation, obtain the track of unique point and the spatial attitude parameter of unique point place plane, after corresponding algorithm process, obtain the coordinate P (X of unique point in world coordinate system w, Y w, Z w), then by the three-dimensional coordinate P (X of neural network algorithm fusion feature point w, Y w, Z w) and the range information R that catches of ultrasonic sensor 2, obtain the three-dimensional coordinate P (X, Y, Z) that barrier is more accurate.
Preferably, neural network algorithm can adopt three layers of BP neural network, and input layer is P in=[X w, Y w, Z w, R], be respectively the object dimensional volume coordinate P (X that stereoscopic vision merges rear gained w, Y w, Z w) and the range information R that obtains of ultrasonic sensor, hidden layer has 9 nodes, the three dimensional space coordinate P (X, Y, Z) of object of output layer T=[X, Y, Z] for obtaining after fusion treatment.
Described BP neural network algorithm, first training sample is utilized to train neural network, neural network can according to current system accept the similarity determination criteria for classification of sample, the learning ability of neural network is utilized to obtain knowledge, probabilistic inference mechanism is obtained, the positional information P (X that can obtain video sensor according to the inference mechanism after study by the study of some steps w, Y w, Z w) and the range information R that obtains of ultrasonic sensor merge, obtain comparatively accurate Obstacle Position information P (X, Y, Z).
The learning process of described BP neural network algorithm is made up of forward-propagating and backpropagation, first forward-propagating is carried out in learning process, if if output layer can not obtain the output expected in forward-propagating process, then turn to backpropagation through the study of multistep deconditioning study until error meets the demands.
Forward-propagating in described BP neural network algorithm learning process be input layer by the mapping of hidden layer to output layer, after forward-propagating, output layer obtains neuronic output y n(k), if desirable network exports as y (k), then can obtain the error that network exports and ideal exports is:
e(k)=y(k)-y n(k)(4)
Getting error performance target function is:
E = 1 2 e ( k ) 2 - - - ( 5 )
Backpropagation in described BP neural network algorithm learning process is by error signal by connecting path backwards calculation, adjusts the neuronic weights of each layer, error signal is reduced by gradient descent method, the connection weights ω of output layer and hidden layer j2learning algorithm be
&Delta;&omega; j 2 = - &eta; &part; E &part; &omega; j 2 = &eta; &CenterDot; e ( k ) &CenterDot; x &prime; j - - - ( 6 )
In formula: η is learning rate, η ∈ [0,1];
X' jfor the output of hidden layer neuron;
The network weight in k+1 moment is:
ω j2(k+1)=ω j2(k)+Δω j2(7)
Hidden layer and input layer connect weights ω ijlearning algorithm is:
&Delta;&omega; i j = - &eta; &part; E &part; &omega; i j = &eta; &CenterDot; e ( k ) &CenterDot; &part; y n &part; &omega; i j - - - ( 8 )
The network weight in k+1 moment is:
ω ij(k+1)=ω ij(k)+Δω ij(9)
For suppressing the vibration of BP neural network in learning process, improving its speed of convergence and learning efficiency, in learning process, adopting the method for Error function and self-adaptative adjustment learning rate.
Error function in described neural network learning process adds factor of momentum in weights learning process, and concrete algorithm is preferably:
ω j2(k+1)=ω j2(k)+Δω j2+α(ω j2(k)-ω j2(k-1))(10)
ω ij(k+1)=ω ij(k)+Δω ij+α(ω ij(k)-ω ij(k-1))(11)
In formula: α is factor of momentum, α ∈ (0,1).
Further, the self-adaptative adjustment learning rate control strategy in described neural network learning process, its method can with reference to as follows:
(1) suppose that initial learning rate is that η trains, obtain error performance index E (k);
(2) continuation training obtains error performance index is E (k+1);
(3) if E (k+1) > E (k), then make η (k+1)=0.7 η (k), E (k)=E (k+1), goes to (2);
(4) if E (k+1) < E (k), then η (k+1)=1.05 η (k) is made, E (k)=E (k+1);
(5) if E (k) meets the demands, deconditioning, otherwise go to (2).
The above is the preferred embodiment of the present invention; it should be pointed out that for those skilled in the art, under the prerequisite not departing from principle of the present invention; can also make some improvements and modifications, these improvements and modifications also should be considered as protection scope of the present invention.

Claims (6)

1. one kind is impacted Dual-Arm Coordination control system based on the robot anticollision of sensor fusion techniques, it is characterized in that, comprise binocular camera, ultrasonic sensor, sensor data fusion module, path planning module, motion controller, the first and second sixdegree-of-freedom simulation, wherein:
Described binocular camera is two and is installed on the end of described first and second sixdegree-of-freedom simulation respectively, described ultrasonic sensor is also two and is installed on the end of described first and second sixdegree-of-freedom simulation respectively, and described binocular camera is connected the input end of described sensor data fusion module with the output terminal of ultrasonic sensor;
The output terminal of described sensor data fusion module connects the input end of described path planning module;
The output terminal of described path planning module connects the input end of described motion controller;
The output terminal of described motion controller connects described first and second sixdegree-of-freedom simulation;
Described binocular camera and ultrasonic sensor are for detecting the obstacle information in operating environment, by the obstacle information of acquisition after described sensor data fusion module fusion treatment, the comparatively accurate Obstacle Position information obtained also passes to the path planning that described path planning module carries out anticollision impact, the routing information of planning is issued described motion controller by described path planning module afterwards, is controlled the motion of described first and second sixdegree-of-freedom simulation by described motion controller.
2. the robot anticollision based on sensor fusion techniques according to claim 1 impacts Dual-Arm Coordination control system, it is characterized in that, described binocular camera is further used for demarcating barrier, and gathers two dimensional image in different angles to the unique point on barrier;
Described ultrasonic sensor is further used for real-time detection surrounding environment, and after constantly detecting ultrasound wave and launching, run into the echo that barrier reflects, measure from being transmitted into the time T receiving echo, calculate relative distance S=CT/2, in formula, C is ultrasonic velocity.
3. the robot anticollision based on sensor fusion techniques according to claim 2 impacts Dual-Arm Coordination control system, it is characterized in that, the unique point that described sensor data fusion module is further used for binocular camera is caught carries out fusion calculation, obtain the track of unique point and the spatial attitude parameter of unique point place plane, after corresponding algorithm process, obtain the coordinate P (X of unique point in world coordinate system w, Y w, Z w), then by the three-dimensional coordinate P (X of neural network algorithm fusion feature point w, Y w, Z w) and the range information R that catches of ultrasonic sensor, obtain the three-dimensional coordinate P (X, Y, Z) that barrier is more accurate.
4. the robot anticollision based on sensor fusion techniques according to claim 3 impacts Dual-Arm Coordination control system, it is characterized in that, described path planning module is further used for the Obstacle Position information obtained according to described sensor data fusion module, the inverse of inspection machine mechanical arm is separated, for Robot Design one is without the movement locus of barrier, original path can be got back to simultaneously after requiring avoiding obstacles and continued corresponding operation.
5. the robot anticollision based on sensor fusion techniques according to claim 3 impacts Dual-Arm Coordination control system, and it is characterized in that, described neural network algorithm adopts three layers of BP neural network, and input layer is P in=[X w, Y w, Z w, R], be respectively the object dimensional volume coordinate (X that stereoscopic vision merges rear gained w, Y w, Z w) and the range information R that obtains of ultrasonic sensor, hidden layer has 9 nodes, output layer T=[X, Y, Z] three dimensional space coordinate P (X, the Y of object for obtaining after fusion treatment, Z), in learning process, adopt the vibration in Error function and self-adaptative adjustment learning rate algorithm suppression learning process, improve speed of convergence, improve the learning efficiency of neural network.
6. the robot anticollision based on sensor fusion techniques according to claim 5 impacts Dual-Arm Coordination control system, and it is characterized in that, the formula of described Error function and self-adaptative adjustment learning rate algorithm is as follows:
ω ij(k+1)=ω ij(k)+Δω ij+α(ω ij(k)-ω ij(k-1))
ω j2(k+1)=ω j2(k)+Δω j2+α(ω j2(k)-ω j2(k-1))(1)
&eta; ( k + 1 ) = 0.7 &eta; ( k ) E ( k + 1 ) > E ( k ) 1.05 &eta; ( k ) E ( k + 1 ) < E ( k ) - - - ( 2 )
Formula (1) is Error function, wherein ω ij, ω j2for link weight coefficients, α is factor of momentum α ∈ (0,1);
Formula (2) is self-adaptative adjustment learning rate algorithm, and wherein η is self study speed, and E (k) is error sum of squares.
CN201510767657.5A 2015-11-11 2015-11-11 Robot anti-impact double-arm coordination control system based on sensor fusion technology Pending CN105425828A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201510767657.5A CN105425828A (en) 2015-11-11 2015-11-11 Robot anti-impact double-arm coordination control system based on sensor fusion technology

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201510767657.5A CN105425828A (en) 2015-11-11 2015-11-11 Robot anti-impact double-arm coordination control system based on sensor fusion technology

Publications (1)

Publication Number Publication Date
CN105425828A true CN105425828A (en) 2016-03-23

Family

ID=55504095

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201510767657.5A Pending CN105425828A (en) 2015-11-11 2015-11-11 Robot anti-impact double-arm coordination control system based on sensor fusion technology

Country Status (1)

Country Link
CN (1) CN105425828A (en)

Cited By (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106020232A (en) * 2016-07-07 2016-10-12 天津航天中为数据系统科技有限公司 Obstacle avoidance device for unmanned aerial vehicle and its method for obstacle avoidance
CN106003043A (en) * 2016-06-20 2016-10-12 先驱智能机械(深圳)有限公司 Obstacle avoidance method and obstacle avoidance system of mechanical arm
CN106094516A (en) * 2016-06-08 2016-11-09 南京大学 A kind of robot self-adapting grasping method based on deeply study
CN106802668A (en) * 2017-02-16 2017-06-06 上海交通大学 Based on the no-manned plane three-dimensional collision avoidance method and system that binocular is merged with ultrasonic wave
CN107272705A (en) * 2017-07-31 2017-10-20 中南大学 A kind of multiple neural network controlling planning method of robot path under intelligent environment
CN107378955A (en) * 2017-09-07 2017-11-24 云南电网有限责任公司普洱供电局 A kind of distribution robot for overhauling motion arm AUTONOMOUS TASK method based on multi-sensor information fusion
CN107571246A (en) * 2017-10-13 2018-01-12 上海神添实业有限公司 A kind of component assembly system and method based on tow-armed robot
CN108107716A (en) * 2017-12-19 2018-06-01 电子科技大学 A kind of Parameter Measuring method based on improved BP neural network
WO2019010612A1 (en) * 2017-07-10 2019-01-17 深圳市艾唯尔科技有限公司 Robot joint anti-collision protection system and method based on sensing fusion technology
CN110069057A (en) * 2018-01-24 2019-07-30 南京机器人研究院有限公司 A kind of obstacle sensing method based on robot
CN112084810A (en) * 2019-06-12 2020-12-15 杭州海康威视数字技术股份有限公司 Obstacle detection method and device, electronic equipment and storage medium
CN113362395A (en) * 2021-06-15 2021-09-07 上海追势科技有限公司 Sensor fusion-based environment sensing method

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6639553B2 (en) * 2000-04-22 2003-10-28 Ching-Fang Lin Passive/ranging/tracking processing method for collision avoidance guidance
CN101273688A (en) * 2008-05-05 2008-10-01 江苏大学 Apparatus and method for flexible pick of orange picking robot
CN102156476A (en) * 2011-04-14 2011-08-17 山东大学 Intelligent space and nurse robot multi-sensor system and information fusion method of intelligent space and nurse robot multi-sensor system
CN103019245A (en) * 2013-01-07 2013-04-03 西北农林科技大学 Obstacle avoidance system of mountain farming robot on basis of multi-sensor information fusion
CN103984936A (en) * 2014-05-29 2014-08-13 中国航空无线电电子研究所 Multi-sensor multi-feature fusion recognition method for three-dimensional dynamic target recognition

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6639553B2 (en) * 2000-04-22 2003-10-28 Ching-Fang Lin Passive/ranging/tracking processing method for collision avoidance guidance
CN101273688A (en) * 2008-05-05 2008-10-01 江苏大学 Apparatus and method for flexible pick of orange picking robot
CN102156476A (en) * 2011-04-14 2011-08-17 山东大学 Intelligent space and nurse robot multi-sensor system and information fusion method of intelligent space and nurse robot multi-sensor system
CN103019245A (en) * 2013-01-07 2013-04-03 西北农林科技大学 Obstacle avoidance system of mountain farming robot on basis of multi-sensor information fusion
CN103984936A (en) * 2014-05-29 2014-08-13 中国航空无线电电子研究所 Multi-sensor multi-feature fusion recognition method for three-dimensional dynamic target recognition

Non-Patent Citations (3)

* Cited by examiner, † Cited by third party
Title
何慧娟: "基于多传感器的移动机器人障碍物检测与定位研究", 《中国优秀硕士学位论文全文数据库信息科技辑》 *
开平安 等: "《火电厂热工过程先进控制技术》", 28 February 2010 *
鲁守银 等: "基于任务流的带电作业机器人智能控制系统", 《制造业自动化》 *

Cited By (15)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106094516A (en) * 2016-06-08 2016-11-09 南京大学 A kind of robot self-adapting grasping method based on deeply study
CN106003043A (en) * 2016-06-20 2016-10-12 先驱智能机械(深圳)有限公司 Obstacle avoidance method and obstacle avoidance system of mechanical arm
CN106020232A (en) * 2016-07-07 2016-10-12 天津航天中为数据系统科技有限公司 Obstacle avoidance device for unmanned aerial vehicle and its method for obstacle avoidance
CN106802668A (en) * 2017-02-16 2017-06-06 上海交通大学 Based on the no-manned plane three-dimensional collision avoidance method and system that binocular is merged with ultrasonic wave
CN106802668B (en) * 2017-02-16 2020-11-17 上海交通大学 Unmanned aerial vehicle three-dimensional collision avoidance method and system based on binocular and ultrasonic fusion
WO2019010612A1 (en) * 2017-07-10 2019-01-17 深圳市艾唯尔科技有限公司 Robot joint anti-collision protection system and method based on sensing fusion technology
CN107272705B (en) * 2017-07-31 2018-02-23 中南大学 A kind of multiple neural network controlling planning method of robot path under intelligent environment
CN107272705A (en) * 2017-07-31 2017-10-20 中南大学 A kind of multiple neural network controlling planning method of robot path under intelligent environment
CN107378955A (en) * 2017-09-07 2017-11-24 云南电网有限责任公司普洱供电局 A kind of distribution robot for overhauling motion arm AUTONOMOUS TASK method based on multi-sensor information fusion
CN107571246A (en) * 2017-10-13 2018-01-12 上海神添实业有限公司 A kind of component assembly system and method based on tow-armed robot
CN108107716A (en) * 2017-12-19 2018-06-01 电子科技大学 A kind of Parameter Measuring method based on improved BP neural network
CN110069057A (en) * 2018-01-24 2019-07-30 南京机器人研究院有限公司 A kind of obstacle sensing method based on robot
CN112084810A (en) * 2019-06-12 2020-12-15 杭州海康威视数字技术股份有限公司 Obstacle detection method and device, electronic equipment and storage medium
CN112084810B (en) * 2019-06-12 2024-03-08 杭州海康威视数字技术股份有限公司 Obstacle detection method and device, electronic equipment and storage medium
CN113362395A (en) * 2021-06-15 2021-09-07 上海追势科技有限公司 Sensor fusion-based environment sensing method

Similar Documents

Publication Publication Date Title
CN105425828A (en) Robot anti-impact double-arm coordination control system based on sensor fusion technology
CN110703747B (en) Robot autonomous exploration method based on simplified generalized Voronoi diagram
CN107562048B (en) Dynamic obstacle avoidance control method based on laser radar
EP3405845B1 (en) Object-focused active three-dimensional reconstruction
CN114384920B (en) Dynamic obstacle avoidance method based on real-time construction of local grid map
Chong et al. Mobile-robot map building from an advanced sonar array and accurate odometry
Wu et al. Autonomous obstacle avoidance of an unmanned surface vehicle based on cooperative manoeuvring
Zhang et al. Sim2real learning of obstacle avoidance for robotic manipulators in uncertain environments
Chen et al. Robot navigation with map-based deep reinforcement learning
Hanebeck et al. Roman: A mobile robotic assistant for indoor service applications
CN113341706B (en) Man-machine cooperation assembly line system based on deep reinforcement learning
CN114905508B (en) Robot grabbing method based on heterogeneous feature fusion
CN105892493A (en) Information processing method and mobile device
CN116540731B (en) Path planning method and system integrating LSTM and SAC algorithms
CN112904890A (en) Unmanned aerial vehicle automatic inspection system and method for power line
CN111260751A (en) Mapping method based on multi-sensor mobile robot
CN109814565A (en) The unmanned boat intelligence navigation control method of space-time double fluid data-driven depth Q study
CN110780670A (en) Robot obstacle avoidance control method based on fuzzy control algorithm
Zhang et al. Intelligent vector field histogram based collision avoidance method for auv
CN111026121A (en) Multi-level three-dimensional obstacle avoidance control method and device for intelligent sweeper
CN110610130A (en) Multi-sensor information fusion power transmission line robot navigation method and system
You et al. A novel obstacle avoidance method for low-cost household mobile robot
Guo et al. Environmental perception of mobile robot
CN117295589A (en) System and method for using simulated learning in training and refining robot control strategies
CN112947426A (en) Cleaning robot motion control system and method based on multi-sensing fusion

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication

Application publication date: 20160323

RJ01 Rejection of invention patent application after publication