CN108931979B - Visual tracking mobile robot based on ultrasonic auxiliary positioning and control method - Google Patents

Visual tracking mobile robot based on ultrasonic auxiliary positioning and control method Download PDF

Info

Publication number
CN108931979B
CN108931979B CN201810650899.XA CN201810650899A CN108931979B CN 108931979 B CN108931979 B CN 108931979B CN 201810650899 A CN201810650899 A CN 201810650899A CN 108931979 B CN108931979 B CN 108931979B
Authority
CN
China
Prior art keywords
module
tracking
ultrasonic
robot
vision
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201810650899.XA
Other languages
Chinese (zh)
Other versions
CN108931979A (en
Inventor
朱华
由韶泽
葛世荣
李锰钢
陈常
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
China University of Mining and Technology CUMT
Original Assignee
China University of Mining and Technology CUMT
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by China University of Mining and Technology CUMT filed Critical China University of Mining and Technology CUMT
Priority to CN201810650899.XA priority Critical patent/CN108931979B/en
Publication of CN108931979A publication Critical patent/CN108931979A/en
Application granted granted Critical
Publication of CN108931979B publication Critical patent/CN108931979B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0231Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means
    • G05D1/0242Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means using non-visible light signals, e.g. IR or UV signals

Abstract

The invention discloses a vision tracking mobile robot based on ultrasonic auxiliary positioning and a control method. The visual tracking mobile robot comprises a robot shell, a chassis, a steering engine holder, a visual module, a separated ultrasonic receiving module, a control module, a driver, a driving motor, a driving wheel and universal wheels. The control module receives signals collected by the vision module, generates a driving instruction and realizes target tracking, the separated ultrasonic receiving module collects ultrasonic signals sent by a separated ultrasonic transmitting module carried by a tracked target, and the control module positions according to the ultrasonic signals. The invention effectively integrates the vision tracking and the ultrasonic positioning of the mobile robot, realizes the continuous tracking of a specific target and improves the positioning precision and the robustness.

Description

Visual tracking mobile robot based on ultrasonic auxiliary positioning and control method
Technical Field
The invention belongs to the technical field of mobile robots, and particularly relates to a visual tracking mobile robot and a control method.
Background
Target tracking is an important problem in the field of computer vision, and is widely applied to the fields of automatic monitoring, man-machine interaction, military reconnaissance and the like. The vision module can enable the robot to obtain the vision characteristics of human eyes through a computer, realize the tracking, identification and positioning of the designated target according to programming, and provide various physical parameters (such as category, position, speed, number and the like) of the target to the controller so as to execute the subsequent functions of the robot. Visual tracking is a part of computer vision, and is widely applied to autonomous tracking of mobile robots.
However, in a complex scene of human life, due to the characteristics of the target and the change of the environment, it is difficult for the robot tracking system to determine a uniform target, which poses a serious challenge to the design of the robot vision tracking system. Due to the physical characteristics and the algorithm of a visual sensor (camera), large errors are often generated in the illumination change of the external environment and the situations of appearance deformation, rapid movement, motion blur, background similar interference and even shielding of a tracked target, and the situation of losing the target occurs occasionally. And a general target tracking algorithm only detects the target again in the current visible area after the target is lost. The method of tracking the target only by the vision sensor has a great limitation.
In the traditional TLD tracking algorithm, a detection module is a random fern classifier generated by training and learning samples in an online model acquired by a tracking module, and belongs to a deformation of a random forest. However, the method occupies a large amount of memory, and for data with attributes of different values, attributes with more value division have a larger influence on random ferns, and attribute weights generated on the data are not credible. And random fern classifiers are overfitted on some of the more noisy classification or regression problems.
Disclosure of Invention
In order to solve the technical problems of the background art, the invention aims to provide a vision tracking mobile robot based on ultrasonic auxiliary positioning and a control method thereof, which effectively integrate the vision tracking and the ultrasonic positioning of the mobile robot together, realize the continuous tracking of a specific target and improve the positioning accuracy and the robustness.
In order to achieve the technical purpose, the technical scheme of the invention is as follows:
a vision tracking mobile robot based on ultrasonic auxiliary positioning comprises a robot shell, a chassis, a steering engine cradle head, a vision module, a separated ultrasonic receiving module, a control module, a driver, a driving motor, driving wheels and universal wheels, wherein the steering engine cradle head is arranged on the chassis; the control module is electrically connected with the steering engine holder, the vision module, the separated ultrasonic receiving module and the driver; the steering engine cradle head is arranged at the top of the outer side of the robot shell, and the visual module is arranged on the steering engine cradle head; at least 3 separated ultrasonic receiving modules are arranged and positioned at the front part of the robot shell in a triangular arrangement; the driving wheel and the universal wheel are arranged at the bottom of the chassis, the driving wheel rotates along with the driving motor, the driving motor is driven by the driver, the control module receives signals collected by the vision module, a driving instruction is generated, target tracking is achieved, the separated ultrasonic receiving module collects ultrasonic signals sent by the separated ultrasonic transmitting module carried by a tracked target, and the control module carries out robot position positioning according to the signals collected by the separated ultrasonic receiving module.
Further, still include 4 at least infrared modules, 2 infrared modules wherein set up in the front end intermediate position of robot housing, and all the other 2 infrared modules are installed respectively in the both sides of robot housing, and 4 infrared modules respectively with control module electric connection.
And the control module is in wireless communication with the intelligent terminal of the user through the wireless communication module and receives a robot control instruction issued by the intelligent terminal.
Furthermore, 2 driving motors are arranged, each driving motor corresponds to 1 driving wheel, the 2 driving wheels are respectively arranged on the left side and the right side of the chassis, and the steering of the robot is realized through differential speed; the universal wheel sets up 2, sets up both sides around the chassis respectively.
The control method based on the vision tracking mobile robot comprises a vision tracking method, an ultrasonic positioning method and an infrared obstacle avoidance method.
Furthermore, the visual tracking method adopts an LRT algorithm, the structure of the visual tracking method is the combination of a tracking module, a learning module and a detection module, the detection module and the tracking module are subjected to parallel computing and are not interfered with each other, the running results of the tracking module and the learning module are sent to the learning module for learning and correction, the learned model reacts on the tracking module and the detection module to update the model characteristics in real time, and therefore the continuous tracking can be guaranteed even under the condition that the appearance of a target changes.
Further, the tracking module adopts a Lucas-Kanade optical Flow method to track the target, and integrates an image pyramid through a media-Flow Median optical Flow method to ensure the sample tracking precision; the learning module adopts a PN learning mechanism, and iteratively trains a classifier according to positive and negative samples generated by the tracking module and the detection module, so as to improve the precision of the detection module; the detection module adopts an online SVM support vector machine classifier.
Further, the hyperplane parameter h objective function adopted by the online SVM support vector machine classifier is as follows:
Figure BDA0001704168080000031
wherein (v)i,ci) For the training set, i is 1,2, …, N is the total number of samples, viFor feature vectors generated from the ith sample, ciE { +1, -1} is a class label, where +1 denotes positive samples, -1 denotes negative samples, l (h; (v)i,ci))=max{0,1-ci<h,vi>},<h,viH and viλ is a regularization parameter, λ > 0;
updating the hyperplane parameter h by adopting the following method: given training sample vtCalculating the margin mt=ht-1·vtAnd a received sample label ctIf m ist≠ctThen order l (h; (v))i,ci) 0, and when mtctWhen the value is less than 1, updating h:
Figure BDA0001704168080000032
when h is generatedt≠ht-1Time, update confidence Σ:
Figure BDA0001704168080000041
wherein, the subscript t represents the iteration number, and r is an update rate over-parameter of the control h.
Further, if the visual tracking method is disabled due to the loss of the target or the reliability threshold of the target is lower than a set value, the ultrasonic positioning method is started, a trilateral positioning method is adopted, and a positioning formula is as follows:
(x1-x0)2+(y1-y0)2=d1 2
(x2-x0)2+(y2-y0)2=d2 2
(x3-x0)2+(y3-y0)2=d3 2
in the above formula, (x)1,y1)、(x2,y2)、(x3,y3) Respectively, the coordinates of 3 separate ultrasonic receiving modules, (x)0,y0) Is the coordinate of the tracked object, d1、d2、d3Is (x)1,y1)、(x2,y2)、(x3,y3) To (x)0,y0) Distance of d1、d2、d3And calculating according to the ultrasonic signals received by the 3 separated ultrasonic receiving modules by using an ultrasonic ranging principle.
Further, the infrared obstacle avoidance method includes that when 2 infrared modules in the middle of the front end of the outer side of a robot shell simultaneously output low levels to indicate that an obstacle exists in front of the robot, a control module preferentially avoids the obstacle through an interruption function, the robot firstly retreats for a preset distance by controlling a driving motor, and then turns to avoid the obstacle; if one of the infrared modules on the left side and the right side of the shell of the robot outputs a high level and the other outputs a low level, the robot firstly retreats for a preset distance by controlling the driving motor and then deflects to one side outputting the high level.
Adopt the beneficial effect that above-mentioned technical scheme brought:
the invention effectively integrates the vision tracking and the ultrasonic positioning of the mobile robot, can correct the position of the robot through the separated ultrasonic positioning under the condition that the target is out of view or the target has low reliability, enables the robot to effectively track the specific target, can relocate the vision module, realizes the continuous tracking of the specific target characteristic, and improves the positioning precision and the robustness. The invention can be applied to the fields of mobile robots, unmanned planes and the like.
Drawings
FIG. 1 is a schematic diagram of a mobile robot according to the present invention;
FIG. 2 is a schematic diagram of a mobile robot according to the present invention;
FIG. 3 is a schematic external view of an ultrasound transmission box carried by the subject of the present invention;
FIG. 4 is a view of the external structure of the ultrasonic transmitter case carried by the object of the present invention;
FIG. 5 is an overall block diagram of the control method of the present invention;
FIG. 6 is a block diagram of a visual tracking method in the control method of the present invention;
FIG. 7 is a control diagram of the control method of the present invention;
FIG. 8 is a schematic diagram of ultrasonic positioning in the control method of the present invention.
Description of reference numerals: 1-a robot body; 11-a robot housing; 12-a steering engine pan-tilt; 13-a vision module; 14-a driver; 15-a separate ultrasonic receiving module; 16-a control module; 17-an infrared module; 18-a drive motor; 19-right side power supply; 110-a bluetooth module; 111-a switch; 112-universal wheels; 113-left side power supply; 114-a chassis; 2-an ultrasound emission box carried by the target; 21-a cartridge housing; 22-a control panel; 23-a box switch; 24-a battery; 25-split ultrasonic emission module.
Detailed Description
The technical scheme of the invention is explained in detail in the following with the accompanying drawings.
As shown in fig. 1-2, a vision tracking mobile robot based on ultrasonic-assisted positioning comprises a robot housing 11, a steering engine pan-tilt 12, a vision module 13, a driver 14, a separate ultrasonic receiving module 15, a control module 16, an infrared module 17, a driving motor 18, a right power supply 19, a bluetooth module 110, a switch 111, a universal wheel 112, a left power supply 113 and a chassis 114; wherein:
the robot shell 11 consists of four lobes, and the separated ultrasonic receiving module 15 is arranged at the front part of the robot shell 11; the steering engine cradle head 12 is arranged on the upper part of the robot shell 11; the vision module 13 is arranged on the steering engine pan-tilt 12 and is powered by a right power supply 19; the driver 14, the separated ultrasonic receiving module 15, the control module 16, the infrared module 17, the driving motor 18 and the Bluetooth module 110 are powered by a left power supply 113, the switch 111, the Bluetooth module 110 and the infrared module 17 are mounted on a chassis 114, and a control circuit is connected with the control module 16; the driver 14 and the control module 16 are respectively installed on the left and right sides of the chassis 114.
The bluetooth module 110 can be paired with a mobile phone of a specific target, and the soft start and stop of the robot are controlled through the bluetooth of the mobile phone.
The number of the separated ultrasonic receiving modules 15 is three, and the three separated ultrasonic receiving modules are arranged in a triangular manner around the robot housing 11; the number of the infrared modules 17 is four, two of which are arranged at the front of the robot, one on each of the left and right sides.
The two driving motors 18 are distributed left and right, are arranged at the bottom of the chassis 114 and steer through differential speed; the universal wheels 112 are arranged in a front-back manner and are arranged at the bottom of the chassis 114, and the bottom of the universal wheels 112 and the wheels of the driving motor 18 are in the same horizontal plane.
As shown in fig. 3 to 4, the ultrasonic transmitter case 2 carried by the subject, in which the detachable ultrasonic transmitter module 25 is installed, is exposed through the through hole of the case housing 21, and the case switch 23 is installed on the case housing 21; the lines are connected to a control board 22, which is powered by a battery 24.
As shown in fig. 5-7, the mobile robot tracks through the vision module, and when the target is in a visual field or the target reliability threshold is lower than a set value, the position of the robot is corrected through ultrasonic positioning, so that the vision module can be repositioned, and continuous tracking of specific target features is realized.
The vision module 13 communicates with the control module 16 through an IIC communication protocol; the bluetooth module 110 communicates with the control module 16 through a UART communication protocol; the separated ultrasonic receiving module 15 and the infrared module 17 communicate with the control module 16 through the I/O port.
The robot is connected to a mobile phone of a user through Bluetooth by starting a power supply of the robot, and when the Bluetooth is successfully connected, the user controls the soft start and stop of the robot through the mobile phone; and then, the vision module acquires the characteristics of the tracked object, and the robot follows the target through a vision tracking algorithm. When the target leaves the visual field range or the reliability of the target is lower than a set value, the ultrasonic positioning system starts to work, auxiliary positioning is carried out, and the robot is enabled to track the target. And when the target reappears in the camera visual field, the machine vision module detects the target again and then continues to track, and the robot continuously works in the cycle. When an obstacle is in front of the robot, the infrared module preferentially carries out automatic obstacle avoidance by utilizing interrupt processing.
The visual Tracking method adopts LRT (Long-term Real-time Tracking) algorithm. The tracking algorithm is constructed by the cooperation of a tracking module, a learning module and a detection module. The detection module and the tracking module are subjected to parallel computing processing and do not interfere with each other, the running results of the detection module and the tracking module are sent to the learning module for learning and correction, the target model is stored in the learner, the learned model reacts on the tracking module and the detection module to update the tracking module and the detection module in real time, and data integration is carried out through the integrator, so that the target model can be continuously tracked even if the appearance of the target changes. And if the visual tracking fails due to the fact that the target disappears in the camera view field, or the target reliability threshold value is lower than a set value, the ultrasonic auxiliary positioning is switched to.
In the invention, a tracking module adopts a Lucas-Kanade optical Flow method to track the target and integrates an image pyramid by a media-Flow Median optical Flow method to ensure the sample tracking precision. The learning module adopts a PN learning mechanism, and iteratively trains a classifier according to positive and negative samples generated by the tracking module and the detection module, so as to improve the precision of the detection module. The detection module adopts an online SVM support vector machine classifier with superior binary classification to replace the traditional random fern classifier, and the effect of re-detection is improved.
The hyperplane parameter h objective function adopted by the online SVM support vector machine classifier is as follows:
Figure BDA0001704168080000071
wherein (v)i,ci) For the training set, i is 1,2, …, N is the total number of samples, viFor feature vectors generated from the ith sample, ciE { +1, -1} is a class label, where +1 denotes positive samples, -1 denotes negative samples, l (h; (v)i,ci))=max{0,1-ci<h,vi>},<h,viH and viλ is a regularization parameter, λ > 0;
updating the hyperplane parameter h by adopting the following method: given training sample vtCalculating the margin mt=ht-1·vtAnd a received sample label ctIf m ist≠ctThen order l (h; (v))i,ci) 0, and when mtctWhen the value is less than 1, updating h:
Figure BDA0001704168080000072
when h is generatedt≠ht-1Time, update confidence Σ:
Figure BDA0001704168080000081
wherein, the subscript t represents the iteration number, and r is an update rate over-parameter of the control h.
When the visual tracking method is disabled due to the loss of the target or the reliability threshold of the target is lower than a set value, starting the ultrasonic positioning method, and adopting a trilateral positioning method, as shown in fig. 8, wherein a positioning formula is as follows:
(x1-x0)2+(y1-y0)2=d1 2
(x2-x0)2+(y2-y0)2=d2 2
(x3-x0)2+(y3-y0)2=d3 2
in the above formula, (x)1,y1)、(x2,y2)、(x3,y3) Respectively, the coordinates of 3 separate ultrasonic receiving modules, (x)0,y0) Is the coordinate of the tracked object, d1、d2、d3Is (x)1,y1)、(x2,y2)、(x3,y3) To (x)0,y0) Distance of d1、d2、d3And calculating according to the ultrasonic signals received by the 3 separated ultrasonic receiving modules by using an ultrasonic ranging principle.
The infrared obstacle avoidance method comprises the steps that 2 infrared modules in the middle position of the front end of the outer side of a robot shell simultaneously output low levels to indicate that an obstacle exists in front of the robot, a control module preferentially avoids the obstacle through an interruption function, the robot firstly retreats for a preset distance by controlling a driving motor, and then turns to avoid the obstacle; if one of the infrared modules on the left side and the right side of the shell of the robot outputs a high level and the other outputs a low level, the robot firstly retreats for a preset distance by controlling the driving motor and then deflects to one side outputting the high level.
The embodiments are only for illustrating the technical idea of the present invention, and the technical idea of the present invention is not limited thereto, and any modifications made on the basis of the technical scheme according to the technical idea of the present invention fall within the scope of the present invention.

Claims (5)

1. The control method of the vision tracking mobile robot based on ultrasonic auxiliary positioning comprises a robot shell, a chassis, a steering engine cradle head, a vision module, a separated ultrasonic receiving module, a control module, a driver, a driving motor, a driving wheel and a universal wheel; the control module is electrically connected with the steering engine holder, the vision module, the separated ultrasonic receiving module and the driver; the steering engine cradle head is arranged at the top of the outer side of the robot shell, and the visual module is arranged on the steering engine cradle head; at least 3 separated ultrasonic receiving modules are arranged and positioned at the front part of the robot shell in a triangular arrangement; the driving wheel and the universal wheel are arranged at the bottom of the chassis, the driving wheel rotates along with the driving motor, the driving motor is driven by the driver, the control module receives signals collected by the vision module and generates driving instructions to realize target tracking, the separated ultrasonic receiving module collects ultrasonic signals sent by the separated ultrasonic transmitting module carried by a tracked target, and the control module carries out robot position positioning according to the signals collected by the separated ultrasonic receiving module; the robot is characterized by further comprising at least 4 infrared modules, wherein 2 infrared modules are arranged in the middle of the front end of the robot shell, the other 2 infrared modules are respectively installed on two sides of the robot shell, and the 4 infrared modules are respectively electrically connected with the control module;
the control method is characterized by comprising a visual tracking method, an ultrasonic positioning method and an infrared obstacle avoidance method; the visual tracking method adopts an LRT algorithm, the structure of the visual tracking method is the combination of a tracking module, a learning module and a detection module, wherein the detection module and the tracking module are subjected to parallel calculation and are not interfered with each other, the operation results of the detection module and the tracking module are sent to the learning module for learning and correction, and the learned model reacts on the tracking module and the detection module to update the model characteristics in real time, so that the model can be continuously tracked even if the appearance of a target is changed; the tracking module adopts a Lucas-Kanade optical Flow method to track the target and integrates an image pyramid through a Median-Flow Median optical Flow method to ensure the sample tracking precision; the learning module adopts a PN learning mechanism, and iteratively trains a classifier according to positive and negative samples generated by the tracking module and the detection module, so as to improve the precision of the detection module; the detection module adopts an online SVM (support vector machine) classifier;
the hyperplane parameter h objective function adopted by the online SVM support vector machine classifier is as follows:
Figure FDA0002697523260000011
wherein (v)i,ci) For the training set, i is 1,2, …, N is the total number of samples, viFor feature vectors generated from the ith sample, ciE { +1, -1} is a class label, where +1 represents a positive sample and-1 represents a negative sample,l(h;(vi,ci))=max{0,1-ci<h,vi>},<h,viH and viλ is a regularization parameter, λ > 0;
updating the hyperplane parameter h by adopting the following method: given training sample vtCalculating the margin mt=ht-1·vtAnd a received sample label ctIf m ist≠ctThen order l (h; (v))i,ci) 0, and when mtctWhen the value is less than 1, updating h:
Figure FDA0002697523260000021
when h is generatedt≠ht-1Time, update confidence Σ:
Figure FDA0002697523260000022
wherein, the subscript t represents the iteration number, and r is an update rate over-parameter of the control h.
2. The control method of the vision tracking mobile robot based on the ultrasonic auxiliary positioning as claimed in claim 1, characterized in that: the vision tracking mobile robot based on ultrasonic auxiliary positioning further comprises a wireless communication module, the wireless communication module is electrically connected with the control module, the control module is in wireless communication with an intelligent terminal of a user through the wireless communication module, and receives a robot control instruction issued by the intelligent terminal.
3. The control method of the vision tracking mobile robot based on the ultrasonic auxiliary positioning as claimed in claim 1, characterized in that: the number of the driving motors is 2, each driving motor corresponds to 1 driving wheel, the 2 driving wheels are respectively arranged on the left side and the right side of the chassis, and the steering of the robot is realized through differential speed; the universal wheel sets up 2, sets up both sides around the chassis respectively.
4. The control method of the vision tracking mobile robot based on ultrasonic auxiliary positioning as claimed in claim 1, wherein if the target is lost to cause the vision tracking method to fail, or the confidence threshold of the target is lower than a set value, the ultrasonic positioning method is started, and a trilateral positioning method is adopted, wherein the positioning formula is as follows:
(x1-x0)2+(y1-y0)2=d1 2
(x2-x0)2+(y2-y0)2=d2 2
(x3-x0)2+(y3-y0)2=d3 2
in the above formula, (x)1,y1)、(x2,y2)、(x3,y3) Respectively, the coordinates of 3 separate ultrasonic receiving modules, (x)0,y0) Is the coordinate of the tracked object, d1、d2、d3Is (x)1,y1)、(x2,y2)、(x3,y3) To (x)0,y0) Distance of d1、d2、d3And calculating according to the ultrasonic signals received by the 3 separated ultrasonic receiving modules by using an ultrasonic ranging principle.
5. The control method of the vision tracking mobile robot based on the ultrasonic auxiliary positioning as claimed in claim 1, wherein the infrared obstacle avoidance method is that when 2 infrared modules in the middle position of the front end of the outer side of the robot shell simultaneously output low levels, which indicates that an obstacle exists in front of the robot, the control module preferentially avoids the obstacle through an interrupt function, and controls the driving motor to enable the robot to firstly retreat for a preset distance and then turn to avoid the obstacle; if one of the infrared modules on the left side and the right side of the shell of the robot outputs a high level and the other outputs a low level, the robot firstly retreats for a preset distance by controlling the driving motor and then deflects to one side outputting the high level.
CN201810650899.XA 2018-06-22 2018-06-22 Visual tracking mobile robot based on ultrasonic auxiliary positioning and control method Active CN108931979B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201810650899.XA CN108931979B (en) 2018-06-22 2018-06-22 Visual tracking mobile robot based on ultrasonic auxiliary positioning and control method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201810650899.XA CN108931979B (en) 2018-06-22 2018-06-22 Visual tracking mobile robot based on ultrasonic auxiliary positioning and control method

Publications (2)

Publication Number Publication Date
CN108931979A CN108931979A (en) 2018-12-04
CN108931979B true CN108931979B (en) 2020-12-15

Family

ID=64446194

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201810650899.XA Active CN108931979B (en) 2018-06-22 2018-06-22 Visual tracking mobile robot based on ultrasonic auxiliary positioning and control method

Country Status (1)

Country Link
CN (1) CN108931979B (en)

Families Citing this family (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109318243B (en) * 2018-12-11 2023-07-07 珠海一微半导体股份有限公司 Sound source tracking system and method of vision robot and cleaning robot
CN109828580B (en) * 2019-02-27 2022-05-24 华南理工大学 Mobile robot formation tracking control method based on separated ultrasonic waves
CN117255313B (en) * 2023-11-17 2024-02-06 双擎科技(杭州)有限公司 Service robot control method and system based on cloud platform

Family Cites Families (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPWO2005076661A1 (en) * 2004-02-10 2008-01-10 三菱電機エンジニアリング株式会社 Super directional speaker mounted mobile body
CN103317514B (en) * 2013-06-20 2016-06-01 中国矿业大学 A kind of mining environment exploring robot Controlling System
CN103885449B (en) * 2014-04-04 2016-03-23 辽宁工程技术大学 Intelligent vision based on multi-sensor cooperation process follows the tracks of the control method of wheeled robot
CN104950887B (en) * 2015-06-19 2017-07-21 重庆大学 Conveying arrangement based on robotic vision system and independent tracking system
KR101849344B1 (en) * 2016-01-26 2018-04-16 (주)유프랜드 worker following automatic guided vehicle
CN205384517U (en) * 2016-03-09 2016-07-13 田君泽 It follows dolly to carry thing
CN106094875B (en) * 2016-06-27 2019-01-22 南京邮电大学 A kind of target follow-up control method of mobile robot
CN106683123B (en) * 2016-10-31 2019-04-02 纳恩博(北京)科技有限公司 A kind of method for tracking target and target tracker
CN106774325A (en) * 2016-12-23 2017-05-31 湖南晖龙股份有限公司 Robot is followed based on ultrasonic wave, bluetooth and vision
CN106774436B (en) * 2017-02-27 2023-04-25 南京航空航天大学 Control system and method for stably tracking target of rotor unmanned aerial vehicle based on vision
CN107909603A (en) * 2017-12-01 2018-04-13 浙江工业大学 It is a kind of towards following robotic vision tracking

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
Target Recognition Study Using SVM, ANNs and Expert Knowledge;Guangzhi Shi,等;《Proceedings of the IEEE International Conference on Automation and Logistics》;20081231;第1507-1511页 *

Also Published As

Publication number Publication date
CN108931979A (en) 2018-12-04

Similar Documents

Publication Publication Date Title
CN108931979B (en) Visual tracking mobile robot based on ultrasonic auxiliary positioning and control method
US11161241B2 (en) Apparatus and methods for online training of robots
US10293483B2 (en) Apparatus and methods for training path navigation by robots
US9821457B1 (en) Adaptive robotic interface apparatus and methods
US10717191B2 (en) Apparatus and methods for haptic training of robots
CN108381554B (en) Visual tracking mobile robot based on WIFI auxiliary positioning and control method
US20150032258A1 (en) Apparatus and methods for controlling of robotic devices
Fang et al. Adaptive active visual servoing of nonholonomic mobile robots
WO2018028361A1 (en) Charging method, apparatus, and device for robot
US20170348858A1 (en) Multiaxial motion control device and method, in particular control device and method for a robot arm
CN109590986B (en) Robot teaching method, intelligent robot and storage medium
JP2022542241A (en) Systems and methods for augmenting visual output from robotic devices
CN103116279B (en) Vague discrete event shared control method of brain-controlled robotic system
CN110271016B (en) Mechanical arm calligraphy writing system and method based on boundary and force feedback
CN113183133B (en) Gesture interaction method, system, device and medium for multi-degree-of-freedom robot
CN109352654A (en) A kind of intelligent robot system for tracking and method based on ROS
US20190217467A1 (en) Apparatus and methods for operating robotic devices using selective state space training
WO2014201422A2 (en) Apparatus and methods for hierarchical robotic control and robotic training
Lee et al. Fast perception, planning, and execution for a robotic butler: Wheeled humanoid m-hubo
CN212522923U (en) Ball picking robot system
CN108062102A (en) A kind of gesture control has the function of the Mobile Robot Teleoperation System Based of obstacle avoidance aiding
CN109685828B (en) Deep learning tracking acquisition method based on target posture, learning system and storage medium
Jayasurya et al. Gesture controlled AI-robot using Kinect
CN115922731B (en) Control method of robot and robot
CN213877284U (en) Intelligent programming robot

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant