CN101893894B - Reconfigurable miniature mobile robot cluster locating and tracking system - Google Patents

Reconfigurable miniature mobile robot cluster locating and tracking system Download PDF

Info

Publication number
CN101893894B
CN101893894B CN2010102148791A CN201010214879A CN101893894B CN 101893894 B CN101893894 B CN 101893894B CN 2010102148791 A CN2010102148791 A CN 2010102148791A CN 201010214879 A CN201010214879 A CN 201010214879A CN 101893894 B CN101893894 B CN 101893894B
Authority
CN
China
Prior art keywords
module
submodule
information
links
robot
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Expired - Fee Related
Application number
CN2010102148791A
Other languages
Chinese (zh)
Other versions
CN101893894A (en
Inventor
陈佳品
沈慧
李振波
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shanghai Jiaotong University
Original Assignee
Shanghai Jiaotong University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shanghai Jiaotong University filed Critical Shanghai Jiaotong University
Priority to CN2010102148791A priority Critical patent/CN101893894B/en
Publication of CN101893894A publication Critical patent/CN101893894A/en
Application granted granted Critical
Publication of CN101893894B publication Critical patent/CN101893894B/en
Expired - Fee Related legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Landscapes

  • Image Analysis (AREA)
  • Image Processing (AREA)

Abstract

The invention discloses to a reconfigurable miniature mobile robot cluster locating and tracking system, belonging to the technical field of robots. The system comprises an image acquiring and processing module, a mark identification module, a motion tracking and locating module and an external monitoring module, wherein the image acquiring and processing module comprises a camera, an image acquisition card, an image filtering sub-module, a gray threshold self-adaption selection sub-module and an image binarization sub-module; the mark identification module comprises a mark matrix comparsion sub-module, a mark perimeter comparsion sub-module, a mark black-and-white ratio comparsion sub-module, a weighting and summating sub-module and a labeling sub-module; and the motion tracking and locating module comprises a position estimating sub-module, a global scene windowing and scanning sub-module and a pose judging sub-module. The invention reduces the scanning work load of the locating system, reduces the system cycle, greatly improves the system instantaneity, enables the real-time tracking system to be faster, more accurate and more reliable, and simultaneously realizes identifying, locating and tracking.

Description

Reconfigurable miniature mobile robot cluster locating and tracking system
Technical field
What the present invention relates to is the system in a kind of Robotics field, specifically is a kind of reconfigurable miniature mobile robot cluster locating and tracking system.
Background technology
Along with the continuous development of MEMS and Robotics, people have expected two kinds of technology are combined, thereby have produced Micro-Robot.In the daily production of reality, irreplaceable effect has been played in the middle of also having put into and having used by Micro-Robot factory in Precision Machining and electronics processing industry.In order to let miniature mobile robot adapt to the more coordinative operation requirement of complex environment, proposed restructural and moved this technology of Micro-Robot.In order to let the reconfigurable miniature mobile robot technology carry out better; People need build a positioning system; Can let the co-ordination of Micro-Robot in the actual production operation, and therefrom extract the relevant location information of Micro-Robot through positioning system.Reconfigurable miniature mobile robot cluster locating and tracking system proposes in order to address this problem just.
Through the literature search of prior art is found, one Chinese patent application number is: 200410059618.1, and a kind of thereby this technology provides based on the system that positions of the bar code on the scanning mobile robot to robot.But this positioning system need be in robot on one's body and paste bar-code label on the experiment porch.Be difficult to identification on one's body because the bar code of bar-code label own than comparatively dense, is pasted with Micro-Robot, therefore be not suitable in the microrobot positioning system, using.
Find through retrieval again; One Chinese patent application number is: 200710168718.1; It is a kind of by platform being set, being arranged at the freedom positioning system for robot that the sensor subsystem, data process subsystem and the voltage transformation module that are provided with on the platform constitute that this technology provides, but this system only is applicable to the robot in underground environment work such as colliery, tunnels.Because the small physical construction that requires of miniature mobile robot volume is succinct as much as possible, therefore be not suitable for carrying sensor subsystem, so this system also is not suitable for reconfigurable miniature mobile robot.
Summary of the invention
The objective of the invention is to overcome the above-mentioned deficiency of prior art, a kind of reconfigurable miniature mobile robot cluster locating and tracking system is provided.The present invention discerns the robot mark through image processing method, utilizes the speed predictor method, realizes the hi-Fix and the tracking of miniature mobile robot cluster.
The present invention realizes through following technical scheme:
The present invention includes: image acquisition and processing module, mark identification module, motion tracking locating module and outer monitoring module; Wherein: the image acquisition and processing module links to each other with the motion tracking locating module with the mark identification module respectively and transmits black and white binary image information; Mark identification module link to each other with the motion tracking locating module transferring robot label information and robot initial scanning area information; Link to each other with the outer monitoring module label information of transferring robot of motion tracking locating module link to each other with the outer monitoring module velocity information and the position posture information of transferring robot, mark identification module.
Described image acquisition and processing module; Comprise: camera, image pick-up card, image filtering submodule, the adaptively selected submodule of gray threshold and image binaryzation submodule; Wherein: camera links to each other with image pick-up card and transmits the image information of gathering; Image pick-up card links to each other the transferring robot global scene as information with the image filtering submodule; The image filtering submodule links to each other with the adaptively selected submodule of gray threshold and transmits filtered image information; The adaptively selected submodule of gray threshold links to each other with the image binaryzation submodule and transmits the optimum gradation threshold information, and the image binaryzation submodule links to each other with the motion tracking locating module with the mark identification module respectively and transmits black and white binary image information.
Described mark identification module; Comprise: mark matrix comparison sub-module, mark girth comparison sub-module, mark black and white ratio comparison sub-module, weighted sum submodule and label submodule; Wherein: the input end of mark matrix comparison sub-module links to each other with the image acquisition and processing module and transmits black and white binary image information; The output terminal of mark matrix comparison sub-module links to each other with the weighted sum submodule and transmits the square value comparison information of black and white binary image and template mark; The input end of mark girth comparison sub-module links to each other with the image acquisition and processing module and transmits black and white binary image information; The output terminal of mark girth comparison sub-module links to each other with the weighted sum submodule and transmits the girth difference information of black and white binary image and template; The input end of mark black and white ratio comparison sub-module links to each other with the image acquisition and processing module and transmits black and white binary image information; Link to each other with the weighted sum submodule difference information of monochrome pixels point ratio of transmission black and white binary image and template of the output terminal of mark black and white ratio comparison sub-module; The weighted sum submodule transferring robot preliminary sweep area information that links to each other with the motion tracking locating module, the weighted sum submodule transferring robot scanning area information that links to each other with the label submodule, the label submodule transferring robot label information that links to each other with the motion tracking locating module;, the label submodule transferring robot label information that links to each other with the outer monitoring module.
Described motion tracking locating module comprises: submodule is estimated in the position, global scene is windowed and scanned submodule and pose judgement submodule; Wherein: the position is estimated submodule and is linked to each other with the image acquisition and processing module and pass black and white binary image information; Submodule link to each other with the mark identification module transferring robot label information and robot initial scanning area information are estimated in the position; Link to each other with the outer monitoring module velocity information of transferring robot of submodule is estimated in the position; The position is estimated submodule and the global scene scanning submodule of windowing and is linked to each other and transmit robot location's information of estimating; Window scanning submodule and pose of global scene judges that the submodule transferring robot that links to each other estimates the Pixel Information of region, position; The global scene scan module transferring robot that links to each other with the outer monitoring module of windowing is estimated the robot center information of region, position, and pose is judged the submodule current angle information of transferring robot that links to each other with the outer monitoring module.
Described global scene window scanning submodule comprise: the scanning element of windowing, the statistics monochrome pixels than value cell, identifying unit, calculating central position unit and output unit; Wherein: the submodule robot location's information that transmission estimates that links to each other is estimated in the scanning element of windowing and position; The scanning element of windowing links to each other than value cell with the statistics monochrome pixels and transmits monochrome pixels information; The statistics monochrome pixels links to each other with identifying unit than value cell and transmits the monochrome pixels ratio information; Identifying unit link to each other with the calculating central position unit transmission this estimate the information that the zone has robot to occur; The calculating central position unit transferring robot that links to each other with the outer monitoring module is estimated the robot center information of region, position; Calculating central position unit and pose judge that the submodule transferring robot that links to each other estimates the Pixel Information of region, position; Identifying unit links to each other with output unit and transmits the information that this estimates the appearance of the no robot in zone, and output unit links to each other with the outer monitoring module and transmits the information that this estimates the appearance of the no robot in zone.
Described pose judges that submodule comprises: Hough change unit and angle extraction unit; Wherein: Hough change unit and the global scene scanning submodule transferring robot that links to each other of windowing is estimated the Pixel Information of region, position; The Hough change unit Pixel Information of transmission Hough after changing that link to each other with the angle extraction unit, the angle extraction unit current angle information of transferring robot that links to each other with the outer monitoring module.
The angle information of the coordinate that described angle information is robot and setting.
Compared with prior art, the invention has the beneficial effects as follows: the present invention combines the characteristics of reconfigurable miniature mobile robot, adopts method based on IMAQ and processing to design to integrate and discerns, locatees, follows the tracks of the reconfigurable miniature mobile robot cluster locating and tracking system in one; And estimate out the zone that robot will occur according to robot speed's information; Thereby reduce the scanning work amount of positioning system, reduced system cycle, significantly improved the real-time of system; Make real-time tracking system more quick; Accurately, reliability is higher, and this invention is applicable to that also other are similarly through in the experiment scene to marker detection.
Description of drawings
Fig. 1 is that system of the present invention forms the connection synoptic diagram.
Fig. 2 is that the composition of image acquisition and processing module connects synoptic diagram.
Fig. 3 is that the composition of mark identification module connects synoptic diagram.
Fig. 4 is that the composition of motion tracking locating module connects synoptic diagram.
Embodiment
Below in conjunction with accompanying drawing system of the present invention is further described: present embodiment provided detailed embodiment and concrete operating process, but protection scope of the present invention is not limited to following embodiment being to implement under the prerequisite with technical scheme of the present invention.
Embodiment
As shown in Figure 1; Present embodiment comprises: image acquisition and processing module, mark identification module, motion tracking locating module and outer monitoring module; Wherein: the image acquisition and processing module links to each other with the motion tracking locating module with the mark identification module respectively and transmits black and white binary image information; Mark identification module link to each other with the motion tracking locating module transferring robot label information and robot initial scanning area information; Link to each other with the outer monitoring module label information of transferring robot of motion tracking locating module link to each other with the outer monitoring module velocity information and the position posture information of transferring robot, mark identification module.
As shown in Figure 2; Described image acquisition and processing module; Comprise: camera, image pick-up card, image filtering submodule, the adaptively selected submodule of gray threshold and image binaryzation submodule; Wherein: camera links to each other with image pick-up card and transmits the image information of gathering; Image pick-up card links to each other the transferring robot global scene as information with the image filtering submodule; The image filtering submodule links to each other with the adaptively selected submodule of gray threshold and transmits filtered image information, and the adaptively selected submodule of gray threshold links to each other with the image binaryzation submodule and transmits the optimum gradation threshold information, and the image binaryzation submodule links to each other with the motion tracking locating module with the mark identification module respectively and transmits black and white binary image information.
Described camera is the CCD camera.
Described image filtering submodule is to adopt Gauss's template to carry out level and smooth and filtering.Present embodiment is through carrying out convolution with the input image pixels of Gauss's matrix and this submodule, thereby reaches the effect of filtering.
As shown in Figure 3; Described mark identification module; Comprise: mark matrix comparison sub-module, mark girth comparison sub-module, mark black and white ratio comparison sub-module, weighted sum submodule and label submodule; Wherein: the input end of mark matrix comparison sub-module links to each other with the image acquisition and processing module and transmits black and white binary image information; The output terminal of mark matrix comparison sub-module links to each other with the weighted sum submodule and transmits the square value comparison information of black and white binary image and template mark; The input end of mark girth comparison sub-module links to each other with the image acquisition and processing module and transmits black and white binary image information; The output terminal of mark girth comparison sub-module links to each other with the weighted sum submodule and transmits the girth difference information of black and white binary image and template; Link to each other with the weighted sum submodule difference information of the monochrome pixels point ratio that transmits black and white binary image and template of the input end of the mark black and white ratio comparison sub-module transmission black and white binary image information that links to each other with the image acquisition and processing module, the output terminal of mark black and white ratio comparison sub-module, the weighted sum submodule transferring robot preliminary sweep area information that links to each other with the motion tracking locating module; The weighted sum submodule transferring robot scanning area information that links to each other with the label submodule; The label submodule transferring robot label information that links to each other with the motion tracking locating module,, the label submodule transferring robot label information that links to each other with the outer monitoring module.
As shown in Figure 4; Described motion tracking locating module comprises: submodule is estimated in the position, global scene is windowed and scanned submodule and pose judgement submodule; Wherein: the position is estimated submodule and is linked to each other with the image acquisition and processing module and pass black and white binary image information; Submodule link to each other with the mark identification module transferring robot label information and robot initial scanning area information are estimated in the position; Link to each other with the outer monitoring module velocity information of transferring robot of submodule is estimated in the position; The position is estimated submodule and the global scene scanning submodule of windowing and is linked to each other and transmit robot location's information of estimating; Window scanning submodule and pose of global scene judges that the submodule transferring robot that links to each other estimates the Pixel Information of region, position; Global scene scan module robot center information or this that transferring robot estimates the region, position that link to each other with the outer monitoring module of windowing estimated the information that the no robot in zone occurs, the pose judgement submodule current angle information of transferring robot that links to each other with the outer monitoring module.
Described global scene window scanning submodule comprise: the scanning element of windowing, the statistics monochrome pixels than value cell, identifying unit, calculating central position unit and output unit; Wherein: the submodule robot location's information that transmission estimates that links to each other is estimated in the scanning element of windowing and position; The scanning element of windowing links to each other than value cell with the statistics monochrome pixels and transmits monochrome pixels information; The statistics monochrome pixels links to each other with identifying unit than value cell and transmits the monochrome pixels ratio information; Identifying unit link to each other with the calculating central position unit transmission this estimate the information that the zone has robot to occur; The calculating central position unit transferring robot that links to each other with the outer monitoring module is estimated the robot center information of region, position; Calculating central position unit and pose judge that the submodule transferring robot that links to each other estimates the Pixel Information of region, position; Identifying unit links to each other with output unit and transmits the information that this estimates the appearance of the no robot in zone, and output unit links to each other with the outer monitoring module and transmits the information that this estimates the appearance of the no robot in zone.
Described pose judges that submodule comprises: Hough change unit and angle extraction unit; Wherein: Hough change unit and the global scene scanning submodule transferring robot that links to each other of windowing is estimated the Pixel Information of region, position; The Hough change unit Pixel Information of transmission Hough after changing that link to each other with the angle extraction unit, the angle extraction unit current angle information of transferring robot that links to each other with the outer monitoring module.
The angle information of the coordinate that described angle information is robot and setting.
Described outer monitoring module comprises: speed detection sub-module and display sub-module; Wherein: the speed detection sub-module links to each other with the motion tracking locating module and transmits the velocity information of transferring robot; Display sub-module is judged the submodule current angle information of transferring robot that links to each other with pose respectively; Display sub-module links to each other with output unit and transmits the information that this estimates the appearance of the no robot in zone; The display sub-module transferring robot that links to each other with the calculating central position unit is estimated the robot center information of region, position
The course of work of present embodiment: after restructural Micro-Robot cluster enters into the present embodiment system; Action is the image acquisition and processing module at first: the image acquisition and processing module is caught the global scene image through the CCD camera; And be sent in the image filtering submodule through image pick-up card, will pass through the image of filtering then and send into the adaptively selected submodule of gray threshold.What the adaptively selected submodule of gray threshold adopted in the present embodiment is that weighting is selected, and utilizes the grey level histogram after the weighting to carry out windowing process then, is set to 255 greater than the pixel of gray threshold, and other pixel is set to 0.Again the image of handling is sent into the mark identification module and carry out robot identification.After the mark identification module receives the image information that the image acquisition and processing module transmits, begin action, and in this module, accomplish identification one by one robot cluster.Wherein mark square value comparison sub-module utilizes geometric moment that the image and the template mark of scanning area are carried out statistical computation, draws both square values respectively, and two square values are compared.Mark girth comparison sub-module is calculated through the girth to the black region in the scanning area image and is compared with the black region girth in the template.Mark black and white ratio comparison sub-module goes out monochrome pixels point ratio through the statistical computation to scanning area image pixel dot information and compares with template monochrome pixels ratio.Mark square value comparison sub-module, mark girth comparison sub-module and mark black and white ratio comparison sub-module all link to each other with the weighted sum submodule and comparative result are separately sent into the weighted sum submodule; The weighted sum submodule is given these 3 results weights in addition respectively; Draw the image of scanning area and the similarity of template exactly; Thereby robot is discerned; And robot is carried out label, and robot initial zone and robot label information are sent to the motion tracking locating module, wait for the beginning of track and localization.After the motion tracking locating module receives the robot sequence number, initialization robot initial scanning area, and enable position is estimated submodule.Estimate in the submodule in the position,, converse the conversion ratio of pixel and physical size according to robot intended size and the actual image slices vegetarian refreshments number that takies.And utilize this ratio to convert out with per second, thereby confirm that robot is about to the regional location that occurs, and the positional information that robot estimates the region, position is sent to the global scene scanning submodule of windowing through the velocity amplitude of pixel., global scene accomplishes in windowing the scanning submodule robot whether if judge in the anticipation position and provide corresponding rreturn value; And the robot center that anticipation is successful outputs in system's outer monitoring module, and the scene information of will windowing is fed through pose and judges in the submodule.In pose judgement submodule, go out the current pose of robot, and output to the outer monitoring module through the Hough transformation calculations.
Present embodiment cooperatively interacts through each intermodule; The mutual transmission of information data; Completion is to the location and the tracking of reconfigurable miniature mobile robot; And with relevant posture information, position coordinates etc. output in the outer monitoring module through specific modules, have accomplished location and tracking to reconfigurable miniature mobile robot.

Claims (6)

1. reconfigurable miniature mobile robot cluster locating and tracking system; It is characterized in that; Comprise: image acquisition and processing module, mark identification module, motion tracking locating module and outer monitoring module; Wherein: the image acquisition and processing module links to each other with the motion tracking locating module with the mark identification module respectively and transmits black and white binary image information; Mark identification module link to each other with the motion tracking locating module transferring robot label information and robot initial scanning area information; Link to each other with the outer monitoring module label information of transferring robot of motion tracking locating module link to each other with the outer monitoring module velocity information and the position posture information of transferring robot, mark identification module.
2. reconfigurable miniature mobile robot cluster locating and tracking system according to claim 1; It is characterized in that; Described image acquisition and processing module; Comprise: camera, image pick-up card, image filtering submodule, the adaptively selected submodule of gray threshold and image binaryzation submodule; Wherein: the camera image information that transmission is gathered that links to each other with image pick-up card, the image pick-up card transferring robot global scene image information that links to each other with the image filtering submodule, the image filtering submodule links to each other with the adaptively selected submodule of gray threshold and transmits filtered image information; The adaptively selected submodule of gray threshold links to each other with the image binaryzation submodule and transmits the optimum gradation threshold information, and the image binaryzation submodule links to each other with the motion tracking locating module with the mark identification module respectively and transmits black and white binary image information.
3. reconfigurable miniature mobile robot cluster locating and tracking system according to claim 1; It is characterized in that; Described mark identification module; Comprise: mark matrix comparison sub-module, mark girth comparison sub-module, mark black and white ratio comparison sub-module, weighted sum submodule and label submodule; Wherein: the input end of mark matrix comparison sub-module links to each other with the image acquisition and processing module and transmits black and white binary image information; The output terminal of mark matrix comparison sub-module links to each other with the weighted sum submodule and transmits the square value comparison information of black and white binary image and template mark; The input end of mark girth comparison sub-module links to each other with the image acquisition and processing module and transmits black and white binary image information; Link to each other with the weighted sum submodule girth difference information of transmission black and white binary image and template of the output terminal of mark girth comparison sub-module, the input end of mark black and white ratio comparison sub-module links to each other with the image acquisition and processing module and transmits black and white binary image information, link to each other with the weighted sum submodule difference information of the monochrome pixels point ratio that transmits black and white binary image and template of the output terminal of mark black and white ratio comparison sub-module; The weighted sum submodule transferring robot preliminary sweep area information that links to each other with the motion tracking locating module; The weighted sum submodule transferring robot scanning area information that links to each other with the label submodule, the label submodule transferring robot label information that links to each other with the motion tracking locating module, the label submodule transferring robot label information that links to each other with the outer monitoring module.
4. reconfigurable miniature mobile robot cluster locating and tracking system according to claim 1; It is characterized in that; Described motion tracking locating module comprises: submodule is estimated in the position, global scene is windowed and scanned submodule and pose judgement submodule; Wherein: the position is estimated submodule and is linked to each other with the image acquisition and processing module and pass black and white binary image information; Submodule link to each other with the mark identification module transferring robot label information and robot initial scanning area information are estimated in the position; Link to each other with the outer monitoring module velocity information of transferring robot of submodule is estimated in the position; The position is estimated submodule and the global scene scanning submodule of windowing and is linked to each other and transmit robot location's information of estimating; Window scanning submodule and pose of global scene judges that the submodule transferring robot that links to each other estimates the Pixel Information of region, position, and the global scene scan module transferring robot that links to each other with the outer monitoring module of windowing is estimated the robot center information of region, position, the pose judgement submodule current angle information of transferring robot that links to each other with the outer monitoring module.
5. reconfigurable miniature mobile robot cluster locating and tracking system according to claim 4; It is characterized in that; Described global scene window scanning submodule comprise: the scanning element of windowing, the statistics monochrome pixels than value cell, identifying unit, calculating central position unit and output unit; Wherein: the submodule robot location's information that transmission estimates that links to each other is estimated in the scanning element of windowing and position; The scanning element of windowing links to each other than value cell with the statistics monochrome pixels and transmits monochrome pixels information; The statistics monochrome pixels links to each other with identifying unit than value cell and transmits the monochrome pixels ratio information; Identifying unit link to each other with the calculating central position unit transmission this estimate the information that the zone has robot to occur, the calculating central position unit transferring robot that links to each other with the outer monitoring module is estimated the robot center information of region, position, the calculating central position unit transferring robot that links to each other with pose judgement submodule is estimated the Pixel Information of region, position; Identifying unit links to each other with output unit and transmits the information that this estimates the appearance of the no robot in zone, and output unit links to each other with the outer monitoring module and transmits the information that this estimates the appearance of the no robot in zone.
6. reconfigurable miniature mobile robot cluster locating and tracking system according to claim 4; It is characterized in that; Described pose judges that submodule comprises: Hough change unit and angle extraction unit; Wherein: Hough change unit and the global scene scanning submodule transferring robot that links to each other of windowing is estimated the Pixel Information of region, position; The Hough change unit Pixel Information of transmission Hough after changing that link to each other with the angle extraction unit, the angle extraction unit current angle information of transferring robot that links to each other with the outer monitoring module.
CN2010102148791A 2010-06-30 2010-06-30 Reconfigurable miniature mobile robot cluster locating and tracking system Expired - Fee Related CN101893894B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN2010102148791A CN101893894B (en) 2010-06-30 2010-06-30 Reconfigurable miniature mobile robot cluster locating and tracking system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN2010102148791A CN101893894B (en) 2010-06-30 2010-06-30 Reconfigurable miniature mobile robot cluster locating and tracking system

Publications (2)

Publication Number Publication Date
CN101893894A CN101893894A (en) 2010-11-24
CN101893894B true CN101893894B (en) 2012-01-04

Family

ID=43103113

Family Applications (1)

Application Number Title Priority Date Filing Date
CN2010102148791A Expired - Fee Related CN101893894B (en) 2010-06-30 2010-06-30 Reconfigurable miniature mobile robot cluster locating and tracking system

Country Status (1)

Country Link
CN (1) CN101893894B (en)

Families Citing this family (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102445681B (en) * 2011-09-30 2013-07-03 深圳市九洲电器有限公司 Indoor positioning method and indoor positioning system of movable device
CN105654086A (en) * 2014-11-12 2016-06-08 沈阳新松机器人自动化股份有限公司 Robot key member pose identification method and system based on monocular vision
CN107105193B (en) 2016-02-23 2020-03-20 芋头科技(杭州)有限公司 Robot monitoring system based on human body information
CN106569493B (en) * 2016-11-03 2020-02-14 中国科学院深圳先进技术研究院 AGV cluster positioning method based on pulse ultra-wideband technology and AGV dispatching method
CN106647766A (en) * 2017-01-13 2017-05-10 广东工业大学 Robot cruise method and system based on complex environment UWB-vision interaction
CN107423766B (en) * 2017-07-28 2020-07-31 江苏大学 Method for detecting tail end motion pose of series-parallel automobile electrophoretic coating conveying mechanism
JP7172151B2 (en) * 2018-06-11 2022-11-16 オムロン株式会社 Control systems, controllers and programs

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2003241833A (en) * 2002-02-18 2003-08-29 Hitachi Ltd Information distribution service by mobile robot and information gathering system
CN1674047A (en) * 2004-03-25 2005-09-28 上海大学 Six freedom visual tracking method and system based on micro machine parallel processing structure
CN1707223A (en) * 2004-06-12 2005-12-14 杨建华 Indoor moving robot positioning system and method based on bar code
KR100698535B1 (en) * 2005-11-04 2007-03-22 재단법인 포항산업과학연구원 Position recognition device and method of mobile robot with tilt correction function
CN101201626A (en) * 2007-12-10 2008-06-18 华中科技大学 Freedom positioning system for robot
KR100902343B1 (en) * 2007-11-08 2009-06-12 한국전자통신연구원 Robot vision system and detection method
CN101661098A (en) * 2009-09-10 2010-03-03 上海交通大学 Multi-robot automatic locating system for robot restaurant

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2003241833A (en) * 2002-02-18 2003-08-29 Hitachi Ltd Information distribution service by mobile robot and information gathering system
CN1674047A (en) * 2004-03-25 2005-09-28 上海大学 Six freedom visual tracking method and system based on micro machine parallel processing structure
CN1707223A (en) * 2004-06-12 2005-12-14 杨建华 Indoor moving robot positioning system and method based on bar code
KR100698535B1 (en) * 2005-11-04 2007-03-22 재단법인 포항산업과학연구원 Position recognition device and method of mobile robot with tilt correction function
KR100902343B1 (en) * 2007-11-08 2009-06-12 한국전자통신연구원 Robot vision system and detection method
CN101201626A (en) * 2007-12-10 2008-06-18 华中科技大学 Freedom positioning system for robot
CN101661098A (en) * 2009-09-10 2010-03-03 上海交通大学 Multi-robot automatic locating system for robot restaurant

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
陈佳品等.毫米级全方位移动机器人及其微装配系统研究.《中国机械工程》.2005,第16卷(第14期),1223-1225. *

Also Published As

Publication number Publication date
CN101893894A (en) 2010-11-24

Similar Documents

Publication Publication Date Title
CN101893894B (en) Reconfigurable miniature mobile robot cluster locating and tracking system
CN110948492B (en) Three-dimensional grabbing platform and grabbing method based on deep learning
CN111496770B (en) Intelligent carrying mechanical arm system based on 3D vision and deep learning and use method
CN112184818B (en) Vision-based vehicle positioning method and parking lot management system applying same
CN111055281B (en) ROS-based autonomous mobile grabbing system and method
CN102866706B (en) Cleaning robot adopting smart phone navigation and navigation cleaning method thereof
CN103294059A (en) Hybrid navigation belt based mobile robot positioning system and method thereof
CN110706248A (en) Visual perception mapping algorithm based on SLAM and mobile robot
CN203241826U (en) Mobile robot positioning system based on hybrid navigation ribbon
CN101574586B (en) Shuttlecock robot and control method thereof
CN110706267B (en) Mining process-based ore three-dimensional coordinate acquisition method and device
CN108074265A (en) A kind of tennis alignment system, the method and device of view-based access control model identification
Ismail et al. Vision-based system for line following mobile robot
CN114155610B (en) Panel assembly key action identification method based on upper half body posture estimation
CN111932617B (en) Method and system for realizing real-time detection and positioning of regular objects
CN204036474U (en) Industrial robot sorting system
CN115299245B (en) Control method and control system of intelligent fruit picking robot
CN114935341B (en) Novel SLAM navigation computation video identification method and device
CN115589845A (en) Intelligent cotton picking robot and cotton picking operation path planning method thereof
CN111182263A (en) Robot system with cloud analysis platform and visual analysis method
CN115289966A (en) Goods shelf detecting and positioning system and method based on TOF camera
CN108534788A (en) A kind of AGV air navigation aids based on kinect visions
CN114299039A (en) Robot and collision detection device and method thereof
CN113126625A (en) Robot walking method and walking device based on automatic tracking
Jia et al. Pallet Detection Based on Halcon and AlexNet Network for Autonomous Forklifts

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
C14 Grant of patent or utility model
GR01 Patent grant
CF01 Termination of patent right due to non-payment of annual fee

Granted publication date: 20120104

Termination date: 20140630

EXPY Termination of patent right or utility model