CN106393144A - Method and system for visual tracking under multi-robot operation mode - Google Patents
Method and system for visual tracking under multi-robot operation mode Download PDFInfo
- Publication number
- CN106393144A CN106393144A CN201611056367.0A CN201611056367A CN106393144A CN 106393144 A CN106393144 A CN 106393144A CN 201611056367 A CN201611056367 A CN 201611056367A CN 106393144 A CN106393144 A CN 106393144A
- Authority
- CN
- China
- Prior art keywords
- destination
- visual
- robot
- delta robot
- under
- Prior art date
Links
- 230000000007 visual effect Effects 0.000 title claims abstract description 57
- 238000000034 methods Methods 0.000 claims abstract description 29
- 238000001914 filtration Methods 0.000 claims description 9
- 230000002079 cooperative Effects 0.000 abstract description 5
- 238000004458 analytical methods Methods 0.000 description 4
- 238000003708 edge detection Methods 0.000 description 4
- 230000000694 effects Effects 0.000 description 4
- 238000003384 imaging method Methods 0.000 description 4
- 238000010586 diagrams Methods 0.000 description 3
- 238000005516 engineering processes Methods 0.000 description 3
- 230000002159 abnormal effects Effects 0.000 description 2
- 238000007688 edging Methods 0.000 description 2
- 238000004088 simulation Methods 0.000 description 2
- 238000003860 storage Methods 0.000 description 2
- 206010068052 Mosaicism Diseases 0.000 description 1
- 210000003765 Sex Chromosomes Anatomy 0.000 description 1
- 230000006399 behavior Effects 0.000 description 1
- 239000003814 drugs Substances 0.000 description 1
- 239000000686 essences Substances 0.000 description 1
- 238000004519 manufacturing process Methods 0.000 description 1
- 239000011664 nicotinic acid Substances 0.000 description 1
- 238000002360 preparation methods Methods 0.000 description 1
- 238000004886 process control Methods 0.000 description 1
- 230000001131 transforming Effects 0.000 description 1
Classifications
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J13/00—Controls for manipulators
- B25J13/08—Controls for manipulators by means of sensing devices, e.g. viewing or touching devices
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J9/00—Programme-controlled manipulators
- B25J9/16—Programme controls
Abstract
Description
Technical field
The present invention relates to intelligent Manufacturing Technology field and in particular under a kind of multirobot operator scheme visual tracking side Method and system.
Background technology
With the continuous development of roboticses, increasing robot starts to substitute the mankind's various tasks of execution.Machine Device people is to automatically control machine(Robot)Be commonly called as, automatically control machine and include all simulation behavior of men or thought and simulation The machinery of other biological(As robot dog, Doraemon etc.).Definition to robot also has a lot of classification method and dispute in the narrow sense, has A little computer programs or even also referred to as robot.In contemporary industry, robot refers to automatically to execute the man-made machine dress of task Put, in order to replace or to assist human work.Highly emulated robot in ideal is senior integral traffic control opinion, mechano-electronic, calculating Machine and artificial intelligence, materialogy and bionic product, current scientific circles research and develop to this direction, but robot is remote Process control also imperfection, the application of big data is not also popularized, and also in off-line state, robot is deep for the data acquisition of robot Degree study also comes from the storage of native data.
DELTA robot is one kind of parallel robot, and DELTA robot passes through the outer pair that rotates and drives 3 parallel four sides Shape side chain, along with a middle rotation driving axle, can achieve the space four-dimensional movement of its end effector, configures visual system Afterwards, extensively apply in the sorting packaging of electronics, light industry, food and medicine and other fields.
The control system of DELTA robot of each producer existing at present is the special purpose robot's controller using its own Come the robot control system to build, the control system system versatility of this employing nonshared control unit is bad.General DELTA Robot control system is made up of kinetic control system, servosystem, Visual Tracking System and conveyer belt tracking system, such as schemes In shown in 1, due to there is multiple DELTA machine person cooperative works, destination object different time under different DELTA robots Inside complete different operating process, during how realizing video tracking, and prior art lacks a kind of feasible program.
Content of the invention
The invention provides under a kind of multirobot operator scheme visual tracking method and system, by set up coordinate work Video tracking under operation mode, solves existing machine vision and coordinates sex chromosome mosaicism.
The invention provides under a kind of multirobot operator scheme visual tracking method, wherein:Robot control system Including kinetic control system, visual system, described kinetic control system and visual system, and described kinetic control system connects There is at least Liang Tai DELTA robot, pass through between described kinetic control system and described at least Liang Tai DELTA robot EtherCAT bus is connected, and methods described includes:
During kinetic control system controls DELTA robot manipulation, visual system builds under each DELTA robot Target tracking model, is acquired to destination object image and processes;
With the method for image procossing, the mode of operation of each DELTA robot is detected and identified, obtained each Destination object under DELTA robot manipulation's process;
Using distance-finding method, it is accurately positioned the distance between DELTA robot and destination object, and destination object state is carried out Prediction;
Visual system follows the tracks of destination object next one state value in time according to prediction.
Described visual system builds the target tracking model under each DELTA robot, and destination object image is carried out Collection includes with processing:
Target tracking model is set up based on the track algorithm of DELTA robot working background color combining and edge.
The described method with image procossing is detected to the mode of operation of each DELTA robot and is identified bag Include:
Using frame differential method, the destination object under the mode of operation to each DELTA robot is detected and identified.
Described utilization distance-finding method, is accurately positioned the distance between DELTA robot and destination object, and to destination object State be predicted including:
The distance between DELTA robot and destination object are calculated using monocular vision distance-finding method;
In the kinestate predicting destination object state based on Kalman filtering algorithm.
Described visual system is followed the tracks of destination object next one state value in time according to prediction and is included:
Based on the Collaborative Control instruction between described at least Liang Tai DELTA robot, real-time tracing is carried out to destination object, and base Capture the state value under different DELTA robots in visual system.
Accordingly, present invention also offers a kind of robot control system, including:Kinetic control system, visual system institute State kinetic control system to be connected with visual system, and described kinetic control system is connected with least Liang Tai DELTA robot, institute State and connected by EtherCAT bus between kinetic control system and described at least Liang Tai DELTA robot, wherein:
Kinetic control system coordinates operation for control targe object under at least Liang Tai DELTA robot;
Visual system is used for building the target tracking model under each DELTA robot, and destination object image is acquired With process;With the method for image procossing, the mode of operation of each DELTA robot is detected and identified, obtained each Destination object under platform DELTA robot manipulation's process;Using distance-finding method, it is accurately positioned DELTA robot and destination object The distance between, and destination object state is predicted;Visual system follows the tracks of destination object next one shape in time according to prediction State value.
Described visual system is set up target based on the track algorithm of DELTA robot working background color combining and edge and is chased after Track model.
Described visual system utilizes frame differential method to the destination object under the mode of operation to each DELTA robot Detected and identified.
Described visual system using monocular vision distance-finding method calculate between DELTA robot and destination object away from From;In the kinestate predicting destination object state based on Kalman filtering algorithm.
Described visual system is instructed based on the Collaborative Control between described at least Liang Tai DELTA robot, to destination object Carry out the state value under real-time tracing, and view-based access control model system grabs difference DELTA robot.
In the present invention, by setting up the video tracking under co-ordination pattern, view-based access control model system realizes destination object Video tracking image under cooperative work mode under multiple DELTA robots, and just have the distance-finding method can be in time to target Object is predicted, and in time follow-up destination object next one state value, can it is ensured that the orderly monitoring of whole operation Object Process With according to sequential course replay whole operation process.
Brief description
In order to be illustrated more clearly that the embodiment of the present invention or technical scheme of the prior art, below will be to embodiment or existing Have technology description in required use accompanying drawing be briefly described it should be apparent that, drawings in the following description be only this Some embodiments of invention, for those of ordinary skill in the art, on the premise of not paying creative work, acceptable Other accompanying drawings are obtained according to these accompanying drawings.
Fig. 1 is the structural representation of robot control system of the prior art;
Fig. 2 is the robot control system architecture schematic diagram in the embodiment of the present invention;
Fig. 3 is the method flow diagram of visual tracking under multirobot operator scheme in the embodiment of the present invention.
Specific embodiment
Below in conjunction with the accompanying drawing in the embodiment of the present invention, the technical scheme in the embodiment of the present invention is carried out clear, complete Site preparation description is it is clear that described embodiment is only a part of embodiment of the present invention, rather than whole embodiments.It is based on Embodiment in the present invention, it is all other that those of ordinary skill in the art are obtained under the premise of not making creative work Embodiment, broadly falls into the scope of protection of the invention.
Specifically, Fig. 2 shows the robot control system architecture schematic diagram in the embodiment of the present invention, and this robot controls System includes kinetic control system, visual system, and this kinetic control system is connected with visual system, and described motor control system System is connected with least Liang Tai DELTA robot, this kinetic control system and this at least pass through between Liang Tai DELTA robot EtherCAT bus is connected, and shows three DELTA robots, wherein in the embodiment of the present invention:
Kinetic control system coordinates operation for control targe object under at least Liang Tai DELTA robot;
Visual system is used for building the target tracking model under each DELTA robot, and destination object image is acquired With process;With the method for image procossing, the mode of operation of each DELTA robot is detected and identified, obtained each Destination object under platform DELTA robot manipulation's process;Using distance-finding method, it is accurately positioned DELTA robot and destination object The distance between, and destination object state is predicted;Visual system follows the tracks of destination object next one shape in time according to prediction State value.
In specific implementation process, the tracking based on DELTA robot working background color combining and edge for this visual system Algorithm sets up target tracking model;And using frame differential method to the mesh under the mode of operation to each DELTA robot Mark object is detected and is identified;Using monocular vision distance-finding method calculate between DELTA robot and destination object away from From;In the kinestate predicting destination object state based on Kalman filtering algorithm;Based on described at least Liang Tai DELTA robot Between Collaborative Control instruction, real-time tracing is carried out to destination object, and view-based access control model system grabs difference DELTA robot under State value.
It should be noted that the track algorithm at color combining and edge sets up target tracking model in the embodiment of the present invention, Some Color edge detection algorithms can be adopted, it has taken into full account the local edge of coloured image, using color difference carry out with Track, thus compensate for losing the deficiency at edge during the detection of conventional edge detection method, it can extract more colour edging information, And accuracy of detection and effect are all more satisfactory, there is certain practical value and good treatment effect.
It should be noted that frame differential method be a kind of by two frames adjacent in sequence of video images are made calculus of differences The method obtaining moving target profile, it can be perfectly suitable for the situation that there are multiple moving targets and video camera movement. When abnormal object of which movement occurs in monitoring scene, more obvious difference occurs between frame and frame, two frames subtract each other, and obtain two The absolute value of two field picture luminance difference, judges that whether it carry out the kinetic characteristic of analysis video or image sequence more than threshold value, determines figure Have or not object of which movement as in sequence.The difference of image sequence frame by frame, is equivalent to high pass filter image sequence having been carried out under time domain Ripple.
It should be noted that monocular vision distance-finding method can utilize pinhole imaging system principle, draw imaging point and impact point Mapping relations, set up national forest park in Xiaokeng.Then by the analysis to target image, draw the face of object and target image Long-pending mapping relations, set up the straight line range finding model of vision measurement;By image procossing, extract the characteristic point of target image, by light The heart is converted into the distance relation of photocentre and characteristic point it is proposed that the monocular vision of distinguished point based is surveyed with the distance relation of object Away from principle.
It should be noted that Kalman filtering algorithm is one kind utilizes linear system state equation, inputted defeated by system Go out to observe data, system mode is carried out with the algorithm of optimal estimation, thus realizing the prediction process of data value.
Accordingly, Fig. 3 shows the method flow of visual tracking under the multirobot operator scheme in the embodiment of the present invention Figure, wherein:Robot control system includes kinetic control system, visual system, and described kinetic control system is with visual system even Connect, and described kinetic control system is connected with least Liang Tai DELTA robot, described kinetic control system and described at least two Connected by EtherCAT bus between platform DELTA robot, the method comprises the steps:
S301, kinetic control system control DELTA robot manipulation during, visual system builds each DELTA machine Target tracking model under people, is acquired to destination object image and processes;
In specific implementation process, by mesh is set up based on the track algorithm of DELTA robot working background color combining and edge Mark tracing model.In the embodiment of the present invention, the track algorithm at color combining and edge sets up target tracking model, can adopt one A little Color edge detection algorithms, it has been taken into full account the local edge of coloured image, has been tracked using color difference, thus making up The deficiency at edge is lost, it can extract more colour edging information, and detect essence during the detection of conventional edge detection method Degree and effect are all more satisfactory, have certain practical value and good treatment effect.
S302, with the method for image procossing, the mode of operation of each DELTA robot is detected and identified, obtained Obtain the destination object under each DELTA robot manipulation's process;
In specific implementation process, using frame differential method to the destination object under the mode of operation to each DELTA robot Detected and identified.Frame differential method is a kind of by obtaining fortune to two frames adjacent in sequence of video images as calculus of differences The method of Moving Target Outline, it can be perfectly suitable for the situation that there are multiple moving targets and video camera movement.Work as monitoring When abnormal object of which movement occurring in scene, more obvious difference occurs between frame and frame, two frames subtract each other, and obtain two field pictures The absolute value of luminance difference, judges that whether it carry out the kinetic characteristic of analysis video or image sequence more than threshold value, determines image sequence In have or not object of which movement.The difference of image sequence frame by frame, is equivalent to high-pass filtering image sequence having been carried out under time domain.Karr Graceful filtering algorithm is that one kind utilizes linear system state equation, observes data by system input and output, system mode is carried out The algorithm of optimal estimation, thus realize the prediction process of data value.
S303, utilize distance-finding method, be accurately positioned the distance between DELTA robot and destination object, and to target pair As state is predicted;
In specific implementation process, the distance between DELTA robot and destination object are calculated using monocular vision distance-finding method; In the kinestate predicting destination object state based on Kalman filtering algorithm.
Monocular vision distance-finding method can utilize pinhole imaging system principle, draw the mapping relations of imaging point and impact point, build Vertical national forest park in Xiaokeng.Then by the analysis to target image, draw the area mapping relations of object and target image, build The straight line range finding model of vertical vision measurement;By image procossing, extract the characteristic point of target image, by photocentre and object away from From transformation for photocentre and characteristic point distance relation it is proposed that the monocular vision range measurement principle of distinguished point based.
S304, visual system follow the tracks of destination object next one state value in time according to prediction.
In specific implementation process, based on the Collaborative Control instruction between described at least Liang Tai DELTA robot, to target pair As carrying out the state value under real-time tracing, and view-based access control model system grabs difference DELTA robot.
To sum up, by setting up the video tracking under co-ordination pattern, view-based access control model system realizes destination object multiple Video tracking image under cooperative work mode under DELTA robot, and just have distance-finding method can in time destination object be entered Row prediction, in time follow-up destination object next one state value is it is ensured that the orderly monitoring of whole operation Object Process, can be according to Sequential course replay whole operation process.
One of ordinary skill in the art will appreciate that all or part of step in the various methods of above-described embodiment is can Completed with the hardware instructing correlation by program, this program can be stored in computer-readable recording medium, storage is situated between Matter can include:Read only memory(ROM, Read Only Memory), random access memory(RAM, Random Access Memory), disk or CD etc..
Under the multirobot the operator scheme above embodiment of the present invention being provided, the method and system of visual tracking are carried out It is discussed in detail, specific case used herein is set forth to the principle of the present invention and embodiment, above example Explanation be only intended to help and understand the method for the present invention and its core concept;Simultaneously for one of ordinary skill in the art, According to the thought of the present invention, all will change in specific embodiments and applications, in sum, in this specification Hold and should not be construed as limitation of the present invention.
Claims (10)
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201611056367.0A CN106393144B (en) | 2016-11-26 | 2016-11-26 | The method and system that vision tracks under a kind of multirobot operation mode |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201611056367.0A CN106393144B (en) | 2016-11-26 | 2016-11-26 | The method and system that vision tracks under a kind of multirobot operation mode |
Publications (2)
Publication Number | Publication Date |
---|---|
CN106393144A true CN106393144A (en) | 2017-02-15 |
CN106393144B CN106393144B (en) | 2018-09-04 |
Family
ID=58082673
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201611056367.0A CN106393144B (en) | 2016-11-26 | 2016-11-26 | The method and system that vision tracks under a kind of multirobot operation mode |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN106393144B (en) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN108827250A (en) * | 2018-05-07 | 2018-11-16 | 深圳市三宝创新智能有限公司 | A kind of robot monocular vision ranging technology method |
Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN103345258A (en) * | 2013-06-16 | 2013-10-09 | 西安科技大学 | Target tracking method and system of football robot |
CN103406905A (en) * | 2013-08-20 | 2013-11-27 | 西北工业大学 | Robot system with visual servo and detection functions |
CN104589357A (en) * | 2014-12-01 | 2015-05-06 | 佛山市万世德机器人技术有限公司 | Control system and method of DELTA robots based on visual tracking |
US20150202776A1 (en) * | 2014-01-23 | 2015-07-23 | Fanuc Corporation | Data generation device for vision sensor and detection simulation system |
CN204819543U (en) * | 2015-06-24 | 2015-12-02 | 燕山大学 | Centralized control formula multirobot motion control system |
CN106097322A (en) * | 2016-06-03 | 2016-11-09 | 江苏大学 | A kind of vision system calibration method based on neutral net |
-
2016
- 2016-11-26 CN CN201611056367.0A patent/CN106393144B/en active IP Right Grant
Patent Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN103345258A (en) * | 2013-06-16 | 2013-10-09 | 西安科技大学 | Target tracking method and system of football robot |
CN103406905A (en) * | 2013-08-20 | 2013-11-27 | 西北工业大学 | Robot system with visual servo and detection functions |
US20150202776A1 (en) * | 2014-01-23 | 2015-07-23 | Fanuc Corporation | Data generation device for vision sensor and detection simulation system |
CN104589357A (en) * | 2014-12-01 | 2015-05-06 | 佛山市万世德机器人技术有限公司 | Control system and method of DELTA robots based on visual tracking |
CN204819543U (en) * | 2015-06-24 | 2015-12-02 | 燕山大学 | Centralized control formula multirobot motion control system |
CN106097322A (en) * | 2016-06-03 | 2016-11-09 | 江苏大学 | A kind of vision system calibration method based on neutral net |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN108827250A (en) * | 2018-05-07 | 2018-11-16 | 深圳市三宝创新智能有限公司 | A kind of robot monocular vision ranging technology method |
Also Published As
Publication number | Publication date |
---|---|
CN106393144B (en) | 2018-09-04 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
Fernández-Caballero et al. | Optical flow or image subtraction in human detection from infrared camera on mobile robot | |
Linder et al. | On multi-modal people tracking from mobile platforms in very crowded and dynamic environments | |
Lippiello et al. | Position-based visual servoing in industrial multirobot cells using a hybrid camera configuration | |
WO2017041730A1 (en) | Method and system for navigating mobile robot to bypass obstacle | |
US9452531B2 (en) | Controlling a robot in the presence of a moving object | |
Stroupe et al. | Distributed sensor fusion for object position estimation by multi-robot systems | |
Mohammed et al. | Active collision avoidance for human–robot collaboration driven by vision sensors | |
Jensfelt et al. | A framework for vision based bearing only 3D SLAM | |
Chen et al. | Adaptive leader-follower formation control of non-holonomic mobile robots using active vision | |
CN102794767B (en) | B spline track planning method of robot joint space guided by vision | |
CN104282020B (en) | A kind of vehicle speed detection method based on target trajectory | |
Hu et al. | Bio-inspired embedded vision system for autonomous micro-robots: The LGMD case | |
Stückler et al. | Following human guidance to cooperatively carry a large object | |
CA2928645A1 (en) | Image-based trajectory robot programming planning approach | |
CN103170973A (en) | Man-machine cooperation device and method based on Kinect video camera | |
JP2019536012A (en) | Visual inertial navigation using variable contrast tracking residuals | |
Chauhan et al. | A comparative study of machine vision based methods for fault detection in an automated assembly machine | |
Tully et al. | Leap-frog path design for multi-robot cooperative localization | |
Mustafah et al. | Stereo vision images processing for real-time object distance and size measurements | |
GB2550296A (en) | Scale independent tracking system | |
US20040167671A1 (en) | Automatic work apparatus and automatic work control program | |
CN105760824B (en) | A kind of moving human hand tracking method and system | |
Kudoh et al. | Painting robot with multi-fingered hands and stereo vision | |
Martin et al. | Online interactive perception of articulated objects with multi-level recursive estimation based on task-specific priors | |
CN1455355A (en) | Method of treating passive optical motion to obtain data |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
C06 | Publication | ||
PB01 | Publication | ||
C10 | Entry into substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |