Disclosure of Invention
The invention provides a visual guidance-based robot destacking system, which aims to complete destacking tasks of feed boxes with different specifications and types.
In order to achieve the above purpose, the technical scheme adopted by the invention is as follows:
The invention relates to a visual guidance-based robot unstacking system, which comprises system hardware and system software; the system hardware comprises a destacking robot, a clamping jaw mechanism and a vision device; the system software comprises a communication module, an identification and positioning module and a path planning module.
The unstacking robot is used for executing a designated planning path or teaching path and carrying the feed box to unstacke and stack the feed box.
The clamping jaw mechanism is used for clamping the feed box; the clamping jaw mechanism comprises four grapples, the mutual distance between the four grapples can be automatically adjusted, and meanwhile, the four grapples have clamping and loosening functions.
The vision device consists of two parts, including: a fixed 3D camera fixed above the bin tray and a movable 3D camera fixed at the side of the robot clamping jaw system; the fixed 3D camera is used for collecting all bin point cloud data of the upper layer of the tray stack; the movable 3D camera is used for collecting the point cloud data of the material feeding box on the roller line.
The communication module realizes the communication between the system software and the vision device as well as the unstacking robot system, and realizes the related data transmission.
The identification and positioning module identifies the model of the bin and positions the bin space according to bin point cloud data acquired by the 3D camera; and identifying the tray size and the tray space position according to the tray point cloud data acquired by the 3D camera.
And the path planning module is used for planning the path from the current position to the target position of the palletizing robot.
In order to achieve the same object as the technical scheme, the invention provides an unstacking method of the visual guidance-based robot unstacking system, which comprises the following steps:
step 1, a robot GoHome;
Step 2, fixing a camera detection feed box;
Step 3, judging the state; if a feed box is arranged above the tray, continuing the step 4; if no bin is present above the tray, go to step 15;
step 4, judging the model; if the identified target bin is of a type which cannot be grasped by the system, the process goes to step 16; otherwise, continuing to step 5;
Step 5, adjusting the configuration of clamping jaws of a feed box;
step 6, opening a clamping jaw of the feed box;
step 7, waiting for a destacking path;
Step 8, judging paths; if the path transmission fails or the transmission path is empty, go to step 16; otherwise, continuing to step 9;
step 9, the robot runs a destacking path;
Step 10, fixing a camera detection feed box;
step 11, waiting for a discharge hole to empty;
Step 12, robot operation teaching path-discharging box; once the photoelectric switch detects that the discharge hole is empty in the step 11, the robot executes a teaching path, and the robot moves from the Home position to a discharging position above the roller line;
Step 13, opening a clamping jaw of a feed box;
Step 14, robot operation teaching path-GoHome;
step 15, unstacking is completed and quit;
And step 16, alarm exits.
Wherein:
The step 2 comprises the following steps:
2-1, photographing by a fixed camera;
Step 2-2, detecting a material box;
step 2-3, determining a destacking bin;
And 2-4, planning a material box unstacking path.
The steps 2-4 comprise the following paths:
paths 2-4-1 and Pick paths, wherein the paths from the Home position of the robot to the target bin grabbing approach position can be fast moved when the robot executes the paths;
A path 2-4-2, an Approach path, a path from a target bin grabbing approaching position to grabbing position, and a robot should slowly move when executing the path;
paths 2-4-3, leave path, path from target bin gripping position to gripping access position, robot should move more slowly when executing the path, because robot jaws have already gripped the bin full of work pieces at this time;
Paths 2-4-4, transport path, the path from the target bin to the Home position is grasped, and the robot can speed up properly when executing the path.
In order to achieve the same purpose as the technical scheme, the invention provides a stacking method of the visual guide-based robot unstacking system, which comprises the following steps:
step 1, a robot GoHome;
step2, the robot moves to a photographing position of the roller way;
Step3, waiting for a bin in-place signal;
step 4, detecting a feed box by a moving camera;
step 5, model judgment: judging whether the model of the material feeding box is the model supported by the stacking system, if so, continuing to step 6; otherwise, go to step 23;
step 6, setting a target feed box;
step 7, fixing a camera detection tray;
step 8, judging the state: judging whether a tray exists under the fixed 3D camera; if no tray is present, go to step 24; otherwise, continuing to step 9;
Step 9, adjusting the configuration of clamping jaws of a feed box;
step 10, the robot moves to a material box grabbing position;
Step 11, gripping a feed box by a clamping jaw;
Step 12, the robot runs out of the path;
Step 13, waiting for a stacking path;
Step 14, judging the path: if no path is received or if the received path is empty, go to step 23; otherwise, continuing to step 15;
step 15, executing a stacking path;
step 16, updating the stacking stack shape;
step 17, judging the state: judging whether the stacking of the whole stack is completed or not; if so, go to step 22; otherwise, continuing to step 18, and starting to continue to circularly execute stacking;
Step 18, the robot moves to a photographing position of the roller way; step 2, the same as the step;
Step 19, waiting for a bin in-place signal; step 3, the same as the step;
step 20, detecting a feed box by a moving camera; step 4, the same as the step;
Step 21, model judgment: judging whether the model of the incoming material is consistent with the model of the set target stacking bin; if the model is consistent, returning to the step 10; if the model numbers are inconsistent, the process goes to step 23;
step 22, stacking is completed and quitting;
step 23, alarm exit;
and step 24, the tray-free alarm is withdrawn.
Wherein:
The step4 specifically comprises the following two steps;
step 4-1, photographing by a motion camera;
And 4-2, detecting the model of the material box.
The step 7 specifically comprises the following steps of;
7-1, photographing by a fixed camera;
step 7-2, detecting a tray;
Step 7-3, setting a target tray;
And 7-4, planning stacking stack shapes of the feed boxes.
The step 13 specifically comprises the following steps;
Step 13-1, stacking stack shape inquiry;
and 13-2, planning a stacking path of the material box.
The stacking path comprises the following four path segments:
paths 13-3-1, place path, path from Home position to near position above the stacker;
path 13-3-2, an Approach path, a path from the Approach position to the stacking position;
Paths 13-3-3, leave paths, paths from the stacking position to the near pose;
Paths 13-3-4, transport path, path from the proximity location to the Home location.
By adopting the technical scheme, the invention solves the problem of mixing and automatic unstacking of the feed boxes with different specifications and types; in the unstacking process, the system can automatically identify all the bins on the top layer above the tray, automatically determine unstacking target bins and automatically plan the unstacking path of the robot bins; in the stacking process, the system can automatically identify the position of the tray, automatically identify the model number of the material feeding box on the roller line, automatically plan the stacking shape and automatically plan the stacking path of the target box; finally, the system can realize the unstacking task of the feed boxes with various specifications and types.
Detailed Description
The following detailed description of the embodiments of the invention, given by way of example only, is presented in the accompanying drawings to aid in a more complete, accurate, and thorough understanding of the inventive concepts and aspects of the invention by those skilled in the art.
The invention belongs to the field of automatic unstacking by robots, and is mainly applied to automatic storage and transportation of goods in industrial sites. The invention is a visual guidance-based robot destacking system, which comprises system hardware and system software, and is shown in the structure of fig. 2.
In order to solve the problems existing in the prior art and overcome the defects of the prior art and realize the aim of completing the task of unstacking the feed boxes with various specifications and types, the invention adopts the following technical scheme:
As shown in fig. 2, the visual guidance-based robot destacking system of the invention, wherein the system hardware comprises a destacking robot, a clamping jaw mechanism and a visual device; the system software comprises a communication module, an identification and positioning module and a path planning module.
The visual guidance-based robot unstacking system can be used for unstacking and stacking various types of bins, and is illustrated by taking single type bin stacking as an example. When being applied to the workbin pile up neatly of multiple model, need arrange a plurality of trays under fixed 3D camera to detect the size and the positional information of every tray, through planning a pile up neatly buttress to every tray respectively to the workbin of different models, after detecting the material workbin model on the roll table line, only need set up corresponding tray as target tray, set up corresponding pile up neatly buttress to current pile up neatly buttress can.
The invention solves the problem of mixing and automatic unstacking of the feed boxes with different specifications and models; in the unstacking process, the system can automatically identify all the bins on the top layer above the tray, automatically determine unstacking target bins and automatically plan the unstacking path of the robot bins; in the stacking process, the system can automatically identify the position of the tray, automatically identify the model number of the material feeding box on the roller line, automatically plan the stacking shape and automatically plan the stacking path of the target box; finally, the system can realize the unstacking task of the feed boxes with various specifications and types.
Specifically, the main functions of each part of the hardware system of the invention are as follows:
1. the unstacking and stacking industrial robot system comprises: the method is mainly used for executing a designated planning path or teaching path and carrying the bin to perform bin unstacking and stacking.
2. Clamping jaw system: the clamping jaw mechanism is used for clamping the feed box; the clamping jaw mechanism comprises four grapples, the mutual distance between the four grapples can be automatically adjusted for grabbing the feed boxes with different specifications and types, and meanwhile, the four grapples all have clamping and loosening functions.
3. Visual system: the vision device consists of two parts, including: a fixed 3D camera fixed above the bin tray and a movable 3D camera fixed at the side of the robot clamping jaw system; the fixed 3D camera is used for collecting all bin point cloud data of the upper layer of the tray stack; the movable 3D camera is used for collecting the point cloud data of the material feeding box on the roller line.
The main functions of each part of the software system of the invention are as follows:
1. and a communication module: the communication module realizes the communication between the system software and the vision device as well as the unstacking robot system, and realizes the related data transmission.
2. The identifying and positioning module: the identification and positioning module identifies the model of the bin and positions the bin space according to bin point cloud data acquired by the 3D camera; and identifying the tray size and the tray space position according to the tray point cloud data acquired by the 3D camera.
3. And a path planning module: and the path planning module is used for planning the path from the current position to the target position of the palletizing robot.
The invention further optimizes, improves or substitutes the technical proposal
The industrial robot can move by 360 degrees around the first joint shaft, and the single 3D camera can not realize detection of a plurality of trays with high precision in consideration of limited visual field range of the 3D camera, so that for a multi-type material box stacking system, a plurality of fixed 3D cameras can be annularly arranged around the robot for detecting size and position information of a plurality of trays and realizing stacking of various types of material boxes.
In order to achieve the same object as the above technical solution, the present invention further provides an unstacking method of the above robot unstacking system based on visual guidance, as shown in fig. 3, the unstacking method comprises the following steps:
Step 1, robot GoHome: all destacking tasks of the robot start from taught Home positions to ensure the safety of the robot in the destacking motion process of the feed box, and if the robot is not at the Home positions currently, the robot automatically runs to the Home positions;
step 2, fixing a camera detection feed box: the process is used for detecting all bins at the upper layer of the stack, determining one of the bins as a target bin for unstacking, and planning a corresponding robot unstacking path; the method specifically comprises the following sub-processes:
step 2-1, photographing by a fixed camera: triggering a fixed 3D camera positioned above the tray, and collecting 3D point clouds of a feed box on the upper layer of the tray;
step 2-2, detection of a material box: identifying all bins in the detection point cloud of the positioning module according to the acquired 3D point cloud, and obtaining model and spatial position information of the bins;
Step 2-3, determining a unstacking material box: according to all the bins of the top layer of the stack detected in the step 2-2, according to the priority principle that the distance from the ground is from high to low and the distance from the robot is from near to far, the highest bin and the nearest bin are preferentially selected as target bins, and model and space position information of the highest bin and the nearest bin are obtained;
step 2-4, planning a material box unstacking path: the method comprises the steps of realizing the path planning problem of a robot from a Home position to a target bin position and then from the target bin position to the Home position, and specifically planning 4 paths:
Path 2-4-1, pick path: a path from a Home position of the robot to a target bin grabbing approach position is quickly moved when the robot executes the path;
Path 2-4-2, approxh Path: a path from the target bin to the grabbing position is grabbed, and the robot slowly moves when executing the path;
Paths 2-4-3, leave path: the path from the target bin gripping position to the gripping access position should be performed by the robot more slowly, since the robot jaws have already gripped the bin full of workpieces at this time;
paths 2-4-4, transport path: the path from the approach position to the Home position is grasped from the target bin, and the robot can appropriately accelerate when executing the path.
Step 3, judging the state: the identification and positioning module judges whether a bin still exists on the tray or not according to the point cloud acquired by the fixed 3D camera when the step 2 is executed; if a feed box is arranged above the tray, continuing the step 4; if no material box exists above the tray, the step 15 is carried out, and the unstacking task is finished;
Step 4, model judgment: the target unstacking material box determined in the step 2-3 is required to be subjected to model judgment; if the identified target bin is of a type which cannot be grasped by the system, the process goes to step 16, and the alarm is carried out; otherwise, continuing to step 5;
step 5, adjusting the configuration of clamping jaws of a feed box: adjusting the mutual distances of the four grapples of the clamping jaw system to enable the four grapples to grab a target bin;
Step 6, opening a clamping jaw of a feed box: . Opening four grapples of the clamping jaw system so as to grasp a target bin later;
Step 7, waiting for a destacking path: the robot waits for the software system to send 4 path segments to the robot controller totally to destack;
Step 8, judging paths: judging whether the path transmission is successful or not, and judging whether the transmitted path segment is empty or not; if the path transmission fails or the transmission path is empty, the step 16 is switched to, and the alarm is exited; otherwise, continuing to step 9;
step 9, running a destacking path by a robot: the robot runs point by point according to the transmitted non-empty path segments, and when the robot moves to a target grabbing position, the clamping jaw of the robot is closed to grab a target material box; step 9 is executed, the material box can be conveyed to the Home position of the robot;
Step 10, fixing a camera detection feed box: step 2, carrying out next tray bin detection in advance;
Step 11, waiting for a discharge hole to empty: waiting for the discharge hole of the feeding box on the roller line to be empty, and detecting by a photoelectric detection switch;
Step 12 robot running teaching path-discharging box: once the photoelectric switch detects that the discharge hole is empty in the step 11, the robot executes a teaching path, and the robot moves from the Home position to a discharging position above the roller line;
Step 13, opening a workbin clamping jaw: when the robot reaches the discharging position of the target bin, the robot controls the clamping jaw to be opened, and the target bin is placed on a roller line;
Step 14, robot operation teaching path-GoHome: the robot moves from the placing position to the Home position;
Step 15, destacking is completed and quits: when the robot moves to the Home position, the next bin detection is performed in advance in the step 10, so that the next bin detection can be directly performed to the step 3 after the step 14 is finished, the unstacking of the single bin is circularly performed until the fact that no bin exists on the tray is detected in the state judgment of the step 3, and the unstacking of the whole bin stack is completed at the moment;
And step 16, alarm exits. In the step 4, if the model of the target bin is detected not to be the type of the system grippable bin, the step is shifted to alarm and exit; in step 8, if the planned unstacking path sending failure or the planned path failure is detected to lead to the transmission path being empty, the process goes to the step to alarm and exit.
In order to achieve the same purpose as the technical scheme, the invention also provides a stacking method of the visual guidance-based robot unstacking system, as shown in fig. 4, wherein the stacking method comprises the following steps:
step 1, robot GoHome: all stacking tasks of the robot start from taught Home positions to ensure the safety of the robot in the stacking movement process, and if the robot is not at the Home positions currently, the robot automatically runs to the Home positions;
Step 2, the robot moves to a photographing position of the roller way: the robot moves from the Home position to the roller way photographing position, and at the moment, the feeding box on the roller way line is just in the photographing visual field range of the 3D camera moving on the side face of the clamping jaw system;
Step 3, waiting for a bin in-place signal: waiting for a material box on the roller line to detect a photoelectric switch signal, if no signal exists, indicating that no material is supplied on the roller line, continuing waiting for the material supply, otherwise, entering a step 4;
step4, detecting a feed box by a moving camera: the movable 3D camera detects the model of the feeding material box, and specifically comprises the following two steps:
Step 4-1, photographing by a motion camera: triggering a movable 3D camera, and collecting 3D point clouds of an incoming material box;
Step 4-2, detecting the model of the material box: the model identification and positioning module identifies the model of the material box according to the collected 3D point cloud of the material box;
step5, model judgment: judging whether the model of the material feeding box is the model supported by the stacking system, if so, continuing to step 6; otherwise, turning to step 23, and exiting the alarm;
Step 6, setting a target feed box: setting the type of the first detected material feeding bin as a stacking target bin;
step 7, fixing a camera detection tray: the fixed 3D camera detects the size and the space position of the tray and plans the stacking shape, and the method specifically comprises the following four steps;
Step 7-1, photographing by a fixed camera: triggering a fixed 3D camera, and collecting 3D point clouds of a tray;
Step 7-2, detecting a tray: the model identification and positioning module detects the size and the position of the tray according to the collected 3D point cloud data of the tray;
step 7-3, setting a target tray: setting the detected tray as a palletized target tray;
step 7-4, planning stacking stack shapes of the feed boxes: planning a stacking stack shape according to the set target bin and target tray, namely stacking the row number, column number and layer number of the target bin on the target tray and stacking positions of each stacking bin;
step 8, judging the state: according to the step 7, the fixed 3D camera detects the condition of the tray, and whether the tray exists under the fixed 3D camera is judged; if no tray exists, the process goes to step 24, and the alarm exits; otherwise, continuing to step 9;
Step 9, adjusting the configuration of clamping jaws of a feed box: according to the set model of the target material box, adjusting the distance between 4 grapples of the clamping jaw to enable the clamping jaw to grasp the material box of the model;
step 10, the robot moves to a material box grabbing position: the robot moves to a grabbing position of the feed box according to the taught material taking path;
step 11, clamping jaw grabbing material box: closing the clamping jaw system, and grabbing the feed box;
step 12, robot operation leaving path: the robot carries the feed box to move to the Home position;
step 13, waiting for a stacking path: waiting for the robot to receive the stacking path, specifically comprising the following two steps:
Step 13-1, stacking stack shape query: inquiring the planned stacking pile shapes one by one, and preferentially selecting the position of the lowest and farthest un-stacked material boxes as the stacking position of the material feeding boxes according to the principle of from low to high and from far to near;
Step 13-2, planning a stacking path of the material box: taking the stacking position of the feeding material box determined in the step 13-1 as a target position of the robot, and planning a path from the Home position to the target stacking position of the robot, wherein the stacking path comprises the following four path segments:
paths 13-3-1, place path, path from Home position to near position above the stacker;
path 13-3-2, an Approach path, a path from the Approach position to the stacking position;
Paths 13-3-3, leave paths, paths from the stacking position to the near pose;
Paths 13-3-4, transport path, path from the proximity location to the Home location.
Step 14, judging the path: judging whether the robot receives the path or not and whether the received path is empty or not, if the robot does not receive the path or the received path is empty, turning to step 23, and alarming to exit; otherwise, continuing to step 15;
Step 15, executing a stacking path: the robot executes a stacking path and opens a clamping jaw stacking box when reaching a target stacking position;
And step 16, updating stacking stack shapes: after the robot executes the step 15, indicating that the position of the material box in the stack is completely stacked, marking the position in the stack as completely stacked, so that the position is skipped to enter the stacking of a position when the next stack stacking inquiry is performed;
Step 17, judging the state: judging whether the stacking of the whole stack is completed or not; if so, turning to step 22, and exiting the palletizing process; otherwise, continuing to step 18, and starting to continue to circularly execute stacking;
Step 18, the robot moves to a photographing position of the roller way; step 2, the same as the step;
Step 19, waiting for a bin in-place signal; step 3, the same as the step;
step 20, detecting a feed box by a moving camera; step 4, the same as the step;
Step 21, model judgment: note that when the model judgment is carried out in the step, whether the model of the incoming material is consistent with the model of the set target stacking bin is judged; if the model is consistent, returning to the step 10; if the model is inconsistent, the step is transferred to the step 23, and the alarm is exited;
step 22, stacking is completed and quitting;
step 23, alarm exit;
and step 24, the tray-free alarm is withdrawn.
The invention has the technical effects that:
1. Support mixed model workbin stack disassembly: on a hardware system, four grapples of a clamping jaw system are arranged with automatically adjustable intervals; on software, the recognition and positioning module can recognize the model of the target workbin, correspondingly obtain the distance between grabbing holes of the workbin, and transmit the distance to the clamping jaw system to realize the distance adjustment of the grapple, so that the support of the workbin stack removal of various models on the hardware system is realized; for a stacking task, as the incoming materials are intermittent one by one, mixed stacking cannot be realized, and for this purpose, stacking of the material boxes of different types is realized by arranging a plurality of trays and planning a plurality of stacking stacks;
2. Safe robot path planning: in general, pallets to be palletized are transported to the lower part of the fixed 3D camera by a forklift, and the accuracy of the forklift is difficult to control, namely, the position randomness of the pallets is large, which leads to the occurrence of local unstructured scenes; for the Home position and the roller line position of the slave robot, a fixed scene is formed; therefore, for a fixed and invariable planning scene, the robot path planning (namely, the autonomous planning in the robot controller) is realized by a traditional teaching mode, and for a planning scene from a tray to the position change of the robot Home, the automatic planning is realized by adopting an algorithm, namely, the planning of the complete motion path of the robot is realized by a path planning module of a software system, so that the safety of the robot executing motion is ensured.
Summary of technical innovation points and technical key points
1. In the palletizing process, the robot path planning combines teaching planning and automatic planning, so that the motion safety of the robot is ensured; planning a robot motion path in a structured and fixed motion scene in a teaching mode, automatically planning the path by an algorithm at the position where the feeding of the pallet changes, avoiding collision and ensuring the motion safety of the robot;
2. Through the unstacking process and the adjustable clamping jaw system, the unstacking and stacking of the mixed type material box are realized, and the unstacking and stacking of the mixed type material box are realized.
Compared with the retrieved prior art document 1, the invention is creatively embodied in:
The invention of the document can realize automatic unstacking of workpieces with different specifications and sizes, the height of the workpieces is detected mainly by an external height sensor, so that unstacking of the workpieces is realized, and when the workpieces are unstacked, each unstacked workpiece needs to be moved to the position above a distance sensor so as to measure the height of the workpiece, thereby being convenient for calculating the position of the robot for placing the workpiece; different types of bins are pre-stored into the system, and the bin types are detected before unstacking to obtain the size information of the bins, so that the problem of detecting the additional unstacking target size information is avoided.
According to the invention, the workpieces with different specifications and sizes can be automatically unstacked, the heights of the workpieces are detected mainly through the external height sensor, so that unstacking of the workpieces is realized, and when unstacking is performed, each unstacked workpiece needs to be moved to the position above the distance sensor so as to measure the heights of the workpieces, so that the position of the robot for placing the workpieces is calculated conveniently; according to the invention, different types of bins are pre-stored in the system, and the bin types are detected before unstacking to obtain the size information, so that the problem of detecting the additional unstacking target size information is avoided.
Compared with the retrieved prior art document 2, the invention is creatively embodied in:
The automatic stacking and unstacking device provided by the invention needs to shape the incoming materials before stacking, and realizes shaping and positioning through the shaping mechanism and the trimming baffle; according to the invention, when stacking is carried out, only the feeding type is needed to be detected, the robot stacking is realized through the movable 3D camera, different bin types teach different grabbing positions of the robot in advance, and the robot is moved to the corresponding grabbing position to grab through judging the bin type.
While the invention has been described above with reference to the accompanying drawings, it will be apparent that the invention is not limited to the above embodiments, but is capable of being modified or applied directly to other applications without modification, as long as various insubstantial modifications of the method concept and technical solution of the invention are adopted, all within the scope of the invention.