CN111359915A - Material sorting method and system based on machine vision - Google Patents

Material sorting method and system based on machine vision Download PDF

Info

Publication number
CN111359915A
CN111359915A CN202010215479.6A CN202010215479A CN111359915A CN 111359915 A CN111359915 A CN 111359915A CN 202010215479 A CN202010215479 A CN 202010215479A CN 111359915 A CN111359915 A CN 111359915A
Authority
CN
China
Prior art keywords
grabbing
materials
robot
picture
area
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202010215479.6A
Other languages
Chinese (zh)
Other versions
CN111359915B (en
Inventor
莫卓亚
刘涛
刘元路
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Guangdong Gongye Technology Co Ltd
Original Assignee
Guangdong Gongye Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Guangdong Gongye Technology Co Ltd filed Critical Guangdong Gongye Technology Co Ltd
Priority to CN202010215479.6A priority Critical patent/CN111359915B/en
Publication of CN111359915A publication Critical patent/CN111359915A/en
Application granted granted Critical
Publication of CN111359915B publication Critical patent/CN111359915B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B07SEPARATING SOLIDS FROM SOLIDS; SORTING
    • B07CPOSTAL SORTING; SORTING INDIVIDUAL ARTICLES, OR BULK MATERIAL FIT TO BE SORTED PIECE-MEAL, e.g. BY PICKING
    • B07C5/00Sorting according to a characteristic or feature of the articles or material being sorted, e.g. by control effected by devices which detect or measure such characteristic or feature; Sorting by manually actuated devices, e.g. switches
    • B07C5/34Sorting according to other particular properties
    • B07C5/342Sorting according to other particular properties according to optical properties, e.g. colour
    • B07C5/3422Sorting according to other particular properties according to optical properties, e.g. colour using video scanning devices, e.g. TV-cameras
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B07SEPARATING SOLIDS FROM SOLIDS; SORTING
    • B07CPOSTAL SORTING; SORTING INDIVIDUAL ARTICLES, OR BULK MATERIAL FIT TO BE SORTED PIECE-MEAL, e.g. BY PICKING
    • B07C5/00Sorting according to a characteristic or feature of the articles or material being sorted, e.g. by control effected by devices which detect or measure such characteristic or feature; Sorting by manually actuated devices, e.g. switches
    • B07C5/02Measures preceding sorting, e.g. arranging articles in a stream orientating
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B07SEPARATING SOLIDS FROM SOLIDS; SORTING
    • B07CPOSTAL SORTING; SORTING INDIVIDUAL ARTICLES, OR BULK MATERIAL FIT TO BE SORTED PIECE-MEAL, e.g. BY PICKING
    • B07C5/00Sorting according to a characteristic or feature of the articles or material being sorted, e.g. by control effected by devices which detect or measure such characteristic or feature; Sorting by manually actuated devices, e.g. switches
    • B07C5/34Sorting according to other particular properties
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B07SEPARATING SOLIDS FROM SOLIDS; SORTING
    • B07CPOSTAL SORTING; SORTING INDIVIDUAL ARTICLES, OR BULK MATERIAL FIT TO BE SORTED PIECE-MEAL, e.g. BY PICKING
    • B07C5/00Sorting according to a characteristic or feature of the articles or material being sorted, e.g. by control effected by devices which detect or measure such characteristic or feature; Sorting by manually actuated devices, e.g. switches
    • B07C5/36Sorting apparatus characterised by the means used for distribution
    • B07C5/361Processing or control devices therefor, e.g. escort memory
    • B07C5/362Separating or distributor mechanisms
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B07SEPARATING SOLIDS FROM SOLIDS; SORTING
    • B07CPOSTAL SORTING; SORTING INDIVIDUAL ARTICLES, OR BULK MATERIAL FIT TO BE SORTED PIECE-MEAL, e.g. BY PICKING
    • B07C5/00Sorting according to a characteristic or feature of the articles or material being sorted, e.g. by control effected by devices which detect or measure such characteristic or feature; Sorting by manually actuated devices, e.g. switches
    • B07C5/36Sorting apparatus characterised by the means used for distribution
    • B07C5/38Collecting or arranging articles in groups
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/10Terrestrial scenes
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B07SEPARATING SOLIDS FROM SOLIDS; SORTING
    • B07CPOSTAL SORTING; SORTING INDIVIDUAL ARTICLES, OR BULK MATERIAL FIT TO BE SORTED PIECE-MEAL, e.g. BY PICKING
    • B07C2501/00Sorting according to a characteristic or feature of the articles or material to be sorted
    • B07C2501/0063Using robots

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Manipulator (AREA)

Abstract

The invention discloses a material sorting method and a material sorting system based on machine vision, wherein the material sorting method comprises the following steps: firstly, acquiring RGB (red, green and blue) pictures of materials to be sorted on a conveying line through a camera, then introducing the pictures into a deep learning model obtained through training, and obtaining the outline and material type of the materials through an example segmentation algorithm; then, calculating a grabbing position and a grabbing attitude corresponding to the current material according to the profile, and transmitting the grabbing position, the grabbing attitude and the material type to a grabbing robot arranged near the conveying line; the grabbing robot adjusts the grabbing action of the material according to the grabbing position and the grabbing posture, the grabbing work of the material is executed, and the grabbing robot places the grabbed material into a corresponding subarea according to the material type; by adopting the material sorting method, no matter materials with different shapes or stacked materials are grabbed according to the optimal grabbing position, and the grabbing success rate is effectively improved.

Description

Material sorting method and system based on machine vision
Technical Field
The invention relates to the technical field of intelligent material sorting, in particular to a material sorting method and system based on machine vision.
Background
With the continuous development of industry, the quantity and types of garbage generated in production and living are increasing, so that the garbage needs to be classified for recycling when being treated. In the industry of garbage classification and recovery, machine vision and robots are generally used for online automatic sorting of garbage, when garbage flows on a conveying line, a camera on equipment photographs the garbage flow, and then a deep learning mode is used for picture recognition, but a target detection mode is generally adopted, namely, only a rectangular frame where a material is located can be recognized, and the center of the rectangular frame is used as a grabbing position. However, in an actual scene, the garbage shape is irregular, and the stacking phenomenon is common, so when the garbage shape is irregular and the stacking phenomenon is faced, when the center of the rectangular frame where the detection target is located is used as the grabbing target, the grabbing phenomenon is often inaccurate, and the actual use scene needs cannot be met.
Disclosure of Invention
The invention aims to provide a material sorting method based on machine vision, which is used for grabbing and sorting materials according to actual profiles and material types of the materials and improving sorting efficiency.
Another object of the present invention is to provide a material sorting system based on machine vision to perform grabbing and sorting work according to the actual profile and material type of the material, so as to improve the sorting efficiency.
In order to achieve the purpose, the invention discloses a material sorting method based on machine vision, which comprises the steps of obtaining RGB (red, green and blue) pictures of materials to be sorted on a conveying line through a camera, then introducing the pictures into a deep learning model obtained through training, and obtaining the outline and the material type of the materials through an example segmentation algorithm;
calculating a grabbing position and a grabbing attitude corresponding to the current material according to the profile, and transmitting the grabbing position, the grabbing attitude and the material type to a grabbing robot arranged near the conveying line;
and the grabbing robot adjusts the grabbing action of the material according to the grabbing position and the grabbing posture, executes the grabbing work of the material, and places the grabbed material into a corresponding subarea according to the material type.
Compared with the prior art, the material sorting method based on machine vision processes RGB images of materials to be processed through an example segmentation algorithm in a deep learning model to obtain specific profiles and material types of the materials, calculates the grabbing positions and postures corresponding to the materials through the profiles, and adjusts grabbing points and grabbing directions according to the grabbing positions and postures calculated through the profiles when a grabbing robot carries out grabbing actions, so that the grabbing robot can guarantee that the materials with different shapes or stacked materials can be grabbed according to the optimal grabbing positions, and the grabbing success rate is further effectively improved.
Preferably, the grabbing robot monitors the dynamic position of the material photographed by the camera on the conveying line in real time, a virtual grabbing area corresponding to the conveying line is arranged in the grabbing robot, the grabbing area is used for limiting the working range of the grabbing robot on the conveying line, and the grabbing robot dynamically adjusts the speed of the conveying line according to the working speed of the grabbing robot and the quantity of the material entering the grabbing area.
Preferably, the gripping area comprises an upper limit located upstream and a lower limit located downstream, and the method for dynamically adjusting the speed of the conveying line comprises:
when no material is in the grabbing area, the conveying line is controlled to be fasterFirst speed V0Running;
when the material in the grabbing area is detected, controlling the conveying line to carry out the second speed V at a lower speed1Operation, V1Is calculated by the formula
Figure BDA0002423730950000021
The material grabbing method comprises the following steps that T is average time used by the grabbing robot for grabbing materials each time, L is a distance from a grabbing position, closest to a material at the upper limit, of a grabbing area to the lower limit, and N is the number of the materials in the grabbing area.
Preferably, a priority adjustment area is further arranged in the grabbing area, and the grabbing priority of the material entering the priority adjustment area is dynamically adjusted according to the material type of the material.
Preferably, the priority adjustment area includes a first boundary line and a second boundary line, a certain set distance is provided between the first boundary line and the second boundary line, the second boundary line is located in front of the first boundary line, and the second boundary line is based on the grabbing center of the foremost material in the grabbing area.
Preferably, the two cameras are respectively a main camera and an auxiliary camera, the main camera is set to be sensitive to light-color materials, the auxiliary camera is set to be sensitive to dark-color materials, after the sorting work starts, the main camera and the auxiliary camera simultaneously take pictures of materials on the conveying line, a first picture is obtained from the main camera, a second picture is obtained from the auxiliary camera, then grabbing information to be transmitted to the grabbing robot at present is obtained through a comparison and judgment method, the grabbing information comprises the grabbing position, the grabbing posture and the material type, and the comparison and judgment method comprises the following steps:
processing the first picture through the example segmentation algorithm to obtain the outlines of all materials in the first picture and the grabbing information corresponding to each outline, and storing the grabbing information obtained from the first picture into a queue to be input;
processing the second picture through the example segmentation algorithm to obtain the outlines of the materials in the second picture and the grabbing information corresponding to each outline;
comparing the profile of any material obtained from the second picture with the profiles of all materials obtained from the first picture respectively to judge whether there is overlap,
if not, adding the grabbing information corresponding to the currently compared outline in the second picture into the queue to be input,
if so, selecting and using the grabbing information corresponding to one of a pair of comparison objects with mutually overlapped outlines to add into the queue to be input;
and transmitting the grabbing information in the queue to be input to the grabbing robot.
Preferably, in the comparison and judgment method, if a pair of contours of the current comparison object overlap each other, the capture information corresponding to the contour with high recognition reliability is selected and added into the queue to be input.
The invention also discloses a material sorting system based on machine vision, which comprises a camera, a conveying line, a grabbing robot and a control processing system;
the conveying line is used for conveying materials to be sorted;
the camera is used for shooting RGB pictures of the materials to be sorted on the conveying line;
the control processing system comprises a control module, an image processing module and a computing module, wherein the control module is electrically connected with the camera, the grabbing robot, the image processing module and the computing module;
the image processing module is used for importing the pictures into a deep learning model obtained through training, and obtaining the outline and the material type of the material through an example segmentation algorithm;
the calculation module is used for calculating the current grabbing position and posture of the material according to the profile of the material output by the image processing module;
the grabbing robot is used for adjusting grabbing actions on the materials according to the grabbing positions and the grabbing postures of the materials calculated by the calculating module and executing grabbing work on the materials, and the grabbing robot is further used for placing the grabbed materials into corresponding subareas according to the material types calculated by the calculating module.
Preferably, the grabbing robot is further electrically connected with a driver of the conveying line to read and control the motion state of the conveying line in real time, and monitor the dynamic position of the material photographed by the camera on the conveying line in real time, a virtual grabbing area corresponding to the conveying line is further arranged in the grabbing robot, the grabbing area is used for limiting the working range of the grabbing robot on the conveying line, and the grabbing robot dynamically adjusts the speed of the conveying line according to the working speed of the grabbing robot and the quantity of the material entering the grabbing area.
Preferably, the gripping area comprises an upper limit at the upstream and a lower limit at the downstream, and when no material is in the gripping area, the gripping robot controls the conveying line to have a faster first speed V0Running; when materials exist in the grabbing area, the grabbing robot controls the conveying line to move at a second speed V which is slower1Operation, V1Is calculated by the formula
Figure BDA0002423730950000041
The material grabbing method comprises the following steps that T is average time used by the grabbing robot for grabbing materials each time, L is a distance from a grabbing position, closest to a material at the upper limit, of a grabbing area to the lower limit, and N is the number of the materials in the grabbing area.
Preferably, a priority adjustment area is further arranged in the grabbing area, and the grabbing robot dynamically adjusts the grabbing priority of the materials entering the priority adjustment area according to the material types of the materials.
Preferably, the priority adjustment area includes a first boundary line and a second boundary line, a certain set distance is provided between the first boundary line and the second boundary line, the second boundary line is located in front of the first boundary line, and the second boundary line is based on the grabbing center of the foremost material in the grabbing area.
Preferably, the number of the cameras is two, the two cameras are respectively a main camera and an auxiliary camera, the main camera is set to be sensitive to light-color materials, the auxiliary camera is set to be sensitive to deep-color materials, after the sorting work is started, the main camera and the auxiliary camera are used for simultaneously photographing materials on the conveying line, a first picture is obtained from the main camera, and a second picture is obtained from the auxiliary camera; the control processing system further comprises a comparison judgment module electrically connected with the control module, the control module acquires the grabbing information to be transmitted to the grabbing robot currently from the queue to be input generated by the comparison judgment module, and the grabbing information comprises the grabbing position, the posture and the material type;
after the image processing module processes the first picture, obtaining the outlines of all materials in the first picture and the grabbing information corresponding to each outline, and storing the grabbing information obtained from the first picture into the queue to be input;
after the image processing module processes the second picture, obtaining the outlines of all materials in the second picture and the grabbing information corresponding to each outline;
the comparison and judgment module is used for comparing the profile of any material obtained from the second picture with the profiles of all materials obtained from the first picture and judging whether the materials are overlapped,
if not, adding the grabbing information corresponding to the currently compared outline in the second picture into the queue to be input,
if yes, selecting and using the grabbing information corresponding to one of a pair of comparison objects with mutually overlapped outlines to add into the queue to be input.
Preferably, in the determining module, if a pair of contours of the current comparison object overlap each other, the capture information corresponding to the contour with high recognition reliability is selected and added to the queue to be input.
The invention also discloses a material sorting system based on machine vision, which comprises:
one or more processors;
a memory;
and one or more programs, wherein the one or more programs are stored in the memory and configured to be executed by the one or more processors, the programs comprising instructions for performing the machine vision-based material sorting method as described above.
The present invention also discloses a computer readable storage medium comprising a test computer program executable by a processor to perform the machine vision based material sorting method as described above.
Drawings
Fig. 1 is a schematic structural diagram of a material sorting system based on machine vision according to an embodiment of the present invention.
Fig. 2 is a schematic view of a gripping area in a gripping machine according to an embodiment of the invention.
Fig. 3 is a schematic diagram of a priority adjustment area in a grabbing machine according to an embodiment of the present invention.
Fig. 4 is a schematic view of a processing flow of taking photos by the main camera and the auxiliary camera in the embodiment of the present invention.
Detailed Description
In order to explain technical contents, structural features, and objects and effects of the present invention in detail, the following detailed description is given with reference to the accompanying drawings in conjunction with the embodiments.
As shown in fig. 1, the invention discloses a material sorting system based on machine vision, which comprises a camera 2, a conveying line 4, a grabbing robot 3 and a control processing system 1. Transfer chain 4 is used for carrying and treats the sorting material, and camera 2 is used for shooting the RGB picture of treating the sorting material on transfer chain 4. The control processing system 1 comprises a control module 10, an image processing module 11 and a calculation module 12, wherein the control module 10 is electrically connected with the camera 2, the grabbing robot 3, the image processing module 11 and the calculation module 12. The image processing module 11 is configured to import the picture into a deep learning model obtained through training, and obtain the material contour and the material type through an example segmentation algorithm. And the calculating module 12 is used for calculating the grabbing position and the grabbing posture of the current material according to the profile of the material output by the image processing module 11.
The material sorting system works in the following mode: placing the material to be sorted on the conveying line 4, when the material moves to the camera 2, the control module 10 controls the camera 2 to start to shoot the material, so as to obtain an RGB picture of the material, and transmits the obtained picture to the image processing module 11 through the control module 10, the image processing module 11 processes the picture through an example segmentation algorithm (such as Mask R-CNN) to obtain a specific profile and a material type of the material, then the calculation module 12 calculates a grabbing position and a posture of the material by using profile data as basic data, the control module 10 transmits the material type of the material and the calculated grabbing position (namely a grabbing center on the material), the posture and other data to the grabbing robot 3, and the grabbing robot 3 adjusts the grabbing action according to the grabbing position and posture data to execute the grabbing work of the material, after the materials are grabbed, the grabbing robot 3 places the grabbed materials into corresponding partitions according to the material type data, so that the materials are sorted. Therefore, by adopting the material sorting system, the materials with different shapes or stacked materials can be guaranteed to be grabbed according to the optimal grabbing position, and the grabbing success rate is effectively improved.
In another preferred embodiment of the material sorting system of the present invention, as shown in fig. 1 and fig. 2, the grabbing robot 3 is electrically connected to the driver 40 of the conveying line 4 to read and control the conveying speed of the conveying line 4 in real time, and monitor the dynamic position of the material photographed by the camera 2 on the conveying line 4 in real time, a virtual grabbing area Q corresponding to the conveying line 4 is further disposed in the grabbing robot 3, the grabbing area Q is used for limiting the working range of the grabbing robot 3 on the conveying line 4, and the grabbing robot 3 dynamically adjusts the speed of the conveying line 4 according to the working speed and the amount of the material entering the grabbing area Q. In this embodiment, the driver 40 is a variable frequency motor for driving the conveyor line 4 to move, the grabbing robot 3 is electrically connected to a frequency converter in the variable frequency motor, and the speed of the conveyor line 4 is controlled by the grabbing robot 3. Since the grabbing robot 3 can read the code of the frequency converter in real time, when the camera 2 finishes one-time photographing, the control module 10 controls the grabbing robot 3 to read the code of the frequency converter at the moment, and then reads the current position of the material on the conveying line 4 in real time according to the change of the subsequent frequency converter code, in addition, the control module 10 also transmits grabbing information (including grabbing position, posture and material type) to the grabbing robot 3 in real time and stores the grabbing information in a processing queue of the grabbing robot 3, therefore, the grabbing robot 3 can calculate the number of the materials in the current grabbing area Q according to the number of the grabbing information in the processing queue and the real-time code information of the frequency converter, and accordingly dynamically adjust the conveying speed of the conveying line 4.
In particular, as shown in fig. 2, the gripping area Q comprises, with respect to the travel direction F of the conveying line, an upper limit Q1 upstream and a lower limit Q2 downstream, the gripping robot 3 controlling the conveying line 4 at a first, faster speed V when there is no material in the gripping area Q0In operation, when the material in the grabbing area Q is detected, the grabbing robot 3 controls the conveying line 4 to move at a second lower speed V1Operation, V1Is calculated by the formula
Figure BDA0002423730950000071
Wherein T is the average time taken by the grabbing robot 3 to grab the material each time, L is the distance from the grabbing center of the material P1 closest to the upper limit Q1 to the lower limit Q2 in the grabbing area Q, and N is the number of the materials (three in the figure) in the grabbing area Q. When the grabbing robot 3 finishes grabbing once, the grabbing information in the processing queue of the grabbing robot 3 is reduced by one item, and at the moment, the N value and the L value are changed, so that the conveying speed of the conveying line 4 is adjusted once.
In actual work, the materials to be sorted (such as garbage) have higher value, such as pop cans and HD-PE materials, and have lower value, such as plastic bags. If snatch robot 3 and snatch according to the order, snatch earlier the material that flows over promptly, snatch the material that comes after again, when the material too much leads to snatching robot 3 and can't all snatch, can not guarantee that can more snatch the material that the value is high, influence the economic value that the material was selected separately. Therefore, the further improvement is that as shown in fig. 3, a priority adjustment area Y is further arranged in the grabbing area Q, and the grabbing robot 3 dynamically adjusts the grabbing priority of the material entering the priority adjustment area Y according to the material type of the material. Preferably, the priority adjustment area Y includes a first boundary line Y1 and a second boundary line Y2, the first boundary line Y1 and the second boundary line Y2 have a certain set distance d therebetween, the second boundary line Y2 is located in front of the first boundary line Y1, and the second boundary line Y2 is aligned with the grabbing center of the foremost material P2 in the grabbing area Q. In this embodiment, the process of the grabbing robot 3 adjusting the grabbing priority is as follows:
a) sorting the material types to be sorted, in this embodiment, sorting according to economic value;
b) when the material is conveyed to the area corresponding to the grabbing area Q in the grabbing robot 3, before the grabbing action, the grabbing position of the foremost material in the grabbing area Q is taken as a second boundary line Y2, and the second boundary line Y2 is shifted backwards by a set distance d to be a first boundary line Y1, so that the area between the first boundary line Y1 and the second boundary line Y2 is the sorting priority adjustment area Y;
c) finding out all materials in the priority adjustment area Y, and sorting the grabbing priority of the materials according to the economic value of the corresponding material type, wherein if 2 or more materials of a certain material type in the priority adjustment area Y exist, the materials are further sorted in the material type according to the area in the material outline; updating the grabbing sequence of the corresponding materials in the queue to be processed after the sequencing is finished;
d) the adjustment of the grabbing priority order is carried out according to the above modes before grabbing at every time, so that more material types with high grabbing value are guaranteed, and the economic value of the sorting equipment is improved.
In the sorting process, for some materials, such as rubbish, the form is different, and various colours all have, and the colour of transfer chain 4 is mostly dark colour, and when the existing dark colour (including black) of the material category of waiting to sort, when having light colour again, if use a camera 2, a set of camera parameter can not satisfy the discernment of the profile of different materials such as dark colour and light colour simultaneously. In this case, it is inevitable that the materials with certain colors cannot be accurately identified. Therefore, in another preferred embodiment of the material sorting system of the present invention, as shown in fig. 1, two cameras 2 are provided, which are a main camera 20 and an auxiliary camera 21, respectively, the main camera 20 is configured to be sensitive to light-color materials, the auxiliary camera 21 is configured to be sensitive to dark-color materials, the control processing system 1 further includes a comparison and determination module 13 electrically connected to the control module 10, and the control module 10 obtains the grabbing information currently to be transmitted to the grabbing robot 3 from the queue to be input generated by the comparison and determination module 13. In this embodiment, the process of generating the grab information is as follows:
as shown in fig. 4, after the sorting operation starts, the main camera 20 and the auxiliary camera 21 are used to photograph the material on the conveying line 4 at the same time, so as to obtain a first picture from the main camera 20 and a second picture from the auxiliary camera 21;
processing the first picture through the image processing module 11 to obtain the outlines of the materials in the first picture and the grabbing information corresponding to each outline, and storing the grabbing information obtained from the first picture into a queue to be input;
processing the second picture through the image processing module 11 to obtain the outlines of the materials in the second picture and the grabbing information corresponding to each outline;
and comparing the profile of any material obtained from the second picture with the profiles of all materials obtained from the first picture respectively, judging whether the profiles are overlapped, if not, adding the grabbing information corresponding to the currently compared profile in the second picture into a queue to be input, if so, selecting the grabbing information corresponding to one of a pair of comparison objects with the overlapped profiles to be added into the queue to be input, and updating the queue to be input. For example, after the first picture is processed, the profiles of four materials are obtained, namely X1, X2, X3 and X4, respectively, and the profiles of three materials obtained after the second picture is processed are Y1, Y2 and Y3, then Y1 is compared with X1, X2, X3 and X4, respectively, if none of the profiles overlap, the grabbing information of the material corresponding to Y1 is added into the queue to be input, then Y2 is compared with X1, X2, X3 and X4, respectively, if Y2 overlaps with X3, the grabbing information corresponding to X3 and the grabbing information corresponding to Y2 can be selected and reserved in the queue to be input, or the grabbing information corresponding to Y2 is used to replace the grabbing information corresponding to X3 in the queue to be input, and finally Y3 is compared with X1, X2, X3 and X4, respectively. Preferably, if a pair of contours of the current comparison object overlap each other, the capture information corresponding to the contour with high recognition reliability is selected and added into the queue to be input. For example, in the case where Y2 overlaps X3, if the recognition reliability of Y2 is 0.8 and the recognition reliability of X3 is 0.7, the capture information corresponding to Y2 is used instead of the capture information corresponding to X3.
The present invention also discloses another machine vision-based material sorting system comprising one or more processors, memory, and one or more programs, wherein the one or more programs are stored in the memory and configured to be executed by the one or more processors, the programs comprising instructions for performing the machine vision-based material sorting method as described above.
In addition, the invention also discloses a computer readable storage medium which comprises a computer program for testing, and the computer program can be executed by a processor to complete the material sorting method based on the machine vision.
In summary, the invention discloses a material sorting system based on machine vision, which comprises the steps of processing a picture shot by a camera 2 through an example segmentation algorithm (such as Mask R-CNN) to obtain the outline and the material type of a material to be sorted, and accurately calculating the grabbing position and the grabbing attitude of the grabbing material according to outline data; secondly, the dynamic adjustment of the conveying speed of the conveying line 4 not only reduces the miss-sorting rate, but also effectively improves the sorting efficiency; then, through the arrangement of the main camera 20 and the auxiliary camera 21, both the dark-color materials and the light-color materials can be effectively sorted, and the miss-sorting rate is further reduced; in addition, the sorting rate of the heavy objects can be effectively improved through the setting of the dynamically adjusted priority adjusting area Y.
The above disclosure is only for the purpose of illustrating the preferred embodiments of the present invention and is not to be construed as limiting the scope of the present invention, therefore, the present invention is not limited by the appended claims.

Claims (16)

1. A material sorting method based on machine vision is characterized in that RGB (red, green and blue) pictures of materials to be sorted on a conveying line are obtained through a camera, then the pictures are led into a deep learning model obtained through training, and the outlines and material types of the materials are obtained through an example segmentation algorithm;
calculating a grabbing position and a grabbing attitude corresponding to the current material according to the profile, and transmitting the grabbing position, the grabbing attitude and the material type to a grabbing robot arranged near the conveying line;
and the grabbing robot adjusts the grabbing action of the material according to the grabbing position and the grabbing posture, executes the grabbing work of the material, and places the grabbed material into a corresponding subarea according to the material type.
2. The material sorting method based on the machine vision is characterized in that the grabbing robot monitors the dynamic position of the material photographed by the camera on the conveying line in real time, a virtual grabbing area corresponding to the conveying line is arranged in the grabbing robot, the grabbing area is used for limiting the working range of the grabbing robot on the conveying line, and the grabbing robot dynamically adjusts the speed of the conveying line according to the working speed of the grabbing robot and the quantity of the material entering the grabbing area.
3. The machine vision-based material sorting method of claim 2, wherein the gripping area includes an upper limit upstream and a lower limit downstream, and the method of dynamically adjusting the speed of the conveyor line includes:
when no material exists in the grabbing area, the conveying line is controlled to be at a first speed V0Running;
when the material in the grabbing area is detected, controlling the conveying line to carry out the second speed V at a lower speed1Operation, V1Is calculated by the formula
Figure FDA0002423730940000011
The material grabbing method comprises the following steps that T is average time used by the grabbing robot for grabbing materials each time, L is a distance from a grabbing position, closest to a material at the upper limit, of a grabbing area to the lower limit, and N is the number of the materials in the grabbing area.
4. The material sorting method based on machine vision according to claim 2, characterized in that a priority adjusting area is further arranged in the grabbing area, and the grabbing priority of the material entering the priority adjusting area is dynamically adjusted according to the material type of the material.
5. The machine-vision-based material sorting method of claim 4, wherein the priority adjustment area comprises a first boundary line and a second boundary line, the first boundary line and the second boundary line have a set distance therebetween, the second boundary line is located in front of the first boundary line, and the second boundary line is based on a center of grab of a forwardmost material in the grab area.
6. The material sorting method based on machine vision according to claim 1, wherein there are two cameras, which are respectively a main camera and an auxiliary camera, the main camera is set to be sensitive to light-color materials, the auxiliary camera is set to be sensitive to dark-color materials, after sorting work is started, the main camera and the auxiliary camera are used for simultaneously taking pictures of materials on the conveying line, a first picture is obtained from the main camera, a second picture is obtained from the auxiliary camera, and then grabbing information to be transmitted to the grabbing robot at present is obtained through a comparison judgment method, the grabbing information includes the grabbing position, the grabbing attitude and the material type, and the comparison judgment method includes:
processing the first picture through the example segmentation algorithm to obtain the outlines of all materials in the first picture and the grabbing information corresponding to each outline, and storing the grabbing information obtained from the first picture into a queue to be input;
processing the second picture through the example segmentation algorithm to obtain the outlines of the materials in the second picture and the grabbing information corresponding to each outline;
comparing the profile of any material obtained from the second picture with the profiles of all materials obtained from the first picture respectively to judge whether there is overlap,
if not, adding the grabbing information corresponding to the currently compared outline in the second picture into the queue to be input,
if so, selecting and using the grabbing information corresponding to one of a pair of comparison objects with mutually overlapped outlines to add into the queue to be input;
and transmitting the grabbing information in the queue to be input to the grabbing robot.
7. The machine-vision-based material sorting method according to claim 6, wherein in the comparison and judgment method, if a pair of contours of a current comparison object overlap with each other, the grasping information corresponding to the contour with high recognition reliability is selected and added to the queue to be input.
8. A material sorting system based on machine vision is characterized by comprising a camera, a conveying line, a grabbing robot and a control processing system;
the conveying line is used for conveying materials to be sorted;
the camera is used for shooting RGB pictures of the materials to be sorted on the conveying line;
the control processing system comprises a control module, an image processing module and a computing module, wherein the control module is electrically connected with the camera, the grabbing robot, the image processing module and the computing module;
the image processing module is used for importing the pictures into a deep learning model obtained through training, and obtaining the outline and the material type of the material through an example segmentation algorithm;
the calculation module is used for calculating the current grabbing position and posture of the material according to the profile of the material output by the image processing module;
the grabbing robot is used for adjusting grabbing actions on the materials according to the grabbing positions and the grabbing postures of the materials calculated by the calculating module and executing grabbing work on the materials, and the grabbing robot is further used for placing the grabbed materials into corresponding subareas according to the material types calculated by the calculating module.
9. The material sorting system based on machine vision according to claim 8, wherein the grabbing robot is further electrically connected with a driver of the conveying line to read and control a motion state of the conveying line in real time, and monitor a dynamic position of the material photographed by the camera on the conveying line in real time, a virtual grabbing area corresponding to the conveying line is further provided in the grabbing robot, the grabbing area is used for limiting a working range of the grabbing robot on the conveying line, and the grabbing robot dynamically adjusts a speed of the conveying line according to a working speed of the grabbing robot and a quantity of the material entering the grabbing area.
10. The machine-vision-based material sorting system of claim 9, wherein the gripper region includes an upper limit upstream and a lower limit downstream, the gripper robot controlling the conveyor line to move at a first faster speed V when there is no material in the gripper region0Running; when materials exist in the grabbing area, the grabbing robot controls the conveying line to move at a second speed V which is slower1Operation, the calculation formula is
Figure FDA0002423730940000041
The material grabbing method comprises the following steps that T is average time used by the grabbing robot for grabbing materials each time, L is a distance from a grabbing position, closest to a material at the upper limit, of a grabbing area to the lower limit, and N is the number of the materials in the grabbing area.
11. The machine vision-based material sorting system according to claim 9, wherein a priority adjustment area is further provided in the gripping area, and the gripping robot dynamically adjusts the gripping priority of the material entering the priority adjustment area according to the material type of the material.
12. The machine-vision-based material sorting system of claim 11, wherein the priority adjustment zone comprises a first boundary line and a second boundary line, the first boundary line and the second boundary line having a set distance therebetween, the second boundary line being located forward of the first boundary line, and the second boundary line being centered on a center of capture of a forwardmost material within the capture zone.
13. The machine-vision-based material sorting system of claim 8, wherein the number of the cameras is two, namely a main camera and an auxiliary camera, the main camera is set to be sensitive to light-color materials, the auxiliary camera is set to be sensitive to dark-color materials, after the sorting work is started, the materials on the conveying line are photographed by the main camera and the auxiliary camera simultaneously, a first picture is obtained from the main camera, and a second picture is obtained from the auxiliary camera; the control processing system further comprises a comparison judgment module electrically connected with the control module, the control module acquires the grabbing information to be transmitted to the grabbing robot currently from the queue to be input generated by the comparison judgment module, and the grabbing information comprises the grabbing position, the posture and the material type;
after the image processing module processes the first picture, obtaining the outlines of all materials in the first picture and the grabbing information corresponding to each outline, and storing the grabbing information obtained from the first picture into the queue to be input;
after the image processing module processes the second picture, obtaining the outlines of all materials in the second picture and the grabbing information corresponding to each outline;
the comparison and judgment module is used for comparing the profile of any material obtained from the second picture with the profiles of all materials obtained from the first picture and judging whether the materials are overlapped,
if not, adding the grabbing information corresponding to the currently compared outline in the second picture into the queue to be input,
if yes, selecting and using the grabbing information corresponding to one of a pair of comparison objects with mutually overlapped outlines to add into the queue to be input.
14. The machine-vision-based material sorting system of claim 13, wherein in the judging module, if a pair of contours of a current comparison object overlap each other, the grasping information corresponding to the contour with high recognition reliability is selected to be added to the queue to be input.
15. A machine vision based material sorting system, comprising:
one or more processors;
a memory;
and one or more programs, wherein the one or more programs are stored in the memory and configured to be executed by the one or more processors, the programs comprising instructions for performing the machine vision based material sorting method of any of claims 1 to 7.
16. A computer-readable storage medium comprising a test computer program executable by a processor to perform the machine vision-based material sorting method of any one of claims 1 to 7.
CN202010215479.6A 2020-03-24 2020-03-24 Material sorting method and system based on machine vision Active CN111359915B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010215479.6A CN111359915B (en) 2020-03-24 2020-03-24 Material sorting method and system based on machine vision

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010215479.6A CN111359915B (en) 2020-03-24 2020-03-24 Material sorting method and system based on machine vision

Publications (2)

Publication Number Publication Date
CN111359915A true CN111359915A (en) 2020-07-03
CN111359915B CN111359915B (en) 2022-05-24

Family

ID=71200938

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010215479.6A Active CN111359915B (en) 2020-03-24 2020-03-24 Material sorting method and system based on machine vision

Country Status (1)

Country Link
CN (1) CN111359915B (en)

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111899629A (en) * 2020-08-04 2020-11-06 菲尼克斯(南京)智能制造技术工程有限公司 Flexible robot teaching system and method
CN112077842A (en) * 2020-08-21 2020-12-15 上海明略人工智能(集团)有限公司 Clamping method, clamping system and storage medium
CN113877836A (en) * 2021-11-05 2022-01-04 江苏昱博自动化设备有限公司 Intelligent recognition sorting system based on visual detection system
CN114273240A (en) * 2020-09-27 2022-04-05 深圳顺丰泰森控股(集团)有限公司 Express delivery single piece separation method, device and system and storage medium
CN116548696A (en) * 2023-05-06 2023-08-08 深圳市长林自动化设备有限公司 Press fit detection device and detection method thereof
CN117330508A (en) * 2023-10-17 2024-01-02 苏州鞍利智能科技有限公司 High-precision imaging detector and application thereof in environment detection

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR100714926B1 (en) * 2006-10-02 2007-05-04 (주) 현암바씨스 Apparatus for taking away garbage and managing system using the same
CN109540105A (en) * 2017-09-22 2019-03-29 北京印刷学院 A kind of courier packages' grabbing device and grasping means based on binocular vision
CN109658413A (en) * 2018-12-12 2019-04-19 深圳前海达闼云端智能科技有限公司 A kind of method of robot target grasping body position detection
CN110743818A (en) * 2019-11-29 2020-02-04 苏州嘉诺环境工程有限公司 Garbage sorting system and garbage sorting method based on vision and deep learning

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR100714926B1 (en) * 2006-10-02 2007-05-04 (주) 현암바씨스 Apparatus for taking away garbage and managing system using the same
CN109540105A (en) * 2017-09-22 2019-03-29 北京印刷学院 A kind of courier packages' grabbing device and grasping means based on binocular vision
CN109658413A (en) * 2018-12-12 2019-04-19 深圳前海达闼云端智能科技有限公司 A kind of method of robot target grasping body position detection
CN110743818A (en) * 2019-11-29 2020-02-04 苏州嘉诺环境工程有限公司 Garbage sorting system and garbage sorting method based on vision and deep learning

Cited By (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111899629A (en) * 2020-08-04 2020-11-06 菲尼克斯(南京)智能制造技术工程有限公司 Flexible robot teaching system and method
CN111899629B (en) * 2020-08-04 2022-06-10 菲尼克斯(南京)智能制造技术工程有限公司 Flexible robot teaching system and method
CN112077842A (en) * 2020-08-21 2020-12-15 上海明略人工智能(集团)有限公司 Clamping method, clamping system and storage medium
CN114273240A (en) * 2020-09-27 2022-04-05 深圳顺丰泰森控股(集团)有限公司 Express delivery single piece separation method, device and system and storage medium
CN113877836A (en) * 2021-11-05 2022-01-04 江苏昱博自动化设备有限公司 Intelligent recognition sorting system based on visual detection system
CN116548696A (en) * 2023-05-06 2023-08-08 深圳市长林自动化设备有限公司 Press fit detection device and detection method thereof
CN117330508A (en) * 2023-10-17 2024-01-02 苏州鞍利智能科技有限公司 High-precision imaging detector and application thereof in environment detection

Also Published As

Publication number Publication date
CN111359915B (en) 2022-05-24

Similar Documents

Publication Publication Date Title
CN111359915B (en) Material sorting method and system based on machine vision
WO2021036824A1 (en) Information collection device and method, inspection robot and storage medium
CN106737664B (en) Delta robot control method and system for sorting multiple types of workpieces
WO2021103487A1 (en) Side-by-side parcel separation device and method thereof
CN106425181A (en) Curve welding seam welding technology based on line structured light
CN106682619B (en) Object tracking method and device
CN113146172B (en) Multi-vision-based detection and assembly system and method
CN113351522B (en) Article sorting method, device and system
CN111428731A (en) Multi-class target identification and positioning method, device and equipment based on machine vision
CN113103215B (en) Motion control method for robot vision flyswatter
CN107301634A (en) A kind of robot automatic sorting method and system
CN108038861A (en) A kind of multi-robot Cooperation method for sorting, system and device
CN110633738B (en) Rapid classification method for industrial part images
CN109396053A (en) Intelligent sorting method
CN114758236A (en) Non-specific shape object identification, positioning and manipulator grabbing system and method
Jung et al. Applying MSC-HOG Feature to the Detection of a Human on a Bicycle
CN114299139A (en) 3D (three-dimensional) stacked package sorting method and system and storage medium
CN112802107A (en) Robot-based control method and device for clamp group
CN115070781B (en) Object grabbing method and two-mechanical-arm cooperation system
CN113808206B (en) Typesetting system and method based on vision tracking robot
van Vuuren et al. Towards the autonomous robotic gripping and handling of novel objects
WO2022138234A1 (en) Tool checking device, tool checking program, and tool checking method for robot arm
WO2023092519A1 (en) Grabbing control method and apparatus, and electronic device and storage medium
CN109941672A (en) The method, apparatus and electronic equipment of material correction
US11203826B1 (en) System and method for determining joinder locations for assembly of garments

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant