WO2021217923A1 - 人机协作分选系统及其分选多类物料的方法 - Google Patents

人机协作分选系统及其分选多类物料的方法 Download PDF

Info

Publication number
WO2021217923A1
WO2021217923A1 PCT/CN2020/104788 CN2020104788W WO2021217923A1 WO 2021217923 A1 WO2021217923 A1 WO 2021217923A1 CN 2020104788 W CN2020104788 W CN 2020104788W WO 2021217923 A1 WO2021217923 A1 WO 2021217923A1
Authority
WO
WIPO (PCT)
Prior art keywords
display
sorting
materials
operation signal
human
Prior art date
Application number
PCT/CN2020/104788
Other languages
English (en)
French (fr)
Inventor
莫卓亚
罗海城
陈文辉
Original Assignee
广东弓叶科技有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 广东弓叶科技有限公司 filed Critical 广东弓叶科技有限公司
Publication of WO2021217923A1 publication Critical patent/WO2021217923A1/zh

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/20Scenes; Scene-specific elements in augmented reality scenes
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B07SEPARATING SOLIDS FROM SOLIDS; SORTING
    • B07CPOSTAL SORTING; SORTING INDIVIDUAL ARTICLES, OR BULK MATERIAL FIT TO BE SORTED PIECE-MEAL, e.g. BY PICKING
    • B07C5/00Sorting according to a characteristic or feature of the articles or material being sorted, e.g. by control effected by devices which detect or measure such characteristic or feature; Sorting by manually actuated devices, e.g. switches
    • B07C5/34Sorting according to other particular properties
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B07SEPARATING SOLIDS FROM SOLIDS; SORTING
    • B07CPOSTAL SORTING; SORTING INDIVIDUAL ARTICLES, OR BULK MATERIAL FIT TO BE SORTED PIECE-MEAL, e.g. BY PICKING
    • B07C5/00Sorting according to a characteristic or feature of the articles or material being sorted, e.g. by control effected by devices which detect or measure such characteristic or feature; Sorting by manually actuated devices, e.g. switches
    • B07C5/36Sorting apparatus characterised by the means used for distribution
    • B07C5/361Processing or control devices therefor, e.g. escort memory
    • B07C5/362Separating or distributor mechanisms
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/20Image preprocessing
    • G06V10/22Image preprocessing by selection of a specific region containing or referencing a pattern; Locating or processing of specific regions to guide the detection or recognition

Definitions

  • This application relates to the technical field of material sorting, and in particular to a man-machine collaborative sorting system and a method for sorting multiple types of materials.
  • the environment at the garbage sorting site is extremely harsh, full of dust and noise, and the sorting workers are exposed to harsh environments for a long time, which is very likely to cause irreversible health problems to the sorting workers.
  • Chinese patent application CN201710649904.0 proposes an artificial intelligence robot to sort large-size organic light objects in construction waste. Select a monitor outside the scene to display it, and then use human eyes to identify the garbage image displayed on the monitor and determine the location of each garbage, and then touch the monitor to select the location of a garbage in the image, and finally the monitor will select the location The information is transmitted to the sorting robot on the spot for garbage collection by the sorting robot.
  • the garbage captured by its sorting robot can only be placed in a specific location, it can only be applied to the capture of a single type of garbage; if it is applied to the capture of multiple types of garbage, it is necessary to set up multiple sorting machines. Choosing robots to be responsible for the grabbing of one type of garbage is complicated, time-consuming, and labor-intensive.
  • the purpose of this application is to provide a method for sorting multiple types of materials by a human-machine collaborative sorting system, a human-computer collaborative sorting system, electronic equipment, and computer-readable storage medium that can be applied to sorting multiple types of materials.
  • this application discloses a method for sorting multiple types of materials by a human-machine cooperative sorting system.
  • the human-machine cooperative sorting system includes a camera, a display, and a robot.
  • the human-machine cooperative sorting system sorts Methods for multiple types of materials include:
  • the operation information including a selected position and an operation signal, the operation signal corresponding to the target placement position of the material
  • the corresponding target placement position is obtained from the operation signal, and the target placement position and the selected position are sent to the robot to control the robot to grab the material corresponding to the selected position and place it to the corresponding target placement position.
  • this application establishes the corresponding relationship between the operation signal of the display and the target placement position of the material.
  • the corresponding relationship between the operation signal and the target placement position first knows that the current to be captured should be captured. Where is the material placed, and then the target placement location is sent to the robot, which realizes that the grabbed material is placed in different placement locations based on the operation information, which is suitable for sorting multiple types of materials.
  • the display is a touch screen display
  • the selected position is a touch position of the display
  • the operation signal is sliding in a preset direction on the display using the selected position as a starting point.
  • the display is a touch screen display, and a plurality of selection buttons are provided on the display, and the operation signal is that the selection button is pressed while the display is touched to select a corresponding material.
  • the method for sorting multiple types of materials by the human-machine cooperative sorting system further includes: presetting the corresponding relationship between the operation signal and the target placement position, the material type and the operation signal, and storing the operation Signal.
  • the method for sorting multiple types of materials by the human-machine collaborative sorting system further includes: obtaining the contours of each of the materials from the image; searching for the contours of the materials corresponding to the selected position and converting the selected position into A preset position in the area where the material contour is located; the preset position is sent to the robot as a grasping position to grasp the corresponding material.
  • this application also discloses a human-machine cooperative sorting system, including a conveying mechanism, a camera, a processing device communicatively connected with the camera, a display and a robot communicatively connected with the processing device, wherein,
  • the conveying mechanism is used to carry and convey materials;
  • the camera is used to collect images of the materials;
  • the display is used to display the images and accept operations, and it transmits operation information to the processing device, the operation information including selection Position and operation signal, the operation signal corresponds to the target placement position of the material;
  • the processing device executes: receives the operation information transmitted by the display, obtains the corresponding target placement position from the operation signal, and obtains the corresponding target placement position according to the selected position And send the target placement position and the grabbing position to the robot;
  • the robot grabs the corresponding material according to the grabbing position and places it to the corresponding target placement position.
  • this application establishes the corresponding relationship between the operation signal of the display and the target placement position of the material.
  • the corresponding relationship between the operation signal and the target placement position first knows that the current to be captured should be captured. Where is the material placed, and then the target placement location is sent to the robot, which realizes that the grabbed material is placed in different placement locations based on the operation information, which is suitable for sorting multiple types of materials.
  • the display is a touch screen display
  • the selected position is a touch position of the display
  • the operation signal is a sliding in a preset direction starting from the selected position of the display.
  • the display is a touch screen display, and a plurality of selection buttons are provided on the display, and the operation signal is that the selection button is pressed while the display is touched to select a corresponding material.
  • the present application also provides an electronic device including a processor, a memory, and a computer program stored in the memory and configured to be executed by the processor, and the processor executes all When the computer program is described, the method for sorting multiple types of materials by the human-computer cooperative sorting system as described above is executed.
  • the present application also provides a computer-readable storage medium, the computer-readable storage medium stores a computer program, the computer program can be executed by a processor to complete the above-mentioned human-machine Collaborative sorting system sorts multiple types of materials.
  • Figure 1 is a partial structural block diagram of the applicant's machine collaborative sorting system.
  • Fig. 2 is a schematic structural diagram of an embodiment of the applicant's computer-based cooperative sorting system.
  • Fig. 3 is a schematic structural diagram of another embodiment of the applicant's machine cooperative sorting system.
  • Fig. 4 is a schematic structural diagram of another embodiment of the applicant's computer-based cooperative sorting system.
  • Fig. 5 is a schematic structural diagram of an embodiment of a display according to the present application.
  • Fig. 6 is a schematic diagram of position correction of the present application.
  • Fig. 7 is a flowchart of a method for sorting multiple types of materials by the applicant's computer-based cooperative sorting system.
  • FIG. 8 is a block diagram of the electronic equipment of the present application.
  • this embodiment provides a human-machine cooperative sorting system, including a conveying mechanism 1, a camera 2, a processing device 3 communicatively connected to the camera 2, a display 4 communicatively connected to the processing device 3, and
  • the robot 5, the conveying mechanism 1, the camera 2, and the robot 5, which are communicatively connected to the processing device 3, are arranged on the site of the sorting workshop, and the display 4 is arranged outside the site of the sorting workshop.
  • the conveying mechanism 1 is used to carry and convey materials;
  • the camera 2 is used to collect images of the materials and transmit the images to the processing device 3;
  • the processing device 3 receives the images transmitted by the camera 2, which processes and processes the received images The latter image is transmitted to the display 4;
  • the display 4 receives the image transmitted by the processing device 3 and displays the received image, it also accepts the operation of the sorting worker 10 and transmits the operation information to the processing device 3, where the operation information includes the selected position And the operation signal, the selected position corresponds to the position of the material selected by the sorting worker 10 to be grasped by the robot 5, and the operation signal corresponds to the target placement position of the selected material; and the processing device 3 receives the selected position transmitted by the display 4 And the operation signal, which finds the corresponding target placement position according to the operation signal, obtains the corresponding grab position according to the selected position, and sends the target placement position and the grab position to the robot 5; that is, it is selected by manipulating the display 4
  • the camera 2 and the robot 5 of the human-machine cooperative sorting system shown in Fig. 2 are both straddling the conveyor belt of the conveying mechanism 1.
  • the camera 2 and the robot 5 are supported by a supporting frame 61 and 62 respectively, and the camera 2 and the robot 5 are supported along The forward direction of the material is set in sequence.
  • the camera 2 straddles the conveyor belt of the conveying mechanism 1
  • the camera 2 is supported by a support frame 6
  • the robot 5 is arranged on one side of the conveying mechanism 1
  • the bottom of the robot 5 is provided with multiple
  • the wheels 51 when the robot 5 receives the moving driving force, the wheels 51 will move relative to the ground, so that the position adjustment of the robot 5 is convenient and labor-saving (as shown in FIG. 3).
  • both the camera 2 and the robot 5 straddle the conveyor belt of the conveying mechanism 1, and the camera 2 and the robot 5 are integrated together and supported by the same support frame 6 (as shown in FIG. 4).
  • the camera 2 is a 2D camera
  • the image is an RGB image
  • the processing device 3 can be any device with image processing capabilities and data computing capabilities, such as a PC.
  • the display 4 is a touch screen display with display and touch functions, and the selected position is the touch position of the finger of the sorting worker 10 on the display 4.
  • the display 4 may also only have a display function.
  • the material to be grasped can also be selected by clicking on the display screen of the display 4 with the mouse.
  • the operation signal is to slide in a preset direction from the selected position of the display 4 (ie, the touch position of the finger of the sorting worker 10 on the display 4) as a starting point. The operation is simple and quick, and it is not easy to make mistakes.
  • the control robot 5 when sliding to the left with the selected position as the starting point, the control robot 5 will place the grabbed material in the material frame at the first position; when sliding upward with the selected position as the starting point, the control robot 5 will place the grabbed material in the The material frame at the second position; when sliding to the right with the selected position as the starting point, the control robot 5 will place the grabbed material in the material frame at the third position; the material frame at the first position and the second position can be set in the specific implementation.
  • the material frame of the position and the material frame of the third position respectively correspond to a certain type of material.
  • the material frame of the first position is for plastic bottles
  • the material frame of the second position is for cans
  • the material frame of the third position is for glass bottles, etc.
  • the operation information may also be other actions, for example, making a circle clockwise with the selected position as a starting point, or making a circle counterclockwise with the selected position as a starting point, and should not be limited to this.
  • the display 4 is a touch screen display with display and touch functions, and the selected position is the touch position of the finger of the sorting worker 10 on the display 4.
  • the display 4 may also only have a display function.
  • the material to be grasped can also be selected by clicking on the display screen of the display 4 with the mouse.
  • a plurality of selection buttons 41 are provided on the display 4, and the operation signal is that the selection button 41 is pressed when the display 4 is touched to select the corresponding material; specifically, there are The key pads of the multiple selection buttons 41 and the IO card 42 constitute the control device 40 of the display 4.
  • the sorting worker 10 operates the control device 40 so that the display 4 obtains operation information, for example, the sorting worker 10’s right hand clicks on the display 4 On the display screen, at this time, the display 4 obtains the selected position, and at the same time, the left hand presses a different selection button 41, at this time, the display 4 obtains an operation signal.
  • the selection button can also be replaced with a stylus, gloves, etc., so it should not be limited to this.
  • the display screen of the display 4 is a capacitive touch display screen, and the operation is smoother and smoother, but it should not be limited to this.
  • the processing device 3 also obtains the contours of each material from the received image (an image often contains multiple materials. After preprocessing the image, the deep learning method is used to obtain the contours of each material contained in the image. How to use deep learning to obtain the contour of each material is the prior art, and I won’t repeat it here), and then find the contour of the material corresponding to the selected position, and convert the selected position into a pre-selection of the area where the material contour is located. Set the position, and then send the preset position to the robot 5 as the grasping position.
  • the preset position is the center position of the area where the material contour is located, but it should not be limited to this.
  • S1 is a material profile obtained through deep learning
  • P1 is the actual selected position.
  • the selected position P1 is located at the edge of the area where the material contour S1 is located.
  • This application corrects the selected position P1 to obtain the grasping position P2 (that is, the center position of the area where the material outline is located).
  • this embodiment provides a method for sorting multiple types of materials by a human-machine cooperative sorting system.
  • the human-machine cooperative sorting system includes a camera 2, a display 4, and a robot 5, and the camera 2 is set up by the robot 5.
  • the display 4 is set outside the sorting workshop site.
  • the methods of man-machine collaborative sorting system to sort multiple types of materials include:
  • S102 Receive operation information transmitted by the display 4, the operation information includes a selected position and an operation signal, the selected position corresponds to the position of the material selected by the sorting worker 10 to be grasped by the robot 5, and the operation signal corresponds to the target placement position of the material correspond;
  • S103 Obtain the corresponding target placement position from the operation signal, and send the target placement position and the selected position to the robot to control the robot to grab the material corresponding to the selected position and place it to the corresponding target placement position.
  • the display 4 is a touch screen display with display and touch functions, and the selected position is the touch position of the finger of the sorting worker 10 on the display 4.
  • the display 4 may also only have a display function.
  • the material to be grasped can also be selected by clicking on the display screen of the display 4 with the mouse.
  • the operation signal is to slide in a preset direction from the selected position of the display 4 (ie, the touch position of the finger of the sorting worker 10 on the display 4) as a starting point.
  • the operation is simple and quick, and it is not easy to make mistakes. For example, when sliding to the left with the selected position as the starting point, the corresponding target placement position is the first position.
  • the robot 5 places the grabbed material in the first position; when sliding upwards from the selected position as the starting point, the corresponding target placement The position is the second position. At this time, the robot 5 will place the picked material in the second position; when sliding to the right from the selected position as the starting point, the corresponding target placement position is the third position. At this time, the robot 5 will grab Place the material in the third position; in specific implementation, you can set the first, second, and third positions to place a certain type of material, such as plastic bottles in the first position, cans in the second position, and cans in the third position. Place glass bottles and so on.
  • the operation information may also be other actions, for example, making a circle clockwise with the selected position as a starting point, or making a circle counterclockwise with the selected position as a starting point, and should not be limited to this.
  • the display 4 is a touch screen display with display and touch functions, and the selected position is the touch position of the finger of the sorting worker 10 on the display 4.
  • the display 4 may also only have a display function.
  • the material to be grasped can also be selected by clicking on the display screen of the display 4 with the mouse.
  • the operation signal is that the selection button 41 is pressed when the display 4 is touched to select the corresponding material; specifically, The key board provided with multiple selection buttons 41 and the IO card 42 constitute the control device 40 of the display 4.
  • the sorting worker 10 operates the control device 40 so that the display 4 obtains operation information, for example, the sorting worker 10 clicks on the display with his right hand At this time, the display 4 obtains the selected position, and at the same time, the left hand presses a different selection button 41. At this time, the display 4 obtains an operation signal.
  • the selection button can also be replaced with a stylus, gloves, etc., so it should not be limited to this.
  • the display screen of the display 4 is a capacitive touch display screen, and the operation is smoother and smoother, but it should not be limited to this.
  • the processing device 3 also obtains the contours of each material from the received image (an image often contains multiple materials. After preprocessing the image, the deep learning method is used to obtain the contours of each material contained in the image. How to use deep learning to obtain the contour of each material is the prior art, and I won’t repeat it here), and then find the contour of the material corresponding to the selected position, and convert the selected position into a pre-selection of the area where the material contour is located. Set the position, and then send the preset position to the robot 5 as the grasping position.
  • the preset position is the center position of the area where the material contour is located, but it should not be limited to this.
  • S1 is a material profile obtained through deep learning
  • P1 is the actual selected position.
  • the selected position P1 is located at the edge of the area where the material contour S1 is located.
  • This application corrects the selected position P1 to obtain the grasping position P2 (that is, the center position of the area where the material outline is located).
  • the corresponding relationship between the operation signal and the target placement position, the material type and the operation signal is also preset, and the operation signal is stored. In this way, it is convenient for sorting workers to view the daily sorting data.
  • the present application also discloses an electronic device 200, which includes a processor 210, a memory 220, and a computer program 230 stored in the memory 220 and configured to be executed by the processor 210, and the processor 210 executes the computer program 230.
  • an electronic device 200 which includes a processor 210, a memory 220, and a computer program 230 stored in the memory 220 and configured to be executed by the processor 210, and the processor 210 executes the computer program 230.
  • this application establishes the corresponding relationship between the operation signal of the display 4 and the target placement position of the material.
  • the operation signal When the operation signal is obtained, it is first known from the correspondence relationship between the operation signal and the target placement position that the current to be captured should be captured. Where to place the material, and then send the target placement position to the robot 5, which realizes that the grabbed material is placed in different placement positions according to the operation information, which is suitable for sorting multiple types of materials.
  • the sorting workers do not need to work on the sorting site, which can avoid the health problems caused by the sorting workers in the harsh sorting environment for a long time.

Abstract

一种人机协作分选系统分选多类物料的方法,包括:将相机采集到的物料的图像传送至显示器显示;接收显示器传送的操作信息,操作信息包括选中位置和操作信号,操作信号与物料的目标放置位置相对应;由操作信号获得对应的目标放置位置,并将目标放置位置及选中位置发送至机器人以控制机器人抓取该选中位置对应的物料并放至对应的目标放置位置,实现了依据操作信息将抓取的物料放置在不同的放置位置,能够适用于分选多种类型的物料。另,本申请还公开一种采用上述分选多类物料的方法进行多类物料分选的人机协作分选系统。

Description

人机协作分选系统及其分选多类物料的方法 技术领域
本申请涉及物料分选技术领域,尤其涉及一种人机协作分选系统及其分选多类物料的方法。
背景技术
垃圾分类现场环境极其恶劣,充满了灰尘和噪声,分选工人长时间身处恶劣的环境下,极可能会对分选工人造成不可逆的健康问题。
为了解决上述问题,中国专利申请CN201710649904.0提出了一种人工智能机器人分选建筑垃圾中大尺寸有机轻质物的方法,其将垃圾置于输送线上,通过相机采集垃圾的图像并由分选现场之外的显示器显示出来,然后通过人眼来识别显示器显示的垃圾图像并判断各垃圾的位置,再人为触碰显示器以选中图像中的某一垃圾的所在位置,最后显示器将选中的位置信息传送至现场的分选机器人以通过分选机器人进行垃圾抓取。
但是,由于其分选机器人抓取的垃圾只能统一放置在一个特定位置,其只能适用于单一类型的垃圾的抓取;若要应用于多种类型垃圾的抓取,须设置多台分选机器人来分别负责一种类型的垃圾的抓取,工序复杂,耗费时间多、人力成本高。
申请内容
本申请的目的在于提供一种人机协作分选系统分选多类物料的方法、能够适用于分选多种类型的物料的人机协作分选系统、电子设备及计算机可读存储介质。
为了实现上述目的,本申请公开了一种人机协作分选系统分选多类物料的方法,所述人机协作分选系统包括相机、显示器及机器人,所述人机协作分选系统分选多类物料的方法包括:
将相机采集到的物料的图像传送至显示器显示;
接收显示器传送的操作信息,所述操作信息包括选中位置和操作信号,所述操作信号与物料的目标放置位置相对应;
由所述操作信号获得对应的目标放置位置,并将所述目标放置位置及选中位置发送至机器人以控制机器人抓取所述选中位置对应的物料并放至对应的所述目标放置位置。
与现有技术相比,本申请建立了显示器的操作信号与物料的目标放置位置的对应关系,获取到操作信号时,首先由操作信号与目标放置位置的对应关系获知应当将当前要抓取的物料放置到哪一位置,然后将目标放置位置发送至机器人,实现了依据操作信息将抓取的物料放置在不同的放置位置,能够适用于分选多种类型的物料。
在一实施例中,所述显示器为触屏显示器,所述选中位置为在所述显示器的触控位置,所述操作信号为在所述显示器以所述选中位置为起点向预设方向滑动。
在一实施例中,所述显示器为触屏显示器,所述显示器上设有多个选择按键,所述操作信号为在所述显示器被触碰以选中对应物料的同时所述选择按键被按压。
具体地,所述人机协作分选系统分选多类物料的方法还包括:预先设置所述操作信号与所述目标放置位置、物料类型与所述操作信号的对应关系,并存储所述操作信号。
较佳地,所述人机协作分选系统分选多类物料的方法还包括:由所述图像获得其中各物料轮廓;查找与所述选中位置对应的物料轮廓并将所述选中位置转化为所述物料轮廓所在区域的一预设位置;将所述预设位置作为抓取位置发送给机器人以抓取对应的物料。
为了实现上述目的,本申请还公开了一种人机协作分选系统,包括输送机构、相机、与所述相机通信连接的处理装置、与所述处理装置通信连接的显示器及机器人,其中,所述输送机构用于承载和输送物料;所述相机用于采集物料的图像;所述显示器用于显示所述图像并接受操作,其将操作信息传送至所述处理装置,所述操作信息包括选中位置和操作信号,所述操作信号与物料的目标放置位置相对应;所述处理装置执行:接收显示器传送的操作信息,由所述操作信号获得对应的目标放置位置,依据所述选中位置获得对应的抓取位置,并将所述目标放置位置及抓取位置发送至所述机器人;所述机器人依据所述抓取位置抓取对应的物料并放至对应的所述目标放置位置。
与现有技术相比,本申请建立了显示器的操作信号与物料的目标放置位置的对应关系,获取到操作信号时,首先由操作信号与目标放置位置的对应关系获知应当将当前要抓取的物料放置到哪一位置,然后将目标放置位置发送至机器人,实现了依据操作信息将抓取的物料放置在不同的放置位置,能够适用于分选多种类型的物料。
在一实施例中,所述显示器为触屏显示器,所述选中位置为在所述显示器的触控位置,所述操作信号为以在所述显示器的选中位置为起点向预设方向滑动。
在一实施例中,所述显示器为触屏显示器,所述显示器上设有多个选择按键,所述操作信号为在所述显示器被触碰以选中对应物料的同时所述选择按键被按压。
为实现上述目的,相应地,本申请还提供了一种电子设备,包括处理器、存储器以及存储在所述存储器中且被配置为由所述处理器执行的计算机程序,所述处理器执行所述计算机程序时,执行如上所述的人机协作分选系统分选多类物料的方法。
为实现上述目的,相应地,本申请还提供了一种计算机可读存储介质,所述计算机可读存储介质存储有计算机程序,所述计算机程序可被处理器执 行以完成如上所述的人机协作分选系统分选多类物料的方法。
附图说明
图1是本申请人机协作分选系统的部分组成结构框图。
图2是本申请人机协作分选系统一实施例的结构示意图。
图3是本申请人机协作分选系统另一实施例的结构示意图。
图4是本申请人机协作分选系统又一实施例的结构示意图。
图5是本申请显示器一实施例的结构示意图。
图6是本申请位置修正的示意图。
图7是本申请人机协作分选系统分选多类物料的方法的流程图。
图8是本申请电子设备的组成框图。
具体实施方式
为详细说明本申请的内容、构造特征、所实现目的及效果,下面将结合本申请实施例中的附图,对本申请实施例中的技术方案进行清楚、完整地描述,显然,所描述的实施例仅仅是本申请一部分实施例,而不是全部的实施例。基于本申请中的实施例,本领域普通技术人员在没有作出创造性劳动前提下所获得的所有其他实施例,都属于本申请保护的范围。
请参阅图1和图2,本实施例提供一种人机协作分选系统,包括输送机构1、相机2、与相机2通信连接的处理装置3、与处理装置3通信连接的显示器4及与处理装置3通信连接的机器人5,输送机构1、相机2、机器人5设置在分选车间现场,显示器4设置在分选车间现场之外。其中,输送机构1用于承载和输送物料;相机2用于采集物料的图像并将图像传送至处理装置3;处理装置3接收相机2传送的图像,其对接收到的图像进行处理并将处理后的图像传送至显示器4;显示器4接收处理装置3传送的图像并显示接收到的图像,其还接受分选工人10的操作并将操作信息传送至处理装置3,其中,操作信息包括选中位置和操作信号,选中位置对应分选工人10所选中的欲要 机器人5抓取的物料的所在位置,操作信号与选中的物料的目标放置位置相对应;而处理装置3接收显示器4传送的选中位置和操作信号,其根据操作信号查找获得对应的目标放置位置,根据选中位置获得对应的抓取位置,并将目标放置位置及抓取位置发送至机器人5;即是,通过人为操作显示器4来选中需要机器人5抓取的物料及依据操作信号与目标放置位置的对应关系来确定选中的物料的目标放置位置,而机器人5则依据该抓取位置抓取对应的物料,并将抓取到的物料放至对应的目标放置位置,实现了将抓取的物料分开放置,在具体实施中,可以设置将不同类别的物料分别放置在不同的目标放置位置,实现在分选过程中直接对抓取的物料细化分类,从而减少分选工序。
图2所示人机协作分选系统的相机2和机器人5均横跨在输送机构1的输送带之上,相机2和机器人5分别通过一支撑架61、62支撑,相机2和机器人5沿物料的前进方向依次设置。在一些实施例中,相机2横跨在输送机构1的输送带之上,相机2通过一支撑架6支撑,机器人5则设置在输送机构1的一侧,且机器人5的底部设有多个车轮51,当机器人5受到移动驱动力时,车轮51将相对地面移动,使得机器人5的位置调整方便省力(如图3所示)。在一些实施例中,相机2和机器人5均横跨在输送机构1的输送带之上,相机2和机器人5集成在一起并通过同一支撑架6支撑(如图4所示)。其中,相机2为2D相机,图像为RGB图像,处理装置3可以为任何具备图像处理能力和数据运算能力的装置,例如PC机等。
在一实施例中,显示器4为具有显示和触控功能的触屏显示器,选中位置即为分选工人10的手指在显示器4的触控位置,当然,显示器4也可以仅具有显示功能,具体实施中还可以通过鼠标点选显示器4的显示屏来选中欲抓取的物料。在该实施例中,操作信号为以在显示器4的选中位置(即分选工人10的手指在显示器4的触控位置)为起点向预设方向滑动,操作简单快捷,且不容易出错。例如,以选中位置为起点向左滑动时,控制机器人5将抓取的物料放置在位于第一位置的料框;以选中位置为起点向上滑动时,控 制机器人5将抓取的物料放置在位于第二位置的料框;以选中位置为起点向右滑动时,控制机器人5将抓取的物料放置在位于第三位置的料框;具体实施中可以设定第一位置的料框、第二位置的料框、第三位置的料框分别对应放置某一类型的物料,如第一位置的料框放置塑料瓶,第二位置的料框放置易拉罐,第三位置的料框放置玻璃瓶等。当然,在一些实施例中,操作信息也可以为其它动作,例如以选中位置为起点顺时针划圈、以选中位置为起点逆时针划圈等,故不应以此为限。
在一实施例中,显示器4为具有显示和触控功能的触屏显示器,选中位置即为分选工人10的手指在显示器4的触控位置,当然,显示器4也可以仅具有显示功能,具体实施中还可以通过鼠标点选显示器4的显示屏来选中欲抓取的物料。如图5所示,在该实施例中,显示器4上设有多个选择按键41,操作信号为在显示器4被触碰以选中对应物料的同时选择按键41被按压;具体而言,设有多个选择按键41的按键板和IO卡42构成了显示器4的操控装置40,分选工人10操作该操控装置40从而使得显示器4获得操作信息,例如,分选工人10的右手点击显示器4的显示屏,此时,显示器4获得选中位置,同时,左手按压不同的选择按键41,此时,显示器4获得操作信号。当然,选择按键也可以替换为触控笔、手套等,故不应以此为限。优选的,显示器4的显示屏为电容触摸显示屏,操作更加平滑流畅,但不应以此为限。
在一实施例中,处理装置3还由接收到的图像获得其中各物料轮廓(一张图像中往往会包含有多个物料,对图像进行预处理后利用深度学习的方式来获得图像中所包含的各个物料的轮廓,如何利用深度学习的方式获得物料轮廓为现有技术,在此不再赘述),然后查找与选中位置对应的物料轮廓,并将选中位置转化为物料轮廓所在区域的一预设位置,然后将该预设位置作为抓取位置发送给机器人5。即是,通过人为操作显示器4来选中需要机器人5抓取的物料的所在位置并依据物料轮廓进行位置校正得到最终的抓取位置,而机器人5则依据该最终的抓取位置抓取物料,可以避免人为操作显示器4时选中位置偏差或碰上图片刷新时实际选中位置与最佳抓取位置不符导致的 物料抓取失败的问题,又由于同一物料轮廓内的各个选中位置都将修正为物料轮廓所在区域的一个预先设定的位置,即使多次选中同一个物料轮廓所在区域,也只会控制机器人5进行一次抓取动作,可以避免机器人5多次抓取(空抓),提高了机器人5的抓取效率。
具体的,预设位置为物料轮廓所在区域的中心位置,但不应以此为限。如图6所示,S1为通过深度学习获得的一物料轮廓,P1为实际的选中位置,该选中位置P1位于物料轮廓S1所在区域的边缘,本申请对选中位置P1进行修正后得到抓取位置P2(即是物料轮廓所在区域的中心位置)。
请参阅图2和图7,本实施例提供一种人机协作分选系统分选多类物料的方法,人机协作分选系统包括相机2、显示器4及机器人5,相机2、机器人5设置在分选车间现场,显示器4设置在分选车间现场之外。人机协作分选系统分选多类物料的方法包括:
S101,将相机2采集到的物料的图像传送至显示器4显示;
S102,接收显示器4传送的操作信息,操作信息包括选中位置和操作信号,选中位置对应分选工人10所选中的欲要机器人5抓取的物料的所在位置,操作信号与物料的目标放置位置相对应;
S103,由操作信号获得对应的目标放置位置,并将目标放置位置及选中位置发送至机器人以控制机器人抓取选中位置对应的物料并放至对应的目标放置位置。
在一实施例中,显示器4为具有显示和触控功能的触屏显示器,选中位置即为分选工人10的手指在显示器4的触控位置,当然,显示器4也可以仅具有显示功能,具体实施中还可以通过鼠标点选显示器4的显示屏来选中欲抓取的物料。在该实施例中,操作信号为以在显示器4的选中位置(即分选工人10的手指在显示器4的触控位置)为起点向预设方向滑动,操作简单快捷,且不容易出错。例如,以选中位置为起点向左滑动时,对应的目标放置位置为第一位置,此时,机器人5将抓取的物料放置第一位置;以选中位置为起点向上滑动时,对应的目标放置位置为第二位置,此时,机器人5将抓 取的物料放置在第二位置;以选中位置为起点向右滑动时,对应的目标放置位置为第三位置,此时,机器人5将抓取的物料放置在第三位置;具体实施中可以设定第一位置、第二位置、第三位置分别对应放置某一类型的物料,如第一位置放置塑料瓶,第二位置放置易拉罐,第三位置放置玻璃瓶等。当然,在一些实施例中,操作信息也可以为其它动作,例如以选中位置为起点顺时针划圈、以选中位置为起点逆时针划圈等,故不应以此为限。
在一实施例中,显示器4为具有显示和触控功能的触屏显示器,选中位置即为分选工人10的手指在显示器4的触控位置,当然,显示器4也可以仅具有显示功能,具体实施中还可以通过鼠标点选显示器4的显示屏来选中欲抓取的物料。如图5所示,在该实施例中,通过在显示器4上设有多个选择按键41,操作信号为在显示器4被触碰以选中对应物料的同时选择按键41被按压;具体而言,设有多个选择按键41的按键板和IO卡42构成了显示器4的操控装置40,分选工人10操作该操控装置40从而使得显示器4获得操作信息,例如,分选工人10的右手点击显示器4的显示屏,此时,显示器4获得选中位置,同时,左手按压不同的选择按键41,此时,显示器4获得操作信号。当然,选择按键也可以替换为触控笔、手套等,故不应以此为限。优选的,显示器4的显示屏为电容触摸显示屏,操作更加平滑流畅,但不应以此为限。
在一实施例中,处理装置3还由接收到的图像获得其中各物料轮廓(一张图像中往往会包含有多个物料,对图像进行预处理后利用深度学习的方式来获得图像中所包含的各个物料的轮廓,如何利用深度学习的方式获得物料轮廓为现有技术,在此不再赘述),然后查找与选中位置对应的物料轮廓,并将选中位置转化为物料轮廓所在区域的一预设位置,然后将该预设位置作为抓取位置发送给机器人5。即是,通过人为操作显示器4来选中需要机器人5抓取的物料的所在位置并依据物料轮廓进行位置校正得到最终的抓取位置,而机器人5则依据该最终的抓取位置抓取物料,可以避免人为操作显示器4时选中位置偏差或碰上图片刷新时实际选中位置与最佳抓取位置不符导致的 物料抓取失败的问题,又由于同一物料轮廓内的各个选中位置都将修正为物料轮廓所在区域的一个预先设定的位置,即使多次选中同一个物料轮廓所在区域,也只会控制机器人5进行一次抓取动作,可以避免机器人5多次抓取(空抓),提高了机器人5的抓取效率。
具体的,预设位置为物料轮廓所在区域的中心位置,但不应以此为限。如图6所示,S1为通过深度学习获得的一物料轮廓,P1为实际的选中位置,该选中位置P1位于物料轮廓S1所在区域的边缘,本申请对选中位置P1进行修正后得到抓取位置P2(即是物料轮廓所在区域的中心位置)。
具体的,在S101之前,还预先设置操作信号与目标放置位置、物料类型与操作信号的对应关系,并存储操作信号。借此,便于分选工人对每日分选数据进行查看。
请参阅图8,本申请还公开一种电子设备200,其包括处理器210、存储器220以及存储在存储器220中且被配置为由处理器210执行的计算机程序230,处理器210执行计算机程序230时,执行上述的人机协作分选系统分选多类物料的方法。
与现有技术相比,本申请建立了显示器4的操作信号与物料的目标放置位置的对应关系,获取到操作信号时,首先由操作信号与目标放置位置的对应关系获知应当将当前要抓取的物料放置到哪一位置,然后将目标放置位置发送至机器人5,实现了依据操作信息将抓取的物料放置在不同的放置位置,能够适用于分选多种类型的物料。此外,由于采用人机协作的方式进行物料分选,分选工人无需到分选现场作业,可以避免分选工人长时间身处恶劣的分选环境所导致的健康问题。
以上所揭露的仅为本申请的优选实施例而已,当然不能以此来限定本申请之权利范围,因此依本申请申请专利范围所作的等同变化,仍属本申请所涵盖的范围。

Claims (10)

  1. 一种人机协作分选系统分选多类物料的方法,所述人机协作分选系统包括相机、显示器及机器人,其特征在于,所述人机协作分选系统分选多类物料的方法包括:
    将相机采集到的物料的图像传送至显示器显示;
    接收显示器传送的操作信息,所述操作信息包括选中位置和操作信号,所述操作信号与物料的目标放置位置相对应;
    由所述操作信号获得对应的目标放置位置,并将所述目标放置位置及选中位置发送至机器人以控制机器人抓取所述选中位置对应的物料并放至对应的所述目标放置位置。
  2. 如权利要求1所述的人机协作分选系统分选多类物料的方法,其特征在于,所述显示器为触屏显示器,所述选中位置为在所述显示器的触控位置,所述操作信号为在所述显示器以所述选中位置为起点向预设方向滑动。
  3. 如权利要求1所述的人机协作分选系统分选多类物料的方法,其特征在于,所述显示器为触屏显示器,所述显示器上设有多个选择按键,所述操作信号为在所述显示器被触碰以选中对应物料的同时所述选择按键被按压。
  4. 如权利要求1所述的人机协作分选系统分选多类物料的方法,其特征在于,还包括:预先设置所述操作信号与所述目标放置位置、物料类型与所述操作信号的对应关系,并存储所述操作信号。
  5. 如权利要求1所述的人机协作分选系统分选多类物料的方法,其特征在于,还包括:
    由所述图像获得其中各物料轮廓;
    查找与所述选中位置对应的物料轮廓并将所述选中位置转化为所述物料轮廓所在区域的一预设位置;
    将所述预设位置作为抓取位置发送给机器人以抓取对应的物料。
  6. 一种人机协作分选系统,包括输送机构、相机、与所述相机通信连接的处理装置、与所述处理装置通信连接的显示器及机器人,其特征在于,
    所述输送机构用于承载和输送物料;
    所述相机用于采集物料的图像;
    所述显示器用于显示所述图像并接受操作,其将操作信息传送至所述处理装置,所述操作信息包括选中位置和操作信号,所述操作信号与物料的目标放置位置相对应;
    所述处理装置执行:接收显示器传送的操作信息,由所述操作信号获得对应的目标放置位置,依据所述选中位置获得对应的抓取位置,并将所述目标放置位置及抓取位置发送至所述机器人;
    所述机器人依据所述抓取位置抓取对应的物料并放至对应的所述目标放置位置。
  7. 如权利要求6所述的人机协作分选系统,其特征在于,所述显示器为触屏显示器,所述选中位置为在所述显示器的触控位置,所述操作信号为以在所述显示器的选中位置为起点向预设方向滑动。
  8. 如权利要求6所述的人机协作分选系统,其特征在于,所述显示器为触屏显示器,所述显示器上设有多个选择按键,所述操作信号为在所述显示器被触碰以选中对应物料的同时所述选择按键被按压。
  9. 一种电子设备,其特征在于,包括:
    处理器、存储器以及存储在所述存储器中且被配置为由所述处理器执行的计算机程序,所述处理器执行所述计算机程序时,执行如权利要求1至5任一项所述的人机协作分选系统分选多类物料的方法。
  10. 一种计算机可读存储介质,其特征在于,所述计算机可读存储介质存储有计算机程序,所述计算机程序可被处理器执行以完成如权利要求1至5任一项所述的人机协作分选系统分选多类物料的方法。
PCT/CN2020/104788 2020-04-26 2020-07-27 人机协作分选系统及其分选多类物料的方法 WO2021217923A1 (zh)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN202010342585.0A CN111582088B (zh) 2020-04-26 2020-04-26 人机协作分选系统及其分选多类物料的方法
CN202010342585.0 2020-04-26

Publications (1)

Publication Number Publication Date
WO2021217923A1 true WO2021217923A1 (zh) 2021-11-04

Family

ID=72121442

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2020/104788 WO2021217923A1 (zh) 2020-04-26 2020-07-27 人机协作分选系统及其分选多类物料的方法

Country Status (2)

Country Link
CN (1) CN111582088B (zh)
WO (1) WO2021217923A1 (zh)

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20130086520A1 (en) * 2011-10-03 2013-04-04 Whirlpool Corporation Method of sorting articles for treatment according to a cycle of operation implemented by an appliance
CN206701918U (zh) * 2017-01-24 2017-12-05 程泉钧 一种多传感器融合的垃圾分拣装置
CN109045676A (zh) * 2018-07-23 2018-12-21 西安交通大学 一种象棋识别学习算法和基于该算法的机器人智动化系统与方法
WO2019064802A1 (ja) * 2017-09-27 2019-04-04 パナソニックIpマネジメント株式会社 荷物仕分けシステム、投影指示装置および荷物仕分け方法

Family Cites Families (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103873539A (zh) * 2012-12-18 2014-06-18 天津云源科技有限公司 一种自主式物品云管理系统
CN103706571B (zh) * 2013-12-27 2015-12-09 西安航天精密机电研究所 一种视觉定位分拣方法
CN103752534B (zh) * 2014-01-14 2016-04-20 温州中波电气有限公司 智觉图像智能识别分拣装置及识别分拣方法
CN106022386B (zh) * 2016-05-26 2019-04-30 北京新长征天高智机科技有限公司 一种计算机识别与人工交互结合的生活垃圾目标识别系统
CN106000904B (zh) * 2016-05-26 2018-04-10 北京新长征天高智机科技有限公司 一种生活垃圾自动分拣系统
CN106275993A (zh) * 2016-08-30 2017-01-04 苏州德品医疗科技股份有限公司 一种直线型垃圾自动分类回收系统
CN107168191A (zh) * 2017-07-10 2017-09-15 上海工程技术大学 一种电子商务物品的智能分拣系统以及实现方法
CN107790402A (zh) * 2017-07-24 2018-03-13 梁超 人工智能机器人分拣建筑垃圾中大尺寸有机轻质物的方法
CN109471841B (zh) * 2018-10-31 2022-02-01 维沃移动通信有限公司 一种文件分类方法和装置

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20130086520A1 (en) * 2011-10-03 2013-04-04 Whirlpool Corporation Method of sorting articles for treatment according to a cycle of operation implemented by an appliance
CN206701918U (zh) * 2017-01-24 2017-12-05 程泉钧 一种多传感器融合的垃圾分拣装置
WO2019064802A1 (ja) * 2017-09-27 2019-04-04 パナソニックIpマネジメント株式会社 荷物仕分けシステム、投影指示装置および荷物仕分け方法
CN109045676A (zh) * 2018-07-23 2018-12-21 西安交通大学 一种象棋识别学习算法和基于该算法的机器人智动化系统与方法

Also Published As

Publication number Publication date
CN111582088B (zh) 2020-12-08
CN111582088A (zh) 2020-08-25

Similar Documents

Publication Publication Date Title
US11213860B2 (en) Automatic magnetic core sorting system based on machine vision
JP5969685B1 (ja) 廃棄物選別システム及びその選別方法
JP2017109197A (ja) 廃棄物選別システム及びその選別方法
JP4742361B2 (ja) 情報入出力システム
JP6695539B1 (ja) 物品選別装置、物品選別システムおよび物品選別方法
CN106530939A (zh) 一种多机器人教学实训平台控制系统及控制方法
WO2012123033A1 (de) Steuerung und überwachung einer lager- und kommissionieranlage durch bewegung und sprache
CN110743818A (zh) 基于视觉及深度学习的垃圾分选系统、垃圾分选方法
CN104423799B (zh) 接口装置和接口方法
CN110302981B (zh) 一种固废分拣在线抓取方法和系统
CN114405866B (zh) 视觉引导钢板分拣方法、视觉引导钢板分拣装置和系统
JP6902369B2 (ja) 提示装置、提示方法およびプログラム、ならびに作業システム
JP2021030219A (ja) 物品選別装置および物品選別方法
Chen et al. Augmented reality-enabled human-robot collaboration to balance construction waste sorting efficiency and occupational safety and health
CN103677442B (zh) 键盘装置及电子装置
WO2021217923A1 (zh) 人机协作分选系统及其分选多类物料的方法
CN112090782A (zh) 一种人机协同式垃圾分拣系统及方法
WO2021217922A1 (zh) 人机协作分选系统及其机器人的抓取位置获取方法
CN112784668A (zh) 要素作业分割装置、方法、系统以及存储介质
JP4925148B2 (ja) 情報入出力システム
CN101765848A (zh) 触摸面板和包括该触摸面板的交互式输入系统
JP2965886B2 (ja) 被識別物の選別装置
US20140145947A1 (en) Portable computer having pointing functions and pointing system
CN112288819B (zh) 多源数据融合的视觉引导机器人抓取及分类系统及方法
CN114683251A (zh) 机器人抓取方法、装置、电子设备及可读取存储介质

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 20933196

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 20933196

Country of ref document: EP

Kind code of ref document: A1