CN117549313B - Tool changing robot workstation control method and system based on visual analysis - Google Patents

Tool changing robot workstation control method and system based on visual analysis Download PDF

Info

Publication number
CN117549313B
CN117549313B CN202410004465.8A CN202410004465A CN117549313B CN 117549313 B CN117549313 B CN 117549313B CN 202410004465 A CN202410004465 A CN 202410004465A CN 117549313 B CN117549313 B CN 117549313B
Authority
CN
China
Prior art keywords
cutter
result
image capturing
tool
index
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202410004465.8A
Other languages
Chinese (zh)
Other versions
CN117549313A (en
Inventor
马开辉
龚楚勋
张长春
黄新颜
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Baigong Huizhi Suzhou Intelligent Technology Co ltd
Original Assignee
Baigong Huizhi Suzhou Intelligent Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Baigong Huizhi Suzhou Intelligent Technology Co ltd filed Critical Baigong Huizhi Suzhou Intelligent Technology Co ltd
Priority to CN202410004465.8A priority Critical patent/CN117549313B/en
Publication of CN117549313A publication Critical patent/CN117549313A/en
Application granted granted Critical
Publication of CN117549313B publication Critical patent/CN117549313B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1694Programme controls characterised by use of sensors other than normal servo-feedback from position, speed or acceleration sensors, perception control, multi-sensor controlled systems, sensor fusion
    • B25J9/1697Vision controlled systems
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B23MACHINE TOOLS; METAL-WORKING NOT OTHERWISE PROVIDED FOR
    • B23QDETAILS, COMPONENTS, OR ACCESSORIES FOR MACHINE TOOLS, e.g. ARRANGEMENTS FOR COPYING OR CONTROLLING; MACHINE TOOLS IN GENERAL CHARACTERISED BY THE CONSTRUCTION OF PARTICULAR DETAILS OR COMPONENTS; COMBINATIONS OR ASSOCIATIONS OF METAL-WORKING MACHINES, NOT DIRECTED TO A PARTICULAR RESULT
    • B23Q3/00Devices holding, supporting, or positioning work or tools, of a kind normally removable from the machine
    • B23Q3/155Arrangements for automatic insertion or removal of tools, e.g. combined with manual handling
    • B23Q3/1552Arrangements for automatic insertion or removal of tools, e.g. combined with manual handling parts of devices for automatically inserting or removing tools
    • B23Q3/1554Transfer mechanisms, e.g. tool gripping arms; Drive mechanisms therefore
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1679Programme controls characterised by the tasks executed
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02PCLIMATE CHANGE MITIGATION TECHNOLOGIES IN THE PRODUCTION OR PROCESSING OF GOODS
    • Y02P90/00Enabling technologies with a potential contribution to greenhouse gas [GHG] emissions mitigation
    • Y02P90/02Total factory control, e.g. smart factories, flexible manufacturing systems [FMS] or integrated manufacturing systems [IMS]

Landscapes

  • Engineering & Computer Science (AREA)
  • Mechanical Engineering (AREA)
  • Robotics (AREA)
  • Image Analysis (AREA)

Abstract

The invention discloses a control method and a control system for a tool changing robot workstation based on visual analysis, which relate to the technical field of image data processing, and the method comprises the following steps: initializing a tool library and completing the index establishment of tools in the tool library; generating a cutter feature set according to the recorded result; executing tool positioning call according to the index; generating a first prediction result with a time sequence identifier; generating a second prediction result; if the trigger value meets a preset threshold value, generating an image capturing result; performing cutter characteristic comparison; and generating response control information of the cutter according to the cutter characteristic comparison result, and updating the response control information and the image capturing result to the index. The invention solves the technical problem of unreliable working quality caused by low matching degree of the response control of the workstation cutter and the actual condition of the cutter in the prior art, and achieves the technical effects of deep mining of image data, reliable analysis of the cutter state and improvement of response control quality.

Description

Tool changing robot workstation control method and system based on visual analysis
Technical Field
The invention relates to the technical field of image data processing, in particular to a tool changing robot workstation control method and system based on visual analysis.
Background
At present, the tool changing speed and the tool changing efficiency can be greatly improved by using the tool changing robot workstation to change the tool, so as to cope with more and more diversified process scenes. However, in practical application, the processing quality is far lower than expected due to insufficient analysis of tool states of tool libraries in the workstation. The prior art has the technical problems that the matching degree of the response control of the workstation cutter and the actual condition of the cutter is low, and the working quality is unreliable.
Disclosure of Invention
The application provides a tool changing robot workstation control method and system based on visual analysis, which are used for solving the technical problem that in the prior art, the matching degree between response control of a workstation tool and actual conditions of the tool is low, so that the working quality is unreliable.
In view of the above, the present application provides a tool changing robot workstation control method and system based on visual analysis.
In a first aspect of the present application, a method for controlling a tool changing robot workstation based on visual analysis is provided, the method comprising:
initializing a tool library, and completing the establishment of indexes of tools in the tool library, wherein the indexes are positioning call indexes and information updating indexes;
recording working information of the cutters in the cutter library, and generating a cutter feature set according to a recording result, wherein the cutter feature set has a mapping relation with the index;
reading the work task information, initializing the arrangement of the cutters in the cutter library according to the work task information, and executing the positioning and calling of the cutters according to the index;
inputting the cutter feature set and the work task information into a cutter state prediction network to generate a first prediction result with a time sequence identifier;
obtaining workpiece detection information of a machined workpiece, and performing state evaluation of a mapping tool based on the workpiece detection information to generate a second prediction result;
if the trigger values of the first prediction result and the second prediction result meet a preset threshold value, invoking an industrial camera, executing image capturing of a corresponding cutter, and generating an image capturing result;
invoking an image sequence based on the index according to the image capturing result, and executing cutter feature comparison of the image capturing result and the image sequence;
and generating response control information of the cutter according to the cutter characteristic comparison result, and updating the response control information and the image capturing result to an index.
In a second aspect of the present application, a tool changing robot workstation control system based on visual analysis is provided, the system comprising:
the index establishing module is used for initializing the tool library and completing the index establishment of tools in the tool library, wherein the index is a positioning call and information updating index;
the tool feature set generation module is used for recording working information of tools in the tool library and generating a tool feature set according to a recording result, wherein the tool feature set has a mapping relation with the index;
the positioning calling module is used for reading the work task information, initializing the arrangement of the cutters in the cutter library according to the work task information, and executing the positioning calling of the cutters according to the index;
the first prediction result generation module is used for inputting the cutter characteristic set and the work task information into a cutter state prediction network to generate a first prediction result with a time sequence identifier;
the second prediction result generation module is used for obtaining workpiece detection information of a machined workpiece, and carrying out state evaluation of the mapping tool on the basis of the workpiece detection information to generate a second prediction result;
the image capturing result generating module is used for calling the industrial camera to execute image capturing of the corresponding cutter and generating an image capturing result if the trigger values of the first prediction result and the second prediction result meet the preset threshold value;
the cutter feature comparison module is used for calling an image sequence based on the index according to the image capturing result and executing cutter feature comparison of the image capturing result and the image sequence;
and the index updating module is used for generating response control information of the cutter according to the cutter characteristic comparison result and updating the response control information and the image capturing result to an index.
One or more technical solutions provided in the present application have at least the following technical effects or advantages:
according to the method, the index establishment of the tool in the tool library is completed through initializing the tool library, wherein the index is a positioning call and an information updating index, further working information recording is conducted on the tool in the tool library, a tool feature set is generated according to a recording result, then the tool feature set and the index have a mapping relation, working task information is read, tool arrangement initialization is conducted in the tool library according to the working task information, positioning call is conducted according to the index, a first prediction result with a time sequence identification is generated through inputting the tool feature set and the working task information into a tool state prediction network, then workpiece detection information of a machined workpiece is obtained, state evaluation of the mapping tool is conducted on the basis of the workpiece detection information, a second prediction result is generated, if the triggering values of the first prediction result and the second prediction result meet a preset threshold value, an industrial camera is called, image capturing is conducted on the corresponding tool, an image capturing result is generated, then image feature comparison is conducted according to the image capturing result, response control information of the tool is generated according to the tool feature comparison result of the image capturing result and the image capturing result is updated to the index. The technical effects of deep mining of image data, reliable analysis of cutter states and improvement of response control quality are achieved.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present invention, the drawings required for the description of the embodiments will be briefly described below, and it is apparent that the drawings in the following description are only some embodiments of the present invention, and other drawings may be obtained according to these drawings without inventive effort for a person skilled in the art.
Fig. 1 is a schematic flow chart of a control method of a workstation of a tool changing robot based on visual analysis according to an embodiment of the present application;
fig. 2 is a schematic flow chart of generating an image capturing result in a control method of a workstation of a tool changing robot based on visual analysis according to an embodiment of the present application;
fig. 3 is a schematic flow chart of performing tool feature comparison in the control method of the workstation of the tool changing robot based on visual analysis according to the embodiment of the present application;
fig. 4 is a schematic structural diagram of a workstation control system of a tool changing robot based on visual analysis according to an embodiment of the present application.
Reference numerals describe the index creation module 11, the tool feature set generation module 12, the positioning call module 13, the first prediction result generation module 14, the second prediction result generation module 15, the image capturing result generation module 16, the tool feature comparison module 17, and the index update module 18.
Detailed Description
The application provides a control method and a control system for a workstation of a tool changing robot based on visual analysis, which are used for solving the technical problems of unreliable working quality caused by low matching degree between response control of a workstation tool and actual conditions of the tool in the prior art.
The technical solutions in the embodiments of the present application will be clearly and completely described below with reference to the drawings in the embodiments of the present application. It will be apparent that the described embodiments are only some, but not all, of the embodiments of the present application. All other embodiments, which can be made by one of ordinary skill in the art without undue burden from the present disclosure, are within the scope of the present application based on the embodiments herein.
It should be noted that the terms "comprises" and "comprising," along with any variations thereof, are intended to cover a non-exclusive inclusion, such that a process, method, system, article, or server that comprises a list of steps or elements is not necessarily limited to those steps or elements expressly listed or inherent to such process, method, article, or apparatus, but may include other steps or modules not expressly listed or inherent to such process, method, article, or apparatus.
Example 1
As shown in fig. 1, the present application provides a tool changing robot workstation control method based on visual analysis, wherein the method includes:
step S100: initializing a tool library, and completing the establishment of indexes of tools in the tool library, wherein the indexes are positioning call indexes and information updating indexes;
in one possible embodiment, the tool magazine is used for storage of tools, turning out of tools, types including turret, chain, disk, etc. And initializing the tool library to restore the tool library to an initial working state, thereby completing the establishment of the indexes of the tools in the tool library. Wherein the index is a positioning call and information update index. The positioning call index is used for retrieving the position of the tool when the tool is called. The information update index is an index of an update object when the cutter information is updated. By completing the index establishment of the cutters in the cutter library, the technical effect of providing support for the follow-up reliable, orderly and accurate cutter calling and cutter information updating is achieved.
Step S200: recording working information of the cutters in the cutter library, and generating a cutter feature set according to a recording result, wherein the cutter feature set has a mapping relation with the index;
in the embodiment of the application, when the cutters in the cutter library work, the working information is recorded, including recording information such as the content of a work task, the working time length, the state of the cutters after work and the like, so that the cutter characteristic set is generated according to the recording result. The tool feature set is used for describing characteristics of tools in the tool library, which are different from other tools, so that the tools can be distinguished conveniently. And the cutter feature set and the index have a mapping relation, that is to say, the cutter features and the index have a one-to-one correspondence relation, and each cutter corresponds to one cutter feature.
Step S300: reading the work task information, initializing the arrangement of the cutters in the cutter library according to the work task information, and executing the positioning and calling of the cutters according to the index;
in one possible embodiment, the work task information input into the management system of the tool changing robot workstation is read, wherein the work task information is used for describing the type and the content of work required to be completed by the tools in the tool library. And determining the cutters needing to work according to the work task information, initializing the arrangement of the cutters in the cutter library according to the work sequence in the work task information, positioning the cutters needing to work according to the index, and calling. Therefore, the targets of matching the cutters which are matched with the work task information and carrying out positioning and calling are realized.
Step S400: inputting the cutter feature set and the work task information into a cutter state prediction network to generate a first prediction result with a time sequence identifier;
in one possible embodiment, the tool state prediction network predicts the tool state after the tool completes the work task information according to the tool features described in the tool feature set, the input data is the tool feature set and the work task information, and the output data is the first prediction result with the time sequence identification. Because the same cutter can be called for multiple times when the work task information is finished, the time of each call is different, and the cutter states of the same cutter after different tasks in the work task information are finished can be effectively distinguished by carrying out time sequence identification on the prediction result. The first prediction result is used for describing the state of the cutter after finishing the work task information, and comprises information such as surface profile characteristics, cutter damage and the like.
Preferably, the training sample data set is generated by obtaining a plurality of sample tool feature sets, a plurality of job task information, and a plurality of sample first predictions with timing identifiers. Equally dividing the training sample data set into n groups, respectively training a network layer constructed based on a feedforward neural network by using the n groups, and updating parameters of the network layer trained by the next group according to a training result until the output reaches convergence, thereby obtaining the cutter state prediction network after training. Through multiple training and updating, a cutter state prediction network with higher prediction accuracy and higher analysis efficiency can be obtained, so that the aim of reliably predicting the state of the cutter after finishing the work task information is fulfilled. And by obtaining the first prediction result with the time sequence identification, the technical effects of providing a basis for whether the follow-up industrial camera is used for image capturing or not and further comparing the characteristics of the cutter, effectively identifying the state of the cutter and avoiding the waste of resources are achieved.
Step S500: obtaining workpiece detection information of a machined workpiece, and performing state evaluation of a mapping tool based on the workpiece detection information to generate a second prediction result;
in one embodiment, after the tools in the tool library are called for workpiece processing, the state of the processed workpiece is detected and analyzed, and workpiece detection information is generated. The workpiece detection information reflects the state of the machined workpiece after machining. By way of example, when the stainless steel strip is sheared by the tool changing robot workstation, the flatness of the shearing edge of the stainless steel strip is detected, and whether a gap exists in the cutter blade can be analyzed according to the detection result, so that the aim of evaluating the state of the mapping cutter is fulfilled. Preferably, a workpiece detection information-tool state mapping network layer is constructed by obtaining a plurality of sample workpiece detection information and a plurality of sample second prediction results of mapping tools, and then the workpiece detection information is input into the workpiece detection information-tool state mapping network layer for mapping search, so that the second prediction results are generated. Wherein the second prediction result predicts a tool state from a point of view of processing the workpiece. By generating the first prediction result and the second prediction result, a reliable basis is provided for comparing whether the cutter needs to be subjected to cutter feature comparison or not from two dimensions, so that the technical effects of effectively analyzing the cutter state, judging in two dimensions and improving the reliability of an analysis result are achieved.
Step S600: if the trigger values of the first prediction result and the second prediction result meet a preset threshold value, invoking an industrial camera, executing image capturing of a corresponding cutter, and generating an image capturing result;
further, as shown in fig. 2, if the trigger values of the first prediction result and the second prediction result meet the preset threshold, the industrial camera is invoked to perform image capturing of the corresponding tool, and an image capturing result is generated, and step S600 in this embodiment of the present application further includes:
when the preset threshold is triggered, generating a cutter calling instruction;
calling a corresponding cutter through an index according to the cutter calling instruction, and executing initial flushing;
acquiring a tool image after flushing by the industrial camera to generate a first image capturing result;
performing deep cleaning of the corresponding cutter, and acquiring cutter images after the deep cleaning by the industrial camera to generate a second image capturing result;
and taking the first image capturing result and the second image capturing result as the image capturing result.
Further, step S600 in the embodiment of the present application further includes:
acquiring a trigger frequency of a preset threshold value, counting the trigger frequency, and generating an early warning accumulated value according to response control information;
and carrying out abnormal report management on the cutter according to the early warning accumulated value.
In the embodiment of the present application, when the trigger values of the first prediction result and the second prediction result meet a preset threshold, that is, the number of the predicted tools that are reflected in the first prediction result and the second prediction result and cannot meet the normal state exceeds the preset threshold, image capturing of the corresponding tools needs to be performed, a comparison basis is provided for subsequent tool feature comparison, and a basis is provided for subsequent tool changing and response control of a tool library in a robot workstation. The preset threshold is the highest trigger value which is set by a person skilled in the art and can be used normally by the cutter. When the trigger value meets a preset threshold value, the state of the corresponding cutter cannot meet the normal working requirement, and at the moment, the industrial camera is used for collecting images of the cutter, so that the image capturing result is generated. Wherein, the image capturing result reflects the appearance state of the cutter after the completion of the work task.
In one embodiment, a tool call instruction is generated after the preset threshold is triggered. The tool calling instruction is a command for calling the tool triggering the preset threshold value out of the tool library. And calling out the corresponding cutter from the cutter library through an index according to the cutter calling instruction, executing initial flushing, flushing dirt on the surface of the cutter by using a water flow flushing mode and the like. Further, the rinsed tool is image-captured with an industrial camera, thereby generating a first image capture result. And then the surface of the cutter is polished by using a grinding stone or a grinding wheel, so that scrap iron and oxide are removed, the deep cleaning of the cutter is completed, and optionally, the cutter can be also subjected to deep cleaning by using an ultrasonic cleaner. Then, the tool after performing the depth cleaning is subjected to image acquisition by using an industrial camera, thereby generating a second image capturing result. Wherein the second image capturing result reflects a tool state after the tool cuts the foreign substances and residues. The first image capturing result and the second image capturing result are used as image capturing results, so that basis is provided for subsequent image cutter comparison.
In one possible embodiment, the number of times the preset threshold of the same tool is triggered in a preset time is collected, and the trigger frequency of the preset threshold is generated by comparing the number of times of triggering with the preset time. And further, generating an accumulated value of the control after each early warning for response control information corresponding to the triggering frequency, wherein the accumulated value of the early warning reflects the accumulated data quantity of the cutter which is subjected to response control in a preset time. And judging whether the early warning accumulated value exceeds a preset early warning accumulated value, if so, indicating that the same cutter is called out abnormally, and carrying out abnormal report management of the cutter. Thereby, the objective of reliable evaluation of the tool state is achieved.
Step S700: invoking an image sequence based on the index according to the image capturing result, and executing cutter feature comparison of the image capturing result and the image sequence;
further, as shown in fig. 3, step S700 in the embodiment of the present application further includes:
performing one-to-one mapping of images according to the image sequence and the acquisition positions of the image capturing results;
gray level processing is carried out on the image capturing result, and regional gray level sampling in a preset space is carried out, wherein the preset space is a preset cutter space and a preset background space;
carrying out continuous gray scale segmentation on the position according to the gray scale sampling result, and separating to obtain a cutter characteristic diagram;
and finishing cutter characteristic comparison according to the cutter characteristic diagram and the image sequence.
Further, step S700 in this embodiment of the present application further includes:
generating a first edge segmentation profile of a tool with the sequence of images;
mapping the first edge segmentation contour to a cutter feature map based on the one-to-one mapping, and performing center sampling according to a mapping result;
taking the center sampling gray scale as a clustering center, and executing continuous clustering of gray scales;
generating a second edge segmentation contour based on the continuous clustering result;
and finishing cutter characteristic comparison through the comparison result of the first cutting edge segmentation contour and the second cutting edge segmentation contour.
Further, step S700 in the embodiment of the present application further includes:
performing contour range comparison through the first cutting edge segmentation contour and the second cutting edge segmentation contour to generate a contour range coverage result;
extracting contour point pixel gray scales of the first blade segmentation contour and the second blade segmentation contour to obtain a position gray scale set;
establishing blade position mapping of the first blade segmentation contour and the second blade segmentation contour, comparing the same-position blade gray scale based on a position gray scale set, and generating a compensation result according to the comparison result;
and compensating the coverage result of the contour range according to the compensation result to complete the matching of the cutter characteristics.
In one possible embodiment, according to the index corresponding to the tool in the image capturing result, an image sequence is invoked, wherein the images, which are invoked for a plurality of times when the corresponding tool completes the task information, are arranged according to the invoking time, and then the tool feature comparison is performed according to the image capturing result and the image sequence.
In a possible embodiment, the images are mapped one-to-one based on the image sequence and the acquisition positions of the image capturing results, i.e. the tool positions acquired in the image capturing results are mapped one-to-one to the corresponding images in the image sequence. And carrying out average value calculation on the three-component brightness of the image in the image capturing result, thereby completing the image gray processing and obtaining the image capturing result after the gray processing. And further, collecting gray scales in a preset cutter space and a background space to obtain a gray scale sampling result. The predetermined space is a space in which a cutter set by a person skilled in the art is located, and a background space other than the cutter. After the gray level sampling is completed, continuously dividing the positions of gray level peaks and valleys in the gray level sampling result, and separating the background, thereby obtaining a cutter characteristic diagram. Wherein, the cutter characteristic diagram describes the appearance state of the cutter after working. And further, performing feature comparison on the basis of the cutter feature map and the image sequence to obtain a cutter feature comparison result.
In one embodiment, the image sequence is subjected to gray processing based on the same gray processing method, and the processed image is subjected to cutting edge position segmentation of the cutter to obtain a first cutting edge segmentation contour of the cutter. The first cutting edge dividing contour is the cutting edge contour condition when the cutter is called. And mapping the first edge segmentation contour onto a cutter characteristic map based on one-to-one mapping, and taking the gray level of the cutter characteristic map as a center sampling result according to the position of the cutter center of the mapped first edge segmentation contour in the mapping result, wherein the gray level is taken as a center sampling gray level. And taking the center sampling gray scale as a clustering center, and continuously clustering the center sampling gray scale to the outside by the clustering center, namely continuously adding the position of the difference value between the center sampling gray scale and the center sampling gray scale in the cutter characteristic diagram in a preset gray scale difference range into the segmentation contour, and then generating a second cutting edge segmentation contour according to a continuous clustering result. Wherein the second edge split profile is the case of the outer profile of the cutter edge depicted in the cutter feature map.
Preferably, the contour areas included in the first blade-dividing contour and the second blade-dividing contour are compared, and the ratio is used as a contour range coverage result. When the coverage result of the outline range is larger, the working blade is indicated to be damaged; when the coverage result of the outline range is smaller, the condition of the cutting edge difference before and after the operation cannot be obviously reflected from the outline due to the shorter acquisition time between the image sequence and the image capturing result. At this time, contour point pixel gray scale extraction is performed on peripheral contour points of the first blade-dividing contour and the second blade-dividing contour, thereby obtaining a position gray scale set. And according to whether the peripheral contour points of the first edge dividing contour and the second edge dividing contour coincide with the connecting line of the central position of the cutter, the position mapping is carried out, if so, the position mapping is established between the peripheral contour points of the second edge dividing contour and the peripheral contour points of the first edge dividing contour, but the positions of the relative contour points before and after the calling and the working are carried out for the same position of the cutter, so that the positions of the edge positions are regarded as the gray scales of the edge at the same position. The gray level difference between the two is analyzed according to the gray level set of the position and the gray level comparison of the blade at the same position according to the mapping result of the blade position, because the higher the gray level value (the brighter the blade is, the better the state of the blade is represented), and the lower the gray level value (the darker the blade is, the worse the state of the blade is represented). Therefore, the gray scales of the positioned blades can be compared, thereby analyzing the blade state. When the larger the gradation difference is, the greater the degree of tool state reduction is indicated, and the gradation difference obtained by the comparison is taken as a compensation result. And compensating the contour range coverage result by using a compensation result, and optionally, performing average value calculation on the ratio of a plurality of gray level differences in the compensation result to the gray level of the position at the position where the contour is divided by the first blade, and multiplying the reciprocal of the calculation result by the contour range coverage result, thereby completing the compensation. And taking the compensation result as a cutter characteristic comparison result.
Step S800: and generating response control information of the cutter according to the cutter characteristic comparison result, and updating the response control information and the image capturing result to an index.
Further, step S800 in the embodiment of the present application further includes:
performing edge attachment feature recognition of the cutter through the first image capturing result to generate an attachment feature recognition result;
comparing the cutter characteristics according to the second image capturing result to obtain a cutter characteristic comparison result;
and compensating the response control information according to the attachment feature recognition result, and updating the compensation result to the index.
In one possible embodiment, the control information retrieval is performed in a response control library according to the cutter characteristic comparison result to obtain matched response control information, and then the response control information and the image capturing result are updated to an index to complete information update of cutters in the cutter library, so that the follow-up cutter calling is facilitated.
Preferably, the attachment feature of the tool surface is determined according to the first image capturing result, and preferably, the attachment feature recognition result is obtained by transmitting the first image capturing result to the blade attachment feature recognition network layer for attachment feature recognition. The blade attachment characteristic recognition network layer is used for intelligently recognizing the attachment condition of foreign objects on the working rear surface of the cutter. The adhesion feature recognition result includes features such as cutting impurity size, cutting residue type, and adhesion area. And performing supervision training on a network layer constructed based on the convolutional neural network by acquiring a plurality of sample attachment feature recognition results and a plurality of sample attachment feature recognition results until output reaches convergence, thereby acquiring the trained blade attachment feature recognition network layer. And comparing the cutter characteristics based on the second image capturing result to obtain a cutter characteristic comparison result. And generating response control information based on the cutter characteristic comparison result, compensating the response control information by using the attachment characteristic recognition result, namely adjusting parameters related to attachment control in the response control information according to the attachment condition in the attachment characteristic recognition result, and synchronously updating the compensation result into the index.
In summary, the embodiments of the present application have at least the following technical effects:
according to the method, the index establishment of the cutter in the cutter library is completed through initializing the cutter library, the aim of supporting follow-up cutter response control is achieved, then a cutter feature set is generated according to recorded results, cutter positioning calling is carried out according to the index, intelligent prediction is carried out on the state of the cutter after a work task is completed, a first prediction result with a time sequence identifier is generated, then the state of a machined workpiece is detected, a second prediction result is generated according to detection result mapping, analysis is carried out on the first prediction result and the second prediction result, if a trigger value meets a preset threshold value, an image capturing result is generated, cutter feature comparison is carried out, response control information of the cutter is generated according to cutter feature comparison results, and the response control information and the image capturing result are updated to the index. The technical effects of deep mining of image data, reliable analysis of cutter states and improvement of response control quality are achieved.
Example 2
Based on the same inventive concept as the tool changing robot workstation control method based on visual analysis in the foregoing embodiments, as shown in fig. 4, the present application provides a tool changing robot workstation control system based on visual analysis, and the system and method embodiments in the embodiments of the present application are based on the same inventive concept. Wherein the system comprises:
the index establishing module 11 is used for initializing a tool library and completing the index establishment of tools in the tool library, wherein the index is a positioning call and information updating index;
the tool feature set generating module 12 is configured to record working information of tools in the tool library, and generate a tool feature set according to a recording result, where the tool feature set has a mapping relationship with the index;
the positioning calling module 13 is used for reading the work task information, initializing the arrangement of the cutters in the cutter library according to the work task information, and executing the positioning calling of the cutters according to the index;
a first prediction result generating module 14, configured to input the tool feature set and the work task information into a tool state prediction network, and generate a first prediction result with a time sequence identifier;
the second prediction result generating module 15 is configured to obtain workpiece detection information of a machined workpiece, and perform state evaluation of the mapping tool based on the workpiece detection information to generate a second prediction result;
the image capturing result generating module 16 is configured to invoke the industrial camera to perform image capturing of the corresponding tool and generate an image capturing result if the trigger values of the first prediction result and the second prediction result meet a preset threshold;
a cutter feature comparison module 17 for calling an image sequence based on an index according to the image capturing result, and executing cutter feature comparison of the image capturing result and the image sequence;
the index updating module 18 is configured to generate response control information of the tool according to the tool feature comparison result, and update the response control information and the image capturing result to an index.
Further, the tool feature comparison module 17 is configured to perform the following steps:
performing one-to-one mapping of images according to the image sequence and the acquisition positions of the image capturing results;
gray level processing is carried out on the image capturing result, and regional gray level sampling in a preset space is carried out, wherein the preset space is a preset cutter space and a preset background space;
carrying out continuous gray scale segmentation on the position according to the gray scale sampling result, and separating to obtain a cutter characteristic diagram;
and finishing cutter characteristic comparison according to the cutter characteristic diagram and the image sequence.
Further, the tool feature comparison module 17 is configured to perform the following steps:
generating a first edge segmentation profile of a tool with the sequence of images;
mapping the first edge segmentation contour to a cutter feature map based on the one-to-one mapping, and performing center sampling according to a mapping result;
taking the center sampling gray scale as a clustering center, and executing continuous clustering of gray scales;
generating a second edge segmentation contour based on the continuous clustering result;
and finishing cutter characteristic comparison through the comparison result of the first cutting edge segmentation contour and the second cutting edge segmentation contour.
Further, the tool feature comparison module 17 is configured to perform the following steps:
performing contour range comparison through the first cutting edge segmentation contour and the second cutting edge segmentation contour to generate a contour range coverage result;
extracting contour point pixel gray scales of the first blade segmentation contour and the second blade segmentation contour to obtain a position gray scale set;
establishing blade position mapping of the first blade segmentation contour and the second blade segmentation contour, comparing the same-position blade gray scale based on a position gray scale set, and generating a compensation result according to the comparison result;
and compensating the coverage result of the contour range according to the compensation result to complete the matching of the cutter characteristics.
Further, the image capturing result generating module 16 is configured to perform the following steps:
when the preset threshold is triggered, generating a cutter calling instruction;
calling a corresponding cutter through an index according to the cutter calling instruction, and executing initial flushing;
acquiring a tool image after flushing by the industrial camera to generate a first image capturing result;
performing deep cleaning of the corresponding cutter, and acquiring cutter images after the deep cleaning by the industrial camera to generate a second image capturing result;
and taking the first image capturing result and the second image capturing result as the image capturing result.
Further, the index updating module 18 is configured to perform the following steps:
performing edge attachment feature recognition of the cutter through the first image capturing result to generate an attachment feature recognition result;
comparing the cutter characteristics according to the second image capturing result to obtain a cutter characteristic comparison result;
and compensating the response control information according to the attachment feature recognition result, and updating the compensation result to the index.
Further, the image capturing result generating module 16 is configured to perform the following steps:
acquiring a trigger frequency of a preset threshold value, counting the trigger frequency, and generating an early warning accumulated value according to response control information;
and carrying out abnormal report management on the cutter according to the early warning accumulated value.
It should be noted that the sequence of the embodiments of the present application is merely for description, and does not represent the advantages and disadvantages of the embodiments. And the foregoing description has been directed to specific embodiments of this specification. Other embodiments are within the scope of the following claims. In some cases, the actions or steps recited in the claims can be performed in a different order than in the embodiments and still achieve desirable results. In addition, the processes depicted in the accompanying figures do not necessarily require the particular order shown, or sequential order, to achieve desirable results. In some embodiments, multitasking and parallel processing are also possible or may be advantageous.
The foregoing description of the preferred embodiments of the present application is not intended to limit the invention to the particular embodiments of the present application, but to limit the scope of the invention to the particular embodiments of the present application.
The specification and drawings are merely exemplary of the application and are to be regarded as covering any and all modifications, variations, combinations, or equivalents that are within the scope of the application. It will be apparent to those skilled in the art that various modifications and variations can be made in the present application without departing from the scope of the application. Thus, if such modifications and variations of the present application fall within the scope of the present application and the equivalents thereof, the present application is intended to cover such modifications and variations.

Claims (5)

1. The tool changing robot workstation control method based on visual analysis is characterized by comprising the following steps:
initializing a tool library, and completing the establishment of indexes of tools in the tool library, wherein the indexes are positioning call indexes and information updating indexes;
recording working information of the cutters in the cutter library, and generating a cutter feature set according to a recording result, wherein the cutter feature set has a mapping relation with the index;
reading the work task information, initializing the arrangement of the cutters in the cutter library according to the work task information, and executing the positioning and calling of the cutters according to the index;
inputting the cutter feature set and the work task information into a cutter state prediction network to generate a first prediction result with a time sequence identifier;
obtaining workpiece detection information of a machined workpiece, and performing state evaluation of a mapping tool based on the workpiece detection information to generate a second prediction result;
if the trigger values of the first prediction result and the second prediction result meet a preset threshold value, invoking an industrial camera, executing image capturing of a corresponding cutter, and generating an image capturing result;
invoking an image sequence based on the index according to the image capturing result, and executing cutter feature comparison of the image capturing result and the image sequence;
generating response control information of the cutter according to the cutter characteristic comparison result, and updating the response control information and the image capturing result to an index;
wherein the method further comprises:
performing one-to-one mapping of images according to the image sequence and the acquisition positions of the image capturing results;
gray level processing is carried out on the image capturing result, and regional gray level sampling in a preset space is carried out, wherein the preset space is a preset cutter space and a preset background space;
carrying out continuous gray scale segmentation on the position according to the gray scale sampling result, and separating to obtain a cutter characteristic diagram;
completing cutter characteristic comparison according to the cutter characteristic diagram and the image sequence;
wherein the method further comprises:
generating a first edge segmentation profile of a tool with the sequence of images;
mapping the first edge segmentation contour to a cutter feature map based on the one-to-one mapping, and performing center sampling according to a mapping result;
taking the center sampling gray scale as a clustering center, and executing continuous clustering of gray scales;
generating a second edge segmentation contour based on the continuous clustering result;
the cutter characteristic comparison is completed through the comparison result of the first cutter cutting edge dividing contour and the second cutter cutting edge dividing contour;
wherein the method further comprises:
performing contour range comparison through the first cutting edge segmentation contour and the second cutting edge segmentation contour to generate a contour range coverage result;
extracting contour point pixel gray scales of the first blade segmentation contour and the second blade segmentation contour to obtain a position gray scale set;
establishing blade position mapping of the first blade segmentation contour and the second blade segmentation contour, comparing the same-position blade gray scale based on a position gray scale set, and generating a compensation result according to the comparison result;
and compensating the covering result of the profile range according to the compensating result to finish the characteristic comparison of the cutter.
2. The method of claim 1, wherein if the trigger values of the first and second predicted outcomes meet a preset threshold, invoking an industrial camera to perform image capture of a corresponding tool, generating an image capture outcome, further comprising:
when the preset threshold is triggered, generating a cutter calling instruction;
calling a corresponding cutter through an index according to the cutter calling instruction, and executing initial flushing;
acquiring a tool image after flushing by the industrial camera to generate a first image capturing result;
performing deep cleaning of the corresponding cutter, and acquiring cutter images after the deep cleaning by the industrial camera to generate a second image capturing result;
and taking the first image capturing result and the second image capturing result as the image capturing result.
3. The method of claim 2, wherein the method further comprises:
performing edge attachment feature recognition of the cutter through the first image capturing result to generate an attachment feature recognition result;
comparing the cutter characteristics according to the second image capturing result to obtain a cutter characteristic comparison result;
and compensating the response control information according to the attachment feature recognition result, and updating the compensation result to the index.
4. The method of claim 1, wherein the method further comprises:
acquiring a trigger frequency of a preset threshold value, counting the trigger frequency, and generating an early warning accumulated value according to response control information;
and carrying out abnormal report management on the cutter according to the early warning accumulated value.
5. Tool changing robot workstation control system based on visual analysis, characterized in that the system comprises:
the index establishing module is used for initializing the tool library and completing the index establishment of tools in the tool library, wherein the index is a positioning call and information updating index;
the tool feature set generation module is used for recording working information of tools in the tool library and generating a tool feature set according to a recording result, wherein the tool feature set has a mapping relation with the index;
the positioning calling module is used for reading the work task information, initializing the arrangement of the cutters in the cutter library according to the work task information, and executing the positioning calling of the cutters according to the index;
the first prediction result generation module is used for inputting the cutter characteristic set and the work task information into a cutter state prediction network to generate a first prediction result with a time sequence identifier;
the second prediction result generation module is used for obtaining workpiece detection information of a machined workpiece, and carrying out state evaluation of the mapping tool on the basis of the workpiece detection information to generate a second prediction result;
the image capturing result generating module is used for calling the industrial camera to execute image capturing of the corresponding cutter and generating an image capturing result if the trigger values of the first prediction result and the second prediction result meet the preset threshold value;
the cutter feature comparison module is used for calling an image sequence based on the index according to the image capturing result and executing cutter feature comparison of the image capturing result and the image sequence;
the index updating module is used for generating response control information of the cutter according to the cutter characteristic comparison result and updating the response control information and the image capturing result to an index;
the cutter characteristic comparison module is used for executing the following steps:
performing one-to-one mapping of images according to the image sequence and the acquisition positions of the image capturing results;
gray level processing is carried out on the image capturing result, and regional gray level sampling in a preset space is carried out, wherein the preset space is a preset cutter space and a preset background space;
carrying out continuous gray scale segmentation on the position according to the gray scale sampling result, and separating to obtain a cutter characteristic diagram;
completing cutter characteristic comparison according to the cutter characteristic diagram and the image sequence;
generating a first edge segmentation profile of a tool with the sequence of images;
mapping the first edge segmentation contour to a cutter feature map based on the one-to-one mapping, and performing center sampling according to a mapping result;
taking the center sampling gray scale as a clustering center, and executing continuous clustering of gray scales;
generating a second edge segmentation contour based on the continuous clustering result;
the cutter characteristic comparison is completed through the comparison result of the first cutter cutting edge dividing contour and the second cutter cutting edge dividing contour;
performing contour range comparison through the first cutting edge segmentation contour and the second cutting edge segmentation contour to generate a contour range coverage result;
extracting contour point pixel gray scales of the first blade segmentation contour and the second blade segmentation contour to obtain a position gray scale set;
establishing blade position mapping of the first blade segmentation contour and the second blade segmentation contour, comparing the same-position blade gray scale based on a position gray scale set, and generating a compensation result according to the comparison result;
and compensating the covering result of the profile range according to the compensating result to finish the characteristic comparison of the cutter.
CN202410004465.8A 2024-01-03 2024-01-03 Tool changing robot workstation control method and system based on visual analysis Active CN117549313B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202410004465.8A CN117549313B (en) 2024-01-03 2024-01-03 Tool changing robot workstation control method and system based on visual analysis

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202410004465.8A CN117549313B (en) 2024-01-03 2024-01-03 Tool changing robot workstation control method and system based on visual analysis

Publications (2)

Publication Number Publication Date
CN117549313A CN117549313A (en) 2024-02-13
CN117549313B true CN117549313B (en) 2024-03-29

Family

ID=89818760

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202410004465.8A Active CN117549313B (en) 2024-01-03 2024-01-03 Tool changing robot workstation control method and system based on visual analysis

Country Status (1)

Country Link
CN (1) CN117549313B (en)

Citations (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2000012259A1 (en) * 1998-08-28 2000-03-09 Mori Seiki Co., Ltd. Preparation of tool information database for nc machining and system for managing tools by utilizing the tool information database
CN101670533A (en) * 2009-09-25 2010-03-17 南京信息工程大学 Cutting-tool wear state evaluating method based on image analysis of workpiece machining surface
CN101758423A (en) * 2008-12-23 2010-06-30 上海诚测电子科技发展有限公司 Rotational cutting tool state multiple parameter overall assessment method based on image identification
CN113909996A (en) * 2021-09-30 2022-01-11 华中科技大学 High-end equipment machining state monitoring method and system based on digital twinning
CN114670062A (en) * 2022-05-31 2022-06-28 苏芯物联技术(南京)有限公司 Real-time detection method and system for wear state of drilling tool
CN114742798A (en) * 2022-04-13 2022-07-12 武汉科技大学 Disc shear tool changing time prediction system and method based on shear blade wear detection
CN115129003A (en) * 2022-06-08 2022-09-30 华中科技大学 Automatic production line intelligent monitoring system based on self-learning time-varying digital twin
CN116523865A (en) * 2023-04-25 2023-08-01 大连理工大学 Workpiece surface roughness prediction method
CN116539620A (en) * 2023-04-25 2023-08-04 大连理工大学 On-machine detection method for surface defects of cutter
CN116967844A (en) * 2023-05-12 2023-10-31 南京工大数控科技有限公司 Cutter state monitoring and life predicting system for numerical control machine tool and using method thereof
WO2023228929A1 (en) * 2022-05-24 2023-11-30 三菱電機株式会社 Tool diagnosis system, tool diagnosis device, tool diagnosis method, and program

Patent Citations (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2000012259A1 (en) * 1998-08-28 2000-03-09 Mori Seiki Co., Ltd. Preparation of tool information database for nc machining and system for managing tools by utilizing the tool information database
CN101758423A (en) * 2008-12-23 2010-06-30 上海诚测电子科技发展有限公司 Rotational cutting tool state multiple parameter overall assessment method based on image identification
CN101670533A (en) * 2009-09-25 2010-03-17 南京信息工程大学 Cutting-tool wear state evaluating method based on image analysis of workpiece machining surface
CN113909996A (en) * 2021-09-30 2022-01-11 华中科技大学 High-end equipment machining state monitoring method and system based on digital twinning
CN114742798A (en) * 2022-04-13 2022-07-12 武汉科技大学 Disc shear tool changing time prediction system and method based on shear blade wear detection
WO2023228929A1 (en) * 2022-05-24 2023-11-30 三菱電機株式会社 Tool diagnosis system, tool diagnosis device, tool diagnosis method, and program
CN114670062A (en) * 2022-05-31 2022-06-28 苏芯物联技术(南京)有限公司 Real-time detection method and system for wear state of drilling tool
CN115129003A (en) * 2022-06-08 2022-09-30 华中科技大学 Automatic production line intelligent monitoring system based on self-learning time-varying digital twin
CN116523865A (en) * 2023-04-25 2023-08-01 大连理工大学 Workpiece surface roughness prediction method
CN116539620A (en) * 2023-04-25 2023-08-04 大连理工大学 On-machine detection method for surface defects of cutter
CN116967844A (en) * 2023-05-12 2023-10-31 南京工大数控科技有限公司 Cutter state monitoring and life predicting system for numerical control machine tool and using method thereof

Also Published As

Publication number Publication date
CN117549313A (en) 2024-02-13

Similar Documents

Publication Publication Date Title
CN115063409B (en) Method and system for detecting surface material of mechanical cutter
CN109191367B (en) Splicing method of cutter abrasion images and life prediction method of cutter
EP0142990A2 (en) Inspecting articles
CN115311629B (en) Abnormal bending precision monitoring system of bending machine
KR20220117328A (en) A metal texture phase classification method, a metal texture phase classification apparatus, a metal texture phase learning method, a metal texture phase learning apparatus, a material property prediction method of a metal material, and a material property prediction apparatus of a metal material
CN111027343A (en) Bar code area positioning method and device
CN115170563A (en) Detection system and method for die casting after deburring based on Internet of things
CN111783544B (en) Method for building diamond milling grinding head state monitoring system for processing ceramic mobile phone backboard
CN113469951A (en) Hub defect detection method based on cascade region convolutional neural network
CN113487533A (en) Part assembly quality digital detection system and method based on machine learning
Sharma et al. Concrete crack detection using the integration of convolutional neural network and support vector machine
Schmitt et al. Machine vision system for inspecting flank wear on cutting tools
CN116797553A (en) Image processing method, device, equipment and storage medium
CN117549313B (en) Tool changing robot workstation control method and system based on visual analysis
CN112561885B (en) YOLOv 4-tiny-based gate valve opening detection method
CN115424075A (en) Method and system for monitoring pipeline state
CN116823708A (en) PC component side mold identification and positioning research based on machine vision
CN111240195A (en) Automatic control model training and target object recycling method and device based on machine vision
CN113021355B (en) Agricultural robot operation method for predicting sheltered crop picking point
CN114862786A (en) Retinex image enhancement and Ostu threshold segmentation based isolated zone detection method and system
JP3771809B2 (en) Material life evaluation system
CN117911415B (en) Automatic equipment supervision system and method based on machine vision
Jiang et al. Does background really matter? Worker activity recognition in unconstrained construction environment
CN117593301B (en) Machine vision-based memory bank damage rapid detection method and system
CN115496931B (en) Industrial robot health monitoring method and system

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant