CN111645111A - Intelligent manufacturing system and method based on industrial robot - Google Patents

Intelligent manufacturing system and method based on industrial robot Download PDF

Info

Publication number
CN111645111A
CN111645111A CN202010456255.4A CN202010456255A CN111645111A CN 111645111 A CN111645111 A CN 111645111A CN 202010456255 A CN202010456255 A CN 202010456255A CN 111645111 A CN111645111 A CN 111645111A
Authority
CN
China
Prior art keywords
module
image
workpiece
processing
data
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Withdrawn
Application number
CN202010456255.4A
Other languages
Chinese (zh)
Inventor
彭二宝
王宏颖
倪江楠
余森
王景
安俊杰
韩宏
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Henan Polytechnic Institute
Original Assignee
Henan Polytechnic Institute
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Henan Polytechnic Institute filed Critical Henan Polytechnic Institute
Priority to CN202010456255.4A priority Critical patent/CN111645111A/en
Publication of CN111645111A publication Critical patent/CN111645111A/en
Withdrawn legal-status Critical Current

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J19/00Accessories fitted to manipulators, e.g. for monitoring, for viewing; Safety devices combined with or specially adapted for use in connection with manipulators
    • B25J19/02Sensing devices
    • B25J19/04Viewing devices
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B23MACHINE TOOLS; METAL-WORKING NOT OTHERWISE PROVIDED FOR
    • B23PMETAL-WORKING NOT OTHERWISE PROVIDED FOR; COMBINED OPERATIONS; UNIVERSAL MACHINE TOOLS
    • B23P23/00Machines or arrangements of machines for performing specified combinations of different metal-working operations not covered by a single other subclass
    • B23P23/04Machines or arrangements of machines for performing specified combinations of different metal-working operations not covered by a single other subclass for both machining and other metal-working operations
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J11/00Manipulators not otherwise provided for
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J11/00Manipulators not otherwise provided for
    • B25J11/005Manipulators for mechanical processing tasks
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J11/00Manipulators not otherwise provided for
    • B25J11/005Manipulators for mechanical processing tasks
    • B25J11/0065Polishing or grinding
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1679Programme controls characterised by the tasks executed

Abstract

The invention belongs to the technical field of industrial manufacturing, and discloses an intelligent manufacturing system and method based on an industrial robot, which comprises modules for processing, detecting, conveying, executing, controlling, displaying and monitoring and the like, wherein the modules are used for completing rapid exchange and flow control of data through an industrial Ethernet, realizing automatic feeding and discharging and automatic detection of processing size through vision, realizing streamlined processing detection through the industrial robot, providing basis for the industrial robot to sense the change of the surrounding environment and adjust the motion of the robot, ensuring that the industrial robot can correctly complete tasks, and providing an external closed-loop control mechanism. The detection module in the invention also has a size measurement function, can automatically measure the size of a processed product according to the setting, and compared with the traditional manual quality detection, the industrial robot has great advantages, can realize accurate operation under the control of a computer, and has high automation degree and high manufacturing efficiency, thereby improving the use value of the industrial robot.

Description

Intelligent manufacturing system and method based on industrial robot
Technical Field
The invention belongs to the technical field of industrial manufacturing, and particularly relates to an intelligent manufacturing system and method based on an industrial robot.
Background
At present, with the increasing labor cost of manufacturing industry, the automation level of each industry is promoted. In order to replace manual work, more and more manufacturing industries adopt industrial robots for modern manufacturing.
Industrial robots are multi-joint manipulators or multi-degree-of-freedom machine devices oriented to the industrial field, can automatically execute work, and are machines which realize various functions by means of self power and control capacity. The robot can accept human command and operate according to a preset program, and modern industrial robots can also perform actions according to a principle formulated by artificial intelligence technology.
However, in the application of the existing industrial robot, the industrial robot is only applied to a certain stage of the manufacturing operation, and the intellectualization of the whole line production cannot be realized. Moreover, manual quality detection is required for products produced by the automatic production line, and manufacturing efficiency is affected.
Through the above analysis, the problems and defects of the prior art are as follows:
(1) in the application of the existing industrial robot, the industrial robot is only applied to a certain stage of manufacturing operation, and the intellectualization of the whole flow process cannot be realized.
(2) The manual quality detection is needed to be carried out on the products produced by the automatic production line, and the manufacturing efficiency is influenced.
Disclosure of Invention
Aiming at the problems in the prior art, the invention provides an intelligent manufacturing system and method based on an industrial robot.
The invention is realized in such a way that an industrial robot-based intelligent manufacturing method comprises the following steps:
the method comprises the following steps that firstly, a vision acquisition module monitors the equipment state and the processing workpiece state in the manufacturing process in real time through a plurality of industrial cameras, and processes and identifies the acquired images; in the process of vision acquisition, carrying out on-site control adjustment on control parameters of a central control module through a human-computer interaction interface of a display device in a human-computer interaction module;
secondly, processing and identifying results according to the equipment state and the processed workpiece state images, wherein the central control module is respectively connected with the communication module, the vision acquisition module, the feeding module, the digital milling module, the polishing module, the discharging module, the detection module, the storage module, the human-computer interaction module and the storage module, and is used for processing information data through an industrial controller and performing coordination control on the mutual work among the modules;
step three, the feeding module controls the feeding robot to clamp the processed workpiece to the processing position of the processing equipment, and the numerical milling module controls the numerical milling equipment to perform numerical milling on the processed workpiece; the grinding module controls the polishing equipment to grind and polish the machined workpiece after the logarithmic milling, and controls the blanking robot to clamp and take the machined workpiece to the detection platform;
after the workpiece is machined, the detection module acquires images of the machined workpiece of the detection platform through an industrial camera, judges the type of the workpiece according to the appearance of the product, measures the size, the aperture and the area of the machined workpiece and judges whether the product is qualified or not;
step five, after the product is detected to be qualified, the warehousing module stores the processed workpieces in grades according to quality grades by controlling the warehousing robot; meanwhile, the storage module stores the control parameters of the system and the preset product information through a storage server;
step six, the central control module controls the communication module to receive the control parameters of the remote monitoring terminal and send the real-time working state parameters through the signal transmitter so as to exchange data; the remote control module carries out remote monitoring on the manufacturing process through a remote monitoring terminal, and sends a control instruction through the remote monitoring terminal to carry out remote regulation and control on the manufacturing process;
the detection process of the detection module is as follows:
firstly, an image acquisition unit to be detected acquires an image of a processing workpiece of a detection platform through an industrial camera arranged at the upper end of the detection platform;
secondly, the category judgment unit compares the product appearance of the collected image with the product appearance prestored in the storage server to obtain the category corresponding to the detected product;
thirdly, the vision measuring unit measures the size, the aperture and the area of the processed workpiece according to the acquired image;
fourthly, after the measurement is finished, the result output unit compares the acquired workpiece parameters with prestored parameters of the type of the workpiece, judges whether the product is qualified or not, and outputs the detection result to a display device;
the visual acquisition process of the visual acquisition module comprises the following steps:
1) the camera initialization unit restores the irradiation angles and the focusing parameters of the cameras at different positions to corresponding initial setting parameters; the image acquisition unit acquires real-time images of the operation equipment and the processing workpiece in real time;
2) after the image is collected, the image processing unit preprocesses the collected image to obtain an image with high definition;
3) according to the preset spatial information, a coordinate conversion unit converts two-dimensional coordinates of different position points into three-dimensional coordinates;
4) according to the obtained three-dimensional coordinate data, the target posture calculation unit identifies and judges the positions and working postures of different running equipment;
5) and an output servo amount unit generates a control servo amount for the operating equipment according to the judged working posture of the operating equipment and outputs the control servo amount in real time.
Further, in the process that a human-computer interaction interface of a display device in the human-computer interaction module carries out field control adjustment on the control parameter of the central control module, the method for selecting the touch point by the display device comprises the following steps:
establishing a corresponding data training sample for the touch points of the human body, and carrying out statistical analysis on the touch points;
setting data which accord with the range of the touch screen for a plurality of times, sequencing the data, and taking the difference value of two middle bits;
if the difference is greater than the threshold, discarding; the difference is a positive or zero value and need not be declared a signed number.
Further, the image processing unit processes the image as follows:
carrying out color space conversion on the acquired image, and converting an RGB color image in an original format of the acquired image into a gray image;
diagnosing the definition and brightness of the image, comparing the obtained result with a threshold value, and judging whether the calculated result is higher than the given threshold value is a clear image or not;
and according to the image diagnosis result, performing compensation adjustment on the contrast and the brightness value of the image to obtain a clear image meeting a given threshold value.
Further, in the process of processing and identifying the acquired image by an image processing unit in the visual acquisition module, the method for extracting the image features comprises the following steps:
performing color space conversion on the acquired equipment state image and the processed workpiece state image according to an image processing unit, and converting an RGB color image in an original format of an acquired image into a gray image;
determining a histogram and a mean value of images in a gray-scale image which can embody the equipment state and the processing workpiece state;
and solving the feature vector of the image texture feature according to the histogram and the mean value of the region.
Further, the process of identifying the image according to the feature vector of the image texture feature is as follows:
performing feature matching on the pre-stored data by using a feature matching model according to the feature vector of the acquired image texture feature;
according to the feature matching numerical value, selecting a feature numerical value with a small data value to perform accurate positioning; and is
And assigning direction values to the feature points, and describing and explaining the features.
Further, the central processing module operates each module in the following process;
the initialization unit initializes each controlled device by setting parameters to enable each controlled device to be at an initial position; meanwhile, the parameter configuration unit inputs and adjusts the control parameters through external input equipment and divides a plurality of groups of different set values into different working modes;
the information processing unit receives the information acquired by each detection assembly and processes and analyzes the acquired information;
and according to the information processing result, the main control unit generates a control instruction and outputs the control instruction to control the controlled device.
Further, the method for detecting the surface roughness of the workpiece in the detection module comprises the following steps:
the processing workpiece is placed below the laser detector, and the shape error of the detected surface is displayed by an interference fringe pattern;
and simultaneously, denoising and amplifying the displayed interference fringe image, and detecting and calculating the surface roughness of the detected processing workpiece.
Further, in the process that the storage module stores the control parameters of the system and the preset product information through the storage server, the process of classifying various types of data is as follows:
setting data classification standards in a storage module, respectively establishing corresponding samples, and establishing data classification samples for data to be classified;
respectively extracting corresponding characteristic values from the data classification standard data and the data samples to be classified;
calculating the distance between the two characteristic values by using a characteristic distance calculation model; the sum that will meet the specified distance requirement is one class.
Another object of the present invention is to provide an industrial robot-based intelligent manufacturing system for implementing the industrial robot-based intelligent manufacturing method, the industrial robot-based intelligent manufacturing system comprising:
the central control module is respectively connected with the communication module, the vision acquisition module, the feeding module, the digital milling module, the polishing module, the discharging module, the detection module, the storage module, the human-computer interaction module and the storage module, and is used for processing information data through an industrial controller and performing coordination control on the mutual work among the modules;
the vision acquisition module is connected with the central control module and is used for monitoring the equipment state and the processing workpiece state in the manufacturing process in real time through a plurality of industrial cameras and processing and identifying the acquired images;
the feeding module is connected with the central control module and used for controlling the feeding robot to clamp and take the processed workpiece to the processing position of the processing equipment;
the numerical milling module is connected with the central control module and is used for controlling numerical milling equipment to perform numerical milling on the processed workpiece;
the polishing module is connected with the central control module and is used for controlling polishing equipment to polish and polish the machined workpiece after the logarithmic milling;
the blanking module is connected with the central control module and used for controlling the blanking robot to clamp and take the processed workpiece to the detection platform;
the detection module is connected with the central control module and used for carrying out image acquisition on a processing workpiece of the detection platform through an industrial camera, judging the type through the appearance of the product, measuring the size, the aperture and the area of the processing workpiece and judging whether the product is qualified or not;
the warehousing module is connected with the central control module and used for storing the machined workpieces in a grading manner according to quality grades by controlling the warehousing robot;
the human-computer interaction module is connected with the central control module and is used for carrying out on-site control adjustment on the control parameters of the central control module through a human-computer interaction interface of the display device; the method for selecting the touch point by the display device comprises the following steps: establishing a corresponding data training sample for the touch points of the human body, and carrying out statistical analysis on the touch points; setting data which accord with the range of the touch screen for a plurality of times, sequencing the data, and taking the difference value of two middle bits; if the difference is greater than the threshold, discarding; the difference is a positive or zero value and need not be declared a signed number.
Further, the industrial robot-based smart manufacturing system further includes:
the storage module is connected with the central control module and used for storing the control parameters of the system and the preset product information through the storage server; setting data classification standards in a storage module, respectively establishing corresponding samples, and establishing data classification samples for data to be classified; respectively extracting corresponding characteristic values from the data classification standard data and the data samples to be classified; calculating the distance between the two characteristic values by using a characteristic distance calculation model; the sum that will meet the specified distance requirement is a class;
the communication module is connected with the central control module and is used for receiving the control parameters of the remote monitoring terminal and sending the real-time working state parameters through the signal transmitter to exchange data;
and the remote control module is connected with the central control module and used for remotely monitoring the manufacturing process through the remote monitoring terminal and sending a control instruction through the remote monitoring terminal to remotely regulate and control the manufacturing process.
By combining all the technical schemes, the invention has the advantages and positive effects that: the industrial robot assembly comprises modules for processing, detecting, conveying, executing, controlling, displaying and monitoring and the like, rapid data exchange and flow control are completed through an industrial Ethernet, automatic feeding and discharging and automatic detection of processing size are realized through vision, streamlined processing detection through the industrial robot is realized, a basis is provided for sensing the change of the surrounding environment and the action adjustment of the industrial robot, the industrial robot can be guaranteed to finish a task correctly, and an external closed-loop control mechanism is provided; the detection module also has a size measurement function, the size of a processed product can be automatically measured according to the setting, for example, the type is judged through the product appearance, and whether the product is qualified is judged through the measurement of the aperture, the height and the area.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present application, the drawings needed to be used in the embodiments of the present application will be briefly described below, and it is obvious that the drawings described below are only some embodiments of the present application, and it is obvious for those skilled in the art that other drawings can be obtained from the drawings without creative efforts.
Fig. 1 is a schematic structural diagram of an industrial robot-based intelligent manufacturing system according to an embodiment of the present invention.
Fig. 2 is a flowchart of an industrial robot-based intelligent manufacturing method according to an embodiment of the present invention.
Fig. 3 is a flowchart of a detection method of a detection module according to an embodiment of the present invention.
Fig. 4 is a flowchart of a visual acquisition method of a visual acquisition module according to an embodiment of the present invention.
Fig. 5 is a flowchart of an image processing method performed by an image processing unit according to an embodiment of the present invention.
In the figure: 1. a central control module; 2. a communication module; 3. a vision acquisition module; 4. a feeding module; 5. a digital milling module; 6. polishing the module; 7. a blanking module; 8. a detection module; 9. a warehousing module; 10. a human-computer interaction module; 11. a storage module; 12. and a remote control module.
Detailed Description
In order to make the objects, technical solutions and advantages of the present invention more apparent, the present invention is further described in detail with reference to the following embodiments. It should be understood that the specific embodiments described herein are merely illustrative of the invention and are not intended to limit the invention.
In order to solve the problems in the prior art, the present invention provides an intelligent manufacturing system and method based on an industrial robot, and the present invention is described in detail below with reference to the accompanying drawings.
As shown in fig. 1, an industrial robot-based smart manufacturing system according to an embodiment of the present invention includes: the system comprises a central control module 1, a communication module 2, a vision acquisition module 3, a feeding module 4, a digital milling module 5, a polishing module 6, a discharging module 7, a detection module 8, a storage module 9, a human-computer interaction module 10, a storage module 11 and a remote control module 12.
The central control module 1 is respectively connected with the communication module 2, the vision acquisition module 3, the feeding module 4, the digital milling module 5, the polishing module 6, the discharging module 7, the detection module 8, the storage module 9, the human-computer interaction module 10 and the storage module 11, and is used for processing information data through an industrial controller and performing coordination control on the mutual work among the modules;
the communication module 2 is connected with the central control module and is used for receiving the control parameters of the remote monitoring terminal and sending the real-time working state parameters through the signal transmitter to carry out data exchange;
the vision acquisition module 3 is connected with the central control module and is used for monitoring the equipment state and the processing workpiece state in the manufacturing process in real time through a plurality of industrial cameras and processing and identifying the acquired images;
the feeding module 4 is connected with the central control module and is used for controlling the feeding robot to clamp and take the processed workpiece to the processing position of the processing equipment;
the numerical milling module 5 is connected with the central control module and is used for controlling numerical milling equipment to perform numerical milling on the processed workpiece;
the grinding module 6 is connected with the central control module and is used for controlling the polishing equipment to grind and polish the machined workpiece after the logarithmic milling;
the blanking module 7 is connected with the central control module and used for controlling the blanking robot to clamp and take the processed workpiece to the detection platform;
the detection module 8 is connected with the central control module and used for acquiring images of the processing workpieces of the detection platform through an industrial camera, judging the type of the processing workpieces through the appearance of the products, measuring the size, the aperture and the area of the processing workpieces and judging whether the products are qualified or not;
and the storage module 9 is connected with the central control module and used for storing the processed workpieces in a grading manner according to quality grades by controlling the storage robot.
The human-computer interaction module 10 is connected with the central control module and is used for carrying out on-site control and adjustment on the control parameters of the central control module through a human-computer interaction interface of the display device;
the storage module 11 is connected with the central control module and used for storing the control parameters of the system and the preset product information through the storage server;
and the remote control module 12 is connected with the central control module and is used for remotely monitoring the manufacturing process through the remote monitoring terminal and sending a control instruction through the remote monitoring terminal to remotely regulate and control the manufacturing process.
As shown in fig. 2, an embodiment of the present invention provides an industrial robot-based intelligent manufacturing method, including:
s101: the vision acquisition module monitors the equipment state and the processing workpiece state in the manufacturing process in real time through a plurality of industrial cameras, and processes and identifies the acquired images; in the process of vision acquisition, carrying out on-site control adjustment on control parameters of a central control module through a human-computer interaction interface of a display device in a human-computer interaction module;
s102: according to the equipment state and the processed workpiece state image processing recognition result, the central control module is respectively connected with the communication module, the vision acquisition module, the feeding module, the digital milling module, the polishing module, the discharging module, the detection module, the storage module, the human-computer interaction module and the storage module, and is used for processing information data through an industrial controller and performing coordination control on the mutual work among the modules;
s103: the feeding module controls the feeding robot to clamp a machined workpiece to a machining position of machining equipment, and the numerical milling module controls the numerical milling equipment to perform numerical milling on the machined workpiece; the grinding module controls the polishing equipment to grind and polish the machined workpiece after the logarithmic milling, and controls the blanking robot to clamp and take the machined workpiece to the detection platform;
s104: after the workpiece is machined, the detection module acquires images of the machined workpiece of the detection platform through an industrial camera, judges the type of the workpiece according to the appearance of the product, measures the size, the aperture and the area of the machined workpiece and judges whether the product is qualified or not;
s105: after the product is detected to be qualified, the warehousing module controls the warehousing robot to store the machined workpieces in a grading manner according to quality grades; meanwhile, the storage module stores the control parameters of the system and the preset product information through a storage server;
s106: the central control module controls the communication module to receive the control parameters of the remote monitoring terminal and send the real-time working state parameters through the signal transmitter so as to exchange data; the remote control module carries out remote monitoring on the manufacturing process through the remote monitoring terminal, and sends a control instruction through the remote monitoring terminal to carry out remote control on the manufacturing process.
The detection process of the detection module is as follows:
s201: the to-be-detected image acquisition unit acquires an image of a processing workpiece of the detection platform through an industrial camera arranged at the upper end of the detection platform;
s202: the category judgment unit compares the product appearance of the acquired image with a product appearance prestored in the storage server to obtain a category corresponding to the detected product;
s203: the vision measuring unit measures the size, the aperture and the area of the processed workpiece according to the acquired image;
s204: after the measurement is finished, the result output unit compares the acquired workpiece parameters with prestored parameters of the type of the workpiece, judges whether the product is qualified or not, and outputs the detection result to the display device.
The visual acquisition process of the visual acquisition module comprises the following steps:
s301: the camera initialization unit restores the irradiation angles and the focusing parameters of the cameras at different positions to corresponding initial setting parameters; the image acquisition unit acquires real-time images of the operation equipment and the processing workpiece in real time;
s302: after the image is collected, the image processing unit preprocesses the collected image to obtain an image with high definition;
s303: according to the preset spatial information, a coordinate conversion unit converts two-dimensional coordinates of different position points into three-dimensional coordinates;
s304: according to the obtained three-dimensional coordinate data, the target posture calculation unit identifies and judges the positions and working postures of different running equipment;
s305: and an output servo amount unit generates a control servo amount for the operating equipment according to the judged working posture of the operating equipment and outputs the control servo amount in real time.
The image processing unit in the embodiment of the invention processes the image as follows:
s401: carrying out color space conversion on the acquired image, and converting an RGB color image in an original format of the acquired image into a gray image;
s402: diagnosing the definition and brightness of the image, comparing the obtained result with a threshold value, and judging whether the calculated result is higher than the given threshold value is a clear image or not;
s403: and according to the image diagnosis result, performing compensation adjustment on the contrast and the brightness value of the image to obtain a clear image meeting a given threshold value.
The operation process of the central processing module to each module in the embodiment of the invention is as follows;
the initialization unit initializes each controlled device by setting parameters to enable each controlled device to be at an initial position; meanwhile, the parameter configuration unit inputs and adjusts the control parameters through external input equipment and divides a plurality of groups of different set values into different working modes;
the information processing unit receives the information acquired by each detection assembly and processes and analyzes the acquired information;
and according to the information processing result, the main control unit generates a control instruction and outputs the control instruction to control the controlled device.
In S101, a method for selecting a touch point by a display device during a process of performing field control adjustment on a control parameter of a central control module by a human-computer interaction interface of the display device in a human-computer interaction module, provided by an embodiment of the present invention, includes:
establishing a corresponding data training sample for the touch points of the human body, and carrying out statistical analysis on the touch points;
setting data which accord with the range of the touch screen for a plurality of times, sequencing the data, and taking the difference value of two middle bits;
if the difference is greater than the threshold, discarding; the difference is a positive or zero value and need not be declared a signed number.
In S101 provided by the embodiment of the present invention, in the process of processing and identifying the acquired image by an image processing unit in the visual acquisition module, the method for extracting the image features includes:
performing color space conversion on the acquired equipment state image and the processed workpiece state image according to an image processing unit, and converting an RGB color image in an original format of an acquired image into a gray image;
determining a histogram and a mean value of images in a gray-scale image which can embody the equipment state and the processing workpiece state;
and solving the feature vector of the image texture feature according to the histogram and the mean value of the region.
The process of identifying the image according to the feature vector of the image texture features comprises the following steps:
performing feature matching on the pre-stored data by using a feature matching model according to the feature vector of the acquired image texture feature;
according to the feature matching numerical value, selecting a feature numerical value with a small data value to perform accurate positioning; and is
And assigning direction values to the feature points, and describing and explaining the features.
In S104 provided by the embodiment of the present invention, the method for detecting the surface roughness of the workpiece in the detection module includes:
the processing workpiece is placed below the laser detector, and the shape error of the detected surface is displayed by an interference fringe pattern;
and simultaneously, denoising and amplifying the displayed interference fringe image, and detecting and calculating the surface roughness of the detected processing workpiece.
In S105 provided in the embodiment of the present invention, in the process of storing the control parameters of the system and the preset product information by the storage module through the storage server, the process of classifying various types of data is as follows:
setting data classification standards in a storage module, respectively establishing corresponding samples, and establishing data classification samples for data to be classified;
respectively extracting corresponding characteristic values from the data classification standard data and the data samples to be classified;
calculating the distance between the two characteristic values by using a characteristic distance calculation model; the sum that will meet the specified distance requirement is one class.
The working principle of the invention is as follows: the vision acquisition module 3 monitors the equipment state and the processing workpiece state in the manufacturing process in real time through a plurality of industrial cameras, and processes and identifies the acquired images; in the process of vision collection, the control parameters of the central control module 1 are subjected to field control adjustment through a human-computer interaction interface of a display device in the human-computer interaction module 10. According to the image processing and recognition result of the equipment state and the processed workpiece state, the central control module 1 is respectively connected with the communication module 2, the vision acquisition module 3, the feeding module 4, the digital milling module 5, the polishing module 6, the discharging module 7, the detection module 8, the storage module 9, the human-computer interaction module 10 and the storage module 11, and is used for processing information data through an industrial controller and carrying out coordination control on the mutual work among the modules;
the feeding module 4 controls the feeding robot to clamp a processed workpiece to a processing position of processing equipment, and the digital milling module 5 controls the digital milling equipment to carry out digital milling on the processed workpiece; the grinding module 6 controls the polishing equipment to grind and polish the machined workpiece after the logarithmic milling, and controls the blanking robot to clamp and take the machined workpiece to the detection platform; after the workpiece is machined, the detection module 8 acquires images of the machined workpiece of the detection platform through the industrial camera, judges the type of the workpiece according to the appearance of the product, measures the size, the aperture and the area of the machined workpiece, and judges whether the product is qualified.
After the product is detected to be qualified, the warehousing module 9 controls the warehousing robot to store the processed workpieces in grades according to quality grades; meanwhile, the storage module 11 stores the control parameters of the system and the preset product information through a storage server; the central control module 1 controls the communication module to receive the control parameters of the remote monitoring terminal and send the real-time working state parameters through the signal transmitter for data exchange; the remote control module 12 remotely monitors the manufacturing process through a remote monitoring terminal, and sends a control instruction through the remote monitoring terminal to remotely regulate and control the manufacturing process.
The above description is only for the purpose of illustrating the preferred embodiments of the present invention, and the scope of the present invention is not limited thereto, and any modification, equivalent replacement, and improvement made by those skilled in the art within the technical scope of the present invention disclosed herein, which is within the spirit and principle of the present invention, should be covered by the present invention.

Claims (10)

1. An intelligent manufacturing method based on an industrial robot, which is characterized by comprising the following steps:
the method comprises the following steps that firstly, a vision acquisition module monitors the equipment state and the processing workpiece state in the manufacturing process in real time through a plurality of industrial cameras, and processes and identifies the acquired images; in the process of vision acquisition, carrying out on-site control adjustment on control parameters of a central control module through a human-computer interaction interface of a display device in a human-computer interaction module;
secondly, processing and identifying results according to the equipment state and the processed workpiece state images, wherein the central control module is respectively connected with the communication module, the vision acquisition module, the feeding module, the digital milling module, the polishing module, the discharging module, the detection module, the storage module, the human-computer interaction module and the storage module, and is used for processing information data through an industrial controller and performing coordination control on the mutual work among the modules;
step three, the feeding module controls the feeding robot to clamp the processed workpiece to the processing position of the processing equipment, and the numerical milling module controls the numerical milling equipment to perform numerical milling on the processed workpiece; the grinding module controls the polishing equipment to grind and polish the machined workpiece after the logarithmic milling, and controls the blanking robot to clamp and take the machined workpiece to the detection platform;
after the workpiece is machined, the detection module acquires images of the machined workpiece of the detection platform through an industrial camera, judges the type of the workpiece according to the appearance of the product, measures the size, the aperture and the area of the machined workpiece and judges whether the product is qualified or not;
step five, after the product is detected to be qualified, the warehousing module stores the processed workpieces in grades according to quality grades by controlling the warehousing robot; meanwhile, the storage module stores the control parameters of the system and the preset product information through a storage server;
step six, the central control module controls the communication module to receive the control parameters of the remote monitoring terminal and send the real-time working state parameters through the signal transmitter so as to exchange data; the remote control module carries out remote monitoring on the manufacturing process through a remote monitoring terminal, and sends a control instruction through the remote monitoring terminal to carry out remote regulation and control on the manufacturing process;
the detection process of the detection module is as follows:
firstly, an image acquisition unit to be detected acquires an image of a processing workpiece of a detection platform through an industrial camera arranged at the upper end of the detection platform;
secondly, the category judgment unit compares the product appearance of the collected image with the product appearance prestored in the storage server to obtain the category corresponding to the detected product;
thirdly, the vision measuring unit measures the size, the aperture and the area of the processed workpiece according to the acquired image;
fourthly, after the measurement is finished, the result output unit compares the acquired workpiece parameters with prestored parameters of the type of the workpiece, judges whether the product is qualified or not, and outputs the detection result to a display device;
the visual acquisition process of the visual acquisition module comprises the following steps:
1) the camera initialization unit restores the irradiation angles and the focusing parameters of the cameras at different positions to corresponding initial setting parameters; the image acquisition unit acquires real-time images of the operation equipment and the processing workpiece in real time;
2) after the image is collected, the image processing unit preprocesses the collected image to obtain an image with high definition;
3) according to the preset spatial information, a coordinate conversion unit converts two-dimensional coordinates of different position points into three-dimensional coordinates;
4) according to the obtained three-dimensional coordinate data, the target posture calculation unit identifies and judges the positions and working postures of different running equipment;
5) and an output servo amount unit generates a control servo amount for the operating equipment according to the judged working posture of the operating equipment and outputs the control servo amount in real time.
2. An industrial robot-based intelligent manufacturing method according to claim 1, wherein in the process of performing on-site control adjustment on the control parameter of the central control module by the human-machine interaction interface of the display device in the human-machine interaction module, the method for selecting the touch point by the display device comprises the following steps:
establishing a corresponding data training sample for the touch points of the human body, and carrying out statistical analysis on the touch points;
setting data which accord with the range of the touch screen for a plurality of times, sequencing the data, and taking the difference value of two middle bits;
if the difference is greater than the threshold, discarding; the difference is a positive or zero value and need not be declared a signed number.
3. An industrial robot-based smart manufacturing method as defined in claim 1 wherein said image processing unit processes the image by:
carrying out color space conversion on the acquired image, and converting an RGB color image in an original format of the acquired image into a gray image;
diagnosing the definition and brightness of the image, comparing the obtained result with a threshold value, and judging whether the calculated result is higher than the given threshold value is a clear image or not;
and according to the image diagnosis result, performing compensation adjustment on the contrast and the brightness value of the image to obtain a clear image meeting a given threshold value.
4. An industrial robot-based intelligent manufacturing method according to claim 1, wherein in the process of processing and identifying the acquired image by the image processing unit in the vision acquisition module, the method for extracting the image features comprises the following steps:
performing color space conversion on the acquired equipment state image and the processed workpiece state image according to an image processing unit, and converting an RGB color image in an original format of an acquired image into a gray image;
determining a histogram and a mean value of images in a gray-scale image which can embody the equipment state and the processing workpiece state;
and solving the feature vector of the image texture feature according to the histogram and the mean value of the region.
5. An industrial robot based intelligent manufacturing method according to claim 4, wherein said process of recognizing the image based on the feature vector of the image texture feature is:
performing feature matching on the pre-stored data by using a feature matching model according to the feature vector of the acquired image texture feature;
according to the feature matching numerical value, selecting a feature numerical value with a small data value to perform accurate positioning; and is
And assigning direction values to the feature points, and describing and explaining the features.
6. An industrial robot-based smart manufacturing method as defined in claim 1 wherein said central processing module runs a process for each module;
the initialization unit initializes each controlled device by setting parameters to enable each controlled device to be at an initial position; meanwhile, the parameter configuration unit inputs and adjusts the control parameters through external input equipment and divides a plurality of groups of different set values into different working modes;
the information processing unit receives the information acquired by each detection assembly and processes and analyzes the acquired information;
and according to the information processing result, the main control unit generates a control instruction and outputs the control instruction to control the controlled device.
7. An industrial robot-based smart manufacturing method as defined in claim 1 wherein said method of detecting roughness of a surface of a workpiece in said detection module is:
the processing workpiece is placed below the laser detector, and the shape error of the detected surface is displayed by an interference fringe pattern;
and simultaneously, denoising and amplifying the displayed interference fringe image, and detecting and calculating the surface roughness of the detected processing workpiece.
8. An industrial robot-based smart manufacturing method according to claim 1, wherein the storage module classifies various types of data in the process of storing the control parameters of the system and the preset product information through the storage server as follows:
setting data classification standards in a storage module, respectively establishing corresponding samples, and establishing data classification samples for data to be classified;
respectively extracting corresponding characteristic values from the data classification standard data and the data samples to be classified;
calculating the distance between the two characteristic values by using a characteristic distance calculation model; the sum that will meet the specified distance requirement is one class.
9. An industrial robot-based intelligent manufacturing system implementing the industrial robot-based intelligent manufacturing method according to claims 1-8, wherein the industrial robot-based intelligent manufacturing system comprises:
the central control module is respectively connected with the communication module, the vision acquisition module, the feeding module, the digital milling module, the polishing module, the discharging module, the detection module, the storage module, the human-computer interaction module and the storage module, and is used for processing information data through an industrial controller and performing coordination control on the mutual work among the modules;
the vision acquisition module is connected with the central control module and is used for monitoring the equipment state and the processing workpiece state in the manufacturing process in real time through a plurality of industrial cameras and processing and identifying the acquired images;
the feeding module is connected with the central control module and used for controlling the feeding robot to clamp and take the processed workpiece to the processing position of the processing equipment;
the numerical milling module is connected with the central control module and is used for controlling numerical milling equipment to perform numerical milling on the processed workpiece;
the polishing module is connected with the central control module and is used for controlling polishing equipment to polish and polish the machined workpiece after the logarithmic milling;
the blanking module is connected with the central control module and used for controlling the blanking robot to clamp and take the processed workpiece to the detection platform;
the detection module is connected with the central control module and used for carrying out image acquisition on a processing workpiece of the detection platform through an industrial camera, judging the type through the appearance of the product, measuring the size, the aperture and the area of the processing workpiece and judging whether the product is qualified or not;
the warehousing module is connected with the central control module and used for storing the machined workpieces in a grading manner according to quality grades by controlling the warehousing robot;
the human-computer interaction module is connected with the central control module and is used for carrying out on-site control adjustment on the control parameters of the central control module through a human-computer interaction interface of the display device; the method for selecting the touch point by the display device comprises the following steps: establishing a corresponding data training sample for the touch points of the human body, and carrying out statistical analysis on the touch points; setting data which accord with the range of the touch screen for a plurality of times, sequencing the data, and taking the difference value of two middle bits; if the difference is greater than the threshold, discarding; the difference is a positive or zero value and need not be declared a signed number.
10. An industrial robot-based smart manufacturing system as defined in claim 9, further comprising:
the storage module is connected with the central control module and used for storing the control parameters of the system and the preset product information through the storage server; setting data classification standards in a storage module, respectively establishing corresponding samples, and establishing data classification samples for data to be classified; respectively extracting corresponding characteristic values from the data classification standard data and the data samples to be classified; calculating the distance between the two characteristic values by using a characteristic distance calculation model; the sum that will meet the specified distance requirement is a class;
the communication module is connected with the central control module and is used for receiving the control parameters of the remote monitoring terminal and sending the real-time working state parameters through the signal transmitter to exchange data;
and the remote control module is connected with the central control module and used for remotely monitoring the manufacturing process through the remote monitoring terminal and sending a control instruction through the remote monitoring terminal to remotely regulate and control the manufacturing process.
CN202010456255.4A 2020-05-26 2020-05-26 Intelligent manufacturing system and method based on industrial robot Withdrawn CN111645111A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010456255.4A CN111645111A (en) 2020-05-26 2020-05-26 Intelligent manufacturing system and method based on industrial robot

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010456255.4A CN111645111A (en) 2020-05-26 2020-05-26 Intelligent manufacturing system and method based on industrial robot

Publications (1)

Publication Number Publication Date
CN111645111A true CN111645111A (en) 2020-09-11

Family

ID=72350975

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010456255.4A Withdrawn CN111645111A (en) 2020-05-26 2020-05-26 Intelligent manufacturing system and method based on industrial robot

Country Status (1)

Country Link
CN (1) CN111645111A (en)

Cited By (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112091634A (en) * 2020-09-27 2020-12-18 王凤萍 Intelligent punching robot with polishing function
CN112123989A (en) * 2020-09-22 2020-12-25 西京学院 Green environment-friendly pollution-free industrial art product manufacturing method and intelligent system
CN112606006A (en) * 2020-12-03 2021-04-06 泰州市朗嘉馨网络科技有限公司 On-site processing platform and method for collecting application parameters
CN112816494A (en) * 2020-12-22 2021-05-18 莱茵技术(上海)有限公司 Intelligent detection system for mechanical properties of parts
CN113074633A (en) * 2021-03-22 2021-07-06 西安工业大学 Automatic detection system and detection method for overall dimension of material
CN113145942A (en) * 2021-03-12 2021-07-23 重庆市永川区中川科技发展有限责任公司 Work control method of gear shaping machine
CN114589688A (en) * 2020-12-07 2022-06-07 山东新松工业软件研究院股份有限公司 Multifunctional vision control method and device applied to industrial robot

Cited By (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112123989A (en) * 2020-09-22 2020-12-25 西京学院 Green environment-friendly pollution-free industrial art product manufacturing method and intelligent system
CN112091634A (en) * 2020-09-27 2020-12-18 王凤萍 Intelligent punching robot with polishing function
CN112606006A (en) * 2020-12-03 2021-04-06 泰州市朗嘉馨网络科技有限公司 On-site processing platform and method for collecting application parameters
CN114589688A (en) * 2020-12-07 2022-06-07 山东新松工业软件研究院股份有限公司 Multifunctional vision control method and device applied to industrial robot
CN112816494A (en) * 2020-12-22 2021-05-18 莱茵技术(上海)有限公司 Intelligent detection system for mechanical properties of parts
CN113145942A (en) * 2021-03-12 2021-07-23 重庆市永川区中川科技发展有限责任公司 Work control method of gear shaping machine
CN113074633A (en) * 2021-03-22 2021-07-06 西安工业大学 Automatic detection system and detection method for overall dimension of material
CN113074633B (en) * 2021-03-22 2023-01-31 西安工业大学 Automatic detection system and detection method for overall dimension of material

Similar Documents

Publication Publication Date Title
CN111645111A (en) Intelligent manufacturing system and method based on industrial robot
CN111468989B (en) Five-axis linkage numerical control manipulator polishing control system and method
CN107263468B (en) SCARA robot assembly method using digital image processing technology
CN103325106B (en) Based on the Moving Workpieces method for sorting of LabVIEW
CN111923053A (en) Industrial robot object grabbing teaching system and method based on depth vision
CN110065068B (en) Robot assembly operation demonstration programming method and device based on reverse engineering
CN109926817B (en) Machine vision-based automatic transformer assembling method
CN110969660B (en) Robot feeding system based on three-dimensional vision and point cloud deep learning
CN113369761B (en) Method and system for positioning welding seam based on vision guiding robot
Hsu et al. Development of a faster classification system for metal parts using machine vision under different lighting environments
CN107598775A (en) It is a kind of by laser detect the method that simultaneously multi-axis robot is polished
Xia et al. Workpieces sorting system based on industrial robot of machine vision
Rios-Cabrera et al. Robotic GMAW online learning: issues and experiments
CN114913346A (en) Intelligent sorting system and method based on product color and shape recognition
CN113334380A (en) Robot vision calibration method, control system and device based on binocular vision
CN113601501B (en) Flexible operation method and device for robot and robot
CN109079777B (en) Manipulator hand-eye coordination operation system
Rückert et al. Calibration of a modular assembly system for personalized and adaptive human robot collaboration
CN111352398B (en) Intelligent precision machining unit
CN117282580A (en) Intelligent manipulator for spraying assembly line and control method
CN113465505B (en) Visual detection positioning system and method
JPH02110788A (en) Method for recognizing shape of three-dimensional object
Klancnik et al. Computer-based workpiece detection on CNC milling machine tools using optical camera and neural networks
Solvang et al. Robot programming in machining operations
CN210589323U (en) Steel hoop processing feeding control system based on three-dimensional visual guidance

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
WW01 Invention patent application withdrawn after publication

Application publication date: 20200911

WW01 Invention patent application withdrawn after publication