CN109693140B - Intelligent flexible production line and working method thereof - Google Patents
Intelligent flexible production line and working method thereof Download PDFInfo
- Publication number
- CN109693140B CN109693140B CN201811651017.8A CN201811651017A CN109693140B CN 109693140 B CN109693140 B CN 109693140B CN 201811651017 A CN201811651017 A CN 201811651017A CN 109693140 B CN109693140 B CN 109693140B
- Authority
- CN
- China
- Prior art keywords
- workpiece
- image
- station
- area
- production line
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
Images
Classifications
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B23—MACHINE TOOLS; METAL-WORKING NOT OTHERWISE PROVIDED FOR
- B23Q—DETAILS, COMPONENTS, OR ACCESSORIES FOR MACHINE TOOLS, e.g. ARRANGEMENTS FOR COPYING OR CONTROLLING; MACHINE TOOLS IN GENERAL CHARACTERISED BY THE CONSTRUCTION OF PARTICULAR DETAILS OR COMPONENTS; COMBINATIONS OR ASSOCIATIONS OF METAL-WORKING MACHINES, NOT DIRECTED TO A PARTICULAR RESULT
- B23Q7/00—Arrangements for handling work specially combined with or arranged in, or specially adapted for use in connection with, machine tools, e.g. for conveying, loading, positioning, discharging, sorting
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B23—MACHINE TOOLS; METAL-WORKING NOT OTHERWISE PROVIDED FOR
- B23Q—DETAILS, COMPONENTS, OR ACCESSORIES FOR MACHINE TOOLS, e.g. ARRANGEMENTS FOR COPYING OR CONTROLLING; MACHINE TOOLS IN GENERAL CHARACTERISED BY THE CONSTRUCTION OF PARTICULAR DETAILS OR COMPONENTS; COMBINATIONS OR ASSOCIATIONS OF METAL-WORKING MACHINES, NOT DIRECTED TO A PARTICULAR RESULT
- B23Q7/00—Arrangements for handling work specially combined with or arranged in, or specially adapted for use in connection with, machine tools, e.g. for conveying, loading, positioning, discharging, sorting
- B23Q7/04—Arrangements for handling work specially combined with or arranged in, or specially adapted for use in connection with, machine tools, e.g. for conveying, loading, positioning, discharging, sorting by means of grippers
Landscapes
- Engineering & Computer Science (AREA)
- Mechanical Engineering (AREA)
- Investigating Materials By The Use Of Optical Means Adapted For Particular Applications (AREA)
- Image Processing (AREA)
Abstract
The invention relates to an intelligent flexible production line and a working method thereof, wherein the intelligent flexible production line comprises the following steps: the device comprises an upper computer, a transferring manipulator, a repairing station and a plurality of processing stations which are arranged according to processing procedures, wherein the processing stations are connected through a conveying mechanism; a workpiece detection system is arranged at the conveying mechanism; the workpiece detection system is suitable for detecting the workpieces circulated by the conveying mechanism; the upper computer is electrically connected with the workpiece detection system, and when the workpiece is judged to be unqualified, the workpiece is clamped and transferred to the repairing station by the transfer manipulator; and after the repairing station finishes repairing the workpiece, the transferring manipulator clamps the workpiece and transfers the workpiece back to the conveying mechanism so as to enter the next processing station.
Description
Technical Field
The invention belongs to the technical field of intelligent manufacturing, and particularly relates to an intelligent flexible production line and a method thereof.
Background
The flexible production line is a production line formed by connecting a plurality of adjustable machine tools (processing equipment) and matching with an automatic conveying device. It relies on computer management and combines multiple production modes, thus reducing production cost and making the best use of things.
However, in the automatic production process, the manufacturing defects of the workpiece are sometimes difficult to be found manually in time, and usually can be found only through manual inspection of subsequent processes, so that the problems of increasing waste products in intelligent manufacturing, lowering production rate and the like are caused.
Therefore, based on the above problems, it is necessary to design an intelligent flexible production line and a working method thereof.
Disclosure of Invention
The invention aims to provide an intelligent flexible production line and a working method thereof.
In order to solve the technical problem, the invention provides an intelligent flexible production line, which comprises:
the device comprises an upper computer, a transferring manipulator, a repairing station and a plurality of processing stations which are arranged according to processing procedures, wherein the processing stations are connected through a conveying mechanism;
a workpiece detection system is arranged at the conveying mechanism;
the workpiece detection system is suitable for detecting the workpieces circulated by the conveying mechanism;
the upper computer is electrically connected with the workpiece detection system, and when the workpiece is judged to be unqualified, the workpiece is clamped and transferred to the repairing station by the transfer manipulator;
and after the repairing station finishes repairing the workpiece, the transferring manipulator clamps the workpiece and transfers the workpiece back to the conveying mechanism so as to enter the next processing station.
Further, the workpiece inspection system includes: the device comprises a detection station, a proximity switch and a positioning and releasing mechanism, wherein a background mark plate which is continuously arranged along a flexible production line is arranged on one side of the detection station, a plurality of mark points corresponding to the current detection station are arranged on the background mark plate, an LED (light emitting diode) surface light source matrix is arranged on the other side of the detection station, the LED surface light source matrix is in communication connection with an upper computer, and the background mark plate and the LED surface light source matrix jointly form a backlight illumination environment.
Further, the workpiece detection system further includes: the system comprises an area array camera, a first photoelectric conversion element, an optical fiber slip ring and a second photoelectric conversion element, wherein the first photoelectric conversion element, the optical fiber slip ring and the second photoelectric conversion element are sequentially in communication connection with the area array camera, the axis of a photosensitive lens of the area array camera is perpendicular to the circulation direction of a detection station, and after photoelectric conversion is carried out on an image collected by the area array camera, the second photoelectric conversion element sends an electric signal of the image to an upper computer.
Further, the host computer includes: the device comprises an image processing unit, a feature vector extraction unit, a deep neural network unit and a computer control unit; wherein
The image processing unit is suitable for receiving an image of a detection station on a detection station acquired by an area-array camera, carrying out resolution scanning on the received image to obtain a sensitive area image of the current detection station, denoising the sensitive area image, and sending the denoised sensitive area image to the feature vector extraction unit;
the feature vector extraction unit carries out edge detection on the sensitive region image to form a target region, and the edge area A of the target region is obtained through calculation according to formulas (1) to (3) respectivelyEEdge of the glassShape factor E and target area mean radius μRThe characteristic vector of a sensitive area with four characteristic variables is formed by adding the Hu invariant moment of the first 3 dimensions to reflect the workpiece quality information of the current detection station, and the characteristic vector is used as an input layer and is sent to a deep neural network unit;
in the above formula, the parameters M and N are the number of edge points of the target area,wherein t (x, y) is the gray value of each edge point; the parameter L is the perimeter of the target area; reference K is the number of edge points on the boundary of the target area, (x)k,yk) Representing the coordinates of the pixels located on the border of the target area,the coordinates of the centroid, which represents the target area, can be calculated by the following formula:
wherein the parameter A represents the area of the sensitive region and is suitable for obtaining the size of the sensitive region when the sensitive region is identified in image processing;
the deep neural network unit constructs a manufacturing defect prediction model based on a neural network algorithm, trains, learns and classifies image characteristic vectors of detection stations, identifies the manufacturing defect type of a workpiece to be detected on the current detection station, and feeds back a classification result to the computer control unit;
the computer control unit is suitable for controlling the transfer manipulator to clamp and transfer the workpiece to the repairing station;
and sending the manufacturing defect type to a repair station to repair the defect of the workpiece.
In another aspect, the present invention further provides a working method of an intelligent flexible production line, including:
carrying out on-line detection on the workpiece circulated by the conveying mechanism;
if the workpiece is judged to be unqualified, transferring and repairing the workpiece;
and after the transfer and repair of the workpiece are finished, transferring the workpiece back to the conveying mechanism.
Further, the working method is suitable for adopting the intelligent flexible production line.
The invention has the following beneficial effects:
the intelligent flexible production line and the working method thereof can accurately judge the defects of the workpiece in the production process, can transfer the workpiece to the repair station for repair and rework after detecting that the workpiece is unqualified, namely the workpiece has manufacturing defects, and transfer the workpiece back to the production line after the workpiece is qualified, thereby effectively reducing the unqualified rate of workpiece processing and avoiding the situation that the product is processed for many times after the primary processing is unreasonable.
Drawings
The invention is further illustrated with reference to the following figures and examples.
FIG. 1 is a top layout view of an intelligent flexible production line of the present invention;
FIG. 2 is a control schematic block diagram of the intelligent flexible production line of the present invention.
The device comprises a transfer manipulator 1, a repairing station 2, a processing station 3, a conveying mechanism 4, a workpiece detection system 5, a workpiece 6, a detection station 501, a background target 502, an LED area light source matrix 503 and an area array camera 504.
Detailed Description
Reference will now be made in detail to embodiments of the present invention, examples of which are illustrated in the accompanying drawings, wherein like or similar reference numerals refer to the same or similar elements or elements having the same or similar function throughout. The embodiments described below with reference to the accompanying drawings are illustrative only for the purpose of explaining the present invention, and are not to be construed as limiting the present invention. On the contrary, the embodiments of the invention include all changes, modifications and equivalents coming within the spirit and terms of the claims appended hereto.
In the description of the present invention, it is to be understood that the terms "center", "longitudinal", "lateral", "length", "width", "thickness", "upper", "lower", "front", "rear", "left", "right", "vertical", "horizontal", "top", "bottom", "inner", "outer", "axial", "radial", "circumferential", and the like, indicate orientations and positional relationships based on the orientations and positional relationships shown in the drawings, and are used merely for convenience of description and for simplicity of description, and do not indicate or imply that the device or element being referred to must have a particular orientation, be constructed and operated in a particular orientation, and therefore, should not be considered as limiting the present invention.
Furthermore, the terms "first," "second," and the like are used for descriptive purposes only and are not to be construed as indicating or implying relative importance. In the description of the present invention, it is to be noted that, unless otherwise explicitly specified or limited, the terms "connected" and "connected" are to be interpreted broadly, e.g., as being fixed or detachable or integrally connected; can be mechanically or electrically connected; may be directly connected or indirectly connected through an intermediate. The specific meanings of the above terms in the present invention can be understood in specific cases to those skilled in the art. In addition, in the description of the present invention, "a plurality" means two or more unless otherwise specified.
Any process or method descriptions in flow charts or otherwise described herein may be understood as representing modules, segments, or portions of code which include one or more executable instructions for implementing specific logical functions or steps of the process, and alternate implementations are included within the scope of the preferred embodiment of the present invention in which functions may be executed out of order from that shown or discussed, including substantially concurrently or in reverse order, depending on the functionality involved, as would be understood by those reasonably skilled in the art of the present invention.
FIG. 1 is a top layout view of an intelligent flexible production line of the present invention;
FIG. 2 is a control schematic block diagram of the intelligent flexible production line of the present invention.
As shown in fig. 1 and 2, the present embodiment provides an intelligent flexible production line, including: the device comprises an upper computer, a transferring manipulator 1, a repairing station 2 and a plurality of processing stations 3 which are arranged according to processing procedures, wherein the processing stations 3 are connected through a conveying mechanism 4; a workpiece detection system 5 is arranged at the conveying mechanism 4; the workpiece detection system 5 is suitable for detecting the workpieces 6 circulated by the conveying mechanism 4; the upper computer is electrically connected with the workpiece detection system 5, and when the workpieces are judged to be unqualified, the workpieces are clamped and transferred to the repairing station 2 by the transfer manipulator 1; after the repair station 2 finishes repairing the workpiece, the transferring manipulator 1 clamps the workpiece and transfers the workpiece back to the conveying mechanism 4 to enter the next processing station 3.
The intelligent flexible production line and the working method thereof can accurately judge the defects of the workpiece in the production process, can transfer the workpiece to the repair station for repair and rework after detecting that the workpiece is unqualified, namely the workpiece has manufacturing defects, and transfer the workpiece back to the production line after the workpiece is qualified, thereby effectively reducing the unqualified rate of workpiece processing and avoiding the situation that the product is processed for many times after the primary processing is unreasonable.
The workpiece detection system 5 includes: the detection device comprises a detection station 501, a proximity switch and a positioning and releasing mechanism, wherein a background mark plate 502 continuously arranged along a flexible production line is arranged on one side of the detection station 501, a plurality of mark points corresponding to the current detection station 501 are arranged on the background mark plate 502, an LED surface light source matrix 503 is arranged on the other side of the detection station 501, the LED surface light source matrix 503 is in communication connection with an upper computer, and the background mark plate 502 and the LED surface light source matrix 503 jointly form a backlight illumination environment. In a preferred embodiment, the proximity switch and the positioning and releasing mechanism are all in communication connection with an upper computer, wherein the positioning and releasing mechanism receives a control instruction of the upper computer, drives the workpiece to be detected to move on the detection station 501 to reach a preset detection position, triggers the proximity switch and sends a trigger signal to the upper computer, and the upper computer starts the workpiece detection system 5 to perform image acquisition.
In this embodiment, a plurality of marking points corresponding to the current inspection station 501 may also be set on the background target 502, and in a preferred embodiment, the marking point position on the background target 502 is set based on typical manufacturing defect information that may occur on a workpiece to be inspected at the current inspection station 501, so as to facilitate identification and segmentation of a sensitive area by subsequent image processing. In addition, an LED area light source matrix 503 is arranged on the other side of the detection station 501, the LED area light source matrix 503 is in communication connection with an upper computer, the upper computer can intelligently control the LED area light source matrix 503 on the whole production line to be turned on and off, when a workpiece to be detected moves to a preset position of the current detection station 501, the upper computer sends an instruction to the LED area light source matrix 503 corresponding to the detection station 501 to enable the workpiece to be detected to be turned on, the background mark plate 502 and the LED area light source matrix 503 jointly form a backlight illumination environment, gray scale comparison between the workpiece to be detected and the background mark plate 502 on an image is facilitated, and time consumption of edge detection during image processing is reduced.
The workpiece detection system 5 further comprises an area-array camera 504, a first photoelectric conversion element, an optical fiber slip ring and a second photoelectric conversion element which are sequentially in communication connection, wherein when a workpiece to be detected moves to a preset detection position on the detection station 501 and approaches to the center of a shooting visual field of the area-array camera 504, a proximity switch is triggered and a trigger signal is sent to an upper computer; the axis of the photosensitive lens of the area-array camera 504 is perpendicular to the circulation direction of the detection station 501, and considering that in an intelligent flexible production line, the operation of equipment may generate certain vibration on the detection station 501, so that the shooting precision of the area-array camera 504 fixedly connected with the detection station 501 is influenced.
In this embodiment, the upper computer includes: the device comprises an image processing unit, a feature vector extraction unit, a deep neural network unit and a computer control unit.
For the workpiece detection system 5, the first photoelectric conversion element and the second photoelectric conversion element are both provided with the same number of input ends and a plurality of output ends, the number of the optical fiber slip rings corresponds to that of the input ends, each input end of the first photoelectric conversion element is connected with the area array camera 504, each output end is connected with the input end of the second photoelectric conversion element through one optical fiber slip ring, and each output end of the second photoelectric conversion element is connected with the image processing unit.
After the image processing unit receives the image subjected to photoelectric conversion, in order to improve the image processing efficiency, the image processing method firstly obtains the image of the sensitive area of the current detection station 501 through resolution scanning and carries out image segmentation, then only carries out denoising and edge detection on the obtained sensitive area, and the traditional image processing is to carry out denoising and then image segmentation on the whole image.
Specifically, the image processing unit automatically locates the center positions of a plurality of mark points on the image, determines the included angle between the background mark plate 502 and the horizontal direction, so as to calculate the deflection angle between the area array camera 504 and the background mark plate 502, the image processing unit controls the image to perform resolution scanning along the deflection angle, so as to complete the identification of the sensitive area of the current detection station 501, and after the sensitive area is identified, the image processing unit performs image segmentation operation on the image, so as to obtain a new image to be processed.
When denoising the segmented sensitive region image, in order to enable the sensitive region image to have a more natural smooth effect and enhance the processing effect of random noise of the sensitive region image, the invention adopts a Gaussian filtering method to denoise the sensitive region image; after obtaining the smooth sensitive area image, the feature vector extraction unit further processes the processed sensitive area image.
Specifically, the purpose of the feature vector extraction unit is to reduce high-dimensional image information characterized by a pixel set into low-dimensional image information characterized by a vector set, so as to facilitate the processing of a computer and ensure the accuracy of deep neural network unit classification; in the invention, based on the flexible production characteristics of intelligent manufacturing, the intelligent manufacturing information of the workpiece to be measured is acquired by adopting the core image characteristic of shape characteristics, which is more suitable, in order to comprehensively capture the manufacturing information of the sensitive area image of the workpiece to be measured, the invention simultaneously considers the outer edge information and the inner area information of the target area, takes the edge area, the edge shape factor, the average radius of the target area and the Hu invariant moment of the front 3 dimensions as the characteristic variables for representing the sensitive area image, and takes the characteristic variables as the characteristic vectors of the sensitive area image to be input to the depth-based neural network unit.
In a preferred embodiment, based on the edge attribute of the sensitive region image, the feature vector extraction unit first performs edge detection on the sensitive region image to obtain a target region, and calculates and obtains an edge area a of the target region by equations (1) - (3) respectivelyEEdge shape factor E and target area mean radius μRIn addition, the Hu invariant moment of the first 3 dimensions is added to form a characteristic vector of a sensitive area with four characteristic variables so as to reflect manufacturing quality information of the current product detection platform such as processing or assembly, and the characteristic vector is used as an input layer and is sent to a deep neural network unit;
in the above formula, the parameters M and N are the number of edge points of the target area,wherein t (x, y) is the gray value of each edge point; the parameter L is the perimeter of the target area and can be obtained by calculation by adopting a chain code method in the image processing technology, and the reference K is the number of edge points on the boundary of the target area, (x)k,yk) Representing the coordinates of the pixels located on the border of the target area,the coordinates of the centroid, which represents the target area, can be calculated by the following formula:
wherein the parameter a represents the area of the sensitive region, the size of which can be obtained when the sensitive region is identified in the image processing.
In addition, the Hu invariant moment is used as an important global feature of an image, is not affected by light and noise, has good geometric invariance, can effectively describe an object image with a complex shape, and is effective in that the former 3-dimensional Hu invariant moment is selected as one of feature variables of an image of a sensitive area of a workpiece to be measured in consideration of typical manufacturing defect properties of intelligent manufacturing and image processing efficiency.
In a preferred embodiment, the deep neural network unit is a manufacturing defect prediction model based on a deep neural network, and comprises three layers of neural networks, namely an input layer, a hidden layer and an output layer, wherein the input layer and the output layer have the same scale, the input layer is used as an input interface for manufacturing the defect prediction model, receives a feature vector of a workpiece image to be detected, reaches the hidden layer through information coding, and is converted into the output layer through information decoding, and a classical coding and decoding formula model can be adopted; before a deep neural network unit formally classifies manufacturing defects of a workpiece to be detected, learning training needs to be carried out, and specifically, a standard sample image library is established aiming at typical manufacturing defects which may occur on the workpiece to be detected at a current detection station 501; in a preferred embodiment, the standard sample image library may include four sample library types, i.e., manufacturing qualified, manufacturing defective I, manufacturing defective II, manufacturing defective III, etc., as a training sample library of the deep neural network; similar to the extraction of the feature vector of the workpiece to be detected, the image in the standard sample image library is also subjected to edge detection, and feature variables such as the edge area, the edge standard deviation, the shape factor, the Hu invariant moment and the like of the image are sequentially extracted to form the feature vector of the training sample library; finally, the input layer of the deep neural network unit reads the feature vectors in the training sample library, and based on the coding and decoding of the deep neural network unit, deep learning is performed on the manufacturing information corresponding to the images in each standard sample image library, so that the manufacturing defect prediction model of the current detection station 501 is obtained.
After the deep neural network unit performs learning training on the training sample library, the deep neural network unit can be used for performing classified prediction on manufacturing defect information of a workpiece to be detected of the current detection station 501, so that the manufacturing defect type of the workpiece to be detected on the current product detection platform is identified, the manufacturing defect information of the workpiece to be detected is further sent to the computer control unit, and the computer control unit is suitable for controlling the transfer manipulator 1 to clamp and transfer the workpiece to the repair station 2; and sends the manufacturing defect type to the repair station 2 to repair the defect of the workpiece.
In a preferred embodiment, the computer control unit includes a PLC controller, and is in communication connection with the product inspection platform, the area-array camera 504, and the deep neural network unit, respectively, and determines resources required for repairing the manufacturing defect according to manufacturing defect information indicated by the classification result of the deep neural network unit, in combination with a historical repair database, and automatically generates a repair strategy for repairing the defect according to the defect type, defect position, defect degree, and maintenance personnel selection, and forms a corresponding work instruction to send to the PLC controller, and the PLC controller performs a specific repair operation, and updates and adds a repair record according to the execution condition in the historical repair database.
In the embodiment, the processing station 3 and the repairing station 2 are both configured with corresponding processing equipment, and each processing equipment is controlled by an upper computer.
The embodiment also provides a working method of the intelligent flexible production line, namely
Carrying out on-line detection on the workpiece rotating by the conveying mechanism 4;
if the workpiece is judged to be unqualified, transferring and repairing the workpiece;
after the completion of the transfer repair of the work, the work is transferred back to the conveying mechanism 4.
The method for detecting the workpiece on line comprises the following steps:
constructing a manufacturing defect prediction model based on a deep neural network, and training and learning the deep neural network through a sample image;
the positioning and releasing mechanism drives the workpiece to be detected to move on the detection station 501 under the control instruction of the computer control module so as to enable the workpiece to be detected to reach a preset detection position, triggers the proximity switch and sends a trigger signal to the computer control unit;
the computer control unit respectively sends instructions to the LED area light source matrix 503 and the workpiece detection system 5, the LED area light source matrix 503 is turned on for illumination, the area array camera 504 shoots a workpiece to be detected according to preset programs and parameters, and after photoelectric conversion, a generated image is sent to the image processing unit;
the image processing unit identifies and segments a sensitive area of the current detection station 501, performs image denoising processing on the local area, and sends an image of the sensitive area to the feature vector extraction unit after denoising;
the feature vector extraction unit carries out edge detection on the sensitive region to form a target region, obtains the edge area, the edge shape factor and the target region average radius of the target region through calculation respectively, and combines the Hu invariant moment of the previous 3 dimensions to form a feature vector of the sensitive region with four feature variables;
the manufacturing information of the feature vectors is diagnosed based on the trained deep neural network, the manufacturing defects of the workpiece to be detected are predicted and classified, and the classification result is fed back to the computer control unit;
and the computer control unit determines a maintenance strategy for maintaining the manufacturing defect based on the classification result and the historical maintenance database information, sends a working instruction to the PLC, and executes specific maintenance operation by the PLC.
The method for constructing the manufacturing defect prediction model based on the deep neural network and training and learning the deep neural network through the sample image further comprises the following steps: constructing a manufacturing defect prediction model based on a deep neural network, wherein the deep neural network is a 3-layer neural network and comprises an input layer, an output layer and a hidden layer, and the input layer and the output layer have the same scale; aiming at typical manufacturing defects possibly occurring at the current detection station 501, a standard sample image library is established, wherein the standard sample image library comprises four sample library types including manufacturing qualified type, manufacturing defect I type, manufacturing defect II type, manufacturing defect III type and the like, and is used as a training sample library of the deep neural network; performing edge detection on the images in the standard sample image library, and sequentially extracting characteristic variables such as edge area, edge standard deviation, shape factor, Hu invariant moment and the like of the images to form a characteristic vector of a training sample library; the input layer of the deep learning network unit reads the feature vectors in the training sample library, and performs deep learning on the manufacturing information corresponding to the images in each standard sample image library, so as to obtain a manufacturing defect prediction model of the current detection station 501.
In conclusion, in the process of identifying and diagnosing the product manufacturing defect information of the flexible production line based on the image acquisition and image processing technology for the first time, the invention acquires, processes and extracts the real-time image of the product in the intelligent flexible production line through the manufacturing defect prediction model constructed based on the deep neural network to obtain the feature vector representation of the product manufacturing information, thereby realizing that the product manufacturing information can be displayed, the defect can be diagnosed and the maintenance strategy can be automatically generated in the intelligent manufacturing process, forming an organic integrated system for real-time monitoring of product manufacturing, real-time analysis of defect conclusion and automatic implementation of the maintenance strategy, and effectively solving one of the factors which restrict the wide application of the existing intelligent manufacturing technology; in order to obtain an accurate image processing result, the problems of image acquisition precision caused by micro vibration of a machine tool and a detection station 501 and lens deflection of an area array camera 504 which possibly occur in an intelligent flexible production line are effectively solved by applying a plurality of optical fiber slip rings and applying a resolution scanning technology in the image processing process, and the precision of online fault defect diagnosis and identification of product manufacturing is improved; in order to reduce the time consumption of image processing, the mark point position of the background mark plate 502 is set in a targeted manner based on typical manufacturing defects of a workpiece to be detected on the current detection station 501, so that a sensitive area image of the current detection station 501 can be directly positioned in subsequent image processing, and image processing and characteristic variable extraction and calculation are performed on the local area, thereby greatly improving the image processing efficiency, and compared with the conventional method for processing the whole image and extracting the characteristic variable, the product online fault prediction system and method provided by the invention are more real-time and efficient.
In the description herein, references to the description of the term "one embodiment," "some embodiments," "an example," "a specific example," or "some examples," etc., mean that a particular feature, structure, material, or characteristic described in connection with the embodiment or example is included in at least one embodiment or example of the invention. In this specification, a schematic representation of the term does not necessarily refer to the same embodiment or example. Furthermore, the particular features, structures, materials, or characteristics described may be combined in any suitable manner in any one or more embodiments or examples.
In light of the foregoing description of the preferred embodiment of the present invention, many modifications and variations will be apparent to those skilled in the art without departing from the spirit and scope of the invention. The technical scope of the present invention is not limited to the content of the specification, and must be determined according to the scope of the claims.
Claims (4)
1. An intelligent flexible production line, comprising:
the device comprises an upper computer, a transferring manipulator, a repairing station and a plurality of processing stations which are arranged according to processing procedures, wherein the processing stations are connected through a conveying mechanism;
a workpiece detection system is arranged at the conveying mechanism;
the workpiece detection system is suitable for detecting the workpieces circulated by the conveying mechanism;
the upper computer is electrically connected with the workpiece detection system, and when the workpiece is judged to be unqualified, the workpiece is clamped and transferred to the repairing station by the transfer manipulator;
after the repairing station finishes repairing the workpiece, the transferring manipulator clamps the workpiece and transfers the workpiece back to the conveying mechanism to enter the next processing station;
the workpiece detection system includes: the device comprises a detection station, a proximity switch and a positioning and releasing mechanism, wherein a background marking plate which is continuously arranged along a flexible production line is arranged on one side of the detection station, a plurality of marking points corresponding to the current detection station are arranged on the background marking plate, an LED (light emitting diode) surface light source matrix is arranged on the other side of the detection station, the LED surface light source matrix is in communication connection with an upper computer, and the background marking plate and the LED surface light source matrix jointly form a backlight illumination environment;
the host computer includes: the device comprises an image processing unit, a feature vector extraction unit, a deep neural network unit and a computer control unit; wherein
The image processing unit is suitable for receiving an image of a detection station on a detection station acquired by an area-array camera, carrying out resolution scanning on the received image to obtain a sensitive area image of the current detection station, denoising the sensitive area image, and sending the denoised sensitive area image to the feature vector extraction unit;
the feature vector extraction unit carries out edge detection on the sensitive region image to form a target region, and the edge area A of the target region is obtained through calculation according to formulas (1) to (3) respectivelyEEdge shape factor E and target area mean radius μRThe characteristic vector of a sensitive area with four characteristic variables is formed by adding the Hu invariant moment of the first 3 dimensions to reflect the workpiece quality information of the current detection station, and the characteristic vector is used as an input layer and is sent to a deep neural network unit;
in the above formula, the parameters M and N are the number of edge points of the target area,wherein t (x, y) is the gray value of each edge point; the parameter L is the perimeter of the target area; reference K is the number of edge points on the boundary of the target area, (x)k,yk) Representing the coordinates of the pixels located on the border of the target area,the coordinates of the centroid, which represents the target area, can be calculated by the following formula:
wherein the parameter A represents the area of the sensitive region and is suitable for obtaining the size of the sensitive region when the sensitive region is identified in image processing;
the deep neural network unit constructs a manufacturing defect prediction model based on a neural network algorithm, trains, learns and classifies image characteristic vectors of detection stations, identifies the manufacturing defect type of a workpiece to be detected on the current detection station, and feeds back a classification result to the computer control unit;
the computer control unit is suitable for controlling the transfer manipulator to clamp and transfer the workpiece to the repairing station;
and sending the manufacturing defect type to a repair station to repair the defect of the workpiece.
2. The intelligent flexible production line according to claim 1,
the workpiece inspection system further comprises: the system comprises an area array camera, a first photoelectric conversion element, an optical fiber slip ring and a second photoelectric conversion element, wherein the first photoelectric conversion element, the optical fiber slip ring and the second photoelectric conversion element are sequentially in communication connection with the area array camera, the axis of a photosensitive lens of the area array camera is perpendicular to the circulation direction of a detection station, and after photoelectric conversion is carried out on an image collected by the area array camera, the second photoelectric conversion element sends an electric signal of the image to an upper computer.
3. A working method of an intelligent flexible production line according to claim 1 or 2, characterized by comprising:
carrying out on-line detection on the workpiece circulated by the conveying mechanism;
if the workpiece is judged to be unqualified, transferring and repairing the workpiece;
and after the transfer and repair of the workpiece are finished, transferring the workpiece back to the conveying mechanism.
4. The operating method according to claim 3,
the working method is suitable for adopting the intelligent flexible production line as claimed in claim 1 or 2.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201811651017.8A CN109693140B (en) | 2018-12-31 | 2018-12-31 | Intelligent flexible production line and working method thereof |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201811651017.8A CN109693140B (en) | 2018-12-31 | 2018-12-31 | Intelligent flexible production line and working method thereof |
Publications (2)
Publication Number | Publication Date |
---|---|
CN109693140A CN109693140A (en) | 2019-04-30 |
CN109693140B true CN109693140B (en) | 2021-07-06 |
Family
ID=66232446
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201811651017.8A Active CN109693140B (en) | 2018-12-31 | 2018-12-31 | Intelligent flexible production line and working method thereof |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN109693140B (en) |
Families Citing this family (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN112330119A (en) * | 2020-10-27 | 2021-02-05 | 浙江大华技术股份有限公司 | Single-station flexible production system, method and device, electronic equipment and storage medium |
CN112846408B (en) * | 2021-01-21 | 2023-09-22 | 广东翼丰盛科技有限公司 | Flip pocket watch chassis flattening and flip detects integrative device |
CN114055181B (en) * | 2021-10-28 | 2022-12-09 | 深圳精匠云创科技有限公司 | Automatic tool machining, detecting and reworking system and method |
CN115922519A (en) * | 2022-12-21 | 2023-04-07 | 贵州安大航空锻造有限责任公司 | Intelligent processing method and device for workpiece defects, production line management and control system and medium |
CN115826547B (en) * | 2023-02-21 | 2023-04-28 | 青岛环球重工科技有限公司 | Control system of flexible segment production line |
CN116587043B (en) * | 2023-07-18 | 2023-09-15 | 太仓德纳森机电工程有限公司 | Workpiece conveying system for industrial automatic production and processing |
Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20140050387A1 (en) * | 2012-08-17 | 2014-02-20 | Cognex Corporation | System and Method for Machine Vision Inspection |
CN104992449A (en) * | 2015-08-06 | 2015-10-21 | 西安冉科信息技术有限公司 | Information identification and surface defect on-line detection method based on machine visual sense |
CN106362961A (en) * | 2016-08-30 | 2017-02-01 | 吴正明 | Working method for flexible production line |
CN108898589A (en) * | 2018-06-19 | 2018-11-27 | 南通大学 | The quick-fried pearl intelligent detecting method of filter stick based on high speed machines vision |
Family Cites Families (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN103076342A (en) * | 2013-01-17 | 2013-05-01 | 陕西科技大学 | LabVIEW-based PCB (printed circuit board) plugging online detecting system and detecting method thereof |
CN106125699B (en) * | 2016-08-23 | 2019-10-29 | 常州轻工职业技术学院 | A kind of automatic production line based on image recognition detection |
CN108596880A (en) * | 2018-04-08 | 2018-09-28 | 东南大学 | Weld defect feature extraction based on image procossing and welding quality analysis method |
-
2018
- 2018-12-31 CN CN201811651017.8A patent/CN109693140B/en active Active
Patent Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20140050387A1 (en) * | 2012-08-17 | 2014-02-20 | Cognex Corporation | System and Method for Machine Vision Inspection |
CN104992449A (en) * | 2015-08-06 | 2015-10-21 | 西安冉科信息技术有限公司 | Information identification and surface defect on-line detection method based on machine visual sense |
CN106362961A (en) * | 2016-08-30 | 2017-02-01 | 吴正明 | Working method for flexible production line |
CN108898589A (en) * | 2018-06-19 | 2018-11-27 | 南通大学 | The quick-fried pearl intelligent detecting method of filter stick based on high speed machines vision |
Non-Patent Citations (2)
Title |
---|
基于机器视觉的彩钢板缺陷检测和智能分类研究;孙创开;《中国优秀硕士学位论文全文数据库 信息科技辑》;20170715(第7期);第5-11、42-43以及53-61页 * |
触点零件形貌在线自学习视觉检测系统研究;戴舒文;《中国优秀硕士学位论文全文数据库 信息科技辑》;20090915(第9期);第36-46页 * |
Also Published As
Publication number | Publication date |
---|---|
CN109693140A (en) | 2019-04-30 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN109693140B (en) | Intelligent flexible production line and working method thereof | |
CN109840900B (en) | Fault online detection system and detection method applied to intelligent manufacturing workshop | |
CN108765416B (en) | PCB surface defect detection method and device based on rapid geometric alignment | |
EP1995553B1 (en) | System and method for identifying a feature of a workpiece | |
CN112534470B (en) | System and method for image-based target object inspection | |
CN111220582A (en) | Fluorescence penetrant inspection system and method | |
CN105468033B (en) | A kind of medical arm automatic obstacle-avoiding control method based on multi-cam machine vision | |
CN102529019B (en) | Method for mould detection and protection as well as part detection and picking | |
CN104483320A (en) | Digitized defect detection device and detection method of industrial denitration catalyst | |
US7599050B2 (en) | Surface defect inspecting method and device | |
CN110108712A (en) | Multifunctional visual sense defect detecting system | |
CN110728657A (en) | Annular bearing outer surface defect detection method based on deep learning | |
EP4202424A1 (en) | Method and system for inspection of welds | |
CN115456999B (en) | Saw chain surface defect automatic detection system and defect detection method based on machine vision | |
CN110096980A (en) | Character machining identifying system | |
CN108109154A (en) | A kind of new positioning of workpiece and data capture method | |
CN111060518A (en) | Stamping part defect identification method based on instance segmentation | |
CN111524154B (en) | Image-based tunnel segment automatic segmentation method | |
CN113744247A (en) | PCB welding spot defect identification method and system | |
CN117664990A (en) | Intelligent PCBA appearance defect detection method | |
CN114280075A (en) | Online visual inspection system and method for surface defects of pipe parts | |
CN114519792A (en) | Welding seam ultrasonic image defect identification method based on machine and depth vision fusion | |
CN216525503U (en) | Carbon fiber prepreg surface defect on-line measuring device based on machine vision | |
CN117252840B (en) | Photovoltaic array defect elimination evaluation method and device and computer equipment | |
CN117085969B (en) | Artificial intelligence industrial vision detection method, device, equipment and storage medium |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
CB02 | Change of applicant information |
Address after: 213000 No.28, Mingxin Middle Road, Wujin District, Changzhou City, Jiangsu Province Applicant after: Changzhou Polytechnic Address before: 213100 No.28, Mingxin Middle Road, Wujin District, Changzhou City, Jiangsu Province Applicant before: CHANGZHOU VOCATIONAL INSTITUTE OF LIGHT INDUSTRY |
|
CB02 | Change of applicant information | ||
GR01 | Patent grant | ||
GR01 | Patent grant |