CN115452844B - Injection molding part detection method and system based on machine vision - Google Patents
Injection molding part detection method and system based on machine vision Download PDFInfo
- Publication number
- CN115452844B CN115452844B CN202211409675.2A CN202211409675A CN115452844B CN 115452844 B CN115452844 B CN 115452844B CN 202211409675 A CN202211409675 A CN 202211409675A CN 115452844 B CN115452844 B CN 115452844B
- Authority
- CN
- China
- Prior art keywords
- image
- highlight
- injection molding
- annular
- determining
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
Images
Classifications
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01N—INVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
- G01N21/00—Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
- G01N21/84—Systems specially adapted for particular applications
- G01N21/88—Investigating the presence of flaws or contamination
- G01N21/8851—Scan or image signal processing specially adapted therefor, e.g. for scan signal adjustment, for detecting different kinds of defects, for compensating for structures, markings, edges
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/0002—Inspection of images, e.g. flaw detection
- G06T7/0004—Industrial image inspection
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/10—Segmentation; Edge detection
- G06T7/13—Edge detection
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/60—Analysis of geometric attributes
- G06T7/66—Analysis of geometric attributes of image moments or centre of gravity
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/70—Determining position or orientation of objects or cameras
- G06T7/73—Determining position or orientation of objects or cameras using feature-based methods
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/70—Arrangements for image or video recognition or understanding using pattern recognition or machine learning
- G06V10/764—Arrangements for image or video recognition or understanding using pattern recognition or machine learning using classification, e.g. of video objects
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- Computer Vision & Pattern Recognition (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- General Health & Medical Sciences (AREA)
- Health & Medical Sciences (AREA)
- Medical Informatics (AREA)
- Signal Processing (AREA)
- Databases & Information Systems (AREA)
- Evolutionary Computation (AREA)
- Artificial Intelligence (AREA)
- Quality & Reliability (AREA)
- Software Systems (AREA)
- Multimedia (AREA)
- Geometry (AREA)
- Computing Systems (AREA)
- Life Sciences & Earth Sciences (AREA)
- Chemical & Material Sciences (AREA)
- Analytical Chemistry (AREA)
- Biochemistry (AREA)
- Immunology (AREA)
- Pathology (AREA)
- Injection Moulding Of Plastics Or The Like (AREA)
Abstract
The invention discloses an injection molding part detection method and system based on machine vision, the technical scheme of the invention is that two images irradiated by different light sources are acquired from the same annular injection molding part to be detected and are respectively processed, the characteristic that the reflection points on the annular injection molding part can be amplified after the light sources are irradiated is utilized, the reflection points are filtered according to the brightness values of various highlight position points in the two acquired images, the influence noise of the reflection points on the annular injection molding part on the shape recognition of bubble defects is eliminated, the technical problem that the prior art cannot accurately recognize the bubble defects on the injection molding part, particularly the bubble defects on the annular injection molding part, the accurate recognition of the bubble defects on the injection molding part, particularly the bubble defects of the annular injection molding part, is realized, and the detection success rate and the accuracy of the injection molding part can be improved.
Description
Technical Field
The invention relates to the technical field of image processing, in particular to a machine vision-based injection molding piece detection method and system.
Background
The injection molding piece is a workpiece made of materials such as polypropylene, polyethylene and the like by fusing a plurality of organic solvents. The manufacturing principle is that plastic particles are injected into a model after being melted at high temperature, and the plastic particles are extruded and cooled by a machine to form a part, the manufacturing process is greatly influenced by other factors such as temperature, pressure, model and the like, the surface of the produced injection molding part has high probability of glue shortage or glue excess, and therefore the produced injection molding part usually needs to be detected in a detection mode.
The types of defects currently associated with injection molded parts are common: shrinking, gas lines, material shortage, flash, thread clamping and the like. In the face of the common defect types, the existing injection molding part detection strategies basically mainly adopt a manual detection mode, and workers classify the injection molding parts with defects by observing the surface conditions of the produced injection molding parts. The above-mentioned apparent defects may also be detected manually, but encounter special defect types, such as "bubbles". The air bubbles are formed due to too fast material injection, or the air bubbles exist in the injection molding part due to the fact that cavities are caused by uneven volume shrinkage during production molding. And some injection molding parts with higher precision have high requirements on finished products, and the occurrence of bubbles can be used for equipment after the injection molding parts are installed. However, the traditional manual detection mode has extremely low efficiency and limited accuracy of visual inspection, and can not accurately identify the bubble defect point. Although some researches aiming at bubble defects on injection molding parts exist in the market at present along with the development of image processing technology, the researches are only limited in the directions of cause analysis and the like of bubble generation, and the bubble defects on the injection molding parts cannot be found; in addition, in the process of image detection facing the annular injection molding part, the reflection point of the annular injection molding part can generate certain influence on bubble identification, so that the bubble defect detection of the annular injection molding part is harder.
Therefore, a new injection molding part detection strategy is urgently needed in the market at present, bubble defects on injection molding parts can be accurately identified, especially the bubble defects of annular injection molding parts are identified, and the detection success rate and the accuracy of the injection molding parts are improved.
Disclosure of Invention
The invention provides a machine vision-based injection molding part detection method and system, which can be used for accurately identifying bubble defects on an injection molding part, particularly identifying the bubble defects of an annular injection molding part, and improving the detection success rate and accuracy of the injection molding part.
In order to solve the technical problem, an embodiment of the present invention provides a machine vision-based injection molding part detection method for detecting a bubble defect on an annular injection molding part, where the method includes:
acquiring an image of the annular injection molding piece to be detected in a closed space through shooting equipment to obtain a first acquired image; keeping the positions of the shooting equipment and the annular injection molding piece to be detected unchanged, and after a light source is thrown in the closed space, carrying out secondary image acquisition on the annular injection molding piece to be detected to obtain a second acquired image;
identifying and marking the annular boundary characteristics in the first collected image and the second collected image, and determining the annular boundary of the annular injection molding piece to be detected in the first collected image and the second collected image respectively;
preprocessing the first collected image and the second collected image, inputting the preprocessed first collected image and the preprocessed second collected image into a pre-established highlight area model for recognition, and respectively marking and outputting highlight position points existing on the first collected image and the second collected image;
respectively determining highlight position points of the first collected image and the second collected image on an annular boundary, and determining the brightness value of each highlight position point on the annular boundary;
filtering highlight position points with difference values larger than a preset threshold value in the first acquired image as influence factors according to the difference between the brightness values of the highlight position points at the same position on the annular boundary of the first acquired image and the second acquired image to obtain a filtered image;
and inputting the filtering image into a pre-established bubble shape recognition model for recognition, marking and outputting highlight position points with shapes meeting the shapes of bubble defects in the filtering image, and taking the highlight position points as the bubble defects on the annular injection molding piece to be detected.
As a preferred scheme, the step of identifying and marking the annular boundary features in the first captured image and the second captured image, and determining the annular boundary of the annular injection molding piece to be tested in the first captured image and the second captured image respectively specifically includes:
respectively carrying out gridding processing on the first collected image and the second collected image, and determining a reference point;
establishing a three-dimensional coordinate system, moving the first collected image and the second collected image into the three-dimensional coordinate system by taking the reference point as an origin, and determining the coordinate position of each gridding point in the three-dimensional coordinate system;
determining a connecting line formed among a plurality of continuous gridding points with the same chromaticity in the first collected image, and determining the connecting line as an annular boundary when the difference value of the chromaticity between the connecting line and the adjacent gridding points which are not on the connecting line reaches a chromaticity threshold value;
and moving the reference point in the three-dimensional coordinate system to the second acquired image by taking the reference point as a reference according to the determined annular boundary in the first acquired image, and determining the annular boundary of the annular injection molding piece to be detected in the second acquired image.
As a preferred scheme, the step of preprocessing the first captured image and the second captured image specifically includes:
graying the first collected image and the second collected image to respectively obtain corresponding grayscale images;
in the three-dimensional coordinate system, transversely stretching the gray image by a certain multiple by taking the reference point as a midpoint to obtain a stretched image;
identifying light spot characteristics existing in the stretching image, and filtering the light spot characteristics in the stretching image to obtain a filtered image;
and according to the transverse stretching multiple, transversely reducing the filtered image to obtain a preprocessed image, and inputting the preprocessed image into a pre-established highlight area model.
As a preferred scheme, the step of establishing the highlight region model includes:
acquiring a training image, wherein the training image is obtained by shooting an image acquisition of a training annular injection molding piece by using shooting equipment after a light source is put in a closed space;
according to the chromaticity of the training image, marking the shape boundary of the highlight generation area in the training image, respectively determining the center point of each highlight generation area and the nearest distance point on the annular boundary of the training annular injection molding part, and associating the distance points with the corresponding highlight generation areas;
establishing an initial highlight model through a machine learning algorithm, inputting the associated training images into the initial highlight model for training until the training times reach a threshold value, and generating a training highlight model;
acquiring a test image, wherein the test image is obtained by acquiring an image of a training annular injection molding piece in a closed space through shooting equipment;
and inputting the test image into the training highlight model for testing, and generating a highlight area model when the accuracy of the training highlight model for marking highlight position points with highlight areas in the test image in the output image reaches a preset threshold value.
Preferably, the step of identifying the spot features existing in the stretched image specifically includes:
identifying irregular figures existing in the stretched image, and determining the irregular figures existing in the stretched image;
respectively dividing a plurality of layers of circular ring areas in each irregular graph, determining a plurality of test points in each layer of circular ring area, and simultaneously determining the chromaticity of each test point;
calculating the average chroma of all test points in each layer of ring area, and taking the average chroma as the chroma value of the ring area;
and when the colorimetric values on the ring area at the outermost layer in the same irregular pattern are determined to be sequentially decreased towards the ring area at the innermost layer, determining that the irregular pattern is the spot characteristic existing in the stretching image.
As a preferred scheme, the step of respectively determining highlight position points of the first captured image and the second captured image on the annular boundary, and determining the brightness value of each highlight position point on the annular boundary specifically includes:
respectively determining the area range of each highlight position point on the annular boundary of the first collected image and the second collected image;
and determining an circumscribed circle aiming at the area range of each highlight position point, and taking the corresponding brightness value at the position of the circle center of the circumscribed circle as the brightness value of the corresponding highlight position point.
As a preferred scheme, the step of filtering, according to a difference between luminance values of highlight position points at the same position on the annular boundary between the first captured image and the second captured image, the highlight position point having a difference value larger than a preset threshold as an influence factor in the first captured image to obtain a filtered image specifically includes:
defining the coordinate position of each highlight position point in the first collected image in a three-dimensional coordinate system as a whole to be a first coordinate position;
defining the coordinate position of each highlight position point in the second acquired image in the three-dimensional coordinate system as a whole as a second coordinate position;
taking the first coordinate position as a reference, and integrally moving the second coordinate position in a three-dimensional coordinate system until the second coordinate position is superposed with the first coordinate position;
and determining the difference of the brightness values of the highlight position points at the same position after superposition, and filtering the highlight position points with the difference value larger than a preset threshold value in the first collected image as influence factors to obtain a filtered image.
Preferably, the step of establishing the bubble shape recognition model includes:
in the process of pre-establishing the bubble shape recognition model, obtaining a filtering image obtained after the steps are executed;
marking bubbles in the filtered image in a manual identification mode, and determining a bubble defect range;
identifying the gray value in each bubble defect range, and dividing the bubble defect range into two areas according to the change of the gray value in the bubble defect range;
respectively determining circumscribed circles for two areas where the same bubble defect range is located, and associating the centers of the circumscribed circles corresponding to the two areas;
and establishing an initial bubble model through a machine learning algorithm, inputting the associated filtering image into the initial bubble model for training and testing, and generating a bubble shape recognition model until the times of training and testing reach a threshold value.
Accordingly, another embodiment of the present invention further provides a machine vision-based injection molding inspection system for detecting bubble defects on a ring-shaped injection molding, the system comprising: the device comprises an image acquisition module, an annular boundary module, a highlight identification module, a highlight determination module, an image filtering module and a bubble identification module;
the image acquisition module is used for acquiring an image of the annular injection molding piece to be detected in a closed space through shooting equipment to obtain a first acquired image; keeping the positions of the shooting equipment and the annular injection molding piece to be detected unchanged, and after a light source is put into the closed space, carrying out secondary image acquisition on the annular injection molding piece to be detected to obtain a second acquired image;
the annular boundary module is used for identifying and marking annular boundary characteristics in the first collected image and the second collected image, and determining an annular boundary of the annular injection molding piece to be detected in the first collected image and the second collected image respectively;
the highlight identification module is used for preprocessing the first collected image and the second collected image, inputting the preprocessed first collected image and the preprocessed second collected image into a preset highlight area model for identification, and respectively marking and outputting highlight position points existing on the first collected image and the second collected image;
the highlight determining module is used for respectively determining highlight position points of the first acquired image and the second acquired image on the annular boundary and determining the brightness value of each highlight position point on the annular boundary;
the image filtering module is used for filtering the highlight position points with the difference value larger than a preset threshold value in the first collected image as influence factors according to the difference between the brightness values of the highlight position points at the same position on the annular boundary of the first collected image and the second collected image to obtain a filtered image;
and the bubble identification module is used for inputting the filtering image into a pre-established bubble shape identification model for identification, marking and outputting highlight position points with shapes meeting the shapes of bubble defects in the filtering image, and the highlight position points are used as the bubble defects on the annular injection molding part to be detected.
As a preferred solution, the annular boundary module is specifically configured to: respectively carrying out gridding processing on the first collected image and the second collected image, and determining a reference point; establishing a three-dimensional coordinate system, moving the first collected image and the second collected image into the three-dimensional coordinate system by taking the reference point as an origin, and determining the coordinate position of each gridding point in the three-dimensional coordinate system; determining a connecting line formed among a plurality of continuous gridding points with the same chromaticity in the first collected image, and determining the connecting line as an annular boundary when the chromaticity difference value between the connecting line and the adjacent gridding points which are not on the connecting line reaches a chromaticity threshold value; and moving the reference point in the three-dimensional coordinate system to the second acquired image by taking the reference point as a reference according to the determined annular boundary in the first acquired image, and determining the annular boundary of the annular injection molding piece to be detected in the second acquired image.
As a preferred scheme, the highlight identification module is configured to perform preprocessing on the first captured image and the second captured image, and specifically includes: graying the first collected image and the second collected image to respectively obtain corresponding grayscale images; in the three-dimensional coordinate system, transversely stretching the gray image by a certain multiple by taking the reference point as a midpoint to obtain a stretched image; identifying light spot features existing in the stretched image, and filtering the light spot features in the stretched image to obtain a filtered image; and according to the transverse stretching multiple, transversely reducing the filtered image to obtain a preprocessed image, and inputting the preprocessed image into a pre-established highlight area model.
As a preferred scheme, the step of establishing the highlight region model includes: acquiring a training image, wherein the training image is obtained by shooting an image acquisition of a training annular injection molding piece by using shooting equipment after a light source is put in a closed space; according to the chromaticity of the training image, marking the shape boundary of the highlight generation area in the training image, respectively determining the center point of each highlight generation area and the nearest distance point on the annular boundary of the training annular injection molding part, and associating the distance points with the corresponding highlight generation areas; establishing an initial highlight model through a machine learning algorithm, inputting the associated training images into the initial highlight model for training until the training times reach a threshold value, and generating a training highlight model; acquiring a test image, wherein the test image is obtained by acquiring an image of a training annular injection molding in a closed space through shooting equipment; and inputting the test image into the training highlight model for testing, and generating a highlight area model when the accuracy of the training highlight model for marking highlight position points with highlight areas in the test image in the output image reaches a preset threshold value.
As a preferred scheme, the highlight identification module is used in the step of identifying the spot features existing in the stretched image, and specifically includes: identifying irregular patterns existing in the stretched image, and determining the irregular patterns existing in the stretched image; dividing a plurality of layers of circular ring areas in each irregular graph respectively, determining a plurality of test points in each layer of circular ring area, and simultaneously determining the chromaticity of each test point; calculating the average chroma of all test points in each layer of ring area, and taking the average chroma as the chroma value of the ring area; and when the chromatic values on the ring area at the outermost layer in the same irregular figure are determined to be sequentially decreased towards the ring area at the innermost layer, determining that the irregular figure is the spot feature existing in the stretched image.
As a preferred scheme, the highlight determining module is specifically configured to: respectively determining the area range of each highlight position point on the annular boundary of the first collected image and the second collected image; and determining an circumscribed circle aiming at the area range of each highlight position point, and taking the corresponding brightness value at the position of the circle center of the circumscribed circle as the brightness value of the corresponding highlight position point.
As a preferred scheme, the image filtering module is specifically configured to: defining the coordinate position of each highlight position point in the first collected image in a three-dimensional coordinate system as a whole to be a first coordinate position; defining the coordinate position of each highlight position point in the second collected image in the three-dimensional coordinate system as a whole to be a second coordinate position; taking the first coordinate position as a reference, and integrally moving the second coordinate position in a three-dimensional coordinate system until the second coordinate position is superposed with the first coordinate position; and determining the difference of the brightness values of the highlight position points at the same position after superposition, and filtering the highlight position points with the difference value larger than a preset threshold value in the first collected image as influence factors to obtain a filtered image.
Preferably, the step of establishing the bubble shape recognition model includes: in the process of pre-establishing the bubble shape recognition model, obtaining a filtering image obtained after the steps are executed; marking bubbles existing in the filtered image in a manual identification mode, and determining a bubble defect range; identifying the gray value in each bubble defect range, and dividing the bubble defect range into two areas according to the change of the gray value in the bubble defect range; respectively determining circumscribed circles for two areas where the same bubble defect range is located, and associating the centers of the circumscribed circles corresponding to the two areas; and establishing an initial bubble model through a machine learning algorithm, inputting the associated filtering image into the initial bubble model for training and testing, and generating a bubble shape recognition model until the times of training and testing reach a threshold value.
An embodiment of the present invention further provides a computer-readable storage medium, where the computer-readable storage medium includes a stored computer program; wherein the computer program, when executed, controls an apparatus in which the computer readable storage medium is located to perform the machine vision-based injection molding inspection method of any of the above.
An embodiment of the present invention further provides a terminal device, which includes a processor, a memory, and a computer program stored in the memory and configured to be executed by the processor, where the processor, when executing the computer program, implements the machine vision-based injection molding detection method according to any one of the above items.
Compared with the prior art, the embodiment of the invention has the following beneficial effects:
according to the technical scheme, the two images irradiated by different light sources are collected and respectively processed for the same annular injection molding piece to be detected, the characteristic that the reflection points on the annular injection molding piece can be amplified after the light sources are irradiated is utilized, the reflection points are filtered according to the brightness values of the highlight position points in the two collected images, the influence noise of the reflection points on the annular injection molding piece on the shape recognition of the bubble defect is eliminated, the technical problem that the prior art cannot accurately recognize the bubble defect on the injection molding piece, particularly the bubble defect on the annular injection molding piece, the accurate recognition of the bubble defect on the injection molding piece is realized, particularly the bubble defect on the annular injection molding piece is recognized, and the detection success rate and the accuracy of the injection molding piece can be improved.
Drawings
FIG. 1: the invention provides a method for detecting an injection molding part based on machine vision, which comprises the following steps of a flow chart;
FIG. 2 is a schematic diagram: the injection molding part detection system based on machine vision is provided by the embodiment of the invention;
FIG. 3: the structure diagram of an embodiment of the terminal device provided by the embodiment of the invention is shown.
Detailed Description
The technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are only a part of the embodiments of the present invention, and not all of the embodiments. All other embodiments, which can be obtained by a person skilled in the art without inventive step based on the embodiments of the present invention, are within the scope of protection of the present invention.
Example one
Referring to fig. 1, a flowchart of steps of a method for inspecting an injection molded part based on machine vision according to an embodiment of the present invention is shown. The method is used for detecting the bubble defects on the annular injection molding part and comprises the following steps 101 to 106:
101, acquiring an image of an annular injection molding piece to be detected in a closed space through shooting equipment to obtain a first acquired image; and keeping the positions of the shooting equipment and the annular injection molding piece to be detected unchanged, and after a light source is put into the closed space, carrying out secondary image acquisition on the annular injection molding piece to be detected to obtain a second acquired image.
Specifically, the injection molding part image can generate reflective spots in the defect identification process, and the annular injection molding part has radian, so that the problem of the reflective spots is more serious. When the injection molding piece image generates reflection points, part of the reflection points are often mistakenly identified as bubble defects in the identification process, or the identification of the bubble defects is influenced due to the fact that the coverage range of the reflection points is too large. Based on the reasons, in the process of detecting the injection molding part, particularly in the process of detecting the annular injection molding part, the influence of the reflection point on the injection molding part image per se is particularly required to be eliminated. In the step, two images need to be acquired, wherein one image is acquired based on a closed space for the annular injection molding part, and the other image is acquired under the condition of an external light source. Research shows that after a light source is additionally arranged on an original shot image, the reflection points on the injection molding image can be amplified, and the reflection points can be identified and removed more conveniently.
102, identifying and marking annular boundary features in the first collected image and the second collected image, and determining an annular boundary of the annular injection molding piece to be detected in the first collected image and the second collected image respectively.
In this embodiment, the step 102 specifically includes: step 1021, gridding the first collected image and the second collected image respectively, and determining a reference point; step 1022, establishing a three-dimensional coordinate system, moving the first collected image and the second collected image to the three-dimensional coordinate system respectively by taking the reference point as an origin, and determining the coordinate position of each gridding point in the three-dimensional coordinate system; step 1023, determining a connecting line formed among a plurality of continuous gridding points with the same chromaticity in the first collected image, and determining the connecting line as a circular boundary when the difference value of the chromaticity between the connecting line and the adjacent gridding points which are not on the connecting line reaches a chromaticity threshold value; and step 1024, moving the reference point in the three-dimensional coordinate system to the second acquired image based on the determined annular boundary in the first acquired image, and determining the annular boundary of the annular injection molding piece to be tested in the second acquired image.
Specifically, the light reflection point of the injection molded part can be enlarged through the step 101. In order to further identify the influence of the annular boundary on the light-reflecting points, the annular boundary on the annular injection molding part needs to be determined firstly, specifically aiming at the annular injection molding part, namely, the light-reflecting points on the annular injection molding part which are difficult to break through are identified. In the specific identification process, in order to eliminate the influence of the external light source on the identification of the annular boundary (for example, the annular boundary is easily blurred due to too long exposure time), in this step, the first acquired image and the second acquired image are used for alignment at the position of the reference point. And then, according to the color difference formed by the annular boundary and the grid area on the surrounding non-boundary, the complete annular boundary is identified, and the situation that the reflective points appearing on the annular boundary cannot be completely identified in the subsequent identification process is prevented.
Step 103, preprocessing the first collected image and the second collected image, inputting the preprocessed first collected image and the preprocessed second collected image into a pre-established highlight area model for identification, and respectively marking and outputting highlight position points existing on the first collected image and the second collected image.
In this embodiment, the step 103 is configured to perform preprocessing on the first captured image and the second captured image, and specifically includes: 1031, performing graying processing on the first collected image and the second collected image to obtain corresponding grayscale images respectively; step 1032, transversely stretching the gray image by a certain multiple by taking the reference point as a midpoint in the three-dimensional coordinate system to obtain a stretched image; step 1033, identifying the light spot features existing in the stretched image, and filtering the light spot features in the stretched image to obtain a filtered image; and 1034, transversely reducing the filtered image according to the transverse stretching multiple to obtain a preprocessed image, and inputting the preprocessed image into a pre-established highlight area model.
In another aspect of this embodiment, the step 1033 is configured to perform the step of identifying the spot features existing in the stretched image, and specifically includes: identifying irregular patterns existing in the stretched image, and determining the irregular patterns existing in the stretched image; dividing a plurality of layers of circular ring areas in each irregular graph respectively, determining a plurality of test points in each layer of circular ring area, and simultaneously determining the chromaticity of each test point; calculating the average chroma of all test points in each layer of ring area, and taking the average chroma as the chroma value of the ring area; and when the chromatic values on the ring area at the outermost layer in the same irregular figure are determined to be sequentially decreased towards the ring area at the innermost layer, determining that the irregular figure is the spot feature existing in the stretched image.
Specifically, in order to identify the reflective spots on the annular border in the subsequent step (substantially all of the reflective spots are concentrated on the annular border, since the annular border reflects the light source to form a reflection during the light source irradiation process). In order to more accurately identify the highlight position point (i.e., the suspected reflection point, and then determine the highlight position point), the image needs to be preprocessed. In the preprocessing, in order to reduce noise, other influence factors (for example, light spots such as noise spots) existing in the image except for the annular injection molding part need to be removed. In this time, the light spots can be stretched into irregular images after the images are stretched, and the light spot noise existing in the images can be judged by using the characteristic that the chromaticity of the light spots in the images decreases progressively from the outer layer to the inner layer, so that the light spots can be filtered.
In this embodiment, the step of establishing the highlight region model includes: acquiring a training image, wherein the training image is obtained by shooting an image acquisition of a training annular injection molding piece by using shooting equipment after a light source is put in a closed space; according to the chromaticity of the training image, marking the shape boundary of the highlight generation area in the training image, respectively determining the center point of each highlight generation area and the nearest distance point on the annular boundary of the training annular injection molding part, and associating the distance points with the corresponding highlight generation areas; establishing an initial highlight model through a machine learning algorithm, inputting the associated training images into the initial highlight model for training until the training times reach a threshold value, and generating a training highlight model; acquiring a test image, wherein the test image is obtained by acquiring an image of a training annular injection molding in a closed space through shooting equipment; and inputting the test image into the training highlight model for testing, and generating a highlight area model when the accuracy of the training highlight model for marking highlight position points with highlight areas in the test image in the output image reaches a preset threshold value.
Specifically, the key point of the present solution also exists in the link of building the highlight region model. Because the function of the highlight region model is used for accurately identifying the highlight position point, when training the model, guidance needs to be performed on the position where the highlight region exists in the training image and the associated feature on the annular boundary. The continuous annular boundary of the model learns the position of the highlight area generated by the image, and the finally generated highlight area model can identify the input image and generate the identified highlight position point.
And 104, respectively determining highlight position points of the first collected image and the second collected image on an annular boundary, and determining the brightness value of each highlight position point on the annular boundary.
In this embodiment, the step 104 specifically includes: step 1041, determining the area range of each highlight position point on the annular boundary of the first collected image and the second collected image respectively; step 1042, determining an circumscribed circle for the area range of each highlight position point, and taking the corresponding brightness value at the position where the center of the circumscribed circle is located as the brightness value of the corresponding highlight position point.
Specifically, the highlight position points are determined in a circumscribed circle mode, and actually, the circle center of the circumscribed circle is used as the brightness value of the corresponding highlight position point, so that the brightness value assignment of each highlight position point is more accurate, and which highlight position points are real reflection points can be distinguished in the next step.
And 105, filtering the highlight position points with the difference value larger than a preset threshold value in the first acquired image as influence factors according to the difference between the brightness values of the highlight position points at the same position on the annular boundary of the first acquired image and the second acquired image to obtain a filtered image.
In this embodiment, the step 105 specifically includes: step 1051, defining the coordinate positions of all highlight position points in the first collected image in a three-dimensional coordinate system as a whole to be first coordinate positions; step 1052, defining the coordinate position of each highlight position point in the second acquired image in the three-dimensional coordinate system as a whole as a second coordinate position; step 1053, taking the first coordinate position as a reference, moving the second coordinate position integrally in a three-dimensional coordinate system until the second coordinate position coincides with the first coordinate position; and 1054, determining the difference of the brightness values of the highlight position points at the same position after superposition, and filtering the highlight position points with the difference value larger than a preset threshold value in the first collected image as influence factors to obtain a filtered image.
Specifically, by utilizing the characteristic that the reflection point is amplified after passing through an external light source, and calculating the difference between the corresponding brightness values in two collected images at the same position, which of the highlight position points are real reflection points can be determined, and then the noise-free images are left after filtering (the reflection points are removed). In order to make data more accurate, the situation that the images move in the process of multiple acquisition and processing is avoided in the step of determining the same position, the whole first acquired image is moved until the whole first acquired image is overlapped with the second acquired image by using the movement equivalent relation of a three-dimensional coordinate system, and then the position on the same coordinate point can be regarded as the same position.
And 106, inputting the filtering image into a pre-established bubble shape recognition model for recognition, marking and outputting highlight position points with shapes meeting the shapes of bubble defects in the filtering image, wherein the highlight position points are used as the bubble defects on the annular injection molding part to be detected.
In this embodiment, the step of establishing the bubble shape recognition model includes: in the process of pre-establishing the bubble shape recognition model, obtaining a filtering image obtained after the steps are executed; marking bubbles in the filtered image in a manual identification mode, and determining a bubble defect range; identifying the gray value in each bubble defect range, and dividing the bubble defect range into two areas according to the change of the gray value in the bubble defect range; respectively determining circumscribed circles for two areas where the same bubble defect range is located, and associating the centers of the circumscribed circles corresponding to the two areas; and establishing an initial bubble model through a machine learning algorithm, inputting the associated filtering image into the initial bubble model for training and testing, and generating a bubble shape recognition model until the times of training and testing reach a threshold value.
Specifically, through the above steps 101 to 105, we have already obtained a noise-free image, and at this time we only need to input the noise-free image (i.e. the filtered image) into the model for identifying the shape of the bubble defect. It can be understood that, in the process of identifying the bubble shape identification model, the steps 101 to 105 are operated in advance, then the "filtered image" output in the step 105 is used as the training image of the bubble shape identification model, and after the model training is completed, the bubble shape identification model can be directly used without re-establishing and training the bubble shape identification model in the process of retrieving the subsequent to-be-detected annular injection molding part to be detected. In the process of constructing the bubble shape recognition model, bubble defects existing in the image can be marked very accurately in a manual mode. In consideration of the fact that in practical application, the appearance of the bubble defect in the image can cause the bubble to present two obviously layered areas on the injection molding part due to the appearance of the shadow part, and through the correlation of the two areas with the relative external circle centers, the model can form the relationship between the two layered areas and the corresponding external circle centers according to the shape of the bubble and the bubble after training, and the bubble defect in the filtered image is identified, so that the bubble defect on the annular injection molding part is accurately identified.
According to the technical scheme, two images irradiated by different light sources are acquired for the same annular injection molding piece to be detected and are respectively processed, the characteristic that after the light sources are irradiated, the reflection points on the annular injection molding piece can be amplified is utilized, the reflection points are filtered according to the brightness values of all highlight position points in the two acquired images, the influence noise of the reflection points on the annular injection molding piece on the bubble defect shape identification is eliminated, the technical problem that the bubble defect on the injection molding piece, particularly the bubble defect on the annular injection molding piece, cannot be accurately identified in the prior art is solved, the accurate identification of the bubble defect on the injection molding piece, particularly the bubble defect on the annular injection molding piece, can be realized, and the detection success rate and the accuracy of the injection molding piece can be improved.
Example two
Referring to fig. 2, a schematic structural diagram of a machine vision-based injection molding inspection system according to another embodiment of the present invention is shown. The system is used for detecting bubble defect on annular injection molding, includes: the device comprises an image acquisition module, an annular boundary module, a highlight identification module, a highlight determination module, an image filtering module and a bubble identification module.
The image acquisition module is used for acquiring images of the annular injection molding piece to be detected in a closed space through shooting equipment to obtain a first acquired image; and keeping the positions of the shooting equipment and the annular injection molding piece to be detected unchanged, and after a light source is put into the closed space, carrying out secondary image acquisition on the annular injection molding piece to be detected to obtain a second acquired image.
The annular boundary module is used for identifying and marking annular boundary characteristics in the first collected image and the second collected image, and determining the annular boundary of the annular injection molding piece to be detected in the first collected image and the second collected image respectively.
In this embodiment, the annular boundary module is specifically configured to: respectively carrying out gridding processing on the first collected image and the second collected image, and determining a reference point; establishing a three-dimensional coordinate system, respectively moving the first collected image and the second collected image to the three-dimensional coordinate system by taking the reference point as an origin, and determining the coordinate position of each gridding point in the three-dimensional coordinate system; determining a connecting line formed among a plurality of continuous gridding points with the same chromaticity in the first collected image, and determining the connecting line as an annular boundary when the chromaticity difference value between the connecting line and the adjacent gridding points which are not on the connecting line reaches a chromaticity threshold value; and moving the three-dimensional coordinate system to the second acquired image by taking the reference point as a reference according to the determined annular boundary in the first acquired image, and determining the annular boundary of the annular injection molding piece to be tested in the second acquired image.
And the highlight identification module is used for preprocessing the first collected image and the second collected image, inputting the preprocessed first collected image and the preprocessed second collected image into a preset highlight area model for identification, and respectively marking and outputting highlight position points existing on the first collected image and the second collected image.
In this embodiment, the highlight identifying module is configured to perform preprocessing on the first captured image and the second captured image, and specifically includes: carrying out graying processing on the first collected image and the second collected image to respectively obtain corresponding grayscale images; in the three-dimensional coordinate system, transversely stretching the gray image by a certain multiple by taking the reference point as a midpoint to obtain a stretched image; identifying light spot features existing in the stretched image, and filtering the light spot features in the stretched image to obtain a filtered image; and according to the transverse stretching multiple, transversely reducing the filtered image to obtain a preprocessed image, and inputting the preprocessed image into a pre-established highlight area model.
In another aspect of this embodiment, the highlight identification module is configured to identify a spot feature existing in the stretched image, and specifically includes: identifying irregular figures existing in the stretched image, and determining the irregular figures existing in the stretched image; dividing a plurality of layers of circular ring areas in each irregular graph respectively, determining a plurality of test points in each layer of circular ring area, and simultaneously determining the chromaticity of each test point; calculating the average chroma of all test points in each layer of ring area, and taking the average chroma as the chroma value of the ring area; and when the chromatic values on the ring area at the outermost layer in the same irregular figure are determined to be sequentially decreased towards the ring area at the innermost layer, determining that the irregular figure is the spot feature existing in the stretched image.
In this embodiment, the step of establishing the highlight region model includes: acquiring a training image, wherein the training image is obtained by shooting an image acquisition of a training annular injection molding piece by shooting equipment after a light source is put in a closed space; according to the chromaticity of the training image, marking the shape boundary of the highlight generation area in the training image, respectively determining the center point of each highlight generation area and the nearest distance point on the annular boundary of the training annular injection molding part, and associating the distance points with the corresponding highlight generation areas; establishing an initial highlight model through a machine learning algorithm, inputting the associated training images into the initial highlight model for training until the training times reach a threshold value, and generating a training highlight model; acquiring a test image, wherein the test image is obtained by acquiring an image of a training annular injection molding piece in a closed space through shooting equipment; and inputting the test image into the training highlight model for testing, and generating a highlight area model when the accuracy of the training highlight model for marking highlight position points with highlight areas in the test image in the output image reaches a preset threshold value.
The highlight determining module is configured to determine highlight position points of the first acquired image and the second acquired image on an annular boundary, and determine a brightness value of each highlight position point on the annular boundary.
In this embodiment, the highlight determining module is specifically configured to: respectively determining the area range of each highlight position point on the annular boundary of the first collected image and the second collected image; and determining an circumscribed circle aiming at the area range of each highlight position point, and taking the brightness value corresponding to the position of the center of the circumscribed circle as the brightness value of the corresponding highlight position point.
The image filtering module is configured to filter, according to a difference between luminance values of highlight position points at the same position on the annular boundary between the first collected image and the second collected image, the highlight position points with a difference value larger than a preset threshold as influence factors in the first collected image, so as to obtain a filtered image.
In this embodiment, the image filtering module is specifically configured to: defining the coordinate position of each highlight position point in the first collected image in a three-dimensional coordinate system as a whole to be a first coordinate position; defining the coordinate position of each highlight position point in the second collected image in the three-dimensional coordinate system as a whole to be a second coordinate position; taking the first coordinate position as a reference, and integrally moving the second coordinate position in a three-dimensional coordinate system until the second coordinate position is superposed with the first coordinate position; and determining the difference of the brightness values of the highlight position points which are positioned at the same position after superposition, and filtering the highlight position points of which the difference values are larger than a preset threshold value in the first acquired image as influence factors to obtain a filtered image.
And the bubble identification module is used for inputting the filtering image into a pre-established bubble shape identification model for identification, marking and outputting highlight position points with shapes meeting the shapes of bubble defects in the filtering image, and the highlight position points are used as the bubble defects on the annular injection molding part to be detected.
In this embodiment, the step of establishing the bubble shape recognition model includes: in the process of pre-establishing the bubble shape recognition model, obtaining a filtering image obtained after the steps are executed; marking bubbles in the filtered image in a manual identification mode, and determining a bubble defect range; identifying the gray value in each bubble defect range, and dividing the bubble defect range into two areas according to the change of the gray value in the bubble defect range; determining circumscribed circles of two areas in which the same bubble defect range is located respectively, and associating the centers of the circumscribed circles corresponding to the two areas; and establishing an initial bubble model through a machine learning algorithm, inputting the associated filtering image into the initial bubble model for training and testing, and generating a bubble shape recognition model until the times of training and testing reach a threshold value.
EXAMPLE III
An embodiment of the present invention further provides a computer-readable storage medium, where the computer-readable storage medium includes a stored computer program; wherein the computer program, when running, controls an apparatus on which the computer-readable storage medium is located to perform the machine vision-based injection molding inspection method according to any of the above embodiments.
Example four
Referring to fig. 3, a schematic structural diagram of an embodiment of a terminal device according to an embodiment of the present invention is shown, where the terminal device includes a processor, a memory, and a computer program stored in the memory and configured to be executed by the processor, and the processor, when executing the computer program, implements the machine-vision-based injection molding detection method according to any of the embodiments.
Preferably, the computer program may be divided into one or more modules/units (e.g., computer program) that are stored in the memory and executed by the processor to implement the invention. The one or more modules/units may be a series of computer program instruction segments capable of performing certain functions, which are used to describe the execution of the computer program in the terminal device.
The Processor may be a Central Processing Unit (CPU), other general purpose Processor, a Digital Signal Processor (DSP), an Application Specific Integrated Circuit (ASIC), a Field-Programmable Gate Array (FPGA) or other Programmable logic device, a discrete Gate or transistor logic device, a discrete hardware component, etc., the general purpose Processor may be a microprocessor, or the Processor may be any conventional Processor, the Processor is a control center of the terminal device, and various interfaces and lines are used to connect various parts of the terminal device.
The memory mainly includes a program storage area and a data storage area, wherein the program storage area may store an operating system, an application program required for at least one function, and the like, and the data storage area may store related data and the like. In addition, the memory may be a high speed random access memory, may also be a non-volatile memory, such as a plug-in hard disk, a Smart Media Card (SMC), a Secure Digital (SD) Card, a Flash Card (Flash Card), and the like, or may also be other volatile solid state memory devices.
It should be noted that the terminal device may include, but is not limited to, a processor and a memory, and those skilled in the art will understand that the terminal device is only an example and does not constitute a limitation of the terminal device, and may include more or less components, or combine some components, or different components.
The above-mentioned embodiments are provided to further explain the objects, technical solutions and advantages of the present invention in detail, and it should be understood that the above-mentioned embodiments are only examples of the present invention and are not intended to limit the scope of the present invention. It should be understood that any modifications, equivalents, improvements and the like, which come within the spirit and principle of the invention, may occur to those skilled in the art and are intended to be included within the scope of the invention.
Claims (10)
1. A machine vision based injection molding inspection method for detecting bubble defects on a ring-shaped injection molding, the method comprising:
acquiring an image of the annular injection molding piece to be detected in a closed space through shooting equipment to obtain a first acquired image; keeping the positions of the shooting equipment and the annular injection molding piece to be detected unchanged, and after a light source is put into the closed space, carrying out secondary image acquisition on the annular injection molding piece to be detected to obtain a second acquired image;
identifying and marking the annular boundary characteristics in the first collected image and the second collected image, and determining the annular boundary of the annular injection molding piece to be detected in the first collected image and the second collected image respectively;
preprocessing the first collected image and the second collected image, inputting the preprocessed first collected image and the preprocessed second collected image into a pre-established highlight area model for recognition, and respectively marking and outputting highlight position points existing on the first collected image and the second collected image;
respectively determining highlight position points of the first collected image and the second collected image on an annular boundary, and determining the brightness value of each highlight position point on the annular boundary;
filtering highlight position points with difference values larger than a preset threshold value in the first acquired image as influence factors according to the difference between the brightness values of the highlight position points at the same position on the annular boundary of the first acquired image and the second acquired image to obtain a filtered image;
and inputting the filtering image into a pre-established bubble shape recognition model for recognition, marking and outputting highlight position points with shapes meeting the shapes of bubble defects in the filtering image, and taking the highlight position points as the bubble defects on the annular injection molding piece to be detected.
2. The machine-vision-based injection molding inspection method of claim 1, wherein said step of identifying and marking the annular boundary features in the first captured image and the second captured image, and determining the annular boundary of the annular injection molding under test in the first captured image and the second captured image, respectively, comprises:
respectively carrying out gridding processing on the first collected image and the second collected image, and determining a reference point;
establishing a three-dimensional coordinate system, respectively moving the first collected image and the second collected image to the three-dimensional coordinate system by taking the reference point as an origin, and determining the coordinate position of each gridding point in the three-dimensional coordinate system;
determining a connecting line formed among a plurality of continuous gridding points with the same chromaticity in the first collected image, and determining the connecting line as an annular boundary when the chromaticity difference value between the connecting line and the adjacent gridding points which are not on the connecting line reaches a chromaticity threshold value;
and moving the reference point in the three-dimensional coordinate system to the second acquired image by taking the reference point as a reference according to the determined annular boundary in the first acquired image, and determining the annular boundary of the annular injection molding piece to be detected in the second acquired image.
3. The machine-vision-based injection molded part inspection method of claim 2, wherein the step of preprocessing the first captured image and the second captured image comprises:
graying the first collected image and the second collected image to respectively obtain corresponding grayscale images;
in the three-dimensional coordinate system, the reference point is used as a midpoint, and the gray level image is transversely stretched by a certain multiple to obtain a stretched image;
identifying light spot features existing in the stretched image, and filtering the light spot features in the stretched image to obtain a filtered image;
and according to the transverse stretching multiple, transversely reducing the filtered image to obtain a preprocessed image, and inputting the preprocessed image into a pre-established highlight area model.
4. The machine-vision-based injection molded part inspection method of claim 3, wherein the step of establishing the highlight region model comprises:
acquiring a training image, wherein the training image is obtained by shooting an image acquisition of a training annular injection molding piece by shooting equipment after a light source is put in a closed space;
according to the chromaticity of the training image, marking the shape boundary of the highlight generation area in the training image, respectively determining the center point of each highlight generation area and the nearest distance point on the annular boundary of the training annular injection molding part, and associating the distance points with the corresponding highlight generation areas;
establishing an initial highlight model through a machine learning algorithm, inputting the associated training images into the initial highlight model for training until the training times reach a threshold value, and generating a training highlight model;
acquiring a test image, wherein the test image is obtained by acquiring an image of a training annular injection molding piece in a closed space through shooting equipment;
and inputting the test image into the training highlight model for testing, and generating a highlight area model when the accuracy of the training highlight model for marking highlight position points with highlight areas in the test image in the output image reaches a preset threshold value.
5. The machine-vision-based injection molding inspection method of claim 3, wherein the step of identifying the spot features present in the stretched image comprises:
identifying irregular figures existing in the stretched image, and determining the irregular figures existing in the stretched image;
dividing a plurality of layers of circular ring areas in each irregular graph respectively, determining a plurality of test points in each layer of circular ring area, and simultaneously determining the chromaticity of each test point;
calculating the average chroma of all test points in each layer of ring area, and taking the average chroma as the chroma value of the ring area;
and when the chromatic values on the ring area at the outermost layer in the same irregular figure are determined to be sequentially decreased towards the ring area at the innermost layer, determining that the irregular figure is the spot feature existing in the stretched image.
6. The machine-vision-based injection molded part inspection method of claim 1, wherein said steps of determining highlight site points of said first captured image and said second captured image, respectively, on an annular boundary, and determining a brightness value of each highlight site point on the annular boundary, specifically comprise:
respectively determining the area range of each highlight position point on the annular boundary of the first collected image and the second collected image;
and determining an circumscribed circle aiming at the area range of each highlight position point, and taking the corresponding brightness value at the position of the circle center of the circumscribed circle as the brightness value of the corresponding highlight position point.
7. The machine-vision-based injection molding detection method of claim 6, wherein the step of filtering the highlight position points with the difference value larger than a preset threshold value in the first captured image as influence factors according to the difference between the brightness values of the highlight position points at the same position on the annular boundary of the first captured image and the second captured image to obtain a filtered image specifically comprises:
defining the coordinate position of each highlight position point in the first collected image in a three-dimensional coordinate system as a whole to be a first coordinate position;
defining the coordinate position of each highlight position point in the second collected image in the three-dimensional coordinate system as a whole to be a second coordinate position;
taking the first coordinate position as a reference, and integrally moving the second coordinate position in a three-dimensional coordinate system until the second coordinate position is superposed with the first coordinate position;
and determining the difference of the brightness values of the highlight position points at the same position after superposition, and filtering the highlight position points with the difference value larger than a preset threshold value in the first collected image as influence factors to obtain a filtered image.
8. A machine vision-based injection molding inspection system for detecting bubble defects on a ring-shaped injection molded part, the system comprising: the device comprises an image acquisition module, an annular boundary module, a highlight identification module, a highlight determination module, an image filtering module and a bubble identification module;
the image acquisition module is used for acquiring an image of the annular injection molding piece to be detected in a closed space through shooting equipment to obtain a first acquired image; keeping the positions of the shooting equipment and the annular injection molding piece to be detected unchanged, and after a light source is put into the closed space, carrying out secondary image acquisition on the annular injection molding piece to be detected to obtain a second acquired image;
the annular boundary module is used for identifying and marking annular boundary characteristics in the first collected image and the second collected image, and determining an annular boundary of the annular injection molding piece to be detected in the first collected image and the second collected image respectively;
the highlight identification module is used for preprocessing the first collected image and the second collected image, inputting the preprocessed first collected image and the preprocessed second collected image into a preset highlight area model for identification, and respectively marking and outputting highlight position points existing on the first collected image and the second collected image;
the highlight determining module is used for respectively determining highlight position points of the first acquired image and the second acquired image on the annular boundary and determining the brightness value of each highlight position point on the annular boundary;
the image filtering module is used for filtering the highlight position points with the difference value larger than a preset threshold value in the first collected image as influence factors according to the difference between the brightness values of the highlight position points at the same position on the annular boundary of the first collected image and the second collected image to obtain a filtered image;
and the bubble identification module is used for inputting the filtering image into a pre-established bubble shape identification model for identification, marking and outputting highlight position points with shapes meeting the shapes of bubble defects in the filtering image, and the highlight position points are used as the bubble defects on the annular injection molding part to be detected.
9. A computer-readable storage medium, characterized in that the computer-readable storage medium comprises a stored computer program; wherein the computer program, when executed, controls an apparatus on which the computer-readable storage medium resides to perform the machine-vision-based injection molding inspection method of any one of claims 1-7.
10. A terminal device comprising a processor, a memory, and a computer program stored in the memory and configured to be executed by the processor, the processor when executing the computer program implementing the machine-vision based injection molding detection method of any one of claims 1-7.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202211409675.2A CN115452844B (en) | 2022-11-11 | 2022-11-11 | Injection molding part detection method and system based on machine vision |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202211409675.2A CN115452844B (en) | 2022-11-11 | 2022-11-11 | Injection molding part detection method and system based on machine vision |
Publications (2)
Publication Number | Publication Date |
---|---|
CN115452844A CN115452844A (en) | 2022-12-09 |
CN115452844B true CN115452844B (en) | 2023-02-03 |
Family
ID=84295658
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202211409675.2A Active CN115452844B (en) | 2022-11-11 | 2022-11-11 | Injection molding part detection method and system based on machine vision |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN115452844B (en) |
Families Citing this family (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN116945521B (en) * | 2023-09-15 | 2023-12-08 | 张家港市神舟机械有限公司 | Injection molding defect detection method |
CN118549436A (en) * | 2024-05-21 | 2024-08-27 | 浙江元盛塑业股份有限公司 | Automobile plastic injection molding defect detection system based on machine vision technology |
Citations (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN201251555Y (en) * | 2008-08-28 | 2009-06-03 | 赵永先 | Chip welding-quality measuring instrument |
CN104510447A (en) * | 2015-01-13 | 2015-04-15 | 哈尔滨工业大学 | Visible light and near-infrared light sublingual image acquisition system |
CN107490582A (en) * | 2017-09-20 | 2017-12-19 | 深圳市晟达机械设计有限公司 | A kind of streamline Work Piece Verification System Based |
JP2020085470A (en) * | 2018-11-15 | 2020-06-04 | 三井化学株式会社 | Bubble detection device, method and program |
CN114004826A (en) * | 2021-11-13 | 2022-02-01 | 博科视(苏州)技术有限公司 | Visual sense-based method for detecting appearance defects of medical injection molding part |
CN115082485A (en) * | 2022-08-23 | 2022-09-20 | 南通华烨塑料工业有限公司 | Method and system for detecting bubble defects on surface of injection molding product |
CN115201212A (en) * | 2022-09-19 | 2022-10-18 | 江苏华彬新材料有限公司 | Plastic product defect detection device based on machine vision |
Family Cites Families (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP5861462B2 (en) * | 2012-01-17 | 2016-02-16 | オムロン株式会社 | Inspection standard registration method for solder inspection and board inspection apparatus using the method |
-
2022
- 2022-11-11 CN CN202211409675.2A patent/CN115452844B/en active Active
Patent Citations (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN201251555Y (en) * | 2008-08-28 | 2009-06-03 | 赵永先 | Chip welding-quality measuring instrument |
CN104510447A (en) * | 2015-01-13 | 2015-04-15 | 哈尔滨工业大学 | Visible light and near-infrared light sublingual image acquisition system |
CN107490582A (en) * | 2017-09-20 | 2017-12-19 | 深圳市晟达机械设计有限公司 | A kind of streamline Work Piece Verification System Based |
JP2020085470A (en) * | 2018-11-15 | 2020-06-04 | 三井化学株式会社 | Bubble detection device, method and program |
CN114004826A (en) * | 2021-11-13 | 2022-02-01 | 博科视(苏州)技术有限公司 | Visual sense-based method for detecting appearance defects of medical injection molding part |
CN115082485A (en) * | 2022-08-23 | 2022-09-20 | 南通华烨塑料工业有限公司 | Method and system for detecting bubble defects on surface of injection molding product |
CN115201212A (en) * | 2022-09-19 | 2022-10-18 | 江苏华彬新材料有限公司 | Plastic product defect detection device based on machine vision |
Non-Patent Citations (1)
Title |
---|
注塑件常见外观缺陷检测系统的研究与开发;崔炽标;《中国优秀硕士学位论文全文数据库工程科技Ⅰ辑》;20190315(第03期);第1-76页 * |
Also Published As
Publication number | Publication date |
---|---|
CN115452844A (en) | 2022-12-09 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN115452844B (en) | Injection molding part detection method and system based on machine vision | |
CN105069790B (en) | A kind of gear open defect fast image detection method | |
CN106949848B (en) | A kind of high-precision laser 3D profile phone structural detection method | |
US8050486B2 (en) | System and method for identifying a feature of a workpiece | |
CN111325713A (en) | Wood defect detection method, system and storage medium based on neural network | |
CN109900711A (en) | Workpiece, defect detection method based on machine vision | |
CN105975941A (en) | Multidirectional vehicle model detection recognition system based on deep learning | |
CN103606167B (en) | A kind of outer bottle cap profile for defects detection determines method | |
WO2007062563A1 (en) | On-line automatic inspection method for detecting surface flaws of steel during the pretreatment of the ship steel | |
CN105303573B (en) | Pin detection method and system for gold needle type element | |
KR101730133B1 (en) | Method of inspection for injection molding plastic | |
CN113962929B (en) | Photovoltaic cell assembly defect detection method and system and photovoltaic cell assembly production line | |
CN110516514A (en) | A kind of modeling method and device of target detection model | |
WO2019244946A1 (en) | Defect identifying method, defect identifying device, defect identifying program, and recording medium | |
CN112414623A (en) | Method and system for detecting part air tightness leakage defect based on artificial intelligence | |
CN110412055A (en) | A kind of lens white haze defect inspection method based on multiple light courcess dark-ground illumination | |
CN109685756A (en) | Image feature automatic identifier, system and method | |
CN118089546B (en) | Method and system for monitoring medicine stability through image analysis | |
CN105023018A (en) | Jet code detection method and system | |
CN114387269A (en) | Fiber yarn defect detection method based on laser | |
CN117372770A (en) | PCB micro defect detection and identification method based on photometric stereo and deep learning | |
CN108428247A (en) | Method and system for detecting direction of soldering tin point | |
JP2019194549A (en) | Inspection method for tire mold side plate | |
CN117036231A (en) | Visual detection method for spot welding quality of automobile cabin assembly | |
CN103955929B (en) | Image local edge pattern and non-edge mode judging method and judgment means |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |