CN115452844A - Injection molding part detection method and system based on machine vision - Google Patents

Injection molding part detection method and system based on machine vision Download PDF

Info

Publication number
CN115452844A
CN115452844A CN202211409675.2A CN202211409675A CN115452844A CN 115452844 A CN115452844 A CN 115452844A CN 202211409675 A CN202211409675 A CN 202211409675A CN 115452844 A CN115452844 A CN 115452844A
Authority
CN
China
Prior art keywords
image
highlight
injection molding
annular
determining
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202211409675.2A
Other languages
Chinese (zh)
Other versions
CN115452844B (en
Inventor
刘璨
周本政
廖光皓
谢炳生
刘焕牢
尹凝霞
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Guangdong Ocean University
Original Assignee
Guangdong Ocean University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Guangdong Ocean University filed Critical Guangdong Ocean University
Priority to CN202211409675.2A priority Critical patent/CN115452844B/en
Publication of CN115452844A publication Critical patent/CN115452844A/en
Application granted granted Critical
Publication of CN115452844B publication Critical patent/CN115452844B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N21/00Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
    • G01N21/84Systems specially adapted for particular applications
    • G01N21/88Investigating the presence of flaws or contamination
    • G01N21/8851Scan or image signal processing specially adapted therefor, e.g. for scan signal adjustment, for detecting different kinds of defects, for compensating for structures, markings, edges
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/0002Inspection of images, e.g. flaw detection
    • G06T7/0004Industrial image inspection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/13Edge detection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/60Analysis of geometric attributes
    • G06T7/66Analysis of geometric attributes of image moments or centre of gravity
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras
    • G06T7/73Determining position or orientation of objects or cameras using feature-based methods
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/70Arrangements for image or video recognition or understanding using pattern recognition or machine learning
    • G06V10/764Arrangements for image or video recognition or understanding using pattern recognition or machine learning using classification, e.g. of video objects

Abstract

The invention discloses an injection molding part detection method and system based on machine vision, the technical scheme of the invention is that two images irradiated by different light sources are acquired from the same annular injection molding part to be detected and are respectively processed, the characteristic that the reflection points on the annular injection molding part can be amplified after the light sources are irradiated is utilized, the reflection points are filtered according to the brightness values of various highlight position points in the two acquired images, the influence noise of the reflection points on the annular injection molding part on the shape recognition of bubble defects is eliminated, the technical problem that the prior art cannot accurately recognize the bubble defects on the injection molding part, particularly the bubble defects on the annular injection molding part, the accurate recognition of the bubble defects on the injection molding part, particularly the bubble defects of the annular injection molding part, is realized, and the detection success rate and the accuracy of the injection molding part can be improved.

Description

Injection molding part detection method and system based on machine vision
Technical Field
The invention relates to the technical field of image processing, in particular to a machine vision-based injection molding part detection method and system.
Background
The injection molding piece is a workpiece made of materials such as polypropylene, polyethylene and the like by fusing a plurality of organic solvents. The manufacturing principle is that plastic particles are injected into a model after being melted at high temperature, and the plastic particles are extruded and cooled by a machine to form a part, the manufacturing process is greatly influenced by other factors such as temperature, pressure, model and the like, the surface of the produced injection molding part has high probability of glue shortage or glue excess, and therefore the produced injection molding part usually needs to be detected in a detection mode.
The types of defects currently associated with injection molded parts are common: shrinkage, gas lines, material shortage, flash, thread clamping and the like. In the face of the common defect types, the existing injection molding part detection strategies basically mainly adopt a manual detection mode, and workers classify the injection molding parts with defects by observing the surface conditions of the produced injection molding parts. The above-mentioned apparent defects may also be detected manually, but encounter a special defect type, such as "bubbles". The air bubbles are formed due to too fast material injection, or the air bubbles exist in the injection molding part due to the fact that cavities are caused by uneven volume shrinkage during production molding. Some injection molding parts with higher precision have high requirements on finished products, and the occurrence of bubbles can be used for equipment after the injection molding parts are installed. However, the traditional manual detection mode has extremely low efficiency and limited accuracy of visual inspection, and can not accurately identify the bubble defect point. Although some researches aiming at bubble defects on injection molding parts exist in the market at present along with the development of image processing technology, the researches are only limited in the directions of cause analysis and the like of bubble generation, and the bubble defects on the injection molding parts cannot be found; in addition, in the process of image detection facing the annular injection molding part, the reflection point of the annular injection molding part can generate certain influence on bubble identification, so that the bubble defect detection of the annular injection molding part is harder.
Therefore, a new injection molding part detection strategy is urgently needed in the market at present, bubble defects on the injection molding part can be accurately identified, particularly the bubble defects of the annular injection molding part are identified, and the detection success rate and the accuracy of the injection molding part are improved.
Disclosure of Invention
The invention provides a machine vision-based injection molding part detection method and system, which can realize accurate identification of bubble defects on injection molding parts, particularly the bubble defects of annular injection molding parts, and can improve the detection success rate and accuracy of the injection molding parts.
In order to solve the technical problem, an embodiment of the present invention provides a machine vision-based injection molding part detection method for detecting a bubble defect on an annular injection molding part, where the method includes:
acquiring an image of the annular injection molding piece to be detected in a closed space through shooting equipment to obtain a first acquired image; keeping the positions of the shooting equipment and the annular injection molding piece to be detected unchanged, and after a light source is put into the closed space, carrying out secondary image acquisition on the annular injection molding piece to be detected to obtain a second acquired image;
identifying and marking the annular boundary characteristics in the first collected image and the second collected image, and determining the annular boundary of the annular injection molding piece to be detected in the first collected image and the second collected image respectively;
preprocessing the first collected image and the second collected image, inputting the preprocessed first collected image and the preprocessed second collected image into a pre-established highlight area model for recognition, and respectively marking and outputting highlight position points existing on the first collected image and the second collected image;
respectively determining highlight position points of the first collected image and the second collected image on an annular boundary, and determining the brightness value of each highlight position point on the annular boundary;
filtering the highlight position points with the difference value larger than a preset threshold value in the first collected image as influence factors according to the difference between the brightness values of the highlight position points at the same position on the annular boundary of the first collected image and the second collected image to obtain a filtered image;
and inputting the filtering image into a pre-established bubble shape recognition model for recognition, marking and outputting highlight position points with shapes meeting the shapes of bubble defects in the filtering image, and taking the highlight position points as the bubble defects on the annular injection molding piece to be detected.
As a preferred scheme, in the step of identifying and marking the annular boundary features in the first captured image and the second captured image, and determining the annular boundary of the annular injection molding piece to be tested in the first captured image and the second captured image respectively, the method specifically includes:
respectively carrying out gridding processing on the first collected image and the second collected image, and determining a reference point;
establishing a three-dimensional coordinate system, moving the first collected image and the second collected image into the three-dimensional coordinate system by taking the reference point as an origin, and determining the coordinate position of each gridding point in the three-dimensional coordinate system;
determining a connecting line formed among a plurality of continuous gridding points with the same chromaticity in the first collected image, and determining the connecting line as an annular boundary when the chromaticity difference value between the connecting line and the adjacent gridding points which are not on the connecting line reaches a chromaticity threshold value;
and moving the reference point in the three-dimensional coordinate system to the second acquired image by taking the reference point as a reference according to the determined annular boundary in the first acquired image, and determining the annular boundary of the annular injection molding piece to be detected in the second acquired image.
As a preferred scheme, the step of preprocessing the first captured image and the second captured image specifically includes:
carrying out graying processing on the first collected image and the second collected image to respectively obtain corresponding grayscale images;
in the three-dimensional coordinate system, the reference point is used as a midpoint, and the gray level image is transversely stretched by a certain multiple to obtain a stretched image;
identifying light spot features existing in the stretched image, and filtering the light spot features in the stretched image to obtain a filtered image;
and according to the transverse stretching multiple, transversely reducing the filtered image to obtain a preprocessed image, and inputting the preprocessed image into a pre-established highlight area model.
As a preferred scheme, the step of establishing the highlight region model includes:
acquiring a training image, wherein the training image is obtained by shooting an image acquisition of a training annular injection molding piece by shooting equipment after a light source is put in a closed space;
according to the chromaticity of the training image, marking the shape boundary of the highlight generation area in the training image, respectively determining the center point of each highlight generation area and the nearest distance point on the annular boundary of the training annular injection molding part, and associating the distance points with the corresponding highlight generation areas;
establishing an initial highlight model through a machine learning algorithm, inputting the associated training images into the initial highlight model for training until the training times reach a threshold value, and generating a training highlight model;
acquiring a test image, wherein the test image is obtained by acquiring an image of a training annular injection molding piece in a closed space through shooting equipment;
and inputting the test image into the training highlight model for testing, and generating a highlight area model when the accuracy of the training highlight model for marking highlight position points with highlight areas in the test image in the output image reaches a preset threshold value.
Preferably, the step of identifying the spot features existing in the stretched image specifically includes:
identifying irregular figures existing in the stretched image, and determining the irregular figures existing in the stretched image;
dividing a plurality of layers of circular ring areas in each irregular graph respectively, determining a plurality of test points in each layer of circular ring area, and simultaneously determining the chromaticity of each test point;
calculating the average chroma of all test points in each layer of ring area, and taking the average chroma as the chroma value of the ring area;
and when the chromatic values on the ring area at the outermost layer in the same irregular figure are determined to be sequentially decreased towards the ring area at the innermost layer, determining that the irregular figure is the spot feature existing in the stretched image.
As a preferred scheme, the step of respectively determining highlight position points of the first captured image and the second captured image on the annular boundary, and determining the brightness value of each highlight position point on the annular boundary specifically includes:
respectively determining the area range of each highlight position point on the annular boundary of the first collected image and the second collected image;
and determining an circumscribed circle aiming at the area range of each highlight position point, and taking the brightness value corresponding to the position of the center of the circumscribed circle as the brightness value of the corresponding highlight position point.
As a preferred scheme, the step of filtering, according to a difference between luminance values of highlight position points at the same position on the annular boundary between the first captured image and the second captured image, the highlight position point having a difference value larger than a preset threshold as an influence factor in the first captured image to obtain a filtered image specifically includes:
defining the coordinate position of each highlight position point in the first collected image in a three-dimensional coordinate system as a whole to be a first coordinate position;
defining the coordinate position of each highlight position point in the second collected image in the three-dimensional coordinate system as a whole to be a second coordinate position;
taking the first coordinate position as a reference, and integrally moving the second coordinate position in a three-dimensional coordinate system until the second coordinate position is superposed with the first coordinate position;
and determining the difference of the brightness values of the highlight position points at the same position after superposition, and filtering the highlight position points with the difference value larger than a preset threshold value in the first collected image as influence factors to obtain a filtered image.
Preferably, the step of establishing the bubble shape recognition model includes:
in the process of pre-establishing the bubble shape recognition model, obtaining a filtering image obtained after the steps are executed;
marking bubbles existing in the filtered image in a manual identification mode, and determining a bubble defect range;
identifying the gray value in each bubble defect range, and dividing the bubble defect range into two areas according to the change of the gray value in the bubble defect range;
determining circumscribed circles of two areas in which the same bubble defect range is located respectively, and associating the centers of the circumscribed circles corresponding to the two areas;
and establishing an initial bubble model through a machine learning algorithm, inputting the associated filtering image into the initial bubble model for training and testing, and generating a bubble shape recognition model until the times of training and testing reach a threshold value.
Accordingly, another embodiment of the present invention further provides a machine vision-based injection molded part inspection system for detecting bubble defects on a ring-shaped injection molded part, the system comprising: the device comprises an image acquisition module, an annular boundary module, a highlight identification module, a highlight determination module, an image filtering module and a bubble identification module;
the image acquisition module is used for acquiring images of the annular injection molding piece to be detected in a closed space through shooting equipment to obtain a first acquired image; keeping the positions of the shooting equipment and the annular injection molding piece to be detected unchanged, and after a light source is put into the closed space, carrying out secondary image acquisition on the annular injection molding piece to be detected to obtain a second acquired image;
the annular boundary module is used for identifying and marking annular boundary characteristics in the first collected image and the second collected image, and determining an annular boundary of the annular injection molding piece to be detected in the first collected image and the second collected image respectively;
the highlight identification module is used for preprocessing the first collected image and the second collected image, inputting the preprocessed first collected image and the preprocessed second collected image into a preset highlight area model for identification, and respectively marking and outputting highlight position points existing on the first collected image and the second collected image;
the highlight determining module is used for respectively determining highlight position points of the first collected image and the second collected image on an annular boundary and determining the brightness value of each highlight position point on the annular boundary;
the image filtering module is used for filtering the highlight position points with the difference value larger than a preset threshold value in the first collected image as influence factors according to the difference between the brightness values of the highlight position points at the same position on the annular boundary of the first collected image and the second collected image to obtain a filtered image;
and the bubble identification module is used for inputting the filtering image into a pre-established bubble shape identification model for identification, marking and outputting highlight position points with shapes meeting the shapes of bubble defects in the filtering image, and the highlight position points are used as the bubble defects on the annular injection molding part to be detected.
As a preferred solution, the annular boundary module is specifically configured to: respectively carrying out gridding processing on the first collected image and the second collected image, and determining a reference point; establishing a three-dimensional coordinate system, moving the first collected image and the second collected image into the three-dimensional coordinate system by taking the reference point as an origin, and determining the coordinate position of each gridding point in the three-dimensional coordinate system; determining a connecting line formed among a plurality of continuous gridding points with the same chromaticity in the first collected image, and determining the connecting line as an annular boundary when the chromaticity difference value between the connecting line and the adjacent gridding points which are not on the connecting line reaches a chromaticity threshold value; and moving the reference point in the three-dimensional coordinate system to the second acquired image by taking the reference point as a reference according to the determined annular boundary in the first acquired image, and determining the annular boundary of the annular injection molding piece to be detected in the second acquired image.
As a preferred scheme, the highlight identification module is used in the step of preprocessing the first captured image and the second captured image, and specifically includes: carrying out graying processing on the first collected image and the second collected image to respectively obtain corresponding grayscale images; in the three-dimensional coordinate system, transversely stretching the gray image by a certain multiple by taking the reference point as a midpoint to obtain a stretched image; identifying light spot features existing in the stretched image, and filtering the light spot features in the stretched image to obtain a filtered image; and according to the transverse stretching multiple, transversely reducing the filtered image to obtain a preprocessed image, and inputting the preprocessed image into a pre-established highlight area model.
As a preferred scheme, the step of establishing the highlight region model includes: acquiring a training image, wherein the training image is obtained by shooting an image acquisition of a training annular injection molding piece by shooting equipment after a light source is put in a closed space; according to the chromaticity of the training image, marking the shape boundary of the highlight generation area in the training image, respectively determining the center point of each highlight generation area and the nearest distance point on the annular boundary of the training annular injection molding part, and associating the distance points with the corresponding highlight generation areas; establishing an initial highlight model through a machine learning algorithm, inputting the associated training images into the initial highlight model for training until the training times reach a threshold value, and generating a training highlight model; acquiring a test image, wherein the test image is obtained by acquiring an image of a training annular injection molding in a closed space through shooting equipment; and inputting the test image into the training highlight model for testing, and generating a highlight area model when the accuracy of the training highlight model for marking highlight position points with highlight areas in the test image in the output image reaches a preset threshold value.
As a preferred scheme, the highlight identification module is used in the step of identifying the spot features existing in the stretched image, and specifically includes: identifying irregular figures existing in the stretched image, and determining the irregular figures existing in the stretched image; dividing a plurality of layers of circular ring areas in each irregular graph respectively, determining a plurality of test points in each layer of circular ring area, and simultaneously determining the chromaticity of each test point; calculating the average chroma of all test points in each layer of ring area, and taking the average chroma as the chroma value of the ring area; and when the chromatic values on the ring area at the outermost layer in the same irregular figure are determined to be sequentially decreased towards the ring area at the innermost layer, determining that the irregular figure is the spot feature existing in the stretched image.
As a preferred scheme, the highlight determining module is specifically configured to: respectively determining the area range of each highlight position point on the annular boundary of the first collected image and the second collected image; and determining an circumscribed circle aiming at the area range of each highlight position point, and taking the corresponding brightness value at the position of the circle center of the circumscribed circle as the brightness value of the corresponding highlight position point.
As a preferred scheme, the image filtering module is specifically configured to: defining the coordinate position of each highlight position point in the first acquired image in a three-dimensional coordinate system as a whole as a first coordinate position; defining the coordinate position of each highlight position point in the second collected image in the three-dimensional coordinate system as a whole to be a second coordinate position; taking the first coordinate position as a reference, and integrally moving the second coordinate position in a three-dimensional coordinate system until the second coordinate position is superposed with the first coordinate position; and determining the difference of the brightness values of the highlight position points at the same position after superposition, and filtering the highlight position points with the difference value larger than a preset threshold value in the first collected image as influence factors to obtain a filtered image.
Preferably, the step of establishing the bubble shape recognition model includes: in the process of pre-establishing the bubble shape recognition model, obtaining a filtering image obtained after the steps are executed; marking bubbles in the filtered image in a manual identification mode, and determining a bubble defect range; identifying the gray value in each bubble defect range, and dividing the bubble defect range into two areas according to the change of the gray value in the bubble defect range; respectively determining circumscribed circles for two areas where the same bubble defect range is located, and associating the centers of the circumscribed circles corresponding to the two areas; and establishing an initial bubble model through a machine learning algorithm, inputting the associated filtering image into the initial bubble model for training and testing, and generating a bubble shape recognition model until the times of training and testing reach a threshold value.
An embodiment of the present invention further provides a computer-readable storage medium, where the computer-readable storage medium includes a stored computer program; wherein the computer program, when executed, controls an apparatus in which the computer readable storage medium is located to perform a machine vision-based injection molding inspection method as in any one of the above.
An embodiment of the present invention further provides a terminal device, which includes a processor, a memory, and a computer program stored in the memory and configured to be executed by the processor, where the processor, when executing the computer program, implements the machine vision-based injection molding detection method according to any one of the above items.
Compared with the prior art, the embodiment of the invention has the following beneficial effects:
according to the technical scheme, two images irradiated by different light sources are acquired for the same annular injection molding piece to be detected and are respectively processed, the characteristic that after the light sources are irradiated, the reflection points on the annular injection molding piece can be amplified is utilized, the reflection points are filtered according to the brightness values of all highlight position points in the two acquired images, the influence noise of the reflection points on the annular injection molding piece on the bubble defect shape identification is eliminated, the technical problem that the bubble defect on the injection molding piece, particularly the bubble defect on the annular injection molding piece, cannot be accurately identified in the prior art is solved, the accurate identification of the bubble defect on the injection molding piece, particularly the bubble defect on the annular injection molding piece, can be realized, and the detection success rate and the accuracy of the injection molding piece can be improved.
Drawings
FIG. 1: the invention provides a method for detecting an injection molding part based on machine vision, which comprises the following steps of a flow chart;
FIG. 2: the injection molding part detection system based on machine vision is provided by the embodiment of the invention;
FIG. 3: the structure diagram of an embodiment of the terminal device provided by the embodiment of the invention is shown.
Detailed Description
The technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are only a part of the embodiments of the present invention, and not all of the embodiments. All other embodiments, which can be obtained by a person skilled in the art without inventive step based on the embodiments of the present invention, are within the scope of protection of the present invention.
Example one
Referring to fig. 1, a flowchart of steps of a method for inspecting an injection molded part based on machine vision according to an embodiment of the present invention is shown. The method is used for detecting the bubble defects on the annular injection molding part and comprises the following steps 101 to 106:
101, acquiring an image of an annular injection molding piece to be detected in a closed space through shooting equipment to obtain a first acquired image; and keeping the positions of the shooting equipment and the annular injection molding piece to be detected unchanged, and after a light source is put into the closed space, carrying out secondary image acquisition on the annular injection molding piece to be detected to obtain a second acquired image.
Specifically, the injection molding part image can generate reflective spots in the defect identification process, and the annular injection molding part has radian, so that the problem of the reflective spots is more serious. When the injection molding piece image generates reflection points, part of the reflection points are often mistakenly identified as bubble defects in the identification process, or the identification of the bubble defects is influenced due to the fact that the coverage range of the reflection points is too large. Based on the reasons, in the process of detecting the injection molding part, particularly in the process of detecting the annular injection molding part, the influence of the reflection point on the injection molding part image per se is particularly required to be eliminated. In the step, two images are required to be acquired, wherein one image is acquired based on the closed space for the annular injection molding part, and the other image is acquired under the condition of an external light source. Research shows that after a light source is additionally arranged on an original shot image, the reflection points on the injection molding image can be amplified, and the reflection points can be identified and removed more conveniently.
102, identifying and marking annular boundary characteristics in the first collected image and the second collected image, and determining an annular boundary of the annular injection molding piece to be detected in the first collected image and the second collected image respectively.
In this embodiment, the step 102 specifically includes: step 1021, gridding the first collected image and the second collected image respectively, and determining a reference point; step 1022, establishing a three-dimensional coordinate system, moving the first collected image and the second collected image to the three-dimensional coordinate system respectively by taking the reference point as an origin, and determining the coordinate position of each gridding point in the three-dimensional coordinate system; step 1023, determining a connecting line formed among a plurality of continuous gridding points with the same chromaticity in the first collected image, and determining the connecting line as a circular boundary when the difference value of the chromaticity between the connecting line and the adjacent gridding points which are not on the connecting line reaches a chromaticity threshold value; and 1024, moving the reference point in the three-dimensional coordinate system to the second acquired image based on the determined annular boundary in the first acquired image, and determining the annular boundary of the annular injection molding piece to be measured in the second acquired image.
Specifically, the light reflection point of the injection molded part can be enlarged through the step 101. In order to further identify the influence of the annular boundary on the light-reflecting points, the annular boundary on the annular injection molding part needs to be determined firstly, specifically aiming at the annular injection molding part, namely, the light-reflecting points on the annular injection molding part which are difficult to break through are identified. In the specific identification process, in order to eliminate the influence of the external light source on the identification of the annular boundary (for example, the annular boundary is easily blurred due to the long exposure time), in this step, the first captured image and the second captured image are used for alignment at the position of the reference point. And then, according to the color difference formed by the grid area on the annular boundary and the surrounding non-boundary, the complete annular boundary is identified, and the situation that the reflective points appearing on the annular boundary cannot be completely identified in the subsequent identification process is prevented.
Step 103, preprocessing the first collected image and the second collected image, inputting the preprocessed first collected image and the preprocessed second collected image into a pre-established highlight area model for identification, and respectively marking and outputting highlight position points existing on the first collected image and the second collected image.
In this embodiment, the step 103 is used for the step of preprocessing the first captured image and the second captured image, and specifically includes: 1031, performing graying processing on the first collected image and the second collected image to obtain corresponding grayscale images respectively; step 1032, transversely stretching the gray image by a certain multiple by taking the reference point as a midpoint in the three-dimensional coordinate system to obtain a stretched image; step 1033, identifying the light spot features existing in the stretched image, and filtering the light spot features in the stretched image to obtain a filtered image; and 1034, transversely reducing the filtered image according to the transverse stretching multiple to obtain a preprocessed image, and inputting the preprocessed image into a pre-established highlight area model.
In another aspect of this embodiment, the step 1033 is configured to perform the step of identifying the spot features existing in the stretched image, and specifically includes: identifying irregular figures existing in the stretched image, and determining the irregular figures existing in the stretched image; respectively dividing a plurality of layers of circular ring areas in each irregular graph, determining a plurality of test points in each layer of circular ring area, and simultaneously determining the chromaticity of each test point; calculating the average chroma of all test points in each layer of ring area, and taking the average chroma as the chroma value of the ring area; and when the chromatic values on the ring area at the outermost layer in the same irregular figure are determined to be sequentially decreased towards the ring area at the innermost layer, determining that the irregular figure is the spot feature existing in the stretched image.
In particular, in order to identify the reflective spots on the annular boundary in the subsequent step (substantially all of the reflective spots are concentrated on the annular boundary, since the annular boundary reflects the light source to form a reflection during the irradiation of the light source, the reflective spots on the annular injection-molded part are substantially concentrated on the annular boundary), the presence of the reflective spots on the injection-molded part needs to be identified by the model. In order to more accurately identify the highlight position point (i.e., the suspected reflection point, and then determine the highlight position point), the image needs to be preprocessed. In the preprocessing, in order to reduce noise, other influence factors (for example, light spots such as noise spots) existing in the image except for the annular injection molding part need to be removed. At the moment, the light spots can be stretched into irregular images after the images are stretched, the light spot noise points existing in the images are judged by utilizing the characteristic that the chromaticity of the light spots in the images decreases progressively from the outer layer to the inner layer, and filtering is carried out.
In this embodiment, the step of establishing the highlight region model includes: acquiring a training image, wherein the training image is obtained by shooting an image acquisition of a training annular injection molding piece by shooting equipment after a light source is put in a closed space; according to the chromaticity of the training image, marking the shape boundary of the highlight generation area in the training image, respectively determining the center point of each highlight generation area and the nearest distance point on the annular boundary of the training annular injection molding part, and associating the distance points with the corresponding highlight generation areas; establishing an initial highlight model through a machine learning algorithm, inputting the associated training images into the initial highlight model for training until the training times reach a threshold value, and generating a training highlight model; acquiring a test image, wherein the test image is obtained by acquiring an image of a training annular injection molding piece in a closed space through shooting equipment; and inputting the test image into the training highlight model for testing, and generating a highlight area model when the accuracy of the training highlight model for marking highlight position points with highlight areas in the test image in the output image reaches a preset threshold value.
Specifically, the key point of the scheme also exists in the link of building the highlight area model. Because the function of the highlight region model is used to accurately identify the highlight position point, when training this model, we need to guide the position of the highlight region in the training image and the associated feature on the annular boundary. The model continuous annular boundary learns the position of a highlight area generated by the image, and the finally generated highlight area model can identify the input image to generate identified highlight position points.
And 104, respectively determining highlight position points of the first collected image and the second collected image on an annular boundary, and determining the brightness value of each highlight position point on the annular boundary.
In this embodiment, the step 104 specifically includes: step 1041, determining the area range of each highlight position point on the annular boundary of the first collected image and the second collected image respectively; step 1042, determining an circumscribed circle for the area range of each highlight position point, and taking the corresponding brightness value at the position where the center of the circumscribed circle is located as the brightness value of the corresponding highlight position point.
Specifically, the highlight position points are determined in a circumscribed circle mode, the circle center of the circumscribed circle is actually used as the brightness value of the corresponding highlight position points, and the brightness value assignment of each highlight position point can be more accurate, so that the highlight position points can be distinguished in the next step, and which highlight position points are real reflective points.
And 105, filtering the highlight position points with the difference value larger than a preset threshold value in the first acquired image as influence factors according to the difference between the brightness values of the highlight position points at the same position on the annular boundary of the first acquired image and the second acquired image to obtain a filtered image.
In this embodiment, the step 105 specifically includes: step 1051, defining the coordinate positions of the highlight position points in the first collected image in a three-dimensional coordinate system as a whole as a first coordinate position; step 1052, defining the coordinate position of each highlight position point in the second acquired image in the three-dimensional coordinate system as a whole as a second coordinate position; step 1053, taking the first coordinate position as a reference, moving the second coordinate position in the three-dimensional coordinate system as a whole until the second coordinate position coincides with the first coordinate position; and 1054, determining the difference between the brightness values of the highlight position points at the same position after superposition, and filtering the highlight position points with the difference value larger than a preset threshold value in the first collected image as influence factors to obtain a filtered image.
Specifically, by utilizing the characteristic that the reflective point is amplified after passing through an external light source, the difference between the corresponding brightness values in two collected images at the same position is calculated, so that which highlight position points are real reflective points can be determined, and then the noise-free images are left after filtering (the reflective points are removed). In order to make data more accurate, the situation that the images move in the process of multiple acquisition and processing is avoided in the step of determining the same position, the whole first acquired image is moved until the first acquired image is overlapped with the second acquired image by utilizing the movement peer-to-peer relationship of a three-dimensional coordinate system, and then the position on the same coordinate point can be regarded as the same position.
And 106, inputting the filtering image into a pre-established bubble shape recognition model for recognition, marking and outputting highlight position points with shapes meeting the shapes of bubble defects in the filtering image, wherein the highlight position points are used as the bubble defects on the annular injection molding part to be detected.
In this embodiment, the step of establishing the bubble shape recognition model includes: in the process of pre-establishing the bubble shape recognition model, obtaining a filtering image obtained after the steps are executed; marking bubbles existing in the filtered image in a manual identification mode, and determining a bubble defect range; identifying the gray value in each bubble defect range, and dividing the bubble defect range into two areas according to the change of the gray value in the bubble defect range; determining circumscribed circles of two areas in which the same bubble defect range is located respectively, and associating the centers of the circumscribed circles corresponding to the two areas; and establishing an initial bubble model through a machine learning algorithm, inputting the associated filtering image into the initial bubble model for training and testing, and generating a bubble shape recognition model until the times of training and testing reach a threshold value.
Specifically, through the above steps 101 to 105, we have obtained a noise-free image, and at this time we only need to input the noise-free image (i.e. the filtered image) into the model for identifying the defect shape of the bubble. It can be understood that, in the process of identifying the bubble shape identification model, the steps 101 to 105 are operated in advance, then the "filtered image" output in the step 105 is used as the training image of the bubble shape identification model, and after the model training is completed, the bubble shape identification model can be directly used without re-establishing and training the bubble shape identification model in the process of retrieving the subsequent to-be-detected annular injection molding part to be detected. In the process of constructing the bubble shape recognition model, bubble defects existing in the image can be marked very accurately in a manual mode. In consideration of the fact that in practical application, the appearance of the bubble defect in the image can cause the bubble to present two obviously layered areas on the injection molding part due to the appearance of the shadow part, and through the correlation of the two areas with the relative external circle centers, the model can form the relationship between the two layered areas and the corresponding external circle centers according to the shape of the bubble and the bubble after training, and the bubble defect in the filtered image is identified, so that the bubble defect on the annular injection molding part is accurately identified.
According to the technical scheme, two images irradiated by different light sources are acquired for the same annular injection molding piece to be detected and are respectively processed, the characteristic that after the light sources are irradiated, the reflection points on the annular injection molding piece can be amplified is utilized, the reflection points are filtered according to the brightness values of all highlight position points in the two acquired images, the influence noise of the reflection points on the annular injection molding piece on the bubble defect shape identification is eliminated, the technical problem that the bubble defect on the injection molding piece, particularly the bubble defect on the annular injection molding piece, cannot be accurately identified in the prior art is solved, the accurate identification of the bubble defect on the injection molding piece, particularly the bubble defect on the annular injection molding piece, can be realized, and the detection success rate and the accuracy of the injection molding piece can be improved.
Example two
Referring to fig. 2, a schematic structural diagram of a machine vision-based injection molding inspection system according to another embodiment of the present invention is shown. The system is used for detecting bubble defect on annular injection molding, includes: the device comprises an image acquisition module, an annular boundary module, a highlight identification module, a highlight determination module, an image filtering module and a bubble identification module.
The image acquisition module is used for acquiring images of the annular injection molding piece to be detected in a closed space through shooting equipment to obtain a first acquired image; and keeping the positions of the shooting equipment and the annular injection molding part to be detected unchanged, and after a light source is thrown in the closed space, carrying out secondary image acquisition on the annular injection molding part to be detected to obtain a second acquired image.
The annular boundary module is used for identifying and marking annular boundary characteristics in the first collected image and the second collected image, and determining an annular boundary of the annular injection molding piece to be detected in the first collected image and the second collected image respectively.
In this embodiment, the annular boundary module is specifically configured to: respectively carrying out gridding processing on the first collected image and the second collected image, and determining a reference point; establishing a three-dimensional coordinate system, respectively moving the first collected image and the second collected image to the three-dimensional coordinate system by taking the reference point as an origin, and determining the coordinate position of each gridding point in the three-dimensional coordinate system; determining a connecting line formed among a plurality of continuous gridding points with the same chromaticity in the first collected image, and determining the connecting line as an annular boundary when the chromaticity difference value between the connecting line and the adjacent gridding points which are not on the connecting line reaches a chromaticity threshold value; and moving the three-dimensional coordinate system to the second acquired image by taking the reference point as a reference according to the determined annular boundary in the first acquired image, and determining the annular boundary of the annular injection molding piece to be tested in the second acquired image.
And the highlight identification module is used for preprocessing the first collected image and the second collected image, inputting the preprocessed first collected image and the preprocessed second collected image into a preset highlight area model for identification, and respectively marking and outputting highlight position points existing on the first collected image and the second collected image.
In this embodiment, the highlight identifying module is configured to perform preprocessing on the first captured image and the second captured image, and specifically includes: carrying out graying processing on the first collected image and the second collected image to respectively obtain corresponding grayscale images; in the three-dimensional coordinate system, the reference point is used as a midpoint, and the gray level image is transversely stretched by a certain multiple to obtain a stretched image; identifying light spot characteristics existing in the stretching image, and filtering the light spot characteristics in the stretching image to obtain a filtered image; and according to the transverse stretching multiple, transversely reducing the filtered image to obtain a preprocessed image, and inputting the preprocessed image into a pre-established highlight area model.
In another aspect of this embodiment, the highlight identification module is configured to identify a spot feature existing in the stretched image, and specifically includes: identifying irregular figures existing in the stretched image, and determining the irregular figures existing in the stretched image; respectively dividing a plurality of layers of circular ring areas in each irregular graph, determining a plurality of test points in each layer of circular ring area, and simultaneously determining the chromaticity of each test point; calculating the average chroma of all test points in each layer of ring area, and taking the average chroma as the chroma value of the ring area; and when the chromatic values on the ring area at the outermost layer in the same irregular figure are determined to be sequentially decreased towards the ring area at the innermost layer, determining that the irregular figure is the spot feature existing in the stretched image.
In this embodiment, the step of establishing the highlight region model includes: acquiring a training image, wherein the training image is obtained by shooting an image acquisition of a training annular injection molding piece by shooting equipment after a light source is put in a closed space; according to the chromaticity of the training image, marking the shape boundary of the highlight generation area in the training image, respectively determining the center point of each highlight generation area and the nearest distance point on the annular boundary of the training annular injection molding part, and associating the distance points with the corresponding highlight generation areas; establishing an initial highlight model through a machine learning algorithm, inputting the associated training images into the initial highlight model for training until the training times reach a threshold value, and generating a training highlight model; acquiring a test image, wherein the test image is obtained by acquiring an image of a training annular injection molding piece in a closed space through shooting equipment; and inputting the test image into the training highlight model for testing, and generating a highlight area model when the accuracy of the training highlight model for marking highlight position points with highlight areas in the test image in the output image reaches a preset threshold value.
The highlight determining module is used for respectively determining highlight position points of the first acquired image and the second acquired image on the annular boundary and determining the brightness value of each highlight position point on the annular boundary.
In this embodiment, the highlight determining module is specifically configured to: respectively determining the area range of each highlight position point on the annular boundary of the first collected image and the second collected image; and determining an circumscribed circle aiming at the area range of each highlight position point, and taking the brightness value corresponding to the position of the center of the circumscribed circle as the brightness value of the corresponding highlight position point.
The image filtering module is configured to filter, according to a difference between luminance values of highlight position points at the same position on the annular boundary between the first collected image and the second collected image, the highlight position points with a difference value larger than a preset threshold as influence factors in the first collected image, so as to obtain a filtered image.
In this embodiment, the image filtering module is specifically configured to: defining the coordinate position of each highlight position point in the first collected image in a three-dimensional coordinate system as a whole to be a first coordinate position; defining the coordinate position of each highlight position point in the second collected image in the three-dimensional coordinate system as a whole to be a second coordinate position; taking the first coordinate position as a reference, and integrally moving the second coordinate position in a three-dimensional coordinate system until the second coordinate position is superposed with the first coordinate position; and determining the difference of the brightness values of the highlight position points which are positioned at the same position after superposition, and filtering the highlight position points of which the difference values are larger than a preset threshold value in the first acquired image as influence factors to obtain a filtered image.
And the bubble identification module is used for inputting the filtering image into a pre-established bubble shape identification model for identification, marking and outputting highlight position points with shapes meeting the shapes of bubble defects in the filtering image as the bubble defects on the annular injection molding piece to be detected.
In this embodiment, the step of establishing the bubble shape recognition model includes: in the process of pre-establishing the bubble shape recognition model, obtaining a filtering image obtained after the steps are executed; marking bubbles existing in the filtered image in a manual identification mode, and determining a bubble defect range; identifying the gray value in each bubble defect range, and dividing the bubble defect range into two areas according to the change of the gray value in the bubble defect range; determining circumscribed circles of two areas in which the same bubble defect range is located respectively, and associating the centers of the circumscribed circles corresponding to the two areas; and establishing an initial bubble model through a machine learning algorithm, inputting the associated filtering image into the initial bubble model for training and testing, and generating a bubble shape recognition model until the times of training and testing reach a threshold value.
EXAMPLE III
An embodiment of the present invention further provides a computer-readable storage medium, where the computer-readable storage medium includes a stored computer program; wherein the computer program, when running, controls an apparatus on which the computer-readable storage medium is located to perform the machine vision-based injection molding inspection method according to any of the above embodiments.
Example four
Referring to fig. 3, a schematic structural diagram of an embodiment of a terminal device according to an embodiment of the present invention is shown, where the terminal device includes a processor, a memory, and a computer program stored in the memory and configured to be executed by the processor, and the processor, when executing the computer program, implements the machine-vision-based injection molding detection method according to any of the embodiments.
Preferably, the computer program may be divided into one or more modules/units (e.g., computer program) that are stored in the memory and executed by the processor to accomplish the present invention. The one or more modules/units may be a series of computer program instruction segments capable of performing specific functions, which are used for describing the execution process of the computer program in the terminal device.
The Processor may be a Central Processing Unit (CPU), other general purpose Processor, a Digital Signal Processor (DSP), an Application Specific Integrated Circuit (ASIC), an off-the-shelf Programmable Gate Array (FPGA) or other Programmable logic device, a discrete Gate or transistor logic device, a discrete hardware component, etc., the general purpose Processor may be a microprocessor, or the Processor may be any conventional Processor, the Processor is a control center of the terminal device, and various interfaces and lines are used to connect various parts of the terminal device.
The memory mainly includes a program storage area and a data storage area, wherein the program storage area may store an operating system, an application program required for at least one function, and the like, and the data storage area may store related data and the like. In addition, the memory may be a high speed random access memory, may also be a non-volatile memory, such as a plug-in hard disk, a Smart Media Card (SMC), a Secure Digital (SD) Card, a Flash Card (Flash Card), and the like, or may also be other volatile solid state memory devices.
It should be noted that the terminal device may include, but is not limited to, a processor and a memory, and those skilled in the art will understand that the terminal device is only an example and does not constitute a limitation of the terminal device, and may include more or less components, or combine some components, or different components.
The above-mentioned embodiments are provided to further explain the objects, technical solutions and advantages of the present invention in detail, and it should be understood that the above-mentioned embodiments are only examples of the present invention and are not intended to limit the scope of the present invention. It should be understood that various changes, substitutions and alterations can be made herein without departing from the spirit and scope of the invention as defined by the appended claims.

Claims (10)

1. A machine vision based method for inspecting injection molded parts, wherein bubble defects on annular injection molded parts are detected, the method comprising:
acquiring an image of the annular injection molding piece to be detected in a closed space through shooting equipment to obtain a first acquired image; keeping the positions of the shooting equipment and the annular injection molding piece to be detected unchanged, and after a light source is thrown in the closed space, carrying out secondary image acquisition on the annular injection molding piece to be detected to obtain a second acquired image;
identifying and marking the annular boundary characteristics in the first collected image and the second collected image, and determining the annular boundary of the annular injection molding piece to be detected in the first collected image and the second collected image respectively;
preprocessing the first collected image and the second collected image, inputting the preprocessed first collected image and the preprocessed second collected image into a pre-established highlight area model for recognition, and respectively marking and outputting highlight position points existing on the first collected image and the second collected image;
respectively determining highlight position points of the first collected image and the second collected image on an annular boundary, and determining the brightness value of each highlight position point on the annular boundary;
filtering highlight position points with difference values larger than a preset threshold value in the first acquired image as influence factors according to the difference between the brightness values of the highlight position points at the same position on the annular boundary of the first acquired image and the second acquired image to obtain a filtered image;
and inputting the filtering image into a pre-established bubble shape recognition model for recognition, marking and outputting highlight position points with shapes meeting the shapes of bubble defects in the filtering image, and taking the highlight position points as the bubble defects on the annular injection molding piece to be detected.
2. The machine-vision-based injection molding inspection method of claim 1, wherein said step of identifying and marking the annular boundary features in the first captured image and the second captured image, and determining the annular boundary of the annular injection molding under test in the first captured image and the second captured image, respectively, specifically comprises:
respectively carrying out gridding processing on the first collected image and the second collected image, and determining a reference point;
establishing a three-dimensional coordinate system, respectively moving the first collected image and the second collected image to the three-dimensional coordinate system by taking the reference point as an origin, and determining the coordinate position of each gridding point in the three-dimensional coordinate system;
determining a connecting line formed among a plurality of continuous gridding points with the same chromaticity in the first collected image, and determining the connecting line as an annular boundary when the chromaticity difference value between the connecting line and the adjacent gridding points which are not on the connecting line reaches a chromaticity threshold value;
and moving the reference point in the three-dimensional coordinate system to the second acquired image by taking the reference point as a reference according to the determined annular boundary in the first acquired image, and determining the annular boundary of the annular injection molding piece to be detected in the second acquired image.
3. The machine-vision-based inspection method for injection molded parts according to claim 2, wherein the step of preprocessing the first captured image and the second captured image comprises:
carrying out graying processing on the first collected image and the second collected image to respectively obtain corresponding grayscale images;
in the three-dimensional coordinate system, the reference point is used as a midpoint, and the gray level image is transversely stretched by a certain multiple to obtain a stretched image;
identifying light spot features existing in the stretched image, and filtering the light spot features in the stretched image to obtain a filtered image;
and according to the transverse stretching multiple, transversely reducing the filtered image to obtain a preprocessed image, and inputting the preprocessed image into a pre-established highlight area model.
4. The machine-vision based injection molded part inspection method of claim 3, wherein the step of establishing the highlight region model comprises:
acquiring a training image, wherein the training image is obtained by shooting an image acquisition of a training annular injection molding piece by shooting equipment after a light source is put in a closed space;
according to the chromaticity of the training image, marking the shape boundary of the highlight generation area in the training image, respectively determining the center point of each highlight generation area and the nearest distance point on the annular boundary of the training annular injection molding part, and associating the distance points with the corresponding highlight generation areas;
establishing an initial highlight model through a machine learning algorithm, inputting the associated training images into the initial highlight model for training until the training times reach a threshold value, and generating a training highlight model;
acquiring a test image, wherein the test image is obtained by acquiring an image of a training annular injection molding in a closed space through shooting equipment;
and inputting the test image into the training highlight model for testing, and generating a highlight area model when the accuracy of the training highlight model for marking highlight position points with highlight areas in the test image in the output image reaches a preset threshold value.
5. The machine-vision-based injection molding inspection method of claim 3, wherein the step of identifying the spot features present in the stretched image comprises:
identifying irregular figures existing in the stretched image, and determining the irregular figures existing in the stretched image;
dividing a plurality of layers of circular ring areas in each irregular graph respectively, determining a plurality of test points in each layer of circular ring area, and simultaneously determining the chromaticity of each test point;
calculating the average chroma of all test points in each layer of ring area, and taking the average chroma as the chroma value of the ring area;
and when the colorimetric values on the ring area at the outermost layer in the same irregular pattern are determined to be sequentially decreased towards the ring area at the innermost layer, determining that the irregular pattern is the spot characteristic existing in the stretching image.
6. The machine-vision-based injection molded part inspection method of claim 1, wherein said steps of determining highlight site points of said first captured image and said second captured image, respectively, on an annular boundary, and determining a brightness value of each highlight site point on the annular boundary, specifically comprise:
respectively determining the area range of each highlight position point on the annular boundary of the first collected image and the second collected image;
and determining an circumscribed circle aiming at the area range of each highlight position point, and taking the brightness value corresponding to the position of the center of the circumscribed circle as the brightness value of the corresponding highlight position point.
7. The machine-vision-based injection molding detection method of claim 6, wherein the step of filtering the highlight position points with the difference value larger than a preset threshold value in the first captured image as influence factors according to the difference between the brightness values of the highlight position points at the same position on the annular boundary of the first captured image and the second captured image to obtain a filtered image specifically comprises:
defining the coordinate position of each highlight position point in the first collected image in a three-dimensional coordinate system as a whole to be a first coordinate position;
defining the coordinate position of each highlight position point in the second collected image in the three-dimensional coordinate system as a whole to be a second coordinate position;
taking the first coordinate position as a reference, and integrally moving the second coordinate position in a three-dimensional coordinate system until the second coordinate position is superposed with the first coordinate position;
and determining the difference of the brightness values of the highlight position points at the same position after superposition, and filtering the highlight position points with the difference value larger than a preset threshold value in the first collected image as influence factors to obtain a filtered image.
8. A machine vision-based injection molding inspection system for detecting bubble defects on a ring-shaped injection molded part, the system comprising: the device comprises an image acquisition module, an annular boundary module, a highlight identification module, a highlight determination module, an image filtering module and a bubble identification module;
the image acquisition module is used for acquiring an image of the annular injection molding piece to be detected in a closed space through shooting equipment to obtain a first acquired image; keeping the positions of the shooting equipment and the annular injection molding piece to be detected unchanged, and after a light source is put into the closed space, carrying out secondary image acquisition on the annular injection molding piece to be detected to obtain a second acquired image;
the annular boundary module is used for identifying and marking annular boundary characteristics in the first collected image and the second collected image, and determining the annular boundary of the annular injection molding piece to be detected in the first collected image and the second collected image respectively;
the highlight identification module is used for preprocessing the first collected image and the second collected image, inputting the preprocessed first collected image and the preprocessed second collected image into a preset highlight area model for identification, and respectively marking and outputting highlight position points existing on the first collected image and the second collected image;
the highlight determining module is used for respectively determining highlight position points of the first collected image and the second collected image on an annular boundary and determining the brightness value of each highlight position point on the annular boundary;
the image filtering module is used for filtering highlight position points with difference values larger than a preset threshold value in the first acquired image as influence factors according to the difference of the brightness values of the highlight position points at the same position on the annular boundary of the first acquired image and the second acquired image to obtain a filtered image;
and the bubble identification module is used for inputting the filtering image into a pre-established bubble shape identification model for identification, marking and outputting highlight position points with shapes meeting the shapes of bubble defects in the filtering image, and the highlight position points are used as the bubble defects on the annular injection molding part to be detected.
9. A computer-readable storage medium, characterized in that the computer-readable storage medium comprises a stored computer program; wherein the computer program, when executed, controls an apparatus on which the computer-readable storage medium resides to perform the machine-vision-based injection molding inspection method of any one of claims 1-7.
10. A terminal device comprising a processor, a memory, and a computer program stored in the memory and configured to be executed by the processor, the processor implementing the machine vision-based injection molding detection method of any one of claims 1-7 when executing the computer program.
CN202211409675.2A 2022-11-11 2022-11-11 Injection molding part detection method and system based on machine vision Active CN115452844B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202211409675.2A CN115452844B (en) 2022-11-11 2022-11-11 Injection molding part detection method and system based on machine vision

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202211409675.2A CN115452844B (en) 2022-11-11 2022-11-11 Injection molding part detection method and system based on machine vision

Publications (2)

Publication Number Publication Date
CN115452844A true CN115452844A (en) 2022-12-09
CN115452844B CN115452844B (en) 2023-02-03

Family

ID=84295658

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202211409675.2A Active CN115452844B (en) 2022-11-11 2022-11-11 Injection molding part detection method and system based on machine vision

Country Status (1)

Country Link
CN (1) CN115452844B (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116945521A (en) * 2023-09-15 2023-10-27 张家港市神舟机械有限公司 Injection molding defect detection method

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN201251555Y (en) * 2008-08-28 2009-06-03 赵永先 Chip welding-quality measuring instrument
US20130182942A1 (en) * 2012-01-17 2013-07-18 Omron Corporation Method for registering inspection standard for soldering inspection and board inspection apparatus thereby
CN104510447A (en) * 2015-01-13 2015-04-15 哈尔滨工业大学 Visible light and near-infrared light sublingual image acquisition system
CN107490582A (en) * 2017-09-20 2017-12-19 深圳市晟达机械设计有限公司 A kind of streamline Work Piece Verification System Based
JP2020085470A (en) * 2018-11-15 2020-06-04 三井化学株式会社 Bubble detection device, method and program
CN114004826A (en) * 2021-11-13 2022-02-01 博科视(苏州)技术有限公司 Visual sense-based method for detecting appearance defects of medical injection molding part
CN115082485A (en) * 2022-08-23 2022-09-20 南通华烨塑料工业有限公司 Method and system for detecting bubble defects on surface of injection molding product
CN115201212A (en) * 2022-09-19 2022-10-18 江苏华彬新材料有限公司 Plastic product defect detection device based on machine vision

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN201251555Y (en) * 2008-08-28 2009-06-03 赵永先 Chip welding-quality measuring instrument
US20130182942A1 (en) * 2012-01-17 2013-07-18 Omron Corporation Method for registering inspection standard for soldering inspection and board inspection apparatus thereby
CN104510447A (en) * 2015-01-13 2015-04-15 哈尔滨工业大学 Visible light and near-infrared light sublingual image acquisition system
CN107490582A (en) * 2017-09-20 2017-12-19 深圳市晟达机械设计有限公司 A kind of streamline Work Piece Verification System Based
JP2020085470A (en) * 2018-11-15 2020-06-04 三井化学株式会社 Bubble detection device, method and program
CN114004826A (en) * 2021-11-13 2022-02-01 博科视(苏州)技术有限公司 Visual sense-based method for detecting appearance defects of medical injection molding part
CN115082485A (en) * 2022-08-23 2022-09-20 南通华烨塑料工业有限公司 Method and system for detecting bubble defects on surface of injection molding product
CN115201212A (en) * 2022-09-19 2022-10-18 江苏华彬新材料有限公司 Plastic product defect detection device based on machine vision

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
崔炽标: "注塑件常见外观缺陷检测系统的研究与开发", 《中国优秀硕士学位论文全文数据库工程科技Ⅰ辑》 *

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116945521A (en) * 2023-09-15 2023-10-27 张家港市神舟机械有限公司 Injection molding defect detection method
CN116945521B (en) * 2023-09-15 2023-12-08 张家港市神舟机械有限公司 Injection molding defect detection method

Also Published As

Publication number Publication date
CN115452844B (en) 2023-02-03

Similar Documents

Publication Publication Date Title
US8050486B2 (en) System and method for identifying a feature of a workpiece
CN105069790B (en) A kind of gear open defect fast image detection method
KR101773791B1 (en) Method and device for inspecting surfaces of an examined object
CN109900711A (en) Workpiece, defect detection method based on machine vision
CN105975941A (en) Multidirectional vehicle model detection recognition system based on deep learning
CN105740910A (en) Vehicle object detection method and device
CN107563265A (en) A kind of high beam detection method and device
CN115452844B (en) Injection molding part detection method and system based on machine vision
KR101730133B1 (en) Method of inspection for injection molding plastic
CN105303573B (en) The pin detection method and system of acupuncture needle class component
CN112334761B (en) Defect discriminating method, defect discriminating apparatus, and recording medium
CN110516514A (en) A kind of modeling method and device of target detection model
CN112414623A (en) Method and system for detecting part air tightness leakage defect based on artificial intelligence
CN116678826A (en) Appearance defect detection system and method based on rapid three-dimensional reconstruction
CN105023018A (en) Jet code detection method and system
CN110412055A (en) A kind of lens white haze defect inspection method based on multiple light courcess dark-ground illumination
CN117649404A (en) Medicine packaging box quality detection method and system based on image data analysis
CN108428247A (en) The detection method and system in bump direction
US10241000B2 (en) Method for checking the position of characteristic points in light distributions
CN116237266A (en) Flange size measuring method and device
García et al. Rail surface inspection system using differential topographic images
KR20190119801A (en) Vehicle Headlight Alignment Calibration and Classification, Inspection of Vehicle Headlight Defects
Mavi et al. Identify defects in gears using digital image processing
TWI816549B (en) Automated defect detection methods
Yemelyanova et al. APPLICATION OF MACHINE LEARNING FOR RECOGNIZING SURFACE WELDING DEFECTS IN VIDEO SEQUENCES

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant