CN114387438B - Machine vision-based die casting machine parameter regulation and control method - Google Patents

Machine vision-based die casting machine parameter regulation and control method Download PDF

Info

Publication number
CN114387438B
CN114387438B CN202210290716.4A CN202210290716A CN114387438B CN 114387438 B CN114387438 B CN 114387438B CN 202210290716 A CN202210290716 A CN 202210290716A CN 114387438 B CN114387438 B CN 114387438B
Authority
CN
China
Prior art keywords
edge
under
suspected
casting
injection
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202210290716.4A
Other languages
Chinese (zh)
Other versions
CN114387438A (en
Inventor
陆志惠
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Wuhan Jinhui Die Casting Co ltd
Original Assignee
Wuhan Jinhui Die Casting Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Wuhan Jinhui Die Casting Co ltd filed Critical Wuhan Jinhui Die Casting Co ltd
Priority to CN202210290716.4A priority Critical patent/CN114387438B/en
Publication of CN114387438A publication Critical patent/CN114387438A/en
Application granted granted Critical
Publication of CN114387438B publication Critical patent/CN114387438B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/04Architecture, e.g. interconnection topology
    • G06N3/045Combinations of networks

Abstract

The invention relates to a method for regulating and controlling parameters of a die casting machine based on machine vision, and belongs to the field of regulation and control of die casting machines. The method comprises the following steps: acquiring a casting image, and extracting the edge of the casting image by using an edge detection operator to obtain a casting edge image; dividing edges in the edge graph of the casting to obtain a plurality of edge sections; grouping the edge sections according to the similarity between the edge sections, and extracting suspected under-injection areas in the groups; obtaining the number of target under-injection points of each suspected under-injection area according to the mutation degree of each suspected under-injection area, and obtaining the under-injection degree of the casting according to the number of the target under-injection points of each suspected under-injection area; and regulating and controlling the parameters of the die casting machine according to the degree of short injection of the casting. The method improves the accuracy of judging the degree of insufficient injection of the casting.

Description

Machine vision-based die casting machine parameter regulation and control method
Technical Field
The invention relates to the field of die casting machine regulation, in particular to a die casting machine parameter regulation and control method based on machine vision.
Background
The die casting machine is used for pressure casting, the die casting machine injects molten metal liquid into a die to be cooled and formed under the action of pressure, and a solid metal casting can be obtained after die opening. In the actual production process, the parameters of the die casting machine need to be adjusted according to whether the solid metal casting obtained after die opening has defects and the defect degree. The defects of the casting comprise short injection, which is also called as short injection, unclear outline and incomplete corner, and refer to the situation that a cavity is not filled with molten metal and the part with incomplete filling appears on the casting in the process of die casting. Much of the short shot occurs at the end of the casting or at the narrow deep cavity.
For the identification of casting under notes, the existing method is to obtain the edge of a target based on a Canny operator, and then detect the discontinuity of the edge. In the process of acquiring the edge of the target by the Canny operator, in order to acquire an accurate edge, a non-maximum suppression value and a hysteresis threshold value are adopted to suppress a plurality of edges with too low response and too low intensity in a single boundary neighborhood, in the process, part of under-injection points can be suppressed due to insufficient intensity, the actual under-injection degree of the casting cannot be accurately reflected by the calculated under-injection degree of the casting based on the suppressed under-injection points, and further the parameters of the die casting machine adjusted by the method are not accurate.
Disclosure of Invention
In order to solve the problem of inaccurate judgment of the degree of insufficient injection of a casting in the existing method, the invention provides a technical scheme of a die casting machine parameter regulating and controlling method based on machine vision, which comprises the following steps:
acquiring a casting image, and extracting the edge of the casting image by using an edge detection operator to obtain a casting edge image;
dividing edges in the edge graph of the casting to obtain a plurality of edge sections; grouping the edge sections according to the similarity between the edge sections, and extracting suspected under-injection areas in the groups;
obtaining the number of target under-injection points of each suspected under-injection area according to the mutation degree of each suspected under-injection area, and obtaining the under-injection degree of the casting according to the number of the target under-injection points of each suspected under-injection area;
and regulating and controlling the parameters of the die casting machine according to the degree of short injection of the casting.
Has the advantages that: the method provided by the invention considers the error caused by extracting the edge of the casting image by using the edge detection operator, and further calculates the obtained suspected under-injection areas, namely, the target under-injection point quantity of each suspected under-injection area is calculated according to the mutation degree of each suspected under-injection area, so that the more accurate real under-injection point quantity is obtained, the accuracy of judging the under-injection degree of the casting is further improved, and the problem that the judgment of the under-injection degree of the casting is inaccurate in the existing method is solved.
Further, the method for acquiring the mutation degree of each suspected under-injection area comprises the following steps:
for any suspected under-filled area:
acquiring an adjacent edge section corresponding to the suspected under-injection area as a normal edge section corresponding to the under-injection area; obtaining a Gaussian mixture model corresponding to a normal edge section corresponding to the suspected under-annotated zone, fitting the mean value points of all sub-Gaussian models in the Gaussian mixture model corresponding to the suspected under-annotated zone into a curve, and marking the curve as a reference texture curve;
constructing a sample set of the suspected under-injection area according to the position of each suspected under-injection point in the suspected under-injection area and the depth degree of the texture in the neighborhood; obtaining a Gaussian mixture model corresponding to the sample set of the suspected under-annotated zone, fitting the mean values of all sub-Gaussian models in the Gaussian mixture model into a curve, and marking as an under-annotated texture curve;
and determining the longest common subsequence of the reference texture curve and the under-annotated texture curve by using a search strategy, and calculating the mutation degree of the suspected under-annotated area according to the length of the under-annotated texture curve and the length of the longest common subsequence.
Further, the mutation degree of each suspected under-injection area is calculated by using the following calculation formula:
Figure 100002_DEST_PATH_IMAGE002
wherein the content of the first and second substances,
Figure 100002_DEST_PATH_IMAGE004
the mutation degree of a suspected under-filled region,
Figure 100002_DEST_PATH_IMAGE006
the length of the longest public subsequence corresponding to the suspected under-annotated zone;
Figure 100002_DEST_PATH_IMAGE008
the length of the under-injection texture curve corresponding to the suspected under-injection area;
Figure 100002_DEST_PATH_IMAGE010
and the housdov distances of the reference texture curve and the under-annotated texture curve corresponding to the suspected under-annotated zone.
Further, the dividing the edge in the casting edge map to obtain a plurality of edge segments includes:
extracting edge points in the edge image of the casting, fitting the extracted edge points into a curve, and marking the curve as an edge curve;
calculating the curvature of each point on the edge curve to obtain an edge curvature curve and obtain the break points of the edge curvature curve;
the edge curve is divided into different edge segments, demarcated by discontinuities.
Further, the grouping the edge segments according to the similarity between the edge segments includes:
constructing descriptions of the edge sections according to the positions of the pixel points on the edge sections and the texture depth degree in the neighborhood;
calculating the similarity between any two edge segments according to the description of each edge segment;
judging the magnitude relation between the similarity and a similarity threshold, and dividing two corresponding edge sections into a group if the similarity is greater than or equal to the similarity threshold; and if the similarity is smaller than the similarity threshold value, dividing the two corresponding edge sections into different groups.
Further, the method for obtaining the texture depth degree in the neighborhood of the pixel point on each edge segment comprises the following steps:
calculating the gradient of each pixel point in the casting gray level image, and taking the gradient as the pixel value of the corresponding pixel point to obtain a casting gradient image;
in the casting gradient image, for each edge pixel point on each edge segment: and taking the length of the corresponding edge segment as the window width, acquiring a gray level co-occurrence matrix in the window, calculating the contrast of the gray level co-occurrence matrix, and recording the contrast as the texture depth degree in the neighborhood.
Further, the similarity between any two edge segments is calculated using the following formula:
Figure 100002_DEST_PATH_IMAGE012
wherein the content of the first and second substances,
Figure 100002_DEST_PATH_IMAGE014
is the degree of similarity between the two edge segments,
Figure 100002_DEST_PATH_IMAGE016
is the minimum distance between the pixel points on the two edge sections in the airspace,
Figure 100002_DEST_PATH_IMAGE018
the KL divergence of the sample set corresponding to the two edge segments in the sample space.
Further, the extracting the suspected under-filled regions in each group includes:
for any two adjacent edge segments within any group:
recording the two adjacent edge sections as reference edge sections, and connecting two nearest pixel points in the two reference edge sections to obtain an auxiliary line section;
acquiring all connected domains in the casting gradient map, and marking the connected domains intersected with the auxiliary line segments;
and acquiring a convex hull of the marked connected domain, and marking an image area contained by the convex hull as a suspected under-injection area. Further, the target under-injection point number of each suspected under-injection area is calculated by using the following calculation formula:
Figure 100002_DEST_PATH_IMAGE020
wherein, the first and the second end of the pipe are connected with each other,
Figure 100002_DEST_PATH_IMAGE022
the mutation degree of a suspected underinjection point,
Figure 100002_DEST_PATH_IMAGE024
the number of suspected under-injection points in the suspected under-injection area is determined;
Figure 100002_DEST_PATH_IMAGE026
the number of target under-filled points in the suspected under-filled area is shown.
Further, the degree of short shot of the casting is calculated by using the following formula:
Figure 100002_DEST_PATH_IMAGE028
wherein the content of the first and second substances,
Figure 100002_DEST_PATH_IMAGE030
the degree of the short shot of the casting is,
Figure 100002_DEST_PATH_IMAGE032
the serial number of the suspected under-annotated zone,
Figure 100002_DEST_PATH_IMAGE034
is as follows
Figure 318550DEST_PATH_IMAGE032
The number of target under-annotating points in each suspected under-annotating area;
Figure 100002_DEST_PATH_IMAGE036
the number of suspected under-annotated zones;
Figure 100002_DEST_PATH_IMAGE038
is the area of the casting.
Drawings
FIG. 1 is a flow chart of a method for controlling parameters of a die casting machine based on machine vision according to the present invention.
Detailed Description
In order to make the objects, technical solutions and advantages of the embodiments of the present invention clearer, the technical solutions in the embodiments of the present invention will be described below with reference to the drawings in the embodiments of the present invention.
The invention aims to solve the problem that the judgment of the casting short-shot degree is not accurate in the existing method, and the conception of the invention is as follows: in the process of manufacturing a casting by using a die-casting machine, the casting cast by the die-casting machine is sent into a detection area, an image of the casting is obtained by using a camera, the degree of undercjection of the casting is obtained according to the image of the casting, and adjustment parameters of the die-casting machine for casting the casting are obtained based on the degree of undercjection of the casting; and adjusting all parameters of the casting production of the die casting machine according to the adjustment parameters.
Specifically, as shown in fig. 1, the method for regulating and controlling parameters of a die-casting machine based on machine vision of the embodiment includes the following steps:
(1) acquiring a casting image, and extracting the edge of the casting image by using an edge detection operator to obtain a casting edge image;
in the embodiment, a casting produced by a die casting machine is placed on a transmission belt, and the casting is conveyed to a detection area by the transmission belt; an RGB camera is arranged above the detection area, and the optical axis of the camera should be perpendicular to the detection area as far as possible so as to ensure the maximum imaging range; when the die casting reaches the detection area, a camera is used for acquiring a casting image.
Carrying out graying processing on the obtained casting image to obtain a corresponding casting gray image; and extracting the edge of the casting by using a Canny operator according to the obtained casting gray level image to obtain a casting edge image.
The edge extraction using Canny operator is prior art and will not be described herein. In this embodiment, the casting image is acquired by using the RGB camera, and the casting image is grayed to obtain a corresponding casting grayscale image, and as another embodiment, the casting grayscale image may also be directly acquired by using the grayscale camera.
(2) Dividing edges in the edge graph of the casting to obtain a plurality of edge sections; grouping the edge sections according to the similarity between the edge sections, and extracting suspected under-injection areas in the groups;
when extracting edges using the Canny operator, non-maxima suppression and hysteresis thresholds are used to remove multiple response and low intensity edges within a single boundary neighborhood in order to obtain accurate edges. In the edge map of the casting, certain edges are influenced by the short notes, the strength of the edges at the short notes is small and is messy, so when the non-maximum value is inhibited and the lag threshold value is used for screening edge points, the edges are often cut off or distorted by the short notes and are divided into sections with different curvatures. In the divided segments, some segments are originally different parts on the same edge, in order to re-divide the different parts on the same edge, in this embodiment, the edge in the casting edge map is divided first to obtain a plurality of edge segments, and then the edge segments are grouped, and the specific process is as follows:
dividing the edge in the casting edge graph into different edge sections;
extracting edge points in the edge image of the casting, fitting the extracted edge points into a curve, and marking the curve as an edge curve;
calculating the curvature of each point on the edge curve to obtain an edge curvature curve and obtain the break points of the edge curvature curve;
the edge curve is divided into different edge segments with the break points as boundaries.
II, obtaining the description of each edge section;
obtaining a description of each edge segment, wherein the description of the edge segment refers to: the distribution of three parameters, namely the position, the distribution direction and the neighborhood texture of each edge point on the same edge segment in the graph in the sample space can be referred to as the edge segment. The acquisition process is as follows:
for each edge pixel point on each edge section, acquiring the position of the pixel point
Figure DEST_PATH_IMAGE040
Figure DEST_PATH_IMAGE042
Is the abscissa of the pixel point, and is,
Figure DEST_PATH_IMAGE044
the abscissa of the pixel point is;
calculating the gradient of each pixel point in the casting gray level image, and taking the gradient as the pixel value of the corresponding pixel point to obtain a casting gradient image; in the casting gradient image, each edge pixel point on each edge segment is processed to
Figure DEST_PATH_IMAGE046
Is the window width, order
Figure DEST_PATH_IMAGE048
Figure DEST_PATH_IMAGE050
Acquiring gray level co-occurrence matrix in the window, and calculating the contrast of the gray level co-occurrence matrix
Figure DEST_PATH_IMAGE052
Figure 754079DEST_PATH_IMAGE052
To indicate the depth of the texture in its window area;
Figure 414868DEST_PATH_IMAGE046
in order to correspond to the length of the edge segment,
Figure DEST_PATH_IMAGE054
for the abscissa offset in the gray level co-occurrence matrix,
Figure DEST_PATH_IMAGE056
is the ordinate offset in the gray level co-occurrence matrix; in this example
Figure DEST_PATH_IMAGE058
The number of the carbon atoms is 1,
Figure DEST_PATH_IMAGE060
1, as other embodiments, other values may be taken;
by the position of the pixel point
Figure 527442DEST_PATH_IMAGE040
And the depth of the texture in the neighborhood
Figure 615484DEST_PATH_IMAGE052
Sample descriptions that collectively make up each edge point
Figure DEST_PATH_IMAGE062
The sample descriptions of all the pixel points of the same edge segment jointly form a sample set of the edge segment
Figure DEST_PATH_IMAGE064
Figure DEST_PATH_IMAGE066
Is a sample set of a certain edge segment,
Figure DEST_PATH_IMAGE068
drawing a sample of the 1 st pixel point of the edge segment of the edgeIn the above-mentioned manner,
Figure DEST_PATH_IMAGE070
for the sample description of the 2 nd pixel point of the edge segment,
Figure DEST_PATH_IMAGE072
for the sample description of the 3 rd pixel point of the edge segment,
Figure DEST_PATH_IMAGE074
and describing the sample of the nth pixel point of the edge segment, wherein n is the number of the pixel points on the edge segment. Dividing the sample set into different clusters by using mean shift clustering, calculating the mean value and the variance of each cluster in a sample space, taking each cluster group as a sub-Gaussian model, and forming a Gaussian mixture model by all the sub-Gaussian models; in the sample space, the gaussian mixture model corresponding to each edge segment is the description of each edge segment.
And III, grouping the edge sections according to the similarity.
The similarity in this embodiment refers to the similarity between two edge segments, and if two edge segments are different parts on the same edge, the similarity between the two edge segments is high. And calculating the similarity between any two edge sections for all the edge sections, wherein the calculation formula of the similarity is as follows:
Figure DEST_PATH_IMAGE012A
wherein the content of the first and second substances,
Figure 413676DEST_PATH_IMAGE014
is the degree of similarity between the two edge segments,
Figure 612576DEST_PATH_IMAGE016
is the minimum distance between the pixel points on the two edge sections in the airspace,
Figure 609351DEST_PATH_IMAGE018
for two in the sample spaceKL divergence of the sample set corresponding to the edge segment; if the two edge segments are different parts of the same edge, the distribution of the sample sets corresponding to the two edge segments is similar in the sample space, and the more similar the distribution of the two edge segments is, the smaller the KL divergence of the distribution of the two edge segments is.
Setting a threshold value
Figure DEST_PATH_IMAGE076
If, if
Figure DEST_PATH_IMAGE078
If so, the two edge segments are not divided into the same group; if it is
Figure DEST_PATH_IMAGE080
The two edge segments are two adjacent edge segments on the same edge, and are divided into the same group; to this end, the edge segments on the same edge may be divided into a group.
The suspected under-annotated area is an area that may be under-annotated, and the suspected under-annotated area in each group is extracted after each edge segment is grouped in the embodiment, the process is as follows:
for any two adjacent edge segments in any group, marking the two edge segments as reference edge segments; connecting two nearest pixel points in the two reference edge sections to obtain an auxiliary line section;
acquiring all connected domains in the casting gradient map, and marking the connected domains intersected with the auxiliary line segments;
and acquiring a convex hull of the marked connected domain, wherein an image area contained by the convex hull is a suspected under-injection area, and pixel points in the suspected under-injection area are suspected under-injection points.
Any two adjacent edge sections in the same group correspond to suspected under-injection areas, so that a plurality of groups can obtain a plurality of suspected under-injection areas, and each suspected under-injection area comprises a plurality of suspected under-injection points.
(3) Obtaining the number of target under-injection points of each suspected under-injection area according to the mutation degree of each suspected under-injection area, and obtaining the under-injection degree of the casting according to the number of the target under-injection points of each suspected under-injection area;
the mutation degree of the suspected underinjected area refers to: the suspected under-annotated zone changes the degree of the change characteristic of the edge pixel points. For example, the edge where the gray value is gradually increased originally along the extension of the curve suddenly decreases the gray value in a certain region; the edge of which the gray value is gradually reduced along with the extension of the curve suddenly increases the gray value in a certain area; or edges where the gray values are originally uniform, there is a sudden change in the gray value.
Obtaining a reference texture curve corresponding to the suspected under-annotated area, wherein the process is as follows:
acquiring two adjacent edge sections corresponding to each suspected under-injection area as normal edge sections near the under-injection area; in a sample space, a normal edge section near a suspected under-injection area corresponds to a Gaussian mixture model, mean points of all sub-Gaussian models in the corresponding Gaussian mixture model are obtained, mean points of all the sub-Gaussian models of two Gaussian mixture models corresponding to the suspected under-injection area are fitted into a curve, the curve is marked as a reference texture curve, and the curve can represent gray level change of texture of the normal edge section.
Acquiring an under-injection texture curve corresponding to each suspected under-injection area, wherein the process is as follows:
for each suspected underinjection point in the same suspected underinjection area, to
Figure 71556DEST_PATH_IMAGE046
For window width, in the gradient image, let
Figure DEST_PATH_IMAGE082
Figure DEST_PATH_IMAGE084
Acquiring gray level co-occurrence matrix in the window, and calculating the contrast of the gray level co-occurrence matrix
Figure 419361DEST_PATH_IMAGE052
Figure 421952DEST_PATH_IMAGE052
To indicate the depth of the texture in its window area;
by the position of a suspected priming point
Figure 945337DEST_PATH_IMAGE040
And the depth of the texture in the neighborhood
Figure 578444DEST_PATH_IMAGE052
Sample descriptions that collectively make up each suspected point of underinjection
Figure DEST_PATH_IMAGE086
Sample descriptions of each suspected under-injection point in the same suspected under-injection area jointly form a sample set
Figure DEST_PATH_IMAGE088
Figure DEST_PATH_IMAGE090
Is a sample set of a suspected under-filled area,
Figure DEST_PATH_IMAGE092
is a sample description of the 1 st suspected priming point of the suspected priming area,
Figure DEST_PATH_IMAGE094
is a sample description of the 2 nd suspected priming point of the suspected priming area,
Figure DEST_PATH_IMAGE096
is a sample description of the 3 rd suspected underfilled point of the suspected underfilled area,
Figure DEST_PATH_IMAGE098
and describing the sample of the mth suspected under-injection point of the suspected under-injection area, wherein m is the number of the suspected under-injection points of the suspected under-injection area. Dividing the sample set into different clusters by mean shift clustering, calculating the mean and variance of each cluster in the sample space, and using each cluster group as a sub-Gaussian modelThe models jointly form a Gaussian mixture model; obtaining the mean value points of each sub-Gaussian model in the corresponding Gaussian mixture model, fitting all the mean value points into a curve, and recording the curve as an under-annotated texture curve; this curve represents the change in gray level of the texture at each suspected underslung point in the suspected underslung area.
And (3) calculating the mutation degree of each suspected under-injection area, wherein the process is as follows:
extracting a reference texture curve and an under-injection texture curve of each suspected under-injection area in a sample space, and determining the longest common subsequence of the two curves of each suspected under-injection area by using a search strategy;
the mutation degree of each suspected under-annotated zone is calculated by the following formula:
Figure DEST_PATH_IMAGE002A
wherein the content of the first and second substances,
Figure 986116DEST_PATH_IMAGE022
the mutation degree of a suspected under-filled region,
Figure DEST_PATH_IMAGE100
the length of the longest public subsequence corresponding to the suspected under-annotated zone is obtained;
Figure DEST_PATH_IMAGE102
the length of the under-injection texture curve corresponding to the suspected under-injection area;
Figure DEST_PATH_IMAGE104
the Hausdorff distance of the reference texture curve and the understeer texture curve corresponding to the suspected understeer area;
Figure DEST_PATH_IMAGE106
and
Figure DEST_PATH_IMAGE108
both represent the degree of similarity of the reference texture curve and the under-annotated texture curve, respectively; the more similar the two curves are to each other,
Figure 323556DEST_PATH_IMAGE022
the larger the value, the maximum is 1.
The larger the similarity degree of the reference texture curve and the under-annotated texture curve is, the larger the mutation of the suspected under-annotated area is; the texture change at the suspected under-noted area is opposite to the texture change of the normal edge, so when a texture in the opposite direction is obtained, the texture curves in the parameter space should be the same under ideal conditions, but under actual conditions, the suspected under-noted areas are not necessarily under-noted points, the number of true under-noted points in the under-noted areas is presumed according to the mutation degree of the suspected under-noted areas, and the formula for specifically calculating the number of target under-noted points in each suspected under-noted area is as follows:
Figure DEST_PATH_IMAGE020A
wherein the content of the first and second substances,
Figure 435868DEST_PATH_IMAGE024
the area of a certain suspected under-injection area is the number of suspected under-injection points in the suspected under-injection area;
Figure 36614DEST_PATH_IMAGE026
the number of target deli points in the suspected deli area, namely the number of real deli points, is obtained.
The degree of short shot is the severity of short shot of the casting, and the present embodiment calculates the degree of short shot using the following formula:
Figure DEST_PATH_IMAGE028A
wherein the content of the first and second substances,
Figure 359011DEST_PATH_IMAGE030
for the degree of under-pour severity of the casting,
Figure 437825DEST_PATH_IMAGE032
a serial number indicating a suspected under-annotated zone,
Figure 139065DEST_PATH_IMAGE034
is shown as
Figure 910712DEST_PATH_IMAGE032
The number of target under-injection points in each suspected under-injection area;
Figure 126930DEST_PATH_IMAGE036
indicating the number of suspected under-annotated zones;
Figure 838796DEST_PATH_IMAGE038
and the area of the casting is represented, namely the number of pixel points contained in the casting image.
(4) And regulating and controlling the parameters of the die casting machine according to the degree of short injection of the casting.
In actual production process, need adjust the parameter in the die casting machine production process according to the degree of injectivity of foundry goods, what this embodiment adjusted specifically is:
the FC deep neural network is used firstly, the adjusting parameters of the die casting machine are inferred according to the degree of undercharge of the casting, and then all the parameters in the production of the die casting machine are adjusted to be corresponding adjusting parameters. In the training process of the FC network: the adopted data is the degree of the short notes of the castings obtained based on big data, wherein 80% of the data is a training set, and the rest is a verification set; the label is the corresponding die casting machine adjusting parameter; the loss function uses the mean square error.
In this embodiment, the adjustment parameters of the die casting machine are inferred based on the trained FC deep neural network, and as another embodiment, the corresponding relationship between the degrees of insufficient injection of different castings and the adjustment parameters of the die casting machine may be established according to experience instead of based on the trained FC deep neural network, and then the adjustment parameters of the die casting machine are determined according to the corresponding relationship and the current degrees of insufficient injection of the castings.
In the embodiment, errors caused when the edges of the casting image are extracted by using the edge detection operator are considered, the obtained suspected under-injection areas are further calculated, namely, the target under-injection point number of each suspected under-injection area is calculated according to the mutation degree of each suspected under-injection area, so that the more accurate real under-injection point number is obtained, the accuracy of judging the degree of under-injection of the casting is further improved, and the problem that the judgment of the degree of under-injection of the casting is inaccurate in the existing method is solved.
It should be noted that while the preferred embodiments of the present invention have been described, additional variations and modifications to these embodiments may occur to those skilled in the art once they learn of the basic inventive concepts. Therefore, it is intended that the appended claims be interpreted as including preferred embodiments and all such alterations and modifications as fall within the scope of the invention.

Claims (6)

1. A die casting machine parameter regulation and control method based on machine vision is characterized by comprising the following steps:
acquiring a casting image, and extracting the edge of the casting image by using an edge detection operator to obtain a casting edge image;
dividing edges in the edge graph of the casting to obtain a plurality of edge sections; grouping the edge sections according to the similarity between the edge sections, and extracting suspected under-injection areas in the groups;
obtaining the number of target under-injection points of each suspected under-injection area according to the mutation degree of each suspected under-injection area, and obtaining the under-injection degree of the casting according to the number of the target under-injection points of each suspected under-injection area;
regulating and controlling parameters of the die casting machine according to the degree of undercasting of the casting;
the method for acquiring the mutation degree of each suspected under-injection area comprises the following steps:
for any suspected under-filled area:
acquiring an adjacent edge section corresponding to the suspected under-injection area as a normal edge section corresponding to the under-injection area; obtaining a Gaussian mixture model corresponding to a normal edge section corresponding to the suspected under-annotated zone, fitting the mean value points of all sub-Gaussian models in the Gaussian mixture model corresponding to the suspected under-annotated zone into a curve, and marking the curve as a reference texture curve;
constructing a sample set of the suspected under-injection area according to the position of each suspected under-injection point in the suspected under-injection area and the depth degree of the texture in the neighborhood; obtaining a Gaussian mixture model corresponding to the sample set of the suspected under-annotated zone, fitting the mean values of all sub-Gaussian models in the Gaussian mixture model into a curve, and marking as an under-annotated texture curve;
determining the longest common subsequence of a reference texture curve and a short-bet texture curve by using a search strategy, and calculating the mutation degree of the suspected short-bet area according to the length of the short-bet texture curve and the length of the longest common subsequence;
calculating the mutation degree of each suspected under-injection area by using the following calculation formula:
Figure DEST_PATH_IMAGE002
wherein the content of the first and second substances,
Figure DEST_PATH_IMAGE004
the mutation degree of a suspected under-filled region,
Figure DEST_PATH_IMAGE006
the length of the longest public subsequence corresponding to the suspected under-annotated zone;
Figure DEST_PATH_IMAGE008
the length of the under-injection texture curve corresponding to the suspected under-injection area;
Figure DEST_PATH_IMAGE010
the Hausdorff distance of the reference texture curve and the understeer texture curve corresponding to the suspected understeer area;
calculating the number of target under-injection points of each suspected under-injection area by using the following calculation formula:
Figure DEST_PATH_IMAGE012
wherein the content of the first and second substances,
Figure DEST_PATH_IMAGE014
the mutation degree of a suspected underinjection point,
Figure DEST_PATH_IMAGE016
the number of suspected under-annotating points in the suspected under-annotating area is determined;
Figure DEST_PATH_IMAGE018
the number of target under-injection points in the suspected under-injection area is determined;
the degree of short shot of the casting is calculated by the following formula:
Figure DEST_PATH_IMAGE020
wherein the content of the first and second substances,
Figure DEST_PATH_IMAGE022
the degree of the short shot of the casting is,
Figure DEST_PATH_IMAGE024
the serial number of the suspected under-annotated zone,
Figure DEST_PATH_IMAGE026
is a first
Figure 230633DEST_PATH_IMAGE024
The number of target under-injection points in each suspected under-injection area;
Figure DEST_PATH_IMAGE028
the number of suspected under-annotated zones;
Figure DEST_PATH_IMAGE030
is the area of the casting.
2. The machine vision-based method for regulating and controlling parameters of a die casting machine according to claim 1, wherein the step of dividing the edge of the casting edge map into a plurality of edge segments comprises:
extracting edge points in the edge image of the casting, fitting the extracted edge points into a curve, and marking the curve as an edge curve;
calculating the curvature of each point on the edge curve to obtain an edge curvature curve and obtain the break points of the edge curvature curve;
the edge curve is divided into different edge segments with the break points as boundaries.
3. The machine-vision-based parameter regulation and control method of a die casting machine of claim 1, wherein the grouping of the edge segments according to the similarity between the edge segments comprises:
constructing descriptions of the edge sections according to the positions of the pixel points on the edge sections and the texture depth degree in the neighborhood;
calculating the similarity between any two edge segments according to the description of each edge segment;
judging the magnitude relation between the similarity and a similarity threshold, and dividing two corresponding edge sections into a group if the similarity is greater than or equal to the similarity threshold; and if the similarity is smaller than the similarity threshold value, dividing the two corresponding edge sections into different groups.
4. The machine vision-based parameter control method for the die casting machine according to claim 3, wherein the method for obtaining the texture depth in the neighborhood of the pixel point on each edge segment comprises:
calculating the gradient of each pixel point in the casting gray level image, and taking the gradient as the pixel value of the corresponding pixel point to obtain a casting gradient image;
in the casting gradient image, for each edge pixel point on each edge segment: and taking the length of the corresponding edge segment as the window width, acquiring a gray level co-occurrence matrix in the window, calculating the contrast of the gray level co-occurrence matrix, and recording the contrast as the texture depth degree in the neighborhood.
5. The machine vision-based die casting machine parameter regulating and controlling method as claimed in claim 3, wherein the similarity between any two edge segments is calculated by using the following formula:
Figure DEST_PATH_IMAGE032
wherein the content of the first and second substances,
Figure DEST_PATH_IMAGE034
is the degree of similarity between the two edge segments,
Figure DEST_PATH_IMAGE036
is the minimum distance between the pixel points on the two edge segments in the airspace,
Figure DEST_PATH_IMAGE038
the KL divergence degrees of the sample sets corresponding to the two edge segments in the sample space.
6. The machine-vision-based parameter regulation and control method of a die-casting machine according to claim 1, wherein the extracting the suspected under-injection regions in each group comprises:
for any two adjacent edge segments within any group:
recording the two adjacent edge sections as reference edge sections, and connecting two nearest pixel points in the two reference edge sections to obtain an auxiliary line section;
acquiring all connected domains in the casting gradient map, and marking the connected domains intersected with the auxiliary line segments;
and acquiring a convex hull of the marked connected domain, and marking an image area contained by the convex hull as a suspected under-injection area.
CN202210290716.4A 2022-03-23 2022-03-23 Machine vision-based die casting machine parameter regulation and control method Active CN114387438B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210290716.4A CN114387438B (en) 2022-03-23 2022-03-23 Machine vision-based die casting machine parameter regulation and control method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210290716.4A CN114387438B (en) 2022-03-23 2022-03-23 Machine vision-based die casting machine parameter regulation and control method

Publications (2)

Publication Number Publication Date
CN114387438A CN114387438A (en) 2022-04-22
CN114387438B true CN114387438B (en) 2022-06-10

Family

ID=81205662

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210290716.4A Active CN114387438B (en) 2022-03-23 2022-03-23 Machine vision-based die casting machine parameter regulation and control method

Country Status (1)

Country Link
CN (1) CN114387438B (en)

Families Citing this family (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114972354B (en) * 2022-08-02 2022-10-28 济宁金筑新型建材科技有限公司 Image processing-based autoclaved aerated concrete block production control method and system
CN116778344B (en) * 2023-08-17 2023-12-05 牧马人(山东)勘察测绘集团有限公司 Land arrangement boundary line dividing method based on visual technology
CN116823808B (en) * 2023-08-23 2023-11-17 青岛豪迈电缆集团有限公司 Intelligent detection method for cable stranded wire based on machine vision
CN116993966B (en) * 2023-09-27 2023-12-12 诺伯特智能装备(山东)有限公司 Casting polishing vision intelligent positioning method and system

Family Cites Families (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE50107827D1 (en) * 2001-03-16 2005-12-01 Yxlon Int X Ray Gmbh Use of a cylindrical test specimen and method for producing the same
CN101634551B (en) * 2009-08-18 2011-04-13 清华大学深圳研究生院 Method and system for detecting surface roughness
KR101271795B1 (en) * 2011-08-10 2013-06-07 주식회사 포스코 Apparatus and method for detecting strip defects in strip casting process
CN107230203B (en) * 2017-05-19 2021-06-08 重庆立洋机电设备制造有限公司 Casting defect identification method based on human eye visual attention mechanism
CN109035236B (en) * 2018-07-27 2024-02-23 深圳市闿思科技有限公司 Casting burr detection method and device
CN110163853B (en) * 2019-05-14 2021-05-25 广东奥普特科技股份有限公司 Edge defect detection method
CN113538433B (en) * 2021-09-17 2021-11-26 海门市创睿机械有限公司 Mechanical casting defect detection method and system based on artificial intelligence

Also Published As

Publication number Publication date
CN114387438A (en) 2022-04-22

Similar Documents

Publication Publication Date Title
CN114387438B (en) Machine vision-based die casting machine parameter regulation and control method
CN115082467B (en) Building material welding surface defect detection method based on computer vision
CN114862862A (en) Pump body cold shut defect identification method and system based on image processing
CN115100191B (en) Metal casting defect identification method based on industrial detection
CN116740070B (en) Plastic pipeline appearance defect detection method based on machine vision
CN114972203A (en) Mechanical part rolling abnormity detection method based on watershed segmentation
CN115690108A (en) Aluminum alloy rod production quality evaluation method based on image processing
CN115082429B (en) Aluminum bar defect detection method based on image processing
CN116740072B (en) Road surface defect detection method and system based on machine vision
CN110717900B (en) Pantograph abrasion detection method based on improved Canny edge detection algorithm
CN111354047B (en) Computer vision-based camera module positioning method and system
CN116228768B (en) Method for detecting scratches on surface of electronic component
CN115359053A (en) Intelligent detection method and system for defects of metal plate
CN116740054B (en) Tongue image tooth trace detection method based on image processing
CN104614386A (en) Lens defects type identification method
CN112884746A (en) Character defect intelligent detection algorithm based on edge shape matching
CN114943848A (en) Crack identification method in nickel screen laser cladding process
CN114119603A (en) Image processing-based snack box short shot defect detection method
CN115115603A (en) Automobile accessory flywheel surface detection method based on artificial intelligence
CN117094975A (en) Method and device for detecting surface defects of steel and electronic equipment
CN116664584B (en) Intelligent feedback regulating system for production of thin-wall zinc alloy die casting die
CN116883415B (en) Thin-wall zinc alloy die casting quality detection method based on image data
CN115205317B (en) Bridge monitoring photoelectric target image light spot center point extraction method
CN117173661A (en) Asphalt road quality detection method based on computer vision
CN112033419A (en) Method, electronic device, and medium for detecting automatic port driving lane line

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant