CN117058151B - Hub detection method and system based on image analysis - Google Patents

Hub detection method and system based on image analysis Download PDF

Info

Publication number
CN117058151B
CN117058151B CN202311321399.9A CN202311321399A CN117058151B CN 117058151 B CN117058151 B CN 117058151B CN 202311321399 A CN202311321399 A CN 202311321399A CN 117058151 B CN117058151 B CN 117058151B
Authority
CN
China
Prior art keywords
hub
module
value
color
image
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202311321399.9A
Other languages
Chinese (zh)
Other versions
CN117058151A (en
Inventor
李洪光
孙谱
李东东
丁有望
朱宝阵
贾子瑞
秦作峰
穆瑞鹏
张皓
赵强
孙涛
许震
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shandong Juncheng Metal Technology Co ltd
Original Assignee
Shandong Juncheng Metal Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shandong Juncheng Metal Technology Co ltd filed Critical Shandong Juncheng Metal Technology Co ltd
Priority to CN202311321399.9A priority Critical patent/CN117058151B/en
Publication of CN117058151A publication Critical patent/CN117058151A/en
Application granted granted Critical
Publication of CN117058151B publication Critical patent/CN117058151B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/0002Inspection of images, e.g. flaw detection
    • G06T7/0004Industrial image inspection
    • G06T7/0006Industrial image inspection using a design-rule based approach
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/0002Inspection of images, e.g. flaw detection
    • G06T7/0004Industrial image inspection
    • G06T7/001Industrial image inspection using an image reference approach
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/12Edge-based segmentation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/60Analysis of geometric attributes
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/90Determination of colour characteristics
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10024Color image
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30108Industrial image inspection
    • G06T2207/30164Workpiece; Machine component
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02PCLIMATE CHANGE MITIGATION TECHNOLOGIES IN THE PRODUCTION OR PROCESSING OF GOODS
    • Y02P90/00Enabling technologies with a potential contribution to greenhouse gas [GHG] emissions mitigation
    • Y02P90/30Computing systems specially adapted for manufacturing

Abstract

The invention relates to a hub detection method and a detection system based on image analysis, and relates to the technical field of image analysis processing, wherein the hub detection method comprises the steps of determining a contrast area, determining a pixel standard color value, determining an error value, scanning a hub to be detected, comparing the hub to be detected, positioning the hub to be detected, and comparing the same; the detection system comprises an input module, a scanning module, a labeling module, an adjusting module, a buffer module, a color difference calculating module, a color difference comparing module, a selecting module, an alarm module and the like. The invention can realize system detection, reduce the labor intensity of detection personnel, detect a plurality of hubs at one time, improve the detection efficiency, and simultaneously, detect the subsequent hubs based on the first batch of hubs during detection so as to improve the actual precision during detection.

Description

Hub detection method and system based on image analysis
Technical Field
The invention relates to the technical field of image analysis processing, in particular to a hub detection method and a detection system based on image analysis.
Background
With the increasing amount of automobile maintenance on the market, the demand for hubs is also increasing. Today, the demands made on the hub are not only in terms of performance, but also in terms of the appearance of the hub. Because the hub is often damaged with the use of the automobile, the automobile owner can choose to replace the hub at the moment; when the hub is replaced, the vehicle owner usually only replaces the damaged hub, so that the cost for replacing the hub is reduced; meanwhile, the car owner can select a new wheel hub with the same type as other wheel hubs in use so as to keep the consistency of all wheel hub styles on the car and further keep the aesthetic feeling of the car. Therefore, the hub of the same model has consistent appearance, and further meets the requirement of the vehicle owners on the appearance of the hub.
At present, when detecting the appearance of a hub, a human eye recognition mode is generally used for detecting, namely, a standard hub is placed on one rotary table, a hub to be detected is placed on the other rotary table, then two rotary tables are rotated simultaneously, and a detecting person observes the difference between the two hubs through naked eyes, so that whether the hub to be detected is qualified is judged.
Judging whether the hub is qualified or not by naked eyes has a plurality of defects, such as excessively subjective judging results, large labor intensity of detection personnel, low detection efficiency and the like, and the detection efficiency of the detection method is difficult to match with the production efficiency when the hub is produced in a large scale.
Disclosure of Invention
In order to improve the detection accuracy, improve the detection efficiency and reduce the labor intensity during detection, the invention provides a hub detection method and a detection system based on image analysis.
In a first aspect, the present invention provides a hub detection method based on image analysis, which adopts the following technical scheme:
a hub detection method based on image analysis comprises the following steps:
determining a comparison area: according to the size of the hub, determining an effective coordinate area (x, y), x epsilon [ x0, x1], y epsilon [ y0, y1],
determining a pixel standard color value: determining pixel standard color values of pixel points in effective coordinate areaAs a reference sample;
determining an error value: determining the range A of each color region according to the number j of the color regions in the hub j And setting an error value A for each color region j c
Scanning a hub to be detected: selecting M hubs as pieces to be detected, and carrying out scanning analysis on the hubs to be detected to obtain images of the hubs to be detected, wherein the images of the hubs to be detected also comprise hub body areas and ground color areas;
and (3) comparing hubs to be detected: carrying out temporary coordinate value assignment on the image of the mth hub to be detected, and obtaining a temporary color value G of the image of the mth hub to be detected by a method for extracting the color value m (x, y); all hub body areas of the m images of the hubs to be detected fall into an effective coordinate area; calculating the standard color value of the current pixel in the image of the hub to be detected of the mth wheel and the reference sample in the effective coordinate areaTemporary chromatic aberration G of (2) m c’ The method comprises the steps of carrying out a first treatment on the surface of the Then translating and rotating the image of the mth hub to be detected, continuously updating the coordinate value of the pixel of the image of the mth hub to be detected, and obtaining the temporary color value G of the pixel of the image of the mth hub to be detected m ' (x, y) and temporary chromatic aberration G m c’
Wherein, the temporary chromatic aberration G m c’ The calculation model of (2) is as follows:
wherein, when calculating the chromatic aberration of two colors, the calculation model is as follows:
wherein: a is color A, B is color B, r is a red component value, g is a green component value, and B is a blue component value;
positioning a hub to be detected: taking the temporary chromatic aberration G m c’ Temporary color value G of pixel of image of mth hub to be detected at minimum m (x, y) color value G of pixel as image of mth hub to be detected m (x,y);
And (3) comparison: in the independent area Aj, the pixel of the image of the mth hub to be detected and the standard color valuePerforming difference operation to obtain average color difference +.>If the average color difference is->Greater than the error value A j c The method comprises the steps of carrying out a first treatment on the surface of the And judging the hub to be detected as a failed hub.
Optionally, a reference sample scanning step and a reference sample marking step are further arranged before the step of determining the comparison area;
reference sample scan: selecting N hubs as reference samples, and carrying out scanning analysis on the reference sample hubs to obtain images of the reference sample hubs, wherein the images of the reference sample hubs comprise hub body areas and ground color areas;
reference sample markers: the 1 st reference sample hub image is subjected to coordinate labeling, and the color value G of each pixel in the 1 st reference sample hub image is obtained by a color value extraction method 1 (x, y), (x, y) being the coordinates of the pixels on the image of the reference sample hub;
in the step of determining the contrast area, the hub body area of the image of the 1 st reference sample hub falls into the effective coordinate area;
after the step of determining the comparison area, a step of determining chromatic aberration in the comparison area and a step of positioning a reference sample are further arranged;
determining the color difference in the contrast area: performing temporary coordinate value assignment on the image of the nth reference sample hub, and obtaining temporary color value G of the image of the nth reference sample hub by a color value extraction method n (x, y) calculating a temporary chromatic aberration G between the image of the nth wheel reference sample hub and the image of the 1 st reference sample hub within the effective coordinate region n c’ The method comprises the steps of carrying out a first treatment on the surface of the Then the image of the nth reference sample hub is translated and rotated, so that the coordinate value of the pixel of the image of the nth reference sample hub is continuously updated, and then the temporary color value G of the pixel of the image of the nth reference sample hub is continuously obtained n (x, y), and a temporary chromatic aberration G between the image of the nth wheel reference sample hub and the image of the 1 st reference sample hub n c’
Wherein, the temporary chromatic aberration G n c’ The calculation model of (2) is as follows:
positioning a reference sample: taking the temporary chromatic aberration G n c’ Temporary color value G of a pixel of an image of an nth reference sample hub at a minimum n (x, y) color value G of pixel of image as nth reference sample hub n (x,y);
In the step of determining the standard color value of the pixel,the calculation model of (2) is as follows:
optionally, after the pixel standard color value is determined, a hub body area dividing step and a hub body area internal contour determining step are further arranged;
hub body region division: extracting the color value G of the ground color d If in the effective area, the standard color value of the pixel on a single coordinateColor value G with ground color d Pixel color difference G between c (x, y) is less than a first threshold value, and the current pixel standard color value +.>Standard color value +.>、/>Color difference G of adjacent pixels of (1) c When the condition that the set of (x+/-1, y+/-1) is larger than the second threshold value exists, taking the set of coordinates (x, y) as a temporary contour line L of the hub sample area 0 The method comprises the steps of carrying out a first treatment on the surface of the And obtain temporary contour edge L 0 Is a function f of (2) L0 (x,y);
Determination of the internal profile of the hub body region: profile L in the hub body region 0 In, if the standard color value of the current pixelStandard color value +.>、/>Color difference G of adjacent pixels of (1) c (x+ -1, y+ -1) is greater than a third threshold and the current pixel standard color value +.>Standard color value with adjacent pixelsColor difference G of adjacent pixels of (1) c When (x+/-x 2, y+/-y 2) is larger than the fourth threshold value, the set of coordinates (x, y) is taken as the contour side line L of the different color areas in the hub body i
In the step of determining the error value, the contour edge line L is used as a reference 0 Contour edge line L i Dividing the interior of the hub sample area into a plurality of independent areas A j And calculates the allowable estimated error value A of each region j c’
Estimated error value A j c’ The calculation model of (2) is as follows:
wherein:is A j Average color value of the inner pixel;
allowable error value A for each region j c Equal to the allowed estimated error value a for each region j c’
Optionally, after the step of determining the error value, an error value compensation step is further provided;
error value compensation: compensating the estimated error value, and further calculating the allowable error value A of each region j c
Error value A j c After updating the calculation model of (a) as follows:
optionally, a noise reduction step is further arranged between the hub body region dividing step and the hub body region internal contour determining step;
noise reduction: for temporary contour line L 0 Removing the line segment with the middle length less than L, and then removing the temporary contour edge line L 0 Performing integral operation, taking the coordinate set of the contour line when the integral value is the maximum value as the contour line L of the hub sample area 0 Is a set of coordinates of (a);
wherein: l (L) 0 The calculation model of (2) is as follows:
in a second aspect, the invention provides a hub detection system based on image analysis, which adopts the following technical scheme:
a hub detection system based on image analysis, comprising the following modules:
an input module: for inputting effective coordinate areas, x e [ x0, x1]],y∈[y0,y1]And is also used for inputting the standard color value of each pixel point in the effective coordinate areaThe model of the hub and the color value G of the ground color d Number j of color zones in the hub, extent A of each color zone j And an error value a for each color region j c
And a scanning module: the device is used for scanning the hub to be detected, so as to obtain an image of the hub;
and the marking module is used for: the input end is connected with the output end of the scanning module and is used for carrying out coordinate marking on the scanned image and extracting color values on the coordinates;
and an adjustment module: the input end is connected with the output end of the scanning module, and the output end is connected with the output end of the labeling module and is used for translating or rotating the image;
and a cache module: the input end is connected with the labeling module and the output end of the input module and is used for caching data;
and a color difference calculation module: the input end is connected with the output end of the buffer module, and the output end is connected with the input end of the buffer module and is used for calculating the chromatic aberration between coordinates, between coordinates and a coordinate area and between coordinates and a coordinate area;
and a color difference comparison module: the input end is connected with the output end of the buffer module, and the output end is connected with the input end of the buffer module and is used for comparing the color difference;
and a selection module: the input end is connected with the output end of the cache module, and the output end is connected with the input end of the cache module and used for selecting data in the cache module;
and an alarm module: the input end is connected with the output end of the buffer module and is used for giving an alarm.
Optionally, the system also comprises a color value calculation module and a storage module,
and the color value calculation module is used for: the input end is connected with the output end of the cache module, and the output end is connected with the input end of the storage module and is used for calculating the coordinates and the average color value of the coordinate area;
function definition module: the input end is connected with the output end of the cache module, and the output end is connected with the input end of the cache module and is used for defining a function according to coordinates;
and a function calculation module: the input end is connected with the output end of the cache module, and the output end is connected with the input end of the cache module and is used for calling the function and carrying out data operation according to the function;
and a storage module: range A for storing hub model, effective coordinate area, each color area j And an error value a for each color region j c
Optionally, the input module is further configured to input the number N of reference sample hubs, the length threshold l, and the first threshold, the second threshold, the third threshold, and the fourth threshold for determining.
Optionally, the device also comprises a judging module,
and a judging module: the input end is connected with the input module and the output end of the storage module and is used for judging whether the information of the same type of hub is recorded in the storage module.
In summary, the present invention includes at least one of the following beneficial technical effects:
1. through the hub comparison step to be detected, the hub positioning to be detected and the setting of the comparison step, the pixels of the image of the hub to be detected can be directly compared with the standard color value of the pixelsComparing to judge whether the hub is qualified or not, so that the probability of human error is reduced; during detection, a plurality of hubs to be detected can be detected simultaneously, so that the detection efficiency is improved, and the labor intensity of detection personnel is reduced.
2. The standard color value of the pixel is realized through the steps of scanning a reference sample, marking the reference sample, determining the color difference in a comparison area and positioning the reference sampleThe method is more accurate, reduces the influence of errors between design and actual processing on detection, and improves the detection precision.
3. The error value A as a judgment scale is obtained by the steps of dividing the hub body area, determining the internal contour of the hub body area and compensating the error value j c More accurate, and then improved the accuracy of detection.
Drawings
FIG. 1 is a schematic flow chart of the sample detection step in example 1;
FIG. 2 is a schematic flow chart of the step of inspecting the hub in embodiment 1;
fig. 3 is a system diagram of example 2.
Detailed Description
The invention is described in further detail below in connection with fig. 1 to 3.
Example 1: the embodiment discloses a hub detection method based on image analysis, referring to fig. 1 and 2, the hub detection method based on image analysis comprises the following steps:
s1: sample detection: and selecting N hubs as samples, detecting the samples, and further obtaining reference information required by subsequent detection. The sample detection step S1 includes the following steps.
S1-1: reference sample scan: n hubs are selected as reference samples, scanning analysis is carried out on the reference sample hubs, images of the reference sample hubs are obtained, and the images of the reference sample hubs comprise hub body areas and ground color areas.
The color of the workbench when the hub is placed in the ground color area is different from any color in the hub body when the hub is scanned, so that the color of the workbench is conveniently distinguished.
S1-2: reference sample markers: the 1 st reference sample hub image is subjected to coordinate labeling, and the color value G of each pixel in the 1 st reference sample hub image is obtained by a color value extraction method 1 (x, y) the coordinates of the pixels on the image of the reference sample hub.
S1-3: determining a comparison area: and selecting an effective coordinate area (x, y), wherein x epsilon [ x0, x1], y epsilon [ y0, y1], and the hub body area of the image of the 1 st reference sample hub fall into the effective coordinate area.
S1-4: determining the color difference in the contrast area: performing temporary coordinate value assignment on the image of the nth reference sample hub, and obtaining temporary color value G of the image of the nth reference sample hub by a color value extraction method n (x, y) calculating a temporary chromatic aberration G between the image of the nth wheel reference sample hub and the image of the 1 st reference sample hub within the effective coordinate region n c’ The method comprises the steps of carrying out a first treatment on the surface of the Then the image of the nth reference sample hub is translated and rotated, so that the coordinate value of the pixel of the image of the nth reference sample hub is continuously updated, and then the temporary color value G of the pixel of the image of the nth reference sample hub is continuously obtained n (x,y), and a temporary chromatic aberration G between the image of the nth wheel reference sample hub and the image of the 1 st reference sample hub n c’
Wherein, the temporary chromatic aberration G n c’ The calculation model of (2) is as follows:
wherein, when calculating the chromatic aberration of two colors, the calculation model is as follows:
wherein: a is color A, B is color B, r is a red component value, g is a green component value, and B is a blue component value.
Because the effective coordinates S1-3 are selected, when the image of the nth reference sample hub translates and rotates, one pixel can be corresponding to all coordinates in the effective coordinates, and then the temporary chromatic aberration G is calculated n c’ In the case, the temporary chromatic aberration G can be reduced n c’ Is a function of the error of (a).
S1-5: positioning a reference sample: taking the temporary chromatic aberration G n c’ Temporary color value G of a pixel of an image of an nth reference sample hub at a minimum n (x, y) color value G of pixel of image as nth reference sample hub n (x,y)。
Since the sample hub is placed on the table, there is an indispensable mounting error, which may be translational or rotational along the hub axis, by constantly translating and rotating the image and taking the temporary chromatic aberration G n c’ Temporary color value G of a pixel of an image of an nth reference sample hub at a minimum n (x, y) color value G of pixel of image as nth reference sample hub n (x, y) this allows the N sample hub images to be "registered" for further convenience in subsequent steps.
S1-6: determining a pixel standard color value: determining a standard color value for a pixel at each coordinate
Wherein,the calculation model of (2) is as follows:
s1-7: hub body region division: extracting the color value G of the ground color d If in the effective area, the standard color value of the pixel on a single coordinateColor value G with ground color d Pixel color difference G between c (x, y) is less than a first threshold value, and the current pixel standard color value +.>Standard color value +.>、/>Color difference G of adjacent pixels of (1) c When the condition that the set of (x+/-1, y+/-1) is larger than the second threshold value exists, taking the set of coordinates (x, y) as a temporary contour line L of the hub sample area 0 The method comprises the steps of carrying out a first treatment on the surface of the And obtain temporary contour edge L 0 Is a function f of (2) L0 (x,y)。
Color value G of ground color d Determining according to the actual color of the workbench; when the first threshold and the second threshold are set, the first threshold and the second threshold are smaller than the color difference between the ground color and any color in the hub body, and the second threshold is larger than the first threshold.
S1-8: noise reduction: for temporary contour line L 0 Removing the line segment with the middle length less than L, and then removing the temporary contour edge line L 0 Performing integral operation, taking the coordinate set of the contour line when the integral value is the maximum value as the contour line L of the hub sample area 0 Is a set of coordinates of (a);
wherein: l (L) 0 The calculation model of (2) is as follows:
after the step S1-8 of noise reduction, the contour line L of the hub sample area 0 The mixed contour lines in the hub body region are removed, so that the influence on the subsequent determination of the internal contour of the hub body region is reduced.
S1-9: determination of the internal profile of the hub body region: profile L in the hub body region 0 In, if the standard color value of the current pixelStandard color value +.>、/>Color difference G of adjacent pixels of (1) c (x+ -1, y+ -1) is greater than a third threshold and the current pixel standard color value +.>Standard color value with adjacent pixelsColor difference G of adjacent pixels of (1) c When (x+/-x 2, y+/-y 2) is larger than the fourth threshold value, the set of coordinates (x, y) is taken as the contour side line L of the different color areas in the hub body i
When the third threshold value and the fourth threshold value are set, the third threshold value and the fourth threshold value should be smaller than the chromatic aberration between any two colors in the hub body, and the fourth threshold value is a multiple of x2 of the third threshold value, so as to avoid L i In presence of broken wires orAnd (3) the situation of mixed contour lines.
S1-10: determining an error value: according to contour edge line L 0 Contour edge line L i Dividing the interior of the hub sample area into a plurality of independent areas A j And calculates the allowable estimated error value A of each region j c’
Estimated error value A j c’ The calculation model of (2) is as follows:
wherein:is A j Average color value of the inner pixel;
s1-11: error value compensation: compensating the estimated error value, and further calculating the allowable error value A of each region j c
Error value A j c The calculation model of (2) is as follows:
after the sample detection step S1 is completed, a hub detection step S2 is executed; the step S2 of inspecting the hub includes the following steps.
S2-1: scanning a hub to be detected: m hubs are selected as pieces to be detected, scanning analysis is carried out on the hubs to be detected, images of the hubs to be detected are obtained, and the images of the hubs to be detected also comprise hub body areas and ground color areas.
S2-2: and (3) comparing hubs to be detected: carrying out temporary coordinate value assignment on the image of the mth hub to be detected, and obtaining a temporary color value G of the image of the mth hub to be detected by a method for extracting the color value m (x, y) calculated in the valid coordinate region, mthImage of wheel hub to be detected and current pixel standard color value in reference sampleTemporary chromatic aberration G of (2) m c’ The method comprises the steps of carrying out a first treatment on the surface of the Then translating and rotating the image of the mth hub to be detected, continuously updating the coordinate value of the pixel of the image of the mth hub to be detected, and obtaining the temporary color value G of the pixel of the image of the mth hub to be detected m (x, y) and temporary chromatic aberration G m c’
Wherein, the temporary chromatic aberration G m c’ The calculation model of (2) is as follows:
s2-3: positioning a hub to be detected: taking the temporary chromatic aberration G m c’ Temporary color value G of pixel of image of mth hub to be detected at minimum m’ (x, y) color value G of pixel as image of mth hub to be detected m (x,y)。
S2-4: and (3) comparison: in the independent area Aj, the pixel of the image of the mth hub to be detected and the standard color valuePerforming difference operation to obtain average color difference +.>If the average color difference is->An error value A greater than C times j c The method comprises the steps of carrying out a first treatment on the surface of the And judging the hub to be detected as a failed hub.
The implementation principle of the hub detection method based on image analysis in the embodiment is as follows:
firstly, detecting a reference sample hub, and collecting coordinates and color value information of an image of the reference sample hub; then, the overlapping comparison is carried out on the hubs of the N reference samples, andthe images of the reference sample hubs are adjusted to enable the N sample hub images to be overlapped, wherein the overlapping means that the N reference sample hubs are in an effective area, and the total chromatic aberration of the N reference sample hubs reaches the minimum value. Then, according to the color value G of the pixels of the image of the N reference sample hubs n (x, y) determining the standard color value of the pixel at each coordinate. Then, the body area of the reference sample hub is internally divided to obtain j areas A with different colors j Then calculate the allowable error A of each region j c
Then the sample to be detected can be detected, the coordinates and the color value information of the image of the sample hub to be detected are collected during detection, and then the M sample hubs to be detected and the standard color value of the pixels are detectedThe coincidence comparison is carried out, and the images of the sample hubs to be detected are continuously adjusted, so that M sample hubs to be detected are enabled to be in standard color values of pixels in a databaseRealizing superposition, wherein superposition means that M sample hubs to be detected are in an effective area and pixel standard color value +.>And the total color difference of (c) reaches a minimum.
Then in the independent area Aj, the pixel of the image of the mth hub to be detected and the standard color valuePerforming difference operation to obtain average color difference +.>If the average color difference of the pixels of the image of the mth hub to be detected is +.>Error greater than C timesDifference A j c The method comprises the steps of carrying out a first treatment on the surface of the And judging the hub to be detected as a failed hub. The color of the hub is detected in the mode, so that the influence of processing errors on the detection result can be reduced, and the detection result is more attached to the actual result.
Example 2: the embodiment discloses a wheel hub detecting system based on image analysis, referring to fig. 3, the wheel hub detecting system based on image analysis includes the following modules:
an input module: for inputting effective coordinate areas, x e [ x0, x1]],y∈[y0,y1]The method comprises the steps of carrying out a first treatment on the surface of the Also used for inputting the model and the color value G of the ground color of the hub d Standard color value of current pixelNumber j of color zones in the hub, extent A of each color zone j And an error value a for each color region j c The method comprises the steps of carrying out a first treatment on the surface of the And the method can be used for inputting the number N of the reference sample hubs, the length threshold value l, and the first threshold value, the second threshold value, the third threshold value and the fourth threshold value for judgment.
And a judging module: the input end is connected with the input module and the output end of the storage module and is used for judging whether the information of the same type of hub is recorded in the storage module.
And a scanning module: the device is used for scanning the hub to be detected, and further obtaining an image of the hub.
And the marking module is used for: the input end is connected with the output end of the scanning module and is used for carrying out coordinate marking on the scanned image and extracting color values on the coordinates.
And an adjustment module: the input end is connected with the output end of the scanning module, and the output end is connected with the output end of the labeling module and is used for translating or rotating the image, so that the labeling module can carry out coordinate value reassignment on the image and extract the color value on the coordinates.
And a cache module: the input end is connected with the labeling module and the output end of the input module and is used for caching data.
And a color difference calculation module: the input end is connected with the output end of the buffer module, and the output end is connected with the input end of the buffer module and is used for calculating the chromatic aberration between coordinates, between coordinates and a coordinate area and between coordinates and a coordinate area.
And a color difference comparison module: the input end is connected with the output end of the buffer memory module, and the output end is connected with the input end of the buffer memory module and is used for comparing the color difference.
And the color value calculation module is used for: the input end is connected with the output end of the buffer memory module, and the output end is connected with the input end of the storage module and is used for calculating the coordinates and the average color value of the coordinate area.
And a selection module: the input end is connected with the output end of the buffer module, and the output end is connected with the input end of the buffer module and is used for selecting data in the buffer module.
And an alarm module: the input end is connected with the output end of the buffer module and is used for giving an alarm.
Function definition module: the input end is connected with the output end of the buffer module, and the output end is connected with the input end of the buffer module and is used for defining a function according to coordinates.
And a function calculation module: the input end is connected with the output end of the buffer module, and the output end is connected with the input end of the buffer module and is used for calling the function and carrying out data operation according to the function.
And a storage module: for storing the hub model, the valid coordinate area, the range Aj for each color area, and the error value Ajc for each color area.
The implementation principle of the hub detection system based on image analysis in the embodiment is as follows:
when a hub of a certain model is detected for the first time, two detection modes of design drawing detection and actual sample detection are provided.
In the design drawing detection mode, according to the information parameters of the design drawing of the current hub model, an effective coordinate area, namely x epsilon [ x0, x1, is input through an input module],y∈[y0,y1]The method comprises the steps of carrying out a first treatment on the surface of the Also used for inputting the model number and the current pixel standard color value of the hubNumber j of color zones in the hub, extent A of each color zone j Each color zoneError value A of domain j c And the data are transmitted to a storage module for storage.
Then, the hub to be detected can be scanned, and further an image of the hub is obtained; the marking module then coordinates the image of the hub and extracts the temporary color value G m (x, y); the color difference calculation module calculates the color value of the image in the effective area and the standard color value of the current pixel in the effective areaPerforming total chromatic aberration calculation and obtaining temporary chromatic aberration G m c’ The method comprises the steps of carrying out a first treatment on the surface of the Then an operator translates and rotates the image of the hub through the adjusting module, and the labeling module continuously updates the coordinate value information of the image of the hub and updates the temporary color value G m (x, y) updating; at the same time, the color difference calculation module calculates the temporary color difference G m c’ Updating; after the adjustment is finished, selecting a temporary chromatic aberration G m c’ Temporary color value G at minimum m (x, y) as the color value G of the hub currently to be detected m (x, y); the color difference calculation module then calculates the range A of each color region j In, the color value of the hub to be detected is +.>Calculating a difference value; the color difference comparison module compares the difference value and the error value A j c Comparing, if the difference is greater than the error value A j c And the hub to be detected is proved to be unqualified, and the alarm module gives an alarm.
When the actual sample detection mode is used, the model of the input hub and the color value G of the current ground color are input d The number N of the reference sample hubs, the length threshold l and the first threshold, the second threshold, the third threshold and the fourth threshold for judgment; an effective coordinate area is also needed to be input, x is E [ x0, x1]],y∈[y0,y1]。
Then, scanning N reference sample hubs to obtain images of the hubs;the marking module then coordinates the image of the hub and extracts the temporary color value G n (x, y); then the chromatic aberration calculation module calculates chromatic aberration among N reference sample hubs in the effective area and obtains temporary chromatic aberration G n c’ Then an operator translates and rotates the image of the hub of the nth reference sample through the adjusting module (n is not equal to 1), the labeling module continuously updates the coordinate value information of the image of the hub and updates the temporary color value G n (x, y) updating; at the same time, the color difference calculation module calculates the temporary color difference G n c’ Updating; after the adjustment is finished, selecting a temporary chromatic aberration G n c’ Temporary color value G at minimum n’ (x, y) color value G of hub as current reference sample n (x, y); the color value calculation module calculates the color value G of the hubs of the N reference samples n (x, y) performing an average value calculation to obtain a current pixel standard color value
Then, the color difference calculation module calculates the standard color value of the current pixel in the effective areaColor value G with ground color d Difference between, if in the effective area, the standard color value of the pixel on a single coordinate +.>Color value G with ground color d Pixel color difference G between c (x, y) is less than a first threshold value, and the current pixel standard color value +.>Standard color value +.>、/>Color difference G of adjacent pixels of (1) c When the condition that the set of (x+/-1, y+/-1) is larger than the second threshold value exists, taking the set of coordinates (x, y) as a temporary contour line L of the hub sample area 0
Then, the function definition module defines the temporary contour edge L 0 Performing function definition to obtain a function f L0 (x, y); then, the temporary contour edge L is calculated by a function calculation module 0 Comparing the temporary contour edge L with a length threshold value L 0 Removing the line segment with the middle length less than l, and then performing the function f L0 (x, y) performing an integration operation, and taking a coordinate set at the time of the maximum value of the integrated value as a contour line L 0 Is described.
Then, comparing the standard color value of the current pixel by a color difference comparison moduleStandard color value with adjacent pixels、/>Color difference G of adjacent pixels of (1) c (x+ -1, y+ -1) and current pixel standard color valueStandard color value +.>Color difference G of adjacent pixels of (1) c (x.+ -. X2, y.+ -. Y2). If the current pixel standard color value +.>Standard color value +.>Adjacent pixel colors of (a)Difference G c (x+ -1, y+ -1) is greater than a third threshold and the current pixel standard color valueStandard color value +.>Color difference G of adjacent pixels of (1) c When (x+/-x 2, y+/-y 2) is larger than the fourth threshold value, the set of coordinates (x, y) is taken as the contour side line L of the different color areas in the hub body i . Contour line L0 and contour line L i Dividing the interior of the hub sample area into a plurality of individual areas A j
Then, the color difference calculation module calculates the color difference according to the independent area A j Calculating estimated error value A by internal color value j c’ And calculates an error value A based on the number N of samples j c . Then the standard color value of the pixel is calculatedContour edge line L 0 Contour edge line L i Region A j And the error value is input into the storage module so as to be convenient for subsequent use.
The hub to be detected can be detected later, and the subsequent detection mode is the same as the detection mode used when the design drawing detection mode is used. Because the actual processing and the design drawing have certain errors, the method can be used for detecting the subsequent hubs based on the hubs produced in the first batch, so that the actual precision in detection is improved.
When the hub with the same model is detected later, the model of the hub is input through the input module, and the judgment module judges the information in the database, so that whether the system directly enters a detection program or needs to input the original data is judged.
The above embodiments are not intended to limit the scope of the present invention, so: all equivalent changes in structure, shape and principle of the invention should be covered in the scope of protection of the invention.

Claims (9)

1. The hub detection method based on image analysis is characterized by comprising the following steps of: the method comprises the following steps:
determining a comparison area: according to the size of the hub, determining an effective coordinate area (x, y), x epsilon [ x0, x1], y epsilon [ y0, y1],
determining a pixel standard color value: determining pixel standard color values of pixel points in effective coordinate areaAs a reference sample;
determining an error value: determining the range A of each color region according to the number j of the color regions in the hub j And setting an error value A for each color region j c
Scanning a hub to be detected: selecting M hubs as pieces to be detected, and carrying out scanning analysis on the hubs to be detected to obtain images of the hubs to be detected, wherein the images of the hubs to be detected also comprise hub body areas and ground color areas;
and (3) comparing hubs to be detected: carrying out temporary coordinate value assignment on the image of the mth hub to be detected, and obtaining a temporary color value G of the image of the mth hub to be detected by a method for extracting the color value m (x, y); all hub body areas of the m images of the hubs to be detected fall into an effective coordinate area; calculating the standard color value of the current pixel in the image of the hub to be detected of the mth wheel and the reference sample in the effective coordinate areaTemporary chromatic aberration G of (2) m c’ The method comprises the steps of carrying out a first treatment on the surface of the Then translating and rotating the image of the mth hub to be detected, continuously updating the coordinate value of the pixel of the image of the mth hub to be detected, and obtaining the temporary color value G of the pixel of the image of the mth hub to be detected m (x, y) and temporary chromatic aberration G m c’
Wherein, the temporary chromatic aberration G m c’ The calculation model of (2) is as follows:
wherein, when calculating the chromatic aberration of two colors, the calculation model is as follows:
wherein: a is color A, B is color B, r is a red component value, g is a green component value, and B is a blue component value;
positioning a hub to be detected: taking the temporary chromatic aberration G m c’ Temporary color value G of pixel of image of mth hub to be detected at minimum m (x, y) color value G of pixel as image of mth hub to be detected m (x,y);
And (3) comparison: in the independent area Aj, the pixel of the image of the mth hub to be detected and the standard color valuePerforming difference operation to obtain average color difference +.>If the average color difference is->Greater than the error value A j c The method comprises the steps of carrying out a first treatment on the surface of the And judging the hub to be detected as a failed hub.
2. The hub detection method based on image analysis according to claim 1, wherein: before the step of determining the comparison area, a reference sample scanning step and a reference sample marking step are further arranged;
reference sample scan: selecting N hubs as reference samples, and carrying out scanning analysis on the reference sample hubs to obtain images of the reference sample hubs, wherein the images of the reference sample hubs comprise hub body areas and ground color areas;
reference sample markers: the 1 st reference sample hub image is subjected to coordinate labeling, and the color value G of each pixel in the 1 st reference sample hub image is obtained by a color value extraction method 1 (x, y), (x, y) being the coordinates of the pixels on the image of the reference sample hub;
in the step of determining the contrast area, the hub body area of the image of the 1 st reference sample hub falls into the effective coordinate area;
after the step of determining the comparison area, a step of determining chromatic aberration in the comparison area and a step of positioning a reference sample are further arranged;
determining the color difference in the contrast area: performing temporary coordinate value assignment on the image of the nth reference sample hub, and obtaining temporary color value G of the image of the nth reference sample hub by a color value extraction method n (x, y) calculating a temporary chromatic aberration G between the image of the nth wheel reference sample hub and the image of the 1 st reference sample hub within the effective coordinate region n c’ The method comprises the steps of carrying out a first treatment on the surface of the Then the image of the nth reference sample hub is translated and rotated, so that the coordinate value of the pixel of the image of the nth reference sample hub is continuously updated, and then the temporary color value G of the pixel of the image of the nth reference sample hub is continuously obtained n (x, y), and a temporary chromatic aberration G between the image of the nth wheel reference sample hub and the image of the 1 st reference sample hub n c’
Wherein, the temporary chromatic aberration G n c’ The calculation model of (2) is as follows:
positioning a reference sample: taking the temporary chromatic aberration G n c’ Temporary color value G of a pixel of an image of an nth reference sample hub at a minimum n (x, y) color value G of pixel of image as nth reference sample hub n (x,y);
In the step of determining the standard color value of the pixel,the calculation model of (2) is as follows:
3. the hub detection method based on image analysis according to claim 2, wherein: the method comprises the steps of determining a standard color value of a pixel, dividing a hub body region and determining an internal contour of the hub body region;
hub body region division: extracting the color value G of the ground color d If in the effective area, the standard color value of the pixel on a single coordinateColor value G with ground color d Pixel color difference G between c (x, y) is less than a first threshold and the current pixel standard color valueStandard color value +.>、/>Color difference G of adjacent pixels of (1) c When the condition that the set of (x+/-1, y+/-1) is larger than the second threshold value exists, taking the set of coordinates (x, y) as a temporary contour line L of the hub sample area 0 The method comprises the steps of carrying out a first treatment on the surface of the And obtain temporary contour edge L 0 Is a function f of (2) L0 (x,y);
Determination of the internal profile of the hub body region: profile L in the hub body region 0 In, if the standard color value of the current pixelStandard color value +.>、/>Color difference G of adjacent pixels of (1) c (x+ -1, y+ -1) is greater than a third threshold and the current pixel standard color value +.>Standard color value with adjacent pixelsColor difference G of adjacent pixels of (1) c When (x+/-x 2, y+/-y 2) is larger than the fourth threshold value, the set of coordinates (x, y) is taken as the contour side line L of the different color areas in the hub body i
In the step of determining the error value, the contour edge line L is used as a reference 0 Contour edge line L i Dividing the interior of the hub sample area into a plurality of independent areas A j And calculates the allowable estimated error value A of each region j c’
Estimated error value A j c’ The calculation model of (2) is as follows:
wherein:is A j Average color value of the inner pixel;
allowable error value A for each region j c Equal to eachAllowed estimated error value a for a region j c’
4. A method of hub inspection based on image analysis as claimed in claim 3, wherein: after the step of determining the error value, an error value compensation step is further arranged;
error value compensation: compensating the estimated error value, and further calculating the allowable error value A of each region j c
Error value A j c After updating the calculation model of (a) as follows:
5. a method of hub inspection based on image analysis as claimed in claim 3, wherein: a noise reduction step is further arranged between the hub body region dividing step and the hub body region internal contour determining step;
noise reduction: for temporary contour line L 0 Removing the line segment with the middle length less than L, and then removing the temporary contour edge line L 0 Performing integral operation, taking the coordinate set of the contour line when the integral value is the maximum value as the contour line L of the hub sample area 0 Is a set of coordinates of (a);
wherein: l (L) 0 The calculation model of (2) is as follows:
6. a hub detection system based on image analysis, for applying the hub detection method based on image analysis according to any one of claims 1 to 5, characterized in that: the method comprises the following modules:
an input module: for inputting effective coordinate areas, x e [ x0, x1]],y∈[y0,y1]And also for inputting effective coordinate areasPixel standard color value of each pixel point in domainThe model of the hub and the color value G of the ground color d Number j of color zones in the hub, extent A of each color zone j And an error value a for each color region j c
And a scanning module: the device is used for scanning the hub to be detected, so as to obtain an image of the hub;
and the marking module is used for: the input end is connected with the output end of the scanning module and is used for carrying out coordinate marking on the scanned image and extracting color values on the coordinates;
and an adjustment module: the input end is connected with the output end of the scanning module, and the output end is connected with the output end of the labeling module and is used for translating or rotating the image;
and a cache module: the input end is connected with the labeling module and the output end of the input module and is used for caching data;
and a color difference calculation module: the input end is connected with the output end of the buffer module, and the output end is connected with the input end of the buffer module and is used for calculating the chromatic aberration between coordinates, between coordinates and a coordinate area and between coordinates and a coordinate area;
and a color difference comparison module: the input end is connected with the output end of the buffer module, and the output end is connected with the input end of the buffer module and is used for comparing the color difference;
and a selection module: the input end is connected with the output end of the cache module, and the output end is connected with the input end of the cache module and used for selecting data in the cache module;
and an alarm module: the input end is connected with the output end of the buffer module and is used for giving an alarm.
7. The image analysis-based hub detection system of claim 6, wherein: the system also comprises a color value calculation module and a storage module,
and the color value calculation module is used for: the input end is connected with the output end of the cache module, and the output end is connected with the input end of the storage module and is used for calculating the coordinates and the average color value of the coordinate area;
function definition module: the input end is connected with the output end of the cache module, and the output end is connected with the input end of the cache module and is used for defining a function according to coordinates;
and a function calculation module: the input end is connected with the output end of the cache module, and the output end is connected with the input end of the cache module and is used for calling the function and carrying out data operation according to the function;
and a storage module: range A for storing hub model, effective coordinate area, each color area j And an error value a for each color region j c
8. The image analysis-based hub detection system of claim 6, wherein: the input module is also used for inputting the number N of the reference sample hubs, the length threshold l, and the first threshold, the second threshold, the third threshold and the fourth threshold for judgment.
9. The image analysis-based hub detection system of claim 8, wherein: also comprises a judging module, wherein the judging module is used for judging whether the current state of the current state is the current state,
and a judging module: the input end is connected with the input module and the output end of the storage module and is used for judging whether the information of the same type of hub is recorded in the storage module.
CN202311321399.9A 2023-10-13 2023-10-13 Hub detection method and system based on image analysis Active CN117058151B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202311321399.9A CN117058151B (en) 2023-10-13 2023-10-13 Hub detection method and system based on image analysis

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202311321399.9A CN117058151B (en) 2023-10-13 2023-10-13 Hub detection method and system based on image analysis

Publications (2)

Publication Number Publication Date
CN117058151A CN117058151A (en) 2023-11-14
CN117058151B true CN117058151B (en) 2024-01-05

Family

ID=88654048

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202311321399.9A Active CN117058151B (en) 2023-10-13 2023-10-13 Hub detection method and system based on image analysis

Country Status (1)

Country Link
CN (1) CN117058151B (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117351859B (en) * 2023-12-05 2024-02-09 深圳市深顺欣科技有限公司 Detection method, device and system for display module

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104123542A (en) * 2014-07-18 2014-10-29 大连理工大学 Device and method for positioning wheel hub work piece
WO2016201947A1 (en) * 2015-06-16 2016-12-22 华南理工大学 Method for automated detection of defects in cast wheel products
CN116482113A (en) * 2023-04-24 2023-07-25 广东绿之彩科技股份有限公司 Printed matter appearance defect detection process based on neural network

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104123542A (en) * 2014-07-18 2014-10-29 大连理工大学 Device and method for positioning wheel hub work piece
WO2016201947A1 (en) * 2015-06-16 2016-12-22 华南理工大学 Method for automated detection of defects in cast wheel products
CN116482113A (en) * 2023-04-24 2023-07-25 广东绿之彩科技股份有限公司 Printed matter appearance defect detection process based on neural network

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
A new image measurement method for wheel hub;Wei Zhang等;《IEEE Xplore》;全文 *
数字图像处理技术在铝轮毂PCD值检测中的应用;余红娟;徐新民;周长明;苏万勇;陈晓龙;;科技通报(01);全文 *

Also Published As

Publication number Publication date
CN117058151A (en) 2023-11-14

Similar Documents

Publication Publication Date Title
CN117058151B (en) Hub detection method and system based on image analysis
US11915407B2 (en) Automated system and method for clarity measurements and clarity grading
CN113570605B (en) Defect detection method and system based on liquid crystal display panel
EP2417419B1 (en) Digital optical comparator and corresponding digital comparison method
CN111862037A (en) Method and system for detecting geometric characteristics of precision hole type part based on machine vision
CN106920219A (en) Article defect detection method, image processing system and computer readable recording medium
CN105160652A (en) Handset casing testing apparatus and method based on computer vision
CN102138068B (en) Visual examination apparatus
JP2009544002A (en) Window glass inspection method
CN114897864B (en) Workpiece detection and defect judgment method based on digital-analog information
CN112001917A (en) Machine vision-based geometric tolerance detection method for circular perforated part
CN1565000A (en) Image processing method for appearance inspection
JPWO2004013572A1 (en) Curved shape inspection method and apparatus
CN111080582A (en) Method for detecting defects on inner surface and outer surface of workpiece
CN116703890B (en) Method and system for detecting tab defects
US20110164129A1 (en) Method and a system for creating a reference image using unknown quality patterns
CN111008602B (en) Scribing feature extraction method combining two-dimensional vision and three-dimensional vision for small-curvature thin-wall part
CN113109240A (en) Method and system for determining imperfect grains of grains implemented by computer
CN107818556A (en) Bad the line Autonomous test and self-repair method of capacitance type fingerprint acquisition system
CN111539951B (en) Visual detection method for outline size of ceramic grinding wheel head
CN114638822B (en) Method and system for detecting surface quality of automobile cover plate by using optical means
WO2004083835A1 (en) Method to determine the optical quality of a glazing
CN114219802A (en) Skin connecting hole position detection method based on image processing
JP4036048B2 (en) Defect identification method
CN115452845B (en) LED screen surface damage detection method based on machine vision

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant