CN116883446A - Real-time monitoring system for grinding degree of vehicle-mounted camera lens - Google Patents

Real-time monitoring system for grinding degree of vehicle-mounted camera lens Download PDF

Info

Publication number
CN116883446A
CN116883446A CN202311152368.5A CN202311152368A CN116883446A CN 116883446 A CN116883446 A CN 116883446A CN 202311152368 A CN202311152368 A CN 202311152368A CN 116883446 A CN116883446 A CN 116883446A
Authority
CN
China
Prior art keywords
degree
edge pixel
value
pixel points
epsilon
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202311152368.5A
Other languages
Chinese (zh)
Other versions
CN116883446B (en
Inventor
刘峰
杜犇犇
周士兰
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Luran Optoelectronics Weishan Co ltd
Original Assignee
Luran Optoelectronics Weishan Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Luran Optoelectronics Weishan Co ltd filed Critical Luran Optoelectronics Weishan Co ltd
Priority to CN202311152368.5A priority Critical patent/CN116883446B/en
Publication of CN116883446A publication Critical patent/CN116883446A/en
Application granted granted Critical
Publication of CN116883446B publication Critical patent/CN116883446B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/13Edge detection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/136Segmentation; Edge detection involving thresholding
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10004Still image; Photographic image
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20048Transform domain processing

Abstract

The application relates to the field of image processing, and provides a real-time monitoring system for grinding degree of a vehicle-mounted camera lens, which comprises the following components: the edge detection module is used for carrying out edge detection on an image to be detected, and determining edge pixel points in the image to be detected, wherein the edge pixel points are edges of the vehicle-mounted camera, and the image to be detected is obtained by collecting side images of the lenses of the vehicle-mounted camera; the threshold calculation module is used for determining an optimal segmentation threshold value based on the degree of abnormality of the edge pixel points; and the anomaly detection module is used for detecting the anomaly of the edge pixel points based on the optimal segmentation threshold and the anomaly degree of the edge pixel points by utilizing an isolated forest algorithm and determining whether the edge pixel points accord with a grinding standard. The system can effectively improve the inspection efficiency and the inspection accuracy.

Description

Real-time monitoring system for grinding degree of vehicle-mounted camera lens
Technical Field
The application relates to the field of image processing, in particular to a real-time monitoring system for grinding degree of a vehicle-mounted camera lens.
Background
With the development of economy and the progress of age, people travel tools from walking to bicycles to motorcycles and finally to places where automobiles are visible. The vehicle-mounted camera is an indispensable part of the automobile, and the quality problem of the vehicle-mounted camera is also a great concern.
The vehicle-mounted camera lens is a convex lens, and the lens needs to be ground so as to obtain a desired lens shape. However, there are many accidents occurring during the milling process, such as local dishing due to excessive milling, local convexity due to insufficient milling, and failure to meet the target curvature due to insufficient milling. The judgment of the grinding degree is carried out based on a manual detection mode at the present stage, but the manual detection consumes a great deal of manpower, and the judgment of each vehicle-mounted camera cannot be ensured to be accurate enough due to the large number of vehicle-mounted cameras produced each day.
Disclosure of Invention
The application provides a real-time monitoring system for grinding degree of a vehicle-mounted camera lens, which can effectively improve the inspection efficiency and the inspection accuracy.
In a first aspect, the present application provides a real-time monitoring system for grinding degree of a lens of an on-vehicle camera, comprising:
the edge detection module is used for carrying out edge detection on an image to be detected, and determining edge pixel points in the image to be detected, wherein the edge pixel points are edges of the vehicle-mounted camera, and the image to be detected is obtained by collecting side images of the lenses of the vehicle-mounted camera;
the threshold calculation module is used for determining an optimal segmentation threshold value based on the degree of abnormality of the edge pixel points;
and the anomaly detection module is used for detecting the anomaly of the edge pixel points based on the optimal segmentation threshold and the anomaly degree of the edge pixel points by utilizing an isolated forest algorithm and determining whether the edge pixel points accord with a grinding standard.
Optionally, the system further comprises:
the anomaly degree calculation module is used for determining the difference degree of two symmetrical edge pixel points based on the curvature difference and the distance degree difference of the two symmetrical edge pixel points, wherein the distance degree is the distance from the edge pixel points to the centroid of the vehicle-mounted camera lens;
and determining the abnormality degree of the edge pixel points based on the difference degree and the density of the edge pixel points.
Optionally, the abnormality degree calculation module is configured to:
determining the density of the window area of the edge pixel points based on the number of the edge pixel points in the window area taking the edge pixel points as the center and the area of the window area;
and determining the abnormality degree of the edge pixel points based on the difference degree of the edge pixel points and the density of the edge pixel points.
Optionally, the threshold calculating module is configured to:
randomly selecting part of edge pixel points as a reference pixel point set, and taking the maximum value of the abnormality degree of the edge pixel points in the reference pixel point set as an initial segmentation threshold;
determining the phase difference degree of the edge pixel point based on the difference value of the j-th segmentation threshold value under the anomaly degree and epsilon value of the edge pixel point;
determining the confusion degree obtained by the j-th segmentation threshold value under the epsilon value based on the phase difference degree of the edge pixel points;
and determining an optimal segmentation threshold value based on the obtained confusion degree of the j-th segmentation threshold value under the epsilon value.
Optionally, the threshold calculating module is configured to:
calculating to obtain a j+1th division threshold value under the epsilon value based on the confusion obtained by the jth division threshold value under the epsilon value, the confusion obtained by the j-1th division threshold value under the epsilon value and the jth division threshold value under the epsilon value;
if the absolute value of the difference value between the j+1th dividing threshold value and the j-th dividing threshold value is smaller than a first preset value, the j+1th dividing threshold value is used as a to-be-selected dividing value under the epsilon value;
changing the epsilon value, and calculating a to-be-selected segmentation value under the changed epsilon value;
and comparing the chaos degrees corresponding to the candidate segmentation thresholds obtained under all epsilon values, and selecting the candidate segmentation threshold corresponding to the minimum chaos degree as the optimal segmentation threshold.
Optionally, the threshold calculating module is configured to:
the j+1th partition threshold is calculated using the following formula:
wherein ,represents the division threshold value of the j+1st time at the first epsilon value, ++>Represents the j-1 th partition threshold value at epsilon value,/for>Indicating that ++under the first ε value>The confusion obtained by the j-th division threshold value under epsilon value is represented by ++>The degree of confusion obtained by the j-1 th division threshold value under the epsilon value is shown.
Optionally, the threshold calculating module is configured to:
determining the confusion degree obtained by the j-th segmentation threshold value under the epsilon value based on the probability that the phase difference degree of the k-th edge pixel point appears in the phase difference degree of the edge pixel points in the reference pixel point set;
the chaos degree calculation mode of the j-th segmentation threshold value under the epsilon value is as follows:
for the degree of confusion obtained for the jth segmentation threshold at the ith ε, ++>Phase difference of the kth edge pixel point +.>The probability of occurrence in the degree of phase difference of the edge pixels in the reference pixel set, n being the total number of edge pixels in the reference pixel set.
Optionally, if the confusion obtained by the j-th segmentation threshold under the epsilon value is smaller than the confusion obtained by the j-1 th segmentation threshold under the epsilon value, the segmentation effect of the j-th segmentation threshold is better than the segmentation effect of the j-1 th segmentation threshold;
if the degree of confusion obtained by the j-th division threshold value under the epsilon value is larger than the degree of confusion obtained by the j-1 th division threshold value under the epsilon value, the segmentation effect of the j-th division threshold value is inferior to the j-1 th division threshold value.
Optionally, the threshold calculating module is further configured to:
the epsilon value was changed using the following formula:
wherein ,for the first epsilon value, < >>Is the 1+1th epsilon value, wherein the first epsilon value +.>For 1, the epsilon value must not exceed 2 at maximum.
Optionally, the anomaly degree calculation module is further configured to:
obtaining a first derivative and a second derivative of the edge pixel points of the vehicle-mounted camera lens by using the Laplacian operator;
and determining the curvature of the edge pixel point based on the first derivative and the second derivative.
The application has the beneficial effects that the vehicle-mounted camera lens grinding degree real-time monitoring system is different from the prior art, and comprises: the edge detection module is used for carrying out edge detection on an image to be detected, and determining edge pixel points in the image to be detected, wherein the edge pixel points are edges of the vehicle-mounted camera, and the image to be detected is obtained by collecting side images of the lenses of the vehicle-mounted camera; the threshold calculation module is used for determining an optimal segmentation threshold value based on the degree of abnormality of the edge pixel points; and the anomaly detection module is used for detecting the anomaly of the edge pixel points based on the optimal segmentation threshold and the anomaly degree of the edge pixel points by utilizing an isolated forest algorithm and determining whether the edge pixel points accord with a grinding standard. The system can effectively improve the inspection efficiency and the inspection accuracy.
Drawings
FIG. 1 is a schematic diagram of a real-time monitoring system for grinding degree of an onboard camera lens according to a first embodiment of the present application;
fig. 2 is a schematic structural diagram of a second embodiment of the on-vehicle camera lens grinding degree real-time monitoring system of the present application.
Detailed Description
The following description of the embodiments of the present application will be made clearly and fully with reference to the accompanying drawings, in which it is evident that the embodiments described are only some, but not all embodiments of the application. All other embodiments, which can be made by those skilled in the art based on the embodiments of the application without making any inventive effort, are intended to be within the scope of the application.
The method is used for preprocessing images by utilizing a Canny algorithm aiming at the characteristic that the Euclidean distance between the convex and concave defect areas and the centroid of the lens is abnormal, so as to obtain the edge point position of the lens of the vehicle-mounted camera. And identifying the defect area by means of the phase difference, and obtaining a conclusion whether the grinding degree of the vehicle-mounted camera lens in detection meets the standard or not by utilizing an isolated forest algorithm through comparing the difference between the curvature and the target value. The present application will be described in detail with reference to the accompanying drawings and examples.
Referring to fig. 1, fig. 1 is a schematic structural diagram of an embodiment of a real-time monitoring system for grinding degree of a vehicle-mounted camera lens according to the present application, which specifically includes: an edge detection module 11, a threshold calculation module 12, and an abnormality detection module 13.
The edge detection module 11 is configured to perform edge detection on an image to be detected, determine edge pixel points in the image to be detected, where the edge pixel points are edges of the vehicle-mounted camera, and the image to be detected is obtained by collecting side images of a lens of the vehicle-mounted camera.
Specifically, the application needs to detect the grinding degree of the lens of the vehicle-mounted camera, and a CMOS camera is used for collecting the side image of the lens of the vehicle-mounted camera to obtain the side image of the lens of the vehicle-mounted camera, wherein the side image of the lens of the vehicle-mounted camera is an RGB image. Preprocessing the obtained side image of the lens of the vehicle-mounted camera, eliminating the influence caused by noise and partial external interference, and enhancing the accuracy of subsequent analysis. Because the image is required to be converted into a gray level image from an RGB space later, gaussian filtering is often adopted to reduce the noise of the image, and a Gaussian function is utilized to convolve with the acquired side image of the vehicle-mounted camera lens to eliminate random noise. The gray scale difference of the two sides of the lens edge in the side image of the vehicle-mounted camera lens is large, and the lens edge can be identified according to the characteristics. And converting the denoised side image of the vehicle-mounted camera lens from the RGB image into a gray image, thereby obtaining an image to be detected.
Specifically, the edge detection module 11 is configured to identify an edge of the lens of the vehicle-mounted camera by using a Canny algorithm on an image to be detected. Solving the recognized edge position coordinates of the vehicle-mounted camera lens to obtain coordinate values of pixel points at the edge of each vehicle-mounted camera lensThe coordinate value of the ith edge pixel point of the vehicle-mounted camera lens is +.>
The threshold calculation module 12 is configured to determine an optimal segmentation threshold based on the degree of abnormality of the edge pixel points. The anomaly detection module 13 is configured to perform anomaly detection on the edge pixel point based on the optimal segmentation threshold and the anomaly degree of the edge pixel point by using an isolated forest algorithm, and determine whether the edge pixel point meets a grinding standard.
Further, referring to fig. 2, the real-time monitoring system for grinding degree of the on-vehicle camera lens further includes an anomaly calculation module 14.
Specifically, the abnormality degree calculation module 14 calculates the distance from the edge pixel point to the centroid of the lens of the vehicle-mounted camera as the distance degree:
wherein ,for the distance degree of the ith edge pixel point of the vehicle-mounted camera lens,/for the vehicle-mounted camera lens>Coordinate values corresponding to the centroid of the lens of the vehicle-mounted camera, < >>For the Euclidean distance between the ith edge pixel point of the vehicle-mounted camera lens and the centroid of the vehicle-mounted camera lens, the coordinate value of the ith edge pixel point is +.>
Further, the abnormality degree calculation module 14 is further configured to: obtaining a first derivative and a second derivative of the edge pixel points of the vehicle-mounted camera lens by using the Laplacian operator; and determining the curvature of the edge pixel point based on the first derivative and the second derivative.
Specifically, the upper and lower side edges of the vehicle-mounted camera lens should be symmetrical about the center line, so as to determine the grinding degree of the vehicle-mounted camera lens. Obtaining a first derivative of a pixel point at the edge of a lens of the vehicle-mounted camera through a Laplacian operatorAnd second derivative->
The curvature formula is as follows:
the curvature of each edge pixel point can be obtainedThe curvature of the ith edge pixel point of the vehicle-mounted camera lens is +.>
Further, the anomaly degree calculation module 14 is configured to determine the difference degree of the two symmetrical edge pixel points based on the curvature difference and the distance degree difference of the two symmetrical edge pixel points.
Specifically, because the vehicle-mounted camera lens is symmetrical, the curvatures corresponding to the two symmetrical points respectively should be the same, and the distance between the two symmetrical points should be the same, so as to calculate the difference between the pixel point at the edge of the ith vehicle-mounted camera lens and the symmetrical pixel point
The curvature of the ith edge pixel point of the vehicle-mounted camera lens is +.>The curvature corresponding to the i-th edge pixel point symmetry point of the vehicle-mounted camera lens is obtained. />The distance degree of the ith edge pixel point of the vehicle-mounted camera lens is +.>Lens for vehicle-mounted cameraAnd (5) the distance degree corresponding to the symmetric point of the ith edge pixel point. The distance degree is the distance from the edge pixel point to the centroid of the lens of the vehicle-mounted camera.
Degree of differenceIs the product of the absolute value of the difference between the ith edge pixel point of the vehicle-mounted camera lens and the curvature of the symmetrical pixel point and the absolute value of the difference between the distance degrees, +.>Is a parameter-adjusting factor whose function is to prevent calculation errors when the difference between curvatures is zero or when the difference between distances is zero,/->The size of (2) is 0.01. Degree of differentiation->The larger the curvature difference is, the larger the curvature difference between the ith pixel point and the symmetrical pixel point is or the larger the distance difference is, the curvature difference is +.>The larger the i-th pixel point is, the more likely the i-th pixel point is an asymmetric pixel point, and the i-th pixel point is not in accordance with the standard.
The anomaly calculation module 14 is also configured to calculate an anomaly based on the differenceAnd determining the degree of abnormality of the edge pixel points by the density of the edge pixel points. Specifically, the density of the window area of the edge pixel point is determined based on the number of edge pixel points in the window area centering on the edge pixel point and the area of the window area.
In an embodiment, taking the ith edge pixel point of the vehicle-mounted camera lens as the center, taking all the edge pixel points of the vehicle-mounted camera lens in the 9*9 area, and calculating the number of the edge pixel points of all the vehicle-mounted camera lens in the 9*9 area taking the ith edge pixel point of the vehicle-mounted camera lens as the center. Calculating the density of the i-th edge pixel point of the vehicle-mounted camera lens in the 9*9 area by taking the i-th edge pixel point of the vehicle-mounted camera lens as the center>
wherein ,is the area of the region, because it is the 9*9 region, +.>And 81. Since the concave or convex area is larger than the number of pixels at the edge of the lens in the normal area, the density is also high. I.e. < ->The larger the i-th vehicle-mounted camera lens edge pixel point is, the more likely the pixel point is in the convex or concave area. The density of all the vehicle-mounted camera lens edge pixel points in the area taking the ith vehicle-mounted camera lens edge pixel point as the center 9*9 is used as the density of the ith inserted camera lens edge pixel point. And calculating the density of the edge pixel points of all the vehicle-mounted camera lenses.
The outlier calculation module 14 calculates the outlier based on the edge pixel pointAnd the density of edge pixels->And determining the degree of abnormality of the edge pixel points. Specifically, the degree of abnormality +_for each edge pixel is calculated by the following formula>
in the formula For the density of the ith pixel, +.>For the degree of difference of the ith pixel, < +.>Calculating the anomaly degree of the pixel points at the edge of the lens of all the vehicle-mounted cameras according to the formula to obtain the anomaly degree of the pixel points at the edge of the lens of the ith edge>
Because the edge pixel points and the symmetrical pixel points of some lenses are in the defect area, the single degree of differenceCan not completely reflect the defect status of the pixel points at the edge of the lens, so the difference degree is +>Density->The combination of the two can completely reflect the position relation between the pixel point and the defect area. Degree of abnormality->Reflecting the position relation between the pixel point and the defect area, < >>The larger the i-th pixel is, the more likely the pixel is to be in the defective area.
The application adopts an isolated forest algorithm to identify unqualified areas. Since the isolated forest algorithm relies on the segmentation threshold, if the segmentation threshold selects an outlier, then the tree growth is subject to certain errors. The segmentation threshold of the isolated forest algorithm is improved. The system is provided with a threshold calculation module 12, and an optimal segmentation threshold is determined by the threshold calculation module 12 based on the degree of abnormality of the edge pixel points.
In a specific embodiment, the threshold calculation module 12 randomly selects a portion of the edge pixels as a reference pixel set, and uses the degree of abnormality of the edge pixels in the reference pixel setMaximum value of>As an initial segmentation threshold +.>
The threshold calculation module 12 is further based on the degree of anomaly of the edge pixelsAnd determining the phase difference degree of the edge pixel point by the difference value of the j-th segmentation threshold value under the epsilon value. Specifically, the i-th edge pixel point is different by +.>Degree of abnormality for the ith edge pixel point +.>And the jth segmentation threshold +.>Is calculated after rounding, and the specific calculation formula is as follows:
the threshold calculation module 12 determines the degree of confusion resulting from the j-th segmentation threshold at the epsilon value based on the degree of phase difference of the edge pixels. Specifically, the threshold calculation module 12 is based on the kth edgeThe probability that the phase difference degree of the edge pixel points appears in the phase difference degree of the edge pixel points in the reference pixel point set determines the confusion degree obtained by the j-th segmentation threshold value under the epsilon value. If the phase difference is highThe probability that the phase difference degree appears in the phase difference degree of the edge pixel points in the reference pixel point set is only once appears in the phase difference degrees of all n pixel points>N represents the number of edge pixels in the reference pixel. The confusion degree is obtained by multiplying the probability of the phase difference degree of each randomly extracted pixel point by the result obtained by taking the logarithm of the probability of the phase difference degree, carrying out the calculation on all randomly extracted pixel points, and summing the calculation results of all the pixel points. Specifically, the chaos degree calculation mode of the jth segmentation threshold under the epsilon value is as follows:
for the degree of confusion obtained for the jth segmentation threshold at the ith ε, ++>Phase difference of the kth edge pixel point +.>The probability of occurrence in the degree of phase difference of the edge pixels in the reference pixel set, n being the total number of edge pixels in the reference pixel set.
The threshold calculation module 12 determines an optimal segmentation threshold based on the degree of confusion resulting from the j-th segmentation threshold at the epsilon value. Specifically, the threshold calculation module 12 calculates the j+1th division threshold value for the epsilon value based on the degree of confusion obtained by the j-th division threshold value for the epsilon value, the degree of confusion obtained by the j-1th division threshold value for the epsilon value, and the j-th division threshold value for the epsilon value.
The j+1th partition threshold is calculated using the following formula:
wherein ,represents the division threshold value of the j+1st time at the first epsilon value, ++>Represents the j-1 th partition threshold value at epsilon value,/for>Indicating that ++under the first ε value>The confusion obtained by the j-th division threshold value under epsilon value is represented by ++>The degree of confusion obtained by the j-1 th division threshold value under the epsilon value is shown.
Due toThe smaller the confusion degree of the pixel points is, the better the obtained confusion degree of the jth segmentation threshold value is, the more representative the jth segmentation threshold value can segment the data well. Therefore, the confusion obtained by the jth segmentation threshold is subtracted from the confusion obtained by the jth-1 segmentation threshold, and if the confusion is negative, the jth segmentation effect is better, and the jth segmentation effect is betterAlso negative, epsilon initial +.>Taking the empirical value of 1, epsilon must not exceed 2 at maximum. Since the phase difference of each data constituted by the division threshold should distinguish the two parts, the phase difference should be constituted by approximately dividing into two parts, oneThe phase difference of one part is relatively close, the phase difference of the other part is relatively close, and the phase difference of two parts of data is not close, so that the threshold segmentation of the isolated tree in the isolated forest can be perfectly realized, and the segmentation threshold is +.>The value range of (2) is not more than the maximum degree of abnormality and not less than the minimum degree of abnormality. In general, the first segmentation threshold +.>Taking the minimum value of the abnormality +.>Segmentation threshold for the second time->Taking the second minimum value of the anomaly degree.
And if the absolute value of the difference value between the j+1th dividing threshold value and the j-th dividing threshold value is smaller than a first preset value, taking the j+1th dividing threshold value as a candidate dividing value under the epsilon value.
Changing the epsilon value, and calculating a to-be-selected segmentation value under the changed epsilon value; the epsilon value was changed using the following formula:
wherein ,for the first epsilon value, < >>Is the 1+1th epsilon value, wherein the first epsilon value +.>For 1, the epsilon value must not exceed 2 at maximum.
And comparing the chaos degrees corresponding to the candidate segmentation thresholds obtained under all epsilon values, and selecting the candidate segmentation threshold corresponding to the minimum chaos degree as the optimal segmentation threshold.
Specifically, if the degree of confusion obtained by the j-th division threshold value under the epsilon value is smaller than the degree of confusion obtained by the j-1 th division threshold value under the epsilon value, that is, the difference between the degree of confusion is a negative value, the division effect of the j-th division threshold value is better than the division effect of the j-1 th division threshold value. If the degree of confusion obtained by the j-th division threshold value under the epsilon value is larger than the degree of confusion obtained by the j-1 th division threshold value under the epsilon value, namely the difference of the degree of confusion is a positive value, the method indicates that the division effect of the j-th division threshold value is inferior to the division effect of the j-1 th division threshold value. The segmentation threshold is reasonable at this time, because the difference of the anomaly degree is larger in the left and right leaf nodes close to the root node, the data confusion degree in the two leaf nodes is smaller, and the difference of the extracted data distribution in the root node is larger. So that the abnormal data and the normal data in the random extraction data set can obtain a better segmentation result through a few nodes.
The threshold calculation module 12 calculates the optimal segmentation threshold in the manner described above. The anomaly detection module 13 is configured to perform anomaly detection on the edge pixel point based on the optimal segmentation threshold and the anomaly degree of the edge pixel point by using an isolated forest algorithm, and determine whether the edge pixel point meets a grinding standard.
Specifically, an isolated forest algorithm is used for the abnormality degree of all the pixel points, so that a conclusion is obtained that whether abnormal points exist or not, and if the abnormal points exist, the grinding degree of the vehicle-mounted camera does not meet the standard.
The real-time monitoring system for the grinding degree of the vehicle-mounted camera lens aims at the problems of the bulge, the recess and the like of the vehicle-mounted camera lens, a distance index is obtained based on Euclidean distance between the edge point of the lens and the centroid of the lens, curvature of the edge pixel point of the vehicle-mounted camera lens is obtained through a Laplace algorithm and a curvature formula, a confusion index is obtained, iterative optimization is carried out according to the confusion index to obtain an optimal threshold effect, and proper segmentation thresholds in an isolated forest algorithm are screened out, so that whether the curvature of the edge of the vehicle-mounted camera lens accords with the standard or not and whether the bulge and the recess defect possibly exist or not is judged. Compared with the prior art, the detection efficiency and the detection accuracy can be effectively improved if the grinding degree of the lens of the vehicle-mounted camera detected from the two aspects meets the standard.
The foregoing is only the embodiments of the present application, and therefore, the patent scope of the application is not limited thereto, and all equivalent structures or equivalent processes using the descriptions of the present application and the accompanying drawings, or direct or indirect application in other related technical fields, are included in the scope of the application.

Claims (10)

1. On-vehicle camera lens degree real-time supervision system that mills, its characterized in that includes:
the edge detection module is used for carrying out edge detection on an image to be detected, and determining edge pixel points in the image to be detected, wherein the edge pixel points are edges of the vehicle-mounted camera, and the image to be detected is obtained by collecting side images of the lenses of the vehicle-mounted camera;
the threshold calculation module is used for determining an optimal segmentation threshold value based on the degree of abnormality of the edge pixel points;
and the anomaly detection module is used for detecting the anomaly of the edge pixel points based on the optimal segmentation threshold and the anomaly degree of the edge pixel points by utilizing an isolated forest algorithm and determining whether the edge pixel points accord with a grinding standard.
2. The on-board camera lens grinding degree real-time monitoring system according to claim 1, further comprising:
the anomaly degree calculation module is used for determining the difference degree of two symmetrical edge pixel points based on the curvature difference and the distance degree difference of the two symmetrical edge pixel points, wherein the distance degree is the distance from the edge pixel points to the centroid of the vehicle-mounted camera lens;
and determining the abnormality degree of the edge pixel points based on the difference degree and the density of the edge pixel points.
3. The real-time monitoring system for grinding degree of an on-vehicle camera lens according to claim 2, wherein the abnormality degree calculation module is configured to:
determining the density of the window area of the edge pixel points based on the number of the edge pixel points in the window area taking the edge pixel points as the center and the area of the window area;
and determining the abnormality degree of the edge pixel points based on the difference degree of the edge pixel points and the density of the edge pixel points.
4. The on-vehicle camera lens grinding degree real-time monitoring system according to claim 1, wherein the threshold calculating module is configured to:
randomly selecting part of edge pixel points as a reference pixel point set, and taking the maximum value of the abnormality degree of the edge pixel points in the reference pixel point set as an initial segmentation threshold;
determining the phase difference degree of the edge pixel point based on the difference value of the j-th segmentation threshold value under the anomaly degree and epsilon value of the edge pixel point;
determining the confusion degree obtained by the j-th segmentation threshold value under the epsilon value based on the phase difference degree of the edge pixel points;
and determining an optimal segmentation threshold value based on the obtained confusion degree of the j-th segmentation threshold value under the epsilon value.
5. The on-vehicle camera lens grinding degree real-time monitoring system according to claim 4, wherein the threshold calculating module is configured to:
calculating to obtain a j+1th division threshold value under the epsilon value based on the confusion obtained by the jth division threshold value under the epsilon value, the confusion obtained by the j-1th division threshold value under the epsilon value and the jth division threshold value under the epsilon value;
if the absolute value of the difference value between the j+1th dividing threshold value and the j-th dividing threshold value is smaller than a first preset value, the j+1th dividing threshold value is used as a to-be-selected dividing value under the epsilon value;
changing the epsilon value, and calculating a to-be-selected segmentation value under the changed epsilon value;
and comparing the chaos degrees corresponding to the candidate segmentation thresholds obtained under all epsilon values, and selecting the candidate segmentation threshold corresponding to the minimum chaos degree as the optimal segmentation threshold.
6. The on-vehicle camera lens grinding degree real-time monitoring system according to claim 5, wherein the threshold calculating module is configured to:
the j+1th partition threshold is calculated using the following formula:
wherein ,represents the division threshold value of the j+1st time at the first epsilon value, ++>Represents the j-1 th partition threshold value at epsilon value,/for>Indicating that ++under the first ε value>The confusion obtained by the j-th division threshold value under epsilon value is represented by ++>The degree of confusion obtained by the j-1 th division threshold value under the epsilon value is shown.
7. The on-vehicle camera lens grinding degree real-time monitoring system according to claim 4, wherein the threshold calculating module is configured to:
determining the confusion degree obtained by the j-th segmentation threshold value under the epsilon value based on the probability that the phase difference degree of the k-th edge pixel point appears in the phase difference degree of the edge pixel points in the reference pixel point set;
the chaos degree calculation mode of the j-th segmentation threshold value under the epsilon value is as follows:
for the degree of confusion obtained for the jth segmentation threshold at the ith ε, ++>Phase difference of the kth edge pixel point +.>The probability of occurrence in the degree of phase difference of the edge pixels in the reference pixel set, n being the total number of edge pixels in the reference pixel set.
8. The on-vehicle camera lens degree of milling real-time supervision system of claim 5, wherein:
if the confusion obtained by the j-th segmentation threshold value under the epsilon value is smaller than the confusion obtained by the j-1 th segmentation threshold value under the epsilon value, the segmentation effect of the j-th segmentation threshold value is better than the segmentation effect of the j-1 th segmentation threshold value;
if the degree of confusion obtained by the j-th division threshold value under the epsilon value is larger than the degree of confusion obtained by the j-1 th division threshold value under the epsilon value, the segmentation effect of the j-th division threshold value is inferior to the j-1 th division threshold value.
9. The on-board camera lens grinding degree real-time monitoring system according to claim 5, wherein the threshold calculating module is further configured to:
the epsilon value was changed using the following formula:
wherein ,for the first epsilon value, < >>Is the 1+1th epsilon value, wherein the first epsilon value +.>For 1, the epsilon value must not exceed 2 at maximum.
10. The on-vehicle camera lens degree of grinding real-time monitoring system of claim 2, wherein the degree of abnormality calculation module is further configured to:
obtaining a first derivative and a second derivative of the edge pixel points of the vehicle-mounted camera lens by using the Laplacian operator;
and determining the curvature of the edge pixel point based on the first derivative and the second derivative.
CN202311152368.5A 2023-09-08 2023-09-08 Real-time monitoring system for grinding degree of vehicle-mounted camera lens Active CN116883446B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202311152368.5A CN116883446B (en) 2023-09-08 2023-09-08 Real-time monitoring system for grinding degree of vehicle-mounted camera lens

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202311152368.5A CN116883446B (en) 2023-09-08 2023-09-08 Real-time monitoring system for grinding degree of vehicle-mounted camera lens

Publications (2)

Publication Number Publication Date
CN116883446A true CN116883446A (en) 2023-10-13
CN116883446B CN116883446B (en) 2023-11-21

Family

ID=88259125

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202311152368.5A Active CN116883446B (en) 2023-09-08 2023-09-08 Real-time monitoring system for grinding degree of vehicle-mounted camera lens

Country Status (1)

Country Link
CN (1) CN116883446B (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117197140A (en) * 2023-11-07 2023-12-08 东莞市恒兴隆实业有限公司 Irregular metal buckle forming detection method based on machine vision

Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
FI935739A0 (en) * 1992-12-21 1993-12-20 Johnson & Johnson Vision Prod Kontrollfoerfarande och -anordning Foer ophthalmic linser
EP3355104A1 (en) * 2017-01-27 2018-08-01 Carl Zeiss Vision International GmbH Method and device and computer program for determining a representation of a spectacle glass rim
SE1930421A1 (en) * 2019-12-30 2021-07-01 Unibap Ab Method and means for detection of imperfections in products
CN113643371A (en) * 2021-10-13 2021-11-12 中国空气动力研究与发展中心低速空气动力研究所 Method for positioning aircraft model surface mark points
CN114113129A (en) * 2021-12-03 2022-03-01 中科计算技术西部研究院 Lens tiny defect identification and grabbing system and method
WO2022088620A1 (en) * 2020-10-28 2022-05-05 北京市商汤科技开发有限公司 State detection method and apparatus for camera lens, device and storage medium
CN115239727A (en) * 2022-09-23 2022-10-25 南通荣茂电子科技有限公司 PCB surface defect detection method
CN115272341A (en) * 2022-09-29 2022-11-01 华联机械集团有限公司 Packaging machine defect product detection method based on machine vision
CN115953407A (en) * 2023-03-15 2023-04-11 深圳市科姆特精密科技有限公司 Semiconductor equipment maintenance system based on computer vision
WO2023134792A2 (en) * 2022-12-15 2023-07-20 苏州迈创信息技术有限公司 Led lamp wick defect detection method

Patent Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
FI935739A0 (en) * 1992-12-21 1993-12-20 Johnson & Johnson Vision Prod Kontrollfoerfarande och -anordning Foer ophthalmic linser
EP3355104A1 (en) * 2017-01-27 2018-08-01 Carl Zeiss Vision International GmbH Method and device and computer program for determining a representation of a spectacle glass rim
SE1930421A1 (en) * 2019-12-30 2021-07-01 Unibap Ab Method and means for detection of imperfections in products
WO2022088620A1 (en) * 2020-10-28 2022-05-05 北京市商汤科技开发有限公司 State detection method and apparatus for camera lens, device and storage medium
CN113643371A (en) * 2021-10-13 2021-11-12 中国空气动力研究与发展中心低速空气动力研究所 Method for positioning aircraft model surface mark points
CN114113129A (en) * 2021-12-03 2022-03-01 中科计算技术西部研究院 Lens tiny defect identification and grabbing system and method
CN115239727A (en) * 2022-09-23 2022-10-25 南通荣茂电子科技有限公司 PCB surface defect detection method
CN115272341A (en) * 2022-09-29 2022-11-01 华联机械集团有限公司 Packaging machine defect product detection method based on machine vision
WO2023134792A2 (en) * 2022-12-15 2023-07-20 苏州迈创信息技术有限公司 Led lamp wick defect detection method
CN115953407A (en) * 2023-03-15 2023-04-11 深圳市科姆特精密科技有限公司 Semiconductor equipment maintenance system based on computer vision

Non-Patent Citations (4)

* Cited by examiner, † Cited by third party
Title
关波;王俊元;杜文华;曾志强;: "刀具轮廓亚像素精度阈值分割算法研究", 图学学报, no. 06 *
王波;: "基于细胞膜优化的图像边缘检测算法研究", 计算机仿真, no. 06 *
申宁馨;彭承琳;王翊;文静;: "基于图像的颗粒物检测研究", 计算机测量与控制, no. 01 *
陈红;熊利荣;胡筱波;王巧华;吴谋成;: "基于神经网络与图像处理的花生仁霉变识别方法", 农业工程学报, no. 04 *

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117197140A (en) * 2023-11-07 2023-12-08 东莞市恒兴隆实业有限公司 Irregular metal buckle forming detection method based on machine vision
CN117197140B (en) * 2023-11-07 2024-02-20 东莞市恒兴隆实业有限公司 Irregular metal buckle forming detection method based on machine vision

Also Published As

Publication number Publication date
CN116883446B (en) 2023-11-21

Similar Documents

Publication Publication Date Title
CN111292305B (en) Improved YOLO-V3 metal processing surface defect detection method
CN109141232B (en) Online detection method for disc castings based on machine vision
CN110163853B (en) Edge defect detection method
CN105067638B (en) Tire fetal membrane face character defect inspection method based on machine vision
CN111274843B (en) Truck overload monitoring method and system based on monitoring video
CN111862037A (en) Method and system for detecting geometric characteristics of precision hole type part based on machine vision
CN105160652A (en) Handset casing testing apparatus and method based on computer vision
CN116883446B (en) Real-time monitoring system for grinding degree of vehicle-mounted camera lens
CN111667470B (en) Industrial pipeline flaw detection inner wall detection method based on digital image
CN110189375B (en) Image target identification method based on monocular vision measurement
CN109559324A (en) A kind of objective contour detection method in linear array images
CN110866430A (en) License plate recognition method and device
CN109359604B (en) Method for identifying instrument under shadow interference facing inspection robot
CN112528868B (en) Illegal line pressing judgment method based on improved Canny edge detection algorithm
CN107862319A (en) A kind of heterologous high score optical image matching error elimination method based on neighborhood ballot
CN116358449A (en) Aircraft rivet concave-convex amount measuring method based on binocular surface structured light
CN112288682A (en) Electric power equipment defect positioning method based on image registration
CN111310771B (en) Road image extraction method, device and equipment of remote sensing image and storage medium
CN116703251A (en) Rubber ring production quality detection method based on artificial intelligence
CN113781413B (en) Electrolytic capacitor positioning method based on Hough gradient method
CN113129260B (en) Automatic detection method and device for internal defects of lithium battery cell
CN110705553A (en) Scratch detection method suitable for vehicle distant view image
CN114024503A (en) Solar cell color separation and defect detection system and method thereof
CN116958714B (en) Automatic identification method for wafer back damage defect
CN116758045B (en) Surface defect detection method and system for semiconductor light-emitting diode

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant
PE01 Entry into force of the registration of the contract for pledge of patent right

Denomination of invention: A real-time monitoring system for the grinding degree of car mounted camera lenses

Granted publication date: 20231121

Pledgee: China Construction Bank Corporation Weishan sub branch

Pledgor: Luran Optoelectronics (Weishan) Co.,Ltd.

Registration number: Y2024980009973