CN113450378A - Method for judging contact group difference plane height data matching degree - Google Patents

Method for judging contact group difference plane height data matching degree Download PDF

Info

Publication number
CN113450378A
CN113450378A CN202110717747.9A CN202110717747A CN113450378A CN 113450378 A CN113450378 A CN 113450378A CN 202110717747 A CN202110717747 A CN 202110717747A CN 113450378 A CN113450378 A CN 113450378A
Authority
CN
China
Prior art keywords
image
contact
height data
gradient direction
contact feature
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202110717747.9A
Other languages
Chinese (zh)
Other versions
CN113450378B (en
Inventor
李文华
韩峥
王景芹
赵正元
潘如政
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Hebei University of Technology
Original Assignee
Hebei University of Technology
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Hebei University of Technology filed Critical Hebei University of Technology
Priority to CN202110717747.9A priority Critical patent/CN113450378B/en
Publication of CN113450378A publication Critical patent/CN113450378A/en
Application granted granted Critical
Publication of CN113450378B publication Critical patent/CN113450378B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/13Edge detection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F17/00Digital computing or data processing equipment or methods, specially adapted for specific functions
    • G06F17/10Complex mathematical operations
    • G06F17/11Complex mathematical operations for solving equations, e.g. nonlinear equations, general mathematical optimization problems
    • G06F17/13Differential equations
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/22Matching criteria, e.g. proximity measures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/60Analysis of geometric attributes
    • G06T7/62Analysis of geometric attributes of area, perimeter, diameter or volume

Abstract

The application provides a method for judging the matching degree of height data of a contact group difference plane, which comprises the following steps: acquiring a surface image of the moving contact to obtain a first image; acquiring a static contact surface image to obtain a second image; extracting a first height dataset of a first image and a second height dataset of a second image; constructing a gradient direction difference cloud picture of the first height data set and the second height data set; to set the angle threshold range thetathresholdProcessing the gradient direction difference cloud picture to obtain a contact characteristic profile of the gradient direction difference cloud picture; extracting a first imageA contact feature profile of; extracting a contact feature profile of the second image; and determining the matching degree of the height data of the difference plane according to the contact feature contour of the gradient direction difference cloud picture and the contact feature contours of the first image and the second image. The application provides a method for judging the matching degree of the height data of the differential plane of the contact group, which solves the problem that the matching degree of the height data of the differential plane cannot be visually determined, and provides initial prejudgment information for subsequent height data calibration work.

Description

Method for judging contact group difference plane height data matching degree
Technical Field
The disclosure generally relates to the technical field of reliability tests, and particularly relates to a method for judging the matching degree of differential plane height data of a contact group.
Background
Establishing a differential plane of the height data of the relay contact group is a key step for researching the reliability of the relay, and the mode of establishing the differential plane is as follows: and taking the three-dimensional curved surface restored by the height data of the fixed contact as a reference, carrying out mirror symmetry on the three-dimensional curved surface restored by the height data of the movable contact, adding the height data matrixes of the movable contact and the fixed contact at the moment, taking the plane where the maximum value of the sum of the height data of the movable contact and the fixed contact is positioned as a reference plane, and solving the difference value between the surface height sum of the two contacts and the reference plane to establish a difference plane of the height data.
The height data of the relay contact group is closely related to the reliability research of the relay, and a difference plane established by the height data in the contact group can reflect information such as deformation, contact spots and the like of the contact back surface of the moving contact and the static contact, and is particularly important for solving the electrical parameters and the surface geometric parameters of the relay.
Because the difference plane of the height data of the contact group is required to be in one-to-one correspondence with the height data points on the surfaces of the movable contact and the static contact, the position points of the height data of the contact surfaces of the movable contact and the static contact which are obtained often have deviation due to the position difference of the movable contact and the static contact which are placed on the three-dimensional topography scanner in the contact group, so that the height data of the movable contact and the static contact are not matched, and the mismatching of the height data of the contact group caused by the position difference is lack of an intuitive and effective observation method.
Disclosure of Invention
In view of the above-mentioned drawbacks and deficiencies of the prior art, it is desirable to provide a method for determining the degree of matching of the differential plane height data of the contact sets, which can solve the above-mentioned technical problems.
The application provides a method for judging the matching degree of height data of a contact group difference plane, which comprises the following steps:
acquiring a surface image of the moving contact to obtain a first image;
acquiring a static contact surface image to obtain a second image;
extracting a first height dataset { h) for the first imaged(1,1),...,hd(x,y),...,hd(m, n) }, wherein the first height dataset has m rows and n columns; h isd(x, y) is the height value corresponding to the point with the spatial position coordinate (x, y) of the first height data set;
extracting a second height dataset { h) for the second imagej(1,1),...,hj(x,y),...,hj(m, n), wherein the second height dataset has m rows and n columns; h isj(x, y) is the height value corresponding to the point with the spatial position coordinate (x, y) of the first height data set;
constructing a gradient direction difference cloud map of the first height data set and the second height data set;
to set the angle threshold range thetathresholdProcessing the gradient direction difference cloud picture to obtain a contact characteristic profile of the gradient direction difference cloud picture;
extracting a contact feature profile of the first image;
extracting a contact feature profile of the second image;
and determining the matching degree of the height data of the difference plane according to the contact feature contour of the gradient direction difference cloud picture and the contact feature contours of the first image and the second image.
According to the technical scheme provided by the embodiment of the application, the method for judging the matching degree of the height data of the difference plane through the contact feature profile of the gradient direction difference cloud picture and the contact feature profiles of the first image and the second image specifically comprises the following steps:
calculating the phase of the cloud image contact feature profile of the gradient direction difference value and the contact feature profile of the first imageSimilarity rd
Obtaining the similarity r of the contact feature outline of the cloud picture of the gradient direction difference value and the contact feature outline of the second imagej
Calculating the similarity r of the cloud picture characteristic profile of the gradient direction difference value and the contact characteristic profile of the contact point group according to the following formulasim
Figure BDA0003135599020000021
The similarity rsimFor characterizing the degree of matching of the difference plane height data.
According to the technical scheme provided by the embodiment of the application, the similarity r of the contact feature profile of the cloud picture of the gradient direction difference value and the contact feature profile of the first image is calculateddThe method comprises the following steps:
calculating the area A of the contact feature outline of the gradient direction difference cloud picture;
calculating the area A of the contact feature profile of the first imaged
Calculating the similarity r of the contact feature contour of the cloud image and the contact feature contour of the first image according to the following formulad
Figure BDA0003135599020000031
According to the technical scheme provided by the embodiment of the application, the similarity r of the contact feature profile of the gradient direction difference cloud picture and the contact feature profile of the first image is obtaineddThe method comprises the following steps:
calculating the area A of the contact feature outline of the gradient direction difference cloud picture;
calculating the area A of the contact feature profile of the second imagej
Calculating the contact characteristic of the cloud image contact characteristic outline and the second image according to the following formulaSimilarity r of characteristic contourj
Figure BDA0003135599020000032
According to the technical scheme provided by the embodiment of the application, the method for extracting the contact feature profile of the first image specifically comprises the following steps: carrying out gray level processing on the first image to obtain a first gray level image; and carrying out edge detection on the first gray level image through a Canny operator, and extracting a contact feature profile.
The method for extracting the contact feature profile of the second image specifically comprises the following steps: carrying out gray level processing on the second image to obtain a second gray level image; and carrying out edge detection on the second gray image through a Canny operator, and extracting a contact feature outline.
According to the technical scheme provided by the embodiment of the application, the method for obtaining the contact characteristic profile of the gradient direction difference cloud picture specifically comprises the following steps:
carrying out gray level processing on the gradient direction difference cloud picture to obtain a gradient direction difference cloud picture gray level image;
performing edge detection on the gradient direction difference cloud image gray level image through a Canny operator, and extracting a contact characteristic profile;
and scaling the contact feature outline of the gradient direction difference cloud picture so as to be in the same size proportion with the contact feature outline of the first image and the contact feature outline of the second image.
According to the technical scheme provided by the embodiment of the application, the method for constructing the cloud picture of the difference value of the gradient directions of the first height data set and the second height data set specifically comprises the following steps:
calculating the first height dataset { hd(1,1),...,hd(x,y),...,hd(m, n) } angle values of gradient directions of the respective points as a first angle set { theta }d(1,1),…,θd(x,y),...,θd(m,n)};
Calculating the second height dataset { hj(1,1),...,hj(x,y),...,hjGradient square of each point in (m, n) }Angle value of direction as a second set of angle values { theta }j(1,1),…,θj(x,y),…,θj(m,n)};
Calculating the first set of angles { theta }d(1,1),…,θd(x,y),...,θd(m, n) } and the second set of angle values { theta }j(1,1),...,θj(x,y),...,θj(m, n) } difference set of absolute values of points { thetac(1,1),...,θc(x,y),...,θc(m,n)};
By said difference set { thetac(1,1),...,θc(x,y),...,θc(m, n) } constructing the gradient direction difference cloud picture.
According to the technical scheme provided by the embodiment of the application, the first angle set { theta ] is calculatedd(1,1),...,θd(x,y),...,θdThe method of (m, n) } specifically comprises the following steps:
Figure BDA0003135599020000041
calculating the second set of angles { theta }j(1,1),...,θj(x,y),...,θjThe method of (m, n) } specifically comprises the following steps:
Figure BDA0003135599020000042
according to the technical scheme provided by the embodiment of the application, a first height data set { h) of the first image is extractedd(1,1),…,hd(x,y),…,hdThe method of (m, n) } specifically comprises the following steps:
selecting a scanning area of the first image as a first scanning area;
acquiring a height data matrix of the first scanning area as a first height data matrix;
normalizing the first height data matrix to obtain the first height data set { h }d(1,1),...,hd(x,y),…,hd(m,n)}。
According to the technical scheme provided by the embodiment of the application, a second height data set { h) of the second image is extractedj(1,1),…,hj(x,y),...,hjThe method of (m, n) } specifically comprises the following steps:
selecting a scanning area of the second image as a second scanning area;
acquiring a height data matrix of the second scanning area as a second height data matrix;
normalizing the second height data matrix to obtain the second height data set { h }j(1,1),...,hj(x,y),...,hj(m,n)}。
The beneficial effect of this application lies in: after extracting a first height data set of the moving contact surface image and a second height data set of the fixed contact surface image, constructing a gradient direction difference cloud chart of the first height data set and the second height data set, and setting an angle threshold range thetathresholdProcessing the gradient direction difference cloud picture to obtain a contact characteristic profile of the gradient direction difference cloud picture; by extracting the contact characteristic outline of the first image and the contact characteristic outline of the second image and comparing the contact characteristic outlines with the gradient direction difference cloud picture, the matching degree of the height data of the difference plane can be determined, and initial prejudgment information is provided for subsequent height data calibration work.
Drawings
Other features, objects and advantages of the present application will become more apparent upon reading of the following detailed description of non-limiting embodiments thereof, made with reference to the accompanying drawings in which:
FIG. 1 is a flowchart of a method for determining a degree of matching of differential plane height data of a contact set according to the present application;
FIG. 2 is a schematic view of the first image shown in FIG. 1;
FIG. 3 is a schematic view of a second image shown in FIG. 1;
FIG. 4 is a schematic diagram of a gradient direction difference cloud map of FIG. 1;
FIG. 5 is a view of FIG. 1 illustrating a threshold range θ for setting the anglethresholdThe schematic diagram is obtained after the gradient direction difference cloud picture is processed;
FIG. 6 is a comparison of the contact feature area profiles of FIGS. 5 and 2;
FIG. 7 is a comparison of the contact feature area profiles of FIGS. 5 and 3;
Detailed Description
The present application will be described in further detail with reference to the following drawings and examples. It is to be understood that the specific embodiments described herein are merely illustrative of the relevant invention and not restrictive of the invention. It should be noted that, for convenience of description, only the portions related to the present invention are shown in the drawings.
It should be noted that the embodiments and features of the embodiments in the present application may be combined with each other without conflict. The present application will be described in detail below with reference to the embodiments with reference to the attached drawings.
Please refer to fig. 1, which is a flowchart of a method for determining a degree of matching between differential plane height data of a contact set according to the present application, including the following steps:
s100: acquiring a surface image of the moving contact to obtain a first image, as shown in fig. 1;
s110: acquiring a static contact surface image to obtain a second image, as shown in fig. 2;
s200, extracting a first height data set { h) of the first imaged(1,1),...,hd(x,y),...,hd(m, n) }, wherein the first height dataset has m rows and n columns; h isd(x, y) is the height value corresponding to the point with the spatial position coordinate (x, y) of the first height data set;
s210: extracting a second height dataset { h) for the second imagej(1,1),...,hj(x,y),…,hj(m, n), wherein the second height dataset has m rows and n columns; h isj(x, y) is the height value corresponding to the point with the spatial position coordinate (x, y) of the first height data set;
s300: constructing a cloud of gradient direction difference values for the first height dataset and the second height dataset, as shown in FIG. 3;
s400: to set the angle threshold range thetathresholdProcessing the gradient direction difference cloud picture to obtain a contact characteristic profile of the gradient direction difference cloud picture;
s410: extracting a contact feature profile of the first image;
s420: extracting a contact feature profile of the second image;
s500: and determining the matching degree of the height data of the difference plane according to the contact feature contour of the gradient direction difference cloud picture and the contact feature contours of the first image and the second image.
In this embodiment, the contact group is a relay contact group, and includes a movable contact and a stationary contact on a relay.
It can be known that, in the process of establishing the difference plane of the height data of the contact group, the height data of the surface of the movable contact needs to be in one-to-one correspondence with the height data of the fixed contact; in the process of respectively acquiring the height data of the movable contact and the fixed contact, the acquired height data of the movable contact and the fixed contact are not matched due to the deviation of contact characteristic positions contained in scanning areas of the movable contact and the fixed contact, and the mismatching degree cannot be visually and effectively observed, namely cannot be quantitatively represented or expressed.
In the application, after a first height data set of a moving contact surface image and a second height data set of a fixed contact surface image are extracted, a gradient direction difference cloud chart of the first height data set and the second height data set is constructed, and an angle threshold range theta is setthresholdProcessing the gradient direction difference cloud picture to obtain a contact characteristic profile of the gradient direction difference cloud picture; by extracting the contact characteristic outline of the first image and the contact characteristic outline of the second image and comparing the contact characteristic outlines with the gradient direction difference cloud picture, the matching degree of the height data of the difference plane can be determined, and initial prejudgment information is provided for subsequent height data calibration work.
In a preferred embodiment of this embodiment, the method for determining the matching degree of the difference plane height data according to the contact feature profile of the gradient direction difference cloud image and the contact feature profiles of the first image and the second image specifically includes:
calculating the similarity r of the contact feature contour of the cloud picture with the gradient direction difference value and the contact feature contour of the first imaged
Obtaining the similarity r of the contact feature outline of the cloud picture of the gradient direction difference value and the contact feature outline of the second imagej
Calculating the similarity r of the cloud picture characteristic profile of the gradient direction difference value and the contact characteristic profile of the contact point group according to the following formulasim
Figure BDA0003135599020000071
The similarity rsimFor characterizing the degree of matching of the difference plane height data.
Specifically, the difference plane height data matching degree r and the similarity rsimIs in positive correlation; therefore, the standard value r can be setbWhen judging the similarity rsim>rbIf so, indicating that the matching degree of the height data of the difference plane is higher; when judging the similarity rsim<rbIf so, indicating that the height data degree of the difference plane is lower; meanwhile, the set standard value may be plural, for example:
Figure BDA0003135599020000072
so that a plurality of threshold intervals can be set to evaluate the degree of matching thereof, for example: when in use
Figure BDA0003135599020000073
When the height data of the differential plane is matched with the height data of the differential plane, the matching degree of the height data of the differential plane is extremely low; when in use
Figure BDA0003135599020000074
When the difference plane height data are matched, the difference plane height data are low in matching degree; when in use
Figure BDA0003135599020000075
When it is, it indicates a differenceThe matching degree of the plane height data is higher; when in use
Figure BDA0003135599020000076
Figure BDA0003135599020000077
The method is characterized in that the matching degree of the height data of the difference plane is extremely high;
of course, it is also the difference plane height data matching degree r and the similarity degree rsimIs positively correlated, so that the similarity r is calculatedsimThe value of (a) can quantify the degree of matching of the height data of the difference plane, i.e. in the process of comparing the degree of matching of a plurality of height data of the difference plane, the similarity r of the height data of each contact group can be respectively calculatedsimComparing; degree of similarity rsimThe larger the value is, the more matched the dynamic and static data height data are.
The step selects the similarity (r) of the contact characteristicsd、rj) Describing the matching degree of the position points of the height data of the moving contact and the static contact; meanwhile, in order to visually display the similarity of the two contact features, the gradient direction difference cloud image contact feature profile and the contact feature profile of the first image can be subjected to superposition comparison, as shown in fig. 6, and the gradient direction difference cloud image contact feature profile and the contact feature profile of the second image are subjected to superposition comparison, as shown in fig. 7.
Preferably, the similarity r between the cloud image contact feature profile of the gradient direction difference value and the contact feature profile of the first image is calculateddThe method comprises the following steps:
calculating the area A of the contact feature outline of the gradient direction difference cloud picture;
calculating the area A of the contact feature profile of the first imaged
Calculating the similarity r of the contact feature contour of the cloud image and the contact feature contour of the first image according to the following formulad
Figure BDA0003135599020000081
Preferably, the similarity r between the contact feature profile of the gradient direction difference cloud picture and the contact feature profile of the first image is obtaineddThe method comprises the following steps:
calculating the area A of the contact feature outline of the gradient direction difference cloud picture;
calculating the area A of the contact feature profile of the second imagej
Calculating the similarity r of the contact feature contour of the cloud image and the contact feature contour of the second image according to the following formulaj
Figure BDA0003135599020000082
Preferably, the method for calculating the contact feature profile area may be: filling an area in the contact feature outline by utilizing an imfill function in matlab, and obtaining the area of the filled area by utilizing a regionprops function after the filling is finished, wherein the unit of the area is as follows: pixels (pixels). The areas of the contact feature outlines of the first image, the second image and the gradient direction difference cloud picture can be obtained in the above mode; of course, the dimensions of the circumscribed rectangle of the contact feature profile can also be obtained by utilizing the regionprops function, and the area of the circumscribed rectangle is used for representing the area of the contact feature profile.
In a preferred embodiment of this embodiment, the method for acquiring the moving contact surface image and the fixed contact surface image specifically includes: shooting the surface micro-topography of the movable contact and the fixed contact of the contact group by a non-contact three-dimensional topography instrument to obtain a first image and a second image; an image (i.e., a first image) of the contact surface of the movable contact of a relay contact set of a certain type is shown in fig. 2, and an image (i.e., a second image) of the contact surface of the stationary contact of the relay contact set is shown in fig. 3.
Specifically, the three-dimensional appearance scanner belongs to non-contact scanning equipment, detects the small distance change of the measured surface relative to a focusing optical system according to an optical principle, when the three-dimensional appearance scanner is used, a high-definition camera on the scanner firstly shoots pictures of the surfaces of a movable contact and a fixed contact to obtain surface images (a first image and a second image) of the contacts, then scanning areas of red frames are selected on the first image and the second image which are displayed on the system respectively, the positions and the sizes of the scanning areas can be specified manually, the three-dimensional appearance scanner can rotate by an angle, the contact surface is measured rapidly, repeatedly and in a multi-side key mode, um-level measurement can be carried out on the sample surface, the whole surface data is subjected to statistical integration, and a height data set is obtained.
In a preferred embodiment of this embodiment, a first height dataset { h } of the first image is extractedd(1,1),...,hd(x,y),...,hdThe method of (m, n) } specifically comprises the following steps:
selecting a scanning area of the first image as a first scanning area;
acquiring a height data matrix of the first scanning area as a first height data matrix;
specifically, the method for acquiring the first height data matrix comprises the following steps: and scanning the height data of the first scanning area by a three-dimensional shape scanner, and pouring the height data into programming software to convert the height data into a height data matrix in a matrix form.
Normalizing the first height data matrix to obtain the first height data set { h }d(1,1),...,hd(x,y),…,hd(m, n) }, wherein the first height dataset has m rows and n columns; h isd(x, y) is the height value corresponding to the point with the spatial position coordinate (x, y) of the first height data set.
Specifically, the first height data matrix is normalized by the following formula to obtain the first height data set:
Figure BDA0003135599020000091
wherein, min (h)1) Is the minimum value in the first height matrix; max (h)1) Is composed ofMaximum value in the first height matrix, h1(x, y) is a height value corresponding to a point with position coordinates (x, y) in the first height matrix; every point h in the first height matrix1(x, y) carrying out normalization processing to obtain the first height data set { hd(1,1),…,hd(x,y),…,hd(m,n)}。
In a preferred embodiment of this embodiment, a second height dataset { h } of the second image is extractedj(1,1),…,hj(x,y),…,hjThe method of (m, n) } specifically comprises the following steps:
selecting a scanning area of the second image as a second scanning area;
acquiring a height data matrix of the second scanning area as a second height data matrix;
similarly, the method for obtaining the second height data matrix comprises the following steps: and scanning the height data of the second scanning area by using a three-dimensional shape scanner, and pouring the height data into programming software to convert the height data into a height data matrix in a matrix form.
Normalizing the second height data matrix to obtain the second height data set { h }j(1,1),...,hj(x,y),…,hj(m, n) }. Wherein the second height dataset has m rows and n columns; h isj(x, y) is the height value corresponding to the point with the spatial position coordinate (x, y) of the first height data set.
Specifically, the second height data matrix is normalized by the following formula to obtain the second height data set:
Figure BDA0003135599020000101
wherein, min (h)2) Is the minimum value in the second height matrix; max (h)2) Is the maximum value in the second height matrix, h2(x, y) is a height value corresponding to a point with a position coordinate of (x, y) in the second height matrix; every point h in the second height matrix2(x, y) carrying out normalization processing to obtain the second height data set { hj(1,1),...,hj(x,y),...,hj(m,n)}。
In a preferred embodiment of this embodiment, the method for constructing the cloud image of the difference between the gradient directions of the first height data set and the second height data set specifically includes:
calculating the first height dataset { hd(1,1),...,hd(x,y),...,hd(m, n) } angle values of gradient directions of the respective points as a first angle set { theta }d(1,1),...,θd(x,y),...,θd(m,n)};
Calculating the second height dataset { hj(1,1),...,hj(x,y),...,hjThe angle value of each point gradient direction in (m, n) is used as a second angle value set thetaj(1,1),...,θj(x,y),...,θj(m,n)};
Calculating the first set of angles { theta }d(1,1),...,θd(x,y),...,θd(m, n) } and the second set of angle values { theta }j(1,1),...,θj(x,y),...,θj(m, n) } difference set of absolute values of points { thetac(1,1),...,θc(x,y),...,θc(m,n)};
By said difference set { thetac(1,1),...,θc(x,y),...,θc(m, n) } constructing the gradient direction difference cloud picture.
Specifically, the calculation formula of the difference set is as follows:
θc(x,y)=|θd(x,y)-|θj(x,y)|;
the difference set θ can be set by matlab software using a contourf functionc(1,1),...,θc(x,y),...,θc(m, n) } constructing a gradient direction difference cloud map;
specifically, by setting the angle threshold range thetathresholdProcessing the gradient direction difference cloud picture to enable the gradient direction difference cloud picture to be capable of restraining information of a non-deformation areaMoreover, the contact characteristic information contained in the height data is highlighted, and the difference change of the gradient direction angle at the boundary of the non-contact characteristic region and the contact characteristic region of the contact group is obvious, so that the contact characteristic of the contact group can be clearly and effectively reflected, as shown in fig. 5.
Specifically, the angle threshold range θthresholdIs a set value;
it can be appreciated that the difference set θ can be characterized by different chroma or depth in the gradient direction difference cloud mapc(1,1),...,θc(x,y),...,θcData points of different values in (m, n) }; e.g. thetacThe value of (1,1) is 70um, which is represented by a first color in the gradient direction difference cloud, e.g. θcThe value of (2,1) is 71um, which is represented in the gradient direction difference cloud in the second color.
Threshold range thetathresholdThe setting method is as follows:
extracting a plurality of data points at the color boundary in the difference cloud picture to obtain an angle threshold set;
calculating the mean value of all data points in the angle threshold set to obtain a standard threshold value thetab
The threshold range thetathresholdSatisfies the following conditions: thetac<θb
The threshold range thetathresholdThe display range of the contact characteristics of the difference cloud chart in the gradient direction as the height data of the contact group can clearly reflect the contact characteristics of the contact surface of the contact group.
Furthermore, in order to ensure the accuracy of the threshold range, the standard threshold θ corresponding to each gradient direction difference cloud image can be respectively calculated by obtaining the gradient direction difference cloud images of a plurality of contact groups of the same modelbAll standard threshold values thetabTaking the mean value to obtain the final standard threshold value thetab′。
In a preferred embodiment of this embodiment, the first set of angles { θ ] is calculatedd(1,1),...,θd(x,y),...,θdThe method of (m, n) } specifically comprises the following steps:
Figure BDA0003135599020000111
wherein, gradhd(x, y) is a gradient direction corresponding to a point with a spatial position (x, y) in the first height data set; the gradient direction is the direction in which the modulus of the direction vector of the point takes the maximum value, i.e., the direction in which the rate of change of the rise of the altitude value is the maximum
θd(x, y) is gradhdThe angle value of (x, y), i.e. the angle value of the gradient direction corresponding to the point with the spatial position (x, y) in the first height data set.
Specifically, the process of calculating the gradient direction can be completed by means of a gradient encapsulation function of matlab software, and the calculation of the gradient in the X direction and the gradient in the Y direction is completed.
In a preferred embodiment of this embodiment, the second set of angles { θ ] is calculatedj(1,1),...,θj(x,y),...,θjThe method of (m, n) } specifically comprises the following steps:
Figure BDA0003135599020000121
wherein, gradhj(x, y) is a gradient direction corresponding to a point with a spatial position (x, y) in the second height data set, and θj(x, y) is gradhjThe angle value of (x, y), i.e. the angle value of the gradient direction corresponding to the point with the spatial position (x, y) in the second height data set.
Specifically, the process of calculating the gradient direction can be completed by means of a gradient encapsulation function of matlab software, and the calculation of the gradient in the X direction and the gradient in the Y direction is completed.
In a preferred implementation manner of this embodiment, the method for extracting the contact feature contour of the first image specifically includes:
carrying out gray level processing on the first image to obtain a first gray level image;
and carrying out edge detection on the first gray level image through a Canny operator in Matlab software, and extracting a contact characteristic outline.
In a preferred implementation manner of this embodiment, the method for extracting the contact feature contour of the second image specifically includes:
carrying out gray level processing on the second image to obtain a second gray level image;
and carrying out edge detection on the second gray image through a Canny operator in Matlab software, and extracting a contact characteristic outline.
In a preferred embodiment of this embodiment, the method for obtaining the contact feature profile of the gradient direction difference cloud map specifically includes:
carrying out gray level processing on the gradient direction difference cloud picture to obtain a gradient direction difference cloud picture gray level image;
performing edge detection on the gradient direction difference cloud image gray level image through a Canny operator in Matlab software, and extracting a contact characteristic outline;
and scaling the contact feature outline of the gradient direction difference cloud picture so as to be in the same size proportion with the contact feature outline of the first image and the contact feature outline of the second image.
The above description is only a preferred embodiment of the application and is illustrative of the principles of the technology employed. It will be appreciated by a person skilled in the art that the scope of the invention as referred to in the present application is not limited to the embodiments with a specific combination of the above-mentioned features, but also covers other embodiments with any combination of the above-mentioned features or their equivalents without departing from the inventive concept. For example, the above features may be replaced with (but not limited to) features having similar functions disclosed in the present application.

Claims (10)

1. A method for judging the degree of matching of differential plane height data of a contact group, wherein the contact group comprises a movable contact and a fixed contact, and the method is characterized in that: the method comprises the following steps:
acquiring a surface image of the moving contact to obtain a first image;
acquiring a static contact surface image to obtain a second image;
extracting a first height dataset { h) for the first imaged(1,1),…,hd(x,y),…,hd(m, n) }, wherein the first height dataset has m rows and n columns; h isd(x, y) is the height value corresponding to the point with the spatial position coordinate (x, y) of the first height data set;
extracting a second height dataset { h) for the second imagej(1,1),…,hj(x,y),…,hj(m, n), wherein the second height dataset has m rows and n columns; h isj(x, y) is the height value corresponding to the point with the spatial position coordinate (x, y) of the first height data set;
constructing a gradient direction difference cloud map of the first height data set and the second height data set;
to set the angle threshold range thetathresholdProcessing the gradient direction difference cloud picture to obtain a contact characteristic profile of the gradient direction difference cloud picture;
extracting a contact feature profile of the first image;
extracting a contact feature profile of the second image;
and determining the matching degree of the height data of the difference plane according to the contact feature contour of the gradient direction difference cloud picture and the contact feature contours of the first image and the second image.
2. The method of claim 1, wherein the step of determining the degree of matching between the differential plane height data of the contact sets comprises: the method for judging the matching degree of the height data of the difference plane through the contact feature contour of the gradient direction difference cloud picture and the contact feature contours of the first image and the second image specifically comprises the following steps:
calculating the similarity r of the contact feature contour of the cloud picture with the gradient direction difference value and the contact feature contour of the first imaged
Obtaining the similarity r of the contact feature outline of the cloud picture of the gradient direction difference value and the contact feature outline of the second imagej
Said is calculated according to the following formulaSimilarity r of gradient direction difference cloud picture feature profile and contact feature profile of contact point groupsim
Figure FDA0003135599010000021
The similarity rsimFor characterizing the degree of matching of the difference plane height data.
3. The method of claim 2, wherein the step of determining the degree of matching between the differential plane height data of the contact sets comprises: calculating the similarity r of the contact feature contour of the cloud picture with the gradient direction difference value and the contact feature contour of the first imagedThe method comprises the following steps:
calculating the area A of the contact feature outline of the gradient direction difference cloud picture;
calculating the area A of the contact feature profile of the first imaged
Calculating the similarity r of the contact feature contour of the cloud image and the contact feature contour of the first image according to the following formulad
Figure FDA0003135599010000022
4. The method for determining the degree of matching of the differential plane height data of the contact set according to claim 2, wherein: obtaining the similarity r of the contact feature outline of the gradient direction difference cloud picture and the contact feature outline of the first imagedThe method comprises the following steps:
calculating the area A of the contact feature outline of the gradient direction difference cloud picture;
calculating the area A of the contact feature profile of the second imagej
Calculating the contact characteristic of the cloud image contact characteristic outline and the second image according to the following formulaSimilarity r of characteristic contourj
Figure FDA0003135599010000023
5. The method for determining the degree of matching of the differential plane height data of the contact set according to claim 1, wherein:
the method for extracting the contact feature profile of the first image specifically comprises the following steps: carrying out gray level processing on the first image to obtain a first gray level image; performing edge detection on the first gray level image through a Canny operator, and extracting a contact characteristic outline;
the method for extracting the contact feature profile of the second image specifically comprises the following steps: carrying out gray level processing on the second image to obtain a second gray level image; and carrying out edge detection on the second gray image through a Canny operator, and extracting a contact feature outline.
6. The method for determining the degree of matching of the differential plane height data of the contact set according to claim 1, wherein: the method for obtaining the contact characteristic profile of the gradient direction difference cloud picture specifically comprises the following steps:
carrying out gray level processing on the gradient direction difference cloud picture to obtain a gradient direction difference cloud picture gray level image;
performing edge detection on the gradient direction difference cloud image gray level image through a Canny operator, and extracting a contact characteristic profile;
and scaling the contact feature outline of the gradient direction difference cloud picture so as to be in the same size proportion with the contact feature outline of the first image and the contact feature outline of the second image.
7. The method of claim 1, wherein the step of determining the degree of matching between the differential plane height data of the contact sets comprises: the method for constructing the cloud picture of the difference value of the gradient directions of the first height data set and the second height data set specifically comprises the following steps:
calculating the first heightData set { hd(1,1),...,hd(x,y),...,hd(m, n) } angle values of gradient directions of the respective points as a first angle set { theta }d(1,1),...,θd(x,y),...,θd(m,n)};
Calculating the second height dataset { hj(1,1),...,hj(x,y),...,hjThe angle value of each point gradient direction in (m, n) is used as a second angle value set thetaj(1,1),...,θj(x,y),...,θj(m,n)};
Calculating the first set of angles { theta }d(1,1),...,θd(x,y),...,θd(m, n) } and the second set of angle values { theta }j(1,1),...,θj(x,y),...,θj(m, n) } difference set of absolute values of points { thetac(1,1),…,θc(x,y),…,θc(m,n)};
By said difference set { thetac(1,1),…,θc(x,y),…,θc(m, n) } constructing the gradient direction difference cloud picture.
8. The method of claim 7, wherein the step of determining the degree of matching between the differential plane height data of the contact sets comprises:
calculating the first set of angles { theta }d(1,1),…,θd(x,y),...,θdThe method of (m, n) } specifically comprises the following steps:
Figure FDA0003135599010000031
calculating the second set of angles { theta }j(1,1),...,θj(x,y),...,θjThe method of (m, n) } specifically comprises the following steps:
Figure FDA0003135599010000032
9. the method for determining the degree of matching of the differential plane height data of the contact sets according to any one of claims 1 to 8, wherein:
extracting a first height dataset { h) for the first imaged(1,1),...,hd(x,y),...,hdThe method of (m, n) } specifically comprises the following steps:
selecting a scanning area of the first image as a first scanning area;
acquiring a height data matrix of the first scanning area as a first height data matrix;
normalizing the first height data matrix to obtain the first height data set { h }d(1,1),...,hd(x,y),...,hd(m,n)}。
10. The method for determining the degree of matching of the differential plane height data of the contact sets according to any one of claims 1 to 8, wherein:
extracting a second height dataset { h) for the second imagej(1,1),...,hj(x,y),...,hjThe method of (m, n) } specifically comprises the following steps:
selecting a scanning area of the second image as a second scanning area;
acquiring a height data matrix of the second scanning area as a second height data matrix;
normalizing the second height data matrix to obtain the second height data set { h }j(1,1),...,hj(x,y),...,hj(m,n)}。
CN202110717747.9A 2021-06-28 2021-06-28 Method for judging contact group difference plane height data matching degree Active CN113450378B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110717747.9A CN113450378B (en) 2021-06-28 2021-06-28 Method for judging contact group difference plane height data matching degree

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110717747.9A CN113450378B (en) 2021-06-28 2021-06-28 Method for judging contact group difference plane height data matching degree

Publications (2)

Publication Number Publication Date
CN113450378A true CN113450378A (en) 2021-09-28
CN113450378B CN113450378B (en) 2022-06-03

Family

ID=77813168

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110717747.9A Active CN113450378B (en) 2021-06-28 2021-06-28 Method for judging contact group difference plane height data matching degree

Country Status (1)

Country Link
CN (1) CN113450378B (en)

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107214703A (en) * 2017-07-11 2017-09-29 江南大学 A kind of robot self-calibrating method of view-based access control model auxiliary positioning
CN108256394A (en) * 2016-12-28 2018-07-06 中林信达(北京)科技信息有限责任公司 A kind of method for tracking target based on profile gradients
CN109584238A (en) * 2018-12-07 2019-04-05 北京航空航天大学 A kind of bow net operation conditions on-line detecting system and method based on stereoscopic vision
CN110569861A (en) * 2019-09-01 2019-12-13 中国电子科技集团公司第二十研究所 Image matching positioning method based on point feature and contour feature fusion

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108256394A (en) * 2016-12-28 2018-07-06 中林信达(北京)科技信息有限责任公司 A kind of method for tracking target based on profile gradients
CN107214703A (en) * 2017-07-11 2017-09-29 江南大学 A kind of robot self-calibrating method of view-based access control model auxiliary positioning
CN109584238A (en) * 2018-12-07 2019-04-05 北京航空航天大学 A kind of bow net operation conditions on-line detecting system and method based on stereoscopic vision
CN110569861A (en) * 2019-09-01 2019-12-13 中国电子科技集团公司第二十研究所 Image matching positioning method based on point feature and contour feature fusion

Non-Patent Citations (6)

* Cited by examiner, † Cited by third party
Title
WENHUA LI ET AL.: "Research on contact di erence plane layering and corrosion zoning of sealed", 《IEEJ TRANSACTIONS ON ELECTRICAL AND ELECTRONIC ENGINEERING》 *
WENHUA LI ET AL.: "Research on contact di erence plane layering and corrosion zoning of sealed", 《IEEJ TRANSACTIONS ON ELECTRICAL AND ELECTRONIC ENGINEERING》, vol. 14, no. 11, 22 August 2019 (2019-08-22), pages 1595 - 1601, XP072431797, DOI: 10.1002/tee.22980 *
WENHUA LI ETAL.: "Contact Spots Analysis of Sealed Relay Contact Pair Based on Contact", 《IEEJ TRANSACTIONS ON ELECTRICAL AND ELECTRONIC ENGINEERING》 *
WENHUA LI ETAL.: "Contact Spots Analysis of Sealed Relay Contact Pair Based on Contact", 《IEEJ TRANSACTIONS ON ELECTRICAL AND ELECTRONIC ENGINEERING》, vol. 16, no. 3, 11 February 2021 (2021-02-11), pages 355 - 363 *
宋文荣: "一种基于图像处理技术的继电器触点间距智能检测方法", 《应用激光》 *
宋文荣: "一种基于图像处理技术的继电器触点间距智能检测方法", 《应用激光》, vol. 32, no. 5, 31 October 2012 (2012-10-31), pages 434 - 439 *

Also Published As

Publication number Publication date
CN113450378B (en) 2022-06-03

Similar Documents

Publication Publication Date Title
CN106651752B (en) Three-dimensional point cloud data registration method and splicing method
CN111243032B (en) Full-automatic detection method for checkerboard corner points
CN104318548B (en) Rapid image registration implementation method based on space sparsity and SIFT feature extraction
CN109215063B (en) Registration method of event trigger camera and three-dimensional laser radar
KR100810326B1 (en) Method for generation of multi-resolution 3d model
CN112132907B (en) Camera calibration method and device, electronic equipment and storage medium
CN101147159A (en) Fast method of object detection by statistical template matching
CN115096206A (en) Part size high-precision measurement method based on machine vision
CN113989336A (en) Visible light image and infrared image registration method and device
CN117495852A (en) Digital printing quality detection method based on image analysis
CN113450378B (en) Method for judging contact group difference plane height data matching degree
CN114998571B (en) Image processing and color detection method based on fixed-size markers
CN115239801B (en) Object positioning method and device
CN116125489A (en) Indoor object three-dimensional detection method, computer equipment and storage medium
CN112102347B (en) Step detection and single-stage step height estimation method based on binocular vision
JP4639044B2 (en) Contour shape extraction device
CN112766338B (en) Method, system and computer readable storage medium for calculating distance image
Wang et al. Multi-surface hydraulic valve block technique hole plug inspection from monocular image
JP2018041169A (en) Information processing device and control method and program thereof
CN110334372B (en) BIM augmented reality simulation method based on drawing registration
CN111667429A (en) Target positioning and correcting method for inspection robot
Takaoka et al. Depth map super-resolution for cost-effective rgb-d camera
CN115937320B (en) Visual positioning method for polishing mobile phone shell
JP5845139B2 (en) Graphic detection processing apparatus, graphic detection processing method, and graphic detection processing program
Meierhold et al. Referencing of images to laser scanner data using linear features extracted from digital images and range images

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant