CN117495829B - Intelligent watch hardware quality detection method - Google Patents
Intelligent watch hardware quality detection method Download PDFInfo
- Publication number
- CN117495829B CN117495829B CN202311522086.XA CN202311522086A CN117495829B CN 117495829 B CN117495829 B CN 117495829B CN 202311522086 A CN202311522086 A CN 202311522086A CN 117495829 B CN117495829 B CN 117495829B
- Authority
- CN
- China
- Prior art keywords
- image
- points
- chain
- block
- gray
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
- 238000001514 detection method Methods 0.000 title claims abstract description 19
- 238000000034 method Methods 0.000 claims abstract description 30
- 241001280173 Crassula muscosa Species 0.000 claims abstract description 29
- 230000007547 defect Effects 0.000 claims abstract description 25
- 230000011218 segmentation Effects 0.000 claims abstract description 19
- 230000008859 change Effects 0.000 claims abstract description 13
- 230000035772 mutation Effects 0.000 claims description 104
- 238000004364 calculation method Methods 0.000 claims description 11
- 230000000903 blocking effect Effects 0.000 claims description 7
- 238000011166 aliquoting Methods 0.000 claims 1
- 238000005286 illumination Methods 0.000 description 13
- 238000004519 manufacturing process Methods 0.000 description 6
- 230000003044 adaptive effect Effects 0.000 description 4
- 230000000694 effects Effects 0.000 description 3
- 239000011295 pitch Substances 0.000 description 3
- 230000008569 process Effects 0.000 description 3
- 230000002950 deficient Effects 0.000 description 2
- 238000010586 diagram Methods 0.000 description 2
- 230000007704 transition Effects 0.000 description 2
- 230000009286 beneficial effect Effects 0.000 description 1
- 238000005516 engineering process Methods 0.000 description 1
- 239000002184 metal Substances 0.000 description 1
- 239000007769 metal material Substances 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 230000002265 prevention Effects 0.000 description 1
- 238000006467 substitution reaction Methods 0.000 description 1
- XLYOFNOQVPJJNP-UHFFFAOYSA-N water Substances O XLYOFNOQVPJJNP-UHFFFAOYSA-N 0.000 description 1
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/0002—Inspection of images, e.g. flaw detection
- G06T7/0004—Industrial image inspection
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/10—Segmentation; Edge detection
- G06T7/11—Region-based segmentation
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/10—Segmentation; Edge detection
- G06T7/136—Segmentation; Edge detection involving thresholding
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/10—Segmentation; Edge detection
- G06T7/194—Segmentation; Edge detection involving foreground-background segmentation
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/20—Special algorithmic details
- G06T2207/20021—Dividing image into blocks, subimages or windows
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30108—Industrial image inspection
- G06T2207/30136—Metal
-
- Y—GENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
- Y02—TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
- Y02P—CLIMATE CHANGE MITIGATION TECHNOLOGIES IN THE PRODUCTION OR PROCESSING OF GOODS
- Y02P90/00—Enabling technologies with a potential contribution to greenhouse gas [GHG] emissions mitigation
- Y02P90/30—Computing systems specially adapted for manufacturing
Landscapes
- Engineering & Computer Science (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Quality & Reliability (AREA)
- Image Analysis (AREA)
Abstract
The invention relates to the technical field of image data processing, in particular to a quality detection method for hardware of an intelligent watch, which comprises the following steps: the method comprises the steps of obtaining a chain gray level image, equally dividing the chain gray level image into a plurality of image blocks with the same size, obtaining initial block overlapping rate of the chain gray level image according to differences among all adjacent image blocks, obtaining eliminating points in the chain gray level image according to the number and gray level values of pixels between adjacent abrupt change points in all rows in the chain gray level image, obtaining updated block overlapping rate, obtaining a plurality of updated image blocks, and judging whether the chain quality in the chain surface image is qualified according to the gray level value differences of the pixels in all updated image blocks. According to the method, the threshold segmentation accuracy of defects in the image is improved by removing the eliminating points which are the gaps of the chain links of the watch chain and the self-adaptive block overlap ratio, so that the accuracy of the quality detection result of the intelligent watch hardware is improved.
Description
Technical Field
The invention relates to the technical field of image data processing, in particular to a quality detection method for hardware of an intelligent watch.
Background
In the age of intelligent walk, all aspects of life have intelligent participation, and even watches are not exceptional, so that intelligent watches are gradually touted by the masses because of the multifunctional functions of heart rate detection, positioning, water prevention and the like. The replaceability of the watch chain can meet the pursuit of people to the style, and the traditional watch chain is widely selected because the metal material of the watch chain can meet the pursuit of people to the texture, so that the importance of defect detection on the appearance of the watch chain is self-evident. Currently, a defect on a bracelet of a smart watch is detected in a common image threshold segmentation manner.
The existing problems are as follows: on the watch chain of the intelligent watch, due to the influence of gaps among the chain links, the image threshold segmentation method is easy to misdetect the gaps among the chain links as defects, so that the accuracy of the quality detection result of hardware of the intelligent watch can be reduced.
Disclosure of Invention
The invention provides a quality detection method for hardware of an intelligent watch, which aims to solve the existing problems.
The invention discloses a quality detection method for hardware of an intelligent watch, which adopts the following technical scheme:
the embodiment of the invention provides a quality detection method for hardware of a smart watch, which comprises the following steps:
acquiring a bracelet surface image of an intelligent watch, and carrying out gray processing to obtain a bracelet gray image;
Equally dividing the gray level image of the watch chain into a plurality of image blocks with the same size; obtaining shadow points and highlight points in each image block according to the gray value of the pixel points in each image block; obtaining the initial block overlapping rate of the chain gray level image according to the quantity difference of shadow points and highlight points in all adjacent image blocks;
Obtaining abrupt points in each row according to the gray value difference of each row of pixel points in the chain gray image; in the gray level image of the chain, according to the number of pixel points and gray values between adjacent abrupt points in all lines, eliminating points in the gray level image of the chain are obtained;
obtaining updated block overlapping rate according to the size of the gray level image of the chain, the initial block overlapping rate and the number of the eliminating points; obtaining a plurality of updated image blocks according to the updated block overlapping rate;
and judging whether the quality of the chain in the chain surface image is qualified or not according to the gray value difference of the pixel points in all the updated image blocks.
Further, the step of obtaining the shadow point and the highlight point in each image block according to the gray value of the pixel point in each image block comprises the following specific steps:
Marking any image block as a target block; the average value of gray values of all pixel points in the target block is recorded as a judgment threshold value;
Marking the pixel points with the gray values smaller than or equal to the judgment threshold value in the target block as shadow points;
and marking the pixel points with gray values larger than the judging threshold value in the target block as high-brightness points.
Further, the step of obtaining the initial block overlapping rate of the chain gray level image according to the difference of the number of shadow points and highlight points in all adjacent image blocks comprises the following specific steps:
In the chain gray level image, marking adjacent image blocks of each image block as reference blocks of each image block;
According to the difference of the numbers of shadow points and highlight points in all the image blocks and the reference blocks, a specific calculation formula corresponding to the initial block overlapping rate of the chain gray level image is obtained:
Wherein C is the initial block overlap ratio of the chain gray image, K i ′ is the number of shadow points in the ith image block, K i is the number of highlight points in the ith image block, D i ′ ,j is the number of shadow points in the jth reference block of the ith image block, D i,j is the number of highlight points in the jth reference block of the ith image block, x i is the number of reference blocks of the ith image block, y is the number of image blocks equally divided by the chain gray image, and I is an absolute function.
Further, according to the gray value difference of each row of pixel points in the gray image of the chain, the abrupt change point in each row is obtained, which comprises the following specific steps:
marking any row in the surface image of the watch chain as a target row; marking any pixel point in the target row as a target point;
in the target row, two pixel points adjacent to the target point are marked as reference points;
The absolute value of the difference value of the gray values of the two reference points is recorded as the mutation degree of the target point;
and marking the pixel points with mutation degrees larger than a preset mutation threshold value in the target row as mutation points.
Further, in the gray level image of the chain, according to the number of pixel points and gray values between adjacent abrupt change points in all lines, the method for obtaining the eliminating points in the gray level image of the chain comprises the following specific steps:
in all mutation points in the target row, marking any two adjacent mutation points as a first mutation point and a second mutation point respectively;
The number of the pixel points from the first mutation point to the second mutation point in the target row is recorded as the distance from the first mutation point to the second mutation point;
The average value of the distances between all adjacent mutation points in all rows is recorded as a standard distance;
obtaining the rejection of the pixel points from the first mutation point to the second mutation point according to the distance from the first mutation point to the second mutation point, the standard distance and the gray value of the pixel points from the first mutation point to the second mutation point;
And (3) in the pixel points between all adjacent mutation points in all rows, marking the pixel points between two adjacent mutation points with the rejection rate larger than a preset rejection threshold value as rejection points.
Further, the specific calculation formula corresponding to the rejection of the pixel points from the first mutation point to the second mutation point according to the distance from the first mutation point to the second mutation point, the standard distance and the pixel point gray value from the first mutation point to the second mutation point is:
wherein T is the rejection of the pixel points from the first mutation point to the second mutation point, F is the average value of the gray values of the pixel points from the first mutation point to the second mutation point, m is the interval from the first mutation point to the second mutation point, m 0 is the standard interval, and I is an absolute function.
Further, according to the size of the gray level image of the chain, the initial block overlapping rate and the number of the removed points, a specific calculation formula corresponding to the updated block overlapping rate is obtained as follows:
Wherein C ′ is the updated block overlapping rate, C is the initial block overlapping rate of the chain gray image, M and N are the length and width of the chain gray image respectively, q is the number of eliminating points in the chain gray image, and b is the preset block coefficient.
Further, the method for obtaining a plurality of updated image blocks according to the updated block overlapping rate comprises the following specific steps:
removing all the removed points in the surface image of the chain to obtain a new chain image;
and dividing the new list chain image into a plurality of updated image blocks which are the same in size and overlap by using an image overlapping blocking algorithm according to the updated blocking overlapping rate and the preset image blocking size.
Further, the step of judging whether the quality of the chain in the chain surface image is qualified according to the gray value difference of the pixel points in all the updated image blocks comprises the following specific steps:
Recording any one updated image block as a target updated block;
Obtaining a segmentation threshold value of the target update block by using an Ojin algorithm;
in the target updating block, the area formed by all pixel points with gray values smaller than or equal to the segmentation threshold value is marked as a target area; the area formed by all pixel points with gray values larger than the segmentation threshold value is recorded as a background area;
the average value of the gray values of all pixel points in the target area is recorded as a target gray value;
the average value of the gray values of all pixel points in the background area is recorded as the background gray value;
calculating a difference value of a background gray value minus a target gray value, and if the difference value is larger than a preset gray threshold value, marking a target area as a defect area in a target updating block;
and judging whether the quality of the chain in the chain surface image is qualified or not according to the result of whether the defect area exists in all the updated image blocks.
Further, according to the result of whether the defect area exists in all the updated image blocks, judging whether the quality of the chain in the chain surface image is qualified, including the following specific steps:
When no defect area exists in all updated image blocks, judging that the quality of the chain in the chain surface image is qualified;
and when the defect areas exist in all the updated image blocks, judging that the quality of the chains in the chain surface image is unqualified.
The technical scheme of the invention has the beneficial effects that:
In the embodiment of the invention, a bracelet surface image of an intelligent watch is acquired to obtain a bracelet gray level image, the bracelet gray level image is equally divided into a plurality of image blocks with the same size, and the initial block overlapping rate of the bracelet gray level image is obtained according to the difference between all adjacent image blocks. According to the number and gray values of the pixel points between adjacent abrupt points in all rows in the chain gray image, eliminating points in the chain gray image are obtained, which are used for eliminating the chain link gap pixel points, preventing the chain link gap pixel points from affecting defect detection, and improving defect detection accuracy. According to the size of the gray level image of the chain, the initial block overlapping rate and the number of eliminating points, the updating block overlapping rate is obtained, so that a plurality of updating image blocks are obtained, the local information of the image is better utilized through the self-adaptive block overlapping rate, and the defect detection accuracy is improved. And judging whether the quality of the chain in the chain surface image is qualified or not according to the gray value difference of the pixel points in all the updated image blocks. The method and the device improve the accuracy of threshold segmentation of defects in the image by removing the eliminating points of the chain link gaps and the self-adaptive block overlapping rate, thereby improving the accuracy of the quality detection result of the intelligent watch hardware.
Drawings
In order to more clearly illustrate the embodiments of the invention or the technical solutions in the prior art, the drawings that are required in the embodiments or the description of the prior art will be briefly described, it being obvious that the drawings in the following description are only some embodiments of the invention, and that other drawings may be obtained according to these drawings without inventive effort for a person skilled in the art.
FIG. 1 is a flow chart of the steps of a method for detecting the quality of hardware of a smart watch according to the present invention;
fig. 2 is a schematic diagram of a surface image of a bracelet of a smart watch to be inspected according to the present embodiment.
Detailed Description
In order to further describe the technical means and effects adopted by the invention to achieve the preset aim, the following detailed description is given below of a quality detection method for hardware of a smart watch according to the invention, which is specific to the implementation, structure, characteristics and effects thereof, with reference to the accompanying drawings and preferred embodiments. In the following description, different "one embodiment" or "another embodiment" means that the embodiments are not necessarily the same. Furthermore, the particular features, structures, or characteristics of one or more embodiments may be combined in any suitable manner.
Unless defined otherwise, all technical and scientific terms used herein have the same meaning as commonly understood by one of ordinary skill in the art to which this invention belongs.
The invention provides a specific scheme of a quality detection method for hardware of an intelligent watch, which is specifically described below with reference to the accompanying drawings.
Referring to fig. 1, a flowchart of steps of a method for detecting quality of hardware of a smart watch according to an embodiment of the present invention is shown, where the method includes the following steps:
step S001: and acquiring a surface image of a watch chain of the intelligent watch, and carrying out gray processing to obtain a gray image of the watch chain.
Because the effect of other scenery exists in the images when the surface images of the watch chain of the intelligent watch on the production line are acquired, the embodiment uses the industrial camera to shoot in a overlooking mode on the watch chain production line of the intelligent watch, and the images without the watch chain on the production line and the images with the watch chain on the production line are obtained.
And obtaining the surface image of the watch chain of the intelligent watch by using an image difference algorithm according to the image without the watch chain on the production line and the image with the watch chain on the production line. And carrying out graying treatment on the surface image of the bracelet to obtain a bracelet gray level image. The image differentiating algorithm and the image graying are well known techniques, and specific methods are not described herein. Fig. 2 is a schematic diagram of a surface image of a bracelet of a smart watch to be inspected according to the present embodiment.
What needs to be described is: the watch chain is transversely placed in the watch chain surface image of the intelligent watch.
Step S002: equally dividing the gray level image of the watch chain into a plurality of image blocks with the same size; obtaining shadow points and highlight points in each image block according to the gray value of the pixel points in each image block; and obtaining the initial block overlapping rate of the chain gray level image according to the quantity difference of the shadow points and the high-bright points in all the adjacent image blocks.
In the embodiment, an image adaptive threshold segmentation algorithm based on the image block overlapping rate is used for detecting the defects of the surface images. The image adaptive threshold segmentation algorithm based on the image block overlapping rate is an improved technology of image adaptive threshold segmentation, and by setting the overlapping rate of image blocks, a certain degree of overlapping can be introduced between adjacent blocks, so that local information of images is better utilized, and fracture or transition blurring caused by threshold change at the boundary is reduced, so that each image block is subjected to adaptive threshold segmentation.
The watch chain is formed by splicing a plurality of small chain links, if scratches are generated on the surface of the watch chain, the distribution of the scratches on the connection is interrupted by gaps of the chain links, and the images are required to be subjected to blocking processing in the process of detecting the scratches.
When the overlapping rate is calculated after the image is segmented, in order to ensure the overlapping between the chain links when the segments overlap, the pixel points in the gap area between the links can interfere with the calculation of the overlapping rate. The gap areas among the chain links are required to be processed, the gap areas are identified, pixel points in the gap areas are removed in the process of calculating the overlapping rate among the blocks, the overlapping rate among the blocks before and after removal is changed, and the initial overlapping rate is corrected according to the change of the overlapping rate, so that the overlapping rate reaches the actual requirement.
And the performance of the segmented images is adjusted according to the corrected overlapping rate, and the threshold segmentation is carried out on each image, so that the performance degree of the scratch area is improved, and the segmentation is facilitated.
The preset image block size of this embodiment isWhere b is a preset block coefficient, and b is 10, this is described as an example, and other values may be set in other embodiments, which is not limited in this embodiment. M and N are the length and the width of the chain gray level image respectively, and the preset image block size needs to be set according to the size of the image, so that the image can be completely equally divided.
Equally dividing the chain gray level image into y pieces of sizes according to the preset image block sizeAnd non-overlapping image blocks. Wherein y is 100.
The stability of the illumination variation is related to the block overlap rate when it is calculated. Illumination variations often result in differences in brightness and contrast in different areas of the image. In some cases, illumination changes may cause brightness jumps at the block boundaries, which may make segmentation and processing difficult. A lower overlap rate may make this problem more pronounced because fewer overlapping pixels between adjacent blocks may not capture a smooth transition in illumination variation. Therefore, the stability of illumination change can influence the block overlapping rate, the overlapping rate is self-adaptive according to the stability of illumination change, and the influence of illumination on the overlapping rate is avoided.
The overlapping rate is determined according to the stability of illumination change in the adjacent blocks, and the worse the illumination change stability is, the larger the overlapping rate is. The chain links are made of smooth metal, shadow areas and areas with higher brightness are formed on the surfaces of the chain links due to the influence factors of illumination, and when the illumination changes, the relative positions and the duty ratios of the shadow areas and the areas with higher brightness in the blocks also change.
The overlapping rate between the blocks is reflected according to the position and the duty ratio difference of the shadow and the area with higher brightness in each image block and the adjacent image block, the larger the difference is, the more unstable the illumination change is, and the larger the overlapping rate is.
Any one image block is recorded as a target block. And (5) marking the average value of the gray values of all pixel points in the target block as a judgment threshold value.
And marking the pixel points with the gray values smaller than or equal to the judgment threshold value in the target block as shadow points. And marking the pixel points with gray values larger than the judging threshold value in the target block as high-brightness points.
In the above manner, shadow points and highlight points within each image block are obtained.
In the chain gray scale image, adjacent image blocks of each image block are noted as reference blocks of each image block. From this, the calculation formula of the initial block overlap ratio C of the chain gray image is known as follows:
Wherein C is the initial block overlap ratio of the chain gray image, K i ′ is the number of shadow points in the ith image block, K i is the number of highlight points in the ith image block, D i ′ ,j is the number of shadow points in the jth reference block of the ith image block, D i,j is the number of highlight points in the jth reference block of the ith image block, x i is the number of reference blocks of the ith image block, and y is the number of image blocks equally divided by the chain gray image. I is an absolute function.
What needs to be described is: The larger the value of the illumination change degree in the local area where the ith image block is located is, the more detail information in the local area is, so that the larger the overlapping rate is, and the breakage of detail textures in the local area is prevented. Thereby use/> The initial block overlap ratio of the gray scale image of the bracelet is represented.
Step S003: obtaining abrupt points in each row according to the gray value difference of each row of pixel points in the chain gray image; in the gray level image of the chain, the eliminating points in the gray level image of the chain are obtained according to the number of the pixel points and gray values between the adjacent abrupt points in all the rows.
In the process of calculating the initial block overlap rate, the block has the pixel points of the chain link gap region, and the partial pixel points participate in the calculation of the block overlap rate, so that the obtained overlap rate is interfered, and the partial pixel points are removed to ensure that the overlap rate reaches the requirement.
The link gap region corresponds to a darker gray level of the pixel point, and the region alternates with the link region on the bracelet. The gray level of the chain link area is larger, the width of the chain link is larger, the gray level of the gap area is darker, the width of the gap area is smaller, the gray level mutation exists between the edge of the chain link and the gap when seen from the direction of the chain link, the gray level mutation points on two sides of the gap area are close, and the gray level between the two mutation points is low.
The watch chain is transversely placed in the known watch chain surface image, and any row in the watch chain surface image is marked as a target row. And marking any pixel point in the target row as a target point.
In the target line, two pixel points adjacent to the target point are noted as reference points.
The absolute value of the difference between the gray values of the two reference points is recorded as the mutation degree of the target point.
What needs to be described is: in this embodiment, the mutation degree of the two end points is equal to the mutation degree of the adjacent pixel points of the two end points.
According to the mode, the mutation degree of each pixel point in the target row is obtained.
The preset mutation threshold value in this embodiment is 50, which is described as an example, and other values may be set in other embodiments, which is not limited in this embodiment.
In the target row, the pixel points with mutation degree larger than the preset mutation threshold value are marked as mutation points.
Any two adjacent mutation points are respectively marked as a first mutation point and a second mutation point in all mutation points in the target row.
In the target row, the number of pixels between the first mutation point and the second mutation point is recorded as the distance between the first mutation point and the second mutation point.
What needs to be described is: and the pixel points between the first mutation point and the second mutation point comprise the first mutation point and the second mutation point.
According to the mode, the distance between each pair of adjacent two mutation points in the target row is obtained, and the distance between each pair of adjacent two mutation points in each row of the surface image of the table chain is obtained.
In the surface image of the bracelet, the average of the pitches of all adjacent abrupt points in all rows is recorded as the standard pitch.
It is known that when the gray level of the pixel point between two abrupt points is low and the distance between two abrupt points is short, the greater the possibility that the pixel point in the middle of the two abrupt points is the link gap on the chain, that is, the greater the possibility that the pixel point needs to be removed.
Therefore, the calculation formula of the rejection T of the pixel points from the first mutation point to the second mutation point is as follows:
wherein T is the rejection of the pixel points from the first mutation point to the second mutation point, F is the average value of the gray values of the pixel points from the first mutation point to the second mutation point, m is the interval from the first mutation point to the second mutation point, m 0 is the standard interval, and I is an absolute function.
What needs to be described is: the smaller the difference between the number of pixels from the standard pitch, i.e., |m-m 0 |, between the first abrupt point and the second abrupt point, and the smaller the gray value of these pixels, the greater the likelihood that they are link gaps on the chain, i.e., the greater the likelihood that they need to be rejected. Thus usingThe rejection of the pixel points from the first mutation point to the second mutation point is shown, wherein 1 is added to the denominator to prevent the denominator from being 0.
According to the mode, the rejection of the pixel points between each pair of adjacent two mutation points in each row of the surface image of the chain is obtained.
The preset rejection threshold in this embodiment is 0.01, which is described as an example, and other values may be set in other embodiments, which is not limited in this embodiment.
And marking the pixel points between two adjacent mutation points with the rejection performance larger than a preset rejection threshold as rejection points.
What needs to be described is: the pixels between two adjacent mutation points with the rejection performance smaller than or equal to the rejection threshold value are the pixels on the surface of the surface chain.
Step S004: obtaining updated block overlapping rate according to the size of the gray level image of the chain, the initial block overlapping rate and the number of the eliminating points; and obtaining a plurality of updated image blocks according to the updated block overlapping rate.
And removing all the removed points in the surface image of the chain to obtain a new chain image. The overlapping rate of the original blocks in the new list chain image is changed, and the overlapping rate needs to be corrected to ensure that the overlapping rate of the image blocks reaches the required value.
From this, the calculation formula of the update block overlap ratio C ′ is shown as follows:
Wherein C ′ is the updated block overlapping rate, C is the initial block overlapping rate of the chain gray image, M and N are the length and width of the chain gray image respectively, q is the number of eliminating points in the chain gray image, and b is the preset block coefficient.
What needs to be described is: Representing the preset image block size, when the number q of the eliminating points in the chain gray level image is larger, the adjustment of the initial block overlapping rate is larger, so that the method is used/> Represents the overlap ratio adjustment coefficient, thereby using/>Representing the update block overlap rate.
Dividing the new list chain image into a plurality of images with the sizes of a plurality of images according to the updated block overlapping rate and the preset image block size by using an image overlapping block algorithmAnd there are overlapping updated image blocks.
What needs to be described is: the image overlap-and-block algorithm is a well-known technique, and a specific method is not described here. The block size and the overlapping rate are main parameters of the image overlapping block algorithm, in this embodiment, the block overlapping rate and the preset image block size are respectively the overlapping rate and the block size in the image overlapping block algorithm, the overlapping rate refers to the proportion of overlapping parts between adjacent image blocks, and the size of the overlapping parts between the image blocks can be controlled by adjusting the overlapping rate, so that the subsequent processing result is affected.
Step S005: and judging whether the quality of the chain in the chain surface image is qualified or not according to the gray value difference of the pixel points in all the updated image blocks.
Any one of the update image blocks is recorded as a target update block. And obtaining the segmentation threshold value of the target update block by using an Ojin algorithm. The method of the Sedrin algorithm is a well-known technique, and the specific method is not described here.
It is known that scratch defects on the bracelet often result in lower gray values. This is because scratch defects may render the surface of the affected area uneven, possibly destroying the reflection and refraction of light, thereby reducing the intensity of the reflected or transmitted light. Therefore, in the target update block, a region formed by all pixels having a gradation value equal to or smaller than the division threshold is referred to as a target region, and a region formed by all pixels having a gradation value greater than the division threshold is referred to as a background region.
Since the oxford algorithm classifies pixels within a target update block into two categories, regardless of whether the target update block is defective. Therefore, the preset gray threshold value in this embodiment is 60, which is described as an example, and other values may be set in other embodiments, and this embodiment is not limited thereto.
And (5) marking the average value of the gray values of all the pixel points in the target area as a target gray value. And (5) marking the average value of the gray values of all the pixel points in the background area as the background gray value.
And calculating a difference value of the background gray value minus the target gray value, and if the difference value is larger than a preset gray threshold value, marking the target area as a defect area in the target update block.
What needs to be described is: and if the difference value is smaller than or equal to a preset gray threshold value, indicating that the target updating block is defect-free.
In the above manner, it is determined whether or not a defective area exists in each updated image block.
And when the defect areas do not exist in all the updated image blocks, judging that the quality of the chains in the chain surface image is qualified.
And when the defect areas exist in all the updated image blocks, judging that the quality of the chains in the chain surface image is unqualified.
The present invention has been completed.
In summary, in the embodiment of the invention, the surface image of the bracelet of the smart watch is acquired to obtain the bracelet gray image, the bracelet gray image is equally divided into a plurality of image blocks with the same size, and the initial block overlapping rate of the bracelet gray image is obtained according to the difference between all the adjacent image blocks. Obtaining abrupt points in each row according to the gray value difference of the pixel points in each row in the chain gray image, and obtaining the eliminating points in the chain gray image according to the number of the pixel points and the gray values between the adjacent abrupt points in all rows. According to the size of the gray level image of the chain, the initial block overlapping rate and the number of eliminating points, the updating block overlapping rate is obtained, a plurality of updating image blocks are obtained, and whether the quality of the chain in the surface image of the chain is qualified or not is judged according to the gray level value difference of the pixel points in all the updating image blocks. According to the method, the threshold segmentation accuracy of defects in the image is improved by removing the eliminating points which are the gaps of the chain links of the watch chain and the self-adaptive block overlap ratio, so that the accuracy of the quality detection result of the intelligent watch hardware is improved.
The above description is only of the preferred embodiments of the present invention and is not intended to limit the invention, but any modifications, equivalent substitutions, improvements, etc. within the principles of the present invention should be included in the scope of the present invention.
Claims (6)
1. The intelligent watch hardware quality detection method is characterized by comprising the following steps of:
acquiring a bracelet surface image of an intelligent watch, and carrying out gray processing to obtain a bracelet gray image;
Equally dividing the gray level image of the watch chain into a plurality of image blocks with the same size; obtaining shadow points and highlight points in each image block according to the gray value of the pixel points in each image block; obtaining the initial block overlapping rate of the chain gray level image according to the quantity difference of shadow points and highlight points in all adjacent image blocks;
Obtaining abrupt points in each row according to the gray value difference of each row of pixel points in the chain gray image; in the gray level image of the chain, according to the number of pixel points and gray values between adjacent abrupt points in all lines, eliminating points in the gray level image of the chain are obtained;
obtaining updated block overlapping rate according to the size of the gray level image of the chain, the initial block overlapping rate and the number of the eliminating points; obtaining a plurality of updated image blocks according to the updated block overlapping rate;
judging whether the quality of the chain in the chain surface image is qualified or not according to the gray value difference of the pixel points in all the updated image blocks;
the shadow points and the highlight points in each image block are obtained according to the gray value of the pixel points in each image block, and the method comprises the following specific steps:
Marking any image block as a target block; the average value of gray values of all pixel points in the target block is recorded as a judgment threshold value;
Marking the pixel points with the gray values smaller than or equal to the judgment threshold value in the target block as shadow points;
the pixel points with gray values larger than the judging threshold value in the target block are marked as high-bright points;
obtaining abrupt change points in each row according to the gray value difference of each row of pixel points in the chain gray image, comprising the following specific steps:
marking any row in the surface image of the watch chain as a target row; marking any pixel point in the target row as a target point;
in the target row, two pixel points adjacent to the target point are marked as reference points;
The absolute value of the difference value of the gray values of the two reference points is recorded as the mutation degree of the target point;
the pixel points with mutation degree larger than a preset mutation threshold value in the target row are marked as mutation points;
in the gray level image of the chain, according to the number of pixel points and gray values between adjacent abrupt points in all lines, the method for obtaining the eliminating points in the gray level image of the chain comprises the following specific steps:
in all mutation points in the target row, marking any two adjacent mutation points as a first mutation point and a second mutation point respectively;
The number of the pixel points from the first mutation point to the second mutation point in the target row is recorded as the distance from the first mutation point to the second mutation point;
The average value of the distances between all adjacent mutation points in all rows is recorded as a standard distance;
obtaining the rejection of the pixel points from the first mutation point to the second mutation point according to the distance from the first mutation point to the second mutation point, the standard distance and the gray value of the pixel points from the first mutation point to the second mutation point;
Among the pixel points between all adjacent mutation points in all rows, the pixel points between two adjacent mutation points with the rejection rate larger than a preset rejection threshold value are marked as rejection points;
The method for obtaining a plurality of updated image blocks according to the updated block overlapping rate comprises the following specific steps:
removing all the removed points in the surface image of the chain to obtain a new chain image;
and dividing the new list chain image into a plurality of updated image blocks which are the same in size and overlap by using an image overlapping blocking algorithm according to the updated blocking overlapping rate and the preset image blocking size.
2. The method for detecting the quality of hardware of the intelligent watch according to claim 1, wherein the obtaining the initial block overlapping rate of the gray-scale image of the bracelet according to the difference between the number of shadow points and high-brightness points in all adjacent image blocks comprises the following specific steps:
In the chain gray level image, marking adjacent image blocks of each image block as reference blocks of each image block;
According to the difference of the numbers of shadow points and highlight points in all the image blocks and the reference blocks, a specific calculation formula corresponding to the initial block overlapping rate of the chain gray level image is obtained:
Where C is the initial block overlap ratio of the in-chain gray scale image, For the number of shadow points within the ith image block,/>For the number of highlight points within the ith image block,/>For the number of shadow points in the jth reference block of the ith image block,/>For the number of highlights in the j-th reference block of the i-th image block,/>The reference block number of the ith image block is y, the image block number of the aliquoting of the chain gray-scale image is y, and the I is an absolute value function.
3. The method for detecting the quality of the hardware of the smart watch according to claim 1, wherein the specific calculation formula corresponding to the rejection of the pixel points from the first mutation point to the second mutation point according to the distance from the first mutation point to the second mutation point, the standard distance and the pixel point gray value from the first mutation point to the second mutation point is:
Wherein T is the rejection of the pixel points from the first mutation point to the second mutation point, F is the average value of the gray values of the pixel points from the first mutation point to the second mutation point, m is the interval from the first mutation point to the second mutation point, Is the standard spacing, and is an absolute function.
4. The method for detecting the quality of hardware of the intelligent watch according to claim 1, wherein the specific calculation formula corresponding to the updated block overlapping rate is obtained according to the size of the gray level image of the watch chain, the initial block overlapping rate and the number of the eliminating points, and is as follows:
Wherein the method comprises the steps of In order to update the block overlapping rate, C is the initial block overlapping rate of the chain gray level image, M and N are the length and the width of the chain gray level image respectively, q is the number of eliminating points in the chain gray level image, and b is a preset block coefficient.
5. The method for detecting the quality of hardware in the smart watch according to claim 1, wherein the step of determining whether the quality of the bracelet in the bracelet surface image is acceptable according to the gray value difference of the pixel points in all the updated image blocks comprises the following specific steps:
Recording any one updated image block as a target updated block;
Obtaining a segmentation threshold value of the target update block by using an Ojin algorithm;
in the target updating block, the area formed by all pixel points with gray values smaller than or equal to the segmentation threshold value is marked as a target area; the area formed by all pixel points with gray values larger than the segmentation threshold value is recorded as a background area;
the average value of the gray values of all pixel points in the target area is recorded as a target gray value;
the average value of the gray values of all pixel points in the background area is recorded as the background gray value;
calculating a difference value of a background gray value minus a target gray value, and if the difference value is larger than a preset gray threshold value, marking a target area as a defect area in a target updating block;
and judging whether the quality of the chain in the chain surface image is qualified or not according to the result of whether the defect area exists in all the updated image blocks.
6. The method for detecting the quality of hardware in an intelligent watch according to claim 5, wherein the step of judging whether the quality of the bracelet in the bracelet surface image is qualified according to the result of whether the defect area exists in all updated image blocks comprises the following specific steps:
When no defect area exists in all updated image blocks, judging that the quality of the chain in the chain surface image is qualified;
and when the defect areas exist in all the updated image blocks, judging that the quality of the chains in the chain surface image is unqualified.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202311522086.XA CN117495829B (en) | 2023-11-15 | 2023-11-15 | Intelligent watch hardware quality detection method |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202311522086.XA CN117495829B (en) | 2023-11-15 | 2023-11-15 | Intelligent watch hardware quality detection method |
Publications (2)
Publication Number | Publication Date |
---|---|
CN117495829A CN117495829A (en) | 2024-02-02 |
CN117495829B true CN117495829B (en) | 2024-04-30 |
Family
ID=89668821
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202311522086.XA Active CN117495829B (en) | 2023-11-15 | 2023-11-15 | Intelligent watch hardware quality detection method |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN117495829B (en) |
Families Citing this family (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN118362483B (en) * | 2024-06-20 | 2024-09-20 | 中国科学院生态环境研究中心 | Pollen automatic sampling anti-stacking control method |
Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN102622602A (en) * | 2012-02-28 | 2012-08-01 | 中国农业大学 | Cotton foreign fiber image online dividing method and cotton foreign fiber image online dividing system |
CN109801282A (en) * | 2019-01-24 | 2019-05-24 | 湖北大学 | Pavement behavior detection method, processing method, apparatus and system |
KR20210073425A (en) * | 2019-12-10 | 2021-06-18 | 한국전자기술연구원 | Method for measuring complexity of image patch |
CN113763355A (en) * | 2021-09-07 | 2021-12-07 | 创新奇智(青岛)科技有限公司 | Defect detection method and device, electronic equipment and storage medium |
CN115082467A (en) * | 2022-08-22 | 2022-09-20 | 山东亿昌装配式建筑科技有限公司 | Building material welding surface defect detection method based on computer vision |
CN115375892A (en) * | 2022-09-20 | 2022-11-22 | 南京大学 | Large-size image preprocessing method and system |
-
2023
- 2023-11-15 CN CN202311522086.XA patent/CN117495829B/en active Active
Patent Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN102622602A (en) * | 2012-02-28 | 2012-08-01 | 中国农业大学 | Cotton foreign fiber image online dividing method and cotton foreign fiber image online dividing system |
CN109801282A (en) * | 2019-01-24 | 2019-05-24 | 湖北大学 | Pavement behavior detection method, processing method, apparatus and system |
KR20210073425A (en) * | 2019-12-10 | 2021-06-18 | 한국전자기술연구원 | Method for measuring complexity of image patch |
CN113763355A (en) * | 2021-09-07 | 2021-12-07 | 创新奇智(青岛)科技有限公司 | Defect detection method and device, electronic equipment and storage medium |
CN115082467A (en) * | 2022-08-22 | 2022-09-20 | 山东亿昌装配式建筑科技有限公司 | Building material welding surface defect detection method based on computer vision |
CN115375892A (en) * | 2022-09-20 | 2022-11-22 | 南京大学 | Large-size image preprocessing method and system |
Non-Patent Citations (3)
Title |
---|
基于图像分块及重构的菠菜重叠叶片与杂草识别;苗荣慧;杨华;武锦龙;刘昊宇;;农业工程学报;20200223(04);全文 * |
基于显著度融合的自适应分块行人再识别;陈鸿 等;《电子与信息学报》;20170626;第39卷(第11期);全文 * |
多聚焦图像的自适应分块融合方法;张闯;常建华;葛益娴;孙冬娇;;科学技术与工程;20130728(21);全文 * |
Also Published As
Publication number | Publication date |
---|---|
CN117495829A (en) | 2024-02-02 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN117495829B (en) | Intelligent watch hardware quality detection method | |
CN115546203B (en) | Production monitoring and analyzing method based on image data algorithm | |
CN114881915B (en) | Symmetry-based mobile phone glass cover plate window area defect detection method | |
CN116030060B (en) | Plastic particle quality detection method | |
CN116912261B (en) | Plastic mold injection molding surface defect detection method | |
CN116311079B (en) | Civil security engineering monitoring method based on computer vision | |
CN115082719B (en) | Wood quality grading method | |
CN114943848B (en) | Crack identification method in nickel screen laser cladding process | |
CN117315289B (en) | Aeroengine blade contour edge detection method based on image processing | |
CN117745724B (en) | Stone polishing processing defect region segmentation method based on visual analysis | |
CN117011291B (en) | Watch shell quality visual detection method | |
CN115277984B (en) | Method for adjusting exposure time of adaptive camera in glass size detection | |
CN114782421A (en) | Poultry veterinarian auxiliary system based on egg laying abnormality detection | |
CN115294097A (en) | Textile surface defect detection method based on machine vision | |
CN115346126A (en) | Side slope crack identification method | |
CN109949239B (en) | Self-adaptive sharpening method suitable for multi-concentration multi-scene haze image | |
CN118225803A (en) | Visual detection method for appearance of blade surface of bulldozer | |
CN117934460A (en) | Intelligent detection method for surface defects of insulating plate based on visual detection | |
CN117388263B (en) | Hardware terminal quality detection method for charging gun | |
CN117635507A (en) | Plastic particle online visual detection method and system | |
CN116993717A (en) | Visual detection method for production quality of heating wire of electronic cigarette | |
CN116363114A (en) | Ceramic tile surface quality detection method and device, electronic equipment and storage medium | |
CN114693652A (en) | Fabric defect detection method based on Gaussian mixture model | |
CN116958134B (en) | Plastic film extrusion quality evaluation method based on image processing | |
CN118657778B (en) | Method and system for monitoring fault data of automatic production of metal panel |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |