CN117809064A - Equipment label detection method, device, equipment and storage medium - Google Patents

Equipment label detection method, device, equipment and storage medium Download PDF

Info

Publication number
CN117809064A
CN117809064A CN202311687872.5A CN202311687872A CN117809064A CN 117809064 A CN117809064 A CN 117809064A CN 202311687872 A CN202311687872 A CN 202311687872A CN 117809064 A CN117809064 A CN 117809064A
Authority
CN
China
Prior art keywords
image
label
label image
template
mask
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202311687872.5A
Other languages
Chinese (zh)
Inventor
荀迅
夏安然
王尹
盛宇航
苏成琪
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Hefei Lianbao Information Technology Co Ltd
Original Assignee
Hefei Lianbao Information Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Hefei Lianbao Information Technology Co Ltd filed Critical Hefei Lianbao Information Technology Co Ltd
Priority to CN202311687872.5A priority Critical patent/CN117809064A/en
Publication of CN117809064A publication Critical patent/CN117809064A/en
Pending legal-status Critical Current

Links

Landscapes

  • Image Analysis (AREA)

Abstract

The present disclosure provides a device tag detection method, apparatus, device, and storage medium, where the method includes: extracting a first label image corresponding to the label position information in an image to be detected of the target equipment based on the label position information recorded in the preset configuration file, and acquiring a template label image corresponding to the target equipment; determining a mask label image based on the first label image and the template label image; sequentially rotating the template label images according to a preset angle by taking the designated positions of the template label images as rotation centers; traversing the mask label image according to the template label image aiming at the template label image rotated by a preset angle, and calculating the matching degree between the template label image and the corresponding area image of the template label image in the mask label image after each traversing; determining a target matching angle according to the matching degree, and determining a target tag position in the mask tag image based on the target matching angle; and determining whether the equipment tag is qualified according to the target tag position.

Description

Equipment label detection method, device, equipment and storage medium
Technical Field
The disclosure relates to the technical field of information processing, and in particular relates to a device tag detection method, a device, equipment and a storage medium.
Background
Labels such as a notebook model label, a brand label and the like are usually attached to the C surface of the notebook computer, and the application condition of each label on the C surface of the notebook computer needs to be detected before the notebook computer leaves a factory, so that the label on the notebook computer leaves a factory is ensured to be applied correctly. The surface C of the notebook computer is the surface where the keyboard of the notebook computer is located.
At present, the method for detecting the label application condition of the C face of the notebook computer mainly comprises manual detection and template image-based detection of the C face label of the notebook computer. Wherein, the randomness of manual detection is great and visual fatigue can cause higher error to the testing result, leads to the high detection accuracy of omission ratio low. The detection of the C-face label of the notebook computer based on the template image is to detect the characteristic positions of the template label image and the actually applied label image of the C-face of the notebook computer to obtain a detection result. Therefore, how to accurately and efficiently extract the label on the C surface of the notebook computer directly influences the accuracy of label detection. The existing label extraction method mainly comprises a K-Means cluster extraction algorithm, and the method is easily influenced by the environment to cause cluster failure.
Therefore, how to detect the label application condition of the C surface of the notebook computer more accurately becomes a technical problem to be solved urgently.
Disclosure of Invention
The present disclosure provides a device tag detection method, apparatus, device, and storage medium, so as to at least solve the above technical problems in the prior art.
According to a first aspect of the present disclosure, there is provided a device tag detection method, the method comprising:
extracting a first label image corresponding to the label position information from an image to be detected of target equipment based on the label position information recorded in a preset configuration file, and acquiring a template label image corresponding to the target equipment;
determining a corresponding first mask image based on the first label image and the stencil label image;
generating a mask label image according to the first mask image and the first label image, wherein the center position of the mask label image is overlapped with the center position of the first label image;
sequentially rotating the template label images according to a preset angle by taking the designated positions of the template label images as rotation centers;
traversing the mask label image according to the template label image aiming at the template label image after rotating the preset angle, and calculating the matching degree between the template label image and the corresponding area image of the template label image in the mask label image after each traversing;
Determining a target matching angle according to the matching degree, and determining a target tag position in the mask tag image based on the target matching angle;
and determining whether the equipment tag is qualified according to the target tag position.
In an embodiment, the determining the corresponding first mask image based on the first label image and the stencil label image includes:
determining a first diagonal corresponding to the first label image, and determining a second diagonal corresponding to the template label image;
and respectively taking the first diagonal line and the second diagonal line as the length and the width of the image to generate a first mask image.
In one embodiment, the calculating the matching degree between the template tag image and the corresponding area image of the template tag image in the mask tag image after each traversal includes:
determining a rotation matrix corresponding to the template tag image based on the current rotation angle;
performing position conversion on pixel points in the template label image based on the rotation matrix to obtain a pixel matrix corresponding to the template label image;
determining a region image corresponding to the template label image and the template label image in the mask label image aiming at the template label image after each traversal;
And determining the matching degree of the template label image between the corresponding region images in the mask label image according to the pixel matrixes corresponding to the region images and the template label image.
In an embodiment, the determining, according to the pixel matrix corresponding to the area image and the template tag image, the matching degree between the corresponding area images in the mask tag image of the template tag image includes:
determining the matching degree of the template label image between the corresponding region images in the mask label image according to the pixel matrixes corresponding to the region images and the template label image by adopting the following formula:
wherein S (x, y) represents a matching degree, M (x ', y') represents a pixel matrix corresponding to the template tag image, x 'and y' refer to position coordinates of any one pixel point in the template tag image, O (x, y) represents a pixel matrix corresponding to an area image of the template tag image in the mask tag image, and x and y refer to position coordinates of a pixel point corresponding to a pixel point with a position coordinate (x ', y') in the template tag image in the mask tag image.
In one embodiment, before the sequentially rotating the stencil label image at a predetermined angle with the designated position of the stencil label image as the rotation center, the method further includes:
Respectively carrying out downsampling treatment on the mask label image and the template label image to obtain a mask label image after downsampling treatment and a template label image after downsampling treatment;
the step of sequentially rotating the stencil label image according to a preset angle by taking a designated position of the stencil label image as a rotation center includes:
and sequentially rotating the template label images after the downsampling process according to a preset angle by taking the designated positions of the template label images after the downsampling process as rotation centers.
In an embodiment, the determining the target matching angle according to the matching degree, and determining the target tag position in the mask tag image based on the target matching angle includes:
determining the rotation angle with the maximum corresponding matching degree as a target matching angle;
rotating the mask tag image according to the target matching angle to obtain a rotated image;
and determining the label position in the rotated image as a target label position.
In an embodiment, the determining whether the device tag is qualified according to the target tag position includes:
determining whether the label corresponding to the target label position is a label to be applied by the target equipment;
And if so, determining that the equipment label of the target equipment is qualified.
According to a second aspect of the present disclosure, there is provided a device tag detection apparatus, the apparatus comprising:
the image acquisition module is used for extracting a first label image corresponding to the label position information from an image to be detected of target equipment based on the label position information recorded in a preset configuration file, and acquiring a template label image corresponding to the target equipment;
a first image determination module for determining a corresponding first mask image based on the first label image and the stencil label image;
the second image determining module is used for generating a mask label image according to the first mask image and the first label image, and the center position of the mask label image is overlapped with the center position of the first label image;
the image rotating module is used for sequentially rotating the template label image according to a preset angle by taking the appointed position of the template label image as a rotation center;
the matching degree calculating module is used for traversing the mask label image according to the template label image for the template label image after rotating the preset angle, and calculating the matching degree between the template label image and the corresponding area image of the template label image in the mask label image after each traversing;
The position determining module is used for determining a target matching angle according to the matching degree and determining a target tag position in the mask tag image based on the target matching angle;
and the detection module is used for determining whether the equipment tag is qualified according to the target tag position.
According to a third aspect of the present disclosure, there is provided an electronic device comprising:
at least one processor; and
a memory communicatively coupled to the at least one processor; wherein,
the memory stores instructions executable by the at least one processor to enable the at least one processor to perform the methods described in the present disclosure.
According to a fourth aspect of the present disclosure, there is provided a non-transitory computer readable storage medium storing computer instructions for causing the computer to perform the method of the present disclosure.
The device label detection method, device, equipment and storage medium are used for extracting a first label image corresponding to label position information from an image to be detected of target equipment based on label position information recorded in a preset configuration file, and acquiring a template label image corresponding to the target equipment; determining a corresponding first mask image based on the first label image and the template label image; generating a mask label image according to the first mask image and the first label image, wherein the center position of the mask label image is overlapped with the center position of the first label image; sequentially rotating the template label images according to a preset angle by taking the designated positions of the template label images as rotation centers; traversing the mask label image according to the template label image aiming at the template label image rotated by a preset angle, and calculating the matching degree between the template label image and the corresponding area image of the template label image in the mask label image after each traversing; determining a target matching angle according to the matching degree, and determining a target tag position in the mask tag image based on the target matching angle; and determining whether the equipment tag is qualified according to the target tag position. By adopting the method, the target matching angle can be determined according to the matching degree, and the target label position in the mask label image can be accurately determined based on the target matching angle, so that whether the equipment label is qualified or not is determined according to the target label position, and the detection accuracy of the equipment label is improved.
It should be understood that the description in this section is not intended to identify key or critical features of the embodiments of the disclosure, nor is it intended to be used to limit the scope of the disclosure. Other features of the present disclosure will become apparent from the following specification.
Drawings
The above, as well as additional purposes, features, and advantages of exemplary embodiments of the present disclosure will become readily apparent from the following detailed description when read in conjunction with the accompanying drawings. Several embodiments of the present disclosure are illustrated by way of example, and not by way of limitation, in the figures of the accompanying drawings, in which:
in the drawings, the same or corresponding reference numerals indicate the same or corresponding parts.
Fig. 1 shows a schematic implementation flow diagram of a device tag detection method according to an embodiment of the present disclosure;
FIG. 2 illustrates a schematic diagram of tag detection provided by embodiments of the present disclosure;
fig. 3 is a schematic structural diagram of an apparatus tag detection device according to an embodiment of the present disclosure;
fig. 4 shows a schematic diagram of a composition structure of an electronic device according to an embodiment of the disclosure.
Detailed Description
In order to make the objects, features and advantages of the present disclosure more comprehensible, the technical solutions in the embodiments of the present disclosure will be clearly described in conjunction with the accompanying drawings in the embodiments of the present disclosure, and it is apparent that the described embodiments are only some embodiments of the present disclosure, but not all embodiments. Based on the embodiments in this disclosure, all other embodiments that a person skilled in the art would obtain without making any inventive effort are within the scope of protection of this disclosure.
Because the existing label detection method has high omission ratio and low detection accuracy, in order to accurately and efficiently extract labels on the C surface of a notebook computer and improve the label detection accuracy, the disclosure provides a device label detection method, a device and a storage medium. The method provided by the present disclosure may be applied to any device capable of image processing, such as a mobile phone, a computer, a tablet computer, and the like.
The technical solutions of the embodiments of the present disclosure will be described below with reference to the drawings in the embodiments of the present disclosure.
Fig. 1 shows a schematic implementation flow diagram of a device tag detection method according to an embodiment of the present disclosure, as shown in fig. 1, where the method includes:
s101, extracting a first label image corresponding to the label position information from an image to be detected of target equipment based on the label position information recorded in a preset configuration file, and acquiring a template label image corresponding to the target equipment.
The target device can be a notebook computer, a mobile phone, a desktop computer and other devices
In the disclosure, the preset configuration file refers to a configuration file of the target device, where location information and attribute information of each device and element of the target device are recorded, for example, if the target device is a notebook computer, attribute information and label location information of a label to be applied on a C-plane of the notebook computer are recorded in the preset configuration file. The position corresponding to the image to be detected of the target device can be determined according to the tag position information recorded in the preset configuration file, and the image at the corresponding position is extracted to be used as a first tag image corresponding to the tag position information.
The template label image corresponding to the target device refers to the label area image in the extracted standard applied template image. The standard applied template image is an image acquired by a pointer to a standard device to which a label is applied according to an application standard, for example, for a notebook computer, a device to which a worker applies according to the label application standard may be regarded as a standard notebook computer, and then a C-plane image of the standard notebook computer acquired by the image acquisition device may be regarded as a standard applied template image.
In the disclosure, if the target device is a notebook computer, preprocessing operation can be performed on a C-plane image of the notebook computer to correct the C-plane image to a standard detection position, then a preset configuration file is loaded, and a corresponding OriImg (first label image) is extracted according to a position of a C-plane label of the notebook computer recorded in the preset configuration file. And the ModelImg (template label image) corresponding to the position of the C-plane label of the notebook computer in the corresponding template image can be extracted according to the preset configuration file. In the present disclosure OriImg refers to the first label image and ModelImg refers to the template label image.
S102, determining a corresponding first mask image based on the first label image and the template label image.
In one embodiment, the determining the corresponding first mask image based on the first label image and the stencil label image includes steps A1-A2:
and A1, determining a first diagonal line corresponding to the first label image, and determining a second diagonal line corresponding to the template label image.
And A2, taking the first diagonal line and the second diagonal line as the length and the width of the image respectively, and generating a first mask image.
In the present disclosure, a width ModelImgWidth and a height ModelImgHeigth of the template tag image ModelImg, and a width OriImgWidth and a height 0 rimimgheigth of the first tag image OriImg may be acquired. The common is used according to the width oriImgWidth and the height oriImgHeigth of the first tag image 0 irimgThe formula:calculating to obtain a first diagonal corresponding to the first label image; from the width ModelImgWidth and height ModelImgHeigth of the stencil label image ModelImg, the formula is adopted: /> And calculating a second diagonal corresponding to the template label image. Then, the first diagonal line is taken as the length of the image and the second diagonal line is taken as the width of the image, so that a first mask image is obtained, or the first diagonal line is taken as the width of the image and the second diagonal line is taken as the length of the image, so that the first mask image is obtained.
Alternatively, in the present disclosure, after the width modelimmand, the height modelimmand, and the width oriimgmand, the height oriimgath of the template tag image modiimg are obtained, the width MaskWidth and the height MaskHeight of the intermediate image Mask may be determined according to the width modelimmand, the height modelimmand, and the width oriimgmand, the height oriimgand of the first tag image OriImg by the following formula:
MaskWidth=MAX{ModelImgWidth,OriImgWidth};
MaskHeight=MAX{ModelImgHeigth,OriImgHeigth}。
then, according to the width Mask width and the height Mask height of the intermediate image Mask, determining the width Mask imgwidth and the height Mask imgheight of the first Mask image Mask by adopting the following formulas:
in the present disclosure, modelImgWidth refers to the width of the template tag image, modelImgHeigth refers to the height of the template tag image, oriImgWidth refers to the width of the first tag image, oriImgHeigth refers to the height of the first tag image, mask refers to the intermediate image, mask Width refers to the width of the intermediate image, mask Height refers to the height of the intermediate image, mask Img refers to the first Mask image, mask ImgWidth refers to the width of the first Mask image, mask Height refers to the height of the first Mask image.
And S103, generating a mask label image according to the first mask image and the first label image, wherein the center position of the mask label image is overlapped with the center position of the first label image.
In the disclosure, the first label image OriMaskImg may be superimposed on the first mask image MaskImg at a center position to obtain the mask label image OriMaskImg. Specifically, the center position of the first label image may be overlapped with the center position of the first mask image, and the obtained superimposed image may be used as the mask label image.
S104, sequentially rotating the template label images according to a preset angle by taking the designated position of the template label image as a rotation center.
The preset angle may be set according to an actual application scenario, for example, the preset angle may be set to 5 ° or 10 °. In the present disclosure, the specified position of the stencil label image may be a center position of the stencil label image.
In the disclosure, downsampling based on a gaussian pyramid can be performed on a template tag image ModelImg and a mask tag image OriMaskImg respectively, so that information of different scales of an effective image is obtained:
G i =D(G i-1 )
wherein G is i Representing the image obtained by the ith downsampling. G in the present disclosure i Is an image obtained by performing downsampling once on the basis of the image obtained by downsampling the i-1 th time. D () refers to downsampling processing based on gaussian pyramid.
Then, taking a preset angle Angle as a rotation starting point, traversing and calculating each rotation matching angle step length, and taking the image center of the template label image ModelImg as a rotation point to obtain a rotation translation matrix R t And then the template label image ModelImg performs affine transformation of the image to the current rotational position to match the mask tag image OriMaskImg, where the rotational translation matrix R t The affine transformation of the image is calculated by adopting the following formula to obtain R t
Wherein, any pixel point location transformation in the template image can be expressed as:
wherein θ represents a rotation matrix, R t Representing a rotational translation matrix, x 0 And y 0 Refers to the abscissa and the ordinate, x of any pixel point on a template label image 1 And y 1 Refers to the coordinate x on the template label image 0 And y 0 The abscissa and the ordinate of the position obtained after affine transformation of the pixel points of (a). AngleStart refers to a preset angle.
S105, traversing the mask label image according to the template label image for each template label image rotated by the preset angle, and calculating the matching degree between the template label image and the corresponding area image of the template label image in the mask label image after each traversing.
In one embodiment, the calculating the matching degree between the template label image and the corresponding area image of the template label image in the mask label image after each traversal includes steps B1-B4:
and step B1, determining a rotation matrix corresponding to the template label image based on the current rotation angle.
Specifically, the rotation matrix corresponding to the stencil label image may be determined using the following formula:
wherein, gamma refers to the current rotation angle, R x Refers to a rotation matrix corresponding to the template label image.
And B2, performing position conversion on pixel points in the template label image based on the rotation matrix to obtain a pixel matrix corresponding to the template label image.
Specifically, for each pixel point in the template label image, the converted position corresponding to the pixel may be determined using the following formula:
wherein x is 0 And y 0 Refers to the abscissa and the ordinate, x of any pixel point on a template label image 1 And y 1 Refers to the coordinate x on the template label image 0 And y 0 The abscissa and the ordinate of the position obtained after the pixel points of (a) are subjected to position conversion.
And B3, determining an area image corresponding to the template label image and the template label image in the mask label image according to the template label image after each traversal.
For the template label image after each rotation traversal, an area image of the template label image overlapped with the mask label image can be segmented and used as an area image of the template label image and the template label image corresponding to each other in the mask label image.
And B4, determining the matching degree of the template label image between the corresponding region images in the mask label image according to the pixel matrixes corresponding to the region images and the template label image.
Specifically, the matching degree between the corresponding region images of the template label image in the mask label image is determined according to the pixel matrix corresponding to the region image and the template label image by adopting the following formula:
wherein S (x, y) represents a matching degree, M (x ', y') represents a pixel matrix corresponding to the template tag image, x 'and y' refer to position coordinates of any one pixel point in the template tag image, M (x ', y') specifically refers to gray values of pixel points with position coordinates of (x ', y') in the template tag image, O (x, y) represents a pixel matrix corresponding to an area image of the template tag image in the mask tag image, x and y refer to position coordinates of pixel points in the mask tag image corresponding to pixel points with position coordinates of (x ', y') in the template tag image, and O (x ', y') specifically refers to gray values of pixel points with position coordinates of (x, y) in the mask tag image.
S106, determining a target matching angle according to the matching degree, and determining the target label position in the mask label image based on the target matching angle.
In an embodiment, the determining a target matching angle according to the matching degree and determining the target tag position in the mask tag image based on the target matching angle includes steps C1-C3:
and C1, determining the rotation angle with the maximum corresponding matching degree as a target matching angle.
And C2, rotating the mask label image according to the target matching angle to obtain a rotated image.
After the target matching angle is determined, the mask label image can be rotated by the target matching angle, the rotated image is the image corrected to the standard detection position, and the label application detection is carried out on the basis of the rotated image, so that the detection result is more accurate.
And C3, determining the label position in the rotated image as the target label position.
Specifically, the matching score of each rotation angle obtained in the ModelImg image matching range can be traversed in the present disclosureMaximum matching score MaxLocMark (i) Finally according to MaxLocMArk (i) Obtaining an optimal matching position point GoodMatchLoc within a matching angle range, and outputting a matching coordinate MatchCoord and a matching score MatchMark of the optimal matching position point;
MatchMark=Max{MaxLocMark(i)}
and determining the matching coordinate matchCoord of the optimal matching position point as a coordinate point in the target label position to obtain the target label position.
S107, determining whether the equipment tag is qualified according to the target tag position.
In an embodiment, the determining whether the device tag is qualified according to the target tag position includes steps D1-D2:
and D1, determining whether the label corresponding to the target label position is a label to be applied by the target equipment.
And D2, if so, determining that the equipment label of the target equipment is qualified.
According to the method, based on tag position information recorded in a preset configuration file, a first tag image corresponding to the tag position information in an image to be detected of target equipment is extracted, and a template tag image corresponding to the target equipment is obtained; determining a corresponding first mask image based on the first label image and the template label image; generating a mask label image according to the first mask image and the first label image, wherein the center position of the mask label image is overlapped with the center position of the first label image; sequentially rotating the template label images according to a preset angle by taking the designated positions of the template label images as rotation centers; traversing the mask label image according to the template label image aiming at the template label image rotated by a preset angle, and calculating the matching degree between the template label image and the corresponding area image of the template label image in the mask label image after each traversing; determining a target matching angle according to the matching degree, and determining a target tag position in the mask tag image based on the target matching angle; and determining whether the equipment tag is qualified according to the target tag position. By adopting the method, the target matching angle can be determined according to the matching degree, and the target label position in the mask label image can be accurately determined based on the target matching angle, so that whether the equipment label is qualified or not is determined according to the target label position, and the detection accuracy of the equipment label is improved.
In one embodiment, before the sequentially rotating the stencil label image at a predetermined angle with the designated position of the stencil label image as the rotation center, the method further includes step E1:
and E1, respectively performing downsampling processing on the mask label image and the template label image to obtain a downsampled mask label image and a downsampled template label image.
On the basis of the step E1, the step of sequentially rotating the template label images by a preset angle with the designated position of the template label image as a rotation center comprises the following steps: and sequentially rotating the template label images after the downsampling process according to a preset angle by taking the designated positions of the template label images after the downsampling process as rotation centers.
Fig. 2 illustrates a schematic diagram of tag detection provided in an embodiment of the present disclosure, as shown in fig. 2, for an image 200 to be detected of a target device, a first tag image 201 corresponding to tag position information in the image 200 to be detected may be extracted according to tag position information recorded in a preset configuration file, and a template tag image 202 corresponding to the target device may be obtained according to tag position information recorded in the preset configuration file. Then, determining a corresponding first mask image based on the first mask image 201 and the template tag image 202, generating a mask tag image according to the first mask image and the first tag image, sequentially rotating the template tag image according to a preset angle by taking a designated position of the template tag image as a rotation center, traversing the mask tag image according to the template tag image after each rotation by a preset angle, calculating the matching degree between the template tag image and a corresponding area image of the template tag image in the mask tag image after each traversing, namely, matching the image obtained after each rotation with the template tag image 202, determining a target matching angle according to the matching degree, thereby determining a target tag position in the mask tag image based on the target matching angle, selecting a tag area in the image 200 to be detected according to a target tag position frame, obtaining a result image 204, and determining whether the application of the equipment tag is qualified according to the target tag position in the result image 204.
Based on the same inventive concept, according to the method for detecting an equipment tag provided in the foregoing embodiment of the present disclosure, correspondingly, another embodiment of the present disclosure further provides an equipment tag detection apparatus, a schematic structural diagram of which is shown in fig. 3, which specifically includes:
the image acquisition module 301 is configured to extract a first label image corresponding to the label position information in an image to be detected of a target device based on the label position information recorded in a preset configuration file, and acquire a template label image corresponding to the target device;
a first image determination module 302 for determining a corresponding first mask image based on the first label image and the stencil label image;
a second image determining module 303, configured to generate a mask tag image according to the first mask image and the first tag image, where a center position of the mask tag image and a center position of the first tag image overlap;
an image rotation module 304, configured to sequentially rotate the template tag images according to a preset angle with a designated position of the template tag image as a rotation center;
a matching degree calculating module 305, configured to traverse the mask label image according to the template label image for the template label image rotated by the preset angle, and calculate a matching degree between the template label image and a corresponding area image of the template label image in the mask label image after each traverse;
A position determining module 306, configured to determine a target matching angle according to the matching degree, and determine a target tag position in the mask tag image based on the target matching angle;
and a detection module 307, configured to determine whether the device tag is qualified according to the target tag position.
The method comprises the steps of adopting the device, extracting a first label image corresponding to label position information in an image to be detected of target equipment based on label position information recorded in a preset configuration file, and obtaining a template label image corresponding to the target equipment; determining a corresponding first mask image based on the first label image and the template label image; generating a mask label image according to the first mask image and the first label image, wherein the center position of the mask label image is overlapped with the center position of the first label image; sequentially rotating the template label images according to a preset angle by taking the designated positions of the template label images as rotation centers; traversing the mask label image according to the template label image aiming at the template label image rotated by a preset angle, and calculating the matching degree between the template label image and the corresponding area image of the template label image in the mask label image after each traversing; determining a target matching angle according to the matching degree, and determining a target tag position in the mask tag image based on the target matching angle; and determining whether the equipment tag is qualified according to the target tag position. By adopting the method, the target matching angle can be determined according to the matching degree, and the target label position in the mask label image can be accurately determined based on the target matching angle, so that whether the equipment label is qualified or not is determined according to the target label position, and the detection accuracy of the equipment label is improved.
In an embodiment, the first image determining module 302 is specifically configured to determine a first diagonal line corresponding to the first label image, and determine a second diagonal line corresponding to the template label image; and respectively taking the first diagonal line and the second diagonal line as the length and the width of the image to generate a first mask image.
In an embodiment, the matching degree calculating module 305 is specifically configured to determine a rotation matrix corresponding to the template label image based on the current rotation angle; performing position conversion on pixel points in the template label image based on the rotation matrix to obtain a pixel matrix corresponding to the template label image; determining a region image corresponding to the template label image and the template label image in the mask label image aiming at the template label image after each traversal; and determining the matching degree of the template label image between the corresponding region images in the mask label image according to the pixel matrixes corresponding to the region images and the template label image.
In an embodiment, the matching degree calculating module 305 is specifically configured to determine, according to the following formula, the matching degree between the corresponding area images in the mask label image and the template label image according to the pixel matrix corresponding to the area image and the template label image:
Wherein S (x, y) represents a matching degree, M (x ', y') represents a pixel matrix corresponding to the template tag image, x 'and y' refer to position coordinates of any one pixel point in the template tag image, M (x ', y') specifically refers to gray values of pixel points with position coordinates of (x ', y') in the template tag image, O (x, y) represents a pixel matrix corresponding to an area image of the template tag image in the mask tag image, x and y refer to position coordinates of pixel points in the mask tag image corresponding to pixel points with position coordinates of (x ', y') in the template tag image, and O (x ', y') specifically refers to gray values of pixel points with position coordinates of (x, y) in the mask tag image.
In an embodiment, the device further comprises:
a downsampling processing module (not shown in the figure) for respectively downsampling the mask label image and the template label image to obtain a downsampled mask label image and a downsampled template label image;
the image rotation module 304 is specifically configured to sequentially rotate the template label images after the downsampling process according to a preset angle with a designated position of the template label images after the downsampling process as a rotation center.
In an embodiment, the location determining module 306 is specifically configured to determine, as the target matching angle, the rotation angle with the largest corresponding matching degree; rotating the mask tag image according to the target matching angle to obtain a rotated image; and determining the label position in the rotated image as a target label position.
In an embodiment, the detection module 307 is specifically configured to determine whether the tag corresponding to the target tag location is a tag to which the target device should be applied; and if so, determining that the equipment label of the target equipment is qualified.
According to embodiments of the present disclosure, the present disclosure also provides an electronic device and a readable storage medium.
Fig. 4 illustrates a schematic block diagram of an example electronic device 400 that may be used to implement embodiments of the present disclosure. Electronic devices are intended to represent various forms of digital computers, such as laptops, desktops, workstations, personal digital assistants, servers, blade servers, mainframes, and other appropriate computers. The electronic device may also represent various forms of mobile devices, such as personal digital processing, cellular telephones, smartphones, wearable devices, and other similar computing devices. The components shown herein, their connections and relationships, and their functions, are meant to be exemplary only, and are not meant to limit implementations of the disclosure described and/or claimed herein.
As shown in fig. 4, the apparatus 400 includes a computing unit 401 that can perform various suitable actions and processes according to a computer program stored in a Read Only Memory (ROM) 402 or a computer program loaded from a storage unit 408 into a Random Access Memory (RAM) 403. In RAM 403, various programs and data required for the operation of device 400 may also be stored. The computing unit 401, ROM 402, and RAM 403 are connected to each other by a bus 404. An input/output (I/O) interface 405 is also connected to bus 404.
Various components in device 400 are connected to I/O interface 405, including: an input unit 406 such as a keyboard, a mouse, etc.; an output unit 407 such as various types of displays, speakers, and the like; a storage unit 408, such as a magnetic disk, optical disk, etc.; and a communication unit 409 such as a network card, modem, wireless communication transceiver, etc. The communication unit 409 allows the device 400 to exchange information/data with other devices via a computer network, such as the internet, and/or various telecommunication networks.
The computing unit 401 may be a variety of general purpose and/or special purpose processing components having processing and computing capabilities. Some examples of computing unit 401 include, but are not limited to, a Central Processing Unit (CPU), a Graphics Processing Unit (GPU), various specialized Artificial Intelligence (AI) computing chips, various computing units running machine learning model algorithms, a Digital Signal Processor (DSP), and any suitable processor, controller, microcontroller, etc. The computing unit 401 performs the respective methods and processes described above, such as the device tag detection method. For example, in some embodiments, the device tag detection method may be implemented as a computer software program tangibly embodied on a machine-readable medium, such as the storage unit 408. In some embodiments, part or all of the computer program may be loaded and/or installed onto the device 400 via the ROM 402 and/or the communication unit 409. When the computer program is loaded into RAM 403 and executed by computing unit 401, one or more steps of the device tag detection method described above may be performed. Alternatively, in other embodiments, the computing unit 401 may be configured to perform the device tag detection method by any other suitable means (e.g. by means of firmware).
Various implementations of the systems and techniques described here above may be implemented in digital electronic circuitry, integrated circuitry, field Programmable Gate Arrays (FPGAs), application Specific Integrated Circuits (ASICs), application Specific Standard Products (ASSPs), systems-on-a-chip (SOCs), complex Programmable Logic Devices (CPLDs), computer hardware, firmware, software, and/or combinations thereof. These various embodiments may include: implemented in one or more computer programs, the one or more computer programs may be executed and/or interpreted on a programmable system including at least one programmable processor, which may be a special purpose or general-purpose programmable processor, that may receive data and instructions from, and transmit data and instructions to, a storage system, at least one input device, and at least one output device.
Program code for carrying out methods of the present disclosure may be written in any combination of one or more programming languages. These program code may be provided to a processor or controller of a general purpose computer, special purpose computer, or other programmable data processing apparatus such that the program code, when executed by the processor or controller, causes the functions/operations specified in the flowchart and/or block diagram to be implemented. The program code may execute entirely on the machine, partly on the machine, as a stand-alone software package, partly on the machine and partly on a remote machine or entirely on the remote machine or server.
In the context of this disclosure, a machine-readable medium may be a tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device. The machine-readable medium may be a machine-readable signal medium or a machine-readable storage medium. The machine-readable medium may include, but is not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any suitable combination of the foregoing. More specific examples of a machine-readable storage medium would include an electrical connection based on one or more wires, a portable computer diskette, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing.
To provide for interaction with a user, the systems and techniques described here can be implemented on a computer having: a display device (e.g., a CRT (cathode ray tube) or LCD (liquid crystal display) monitor) for displaying information to a user; and a keyboard and pointing device (e.g., a mouse or trackball) by which a user can provide input to the computer. Other kinds of devices may also be used to provide for interaction with a user; for example, feedback provided to the user may be any form of sensory feedback (e.g., visual feedback, auditory feedback, or tactile feedback); and input from the user may be received in any form, including acoustic input, speech input, or tactile input.
The systems and techniques described here can be implemented in a computing system that includes a background component (e.g., as a data server), or that includes a middleware component (e.g., an application server), or that includes a front-end component (e.g., a user computer having a graphical user interface or a web browser through which a user can interact with an implementation of the systems and techniques described here), or any combination of such background, middleware, or front-end components. The components of the system can be interconnected by any form or medium of digital data communication (e.g., a communication network). Examples of communication networks include: local Area Networks (LANs), wide Area Networks (WANs), and the internet.
The computer system may include a client and a server. The client and server are typically remote from each other and typically interact through a communication network. The relationship of client and server arises by virtue of computer programs running on the respective computers and having a client-server relationship to each other. The server may be a cloud server, a server of a distributed system, or a server incorporating a blockchain.
It should be appreciated that various forms of the flows shown above may be used to reorder, add, or delete steps. For example, the steps recited in the present disclosure may be performed in parallel or sequentially or in a different order, provided that the desired results of the technical solutions of the present disclosure are achieved, and are not limited herein.
Furthermore, the terms "first," "second," and the like, are used for descriptive purposes only and are not to be construed as indicating or implying a relative importance or implicitly indicating the number of technical features indicated. Thus, a feature defining "a first" or "a second" may explicitly or implicitly include at least one such feature. In the description of the present disclosure, the meaning of "a plurality" is two or more, unless explicitly defined otherwise.
The foregoing is merely specific embodiments of the disclosure, but the protection scope of the disclosure is not limited thereto, and any person skilled in the art can easily think about changes or substitutions within the technical scope of the disclosure, and it is intended to cover the scope of the disclosure. Therefore, the protection scope of the present disclosure shall be subject to the protection scope of the claims.

Claims (10)

1. A device tag detection method, the method comprising:
extracting a first label image corresponding to the label position information from an image to be detected of target equipment based on the label position information recorded in a preset configuration file, and acquiring a template label image corresponding to the target equipment;
determining a corresponding first mask image based on the first label image and the stencil label image;
generating a mask label image according to the first mask image and the first label image, wherein the center position of the mask label image is overlapped with the center position of the first label image;
sequentially rotating the template label images according to a preset angle by taking the designated positions of the template label images as rotation centers;
traversing the mask label image according to the template label image aiming at the template label image after rotating the preset angle, and calculating the matching degree between the template label image and the corresponding area image of the template label image in the mask label image after each traversing;
determining a target matching angle according to the matching degree, and determining a target tag position in the mask tag image based on the target matching angle;
And determining whether the equipment tag is qualified according to the target tag position.
2. The method of claim 1, wherein the determining a corresponding first mask image based on the first label image and the stencil label image comprises:
determining a first diagonal corresponding to the first label image, and determining a second diagonal corresponding to the template label image;
and respectively taking the first diagonal line and the second diagonal line as the length and the width of the image to generate a first mask image.
3. The method of claim 1, wherein said calculating a degree of matching between the template label image and a corresponding region image of the template label image in the mask label image after each traversal comprises:
determining a rotation matrix corresponding to the template tag image based on the current rotation angle;
performing position conversion on pixel points in the template label image based on the rotation matrix to obtain a pixel matrix corresponding to the template label image;
determining a region image corresponding to the template label image and the template label image in the mask label image aiming at the template label image after each traversal;
And determining the matching degree of the template label image between the corresponding region images in the mask label image according to the pixel matrixes corresponding to the region images and the template label image.
4. A method according to claim 3, wherein said determining the degree of matching of the template label image between corresponding region images in the mask label image from the matrix of pixels corresponding to the region image and the template label image comprises:
determining the matching degree of the template label image between the corresponding region images in the mask label image according to the pixel matrixes corresponding to the region images and the template label image by adopting the following formula:
wherein S (x, y) represents the matching degree, M (x) ,y ) Representing a matrix of pixels corresponding to the template label image, x And y Refers to the position coordinates of any one pixel point in the template label image, O (x, y) represents the pixel matrix of the corresponding area image of the template label image in the mask label image, x and y refer to the position coordinates of the mask label image and the template label image as (x) ,y ) Position coordinates of the pixel point corresponding to the pixel point of (c).
5. The method of claim 1, wherein prior to sequentially rotating the stencil label image at a predetermined angle with respect to the specified position of the stencil label image as a center of rotation, the method further comprises:
respectively carrying out downsampling treatment on the mask label image and the template label image to obtain a mask label image after downsampling treatment and a template label image after downsampling treatment;
the step of sequentially rotating the stencil label image according to a preset angle by taking a designated position of the stencil label image as a rotation center includes:
and sequentially rotating the template label images after the downsampling process according to a preset angle by taking the designated positions of the template label images after the downsampling process as rotation centers.
6. The method of claim 1, wherein the determining a target match angle from the degree of match and determining a target tag location in the mask tag image based on the target match angle comprises:
determining the rotation angle with the maximum corresponding matching degree as a target matching angle;
rotating the mask tag image according to the target matching angle to obtain a rotated image;
And determining the label position in the rotated image as a target label position.
7. The method of claim 1, wherein said determining whether the device tag is acceptable based on the target tag location comprises:
determining whether the label corresponding to the target label position is a label to be applied by the target equipment;
and if so, determining that the equipment label of the target equipment is qualified.
8. A device tag detection apparatus, the apparatus comprising:
the image acquisition module is used for extracting a first label image corresponding to the label position information from an image to be detected of target equipment based on the label position information recorded in a preset configuration file, and acquiring a template label image corresponding to the target equipment;
a first image determination module for determining a corresponding first mask image based on the first label image and the stencil label image;
the second image determining module is used for generating a mask label image according to the first mask image and the first label image, and the center position of the mask label image is overlapped with the center position of the first label image;
the image rotating module is used for sequentially rotating the template label image according to a preset angle by taking the appointed position of the template label image as a rotation center;
The matching degree calculating module is used for traversing the mask label image according to the template label image for the template label image after rotating the preset angle, and calculating the matching degree between the template label image and the corresponding area image of the template label image in the mask label image after each traversing;
the position determining module is used for determining a target matching angle according to the matching degree and determining a target tag position in the mask tag image based on the target matching angle;
and the detection module is used for determining whether the equipment tag is qualified according to the target tag position.
9. An electronic device, comprising:
at least one processor; and
a memory communicatively coupled to the at least one processor; wherein,
the memory stores instructions executable by the at least one processor to enable the at least one processor to perform the method of any one of claims 1-7.
10. A non-transitory computer readable storage medium storing computer instructions for causing a computer to perform the method of any one of claims 1-7.
CN202311687872.5A 2023-12-05 2023-12-05 Equipment label detection method, device, equipment and storage medium Pending CN117809064A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202311687872.5A CN117809064A (en) 2023-12-05 2023-12-05 Equipment label detection method, device, equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202311687872.5A CN117809064A (en) 2023-12-05 2023-12-05 Equipment label detection method, device, equipment and storage medium

Publications (1)

Publication Number Publication Date
CN117809064A true CN117809064A (en) 2024-04-02

Family

ID=90420817

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202311687872.5A Pending CN117809064A (en) 2023-12-05 2023-12-05 Equipment label detection method, device, equipment and storage medium

Country Status (1)

Country Link
CN (1) CN117809064A (en)

Similar Documents

Publication Publication Date Title
CN110222703B (en) Image contour recognition method, device, equipment and medium
CN108229301B (en) Eyelid line detection method and device and electronic equipment
CN113344862B (en) Defect detection method, device, electronic equipment and storage medium
CN113205090B (en) Picture correction method, device, electronic equipment and computer readable storage medium
CN116486126B (en) Template determination method, device, equipment and storage medium
CN116958145B (en) Image processing method and device, visual detection system and electronic equipment
CN116205889A (en) Offset detection method, offset detection device, electronic equipment and storage medium
CN112651315A (en) Information extraction method and device of line graph, computer equipment and storage medium
CN116803354A (en) Method and device for judging position of surgical instrument of endoscopic surgery robot and storage medium
CN116342585A (en) Product defect detection method, device, equipment and storage medium
CN117809064A (en) Equipment label detection method, device, equipment and storage medium
CN113032071B (en) Page element positioning method, page testing method, device, equipment and medium
CN114972242B (en) Training method and device for myocardial bridge detection model and electronic equipment
CN115511818B (en) Optimization method, device, equipment and storage medium of lung nodule detection model
CN117764913A (en) Image detection method, device, electronic equipment and storage medium
CN114387405B (en) Machine vision-based method and device for quickly positioning tiny features across orders of magnitude
CN116681913A (en) Positioning method, device, equipment and storage medium for label printing
CN117953073A (en) Calibration parameter determining method and device for depth camera and electronic equipment
CN117557511A (en) Method and device for detecting defects of electronic equipment and storage medium
CN116934714A (en) Image processing method, device, equipment and storage medium
CN117372408A (en) Equipment detection method, device, equipment and storage medium
CN116309660A (en) Linear detection method, device, equipment and storage medium
CN118096896A (en) Mechanical arm calibration method, device and storage medium
CN116152340A (en) Method and device for determining coordinates of circuit board, electronic equipment and storage medium
CN115661211A (en) Object detection method, device, equipment and medium based on point cloud

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination