CN112529928A - Part assembly detection method, computer device and storage medium - Google Patents

Part assembly detection method, computer device and storage medium Download PDF

Info

Publication number
CN112529928A
CN112529928A CN202011578349.5A CN202011578349A CN112529928A CN 112529928 A CN112529928 A CN 112529928A CN 202011578349 A CN202011578349 A CN 202011578349A CN 112529928 A CN112529928 A CN 112529928A
Authority
CN
China
Prior art keywords
parameter
assembly
contour
determining
image
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202011578349.5A
Other languages
Chinese (zh)
Inventor
范雄
李坚
曾政
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Suntown Technology Group Co Ltd
Original Assignee
Suntown Technology Group Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Suntown Technology Group Co Ltd filed Critical Suntown Technology Group Co Ltd
Priority to CN202011578349.5A priority Critical patent/CN112529928A/en
Publication of CN112529928A publication Critical patent/CN112529928A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/181Segmentation; Edge detection involving edge growing; involving edge linking
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N20/00Machine learning
    • G06N20/10Machine learning using kernel methods, e.g. support vector machines [SVM]
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/0002Inspection of images, e.g. flaw detection
    • G06T7/0004Industrial image inspection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/13Edge detection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10024Color image
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20081Training; Learning
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30108Industrial image inspection
    • G06T2207/30164Workpiece; Machine component

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • General Physics & Mathematics (AREA)
  • Physics & Mathematics (AREA)
  • Software Systems (AREA)
  • Data Mining & Analysis (AREA)
  • Medical Informatics (AREA)
  • Computing Systems (AREA)
  • General Engineering & Computer Science (AREA)
  • Evolutionary Computation (AREA)
  • Mathematical Physics (AREA)
  • Artificial Intelligence (AREA)
  • Quality & Reliability (AREA)
  • Image Analysis (AREA)

Abstract

The application relates to the technical field of assembly and provides a part assembly detection method, computer equipment and a storage medium, wherein the part assembly detection method comprises the following steps: inputting an acquired image containing a component into a pre-trained image recognition model, and recognizing a first part and a second part in the image, wherein the component is formed by assembling the first part and the second part; determining a relative position parameter of the first part and the second part; acquiring target assembly parameters of the first part; and comparing the relative position parameter with the target assembly parameter to generate a part assembly detection result of the assembly. This application can improve the efficiency that the part equipment detected.

Description

Part assembly detection method, computer device and storage medium
Technical Field
The present disclosure relates to the field of assembly technologies, and in particular, to a method for detecting assembly of components, a computer device, and a storage medium.
Background
In the field of construction, various assembly parts are generally required to be connected and assembled to obtain an assembly for construction, wherein the assembly parts comprise aluminum alloy formworks, angle aluminum, square tubes and the like. When various assembly parts are assembled incorrectly, the connected components do not meet the construction requirements, construction accidents are easily caused, or the construction progress is influenced. At present, the part assembly of the assembly is manually detected, but the efficiency of manually detecting the part assembly is not high, more detection time and labor are needed to be consumed, and the construction cost is increased.
Disclosure of Invention
In view of the above, the present application provides a method, a computer device and a storage medium for detecting component assembly, which aims to solve the technical problem of how to improve the efficiency of component assembly detection.
A first aspect of the present application provides a part assembly detection method, including:
inputting an acquired image containing a component into a pre-trained image recognition model, and recognizing a first part and a second part in the image, wherein the component is formed by assembling the first part and the second part;
determining a relative position parameter of the first part and the second part;
acquiring target assembly parameters of the first part;
and comparing the relative position parameter with the target assembly parameter to generate a part assembly detection result of the assembly.
According to an alternative embodiment of the present application, said determining a relative position parameter of said first part and said second part comprises:
extracting a first contour of the first part and a second contour of the second part;
determining a relative orientation of the first part and the second part from the first profile and the second profile;
determining a relative distance of the first part from the second part according to the relative orientation;
and generating a relative position parameter of the first part and the second part according to the relative orientation and the relative distance.
According to an alternative embodiment of the present application, said extracting a first contour of said first part and a second contour of said second part comprises:
calculating a first gradient and a second gradient for each pixel in the image;
cutting the image into a plurality of sub-images;
calculating a gradient sum corresponding to each sub-image in the plurality of sub-images according to the first gradient and the second gradient;
merging the corresponding gradient sums of the plurality of sub-images to obtain an HOG feature vector of the image;
and inputting the HOG feature vector into a pre-trained SVM model to obtain a first contour of the first part and a second contour of the second part.
According to an alternative embodiment of the present application, said determining a relative distance of said first part from said second part in dependence on said relative orientation comprises:
when the relative orientation is a first relative orientation, constructing a plane rectangular coordinate system based on any point on the first contour;
determining an abscissa value corresponding to each point in the first contour, and determining the smallest abscissa value in the first contour as the first abscissa value;
determining the abscissa value corresponding to each point in the second contour, and determining the smallest abscissa value in the second contour as the second abscissa value;
and determining the relative distance between the first part and the second part according to the difference value of the first abscissa value and the second abscissa value.
According to an alternative embodiment of the present application, said determining a relative distance of said first part from said second part in dependence on said relative orientation comprises:
when the relative orientation is a second relative orientation, constructing a plane rectangular coordinate system based on any point on the first contour;
determining the longitudinal coordinate value corresponding to each point in the first contour, and determining the minimum longitudinal coordinate value in the first contour as the first longitudinal coordinate value;
determining the longitudinal coordinate value corresponding to each point in the second contour, and determining the minimum longitudinal coordinate value in the second contour as the second longitudinal coordinate value;
and determining the relative distance between the first part and the second part according to the difference value of the first ordinate value and the second ordinate value.
According to an optional embodiment of the present application, the comparing the relative position parameter with the target assembly parameter to generate the part assembly detection result of the component includes:
extracting a first position parameter and a second position parameter in the relative position parameters;
extracting a first assembly parameter and a second assembly parameter in the target assembly parameters;
comparing the first location parameter to the first assembly parameter;
when the first position parameter is the same as the first assembly parameter, comparing the second position parameter with the second assembly parameter to obtain a second parameter difference value;
and generating a part assembly detection result of the assembly according to the second parameter difference.
According to an optional embodiment of the present application, the generating the part assembly detection result of the assembly according to the second parameter difference includes:
judging whether the second parameter difference is larger than a preset difference threshold value or not;
and generating a part assembly detection result of the assembly according to the judgment result.
According to an optional embodiment of the present application, the inputting the acquired image including the component into a pre-trained image recognition model includes:
carrying out graying processing on the acquired image containing the component to obtain a grayscale image;
carrying out normalization processing on the gray level image to obtain a processed image;
carrying out illumination processing on the processed image to obtain a target image;
and inputting the target image into a pre-trained image recognition model.
A second aspect of the present application provides a computer device comprising:
a memory to store at least one instruction;
and the processor is used for realizing the part assembly detection method when executing the at least one instruction.
A third aspect of the present application provides a computer-readable storage medium having stored therein at least one instruction which, when executed by a processor, implements a part assembly detection method as described above,
according to the technical scheme, the acquired image containing the component is input into a pre-trained image recognition model, the first part and the second part in the image are recognized, the relative position parameter of the first part and the second part is determined, then the target assembly parameter of the first part is acquired, and finally the relative position parameter and the target assembly parameter are compared to generate the component assembly detection result of the component. This application can realize carrying out automated inspection to the equipment of first part and second part in the subassembly to judge whether the equipment of first part and second part accords with predetermined equipment requirement, generate the testing result, reduced check-out time and detection manpower, improved the efficiency that the part equipment detected.
Drawings
FIG. 1 is a flow chart of a method for inspecting a component assembly according to an embodiment of the present disclosure;
FIG. 2 is a schematic diagram of an assembly according to an embodiment of the present application;
fig. 3 is a schematic block diagram of a structure of a computer device according to an embodiment of the present disclosure.
Detailed Description
The technical solutions in the embodiments of the present application will be clearly and completely described below with reference to the drawings in the embodiments of the present application, and it is obvious that the described embodiments are some, but not all, embodiments of the present application. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present application.
The flow diagrams depicted in the figures are merely illustrative and do not necessarily include all of the elements and operations/steps, nor do they necessarily have to be performed in the order depicted. For example, some operations/steps may be decomposed, combined or partially combined, so that the actual execution sequence may be changed according to the actual situation.
The embodiment of the application provides a part assembly detection method, a part assembly detection device, computer equipment and a computer readable storage medium. The part assembly detection method can be applied to terminal equipment or a server, the terminal equipment can be electronic equipment such as a mobile phone, a tablet computer, a notebook computer, a desktop computer, a personal digital assistant and wearable equipment, and the server can be a single server or a server cluster consisting of a plurality of servers. The following explanation will be given by taking an example in which the component assembly detection method is applied to a server.
Some embodiments of the present application will be described in detail below with reference to the accompanying drawings. The embodiments described below and the features of the embodiments can be combined with each other without conflict.
Referring to fig. 1, fig. 1 is a schematic flow chart of a component assembly detection method according to an embodiment of the present application.
As shown in fig. 1, the method for detecting the assembly of parts specifically includes steps S11 to S14, and the sequence of steps in the flowchart may be changed and some steps may be omitted according to different requirements.
And S11, inputting the acquired image containing the assembly into a pre-trained image recognition model, and recognizing a first part and a second part in the image, wherein the assembly is formed by assembling the first part and the second part.
For example, a camera for capturing an image containing the component may be installed in advance, and the camera establishes communication with the server and transmits the captured image containing the component to the server. The shooting device can be a common camera or an infrared camera, and the type and the installation position of the shooting device are not limited.
Wherein the first part may be a form, such as an aluminium alloy form, and the second part may be angle aluminium. As shown in fig. 2, the assembly 20 is assembled from an aluminum alloy form 201 and an angle aluminum 202.
Illustratively, a plurality of part images are obtained and marked, the image recognition model is trained according to the part images and the marks corresponding to the part images as training images, and when the recognition accuracy of the image recognition model reaches a preset threshold value, the training is stopped, so that the trained image recognition model is obtained.
In some embodiments, inputting the acquired image including the component into a pre-trained image recognition model includes:
carrying out graying processing on the acquired image containing the component to obtain a grayscale image;
carrying out normalization processing on the gray level image to obtain a processed image;
carrying out illumination processing on the processed image to obtain a target image;
and inputting the target image into a pre-trained image recognition model.
Illustratively, the acquired image containing the component is subjected to graying processing by using a linear function, so as to obtain a grayscale image. The gray level image occupies a small storage space, and the image recognition rate can be increased.
Illustratively, the grayscale image is normalized based on a Gamma correction method. Gamma results from the response curve of a CRT (display/television), i.e. its non-linear dependence of luminance on input voltage. After the Gamma curve of the gray level image is corrected, the dark field gray level of the gray level image can be improved, the color error of each gray level is reduced, the dark field color is clear in detail, the image brightness color is consistent, the transparent brightness is good, the contrast is obvious, and the accuracy of image identification is improved.
Illustratively, the illumination process may include a histogram normalization process and/or a filtering process. The histogram normalization process calculates the number of times each gray level appears in an image, and adjusts the distribution of the gray levels by using a conversion method, thereby realizing the uniformity of the image gray levels. The filtering process can improve image interference caused by illumination change by modeling the variability of light and controlling the light in the image by using transformation, can improve the image by compressing the gray scale range and enhancing the contrast, and further improves the accuracy of image identification.
S12, determining the relative position parameter of the first part and the second part.
For example, the relative position parameters of the first part and the second part may include a relative orientation parameter and a relative distance parameter. The relative orientation parameters include a longitudinal orientation parameter U, D and a transverse orientation parameter L, R, wherein U indicates that the second part is above the first part, D indicates that the second part is below the first part, L indicates that the second part is to the left of the first part, and R indicates that the second part is to the right of the first part.
In some embodiments, the determining the relative positional parameter of the first part and the second part comprises:
extracting a first contour of the first part and a second contour of the second part;
determining relative orientation parameters of the first part and the second part according to the first contour and the second contour;
determining a relative distance parameter of the first part and the second part according to the relative orientation parameter;
and generating a relative position parameter of the first part and the second part according to the relative orientation parameter and the relative distance parameter.
For example, the Canny edge detection method or sobel algorithm may be used to extract the first contour of the first part and the second contour of the second part, and the method for extracting the contours is not limited herein.
After a plane rectangular coordinate system is established according to the first contour, determining relative orientation parameters of the first part and the second part according to the positions of the first contour and the second contour on the plane rectangular coordinate system.
In some embodiments, said extracting a first contour of said first part and a second contour of said second part comprises:
calculating a first gradient and a second gradient for each pixel in the image;
cutting the image into a plurality of sub-images;
calculating a gradient sum corresponding to each sub-image in the plurality of sub-images according to the first gradient and the second gradient;
merging the corresponding gradient sums of the plurality of sub-images to obtain an HOG feature vector of the image;
and inputting the HOG feature vector into a pre-trained SVM model to obtain a first contour of the first part and a second contour of the second part.
Wherein the first gradient may be a magnitude gradient and the second gradient may be a direction gradient. Calculating the first and second gradients for each pixel may allow more accurate capture of contour information while further attenuating interference from illumination. And cutting the image into a plurality of sub-images with the same size, and calculating the gradient sum in each sub-image according to the first gradient and the second gradient corresponding to each pixel in each sub-image, namely generating a histogram vector corresponding to each sub-image. And carrying out serial combination on the gradient sums of the multiple self-images to obtain the HOG feature vector of the image, namely combining the histogram vectors corresponding to the multiple sub-images together to generate the HOG feature vector of the image. And inputting the HOG feature vector into a pre-trained SVM model to obtain a first contour of the first part and a second contour of the second part. By acquiring the HOG feature vector and the SVM model, the accuracy of extracting the contour can be improved, and therefore the accuracy of part assembly detection is improved.
In some embodiments, said determining a relative distance of said first part from said second part from said relative orientation comprises:
when the relative orientation is a first relative orientation, constructing a plane rectangular coordinate system based on any point on the first contour;
determining an abscissa value corresponding to each point in the first contour, and determining the smallest abscissa value in the first contour as the first abscissa value;
determining the abscissa value corresponding to each point in the second contour, and determining the smallest abscissa value in the second contour as the second abscissa value;
and determining the relative distance between the first part and the second part according to the difference value of the first abscissa value and the second abscissa value.
When the relative orientation parameter is a first relative orientation, that is, longitudinal orientation parameter U, D, where U denotes the second part is above the first part and D denotes the second part is below the first part. And taking any point on the first contour to construct a plane rectangular coordinate system, for example, taking a point at the lower left corner of the first contour to construct a plane rectangular coordinate system. Determining the abscissa value of each point in the first contour corresponding to the rectangular coordinate system of the plane, determining the smallest abscissa value in the first contour as the first abscissa value, such as X1, determining the abscissa value of each point in the second contour corresponding to the rectangular coordinate system of the plane, determining the smallest abscissa value in the second contour as the second abscissa value, such as X2; and determining the relative distance between the first part and the second part according to the difference value of the first abscissa value and the second abscissa value. For example, X1-X2, are absolute values and the relative distance of the first part from the second part is obtained.
In some embodiments, said determining a relative distance of said first part from said second part from said relative orientation comprises:
when the relative orientation is a second relative orientation, constructing a plane rectangular coordinate system based on any point on the first contour;
determining the longitudinal coordinate value corresponding to each point in the first contour, and determining the minimum longitudinal coordinate value in the first contour as the first longitudinal coordinate value;
determining the longitudinal coordinate value corresponding to each point in the second contour, and determining the minimum longitudinal coordinate value in the second contour as the second longitudinal coordinate value;
and determining the relative distance between the first part and the second part according to the difference value of the first ordinate value and the second ordinate value.
When the relative orientation parameter is a first relative orientation, that is, lateral orientation parameter L, R, where L represents the second part to the left of the first part and R represents the second part to the right of the first part. And taking any point on the first contour to construct a plane rectangular coordinate system, for example, taking a point at the lower left corner of the first contour to construct a plane rectangular coordinate system. Determining the ordinate value corresponding to each point in the first contour on the rectangular coordinate system, determining the smallest ordinate value in the first contour as the first ordinate value, e.g. Y1, determining the ordinate value corresponding to each point in the second contour on the rectangular coordinate system, e.g. Y2, and determining the smallest ordinate value in the second contour as the second ordinate value; and determining the relative distance between the first part and the second part according to the difference value of the first ordinate value and the second ordinate value. For example, Y1-Y2, are absolute values and the relative distance of the first part from the second part is obtained.
S13, acquiring target assembly parameters of the first part.
Illustratively, target assembly parameters of the first part are preset, and the target assembly parameters are used for representing the assembly position relation of the first part and the second part, such as an assembly relative orientation and an assembly relative distance. For example, a target assembly parameter of the first part of L100 indicates that the second part is to the left of the first part, and the relative distance between the first part and the second part is 100; a target assembly parameter of R200 indicates that the second part is to the right of the first part and the relative distance between the first part and the second part is 200; a target assembly parameter of U50 indicates that the second part is above the first part and the relative distance between the first part and the second part is 50; a target assembly parameter of D100 indicates that the second part is below the first part and the relative distance between the first part and the second part is 100.
For example, a target assembly parameter corresponding to the first part may be retrieved from a preset database according to the number of the first part.
And S14, comparing the relative position parameters with the target assembly parameters to generate a part assembly detection result of the assembly.
And comparing the relative position parameter with the target assembly parameter, and determining whether the assembly positions of the second part and the first part are in the assembly position relation range represented by the target assembly parameter.
In some embodiments, the comparing the relative position parameter with the target assembly parameter to generate the part assembly detection result of the component includes:
extracting a first position parameter and a second position parameter in the relative position parameters;
extracting a first assembly parameter and a second assembly parameter in the target assembly parameters;
comparing the first location parameter to the first assembly parameter;
when the first position parameter is the same as the first assembly parameter, comparing the second position parameter with the second assembly parameter to obtain a second parameter difference value;
and generating a part assembly detection result of the assembly according to the second parameter difference.
Illustratively, a first one of the relative position parameters represents a relative orientation of the first part and the second part, and the second position parameter represents a relative distance of the first part and the second part. A first one of the target assembly parameters represents a target assembly orientation of the first part and the second part, and a second assembly parameter represents a target relative distance of the first part and the second part.
When the first position parameter is different from the first assembly parameter, the assembly of the first part and the second part is not in accordance with the preset target assembly parameter, namely the assembly is not in accordance with the requirement.
And when the first position parameter is the same as the first assembly parameter, calculating a difference value between the second position parameter and the second assembly parameter, and generating a part assembly detection result of the assembly according to the difference value.
In some embodiments, the generating the part assembly detection result of the assembly according to the second parameter difference comprises:
judging whether the second parameter difference is larger than a preset difference threshold value or not;
and generating a part assembly detection result of the assembly according to the judgment result.
And presetting an error threshold, wherein the error threshold can be self-defined according to construction requirements. And judging whether the second parameter difference is larger than a preset difference threshold value or not, when the second parameter difference is smaller than or equal to an error threshold value, the assembly of the first part and the second part meets the requirement, determining that the assembly of the component is normal, and when the second parameter difference is larger than the error threshold value, the assembly of the first part and the second part does not meet the requirement, and determining that the assembly of the component is abnormal.
For example, after the component assembly detection result of the component is generated according to the matching result, the component assembly detection result may be sent to a terminal device that establishes communication with the server.
In the method for detecting assembly of parts provided in the above embodiment, the obtained image including the component is input into a pre-trained image recognition model, a first part and a second part in the image are recognized, a relative position parameter of the first part and the second part is determined, a target assembly parameter of the first part is obtained, and finally the relative position parameter is compared with the target assembly parameter to generate a result of detecting assembly of the component. This application can realize carrying out automated inspection to the equipment of first part and second part in the subassembly to judge whether the equipment of first part and second part accords with predetermined equipment requirement, generate the testing result, reduced check-out time and detection manpower, improved the efficiency that the part equipment detected.
Referring to fig. 3, fig. 3 is a schematic block diagram of a computer device according to an embodiment of the present disclosure. The computer device may be a server or a terminal device.
As shown in fig. 3, the computer device 30 includes a processor 301 and a memory 302 connected by a system bus, wherein the memory 302 may include a nonvolatile storage medium and a volatile storage medium.
The memory 302 may store an operating system and computer programs. The computer program includes program instructions that, when executed, cause the processor 301 to perform any of the parts assembly detection methods described herein.
The processor 301 is used to provide computing and control capabilities, supporting the operation of the overall computer device.
In a possible embodiment, the computer device further comprises a network interface for performing network communication, such as sending assigned tasks, etc. Those skilled in the art will appreciate that the architecture shown in fig. 3 is merely a block diagram of some of the structures associated with the disclosed aspects and is not intended to limit the computing devices to which the disclosed aspects apply, as particular computing devices may include more or less components than those shown, or may combine certain components, or have a different arrangement of components.
It should be understood that Processor 301 is a Central Processing Unit (CPU), and may be other general purpose processors, Digital Signal Processors (DSPs), Application Specific Integrated Circuits (ASICs), Field Programmable Gate Arrays (FPGAs) or other Programmable logic devices, discrete Gate or transistor logic devices, discrete hardware components, etc. Wherein a general purpose processor may be a microprocessor or the processor may be any conventional processor or the like.
Wherein, in one embodiment, the processor executes a computer program stored in the memory to implement the steps of:
inputting an acquired image containing a component into a pre-trained image recognition model, and recognizing a first part and a second part in the image, wherein the component is formed by assembling the first part and the second part;
determining a relative position parameter of the first part and the second part;
acquiring target assembly parameters of the first part;
and comparing the relative position parameter with the target assembly parameter to generate a part assembly detection result of the assembly.
Specifically, the specific implementation method of the instruction by the processor may refer to the description of the relevant steps in the foregoing embodiment of the component assembly detection method, which is not described herein again.
Embodiments of the present application also provide a computer-readable storage medium, where a computer program is stored on the computer-readable storage medium, where the computer program includes program instructions, and when the program instructions are executed, the method that is implemented may refer to the embodiments of the method for detecting a component assembly of the present application.
The computer-readable storage medium may be an internal storage unit of the computer device described in the foregoing embodiment, for example, a hard disk or a memory of the computer device. The computer readable storage medium may also be an external storage device of the computer device, such as a plug-in hard disk, a Smart Media Card (SMC), a Secure Digital (SD) Card, a Flash memory Card (Flash Card), and the like provided on the computer device.
In the computer device and the computer-readable storage medium provided in the foregoing embodiment, the obtained image including the component is input into a pre-trained image recognition model, a first part and a second part in the image are recognized, a relative position parameter of the first part and the second part is determined, a target assembly parameter of the first part is then obtained, and finally the relative position parameter is compared with the target assembly parameter, so as to generate a part assembly detection result of the component. This application can realize carrying out automated inspection to the equipment of first part and second part in the subassembly to judge whether the equipment of first part and second part accords with predetermined equipment requirement, generate the testing result, reduced check-out time and detection manpower, improved the efficiency that the part equipment detected.
It is also to be understood that the terminology used in the description of the present application herein is for the purpose of describing particular embodiments only and is not intended to be limiting of the application. As used in the specification of the present application and the appended claims, the singular forms "a," "an," and "the" are intended to include the plural forms as well, unless the context clearly indicates otherwise.
It should also be understood that the term "and/or" as used in this specification and the appended claims refers to and includes any and all possible combinations of one or more of the associated listed items. It should be noted that, in this document, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or system that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or system. Without further limitation, an element defined by the phrase "comprising an … …" does not exclude the presence of other like elements in a process, method, article, or system that comprises the element.
The above-mentioned serial numbers of the embodiments of the present application are merely for description and do not represent the merits of the embodiments. While the invention has been described with reference to specific embodiments, the scope of the invention is not limited thereto, and those skilled in the art can easily conceive various equivalent modifications or substitutions within the technical scope of the invention. Therefore, the protection scope of the present application shall be subject to the protection scope of the claims.

Claims (10)

1. A component assembly detection method, characterized by comprising:
inputting an acquired image containing a component into a pre-trained image recognition model, and recognizing a first part and a second part in the image, wherein the component is formed by assembling the first part and the second part;
determining a relative position parameter of the first part and the second part;
acquiring target assembly parameters of the first part;
and comparing the relative position parameter with the target assembly parameter to generate a part assembly detection result of the assembly.
2. The part assembly inspection method of claim 1, wherein said determining a relative position parameter of the first part and the second part comprises:
extracting a first contour of the first part and a second contour of the second part;
determining relative orientation parameters of the first part and the second part according to the first contour and the second contour;
determining a relative distance parameter of the first part and the second part according to the relative orientation parameter;
and generating a relative position parameter of the first part and the second part according to the relative orientation parameter and the relative distance parameter.
3. The part assembly inspection method according to claim 2, wherein the extracting a first contour of the first part and a second contour of the second part includes:
calculating a first gradient and a second gradient for each pixel in the image;
cutting the image into a plurality of sub-images;
calculating a gradient sum corresponding to each sub-image in the plurality of sub-images according to the first gradient and the second gradient;
merging the corresponding gradient sums of the plurality of sub-images to obtain an HOG feature vector of the image;
and inputting the HOG feature vector into a pre-trained SVM model to obtain a first contour of the first part and a second contour of the second part.
4. The part assembly inspection method of claim 2, wherein said determining a relative distance parameter of the first part from the second part based on the relative orientation parameter comprises:
when the relative orientation parameter is a first relative orientation parameter, constructing a plane rectangular coordinate system based on any point on the first contour;
determining an abscissa value corresponding to each point in the first contour, and determining the smallest abscissa value in the first contour as the first abscissa value;
determining the abscissa value corresponding to each point in the second contour, and determining the smallest abscissa value in the second contour as the second abscissa value;
and determining a relative distance parameter of the first part and the second part according to the difference value of the first abscissa value and the second abscissa value.
5. The part assembly inspection method of claim 2, wherein said determining a relative distance parameter of the first part from the second part based on the relative orientation parameter comprises:
when the relative orientation parameter is a second relative orientation parameter, constructing a plane rectangular coordinate system based on any point on the first contour;
determining the longitudinal coordinate value corresponding to each point in the first contour, and determining the minimum longitudinal coordinate value in the first contour as the first longitudinal coordinate value;
determining the longitudinal coordinate value corresponding to each point in the second contour, and determining the minimum longitudinal coordinate value in the second contour as the second longitudinal coordinate value;
and determining a relative distance parameter of the first part and the second part according to the difference value of the first longitudinal coordinate value and the second longitudinal coordinate value.
6. The component assembly inspection method of claim 1, wherein the comparing the relative position parameter with the target assembly parameter to generate a component assembly inspection result of the assembly comprises:
extracting a first position parameter and a second position parameter in the relative position parameters;
extracting a first assembly parameter and a second assembly parameter in the target assembly parameters;
comparing the first location parameter to the first assembly parameter;
when the first position parameter is the same as the first assembly parameter, comparing the second position parameter with the second assembly parameter to obtain a second parameter difference value;
and generating a part assembly detection result of the assembly according to the second parameter difference.
7. The part assembly inspection method of claim 6, wherein the generating of the part assembly inspection result of the assembly based on the second parameter difference comprises:
judging whether the second parameter difference is larger than a preset difference threshold value or not;
and generating a part assembly detection result of the assembly according to the judgment result.
8. The part assembly detection method according to any one of claims 1 to 7, wherein the inputting the acquired image containing the component into a pre-trained image recognition model comprises:
carrying out graying processing on the acquired image containing the component to obtain a grayscale image;
carrying out normalization processing on the gray level image to obtain a processed image;
carrying out illumination processing on the processed image to obtain a target image;
and inputting the target image into a pre-trained image recognition model.
9. A computer device, wherein the computer device comprises a memory and a processor;
the memory is to store at least one instruction;
the processor is configured to implement the method of parts assembly detection as claimed in any one of claims 1 to 8 when executing the at least one instruction.
10. A computer-readable storage medium having stored therein at least one instruction which, when executed by a processor, implements a part assembly inspection method as claimed in any one of claims 1 to 8.
CN202011578349.5A 2020-12-28 2020-12-28 Part assembly detection method, computer device and storage medium Pending CN112529928A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202011578349.5A CN112529928A (en) 2020-12-28 2020-12-28 Part assembly detection method, computer device and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202011578349.5A CN112529928A (en) 2020-12-28 2020-12-28 Part assembly detection method, computer device and storage medium

Publications (1)

Publication Number Publication Date
CN112529928A true CN112529928A (en) 2021-03-19

Family

ID=74976913

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202011578349.5A Pending CN112529928A (en) 2020-12-28 2020-12-28 Part assembly detection method, computer device and storage medium

Country Status (1)

Country Link
CN (1) CN112529928A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116309351A (en) * 2023-02-15 2023-06-23 浙江丽威汽车控制系统有限公司 Automobile engineering material supply processing system

Citations (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103165501A (en) * 2011-12-08 2013-06-19 财团法人金属工业研究发展中心 Aligning method for unmarked substrate assembly
CN104677277A (en) * 2015-02-16 2015-06-03 武汉天远视科技有限责任公司 Method and system measuring geometric attribute of object or distance
CN105334446A (en) * 2015-09-23 2016-02-17 南京协辰电子科技有限公司 Method for aligning workpiece to tool in electrical testing
CN107886495A (en) * 2017-09-30 2018-04-06 北京得华机器人技术研究院有限公司 A kind of auto-parts defect identification method based on similarity mode
CN108701237A (en) * 2016-01-15 2018-10-23 英卓美特公司 Method for automatically generating public-measurement across multiple module units
CN109635698A (en) * 2018-12-04 2019-04-16 杭州中房信息科技有限公司 A kind of crowd's personal safety detection method of renting a house based on SVM algorithm
CN109926817A (en) * 2018-12-20 2019-06-25 南京理工大学 Transformer automatic assembly method based on machine vision
CN110135514A (en) * 2019-05-22 2019-08-16 国信优易数据有限公司 A kind of workpiece classification method, device, equipment and medium
CN110285760A (en) * 2019-06-27 2019-09-27 重庆矢崎仪表有限公司 A kind of FPC assembling detection system and method
CN110838147A (en) * 2019-10-25 2020-02-25 深圳信息职业技术学院 Camera module detection method and device
CN111105465A (en) * 2019-11-06 2020-05-05 京东数字科技控股有限公司 Camera device calibration method, device, system electronic equipment and storage medium
CN111260629A (en) * 2020-01-16 2020-06-09 成都地铁运营有限公司 Pantograph structure abnormity detection algorithm based on image processing
CN111428731A (en) * 2019-04-04 2020-07-17 深圳市联合视觉创新科技有限公司 Multi-class target identification and positioning method, device and equipment based on machine vision

Patent Citations (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103165501A (en) * 2011-12-08 2013-06-19 财团法人金属工业研究发展中心 Aligning method for unmarked substrate assembly
CN104677277A (en) * 2015-02-16 2015-06-03 武汉天远视科技有限责任公司 Method and system measuring geometric attribute of object or distance
CN105334446A (en) * 2015-09-23 2016-02-17 南京协辰电子科技有限公司 Method for aligning workpiece to tool in electrical testing
CN108701237A (en) * 2016-01-15 2018-10-23 英卓美特公司 Method for automatically generating public-measurement across multiple module units
CN107886495A (en) * 2017-09-30 2018-04-06 北京得华机器人技术研究院有限公司 A kind of auto-parts defect identification method based on similarity mode
CN109635698A (en) * 2018-12-04 2019-04-16 杭州中房信息科技有限公司 A kind of crowd's personal safety detection method of renting a house based on SVM algorithm
CN109926817A (en) * 2018-12-20 2019-06-25 南京理工大学 Transformer automatic assembly method based on machine vision
CN111428731A (en) * 2019-04-04 2020-07-17 深圳市联合视觉创新科技有限公司 Multi-class target identification and positioning method, device and equipment based on machine vision
CN110135514A (en) * 2019-05-22 2019-08-16 国信优易数据有限公司 A kind of workpiece classification method, device, equipment and medium
CN110285760A (en) * 2019-06-27 2019-09-27 重庆矢崎仪表有限公司 A kind of FPC assembling detection system and method
CN110838147A (en) * 2019-10-25 2020-02-25 深圳信息职业技术学院 Camera module detection method and device
CN111105465A (en) * 2019-11-06 2020-05-05 京东数字科技控股有限公司 Camera device calibration method, device, system electronic equipment and storage medium
CN111260629A (en) * 2020-01-16 2020-06-09 成都地铁运营有限公司 Pantograph structure abnormity detection algorithm based on image processing

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
模具达人: "机械人必备基础——通用装配工艺及要求", pages 1 - 3, Retrieved from the Internet <URL:https://zhuanlan.zhihu.com/p/232198636> *
黄妙华等: "智能车辆控制基础", vol. 1, 30 September 2020, 机械工业出版社, pages: 183 *

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116309351A (en) * 2023-02-15 2023-06-23 浙江丽威汽车控制系统有限公司 Automobile engineering material supply processing system
CN116309351B (en) * 2023-02-15 2023-11-21 浙江丽威汽车控制系统有限公司 Automobile engineering material supply processing system

Similar Documents

Publication Publication Date Title
CN110232369B (en) Face recognition method and electronic equipment
US20190362193A1 (en) Eyeglass positioning method, apparatus and storage medium
CN111027504A (en) Face key point detection method, device, equipment and storage medium
CN108229475B (en) Vehicle tracking method, system, computer device and readable storage medium
CN111355941B (en) Image color real-time correction method, device and system
US9767383B2 (en) Method and apparatus for detecting incorrect associations between keypoints of a first image and keypoints of a second image
US20120106784A1 (en) Apparatus and method for tracking object in image processing system
CN110675940A (en) Pathological image labeling method and device, computer equipment and storage medium
CN111310746B (en) Text line detection method, model training method, device, server and medium
CN111160169B (en) Face detection method, device, equipment and computer readable storage medium
CN111814776B (en) Image processing method, device, server and storage medium
CN113592886A (en) Method and device for examining architectural drawings, electronic equipment and medium
CN116168351B (en) Inspection method and device for power equipment
CN111507957B (en) Identity card picture conversion method and device, computer equipment and storage medium
US9824289B2 (en) Exploiting color for license plate recognition
CN112884782A (en) Biological object segmentation method, apparatus, computer device and storage medium
CN110991256A (en) System and method for carrying out age estimation and/or gender identification based on face features
CN112529928A (en) Part assembly detection method, computer device and storage medium
CN110321778B (en) Face image processing method and device and storage medium
CN112734689A (en) Gasket quality detection method, system, device and storage medium
Thomas et al. Color balancing for change detection in multitemporal images
CN113840135B (en) Color cast detection method, device, equipment and storage medium
EP2919149A2 (en) Image processing apparatus and image processing method
CN110807403B (en) User identity identification method and device and electronic equipment
CN112508925A (en) Electronic lock panel quality detection method, system, computer device and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination