CN117934796A - Workpiece assembly inspection method, apparatus, device, medium, and computer program product - Google Patents

Workpiece assembly inspection method, apparatus, device, medium, and computer program product Download PDF

Info

Publication number
CN117934796A
CN117934796A CN202410019331.3A CN202410019331A CN117934796A CN 117934796 A CN117934796 A CN 117934796A CN 202410019331 A CN202410019331 A CN 202410019331A CN 117934796 A CN117934796 A CN 117934796A
Authority
CN
China
Prior art keywords
component
information
detection
workpiece
assembly
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202410019331.3A
Other languages
Chinese (zh)
Inventor
常欢
张文博
李孟其
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Hunan Shibite Robot Co Ltd
Original Assignee
Hunan Shibite Robot Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Hunan Shibite Robot Co Ltd filed Critical Hunan Shibite Robot Co Ltd
Priority to CN202410019331.3A priority Critical patent/CN117934796A/en
Publication of CN117934796A publication Critical patent/CN117934796A/en
Pending legal-status Critical Current

Links

Landscapes

  • General Factory Administration (AREA)

Abstract

The present application relates to a workpiece assembly inspection method, apparatus, device, medium and computer program product. The method comprises the following steps: acquiring part information of at least one component part on the target workpiece aiming at the assembled target workpiece; classifying each component according to the component information of each component to obtain a component classification result; for each component part, detecting and analyzing the component part according to the part information and the part classification result of the component part to obtain part detection information; and detecting the assembly condition of the target workpiece according to the respective component detection information of each component to obtain a detection result. By adopting the method, the accuracy of detecting the assembly condition of the workpiece can be improved.

Description

Workpiece assembly inspection method, apparatus, device, medium, and computer program product
Technical Field
The present application relates to the field of industrial production technology, and in particular, to a workpiece assembly detection method, apparatus, device, medium, and computer program product.
Background
With the development of industrial production technology, it is necessary to detect the assembly state of a workpiece when the workpiece is produced. At present, a method for detecting the assembly condition of a workpiece is generally that a worker observes the workpiece with naked eyes to judge whether the workpiece is assembled correctly, and at this time, the detection process of the assembly condition of the workpiece is limited by subjective judgment of the worker, so that the detection result of the assembly condition depends on the observation capability of the worker and the assembly experience of the workpiece, and the detection accuracy of the assembly condition of the workpiece is lower.
Disclosure of Invention
In view of the foregoing, it is desirable to provide a workpiece assembly detection method, apparatus, computer device, computer-readable storage medium, and computer program product that can improve the accuracy of assembly condition detection of workpieces.
In a first aspect, the present application provides a method for detecting assembly of a workpiece, comprising:
Acquiring part information of at least one component part on the target workpiece aiming at the assembled target workpiece;
classifying each component according to the component information of each component to obtain a component classification result;
for each component part, detecting and analyzing the component part according to the part information and the part classification result of the component part to obtain part detection information;
And detecting the assembly condition of the target workpiece according to the respective component detection information of each component to obtain a detection result.
In a second aspect, the present application also provides a workpiece assembly inspection apparatus, including:
The acquisition module is used for acquiring the component information of at least one component on the target workpiece aiming at the assembled target workpiece;
the classification module is used for classifying each component part according to the part information of each component part to obtain a part classification result;
the component detection module is used for detecting and analyzing the component parts according to the component information and the component classification result of the component parts to obtain component detection information;
and the workpiece detection module is used for detecting the assembly condition of the target workpiece according to the respective component detection information of each component to obtain a detection result.
In a third aspect, the present application also provides a computer device comprising a memory and a processor, the memory storing a computer program, the processor implementing the following steps when executing the computer program:
Acquiring part information of at least one component part on the target workpiece aiming at the assembled target workpiece;
classifying each component according to the component information of each component to obtain a component classification result;
for each component part, detecting and analyzing the component part according to the part information and the part classification result of the component part to obtain part detection information;
And detecting the assembly condition of the target workpiece according to the respective component detection information of each component to obtain a detection result.
In a fourth aspect, the present application also provides a computer readable storage medium having stored thereon a computer program which when executed by a processor performs the steps of:
Acquiring part information of at least one component part on the target workpiece aiming at the assembled target workpiece;
classifying each component according to the component information of each component to obtain a component classification result;
for each component part, detecting and analyzing the component part according to the part information and the part classification result of the component part to obtain part detection information;
And detecting the assembly condition of the target workpiece according to the respective component detection information of each component to obtain a detection result.
In a fifth aspect, the application also provides a computer program product comprising a computer program which, when executed by a processor, performs the steps of:
Acquiring part information of at least one component part on the target workpiece aiming at the assembled target workpiece;
classifying each component according to the component information of each component to obtain a component classification result;
for each component part, detecting and analyzing the component part according to the part information and the part classification result of the component part to obtain part detection information;
And detecting the assembly condition of the target workpiece according to the respective component detection information of each component to obtain a detection result.
According to the workpiece assembly detection method, the device, the equipment, the medium and the computer program product, aiming at an assembled target workpiece, the component information of at least one component part on the target workpiece is obtained, the component parts are classified according to the component information of each component part to obtain a component part classification result, the component parts are detected and analyzed according to the component information of the component parts and the component part classification result for each component part to obtain component part detection information, the assembly condition of the target workpiece is detected according to the component part detection information of each component part to obtain a detection result, the objectively existing and multi-dimensional component part information and the component part classification information are used as the generation basis of the component part detection information of the component parts on the target workpiece, and the component part detection information is the detection basis of the assembly condition of the target workpiece, so that the detection result obtained by detection can accurately reflect the assembly condition of the target workpiece, the dependence of the observation capability of the assembly condition detection result on workers and the workpiece assembly experience is eliminated, and the workpiece assembly condition detection accuracy is improved.
Drawings
In order to more clearly illustrate the embodiments of the present application or the technical solutions in the related art, the drawings that are required to be used in the embodiments or the related technical descriptions will be briefly described, and it is apparent that the drawings in the following description are only some embodiments of the present application, and other drawings may be obtained according to the drawings without inventive effort for those skilled in the art.
FIG. 1 is a schematic diagram of an application scenario of a method for detecting assembly of a workpiece in one embodiment;
FIG. 2 is a flow chart of a method of detecting assembly of a workpiece in one embodiment;
FIG. 3 is a flow chart of detecting assembly conditions of components in a scene according to an embodiment;
FIG. 4 is a schematic flow chart of detecting assembly conditions of components in another scenario in one embodiment;
FIG. 5 is a schematic overall flow chart of a method for detecting assembly of a workpiece in one embodiment;
FIG. 6 is a block diagram of a device for detecting assembly of a workpiece in one embodiment;
fig. 7 is an internal structural diagram of a computer device in one embodiment.
Detailed Description
The present application will be described in further detail with reference to the drawings and examples, in order to make the objects, technical solutions and advantages of the present application more apparent. It should be understood that the specific embodiments described herein are for purposes of illustration only and are not intended to limit the scope of the application.
In the traditional method, whether the workpiece is correctly assembled or not is observed by naked eyes of a worker, and the detection process of the assembly condition of the workpiece is limited by subjective judgment of the worker, so that the detection result of the assembly condition depends on the observation capability of the worker and the assembly experience of the workpiece, and the detection accuracy of the assembly condition of the workpiece is lower; or comparing the workpiece with the workpiece pictures which are assembled correctly by a worker, and when the assembly of the workpiece is complex or the number of the workpieces waiting for detection is large, the workpieces need to be compared one by the worker, so that the assembly condition detection efficiency of the workpiece is low; in summary, the accuracy of detecting the assembly condition of the workpiece and the efficiency of detecting the assembly condition cannot be considered.
The workpiece assembly detection method provided by the embodiment of the application can be applied to an application scene shown in fig. 1. The workpiece assembly system 01 comprises a camera 02, a polishing device 03 and a detection device 04, wherein the camera 02 is used for collecting component images, the polishing device 03 is used for carrying out illumination compensation on an image collecting environment of the camera 02, the detection device 04 is used for obtaining various component information and detecting component parts and target workpieces according to the component information, the detection device 04 can be a terminal, and the terminal can be, but is not limited to, various personal computers, notebook computers, smart phones, tablet computers, internet of things devices and portable wearable devices, and the internet of things devices can be smart speakers, smart televisions, smart air conditioners, smart vehicle-mounted devices and the like. The portable wearable device may be a smart watch, smart bracelet, headset, or the like.
In one embodiment, as shown in fig. 2, a workpiece assembly detection method is provided, and the workpiece assembly detection method is applied to the workpiece assembly system 01 in fig. 1 for illustration, and includes the following steps:
Step 102, acquiring component information of at least one component on the target workpiece aiming at the assembled target workpiece.
The target workpiece in step 102 is a workpiece set by a user according to requirements and/or is selected from at least one workpiece to be detected; the component parts are components on the target workpiece, and the component parts may be components additionally mounted to the target workpiece, such as nuts, studs, silk-screen printing, etc., or the component parts may be components existing on the target workpiece itself, such as grooves, round holes, threaded holes, etc., without limitation; the component information includes at least one of a component shape, a component size, a component location, a component material, a detection parameter, and component assembly operation information.
As an embodiment, before step 102, further includes: acquiring the assembly completion degree of at least one workpiece to be detected, selecting a corresponding target workpiece with the assembly completion degree being greater than or equal to a preset completion degree threshold value from unselected parts in all the workpieces to be detected, wherein the preset completion degree threshold value is an assembly completion degree critical value for judging that the assembly of the workpiece is completed, and executing step 102; and aiming at the corresponding workpiece with the assembly completion degree smaller than the preset completion degree threshold, taking the assembly incomplete result as a detection result of the workpiece.
As another embodiment, before step 102, the method further includes: and acquiring the selected information of the detection personnel on at least one workpiece to be detected, taking the workpiece corresponding to the selected information as a target workpiece, and executing step 102.
As an embodiment, step 102 includes: and acquiring a workpiece image carrying the characteristics of the target workpiece, and obtaining the shape, the size, the position and the material of at least one component part on the target workpiece by performing image recognition on the workpiece image.
As another embodiment, step 102 includes: the component identifiers are respectively configured on the component parts of the target workpiece, wherein the component identifiers can be component labels or component chips, are not limited herein, and the component identifiers configured on the component parts of the target workpiece are respectively identified through RFID (Radio Frequency IDentification, radio frequency identification technology) to obtain the component shape, the component size, the component position, the component material and the component assembly operation information of at least one component part on the target workpiece.
As yet another embodiment, step 102 includes: the detection parameters of at least one component part on the target workpiece are respectively acquired through preset sensors of at least one component part arranged on the target workpiece.
As an embodiment, step 102 includes: and acquiring time sequence workpiece images carrying characteristic assembly characteristics of the target workpiece, wherein the time sequence workpiece images are a series of workpiece images with time sequence, and the assembly operation information of at least one component part on the target workpiece is obtained by carrying out assembly operation identification on the time sequence workpiece images.
As another embodiment, step 102 includes: a pressure sensor is deployed on at least one component part of the target workpiece, and pressure data of each component part in the assembly process of the target workpiece are collected through the pressure sensor; and identifying the assembly operation of each component part in the target workpiece according to the pressure data of each component part to obtain the component assembly operation information of at least one component part on the target workpiece.
It can be understood that, in a scenario where the component information is obtained by means of image recognition, since industrial production is usually performed in a closed factory or workshop, a situation that the light condition is poor easily occurs, so that an image obtained by capturing in a shooting environment with the poor light condition is unclear, and the component information obtained by image recognition is inaccurate.
Therefore, there is a need for a way to improve the accuracy of component information obtained by image recognition, and as an embodiment, acquiring component information of at least one component on a target workpiece includes:
Determining, for each of the component parts, an illumination reflection capability of the component part; constructing an illumination compensation condition for carrying out illumination compensation on the component parts, wherein the stronger the illumination reflection capability of the component parts is, the lower the illumination compensation value corresponding to the illumination compensation condition is; and acquiring component part images carrying the characteristics of the component parts under an image acquisition environment corresponding to the illumination compensation conditions so as to acquire the part information of the component parts from the component part images.
Wherein the illumination compensation conditions include a device mounting position of the lighting device and an illumination intensity.
Illustratively, determining the light reflection capabilities of the component parts includes: acquiring material information of the component parts and acquiring the part colors of the component parts; and determining the illumination reflecting capacity of the component according to the material information and/or the color of the component.
As an embodiment, determining the light reflection capability of the component according to the material information of the component includes: identifying the reflection degree of the component parts according to the material information of the component parts to obtain the reflection degree of the component parts, wherein the smoothness of the parts corresponding to the material information of the component parts is positively related to the reflection group; and determining the illumination reflecting capacity of the component according to the reflectivity of the component, wherein the higher the reflectivity is, the stronger the illumination reflecting capacity is.
As another embodiment, determining the light reflection capability of the component part from the part color of the component part includes: if the component colors of the component parts are composed of single light rays, determining the illumination reflection capability of the component parts according to the light ray wavelengths corresponding to the component colors of the component parts, wherein the longer the light ray wavelength is, the stronger the illumination reflection capability is, for example, the red, orange, yellow, green, cyan, blue and purple seven regions with the wavelengths from long to short can be divided according to the light wavelength, so that the illumination reflection capability of the red is the strongest and the illumination reflection capability of the purple is the weakest; if the component colors of the component parts are formed by mixing a plurality of light rays, outputting the first capability value as the corresponding illumination reflection capability of the component parts under the condition that the component colors of the component parts are light colors such as white, gold and the like; and outputting the second capability value as the illumination reflection capability of the component part correspondingly when the component part has dark colors such as black, brown and the like, wherein the first capability value is larger than the second capability value.
As still another embodiment, determining the light reflection capability of the component part based on the material information of the component part and the color of the component part includes: determining a first reflective capability of the component according to the material information of the component; determining a second reflective capability of the component part based on the part color of the component part; and polymerizing the first reflecting capacity and the second reflecting capacity to obtain the illumination reflecting capacity of the component, further multiplying a preset first weight and the first reflecting capacity to obtain a first weighting capacity, multiplying a preset second weight and the second reflecting capacity to obtain a second weighting capacity, and adding the first weighting capacity and the second weighting capacity to obtain the illumination reflecting capacity of the component.
Illustratively, constructing an illumination compensation condition for illumination compensating a component part includes: determining an illumination compensation value of the component according to the illumination reflection capability of the component; and constructing the equipment installation position and the illumination intensity of the illumination equipment for carrying out illumination compensation on the component parts according to the illumination compensation value.
Illustratively, capturing component part images carrying component part features in an image capturing environment corresponding to illumination compensation conditions includes: and acquiring component part images carrying component part features by a plurality of cameras under an image acquisition environment of illumination compensation of the illumination equipment corresponding to the illumination compensation conditions.
Therefore, under the scene with poor light conditions of the shooting environment, illumination compensation is carried out on the shooting environment through the lighting equipment, so that the situation that an image shot by the shooting environment with poor light conditions is unclear is avoided, and the accuracy of component information obtained by image identification is improved.
Step 104, classifying each component according to the component information of each component to obtain a component classification result.
As an embodiment, step 104 includes: acquiring a preset classification rule, wherein the preset classification rule comprises a corresponding relation between preset part information and a preset classification result; and mapping the component information of each component into component classification information through preset classification rules.
As another embodiment, step 104 includes: for each of the constituent parts, extracting part features from part information of the constituent parts, and mapping the part features into part classification results of the constituent parts by a part classification model.
And 106, for each component, detecting and analyzing the component according to the component information of the component and the component classification result to obtain component detection information.
As an embodiment, step 106 includes: detecting the assembly condition of the component parts according to the component information and the component classification result of the component parts to obtain component part detection information, wherein the component part detection information comprises component part correct assembly information or component part incorrect assembly information, and the component part detection information can be a classification result; the component detection information may also be a multi-classification result, in which case, the component detection information includes component assembly accuracy, and further, when the component assembly accuracy is higher than a first preset accuracy threshold, the first preset accuracy threshold is a component assembly accuracy critical value set as required when it is determined that the component is assembled correctly, and it is determined that the component is assembled correctly; when the assembly accuracy of the component is not higher than a preset accuracy threshold, the component is judged to be assembled in error, and the closer the assembly accuracy of the component is to the first preset accuracy threshold, the higher the assembly accuracy of the component is, so that the assembly condition of the component is represented, and the assembly accuracy of the component is also represented.
As another embodiment, step 106 includes: the component information comprises detection parameters, and the detection parameters are analyzed according to the component classification result to obtain component detection information.
And step 108, detecting the assembly condition of the target workpiece according to the respective component detection information of each component to obtain a detection result.
Wherein the assembly condition in step 108 is used to characterize the assembly accuracy of the target workpiece; the detection result includes a workpiece correct assembly result or a workpiece incorrect assembly result, and the detection result may be a classification result, in which case the detection result includes a workpiece correct assembly label or a workpiece incorrect assembly label; the detection result may also be a multi-classification result, in which case, the detection result includes workpiece assembly accuracy, and further, when the workpiece assembly accuracy is higher than a second preset accuracy threshold, the second preset accuracy threshold is a workpiece assembly accuracy threshold set as required when it is determined that the workpiece is assembled correctly, and when the workpiece assembly accuracy is not higher than the second preset accuracy threshold, it is determined that the target workpiece is assembled incorrectly, and when the workpiece assembly accuracy is closer to the preset accuracy threshold, the assembly accuracy of the target workpiece is higher, so that not only is the assembly condition of the target workpiece characterized, but also the assembly accuracy of the target workpiece is characterized.
As an embodiment, the component detection information includes component correct assembly information or component incorrect assembly information, and the accumulated corresponding component detection information includes a target component of the component correct assembly information; detecting the assembly condition of the target workpiece according to the target component to obtain a detection result; further, if the number of the parts of the target component parts is greater than or equal to a preset number threshold, taking the correct assembly result of the workpiece as a detection result; and if the number of the components of the target component is smaller than a preset number threshold, taking the workpiece error assembly result as a detection result, wherein the preset number threshold is a number threshold value of correctly assembled components on the workpiece when the workpiece is judged to be correctly assembled, which is set according to the requirement, and is determined by the total number of the components on the target workpiece, and is smaller than or equal to the total number of the components on the target workpiece.
As another embodiment, a component detection model corresponding to a target workpiece is acquired; and mapping the respective component detection information of each component into a detection result through a component detection model.
It will be appreciated that in the actual scenario of workpiece installation, there may be some components of a component that do not affect the normal use of the workpiece even if installed incorrectly, and other components of a component that do not affect the normal use of the workpiece seriously when installed incorrectly, so for this scenario it is desirable to provide a way of constructing the component features to show how important these components are for workpiece assembly inspection.
Detecting the assembly condition of the target workpiece according to the respective component detection information of each component to obtain a detection result, wherein the detection result comprises:
Weighting the component detection information of the component according to the weight corresponding to the component for each component to obtain component detection weighting information of the component, wherein the weight corresponding to the component is positively correlated with the importance degree of the component; and detecting the assembly condition of the target workpiece according to the component detection weighting information of each component to obtain a detection result.
The importance level of the component parts is used for representing the influence of the correct assembly or not of the component parts on the normal use of the workpiece, specifically, the importance level of the component parts is higher as the influence of the correct assembly or not of the component parts on the normal use of the workpiece is higher.
Illustratively, weighting the component detection information of the component according to the weight corresponding to the component to obtain the component detection weighting information of the component, including: generating weights of the component parts according to the importance degrees of the component parts; and weighting the component detection information of the component according to the weight corresponding to the component to obtain the component detection weighting information of the component.
As an embodiment, weighting the component detection information of the component according to the weight corresponding to the component to obtain the component detection weighting information of the component, including: multiplying the weight corresponding to the component part by the part detection information of the component part to obtain part detection weight information of the component part, for example, if the part detection information includes part correct assembly information and the weight is 1.1, the part detection weight information is 1.1 part correct assembly information; or if the component detection information includes component assembly accuracy, the component assembly accuracy is specifically 70%, and the weight is 1.1, the component detection weighting information is 77%.
Detecting the assembly condition of the target workpiece according to the component detection weighting information of each component to obtain a detection result, wherein the detection result comprises the following steps:
Aggregating the component detection weighting information of each component to obtain component detection aggregation information; and detecting the assembly condition of the target workpiece according to the matching relation between the component detection aggregation information and the preset aggregation information condition to obtain a detection result.
As an embodiment, aggregating the component detection weighting information of each component to obtain component detection aggregation information, including: adding the component detection weighting information of each component to obtain component detection aggregation information; further, if the component detection information includes component correct assembly information or component incorrect assembly information, subtracting the component detection weighting information corresponding to the component correct assembly information from the component detection weighting information corresponding to the component incorrect assembly information corresponding to each component to obtain component detection aggregate information, for example, the target workpiece includes component A, B, C, the component detection weighting information of component a is 1.2 component correct assembly information, the component detection weighting information of component B is 0.8 component incorrect assembly information, the component detection weighting information of component C is 0.8 component correct assembly information, and the component detection aggregate information of the target workpiece is 1.2+0.8-0.8 component correct assembly information; if the component detection information includes component assembly accuracy, component detection weighting information for each component is added to obtain component detection aggregation information.
As an embodiment, detecting an assembly condition of a target workpiece according to a matching relationship between component detection aggregation information and a preset aggregation information condition to obtain a detection result, including: the detection result comprises a workpiece correct assembly label or a workpiece incorrect assembly label, and if the component detection aggregation information is matched with the preset aggregation information condition, the workpiece correct assembly label is used as the detection result; and if the component detection aggregation information is not matched with the preset aggregation information condition, the workpiece error assembly label is used as a detection result.
As another embodiment, detecting the assembly condition of the target workpiece according to the matching relationship between the component detection aggregation information and the preset aggregation information condition to obtain a detection result includes: the detection result comprises workpiece assembly accuracy, and the matching degree between the component detection aggregation information and the preset aggregation information condition is output as the workpiece assembly accuracy.
In this way, for the parts with higher importance degree for workpiece detection or larger influence on normal use of the workpiece, higher weight is set, so that the part detection weighting information of the parts plays a decisive role for workpiece detection, and therefore, the assembly detection accuracy of the workpiece can be improved to a certain extent.
In the workpiece assembly detection method, for the assembled target workpiece, the component information of at least one component part on the target workpiece is obtained, the component parts are classified according to the component information of each component part to obtain a component classification result, the component parts are detected and analyzed according to the component information of the component parts and the component classification result to obtain component detection information, the assembly condition of the target workpiece is detected according to the component detection information of each component part to obtain a detection result, the objectively existing and multi-dimensional component information and the component classification information are used as the generation basis of the component detection information of the component parts on the target workpiece, and the component detection information is the detection basis of the assembly condition of the target workpiece, so that the detection result obtained by detection can accurately reflect the assembly condition of the target workpiece, the dependency of the detection result of the assembly condition on the observation capability of staff and the assembly experience of the workpiece is eliminated, and the assembly condition detection accuracy of the workpiece is improved.
To ensure accuracy of the assembly condition detection of the workpiece, a method for accurately detecting the assembly condition of the component is required, and in an exemplary embodiment, as shown in fig. 3, in step 206 of fig. 2, according to the component information and the component classification result of the component, the component is detected and analyzed to obtain component detection information, which includes:
step 202, extracting component features from component information of the component parts.
The component information in step 202 includes a component shape, a component size, a component location, and a component material.
Illustratively, step 202 includes: for each of the component parts, component features of the component parts are extracted from the part shape, the part size, the part location and the part material of the component parts.
Illustratively, extracting component features of a component part from a component shape, a component size, a component location, and a component material of the component part includes: extracting the component characteristics of the component parts from the component parts shape, the component parts size, the component parts position and the component parts material by a characteristic extractor, wherein the characteristic extractor comprises a shape characteristic extractor, a size characteristic extractor, a position characteristic extractor and a material characteristic extractor, and extracting the shape characteristics of the component parts from the component parts shape of the component parts by the shape characteristic extractor; extracting the size characteristics of the component parts from the sizes of the component parts by a size characteristic extractor; extracting the position features of the component parts from the positions of the component parts by a position feature extractor; extracting material characteristics of the component parts from the part materials of the component parts by a material characteristic extractor; and splicing the shape characteristics, the size characteristics, the position characteristics and the material characteristics of the component parts into the component characteristics of the component parts.
And 204, detecting the assembly condition of the component parts according to the part characteristics by a pre-trained part detection model corresponding to the part classification result to obtain part detection information.
Illustratively, step 204 includes: and acquiring a pre-trained component detection model corresponding to the component classification result, and mapping the component characteristics into component detection information through the component detection model.
Optionally, after detecting the assembly condition of the component parts according to the feature of the parts by using a pre-trained part detection model corresponding to the part classification result, obtaining part detection information, the method further includes: acquiring real assembly detection information corresponding to the component, and if the difference between the real assembly detection information and the component detection information is larger than a preset difference threshold, generating sample information corresponding to the component according to the component information of the component, the real assembly detection information and the component detection information; updating the part detection model according to the sample information; specifically, the sample information is sent to a local platform through the component detection model, and the sample information is sent to the glume calculation platform through the local platform; and updating the part detection model according to the sample information by using the glume calculation platform.
Therefore, the sample information takes the local platform as a transmission relay, and the data safety and the data privacy of the workpiece detection related data are ensured.
In the present embodiment, by extracting the component features from the component information of the constituent components; according to the feature of the component, the assembly condition of the component is detected through a pre-trained component detection model corresponding to the component classification result, component detection information is obtained, and the assembly condition detection of the target workpiece is split into the assembly condition detection of each component on the target workpiece, so that the direct assembly condition detection of the target workpiece is avoided, and when the detection environment is complex, the condition of inaccurate detection possibly exists, so that the assembly condition detection accuracy of the workpiece is improved.
To ensure accuracy of the assembly condition detection of the workpiece, a method for accurately detecting the assembly condition of the component is required, and in an exemplary embodiment, as shown in fig. 4, in step 106 of fig. 2, according to the component information and the component classification result of the component, the component is detected and analyzed to obtain component detection information, which includes:
Step 302, detecting the component parts through a preset sensor to obtain actual detection parameters, and obtaining standard detection parameters obtained by detection when the component parts are correctly assembled to a target workpiece.
Wherein the detection parameters (including but not limited to standard detection parameters, actual detection parameters, and corrected detection parameters) referred to throughout include at least one of temperature parameters, pressure parameters, current parameters, and voltage parameters.
Illustratively, step 302 includes: acquiring standard detection parameters acquired by sensors deployed on a correctly assembled part, wherein the standard assembled part is a part for correctly assembling a target workpiece in the same type; the method comprises the steps of obtaining actual detection parameters acquired by preset sensors deployed on component parts, wherein the preset sensors comprise, but are not limited to, temperature sensors, pressure sensors, current sensors and voltage sensors.
And step 304, correcting the actual detection parameters according to the component information and the component classification information of the component parts to obtain corrected detection parameters.
As an embodiment, step 304 includes: determining a correction coefficient corresponding to the actual detection parameter according to the component information of the component; and multiplying the correction coefficient by the actual detection parameter to obtain the corrected detection parameter.
As another embodiment, step 304 includes: determining a correction direction and a correction value corresponding to the actual detection parameter according to the component information of the component; and correcting the actual detection parameters according to the correction direction and the correction value corresponding to the actual detection parameters to obtain corrected detection parameters.
And step 306, detecting the assembly condition of the component parts according to the similarity between the corrected detection parameters and the standard detection parameters to obtain the part detection information.
As an embodiment, step 306 includes: the component detection information comprises a component correct assembly label or a component incorrect assembly label, the similarity between the corrected detection parameter and the predicted detection parameter is obtained, and if the similarity is larger than a preset similarity threshold, the preset similarity threshold is a similarity critical value between the corrected detection parameter and the predicted detection parameter when the workpiece is judged to be correctly assembled, the component correct assembly label is correspondingly output as the component detection information; and if the similarity is smaller than or equal to a preset similarity threshold, outputting the component error assembly label as component detection information correspondingly.
As another embodiment, step 306 includes: the component detection information includes component assembly accuracy, and the similarity between the corrected detection parameter and the predicted detection parameter is outputted as component assembly accuracy correspondence.
In this embodiment, the actual detection parameters are obtained by detecting the component parts through the preset sensor, the standard detection parameters obtained by detecting when the component parts are correctly assembled to the target workpiece are obtained, the actual detection parameters are corrected according to the component information and the component classification information of the component parts, the corrected detection parameters are obtained, the measurement errors are quantized and corrected through the component information, and therefore the assembly condition of the component parts is detected according to the similarity between the corrected detection parameters and the standard detection parameters, the component detection information is obtained, and the assembly condition erroneous judgment caused by the design tolerance or other unreliability-caused measurement errors of the component parts is avoided to a certain extent, so that the assembly condition detection accuracy of the component parts is improved.
In a more detailed embodiment, referring to FIG. 5, first, for each of the component parts, the light reflection capabilities of the component parts are determined; constructing an illumination compensation condition for carrying out illumination compensation on the component parts, wherein the stronger the illumination reflection capability of the component parts is, the lower the compensation illumination intensity corresponding to the illumination compensation condition is; acquiring component part images carrying component part features by a plurality of cameras under an image acquisition environment of illumination compensation of the illumination equipment corresponding to illumination compensation conditions, and carrying out image recognition on the component part images to obtain component part information of the component parts; extracting component characteristics of each component part from the part shape, the part size, the position of the component and the material of the component; and classifying the component parts according to the part characteristics of the component parts to obtain part classification results.
Further, according to the component characteristics, detecting the assembly condition of the component parts through a pre-trained component detection model corresponding to the component classification result to obtain component detection information; weighting the component detection information of the component according to the weight corresponding to the component for each component to obtain component detection weighting information of the component, wherein the weight corresponding to the component is positively correlated with the importance degree of the component; detecting the assembly condition of the target workpiece according to the component detection weighting information of each component to obtain a detection result; acquiring real assembly detection information corresponding to the component, and if the difference between the real assembly detection information and the component detection information is larger than a preset difference threshold, generating sample information corresponding to the component according to the component information of the component, the real assembly detection information and the component detection information; transmitting the sample information to the glume computing platform through the component detection model; and updating the part detection model according to the sample information by using the glume calculation platform.
In this way, under the scene that the light condition of the shooting environment is poor, illumination compensation is carried out on the shooting environment through the polishing equipment, thereby avoiding the situation that the image shot by the shooting environment with poor light condition is not clear, improving the accuracy of the component information obtained by image recognition, aiming at the component with higher importance degree for workpiece detection or higher influence on normal use of the workpiece, setting higher weight, so that the component detection weighted information of the component plays a decisive role for workpiece detection, therefore, the assembly detection accuracy of the workpiece can be improved to a certain extent, the model automatic assembly condition detection of the target workpiece is provided, the workpieces are not required to be compared one by staff, the assembly condition detection efficiency of the workpiece is improved, objective and multi-dimensional component information and component classification information are used as the generation basis of the component detection information of the component on the target workpiece, the component detection information is the detection basis of the assembly condition of the target workpiece, the detection result obtained by detection can accurately reflect the assembly condition of the target workpiece, the observation capability of the detection result of the assembly condition of the workpiece and the dependence of the workpiece assembly condition on the assembly condition of the worker are eliminated, and the assembly condition experience of the workpiece assembly condition is improved, and the assembly condition detection accuracy of the workpiece is improved.
It should be understood that, although the steps in the flowcharts related to the above embodiments are sequentially shown as indicated by arrows, these steps are not necessarily sequentially performed in the order indicated by the arrows. The steps are not strictly limited to the order of execution unless explicitly recited herein, and the steps may be executed in other orders. Moreover, at least some of the steps in the flowcharts described in the above embodiments may include a plurality of steps or a plurality of stages, which are not necessarily performed at the same time, but may be performed at different times, and the order of the steps or stages is not necessarily performed sequentially, but may be performed alternately or alternately with at least some of the other steps or stages.
Based on the same inventive concept, the embodiment of the application also provides a workpiece assembly detection device for realizing the workpiece assembly detection method. The implementation of the solution provided by the device is similar to that described in the above method, so the specific limitation of one or more embodiments of the workpiece assembly detection device provided below may be referred to above as limitation of the workpiece assembly detection method, and will not be repeated here.
In one exemplary embodiment, as shown in fig. 6, there is provided a workpiece assembly inspection apparatus 500 comprising: an acquisition module 501, a classification module 502, a component detection module 503, and a workpiece detection module 504, wherein:
An obtaining module 501, configured to obtain, for a target workpiece after assembly, component information of at least one component on the target workpiece;
The classification module 502 is configured to classify each component according to the component information of each component, so as to obtain a component classification result;
a component detection module 503, configured to detect and analyze, for each component, according to component information and component classification results of the component, to obtain component detection information;
the workpiece detection module 504 is configured to detect an assembly condition of the target workpiece according to the component detection information of each component, so as to obtain a detection result.
In one embodiment, the component detection module 503 is further configured to extract component features from component information of the component parts; and detecting the assembly condition of the component parts through a pre-trained part detection model corresponding to the part classification result according to the part characteristics, so as to obtain part detection information.
In one embodiment, the component detecting module 503 is further configured to detect the component by using a preset sensor to obtain an actual detection parameter, and obtain a standard detection parameter detected when the component is correctly assembled to the target workpiece; correcting the actual detection parameters according to the component information and the component classification information of the component parts to obtain corrected detection parameters; and detecting the assembly condition of the component parts according to the similarity between the corrected detection parameters and the standard detection parameters to obtain the part detection information.
In one embodiment, the workpiece detection module 504 is further configured to weight, for each component, component detection information of the component according to a weight corresponding to the component, to obtain component detection weighted information of the component, where the weight corresponding to the component is positively related to an importance level of the component; and detecting the assembly condition of the target workpiece according to the component detection weighting information of each component to obtain a detection result.
In one embodiment, the workpiece detection module 504 is further configured to aggregate the component detection weighting information of each component to obtain component detection aggregate information; and detecting the assembly condition of the target workpiece according to the matching relation between the component detection aggregation information and the preset aggregation information condition to obtain a detection result.
In one embodiment, the workpiece assembly inspection device 500 further includes determining, for each component, the illumination reflection capability of the component prior to acquiring the component information of at least one component on the target workpiece; constructing an illumination compensation condition for carrying out illumination compensation on the component parts, wherein the stronger the illumination reflection capability of the component parts is, the lower the compensation illumination intensity of the component parts corresponding to the illumination compensation condition is; and acquiring component part images carrying the characteristics of the component parts under an image acquisition environment corresponding to the illumination compensation conditions so as to acquire the part information of the component parts from the component part images.
The above-mentioned various modules in the workpiece assembly inspection device may be implemented in whole or in part by software, hardware, and combinations thereof. The above modules may be embedded in hardware or may be independent of a processor in the computer device, or may be stored in software in a memory in the computer device, so that the processor may call and execute operations corresponding to the above modules.
In one exemplary embodiment, a computer device is provided, which may be a terminal, and an internal structure diagram thereof may be as shown in fig. 7. The computer device includes a processor, a memory, an input/output interface, a communication interface, a display unit, and an input means. The processor, the memory and the input/output interface are connected through a system bus, and the communication interface, the display unit and the input device are connected to the system bus through the input/output interface. Wherein the processor of the computer device is configured to provide computing and control capabilities. The memory of the computer device includes a non-volatile storage medium and an internal memory. The non-volatile storage medium stores an operating system and a computer program. The internal memory provides an environment for the operation of the operating system and computer programs in the non-volatile storage media. The input/output interface of the computer device is used to exchange information between the processor and the external device. The communication interface of the computer device is used for carrying out wired or wireless communication with an external terminal, and the wireless mode can be realized through WIFI, a mobile cellular network, NFC (near field communication) or other technologies. The computer program is executed by a processor to implement a method of workpiece assembly inspection. The display unit of the computer device is used for forming a visual picture, and can be a display screen, a projection device or a virtual reality imaging device. The display screen can be a liquid crystal display screen or an electronic ink display screen, and the input device of the computer equipment can be a touch layer covered on the display screen, can also be a key, a track ball or a touch pad arranged on the shell of the computer equipment, and can also be an external keyboard, a touch pad or a mouse and the like.
It will be appreciated by those skilled in the art that the structure shown in FIG. 7 is merely a block diagram of some of the structures associated with the present inventive arrangements and is not limiting of the computer device to which the present inventive arrangements may be applied, and that a particular computer device may include more or fewer components than shown, or may combine some of the components, or have a different arrangement of components.
In an exemplary embodiment, a computer device is provided, comprising a memory and a processor, the memory having stored therein a computer program, the processor performing the steps of the method embodiments described above when the computer program is executed.
In one embodiment, a computer-readable storage medium is provided, on which a computer program is stored which, when executed by a processor, carries out the steps of the method embodiments described above.
In an embodiment, a computer program product is provided, comprising a computer program which, when executed by a processor, implements the steps of the method embodiments described above.
It should be noted that, the image information (including, but not limited to, the target image, the first training sample, the second training sample, etc.) related to the present application is information and data authorized by the user or sufficiently authorized by each party, and the collection, use and processing of the related data are required to meet the related regulations.
Those skilled in the art will appreciate that implementing all or part of the above-described methods in accordance with the embodiments may be accomplished by way of a computer program stored on a non-transitory computer readable storage medium, which when executed may comprise the steps of the embodiments of the methods described above. Any reference to memory, database, or other medium used in embodiments provided herein may include at least one of non-volatile and volatile memory. The nonvolatile Memory may include Read-Only Memory (ROM), magnetic tape, floppy disk, flash Memory, optical Memory, high density embedded nonvolatile Memory, resistive random access Memory (ReRAM), magneto-resistive random access Memory (Magnetoresistive Random Access Memory, MRAM), ferroelectric Memory (Ferroelectric Random Access Memory, FRAM), phase change Memory (PHASE CHANGE Memory, PCM), graphene Memory, and the like. Volatile memory can include random access memory (Random Access Memory, RAM) or external cache memory, and the like. By way of illustration, and not limitation, RAM can be in various forms such as static random access memory (Static Random Access Memory, SRAM) or dynamic random access memory (Dynamic Random Access Memory, DRAM), etc. The databases referred to in the embodiments provided herein may include at least one of a relational database and a non-relational database. The non-relational database may include, but is not limited to, a blockchain-based distributed database, and the like. The processor referred to in the embodiments provided in the present application may be a general-purpose processor, a central processing unit, a graphics processor, a digital signal processor, a programmable logic unit, a data processing logic unit based on quantum computing, or the like, but is not limited thereto.
The technical features of the above embodiments may be arbitrarily combined, and all possible combinations of the technical features in the above embodiments are not described for brevity of description, however, as long as there is no contradiction between the combinations of the technical features, they should be considered as the scope of the description.
The foregoing examples illustrate only a few embodiments of the application, which are described in detail and are not to be construed as limiting the scope of the application. It should be noted that it will be apparent to those skilled in the art that several variations and modifications can be made without departing from the spirit of the application, which are all within the scope of the application. Accordingly, the scope of the application should be assessed as that of the appended claims.

Claims (10)

1. A method of workpiece assembly inspection, the method comprising:
Acquiring part information of at least one component part on an assembled target workpiece aiming at the target workpiece;
classifying each component part according to the part information of each component part to obtain a part classification result;
For each component part, detecting and analyzing the component part according to the part information and the part classification result of the component part to obtain part detection information;
And detecting the assembly condition of the target workpiece according to the respective component detection information of each component to obtain a detection result.
2. The method according to claim 1, wherein the detecting and analyzing the component parts based on the component information of the component parts and the component classification result to obtain component detection information includes:
Extracting component features from the component information of the component parts;
And detecting the assembly condition of the component parts through a pre-trained part detection model corresponding to the part classification result according to the part characteristics to obtain part detection information.
3. The method according to claim 1, wherein the detecting and analyzing the component parts based on the component information of the component parts and the component classification result to obtain component detection information includes:
detecting the component parts through a preset sensor to obtain actual detection parameters, and obtaining standard detection parameters obtained by detection when the component parts are correctly assembled to the target workpiece;
Correcting the actual detection parameters according to the component information and the component classification information of the component parts to obtain corrected detection parameters;
And detecting the assembly condition of the component according to the similarity between the corrected detection parameter and the standard detection parameter to obtain component detection information.
4. The method according to claim 1, wherein the detecting the assembly condition of the target workpiece based on the respective component detection information of each of the component parts to obtain a detection result includes:
Weighting the component detection information of the component according to the weight corresponding to the component for each component to obtain component detection weighting information of the component, wherein the weight corresponding to the component is positively correlated with the importance degree of the component;
And detecting the assembly condition of the target workpiece according to the component detection weighting information of each component to obtain a detection result.
5. The method of claim 4, wherein detecting the assembly condition of the target workpiece based on the component detection weighting information of each of the component parts, comprises:
Aggregating the component detection weighting information of each component to obtain component detection aggregation information;
And detecting the assembly condition of the target workpiece according to the matching relation between the component detection aggregation information and the preset aggregation information condition to obtain a detection result.
6. The method according to any one of claims 1 to 5, wherein prior to said acquiring part information of at least one component part on the target workpiece, the method further comprises:
determining, for each of the component parts, an illumination reflection capability of the component part;
Constructing an illumination compensation condition for carrying out illumination compensation on the component parts, wherein the stronger the illumination reflection capability of the component parts is, the lower the compensation illumination intensity of the component parts corresponding to the illumination compensation condition is;
And acquiring component part images carrying component part features under an image acquisition environment corresponding to the illumination compensation condition so as to acquire the component part information of the component part from the component part images.
7. A workpiece assembly inspection device, comprising:
The device comprises an acquisition module, a processing module and a processing module, wherein the acquisition module is used for acquiring component information of at least one component on a target workpiece after assembly;
The classification module is used for classifying each component part according to the part information of each component part to obtain a part classification result;
The component detection module is used for detecting and analyzing the component parts according to the component information and the component classification result of each component part to obtain component detection information;
And the workpiece detection module is used for detecting the assembly condition of the target workpiece according to the component detection information of each component to obtain a detection result.
8. A computer device comprising a memory and a processor, the memory storing a computer program, characterized in that the processor implements the steps of the method of any of claims 1 to 6 when the computer program is executed.
9. A computer readable storage medium, on which a computer program is stored, characterized in that the computer program, when being executed by a processor, implements the steps of the method of any of claims 1 to 6.
10. A computer program product comprising a computer program, characterized in that the computer program, when being executed by a processor, implements the steps of the method of any of claims 1 to 6.
CN202410019331.3A 2024-01-05 2024-01-05 Workpiece assembly inspection method, apparatus, device, medium, and computer program product Pending CN117934796A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202410019331.3A CN117934796A (en) 2024-01-05 2024-01-05 Workpiece assembly inspection method, apparatus, device, medium, and computer program product

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202410019331.3A CN117934796A (en) 2024-01-05 2024-01-05 Workpiece assembly inspection method, apparatus, device, medium, and computer program product

Publications (1)

Publication Number Publication Date
CN117934796A true CN117934796A (en) 2024-04-26

Family

ID=90755081

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202410019331.3A Pending CN117934796A (en) 2024-01-05 2024-01-05 Workpiece assembly inspection method, apparatus, device, medium, and computer program product

Country Status (1)

Country Link
CN (1) CN117934796A (en)

Similar Documents

Publication Publication Date Title
US20240087104A1 (en) Method for monitoring manufacture of assembly units
JP7059883B2 (en) Learning device, image generator, learning method, and learning program
CN112613569B (en) Image recognition method, training method and device for image classification model
CN116342599A (en) Point inspection method, point inspection device, point inspection equipment and point inspection equipment for defect detection equipment and storage medium
CN111815576A (en) Method, device, equipment and storage medium for detecting corrosion condition of metal part
CN111291778B (en) Training method of depth classification model, exposure anomaly detection method and device
CN111931721B (en) Method and device for detecting color and number of annual inspection label and electronic equipment
Fan Detection of multidamage to reinforced concrete using support vector machine‐based clustering from digital images
CN116563841B (en) Detection method and detection device for power distribution network equipment identification plate and electronic equipment
CN111126187A (en) Fire detection method, system, electronic device and storage medium
CN117934796A (en) Workpiece assembly inspection method, apparatus, device, medium, and computer program product
CN116958021A (en) Product defect identification method based on artificial intelligence, related device and medium
CN111222558A (en) Image processing method and storage medium
CN111537447A (en) Spectrum reconstruction method, printing quality evaluation method and system and electronic equipment
CN115760854A (en) Deep learning-based power equipment defect detection method and device and electronic equipment
CN115908344A (en) Micro LED chip defect detection method based on MLCT-YOLO
CN113269730B (en) Image processing method, image processing device, computer equipment and storage medium
WO2023034441A1 (en) Imaging test strips
CN115620083A (en) Model training method, face image quality evaluation method, device and medium
CN114925153A (en) Service-based geographic information data quality detection method, device and equipment
CN114913118A (en) Industrial visual detection method and device, electronic equipment and storage medium
CN111563021A (en) Positioning method, positioning device, electronic apparatus, and medium
US9654742B2 (en) System and method of automatically determining material reaction or sensitivity using images
US20220172453A1 (en) Information processing system for determining inspection settings for object based on identification information thereof
CN115100424B (en) Overlapping tobacco shred component judging method, system, electronic equipment and computer readable storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination