CN116952166A - Method, device, equipment and medium for detecting parts of automobile door handle assembly - Google Patents

Method, device, equipment and medium for detecting parts of automobile door handle assembly Download PDF

Info

Publication number
CN116952166A
CN116952166A CN202311211675.6A CN202311211675A CN116952166A CN 116952166 A CN116952166 A CN 116952166A CN 202311211675 A CN202311211675 A CN 202311211675A CN 116952166 A CN116952166 A CN 116952166A
Authority
CN
China
Prior art keywords
detection
target
image
door handle
handle assembly
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202311211675.6A
Other languages
Chinese (zh)
Other versions
CN116952166B (en
Inventor
李浩斌
陈立名
曹彬
胡江洪
袁帅鹏
李志�
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Fitow Tianjin Detection Technology Co Ltd
Original Assignee
Fitow Tianjin Detection Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Fitow Tianjin Detection Technology Co Ltd filed Critical Fitow Tianjin Detection Technology Co Ltd
Priority to CN202311211675.6A priority Critical patent/CN116952166B/en
Publication of CN116952166A publication Critical patent/CN116952166A/en
Application granted granted Critical
Publication of CN116952166B publication Critical patent/CN116952166B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B11/00Measuring arrangements characterised by the use of optical techniques
    • G01B11/26Measuring arrangements characterised by the use of optical techniques for measuring angles or tapers; for testing the alignment of axes
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01DMEASURING NOT SPECIALLY ADAPTED FOR A SPECIFIC VARIABLE; ARRANGEMENTS FOR MEASURING TWO OR MORE VARIABLES NOT COVERED IN A SINGLE OTHER SUBCLASS; TARIFF METERING APPARATUS; MEASURING OR TESTING NOT OTHERWISE PROVIDED FOR
    • G01D5/00Mechanical means for transferring the output of a sensing member; Means for converting the output of a sensing member to another variable where the form or nature of the sensing member does not constrain the means for converting; Transducers not specially adapted for a specific variable
    • G01D5/26Mechanical means for transferring the output of a sensing member; Means for converting the output of a sensing member to another variable where the form or nature of the sensing member does not constrain the means for converting; Transducers not specially adapted for a specific variable characterised by optical transfer means, i.e. using infrared, visible, or ultraviolet light
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01JMEASUREMENT OF INTENSITY, VELOCITY, SPECTRAL CONTENT, POLARISATION, PHASE OR PULSE CHARACTERISTICS OF INFRARED, VISIBLE OR ULTRAVIOLET LIGHT; COLORIMETRY; RADIATION PYROMETRY
    • G01J3/00Spectrometry; Spectrophotometry; Monochromators; Measuring colours
    • G01J3/46Measurement of colour; Colour measuring devices, e.g. colorimeters
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/70Circuitry for compensating brightness variation in the scene
    • H04N23/73Circuitry for compensating brightness variation in the scene by influencing the exposure time
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02PCLIMATE CHANGE MITIGATION TECHNOLOGIES IN THE PRODUCTION OR PROCESSING OF GOODS
    • Y02P90/00Enabling technologies with a potential contribution to greenhouse gas [GHG] emissions mitigation
    • Y02P90/30Computing systems specially adapted for manufacturing

Abstract

The invention discloses a part detection method, device, equipment and medium of an automobile door handle assembly. The method comprises the following steps: the target door handle assembly is controlled to sequentially move to each detection point on the detection platform, and at least one detection image of the target door handle assembly at each detection point is obtained; wherein each detection image comprises at least one part to be detected; matching each detection image with a corresponding template image, and acquiring area images corresponding to each detection image respectively according to a matching result; analyzing each area image, and determining whether each part to be detected meets the qualification conditions according to the analysis result. By adopting the technical scheme, the number of cameras used in the part detection process of the automobile door handle assembly can be reduced, and the definition of detection images can be effectively improved.

Description

Method, device, equipment and medium for detecting parts of automobile door handle assembly
Technical Field
The invention relates to the technical field of part detection, in particular to a part detection method, device, equipment and medium of an automobile door handle assembly.
Background
In the detection process of the automobile door handle assembly, a plurality of detection surfaces in the automobile door handle assembly are required to be detected simultaneously, and each detection surface possibly comprises a plurality of parts to be detected.
In the prior art, if there are multiple detection sides in the door handle assembly, there are many parts to be detected in each detection side, and a clear image of the parts to be detected is to be obtained, multiple cameras are generally required to be configured for each detection side, so as to ensure that the clear detection images of the parts can be obtained.
However, since the size of the door handle assembly is limited, if a plurality of cameras are configured for each detection surface to shoot, the camera erection distance is generally long, a clear detection image cannot be shot, and the long-distance shooting requires the support of a large-scale light source, so that the detection cost is high.
Disclosure of Invention
The invention provides a part detection method, device, equipment and medium for an automobile door handle assembly, which can reduce the number of cameras used in the part detection process of the automobile door handle assembly and can effectively improve the definition of detection images.
According to an aspect of the present invention, there is provided a part detection method of an automobile door handle, including:
the method comprises the steps that a target door handle is controlled to sequentially move to each detection point on a detection table, and at least one detection image of the target door handle at each detection point is obtained; wherein each detection image comprises at least one part to be detected;
Matching each detection image with a corresponding template image, and acquiring area images corresponding to each detection image respectively according to a matching result;
analyzing each area image, and determining whether each part to be detected meets the qualification conditions according to the analysis result.
According to another aspect of the present invention, there is provided a part inspection device of an automobile door handle, comprising:
the detection image acquisition module is used for controlling the target door handle to sequentially move to each detection point on the detection table and acquiring at least one detection image of the target door handle at each detection point; wherein each detection image comprises at least one part to be detected;
the regional image acquisition module is used for matching each detection image with the corresponding template image and acquiring regional images corresponding to each detection image respectively according to the matching result;
the part detection module is used for analyzing each area image and determining whether each part to be detected meets the qualification condition according to the analysis result.
According to another aspect of the present invention, there is provided an electronic apparatus including:
at least one processor; and
a memory communicatively coupled to the at least one processor; wherein,
The memory stores a computer program executable by the at least one processor to enable the at least one processor to perform the method for detecting a part of an automobile door handle according to any one of the embodiments of the present invention.
According to another aspect of the present invention, there is provided a computer readable storage medium storing computer instructions for causing a processor to execute the method for detecting a part of an automobile door handle according to any one of the embodiments of the present invention.
According to the technical scheme, the target door handle assembly is moved to each detection point, the detection image of each detection point is obtained, the detection image is analyzed to obtain the region image, the region image is analyzed, and whether all parts in the region image meet the qualification conditions is judged.
It should be understood that the description in this section is not intended to identify key or critical features of the embodiments of the invention or to delineate the scope of the invention. Other features of the present invention will become apparent from the description that follows.
Drawings
FIG. 1 is a flow chart of a method for detecting a part of an automobile door handle assembly according to a first embodiment of the invention;
FIG. 2 is a flow chart of a method of detecting a part of another door handle assembly of an automobile according to a second embodiment of the invention;
fig. 3 is a schematic structural view of a part detection device of an automobile door handle assembly according to a third embodiment of the present invention;
fig. 4 is a schematic structural view of an electronic device implementing a method for detecting parts of an automobile door handle assembly according to an embodiment of the present invention.
Detailed Description
In order that those skilled in the art will better understand the present invention, a technical solution in the embodiments of the present invention will be clearly and completely described below with reference to the accompanying drawings in which it is apparent that the described embodiments are only some embodiments of the present invention, not all embodiments. All other embodiments, which can be made by those skilled in the art based on the embodiments of the present invention without making any inventive effort, shall fall within the scope of the present invention.
It should be noted that the terms "first," "second," and the like in the description and the claims of the present invention and the above figures are used for distinguishing between similar objects and not necessarily for describing a particular sequential or chronological order. It is to be understood that the data so used may be interchanged where appropriate such that the embodiments of the invention described herein may be implemented in sequences other than those illustrated or otherwise described herein. Furthermore, the terms "comprises," "comprising," and "having," and any variations thereof, are intended to cover a non-exclusive inclusion, such that a process, method, system, article, or apparatus that comprises a list of steps or elements is not necessarily limited to those steps or elements expressly listed but may include other steps or elements not expressly listed or inherent to such process, method, article, or apparatus.
Example 1
Fig. 1 is a flowchart of a method for detecting parts of an automobile door handle assembly according to an embodiment of the present invention, where the method may be applied to moving the automobile door handle assembly to each detection point on a detection table, shooting a plurality of different detection images of each detection surface of the door handle assembly by using a fixed camera, and determining whether each part in the door handle assembly is qualified according to the detection images. As shown in fig. 1, the method includes:
S110, controlling the target door handle assembly to sequentially move to each detection point on the detection platform, and acquiring at least one detection image of the target door handle assembly at each detection point.
Wherein each detection image comprises at least one part to be detected.
It can be understood that the automobile door handle assembly provided by the invention can also be other mechanical structures with a plurality of detection surfaces, wherein each detection surface comprises a plurality of detection points, and the camera configuration space is limited.
Alternatively, the target door handle assembly may refer to an automobile door handle assembly currently to be detected on the detection station.
Optionally, the target door handle assembly may include a plurality of detection surfaces, for example, a top surface and each side surface of the door handle assembly, where each detection surface may include a plurality of parts to be detected, and there may be a plurality of parts to be detected on a certain detection surface, where all the parts to be detected cannot be shot in the same image.
Alternatively, the detection image may be captured by a camera disposed on the detection table, and a part of the parts in the door handle assembly may be displayed in the detection image. In the embodiment of the invention, in order to reduce the detection cost, and also consider that the size of the automobile door handle assembly is smaller and the detection space is limited, if a plurality of cameras are configured on a single detection surface, the distance between each camera and the door handle assembly is increased, and the shooting view of a single photo is larger, so that the problem of unclear shooting content is possibly caused.
It will be appreciated that the camera configured for each inspection surface may be in a fixed state, and that a plurality of inspection points may be predetermined on the inspection table, each time the door handle assembly is moved to a different inspection point, the camera may capture a different area in the inspection surface corresponding thereto, i.e., the camera may capture a different part in the inspection surface.
And S120, matching each detection image with the corresponding template image, and acquiring area images respectively corresponding to each detection image according to the matching result.
It can be understood that the detected image taken by the camera may contain useless environmental information besides the part to be detected, so that in order to improve the detection accuracy and the detection efficiency of the part, the region of interest may be obtained in the detected image, so that the part detection is performed in the region of interest.
Optionally, the template image may be a pre-acquired example image, the detection image photographed by the camera on each detection surface at each detection point may have a matched template image, the template image is substantially consistent with the parts in the corresponding detection image, the sum of the parts in each template image can cover all the parts to be detected, and the template image does not substantially contain useless information.
Furthermore, a designated part or designated image area can be selected from the template image as a reference for matching, and a position matched with the selected designated part or designated image area is identified in the detection image, so that the two images are matched, and the matching process can comprise the steps of rotation or scaling and the like, so that the area with the same content as the template image is extracted from the detection image as an interested area, namely an area image.
S130, analyzing each area image, and determining whether each part to be detected meets the qualification conditions according to the analysis result.
Optionally, the detection content of the parts of the automobile door handle assembly generally comprises color, angle and the presence or absence of the parts. In a specific example, the color of the target part should be the same as a preset standard color, and the installation angle of a part such as a spring should be ensured within a certain range, and furthermore, each part should be installed at a designated position.
Optionally, each part in the region image can be extracted by an image recognition mode, the part is further recognized according to each set detection content, and if the detection content meets the preset standard, the part meets the qualification condition.
According to the technical scheme, the target door handle assembly is moved to each detection point, the detection image of each detection point is obtained, the detection image is analyzed to obtain the region image, the region image is analyzed, and whether all parts in the region image meet the qualification conditions is judged.
Example two
Fig. 2 is a flowchart of a part detection method of an automobile door handle assembly according to a second embodiment of the present invention, and the method for acquiring a detection image is specifically described based on the above embodiment. As shown in fig. 2, the method includes:
s210, determining a current detection point matched with a current time node according to a preset detection point moving sequence and the current position of the target door handle assembly when the specified time node is reached.
Optionally, before the door handle assembly is detected, a configuration file may be preloaded, where the configuration file may include a plurality of detection points preset on the detection platform, and a detection plane to be shot at each detection point, and a specific position of the detection point may be represented by coordinates matched with the detection platform.
It will be appreciated that when the parts to be detected in the detection surface are fewer or more concentrated, the image of all the parts of all the detection surface may be obtained by shooting only once or fewer times, and when the parts to be detected in the detection surface are more or more scattered, shooting is required to be performed multiple times, so that the setting of the detection points needs to satisfy the distribution of all the parts in each detection surface in the target door handle assembly, but in the actual shooting process, only the image of part of the detection surface may be required to be shot at one time at the designated detection point, so that the detection surface to be shot at each detection point may be preset in the configuration file, thereby facilitating the determination of the camera for shooting at each detection point.
Alternatively, the target door handle assembly may be moved to the next detection point according to the designated time node, and may be moved to the next detection point after it is detected that all the detection images at the current detection point have been photographed.
Optionally, in order to efficiently acquire all detection images, the sequence of the detection points may be preset to improve the movement efficiency of the door handle assembly, thereby improving the detection efficiency.
S220, acquiring position coordinates matched with the current detection point, and moving the target door handle assembly to the current detection point according to the position coordinates.
And S230, after the target door handle assembly is determined to move to the current detection point, determining at least one target camera according to at least one current detection surface matched with the current detection point.
Each camera on the detection table is used for shooting different detection surfaces in the target door handle assembly.
Alternatively, the target camera may refer to a camera for capturing a detection image at a current detection point, at least one current detection surface to be detected is included at each detection point, and after the current detection surface is determined, the target camera corresponding to the current detection surface may be determined according to the current detection surface.
S240, according to the part attribute information of the parts contained in each current detection surface, shooting parameters corresponding to each target camera are determined.
The determining, according to the part attribute information of the parts included in the current detection surfaces, shooting parameters corresponding to each target camera may include:
determining the part materials of the parts contained in the current detection surfaces according to the part attribute information of the parts contained in the current detection surfaces, and judging whether the exposure time difference value of the parts contained in the current detection surfaces is within a preset time range according to the exposure time matched with the part materials;
And determining the number of detection images to be shot by each target camera and the corresponding exposure time of each shooting according to the judging result.
According to the judgment result, determining the number of detection images to be shot by each target camera and the corresponding exposure time for each shooting may include:
if the exposure time difference value of each part contained in the target detection surface is within a preset time range, determining that the number of detection images to be shot by the target camera corresponding to the target detection surface is a preset number value, and the exposure time corresponding to the target camera corresponding to the target detection surface is an average value of the exposure time of each part contained in the target detection surface;
if the exposure time difference of each part contained in the target detection surface exceeds a preset time range, determining the number of detection images to be shot by the target camera corresponding to the target detection surface and the exposure time of each shot according to the exposure time and the preset time range corresponding to each part in the target detection surface.
Optionally, the shooting parameters may include an exposure time corresponding to each camera, and may further include other parameters such as a gain of the camera, where the specific shooting parameters may be determined according to an actual shooting situation, and are not limited herein.
It can be understood that, because the reflection degrees of the parts made of different materials such as metal, plastic, rubber and the like are different, if the parts made of different materials are exposed according to the same exposure time, the reflection of part of the parts is serious, or part of the parts is darker, so that the specific structure of the zero clearing part is difficult to see. Therefore, the invention creatively provides that the part attribute information in each detection surface at each detection point can be predetermined, and the exposure time is determined according to the part attribute information, so that the good shooting effect on different parts can be ensured.
Alternatively, the exposure time may refer to the time elapsed from the start of shooting to the end of shooting during each shooting.
Optionally, if the difference of exposure time corresponding to the part material of each part in the current detection surface is within the preset time range, even if the part materials are different, the same exposure time is adopted, the influence on the imaging effect of each part in the detection image is low, and still a clear part image can be obtained, at this time, the preset number value of the detection image can be 1, that is, only one image can be shot for one detection surface, and the exposure time can be the average value of the exposure time of each part.
Further, if the difference of the exposure time corresponding to the part material of each part in the current detection surface exceeds the preset time range, that is, the problem that imaging is possibly deteriorated if the parts are shot with the same exposure time is represented, at this time, the exposure time may be grouped according to the preset time range, the number of detected images shot by the target camera is determined according to the number of groups, and the average value of the exposure time in each group is obtained, so as to determine the exposure time of each shot.
S250, controlling the target light source to sequentially shoot in cooperation with each target camera according to shooting parameters of each target camera, and acquiring shooting images of each target camera as at least one detection image of the target door handle assembly at the current detection point.
According to the shooting parameters of each target camera, the target light source is controlled to sequentially shoot in cooperation with each target camera, and shooting images of each target camera are obtained as at least one detection image of the target door handle assembly at the current detection point, and the method can comprise the following steps:
when the fact that the current detection point is provided with a plurality of matched current detection surfaces is determined, acquiring a shooting sequence of each preset target camera;
Determining a current camera in each target camera according to the shooting sequence, controlling a target light source to expose a current detection surface shot by the current camera according to the exposure time corresponding to the current camera, and simultaneously controlling the current camera to shoot;
after the shooting of the current camera is completed, the target light source is turned off, and the previous step is repeatedly executed until each target camera at the current detection point completes shooting.
It can be understood that, because the number and distribution positions of the parts to be detected on different detection surfaces are different, and thus the detection surfaces possibly needing to be shot at different detection points are different, for example, the left side and the right side need to be shot at the first detection point, but the upper side, the left side and the right side need to be shot at the second detection point, although each detection surface is provided with a camera, if all the cameras shoot simultaneously, the shooting definition may be affected, therefore, if a plurality of detection surfaces needing to be detected exist at the current detection point, the shooting sequence among the cameras can be determined, thereby ensuring that a single camera shoots at the same time, and turning off the light source after each shooting, the influence of the light source on the next shooting can be effectively avoided, and the shooting definition is ensured to the greatest extent.
Alternatively, after capturing the detection image at each detection point, the detection image may be named according to a predetermined rule. An alternative naming convention may be: the method comprises the steps of respectively encoding an automobile door handle assembly to be detected, shooting cameras and shooting sequences of the cameras, acquiring currently matched encoding information after each shooting, and combining the encoding information to generate a naming result matched with a currently shot detection image. The position of the defect detected in real time can be returned to the industrial personal computer through reasonable naming.
S260, determining the target template image matched with the target detection image according to the detection point information and the detection surface information matched with the target detection image.
S270, respectively identifying the target detection image and the positioning part in the target template image, and matching the target detection image with the target template image according to the positioning part.
Alternatively, an area image with higher matching degree with the target template image can be identified by a template matching technology and used as a positioning area image. The image of the positioning area and the area where the positioning part is positioned are generally not interfered by illumination, and the imaging is stable.
S280, acquiring a region image matched with the target template image in the target detection image according to the matching result, and taking the region image as a target region image.
S290, analyzing the target area image, determining identity information of each part in the target area image, and acquiring identification information of each part.
Optionally, visual algorithms such as edge detection, gray level analysis, template matching, etc. may be applied to analyze the target area image to identify each part in the target area image.
The identification information may include at least color information, angle information, and presence information.
Alternatively, the identity information of the part may be a part model, or other information that can uniquely identify the part.
Optionally, edge detection can be applied to identify the edge shape of the part, the edge shape is fitted, two fitted straight lines are obtained, and an included angle between the two straight lines can be used as angle information of the part.
S2100, acquiring standard configuration information of each part in the target area image according to the preloaded detection file and the identity information of each part.
Optionally, each part has standard configuration information matched with the part, and the standard configuration information can also include color information, angle information and existence information. The color information may refer to the appearance color of the part, the angle information may refer to the installation angle of the part on the detection surface, and the presence information may refer to whether the part is installed at a specified installation position on the detection surface, that is, whether a target part exists at the specified installation position on the detection surface.
S2110, comparing the standard configuration information of each part with the identification information of each part, and determining whether each part meets the qualification condition according to the comparison result.
Optionally, the part satisfies the qualification condition when the identification information of the part is equal to or within a range specified by the standard configuration information of the part.
According to the technical scheme, the exposure time of the camera is determined according to the material of the part, and the light source is turned off after each shooting, so that the problem that the part in the detected image is unclear can be avoided, the definition of the detected image is effectively improved, the part detection efficiency can be improved, the detected image at each detection point is obtained by moving the target door handle assembly to the position of each detection point, the detected image is analyzed to obtain the area image, the area image is analyzed, whether each part in the area image meets the qualification condition is judged, the clear and complete detected image can be obtained by using fewer cameras, meanwhile, the stay times of the door handle assembly can be reduced, the part detection efficiency of the automobile door handle assembly and the accuracy of the detection result are improved, and the detection cost is effectively reduced.
Example III
Fig. 3 is a schematic structural diagram of a part detection device of an automobile door handle assembly according to a third embodiment of the present invention. As shown in fig. 3, the apparatus includes: a detected image acquisition module 310, a region image acquisition module 320, and a part detection module 330.
A detection image acquisition module 310, configured to control the target door handle assembly to sequentially move to each detection point on the detection table, and acquire at least one detection image of the target door handle assembly at each detection point; wherein each detection image comprises at least one part to be detected.
The region image obtaining module 320 is configured to match each detected image with a corresponding template image, and obtain a region image corresponding to each detected image according to the matching result.
The part detection module 330 is configured to parse each area image, and determine whether each part to be detected meets a qualification condition according to the parsing result.
According to the technical scheme, the target door handle assembly is moved to each detection point, the detection image of each detection point is obtained, the detection image is analyzed to obtain the region image, the region image is analyzed, and whether all parts in the region image meet the qualification conditions is judged.
Based on the above embodiments, the detection image acquisition module 310 may include:
the detecting point determining unit is used for determining a current detecting point matched with a current time node according to a preset detecting point moving sequence and the current position of the target door handle assembly every time a designated time node is reached;
the door handle assembly moving unit is used for acquiring position coordinates matched with the current detection point and moving the target door handle assembly to the current detection point according to the position coordinates;
a target camera determining unit, configured to determine at least one target camera according to at least one current detection surface matched with the current detection point after determining that the target door handle assembly moves to the current detection point; each camera on the detection table is used for shooting different detection surfaces in the target door handle assembly respectively;
a shooting parameter determining unit for determining shooting parameters corresponding to each target camera according to the part attribute information of the parts contained in each current detection surface;
and the image shooting unit is used for controlling the target light source to sequentially shoot in cooperation with each target camera according to shooting parameters of each target camera, and acquiring shooting images of each target camera as at least one detection image of the target door handle assembly at the current detection point.
On the basis of the above embodiments, the photographing parameter determining unit may include:
an exposure time judging subunit, configured to determine, according to part attribute information of parts included in each current detection surface, part materials of each part included in each current detection surface, and judge whether an exposure time difference value of each part included in each current detection surface is within a preset time range according to exposure time matched with each part material;
and the parameter determination subunit is used for determining the number of the detection images to be shot by each target camera and the corresponding exposure time of each shooting according to the judging result.
On the basis of the above embodiments, the parameter determining subunit may be specifically configured to:
if the exposure time difference value of each part contained in the target detection surface is within a preset time range, determining that the number of detection images to be shot by the target camera corresponding to the target detection surface is a preset number value, and the exposure time corresponding to the target camera corresponding to the target detection surface is an average value of the exposure time of each part contained in the target detection surface;
if the exposure time difference of each part contained in the target detection surface exceeds a preset time range, determining the number of detection images to be shot by the target camera corresponding to the target detection surface and the exposure time of each shot according to the exposure time and the preset time range corresponding to each part in the target detection surface.
On the basis of the above embodiments, the image capturing unit may be specifically configured to:
when the fact that the current detection point is provided with a plurality of matched current detection surfaces is determined, acquiring a shooting sequence of each preset target camera;
determining a current camera in each target camera according to the shooting sequence, controlling a target light source to expose a current detection surface shot by the current camera according to the exposure time corresponding to the current camera, and simultaneously controlling the current camera to shoot;
after the shooting of the current camera is completed, the target light source is turned off, and the previous step is repeatedly executed until each target camera at the current detection point completes shooting.
Based on the above embodiments, the area image acquisition module 320 may be specifically configured to:
determining a target template image matched with the target detection image according to the detection point information and the detection surface information matched with the target detection image;
respectively identifying a target detection image and a positioning part in the target template image, and matching the target detection image with the target template image according to the positioning part;
and acquiring a region image matched with the target template image in the target detection image according to the matching result, and taking the region image as a target region image.
Based on the above embodiments, the part detection module 330 may specifically be used to:
analyzing the target area image, determining identity information of each part in the target area image, and acquiring identification information of each part;
wherein the identification information comprises color information, angle information and presence information;
acquiring standard configuration information of each part in the target area image according to the preloaded detection file and the identity information of each part;
and comparing the standard configuration information of each part with the identification information of each part, and determining whether each part meets the qualification condition according to the comparison result.
The part detection device of the automobile door handle assembly provided by the embodiment of the invention can execute the part detection method of the automobile door handle assembly provided by any embodiment of the invention, and has the corresponding functional modules and beneficial effects of the execution method.
Example IV
Fig. 4 shows a schematic diagram of the structure of an electronic device 10 that may be used to implement an embodiment of the invention. Electronic devices are intended to represent various forms of digital computers, such as laptops, desktops, workstations, personal digital assistants, servers, blade servers, mainframes, and other appropriate computers. Electronic equipment may also represent various forms of mobile devices, such as personal digital processing, cellular telephones, smartphones, wearable devices (e.g., helmets, glasses, watches, etc.), and other similar computing devices. The components shown herein, their connections and relationships, and their functions, are meant to be exemplary only, and are not meant to limit implementations of the inventions described and/or claimed herein.
As shown in fig. 4, the electronic device 10 includes at least one processor 11, and a memory, such as a Read Only Memory (ROM) 12, a Random Access Memory (RAM) 13, etc., communicatively connected to the at least one processor 11, in which the memory stores a computer program executable by the at least one processor, and the processor 11 may perform various appropriate actions and processes according to the computer program stored in the Read Only Memory (ROM) 12 or the computer program loaded from the storage unit 18 into the Random Access Memory (RAM) 13. In the RAM 13, various programs and data required for the operation of the electronic device 10 may also be stored. The processor 11, the ROM 12 and the RAM 13 are connected to each other via a bus 14. An input/output (I/O) interface 15 is also connected to bus 14.
Various components in the electronic device 10 are connected to the I/O interface 15, including: an input unit 16 such as a keyboard, a mouse, etc.; an output unit 17 such as various types of displays, speakers, and the like; a storage unit 18 such as a magnetic disk, an optical disk, or the like; and a communication unit 19 such as a network card, modem, wireless communication transceiver, etc. The communication unit 19 allows the electronic device 10 to exchange information/data with other devices via a computer network, such as the internet, and/or various telecommunication networks.
The processor 11 may be a variety of general and/or special purpose processing components having processing and computing capabilities. Some examples of processor 11 include, but are not limited to, a Central Processing Unit (CPU), a Graphics Processing Unit (GPU), various specialized Artificial Intelligence (AI) computing chips, various processors running machine learning model algorithms, digital Signal Processors (DSPs), and any suitable processor, controller, microcontroller, etc. The processor 11 performs the various methods and processes described above, such as the method of detecting a part of an automobile door handle assembly according to an embodiment of the present invention. Namely:
the target door handle assembly is controlled to sequentially move to each detection point on the detection platform, and at least one detection image of the target door handle assembly at each detection point is obtained; wherein each detection image comprises at least one part to be detected;
matching each detection image with a corresponding template image, and acquiring area images corresponding to each detection image respectively according to a matching result;
analyzing each area image, and determining whether each part to be detected meets the qualification conditions according to the analysis result.
In some embodiments, the method of part detection of a car door handle assembly may be implemented as a computer program tangibly embodied on a computer readable storage medium, such as storage unit 18. In some embodiments, part or all of the computer program may be loaded and/or installed onto the electronic device 10 via the ROM 12 and/or the communication unit 19. When the computer program is loaded into RAM 13 and executed by processor 11, one or more steps of the above-described method of detecting a part of an automobile door handle assembly may be performed. Alternatively, in other embodiments, the processor 11 may be configured to perform the part detection method of the automobile door handle assembly in any other suitable manner (e.g., by means of firmware).
Various implementations of the systems and techniques described here above may be implemented in digital electronic circuitry, integrated circuit systems, field Programmable Gate Arrays (FPGAs), application Specific Integrated Circuits (ASICs), application Specific Standard Products (ASSPs), systems On Chip (SOCs), load programmable logic devices (CPLDs), computer hardware, firmware, software, and/or combinations thereof. These various embodiments may include: implemented in one or more computer programs, the one or more computer programs may be executed and/or interpreted on a programmable system including at least one programmable processor, which may be a special purpose or general-purpose programmable processor, that may receive data and instructions from, and transmit data and instructions to, a storage system, at least one input device, and at least one output device.
A computer program for carrying out methods of the present invention may be written in any combination of one or more programming languages. These computer programs may be provided to a processor of a general purpose computer, special purpose computer, or other programmable data processing apparatus, such that the computer programs, when executed by the processor, cause the functions/acts specified in the flowchart and/or block diagram block or blocks to be implemented. The computer program may execute entirely on the machine, partly on the machine, as a stand-alone software package, partly on the machine and partly on a remote machine or entirely on the remote machine or server.
In the context of the present invention, a computer-readable storage medium may be a tangible medium that can contain, or store a computer program for use by or in connection with an instruction execution system, apparatus, or device. The computer readable storage medium may include, but is not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any suitable combination of the foregoing. Alternatively, the computer readable storage medium may be a machine readable signal medium. More specific examples of a machine-readable storage medium would include an electrical connection based on one or more wires, a portable computer diskette, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing.
To provide for interaction with a user, the systems and techniques described here can be implemented on an electronic device having: a display device (e.g., a CRT (cathode ray tube) or LCD (liquid crystal display) monitor) for displaying information to a user; and a keyboard and a pointing device (e.g., a mouse or a trackball) through which a user can provide input to the electronic device. Other kinds of devices may also be used to provide for interaction with a user; for example, feedback provided to the user may be any form of sensory feedback (e.g., visual feedback, auditory feedback, or tactile feedback); and input from the user may be received in any form, including acoustic input, speech input, or tactile input.
The systems and techniques described here can be implemented in a computing system that includes a background component (e.g., as a data server), or that includes a middleware component (e.g., an application server), or that includes a front-end component (e.g., a user computer having a graphical user interface or a web browser through which a user can interact with an implementation of the systems and techniques described here), or any combination of such background, middleware, or front-end components. The components of the system can be interconnected by any form or medium of digital data communication (e.g., a communication network). Examples of communication networks include: local Area Networks (LANs), wide Area Networks (WANs), blockchain networks, and the internet.
The computing system may include clients and servers. The client and server are typically remote from each other and typically interact through a communication network. The relationship of client and server arises by virtue of computer programs running on the respective computers and having a client-server relationship to each other. The server can be a cloud server, also called a cloud computing server or a cloud host, and is a host product in a cloud computing service system, so that the defects of high management difficulty and weak service expansibility in the traditional physical hosts and VPS service are overcome.
It should be appreciated that various forms of the flows shown above may be used to reorder, add, or delete steps. For example, the steps described in the present invention may be performed in parallel, sequentially, or in a different order, so long as the desired results of the technical solution of the present invention are achieved, and the present invention is not limited herein.
The above embodiments do not limit the scope of the present invention. It will be apparent to those skilled in the art that various modifications, combinations, sub-combinations and alternatives are possible, depending on design requirements and other factors. Any modifications, equivalent substitutions and improvements made within the spirit and principles of the present invention should be included in the scope of the present invention.

Claims (9)

1. A method of detecting a part of an automobile door handle assembly, comprising:
the target door handle assembly is controlled to sequentially move to each detection point on the detection platform, and at least one detection image of the target door handle assembly at each detection point is obtained; wherein each detection image comprises at least one part to be detected;
matching each detection image with a corresponding template image, and acquiring area images corresponding to each detection image respectively according to a matching result;
Analyzing each area image, and determining whether each part to be detected meets the qualification conditions according to the analysis result;
the method for controlling the target door handle assembly to sequentially move to each detection point on the detection platform and acquiring at least one detection image of the target door handle assembly at each detection point comprises the following steps:
determining a current detection point matched with a current time node according to a preset detection point moving sequence and the current position of the target door handle assembly when reaching the specified time node;
acquiring position coordinates matched with the current detection point, and moving the target door handle assembly to the current detection point according to the position coordinates;
after determining that the target door handle assembly moves to the current detection point, determining at least one target camera according to at least one current detection surface matched with the current detection point; each camera on the detection table is used for shooting different detection surfaces in the target door handle assembly respectively;
determining shooting parameters corresponding to each target camera according to the part attribute information of the parts contained in each current detection surface;
according to the shooting parameters of each target camera, the target light source is controlled to sequentially shoot in cooperation with each target camera, and shooting images of each target camera are obtained and used as at least one detection image of the target door handle assembly at the current detection point.
2. The method according to claim 1, wherein determining shooting parameters respectively corresponding to each target camera based on part attribute information of parts contained in each current inspection surface, comprises:
determining the part materials of the parts contained in the current detection surfaces according to the part attribute information of the parts contained in the current detection surfaces, and judging whether the exposure time difference value of the parts contained in the current detection surfaces is within a preset time range according to the exposure time matched with the part materials;
and determining the number of detection images to be shot by each target camera and the corresponding exposure time of each shooting according to the judging result.
3. The method according to claim 2, wherein determining the number of detected images to be photographed by each target camera and the corresponding exposure time for each photographing according to the determination result includes:
if the exposure time difference value of each part contained in the target detection surface is within a preset time range, determining that the number of detection images to be shot by the target camera corresponding to the target detection surface is a preset number value, and the exposure time corresponding to the target camera corresponding to the target detection surface is an average value of the exposure time of each part contained in the target detection surface;
If the exposure time difference of each part contained in the target detection surface exceeds a preset time range, determining the number of detection images to be shot by the target camera corresponding to the target detection surface and the exposure time of each shot according to the exposure time and the preset time range corresponding to each part in the target detection surface.
4. The method according to claim 1, wherein controlling the target light source to sequentially photograph in cooperation with each target camera in accordance with the photographing parameters of each target camera and acquiring photographed images of each target camera as at least one detected image of the target door handle assembly at the current detection point, comprises:
when the fact that the current detection point is provided with a plurality of matched current detection surfaces is determined, acquiring a shooting sequence of each preset target camera;
determining a current camera in each target camera according to the shooting sequence, controlling a target light source to expose a current detection surface shot by the current camera according to the exposure time corresponding to the current camera, and simultaneously controlling the current camera to shoot;
after the shooting of the current camera is completed, the target light source is turned off, and the previous step is repeatedly executed until each target camera at the current detection point completes shooting.
5. The method according to claim 1, wherein matching each detected image with a corresponding template image, and obtaining an area image corresponding to each detected image, respectively, based on the matching result, comprises:
determining a target template image matched with the target detection image according to the detection point information and the detection surface information matched with the target detection image;
respectively identifying a target detection image and a positioning part in the target template image, and matching the target detection image with the target template image according to the positioning part;
and acquiring a region image matched with the target template image in the target detection image according to the matching result, and taking the region image as a target region image.
6. The method of claim 1, wherein analyzing each area image and determining whether each part to be inspected meets a qualification condition based on the analysis result comprises:
analyzing the target area image, determining identity information of each part in the target area image, and acquiring identification information of each part;
wherein the identification information comprises color information, angle information and presence information;
acquiring standard configuration information of each part in the target area image according to the preloaded detection file and the identity information of each part;
And comparing the standard configuration information of each part with the identification information of each part, and determining whether each part meets the qualification condition according to the comparison result.
7. A part detection device of an automobile door handle assembly, comprising:
the detection image acquisition module is used for controlling the target door handle assembly to sequentially move to each detection point on the detection table and acquiring at least one detection image of the target door handle assembly at each detection point; wherein each detection image comprises at least one part to be detected;
the regional image acquisition module is used for matching each detection image with the corresponding template image and acquiring regional images corresponding to each detection image respectively according to the matching result;
the part detection module is used for analyzing each area image and determining whether each part to be detected meets the qualification condition according to the analysis result;
wherein, detect image acquisition module includes:
the detecting point determining unit is used for determining a current detecting point matched with a current time node according to a preset detecting point moving sequence and the current position of the target door handle assembly every time a designated time node is reached;
The door handle assembly moving unit is used for acquiring position coordinates matched with the current detection point and moving the target door handle assembly to the current detection point according to the position coordinates;
a target camera determining unit, configured to determine at least one target camera according to at least one current detection surface matched with the current detection point after determining that the target door handle assembly moves to the current detection point; each camera on the detection table is used for shooting different detection surfaces in the target door handle assembly respectively;
a shooting parameter determining unit for determining shooting parameters corresponding to each target camera according to the part attribute information of the parts contained in each current detection surface;
and the image shooting unit is used for controlling the target light source to sequentially shoot in cooperation with each target camera according to shooting parameters of each target camera, and acquiring shooting images of each target camera as at least one detection image of the target door handle assembly at the current detection point.
8. An electronic device, the electronic device comprising:
at least one processor; and
a memory communicatively coupled to the at least one processor; wherein,
The memory stores a computer program executable by the at least one processor to enable the at least one processor to perform the part inspection method of the automobile door handle assembly of any one of claims 1-6.
9. A computer readable storage medium storing computer instructions for causing a processor to execute the part detection method of the automobile door handle assembly of any one of claims 1-6.
CN202311211675.6A 2023-09-20 2023-09-20 Method, device, equipment and medium for detecting parts of automobile door handle assembly Active CN116952166B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202311211675.6A CN116952166B (en) 2023-09-20 2023-09-20 Method, device, equipment and medium for detecting parts of automobile door handle assembly

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202311211675.6A CN116952166B (en) 2023-09-20 2023-09-20 Method, device, equipment and medium for detecting parts of automobile door handle assembly

Publications (2)

Publication Number Publication Date
CN116952166A true CN116952166A (en) 2023-10-27
CN116952166B CN116952166B (en) 2023-12-08

Family

ID=88462412

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202311211675.6A Active CN116952166B (en) 2023-09-20 2023-09-20 Method, device, equipment and medium for detecting parts of automobile door handle assembly

Country Status (1)

Country Link
CN (1) CN116952166B (en)

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111879777A (en) * 2020-06-19 2020-11-03 巨轮(广州)智能装备有限公司 Soft material fitting defect detection method, device, equipment and storage medium
CN111950520A (en) * 2020-08-27 2020-11-17 重庆紫光华山智安科技有限公司 Image recognition method and device, electronic equipment and storage medium
US20210400171A1 (en) * 2018-11-09 2021-12-23 Zhejiang Uniview Technologies Co., Ltd. Method and apparatus for automatically detecting and suppressing fringes, electronic device and computer-readable storage medium
CN114241342A (en) * 2021-10-27 2022-03-25 上海艾豚科技有限公司 Method for detecting whether detection point on workpiece is mistakenly installed or not
CN115937101A (en) * 2022-11-15 2023-04-07 苏州凌云光工业智能技术有限公司 Quality detection method, device, equipment and storage medium
CN116559170A (en) * 2022-01-27 2023-08-08 华为技术有限公司 Product quality detection method and related system

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20210400171A1 (en) * 2018-11-09 2021-12-23 Zhejiang Uniview Technologies Co., Ltd. Method and apparatus for automatically detecting and suppressing fringes, electronic device and computer-readable storage medium
CN111879777A (en) * 2020-06-19 2020-11-03 巨轮(广州)智能装备有限公司 Soft material fitting defect detection method, device, equipment and storage medium
CN111950520A (en) * 2020-08-27 2020-11-17 重庆紫光华山智安科技有限公司 Image recognition method and device, electronic equipment and storage medium
CN114241342A (en) * 2021-10-27 2022-03-25 上海艾豚科技有限公司 Method for detecting whether detection point on workpiece is mistakenly installed or not
CN116559170A (en) * 2022-01-27 2023-08-08 华为技术有限公司 Product quality detection method and related system
CN115937101A (en) * 2022-11-15 2023-04-07 苏州凌云光工业智能技术有限公司 Quality detection method, device, equipment and storage medium

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
于保军;刘敏;: "汽车传感器电路板涂胶质量视觉检测系统研究", 机械工程师, no. 12 *

Also Published As

Publication number Publication date
CN116952166B (en) 2023-12-08

Similar Documents

Publication Publication Date Title
US11380232B2 (en) Display screen quality detection method, apparatus, electronic device and storage medium
CN115205291B (en) Circuit board detection method, device, equipment and medium
CN112597837A (en) Image detection method, apparatus, device, storage medium and computer program product
CN115471476A (en) Method, device, equipment and medium for detecting component defects
CN116559177A (en) Defect detection method, device, equipment and storage medium
CN116952958B (en) Defect detection method, device, electronic equipment and storage medium
CN116952166B (en) Method, device, equipment and medium for detecting parts of automobile door handle assembly
CN116661477A (en) Substation unmanned aerial vehicle inspection method, device, equipment and storage medium
CN115700758A (en) Sperm activity detection method, device, equipment and storage medium
CN116668843A (en) Shooting state switching method and device, electronic equipment and storage medium
CN117689660B (en) Vacuum cup temperature quality inspection method based on machine vision
CN114581890B (en) Method and device for determining lane line, electronic equipment and storage medium
CN117808848B (en) Identification tracking method and device, electronic equipment and storage medium
CN117115568B (en) Data screening method, device, equipment and storage medium
CN115631249B (en) Camera correction method, device, equipment and storage medium
CN114037865B (en) Image processing method, apparatus, device, storage medium, and program product
CN118014971A (en) Surface defect detection method, device and equipment for photovoltaic module and storage medium
CN117611529A (en) Mobile computer shell detection method, device, equipment and storage medium
CN117876490A (en) Feature determination method and device, electronic equipment and storage medium
CN117745701A (en) Defect detection method and device, electronic equipment and storage medium
CN116309586A (en) Defect detection method, device, equipment and medium based on convolutional neural network
CN116777871A (en) Defect detection method, device, equipment and medium based on X-rays
CN117689660A (en) Vacuum cup temperature quality inspection method based on machine vision
CN117783132A (en) Automatic workpiece detection method and device, electronic equipment and storage medium
CN117275006A (en) Image processing method and device, electronic equipment and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant