CN116124789A - Multi-channel image synthesis method, device, computer equipment and medium - Google Patents

Multi-channel image synthesis method, device, computer equipment and medium Download PDF

Info

Publication number
CN116124789A
CN116124789A CN202310025059.5A CN202310025059A CN116124789A CN 116124789 A CN116124789 A CN 116124789A CN 202310025059 A CN202310025059 A CN 202310025059A CN 116124789 A CN116124789 A CN 116124789A
Authority
CN
China
Prior art keywords
image
workpiece
images
synthetic
offset
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202310025059.5A
Other languages
Chinese (zh)
Inventor
请求不公布姓名
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Guangdong Lyric Robot Automation Co Ltd
Original Assignee
Guangdong Lyric Robot Intelligent Automation Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Guangdong Lyric Robot Intelligent Automation Co Ltd filed Critical Guangdong Lyric Robot Intelligent Automation Co Ltd
Priority to CN202310025059.5A priority Critical patent/CN116124789A/en
Publication of CN116124789A publication Critical patent/CN116124789A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N21/00Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
    • G01N21/84Systems specially adapted for particular applications
    • G01N21/88Investigating the presence of flaws or contamination
    • G01N21/8851Scan or image signal processing specially adapted therefor, e.g. for scan signal adjustment, for detecting different kinds of defects, for compensating for structures, markings, edges
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N21/00Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
    • G01N21/84Systems specially adapted for particular applications
    • G01N21/88Investigating the presence of flaws or contamination
    • G01N21/95Investigating the presence of flaws or contamination characterised by the material or shape of the object to be examined
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/90Dynamic range modification of images or parts thereof
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/0002Inspection of images, e.g. flaw detection
    • G06T7/0004Industrial image inspection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/136Segmentation; Edge detection involving thresholding
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30108Industrial image inspection
    • G06T2207/30164Workpiece; Machine component
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02PCLIMATE CHANGE MITIGATION TECHNOLOGIES IN THE PRODUCTION OR PROCESSING OF GOODS
    • Y02P90/00Enabling technologies with a potential contribution to greenhouse gas [GHG] emissions mitigation
    • Y02P90/30Computing systems specially adapted for manufacturing

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Theoretical Computer Science (AREA)
  • Biochemistry (AREA)
  • Chemical & Material Sciences (AREA)
  • Analytical Chemistry (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • General Health & Medical Sciences (AREA)
  • Health & Medical Sciences (AREA)
  • Immunology (AREA)
  • Pathology (AREA)
  • Quality & Reliability (AREA)
  • Signal Processing (AREA)
  • Investigating Materials By The Use Of Optical Means Adapted For Particular Applications (AREA)

Abstract

The embodiment of the application provides a multichannel image synthesis method, a multichannel image synthesis device, computer equipment and a multichannel image synthesis medium, and belongs to the technical field of image synthesis. The method comprises the following steps: illuminating at least a part of the area of the workpiece to be detected each time, and carrying out image acquisition on the workpiece to be detected to obtain a plurality of workpiece images; carrying out weighted bias treatment on the multiple workpiece images to obtain multiple bias images and a synthetic image; performing cross-blending ratio calculation on the synthetic image and the whole image to obtain a cross-blending ratio; and determining a target synthetic graph according to the cross ratio value, the synthetic graph and a preset cross ratio condition. According to the embodiment of the application, the problem of non-uniformity in imaging can be solved, and uniformity in imaging of defect features is realized.

Description

Multi-channel image synthesis method, device, computer equipment and medium
Technical Field
The present disclosure relates to the field of image synthesis technologies, and in particular, to a method, an apparatus, a computer device, and a medium for synthesizing a multichannel image.
Background
In the process of appearance detection of a device, photo acquisition is often required to be carried out on the device, and a user can determine whether the device has defects, damages and the like by analyzing the photo of the device. In the process of photo acquisition, the quality of the photo is greatly influenced by illumination, and the situation that the photo shot in the daytime is clearly visible, but the photo shot at night is blurred can occur. In addition, in the detection process, the condition of uneven illumination of each part of the device to be detected may occur, so that the acquired photo may be blurred, ghost and the like, the acquired image details are blurred, defect features are difficult to distinguish, and the detection accuracy and efficiency are reduced.
Disclosure of Invention
The main objective of the embodiments of the present application is to provide a multi-channel image synthesis method, apparatus, computer device and medium, which can solve the problem of imaging non-uniformity.
To achieve the above object, a first aspect of an embodiment of the present application provides a multi-channel image synthesis method, including:
illuminating at least a part of the area of the workpiece to be detected each time and carrying out image acquisition on the workpiece to be detected to obtain a plurality of workpiece images, wherein the workpiece images comprise integral images obtained by carrying out integral illumination on the workpiece to be detected and acquiring;
carrying out weighted bias treatment on a plurality of workpiece images to obtain a plurality of bias images and a synthetic image;
performing cross ratio calculation on the synthetic image and the whole image to obtain a cross ratio;
and determining a target synthetic graph according to the intersection ratio, the synthetic graph and a preset intersection ratio condition.
In some embodiments, the performing weighted bias processing on the plurality of workpiece images to obtain a plurality of bias images and a composite image includes:
generating a weight matrix according to a plurality of workpiece images, wherein the weight matrix comprises a plurality of matrix elements, and the matrix elements respectively correspond to the workpiece images;
performing weighted bias processing on the workpiece image through matrix elements in the weight matrix to obtain a plurality of bias images;
and determining a plurality of offset channels according to the offset images, and performing image synthesis on the offset images and the offset channels to obtain a synthetic image.
In some embodiments, the matrix elements include weight values; the weighting bias processing is carried out on the workpiece image through matrix elements in the weight matrix to obtain a plurality of bias images, and the method comprises the following steps:
acquiring an image gray value of each workpiece image;
and carrying out weighted bias processing on the gray values of the images based on preset offset values and the weight values for each workpiece image to obtain a plurality of bias images.
In some embodiments, the performing weighted bias processing on the image gray level value based on a preset offset value and the weight value to obtain a plurality of biased images includes:
determining an image weight value corresponding to the workpiece image and an image offset value corresponding to the workpiece image;
carrying out weighted bias on the image gray level value according to the image weight value and the image offset value to obtain a target gray level value;
and carrying out brightness adjustment on the workpiece image according to the target gray value to obtain the offset image.
In some embodiments, the image synthesis of the offset image and the offset channel to obtain a synthetic image includes:
acquiring pixel points of the offset image;
ordering the offset images to obtain an offset sequence;
and adding the pixel points and the offset channels according to the offset sequence to obtain the synthetic image.
In some embodiments, the calculating the intersection ratio of the composite image and the whole image to obtain the intersection ratio includes:
performing feature marking on the integral image to obtain defect feature information;
determining feature width, feature height and center point coordinates according to the defect feature information;
performing gray value binarization processing on the synthetic image to obtain first defect characteristic information of the synthetic image;
threshold segmentation is carried out on the first defect characteristic information based on a preset spot tool, so that second defect characteristic information is obtained;
determining a synthetic feature width, a synthetic feature height and a synthetic center point coordinate according to the second defect feature information;
and calculating the intersection ratio of the feature width, the feature height, the center point coordinate, the synthesized feature width, the synthesized feature height and the synthesized center point coordinate to obtain the intersection ratio.
In some embodiments, the determining the target synthetic graph according to the intersection ratio value, the synthetic graph and the preset intersection ratio condition includes:
comparing the cross ratio with a preset cross ratio condition;
under the condition that the intersection ratio meets the intersection ratio condition, determining the target synthetic diagram according to the synthetic diagram;
and under the condition that the intersection ratio value does not meet the intersection ratio condition, continuing to carry out weighted bias processing on the plurality of workpiece images to obtain an iteration intersection ratio value until the iteration intersection ratio value meets the intersection ratio condition.
A second aspect of an embodiment of the present application proposes a multi-channel image synthesis apparatus, the apparatus comprising:
the image acquisition module is used for illuminating at least a part of the area of the workpiece to be detected each time and carrying out image acquisition on the workpiece to be detected to obtain a plurality of workpiece images, wherein the workpiece images comprise integral images obtained by carrying out integral illumination on the workpiece to be detected;
the weighting bias module is used for carrying out weighting bias processing on the plurality of workpiece images to obtain a plurality of bias images and a synthetic image;
the cross ratio calculation module is used for calculating the cross ratio of the synthetic image and the whole image to obtain a cross ratio;
and the target determining module is used for determining a target synthetic diagram according to the cross ratio, the synthetic diagram and a preset cross ratio condition.
A third aspect of the embodiments of the present application proposes a computer device comprising a memory and a processor, wherein the memory stores a computer program, which when executed by the processor is configured to perform a multi-channel image synthesis method according to any of the embodiments of the first aspect of the present application.
A fourth aspect of the embodiments of the present application proposes a storage medium being a computer readable storage medium storing a computer program for performing the multi-channel image synthesis method according to any one of the embodiments of the first aspect of the present application when the computer program is executed by a computer.
According to the multi-channel image synthesis method, device, computer equipment and medium, firstly, at least one part of the area of a workpiece to be detected is illuminated each time and the image of the workpiece to be detected is acquired, so that a plurality of workpiece images under different illumination conditions are obtained, wherein the workpiece images comprise integral images obtained by integral illumination and acquisition of the workpiece to be detected, then, the weighted bias processing is carried out on the workpiece images to obtain a plurality of bias images, so that the uniformity of the brightness of the images is improved, a synthetic image is obtained according to the bias images, the situation of data overflow is avoided, finally, the synthetic image and the integral images are subjected to cross-ratio calculation to obtain a cross-ratio value, so that the image synthesis precision is improved, the problem of uneven imaging can be solved, the uniformity of imaging of defect characteristics is realized, and the detection accuracy and efficiency of the workpiece to be detected are improved.
Drawings
FIG. 1 is a flow chart of a multi-channel image synthesis method provided in one embodiment of the present application;
FIG. 2 is a detailed flowchart of step S102 in FIG. 1;
fig. 3 is a specific flowchart of step S202 in fig. 2;
FIG. 4 is a detailed flowchart of step S302 in FIG. 3;
fig. 5 is a specific flowchart of step S203 in fig. 1;
fig. 6 is a specific flowchart of step S103 in fig. 1;
fig. 7 is a specific flowchart of step S104 in fig. 1;
FIG. 8 is another specific flowchart of step S104 in FIG. 1;
fig. 9 is a schematic structural diagram of a multi-channel image synthesizing apparatus according to an embodiment of the present application;
fig. 10 is a schematic hardware structure of a computer device according to an embodiment of the present application.
Detailed Description
In order to make the objects, technical solutions and advantages of the present application more apparent, the present application will be further described in detail with reference to the accompanying drawings and examples. It should be understood that the specific embodiments described herein are for purposes of illustration only and are not intended to limit the present application.
It should be noted that although functional block division is performed in a device diagram and a logic sequence is shown in a flowchart, in some cases, the steps shown or described may be performed in a different order than the block division in the device, or in the flowchart. The terms first, second and the like in the description and in the claims and in the above-described figures, are used for distinguishing between similar elements and not necessarily for describing a particular sequential or chronological order.
Unless defined otherwise, all technical and scientific terms used herein have the same meaning as commonly understood by one of ordinary skill in the art to which this application belongs. The terminology used herein is for the purpose of describing embodiments of the present application only and is not intended to be limiting of the present application.
The multichannel image synthesis method provided by the embodiment of the application can be applied to a terminal, a server side and software running in the terminal or the server side. In some embodiments, the terminal may be a smart phone, tablet, notebook, desktop, or smart watch, etc.; the server side can be configured as an independent physical server, a server cluster or a distributed system formed by a plurality of physical servers, and a cloud server for providing cloud services, cloud databases, cloud computing, cloud functions, cloud storage, network services, cloud communication, middleware services, domain name services, security services, content delivery networks (Content De l i very Network, CDN), basic cloud computing services such as big data and artificial intelligent platforms and the like; the software may be an application or the like that implements the above method, but is not limited to the above form.
Embodiments of the present application may be used in a variety of general-purpose or special-purpose computer system environments or configurations. For example: personal computers, server computers, hand-held or portable devices, tablet devices, multiprocessor systems, microprocessor-based systems, set top boxes, programmable consumer electronics, network PCs, minicomputers, mainframe computers, distributed computing environments that include any of the above systems or devices, and the like. The application may be described in the general context of computer-executable instructions, such as program modules, being executed by a computer. Generally, program modules include routines, programs, objects, components, data structures, etc. that perform particular tasks or implement particular abstract data types. The application may also be practiced in distributed computing environments where tasks are performed by remote processing devices that are linked through a communications network. In a distributed computing environment, program modules may be located in both local and remote computer storage media including memory storage devices.
Referring to fig. 1, fig. 1 is a flowchart of a specific method of a multi-channel image synthesis method according to an embodiment of the present application. In some embodiments, the multi-channel image synthesis method includes, but is not limited to, steps S101 to S104.
Step S101, illuminating at least a part of the area of the workpiece to be detected each time and collecting images of the workpiece to be detected to obtain a plurality of workpiece images;
the workpiece image includes an integral image obtained by integral illumination and collection of the workpiece to be measured.
In some embodiments, a light source emitter is arranged above the workpiece to be measured, the light source emitter is turned on or off by a light source controller, and in the process of illuminating the workpiece to be measured each time, a rear area, a left area, a front area, a right area and an integral lighting lamp of the light source emitter are sequentially controlled, so that at least a part of the area of the workpiece to be measured is illuminated, and image acquisition is carried out on the illuminated workpiece to be measured in different areas, so that a plurality of workpiece images are obtained, and subsequent image synthesis is facilitated.
It can be understood that in the process of image acquisition of the workpiece to be detected, the light source emitter of the rear quarter can be started first, the camera is triggered to shoot to obtain the first image, and then the light source emitter of the left quarter, the light source emitter of the front quarter, the light source emitter of the right quarter and the integral light source emitter are started in sequence, so that the second image, the third image, the fourth image and the integral image are obtained, and the first image, the second image, the third image, the fourth image and the integral image are integrated to obtain a plurality of workpiece images.
It should be noted that, in the present embodiment, the opening sequence and the irradiation area of the light source emitter can be adjusted according to the needs of the user, and the present embodiment is not limited specifically.
Step S102, carrying out weighted bias processing on a plurality of workpiece images to obtain a plurality of bias images and a synthetic image;
in some embodiments, each workpiece image is subjected to weighted bias processing to obtain a plurality of bias images, so that the brightness of the bias images is uniform, the situation that the brightness of the images is too bright or too dark is avoided, and then a synthetic image is determined according to the bias images, so that the subsequent comparison of the synthetic images is facilitated.
Step S103, performing cross ratio calculation on the synthetic image and the whole image to obtain a cross ratio;
in some embodiments, the blending ratio is calculated on the composite image and the whole image to obtain the blending ratio, so that the image precision of the composite image is improved, and the problem of uneven defect imaging is solved.
Step S104, determining a target synthetic diagram according to the cross ratio value, the synthetic diagram and a preset cross ratio condition.
In some embodiments, the intersection ratio is compared with a preset intersection ratio condition, so that whether the current synthetic image meets the requirement of the intersection ratio condition can be judged, and a target synthetic image is determined according to a judging result, so that uniform imaging is realized.
It should be noted that after the target synthetic diagram is obtained, the workpiece to be detected may be adjusted or modified according to the target synthetic diagram, so as to improve the detection efficiency of the workpiece to be detected.
In some embodiments, steps S101 to S104 are performed, first, at least a part of a region of a workpiece to be measured is illuminated and an image is acquired on the workpiece to be measured, so as to obtain a plurality of workpiece images under different illumination conditions, wherein the workpiece images include integral images obtained by integral illumination and acquisition of the workpiece to be measured, then, the plurality of workpiece images are subjected to weighted bias processing to obtain a plurality of bias images, so that uniformity of image brightness is improved, a composite image is obtained according to the bias images, a situation of data overflow is avoided, finally, the composite image and the integral images are subjected to cross-over ratio calculation to obtain a cross-over ratio, so that image synthesis accuracy is improved, and then, a target composite image is determined according to the cross-over ratio, the composite image and a preset cross-over ratio condition, so that imaging non-uniformity can be solved, imaging uniformity of defect characteristics is realized, and detection accuracy and efficiency of the workpiece to be measured are improved.
Referring to fig. 2, fig. 2 is a specific flowchart of step S102 provided in the embodiment of the present application. In some embodiments, step S102 specifically includes, but is not limited to, step S201 and step S203.
Step S201, generating a weight matrix according to a plurality of workpiece images;
the weight matrix includes a plurality of matrix elements, and the matrix elements respectively correspond to the workpiece images.
Step S202, carrying out weighted bias processing on the workpiece images through matrix elements in the weight matrix to obtain a plurality of bias images.
Step S203, determining a plurality of offset channels according to the offset images, and performing image synthesis on the offset images and the offset channels to obtain a synthetic image.
In steps S201 to S203 of some embodiments, a weight matrix is randomly generated according to a plurality of workpiece images, the number of matrix elements in the weight matrix corresponds to the number of workpiece images, the sum of the addition of each matrix element is equal to 1, then each pixel point in the corresponding workpiece image is subjected to weighted bias processing through the matrix element to obtain a plurality of bias images, so that uniformity of image brightness is improved, a bias channel corresponding to each bias image is determined according to the bias images, and image synthesis is performed on the bias images and the bias channels, so that a brand new and complete synthetic image is obtained, and image synthesis is realized.
Referring to fig. 3, fig. 3 is a specific flowchart of step S202 provided in the embodiment of the present application. In some embodiments, step S202 specifically includes, but is not limited to, step S301 and step S302.
It should be noted that the matrix element includes a weight value.
Step S301, obtaining an image gray value of each workpiece image;
step S302, for each workpiece image, carrying out weighted bias processing on the gray level value of the image based on a preset offset value and a weight value to obtain a plurality of bias images.
In steps S301 to S302 of some embodiments, in the process of performing the weighted bias calculation, it is required to first obtain an original image gray value of each workpiece image, and then perform weighted bias processing on the gray value of each workpiece image based on a preset offset value and a weight value to obtain a plurality of bias images, so that the gray value of the image with a bright workpiece image is reduced, the gray value of the dark place is increased, and the uniformity of the brightness of the image is integrally improved.
Referring to fig. 4, fig. 4 is a specific flowchart of step S302 provided in the embodiment of the present application. In some embodiments, step S302 includes, but is not limited to, step S401 and step S403.
Step S401, determining an image weight value corresponding to the workpiece image and an image offset value corresponding to the workpiece image;
step S402, carrying out weighted bias on the image gray level value according to the image weight value and the image offset value to obtain a target gray level value;
and step S403, brightness adjustment is carried out on the workpiece image according to the target gray value, so as to obtain a bias image.
In steps S401 to S403 of some embodiments, an image weight value corresponding to the workpiece image is determined in the matrix element, an image offset value corresponding to the workpiece image is determined in the preset offset values, then the image gray value is weighted and biased according to the image weight value and the image offset value, so as to obtain a target gray value, finally, the workpiece image is subjected to brightness adjustment according to the target gray value, the bright image gray value of the workpiece image is lowered, the dark gray value is raised, and then, each workpiece image is subjected to weighted and biased processing, so that a plurality of bias images with uniform brightness are obtained.
It should be noted that, the specific process of weighting bias calculation is shown in the following formula (1):
y=a1*x+b1 (1)
wherein a1 represents an image weight value, b1 represents an image offset value, x represents an image gray value, and y is a weighted offset target gray value.
Referring to fig. 5, fig. 5 is a specific flowchart of step S203 provided in the embodiment of the present application. In some embodiments, step S203 specifically includes, but is not limited to, step S501 and step S503.
Step S501, obtaining pixel points of offset images;
step S502, ordering the offset images to obtain an offset sequence;
step S503, adding the pixel points and the offset channels according to the offset sequence to obtain a synthetic image.
In steps S501 to S503 of some embodiments, in the process of image synthesis, each pixel point of the offset image needs to be acquired first, then the offset images are ordered to obtain an offset sequence, the situation that the image sequence is disordered in the synthesis process to cause the confusion of the synthesized image area is avoided, and finally, the pixel points and the offset channels corresponding to each offset image are added according to the offset sequence to obtain the synthesized image.
It should be noted that, the coefficient multiplied by each pixel point in the offset image does not exceed 1, and the whole pixel gray value is ensured not to exceed 255, so that the situation of data overflow is avoided.
Referring to fig. 6, fig. 6 is a specific flowchart of step S103 provided in the embodiment of the present application. In some embodiments, step S103 specifically includes, but is not limited to, step S601 and step S606.
Step S601, carrying out feature marking on the whole image to obtain defect feature information;
in some embodiments, an obvious feature is defined in the whole image, and the whole image is marked with the feature to obtain defect feature information, so that the defect feature is convenient to calculate later.
Step S602, determining feature width, feature height and center point coordinates according to the defect feature information;
in some embodiments, feature widths, feature heights, and center point coordinates of original defect features are first determined from defect feature information of the overall image, denoted as w0, h0, and (x 0, y 0).
Step S603, performing gray value binarization processing on the synthetic image to obtain first defect characteristic information of the synthetic image;
in some embodiments, gray value binarization processing is performed on the composite graph obtained in step S503, so as to highlight defect features in the composite graph, obtain first defect feature information, and improve efficiency of defect feature searching.
Step S604, threshold segmentation is carried out on the first defect characteristic information based on a preset spot tool to obtain second defect characteristic information;
in some embodiments, threshold segmentation is performed on the first defect feature information based on a preset speckle tool to obtain second defect feature information, wherein the segmentation is performed by combining the areas of the synthetic graphs in the threshold segmentation process, and the segmented results are filtered to obtain the second defect feature information.
Step S605, determining a synthetic feature width, a synthetic feature height and a synthetic center point coordinate according to the second defect feature information;
in some embodiments, the composite feature width w1, the composite feature height h1, and the composite center point coordinates (x 1, y 1) are determined from the second defect feature information.
And step S606, calculating the cross-over ratio of the feature width, the feature height, the center point coordinates, the synthesized feature width, the synthesized feature height and the synthesized center point coordinates to obtain the cross-over ratio.
In some embodiments, the intersection ratio of the second defect characteristic information and the defect characteristic information is calculated according to the width, the height and the coordinate points, so that whether the composite graph reaches the standard is judged, and uniformity of defect characteristic imaging is realized.
Referring to fig. 7, fig. 7 is a specific flowchart of step S104 provided in the embodiment of the present application. In some embodiments, step S104 specifically includes, but is not limited to, step S701 and step S702.
Step S701, comparing the intersection ratio with a preset intersection ratio condition;
step S702, determining a target synthetic diagram according to the synthetic diagram when the cross ratio satisfies the cross ratio condition.
In steps S701 to S702 of some embodiments, the cross ratio is compared with a preset cross ratio condition, and when the cross ratio satisfies the cross ratio condition, it is indicated that the synthetic image has reached the specified specification, and the synthetic image may be directly used as the target synthetic image.
It should be noted that the cross ratio condition may be set according to the needs of the user, and the embodiment is not particularly limited.
Referring to fig. 8, fig. 8 is a specific flowchart of step S104 according to another embodiment of the present application. In some embodiments, step S104 specifically includes, but is not limited to, step S703.
And step 703, continuing to perform weighted bias processing on the plurality of workpiece images to obtain an iterative blending ratio under the condition that the blending ratio does not meet the blending ratio condition until the iterative blending ratio meets the blending ratio condition.
In some embodiments, in the process of comparing the cross-blending ratio with the preset cross-blending ratio condition, if the cross-blending ratio does not meet the cross-blending ratio condition, which indicates that the composite graph at this time does not meet the specified specification, step S102-step S103 are required to be repeated, that is, the weighted bias processing is continuously performed on the multiple workpiece images, the weights are continuously optimized iteratively, until the cross-blending ratio meets the cross-blending ratio condition, and the iterative composite graph corresponding to the iterative cross-blending ratio is taken as the target composite graph.
Referring to fig. 9, an embodiment of the present application further provides a multi-channel image synthesis apparatus, which may implement the multi-channel image synthesis method, where the apparatus includes:
the image acquisition module 801 is configured to illuminate at least a part of an area of a workpiece to be measured and acquire images of the workpiece to be measured each time, so as to obtain a plurality of workpiece images, where the workpiece images include an integral image obtained by integral illumination and acquisition of the workpiece to be measured;
the weighted bias module 802 is configured to perform weighted bias processing on the multiple workpiece images to obtain multiple biased images and a composite image;
the cross-over ratio calculating module 803 is configured to perform cross-over ratio calculation on the composite image and the whole image to obtain a cross-over ratio;
the target determining module 804 is configured to determine a target synthetic graph according to the intersection ratio value, the synthetic graph, and a preset intersection ratio condition.
The specific processing procedure of the multi-channel image synthesizing device according to the embodiment of the present application is the same as that of the multi-channel image synthesizing method according to the embodiment, and will not be described in detail herein.
The embodiment of the application also provides a computer device, which comprises a memory and a processor, wherein the memory stores a computer program, and the processor is used for executing the multichannel image synthesis method in the embodiment of the application when the computer program is executed by the processor.
Referring to fig. 10, fig. 10 is a schematic diagram of a hardware structure of a computer device according to an embodiment of the present application.
The hardware structure of the computer device is described in detail below with reference to fig. 10. The computer device includes: a processor 910, a memory 920, an input/output interface 930, a communication interface 940, and a bus 950.
The processor 910 may be implemented by a general-purpose CPU (Centra l Process I n Un it, central processing unit), a microprocessor, an application-specific integrated circuit (App l I cat I on Spec I f I C I ntegrated Ci rcu it, AS ic), or one or more integrated circuits, etc., for executing related programs to implement the technical solutions provided in the embodiments of the present application;
the memory 920 may be implemented in the form of read-only memory (Read On ly Memory, ROM), static storage, dynamic storage, or random access memory (Random Access Memory, RAM). The memory 920 may store an operating system and other application programs, and when the technical solutions provided in the embodiments of the present application are implemented by software or firmware, relevant program codes are stored in the memory 920, and the processor 910 invokes a multi-channel image synthesis method to perform the embodiments of the present application;
an input/output interface 930 for inputting and outputting information;
the communication interface 940 is configured to implement communication interaction between the present device and other devices, and may implement communication in a wired manner (e.g., USB, network cable, etc.), or may implement communication in a wireless manner (e.g., mobile network, WI F I, bluetooth, etc.); and a bus 950 for transferring information between components of the device (e.g., processor 910, memory 920, input/output interface 930, and communication interface 940);
wherein processor 910, memory 920, input/output interface 930, and communication interface 940 implement communication connections among each other within the device via a bus 950.
The present application also provides a storage medium, which is a computer-readable storage medium storing a computer program, where the computer program is configured to perform the multi-channel image synthesis method according to the above embodiments of the present application when the computer program is executed by a computer.
The memory, as a non-transitory computer readable storage medium, may be used to store non-transitory software programs as well as non-transitory computer executable programs. In addition, the memory may include high-speed random access memory, and may also include non-transitory memory, such as at least one magnetic disk storage device, flash memory device, or other non-transitory solid state storage device. In some embodiments, the memory optionally includes memory remotely located relative to the processor, the remote memory being connectable to the processor through a network. Examples of such networks include, but are not limited to, the internet, intranets, local area networks, mobile communication networks, and combinations thereof.
The embodiments described in the embodiments of the present application are for more clearly describing the technical solutions of the embodiments of the present application, and do not constitute a limitation on the technical solutions provided by the embodiments of the present application, and as those skilled in the art can know that, with the evolution of technology and the appearance of new application scenarios, the technical solutions provided by the embodiments of the present application are equally applicable to similar technical problems.
It will be appreciated by those skilled in the art that the solutions shown in fig. 1-8 are not limiting to embodiments of the present application, and may include more or fewer steps than illustrated, or may combine certain steps, or different steps.
The above described apparatus embodiments are merely illustrative, wherein the units illustrated as separate components may or may not be physically separate, i.e. may be located in one place, or may be distributed over a plurality of network elements. Some or all of the modules may be selected according to actual needs to achieve the purpose of the solution of this embodiment.
Those of ordinary skill in the art will appreciate that all or some of the steps of the methods, systems, functional modules/units in the devices disclosed above may be implemented as software, firmware, hardware, and suitable combinations thereof.
The terms "first," "second," "third," "fourth," and the like in the description of the present application and in the above-described figures, if any, are used for distinguishing between similar objects and not necessarily for describing a particular sequential or chronological order. It is to be understood that the data so used may be interchanged where appropriate such that embodiments of the present application described herein may be implemented in sequences other than those illustrated or otherwise described herein. Furthermore, the terms "comprises," "comprising," and "having," and any variations thereof, are intended to cover a non-exclusive inclusion, such that a process, method, system, article, or apparatus that comprises a list of steps or elements is not necessarily limited to those steps or elements expressly listed but may include other steps or elements not expressly listed or inherent to such process, method, article, or apparatus.
It should be understood that in this application, "at least one" means one or more, and "a plurality" means two or more. "and/or" for describing the association relationship of the association object, the representation may have three relationships, for example, "a and/or B" may represent: only a, only B and both a and B are present, wherein a, B may be singular or plural. The character "/" generally indicates that the context-dependent object is an "or" relationship. "at least one of" or the like means any combination of these items, including any combination of single item(s) or plural items(s). For example, at least one (one) of a, b or c may represent: a, b, c, "a and b", "a and c", "b and c", or "a and b and c", wherein a, b, c may be single or plural.
In the several embodiments provided in this application, it should be understood that the disclosed apparatus and method may be implemented in other ways. For example, the apparatus embodiments described above are merely illustrative, e.g., the division of elements is merely a logical functional division, and there may be additional divisions of actual implementation, e.g., multiple elements or components may be combined or integrated into another system, or some features may be omitted, or not performed. Alternatively, the coupling or direct coupling or communication connection shown or discussed with each other may be an indirect coupling or communication connection via some interfaces, devices or units, which may be in electrical, mechanical or other form.
The units described as separate units may or may not be physically separate, and units shown as units may or may not be physical units, may be located in one place, or may be distributed over a plurality of network units. Some or all of the units may be selected according to actual needs to achieve the purpose of the solution of this embodiment.
In addition, each functional unit in each embodiment of the present application may be integrated in one processing unit, or each unit may exist alone physically, or two or more units may be integrated in one unit. The integrated units may be implemented in hardware or in software functional units.
The integrated units, if implemented in the form of software functional units and sold or used as stand-alone products, may be stored in a computer readable storage medium. Based on such understanding, the technical solution of the present application may be embodied in essence or a part contributing to the prior art or all or part of the technical solution in the form of a software product stored in a storage medium, including multiple instructions to cause a computer device (which may be a personal computer, a server, or a network device, etc.) to perform all or part of the steps of the methods of the embodiments of the present application. And the aforementioned storage medium includes: a U-disk, a removable hard disk, a Read-On-y Memory (ROM), a random access Memory (Random Access Memory, RAM), a magnetic disk, or an optical disk, or other various media capable of storing a program.
Preferred embodiments of the present application are described above with reference to the accompanying drawings, and thus do not limit the scope of the claims of the embodiments of the present application. Any modifications, equivalent substitutions and improvements made by those skilled in the art without departing from the scope and spirit of the embodiments of the present application shall fall within the scope of the claims of the embodiments of the present application.

Claims (10)

1. A method of multi-channel image synthesis, the method comprising:
illuminating at least a part of the area of the workpiece to be detected each time and carrying out image acquisition on the workpiece to be detected to obtain a plurality of workpiece images, wherein the workpiece images comprise integral images obtained by carrying out integral illumination on the workpiece to be detected and acquiring;
carrying out weighted bias treatment on a plurality of workpiece images to obtain a plurality of bias images and a synthetic image;
performing cross ratio calculation on the synthetic image and the whole image to obtain a cross ratio;
and determining a target synthetic graph according to the intersection ratio, the synthetic graph and a preset intersection ratio condition.
2. The method of synthesizing a multi-channel image according to claim 1, wherein said weighting and biasing a plurality of said workpiece images to obtain a plurality of biased images and a synthesized image, comprises:
generating a weight matrix according to a plurality of workpiece images, wherein the weight matrix comprises a plurality of matrix elements, and the matrix elements respectively correspond to the workpiece images;
performing weighted bias processing on the workpiece image through matrix elements in the weight matrix to obtain a plurality of bias images;
and determining a plurality of offset channels according to the offset images, and performing image synthesis on the offset images and the offset channels to obtain a synthetic image.
3. The multi-channel image synthesis method according to claim 2, wherein the matrix elements include weight values; the weighting bias processing is carried out on the workpiece image through matrix elements in the weight matrix to obtain a plurality of bias images, and the method comprises the following steps:
acquiring an image gray value of each workpiece image;
and carrying out weighted bias processing on the gray values of the images based on preset offset values and the weight values for each workpiece image to obtain a plurality of bias images.
4. The method of multi-channel image synthesis according to claim 3, wherein the performing weighted bias processing on the image gray values based on the preset offset value and the weight value to obtain a plurality of bias images includes:
determining an image weight value corresponding to the workpiece image and an image offset value corresponding to the workpiece image;
carrying out weighted bias on the image gray level value according to the image weight value and the image offset value to obtain a target gray level value;
and carrying out brightness adjustment on the workpiece image according to the target gray value to obtain the offset image.
5. The method of multi-channel image synthesis according to claim 2, wherein said image synthesizing the offset image and the offset channel to obtain a synthesized image includes:
acquiring pixel points of the offset image;
ordering the offset images to obtain an offset sequence;
and adding the pixel points and the offset channels according to the offset sequence to obtain the synthetic image.
6. The method of synthesizing a multi-channel image according to claim 1, wherein said calculating the cross-over ratio of the synthesized image and the whole image to obtain the cross-over ratio comprises:
performing feature marking on the integral image to obtain defect feature information;
determining feature width, feature height and center point coordinates according to the defect feature information;
performing gray value binarization processing on the synthetic image to obtain first defect characteristic information of the synthetic image;
threshold segmentation is carried out on the first defect characteristic information based on a preset spot tool, so that second defect characteristic information is obtained;
determining a synthetic feature width, a synthetic feature height and a synthetic center point coordinate according to the second defect feature information;
and calculating the intersection ratio of the feature width, the feature height, the center point coordinate, the synthesized feature width, the synthesized feature height and the synthesized center point coordinate to obtain the intersection ratio.
7. The method of multi-channel image synthesis according to claim 1, wherein determining the target synthesis map according to the intersection ratio value, the synthesis map, and a preset intersection ratio condition comprises:
comparing the cross ratio with a preset cross ratio condition;
under the condition that the intersection ratio meets the intersection ratio condition, determining the target synthetic diagram according to the synthetic diagram;
and under the condition that the intersection ratio value does not meet the intersection ratio condition, continuing to carry out weighted bias processing on the plurality of workpiece images to obtain an iteration intersection ratio value until the iteration intersection ratio value meets the intersection ratio condition.
8. A multi-channel image synthesizing apparatus, the apparatus comprising:
the image acquisition module is used for illuminating at least a part of the area of the workpiece to be detected each time and carrying out image acquisition on the workpiece to be detected to obtain a plurality of workpiece images, wherein the workpiece images comprise integral images obtained by carrying out integral illumination on the workpiece to be detected;
the weighting bias module is used for carrying out weighting bias processing on the plurality of workpiece images to obtain a plurality of bias images and a synthetic image;
the cross ratio calculation module is used for calculating the cross ratio of the synthetic image and the whole image to obtain a cross ratio;
and the target determining module is used for determining a target synthetic diagram according to the cross ratio, the synthetic diagram and a preset cross ratio condition.
9. A computer device comprising a memory and a processor, wherein the memory has stored therein a computer program which, when executed by the processor, is adapted to carry out the multi-channel image synthesis method of any of claims 1 to 7.
10. A storage medium, characterized in that the storage medium is a computer-readable storage medium, wherein the computer-readable storage medium stores a computer program for performing the multi-channel image synthesis method according to any one of claims 1 to 7 when the computer program is executed by a computer.
CN202310025059.5A 2023-01-09 2023-01-09 Multi-channel image synthesis method, device, computer equipment and medium Pending CN116124789A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202310025059.5A CN116124789A (en) 2023-01-09 2023-01-09 Multi-channel image synthesis method, device, computer equipment and medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202310025059.5A CN116124789A (en) 2023-01-09 2023-01-09 Multi-channel image synthesis method, device, computer equipment and medium

Publications (1)

Publication Number Publication Date
CN116124789A true CN116124789A (en) 2023-05-16

Family

ID=86296815

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202310025059.5A Pending CN116124789A (en) 2023-01-09 2023-01-09 Multi-channel image synthesis method, device, computer equipment and medium

Country Status (1)

Country Link
CN (1) CN116124789A (en)

Similar Documents

Publication Publication Date Title
CN110660066B (en) Training method of network, image processing method, network, terminal equipment and medium
US20200043157A1 (en) Inspection Apparatus, Inspection Method, And Program
US8385646B2 (en) Image processing
CN1201267C (en) Method of searching multimedia data
CN111179230B (en) Remote sensing image contrast change detection method and device, storage medium and electronic equipment
US8503767B2 (en) Textual attribute-based image categorization and search
US20210124967A1 (en) Method and apparatus for sample labeling, and method and apparatus for identifying damage classification
US8213052B2 (en) Digital image brightness adjustment using range information
US9020243B2 (en) Image adjustment
TWI290695B (en) Image subtraction of illumination artifacts
CN101061477A (en) Variance-based event clustering
CN110706182B (en) Method and device for detecting flatness of shielding case, terminal equipment and storage medium
CN108764139B (en) Face detection method, mobile terminal and computer readable storage medium
CN114719966A (en) Light source determination method and device, electronic equipment and storage medium
CN113240630A (en) Speckle image quality evaluation method and device, terminal equipment and readable storage medium
CN116503388A (en) Defect detection method, device and storage medium
CN113920022A (en) Image optimization method and device, terminal equipment and readable storage medium
CN116883336A (en) Image processing method, device, computer equipment and medium
EP3924867A1 (en) Detection of projected infrared patterns using difference of gaussian and blob identification
CN117011250A (en) Defect detection method, device and storage medium
CN116402781A (en) Defect detection method, device, computer equipment and medium
CN112070682B (en) Method and device for compensating image brightness
CN111340722B (en) Image processing method, processing device, terminal equipment and readable storage medium
CN115760653B (en) Image correction method, device, equipment and readable storage medium
CN116124789A (en) Multi-channel image synthesis method, device, computer equipment and medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination