CN113240614B - High-dynamic image fusion method suitable for K-TIG welding ultra-strong arc light scene - Google Patents

High-dynamic image fusion method suitable for K-TIG welding ultra-strong arc light scene Download PDF

Info

Publication number
CN113240614B
CN113240614B CN202110376562.6A CN202110376562A CN113240614B CN 113240614 B CN113240614 B CN 113240614B CN 202110376562 A CN202110376562 A CN 202110376562A CN 113240614 B CN113240614 B CN 113240614B
Authority
CN
China
Prior art keywords
image
exposure time
pixel
sequence
pixel value
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202110376562.6A
Other languages
Chinese (zh)
Other versions
CN113240614A (en
Inventor
石永华
王子顺
陈熙引
王劲一
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
South China University of Technology SCUT
Original Assignee
South China University of Technology SCUT
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by South China University of Technology SCUT filed Critical South China University of Technology SCUT
Priority to CN202110376562.6A priority Critical patent/CN113240614B/en
Publication of CN113240614A publication Critical patent/CN113240614A/en
Application granted granted Critical
Publication of CN113240614B publication Critical patent/CN113240614B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/50Image enhancement or restoration by the use of more than one image, e.g. averaging, subtraction
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/20Image enhancement or restoration by the use of local operators
    • G06T5/30Erosion or dilatation, e.g. thinning
    • G06T5/70
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20212Image combination
    • G06T2207/20221Image fusion; Image merging
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30108Industrial image inspection
    • G06T2207/30152Solder

Abstract

The invention discloses a high dynamic image fusion method suitable for a K-TIG welding ultra-strong arc light scene, which fuses a keyhole TIG welding state low dynamic range image (LDR) sequence collected by a common industrial camera to generate a High Dynamic Range (HDR) image capable of simultaneously acquiring information of a molten pool, a keyhole, an electric arc and a welding seam, and comprises the following steps: obtaining a certain amount of proper exposure time sets by using an automatic exposure method, and inputting the proper exposure time sets into an industrial camera to acquire an LDR image sequence { A }; calculating the integral exposure ratio sequence { B } of the image acquired in the image sequence { A } in the larger exposure time and the image acquired in the lowest exposure time by using an integral exposure ratio method; calculating the relative brightness value of each image in the image sequence { A } according to the whole exposure ratio sequence { B } by using an HDR fusion method, and obtaining an HDR image C in a welding state through weighted average; image C is compressed to an 8-bit image using a tone remapping method.

Description

High-dynamic image fusion method suitable for K-TIG welding ultra-strong arc light scene
Technical Field
The invention relates to the technical field of welding monitoring and image processing, in particular to a high dynamic image fusion method suitable for a K-TIG welding ultra-strong arc light scene.
Background
The keyhole deep-melting TIG welding (K-TIG welding) is a novel efficient welding method for realizing large penetration depth by using a keyhole effect in the welding process, can realize one-pass penetration without beveling, single-side welding and double-side forming, has good double-side weld formation and high welding quality, and has great application prospect in the welding scene of medium-thickness plates. The main defects of the K-TIG welding are lack of penetration or penetration, the K-TIG welding is stable and has no splashing, but the arc brightness is very high in the welding process, and the brightness of a molten pool and a parent metal is relatively low, so that the efficient use of a common industrial camera with a narrow dynamic range in the aspect of K-TIG welding monitoring is prevented. Where dynamic range refers to the ability of an image or camera to retain scene luminance information. In addition, the welding seam tracking technology for ensuring that the welding gun follows the welding seam central line needs to acquire the information of the lockhole inlet and the welding seam of the parent metal at the same time. Although the common industrial camera with narrow dynamic range can not obtain the information of arc light, keyhole, molten pool and welding seam at the same time, the common industrial camera has the advantages of low price, small volume and convenient installation, and compared with the high dynamic industrial camera on the market, the common industrial camera has the price lower than 1/10 of that of the common industrial camera.
The invention patent CN108668093A discloses a method and an apparatus for generating an HDR image, which mainly compare a low-exposure image and a high-exposure image with a medium-exposure image to obtain a motion region, and obtain a more optimized weight value to fuse the HDR image. But the method is mainly used in the scenes of rigid motion and is not suitable for the scenes of K-TIG fluid motion.
The invention patent CN108629739B discloses a method and a device for generating an HDR image, and a mobile terminal. The image obtained by the method is mainly fused by using a Laplacian pyramid, and the weight coefficient is adjusted according to the result. The invention does not optimize the sequence to be fused, and can not perfectly fuse the ultra-wide brightness range of the K-TIG welding, thereby generating larger ghost defect.
The invention patent CN106162131A discloses a real-time image processing method, which uses a Field Programmable Gate Array (FPGA) to implement an optimized HDR technique, and its speed is faster, but needs to manufacture a camera again, and simply multiplies the exposure time by a factor of 2 to obtain a low dynamic image sequence, and the effect is worse for a high-frequency changing scene of K-TIG welding.
The method aims to overcome the interference of strong light in the K-TIG welding process, realize the real-time monitoring of the K-TIG welding and make up for the defects that the dynamic range of the current common industrial camera is narrow and the information of an electric arc, a molten pool and a base metal cannot be directly acquired. At present, it is necessary to design a method and a device for improving HDR fusion images of a general industrial camera.
Disclosure of Invention
The invention aims to solve the problem that an arc, a molten pool, a lock hole and a welding seam of K-TIG welding cannot be observed simultaneously due to low dynamic range of the existing low-cost industrial camera, and provides a high dynamic image fusion method and a high dynamic image fusion device suitable for a superstrong arc scene of K-TIG welding.
The purpose of the invention can be achieved by adopting the following technical scheme:
a high dynamic image fusion method suitable for a K-TIG welding ultra-strong arc light scene comprises the following steps:
s1, shooting a welding scene image of K-TIG welding by an industrial camera, searching for proper exposure time of a first frame in the welding scene image by using a random search method, generating proper exposure time of a next frame according to a histogram until 3 exposure time sets are generated, and inputting the exposure time sets into the industrial camera to acquire 3 low-dynamic-range image sequences { A };
s2, taking the lowest exposure time image in the obtained image sequence { A } as a reference, and obtaining an overall exposure ratio sequence { B } of a higher exposure time image in the image sequence { A } relative to the lowest exposure time image in a masking mode;
s3, restoring the relative brightness value of the pixel of the image sequence { A } by using the ratio sequence { B } obtained in the integral exposure ratio, and obtaining a high dynamic range image C by calculating the weighted average of the relative brightness values of all pixel positions in the image sequence { A };
and S4, carrying out tone remapping on the obtained high dynamic range image C, and carrying out gamma conversion on the high dynamic range image C to obtain an 8-bit image D which can be displayed on a display.
Further, the process of generating the exposure time set in step S1 is as follows:
s11, if the exposure time set is not empty, judging whether the exposure time set needs to be regenerated or not by detecting the exposure process, and if not, acquiring an image sequence { A } directly from the obtained exposure time set; if the exposure time set is empty or needs to be regenerated after judgment, executing the next step S12;
s12, generating a first frame of proper exposure time through random search, and collecting an image of the first frame of proper exposure time;
and S13, obtaining images with proper exposure time of the second frame and the third frame through histogram reasoning to form an image sequence { A }.
Further, the process of determining whether the exposure time set needs to be regenerated by detecting the exposure process in step S11 is as follows:
acquiring a first frame of exposure time image in an exposure time set, counting the pixel quantity of which the pixel value is between 255 and 240, and dividing the pixel quantity by the total pixel quantity to obtain a pixel ratio r1, wherein the total pixel quantity refers to the resolution of the image, and subtracting the pixel ratio r corresponding to the first frame when the image is acquired for the first time after the exposure time set is generated for the first time or regenerated for the first time, and taking an absolute value r2, if r2 is more than 0.2%, the exposure time set needs to be regenerated, otherwise, the exposure time set is continuously used to obtain an image sequence { A }.
Further, the process of generating the proper exposure time of the first frame by random search in step S12 is as follows:
s121, collecting a welding image with exposure time of 100 microseconds through an industrial camera;
s122, counting the sum of the ratio of the pixel quantity of each pixel value to the total pixel quantity from the pixel value of 255 to obtain a pixel value p1 accounting for 0.1% of the total pixel quantity;
s123, comparing p1 with 100 and 250, if it is lower than 100, jumping to step S114, if it is higher than 250, jumping to step S115, if p1 is higher than 100 and lower than 250, according to the formula:
t′=240t/p1
wherein t' is the proper exposure time of the first frame, t is the exposure time of the current image, and go to step S126;
s124, judging whether the exposure time t of the last cycle retracts or not, if so, increasing half of the previous retraction to the current exposure time, otherwise, multiplying the exposure time by 2 to obtain the exposure time t of the next cycle, acquiring an image of the exposure time t through the industrial camera, and returning to the step S122;
s125 determines whether the current exposure time is 100 microseconds, and if the current exposure time is 100 microseconds, the process jumps to step S126 with t' =100 microseconds as the proper exposure time for the first frame; otherwise, retracting the exposure time to half of the last increase to obtain the exposure time t of the next cycle, acquiring an image of the exposure time t by the industrial camera and returning to the step S122;
s126, obtaining proper exposure time t 'of the first frame, putting the proper exposure time t' into an exposure time set to obtain an exposure time image of the first frame, counting the pixel quantity of which the pixel value is between 255 and 240, dividing the pixel quantity by the total pixel quantity to obtain a pixel proportion r, and ending the random search process.
Further, the histogram inference process in step S13 is as follows:
s131, collecting a welding image with exposure time t through an industrial camera;
s132, counting the sum of the ratio of the pixel quantity of each pixel value to the total pixel quantity from the pixel value of 240 to obtain a pixel value p2 accounting for 30% of the total pixel quantity;
s133, through a formula: t ' =240t/p2, whether t ' is larger than 8 times t is judged, wherein t ' is proper exposure time of the next frame, t is current exposure time, if yes, 8t is assigned to t ', otherwise, no modification is carried out, t ' is placed into an exposure time set, whether the number of the exposure time in the set is 3 is judged, if not, the step S131 is returned, otherwise, the histogram reasoning process is ended, and images are collected through the exposure time set to obtain an image sequence { A }.
Further, the overall exposure amount ratio in step S2 is calculated as follows:
with reference to the lowest exposure time image in the sequence of images { A }, k 1 =1, obtaining an average pixel value avg1 of an overexposure edge of the image sequence { a } with a larger exposure time by means of a mask, where overexposure refers to a pixel with a pixel value of 255, and the overexposure edge refers to a pixel adjacent to the overexposure pixel position, and obtaining an average pixel value avg2 of the same position range of the lowest exposure time image, according to a formula:
Figure BDA0003009072220000051
calculating to obtain the overall exposure ratio k of the image with larger exposure time to the image with minimum exposure time 2
Obtaining maximum exposure of image sequence { A } by means of maskThe average pixel value avg1 of the overexposure edge of the light time image is obtained, the average pixel value avg2 of the same position range of the lower exposure time image is obtained, and the formula is as follows:
Figure BDA0003009072220000052
calculating to obtain the overall exposure ratio k of the maximum exposure time image to the lower exposure time image 3
Will k is 3 And k is 2 Is given to k 3 And obtaining the whole exposure ratio sequence { B } by taking the lowest exposure time image as a reference.
Further, the process of acquiring the average pixel value avg1 of the overexposed edge of the image sequence { A } maximum exposure time image by means of the mask is as follows:
for the lowest exposure time image and the larger exposure time in the image sequence { A }, the mask for the larger exposure time image is found as follows:
by formula generation
Figure BDA0003009072220000053
Wherein, I j Is the pixel value, p, of the jth location in the mask image j Pixel values at the jth location for the larger exposure time image;
because the electromagnetic interference is serious in the welding process, the camera contains a lot of salt and pepper noises, and therefore the mask image is subjected to morphological expansion processing through a formula
Figure BDA0003009072220000054
Performing morphological dilation processing, wherein I' is the mask image after the morphological dilation processing, and I is the mask image before the morphological dilation processing, which indicates dilation processing;
subtracting the mask image before expansion from the expanded mask image to obtain the overexposure edge position of the image with larger exposure time: I.C. A mask And = I' -I, performing dot product operation on the mask and the image with the larger exposure time and the image with the lowest exposure time respectively, summing, and dividing by the number of non-zero pixels to obtain the pixel value average value of the overexposure edge.
Further, the process of step S3 is as follows:
according to the overall exposure ratio sequence { B }, the pixel positions of each image of the image sequence { A } are processed by a formula
Figure BDA0003009072220000061
Is calculated to obtain E j ,E j The relative brightness value recovered for the jth pixel position of the image is used to obtain a high dynamic range image C, wherein P ij Is the pixel value, k, of the jth pixel position in the ith image of the sequence of images { A } i Is the ith ratio, w in the overall exposure ratio series { B } i ("I) is a weight function corresponding to the ith image, and the weight function is given by the following formula:
Figure BDA0003009072220000062
Figure BDA0003009072220000063
Figure BDA0003009072220000064
where p is the pixel value, w 1 (p)、w 2 (p) and w 3 (p) are weight functions of the first, second, and third images in the image sequence { A }, respectively.
Further, the process of tone remapping in step S4 is as follows:
relative brightness value E of each pixel position of high dynamic image C j And performing gamma transformation, wherein the gamma transformation calculation formula is as follows:
Figure BDA0003009072220000065
wherein, p' j The gamma-transformed pixel value is gamma-transformed pixel value, gamma is gamma-transformed factor, and the value range is 0.1 to 0.5 in K-TIG welding monitoring;
then by the formula
Figure BDA0003009072220000066
Obtaining an 8-bit high dynamic image D, wherein p ″) j Is a stretched pixel value of p' max And p' min Are respectively p' j A maximum value and a minimum value of (c).
Further, after the first exposure time set is obtained in step S1, exposure determination needs to be performed first to determine whether the exposure time set needs to be regenerated.
Compared with the prior art, the invention has the following advantages and effects:
1) The method is suitable for industrial monitoring scenes with superstrong arc interference in K-TIG welding, and has the advantages of high real-time performance, less ghost and rich information.
2) The method can realize the HDR function without replacing the existing monitoring industrial camera, can add a calculation and control unit in the original monitoring system, and even can directly realize the HDR function by using an industrial computer, and is convenient, quick and low in cost.
3) The method can automatically select proper exposure time according to the brightness range of the scene, and simultaneously utilizes the exposure ratio to carry out linear synthesis, thereby avoiding ghost image generated by high dynamic image caused by high-speed changing brightness during welding.
Drawings
FIG. 1 is a flow chart of a high dynamic image fusion method suitable for a K-TIG welding ultra-strong arc scene disclosed by the invention;
FIG. 2 is a hardware block diagram of a high dynamic image fusion device suitable for K-TIG welding ultra-strong arc scenes in the embodiment of the invention;
FIG. 3 is a circuit diagram of an FPGA module according to an embodiment of the present invention.
Detailed Description
In order to make the objects, technical solutions and advantages of the embodiments of the present invention clearer, the technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are some, but not all embodiments of the present invention. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
Example one
The embodiment discloses a high dynamic image fusion method suitable for a K-TIG welding ultra-strong arc light scene, which is used for generating a high dynamic image for K-TIG welding monitoring and comprises the following steps:
s1, shooting a welding scene image of K-TIG welding by an industrial camera, searching for proper exposure time of a first frame in the welding scene image by using a random search method, generating proper exposure time of a next frame according to a histogram until 3 exposure time sets are generated, inputting the exposure time sets into the industrial camera to acquire 3 low-dynamic-range image sequences { A }, after the first exposure time set is obtained, carrying out exposure judgment firstly, and judging whether the exposure time sets need to be reproduced or not;
wherein generating the set of exposure times and the sequence of low dynamic range images { A } includes exposure judgment, random search, and histogram inference. The exposure judgment aims at avoiding calculation redundancy and time waste caused by frequent calculation of an exposure time set, and for K-TIG welding, the oscillation frequency of a molten pool is higher and exceeds 30Hz, and the effect of monitoring the molten pool in real time cannot be realized by frequent calculation of the exposure time set; compared with the patent CN106162131A, the exposure time is simply multiplied by 2 by using the factor to generate the image sequence, the information redundancy of the image sequence is high, the calculated amount is large, the method uses a random search method to allow the first frame to contain 0.1% of pixel amount to be in an overexposure state, the information space of 8-bit images is maximally utilized, and meanwhile, the information redundancy is about 30% by using a histogram reasoning method, so that the real-time monitoring on the K-TIG welding state can be effectively realized.
The process of the step is as follows:
s11, if the exposure time set is not empty, judging whether the exposure time set needs to be regenerated or not by detecting the exposure process, and if not, acquiring an image sequence { A } directly from the obtained exposure time set; if the exposure time set is empty or needs to be regenerated after judgment, executing the next step S12;
the specific implementation is as follows: acquiring a first frame of exposure time image in an exposure time set, counting the pixel quantity of which the pixel value is between 255 and 240, and dividing the pixel quantity by the total pixel quantity to obtain a pixel ratio r1, wherein the total pixel quantity refers to the image resolution, calculating the difference of the pixel ratio r corresponding to a first frame when the image is acquired for the first time after the exposure time set is generated for the first time or regenerated for the first time, and taking an absolute value r2, if r2 is more than 0.2%, regenerating the exposure time set, otherwise, continuously using the exposure time set to obtain an image sequence { A }.
S12, generating a first frame of proper exposure time through random search, and collecting an image of the first frame of proper exposure time; the specific implementation is as follows:
s121, collecting a welding image with exposure time of 100 microseconds through an industrial camera;
s122, counting the sum of the ratio of the pixel quantity of each pixel value to the total pixel quantity from the pixel value of 255 to obtain a pixel value p1 accounting for 0.1% of the total pixel quantity;
s123, comparing p1 with 100 and 250, if it is lower than 100, jumping to step S114, if it is higher than 250, jumping to step S115, if p1 is higher than 100 and lower than 250, according to the formula:
t′=240t/p1
wherein t' is the proper exposure time of the first frame, and t is the exposure time of the current image. And jumping to step S126;
s124, judging whether the exposure time t of the previous cycle retracts or not, if so, increasing half of the previous retraction to the current exposure time, otherwise, multiplying the exposure time by 2 to obtain the exposure time t of the next cycle, acquiring an image of the exposure time t by the industrial camera, and returning to the step S122;
s125 determines whether the current exposure time is 100 microseconds, and if the current exposure time is 100 microseconds, the process jumps to step S126 with t' =100 microseconds as the proper exposure time for the first frame; otherwise, retracting the exposure time to half of the last increase to obtain the exposure time t of the next cycle, acquiring an image of the exposure time t by the industrial camera and returning to the step S122;
s126, obtaining proper exposure time t 'of the first frame, putting the exposure time t' into an exposure time set to obtain an exposure time image of the first frame, counting the pixel quantity of which the pixel value is between 255 and 240, dividing the pixel quantity by the total pixel quantity to obtain a pixel ratio r, and ending the random search process.
S13, obtaining an image sequence { A } formed by the images of the second frame and the third frame with proper exposure time through histogram reasoning; the specific implementation is as follows:
s131, collecting a welding image with exposure time t through an industrial camera;
s132, counting the sum of the ratio of the pixel quantity of each pixel value to the total pixel quantity from the pixel value of 240 to obtain a pixel value p2 accounting for 30% of the total pixel quantity;
s133, through a formula: t' =240t/p2
And if not, putting the t' into an exposure time set, judging whether the exposure time quantity in the set is 3, if so, returning to the step S131, otherwise, ending the histogram inference process, and acquiring images by using the exposure time set to obtain an image sequence { A }.
S2, taking the lowest exposure time image in the obtained image sequence { A } as a reference, and obtaining an overall exposure ratio sequence { B } of a higher exposure time image in the image sequence { A } relative to the lowest exposure time image in a masking mode; the step aims to strengthen the influence of noise on the fused image after the redundancy of the image sequence is reduced, the traditional method for calculating the exposure ratio by using the exposure time set can cause serious ghosting, unnecessary troubles are added to subsequent image processing, the ghosting can be effectively removed by calculating the whole exposure ratio by using a mask mode, and the smooth transition of the fused high-dynamic image is realized. The method specifically comprises the following steps:
taking the lowest exposure time image in the image sequence { A } as a reference, k 1 =1, obtaining average pixel value avg1 of overexposure edge of image with larger exposure time in image sequence { A } by means of mask, wherein overexposure refers to pixel with pixel value of 255, and overexposure edge refers to pixel adjacent to overexposureAnd calculating the average pixel value avg2 of the same position range of the lowest exposure time image according to the formula:
Figure BDA0003009072220000101
obtaining the integral exposure ratio k of the image with larger exposure time and the image with the minimum exposure time 2 (ii) a Then, an average pixel value avg1 of an overexposure edge of the image sequence { A } maximum exposure time image is obtained in a masking mode, an average pixel value avg2 of the same position range of the image with lower exposure time is obtained, and the average pixel value avg2 is obtained through a formula:
Figure BDA0003009072220000102
obtaining the overall exposure ratio k of the maximum exposure time image and the lower exposure time image 3 (ii) a Then k is put 3 And k 2 Is given to k 3 And obtaining the whole exposure ratio sequence { B } by taking the lowest exposure time image as a reference.
The specific mask mode is as follows: taking the lowest exposure time image and the larger exposure time image in the image sequence { A } as an example, the mask of the image with the larger exposure time is obtained, and the formula is as follows:
Figure BDA0003009072220000103
is generated in which j Is the pixel value, p, of the jth location in the mask image j The pixel value of the image at the jth location for the larger exposure time. Because the electromagnetic interference is more serious in the welding process, the camera contains a lot of salt and pepper noises, so the morphological expansion processing is carried out on the mask image, and the method specifically comprises the following steps:
Figure BDA0003009072220000104
Figure BDA0003009072220000111
wherein I' is a morphologically processed mask image, I isThe pre-process mask image, ", indicates a dilation process. And subtracting the mask image before expansion from the mask image after expansion to obtain the overexposure edge position of the image with larger exposure time: i is mask = I' -I. And performing dot multiplication on the mask and the image with the larger exposure time and the image with the lowest exposure time respectively, summing, and dividing by the number of non-zero pixels to obtain the pixel value average value of the overexposure edge.
S3, restoring the relative brightness value of the pixel of the image sequence { A } by using the ratio sequence { B } obtained in the integral exposure ratio, and obtaining a high dynamic range image C by calculating the weighted average of the relative brightness values of all pixel positions in the image sequence { A }; compared with the patent CN108668093A and the patent CN108629739B, the two inventions generate a weight map by using image information so as to perform weighted synthesis on an image sequence, and the image sequence with lower information redundancy has serious loss of details, and the two inventions are both directed at rigid bodies, and a welding pool belongs to fluid, which causes serious ghost defects; compared with the patent CN106162131A, the method uses linear calculation, which avoids the complex operation of providing a camera response function, and avoids performing complex operation, thereby ensuring the real-time performance of monitoring. The method comprises the following steps:
according to the overall exposure ratio sequence { B }, the pixel positions of each image of the image sequence { A } are determined by the formula:
Figure BDA0003009072220000112
wherein, P ij Is the pixel value, k, of the jth pixel position in the ith image in the sequence of images { A } i Is the ith ratio, w, in the overall exposure ratio series { B } i ("E) is a weight function corresponding to the ith image j And obtaining the high dynamic range image C for the relative brightness value recovered from the jth pixel position of the image.
The weight function is used for evaluating the confidence coefficient of different pixel values in each image, in K-TIG welding, due to the huge brightness difference, obvious pixel value faults are easy to appear in a first frame image, namely when a higher pixel value is transited to a lower pixel value, the number of the faults is reduced in a cliff mode, the first frame mainly collects the information of electric arcs, so that the weight is gradually increased from the pixel value of 128, and the weight is the minimum between 0 and 128; in the second frame, the information of the molten pool and the keyhole entrance is mainly collected, so that the weight functions are distributed in a triangular shape by taking 128 as a symmetry axis; in the third frame, weld information is mainly collected, and because of the strong arc of K-TIG, the weld information is still in a lower pixel value area, so that the pixel value weight between 0 and 128 is the highest, and gradually decreases from 128. The concrete formula is given as follows:
Figure BDA0003009072220000121
Figure BDA0003009072220000122
Figure BDA0003009072220000123
where p is the pixel value, w 1 (p)、w 2 (p) and w 3 (p) are weight functions of the first, second, and third images in the image sequence { A }, respectively.
S4, carrying out tone remapping on the obtained high dynamic range image C, and carrying out gamma conversion on the high dynamic range image C to obtain an 8-bit image D which can be displayed on a display, wherein the method specifically comprises the following steps:
relative brightness value E of each pixel position of high dynamic image C j And performing gamma conversion, and realizing by a formula:
Figure BDA0003009072220000124
wherein, p' j The gamma-transformed pixel value, gamma is gamma-transformed factor, and the value range in K-TIG welding is 0.1-0.5. Then through the formula
Figure BDA0003009072220000125
Wherein, p ″) j Is a stretched pixel value of p' max And p' min Are respectively p' j The 8-bit high dynamic image D is obtained.
Example two
The embodiment discloses a specific implementation method of a high dynamic image fusion method suitable for a K-TIG welding ultra-strong arc scene, and the fusion algorithm steps are shown in figure 1.
In the process of realizing the automation of K-TIG welding, the visual technology is indispensable as a visual and information-rich sensing technology. In the K-TIG welding working process, the common industrial camera is in an overexposure state due to the ultra-strong arc light, and information including the arc light, a lock hole, a molten pool and a welding seam cannot be stably acquired, so that the common industrial camera is prevented from being applied to the welding industry. High dynamic image technology can help to improve the dynamic range of industrial cameras, breaking through this obstacle.
Under the condition that the existing camera monitoring system is not changed, an FPGA data processing module and an MCU control module are additionally arranged between a camera and an industrial personal computer. Fig. 2 is a block diagram of a system between modules. In order to avoid electromagnetic radiation interference of K-TIG welding, the system is integrated outside the camera on the premise of not damaging shielding interference of the original industrial camera. The image processing operation of each step design in the high dynamic image fusion method suitable for the K-TIG welding ultra-strong arc light scene comprises the steps of statistical histogram, morphological processing, relative brightness recovery and fusion, tone remapping and the like which are all realized in an FPGA image processing module. As shown in fig. 3, the image enters the buffer circuit through the communication module, and specifically which functional module is controlled by the MCU. Besides controlling which module of the FPGA image processing module the image enters for operation, the MCU is also responsible for inputting the calculated exposure time into the industrial camera and controlling the industrial camera to acquire the image. And finally, the welding image is output to an industrial personal computer by the FPGA image processing module for feature extraction, analysis and display of the welding image.
The above embodiments are preferred embodiments of the present invention, but the present invention is not limited to the above embodiments, and any other changes, modifications, substitutions, combinations, and simplifications which do not depart from the spirit and principle of the present invention should be construed as equivalents thereof, and all such modifications are intended to be included in the scope of the present invention.

Claims (6)

1. A high dynamic image fusion method suitable for a K-TIG welding ultra-strong arc light scene is characterized by comprising the following steps:
s1, shooting a welding scene image of K-TIG welding by an industrial camera, searching for proper exposure time of a first frame in the welding scene image by using a random search method, generating proper exposure time of a next frame according to a histogram until 3 exposure time sets are generated, and inputting the exposure time sets into the industrial camera to acquire 3 low-dynamic-range image sequences { A }; the process of generating the exposure time set is as follows:
s11, if the exposure time set is not empty, judging whether the exposure time set needs to be regenerated or not by detecting the exposure process, and if not, acquiring an image sequence { A } directly from the obtained exposure time set; if the exposure time set is empty or needs to be regenerated after judgment, executing the next step S12;
the process of judging whether the exposure time set needs to be regenerated or not by detecting the exposure process is as follows:
acquiring a first frame of exposure time image in an exposure time set, counting the pixel quantity of which the pixel value is between 255 and 240, and dividing the pixel quantity by the total pixel quantity to obtain a pixel ratio r1, wherein the total pixel quantity refers to the resolution of the image, and subtracting the pixel ratio r corresponding to the first frame when the image is acquired for the first time after the exposure time set is generated for the first time or regenerated for the first time, and taking an absolute value r2, if r2 is more than 0.2%, the exposure time set needs to be regenerated, otherwise, the exposure time set is continuously used to obtain an image sequence { A };
s12, generating a first frame of proper exposure time through random search, and collecting an image of the first frame of proper exposure time;
s13, obtaining images with proper exposure time of the second frame and the third frame through histogram reasoning to form an image sequence { A };
s2, taking the lowest exposure time image in the obtained image sequence { A } as a reference, and obtaining an overall exposure ratio sequence { B } of a higher exposure time image in the image sequence { A } relative to the lowest exposure time image in a masking mode; the calculation process of the overall exposure ratio in the step S2 is as follows:
taking the lowest exposure time image in the image sequence { A } as a reference, k 1 =1, obtaining an average pixel value avg1 of an overexposure edge of the image sequence { a } with a larger exposure time by means of a mask, where overexposure refers to a pixel with a pixel value of 255, and the overexposure edge refers to a pixel adjacent to the overexposure pixel position, and obtaining an average pixel value avg2 of the same position range of the lowest exposure time image, according to a formula:
Figure FDA0003859485620000021
calculating to obtain the overall exposure ratio k of the image with larger exposure time to the image with minimum exposure time 2
Acquiring an average pixel value avg1 of an overexposure edge of an image sequence { A } maximum exposure time image in a masking mode, and solving an average pixel value avg2 of the same position range of a lower exposure time image, wherein the average pixel value avg2 is obtained through a formula:
Figure FDA0003859485620000022
calculating to obtain the overall exposure ratio k of the maximum exposure time image to the lower exposure time image 3
Will k 3 And k 2 Is given to k 3 Obtaining an overall exposure ratio sequence { B } taking the lowest exposure time image as a reference;
the process of acquiring the average pixel value avg1 of the overexposure edge of the image sequence { A } maximum exposure time image in a masking mode is as follows:
for the lowest exposure time image and the larger exposure time image in the sequence of images { A }, the mask for the larger exposure time image is found as follows:
by formula generation
Figure FDA0003859485620000023
Wherein, I j Is the pixel value, p, of the jth location in the mask image j Pixel values at the jth location for the larger exposure time image;
because the electromagnetic interference is serious in the welding process, the camera contains a lot of salt and pepper noises, and therefore the mask image is subjected to morphological expansion processing through a formula
Figure FDA0003859485620000024
Performing morphological dilation processing, wherein I' is the mask image after morphological processing and I is the mask image before processing, the indication of dilation processing;
subtracting the mask image before expansion from the expanded mask image to obtain the overexposure edge position of the image with larger exposure time: i is mask = I' -I, the masking film is respectively subjected to dot product operation with the image with larger exposure time and the image with the lowest exposure time, and the sum is divided by the number of non-zero pixels to obtain the pixel value average value of the overexposure edge;
s3, restoring the relative brightness value of the pixel of the image sequence { A } by using the ratio sequence { B } obtained in the integral exposure ratio, and obtaining a high dynamic range image C by calculating the weighted average of the relative brightness values of all pixel positions in the image sequence { A };
and S4, carrying out tone remapping on the obtained high dynamic range image C, and carrying out gamma conversion on the high dynamic range image C to obtain an 8-bit image D which can be displayed on a display.
2. The method for fusing high-dynamic images in a K-TIG welding super strong arc scene according to claim 1, wherein the process of generating the first frame of proper exposure time through random search in step S12 is as follows:
s121, collecting a welding image with exposure time of 100 microseconds through an industrial camera;
s122, counting the sum of the ratio of the pixel quantity of each pixel value to the total pixel quantity from the pixel value of 255 to obtain a pixel value p1 accounting for 0.1% of the total pixel quantity;
s123, comparing p1 with 100 and 250, if it is lower than 100, jumping to step S114, if it is higher than 250, jumping to step S115, if p1 is higher than 100 and lower than 250, according to the formula:
t′=240t/p1
wherein t' is the proper exposure time of the first frame, t is the exposure time of the current image, and go to step S126;
s124, judging whether the exposure time t of the previous cycle retracts or not, if so, increasing half of the previous retraction to the current exposure time, otherwise, multiplying the exposure time by 2 to obtain the exposure time t of the next cycle, acquiring an image of the exposure time t by the industrial camera, and returning to the step S122;
s125 determines whether the current exposure time is 100 microseconds, and if the current exposure time is 100 microseconds, the process jumps to step S126 with t' =100 microseconds as the proper exposure time for the first frame; otherwise, retracting the exposure time to half of the last increase to obtain the exposure time t of the next cycle, acquiring an image of the exposure time t by the industrial camera and returning to the step S122;
s126, obtaining proper exposure time t 'of the first frame, putting the proper exposure time t' into an exposure time set to obtain an exposure time image of the first frame, counting the pixel quantity of which the pixel value is between 255 and 240, dividing the pixel quantity by the total pixel quantity to obtain a pixel proportion r, and ending the random search process.
3. The high-dynamic image fusion method suitable for the K-TIG welding super strong arc scene according to the claim 1, characterized in that the histogram inference process in the step S13 is as follows:
s131, collecting a welding image with exposure time t through an industrial camera;
s132, counting the sum of the ratio of the pixel quantity of each pixel value to the total pixel quantity from the pixel value of 240 to obtain a pixel value p2 accounting for 30% of the total pixel quantity;
s133, through a formula: t ' =240t/p2, whether t ' is larger than 8 times t is judged, wherein t ' is proper exposure time of the next frame, t is current exposure time, if yes, 8t is assigned to t ', otherwise, no modification is carried out, t ' is placed into an exposure time set, whether the number of the exposure time in the set is 3 is judged, if not, the step S131 is returned, otherwise, the histogram reasoning process is ended, and images are collected through the exposure time set to obtain an image sequence { A }.
4. The high-dynamic image fusion method suitable for the K-TIG welding ultra-strong arc scene is characterized in that the process of the step S3 is as follows:
according to the overall exposure ratio sequence { B }, the pixel positions of each image of the image sequence { A } are processed by a formula
Figure FDA0003859485620000041
Is calculated to obtain E j ,E j The relative brightness value recovered for the jth pixel position of the image is used to obtain a high dynamic range image C, wherein P ij Is the pixel value, k, of the jth pixel position in the ith image of the sequence of images { A } i Is the ith ratio, w in the overall exposure ratio series { B } i ("is a weight function corresponding to the ith image, and the weight function is given by the following formula:
Figure FDA0003859485620000051
Figure FDA0003859485620000052
Figure FDA0003859485620000053
where p is the pixel value, w 1 (p)、w 2 (p) and w 3 (p) are the weight functions of the first, second, and third images in the sequence of images { A }, respectively.
5. The high-dynamic image fusion method suitable for the K-TIG welding ultra-strong arc scene is characterized in that the process of tone remapping in the step S4 is as follows:
relative brightness value E of each pixel position of high dynamic image C j And performing gamma transformation, wherein the gamma transformation has the following calculation formula:
Figure FDA0003859485620000054
wherein, p' j The gamma-transformed pixel value is gamma-transformed pixel value, gamma is gamma-transformed factor, and the value range is 0.1 to 0.5 in K-TIG welding monitoring;
then by the formula
Figure FDA0003859485620000055
Obtaining an 8-bit high dynamic image D, wherein p ″) j Is a stretched pixel value of p' max And p' min Are respectively p' j A maximum value and a minimum value of (c).
6. The method for fusing high-dynamic images in the ultra-strong arc scene in K-TIG welding according to claim 1, wherein after the first exposure time set is obtained in the step S1, exposure judgment is required to be performed first, and whether the exposure time set needs to be regenerated or not is judged.
CN202110376562.6A 2021-04-07 2021-04-07 High-dynamic image fusion method suitable for K-TIG welding ultra-strong arc light scene Active CN113240614B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110376562.6A CN113240614B (en) 2021-04-07 2021-04-07 High-dynamic image fusion method suitable for K-TIG welding ultra-strong arc light scene

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110376562.6A CN113240614B (en) 2021-04-07 2021-04-07 High-dynamic image fusion method suitable for K-TIG welding ultra-strong arc light scene

Publications (2)

Publication Number Publication Date
CN113240614A CN113240614A (en) 2021-08-10
CN113240614B true CN113240614B (en) 2023-02-10

Family

ID=77131169

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110376562.6A Active CN113240614B (en) 2021-04-07 2021-04-07 High-dynamic image fusion method suitable for K-TIG welding ultra-strong arc light scene

Country Status (1)

Country Link
CN (1) CN113240614B (en)

Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103826066A (en) * 2014-02-26 2014-05-28 芯原微电子(上海)有限公司 Automatic exposure adjusting method and system

Family Cites Families (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE10131897A1 (en) * 2001-07-04 2003-01-16 Leica Microsystems Method and measuring device for detecting an object
US8933985B1 (en) * 2011-06-06 2015-01-13 Qualcomm Technologies, Inc. Method, apparatus, and manufacture for on-camera HDR panorama
CN106162131B (en) * 2015-04-28 2018-08-28 深圳市易瞳科技有限公司 A kind of real time image processing
EP3417762B1 (en) * 2016-02-16 2023-07-05 Sony Group Corporation Image processing device, image processing method, and program
JP2017184094A (en) * 2016-03-31 2017-10-05 ソニー株式会社 Imaging control device, imaging control method, computer program and electronic equipment
CN108629739B (en) * 2017-03-23 2020-08-11 展讯通信(上海)有限公司 HDR image generation method and device and mobile terminal
CN108668093B (en) * 2017-03-31 2020-08-14 华为技术有限公司 HDR image generation method and device
US11128809B2 (en) * 2019-02-15 2021-09-21 Samsung Electronics Co., Ltd. System and method for compositing high dynamic range images

Patent Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103826066A (en) * 2014-02-26 2014-05-28 芯原微电子(上海)有限公司 Automatic exposure adjusting method and system

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
A high dynamic range SPAD pixel for time of flight imaging;Francescopaolo Mattioli Della Rocca1 et al;《https://ieeexplore.ieee.org/stamp/stamp.jsp?tp=&arnumber=8234049》;20171101;第1-3页 *
机器人K-TIG焊接旋转电弧磁场控制器及焊接质量研究;钟少涛 等;《自动化与仪器仪表》;20190725(第7期);第137-140页 *

Also Published As

Publication number Publication date
CN113240614A (en) 2021-08-10

Similar Documents

Publication Publication Date Title
CN100562894C (en) A kind of image combining method and device
KR101468351B1 (en) Object tracking device, object tracking method, and control program
CN108416754B (en) Multi-exposure image fusion method capable of automatically removing ghosting
CN102446352B (en) Method of video image processing and device
CN110832541A (en) Image processing apparatus and method
CN111951313B (en) Image registration method, device, equipment and medium
CN105812675A (en) Method for generating an HDR image of a scene based on a tradeoff between brightness distribution and motion
CN111161313B (en) Multi-target tracking method and device in video stream
CN108769550B (en) Image significance analysis system and method based on DSP
CN105208376A (en) Digital noise reduction method and device
CN112288642A (en) Ghost detection method, image fusion method and corresponding device
CN102073866B (en) Video super resolution method by utilizing space-time Markov random field model
CN114022823A (en) Shielding-driven pedestrian re-identification method and system and storable medium
CN112561946A (en) Dynamic target detection method
CN113240614B (en) High-dynamic image fusion method suitable for K-TIG welding ultra-strong arc light scene
CN113751920B (en) Embedded device and method for detecting welding quality of lockhole TIG welding in real time
CN106780544B (en) The method and apparatus that display foreground extracts
CN110351453A (en) A kind of computer video data processing method
CN113888509A (en) Method, device and equipment for evaluating image definition and storage medium
CN113034404A (en) Traffic image deblurring method and device based on multi-scale counterstudy
Liu et al. Unsupervised optical flow estimation for differently exposed images in LDR domain
CN111127355A (en) Method for finely complementing defective light flow graph and application thereof
JP4612522B2 (en) Change area calculation method, change area calculation device, change area calculation program
Chen et al. Multi-exposure fusion for welding region based on multi-scale transform and hybrid weight
CN114998173A (en) High dynamic range imaging method for space environment based on local area brightness adjustment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant