WO2021238095A1 - Système d'analyse d'alliage et procédé de nouveau test correspondant - Google Patents

Système d'analyse d'alliage et procédé de nouveau test correspondant Download PDF

Info

Publication number
WO2021238095A1
WO2021238095A1 PCT/CN2020/128643 CN2020128643W WO2021238095A1 WO 2021238095 A1 WO2021238095 A1 WO 2021238095A1 CN 2020128643 W CN2020128643 W CN 2020128643W WO 2021238095 A1 WO2021238095 A1 WO 2021238095A1
Authority
WO
WIPO (PCT)
Prior art keywords
point
structured light
detection
convex
candidate
Prior art date
Application number
PCT/CN2020/128643
Other languages
English (en)
Chinese (zh)
Inventor
孙茂杰
李福存
孙敬忠
朱正清
杨文�
周鼎
苏循亮
林启森
吴俊生
Original Assignee
江苏金恒信息科技股份有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 江苏金恒信息科技股份有限公司 filed Critical 江苏金恒信息科技股份有限公司
Publication of WO2021238095A1 publication Critical patent/WO2021238095A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N21/00Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
    • G01N21/84Systems specially adapted for particular applications
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N21/00Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
    • G01N21/01Arrangements or apparatus for facilitating the optical investigation
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N21/00Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
    • G01N21/01Arrangements or apparatus for facilitating the optical investigation
    • G01N2021/0106General arrangement of respective parts
    • G01N2021/0112Apparatus in one mechanical, optical or electronic block

Definitions

  • This application relates to the technical field of visual inspection, and in particular to an alloy analysis system and a re-inspection method thereof.
  • structured light measurement system is generally used to locate the best detection point on the sample surface.
  • the structured light measurement system is mainly composed of a structured light source, a camera and an image processing system.
  • the measurement principle is to project structured light to the tested sample, and the structured light is subject to the test.
  • the sample surface is modulated and deformed, so that structured light fringes are generated in the sample image taken by the camera, and the three-dimensional position coordinates of the best detection point are calculated by analyzing the structured light fringes.
  • the alloy analyzer may not be able to detect valid data at the best detection point.
  • a re-inspection is required, that is, the surface of the sample to be tested is re-photographed and based on the captured data.
  • the sample image calculates the best detection point again until the alloy analyzer can return valid detection data.
  • the detection efficiency of this method is low, and the alloy analysis system has a long working cycle, which cannot meet the requirements of certain scenes that require rapid detection.
  • the present application provides an alloy analysis system and a retest method thereof.
  • this application provides an alloy analysis system, including:
  • An alloy analyzer connected to the robot, the alloy analyzer including a probe;
  • a structured light source connected with the robot and a camera assembly used to shoot sample images
  • the control system is configured to perform:
  • the alloy analyzer In response to not receiving the effective detection data fed back by the alloy analyzer, the alloy analyzer is controlled to recheck the candidate detection points in sequence in the order of priority in the detection point set, until If valid inspection data is received, the re-inspection process ends.
  • control system is further configured to calculate the first detection point and the candidate detection point according to the following steps:
  • the center points of the sub-stripes in the structured light stripes are sorted according to the v coordinate value, and the center point with the largest v coordinate value is taken as the first most convex point;
  • the convex point is the most convex center point in the other interval;
  • control system is further configured to divide the structured light stripe into a plurality of sections according to the following steps:
  • the nodes of the step calibration interval are preset at intervals, thereby dividing the structured light stripes into N+1 intervals;
  • N represents the preset number of detection points included in the detection point set.
  • the camera assembly includes a first camera and a second camera at the same level;
  • the sample image includes a first image taken by the first camera and a second image taken by the second camera;
  • the The structured light stripe includes the first structured light stripe extracted from the first image and the second structured light stripe extracted from the second image;
  • the control system is configured to calculate the detection point set according to the following steps:
  • the first detection point is calculated according to the matching pair to which the first most convex point (u 1 , v 1 ) belongs and the mapping relationship between the image coordinate system uov and the world coordinate system XYZ.
  • control system is further configured to calculate candidate detection points according to the following steps:
  • the candidate most convex point is the most convex center point included in the first target point set in the other interval;
  • the candidate detection point is calculated according to the matching pair to which the candidate most convex point belongs and the mapping relationship between the image coordinate system uov and the world coordinate system XYZ.
  • control system is further configured to determine whether any pixel point (u 1m , v 1m ) The only matching pixel (u 2n , v 2n ):
  • uov respectively calculate the distance between any pixel point (u 1m , v 1m ) in the first center point set and each pixel point in the second center point set to form a distance value set;
  • the distance between the pixel point (u 2n , v 2n ) and the pixel point (u 1m , v 1m ) is less than or equal to the first threshold.
  • the priority ranking is:
  • the first most convex point and the candidate most convex point are sorted in descending order according to the v coordinate value, and the first detection point and the candidate detection point are correspondingly sorted in this order.
  • the alloy analysis system further includes a voice device, and the control system is further configured to:
  • control the voice device In response to the end of the recheck process, control the voice device to broadcast the prompt information corresponding to the valid detection data;
  • this application provides a re-inspection method of an alloy analysis system, including:
  • the alloy analyzer In response to not receiving the effective detection data fed back by the alloy analyzer, the alloy analyzer is controlled to recheck the candidate detection points in sequence in the order of priority in the detection point set, until If valid inspection data is received, the re-inspection process ends.
  • the key to the re-inspection of this application is to calculate the detection point set based on the structured light fringes in the sample image in advance.
  • the detection point set includes the first detection point (that is, the global best detection point) and the candidate detection point.
  • the candidate inspection points will be re-examined in the order of priority from high to low until the valid detection data fed back by the alloy analyzer can be received, then the re-examination ends.
  • this solution after obtaining the inspection point set in advance, only the candidate inspection points in the inspection point set need to be called for re-inspection. There is no need to repeatedly take pictures to calculate the inspection points, thereby improving the inspection efficiency of the alloy analysis system and shortening the work.
  • the beat makes the alloy analysis system more suitable for the needs of fast detection scenarios.
  • Fig. 1 is a schematic structural diagram of an alloy analysis system shown in an embodiment of the application
  • 3 is a schematic diagram of the connection structure of the bracket, the alloy analysis visual positioning device, and the alloy analyzer shown in an embodiment of the application;
  • FIG. 5 is a schematic structural diagram of a binocular vision positioning device according to an embodiment of the application.
  • Fig. 6 is a flowchart of a re-inspection method of an alloy analysis system shown in an embodiment of the application;
  • FIG. 7 is a schematic diagram of a sample image with structured light fringes shown in an embodiment of the application.
  • FIG. 8 is a simplified schematic diagram showing the division of structured light stripe regions according to an embodiment of the application.
  • FIG. 9 is a schematic diagram of the detection principle of the Z coordinate of the first detection point shown in an embodiment of the application.
  • FIG. 10 is a schematic diagram of markings of the first most convex point and the candidate most convex points shown in an embodiment of the application;
  • FIG. 11 is a schematic diagram of electrical connections of another alloy analysis system shown in an embodiment of the application.
  • 1-robot 2-bracket, 21-flange, 22-mounting plate, 23-support rod, 24-first end, 25-second end; 3-alloy analyzer, 31-probe; 4- Vision positioning device, 41- structured light source, 42- camera assembly, 421- first camera, 422- second camera, 43- base plate, 44- outer shield, 441- front panel; 5- control system, 501-PLC control Device, 502-computer; 6-speech device; 100-sample to be tested.
  • the alloy analysis system provided by an embodiment of the present application includes a robot 1, an alloy analyzer 3 connected to the robot 1, and a visual positioning device connected to the robot 1 and arranged adjacent to the alloy analyzer 3. 4. And control system 5.
  • the control system 5 is electrically connected to the robot 1 to control the movement and opening and closing of the robot 1.
  • the robot 1 will drive the alloy analyzer 3 and the visual positioning device 4 to link, and the robot 1 can Choose six-axis robots and other types.
  • the alloy analyzer 3 and the robot 1 can be connected by flanges or bolts, and the visual positioning device 4 and the robot 1 can be connected by bolts or mounting plates.
  • the alloy analyzer 3 is electrically connected to the control system 5, and the control system 5 is used to control the opening and closing state of the alloy analyzer.
  • Alloy analyzer 3 adopts X fluorescence analysis technology, which can quickly, accurately and non-destructively analyze a variety of materials; it has a wide range of customizable grades library, users can modify the existing grades library, add new grades or create a grade library, which can be strict Control the analysis of light elements (magnesium-aluminum-silicon-phosphorus-sulfur); with powerful background data management functions, software can be customized according to requirements. Test results and reports can be directly downloaded to a U disk, or data can be transmitted via WiFi, USB or network cables.
  • the alloy analysis system further includes a bracket 2.
  • the bracket 2 can adopt an L-shaped structure, and the first end 24 of the bracket 2 is connected to the robot 1 through a flange 21.
  • the alloy analysis The instrument 3 is installed on the second end 25 of the bracket 2.
  • the second end 25 of the bracket 2 is provided with a mounting plate 22.
  • the mounting plate 22 can be specifically arranged on the side of the second end 25 of the bracket 2.
  • the visual positioning device 4 is bolted or welded. It is connected to the mounting plate 22 in a similar manner.
  • the alloy analyzer 3 and the visual positioning device 4 are arranged on the support 2 adjacently and in parallel, and both face the sample 100 to be tested.
  • the two sides of the bracket 2 are connected by a support rod 23 to improve the support structure of the bracket 2 and thereby increase the mechanical strength of the bracket 2.
  • a motion trajectory can be generated according to the three-dimensional coordinates of the robot and the best detection point, so that after the robot 1 moves according to the motion trajectory, the alloy analyzer 3
  • the probe 31 can be in contact with the best detection point, so as to detect the best detection point, and complete the alloy analysis of the sample 100 to be tested.
  • the sample 100 to be tested can be a wire rod, a coil, a wire, or other samples that require alloy analysis, which is not limited in this application.
  • the visual positioning device 4 includes a structured light source 41 and a camera assembly 42.
  • the camera assembly 42 may be a monocular camera (that is, the camera assembly 42 includes one camera) or a binocular camera (that is, the camera assembly 42 includes two cameras).
  • 5 shows a binocular-based visual positioning device 4, the camera assembly 42 includes a first camera 421 and a second camera 422 at the same level, the control system 5 and the structure light source 41, the first camera 421 and The second camera 422 is electrically connected to control the activation and deactivation of these components.
  • the first camera 421 and the second camera 422 use the same model of industrial cameras to ensure the same shooting parameters.
  • the first camera 421 and the second camera 422 have the same height in the vertical direction and the same distance from the surface of the sample to be tested.
  • the first camera 421 and the second camera 422 only have a distance in the horizontal direction (that is, the left-right direction in FIG. 5). Based on the monocular vision positioning device 4, refer to the structure of FIG. 5, the difference is only that the number of cameras is one.
  • structured light When the structured light source 41 is activated, structured light can be generated. Based on the characteristic principle that the structured light is deformed by the modulation of the surface of the sample to be tested, the structured light is reflected by the surface of the sample to be tested 100 and is received by the camera assembly 42 so that the captured image
  • the image has structured light fringes that carry the true deformation characteristics of the surface of the sample to be tested.
  • the structured light fringes can be used to calculate the first detection point (ie the global best detection point) and candidate detection points of the alloy analyzer 3 on the surface of the sample 100 to be tested.
  • the camera included in the camera assembly 42 can be provided with an adjustment element for adjusting the number of structured light fringes in the sample image.
  • the visual positioning device 4 also includes a bottom plate 43 and an outer shield 44.
  • the front end panel 441 (front end, that is, the front side) of the outer shield 44 is transparent.
  • the transparent front panel 441 can ensure the structure of the structured light source 41. Light can be incident on the surface of the sample 100 to be tested, and the camera assembly 42 can capture the image of the sample.
  • the transparent front panel 441 also plays a role of sealing protection; the rear end (ie, the back) of the outer shield 44 is fixed on the bottom plate On 43, the structured light source 41 and the camera assembly 42 are located inside the outer shield 44, and the structured light source 41 and the camera assembly 42 are fixed on the bottom plate 43, the bottom plate 43 and the mounting plate 22 are rigidly connected by bolts or welding, etc., the bottom plate 43 It is not only used to install the structured light source 41 and the camera assembly 42, but also used to fix the visual positioning device 4 and the bracket 2 and to seal and protect the rear end of the visual positioning device 4.
  • the axial centers of the structured light source 41 and the camera assembly 42 are on the same vertical plane to ensure the image shooting quality.
  • the vertical distance between the camera assembly 42 and the structured light source 41 is 700 mm-100 mm.
  • a distance measuring device such as a laser rangefinder, can be set on the top of the visual positioning device 4.
  • the robot 1 controls the visual positioning device 4 to move to the sample 100 to be tested, the camera assembly 42 and the Determine the distance between the sample 100 to be tested and determine whether the distance between the camera component 42 and the sample 100 is equal to the preset distance. If the result of the determination is not equal, the robot 1 needs to continue to control the robot 1 to adjust the position of the camera component 42 until Until the distance between the camera assembly 42 and the sample to be tested 100 is equal to the preset distance, the positioning of the shooting position of the camera assembly 42 is completed. At this time, the camera assembly 42 can be controlled to start and photograph the surface of the sample to be tested 100, thereby acquiring a sample image .
  • a preset shooting position may be set, and the preset shooting position is a fixed position preset according to factors such as the shooting distance and the shooting angle. Since the preset shooting position is fixed, a fixed motion trajectory can be generated according to the initial position of the robot 1 and the preset shooting position.
  • the control system 5 controls the robot 1 to move according to the motion trajectory
  • the visual positioning device 4 Move to the preset shooting position, the camera assembly 42 can start the photographing work on the surface of the sample 100 to be tested, that is, start the structured light source 41 so that the structured light emitted by the structured light source 41 can be incident on the surface of the sample to be tested 100.
  • the camera assembly 42 If a monocular camera is used, one frame of sample image is directly taken; if the camera component 42 is a binocular camera, the sample image taken by the camera component 42 includes the first image taken by the first camera 421 and the second camera 422 taken at the same time. The second image. There are one or more structured light fringes in the sample image.
  • the alloy analysis system also includes a voice device 6 for broadcasting the detection results of the alloy analyzer 3.
  • the voice device 6 is electrically connected to the control system 5.
  • the voice device 6 can be set on the robot 1, or other scenarios where voice prompts are required. middle.
  • the essence of the re-inspection of this application is to pre-calculate the detection point set based on the structured light fringes in the sample image.
  • the detection point set includes the first detection point (that is, the global best detection point) and the candidate detection point.
  • the present application provides a re-inspection method of an alloy analysis system.
  • the re-inspection method is used in the alloy analysis system described above.
  • the method execution body is the control system 5, and the method includes:
  • Step S10 extract structured light stripes from the sample image.
  • the camera component 42 After the camera component 42 takes a sample image, it extracts structured light stripes from the sample image. If the camera component 42 uses a binocular camera, it needs to extract the first structured light stripe from the first image and the second structured light from the second image. stripe.
  • the sample image mainly consists of two parts, one part is the dark sample background (the black part in the figure), and the other part is the structured light stripe (that is, the white light stripe with deformation characteristics in the figure), due to the structured light stripe and
  • the sample background has its own distinct characteristics and different gray levels, so a gray threshold can be preset, and the sample background and structured light stripes can be distinguished and segmented through the gray threshold.
  • the number of structured light fringes in the sample image can be adjusted by the adjustment element in the camera assembly 42.
  • the sample image shown in FIG. 7 preferably has a structured light stripe.
  • Step S20 Calculate the first detection point and candidate detection points according to the structured light stripes.
  • the characteristic of structured light that is deformed by the modulation of the surface of the sample to be tested is: due to the unevenness of the surface of the sample to be tested, the structured light irradiated on the surface of the sample to be tested will be phase-modulated, resulting in the The more convex part of the test sample corresponds to the lower the pixel point of the light stripe. On the contrary, the more concave part of the sample to be tested, the higher the pixel point of the light stripe. Based on this feature, the extracted structured light fringe is used to analyze According to the unevenness of the surface of the sample to be tested, the first detection point and the candidate detection point can be calculated and screened out.
  • step S20 is further configured as:
  • Step (A) in the image coordinate system uov, sort the center points of the sub-stripes in the structured light stripe according to the v coordinate value, and use the center point with the largest v coordinate value as the first most convex point.
  • the number of structured light stripes is equal to 1.
  • the structured light stripes include several segments of sub-stripes. Get the center point of each segment of sub-stripes respectively, and then follow v
  • the center points of the sub-stripes in the structured light stripes are sorted in the order of increasing or decreasing coordinate values, and the center point with the largest v coordinate value is regarded as the first most convex point, that is, the first most convex point is the global most convex point.
  • the first most convex point is the center point sorted at the end; if sorted in the descending order of the v coordinate value, the first most convex point is the center point sorted in the first place.
  • Step (B) using the first most convex point as the reference point, divide the structured light stripe into multiple intervals, and search for candidate most convex points in other intervals except the interval to which the first most convex point belongs; the candidate most convex The point is the most convex center point in the other interval.
  • the nodes of the step calibration interval are preset at every interval, thereby dividing the structured light stripes into N+ 1 interval, where N represents the preset number of detection points included in the detection point set, that is, the sum of the number of the first detection point and the candidate detection point is N.
  • Figure 8 shows a simplified diagram of structured light stripe area division, taking the preset number N equal to 3 as an example, taking the first most convex point as the reference point, crossing the preset step length to the left and right along the u axis respectively, and calibrating together With 4 nodes, the structured light stripe is divided into 4 intervals.
  • the first interval and the second interval are the intervals of the first most convex point. Since the first most convex point is the global most convex point, the first interval and The most convex center point in the second interval is the first most convex point, so there is no need to search for candidate detection points in the interval to which the first most convex point belongs, but to search for candidate most convex points in other intervals.
  • the preset step M is the preset multiple under the monocular scheme, It is the average value of the width of each sub-stripe in the structured light stripe in the image coordinate system uov; the preset multiple M is calculated according to the total width of the structured light stripe and the number of intervals N+1 where the structured light stripe is divided.
  • the best detection point that is, the first detection point in this application
  • the detection point closer to the first detection point will most likely not meet the requirements, so set the preset multiple and preset
  • the step length is kept away from the detection points that do not meet the requirements, and the reliability of the candidate detection points is improved, thereby improving the efficiency and accuracy of the re-inspection.
  • Step (C) calculating the first most convex point mapped to the first detection point in the world coordinate system XYZ, and the candidate most convex point mapped to the candidate detection point in the world coordinate system XYZ.
  • the world coordinate system XYZ is a coordinate system established in the actual world space. After the image coordinate system uov and the world coordinate system XYZ are established, the image coordinate system uov and the world coordinate system can be obtained in advance according to the imaging characteristics and shooting position of the camera.
  • the mapping relationship between XYZ, any pixel in the image coordinate system uov can find the corresponding coordinate point in the world coordinate system XYZ according to the mapping relationship. Therefore, through the mapping relationship, it can be calculated that the first most convex point corresponds to the first detection point in the world coordinate system XYZ, and each candidate most convex point corresponds to the candidate detection point in the world coordinate system XYZ.
  • the structured light stripe includes a first structured light stripe extracted from a first image, and a second structured light stripe extracted from a second image
  • Step S20 is further configured as:
  • Step (D) in the image coordinate system uov, sort the center points of the sub-stripes in the first structured light stripe according to the v coordinate value to obtain the first center point set P 1 (u 1m , v 1m ).
  • the first structured light stripe includes several segments of sub-stripes, and the center point of each segment of the sub-stripes is obtained separately .
  • the central point set is named as the first central point set P 1 (u 1m , v 1m ) in this application, where 1 ⁇ m ⁇ Q, and Q is the number of sub-stripes included in the first structured light stripe.
  • step (E) the center points of the sub-stripes in the second structured light stripe are sorted according to the v coordinate value to obtain a second center point set P 2 (u 2n , v 2n ).
  • the second structured light stripe After the second structured light stripe is extracted, the second structured light stripe includes several segments of sub-stripes, and the center point of each segment of the sub-stripes can be obtained separately. Each of the second structured light stripe can be adjusted in the order of increasing or decreasing v-coordinate value.
  • the center points of the segment stripes are sorted. It should be noted that the center points of the first structured light stripe and the second structured light stripe are sorted in the same way. If the points are sorted, the second structured light stripe should also sort the center points according to the increasing order of the v coordinate value.
  • each center point in the second structured light stripe After the center points in the second structured light stripe are sorted, each center point has a corresponding sequence number n, that is, an ordered set of center points is obtained, which is named as the second center point set P 2 (u 2n , v 2n ), 1 ⁇ n ⁇ S, S is the number of sub-stripes included in the second structured light stripe, and the number of sub-stripes included in the first structured light stripe and the second structured light stripe may be the same or different.
  • any pixel point (u 1m , v 1m ) in the first center point set is theoretically Above, there should only be a unique matching pixel point (u 2n , v 2n ) in the second central point set, and the matching pixel point (u 1m , v 1m ) and pixel point (u 2n , v 2n ) correspond to the actual space Point in the same location.
  • the same spatial location point may show a certain deviation in the images taken by the two cameras.
  • the location of the pixel point corresponding to the spatial location point A in the first image may be slightly to the left, and the spatial location point A is in the second image.
  • the position of the corresponding pixel in the image may be slightly to the right. Based on this factor, it may cause the pixel (u 1m , v 1m ) to be unable to filter the matching point in the second center point set, or the pixel (u 1m , v 1m )
  • the pixel point (u 1m , v 1m ) There are two or more matching points in the second center point set, so obviously the pixel point (u 1m , v 1m ) cannot be involved in the calculation of the first most convex point in order to ensure the detection point set calculation Accuracy.
  • step (F) it is judged whether a pixel point (u 2n , v 2n ) uniquely matching any pixel point (u 1m , v 1m ) in the first center point set is screened in the second center point set. If the judgment result is no, then step (G) is executed; otherwise, if the judgment result is yes, then step (H) is executed.
  • Step (F) can be implemented by using a threshold matching algorithm, specifically: in the image coordinate system uov, respectively calculate any pixel point (u 1m , v 1m ) in the first center point set and each pixel point in the second center point set from a pixel, composed of a set of distance values; only if the set distance value from a value larger than a first threshold value, it is determined that the screened pixel (u 1m, v 1m) unique matching pixel (u 2n , V 2n ), that is , the distance between the pixel point (u 2n , v 2n ) and the pixel point (u 1m , v 1m ) is less than or equal to the first threshold; if all distance values in the distance value set are greater than the first threshold, or exist If at least two distance values are less than or equal to the first threshold, it is determined that the pixel point (u 2n , v 2n ) that uniquely matches the pixel point (u 1m , v 1m) is
  • the first image and the second image can be imported into the same reference image coordinate system uov, which is equivalent to comparing the first image and the second image.
  • the two image coordinate systems are merged to facilitate the calculation of the distance value set.
  • Each sub-stripe in Figure 7 corresponds to a wire, and the distribution of the wires on the surface of the sample to be tested may be uneven, so that the more convex the center point of the sub-stripe corresponds to the more protruding wire, because the first detection point is the to-be-measured
  • the most convex point on the sample surface is used in this embodiment to determine the height of the more convex sub-stripes in the first structured light stripe and the second structured light stripe.
  • Calculate the first threshold, and the first threshold is Min ⁇ H1, H2 ⁇ , or (H1+H2)/2.
  • H1 is the average height of all sub-stripes in the first sub-strip set in the image coordinate system uov; if the center points of the sub-strips in the first structured light stripe are sorted according to the increasing v coordinate value, the more convex The lower the center point is sorted in the first center point set, the first sub-stripe set includes the sub-stripes corresponding to the center points sorted in the second half of the first center point set; if each segment in the first structured light stripe The center points of the stripes are sorted according to the decreasing v-coordinate value, indicating that the more convex center points are sorted forward in the first center point set, then the first sub-stripes set includes the first half of the first center point set. The sub-stripes corresponding to the center point.
  • H2 is the average height of all sub-stripes in the second sub-strip set in the image coordinate system uov; if the center points of the sub-strips in the second structured light stripe are sorted according to the v coordinate value, the more convex center points are The lower the sorting in the second center point set, the second sub-stripe set includes the sub-stripes corresponding to the second half of the center point in the second center point set; if the sub-stripes in the second structured light stripe are The center points are sorted according to the decreasing v-coordinate value, indicating that the more convex center points are sorted forward in the second center point set, the second sub-stripes set includes the center points sorted in the first half of the second center point set The corresponding sub-stripes.
  • Calculating the first threshold by the method of this application can improve the accuracy of the threshold matching algorithm, thereby improving the accuracy of the calculation of the first detection point and the candidate detection point, thereby obtaining a more accurate detection point set, and ensuring the accuracy and efficient.
  • the first threshold is not limited to the value given in this embodiment, and can also be set based on experience.
  • step (G) pixel points (u 1m , v 1m ) are deleted from the first center point set.
  • the distance value set corresponding to the pixel point (u 1m , v 1m) The distance values are all greater than the first threshold, that is, there is no matching point of the pixel point (u 1m , v 1m ) in the second center point set; the other is that the pixel point (u 1m , v 1m ) corresponds to at least the distance value set
  • There are two distance values less than or equal to the first threshold that is, there are at least two points matching the pixel point (u 1m , v 1m ) in the second center point set. In either case, the pixel point (u 1m , v 1m ) needs to be deleted from the first center point set so that it does not participate in the calculation of the first most
  • Step (H) the pixel point (u 1m , v 1m ) is retained in the first center point set, and the pixel point (u 1m , v 1m ) and the pixel point (u 2n , v 2n ) are recorded as matching pairs . That is, in the first center point set, every time a pixel point (u 1m , v 1m ) with a unique matching point is determined, the pixel point (u 1m , v 1m ) is retained, and the pixel point (u 1m , v 1m ) and the matching pixel (u 2n , v 2n ) are a matching pair, so that the corresponding relationship between the two matching pixels is recorded.
  • Step (I) traverse the first center point set according to the above-mentioned screening method, and obtain the first target point set after the screening is completed. That is, according to steps (F) to (H), all pixels in the first center point set are filtered, and the pixels in the first center point set that do not meet the threshold matching condition are deleted, and the first target point set is obtained .
  • step (J) the pixel with the largest v coordinate value in the first target point set is taken as the first most convex point (u 1 , v 1 ), and the first most convex point (u 1 , v 1 ) belongs to And the mapping relationship between the image coordinate system uov and the world coordinate system XYZ to calculate the first detection point.
  • v 1 v 1max
  • v 1max is the maximum value of the v coordinate value of each pixel in the first target point set
  • (u 1 , v 1 ) is the lowest point in the v-axis direction in the first structured light stripe, namely
  • the first most convex point is the global most convex point in the first structured light stripe. Since the pixel points included in the first target point set have been sorted according to the v coordinate value in advance, the first most convex point can be directly obtained.
  • the first target point The pixel with the largest sequence number m in the set is the first most convex point; conversely, if sorted in the descending order of the v coordinate value (descending order), the pixel with the smallest sequence number m in the first target point set is the first The most convex point.
  • the pixel point (u 2 , v 2 ) that uniquely matches the first most convex point (u 1 , v 1 ) can be found according to the corresponding relationship of the matching pair recorded before.
  • This embodiment mainly performs threshold matching from the perspective of the first central point set. Of course, it can also be judged from the perspective of the second central point set to determine whether the first central point set is filtered to any one of the second central point set.
  • the pixel point (u 2n , v 2n ) uniquely matches the pixel point (u 1m , v 1m ), and the two implementation schemes for realizing angles are substantially equivalent, and will not be repeated here.
  • a triangulation ranging method based on binocular vision can be used to calculate the Z coordinate of the first detection point.
  • the Z coordinate of the first detection point can be calculated according to the following formula:
  • T x is the horizontal distance between the first camera 421 and the second camera 422
  • u 1 is the abscissa of the first most convex point in the image coordinate system uov
  • u 2 is the only matching point of the first most convex point.
  • f is the focal length of the first camera 421 and the second camera 42
  • dx is the intrinsic parameter value of the first camera 421 and the second camera 422
  • dx depends on the first camera 421 and the second camera 422 model, when the camera model is determined, dx is naturally determined.
  • Step (K) using the first most convex point (u 1 , v 1 ) as a reference point, divide the first structured light stripe into a plurality of intervals, and divide the first structured light stripe into a plurality of intervals except for the interval to which the first most convex point belongs.
  • the candidate most convex point is searched for in the interval; the candidate most convex point is the most convex center point included in the first target point set in the other interval.
  • the nodes in the calibration interval are preset at intervals, so that the first structured light stripe It is divided into N+1 intervals, where N represents the preset number of detection points included in the detection point set, that is, the sum of the number of the first detection point and the candidate detection point is N.
  • Preset step here M 1 is the preset multiple corresponding to the binocular scheme, Is the average value of the width of each sub-stripe in the first structured light stripe in the image coordinate system uov; the preset multiple M 1 is based on the total width of the first structured light stripe and the number of intervals divided by the first structured light stripe N+ 1 to calculate.
  • the candidate most convex point must be It is the pixel point contained in the first target point set, that is, one of the necessary conditions that the candidate most convex point meets is that there is a uniquely matching point in the second central point set, and the second necessary condition that the candidate most convex point meets is it It is the most convex center point that is effective in the other interval.
  • the intersection of the center point of the sub-stripes included in other intervals and the first target point set is obtained, and then the point with the largest v coordinate value is selected from the intersection result, which is the search found in this interval
  • the most convex point of the candidate For example, in a certain interval except for the interval to which the first most convex point belongs, there are 3 sub-stripes included.
  • the center point and the first target point set are intersected, and the result of the intersection is center point 2 and center point 3. That is, center point 2 and center point 3 in the interval belong to the first target point set, and the v coordinate value of center point 3 is greater than The v coordinate value of the center point 2, then the center point 3 is the candidate most convex point in the interval.
  • the center points of the sub-stripes included in other intervals may be sorted according to the v coordinate value in descending order, and then it is judged whether the center point sorted in the first place is the point concentrated in the first target point, if the result of the judgment is If yes, the center point sorted in the first place is the most convex point of the candidate in the interval; if the judgment result is no, it means that the center point with the largest v coordinate value does not have a unique matching point, that is, the center point is invalid The most convex center point of, then it can be judged whether the center point of the next sequence belongs to the first target point set.
  • the center points of these three sub-strips are center point 1, center point 2 and center point 3, according to the coordinate value of v
  • the center point 1>center point 3>center point 2 is obtained, but the center point 1 does not belong to the first target point set, then continue to determine whether the center point 3 belongs to the first target point set.
  • the judgment result is that the center point 3 belongs to the first target point set, that is, the center point 3 is the most effective convex center point in the interval, and the center point 3 is the candidate most convex point in the interval.
  • the search logic of the candidate most convex point is not limited to the one described in this embodiment.
  • the calculation method of candidate detection points can refer to the first detection point. The only difference is that the first detection point is calculated using the matching pair to which the first most convex point belongs, while the candidate detection point is calculated using the matching pair to which the candidate most convex point belongs. I won't repeat it here.
  • Step S30 Combine the first detection point and the candidate detection point according to the priority order into a detection point set.
  • the first detection point and candidate detection points may be sorted in ascending order according to the Z coordinate value, and the Z coordinate value is the distance between the camera component 42 and the detection point in the detection point set (ie Image depth), which indirectly indicates how far the robot 1 needs to travel along the Z axis to make the probe 31 of the alloy analyzer 3 touch the detection point.
  • the Z coordinate value is the distance between the camera component 42 and the detection point in the detection point set (ie Image depth), which indirectly indicates how far the robot 1 needs to travel along the Z axis to make the probe 31 of the alloy analyzer 3 touch the detection point.
  • the more protruding the detection point the shorter the travel distance of the robot 1 along the Z axis, that is, the more convex.
  • the first most convex point and the candidate most convex point are sorted in descending order according to the v coordinate value, and the first detection point and the candidate detection point are correspondingly sorted in this order.
  • the preset number N is equal to 3
  • the first most convex point, the candidate most convex point 1 and the candidate most convex point 2 are calculated based on the structured light stripes, and the first most convex point > candidate is obtained after sorting in descending order of the coordinate value of v
  • the most convex point 2>candidate most convex point 1 then the priority ranking in the detection point set is the first detection point>candidate detection point 2>candidate detection point 1.
  • control system 5 is also configured to perform the following program steps: mark the first most convex point and the candidate most convex point in the sample image, as shown in FIG. There are two marked candidate most protruding points, namely candidate most protruding point 1 and candidate most protruding point 2, so as to provide a reference for the user and facilitate the user to understand the specific location distribution of the first most protruding point and the candidate most protruding point.
  • Step S40 Control the robot to move the alloy analyzer so that the probe contacts the first detection point, and control the alloy analyzer to detect the first detection point.
  • the first target motion trajectory of the robot 1 can be generated.
  • the first target motion trajectory should be suitable for moving the probe 31 of the alloy analyzer 3 to the first inspection Make sure that the probe 31 is in full contact with the first detection point, and then the control system 5 controls the alloy analyzer 3 to start, and the alloy analyzer 3 detects and analyzes the alloy at the first detection point, and feeds back the detection result to the control system 5. .
  • step S50 it is judged whether the effective detection data fed back by the alloy analyzer is received.
  • step S60 it is determined that the detection data is invalid, for example, the data is all 0, or the control system 5 does not receive the retrieval result feedback of the alloy analyzer, then step S60 is executed, and a re-inspection needs to be initiated; Conversely, if the control system 5 receives valid detection data fed back by the alloy analyzer, step S70 is executed.
  • Step S60 Control the alloy analyzer to re-inspect the candidate inspection points in sequence according to the priority order in the inspection point set, until valid inspection data is received, then the re-inspection process ends.
  • the detection point set also includes candidate detection point 1, candidate detection point 2 and candidate detection point 3.
  • the priority order of these three candidate detection points is candidate detection point 1> candidate detection point 2.
  • candidate detection point 3 when valid detection data is not obtained at the first detection point, the second target motion trajectory of the robot 1 is generated according to the current position of the robot 1 and the three-dimensional coordinates of the candidate detection point 1, and the robot 1 is controlled according to the first Second, the target movement track moves until the probe 31 contacts the candidate detection point 1, and then the alloy analyzer detects the candidate detection point 1, and then the control system executes step S50 again.
  • step S70 If valid detection data is received, the re-inspection process ends, and the step is executed S70: If valid detection data is not obtained at candidate detection point 1, then continue to recheck candidate detection point 2. By analogy, until the detection is successful, the re-inspection process ends.
  • Step S70 Control the voice device to broadcast the prompt information corresponding to the valid detection data, and control the robot to return to the initial position.
  • the control system 5 controls the voice device 6 to broadcast the prompt information corresponding to the effective detection data, so that the on-site personnel can be informed of the test to be tested Whether the sample 100 is qualified.
  • the initial position is the parking position of the robot 1 when the alloy analysis test is not performed. After the alloy analyzer 3 is completed, the robot 1 is reset.
  • the prompt information can be preset in the voice device 6. As an example, the prompt information can be set to pass or fail the test of a certain sample to be tested. The specific content of the prompt information can be set according to actual conditions, which is not limited in this embodiment.
  • the control system 5 can adopt a combined control structure of a PLC controller 501 + a computer 502.
  • the PLC controller 501 is electrically connected to the computer 502, and the camera assembly 42 and the alloy analyzer 3 are connected to each other.
  • the computer 502 is electrically connected, and the robot 1, the structured light source 41 and the voice device 6 are electrically connected to the PLC controller 501, respectively.
  • the detection point set can be stored in the database of the computer 502, so as to facilitate the call of candidate detection points for re-inspection.
  • the control flow of the cooperation between the PLC controller 501 and the computer 502 is as follows:
  • the computer 502 After the computer 502 receives the signal to be detected on the sample, the computer 502 sends a first control instruction to the PLC controller 501, and the PLC controller 501 controls the robot 1 to move in response to the first control instruction To the preset shooting position; when the robot 1 reaches the preset shooting position, it feeds back the first in-position signal to the PLC controller 501, and the PLC controller 501 transmits the first in-position signal to the computer 502; the computer 502 receives the first in-position signal , Respectively send second control instructions to the first camera 421 and the second camera 422 to control the first camera 421 to take the first image, and at the same time to control the second camera 422 to take the second image; the first camera 421 and the second camera 422 respectively The captured image is transmitted to the image processing system in the computer 502.
  • the image processing system calculates the detection point set according to the aforementioned method, it stores the detection point set in the database of the computer 502, and sends the three-dimensional coordinates of the first detection point to the PLC Controller 501; PLC controller 501 generates a third control instruction according to the three-dimensional coordinates of the first detection point, and sends the third control instruction to the robot 1; robot 1 responds to the third control instruction to control the probe 31 of the alloy analyzer 3 to arrive The first detection point.
  • the robot 1 arrives at the first detection point, it feeds back the second in-position signal to the PLC controller 501, and the PLC controller 501 transmits the second in-position signal to the computer 502; the computer 502 receives the second in-position signal and sends it to the alloy analyzer 3.
  • the fourth control instruction is sent; after the alloy analyzer 3 receives and responds to the fourth control instruction, it starts the detection of the first detection point, and sends the detection result to the computer 502.
  • the computer 502 judges whether it receives valid detection data for the first detection point. If it does not receive the valid detection detection data fed back by the alloy analyzer, it sends the three-dimensional coordinates of the candidate detection point in the next sequence after the first detection point.
  • the robot 1 When the robot 1 arrives at the candidate detection point, it feeds back the third in-position signal to the PLC controller 501, and the PLC controller 501 transmits the third in-position signal to the computer 502; the computer 502 receives the third in-position signal and sends it to the alloy analyzer 3.
  • the sixth control instruction After the alloy analyzer 3 receives and responds to the sixth control instruction, it starts the detection of candidate detection points, and sends the detection results to the computer 502, and the computer judges again.
  • the computer 502 When the computer 502 receives the effective detection data of the alloy analyzer for the first detection point, or the re-inspection process ends, it sends a confirmation signal to the PLC controller 501; the PLC controller 501 receives the confirmation signal and sends the seventh control to the robot 1. Command to control the robot 1 to return to the initial position, and at the same time, the PLC controller 501 sends an eighth control command to the voice device 6 to control the voice device 6 to broadcast the prompt information corresponding to the valid detection data.
  • the control system 5 may be configured with functions such as a control program and an image processing system, and the specific hardware form of the control system 5 is not limited to that described in this embodiment.
  • the robot 1 can be an ABB IRB4600 robot
  • the structured light source 41 can be an OPT-SL10B structured light source
  • the alloy analyzer 3 can be a Niton XL2980 alloy analyzer
  • the camera in the camera assembly 42 can be AVT Mako G- 192B industrial camera.
  • the detection point set includes the first detection point (that is, the global best detection point) and the candidate detection point.
  • the candidate detection points will be re-examined in the order of priority from high to low until the valid detection data fed back by the alloy analyzer can be received. The inspection is over.
  • the working cycle makes the alloy analysis system more suitable for the needs of fast detection scenes.

Abstract

Sont divulgués dans la présente demande un système d'analyse d'alliage et un procédé de nouveau test correspondant. Le système d'analyse d'alliage comprend : un robot, un analyseur d'alliage, une source de lumière structurée reliée au robot, et un ensemble caméra utilisé pour photographier une image d'échantillon. Un système de commande est configuré pour : extraire des bandes de lumière structurée d'une image d'échantillon ; calculer un premier point de test et un point de test candidat en fonction des bandes de lumière structurée ; combiner le premier point de test et le point de test candidat en un ensemble de points de test en fonction d'une séquence de priorité ; commander le déplacement de l'analyseur d'alliage par le robot afin de permettre le contact d'une sonde avec le premier point de test, et commander à l'analyseur d'alliage d'effectuer un test sur le premier point de test ; et en réponse à l'absence de réception de données de test effectives renvoyée par l'analyseur d'alliage, commander à l'analyseur d'alliage d'effectuer séquentiellement un nouveau test sur le point de test candidat en fonction de la séquence de priorité de l'ensemble de points de test d'élevée à faible jusqu'à la réception de données de test effectives, et terminer le processus de nouveau test. La présente demande améliore l'efficacité de test du système d'analyse d'alliage, et raccourcit le rythme de travail, et le système d'analyse d'alliage est plus adapté à une scène de test rapide.
PCT/CN2020/128643 2020-05-27 2020-11-13 Système d'analyse d'alliage et procédé de nouveau test correspondant WO2021238095A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN202010457852.9 2020-05-27
CN202010457852.9A CN111562262B (zh) 2020-05-27 2020-05-27 一种合金分析系统及其复检方法

Publications (1)

Publication Number Publication Date
WO2021238095A1 true WO2021238095A1 (fr) 2021-12-02

Family

ID=72074905

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2020/128643 WO2021238095A1 (fr) 2020-05-27 2020-11-13 Système d'analyse d'alliage et procédé de nouveau test correspondant

Country Status (2)

Country Link
CN (1) CN111562262B (fr)
WO (1) WO2021238095A1 (fr)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111562262B (zh) * 2020-05-27 2020-10-13 江苏金恒信息科技股份有限公司 一种合金分析系统及其复检方法
CN114160467A (zh) * 2021-11-30 2022-03-11 湖南思博睿智能装备有限公司 基于视觉检测的滤芯清洗方法

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
GB2089498A (en) * 1980-12-15 1982-06-23 Ramsey Eng Co Apparatus for Analysis by X-ray Fluorescence
US20040155208A1 (en) * 2002-12-13 2004-08-12 Takahiro Ikeda Method of selecting pattern to be measured, pattern inspection method, manufacturing method of semiconductor device, program, and pattern inspection apparatus
CN101319875A (zh) * 2008-07-23 2008-12-10 天津大学 基于视觉注意机制的气体泄漏源搜寻方法
CN110441336A (zh) * 2019-08-27 2019-11-12 江苏金恒信息科技股份有限公司 一种合金分析仪柔性接触装置
CN110455802A (zh) * 2019-08-27 2019-11-15 江苏金恒信息科技股份有限公司 基于视觉识别的合金分析装置及方法
CN110567963A (zh) * 2019-11-06 2019-12-13 江苏金恒信息科技股份有限公司 合金分析视觉定位方法、装置及合金分析系统
CN111562262A (zh) * 2020-05-27 2020-08-21 江苏金恒信息科技股份有限公司 一种合金分析系统及其复检方法

Family Cites Families (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP4632777B2 (ja) * 2004-12-27 2011-02-16 オリンパス株式会社 欠陥検査装置及び欠陥検査方法
KR100919166B1 (ko) * 2007-04-02 2009-09-28 한국과학기술원 스테레오 모아레를 이용한 3차원 형상 측정장치 및 그 방법
SG164292A1 (en) * 2009-01-13 2010-09-29 Semiconductor Technologies & Instruments Pte System and method for inspecting a wafer
CN102221553B (zh) * 2011-03-25 2013-05-01 上海交通大学 基于结构光的铁路扣件高速识别方法
CN102879405B (zh) * 2012-09-29 2014-09-24 肇庆中导光电设备有限公司 紧凑型的检测台以及应用该检测台的检测方法
CN106872476A (zh) * 2017-03-31 2017-06-20 武汉理工大学 一种基于线结构光的铸造类工件表面质量检测方法与系统
CN110057755A (zh) * 2019-05-24 2019-07-26 广东工业大学 一种复合光学检测仪

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
GB2089498A (en) * 1980-12-15 1982-06-23 Ramsey Eng Co Apparatus for Analysis by X-ray Fluorescence
US20040155208A1 (en) * 2002-12-13 2004-08-12 Takahiro Ikeda Method of selecting pattern to be measured, pattern inspection method, manufacturing method of semiconductor device, program, and pattern inspection apparatus
CN101319875A (zh) * 2008-07-23 2008-12-10 天津大学 基于视觉注意机制的气体泄漏源搜寻方法
CN110441336A (zh) * 2019-08-27 2019-11-12 江苏金恒信息科技股份有限公司 一种合金分析仪柔性接触装置
CN110455802A (zh) * 2019-08-27 2019-11-15 江苏金恒信息科技股份有限公司 基于视觉识别的合金分析装置及方法
CN110567963A (zh) * 2019-11-06 2019-12-13 江苏金恒信息科技股份有限公司 合金分析视觉定位方法、装置及合金分析系统
CN111562262A (zh) * 2020-05-27 2020-08-21 江苏金恒信息科技股份有限公司 一种合金分析系统及其复检方法

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
PIOREK STANISLAW: "Field-Portable X-Ray Fluorescence Spectrometry: Past, Present, and Future", FIELD ANALYTICAL CHEMISTRY AND TECHNOLOGY, WILEY, NEW YORK, NY,, US, vol. 1, no. 6, 31 December 1997 (1997-12-31), US , pages 317 - 329, XP055872190, ISSN: 1086-900X, DOI: 10.1002/(SICI)1520-6521(199712)1:6<317::AID-FACT2>3.0.CO;2-N *
WEN LIANGBI, BAI RONGSHEN YANG MINGZHONG ZHOU XINZHI HUANG YANWEN JIN JIANLAN: "The Study of Portable High Resolution Alloy Analyzer", JOURNAL OF SICHUAN UNIVERSITY (NATURAL SCIENCE EDITION), vol. 33, no. 6, 31 December 1996 (1996-12-31), XP055872192 *

Also Published As

Publication number Publication date
CN111562262B (zh) 2020-10-13
CN111562262A (zh) 2020-08-21

Similar Documents

Publication Publication Date Title
Sochor et al. Comprehensive data set for automatic single camera visual speed measurement
WO2021238095A1 (fr) Système d&#39;analyse d&#39;alliage et procédé de nouveau test correspondant
WO2021088247A1 (fr) Procédé et appareil de positionnement visuel d&#39;analyse d&#39;alliage, et système d&#39;analyse d&#39;alliage
CN110064819A (zh) 基于结构光的柱面纵向焊缝特征区域提取、焊缝跟踪方法及系统
KR20070088318A (ko) 3차원 계측을 행하는 화상 처리 장치 및 화상 처리 방법
CN113674345B (zh) 一种二维像素级三维定位系统及定位方法
KR100695945B1 (ko) 용접선 위치 추적 시스템 및 그 위치 추적 방법
CN102788572B (zh) 一种工程机械吊钩姿态的测量方法、装置及系统
CN110136186B (zh) 一种用于移动机器人目标测距的检测目标匹配方法
CN102455171A (zh) 一种激光拼焊焊缝背面几何形貌检测方法及其实现装置
CN113134683A (zh) 基于机器学习的激光标刻方法及装置
CN114140439A (zh) 基于深度学习的激光焊接焊缝特征点识别方法及装置
Sochor et al. Brnocompspeed: Review of traffic camera calibration and comprehensive dataset for monocular speed measurement
CN115619738A (zh) 一种模组侧缝焊焊后检测方法
CN105023270A (zh) 用于地下基础设施结构监测的主动式3d立体全景视觉传感器
CN111397529A (zh) 一种基于双目视觉结构光的复杂表面形状检测方法
CN111754560B (zh) 基于稠密三维重建的高温冶炼容器侵蚀预警方法及系统
CN116393982B (zh) 一种基于机器视觉的螺丝锁付方法及装置
CN111272756B (zh) 一种合金分析系统
CN113506292B (zh) 一种基于位移场的结构物表面裂纹检测和提取方法
CN109791038B (zh) 台阶大小及镀金属厚度的光学测量
JP2006208187A (ja) 形状良否判定装置および形状良否判定方法
CN105526993A (zh) 机器视觉料位计及其测量料位的方法
CN115014205A (zh) 塔盘的视觉检测方法、检测系统及其引导自动化焊接系统
CN114239995A (zh) 全区域巡航路线生成方法、系统、电子设备及存储介质

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 20937510

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 20937510

Country of ref document: EP

Kind code of ref document: A1