CN115971635A - Friction stir welding control method and system based on visual sensing and machine tool - Google Patents

Friction stir welding control method and system based on visual sensing and machine tool Download PDF

Info

Publication number
CN115971635A
CN115971635A CN202210919564.XA CN202210919564A CN115971635A CN 115971635 A CN115971635 A CN 115971635A CN 202210919564 A CN202210919564 A CN 202210919564A CN 115971635 A CN115971635 A CN 115971635A
Authority
CN
China
Prior art keywords
welding
welded
image
camera
virtual
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202210919564.XA
Other languages
Chinese (zh)
Inventor
韩坤
蔡鑫
孙家阔
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Ningbo Qiyun New Material Technology Co ltd
Original Assignee
Ningbo Qiyun New Material Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Ningbo Qiyun New Material Technology Co ltd filed Critical Ningbo Qiyun New Material Technology Co ltd
Priority to CN202210919564.XA priority Critical patent/CN115971635A/en
Publication of CN115971635A publication Critical patent/CN115971635A/en
Pending legal-status Critical Current

Links

Images

Landscapes

  • Pressure Welding/Diffusion-Bonding (AREA)

Abstract

The application provides a friction stir welding control method, a friction stir welding control system and a machine tool based on visual sensing, wherein the control method comprises the following steps: acquiring a first image containing a part to be welded on a welding platform through a first camera; acquiring a first relative coordinate of the second marking point relative to the first marking point based on the first image; the second mark point is arranged on the part to be welded, and the first mark point is arranged on the welding platform; controlling the second camera to move to a position where the welding part to be welded can be completely shot based on the first relative coordinate; acquiring a second image through a second camera, and acquiring the welding seam information of a to-be-welded part based on the second image; planning a virtual welding path relative to the second mark point based on the welding seam information; planning an actual welding path based on the virtual welding path and the first relative coordinate; and controlling the stirring needle to stir and friction weld the to-be-welded part along the actual welding path. The welding machine especially solves the technical problem of low efficiency in continuous production of different types of to-be-welded parts.

Description

Friction stir welding control method and system based on visual sensing and machine tool
Technical Field
The application relates to the technical field of friction stir welding, in particular to a friction stir welding control method and system based on visual sensing and a machine tool.
Background
Friction Stir Welding (FSW) is one solid phase joining technique. Compared with the conventional welding method, the friction stir welding is to utilize a stirring head rotating at a high speed to extend into a welding seam of a part to be welded and generate heat by friction with the part to be welded, so that the temperature of a material of the part to be welded at the welding seam is raised and softened, and the purpose of welding is achieved. The friction stir welding is widely applied to the 3C field due to the advantages of low cost, small welding deformation, high quality, high welding efficiency and the like.
In the prior art, the stir welding friction needs to extract the weld joint characteristics of a workpiece in advance, then manually set a welding path, and then introduce a three-dimensional model of the workpiece to be welded and the set welding path into a stir welding machine tool, so that the machining efficiency is low. Moreover, the models of the to-be-welded parts to be machined by one welding machine tool are various, and for the to-be-welded parts of different models, welding paths need to be set manually and models need to be led in, so that the machining efficiency is low; and before welding, different parts to be welded need to be placed according to corresponding established positions, so that the machining efficiency is extremely low.
Disclosure of Invention
The application provides a friction stir welding control method, a friction stir welding control system and a machine tool based on visual sensing, and aims to solve the technical problem that friction stir welding in the prior art is low in efficiency, and especially solve the technical problem that the efficiency is low when different types of to-be-welded parts are continuously produced.
On one hand, the application provides a friction stir welding control method based on visual sensing, which is used for controlling the movement of a friction stir welding machine head, wherein a stirring needle is arranged on the friction stir welding machine head; the control method comprises the following steps: collecting a first image through a first camera;
acquiring a first relative coordinate of the second mark point relative to the first mark point and acquiring size information of a part to be welded based on the first image; the second mark point is arranged on the part to be welded, and the first mark point is arranged on the welding platform;
controlling a second camera to move to a position capable of completely shooting the part to be welded based on the first relative coordinate and the size information;
controlling the second camera to collect a second image, and acquiring the welding seam information of the part to be welded based on the second image;
planning a virtual welding path relative to a second mark point based on the welding seam information;
planning an actual welding path based on the virtual welding path and the first relative coordinates;
and controlling the stirring needle to stretch into the welding seam of the to-be-welded part, and carrying out stirring friction welding on the to-be-welded part along the actual welding path.
Optionally, determining at least three second mark points according to the shape of the welding seam of the part to be welded, wherein the at least three second mark points are arranged around the welding seam and define at least one virtual triangle; the step of acquiring a first relative coordinate of the second marker point with respect to the first marker point comprises: acquiring three first relative coordinates of at least three of the three marking points relative to the first marking point based on the first image, and determining the geometric center coordinate of one of at least one virtual triangle relative to the first marking point based on the three first relative coordinates; the step of controlling the second camera to move to a position where the second camera can completely shoot the part to be welded comprises the following steps: and controlling the second camera to move to the position above the geometric center coordinate, zooming according to the size information to acquire the second image, wherein the acquired second image completely comprises at least two of the welding seam and the second mark point.
Optionally, the step of planning a virtual welding path relative to the second marker point based on the weld information includes: randomly selecting one of at least three second mark points as a main reference point; randomly selecting one of at least three second mark points as an auxiliary reference point, wherein the main reference point and the auxiliary reference point are respectively positioned on two sides of the welding seam; identifying the welding seam through an image identification technology to obtain a welding seam track; associating the welding seam track with the main reference point to obtain a first virtual welding path; associating the welding seam track with the auxiliary reference point to obtain a second virtual welding path; transforming the second virtual welding path to a third virtual welding path based on the relative positions of the secondary reference point and the primary reference point; judging the similarity between the third virtual welding path and the first virtual welding path, and if the similarity is greater than a preset parameter, taking the first virtual welding path as the virtual welding path; if the similarity is smaller than a preset parameter, at least one of resolution, brightness and/or focal length of the second camera is adjusted and/or the second camera is controlled to move close to the welding workpiece or move away from the welding workpiece in a direction perpendicular to the welding platform so as to obtain the second image again, the steps are repeated until the similarity is larger than or equal to the preset parameter, and a first virtual welding path determined finally is used as the virtual welding path.
Optionally, the step of identifying the weld by the image technique comprises image preprocessing, weld identification and weld segmentation.
Optionally, the step of planning an actual welding path based on the virtual welding path and the first relative coordinates includes: transforming the first virtual welding path satisfying a condition into an actual welding path referring to the first mark point based on first relative coordinates of the main reference point and the first mark point.
Optionally, the step of acquiring, based on the first image, first relative coordinates of the second marker point with respect to the first marker point, and acquiring size information of a to-be-welded part specifically includes: determining whether the workpiece to be welded exists on the welding platform or not based on the first image and a preset image; if so, preprocessing the first image to improve the resolution of the first image; and identifying the workpiece to be welded and the second mark point based on the first image with the improved resolution so as to obtain a first relative coordinate of the second mark point relative to the first mark point and size information of the workpiece to be welded.
Optionally, the step of controlling the stirring pin to extend into the welding seam of the part to be welded and performing friction stir welding on the part to be welded along the actual welding path comprises: identifying a label of the part to be welded based on the size information; acquiring the rotation speed and the moving speed of the stirring pin based on the label; randomly selecting one end of the actual welding path as a starting point, controlling the stirring pin to move right above the starting point according to the relative position of the starting point and the first mark point, controlling the stirring pin to extend into a welding seam of the part to be welded, and driving the stirring pin to perform friction stir welding on the part to be welded along the actual welding path according to the rotating speed and the moving speed.
The present application further proposes a machine tool comprising:
a frame;
the welding platform is connected with the rack and used for placing a part to be welded; a first mark point is arranged on the welding platform;
a first camera fixed to the frame for capturing a first image containing a part to be welded on a welding platform; wherein a second mark point is arranged on the part to be welded;
a second camera movably disposed relative to the gantry;
the friction stir welding machine head is movably arranged on the rack and is provided with a stirring needle; and
a processor electrically connected to the first camera, the second camera, and the friction stir welding handpiece, respectively; the processor is configured to: the friction stir welding control method based on visual sensing is adopted.
Optionally, a shaft shoulder capable of rotating is arranged on the friction stir welding head, and the stirring pin is connected with the shaft shoulder; when the stirring pin extends to a welding seam of the part to be welded, the shaft shoulder is abutted against the surface of the part to be welded; the shaft shoulder is provided with a groove, and an opening of the groove is formed in the surface; the groove extends from the connection of the shoulder and the stirring pin to the peripheral wall of the shoulder in the radial direction of the shoulder, and the groove depth of the groove increases gradually in the extending direction thereof.
The application also provides a friction stir welding control system based on vision sensing, includes: one or more processors; a memory; and one or more application programs, wherein the one or more application programs are stored in the memory and configured as the visual sensing-based friction stir welding control method as previously described.
According to the technical scheme, the first relative coordinate of the second mark point relative to the first mark point is obtained through the first image shot by the first camera, the second camera is controlled to move to the position where the part to be welded can be completely shot based on the first relative coordinate, so that the complete welding seam information of the part to be welded can be collected, the virtual welding path relative to the second mark point is planned based on the welding seam information, the actual welding path is obtained based on the first mark point and the virtual welding path, the stirring pin is controlled to stretch into the welding seam of the part to be welded, and the part to be welded is subjected to friction stir welding along the actual welding path. In the technical scheme of this application embodiment, no matter how the shape, size and the locating place of waiting to weld the piece, can both accurately extract the welding seam information, and establish its actual welding route for welding platform (first mark point) through the discernment to welding seam information, and then control the pin mixer and carry out friction stir welding, need not operating personnel and will wait to weld the piece and put to preset position intentionally, also need not to model the welding clearance, make welding efficiency improve, do benefit to continuity production.
Drawings
In order to more clearly illustrate the technical solutions in the embodiments of the present application, the drawings needed to be used in the description of the embodiments are briefly introduced below, and it is obvious that the drawings in the following description are only some embodiments of the present invention, and it is obvious for those skilled in the art to obtain other drawings based on these drawings without creative efforts.
FIG. 1 is a schematic flow chart diagram illustrating one embodiment of a visual sensing-based friction stir welding control method provided by an embodiment of the present application;
FIG. 2 is a schematic flow chart diagram illustrating another embodiment of a visual sensing-based friction stir welding control method provided by an embodiment of the present application;
FIG. 3 is a flowchart illustrating an embodiment of step S500 of a friction stir welding control method based on visual sensing provided in an embodiment of the present application;
fig. 4 is a flowchart illustrating step S600 of the friction stir welding control method based on visual sensing in the embodiment of the present application;
fig. 5 is a schematic flowchart of step S200 of the friction stir welding control method based on visual sensing in the embodiment of the present application;
fig. 6 is a schematic flowchart of step S700 of the friction stir welding control method based on visual sensing in the embodiment of the present application;
FIG. 7 is a schematic plane view of a welding part to be welded placed on the welding platform according to the embodiment of the present disclosure;
FIG. 8 is another schematic plan view of a welding part to be welded placed on the welding platform according to the embodiment of the present disclosure;
FIG. 9 is a schematic plan view of another workpiece to be welded placed on the welding platform according to the embodiment of the present disclosure;
FIG. 10 is a schematic view of a machine tool according to an embodiment of the present application;
FIG. 11 is a schematic view of a shoulder and pin according to an embodiment of the present disclosure;
fig. 12 is another schematic view of the shoulder and the pin according to the embodiment of the present application.
Detailed Description
The technical solutions in the embodiments of the present application will be clearly and completely described below with reference to the drawings in the embodiments of the present application, and it is obvious that the described embodiments are only a part of the embodiments of the present invention, and not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
In the description of the present invention, it is to be understood that the terms "center", "longitudinal", "lateral", "length", "width", "thickness", "upper", "lower", "front", "rear", "left", "right", "vertical", "horizontal", "top", "bottom", "inner", "outer", etc. indicate orientations or positional relationships based on those shown in the drawings, and are only for convenience of description and simplicity of description, but do not indicate or imply that the device or element being referred to must have a particular orientation, be constructed and operated in a particular orientation, and thus, should not be considered as limiting the present invention. Furthermore, the terms "first", "second" and "first" are used for descriptive purposes only and are not to be construed as indicating or implying relative importance or implicitly indicating the number of technical features indicated. Thus, features defined as "first", "second", may explicitly or implicitly include one or more of the described features. In the description of the present invention, "a plurality" means two or more unless specifically defined otherwise.
In this application, the word "exemplary" is used to mean "serving as an example, instance, or illustration. Any embodiment described herein as "exemplary" is not necessarily to be construed as preferred or advantageous over other embodiments. The following description is presented to enable any person skilled in the art to make and use the invention. In the following description, details are set forth for the purpose of explanation. It will be apparent to one of ordinary skill in the art that the present invention may be practiced without these specific details. In other instances, well-known structures and processes are not shown in detail to avoid obscuring the description of the invention with unnecessary detail. Thus, the present invention is not intended to be limited to the embodiments shown, but is to be accorded the widest scope consistent with the principles and features disclosed herein.
The embodiments of the present application provide a friction stir welding control method and system based on visual sensing, and a machine tool, which are described in detail below.
The visual sensing friction stir welding control method is used for controlling the movement of a friction stir welding head of a friction stir welding machine tool. The stirring welding friction machine head is provided with a stirring needle. Specifically, the control method can stretch the stirring pin into the welding seam of the part to be welded, and carry out friction stir welding on the part to be welded along the actual welding path, so that the stirring pin and the part to be welded generate friction, and the material around the welding seam of the part to be welded is softened and filled into the welding seam to integrate the part to be welded.
In the prior art, a visual sensing technology is used for identifying a welding seam and welding a to-be-welded part into a whole, and the method generally comprises the following steps: extracting the weld joint characteristics of the parts to be welded in advance, manually setting a welding path, and introducing the three-dimensional model of the parts to be welded and the set welding path into a friction stir welding machine tool; in the method, the parts to be welded need to be placed at preset positions so as to be aligned with a preset coordinate system. However, since the welding gap of the friction of stir welding is small, the deviation of the placement position of the to-be-welded parts needs to be strictly controlled, but the types and sizes of the to-be-welded parts targeted by the stir welding machine are different, so that when placing the to-be-welded parts of different types and sizes, a large amount of time is needed to position the to-be-welded parts, the machining efficiency is low, and the continuous machining is not facilitated.
Therefore, as shown in fig. 1, a schematic flowchart of an embodiment of the friction stir welding control method based on visual sensing in the embodiment of the present application is shown. The control method comprises the following steps:
s100, acquiring a first image by a first camera 40;
the first camera 40 is fixed on the machine tool. Generally, the first camera 40 is a wide-angle camera, which has a view angle at least covering the welding platform 10, so as to be able to capture a first image of the welding platform 10 on which the weldment 20 is placed.
S200, acquiring a first relative coordinate of the second mark point P2 relative to the first mark point P1 based on the first image. The second mark point P2 is arranged on the part to be welded 20, and the first mark point P1 is arranged on the welding platform 10.
And identifying the first image, and identifying a first mark point P1 and a second mark point P2. For the same machine tool, the table 10 to be welded and therefore the first marking point P1 are fixed, so that the coordinates of the first marking point P1 can be stored in the system beforehand. And obtaining a first relative coordinate of the second mark point P2 relative to the first mark point P1 according to the relative positions of the first mark point P1 and the second mark point P2 and the coordinate value of the first mark point P1. For example, the system marks the coordinates of the first mark point P1 as (x) 1 ,y 1 ,z 1 ) The coordinates of the second mark point P2 are marked as (x) 2 ,y 2 ,z 2 ) Then the first relative coordinate of the second marker point P2 with respect to the first marker point P1 is (x) 2 -x 1 ,y 2 -y 1 ,z 2 -z 1 )。
And S300, controlling the second camera 50 to move to a position capable of completely shooting the to-be-welded part 20 based on the first relative coordinate.
In order to effectively improve the processing efficiency, the placement of the to-be-welded members 20 on the welding platform 10 is arbitrary, and there is no need to place the to-be-welded members 20 at a predetermined position. For this reason, in the control method of the present application, the second camera 50 is movably disposed on the machine tool to clearly photograph the to-be-welded part 20, so as to extract the to-be-welded seam information for planning the welding path. The initial position of the second camera 50 is preset, and thus the initial position thereof is determined with respect to the first mark point P1. For this reason, the second camera 50 is controlled to move to a position where the welding part 20 to be welded can be completely photographed, based on the first relative coordinates. The second camera 50 may be driven by a motion mechanism such as a hydraulic mechanism, an electric mechanism, or a rack and pinion mechanism.
Moreover, the weld gap of the to-be-welded part 20 is small, the recognition difficulty is large, and the accuracy of the obtained weld information is insufficient through the first image acquired by the first camera 40, so that the second image with higher resolution is acquired by the movable second camera 50 for extracting the weld information. Namely: the first image is a global image and the second image is a local image.
S400, controlling the second camera 50 to acquire a second image, and acquiring the welding seam information of the part to be welded 20 based on the second image;
when the second camera 50 moves to a position where the work to be welded 20 can be photographed, a second image is captured by the second camera 50. And through the identification of the second image, obtaining the welding seam information of the 20 pieces to be welded. The weld information includes the width of the weld gap and the orientation of the weld gap.
S500, planning a virtual welding path relative to a second mark point P2 based on the welding seam information;
and establishing a second coordinate system by taking the second mark point P2 as an origin, and establishing a track equation of the welding seam gap based on the width and the trend of the welding seam gap in the second coordinate system. The trajectory equation is the virtual welding path. S600, planning an actual welding path based on the virtual welding path and the first relative coordinate;
and under the condition of establishing a first coordinate by taking the first mark point P1 as an origin, converting a track equation under a second coordinate system into the first coordinate based on the first relative coordinate, and further constructing another track equation, namely the actual welding path.
And S700, controlling the stirring pin 80 to extend into the welding seam of the to-be-welded part 20, and carrying out friction stir welding on the to-be-welded part 20 along the actual welding path.
In the machine tool, the initial position of the probe 80 is a definite value, and the initial position is also definite relative to the first mark point P1, so that in the case that the actual welding path is determined, the probe 80 is controlled to extend into the weld gap of the piece to be welded 20, and the probe 80 is controlled to move along the actual welding path to perform the friction stir welding on the piece to be welded 20 until the strip weld gap is welded.
In the technical scheme of the embodiment of the application, the first relative coordinate of the second mark point P2 relative to the first mark point P1 is obtained through the first image shot by the first camera 40, based on the first relative coordinate, the second camera 50 is controlled to move to the position capable of completely shooting the part to be welded 20 so as to be capable of collecting the complete weld information of the part to be welded 20, based on the weld information, a virtual welding path relative to the second mark point P2 is planned, based on the first mark point P1 and the virtual welding path, so as to obtain an actual welding path, so as to control the stirring pin 80 to extend into the weld of the part to be welded 20, and the actual welding path is used for stirring friction welding of the part to be welded 20. In the technical scheme of this application embodiment, no matter how the shape, size and the locating position of waiting to weld piece 20, can both accurately extract the welding seam information to establish its actual welding route for welding platform 10 through the discernment to welding seam information, and then control pin stirrer 80 welds, need not operating personnel and intentionally will wait to weld piece 20 and put to preset position, also need not to model the welding clearance, make welding efficiency improve, do benefit to continuity production.
In the prior art, modeling of welds through placement of image recognition has become a common approach. The weld seam is a long and narrow structure, and has very high requirements on resolution and visual angle. However, in the prior art, a fixed camera is used to photograph the part to be welded 20 to obtain an image for planning a welding path; however, when facing the to-be-welded parts 20 with different sizes (as shown in fig. 7 and 9), shapes (as shown in fig. 7 and 9), types and placing positions (as shown in fig. 7 and 8), the finally obtained image quality for planning the path is difficult to be consistent due to the fixed shooting view angle of the fixed camera, so that the system error is large, and the welding quality cannot be ensured. Therefore, in order to reduce the systematic error and improve the welding quality, that is, in order to obtain a second image with similar quality (the accuracy of the weld information obtained from the second image is reliable), in the embodiment of the present application, the shooting position of the second camera 50 is controlled by fixing the first image (global image) shot by the camera, so that the second camera 50 moves to a reliable position to shoot the to-be-welded part 20 to obtain a second image (local image), and further, when facing to the to-be-welded parts 20 with different sizes, shapes, types and placing positions, the obtained systematic error of the welding path is reduced.
In actual production, the shape of the welding seam may be linear, curved or mixed (linear + curved), etc. For this reason, in the present embodiment, at least three second marker points P2 are provided on the to-be-welded part 20, so that the second camera 50 moves to a position directly above or close to a position directly above the weld gap. At least three second marker points P2 are arranged around the weld, defining at least one virtual triangle. Generally, the weld falls within the range of the virtual triangle.
As an optional implementation manner of the foregoing embodiment, as shown in fig. 2, the step of acquiring the first relative coordinate of the second marker point P2 with respect to the first marker point P1 includes:
s203, acquiring three first relative coordinates of at least three of the three second marked points P2 (as shown by P21, P22 and P23 in the figure) relative to the first marked point P1 based on the first image, and determining coordinates of one of the at least one virtual triangle relative to the geometric center (P20) of the first marked point P1 based on the three first relative coordinates.
For example, the ith second mark point P2 has coordinates of (x) 2i ,y 2i ,z 2i ) Then the first relative coordinate of the ith second marker point P2 with respect to the first marker point P1 is (x) 2i -x 1 ,y 2i -y 1 ,z 2i -z 1 ) And i is 1,2,3 … N. The geometric center coordinates are coordinates of the inner center or the center of gravity of the virtual triangle in the first coordinate system.
The step of controlling the second camera 50 to move to a position where the workpiece to be welded 20 can be completely photographed comprises the following steps: s301, controlling the second camera 50 to move above the geometric center coordinate to acquire the second image, wherein the acquired second image completely contains the welding seam. Namely: the shooting position of the second camera 50 is located right above the center of gravity or the inner center of the virtual triangle, so that a high-quality second image can be obtained, and information of the weld joint close to reality can be obtained.
In the technical solution of the present application, the shooting position of the second camera 50 is located right above the geometric center of the virtual triangle. For this reason, in order to enable the weld gap to be clearly photographed, an implementer may set the virtual triangle as a regular triangle and the geometric center thereof on the weld when setting the second mark point P2, so that a connection line between the focal point of the second camera 50 and the geometric center may be perpendicular to the surface of the to-be-welded part 20 when photographing, and thus high-quality weld information may be obtained.
In the technical scheme of the embodiment of the application, in order to be able to photograph the weld of the completed piece to be welded 20, the size information of the piece to be welded 20 is acquired through the first image; and adjusts the focal length of the second camera 50 according to the size information to be able to photograph the complete piece to be welded 20. In some alternative embodiments of the present application, as shown in fig. 7, after step S100, the control method further includes step S210 of acquiring size information of the to-be-welded member 20 based on the first image. The dimensional information reflects the shape and size of the work to be welded 20. This dimensional information is primarily used to enable the second camera 50 to capture the complete weld gap in order to model the weld. Step S210 may be performed simultaneously with step S200, or may be performed after or before step S200.
Further, in order to ensure that the virtual welding path of the established weld joint is close to the true value, the second camera 50 can shoot at least two of the second mark points P2 for mutual verification of the virtual welding path when shooting. Specifically, as an optional implementation manner of the foregoing embodiment, as shown in fig. 3, the step of planning the virtual welding path relative to the second marker point P2 based on the weld information includes:
s501, randomly selecting one of at least three second mark points P2 as a main reference point; for example, the primary reference point is P22.
S502, randomly selecting one of at least three second mark points P2 as a secondary reference point; for example, the secondary reference point is P21.
S503, identifying the welding seam through an image identification technology, and obtaining a welding seam track.
S504, associating the welding seam track with the main reference point to obtain a first virtual welding path; namely, establishing a second coordinate system I by taking the main reference point as an origin, and modeling the welding seam track in the second coordinate system I to obtain a first virtual welding path;
s505, associating the welding seam track with the auxiliary reference point to obtain a second virtual welding path;
namely, establishing a second coordinate system II by taking the auxiliary reference point as an origin, and modeling the welding seam track in the second coordinate system II to obtain a first virtual welding path;
s506, transforming the second virtual welding path into a third virtual welding path based on the relative positions of the auxiliary reference point and the main reference point;
namely, according to the relative position of the auxiliary reference point and the main reference point, the second virtual welding path is switched to a second coordinate system II to obtain a third virtual welding path. Theoretically, the third virtual welding path is completely coincident with the first virtual welding path. However, due to the influence of the resolution of the second image, the weld joint identification algorithm has an error, which causes the third virtual welding path and the first virtual welding path to be misaligned, and therefore, in order to minimize the error of obtaining the virtual welding path, the embodiment of the application provides that the error is reduced by mutual verification of the virtual welding paths. In particular, the amount of the solvent to be used,
and S507, judging the similarity of the third virtual welding path and the first virtual welding path. Similarity can be measured by the distance of the two paths. In general, there are various classical metrics for the similarity between tracks, such as: closest-Pair Distance (CPD), sum-of-Pairs Distance (SPD), DTW, LCSS, and EDR. The similarity of the third virtual welding path to the first virtual welding path may be calculated by at least one of the metrics described above.
S508, if the similarity is larger than a preset parameter, taking the first virtual welding path as the virtual welding path; namely: if the similarity is greater than the preset parameter, the similarity of the third virtual welding path and the first virtual welding path is high, PASS is mutually verified, and the first virtual welding path can be used as the virtual welding path.
S509, if the similarity is smaller than a preset parameter, at least one of a resolution, a brightness and a focal length of the second camera 50 is adjusted and/or the second camera 50 is controlled to move closer to or away from the welding workpiece in a direction perpendicular to the welding platform 10 so as to obtain the second image again, and the above steps are repeated until the similarity is greater than or equal to the preset parameter, and the first virtual welding path determined finally is used as the virtual welding path. Namely: if the third virtual welding path and the first virtual welding path are not similar enough, and mutually verify NG each other, which indicates that the quality of the second image is not sufficient, at least one of the resolution, brightness and/or focal length of the second camera 50 may be adjusted and/or the second camera 50 may be controlled to move closer to the welding workpiece or away from the welding workpiece in a direction perpendicular to the welding platform 10 to re-acquire a higher quality second image, and the above steps may be repeated until PASS is mutually verified each other, and the newly determined first virtual welding path is used as the virtual welding path.
In order to reduce errors as much as possible, the primary reference point and the secondary reference point are respectively located on both sides of the weld.
As an optional implementation manner of the foregoing embodiment, the step of identifying the weld seam through the image technology includes image preprocessing, weld seam identification, and weld seam segmentation. In image preprocessing, it is generally necessary to perform color gamut conversion, graying, and filtering on the second image. Due to various disturbances in the real process, image edge blurring is one of the problems with high occurrence frequency, and the characteristic extraction process becomes abnormally difficult due to the blurring of the boundary and the difficulty in recognizing the boundary lines of the welding seams of the to-be-welded part 20. Therefore, it becomes important to enhance the blurred region of the weld by using an image sharpening technique. After the image is subjected to image filtering, graying processing, image enhancement and other processing, non-relevant interference sources need to be further filtered, and weld joint identification and segmentation are performed after a weld joint area is highlighted. In the embodiment of the application, the weld image processing unit is used for segmenting the interested region and the non-interested region of the target image. Further, segmenting and identifying the image to segment the target object from the background, so as to obtain the position of the welding seam gap in the image coordinate system; then, the pixels of the whole image are classified according to some characteristic information sets of the target and corresponding judgment standards, and information in the weld joint area is extracted. Generally, image segmentation is performed by segmenting a target object from a background by a threshold segmentation method. Thresholding is the separation of a target object from the background in an image by selecting an appropriate threshold to classify different ranges of pixel values in the image. Generally speaking, after preprocessing such as image graying and filtering is completed, a proper threshold is set by using priori knowledge or an algorithm, algebraic operation is performed on pixel values of all pixel points and the threshold, a set of pixel points meeting constraint conditions is selected as contour information of a target object, and a process of separating the target object from a background is completed. In the actual image segmentation process, the selection of the threshold directly influences the image segmentation effect, so that in order to ensure the effect in image segmentation, super-resolution reconstruction is introduced on the basis of performing threshold segmentation operation, and the possible multi-threshold condition is effectively eliminated. When more than one optimal threshold data of image segmentation processing is found by the maximum inter-class variance method due to the fact that the proportion between a welding seam area and the background in the image is unbalanced and the image characteristics have multiple peaks, the corrected high-resolution image is obtained by the super-resolution reconstruction method, and then the optimal threshold is obtained by the optimal threshold calculation method of image segmentation, so that the image segmentation effect is improved. And the screening of the welding seam area is completed by performing super-resolution reconstruction and image threshold segmentation of the scene image in sequence. The super-resolution reconstruction of the scene image is completed in a mode of improving and optimizing a convex set projection algorithm by taking a high-precision Point Spread Function (PSF) as priori knowledge and accurate color vector information as constraint conditions. After the high-resolution image after the super-resolution reconstruction is obtained, an optimal threshold value is obtained by using a threshold value calculation method, and super-pixel segmentation is introduced in the operation step of obtaining the weld joint area through the optimal threshold value. After the welding seam area is obtained, the central line of the welding seam is extracted, a mathematical model of the central line is established under a second coordinate system I and a second coordinate system II, and a first virtual welding path and a second virtual welding path are obtained.
In addition, the image recognition of the welding seam area can also adopt the prior art, for example, the patent with the publication number of CN112238292 discloses a space curve track tracking method of a friction stir welding robot based on vision. The patent adopts a binarization mode to obtain the center coordinates of the welding seam from the image.
As an optional implementation manner of the foregoing embodiment, as shown in fig. 4, the step of planning an actual welding path based on the virtual welding path and the first relative coordinate includes: s601, based on the first relative coordinates of the main reference point and the first mark point P1, transforming the first virtual welding path satisfying the condition into an actual welding path referring to the first mark point P1. The first virtual welding path which accords with the system error is converted to the actual welding path which refers to the first mark point P1, so that the motion track of the stirring pin 80 is closer to the joint of the real welding seam track, a deviation correcting device is not needed in the welding process, and the welding speed and the welding quality are improved.
By the control method provided by the embodiment of the application, the continuous operation of the parts to be welded 20 is realized because personnel do not need to adjust the placing positions of the parts to be welded 20. In the continuous operation, the process of the to-be-welded part 20 may include transferring the to-be-welded part 20 to the welding platform 10 by a robot, welding, and transferring the thick to-be-welded part 20 to the next manufacturing tool. For this reason, in order to meet the requirement of intelligent production process, as an optional implementation of the above embodiment, as shown in fig. 5, the step of acquiring the first relative coordinate of the second mark point P2 relative to the first mark point P1 based on the first image specifically includes:
s201, determining whether the workpiece to be welded exists on the welding platform 10 or not based on the first image and a preset image;
s202, if yes, preprocessing the first image to improve the resolution of the first image; and identifying the workpiece to be welded and the second mark point P2 based on the first image with the improved resolution so as to acquire a first relative coordinate of the second mark point P2 relative to the first mark point P1.
That is, based on the first image, the calculation of the first relative coordinates of the second mark point P2 and the first mark point P1 is performed when it is determined that the workpiece is to be welded on the welding stage 10. Since the first image is a global image, the pixel positions of the first mark point P1 and the second mark point P2 are blurred in the global image, and therefore, the resolution of the first image needs to be improved, and based on the first image with the improved resolution, the workpiece to be welded and the second mark point P2 are identified, so as to obtain a first relative coordinate of the second mark point P2 relative to the first mark point P1, so as to control the movement of the second camera 50.
As an alternative to the above embodiment, as shown in fig. 6, the step of controlling the stir pin 80 to extend into the weld of the to-be-welded part 20 and to perform friction stir welding on the to-be-welded part 20 along the actual welding path includes:
s701, identifying a label of the to-be-welded part 20 based on the size information; for example, when the first image is identified, the size information of the workpiece to be welded 20 is obtained according to the contour information of the workpiece to be welded, and the type of the workpiece to be welded 20 is obtained through the size information.
S702, acquiring the rotating speed and the moving speed of the stirring pin 80 based on the label; randomly selecting one end of the actual welding path as a starting point, controlling the stirring pin 80 to move right above the starting point according to the relative position of the starting point and the first mark point P1, controlling the stirring pin 80 to extend into the welding seam of the part to be welded 20, and driving the stirring pin 80 to carry out friction stir welding on the part to be welded 20 along the actual welding path according to the rotating speed and the moving speed.
Based on the above embodiment, since the welding parameters of each member to be welded 20 are also different. The welding parameters are mainly designed by the material, thickness, type of weld, etc. of the parts 20 to be welded. The welding parameters include the rotational speed and the moving speed of the stirring pin 80. Before welding, the practitioner can correlate the rotational speed and the movement speed to dimensional information of the part 20 to be welded. Therefore, on one production line, the control system can implement corresponding welding parameters according to the size information of the parts to be welded 20 so as to match the types of the parts to be welded 20 and obtain welding seams with corresponding quality.
The present application also proposes a machine tool, shown in fig. 10, comprising:
a frame 30;
the welding platform 10 is connected with the rack 30 and used for placing a part to be welded 20; a first mark point P1 is arranged on the welding platform 10;
a first camera 40, wherein the first camera 40 is fixed on the frame 30, and the first camera 40 is used for acquiring a first image containing the part to be welded 20 on the welding platform 10; wherein, a second mark point P2 is arranged on the part to be welded 20;
a second camera 50, the second camera 50 being movably disposed with respect to the gantry 30;
the friction stir welding machine head 60 is movably arranged on the frame 30, and a stirring needle 80 is arranged on the friction stir welding machine head 60; and
a processor (not shown) electrically connected to the first camera 40, the second camera 50, and the friction stir welding head 60, respectively; the processor is configured to: and a stirring friction welding control method based on visual sensing is adopted. Since the control method employs a part or all of the foregoing embodiments, the machine tool has a part or all of the technical advantages of the foregoing embodiments.
It should be noted that the second camera 50 can be movably disposed on the frame 30, and the driving structure thereof can be a rack and pinion, a worm gear, a crawler, a hydraulic telescopic rod, an electric push rod, and the like. The second camera 50 may be fixedly disposed on the friction stir welding head 60, and the second camera 50 may be driven to move by driving the friction stir welding head 60. The stir welding friction head 60 is provided with a stirring pin 80. The drive to the friction stir welding head 60 may be provided by any conventional means. The stir pin 80 is rotatably provided on the stir welding friction head 60 to rub against the work 20 to be welded at a high speed.
The processor may include one or more processing cores, such as a 4-core processor, an 8-core processor, and so forth. The processor may be implemented in at least one hardware form of a DSP (Digital Signal Processing), an FPGA (Field-Programmable Gate Array), and a PLA (Programmable Logic Array). The processor may also include a main processor and a coprocessor, where the main processor is a processor for processing data in an awake state, and is also called a Central Processing Unit (CPU); a coprocessor is a low power processor for processing data in a standby state. In some embodiments, the processor may be integrated with a GPU (Graphics Processing Unit) that is responsible for rendering and drawing the content that the display screen needs to display. The processor may further include an AI (Artificial Intelligence) processor for processing control method operations with respect to the vision sensing-based friction stir welding control system, such that a control method model of the vision sensing-based friction stir welding control system may be trained and learned autonomously, improving efficiency and accuracy.
As an alternative to the above embodiment, as shown in fig. 11 and 12, the friction stir welding head 60 is provided with a shaft shoulder 70 capable of rotating, and the stirring pin 80 is connected with the shaft shoulder 70; when the stirring pin 80 extends to the welding seam of the part to be welded 20, the shaft shoulder 70 abuts against the surface of the part to be welded 20; wherein the shaft shoulder 70 is provided with a groove 70a, and the opening of the groove 70a is arranged on the surface; the groove 70a extends from a connection of the shoulder 70 and the pin 80 to an outer peripheral wall of the shoulder 70 in a radial direction of the shoulder 70, and a groove depth of the groove 70a increases stepwise in an extending direction thereof. Even if in the technical scheme of the embodiment of the application, a part of system errors are eliminated through mutual verification of the welding paths, the control system still relies on image recognition to model the welding paths. Since the image recognition technology still has the problem of accuracy, so that the stirring pin 80 still deviates from the welding gap center in a small range during welding, in order to effectively reduce the influence of system errors on the welding quality, in the embodiment of the present application, a groove 70a is formed on the shaft shoulder 70, and an opening of the groove 70a is formed on the surface of the shaft shoulder 70, which is used for abutting against the to-be-welded part 20. And the groove 70a extends from the connection of the shoulder 70 and the stirring pin 80 to the outer peripheral wall of the shoulder 70 along the radial direction of the shoulder 70, and the groove depth of the groove 70a gradually increases in the extending direction thereof, so when the stirring pin 80 rubs against the to-be-welded member 20 on the side slightly off-center, the excessive molten liquid on the side flows radially outward from the root of the stirring pin 80 under the action of the centrifugal force, and when the shoulder 70 rotates, the excessive molten liquid is brought to the other side in the radial direction of the stirring pin 80 to fill the weld gap, so that the weld on the two sides of the center is uniform, and the influence of the systematic error of the control system on the weld quality can be effectively reduced.
Meanwhile, due to the arrangement of the groove 70a, when the similarity between the third virtual welding path and the first virtual welding path is judged, the similarity can be set to be higher, that is, the third virtual welding path and the first virtual welding path can have relatively higher errors, so that one of the resolution, the brightness and/or the focal length of the second camera 50 can be adjusted as little as possible and/or the second camera 50 can be controlled to move closer to or away from the welding workpiece in the direction perpendicular to the welding platform 10, so as to improve the processing efficiency.
As shown in fig. 12, the number of the grooves is plural, and the plural grooves are provided at intervals along the circumferential direction of the pin.
The embodiment of the present application further provides a friction stir welding control system based on visual sensing, including: one or more processors; a memory; and one or more application programs, wherein the one or more application programs are stored in the memory and configured to be executed by the processor to implement the visual sensing-based friction stir welding control method as previously described.
Generally, the visual sensing-based friction stir welding control system comprises: the control program of the visual sense-based friction stir welding control system is configured to implement the steps of the previous control method.
The processor may include one or more processing cores, such as a 4-core processor, an 8-core processor, and so forth. The processor may be implemented in at least one hardware form of a DSP (Digital Signal Processing), an FPGA (Field-Programmable Gate Array), and a PLA (Programmable Logic Array). The processor may also include a main processor and a coprocessor, where the main processor is a processor for processing data in an awake state, and is also called a Central Processing Unit (CPU); a coprocessor is a low power processor for processing data in a standby state. In some embodiments, the processor may be integrated with a GPU (Graphics Processing Unit), which is responsible for rendering and drawing the content that the display screen needs to display. The processor may further include an AI (Artificial Intelligence) processor for processing control method operations related to the vision sensing based friction stir welding control system, such that a control method model of the vision sensing based friction stir welding control system may be trained and learned autonomously, improving efficiency and accuracy.
The memory may include one or more computer-readable storage media, which may be non-transitory. The memory may also include high speed random access memory, as well as non-volatile memory, such as one or more magnetic disk storage devices, flash memory storage devices. In some embodiments, a non-transitory computer readable storage medium in a memory is used to store at least one instruction for execution by a processor to implement a visual sensing-based friction stir welding control method provided by method embodiments herein.
The foregoing describes in detail a friction stir welding control system and a machine tool based on visual sensing provided in an embodiment of the present application, and specific examples are applied herein to explain the principles and embodiments of the present invention, and the description of the foregoing examples is only used to help understanding the method and its core ideas of the present invention; meanwhile, for those skilled in the art, according to the idea of the present invention, there may be variations in the specific embodiments and the application scope, and in summary, the content of the present specification should not be construed as a limitation to the present invention.

Claims (10)

1. The utility model provides a friction stir welding control method based on vision sensing for control friction stir welding aircraft nose motion, be equipped with the stirring pin on the friction stir welding aircraft nose, its characterized in that includes:
acquiring a first image through a first camera;
acquiring a first relative coordinate of the second marking point relative to the first marking point based on the first image; the second mark point is arranged on the part to be welded, and the first mark point is arranged on the welding platform;
controlling a second camera to move to a position capable of completely shooting the part to be welded based on the first relative coordinate;
controlling the second camera to acquire a second image and acquiring the welding seam information of the part to be welded based on the second image;
planning a virtual welding path relative to a second mark point based on the welding seam information;
planning an actual welding path based on the virtual welding path and the first relative coordinates;
and controlling the stirring needle to stretch into the welding seam of the to-be-welded part, and carrying out stirring friction welding on the to-be-welded part along the actual welding path.
2. The control method according to claim 1, characterized in that at least three second marking points are determined according to the shape of the weld of the part to be welded, are arranged around the weld and define at least one virtual triangle;
the step of acquiring a first relative coordinate of the second marker point with respect to the first marker point comprises:
acquiring three first relative coordinates of at least three of the three marking points relative to the first marking point based on the first image, and determining the geometric center coordinates of one of the at least one virtual triangle relative to the first marking point based on the three first relative coordinates;
the step of controlling the second camera to move to a position where the second camera can completely shoot the part to be welded comprises the following steps:
and controlling the second camera to move above the geometric center coordinate so as to acquire the second image, wherein the acquired second image completely contains at least two of the welding seam and the second mark point.
3. The control method of claim 2, wherein the step of planning a virtual welding path relative to the second marker point based on the weld information comprises:
randomly selecting one of at least three second mark points as a main reference point;
randomly selecting one of at least three second mark points as an auxiliary reference point, wherein the main reference point and the auxiliary reference point are respectively positioned on two sides of the welding seam;
identifying the welding seam through an image identification technology to obtain a welding seam track;
associating the welding seam track with the main reference point to obtain a first virtual welding path;
associating the welding seam track with the auxiliary reference point to obtain a second virtual welding path;
transforming the second virtual welding path to a third virtual welding path based on the relative positions of the secondary reference point and the primary reference point;
judging the similarity between the third virtual welding path and the first virtual welding path, and if the similarity is greater than a preset parameter, taking the first virtual welding path as the virtual welding path;
if the similarity is smaller than a preset parameter, at least one of resolution, brightness and/or focal length of the second camera is adjusted and/or the second camera is controlled to move close to the welding workpiece or move away from the welding workpiece in a direction perpendicular to the welding platform so as to obtain the second image again, the steps are repeated until the similarity is larger than or equal to the preset parameter, and a first virtual welding path determined finally is used as the virtual welding path.
4. The control method of claim 3, wherein the step of identifying the weld by the image technique comprises image preprocessing, weld identification, and weld segmentation.
5. The control method of claim 3, wherein the step of planning an actual welding path based on the virtual welding path and the first relative coordinates comprises:
transforming the first virtual welding path satisfying a condition into an actual welding path referring to the first mark point based on first relative coordinates of the main reference point and the first mark point.
6. The control method according to claim 1, wherein the step of acquiring a first relative coordinate of the second marker with respect to the first marker based on the first image specifically includes:
determining whether the workpiece to be welded exists on the welding platform or not based on the first image and a preset image;
if so, preprocessing the first image to improve the resolution of the first image; and identifying the workpiece to be welded and the second marking point based on the first image with the improved resolution so as to obtain a first relative coordinate of the second marking point relative to the first marking point.
7. The control method as set forth in claim 1, characterized in that the step of controlling the probe to project into the weld of the member to be welded and to friction stir weld the member to be welded along the actual welding path includes:
identifying a label of the part to be welded based on the size information; acquiring the rotation speed and the moving speed of the stirring pin based on the label; wherein the size information is obtained based on the identification of the first image;
randomly selecting one end of the actual welding path as a starting point, controlling the stirring pin to move right above the starting point according to the relative position of the starting point and the first mark point, controlling the stirring pin to extend into a welding seam of the part to be welded, and driving the stirring pin to perform friction stir welding on the part to be welded along the actual welding path according to the rotating speed and the moving speed.
8. A machine tool, comprising:
a frame;
the welding platform is connected with the rack and used for placing a part to be welded; a first mark point is arranged on the welding platform;
a first camera fixed to the frame for capturing a first image containing a part to be welded on a welding platform; wherein a second mark point is arranged on the part to be welded;
a second camera movably disposed relative to the gantry;
the friction stir welding machine head is movably arranged on the rack, and a stirring needle is arranged on the friction stir welding machine head; and
a processor electrically connected to the first camera, the second camera, and the friction stir weld head, respectively; the processor is configured to: the visual sensing-based friction stir welding control method of any one of claims 1 to 7 is employed.
9. The machine tool of claim 8 wherein said friction stir welding head is provided with a rotatably movable shoulder, said pin being connected to said shoulder; when the stirring pin extends to a welding seam of the part to be welded, the shaft shoulder is abutted against the surface of the part to be welded;
the shaft shoulder is provided with a groove, and an opening of the groove is formed in the surface; the groove extends from the connection of the shoulder and the stirring pin to the peripheral wall of the shoulder in the radial direction of the shoulder, and the groove depth of the groove increases gradually in the extending direction thereof.
10. A visual sensing-based friction stir welding control system, comprising:
one or more processors;
a memory; and
one or more application programs, wherein the one or more application programs are stored in the memory and configured to be executed by the processor to implement the visual sensing-based friction stir welding control method of any of claims 1-7.
CN202210919564.XA 2022-07-27 2022-07-27 Friction stir welding control method and system based on visual sensing and machine tool Pending CN115971635A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210919564.XA CN115971635A (en) 2022-07-27 2022-07-27 Friction stir welding control method and system based on visual sensing and machine tool

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210919564.XA CN115971635A (en) 2022-07-27 2022-07-27 Friction stir welding control method and system based on visual sensing and machine tool

Publications (1)

Publication Number Publication Date
CN115971635A true CN115971635A (en) 2023-04-18

Family

ID=85968716

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210919564.XA Pending CN115971635A (en) 2022-07-27 2022-07-27 Friction stir welding control method and system based on visual sensing and machine tool

Country Status (1)

Country Link
CN (1) CN115971635A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116252039A (en) * 2023-05-15 2023-06-13 上海耀焊科技有限公司 Intelligent control method and system for inertia friction welding machine

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116252039A (en) * 2023-05-15 2023-06-13 上海耀焊科技有限公司 Intelligent control method and system for inertia friction welding machine

Similar Documents

Publication Publication Date Title
EP3863791B1 (en) System and method for weld path generation
CN109903279B (en) Automatic teaching method and device for welding seam movement track
CN103558850B (en) A kind of welding robot full-automatic movement self-calibration method of laser vision guiding
Tsai et al. Machine vision based path planning for a robotic golf club head welding system
CN111805051B (en) Groove cutting method, device, electronic equipment and system
CN107798330A (en) A kind of weld image characteristics information extraction method
Guo et al. Weld deviation detection based on wide dynamic range vision sensor in MAG welding process
US20130060369A1 (en) Method and system for generating instructions for an automated machine
CN113894481B (en) Welding pose adjusting method and device for complex space curve welding seam
CN110083157B (en) Obstacle avoidance method and device
CN104766333A (en) Vehicle door point welding robot path correction method based on stereoscopic vision
CN112238304A (en) Method for automatically welding small-batch customized special-shaped bridge steel templates by mechanical arm based on image visual recognition of welding seams
CN106056603A (en) Stereoscopic vision-based welding execution parameter on-line detection method
CN115971635A (en) Friction stir welding control method and system based on visual sensing and machine tool
CN110475627B (en) Deformation processing auxiliary system and deformation processing auxiliary method
CN114283139A (en) Weld joint detection and segmentation method and device based on area array structured light 3D vision
CN114211168A (en) Method for correcting plane welding seam track based on image subtraction
CN114964007A (en) Visual measurement and surface defect detection method for weld size
CN115018813A (en) Method for robot to autonomously identify and accurately position welding line
CN112894133B (en) Laser welding system and welding spot position extraction method
CN113182701A (en) Laser processing method, apparatus, device and storage medium
CN117506931A (en) Groove cutting path planning and correcting equipment and method based on machine vision
CN116372938A (en) Surface sampling mechanical arm fine adjustment method and device based on binocular stereoscopic vision three-dimensional reconstruction
CN116607758A (en) Construction method of tree-shaped curved surface concrete steel-wood combined formwork structure
CN115770988A (en) Intelligent welding robot teaching method based on point cloud environment understanding

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination