CN116833550A - Laser control method and laser processing apparatus - Google Patents

Laser control method and laser processing apparatus Download PDF

Info

Publication number
CN116833550A
CN116833550A CN202310917023.8A CN202310917023A CN116833550A CN 116833550 A CN116833550 A CN 116833550A CN 202310917023 A CN202310917023 A CN 202310917023A CN 116833550 A CN116833550 A CN 116833550A
Authority
CN
China
Prior art keywords
laser
processed
workpiece
processing
point cloud
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202310917023.8A
Other languages
Chinese (zh)
Inventor
黄泽铗
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Orbbec Inc
Original Assignee
Orbbec Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Orbbec Inc filed Critical Orbbec Inc
Priority to CN202310917023.8A priority Critical patent/CN116833550A/en
Publication of CN116833550A publication Critical patent/CN116833550A/en
Pending legal-status Critical Current

Links

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B23MACHINE TOOLS; METAL-WORKING NOT OTHERWISE PROVIDED FOR
    • B23KSOLDERING OR UNSOLDERING; WELDING; CLADDING OR PLATING BY SOLDERING OR WELDING; CUTTING BY APPLYING HEAT LOCALLY, e.g. FLAME CUTTING; WORKING BY LASER BEAM
    • B23K26/00Working by laser beam, e.g. welding, cutting or boring
    • B23K26/02Positioning or observing the workpiece, e.g. with respect to the point of impact; Aligning, aiming or focusing the laser beam
    • B23K26/04Automatically aligning, aiming or focusing the laser beam, e.g. using the back-scattered light
    • B23K26/046Automatically focusing the laser beam
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B23MACHINE TOOLS; METAL-WORKING NOT OTHERWISE PROVIDED FOR
    • B23KSOLDERING OR UNSOLDERING; WELDING; CLADDING OR PLATING BY SOLDERING OR WELDING; CUTTING BY APPLYING HEAT LOCALLY, e.g. FLAME CUTTING; WORKING BY LASER BEAM
    • B23K26/00Working by laser beam, e.g. welding, cutting or boring
    • B23K26/02Positioning or observing the workpiece, e.g. with respect to the point of impact; Aligning, aiming or focusing the laser beam
    • B23K26/04Automatically aligning, aiming or focusing the laser beam, e.g. using the back-scattered light
    • B23K26/046Automatically focusing the laser beam
    • B23K26/048Automatically focusing the laser beam by controlling the distance between laser head and workpiece
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B23MACHINE TOOLS; METAL-WORKING NOT OTHERWISE PROVIDED FOR
    • B23KSOLDERING OR UNSOLDERING; WELDING; CLADDING OR PLATING BY SOLDERING OR WELDING; CUTTING BY APPLYING HEAT LOCALLY, e.g. FLAME CUTTING; WORKING BY LASER BEAM
    • B23K26/00Working by laser beam, e.g. welding, cutting or boring
    • B23K26/70Auxiliary operations or equipment
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B23MACHINE TOOLS; METAL-WORKING NOT OTHERWISE PROVIDED FOR
    • B23KSOLDERING OR UNSOLDERING; WELDING; CLADDING OR PLATING BY SOLDERING OR WELDING; CUTTING BY APPLYING HEAT LOCALLY, e.g. FLAME CUTTING; WORKING BY LASER BEAM
    • B23K26/00Working by laser beam, e.g. welding, cutting or boring
    • B23K26/70Auxiliary operations or equipment
    • B23K26/702Auxiliary equipment

Landscapes

  • Physics & Mathematics (AREA)
  • Optics & Photonics (AREA)
  • Engineering & Computer Science (AREA)
  • Plasma & Fusion (AREA)
  • Mechanical Engineering (AREA)
  • Laser Beam Processing (AREA)

Abstract

The embodiment of the application provides a laser control method and laser processing equipment, wherein the laser processing equipment comprises a 3D camera and a laser. The method comprises the following steps: controlling a 3D camera to scan a workpiece to be processed according to a preset scanning path to obtain a global point cloud of the workpiece to be processed; processing the global point cloud and a preset pattern to be processed to obtain a three-dimensional processing track, determining the height of the laser relative to the workpiece to be processed or the focusing distance of the laser according to the coordinate information and the mapping relation of the three-dimensional processing track, and controlling the laser to process the workpiece to be processed according to the three-dimensional processing track and the height of the laser relative to the workpiece to be processed or the focusing distance of the laser. The scheme provides a low-cost three-dimensional scanning scheme, can efficiently complete three-dimensional scanning and reconstruction of the surface profile of the workpiece to be processed, greatly improves the efficiency and effect of laser processing, and is simple to operate and low in cost.

Description

Laser control method and laser processing apparatus
Technical Field
Embodiments of the present application relate to the field of optics, and more particularly, to a laser control method and a laser processing apparatus.
Background
The laser processing equipment is equipment for processing and treating a workpiece by utilizing a laser technology, and is widely applied to the fields of manufacturing industry, electronic industry, automobile industry, medical equipment, aerospace and the like by utilizing the high energy and high focusing property of laser and performing the processing modes of heating, melting, evaporating, burning and the like on the surface of the workpiece to realize the operations of cutting, carving, welding, punching, surface modification and the like on materials.
For example, for a curved object composed of a plurality of planes, the laser processing apparatus may perform laser processing on the curved object in a layered processing manner; for some cylindrical or spherical curved surface objects, the laser processing device can adopt a combined mode of rotation and movement to perform laser processing on the curved surface object; in some cases, the focal position of the laser processing apparatus, the position and angle of the table may be adjusted to accommodate the shape of the curved object, achieving a uniform processing effect over the curved surface as much as possible.
However, the above method has a great limitation on the surface shape of the object to be processed, low laser processing efficiency, complicated manual operation, inability to ensure accurate focusing of the laser focus, and simultaneously, the need of additional rotating equipment for clamping the object to be processed, etc.
Disclosure of Invention
The embodiment of the application provides a laser control method and laser processing equipment, which can effectively realize laser processing on a curved object, improve the laser processing efficiency and accuracy and are simple to operate.
In a first aspect, a laser control method is provided, applied to a laser processing apparatus including a 3D camera and a laser, for processing a workpiece to be processed, wherein the method includes: the 3D camera is controlled to scan the workpiece to be processed according to a preset scanning path, so that global point cloud of the workpiece to be processed is obtained; processing the global point cloud and a preset pattern to be processed to obtain a three-dimensional processing track; determining the height of the laser relative to the workpiece to be processed or the focusing distance of the laser according to the coordinate information of the three-dimensional processing track and a mapping relation, wherein the mapping relation is used for representing the mapping relation between the pre-established three-dimensional processing track and the height of the laser relative to the workpiece to be processed or the focusing distance of the laser; and controlling the laser to emit laser beams to process the workpiece to be processed according to the three-dimensional processing track and the height of the laser relative to the workpiece to be processed or the focusing distance of the laser.
Based on the scheme provided by the application, a low-cost three-dimensional scanning scheme is provided, the scanning of the surface profile of the workpiece to be processed and the reconstruction of a three-dimensional model can be efficiently completed, three-dimensional height information (such as a height value or a focusing distance) of the surface of the workpiece to be processed can be obtained in advance, the position and focal length of the laser can be automatically adjusted based on the obtained three-dimensional height information, so that the laser can keep proper focal position and processing depth, accurate focusing of the laser focal point is ensured, the laser is positioned at the optimal position to process the workpiece to be processed with different curved surface heights, the efficiency and effect of laser processing are greatly improved, the operation is simple, and the cost is low.
With reference to the first aspect, in some implementations of the first aspect, scanning a workpiece to be processed according to a preset scanning path, acquiring a global point cloud of the workpiece to be processed includes: acquiring multi-frame local point clouds of a workpiece to be processed; and fusing the multi-frame local point clouds to obtain the global point cloud of the workpiece to be processed.
With reference to the first aspect, in some implementations of the first aspect, fusing multiple frames of local point clouds to obtain a global point cloud of the workpiece to be processed, including preprocessing each frame of local point cloud data in the multiple frames of local point clouds; and aligning the preprocessed local point cloud data of each frame with a world coordinate system, and fusing the aligned local point cloud data of each frame by using a fusion algorithm to obtain a global point cloud.
With reference to the first aspect, in some implementations of the first aspect, processing the global point cloud and the preset pattern to be processed to obtain a three-dimensional processing track includes: generating an original processing track of the laser processing equipment according to the pattern to be processed of the laser processing equipment which is pre-led and path information in the pattern file; and combining the global point cloud with the original processing track to obtain the three-dimensional processing track.
With reference to the first aspect, in certain implementations of the first aspect, the method further includes: and adjusting the position and focus of the laser based on the height of the laser relative to the workpiece to be processed or the focusing distance of the laser, so that the focusing light spot of the laser beam is always positioned on the surface of the workpiece to be processed.
With reference to the first aspect, in certain implementation manners of the first aspect, according to a three-dimensional processing track, a height of the laser relative to a workpiece to be processed, or a focusing distance of the laser, controlling the laser to process the workpiece to be processed includes: controlling the transverse movement of the laser according to the three-dimensional processing track; and controlling the longitudinal distance of the laser according to the height of the laser relative to the workpiece to be processed or the focusing distance of the laser so as to finish the processing of the workpiece to be processed.
In a second aspect, there is provided a laser control apparatus comprising: the control unit is used for controlling the 3D camera to scan the workpiece to be processed according to a preset scanning path to obtain a global point cloud of the workpiece to be processed; the processing unit is used for processing the global point cloud and the preset pattern to be processed to obtain a three-dimensional processing track; the processing unit is further used for determining the height of the laser relative to the workpiece to be processed or the focusing distance of the laser in the laser processing equipment according to the coordinate information of the three-dimensional processing track and the mapping relation, wherein the mapping relation is used for representing the mapping relation between the pre-established three-dimensional processing track and the height of the laser relative to the workpiece to be processed or the focusing distance of the laser; and the control unit is used for controlling the laser to emit laser beams to process the workpiece to be processed according to the three-dimensional processing track and the height of the laser relative to the workpiece to be processed or the focusing distance of the laser.
In a third aspect, there is provided a laser processing apparatus for processing a workpiece to be processed, the laser processing apparatus comprising: the device comprises a working platform, a laser, a 3D camera and an upper computer, wherein a workpiece to be processed is arranged on the working platform. The laser is used for emitting laser beams to the workpiece to be processed so as to process the workpiece to be processed; the 3D camera is used for scanning the workpiece to be processed according to a preset scanning path to obtain a global point cloud of the workpiece to be processed; the upper computer is used for processing the global point cloud and the preset pattern to be processed to obtain a three-dimensional processing track; determining the height of the laser relative to the workpiece to be processed or the focusing distance of the laser according to the coordinate information of the three-dimensional processing track and a mapping relation, wherein the mapping relation is used for representing the mapping relation between the pre-established three-dimensional processing track and the height of the laser relative to the workpiece to be processed or the focusing distance of the laser; and controlling the laser to process the workpiece to be processed according to the three-dimensional processing track and the height of the laser relative to the workpiece to be processed or the focusing distance of the laser.
In a fourth aspect, a computer readable storage medium is provided for storing a computer program which, when run on a computer, causes the computer to perform the method as in any one of the possible implementations of the first aspect and its first aspect.
In a fifth aspect, there is provided a chip comprising a processor for calling and running a computer program from a memory, so that a device on which the chip is mounted performs the laser control method as in the first aspect.
In a sixth aspect, there is provided a computer program that causes a computer to execute the control method as in the first aspect and the first aspect thereof.
In a seventh aspect, there is provided a program product comprising computer readable code, or a non-transitory computer readable storage medium carrying computer readable code, which when run in an electronic device, causes a processor in the electronic device to perform a laser control method as in the first aspect.
Drawings
Fig. 1 and 2 are front views of a laser processing apparatus provided in an embodiment of the present application.
Fig. 3 is a schematic diagram of an operating principle of a 3D camera according to an embodiment of the present application.
Fig. 4 is a schematic flow chart of a laser control method according to an embodiment of the present application.
Fig. 5 and 6 are schematic diagrams of a region to be scanned of a workpiece to be processed according to an embodiment of the present application.
Fig. 7 is a schematic view of a three-dimensional contour of a surface of a workpiece to be processed according to an embodiment of the present application.
Fig. 8 is a schematic diagram of an original processing track corresponding to a pattern to be processed of a laser processing apparatus according to an embodiment of the present application.
Fig. 9 is a schematic diagram of a three-dimensional processing track for mapping a pattern to be processed onto a surface of a workpiece to be processed according to an embodiment of the application.
Fig. 10 to 12 are schematic diagrams illustrating controlling a longitudinal distance between a laser and a surface of a workpiece to be processed according to an embodiment of the present application.
Fig. 13 is a schematic block diagram of a laser control apparatus according to an embodiment of the present application.
Fig. 14 is a schematic structural view of a laser processing apparatus according to an embodiment of the present application.
Detailed Description
In order to facilitate understanding of the technical scheme of the present application, the following description is made.
In the present application, "at least one" means one or more, and "a plurality" means two or more. In the text description of the present application, the character "/", generally indicates that the front-rear associated object is an or relationship.
In the present application, "first", "second" and various numerical numbers are merely for convenience of description and are not intended to limit the scope of the embodiments of the present application. The sequence numbers of the processes below do not mean the sequence of execution, and the execution sequence of the processes should be determined by the functions and the internal logic, and should not be construed as limiting the implementation process of the embodiments of the present application.
In the present disclosure, the terms "comprises" and "comprising," and any variations thereof, are intended to cover a non-exclusive inclusion, such that a process, method, system, article, or apparatus that comprises a list of steps or elements is not necessarily limited to those steps or elements but may include other steps or elements not expressly listed or inherent to such process, method, article, or apparatus.
In the present disclosure, "exemplary" or "such as" and the like are used to indicate examples, illustrations, or descriptions, embodiments or designs described as "exemplary" or "such as" should not be construed as preferred or advantageous over other embodiments or designs. The use of the word "exemplary" or "such as" is intended to present the relevant concepts in a concrete fashion to facilitate understanding.
In the present application, the terms "center," "upper," "lower," and the like refer to an orientation or positional relationship based on that shown in the drawings, for convenience of description and simplicity of description, and do not indicate or imply that the apparatus or elements referred to must have a specific orientation, be constructed and operated in a specific orientation, and thus should not be construed as limiting the application.
In the present application, unless explicitly specified and limited otherwise, the terms "mounted," "fixed," "disposed," and the like are to be construed broadly and when an element is referred to as being "fixed" or "disposed" on another element, it can be directly on the other element or intervening elements may also be present. When an element is referred to as being "connected" to another element, it can be directly connected to the other element or intervening elements may also be present.
In the present application, unless explicitly specified and defined otherwise, the term "point cloud" may be replaced with "point cloud data", "point cloud information", "point cloud coordinates", and the like.
The technical solutions in the embodiments of the present application will be described below with reference to the accompanying drawings.
Laser processing equipment is a type of equipment that utilizes laser technology to process and treat workpieces. The method utilizes the high energy and high focusing property of laser to realize the operations of cutting, carving, welding, punching, surface modification and the like of materials by heating, melting, evaporating, burning and the like on the surface of a workpiece. The laser processing apparatus may be applied to the processing and treatment of a variety of materials including metals, plastics, ceramics, glass, textiles, and the like. The method has wide application in the fields of manufacturing industry, electronic industry, automobile industry, medical equipment, aerospace and the like, and can realize high-precision, high-efficiency and non-contact processing and handling operation.
The extent of the processing effect and the processing depth of the laser processing device on different surface heights is mainly dependent on the focusing properties of the laser beam in the vertical direction. In general, during laser processing and handling, it is only necessary to ensure that the focal point of the laser falls on the surface of the object to be processed. The laser depth of field refers to the focal range of the laser beam in a direction perpendicular to the working surface. However, the laser depth of field of a laser processing apparatus is limited and short, and the specific laser depth of field depends on the characteristics of the laser, the design of the optical system, and the working distance. If the laser beam is not focused accurately or the depth of field of the laser is small, uneven machining or inability to machine deep layers of material may result. Therefore, when processing is performed using a laser processing apparatus, it is necessary to reasonably adjust the focal position and processing parameters of the laser according to the characteristics and requirements of the material to ensure that a desired processing effect is obtained.
Currently, laser processing apparatuses perform laser processing on curved objects, and generally adopt layering processing, rotary processing, manual adjustment, and the like.
(1) Layering processing: for curved objects composed of multiple planes, the curved surface may be broken down into a series of planes, and then machined on each plane separately by adjusting the position and angle of the stage, or adjusting the focal length of the laser.
(2) And (3) rotary machining: for some cylindrical or spherical curved objects, the curved object can be fixed on a rotating device, so that the laser moves along the rotating shaft of the object, and the curved object is processed. By means of the combination of rotation and movement, a uniform machining effect can be achieved on the whole curved surface.
(3) And (3) manual adjustment: for special cases, the focal position of the laser processing device, the position and the angle of the workbench can be manually adjusted to adapt to the shape of the curved object, so that the curved object can be processed, which usually needs to be adjusted according to experience and practice to obtain the best processing effect.
Therefore, the processing mode provided by the application has great limitation on the surface shape of the workpiece to be processed, and the applicable workpiece to be processed is limited. In addition, the overall laser processing operation is complicated, the efficiency is low, and an additional rotating device is required to clamp the object to be processed, so that the laser processing device is complicated and has high cost.
In view of the above, the application provides a laser control method, a device and a laser processing device, which can efficiently complete the scanning of the surface profile of a workpiece to be processed and the reconstruction of a three-dimensional model, obtain the three-dimensional height information of the surface of the workpiece to be processed in advance, and further automatically adjust the position and focal length of a laser, so that the laser is positioned at an optimal position to process the workpiece to be processed with different curved surface heights, thereby greatly improving the efficiency and effect of laser processing, and having simple operation and lower cost.
The technical solutions in the embodiments of the present application will be described below with reference to the accompanying drawings. The embodiments provided by the present application may be applied to a laser processing apparatus, such as the laser processing apparatus shown in fig. 1 and 2.
Fig. 1 is a front view of a first laser processing apparatus provided in an embodiment of the present application. As shown in fig. 1, the laser processing apparatus 100 includes a laser 110 and a 3D camera 120, wherein the 3D camera 120 and the laser 110 are fixed together, and the 3D camera 120 can move along the x\y\z axis following the laser 110, i.e., the 3D camera 120 is a mobile 3D camera. It will be appreciated that this arrangement is suitable for a single measurement of only locally small areas. The 3D camera 120 may be a single point rangefinder, or a single line 3D camera, for example.
Fig. 2 is a front view of a second laser processing apparatus according to an embodiment of the present application. As shown in fig. 2, the laser processing apparatus 200 includes a laser 210 and a 3D camera 220, wherein the laser 110 can be moved along the x\y\z axis, and the 3D camera 220 is fixedly installed at the top or side of the laser processing apparatus 200 to perform downward measurement, i.e., the 3D camera 220 is a fixed 3D camera. It will be appreciated that this arrangement is well suited to a solution where a single measurement can be taken over a larger area.
In an embodiment of the present application, the laser processing apparatus (e.g., 100 or 200) shown in fig. 1 or fig. 2 includes a working platform, a laser 110/210, a 3D camera 120/220, and an upper computer, and a workpiece to be processed is placed on the working platform. The laser 110/210 is configured to emit a laser beam to the workpiece 130/230 to be processed, where the laser 110/210 may be a single-line, multi-line, single-point or multi-point laser source, such as a tunable continuous line laser, capable of continuously outputting laser light with different stable power frequencies, or a tunable pulse line laser capable of outputting pulse laser light with different stable power frequencies, which is not specifically limited in the present application. The 3D camera 120/220 is used for collecting global point clouds of the workpiece 130/230 to be processed; the upper computer is used for processing the global point cloud and the preset pattern to be processed to obtain a three-dimensional processing track, and the height of the laser 110/210 relative to the workpiece 130/230 to be processed or the focusing distance of the laser 110/210, so that the laser 110/210 is controlled to process the workpiece 130/230 to be processed according to the three-dimensional processing track and the height of the laser 110/210 relative to the workpiece 130/230 to be processed or the focusing distance of the laser 110/210 to obtain a target object.
It should be noted that fig. 1 and fig. 2 are only examples given for easy understanding, and the present application is not limited to the installation manner of the different 3D cameras. In addition, the two mounting manners of the 3D cameras shown in fig. 1 and fig. 2 may be implemented independently, or may be implemented in combination, for example, in some scenarios, a fixed 3D camera (e.g., the 3D camera 220 shown in fig. 2) may be used to perform global low-precision scanning first, and then a mobile 3D camera (e.g., the 3D camera 120 shown in fig. 1) may be used to perform local high-precision scanning.
Specifically, taking a 3D camera based on the line structured light computing principle as an example, the working principle of the 3D camera (e.g., the 3D camera 120 of fig. 1, the 3D camera 220 of fig. 2) in the embodiment of the present application will be specifically described with reference to fig. 3.
Fig. 3 is a schematic diagram of an operating principle of a 3D camera according to an embodiment of the present application. For convenience of description, as shown in fig. 3, the 3D camera 300 includes a line emitting end 310 and a receiving end 320, and an optical axis of the line emitting end 310 is disposed at an angle or relatively parallel to an optical axis of the receiving end 320.
Alternatively, the line emitting end 310 may be a single line, multiple lines, single point or multiple point laser source. This is because the laser has the advantages of high collimation and strong directivity, and can improve the accuracy and efficiency of calibration.
Alternatively, the number of the wire emitting ends 310 and the receiving ends 320, the wavelength of the laser light emitted from the wire emitting ends 310, the angle between the optical axis of the wire emitting ends 310 and the optical axis of the receiving ends 320, and the orientations of the wire emitting ends 310 and the receiving ends 320 are not particularly limited in the present application. For example, the receiving end 320 irradiates vertically downward and the line transmitting end 310 irradiates obliquely downward, or the line transmitting end 310 irradiates vertically downward and the receiving end 320 irradiates obliquely downward, or the scenes where both the line transmitting end 310 and the receiving end 320 irradiate obliquely downward are all within the protection scope of the technical solution of the present application.
Alternatively, the receiving end 320 may be a charge-coupled device (CCD) or a complementary metal oxide semiconductor (complementary metal oxide semiconductor, CMOS) or other photosensitive element capable of receiving infrared light, ultraviolet light, etc. Further, the processing chip includes a separate dedicated circuit, such as a dedicated SOC chip including a CPU, a memory, a bus, etc., an FPGA chip, an ASIC chip, etc., or may include a general purpose processing circuit, such as when the laser profiler is integrated into an intelligent terminal, such as a mobile phone, a television, a computer, a scanner, and a 3D printer, etc., where the processing circuit in the terminal may be at least a part of the processor.
Based on the line laser scanning principle, the line laser is projected onto the workpiece 330 to be processed by using the line emitting end 310, an image is shot through the receiving end 320, the central line of the line laser in the image is extracted, and then the point cloud of the workpiece 330 to be processed under the world coordinate system can be obtained through the light plane equation and the internal and external parameter calculation of the calibrated receiving end 320. Wherein the Z-axis height variation may be manifested as a movement of pixels in the receiving end 320.
In a first implementation, the line emitting end 310 in the 3D camera 300 emits a line laser to the workpiece 330 to be processed, where the line laser forms a plane of the optical knife, and one plane of the optical knife corresponds to a plane equation of the optical knife, and the plane equation of the optical knife can be obtained through calibration. The receiving end 320 in the 3D camera 300 collects line laser reflected by the workpiece 330 to be processed to form a line image on an imaging plane, any point on the line laser in the line image is extracted, a ray is formed from the optical center of the receiving end 320 and any point of the line laser, and the three-dimensional coordinate of any point under a camera coordinate system can be determined through the intersection point of the ray and the optical knife plane, so that the point cloud of the workpiece 330 to be processed under the camera coordinate system is obtained.
Because the line transmitting end 310 adopts single-line laser, the single-frame point cloud is single-line, in order to obtain the surface profile of the workpiece 330 to be processed, the surface profile of the workpiece 330 to be processed can be scanned by moving the receiving end 320 or the workpiece 330 to be processed, so as to obtain multi-frame point cloud, and then the collected multi-frame point cloud is spliced based on the inner parameter and the outer parameter of the receiving end 320, so that the three-dimensional point cloud information of the surface profile of the workpiece 330 to be processed in the world coordinate system can be obtained. The origin of the world coordinate system may be generally defined as a corner point on the working platform, and the X-axis and the Y-axis are sides intersecting the corner point. Alternatively, the world coordinate system may be constructed in other ways, as the application is not limited in this regard.
In a second implementation manner, a central line on the line laser in the line image may be extracted by a central line extraction algorithm, a ray is formed from the optical center of the receiving end 320 and any central point of the line laser, an intersection point of the ray and the plane of the optical knife is obtained, and three-dimensional coordinates of any central point may be determined, so as to obtain a point cloud of the workpiece 330 to be processed.
Alternatively, the point cloud acquired in this implementation may be a point cloud under the camera coordinate system.
Compared to the first implementation manner, that is, the point cloud of the workpiece 330 to be processed is directly calculated through any point on the line laser, in the second implementation manner, the center line on the line laser is calculated to obtain the corresponding center point with sub-pixel coordinates, and the accuracy of the point cloud data can be improved by obtaining the point cloud data through the center point of the sub-pixel.
Optionally, for the point clouds acquired by the two implementations or other implementations, filtering processing may be performed on the point clouds to improve the accuracy of the point clouds.
Alternatively, the receiving end 320 of the 3D camera 300 may be used alone to collect two-dimensional images of the workpiece 330 to be processed, so as to further obtain information such as line width, graph, texture, etc. of the workpiece 330 to be processed.
It should be understood that the above manner of acquiring the point cloud of the workpiece 330 to be processed is mainly described in a single line scanning manner, and the present application is equally applicable to scanning by the 3D camera 300 in a surface scanning manner. For example, if the beam emitted by the wire emitting end 310 is a multi-line beam, and accordingly, the receiving end 320 collects the multi-line beam reflected by the workpiece 330 to be processed, a multi-line image can be obtained. However, a ray formed by the optical center of the receiving end 320 from any point on any line in the multi-line image intersects the optical knife plane corresponding to the multi-line beam emitted by the line emitting end 310 to obtain a plurality of intersection points, so that an optical plane equation corresponding to the current line in the multi-line image cannot be uniquely determined.
In order to determine the plane equation of the optical knife corresponding to the current line beam in the multi-line image, each line beam in the multi-line beam emitted by the line emitting end 310 may be encoded, for example, the multi-line beam in the multi-line pattern may be encoded by using an encoding pattern, or each line beam may be encoded by using a property (such as color or brightness) of the multi-line beam, so that the receiving end 320 may collect the reflected beam to generate an encoded multi-line image, and then decode the encoded multi-line image to identify each line beam, thereby determining the plane equation of the optical knife corresponding to the current line beam uniquely, and further obtaining the point cloud information of the workpiece 330 to be processed in the world coordinate system according to the line laser scanning principle and the internal and external parameters of the 3D camera 300.
Optionally, when the 3D camera 300 adopts the surface scanning mode, the line emitting end 310 may also emit a structured light pattern to the workpiece 330 to be processed, the receiving end 320 receives the light beam reflected by the workpiece 330 to be processed to generate a structured light image, and processes the structured light image based on the structured light principle, and combines the internal parameters and external parameters of the 3D camera 300 to obtain the point cloud information of the workpiece 330 to be processed under the world coordinate system.
Optionally, the 3D camera 300 may further include a processing chip (not shown in the figure), where the processing chip is configured to process the image acquired by the receiving end 320 to obtain a point cloud of the workpiece 330 to be processed under the camera coordinate system. Alternatively, the processing chip may be a processor including a separate dedicated circuit, such as a dedicated SOC chip including a CPU, a memory, a bus, etc., an FPGA chip, an ASIC chip, etc., or may include a general-purpose processing circuit, such as a processing circuit of an upper computer in the laser processing apparatus 100/200 may be a part of the processing chip when the 3D camera 300 is integrated into the laser processing apparatus (100 shown in fig. 1 or 200 shown in fig. 2).
It should be noted that, the 3D camera 300 shown in fig. 3 is only an example given for understanding, and does not limit the technical solution of the present application. It should be understood that, in the above description, the 3D camera 300 is taken as an example of a laser profiler, and in other embodiments, the 3D camera 300 may be a single-point rangefinder based on trigonometry or single-point direct flight time, a multi-line rangefinder, a speckle structure light/binocular camera, an indirect flight time camera based on floodlight/speckle/linear array, and the like, as long as a device capable of acquiring 3D information falls within the scope of the present application.
Calibration of the 3D camera 300 in an embodiment of the present application is described below with reference to fig. 1 to 3. It should be noted that calibration for the 3D camera is an optional step, for example, calibrating the 3D camera 120/220/300 before each processing performed by the laser processing apparatus 100/200; alternatively, the 3D camera 120/220/300 is calibrated periodically, e.g., the 3D camera 120/220/300 is calibrated once every 10 times the laser machining apparatus is operated; alternatively, the 3D camera 120/220/300 is calibrated according to factors such as comprehensive practical experience, processing effect, actual use condition and the like. In a word, the measurement accuracy of the 3D camera 120/220/300 can be ensured by calibrating the 3D camera 120/220/300, so that the acquisition accuracy of the laser processing equipment 100/200 is improved.
In one implementation, as shown in fig. 1 or fig. 2, a calibration board 140/240 is provided, based on the working principle of the 3D camera 300 of fig. 3, the upper computer controls the 3D camera 120/220 to move above the calibration board 140/240, and the calibration board 140/240 is photographed through a plurality of poses and internal parameters and external parameters of the 3D camera 120/220 are calculated, so as to calibrate the 3D camera 120/220. For example, the 3D camera 300 may acquire the calibration image including the calibration plate 140/240, and then calibrate the calibration image according to a preset calibration algorithm to obtain the internal and external parameters of the 3D camera 120/220. The preset calibration algorithm may be Zhang Zhengyou calibration algorithm or other calibration algorithms.
Based on the above-described fig. 1 to 3, a laser control method provided by the present application, which is applicable to the laser processing apparatus 100/200 shown in fig. 1 or 2, will be described in detail with reference to fig. 4 to 12.
Fig. 4 is a flowchart of a laser control method 400 according to an embodiment of the present application, as shown in fig. 4, the method includes the following steps.
S410, the 3D camera is controlled to scan the workpiece to be processed according to a preset scanning path, and the global point cloud of the workpiece to be processed is obtained.
Illustratively, controlling the 3D camera 300 shown in fig. 3 to scan the workpiece to be processed according to the preset scan path may obtain a global point cloud of the workpiece to be processed. It should be noted that, in the embodiments of the present application, the point cloud may be three-dimensional coordinate information, color information, light intensity information, etc., and for convenience of understanding and description, the three-dimensional coordinate information is specifically described below as an example.
Based on the working principle of the 3D camera shown in fig. 3, the implementation of step S410 is specifically described with respect to the mobile 3D camera 120 shown in fig. 1 and the fixed 3D camera 220 shown in fig. 2.
(1) Such as the mobile 3D camera 120 shown in fig. 1.
Illustratively, the 3D camera 120 acquiring a global point cloud of the workpiece 130 to be processed includes the following steps.
S411, acquiring multi-frame local point clouds.
The upper computer interface in the laser processing apparatus 100 selects a region to be scanned of the workpiece 130 to be processed, automatically generates an optimal scan path according to the region to be scanned, or self-defines the scan path by a user. For example, a fine scanning or sparse scanning method is adaptively selected to acquire a point cloud of the workpiece 130 to be processed according to whether the shape of the workpiece 130 to be processed is regular or irregular.
In the first implementation manner, when the workpiece 130 to be processed is regular, the area to be scanned corresponding to the workpiece 130 to be processed may be divided into a plurality of areas, and the 3D camera 120 is controlled to perform global sparse scanning on the central position of each area of the workpiece 130 to be processed, so as to obtain a multi-frame sparse point cloud of the workpiece 130 to be processed, that is, the distance between the plurality of areas divided by the area to be scanned is set to be larger, so that the number of the areas divided by the area to be scanned is smaller. Illustratively, the area to be scanned is divided into 5*5 identical areas. For the regular workpiece 130 to be processed, a relatively accurate global point cloud can be fitted through a small number of limited sparse point clouds, and the time for acquiring the point clouds by the 3D camera 120 can be shortened through regional division with a small number, namely, the scanning efficiency can be improved through sparse scanning of the region to be scanned by the 3D camera 120, so that the processing time is saved.
Fig. 5 is a schematic diagram of a region to be scanned of a first workpiece to be processed according to an embodiment of the present application. As shown in fig. 5, the scanning direction of the 3D camera 120 is from left to right, dividing the region to be scanned into regions of finer granularity, and the interval of each region may be set to 1mm, i.e., the 3D camera 120 acquires a point cloud every 1mm of movement. It can be appreciated that, because the 3D camera 120 is fixed with the laser 110, the measuring beam emitted by the laser 110 can cover multiple areas at the same time, that is, the width of a single measurement is large, which can be tens or hundreds of millimeters, and therefore, the 3D camera 120 can acquire the point clouds corresponding to the multiple areas every 1mm of movement, for example, a zigzag scan can be adopted.
In the second implementation manner, when the workpiece 130 to be processed is irregular, the region to be scanned corresponding to the workpiece 130 to be processed may be divided into regions with smaller granularity, that is, the intervals between the multiple regions divided by the region to be scanned are set to be smaller, so that the number of the regions divided by the region to be scanned is larger, and thus, fine scanning of the region to be scanned can be achieved by using the 3D camera 120, so as to obtain a larger number of dense point clouds, and the global point clouds of the workpiece 130 to be processed can be obtained by fusing the dense point clouds, so that the position where the concave-convex characteristic exists on the surface of the workpiece 130 to be processed can be more accurately positioned. It should be noted that, when the workpiece to be processed is regular, a fine scanning manner may be also adopted to obtain a more accurate point cloud, which is not limited by the present application.
Fig. 6 is a schematic diagram of a region to be scanned of a second workpiece to be processed according to an embodiment of the present application. As shown in fig. 6, the 3D camera 120 moves in an "arcuate" pattern to obtain dense point clouds corresponding to respective areas of the workpiece 130 to be processed. The adoption of the bow-shaped pattern can enable the 3D camera 120 not to stop moving when moving to the edge area of the workpiece 130 to be processed, can achieve the effect of continuous moving scanning, and can save the time for acquiring the point cloud of the workpiece 130 to be processed by adopting the bow-shaped pattern compared with the mode that the 3D camera 120 moves along the same direction.
In the third implementation manner, when a regular area on the surface of the workpiece 130 to be processed cannot be precisely determined or an irregular area existing on the surface of the workpiece 130 to be processed is scattered, the 3D camera 120 needs to perform global fine scanning on the whole surface of the workpiece 130 to be processed. However, when it is known that the workpiece 130 to be processed has an irregular area in a specific local area, the entire surface of the workpiece 130 to be processed may be sparsely scanned to obtain multiple frames of sparse point clouds and the irregular area of the workpiece 130 to be processed, then the irregular area of the workpiece 130 to be processed is finely scanned to obtain a dense point cloud, and the sparse point cloud and the dense point cloud are fused to obtain a global point cloud of the workpiece 130 to be processed. Compared with global fine scanning, by fusing global sparse scanning and local fine scanning, the number of acquired point clouds can be reduced, simplification of the global fine scanning process is achieved, and therefore the processing speed of the 3D camera 120 is improved.
Optionally, if a 3D camera adopting line scanning is adopted, the upper computer controls the 3D camera 120 to move along a scanning path, acquires multi-frame local point cloud data in real time, and records coordinates [ xw_i, yw_i, zw_i ] corresponding to the 3D camera 120 (usually the optical center of the 3D camera 120) in a world coordinate system (or a global coordinate system) when the ith frame of point cloud data is measured until the scanning is completed. The jth point cloud data in the ith frame partial point cloud may be represented as [ x_i_j, y_i_j, z_i_j ].
Optionally, if a 3D camera adopting surface scanning is adopted, the scanning path may be converted into a plurality of local rectangular areas to be scanned, the upper computer controls the 3D camera 120 to move to the centers of the corresponding plurality of rectangular areas for measurement, obtains multi-frame local point cloud data corresponding to the plurality of areas, and records coordinates [ xw_i, yw_i, zw_i ] corresponding to the 3D camera 120 under a world coordinate system when the i-th frame point cloud data is measured.
S412, fusing the multi-frame local point clouds to obtain a global point cloud.
By way of example, a specific implementation of this step S412 may include the following steps:
s4121, preprocessing the obtained local point cloud of each frame.
Illustratively, preprocessing includes, but is not limited to, denoising, filtering, feature extraction, etc., and this implementation helps to reduce noise and unnecessary points, ensuring accuracy of laser machining results.
S4122, each local point cloud is aligned with the world coordinate system according to the coordinates [ xw_i, yw_i, zw_i ] of the 3D camera 120 in the world coordinate system at each measurement.
Illustratively, an iterative closest point (Iterative Closest Point, ICP) or other registration algorithm is typically used to estimate the transformation relationship between each local point cloud and the global point cloud.
S4123, carrying out point cloud fusion on the local point cloud of each frame to obtain a global point cloud.
For example, after aligning the local point clouds to the world coordinate system, different fusion algorithms may be used to fuse multiple frames of local point clouds into a complete frame of global point cloud. It should be understood that the point cloud fusion is a key step of fusing a plurality of local point cloud data into a frame of complete global point cloud data, and the implementation manner can obtain the contour information of different positions on the surface of the workpiece 130 to be processed, so as to provide basic data for subsequent processing tasks.
Exemplary, common fusion methods include, but are not limited to: voxel Grid (Voxel Grid) filtering, european clustering, nearest neighbor search based on KD tree, etc.
Optionally, some imperfect or inaccurate parts may exist in the fused global point cloud, and an optimization algorithm, such as a nonlinear least squares method (Nonlinear Least Squares) or ICP iteration, may be applied to further optimize the fitting degree and accuracy of the global point cloud.
Fig. 7 is a three-dimensional profile information of a surface of a workpiece to be processed according to an embodiment of the present application. As shown in fig. 7, coordinate information in X/Y/Z axis directions of different positions on the surface of the workpiece 130 to be processed, for example, the 3D camera 120 scans the workpiece 130 to be processed according to a preset scanning path, so as to obtain three-dimensional profile information of a global point cloud of the workpiece 130 to be processed, and provide basic data for subsequent laser processing.
(2) Such as the stationary 3D camera 220 shown in fig. 2.
Illustratively, the 3D camera 220 acquires a global point cloud of the workpiece 230 to be processed, including: the 3D camera 220 collects a frame of point cloud data, which is a global point cloud.
It should be understood that the first and second modes are examples given for easy understanding, and should not constitute any limitation on the technical solutions of the present application. Optionally, the point cloud acquiring and processing manner of the workpiece 130/230 to be processed may be selected and adjusted according to practical situations, which is not limited in the present application.
S420, processing the global point cloud and the preset pattern to be processed to obtain a three-dimensional processing track.
Illustratively, the upper computer combines the global point cloud with a preset pattern to be processed, and a three-dimensional processing track can be obtained.
S430, determining the height of the laser 110/210 relative to the workpiece 130/230 to be processed or the focusing distance of the laser 110/210 according to the coordinate information and the mapping relation of the three-dimensional processing track. The mapping relation is used for representing the mapping relation between the coordinate information of the pre-established three-dimensional processing track and the height of the laser relative to the workpiece to be processed or the focusing distance of the laser.
Alternatively, the representation of the mapping relationship may exist in forms of a table, a function, a text, a character string, or the like, which is not limited in the present application. For example, the upper computer searches the height of the laser 110/210 with respect to the workpiece 130/230 to be processed or the focusing distance of the laser 110/210 from a preset table based on the coordinate information of the three-dimensional processing track.
First, by importing the to-be-processed pattern upper computer software of the laser processing apparatus 100/200 into the upper computer in advance, a corresponding original processing track is generated according to path information in a graphic file, as shown in fig. 8, each point coordinate of the two-dimensional track may be denoted by [ x_i, y_i ], and a path defines a path and a sequence in which the laser processing apparatus 100/200 moves during processing, and may further include laser power, a working speed, a scanning mode, and the like. These point coordinates do not contain Z-axis information or the Z-axis values of the different point coordinates are all the same. The pattern to be processed is usually in a vector graphics format, and common formats include, but are not limited to: scalable vector graphics (Scalable Vector Graphics, SVG), graphics interchange formats (Drawing Exchange Format, DXF), and (Adobe Illustrator, AI), and the like.
Next, the global point cloud is combined to the original processing track shown in fig. 8, so that the processing track is added with height information of one Z axis to obtain a three-dimensional processing track, as shown in fig. 9, where each point coordinate of the three-dimensional processing track may be represented as a three-dimensional coordinate [ x_i, y_i, z_i ], that is, a three-dimensional coordinate indicating that the pattern to be processed is mapped onto the surface of the workpiece 130/230 to be processed.
Then, the height of the laser 110/210 relative to the workpiece 130/230 to be processed, or the focusing distance of the laser 110/210 is determined by a lookup table of a mapping relationship between the pre-established z_i and the height value of the laser 110/210 relative to the workpiece 130/230 to be processed, or the focusing distance of the laser 110/210. The lookup table is expressed as that when the distance between a certain area of the workpiece 130/230 to be processed and the working platform is z_i, the height of the laser 110/210 relative to the workpiece 130/230 to be processed is Δh_i, or the focusing distance of the laser 110/210 is Δf_i, so that the laser 110/210 can achieve the best processing effect on the area of the workpiece 130/230 to be processed. In other words, the height value Δh_i of the laser 110/210 relative to the workpiece 130/230 to be processed or the focusing distance Δf_i of the laser 110/210 can be looked up in a lookup table based on the known height z_i in the three-dimensional processing track.
Further, the position and/or focus of the laser 110/210 is adjusted based on the obtained height value Δh_i or the focusing distance Δf_i, so that the laser 110/210 can be located at the optimal position to process the workpiece 130/230 to be processed, so as to obtain the target object with the optimal processing effect. For example, when the laser 110/210 moves to the coordinates [ x_i, y_i ], a height value or a focusing distance to be adjusted for the laser 110/210 can be found according to the z_i value in the [ x_i, y_i, z_i ] in the three-dimensional processing track, and then the upper computer controls the laser 110/210 to move by the height Δh_i, or adjusts the focal length of the laser 110/210 by Δf_i, so as to ensure that the focusing spot of the laser 110/210 is always located on the surface of the workpiece 130/230 to be processed.
S440, controlling the laser 110/210 to emit laser beams to process the workpiece 130/230 to be processed according to the three-dimensional processing track, the height of the laser 110/210 relative to the workpiece 130/230 to be processed or the focusing distance of the laser 110/210.
It should be appreciated that once the three-dimensional processing trajectory of the workpiece 130/230 to be processed is generated (as shown in fig. 9), it may be sent to the laser processing apparatus 100/200 for actual processing operations. The laser processing apparatus 100/200 may perform lateral movement of the laser 110/210 according to a three-dimensional processing track, and control a longitudinal distance of the laser 110/210 according to a height of the laser 110/210 relative to the workpiece 130/230 to be processed or a focusing distance of the laser 110/210 corresponding to the three-dimensional processing track, so as to complete processing of the workpiece 130/230 to be processed.
There are various ways in which the laser processing apparatus 100/200 controls the longitudinal distance of the laser 110/210 from the surface of the workpiece 130/230 to be processed, including the following implementations.
Mode one:
in one example, as shown in FIG. 10, the laser 110/210 includes a laser light source and a focusing lens for focusing a laser light beam emitted from the laser light source, and the focal point position of the laser light beam can be changed by adjusting the focusing lens of the laser 110/210. In general, moving the focusing lens closer to the laser source causes the laser beam focal point to move far, while moving the focusing lens farther from the laser source causes the laser beam focal point to move closer, thereby controlling the longitudinal distance of the laser 110/210 from the surface of the workpiece 130/230 to be processed.
In another example, the laser 110/210 includes a laser light source and a zoom lens, including a liquid lens, a superlens, and the like. The focal length of the zoom lens is variable, so that the laser beam emitted by the laser light source can be focused on the surface of the workpiece 130/230 to be processed by changing the focal length of the zoom lens, and the longitudinal distance between the laser 110/210 and the surface of the workpiece 130/230 to be processed is controlled.
Mode two:
in one example, as shown in FIG. 11, the height of the laser 110/210 in the Z-axis may be adjusted by a Z-axis drive shaft to control the longitudinal distance of the laser 110/210 from the surface of the workpiece 130/230 to be processed.
Mode three:
in one example, as shown in FIG. 12, the distance of the laser focus in the Z-axis relative to the work platform may be varied by adjusting the height of the liftable work platform on which the work piece 130/230 to be processed is placed. For example, moving the work platform upward (from position 2 to position 1) brings the focal point toward the work platform, while moving the work platform downward (from position 1 to position 2) brings the focal point away from the work platform, thereby controlling the longitudinal distance of the laser 110/210 from the surface of the workpiece 130/230 to be processed.
It should be understood that the above-provided implementation is only an example given for ease of understanding and should not constitute any limitation on the technical solution of the present application.
In summary, in order to ensure that the laser focus can be accurately focused when the laser processing equipment processes objects with different curved surface heights, a low-cost three-dimensional scanning scheme is provided, the three-dimensional scanning and reconstruction of the surface profile of the workpiece to be processed can be efficiently completed by adopting a 3D camera, the three-dimensional profile information of the surface of the workpiece to be processed is obtained in advance, and then the real-time adjustment of the position and focal length of the laser in the processing process can be assisted based on the three-dimensional profile information so as to keep proper focus position and processing depth, thereby greatly improving the efficiency and effect of laser processing.
The laser control method in the embodiment of the present application is described in detail above with reference to fig. 1 to 12, and the laser control apparatus and the laser processing device in the embodiment of the present application will be described below with reference to fig. 13 and 14. It will be appreciated that the description of the apparatus embodiments corresponds to the description of the method embodiments, and that parts not described in detail may therefore be referred to the previous method embodiments.
Fig. 13 is a schematic block diagram of a laser control apparatus according to an embodiment of the present application. As shown in fig. 13, the apparatus 1000 may include a processing unit 1020 and a control unit 1010, where the processing unit 1020 is configured to perform data processing.
In one possible design, the apparatus 1000 may implement steps or processes corresponding to those performed by the laser processing device 100/200 in the above method embodiments, where the processing unit 1020 is configured to perform operations related to processing by the laser processing device 100/200 in the above method embodiments, and the control unit 1010 is configured to perform operations related to transceiving by the laser processing device 100/200 in the above method embodiments.
The control unit 1010 is configured to control the 3D camera to scan the workpiece to be processed according to a preset scan path, so as to obtain a global point cloud of the workpiece to be processed;
The processing unit 1020 is configured to process the global point cloud and a preset pattern to be processed to obtain a three-dimensional processing track; determining the height of the laser relative to the workpiece to be processed or the focusing distance of the laser according to the coordinate information of the three-dimensional processing track and a mapping relation, wherein the mapping relation is used for representing the mapping relation between the pre-established three-dimensional processing track and the height of the laser relative to the workpiece to be processed or the focusing distance of the laser;
the control unit 1010 is further configured to control the laser to emit a laser beam to process the workpiece to be processed according to the three-dimensional processing track and the height of the laser relative to the workpiece to be processed or the focusing distance of the laser.
It should be understood that the apparatus 1000 herein is embodied in the form of functional units. The term "unit" herein may refer to an application specific integrated circuit (application specific integrated circuit, ASIC), an electronic circuit, a processor (e.g., a shared, dedicated, or group processor, etc.) and memory that execute one or more software or firmware programs, a combinational logic circuit, and/or other suitable components that support the described functionality.
The functions may be realized by hardware, or may be realized by hardware executing corresponding software. The hardware or software comprises one or more modules corresponding to the functions; for example, the processing unit or the like may be replaced by a processor to perform the processing operations in the above-described method embodiments.
Further, the processing unit may be a processing circuit. In an embodiment of the present application, the apparatus in fig. 9 may be the laser processing apparatus in the foregoing embodiment, or may be a chip or a chip system, for example: system on chip (SoC). Wherein the processing unit is an integrated processor or microprocessor or integrated circuit on the chip. And are not limited herein.
Fig. 14 is a schematic block diagram of a laser processing apparatus 2000 provided by an embodiment of the present application. As shown in fig. 14, the laser processing apparatus 2000 includes a 3D camera 2010, a working platform 2020, a laser 2030 and a host computer 2040, where a workpiece to be processed is placed on the working platform 2020, the laser 2030 may be a single-line, multi-line, single-point or multi-point laser light source, for example, a tunable continuous line laser, capable of continuously outputting laser light with different stable power frequencies, or a tunable pulse line laser capable of outputting pulse laser light with different stable power frequencies, which is not particularly limited in the present application.
Illustratively, a laser for emitting a laser beam to a workpiece to be processed to process the workpiece to be processed; the 3D camera is used for scanning the workpiece to be processed according to a preset scanning path and acquiring global point clouds of the workpiece to be processed; the upper computer is used for processing the global point cloud and the preset pattern to be processed to obtain a three-dimensional processing track; determining the height of the laser relative to the workpiece to be processed or the focusing distance of the laser according to the coordinate information of the three-dimensional processing track and a mapping relation, wherein the mapping relation is used for representing the mapping relation between the pre-established three-dimensional processing track and the height of the laser relative to the workpiece to be processed or the focusing distance of the laser; and controlling the laser to emit laser beams to process the workpiece to be processed according to the three-dimensional processing track and the height of the laser relative to the workpiece to be processed or the focusing distance of the laser.
In one embodiment, the host computer 2040 includes a processor for sending control signals to control the various components and for executing the laser control method provided by embodiments of the present application. Optionally, as shown in fig. 14, the upper computer 2040 may also include a memory. The upper computer 2040 may call and run a laser control program from a memory to implement the method according to the embodiment of the present application. The memory may be a separate device independent of the upper computer or may be integrated in the upper computer.
In the alternative, the memory may include read-only memory and random access memory, and provide instructions and data to the processor. A portion of the memory may also include non-volatile random access memory. For example, the memory may also store information of the device type. The host computer 2040 may be used to execute instructions stored in a memory, and when the host computer 2040 executes instructions stored in a memory, the host computer 2040 is used to perform the various steps and/or processes of the laser control method described above.
In implementation, the steps of the above method may be performed by integrated logic circuits of hardware in a processor or by instructions in the form of software. The steps of a method disclosed in connection with the embodiments of the present application may be embodied directly in a hardware processor for execution, or in a combination of hardware and software modules in the processor for execution. The software modules may be located in a random access memory, flash memory, read only memory, programmable read only memory, or electrically erasable programmable memory, registers, etc. as well known in the art. The storage medium is located in a memory, and the processor reads the information in the memory and, in combination with its hardware, performs the steps of the above method. To avoid repetition, a detailed description is not provided herein.
It should be noted that the processor in the embodiments of the present application may be an integrated circuit chip with signal processing capability. In implementation, the steps of the above method embodiments may be implemented by integrated logic circuits of hardware in a processor or instructions in software form. The processor may be a general purpose processor, a digital signal processor, an application specific integrated circuit, a field programmable gate array or other programmable logic device, a discrete gate or transistor logic device, a discrete hardware component. The processor in the embodiments of the present application may implement or execute the methods, steps and logic blocks disclosed in the embodiments of the present application. A general purpose processor may be a microprocessor or the processor may be any conventional processor or the like. The steps of the method disclosed in connection with the embodiments of the present application may be embodied directly in the execution of a hardware decoding processor, or in the execution of a combination of hardware and software modules in a decoding processor. The software modules may be located in a random access memory, flash memory, read only memory, programmable read only memory, or electrically erasable programmable memory, registers, etc. as well known in the art. The storage medium is located in a memory, and the processor reads the information in the memory and, in combination with its hardware, performs the steps of the above method.
It will be appreciated that the memory in embodiments of the application may be volatile memory or nonvolatile memory, or may include both volatile and nonvolatile memory. The nonvolatile memory may be a read-only memory (ROM), a Programmable ROM (PROM), an Erasable PROM (EPROM), an electrically Erasable EPROM (EEPROM), or a flash memory. The volatile memory may be random access memory (random access memory, RAM) which acts as an external cache. By way of example, and not limitation, many forms of RAM are available, such as Static RAM (SRAM), dynamic RAM (DRAM), synchronous DRAM (SDRAM), double data rate SDRAM (DDR SDRAM), enhanced SDRAM (ESDRAM), synchronous DRAM (SLDRAM), and direct memory bus RAM (DR RAM). It should be noted that the memory of the systems and methods described herein is intended to comprise, without being limited to, these and any other suitable types of memory.
Optionally, the embodiment of the application further provides a chip, which comprises a processor, wherein the processor can call and run the computer program from the memory to realize the method in the embodiment of the application.
Optionally, the embodiment of the present application further provides a computer readable storage medium, where a computer program is stored, where the computer program causes a computer to execute the method in the embodiment of the present application.
Optionally, the embodiment of the present application further provides a computer program product, including computer program instructions, where the computer program instructions cause a computer to perform the method in the embodiment of the present application.
Optionally, the embodiment of the application further provides a computer program. The computer program causes a computer to perform the method in the embodiment of the application.
It will be appreciated that the memory in embodiments of the application may be volatile memory or nonvolatile memory, or may include both volatile and nonvolatile memory. The nonvolatile Memory may be a Read-Only Memory (ROM), a Programmable ROM (PROM), an Erasable PROM (EPROM), an Electrically Erasable EPROM (EEPROM), or a flash Memory. The volatile memory may be random access memory (Random Access Memory, RAM) which acts as an external cache. By way of example, and not limitation, many forms of RAM are available, such as Static RAM (SRAM), dynamic RAM (DRAM), synchronous DRAM (SDRAM), double Data Rate SDRAM (Double Data Rate SDRAM), enhanced SDRAM (ESDRAM), synchronous DRAM (SLDRAM), and Direct RAM (DR RAM). It should be noted that the memory of the systems and methods described herein is intended to comprise, without being limited to, these and any other suitable types of memory.
While the application has been described with reference to a preferred embodiment, various modifications may be made and equivalents may be substituted for elements thereof without departing from the scope of the application. In particular, the technical features mentioned in the respective embodiments may be combined in any manner as long as there is no structural conflict. The present application is not limited to the specific embodiments disclosed herein, but encompasses all technical solutions falling within the scope of the claims.

Claims (11)

1. A laser control method, applied to a laser processing apparatus including a 3D camera and a laser, for processing a workpiece to be processed, wherein the method comprises:
controlling the 3D camera to scan the workpiece to be processed according to a preset scanning path to obtain a global point cloud of the workpiece to be processed;
processing the global point cloud and a preset pattern to be processed to obtain a three-dimensional processing track;
determining the height of the laser relative to the workpiece to be processed or the focusing distance of the laser according to the coordinate information of the three-dimensional processing track and a mapping relation, wherein the mapping relation is used for representing the mapping relation between the pre-established three-dimensional processing track and the height of the laser relative to the workpiece to be processed or the focusing distance of the laser;
And controlling the laser to emit laser beams to process the workpiece to be processed according to the three-dimensional processing track and the height of the laser relative to the workpiece to be processed or the focusing distance of the laser.
2. The method of claim 1, wherein scanning the workpiece to be processed according to a preset scanning path to obtain a global point cloud of the workpiece to be processed comprises:
acquiring multi-frame local point clouds of the workpiece to be processed;
and fusing the multi-frame local point clouds to obtain the global point cloud of the workpiece to be processed.
3. The method according to claim 2, wherein the fusing the multi-frame local point clouds to obtain the global point cloud of the workpiece to be processed comprises:
preprocessing each frame of local point cloud data in the multi-frame local point cloud;
and aligning the preprocessed local point cloud data of each frame with a world coordinate system, and fusing the aligned local point cloud data of each frame by using a fusion algorithm to obtain the global point cloud.
4. A method according to any one of claims 1 to 3, wherein the processing the global point cloud and a preset pattern to be processed to obtain a three-dimensional processing track includes:
Generating an original processing track of the laser processing equipment according to the to-be-processed pattern of the laser processing equipment and path information in the pattern file which is pre-introduced;
and merging the global point cloud with the original processing track to obtain the three-dimensional processing track.
5. The method according to any one of claims 1 to 4, further comprising:
and adjusting the position and focus of the laser based on the height of the laser relative to the workpiece to be processed or the focusing distance of the laser, so that the focusing light spot of the laser beam is always positioned on the surface of the workpiece to be processed.
6. The method according to claim 1 or 2, wherein the controlling the laser to process the workpiece to be processed according to the three-dimensional processing trajectory, the height of the laser relative to the workpiece to be processed, or the focusing distance of the laser, comprises:
controlling the transverse movement of the laser according to the three-dimensional processing track;
and controlling the longitudinal distance of the laser according to the height of the laser relative to the workpiece to be processed or the focusing distance of the laser so as to finish the processing of the workpiece to be processed.
7. The utility model provides a laser processing equipment, its characterized in that is used for treating the processing of processing work piece, laser processing equipment includes work platform, laser instrument, 3D camera and host computer, treat that the processing work piece is arranged in on the work platform, wherein:
the laser is used for emitting laser beams to a workpiece to be processed so as to process the workpiece to be processed;
the 3D camera is used for scanning the workpiece to be processed according to a preset scanning path to obtain a global point cloud of the workpiece to be processed;
the upper computer is used for processing the global point cloud and a preset pattern to be processed to obtain a three-dimensional processing track; determining the height of the laser relative to the workpiece to be processed or the focusing distance of the laser according to the coordinate information of the three-dimensional processing track and a mapping relation, wherein the mapping relation is used for representing the mapping relation between the pre-established three-dimensional processing track and the height of the laser relative to the workpiece to be processed or the focusing distance of the laser; and controlling the laser to emit laser beams to process the workpiece to be processed according to the three-dimensional processing track and the height of the laser relative to the workpiece to be processed or the focusing distance of the laser.
8. The laser processing apparatus according to claim 7, wherein,
the 3D camera is fixed with the laser and moves along the X\Y\Z axis of the coordinate system along with the laser; or,
the 3D camera is fixedly arranged at the top or at the side of the laser processing equipment.
9. The laser processing apparatus according to claim 7 or 8, wherein,
the laser comprises a laser light source and a focusing lens; the focusing lens is used for focusing the laser beam emitted by the laser light source so as to change the focusing point position of the laser beam; or,
the laser comprises a laser light source and a zoom lens, wherein the focal length of the zoom lens is variable; the zoom lens is used for focusing the laser beam emitted by the laser light source so as to enable the laser beam to be focused on the surface of the workpiece to be processed.
10. The laser processing apparatus according to claim 7 or 8, wherein,
the upper computer is also used for adjusting the height of the laser on the Z axis through a Z axis transmission shaft so as to change the distance between the laser beam and the surface of the workpiece to be processed; or,
the upper computer is also used for changing the distance between the focal point of the laser beam and the working platform on the Z axis by adjusting the height of the working platform.
11. A computer readable storage medium for storing a computer program which, when run on a computer, causes the computer to perform the method of any one of claims 1 to 6.
CN202310917023.8A 2023-07-24 2023-07-24 Laser control method and laser processing apparatus Pending CN116833550A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202310917023.8A CN116833550A (en) 2023-07-24 2023-07-24 Laser control method and laser processing apparatus

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202310917023.8A CN116833550A (en) 2023-07-24 2023-07-24 Laser control method and laser processing apparatus

Publications (1)

Publication Number Publication Date
CN116833550A true CN116833550A (en) 2023-10-03

Family

ID=88163398

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202310917023.8A Pending CN116833550A (en) 2023-07-24 2023-07-24 Laser control method and laser processing apparatus

Country Status (1)

Country Link
CN (1) CN116833550A (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117784088A (en) * 2024-01-30 2024-03-29 荣耀终端有限公司 Laser scanning device, system, control method and storage medium
CN118314138A (en) * 2024-06-07 2024-07-09 深圳市牧激科技有限公司 Laser processing method and system based on machine vision

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117784088A (en) * 2024-01-30 2024-03-29 荣耀终端有限公司 Laser scanning device, system, control method and storage medium
CN118314138A (en) * 2024-06-07 2024-07-09 深圳市牧激科技有限公司 Laser processing method and system based on machine vision
CN118314138B (en) * 2024-06-07 2024-08-06 深圳市牧激科技有限公司 Laser processing method and system based on machine vision

Similar Documents

Publication Publication Date Title
CN116833550A (en) Laser control method and laser processing apparatus
EP2568253B1 (en) Structured-light measuring method and system
US8917942B2 (en) Information processing apparatus, information processing method, and program
CN106296716A (en) The power regulating method of light source, depth measurement method and device
DE102013105828A1 (en) Structured light contour sensing system for measuring contour of surface has control module to control micro electro mechanical system (MEMS) mirrors based on focus quality to maintain Scheimpflug tilt condition between lens and image plane
EP3789139B1 (en) Three-dimensional additive manufacturing method and device with detection of defects with backscattered electrons
US4760269A (en) Method and apparatus for measuring distance to an object
CN114111624B (en) Handheld three-dimensional scanning method, equipment and medium with mark point projection device
CN116817796B (en) Method and device for measuring precision parameters of curved surface workpiece based on double telecentric lenses
CN111932517B (en) Contour mapping method and device for residual plate, electronic equipment and storage medium
CN116839473A (en) Weld positioning and size calculating method and device, storage medium and electronic equipment
JP5136108B2 (en) 3D shape measuring method and 3D shape measuring apparatus
JP2007163429A (en) Three-dimensional distance measuring method, and instrument therefor
JP2680460B2 (en) Angle measuring device for bending machine
JP2017120515A (en) Control data generation method and control data generation device
US11933597B2 (en) System and method for optical object coordinate determination
CN109693035A (en) Control device, laser processing and the laser machine of laser machine
CN112361982B (en) Method and system for extracting three-dimensional data of large-breadth workpiece
CN112710662B (en) Generation method and device, generation system and storage medium
JP6944891B2 (en) How to identify the position in 3D space
JP2020046381A (en) Measuring apparatus, laser marking device and measuring method
JP7096415B1 (en) Laminated modeling equipment and manufacturing method of laminated modeled products
US20240255278A1 (en) Shape measuring device
KR20020068725A (en) Sensor for acquiring 3d image data
JPH0560518A (en) Three-dimensional coordinate measurement device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination