WO2022237544A1 - Procédé et appareil de génération de trajectoire, et dispositif électronique et support d'enregistrement - Google Patents

Procédé et appareil de génération de trajectoire, et dispositif électronique et support d'enregistrement Download PDF

Info

Publication number
WO2022237544A1
WO2022237544A1 PCT/CN2022/089578 CN2022089578W WO2022237544A1 WO 2022237544 A1 WO2022237544 A1 WO 2022237544A1 CN 2022089578 W CN2022089578 W CN 2022089578W WO 2022237544 A1 WO2022237544 A1 WO 2022237544A1
Authority
WO
WIPO (PCT)
Prior art keywords
dimensional
trajectory
point
item
points
Prior art date
Application number
PCT/CN2022/089578
Other languages
English (en)
Chinese (zh)
Inventor
李辉
魏海永
丁有爽
邵天兰
Original Assignee
梅卡曼德(北京)机器人科技有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from CN202110511343.4A external-priority patent/CN113199479B/zh
Priority claimed from CN202110511347.2A external-priority patent/CN113189934A/zh
Priority claimed from CN202110511370.1A external-priority patent/CN113199480B/zh
Application filed by 梅卡曼德(北京)机器人科技有限公司 filed Critical 梅卡曼德(北京)机器人科技有限公司
Publication of WO2022237544A1 publication Critical patent/WO2022237544A1/fr

Links

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls

Definitions

  • the present disclosure relates to the field of image processing, and more specifically, to a trajectory generation method, device, electronic equipment, and storage medium.
  • Intelligent industrial robots can already replace humans in various fields, such as spraying, grabbing, and handling.
  • Embodiments of the present disclosure provide a trajectory generation method, device, electronic equipment, and storage medium.
  • the trajectory generation method can generate three-dimensional trajectory points of an item, so as to at least solve the problem of inaccurate spraying trajectory.
  • the trajectory generation method includes: generating a two-dimensional image of the item based on the three-dimensional point cloud of the item; generating a two-dimensional trajectory point based on the two-dimensional image of the item; obtaining height information corresponding to the two-dimensional trajectory point; The track point and the height information of the two-dimensional track point generate a three-dimensional track point.
  • the trajectory generation device includes: a two-dimensional image generation module, which is used to generate a two-dimensional image of an item based on the three-dimensional point cloud of the item; a two-dimensional trajectory point generation module, which is used to generate a two-dimensional image based on the item A two-dimensional track point; a height information acquisition module, configured to acquire height information corresponding to a two-dimensional track point; a three-dimensional track point generating module, used to generate a three-dimensional track point based on the two-dimensional track point and the acquired height information of the two-dimensional track point.
  • An embodiment of the present disclosure provides an electronic device, including: a memory and a processor;
  • the memory is used to store computer instructions; the processor is used to run the computer instructions stored in the memory to implement the trajectory generation method provided by any of the above implementations.
  • An embodiment of the present disclosure provides a computer-readable storage medium, on which a computer program is stored, and the computer program is executed by a processor to implement the trajectory generation method provided in any of the above-mentioned embodiments.
  • An embodiment of the present disclosure provides a 3D camera, including a memory, a processor, and a computer program stored on the memory and operable on the processor.
  • the processor executes the computer program, the trajectory generation provided by any of the above embodiments is realized. method.
  • Embodiments of the present disclosure provide a computer program product, including a computer program.
  • the computer program is executed by a processor, the trajectory generation method provided in any of the above embodiments is implemented.
  • the trajectory generation method generates a two-dimensional image of the item based on the three-dimensional point cloud of the item; generates two-dimensional trajectory points based on the two-dimensional image of the item; Combined with the height information corresponding to the two-dimensional track point, the three-dimensional track point is generated.
  • the three-dimensional trajectory points of the operation of the intelligent industrial robot can be obtained according to the three-dimensional point cloud information of the object to be sprayed, combined with the corresponding height information, so that the intelligent industrial robot can spray based on the three-dimensional trajectory points
  • Such operations can not only greatly improve the accuracy of spraying the items to be sprayed. It can also effectively avoid collisions with objects when intelligent industrial robots are spraying in certain specific scenarios.
  • FIG. 1 is a first schematic flow diagram of a trajectory generation method according to an embodiment of the present disclosure
  • FIG. 2 is a second schematic flow diagram of a trajectory generation method according to an embodiment of the present disclosure
  • FIG. 3 is a schematic diagram of an object profile according to an embodiment of the present disclosure.
  • FIG. 4 is a third schematic flow diagram of a trajectory generation method according to an embodiment of the present disclosure.
  • FIG. 5 is a schematic diagram of object contour angle contour points according to an embodiment of the present disclosure.
  • FIG. 6 is a schematic diagram of obtaining two-dimensional trajectory points of an item according to contour points according to an embodiment of the present disclosure
  • FIG. 7 is a fourth schematic flowchart of a trajectory generation method according to an embodiment of the present disclosure.
  • FIG. 8 is a schematic diagram of a circumscribed rectangle of a two-dimensional image according to an object group according to an embodiment of the present disclosure
  • FIG. 9 is a fifth schematic flowchart of a trajectory generation method according to an embodiment of the present disclosure.
  • Fig. 10 is a schematic diagram of obtaining two-dimensional trajectory points of an item according to a circumscribing rectangle according to an embodiment of the present disclosure
  • FIG. 11 is a schematic structural diagram of a trajectory generation device according to an embodiment of the present disclosure.
  • FIG. 12 is a schematic structural diagram of an electronic device according to an embodiment of the present disclosure.
  • Robots can already replace humans in various fields, such as spraying, grabbing, and handling.
  • the robot when using a robot to perform such as spraying operations, it is necessary to first determine the spraying trajectory.
  • the robot will perform the spraying operation based on the two-dimensional spraying trajectory.
  • the items that need to be sprayed are not always Regular, if the spraying surface of the item to be sprayed is uneven, or when multiple items with different heights are stacked together, if the spraying is carried out according to the two-dimensional spraying trajectory and the height is fixed, the spraying will be uneven, the nozzle will collide with the object, etc. abnormal situation.
  • Embodiments of the present disclosure provide a trajectory generation method, device, electronic equipment, and storage medium. After obtaining a two-dimensional image of an item through a three-dimensional point cloud, first obtain the two-dimensional trajectory point of the robot based on the two-dimensional image, and then obtain the two-dimensional The height information corresponding to the track point is combined with the two-dimensional track point and the height information of the two-dimensional track point to obtain the three-dimensional track point.
  • the robot performs spraying operations based on three-dimensional trajectory points, which can greatly improve the accuracy of spraying objects to be sprayed with large height differences and can effectively avoid collisions between robots and objects when spraying.
  • Fig. 1 is a schematic flow diagram of a trajectory generating method according to an embodiment of the present disclosure; please refer to Fig. 1 , the glass coating method according to an embodiment of this disclosure includes:
  • Step 101 generate a 2D image of the item based on the 3D point cloud of the item.
  • the three-dimensional point cloud refers to the collection of point data on the surface of the three-dimensional appearance of the object, including the coordinate values of each point on the X, Y, and Z axes in space, and the normal direction of each point on the X, Y, and Z axes.
  • a two-dimensional image refers to a planar image of an item.
  • the three-dimensional image of the item can be acquired through the visual sensor, and then the three-dimensional point cloud of the corresponding item can be acquired according to the visual algorithm.
  • the visual sensor may be a 3D industrial camera, a CMOS image sensor (CMOS image sensor) or a CCD image sensor (Charge-coupled Device), etc.
  • CMOS image sensor CMOS image sensor
  • CCD image sensor Charge-coupled Device
  • the vision algorithm may be a binocular stereo vision algorithm, a point cloud normal calculation algorithm, etc., and the type of the vision algorithm is not particularly limited in this embodiment.
  • the 3D industrial camera is generally equipped with two lenses, which respectively capture objects to be captured from different angles, and can display a three-dimensional image of the object after processing. Place the object to be captured under the 3D industrial camera, and shoot the two lenses at the same time.
  • the binocular stereo vision algorithm uses the binocular stereo vision algorithm to calculate the X, Y, and Z of each point of the object to be captured
  • the point cloud normal calculation algorithm uses the point cloud normal calculation algorithm to calculate the normal direction of each coordinate, and then convert it into a three-dimensional point cloud of the object to be grabbed.
  • the point cloud of the object can be corrected from the camera coordinate system at the time of shooting to the coordinate system where the Z axis is perpendicular to the surface of the object to be sprayed, that is to say, based on the shooting parameters of the 3D industrial camera, the captured The 3D point cloud of the item is "aligned", and the item is corrected so that the painted surface is facing the camera.
  • components such as laser detectors, visible light detectors such as LEDs, infrared detectors, and radar detectors may also be used to obtain three-dimensional point cloud information, and the present disclosure does not limit the specific implementation manner.
  • the orthographic projection of the 3D point cloud of the item is mapped onto a 2D plane.
  • the 3D point cloud of the object is mapped to the 2D plane along the vertical direction of the object surface to obtain a 2D image.
  • a two-dimensional plane may be firstly selected in space, and the two-dimensional plane is a vertical plane of the vertical direction; then the three-dimensional point cloud is mapped toward the two-dimensional plane to obtain a two-dimensional image.
  • the two-dimensional image can be a two-dimensional color image, or a two-dimensional image including only two values, that is, the value of all pixels of the image is 0, or a non-zero value, such as 255 (the value of the pixel A value of 0 is black, a value of 255 is white). It is also possible to convert a two-dimensional color image into a two-dimensional image including two values after obtaining it.
  • Step 102 generating two-dimensional trajectory points based on the two-dimensional image of the item.
  • the two-dimensional track point is the track point on the two-dimensional plane where the object to be sprayed is sprayed, including the coordinate value and normal direction of each point on the X and Y axes.
  • the method of generating two-dimensional trajectory points is closely related to specific industrial scenarios, and different spraying requirements and scenarios will have different and distinctive two-dimensional trajectory point generation methods.
  • two-dimensional trajectory points can be generated according to the outline of the object in the two-dimensional image.
  • the item to be sprayed is a plurality of item groups stacked together, the two-dimensional trajectory points may be generated according to the area of the item group in the two-dimensional image.
  • the method for generating two-dimensional trajectory points is not limited in this embodiment, and any method for generating two-dimensional trajectory points can be applied to the method of this embodiment.
  • the number of two-dimensional trajectory points can be preset according to the actual situation. Generally speaking, the more the number of two-dimensional trajectory points, the higher the degree of consistency between the robot's spraying trajectory and the ideal trajectory, and correspondingly, the higher the control complexity.
  • the total number of two-dimensional track points can be preset, and the interval between two-dimensional track points can also be preset.
  • two-dimensional trajectory points can be selected according to the preset two-dimensional trajectory point interval; if the total number of two-dimensional trajectory points is preset, it can be calculated based on the spraying trajectory and the total number of two-dimensional trajectory points After calculating the interval of the two-dimensional trajectory points, the two-dimensional trajectory points are selected according to the interval of the two-dimensional trajectory points.
  • Step 103 acquiring height information corresponding to the two-dimensional track point.
  • the height information refers to the coordinate value and normal direction of the two-dimensional trajectory point on the Z axis in the corresponding three-dimensional point cloud.
  • the depth map of the corresponding two-dimensional color image may be acquired.
  • the depth map contains the depth information of each pixel point, and the depth information represents the distance between the point and the camera.
  • the height of the pixel is the value of the pixel on the Z axis when the sprayed surface is used as the XY surface.
  • the depth information of the pixel is correlated with the height information of the pixel, and the height information of the pixel can be obtained based on the depth information .
  • the two-dimensional trajectory point is a point generated according to the two-dimensional image, and is usually located in the two-dimensional image.
  • the most relevant point on the two-dimensional image can be found first.
  • the pixel point for example, the pixel point closest to the track point, and then obtain the height information of the pixel point, and use the height information of the pixel point as the height information of the two-dimensional track point.
  • the two-dimensional trajectory points are generated based on the contour of the two-dimensional image, the most relevant pixel point on the contour to the two-dimensional trajectory point may also be found first.
  • Step 104 generating a three-dimensional trajectory point based on the two-dimensional trajectory point and the height information of the two-dimensional trajectory point.
  • the acquired height information of the pixel point of the item most related to the two-dimensional track point may be assigned to the two-dimensional track point, thereby converting the two-dimensional track point into a three-dimensional track point.
  • the trajectory of the 3D trajectory points is not smooth enough in height as a whole.
  • the three-dimensional trajectory points can be smoothed, so that the entire three-dimensional trajectory points are smoothly connected in height.
  • Gaussian filtering, bilateral filtering, etc. may be used for smoothing, so that the three-dimensional trajectory points are uniform and smooth. The present disclosure does not limit the specific implementation manner.
  • trajectory information is generated, the trajectory information includes the three-dimensional trajectory points, and the trajectory information is used to control the industrial robot to spray the article based on the three-dimensional trajectory points .
  • the robot can not only plan the trajectory of the plane in real time based on vision technology, but also can plan the movement path on the XYZ axis according to the height of the recognized trajectory, even if the object to be sprayed is a shape extending in three directions. Regular line-shaped items or items with uneven surfaces to be sprayed can achieve good spraying effects and avoid uneven or uneven spraying.
  • the trajectory generation method of the embodiment of the present disclosure after obtaining the two-dimensional image of the object in the three-dimensional point cloud, first obtain the trajectory point of the robot based on the two-dimensional image, and then obtain the height information corresponding to the trajectory point, combined with the two-dimensional trajectory point and the height information of the track point to obtain the three-dimensional track point of the robot operation.
  • the robot performs spraying and other operations based on three-dimensional trajectory points, which can greatly improve the accuracy of spraying objects to be sprayed with large height differences.
  • trajectory generation method has been described above, and several trajectory generation methods in actual industrial scenarios where the objects to be sprayed are in different shapes will be introduced below.
  • the object to be sprayed may be in the shape of a line, and the spraying process requires spraying along the line of the object.
  • the spraying process requires spraying along the line of the object.
  • the robot needs to walk and spray along the line of the item.
  • FIG. 2 is a second schematic flow diagram of the trajectory generation method of the embodiment of the present disclosure.
  • this embodiment provides a trajectory generation method when the object to be sprayed is in the shape of a line.
  • Step 201 generate a 2D image of the item based on the 3D point cloud of the item.
  • step 201 is similar to the implementation manner of the above-mentioned step 101, and will not be repeated here in this embodiment manner.
  • Step 202 acquiring the outline of the item based on the two-dimensional image of the item.
  • the outline refers to the lines constituting the outer edge of the object in the two-dimensional image.
  • the outline of the article can be in any shape, such as regular circles, rectangles, etc., or irregular shapes, such as small parts of various shapes, steel pipes bent in different directions, etc.
  • contour analysis may be performed on the two-dimensional image of the object to obtain the contour of the object.
  • Contour analysis includes but not limited to: edge detection technology, deep learning algorithm.
  • the edge of the item can be obtained by edge detection technology, and the edge refers to a collection of pixels whose gray levels around the item change sharply, which is the most basic feature of the image. Edges exist between objects, backgrounds and regions, and are the most important basis for image segmentation.
  • the edge of the object is determined by extracting the features of the discontinuous part of the image, such as the position where the gray value of the pixels on both sides is significantly different, or the position of the turning point where the gray value changes from small to large and then to small.
  • the edge of the item in the two-dimensional image is extracted by edge detection technology, and the edge is used as the outline of the item.
  • the collected point clouds may be missing or even broken due to lighting problems or reflection problems of the objects themselves. Therefore, after obtaining the two-dimensional image of the item, the two-dimensional image can be expanded and/or corroded first, so that the two-dimensional image is complete and there is no breakage, and it is best to allow the image to change properly. Wide to facilitate subsequent processing steps of contour recognition and trajectory point generation.
  • dilation processing may be performed on the image, so as to fill in missing and broken positions of the point cloud.
  • the non-black pixel is the pixel of the item, where there is no item, the pixel is black
  • this step is equivalent to whitening every white pixel around, Therefore, if the 2D image is missing or broken, this operation will fill all the missing or broken parts of the 2D image with colored pixels. After this processing, the 2D image will become complete without missing or broken.
  • an erosion operation can also be performed on the dilated 2D image. For example, for each black pixel on the image, a certain number of points, such as 8-25 points, can be set to 0. This operation is equivalent to painting black around every black pixel on the two-dimensional image.
  • the points near the edge of the two-dimensional image will be blackened, so the two-dimensional image will become "thin” as a whole, so that the processed two-dimensional image is closer to the real object.
  • the trajectory points generated by the 2D image are more accurate.
  • Step 203 generating two-dimensional track points based on the outline of the item.
  • the two-dimensional trajectory point may be a contour point on the contour of the item.
  • contour points may be selected on the contour of the object as the two-dimensional trajectory points.
  • the two-dimensional track point can select several outline points on the outline of the item as corresponding two-dimensional track points according to the preset number of track points, where the outline point refers to the set of item outline point data in the two-dimensional plane ; or the interval between track points can also be preset, and a number of outline points are selected on the outline of the object as corresponding two-dimensional track points.
  • the manner of acquiring the two-dimensional trajectory points there is no limitation on the manner of acquiring the two-dimensional trajectory points.
  • Fig. 3 shows the outline of an article in this embodiment.
  • the article is a steel pipe bent in different directions.
  • a plurality of contour points are obtained on the object contour, and these contour points are used as corresponding two-dimensional trajectory points.
  • Step S204 acquiring height information corresponding to the two-dimensional track point.
  • Step S205 generating a three-dimensional trajectory point based on the two-dimensional trajectory point and the acquired height information of the two-dimensional trajectory point.
  • step 204 and step 205 is similar to the implementation manner of step 103 and step 104 above, and will not be repeated here in this implementation manner.
  • the outline of the item is generated based on the two-dimensional image of the item, and then the two-dimensional trajectory points are generated based on the outline, and then the two-dimensional trajectory points are converted based on the corresponding height information is the three-dimensional trajectory point of the object, and the robot performs spraying and other operations based on the three-dimensional trajectory point, and the spraying trajectory of the robot can be obtained by obtaining the contour of the object, which can greatly improve the accuracy of spraying line-shaped objects.
  • FIG. 4 is a third schematic flowchart of a trajectory generation method according to an embodiment of the present disclosure.
  • this embodiment describes in detail the generation of two-dimensional trajectory points based on the outline of an object.
  • corner contour points are two end contour points in the contour of the article, and each end has two corner contour points.
  • the contour can be processed according to a corner detection algorithm to obtain corner contour points.
  • the corner detection algorithm includes but not limited to: corner detection based on a binary image, corner detection based on a contour curve, and the like.
  • Fig. 5 shows the corner contour points of an article contour according to this embodiment. As shown in Fig. 5, according to the contour of the article, the first corner contour point is selected at one end of the contour, and the first corner contour point is selected at the other end of the contour. Bicorner contour points.
  • first corner contour point and the second corner contour point are on the same long side of the object contour, or, the first corner contour point and the second corner contour point are on different long sides of the object contour.
  • a contour point is selected based on a preset distance on the object contour with the first corner contour point as the starting point and the second corner contour point as the end point.
  • the next contour point is acquired at an interval of 10 mm until the second corner contour point is obtained.
  • the total number of contour points can also be preset, and then the contour point interval can be calculated according to the total number of contour points and the length from the first corner contour point to the second corner contour point, and then contour points can be obtained according to the interval.
  • the selected contour points can be directly used as two-dimensional trajectory points, or other points can be obtained based on the contour points as two-dimensional trajectory points.
  • This embodiment does not specifically limit the method of generating two-dimensional trajectory points based on contour points , a concrete example is given below.
  • the first corner contour point and the second corner contour point are on the same long side of the object contour, and the contour points selected on the object contour are the contour points on the long side. For each selected contour point, determine the midpoint of the width direction of the contour point of the article at the contour point, and use the midpoint as the two-dimensional trajectory point of the article.
  • the midpoint of the outline of the item in the width direction at the outline point is determined based on the outline point, and the midpoint is used as the two-dimensional trajectory point of the item.
  • the midpoint of the profile in the width direction is used as a two-dimensional trajectory point.
  • accurate spraying can be performed to avoid deflection of spraying.
  • determining the midpoint it can be realized in many ways, such as fitting the centerline of the contour, and then based on the contour point as a reference, selecting the midpoint on the centerline, or based on the contour point, by vertical Line way to get the midpoint.
  • the tangent a of the long side where the contour point A is located at the contour A can be calculated first, and then the perpendicular b of the tangent at the contour A can be calculated.
  • the line b intersects the two long sides of the outline of the item to form two intersection points A and B, then the midpoint C of the line segment AB is the midpoint to be obtained, and the midpoint is used as a two-dimensional trajectory point.
  • the two-dimensional trajectory acquisition method provided by the present disclosure obtains the corresponding corner contour points through the contour of the object, and then obtains the contour points according to the corner contour points, and obtains the two-dimensional trajectory points of the object according to the contour points, which can accurately obtain the line-shaped object. Two-dimensional trajectory points, which improves the accuracy of spraying in this specific scene.
  • the items to be sprayed may be multiple items arranged side by side, and the requirement of the spraying process is that it does not need to be precise, but it is required to spray on the surface of each item without omission (that is, even if it will be sprayed between the items It doesn't matter outside, as long as the surface of each item can be sprayed), for example, multiple steel pipes bent in different directions are placed together, and it is required to spray this pile of steel pipes.
  • Fig. 7 is a schematic flowchart of the trajectory generation method 4 of the embodiment of the present disclosure.
  • this embodiment provides an item group in which the items to be sprayed are composed of a plurality of items arranged side by side. Trajectory generation method. The trajectory generation method in this scenario will be described in detail below with reference to specific embodiments.
  • Step 701 generate a 2D image of the item based on the 3D point cloud of the item group.
  • An item group includes at least one item.
  • the sprayed area of the item is larger, that is, the sprayed area of the item is larger than a preset area, and the surface of the item is uneven.
  • the multiple items are arranged side by side, and the multiple items have different heights, for example, multiple steel pipes bent in different directions are placed together.
  • an item group composed of at least one item may be taken as a whole, and a two-dimensional image of the item may be generated based on a three-dimensional point cloud of the item group.
  • the specific implementation manner is similar to the implementation manner of step 101 above, and will not be repeated here in this embodiment manner.
  • Step 702 acquiring the bounding rectangle of the two-dimensional image of the item group.
  • the circumscribed rectangle refers to a rectangular area covering the two-dimensional image of the item group.
  • the two-dimensional image of the item group is obtained, the two-dimensional image of the items included in the item group is regarded as a whole, and the two-dimensional image of the item group is processed by contour analysis and corner point detection to obtain the The outermost pixel point in the two-dimensional image, according to the outermost pixel point, obtain its circumscribed rectangle. For a single item, just get the bounding rectangle of the item directly.
  • the four outermost pixel points a, b, c, d of the two-dimensional image are acquired, and according to the four outermost pixel points, the corresponding 4 tangent lines of , and the rectangle formed by intersecting the 4 tangent lines is used as the circumscribed rectangle of the two-dimensional image.
  • the circumscribed rectangle can be extended, and the size of the extension can be set according to actual needs, which is not limited in this implementation.
  • Step 703 generating two-dimensional trajectory points based on the circumscribing rectangle.
  • the circumscribed rectangle can be divided into several areas along the vertical or horizontal direction, and the two-dimensional trajectory points can be selected according to the number of preset trajectory points on the boundaries of several regions as corresponding two-dimensional points. Or you can preset the interval between the trajectory points, and select several points on the boundaries of several regions as the corresponding two-dimensional trajectory points.
  • Step S704 acquiring height information corresponding to the two-dimensional track point.
  • Step S705 generating a three-dimensional trajectory point based on the two-dimensional trajectory point and the acquired height information of the two-dimensional trajectory point.
  • step 704 and step 705 is similar to the implementation manner of step 103 and step 104 above, and will not be repeated here in this implementation manner.
  • the trajectory generation method of the embodiment of the present disclosure generates the circumscribing rectangle of the item group based on the two-dimensional image of the item group, and then generates the two-dimensional trajectory point based on the circumscribing rectangle, and then converts the two-dimensional trajectory point into the item group based on the corresponding height information
  • the three-dimensional trajectory point of the robot is based on the three-dimensional trajectory point for spraying and other operations.
  • the spraying trajectory of the robot can be obtained by obtaining the circumscribed rectangle of the object group, which can greatly improve the accuracy of the object group spraying.
  • FIG. 9 is a schematic flowchart of a trajectory generation method according to an embodiment of the present disclosure. On the basis of the above-mentioned embodiment in FIG. 7 , this embodiment describes in detail the generation of two-dimensional trajectory points based on a circumscribing rectangle of an item.
  • the width direction is the direction of the short side of the circumscribing rectangle
  • the length direction is the direction of the long side of the circumscribing rectangle.
  • the predetermined interval can be preset, and the predetermined interval can also be determined based on the width or length of the circumscribed rectangle. In this embodiment, there is no special limitation on the implementation manner of the predetermined interval.
  • the boundary line of the circumscribed rectangle is generated at predetermined intervals, and the boundary line and the two parallel sides of the circumscribed rectangle are vertically intersected respectively, that is, the boundary line divides the circumscribed rectangle into multiple small rectangles.
  • a plurality of dividing lines are generated at an interval of, for example, 20 mm. small rectangle.
  • the 2D trajectory point is the point selected on the dividing line.
  • select points on the dividing line as 2D trajectory points, for example, selecting points on the dividing line based on preset intervals, or selecting points on each dividing line Set the number of two-dimensional track points on the point, and then determine the position of the two-dimensional track point based on the number.
  • start from any corner of the circumscribed rectangle use the corner of the circumscribed rectangle and the intersection of the boundary line and the circumscribed rectangle as inflection points, traverse the boundary line and the circumscribed rectangle parallel to the boundary line in a zigzag Both sides, thus generating the two-dimensional trajectory points.
  • the corner point is one of the four corner points of the circumscribed rectangle. Since the boundary line intersects the two parallel sides of the circumscribing rectangle vertically respectively, there are two intersection points between each boundary line and the circumscribing rectangle.
  • the dividing line intersects the two sides of the circumscribing rectangle, and the dividing line divides the circumscribing rectangle into multiple small rectangles, as shown in Figure 10, the corners of the circumscribing rectangle and the intersections of the dividing line and the rectangle are recorded as P0, P1, P2...Pn+1, the corner points and intersection points on the opposite side are marked as P0', P1', P2'...Pn+1', and all the above corner points and intersection points are regarded as the inflection points of the two-dimensional trajectory.
  • the spraying trajectory starts from P0, moves toward P0', turns right 90 degrees after reaching P0', moves toward P1', turns left 90 degrees after reaching P1', moves toward P1, and after reaching P1, Turn right 90 degrees according to the Z-shaped trajectory, in this way, until you walk through the rightmost edge and reach Pn+1 or Pn+1', that is, P0 ⁇ P0' ⁇ P1' ⁇ P1.
  • Pn+1 or Pn+1' is a spraying trajectory, thus, when traversing the boundary line and the two sides of the circumscribed rectangle parallel to the boundary line, trajectory points are generated according to a preset distance.
  • the distance between the two-dimensional track points can be set first, the distance can be preset directly or the total number of track points can be preset, and then the distance of the track points can be calculated based on the track length and the total number of track points.
  • the preset distance and the total number of track points can be determined according to the height change of the object to be sprayed. If the height of the surface of the object group to be sprayed changes greatly, more track points need to be set, so that the track points obtained are closer to the real track. ; and if the altitude change is small, fewer track points can be set.
  • the two-dimensional trajectory acquisition method provided by the present disclosure obtains the corresponding boundary line through the circumscribed rectangle, and then obtains the outline point according to the intersection point of the boundary line and the circumscribed rectangle, and obtains the two-dimensional trajectory point of the item according to the outline point, which can realize a specific scene Accurately obtain the two-dimensional trajectory points of the item, which improves the accuracy of spraying the item group.
  • the present disclosure proposes a method of generating a three-dimensional trajectory point through two-dimensional trajectory points based on robot vision technology.
  • the collision problem of the spray head secondly, for a specific industrial scene, even when a robot is used to spray irregular items, such as long strips bent in different directions, the present disclosure can also accurately generate three directions along the surface of the item.
  • Extended trajectories again, for another specific industrial scenario, that is, to use robots to spray multiple side-by-side irregular-shaped items, such as multiple elongated items that bend in different directions as described above or the spraying surface has a large height
  • the present disclosure can also accurately generate a three-dimensional trajectory that can fully cover the entire object or the entire surface of the object.
  • the present disclosure solves the problem of how to generate a three-dimensional moving trajectory of a robot based on robot vision and how to generate a two-dimensional trajectory in a specific industrial scene and generate a three-dimensional trajectory based on the two-dimensional trajectory.
  • the robots in various embodiments of the present disclosure may be robot arms, and these robot arms may be general-purpose or dedicated for spraying.
  • the present disclosure can be sprayed on any object, such as glass, table boards, steel plates, cushions, steel pipes, etc., and the present disclosure does not limit specific application fields.
  • a spraying head can be installed at the operating end of the robot with a pre-established communication connection, and the spraying trajectory of the object to be sprayed can be generated according to the spraying size of the spraying head and the preset moving reference point on the spraying head.
  • the moving reference point is used for locating the position of the spraying head when it moves, and when positioning during the movement, the position of the spraying head takes this point as a reference instead of locating parts other than the moving reference point of the spraying head.
  • the specific moving reference point is on the spraying head can be preset according to specific requirements, and is not limited in this embodiment.
  • the shape of the spraying head of the spraying head can be any shape, for example, it can be rectangular or circular.
  • the moving reference point of the above-mentioned spraying head can be set at a certain end of the spraying head according to requirements, and can also be set at the center of the spraying head.
  • the spraying head is a rectangle, the moving reference point can be set at one end of the spraying head. Midpoint or corner point, or set at the intersection point of the diagonals of the applicator head, that is, the center point.
  • the spraying head may also be circular, and the moving reference point may be the center of the spraying head, or may be located on the circumference of the circular spraying head.
  • the spray size in the embodiment of the present disclosure can be the actual size of the spray head, for example, when the shape of the spray head of the spray head is rectangular, the spray size of the spray head can include width and length; When the shape is circular, the spray size of the spray head can be the spray diameter.
  • the spraying size may also be a size corresponding to the shadow projected by the spraying head on the object to be sprayed.
  • Trajectory information can be sent to the robot, the trajectory information includes three-dimensional trajectory points, and the trajectory information is used to control the robot to spray objects based on the three-dimensional trajectory points.
  • the robot may communicate with the above-mentioned robot based on TCP protocol, HTTP protocol, or GRPC protocol (Google Remote Procedure Call Protocol, Google Remote Procedure Call Protocol), and then send the above-mentioned trajectory information.
  • Fig. 11 is a schematic structural diagram of a trajectory generation device provided by an embodiment of the present disclosure, the device includes:
  • a 2D image generation module 1101 configured to generate a 2D image of the item based on the 3D point cloud of the item.
  • the two-dimensional track point generation module 1102 is configured to generate two-dimensional track points based on the two-dimensional image of the item.
  • a height information acquiring module 1103, configured to acquire height information corresponding to two-dimensional trajectory points.
  • the 3D track point generation module 1104 is configured to generate a 3D track point based on the 2D track point and the acquired height information of the 2D track point.
  • the two-dimensional image generating module 1101 is also used for:
  • the two-dimensional trajectory point generating module 1102 is also used for:
  • the item When the item is a linear item, perform an expansion operation on the two-dimensional image, and/or perform an erosion operation on the two-dimensional image; obtain the outline of the item based on the two-dimensional image of the item, and generate two-dimensional trajectory points based on the outline of the item.
  • first corner contour point from the first corner contour point to the second corner contour point, select contour points based on a preset distance on the object contour, and generate two-dimensional trajectory points based on the contour points, wherein the first corner contour The point is the corner contour point of one end of the article, and the second corner contour point is the corner contour point of the other end of the article.
  • the first corner contour point and the second corner contour point are on the same long side of the object contour, and the selected contour point on the object contour is a contour point on the long side.
  • the midpoint is used as a two-dimensional trajectory point of the item.
  • the two-dimensional trajectory point generating module 1102 is also used for:
  • the item is an item in an item group, the item group includes at least one item, and when the item group includes a plurality of items, the plurality of items are arranged side by side.
  • a circumscribing rectangle of the two-dimensional image of the item group is acquired, and a two-dimensional trajectory point is generated based on the circumscribing rectangle.
  • the boundary line of the circumscribed rectangle is generated according to a predetermined interval, and the boundary line is vertically intersected with the two parallel sides of the circumscribed rectangle, starting from any corner point of the circumscribed rectangle, and outside
  • the corner points of the bounding rectangle and the intersection of the boundary line and the circumscribed rectangle are used as inflection points, and the boundary line and both sides of the circumscribed rectangle parallel to the boundary line are traversed in zigzag, thereby generating the two-dimensional trajectory points.
  • track points are generated according to a preset distance. The preset distance is obtained based on the preset total track points and track length.
  • the height information acquiring module 1103 is further configured to: acquire height information of the track point based on depth information of item pixels at the track point.
  • the three-dimensional trajectory point generation module 1104 is also used for:
  • the generated 3D trajectory points are smoothed in height.
  • the trajectory generation device provided in the embodiments of the present disclosure can execute the trajectory generation method shown in any of the above embodiments, the principle and technical effect of which are similar, and will not be repeated here.
  • FIG. 12 shows a schematic structural diagram of an electronic device according to an embodiment of the present disclosure.
  • the specific embodiment of the present disclosure does not limit the specific implementation of the electronic device.
  • the electronic device may be a 3D camera.
  • the electronic device may include: a processor 1201 , a communication interface 1202 , a memory 1203 , and a communication bus 1204 .
  • the processor 1201, the communication interface 1202, the memory 1203, and the communication bus 1204 can complete mutual communication.
  • the processor 1201 is configured to execute the program 1210, and may specifically execute relevant steps in the foregoing method implementation manners.
  • the program 1210 may include program codes including computer operation instructions.
  • the processor 1201 may be a central processing unit CPU, or a specific integrated circuit ASIC (Application Specific Integrated Circuit), or one or more integrated circuits configured to implement embodiments of the present disclosure.
  • the one or more processors included in the electronic device may be of the same type, such as one or more CPUs, or may be different types of processors, such as one or more CPUs and one or more ASICs.
  • the memory 1203 is used to store the program 1210 .
  • the memory 1203 may include a high-speed RAM memory, and may also include a non-volatile memory (non-volatile memory), such as at least one disk memory.
  • the program 1210 may be specifically configured to enable the processor 1201 to perform various operations in the foregoing method implementation manners.
  • the present disclosure also provides a computer-readable storage medium, on which a computer program is stored, and when the program is executed by a processor, the method in any one of the above-mentioned implementation manners is implemented.
  • the computer program stored in the computer-readable storage medium in the embodiment of the present disclosure can be executed by the processor of the electronic device.
  • the computer-readable storage medium can be a storage medium built in the electronic device, or can be The storage medium of the electronic device is pluggable and pluggable. Therefore, the computer-readable storage medium in the embodiments of the present disclosure has high flexibility and reliability.
  • a "computer-readable medium” may be any device that can contain, store, communicate, propagate or transmit a program for use in or in conjunction with an instruction execution system, device or device.
  • computer-readable media include the following: electrical connection with one or more wires (electronic device), portable computer disk case (magnetic device), random access memory (RAM), Read Only Memory (ROM), Erasable and Editable Read Only Memory (EPROM or Flash Memory), Fiber Optic Devices, and Portable Compact Disc Read Only Memory (CDROM).
  • the computer-readable medium may even be paper or other suitable medium on which the program can be printed, since the program can be read, for example, by optically scanning the paper or other medium, followed by editing, interpretation or other suitable processing if necessary.
  • the program is processed electronically and stored in computer memory.
  • the processor can be a central processing unit (Central Processing Unit, CPU), and can also be other general-purpose processors, digital signal processors (Digital Signal Processor, DSP), application specific integrated circuit (Application Specific Integrated Circuit, ASIC), off-the-shelf programmable gate array (Field- Programmable Gate Array, FPGA) or other programmable logic devices, discrete gate or transistor logic devices, discrete hardware components, etc.
  • a general-purpose processor may be a microprocessor, or the processor may be any conventional processor, or the like.
  • each part of the embodiments of the present disclosure may be realized by hardware, software, firmware or a combination thereof.
  • various steps or methods may be implemented by software or firmware stored in memory and executed by a suitable instruction execution system.
  • a suitable instruction execution system For example, if implemented in hardware, as in another embodiment, it can be implemented by any one or combination of the following techniques known in the art: Discrete logic circuits, ASICs with suitable combinational logic gates, programmable gate arrays (PGAs), field programmable gate arrays (FPGAs), etc.
  • each functional unit in each embodiment of the present disclosure may be integrated into one processing module, each unit may exist separately physically, or two or more units may be integrated into one module.
  • the above-mentioned integrated modules can be implemented in the form of hardware or in the form of software function modules. If the integrated modules are realized in the form of software function modules and sold or used as independent products, they can also be stored in a computer-readable storage medium.
  • the storage medium mentioned above may be a read-only memory, a magnetic disk or an optical disk, and the like.

Landscapes

  • Engineering & Computer Science (AREA)
  • Robotics (AREA)
  • Mechanical Engineering (AREA)
  • Manipulator (AREA)

Abstract

Procédé de génération de trajectoire, consistant à : acquérir un nuage de points tridimensionnel d'un article et générer une image bidimensionnelle de l'article sur la base du nuage de points tridimensionnel de l'article ; générer des points de trajectoire bidimensionnels sur la base de l'image bidimensionnelle de l'article ; acquérir des informations de hauteur correspondant aux points de trajectoire bidimensionnels ; et générer des points de trajectoire tridimensionnels sur la base des points de trajectoire bidimensionnels et des informations de hauteur acquises des points de trajectoire bidimensionnels. Dans le procédé, des points de trajectoire du déplacement d'un robot sont d'abord acquis sur la base d'une image bidimensionnelle, puis des informations de hauteur correspondant aux points de trajectoire sont acquises, ensuite, les points de trajectoire tridimensionnels du déplacement du robot sont acquis au moyen de la combinaison des points de trajectoire bidimensionnels et des informations de hauteur correspondant aux points de trajectoire, de sorte que le robot puisse effectuer des opérations, telles que la pulvérisation, sur la base des points de trajectoire tridimensionnels, ce qui permet d'améliorer considérablement la précision de pulvérisation d'articles à pulvériser qui ont une grande différence de hauteur. L'invention concerne également un appareil de génération de trajectoire, un dispositif électronique et un support d'enregistrement.
PCT/CN2022/089578 2021-05-11 2022-04-27 Procédé et appareil de génération de trajectoire, et dispositif électronique et support d'enregistrement WO2022237544A1 (fr)

Applications Claiming Priority (6)

Application Number Priority Date Filing Date Title
CN202110511343.4A CN113199479B (zh) 2021-05-11 2021-05-11 轨迹生成方法、装置、电子设备、存储介质和3d相机
CN202110511343.4 2021-05-11
CN202110511347.2A CN113189934A (zh) 2021-05-11 2021-05-11 轨迹生成方法、装置、电子设备、存储介质和3d相机
CN202110511370.1A CN113199480B (zh) 2021-05-11 2021-05-11 轨迹生成方法、装置、电子设备、存储介质和3d相机
CN202110511347.2 2021-05-11
CN202110511370.1 2021-05-11

Publications (1)

Publication Number Publication Date
WO2022237544A1 true WO2022237544A1 (fr) 2022-11-17

Family

ID=84027992

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2022/089578 WO2022237544A1 (fr) 2021-05-11 2022-04-27 Procédé et appareil de génération de trajectoire, et dispositif électronique et support d'enregistrement

Country Status (1)

Country Link
WO (1) WO2022237544A1 (fr)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117900917A (zh) * 2024-03-19 2024-04-19 中船黄埔文冲船舶有限公司 一种打磨轨迹离散化方法、系统、终端及可读存储介质

Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20110222757A1 (en) * 2010-03-10 2011-09-15 Gbo 3D Technology Pte. Ltd. Systems and methods for 2D image and spatial data capture for 3D stereo imaging
US20150009301A1 (en) * 2012-01-31 2015-01-08 3M Innovative Properties Company Method and apparatus for measuring the three dimensional structure of a surface
CN106600643A (zh) * 2016-10-25 2017-04-26 长安大学 一种基于轨迹分析的人数统计方法
CN111369593A (zh) * 2020-03-16 2020-07-03 梅卡曼德(北京)机器人科技有限公司 玻璃涂胶方法、装置、电子设备和存储介质
CN111744706A (zh) * 2020-06-23 2020-10-09 梅卡曼德(北京)机器人科技有限公司 物件的喷胶方法、装置、电子设备及存储介质
CN112967368A (zh) * 2021-04-20 2021-06-15 梅卡曼德(北京)机器人科技有限公司 基于机器人视觉的物品表面涂胶方法、装置、电子设备和存储介质
CN113189934A (zh) * 2021-05-11 2021-07-30 梅卡曼德(北京)机器人科技有限公司 轨迹生成方法、装置、电子设备、存储介质和3d相机
CN113199480A (zh) * 2021-05-11 2021-08-03 梅卡曼德(北京)机器人科技有限公司 轨迹生成方法、装置、电子设备、存储介质和3d相机
CN113199479A (zh) * 2021-05-11 2021-08-03 梅卡曼德(北京)机器人科技有限公司 轨迹生成方法、装置、电子设备、存储介质和3d相机

Patent Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20110222757A1 (en) * 2010-03-10 2011-09-15 Gbo 3D Technology Pte. Ltd. Systems and methods for 2D image and spatial data capture for 3D stereo imaging
US20150009301A1 (en) * 2012-01-31 2015-01-08 3M Innovative Properties Company Method and apparatus for measuring the three dimensional structure of a surface
CN106600643A (zh) * 2016-10-25 2017-04-26 长安大学 一种基于轨迹分析的人数统计方法
CN111369593A (zh) * 2020-03-16 2020-07-03 梅卡曼德(北京)机器人科技有限公司 玻璃涂胶方法、装置、电子设备和存储介质
CN111744706A (zh) * 2020-06-23 2020-10-09 梅卡曼德(北京)机器人科技有限公司 物件的喷胶方法、装置、电子设备及存储介质
CN112967368A (zh) * 2021-04-20 2021-06-15 梅卡曼德(北京)机器人科技有限公司 基于机器人视觉的物品表面涂胶方法、装置、电子设备和存储介质
CN113189934A (zh) * 2021-05-11 2021-07-30 梅卡曼德(北京)机器人科技有限公司 轨迹生成方法、装置、电子设备、存储介质和3d相机
CN113199480A (zh) * 2021-05-11 2021-08-03 梅卡曼德(北京)机器人科技有限公司 轨迹生成方法、装置、电子设备、存储介质和3d相机
CN113199479A (zh) * 2021-05-11 2021-08-03 梅卡曼德(北京)机器人科技有限公司 轨迹生成方法、装置、电子设备、存储介质和3d相机

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117900917A (zh) * 2024-03-19 2024-04-19 中船黄埔文冲船舶有限公司 一种打磨轨迹离散化方法、系统、终端及可读存储介质

Similar Documents

Publication Publication Date Title
WO2022237166A1 (fr) Procédé et appareil de génération de trajectoire, dispositif électronique, support d'enregistrement et caméra 3d
CN113199479B (zh) 轨迹生成方法、装置、电子设备、存储介质和3d相机
EP3262439B1 (fr) Utilisation des variations d'intensité dans un motif de lumière pour une cartographie de profondeur des objets dans un volume
JP5132832B1 (ja) 計測装置および情報処理装置
US20180003498A1 (en) Visual positioning system and method based on high reflective infrared identification
CN111028340B (zh) 精密装配中的三维重构方法、装置、设备及系统
Larsson et al. Path planning for laser scanning with an industrial robot
WO2020217878A1 (fr) Dispositif, procédé et programme de détection de la position et de l'orientation d'un objet
JP2008537190A (ja) 赤外線パターンを照射することによる対象物の三次元像の生成
JP2011175477A (ja) 3次元計測装置、処理方法及びプログラム
CN104567727A (zh) 立体靶标及对线结构光轮廓传感器的全局统一校准方法
CN113189934A (zh) 轨迹生成方法、装置、电子设备、存储介质和3d相机
CN113344769B (zh) 基于机器视觉的物品3d图像信息获取方法、装置、介质
Alizadeh Object distance measurement using a single camera for robotic applications
EP4189650A2 (fr) Systèmes, procédés et supports pour récupérer directement des surfaces planes dans une scène à l'aide d'une lumière structurée
WO2022237544A1 (fr) Procédé et appareil de génération de trajectoire, et dispositif électronique et support d'enregistrement
CN114543787B (zh) 基于条纹投影轮廓术的毫米级室内建图定位方法
JP5698815B2 (ja) 情報処理装置、情報処理装置の制御方法及びプログラム
JP6040264B2 (ja) 情報処理装置、情報処理装置の制御方法、およびプログラム
KR20220122645A (ko) 3차원 좌표를 2차원 피처 포인트와 연관
JPH07152810A (ja) 環境モデル作成装置
WO2022222934A1 (fr) Procédé de revêtement d'adhésif pour verre, appareil de revêtement d'adhésif pour verre, dispositif électronique et support de stockage
CN113223029A (zh) 玻璃涂胶方法、玻璃涂胶装置、电子设备和存储介质
JP6061631B2 (ja) 計測装置、情報処理装置、計測方法、情報処理方法、および、プログラム
Ji et al. A structured light based measuring system used for an autonomous interior finishing robot

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 22806513

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 22806513

Country of ref document: EP

Kind code of ref document: A1