CN112800582B - Method for generating simulated laser line of structured light vision sensor - Google Patents

Method for generating simulated laser line of structured light vision sensor Download PDF

Info

Publication number
CN112800582B
CN112800582B CN202011621597.3A CN202011621597A CN112800582B CN 112800582 B CN112800582 B CN 112800582B CN 202011621597 A CN202011621597 A CN 202011621597A CN 112800582 B CN112800582 B CN 112800582B
Authority
CN
China
Prior art keywords
coordinate system
laser
laser plane
plane
point
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202011621597.3A
Other languages
Chinese (zh)
Other versions
CN112800582A (en
Inventor
王念峰
杨天
张宪民
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
South China University of Technology SCUT
Original Assignee
South China University of Technology SCUT
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by South China University of Technology SCUT filed Critical South China University of Technology SCUT
Priority to CN202011621597.3A priority Critical patent/CN112800582B/en
Publication of CN112800582A publication Critical patent/CN112800582A/en
Application granted granted Critical
Publication of CN112800582B publication Critical patent/CN112800582B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F30/00Computer-aided design [CAD]
    • G06F30/20Design optimisation, verification or simulation
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B11/00Measuring arrangements characterised by the use of optical techniques
    • G01B11/24Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures
    • G01B11/25Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures by projecting a pattern, e.g. one or more lines, moiré fringes on the object
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T17/00Three dimensional [3D] modelling, e.g. data description of 3D objects
    • G06T17/10Constructive solid geometry [CSG] using solid primitives, e.g. cylinders, cubes

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Geometry (AREA)
  • Evolutionary Computation (AREA)
  • General Engineering & Computer Science (AREA)
  • Computer Hardware Design (AREA)
  • Computer Graphics (AREA)
  • Software Systems (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Length Measuring Devices By Optical Means (AREA)
  • Manipulator (AREA)
  • Numerical Control (AREA)

Abstract

The invention belongs to the field of robots and computer graphics, and discloses a method for generating a simulation laser line of a structured light vision sensor, which comprises the following steps: generating a simulation environment; adjusting a simulation environment; converting one point on a laser plane under a camera coordinate system, a normal vector and two end points of the laser plane under a tool coordinate system into a world coordinate system; creating a laser plane, and discretizing the laser plane into laser beams; intersecting the discrete laser beams with the surface of a workpiece respectively to obtain an intersection point nearest to the virtual emission origin of the laser; and connecting the intersection points into a polygon to obtain a simulated laser line, and constructing a plane by using the simulated laser line and the virtual emission origin of the laser to obtain a simulated laser plane. And converting the intersection point into an image coordinate system by using a perspective projection principle to obtain a simulated laser line simulation image. The method can effectively solve the problem of low identification precision of the structured light sensor under the interference of mirror reflection, and improves the application range of the structured light vision sensor.

Description

Method for generating simulated laser line of structured light vision sensor
Technical Field
The invention belongs to the field of robots and computer graphics, and relates to a method for generating a simulation laser line of a structured light vision sensor.
Background
With the development of intelligent manufacturing, non-contact structured light visual sensors are applied more and more widely in industrial application. Structured light vision sensors have been widely used in the fields of surface modeling, workpiece quality, weld seam tracking, and the like. The vision sensor adopting the line structured light mode meets the requirement of a laser triangulation method measurement model, and is a non-contact measurement mode with high measurement speed and high precision. The laser line irradiates the surface of the measured object to form light stripes, the light stripes are affected by the geometric shape of the surface of the measured object to generate the phenomena of discontinuity and distortion, and the change comprises the depth information of the surface of the measured object. The collected laser stripe image is analyzed to extract the central line of the laser stripe, and the spatial position of a point on the laser central line can be calculated according to a geometric model formed by a camera and a laser, so that the structural information of the surface of the measured object is obtained.
Because of the interference of noise, light and the like in the industrial environment, the light stripes detected by the sensor cannot accurately reflect the real workpiece information, and the subsequent processing is influenced. Therefore, it is desirable to obtain a desired workpiece surface laser line in a simulation environment.
Disclosure of Invention
Aiming at the defects of the prior art, the invention provides a method for generating a simulation laser line of a structured light vision sensor. The purpose is to accurately obtain the laser line on the surface of a workpiece under the condition of environmental interference, and improve the precision and the application range of the structured light vision sensor.
The invention is realized by adopting the following technical scheme:
a method for generating a simulation laser line of a structured light vision sensor comprises the following steps:
s1, importing the robot, the structured light sensor and the workpiece model into robot simulation software to generate a simulation environment;
s2, adjusting the simulation environment according to the absolute pose of the workpiece and the absolute pose of the central point coordinate system of the robot tool;
s3, converting one point on the laser plane under the camera coordinate system, a normal vector and two end points of the laser plane under the tool coordinate system into a world coordinate system according to the calibration results of the tool coordinate system, the camera coordinate system and the laser plane;
s4, creating a laser plane, calculating the virtual emission origin of the laser and the intersection point of the laser plane and the optical axis of the camera, and discretizing the laser plane into laser beams;
s5, intersecting the discrete laser beams in the step S4 with the surface of the workpiece respectively to obtain an intersection point nearest to the virtual emission origin of the laser;
And S6, connecting the intersection points in the step S5 into polygons to obtain simulated laser lines, and constructing a plane by using the simulated laser lines and the virtual emission origin of the laser to obtain a simulated laser plane.
And S6, converting the intersection point in the step S5 into an image coordinate system by using a perspective projection principle to obtain a simulated laser line simulation image.
Preferably, step S3 includes: according to the hand-eye relation of the camera coordinate system relative to the tool center point coordinate system
Figure BDA0002872436010000021
An equation of a laser plane generated by a structural light emitter in the structural light sensor under a camera coordinate system, the absolute pose of a workpiece and the absolute pose data of a robot tool center point coordinate system are obtained, and a point on the laser plane under the camera coordinate system, a normal vector of the point and coordinates of two endpoints of the laser plane under a world coordinate system are obtained.
Preferably, the equation of the laser plane in the camera coordinate system is: ax + By + Cz +1 ═ 0; wherein A, B, C represents the coefficients of the plane equation under the camera coordinate system; x, y, and z represent three-dimensional coordinates of any point on the laser plane.
Preferably, the process of obtaining the coordinates of one point on the laser plane and its normal vector in the camera coordinate system and the two end points of the laser plane in the world coordinate system comprises:
A point on the laser plane under the camera coordinate system
Figure BDA0002872436010000031
And normal vector N (A, B, C), hand-eye relationship from camera coordinate system to tool center point coordinate system
Figure BDA0002872436010000032
And the absolute position and posture of the coordinate system of the central point of the robot tool
Figure BDA0002872436010000033
Obtaining the pose of the camera coordinate system relative to the world coordinate system
Figure BDA0002872436010000034
Figure BDA0002872436010000035
In the formula: c. t and w represent a camera coordinate system, a tool coordinate system, and a world coordinate system, respectively.
For the convenience of calculation, a point P in the world coordinate system is used0And normal vectors N are respectively expressed as
Figure BDA0002872436010000036
And N' (A, B, C,1), by homogeneous transformation, to obtain a point P0The sum normal vector N represents P' in the world coordinate system0And N':
Figure BDA0002872436010000037
Figure BDA0002872436010000038
preferably, the process of obtaining the coordinates of the two end points of the laser plane in the world coordinate system in the camera coordinate system comprises:
setting the pose of the tool coordinate system of the robot as: rx-180 °, Ry-0 °, Rz-180 °, measuring the coordinates of the two end points of the real laser line in the tool center point coordinate system: p1(x1,y1,0)、P2(x2,y2,0). Converting the two points into a camera coordinate system according to the hand-eye relationship of the camera coordinate system relative to the tool coordinate system; using the pose of the camera coordinate system relative to the world coordinate system
Figure BDA0002872436010000039
Will P1And P2And converting to a world coordinate system.
Preferably, step S4 includes: from the points on the laser plane and the normal vector in step S3, a laser plane α is created lAnd calculating a virtual emission origin of the laser according to the world coordinates of the two end points of the laser plane in the step S3, calculating an intersection point of the laser plane and the optical axis of the camera, and discretizing the laser plane into laser beams.
Preferably, the laser plane αlThe geometric plane class creation in the OCCT engine library is utilized, and the input parameter is P ″0And N'.
Preferably, the step of calculating the virtual emission origin of the laser according to the world coordinates of the two end points of the laser plane in step S3, and then calculating the intersection point of the laser plane and the optical axis of the camera, and the discretizing the laser plane into the laser beam process includes:
because of the measurement error, the two end points of the laser plane in the step S3 need to be corrected to the laser plane by using the projection of the point to the laser plane; since the apex angle of the triangular laser plane emitted by the laser is 20 °, the straight line P is formed1P2Two sides of the triangular laser plane can be obtained by rotating the axis formed by the two end points of the laser plane and the normal vector of the laser plane by 80 degrees, and the intersection point of the two sides is the virtual emission origin P of the laseroThen the straight line P is drawnoP2With PoAnd the positive and negative directions of the axis formed by the normal vector of the laser plane are respectively rotated by 10 degrees, the step length is 0.1 degree, and discrete laser beams are obtained.
Preferably, the perspective projection can be expressed as a three-dimensional space to image space conversion relationship:
Figure BDA0002872436010000041
in the formula: c. Ci、riPixel coordinates representing an image; k is an internal reference matrix of the camera; xc、Yc、ZcRepresenting three-dimensional space point coordinates; f. ofxAnd fyThe focal length is used for describing the proportional relation between the pixel unit and the three-dimensional coordinate unit; c. C0And r0The projected position of the camera optical center in the image is used for calculating the displacement between the image origin and the camera coordinate system origin.
The beneficial effects of the invention include:
(1) the invention is used for workpiece detection, utilizes the simulation tool to build a simulation environment, and effectively avoids the interference in the actual environment.
(2) The invention adopts the data of the real robot, the sensor and the workpiece, so that the simulation data is more reliable.
(3) The invention adopts a modeling algorithm in the three-dimensional modeling engine OCCT, and is suitable for different workpiece models.
(4) The invention realizes the generation method of the workpiece detection simulation laser line and expands the application range of the structured light sensor.
Drawings
FIG. 1 is a flow diagram of a method for generating a simulated laser line for a structured light vision sensor, in accordance with one embodiment;
FIG. 2 is a schematic diagram of a three-dimensional model of a robot, a structured light sensor and an initial pose of a workpiece, which are imported into software in one embodiment;
FIG. 3 is a schematic diagram of the pose of the robot, the sensor, and the workpiece adjusted in one embodiment;
FIG. 4 is a schematic diagram of measuring two endpoints of a laser plane in one embodiment;
FIG. 5 is a diagram illustrating the relative positions of the robot in the sixth axis coordinate system, the camera coordinate system, the tool coordinate system, and the workpiece coordinate system, according to one embodiment;
FIG. 6 is a simulated laser line of workpiece surface inspection generated in one embodiment;
FIG. 7 is a simulated laser line image of workpiece surface inspection generated in one embodiment;
Detailed Description
In order to explain the implementation of the present invention in more detail, the present invention is further explained with reference to the attached drawings.
The method for generating the simulated laser line of the structured light vision sensor shown in fig. 1 comprises the following steps:
(1) and importing the robot, the structured light sensor and the workpiece three-dimensional model into robot simulation software SCUT-RobotSim to generate a simulation environment shown in FIG. 2.
(2) According to the absolute position and attitude of the workpiece
Figure BDA0002872436010000051
Absolute position and posture of robot Central Point (TCP) coordinate system
Figure BDA0002872436010000052
And (3) rearranging the poses of the robot, the structured light sensor and other equipment in the step (1) according to the data, wherein the layout result is shown in fig. 3, wherein:
Figure BDA0002872436010000061
(3) According to the hand-eye relationship of the camera coordinate system relative to the tool center point coordinate system
Figure BDA0002872436010000062
Equation of laser plane generated by structured light emitter in structured light sensor in camera coordinate system and step (2)
Figure BDA0002872436010000063
And
Figure BDA0002872436010000064
and the data are obtained, so that the coordinates of one point on the laser plane and the normal vector thereof in the camera coordinate system and the coordinates of two end points of the laser plane in the world coordinate system are obtained.
The equation of the laser plane in the camera coordinate system is as follows: ax + By + Cz +1 ═ 0; wherein A, B, C represents the coefficients of the plane equation under the camera coordinate system; x, y, and z represent three-dimensional coordinates of any point on the laser plane. In this example: 0.107029, 11.524 and 4.955158.
A point on the laser plane under the camera coordinate system
Figure BDA0002872436010000065
And normal vector N (A, B, C), hand-eye relationship from camera coordinate system to tool center point coordinate system
Figure BDA0002872436010000066
And the absolute position and posture of the coordinate system of the central point of the robot tool
Figure BDA0002872436010000067
Obtaining the pose of the camera coordinate system relative to the world coordinate system
Figure BDA0002872436010000068
Figure BDA0002872436010000069
In the formula:
Figure BDA00028724360100000610
representing a homogeneous transformation matrix of a camera coordinate system under a tool center point coordinate system;
Figure BDA00028724360100000611
representing a homogeneous transformation matrix of a tool coordinate system in a world coordinate system;
Figure BDA00028724360100000612
representing a homogeneous transformation matrix of a camera coordinate system under a world coordinate system; c. t and w represent a camera coordinate system, a tool coordinate system, and a world coordinate system, respectively.
For the convenience of calculation, a point P in a world coordinate system is used0And normal vectors N are respectively represented as
Figure BDA00028724360100000613
And N '(A, B, C,1), and obtaining the representation P' of the point and the normal vector in the world coordinate system through homogeneous transformation0And N':
Figure BDA00028724360100000614
Figure BDA00028724360100000615
fig. 4 shows a calibration method for two end points of a laser plane: setting the pose of the tool coordinate system of the robot as: rx-180 °, Ry-0 °, Rz-180 °, measuring the coordinates of the two end points of the real laser line in the tool coordinate system: p1(x1,y1,0)、P2(x2,y2,0). Converting the two points into the camera coordinate system according to the hand-eye relationship of the camera coordinate system relative to the tool coordinate system, and further utilizing the pose of the camera coordinate system relative to the world coordinate system
Figure BDA0002872436010000071
Will P1And P2And converting to a world coordinate system.
(4) And (3) creating the laser Plane by using a class Geom _ Plane in an open source three-dimensional modeling engine library OCCT, wherein the introduced parameter is P ″0And N'. Due to measurement errors, two end points of the real laser line need to be corrected to the laser plane by using the projection of the points to the laser plane. Since the apex angle of the triangular laser plane emitted by the laser is 20 °, the straight line P is formed1P2Two sides of the triangular laser plane can be obtained by rotating the axis formed by the two end points of the laser plane and the normal vector of the laser plane by 80 degrees, and the intersection point of the two sides is the virtual emission origin P of the laser oThen the straight line P is drawnoP2With PoAnd the positive and negative directions of the axis formed by the normal vector of the laser plane are respectively rotated by 10 degrees, the step length is 0.1 degree, and discrete laser beams are obtained.
(5) And intersecting the discrete laser beams with all surfaces of the workpiece, and calculating an intersection point closest to the virtual emission origin of the laser.
(6) And (5) connecting all the intersection points in the step (5) into a polygon to obtain a simulated laser line, and constructing a plane by using the simulated laser line and the virtual laser emission plane to obtain a simulated laser plane.
(7) And (5) converting all the intersection points in the step (5) into an image coordinate system by using a perspective projection principle to obtain a simulated laser line simulation image. Perspective projection can be expressed as a three-dimensional space to image space transformation relationship:
Figure BDA0002872436010000072
in the formula: c. Ci、riPixel coordinates representing an image; k is an internal reference matrix of the camera; xc、Yc、ZcRepresenting three-dimensional space point coordinates; f. ofxAnd fyThe focal length is used for describing the proportional relation between the pixel unit and the three-dimensional coordinate unit; c. C0And r0The projected position of the camera optical center in the image is used for calculating the displacement between the image origin and the camera coordinate system origin.
Fig. 5 is a schematic diagram of a robot, a structured light sensor, and a workpiece model welding scene, where the robot is configured to detect a workpiece weld by the structured light sensor. The relative positional relationship among the robot's sixth axis coordinate system, camera coordinate system, tool coordinate system, and workpiece coordinate system is indicated in fig. 5. Wherein: f WAs a world coordinate system, F6A sixth axis coordinate system of the robot, FCAs a camera coordinate system, FTAs a tool coordinate system, FOIs a coordinate system of the workpiece, and is,
Figure BDA0002872436010000081
is F6To FWThe transformation matrix of (a) is,
Figure BDA0002872436010000082
is FTTo F6The transformation matrix of (a) is,
Figure BDA0002872436010000083
is FCTo FTIs transformed by
Figure BDA0002872436010000084
Figure BDA0002872436010000085
Is FOTo FWThe transformation matrix of (2).
The class Geom _ Plane and the like are library functions in an open-source three-dimensional modeling engine OpenCascade and can be directly called.
Fig. 6 is a calculated intersection line of the laser plane and the workpiece surface, and fig. 7 is an image of this intersection line in the camera coordinate system.
The above embodiments are preferred embodiments of the present invention, but the present invention is not limited to the above embodiments, and any other changes, modifications, substitutions, combinations, and simplifications which do not depart from the spirit and principle of the present invention should be construed as equivalents thereof, and all such changes, modifications, substitutions, combinations, and simplifications are intended to be included in the scope of the present invention.

Claims (9)

1. A method for generating a simulation laser line of a structured light vision sensor is characterized by comprising the following steps:
s1, importing the robot, the structured light sensor and the workpiece model into robot simulation software to generate a simulation environment;
s2, adjusting the simulation environment according to the absolute pose of the workpiece and the absolute pose of the central point coordinate system of the robot tool;
S3, converting one point on the laser plane under the camera coordinate system, a normal vector and two end points of the laser plane under the tool coordinate system into a world coordinate system according to the calibration results of the tool coordinate system, the camera coordinate system and the laser plane;
s4, creating a laser plane, calculating the virtual emission origin of the laser and the intersection point of the laser plane and the optical axis of the camera, and discretizing the laser plane into laser beams;
s5, intersecting the discrete laser beams in the step S4 with the surface of the workpiece respectively to obtain an intersection point nearest to the virtual emission origin of the laser;
s6, connecting the intersection points in the step S5 into polygons to obtain simulated laser lines, and constructing a plane by utilizing the simulated laser lines and the virtual emission origin of the laser to obtain a simulated laser plane;
and S7, converting the intersection point in the step S5 into an image coordinate system by using a perspective projection principle to obtain a simulated laser line simulation image.
2. The structured light vision sensor simulation laser line generation method according to claim 1, wherein step S3 comprises: according to the hand-eye relation of the camera coordinate system relative to the tool center point coordinate system
Figure FDA0003512853870000011
An equation of a laser plane generated by a structural light emitter in the structural light sensor under a camera coordinate system, the absolute pose of a workpiece and the absolute pose data of a robot tool center point coordinate system are obtained, and a point on the laser plane under the camera coordinate system, a normal vector of the point and coordinates of two endpoints of the laser plane under a world coordinate system are obtained.
3. The method of claim 2, wherein the equation of the laser plane in the camera coordinate system is: ax + By + Cz +1 ═ 0; wherein A, B, C represents the coefficients of the plane equation under the camera coordinate system; x, y, and z represent three-dimensional coordinates of any point on the laser plane.
4. The method for generating a simulated laser line of a structured light vision sensor according to claim 2, wherein the step of obtaining the coordinates of one point on the laser plane and the normal vector thereof in the camera coordinate system and the two end points of the laser plane in the world coordinate system comprises:
a point on the laser plane under the camera coordinate system
Figure FDA0003512853870000021
And normal vector N (A, B, C), hand-eye relationship from camera coordinate system to tool center point coordinate system
Figure FDA0003512853870000022
And the absolute position and posture of the coordinate system of the central point of the robot tool
Figure FDA0003512853870000023
Obtaining the pose of the camera coordinate system relative to the world coordinate system
Figure FDA0003512853870000024
Figure FDA0003512853870000025
In the formula: c. t and w respectively represent a camera coordinate system, a tool coordinate system and a world coordinate system;
for the sake of calculation, point P in the camera coordinate system0And normal vectors N are respectively expressed as
Figure FDA0003512853870000026
And N' (A, B, C,1), by homogeneous transformation, to obtain a point P0The sum normal vector N represents P' in the world coordinate system 0And N':
Figure FDA0003512853870000027
Figure FDA0003512853870000028
5. the method of claim 2, wherein obtaining coordinates of two endpoints of the laser plane in the world coordinate system in the camera coordinate system comprises:
setting the pose of the tool coordinate system of the robot as: rx-180 °, Ry-0 °, Rz-180 °, measuring the coordinates of the two end points of the real laser line in the tool center point coordinate system: p1(x1,y1,0)、P2(x2,y20); converting the two points to phases according to the hand-eye relationship of the camera coordinate system relative to the tool coordinate systemUnder a machine coordinate system; using the pose of the camera coordinate system relative to the world coordinate system
Figure FDA0003512853870000029
Will P1And P2And converting to a world coordinate system.
6. The structured light vision sensor simulation laser line generation method according to claim 1, wherein step S4 comprises: from the points on the laser plane and the normal vector in step S3, a laser plane α is createdlAnd calculating a virtual emission origin of the laser according to the world coordinates of the two end points of the laser plane in the step S3, calculating an intersection point of the laser plane and the optical axis of the camera, and discretizing the laser plane into laser beams.
7. The method of claim 6, wherein the laser plane α is a lThe geometric plane class creation in the OCCT engine library is utilized, and the input parameter is P ″0And N', P ″)0And N' are respectively a point P in the camera coordinate system0And a representation of normal vector N in a world coordinate system.
8. The method for generating a simulated laser line of a structured light vision sensor according to claim 1, wherein the step of calculating a virtual emission origin of the laser according to the world coordinates of two end points of the laser plane in step S3, and then calculating an intersection point of the laser plane and the optical axis of the camera, and the discretizing the laser plane into a laser beam comprises the steps of:
because of the measurement error, the two end points of the laser plane in the step S3 need to be corrected to the laser plane by using the projection of the point to the laser plane; since the apex angle of the triangular laser plane emitted by the laser is 20 °, the straight line P is formed1P2Rotating an axis formed by two end points of the laser plane and a normal vector of the laser plane by 80 degrees to obtain two sides of the triangular laser plane, wherein the intersection point of the two sides is the virtual emission origin P of the laseroThen the straight line P is drawnoP2With PoAnd the positive and negative directions of the axis formed by the normal vector of the laser plane are respectively rotated by 10 degrees, the step length is 0.1 degree, and discrete laser beams are obtained.
9. The structured light vision sensor simulation laser line generation method of claim 1, wherein the perspective projection is represented as a three-dimensional space to image space conversion relationship:
Figure FDA0003512853870000031
in the formula: c. Ci、riPixel coordinates representing an image; k is an internal reference matrix of the camera; xc、Yc、ZcRepresenting three-dimensional space point coordinates; f. ofxAnd fyThe focal length is used for describing the proportional relation between the pixel unit and the three-dimensional coordinate unit; c. C0And r0The projected position of the camera optical center in the image is used for calculating the displacement between the image origin and the camera coordinate system origin.
CN202011621597.3A 2020-12-30 2020-12-30 Method for generating simulated laser line of structured light vision sensor Active CN112800582B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202011621597.3A CN112800582B (en) 2020-12-30 2020-12-30 Method for generating simulated laser line of structured light vision sensor

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202011621597.3A CN112800582B (en) 2020-12-30 2020-12-30 Method for generating simulated laser line of structured light vision sensor

Publications (2)

Publication Number Publication Date
CN112800582A CN112800582A (en) 2021-05-14
CN112800582B true CN112800582B (en) 2022-05-24

Family

ID=75805012

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202011621597.3A Active CN112800582B (en) 2020-12-30 2020-12-30 Method for generating simulated laser line of structured light vision sensor

Country Status (1)

Country Link
CN (1) CN112800582B (en)

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106839979A (en) * 2016-12-30 2017-06-13 上海交通大学 The hand and eye calibrating method of line structured laser sensor
KR20190078853A (en) * 2017-12-27 2019-07-05 경북대학교 산학협력단 Laser projection apparatus and control method thereof, laser guidance system including the apparatus
CN110648313A (en) * 2019-09-05 2020-01-03 北京智行者科技有限公司 Laser stripe center line fitting method based on FPGA

Family Cites Families (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101986350B (en) * 2010-10-22 2012-03-28 武汉大学 Monocular structured light-based three-dimensional modeling method
US20150085080A1 (en) * 2012-04-18 2015-03-26 3Shape A/S 3d scanner using merged partial images
US9303990B2 (en) * 2014-04-11 2016-04-05 Black & Decker Inc. Laser line generating device
CN109215108B (en) * 2017-06-30 2023-05-23 深圳先进技术研究院 Panoramic three-dimensional reconstruction system and method based on laser scanning
CN108759714B (en) * 2018-05-22 2020-01-03 华中科技大学 Coordinate system fusion and rotating shaft calibration method for multi-line laser profile sensor
CN108717715B (en) * 2018-06-11 2022-05-31 华南理工大学 Automatic calibration method for linear structured light vision system of arc welding robot
CN108732556B (en) * 2018-08-17 2020-03-27 西南交通大学 Vehicle-mounted laser radar simulation method based on geometric intersection operation
CN109291048B (en) * 2018-09-26 2020-11-13 泉州华中科技大学智能制造研究院 Real-time online programming system and method for grinding and polishing industrial robot
CN110553600B (en) * 2019-08-14 2021-05-14 华南理工大学 Method for generating simulated laser line of structured light sensor for workpiece detection

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106839979A (en) * 2016-12-30 2017-06-13 上海交通大学 The hand and eye calibrating method of line structured laser sensor
KR20190078853A (en) * 2017-12-27 2019-07-05 경북대학교 산학협력단 Laser projection apparatus and control method thereof, laser guidance system including the apparatus
CN110648313A (en) * 2019-09-05 2020-01-03 北京智行者科技有限公司 Laser stripe center line fitting method based on FPGA

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
A review of modeling and simulation of laser beam machining;Parandoush P et al;《International journal of machine tools and manufacture》;20141231;第135-145页 *
基于二维激光扫描成像的机器人焊接轨迹修正方法研究;李波;《中国优秀硕士学位论文全文数据库 信息科技辑》;20151215;第I140-196页 *

Also Published As

Publication number Publication date
CN112800582A (en) 2021-05-14

Similar Documents

Publication Publication Date Title
CN110553600B (en) Method for generating simulated laser line of structured light sensor for workpiece detection
Palomer et al. Underwater laser scanner: Ray-based model and calibration
CN103678754B (en) Information processor and information processing method
TWI672207B (en) Posture positioning system for machine and the method thereof
Balanji et al. A novel vision-based calibration framework for industrial robotic manipulators
JP2004508954A (en) Positioning device and system
JP2003058911A (en) Device, method, program for modeling surface shape of three-dimensional object
Zhang et al. Error correctable hand–eye calibration for stripe-laser vision-guided robotics
Li et al. Structured light-based visual servoing for robotic pipe welding pose optimization
Liska et al. Hand-eye calibration of a laser profile scanner in robotic welding
Grudziński et al. Stereovision tracking system for monitoring loader crane tip position
CN112800582B (en) Method for generating simulated laser line of structured light vision sensor
CN117162098A (en) Autonomous planning system and method for robot gesture in narrow space
JP6848761B2 (en) Distance evaluation method between objects and interference evaluation method between relatively moving objects
CN113324538B (en) Cooperative target remote high-precision six-degree-of-freedom pose measurement method
US11055562B1 (en) Methods and systems for registering a three-dimensional pose of an object
Deng et al. Hand-eye calibration of line structured-light sensor by scanning and reconstruction of a free-placed standard cylindrical target
Erceg et al. Stereo vision based robot welding
Shi et al. Development of an automatic optical measurement system for automotive part surface inspection
CN116625242B (en) Path planning method and system for optical three-coordinate measuring machine, electronic equipment and medium
Na et al. Robotic Path Planning for Inspection of Complex-Shaped Objects
Tang et al. 3-step-calibration of 3D vision measurement system based-on structured light
JP5938201B2 (en) Position / orientation measuring apparatus, processing method thereof, and program
Madhusudanan Fast Eye-In-Hand 3D Scanner-Robot Calibration for Low Stitching Errors
Qin et al. Sensor calibration and trajectory planning in 3D vision-guided robots

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant