CN103822594A - Workpiece scanning imaging method based on laser sensor and robot - Google Patents

Workpiece scanning imaging method based on laser sensor and robot Download PDF

Info

Publication number
CN103822594A
CN103822594A CN201410072626.3A CN201410072626A CN103822594A CN 103822594 A CN103822594 A CN 103822594A CN 201410072626 A CN201410072626 A CN 201410072626A CN 103822594 A CN103822594 A CN 103822594A
Authority
CN
China
Prior art keywords
laser
workpiece
sensor
robot
point
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201410072626.3A
Other languages
Chinese (zh)
Inventor
张铁
李波
邹焱飚
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
South China University of Technology SCUT
Original Assignee
South China University of Technology SCUT
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by South China University of Technology SCUT filed Critical South China University of Technology SCUT
Priority to CN201410072626.3A priority Critical patent/CN103822594A/en
Publication of CN103822594A publication Critical patent/CN103822594A/en
Pending legal-status Critical Current

Links

Images

Landscapes

  • Length Measuring Devices By Optical Means (AREA)

Abstract

The invention provides a workpiece scanning imaging method based on a laser sensor and a robot. The method includes the following steps of (A) obtaining workpiece profile point data obtained through laser scanning, and (B) processing the point data so as to generate a workpiece profile image, wherein by means of the step of obtaining the workpiece profile point data obtained through laser scanning, the function of obtaining the laser scanning point data is achieved; by means of the step of processing the point data to generate the workpiece profile image, the problem of obtaining a workpiece profile binaryzation image through the original laser scanning point data is solved. The method has the advantages of being practical, flexible, good in anti-interference performance and the like and solving the problem that a traditional CCD camera is insufficient in anti-interference performance in a special industrial operation situation.

Description

A kind of workpiece scan imaging method based on laser sensor and robot
Technical field
The present invention relates to a kind of workpiece scanning imaging technology, particularly a kind of workpiece scan imaging method based on sensor and robot; Scan imaging method of the present invention is a kind of workpiece scan imaging method based on striped formula laser sensor and industrial robot, take industrial machine people as mechanical motion actuating unit, take industrial robot and striped formula laser sensor as measurement mechanism, take computing machine as communication, data acquisition and processing platform, be used for workpiece to carry out scan-type imaging.
Background technology
Because vision system has good detection performance and positioning performance, so robotic vision system is developed focus and emphasis that oneself becomes robot research field.Vision-sensing method because of the quantity of information obtaining abundant, and high sensitivity, high precision, with the advantage such as workpiece is contactless, and be more and more subject to people's attention.
At present, the image of visual sensing collection has based on natural light, the image of artificial normal optical and the structure light image take laser as active light source.In some special industrial environment, such as exist the bad disturbing factors such as strong arc light, dust, smog at welding scene, the performance of traditional C CD camera has been subject to comparatively serious interference, and CCD camera traditional under this environment just can not be finished the work well.By contrast, described striped formula laser sensor is based on principle of triangulation, carry out object Profile measurement by linear beam laser, adopt with laser and filter all parasitic lights including arc light with equiwavelength's optical filter, the integrated optical receiver assembly of sensor internal, the image that CMOS area detector received only and formed laser stripe.The advantage of this striped formula laser sensor is not adopt any portable parts, sturdy and durable, the interference such as be not subject to arclight, flue dust, splash.
Laser as active light source there is high-energy, high brightness, the advantage such as monochromaticity is good, antijamming capability is strong, therefore laser vision sensor has very large development prospect.
Summary of the invention
The shortcoming that the object of the invention is to overcome prior art is with not enough, a kind of workpiece scan imaging method based on laser sensor and robot is provided, the method comprises obtains laser scanning gained workpiece profile point data, process points data to generate these two steps of workpiece profile image, has the features such as practicality is flexible, interference free performance is good.
Object of the present invention is achieved through the following technical solutions:
A kind of workpiece scan imaging method based on laser sensor and robot, the method adopts an inside to be integrated with striped formula laser sensor and the robot (containing robot controller) of optical receiver assembly, CMOS area detector, wherein striped formula laser sensor is fixedly mounted on robot end's flange, jointly completes the measurement task of obtaining laser scanning gained workpiece profile point data;
When the linear beam that the striped formula laser sensor that the method adopts sends projects on measured object surface, laser beam can form the image consistent with measured object surface profile, in this laser beam, have a series of continuous, uniform P laser sampling points, then sensor returns to P sampled point in this Shu Jiguang with respect to the Z axis in sensor coordinate system and X-axis coordinate figure;
The method also adopts computing machine to complete collection, the processing of laser scanning point data and then the task of workpiece profile being carried out to imaging;
The method also needs to adopt some miscellaneous parts, for example: work piece platform, workpiece etc.Wherein work piece platform is used for place work piece;
The described workpiece scan imaging method based on laser sensor and robot, comprises the following steps:
Steps A, obtain laser scanning gained workpiece profile point data;
Step B, process points data are to generate workpiece profile image;
Described steps A comprises the following steps:
A1) when the workpiece in work piece platform is in the robot radius of clean-up when a certain locus point A, robot drives striped formula laser sensor to move to a certain another the suitable fixing locus point B in workpiece top;
A2) control drives striped formula laser sensor to start that from described B point workpiece is done to rectilinear scanning with constant speed V and measures motion, controls the constantly coordinate figure with respect to Z axis and X-axis in laser sensor coordinate system with cycle T Emission Lasers bundle measurement object return laser light sampled point of striped formula laser sensor; Scan period guarantees striped formula laser sensor back plane and work piece platform constant gap between the two.Determine described speed V according to s=d; Wherein: s represents the spacing of adjacent two bundle laser in sensor coordinate system Y direction, s=VT; D represents the spacing of adjacent laser sampling point in sensor coordinate system X-direction in laser beam;
A3) control computing machine is measured the coordinate information of the Y direction of coordinate system at striped formula laser sensor with cycle T constantly real-time read machine people's from robot controller end;
A4), when scanning motion finishes, striped formula laser sensor moves to a certain another the suitable fixing locus point C in workpiece top, robot stop motion, and while sensor stops Emission Lasers workpiece is measured;
A5) computing machine by and robot controller, striped formula laser sensor between communicate by letter and obtain the workpiece profile point data that above-mentioned scanning process obtains.
Described step B comprises the following steps:
B1) minority noise spot is carried out to interpolation: because the causes such as some workpiece shape profile is extremely irregular, height fall is large cause the optical receiver assembly of sensor internal, the laser sampling point that CMOS area detector cannot detect some place, so just cannot obtain the actual coordinate value of these noise samples point positions.Take the mode of quadratic interpolation to carry out interpolation to described noise spot;
B2) the coarse error of sensor laser beam transmitting cycle T is carried out to interpolation: the cycle of triggering by the scanning of set timer striped formula laser sensor is T (ms), and in fact cycle T is floated by a small margin.The Z axis coordinate figure that P the sampled point of every beam of laser correspondence according to desirable step-length s returns carries out linear interpolation;
B3) number of the laser sampling point in sensor coordinate system X-direction is carried out to interpolation.The Z axis coordinate figure corresponding to the sampled point of the P in the X-direction of every Shu Jiguang carries out interpolation;
B4) set up the criterion that generates workpiece binary image matrix: set up a filter criteria, the Z axis coordinate elevation information returning according to laser sampling point will and not separated at the lip-deep laser sampling point of workpiece profile at the lip-deep laser sampling point of workpiece profile, for each laser sampling point, if judge, it makes the corresponding element in workpiece image binaryzation matrix on surface of the work is 0; Otherwise, be 255;
B5) generate workpiece profile binary image according to the image array of binaryzation:
In workpiece image binaryzation matrix, being the pixel that 0 element corresponds to a black, is that 255 element corresponds to a white pixel.
The present invention has following advantage and effect with respect to prior art:
1, of the present invention practical, use is flexible, interference free performance is good.
2, some special industry occasion that is applicable to of the present invention, has solved the problem of traditional C CD camera technique in special industry operation occasion interference free performance deficiency well.
Accompanying drawing explanation
Fig. 1 is laser scanning imaging general illustration in the present invention.
Fig. 2 is noise spot interpolation schematic diagram in the present invention.
Fig. 3 is adjacent laser beam spacing and adjacent laser sampling dot spacing schematic diagram in the present invention, and wherein, symbol "○" represents that each round dot corresponds to a corresponding pixel in workpiece binaryzation picture.
Fig. 4 is that the present invention carries out interpolation schematic diagram to laser beam transmitting cycle error that out of true causes.
Fig. 5 carries out interpolation schematic diagram to sampled point number in the present invention.
Fig. 6 is the situation schematic diagram that in the present invention, beam of laser shines workpiece profile surface.
Embodiment
Below in conjunction with embodiment and accompanying drawing, the present invention is described in further detail, but embodiments of the present invention are not limited to this.
Embodiment
Robot end, last striped formula laser sensor is installed, this sensor and robot (containing robot controller), wherein striped formula laser sensor is fixedly mounted on robot end's flange, jointly completes the measurement task of the workpiece profile point data of obtaining laser scanning gained.
When the laser beam that the striped formula laser sensor that the method adopts sends projects on measured object surface, laser beam can form the image consistent with measured object surface profile, in this laser beam, have a series of continuous, uniform P laser sampling points, then sensor returns to P sampled point in this Shu Jiguang with respect to the Z axis in sensor measurement coordinate system and X-axis coordinate figure.
Computing machine and laser sensor and robot controller have communicated collection, the processing of laser scanning point data and then the task of workpiece profile being carried out to imaging.
In this embodiment, also need to adopt some other annexes: work piece platform, workpiece etc.Wherein work piece platform is used for place work piece.
Described steps A (obtaining laser scanning gained workpiece profile point data) comprises the following steps, as shown in Figure 1:
A1) when the workpiece 5 in work piece platform 6 is in robot 1 radius of clean-up when a certain locus point A, robot drives striped formula laser sensor 3 to move to a certain another the suitable fixing locus point B in workpiece top;
A2) robot 1 drives striped formula laser sensor 3 to start, along a certain scanning pattern 7, workpiece 5 is done to rectilinear scanning measurements campaign with constant speed V along sensor coordinate system Y-axis from described B point, control striped formula laser sensor 3 constantly with cycle T Emission Lasers bundle measuring workpieces 5 also the intrafascicular sampled point of return laser light with respect to the coordinate figure of Z axis and X-axis in sensor coordinate system 4; Scan period guarantees striped formula laser sensor back plane 2 and work piece platform 6 constant gap between the two; As shown in Figure 3, described speed V need to meet expression formula s=d; Wherein: s represents the spacing of adjacent two bundle laser in sensor coordinate system Y direction, s=VT; D represents the spacing of adjacent laser sampling point in sensor coordinate system X-direction in laser beam, d=X max-X min/ n, X max, X minbe respectively that in a laser beam, sampled point returns with respect to sensor coordinate system X-axis maximum, min coordinates value, n is the sum of laser sampling point in a laser beam;
A3) computing machine 8 is measured the positional information of the Y direction of coordinate system 4 at striped formula laser sensor with cycle T constantly real-time read machine people's 1 from robot controller 9 end;
A4), when scanning motion finishes, striped formula laser sensor 3 moves to a certain another the suitable fixing locus point C in workpiece top, robot stop motion, and while sensor stops Emission Lasers workpiece is measured;
A5) computing machine 8 by and robot control cabinet 9, striped formula laser sensor 3 between communicate by letter and obtain the workpiece profile point data that above-mentioned scanning process obtains;
Sensor has periodically been launched N bar laser beam altogether and has been carried out sampled measurements during motion scan is measured.
Described step B(process points data are to generate workpiece profile image) comprise the following steps:
B1) minority noise spot is carried out to interpolation:
Because the causes such as some workpiece shape profile is extremely irregular, height fall is large cause the optical receiver assembly of sensor internal, the laser sampling point that CMOS area detector cannot detect some place.At this moment, X-axis and Z axis coordinate figure that laser sensor turns back to the laser sampling point of computing machine are all 0, claim that this laser sampling point is noise spot here;
As shown in Figure 2, suppose i+3(i<N-3) m(m<P in road laser) individual laser sampling point is noise spot, utilizes Lagrangian quadratic interpolation mode to obtain the Z axis coordinate figure of this point:
l 0 = ( Y - Y i + 1 ) ( Y - Y i + 2 ) ( Y i - Y i + 1 ) ( Y i - Y i + 2 ) l 1 = ( Y - Y i + 1 ) ( Y - Y i + 2 ) ( Y i + 1 - Y i ) ( Y i + 1 - Y i + 2 ) l 2 = ( Y - Y i ) ( Y - Y i + 1 ) ( Y i + 2 - Y i ) ( Y i + 2 - Y i + 1 ) Z i + 3 ( Y i + 3 ) = l 0 Z i + l 1 Z i + 1 + l 2 Z i + 2 ,
Wherein:
Y i, Y i+1, Y i+2be respectively the Y-axis coordinate figure (being obtained by steps A 3) of m laser sampling point in i, i+1, i+2 bundle laser;
Z i, Z i+1, Z i+2be respectively the Z axis coordinate figure of m laser sampling point in i, i+1, i+2 bundle laser, 1≤i≤N;
B2) error due to sensor laser beam transmitting cycle T out of true is carried out to interpolation:
The cycle of triggering by the scanning of set timer striped formula laser sensor is T (ms), and in fact cycle T is floated by a small margin.Because T is difficult to strict guarantee, therefore can not equal d by strict guarantee s, as shown in Figure 3, so be difficult to strict guarantee generate measured piece image do not distort.This is also the real-time necessity at Y direction position coordinates from read machine robot end in robot controller, that is robot serves as the necessity place of survey instrument.For the image slices vegetarian refreshments that guarantees to produce in subsequent process does not produce geometric distortion as far as possible, must be to s 1s 2s 3s n, the Z axis coordinate figure of each Y value correspondence to P sampled point according to desirable step-length s carries out linear interpolation, and as shown in Figure 4, interpolation formula is Interpolation Principle:
Z &prime; [ i ] = S 1 + S 2 + . . . + S [ i ] VTi Z [ i ] ,
Wherein:
Z[i] be an array that comprises P element, represent the set of the Z axis coordinate figure of P sampled point of i road laser, 1≤i≤N;
Z'[i] be an array, represent the set of the laser sampling point of the P after interpolation of i road laser;
The direction indication laser sensor direction of scanning of Y-axis, has carried out correction interpolation to the Z axis coordinate figure of P sampled point in the corresponding beam of laser of each Y-axis coordinate figure, has so just substantially eliminated the geometric distortion of the imaging that produces;
B3) carry out interpolation for the number of sensors X direction of principal axis P uniform laser sampling point:
Carry out the data that remain NP laser sampling point that two above-mentioned step interpolations obtain later.Because laser sensor only has P sampled point in its coordinate system X-direction, this is to be determined by the performance of sensor self; Obviously, the degree of accuracy of sampled data and the number of sampled point are obvious positive correlation, also have direct relation with the precision of workpiece profile imaging; In order to improve the precision of X-direction sampled data, for P sampled point of the X-direction of per pass laser, as shown in Figure 5, every two neighbouring samples are put to corresponding Z axis coordinate figure and ask arithmetic mean, as the Z axis coordinate figure of the sampled point of a new X-direction, i.e. Z' i=(Z i+ Z i+1)/2;
Wherein:
Z' i: the Z axis coordinate figure of the sampled point that interpolation obtains;
Z i: the Z axis coordinate figure of i laser sampling point;
As shown in Figure 5, the round dot in laser beam represents laser sampling point originally, and triangle form point represents the laser sampling point that interpolation obtains.After interpolation, the sampled point number in every laser beam becomes 2P-1 from P;
B4) set up the criterion that generates workpiece binary image matrix:
Set up a filter criteria, the Z axis coordinate elevation information returning according to laser sampling point by the laser sampling point on surface of the work and not the point of the laser sampling on surface of the work separate.Sensor emission beam of laser shines the lip-deep situation of workpiece profile as shown in Figure 6:
Determine workpiece profile image according to the individual sampled point of N (2P-1) of this N bar laser beam, determine that the treatment step of workpiece projection in the plane P of workpiece place along Z direction is as follows:
For i article of bundle laser beam, this 2P-1 sampled point can be divided into two classes:
Category-A point: be positioned at the lip-deep sampled point of workpiece profile, i.e. the sampled point of BG section;
Category-B point: the non-lip-deep sampled point of workpiece profile that is positioned at, the i.e. sampled point of AB and GH section;
By that analogy, the individual sampled point of N (2P-1) of this N bar laser beam also can be divided into two parts: category-A point, category-B point;
Complete all sampled points be divided into category-A point, category-B point two parts through following 2 steps:
1: find out Z axis coordinate maximal value Zmax in the individual sampled point of all N (2P-1), this point obviously belongs to category-B point;
2: travel through the Z axis coordinate figure of all sampled points, the minimum thickness of establishing workpiece is T, if certain
The Z axis coordinate figure Z[i of sampled point] meet | Zmax-Z[i] | >aT (0<a<1 generally gets 0.9), thinks that this sampled point belongs to category-A point, otherwise thinks that this sampled point belongs to category-B point;
Each laser sampling point corresponding an element of workpiece binary image matrix, so this matrix has the individual finite element of N (2P-1);
For each laser sampling point, it belonging to category-A point if judge, to make the corresponding element in workpiece image binaryzation matrix be 0; Otherwise, be 255;
So just obtain one by the ImageArray[2P-1 of assignment] image array of [N];
B5) generate workpiece profile binary image according to the image array of binaryzation:
In program according to matrix ImageArray[2P-1] [N] while generating the binary image of workpiece, the matrix element that value is 255 corresponds to a white pixel point, the element that value is 0 corresponds to a black pixel point.That is to say that being arranged in the binary image that the lip-deep sampled point of workpiece profile in the end generates is all stains, but not the sampled point that is positioned at workpiece profile surface is white point.So, along the Z-direction of sensor measurement coordinate system 4 in Fig. 1, the projection in the plane at workpiece bottom place is all stains to workpiece outline.
Above-described embodiment is preferably embodiment of the present invention; but embodiments of the present invention are not restricted to the described embodiments; other any do not deviate from change, the modification done under Spirit Essence of the present invention and principle, substitutes, combination, simplify; all should be equivalent substitute mode, within being included in protection scope of the present invention.

Claims (6)

1. the workpiece scan imaging method based on laser sensor and robot, is characterized in that, comprises the following steps:
Steps A, obtain the workpiece profile point data of laser scanning gained;
Step B, process points data are to generate workpiece profile image;
Described steps A comprises the following steps:
A1) when the workpiece in work piece platform is in the robot radius of clean-up when a certain locus point A, robot drives striped formula laser sensor to move to a certain another the suitable fixing locus point B in workpiece top;
A2) robot drives striped formula laser sensor to start that from B point workpiece is done to rectilinear scanning with speed V and measures motion;
A3) computing machine constantly reads in real time coordinate information from robot controller with cycle T;
A4), when scanning motion finishes, striped formula laser sensor moves to a certain another the suitable fixing locus point C in workpiece top, robot stop motion, and while sensor stops Emission Lasers workpiece is measured;
A5) computing machine by and robot controller, striped formula laser sensor between communicate by letter and obtain the workpiece profile point data that above-mentioned scanning process obtains;
Described step B comprises the following steps:
B1) minority noise spot is carried out to interpolation;
B2) to sensor laser beam transmitting cycle T not enough accurate error carry out interpolation: the cycle by set timer striped formula laser sensor scanning triggering is T; The Z axis coordinate figure that P the laser sampling point of every beam of laser correspondence according to desirable step-length s returns carries out linear interpolation;
B3) number of the laser sampling point in sensor coordinate system X-direction is carried out to interpolation;
B4) set up the criterion that generates workpiece binary image matrix;
B5) generate workpiece profile binary image according to the image array of binaryzation:
In workpiece image binaryzation matrix, being the pixel that 0 element corresponds to a black, is that 255 element corresponds to a white pixel.
2. the workpiece scan imaging method based on laser sensor and robot as claimed in claim 1, it is characterized in that, described steps A 2) in, described striped formula laser sensor is constantly measured and return measurement data with cycle T Emission Lasers Shu Jinhang, and described measurement data refers to the coordinate figure of laser sampling point with respect to Z axis in sensor coordinate system and X-axis; Scan period guarantees constant gap between striped formula laser sensor back plane and work piece platform, and the speed V of rectilinear scanning campaign determines according to s=d;
Wherein: s represents the spacing of adjacent two bundle laser in sensor coordinate system Y direction, s=VT; D represents the spacing of adjacent laser sampling point in sensor coordinate system X-direction in laser beam.
3. the workpiece scan imaging method based on laser sensor and robot as claimed in claim 1 or 2, it is characterized in that, described steps A 3) in, described coordinate information refers to that the end of robot is at the coordinate information of the Y direction of striped formula laser sensor measurement coordinate system.
4. the workpiece scan imaging method based on laser sensor and robot as described in claims 1 to 3 any one, is characterized in that, at described step B1) in, take the mode of quadratic interpolation to carry out interpolation to described minority noise spot.
5. the workpiece scan imaging method based on laser sensor and robot as described in claim 1 to 4 any one, it is characterized in that, described step B3) in, described interpolation refers to that the laser sampling of the P in the X-direction of every Shu Jiguang is put to corresponding Z axis coordinate figure carries out interpolation.
6. the workpiece scan imaging method based on sensor and robot as described in as arbitrary in claim 1 to 5, is characterized in that described step B4) comprise the following steps:
Step B41, set up a filter criteria, the Z axis coordinate elevation information returning according to laser sampling point by the laser sampling point on surface of the work and not the point of the laser sampling on surface of the work separate;
Step B42, for each laser sampling point, if judge, it makes the corresponding element in workpiece image binaryzation matrix on surface of the work is 0; Otherwise, be 255.
CN201410072626.3A 2014-02-28 2014-02-28 Workpiece scanning imaging method based on laser sensor and robot Pending CN103822594A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201410072626.3A CN103822594A (en) 2014-02-28 2014-02-28 Workpiece scanning imaging method based on laser sensor and robot

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201410072626.3A CN103822594A (en) 2014-02-28 2014-02-28 Workpiece scanning imaging method based on laser sensor and robot

Publications (1)

Publication Number Publication Date
CN103822594A true CN103822594A (en) 2014-05-28

Family

ID=50757771

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201410072626.3A Pending CN103822594A (en) 2014-02-28 2014-02-28 Workpiece scanning imaging method based on laser sensor and robot

Country Status (1)

Country Link
CN (1) CN103822594A (en)

Cited By (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107677205A (en) * 2017-09-29 2018-02-09 桂林电子科技大学 A kind of laser measurement system and method based on industrial robot
CN109746943A (en) * 2019-03-21 2019-05-14 重庆东渝中能实业有限公司 3D structure light scan device and intelligence manufacture robot
CN109773776A (en) * 2017-11-14 2019-05-21 欧姆龙株式会社 Holding method holds system and storage medium
CN110017769A (en) * 2019-03-12 2019-07-16 精诚工科汽车系统有限公司 Part detection method and system based on industrial robot
CN111664808A (en) * 2020-05-09 2020-09-15 洛阳矿山机械工程设计研究院有限责任公司 Crusher lining plate detection method based on contour scanning
CN111776762A (en) * 2018-10-30 2020-10-16 牧今科技 Robotic system with automated package scanning and registration mechanism and method of operation thereof
US11780101B2 (en) 2018-10-30 2023-10-10 Mujin, Inc. Automated package registration systems, devices, and methods
CN117444404A (en) * 2023-11-20 2024-01-26 北京绿能环宇低碳科技有限公司 Intelligent positioning method and system for laser welding

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN2463784Y (en) * 2001-01-19 2001-12-05 西安交通大学 Contactless three-dimensional contour measuring instrument
CN1354355A (en) * 2001-12-10 2002-06-19 西安交通大学 Laser linear scanning three-dimensional measurement double liquid knife virtual grid mapping calibrating method and equipment
CN1474160A (en) * 2003-07-02 2004-02-11 西安邮电学院 Multiple-section synthesizing three-dimensional profile measuring method
CN101198435A (en) * 2005-06-13 2008-06-11 Abb研究有限公司 Defect detection system for identifying defects in weld seams
CN101334270A (en) * 2008-07-25 2008-12-31 西安交通大学 Laser line scanning feeler geometric transformation calibration and curved face interpolation correcting method and apparatus
CN101729739A (en) * 2009-11-16 2010-06-09 潘林岭 Method for rectifying deviation of image
CN102494961A (en) * 2011-11-14 2012-06-13 长沙理工大学 Asphalt surface layer structure anti-rutting performance evaluation method
US20130342850A1 (en) * 2012-06-26 2013-12-26 Toyota Motor Engineering & Manufacturing North America, Inc. Workpiece engagement checking apparatus

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN2463784Y (en) * 2001-01-19 2001-12-05 西安交通大学 Contactless three-dimensional contour measuring instrument
CN1354355A (en) * 2001-12-10 2002-06-19 西安交通大学 Laser linear scanning three-dimensional measurement double liquid knife virtual grid mapping calibrating method and equipment
CN1474160A (en) * 2003-07-02 2004-02-11 西安邮电学院 Multiple-section synthesizing three-dimensional profile measuring method
CN101198435A (en) * 2005-06-13 2008-06-11 Abb研究有限公司 Defect detection system for identifying defects in weld seams
CN101334270A (en) * 2008-07-25 2008-12-31 西安交通大学 Laser line scanning feeler geometric transformation calibration and curved face interpolation correcting method and apparatus
CN101729739A (en) * 2009-11-16 2010-06-09 潘林岭 Method for rectifying deviation of image
CN102494961A (en) * 2011-11-14 2012-06-13 长沙理工大学 Asphalt surface layer structure anti-rutting performance evaluation method
US20130342850A1 (en) * 2012-06-26 2013-12-26 Toyota Motor Engineering & Manufacturing North America, Inc. Workpiece engagement checking apparatus

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
丁献文等: "航海雷达图像噪声抑制方法研究", 《海洋技术》, vol. 30, no. 3, 30 September 2011 (2011-09-30) *

Cited By (19)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107677205A (en) * 2017-09-29 2018-02-09 桂林电子科技大学 A kind of laser measurement system and method based on industrial robot
CN109773776A (en) * 2017-11-14 2019-05-21 欧姆龙株式会社 Holding method holds system and storage medium
US11636605B2 (en) 2018-10-30 2023-04-25 Mujin, Inc. Robotic system with automated package registration mechanism and minimum viable region detection
US11501445B2 (en) 2018-10-30 2022-11-15 Mujin, Inc. Robotic system with automated package scan and registration mechanism and methods of operating the same
US12002007B2 (en) 2018-10-30 2024-06-04 Mujin, Inc. Robotic system with automated package scan and registration mechanism and methods of operating the same
CN111776762A (en) * 2018-10-30 2020-10-16 牧今科技 Robotic system with automated package scanning and registration mechanism and method of operation thereof
US11062457B2 (en) 2018-10-30 2021-07-13 Mujin, Inc. Robotic system with automated package registration mechanism and minimum viable region detection
CN111776762B (en) * 2018-10-30 2021-07-23 牧今科技 Robotic system with automated package scanning and registration mechanism and method of operation thereof
US11176674B2 (en) 2018-10-30 2021-11-16 Mujin, Inc. Robotic system with automated object detection mechanism and methods of operating the same
US11189033B2 (en) 2018-10-30 2021-11-30 Mujin, Inc. Robotic system with automated package registration mechanism and auto-detection pipeline
US11288810B2 (en) 2018-10-30 2022-03-29 Mujin, Inc. Robotic system with automated package registration mechanism and methods of operating the same
US11961042B2 (en) 2018-10-30 2024-04-16 Mujin, Inc. Robotic system with automated package registration mechanism and auto-detection pipeline
US11797926B2 (en) 2018-10-30 2023-10-24 Mujin, Inc. Robotic system with automated object detection mechanism and methods of operating the same
US11780101B2 (en) 2018-10-30 2023-10-10 Mujin, Inc. Automated package registration systems, devices, and methods
CN110017769A (en) * 2019-03-12 2019-07-16 精诚工科汽车系统有限公司 Part detection method and system based on industrial robot
CN109746943A (en) * 2019-03-21 2019-05-14 重庆东渝中能实业有限公司 3D structure light scan device and intelligence manufacture robot
CN111664808A (en) * 2020-05-09 2020-09-15 洛阳矿山机械工程设计研究院有限责任公司 Crusher lining plate detection method based on contour scanning
CN117444404A (en) * 2023-11-20 2024-01-26 北京绿能环宇低碳科技有限公司 Intelligent positioning method and system for laser welding
CN117444404B (en) * 2023-11-20 2024-03-29 北京绿能环宇低碳科技有限公司 Intelligent positioning method and system for laser welding

Similar Documents

Publication Publication Date Title
CN103822594A (en) Workpiece scanning imaging method based on laser sensor and robot
US10254404B2 (en) 3D measuring machine
CN104236463B (en) Machine vision inspection system and the method for carrying out high speed focusing elevation carrection operation
EP0532169B1 (en) Optical Inspection Probe
CN100483116C (en) Method for detecting 3D defects on surface of belt material
EP3783304A1 (en) Calibration of a triangulation sensor
CN111595949B (en) Laser ultrasonic imaging detection system and detection method for self-adaptive irregular surface
US20120072170A1 (en) Vision measurement probe and method of operation
CN107121093A (en) A kind of gear measurement device and measuring method based on active vision
CN103988049A (en) Coordinate measuring machine having camera
EP0114517B1 (en) Mark position detecting method and apparatus
CN103091992A (en) Workpiece position correction device and correction method
CN103411533A (en) Structured light self-adapting repeated multi-exposure method
CN109916305A (en) Vehicle dimension measuring system and method based on optoelectronic induction and TOF technology
CN109458949A (en) A kind of object surface appearance scanning reconstructing arrangement
KR20160102244A (en) Non-imaging coherent line scanner systems and methods for optical inspection
CN104034259B (en) A kind of image measurer bearing calibration
JP5874252B2 (en) Method and apparatus for measuring relative position with object
Manthey et al. Calibration of a laser range-finding coordinate-measuring machine
US20210270969A1 (en) Enhanced depth mapping using visual inertial odometry
CN109696191A (en) A kind of virtual reality wears the mobile delay measuring method of display equipment
CN204730813U (en) A kind of medium plate Shap feature detection system controlled based on symmetric double line laser angle
CN103365099A (en) Focusing and leveling signal processing method
JPH06207812A (en) Measurement point indicator for three-dimensional measurement
CN105783782B (en) Surface curvature is mutated optical profilometry methodology

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
C02 Deemed withdrawal of patent application after publication (patent law 2001)
WD01 Invention patent application deemed withdrawn after publication

Application publication date: 20140528