US5570190A - Visual sensor coordinate system setting jig and setting method - Google Patents

Visual sensor coordinate system setting jig and setting method Download PDF

Info

Publication number
US5570190A
US5570190A US08/284,442 US28444294A US5570190A US 5570190 A US5570190 A US 5570190A US 28444294 A US28444294 A US 28444294A US 5570190 A US5570190 A US 5570190A
Authority
US
United States
Prior art keywords
feature points
coordinate system
visual sensor
sensor coordinate
jig
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Expired - Lifetime
Application number
US08/284,442
Other languages
English (en)
Inventor
Fumikazu Terawaki
Fumikazu Warashina
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Fanuc Corp
Original Assignee
Fanuc Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Fanuc Corp filed Critical Fanuc Corp
Assigned to FANUC LTD. reassignment FANUC LTD. ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: TARAWAKI, FUMIKAZU, WARASHINA, FUMIKAZU
Application granted granted Critical
Publication of US5570190A publication Critical patent/US5570190A/en
Anticipated expiration legal-status Critical
Expired - Lifetime legal-status Critical Current

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1679Programme controls characterised by the tasks executed
    • B25J9/1692Calibration of manipulator
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B11/00Measuring arrangements characterised by the use of optical techniques

Definitions

  • the present invention relates to a coordinate system setting jig for a visual sensor used in detecting the position and attitude of an object in controlling an automatic machine such as a robot, for example, and also to a sensor coordinate system setting method using the jig.
  • a visual sensor which includes a camera having a CCD element array and the like and an image processing device for storing and analyzing an image picked up by means of the camera, is often used for the control of an automatic machine, such as a robot, or other similar applications.
  • an automatic machine such as a robot, or other similar applications.
  • Conventionally known is a system arranged as follows. For example, in performing welding, transportation, and other operations for the workpiece by using the robot, a deviation of a workpiece as an object of operation, from its reference position or reference attitude is detected by means of one or a plurality of visual sensors, the data indicative of such deviation is transmitted as correction data for the robot position or attitude to a robot controller, and the position or attitude is corrected in accordance with the correction data as the robot executes a taught operation.
  • Such a system must be arranged so that a coordinate system (camera coordinate system, more specifically CCD pixel array, for example) on which the image data for the object of operation recognized by each visual sensor is based and a coordinate system (usually work coordinate system) set in the robot can be integrated, and position data (pixel value data) detected based on the former coordinate system by the individual visual sensors can be converted into data (work coordinate system data) represented by the latter coordinate system.
  • a coordinate system camera coordinate system, more specifically CCD pixel array, for example
  • a coordinate system usually work coordinate system
  • position data pixel value data
  • a sensor coordinate system which can be recognized in common by the visual sensors and the robot controller, is established so that the pixel value data detected by the visual sensors are temporarily converted into data on the sensor coordinate system, and the resulting data are further converted into data on the workpiece coordinate system on which actual robot drive control is based.
  • the sensor coordinate system may be made to be identical with the work coordinate system.
  • the object position information detected by the visual sensors it is usually necessary for such detected information to be arranged for the conversion into objective data. More specifically, the information, such as the pixel value data, which can only be recognized directly by the visual sensors, need to be arranged so that they can be converted into the information or the data which can be represented by an objective coordinate system.
  • the objective coordinate system will be referred to generally as the sensor coordinate system without regard to the application.
  • the sensor coordinate system recognized by the visual sensors in any way, utilizing the CPU, image processor, memories, etc. included in the visual sensor so that the correlation between the sensor coordinate system and the pixel value can be stored in the visual sensor.
  • the data represented on the sensor coordinate system can be obtained with respect to the object recognized by the visual sensor, in accordance with the aforesaid correspondence relationship stored in the visual sensor, unless the sensor coordinate system is changed.
  • the sensor coordinate system set in the visual sensor can be changed by utilizing a conversion program as long as the relationship between new and old sensor coordinate systems is known.
  • a sensor coordinate system setting jig with three feature points thereon is located in a predetermined position (e.g., fixed position on a fixed plane in a work space), the three feature points, whose positions on the sensor coordinate system are known, are caught in the field of view of the camera of the visual sensor, the pixel value data for the individual feature points are stored; and, based on these data and the known position data, the visual sensor recognizes the origin and the directions and scales (actual size of one pixel on the plane of the object, or reciprocal of scale factor on the pixel plane) of individual coordinate axes (normally X- and Y-axes) of the sensor coordinate system.
  • a predetermined position e.g., fixed position on a fixed plane in a work space
  • the three feature points, whose positions on the sensor coordinate system are known are caught in the field of view of the camera of the visual sensor
  • the pixel value data for the individual feature points are stored; and, based on these data and the known position data, the visual sensor recognizes the
  • the feature points are increased to four or more, e.g., seven, in number, in consideration of that object is recognized through the lens system of the camera of the visual sensor, these seven feature points are recognized by the visual sensor, and the sensor coordinate system is established in the same processings.
  • Tsai's camera model indicates that at least seven feature points are needed when the lens system is used.
  • all of the three or more feature points must be recognized by means of the visual sensor in catching a plurality of feature points on the jig in the field of view of the camera of the visual sensor and executing the sensor coordinate system setting operation.
  • the feature points are as few as three or thereabout, it is relatively easy to catch all the feature points in the camera view field of the visual sensor and coordinate the pixel data corresponding thereto to the data of the individual feature points on the sensor coordinate system. Since the lens system has aberration, it is inevitably difficult to set the sensor coordinate system accurately in consideration of image distortion which is liable to occur in the peripheral area of the field of view, in particular.
  • the camera is located in a relatively close position, so that the actual size of the field of view is accordingly small.
  • the object of the present invention is to get over the above drawbacks of the prior art, and to provide a novel sensor coordinate system setting jig, which can be used in common despite variations in the actual size of the field of view depending on the nature of operation, and can set a sensor coordinate system by a relatively simple operation not necessarily requiring all of feature points to be recognized.
  • the present invention is intended to provide a sensor coordinate system setting method which utilizes the novel jig described above.
  • the present invention provides a visual sensor coordinate system setting jig which comprises a number of feature points (marks having a narrower extent than the lattice interval; the shape or pattern of the marks is optional and not limited to the so-called dot pattern, the same applies hereinafter) arranged at known intervals in individual lattice point positions of a lattice (the shape of each unit lattice is not limited to a square configuration; to indicate any unit having a regular array shall be called lattice hereinafter) corresponding to a visual sensor coordinate system and capable of being recognized by a visual sensor, a relatively small number of feature points, at least three, among the number of feature points, having an additional feature by which the feature points can be discriminated from the remaining feature points by means of the visual sensor, and the relatively small number of feature points, at least three, representing the origin and coordinate axes of the visual sensor coordinate system by an array state identifiable of being identified by the visual sensor or a feature of the feature points themselves, and furthermore the present
  • FIG. 1 is a conceptual diagram showing an arrangement used in setting a sensor coordinate system in a visual sensor by means of a jig according to the present invention
  • FIG. 2(a) is a diagram showing a feature point array according to one embodiment of the jig of the present invention.
  • FIG. 2(b) is a diagram showing the state of an image obtained by shooting the array by a camera
  • FIG. 3 is a diagram showing a feature point array according to another embodiment of the jig of the present invention.
  • FIG. 4 is a block diagram showing the principal part of an example of an image processing device used in carrying out a sensor coordinate setting method using the jig of the present invention
  • FIG. 5 shows the first half of a flow chart illustrating an example of a processing for carrying out the sensor coordinate system setting method of the present invention.
  • FIG. 6 shows the second half of the flow chart illustrating the example of the process for carrying out the sensor coordinate system setting method of the present invention.
  • FIG. 1 shows the way a jig J, in the form of a thin plate located in a predetermined position on a work plane G (one fixed plane in a work space for the case a three-dimensional operation is assumed), is shot by a camera C, which is connected to an image processing device 10, to establish a sensor coordinate system.
  • the optical axis of the camera C is inclined at an angle to the work plane G and hence, to the surface of the jig, and the camera catches in its visual range ⁇ a region which includes the central portion of the feature point array of the jig J.
  • FIG. 1 shows only some of feature points.
  • the feature points of the jig J are arranged at known intervals d in the form of a lattice.
  • P 00 P 20 and P 01 are given an additional feature (here large black dots or large-area black dots) by which they can be discriminated from the other feature points (small black dots or marks).
  • a straight line from the feature point P 00 to P 20 is given as the X-axis, and a straight line from the feature point P 00 to P 01 as the Y-axis.
  • a coordinate system having this P 00 as its origin is supposed to be established as the sensor coordinate system in the visual sensor.
  • the coordinate values of the feature points P mn are given by [md, nd].
  • the feature points P 00 , P 20 and P 01 having the additional feature are given a conspicuous feature which can be easily discriminated by means of the image processing device 10 of the visual sensor. Since these few additional feature points are located within a relatively narrow range around the origin, such as the central portion of the jig, it is to contain all of them (normally within three or three plus several in number) securely in the camera view field.
  • the jig or the feature point array thereof is designed relatively larger to prepare for the case that a relatively large area can be covered with the field of view.
  • the camera field of view is unable to cover all the feature points even in the case shown in FIG. 2.
  • P 55 , P 60 , etc. are contained in the camera view field
  • P 77 , P -5-7 , etc. are outside the camera view field. It is a remarkable feature of the present invention that the sensor coordinate system can be set without any hindrance at all even in such a situation.
  • ( ⁇ , ⁇ ) indicates that a pixel corresponding to (the center of) Q ij is the pixel in the ⁇ 'th row and ⁇ 'th column of a matrix which a pixel array forms.
  • the additional feature points P 00 , P 20 and P 01 can be discriminated by the other feature points by only causing the visual sensor to extract the feature points which have the additional feature (large area in this case).
  • the visual sensor can be considered to have fetched minimum necessary information for the determination of the origin and the directions and scales of the X- and Y-axes of the sensor coordinate system.
  • the X- and Y-axes can definitely be discriminated by a feature of the array, P 01 -P 20 distance>P 00 -P 20 distance>P 00 -P 01 distance.
  • the method for extracting the additional feature points is not limited to the method which utilizes the aforesaid relationships of the distances between the points.
  • the distinctions between Q 00 , Q 20 and Q 01 may be taught to the visual sensor so that the origin and the directions and scales of the X- and Y-axes of the sensor coordinate system can be determined thereby, in accordance with information that the three points Q 00 , Q 20 and Q 01 form a triangle and that angles Q 20 Q 00 Q 01 , Q 00 Q 01 Q 20 and Q 00 Q 20 Q 01 are approximately 90°, 60° and 30°, respectively, on condition that the camera is not inclined extremely. Also, the condition, angle Q 20 Q 00 Q 01 >angle Q 00 Q 01 Q 20 >angle Q 00 Q 20 Q 01 , may be used.
  • Measured data of the pixel values of Q mn corresponding to all the remaining lattice points are seized by the visual sensor as far as the recognition of the feature points has not failed, except only that provided the correspondence between the individual data and m and n is not determine. It is necessary, therefore, only that the measured data and m and n be coordinated by utilizing the aforesaid estimated values.
  • abnormal measured values can be excluded by setting a suitable allowable value ⁇ or by other means, or otherwise, the estimated values can be settled by utilizing an interpolation method based on measured values corresponding to other reliable adjacent lattice points.
  • m and n can be coordinated with respect to all feature point images ⁇ Q ij ⁇ .
  • the method for determining the individual pixel values I ij for all the feature point images ⁇ Q ij ⁇ is not limited to the method described above.
  • pixel values such as Q 10 , Q 30 , Q 40 , Q 50 , Q 02 , Q 03 , Q 04 , Q 05 , etc., on the images of the X- and Y-axes are first detected from the observed values and settled, and selection and settlement of the most approximate observed value, among remaining observed values for the nearest points Q, are repeated in succession.
  • the pixel value data for (the center of) an image obtained when a specific point (e.g., center of a workpiece) of an optional object is observed can be converted into data on the sensor coordinate system in accordance with a suitable conventional computation, such as the interpolation method based on the pixel value data for the individual lattice points.
  • the respective positions of the vanishing points V X and V y on the pixel plane can be computed, so that the direction (directional cosine) of the optical axis of the camera can also be computed on the basis of the computation result.
  • this information on the direction of the optical axis is utilized.
  • FIG. 3 shows an example of this arrangement.
  • the sensor coordinate system is set on condition that point of intersection of a segment connecting a large white circle H -10 and a large black dot H 10 and a segment connecting a large black dot H 0-1 and a large black dot H 01 represents the origin H 00 , a straight line extending from the large white circle H -10 to the large black dot H 10 represents the X-axis, and a straight line directed from the large black dot H 0-1 to the large black dot H 01 represents the Y-axis. It is to be understood without any special explanation that the sensor coordinate system can be set in accordance with the same principle as the one described in connection with the case of FIG. 2.
  • FIGS. 4, 5 and 6 an example of a procedure for setting coordinates by a sensor coordinate system setting method according to the present invention will now be described.
  • the jig J of FIG. 2(a) is arranged in the manner shown in FIG. 1, and it is shot by means of the camera of the visual sensor.
  • FIG. 4 is a block diagram illustrating the camera C, which constitutes the visual sensor used to carry out the present invention, and the principal part of the image processing device 10 connected thereto.
  • the image processing device 10 includes a central processing unit (hereinafter referred to as CPU) 11.
  • the CPU 1 is connected, by means of a bus 21, with a camera interface 12, image processor 13, console interface 4, communication interface 15, TV monitor interface 16, frame memory 17, memory 18 for control software formed of a ROM, program memory 19 formed of a RAM, and data memory 20 formed of a nonvolatile RAM.
  • the camera interface 12 is connected with the camera C which catches the jig J in its view field ⁇ in the manner shown in FIG. 1.
  • Such cameras C are connected as required, and are arranged so as to detect the position of an object of working (e.g., workpiece to be machined) on the work plane G after the sensor coordinate system is set, as well as to transmit data for position correction to a robot which is connected to the communication interface 15.
  • an object of working e.g., workpiece to be machined
  • the several cameras C though only one of which is illustrated, sometimes may be connected simultaneously to the communication interface 15 for the operation.
  • a common sensor coordinate system is set for the individual cameras.
  • An image picked up by the camera C is converted into a gradational image which is converted into a gray scale, and is loaded into the frame memory 17.
  • the image processor 13 serves to process the image data stored in the frame memory 17, recognize the images Q mn of the feature points P mn of the jig J, and detect their positions (corresponding pixel values; central pixel values if the images have an extent).
  • the console interface 14 is connected with a console 23, which includes various command keys and an input unit for performing operations such as entry, editing, registration, execution, etc. of application programs, besides an output unit.
  • the output unit can display, for example, lists of menus and programs used to set various data.
  • the memory 18 for control software is loaded with a control program with which the CPU 11 controls the visual sensor system.
  • the program memory 19 is loaded with programs which are created by a user.
  • the communication interface 15 is connected with an automatic machine (robot in this case) or the like which utilizes position information on the work or the like detected by the visual sensor.
  • the TV monitor interface 16 is connected with a TV monitor 24 which can selectively display the image being shot by means of the camera C or the image stored in the frame memory 17.
  • the above-described arrangement is basically the same as the arrangement of a conventional visual sensor system, it differs from the conventional system arrangement in that the memory 18 for control software or the data memory 20 is loaded with programs and data (S 0 , d, etc.), which are necessary for the execution of processings such as the ones shown in the flow charts of FIGS. 5 and 6, in carrying out the present invention.
  • a preliminary operation or first thing to do for setting the sensor coordinate system is to set the cameras C to be used in the image processing device 10. For example, this setting is achieved by assigning connector numbers in the camera interface 12, when setting coordinate systems are set for selected ones of the cameras which are connected to the camera interface.
  • the data for the jig J are stored in the data memory 20.
  • particularly important input data are those data which for discriminating the distance d between the lattice points on which the feature points are arranged, from the additional feature.
  • codes for specifying the types of the jigs and coordinate setting and codes and data for specifying algorithms necessary for recognizing the origin and coordinate axes and settling coordinate values corresponding to the individual lattice point images in accordance with the images of the additional feature points are inputted or assigned.
  • FIGS. 5 and 6 described below is an example of a processing operation of the image processing device 10, in which the sensor coordinate system setting is executed by using the jig J according to the present invention shown in FIG. 2, on the basis of the aforesaid preparations.
  • the coordinate setting operation is started when an operator gives a sensor coordinate setting start command from the console 23.
  • codes for specifying the types of the jig and coordinate setting and codes for specifying the algorithms are assigned, and the specified codes and necessary incidental data are loaded for the data memory 20 (START).
  • An image being shot by means of the camera C is fetched by the frame memory 17, the feature point images in the view field is recognized by the image processing device 10 (Step S2), numbers 1, 2, . . . , N are assigned to the feature point images in proper order, and the images, along with corresponding pixel values, are stored in the data memory 20 (Step S3).
  • Step S4 If Q q is not a large black dot image, the program returns to Step S4, whereupon 1 is added again to the feature point number index q, and the program advances to Step S5.
  • Step S8 Before all of three large black dot images R 1 , R 2 and R 3 , equal in number to the large black dots (additional feature points) of FIG. 2(a) are detected the decision in Step S8 next to Step S7 is NO, and so after the program returns again to Step S4, Steps S4 to S8 are repeated.
  • Step S8 When all the three large black dot images R 1 , R 2 and R 3 are detected, the decision in Step S8 is YES, whereupon central pixel values I(R 1 ), I(R 2 ) and I(R 3 ) of the large black dot images R 1 , R 2 and R 3 are computed, and the result is loaded into the data memory 20 (Step S10). In this stage, the correspondent relationship between I(R 1 ), I(R 2 ) and I(R 3 ) and P 00 , P 20 and P 01 is not recognized.
  • a distance D 12 between I(R 1 ) and I(R 2 ) on the pixel plane, distance D 23 between I(R 2 ) and I(R 3 ) on the pixel plane, and distance D 31 between I(R 3 ) and I(R 1 ) on the pixel plane are computed individually (Step S10).
  • I(R 1 ), I(R 2 ) and I(R 3 ) are made to correspond equally to P 00 , P 20 and P 01 on 1-to-1 basis, and the individual pixel values are determined as I 00 , I 20 and I 01 (Step S11).
  • Step S13 the feature point image number index q is restored to 1 (Step S13), and a processing is started to determine the estimated pixel values I' mn to which the individual feature point images correspond.
  • Step S14 it is determined whether or not Q 1 is any of R 1 to R 3 (Step S14).
  • Step S14 it is determined whether or not Q 1 is any of R 1 to R 3 (Step S14).
  • Step S17 since the pixel values are already determined in Step S11, they are not the objects of coordination with the estimated pixel values, and the program advances to Step S17, whereupon 1 is added to the feature point number index q.
  • Step S18 it is checked whether or not the processing of all of N number of feature points is finished (Step S18). Unless it is finished, the program returns to Step S14.
  • Step S14 If the decision in Step S14 is NO, there should be only one estimated value I' mn which is approximate to a measured pixel value I q (central pixel value if there are a plurality of values) of Q q . Then, according to equation (3) mentioned in connection with the description of the function, distances, ⁇ q-10, ⁇ q-30, ⁇ q-40, . . . , ⁇ q-77, . . . , ⁇ q-5-7, between I and the individual estimated values I' mn are computed, (m, n) which gives the minimum distance ⁇ q-mn is obtained (Step S15), and the corresponding pixel value I q is determined as I mn (Step S16).
  • the position or region of a plane, at which or in which the jig J is placed corresponds to the image recognized by any given pixel, can always be computed by the reverse calculation based on the pixel data of the registered individual lattice points.
  • image information detected by the visual sensor can be converted into data on the sensor coordinate system as an objective coordinate system and utilized for control robot control and the like.
  • the jig is provided with a few additional feature points which can be discriminated from the other feature points and represent the origin and the directions and scales of the coordinate axes. Even though the actual size of the view field varies depending on the nature of the operation, therefore, fundamental data for the sensor coordinate system can easily and surely be obtained. As far as such few additional feature points can be recognized, the operator can set the sensor coordinate system by a simple operation without paying attention to whether the other feature points are in the view field or whether they are recognized successfully.

Landscapes

  • Engineering & Computer Science (AREA)
  • Robotics (AREA)
  • Mechanical Engineering (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Numerical Control (AREA)
  • Length Measuring Devices By Optical Means (AREA)
  • Image Processing (AREA)
US08/284,442 1992-12-03 1993-12-01 Visual sensor coordinate system setting jig and setting method Expired - Lifetime US5570190A (en)

Applications Claiming Priority (3)

Application Number Priority Date Filing Date Title
JP4-350296 1992-12-03
JP35029692A JP3394278B2 (ja) 1992-12-03 1992-12-03 視覚センサ座標系設定治具及び設定方法
PCT/JP1993/001750 WO1994012915A1 (en) 1992-12-03 1993-12-01 Jig for setting coordinate system of visual sensor and setting method

Publications (1)

Publication Number Publication Date
US5570190A true US5570190A (en) 1996-10-29

Family

ID=18409540

Family Applications (1)

Application Number Title Priority Date Filing Date
US08/284,442 Expired - Lifetime US5570190A (en) 1992-12-03 1993-12-01 Visual sensor coordinate system setting jig and setting method

Country Status (3)

Country Link
US (1) US5570190A (ja)
JP (1) JP3394278B2 (ja)
WO (1) WO1994012915A1 (ja)

Cited By (24)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP0843153A1 (fr) * 1996-11-15 1998-05-20 AEROSPATIALE Société Nationale Industrielle Système de mesure des caractéristiques d'un objet
US6128585A (en) * 1996-02-06 2000-10-03 Perceptron, Inc. Method and apparatus for calibrating a noncontact gauging sensor with respect to an external coordinate system
US6236896B1 (en) * 1994-05-19 2001-05-22 Fanuc Ltd. Coordinate system setting method using visual sensor
US6285959B1 (en) 1996-02-06 2001-09-04 Perceptron, Inc. Method and apparatus for calibrating a non-contact gauging sensor with respect to an external coordinate system
US20030151665A1 (en) * 2002-02-14 2003-08-14 Canon Kabushiki Kaisha Information processing method and apparatus, and recording medium
US20040170315A1 (en) * 2002-12-27 2004-09-02 Olympus Corporation Calibration apparatus, calibration method, program for calibration, and calibration jig
US7113878B1 (en) 2005-05-18 2006-09-26 Perceptron, Inc. Target for calibrating a non-contact sensor
US20060271332A1 (en) * 2005-05-18 2006-11-30 Perceptron, Inc. Method for calibrating a non-contact sensor using a robot
US20070184957A1 (en) * 2006-02-03 2007-08-09 Mobert Srl Device and method for automatic control of plastic bag handles
US20080062266A1 (en) * 2006-09-12 2008-03-13 Mediatek Inc. Image test board
US20090309839A1 (en) * 2008-06-13 2009-12-17 Polyvision Corporation Eraser assemblies and methods of manufacturing same
US20100141776A1 (en) * 2008-12-10 2010-06-10 Fanuc Ltd Calibrating device for calibration and image measurement system comprising calibrating device
CN102485441A (zh) * 2010-12-03 2012-06-06 财团法人工业技术研究院 机械手臂的定位方法及校正方法
JP2012183606A (ja) * 2011-03-04 2012-09-27 Seiko Epson Corp ロボット位置検出装置及びロボットシステム
CN103884271A (zh) * 2012-12-20 2014-06-25 中国科学院沈阳自动化研究所 一种线结构光视觉传感器直接标定方法
WO2014143614A1 (en) 2013-03-11 2014-09-18 Jan Remmereit Lipid compositions containing bioactive fatty acids
WO2014140934A2 (en) 2013-03-11 2014-09-18 Life Science Nutrition As Natural lipids containing non-oxidizable fatty acids
US20140288710A1 (en) * 2013-03-19 2014-09-25 Kabushiki Kaisha Yaskawa Denki Robot system and calibration method
US20160001445A1 (en) * 2014-07-01 2016-01-07 Seiko Epson Corporation Teaching apparatus and robot system
DE102014213518A1 (de) * 2014-07-11 2016-01-14 Trumpf Werkzeugmaschinen Gmbh + Co. Kg Verfahren, Bearbeitungsmaschine und Computerprogrammprodukt zum bildbasierten Platzieren von Werkstückbearbeitungsvorgängen
US20200338678A1 (en) * 2019-04-26 2020-10-29 Fanuc Corporation Machining system and machining method
DE102019126403A1 (de) * 2019-09-30 2021-04-01 Trumpf Werkzeugmaschinen Gmbh + Co. Kg Verfahren zum Beladen einer Tafelablagevorrichtung einer Flachbettwerkzeugmaschine und Flachbettwerkzeugmaschine
US11014233B2 (en) * 2015-10-22 2021-05-25 Canon Kabushiki Kaisha Teaching point correcting method, program, recording medium, robot apparatus, imaging point creating method, and imaging point creating apparatus
US20220168902A1 (en) * 2019-03-25 2022-06-02 Abb Schweiz Ag Method And Control Arrangement For Determining A Relation Between A Robot Coordinate System And A Movable Apparatus Coordinate System

Families Citing this family (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP4642889B2 (ja) * 2008-09-22 2011-03-02 株式会社トプコン 校正用三次元フィールド、校正用三次元フィールドの撮影方法
JP2018001333A (ja) * 2016-06-30 2018-01-11 セイコーエプソン株式会社 校正ボード、ロボット、及び検出方法
JP6356845B1 (ja) * 2017-02-13 2018-07-11 ファナック株式会社 検査システムの動作プログラムを生成する装置および方法
CN109434839A (zh) * 2018-12-25 2019-03-08 江南大学 一种基于单目视觉辅助定位的机器人自标定方法
JP7340069B1 (ja) * 2022-06-09 2023-09-06 株式会社ダイヘン マーカ位置登録プログラム、マーカ位置登録装置、マーカ位置登録方法及びその方法に用いるマーカ

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPS6295403A (ja) * 1985-10-22 1987-05-01 Nec Corp 座標系較正装置
JPH03287343A (ja) * 1990-03-30 1991-12-18 Toyota Motor Corp 機械座標系補正装置
JPH0481903A (ja) * 1990-07-25 1992-03-16 Fanuc Ltd ロボットの座標系の定義方法
US5329469A (en) * 1990-05-30 1994-07-12 Fanuc Ltd. Calibration method for a visual sensor

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2672509B2 (ja) * 1987-06-13 1997-11-05 オムロン株式会社 カメラモデルの自動キャリブレーション方法およびその装置
JP2686351B2 (ja) * 1990-07-19 1997-12-08 ファナック株式会社 視覚センサのキャリブレーション方法
JPH04181106A (ja) * 1990-11-15 1992-06-29 Komatsu Ltd 位置寸法計測装置のキャリブレーション装置

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPS6295403A (ja) * 1985-10-22 1987-05-01 Nec Corp 座標系較正装置
JPH03287343A (ja) * 1990-03-30 1991-12-18 Toyota Motor Corp 機械座標系補正装置
US5329469A (en) * 1990-05-30 1994-07-12 Fanuc Ltd. Calibration method for a visual sensor
JPH0481903A (ja) * 1990-07-25 1992-03-16 Fanuc Ltd ロボットの座標系の定義方法

Cited By (38)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6236896B1 (en) * 1994-05-19 2001-05-22 Fanuc Ltd. Coordinate system setting method using visual sensor
US6128585A (en) * 1996-02-06 2000-10-03 Perceptron, Inc. Method and apparatus for calibrating a noncontact gauging sensor with respect to an external coordinate system
US6285959B1 (en) 1996-02-06 2001-09-04 Perceptron, Inc. Method and apparatus for calibrating a non-contact gauging sensor with respect to an external coordinate system
EP0843153A1 (fr) * 1996-11-15 1998-05-20 AEROSPATIALE Société Nationale Industrielle Système de mesure des caractéristiques d'un objet
FR2756042A1 (fr) * 1996-11-15 1998-05-22 Aerospatiale Systeme de mesure des caracteristiques d'un objet
US5854679A (en) * 1996-11-15 1998-12-29 Aerospatiale Societe Nationale Industrielle Object characteristics measurement system
US7196721B2 (en) 2002-02-14 2007-03-27 Canon Kabushiki Kaisha Information processing method and apparatus, and recording medium
US20030151665A1 (en) * 2002-02-14 2003-08-14 Canon Kabushiki Kaisha Information processing method and apparatus, and recording medium
US20040170315A1 (en) * 2002-12-27 2004-09-02 Olympus Corporation Calibration apparatus, calibration method, program for calibration, and calibration jig
US7894661B2 (en) * 2002-12-27 2011-02-22 Olympus Corporation Calibration apparatus, calibration method, program for calibration, and calibration jig
US7113878B1 (en) 2005-05-18 2006-09-26 Perceptron, Inc. Target for calibrating a non-contact sensor
US20060271332A1 (en) * 2005-05-18 2006-11-30 Perceptron, Inc. Method for calibrating a non-contact sensor using a robot
US20070184957A1 (en) * 2006-02-03 2007-08-09 Mobert Srl Device and method for automatic control of plastic bag handles
US20080062266A1 (en) * 2006-09-12 2008-03-13 Mediatek Inc. Image test board
US20090309839A1 (en) * 2008-06-13 2009-12-17 Polyvision Corporation Eraser assemblies and methods of manufacturing same
US8243028B2 (en) * 2008-06-13 2012-08-14 Polyvision Corporation Eraser assemblies and methods of manufacturing same
US20100141776A1 (en) * 2008-12-10 2010-06-10 Fanuc Ltd Calibrating device for calibration and image measurement system comprising calibrating device
US8350913B2 (en) 2008-12-10 2013-01-08 Fanuc Ltd Calibrating device for calibration and image measurement system comprising calibrating device
CN102485441B (zh) * 2010-12-03 2014-07-02 财团法人工业技术研究院 机械手臂的定位方法及校正方法
CN102485441A (zh) * 2010-12-03 2012-06-06 财团法人工业技术研究院 机械手臂的定位方法及校正方法
US8688274B2 (en) 2010-12-03 2014-04-01 Industrial Technology Research Institute Robot positioning method and calibration method
JP2012183606A (ja) * 2011-03-04 2012-09-27 Seiko Epson Corp ロボット位置検出装置及びロボットシステム
CN103884271B (zh) * 2012-12-20 2016-08-17 中国科学院沈阳自动化研究所 一种线结构光视觉传感器直接标定方法
CN103884271A (zh) * 2012-12-20 2014-06-25 中国科学院沈阳自动化研究所 一种线结构光视觉传感器直接标定方法
WO2014143614A1 (en) 2013-03-11 2014-09-18 Jan Remmereit Lipid compositions containing bioactive fatty acids
WO2014140934A2 (en) 2013-03-11 2014-09-18 Life Science Nutrition As Natural lipids containing non-oxidizable fatty acids
US20140288710A1 (en) * 2013-03-19 2014-09-25 Kabushiki Kaisha Yaskawa Denki Robot system and calibration method
US20180236661A1 (en) * 2014-07-01 2018-08-23 Seiko Epson Corporation Teaching Apparatus And Robot System
US9981380B2 (en) * 2014-07-01 2018-05-29 Seiko Epson Corporation Teaching apparatus and robot system
US20160001445A1 (en) * 2014-07-01 2016-01-07 Seiko Epson Corporation Teaching apparatus and robot system
DE102014213518A1 (de) * 2014-07-11 2016-01-14 Trumpf Werkzeugmaschinen Gmbh + Co. Kg Verfahren, Bearbeitungsmaschine und Computerprogrammprodukt zum bildbasierten Platzieren von Werkstückbearbeitungsvorgängen
US11014233B2 (en) * 2015-10-22 2021-05-25 Canon Kabushiki Kaisha Teaching point correcting method, program, recording medium, robot apparatus, imaging point creating method, and imaging point creating apparatus
US20220168902A1 (en) * 2019-03-25 2022-06-02 Abb Schweiz Ag Method And Control Arrangement For Determining A Relation Between A Robot Coordinate System And A Movable Apparatus Coordinate System
US12036663B2 (en) * 2019-03-25 2024-07-16 Abb Schweiz Ag Method and control arrangement for determining a relation between a robot coordinate system and a movable apparatus coordinate system
US20200338678A1 (en) * 2019-04-26 2020-10-29 Fanuc Corporation Machining system and machining method
US11602817B2 (en) * 2019-04-26 2023-03-14 Fanuc Corporation Machining system and machining method
DE102019126403A1 (de) * 2019-09-30 2021-04-01 Trumpf Werkzeugmaschinen Gmbh + Co. Kg Verfahren zum Beladen einer Tafelablagevorrichtung einer Flachbettwerkzeugmaschine und Flachbettwerkzeugmaschine
DE102019126403B4 (de) 2019-09-30 2023-03-23 Trumpf Werkzeugmaschinen Gmbh + Co. Kg Verfahren zum Beladen einer Tafelablagevorrichtung einer Flachbettwerkzeugmaschine und Flachbettwerkzeugmaschine

Also Published As

Publication number Publication date
JP3394278B2 (ja) 2003-04-07
WO1994012915A1 (en) 1994-06-09
JPH06175715A (ja) 1994-06-24

Similar Documents

Publication Publication Date Title
US5570190A (en) Visual sensor coordinate system setting jig and setting method
US6114824A (en) Calibration method for a visual sensor
US7177459B1 (en) Robot system having image processing function
US6728417B1 (en) Measurement apparatus
EP2045772B1 (en) Apparatus for picking up objects
US5329469A (en) Calibration method for a visual sensor
JP3242108B2 (ja) ターゲットマークの認識・追跡システム及び方法
US5396331A (en) Method for executing three-dimensional measurement utilizing correctively computing the absolute positions of CCD cameras when image data vary
US20090070077A1 (en) Three-dimensional model data generating method, and three dimensional model data generating apparatus
US5471312A (en) Automatic calibration method
CN112308916B (zh) 一种基于图像靶标的目标位姿识别方法
US6768813B1 (en) Photogrammetric image processing apparatus and method
CN111611989B (zh) 一种基于自主机器人的多目标精准定位识别方法
US20030044047A1 (en) System and method for object localization
US20050273199A1 (en) Robot system
EP0157299B1 (en) Image processing apparatus
JP3138080B2 (ja) 視覚センサの自動キャリブレーション装置
CN115176274A (zh) 一种异源图像配准方法及系统
JPH0798208A (ja) 視覚に基く三次元位置および姿勢の認識方法ならびに視覚に基く三次元位置および姿勢の認識装置
CN111914857B (zh) 板材余料的排样方法、装置、系统、电子设备及存储介质
JP2690103B2 (ja) 指紋中心検出装置
JPH07121713A (ja) パターン認識方法
JPH06214622A (ja) ワーク位置検知装置
KR102452430B1 (ko) 비전 기반 용접 대상물 인식 장치 및 방법
WO2021157528A1 (ja) 画像処理装置

Legal Events

Date Code Title Description
AS Assignment

Owner name: FANUC LTD., JAPAN

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:TARAWAKI, FUMIKAZU;WARASHINA, FUMIKAZU;REEL/FRAME:007135/0964

Effective date: 19940722

STCF Information on status: patent grant

Free format text: PATENTED CASE

FEPP Fee payment procedure

Free format text: PAYOR NUMBER ASSIGNED (ORIGINAL EVENT CODE: ASPN); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY

FPAY Fee payment

Year of fee payment: 4

FPAY Fee payment

Year of fee payment: 8

FPAY Fee payment

Year of fee payment: 12