WO2020049973A1 - Dispositif de traitement, procédé de commande de dispositif de traitement et programme de commande de dispositif de traitement - Google Patents

Dispositif de traitement, procédé de commande de dispositif de traitement et programme de commande de dispositif de traitement Download PDF

Info

Publication number
WO2020049973A1
WO2020049973A1 PCT/JP2019/031959 JP2019031959W WO2020049973A1 WO 2020049973 A1 WO2020049973 A1 WO 2020049973A1 JP 2019031959 W JP2019031959 W JP 2019031959W WO 2020049973 A1 WO2020049973 A1 WO 2020049973A1
Authority
WO
WIPO (PCT)
Prior art keywords
camera
processing apparatus
cameras
processing device
processing
Prior art date
Application number
PCT/JP2019/031959
Other languages
English (en)
Japanese (ja)
Inventor
山田 智明
静雄 西川
大樹 中尾
Original Assignee
Dmg森精機株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Dmg森精機株式会社 filed Critical Dmg森精機株式会社
Publication of WO2020049973A1 publication Critical patent/WO2020049973A1/fr

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B23MACHINE TOOLS; METAL-WORKING NOT OTHERWISE PROVIDED FOR
    • B23QDETAILS, COMPONENTS, OR ACCESSORIES FOR MACHINE TOOLS, e.g. ARRANGEMENTS FOR COPYING OR CONTROLLING; MACHINE TOOLS IN GENERAL CHARACTERISED BY THE CONSTRUCTION OF PARTICULAR DETAILS OR COMPONENTS; COMBINATIONS OR ASSOCIATIONS OF METAL-WORKING MACHINES, NOT DIRECTED TO A PARTICULAR RESULT
    • B23Q17/00Arrangements for observing, indicating or measuring on machine tools
    • B23Q17/20Arrangements for observing, indicating or measuring on machine tools for indicating or measuring workpiece characteristics, e.g. contour, dimension, hardness
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B23MACHINE TOOLS; METAL-WORKING NOT OTHERWISE PROVIDED FOR
    • B23QDETAILS, COMPONENTS, OR ACCESSORIES FOR MACHINE TOOLS, e.g. ARRANGEMENTS FOR COPYING OR CONTROLLING; MACHINE TOOLS IN GENERAL CHARACTERISED BY THE CONSTRUCTION OF PARTICULAR DETAILS OR COMPONENTS; COMBINATIONS OR ASSOCIATIONS OF METAL-WORKING MACHINES, NOT DIRECTED TO A PARTICULAR RESULT
    • B23Q17/00Arrangements for observing, indicating or measuring on machine tools
    • B23Q17/24Arrangements for observing, indicating or measuring on machine tools using optics or electromagnetic waves
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B11/00Measuring arrangements characterised by the use of optical techniques
    • G01B11/24Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures
    • G01B11/245Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures using a plurality of fixed, simultaneously operating transducers
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B19/00Programme-control systems
    • G05B19/02Programme-control systems electric
    • G05B19/18Numerical control [NC], i.e. automatically operating machines, in particular machine tools, e.g. in a manufacturing environment, so as to execute positioning, movement or co-ordinated operations by means of programme data in numerical form
    • G05B19/401Numerical control [NC], i.e. automatically operating machines, in particular machine tools, e.g. in a manufacturing environment, so as to execute positioning, movement or co-ordinated operations by means of programme data in numerical form characterised by control arrangements for measuring, e.g. calibration and initialisation, measuring workpiece for machining purposes

Definitions

  • the present invention relates to a processing device, a control method for the processing device, and a control program for the processing device.
  • Patent Literature 1 discloses a technique in which a camera is arranged inside a machine tool, and the inside of the machine is photographed and displayed using the camera.
  • An object of the present invention is to provide a technique for solving the above-mentioned problem.
  • a processing apparatus includes: At least two cameras including at least one moving camera; Using an image acquired by the at least two cameras, measuring means for measuring the three-dimensional shape of the imaging target, With.
  • a method for controlling a processing apparatus includes: A method for controlling a processing apparatus including at least two cameras including at least one moving camera, An obtaining step of obtaining an image captured by the at least two cameras; Using the acquired image, a measurement step of measuring the three-dimensional shape of the imaging target, including.
  • a control program for a processing apparatus includes: A control program for a processing apparatus provided with at least two cameras including at least one moving camera, An obtaining step of obtaining an image captured by the at least two cameras; Using the acquired image, a measurement step of measuring the three-dimensional shape of the imaging target, On a computer.
  • the processing device 100 is a device that measures a three-dimensional shape from a camera image.
  • the processing apparatus 100 includes a moving camera 101, a camera 102, and a measuring unit 103. Further, the processing apparatus 100 has at least two cameras 101 and 102 including at least one moving camera 101. The measurement unit 103 measures the three-dimensional shape of the imaging target 110 using the images acquired by at least two cameras 101 and 102.
  • FIG. 2A is a diagram illustrating an outline of an example of a configuration of a processing apparatus according to the present embodiment.
  • the processing apparatus 200 performs processing on the imaging target 210 placed on the rotary stage 230 (processing table).
  • the processing apparatus 200 has at least two cameras arranged in the apparatus.
  • the processing apparatus 200 has a moving camera 201 and a fixed camera 202.
  • the moving camera 201 is attached to the tool spindle 220.
  • a spindle is attached to the tool spindle 220, and the moving camera 201 is attached to the tool spindle 220 via, for example, a spindle.
  • the moving camera 201 moves in accordance with the movement of the tool spindle 220 or the spindle. That is, the mobile camera 201 is a mobile camera.
  • the moving camera 201 may be mounted so as not to rotate in accordance with the rotation of the spindle.
  • the fixed camera 202 is attached to the ceiling of the processing device 200.
  • the fixed camera 202 is attached to the ceiling of the processing apparatus 200 and cannot be moved (the position does not change), so that it is a fixed camera.
  • the processing apparatus 200 obtains a three-dimensional image as in the case of a so-called stereo camera by capturing images with the moving camera 201 and the fixed camera 202 and superimposing the obtained images. Using the obtained three-dimensional image, the processing apparatus 200 performs processing such as the three-dimensional shape of the imaging target 210 (work, processing target, etc.), the depth of a hole processed in the imaging target, the size of a step, and the like. Is measured. Further, by measuring the three-dimensional shape, the processing device 200 can avoid collision between the tool and the imaging target 210, collision between components of the processing device 200 and the imaging target 210, and the like.
  • One of the two cameras is attached to the tool spindle 220 to be the movable camera 201 and the other is fixed inside the processing apparatus 200 to be the fixed camera 202, so that a wider range of images can be obtained.
  • the accuracy of the three-dimensional shape of the imaging target 210 measured using the images acquired by the moving camera 201 and the fixed camera 202 increases as the base angle increases.
  • the base angle is the angle of a triangle formed by the mobile camera 201, the fixed camera 202, and the photographing target 210.
  • the accuracy of the three-dimensional shape increases as the base angle increases, but on the other hand, the feature points must be captured by the two cameras (the moving camera 201 and the fixed camera 202).
  • the position of the mobile camera 201 in the Z direction can be changed by changing the field of view (FOV; Field of View) of the camera, so that it is sufficient to determine whether or not the object 210 falls within the field of view of the camera.
  • the position of the mobile camera 201 in the X direction may be any position as long as the photographing target 210 comes to the center of the visual field.
  • the position of the mobile camera 201 in the Y direction only needs to be a position where the base angle is maximum and the surface of the imaging target 210 can be observed.
  • the photographing attitude and the installation position of the mobile camera 201 and the fixed camera 202 can be specifically determined according to, for example, the following procedure.
  • An attention area is set from a model of the imaging target 210 (work).
  • the X coordinate of the moving camera 201 is determined so that the center of the attention area matches the center of the visual field.
  • the Z position is determined from the size of the attention area and the angle of view of the moving camera 201.
  • the distribution of the Y component of the normal vector of each facet of the imaging target 210 is obtained.
  • the shooting direction of the mobile camera 201 is set so that the angle between the vector directed from the mobile camera 201 to the shooting target 210 and the normal vector of (5) is equal to or less than a predetermined value (for example, 70 °). Decide.
  • a predetermined value for example, 70 °.
  • the Y coordinate is obtained from the angle obtained in (6) and the Z coordinate obtained in (4).
  • FIG. 2B is a diagram showing an outline of another example of the configuration of the processing apparatus according to the present embodiment.
  • the processing apparatus 240 has at least two cameras disposed in the apparatus as in the processing apparatus 200.
  • the rotary stage 230 slides and moves in the Y-axis direction (vertical direction).
  • two moving cameras 201) are attached to the rotary stage 230.
  • the disk-shaped pallet 250 has a photographing target 210 such as a processing target (work) previously set thereon, and is conveyed onto the rotary stage 230 by a transfer robot or the like (not shown).
  • the image obtained by the two moving cameras 201) can be used to capture the object 210 and the like inside the processing device 240 as a three-dimensional shape, so that the control of the transfer robot can be closed-loop control. Further, since the processing device 240 measures the three-dimensional shape, it determines whether or not the NC (Numerical @ Control) program is appropriate or not and responds to the imaging target 210 while the imaging target 210 such as a workpiece or a processing target is being transported. You can select a program.
  • NC Numerical @ Control
  • the processing device 240 can measure the three-dimensional shape of a part of the spindle attached to the tool spindle 220, the processing device 240 can check itself after a collision and check the winding of chips. If the processing device 240 includes an acceleration sensor or the like and can detect an abnormality such as a collision, the abnormal state can be easily verified by storing the spindle itself as a three-dimensional shape after the abnormality is detected.
  • the number of mobile cameras 201 disposed inside the processing apparatuses 200 and 240 may be one or more, and all of the plurality of cameras inside the processing apparatuses are mobile cameras 201. You may. When the number of mobile cameras is one, at least one fixed camera 202 is required to be disposed inside the processing apparatuses 200 and 240. However, a plurality of fixed cameras 202 may be provided.
  • FIG. 3 is a block diagram showing the configuration of the processing apparatus according to the present embodiment.
  • the processing apparatus 200 includes a moving camera 201, a fixed camera 202, a measuring unit 301, and a detecting unit 302.
  • the moving camera 201 is movable in the processing device 200. Further, the mobile camera 201 captures an image of the inside of the processing apparatus 200 by capturing an image. In addition, the moving camera 201 may capture an image while moving, and acquire an image. Alternatively, when capturing, the mobile camera 201 may capture a still image to acquire an image.
  • the fixed camera 202 is a camera attached to a predetermined position inside the processing device 200. That is, the fixed camera 202 is a camera that does not move. The fixed camera 202 photographs the inside of the processing apparatus 200 and acquires an image.
  • the mobile camera 201 and the fixed camera 202 are those that exist inside the processing apparatus 200 and are to be photographed.
  • the object 210 to be photographed by the mobile camera 201 and the fixed camera 202 is a processing object mounted on a rotary stage 230 (processing table).
  • the imaging target 210 may be, for example, a processing target (work) to be processed by the processing apparatus 200, a processing table on which the processing target is placed, a tool spindle, a tool holder, a chuck, a cutting tool, or the like. It is not limited to these as long as it exists inside the processing device 200.
  • the fixed camera 202 may be photographed by the moving camera 201 or the moving camera 201 may be photographed by the fixed camera 202.
  • the measuring unit 301 measures the three-dimensional shape of the photographing target 210 by using the images captured by the moving camera 201 and the fixed camera 202.
  • the measurement unit 301 obtains information in the depth direction of the imaging target 210 by using images obtained by simultaneously capturing the imaging target 210 from a plurality of different directions, that is, images acquired by the moving camera 201 and the fixed camera 202. Can be Accordingly, the measurement unit 301 can measure the three-dimensional shape of the imaging target 210.
  • the detection unit 302 detects the position of the mobile camera 201 using the fixed camera 202, and executes calibration.
  • the measurement unit 301 acquires an image of the mobile camera 201 captured by the fixed camera 202.
  • the detection unit 302 obtains parameters such as internal parameters, external parameters, and distortion coefficients of the mobile camera 201 and the fixed camera 202, and performs calibration. If the above parameters are obtained by executing the calibration, the corresponding point search, the rectification, and the like can be efficiently performed under the epipolar geometric constraint condition.
  • the detection unit 302 performs calibration at a timing such as when the processing apparatus 200 is started.
  • the timing at which calibration is performed by the detection unit 302 is not limited to this, and may be timing during processing, after processing is completed, or the like. Incidentally, the three-dimensional coordinates may be directly calculated by the 8-point algorithm without performing the calibration.
  • the measuring unit 301 measures at least one of the depth of the hole formed in the imaging target 210 and the size of the step formed in the imaging target 210 as the three-dimensional shape of the imaging target 210.
  • the processing apparatus 200 can change or control the processing conditions and the like.
  • the three-dimensional shape measured by the measurement unit 301 is not limited to the depth of the hole or the size of the step, and may be, for example, the diameter of the hole, the depth and the length of the groove, and the like.
  • FIG. 4 is a diagram illustrating an example of a camera table included in the processing apparatus according to the present embodiment.
  • the camera table 401 stores a move / fix 412, a position 413, a specification 414, and a captured image 415 in association with a camera ID (Identifier) 411.
  • the camera ID 411 is an identifier for identifying a camera.
  • the move / fix 412 indicates whether the camera is a mobile camera or a fixed camera.
  • the position 413 indicates the position of the camera (the mobile camera 201 and the fixed camera 202).
  • the specification 414 indicates the performance of the camera, and includes, but is not limited to, an angle of view, an F value, a focal length, the number of pixels, a magnification, and the like.
  • the captured image 415 is an image captured by a camera (the moving camera 201 and the fixed camera 202).
  • the processing apparatus 200 refers to, for example, the camera table 401 and measures the three-dimensional shape of the imaging target.
  • the processing apparatus 200 detects the position of the mobile camera 201 with reference to the camera table 401.
  • FIG. 5 is a block diagram showing a hardware configuration of the processing apparatus according to the present embodiment.
  • a CPU (Central Processing Unit) 510 is a processor for arithmetic control, and realizes the functional components of the processing apparatus 200 in FIG. 3 by executing a program.
  • the CPU 510 includes a plurality of processors, and may execute different programs, modules, tasks, threads, and the like in parallel.
  • a ROM (Read Only Memory) 520 stores fixed data such as initial data and programs, and other programs.
  • the network interface 530 communicates with other devices via a network. Note that the number of CPUs 510 is not limited to one, and a plurality of CPUs or a GPU (Graphics Processing Unit) for image processing may be included.
  • the network interface 530 has a CPU independent of the CPU 510 and writes or reads transmission / reception data in an area of a random access memory (RAM) 540. It is desirable to provide a DMAC (Direct Memory Access Controller) for transferring data between the RAM 540 and the storage 550 (not shown). Further, CPU 510 recognizes that the data has been received or transferred to RAM 540 and processes the data. Further, the CPU 510 prepares the processing result in the RAM 540, and leaves the subsequent transmission or transfer to the network interface 530 or the DMAC.
  • DMAC Direct Memory Access Controller
  • the RAM 540 is a random access memory used by the CPU 510 as a work area for temporary storage. In the RAM 540, an area for storing data necessary for realizing the present embodiment is secured.
  • the position 541 is data of the position of the mobile camera 201 and the fixed camera 202 inside the processing device 200.
  • the specification 542 is data relating to the performance of the mobile camera 201 and the fixed camera 202 and the like.
  • the captured image 543 is data of an image captured by the mobile camera 201 and the fixed camera 202.
  • the parameter 544 is an internal parameter between the mobile camera 201 and the fixed camera 202, an external parameter, a distortion coefficient, and the like, obtained by the calibration.
  • the transmission / reception data 545 is data transmitted / received via the network interface 530. Further, the RAM 540 has an application execution area 546 for executing various application modules.
  • the storage 550 stores a database, various parameters, or the following data or programs necessary for realizing the present embodiment.
  • the storage 550 stores the camera table 401.
  • the camera table 401 is a table for managing the relationship between the camera ID 411 and the captured image 415 shown in FIG.
  • the storage 550 further stores a measurement module 551 and a detection module 552.
  • the measurement module 551 is a module that measures the three-dimensional shape of the photographing target 210 using images acquired by photographing with the moving camera 201 and the fixed camera 202.
  • the measurement module 551 is a module that measures, as a three-dimensional shape, at least one of the depth of a hole formed in the imaging target 210 and the size of a step formed in the imaging target 210.
  • the detection module 552 is a module that detects the position of the mobile camera 201 using the fixed camera 202. These modules 551 to 552 are read into the application execution area 546 of the RAM 540 by the CPU 510 and executed.
  • the control program 553 is a program for controlling the entire processing apparatus 200.
  • the input / output interface 560 interfaces input / output data with input / output devices.
  • the display unit 561 and the operation unit 562 are connected to the input / output interface 560.
  • a storage medium 564 may be connected to the input / output interface 560.
  • a speaker 563 as an audio output unit, a microphone (not shown) as an audio input unit, or a GPS position determination unit may be connected.
  • the RAM 540 and the storage 550 shown in FIG. 5 do not show programs and data relating to general-purpose functions and other realizable functions of the processing apparatus 200.
  • FIG. 6 is a flowchart illustrating a processing procedure of the processing apparatus according to the present embodiment. This flowchart is executed by the CPU 510 in FIG. 5 using the RAM 540, and implements the functional components of the processing apparatus 200 in FIG.
  • step S601 the processing device 200 executes a start-up operation of the device.
  • step S603 the processing device 200 determines whether calibration is necessary. If calibration is not necessary (NO in step S603), the processing apparatus 200 proceeds to step S609. If calibration is necessary (YES in step S603), the processing apparatus 200 proceeds to step S605.
  • step S605 the processing apparatus 200 detects the position of the mobile camera 201 using the fixed camera 202.
  • step S607 the processing apparatus 200 executes calibration to obtain parameters such as internal parameters, external parameters, and distortion coefficients of the mobile camera 201 and the fixed camera 202.
  • step S609 the processing apparatus 200 acquires an image captured by the moving camera 201 and the fixed camera 202, and acquires the acquired image.
  • the processing apparatus 200 measures the three-dimensional shape of the imaging target 210 using the images acquired by the moving camera 201 and the fixed camera 202.
  • the processing apparatus 200 measures, for example, at least one of a depth of a hole processed in the imaging target 210 as a three-dimensional shape and a size of a step processed in the imaging target 210.
  • step S613 the processing device 200 determines whether to end the measurement. If it is determined that the measurement is not to be ended (NO in step S613), the processing device 200 returns to step S609. If it is determined that the measurement is to be ended (YES in step S613), the processing device 200 ends the processing.
  • the present embodiment highly accurate measurement based on an image can be performed. Further, since the position of the moving camera is detected using the fixed camera, the camera can be calibrated. Further, it is possible to measure the depth of a hole formed in the object to be photographed as a three-dimensional shape and the size of a step formed in the object to be photographed.
  • FIG. 7 is a diagram illustrating an outline of an example of a configuration of a processing apparatus according to the present embodiment.
  • the processing apparatus according to the present embodiment is different from the above-described second embodiment in that it has a tool magazine and a camera exchange unit.
  • Other configurations and operations are the same as those of the second embodiment, and thus the same configurations and operations are denoted by the same reference numerals and detailed description thereof will be omitted.
  • the processing device 700 includes a tool magazine 701 and a camera exchange unit 702.
  • the tool magazine 701 houses a plurality of types of cameras together with a plurality of tools (tools).
  • the tool magazine 701 is, for example, an ATC (Automatic Tool Changer) magazine and has a role of storing a tool, selecting a tool, and exchanging a tool.
  • the tool magazine 701 may be either a turret type or a magazine storage type.
  • the camera exchange unit 702 takes out the selected camera from the tool magazine 701 and attaches the selected camera to a predetermined position inside the processing device 700.
  • the camera exchange unit 702 is, for example, a robot arm, and can take out the camera from the tool magazine 701 and attach it to a predetermined position inside the processing device 700.
  • the camera exchange unit 702 may be any device as long as it can exchange cameras.
  • FIG. 8 is a diagram illustrating an example of a camera table provided in the processing apparatus according to the present embodiment.
  • the camera table 801 stores the accommodation position 811 in association with the camera ID 411.
  • the storage position 811 indicates a position where the camera is stored in the tool magazine 701.
  • the processing device 700 refers to the camera table 801 to select and exchange a camera according to the purpose.
  • FIG. 9 is a flowchart illustrating a processing procedure of the processing apparatus according to the present embodiment. This flowchart is executed by the CPU 510 in FIG. 5 using the RAM 540, and implements the functional components of the processing apparatus 700 in FIG.
  • the processing device 700 determines whether the camera needs to be replaced. If it is determined that the camera does not need to be replaced (NO in step S901), the processing device 700 proceeds to step S603. If it is determined that the camera needs to be replaced (YES in step S901), the processing device 700 proceeds to step S903. In step S ⁇ b> 903, the processing apparatus 700 selects a camera according to the application from a plurality of types of cameras and replaces the camera.
  • the camera can be selectively replaced, so that measurement suitable for various uses can be performed.

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Mechanical Engineering (AREA)
  • Human Computer Interaction (AREA)
  • Manufacturing & Machinery (AREA)
  • Automation & Control Theory (AREA)
  • Optics & Photonics (AREA)
  • Machine Tool Sensing Apparatuses (AREA)
  • Numerical Control (AREA)
  • Length Measuring Devices By Optical Means (AREA)

Abstract

L'objet de la présente invention est d'effectuer une mesure avec une précision élevée sur la base d'une image. Le dispositif de traitement comprend : au moins deux caméras comprenant au moins une caméra mobile; et une unité de mesure qui utilise des images acquises par les au moins deux caméras pour mesurer la forme stéréoscopique d'un sujet à imager.
PCT/JP2019/031959 2018-09-06 2019-08-14 Dispositif de traitement, procédé de commande de dispositif de traitement et programme de commande de dispositif de traitement WO2020049973A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2018167081A JP6595065B1 (ja) 2018-09-06 2018-09-06 加工装置、加工装置の制御方法および加工装置の制御プログラム
JP2018-167081 2018-09-06

Publications (1)

Publication Number Publication Date
WO2020049973A1 true WO2020049973A1 (fr) 2020-03-12

Family

ID=68314149

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2019/031959 WO2020049973A1 (fr) 2018-09-06 2019-08-14 Dispositif de traitement, procédé de commande de dispositif de traitement et programme de commande de dispositif de traitement

Country Status (2)

Country Link
JP (1) JP6595065B1 (fr)
WO (1) WO2020049973A1 (fr)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109940463B (zh) * 2019-01-18 2021-08-10 哈尔滨理工大学 铣床用淬硬钢模具加工尺寸在机检测夹具及控制方法
JP6892462B2 (ja) * 2019-02-05 2021-06-23 ファナック株式会社 機械制御装置

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH0961140A (ja) * 1995-08-28 1997-03-07 Denso Corp 画像入力装置及びそれを用いた検査方法
JP2000131032A (ja) * 1998-10-24 2000-05-12 Hitachi Seiki Co Ltd 三次元形状計測方法およびその装置
JP2000180106A (ja) * 1998-12-21 2000-06-30 Kokko:Kk ワーク加工装置及びコンピュータ読み取り可能な記録媒体
JP2005300512A (ja) * 2004-03-18 2005-10-27 Ricoh Co Ltd 表面欠陥検査装置、表面欠陥検査方法、その方法をコンピュータに実行させるプログラム
JP2008168372A (ja) * 2007-01-10 2008-07-24 Toyota Motor Corp ロボット装置及び形状認識方法
US20180150062A1 (en) * 2016-11-25 2018-05-31 Glowforge Inc. Controlled deceleration of moveable components in a computer numerically controlled machine

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH0961140A (ja) * 1995-08-28 1997-03-07 Denso Corp 画像入力装置及びそれを用いた検査方法
JP2000131032A (ja) * 1998-10-24 2000-05-12 Hitachi Seiki Co Ltd 三次元形状計測方法およびその装置
JP2000180106A (ja) * 1998-12-21 2000-06-30 Kokko:Kk ワーク加工装置及びコンピュータ読み取り可能な記録媒体
JP2005300512A (ja) * 2004-03-18 2005-10-27 Ricoh Co Ltd 表面欠陥検査装置、表面欠陥検査方法、その方法をコンピュータに実行させるプログラム
JP2008168372A (ja) * 2007-01-10 2008-07-24 Toyota Motor Corp ロボット装置及び形状認識方法
US20180150062A1 (en) * 2016-11-25 2018-05-31 Glowforge Inc. Controlled deceleration of moveable components in a computer numerically controlled machine

Also Published As

Publication number Publication date
JP2020040130A (ja) 2020-03-19
JP6595065B1 (ja) 2019-10-23

Similar Documents

Publication Publication Date Title
JP7237483B2 (ja) ロボットシステムの制御方法、制御プログラム、記録媒体、制御装置、ロボットシステム、物品の製造方法
JP7167012B2 (ja) 製造作業に使用するための部品の位置を特定するためのシステム、方法および装置
US8295585B2 (en) Method for determining the position of an object in space
EP1555508A1 (fr) Système de mesure
EP1607194A2 (fr) Système robotisé comprenant plusieurs robots munis de moyens pour calibrer leur position relative
CN111193862B (zh) 照相机校正装置以及照相机校正方法
JP5815761B2 (ja) 視覚センサのデータ作成システム及び検出シミュレーションシステム
JP2017021723A (ja) ワーク原点を取得可能な工作機械制御システムおよびワーク原点設定方法
JP2016185572A (ja) ロボット、ロボット制御装置およびロボットシステム
US10359266B2 (en) Position measurement method of object in machine tool and position measurement system of the same
US9942524B2 (en) Device and method for detecting the position of an object in a machine tool
JP6129058B2 (ja) 教示点補正装置および教示点補正方法
WO2020049973A1 (fr) Dispositif de traitement, procédé de commande de dispositif de traitement et programme de commande de dispositif de traitement
CN111223048B (zh) 一种3d视觉点云数据拼接的方法及其系统
CN115066313B (zh) 用于加工装置的工件的安装方法、工件安装支援系统及存储介质
US20170292827A1 (en) Coordinate measuring system
US12058468B2 (en) Image capturing apparatus, image processing apparatus, image processing method, image capturing apparatus calibration method, robot apparatus, method for manufacturing article using robot apparatus, and recording medium
JP2010058239A (ja) 加工方法
CN101451825A (zh) 图像测量装置的校正方法
JPH09253979A (ja) 刃先位置計測装置
TW201439572A (zh) 量測機台座標系歸一系統及方法
WO2017009615A1 (fr) Procédé de mesure d'un artefact
CN109732601B (zh) 一种自动标定机器人位姿与相机光轴垂直的方法和装置
WO2022124232A1 (fr) Système de traitement d'image et procédé de traitement d'image
JP2019104093A (ja) ロボットシステム

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 19857174

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 19857174

Country of ref document: EP

Kind code of ref document: A1