WO2022107684A1 - パラメータを調整する装置、ロボットシステム、方法、及びコンピュータプログラム - Google Patents
パラメータを調整する装置、ロボットシステム、方法、及びコンピュータプログラム Download PDFInfo
- Publication number
- WO2022107684A1 WO2022107684A1 PCT/JP2021/041616 JP2021041616W WO2022107684A1 WO 2022107684 A1 WO2022107684 A1 WO 2022107684A1 JP 2021041616 W JP2021041616 W JP 2021041616W WO 2022107684 A1 WO2022107684 A1 WO 2022107684A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- work
- image data
- processor
- model
- parameter
- Prior art date
Links
- 238000000034 method Methods 0.000 title claims description 18
- 238000004590 computer program Methods 0.000 title claims description 6
- 230000000007 visual effect Effects 0.000 claims abstract description 57
- 238000001514 detection method Methods 0.000 claims abstract description 51
- 230000008859 change Effects 0.000 claims description 12
- 238000003384 imaging method Methods 0.000 claims description 5
- 230000006870 function Effects 0.000 description 26
- 239000012636 effector Substances 0.000 description 15
- 210000000707 wrist Anatomy 0.000 description 12
- 230000008569 process Effects 0.000 description 9
- 238000012217 deletion Methods 0.000 description 4
- 230000037430 deletion Effects 0.000 description 4
- 238000006073 displacement reaction Methods 0.000 description 3
- 230000003287 optical effect Effects 0.000 description 3
- 238000006243 chemical reaction Methods 0.000 description 2
- 238000010586 diagram Methods 0.000 description 2
- 238000010801 machine learning Methods 0.000 description 2
- 230000004048 modification Effects 0.000 description 2
- 238000012986 modification Methods 0.000 description 2
- 238000012545 processing Methods 0.000 description 2
- 230000004044 response Effects 0.000 description 2
- 238000013459 approach Methods 0.000 description 1
- 238000004040 coloring Methods 0.000 description 1
- 239000004973 liquid crystal related substance Substances 0.000 description 1
- 239000011159 matrix material Substances 0.000 description 1
- 239000013307 optical fiber Substances 0.000 description 1
- 239000003973 paint Substances 0.000 description 1
- 238000004088 simulation Methods 0.000 description 1
- 230000009466 transformation Effects 0.000 description 1
- 238000003466 welding Methods 0.000 description 1
Images
Classifications
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J19/00—Accessories fitted to manipulators, e.g. for monitoring, for viewing; Safety devices combined with or specially adapted for use in connection with manipulators
- B25J19/02—Sensing devices
- B25J19/021—Optical sensing devices
- B25J19/023—Optical sensing devices including video camera means
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J19/00—Accessories fitted to manipulators, e.g. for monitoring, for viewing; Safety devices combined with or specially adapted for use in connection with manipulators
- B25J19/02—Sensing devices
- B25J19/04—Viewing devices
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J9/00—Programme-controlled manipulators
- B25J9/16—Programme controls
- B25J9/1628—Programme controls characterised by the control loop
- B25J9/163—Programme controls characterised by the control loop learning, adaptive, model based, rule based expert control
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J9/00—Programme-controlled manipulators
- B25J9/16—Programme controls
- B25J9/1656—Programme controls characterised by programming, planning systems for manipulators
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J9/00—Programme-controlled manipulators
- B25J9/16—Programme controls
- B25J9/1694—Programme controls characterised by use of sensors other than normal servo-feedback from position, speed or acceleration sensors, perception control, multi-sensor controlled systems, sensor fusion
- B25J9/1697—Vision controlled systems
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T17/00—Three dimensional [3D] modelling, e.g. data description of 3D objects
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/70—Determining position or orientation of objects or cameras
- G06T7/73—Determining position or orientation of objects or cameras using feature-based methods
- G06T7/75—Determining position or orientation of objects or cameras using feature-based methods involving models
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05B—CONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
- G05B2219/00—Program-control systems
- G05B2219/30—Nc systems
- G05B2219/40—Robotics, robotics mapping to robotics vision
- G05B2219/40053—Pick 3-D object from pile of objects
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05B—CONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
- G05B2219/00—Program-control systems
- G05B2219/30—Nc systems
- G05B2219/40—Robotics, robotics mapping to robotics vision
- G05B2219/40499—Reinforcement learning algorithm
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05B—CONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
- G05B2219/00—Program-control systems
- G05B2219/30—Nc systems
- G05B2219/40—Robotics, robotics mapping to robotics vision
- G05B2219/40564—Recognize shape, contour of object, extract position and orientation
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2200/00—Indexing scheme for image data processing or generation, in general
- G06T2200/24—Indexing scheme for image data processing or generation, in general involving graphical user interfaces [GUIs]
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30108—Industrial image inspection
- G06T2207/30164—Workpiece; Machine component
Definitions
- Servo motors 29 are provided on each component of the robot 12 (robot base 18, swivel body 20, lower arm 22, upper arm 24, wrist 26). These servomotors 29 rotate each movable element of the robot 12 (swivel body 20, lower arm 22, upper arm 24, wrist 26, wrist flange 26b) around the drive shaft in response to a command from the control device 16. Move it. As a result, the robot 12 can move the end effector 28 and arrange it at an arbitrary position and posture.
- the processor 30 operates the robot 12 to perform a work handling operation in which the work W piled up in bulk in the container B is gripped by the end effector 28 and picked up.
- the processor 30 first images the work W in the container B with the visual sensor 14.
- the processor 30 acquires the parameter PM for collating the work model WM that models the work W with the work feature WP of the work W imaged by the visual sensor 14. Then, the processor 30 applies the parameter PM to a predetermined algorithm AL (software), and collates the work model WM with the work feature WP according to the algorithm AL, so that the sensor coordinate system of the work W reflected in the image data ID 1 is used.
- the data (specifically, the coordinates) of the position (specifically, the position and the posture) in C3 is acquired.
- the processor 30 acquires the position data in the robot coordinate system C1 of the imaged work W by converting the acquired position of the sensor coordinate system C3 into the robot coordinate system C1.
- the processor 30 adjusts so as to optimize the parameter PM by using the work feature WP of the work W imaged by the visual sensor 14.
- FIG. 6 shows an example of the image data ID 2 generated by this step S11.
- the processor 30 arranges the work model WM in the virtual space defined by the sensor coordinate system C3, and generates the image data ID 2 of the virtual space in which the work model WM is arranged together with the work feature WP of the work W. Further, the processor 30 sets the work coordinate system C4 together with the work model WM in the sensor coordinate system C3.
- This work coordinate system C4 is a coordinate system that defines the position (specifically, the position and the posture) of the work model WM.
- the processor 30 uses the parameter PM 1 stored in the memory 32 at the start of the step S11 to set the position of the work W in the image data ID 2 as the detection position DP 1 .
- Ask When obtaining the detection position DP 1 , the processor 30 applies the parameter PM 1 to the algorithm AL, and collates the work model WM with the work feature WP reflected in the image data ID 2 according to the algorithm AL.
- the processor 30 determines the coordinates (in the sensor coordinate system C3) of the work coordinate system C4 set in the work model WM.
- x, y, z, W, P, R) is detected as the detection position DP1 .
- the coordinates (x, y, z) indicate the origin position of the work coordinate system C4 in the sensor coordinate system C3
- the coordinates (W, P, R) indicate the attitude of the work coordinate system C4 with respect to the sensor coordinate system C3. So-called yaw, pitch, roll).
- the above-mentioned parameter PM 1 is for collating the work model WM with the work feature WP.
- the size SZ, the image roughness (or resolution) ⁇ at the time of collation, and the data (for example, the work model WM and the work feature WP) that specify which feature points of the work model WM and the work feature WP are collated with each other. Includes data that identifies the "contour" to match.
- the processor 30 acquires the detection position DP 1 (x, y, z, W, P, R) by collating the work model WM and the work feature WP with each other using the parameter PM 1 . Therefore, in the present embodiment, the processor 30 functions as a position detection unit 54 (FIG. 2) for obtaining the detection position DP 1 using the parameter PM 1 .
- step S26 the processor 30 updates the parameter PM n .
- the processor 30 changes the parameter PM n (for example, the displacement amount E, the size SZ, or the image roughness ⁇ ) by the change amount ⁇ n determined in the latest step S25, thereby changing the parameter PM n .
- step S4 that is, the process of acquiring the matching position
- the same process as the flow shown in FIG. 5 is assigned the same step number, and duplicate description will be omitted.
- step S4 shown in FIG. 9 the processor 30 executes steps S31 and S32 after step S11.
- the processor 30 may erroneously display the work model WM at an inappropriate position.
- FIG. 10 shows an example of the image data ID 2 in which the work model WM is displayed at an inappropriate position.
- a feature F of a member different from the work W is reflected.
- the processor 30 may not be able to recognize the work feature WP reflected in the image data ID 2, and may fail to display the work model WM.
- An example of this is shown in FIG. In the image data ID 2 shown in FIG. 11, the work model WM corresponding to the work feature WP on the upper right is not displayed among the total of three work feature WPs. In such a case, the operator needs to add the work model WM to the image data ID 2.
- step S11 When the image data ID 2 shown in FIG. 11 is displayed in step S11, the operator operates the input device 40 while visually recognizing the image data ID 2 to add the image data ID 2 (sensor coordinates) of the work model WM to be added.
- Input data IP4 that specifies a position (for example, coordinates) in the system C3) is input.
- step S32 the processor 30 functions as an image generation unit 52 and deletes the displayed work model WM from the image data ID 2 according to the received input data IP3 or IP4, or further converts the work model WM into the image data ID 2. Display additionally. For example, when the input data IP3 is received, the processor 30 deletes the work model WM displayed at the position corresponding to the feature F from the image data ID 2 shown in FIG. As a result, the image data ID 2 is updated as shown in FIG.
- the processor 30 may display the work model WM at a position determined in accordance with a predetermined rule in the image data ID 2.
- this rule can be defined as a rule for arranging the work model WM in the image data ID 2 in a grid pattern at equal intervals.
- FIG. 14 shows an example in which the processor 30 displays the work model WM on the image data ID 2 according to the rule that the work model WM is arranged in a grid pattern at equal intervals.
- step S4 processing of acquiring the matching position
- the processor 30 executes steps S41 and S42 after step S11.
- step S41 the processor 30 determines whether or not there is a work model WM that satisfies the condition G1 to be deleted in the image data ID 2.
- the processor 30 refers to the points (or work features) of the three-dimensional point cloud constituting the work feature WP existing in the occupied area of the work model WM for each of the work model WMs reflected in the image data ID 2. Calculate the number N (pixels that copy the WP). Then, in step S41, the processor 30 determines whether or not the calculated number N is equal to or less than the predetermined threshold value Nth (N ⁇ Nth ) for each of the work model WMs , and N ⁇ Nth . If there is a work model WM determined to exist, the work model WM is specified as a deletion target, and YES is determined. That is, in the present embodiment, the condition G1 is defined as the number N being equal to or less than the threshold value Nth .
- the processor 30 identifies the work feature WP as a model addition target and determines YES. That is, in the present embodiment, the condition G2 is defined as the existence of a work feature WP having no points (pixels) included in the exclusive area of the work model WM. For example, in the case of the example shown in FIG. 11, in the step S41, the processor 30 identifies the work feature WP shown in the upper right of the image data ID 2 as a model addition target, and determines YES.
- step S42 the processor 30 functions as an image generation unit 52, and automatically adds the work model WM to the image data ID 2 at the position corresponding to the work feature WP specified as the model addition target in step S41.
- a work model WM is added as shown in FIG.
- the processor 30 additionally displays the work model WM in the image data ID 2 according to the predetermined condition G2. According to the flow shown in FIG. 15, the processor 30 can automatically delete or add the work model WM according to the condition G1 or G2, so that the operator's work can be reduced.
- the processor 30 first executes the image acquisition process shown in FIG. In step S51, the processor 30 sets the number “i” that identifies the image data ID1 _i captured by the visual sensor 14 to “1”.
- step S52 the processor 30 determines whether or not the image pickup start command has been accepted. For example, the operator operates the input device 40 to input an image pickup start command.
- the processor 30 determines YES and proceeds to step S53. On the other hand, if the processor 30 has not received the image pickup start command, it determines NO and proceeds to step S56.
- step S53 the processor 30 images the work W by the visual sensor 14 in the same manner as in step S2 described above. As a result, the visual sensor 14 captures the i-th image data ID1 _i and supplies it to the processor 30.
- step S54 the processor 30 stores the i-th image data ID 1 _i acquired in the latest step S53 in the memory 32 together with the identification number “i”.
- step S55 the operator changes the arrangement of the work W in the container B shown in FIG. 1 without inputting the image pickup end command.
- the operator operates the input device 40 to input an imaging start command.
- the processor 30 determines YES in step S52, executes steps S53 to S55 for the work W after the arrangement in the container B has changed, and acquires the i + 1th image data ID1 _i + 1 .
- step S62 the processor 30 generates the image data ID 2 _i in which the work feature WP is displayed. Specifically, the processor 30 reads the i-th image data ID 1 _i identified by the identification number “i” from the memory 32. Then, based on the i-th image data ID1 _i , the processor 30 uses the work feature WP reflected in the i-th image data ID1 _i as a GUI that can be visually recognized by the operator, for example, the i-th as shown in FIG. Image data ID2 _i of is generated.
- step S64 the processor 30 determines whether or not the identification number “i” exceeds the maximum value iMAX (i> iMAX ).
- This maximum value iMAX is the total number of image data IDs 1 _i acquired by the processor 30 in the flow of FIG.
- a plurality of image data IDs 1 _i of the work W arranged at various positions are accumulated by the flow shown in FIG. 17, and then a plurality of accumulated image data IDs 1 _i are accumulated in the flow shown in FIG.
- the parameter PM is adjusted using the image data ID1 _i . According to this configuration, the parameter PM can be optimized for the work W arranged at various positions.
- the processor 30 arranges the robot model 12M that models the robot 12 and the visual sensor model 14M fixed to the end effector model 28M of the robot model 12M in the virtual space, and arranges the robot model 12M and the visual.
- the sensor model 14M may be simulated in virtual space to perform the flows shown in FIGS. 3, 17 and 18 (ie, simulation). According to this configuration, the parameter PM can be adjusted by so-called offline operation without using the actual robot 12 and the visual sensor 14.
- Robot system 12
- Robot 14 Visual sensor 16
- Control device 30
- Processor 50
- Device 52
- Image generation unit 54
- Position detection unit 56
- Input reception unit 58
- Matching position acquisition unit 60
- Parameter adjustment unit 62
- Command generation unit
Abstract
Description
12 ロボット
14 視覚センサ
16 制御装置
30 プロセッサ
50 装置
52 画像生成部
54 位置検出部
56 入力受付部
58 マッチング位置取得部
60 パラメータ調整部
62 指令生成部
Claims (11)
- 視覚センサが撮像したワークのワーク特徴が表示された画像データにおいて前記ワークをモデル化したワークモデルを前記ワーク特徴と照合するためのパラメータを用いて、前記画像データにおける前記ワークの位置を検出位置として求める位置検出部と、
前記画像データにおいて前記ワークモデルを前記ワーク特徴に一致するように配置したときの、該画像データにおける該ワークモデルの位置をマッチング位置として取得するマッチング位置取得部と、
前記位置検出部が前記検出位置を、前記マッチング位置に対応する位置として求めるのを可能とするように、前記検出位置と前記マッチング位置との差を表すデータに基づいて前記パラメータを調整するパラメータ調整部と、を備える、装置。 - 前記画像データを生成する画像生成部をさらに備える、請求項1に記載の装置。
- 前記画像生成部は、前記画像データに前記ワークモデルをさらに表示し、
前記装置は、前記画像データにおいて前記ワークモデルの位置を変位させるための第1の入力データを受け付ける入力受付部をさらに備え、
前記マッチング位置取得部は、前記画像生成部が前記第1の入力データに応じて前記画像データに表示された前記ワークモデルの位置を変位させて該ワークモデルを前記ワーク特徴に一致するように配置したときに、前記マッチング位置を取得する、請求項2に記載の装置。 - 前記画像生成部は、
前記位置検出部が取得した前記検出位置に前記ワークモデルを表示するか、
前記画像データにおいてランダムに決定した位置に前記ワークモデルを表示するか、又は、
前記画像データにおいて予め定めた規則に従って決定した位置に前記ワークモデルを表示する、請求項3に記載の装置。 - 前記入力受付部は、さらに、前記ワークモデルを前記画像データから削除するか、又は、第2の前記ワークモデルを前記画像データに追加するための第2の入力データを受け付け、
前記画像生成部は、前記第2の入力データに従って、表示した前記ワークモデルを前記画像データから削除し、又は、前記第2のワークモデルを前記画像データに追加で表示する、請求項3又は4に記載の装置。 - 前記画像生成部は、予め定めた条件に従って、表示した前記ワークモデルを前記画像データから削除するか、又は、第2の前記ワークモデルを前記画像データに追加で表示する、請求項3又は4に記載の装置。
- 前記パラメータ調整部は、
前記差を表すデータに基づいて、前記差を小さくすることができる前記パラメータの変化量を決定し、
決定した前記変化量だけ前記パラメータを変化させることによって、該パラメータを更新し、
更新後の前記パラメータを用いて前記位置検出部が求めた前記検出位置と前記マッチング位置との差を表すデータを取得する、
という、一連の動作を繰り返し実行することによって、前記パラメータを調整する、請求項1~6のいずれか1項に記載の装置。 - 前記ワーク特徴は、前記視覚センサをモデル化した視覚センサモデルによって前記ワークモデルを仮想的に撮像することによって取得される、請求項1~7のいずれか1項に記載の装置。
- ワークを撮像する視覚センサと、
前記ワークに対する作業を実行するロボットと、
前記視覚センサが撮像した画像データに基づいて、前記ロボットを動作させるための動作指令を生成する指令生成部と、
請求項1~8のいずれか1項に記載の装置と、を備え、
前記位置検出部は、前記パラメータ調整部によって調整された前記パラメータを用いて、前記視覚センサが撮像した前記画像データにおける前記ワークの位置を検出位置として取得し、
前記指令生成部は、
前記位置検出部が前記調整されたパラメータを用いて取得した前記検出位置に基づいて、前記ロボットを制御するための制御座標系における前記ワークの位置データを取得し、
前記位置データに基づいて、前記動作指令を生成する、ロボットシステム。 - プロセッサが、
視覚センサが撮像したワークのワーク特徴が表示された画像データにおいて前記ワークをモデル化したワークモデルを前記ワーク特徴と照合するためのパラメータを用いて、前記画像データにおける前記ワークの位置を検出位置として求め、
前記画像データにおいて前記ワークモデルを前記ワーク特徴に一致するように配置したときの、該画像データにおける該ワークモデルの位置をマッチング位置として取得し、
前記検出位置を、前記マッチング位置に対応する位置として求めるのを可能とするように、前記検出位置と前記マッチング位置との差を表すデータに基づいて前記パラメータを調整する、方法。 - 請求項10に記載の方法を前記プロセッサに実行させるコンピュータプログラム。
Priority Applications (4)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US18/252,189 US20230405850A1 (en) | 2020-11-18 | 2021-11-11 | Device for adjusting parameter, robot system, method, and computer program |
DE112021004779.5T DE112021004779T5 (de) | 2020-11-18 | 2021-11-11 | Vorrichtung zum Einstellen eines Parameters, Robotersystem, Verfahren und Computerprogramm |
CN202180075789.5A CN116472551A (zh) | 2020-11-18 | 2021-11-11 | 调整参数的装置、机器人系统、方法以及计算机程序 |
JP2022563719A JPWO2022107684A1 (ja) | 2020-11-18 | 2021-11-11 |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2020-191918 | 2020-11-18 | ||
JP2020191918 | 2020-11-18 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2022107684A1 true WO2022107684A1 (ja) | 2022-05-27 |
Family
ID=81708874
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2021/041616 WO2022107684A1 (ja) | 2020-11-18 | 2021-11-11 | パラメータを調整する装置、ロボットシステム、方法、及びコンピュータプログラム |
Country Status (6)
Country | Link |
---|---|
US (1) | US20230405850A1 (ja) |
JP (1) | JPWO2022107684A1 (ja) |
CN (1) | CN116472551A (ja) |
DE (1) | DE112021004779T5 (ja) |
TW (1) | TW202235239A (ja) |
WO (1) | WO2022107684A1 (ja) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2023248353A1 (ja) * | 2022-06-21 | 2023-12-28 | ファナック株式会社 | ワークの位置データを取得する装置、制御装置、ロボットシステム、方法、及びコンピュータプログラム |
Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH02210584A (ja) * | 1989-02-10 | 1990-08-21 | Fanuc Ltd | 視覚センサにおける画像処理用教示データ設定方法及び視覚センサ |
JP2010210586A (ja) * | 2009-03-12 | 2010-09-24 | Omron Corp | 3次元計測処理のパラメータの導出方法および3次元視覚センサ |
JP2020082274A (ja) * | 2018-11-26 | 2020-06-04 | キヤノン株式会社 | 画像処理装置およびその制御方法、プログラム |
WO2020255229A1 (ja) * | 2019-06-17 | 2020-12-24 | オムロン株式会社 | 計測装置、計測方法、及び計測プログラム |
-
2021
- 2021-11-10 TW TW110141851A patent/TW202235239A/zh unknown
- 2021-11-11 WO PCT/JP2021/041616 patent/WO2022107684A1/ja active Application Filing
- 2021-11-11 JP JP2022563719A patent/JPWO2022107684A1/ja active Pending
- 2021-11-11 US US18/252,189 patent/US20230405850A1/en active Pending
- 2021-11-11 DE DE112021004779.5T patent/DE112021004779T5/de active Pending
- 2021-11-11 CN CN202180075789.5A patent/CN116472551A/zh active Pending
Patent Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH02210584A (ja) * | 1989-02-10 | 1990-08-21 | Fanuc Ltd | 視覚センサにおける画像処理用教示データ設定方法及び視覚センサ |
JP2010210586A (ja) * | 2009-03-12 | 2010-09-24 | Omron Corp | 3次元計測処理のパラメータの導出方法および3次元視覚センサ |
JP2020082274A (ja) * | 2018-11-26 | 2020-06-04 | キヤノン株式会社 | 画像処理装置およびその制御方法、プログラム |
WO2020255229A1 (ja) * | 2019-06-17 | 2020-12-24 | オムロン株式会社 | 計測装置、計測方法、及び計測プログラム |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2023248353A1 (ja) * | 2022-06-21 | 2023-12-28 | ファナック株式会社 | ワークの位置データを取得する装置、制御装置、ロボットシステム、方法、及びコンピュータプログラム |
Also Published As
Publication number | Publication date |
---|---|
CN116472551A (zh) | 2023-07-21 |
TW202235239A (zh) | 2022-09-16 |
JPWO2022107684A1 (ja) | 2022-05-27 |
US20230405850A1 (en) | 2023-12-21 |
DE112021004779T5 (de) | 2023-07-06 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US11117262B2 (en) | Intelligent robots | |
CN111482959B (zh) | 机器人运动视觉系统的自动手眼标定系统与方法 | |
KR102023588B1 (ko) | 로봇 파지용 심층 기계 학습 방법 및 장치 | |
JP5850962B2 (ja) | ビジュアルフィードバックを利用したロボットシステム | |
JP5835926B2 (ja) | 情報処理装置、情報処理装置の制御方法、およびプログラム | |
US9884425B2 (en) | Robot, robot control device, and robotic system | |
JP2018111165A (ja) | 視覚センサのキャリブレーション装置、方法及びプログラム | |
US11833697B2 (en) | Method of programming an industrial robot | |
JP2016099257A (ja) | 情報処理装置及び情報処理方法 | |
JP2013215866A (ja) | ロボットシステム、ロボットシステムのキャリブレーション方法、キャリブレーション装置およびデジタルカメラ | |
JP2020012669A (ja) | 物体検査装置、物体検査システム、及び検査位置を調整する方法 | |
WO2022107684A1 (ja) | パラメータを調整する装置、ロボットシステム、方法、及びコンピュータプログラム | |
US20220395981A1 (en) | System and method for improving accuracy of 3d eye-to-hand coordination of a robotic system | |
CN115446847A (zh) | 用于提高机器人系统的3d眼手协调准确度的系统和方法 | |
JP2018122376A (ja) | 画像処理装置、ロボット制御装置、及びロボット | |
WO2023102647A1 (en) | Method for automated 3d part localization and adjustment of robot end-effectors | |
US11559888B2 (en) | Annotation device | |
WO2022092168A1 (ja) | ロボット制御装置及びロボットシステム | |
KR102430282B1 (ko) | 생산 라인에서의 작업자 위치 인식 방법 및 그 장치 | |
JP2020091126A (ja) | 計測装置、システム、表示方法及びプログラム | |
CN115972192A (zh) | 具有可变空间分辨率的3d计算机视觉系统 | |
KR20220067719A (ko) | 딥러닝과 마커를 이용한 비전인식을 통한 로봇 제어장치 및 그 방법 | |
CN112643718A (zh) | 图像处理设备及其控制方法和存储其控制程序的存储介质 | |
Xu et al. | A fast and straightforward hand-eye calibration method using stereo camera | |
WO2021200743A1 (ja) | ロボットの教示位置を修正するための装置、教示装置、ロボットシステム、教示位置修正方法、及びコンピュータプログラム |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 21894558 Country of ref document: EP Kind code of ref document: A1 |
|
ENP | Entry into the national phase |
Ref document number: 2022563719 Country of ref document: JP Kind code of ref document: A |
|
WWE | Wipo information: entry into national phase |
Ref document number: 18252189 Country of ref document: US |
|
WWE | Wipo information: entry into national phase |
Ref document number: 202180075789.5 Country of ref document: CN |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 21894558 Country of ref document: EP Kind code of ref document: A1 |