WO2019059343A1 - Workpiece information processing device and recognition method of workpiece - Google Patents

Workpiece information processing device and recognition method of workpiece Download PDF

Info

Publication number
WO2019059343A1
WO2019059343A1 PCT/JP2018/035021 JP2018035021W WO2019059343A1 WO 2019059343 A1 WO2019059343 A1 WO 2019059343A1 JP 2018035021 W JP2018035021 W JP 2018035021W WO 2019059343 A1 WO2019059343 A1 WO 2019059343A1
Authority
WO
WIPO (PCT)
Prior art keywords
work
workpiece
learning
data
unit
Prior art date
Application number
PCT/JP2018/035021
Other languages
French (fr)
Japanese (ja)
Inventor
博明 大庭
Original Assignee
Ntn株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Ntn株式会社 filed Critical Ntn株式会社
Publication of WO2019059343A1 publication Critical patent/WO2019059343A1/en

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B11/00Measuring arrangements characterised by the use of optical techniques
    • G01B11/24Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T1/00General purpose image data processing
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis

Definitions

  • the present invention relates to a workpiece information processing apparatus that recognizes the shape of a workpiece that is an object of machining or assembly.
  • machining or assembly operations workpieces that are objects are often picked up automatically by a robot or an assembly apparatus and the like and set in a housing or the like of a processing apparatus or assembly.
  • a workpiece information processing apparatus is used for workpiece posture detection, shape recognition, and the like which are necessary in the work of taking out a workpiece by a robot, an assembly apparatus or the like.
  • JP-A-10-332333 (patent document 1) and JP-A-11-066321 (patent document 2) photograph a work stacked one upon another with a camera, differentiate the obtained image and extract the outline
  • a work information processing apparatus to perform is disclosed.
  • a curvature is obtained from the extracted contour line, and a pixel having a locally maximum curvature is detected as a vertex.
  • a master pattern is an image captured under ideal conditions in advance, and an image of a workpiece actually captured by a camera is used as an input pattern. Master patterns with equal curvatures are associated with vertices of the input pattern, and the number of associated vertices is the largest The amount of deviation of the planar position and the rotation angle between the contour of the input pattern and the contour of the master pattern is detected.
  • Patent Document 2 an image captured in advance under ideal conditions is used as a template, and an image of a work actually captured by a camera is used as an input image.
  • the tangent angle of each pixel on the contour line is determined, the template and the histogram of the tangent angle of the input image are superimposed, and the deviation of the angle when they are most coincident is detected as the rotation angle.
  • the apparatus disclosed in the above prior art has the following problems. First, it is necessary to set an optimal detection condition for detecting a workpiece that can be gripped. If the detection condition is not set properly, for example, contour extraction fails, and the overlapping state is erroneously recognized to reduce the success rate of the gripping operation. If such a situation occurs frequently, work efficiency will be poor.
  • the present invention has been made to solve the above-mentioned problems, and its object is to require human trial and error and expert knowledge for selecting image features necessary for shape recognition of workpieces and selecting optimum conditions. Not to provide a work information processing apparatus by machine learning.
  • the present invention in summary, is a workpiece information processing apparatus used for workpiece shape recognition, and a training data creation unit for creating training data necessary for learning based on workpiece design information, and a workpiece for learning training data A learning unit, and a data storage unit for storing training data and a learning result in the work learning unit.
  • the work information processing apparatus further includes a work data creation unit that creates work data including a basic image of the work and coordinate data from the design information.
  • the learning data creation unit changes the basic image to create training data.
  • the learning data creation unit creates training data by performing at least one of translation, rotation, reduction, and / or enlargement on the basic image.
  • the learning data creation unit changes at least one of the change of the color of the surface of the work, the change of the irradiation direction of the light irradiated to the work, and the change of the light intensity irradiated to the work with respect to the basic image. Create training data by performing one process.
  • the workpiece information processing apparatus further includes a workpiece observation unit for photographing a workpiece.
  • the training data includes the first image data created by the learning data creation unit and the second image data captured by the work observation unit.
  • the design information of the workpiece includes two-dimensional CAD or three-dimensional CAD design information.
  • the workpiece information processing apparatus further includes a workpiece recognition unit that receives an image of the workpiece and recognizes the shape of the workpiece.
  • the work learning unit changes internal parameters of the work recognition unit by learning training data.
  • the present invention relates to a method of recognizing a work, which comprises: creating training data necessary for learning based on design information of the work; training training data; and training data and training data And storing the learning result of the step and receiving an image of the work, and using the learning result to recognize the shape of the work.
  • FIG. It is a figure which shows the structural example of the workpiece
  • FIG. 1 is a diagram showing a configuration example of a work information processing apparatus 1 according to the present embodiment.
  • the work information processing apparatus 1 includes a work data creation unit 2, a learning data creation unit 3, a work observation unit 4, a work learning unit 5, a work recognition unit 6, and a data storage unit 7.
  • a work data creation unit 2 includes a learning data creation unit 3, a learning data creation unit 3, a work observation unit 4, a work learning unit 5, a work recognition unit 6, and a data storage unit 7.
  • the work data creation unit 2 creates work data as a reference of training data necessary for learning.
  • the workpiece data creation unit 2 creates workpiece data based on the design information of the workpiece 8 including the material of the workpiece 8 and the processing method (surface property), and stores the workpiece data in the data storage unit 7.
  • the design information of the work 8 includes two-dimensional CAD or three-dimensional CAD design information.
  • work data is an image
  • two-dimensional or three-dimensional dimensional information is stored together with the work data, and is stored in a commonly used two-dimensional or three-dimensional CAD data format.
  • information such as material, processing method, surface quality and the like is also stored as necessary.
  • DXF Data Exchange Format
  • IGES Initial Graphics Exchange Specification
  • STEP Standard for the Exchange of Product model
  • the learning data creation unit 3 creates training data necessary for learning based on the work data created by the work data creation unit 2. As such training data, an image actually taken by a camera or the like is often used, but it is laborious to prepare a large number of such images with a machine tool or the like. Therefore, in the present embodiment, the learning data creation unit 3 automatically generates a large number of training data from the original work data.
  • FIG. 2 is a figure for demonstrating the training data which the learning data creation part 3 produces.
  • the learning data creation unit 3 modifies the basic image 100 to create training data. As shown in FIG. 2, the training data creation unit 3 performs training data 100 A, 100 B by performing at least one of translation, rotation, reduction and enlargement on the basic image 100 according to a program. Create
  • the learning data creation unit 3 causes the basic image 100 to change the color of the surface of the work, change the irradiation direction of light to be irradiated to the work, and irradiate the work with the program.
  • Training data 100C and 100D are created by performing at least one process of changing the light intensity.
  • the learning data creation unit 3 artificially creates various training data from CAD data, not from actual photographed images.
  • the learning data creation unit 3 is a program for performing at least one of the above-mentioned translation, rotation, reduction, and enlargement, the change of the color of the work surface, the change of the direction of the light to be irradiated, and the intensity. It is a combination of programs that perform at least one process of change, and the order of processes can be arbitrarily changed.
  • FIG. 3 is a diagram showing an example of training data.
  • the training data is an image in which sample points (pixels) are gathered.
  • the training data includes the contour W1 of the work.
  • the training data consists of three-dimensional coordinates (X, Y, Z) of each sample point on the work surface, the color (R, G, B) of the work, and the brightness I, which are stored in an array and associated with data names It is stored in the storage unit 7.
  • the data name indicates the type of created data, and may be numbered sequentially from 1 or may be represented by characters such as A and ABC.
  • FIG. 4 is a diagram showing an example of correspondence between data names and shapes of workpieces.
  • the data names "1" to “5" correspond to five types of workpiece shapes. Specifically, a square work corresponds to the data name "1", a circular work corresponds to the data name "2", and a parallelogram work corresponds to the data name "3"
  • the data name "4" corresponds to a triangular work, and the data name "5" corresponds to a double circle work.
  • FIG. 4 shows a simple two-dimensional shape as a shape example for the sake of simplicity, in actuality, the shape of the workpiece may be three-dimensional.
  • condition when training data is created in relation to the data name for example, the type of work, parallel movement amount, rotation angle, scale value, color, light irradiation direction, light intensity, etc. It is stored in section 7.
  • the workpiece observation unit 4 outputs three-dimensional coordinates (X, Y, Z), colors (R, G, B) and brightness I of each sample point on the workpiece surface.
  • the workpiece observation unit 4 includes two cameras for photographing the workpiece 8 and obtains three-dimensional coordinates (X, Y, Z) of each sample point on the workpiece surface by stereo measurement. These are color cameras and can output the color (R, G, B) and brightness I of each sample point on the surface of the workpiece.
  • the work observation unit 4 is also connected to the work learning unit 5 and the work recognition unit 6.
  • the work learning unit 5 and the work recognition unit 6 execute the learning process and the recognition process using data (X, Y, Z), (R, G, B), and I, respectively.
  • the work learning unit 5 performs learning using the training data created by the learning data creation unit 3.
  • supervised learning methods such as support vector machine and back-provocation (error back propagation method), unsupervised learning methods such as auto encoder (self-coder), k-means method, principal component analysis, and Use machine learning method that combines them.
  • the workpiece recognition unit 6 performs workpiece recognition processing using the output data of the workpiece observation unit 4 and the learning result of the workpiece learning unit 5.
  • the workpiece recognition unit 6 uses a recognition model such as a neural network or support vector machine corresponding to the machine learning method, and based on the learning result of the workpiece learning unit 5, each specimen of the workpiece surface output from the workpiece observation unit 4
  • the recognition processing is performed using the three-dimensional coordinates (X, Y, Z) of the point, the colors (R, G, B), and the brightness I as inputs.
  • Each of the work data creation unit 2, the learning data creation unit 3, the work learning unit 5, and the work recognition unit 6 includes a central processing unit (CPU), a random access memory (RAM), a read only memory (ROM), etc. Control of each component is performed according to processing.
  • the functions of two or more of the work data creation unit 2, the learning data creation unit 3, the work learning unit 5, and the work recognition unit 6 may be processed by one CPU.
  • the data storage unit 7 is, for example, an auxiliary storage device such as a hard disk drive or a solid state drive.
  • the data storage unit 7 is a program executed by each of the work data generation unit 2, the learning data generation unit 3, the work learning unit 5, and the work recognition unit 6, and the work data generation unit 2 and the learning data generation unit 3.
  • the data and the learning result of the work learning unit 5 are stored.
  • the work learning unit 5 changes the internal parameter of the work recognition unit 6 by combining the first image data prepared in the first preparation procedure described below and the second image data prepared in the second preparation procedure. Execute learning process.
  • the internal parameters correspond to, for example, weights and biases between layers from the input layer to the output layer in the neural network. That is, the training data learned by the work learning unit 5 includes the first image data and the second image data.
  • the first preparation procedure is a procedure for generating a large number of first image data for training from design data (CAD data).
  • the second preparation procedure is a procedure in which an image of an actual workpiece acquired by the workpiece observation unit 4 is used as second image data for training.
  • FIG. 5 is a flowchart for explaining a first procedure for creating a training image.
  • the first creation procedure is executed in the learning data creation unit 3 of FIG.
  • the learning data creation unit 3 reads CAD data (original design data) created by the work data creation unit 2.
  • the learning data creation unit 3 inquires to which data name the read CAD data corresponds, and the user selects a data name corresponding to the CAD data.
  • step S3 the learning data creation unit 3 creates a correct answer corresponding to the data name input by the user.
  • the learning data creation unit 3 determines the change range of the parameter in step S4.
  • the parameters include the amount of parallel movement, the angle of rotation, the reduction or enlargement ratio, the color of the work, the irradiation direction and intensity of light, etc.
  • the learning data creation unit 3 determines the change range of the parameter according to the input from the user.
  • the learning data creation unit 3 may set the change range of the parameter to a value determined in advance as a standard.
  • step S5 the learning data creation unit 3 creates an image while changing the parameter within the change range determined in step S4. For example, the amount of parallel movement, rotation angle, magnification, color work, the irradiation direction of the light, the six parameters of the irradiation intensity of light, when changing the respective five levels within the change area, 5 6 image is generated Be done.
  • step S6 the learning data creation unit 3 stores the generated image in the data storage unit 7 in association with the parameter and the data name.
  • FIG. 6 is a diagram showing a configuration used for a second procedure for creating a training image.
  • the second preparation procedure an image taken by the work observation unit 4 is used. Only one of the training image prepared in the first preparation procedure or the training image prepared in the second preparation procedure may be used, but the training image prepared in the first preparation procedure and the second preparation procedure The recognition rate can be further improved by combining the prepared training image.
  • the training data created by the learning data creation unit 3 is reinforced by causing the work observation unit 4 to simultaneously learn training data obtained by observing an actual target work.
  • the image generated in the first preparation procedure is nothing arranged around the work image.
  • the image created in the second preparation procedure another workpiece may be shown around the workpiece.
  • the work learning unit 5 executes learning so as to recognize the work with the surrounding image. Therefore, even when a plurality of works 8 are contained in a container and a part thereof is overlapped, one work can be identified and recognized.
  • the workpiece observation unit 4 outputs three-dimensional coordinates (X, Y, Z), colors (R, G, B) and brightness I of each sample point on the workpiece surface. Since these are not associated with data names, they will be associated with data names at the time of acquisition.
  • FIG. 7 is a flowchart for describing a second procedure for creating a training image.
  • the work learning unit 5 acquires the image captured from the work observing unit 4 in step S21.
  • the work learning unit 5 displays the acquired acquired data 12 on the monitor 9 connected to the work learning unit 5.
  • the user selects one of the data names 13 using the keyboard 10 and the mouse 11 connected to the work learning unit 5, thereby acquiring the data name corresponding to the acquired data 12.
  • the acquired data 12 displayed on the monitor 9 and the selected data name 13 are associated with each other and stored in the data storage unit 7.
  • the learning data creation unit 3 creates a correct answer corresponding to the data name 13 input by the user.
  • the monitor 9 is connected to the work learning unit 5, and the work learning unit 5 executes the above-described second preparation procedure.
  • the monitor 9 is connected to the work recognition unit 6 and connected to the work recognition unit 6.
  • the second recognition procedure described above may be performed by the work recognition unit 6 using the keyboard 10 and the mouse 11.
  • the workpiece recognition unit 6 and the data storage unit 7 are connected so that data can be stored.
  • steps S24 to S27 in the above association, the user inputs the position, the rotation angle, the scale, and the like of the work using the keyboard 10 and the mouse 11 while looking at the acquired data 12 displayed on the monitor 9.
  • step S24 the work learning unit 5 displays the acquired image on the monitor 9. Subsequently, in step S25, the work learning unit 5 receives specification of the center position of the work in the image from the user, and calculates the amount of parallel movement.
  • the amount of parallel movement is given as coordinate values in the horizontal and vertical directions with the upper left end of the image as the origin.
  • step S26 the work learning unit 5 calculates the rotation angle based on two points of the work in the image designated by the user. For example, if the work is a square, the user specifies two diagonal corners. Thus, the work learning unit 5 can recognize the rotation angle.
  • step S27 when the user designates an outer shape on the image, the work learning unit 5 can calculate the magnification of the work captured in the image.
  • step S28 the input position, rotation angle, scale, and the like are stored in the data storage unit 7 in association with the acquired data 12.
  • the work learning unit 5 After the training data is prepared in the first creation procedure or the second creation procedure described above, the work learning unit 5 performs learning in the procedure described below.
  • FIG. 8 is a diagram for explaining learning in the work learning unit 5.
  • FIG. 9 is a flowchart for explaining the work learning procedure executed by the work learning unit. The following description is given taking as an example the case of learning the data name "3".
  • the learning data creation unit 3 stores data names in association with the training data.
  • step S32 the work learning unit 5 initializes a variable n for counting the number of images to one.
  • step S33 it is determined whether or not the variable n is equal to or less than the number of images.
  • step S33 when the variable n is equal to or less than the number of images, the processes of steps S34 to S37 are performed.
  • step S33 when the variable n exceeds the image, the processing in steps S34 to S37 is not executed, and the process proceeds to step S38.
  • step S34 the work learning unit 5 inputs training data to the work recognition unit 6. Then, in step S35, the work learning unit 5 acquires the recognition result of the work recognition unit 6.
  • step S36 the work learning unit 5 corrects the internal parameters of the work recognition unit 6 so that the output result of the work recognition unit 6 matches the teacher data.
  • the recognition is performed again using the changed internal parameter, and if the output of the workpiece recognition unit 6 does not match the teacher data, the workpiece learning unit 5 corrects the internal parameter again.
  • step S36 the work learning unit 5 repeatedly performs internal parameter correction and recognition processing, and brings the output of the work recognition unit 6 closer to teacher data.
  • the recognition results P1 to P5 indicate the probabilities corresponding to the data names "1" to "5", respectively.
  • step S36 the variable n is incremented in step S37, and the determination process of step S33 is performed again. If the variable n exceeds the number of training images in the determination process of step S33, the process proceeds to step S38, and the learning result is stored. Specifically, in step S38, the work learning unit 5 fixes the internal parameter of the work recognition unit 6 that has been adjusted, and ends the learning.
  • the work learning unit 5 translates the parallel movement amount, rotation angle, and scale of the work corresponding to the acquired data when the learning data is created.
  • Etc. may be given as teacher data, and these may be directly learned.
  • the parallel movement amount, the rotation angle, the scale, etc. are output from the workpiece recognition unit 6.
  • the amount of parallel movement is given as coordinate values in the horizontal and vertical directions with the upper left end of the image as the origin.
  • the workpiece recognition unit 6 whose internal parameters have been adjusted in the learning process can identify which data name the workpiece photographed by the workpiece observation unit 4 corresponds to.
  • FIG. 10 is a flowchart for explaining the workpiece identification process performed by the workpiece identification unit.
  • the workpiece recognition unit 6 acquires an image captured by the workpiece observation unit 4.
  • the work recognition unit 6 executes the recognition process using the internal parameters adjusted by the work learning unit 5.
  • the workpiece recognition unit 6 outputs the recognition result.
  • the work recognition method described in the above flow chart will be summarized and described.
  • the work recognition method according to the present embodiment includes steps (S1 to S6) of creating training data necessary for learning based on design information of the work, and steps (S31 to S38) of learning training data. Receiving an image of the workpiece and recognizing the shape of the workpiece using the learning result (S41 to S43).
  • Such a recognition result can be used, for example, to recognize which of the five shapes shown in FIG. 4 contains the workpiece in the container for containing the workpiece.
  • the posture of the workpiece can also be detected by the amount of parallel movement, the angle of rotation, the scale, etc. included in the recognition result.
  • the image feature necessary for detecting the posture of the work is learned by the neural network itself consisting of the work learning unit 5 and the work recognition unit 6, so that the number of steps for examining the detection algorithm can be reduced.
  • complicated operations can be omitted, and the number of steps for constructing a recognition device can be reduced.
  • 1 Work information processing apparatus 2 Work data creation unit, 3 Learning data creation unit, 4 Work observation unit, 5 Work learning unit, 6 Work recognition unit, 7 Data storage unit, 9 monitors, 10 keyboards, 11 mice.

Landscapes

  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Image Processing (AREA)
  • Length Measuring Devices By Optical Means (AREA)
  • Image Analysis (AREA)

Abstract

This workpiece information processing device (1) used to recognize the shape of a workpiece is provided with: a learning data creation unit (3) which creates training data necessary for learning on the basis of design information on a workpiece (8); a workpiece learning unit (5) which learns the training data; and a data conservation unit (7) which conserves the training data and the learnt result in the workpiece learning unit (5). Preferably, the workpiece information processing device (1) is further provided with a workpiece data creation unit (2) which creates, from the design information, workpiece data including a basic image and coordinate data on the workpiece. The learning data creation unit (3) creates the training data by changing the basic image. Accordingly, the workpiece information processing device can be implemented by mechanical learning without requiring trial and error of a human or specialized knowledge in the selection of an image feature or an optimal condition necessary to recognize the shape of the workpiece.

Description

ワーク情報処理装置およびワークの認識方法Work information processing apparatus and work recognition method
 この発明は、機械加工または組立の対象物であるワークの形状を認識するワーク情報処理装置に関する。 The present invention relates to a workpiece information processing apparatus that recognizes the shape of a workpiece that is an object of machining or assembly.
 機械加工または組立作業において、対象物であるワークは、ロボットまたは組立装置等によって自動的にピックアップされて加工装置または組立物の筐体などにセットされることが多い。このようなピックアップ時には、ワークの形状や姿勢を認識してピックアップアームを制御する必要がある。例えば、ロボットや組立装置等によるワークの取り出し作業で必要なワークの姿勢検出や形状認識などにワーク情報処理装置が利用される。 In machining or assembly operations, workpieces that are objects are often picked up automatically by a robot or an assembly apparatus and the like and set in a housing or the like of a processing apparatus or assembly. At the time of such pickup, it is necessary to control the pickup arm by recognizing the shape and posture of the work. For example, a workpiece information processing apparatus is used for workpiece posture detection, shape recognition, and the like which are necessary in the work of taking out a workpiece by a robot, an assembly apparatus or the like.
 特開平10-332333号公報(特許文献1)および特開平11-066321公報(特許文献2)には、重ねて積まれたワークをカメラで撮影し、得られた画像を微分して輪郭抽出を行なうワーク情報処理装置が開示されている。 JP-A-10-332333 (patent document 1) and JP-A-11-066321 (patent document 2) photograph a work stacked one upon another with a camera, differentiate the obtained image and extract the outline A work information processing apparatus to perform is disclosed.
 特開平10-332333号公報(特許文献1)に開示された装置では、抽出した輪郭線から曲率を求め、曲率が局所的に最大になる画素を頂点として検出する。あらかじめ理想的な条件で撮影した画像をマスタパターン、実際にカメラで撮影したワークの画像を入力パターンとし、曲率が互いに等しいマスタパターンと入力パターンの頂点を対応付け、対応の付いた頂点が最も多い入力パターンの輪郭とマスタパターンの輪郭との平面位置や回転角度のズレ量を検出している。 In the apparatus disclosed in Japanese Patent Application Laid-Open No. 10-332333 (Patent Document 1), a curvature is obtained from the extracted contour line, and a pixel having a locally maximum curvature is detected as a vertex. A master pattern is an image captured under ideal conditions in advance, and an image of a workpiece actually captured by a camera is used as an input pattern. Master patterns with equal curvatures are associated with vertices of the input pattern, and the number of associated vertices is the largest The amount of deviation of the planar position and the rotation angle between the contour of the input pattern and the contour of the master pattern is detected.
 特開平11-066321公報(特許文献2)に開示された装置では、あらかじめ理想的な条件で撮影した画像をテンプレートとし、実際にカメラで撮影したワークの画像を入力画像として、それぞれ画像の輪郭線を追跡し、輪郭線上の各画素の接線角度を求め、テンプレートと入力画像の接線角度のヒストグラムを重ね合わせ、最も一致したときの角度のズレを回転角度として検出している。 In the apparatus disclosed in Japanese Patent Application Laid-Open No. 11-066321 (Patent Document 2), an image captured in advance under ideal conditions is used as a template, and an image of a work actually captured by a camera is used as an input image. The tangent angle of each pixel on the contour line is determined, the template and the histogram of the tangent angle of the input image are superimposed, and the deviation of the angle when they are most coincident is detected as the rotation angle.
特開平10-332333号公報Japanese Patent Application Laid-Open No. 10-332333 特開平11-066321公報Japanese Patent Application Laid-Open No. 11-066321
 上記先行文献に開示された装置には、以下のような課題がある。
 第1に、把持できるワークを検出するための最適な検出条件を設定しなければならない。検出条件が適切に設定されていない場合、たとえば、輪郭線抽出に失敗し、重なり状態を誤認識して把持動作の成功率を低下させる。このような状況が頻繁に起きると作業の効率が悪い。
The apparatus disclosed in the above prior art has the following problems.
First, it is necessary to set an optimal detection condition for detecting a workpiece that can be gripped. If the detection condition is not set properly, for example, contour extraction fails, and the overlapping state is erroneously recognized to reduce the success rate of the gripping operation. If such a situation occurs frequently, work efficiency will be poor.
 第2に、あらゆるワークの姿勢、外乱を考慮した検出アルゴリズムを構築しなければならない。重なり状態のワークではワークの姿勢がさまざまであるため、照明が一様に当たらず見え方が一定でない。また、外乱として、ワーク表面の油などの付着物や外乱光の影響も考慮しなければならない。すべての条件を網羅した検出アルゴリズムの考案は容易ではなく、多くの試行錯誤が必要で、開発は長期に亘る。 Second, we need to construct a detection algorithm that takes into consideration the posture of all the work and disturbances. The work in the overlapping state has various positions, so the illumination is not uniform and the appearance is not constant. Also, as the disturbance, it is also necessary to consider the influence of extraneous matter such as oil on the surface of the work and disturbance light. It is not easy to devise a detection algorithm that covers all conditions, it requires many trial and error, and development is long-term.
 これらの問題点を解決するために機械学習による検出法が提案されている。特にディープラーニングによる画像認識では人間を凌駕する認識率を達成する例も報告されている(ImageNet Large Scale Visual Recognition Challenge 2015)。この理由は、インターネット等を利用し、非常に多くの画像データを入手して学習させることができたことによる。 A machine learning detection method has been proposed to solve these problems. In particular, examples of achieving recognition rates that surpass humanity in image recognition by deep learning have also been reported (ImageNet Large Scale Visual Recognition Challenge 2015). The reason for this is that it was possible to learn and learn a great many image data using the Internet and the like.
 しかしながら、ロボットや組立装置等が扱う工業製品の場合、機械学習には非常に多くの画像データが必要であるにもかかわらず、製造工程で多くの画像を収集することは難しい。したがって、学習に必要となる多くの画像データを収集することが課題となっている。 However, in the case of industrial products handled by robots, assembling devices, etc., it is difficult to collect many images in the manufacturing process, although a large amount of image data is required for machine learning. Therefore, it is an issue to collect many image data required for learning.
 この発明は、上記の課題を解決するためになされたものであって、その目的は、ワークの形状認識に必要な画像特徴の選定や最適条件の選択に人間の試行錯誤や専門知識を必要としない、機械学習によるワーク情報処理装置を提供することである。 The present invention has been made to solve the above-mentioned problems, and its object is to require human trial and error and expert knowledge for selecting image features necessary for shape recognition of workpieces and selecting optimum conditions. Not to provide a work information processing apparatus by machine learning.
 この発明は、要約すると、ワークの形状認識に用いるワーク情報処理装置であって、ワークの設計情報に基づいて、学習に必要な訓練データを作成する学習データ作成部と、訓練データを学習するワーク学習部と、訓練データとワーク学習部における学習結果とを保存するデータ保存部とを備える。 The present invention, in summary, is a workpiece information processing apparatus used for workpiece shape recognition, and a training data creation unit for creating training data necessary for learning based on workpiece design information, and a workpiece for learning training data A learning unit, and a data storage unit for storing training data and a learning result in the work learning unit.
 好ましくは、ワーク情報処理装置は、設計情報からワークの基本画像と座標データとを含むワークデータを作成するワークデータ作成部をさらに備える。学習データ作成部は、基本画像を変更して訓練データを作成する。 Preferably, the work information processing apparatus further includes a work data creation unit that creates work data including a basic image of the work and coordinate data from the design information. The learning data creation unit changes the basic image to create training data.
 より好ましくは、学習データ作成部は、基本画像に対して、平行移動、回転、縮小および拡大のうちの少なくともいずれか1つの処理を行なうことによって、訓練データを作成する。 More preferably, the learning data creation unit creates training data by performing at least one of translation, rotation, reduction, and / or enlargement on the basic image.
 より好ましくは、学習データ作成部は、基本画像に対して、ワークの表面の色の変更、ワークに照射する光の照射方向の変更、およびワークに照射する光の強度の変更のうちの少なくともいずれか1つの処理を行なうことによって、訓練データを作成する。 More preferably, the learning data creation unit changes at least one of the change of the color of the surface of the work, the change of the irradiation direction of the light irradiated to the work, and the change of the light intensity irradiated to the work with respect to the basic image. Create training data by performing one process.
 好ましくは、ワーク情報処理装置は、ワークを撮影するワーク観察部をさらに備える。訓練データは、学習データ作成部によって作成された第1の画像データと、ワーク観察部によって撮影された第2の画像データとを含む。 Preferably, the workpiece information processing apparatus further includes a workpiece observation unit for photographing a workpiece. The training data includes the first image data created by the learning data creation unit and the second image data captured by the work observation unit.
 好ましくは、ワークの設計情報は、二次元CADまたは三次元CADの設計情報を含む。 Preferably, the design information of the workpiece includes two-dimensional CAD or three-dimensional CAD design information.
 好ましくは、ワーク情報処理装置は、ワークを撮影した画像を受けてワークの形状を認識するワーク認識部をさらに備える。ワーク学習部は、訓練データを学習することによってワーク認識部の内部パラメータを変更する。 Preferably, the workpiece information processing apparatus further includes a workpiece recognition unit that receives an image of the workpiece and recognizes the shape of the workpiece. The work learning unit changes internal parameters of the work recognition unit by learning training data.
 この発明は、他の局面では、ワークの認識方法であって、ワークの設計情報に基づいて、学習に必要な訓練データを作成するステップと、訓練データを学習するステップと、訓練データと訓練データの学習結果とを保存するステップと、ワークを撮影した画像を受け、学習結果を用いてワークの形状を認識するステップとを備える。 In another aspect, the present invention relates to a method of recognizing a work, which comprises: creating training data necessary for learning based on design information of the work; training training data; and training data and training data And storing the learning result of the step and receiving an image of the work, and using the learning result to recognize the shape of the work.
 本発明によれば、ワークの形状認識に必要な画像特徴の選定や最適条件の選択に人間の試行錯誤や専門知識を必要としないワーク情報処理装置を実現することができる。 According to the present invention, it is possible to realize a work information processing apparatus which does not require human trial and error or expert knowledge for selecting an image feature necessary for shape recognition of a work and selecting an optimum condition.
本実施の形態のワーク情報処理装置1の構成例を示す図である。It is a figure which shows the structural example of the workpiece | work information processing apparatus 1 of this Embodiment. 学習データ作成部3が作成する訓練データを説明するための図である。It is a figure for demonstrating the training data which the learning data preparation part 3 produces. 訓練データの一例を示す図である。It is a figure which shows an example of training data. データ名とワークの形状の対応関係の一例を示す図である。It is a figure which shows an example of the correspondence of a data name and the shape of a workpiece | work. 訓練用画像の第1作成手順を説明するためのフローチャートである。It is a flow chart for explaining the 1st creation procedure of a picture for training. 訓練用画像の第2作成手順に使用される構成を示す図である。It is a figure which shows the structure used for the 2nd preparation procedure of the image for training. 訓練用画像の第2作成手順を説明するためのフローチャートである。It is a flowchart for demonstrating the 2nd preparation procedure of the image for training. ワーク学習部5における学習を説明するための図である。5 is a diagram for explaining learning in a work learning unit 5. FIG. ワーク学習手順を説明するためのフローチャートである。It is a flowchart for demonstrating a work learning procedure. ワーク識別部で実行されるワーク識別工程を説明するためのフローチャートである。It is a flowchart for demonstrating the workpiece | work identification process performed by a workpiece | work identification part.
 以下、本発明の実施の形態について図面を参照しつつ説明する。なお、以下の図面において同一または相当する部分には同一の参照番号を付し、その説明は繰返さない。 Hereinafter, embodiments of the present invention will be described with reference to the drawings. In the following drawings, the same or corresponding parts are denoted by the same reference numerals, and the description thereof will not be repeated.
 図1は、本実施の形態のワーク情報処理装置1の構成例を示す図である。図1を参照して、ワーク情報処理装置1は、ワークデータ作成部2と、学習データ作成部3と、ワーク観察部4と、ワーク学習部5と、ワーク認識部6と、データ保存部7とを備える。 FIG. 1 is a diagram showing a configuration example of a work information processing apparatus 1 according to the present embodiment. Referring to FIG. 1, the work information processing apparatus 1 includes a work data creation unit 2, a learning data creation unit 3, a work observation unit 4, a work learning unit 5, a work recognition unit 6, and a data storage unit 7. And
 ワークデータ作成部2は、学習に必要な訓練データの基準となるワークデータを作成する。ワークデータ作成部2は、ワーク8の材質や加工方法(表面性状)を含むワーク8の設計情報に基づいてワークデータを作成し、データ保存部7に保存する。ワーク8の設計情報は、二次元CADまたは三次元CADの設計情報を含む。 The work data creation unit 2 creates work data as a reference of training data necessary for learning. The workpiece data creation unit 2 creates workpiece data based on the design information of the workpiece 8 including the material of the workpiece 8 and the processing method (surface property), and stores the workpiece data in the data storage unit 7. The design information of the work 8 includes two-dimensional CAD or three-dimensional CAD design information.
 「ワークデータ」の実態は画像であるが、ワークデータには、二次元または三次元の寸法情報が併せて格納され、一般的に使われる二次元または三次元のCADデータ形式で保存される。また、材質や加工方法、表面性状等の情報も必要に応じて保存される。この形式としてDXF(Drawing Exchange Format)やIGES(Initial Graphics Exchange Specification)、STEP(Standard for the Exchange of Product model
 data)などがあり、本実施の形態ではこれらを用いることができる。
Although the substance of "work data" is an image, two-dimensional or three-dimensional dimensional information is stored together with the work data, and is stored in a commonly used two-dimensional or three-dimensional CAD data format. In addition, information such as material, processing method, surface quality and the like is also stored as necessary. As this format, DXF (Drawing Exchange Format), IGES (Initial Graphics Exchange Specification), STEP (Standard for the Exchange of Product model)
data) etc., and these can be used in the present embodiment.
 学習データ作成部3は、ワークデータ作成部2で作成されたワークデータに基づいて学習に必要な訓練データを作成する。このような訓練データは、実際にカメラ等で撮影された画像が用いられることが多いが、工作機械等ではこのような画像を多数用意することは手間がかかる。したがって、本実施の形態では、元となるワークデータから多数の訓練データを学習データ作成部3が自動的に生成する。図2は、学習データ作成部3が作成する訓練データを説明するための図である。学習データ作成部3は、基本画像100を変更して訓練データを作成する。図2に示すように、学習データ作成部3は、基本画像100に対して、プログラムにより平行移動、回転、縮小および拡大のうちの少なくともいずれか1つの処理を行なうことによって、訓練データ100A,100Bを作成する。 The learning data creation unit 3 creates training data necessary for learning based on the work data created by the work data creation unit 2. As such training data, an image actually taken by a camera or the like is often used, but it is laborious to prepare a large number of such images with a machine tool or the like. Therefore, in the present embodiment, the learning data creation unit 3 automatically generates a large number of training data from the original work data. FIG. 2 is a figure for demonstrating the training data which the learning data creation part 3 produces. The learning data creation unit 3 modifies the basic image 100 to create training data. As shown in FIG. 2, the training data creation unit 3 performs training data 100 A, 100 B by performing at least one of translation, rotation, reduction and enlargement on the basic image 100 according to a program. Create
 好ましくは、図2に示すように、学習データ作成部3は、基本画像100に対して、プログラムによりワークの表面の色の変更、ワークに照射する光の照射方向の変更、およびワークに照射する光の強度の変更のうちの少なくともいずれか1つの処理を行なうことによって、訓練データ100C,100Dを作成する。 Preferably, as shown in FIG. 2, the learning data creation unit 3 causes the basic image 100 to change the color of the surface of the work, change the irradiation direction of light to be irradiated to the work, and irradiate the work with the program. Training data 100C and 100D are created by performing at least one process of changing the light intensity.
 このように、学習データ作成部3は、実際の撮影画像ではなくCADデータから、人工的にさまざまな訓練データを作成する。学習データ作成部3は、上述の平行移動、回転、縮小および拡大のうちの少なくともいずれか1つの処理を行なうプログラムと、上述のワーク表面の色の変更や照射する光の方向の変更、強度の変更のうちの少なくともいずれか1つの処理を行なうプログラムの組み合わせであり、処理の順序も任意に変更可能である。 As described above, the learning data creation unit 3 artificially creates various training data from CAD data, not from actual photographed images. The learning data creation unit 3 is a program for performing at least one of the above-mentioned translation, rotation, reduction, and enlargement, the change of the color of the work surface, the change of the direction of the light to be irradiated, and the intensity. It is a combination of programs that perform at least one process of change, and the order of processes can be arbitrarily changed.
 図3は、訓練データの一例を示す図である。図3に示すように、訓練データは各標本点(画素)が集合した画像である。訓練データには、ワークの輪郭W1が含まれている。 FIG. 3 is a diagram showing an example of training data. As shown in FIG. 3, the training data is an image in which sample points (pixels) are gathered. The training data includes the contour W1 of the work.
 訓練データはワーク表面の各標本点の三次元座標(X,Y,Z)、ワークの色(R,G,B)、明るさIからなり、それぞれ配列に格納され、データ名と関連付けてデータ保存部7に保存される。データ名は作成したデータの種類を示すもので、たとえば、1から順に番号を付けてもよいし、A,ABCなどの文字で表してもよい。 The training data consists of three-dimensional coordinates (X, Y, Z) of each sample point on the work surface, the color (R, G, B) of the work, and the brightness I, which are stored in an array and associated with data names It is stored in the storage unit 7. The data name indicates the type of created data, and may be numbered sequentially from 1 or may be represented by characters such as A and ABC.
 図4は、データ名とワークの形状の対応関係の一例を示す図である。データ名「1」~「5」という数字が5種類のワーク形状に対応することとする。具体的には、データ名「1」には、正方形のワークが対応し、データ名「2」には、円形のワークが対応し、データ名「3」には、平行四辺形のワークが対応し、データ名「4」には、3角形のワークが対応し、データ名「5」には、二重丸のワークが対応する。なお、図4には、簡単のため形状例として単純な二次元の形状を示したが、実際はワークの形状は三次元であっても良い。 FIG. 4 is a diagram showing an example of correspondence between data names and shapes of workpieces. The data names "1" to "5" correspond to five types of workpiece shapes. Specifically, a square work corresponds to the data name "1", a circular work corresponds to the data name "2", and a parallelogram work corresponds to the data name "3" The data name "4" corresponds to a triangular work, and the data name "5" corresponds to a double circle work. Although FIG. 4 shows a simple two-dimensional shape as a shape example for the sake of simplicity, in actuality, the shape of the workpiece may be three-dimensional.
 また、データ名に関連付けて、訓練データを作成したときの条件、たとえば、ワークの種類、平行移動量、回転角度、縮尺値、色、光の照射方向、光の強度等が訓練データとともにデータ保存部7に保存される。 In addition, the condition when training data is created in relation to the data name, for example, the type of work, parallel movement amount, rotation angle, scale value, color, light irradiation direction, light intensity, etc. It is stored in section 7.
 再び図1に戻って、ワーク観察部4は、ワーク表面の各標本点の三次元座標(X,Y,Z)、色(R,G,B)、明るさIを出力する。図1の構成例では、ワーク観察部4は、ワーク8を撮影する2台のカメラを含み、ステレオ計測によりワーク表面の各標本点の三次元座標(X,Y,Z)を求める。これらはカラーカメラであり、ワーク表面の各標本点の色(R,G,B)、明るさIを出力できる。 Referring back to FIG. 1 again, the workpiece observation unit 4 outputs three-dimensional coordinates (X, Y, Z), colors (R, G, B) and brightness I of each sample point on the workpiece surface. In the configuration example of FIG. 1, the workpiece observation unit 4 includes two cameras for photographing the workpiece 8 and obtains three-dimensional coordinates (X, Y, Z) of each sample point on the workpiece surface by stereo measurement. These are color cameras and can output the color (R, G, B) and brightness I of each sample point on the surface of the workpiece.
 また、ワーク観察部4は、ワーク学習部5およびワーク認識部6と接続される。ワーク学習部5およびワーク認識部6では、データ(X,Y,Z)、(R,G,B)、Iを用いて学習処理および認識処理がそれぞれ実行される。 The work observation unit 4 is also connected to the work learning unit 5 and the work recognition unit 6. The work learning unit 5 and the work recognition unit 6 execute the learning process and the recognition process using data (X, Y, Z), (R, G, B), and I, respectively.
 ワーク学習部5は、学習データ作成部3で作成した訓練データを用いて学習を行なう。学習法としては、サポートベクタマシンやバックプロバケーション(誤差逆伝播法)などの教師あり学習法や、オートエンコーダ(自己符号化器)やk平均法、主成分分析などの教師なし学習法、およびそれらを複合した機械学習法を用いる。 The work learning unit 5 performs learning using the training data created by the learning data creation unit 3. As a learning method, supervised learning methods such as support vector machine and back-provocation (error back propagation method), unsupervised learning methods such as auto encoder (self-coder), k-means method, principal component analysis, and Use machine learning method that combines them.
 ワーク認識部6は、ワーク観察部4の出力データとワーク学習部5の学習結果を用いてワークの認識処理を行なう。ワーク認識部6は、上記機械学習法に対応したニューラルネットワークやサポートベクタマシンなどの認識モデルを用い、ワーク学習部5の学習結果に基づいて、ワーク観察部4から出力されるワーク表面の各標本点の三次元座標(X,Y,Z)、色(R,G,B)、明るさIを入力として認識処理を行なうように構成される。 The workpiece recognition unit 6 performs workpiece recognition processing using the output data of the workpiece observation unit 4 and the learning result of the workpiece learning unit 5. The workpiece recognition unit 6 uses a recognition model such as a neural network or support vector machine corresponding to the machine learning method, and based on the learning result of the workpiece learning unit 5, each specimen of the workpiece surface output from the workpiece observation unit 4 The recognition processing is performed using the three-dimensional coordinates (X, Y, Z) of the point, the colors (R, G, B), and the brightness I as inputs.
 ワークデータ作成部2、学習データ作成部3、ワーク学習部5、ワーク認識部6の各々は、CPU(Central Processing Unit)、RAM(Random Access Memory)、ROM(Read Only Memory)等を含み、情報処理に応じて各構成要素の制御を行なう。ワークデータ作成部2、学習データ作成部3、ワーク学習部5、ワーク認識部6のうち2以上の部分の機能が1つのCPUで処理されていても良い。データ保存部7は、例えば、ハードディスクドライブ、ソリッドステートドライブ等の補助記憶装置である。データ保存部7は、ワークデータ作成部2、学習データ作成部3、ワーク学習部5、ワーク認識部6の各々で実行されるプログラムおよびワークデータ作成部2や学習データ作成部3で作成されたデータおよびワーク学習部5の学習結果を記憶する。 Each of the work data creation unit 2, the learning data creation unit 3, the work learning unit 5, and the work recognition unit 6 includes a central processing unit (CPU), a random access memory (RAM), a read only memory (ROM), etc. Control of each component is performed according to processing. The functions of two or more of the work data creation unit 2, the learning data creation unit 3, the work learning unit 5, and the work recognition unit 6 may be processed by one CPU. The data storage unit 7 is, for example, an auxiliary storage device such as a hard disk drive or a solid state drive. The data storage unit 7 is a program executed by each of the work data generation unit 2, the learning data generation unit 3, the work learning unit 5, and the work recognition unit 6, and the work data generation unit 2 and the learning data generation unit 3. The data and the learning result of the work learning unit 5 are stored.
 [訓練用画像の準備]
 ワーク学習部5は以下に説明する第1作成手順で用意された第1の画像データと第2作成手順で用意された第2の画像データとを組み合わせてワーク認識部6の内部パラメータを変更する学習処理を実行する。内部パラメータは、たとえば、ニューラルネットワークにおける入力層から出力層までの各層間の重みやバイアスなどが該当する。すなわち、ワーク学習部5で学習される訓練データは、第1の画像データと第2の画像データとを含む。第1作成手順は、設計データ(CADデータ)から訓練用の第1の画像データを多数生成する手順である。第2作成手順は、ワーク観察部4で取得された実際のワークの画像を訓練用の第2の画像データとする手順である。
[Preparation of images for training]
The work learning unit 5 changes the internal parameter of the work recognition unit 6 by combining the first image data prepared in the first preparation procedure described below and the second image data prepared in the second preparation procedure. Execute learning process. The internal parameters correspond to, for example, weights and biases between layers from the input layer to the output layer in the neural network. That is, the training data learned by the work learning unit 5 includes the first image data and the second image data. The first preparation procedure is a procedure for generating a large number of first image data for training from design data (CAD data). The second preparation procedure is a procedure in which an image of an actual workpiece acquired by the workpiece observation unit 4 is used as second image data for training.
 図5は、訓練用画像の第1作成手順を説明するためのフローチャートである。第1作成手順は、図1の学習データ作成部3において実行される。図1、図5を参照して、学習データ作成部3は、ステップS1において、ワークデータ作成部2が作成したCADデータ(もともとの設計データ)を読み込む。ステップS2において、学習データ作成部3は、読み込んだCADデータが、どのデータ名に対応するのかを問合せ、ユーザは、CADデータに対応するデータ名を選択する。 FIG. 5 is a flowchart for explaining a first procedure for creating a training image. The first creation procedure is executed in the learning data creation unit 3 of FIG. Referring to FIGS. 1 and 5, in step S1, the learning data creation unit 3 reads CAD data (original design data) created by the work data creation unit 2. In step S2, the learning data creation unit 3 inquires to which data name the read CAD data corresponds, and the user selects a data name corresponding to the CAD data.
 続いて、学習データ作成部3は、ステップS3において、ユーザが入力したデータ名に対応する正解を作成する。正解は、ワーク認識部6が出力する認識結果に対応するものである。たとえばワーク認識部6から図4のデータ名「1」~「5」にそれぞれ対応する確率が認識結果P1~P5として出力される場合、データ名「3」に対応する正解は、(P1=0、P2=0、P3=1、P4=0、P5=0)となる。 Subsequently, in step S3, the learning data creation unit 3 creates a correct answer corresponding to the data name input by the user. The correct answer corresponds to the recognition result output from the workpiece recognition unit 6. For example, when the probability corresponding to data names "1" to "5" in FIG. 4 is output as recognition results P1 to P5 from work recognition unit 6, the correct answer corresponding to data name "3" is (P1 = 0 , P2 = 0, P3 = 1, P4 = 0, P5 = 0).
 続いて、学習データ作成部3は、ステップS4においてパラメータの変更範囲を決定する。パラメータは、平行移動量、回転角、縮小または拡大倍率、ワークの色、光の照射方向と強度などである。学習データ作成部3は、上記パラメータの変更範囲をユーザからの入力によって決定する。また、学習データ作成部3は、上記パラメータの変更範囲を予め標準的に決められている値に設定しても良い。 Subsequently, the learning data creation unit 3 determines the change range of the parameter in step S4. The parameters include the amount of parallel movement, the angle of rotation, the reduction or enlargement ratio, the color of the work, the irradiation direction and intensity of light, etc. The learning data creation unit 3 determines the change range of the parameter according to the input from the user. In addition, the learning data creation unit 3 may set the change range of the parameter to a value determined in advance as a standard.
 続いてステップS5において、学習データ作成部3は、ステップS4で決定された変更範囲内で、パラメータを変更しながら画像を生成する。たとえば、平行移動量、回転角、倍率、ワークの色、光の照射方向、光の照射強度の6つのパラメータについて、変更範囲内で各々5段階に変更することとすると、5の画像が生成される。 Subsequently, in step S5, the learning data creation unit 3 creates an image while changing the parameter within the change range determined in step S4. For example, the amount of parallel movement, rotation angle, magnification, color work, the irradiation direction of the light, the six parameters of the irradiation intensity of light, when changing the respective five levels within the change area, 5 6 image is generated Be done.
 ステップS6では、学習データ作成部3は、生成された画像を、パラメータおよびデータ名と関連付けて、データ保存部7に保存する。 In step S6, the learning data creation unit 3 stores the generated image in the data storage unit 7 in association with the parameter and the data name.
 図6は、訓練用画像の第2作成手順に使用される構成を示す図である。
 第2作成手順では、ワーク観察部4で撮影した画像を利用する。第1作成手順で用意した訓練用画像または第2作成手順で用意した訓練用画像のいずれか一方しか用いなくても良いが、第1作成手順で用意した訓練用画像と、第2作成手順で用意した訓練用画像とを組み合わせることにより、さらに認識率を向上させることができる。ワーク観察部4で実際の対象ワークを観察して得られた訓練データを併せて学習させることにより、学習データ作成部3で作成した訓練データを補強する。第1作成手順で生成される画像は、ワーク画像の周辺には何も配置されていない。これに対し、第2作製手順で作成される画像は、ワークの周囲に他のワークが写っている場合がある。後の学習工程において、ワーク学習部5は、周囲の画像ごとワークを認識するように学習を実行する。したがって、ワーク8が容器に複数入っており一部が重なっているような場合であっても、そのうちの1つのワークを特定して認識できるようになる。
FIG. 6 is a diagram showing a configuration used for a second procedure for creating a training image.
In the second preparation procedure, an image taken by the work observation unit 4 is used. Only one of the training image prepared in the first preparation procedure or the training image prepared in the second preparation procedure may be used, but the training image prepared in the first preparation procedure and the second preparation procedure The recognition rate can be further improved by combining the prepared training image. The training data created by the learning data creation unit 3 is reinforced by causing the work observation unit 4 to simultaneously learn training data obtained by observing an actual target work. The image generated in the first preparation procedure is nothing arranged around the work image. On the other hand, in the image created in the second preparation procedure, another workpiece may be shown around the workpiece. In the later learning process, the work learning unit 5 executes learning so as to recognize the work with the surrounding image. Therefore, even when a plurality of works 8 are contained in a container and a part thereof is overlapped, one work can be identified and recognized.
 ワーク観察部4は、ワーク表面の各標本点の三次元座標(X,Y,Z)、色(R,G,B)、明るさIを出力する。これらはデータ名と関連付けられていないため、取得時にデータ名との関連付けを行なう。 The workpiece observation unit 4 outputs three-dimensional coordinates (X, Y, Z), colors (R, G, B) and brightness I of each sample point on the workpiece surface. Since these are not associated with data names, they will be associated with data names at the time of acquisition.
 図7は、訓練用画像の第2作成手順を説明するためのフローチャートである。ワーク学習部5は、ステップS21においてワーク観察部4から撮影された画像を取得する。ワーク学習部5は、撮影された取得データ12をワーク学習部5に接続されたモニタ9に表示する。さらにステップS22において、ワーク学習部5に接続されたキーボード10やマウス11を用いて、ユーザがデータ名13の内の一つを選択することによって取得データ12に対応するデータ名を取得する。この際、モニタ9に表示された取得データ12と選択されたデータ名13が関連付けられてデータ保存部7に保存される。続いて、学習データ作成部3は、ステップS23において、ユーザが入力したデータ名13に対応する正解を作成する。正解は、ワーク認識部6が出力する認識結果に対応するものである。たとえばワーク認識部6から図4のデータ名「1」~「5」にそれぞれ対応する確率が認識結果P1~P5として出力される場合、データ名「3」に対応する正解は、(P1=0、P2=0、P3=1、P4=0、P5=0)となる。 FIG. 7 is a flowchart for describing a second procedure for creating a training image. The work learning unit 5 acquires the image captured from the work observing unit 4 in step S21. The work learning unit 5 displays the acquired acquired data 12 on the monitor 9 connected to the work learning unit 5. Further, in step S22, the user selects one of the data names 13 using the keyboard 10 and the mouse 11 connected to the work learning unit 5, thereby acquiring the data name corresponding to the acquired data 12. At this time, the acquired data 12 displayed on the monitor 9 and the selected data name 13 are associated with each other and stored in the data storage unit 7. Subsequently, in step S23, the learning data creation unit 3 creates a correct answer corresponding to the data name 13 input by the user. The correct answer corresponds to the recognition result output from the workpiece recognition unit 6. For example, when the probability corresponding to data names "1" to "5" in FIG. 4 is output as recognition results P1 to P5 from work recognition unit 6, the correct answer corresponding to data name "3" is (P1 = 0 , P2 = 0, P3 = 1, P4 = 0, P5 = 0).
 図6では、モニタ9はワーク学習部5に接続され、ワーク学習部5で上記の第2作成手順を実施したが、モニタ9をワーク認識部6に接続し、ワーク認識部6に接続されたキーボード10やマウス11を用いてワーク認識部6で上記の第2作成手順を実施してもよい。この場合、ワーク認識部6とデータ保存部7を接続し、データを保存できるようにする。 In FIG. 6, the monitor 9 is connected to the work learning unit 5, and the work learning unit 5 executes the above-described second preparation procedure. However, the monitor 9 is connected to the work recognition unit 6 and connected to the work recognition unit 6. The second recognition procedure described above may be performed by the work recognition unit 6 using the keyboard 10 and the mouse 11. In this case, the workpiece recognition unit 6 and the data storage unit 7 are connected so that data can be stored.
 ステップS24~S27では、上記の関連付けにおいて、モニタ9に表示された取得データ12を見ながらユーザがキーボード10やマウス11を用い、ワークの位置や回転角度、縮尺等を入力する。 In steps S24 to S27, in the above association, the user inputs the position, the rotation angle, the scale, and the like of the work using the keyboard 10 and the mouse 11 while looking at the acquired data 12 displayed on the monitor 9.
 ステップS24において、ワーク学習部5は、モニタ9上に取得した画像を表示する。続いて、ステップS25において、ワーク学習部5は、ユーザから画像中のワークの中心位置の指定を受け、平行移動量を算出する。なお、平行移動量は、画像の左上端を原点とする水平方向および垂直方向の座標値として与えられる。 In step S24, the work learning unit 5 displays the acquired image on the monitor 9. Subsequently, in step S25, the work learning unit 5 receives specification of the center position of the work in the image from the user, and calculates the amount of parallel movement. The amount of parallel movement is given as coordinate values in the horizontal and vertical directions with the upper left end of the image as the origin.
 さらに、ステップS26において、ワーク学習部5は、ユーザに指定された画像中のワークの2点に基づいて回転角を算出する。たとえばワークが四角形の場合、ユーザは、対角の2頂点を指定する。これによりワーク学習部5は回転角を認識できる。また、ステップS27において、ユーザが外形を画像上で指定すると、ワーク学習部5は、画像中に撮影されているワークの倍率を算出することができる。 Furthermore, in step S26, the work learning unit 5 calculates the rotation angle based on two points of the work in the image designated by the user. For example, if the work is a square, the user specifies two diagonal corners. Thus, the work learning unit 5 can recognize the rotation angle. In step S27, when the user designates an outer shape on the image, the work learning unit 5 can calculate the magnification of the work captured in the image.
 そしてステップS28において、入力された位置や回転角度、縮尺等は取得データ12に関連付けてデータ保存部7に保存される。 Then, in step S28, the input position, rotation angle, scale, and the like are stored in the data storage unit 7 in association with the acquired data 12.
 以上の第1作成手順または第2作成手順で訓練データが用意された後に、ワーク学習部5は、以下に説明する手順で学習を行なう。 After the training data is prepared in the first creation procedure or the second creation procedure described above, the work learning unit 5 performs learning in the procedure described below.
 [訓練用画像を用いた学習]
 図8は、ワーク学習部5における学習を説明するための図である。図9は、ワーク学習部で実行されるワーク学習手順を説明するためのフローチャートである。以下は、データ名「3」を学習する場合を例として説明する。
[Learning using training images]
FIG. 8 is a diagram for explaining learning in the work learning unit 5. FIG. 9 is a flowchart for explaining the work learning procedure executed by the work learning unit. The following description is given taking as an example the case of learning the data name "3".
 学習データ作成部3では訓練データと関連付けてデータ名を保存している。ステップS31において、ワーク学習部5は、訓練データとともに、データ保存部7から、データ名「3」の教師データとして(P1=0、P2=0、P3=1、P4=0、P5=0)を得る。 The learning data creation unit 3 stores data names in association with the training data. In step S31, the work learning unit 5 together with the training data from the data storage unit 7 as teacher data of the data name “3” (P1 = 0, P2 = 0, P3 = 1, P4 = 0, P5 = 0) Get
 ステップS32において、ワーク学習部5は、画像枚数をカウントする変数nを1に初期化する。ステップS33において変数nが画像枚数以下であるか否かが判断される。ステップS33において、変数nが画像枚数以下である場合、ステップS34~S37の処理が実行される。ステップS33において、変数nが画像を超えた場合、ステップS34~S37の処理は実行されずステップS38に処理が進められる。 In step S32, the work learning unit 5 initializes a variable n for counting the number of images to one. In step S33, it is determined whether or not the variable n is equal to or less than the number of images. In step S33, when the variable n is equal to or less than the number of images, the processes of steps S34 to S37 are performed. In step S33, when the variable n exceeds the image, the processing in steps S34 to S37 is not executed, and the process proceeds to step S38.
 ステップS34において、ワーク学習部5は、ワーク認識部6に訓練データを入力する。そして、ステップS35においてワーク学習部5はワーク認識部6の認識結果を取得する。 In step S34, the work learning unit 5 inputs training data to the work recognition unit 6. Then, in step S35, the work learning unit 5 acquires the recognition result of the work recognition unit 6.
 ステップS36において、ワーク学習部5は、ワーク認識部6の出力結果が教師データと一致するようにワーク認識部6の内部パラメータを修正する。変更後の内部パラメータで再度認識を行ない、ワーク認識部6の出力が教師データと一致しない場合は、ワーク学習部5は再度内部パラメータを修正する。ワーク学習部5はステップS36において内部パラメータの修正と認識の処理を繰り返し行ない、ワーク認識部6の出力を教師データに近づける。 In step S36, the work learning unit 5 corrects the internal parameters of the work recognition unit 6 so that the output result of the work recognition unit 6 matches the teacher data. The recognition is performed again using the changed internal parameter, and if the output of the workpiece recognition unit 6 does not match the teacher data, the workpiece learning unit 5 corrects the internal parameter again. In step S36, the work learning unit 5 repeatedly performs internal parameter correction and recognition processing, and brings the output of the work recognition unit 6 closer to teacher data.
 図8において、認識結果P1~P5は、データ名「1」~「5」にそれぞれ該当する確率を示す。(P1=1,P2=0,P3=0.5,P4=1,P5=0)は、内部パラメータを調整する前のワーク認識部6における訓練データの認識結果を示す。図8に示される教師データ(P1=0,P2=0,P3=1,P4=0,P5=0)は、データ名「3」の確率が1であり、データ名「1」,「2」,「4」,「5」である確率がゼロであるという認識結果を示す。ワーク学習部5は、訓練データを認識した結果が教師データ(P1=0,P2=0,P3=1,P4=0,P5=0)に近づくように、ステップS36において、ワーク認識部6の内部パラメータを調整する学習処理を実行する。 In FIG. 8, the recognition results P1 to P5 indicate the probabilities corresponding to the data names "1" to "5", respectively. (P1 = 1, P2 = 0, P3 = 0.5, P4 = 1, P5 = 0) indicates the recognition result of the training data in the work recognition unit 6 before adjusting the internal parameter. In the teacher data (P1 = 0, P2 = 0, P3 = 1, P4 = 0, P5 = 0) shown in FIG. 8, the probability of the data name “3” is 1 and the data names “1” and “2” "," "4", "5" indicates the recognition result that the probability is zero. In step S36, the work learning unit 5 performs step S36 so that the result of recognition of the training data approaches teacher data (P1 = 0, P2 = 0, P3 = 1, P4 = 0, P5 = 0). Execute learning processing to adjust internal parameters.
 ステップS36に続き、ステップS37において変数nがインクリメントされ、再びステップS33の判定処理が実行される。ステップS33の判定処理において変数nが訓練画像の枚数を超えていた場合、ステップS38に処理が進められ、学習結果が保存される。具体的には、ステップS38において、ワーク学習部5は、調整していたワーク認識部6の内部パラメータを固定し、学習を終了する。 Following step S36, the variable n is incremented in step S37, and the determination process of step S33 is performed again. If the variable n exceeds the number of training images in the determination process of step S33, the process proceeds to step S38, and the learning result is stored. Specifically, in step S38, the work learning unit 5 fixes the internal parameter of the work recognition unit 6 that has been adjusted, and ends the learning.
 なお、以上の説明では、データ名を学習する手順を例示したが、ワーク学習部5ではデータ名に加え、学習データを作成したときの取得データに対応するワークの平行移動量や回転角度、縮尺等を教師データとして与え、これらを直接学習しても良い。この場合、ワーク認識部6の認識処理の結果、平行移動量や回転角度、縮尺等がワーク認識部6から出力される。なお、平行移動量は、画像の左上端を原点とする水平方向および垂直方向の座標値として与えられる。 In the above description, the procedure for learning the data name has been exemplified. However, in addition to the data name, the work learning unit 5 translates the parallel movement amount, rotation angle, and scale of the work corresponding to the acquired data when the learning data is created. Etc. may be given as teacher data, and these may be directly learned. In this case, as a result of the recognition process of the workpiece recognition unit 6, the parallel movement amount, the rotation angle, the scale, etc. are output from the workpiece recognition unit 6. The amount of parallel movement is given as coordinate values in the horizontal and vertical directions with the upper left end of the image as the origin.
 [ワークの識別処理]
 学習工程で内部パラメータが調整されたワーク認識部6は、ワーク観察部4で撮影されたワークがどのデータ名に該当するものであるか識別することが可能となる。
[Work identification process]
The workpiece recognition unit 6 whose internal parameters have been adjusted in the learning process can identify which data name the workpiece photographed by the workpiece observation unit 4 corresponds to.
 図10は、ワーク識別部で実行されるワーク識別工程を説明するためのフローチャートである。まず、ステップS41において、ワーク認識部6は、ワーク観察部4で撮影した画像を取得する。続いてステップS42において、ワーク認識部6は、ワーク学習部5によって調整済みの内部パラメータを用いて認識処理を実行する。そしてステップS43において、ワーク認識部6は認識結果を出力する。 FIG. 10 is a flowchart for explaining the workpiece identification process performed by the workpiece identification unit. First, in step S41, the workpiece recognition unit 6 acquires an image captured by the workpiece observation unit 4. Subsequently, in step S42, the work recognition unit 6 executes the recognition process using the internal parameters adjusted by the work learning unit 5. Then, in step S43, the workpiece recognition unit 6 outputs the recognition result.
 以上のフローチャートで説明したワークの認識方法を要約して説明する。本実施の形態の係るワークの認識方法は、ワークの設計情報に基づいて、学習に必要な訓練データを作成するステップ(S1~S6)と、訓練データを学習するステップ(S31~S38)と、ワークを撮影した画像を受け、学習結果を用いてワークの形状を認識するステップ(S41~S43)とを備える。 The work recognition method described in the above flow chart will be summarized and described. The work recognition method according to the present embodiment includes steps (S1 to S6) of creating training data necessary for learning based on design information of the work, and steps (S31 to S38) of learning training data. Receiving an image of the workpiece and recognizing the shape of the workpiece using the learning result (S41 to S43).
 このように、訓練データをワークの設計情報に基づいて多数作成するので、実際にワークをカメラで撮影して学習用の訓練画像を多数用意するという手間を軽減することができる。 As described above, since a large number of training data are created based on design information of a workpiece, it is possible to reduce the trouble of actually photographing the workpiece with a camera and preparing a large number of training images for learning.
 また、機械学習は、外乱を含んだデータそのものから正解がわかるため、誤認識した場合でも上記のワークの認識方法における処理内容の変更は必要ない。新規の誤認識画像データが発生した場合は、そのデータ自身を学習させるだけで済むため、画像処理の専門知識も不要であるという利点もある。 Further, in machine learning, since the correct answer is known from the data itself including the disturbance, even in the case of erroneous recognition, there is no need to change the processing content in the above-described work recognition method. When new erroneously recognized image data is generated, only learning of the data itself is required, so that there is also an advantage that no specialized knowledge of image processing is required.
 なお、このような認識結果は、たとえば、ワークを入れる容器に図4に示した5つの形状のいずれのワークが入っているのかを認識することに使用できる。また、認識結果に含まれる平行移動量や回転角度、縮尺等によってワークの姿勢を検出することもできる。 Such a recognition result can be used, for example, to recognize which of the five shapes shown in FIG. 4 contains the workpiece in the container for containing the workpiece. The posture of the workpiece can also be detected by the amount of parallel movement, the angle of rotation, the scale, etc. included in the recognition result.
 ワークの姿勢検出に必要な画像特徴はワーク学習部5およびワーク認識部6からなるニューラルネットワーク自身が学習するため、検出アルゴリズムを検討する工数を削減できる。また、最適な検出条件の探索や調整も必要としないため、複雑な作業を省略でき、認識装置を構築する工数を削減できる。 The image feature necessary for detecting the posture of the work is learned by the neural network itself consisting of the work learning unit 5 and the work recognition unit 6, so that the number of steps for examining the detection algorithm can be reduced. In addition, since it is not necessary to search for or adjust the optimum detection conditions, complicated operations can be omitted, and the number of steps for constructing a recognition device can be reduced.
 今回開示された実施の形態は、すべての点で例示であって制限的なものではないと考えられるべきである。本発明の範囲は、上記した実施の形態の説明ではなくて請求の範囲によって示され、請求の範囲と均等の意味および範囲内でのすべての変更が含まれることが意図される。 It should be understood that the embodiments disclosed herein are illustrative and non-restrictive in every respect. The scope of the present invention is shown not by the above description of the embodiment but by the scope of claims, and is intended to include all modifications within the scope and meaning equivalent to the scope of claims.
 1 ワーク情報処理装置、2 ワークデータ作成部、3 学習データ作成部、4 ワーク観察部、5 ワーク学習部、6 ワーク認識部、7 データ保存部、9 モニタ、10 キーボード、11 マウス。 1 Work information processing apparatus, 2 Work data creation unit, 3 Learning data creation unit, 4 Work observation unit, 5 Work learning unit, 6 Work recognition unit, 7 Data storage unit, 9 monitors, 10 keyboards, 11 mice.

Claims (8)

  1.  ワークの形状認識に用いるワーク情報処理装置であって、
     前記ワークの設計情報に基づいて、学習に必要な訓練データを作成する学習データ作成部と、
     前記訓練データを学習するワーク学習部と、
     前記訓練データと前記ワーク学習部における学習結果とを保存するデータ保存部とを備える、ワーク情報処理装置。
    A workpiece information processing apparatus used for workpiece shape recognition,
    A learning data creation unit that creates training data necessary for learning based on design information of the work;
    A work learning unit for learning the training data;
    A work information processing apparatus comprising: a data storage unit for storing the training data and the learning result in the work learning unit.
  2.  前記設計情報から前記ワークの基本画像と座標データとを含むワークデータを作成するワークデータ作成部をさらに備え、
     前記学習データ作成部は、前記基本画像を変更して前記訓練データを作成する、請求項1に記載のワーク情報処理装置。
    It further comprises a work data creation unit that creates work data including the basic image of the work and coordinate data from the design information,
    The work information processing apparatus according to claim 1, wherein the learning data creation unit creates the training data by changing the basic image.
  3.  前記学習データ作成部は、前記基本画像に対して、平行移動、回転、縮小および拡大のうちの少なくともいずれか1つの処理を行なうことによって、前記訓練データを作成する、請求項2に記載のワーク情報処理装置。 The work according to claim 2, wherein the training data creation unit creates the training data by performing at least one of translation, rotation, reduction, and / or enlargement on the basic image. Information processing device.
  4.  前記学習データ作成部は、前記基本画像に対して、前記ワークの表面の色の変更、前記ワークに照射する光の照射方向の変更、および前記ワークに照射する光の強度の変更のうちの少なくともいずれか1つの処理を行なうことによって、前記訓練データを作成する、請求項2または3に記載のワーク情報処理装置。 The learning data creation unit is at least one of a change of a color of a surface of the work, a change of an irradiation direction of light irradiated to the work, and a change of intensity of light irradiated to the work with respect to the basic image. The work information processing apparatus according to claim 2, wherein the training data is created by performing any one process.
  5.  前記ワークを撮影するワーク観察部をさらに備え、
     前記訓練データは、
     前記学習データ作成部によって作成された第1の画像データと、
     前記ワーク観察部によって撮影された第2の画像データとを含む、請求項1~4のいずれか1項に記載のワーク情報処理装置。
    The apparatus further comprises a workpiece observation unit for photographing the workpiece,
    The training data is
    First image data created by the learning data creation unit;
    The work information processing apparatus according to any one of claims 1 to 4, including the second image data captured by the work observing unit.
  6.  前記ワークの設計情報は、二次元CADまたは三次元CADの設計情報を含む、請求項1~5のいずれか1項に記載のワーク情報処理装置。 The workpiece information processing apparatus according to any one of claims 1 to 5, wherein the design information of the workpiece includes two-dimensional CAD or three-dimensional CAD design information.
  7.  前記ワークを撮影した画像を受けて前記ワークの形状を認識するワーク認識部をさらに備え、
     前記ワーク学習部は、前記訓練データを学習することによって前記ワーク認識部の内部パラメータを変更する、請求項1~6のいずれか1項に記載のワーク情報処理装置。
    The apparatus further comprises a workpiece recognition unit that receives an image of the workpiece and recognizes the shape of the workpiece.
    The work information processing apparatus according to any one of claims 1 to 6, wherein the work learning unit changes an internal parameter of the work recognition unit by learning the training data.
  8.  ワークの設計情報に基づいて、学習に必要な訓練データを作成するステップと、
     前記訓練データを学習するステップと、
     前記ワークを撮影した画像を受け、学習結果を用いて前記ワークの形状を認識するステップとを備える、ワークの認識方法。
    Creating training data necessary for learning based on the design information of the work;
    Learning the training data;
    Receiving an image of the workpiece and recognizing a shape of the workpiece using a learning result.
PCT/JP2018/035021 2017-09-22 2018-09-21 Workpiece information processing device and recognition method of workpiece WO2019059343A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2017-182799 2017-09-22
JP2017182799A JP2019057250A (en) 2017-09-22 2017-09-22 Work-piece information processing system and work-piece recognition method

Publications (1)

Publication Number Publication Date
WO2019059343A1 true WO2019059343A1 (en) 2019-03-28

Family

ID=65811272

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2018/035021 WO2019059343A1 (en) 2017-09-22 2018-09-21 Workpiece information processing device and recognition method of workpiece

Country Status (2)

Country Link
JP (1) JP2019057250A (en)
WO (1) WO2019059343A1 (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112184713A (en) * 2020-11-06 2021-01-05 上海柏楚电子科技股份有限公司 Control method and device for cutting pipe containing welding seam, cutting system, equipment and medium
WO2023286847A1 (en) * 2021-07-15 2023-01-19 京セラ株式会社 Recognition model generation method and recognition model generation device

Families Citing this family (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP7377627B2 (en) * 2019-06-04 2023-11-10 グローリー株式会社 Object detection device, object grasping system, object detection method, and object detection program
JP2021070122A (en) * 2019-10-31 2021-05-06 ミネベアミツミ株式会社 Learning data generation method
WO2021176902A1 (en) * 2020-03-02 2021-09-10 ソニーグループ株式会社 Learning processing device, device and method for robot control, and program
WO2022202365A1 (en) * 2021-03-22 2022-09-29 パナソニックIpマネジメント株式会社 Inspection assistance system, inspection assistance method, and program
WO2023073780A1 (en) * 2021-10-25 2023-05-04 ファナック株式会社 Device for generating learning data, method for generating learning data, and machine learning device and machine learning method using learning data

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2014229115A (en) * 2013-05-23 2014-12-08 キヤノン株式会社 Information processing device and method, program, and storage medium
WO2017010985A1 (en) * 2015-07-13 2017-01-19 Landmark Graphics Corporation Underbalanced drilling through formations with varying lithologies
JP2017054450A (en) * 2015-09-11 2017-03-16 キヤノン株式会社 Recognition unit, recognition method and recognition program
JP2017102755A (en) * 2015-12-02 2017-06-08 池上通信機株式会社 Machine learning support device

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2014229115A (en) * 2013-05-23 2014-12-08 キヤノン株式会社 Information processing device and method, program, and storage medium
WO2017010985A1 (en) * 2015-07-13 2017-01-19 Landmark Graphics Corporation Underbalanced drilling through formations with varying lithologies
JP2017054450A (en) * 2015-09-11 2017-03-16 キヤノン株式会社 Recognition unit, recognition method and recognition program
JP2017102755A (en) * 2015-12-02 2017-06-08 池上通信機株式会社 Machine learning support device

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112184713A (en) * 2020-11-06 2021-01-05 上海柏楚电子科技股份有限公司 Control method and device for cutting pipe containing welding seam, cutting system, equipment and medium
WO2023286847A1 (en) * 2021-07-15 2023-01-19 京セラ株式会社 Recognition model generation method and recognition model generation device

Also Published As

Publication number Publication date
JP2019057250A (en) 2019-04-11

Similar Documents

Publication Publication Date Title
WO2019059343A1 (en) Workpiece information processing device and recognition method of workpiece
US11338435B2 (en) Gripping system with machine learning
JP6004809B2 (en) Position / orientation estimation apparatus, information processing apparatus, and information processing method
JP5458885B2 (en) Object detection method, object detection apparatus, and robot system
CN113524194A (en) Target grabbing method of robot vision grabbing system based on multi-mode feature deep learning
JP6487493B2 (en) Image processing system
CN105818167A (en) Method for calibrating an articulated end effector employing a remote digital camera
CN111476841B (en) Point cloud and image-based identification and positioning method and system
CN111151463A (en) Mechanical arm sorting and grabbing system and method based on 3D vision
JP6912215B2 (en) Detection method and detection program to detect the posture of an object
JP2012101320A (en) Image generation apparatus, image generation method and program
Kudoh et al. Painting robot with multi-fingered hands and stereo vision
CN111360821A (en) Picking control method, device and equipment and computer scale storage medium
CN110009689B (en) Image data set rapid construction method for collaborative robot pose estimation
WO2022208963A1 (en) Calibration device for controlling robot
Jia et al. Real-time color-based sorting robotic arm system
CN114187312A (en) Target object grabbing method, device, system, storage medium and equipment
US20230150142A1 (en) Device and method for training a machine learning model for generating descriptor images for images of objects
CN115861780B (en) Robot arm detection grabbing method based on YOLO-GGCNN
Funakubo et al. Recognition and handling of clothes with different pattern by dual hand-eyes robotic system
KR102452315B1 (en) Apparatus and method of robot control through vision recognition using deep learning and marker
CN116529760A (en) Grabbing control method, grabbing control device, electronic equipment and storage medium
CN112533739B (en) Robot control device, robot control method, and storage medium
CN109934155B (en) Depth vision-based collaborative robot gesture recognition method and device
CN117769724A (en) Synthetic dataset creation using deep-learned object detection and classification

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 18858122

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 18858122

Country of ref document: EP

Kind code of ref document: A1