JP2019057250A - Work-piece information processing system and work-piece recognition method - Google Patents

Work-piece information processing system and work-piece recognition method Download PDF

Info

Publication number
JP2019057250A
JP2019057250A JP2017182799A JP2017182799A JP2019057250A JP 2019057250 A JP2019057250 A JP 2019057250A JP 2017182799 A JP2017182799 A JP 2017182799A JP 2017182799 A JP2017182799 A JP 2017182799A JP 2019057250 A JP2019057250 A JP 2019057250A
Authority
JP
Japan
Prior art keywords
work
learning
data
unit
workpiece
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
JP2017182799A
Other languages
Japanese (ja)
Inventor
博明 大庭
Hiroaki Oba
博明 大庭
Original Assignee
Ntn株式会社
Ntn Corp
Ntn株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Ntn株式会社, Ntn Corp, Ntn株式会社 filed Critical Ntn株式会社
Priority to JP2017182799A priority Critical patent/JP2019057250A/en
Publication of JP2019057250A publication Critical patent/JP2019057250A/en
Application status is Pending legal-status Critical

Links

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B11/00Measuring arrangements characterised by the use of optical means
    • G01B11/24Measuring arrangements characterised by the use of optical means for measuring contours or curvatures
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T1/00General purpose image data processing
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis

Abstract

To provide a work-piece information processing system by means of machine learning which requires no human trial and error or expertise for selecting image features required for shape recognition of work-pieces and/or optimum condition.SOLUTION: A work-piece information processing system 1 used for recognizing a shape of a work-piece comprises: a learning data create part 3 configured to create a piece of training data required for learning on the basis of design information of a work-piece 8; a work-piece learning part 5 configured to learn the training data; and a data storage part 7 storing the training data and learning results by the work-piece learning part 5. Preferably, the work-piece information processing system 1 further comprises a work-piece data creation part 2 configured to create a piece of work-piece data which includes a basic image and a coordinate data of the work-piece from the design information. The learning data creation part 3 is configured to create a piece of training data by changing the basic image.SELECTED DRAWING: Figure 1

Description

  The present invention relates to a workpiece information processing apparatus that recognizes the shape of a workpiece that is an object of machining or assembly.

  In machining or assembling work, a workpiece that is an object is often automatically picked up by a robot or an assembling apparatus and set in a housing of the processing apparatus or the assembly. During such pickup, it is necessary to control the pickup arm by recognizing the shape and posture of the workpiece. For example, a workpiece information processing apparatus is used for workpiece posture detection, shape recognition, and the like, which are necessary for workpiece picking work by a robot or an assembly device.

  In JP-A-10-332333 (Patent Document 1) and JP-A-11-066631 (Patent Document 2), images of workpieces stacked in layers are photographed with a camera, and the obtained image is differentiated to extract contours. A work information processing apparatus to perform is disclosed.

  In the apparatus disclosed in Japanese Patent Laid-Open No. 10-332333 (Patent Document 1), the curvature is obtained from the extracted contour line, and a pixel having the locally maximum curvature is detected as a vertex. The image captured under ideal conditions in advance is the master pattern, the image of the workpiece actually captured by the camera is the input pattern, the master pattern with the same curvature and the input pattern vertices are associated, and the number of corresponding vertices is the largest. The plane position and the amount of deviation of the rotation angle between the contour of the input pattern and the contour of the master pattern are detected.

  In the apparatus disclosed in Japanese Patent Application Laid-Open No. 11-066631 (Patent Document 2), an image captured in advance under ideal conditions is used as a template, and an image of a work actually captured by a camera is used as an input image. , The tangent angle of each pixel on the contour line is obtained, the histogram of the tangent angle of the template and the input image is overlaid, and the deviation of the angle when the best match is detected as the rotation angle.

JP-A-10-332333 Japanese Patent Application Laid-Open No. 11-066631

The apparatus disclosed in the above prior art has the following problems.
First, an optimum detection condition for detecting a work that can be gripped must be set. When the detection condition is not set appropriately, for example, the contour line extraction fails, the overlapping state is erroneously recognized, and the success rate of the gripping operation is reduced. If such a situation occurs frequently, work efficiency is poor.

  Secondly, it is necessary to construct a detection algorithm that takes into account the posture and disturbance of every workpiece. Since the postures of the workpieces in the overlapped state vary, the lighting is not uniformly applied and the appearance is not constant. In addition, the influence of extraneous matter such as oil on the workpiece surface and disturbance light must be considered as disturbance. It is not easy to devise a detection algorithm that covers all conditions, and it requires a lot of trial and error, and development takes a long time.

  In order to solve these problems, a detection method using machine learning has been proposed. In particular, there have been reports of achieving image recognition rates that exceed humans in image recognition using deep learning (ImageNet Large Scale Visual Recognition Challenge 2015). This is because a great amount of image data can be obtained and learned using the Internet or the like.

  However, in the case of industrial products handled by robots, assembly devices, etc., it is difficult to collect many images in the manufacturing process even though machine learning requires a large amount of image data. Therefore, it is a problem to collect a lot of image data necessary for learning.

  The present invention has been made to solve the above problems, and its purpose is to require human trial and error and expertise to select image features and optimum conditions necessary for workpiece shape recognition. It is to provide a work information processing apparatus using machine learning.

  In summary, the present invention is a work information processing apparatus used for work shape recognition, and a learning data creation unit that creates training data necessary for learning based on work design information, and a work that learns training data. A learning unit; and a data storage unit that stores training data and a learning result in the work learning unit.

  Preferably, the work information processing apparatus further includes a work data creation unit that creates work data including a basic image of the work and coordinate data from the design information. The learning data creation unit creates training data by changing the basic image.

  More preferably, the learning data creation unit creates training data by performing at least one of parallel movement, rotation, reduction, and enlargement on the basic image.

  More preferably, the learning data creation unit performs at least one of a change in the color of the surface of the workpiece, a change in the irradiation direction of light applied to the workpiece, and a change in the intensity of light applied to the workpiece with respect to the basic image. Training data is created by performing one process.

  Preferably, the workpiece information processing apparatus further includes a workpiece observation unit that photographs the workpiece. The training data includes first image data created by the learning data creation unit and second image data taken by the work observation unit.

  Preferably, the workpiece design information includes two-dimensional CAD or three-dimensional CAD design information.

  Preferably, the workpiece information processing apparatus further includes a workpiece recognition unit that receives an image of the workpiece and recognizes the shape of the workpiece. The work learning unit changes the internal parameters of the work recognition unit by learning the training data.

  In another aspect, the present invention is a method for recognizing a work, the step of creating training data necessary for learning based on work design information, the step of learning training data, the training data, and the training data Storing the learning result and receiving an image of the workpiece and recognizing the shape of the workpiece using the learning result.

  According to the present invention, it is possible to realize a work information processing apparatus that does not require human trial and error and specialized knowledge for selecting image features necessary for workpiece shape recognition and selecting optimum conditions.

It is a figure which shows the structural example of the workpiece | work information processing apparatus 1 of this Embodiment. It is a figure for demonstrating the training data which the learning data preparation part 3 produces. It is a figure which shows an example of training data. It is a figure which shows an example of the correspondence of a data name and the shape of a workpiece | work. It is a flowchart for demonstrating the 1st creation procedure of the image for training. It is a figure which shows the structure used for the 2nd preparation procedure of the image for training. It is a flowchart for demonstrating the 2nd creation procedure of the image for training. It is a figure for demonstrating the learning in the workpiece | work learning part 5. FIG. It is a flowchart for demonstrating a workpiece | work learning procedure. It is a flowchart for demonstrating the workpiece identification process performed in a workpiece | work identification part.

  Embodiments of the present invention will be described below with reference to the drawings. In the following drawings, the same or corresponding parts are denoted by the same reference numerals, and description thereof will not be repeated.

  FIG. 1 is a diagram illustrating a configuration example of a work information processing apparatus 1 according to the present embodiment. Referring to FIG. 1, a work information processing apparatus 1 includes a work data creation unit 2, a learning data creation unit 3, a work observation unit 4, a work learning unit 5, a work recognition unit 6, and a data storage unit 7. With.

  The work data creation unit 2 creates work data serving as a basis for training data necessary for learning. The workpiece data creation unit 2 creates workpiece data based on the design information of the workpiece 8 including the material and processing method (surface properties) of the workpiece 8 and stores the workpiece data in the data storage unit 7. The design information of the work 8 includes design information of two-dimensional CAD or three-dimensional CAD.

  Although the actual work data is an image, two-dimensional or three-dimensional dimensional information is stored in the work data and stored in a commonly used two-dimensional or three-dimensional CAD data format. Information such as materials, processing methods, surface properties, and the like is also stored as necessary. As this format, there are DXF (Drawing Exchange Format), IGES (Initial Graphics Exchange Specification), STEP (Standard for the Exchange of Product model data), etc., and these can be used in this embodiment.

  The learning data creation unit 3 creates training data necessary for learning based on the work data created by the work data creation unit 2. For such training data, images actually taken by a camera or the like are often used, but it is troublesome to prepare a large number of such images in a machine tool or the like. Therefore, in the present embodiment, the learning data creation unit 3 automatically generates a lot of training data from the original work data. FIG. 2 is a diagram for explaining the training data created by the learning data creation unit 3. The learning data creation unit 3 creates training data by changing the basic image 100. As illustrated in FIG. 2, the learning data creation unit 3 performs training data 100A and 100B on the basic image 100 by performing at least one of parallel movement, rotation, reduction, and enlargement according to a program. Create

  Preferably, as shown in FIG. 2, the learning data creation unit 3 changes the color of the surface of the work, changes the irradiation direction of light applied to the work, and irradiates the work on the basic image 100 by a program. The training data 100C and 100D are created by performing at least one of the light intensity changes.

  In this manner, the learning data creation unit 3 artificially creates various training data from CAD data instead of actual captured images. The learning data creation unit 3 includes a program for performing at least one of the above-described parallel movement, rotation, reduction, and enlargement, a change in the color of the workpiece surface, a change in the direction of light to be irradiated, and an intensity It is a combination of programs that perform at least one of the processes of change, and the order of the processes can be arbitrarily changed.

  FIG. 3 is a diagram illustrating an example of training data. As shown in FIG. 3, the training data is an image in which each sample point (pixel) is gathered. The training data includes a workpiece outline W1.

  The training data consists of the three-dimensional coordinates (X, Y, Z) of each sample point on the workpiece surface, the workpiece color (R, G, B), and the brightness I, each stored in an array, and associated with the data name. It is stored in the storage unit 7. The data name indicates the type of data created. For example, the data name may be numbered sequentially from 1 or may be represented by characters such as A and ABC.

  FIG. 4 is a diagram illustrating an example of a correspondence relationship between a data name and a work shape. Numbers “1” to “5” corresponding to data names correspond to five types of workpiece shapes. Specifically, a square work corresponds to the data name “1”, a circular work corresponds to the data name “2”, and a parallelogram work corresponds to the data name “3”. The data name “4” corresponds to a triangular work, and the data name “5” corresponds to a double-circle work. FIG. 4 shows a simple two-dimensional shape as an example of the shape for the sake of simplicity, but the shape of the workpiece may actually be three-dimensional.

  In addition, the conditions when creating training data in association with the data name, such as the type of workpiece, parallel movement amount, rotation angle, scale value, color, light irradiation direction, light intensity, etc., are saved together with the training data. Stored in part 7.

  Returning to FIG. 1 again, the work observation unit 4 outputs the three-dimensional coordinates (X, Y, Z), color (R, G, B), and brightness I of each sample point on the work surface. In the configuration example of FIG. 1, the work observation unit 4 includes two cameras that photograph the work 8, and obtains three-dimensional coordinates (X, Y, Z) of each sample point on the work surface by stereo measurement. These are color cameras, which can output the color (R, G, B) and brightness I of each sample point on the workpiece surface.

  The work observation unit 4 is connected to the work learning unit 5 and the work recognition unit 6. In the workpiece learning unit 5 and the workpiece recognition unit 6, learning processing and recognition processing are executed using data (X, Y, Z), (R, G, B), and I, respectively.

  The work learning unit 5 performs learning using the training data created by the learning data creation unit 3. The learning methods include supervised learning methods such as support vector machine and back pro vacation (error back propagation method), unsupervised learning methods such as auto encoder (self-encoder), k-average method, principal component analysis, and A machine learning method that combines them is used.

  The workpiece recognition unit 6 performs workpiece recognition processing using the output data of the workpiece observation unit 4 and the learning result of the workpiece learning unit 5. The workpiece recognition unit 6 uses a recognition model such as a neural network or a support vector machine corresponding to the machine learning method, and each workpiece surface sample output from the workpiece observation unit 4 based on the learning result of the workpiece learning unit 5. It is configured to perform a recognition process using the three-dimensional coordinates (X, Y, Z), color (R, G, B), and brightness I of the points as inputs.

  Each of the work data creation unit 2, the learning data creation unit 3, the work learning unit 5, and the work recognition unit 6 includes a CPU (Central Processing Unit), a RAM (Random Access Memory), a ROM (Read Only Memory), and the like. Each component is controlled according to the processing. Two or more functions of the work data creation unit 2, the learning data creation unit 3, the work learning unit 5, and the work recognition unit 6 may be processed by one CPU. The data storage unit 7 is an auxiliary storage device such as a hard disk drive or a solid state drive. The data storage unit 7 is a program executed by each of the work data creation unit 2, the learning data creation unit 3, the work learning unit 5, and the work recognition unit 6 and created by the work data creation unit 2 and the learning data creation unit 3. Data and learning results of the work learning unit 5 are stored.

[Preparation of training images]
The workpiece learning unit 5 changes the internal parameters of the workpiece recognition unit 6 by combining the first image data prepared in the first creation procedure described below and the second image data prepared in the second creation procedure. Execute learning process. The internal parameters correspond to, for example, weights and biases between layers from the input layer to the output layer in the neural network. That is, the training data learned by the work learning unit 5 includes first image data and second image data. The first creation procedure is a procedure for generating a large number of first image data for training from design data (CAD data). The second creation procedure is a procedure in which an image of an actual workpiece acquired by the workpiece observation unit 4 is used as second image data for training.

  FIG. 5 is a flowchart for explaining the first creation procedure of the training image. The first creation procedure is executed in the learning data creation unit 3 in FIG. Referring to FIGS. 1 and 5, learning data creation unit 3 reads CAD data (original design data) created by work data creation unit 2 in step S <b> 1. In step S2, the learning data creation unit 3 inquires which data name the read CAD data corresponds to, and the user selects a data name corresponding to the CAD data.

  Subsequently, in step S3, the learning data creation unit 3 creates a correct answer corresponding to the data name input by the user. The correct answer corresponds to the recognition result output by the workpiece recognition unit 6. For example, when the workpiece recognition unit 6 outputs probabilities corresponding to the data names “1” to “5” in FIG. 4 as the recognition results P1 to P5, the correct answer corresponding to the data name “3” is (P1 = 0). P2 = 0, P3 = 1, P4 = 0, P5 = 0).

  Subsequently, the learning data creation unit 3 determines a parameter change range in step S4. The parameters include a parallel movement amount, a rotation angle, a reduction or enlargement magnification, a work color, a light irradiation direction and intensity, and the like. The learning data creation unit 3 determines the change range of the parameters by input from the user. Further, the learning data creation unit 3 may set the change range of the parameter to a value determined in advance as a standard.

Subsequently, in step S5, the learning data creation unit 3 generates an image while changing the parameters within the change range determined in step S4. For example, the amount of parallel movement, rotation angle, magnification, color work, the irradiation direction of the light, the six parameters of the irradiation intensity of light, when changing the respective five levels within the change area, 5 6 image is generated Is done.

  In step S6, the learning data creation unit 3 stores the generated image in the data storage unit 7 in association with the parameter and the data name.

FIG. 6 is a diagram illustrating a configuration used in the second creation procedure of the training image.
In the second creation procedure, an image photographed by the work observation unit 4 is used. Only one of the training image prepared in the first creation procedure or the training image prepared in the second creation procedure may be used, but the training image prepared in the first creation procedure and the second creation procedure The recognition rate can be further improved by combining with the prepared training images. The training data created by the learning data creation unit 3 is reinforced by learning together with the training data obtained by observing the actual target workpiece by the workpiece observation unit 4. No image generated in the first creation procedure is arranged around the work image. On the other hand, in the image created by the second production procedure, there is a case where another work is shown around the work. In a later learning step, the work learning unit 5 performs learning so as to recognize the work for each surrounding image. Therefore, even when a plurality of workpieces 8 are contained in a container and a part of them is overlapped, one of the workpieces can be identified and recognized.

  The work observation unit 4 outputs three-dimensional coordinates (X, Y, Z), colors (R, G, B), and brightness I of each sample point on the work surface. Since these are not associated with the data name, they are associated with the data name at the time of acquisition.

  FIG. 7 is a flowchart for explaining a second procedure for creating a training image. The work learning unit 5 acquires the image taken from the work observation unit 4 in step S21. The work learning unit 5 displays the captured acquisition data 12 on a monitor 9 connected to the work learning unit 5. Furthermore, in step S22, the data name corresponding to the acquired data 12 is acquired by the user selecting one of the data names 13 using the keyboard 10 and the mouse 11 connected to the work learning unit 5. At this time, the acquired data 12 displayed on the monitor 9 and the selected data name 13 are associated with each other and stored in the data storage unit 7. Subsequently, in step S23, the learning data creation unit 3 creates a correct answer corresponding to the data name 13 input by the user. The correct answer corresponds to the recognition result output by the workpiece recognition unit 6. For example, when the workpiece recognition unit 6 outputs probabilities corresponding to the data names “1” to “5” in FIG. 4 as the recognition results P1 to P5, the correct answer corresponding to the data name “3” is (P1 = 0). P2 = 0, P3 = 1, P4 = 0, P5 = 0).

  In FIG. 6, the monitor 9 is connected to the work learning unit 5, and the second learning procedure is performed by the work learning unit 5, but the monitor 9 is connected to the work recognition unit 6 and connected to the work recognition unit 6. The second creation procedure may be performed by the work recognition unit 6 using the keyboard 10 or the mouse 11. In this case, the workpiece recognition unit 6 and the data storage unit 7 are connected so that data can be stored.

  In steps S24 to S27, in the above association, the user inputs the position, rotation angle, scale, and the like of the workpiece using the keyboard 10 and the mouse 11 while viewing the acquired data 12 displayed on the monitor 9.

  In step S <b> 24, the work learning unit 5 displays the acquired image on the monitor 9. Subsequently, in step S25, the work learning unit 5 receives the designation of the center position of the work in the image from the user, and calculates the parallel movement amount. The translation amount is given as coordinate values in the horizontal direction and the vertical direction with the origin at the upper left corner of the image.

  Furthermore, in step S26, the work learning unit 5 calculates a rotation angle based on two points of the work in the image designated by the user. For example, when the work is a square, the user designates two vertices on the diagonal. Thereby, the workpiece | work learning part 5 can recognize a rotation angle. In step S27, when the user designates the outer shape on the image, the work learning unit 5 can calculate the magnification of the work photographed in the image.

  In step S28, the input position, rotation angle, scale, and the like are stored in the data storage unit 7 in association with the acquired data 12.

  After the training data is prepared by the first creation procedure or the second creation procedure, the work learning unit 5 performs learning according to the procedure described below.

[Learning using training images]
FIG. 8 is a diagram for explaining learning in the work learning unit 5. FIG. 9 is a flowchart for explaining a work learning procedure executed by the work learning unit. Hereinafter, a case where the data name “3” is learned will be described as an example.

  The learning data creation unit 3 stores the data name in association with the training data. In step S31, the work learning unit 5 sends the training data together with the training data from the data storage unit 7 as teacher data of the data name “3” (P1 = 0, P2 = 0, P3 = 1, P4 = 0, P5 = 0). Get.

  In step S32, the work learning unit 5 initializes a variable n for counting the number of images to 1. In step S33, it is determined whether the variable n is equal to or smaller than the number of images. In step S33, when the variable n is equal to or less than the number of images, the processes in steps S34 to S37 are executed. In step S33, when the variable n exceeds the image, the processes in steps S34 to S37 are not executed, and the process proceeds to step S38.

  In step S <b> 34, the work learning unit 5 inputs training data to the work recognition unit 6. In step S <b> 35, the work learning unit 5 acquires the recognition result of the work recognition unit 6.

  In step S36, the work learning unit 5 corrects the internal parameters of the work recognition unit 6 so that the output result of the work recognition unit 6 matches the teacher data. When the recognition is performed again with the changed internal parameters and the output of the workpiece recognition unit 6 does not match the teacher data, the workpiece learning unit 5 corrects the internal parameters again. In step S36, the work learning unit 5 repeats the internal parameter correction and recognition processing, and brings the output of the work recognition unit 6 closer to the teacher data.

  In FIG. 8, recognition results P1 to P5 indicate probabilities corresponding to the data names “1” to “5”, respectively. (P1 = 1, P2 = 0, P3 = 0.5, P4 = 1, P5 = 0) shows the recognition result of the training data in the workpiece recognition unit 6 before adjusting the internal parameters. In the teacher data (P1 = 0, P2 = 0, P3 = 1, P4 = 0, P5 = 0) shown in FIG. 8, the probability of the data name “3” is 1, and the data names “1”, “2” ”,“ 4 ”, and“ 5 ”indicate recognition results that the probability is zero. In step S36, the work learning unit 5 determines that the result of recognizing the training data approaches the teacher data (P1 = 0, P2 = 0, P3 = 1, P4 = 0, P5 = 0). A learning process for adjusting internal parameters is executed.

  Subsequent to step S36, the variable n is incremented in step S37, and the determination process of step S33 is executed again. When the variable n exceeds the number of training images in the determination process in step S33, the process proceeds to step S38, and the learning result is stored. Specifically, in step S38, the work learning unit 5 fixes the adjusted internal parameters of the work recognition unit 6, and ends the learning.

  In the above description, the procedure for learning the data name is exemplified. However, in the work learning unit 5, in addition to the data name, the parallel movement amount, rotation angle, and scale of the work corresponding to the acquired data when the learning data is created. Etc. may be given as teacher data, and these may be directly learned. In this case, as a result of the recognition processing of the workpiece recognition unit 6, the parallel movement amount, the rotation angle, the scale, and the like are output from the workpiece recognition unit 6. The translation amount is given as coordinate values in the horizontal direction and the vertical direction with the origin at the upper left corner of the image.

Work identification process
The workpiece recognition unit 6 whose internal parameters have been adjusted in the learning process can identify which data name corresponds to the workpiece photographed by the workpiece observation unit 4.

  FIG. 10 is a flowchart for explaining a workpiece identification process executed by the workpiece identification unit. First, in step S <b> 41, the workpiece recognition unit 6 acquires an image photographed by the workpiece observation unit 4. Subsequently, in step S <b> 42, the workpiece recognition unit 6 executes a recognition process using the internal parameters adjusted by the workpiece learning unit 5. In step S43, the workpiece recognition unit 6 outputs a recognition result.

  The work recognition method described in the above flowchart will be described in summary. The work recognition method according to the present embodiment includes steps (S1 to S6) for creating training data necessary for learning based on work design information, steps (S31 to S38) for learning training data, Receiving the image which image | photographed the workpiece | work, and recognizing the shape of a workpiece | work using a learning result (S41-S43).

  As described above, since a lot of training data is created based on the work design information, it is possible to reduce the trouble of actually photographing the work with the camera and preparing a large number of training images for learning.

  In machine learning, since the correct answer can be found from the data itself including disturbance, it is not necessary to change the processing content in the workpiece recognition method described above even if it is erroneously recognized. When new misrecognized image data is generated, it is only necessary to learn the data itself, so there is an advantage that specialized knowledge of image processing is not required.

  Such a recognition result can be used, for example, for recognizing which of the five shapes shown in FIG. 4 is contained in the container in which the workpiece is placed. Further, the posture of the workpiece can be detected based on the parallel movement amount, the rotation angle, the scale, and the like included in the recognition result.

  The image features necessary for detecting the posture of the workpiece are learned by the neural network itself including the workpiece learning unit 5 and the workpiece recognition unit 6, so that the number of steps for studying the detection algorithm can be reduced. In addition, since the search and adjustment of the optimum detection conditions are not required, complicated work can be omitted, and the number of steps for constructing the recognition apparatus can be reduced.

  The embodiment disclosed this time should be considered as illustrative in all points and not restrictive. The scope of the present invention is shown not by the above description of the embodiments but by the scope of claims for patent, and is intended to include meanings equivalent to the scope of claims for patent and all modifications within the scope.

  DESCRIPTION OF SYMBOLS 1 Work information processing apparatus, 2 Work data creation part, 3 Learning data creation part, 4 Work observation part, 5 Work learning part, 6 Work recognition part, 7 Data storage part, 9 Monitor, 10 Keyboard, 11 Mouse.

Claims (8)

  1. A workpiece information processing apparatus used for workpiece shape recognition,
    Based on the design information of the work, a learning data creation unit that creates training data necessary for learning;
    A work learning unit for learning the training data;
    A work information processing apparatus comprising: a data storage unit that stores the training data and a learning result in the work learning unit.
  2. A work data creation unit for creating work data including the basic image and coordinate data of the work from the design information;
    The work information processing apparatus according to claim 1, wherein the learning data creation unit creates the training data by changing the basic image.
  3.   The work according to claim 2, wherein the learning data creation unit creates the training data by performing at least one of translation, rotation, reduction, and enlargement on the basic image. Information processing device.
  4.   The learning data creation unit includes at least one of a change in the color of the surface of the work, a change in an irradiation direction of light applied to the work, and a change in the intensity of light applied to the work with respect to the basic image. The work information processing apparatus according to claim 2 or 3, wherein the training data is created by performing any one of the processes.
  5. A work observation unit for photographing the work;
    The training data is
    First image data created by the learning data creation unit;
    The work information processing apparatus according to claim 1, further comprising second image data photographed by the work observation unit.
  6.   The workpiece information processing apparatus according to claim 1, wherein the workpiece design information includes two-dimensional CAD or three-dimensional CAD design information.
  7. A work recognition unit that receives an image of the work and recognizes the shape of the work;
    The work information processing apparatus according to claim 1, wherein the work learning unit changes an internal parameter of the work recognition unit by learning the training data.
  8. Creating training data necessary for learning based on the design information of the workpiece;
    Learning the training data;
    Receiving the image which image | photographed the said workpiece | work, and recognizing the shape of the said workpiece | work using a learning result.
JP2017182799A 2017-09-22 2017-09-22 Work-piece information processing system and work-piece recognition method Pending JP2019057250A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
JP2017182799A JP2019057250A (en) 2017-09-22 2017-09-22 Work-piece information processing system and work-piece recognition method

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2017182799A JP2019057250A (en) 2017-09-22 2017-09-22 Work-piece information processing system and work-piece recognition method
PCT/JP2018/035021 WO2019059343A1 (en) 2017-09-22 2018-09-21 Workpiece information processing device and recognition method of workpiece

Publications (1)

Publication Number Publication Date
JP2019057250A true JP2019057250A (en) 2019-04-11

Family

ID=65811272

Family Applications (1)

Application Number Title Priority Date Filing Date
JP2017182799A Pending JP2019057250A (en) 2017-09-22 2017-09-22 Work-piece information processing system and work-piece recognition method

Country Status (2)

Country Link
JP (1) JP2019057250A (en)
WO (1) WO2019059343A1 (en)

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP6282045B2 (en) * 2013-05-23 2018-02-21 キヤノン株式会社 Information processing apparatus and method, program, and storage medium
GB2555313A (en) * 2015-07-13 2018-04-25 Landmark Graphics Corp Underbalanced drilling through formations with varying lithologies
JP2017054450A (en) * 2015-09-11 2017-03-16 キヤノン株式会社 Recognition unit, recognition method and recognition program
JP6604832B2 (en) * 2015-12-02 2019-11-13 池上通信機株式会社 Machine learning support device

Also Published As

Publication number Publication date
WO2019059343A1 (en) 2019-03-28

Similar Documents

Publication Publication Date Title
JP4878842B2 (en) Robot drive method
JP5839929B2 (en) Information processing apparatus, information processing system, information processing method, and program
JP6267097B2 (en) System and method for three-dimensional alignment of objects using machine vision
DE112011103794B4 (en) Pick-up device for workpieces
RU2566226C2 (en) Selection of physical objects in robotics system
Chan et al. A multi-sensor approach to automating co-ordinate measuring machine-based reverse engineering
EP1477924B1 (en) Gesture recognition apparatus, method and program
Rahardja et al. Vision-based bin-picking: Recognition and localization of multiple complex objects using simple visual cues
US8095237B2 (en) Method and apparatus for single image 3D vision guided robotics
EP1905548B1 (en) Workpiece picking apparatus
JP3768174B2 (en) Work take-out device
JP4317465B2 (en) Face identification device, face identification method, and face identification program
US20090118864A1 (en) Method and system for finding a tool center point for a robot using an external camera
CN101274432B (en) Apparatus for picking up objects
JP2007213353A (en) Apparatus for detecting three-dimensional object
JP5975685B2 (en) Information processing apparatus and information processing method
US20050084141A1 (en) Action recognition apparatus and apparatus for recognizing attitude of object
JP5049975B2 (en) 3D model data generation method and 3D model data generation apparatus
Suzuki et al. Visual servoing to catch fish using global/local GA search
US7283661B2 (en) Image processing apparatus
JP5787642B2 (en) Object holding device, method for controlling object holding device, and program
JP2004295223A (en) Image processing apparatus and robot system
JP4004899B2 (en) Article position / orientation detection apparatus and article removal apparatus
JP4825253B2 (en) System and method for deformable object recognition
CA2369845A1 (en) Method and apparatus for single camera 3d vision guided robotics