WO2015136714A1 - 画像識別装置、画像センサ、画像識別方法 - Google Patents
画像識別装置、画像センサ、画像識別方法 Download PDFInfo
- Publication number
- WO2015136714A1 WO2015136714A1 PCT/JP2014/057003 JP2014057003W WO2015136714A1 WO 2015136714 A1 WO2015136714 A1 WO 2015136714A1 JP 2014057003 W JP2014057003 W JP 2014057003W WO 2015136714 A1 WO2015136714 A1 WO 2015136714A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- image
- model
- feature point
- similarity
- identification
- Prior art date
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/70—Arrangements for image or video recognition or understanding using pattern recognition or machine learning
- G06V10/74—Image or video pattern matching; Proximity measures in feature spaces
- G06V10/75—Organisation of the matching processes, e.g. simultaneous or sequential comparisons of image or video features; Coarse-fine approaches, e.g. multi-scale approaches; using context analysis; Selection of dictionaries
- G06V10/751—Comparing pixel values or logical combinations thereof, or feature values having positional relevance, e.g. template matching
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F18/00—Pattern recognition
- G06F18/20—Analysing
- G06F18/22—Matching criteria, e.g. proximity measures
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F18/00—Pattern recognition
- G06F18/20—Analysing
- G06F18/25—Fusion techniques
- G06F18/254—Fusion techniques of classification results, e.g. of results related to same input data
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/0002—Inspection of images, e.g. flaw detection
- G06T7/0004—Industrial image inspection
- G06T7/001—Industrial image inspection using an image reference approach
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/70—Arrangements for image or video recognition or understanding using pattern recognition or machine learning
- G06V10/74—Image or video pattern matching; Proximity measures in feature spaces
- G06V10/75—Organisation of the matching processes, e.g. simultaneous or sequential comparisons of image or video features; Coarse-fine approaches, e.g. multi-scale approaches; using context analysis; Selection of dictionaries
- G06V10/757—Matching configurations of points or features
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/70—Arrangements for image or video recognition or understanding using pattern recognition or machine learning
- G06V10/74—Image or video pattern matching; Proximity measures in feature spaces
- G06V10/761—Proximity, similarity or dissimilarity measures
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/70—Arrangements for image or video recognition or understanding using pattern recognition or machine learning
- G06V10/77—Processing image or video features in feature spaces; using data integration or data reduction, e.g. principal component analysis [PCA] or independent component analysis [ICA] or self-organising maps [SOM]; Blind source separation
- G06V10/80—Fusion, i.e. combining data from various sources at the sensor level, preprocessing level, feature extraction level or classification level
- G06V10/809—Fusion, i.e. combining data from various sources at the sensor level, preprocessing level, feature extraction level or classification level of classification results, e.g. where the classifiers operate on the same input data
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30108—Industrial image inspection
- G06T2207/30164—Workpiece; Machine component
Definitions
- the present invention relates to a technique for identifying an image based on image feature points.
- a sensor device called an image sensor (visual sensor) is widely used to measure and monitor a measurement object (hereinafter also referred to as “work”) flowing through a line.
- the image sensor is composed of a camera and an image processing apparatus, detects a workpiece in an image by matching processing with a pre-registered teacher object (hereinafter also referred to as “model” or “pattern”), and extracts necessary information.
- model a pre-registered teacher object
- pattern has a function to perform measurement.
- the output of the image sensor is used for various purposes such as workpiece identification, inspection, and sorting.
- mixed flow line in which a plurality of types of workpieces are mixed (hereinafter also referred to as “mixed flow line”), in order to perform different processing depending on the type of workpiece, the type of workpiece is accurately determined by an image sensor. Is required.
- a plurality of types of model images are registered in advance in the image sensor, and the type of workpiece is estimated by obtaining a model image that best matches the workpiece image.
- the degree of similarity is calculated depending on how much the feature point of the model image matches the feature point of the input image.
- object identification using images of six types of dice as shown in FIG.
- the similarity calculated in each model is 100 points (full score) for a plurality of model images as shown in the figure, and the identification result is noise in the input image. It will be decided by etc.
- the reason why there is no difference in the degree of similarity is that there are model images in an inclusive relationship with each other or model images with many common parts. That is, when there are many similar model images, it is difficult to perform correct identification by conventional template matching using similarity.
- Patent Document 1 proposes that correlation values are calculated for similar model images, feature point positions with low correlation are stored, and similarity is calculated using only such feature points. ing. Thus, by calculating the similarity using only feature points with low correlation, correct identification can be performed even when similar model images exist.
- the present invention has been made in view of the above circumstances, and it is an object of the present invention to provide an image identification technique that can perform identification with low processing load and high accuracy even when similar model images are included. And
- the present invention adopts a configuration in which a weight based on the saliency in the model image is set for each feature point, and the similarity considering the weight is calculated.
- an image identification apparatus includes a storage unit in which feature points are registered for each of a plurality of model images, a feature point extraction unit that extracts feature points from an input image, and the input image
- a first similarity (hereinafter also referred to as a weighted similarity) between the input image and each model image is calculated by comparing the feature point and the feature point of each model image, and based on the first similarity
- An identification unit for identifying the input image, and the identification unit is configured to correspond to the feature point of the input image that matches the feature point of the model image that is the first similarity calculation target.
- the first similarity is calculated by adding together the scores (hereinafter also referred to as weight scores), and the score corresponding to the feature point is a model including the feature point among the plurality of model images.
- the image identification device sets the weight score higher for the feature points having higher saliency, and performs image identification based on the first similarity obtained by adding the weight scores. Thereby, even if there are many similar model images, it becomes possible to identify with high accuracy.
- a weight score may be obtained at the time of image identification for feature points included in the input image, or a weight score may be obtained in advance and stored at the time of model image registration for feature points included in the model image. May be.
- the weight score at the time of image identification When obtaining the weight score at the time of image identification, the number of model images that match the feature point is obtained for each feature point of the input image, and the weight score may be calculated according to the number.
- the weight scores at the time of model image registration the number of model images that match the feature points of feature points of all model images may be obtained, and weight scores may be calculated in advance according to the number.
- the specific calculation method may be arbitrary as long as the weight score is calculated so as to increase as the number of matching model images decreases. For example, a value corresponding to the reciprocal of the number of matches may be used as the weight score, or a value corresponding to the difference between the number of all model images and the number of matches may be used as the weight score.
- the identification unit has, for each of the plurality of model images, a second similarity (hereinafter, also referred to as a simple similarity) according to the ratio of feature points of the model image included in the feature points of the input image. It is also preferable that a model image that best matches the input image is determined based on the first similarity and the second similarity.
- model image having the largest simple sum or weighted sum of the first similarity and the second similarity is determined as the model image that best matches the input image.
- the model image having the highest first similarity among the model images having the second second similarity is the model image that best matches the input image.
- the model image having the second high degree of similarity may be a predetermined number of upper order model images with the second degree of similarity, or a model having the second degree of similarity within a predetermined threshold from the maximum value. It may be an image.
- the present invention can be understood as an image identification device or an object identification device having at least a part of the above configuration.
- the present invention can also be understood as an image sensor having a camera that captures an object and an object discrimination device or an image processing device.
- the present invention also provides an image identification method or an object identification method including at least a part of the above processing, a program for causing a computer to execute the method, or a computer-readable recording medium in which the program is temporarily stored. Can also be taken as.
- the processing load is low and the identification can be performed with high accuracy.
- the present invention relates to an image identification technique for extracting an image that best matches an input image from a plurality of registered images (model images) registered in advance by template matching when an input image is given. is there.
- This technology can be applied to object discrimination in an image sensor for FA, computer vision, machine vision, and the like, or a similar image search for detecting an image similar to a query image from a group of images in an image database.
- the present invention is implemented in an FA image sensor that detects and discriminates each workpiece in a mixed flow line in which a plurality of types of workpieces flow. An example will be described.
- the image sensor 1 is a system that is installed in a production line or the like and performs type discrimination of the workpiece 2 using an input image obtained by imaging a product (work 2).
- the image sensor 1 can be mounted with various image processing functions as necessary, such as edge detection, scratch / dirt detection, and area / length / centroid measurement.
- a PLC (Programmable Logic Controller) 4 is a device that controls a manufacturing apparatus (not shown) such as the image sensor 1, the conveyor 3, and a robot.
- the image sensor 1 generally includes a camera 11 and an image processing device 10.
- the camera 11 is a device for capturing the image of the workpiece 2 into the image processing apparatus 10, and for example, a CMOS (Complementary Metal-Oxide-Semiconductor) camera or a CCD (Charge-Coupled Device) camera can be suitably used.
- CMOS Complementary Metal-Oxide-Semiconductor
- CCD Charge-Coupled Device
- the format of the input image is arbitrary, and may be appropriately selected according to the type of workpiece 2 and the purpose of sensing.
- a special image other than a visible light image such as an X-ray image or a thermo image
- a camera that matches the image may be used.
- the image processing apparatus 10 includes a CPU (Central Processing Unit) 110, a main memory 112 and a hard disk 114 as a storage unit, a camera interface 116, an input interface 118, a display controller 120, a PLC interface 122, and a communication interface. 124 and a data reader / writer 126. These units are connected to each other via a bus 128 so that data communication is possible.
- the camera interface 116 is a part that mediates data transmission between the CPU 110 and the camera 11, and has an image buffer 116 a for temporarily storing image data from the camera 11.
- the input interface 118 mediates data transmission between the CPU 110 and an input unit (mouse 13, keyboard, touch panel, jog controller, etc.).
- the display controller 120 is connected to a display 12 such as a liquid crystal monitor, and controls display on the display 12.
- the PLC interface 122 mediates data transmission between the CPU 110 and the PLC 4.
- the communication interface 124 mediates data transmission between the CPU 110 and a console (or a personal computer or a server device).
- the data reader / writer 126 mediates data transmission between the CPU 110 and the memory card 14 as a storage medium.
- the image processing apparatus 10 can be configured by a computer having a general-purpose architecture, and the CPU 110 reads and executes a program (instruction code) stored in the hard disk 114 or the memory card 14 to provide various functions. .
- a program instruction code
- Such a program is distributed while being stored in a computer-readable recording medium such as the memory card 14 or an optical disk.
- an OS operation for providing a basic function of the computer
- the program according to the present embodiment may realize a target function by using a program module provided by the OS.
- the program according to the present embodiment may be provided as a single application program, or may be provided as a module incorporated in a part of another program. Further, a part or all of the functions may be replaced with a dedicated logic circuit.
- FIG. 3 shows a functional configuration related to type discrimination (object discrimination) provided by the image processing apparatus.
- the image processing apparatus 10 includes an image input unit 130, a detection unit 131, a feature point extraction unit 132, an identification unit 133, a storage unit 134, and an output unit 135 as functions related to type determination. These functional blocks are realized by the CPU 110 of the image processing apparatus 10 executing a computer program.
- model image feature points 134a feature points extracted from model images are registered for a plurality of models.
- feature points based on colors and shapes may be adopted as feature points. For example, blue, yellow, and red can be used as feature points, and the number of straight lines and the number of circles can be used as feature points. Further, a feature point having a large change in gray value and a color feature point can be used in combination.
- the model image feature point 134a can be extracted from the received model image by the image processing apparatus 10 receiving the model image. Alternatively, data regarding model image feature points extracted by another apparatus may be received by the image processing apparatus 10 and stored in the storage unit 134.
- FIG. 4 is a flowchart showing the overall flow of the image identification process executed by the image processing apparatus 10. The operation of each functional block in the image identification process and the overall flow of the type determination process will be described below with reference to the flowchart of FIG.
- FIG. 5A is an example of the captured image, and shows a state where five types of objects (for example, assorted chocolate) are mixed on the conveyor.
- the detection unit 131 detects each workpiece 2 from the input image (step S101). Any algorithm may be used for the detection process. For example, after cutting the background by binarization, a method of detecting a region (pixel group) larger than a predetermined area as the work 2 or a method of searching for a region that seems to be a work by pattern matching can be used.
- the detection unit 131 may perform preprocessing such as smoothing and noise removal on the input image as necessary.
- FIG. 5B is an example of the detection result, and each work area detected from the image is indicated by a dotted rectangle.
- the feature point extraction unit 132 analyzes the detected workpiece image (hereinafter referred to as a workpiece image) and extracts feature points (step S102).
- a feature point is a point or a region having a large change in gray value in an image. For example, an edge or a corner can be extracted as a feature point.
- the identification unit 133 compares the feature point of the workpiece with each feature point of the plurality of model images registered in the storage unit 134 to determine the type of the workpiece (step S103). By repeating the processing of steps S102 and S103 for all the workpieces detected in step S101, the type of each workpiece can be determined.
- FIG. 5C shows an example of the discrimination result, and the discrimination result (type) of each workpiece is indicated by a number from 1 to 5.
- the determination result is output to the display 12 or the PLC 4 by the output unit 135 (step S104).
- step S103 shows details of the processing in step S103.
- a weight that takes into account a simple similarity score S1 (second similarity) based on the number of simple matches between feature points of the work image and feature points of the model image, and a weight score corresponding to the saliency of each feature point
- S1 second similarity
- S2 first similarity
- the simple similarity score calculation unit 133a of the identification unit 133 initializes a variable n representing the number of matching feature points to 0 when processing of a new model image is started (step S201).
- the simple similarity score calculation unit 133a determines, for each feature point in the model image, whether or not the corresponding feature point exists in the work image (step S202), and if it exists, increments the matching number n (step S202). S203).
- the simple similarity score calculation unit 133a repeats the above processing for all feature points, and calculates the ratio of the number n of feature points having corresponding feature points in the work image to the number N of all feature points in the model image.
- the simple similarity score S1 is calculated (S205).
- the simple similarity score S1 for the model image “1” is “100” (n / N multiplied by 100). 100 points). Similarly, for the model images “2” to “5”, the simple similarity score S1 is “100”. For the model image “6”, since only four of the six feature points are present in the work image, the simple similarity score S1 is “67”.
- the simple similarity score calculation unit 133a When collating whether or not a feature point corresponding to the feature point of each model image is present in the work image, the simple similarity score calculation unit 133a includes each feature point in the work image in each model image. Is stored (step S204). A specific description will be given with reference to FIG.
- the die of “5” is a work image
- five feature points are extracted from the work image. As indicated by black circles in the column 300 of FIG. 8, the feature points are designated as feature points 1 to 5 (for example, the feature point 1 is the upper left eye (dot)).
- this work image and the model image of “3” are collated, corresponding feature points for the feature points 1, 3, and 5 of “5” are also present in the model image. Therefore, in step S204, as shown in FIG.
- the value corresponding to the reciprocal of the number of corresponding model images is used as the weight score.
- the determination method may be arbitrary as long as the weight score becomes larger as the number of model images decreases.
- the weight score of a feature point may be determined by a monotone decreasing function using the number of model images including the feature point as a variable.
- feature points that are commonly used in many model images have low saliency, and the contribution to image identification is small. Conversely, feature points with few commonly used model images have high saliency and can be said to contribute greatly to image identification. Therefore, as described above, by setting a larger weight score for feature points with fewer model images being used, it is possible to calculate a similarity that takes into account the saliency of the feature points.
- the weighted similarity score calculation unit 133b calculates the weighted similarity score S2 between the work image and each model image using the weight score. Specifically, the sum of the weight scores of the feature points present in the work image among the feature points in the model image is calculated as the weighted similarity score S2 (S207). As shown in FIG. 8, for the model image “1”, only the feature point 3 matches and the weight score of the feature point 3 is “33”, so “33” is the work image (the dice of the “5” eye). And the weighted similarity score S2 between the model image “1” and the model image “1”.
- the identification unit 133 determines the similarity (identification score) between the work image and each model image. Calculate (S208).
- the weighted similarity score S2 is normalized so that the maximum value is 100, and the sum of the normalized weighted similarity score S2 and the simple similarity score S1 is used as the identification score.
- FIG. 9 shows a simple similarity score S1, a weighted similarity score S2, and an identification score with each model image when the work image is a dice “5”.
- the weighted similarity score S2 the normalized score is shown without parentheses, and the value obtained in step S207 is shown in parentheses.
- the identification unit 133 determines that the model image with the highest identification score is the best match with the workpiece image, and identifies the type of workpiece (S209). In the example of FIG. 9, since the identification score for the model image “5” is the maximum, the work image is determined to be the model image “5” (the image of the “5” eye of the dice).
- FIGS. 11 (A) to (B) The calculation results of the simple similarity score, the weighted similarity score, and the identification score when the “1” to “4” and “6” eyes of the dice are input images are shown in FIGS. 11 (A) to (B), respectively.
- FIGS. 10 (C), 11 (A), and 11 (B) when the dice of the eyes “3”, “4”, and “6” are input images, there is no difference only with the simple similarity score.
- correct identification cannot be performed without using it, correct identification is possible by using an identification score in consideration of a weighted similarity score.
- the weighted similarity score is the same for a plurality of model images, but since the difference is large for the simple similarity score, it can be sufficiently identified.
- a simple sum of a simple similarity score and a weighted similarity score is obtained as an identification score, but a value (weighted sum) obtained by adding these two similarity scores at a predetermined ratio is used as an identification score. You may use as.
- the weight ratio in this case may be a predetermined value or a value set by the user.
- the object type may be determined using the simple similarity score and the weighted similarity score without obtaining the identification score. For example, first, a predetermined number of model images having a large simple similarity score are selected, and it is possible to determine that a model image having the highest weighted similarity score matches the input image.
- the predetermined number of model images having a large simple similarity score may be the upper predetermined number of model images of the simple similarity score, or a simple similarity score within a predetermined threshold from the maximum value of the simple similarity score. May be a model image.
- the threshold value in this case may be specified by an absolute value of the score, or may be specified by a ratio with respect to the maximum value.
- the determination is performed based on both the simple similarity score and the weighted similarity score, but the determination may be performed based only on the weighted similarity score. As described above, when only a weighted similarity score is used, correct discrimination may not be performed, but when an image that does not have a sufficient difference in simple similarity score is input, correct discrimination is possible. .
- the weight score for each feature point is calculated when matching the input images.
- the weight score may be calculated when the model image is registered. That is, the weight score for each feature point can be obtained by performing the same weight score calculation processing as described above using each model image as an input image.
- the weighted similarity score can be calculated by adding the weight scores of the feature points in the same manner as described above. Even if it does in this way, the effect similar to the said embodiment is acquired.
- the present invention is a three-dimensional image (stereoscopic image) to which information in the depth direction is added.
- the present invention can also be applied to image identification for the target.
- both the model image and the input image may be processed in the same manner as described above using an image taken using an imaging device with a distance measuring function such as a stereo camera.
- the identification processing considering the saliency of the feature points according to the present invention can be applied not only to still images but also to time-series data such as moving images and audio data.
Abstract
Description
図1を参照して、本発明の実施形態に係る画像センサの全体構成および適用場面について説明する。
画像センサ1は、生産ラインなどに設置され、製造物(ワーク2)を撮像することで得られる入力画像を用いてワーク2の種類判別などを行うシステムである。なお、画像センサ1には、種類判別のほかにも、エッジ検出、キズ・汚れ検出、面積・長さ・重心の計測など、必要に応じて様々な画像処理機能を実装可能である。
図2を参照して、画像センサ1のハードウェア構成を説明する。画像センサ1は、概略、カメラ11と画像処理装置10から構成される。
図3に、画像処理装置が提供する種類判別(物体判別)にかかわる機能構成を示す。画像処理装置10は、種類判別にかかわる機能として、画像入力部130、検出部131、特徴点抽出部132、識別部133、記憶部134、出力部135を有している。これらの機能ブロックは、画像処理装置10のCPU110がコンピュータプログラムを実行することにより実現される。
画像処理装置10は、コンベヤ3上を流れるワーク2の画像を取り込み、ワーク2の検出や種類判別などの処理を実行する。図4は、画像処理装置10が実行する画像識別処理の全体の流れを示すフローチャートである。以下、図4のフローチャートに沿って、画像識別処理における各機能ブロックの動作、および、種類判別処理の全体の流れについて説明する。
上述した実施形態の構成は本発明の一具体例を示したものにすぎず、本発明の範囲を限定する趣旨のものではない。本発明はその技術思想を逸脱しない範囲において、種々の具体的構成を採り得るものである。例えば上記実施形態では、本発明を物体判別装置に適用した例を説明したが、本発明の適用範囲はこれに限らず、類似画像検索装置などへ適用することも好ましい。
10:画像処理装置、11:カメラ、12:ディスプレイ
130:画像入力部、131:検出部、132:特徴点抽出部、133:識別部、134:記憶部、135:出力部
Claims (10)
- 複数のモデル画像のそれぞれについて特徴点が登録されている記憶部と、
入力画像から特徴点を抽出する特徴点抽出部と、
前記入力画像の特徴点と各モデル画像の特徴点の比較により、前記入力画像と各モデル画像の間の第1の類似度を算出し、当該第1の類似度に基づいて前記入力画像を識別する識別部と、
を有し、
前記識別部は、前記第1の類似度算出対象のモデル画像の特徴点と一致する入力画像の特徴点ごとに、当該特徴点に応じたスコアを足し合わせることにより前記第1の類似度を算出するものであり、
前記特徴点に応じたスコアは、前記複数のモデル画像のうち当該特徴点を含むモデル画像の数が少ないほど大きい値である、
ことを特徴とする画像識別装置。 - 前記識別部は、前記入力画像の特徴点ごとに、当該特徴点と一致するモデル画像の数を求め、当該数に基づいて前記特徴点に応じたスコアを算出する、
ことを特徴とする請求項1に記載の画像識別装置。 - 前記識別部は、前記複数のモデル画像の特徴点ごとに、当該特徴点と一致するモデル画像の数を求め、当該数に基づいて前記特徴点に応じたスコアをあらかじめ算出しておく、
ことを特徴とする請求項1に記載の画像識別装置。 - 前記識別部は、前記複数のモデル画像のそれぞれについて、当該モデル画像の特徴点が前記入力画像の特徴点に含まれる割合に応じた第2の類似度も算出し、前記第1の類似度および前記第2の類似度に基づいて、前記入力画像と最も良く合致するモデル画像を決定する、
ことを特徴とする請求項1~3のいずれか1項に記載の画像識別装置。 - 前記識別部は、前記第1の類似度と前記第2の類似度の単純和または重み付き和が最も大きいモデル画像を、前記入力画像と最も良く合致するモデル画像であると決定する、
ことを特徴とする請求項4に記載の画像識別装置。 - 前記識別部は、前記第2の類似度が大きいモデル画像のうち、前記第1の類似度が最も大きいモデル画像を、前記入力画像と最も良く合致するモデル画像であると決定する、
ことを特徴とする請求項4に記載の画像識別装置。 - 物体を撮影するカメラと、
前記カメラから入力された画像から前記物体の種類を判別し、その結果を出力する、請求項1~6のうちいずれか1項に記載の画像識別装置と、を有する
ことを特徴とする画像センサ。 - 入力画像と合致するモデル画像を識別する画像識別方法であって、
複数のモデル画像のそれぞれについて特徴点が登録されている記憶部を有するコンピュータが、
入力画像から特徴点を抽出する特徴点抽出ステップと、
前記入力画像の特徴点と各モデル画像の特徴点の比較により、前記入力画像と各モデル画像の間の第1の類似度を算出し、当該第1の類似度に基づいて前記入力画像を識別する識別ステップと、
を実行し、
前記識別ステップでは、前記第1の類似度算出対象のモデル画像の特徴点と一致する入力画像の特徴点ごとに、当該特徴点に応じたスコアを足し合わせることにより前記第1の類似度を算出し、
前記特徴点に応じたスコアは、前記複数のモデル画像のうち当該特徴点を含むモデル画像の数が少ないほど大きい値である、
ことを特徴とする画像識別方法。 - 請求項8に記載の画像識別方法の各ステップをコンピュータに実行させることを特徴とするプログラム。
- 請求項9に記載のプログラムを非一時的に記憶することを特徴とするコンピュータ読み取り可能な記憶媒体。
Priority Applications (5)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US15/121,886 US10055670B2 (en) | 2014-03-14 | 2014-03-14 | Image recognition device, image sensor, and image recognition method using feature |
PCT/JP2014/057003 WO2015136714A1 (ja) | 2014-03-14 | 2014-03-14 | 画像識別装置、画像センサ、画像識別方法 |
EP14885695.8A EP3118813B1 (en) | 2014-03-14 | 2014-03-14 | Image recognition device, image sensor, and image recognition method |
JP2016507250A JP6176388B2 (ja) | 2014-03-14 | 2014-03-14 | 画像識別装置、画像センサ、画像識別方法 |
CN201480076713.4A CN106062820B (zh) | 2014-03-14 | 2014-03-14 | 图像识别装置、图像传感器、图像识别方法 |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
PCT/JP2014/057003 WO2015136714A1 (ja) | 2014-03-14 | 2014-03-14 | 画像識別装置、画像センサ、画像識別方法 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2015136714A1 true WO2015136714A1 (ja) | 2015-09-17 |
Family
ID=54071191
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2014/057003 WO2015136714A1 (ja) | 2014-03-14 | 2014-03-14 | 画像識別装置、画像センサ、画像識別方法 |
Country Status (5)
Country | Link |
---|---|
US (1) | US10055670B2 (ja) |
EP (1) | EP3118813B1 (ja) |
JP (1) | JP6176388B2 (ja) |
CN (1) | CN106062820B (ja) |
WO (1) | WO2015136714A1 (ja) |
Cited By (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2019092782A1 (ja) * | 2017-11-07 | 2019-05-16 | 日本電気株式会社 | 情報処理装置、制御方法、及びプログラム |
JP2019192005A (ja) * | 2018-04-26 | 2019-10-31 | 株式会社日立製作所 | 物体認識装置および方法 |
CN111353419A (zh) * | 2020-02-26 | 2020-06-30 | 北京百度网讯科技有限公司 | 图像比对方法、装置、电子设备及存储介质 |
Families Citing this family (16)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP6333871B2 (ja) * | 2016-02-25 | 2018-05-30 | ファナック株式会社 | 入力画像から検出した対象物を表示する画像処理装置 |
CN108460389B (zh) * | 2017-02-20 | 2021-12-03 | 阿里巴巴集团控股有限公司 | 一种识别图像中对象的类型预测方法、装置及电子设备 |
CN108694398B (zh) * | 2017-04-06 | 2020-10-30 | 杭州海康威视数字技术股份有限公司 | 一种图像分析方法及装置 |
JP6889865B2 (ja) * | 2017-09-22 | 2021-06-18 | オムロン株式会社 | テンプレート作成装置、物体認識処理装置、テンプレート作成方法及びプログラム |
JP6687591B2 (ja) * | 2017-12-26 | 2020-04-22 | ファナック株式会社 | 物品搬送装置、ロボットシステムおよび物品搬送方法 |
JP6763914B2 (ja) * | 2018-06-08 | 2020-09-30 | ファナック株式会社 | ロボットシステムおよびロボットシステムの制御方法 |
JP7022036B2 (ja) | 2018-08-31 | 2022-02-17 | ファナック株式会社 | 物品取出システム |
US11200632B2 (en) * | 2018-11-09 | 2021-12-14 | Canon Kabushiki Kaisha | Image processing method and image processing apparatus |
JP7186128B2 (ja) * | 2019-04-24 | 2022-12-08 | 株式会社日立製作所 | 物品認識システムおよび物品認識方法 |
CN110102906A (zh) * | 2019-05-10 | 2019-08-09 | 定颖电子(黄石)有限公司 | 一种镭刻打码系统及镭刻打码工艺 |
US11893718B2 (en) * | 2019-07-12 | 2024-02-06 | Murata Machinery, Ltd. | Image recognition method and image recognition device |
CN110458044A (zh) * | 2019-07-22 | 2019-11-15 | 苏州慧润百年物联科技有限公司 | 一种基于图像识别的判断地面浸水方法 |
US11948130B2 (en) * | 2020-01-30 | 2024-04-02 | BlueOwl, LLC | Systems and methods for waste management using recurrent convolution neural network with stereo video input |
CN112950623A (zh) * | 2021-03-29 | 2021-06-11 | 云印技术(深圳)有限公司 | 一种唛头识别方法及系统 |
CN113591921A (zh) * | 2021-06-30 | 2021-11-02 | 北京旷视科技有限公司 | 图像识别方法及装置、电子设备、存储介质 |
CN115170566B (zh) * | 2022-09-07 | 2022-12-13 | 湖南视比特机器人有限公司 | 一种工件的识别方法和识别装置 |
Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH033089A (ja) * | 1989-05-31 | 1991-01-09 | Meidensha Corp | パターン認識装置 |
JP2006190201A (ja) * | 2005-01-07 | 2006-07-20 | Sony Corp | 画像処理システム、学習装置および方法、並びにプログラム |
JP2010113731A (ja) * | 2007-03-09 | 2010-05-20 | Omron Corp | 認識処理方法およびこの方法を用いた画像処理装置 |
JP2012048593A (ja) * | 2010-08-30 | 2012-03-08 | Juki Corp | 画像処理装置 |
Family Cites Families (9)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
EP1566788A3 (en) * | 2004-01-23 | 2017-11-22 | Sony United Kingdom Limited | Display |
EP1722331B1 (en) * | 2004-03-03 | 2010-12-01 | NEC Corporation | Image similarity calculation system, image search system, image similarity calculation method, and image similarity calculation program |
GB2431797B (en) * | 2005-10-31 | 2011-02-23 | Sony Uk Ltd | Image processing |
CN101110100B (zh) * | 2006-07-17 | 2012-05-02 | 松下电器产业株式会社 | 检测包含任意线段组合的形状的方法及装置 |
JP5018404B2 (ja) | 2007-11-01 | 2012-09-05 | ソニー株式会社 | 画像識別装置および画像識別方法、並びに、プログラム |
US20100166303A1 (en) * | 2008-12-31 | 2010-07-01 | Ali Rahimi | Object recognition using global similarity-based classifier |
WO2012073421A1 (ja) * | 2010-11-29 | 2012-06-07 | パナソニック株式会社 | 画像分類装置、画像分類方法、プログラム、記録媒体、集積回路、モデル作成装置 |
CN102938071B (zh) * | 2012-09-18 | 2015-06-03 | 西安电子科技大学 | 基于非局部均值的sar图像变化检测模糊聚类分析方法 |
WO2014101803A1 (zh) * | 2012-12-27 | 2014-07-03 | Wang Hao | 红外选择装置和红外选择方法 |
-
2014
- 2014-03-14 WO PCT/JP2014/057003 patent/WO2015136714A1/ja active Application Filing
- 2014-03-14 CN CN201480076713.4A patent/CN106062820B/zh active Active
- 2014-03-14 US US15/121,886 patent/US10055670B2/en active Active
- 2014-03-14 JP JP2016507250A patent/JP6176388B2/ja active Active
- 2014-03-14 EP EP14885695.8A patent/EP3118813B1/en active Active
Patent Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH033089A (ja) * | 1989-05-31 | 1991-01-09 | Meidensha Corp | パターン認識装置 |
JP2006190201A (ja) * | 2005-01-07 | 2006-07-20 | Sony Corp | 画像処理システム、学習装置および方法、並びにプログラム |
JP2010113731A (ja) * | 2007-03-09 | 2010-05-20 | Omron Corp | 認識処理方法およびこの方法を用いた画像処理装置 |
JP2012048593A (ja) * | 2010-08-30 | 2012-03-08 | Juki Corp | 画像処理装置 |
Non-Patent Citations (1)
Title |
---|
See also references of EP3118813A4 * |
Cited By (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2019092782A1 (ja) * | 2017-11-07 | 2019-05-16 | 日本電気株式会社 | 情報処理装置、制御方法、及びプログラム |
JPWO2019092782A1 (ja) * | 2017-11-07 | 2020-11-12 | 日本電気株式会社 | 情報処理装置、制御方法、及びプログラム |
US11580721B2 (en) | 2017-11-07 | 2023-02-14 | Nec Corporation | Information processing apparatus, control method, and program |
JP2019192005A (ja) * | 2018-04-26 | 2019-10-31 | 株式会社日立製作所 | 物体認識装置および方法 |
JP7207862B2 (ja) | 2018-04-26 | 2023-01-18 | 株式会社日立製作所 | 物体認識装置および方法 |
CN111353419A (zh) * | 2020-02-26 | 2020-06-30 | 北京百度网讯科技有限公司 | 图像比对方法、装置、电子设备及存储介质 |
CN111353419B (zh) * | 2020-02-26 | 2023-08-11 | 北京百度网讯科技有限公司 | 图像比对方法、装置、电子设备及存储介质 |
Also Published As
Publication number | Publication date |
---|---|
US20170017862A1 (en) | 2017-01-19 |
EP3118813A4 (en) | 2017-11-01 |
EP3118813A1 (en) | 2017-01-18 |
JP6176388B2 (ja) | 2017-08-09 |
JPWO2015136714A1 (ja) | 2017-04-06 |
CN106062820B (zh) | 2018-12-28 |
EP3118813B1 (en) | 2021-06-02 |
CN106062820A (zh) | 2016-10-26 |
US10055670B2 (en) | 2018-08-21 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
JP6176388B2 (ja) | 画像識別装置、画像センサ、画像識別方法 | |
JP6557943B2 (ja) | 画像照合装置、画像センサ、処理システム、画像照合方法 | |
US20230410321A1 (en) | Information processing apparatus, control method, and program | |
US10063843B2 (en) | Image processing apparatus and image processing method for estimating three-dimensional position of object in image | |
KR101551576B1 (ko) | 로봇 청소기, 제스쳐 인식 장치 및 방법 | |
WO2015115274A1 (ja) | 物体判別装置、画像センサ、物体判別方法 | |
US10275682B2 (en) | Information processing apparatus, information processing method, and storage medium | |
US10713530B2 (en) | Image processing apparatus, image processing method, and image processing program | |
JP6278108B2 (ja) | 画像処理装置、画像センサ、画像処理方法 | |
US9286669B2 (en) | Image processing apparatus, image processing method and program | |
JP2018081402A5 (ja) | ||
CN113297963A (zh) | 多人姿态的估计方法、装置、电子设备以及可读存储介质 | |
JP6393495B2 (ja) | 画像処理装置および物体認識方法 | |
CN106406507B (zh) | 图像处理方法以及电子设备 | |
Ramisa et al. | Evaluation of the sift object recognition method in mobile robots | |
JP2015118582A5 (ja) | ||
CN111275693B (zh) | 一种图像中物体的计数方法、计数装置及可读存储介质 | |
WO2017179728A1 (ja) | 画像認識装置、画像認識方法および画像認識プログラム | |
WO2015136716A1 (ja) | 画像処理装置、画像センサ、画像処理方法 | |
JP2015187770A (ja) | 画像認識装置、画像認識方法及びプログラム | |
KR101179401B1 (ko) | 정합쌍 분류장치 및 방법 | |
CN116883945B (zh) | 一种融合目标边缘检测和尺度不变特征变换的人员识别定位方法 | |
JP7035357B2 (ja) | 画像判定用コンピュータプログラム、画像判定装置及び画像判定方法 | |
WO2020003510A1 (ja) | 特定方法、判定方法、特定プログラム、判定プログラムおよび情報処理装置 | |
Wang et al. | A New Multi-scale Harris Interesting Point Detector |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 14885695 Country of ref document: EP Kind code of ref document: A1 |
|
REEP | Request for entry into the european phase |
Ref document number: 2014885695 Country of ref document: EP |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2014885695 Country of ref document: EP |
|
ENP | Entry into the national phase |
Ref document number: 2016507250 Country of ref document: JP Kind code of ref document: A |
|
WWE | Wipo information: entry into national phase |
Ref document number: 15121886 Country of ref document: US |
|
NENP | Non-entry into the national phase |
Ref country code: DE |