JP2016063894A - Shape recognition device and sewing machine - Google Patents

Shape recognition device and sewing machine Download PDF

Info

Publication number
JP2016063894A
JP2016063894A JP2014193316A JP2014193316A JP2016063894A JP 2016063894 A JP2016063894 A JP 2016063894A JP 2014193316 A JP2014193316 A JP 2014193316A JP 2014193316 A JP2014193316 A JP 2014193316A JP 2016063894 A JP2016063894 A JP 2016063894A
Authority
JP
Japan
Prior art keywords
sewing
shape
fabric
shape recognition
image data
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
JP2014193316A
Other languages
Japanese (ja)
Inventor
友美 山田
Tomomi Yamada
友美 山田
安部 好晃
Yoshiaki Abe
好晃 安部
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Juki Corp
Original Assignee
Juki Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Juki Corp filed Critical Juki Corp
Priority to JP2014193316A priority Critical patent/JP2016063894A/en
Priority to DE102015116112.5A priority patent/DE102015116112A1/en
Priority to CN201510616070.4A priority patent/CN105447847A/en
Publication of JP2016063894A publication Critical patent/JP2016063894A/en
Pending legal-status Critical Current

Links

Classifications

    • DTEXTILES; PAPER
    • D05SEWING; EMBROIDERING; TUFTING
    • D05BSEWING
    • D05B21/00Sewing machines with devices for automatically controlling movement of work-carrier relative to stitch-forming mechanism in order to obtain particular configuration of seam, e.g. programme-controlled for sewing collars, for attaching pockets
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/0002Inspection of images, e.g. flaw detection
    • G06T7/0004Industrial image inspection
    • G06T7/0006Industrial image inspection using a design-rule based approach
    • DTEXTILES; PAPER
    • D05SEWING; EMBROIDERING; TUFTING
    • D05BSEWING
    • D05B19/00Programme-controlled sewing machines
    • D05B19/02Sewing machines having electronic memory or microprocessor control unit
    • DTEXTILES; PAPER
    • D05SEWING; EMBROIDERING; TUFTING
    • D05BSEWING
    • D05B19/00Programme-controlled sewing machines
    • D05B19/02Sewing machines having electronic memory or microprocessor control unit
    • D05B19/12Sewing machines having electronic memory or microprocessor control unit characterised by control of operation of machine
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T1/00General purpose image data processing
    • G06T1/0014Image feed-back for automatic industrial control, e.g. robot with camera
    • DTEXTILES; PAPER
    • D05SEWING; EMBROIDERING; TUFTING
    • D05DINDEXING SCHEME ASSOCIATED WITH SUBCLASSES D05B AND D05C, RELATING TO SEWING, EMBROIDERING AND TUFTING
    • D05D2305/00Operations on the work before or after sewing
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30108Industrial image inspection
    • G06T2207/30124Fabrics; Textile; Paper

Landscapes

  • Engineering & Computer Science (AREA)
  • Textile Engineering (AREA)
  • Computer Hardware Design (AREA)
  • Microelectronics & Electronic Packaging (AREA)
  • Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Sewing Machines And Sewing (AREA)
  • Robotics (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Quality & Reliability (AREA)
  • Image Processing (AREA)
  • Image Analysis (AREA)

Abstract

PROBLEM TO BE SOLVED: To excellently recognize the outer peripheral end part of an object to be sewn regardless of the color and the material of the object to be sewn.SOLUTION: A shape recognition device 4 is provided for, from captured image data by image capturing means 41 which captures, from above, the image of a first object and a second object vertically overlapped, recognizing a border between the objects. In the shape recognition device 4, the image capturing means captures the image while positioning the upper surface of the first object or the second object at a focal distance. In addition, the shape recognition device 4 includes an image recognition control section 43 for performing: sharpening processing for the captured image data captured by the image capturing means in which, with respect to each pixel of the captured image data, when the pixel has a brightness larger than the brightness of surrounding pixels, the brightness value of the pixel is changed to a larger value according to the difference value therebetween; binarization processing for the captured image data after the sharpening processing; and edge extraction to recognize the border between the first object and the second object.SELECTED DRAWING: Figure 6

Description

本発明は、被縫製物の形状を認識する形状認識装置及びこれを備えるミシンに関する。   The present invention relates to a shape recognition device for recognizing the shape of a sewing object and a sewing machine including the shape recognition device.

従来、水平面内を動作可能に構成された布送り下板及び布送り外枠に縫製物本体を挟持し、縫製物本体上に被縫製物を設置し、これを予めプログラミングされた縫製経路に沿って移動させながら縫うことにより縫製物本体に被縫製物を縫い付ける縫製装置が用いられている。
このような縫製装置においては、被縫製物を位置決め保持するために、布送り外枠には被縫製物の縫製箇所に対応した溝や穴などの開口が形成された布保持枠が装備される。そして、被縫製物は、布保持枠に形成された開口に沿って縫製が行われる。また、所定形状の被縫製物の外縁に沿って当該外縁から一定間隔を維持しつつ縫製を行う場合もある。
しかしながら、被縫製物を所定形状に裁断する段階で、被縫製物の伸縮性の違いや作業者の技量の違い等により、目標とする形状からサイズや形状のバラ付きを生じる場合があり、このような被縫製物については、実際の外縁に対して一定間隔を維持した縫製を行うことができなかった。
Conventionally, the sewing machine body is sandwiched between a cloth feed lower plate and a cloth feed outer frame which are configured to be operable in a horizontal plane, and the work to be sewn is installed on the sewing machine body, and this is followed along a pre-programmed sewing path. 2. Description of the Related Art Sewing apparatuses that sew a workpiece to be sewn by sewing while moving the sewing machine are used.
In such a sewing apparatus, in order to position and hold the sewing product, the cloth feed outer frame is equipped with a cloth holding frame in which openings such as grooves and holes corresponding to the sewing locations of the sewing product are formed. . Then, the sewing object is sewn along an opening formed in the cloth holding frame. Further, there is a case where sewing is performed while maintaining a constant distance from the outer edge along the outer edge of the sewing object having a predetermined shape.
However, at the stage of cutting the sewing product into a predetermined shape, the size and shape may vary from the target shape due to the difference in stretchability of the sewing product and the skill of the operator. For such a material to be sewn, it was not possible to perform sewing while maintaining a constant interval with respect to the actual outer edge.

かかる問題を解決するため特許文献1には、「縁位置感知装置は回転軸受上に装架された走査カメラを有し、該軸受はミシンヘッドに装着され、前記カメラが針の軸線と同軸的に回転し、縫製面内の針貫通点のまわりの仮想円に接する線を走査する」電子ミシンが開示されている。即ち、この特許文献1には、縫製物本体に固定された被縫製物は、走査カメラに装着された照明装置により縫製物本体に対して被縫製物の外周端縁が強調され、走査カメラによりこの被縫製物の外周端縁を感知し、予めプログラミングされた縫製パターンに対して、検出された被縫製物の実際位置に従って縫製経路を修正しながら被縫製物の外周端縁から一定幅の縫い代位置を縫製する縫製装置が記載されている。   In order to solve such a problem, Patent Document 1 discloses that “the edge position sensing device has a scanning camera mounted on a rotary bearing, the bearing is mounted on a sewing head, and the camera is coaxial with the axis of the needle. An electronic sewing machine is disclosed that scans a line that touches a virtual circle around the needle penetration point in the sewing surface. That is, in this patent document 1, the outer peripheral edge of the sewing product is emphasized with respect to the sewing product body by the illumination device mounted on the scanning camera, and the sewing product is fixed by the scanning camera. Detecting the outer peripheral edge of the sewing product and correcting a sewing path in accordance with the detected actual position of the sewing product with respect to a pre-programmed sewing pattern, a sewing margin having a certain width from the outer peripheral edge of the sewing product. A sewing device for sewing positions is described.

また、同問題を解決するため特許文献2には、縫製工程の前工程として外形認識装置によって被縫製物の外形認識を行い、縫製物本体に固定された被縫製物は、被縫製物の外周端縁測方部に設けられた照明装置により縫製物本体に対して被縫製物の外周端縁が強調され、CCDカメラによりこの被縫製物の外周端縁を感知し、外形認識装置から電子ミシンへと縫製物を移載し、検出された被縫製物の外形形状を元に被縫製物の外周端縁から一定幅の縫い代位置を縫製する縫製装置が記載されている。   Further, in order to solve this problem, Patent Document 2 discloses that the outer shape of the sewing product is recognized by the outer shape recognition device as a pre-process of the sewing process, and the sewing product fixed to the sewing product body is the outer periphery of the sewing product. The outer peripheral edge of the sewing product is emphasized with respect to the sewing product body by the illumination device provided in the edge measuring unit, the outer peripheral edge of the sewing product is detected by the CCD camera, and the electronic sewing machine detects the outer peripheral edge of the sewing product. A sewing device is described in which a sewing product is transferred to the sewing machine and a sewing margin position having a predetermined width is sewn from the outer peripheral edge of the sewing product based on the detected outer shape of the sewing product.

特公平7−28962号公報Japanese Patent Publication No. 7-28962 特開平05−225315号公報JP 05-225315 A

しかしながら、上述した特許文献1,2に記載のいずれの縫製装置においても、照明装置によって縫製物本体に対して被縫製物の外周端縁を強調し、カメラによって被縫製物の外周端縁を認識することから、被縫製物の色や材質によっては外周端が十分に強調されず、認識困難となる場合があった。特に、被縫製物と縫製物本体の色が同色(黒)などの場合には、外周端が十分に強調されない傾向にあった。
その結果、被縫製物の外周端を正確に認識することが困難となっていた。
However, in any of the sewing devices described in Patent Documents 1 and 2 described above, the illumination device emphasizes the outer peripheral edge of the sewing product with respect to the sewing product body, and the camera recognizes the outer peripheral edge of the sewing product. For this reason, depending on the color and material of the sewing object, the outer peripheral edge may not be sufficiently emphasized, which may make recognition difficult. In particular, when the color of the sewing object and the sewing body is the same color (black), the outer peripheral edge tends not to be sufficiently emphasized.
As a result, it has been difficult to accurately recognize the outer peripheral edge of the workpiece.

そこで本発明は、被縫製物の色や材質に拘わらずその外周端部を良好に認識することを課題とする。   Then, this invention makes it a subject to recognize the outer peripheral edge part favorably irrespective of the color and material of to-be-sewn material.

請求項1記載の発明は、
上下に重ねて配置される第1対象物と第2対象物を上方から撮像する撮像手段を備え、
前記撮像手段による撮像画像データから前記第1対象物と前記第2対象物の境界を認識するための形状認識装置において、
前記撮像手段は、前記第1対象物と前記第2対象物のいずれか一方の上面を焦点距離に合わせることで、一方の対象物を焦点距離、他方の対象物を非焦点距離で撮像し、
前記撮像手段によって撮像した撮像画像データに対して、前記撮像画像データのそれぞれの画素について周囲の画素の輝度より大きな場合にその差分値に応じて当該画素の輝度値を大きく変換する鮮鋭化処理と、当該鮮鋭化処理後の撮像画像データに対して二値化処理と、を行い、さらに、エッジ抽出して、前記第1対象物と前記第2対象物の境界を認識する画像認識制御部、を備えることを特徴とする。
The invention described in claim 1
An imaging means for imaging the first object and the second object that are arranged one above the other from above and below,
In a shape recognition apparatus for recognizing a boundary between the first object and the second object from imaged image data obtained by the imaging means,
The imaging means images one object at a focal distance and the other object at a non-focal distance by adjusting the upper surface of one of the first object and the second object to a focal distance,
A sharpening process for greatly converting the luminance value of the pixel according to the difference value when the luminance of each pixel of the captured image data is larger than the luminance of surrounding pixels with respect to the captured image data captured by the imaging unit; An image recognition control unit that performs binarization processing on the captured image data after the sharpening processing, and further performs edge extraction to recognize a boundary between the first object and the second object; It is characterized by providing.

請求項2記載の発明は、請求項1記載の形状認識装置において、
前記画像認識制御部は、前記鮮鋭化処理後かつ前記二値化処理前の撮像画像データに対して、平滑化処理を行うことを特徴とする。
The invention according to claim 2 is the shape recognition apparatus according to claim 1,
The image recognition control unit performs a smoothing process on captured image data after the sharpening process and before the binarization process.

請求項3記載の発明は、請求項1又は2記載の形状認識装置において、
前記撮像手段を昇降させる高さ調節手段を備え、
前記画像認識制御部は、前記撮像手段が予め設定された高さになるように前記高さ調節手段を制御することを特徴とする。
The invention according to claim 3 is the shape recognition apparatus according to claim 1 or 2,
A height adjusting means for raising and lowering the imaging means;
The image recognition control unit controls the height adjusting unit so that the imaging unit has a preset height.

請求項4記載の発明は、請求項1から3いずれか一項に記載の形状認識装置において、
複数の照射位置から前記対象物を照射可能であって、前記複数の照射位置ごとに照明強度を調節可能な照明装置を備えることを特徴とする。
The invention according to claim 4 is the shape recognition device according to any one of claims 1 to 3,
An illumination device is provided that can irradiate the object from a plurality of irradiation positions and can adjust the illumination intensity for each of the plurality of irradiation positions.

請求項5記載の発明は、請求項1から4いずれか一項に記載の形状認識装置を備えるミシンにおいて、
前記対象物として被縫製物の外形を認識して縫製を行うことを特徴とする。
According to a fifth aspect of the present invention, in a sewing machine comprising the shape recognition device according to any one of the first to fourth aspects,
Sewing is performed by recognizing the outer shape of the sewing object as the object.

請求項6記載の発明は、請求項5記載のミシンにおいて、
前記形状認識装置により得られる前記被縫製物の外形に基づいて縫製パターンデータを生成し、
当該縫製パターンデータに従って運針を行うことを特徴とする。
The invention according to claim 6 is the sewing machine according to claim 5,
Generate sewing pattern data based on the outer shape of the sewing product obtained by the shape recognition device,
The needle movement is performed according to the sewing pattern data.

請求項7記載の発明は、請求項5記載のミシンにおいて、
前記形状認識装置により得られる前記被縫製物の外形に基づいて既存の縫製パターンデータを補正し、
当該縫製パターンデータに従って運針を行うことを特徴とする。
The invention according to claim 7 is the sewing machine according to claim 5,
Correcting the existing sewing pattern data based on the outer shape of the sewing product obtained by the shape recognition device;
The needle movement is performed according to the sewing pattern data.

請求項8記載の発明は、請求項7記載のミシンにおいて、
前記形状認識装置により得られる前記被縫製物の外形と既存の縫製パターンデータを比較し、
比較結果に基づいて縫製の実行の有無を判定することを特徴とする。
The invention according to claim 8 is the sewing machine according to claim 7,
Compare the outline of the sewing product obtained by the shape recognition device with existing sewing pattern data,
It is characterized in that the presence or absence of execution of sewing is determined based on the comparison result.

本発明は、上下に重ねて配置される第1と第2対象物のいずれか一方の上面を焦点距離に合わせて撮像した場合に、当該焦点距離が合っている対象物についてはより画像が鮮明となり、焦点距離が合っていない対象物は不鮮明となるので、対象物の撮像画像データに対して鮮鋭化処理を行うと、一方の対象物の上面と他方の対象物の上面とで輝度値の分布を大きく分離させることができる。
このため、一方の対象物の上面と他方の対象物との境界部分であるエッジを容易且つ明確に抽出することができる。
これにより、対象物の色や材質に拘わらずその境界部を良好に認識することが可能となる。
According to the present invention, when the upper surface of one of the first and second objects that are arranged one above the other is imaged in accordance with the focal length, an image with a clearer focus is obtained with respect to the target with the focal length. Therefore, if the sharpening process is performed on the captured image data of the target object, the luminance value between the upper surface of one target object and the upper surface of the other target object is unclear. Distribution can be largely separated.
For this reason, the edge which is a boundary part of the upper surface of one target object and the other target object can be extracted easily and clearly.
Thereby, it becomes possible to recognize the boundary part well regardless of the color or material of the object.

本発明の一実施形態に係る縫製装置の斜視図である。It is a perspective view of the sewing apparatus which concerns on one Embodiment of this invention. 本発明の一実施形態に係る縫製装置の正面図である。但し、カメラを分離して示す。It is a front view of the sewing device concerning one embodiment of the present invention. However, the camera is shown separately. 本発明の一実施形態に係る縫製装置の制御ブロック図である。It is a control block diagram of the sewing apparatus which concerns on one Embodiment of this invention. 本発明の一実施形態に係る縫製装置の布送り部の斜視図である。It is a perspective view of the cloth feed part of the sewing apparatus which concerns on one Embodiment of this invention. 画像認識制御部が行う制御及び処理を順番に示したフローチャートである。It is the flowchart which showed the control and process which an image recognition control part performs in order. 被縫製物の撮像画像である。It is a picked-up image of a sewing product. 尖鋭処理後の被縫製物の撮像画像である。It is a picked-up image of the to-be-sewn product after a sharpening process. 平滑化処理後の被縫製物の撮像画像である。It is the picked-up image of the to-be-sewn product after the smoothing process. 平滑化処理を行った撮像画像データの全画素の輝度値に基づいて輝度値の度数分布を示したヒストグラムである。It is the histogram which showed frequency distribution of the luminance value based on the luminance value of all the pixels of the picked-up image data which performed the smoothing process. 二値化処理後の被縫製物の撮像画像である。It is the picked-up image of the to-be-sewn product after a binarization process. 図11(A)〜図11(C)は縫製パターンの不一致の例を示す説明図である。FIG. 11A to FIG. 11C are explanatory diagrams showing examples of mismatch of the sewing patterns. 図12(A)は特徴点の抽出を示し、図12(B)は対応する特徴点同士の位置を比較している状態を示す説明図である。FIG. 12A shows extraction of feature points, and FIG. 12B is an explanatory diagram showing a state in which the positions of corresponding feature points are compared. 図13(A)は縫製パターンのズレ量を求めている状態を示し、図13(B)はズレ量に基づいて縫製パターンを平行移動させている状態を示す説明図である。FIG. 13A shows a state in which the amount of deviation of the sewing pattern is obtained, and FIG. 13B is an explanatory diagram showing a state in which the sewing pattern is translated based on the amount of deviation. 図14(A)は布地本体に焦点を合わせた状態の撮像画像を示し、図14(B)は布地と布地本体との間に焦点を合わせた状態の撮像画像を示している。FIG. 14A shows a captured image in a state in which the cloth body is focused, and FIG. 14B shows a captured image in a state in which the focus is between the fabric and the fabric body. 布地本体の上に二枚の布地を重ねた場合の平滑化処理を行った撮像画像データの全画素の輝度値に基づいて輝度値の度数分布を示したヒストグラムである。It is the histogram which showed the frequency distribution of the luminance value based on the luminance value of all the pixels of the picked-up image data which performed the smoothing process at the time of superimposing two sheets of cloth on the cloth main body.

以下に本発明の一実施形態につき図面を参照して説明する。以下は本発明の一実施形態であって本発明を限定するものではない。   An embodiment of the present invention will be described below with reference to the drawings. The following is one embodiment of the present invention and does not limit the present invention.

[ミシンの概要]
図1から図3によって示すように本実施形態のミシン1は、ミシン本体2と、布送り部3と、形状認識装置としての形状認識部4と、ミシン制御部5と、縫製パターンデータベース51と、表示装置52とを備えて構成される。
ミシン本体2は、ミシンアーム部2aに内蔵され縫い針の上下動動作を行う図示しない針上下動機構と、ミシンベッド部2bに内蔵され上糸と下糸とをからめる釜機構と、被縫製物を載置するミシンテーブル2cとを備え、被縫製物の縫製を行う。
[Outline of sewing machine]
As shown in FIGS. 1 to 3, the sewing machine 1 according to the present embodiment includes a sewing machine body 2, a cloth feeding unit 3, a shape recognition unit 4 as a shape recognition device, a sewing machine control unit 5, a sewing pattern database 51, and And a display device 52.
The sewing machine main body 2 includes a needle up / down movement mechanism (not shown) that is built in the sewing machine arm portion 2a and moves the sewing needle up and down, a hook mechanism that is built in the sewing machine bed portion 2b and entangles the upper thread and lower thread, and a workpiece to be sewn. The sewing machine table 2c is mounted, and the sewing product is sewn.

形状認識部4は、対象物としての被縫製物の撮像を行う撮像手段としてのカメラ41と、撮像される被縫製物に光照射を行う照明装置42と、カメラ41により撮像画像データに対して後述する各種の処理を行う画像認識制御部43と、被縫製物の形状認識とミシン本体の動作について設定を行う設定入力部44と、データ記憶部45とを備える。   The shape recognizing unit 4 has a camera 41 as an imaging unit that captures an image of a sewing object as an object, an illuminating device 42 that irradiates the image of the sewing object to be imaged, and image data captured by the camera 41. An image recognition control unit 43 that performs various processes to be described later, a setting input unit 44 that sets the shape recognition of the sewing object and the operation of the sewing machine body, and a data storage unit 45 are provided.

画像認識制御部43はカメラ41及び照明装置42を制御して撮像画像データを取得し、後述する所定の画像処理を行って被縫製物の形状認識を行う。
画像認識制御部43、ミシン制御部5、設定入力部44、データ記憶部45及び縫製パターンデータベース51は、情報処理装置、情報入力装置及び情報記憶装置により構成される情報処理システムであって、画像認識制御部43とミシン制御部5とが同一又は別々のICチップで動作するとか、データ記憶部45と縫製パターンデータベース51とが同一又は別々のハードディスク又はICチップに構築されるとか、ミシン本体2の筐体内に又は外部装置として構成されるかなどのハードウエア構成は問わない。
表示装置52は、画像表示のほか任意に音声表示を行う。
The image recognition control unit 43 controls the camera 41 and the illumination device 42 to acquire captured image data, and performs predetermined image processing described later to recognize the shape of the sewing product.
The image recognition control unit 43, the sewing machine control unit 5, the setting input unit 44, the data storage unit 45, and the sewing pattern database 51 are an information processing system including an information processing device, an information input device, and an information storage device. The recognition control unit 43 and the sewing machine control unit 5 operate on the same or different IC chips, the data storage unit 45 and the sewing pattern database 51 are constructed on the same or different hard disks or IC chips, or the sewing machine body 2 Any hardware configuration may be used, such as whether it is configured within the housing or as an external device.
The display device 52 performs audio display arbitrarily in addition to image display.

照明装置42はカメラ41を中心として水平に四方に配置された四つの光源を備えている(より多くとも良い)。そして、各光源は、上記四箇所の照射位置から下方の被縫製物を照射すると共に、設定入力部44からの入力に応じて個別に照明強度を調節可能となっている。
即ち、後述するように、被縫製物が布地本体C2とその上に載置された布地C1とからなる場合には、布地C1の外縁部に沿って陰影が生じる場合がある。この陰影が後述する画像処理に影響を及ぼす場合があるので、四方の光源の照明強度を調節して陰影をより薄くすることができるようになっている。陰影は光源とは逆側に発生するので、陰影が発生した方向とは逆側の照明強度を低減し、陰影側の照明強度を高くする等の調節が行われる。
The illuminating device 42 includes four light sources arranged horizontally in four directions with the camera 41 as the center (more at most). Each light source irradiates the lower sewing product from the four irradiation positions, and can individually adjust the illumination intensity according to the input from the setting input unit 44.
That is, as will be described later, in the case where the article to be sewn is composed of the fabric main body C2 and the fabric C1 placed thereon, a shadow may occur along the outer edge of the fabric C1. Since this shadow may affect the image processing described later, the shadow can be made thinner by adjusting the illumination intensity of the four light sources. Since the shadow is generated on the side opposite to the light source, adjustments such as reducing the illumination intensity on the side opposite to the direction in which the shadow occurs and increasing the illumination intensity on the shadow side are performed.

カメラ41は主にCCD(Charge Coupled Device)やCMOS(Complementary Metal Oxide Semiconductor Image Sensor)などの撮像素子と光学系とから構成されている。
カメラ41の光学系は、単焦点であって被写界深度が十分に浅いものが使用されている。例えば、光軸方向について被縫製物である布地C1の厚さ分のズレが生じただけで撮像画像がぼやけてピントが合わなくなる程度の被写界深度となっている。
また、カメラ41にはモータ駆動によって光軸方向である上下方向(Z方向)に沿ってカメラ41を任意に昇降させる高さ調節手段46が併設されている。画像認識制御部43は、この高さ調節手段46を制御して、カメラ41が予め設定された高さになるように高さ調節を実行する。
The camera 41 is mainly composed of an image sensor such as a CCD (Charge Coupled Device) or a CMOS (Complementary Metal Oxide Semiconductor Image Sensor) and an optical system.
The optical system of the camera 41 has a single focal point and a sufficiently shallow depth of field. For example, the depth of field is such that the picked-up image is blurred and out of focus only by the deviation of the thickness of the fabric C1, which is the sewing product, in the optical axis direction.
Further, the camera 41 is provided with a height adjusting means 46 for arbitrarily raising and lowering the camera 41 along the vertical direction (Z direction) that is the optical axis direction by driving the motor. The image recognition control unit 43 controls the height adjusting unit 46 to perform height adjustment so that the camera 41 has a preset height.

図4に示すように布送り部3は、布送り下板7と、布送り外枠8と、布送り土台10と、布送りアーム11と、XY駆動部(ミシンベッド部2bに内蔵)とを備えて構成される。
布送り外枠8は布送りアーム11にX軸回りに揺動可能に支持され、布送りアーム11は布送り土台10に上下動可能に支持されている。また、布送り下板7は布送り外枠8の下方で布送り土台10に固定されている。
布送り土台10は、図示のないモータやタイミングベルト、タイミングプーリ等により構成されるXY駆動部により水平面(図中のX−Y平面)内で動作可能に構成されており、布送り土台10とともに布送り下板7及び布送り外枠8が一体となって水平面内を移動して任意に位置決めすることができる。縫製位置はこの布送り部3の制御に係る二次元座標X−Yによって定義される。各被縫製物の縫製位置は被縫製物ごとに対応した縫製パターンデータによって指定される。各縫製パターンデータは縫製パターンデータベース51に保存されている。
As shown in FIG. 4, the cloth feed section 3 includes a cloth feed lower plate 7, a cloth feed outer frame 8, a cloth feed base 10, a cloth feed arm 11, an XY drive section (built in the sewing machine bed section 2b), It is configured with.
The cloth feed outer frame 8 is supported by the cloth feed arm 11 so as to be swingable around the X axis, and the cloth feed arm 11 is supported by the cloth feed base 10 so as to be movable up and down. The cloth feed lower plate 7 is fixed to the cloth feed base 10 below the cloth feed outer frame 8.
The cloth feed base 10 is configured to be operable in a horizontal plane (XY plane in the drawing) by an XY drive unit including a motor, a timing belt, a timing pulley, and the like (not shown). The cloth feed lower plate 7 and the cloth feed outer frame 8 are integrally moved and can be arbitrarily positioned by moving in the horizontal plane. The sewing position is defined by a two-dimensional coordinate XY related to the control of the cloth feeding unit 3. The sewing position of each sewing product is specified by sewing pattern data corresponding to each sewing product. Each sewing pattern data is stored in the sewing pattern database 51.

被縫製物は、所定の外周形状からなる第1対象物である布地C1と、当該布地C1が上から縫い付けられる第2対象物である布地本体C2とが上下に重ねて配置された構成からなり、上記布送り外枠8は下降動作により下板7との間で布地本体C2のみを保持する。縫製パターンデータは、布地C1の外周縁部から一定の間隔を維持しつつその内側に縫い目を形成するためのパターンが記録されている。つまり、布地C1の外周形状よりも幾分サイズが小さい相似形状に縫い目を形成するためのパターンが記録されている。
布地C1は布送り外枠8に保持されないので、布地本体C2に対して糊付けで仮止めされている。なお、布地C1の外周が上方から見て遮蔽されない他の方法で布地本体C2に仮止めを行っても良い。例えば、布地C1の上面であって縫い目の形成位置よりも内側領域に載置するマグネット等により保持(又は下板81をマグネットとして磁性体で上から保持)しても良い。
The to-be-sewn product has a configuration in which a fabric C1 which is a first object having a predetermined outer peripheral shape and a fabric body C2 which is a second object to which the cloth C1 is sewn from above are arranged in an up-and-down manner. Thus, the cloth feed outer frame 8 holds only the cloth main body C2 with the lower plate 7 by the lowering operation. In the sewing pattern data, a pattern for forming a seam is recorded on the inside while maintaining a constant distance from the outer peripheral edge of the fabric C1. That is, a pattern for forming a seam in a similar shape that is somewhat smaller than the outer peripheral shape of the fabric C1 is recorded.
Since the fabric C1 is not held by the fabric feed outer frame 8, it is temporarily fixed to the fabric main body C2 by gluing. Note that the cloth body C2 may be temporarily fixed by another method in which the outer periphery of the cloth C1 is not shielded when viewed from above. For example, it may be held by a magnet or the like placed on the upper surface of the fabric C1 and inside the seam formation position (or the lower plate 81 is held as a magnet from above).

[縫製動作の流れ]
ミシン制御部5が行う縫製動作制御について説明する。
作業者が布送り下板7と布送り外枠8の間に布地C1及び布地本体C2をセットして設定入力部44から縫製の開始が入力されると、ミシン制御部5は、布送り外枠8を下降させて布地本体C2を保持すると共に、縫製パターンデータベースから縫製パターンデータの読み出しを実行する。
なお、縫製開始が入力されて実際にミシン本体2の各部で縫製動作が開始されるまでの間に、形状認識部4では、布地C1の撮像画像データを取得し、形状認識を行い、布地C1の形状の良否判定や縫製パターンデータの比較照合、補正処理等を実行している。
[Flow of sewing operation]
The sewing operation control performed by the sewing machine controller 5 will be described.
When the operator sets the cloth C1 and the cloth body C2 between the cloth feed lower plate 7 and the cloth feed outer frame 8 and inputs the start of sewing from the setting input section 44, the sewing machine control section 5 The frame 8 is lowered to hold the fabric body C2, and the sewing pattern data is read from the sewing pattern database.
The shape recognition unit 4 acquires captured image data of the fabric C1, recognizes the shape, and performs the fabric recognition C1 until the sewing operation is actually started at each part of the sewing machine body 2 after the start of sewing is input. The shape is judged to be good or bad, sewing pattern data is compared and verified, correction processing, and the like are executed.

そして、ミシン制御部5は、縫い針の上下動動作を開始すると共に、これに同期して縫製パターンデータに従ってXY駆動部を制御し、一針ごとに布送り下板7及び布送り外枠8を所定の目標位置に移動させ、布地C1に対して外周形状に応じた縫製パターンに従って運針を行う。そして、縫製パターンデータに定められた全ての針落ち位置に針落ちを行うことにより、布地C1の外周形状に沿った縫着縫製が行われる。   The sewing machine control unit 5 starts the vertical movement operation of the sewing needle and controls the XY driving unit in accordance with the sewing pattern data in synchronization with this, and the cloth feed lower plate 7 and the cloth feed outer frame 8 for each stitch. Is moved to a predetermined target position, and the needle is moved according to the sewing pattern corresponding to the outer peripheral shape of the fabric C1. Then, sewing is performed along the outer peripheral shape of the fabric C1 by performing needle dropping at all the needle dropping positions determined in the sewing pattern data.

[形状認識部による処理]
次に、形状認識部4の画像認識制御部43が、縫製開始入力後に行う各種の処理について説明する。図5は画像認識制御部43が行う制御及び処理を順番に示したフローチャートである。
なお、ミシン制御部5の制御により、被縫製物を保持した布送り下板7及び布送り外枠8は、予め、規定の撮像位置に移動している。
この移動後に、画像認識制御部43は、高さ調節手段46を制御してカメラ41を適正な高さに調節する(ステップS1)。カメラ41の適正な高さは、その光学系の焦点が布地C1の上面に合う高さであり、その値は設定入力部44により予め入力されている。
そして、画像認識制御部43は、カメラ41を制御して布地C1全体を上方から撮像する(ステップS3)。画像認識制御部43は、カメラ41の撮像により布地C1及び布地本体C2の一部からなる撮像画像データを生成し、データ記憶部45に格納する。
なお、図6は撮像画像であり、符号t1は布地C1の上面、t2は布地本体C2の上面を示している。被写界深度が浅い光学系を備えるカメラ41で撮像したので、焦点距離である布地C1の上面は鮮明に撮像され、焦点からずれた非焦点距離で撮像された布地本体C2の上面はボケが生じているのが分かる。
[Processing by shape recognition unit]
Next, various processes performed by the image recognition control unit 43 of the shape recognition unit 4 after sewing start input will be described. FIG. 5 is a flowchart showing the control and processing performed by the image recognition control unit 43 in order.
Note that the cloth feed lower plate 7 and the cloth feed outer frame 8 that hold the workpiece are moved in advance to a prescribed imaging position under the control of the sewing machine control unit 5.
After this movement, the image recognition control unit 43 controls the height adjusting means 46 to adjust the camera 41 to an appropriate height (step S1). The appropriate height of the camera 41 is a height at which the focal point of the optical system matches the upper surface of the fabric C1, and the value is input in advance by the setting input unit 44.
And the image recognition control part 43 controls the camera 41, and images the cloth C1 whole from upper direction (step S3). The image recognition control unit 43 generates captured image data including part of the fabric C <b> 1 and the fabric main body C <b> 2 by capturing with the camera 41 and stores the captured image data in the data storage unit 45.
Note that FIG. 6 is a captured image, where reference numeral t1 represents the upper surface of the fabric C1, and t2 represents the upper surface of the fabric body C2. Since the image is captured by the camera 41 having an optical system with a shallow depth of field, the upper surface of the fabric C1 that is the focal length is clearly imaged, and the upper surface of the fabric main body C2 that is imaged at a non-focal length shifted from the focal point is blurred. You can see it happening.

次に、画像認識制御部43は、取得した撮像画像データに対して、鮮鋭化処理を実行する(ステップS5)。即ち、画像認識制御部43は「鮮鋭化処理手段」として機能する。
この鮮鋭化処理では、8方向ラプラシアンフィルタを使用する。これにより、撮像画像データを構成する各画素の輝度値は、周囲の画素(例えば、隣接画素)よりも輝度値が大きな画素についてはその輝度値差分がより大きくなるように変換される。
従って、図7に示すように、鮮鋭化処理を行うと、鮮明度が低い布地本体C2の領域t2では小さな輝度値となる画素が多くを占めて全体的に黒色となる。また、鮮明度が高い布地C1の領域t1では小さな輝度値となる画素が少なくなる。
Next, the image recognition control unit 43 performs a sharpening process on the acquired captured image data (step S5). That is, the image recognition control unit 43 functions as “sharpening processing means”.
In this sharpening process, an 8-direction Laplacian filter is used. As a result, the luminance value of each pixel constituting the captured image data is converted so that the luminance value difference is larger for pixels having a larger luminance value than surrounding pixels (for example, adjacent pixels).
Therefore, as shown in FIG. 7, when the sharpening process is performed, in the region t2 of the fabric body C2 having a low sharpness, a large number of pixels having a small luminance value occupy a black color as a whole. In addition, in the region t1 of the fabric C1 having a high definition, the number of pixels having a small luminance value is reduced.

さらに、画像認識制御部43は、鮮鋭化処理を行った撮像画像データに対して、平滑化処理を実行する(ステップS7)。この平滑化処理では、15×15平均化フィルタを使用する。
これにより、図8に示すように、画像全体でノイズが除去されると共に、大きな輝度値となる画素が多く含まれる布地C1の領域t1では、点在する輝度値が小さな画素も輝度値が引き上げられて、その領域全体で輝度値が引き上げられる。
Further, the image recognition control unit 43 performs a smoothing process on the captured image data that has been subjected to the sharpening process (step S7). In this smoothing process, a 15 × 15 averaging filter is used.
As a result, as shown in FIG. 8, noise is removed from the entire image, and in the region t1 of the fabric C1 where many pixels having a large luminance value are included, the luminance value of pixels with small luminance values is also increased. As a result, the luminance value is increased in the entire area.

次に、画像認識制御部43は、布地C1と布地本体C2の境界を求めるエッジ抽出処理を行う(ステップS9)。
このエッジ抽出処理では、布地C1の領域t1と布地本体C2の領域t2とを明確に分別する必要があるため、まず、これらの分別のための輝度値の閾値を決定する必要がある。
このため、画像認識制御部43は、平滑化処理を行った撮像画像データの全画素の輝度値に基づいて輝度値の度数分布を求める。その結果、図9のヒストグラムに示すように、領域t1の各画素の輝度値の分布b1と領域t2の各画素の輝度値の分布b2とが発生する。それぞれの分布b1,b2について度数分布が頂点となる輝度値を求め、それらの中間値k1を輝度値の閾値とする。
さらに、画像認識制御部43は、上記閾値に基づいて、図10に示すように、平滑化処理を行った撮像画像データの二値化処理を行う。
そして、周知のエッジ検出オペレータを用いて布地C1の領域t1と布地C2の領域t2のエッジ検出を行い、布地C1と布地C2の境界となる画素を全て特定する。これにより布地C1の外周縁部(エッジe)が認識され、画像認識制御部43は、「認識部」として機能する。
Next, the image recognition control unit 43 performs an edge extraction process for obtaining the boundary between the fabric C1 and the fabric body C2 (step S9).
In this edge extraction process, it is necessary to clearly separate the region t1 of the fabric C1 and the region t2 of the fabric body C2, and therefore, first, it is necessary to determine a threshold value of the luminance value for the separation.
For this reason, the image recognition control unit 43 obtains a frequency distribution of luminance values based on the luminance values of all the pixels of the captured image data subjected to the smoothing process. As a result, as shown in the histogram of FIG. 9, a luminance value distribution b1 of each pixel in the region t1 and a luminance value distribution b2 of each pixel in the region t2 are generated. For each of the distributions b1 and b2, the luminance value having the frequency distribution as a vertex is obtained, and the intermediate value k1 is set as the threshold value of the luminance value.
Further, the image recognition control unit 43 performs binarization processing of the captured image data that has been subjected to smoothing processing, as shown in FIG. 10, based on the threshold value.
Then, the edge detection of the region t1 of the fabric C1 and the region t2 of the fabric C2 is performed using a known edge detection operator, and all pixels serving as the boundary between the fabric C1 and the fabric C2 are specified. As a result, the outer peripheral edge (edge e) of the fabric C1 is recognized, and the image recognition control unit 43 functions as a “recognition unit”.

次に、画像認識制御部43は、撮像画像データから取得した布地C1の外周縁部を構成する全エッジ点から縫製パターンを生成する。
即ち、画像認識制御部43は、全エッジ点(或いは一部のエッジ点を間引いても良い)からスプライン曲線を生成し、布地C1の外形を求め、さらに、その外形から一定の縫い代で内側に形成される縫い位置の軌跡となるパターンを生成する。そして、縫い位置の軌跡となるパターンに沿って一定の縫いピッチで形成させる全針落ち位置からなる縫製パターンを生成する(「撮像に基づく縫製パターン」とする)。
そして、画像認識制御部43は、この撮像に基づく縫製パターンと縫製パターンデータに定められた縫製パターンとを比較する(ステップS11)。
Next, the image recognition control part 43 produces | generates a sewing pattern from all the edge points which comprise the outer periphery part of the fabric C1 acquired from captured image data.
That is, the image recognition control unit 43 generates a spline curve from all edge points (or a part of the edge points may be thinned out), obtains the outer shape of the fabric C1, and further inward from the outer shape with a certain seam allowance. A pattern serving as a locus of the sewing position to be formed is generated. Then, a sewing pattern including all needle drop positions formed at a constant sewing pitch along the pattern serving as the locus of the sewing position is generated (referred to as “sewing pattern based on imaging”).
Then, the image recognition control unit 43 compares the sewing pattern based on this imaging with the sewing pattern defined in the sewing pattern data (step S11).

縫製パターン同士の比較のために、画像認識制御部43は、撮像に基づく縫製パターンと縫製パターンデータに定められた縫製パターンのそれぞれについて特徴点P1,P2を抽出し(図12(A))、対応する特徴点P1,P2同士の位置を比較する(図12(B))。
そして、各特徴点同士が一致と見なせる範囲で近接している場合には、形状パターン同士が一致していると判定し、画像認識制御部43はミシン制御部5に縫製開始の許可を通知し、ミシン制御部5はデータ記憶部45の縫製パターンデータに基づいて縫製動作を実行する。
In order to compare the sewing patterns, the image recognition control unit 43 extracts feature points P1 and P2 for each of the sewing pattern based on the imaging and the sewing pattern defined in the sewing pattern data (FIG. 12A). The positions of the corresponding feature points P1 and P2 are compared (FIG. 12B).
When the feature points are close to each other within a range that can be regarded as matching, it is determined that the shape patterns match, and the image recognition control unit 43 notifies the sewing machine control unit 5 of permission to start sewing. The sewing machine control unit 5 executes a sewing operation based on the sewing pattern data stored in the data storage unit 45.

一方、形状パターン同士が一致していないと判定した場合には、画像認識制御部43は、データ記憶部45の縫製パターンデータに対して補正を行う(ステップS13)。
図11は縫製パターンの不一致の例を示しており、撮像に基づく縫製パターンを破線で示し、縫製パターンデータに定められた縫製パターンを実線で示している。図11(A)は互いに形状は一致しているが全体的に位置ズレを生じている。図11(B)は縫製パターンの形状そのものに不一致を生じている。図11(C)は互いに形状は一致しているが一方が傾斜している。
例えば、図11(A)のように、縫製パターンの全体的な位置にズレが生じている場合には、図13(A)に示すように全体の位置ズレ量を求め、図13(B)に示すように、縫製パターンデータに定められた縫製パターン全体を位置ズレ量分だけ平行移動させる補正を行う。
On the other hand, if it is determined that the shape patterns do not match, the image recognition control unit 43 corrects the sewing pattern data in the data storage unit 45 (step S13).
FIG. 11 shows an example of mismatching of the sewing patterns, in which the sewing pattern based on the imaging is indicated by a broken line, and the sewing pattern determined in the sewing pattern data is indicated by a solid line. In FIG. 11A, the shapes are identical to each other, but there is a positional shift as a whole. In FIG. 11B, there is a discrepancy in the shape of the sewing pattern itself. In FIG. 11C, the shapes coincide with each other, but one is inclined.
For example, as shown in FIG. 11A, when there is a deviation in the overall position of the sewing pattern, the entire positional deviation amount is obtained as shown in FIG. As shown in FIG. 4, correction is performed to translate the entire sewing pattern determined in the sewing pattern data by the amount of positional deviation.

また、図11(B)のように、縫製パターンが一定の方向に長く延びていた場合には、その伸長率を求め、一定方向に沿って縫製パターンデータに基づく縫製パターンが伸長されるように座標変換を行う。或いは、縫製パターンデータに定められた縫製パターンを撮像に基づく縫製パターンに更新しても良い。
また、図11(C)のように、縫製パターンが傾斜していた場合には、その傾斜角度を求め、縫製パターンデータに基づく縫製パターンがその傾斜角度だけ回転するように座標変換を行う。
また、図11(A)、図11(C)の場合も、縫製パターンデータに定められた縫製パターンを撮像に基づく縫製パターンに更新しても良い。
Further, as shown in FIG. 11B, when the sewing pattern extends long in a certain direction, the elongation rate is obtained, and the sewing pattern based on the sewing pattern data is elongated along the certain direction. Perform coordinate transformation. Alternatively, the sewing pattern defined in the sewing pattern data may be updated to a sewing pattern based on imaging.
If the sewing pattern is inclined as shown in FIG. 11C, the inclination angle is obtained, and coordinate conversion is performed so that the sewing pattern based on the sewing pattern data is rotated by the inclination angle.
11A and 11C, the sewing pattern defined in the sewing pattern data may be updated to a sewing pattern based on imaging.

そして、データ記憶部45の縫製パターンデータの補正が完了すると、画像認識制御部43はミシン制御部5に縫製開始の許可を通知し、ミシン制御部5はデータ記憶部45の補正後の縫製パターンデータに基づいて縫製動作を実行する。   When the correction of the sewing pattern data in the data storage unit 45 is completed, the image recognition control unit 43 notifies the sewing machine control unit 5 of permission to start sewing, and the sewing machine control unit 5 corrects the sewing pattern after the data storage unit 45 has been corrected. The sewing operation is executed based on the data.

[発明の実施形態の技術的効果]
カメラ42により、高さが異なる布地C1と布地本体C2のいずれか一方(例えば布地C1)に焦点を合わせて撮像した場合に、当該布地C1の上面についてはより画像が鮮明となり、焦点距離が遠い布地本体C2の上面は不鮮明となる。
従って、ミシン1の画像認識制御部43は、これらを鮮鋭化処理することにより、布地C1の上面と布地本体C2の上面とで輝度値の分布を大きく分離させることができる。
このため、布地C1と布地本体C2の境界をエッジとして容易且つ明確に抽出することができる。
これにより、ミシン1の画像認識制御部43は、布地C1の外周縁部の色や材質に拘わらずその外周端部を良好に認識することが可能となる。
また、これにより、実際の布地C1の外周縁部に一定の縫い代を確保してより正確に運針を行うことが可能となり、縫い品質の向上を図ることが可能となる。
[Technical effects of the embodiment of the invention]
When the camera 42 is focused on one of the fabric C1 and the fabric body C2 (for example, the fabric C1) having different heights, the image on the upper surface of the fabric C1 becomes clearer and the focal distance is long. The upper surface of the fabric body C2 is unclear.
Therefore, the image recognition control unit 43 of the sewing machine 1 can greatly separate the distribution of luminance values between the upper surface of the fabric C1 and the upper surface of the fabric main body C2 by sharpening them.
For this reason, the boundary between the fabric C1 and the fabric body C2 can be easily and clearly extracted as an edge.
Thereby, the image recognition control part 43 of the sewing machine 1 can recognize the outer peripheral edge part satisfactorily regardless of the color and material of the outer peripheral edge part of the fabric C1.
In addition, this makes it possible to ensure a certain sewing margin at the outer peripheral edge of the actual fabric C1 and perform more accurate hand movement, and to improve the sewing quality.

また、ミシン1の画像認識制御部43は、鮮鋭化処理とエッジ抽出処理(二値化処理)との間で平滑化処理を行っている。このため、ノイズとなる画素の輝度値を周囲と平均化してその影響を低減することができ、布地C1の領域と布地本体C2の領域をより精度良く分離することができる。   Further, the image recognition control unit 43 of the sewing machine 1 performs a smoothing process between the sharpening process and the edge extraction process (binarization process). For this reason, it is possible to reduce the influence by averaging the luminance value of the pixel that becomes noise with the surroundings, and the region of the fabric C1 and the region of the fabric body C2 can be separated more accurately.

また、ミシン1は、複数の照射位置から布地C1を照射可能であって、各照射位置の光源ごとに照明強度を調節可能な照明装置42を備えるので、布地C1の外周縁部による陰影の発生を抑制するように調整することができ、その影響を抑えてより正確に布地C1の外周縁部の認識を行うことが可能となる。   Further, since the sewing machine 1 includes the illuminating device 42 that can irradiate the cloth C1 from a plurality of irradiation positions and can adjust the illumination intensity for each light source at each irradiation position, generation of a shadow by the outer peripheral edge of the cloth C1 It is possible to adjust the outer peripheral edge of the fabric C1 more accurately while suppressing the influence.

また、ミシン1は、形状認識部4により得られる布地C1の外周縁部の形状に基づいて既存の縫製パターンデータを補正し、ミシン制御部3が当該縫製パターンデータに従って運針を行うように制御を行っている。
このため、布地C1のカッティングの個体差や取り付けの個体差により既存の縫製パターンデータの縫製パターンから乖離を生じた場合でも、これに対応してより正確に縫製を行うことができ、縫い品質の向上を図ることが可能となる。
Further, the sewing machine 1 corrects the existing sewing pattern data based on the shape of the outer peripheral edge of the fabric C1 obtained by the shape recognition unit 4, and performs control so that the sewing machine control unit 3 performs the needle movement according to the sewing pattern data. Is going.
For this reason, even if there is a divergence from the sewing pattern of the existing sewing pattern data due to individual differences in cutting or attachment of the fabric C1, sewing can be performed more accurately in response to this, and the sewing quality can be improved. It is possible to improve.

[その他]
前述した布地C1及び布地本体C2の撮像において、布地C1に焦点が合うようにカメラ41の高さを調節する場合を例示したが、布地C1とその背景である布地本体C2とで鮮明度に識別可能な差が生じるのであればこれに限られない。
例えば、図14(A)に示すように、布地本体C2の上面に焦点を合わせても良い。
また、カメラ41の焦点距離が長すぎる場合には、布地C1の厚さの差程度では、図14(B)に示すように、鮮明度に識別可能な差が生じないので、カメラ41の焦点距離はより短い方が良い。
また、布地C1の断面が映り込むような場合には、断面の影が生じないように照明装置42の各光源の照明強度を調節するか、布地C1の上面より幾分上の位置に焦点を合わせることが望ましい。
また、布地C1と布地本体C2とが色や材質の違いにより撮像画像のコントラストに大きく差が生じるような場合には、コントラストがより出やすい方に焦点を合わせることが望ましい。
[Others]
In the imaging of the cloth C1 and the cloth main body C2, the case where the height of the camera 41 is adjusted so that the cloth C1 is focused is exemplified. However, the cloth C1 and the cloth main body C2 which is the background are distinguished clearly. If possible difference arises, it will not be restricted to this.
For example, as shown in FIG. 14A, the upper surface of the fabric body C2 may be focused.
If the focal length of the camera 41 is too long, the difference in the thickness of the fabric C1 does not cause an identifiable difference in the sharpness as shown in FIG. A shorter distance is better.
Further, when the cross section of the fabric C1 is reflected, the illumination intensity of each light source of the lighting device 42 is adjusted so as not to cause a shadow of the cross section, or the focus is slightly higher than the upper surface of the fabric C1. It is desirable to match.
In the case where there is a large difference in the contrast of the captured image due to the difference in color or material between the fabric C1 and the fabric body C2, it is desirable to focus on the one where the contrast is more likely to appear.

また、前述した鮮鋭化処理では、8方向ラプラシアンフィルタを使用する場合を例示したが、他の鮮鋭化フィルタ或いはエッジ強調フィルタを使用しても良い。また、方向がより少ない又はより多いラプラシアンフィルタを使用しても良い。
また、平滑化処理では、15×15平均化フィルタを使用する場合を例示したが、他の平滑化フィルタを使用しても良い。また、平均化する範囲は適宜増減可能である。また、平均化する場合に、例えば、注目画素を重くする等の重み付けを行っても良い。
Further, in the sharpening process described above, the case where the 8-direction Laplacian filter is used has been exemplified, but other sharpening filters or edge enhancement filters may be used. Also, a Laplacian filter with fewer or more directions may be used.
Further, in the smoothing process, the case where the 15 × 15 averaging filter is used is illustrated, but other smoothing filters may be used. Moreover, the range to average can be increased / decreased suitably. Further, in the case of averaging, for example, weighting such as increasing the target pixel may be performed.

また、形状認識部4では、撮像により取得する布地C1の外周縁部の形状と既存の縫製パターンデータを比較し、比較結果に基づいて縫製の実行の有無を判定しても良い。
例えば、画像認識制御部43は、撮像に基づく縫製パターンと縫製パターンデータに定められた縫製パターンとの比較において、互いの不一致の状況に応じてエラー処理を行っても良い。具体的には、各縫製パターンのサイズが大きく異なる場合や、特徴点の数が一致しない場合には、ミシン制御部5に縫製の禁止を通知し、作業者に確認を促すように報知を行ってもよい。
これにより、何らかの異常の発生時には作業者に知らしめると共に確認を促すことができる。このため、何らかの異常発生の看過を抑制することが可能となる。
Further, the shape recognition unit 4 may compare the shape of the outer peripheral edge of the fabric C1 acquired by imaging with existing sewing pattern data, and determine whether or not the sewing is executed based on the comparison result.
For example, the image recognition control unit 43 may perform error processing according to the situation of disagreement in the comparison between the sewing pattern based on the imaging and the sewing pattern defined in the sewing pattern data. Specifically, if the size of each sewing pattern is significantly different or the number of feature points does not match, the sewing machine control unit 5 is notified of the prohibition of sewing and a notification is made to prompt the operator for confirmation. May be.
As a result, when any abnormality occurs, it is possible to notify the operator and prompt confirmation. For this reason, it is possible to suppress oversight of any abnormality.

また、上記実施形態では、カメラ41による布地C1の撮像回数は一回であることを前提としているが、焦点位置を布地C1から布地本体C2に変更して複数回の撮像を行い、複数の撮像画像データを取得しても良い。そして、各撮像画像データごとに縫製パターンを取得し、これらの座標値を平均化する等により一つに合成して精度を高め、データ記憶部45内の縫製パターンデータの縫製パターンと比較を行っても良い。   In the above-described embodiment, it is assumed that the number of times the fabric C1 is imaged by the camera 41 is one. However, the focal position is changed from the fabric C1 to the fabric body C2, and imaging is performed a plurality of times. Image data may be acquired. Then, a sewing pattern is acquired for each captured image data, and these coordinate values are combined into one by averaging, for example, to improve accuracy, and compared with the sewing pattern of the sewing pattern data in the data storage unit 45. May be.

また、上記実施形態では、布地本体C2の上に布地C1を重ねて上下二枚からなる被縫製物を例示したが、重ねる枚数はもっと増やしても良い。
例えば、布地C1の上に、さらに、より小さい布地を重ねて上下三枚からなる被縫製物について縫製パターンを求める場合には、一番下又は一番上の布地の上面に焦点を合わせることで、三枚それぞれの上面を識別することができる。その場合、ヒストグラムは、図15のように分布の集中箇所が三箇所発生するので、それぞれの度数分布が頂点となる輝度値を求め、その中間の輝度値を閾値k1,k2として、それぞれの布地の境界をエッジとして抽出する。これにより、上の二枚の布地の外周縁部に基づく縫製パターンを取得することが可能である。
Moreover, in the said embodiment, although the fabric C1 was piled up on the fabric main body C2, and the to-be-sewn material which consists of two upper and lower sides was illustrated, you may increase the number of sheets to overlap more.
For example, when a sewing pattern is obtained for the upper and lower three pieces of fabric by overlapping smaller fabrics on the fabric C1, the top surface of the bottom or top fabric is focused. , The top surface of each of the three sheets can be identified. In this case, the histogram has three locations where the distribution is concentrated as shown in FIG. 15. Therefore, the luminance values having the respective frequency distributions as vertices are obtained, and the intermediate luminance values are set as threshold values k1 and k2, and the respective fabrics are obtained. The boundary of is extracted as an edge. As a result, it is possible to acquire a sewing pattern based on the outer peripheral edge portions of the two upper fabrics.

また、形状認識部4では、撮像に基づく縫製パターンに基づいて縫製パターンデータに定められた縫製パターンの補正を行っているが、このような更新処理は行わず、撮像に基づく縫製パターンから直接縫製パターンデータを生成して当該縫製パターンデータに従って運針を行うようミシン制御部5がミシン各部の制御を行ってもよい。その場合、既存の縫製パターンデータの補正処理を不要とするため、処理の簡略化、迅速化を図ることが可能となる。   In addition, the shape recognition unit 4 corrects the sewing pattern defined in the sewing pattern data based on the sewing pattern based on the imaging, but does not perform such an update process, and directly performs sewing from the sewing pattern based on the imaging. The sewing machine control unit 5 may control each part of the sewing machine so as to generate pattern data and move the needle according to the sewing pattern data. In this case, since the existing sewing pattern data correction process is not required, the process can be simplified and speeded up.

また、上記実施形態では、形状認識装置(形状認識部4)をミシン1に適用した場合を例示したが、これに限らず、対象物の形状を認識する必要があるあらゆる分野の装置に形状認識装置を適用することが可能である。   Moreover, although the case where the shape recognition apparatus (shape recognition part 4) was applied to the sewing machine 1 was illustrated in the above-described embodiment, the shape recognition is not limited to this and is applicable to apparatuses in various fields that need to recognize the shape of an object. It is possible to apply the device.

1 ミシン
4 形状認識部(形状認識装置)
41 カメラ(撮像手段)
42 照明装置
43 画像認識制御部
46 高さ調節手段
C1 布地(第1対象物)
C2 布地本体(第2対象物)
1 sewing machine 4 shape recognition unit (shape recognition device)
41 Camera (imaging means)
42 Illuminating device 43 Image recognition control unit 46 Height adjusting means C1 Cloth (first object)
C2 Fabric body (second object)

Claims (8)

上下に重ねて配置される第1対象物と第2対象物を上方から撮像する撮像手段を備え、
前記撮像手段による撮像画像データから前記第1対象物と前記第2対象物の境界を認識するための形状認識装置において、
前記撮像手段は、前記第1対象物と前記第2対象物のいずれか一方の上面を焦点距離に合わせることで、一方の対象物を焦点距離、他方の対象物を非焦点距離で撮像し、
前記撮像手段によって撮像した撮像画像データに対して、前記撮像画像データのそれぞれの画素について周囲の画素の輝度より大きな場合にその差分値に応じて当該画素の輝度値を大きく変換する鮮鋭化処理と、当該鮮鋭化処理後の撮像画像データに対して二値化処理と、を行い、さらに、エッジ抽出して、前記第1対象物と前記第2対象物の境界を認識する画像認識制御部、を備えることを特徴とする形状認識装置。
An imaging means for imaging the first object and the second object that are arranged one above the other from above and below,
In a shape recognition apparatus for recognizing a boundary between the first object and the second object from imaged image data obtained by the imaging means,
The imaging means images one object at a focal distance and the other object at a non-focal distance by adjusting the upper surface of one of the first object and the second object to a focal distance,
A sharpening process for greatly converting the luminance value of the pixel according to the difference value when the luminance of each pixel of the captured image data is larger than the luminance of surrounding pixels with respect to the captured image data captured by the imaging unit; An image recognition control unit that performs binarization processing on the captured image data after the sharpening processing, and further performs edge extraction to recognize a boundary between the first object and the second object; A shape recognition device comprising:
前記画像認識制御部は、前記鮮鋭化処理後かつ前記二値化処理前の撮像画像データに対して、平滑化処理を行うことを特徴とする請求項1に記載の形状認識装置。   The shape recognition apparatus according to claim 1, wherein the image recognition control unit performs a smoothing process on the captured image data after the sharpening process and before the binarization process. 前記撮像手段を昇降させる高さ調節手段を備え、
前記画像認識制御部は、前記撮像手段が予め設定された高さになるように前記高さ調節手段を制御することを特徴とする請求項1又は2に記載の形状認識装置。
A height adjusting means for raising and lowering the imaging means;
The shape recognition apparatus according to claim 1, wherein the image recognition control unit controls the height adjusting unit so that the imaging unit has a preset height.
複数の照射位置から前記対象物を照射可能であって、前記複数の照射位置ごとに照明強度を調節可能な照明装置を備えることを特徴とする請求項1から3いずれか一項に記載の形状認識装置。   The shape according to any one of claims 1 to 3, further comprising an illuminating device capable of irradiating the object from a plurality of irradiation positions and capable of adjusting an illumination intensity for each of the plurality of irradiation positions. Recognition device. 請求項1から4いずれか一項に記載の形状認識装置を備え、
前記対象物として被縫製物の外形を認識して縫製を行うことを特徴とするミシン。
A shape recognition device according to any one of claims 1 to 4, comprising:
A sewing machine that performs sewing by recognizing an outer shape of a sewing object as the object.
前記形状認識装置により得られる前記被縫製物の外形に基づいて縫製パターンデータを生成し、
当該縫製パターンデータに従って運針を行う請求項5記載のミシン。
Generate sewing pattern data based on the outer shape of the sewing product obtained by the shape recognition device,
The sewing machine according to claim 5, wherein the needle is moved according to the sewing pattern data.
前記形状認識装置により得られる前記被縫製物の外形に基づいて既存の縫製パターンデータを補正し、
当該縫製パターンデータに従って運針を行う請求項5記載のミシン。
Correcting the existing sewing pattern data based on the outer shape of the sewing product obtained by the shape recognition device;
The sewing machine according to claim 5, wherein the needle is moved according to the sewing pattern data.
前記形状認識装置により得られる前記被縫製物の外形と既存の縫製パターンデータを比較し、
比較結果に基づいて縫製の実行の有無を判定する請求項7記載のミシン。
Compare the outline of the sewing product obtained by the shape recognition device with existing sewing pattern data,
The sewing machine according to claim 7, wherein the presence or absence of execution of sewing is determined based on the comparison result.
JP2014193316A 2014-09-24 2014-09-24 Shape recognition device and sewing machine Pending JP2016063894A (en)

Priority Applications (3)

Application Number Priority Date Filing Date Title
JP2014193316A JP2016063894A (en) 2014-09-24 2014-09-24 Shape recognition device and sewing machine
DE102015116112.5A DE102015116112A1 (en) 2014-09-24 2015-09-23 Shape recognition device and sewing machine
CN201510616070.4A CN105447847A (en) 2014-09-24 2015-09-24 Form detection means and sewing machine

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
JP2014193316A JP2016063894A (en) 2014-09-24 2014-09-24 Shape recognition device and sewing machine

Publications (1)

Publication Number Publication Date
JP2016063894A true JP2016063894A (en) 2016-04-28

Family

ID=55444946

Family Applications (1)

Application Number Title Priority Date Filing Date
JP2014193316A Pending JP2016063894A (en) 2014-09-24 2014-09-24 Shape recognition device and sewing machine

Country Status (3)

Country Link
JP (1) JP2016063894A (en)
CN (1) CN105447847A (en)
DE (1) DE102015116112A1 (en)

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111476763A (en) * 2020-03-26 2020-07-31 创驱(上海)新能源科技有限公司 Device and method for correcting visual position
CN112779679A (en) * 2019-11-06 2021-05-11 Juki株式会社 Image processing device, sewing machine, and image processing method
JP2021074074A (en) * 2019-11-06 2021-05-20 Juki株式会社 Image processing device, sewing machine and image processing method
JP2022145797A (en) * 2018-06-20 2022-10-04 Juki株式会社 Sewing machine and sewing method

Families Citing this family (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106910220B (en) * 2017-03-01 2020-05-05 拓卡奔马机电科技有限公司 Automatic material cutting device and method
TWI646233B (en) * 2017-08-02 2019-01-01 伸興工業股份有限公司 Appliqué method based on image recognition
CN108986128A (en) * 2018-04-19 2018-12-11 芜湖圣美孚科技有限公司 A kind of direct-injection type lighting system for machine vision
DE102018207931A1 (en) * 2018-05-18 2019-11-21 Pfaff Industriesysteme Und Maschinen Gmbh Method for regulating the position of a seam course relative to structures of a sewing material
DE102021212588A1 (en) 2021-11-09 2023-05-11 Dürkopp Adler GmbH Method for creating a sewing data record as the basis of a sewing program for sewing a seam course on a sewing material having structures
CN114657712B (en) * 2022-03-11 2023-08-04 杰克科技股份有限公司 Pattern pattern optimization method based on edge recognition

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH0576679A (en) * 1991-01-31 1993-03-30 Brother Ind Ltd Recognition device for pattern or the like
JPH0698331A (en) * 1991-02-25 1994-04-08 Brother Ind Ltd Recognition device for pattern and shape of cloth
JP2007263852A (en) * 2006-03-29 2007-10-11 Dainippon Printing Co Ltd Apparatus, method and processing program for detecting defect
JP2014091008A (en) * 2012-11-07 2014-05-19 Juki Corp Sewing device

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH05225315A (en) 1992-02-07 1993-09-03 Juki Corp External shape recognition device for object to be worked
JPH0728962A (en) 1993-07-12 1995-01-31 Sandenshi Kogyo Kk Picture recording card reutilization processor
JP3170238B2 (en) * 1997-03-24 2001-05-28 洋 古舘 SEWING SYSTEM AND SEWING METHOD
CN102547147A (en) * 2011-12-28 2012-07-04 上海聚力传媒技术有限公司 Method for realizing enhancement processing for subtitle texts in video images and device

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH0576679A (en) * 1991-01-31 1993-03-30 Brother Ind Ltd Recognition device for pattern or the like
JPH0698331A (en) * 1991-02-25 1994-04-08 Brother Ind Ltd Recognition device for pattern and shape of cloth
JP2007263852A (en) * 2006-03-29 2007-10-11 Dainippon Printing Co Ltd Apparatus, method and processing program for detecting defect
JP2014091008A (en) * 2012-11-07 2014-05-19 Juki Corp Sewing device

Cited By (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2022145797A (en) * 2018-06-20 2022-10-04 Juki株式会社 Sewing machine and sewing method
JP7316420B2 (en) 2018-06-20 2023-07-27 Juki株式会社 Sewing machine and sewing method
CN112779679A (en) * 2019-11-06 2021-05-11 Juki株式会社 Image processing device, sewing machine, and image processing method
JP2021074075A (en) * 2019-11-06 2021-05-20 Juki株式会社 Image processing device, sewing machine and image processing method
JP2021074074A (en) * 2019-11-06 2021-05-20 Juki株式会社 Image processing device, sewing machine and image processing method
JP7405565B2 (en) 2019-11-06 2023-12-26 Juki株式会社 Image processing device, sewing machine, and image processing method
JP7405564B2 (en) 2019-11-06 2023-12-26 Juki株式会社 Image processing device, sewing machine, and image processing method
CN112779679B (en) * 2019-11-06 2024-02-02 Juki株式会社 Image processing device, sewing machine, and image processing method
CN111476763A (en) * 2020-03-26 2020-07-31 创驱(上海)新能源科技有限公司 Device and method for correcting visual position

Also Published As

Publication number Publication date
CN105447847A (en) 2016-03-30
DE102015116112A1 (en) 2016-03-24

Similar Documents

Publication Publication Date Title
JP2016063894A (en) Shape recognition device and sewing machine
JP5993283B2 (en) Sewing device
JP6394157B2 (en) Recording medium recording sewing machine and program
EP2388574A1 (en) Appearance inspection apparatus
US20200310703A1 (en) Non-transitory computer-readable medium, print image control device, and control method of print image control device
JPWO2019124508A1 (en) Wire shape inspection device and wire shape inspection method
JP4381764B2 (en) IMAGING DEVICE AND OBJECT MOVING DEVICE EQUIPPED WITH THE DEVICE
TW202022313A (en) Center detecting method for suppressing the situation that the outer periphery of wafer is incorrectly recognized when detecting the center of wafer
JP4646805B2 (en) Defect correcting apparatus and defect correcting method thereof
JP2013146032A (en) Driver monitor system and processing method thereof
JP6974950B2 (en) Welding equipment, welding methods and programs
JP4987323B2 (en) Color filter defect correcting apparatus and color filter defect correcting method
JP2010078562A (en) Device and method for recognizing buttons
CN116615302A (en) Method for detecting the suspension position of a support bar and flat machine tool
JP6697328B2 (en) Wafer detecting method for detecting outer peripheral position of wafer and processing apparatus capable of detecting outer peripheral position of wafer
JP2004213562A (en) Component recognition data creating method, creating device and component recognition data creating program
JP7075246B2 (en) Seam inspection device
JP2013191775A (en) Component mounting device and component shape measuring method
KR102462591B1 (en) Sewing line automatic inspection method using vision system
KR102204176B1 (en) Apparatus and method for cutiing fabric
JP2017009434A (en) Image inspection device and image inspection method
JP2850222B2 (en) Quilting method
JP5360467B2 (en) Defect inspection equipment
CN104853874B (en) The blanking system and method for steel plate
JP2015194368A (en) defect inspection method and defect inspection apparatus

Legal Events

Date Code Title Description
A621 Written request for application examination

Free format text: JAPANESE INTERMEDIATE CODE: A621

Effective date: 20170821

A977 Report on retrieval

Free format text: JAPANESE INTERMEDIATE CODE: A971007

Effective date: 20180704

A02 Decision of refusal

Free format text: JAPANESE INTERMEDIATE CODE: A02

Effective date: 20190312