TW201910584A - Patch embroidering method based on image recognition which can automatically embroider an applique of any outer frame pattern without being limited to the outer frame pattern - Google Patents

Patch embroidering method based on image recognition which can automatically embroider an applique of any outer frame pattern without being limited to the outer frame pattern Download PDF

Info

Publication number
TW201910584A
TW201910584A TW106126043A TW106126043A TW201910584A TW 201910584 A TW201910584 A TW 201910584A TW 106126043 A TW106126043 A TW 106126043A TW 106126043 A TW106126043 A TW 106126043A TW 201910584 A TW201910584 A TW 201910584A
Authority
TW
Taiwan
Prior art keywords
image
coordinate system
embroidered
fabric
points
Prior art date
Application number
TW106126043A
Other languages
Chinese (zh)
Other versions
TWI646233B (en
Inventor
徐坤龍
林信豪
Original Assignee
伸興工業股份有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 伸興工業股份有限公司 filed Critical 伸興工業股份有限公司
Priority to TW106126043A priority Critical patent/TWI646233B/en
Application granted granted Critical
Publication of TWI646233B publication Critical patent/TWI646233B/en
Publication of TW201910584A publication Critical patent/TW201910584A/en

Links

Landscapes

  • Image Processing (AREA)

Abstract

The invention provides a patch embroidering method based on image recognition. First, an image of a fabric to be embroidered is captured by a photographing box, wherein the fabric to be embroidered is attached to a base fabric fixed to a bottom fabric of an embroidery frame. A computing device is used to correct the image and calculate a plurality of smoothed edge points of the fabric to be embroidered in the image, then the invention obtains an universal coordinates corresponding to each edge point based on the image coordination and the thickness correction. Finally, the computerized sewing machine automatically embroider the fabric according to an embroidery path formed by the universal coordinates corresponding to the edge points. The invention can make the computerized sewing machine automatically embroider the applique of any outer frame pattern, which is not limited to the outer frame pattern pre-stored in the computerized sewing machine.

Description

基於影像辨識的貼布繡方法Appliqué method based on image recognition

本發明是有關於一種貼布繡方法,特別是指一種基於影像辨識的貼布繡方法。The invention relates to a method for appliqué embroidering, in particular to a method for appliqué embroidering based on image recognition.

在現有的利用電腦式縫紉機來自動刺繡出貼布繡的方式中,電腦式縫紉機需預先儲存有貼布繡的外框花樣,例如是圓形或是動物造型;使用者需在預先儲存的外框花樣中做選擇,再讓電腦式縫紉機自動刺繡出所選擇的外框花樣的貼布繡。In the conventional method of automatically embroidering the appliqué by using a computer-type sewing machine, the computer-type sewing machine needs to pre-store the outer frame pattern of the appliqué, for example, a circular shape or an animal shape; the user needs to store in advance Make a selection in the frame pattern, and then let the computerized sewing machine automatically embroider the appliqué of the selected frame pattern.

舉例來說,如圖1(a)所示,使用者從電腦式縫紉機中選擇了圓形花樣,並讓電腦式縫紉機自動在固定於刺繡框23的布件21上刺繡該圓形花樣1;接著,如圖1(b)所示,使用者將該圓形花樣1從該布件21上剪裁下來並粘貼至固定於刺繡框23的底布22,然後再讓電腦式縫紉機自動對該圓形花樣1的外框進行刺繡。For example, as shown in FIG. 1(a), the user selects a circular pattern from the computer-type sewing machine, and causes the computer-type sewing machine to automatically embroider the circular pattern 1 on the cloth member 21 fixed to the embroidery frame 23; Next, as shown in FIG. 1(b), the user cuts the circular pattern 1 from the cloth member 21 and sticks it to the base fabric 22 fixed to the embroidery frame 23, and then causes the computer-type sewing machine to automatically apply the circle. The outer frame of the pattern 1 is embroidered.

然而,若僅能讓電腦式縫紉機刺繡出所預先儲存的外框花樣的貼布繡,這對使用者來說是遠遠不足的。However, it is far from sufficient for the user to allow the computerized sewing machine to embroider the patch of the pre-stored frame pattern.

因此,本發明之目的,即在提供一種能讓電腦式縫紉機自動刺繡出任意外框花樣的貼布繡的基於影像辨識的貼布繡方法。SUMMARY OF THE INVENTION Accordingly, it is an object of the present invention to provide an image recognition-based appliqué method which allows a computerized sewing machine to automatically embroider an applique of any frame pattern.

於是,本發明基於影像辨識的貼布繡方法由一刺繡系統實施,並用於對一貼附於一底布的待繡布件進行刺繡。該刺繡系統對應一世界座標系,該基於影像辨識的貼布繡方法包含一步驟(a)、一步驟(b)、一步驟(c),及一步驟(d)。Therefore, the patch embedding method based on image recognition of the present invention is implemented by an embroidery system and is used for embroidering a fabric to be embroidered attached to a base fabric. The embroidery system corresponds to a world coordinate system, and the image recognition based patch embedding method comprises a step (a), a step (b), a step (c), and a step (d).

該步驟(a)是擷取該待繡布件的一第一影像,其中該第一影像對應一影像座標系。The step (a) is to capture a first image of the fabric to be embroidered, wherein the first image corresponds to an image coordinate system.

該步驟(b)是對該第一影像進行邊緣偵測而計算出在該第一影像中該待繡布件的多個邊緣點。The step (b) is to perform edge detection on the first image to calculate a plurality of edge points of the fabric to be embroidered in the first image.

該步驟(c)是根據該影像座標系與該世界座標系計算出每一邊緣點在該世界座標系中所對應的一世界座標。The step (c) is to calculate a world coordinate corresponding to each edge point in the world coordinate system according to the image coordinate system and the world coordinate system.

該步驟(d)是根據該等邊緣點對應的該等世界座標所形成的一刺繡路徑,對該待繡布件進行刺繡。The step (d) is to embroider the member to be embroidered according to an embroidery path formed by the world coordinates corresponding to the edge points.

本發明之功效在於:能讓電腦式縫紉機自動刺繡出任意外框花樣的貼布繡。The effect of the invention is that the computerized sewing machine can automatically embroider the appliqué of any outer frame pattern.

本發明基於影像辨識的貼布繡方法由一刺繡系統實施,該刺繡系統包含一習知的拍照箱、一計算裝置,及一習知的電腦式縫紉機。該計算裝置例如為一桌上型電腦或一平板電腦。較佳地,該拍照箱的攝像鏡頭為一廣角鏡頭或一魚眼鏡頭。The patch embedding method based on image recognition of the present invention is implemented by an embroidery system comprising a conventional photographing box, a computing device, and a conventional computerized sewing machine. The computing device is, for example, a desktop computer or a tablet computer. Preferably, the camera lens of the camera box is a wide-angle lens or a fisheye lens.

參閱圖2,以下詳述本發明基於影像辨識的貼布繡方法的一實施方式。Referring to Fig. 2, an embodiment of the patch embedding method based on image recognition of the present invention will be described in detail below.

首先在步驟31,如圖3與圖4所示,使用者將一待繡布件41貼附於一固定在一刺繡框43的底布42,其中該待繡布件41的外框花樣可為任意形狀,而在此以愛心形狀來說明。然後,將該刺繡框43的卡榫431樞接於該拍照箱5的卡榫53以將該刺繡框43固定於該拍照箱5中,並利用該拍照箱5擷取該待繡布件41的一第一影像。First, in step 31, as shown in FIG. 3 and FIG. 4, the user attaches a fabric to be embroidered 41 to a base fabric 42 fixed to an embroidery frame 43, wherein the outer frame pattern of the fabric to be embroidered 41 can be It is an arbitrary shape, and is described here in a love shape. Then, the cassette 431 of the embroidery frame 43 is pivotally connected to the cassette 53 of the photographing box 5 to fix the embroidery frame 43 in the photograph box 5, and the photographing box 5 is used to capture the to-embroidered member 41. a first image.

接著在步驟32,為校正該第一影像的變形失真,利用該計算裝置對該第一影像進行影像校正而獲得一第一校正後影像。本發明也提供了一種創新的影像校正方法,其細節將在本節文末說明。Next, in step 32, in order to correct the distortion of the first image, the first image is image-corrected by the computing device to obtain a first corrected image. The present invention also provides an innovative image correction method, the details of which will be explained at the end of this section.

接著在步驟33,參閱圖5,藉由該計算裝置,利用一邊緣偵測演算法,例如Canny邊緣偵測算子(Canny edge detector),來獲得在該第一校正後影像64中該待繡布件41的多個邊緣點E1 ,且進一步對該等邊緣點進行一平滑化運算而獲得另外多個邊緣點。較佳地,該平滑化運算的方式如下:參閱圖5與圖6,先利用該邊緣偵測演算法所求得的該等邊緣點E1 ,估計出一擬合該等邊緣點E1 的參數式非均勻有理B樣條曲線Cr(parametric non-uniform rational B-spline curve, NURBS curve),再從該參數式非均勻有理B樣條曲線Cr取樣出多個邊緣點E2 ,該等邊緣點E2 也就是經平滑化運算後的邊緣點。然而,需特別說明的是,若該邊緣偵測演算法所偵測出的該等邊緣點E1 沒有明顯的毛邊/鋸齒邊的現象,上述的平滑化運算也就非必要。此外,平滑化運算的方式也不限於上述,例如也可用一般的平滑濾波器(smoothing filter)來做平滑化運算。Then, in step 33, referring to FIG. 5, the edge correction algorithm, such as a Canny edge detector, is used by the computing device to obtain the image to be embroidered in the first corrected image 64. A plurality of edge points E 1 of the cloth member 41 are further subjected to a smoothing operation on the edge points to obtain a plurality of other edge points. Preferably, the smoothing operation is as follows: Referring to FIG. 5 and FIG. 6, the edge points E 1 obtained by the edge detection algorithm are first used to estimate a boundary point E 1 . Parametric non-uniform rational B-spline curve (NURBS curve), and then sample a plurality of edge points E 2 from the parametric non-uniform rational B-spline curve Cr, the edges The point E 2 is the edge point after the smoothing operation. However, it should be specially noted that if the edge detection points E 1 detected by the edge detection algorithm have no obvious burrs/saw edges, the above smoothing operation is unnecessary. Further, the manner of the smoothing operation is not limited to the above, and for example, a smoothing operation may be performed by a general smoothing filter.

接著在步驟34,利用該計算裝置,根據該第一校正後影像所對應的一影像座標系與該電腦式縫紉機所對應的一世界座標系計算出每一邊緣點E2 在該世界座標系中所對應的一世界座標,也就是進行所謂的影像對位。Then, in step 34, using the computing device, each edge point E 2 is calculated in the world coordinate system according to an image coordinate system corresponding to the first corrected image and a world coordinate system corresponding to the computerized sewing machine. The corresponding world coordinates, that is, the so-called image alignment.

詳言之,參閱圖7與圖8,該電腦式縫紉機7包含一工作平台71與一移動模組72。該移動模組72具有一本體721,及一連接該本體721並可沿著該本體721移動的連接單元722;該刺繡框43透過其卡榫431與該連接單元722連接之後,該本體721與該連接單元722分別能帶動該刺繡框43在該電腦式縫紉機7對應的世界座標系WCS的x軸方向與y軸方向移動,以進行刺繡。該影像對位的方式如下:首先,如圖8所示,將一布件8固定於該刺繡框43,且操作該電腦式縫紉機7在該布件8上刺繡出一平面座標系PCS。接著,將該刺繡框43放置於該拍照箱5中並利用該拍照箱5擷取該平面座標系PCS的一第二影像。接著,對該第二影像校正而產生一第二校正後影像。接著,參閱圖9,藉由疊合該第一校正後影像64與該第二校正後影像(圖未示),計算出該第二校正後影像中的該平面座標系PCS相對於該第一校正後影像64所對應的影像座標系ICS的一旋轉矩陣與一位移向量。接著,根據該旋轉矩陣、該位移向量及一對應該第一校正後影像64與該第二校正後影像的影像比例(scale),將該待繡布件41的每一邊緣點E2 在該影像座標系ICS上的座標轉換成在該世界座標系WCS中的一世界座標,其中該影像比例的單位例如為「微米/像素」、「公厘/像素」、「公分/像素」等,也就是該第一校正後影像64或該第二校正後影像的每一像素所對應的真實空間中的長度。進一步來說,令ci 表示一邊緣點E2 在該影像座標系ICS上的座標、cw 表示ci 在該世界座標系WCS上所對應的世界座標,Ab 分別表示該旋轉矩陣與該位移向量,且α為該影像比例,則cw = α(Aci +b )。In detail, referring to FIG. 7 and FIG. 8, the computerized sewing machine 7 includes a working platform 71 and a moving module 72. The mobile module 72 has a body 721 and a connecting unit 722 connected to the body 721 and movable along the body 721. After the embroidery frame 43 is connected to the connecting unit 722 through the cassette 431, the body 721 is The connecting unit 722 can move the embroidery frame 43 in the x-axis direction and the y-axis direction of the world coordinate system WCS corresponding to the computer-type sewing machine 7 to perform embroidery. The image is aligned in the following manner. First, as shown in Fig. 8, a cloth member 8 is fixed to the embroidery frame 43, and the computerized sewing machine 7 is operated to embroider a plane coordinate system PCS on the cloth member 8. Next, the embroidery frame 43 is placed in the photographing box 5 and a second image of the plane coordinate system PCS is captured by the photographing box 5. Then, the second image is corrected to generate a second corrected image. Next, referring to FIG. 9, by superimposing the first corrected image 64 and the second corrected image (not shown), the plane coordinate system PCS in the second corrected image is calculated relative to the first A rotation matrix and a displacement vector of the image coordinate system ICS corresponding to the corrected image 64. Then, according to the rotation matrix, the displacement vector, and a pair of image ratios of the first corrected image 64 and the second corrected image, each edge point E 2 of the member to be embroidered 41 is The coordinate on the image coordinate system ICS is converted into a world coordinate in the world coordinate system WCS, wherein the unit of the image ratio is, for example, "micrometer/pixel", "mm/pixel", "cm/pixel", etc. That is, the length in the real space corresponding to each pixel of the first corrected image 64 or the second corrected image. Further, let c i denote a coordinate of an edge point E 2 on the image coordinate system ICS, c w denotes a world coordinate corresponding to c i on the world coordinate system WCS, and A and b respectively represent the rotation matrix and The displacement vector, and α is the image scale, then c w = α( Ac i +b ).

接著在步驟35,如圖10所示,將固定住該待繡布件41的該刺繡框43連接該電腦式縫紉機7的移動模組72,以讓該電腦式縫紉機7根據該等邊緣點E2 對應的該等世界座標所形成的一刺繡路徑,對該待繡布件41進行刺繡;如此,可獲得圖11所示的刺繡結果。Next, in step 35, as shown in FIG. 10, the embroidery frame 43 holding the fabric to be embroidered 41 is connected to the moving module 72 of the computerized sewing machine 7, so that the computerized sewing machine 7 can be based on the edge points E. 2 corresponding to a path of the embroidery coordinate world those formed, the pieces of cloth to be embroidered embroidery 41; thus, the embroidery can be obtained the results shown in FIG. 11.

特別地,前述基於影像辨識的貼布繡方法能讓該電腦式縫紉機7自動刺繡出使用者所準備的任意外框花樣的貼布繡,不再受限於該電腦式縫紉機7中所預先儲存的外框花樣。In particular, the above-described image recognition-based appliqué method allows the computer-type sewing machine 7 to automatically embroider the appliqué of any frame pattern prepared by the user, and is no longer limited to being pre-stored in the computer-type sewing machine 7. The frame of the frame.

在一實施方式中,若該待繡布件41或該底布42的厚度較大,還可在該步驟35之前對在該步驟34所求得的該等邊緣點E2 所對應的世界座標進行「布厚校正」,以下詳述之。參閱圖12,由於該拍照箱5的鏡頭51擷取影像時並非只有垂直方向的入射光H1 ,還會有非垂直方向的入射光H2 ,因此在該第一校正後影像64的每一邊緣點E2 所對應的世界座標與該世界座標系的原點O的直線距離會過大,故需縮減該距離,且若該底布42的厚度與該待繡布件41的厚度的總和愈大,則縮減的幅度需愈大。In an embodiment, if the thickness of the fabric to be embroidered 41 or the base fabric 42 is large, the world coordinates corresponding to the edge points E 2 obtained in the step 34 may be before the step 35. Perform "cloth thickness correction" as detailed below. Referring to FIG. 12, since the lens 51 of the photographing box 5 captures an image, not only the incident light H 1 in the vertical direction but also the incident light H 2 in the non-vertical direction, each of the first corrected images 64 The linear distance between the world coordinates corresponding to the edge point E 2 and the origin O of the world coordinate system is too large, so the distance needs to be reduced, and if the thickness of the base fabric 42 and the thickness of the fabric to be embroidered 41 are greater, Large, the larger the reduction will be.

舉例來說,如圖12所示,假設該底布42的厚度為T0 、該待繡布件41的厚度為T1 、該拍照箱5的鏡頭51至其底部52的距離為D,且在該第一校正後影像64中該待繡布件41的一邊緣點E2 對應該待繡布件41的一實際邊緣點Pt ,則在該步驟34所計算出的該邊緣點Pt 的世界座標與該世界座標系的原點O的距離為L1 ,且需將該距離L1 縮減至該邊緣點Pt 至該世界座標系的原點O的實際距離L11 ,而藉由相似三角形定理可計算出縮減的幅度為L1 -L11 =L1 ÷D×(T0 +T1 )。類似地,經由該步驟34所計算出的該待繡布件41的實際邊緣點Qt 的世界座標與該世界座標系的原點O的距離為L2 ,且需將該距離L2 縮減至該邊緣點Qt 至該世界座標系的原點O的實際距離L21 ,而藉由相似三角形定理可計算出縮減的幅度為L2 -L21 =L2 ÷D×T0For example, as shown in FIG. 12, it is assumed that the thickness of the base fabric 42 is T 0 , the thickness of the fabric to be embroidered 41 is T 1 , the distance from the lens 51 of the photographing box 5 to the bottom 52 thereof is D, and after the first correction image to be embroidered cloth member 64 the edge of a two pairs of points E 41 should be a member of embroidered cloth actual edge point P t 41, then at step 34 that the edge point calculated P t The distance between the world coordinates and the origin O of the world coordinate system is L 1 , and the distance L 1 needs to be reduced to the actual distance L 11 of the edge point P t to the origin O of the world coordinate system. The similar triangle theorem can calculate the magnitude of the reduction as L 1 - L 11 = L 1 ÷ D × (T 0 + T 1 ). Similarly, the distance between the world coordinate of the actual edge point Q t of the fabric to be embroidered 41 calculated by the step 34 and the origin O of the world coordinate system is L 2 , and the distance L 2 needs to be reduced to The edge point Q t is to the actual distance L 21 of the origin O of the world coordinate system, and the magnitude of the reduction can be calculated by the similar triangle theorem as L 2 - L 21 = L 2 ÷ D × T 0 .

在一實施方式中,也可不對在該第一校正後影像64中的每一邊緣點E2 對應的世界座標進行布厚校正。例如,若藉由該計算裝置判斷出一邊緣點E2 對應的世界座標與該世界座標系的原點O的距離小於一門檻值,則不對該邊緣點E2 所對應的世界座標進行布厚校正;而在該距離大於該門檻值時,才對該邊緣點E2 對應的的世界座標進行布厚校正。In an embodiment, the thickness of the world coordinates corresponding to each edge point E 2 in the first corrected image 64 may not be corrected. For example, if the computing device determines that the distance between the world coordinate corresponding to an edge point E 2 and the origin O of the world coordinate system is less than a threshold value, the world coordinates corresponding to the edge point E 2 are not thickened. Correction; and when the distance is greater than the threshold value, the thickness of the world coordinates corresponding to the edge point E 2 is corrected.

本發明提供了一創新的影像校正方法,以下詳述之。參閱圖13~15,首先,在步驟321,將一校正板放置於該拍照箱5中並利用該拍照箱5擷取該校正板的一影像。較佳地,該校正板為圖14所示的一棋盤格校正板61。The present invention provides an innovative image correction method, which is described in more detail below. Referring to FIGS. 13-15, first, in step 321, a calibration plate is placed in the photo box 5 and an image of the calibration plate is captured by the photo box 5. Preferably, the correction plate is a checkerboard correction plate 61 as shown in FIG.

接著,在步驟322,利用該計算裝置擷取該校正板的影像的多個影像特徵點。較佳地,如圖15所示,利用Harris角點偵測法(Harris corner detection)擷取該棋盤格校正板的影像62的多個角點621做為該等影像特徵點,其中該等影像特徵點為浮點數(floating point)。Next, in step 322, the plurality of image feature points of the image of the calibration plate are captured by the computing device. Preferably, as shown in FIG. 15, a plurality of corner points 621 of the image 62 of the checkerboard correction plate are captured by Harris corner detection as the image feature points, wherein the images are The feature point is a floating point.

接著,在步驟323,利用該計算裝置根據該等影像特徵點計算出一具有多個控制參數並擬合(fit)該等影像特徵點的幾何曲面。較佳地,如圖16所示,該幾何曲面為一參數式非均勻有理B樣條曲面63(parametric non-uniform rational B-spline surface, NURBS surface),其中是利用習知的參數式NURBS曲面內插法(parametric NURBS surface interpolation)來利用該等影像特徵點作為插值點而估計出擬合該等影像特徵點的該參數式非均勻有理B樣條曲面63,也就是, 其中{Wi,j }為權重值,{Pi,j }為利用該等影像特徵點所計算出來的也為浮點數的多個控制點631(控制參數),u與v為該參數式非均勻有理B樣條曲面63的定義域的兩個軸向且u[0,1]、v[0,1],{Ni,p (u)}與{Nj,q (v)}均為B樣條基底函數(B-spline basis function),且p與q分別為該u軸方向與該v軸方向的階數(degree)。Next, in step 323, the computing device calculates a geometric surface having a plurality of control parameters and fitting the image feature points according to the image feature points. Preferably, as shown in FIG. 16, the geometric curved surface is a parametric non-uniform rational B-spline surface (NURBS surface), wherein the conventional parametric NURBS surface is utilized. Parametric NURBS surface interpolation to estimate the parametric non-uniform rational B-spline surface 63 fitting the image feature points by using the image feature points as interpolation points, that is, Where {W i,j } is the weight value, and {P i,j } is a plurality of control points 631 (control parameters) that are also floating point numbers calculated using the image feature points, u and v are The two axial directions of the domain of the parametric non-uniform rational B-spline surface 63 [0,1], v [0,1], {N i,p (u)} and {N j,q (v)} are both B-spline basis functions, and p and q are the u-axis directions, respectively. The degree with the direction of the v-axis.

接著,在步驟324,藉由該計算裝置,利用該參數式非均勻有理B樣條曲面63對該拍照箱5所拍攝的一待校正影像進行校正而產生一經校正後影像。為了方便說明,以下是以圖15所示的該棋盤格校正板的影像62作為該待校正影像來說明。Next, in step 324, the parametric non-uniform rational B-spline curved surface 63 is used to correct a to-be-corrected image captured by the photographing box 5 to generate a corrected image. For convenience of explanation, the following is an image 62 of the checkerboard correction plate shown in FIG. 15 as the image to be corrected.

詳言之,首先,如圖17所示,定義出該經校正後影像68的一第一影像軸(x軸)的一第一像素數量與該經校正後影像68的一第二影像軸(y軸)的一第二像素數量;在此,以該第一像素數量為k且該第二像素數量為t來說明。In detail, first, as shown in FIG. 17, a first pixel number of a first image axis (x-axis) of the corrected image 68 and a second image axis of the corrected image 68 are defined ( A second number of pixels of the y-axis); here, the number of the first pixels is k and the number of the second pixels is t.

接著,一併參閱圖17~圖19,在該參數式非均勻有理B樣條曲面的定義域65的u軸上定義出該第一像素數量個取樣點{ui |i=1,2,…,k},並在該參數式非均勻有理B樣條曲面的定義域65的v軸上定義出該第二像素數量個取樣點{vj |j=1,2,…,t},而使得依據該等取樣點所計算出的該參數式非均勻有理B樣條曲面63上的多個曲面點與該經校正後影像68的該等像素一一對應;其中,較佳地,該u軸上的任兩相鄰取樣點的間隔距離相同,也就是1/k,該v軸上的任兩相鄰取樣點的間隔距離相同,也就是1/t,且(ui ,vj )在該參數式非均勻有理B樣條曲面63上所對應的曲面點為S((i-0.5)/k,(j-0.5)/t)。也就是說,假設f(i,j)表示該經校正後影像68的第(i,j)個像素,則f(i,j)對應(ui ,vj )與曲面點S((i-0.5)/k,(j-0.5)/t),其中i為1至k的正整數且j為1至t的正整數。也就是說,如圖18所示,在此是將該定義域65劃分成數量與該經校正後影像68的像素數量相同的多個均等方格651,該等方格651與該經校正後影像68的該等像素一一對應,且每一像素所對應的曲面點為該像素對應的方格651的中心點所對應的曲面點。此外,參閱圖19,該定義域65的每一方格651在該參數式非均勻有理B樣條曲面63上對應一多邊形區域652,且每一多邊形區域652包含一對應該經校正後影像68的一像素的曲面點653。Next, referring to FIG. 17 to FIG. 19, the first pixel number of sampling points {u i |i=1, 2 is defined on the u-axis of the parametric non-uniform rational B-spline surface. ..., k}, and defining the second pixel number of sampling points {v j |j=1, 2,..., t} on the v-axis of the parametric non-uniform rational B-spline surface And the plurality of curved surface points on the parametric non-uniform rational B-spline curved surface 63 calculated according to the sampling points are in one-to-one correspondence with the pixels of the corrected image 68; wherein, preferably, the The distance between any two adjacent sampling points on the u-axis is the same, that is, 1/k, and the distance between any two adjacent sampling points on the v-axis is the same, that is, 1/t, and (u i , v j The surface point corresponding to the parametric non-uniform rational B-spline surface 63 is S((i-0.5)/k, (j-0.5)/t). That is, assuming f(i,j) represents the (i,j)th pixel of the corrected image 68, then f(i,j) corresponds to (u i ,v j ) and the surface point S((i -0.5) / k, (j - 0.5) / t), where i is a positive integer from 1 to k and j is a positive integer from 1 to t. That is, as shown in FIG. 18, the definition field 65 is divided into a plurality of equal squares 651 having the same number of pixels as the corrected image 68, and the squares 651 and the corrected squares 651 The pixels of the image 68 correspond one-to-one, and the surface points corresponding to each pixel are the surface points corresponding to the center point of the square 651 corresponding to the pixel. In addition, referring to FIG. 19, each square 651 of the definition field 65 corresponds to a polygonal area 652 on the parametric non-uniform rational B-spline surface 63, and each polygon area 652 includes a pair of images 68 that should be corrected. A pixel's surface point 653.

接著,針對該經校正後影像68的每一像素f(i,j),利用該像素f(i,j)所對應的曲面點653在該待校正影像69中的至少一鄰近像素進行內插,而計算出該像素f(i,j)的像素值;其中,內插的方式例如可為習知的最近鄰插補(nearest neighbor interpolation)、雙線性插補(bilinear interpolation)或更高次內插等方法。舉例來說,對於該經校正後影像68的像素f(5,6)來說,其像素值是利用(u5 ,v6 )對應的S(4.5/k,5.5/t)在該待校正影像69中的至少一鄰近像素進行內插所計算出來的。Then, for each pixel f(i,j) of the corrected image 68, the surface point 653 corresponding to the pixel f(i,j) is used to interpolate at least one adjacent pixel in the image to be corrected 69. And calculating a pixel value of the pixel f(i, j); wherein the interpolation manner may be, for example, a conventional nearest neighbor interpolation, a bilinear interpolation, or a higher Sub-interpolation and other methods. For example, for the pixel f(5, 6) of the corrected image 68, the pixel value is S(4.5/k, 5.5/t) corresponding to (u 5 , v 6 ) in the to be corrected. The at least one neighboring pixel in the image 69 is interpolated and calculated.

參閱圖20,值得一提的是,因為每一曲面點為浮點數,故若該待校正影像69具有M×N個像素,則該待校正影像69對應的影像座標系需涵蓋C1(-0.5,-0.5)、C2(M-1+0.5,-0.5)、C3(M-1+0.5,N-1+0.5)、C4(-0.5,N-1+0.5)等四個端點所界定出來的一座標平面9,以涵蓋位於該參數式非均勻有理B樣條曲面的邊界的曲面點;且該待校正影像69的第(i,j)個像素的中心在該影像座標系對應的座標為(i-1,j-1),其中i為1至M的正整數,j為1至N的正整數。Referring to FIG. 20, it is worth mentioning that, since each surface point is a floating point number, if the image to be corrected 69 has M×N pixels, the image coordinate corresponding to the image to be corrected 69 needs to cover C1 (- Four endpoints: 0.5, -0.5), C2 (M-1+0.5, -0.5), C3 (M-1+0.5, N-1+0.5), C4 (-0.5, N-1+0.5) Defining a target plane 9 to cover a surface point located at a boundary of the parametric non-uniform rational B-spline surface; and the center of the (i, j)th pixel of the image to be corrected 69 corresponds to the image coordinate system The coordinates are (i-1, j-1), where i is a positive integer from 1 to M, and j is a positive integer from 1 to N.

此外,參閱圖21,在另一實施方式中,也可計算在該待校正影像69中包含S(4.5/k,5.5/t)的該多邊形區域652所涵蓋的所有像素的像素值的加權平均(weighted mean),作為該經校正後影像68的像素f(5,6)的像素值;其中每一像素的權重(weight)為在該多邊形區域652中該像素的面積比例。舉例來說,如圖21所示,該多邊形區域652涵蓋了像素P1 的部分區域面積A1 、像素P2 的部分區域面積A2 、像素P3 的部分區域面積A3 、像素P4 的部分區域面積A4 、像素P5 的部分區域面積A5 ;令,則該加權平均為,其中為像素Pi 的權重。在另一實施方式中,也可根據S(4.5/k,5.5/t)與像素Pi 的中心的距離來定義像素Pi 的權重,其中距離愈短則權重愈大。In addition, referring to FIG. 21, in another embodiment, a weighted average of pixel values of all pixels covered by the polygon region 652 including S (4.5/k, 5.5/t) in the image to be corrected 69 may also be calculated. (weighted mean), the pixel value of the pixel f(5, 6) of the corrected image 68; wherein the weight of each pixel is the area ratio of the pixel in the polygonal area 652. For example, as shown in FIG. 21, the polygon area 652 covers the pixel portion P 1 of the area of A 1, the pixel portion P 2 of the area of A 2, an area of the pixel portion P 3, A 3, P 4 of the pixel section area of a 4, part of the pixel region P of the area a 5 5; order , the weighted average is ,among them Is the weight of the pixel P i . In another embodiment, the weights may be defined in the pixel P i weight, wherein the shorter the distance the greater the weights according to the distance from the center S (4.5 / k, 5.5 / t) of the pixel P i.

特別地,藉由前述該等曲面點653,能校正該拍照箱5所拍攝的任一影像,包括前述的該第一影像與該第二影像。參閱圖22,圖22示意了對該待校正影像69,也就是該棋盤格校正板的影像62,進行校正後所獲得的經校正後影像68。In particular, by using the curved surface points 653, any image captured by the photographing box 5 can be corrected, including the first image and the second image. Referring to FIG. 22, FIG. 22 illustrates the corrected image 68 obtained by correcting the image 69 to be corrected, that is, the image 62 of the checkerboard.

特別地,上述影像校正方法除了能校正因相機鏡頭的鏡片的幾何設計所造成的影像變形失真之外,還能校正因鏡片製造發生變形、鏡片組裝位置不夠精確、相機的影像感測器的組裝位置不夠精確等因素所造成的影像變形失真。此外,被拍攝物體本身的變形在影像中也可被校正展平。In particular, the image correction method described above can correct image distortion caused by the geometric design of the lens of the camera lens, and can correct the distortion of the lens manufacturing, the lens assembly position is not accurate, and the image sensor assembly of the camera is assembled. Image distortion caused by factors such as inaccurate position. In addition, the deformation of the subject itself can be corrected and flattened in the image.

此外,上述影像校正方法除了可校正該待校正影像的變形失真之外,還可藉由設定該經校正後影像的該第一影像軸(x軸)的像素數量與該第二影像軸(y軸)的像素數量來設定該經校正後影像的影像大小。In addition, in addition to correcting distortion distortion of the image to be corrected, the image correction method may further set a number of pixels of the first image axis (x-axis) of the corrected image and the second image axis (y The number of pixels of the axis) sets the image size of the corrected image.

此外,該校正板也可為其他實施態樣。例如,該校正板可為圖23所示的一圓點校正板67,且利用該計算裝置對該圓點校正板67的影像進行影像辨識,擷取出每一圓點的中心並利用該等原點中心做為該等影像特徵點;而其他實施步驟如同前述所敘明,不在此贅述。In addition, the calibration plate can also be in other implementations. For example, the calibration plate may be a dot correction plate 67 as shown in FIG. 23, and the image of the dot correction plate 67 is image-recognized by the calculation device, and the center of each dot is extracted and utilized. The point center is used as the image feature points; and other implementation steps are as described above, and are not described here.

此外,雖然較佳地係利用前述本發明的影像校正方法來對該第一影像與該第二影像進行影像校正,但也可不限於此,例如,可利用習知的針孔相機模型(pinhole camera model)來對該第一影像進行影像校正而產生該第一校正後影像,並求取該第一校正後影像中該待繡布件的每一邊緣點所對應的世界座標。In addition, although the image correction method of the present invention is preferably used to perform image correction on the first image and the second image, the image is not limited thereto. For example, a pinhole camera model can be used. And performing image correction on the first image to generate the first corrected image, and obtaining a world coordinate corresponding to each edge point of the fabric to be embroidered in the first corrected image.

此外,值得一提的是,若該拍照箱5所擷取的影像的變形失真相當地輕微,則前述對該第一影像與該第二影像所進行的影像校正非為必要。In addition, it is worth mentioning that if the distortion of the image captured by the camera box 5 is relatively slight, the image correction performed on the first image and the second image is not necessary.

綜上所述,本發明基於影像辨識的貼布繡方法,藉由擷取該待繡布件的一影像,並對該影像進行影像校正且計算出其中待繡布件的多個經平滑化處理的邊緣點,且透過影像對位、布厚校正來獲得每一邊緣點所對應的世界座標,再使該電腦式縫紉機根據該等邊緣點對應的該等世界座標所形成的刺繡路徑,自動對該待繡布件進行刺繡,故確實能達成本發明的目的。In summary, the present invention is based on an image-applied patch embedding method, by capturing an image of the fabric to be embroidered, and performing image correction on the image and calculating a plurality of smoothed portions of the fabric to be embroidered. Processing the edge points, and obtaining the world coordinates corresponding to each edge point through image alignment and cloth thickness correction, and then causing the computerized sewing machine to automatically follow the embroidery path formed by the world coordinates corresponding to the edge points. The embroidery to be embroidered is embroidered, so that the object of the present invention can be achieved.

惟以上所述者,僅為本發明之實施例而已,當不能以此限定本發明實施之範圍,凡是依本發明申請專利範圍及專利說明書內容所作之簡單的等效變化與修飾,皆仍屬本發明專利涵蓋之範圍內。However, the above is only the embodiment of the present invention, and the scope of the invention is not limited thereto, and all the equivalent equivalent changes and modifications according to the scope of the patent application and the patent specification of the present invention are still The scope of the invention is covered.

1‧‧‧圓形花樣1‧‧‧Circular pattern

21‧‧‧布件21‧‧‧ cloth

22‧‧‧底布22‧‧‧Backcloth

23‧‧‧刺繡框23‧‧‧Embroidery frame

31~35‧‧‧步驟31~35‧‧‧Steps

321~324‧‧‧步驟321~324‧‧‧Steps

41‧‧‧待繡布件41‧‧‧Embroidery pieces to be embroidered

42‧‧‧底布42‧‧‧Backcloth

43‧‧‧刺繡框43‧‧‧Embroidery frame

431‧‧‧卡榫431‧‧‧Carmen

5‧‧‧拍照箱5‧‧‧Photo Box

51‧‧‧鏡頭51‧‧‧ lens

52‧‧‧底部52‧‧‧ bottom

53‧‧‧卡榫53‧‧‧Carmen

61‧‧‧棋盤格校正板61‧‧‧checkerboard correction board

62‧‧‧棋盤格校正板的影像62‧‧‧Image of checkerboard correction board

621‧‧‧角點621‧‧‧ corner

63‧‧‧參數式非均勻有理B樣條曲面63‧‧‧Parametric non-uniform rational B-spline surface

631‧‧‧控制點631‧‧‧Control points

64‧‧‧第一校正後影像64‧‧‧First corrected image

65‧‧‧參數式非均勻有理B樣條曲面的定義域65‧‧‧Parameter-type non-uniform rational B-spline surface

651‧‧‧方格651‧‧‧ square

652‧‧‧多邊形區域652‧‧‧Polygonal area

653‧‧‧曲面點653‧‧‧Surface points

67‧‧‧圓點校正板67‧‧‧Mark Calibration Board

68‧‧‧經校正後影像68‧‧‧Fixed image

69‧‧‧待校正影像69‧‧‧Image to be corrected

7‧‧‧電腦式縫紉機7‧‧‧Computerized sewing machine

71‧‧‧工作平台71‧‧‧Working platform

72‧‧‧移動模組72‧‧‧Mobile Module

721‧‧‧本體721‧‧‧Ontology

722‧‧‧連接單元722‧‧‧ Connection unit

8‧‧‧布件8‧‧‧ cloth

9‧‧‧座標平面9‧‧‧ coordinate plane

f(i,j)‧‧‧經校正後影像的第(i,j)個像素f(i,j)‧‧‧ the (i,j)th pixel of the corrected image

C1~C4‧‧‧影像座標平面的四個端點C 1 ~ C 4 ‧ ‧ four endpoints of the image coordinate plane

P1~P5‧‧‧像素P 1 ~P 5 ‧‧ ‧ pixels

A1~A5‧‧‧像素的局部區域面積Part area of A 1 ~A 5 ‧‧ ‧ pixels

Cr‧‧‧參數式非均勻有理B樣條曲線Cr‧‧‧Parametric non-uniform rational B-spline curve

E1‧‧‧在平滑化運算前的影像邊緣點E 1 ‧‧‧Image edge points before smoothing

E2‧‧‧經平滑化運算後的影像邊緣點E 2 ‧‧‧Image edge points after smoothing

WCS‧‧‧世界座標系WCS‧‧‧World coordinate system

ICS‧‧‧影像座標系ICS‧‧‧image coordinate system

PCS‧‧‧平面座標系PCS‧‧‧planar coordinate system

H1‧‧‧垂直方向的入射光H 1 ‧‧‧ incident light in the vertical direction

H2‧‧‧非垂直方向的入射光H 2 ‧‧‧non-vertical incident light

Pt‧‧‧待繡布件的實際邊緣點P t ‧‧‧The actual edge point of the fabric to be embroidered

Qt‧‧‧待繡布件的另一實際邊緣點Q t ‧‧‧Another actual edge point of the fabric to be embroidered

O‧‧‧世界座標系的原點O‧‧‧ origin of the world coordinate system

本發明的其他的特徵及功效,將於參照圖式的實施方式中清楚地呈現,其中: 圖1說明習知利用電腦式縫紉機來製作貼布繡的方式; 圖2說明本發明基於影像辨識的貼布繡方法; 圖3示意將一待繡布件貼附於一固定在一刺繡框的底布; 圖4示意利用一拍照箱擷取該待繡布件的一第一影像; 圖5示意在一第一校正後影像中該待繡布件的在平滑化運算前的多個邊緣點; 圖6配合圖5示意在該第一校正後影像中該待繡布件的經平滑化運算後的多個邊緣點; 圖7示意一電腦式縫紉機; 圖8示意該電腦式縫紉機在一布件上刺繡出一平面座標系; 圖9示意在影像中該平面座標系與一影像座標系的相對位置; 圖10示意該電腦式縫紉機根據該等邊緣點對應的多個世界座標所形成的一刺繡路徑,對該待繡布件進行刺繡; 圖11配合圖10示意刺繡完成後的貼布繡; 圖12說明利用該拍照箱擷取該待繡布件的影像與布厚校正; 圖13說明本發明的影像校正方法; 圖14示意一棋盤格校正板; 圖15示意該棋盤格校正板的影像與其多個角點; 圖16示意根據該等角點所估計出的一參數式非均勻有理B樣條曲面與其多個控制點; 圖17說明設定對應一待校正影像的一經校正後影像的像素數量; 圖18說明該參數式非均勻有理B樣條曲面的定義域; 圖19配合圖17與圖18說明該經校正後影像的每一像素的像素值的求取方式; 圖20說明一影像座標系所需涵蓋的座標平面; 圖21說明該經校正後影像的一像素值的一計算方式; 圖22示意該棋盤格校正板的影像經校正後所獲得的該經校正後影像;及 圖23示意一圓點校正板。Other features and advantages of the present invention will be apparent from the embodiments of the drawings, wherein: Figure 1 illustrates a conventional manner of making a patchwork using a computerized sewing machine; Figure 2 illustrates the image recognition based on the present invention. Fig. 3 shows a method of attaching a fabric to be embroidered to a base fabric fixed to an embroidery frame; Fig. 4 is a schematic view of a first image of the fabric to be embroidered by using a photographing box; a plurality of edge points of the fabric to be embroidered before the smoothing operation in a first corrected image; FIG. 6 is a schematic diagram of the smoothing operation of the fabric to be embroidered in the first corrected image after FIG. Figure 7 illustrates a computerized sewing machine; Figure 8 illustrates the computerized sewing machine embroidering a plane coordinate system on a cloth member; Figure 9 illustrates the relative coordinate of the plane coordinate system with an image coordinate system in the image. Figure 10 illustrates the embroidering of the fabric to be embroidered according to an embroidery path formed by the plurality of world coordinates corresponding to the edge points of the computer-type sewing machine; Figure 11 is a schematic illustration of the appliqué after the embroidery is completed; Figure 12 illustrates the use of this FIG. 13 illustrates an image correction method of the present invention; FIG. 14 illustrates a checkerboard correction plate; FIG. 15 illustrates an image of the checkerboard correction plate and a plurality of corner points thereof. Figure 16 illustrates a parametric non-uniform rational B-spline surface estimated from the isocenter and its plurality of control points; Figure 17 illustrates the number of pixels of a corrected image corresponding to a to-be corrected image; Figure 18 illustrates The parametric non-uniform rational B-spline surface defines the domain; FIG. 19 illustrates the pixel value of each pixel of the corrected image with reference to FIG. 17 and FIG. 18; FIG. 20 illustrates an image coordinate system required to cover Figure 21 illustrates a calculation of a pixel value of the corrected image; Figure 22 illustrates the corrected image obtained by correcting the image of the checkerboard; and Figure 23 illustrates a dot Calibration board.

Claims (10)

一種基於影像辨識的貼布繡方法,由一刺繡系統實施,並用於對一貼附於一底布的待繡布件進行刺繡,該刺繡系統對應一世界座標系,該基於影像辨識的貼布繡方法包含以下步驟: (a)擷取該待繡布件的一第一影像,其中該第一影像對應一影像座標系; (b)對該第一影像進行邊緣偵測而計算出在該第一影像中該待繡布件的多個邊緣點; (c)根據該影像座標系與該世界座標系計算出每一邊緣點在該世界座標系中所對應的一世界座標;及 (d)根據該等邊緣點對應的該等世界座標所形成的一刺繡路徑,對該待繡布件進行刺繡。An appliqué method based on image recognition, implemented by an embroidery system, and used for embroidering a fabric to be embroidered attached to a base fabric, the embroidery system corresponding to a world coordinate system, the image recognition based patch The embedding method comprises the following steps: (a) capturing a first image of the fabric to be embroidered, wherein the first image corresponds to an image coordinate system; (b) performing edge detection on the first image to calculate a plurality of edge points of the fabric to be embroidered in the first image; (c) calculating, according to the image coordinate system and the world coordinate system, a world coordinate corresponding to each edge point in the world coordinate system; and (d) And embroidering the article to be embroidered according to an embroidery path formed by the world coordinates corresponding to the edge points. 如請求項1所述的基於影像辨識的貼布繡方法,其中,該步驟(b)包含以下子步驟: (b1)利用一邊緣偵測演算法獲得該第一影像中該待繡布件的多個邊緣點;及 (b2)對該等邊緣點進行一平滑化運算而獲得另外多個邊緣點。The image recognition-based appliqué method according to claim 1, wherein the step (b) comprises the following sub-steps: (b1) obtaining an image to be embroidered in the first image by using an edge detection algorithm a plurality of edge points; and (b2) performing a smoothing operation on the edge points to obtain a plurality of edge points. 如請求項2所述的基於影像辨識的貼布繡方法,其中,在該步驟(b2),先利用在該步驟(b1)所獲得的該等邊緣點求取一擬合該等邊緣點的參數式非均勻有理B樣條曲線,再從該參數式非均勻有理B樣條曲線取樣出該另外多個邊緣點。The image recognition-based appliqué method according to claim 2, wherein in the step (b2), the edge points obtained in the step (b1) are first used to obtain a fitting edge point. The parametric non-uniform rational B-spline curve is sampled from the parametric non-uniform rational B-spline curve. 如請求項1所述的基於影像辨識的貼布繡方法,其中,該步驟(c)包含以下子步驟: (c1)基於該世界座標系,在一物件上刺繡出一對應該世界座標系的平面座標系; (c2)擷取該平面座標系的一第二影像; (c3)藉由疊合該第一影像與該第二影像,計算出該第二影像中的該平面座標系相對於該影像座標系的一旋轉矩陣與一位移向量;及 (c4)根據該旋轉矩陣、該位移向量及一對應該第一影像與該第二影像的影像比例,將每一邊緣點在該影像座標系上的座標轉換成在該世界座標系中的一世界座標。The image recognition-based appliqué method according to claim 1, wherein the step (c) comprises the following sub-steps: (c1) embroidering a pair of world coordinate systems on an object based on the world coordinate system a plane coordinate system; (c2) capturing a second image of the plane coordinate system; (c3) calculating the plane coordinate system in the second image by superimposing the first image and the second image a rotation matrix and a displacement vector of the image coordinate system; and (c4) according to the rotation matrix, the displacement vector, and a pair of image ratios of the first image and the second image, each edge point is at the image coordinate The coordinates on the system are converted into a world coordinate in the world coordinate system. 如請求項1所述的基於影像辨識的貼布繡方法,還包含一在該步驟(c)之後進行且在該步驟(d)之前進行的步驟(e):針對至少一邊緣點所對應的世界座標,根據該底布的厚度與該待繡布件的厚度的其中至少一者縮減每一邊緣點所對應的世界座標至該世界座標系的原點的距離,其中所縮減的距離正相關於該底布的厚度與該待繡布件的厚度的其中至少一者。The image recognition-based applique embroidery method according to claim 1, further comprising a step (e) performed after the step (c) and performed before the step (d): corresponding to at least one edge point a world coordinate, wherein at least one of a thickness of the base fabric and a thickness of the fabric to be embroidered reduces a distance of a world coordinate corresponding to each edge point to an origin of the world coordinate system, wherein the reduced distance is positively correlated At least one of a thickness of the base fabric and a thickness of the fabric to be embroidered. 如請求項1所述的基於影像辨識的貼布繡方法,還包含一在該步驟(a)之後進行且在該步驟(b)之前進行的步驟(f):對該第一影像進行影像校正。The image recognition-based appliqué method according to claim 1, further comprising a step (f) performed after the step (a) and before the step (b): performing image correction on the first image . 如請求項6所述的基於影像辨識的貼布繡方法,其中,該步驟(f)包含以下子步驟: (f1)擷取一校正板的一影像; (f2)擷取該影像的多個影像特徵點; (f3)根據該等影像特徵點計算出一具有多個控制參數並擬合該等影像特徵點的幾何曲面;及 (f4)根據該第一影像與該幾何曲面產生一對應該第一影像的第一校正後影像,其中該第一校正後影像的每一像素對應該幾何曲面上的一曲面點,且每一像素的像素值是根據該像素所對應曲面點在該第一影像中的至少一鄰近像素所計算出來的。The image recognition-based appliqué method according to claim 6, wherein the step (f) comprises the following sub-steps: (f1) capturing an image of a calibration plate; (f2) capturing a plurality of the image Image feature points; (f3) calculating a geometric surface having a plurality of control parameters and fitting the image feature points according to the image feature points; and (f4) generating a pair according to the first image and the geometric surface a first corrected image of the first image, wherein each pixel of the first corrected image corresponds to a curved point on the geometric surface, and the pixel value of each pixel is based on the curved surface corresponding to the pixel at the first At least one neighboring pixel in the image is calculated. 如請求項7所述的基於影像辨識的貼布繡方法,其中在該步驟(f3),該幾何曲面為一參數式非均勻有理B樣條曲面,且每一控制參數為該參數式非均勻有理B樣條曲面的一控制點。The image recognition-based appliqué method according to claim 7, wherein in the step (f3), the geometric surface is a parametric non-uniform rational B-spline surface, and each control parameter is non-uniform A control point for a rational B-spline surface. 如請求項8所述的基於影像辨識的貼布繡方法,其中該步驟(f4)包含以下子步驟: (f41)定義出該第一校正後影像的一第一影像軸的一第一像素數量,並定義出該第一校正後影像的一第二影像軸的一第二像素數量;及 (f42)在該幾何曲面的定義域的一第一定義域軸上定義出該第一像素數量個取樣點,並在該幾何曲面的定義域的一第二定義域軸上定義出該第二像素數量個取樣點,其中依據該等取樣點所計算出的該幾何曲面上的多個曲面點與該第一校正後影像的該等像素一一對應。The image recognition-based appliqué method according to claim 8, wherein the step (f4) comprises the following sub-steps: (f41) defining a first pixel number of a first image axis of the first corrected image And defining a second number of pixels of a second image axis of the first corrected image; and (f42) defining the first number of pixels on a first domain axis of the domain of the geometric surface Sampling a point, and defining a second number of sampling points on a second domain axis of the domain of the geometric surface, wherein the plurality of surface points on the geometric surface are calculated according to the sampling points The pixels of the first corrected image are in one-to-one correspondence. 如請求項9所述的基於影像辨識的貼布繡方法,其中在該步驟(f42),該幾何曲面的定義域的第一定義域軸上的任兩相鄰取樣點的間隔距離相同,該幾何曲面的定義域的第二定義域軸上的任兩相鄰取樣點的間隔距離相同,且該第一校正後影像的每一像素的像素值是利用所對應曲面點在該第一影像中的至少一鄰近像素進行內插及加權平均的其中一者所計算出來的。The image recognition-based applique embroidery method according to claim 9, wherein in the step (f42), the distance between any two adjacent sampling points on the first domain axis of the domain of the geometric surface is the same, The distance between any two adjacent sampling points on the second domain of the domain of the geometric surface is the same, and the pixel value of each pixel of the first corrected image is in the first image by using the corresponding surface point At least one neighboring pixel is calculated by one of interpolation and weighted averaging.
TW106126043A 2017-08-02 2017-08-02 Appliqué method based on image recognition TWI646233B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
TW106126043A TWI646233B (en) 2017-08-02 2017-08-02 Appliqué method based on image recognition

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
TW106126043A TWI646233B (en) 2017-08-02 2017-08-02 Appliqué method based on image recognition

Publications (2)

Publication Number Publication Date
TWI646233B TWI646233B (en) 2019-01-01
TW201910584A true TW201910584A (en) 2019-03-16

Family

ID=65804026

Family Applications (1)

Application Number Title Priority Date Filing Date
TW106126043A TWI646233B (en) 2017-08-02 2017-08-02 Appliqué method based on image recognition

Country Status (1)

Country Link
TW (1) TWI646233B (en)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110390278B (en) * 2019-07-05 2022-05-31 北京大豪科技股份有限公司 Sewing material boundary identification method and device, electronic equipment and storage medium
CN114657712B (en) * 2022-03-11 2023-08-04 杰克科技股份有限公司 Pattern pattern optimization method based on edge recognition

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7324706B2 (en) * 2004-09-09 2008-01-29 Silicon Optix Inc. System and method for representing a general two dimensional spatial transformation
TWI519689B (en) * 2012-07-10 2016-02-01 鄭偉源 Embroidery and method of making the same
JP2016063894A (en) * 2014-09-24 2016-04-28 Juki株式会社 Shape recognition device and sewing machine
CN104695139A (en) * 2015-04-01 2015-06-10 华中科技大学 Industrial sewing machine system and cut part sewing processing method by same

Also Published As

Publication number Publication date
TWI646233B (en) 2019-01-01

Similar Documents

Publication Publication Date Title
JP6363863B2 (en) Information processing apparatus and information processing method
US20100194851A1 (en) Panorama image stitching
JP6899189B2 (en) Systems and methods for efficiently scoring probes in images with a vision system
JP6721112B2 (en) Camera parameter estimation device, method and program
JP6836561B2 (en) Image processing device and image processing method
TW201342304A (en) Method and system for adaptive perspective correction of ultra wide-angle lens images
JP2009042162A (en) Calibration device and method therefor
US20130058526A1 (en) Device for automated detection of feature for calibration and method thereof
JP6594170B2 (en) Image processing apparatus, image processing method, image projection system, and program
JP6721111B2 (en) Camera parameter estimation device, method and program
CN111025701A (en) Curved surface liquid crystal screen detection method
TWI646233B (en) Appliqué method based on image recognition
KR101597915B1 (en) Image processing apparatus and image processing method
WO2015101979A1 (en) Device and method with orientation indication
TWI582388B (en) Image stitching method and image stitching device
US10683596B2 (en) Method of generating an image that shows an embroidery area in an embroidery frame for computerized machine embroidery
US10619278B2 (en) Method of sewing a fabric piece onto another fabric piece based on image detection
JP6317611B2 (en) Display display pattern generating apparatus and program thereof
TWI663576B (en) Image correction method
CN115631245A (en) Correction method, terminal device and storage medium
TWI569642B (en) Method and device of capturing image with machine vision
EP3518521B1 (en) Method for realizing effect of photo being taken by others through selfie, and photographing device
JP6867766B2 (en) Information processing device and its control method, program
TWI641738B (en) Method for producing an embroidery range image of an embroidery frame
CN115239801A (en) Object positioning method and device