WO2010010767A1 - Thread measuring apparatus, measuring program, and measuring method - Google Patents

Thread measuring apparatus, measuring program, and measuring method Download PDF

Info

Publication number
WO2010010767A1
WO2010010767A1 PCT/JP2009/061175 JP2009061175W WO2010010767A1 WO 2010010767 A1 WO2010010767 A1 WO 2010010767A1 JP 2009061175 W JP2009061175 W JP 2009061175W WO 2010010767 A1 WO2010010767 A1 WO 2010010767A1
Authority
WO
WIPO (PCT)
Prior art keywords
yarn
color
light source
image
thread
Prior art date
Application number
PCT/JP2009/061175
Other languages
French (fr)
Japanese (ja)
Inventor
浩孝 藤崎
紘規 奥野
Original Assignee
株式会社島精機製作所
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 株式会社島精機製作所 filed Critical 株式会社島精機製作所
Publication of WO2010010767A1 publication Critical patent/WO2010010767A1/en

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B11/00Measuring arrangements characterised by the use of optical techniques
    • G01B11/08Measuring arrangements characterised by the use of optical techniques for measuring diameters
    • DTEXTILES; PAPER
    • D01NATURAL OR MAN-MADE THREADS OR FIBRES; SPINNING
    • D01HSPINNING OR TWISTING
    • D01H13/00Other common constructional features, details or accessories
    • D01H13/32Counting, measuring, recording or registering devices
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B11/00Measuring arrangements characterised by the use of optical techniques
    • G01B11/24Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures
    • G01B11/2425Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures of screw-threads

Definitions

  • This invention relates to yarn measurement, and more particularly to measurement of yarn thickness and fluff amount.
  • the yarn image is converted into a Fourier transform or an autocorrelation function, and the background, the yarn body, and the fluff are separated from each other to obtain the yarn diameter and the fluff amount.
  • the inventor noticed the following problem. For example, when capturing an image of a thread including a bright part and a dark part, it is difficult to obtain a dark part of the thread from Fourier transform data etc. with a dark background color, and conversely, it is difficult to obtain a bright part of the thread with a bright background color. . In other words, it is difficult to obtain data of the portion close to the background color in the yarn from Fourier transform data or the like.
  • An object of the present invention is to be able to accurately determine the thread diameter using a normal imaging device such as a digital camera or a scanner, and to accurately measure the thread properties regardless of the thread color. There is to be able to do it.
  • the yarn measuring device is a device for taking a digital image of a yarn by an imaging means and obtaining a yarn diameter or fluff amount of the yarn.
  • a backlight light source that projects a background light having a variable light source color onto a thread during imaging;
  • Color extraction means for obtaining the color component of the thread from the digital image imaged by the imaging means;
  • Arithmetic means for obtaining the light source color so that a color difference of a predetermined value or more is obtained between the extracted color component and the light source color of the backlight light source;
  • Control means for controlling the backlight source according to the obtained light source color;
  • Conversion means for obtaining Fourier transform data or autocorrelation function data by digital signal processing for the digital image of the yarn imaged by the imaging means using the obtained light source color as a background color;
  • Yarn property calculating means for obtaining the yarn diameter or fluff amount of the yarn from the data obtained by the converting means.
  • the yarn measurement program includes an image pickup device for picking up a digital image of a yarn, an image pickup device including a backlight source that projects background light having a variable light source color onto the yarn, and an image pickup device.
  • an image pickup device for picking up a digital image of a yarn
  • an image pickup device including a backlight source that projects background light having a variable light source color onto the yarn
  • an image pickup device for the yarn measuring device, comprising a signal processing of the digital image captured by the computer and a computer for controlling the backlight light source,
  • the computer Obtaining a color component of the thread from the digital image captured by the imaging means; Obtaining the light source color so that a color difference of a predetermined value or more is obtained between the extracted color component and the light source color of the backlight light source; Controlling the backlight source according to the determined light source color; From the obtained Fourier transform data or autocorrelation function data, the step of obtaining Fourier transform data or autocorrelation function data by digital signal processing for the digital image of the yarn image
  • a digital image of the yarn is picked up by an image pickup means, and the yarn diameter or the amount of fluff is calculated.
  • Imaging a digital image of the yarn by the imaging means while projecting background light having a variable light source color from the backlight light source to the yarn;
  • control means controls the light source color of the backlight light source so as to match the obtained light source color, but the light source color of the backlight light source may be brought close to the obtained light source color even if it does not exactly match.
  • the backlight light source is a color light source capable of controlling brightness and hue.
  • the brightness of the light source color is changed to search for a color difference that is greater than or equal to a predetermined value with the thread color component.
  • the hue of the light source color capable of obtaining a color difference of the predetermined value or more is searched, and the searched hue is obtained. Control the backlight source.
  • the description related to the yarn measuring device also applies to the yarn measuring program and the yarn measuring method
  • the description related to the yarn measuring program and the yarn measuring method also applies to the yarn measuring device.
  • the digital image of the yarn is converted into Fourier transform data or autocorrelation function data.
  • the Fourier transform data of the yarn includes data corresponding to the yarn diameter and the fluff amount, and the yarn diameter can be obtained from this.
  • the correlation width in the autocorrelation function data corresponds to the yarn diameter, and the intensity of the signal having a very short correlation distance corresponds to the fluff amount. Therefore, in the present invention, if a digital image of a yarn is acquired with a digital camera or the like, the yarn diameter and the amount of fluff can be obtained only by digital signal processing, and no special optical system is required. Furthermore, if a digital image can be captured even with a translucent thread, the thread diameter and the amount of fluff can be determined. Also, delicate data such as the boundary between the yarn body and the fluff is not required. *
  • the background color is controlled so that a sufficient color difference from the color component of the yarn is obtained, accurate measurement can be performed regardless of the color of the yarn. Further, the yarn diameter can be accurately obtained even with a translucent yarn, and the amount of fluff can be obtained accurately even with a translucent fluff.
  • Block diagram of the reference yarn measuring device The figure which shows the process by the two-dimensional Fourier transform with respect to the thread
  • the figure which shows the thread image used in the test example The figure which shows the two-dimensional Fourier-transform data of FIG.
  • Block diagram of the reference thread measurement program Block diagram showing an example of a reference example applied to yarn diameter management Block diagram of the yarn measuring device of the embodiment
  • Block diagram of backlight light source control unit in the embodiment The flowchart which shows the determination algorithm of the backlight in an Example
  • the figure which shows the example of determination of the backlight in an Example typically Photo of yarn with backlight off
  • FIGS. 1 to 10 show a yarn measuring device 2 and a yarn measuring program 80 of a reference example.
  • 4 is a bus
  • 6 is a color digital camera, and may be a color scanner
  • 8 is a color monitor
  • 10 is a keyboard
  • 12 is a color printer
  • 14 is a mouse.
  • a trackball, joystick, stylus or the like may be used instead of the mouse 14.
  • Reference numeral 16 denotes a network interface, which inputs and outputs various programs and data
  • 18 to 22 are image memories.
  • the image memory 18 stores the color image of the yarn imaged by the digital camera 6
  • the image memory 20 stores the image of the yarn body separated from the yarn image by Fourier transform
  • the image memory 22 stores the yarn image.
  • the image of the fluff separated from the image by Fourier transform is stored.
  • the Fourier transform unit 24 transforms the yarn color image into a two-dimensional Fourier transform image.
  • the Fourier transform unit 24 may convert a color image into, for example, a lightness image, and may perform one-dimensional Fourier transform instead of two-dimensional Fourier transform. Further, when the color of the yarn is particularly important like melange yarn, for example, a two-dimensional Fourier transform may be performed for each of RGB components.
  • the Fourier transform image of the yarn is stored in the Fourier transform image storage unit 25.
  • the Fourier transform unit 24 also performs a two-dimensional inverse Fourier transform and performs a two-dimensional inverse Fourier transform on the image processed by the filter to create an image of the yarn body and an image of the fluff.
  • the filter generation unit 26 generates a filter for separating the yarn body and the fluff from the Fourier transform image.
  • the binarization unit 27 binarizes the inverse Fourier transformed image, and in particular binarizes the fluff image.
  • the counter 28 counts the number of pixels for the fluff image or the yarn body image, obtains the yarn diameter, the yarn diameter distribution or variation pattern for the yarn body, the fluff amount for the fluff, etc. Ask for.
  • the program memory 30 stores a yarn measurement program 80 and the like, and the yarn model creation unit 32 uses the yarn diameter obtained by the counter 28 and the yarn texture stored in the image memory 18 to produce a three-dimensional yarn. Generate a model.
  • the simulation unit 34 uses the generated yarn model to perform simulation so as to express individual loops of the knit garment with respect to the knitting data stored in the knitting data storage unit 36.
  • FIG. 2 shows an outline of processing in the reference example.
  • the short side direction of the yarn is the x direction and the long side direction is the y direction.
  • the image obtained by the digital camera 6 includes a yarn main body image 40 and a fluff image 42, and includes an image in which the yarn main body and the fluff are mixed between the images 40 and 42.
  • the yarn body area 44 is included in the low frequency region
  • the fluff area 46 is included in the high frequency region
  • the intermediate area 48 is included therebetween.
  • represents a frequency along the x direction
  • represents a frequency along the y direction.
  • the frequency range for performing the Fourier transform is determined according to the high frequency end of the fluff area 46.
  • the Fourier transform data has both a real component and an imaginary component, but a power spectrum may be used instead. For example, if the real component is Re ( ⁇ ) and the imaginary component is I ( ⁇ ), the power spectrum P ( ⁇ ) is given by (Re 2 ( ⁇ ) + I 2 ( ⁇ )) 1/2 .
  • a frequency component corresponding to the yarn main body is extracted from the two-dimensional Fourier transform image by the filter 50 that passes only the yarn main body area 44, a two-dimensional inverse Fourier transform is performed to obtain a yarn main body image 41.
  • a filter 51 that passes only the fluff area 46 is applied and two-dimensional inverse Fourier transform is performed, fluff images 43 and 43 are obtained.
  • the filter 50 passes only low frequency components along the ⁇ direction
  • the filter 51 passes only high frequency components along the ⁇ direction.
  • the narrower the width of the frequency passed through the filter 50 the smoother the thread main body image 41 obtained.
  • the width of the component passed through the filter 50 is increased, subtle irregularities and twists of the thread main body are removed.
  • a gap is provided between both ends of the frequency passed by the filter 50 and both ends of the frequency cut by the filter 51 to remove the frequency component of the intermediate area 48.
  • the intermediate area 48 may not be removed, and all the frequencies outside the frequency cut by the filter 50 may be passed by the filter 51.
  • the yarn diameter can be obtained.
  • the yarn diameter can be obtained by counting the total number of pixels corresponding to the yarn body and dividing by the number of pixels in the length direction of the yarn body. Further, if the yarn diameter is obtained at various positions and the average value, dispersion, abnormal value, and the like are obtained, the degree of fluctuation of the yarn diameter due to twisting or knurling can be obtained. Since the fluff image 43 is generally a weak image, the amount of fluff can be obtained by separating the fluff from the background by binarization and counting the number of pixels corresponding to the fluff.
  • the total number of pixels corresponding to the fluff may be counted, or the number of pixels corresponding to the fluff may be counted on a plurality of lines along the x direction.
  • the amount of fluff can be obtained by converting the Fourier transform component on the high frequency side corresponding to the fluff into a fluff amount for each frequency according to a reference table (not shown) and adding it to the frequency.
  • two-dimensional Fourier transform is used, but one-dimensional Fourier transform may be used.
  • one-dimensional Fourier transform it is easy to understand because there is no processing in the y direction in FIG. 3 is a yarn main body image 52, and there is a fluff image 54 around it.
  • a yarn body area 56 and a fluff area 58 are obtained.
  • the filter 59 cuts out a frequency component corresponding to the yarn body
  • the filter 60 cuts out a high frequency component corresponding to the fluff.
  • a yarn main body image 53 and a fluff image 55 are obtained.
  • a rectangular filter is used in FIG. 3, a filter whose transmittance changes smoothly may be used.
  • FIG. 4 shows filter generation.
  • the image of the yarn body has a peak near the frequency 0, and the width of the peak corresponds to the yarn diameter.
  • the bands of the filters 59 and 60 are determined from the peak shape of the yarn body area 56 in the Fourier transform. For example, the half width ⁇ of the peak of the yarn main body area 56 is obtained, and this is multiplied by a first coefficient to obtain a bandwidth ⁇ for cutting out the yarn main body area 56. Similarly, the second coefficient is multiplied to obtain a second bandwidth ⁇ for cutting out the fluff area 58. Bandwidth ⁇ is used as a low-pass filter, and bandwidth ⁇ is used as a high-pass filter.
  • the full width at half maximum ⁇ An alternative value can be obtained.
  • the digital image of the yarn body is represented by a rectangle of width D
  • the Fourier transform is almost sinc (f ⁇ D), where sinc (x) is It is given by sin (x) / x. If the width of the peak of the yarn main body is known in FIG. 4, D, that is, the yarn diameter can be obtained, and the yarn diameter can be obtained without inverse Fourier transform.
  • FIG. 5 to 8 show the processing results of the thread image.
  • FIG. 5 is a yarn image captured by a digital camera
  • FIG. 6 is an image obtained by two-dimensional Fourier transform.
  • FIG. 6 the directions of the axes are indicated by ⁇ and ⁇ .
  • FIG. 7 shows an image obtained by processing the image of FIG. 6 with a filter and cutting out only the low-frequency side yarn body area and performing two-dimensional inverse Fourier transform on this, which is an image obtained by smoothing the surface of the yarn body.
  • This image is sufficiently clear and the yarn diameter can be calculated. Note that the yarn diameter may be calculated after binarizing the image of FIG.
  • the image of FIG. 8 is a fluff image obtained by applying a filter to the image of FIG.
  • the fluff image is darker than the main body image of the yarn, and the binarization threshold is set lower for the fluff than the main body of the yarn. If the number of white pixels in FIG. 8 is counted, the amount of fluff can be obtained.
  • the thread model can be represented by a pillar such as a quadrangular prism or a hexagonal prism, and can be formed into a loop by dividing its surface into polygons and bending the thread model along the sides of the polygons. If the diameter of the yarn is determined, the diameter of the quadrangular prism or hexagonal prism is determined, so that a polygon on the surface of the thread is obtained. Since the texture of the thread has already been imaged with a digital camera, this is texture mapped.
  • FIG. 9 shows a yarn measurement program 80.
  • a two-dimensional Fourier transform command 81 performs a two-dimensional Fourier transform on the digital image of the yarn.
  • the filtering instruction 82 generates a filter from the Fourier-transformed image, and generates a filter for cutting out the yarn body and a filter for cutting out the fluff.
  • the two-dimensional inverse Fourier transform instruction 83 performs inverse Fourier transform on the filtered Fourier transform data and separates it into a yarn body image and a fluff image.
  • the binarization command 84 may binarize these images, and may omit binarization of the yarn body image as described above.
  • the count instruction 85 counts the yarn diameter and the amount of fluff, for example, by counting the number of pixels corresponding to the yarn in the yarn body image and the fluff image. Further, since the Fourier transform and the inverse Fourier transform have the same processing contents, these processes can be largely shared.
  • Fig. 10 shows an example of application to yarn quality control.
  • Reference numeral 90 denotes a thread
  • reference numerals 91 and 92 denote thread feeding rollers.
  • the digital camera 6 captures an image of the thread.
  • the measuring device 2 obtains the yarn diameter of the captured image and stores the yarn diameter in the storage device 94.
  • the quality of the yarn 90 is controlled based on the obtained dispersion of the yarn diameter, the presence or absence of an abnormal value, and the like.
  • the amount of fluff may be measured by the measuring device 2 and stored in the storage device 94 to similarly control the amount of fluff.
  • the following effects can be obtained. (1) Only digital camera 6 is required as hardware before signal processing. Since it is sufficient that the image of the thread can be captured by the digital camera 6, no accurate adjustment or the like is required. (2) A filter for cutting out the yarn body and a filter for cutting out the fluff can be automatically generated from the peak shape on the low frequency side corresponding to the yarn body. For this reason, both thin and thick threads can be easily processed. (3) Since diffracted light is not used, the thread diameter and the amount of fluff can be measured even with a translucent thread.
  • the same processing can be performed by using the correlation distance in the autocorrelation function instead of the frequency in the reference example.
  • the yarn diameter D is the width of the correlation in the autocorrelation function.
  • the longest correlation peak value in the one-dimensional autocorrelation function is the yarn diameter.
  • the intensity of a component having a correlation distance of 0 and an extremely short component in the autocorrelation function represents the amount of fluff. *
  • FIG. 11 to FIG. 21 show examples and results.
  • FIG. 11 shows a configuration of the embodiment.
  • Reference numeral 100 denotes a backlight light source control unit which controls the backlight light source 102, and the backlight light source 102 gives background light to the yarn to be measured.
  • the backlight light source control unit 100 obtains an approximate color component of the thread 90 from the image from the digital camera 6 so that the color of the thread itself and the background color (light source color) have a sufficient distance in the color space.
  • the light source color of the backlight light source 102 is controlled.
  • the thread 90 has a plurality of color components, for example, a color component of the thread closest to the background color is set to have a sufficient distance from the background color.
  • the distance regarding the color is a distance in the color space, and the distance has the same meaning as the color difference.
  • the distance when there are a plurality of color components, they are represented by the smallest of the distances between the background color and each color component.
  • the embodiment is the same as the reference example of FIGS.
  • FIG. 12 shows the backlight source 102 and its surroundings.
  • 91 and 92 are the rollers, and 6 is the digital camera.
  • the digital camera 6 and the backlight light source 102 are disposed so as to face each other with the thread 90 interposed therebetween.
  • a standard light source (not shown) is disposed on the front side and the back side of the digital camera 6, for example.
  • the standard light source is arranged so that only the reflected light enters the digital camera 6 from the standard light source.
  • the backlight source 102 is realized by, for example, an array of three or more LEDs, a liquid crystal panel with a backlight, a white fluorescent lamp, or the like.
  • the color coordinates of the background light are controlled by changing the brightness of the LED for each color.
  • a white light source is used as the backlight light source 102, for example, an array of white LEDs or a white fluorescent lamp is used as the light source.
  • the backlight source 102 is provided with a translucent plate (not shown) on the output side so that a direct surface light from an LED, a fluorescent lamp, or the like does not leak.
  • a dark filter may be detachably attached to the output side of the backlight light source 102 in case a darker light source color is required.
  • FIG. 13 shows the configuration of the backlight light source control unit 100.
  • the color component extraction unit 104 extracts an approximate color component of the yarn from the digital camera 6.
  • the color component is a color coordinate of the color space, and any color space such as RGB or HLS can be used. Here, the RGB space is used.
  • the RGB space is used.
  • color components that is, color coordinates are obtained, and when they have a wide distribution, for example, in order to facilitate processing, they are collected into a relatively small number of components.
  • the color component distribution may be used as it is.
  • the background color calculation unit 106 controls the brightness while keeping the backlight light source 102 in black and white, for example.
  • the brightness of the backlight light source can be referred to as the brightness of the background color.
  • a search is made for a lightness difference between each color component extracted by the color component extraction unit 104 and a predetermined value or more. In other words, a search is made for a color space in which a distance greater than or equal to a predetermined value is obtained for all color components of the yarn. If there is one that can obtain such a distance, the backlight light source 102 is driven with the lightness. The search may be continued until the maximum distance is obtained or may be terminated when a distance greater than a predetermined value is obtained.
  • changing the background color is not actually controlling the backlight source 102 but changing the background color in calculating the distance in the color space. If the thread is translucent, in order to obtain a more optimal background color, the backlight light source 102 is controlled to actually change the background color, and the thread color component and the background color are measured each time. You may measure the distance.
  • the background color calculation unit 108 can change the light of the backlight light source 102 in the RGB space to obtain a distance of a predetermined value or more.
  • the lightness that maximizes the distance from each color component of the thread has already been obtained by the background color calculation unit 106. While the lightness is fixed at this value, the hue and saturation of the backlight light source 102 are changed. In other words, for the line segment connecting white and black in the RGB space, the light source color is changed in a plane perpendicular to the line segment from the optimum point obtained by the background color calculation unit 106, and a distance of a predetermined value or more is obtained.
  • the method for obtaining the optimal light source color is arbitrary, and the light source color that is farthest from the color component of the yarn may be obtained, or the search is stopped when a predetermined distance or more is obtained for each component of the yarn. Also good.
  • FIG. 14 shows the configuration of a program for determining the background color.
  • This program is stored in a storage medium such as a CD-ROM or supplied by a carrier wave from the Internet or the like.
  • the program for determining the background color is installed in an appropriate computer, thereby realizing the backlight light source control unit 100.
  • step 1 color components are extracted from the yarn image captured by the digital camera 6 or the like. Note that the process of collecting color components into a small number may be omitted.
  • step 2 the background color is changed in the black and white range, the distance in the RGB space from each color component of the yarn is obtained, and a search is made for those that can obtain a distance of a predetermined value or more for all the color components.
  • step 2 If a background color that can obtain a distance greater than or equal to the predetermined value is not obtained in step 2, the lightness of the background color that maximizes the distance is used in step 3, and the background color is changed in the color space, so that all the color components of the thread A search is made for a distance that is greater than or equal to a predetermined value.
  • the background color calculation unit 108 in FIG. 13 and step 3 in FIG. 14 are not provided. Further, instead of the procedure of obtaining the optimum background color in black and white first and then obtaining the optimum background color in color, the optimum background color may be obtained in color from the beginning.
  • Fig. 15 shows two examples of background color determination.
  • the white color component 110 and the black color component 111 are present as in the thread on the left side of FIG. 15, for example, there is an optimal background color in the gray area.
  • the gray color component 112 is included in addition to the white and black color components 110 and 111 as shown on the right side of FIG. It is difficult to separate them sufficiently. In such a case, the color component of the thread and the background color are sufficiently separated using the background color.
  • FIG. 16 is an image of a thread imaged with a dark background color with the backlight light source off.
  • FIG. 17 is a yarn image obtained by subjecting the yarn image of FIG. 16 to two-dimensional Fourier transform and two-dimensional inverse Fourier transform. In the following, the two-dimensional Fourier transform and the two-dimensional inverse Fourier transform are performed for each component of the color space such as RGB. The black part of the thread disappears in FIG. This means that the color component of the thread close to the background color disappears.
  • FIG. 18 is a yarn image captured using a black and white background color with optimized brightness.
  • FIG. 19 is a yarn image reproduced by performing two-dimensional inverse Fourier transform and two-dimensional inverse Fourier transform on this.
  • FIG. 20 is an image of the yarn main body in the process of creating the image of FIG. 19, and FIG. 21 is an image of the fluff.
  • the yarn main body and the fluff are sufficiently reproduced for the black portion of the yarn.
  • a one-dimensional Fourier transform along the radial direction of the yarn may be used. For example, if there is one piece of one-dimensional Fourier transform data, the yarn diameter data and the fluff amount data can be obtained one by one, but the fluff amount varies greatly depending on the imaging position. It is difficult to find accurately. Next, if there are a plurality of one-dimensional Fourier transform data, both the yarn diameter and the fluff amount can be accurately obtained.
  • the yarn image can be used for simulation of garment using the measured yarn. Can be used.
  • an image in which the yarn body and the fluff are separated can be obtained, and the average value and distribution of the fluff amount, the yarn diameter, the abnormal value, and the like can be obtained.

Landscapes

  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Engineering & Computer Science (AREA)
  • Mechanical Engineering (AREA)
  • Textile Engineering (AREA)
  • Treatment Of Fiber Materials (AREA)

Abstract

Disclosed is a thread measuring apparatus, which controls a back light source (102) so that a color difference between a thread (90) and an illumination may be at or higher than a predetermined value.  With the illumination color of the back light source (102) being optimized, a digital image of the thread (90) is taken by a camera (6), and a two-dimensional Fourier-transformed image is formed along the width direction and the longitudinal direction of the thread.  A low-frequency component corresponding to the thread body is extracted from the two-dimensional Fourier-transformed image, and is subjected to a two-dimensional reverse Fourier-transformation thereby reproducing the image of the thread body.  Thus, it is possible to determine the thread diameter precisely and the color components of the thread.

Description

糸の計測装置と計測プログラム及び計測方法Yarn measuring device, measuring program and measuring method
 この発明は、糸の計測に関し、特に糸の太さや毛羽の量の計測に関する。 This invention relates to yarn measurement, and more particularly to measurement of yarn thickness and fluff amount.
 発明者の1人は、糸の画像をフーリエ変換あるいは自己相関関数に変換し、これから背景と糸本体と毛羽とを各々分離し、糸径や毛羽量を求めることを提案した。この手法を改良する過程で、発明者は次の問題に気付いた。例えば明るい部分と暗い部分とを含む糸の画像を撮像すると、暗い背景色ではフーリエ変換データなどから糸の暗い部分を得ることが難しく、逆に明るい背景色では糸の明るい部分を得ることが難しい。言い換えると糸の中で、背景色と近い部分のデータがフーリエ変換データなどから得ることが難しい。 One of the inventors has proposed that the yarn image is converted into a Fourier transform or an autocorrelation function, and the background, the yarn body, and the fluff are separated from each other to obtain the yarn diameter and the fluff amount. In the process of improving this technique, the inventor noticed the following problem. For example, when capturing an image of a thread including a bright part and a dark part, it is difficult to obtain a dark part of the thread from Fourier transform data etc. with a dark background color, and conversely, it is difficult to obtain a bright part of the thread with a bright background color. . In other words, it is difficult to obtain data of the portion close to the background color in the yarn from Fourier transform data or the like.
日本特許出願2008-68841Japanese Patent Application 2008-68841
 この発明の課題は、デジタルカメラやスキャナ等の通常の撮像装置を用いて、糸径を正確に求めることができるようにすること、及び、糸の色に関係なく、正確に糸の性状を計測できるようにすることにある。 An object of the present invention is to be able to accurately determine the thread diameter using a normal imaging device such as a digital camera or a scanner, and to accurately measure the thread properties regardless of the thread color. There is to be able to do it.
 この発明の糸の計測装置は、糸のデジタル画像を撮像手段により撮像し、糸の糸径または毛羽量を求める装置において、
 撮像時に可変の光源色を持つ背景光を糸に投光するバックライト光源と、
 撮像手段で撮像したデジタル画像から糸の色成分を求めるための色抽出手段と、
 抽出した色成分とバックライト光源の光源色との間に所定値以上の色差が得られるように、前記光源色を求めるための演算手段と、
 求めた光源色に従って前記バックライト光源を制御するための制御手段と、
 求めた光源色を背景色として前記撮像手段で撮像した糸のデジタル画像に対し、フーリエ変換データもしくは自己相関関数データをデジタル信号処理により求めるための変換手段と、
 変換手段で求めたデータから前記糸の糸径または毛羽量を求めるための糸性状算出手段、とを備えていることを特徴とする。
The yarn measuring device according to the present invention is a device for taking a digital image of a yarn by an imaging means and obtaining a yarn diameter or fluff amount of the yarn.
A backlight light source that projects a background light having a variable light source color onto a thread during imaging;
Color extraction means for obtaining the color component of the thread from the digital image imaged by the imaging means;
Arithmetic means for obtaining the light source color so that a color difference of a predetermined value or more is obtained between the extracted color component and the light source color of the backlight light source;
Control means for controlling the backlight source according to the obtained light source color;
Conversion means for obtaining Fourier transform data or autocorrelation function data by digital signal processing for the digital image of the yarn imaged by the imaging means using the obtained light source color as a background color;
Yarn property calculating means for obtaining the yarn diameter or fluff amount of the yarn from the data obtained by the converting means.
 この発明の糸の計測プログラムは、糸のデジタル画像を撮像するための撮像手段と、撮像時に可変の光源色を持つ背景光を糸に投光するバックライト光源とを含む撮像装置と、撮像手段で撮像したデジタル画像を信号処理すると共に、前記バックライト光源を制御するコンピュータ、とを備えた糸の計測装置のためのプログラムにおいて、
 前記コンピュータにより、
 撮像手段で撮像したデジタル画像から糸の色成分を求めるステップと、
 抽出した色成分とバックライト光源の光源色との間に所定値以上の色差が得られるように、前記光源色を求めるステップと、
 求めた光源色に従って前記バックライト光源を制御するステップと、
 求めた光源色を背景色として前記撮像手段で撮像した糸のデジタル画像に対し、フーリエ変換データもしくは自己相関関数データをデジタル信号処理により求めるステップと、 求めたフーリエ変換データもしくは自己相関関数データから、前記糸の糸径または毛羽量を求めるステップとを実行させることを特徴とする。
The yarn measurement program according to the present invention includes an image pickup device for picking up a digital image of a yarn, an image pickup device including a backlight source that projects background light having a variable light source color onto the yarn, and an image pickup device. In the program for the yarn measuring device, comprising a signal processing of the digital image captured by the computer and a computer for controlling the backlight light source,
By the computer
Obtaining a color component of the thread from the digital image captured by the imaging means;
Obtaining the light source color so that a color difference of a predetermined value or more is obtained between the extracted color component and the light source color of the backlight light source;
Controlling the backlight source according to the determined light source color;
From the obtained Fourier transform data or autocorrelation function data, the step of obtaining Fourier transform data or autocorrelation function data by digital signal processing for the digital image of the yarn imaged by the imaging means with the obtained light source color as the background color, And determining the yarn diameter or fluff amount of the yarn.
 この発明の糸の計測方法は、糸のデジタル画像を撮像手段により撮像し、糸の糸径または毛羽量を求める方法において、
 バックライト光源から可変の光源色を持つ背景光を糸に投光しながら、撮像手段により糸のデジタル画像を撮像するステップと、
 撮像手段で撮像したデジタル画像から、抽出手段により糸の色成分を求めるステップと、 抽出した色成分とバックライト光源の光源色との間に所定値以上の色差が得られる光源色を、演算手段により求めるステップと、
 求めた光源色に従って制御手段により前記バックライト光源を制御するステップと、
 求めた光源色を背景色として前記撮像手段で撮像した糸のデジタル画像に対し、フーリエ変換データもしくは自己相関関数データを、変換手段を用いてデジタル信号処理により求めるステップと、
 変換手段で求めたデータから前記糸の糸径または毛羽量を、糸性状算出手段により求めるステップ、とを備えていることを特徴とする。
In the method for measuring a yarn according to the present invention, a digital image of the yarn is picked up by an image pickup means, and the yarn diameter or the amount of fluff is calculated.
Imaging a digital image of the yarn by the imaging means while projecting background light having a variable light source color from the backlight light source to the yarn;
A step of obtaining a color component of the yarn by the extracting unit from the digital image captured by the imaging unit, and a light source color capable of obtaining a color difference of a predetermined value or more between the extracted color component and the light source color of the backlight source. A step to obtain by
Controlling the backlight light source by control means according to the determined light source color;
For the digital image of the yarn imaged by the imaging means with the determined light source color as a background color, Fourier transform data or autocorrelation function data is obtained by digital signal processing using the conversion means;
And a step of obtaining the yarn diameter or fluff amount of the yarn from the data obtained by the converting means by the yarn property calculating means.
 この明細書ではRGB,Lab,YCbCrなどの色空間での光源色を取り扱うので、明度を変えることも光源色を変えることに含まれる。制御手段は、求めた光源色と一致するようにバックライト光源の光源色を制御するが、厳密に一致しないでも、バックライト光源の光源色を求めた光源色に近づければよい。 In this specification, since the light source color in the color space such as RGB, Lab, YCbCr is handled, changing the lightness is also included in changing the light source color. The control means controls the light source color of the backlight light source so as to match the obtained light source color, but the light source color of the backlight light source may be brought close to the obtained light source color even if it does not exactly match.
 好ましくは、前記バックライト光源は明るさと色相とを制御自在なカラー光源で、
 光源色の制御では光源色の明るさを変えることにより、糸の色成分との間で所定値以上の色差が得られるものを探索し、所定値以上の色差が得られる明るさがあれば、該明るさでバックライト光源を制御すると共に、所定値以上の色差が得られる明るさが無い場合は、所定値以上の色差が得られる光源色の色相を探索し、探索した色相になるようにバックライト光源を制御する。
Preferably, the backlight light source is a color light source capable of controlling brightness and hue.
In the control of the light source color, the brightness of the light source color is changed to search for a color difference that is greater than or equal to a predetermined value with the thread color component. When the backlight light source is controlled with the brightness and there is no brightness at which a color difference of a predetermined value or more is obtained, the hue of the light source color capable of obtaining a color difference of the predetermined value or more is searched, and the searched hue is obtained. Control the backlight source.
 この明細書において、糸の計測装置に関する記載は糸の計測プログラム及び糸の計測方法にもそのまま当てはまり、糸の計測プログラム及び糸の計測方法に関する記載は糸の計測装置にも当てはまる。 In this specification, the description related to the yarn measuring device also applies to the yarn measuring program and the yarn measuring method, and the description related to the yarn measuring program and the yarn measuring method also applies to the yarn measuring device.
 この発明では、糸のデジタル画像をフーリエ変換データもしくは自己相関関数データに変換する。糸のフーリエ変換データは糸径や毛羽量に対応するデータを含んでおり、これから糸径が求まる。また自己相関関数データでの相関の幅は糸径に対応し、相関距離が極く短い信号の強度は毛羽量に対応する。従ってこの発明では、デジタルカメラなどで糸のデジタル画像を取得すれば、デジタル信号処理のみで糸径や毛羽量を求めることができ、特別な光学系を必要としない。さらに半透明の糸でも、デジタル画像が撮像できれば、糸径や毛羽量を求めることができる。また糸本体と毛羽との境界のような微妙なデータを必要としない。  In this invention, the digital image of the yarn is converted into Fourier transform data or autocorrelation function data. The Fourier transform data of the yarn includes data corresponding to the yarn diameter and the fluff amount, and the yarn diameter can be obtained from this. The correlation width in the autocorrelation function data corresponds to the yarn diameter, and the intensity of the signal having a very short correlation distance corresponds to the fluff amount. Therefore, in the present invention, if a digital image of a yarn is acquired with a digital camera or the like, the yarn diameter and the amount of fluff can be obtained only by digital signal processing, and no special optical system is required. Furthermore, if a digital image can be captured even with a translucent thread, the thread diameter and the amount of fluff can be determined. Also, delicate data such as the boundary between the yarn body and the fluff is not required. *
 この発明ではさらに、糸の色成分と充分な色差が得られるように背景色を制御するので、糸の色と無関係に正確な計測ができる。
 また半透明な糸でも糸径を正確に求めることができ、特に半透明な毛羽でも、正確に毛羽量を求めることができる。
Further, in the present invention, since the background color is controlled so that a sufficient color difference from the color component of the yarn is obtained, accurate measurement can be performed regardless of the color of the yarn.
Further, the yarn diameter can be accurately obtained even with a translucent yarn, and the amount of fluff can be obtained accurately even with a translucent fluff.
 最初に光源色の明るさを変えて最適な背景色を探索すると、糸の色成分を正確に求めることができる。従って背景から切り出した糸の正確な画像を得ることができる。そして最適な背景色を得られない場合に、光源色の色相を制御することにより背景色を探索すると、ほぼ任意の糸に対し、糸径や毛羽量を求めることができる。  最初 First, by searching for the optimal background color by changing the brightness of the light source color, the color component of the thread can be accurately obtained. Accordingly, an accurate image of the thread cut out from the background can be obtained. If the optimum background color cannot be obtained and the background color is searched for by controlling the hue of the light source color, the thread diameter and the fluff amount can be obtained for almost any thread. *
参考例の糸の計測装置のブロック図Block diagram of the reference yarn measuring device 参考例での糸画像に対する2次元フーリエ変換による処理を示す図The figure which shows the process by the two-dimensional Fourier transform with respect to the thread | yarn image in a reference example 参考例での糸画像に対する1次元フーリエ変換による処理を示す図The figure which shows the process by the one-dimensional Fourier transform with respect to the thread | yarn image in a reference example 参考例でのフィルタの生成を示す図Diagram showing filter generation in the reference example 試験例で用いた糸画像を示す図The figure which shows the thread image used in the test example 図5の2次元フーリエ変換データを示す図The figure which shows the two-dimensional Fourier-transform data of FIG. 図5のデータから2次元逆フーリエ変換で得られた糸本体画像を示す図The figure which shows the thread | yarn main body image obtained by the two-dimensional inverse Fourier transform from the data of FIG. 図5のデータから2次元逆フーリエ変換と2値化で得られた毛羽画像を示す図The figure which shows the fluff image obtained by two-dimensional inverse Fourier transform and binarization from the data of FIG. 参考例の糸の計測プログラムのブロック図Block diagram of the reference thread measurement program 参考例を糸径の管理に応用した例を示すブロック図Block diagram showing an example of a reference example applied to yarn diameter management 実施例の糸の計測装置のブロック図Block diagram of the yarn measuring device of the embodiment 実施例の糸の計測装置の機構的な構成を示すブロック図The block diagram which shows the mechanical structure of the measuring device of the thread | yarn of an Example. 実施例でのバックライト光源制御部のブロック図Block diagram of backlight light source control unit in the embodiment 実施例でのバックライトの決定アルゴリズムを示すフローチャートThe flowchart which shows the determination algorithm of the backlight in an Example 実施例でのバックライトの決定例を模式的に示す図The figure which shows the example of determination of the backlight in an Example typically バックライトがオフでの糸の写真Photo of yarn with backlight off 図16の画像を基に参考例により作成した糸画像を示す図The figure which shows the thread | yarn image created by the reference example based on the image of FIG. バックライトがオンでの糸の写真Photo of yarn with backlight on 図18の画像を基に実施例により作成した糸画像を示す図The figure which shows the thread | yarn image created by the Example based on the image of FIG. 図18の画像を基に実施例により作成した糸本体画像を示す図The figure which shows the thread | yarn main body image created by the Example based on the image of FIG. 図18の画像を基に実施例により作成した毛羽画像を示す図The figure which shows the fluff image created by the Example based on the image of FIG.
 以下に本発明を実施するための最適実施例を示す。最初に、図1~図10を参照してバックライト光源の無い装置での糸の計測を説明し、次いで図11~図21を参照してバックライト光源とその制御等を説明する。 The following is an optimum embodiment for carrying out the present invention. First, with reference to FIGS. 1 to 10, measurement of yarns in an apparatus without a backlight light source will be described, and then the backlight light source and its control will be described with reference to FIGS.
 図1~図10に、参考例の糸の計測装置2と糸の計測プログラム80とを示す。図1において4はバスで、6はカラーのデジタルカメラで、カラースキャナなどでもよく、8はカラーモニタ、10はキーボード、12はカラープリンタで、14はマウスである。マウス14に代えてトラックボール、ジョイスティック、スタイラスなどを用いてもよい。16はネットワークインターフェースで、種々のプログラムやデータなどを入出力し、18~22は画像メモリである。このうち画像メモリ18は、デジタルカメラ6で撮像した糸のカラー画像を記憶し、画像メモリ20は、フーリエ変換により糸の画像から分離した糸本体の画像を記憶し、画像メモリ22は、糸の画像からフーリエ変換により分離した毛羽の画像を記憶する。 FIGS. 1 to 10 show a yarn measuring device 2 and a yarn measuring program 80 of a reference example. In FIG. 1, 4 is a bus, 6 is a color digital camera, and may be a color scanner, 8 is a color monitor, 10 is a keyboard, 12 is a color printer, and 14 is a mouse. A trackball, joystick, stylus or the like may be used instead of the mouse 14. Reference numeral 16 denotes a network interface, which inputs and outputs various programs and data, and 18 to 22 are image memories. Among these, the image memory 18 stores the color image of the yarn imaged by the digital camera 6, the image memory 20 stores the image of the yarn body separated from the yarn image by Fourier transform, and the image memory 22 stores the yarn image. The image of the fluff separated from the image by Fourier transform is stored.
 フーリエ変換部24は、糸のカラー画像を2次元フーリエ変換画像に変換する。変換前の前処理として、フーリエ変換部24はカラー画像を例えば明度の画像に変換し、2次元フーリエ変換に代えて、1次元フーリエ変換でも良い。またメランジ糸のように糸のカラーが特に重要になる場合、例えばRGBの各成分毎に2次元フーリエ変換を施しても良い。糸のフーリエ変換画像は、フーリエ変換画像記憶部25に記憶する。フーリエ変換部24は、2次元逆フーリエ変換をも行い、フィルタで処理した画像を2次元逆フーリエ変換して、糸本体の画像並びに毛羽の画像を作成する。フィルタ生成部26は、フーリエ変換画像から糸本体と毛羽とを分離するためのフィルタを生成する。2値化部27は、逆フーリエ変換した画像を2値化し、特に毛羽画像を2値化する。カウンタ28は、毛羽の画像や糸本体の画像に対して画素の数をカウントし、糸本体に対しては糸径や糸径の分布あるいは変動パターンを求め、毛羽に対しては毛羽の量等を求める。 The Fourier transform unit 24 transforms the yarn color image into a two-dimensional Fourier transform image. As pre-processing before conversion, the Fourier transform unit 24 may convert a color image into, for example, a lightness image, and may perform one-dimensional Fourier transform instead of two-dimensional Fourier transform. Further, when the color of the yarn is particularly important like melange yarn, for example, a two-dimensional Fourier transform may be performed for each of RGB components. The Fourier transform image of the yarn is stored in the Fourier transform image storage unit 25. The Fourier transform unit 24 also performs a two-dimensional inverse Fourier transform and performs a two-dimensional inverse Fourier transform on the image processed by the filter to create an image of the yarn body and an image of the fluff. The filter generation unit 26 generates a filter for separating the yarn body and the fluff from the Fourier transform image. The binarization unit 27 binarizes the inverse Fourier transformed image, and in particular binarizes the fluff image. The counter 28 counts the number of pixels for the fluff image or the yarn body image, obtains the yarn diameter, the yarn diameter distribution or variation pattern for the yarn body, the fluff amount for the fluff, etc. Ask for.
 プログラムメモリ30は、糸の計測プログラム80などを記憶し、糸モデル作成部32は、カウンタ28で求めた糸径、並びに画像メモリ18に記憶している糸のテクスチャーを用いて、糸の3次元モデルを生成する。シミュレーション部34は、生成した糸モデルを用い、編成データ記憶部36で記憶した編成データに対し、ニットガーメントの個々のループを表現するようにシミュレーションを行う。 The program memory 30 stores a yarn measurement program 80 and the like, and the yarn model creation unit 32 uses the yarn diameter obtained by the counter 28 and the yarn texture stored in the image memory 18 to produce a three-dimensional yarn. Generate a model. The simulation unit 34 uses the generated yarn model to perform simulation so as to express individual loops of the knit garment with respect to the knitting data stored in the knitting data storage unit 36.
 図2に参考例での処理の概要を示す。糸の短辺方向をx方向、長手方向をy方向とする。デジタルカメラ6で得られた画像は、糸本体画像40と毛羽画像42とを含み、画像40,42の中間では糸本体と毛羽とが入り交じった画像を含んでいる。この画像に対し2次元フーリエ変換を施した2次元フーリエ変換画像では、糸本体エリア44が低周波領域に含まれ、高周波領域に毛羽エリア46が含まれ、その間に中間エリア48が含まれる。なお以下、ωはx方向に沿っての周波数を、φはy方向に沿っての周波数を示す。またフーリエ変換を行う周波数の範囲は、毛羽エリア46の高周波側の端に応じて定める。フーリエ変換データは実数成分と虚数成分の双方を持つものとするが、これに代えてパワースペクトルを用いてもよい。例えば実数成分をRe(ω)、虚数成分をI(ω)とすると、パワースペクトルP(ω)は(Re(ω)+I(ω))1/2で与えられる。 FIG. 2 shows an outline of processing in the reference example. The short side direction of the yarn is the x direction and the long side direction is the y direction. The image obtained by the digital camera 6 includes a yarn main body image 40 and a fluff image 42, and includes an image in which the yarn main body and the fluff are mixed between the images 40 and 42. In the two-dimensional Fourier transform image obtained by performing the two-dimensional Fourier transform on this image, the yarn body area 44 is included in the low frequency region, the fluff area 46 is included in the high frequency region, and the intermediate area 48 is included therebetween. In the following description, ω represents a frequency along the x direction, and φ represents a frequency along the y direction. The frequency range for performing the Fourier transform is determined according to the high frequency end of the fluff area 46. The Fourier transform data has both a real component and an imaginary component, but a power spectrum may be used instead. For example, if the real component is Re (ω) and the imaginary component is I (ω), the power spectrum P (ω) is given by (Re 2 (ω) + I 2 (ω)) 1/2 .
 2次元フーリエ変換画像に対し、糸本体エリア44のみを通過させるフィルタ50で糸本体に対応する周波数成分を取り出し、2次元逆フーリエ変換すると、糸本体画像41が得られる。同様に毛羽エリア46のみを通過させるフィルタ51を適用し、2次元逆フーリエ変換を施すと、毛羽画像43,43が得られる。そしてフィルタ50は例えばω方向に沿っての低周波成分のみを通過させ、フィルタ51はω方向に沿っての高周波成分のみを通過させる。フィルタ50で通過させる周波数の幅が狭いほど、得られる糸本体画像41は平滑になり、逆にフィルタ50で通過させる成分の幅を広くすると、糸本体の微妙な凹凸や撚りなどを糸本体画像41に反映できる。参考例ではフィルタ50で通過させる周波数の両端と、フィルタ51でカットする周波数の両端との間に隙間を設け、中間エリア48の周波数成分を除去している。しかしながら中間エリア48を除去せず、フィルタ50でカットする周波数の外側を全てフィルタ51で通過させるようにしても良い。 When a frequency component corresponding to the yarn main body is extracted from the two-dimensional Fourier transform image by the filter 50 that passes only the yarn main body area 44, a two-dimensional inverse Fourier transform is performed to obtain a yarn main body image 41. Similarly, when a filter 51 that passes only the fluff area 46 is applied and two-dimensional inverse Fourier transform is performed, fluff images 43 and 43 are obtained. For example, the filter 50 passes only low frequency components along the ω direction, and the filter 51 passes only high frequency components along the ω direction. The narrower the width of the frequency passed through the filter 50, the smoother the thread main body image 41 obtained. Conversely, when the width of the component passed through the filter 50 is increased, subtle irregularities and twists of the thread main body are removed. 41 can be reflected. In the reference example, a gap is provided between both ends of the frequency passed by the filter 50 and both ends of the frequency cut by the filter 51 to remove the frequency component of the intermediate area 48. However, the intermediate area 48 may not be removed, and all the frequencies outside the frequency cut by the filter 50 may be passed by the filter 51.
 次に糸本体画像41に対し、x方向に沿って幅をカウントすれば糸径を求めることができる。あるいはまた糸本体に対応する画素の総数をカウントし、糸本体の長さ方向の画素数で割ると、糸径を求めることができる。また種々の位置で糸径を求め、その平均値と分散、異常値などを求めれば、撚りや瘤などによる糸径の変動の程度を求めることができる。毛羽画像43は一般に弱い画像なので、例えば2値化により毛羽と背景とを分離し、毛羽に対応する画素の数をカウントすると、毛羽量を求めることができる。毛羽量を求めるためには、毛羽に対応する画素の総数をカウントしても良く、あるいはx方向に沿った複数のライン上で、毛羽に対応する画素の数をカウントしても良い。また毛羽に対応する高周波側のフーリエ変換成分を、図示しない参照表などにより、周波数毎に毛羽量に変換し、周波数に対して加算しても、毛羽量を求めることができる。 Next, if the width is counted along the x direction with respect to the yarn body image 41, the yarn diameter can be obtained. Alternatively, the yarn diameter can be obtained by counting the total number of pixels corresponding to the yarn body and dividing by the number of pixels in the length direction of the yarn body. Further, if the yarn diameter is obtained at various positions and the average value, dispersion, abnormal value, and the like are obtained, the degree of fluctuation of the yarn diameter due to twisting or knurling can be obtained. Since the fluff image 43 is generally a weak image, the amount of fluff can be obtained by separating the fluff from the background by binarization and counting the number of pixels corresponding to the fluff. In order to obtain the fluff amount, the total number of pixels corresponding to the fluff may be counted, or the number of pixels corresponding to the fluff may be counted on a plurality of lines along the x direction. Also, the amount of fluff can be obtained by converting the Fourier transform component on the high frequency side corresponding to the fluff into a fluff amount for each frequency according to a reference table (not shown) and adding it to the frequency.
 参考例では2次元フーリエ変換を用いるが、1次元フーリエ変換を用いてもよい。1次元フーリエ変換の場合、図2のy方向の処理がないので理解しやすい。図3の52は糸本体画像で、その周囲に毛羽画像54がある。これを糸の幅方向に沿って1次元フーリエ変換すると、糸本体エリア56と毛羽エリア58とが得られる。次にフィルタ59により糸本体に対応する周波数成分を切り出し、フィルタ60により毛羽に対応する高周波成分を切り出す。そして逆フーリエ変換を施すと、糸本体画像53や毛羽画像55が得られる。なお図3では矩形状のフィルタとしたが、透過率が滑らかに変化するフィルタを用いても良い。 In the reference example, two-dimensional Fourier transform is used, but one-dimensional Fourier transform may be used. In the case of one-dimensional Fourier transform, it is easy to understand because there is no processing in the y direction in FIG. 3 is a yarn main body image 52, and there is a fluff image 54 around it. When this is subjected to a one-dimensional Fourier transform along the width direction of the yarn, a yarn body area 56 and a fluff area 58 are obtained. Next, the filter 59 cuts out a frequency component corresponding to the yarn body, and the filter 60 cuts out a high frequency component corresponding to the fluff. When the inverse Fourier transform is performed, a yarn main body image 53 and a fluff image 55 are obtained. Although a rectangular filter is used in FIG. 3, a filter whose transmittance changes smoothly may be used.
 図4にフィルタの生成を示す。糸本体の画像は周波数0の付近にピークを持ち、ピークの幅は糸径に対応する。フーリエ変換での糸本体エリア56のピーク形状から、フィルタ59,60の帯域を決定する。例えば糸本体エリア56のピークの半値幅αを求め、これに第1の係数を乗算して、糸本体エリア56を切り出すための帯域幅βとする。また同様に第2の係数を乗算して、毛羽エリア58を切り出すための第2の帯域幅γとする。帯域幅βはロウパスフィルタとして用い、帯域幅γはハイパスフィルタとして用いる。これ以外に、糸本体エリア56と毛羽エリア58との間の0クロッシングポイントを用いる、もしくは糸本体エリア56でのピークのラインを外挿し、0とクロスする点を用いることなどにより、半値幅αに代わる値を求めることができる。なお糸本体のデジタル画像が幅Dの矩形で表されるとき、そのフーリエ変換はほぼsinc(f・D)となり、ここにsinc(x)は
sin(x)/xで与えられる。図4で糸本体のピークの幅が分かれば、D,即ち糸径が求まり、逆フーリエ変換無しで糸径を求めることも可能である。
FIG. 4 shows filter generation. The image of the yarn body has a peak near the frequency 0, and the width of the peak corresponds to the yarn diameter. The bands of the filters 59 and 60 are determined from the peak shape of the yarn body area 56 in the Fourier transform. For example, the half width α of the peak of the yarn main body area 56 is obtained, and this is multiplied by a first coefficient to obtain a bandwidth β for cutting out the yarn main body area 56. Similarly, the second coefficient is multiplied to obtain a second bandwidth γ for cutting out the fluff area 58. Bandwidth β is used as a low-pass filter, and bandwidth γ is used as a high-pass filter. In addition to this, by using a zero crossing point between the yarn main body area 56 and the fluff area 58, or by extrapolating a peak line in the yarn main body area 56 and using a point crossing zero, the full width at half maximum α An alternative value can be obtained. When the digital image of the yarn body is represented by a rectangle of width D, the Fourier transform is almost sinc (f · D), where sinc (x) is
It is given by sin (x) / x. If the width of the peak of the yarn main body is known in FIG. 4, D, that is, the yarn diameter can be obtained, and the yarn diameter can be obtained without inverse Fourier transform.
 図5~図8に、糸画像の処理結果を示す。図5はデジタルカメラで撮像した糸画像で、図6はこれを2次元フーリエ変換した画像である。なお図6に、軸の向きをωとφとで示してある。図6の画像をフィルタで処理して低周波側の糸本体エリアのみを切り出し、これを2次元逆フーリエ変換した画像を図7に示し、これは糸本体の表面を平滑化した画像である。この画像は充分明瞭で、糸径を算出できる。なお図7の画像に2値化を施した後に、糸径を算出しても良い。図8の画像は図6の画像に対しフィルタを適用して、毛羽エリアの周波数成分を取り出し、これを2次元逆フーリエ変換した後に2値化した、毛羽の画像である。なお毛羽画像は糸の本体画像に比べて暗く、2値化の閾値を毛羽では糸本体よりも低くする。そして図8の白い画素の数をカウントすれば、毛羽の量を求めることができる。 5 to 8 show the processing results of the thread image. FIG. 5 is a yarn image captured by a digital camera, and FIG. 6 is an image obtained by two-dimensional Fourier transform. In FIG. 6, the directions of the axes are indicated by ω and φ. FIG. 7 shows an image obtained by processing the image of FIG. 6 with a filter and cutting out only the low-frequency side yarn body area and performing two-dimensional inverse Fourier transform on this, which is an image obtained by smoothing the surface of the yarn body. This image is sufficiently clear and the yarn diameter can be calculated. Note that the yarn diameter may be calculated after binarizing the image of FIG. The image of FIG. 8 is a fluff image obtained by applying a filter to the image of FIG. 6 to extract the frequency component of the fluff area, binarizing it after two-dimensional inverse Fourier transform. Note that the fluff image is darker than the main body image of the yarn, and the binarization threshold is set lower for the fluff than the main body of the yarn. If the number of white pixels in FIG. 8 is counted, the amount of fluff can be obtained.
 糸本体の径が定まると、糸モデルを作成できる。糸モデルは例えば4角柱や6角柱などの柱で表すことができ、その表面をポリゴンに分割し、ポリゴンの辺に沿って糸モデルを屈曲させることにより、ループにできる。糸の径が定まれば4角柱や6角柱の径が定まり、従って糸表面のポリゴンが得られる。糸のテクスチャーはデジタルカメラで撮像済みなので、これをテクスチャーマッピングする。 Once the thread body diameter is determined, a thread model can be created. The thread model can be represented by a pillar such as a quadrangular prism or a hexagonal prism, and can be formed into a loop by dividing its surface into polygons and bending the thread model along the sides of the polygons. If the diameter of the yarn is determined, the diameter of the quadrangular prism or hexagonal prism is determined, so that a polygon on the surface of the thread is obtained. Since the texture of the thread has already been imaged with a digital camera, this is texture mapped.
 図9に糸の計測プログラム80を示す。2次元フーリエ変換命令81は糸のデジタル画像に対して2次元フーリエ変換を行う。フィルタリング命令82は、フーリエ変換した画像からフィルタを生成して、糸本体を切り出すためのフィルタと、毛羽を切り出すためのフィルタとを生成する。2次元逆フーリエ変換命令83は、フィルタリングしたフーリエ変換データを逆フーリエ変換し、糸本体画像と毛羽画像とに分離する。2値化命令84はこれらの画像に対して2値化を施し、前記のように糸本体画像に対しては2値化を省略しても良い。カウント命令85は、糸本体画像や毛羽画像での糸に対応する画素の数をカウントするなどにより、糸径や毛羽の量をカウントする。またフーリエ変換と逆フーリエ変換とは、処理の内容が共通するので、これらの処理を大部分共通化できる。 FIG. 9 shows a yarn measurement program 80. A two-dimensional Fourier transform command 81 performs a two-dimensional Fourier transform on the digital image of the yarn. The filtering instruction 82 generates a filter from the Fourier-transformed image, and generates a filter for cutting out the yarn body and a filter for cutting out the fluff. The two-dimensional inverse Fourier transform instruction 83 performs inverse Fourier transform on the filtered Fourier transform data and separates it into a yarn body image and a fluff image. The binarization command 84 may binarize these images, and may omit binarization of the yarn body image as described above. The count instruction 85 counts the yarn diameter and the amount of fluff, for example, by counting the number of pixels corresponding to the yarn in the yarn body image and the fluff image. Further, since the Fourier transform and the inverse Fourier transform have the same processing contents, these processes can be largely shared.
 図10に糸の品質管理への応用例を示す。90は糸で、91,92は糸送り用のローラで、デジタルカメラ6により糸の画像を撮像する。正確な画像が得られるように、撮像時にはローラ91,92を停止させて、静止画像を撮像することが好ましい。撮像した画像に対し計測装置2で例えば糸径を求め、糸径を記憶装置94に記憶する。そして求めた糸径の分散や異常値の有無などから、糸90を品質管理する。計測装置2で毛羽量を測定し、これも記憶装置94に記憶して、同様に毛羽量を品質管理してもよい。 Fig. 10 shows an example of application to yarn quality control. Reference numeral 90 denotes a thread, and reference numerals 91 and 92 denote thread feeding rollers. The digital camera 6 captures an image of the thread. In order to obtain an accurate image, it is preferable to stop the rollers 91 and 92 to capture a still image during imaging. For example, the measuring device 2 obtains the yarn diameter of the captured image and stores the yarn diameter in the storage device 94. Then, the quality of the yarn 90 is controlled based on the obtained dispersion of the yarn diameter, the presence or absence of an abnormal value, and the like. The amount of fluff may be measured by the measuring device 2 and stored in the storage device 94 to similarly control the amount of fluff.
 参考例では以下の効果が得られる。
(1) 信号処理前のハードウェアとしてはデジタルカメラ6しか必要としない。そして糸の画像をデジタルカメラ6で撮像できればよいので、正確な調整などを必要としない。
(2) 糸本体を切り出すためのフィルタと、毛羽を切り出すためのフィルタを、糸本体に対応する低周波側のピーク形状から自動的に生成できる。このため細い糸も太い糸も容易に処理できる。
(3) 回折光を用いるのではないので、半透明の糸でも糸径や毛羽量を測定できる。
In the reference example, the following effects can be obtained.
(1) Only digital camera 6 is required as hardware before signal processing. Since it is sufficient that the image of the thread can be captured by the digital camera 6, no accurate adjustment or the like is required.
(2) A filter for cutting out the yarn body and a filter for cutting out the fluff can be automatically generated from the peak shape on the low frequency side corresponding to the yarn body. For this reason, both thin and thick threads can be easily processed.
(3) Since diffracted light is not used, the thread diameter and the amount of fluff can be measured even with a translucent thread.
 参考例ではフーリエ変換を示したが、周知のようにフーリエ変換と自己相関関数とは類似の性質を持つ。即ち参考例での周波数に代えて、自己相関関数での相関距離を用いても、全く同様の処理を行うことができる。例えば糸径Dは自己相関関数での相関の幅であり、例えば1次元の自己相関関数での長い側の相関のピーク値が糸径である。また自己相関関数で相関距離が0及び極く短い成分の強度は、毛羽量を表している。  In the reference example, Fourier transform is shown, but as is well known, Fourier transform and autocorrelation function have similar properties. That is, the same processing can be performed by using the correlation distance in the autocorrelation function instead of the frequency in the reference example. For example, the yarn diameter D is the width of the correlation in the autocorrelation function. For example, the longest correlation peak value in the one-dimensional autocorrelation function is the yarn diameter. In addition, the intensity of a component having a correlation distance of 0 and an extremely short component in the autocorrelation function represents the amount of fluff. *
 図11~図21に、実施例とその結果とを示す。図11は実施例の構成を示し、100はバックライト光源制御部で、バックライト光源102を制御し、バックライト光源102は測定対象の糸に対して背景光を与える。バックライト光源制御部100は、デジタルカメラ6からの画像から、糸90のおおよその色成分を求め、糸自体の色と背景色(光源色)とが色空間において充分な距離を持つように、バックライト光源102の光源色を制御する。糸90が複数の色成分を持つ場合、例えば背景色に最も近い糸の色成分でも背景色との間に充分な距離があるようにする。なおこの明細書で、色に関する距離は色空間での距離で、距離は色差と同じ意味である。また複数の色成分がある場合、背景色と各色成分の距離のうちで最小のもので代表する。これ以外の部分について、実施例は図1~図10の参考例と同様である。 FIG. 11 to FIG. 21 show examples and results. FIG. 11 shows a configuration of the embodiment. Reference numeral 100 denotes a backlight light source control unit which controls the backlight light source 102, and the backlight light source 102 gives background light to the yarn to be measured. The backlight light source control unit 100 obtains an approximate color component of the thread 90 from the image from the digital camera 6 so that the color of the thread itself and the background color (light source color) have a sufficient distance in the color space. The light source color of the backlight light source 102 is controlled. When the thread 90 has a plurality of color components, for example, a color component of the thread closest to the background color is set to have a sufficient distance from the background color. In this specification, the distance regarding the color is a distance in the color space, and the distance has the same meaning as the color difference. In addition, when there are a plurality of color components, they are represented by the smallest of the distances between the background color and each color component. With respect to the other parts, the embodiment is the same as the reference example of FIGS.
 図12にバックライト光源102とその周囲とを示す。91,92は前記のローラで、6は前記のデジタルカメラである。デジタルカメラ6とバックライト光源102は糸90を挟んで向かい合うように配置され、好ましくは、バックライト光源102以外に、図示しない標準光源を例えばデジタルカメラ6の手前側と奥側とに配置する。そして標準光源からの光は反射光のみがデジタルカメラ6に入射するように、標準光源を配置する。 FIG. 12 shows the backlight source 102 and its surroundings. 91 and 92 are the rollers, and 6 is the digital camera. The digital camera 6 and the backlight light source 102 are disposed so as to face each other with the thread 90 interposed therebetween. Preferably, in addition to the backlight light source 102, a standard light source (not shown) is disposed on the front side and the back side of the digital camera 6, for example. The standard light source is arranged so that only the reflected light enters the digital camera 6 from the standard light source.
 バックライト光源102は、例えば3色以上のLEDのアレイや、バックライト付きの液晶パネル、白色蛍光灯などで実現する。そして例えばLEDの輝度を色毎に変えて、背景光の色座標を制御する。なおバックライト光源102として白色光源を用いる場合、例えば白色LEDのアレイや白色蛍光灯などを光源とする。またバックライト光源102には出力側に図示しない透光板を設けて、LEDや蛍光灯などからの直接光が漏れない均一な面光源とする。さらに暗い光源色が必要な場合に備えて、濃色のフィルタをバックライト光源102の出力側に着脱自在にしても良い。 The backlight source 102 is realized by, for example, an array of three or more LEDs, a liquid crystal panel with a backlight, a white fluorescent lamp, or the like. For example, the color coordinates of the background light are controlled by changing the brightness of the LED for each color. When a white light source is used as the backlight light source 102, for example, an array of white LEDs or a white fluorescent lamp is used as the light source. Further, the backlight source 102 is provided with a translucent plate (not shown) on the output side so that a direct surface light from an LED, a fluorescent lamp, or the like does not leak. In addition, a dark filter may be detachably attached to the output side of the backlight light source 102 in case a darker light source color is required.
 図13に、バックライト光源制御部100の構成を示す。色成分抽出部104はデジタルカメラ6から糸のおおよその色成分を抽出する。色成分は色空間の色座標で、色空間はRGB、HLSなど任意のものを用いることができ、ここではRGB空間を用いる。図12に示すように、糸90はローラ91,92間に差し渡されているので、ローラ91,92を結ぶ線分上の画素をデジタルカメラ6から取り出すと、糸90の画素が複数得られる。この画素に対し色成分、即ち色座標を求め、これらが広い分布を持つ場合、例えば処理を容易にするため、比較的少数の成分に集約する。なお色成分の分布をそのまま用いてもよい。 FIG. 13 shows the configuration of the backlight light source control unit 100. The color component extraction unit 104 extracts an approximate color component of the yarn from the digital camera 6. The color component is a color coordinate of the color space, and any color space such as RGB or HLS can be used. Here, the RGB space is used. As shown in FIG. 12, since the thread 90 is passed between the rollers 91 and 92, when pixels on the line connecting the rollers 91 and 92 are taken out from the digital camera 6, a plurality of pixels of the thread 90 are obtained. . For this pixel, color components, that is, color coordinates are obtained, and when they have a wide distribution, for example, in order to facilitate processing, they are collected into a relatively small number of components. The color component distribution may be used as it is.
 背景色算出部106は、バックライト光源102を例えば白黒に保ったままで、明るさを制御する。ここでバックライト光源の明るさは背景色の明度ということができる。そして背景色を変えながら、色成分抽出部104で抽出した各色成分との明度の差が所定値以上になるものを探索する。言い換えると、色空間で糸の全ての色成分に対し、所定値以上の距離が得られるものを探索する。このような距離が得られるものがあれば、その明度でバックライト光源102を駆動する。探索は最大の距離が得られるまで続けても、所定値以上の距離が得られた時点で打ち切っても良い。ここで背景色を変えるとは、実際にバックライト光源102を制御することではなく、色空間での距離の計算上で背景色を変えることである。なお糸が半透明な場合、より最適な背景色を得るため、バックライト光源102を制御して実際に背景色を変えながら、糸の色成分と背景色とをその都度測定して、背景色との距離を測定しても良い。 The background color calculation unit 106 controls the brightness while keeping the backlight light source 102 in black and white, for example. Here, the brightness of the backlight light source can be referred to as the brightness of the background color. Then, while changing the background color, a search is made for a lightness difference between each color component extracted by the color component extraction unit 104 and a predetermined value or more. In other words, a search is made for a color space in which a distance greater than or equal to a predetermined value is obtained for all color components of the yarn. If there is one that can obtain such a distance, the backlight light source 102 is driven with the lightness. The search may be continued until the maximum distance is obtained or may be terminated when a distance greater than a predetermined value is obtained. Here, changing the background color is not actually controlling the backlight source 102 but changing the background color in calculating the distance in the color space. If the thread is translucent, in order to obtain a more optimal background color, the backlight light source 102 is controlled to actually change the background color, and the thread color component and the background color are measured each time. You may measure the distance.
 バックライト光源102を白黒の範囲で制御しても所定の距離が得られない場合、背景色算出部108でバックライト光源102の光をRGB空間内で変えて、所定値以上の距離が得られるものを探索する。ここで、背景色算出部106で糸の各色成分との距離が最大となる明度を既に求めてある。明度をこの値に固定したまま、バックライト光源102の色相と彩度とを変更する。言い換えるとRGB空間での白と黒とを結ぶ線分に対し、背景色算出部106で求めた最適点から、この線分に直角な平面内で光源色を変えて、所定値以上の距離が得られる光源色を求める。なお最適な光源色を求める手法は任意で、糸の色成分と最も離れた光源色を求めても良く、あるいは糸の各成分に対して所定以上の距離が得られた時点で探索を打ち切っても良い。 If the predetermined distance cannot be obtained even when the backlight light source 102 is controlled in the black and white range, the background color calculation unit 108 can change the light of the backlight light source 102 in the RGB space to obtain a distance of a predetermined value or more. Explore things. Here, the lightness that maximizes the distance from each color component of the thread has already been obtained by the background color calculation unit 106. While the lightness is fixed at this value, the hue and saturation of the backlight light source 102 are changed. In other words, for the line segment connecting white and black in the RGB space, the light source color is changed in a plane perpendicular to the line segment from the optimum point obtained by the background color calculation unit 106, and a distance of a predetermined value or more is obtained. Obtain the obtained light source color. The method for obtaining the optimal light source color is arbitrary, and the light source color that is farthest from the color component of the yarn may be obtained, or the search is stopped when a predetermined distance or more is obtained for each component of the yarn. Also good.
 図14に、背景色の決定用のプログラムの構成を示す。このプログラムはCD-ROMなどの記憶媒体に記憶され、あるいはインターネットなどから搬送波により供給される。背景色の決定用のプログラムは、適宜のコンピュータにインストールされることにより、バックライト光源制御部100を実現する。ステップ1で、デジタルカメラ6などで撮像した糸画像から色成分を抽出し、色成分が多数ある場合、少数にまとめる。なお色成分を少数にまとめる処理は省略しても良い。ステップ2で、背景色を白黒の範囲で変え、糸の各色成分からのRGB空間での距離を求め、全ての色成分に対し所定値以上の距離が得られるものを探索する。ステップ2で所定値以上の距離が得られる背景色が得られなかった場合、ステップ3で、距離が最大となる背景色の明度を用い、背景色を色空間で変えて、糸の全色成分からの距離が所定値以上のものを探索する。 FIG. 14 shows the configuration of a program for determining the background color. This program is stored in a storage medium such as a CD-ROM or supplied by a carrier wave from the Internet or the like. The program for determining the background color is installed in an appropriate computer, thereby realizing the backlight light source control unit 100. In step 1, color components are extracted from the yarn image captured by the digital camera 6 or the like. Note that the process of collecting color components into a small number may be omitted. In step 2, the background color is changed in the black and white range, the distance in the RGB space from each color component of the yarn is obtained, and a search is made for those that can obtain a distance of a predetermined value or more for all the color components. If a background color that can obtain a distance greater than or equal to the predetermined value is not obtained in step 2, the lightness of the background color that maximizes the distance is used in step 3, and the background color is changed in the color space, so that all the color components of the thread A search is made for a distance that is greater than or equal to a predetermined value.
 バックライト光源102として白色光源を用いる場合、図13の背景色算出部108や図14のステップ3は設けない。さらに最初に白黒での最適背景色を求め、次ぎにカラーでの最適背景色を求めるとの手続に代え、最初からカラーで最適背景色を求めても良い。 When a white light source is used as the backlight light source 102, the background color calculation unit 108 in FIG. 13 and step 3 in FIG. 14 are not provided. Further, instead of the procedure of obtaining the optimum background color in black and white first and then obtaining the optimum background color in color, the optimum background color may be obtained in color from the beginning.
 図15に背景色の決定の例を2種類示す。例えば図15の左側の糸のように、白色の色成分110と、黒色の色成分111のみがある場合、例えば灰色の領域に最適な背景色がある。これに対して図15の右側のように、白色と黒色の色成分110,111の他に、灰色の色成分112が含まれている場合、白黒の背景色で糸の色成分と背景色とを充分に分離することは難しい。このような場合、カラーの背景色を用いて糸の色成分と背景色とを充分に分離する。 Fig. 15 shows two examples of background color determination. For example, when only the white color component 110 and the black color component 111 are present as in the thread on the left side of FIG. 15, for example, there is an optimal background color in the gray area. On the other hand, when the gray color component 112 is included in addition to the white and black color components 110 and 111 as shown on the right side of FIG. It is difficult to separate them sufficiently. In such a case, the color component of the thread and the background color are sufficiently separated using the background color.
 図16~図21に、実施例での結果を示す。図16はバックライト光源をオフして、暗い背景色で撮像した糸の画像である。図17は、図16の糸画像に2次元フーリエ変換と2次元逆フーリエ変換とを施して得た糸の画像である。なお以下で、2次元フーリエ変換や2次元逆フーリエ変換は、RGB等の色空間の各成分毎に行った。糸の黒い部分が図17では消えている。これは背景色に近い糸の色成分が消えてしまうことを意味する。図18は、明度を最適化した白黒の背景色を用いて撮像した糸画像である。図19はこれに2次元逆フーリエ変換と2次元逆フーリエ変換とを施して再現した糸画像である。図18の段階で、糸の色成分と背景色とが充分に分離されているので、図19では糸を充分に再現する画像が得られている。図20は図19の画像を作成する過程での糸本体の画像、図21は毛羽の画像で、糸の黒い部分に対して糸本体も毛羽も充分に再現されている。なお2次元フーリエ変換に代えて、糸の半径方向等に沿った1次元フーリエ変換を用いても良い。例えば1個の1次元フーリエ変換データが有れば、糸径のデータと毛羽量のデータが1個ずつ求まるが、毛羽量は撮像位置により大きく変化するので、糸径は求まるが、毛羽量を正確に求めることは難しい。次ぎに1次元フーリエ変換データが複数有れば、糸径と毛羽量の双方が正確に求まる。 Figures 16 to 21 show the results of the examples. FIG. 16 is an image of a thread imaged with a dark background color with the backlight light source off. FIG. 17 is a yarn image obtained by subjecting the yarn image of FIG. 16 to two-dimensional Fourier transform and two-dimensional inverse Fourier transform. In the following, the two-dimensional Fourier transform and the two-dimensional inverse Fourier transform are performed for each component of the color space such as RGB. The black part of the thread disappears in FIG. This means that the color component of the thread close to the background color disappears. FIG. 18 is a yarn image captured using a black and white background color with optimized brightness. FIG. 19 is a yarn image reproduced by performing two-dimensional inverse Fourier transform and two-dimensional inverse Fourier transform on this. Since the color component of the thread and the background color are sufficiently separated at the stage of FIG. 18, an image that sufficiently reproduces the thread is obtained in FIG. 20 is an image of the yarn main body in the process of creating the image of FIG. 19, and FIG. 21 is an image of the fluff. The yarn main body and the fluff are sufficiently reproduced for the black portion of the yarn. Instead of the two-dimensional Fourier transform, a one-dimensional Fourier transform along the radial direction of the yarn may be used. For example, if there is one piece of one-dimensional Fourier transform data, the yarn diameter data and the fluff amount data can be obtained one by one, but the fluff amount varies greatly depending on the imaging position. It is difficult to find accurately. Next, if there are a plurality of one-dimensional Fourier transform data, both the yarn diameter and the fluff amount can be accurately obtained.
 白黒の背景色を用いると、糸本体や毛羽の色成分を正確に抽出するのが容易である。特に半透明な毛羽の色成分を正確に抽出できる。これに対してカラーの背景色では、毛羽自体の色と背景色とを分離するのが難しい。実施例では、糸が複数の色成分を持つ場合でも、各色成分を求めることができ、また糸や毛羽の正確な画像が得られるので、計測した糸を用いたガーメントなどのシミュレーションに糸画像を用いることができる。また糸本体と毛羽とを分離した画像を得ることができ、毛羽の量や糸径の平均値や分布、異常値などを求めることができる。 If a black and white background color is used, it is easy to accurately extract the color components of the yarn body and fluff. In particular, the translucent fluff color component can be accurately extracted. On the other hand, with the background color, it is difficult to separate the fluff color from the background color. In the embodiment, even when the yarn has a plurality of color components, each color component can be obtained, and an accurate image of the yarn or fluff can be obtained. Therefore, the yarn image can be used for simulation of garment using the measured yarn. Can be used. In addition, an image in which the yarn body and the fluff are separated can be obtained, and the average value and distribution of the fluff amount, the yarn diameter, the abnormal value, and the like can be obtained.
2 計測装置  4 バス  6 デジタルカメラ
8 カラーモニタ  10 キーボード  12 カラープリンタ
14 マウス  16 ネットワークインターフェース
18~22 画像メモリ  24 フーリエ変換部
25 フーリエ変換画像記憶部  26 フィルタ生成部
27 2値化部  28 カウンタ  30 プログラムメモリ
32 糸モデル作成部  34 シミュレーション部
36 編成データ記憶部  40,41 糸本体画像
42,43 毛羽画像  44 糸本体エリア  46 毛羽エリア
48 中間エリア  50,51 フィルタ  52,53 糸本体画像
54,55 毛羽画像  56 糸本体エリア  58 毛羽エリア
59,60 フィルタ  80 糸の計測プログラム
81 2次元フーリエ変換命令  82 フィルタリング命令
83 2次元逆フーリエ変換命令  84 2値化命令
85 カウント命令  90 糸  91,92 ローラ
94 記憶装置  100 バックライト光源制御部
102 バックライト光源  104 色成分抽出部
106,108 背景色算出部  110,111,112 色成分
2 Measuring Device 4 Bus 6 Digital Camera 8 Color Monitor 10 Keyboard 12 Color Printer 14 Mouse 16 Network Interface 18-22 Image Memory 24 Fourier Transform Unit 25 Fourier Transform Image Storage Unit 26 Filter Generation Unit 27 Binary Unit 28 Counter 30 Program Memory 32 Yarn model creation unit 34 Simulation unit 36 Knitting data storage unit 40, 41 Yarn body image 42, 43 Fluff image 44 Yarn body area 46 Fluff area 48 Intermediate area 50, 51 Filter 52, 53 Yarn body image 54, 55 Fluff image 56 Yarn body area 58 Fluff area 59, 60 Filter 80 Yarn measurement program 81 Two-dimensional Fourier transform instruction 82 Filtering instruction 83 Two-dimensional inverse Fourier transform instruction 84 Binary instruction 85 Count Decree 90 yarn 91, 92 roller 94 storage device 100 backlight source control unit 102 backlight source 104 color component extracting unit 106 background color calculation unit 110, 111 color component

Claims (4)

  1. 糸のデジタル画像を撮像手段により撮像し、糸の糸径または毛羽量を求める装置において、
     撮像時に可変の光源色を持つ背景光を糸に投光するバックライト光源と、
     撮像手段で撮像したデジタル画像から糸の色成分を求めるための色抽出手段と、
     抽出した色成分とバックライト光源の光源色との間に所定値以上の色差が得られるように、前記光源色を求めるための演算手段と、
     求めた光源色に従って前記バックライト光源を制御するための制御手段と、
     求めた光源色を背景色として前記撮像手段で撮像した糸のデジタル画像に対し、フーリエ変換データもしくは自己相関関数データをデジタル信号処理により求めるための変換手段と、
     変換手段で求めたデータから前記糸の糸径または毛羽量を求めるための糸性状算出手段、とを備えていることを特徴とする、糸の計測装置。 
    In an apparatus for taking a digital image of a yarn with an imaging means and obtaining the yarn diameter or fluff amount of the yarn,
    A backlight light source that projects a background light having a variable light source color onto a thread during imaging;
    Color extraction means for obtaining the color component of the thread from the digital image imaged by the imaging means;
    Arithmetic means for obtaining the light source color so that a color difference of a predetermined value or more is obtained between the extracted color component and the light source color of the backlight light source;
    Control means for controlling the backlight source according to the obtained light source color;
    Conversion means for obtaining Fourier transform data or autocorrelation function data by digital signal processing for the digital image of the yarn imaged by the imaging means using the obtained light source color as a background color;
    And a yarn property calculating means for obtaining the yarn diameter or fluff amount of the yarn from the data obtained by the converting means.
  2. 前記バックライト光源は明るさと色相とを制御自在なカラー光源で、
     前記演算手段は光源色の明るさを変えることにより、糸の色成分との間で所定値以上の色差が得られるものを探索し、所定値以上の色差が得られる明るさがあれば、前記制御手段により該明るさでバックライト光源を制御すると共に、所定値以上の色差が得られる明るさが無い場合は、所定値以上の色差が得られる光源色の色相を探索し、探索した色相になるように前記制御手段でバックライト光源を制御するようにしたことを特徴とする、請求項1の糸の計測装置。
    The backlight light source is a color light source capable of controlling brightness and hue,
    The calculation means searches for a color difference of a predetermined value or more with the color component of the yarn by changing the brightness of the light source color, and if there is a brightness that can obtain a color difference of a predetermined value or more, When the backlight unit is controlled with the brightness by the control means and there is no brightness that can obtain a color difference of a predetermined value or more, the hue of the light source color that can obtain a color difference of a predetermined value or more is searched, and the searched hue is obtained. The yarn measuring apparatus according to claim 1, wherein a backlight light source is controlled by the control means.
  3. 糸のデジタル画像を撮像するための撮像手段と、撮像時に可変の光源色を持つ背景光を糸に投光するバックライト光源とを含む撮像装置と、撮像手段で撮像したデジタル画像を信号処理すると共に、前記バックライト光源を制御するコンピュータ、とを備えた糸の計測装置のためのプログラムにおいて、
     前記コンピュータにより、
     撮像手段で撮像したデジタル画像から糸の色成分を求めるステップと、
     抽出した色成分とバックライト光源の光源色との間に所定値以上の色差が得られるように、前記光源色を求めるステップと、
     求めた光源色に従って前記バックライト光源を制御するステップと、
     求めた光源色を背景色として前記撮像手段で撮像した糸のデジタル画像に対し、フーリエ変換データもしくは自己相関関数データをデジタル信号処理により求めるステップと、 求めたフーリエ変換データもしくは自己相関関数データから、前記糸の糸径または毛羽量を求めるステップとを実行させることを特徴とする、糸の計測プログラム。 
    An image pickup device including an image pickup means for picking up a digital image of the yarn, a backlight light source for projecting background light having a variable light source color to the yarn at the time of image pickup, and a signal processing of the digital image picked up by the image pickup means In addition, in a program for a yarn measuring device comprising a computer for controlling the backlight light source,
    By the computer
    Obtaining a color component of the thread from the digital image captured by the imaging means;
    Obtaining the light source color so that a color difference of a predetermined value or more is obtained between the extracted color component and the light source color of the backlight light source;
    Controlling the backlight source according to the determined light source color;
    From the obtained Fourier transform data or autocorrelation function data, the step of obtaining Fourier transform data or autocorrelation function data by digital signal processing for the digital image of the yarn imaged by the imaging means with the obtained light source color as the background color, And a step of obtaining a yarn diameter or a fluff amount of the yarn.
  4. 糸のデジタル画像を撮像手段により撮像し、糸の糸径または毛羽量を求める方法において、
     バックライト光源から可変の光源色を持つ背景光を糸に投光しながら、撮像手段により糸のデジタル画像を撮像するステップと、
     撮像手段で撮像したデジタル画像から、抽出手段により糸の色成分を求めるステップと、 抽出した色成分とバックライト光源の光源色との間に所定値以上の色差が得られる光源色を、演算手段により求めるステップと、
     求めた光源色に従って制御手段により前記バックライト光源を制御するステップと、
     求めた光源色を背景色として前記撮像手段で撮像した糸のデジタル画像に対し、フーリエ変換データもしくは自己相関関数データを、変換手段を用いてデジタル信号処理により求めるステップと、
     変換手段で求めたデータから前記糸の糸径または毛羽量を、糸性状算出手段により求めるステップ、とを備えていることを特徴とする糸の計測方法。
    In a method of taking a digital image of a yarn with an imaging means and obtaining the yarn diameter or fluff amount of the yarn,
    Imaging a digital image of the yarn by the imaging means while projecting background light having a variable light source color from the backlight light source to the yarn;
    A step of obtaining a color component of the yarn by the extracting unit from the digital image captured by the imaging unit, and a light source color capable of obtaining a color difference of a predetermined value or more between the extracted color component and the light source color of the backlight source. A step to obtain by
    Controlling the backlight light source by control means according to the determined light source color;
    For the digital image of the yarn imaged by the imaging means with the determined light source color as a background color, Fourier transform data or autocorrelation function data is obtained by digital signal processing using the conversion means;
    And a step of obtaining the yarn diameter or fluff amount of the yarn from the data obtained by the conversion means by means of the yarn property calculating means.
PCT/JP2009/061175 2008-07-23 2009-06-19 Thread measuring apparatus, measuring program, and measuring method WO2010010767A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2008-189345 2008-07-23
JP2008189345 2008-07-23

Publications (1)

Publication Number Publication Date
WO2010010767A1 true WO2010010767A1 (en) 2010-01-28

Family

ID=41570233

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2009/061175 WO2010010767A1 (en) 2008-07-23 2009-06-19 Thread measuring apparatus, measuring program, and measuring method

Country Status (1)

Country Link
WO (1) WO2010010767A1 (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2014075350A1 (en) * 2012-11-15 2014-05-22 深圳市华星光电技术有限公司 Linewidth measurement device
CN111815615A (en) * 2020-07-21 2020-10-23 江南大学 Double-light-source-based gray yarn image evenness information extraction method

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPS6080821A (en) * 1983-10-07 1985-05-08 Hamamatsu Photonics Kk Excitation filter for microscope
JPS63219635A (en) * 1986-12-06 1988-09-13 ローベルト・マーセン Method and apparatus for measuring and/or monitoring characteristic of yarn or rope
JPH02257382A (en) * 1989-03-30 1990-10-18 Daiwabo Co Ltd Feather measuring instrument
JPH03238308A (en) * 1990-02-15 1991-10-24 Asahi Chem Ind Co Ltd Outer-shape measuring apparatus
JPH05172533A (en) * 1991-12-25 1993-07-09 Ono Sokki Co Ltd Dimension measuring apparatus
JPH1140983A (en) * 1997-07-18 1999-02-12 Matsushita Electric Ind Co Ltd Board mark recognition device
JP2000283725A (en) * 1999-03-29 2000-10-13 Asahi Chem Ind Co Ltd Hollow yarn measuring device and method
JP2004053583A (en) * 2002-05-22 2004-02-19 Agilent Technol Inc Controller for automatically optimizing and adjusting illuminating light

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPS6080821A (en) * 1983-10-07 1985-05-08 Hamamatsu Photonics Kk Excitation filter for microscope
JPS63219635A (en) * 1986-12-06 1988-09-13 ローベルト・マーセン Method and apparatus for measuring and/or monitoring characteristic of yarn or rope
JPH02257382A (en) * 1989-03-30 1990-10-18 Daiwabo Co Ltd Feather measuring instrument
JPH03238308A (en) * 1990-02-15 1991-10-24 Asahi Chem Ind Co Ltd Outer-shape measuring apparatus
JPH05172533A (en) * 1991-12-25 1993-07-09 Ono Sokki Co Ltd Dimension measuring apparatus
JPH1140983A (en) * 1997-07-18 1999-02-12 Matsushita Electric Ind Co Ltd Board mark recognition device
JP2000283725A (en) * 1999-03-29 2000-10-13 Asahi Chem Ind Co Ltd Hollow yarn measuring device and method
JP2004053583A (en) * 2002-05-22 2004-02-19 Agilent Technol Inc Controller for automatically optimizing and adjusting illuminating light

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2014075350A1 (en) * 2012-11-15 2014-05-22 深圳市华星光电技术有限公司 Linewidth measurement device
CN111815615A (en) * 2020-07-21 2020-10-23 江南大学 Double-light-source-based gray yarn image evenness information extraction method

Similar Documents

Publication Publication Date Title
Feng et al. General solution for high dynamic range three-dimensional shape measurement using the fringe projection technique
JP5781743B2 (en) Image processing apparatus, image processing method, and image processing program
CN105678767B (en) A kind of cloth surface flaw detection method based on SoC Hardware/Software Collaborative Design
WO2015133287A1 (en) Surface texture indexing device, surface texture indexing method, and program
CN103759662A (en) Dynamic textile yarn diameter rapid-measuring device and method
Behera Image-processing in textiles
TW201518694A (en) Method and system for detecting luminance of a light source
CN110879131B (en) Imaging quality testing method and imaging quality testing device for visual optical system, and electronic apparatus
CN110378887A (en) Screen defect inspection method, apparatus and system, computer equipment and medium
JPH05509136A (en) Web shrinkage frequency measurement method and device
CN116703909B (en) Intelligent detection method for production quality of power adapter
CN108961331A (en) A kind of measurement method and device of twisted string pitch
CN104749801B (en) High Precision Automatic optical detecting method and system
WO2009116420A1 (en) Device, program and method for measuring yarn
CN109387524A (en) Thread defect detection method and device based on linearly polarized photon
JP2014087464A (en) Skin evaluation method and skin evaluation device
WO2010010767A1 (en) Thread measuring apparatus, measuring program, and measuring method
JP5349494B2 (en) Yarn property measuring apparatus and measuring method
CN116843687A (en) Communication optical cable surface flaw detection method and device
Prabha et al. Defect detection of industrial products using image segmentation and saliency
Sparavigna et al. The GIMP retinex filter applied to the fabric fault detection
CN110186929A (en) A kind of real-time product defect localization method
WO2018112979A1 (en) Image processing method and apparatus, and a terminal device
Dan et al. Twist detection based on machine vision
Sparavigna et al. Retinex Filtering for Fabric Fault Detection

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 09800282

Country of ref document: EP

Kind code of ref document: A1

DPE1 Request for preliminary examination filed after expiration of 19th month from priority date (pct application filed from 20040101)
NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 09800282

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: JP