WO2006090671A1 - 粒状物体の検査方法及びその方法を実施する検査装置 - Google Patents
粒状物体の検査方法及びその方法を実施する検査装置 Download PDFInfo
- Publication number
- WO2006090671A1 WO2006090671A1 PCT/JP2006/302973 JP2006302973W WO2006090671A1 WO 2006090671 A1 WO2006090671 A1 WO 2006090671A1 JP 2006302973 W JP2006302973 W JP 2006302973W WO 2006090671 A1 WO2006090671 A1 WO 2006090671A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- reference point
- extracted
- area
- reference points
- region
- Prior art date
Links
- 238000000034 method Methods 0.000 title claims abstract description 74
- 238000007689 inspection Methods 0.000 title claims abstract description 58
- 239000008187 granular material Substances 0.000 title claims abstract description 8
- 238000000605 extraction Methods 0.000 claims description 126
- 238000003384 imaging method Methods 0.000 claims description 23
- 239000000284 extract Substances 0.000 claims description 16
- 239000003814 drug Substances 0.000 description 79
- 229940079593 drug Drugs 0.000 description 64
- 238000003672 processing method Methods 0.000 description 23
- 238000010586 diagram Methods 0.000 description 8
- 239000005022 packaging material Substances 0.000 description 5
- 238000005286 illumination Methods 0.000 description 2
- 238000004806 packaging method and process Methods 0.000 description 2
- 241001609030 Brosme brosme Species 0.000 description 1
- 239000004973 liquid crystal related substance Substances 0.000 description 1
- 238000001579 optical reflectometry Methods 0.000 description 1
- 238000012856 packing Methods 0.000 description 1
- 239000002245 particle Substances 0.000 description 1
- 229940126589 solid medicine Drugs 0.000 description 1
- 239000000126 substance Substances 0.000 description 1
- 230000009747 swallowing Effects 0.000 description 1
Classifications
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01N—INVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
- G01N21/00—Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
- G01N21/84—Systems specially adapted for particular applications
- G01N21/85—Investigating moving fluids or granular solids
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/0002—Inspection of images, e.g. flaw detection
- G06T7/0004—Industrial image inspection
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61J—CONTAINERS SPECIALLY ADAPTED FOR MEDICAL OR PHARMACEUTICAL PURPOSES; DEVICES OR METHODS SPECIALLY ADAPTED FOR BRINGING PHARMACEUTICAL PRODUCTS INTO PARTICULAR PHYSICAL OR ADMINISTERING FORMS; DEVICES FOR ADMINISTERING FOOD OR MEDICINES ORALLY; BABY COMFORTERS; DEVICES FOR RECEIVING SPITTLE
- A61J3/00—Devices or methods specially adapted for bringing pharmaceutical products into particular physical or administering forms
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01N—INVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
- G01N21/00—Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
- G01N21/84—Systems specially adapted for particular applications
- G01N21/88—Investigating the presence of flaws or contamination
- G01N21/95—Investigating the presence of flaws or contamination characterised by the material or shape of the object to be examined
- G01N21/9508—Capsules; Tablets
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T1/00—General purpose image data processing
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/60—Analysis of geometric attributes
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/60—Type of objects
- G06V20/66—Trinkets, e.g. shirt buttons or jewellery items
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30108—Industrial image inspection
Definitions
- the present invention relates to a granular object inspection method for inspecting a granular object such as a solid medicine and an inspection apparatus for performing the method.
- image data obtained by performing data processing such as binarization on an image of a medicine imaged by an imaging means such as a CCD camera.
- the image is processed to determine the area, perimeter, and complexity of the object in the image, and based on the area and complexity, it is determined whether the object in the image is a drug. I was asking for it.
- Complexity is a value obtained by dividing the square of the circumference by the area.
- FIG. 13 As shown in (a), there is a groove force on one side of the drugs 10a and 10b, and when such drugs 10a and 10b force stand up as shown in Fig. Since the dent appears, the circumference of the drug cannot be recognized correctly, and the number of drugs cannot be counted correctly.
- An object of the present invention is to provide a method for inspecting granular objects that can accurately count the number of granular objects, and an inspection apparatus that implements the method.
- an imaging region including a granular object to be inspected is imaged, and the granular shape is digitized in a digital image obtained by digitizing the pixel value of each pixel of the captured image.
- a granular object for inspecting a granular object existing in the imaging area by separating each object area from the lump area.
- a second extraction process for extracting, from a plurality of reference points existing in the target area, a reference point having a minimum count value by the counting process as a reference point; Reference point force extracted by the second extraction process Selects all reference points that can be seen through the target area, and creates a granular area that connects the selected reference point and the reference point to each other.
- the number of reference points extracted by repeatedly executing the second to fourth extraction processes is counted as the number of granular objects.
- the object region corresponding to the granular object can be separated one by one in the digital image, and the number of extracted reference points can be obtained even when a plurality of granular objects are in contact with or overlapped with each other.
- the number of granular objects can be accurately counted.
- the target region extracted by the first extraction process for each of the plurality of reference points extracted by the second to fourth extraction processes by the image processing unit is further performed to extract all reference points that can see only one of the reference points through the region as reference point belonging reference points belonging to the reference point, and An area surrounded by all the connected line segments that generate the connected line segment connecting the reference point and all the reference point belonging reference points belonging to the above-mentioned reference point extracted by the fifth extraction process. May be the object region of the granular object corresponding to the reference point. As a result, an area closer to the actual granular object area can be extracted as the object area.
- each of a plurality of reference points extracted by repeatedly executing the second to fourth extraction processes is subjected to the fifth extraction process. It may be generated by connecting two of the reference point belonging reference points that belong to the extracted reference point. As a result, a region closer to the actual granular object region can be extracted as the object region.
- a reference point excluding the reference point belonging reference point extracted in the fifth extraction process is extracted as an undetermined reference point from all the reference points.
- the sixth extraction process can be performed, and the unidentified reference point extracted in the sixth extraction process can be seen through the region of the target area extracted by the first extraction process.
- the target object region is determined as the object region corresponding to the unidentified reference point, and the reference point corresponding to the determined object region, the reference point belonging reference point, and the unidentified reference point are connected to each other. May be generated. As a result, a region closer to the actual granular object region can be extracted as the object region.
- the image processing unit is formed by forming a plurality of search lines that are undetermined by the unexamined reference point force extracted in the sixth extraction process at a substantially constant angle.
- the areas until each search line intersects the connected line segment connecting two of the reference point belonging reference points belonging to the reference point extracted by the fifth extraction process are crossed. It can also be added to the object region corresponding to the connecting line segment.
- the object region to which the unidentified reference point belongs can be determined as the object region closest to the unidentified reference point, and the region closer to the actual granular object region can be extracted as the object region.
- the present invention provides the first extraction process for each of a plurality of reference points extracted by the image processing unit repeatedly performing the second to fourth extraction processes. Only one point of the reference point can be seen through the extracted region of the target region, and a reference point on the opposite side of the reference point across the center of the target region is shaped A seventh extraction process for extracting as a determination point is further performed, and a gap between a shape determination point corresponding to one of the reference points extracted by the seventh extraction process and a shape determination point corresponding to another reference point is determined. If at least a part of each connecting line segment passes outside the target area, it is determined that the plurality of reference points belong to different granular objects, and belong to the determined different granular objects.
- the above image processing unit does not extract the shape determination point even by the seventh extraction process. It can be determined that the points belong to the same granular object, and the number of granular objects can be accurately identified without being erroneously identified as two granular objects.
- the maximum distance between a point on the contour line forming the first region extracted by the eighth extraction process and the first connecting line segment, and the eighth extraction process are used for extraction. If the distance difference between the point on the contour line forming the second region and the maximum distance between the second connecting line segment is shorter than the predetermined reference distance, the two reference points It may be determined that the particles belong to different granular objects. As a result, the time required for the arithmetic processing can be shortened as compared with the case where the areas of the first region and the second region are directly obtained.
- the shape determination point having the longest distance from the corresponding reference point may be selected from among the plurality of shape determination points, and the second connecting line segment may be formed using the selected shape determination point. Thereby, the calculation time required for the determination process can be further shortened.
- the present invention supports an imaging unit that captures an imaging region including a granular object to be inspected, and a granular object in a digital image obtained by digitizing the pixel value of each pixel of the image of the imaging unit.
- an image processing unit having means for separating individual object areas from the mass area when a plurality of object areas are in contact with each other to form one mass area.
- First extracting means for extracting the block region from the digital image as a target region for image processing;
- Setting means for setting a plurality of reference points in a distributed manner along the outline of the target area inside the target area extracted by the first extracting means;
- Counting means for counting the number of other reference points that can be seen from the reference point through the region of the target area for each of the reference points set by the above;
- Second extraction means for extracting, from a plurality of reference points existing in the target area, a reference point having a minimum count value by the counting means as a reference point;
- Reference point force extracted by the second extraction means Selects all reference points that can be seen through the target area, and creates a granular area formed by connecting the selected reference point and the reference point to each other.
- Third extraction means for extracting the object area as an object area; target area force extracted by the third extraction means; fourth extraction means for extracting an area excluding the object area as a new target area;
- Means for counting the number of granular objects based on the number of reference points extracted by repeatedly using the second to fourth extraction means are provided. Therefore, even when a plurality of granular objects are in contact with each other or overlapping, the number of granular objects can be accurately counted based on the number of extracted reference points.
- FIG. 1 is a schematic configuration diagram of a granular object inspection method device according to a first embodiment of the present invention.
- FIG. 2 (a), (b), and (c) are explanatory diagrams of the image processing method described above.
- FIG. 3 (a) and (b) are explanatory views of the image processing method described above.
- 4 (a) and 4 (b) are explanatory views of an image processing method by the granular object inspection apparatus according to Embodiment 2 of the present invention.
- FIG. 5 is an explanatory diagram of an image processing method by the granular object inspection device according to the third embodiment of the present invention.
- 6A and 6B are explanatory diagrams of an image processing method by the granular object inspection apparatus according to Embodiment 4 of the present invention.
- FIG. 7 is an explanatory diagram showing the results of the image processing described above.
- FIGS. 8A and 8B are explanatory diagrams of an image processing method by the granular object inspection apparatus according to the fifth embodiment of the present invention.
- FIG. 9 is an explanatory diagram of an image processing method by the granular object inspection device according to the sixth embodiment of the present invention.
- FIG. 10 is an explanatory view of an image processing method by the granular object inspection device according to the seventh embodiment of the present invention.
- FIG. 11 is an explanatory diagram of an image processing method by a granular object inspection device according to an eighth embodiment of the present invention.
- FIG. 12 (a), (b), and (c) are images of granular objects to be inspected, and when two granular objects are placed on the inspection table in contact or overlapped with each other.
- FIG. 13 (a) and (b) are images of a granular object to be inspected, and an example when the granular object having a groove on one side is placed on an inspection table in an upright state.
- FIG. 13 is images of a granular object to be inspected, and an example when the granular object having a groove on one side is placed on an inspection table in an upright state.
- FIG. 1 shows a schematic configuration of a granular object inspection apparatus according to the present embodiment.
- This granular object inspection device has an inspection table 1 on which the granular object to be inspected (drugs 10a and 10b) is placed, and an imaging device that is installed above the inspection table 1 and images the drugs 10a 'and 10b.
- Means 2 for example, a CCD camera
- an illumination device that is arranged on the same side as the imaging means 2 with respect to the examination table 1 and irradiates the granular object placed on the examination table 1 with light 3.
- An image storage unit 4 for storing a binarized image obtained by binarizing the grayscale information of the image signal captured by the image processing unit with an appropriate threshold, and image processing for the binary image stored in the image storage unit 4;
- An image processing unit 5 that separates into object regions corresponding to individual granular objects, and an inspection determination unit 6 that counts the number of granular objects based on the number of object regions separated by the image processing unit 5 are provided.
- the image storage unit 4, the image processing unit 5, and the inspection determination unit 6 constitute an image processing / inspection determination unit 7.
- the light irradiated from the lighting device 3 increases the luminance difference between the background portion and the portion corresponding to the medicines 10a and 10b.
- the surface of the examination table 1 has low light reflectivity, and when the illumination device 3 irradiates the examination table 1 with light, the portions of the drugs 10a and 10b are bright in the image captured by the imaging means 2.
- the background part (the surface of the inspection table 1) appears. in this way A relatively large luminance difference is generated between the contours of the medicines 10a and 10b and the background.
- the image data of the grayscale image may be stored in the image storage unit 4 by binarizing the image signal picked up by the image pickup means 2 but not by binarizing the image signal.
- this grayscale image it is possible to improve the determination accuracy because differential processing or the like can be applied.
- a binary image obtained by binarizing an image signal picked up by the image pickup means 2 is a digital image having only binary values of pixel values 0 and 1, and an image storage unit having RAM power. Stored in 4.
- the image storage unit 4 is not only used as a storage area for binary images, but also used as a storage area for work in various image processing described later.
- the binary image stored in the image storage unit 4 is input to the image processing unit 5 and subjected to image processing described below. Then, the image processing unit 5 recognizes the shapes of the drugs 10a and 10b, and the examination determination unit 6 identifies the number of the drugs 10a and 10b based on the recognition result.
- a monitor device such as a CRT or a liquid crystal display is connected to the image processing unit 5.
- an image captured by the imaging means 2 a binary image binarized by a binarization processing unit (not shown), a recognition result by the image processing unit 5, and the like are displayed.
- the method for inspecting a granular object includes, for example, placing a plurality of medicines 10a and 10b to be packaged in one packing bag on the inspection table 1 and using the imaging means 2 From the images of a plurality of medicines 10a and 10b to be imaged, it is judged whether or not the force of the medicines 10a and 10b placed on the examination table 1 is the correct number.
- the drugs 10a and 10b whose final number has been determined are packaged in a single package using packaging materials.
- the packaging material is transparent or translucent and the drugs 10a and 10b are imaged by the imaging means 2, the outlines of the drugs 10a and 10b are recognized in the same way as when packaging with the packaging material is not performed. If an image that can be obtained is obtained, the medicines 10a and 10b previously wrapped with a packaging material may be placed on the examination table 1 and imaged.
- Fig. 2 (a) shows two round tablets (drugs 10a and 10b) force when the drugs 10a and 10b are placed on the examination table 1 with some of them overlapping each other.
- storage part 4 is illustrated.
- this binary image one continuous area (hereinafter referred to as a lump area A1) corresponding to the two drugs 10a and 10b is shown.
- object areas areas corresponding to the individual drugs 10a and 10b
- the image processing unit 5 performs a first extraction process for extracting the block area A1 as a target area for image processing from the binary image stored in the image storage unit 4. Then, a setting process is performed in which appropriate pixels in the vicinity of the extracted outline of the lump area A1 are set as reference points.
- the reference points are distributed at substantially constant intervals inside the lump area A1 and above the outline of the lump area A1 or within a few pixels inside the outline of the lump area A1. Is set as follows.
- FIG. 2 (c) illustrates a binary image stored in the image storage unit 4 when a single circular tablet (drug 10a) is imaged.
- a mass area A2 an area (referred to as a mass area A2) corresponding to one medicine 10a is shown.
- the image processing unit 5 sets all reference points (for example, PI, P2, P3, P4, ⁇ 5,%) Distributed along the outline of the block region A2, and sets each reference point ( ⁇ 1 ⁇ ) are connected to each other (eg, S (1—20), S (2—10), S (3—15), S (5—30)...;) To do.
- the image processing unit 5 performs a counting process for counting the number of connected line segments in which the pixel values (0 or 1) of the pixels on the connected line segments all have the same value. As a result, at each reference point, the number of other reference points that can be seen through the reference point force mass region A2 is counted.
- arbitrary reference points set in the above-described block region A2 are set as a reference point Pm and a reference point Pn, and a line segment connecting the reference point Pn and the reference point Pm is a connected line segment S (mn ).
- this connection line segment S (mn) passes through the area of the mass area A2, the pixel values of the pixels on the connection line segment S (mn) all become the same value (0 or 1).
- By comparing the pixel values of each pixel on the line segment it is determined whether or not the force passes only within the connected line segment S (m ⁇ n) 1S lump area A2.
- the image processing unit 5 first performs the lump area.
- a plurality of reference points are distributed at substantially constant intervals along the contour of A1.
- the image processing unit 5 uses the other reference points (for example, P32, P28, P10, P25, ⁇ 21,...) For all reference points (for example, PI, P2, P3, P11, ⁇ 12). )
- S (2-28), S (3—10), S (ll—25), S ( 12— 21) forms.
- some connected line segments for example, the line segment S (3-10)) pass outside the mass area A1.
- the count value the number of other reference points that the image processing unit 5 can see through each of the plurality of reference points existing in the region of the block region A1 through the region of the reference point force region A1.
- the reference point for example, reference points P5, P6, P25, P30, etc.
- the count value of the reference point near the overlapping portion of the medicines 10a and 10b is smaller than the reference point in the other part.
- the image processing unit 5 performs a counting process for obtaining a count value for each of a plurality of reference points existing in the target area (lumb area A1), and uses the count value obtained by the count process.
- the image processing unit 5 extracts, from a plurality of reference points in the block area A1, a reference point having a minimum count value (see Table 1) by the counting process as a reference point.
- a second extraction process is performed.
- the image processing unit 5 extracts the reference point P3 as a reference point. Note that, as a result of the counting process, there may be a case where there are a plurality of reference points whose numerical values are minimum values at the reference points of the same granular object or the reference points of different granular objects.
- the image processing unit 5 arbitrarily selects any one of the reference points having the smallest count value, and this is extracted as a reference point, and the extracted reference point is processed as described later. I do. The result is the same no matter which point is used as the reference point and the number of granular objects is counted.
- the image processing unit 5 has the lump area A2 set to 1. It is determined that there are two granular objects, the subsequent processing is terminated, and the number of granular objects corresponding to the lump area A2 is determined as one.
- the image processing unit 5 can see through the region of the lump area A1 from the reference point P3 as shown in FIG. 3 (a). All reference points (for example, PI, P2, P4 to P7, P15 to P45) are selected, and a connecting line segment connecting the reference point P3 and the selected reference point is formed. Then, the image processing unit 5 performs a third extraction process for extracting an area surrounded by these connecting line segments as an object area B1 corresponding to one granular object (see FIG. 3B).
- All reference points for example, PI, P2, P4 to P7, P15 to P45
- the image processing unit 5 performs a fourth extraction process of removing the object region B1 from the mass region A1 and extracting the region as a new target region A3 (not shown), and this target region Among the reference points present in A3 (for example, P8, ⁇ 9 ⁇ ), the reference point with the smallest count value (see Table 1) by the above counting process (referred to here as reference point P10) This reference point P10 is extracted as a new reference point.
- the image processing unit 5 selects all reference points (for example, P8, P9, ⁇ 10 ⁇ ) that can be seen from the reference point P10 through the target area A1.
- a connecting line segment that connects the selected reference point and the reference point P10 to each other is formed.
- an area surrounded by these connecting line segments is extracted as an object area B2 corresponding to one object (see FIG. 3 (b)).
- the combined region of the object regions Bl and B2 is substantially equal to the lump region A1.
- the image processor 5 performs reference point extraction processing.
- the inspection determination unit 6 determines the number of extracted reference points (two in the present embodiment) as the number of granular objects existing in the lump area A1.
- the image processing unit 5 extracts the reference point having the smallest count value among the reference points existing in the target region as the reference point (second extraction process), and this reference point Force Extract the region that is formed by connecting the reference point and the reference point that can be seen through the target region as the object region (third extraction process), and then remove the object region from the target region as the new target region.
- the object areas Bl and B2 can be separated one by one from the lump area A1, and even if multiple granular objects overlap, they are extracted. The number of granular objects can be accurately counted based on the number of reference points.
- Embodiment 2 of the present invention A granular object inspection method according to Embodiment 2 of the present invention and an inspection apparatus that performs the method will be described.
- the configuration of the inspection apparatus is the same as that of the first embodiment, and the same components as those of the first embodiment are denoted by the same reference numerals as those described in the first embodiment, and the description thereof is omitted (hereinafter referred to as the first embodiment). The same).
- the object region Bl shown in FIG. 3B includes not only the region corresponding to the medicine 10a but also the region corresponding to the medicine 10b. That is, in the image processing method described in the first embodiment, a region that is far from the actual object region of the medicines 10a and 10b is extracted.
- the image processing unit 5 can extract an object region close to the actual regions of the medicines 10a and 10b by performing image processing described later. As a result, the number of granular materials can be counted more accurately.
- the image processing unit 5 is the same as that described in the first embodiment.
- Two reference points P3 and P10 are extracted by performing the second to fourth extraction processes.
- the image processing unit 5 passes only one of the reference points (either P3 or P10) for each reference point (P3 or P10) through all the reference point force areas in the region.
- Reference points that can be seen Are extracted, and a fifth extraction process is performed in which all the extracted reference points are used as reference point belonging reference points.
- the image processing unit 5 forms two connecting line segments respectively connecting the reference point and the reference points P3 and P10 for all the reference points in the block region A1.
- this reference point is determined to be a reference point through which only one reference point can be seen.
- the reference point where only one reference point P3 and P10 corresponding to each of drugs 10a and 10b can be seen is the drug 10a , 10b exists around the overlapping part (hidden part of lump area A1).
- the image processing unit 5 extracts the reference point PI, P2, P4, P5, P28, P29 as the reference point belonging reference point G1 for the reference point P3, and the reference point P10. Extracts the reference points P8, P9, Pl1, P12, P24, and P25 as the reference point belonging reference point G2.
- the image processing unit 5 After extracting the reference point belonging reference points Gl and G2 belonging to the respective reference points P3 and P10, as shown in FIG. 4 (b), the image processing unit 5 refers to the reference point P3 and the reference point belonging reference. Connecting line segments connecting to point G1 (for example, S (3-1), S (3- 2), S (3-4), S (3-5), S (3-2) 8), S (3-29)) are formed, and these connected segments are extracted as the substance region B3 of the drug 10a corresponding to the reference point P3.
- point G1 for example, S (3-1), S (3- 2), S (3-4), S (3-5), S (3-2) 8), S (3-29)
- the reference point belonging reference point G1 which can only be seen through one of the reference points P3, is used by the image processing unit 5 as a reference point of the granular object to which the reference point P3 belongs.
- the connecting line segment connecting to the reference point belonging reference point G1 is extracted as the object region corresponding to the reference point P3.
- the same process is performed for the other reference point P10, and an object region corresponding to the reference point P10 is extracted.
- the image processing unit 5 can extract a region close to the actual granular object region of the medicines 10a and 10b as the object region.
- a granular object inspection method according to Embodiment 3 of the present invention and an inspection apparatus for performing the method will be described.
- two image processing units 5 are provided.
- the second to fourth extraction processes described in the first embodiment are repeatedly executed to extract two reference points P3 and P10, and then the second embodiment.
- the reference point belonging reference points Gl and G2 belonging to the respective reference points P3 and P10 are extracted by the fifth extraction process described in the above.
- the image processing unit 5 generates, for one reference point P3, a connecting line segment that connects the reference point P3 and all of the reference points belonging to the reference point belonging reference point G1. Then, an area (referred to as B5) surrounded by these connecting line segments is extracted as an object area of the medicine 10a corresponding to the reference point P3. For the other reference point P10, a connecting line segment connecting the reference point P10 and all the reference points belonging to the reference point belonging reference point G2 is generated, and an area surrounded by these connecting line segments (B6 )) As the object region of the medicine 10b corresponding to the reference point P10. Thereby, it is possible to extract a region closer to the actual region of the medicines 10a and 10b as the object region than in the second embodiment.
- a granular object inspection method according to Embodiment 4 of the present invention and an inspection apparatus for performing the method will be described.
- portions where the drugs 10a and 10b are in contact with each other or overlap are extracted as object regions.
- the non-overlapping part was not extracted as an object area.
- object regions B5 and B6 are extracted only in overlapping portions among regions corresponding to the respective drugs 10a and 10b. Regions CI and C2 were not extracted as object regions.
- the image processing unit 5 extracts the object regions B5 and B6 by the method described in the third embodiment, two regions that do not belong to the object regions B5 and B6 (undecided regions CI , C2) is extracted as an undistinguished reference point. Then, the following processing is performed on the reference points in the object regions B5 and B6 from the reference points of the unidentified reference points extracted by the sixth extraction processing. For example, as shown in FIG. 6 (b), the image processing unit 5 starts from the unidentified reference points (for example, P30, P3 1, P32, P33, ⁇ 34%) In the undecided region C1.
- the unidentified reference points for example, P30, P3 1, P32, P33, ⁇ 34
- a connecting line segment is formed connecting the unidentified reference point of the object and the reference points in the object regions B5 and B6 (for example, ⁇ 1 ). It is detected which object region the connecting line segment to pass first passes. Then, the image processing unit 5 determines the object region through which the connecting line segment starting from each unidentified reference point first passes as the object region to which the unidentified reference point belongs. For example, two connecting line segments (for example, S (30-1), S) connecting the reference point P30 in the object region B5, B6 to the reference point PI, P11 in the undecided region C1. When (30-9) ...) is formed, any connecting line segment passes through the object region B5 first, so the image processing unit 5 determines that the reference point P30 belongs to the object region of the medicine 10a. It is judged that.
- the image processing unit 5 performs the above processing on all the unidentified reference points in the undetermined areas CI and C2, so that all the unidentified reference points are objects of any granular object. Determine if it belongs to an area. Then, the object region B5 or B6 includes an area surrounded by a connecting line segment connecting each unidentified reference point and the reference point in the object region through which the unidentified reference point first passes. Regions B5 and B6 can be brought close to the regions corresponding to the actual drugs 10a and 10b.
- the image processing unit 5 includes individual unidentified reference points extracted by the above processing and individual unidentified reference points belonging to the same object area of the granular object as the unidentified reference points.
- Object areas extracted by the processing method of the second embodiment are formed by forming connecting line segments that are connected to each other and extracting regions surrounded by these connecting line segments as object regions B7 and B8 corresponding to the respective drugs.
- a region obtained by combining B5 and B6 and the newly extracted object regions B7 and B8 may be extracted as object regions B9 and B10 (see FIG. 7).
- a region close to the regions 10a and 10b can be extracted as an object region.
- a granular object inspection method according to Embodiment 5 of the present invention and an inspection apparatus that performs the method will be described.
- the present embodiment is different from the image processing method of the fourth embodiment described above in the method for obtaining the object region to which the unidentified reference point belongs.
- the object region that can be seen from the unidentified reference point extracted in the sixth extraction process is set as the object region to which the unidentified reference point belongs.
- a plurality of search lines for example, LI, L2 L8 extending radially from a certain unidentified reference point Pn are omitted.
- a certain angle (eg, about 45 degrees) Form with.
- the image processing unit 5 includes a connecting line segment connecting a plurality of search lines extending to the radial stripe between the reference point and the reference point belonging reference point extracted in the fifth extraction process belonging to the reference point, The area until the first intersection is determined as the area belonging to the object area corresponding to the first connecting line segment.
- the image processing unit 5 does not extract only the connection line segment connecting the reference point and the reference point belonging reference point belonging to the reference point as the object region, and the unidentified reference point and the connection line segment.
- the image processing unit 5 forms a plurality of search lines extending radially from a certain unidentified reference point Pn at a substantially constant angle, but part of the plurality of search lines ( For example, L1) is radiated from the reference point Pn to the opposite side of the object region, so it does not intersect the connecting line segment. Therefore, in the image processing unit 5, as shown in FIG.
- the search lines LI, L3 When incident on the contour line LO of A1, the search lines LI and L3 are reflected at an exit angle substantially the same as the incident angles ⁇ 1 and ⁇ 3, and the reflected search lines Ll ′ and L3 ′ are the first connecting line segments that intersect.
- the region up to may be determined as the object region to which the connecting line segment belongs.
- the region up to the contour LO side with respect to the unidentified reference point Pn can be extracted as the object region, and the region closer to the actual granular object region can be extracted as the object region.
- a granular object inspection method according to Embodiment 6 of the present invention and an inspection apparatus for performing the method will be described.
- an area closer to the actual area of the medicines 10a and 10b can be extracted as the physical areas B5, B6, and the like. it can.
- Each created search line intersects first And the identification number of the object region to which the connected line segment belongs is assigned to the search line and stored in the image storage unit 4. After that, ID numbers are assigned to all search lines. For each reference point, the image processing unit 5 examines the identification number assigned to an odd number of search lines starting from the reference point, and the object region with the largest identification number is the object region to which the reference point belongs. Judge that there is.
- the image processing unit 5 determines that this reference point P27 belongs to the object region B5. The image processing unit 5 performs the above-described determination process for each reference point existing in the intermediate area C3.
- a granular object inspection method according to Embodiment 7 of the present invention and an inspection apparatus that performs the method will be described.
- the number of granular objects can be accurately counted even when a plurality of granular objects to be inspected overlap or are in contact with each other.
- the image processing unit 5 performs the above-described implementation.
- the image processing shown in the form is performed, a part of the contour line is dented, so two reference points are extracted.
- the image processing unit 5 can determine the exact number of granular objects even when the medicine 10a is standing.
- the reference points P3 and P10 are extracted by repeatedly executing the second to fourth extraction processes described in the first embodiment.
- a reference point for example, P1, ⁇ 2
- Force in the area of the lump area A1 is processed as described below. That is, for each of a plurality of reference points (P3 or P10), the image processing unit 5 passes the reference point (P3 or P3 or P10) from the plurality of reference points P in the block region A1 through the region of the block region A1. Extract all reference points that can only be seen through (P10), and extract the reference points that are on the opposite side of the reference point across the center of the block area A1 as the shape determination points. The extraction process is performed.
- a reference point group consisting of reference points PI, P2, P4, and P5 on the reference point P3 side (reference point group C4 and And a reference point group (reference point group C5) composed of reference points P26, P27, and P28 on the opposite side across the central portion of the lump area A1 is extracted.
- the image processing unit 5 extracts reference points P26, P27, and P28 belonging to the reference point group C4 as shape determination points.
- reference point group C3 a reference point group consisting of reference points P8, P9, Pl1, and P12 on the reference point P10 side
- reference point group C4 a reference point group consisting of reference points P23, P24, and P25 on the opposite side across the center of the mass area A1
- Reference points P23, P24, and P25 belonging to C4 are extracted as shape determination points.
- the shape determination point (P26, P27 or P28) corresponding to the reference point P3 and the reference point P10 A connecting line segment is formed connecting the shape determination points (P23, P24 or P25).
- the pixel value of the background portion in the binary image is 0 and the pixel value of the drug portion is 1
- a part of the connecting line segment connecting the reference points P3 and P10 is outside the region of the block region A1. Therefore, there is a part where the pixel value at both ends (reference points P3, P10) of the connecting line segment is 1 and the pixel value of the middle part is 0.
- the two drugs 10a and 10b are overlapped or contacted.
- the connecting line segment S (24-29) passes outside the block area A1
- the pixel value at both ends of the connecting line segment is 1, and the intermediate pixel value is 0.
- the image processing unit 5 forms a connecting line segment by connecting between the shape determination points corresponding to the two reference points, and at least a part of this connecting line segment passes outside the area of the lump area A1. If so, these two It can be determined that the reference points belong to different granular objects.
- the image processing unit 5 can extract two reference points P3 and P10 by the above-described image processing. Even if the shape determination points are searched for the reference points P3 and P10, the reference points P3 and P10 can be seen from the reference point on the opposite side of the reference points P3 and P10 across the center of the lump area A1. Since it is possible, shape determination points are not extracted. As described above, when the determination condition that a part of the connecting line segment connecting the shape determination points passes the outer region of the block region A1 is not satisfied, the image processing unit 5 determines that the two reference points P3 and P10 are Judged as a reference point belonging to the same granular object. As a result, even if two reference points are extracted due to the standing of the granular object, the image processing unit 5 does not erroneously detect that two granular objects exist. Can be counted correctly.
- the image processing unit 5 connects the connection line segment (referred to as the first connection line segment Sa) connecting the two reference points P3 and P10 and the lump area A1.
- a connecting line segment (the second connecting line segment Sb) connecting the area surrounded by the contour line (referred to as the first area Da) and the shape determination points P29 and P24 corresponding to the reference points P3 and P10, respectively.
- the second area Db the area surrounded by the outline of the block area A1 (referred to as the second area Db) is performed.
- the areas of the first region Da and the second region Db are substantially equal.
- the image processing unit 5 compares the area of the first region Da and the area of the second region Db, and determines whether the difference between the two is smaller than the predetermined reference area! Can be determined whether or not the force belongs to different granular objects.
- the image processing unit 5 has the areas of the first region Da and the second region Db. It takes a relatively long time to calculate Therefore, the calculation time can be shortened by the following processing as compared with the case of obtaining the area value. As shown in FIG. 11 (b), the image processing unit 5 also draws a dotted force on the contour line surrounding the first region Da to the first connection line segment Sa, so that it is on the first connection line segment Sa and the contour line.
- the maximum distance from the point (maximum distance E1) is calculated, and the point force on the contour line surrounding the second region Db is also perpendicular to the second connecting line segment Sb, and the second connecting line segment Sb and the point on the contour line
- the maximum distances El and E2 are substantially equal.
- the image processing unit 5 compares the maximum distances El and E2, and determines whether or not the two reference points belong to different granular objects depending on whether or not the distance difference between the two is shorter than a predetermined reference distance. Can be determined. Thus, even if two reference points are extracted due to the standing of the granular object, the image processing unit 5 does not erroneously detect that two granular objects exist. In addition, the inspection determination unit 6 can accurately count the number of granular objects based on the number of reference points belonging to different granular objects.
- a plurality of shape determination points are extracted for each of the reference points P3 and P10.
- the above-described determination process may be performed for all shape determination points. Since the determination process is performed a plurality of times, the time required for the calculation process increases. Therefore, the image processing unit 5 performs the above-described determination processing only for the shape determination point having the maximum distance value with respect to the reference points P3 and P10 among the plurality of shape determination points extracted for the reference points P3 and P10. Is preferably performed. As a result, the calculation time required for the determination process can be shortened.
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Health & Medical Sciences (AREA)
- Theoretical Computer Science (AREA)
- General Health & Medical Sciences (AREA)
- Chemical & Material Sciences (AREA)
- Life Sciences & Earth Sciences (AREA)
- Biochemistry (AREA)
- Immunology (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Analytical Chemistry (AREA)
- Pathology (AREA)
- Quality & Reliability (AREA)
- Multimedia (AREA)
- Geometry (AREA)
- Medicinal Chemistry (AREA)
- Pharmacology & Pharmacy (AREA)
- Animal Behavior & Ethology (AREA)
- Public Health (AREA)
- Veterinary Medicine (AREA)
- Image Analysis (AREA)
- Investigating Materials By The Use Of Optical Means Adapted For Particular Applications (AREA)
- Image Processing (AREA)
Abstract
Description
Claims
Priority Applications (4)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN2006800056176A CN101175990B (zh) | 2005-02-23 | 2006-02-20 | 粒状物体的检查方法及实施该方法的检查装置 |
KR1020077019019A KR100929475B1 (ko) | 2005-02-23 | 2006-02-20 | 입상물체의 검사방법 및 그 방법을 실시하는 검사장치 |
EP06714113A EP1852693A4 (en) | 2005-02-23 | 2006-02-20 | METHOD FOR EXAMINING GRANULAR MATERIAL AND EXAMINATION DEVICE FOR CARRYING OUT SAID METHOD |
US11/816,536 US7916949B2 (en) | 2005-02-23 | 2006-02-20 | Method of inspecting granular material and inspection device for conducting that method |
Applications Claiming Priority (4)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2005-048033 | 2005-02-23 | ||
JP2005-048032 | 2005-02-23 | ||
JP2005048033A JP4639841B2 (ja) | 2005-02-23 | 2005-02-23 | 粒状物体の検査方法及びそれを用いる検査装置 |
JP2005048032A JP4639840B2 (ja) | 2005-02-23 | 2005-02-23 | 粒状物体の検査方法及びそれを用いる検査装置 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2006090671A1 true WO2006090671A1 (ja) | 2006-08-31 |
Family
ID=36927311
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2006/302973 WO2006090671A1 (ja) | 2005-02-23 | 2006-02-20 | 粒状物体の検査方法及びその方法を実施する検査装置 |
Country Status (5)
Country | Link |
---|---|
US (1) | US7916949B2 (ja) |
EP (1) | EP1852693A4 (ja) |
KR (1) | KR100929475B1 (ja) |
CN (1) | CN101175990B (ja) |
WO (1) | WO2006090671A1 (ja) |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2010061646A (ja) * | 2008-08-08 | 2010-03-18 | Make Softwear:Kk | 画像処理装置、画像出力装置、画像処理方法及びコンピュータプログラム |
CN109073564A (zh) * | 2016-04-22 | 2018-12-21 | 富士胶片株式会社 | 药剂监查装置及方法以及程序 |
Families Citing this family (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2006090671A1 (ja) * | 2005-02-23 | 2006-08-31 | Matsushita Electric Works, Ltd. | 粒状物体の検査方法及びその方法を実施する検査装置 |
JP4755714B2 (ja) * | 2009-11-17 | 2011-08-24 | 株式会社湯山製作所 | 薬剤払出装置 |
JP5886209B2 (ja) * | 2010-12-17 | 2016-03-16 | パナソニックヘルスケアホールディングス株式会社 | 錠剤鑑査装置 |
KR102223436B1 (ko) * | 2012-10-03 | 2021-03-05 | 가부시키가이샤 유야마 세이사쿠쇼 | 약제 감사 시스템, 권취 장치, 조출 장치 및 홀더 |
US9994347B2 (en) * | 2013-02-20 | 2018-06-12 | Yuyama Mfg. Co., Ltd. | Medicine inspection device and medicine packaging system |
JP6167053B2 (ja) * | 2014-02-28 | 2017-07-19 | 富士フイルム株式会社 | 検査装置及び検査方法、並びに検査方法をコンピュータに実行させるプログラム |
WO2019116543A1 (ja) * | 2017-12-15 | 2019-06-20 | 日本たばこ産業株式会社 | シガレットフィルタ検査方法、シガレットフィルタ検査装置、及びシガレットフィルタ検査プログラム |
EP3922232B1 (en) * | 2019-02-08 | 2022-11-23 | FUJIFILM Toyama Chemical Co., Ltd. | Medicine identification system, medicine identification device, medicine identification method, and program |
Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH09231342A (ja) * | 1996-02-26 | 1997-09-05 | Sanyo Electric Co Ltd | 錠剤検査方法及び装置 |
JP2004234132A (ja) * | 2003-01-28 | 2004-08-19 | Matsushita Electric Works Ltd | 粒状物体の検査装置およびその検査方法 |
Family Cites Families (14)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US3150730A (en) * | 1960-03-23 | 1964-09-29 | Wm Ainsworth & Sons Inc | Balance |
US4093941A (en) * | 1976-12-09 | 1978-06-06 | Recognition Equipment Incorporated | Slope feature detection system |
JPH0417665A (ja) | 1990-05-10 | 1992-01-22 | Kowa Eng Kk | 銀白色装飾品 |
JP3438925B2 (ja) * | 1993-12-28 | 2003-08-18 | 三洋電機株式会社 | 錠剤検査システム |
US5978520A (en) * | 1995-07-31 | 1999-11-02 | Hitachi, Ltd. | Method of recognizing image data and apparatus therefor |
US5974174A (en) * | 1996-09-26 | 1999-10-26 | Victor Company Of Japan, Ltd. | Picture-information processing apparatus |
JP4037512B2 (ja) * | 1997-04-15 | 2008-01-23 | コニカミノルタビジネステクノロジーズ株式会社 | 画像読取装置 |
JP3557081B2 (ja) * | 1997-08-29 | 2004-08-25 | シーケーディ株式会社 | 捺印錠剤の外観検査方法と捺印錠剤の外観検査装置 |
WO2000033251A1 (fr) * | 1998-11-30 | 2000-06-08 | Yamatake Corporation | Dispositif de reconnaissance de particules |
US6307964B1 (en) * | 1999-06-04 | 2001-10-23 | Mitsubishi Electric Research Laboratories, Inc. | Method for ordering image spaces to represent object shapes |
US7110003B2 (en) * | 2000-12-22 | 2006-09-19 | Canon Kabushiki Kaisha | Rendering objects |
US6378572B1 (en) * | 2001-03-28 | 2002-04-30 | Siemens Corporate Research, Inc. | Image processing system for inspection of tablets in slab filler packaging machines |
KR100468857B1 (ko) * | 2002-11-21 | 2005-01-29 | 삼성전자주식회사 | 2차원 형상에 대한 투사 불변형 표현자를 이용한핸드/아이 캘리브레이션 방법 |
WO2006090671A1 (ja) * | 2005-02-23 | 2006-08-31 | Matsushita Electric Works, Ltd. | 粒状物体の検査方法及びその方法を実施する検査装置 |
-
2006
- 2006-02-20 WO PCT/JP2006/302973 patent/WO2006090671A1/ja active Application Filing
- 2006-02-20 KR KR1020077019019A patent/KR100929475B1/ko not_active IP Right Cessation
- 2006-02-20 EP EP06714113A patent/EP1852693A4/en not_active Withdrawn
- 2006-02-20 US US11/816,536 patent/US7916949B2/en not_active Expired - Fee Related
- 2006-02-20 CN CN2006800056176A patent/CN101175990B/zh not_active Expired - Fee Related
Patent Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH09231342A (ja) * | 1996-02-26 | 1997-09-05 | Sanyo Electric Co Ltd | 錠剤検査方法及び装置 |
JP2004234132A (ja) * | 2003-01-28 | 2004-08-19 | Matsushita Electric Works Ltd | 粒状物体の検査装置およびその検査方法 |
Cited By (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2010061646A (ja) * | 2008-08-08 | 2010-03-18 | Make Softwear:Kk | 画像処理装置、画像出力装置、画像処理方法及びコンピュータプログラム |
JP2013109788A (ja) * | 2008-08-08 | 2013-06-06 | Make Softwear:Kk | 画像処理装置、画像処理方法及びコンピュータプログラム |
CN109073564A (zh) * | 2016-04-22 | 2018-12-21 | 富士胶片株式会社 | 药剂监查装置及方法以及程序 |
CN109073564B (zh) * | 2016-04-22 | 2020-12-08 | 富士胶片富山化学株式会社 | 药剂监查装置及方法以及记录介质 |
Also Published As
Publication number | Publication date |
---|---|
EP1852693A1 (en) | 2007-11-07 |
KR20070103466A (ko) | 2007-10-23 |
EP1852693A4 (en) | 2010-05-26 |
US20090123056A1 (en) | 2009-05-14 |
CN101175990B (zh) | 2011-04-20 |
US7916949B2 (en) | 2011-03-29 |
CN101175990A (zh) | 2008-05-07 |
KR100929475B1 (ko) | 2009-12-02 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
WO2006090671A1 (ja) | 粒状物体の検査方法及びその方法を実施する検査装置 | |
JP6369456B2 (ja) | 薬剤鑑査装置、及び薬剤分包システム | |
JP5163985B2 (ja) | 粒状物品種検査装置 | |
JP3438925B2 (ja) | 錠剤検査システム | |
CN107076677A (zh) | 检查装置以及检查方法 | |
JPS6362074A (ja) | 三次元画像の連結成分抽出装置 | |
US8331678B2 (en) | Systems and methods for identifying a discontinuity in the boundary of an object in an image | |
EP3674696B1 (en) | Medicine inspection assistance device, image processing device, image processing method, and non-transitory computer-readable recording medium | |
CA2914403A1 (en) | System and method of using imprint analysis in pill identification | |
JP4300809B2 (ja) | 粒状物体の検査装置およびその検査方法 | |
KR101330567B1 (ko) | 약품 적재함 내의 알약 영상 검출방법. | |
JP3976961B2 (ja) | 物品の外観検査方法 | |
JP4639841B2 (ja) | 粒状物体の検査方法及びそれを用いる検査装置 | |
JP4639840B2 (ja) | 粒状物体の検査方法及びそれを用いる検査装置 | |
CA3130044A1 (en) | Feature point recognition system and recognition method | |
JP2017166957A (ja) | 欠陥検出装置、欠陥検出方法およびプログラム | |
KR101150754B1 (ko) | 영상처리를 이용한 약품상자 영역 검출 시스템 및 방법 | |
Deepti | Enhanced feature extraction technique for detection of pharmaceutical drugs | |
JP2001116534A (ja) | 真円度判定方法、真円度演算装置及び記録媒体 | |
JP7309017B1 (ja) | 錠剤検査装置及び錠剤検査方法 | |
JPH09288068A (ja) | 外観検査装置 | |
JP2893412B2 (ja) | Icパッケージ検査システム | |
JPS61153507A (ja) | 三次元形状認識装置 | |
JPS6123950A (ja) | 欠陥検出装置 | |
JPH0894335A (ja) | 画像処理検品装置 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
WWE | Wipo information: entry into national phase |
Ref document number: 200680005617.6 Country of ref document: CN |
|
121 | Ep: the epo has been informed by wipo that ep was designated in this application | ||
WWE | Wipo information: entry into national phase |
Ref document number: 2006714113 Country of ref document: EP Ref document number: 11816536 Country of ref document: US |
|
WWE | Wipo information: entry into national phase |
Ref document number: 1020077019019 Country of ref document: KR |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
WWP | Wipo information: published in national office |
Ref document number: 2006714113 Country of ref document: EP |