WO2014092189A1 - Dispositif de reconnaissance d'image, procédé de reconnaissance d'image et programme de reconnaissance d'image - Google Patents

Dispositif de reconnaissance d'image, procédé de reconnaissance d'image et programme de reconnaissance d'image Download PDF

Info

Publication number
WO2014092189A1
WO2014092189A1 PCT/JP2013/083524 JP2013083524W WO2014092189A1 WO 2014092189 A1 WO2014092189 A1 WO 2014092189A1 JP 2013083524 W JP2013083524 W JP 2013083524W WO 2014092189 A1 WO2014092189 A1 WO 2014092189A1
Authority
WO
WIPO (PCT)
Prior art keywords
image
recognition
input image
recognition target
registered
Prior art date
Application number
PCT/JP2013/083524
Other languages
English (en)
Japanese (ja)
Inventor
博史 川口
康毅 斎藤
Original Assignee
株式会社メディポリ
チームラボ株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 株式会社メディポリ, チームラボ株式会社 filed Critical 株式会社メディポリ
Publication of WO2014092189A1 publication Critical patent/WO2014092189A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/40Extraction of image or video features
    • G06V10/44Local feature extraction by analysis of parts of the pattern, e.g. by detecting edges, contours, loops, corners, strokes or intersections; Connectivity analysis, e.g. of connected components
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition

Definitions

  • the present invention relates to an image recognition apparatus, an image recognition method, and an image recognition program, and more particularly, to an input image recognition process.
  • Dispensing pharmacies that provide drugs according to doctors' prescriptions handle a wide variety of drugs. Since drugs are life-threatening and there should be no medication errors, it is necessary to provide accurate drugs according to the doctor's prescription from a wide variety of drugs. Among various types of medicines, for example, many medicines packed in blister packs are similar in appearance. Therefore, the confirmation work of the medicine at the time of provision is a heavy burden on the pharmacist.
  • Patent Document 1 In the technique disclosed in Patent Document 1, it is assumed that a drug name image part capable of recognizing a drug name from an image of a packaging material is collated. However, since a special font may be used for the medicine name printed on the medicine, it may be difficult to perform highly accurate character recognition.
  • the image displayed on the drug packaging material may contain information effective for drug recognition.
  • information is used. Cannot be used effectively.
  • such a subject is not limited to the medicine packaged by the blister pack, but can be a problem as long as the medicine displays information capable of identifying the medicine on the packaging material.
  • the present invention has been made in consideration of the above-described circumstances, and an object thereof is to improve the recognition accuracy of a recognition target based on an input image and to quickly obtain a recognition result of the recognition target.
  • one embodiment of the present invention is a drug recognition device that recognizes a drug based on a read image generated by imaging a package of the drug, and acquires the read image. And a registered image database in which a plurality of local feature amounts are extracted from the read image, and local feature amounts extracted for each of a plurality of medicine images that can be recognized are registered in association with each of the plurality of medicines.
  • the unit obtains the conversion information in the order of the ranks of the plurality of medicines ranked, and the verification processing unit converts one of the medicine image and the read image using the conversion information obtained in order. Determining whether the image of the medicine and the read image are the same, and outputting the medicine corresponding to the image of the medicine determined to be the same as the recognition result of the medicine for the read image. When That.
  • a drug recognition method for recognizing a drug based on a read image generated by imaging a package of a drug, wherein the read image is acquired, and a plurality of read images are obtained from the read image.
  • a local feature amount is extracted, and a local feature amount extracted for each of a plurality of medicine images that can be recognized is extracted from the read image with reference to a registered image database registered in association with each of the plurality of medicines.
  • a plurality of local feature quantities are associated with the nearest local feature quantity among the local feature quantities registered in the registered image database, the plurality of drugs are ranked according to the number of associated local feature quantities, and the drugs Conversion information for converting one of the medicine image and the read image so that the read image and the read image are overlapped with each other.
  • the drug is obtained based on the local feature amount extracted from the image of the medicine and the read image in the order of the medicine, and the medicine is used by using the conversion information obtained in the order of the ranking of the ranked medicines.
  • One of the image and the read image is converted and the image of the drug and the read image are compared to determine whether the image of the drug and the read image are the same.
  • the medicine corresponding to the determined medicine image is output as a medicine recognition result for the read image.
  • a medicine recognition program for recognizing a medicine based on a read image generated by imaging a medicine packaging, the step of acquiring the read image, and the read image Extracting a plurality of local feature amounts from the registered image database in which local feature amounts extracted for each of a plurality of medicine images that can be recognized are registered in association with each of the plurality of medicines, and Associating a plurality of local feature quantities extracted from the read image with the nearest local feature quantity among the local feature quantities registered in the registered image database; and the plurality of drugs according to the number of associated local feature quantities The medicine image and the read image so that the medicine image and the read image are superimposed.
  • the information processing apparatus includes a step of determining whether or not the read image is the same, and a step of outputting a drug corresponding to the drug image determined to be the same as a drug recognition result for the read image. It is made to perform.
  • the recognition accuracy of the recognition target based on the input image can be improved, and the recognition result of the recognition target can be obtained quickly.
  • an image recognition device a drug recognition device that uses an image obtained by imaging a medicine packaging as an input image, recognizes the type of the drug based on the image, and notifies the user of the input image. This will be described as an example.
  • FIG. 1 is a perspective view showing an appearance and an internal configuration of a medicine recognition apparatus 1 according to the present embodiment.
  • the drug recognition device 1 according to the present embodiment is configured by installing a touch panel 3 on a box-shaped housing 2.
  • a part of the upper surface of the housing 2 is formed of a transparent plate, and the transparent portion serves as an imaging stand 4 on which a medicine to be imaged is placed.
  • a camera 6 for imaging a medicine placed on the imaging table 4 is installed inside the housing 2, and a ball-type illumination 5 is provided so as to face the imaging table 4 from the periphery of the camera 6. ing. Due to the effect of the ball-type illumination 5, the medicine placed on the imaging table 4 is irradiated with light from multiple directions, and an image of the medicine from which the shadow is eliminated is taken by the camera 6.
  • a controller device 7 is provided inside the housing 2, and an image captured and generated by the camera 6 is input to the controller device 7.
  • the controller device 7 processes the medicine image acquired from the camera 6 to perform the medicine recognition process, and displays the recognition result on the touch panel 3.
  • FIG. 2 is a block diagram showing a hardware configuration of the medicine recognition apparatus 1 according to the present embodiment.
  • the medicine recognition apparatus 1 according to the present embodiment includes the above-described camera 6 in addition to the same configuration as an information processing terminal such as a general server or a PC (Personal Computer). That is, the drug recognition apparatus 1 according to the present embodiment includes a CPU (Central Processing Unit) 10, a RAM (Random Access Memory) 11, a ROM (Read Only Memory) 12, an HDD (Hard Disk Drive) 13 and an I / F 14. 17 is connected.
  • the I / F 14 is connected to an image camera 6, an LCD (Liquid Crystal Display) 15, and an operation unit 16.
  • the CPU 10 is a calculation means and controls the operation of the entire medicine recognition apparatus 1.
  • the RAM 11 is a volatile storage medium capable of reading and writing information at high speed, and is used as a work area when the CPU 10 processes information.
  • the ROM 12 is a read-only nonvolatile storage medium, and stores programs such as firmware.
  • the HDD 13 is a non-volatile storage medium that can read and write information, and stores an OS (Operating System), various control programs, application programs, and the like.
  • the I / F 14 connects and controls the bus 18 and various hardware and networks.
  • the LCD 15 is a visual user interface for the operator of the apparatus to confirm the state of the medicine recognition apparatus 1.
  • the operation unit 16 is a user interface for an operator to input information to the medicine recognition device 1.
  • the LCD 15 and the operation unit 16 constitute the touch panel 3 shown in FIG.
  • a program stored in a recording medium such as the ROM 12, the HDD 14, or an optical disk (not shown) is read into the RAM 11, and the CPU 10 performs an operation according to the program, thereby configuring a software control unit.
  • a functional block that realizes the function of the medicine recognition apparatus 1 according to the present embodiment is configured by a combination of the software control unit configured as described above and hardware.
  • FIG. 3 is a block diagram showing a functional configuration of the controller device 7 according to the present embodiment.
  • the controller device 7 according to the present embodiment includes a registered image database 1-2 and an image processing unit 110 in addition to the camera driver 101 that drives the camera 6 and the display driver 103 that drives the LCD 15. Including.
  • the camera driver 101, the display driver 103, and the image processing unit 110 have the software control unit and hardware realized by the CPU 10 performing calculations according to the program read into the RAM 11. It works by.
  • the image processing unit 110 acquires an image of the medicine placed on the imaging stand 4 taken by the camera 6 via the camera driver 101, and performs medicine recognition processing based on the image.
  • the registered image database 102 is an information storage unit that stores information related to an image of a medicine that can be placed on the imaging table 4 and can be recognized (hereinafter referred to as “registered image”).
  • FIG. 4 is a diagram illustrating an example of information stored in the registered image database 102.
  • the registered image database 102 according to the present embodiment includes a “medicine ID” that identifies a drug that can be recognized, a “medicine name” that indicates the name of a drug that can be recognized, and a drug that can be recognized. This is information associated with “data path” information indicating a storage area in which an image of the package is stored.
  • the “medicine name” includes a part of the drug name itself such as “ABC tablet” and a part of the amount of the drug such as “250 mg”.
  • the “data path” is a data path indicating a storage area in the HDD 13 described with reference to FIG. 2, for example, but may be a storage area outside the medicine recognition apparatus 1 such as a network drive.
  • FIG. 5A is a diagram illustrating an example of a registered image according to the present embodiment. As shown in FIG. 5A, in a general medicine packaging, the name and quantity of the medicine are repeatedly displayed on one side. In the medicine recognition device 1 according to the present embodiment, an image of a surface on which such names and amounts of medicines are repeatedly displayed is registered in advance.
  • Fig. 5 (b) is an image obtained by extracting one portion of the display range of the drug name and quantity shown in Fig. 5 (a).
  • an image of a portion that can be a feature in medicine recognition as shown in FIG. Reference image ” is registered in advance.
  • the image processing unit 110 includes an image acquisition unit 111, a ranking unit 112, a rotation processing unit 113, and a verification processing unit 114.
  • the image acquisition unit 111 acquires an image (hereinafter, referred to as “read image”) that is captured and generated by the camera 6 via the camera driver 101.
  • the ranking unit 112 performs a comparison process between the read image acquired by the image acquisition unit 111 and the registered image whose “data path” is registered in the registered image database 102, and ranks the registered images in an order similar to the read image. I do.
  • the rotation processing unit 113 considers that the medicine is read while being tilted or rotated in the read image, and aligns the direction of the registered image with the portion serving as a key for image comparison included in the read image. Performs image rotation processing. That is, the rotation processing unit 113 functions as a conversion information acquisition unit that obtains conversion information for converting either one of the read image and the registered image so as to overlap each other.
  • the verification processing unit 114 compares the key portion of the read image rotated by the rotation processing unit 113 and the registered image in the order of the ranks generated by the ranking unit 112, and determines whether or not they are the same image. And outputs information to display the verification result.
  • the gist of the present embodiment is to perform each process with high accuracy and high speed by using parameters related to the processes in the ranking unit 112 and the rotation processing unit 113.
  • processing according to the gist of the present embodiment will be described.
  • the camera 6 may detect and execute in real time that the medicine is placed on the imaging table 4, or may be executed by the operator operating the touch panel 3. .
  • FIG. 6A is a diagram illustrating an example of a read image generated by imaging a medicine placed on the imaging stand 4.
  • the read image generated by the imaging by the camera 6 is in a state where the medicine is tilted as shown in FIG. 6A according to how the medicine is placed on the imaging stand 4 by the operator.
  • the image acquisition unit 111 acquires the read image generated in this way via the camera driver 101.
  • the ranking unit 112 registers information in the registered image database 102 in the order of similarity to the read images based on the read images acquired by the image acquisition unit 111. Ranking registered images.
  • the details of the processing by the ranking unit 112 is one of the gist according to the present embodiment.
  • the ranking unit 112 according to the present embodiment uses the read image and the registered image as input, and performs ranking by nearest neighbor search processing based on local feature amounts.
  • the local feature amount extraction processing includes processing for extracting key points effective for recognition in an image and processing for generating feature amounts for each of the extracted key points.
  • FIG. 6B is a diagram illustrating an example of a result of key point extraction based on the read image illustrated in FIG.
  • Such key point extraction processing can be realized, for example, by extracting corner pixels using a simple corner detection filter. In that case, a predetermined scale can be used for the local scale. It is also possible to perform key point extraction using Fast-Hessian Detector.
  • the ranking unit 112 performs feature amount extraction based on the extracted image around the key point.
  • SIFT Scale-Invariant Feature Transform
  • SURF Speeded-Up Robust Features
  • FRAK Fast Retina Keypoint
  • the content of the feature amount extracted by such processing differs depending on whether one of the above-described algorithms is used, but all is information calculated or extracted according to the content of the image. For example, when SIFT is used, a 128-dimensional feature amount can be extracted.
  • the ranking unit 112 executes the above-described key point extraction and feature amount extraction processing in advance for an image whose information is registered in the registered image database 102.
  • FIG. 5C is a diagram illustrating an example of the processing executed in advance. As shown in FIG. 5C, the ranking unit 112 according to the present embodiment extracts key points for the reference image shown in FIG. 5B among the images whose information is registered in the registered image database 102. The feature amount extraction is executed in advance, and the result is stored in the registered image database 102.
  • the local feature value f q of the read image and the local feature value f t of the reference image obtained in this way are expressed by, for example, the following equations (1) and (2).
  • p q and p t shown in equations (1) and (2) are the positions of the feature points, and are indicated by the coordinates of the pixels in the image, for example.
  • ⁇ q and ⁇ t are feature point scales, respectively.
  • d q and d t are descriptors indicating the features of the feature points, respectively.
  • the ranking unit 112 uses the feature points extracted from the read image by the nearest neighbor search process as keys, and features corresponding to the feature points. Search for points in the reference image. In the nearest neighbor search, the ranking unit 112 sequentially refers to the feature points extracted from the read image, and selects d t closest to d q of the feature points from the information registered in the registered image database. Associate.
  • the feature points in the read image and the feature points in the reference image that are associated with each other are connected by a broken line.
  • a broken line associated with a part of the feature point in the read image specifically, one display part of the medicine name as one unit is shown.
  • all the feature points included in the read image are associated with the feature points of any registered image whose information is registered in the registered image database by the nearest neighbor search.
  • each of the feature points of the part of the drug name displayed in the read image is associated with the feature point in the reference image, so that the plurality of feature points in the read image are referred to. It can happen that it is associated with one feature point in the image redundantly.
  • FIG. 7 shows a case where the drug in the read image is the same as the drug in the reference image.
  • FIG. 8 is a diagram illustrating an example of feature points associated with different reference images, that is, different drugs, for the same read image as FIG. 7. As shown in FIG. 8, when the drug in the read image and the drug in the reference image are different, the feature amounts are naturally different, and the number of feature points is small even if they are associated with each other.
  • the ranking unit 112 associates feature points by the nearest neighbor search with respect to one registered image for all registered images whose information is registered in the registered image database 102.
  • the nearest neighbor search it is preferable to perform approximation for speeding up the processing.
  • the approximate nearest neighbor search for example, Hierarchical K-Means Tree or ANN (Approximate Nearest Neighbor) is used. Can do.
  • the ranking unit 112 counts the number of feature points associated with the feature points in the read image for each reference image as the number of votes, The number of votes for each reference image is obtained. In other words, the ranking unit 112 counts how many votes out of n votes each reference image obtains with the number n of feature points extracted from the read image as the total number of votes.
  • the ranking unit 112 that performed such vote count processing performs ranking of reference images based on the number of votes counted for each reference image.
  • FIG. 9 shows an example of the results of counting and ranking the number of votes.
  • the ranking by the ranking unit 112 is not a reference image.
  • the feature points extracted from the read image may be associated with the feature points extracted from the registered image shown in FIG.
  • the medicine corresponding to the reference image that has obtained the first place by the processing of the ranking unit 112 represents an accurate recognition result.
  • the “ABC tablet whose medicine ID is“ 0001 ” is also used. “250 mg”, the correct recognition result has won first place.
  • a feature point in an image of a different drug is associated by a nearest neighbor search, and an accurate recognition result is not ranked first.
  • the ranking result verification process is executed by the rotation processing unit 113 and the verification processing unit 114. Therefore, the ranking unit 112 inputs the ranking result illustrated in FIG. 9, the read image and local feature amount extraction result, and the feature point association result to the rotation processing unit 113.
  • FIG. 10 is a flowchart showing the order of processing executed by the rotation processing unit 113 and the verification processing unit 114.
  • the rotation processing unit 113 that has acquired the above-described information from the ranking unit 112 selects reference images in the order of the ranking results illustrated in FIG. 9, and selects the selected reference images as corresponding feature points.
  • a conversion matrix H for matching and projecting on the read image is obtained (S1001).
  • the transformation matrix H can be obtained using, for example, RANSAC (RANdom Sampl Consensus). Processing for obtaining the transformation matrix H using RANSAC will be described below with reference to FIG.
  • the rotation processing unit 113 first associates the feature points included in the reference image with the feature points included in the read image (S1101).
  • the rotation processing unit 113 using the f q-i and f t-j described above, the following equation (3), the f q-i and f t-j that satisfy the constraints shown in (4)
  • Corresponding points C ⁇ p q ⁇ i , p t ⁇ j ⁇ are obtained by associating the sets as corresponding points.
  • the expression (3) is an expression for calculating the Hamming distance in the case of FREEK, and is an expression for calculating the Euclidean distance in the case of SIFT and SURF. Therefore, “T d ” is set with respect to the Hamming distance or the Euclidean distance. That is, the threshold value is used to determine that the feature amount of the feature point in the reference image is close to the feature amount of the feature point in the read image. Further, since the reference image is stored at the same resolution as that of the image picked up by the camera 6, for example, “1” can be used as T s indicating the threshold of the scale difference.
  • the feature point association processing by the above formulas (3) and (4) can also be used in the feature point association by the ranking unit 112.
  • the result of association by the ranking unit 112 can be adopted as the result of association in S1101.
  • the ranking unit 112 it is predicted that many feature points are associated with each other. Therefore, feature points are associated again based on the above formulas (3) and (4). Even if it is performed, it is predicted that many feature points have the same correspondence result. Therefore, by adopting the feature point association result by the ranking unit 112 as the association result in S1101, it is possible to reduce the amount of processing and obtain a result quickly without degrading the processing accuracy.
  • the rotation processing unit 113 that has completed the processing of S1101 next randomly selects one corresponding point associated in S1101 (S1102), and is within a predetermined range on the read image side with the selected corresponding point as the center. Two other corresponding points are randomly selected from the reference image side (S1103).
  • This predetermined range is, for example, a range in which the diagonal line of the reference image shown in FIG.
  • the rotation processing unit 113 When a total of three corresponding points are acquired from the predetermined range on the read image side, the rotation processing unit 113 performs the reference image side according to the affine transformation based on the positions ⁇ p q ⁇ i , p t ⁇ j ⁇ of the corresponding points.
  • a transformation matrix H for projecting the feature points to the read image side is obtained (S1104). Since the number of parameters in the affine transformation is 6, it is possible to obtain the transformation matrix H according to the affine transformation using two parameters included in each of the three feature points.
  • the rotation processing unit 113 projects the feature points in the reference image into the read image using the calculated H (S1105), and the feature points of the reference image projected in the read image;
  • the position difference from the corresponding feature point of the read image that is, the number of corresponding points whose distance between corresponding feature points is within a predetermined threshold value is counted as Inlier (S1106).
  • the predetermined threshold value can be set by the number of pixels, for example, and a relatively small value such as a few pixels to a dozen pixels is set according to the resolution of the reference image.
  • the rotation processing unit 113 repeatedly executes the processing from S1102 in various corresponding point selection states (S1107 / NO). When the specified number of repetitions is completed (S1107 / YES), each corresponding point is selected. The count number of Inlier counted in the state is compared, and the conversion matrix H obtained when the count number is the largest is determined as the final conversion matrix H (S1108), and the process is terminated. In the process of S1108, the Inlier count may be the same in the selection state of a plurality of corresponding points. In such a case, any one may be selected.
  • the rotation processing unit 113 determines that any of the corresponding corresponding points being selected is incorrect, and returns to S1102 to select another feature point.
  • DLT Direct Linear Transform
  • the rotation processing unit 113 when the rotation processing unit 113 obtains the transformation matrix H by such processing, the rotation processing unit 113 inputs the obtained transformation matrix H to the verification processing unit 114.
  • the verification processing unit 114 projects the selected reference image onto the read image based on the conversion matrix H acquired from the rotation processing unit 113 in this way. By projecting the reference image onto the read image, a range corresponding to the reference image in the read image can be extracted based on the outer frame of the reference image, as shown in FIG.
  • the verification processing unit 114 performs an image in a range corresponding to the reference image in the read image, that is, an image of a characteristic part for identifying a drug such as “ABC tablet 250 mg” (hereinafter, “ Whether or not the two images are the same by comparing the extracted medicine display unit image with the reference image converted by the conversion matrix H. That is, the accuracy of the reference image ranked higher by the ranking unit 112 as being similar to the read image is verified (S1002).
  • the comparison processing of the image by the verification processing unit 114 is performed by, for example, calculating the similarity of the shape by normalized correlation or calculating the similarity by comparing the color histograms generated by the HSV (Hue, Saturation, Value) system. This can be realized by making a threshold judgment on the determined value. In addition, verification accuracy can be improved by using a combination of the above-described threshold determination for the similarity of the shape and the similarity of the color.
  • the rotation processing unit 113 and the verification processing unit 114 perform conversion matrix H calculation processing and verification processing in the order of the generated order as shown in FIG.
  • the verification processing unit 114 determines that the reference image being determined is a read image, that is, when the verification is passed (S1003 / YES)
  • the determination process is terminated at that time, and the determination result is displayed.
  • the judgment result that is, the recognition result of the medicine placed on the imaging stand 4 is displayed on the LCD 15.
  • the verification processing unit 114 determines that the reference image being determined is not a read image (S1003 / NO)
  • the verification processing unit 114 notifies the rotation processing unit 113 of the determination result.
  • the rotation processing unit 113 executes the process described with reference to FIG. 11 for the reference image having the next highest order in the order shown in FIG.
  • the verification processing unit 114 performs the verification process on the next highest-reference image.
  • the medicine recognition apparatus 1 after ranking the reference images based on the local feature amounts as described above, a detailed comparison inspection is performed by the verification processing unit 114 according to the ranking. Therefore, it is possible to improve the accuracy of drug recognition through detailed comparison tests, and since ranking is performed using local features in advance, comparisons are made in order from the reference images that are most likely to be accurate. Since the drug recognition result is confirmed when the accuracy is confirmed, it is not necessary to carry out detailed comparison inspections for many reference images, and the recognition result of the drug can be obtained quickly by reducing the processing load. It is possible.
  • the conversion matrix H is calculated by the rotation processing unit 113 to correct the inclination of the read image.
  • a portion corresponding to the reference image in the read image that is, a portion to be subjected to the high-precision comparison inspection is extracted from the read image.
  • the matching result of the local feature amount obtained by the ranking unit 112 is used. It is possible to link the processing of the rotation processing unit 113 with each other, realize efficient processing, and contribute to the acquisition of the rapid drug recognition result described above.
  • the medicine recognition device 1 As described above, according to the medicine recognition device 1 according to the present embodiment, it is possible to improve the medicine recognition accuracy based on an image obtained by imaging the packaging material and to quickly obtain the medicine recognition result. Moreover, in the said embodiment, although the case where the image displayed on the packaging of a medicine was character information like "ABC tablet 250 mg" was demonstrated as an example, the method which concerns on this embodiment is not restricted to character information. Applicable and widely applicable to drugs in various packaging forms.
  • photographed based on the image obtained by imaging the packaging of a medicine was demonstrated as an example, it is not restricted to such an aspect,
  • the input image Can be widely used as a technique for recognizing a recognition object displayed in an image.
  • the display on the LCD 15 has been described as an example as a result of the drug recognition process.
  • the drug name of the recognized drug may be read out by voice. If there is information indicating a medicine to be provided, such as prescription information, whether the medicine is correctly selected by comparing the recognition result by the verification processing unit 114 with the information indicating the medicine to be provided. It is also possible to notify the pharmacist by judging.
  • the number of medicine display unit images included in the read image is determined based on the number of the conversion matrix H. can do.
  • the number of medicine display unit images determined in this way can be used in medicine recognition. Such a case can be realized by registering the number of medicine display unit images for each medicine in the registered image database 102 described in FIG.
  • the medicine may be prescribed in a divided state instead of one package of the blister pack.
  • it is also possible to determine the amount of the prescribed medicine by judging the number of medicine display unit images included in the read image as described above. This makes it possible to determine whether or not the amount of medicine actually provided matches the doctor's prescription.
  • the number of medicine display unit images displayed in the medicine packaging does not always correspond to the amount of medicine, for example, the number of tablets.
  • the number of medicine display unit images included in one package of the blister pack is registered in the registered image database 102 for each medicine, and based on the ratio to the number of medicine display unit images determined from the read image.
  • the amount of medicine corresponding to the number of recognized medicine display unit images may be registered in the registered image database 102 for each medicine.
  • the case where the verification is terminated by the process such as the normalized correlation in the verification processing unit 114 has been described as an example.
  • a combination of medicines whose reference images are very similar may be stored in a database in advance, and when a medicine registered in the database is recognized, further detailed verification may be performed.
  • the type of medicine may be the same and the amount may be different.
  • the “2” portion of “250 mg” and the “1” portion of “150 mg” are different as images. Even if the read image is “250 mg”, If “150 mg” is ranked first by the ranking unit 112, the verification processing by the verification processing unit 114 may pass verification.
  • FIG. 15 is a diagram showing an example of a database (hereinafter referred to as “similar drug database”) in which similar drugs are registered.
  • similar drug database for example, drug IDs of drugs with similar reference images are registered in association with each other, and the reference image of the drug ID is verified by the verification processing unit 114. Is passed, the information of coordinates indicating the area in the reference image to be further verified is associated as “verification area coordinates”.
  • FIG. 16 is a diagram illustrating an example of the coordinate range specified by the verification area coordinates. As indicated by a broken line in FIG. 16, a different part in a set of similar images is specified as a verification region. Thereby, when the medicine ID corresponding to the reference image that has passed the verification pass in S1003 in FIG. 10 is registered in the similar medicine database, the verification processing unit 114 performs the verification process again on the verification region coordinates associated with the medicine ID. That is, the shape similarity is calculated by the above-described normal correlation, and the similarity is calculated by comparing color histograms generated by the HSV system. Thereby, it is possible to improve the recognition accuracy for similar images.
  • the process of S1001 in FIG. 11 may be omitted and only the verification process using normalized correlation or the like may be performed.
  • the transformation matrix H obtained in the reference image in error is used as it is, there is a possibility that a positional deviation occurs between the read image and the reference image in the verification process with the similar image. This misregistration can be absorbed in the normalized correlation processing, and the amount of processing can be reduced by such processing, and the recognition result can be obtained quickly.
  • the case has been described as an example where, after ranking by the ranking unit 112, the verification processing by the verification processing unit 114 is always performed to ensure the accuracy of the recognition result.
  • the result of ranking as shown in FIG. 9, if it can be clearly determined that the first place is correct based on the difference between the first place and the second place, the rotation processing unit 113 and the verification processing unit 114.
  • the first result shown in FIG. 9 may be output as the recognition result.
  • the ranking unit 112 determines that the medicine corresponding to the first medicine ID is correct if, for example, the number of votes in the second place is 1% or less of the number of votes in the first place as a result of the ranking shown in FIG.
  • the result is a recognition result and generating information for displaying the recognition result in place of the verification processing unit 114 and outputting the information to the display driver 103, the determination result on the LCD 15, that is, the image is placed on the imaging stand 4.
  • the recognition result of the selected drug is displayed.

Abstract

La présente invention augmente la précision de reconnaissance d'une image d'entrée et acquiert rapidement des résultats de reconnaissance d'un sujet de reconnaissance. La présente invention est caractérisée en ce qu'elle contient : une unité de classement (112) qui extrait une pluralité de quantités de caractéristiques locales à partir d'une image d'entrée, se réfère à une base de données d'image enregistrée (102) dans laquelle des quantités de caractéristiques locales extraites de chacune d'une pluralité d'images de sujet de reconnaissance, qui peuvent être reconnues, sont associées à chacun d'une pluralité de sujets de reconnaissance et sont enregistrées, associe la pluralité de quantités de caractéristiques locales extraites de l'image d'entrée aux quantités de caractéristiques locales les plus proches parmi les quantités de caractéristiques locales enregistrées, et classe la pluralité de sujets de reconnaissance selon le nombre de quantités de caractéristiques locales associées ; une unité de traitement de rotation (113) qui, sur la base de chaque quantité de caractéristiques locales extraite, détermine des informations de conversion pour convertir les images de sujet de reconnaissance ; et une unité de traitement de vérification (114) qui réalise une conversion à l'aide des informations de conversion déterminées et détermine si l'image du sujet de reconnaissance et l'image d'entrée sont ou non identiques.
PCT/JP2013/083524 2012-12-14 2013-12-13 Dispositif de reconnaissance d'image, procédé de reconnaissance d'image et programme de reconnaissance d'image WO2014092189A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2012-273429 2012-12-14
JP2012273429A JP5414879B1 (ja) 2012-12-14 2012-12-14 薬剤認識装置、薬剤認識方法及び薬剤認識プログラム

Publications (1)

Publication Number Publication Date
WO2014092189A1 true WO2014092189A1 (fr) 2014-06-19

Family

ID=50202774

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2013/083524 WO2014092189A1 (fr) 2012-12-14 2013-12-13 Dispositif de reconnaissance d'image, procédé de reconnaissance d'image et programme de reconnaissance d'image

Country Status (2)

Country Link
JP (1) JP5414879B1 (fr)
WO (1) WO2014092189A1 (fr)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2019168765A (ja) * 2018-03-22 2019-10-03 オオクマ電子株式会社 医療材料認識システム

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2008097245A (ja) * 2006-10-11 2008-04-24 Seiko Epson Corp 回転角度検出装置、回転角度検出装置の制御方法および制御プログラム
JP2009187186A (ja) * 2008-02-05 2009-08-20 Sony Corp 画像処理装置および方法、並びにプログラム
JP2010026603A (ja) * 2008-07-15 2010-02-04 Canon Inc 画像処理装置、画像処理方法、及びコンピュータプログラム
JP2010152543A (ja) * 2008-12-24 2010-07-08 Fujitsu Ltd 検出装置、検出方法および検出プログラム

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP3128357B2 (ja) * 1992-10-20 2001-01-29 沖電気工業株式会社 文字認識処理装置
JPH09179909A (ja) * 1995-12-25 1997-07-11 Matsushita Electric Works Ltd ホームオートメーション制御システムのメモリカード
JP4923282B2 (ja) * 2002-11-22 2012-04-25 グローリー株式会社 薬剤認識装置

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2008097245A (ja) * 2006-10-11 2008-04-24 Seiko Epson Corp 回転角度検出装置、回転角度検出装置の制御方法および制御プログラム
JP2009187186A (ja) * 2008-02-05 2009-08-20 Sony Corp 画像処理装置および方法、並びにプログラム
JP2010026603A (ja) * 2008-07-15 2010-02-04 Canon Inc 画像処理装置、画像処理方法、及びコンピュータプログラム
JP2010152543A (ja) * 2008-12-24 2010-07-08 Fujitsu Ltd 検出装置、検出方法および検出プログラム

Also Published As

Publication number Publication date
JP2014117390A (ja) 2014-06-30
JP5414879B1 (ja) 2014-02-12

Similar Documents

Publication Publication Date Title
US11468975B2 (en) Medication reconciliation system and method
CN105378793B (zh) 用于在对象可能受医学状况影响时进行鉴别的系统、方法和计算机可读介质
JP2020507836A (ja) 重複撮像を予測した手術アイテムの追跡
WO2018173649A1 (fr) Dispositif de reconnaissance de médicament, procédé de reconnaissance de médicament et programme de reconnaissance de médicament
US11688060B2 (en) Image-based circular plot recognition and interpretation
JP6879298B2 (ja) 画像処理プログラム、および、画像処理装置
US11404153B2 (en) Drug inspection assisting apparatus and drug inspection assisting method
US11759401B2 (en) Method of monitoring medication regimen with portable apparatus
JP2016523405A (ja) 錠剤の識別における印の解析を使用したシステム及び方法
US10937152B2 (en) Inspection support method and inspection support device
JP6047475B2 (ja) 画像認識装置、画像認識方法及び画像認識プログラム
JP5414879B1 (ja) 薬剤認識装置、薬剤認識方法及び薬剤認識プログラム
Holtkötter et al. Development and validation of a digital image processing-based pill detection tool for an oral medication self-monitoring system
Yang et al. A novel mobile application for medication adherence supervision based on AR and OpenCV designed for elderly patients
WO2014020820A1 (fr) Dispositif de lecture de marque et procédé de lecture de marque
JP6857373B1 (ja) 情報処理装置、情報処理方法、及びプログラム
US11594322B2 (en) Dispensing audit support apparatus and dispensing audit support method
JP7437259B2 (ja) 画像処理装置、薬剤識別装置、画像処理方法及び薬剤識別方法
JP4379038B2 (ja) 画像照合装置、画像照合方法および画像照合プログラム
KR102598969B1 (ko) 알약 검색 시스템
JP5582610B2 (ja) 画像識別装置及びプログラム
US20180021017A1 (en) A method and apparatus for displaying a region of interest on a current ultrasonic image
Ferraz Utilização da câmara smartphone para monitorizar a aderência à terapia inalatória
Saitoh Pharmaceutical Blister Pack Recognition using Local Features.
Joshi et al. Automatic Pill Identifier An Overview On Identifying, Retrieving And Authenticating Drug Pill

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 13862499

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 13862499

Country of ref document: EP

Kind code of ref document: A1