WO2015008732A1 - Dispositif de reconnaissance optique de caractères - Google Patents

Dispositif de reconnaissance optique de caractères Download PDF

Info

Publication number
WO2015008732A1
WO2015008732A1 PCT/JP2014/068725 JP2014068725W WO2015008732A1 WO 2015008732 A1 WO2015008732 A1 WO 2015008732A1 JP 2014068725 W JP2014068725 W JP 2014068725W WO 2015008732 A1 WO2015008732 A1 WO 2015008732A1
Authority
WO
WIPO (PCT)
Prior art keywords
character string
candidate
character
candidates
date
Prior art date
Application number
PCT/JP2014/068725
Other languages
English (en)
Japanese (ja)
Inventor
美 張
中村 圭吾
Original Assignee
株式会社湯山製作所
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 株式会社湯山製作所 filed Critical 株式会社湯山製作所
Priority to JP2015527291A priority Critical patent/JP6344389B2/ja
Priority to CN201480040348.1A priority patent/CN105431866A/zh
Publication of WO2015008732A1 publication Critical patent/WO2015008732A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V30/00Character recognition; Recognising digital ink; Document-oriented image-based pattern recognition
    • G06V30/10Character recognition
    • G06V30/18Extraction of features or characteristics of the image
    • G06V30/1801Detecting partial patterns, e.g. edges or contours, or configurations, e.g. loops, corners, strokes or intersections
    • G06V30/18019Detecting partial patterns, e.g. edges or contours, or configurations, e.g. loops, corners, strokes or intersections by matching or filtering
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61JCONTAINERS SPECIALLY ADAPTED FOR MEDICAL OR PHARMACEUTICAL PURPOSES; DEVICES OR METHODS SPECIALLY ADAPTED FOR BRINGING PHARMACEUTICAL PRODUCTS INTO PARTICULAR PHYSICAL OR ADMINISTERING FORMS; DEVICES FOR ADMINISTERING FOOD OR MEDICINES ORALLY; BABY COMFORTERS; DEVICES FOR RECEIVING SPITTLE
    • A61J2200/00General characteristics or adaptations
    • A61J2200/70Device provided with specific sensor or indicating means
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61JCONTAINERS SPECIALLY ADAPTED FOR MEDICAL OR PHARMACEUTICAL PURPOSES; DEVICES OR METHODS SPECIALLY ADAPTED FOR BRINGING PHARMACEUTICAL PRODUCTS INTO PARTICULAR PHYSICAL OR ADMINISTERING FORMS; DEVICES FOR ADMINISTERING FOOD OR MEDICINES ORALLY; BABY COMFORTERS; DEVICES FOR RECEIVING SPITTLE
    • A61J2205/00General identification or selection means
    • A61J2205/30Printed labels
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61JCONTAINERS SPECIALLY ADAPTED FOR MEDICAL OR PHARMACEUTICAL PURPOSES; DEVICES OR METHODS SPECIALLY ADAPTED FOR BRINGING PHARMACEUTICAL PRODUCTS INTO PARTICULAR PHYSICAL OR ADMINISTERING FORMS; DEVICES FOR ADMINISTERING FOOD OR MEDICINES ORALLY; BABY COMFORTERS; DEVICES FOR RECEIVING SPITTLE
    • A61J7/00Devices for administering medicines orally, e.g. spoons; Pill counting devices; Arrangements for time indication or reminder for taking medicine
    • A61J7/0015Devices specially adapted for taking medicines
    • A61J7/0046Cups, bottles or bags
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V30/00Character recognition; Recognising digital ink; Document-oriented image-based pattern recognition
    • G06V30/10Character recognition

Definitions

  • the present invention relates to an optical character recognition device that optically recognizes a character string, and more particularly to an optical character recognition device that recognizes a character string representing a date.
  • the present invention also relates to an optical character recognition method, a computer program, and a recording medium for recognizing a character string representing a date.
  • Patent Document 1 There is a need for a device that optically recognizes characters printed on a chemical container (see Patent Document 1). For example, if a medicine such as an injection that has been transported to the ward and is not used as a result is returned to the vault, the medicine type, name, and so that the medicine can be quickly and definitely removed at the next use. It is necessary to sort and store based on the expiration date. If a return device that automatically performs this sorting can be realized using an optical character recognition device, it is effective for improving work efficiency and reducing errors. In addition, when storing medicine using such a return device, the storage location of the medicine is recorded, so that the appropriate medicine can be automatically dispensed based on the prescription at the next use. .
  • An object of the present invention is to provide an optical character recognition device, an optical character recognition method, a computer program, and a recording medium capable of recognizing a character string representing a date with higher accuracy than the conventional one. .
  • the optical character recognition device in the optical character recognition device that optically recognizes a character string, includes: First processing means for extracting a target area including an object to be recognized from an input image; Second processing means for extracting candidate objects including at least one character string candidate object from the objects included in the target area; By labeling the candidate objects, a plurality of objects extending in a predetermined direction and close to each other are extracted as the character string candidates, and the character string candidates are two-digit or four-digit numbers representing the year. And whether or not the character string candidate has the date pattern, the character string has a date pattern that includes a one- or two-digit number representing the month and a predetermined punctuation mark. And third processing means for recognizing column candidates as dates.
  • the second processing means includes Detect the contour and edge of the object included in the target area, An object having contours and edges that overlap each other is extracted as the candidate object.
  • the second processing means includes Applying a Sobel filter to the target area to detect the first edge; Applying a canny filter to a region in the vicinity of the first edge to detect a second edge; The second edge is used as an edge of an object included in the target area.
  • the third processing means includes Label the above character string candidate objects to extract multiple character candidates, A plurality of bounding boxes each having a rectangular shape having a width parallel to a direction in which the character string candidates extend and a height orthogonal to the direction in which the character string candidates extend, and surrounding each of the character candidates. Generate Each of the bounding boxes is deformed so that the width of the bounding box is increased as the height of the bounding box is lower. A set of character candidates included in bounding boxes connected by deformation is extracted as a new character string candidate.
  • the third processing means includes Label the above character string candidate objects to extract multiple character candidates, Character string candidates including more than 10 character candidates are deleted.
  • the third processing means includes Label the above character string candidate objects to extract multiple character candidates, Character string candidates including only character candidates including two or more objects in a direction orthogonal to the direction in which the character string candidates extend are deleted.
  • the third processing means includes Detect the contours and edges of the above character string candidate objects, Character string candidates whose portion where the edge pixel coincides with the contour pixel are 60% or less of the area of the edge pixel are deleted.
  • the third processing means recognizes the character string candidate as not being a date when the character string candidate includes an alphabetic character that is clearly not mistaken for a number.
  • the third processing unit is configured to represent the month.
  • the distance between two numbers is larger than the average of the distance between the two numbers representing the month and other characters and the average distance between the other characters, the first digit of the two numbers representing the month and The other characters are removed.
  • the input image is an image of a cylindrical container held rotatably.
  • the first processing means includes, from the input image, an edge extending in a direction substantially perpendicular to a rotation axis of the cylindrical container, and a portion having a luminance higher than a predetermined threshold value. A region to be included is extracted as the target region.
  • the optical character recognition device acquires a plurality of input images respectively representing different angles of the container photographed while rotating the container,
  • the character string candidate related to one input image includes only “1” as the number representing the month
  • the third processing means only includes “1” as the number representing the month as the character string candidate related to the other input image. It is characterized by determining whether it is included.
  • the optical character recognition device in the thirteenth aspect of the present invention, in the optical character recognition device in any one of the tenth to twelfth aspects, is A camera, An imaging table for holding the container so as to be rotatable around a rotation axis of the cylindrical container; A moving device for moving the container between at least one storage and the imaging table; The container is printed with a character string representing the expiration date of the medicine in the container.
  • the optical character recognition method includes: A first step of extracting a target area including an object to be recognized from an input image; A second step of extracting candidate objects including at least one character string candidate object from the objects included in the target area; By labeling the candidate objects, a plurality of objects extending in a predetermined direction and close to each other are extracted as the character string candidates, and the character string candidates are two-digit or four-digit numbers representing the year. And whether or not the character string candidate has the date pattern, the character string has a date pattern that includes a one- or two-digit number representing the month and a predetermined punctuation mark. And a third step of recognizing column candidates as dates.
  • the computer program includes: A first step of extracting a target area including an object to be recognized from an input image; A second step of extracting candidate objects including at least one character string candidate object from the objects included in the target area; By labeling the candidate objects, a plurality of objects extending in a predetermined direction and close to each other are extracted as the character string candidates, and the character string candidates are two-digit or four-digit numbers representing the year. And whether or not the character string candidate has the date pattern, the character string has a date pattern that includes a one- or two-digit number representing the month and a predetermined punctuation mark. And a third step of recognizing column candidates as dates.
  • the computer program includes: A first step of extracting a target area including an object to be recognized from an input image; A second step of extracting candidate objects including at least one character string candidate object from the objects included in the target area; By labeling the candidate objects, a plurality of objects extending in a predetermined direction and close to each other are extracted as the character string candidates, and the character string candidates are two-digit or four-digit numbers representing the year. And whether or not the character string candidate has the date pattern, the character string has a date pattern that includes a one- or two-digit number representing the month and a predetermined punctuation mark. And a third step of recognizing column candidates as dates.
  • the optical character recognition device in the optical character recognition device according to any one of the first to thirteenth aspects, is characterized in that it acquires a plurality of input images each representing a different angle of the container taken while rotating the container, and connects the plurality of input images.
  • the optical character recognition device, the optical character recognition method, the computer program, and the recording medium of the present invention can recognize a character string representing a date with higher accuracy than before.
  • step S6 of FIG. 10 is a flowchart showing a first part of an OCR subroutine in steps S51, S53, S55, and S57 of FIG. 10 is a flowchart showing a second part of the OCR subroutine in steps S51, S53, S55, and S57 of FIG. 12 is a flowchart showing a subroutine of edge strength and area luminance determination processing in step S68 of FIG.
  • FIG. 17 is a diagram in which a portion with high luminance in FIG. 15 and a long vertical edge in FIG. It is a figure which shows the target area
  • FIG. 29B is a diagram showing the binarized image of FIG. 29A extracted using the threshold value 200. It is a figure which shows the outline of FIG. 29B. It is a figure which shows the edge of FIG.
  • FIG. 29A extracted using the threshold value 50.
  • FIG. It is a figure which shows the edge of FIG. 29A extracted using the threshold value 200.
  • FIG. It is a figure which shows the outline and edge of a character. It is a figure which shows the example of the character string candidate extracted in step S61 of FIG. It is a figure which shows extraction of the character string candidate in step S61 of FIG. It is a figure which shows the example of the character candidate extracted in step S62 of FIG. 10, and the produced
  • FIG. 10 It is a figure which shows the example of the bounding box 43 deform
  • FIG. It is a figure which shows the example of the character string candidate after deleting some character string candidates in step S65, S66, S67 of FIG. It is a figure which shows the example of the character string candidate extracted in step S64 of FIG. It is a figure which shows the number of the objects in the height direction of each character candidate contained in the character string candidate of FIG.
  • FIG. 39A It is a figure which shows the edge extracted from the image of FIG. 39A. It is a figure which shows the example of the input image for demonstrating the edge intensity
  • FIG. 1 is a block diagram showing the configuration of the optical character recognition apparatus according to the first embodiment of the present invention.
  • the optical character recognition device in FIG. 1 optically recognizes a date character string printed on the surface of a cylindrical container 13.
  • the optical character recognition device further includes at least one tray (or storage) 11 and 12 that accommodates the container 13.
  • the moving device 3 moves the container 13 between the trays 11 and 12 and the rollers 8 a and 8 b under the control of the control device 1.
  • the cameras 4 to 6 are provided on the trays 11 and 12 and the rollers 8a and 8b, respectively, and acquire images of the containers 13 when the containers 13 are on the trays 11 and 12 and the rollers 8a and 8b.
  • the lighting devices 7a and 7b illuminate the container 13 on the rollers 8a and 8b.
  • the rollers 8 a and 8 b and the lighting devices 7 a and 7 b function as a photographing stand for the container 13.
  • the optical character recognition device may include another mechanism that holds the container 13 so as to be rotatable around the rotation axis of the cylindrical container 13 instead of the rollers 8a and 8b.
  • the control device 1 performs date detection processing described later with reference to FIGS.
  • the control device 1 may be connected to an external personal computer (PC) 9 that operates according to a computer program read from the recording medium 10.
  • PC personal computer
  • the container 13 is, for example, a chemical container (ampoule), and a character string representing the expiration date of the chemical in the container 13 is printed on the container 13.
  • a character string representing the expiration date of the chemical in the container 13 is printed on the container 13.
  • the optical character recognition device moves the container 13 from the tray 11 onto the rollers 8 a and 8 b using the moving device 3, and the roller 8 a , 8b, the expiration date printed on the container 13 is optically recognized.
  • the optical character recognition device determines, based on the recognized date, whether to store or discard the container and uses the transfer device 3 to transfer to another tray 12 associated with the appropriate storage or trash.
  • the container 13 is moved.
  • FIG. 2 is a top view showing the container 13a on which the character string according to the first example is printed.
  • FIG. 3 is a top view showing the container 13b on which the character string according to the second example is printed.
  • FIG. 4 is a top view showing the container 13c on which the character string according to the third example is printed.
  • FIG. 5 is a top view showing a container 13d on which a character string according to the fourth example is printed.
  • the character string may be printed on a label attached to the container, or may be printed directly on the container. Further, the direction of the character string may be parallel to the rotation axis of the cylindrical container 13, may be orthogonal to the rotation axis of the container 13, or the character strings in these directions may be mixed. Good.
  • the control device 1 operates as a first processing unit that extracts a target area including an object to be recognized from an input image.
  • the control device 1 operates as a second processing unit that extracts candidate objects including at least one character string candidate object from the objects included in the target area.
  • the control device 1 performs labeling of the candidate objects, extracts a plurality of objects extending in a predetermined direction and close to each other as character string candidates, and the character string candidates are two digits representing the year or Judge whether or not it has a date pattern that includes a 4-digit number, a 1- or 2-digit month number, and a predetermined punctuation mark. Operates as a third processing means for recognizing a character string candidate as a date.
  • FIG. 6 is a flowchart showing a date detection process executed by the control device 1 of FIG.
  • the control device 1 takes images of the container 13 with the camera 5 while rotating the container 13 by a certain angle (for example, 15 degrees) using the rollers 8a and 8b, and displays a plurality of images (input images) representing different angles of the container 13 respectively. ) To get.
  • the camera 5 a camera having sufficient resolution for optically recognizing the character string printed on the container 13 is used.
  • the container 13 has a diameter of 10 to 40 mm.
  • a black and white camera that captures a range of 120 ⁇ 90 mm including the container 13 with a pixel number of 3840 ⁇ 2748 (about 10 million pixels) is used. In this case, 1 mm on the container 13 corresponds to 32 pixels.
  • step S ⁇ b> 1 of FIG. 6 the control device 1 acquires one of the plurality of images of the container 13.
  • the control device 1 executes target area extraction processing.
  • FIG. 7 is a flowchart showing a subroutine of target area extraction processing in step S2 of FIG.
  • step S21 of FIG. 7 the control device 1 extracts a portion having a luminance higher than a predetermined threshold (for example, a portion including illumination reflection) from the image acquired in step S1.
  • a predetermined threshold for example, a portion including illumination reflection
  • the control device 1 extracts a portion having a luminance higher than 220, for example.
  • FIG. 15 is a diagram illustrating an example of a portion having a high luminance in the image extracted in step S21 of FIG.
  • the input image is an image of the container 13d in FIG.
  • step S22 of FIG. 7 the control device 1 extracts a long edge (longitudinal edge) extending in a direction substantially perpendicular to the rotation axis of the cylindrical container 13 from the image acquired in step S1.
  • a long edge long edge
  • the rollers 8a and 8b exist in the background of the container 13, since the rollers 8a and 8b extend in parallel with the rotation axis of the container 13, the influence of the rollers 8a and 8b is removed by extracting the vertical edge. be able to.
  • FIG. 16 is a diagram illustrating an example of a long vertical edge in the image extracted in step S22 of FIG.
  • the rotation axis of the container 13d in FIG. 5 is parallel to the X axis in FIG. 5. Therefore, in step S22, an edge extending in a direction substantially parallel to the Y axis in FIG. 5 is extracted.
  • the control device 1 extracts a rectangular area (width w1 ⁇ height h1) including a portion with high luminance and a vertical edge as a target area, and deletes an area outside the target area.
  • FIG. 17 is a diagram in which the high luminance portion of FIG. 15 and the long vertical edge of FIG. 16 are superimposed.
  • FIG. 18 is a diagram illustrating the target area 21 including the high-luminance portion of FIG. 15 and the long vertical edge of FIG.
  • the target area is an area that is considered to include a character string object to be recognized.
  • step S3 the control device 1 determines whether or not the target area has been successfully extracted. If YES, the process proceeds to step S4. If NO, the process proceeds to step S10. In step S4, the control device 1 executes candidate object extraction processing.
  • FIG. 8 is a flowchart showing a subroutine of candidate object extraction processing in step S4 of FIG.
  • step S31 of FIG. 8 the control device 1 applies a moving average filter to the image of the target area, thereby extracting bright objects brighter than the surroundings and binarizing the image.
  • a white character string on a black background is extracted as a bright object. Since lighting is uneven, it is not possible to detect an object by simply performing binarization. Therefore, a binarization method (dynamic threshold method) using a moving average filter is used.
  • FIG. 21 is a diagram illustrating extraction of a bright object using the moving average filter in step S31 of FIG. According to the principle shown in FIG. 21, the control device 1 calculates the local average luminance from the luminance of the input image (here, the image of the target area), and adds a predetermined offset to the local average luminance.
  • FIG. 19 is a diagram illustrating an example of the bright object extracted in step S31 of FIG. Next, in step S32, the control device 1 detects the contour of the binarized bright object.
  • step S33 the control device 1 applies a moving average filter to the image of the target area, thereby extracting dark objects that are darker than the surroundings and binarizing the image.
  • a black character string on a white background is extracted as a dark object.
  • FIG. 20 is a diagram illustrating an example of the dark object extracted in step S33 of FIG.
  • step S34 the control device 1 detects the binarized dark object contour.
  • FIG. 22A is a diagram showing an image including a bright object and a dark object.
  • FIG. 22B is a diagram showing extraction of a bright object from the image of FIG. 22A.
  • FIG. 22C is a diagram illustrating extraction of a dark object from the image of FIG. 22A.
  • One character string is considered to be one of a bright object and a dark object. Since both the bright object and the dark object are extracted in steps S31 to S34 in FIG. 8, the date printed on the container 13 can be reliably detected.
  • step S35 of FIG. 8 the control device 1 extracts an edge in the image of the target region using the following Sobel filter.
  • FIG. 23 is a diagram showing an example of edges in the image extracted using the Sobel filter in step S35 of FIG.
  • step S36 of FIG. 8 the control device 1 deletes an area other than the edge extracted in step S35 from the image of the target area.
  • FIG. 24 is a diagram illustrating an example of an image after deleting a region other than the edge extracted in step S35 in step S36 of FIG.
  • step S37 in FIG. 8 the control device 1 applies a Canny filter to the image after the deletion in step S36, and extracts an edge in the image.
  • the Canny edge detection method includes the following three steps. As a first step, the gradient magnitude g (x, y) and gradient direction d (x, y) of the following equation are calculated for the image.
  • f x (x, y) indicates a convolution of the first-order derivative in the x direction of the Gaussian function having the standard deviation ⁇ and the pixel value function
  • f y (x, y) is y of the same Gaussian function. The first derivative with respect to the direction and the convolution of the pixel value function are shown.
  • an edge is detected by obtaining the maximum value of the gradient magnitude g (x, y).
  • the magnitude of the gradient of the pixel of interest is estimated by estimating the magnitude of the gradient interpolated with respect to the gradient direction d (x, y) using 8 pixels around the pixel of interest, and comparing with the estimated value. It is determined whether g (x, y) has a true maximum value.
  • a high threshold value Th_H and a low threshold value Th_L are set, and a threshold value with hysteresis is determined.
  • the gradient magnitude g (x, y) is greater than the high threshold Th_H, the pixel is determined to be an edge.
  • the gradient magnitude g (x, y) is smaller than the low threshold Th_L, it is determined that the pixel is not an edge.
  • the gradient magnitude g (x, y) is between the high threshold Th_H and the low threshold Th_L, it is determined that the pixel is an edge only when the pixel is adjacent to a pixel detected as an edge. To do.
  • the range of possible values for the threshold is 0-255.
  • FIG. 25 is a diagram illustrating an example of edges in the image extracted using the Canny filter having the threshold value 15 in step S37 of FIG.
  • FIG. 26 is a diagram illustrating an example of edges in the image extracted using the Canny filter having the threshold value 4 in step S37 of FIG.
  • the edge of FIG. 26 is used in steps S38 and S39
  • the edge of FIG. 25 is used in step S67 of FIG.
  • edges are extracted once using a Sobel filter (step S35), areas other than the extracted edges are deleted (step S36), and the vicinity of the edges extracted using the Sobel filter is used.
  • An edge is extracted by applying a canny filter only to the area (step S37), and this edge is used as an edge of an object included in the target area.
  • the edge extraction is about 10 times faster than when only the Canny filter is used.
  • step S38 in FIG. 8 the control device 1 extracts a bright object having a contour and an edge, the contour and the edge overlapping each other and substantially matching each other as a candidate object.
  • FIG. 27 is a diagram illustrating an example of a bright object candidate object extracted in step S38 of FIG.
  • step S39 in FIG. 8 the control device 1 extracts dark objects having contours and edges, the contours and edges overlapping each other and substantially matching each other as candidate objects.
  • FIG. 28 is a diagram illustrating an example of dark object candidate objects extracted in step S39 of FIG.
  • FIG. 29A shows an example of an object.
  • FIG. 29B is a diagram showing the binarized image of FIG. 29A extracted using the threshold value 200.
  • FIG. 29C is a diagram showing an outline of FIG. 29B.
  • FIG. 29D is a diagram showing the edges of FIG. 29A extracted using the threshold value 50.
  • FIG. 29E is a diagram showing the edges of FIG. 29A extracted using the threshold 200.
  • FIG. 29F is a diagram showing the outline and edges of a character.
  • the object of FIG. 29A includes a luminance 0 portion, a luminance 128 portion, and a luminance 255 portion, for example, when the luminance of a pixel varies from 0 to 255.
  • FIG. 29A is obtained as the outline of the binarized image (FIG. 29B) (FIG. 29C).
  • the edge of the object in FIG. 29A is obtained as a portion where the luminance changes suddenly, and different edges are extracted by using different threshold values (FIGS. 29D and 29E).
  • FIGS. 29C to 29E generally, the outline and edge of an object do not always coincide with each other. However, it is considered that the character object always has a closed edge, and the outline and the edge of the object coincide. Therefore, a character object can be extracted by extracting an object having a substantially matched contour and edge. Objects whose edges and contours do not match are deleted as noise.
  • step S5 of FIG. 6 the control device 1 determines whether or not the candidate object has been successfully extracted. If YES, the process proceeds to step S6, and if NO, the process proceeds to step S10. In step S6, the control device 1 executes an OCR process.
  • FIG. 9 is a flowchart showing a subroutine of OCR processing in step S6 of FIG. It is unknown whether the character string to be recognized is a bright object or a dark object, and whether the character string to be recognized extends parallel to the X axis in FIG. 5 or extends parallel to the Y axis. Since it is unknown whether it exists, the OCR subroutine of FIGS. 10 and 11 is executed for all these combinations.
  • the OCR subroutine of FIGS. 10 and 11 is executed for all these combinations.
  • the character string of the recognition device is a bright object
  • the bright object candidate object extracted in step S38 in FIG. 8 is used.
  • the dark object candidate object extracted in step S39 in FIG. 8 is used.
  • the image of the target area is used as it is.
  • the image of the target area is rotated 90 degrees and used.
  • step S52 the control device 1 determines whether or not the OCR is successful. If YES, the process proceeds to step S7 in FIG. 6, and if NO, the process proceeds to step S53.
  • step S53 the control device 1 executes the OCR subroutine on the assumption that the character string of the recognition device is a bright object extending in parallel with the Y axis.
  • step S54 the control device 1 determines whether or not the OCR is successful. If YES, the process proceeds to step S7 in FIG. 6, and if NO, the process proceeds to step S55.
  • step S55 the control device 1 executes the OCR subroutine on the assumption that the character string of the recognition device is a dark object extending in parallel with the X axis.
  • step S56 the control device 1 determines whether or not the OCR is successful. If YES, the process proceeds to step S7 in FIG. 6, and if NO, the process proceeds to step S57.
  • step S57 the control device 1 executes the OCR subroutine on the assumption that the character string of the recognition device is a dark object extending in parallel to the Y axis, and then proceeds to step S7 of FIG.
  • FIG. 10 is a flowchart showing a first part of the OCR subroutine in steps S51, S53, S55, and S57 of FIG.
  • FIG. 11 is a flowchart showing a second part of the OCR subroutine in steps S51, S53, S55, and S57 of FIG.
  • step S61 of FIG. 10 the control device 1 performs labeling of the candidate objects, and extracts a plurality of objects extending in a predetermined direction and close to each other as character string candidates.
  • FIG. 30 is a diagram showing an example of character string candidates extracted in step S61 of FIG.
  • FIG. 31 is a diagram showing extraction of character string candidates in step S61 of FIG.
  • the character string candidate mask 31 arranged at a certain position includes even one candidate object pixel, the region in the character string candidate mask 31 is determined to be a part of the character string candidate.
  • the character string candidate mask 31 is scanned over the entire target area, and labels are assigned to the individual connected character string candidates.
  • step S61 When labeling candidate objects and extracting character string candidates in step S61, a plurality of adjacent character strings may be extracted as one character string candidate. Therefore, the character string candidates are once separated into character candidates, and character candidates having similar feature values are recombined as character string candidates based on the feature values (width and height) of each character candidate.
  • step S62 of FIG. 10 the control device 1 performs labeling of objects in each character string candidate, extracts a plurality of character candidates included in the character string candidate, and generates a bounding box for each character candidate.
  • Each bounding box has a rectangular shape having a width parallel to the direction in which the character string candidates extend and a height orthogonal to the direction in which the character string candidates extend, and encloses each character candidate. It is.
  • step S63 the control device 1 deforms each bounding box so that the width of the bounding box is increased as the height of the bounding box is lower, based on the width and height.
  • step S64 the control device 1 extracts a set of character candidates included in the bounding boxes connected by deformation as a new character string candidate.
  • FIG. 32 is a diagram illustrating an example of the character candidates extracted in step S62 of FIG. 10 and the generated bounding box.
  • FIG. 33A is a diagram showing an example of the character string candidates extracted in step S61 of FIG.
  • FIG. 33B is a diagram showing an example of the character candidates extracted in step S62 of FIG. 10 and the generated bounding box 42.
  • FIG. 33C is a diagram illustrating an example of the bounding box 43 modified in step S63 of FIG.
  • FIG. 33D is a diagram showing an example of a new character string candidate extracted in step S64 of FIG.
  • the character string candidates in FIG. 33A include two character strings “2012. 1” and “abc”, but are extracted as one character string candidate.
  • FIG. 33A shows a bounding box 41 of character string candidates for explanation.
  • FIG. 33B the objects in the character string candidates in FIG.
  • each character candidate's bounding box 42 has a width w3 and a height h3.
  • each bounding box is deformed based on its width and height.
  • the width w3 'and the height h3' of the bounding box 43 after deformation are obtained by the following equations.
  • W is the maximum value of the width of the bounding box 42 of each character candidate
  • H is the maximum value of the height of the bounding box 42 of each character candidate.
  • FIG. 33C the bounding box of each character candidate is deformed so as to expand the width w3 as the height h3 is lower. Therefore, although the distance between “.” And “1” is larger than the distance between “1” and “a”, “.” And “1” are connected in the bounding box 43 after deformation, and “1” and “1” a ”is separated.
  • FIG. 33D a set of character candidates included in the bounding boxes connected by deformation is extracted as a new character string candidate.
  • FIG. 33D shows new character string candidate bounding boxes 41a and 41b for explanation.
  • FIG. 34A is a diagram showing another example of the character string candidates extracted in step S61 of FIG.
  • FIG. 34B is a diagram showing another example of the character candidates extracted in step S62 of FIG. 10 and the generated bounding box 42.
  • step S64 when the set of character candidates included in the bounding boxes connected by deformation is not extracted as a new character string candidate, the object included in the character string candidate in FIG. 34A is deleted as noise.
  • step S65 of FIG. 10 the control device 1 deletes character string candidates including more than 10 character candidates.
  • the date string is considered to contain no more than 10 characters. Accordingly, character string candidates including more than 10 character candidates are deleted as noise.
  • step S66 the control device 1 deletes character string candidates including only character candidates including two or more objects in the height direction.
  • the objects included in the character candidate are connected in the height direction, and the number of connected objects is counted.
  • the numbers “0” to “9” are single connected objects. Therefore, if the character string candidate is a date, all the character candidates included in the character string candidate should include only one object in the height direction.
  • FIG. 36A is a diagram showing an example of character string candidates extracted in step S64 of FIG.
  • FIG. 36B is a diagram illustrating the number of objects in the height direction of each character candidate included in the character string candidates in FIG. 36A. The number of objects in the height direction of each character candidate is shown on the character candidate 51 in FIG. 36B.
  • FIG. 37A is a diagram showing another example of the character string candidates extracted in step S64 of FIG.
  • FIG. 37B is a diagram illustrating the number of objects in the height direction of each character candidate included in the character string candidates in FIG. 37A.
  • FIG. 37A and FIG. 37B are examples in which a character string is processed as being erroneously extended horizontally even though it extends vertically. As shown in FIG. 37B, since the number of objects in the height direction of each character candidate is two or more, the character string candidates in FIGS. 37A and 37B are deleted as noise.
  • step S67 the control device 1 clears a character string candidate in which a portion where the edge pixel in each object matches the contour pixel is 60% or less of the edge pixel area (number of pixels) from a certain region.
  • a correct character string candidate is included in only one of them.
  • the edge pixel substantially matches the contour pixel.
  • the character string candidate extracted from the other candidate object is deleted as noise.
  • the edge extracted in step S37 in FIG. 8 is used as the object edge.
  • FIG. 38A is a diagram illustrating an example of an input image.
  • FIG. 38B is a diagram showing bright object candidate objects extracted from the image of FIG. 38A.
  • FIG. 38B is a diagram showing bright object candidate objects extracted from the image of FIG. 38A.
  • FIG. 38C is a diagram showing an outline of the candidate object of FIG. 38B.
  • FIG. 38D is a diagram showing edges extracted from the image of FIG. 38A.
  • FIG. 39A is a diagram illustrating an example of an input image.
  • FIG. 39B is a diagram showing dark object candidate objects extracted from the image of FIG. 39A.
  • FIG. 39C is a diagram showing an outline of the candidate object in FIG. 39B.
  • FIG. 39D is a diagram showing edges extracted from the image of FIG. 39A.
  • the input images in FIGS. 38A and 39A include dark objects. Accordingly, the contour of FIG. 39C and the edge of FIG. 39D substantially match, but the contour of FIG. 38C and FIG. 38D do not match.
  • FIG. 35 is a diagram showing an example of character string candidates after deleting some character string candidates in steps S65, S66, and S67 of FIG. Compared to FIG. 30, it can be seen that noise is reduced.
  • step S68 of FIG. 11 the control device 1 executes edge strength and area luminance determination processing.
  • FIG. 12 is a flowchart showing a subroutine of edge strength and area luminance determination processing in step S68 of FIG.
  • step S90 of FIG. 12 the control device 1 selects one character string candidate.
  • step S91 the control device 1 extracts and expands the outline of the character candidate region.
  • the pixel is expanded by adding one pixel to the outline pixel of the character candidate area.
  • step S92 the control device 1 applies a canny filter to the contour expanded in step S91 to detect the edge of the character string candidate region.
  • step S93 the control device 1 calculates the average edge_M and the deviation edge_D of the edge strength of the character string candidate area. Based on the average edge_M and the deviation edge_D of the edge strength, the lower limit edge_L and the upper limit edge_H of the reference range of the edge strength are calculated using the following equations.
  • step S94 the control device 1 calculates the average I_M and the deviation I_D of the luminance of the character string candidate area. Based on the luminance average I_M and the deviation I_D, the lower limit I_L and the upper limit I_H of the luminance reference range are calculated using the following equations.
  • step S95 the control device 1 selects one character candidate in the selected character string candidate.
  • step S96 the control device 1 calculates the average edge strength of the selected character candidate region.
  • step S97 the control device 1 calculates the average brightness of the selected character candidate area.
  • step S98 when the selected character candidate has edge strength and brightness outside the reference range, the control device 1 deletes the character candidate. Specifically, a character candidate having an edge strength less than the lower limit edge_L or an edge strength greater than the upper limit edge_H is deleted as noise. In addition, character candidates having a luminance lower than the lower limit I_L or a luminance higher than the upper limit I_H are deleted as noise.
  • step S99 the control device 1 determines whether or not there is an unprocessed character candidate. If YES, the process proceeds to step S95, and if NO, the process proceeds to step S100. In step S100, the control device 1 determines whether or not there is an unprocessed character string candidate. If YES, the process proceeds to step S90, and if NO, the process proceeds to step S69 in FIG.
  • FIG. 40 is a diagram illustrating an example of an input image for explaining the edge strength and region luminance determination processing in step S68 of FIG.
  • FIG. 41 is a diagram showing an example of character string candidates selected in step S90 of FIG.
  • FIG. 42 is a diagram illustrating an example of character string candidates processed by the edge strength and region luminance determination processing in step S68 of FIG. 40 to 42, it can be seen that noise is reduced based on the edge strength and the region luminance.
  • step S69 of FIG. 11 the control device 1 executes an average height determination process.
  • FIG. 13 is a flowchart showing a subroutine of average height determination processing in step S69 of FIG.
  • the control device 1 selects one character string candidate.
  • the control device 1 connects the objects included in each character candidate in the height direction. In order to connect the objects in the height direction, a closing process is performed in the height direction (that is, an area expansion process is performed and then a contraction process is performed).
  • the control device 1 calculates an average height and deviation of each character candidate, and determines a reference range for the height. In order to determine the reference range of height, instead of calculating the average and the deviation, an intermediate value of the height of each object in the character string candidate may be calculated.
  • step S ⁇ b> 104 the control device 1 deletes a character candidate that does not have the height of the specified range, and extracts a new character string candidate separated by deleting the character candidate from the original character string candidate.
  • FIG. 43A is a diagram illustrating an example of the character string candidates selected in step S101 of FIG.
  • FIG. 43B is a diagram showing an example of a new character string candidate extracted in step S104 of FIG.
  • the character candidate 42a is deleted as noise, and new character string candidates 41c and 41d are extracted.
  • step S105 in FIG. 13 the control device 1 determines whether or not there is an unprocessed character string candidate. If YES, the process proceeds to step S101, and if NO, the process proceeds to step S70 in FIG.
  • step S70 of FIG. 11 the control device 1 determines whether or not the number of character string candidates is 0. If YES, the control device 1 proceeds to steps S52, S54, S56 of FIG. 9 or step S7 of FIG. If NO, the process proceeds to step S71. In step S71, the control device 1 selects one character string candidate. In step S72, the control device 1 executes date pattern determination processing.
  • FIG. 14 is a flowchart showing a subroutine of date pattern determination processing in steps S72 and S75 of FIG.
  • the control device 1 stores therein a table including a plurality of date patterns including a two-digit or four-digit number representing a year, a one-digit or two-digit number representing a month, and a predetermined punctuation mark. ing. Taking “July 2012” as an example, the date has the following pattern, for example.
  • Each date pattern specifies how numbers and punctuation are arranged.
  • step S111 of FIG. 14 the control device 1 performs OCR for alphanumeric characters on the character string candidate.
  • step S112 the control device 1 selects one date pattern from the date patterns held in the internal table.
  • the control device 1 determines whether or not the character string recognized in step S111 matches the date pattern selected in step S112.
  • step S113 the control device 1 determines whether or not the character string matches the date pattern. If YES, the process proceeds to step S114, and if NO, the process proceeds to step S115.
  • step S114 the control device 1 determines whether or not the character string includes “alphabetic characters that are clearly not mistaken for numbers”. If YES, the process proceeds to step S115. If NO, the process proceeds to step S117.
  • step S115 the control device 1 determines whether or not all date patterns have been used. If YES, the control device 1 proceeds to step S73 (or step S76) in FIG. 11, and if NO, the process returns to step S112. Select a date pattern. In step S117, the control device 1 determines whether or not the height of the character in the character string is constant.
  • step S116 the control device 1 determines whether or not the character string includes alphabetic characters. If YES, the process proceeds to step S117, and if NO, the process proceeds to step S118. In step S117, the control device 1 performs OCR for numbers on the character string candidate, and recognizes that the character string candidate includes an alphabetic character as a number.
  • tilt correction may be performed.
  • a character string including “1” can be recognized without error.
  • step S118 of FIG. 14 the control device 1 determines whether or not the character string is a date. If YES, the process proceeds to step S73 (or step S76) of FIG. 11, and if NO, the process returns to step S113. .
  • step S73 in FIG. 11 the control device 1 determines whether or not the date pattern has been successfully determined. If YES, the process proceeds to steps S52, S54, S56 in FIG. 9 or step S7 in FIG. If YES, go to step S74.
  • step S74 the control device 1 rotates the character string candidate by 180 degrees.
  • step S75 the control device 1 executes the same date pattern determination process as described above for the character string candidate rotated by 180 degrees.
  • step S76 the control device 1 determines whether or not the date pattern has been successfully determined. If YES, the control device 1 proceeds to steps S52, S54, S56 in FIG. 9 or step S7 in FIG. 6, and if NO, Proceed to step S77.
  • step S77 the control device 1 determines whether or not there is an unprocessed character string candidate. If YES, the process returns to step S71. If NO, step S52, S54, S56 in FIG. 9 or FIG. The process proceeds to step S7.
  • step S7 of FIG. 6 the control device 1 determines whether or not the OCR has succeeded, that is, whether or not the character string representing the expiration date has been successfully extracted. If YES, the process proceeds to step S8. If NO, the process proceeds to step S10. When the month is recognized as “1”, it is actually “10” to “12”, but it may be erroneously recognized as “1” due to the angle of the container 13 or the like. is there. In the following steps, when a character string candidate related to one input image includes only “1” as a number representing the month, does a character string candidate related to another input image include only “1” as a number representing the month? Judge whether or not. In step S8, the control device 1 determines whether or not the month is “1”.
  • step S9 the control device 1 determines whether or not the same date has been detected twice. If YES, the process proceeds to step S12. If NO, the process proceeds to step S10.
  • step S10 the control device 1 determines whether or not the container 13 has made one turn, the process proceeds to step S13 if YES, and the process proceeds to step S11 if NO.
  • step S ⁇ b> 11 the control device 1 rotates the container 13. For example, when the container 13 is rotated by 15 degrees, a total of 24 input images can be acquired.
  • step S12 the control device 1 outputs a date.
  • step S13 the control device 1 outputs an error.
  • various noises included in the image of the character string are removed in advance before optically recognizing the character string.
  • a character string representing a date can be recognized with high accuracy.
  • the input image is not limited to a cylindrical container image, and may be another image (a flat object image or arbitrary image data).
  • the date detection process of FIGS. 6 to 14 may be executed at least partially by the PC 9.
  • an optical character recognition method for recognizing a character string representing a date may be performed.
  • Such an optical character recognition method may be implemented as a computer program that optically recognizes a character string when executed by a computer.
  • Such a computer program may be stored in a computer-readable recording medium.
  • such a computer program is stored in the recording medium 10 of FIG. 1, and when the PC 9 reads the computer program from the recording medium 10, the optical character recognition method is performed according to the computer program.
  • the optical character recognition device recognizes a character string representing a date with higher accuracy than before if the date includes a predetermined punctuation and is printed with characters having a normal typeface. can do.
  • dates that include special punctuation for example, “2015/5”, “2015 5”
  • dates printed with characters that have special typefaces for example, a plurality of dots separated from each other
  • a special pattern it is necessary to relax various judgment conditions (threshold values, etc.) in the date detection processing of FIGS. 6 to 14 and extract a large number of character string candidates. is there. Relaxing the judgment conditions in the date detection process increases noise and increases the time required to execute the date detection process, so it is desirable to be able to recognize the date of the special pattern while suppressing an increase in execution time. .
  • FIG. 44 is a flowchart showing date detection processing executed by the control device 1 of the optical character recognition device according to the second embodiment of the present invention. Steps S1 to S12 in FIG. 44 are the same as steps S1 to S12 in FIG.
  • the date detection process in FIG. 44 includes a special pattern date detection process in step S14 instead of step S13 in FIG. In the date detection process of the special pattern in step S14, the judgment conditions (threshold value, etc.) that are more relaxed than those used in steps S2, S4, and S6 of FIG. 44 are set, and the date detection process of FIG. Execute. There are few types and quantities of containers on which the date of the special pattern is printed, and the dates printed on most containers can be recognized by executing a date detection process in which limited conditions are set to some extent.
  • the date detection process of the special pattern of step S14 is executed only when the character string of the date is not recognized by executing steps S1 to S11 of FIG.
  • the date of the special pattern can be recognized while suppressing the increase.
  • the date character string is orthogonal to the rotation axis of the cylindrical container. (FIGS. 3 and 4), there is a possibility that the entire date does not fit in the image. When the date string extends over a half circumference of the side surface of the cylindrical container, an image including the entire date cannot be acquired.
  • FIG. 45 is a flowchart showing date detection processing executed by the control device 1 of the optical character recognition device according to the third embodiment of the present invention. Steps S1 to S12 in FIG. 45 are the same as steps S1 to S12 in FIG.
  • the date detection process in FIG. 45 includes a date detection process for linked images in step S15 instead of step S13 in FIG.
  • FIG. 46 is a flowchart showing a subroutine of date detection processing for linked images in step S15 of FIG.
  • the control device 1 photographs the container 13 with the camera 5 while rotating the container 13 by a certain angle using the rollers 8a and 8b, and acquires a plurality of images representing different angles of the container 13, respectively. .
  • the control device 1 connects a plurality of images including portions adjacent to each other of the container 13 to generate one connected image. Specifically, the control device 1 connects these images by recognizing a similar object in two adjacent images.
  • the connected image is a planar image in which the side surface of the container is developed.
  • the control device 1 corrects the curved portion of the container 13 to a plane using projective transformation, with the width of the container 13 as the diameter of the cylinder.
  • Steps S2 to S9, S12, and S13 in FIG. 46 are the same as steps S2 to S9, S12, and S13 in FIG.
  • control device 1 may generate a connected image in advance before executing step S15 in FIG.
  • a line camera for generating an image of a side surface of a cylindrical object.
  • a line camera in addition to the cost of the line camera itself, there is a cost to provide a mechanism for rotating an object with high accuracy. Since the medicine container has various shapes and sizes, the cost for rotating the medicine container with sufficient accuracy for photographing with a line camera becomes very large.
  • the date detection process of FIG. 45 it is possible to suppress an increase in cost by connecting a plurality of images captured by a normal camera to generate a connected image.
  • the date detection process of the linked image in step S15 is executed only when the character string of the date is not recognized by executing steps S1 to S11 in FIG. Dates that do not fall within one image can be recognized while suppressing the increase.
  • FIG. 47 is a flowchart showing a date pattern determination processing subroutine of date detection processing executed by the control device 1 of the optical character recognition device according to the fourth embodiment of the present invention.
  • the date pattern determination process in FIG. 47 is executed in steps S72 and S75 in FIG. 11, and includes additional steps S121 and S122 between steps S113 and S114 in FIG.
  • step S121 of FIG. 47 when the number representing the month of the character string matching the date pattern is any one of “10”, “11”, and “12”, the control device 1 is based on the criteria described below. Whether or not there are other characters after the date is determined. If YES, the process proceeds to step S122. If NO, the process proceeds to step S114.
  • FIG. 48 is a diagram showing an example of a character string candidate including a date character string and other characters. Another character “CJ932” exists after the date “20166.1”. If “C” is mistakenly recognized as “0”, the character string matching the date pattern is mistakenly recognized as “2016.10”. When detecting the date character string “2016.10”, the control device 1 determines whether or not the last “0” is actually a part of the date.
  • the distances D1 to D10 between characters in FIG. 48 are, for example, as follows in units of the number of pixels.
  • the distance D6 and the distance between characters D7 to D2016 after “2016.10” are determined.
  • the average value of D10 is compared.
  • the control device 1 determines that the last “0” of “2016.10” is not part of the date, and in step S122, the character “CJ932” is determined. Remove.
  • the control device 1 determines that the last “0” of “2016.10” is part of the date. Thereby, even if there is another character immediately after the date character string, the character string representing the date can be recognized with high accuracy.
  • control device 1 uses the character string candidate when the character string candidate includes two numbers representing the month and at least one other character following the two numbers representing the month, and 2 representing the month.
  • the distance between two numbers is greater than the average of the distance between the two numbers representing the month and other characters and the distance between the other characters (step S121)
  • the first digit of the two numbers representing the month And other characters are removed (step S122).
  • the optical character recognition device, the optical character recognition method, the computer program, and the recording medium of the present invention can recognize a character string representing a date with higher accuracy than before.

Landscapes

  • Engineering & Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Multimedia (AREA)
  • Theoretical Computer Science (AREA)
  • Character Input (AREA)
  • Character Discrimination (AREA)
  • Image Analysis (AREA)
  • Medical Preparation Storing Or Oral Administration Devices (AREA)

Abstract

L'invention concerne un dispositif de reconnaissance optique de caractères qui extrait d'une image d'entrée une région cible contenant des objets qui sont des sujets de reconnaissance. Le dispositif de reconnaissance optique de caractères extrait des objets candidats contenant au moins un objet candidat à chaîne de caractères parmi les objets contenus dans la région cible. Le dispositif de reconnaissance optique de caractères étiquette les objets candidats, extrait une pluralité d'objets à proximité les uns des autres et s'étendant dans une direction prédéterminée en tant que candidats à chaîne de caractères, détermine si les candidats à chaîne de caractères contiennent un motif de date contenant un nombre à deux ou quatre chiffres représentant l'année, un nombre à un ou deux chiffres représentant le mois, et une marque de ponctuation prédéterminée, et, lorsqu'un candidat à chaîne de caractères contient un motif de date, il reconnaît le candidat à chaîne de caractères en tant que date.
PCT/JP2014/068725 2013-07-16 2014-07-14 Dispositif de reconnaissance optique de caractères WO2015008732A1 (fr)

Priority Applications (2)

Application Number Priority Date Filing Date Title
JP2015527291A JP6344389B2 (ja) 2013-07-16 2014-07-14 光学文字認識装置
CN201480040348.1A CN105431866A (zh) 2013-07-16 2014-07-14 光学字符识别装置

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2013147546 2013-07-16
JP2013-147546 2013-07-16

Publications (1)

Publication Number Publication Date
WO2015008732A1 true WO2015008732A1 (fr) 2015-01-22

Family

ID=52346186

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2014/068725 WO2015008732A1 (fr) 2013-07-16 2014-07-14 Dispositif de reconnaissance optique de caractères

Country Status (4)

Country Link
JP (1) JP6344389B2 (fr)
CN (1) CN105431866A (fr)
TW (1) TWI608422B (fr)
WO (1) WO2015008732A1 (fr)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20180117596A (ko) * 2016-03-02 2018-10-29 핑안 테크놀로지 (션젼) 컴퍼니 리미티드 운전면허 유효기간 자동 추출 방법, 기기, 시스템 및 저장 매체
CN110414496A (zh) * 2018-04-26 2019-11-05 百度在线网络技术(北京)有限公司 相似字识别方法、装置、计算机设备及存储介质

Families Citing this family (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106407976B (zh) * 2016-08-30 2019-11-05 百度在线网络技术(北京)有限公司 图像字符识别模型生成和竖列字符图像识别方法和装置
JP6401806B2 (ja) * 2017-02-14 2018-10-10 株式会社Pfu 日付識別装置、日付識別方法及び日付識別プログラム
JP6480985B2 (ja) * 2017-07-03 2019-03-13 ファナック株式会社 Ncプログラム変換装置
JP6949596B2 (ja) 2017-07-20 2021-10-13 東芝テック株式会社 商品データ処理装置及び商品データ処理プログラム
TWI685796B (zh) * 2018-05-31 2020-02-21 國立中興大學 智慧型文字圖形識別方法
CN110490192A (zh) * 2019-07-16 2019-11-22 广东工业大学 一种商品生产日期标签检测方法及系统
TWI797531B (zh) * 2020-12-31 2023-04-01 國立臺北科技大學 藥庫管理系統

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2001175808A (ja) * 1999-12-22 2001-06-29 Fujitsu Ltd 画像処理装置、及び画像処理プログラムを記録した、コンピュータ読み取り可能な記録媒体
JP2004178044A (ja) * 2002-11-25 2004-06-24 Mitsubishi Electric Corp 属性抽出方法及びその装置及び属性抽出プログラム
JP2009294704A (ja) * 2008-06-02 2009-12-17 Mitsubishi Heavy Ind Ltd ナンバー認識装置およびナンバー認識方法
JP2010115339A (ja) * 2008-11-13 2010-05-27 Ookuma Electronic Co Ltd 注射液用空容器の情報読取装置
JP2011521520A (ja) * 2008-04-16 2011-07-21 ワイコフ, リチャード ダレル 携帯型マルチメディア受信および送信装置
JP2012137841A (ja) * 2010-12-24 2012-07-19 Institute Of National Colleges Of Technology Japan ピッキングシステムおよびピッキング方法

Family Cites Families (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5405015A (en) * 1993-08-11 1995-04-11 Videojet Systems International, Inc. System and method for seeking and presenting an area for reading with a vision system
TW200641708A (en) * 2005-05-25 2006-12-01 Systex Corp Fast inquiry system and method for merchandise data
JP2009199102A (ja) * 2008-02-19 2009-09-03 Fujitsu Ltd 文字認識プログラム、文字認識装置及び文字認識方法
US8577145B2 (en) * 2009-12-19 2013-11-05 Pcas Patient Care Automation Services Inc. Automated dispensary for identifying embossed characters and package labeling
CN101968865B (zh) * 2010-11-17 2013-12-11 上海合合信息科技发展有限公司 在电子日历中添加提醒事件的方法
JP5647919B2 (ja) * 2011-03-07 2015-01-07 株式会社Nttドコモ 文字認識装置、文字認識方法、文字認識システム、および文字認識プログラム

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2001175808A (ja) * 1999-12-22 2001-06-29 Fujitsu Ltd 画像処理装置、及び画像処理プログラムを記録した、コンピュータ読み取り可能な記録媒体
JP2004178044A (ja) * 2002-11-25 2004-06-24 Mitsubishi Electric Corp 属性抽出方法及びその装置及び属性抽出プログラム
JP2011521520A (ja) * 2008-04-16 2011-07-21 ワイコフ, リチャード ダレル 携帯型マルチメディア受信および送信装置
JP2009294704A (ja) * 2008-06-02 2009-12-17 Mitsubishi Heavy Ind Ltd ナンバー認識装置およびナンバー認識方法
JP2010115339A (ja) * 2008-11-13 2010-05-27 Ookuma Electronic Co Ltd 注射液用空容器の情報読取装置
JP2012137841A (ja) * 2010-12-24 2012-07-19 Institute Of National Colleges Of Technology Japan ピッキングシステムおよびピッキング方法

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20180117596A (ko) * 2016-03-02 2018-10-29 핑안 테크놀로지 (션젼) 컴퍼니 리미티드 운전면허 유효기간 자동 추출 방법, 기기, 시스템 및 저장 매체
JP2018533808A (ja) * 2016-03-02 2018-11-15 平安科技(深▲せん▼)有限公司 運転免許証有効期限自動抽出方法、装置、システム及び記憶媒体
EP3425563A4 (fr) * 2016-03-02 2019-10-23 Ping An Technology (Shenzhen) Co., Ltd. Procédé, dispositif et système d'extraction automatique pour date d'expiration de permis de conduire, et support de stockage
KR102152191B1 (ko) * 2016-03-02 2020-09-07 핑안 테크놀로지 (션젼) 컴퍼니 리미티드 운전면허 유효기간 자동 추출 방법, 기기, 시스템 및 저장 매체
CN110414496A (zh) * 2018-04-26 2019-11-05 百度在线网络技术(北京)有限公司 相似字识别方法、装置、计算机设备及存储介质
CN110414496B (zh) * 2018-04-26 2022-05-27 百度在线网络技术(北京)有限公司 相似字识别方法、装置、计算机设备及存储介质

Also Published As

Publication number Publication date
TWI608422B (zh) 2017-12-11
JPWO2015008732A1 (ja) 2017-03-02
TW201506800A (zh) 2015-02-16
JP6344389B2 (ja) 2018-06-20
CN105431866A (zh) 2016-03-23

Similar Documents

Publication Publication Date Title
JP6344389B2 (ja) 光学文字認識装置
JP6354589B2 (ja) 物体識別装置、方法及びプログラム
JP7102490B2 (ja) 画像認識装置
JP6143111B2 (ja) 物体識別装置、物体識別方法、及びプログラム
US8300928B2 (en) System and method for locating a target region in an image
US8971569B2 (en) Marker processing method, marker processing device, marker, object having a marker, and marker processing program
JP6278276B2 (ja) 物体識別装置、物体識別方法、及びプログラム
US8538170B2 (en) System and method for document location and recognition
TWI751426B (zh) 影像處理系統、影像處理方法及程式產品
TWI725465B (zh) 影像處理系統、影像處理方法及程式產品
US9652652B2 (en) Method and device for identifying a two-dimensional barcode
JP2023040038A (ja) プログラム、情報処理方法及び情報処理装置
JP2017173925A (ja) 光学文字認識装置
JP5160366B2 (ja) 電子部品のパターンマッチング方法
JP2006330872A (ja) 指紋照合装置、方法およびプログラム
JP6890849B2 (ja) 情報処理システム
CN113658039A (zh) 一种药瓶标签图像的拼接顺序确定方法
US11817207B1 (en) Medication inventory system including image based boundary determination for generating a medication tray stocking list and related methods
JP6941331B2 (ja) 画像認識システム
JP2006330873A (ja) 指紋照合装置、方法およびプログラム
AU2016238852A1 (en) Method for analyzing contents of at least one image of a deformed structured document
JP2023061880A (ja) 照合装置及びプログラム
WO2015181580A1 (fr) Examen automatisé de formulaires grâce à la réalité augmentée
Hampapur et al. Key-word Guided Word Spotting In Printed Text
Diem et al. Registration of Manuscript Images using Rotation Invariant Features

Legal Events

Date Code Title Description
WWE Wipo information: entry into national phase

Ref document number: 201480040348.1

Country of ref document: CN

121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 14825778

Country of ref document: EP

Kind code of ref document: A1

ENP Entry into the national phase

Ref document number: 2015527291

Country of ref document: JP

Kind code of ref document: A

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 14825778

Country of ref document: EP

Kind code of ref document: A1