CN117152415A - Method, device, equipment and storage medium for detecting marker of medicine package - Google Patents

Method, device, equipment and storage medium for detecting marker of medicine package Download PDF

Info

Publication number
CN117152415A
CN117152415A CN202311119565.7A CN202311119565A CN117152415A CN 117152415 A CN117152415 A CN 117152415A CN 202311119565 A CN202311119565 A CN 202311119565A CN 117152415 A CN117152415 A CN 117152415A
Authority
CN
China
Prior art keywords
package
marker
image
region
interest
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202311119565.7A
Other languages
Chinese (zh)
Other versions
CN117152415B (en
Inventor
潘健岳
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Aocheng Intelligent Technology Co ltd
Original Assignee
Beijing Aocheng Intelligent Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Aocheng Intelligent Technology Co ltd filed Critical Beijing Aocheng Intelligent Technology Co ltd
Priority to CN202311119565.7A priority Critical patent/CN117152415B/en
Publication of CN117152415A publication Critical patent/CN117152415A/en
Application granted granted Critical
Publication of CN117152415B publication Critical patent/CN117152415B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/20Image preprocessing
    • G06V10/25Determination of region of interest [ROI] or a volume of interest [VOI]
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06KGRAPHICAL DATA READING; PRESENTATION OF DATA; RECORD CARRIERS; HANDLING RECORD CARRIERS
    • G06K7/00Methods or arrangements for sensing record carriers, e.g. for reading patterns
    • G06K7/10Methods or arrangements for sensing record carriers, e.g. for reading patterns by electromagnetic radiation, e.g. optical sensing; by corpuscular radiation
    • G06K7/14Methods or arrangements for sensing record carriers, e.g. for reading patterns by electromagnetic radiation, e.g. optical sensing; by corpuscular radiation using light without selection of wavelength, e.g. sensing reflected white light
    • G06K7/1404Methods for optical code recognition
    • G06K7/1439Methods for optical code recognition including a method step for retrieval of the optical code
    • G06K7/1443Methods for optical code recognition including a method step for retrieval of the optical code locating of the code in an image
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06KGRAPHICAL DATA READING; PRESENTATION OF DATA; RECORD CARRIERS; HANDLING RECORD CARRIERS
    • G06K7/00Methods or arrangements for sensing record carriers, e.g. for reading patterns
    • G06K7/10Methods or arrangements for sensing record carriers, e.g. for reading patterns by electromagnetic radiation, e.g. optical sensing; by corpuscular radiation
    • G06K7/14Methods or arrangements for sensing record carriers, e.g. for reading patterns by electromagnetic radiation, e.g. optical sensing; by corpuscular radiation using light without selection of wavelength, e.g. sensing reflected white light
    • G06K7/1404Methods for optical code recognition
    • G06K7/1439Methods for optical code recognition including a method step for retrieval of the optical code
    • G06K7/1447Methods for optical code recognition including a method step for retrieval of the optical code extracting optical codes from image or text carrying said optical code
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06KGRAPHICAL DATA READING; PRESENTATION OF DATA; RECORD CARRIERS; HANDLING RECORD CARRIERS
    • G06K7/00Methods or arrangements for sensing record carriers, e.g. for reading patterns
    • G06K7/10Methods or arrangements for sensing record carriers, e.g. for reading patterns by electromagnetic radiation, e.g. optical sensing; by corpuscular radiation
    • G06K7/14Methods or arrangements for sensing record carriers, e.g. for reading patterns by electromagnetic radiation, e.g. optical sensing; by corpuscular radiation using light without selection of wavelength, e.g. sensing reflected white light
    • G06K7/1404Methods for optical code recognition
    • G06K7/1495Methods for optical code recognition the method including an image compression step
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/20Image preprocessing
    • G06V10/26Segmentation of patterns in the image field; Cutting or merging of image elements to establish the pattern region, e.g. clustering-based techniques; Detection of occlusion
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/20Image preprocessing
    • G06V10/28Quantising the image, e.g. histogram thresholding for discrimination between background and foreground patterns
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/20Image preprocessing
    • G06V10/30Noise filtering
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V30/00Character recognition; Recognising digital ink; Document-oriented image-based pattern recognition
    • G06V30/10Character recognition
    • G06V30/14Image acquisition
    • G06V30/146Aligning or centring of the image pick-up or image-field
    • G06V30/147Determination of region of interest
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V30/00Character recognition; Recognising digital ink; Document-oriented image-based pattern recognition
    • G06V30/10Character recognition
    • G06V30/14Image acquisition
    • G06V30/148Segmentation of character regions
    • G06V30/15Cutting or merging image elements, e.g. region growing, watershed or clustering-based techniques
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V30/00Character recognition; Recognising digital ink; Document-oriented image-based pattern recognition
    • G06V30/10Character recognition
    • G06V30/16Image preprocessing
    • G06V30/162Quantising the image signal
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V30/00Character recognition; Recognising digital ink; Document-oriented image-based pattern recognition
    • G06V30/10Character recognition
    • G06V30/16Image preprocessing
    • G06V30/164Noise filtering
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02PCLIMATE CHANGE MITIGATION TECHNOLOGIES IN THE PRODUCTION OR PROCESSING OF GOODS
    • Y02P90/00Enabling technologies with a potential contribution to greenhouse gas [GHG] emissions mitigation
    • Y02P90/30Computing systems specially adapted for manufacturing

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Multimedia (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Health & Medical Sciences (AREA)
  • Electromagnetism (AREA)
  • General Health & Medical Sciences (AREA)
  • Toxicology (AREA)
  • Artificial Intelligence (AREA)
  • Image Analysis (AREA)
  • Image Processing (AREA)

Abstract

The application provides a method, a device, equipment and a storage medium for detecting a marker of medicine package. The method can acquire a target image, wherein the target image comprises image information of target packages; extracting a region of interest from the target image according to the relative position of the target package in the duplex package; compressing a data matrix of the region of interest to obtain a one-dimensional graph element matrix of the region of interest; each element in the one-dimensional image element matrix respectively represents the image change characteristics of a corresponding column of elements in the data matrix of the region of interest; the element value at the edge of the marker in the one-dimensional graph element matrix is larger than a set threshold value; judging whether the visual characteristics of the marker accord with preset detection conditions. The embodiment of the application can realize automatic detection of the markers in the medicine package, improve the detection efficiency and accuracy of the markers in the dual medicine package, avoid secondary pollution risks caused by the detection process and ensure the medicine quality.

Description

Method, device, equipment and storage medium for detecting marker of medicine package
Technical Field
The embodiment of the application relates to the technical field of computers, in particular to a method, a device, equipment and a storage medium for detecting a marker of medicine packaging.
Background
At present, drugs are classified into solid preparations and liquid preparations according to the formulation method. Solid preparations include various forms such as tablets, capsules, powders, granules and the like.
In the production and preparation process of the traditional Chinese medicine package, the traditional Chinese medicine preparation can be packaged by adopting a small bag.
In the related art, after the pouch packaging process is completed, it is necessary to manually check whether the package has defects. However, human eyes are easily fatigued, and thus, the detection efficiency and accuracy of the manual detection method are limited by the status of personnel. In addition, the intervention of manual operation can cause secondary pollution of the medicine, so that the quality of the medicine is reduced.
Disclosure of Invention
In this context, the embodiment of the application is expected to provide a method, a device, equipment and a storage medium for detecting a marker of a medicine package, which are used for realizing automatic detection of the marker in the medicine package, improving the detection efficiency of the marker in the medicine package, improving the accuracy of the marker detection in the medicine package, avoiding secondary pollution caused by the detection process and guaranteeing the medicine quality.
In a first aspect of embodiments of the present application, there is provided a method of detecting a marker of a pharmaceutical package, comprising:
Acquiring a target image; the target image comprises image information of a target package, wherein the target package is one independent package in a dual package, and the dual package comprises two independent packages connected with each other;
extracting a region of interest from the target image according to the relative position of the target package in the duplex package;
compressing the data matrix of the region of interest to obtain a one-dimensional graph element matrix of the region of interest; wherein each element in the one-dimensional image element matrix respectively represents the image change characteristics of a corresponding column of elements in the data matrix of the region of interest; the element value at the edge of the marker in the one-dimensional graph element matrix is larger than a set threshold value; the marker is used for indicating a cuttable region in the duplex package;
judging whether the visual characteristics of the marker accord with preset detection conditions or not; wherein the visual characteristics of the marker are determined based on the elements at the edges of the marker.
In a second aspect of embodiments of the present application, there is provided a marker detection device for pharmaceutical packaging, the device comprising:
an acquisition unit configured to acquire a target image; the target image comprises image information of a target package, wherein the target package is one independent package in a dual package, and the dual package comprises two independent packages connected with each other;
The extraction unit is used for extracting a region of interest from the target image according to the relative position of the target package in the bigeminal package;
the compression unit is used for compressing the data matrix of the region of interest to obtain a one-dimensional graph element matrix of the region of interest; wherein each element in the one-dimensional image element matrix respectively represents the image change characteristics of a corresponding column of elements in the data matrix of the region of interest; the element value at the edge of the marker in the one-dimensional graph element matrix is larger than a set threshold value; the marker is used for indicating a cuttable region in the duplex package;
the judging unit is used for judging whether the visual characteristics of the marker accord with preset detection conditions or not; wherein the visual characteristics of the marker are determined based on the elements at the edges of the marker.
In a third aspect of embodiments of the present application, there is provided a computing device comprising:
at least one processor, memory, and input output unit;
wherein the memory is for storing a computer program and the processor is for invoking the computer program stored in the memory to perform the marker detection method of the pharmaceutical packaging of the first aspect.
In a fourth aspect of embodiments of the present application, there is provided a computer readable storage medium comprising instructions which, when run on a computer, cause the computer to perform the marker detection method of the pharmaceutical product package of the first aspect.
The embodiment of the application provides a method, a device, equipment and a storage medium for detecting a marker of medicine package. In an embodiment of the application, a target image is acquired, the target image comprising image information of a target package, the target package being one of two separate packages, the two separate packages comprising a connection. Further, a region of interest is extracted from the target image based on the relative position of the target package in the duplex package. Further, the data matrix of the region of interest is compressed, and a one-dimensional graph element matrix of the region of interest is obtained. Each element in the one-dimensional image element matrix respectively represents the image change characteristics of a corresponding column of elements in the data matrix of the region of interest, and the element value at the edge of the marker in the one-dimensional image element matrix is larger than a set threshold value. The marker is used to indicate a trimmable area in the duplex package. Finally, judging whether the visual characteristics of the marker accord with preset detection conditions. The visual characteristics of the marker are determined based on the elements at the edges of the marker.
In the embodiment of the application, the region of interest to be detected is extracted from the medicine package image (namely the target image), and then the phenomenon that the element value at the edge of the marker is larger than the set threshold value is referenced, and the visual characteristics of the marker are obtained from the one-dimensional image element matrix obtained by compressing the region of interest and are used for judging whether the marker is qualified or not, so that the automatic detection of medicine package is realized, and the detection efficiency of the marker in the medicine package is greatly improved. Meanwhile, the problem of accuracy reduction caused by human eye fatigue and the risk of secondary pollution of medicines caused by manual operation intervention are effectively avoided by automatically extracting and detecting the visual characteristics in the images, so that the accuracy of marker detection in medicine packaging is greatly improved, and the medicine quality is ensured.
Drawings
The above, as well as additional purposes, features, and advantages of exemplary embodiments of the present application will become readily apparent from the following detailed description when read in conjunction with the accompanying drawings. Several embodiments of the present application are illustrated by way of example, and not by way of limitation, in the figures of the accompanying drawings and in which:
FIG. 1 is a flow chart of a method for detecting a label of a pharmaceutical package according to an embodiment of the present application;
FIG. 2 is a schematic diagram of a target image according to an embodiment of the present application;
FIG. 3 is a flowchart of a target image acquisition method according to an embodiment of the present application;
FIG. 4 is a schematic diagram of a target image according to another embodiment of the present application;
FIG. 5 is a flowchart illustrating a method for extracting a region of interest according to an embodiment of the present application;
FIG. 6 is a schematic diagram of a marker according to an embodiment of the present application;
FIG. 7 is a flowchart of a method for determining eligibility of a marker according to an embodiment of the present application;
FIG. 8 is a schematic view of a marker according to another embodiment of the present application;
FIG. 9 is a flowchart illustrating a method for preprocessing a region of interest according to an embodiment of the present application;
FIG. 10 is a schematic diagram illustrating the effect of preprocessing a region of interest according to an embodiment of the present application;
FIG. 11 is a schematic structural view of a label detection device for pharmaceutical packaging according to an embodiment of the present application;
FIG. 12 schematically illustrates a schematic structural diagram of a medium according to an embodiment of the present application;
FIG. 13 schematically illustrates a structural diagram of a computing device in accordance with embodiments of the present application.
In the drawings, the same or corresponding reference numerals indicate the same or corresponding parts.
Detailed Description
The principles and spirit of the present application will be described below with reference to several exemplary embodiments. It should be understood that these embodiments are presented merely to enable those skilled in the art to better understand and practice the application and are not intended to limit the scope of the application in any way. Rather, these embodiments are provided so that this disclosure will be thorough and complete, and will fully convey the scope of the disclosure to those skilled in the art.
Those skilled in the art will appreciate that embodiments of the application may be implemented as a system, apparatus, device, method, or computer program product. Accordingly, the present disclosure may be embodied in the following forms, namely: complete hardware, complete software (including firmware, resident software, micro-code, etc.), or a combination of hardware and software.
At present, drugs are classified into solid preparations and liquid preparations according to the formulation method. Solid preparations include various forms such as tablets, capsules, powders, granules and the like.
In the production and preparation process of the traditional Chinese medicine package, the traditional Chinese medicine preparation can be packaged by adopting a small bag. At present, the production process of the pouch package mainly comprises: aiming at a certain batch of traditional Chinese medicine preparations, typesetting and editing corresponding medicine information, printing the medicine information on packaging materials of small bags by using a high-speed rewinder, and finally, processing the medicine raw materials and the packaging materials into the traditional Chinese medicine preparations packaged by the small bags by using a small bag packaging machine through two procedures of filling, heat sealing, cutting and the like.
In the related art, after the pouch packaging process is completed, it is necessary to manually inspect the packaging bag for possible defects, such as package breakage, white edges, printing errors, black marks, and the like. However, the manual detection method is limited by the fatigue of human eyes, and the accuracy and detection efficiency of the manual detection method are reduced. In addition, the intervention of manual operation can cause secondary pollution of the medicine, so that the quality of the medicine is reduced.
In summary, a new solution is needed to solve at least one of the above technical problems.
In order to overcome the technical problems, according to the embodiments of the present application, a method, a device, an apparatus and a storage medium for detecting a label of a pharmaceutical package are provided. In an embodiment of the application, a target image is acquired, the target image comprising image information of a target package, the target package being one of two separate packages, the two separate packages comprising a connection. Further, a region of interest is extracted from the target image based on the relative position of the target package in the duplex package. Further, the data matrix of the region of interest is compressed, and a one-dimensional graph element matrix of the region of interest is obtained. Each element in the one-dimensional image element matrix respectively represents the image change characteristics of a corresponding column of elements in the data matrix of the region of interest, and the element value at the edge of the marker in the one-dimensional image element matrix is larger than a set threshold value. The marker is used to indicate a trimmable area in the duplex package. Finally, judging whether the visual characteristics of the marker accord with preset detection conditions. The visual characteristics of the marker are determined based on the elements at the edges of the marker.
In the embodiment of the application, the region of interest to be detected is extracted from the medicine package image, and furthermore, the phenomenon that the element value at the edge of the marker is larger than the set threshold value is referred to, and the visual characteristics of the marker are obtained from the one-dimensional image element matrix obtained by compressing the region of interest and used for judging whether the marker is qualified or not, so that the automatic detection of medicine package is realized, and the detection efficiency of the marker in the medicine package is greatly improved. Meanwhile, the problem of accuracy reduction caused by human eye fatigue and the risk of secondary pollution of medicines caused by manual operation intervention are effectively avoided by automatically extracting and detecting the visual characteristics in the images, so that the accuracy of marker detection in medicine packaging is greatly improved, and the medicine quality is ensured.
The technical scheme provided by the embodiment of the application can be realized by a server and/or terminal equipment. The server and/or the terminal device may be deployed in the production device of the pharmaceutical packaging, or may be a production device independent of the pharmaceutical packaging, and the transmission of the data instructions is achieved by establishing wireless communication and/or wired communication.
It should be noted that, the server according to the embodiment of the present application may be an independent physical server, or may be a server cluster or a distributed system formed by a plurality of physical servers, or may be a cloud server that provides cloud services, cloud databases, cloud computing, cloud functions, cloud storage, network services, cloud communication, middleware services, domain name services, security services, big data, and basic cloud computing services such as an artificial intelligent platform.
The terminal device according to the embodiment of the present application may be a device that provides voice and/or data connectivity to a user, a handheld device with a wireless connection function, or other processing device connected to a wireless modem. Such as mobile telephones (or "cellular" telephones) and computers with mobile terminals, which can be portable, pocket, hand-held, computer-built-in or car-mounted mobile devices, for example, which exchange voice and/or data with radio access networks. Such as personal communication services (Personal Communication Service, PCS) phones, cordless phones, session initiation protocol (Session initialization Protocol, SIP) phones, wireless local loop (Wireless Local Loop, WLL) stations, personal digital assistants (Personal Digital Assistant, PDAs), and the like.
In this document, it should be understood that any number of elements in the drawings is for illustration and not limitation, and that any naming is used only for distinction and not for any limitation.
The principles and spirit of the present application are explained in detail below with reference to several representative embodiments thereof.
Referring now to fig. 1, fig. 1 is a flow chart illustrating a method for detecting a label of a pharmaceutical package according to an embodiment of the present application. The label detection method for medicine package provided by the embodiment of the application comprises the following steps:
Step S101, a target image is acquired.
In an embodiment of the present application, the target image includes a pharmaceutical package. Specifically, the target image contains image information of the target package. The target package is one of the individual packages in a dual package comprising two connected individual packages. For example, the target image includes at least a complete outline of one individual package and a partial outline of another individual package.
In practical application, the medicine package related to the application can be used for packaging medicines of solid preparations. For example, a medicine packaging bag for packaging solid Chinese medicinal preparations. From a packaging form, for example, the pharmaceutical package may be a duplex pharmaceutical package, where duplex refers to a package structure in which two individual packages are connected together. See fig. 2 for a two-part pharmaceutical package comprising a complete image of package a and a partial image of package b. In fig. 2, the package a and the package b are independent of each other and are used for packaging solid preparations, respectively. A detachable connecting structure is arranged between the packaging bag a and the packaging bag b. It should be noted that, in addition to the dual medicine package, the detachable independent medicine package may also be configured into a triple, quadruple or other number of package forms, which is not limited in this embodiment of the present application.
In the embodiment of the application, the target image can be acquired by an image acquisition device, and the image acquisition device can be a camera, a video camera and other devices. For example, the image acquisition device may be a camera module installed in a packaging device or a packaging line. Or a camera module in a mobile terminal or other device, which is connected to the packaging device by a clamping device or other connection structure.
In an alternative embodiment of the present application, as shown in fig. 3, the step of acquiring the target image in step S101 may be implemented as the following steps:
step S301, an original image containing target package image information is acquired.
In the embodiment of the application, the original image is an image comprising the complete outline of the medicine package, and can be acquired by any image acquisition mode. For example, the drug package is placed in a solid background and the original image is acquired. For example, the medicine package is fixed in the field of view of the image capturing device by the holder, whereby the original image is captured by the image capturing device.
Step S302, binarization processing is carried out on the original image to obtain a target image.
Illustratively, the original image acquired in step S301 is assumed to be fig. 2, and the package contained in the original image is assumed to be the package bag a in fig. 2. Based on this, referring to fig. 4 together, fig. 4 is a schematic diagram of a binarized image (i.e. a target image) obtained by performing the binarization processing on the original image in step S302.
Through steps S301 to S302, automatic binarization processing can be performed on the original image, so as to provide a data base for the packaged contour extraction process, and further improve robustness and usability.
In practice, the method of binarizing the original image in step S302 includes, but is not limited to: the Otsu method (OTSU), the TRIANGLE (triangule) method, the window-based binarization method, and the fuzzy set theory-based automatic image binarization (Huang's fuzzy thresholding method).
In order to acquire the outer contour of the package, in step S302, binarization processing is required for the original image. For example, to improve robustness and ease of use, the implementation principle of Huang's fuzzy thresholding method is as follows:
first, a fuzzy subset is defined that maps from image X (i.e., the original image) to a range of values [0,1 ]:
X={x mnX (x mn ))}
wherein x is mn Represents the gray value, μ, of the pixel at the midpoint (m, n) of the image X x (x mn ) Indicating that the point has a membership value of a certain attribute. Wherein, mu is more than or equal to 0 x (x mn ) Less than or equal to 1, m=0, 1, …, M-1, n=0, 1, …, N-1. For binarization, each pixel has a close relationship to the category to which it belongsThus, μ can be expressed in this relationship X (x mn ) Is a value of (2).
For a given threshold t, the average μ of the respective gradation values of the background pixel and the foreground pixel 0 Sum mu 1 Expressed by the following formula:
where h (g) represents the number of pixels in the image having a gray level g.
Average value mu of the background pixel tone 0 And foreground pixel tone average mu 1 The relationship between a point in the image X and the region to which the threshold t corresponds can be regarded as a target value of the foreground and the background, and the relationship should intuitively relate to the difference between the gray value of the point and the target value of the region to which the point belongs. Thus, for point (m, n), the following membership definition function is used:
wherein C is a constant that allows the above function to be satisfied: mu is more than or equal to 0.5 X (x mn )≤1。
Based on shannon entropy function, the entropy of one fuzzy set a is defined as:
wherein shannon function:
S(μ A (x i ))=-μ A (x i )ln[μ A (x i )]-[1-μ A (x i )]ln[1-μ A (x i )]
since the gray scale image has at most L color levels, the above equation can be further converted into:
finally, taking the t value when the shannon entropy value is minimum as the final segmentation threshold value for all possible threshold values t. Therefore, the image X is subjected to color segmentation through the segmentation threshold value, automatic binarization processing of the image X is realized, and a binarized image is obtained.
After the target image is acquired, step S102 extracts the region of interest from the target image according to the relative position of the target package in the duplex package.
In the embodiment of the application, the region of interest is an image region which is intended to be detected by a user in the duplex package. In practical applications, taking medicine packaging as an example, the region of interest includes, but is not limited to, at least one of the following image regions: a marker for the pharmaceutical packaging.
Wherein, the marker is used for providing beat signals for packaging equipment in the production process of medicine packaging on one hand; in another aspect, a method for indicating a trimmable area in a pharmaceutical packaging material. Thus, the packaging device cuts the pharmaceutical packaging material at the marker location, making it into individual pharmaceutical packages. In practice, the markers are also known as black marks, i.e. two black rectangular marks of the top area of the pharmaceutical package. Typically, the labels are printed on the pharmaceutical packaging material in advance, with the length, width and spacing being fixed values. Thus, the length and width can be used as known standard visual information to aid in the detection of the eligibility of the marker.
In order to detect the marker in the pharmaceutical package, it is necessary to first extract the image area, i.e. the region of interest, containing the marker from the pharmaceutical package. In an alternative embodiment of the present application, as shown in fig. 5, the step S102 of extracting the region of interest from the target image according to the relative position of the target package in the duplex package may be implemented as the following steps:
In step S501, the relative position where the target package is in the bigeminal package is identified.
In an embodiment of the present application, the relative positions of the target package include: left or right bag in the duplex medicine package. I.e. a left bag located on the left side of the duplex package, or a right bag located on the right side of the duplex package. Specifically, it is assumed that the medicine package is a two-pack, based on which one individual medicine package in the two-pack is photographed at a time. The center seam of the two-way bag is taken as a boundary, and the two-way bag can be divided into a left bag and a right bag.
As an alternative embodiment, in the currently captured target image, the method for determining whether the package bag occupying the main body position is the left bag or the right bag in the dual package may be implemented as follows: first, a minimum circumscribed profile of the target package is obtained. Further, for the minimum circumscribing contour of the irregular shape, the midpoint of the minimum circumscribing contour may be determined in a first order central moment manner. Specifically, the midpoint of the minimum circumscribing contour is determined according to the first-order center moment of the minimum circumscribing contour. Finally, the midpoint is judged to be located to the left or right of the horizontal midline in the target image. If the midpoint is to the left of the horizontal midline, the target package is the right bag on the right in the duplex package. If the midpoint is to the right of the horizontal midline, the target package is the left bag on the left in the duplex package.
For example, assuming that the midpoint of the outline of the package in the horizontal direction is denoted as M, assuming that the size of the package in the target image is M x N, if M < M/2, it is determined that the pouch is the right pouch in the duplex package; if M is more than or equal to M/2, judging that the small bag is the left bag in the dual package.
Taking the packaging bag a shown in fig. 4 as an example, the bounding box of the foreground portion represents the minimum circumscribed outline of the packaging bag a, the straight line at the middle of the binarized image represents the center line of the target image in the horizontal direction, and the cross mark in fig. 4 represents the midpoint of the minimum circumscribed outline of the packaging bag a. As can be seen from fig. 4, the foreground portion occupying the largest area of the image is the complete image of the package bag a and the partial image of the package bag b, so that the minimum circumscribing outline (i.e., the foreground portion, the complete image of the package bag a and the partial image of the package bag b) including the package bag a can be identified and segmented from fig. 4, and the midpoint of the minimum circumscribing outline of the package bag a (i.e., the intersection point of the cross mark in fig. 4) is located on the right side of the center line (i.e., the straight line located in the middle of the binarized image in fig. 4), in which case the package bag a can be judged as the left bag.
Step S502, determining a region of interest in the target image according to the relative position of the target package and the standard visual characteristics of the marker.
Wherein the label is used to represent a trimmable area in a duplex pharmaceutical package. As described above, the marker may provide a beat signal to the packaging device during production in addition to indicating the effect of the trimmable area. For example, the markers may be black marks, i.e. two black rectangular marks pre-printed on the top area of the pharmaceutical package.
In an embodiment of the application, the standard visual characteristics of the marker include standard size and standard location. Generally, the length, width and spacing of the black marks are all fixed values (i.e. standard sizes). For example, the width of the black label is set to be 5mm, and once the cutting position of the packaging equipment is wrong, quality problems such as powder leakage and the like can be caused. Therefore, in the case where the cutting position is not within the black mark width, the package bag can be regarded as a defective product, and rejected. For specific detection, see below, the description is not expanded here.
Wherein the standard position of the marker is related to the relative position of the target package. Taking the bight packaging as an example, the left black label of the left bag is located in the fixed area (i.e., standard position) at the upper left corner of the region of interest, and the right black label of the right bag is located in the fixed area (i.e., standard position) at the upper right corner of the region of interest.
Through steps S501 to 502, the position of the target package in the duplex package is determined, and then, the target image is used as a reference to the relative position of the target package and the standard visual characteristics of the marker, and the region of interest containing the marker is divided from the target image to be used as an image sample for detecting the eligibility of the marker in the duplex package.
For example, assume that the pharmaceutical package to be tested is the left bag in a duplex package. Assuming that the bounding box coordinates of the target image are [ x, y, w, h ], then the left black label (i.e., the marker to be detected) corresponding to the left bag should be in the upper left corner region of the bounding box [ x, y, w, h ].
Further assume that the black mark size is marked as (w b ,h b ) Let r=fov/Resolution be the pixel Resolution in the optical system of the image acquisition device of the target image. Wherein, foV is a visual field parameter, and Resolution is the Resolution of the image acquisition device itself. Based on the above assumption, the left black mark has a pixel size in units of pixels in the target image of: (w) b ,h b )=(W b /r,H b R). Wherein W is b And H b The original size acquired for the image acquisition device.
Based on the above assumption, in step S501, the region of interest [ x, y,2w ] may be adopted according to shannon' S law of sampling and the standard law of location of the black label in the pharmaceutical package (i.e. where the black label should appear in a qualified pharmaceutical package) b ,2h b ]The left black mark is searched in the target image, thereby obtaining a region of interest a containing the left black mark a as shown in fig. 6. Further, assume that the spacing between two black marks in a pharmaceutical package is set to a fixed dimension D b Then, the two black marks have a pitch in pixels of: d, d b =D b And/r. Thus, based on the above assumption, the region of interest for searching for the right black mark can be obtained as: [ x+d ] b -w b ,y,2w b ,2h b ]Resulting in a region of interest b containing a right black mark b as shown in fig. 6. For example, the left black mark a and the right black mark b shown in fig. 6 can be extracted by the above two regions of interest, respectively.
After the region of interest containing the marker is acquired, the region of interest may be inspected from the perspective of the visual characteristics of the marker to determine whether the pharmaceutical package is acceptable.
Step S103, compressing the data matrix of the region of interest to obtain a one-dimensional graph element matrix of the region of interest.
In the embodiment of the application, each element in the one-dimensional image element matrix respectively represents the image change characteristics of a corresponding column of elements in the data matrix of the region of interest. Since the marker is significantly different from the background element in visual aspect, from visual aspect, the element at the marker edge in the one-dimensional image element matrix has significant feature, that is, the element at the marker edge is significantly different from the background element in numerical value, so that the element at the marker edge is differentiated from the background element by setting a threshold value, thereby realizing automatic identification of the marker. Optionally, the element value at the marker edge is greater than a set threshold. This set threshold value in the present application may be set according to the saturation value in the marker binarized image, or may be set according to the type of light source and the environmental factors of the production site, or may be set based on other manners, which is not limited herein. The marker is used to indicate a trimmable area in the duplex package.
Step S104, judging whether the visual characteristics of the marker meet preset detection conditions.
In an embodiment of the application, the visual characteristics of the marker are determined based on the elements at the edges of the marker.
It is worth to say that, in the medicine packaging scene, after the visual characteristics of the region of interest are detected to meet the preset detection conditions, the label of the medicine package can be confirmed to be qualified, and in this case, the medicine package is confirmed to be qualified.
Specifically, for the region of interest containing the marker, the data matrix of the region of interest is compressed in step S103, so as to obtain an alternative embodiment of the one-dimensional map element matrix of the region of interest, as shown in fig. 7, which may be implemented as the following step S701:
in step S701, an average value of each column of elements in the data matrix of the region of interest is calculated, and the average value of each column of elements is used as each corresponding element in the first one-dimensional map element matrix of the region of interest.
In the embodiment of the application, each element in the data matrix represents the pixel value of the corresponding pixel in the region of interest. Further optionally, each element in the first one-dimensional map element matrix represents an average value of a corresponding column of elements in the data matrix of the region of interest. Therefore, a single column element replaces a plurality of elements in one row, so that the calculated amount of the data matrix processing process is reduced, and the detection efficiency is further improved.
For example, the data matrix of the region of interest containing black marks (i.e., markers) is compressed into a single line in the form of an average value in step S701. That is, the average value of all the elements in each column is taken as the corresponding each element in the first one-dimensional map element matrix (for convenience of distinction, the elements in the one-dimensional map element matrix are also referred to as map elements in the present application), and thus the overall situation of the values of all the elements in the column is represented by the numerical value of each map element. For example, a 3x3 data matrix, ultimately requires compression into a single row of one-dimensional matrix of picture elements, i.e., a 3x1 matrix. Specifically, assume that a 3x3 data matrix is:
then, the first column is averaged, i.e., (1+4+7)/3=4, to obtain the first picture element of 4. Sequentially calculating, and taking an average value of the second column, namely (2+5+8)/3=5, so as to obtain a second picture element; the third column is averaged, i.e., (3+6+9)/3=6, to yield the third picture element. Finally, a first one-dimensional graph element matrix of 3x1 is obtained: [4 5 6]. The values in the examples herein are presented for purposes of introducing a mean value compression method for the region of interest and are not limiting.
In practice, the values of the elements of the graph may be used to characterize the pixel value size of each element in the corresponding column. For example, assuming that the pixel values in the binarized image are 1 and 0, respectively, and assuming that the depth of the data matrix is 8 bits, the saturation value of the picture element is 255. Based on the above assumption, if the value of the map element is a saturation value, it is explained that each element value in the corresponding column is 1. If the data value of the picture element is greater than half the saturation value, it is stated that the number of elements with a value of 1 in the corresponding column exceeds half the total number of elements in the column. If the data value of the picture element is less than half the saturation value, it is stated that the number of elements with a value of 1 in the corresponding column is less than half the total number of elements in the column. If the data value of the figure element is less than 0, then it is stated that there are no elements with a value of 1 in the corresponding column. Of course, this is merely an example, and the setting of the map elements in the practical application is not limited thereto.
Next, in step S104, an alternative embodiment of determining whether the visual characteristic of the marker meets the preset detection condition, as shown in fig. 7, may be implemented as the following steps S702 to S703:
step S702, traversing the first one-dimensional graph element matrix to extract the head and tail positions of the markers.
In step S703, it is determined whether the physical size of the marker meets the standard size.
In the embodiment of the application, the physical size of the marker is calculated based on the head-to-tail position of the marker. It can be understood that the head and tail positions of the marker are the first elements which are traversed in the first one-dimensional graph element matrix and are larger than the set threshold value by taking the first elements on two sides in the first one-dimensional graph element matrix as the traversal starting points. Illustratively, an alternative implementation of the head-to-tail position of the tag is: taking the first picture element which traverses from left to right and is larger than a set threshold value in the first one-dimensional picture element matrix as the starting position of the marker; and taking the first graph element which traverses from right to left and is larger than a set threshold value in the first one-dimensional graph element matrix as the end position of the marker. It should be noted that, in addition to the above-mentioned acquisition modes, there are other acquisition modes for the end-to-end position of the marker, which are not developed here.
For example, in step S702, assume that the threshold is set to threshold=h b Alpha. Wherein h is b Is the height of the black mark (i.e. marker), and alpha is an empirical value. Based on this, the first picture element larger than threshold traversed from left to right in the first one-dimensional picture element matrix is taken as the starting position of the black mark, and the index of the position is recorded as start. Taking the first graph element which is traversed from right to left and is larger than a set threshold value in the first one-dimensional graph element matrix asThe end position of the black mark, the index of which is recorded as end. Finally, the width of the black mark obtained in step S703 is w m =end-start. And comparing the width (i.e. the physical dimension) of the black mark with the width (i.e. the standard dimension) of the standard black mark, thereby judging whether the physical dimension of the black mark is qualified or not.
See the test interface for a duplex pharmaceutical package shown in fig. 8. In fig. 8, the upper left corner is the left black label of the left bag, and the upper right corner is the right black label of the left bag, wherein the right black label of the left bag is connected with the left black label of the right bag. In fig. 8, a line frame thickened area under the black mark represents the width of the black mark, and the lower numerical value thereof, that is, the black mark size in units of pixels, that is, the left black mark is 47, and the right black mark is 77.
Based on the example shown in fig. 7, in an alternative embodiment of step S104, the target package is determined to be acceptable if the physical size of the marker meets the standard size. Specifically, if the physical size of the tag meets the standard size, the cutting position of the tag region is specified, and in this case, the current medicine package does not have the medicine leakage problem caused by the cutting problem, so that the medicine package containing the tag can be qualified. Otherwise, if the physical size of the marker does not meet the standard size, determining that the target package is not qualified.
Of course, other parts (such as edge areas, two-dimensional codes, text information and the like) can be detected continuously, so that after other defects in the medicine package are further eliminated, the medicine package is judged to be qualified.
Through the steps S701 to S703, the image of the marker in the medicine package can be converted into a data matrix, and the head and tail positions of the marker are obtained through calculation of the data matrix, so that the physical size of the marker is obtained, the automatic detection of the medicine package marker is realized, and the accuracy and the detection efficiency of the marker detection in the medicine package are further improved.
For a region of interest containing a marker, background elements in the region of interest may interfere with the identification of the contours of the marker, such as by having the background elements also be part of the marker, thereby causing errors in the identification of the contours of the marker. Thus, to further eliminate interference of background elements in the region of interest, the image of the region to be detected may also be preprocessed prior to step S103.
An alternative embodiment of preprocessing the area to be detected, as shown in fig. 9, is specifically implemented as the following steps:
Step S901, performing an open operation on the binarized image of the region of interest to obtain an initial noise reduction image of the region of interest. Here, the background noise in the region of interest can be primarily eliminated by the open operation, and the image quality is improved.
Step S902, compressing a data matrix of the initial noise reduction image to obtain a second one-dimensional image element matrix; each element in the second one-dimensional image element matrix represents an average value of a corresponding column of elements in the data matrix of the initial noise reduction image.
Step S903, performing background elimination processing on the initial noise reduction image by adopting the noise reduction zone bit in the second one-dimensional image element matrix to obtain a target noise reduction image of the region of interest. The noise reduction flag bit is the first element which is searched in the second one-dimensional image element matrix and is in a preset range by taking the first elements at two sides in the second one-dimensional image element matrix as searching starting points.
Specifically, in step S903, the noise reduction flag bit is searched for in the second one-dimensional map element matrix. And then, performing background elimination processing on the initial noise reduction image based on the noise reduction zone bit to obtain a secondary noise reduction image. Alternatively, it may also be determined whether there is a case where the outline of the marker is connected to the background element, and if this is the case, it is necessary to eliminate the background element having the connection relation. Further, the difference between at least two noise reduction flag bits is taken as the initial width of the marker. Then, the secondary noise reduction image is subjected to an open operation based on the initial width, and the region of interest (namely the target noise reduction image) after the noise reduction processing is obtained. Therefore, the interference of the background element on the identification of the marker profile is further removed through the self-adaptive open operation of the region of interest, burrs of the marker are removed, and the accuracy of the marker profile in the region of interest is improved.
Through the steps S901 to S903, through the above preprocessing steps, automatic noise reduction of the region of interest containing the marker is achieved, interference of background elements in the region of interest is further eliminated, accuracy of the marker outline in the region of interest is improved, and the marker extraction efficiency is further improved.
The pretreatment process for a region of interest containing a marker in an embodiment of the present application is described below by way of an alternative example.
The region of interest containing the marker is assumed to be: the region of interest b in fig. 6 contains the right black mark b. Based on this, the above-described region of interest is processed in the automatic binarization manner described above, resulting in the region of interest b1 shown in fig. 10 (i.e., a binarized image of the region of interest).
In the region of interest b1, there is obviously some noise and holes caused by the noise, in this case, a 3×3 kernel may be used to perform an open operation, so as to fill the holes of the region of interest b1, so as to eliminate the noise, and obtain the region of interest b2 shown in fig. 10 (i.e. an initial noise reduction image of the region of interest).
In the region of interest b2, it can be seen that part of the black mark element communicates with the background element, such as a burr portion. To further improve the image quality and the recognition accuracy of the black mark size, these background elements need to be further eliminated. First, the data matrix of the region of interest b2 is compressed into a single row matrix (i.e., a second one-dimensional matrix of picture elements). Further, traversing a graph element from left to right in a single row matrix, if the value of a graph element is a saturated value, it means that the corresponding whole column of the graph element in the data matrix is 1. In this case, column elements in the single row matrix having the same index as the figure element may be set to 0 to eliminate the background element in the column. Traversing to the first unsaturated graph element with the value larger than half of the saturated value, stopping traversing operation, and recording the index of the graph element as the start (namely the noise reduction zone bit). Further, the same traversing operation is performed from right to left until the first non-saturated graph element with a value greater than half the saturation value is traversed, and the traversing operation is stopped again, and the index of the graph element is recorded as end (i.e. the noise reduction flag bit). Through the traversing operation, the background elements can be further eliminated, so that a secondary noise reduction image is obtained.
Then, the noise reduction zone bit in the region of interest b2 is calculated to obtain the initial width w of the black mark m =end-start. Further, based on the initial width determination (w m And 3, 1), performing open operation on the black mark, and further eliminating burrs in the secondary noise reduction image. Retake (5,w) b And/2) performing an open operation on the black mark to make the overall shape of the black mark more regular, thereby obtaining the region of interest b3 (i.e., the target noise reduction image) shown in fig. 10. Further, to improve the robustness, the maximum outline of the black mark is further searched and selected in the region of interest b3, so as to further reduce the region of interest to the bounding box of the maximum outline.
Through the preprocessing process in the above example, the quality of the image to be detected can be further improved, so that the accuracy and the reliability of black mark qualification detection can be improved, and the automatic detection efficiency of the package can be improved.
In the embodiment of the application, the region of interest to be detected is extracted from the medicine package image, and then the region of interest is compressed into the one-dimensional image element matrix, and the visual characteristics of the marker are obtained from the one-dimensional image element matrix through the phenomenon that the element value at the edge of the marker is larger than the set threshold value, so that whether the marker is qualified or not is judged, the automatic detection of medicine package is realized, and the detection efficiency of the marker in the medicine package is greatly improved. Meanwhile, the problem of accuracy reduction caused by human eye fatigue and the risk of secondary pollution of medicines caused by manual operation intervention are effectively avoided by automatically extracting and detecting the visual characteristics in the images, so that the accuracy of marker detection in medicine packaging is greatly improved, and the medicine quality is ensured.
Having described the apparatus, method of the exemplary embodiments of the present application, a marker detection device for pharmaceutical packages of the exemplary embodiments of the present application is described next with reference to fig. 11. Alternatively, the marker detection device may be provided in a pharmaceutical packaging detection apparatus or a pharmaceutical packaging production apparatus, the device comprising:
an acquisition unit 1101 for acquiring a target image; the target image comprises image information of a target package, wherein the target package is one independent package in a dual package, and the dual package comprises two independent packages connected with each other;
an extracting unit 1102, configured to extract a region of interest from the target image according to a relative position of the target package in the duplex package;
a compression unit 1103, configured to compress the data matrix of the region of interest to obtain a one-dimensional map element matrix of the region of interest; wherein each element in the one-dimensional image element matrix respectively represents the image change characteristics of a corresponding column of elements in the data matrix of the region of interest; the element value at the edge of the marker in the one-dimensional graph element matrix is larger than a set threshold value; the marker is used for indicating a cuttable region in the duplex package;
A judging unit 1104, configured to judge whether the visual feature of the marker meets a preset detection condition; wherein the visual characteristics of the marker are determined based on the elements at the edges of the marker.
As an alternative embodiment, the extracting unit 1102 is specifically configured to:
identifying a relative position of the target package in the duplex package; the relative position includes a left pocket located on a left side of the duplex package or a right pocket located on a right side of the duplex package;
determining the region of interest in the target image according to the relative position of the target package and the standard visual features of the marker;
wherein the standard visual characteristics of the marker include standard size and standard location; the standard position is related to the relative position in which the target package is located.
As an alternative embodiment, the extracting unit 1102 is specifically configured to, when identifying the relative position of the target package in the duplex package:
acquiring the minimum external contour of the target package;
determining the midpoint of the minimum external contour according to the first-order central moment of the minimum external contour;
judging whether the midpoint is positioned at the left side or the right side of a horizontal midline in the target image;
If the midpoint is on the left side of the horizontal midline, the target package is a right bag positioned on the right side in a duplex package; or alternatively
If the midpoint is to the right of the horizontal midline, the target package is the left pocket on the left in the duplex package.
As an alternative embodiment, the compression unit 1103 is specifically configured to:
and calculating the average value of each column of elements in the data matrix of the region of interest, and taking the average value of each column of elements as each corresponding element in the first one-dimensional graph element matrix of the region of interest.
Accordingly, the judging unit 1104 specifically functions to:
traversing the first one-dimensional graph element matrix to extract the head and tail positions of the marker; the head and tail positions are the first elements which are traversed in the first one-dimensional image element matrix and are larger than a set threshold value by taking the first elements on two sides of the first one-dimensional image element matrix as traversal starting points;
judging whether the physical size of the marker meets the standard size; the physical size of the marker is calculated based on the head-to-tail position of the marker;
if the physical size of the marker meets the standard size, determining that the target package is qualified; or alternatively
And if the physical size of the marker does not meet the standard size, determining that the target package is not qualified.
Wherein, as an alternative embodiment, the head-tail position includes:
taking the first element which traverses from left to right and is larger than a set threshold value in the first one-dimensional graph element matrix as the initial position of the marker; and taking the first element which traverses from right to left and is larger than a set threshold value in the first one-dimensional graph element matrix as the end position of the marker.
As an alternative embodiment, the obtaining unit 1101 is specifically configured to:
acquiring an original image containing the target package image information;
and performing binarization processing on the original image by adopting an automatic image binarization mode based on a fuzzy set theory so as to obtain the target image.
As an optional implementation manner, the apparatus further includes a preprocessing unit, before the compressing unit 1103 compresses the data matrix of the region of interest to obtain the one-dimensional graph element matrix of the region of interest, configured to:
performing open operation on the binarized image of the region of interest to obtain an initial noise reduction image of the region of interest;
Compressing the data matrix of the initial noise reduction image to obtain a second one-dimensional image element matrix; each element in the second one-dimensional image element matrix respectively represents the average value of a corresponding column of elements in the data matrix of the initial noise reduction image;
and performing background elimination processing on the initial noise reduction image by adopting a noise reduction zone bit in the second one-dimensional image element matrix to obtain a target noise reduction image of the region of interest.
The noise reduction zone bit is the first element which is searched in the second one-dimensional image element matrix and is in a preset range by taking the first elements at two sides in the second one-dimensional image element matrix as search starting points.
As an optional implementation manner, the preprocessing unit is configured to perform background elimination processing on the initial noise reduction image by using a noise reduction flag bit in the second one-dimensional image element matrix, so as to obtain a target noise reduction image of the region of interest, where the preprocessing unit is specifically configured to:
searching a noise reduction zone bit in the second one-dimensional picture element matrix;
performing background elimination processing on the initial noise reduction image based on the noise reduction zone bit to obtain a secondary noise reduction image; the noise reduction zone bit is the first element which is in a preset range and is searched in the second one-dimensional picture element matrix by taking the first elements at two sides in the second one-dimensional picture element matrix as search starting points;
Taking the difference value between at least two noise reduction zone bits as the initial width of the marker;
and performing open operation on the secondary noise reduction image based on the initial width to obtain the target noise reduction image.
According to the embodiment of the application, the region of interest to be detected is extracted from the medicine package image through the medicine package marker detection device, and then the phenomenon that the element value at the edge of the marker is larger than the set threshold value is referred to, and the visual characteristics of the marker are obtained from the one-dimensional image element matrix obtained by compressing the region of interest, so that whether the marker is qualified or not is judged, the automatic detection of medicine packages is realized, and the detection efficiency of the marker in the medicine packages is greatly improved. Meanwhile, the problem of accuracy reduction caused by human eye fatigue and the risk of secondary pollution of medicines caused by manual operation intervention are effectively avoided by automatically extracting and detecting the visual characteristics in the images, so that the accuracy of marker detection in medicine packaging is greatly improved, and the medicine quality is ensured.
Having described the apparatus, method and device of the exemplary embodiments of the present application, a computer readable storage medium of the exemplary embodiments of the present application, which may be provided in a marker detection apparatus for pharmaceutical packaging, will be described with reference to fig. 12, and referring to fig. 12, the computer readable storage medium is shown as an optical disc 120 having a computer program (i.e., a program product) stored thereon, which when executed by a processor, implements the steps described in the above-described method embodiments, for example, to obtain a target image; the target image comprises image information of a target package, wherein the target package is one independent package in a dual package, and the dual package comprises two independent packages connected with each other; extracting a region of interest from the target image according to the relative position of the target package in the duplex package; compressing the data matrix of the region of interest to obtain a one-dimensional graph element matrix of the region of interest; wherein each element in the one-dimensional image element matrix respectively represents the image change characteristics of a corresponding column of elements in the data matrix of the region of interest; the element value at the edge of the marker in the one-dimensional graph element matrix is larger than a set threshold value; the marker is used for indicating a cuttable region in the duplex package; judging whether the visual characteristics of the marker accord with preset detection conditions or not; wherein the visual characteristics of the marker are determined based on the elements at the edge of the marker; the specific implementation of each step is not repeated here.
It should be noted that examples of the computer readable storage medium may also include, but are not limited to, a phase change memory (PRAM), a Static Random Access Memory (SRAM), a Dynamic Random Access Memory (DRAM), other types of Random Access Memory (RAM), a Read Only Memory (ROM), an Electrically Erasable Programmable Read Only Memory (EEPROM), a flash memory, or other optical or magnetic storage medium, which will not be described in detail herein.
Having described the apparatus, methods, media, and devices of exemplary embodiments of the present application, next, a computing device for marker detection in pharmaceutical packaging of exemplary embodiments of the present application, which may be provided in a pharmaceutical packaging detection device or a pharmaceutical packaging production device, is described with reference to fig. 13.
FIG. 13 illustrates a block diagram of an exemplary computing device 100 suitable for use in implementing embodiments of the application, the computing device 100 may be a computer system or a server. The computing device 100 shown in fig. 13 is only one example and should not be taken as limiting the functionality and scope of use of embodiments of the application.
As shown in fig. 13, components of computing device 100 may include, but are not limited to: one or two processors or processing units 1001, a system memory 1002, and a bus 1003 that connects the different system components (including the system memory 1002 and the processing units 1001).
Computing device 100 typically includes a variety of computer system readable media. Such media can be any available media that is accessible by computing device 100 and includes both volatile and nonvolatile media, removable and non-removable media.
The system memory 1002 may include computer system readable media in the form of volatile memory, such as Random Access Memory (RAM) 10021 and/or cache memory 10022. Computing device 100 may further include other removable/non-removable, volatile/nonvolatile computer system storage media. By way of example only, ROM10023 may be used to read from or write to non-removable, nonvolatile magnetic media (not shown in FIG. 13, commonly referred to as a "hard disk drive"). Although not shown in fig. 13, a magnetic disk drive for reading from and writing to a removable non-volatile magnetic disk (e.g., a "floppy disk"), and an optical disk drive for reading from or writing to a removable non-volatile optical disk (e.g., a CD-ROM, DVD-ROM, or other optical media), may be provided. In these cases, each drive may be connected to bus 1003 via one or two data media interfaces. The system memory 1002 may include at least one program product having a set (e.g., at least one) of program modules configured to carry out the functions of the embodiments of the application.
A program/utility 10025 having a set (at least one) of program modules 10024 may be stored, for example, in system memory 1002, and such program modules 10024 include, but are not limited to: an operating system, one or two application programs, other program modules, and program data, each of which may include an implementation of a network environment, or some combination. Program modules 10024 generally perform the functions and/or methodologies of the described embodiments of the application.
Computing device 100 may also communicate with one or two external devices 1004 (e.g., keyboard, pointing device, display, etc.). Such communication may occur through an input/output (I/O) interface 605. Moreover, computing device 100 may also communicate with one or both networks such as a Local Area Network (LAN), a Wide Area Network (WAN) and/or a public network, such as the Internet, through network adapter 1006. As shown in fig. 10, the network adapter 1006 communicates with other modules of the computing device 100 (e.g., processing unit 1001, etc.) over the bus 1003. It should be appreciated that although not shown in fig. 10, other hardware and/or software modules may be used in connection with computing device 100.
The processing unit 1001 executes various functional applications and data processing by running a program stored in the system memory 1002, for example, acquires a target image; the target image comprises image information of a target package, wherein the target package is one independent package in a dual package, and the dual package comprises two independent packages connected with each other; extracting a region of interest from the target image according to the relative position of the target package in the duplex package; compressing the data matrix of the region of interest to obtain a one-dimensional graph element matrix of the region of interest; wherein each element in the one-dimensional image element matrix respectively represents the image change characteristics of a corresponding column of elements in the data matrix of the region of interest; the element value at the edge of the marker in the one-dimensional graph element matrix is larger than a set threshold value; the marker is used for indicating a cuttable region in the duplex package; judging whether the visual characteristics of the marker accord with preset detection conditions or not; wherein the visual characteristics of the marker are determined based on the elements at the edges of the marker. The specific implementation of each step is not repeated here. It should be noted that although in the above detailed description several units/modules or sub-units/sub-modules of a marker detection device of a pharmaceutical package are mentioned, such a division is only exemplary and not mandatory. Indeed, the features and functionality of two or more units/modules described above may be embodied in one unit/module in accordance with embodiments of the present application. Conversely, the features and functions of one unit/module described above may be further divided into two units/modules to be embodied.
In the description of the present application, it should be noted that the terms "first," "second," and "third" are used for descriptive purposes only and are not to be construed as indicating or implying relative importance.
It will be clear to those skilled in the art that, for convenience and brevity of description, specific working procedures of the above-described systems, apparatuses and units may refer to corresponding procedures in the foregoing method embodiments, and are not repeated herein.
In the several embodiments provided by the present application, it should be understood that the disclosed systems, devices, and methods may be implemented in other manners. The above-described apparatus embodiments are merely illustrative, for example, the division of the units is merely a logical function division, and there may be additional divisions in actual implementation, and for example, two units or components may be combined or integrated into another system, or some features may be omitted or not performed. Alternatively, the coupling or direct coupling or communication connection shown or discussed with each other may be through some communication interface, device or unit indirect coupling or communication connection, which may be in electrical, mechanical or other form.
The units described as separate units may or may not be physically separate, and units shown as units may or may not be physical units, may be located in one place, or may be distributed on two network units. Some or all of the units may be selected according to actual needs to achieve the purpose of the solution of this embodiment.
In addition, each functional unit in the embodiments of the present application may be integrated in one processing unit, or each unit may exist alone physically, or two or more units may be integrated in one unit.
The functions, if implemented in the form of software functional units and sold or used as a stand-alone product, may be stored in a non-volatile computer readable storage medium executable by a processor. Based on this understanding, the technical solution of the present application may be embodied essentially or in a part contributing to the prior art or in a part of the technical solution, in the form of a software product stored in a storage medium, comprising several instructions for causing a computer device (which may be a personal computer, a server, a network device, etc.) to perform all or part of the steps of the method according to the embodiments of the present application. And the aforementioned storage medium includes: a U-disk, a removable hard disk, a Read-Only Memory (ROM), a random access Memory (RAM, random Access Memory), a magnetic disk, or an optical disk, or other various media capable of storing program codes.
Finally, it should be noted that: the above examples are only specific embodiments of the present application, and are not intended to limit the scope of the present application, but it should be understood by those skilled in the art that the present application is not limited thereto, and that the present application is described in detail with reference to the foregoing examples: any person skilled in the art may modify or easily conceive of the technical solution described in the foregoing embodiments, or perform equivalent substitution of some of the technical features, while remaining within the technical scope of the present disclosure; such modifications, changes or substitutions do not depart from the spirit and scope of the technical solutions of the embodiments of the present application, and are intended to be included in the scope of the present application. Therefore, the protection scope of the present application shall be subject to the protection scope of the claims.
Furthermore, although the operations of the methods of the present application are depicted in the drawings in a particular order, this is not required to either imply that the operations must be performed in that particular order or that all of the illustrated operations be performed to achieve desirable results. Additionally or alternatively, certain steps may be omitted, combined into one step to perform, and/or split into two steps to perform.

Claims (10)

1. A method of detecting a label for a pharmaceutical package, comprising:
acquiring a target image; the target image comprises image information of a target package, wherein the target package is one independent package in a dual package, and the dual package comprises two independent packages connected with each other;
extracting a region of interest from the target image according to the relative position of the target package in the duplex package;
compressing the data matrix of the region of interest to obtain a one-dimensional graph element matrix of the region of interest; wherein each element in the one-dimensional image element matrix respectively represents the image change characteristics of a corresponding column of elements in the data matrix of the region of interest; the element value at the edge of the marker in the one-dimensional graph element matrix is larger than a set threshold value; the marker is used for indicating a cuttable region in the duplex package;
judging whether the visual characteristics of the marker accord with preset detection conditions or not; wherein the visual characteristics of the marker are determined based on the elements at the edges of the marker.
2. The method of claim 1, wherein the extracting a region of interest from the target image based on the relative position of the target package in the duplex package comprises:
Identifying a relative position of the target package in the duplex package; the relative position includes a left pocket located on a left side of the duplex package or a right pocket located on a right side of the duplex package;
determining the region of interest in the target image according to the relative position of the target package and the standard visual features of the marker;
wherein the standard visual characteristics of the marker include standard size and standard location; the standard position is related to the relative position in which the target package is located.
3. The method of claim 2, wherein the identifying the relative position of the target package in the duplex package comprises:
acquiring the minimum external contour of the target package;
determining the midpoint of the minimum external contour according to the first-order central moment of the minimum external contour;
judging whether the midpoint is positioned at the left side or the right side of a horizontal midline in the target image;
if the midpoint is on the left side of the horizontal midline, the target package is a right bag positioned on the right side in a duplex package; or alternatively
If the midpoint is to the right of the horizontal midline, the target package is the left pocket on the left in the duplex package.
4. The method of claim 1, wherein the compressing the data matrix of the region of interest to obtain a one-dimensional matrix of map elements of the region of interest comprises:
calculating the average value of each column of elements in the data matrix of the region of interest, and taking the average value of each column of elements as each corresponding element in the first one-dimensional graph element matrix of the region of interest;
the judging whether the visual characteristics of the marker meet the preset detection conditions comprises the following steps:
traversing the first one-dimensional graph element matrix to extract the head and tail positions of the marker; the head and tail positions are the first elements which are traversed in the first one-dimensional image element matrix and are larger than a set threshold value by taking the first elements on two sides of the first one-dimensional image element matrix as traversal starting points;
judging whether the physical size of the marker meets the standard size; the physical size of the marker is calculated based on the head-to-tail position of the marker;
if the physical size of the marker meets the standard size, determining that the target package is qualified; or alternatively
And if the physical size of the marker does not meet the standard size, determining that the target package is not qualified.
5. The method of claim 4, wherein the end-to-end position comprises:
taking the first element which traverses from left to right and is larger than a set threshold value in the first one-dimensional graph element matrix as the initial position of the marker;
and taking the first element which traverses from right to left and is larger than a set threshold value in the first one-dimensional graph element matrix as the end position of the marker.
6. The method of claim 1, wherein compressing the data matrix of the region of interest to obtain the one-dimensional map element matrix of the region of interest further comprises:
performing open operation on the binarized image of the region of interest to obtain an initial noise reduction image of the region of interest;
compressing the data matrix of the initial noise reduction image to obtain a second one-dimensional image element matrix; each element in the second one-dimensional image element matrix respectively represents the average value of a corresponding column of elements in the data matrix of the initial noise reduction image;
performing background elimination processing on the initial noise reduction image by adopting a noise reduction zone bit in the second one-dimensional image element matrix to obtain a target noise reduction image of the region of interest;
The noise reduction zone bit is the first element which is searched in the second one-dimensional image element matrix and is in a preset range by taking the first elements at two sides in the second one-dimensional image element matrix as search starting points.
7. The method for detecting a marker according to claim 6, wherein the performing background elimination processing on the initial noise-reduced image by using the noise-reduced flag bit in the second one-dimensional image element matrix to obtain the target noise-reduced image of the region of interest includes:
searching a noise reduction zone bit in the second one-dimensional picture element matrix;
performing background elimination processing on the initial noise reduction image based on the noise reduction zone bit to obtain a secondary noise reduction image; the noise reduction zone bit is the first element which is in a preset range and is searched in the second one-dimensional picture element matrix by taking the first elements at two sides in the second one-dimensional picture element matrix as search starting points;
taking the difference value between at least two noise reduction zone bits as the initial width of the marker;
and performing open operation on the secondary noise reduction image based on the initial width to obtain the target noise reduction image.
8. A label detection device for pharmaceutical packaging, the device comprising:
An acquisition unit configured to acquire a target image; the target image comprises image information of a target package, wherein the target package is one independent package in a dual package, and the dual package comprises two independent packages connected with each other;
the extraction unit is used for extracting a region of interest from the target image according to the relative position of the target package in the bigeminal package;
the compression unit is used for compressing the data matrix of the region of interest to obtain a one-dimensional graph element matrix of the region of interest; wherein each element in the one-dimensional image element matrix respectively represents the image change characteristics of a corresponding column of elements in the data matrix of the region of interest; the element value at the edge of the marker in the one-dimensional graph element matrix is larger than a set threshold value; the marker is used for indicating a cuttable region in the duplex package;
the judging unit is used for judging whether the visual characteristics of the marker accord with preset detection conditions or not; wherein the visual characteristics of the marker are determined based on the elements at the edges of the marker.
9. A computing device, the computing device comprising:
At least one processor, memory, and input output unit;
wherein the memory is for storing a computer program and the processor is for invoking the computer program stored in the memory to perform the marker detection method of the pharmaceutical packaging of any one of claims 1 to 7.
10. A computer readable storage medium comprising instructions that when executed on a computer cause the computer to perform the method of marker detection of a pharmaceutical package according to any one of claims 1 to 7.
CN202311119565.7A 2023-09-01 2023-09-01 Method, device, equipment and storage medium for detecting marker of medicine package Active CN117152415B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202311119565.7A CN117152415B (en) 2023-09-01 2023-09-01 Method, device, equipment and storage medium for detecting marker of medicine package

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202311119565.7A CN117152415B (en) 2023-09-01 2023-09-01 Method, device, equipment and storage medium for detecting marker of medicine package

Publications (2)

Publication Number Publication Date
CN117152415A true CN117152415A (en) 2023-12-01
CN117152415B CN117152415B (en) 2024-04-23

Family

ID=88905660

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202311119565.7A Active CN117152415B (en) 2023-09-01 2023-09-01 Method, device, equipment and storage medium for detecting marker of medicine package

Country Status (1)

Country Link
CN (1) CN117152415B (en)

Citations (20)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20060124744A1 (en) * 2004-12-14 2006-06-15 Michael Gormish Location of machine readable codes in compressed representations
US20130223673A1 (en) * 2011-08-30 2013-08-29 Digimarc Corporation Methods and arrangements for identifying objects
US8712163B1 (en) * 2012-12-14 2014-04-29 EyeNode, LLC Pill identification and counterfeit detection method
CN103903265A (en) * 2014-03-31 2014-07-02 东华大学 Method for detecting industrial product package breakage
CN104537671A (en) * 2015-01-04 2015-04-22 长沙理工大学 Cigarette filter online counting and quality detecting method based on machine vision
US9342900B1 (en) * 2014-12-23 2016-05-17 Ricoh Co., Ltd. Distinguishing between stock keeping units using marker based methodology
CN111210412A (en) * 2019-12-31 2020-05-29 电子科技大学中山学院 Package detection method and device, electronic equipment and storage medium
CN111242900A (en) * 2019-12-31 2020-06-05 电子科技大学中山学院 Product qualification determination method and device, electronic equipment and storage medium
CN111767920A (en) * 2020-06-30 2020-10-13 北京百度网讯科技有限公司 Region-of-interest extraction method and device, electronic equipment and storage medium
US10803272B1 (en) * 2016-09-26 2020-10-13 Digimarc Corporation Detection of encoded signals and icons
WO2021114799A1 (en) * 2019-12-14 2021-06-17 华南理工大学广州学院 Computer vision-based matrix vehicle light identification method
CN113284095A (en) * 2021-05-08 2021-08-20 北京印刷学院 Method for detecting number of medicine bags in medicine box based on machine vision
WO2021179751A1 (en) * 2020-03-13 2021-09-16 上海哔哩哔哩科技有限公司 Image processing method and system
CN113420690A (en) * 2021-06-30 2021-09-21 平安科技(深圳)有限公司 Vein identification method, device and equipment based on region of interest and storage medium
CN113870217A (en) * 2021-09-27 2021-12-31 菲特(天津)检测技术有限公司 Edge deviation vision measurement method based on machine vision and image detector
CN113870189A (en) * 2021-09-02 2021-12-31 广东省电信规划设计院有限公司 Industrial product circular detection method and device
CN114066810A (en) * 2021-10-11 2022-02-18 安庆师范大学 Method and device for detecting concave-convex point defects of packaging box
WO2022036478A1 (en) * 2020-08-17 2022-02-24 江苏瑞科科技有限公司 Machine vision-based augmented reality blind area assembly guidance method
CN114092771A (en) * 2020-08-05 2022-02-25 北京万集科技股份有限公司 Multi-sensing data fusion method, target detection device and computer equipment
CN115908269A (en) * 2022-10-26 2023-04-04 中科慧远视觉技术(北京)有限公司 Visual defect detection method and device, storage medium and computer equipment

Patent Citations (21)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20060124744A1 (en) * 2004-12-14 2006-06-15 Michael Gormish Location of machine readable codes in compressed representations
US20130223673A1 (en) * 2011-08-30 2013-08-29 Digimarc Corporation Methods and arrangements for identifying objects
US8712163B1 (en) * 2012-12-14 2014-04-29 EyeNode, LLC Pill identification and counterfeit detection method
CN103903265A (en) * 2014-03-31 2014-07-02 东华大学 Method for detecting industrial product package breakage
US9342900B1 (en) * 2014-12-23 2016-05-17 Ricoh Co., Ltd. Distinguishing between stock keeping units using marker based methodology
CN104537671A (en) * 2015-01-04 2015-04-22 长沙理工大学 Cigarette filter online counting and quality detecting method based on machine vision
US10803272B1 (en) * 2016-09-26 2020-10-13 Digimarc Corporation Detection of encoded signals and icons
WO2021114799A1 (en) * 2019-12-14 2021-06-17 华南理工大学广州学院 Computer vision-based matrix vehicle light identification method
CN111242900A (en) * 2019-12-31 2020-06-05 电子科技大学中山学院 Product qualification determination method and device, electronic equipment and storage medium
CN111210412A (en) * 2019-12-31 2020-05-29 电子科技大学中山学院 Package detection method and device, electronic equipment and storage medium
WO2021179751A1 (en) * 2020-03-13 2021-09-16 上海哔哩哔哩科技有限公司 Image processing method and system
US20220343507A1 (en) * 2020-03-13 2022-10-27 Shanghai Bilibili Technology Co., Ltd. Process of Image
CN111767920A (en) * 2020-06-30 2020-10-13 北京百度网讯科技有限公司 Region-of-interest extraction method and device, electronic equipment and storage medium
CN114092771A (en) * 2020-08-05 2022-02-25 北京万集科技股份有限公司 Multi-sensing data fusion method, target detection device and computer equipment
WO2022036478A1 (en) * 2020-08-17 2022-02-24 江苏瑞科科技有限公司 Machine vision-based augmented reality blind area assembly guidance method
CN113284095A (en) * 2021-05-08 2021-08-20 北京印刷学院 Method for detecting number of medicine bags in medicine box based on machine vision
CN113420690A (en) * 2021-06-30 2021-09-21 平安科技(深圳)有限公司 Vein identification method, device and equipment based on region of interest and storage medium
CN113870189A (en) * 2021-09-02 2021-12-31 广东省电信规划设计院有限公司 Industrial product circular detection method and device
CN113870217A (en) * 2021-09-27 2021-12-31 菲特(天津)检测技术有限公司 Edge deviation vision measurement method based on machine vision and image detector
CN114066810A (en) * 2021-10-11 2022-02-18 安庆师范大学 Method and device for detecting concave-convex point defects of packaging box
CN115908269A (en) * 2022-10-26 2023-04-04 中科慧远视觉技术(北京)有限公司 Visual defect detection method and device, storage medium and computer equipment

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
李晓飞;马大玮;胡焰智;范小麟;: "微目标遥感图像的感兴趣区域提取新方法", 微计算机信息, no. 10, 5 April 2008 (2008-04-05) *
黄明晶;蹇渊;王雪梅;孟伟杰;马蒙蒙;: "一种基于激光三维成像雷达距离像的目标检测方法", 激光与红外, no. 07, 20 July 2020 (2020-07-20) *

Also Published As

Publication number Publication date
CN117152415B (en) 2024-04-23

Similar Documents

Publication Publication Date Title
CN110390269B (en) PDF document table extraction method, device, equipment and computer readable storage medium
CN110443243B (en) Water level monitoring method, storage medium, network device and water level monitoring system
KR100325384B1 (en) Character string extraction apparatus and pattern extraction apparatus
US8811751B1 (en) Method and system for correcting projective distortions with elimination steps on multiple levels
US20030152272A1 (en) Detecting overlapping images in an automatic image segmentation device with the presence of severe bleeding
US8897600B1 (en) Method and system for determining vanishing point candidates for projective correction
CN107315989B (en) Text recognition method and device for medical data picture
US8913836B1 (en) Method and system for correcting projective distortions using eigenpoints
CN107516085A (en) A kind of method that black surround is automatically removed based on file and picture
CN110660072A (en) Method and device for identifying straight line edge, storage medium and electronic equipment
CN112906532B (en) Image processing method and device, electronic equipment and storage medium
CN117152415B (en) Method, device, equipment and storage medium for detecting marker of medicine package
CN113392819A (en) Batch academic image automatic segmentation and labeling device and method
CN113052181A (en) Table reconstruction method, device and equipment based on semantic segmentation and storage medium
CN116309494B (en) Method, device, equipment and medium for determining interest point information in electronic map
CN107861931B (en) Template file processing method and device, computer equipment and storage medium
CN107092907B (en) Growth curve processing method, device and system for blood bacteria culture
CN115660952A (en) Image processing method, dictionary pen and storage medium
CN111292374B (en) Method and equipment for automatically plugging and unplugging USB interface
CN115546476A (en) Multi-object detection method and data platform based on multi-scale features
CN111046878B (en) Data processing method and device, computer storage medium and computer
CN117152088B (en) Method, device, equipment and storage medium for detecting seal of medicine package
CN113536964A (en) Classification extraction method of ultrasonic videos
CN116648721A (en) Hair evaluation method, program, computer, and hair evaluation system
CN112861861A (en) Method and device for identifying nixie tube text and electronic equipment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant