CN117152415B - Method, device, equipment and storage medium for detecting marker of medicine package - Google Patents
Method, device, equipment and storage medium for detecting marker of medicine package Download PDFInfo
- Publication number
- CN117152415B CN117152415B CN202311119565.7A CN202311119565A CN117152415B CN 117152415 B CN117152415 B CN 117152415B CN 202311119565 A CN202311119565 A CN 202311119565A CN 117152415 B CN117152415 B CN 117152415B
- Authority
- CN
- China
- Prior art keywords
- package
- marker
- region
- interest
- image
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
- 239000003550 marker Substances 0.000 title claims abstract description 176
- 238000000034 method Methods 0.000 title claims abstract description 51
- 238000003860 storage Methods 0.000 title claims abstract description 18
- 239000003814 drug Substances 0.000 title abstract description 81
- 239000011159 matrix material Substances 0.000 claims abstract description 146
- 238000001514 detection method Methods 0.000 claims abstract description 61
- 230000000007 visual effect Effects 0.000 claims abstract description 46
- 230000009977 dual effect Effects 0.000 claims abstract description 19
- 230000008859 change Effects 0.000 claims abstract description 12
- 230000009467 reduction Effects 0.000 claims description 60
- 238000012545 processing Methods 0.000 claims description 24
- 238000009512 pharmaceutical packaging Methods 0.000 claims description 14
- 230000008030 elimination Effects 0.000 claims description 8
- 238000003379 elimination reaction Methods 0.000 claims description 8
- 238000007906 compression Methods 0.000 claims description 7
- 230000006835 compression Effects 0.000 claims description 7
- 238000004590 computer program Methods 0.000 claims description 6
- 238000000605 extraction Methods 0.000 claims description 5
- 230000008569 process Effects 0.000 abstract description 5
- 238000004806 packaging method and process Methods 0.000 description 28
- 238000002360 preparation method Methods 0.000 description 15
- 230000006870 function Effects 0.000 description 12
- 238000004519 manufacturing process Methods 0.000 description 10
- 238000004891 communication Methods 0.000 description 9
- 238000010586 diagram Methods 0.000 description 8
- 229940079593 drug Drugs 0.000 description 8
- 238000007781 pre-processing Methods 0.000 description 8
- 239000007787 solid Substances 0.000 description 8
- 230000003287 optical effect Effects 0.000 description 7
- 238000005520 cutting process Methods 0.000 description 5
- 210000003128 head Anatomy 0.000 description 5
- 208000003464 asthenopia Diseases 0.000 description 4
- 230000008878 coupling Effects 0.000 description 3
- 238000010168 coupling process Methods 0.000 description 3
- 238000005859 coupling reaction Methods 0.000 description 3
- 230000007547 defect Effects 0.000 description 3
- 239000011129 pharmaceutical packaging material Substances 0.000 description 3
- 239000000843 powder Substances 0.000 description 3
- 229920006395 saturated elastomer Polymers 0.000 description 3
- 230000011218 segmentation Effects 0.000 description 3
- 239000002775 capsule Substances 0.000 description 2
- 230000000694 effects Effects 0.000 description 2
- 239000000284 extract Substances 0.000 description 2
- 238000009472 formulation Methods 0.000 description 2
- 239000008187 granular material Substances 0.000 description 2
- 239000007788 liquid Substances 0.000 description 2
- 239000000203 mixture Substances 0.000 description 2
- 239000005022 packaging material Substances 0.000 description 2
- 238000012858 packaging process Methods 0.000 description 2
- 238000007639 printing Methods 0.000 description 2
- 238000006467 substitution reaction Methods 0.000 description 2
- 239000003826 tablet Substances 0.000 description 2
- 230000005540 biological transmission Effects 0.000 description 1
- 238000004364 calculation method Methods 0.000 description 1
- 230000001413 cellular effect Effects 0.000 description 1
- 230000002950 deficient Effects 0.000 description 1
- 230000007613 environmental effect Effects 0.000 description 1
- 230000000977 initiatory effect Effects 0.000 description 1
- 230000001788 irregular Effects 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 239000000825 pharmaceutical preparation Substances 0.000 description 1
- 229940127557 pharmaceutical product Drugs 0.000 description 1
- 238000012797 qualification Methods 0.000 description 1
- 239000002994 raw material Substances 0.000 description 1
- 238000005070 sampling Methods 0.000 description 1
- 238000007789 sealing Methods 0.000 description 1
- 238000011895 specific detection Methods 0.000 description 1
- 230000003068 static effect Effects 0.000 description 1
- 238000012360 testing method Methods 0.000 description 1
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/20—Image preprocessing
- G06V10/25—Determination of region of interest [ROI] or a volume of interest [VOI]
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06K—GRAPHICAL DATA READING; PRESENTATION OF DATA; RECORD CARRIERS; HANDLING RECORD CARRIERS
- G06K7/00—Methods or arrangements for sensing record carriers, e.g. for reading patterns
- G06K7/10—Methods or arrangements for sensing record carriers, e.g. for reading patterns by electromagnetic radiation, e.g. optical sensing; by corpuscular radiation
- G06K7/14—Methods or arrangements for sensing record carriers, e.g. for reading patterns by electromagnetic radiation, e.g. optical sensing; by corpuscular radiation using light without selection of wavelength, e.g. sensing reflected white light
- G06K7/1404—Methods for optical code recognition
- G06K7/1439—Methods for optical code recognition including a method step for retrieval of the optical code
- G06K7/1443—Methods for optical code recognition including a method step for retrieval of the optical code locating of the code in an image
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06K—GRAPHICAL DATA READING; PRESENTATION OF DATA; RECORD CARRIERS; HANDLING RECORD CARRIERS
- G06K7/00—Methods or arrangements for sensing record carriers, e.g. for reading patterns
- G06K7/10—Methods or arrangements for sensing record carriers, e.g. for reading patterns by electromagnetic radiation, e.g. optical sensing; by corpuscular radiation
- G06K7/14—Methods or arrangements for sensing record carriers, e.g. for reading patterns by electromagnetic radiation, e.g. optical sensing; by corpuscular radiation using light without selection of wavelength, e.g. sensing reflected white light
- G06K7/1404—Methods for optical code recognition
- G06K7/1439—Methods for optical code recognition including a method step for retrieval of the optical code
- G06K7/1447—Methods for optical code recognition including a method step for retrieval of the optical code extracting optical codes from image or text carrying said optical code
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06K—GRAPHICAL DATA READING; PRESENTATION OF DATA; RECORD CARRIERS; HANDLING RECORD CARRIERS
- G06K7/00—Methods or arrangements for sensing record carriers, e.g. for reading patterns
- G06K7/10—Methods or arrangements for sensing record carriers, e.g. for reading patterns by electromagnetic radiation, e.g. optical sensing; by corpuscular radiation
- G06K7/14—Methods or arrangements for sensing record carriers, e.g. for reading patterns by electromagnetic radiation, e.g. optical sensing; by corpuscular radiation using light without selection of wavelength, e.g. sensing reflected white light
- G06K7/1404—Methods for optical code recognition
- G06K7/1495—Methods for optical code recognition the method including an image compression step
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/20—Image preprocessing
- G06V10/26—Segmentation of patterns in the image field; Cutting or merging of image elements to establish the pattern region, e.g. clustering-based techniques; Detection of occlusion
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/20—Image preprocessing
- G06V10/28—Quantising the image, e.g. histogram thresholding for discrimination between background and foreground patterns
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/20—Image preprocessing
- G06V10/30—Noise filtering
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V30/00—Character recognition; Recognising digital ink; Document-oriented image-based pattern recognition
- G06V30/10—Character recognition
- G06V30/14—Image acquisition
- G06V30/146—Aligning or centring of the image pick-up or image-field
- G06V30/147—Determination of region of interest
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V30/00—Character recognition; Recognising digital ink; Document-oriented image-based pattern recognition
- G06V30/10—Character recognition
- G06V30/14—Image acquisition
- G06V30/148—Segmentation of character regions
- G06V30/15—Cutting or merging image elements, e.g. region growing, watershed or clustering-based techniques
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V30/00—Character recognition; Recognising digital ink; Document-oriented image-based pattern recognition
- G06V30/10—Character recognition
- G06V30/16—Image preprocessing
- G06V30/162—Quantising the image signal
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V30/00—Character recognition; Recognising digital ink; Document-oriented image-based pattern recognition
- G06V30/10—Character recognition
- G06V30/16—Image preprocessing
- G06V30/164—Noise filtering
-
- Y—GENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
- Y02—TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
- Y02P—CLIMATE CHANGE MITIGATION TECHNOLOGIES IN THE PRODUCTION OR PROCESSING OF GOODS
- Y02P90/00—Enabling technologies with a potential contribution to greenhouse gas [GHG] emissions mitigation
- Y02P90/30—Computing systems specially adapted for manufacturing
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Multimedia (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Health & Medical Sciences (AREA)
- Electromagnetism (AREA)
- General Health & Medical Sciences (AREA)
- Toxicology (AREA)
- Artificial Intelligence (AREA)
- Image Analysis (AREA)
- Image Processing (AREA)
Abstract
The application provides a method, a device, equipment and a storage medium for detecting a marker of medicine package. The method can acquire a target image, wherein the target image comprises image information of target packages; extracting a region of interest from the target image according to the relative position of the target package in the duplex package; compressing a data matrix of the region of interest to obtain a one-dimensional graph element matrix of the region of interest; each element in the one-dimensional image element matrix respectively represents the image change characteristics of a corresponding column of elements in the data matrix of the region of interest; the element value at the edge of the marker in the one-dimensional graph element matrix is larger than a set threshold value; judging whether the visual characteristics of the marker accord with preset detection conditions. The embodiment of the application can realize automatic detection of the markers in the medicine package, improve the detection efficiency and accuracy of the markers in the dual medicine package, avoid secondary pollution risks caused by the detection process and ensure the medicine quality.
Description
Technical Field
The embodiment of the application relates to the technical field of computers, in particular to a method, a device, equipment and a storage medium for detecting a marker of medicine packaging.
Background
At present, drugs are classified into solid preparations and liquid preparations according to the formulation method. Solid preparations include various forms such as tablets, capsules, powders, granules and the like.
In the production and preparation process of the traditional Chinese medicine package, the traditional Chinese medicine preparation can be packaged by adopting a small bag.
In the related art, after the pouch packaging process is completed, it is necessary to manually check whether the package has defects. However, human eyes are easily fatigued, and thus, the detection efficiency and accuracy of the manual detection method are limited by the status of personnel. In addition, the intervention of manual operation can cause secondary pollution of the medicine, so that the quality of the medicine is reduced.
Disclosure of Invention
In this context, the embodiment of the application is expected to provide a method, a device, equipment and a storage medium for detecting a marker of a medicine package, which are used for realizing automatic detection of the marker in the medicine package, improving the detection efficiency of the marker in the medicine package, improving the accuracy of the marker detection in the medicine package, avoiding secondary pollution caused by the detection process and guaranteeing the medicine quality.
In a first aspect of embodiments of the present application, there is provided a method of detecting a marker of a pharmaceutical package, comprising:
acquiring a target image; the target image comprises image information of a target package, wherein the target package is one independent package in a dual package, and the dual package comprises two independent packages connected with each other;
extracting a region of interest from the target image according to the relative position of the target package in the duplex package;
Compressing the data matrix of the region of interest to obtain a one-dimensional graph element matrix of the region of interest; wherein each element in the one-dimensional image element matrix respectively represents the image change characteristics of a corresponding column of elements in the data matrix of the region of interest; the element value at the edge of the marker in the one-dimensional graph element matrix is larger than a set threshold value; the marker is used for indicating a cuttable region in the duplex package;
judging whether the visual characteristics of the marker accord with preset detection conditions or not; wherein the visual characteristics of the marker are determined based on the elements at the edges of the marker.
In a second aspect of embodiments of the present application, there is provided a marker detection device for pharmaceutical packaging, the device comprising:
An acquisition unit configured to acquire a target image; the target image comprises image information of a target package, wherein the target package is one independent package in a dual package, and the dual package comprises two independent packages connected with each other;
The extraction unit is used for extracting a region of interest from the target image according to the relative position of the target package in the bigeminal package;
The compression unit is used for compressing the data matrix of the region of interest to obtain a one-dimensional graph element matrix of the region of interest; wherein each element in the one-dimensional image element matrix respectively represents the image change characteristics of a corresponding column of elements in the data matrix of the region of interest; the element value at the edge of the marker in the one-dimensional graph element matrix is larger than a set threshold value; the marker is used for indicating a cuttable region in the duplex package;
The judging unit is used for judging whether the visual characteristics of the marker accord with preset detection conditions or not; wherein the visual characteristics of the marker are determined based on the elements at the edges of the marker.
In a third aspect of embodiments of the present application, there is provided a computing device comprising:
at least one processor, memory, and input output unit;
Wherein the memory is for storing a computer program and the processor is for invoking the computer program stored in the memory to perform the marker detection method of the pharmaceutical packaging of the first aspect.
In a fourth aspect of embodiments of the present application, there is provided a computer readable storage medium comprising instructions which, when run on a computer, cause the computer to perform the marker detection method of the pharmaceutical product package of the first aspect.
The embodiment of the application provides a method, a device, equipment and a storage medium for detecting a marker of medicine package. In an embodiment of the application, a target image is acquired, the target image comprising image information of a target package, the target package being one of two separate packages, the two separate packages comprising a connection. Further, a region of interest is extracted from the target image based on the relative position of the target package in the duplex package. Further, the data matrix of the region of interest is compressed, and a one-dimensional graph element matrix of the region of interest is obtained. Each element in the one-dimensional image element matrix respectively represents the image change characteristics of a corresponding column of elements in the data matrix of the region of interest, and the element value at the edge of the marker in the one-dimensional image element matrix is larger than a set threshold value. The marker is used to indicate a trimmable area in the duplex package. Finally, judging whether the visual characteristics of the marker accord with preset detection conditions. The visual characteristics of the marker are determined based on the elements at the edges of the marker.
In the embodiment of the application, the region of interest to be detected is extracted from the medicine package image (namely the target image), and then the phenomenon that the element value at the edge of the marker is larger than the set threshold value is referenced, and the visual characteristics of the marker are obtained from the one-dimensional image element matrix obtained by compressing the region of interest and are used for judging whether the marker is qualified or not, so that the automatic detection of medicine package is realized, and the detection efficiency of the marker in the medicine package is greatly improved. Meanwhile, the problem of accuracy reduction caused by human eye fatigue and the risk of secondary pollution of medicines caused by manual operation intervention are effectively avoided by automatically extracting and detecting the visual characteristics in the images, so that the accuracy of marker detection in medicine packaging is greatly improved, and the medicine quality is ensured.
Drawings
The above, as well as additional purposes, features, and advantages of exemplary embodiments of the present application will become readily apparent from the following detailed description when read in conjunction with the accompanying drawings. Several embodiments of the present application are illustrated by way of example, and not by way of limitation, in the figures of the accompanying drawings and in which:
FIG. 1 is a flow chart of a method for detecting a label of a pharmaceutical package according to an embodiment of the present application;
FIG. 2 is a schematic diagram of a target image according to an embodiment of the present application;
FIG. 3 is a flowchart of a target image acquisition method according to an embodiment of the present application;
FIG. 4 is a schematic diagram of a target image according to another embodiment of the present application;
FIG. 5 is a flowchart illustrating a method for extracting a region of interest according to an embodiment of the present application;
FIG. 6 is a schematic diagram of a marker according to an embodiment of the present application;
FIG. 7 is a flowchart of a method for determining eligibility of a marker according to an embodiment of the present application;
FIG. 8 is a schematic view of a marker according to another embodiment of the present application;
FIG. 9 is a flowchart illustrating a method for preprocessing a region of interest according to an embodiment of the present application;
FIG. 10 is a schematic diagram illustrating the effect of preprocessing a region of interest according to an embodiment of the present application;
FIG. 11 is a schematic structural view of a label detection device for pharmaceutical packaging according to an embodiment of the present application;
FIG. 12 schematically illustrates a schematic structural diagram of a medium according to an embodiment of the present application;
FIG. 13 schematically illustrates a structural diagram of a computing device in accordance with embodiments of the present application.
In the drawings, the same or corresponding reference numerals indicate the same or corresponding parts.
Detailed Description
The principles and spirit of the present application will be described below with reference to several exemplary embodiments. It should be understood that these embodiments are presented merely to enable those skilled in the art to better understand and practice the application and are not intended to limit the scope of the application in any way. Rather, these embodiments are provided so that this disclosure will be thorough and complete, and will fully convey the scope of the disclosure to those skilled in the art.
Those skilled in the art will appreciate that embodiments of the application may be implemented as a system, apparatus, device, method, or computer program product. Accordingly, the present disclosure may be embodied in the following forms, namely: complete hardware, complete software (including firmware, resident software, micro-code, etc.), or a combination of hardware and software.
At present, drugs are classified into solid preparations and liquid preparations according to the formulation method. Solid preparations include various forms such as tablets, capsules, powders, granules and the like.
In the production and preparation process of the traditional Chinese medicine package, the traditional Chinese medicine preparation can be packaged by adopting a small bag. At present, the production process of the pouch package mainly comprises: aiming at a certain batch of traditional Chinese medicine preparations, typesetting and editing corresponding medicine information, printing the medicine information on packaging materials of small bags by using a high-speed rewinder, and finally, processing the medicine raw materials and the packaging materials into the traditional Chinese medicine preparations packaged by the small bags by using a small bag packaging machine through two procedures of filling, heat sealing, cutting and the like.
In the related art, after the pouch packaging process is completed, it is necessary to manually inspect the packaging bag for possible defects, such as package breakage, white edges, printing errors, black marks, and the like. However, the manual detection method is limited by the fatigue of human eyes, and the accuracy and detection efficiency of the manual detection method are reduced. In addition, the intervention of manual operation can cause secondary pollution of the medicine, so that the quality of the medicine is reduced.
In summary, a new solution is needed to solve at least one of the above technical problems.
In order to overcome the technical problems, according to the embodiments of the present application, a method, a device, an apparatus and a storage medium for detecting a label of a pharmaceutical package are provided. In an embodiment of the application, a target image is acquired, the target image comprising image information of a target package, the target package being one of two separate packages, the two separate packages comprising a connection. Further, a region of interest is extracted from the target image based on the relative position of the target package in the duplex package. Further, the data matrix of the region of interest is compressed, and a one-dimensional graph element matrix of the region of interest is obtained. Each element in the one-dimensional image element matrix respectively represents the image change characteristics of a corresponding column of elements in the data matrix of the region of interest, and the element value at the edge of the marker in the one-dimensional image element matrix is larger than a set threshold value. The marker is used to indicate a trimmable area in the duplex package. Finally, judging whether the visual characteristics of the marker accord with preset detection conditions. The visual characteristics of the marker are determined based on the elements at the edges of the marker.
In the embodiment of the application, the region of interest to be detected is extracted from the medicine package image, and furthermore, the phenomenon that the element value at the edge of the marker is larger than the set threshold value is referred to, and the visual characteristics of the marker are obtained from the one-dimensional image element matrix obtained by compressing the region of interest and used for judging whether the marker is qualified or not, so that the automatic detection of medicine package is realized, and the detection efficiency of the marker in the medicine package is greatly improved. Meanwhile, the problem of accuracy reduction caused by human eye fatigue and the risk of secondary pollution of medicines caused by manual operation intervention are effectively avoided by automatically extracting and detecting the visual characteristics in the images, so that the accuracy of marker detection in medicine packaging is greatly improved, and the medicine quality is ensured.
The technical scheme provided by the embodiment of the application can be realized by a server and/or terminal equipment. The server and/or the terminal device may be deployed in the production device of the pharmaceutical packaging, or may be a production device independent of the pharmaceutical packaging, and the transmission of the data instructions is achieved by establishing wireless communication and/or wired communication.
It should be noted that, the server according to the embodiment of the present application may be an independent physical server, or may be a server cluster or a distributed system formed by a plurality of physical servers, or may be a cloud server that provides cloud services, cloud databases, cloud computing, cloud functions, cloud storage, network services, cloud communication, middleware services, domain name services, security services, big data, and basic cloud computing services such as an artificial intelligent platform.
The terminal device according to the embodiment of the present application may be a device that provides voice and/or data connectivity to a user, a handheld device with a wireless connection function, or other processing device connected to a wireless modem. Such as mobile telephones (or "cellular" telephones) and computers with mobile terminals, which can be portable, pocket, hand-held, computer-built-in or car-mounted mobile devices, for example, which exchange voice and/or data with radio access networks. Such as Personal communication services (Personal Communication Service, PCS) phones, cordless phones, session initiation protocol (Session initialization Protocol, SIP) phones, wireless local loop (Wireless Local Loop, WLL) stations, personal digital assistants (Personal DIGITAL ASSISTANT, PDA) and the like.
In this document, it should be understood that any number of elements in the drawings is for illustration and not limitation, and that any naming is used only for distinction and not for any limitation.
The principles and spirit of the present application are explained in detail below with reference to several representative embodiments thereof.
Referring now to fig. 1, fig. 1 is a flow chart illustrating a method for detecting a label of a pharmaceutical package according to an embodiment of the present application. The label detection method for medicine package provided by the embodiment of the application comprises the following steps:
step S101, a target image is acquired.
In an embodiment of the present application, the target image includes a pharmaceutical package. Specifically, the target image contains image information of the target package. The target package is one of the individual packages in a dual package comprising two connected individual packages. For example, the target image includes at least a complete outline of one individual package and a partial outline of another individual package.
In practical application, the medicine package related to the application can be used for packaging medicines of solid preparations. For example, a medicine packaging bag for packaging solid Chinese medicinal preparations. From a packaging form, for example, the pharmaceutical package may be a duplex pharmaceutical package, where duplex refers to a package structure in which two individual packages are connected together. See fig. 2 for a two-part pharmaceutical package comprising a complete image of package a and a partial image of package b. In fig. 2, the package a and the package b are independent of each other and are used for packaging solid preparations, respectively. A detachable connecting structure is arranged between the packaging bag a and the packaging bag b. It should be noted that, in addition to the dual medicine package, the detachable independent medicine package may also be configured into a triple, quadruple or other number of package forms, which is not limited in this embodiment of the present application.
In the embodiment of the application, the target image can be acquired by an image acquisition device, and the image acquisition device can be a camera, a video camera and other devices. For example, the image acquisition device may be a camera module installed in a packaging device or a packaging line. Or a camera module in a mobile terminal or other device, which is connected to the packaging device by a clamping device or other connection structure.
In an alternative embodiment of the present application, as shown in fig. 3, the step of acquiring the target image in step S101 may be implemented as the following steps:
step S301, an original image containing target package image information is acquired.
In the embodiment of the application, the original image is an image comprising the complete outline of the medicine package, and can be acquired by any image acquisition mode. For example, the drug package is placed in a solid background and the original image is acquired. For example, the medicine package is fixed in the field of view of the image capturing device by the holder, whereby the original image is captured by the image capturing device.
Step S302, binarization processing is carried out on the original image to obtain a target image.
Illustratively, the original image acquired in step S301 is assumed to be fig. 2, and the package contained in the original image is assumed to be the package bag a in fig. 2. Based on this, referring to fig. 4 together, fig. 4 is a schematic diagram of a binarized image (i.e. a target image) obtained by performing the binarization processing on the original image in step S302.
Through steps S301 to S302, automatic binarization processing can be performed on the original image, so as to provide a data base for the packaged contour extraction process, and further improve robustness and usability.
In practice, the method of binarizing the original image in step S302 includes, but is not limited to: the Otsu method (OTSU), the Triangle (TRIANGLE) method, the window-based binarization method, the fuzzy set theory-based automatic image binarization (Huang's fuzzy thresholding method).
In order to acquire the outer contour of the package, in step S302, binarization processing is required for the original image. For example, to improve robustness and ease of use, the implementation principle of Huang's fuzzy thresholding method is as follows:
First, a fuzzy subset is defined that maps from image X (i.e., the original image) to a range of values [0,1 ]:
X={xmn,μX(xmn))}
Where X mn represents the pixel gray value at point (m, n) in image X, μ x(xmn represents the membership value of that point with some attribute. Wherein, 0.ltoreq.mu x(xmn). Ltoreq.1, m=0, 1, …, M-1, n=0, 1, …, N-1. For binarization, each pixel has a similar relationship to the class to which it belongs, and therefore, this relationship can be used to represent the value of mu X(xmn).
For a given threshold t, the average values μ 0 and μ 1 of the respective gradation values of the background pixel and the foreground pixel are expressed by the following formula:
Where h (g) represents the number of pixels in the image having a gray level g.
The background pixel tone average μ 0 and the foreground pixel tone average μ 1 can be regarded as target values of the foreground and the background corresponding to the specified threshold t, and the relationship between a point in the image X and the region to which the point belongs should intuitively be related to the difference between the gray value of the point and the target value of the region to which the point belongs. Thus, for point (m, n), the following membership definition function is used:
Wherein C is a constant that allows the above function to be satisfied: mu is more than or equal to 0.5 X(xmn) is less than or equal to 1.
Based on shannon entropy function, the entropy of one fuzzy set a is defined as:
Wherein shannon function:
S(μA(xi))=-μA(xi)ln[μA(xi)]-[1-μA(xi)]ln[1-μA(xi)]
since the gray scale image has at most L color levels, the above equation can be further converted into:
finally, taking the t value when the shannon entropy value is minimum as the final segmentation threshold value for all possible threshold values t. Therefore, the image X is subjected to color segmentation through the segmentation threshold value, automatic binarization processing of the image X is realized, and a binarized image is obtained.
After the target image is acquired, step S102 extracts the region of interest from the target image according to the relative position of the target package in the duplex package.
In the embodiment of the application, the region of interest is an image region which is intended to be detected by a user in the duplex package. In practical applications, taking medicine packaging as an example, the region of interest includes, but is not limited to, at least one of the following image regions: a marker for the pharmaceutical packaging.
Wherein, the marker is used for providing beat signals for packaging equipment in the production process of medicine packaging on one hand; in another aspect, a method for indicating a trimmable area in a pharmaceutical packaging material. Thus, the packaging device cuts the pharmaceutical packaging material at the marker location, making it into individual pharmaceutical packages. In practice, the markers are also known as black marks, i.e. two black rectangular marks of the top area of the pharmaceutical package. Typically, the labels are printed on the pharmaceutical packaging material in advance, with the length, width and spacing being fixed values. Thus, the length and width can be used as known standard visual information to aid in the detection of the eligibility of the marker.
In order to detect the marker in the pharmaceutical package, it is necessary to first extract the image area, i.e. the region of interest, containing the marker from the pharmaceutical package. In an alternative embodiment of the present application, as shown in fig. 5, the step S102 of extracting the region of interest from the target image according to the relative position of the target package in the duplex package may be implemented as the following steps:
In step S501, the relative position where the target package is in the bigeminal package is identified.
In an embodiment of the present application, the relative positions of the target package include: left or right bag in the duplex medicine package. I.e. a left bag located on the left side of the duplex package, or a right bag located on the right side of the duplex package. Specifically, it is assumed that the medicine package is a two-pack, based on which one individual medicine package in the two-pack is photographed at a time. The center seam of the two-way bag is taken as a boundary, and the two-way bag can be divided into a left bag and a right bag.
As an alternative embodiment, in the currently captured target image, the method for determining whether the package bag occupying the main body position is the left bag or the right bag in the dual package may be implemented as follows: first, a minimum circumscribed profile of the target package is obtained. Further, for the minimum circumscribing contour of the irregular shape, the midpoint of the minimum circumscribing contour may be determined in a first order central moment manner. Specifically, the midpoint of the minimum circumscribing contour is determined according to the first-order center moment of the minimum circumscribing contour. Finally, the midpoint is judged to be located to the left or right of the horizontal midline in the target image. If the midpoint is to the left of the horizontal midline, the target package is the right bag on the right in the duplex package. If the midpoint is to the right of the horizontal midline, the target package is the left bag on the left in the duplex package.
For example, assuming that the midpoint of the outline of the package in the horizontal direction is denoted as M, assuming that the size of the package in the target image is M x N, if M < M/2, it is determined that the pouch is the right pouch in the duplex package; if M is more than or equal to M/2, judging that the small bag is the left bag in the dual package.
Taking the packaging bag a shown in fig. 4 as an example, the bounding box of the foreground portion represents the minimum circumscribed outline of the packaging bag a, the straight line at the middle of the binarized image represents the center line of the target image in the horizontal direction, and the cross mark in fig. 4 represents the midpoint of the minimum circumscribed outline of the packaging bag a. As can be seen from fig. 4, the foreground portion occupying the largest area of the image is the complete image of the package bag a and the partial image of the package bag b, so that the minimum circumscribing outline (i.e., the foreground portion, the complete image of the package bag a and the partial image of the package bag b) including the package bag a can be identified and segmented from fig. 4, and the midpoint of the minimum circumscribing outline of the package bag a (i.e., the intersection point of the cross mark in fig. 4) is located on the right side of the center line (i.e., the straight line located in the middle of the binarized image in fig. 4), in which case the package bag a can be judged as the left bag.
Step S502, determining a region of interest in the target image according to the relative position of the target package and the standard visual characteristics of the marker.
Wherein the label is used to represent a trimmable area in a duplex pharmaceutical package. As described above, the marker may provide a beat signal to the packaging device during production in addition to indicating the effect of the trimmable area. For example, the markers may be black marks, i.e. two black rectangular marks pre-printed on the top area of the pharmaceutical package.
In an embodiment of the application, the standard visual characteristics of the marker include standard size and standard location. Generally, the length, width and spacing of the black marks are all fixed values (i.e. standard sizes). For example, the width of the black label is set to be 5mm, and once the cutting position of the packaging equipment is wrong, quality problems such as powder leakage and the like can be caused. Therefore, in the case where the cutting position is not within the black mark width, the package bag can be regarded as a defective product, and rejected. For specific detection, see below, the description is not expanded here.
Wherein the standard position of the marker is related to the relative position of the target package. Taking the bight packaging as an example, the left black label of the left bag is located in the fixed area (i.e., standard position) at the upper left corner of the region of interest, and the right black label of the right bag is located in the fixed area (i.e., standard position) at the upper right corner of the region of interest.
Through steps S501 to 502, the position of the target package in the duplex package is determined, and then, the target image is used as a reference to the relative position of the target package and the standard visual characteristics of the marker, and the region of interest containing the marker is divided from the target image to be used as an image sample for detecting the eligibility of the marker in the duplex package.
For example, assume that the pharmaceutical package to be tested is the left bag in a duplex package. Assuming that the bounding box coordinates of the target image are [ x, y, w, h ], then the left black label (i.e., the marker to be detected) corresponding to the left bag should be in the upper left corner region of the bounding box [ x, y, w, h ].
Further assume that the black mark size is denoted (w b,hb), and that the pixel Resolution r=fov/Resolution in the optical system of the image capturing device of the target image. Wherein, foV is a visual field parameter, and Resolution is the Resolution of the image acquisition device itself. Based on the above assumption, the left black mark has a pixel size in units of pixels in the target image of: (w b,hb)=(Wb/r,Hb/r). Wherein, W b and H b are the original sizes acquired by the image acquisition device.
Based on the above assumption, in step S501, according to shannon' S law of sampling and the standard location rule of the black label in the medicine package (i.e. the location where the black label should appear in the qualified medicine package), the region of interest [ x, y,2w b,2hb ] may be used to search the target image for the left black label, thereby obtaining the region of interest a containing the left black label a as shown in fig. 6. Further, assuming that the interval between two black marks in the medicine package is set to a fixed size D b, the interval obtained in units of pixels is: d b=Db/r. Thus, based on the above assumption, the region of interest for searching for the right black mark can be obtained as: x + d b-wb,y,2wb,2hb ] to obtain a region of interest b containing a right black label b as shown in figure 6. For example, the left black mark a and the right black mark b shown in fig. 6 can be extracted by the above two regions of interest, respectively.
After the region of interest containing the marker is acquired, the region of interest may be inspected from the perspective of the visual characteristics of the marker to determine whether the pharmaceutical package is acceptable.
Step S103, compressing the data matrix of the region of interest to obtain a one-dimensional graph element matrix of the region of interest.
In the embodiment of the application, each element in the one-dimensional image element matrix respectively represents the image change characteristics of a corresponding column of elements in the data matrix of the region of interest. Since the marker is significantly different from the background element in visual aspect, from visual aspect, the element at the marker edge in the one-dimensional image element matrix has significant feature, that is, the element at the marker edge is significantly different from the background element in numerical value, so that the element at the marker edge is differentiated from the background element by setting a threshold value, thereby realizing automatic identification of the marker. Optionally, the element value at the marker edge is greater than a set threshold. This set threshold value in the present application may be set according to the saturation value in the marker binarized image, or may be set according to the type of light source and the environmental factors of the production site, or may be set based on other manners, which is not limited herein. The marker is used to indicate a trimmable area in the duplex package.
Step S104, judging whether the visual characteristics of the marker meet preset detection conditions.
In an embodiment of the application, the visual characteristics of the marker are determined based on the elements at the edges of the marker.
It is worth to say that, in the medicine packaging scene, after the visual characteristics of the region of interest are detected to meet the preset detection conditions, the label of the medicine package can be confirmed to be qualified, and in this case, the medicine package is confirmed to be qualified.
Specifically, for the region of interest containing the marker, the data matrix of the region of interest is compressed in step S103, so as to obtain an alternative embodiment of the one-dimensional map element matrix of the region of interest, as shown in fig. 7, which may be implemented as the following step S701:
In step S701, an average value of each column of elements in the data matrix of the region of interest is calculated, and the average value of each column of elements is used as each corresponding element in the first one-dimensional map element matrix of the region of interest.
In the embodiment of the application, each element in the data matrix represents the pixel value of the corresponding pixel in the region of interest. Further optionally, each element in the first one-dimensional map element matrix represents an average value of a corresponding column of elements in the data matrix of the region of interest. Therefore, a single column element replaces a plurality of elements in one row, so that the calculated amount of the data matrix processing process is reduced, and the detection efficiency is further improved.
For example, the data matrix of the region of interest containing black marks (i.e., markers) is compressed into a single line in the form of an average value in step S701. That is, the average value of all the elements in each column is taken as the corresponding each element in the first one-dimensional map element matrix (for convenience of distinction, the elements in the one-dimensional map element matrix are also referred to as map elements in the present application), and thus the overall situation of the values of all the elements in the column is represented by the numerical value of each map element. For example, a 3x3 data matrix, ultimately requires compression into a single row of one-dimensional matrix of picture elements, i.e., a 3x1 matrix. Specifically, assume that a 3x3 data matrix is:
Then, the first column is averaged, i.e., (1+4+7)/3=4, to obtain the first picture element of 4. Sequentially calculating, and taking an average value of the second column, namely (2+5+8)/3=5, so as to obtain a second picture element; the third column is averaged, i.e., (3+6+9)/3=6, to yield the third picture element. Finally, a first one-dimensional graph element matrix of 3x1 is obtained: [4 5 6]. The values in the examples herein are presented for purposes of introducing a mean value compression method for the region of interest and are not limiting.
In practice, the values of the elements of the graph may be used to characterize the pixel value size of each element in the corresponding column. For example, assuming that the pixel values in the binarized image are 1 and 0, respectively, and assuming that the depth of the data matrix is 8 bits, the saturation value of the picture element is 255. Based on the above assumption, if the value of the map element is a saturation value, it is explained that each element value in the corresponding column is 1. If the data value of the picture element is greater than half the saturation value, it is stated that the number of elements with a value of 1 in the corresponding column exceeds half the total number of elements in the column. If the data value of the picture element is less than half the saturation value, it is stated that the number of elements with a value of 1 in the corresponding column is less than half the total number of elements in the column. If the data value of the figure element is less than 0, then it is stated that there are no elements with a value of 1 in the corresponding column. Of course, this is merely an example, and the setting of the map elements in the practical application is not limited thereto.
Next, in step S104, an alternative embodiment of determining whether the visual characteristic of the marker meets the preset detection condition, as shown in fig. 7, may be implemented as the following steps S702 to S703:
step S702, traversing the first one-dimensional graph element matrix to extract the head and tail positions of the markers.
In step S703, it is determined whether the physical size of the marker meets the standard size.
In the embodiment of the application, the physical size of the marker is calculated based on the head-to-tail position of the marker. It can be understood that the head and tail positions of the marker are the first elements which are traversed in the first one-dimensional graph element matrix and are larger than the set threshold value by taking the first elements on two sides in the first one-dimensional graph element matrix as the traversal starting points. Illustratively, an alternative implementation of the head-to-tail position of the tag is: taking the first picture element which traverses from left to right and is larger than a set threshold value in the first one-dimensional picture element matrix as the starting position of the marker; and taking the first graph element which traverses from right to left and is larger than a set threshold value in the first one-dimensional graph element matrix as the end position of the marker. It should be noted that, in addition to the above-mentioned acquisition modes, there are other acquisition modes for the end-to-end position of the marker, which are not developed here.
For example, in step S702, it is assumed that the threshold value is set to threshold=h b ·α. Where h b is the height of the black label (i.e., marker) and α is the empirical value. Based on this, the first picture element larger than threshold traversed from left to right in the first one-dimensional picture element matrix is taken as the starting position of the black mark, and the index of the position is recorded as start. The first picture element which traverses from right to left and is larger than a set threshold value in the first one-dimensional picture element matrix is taken as the end position of the black mark, and the index of the position is recorded as end. Finally, the width of the black mark obtained in step S703 is w m =end-start. And comparing the width (i.e. the physical dimension) of the black mark with the width (i.e. the standard dimension) of the standard black mark, thereby judging whether the physical dimension of the black mark is qualified or not.
See the test interface for a duplex pharmaceutical package shown in fig. 8. In fig. 8, the upper left corner is the left black label of the left bag, and the upper right corner is the right black label of the left bag, wherein the right black label of the left bag is connected with the left black label of the right bag. In fig. 8, a line frame thickened area under the black mark represents the width of the black mark, and the lower numerical value thereof, that is, the black mark size in units of pixels, that is, the left black mark is 47, and the right black mark is 77.
Based on the example shown in fig. 7, in an alternative embodiment of step S104, the target package is determined to be acceptable if the physical size of the marker meets the standard size. Specifically, if the physical size of the tag meets the standard size, the cutting position of the tag region is specified, and in this case, the current medicine package does not have the medicine leakage problem caused by the cutting problem, so that the medicine package containing the tag can be qualified. Otherwise, if the physical size of the marker does not meet the standard size, determining that the target package is not qualified.
Of course, other parts (such as edge areas, two-dimensional codes, text information and the like) can be detected continuously, so that after other defects in the medicine package are further eliminated, the medicine package is judged to be qualified.
Through the steps S701 to S703, the image of the marker in the medicine package can be converted into a data matrix, and the head and tail positions of the marker are obtained through calculation of the data matrix, so that the physical size of the marker is obtained, the automatic detection of the medicine package marker is realized, and the accuracy and the detection efficiency of the marker detection in the medicine package are further improved.
For a region of interest containing a marker, background elements in the region of interest may interfere with the identification of the contours of the marker, such as by having the background elements also be part of the marker, thereby causing errors in the identification of the contours of the marker. Thus, to further eliminate interference of background elements in the region of interest, the image of the region to be detected may also be preprocessed prior to step S103.
An alternative embodiment of preprocessing the area to be detected, as shown in fig. 9, is specifically implemented as the following steps:
Step S901, performing an open operation on the binarized image of the region of interest to obtain an initial noise reduction image of the region of interest. Here, the background noise in the region of interest can be primarily eliminated by the open operation, and the image quality is improved.
Step S902, compressing a data matrix of the initial noise reduction image to obtain a second one-dimensional image element matrix; each element in the second one-dimensional image element matrix represents an average value of a corresponding column of elements in the data matrix of the initial noise reduction image.
Step S903, performing background elimination processing on the initial noise reduction image by adopting the noise reduction zone bit in the second one-dimensional image element matrix to obtain a target noise reduction image of the region of interest. The noise reduction flag bit is the first element which is searched in the second one-dimensional image element matrix and is in a preset range by taking the first elements at two sides in the second one-dimensional image element matrix as searching starting points.
Specifically, in step S903, the noise reduction flag bit is searched for in the second one-dimensional map element matrix. And then, performing background elimination processing on the initial noise reduction image based on the noise reduction zone bit to obtain a secondary noise reduction image. Alternatively, it may also be determined whether there is a case where the outline of the marker is connected to the background element, and if this is the case, it is necessary to eliminate the background element having the connection relation. Further, the difference between at least two noise reduction flag bits is taken as the initial width of the marker. Then, the secondary noise reduction image is subjected to an open operation based on the initial width, and the region of interest (namely the target noise reduction image) after the noise reduction processing is obtained. Therefore, the interference of the background element on the identification of the marker profile is further removed through the self-adaptive open operation of the region of interest, burrs of the marker are removed, and the accuracy of the marker profile in the region of interest is improved.
Through the steps S901 to S903, through the above preprocessing steps, automatic noise reduction of the region of interest containing the marker is achieved, interference of background elements in the region of interest is further eliminated, accuracy of the marker outline in the region of interest is improved, and the marker extraction efficiency is further improved.
The pretreatment process for a region of interest containing a marker in an embodiment of the present application is described below by way of an alternative example.
The region of interest containing the marker is assumed to be: the region of interest b in fig. 6 contains the right black mark b. Based on this, the above-described region of interest is processed in the automatic binarization manner described above, resulting in the region of interest b1 shown in fig. 10 (i.e., a binarized image of the region of interest).
In the region of interest b1, there is obviously some noise and holes caused by the noise, in this case, a 3×3 kernel may be used to perform an open operation, so as to fill the holes of the region of interest b1, so as to eliminate the noise, and obtain the region of interest b2 shown in fig. 10 (i.e. an initial noise reduction image of the region of interest).
In the region of interest b2, it can be seen that part of the black mark element communicates with the background element, such as a burr portion. To further improve the image quality and the recognition accuracy of the black mark size, these background elements need to be further eliminated. First, the data matrix of the region of interest b2 is compressed into a single row matrix (i.e., a second one-dimensional matrix of picture elements). Further, traversing a graph element from left to right in a single row matrix, if the value of a graph element is a saturated value, it means that the corresponding whole column of the graph element in the data matrix is 1. In this case, column elements in the single row matrix having the same index as the figure element may be set to 0 to eliminate the background element in the column. Traversing to the first unsaturated graph element with the value larger than half of the saturated value, stopping traversing operation, and recording the index of the graph element as the start (namely the noise reduction zone bit). Further, the same traversing operation is performed from right to left until the first non-saturated graph element with a value greater than half the saturation value is traversed, and the traversing operation is stopped again, and the index of the graph element is recorded as end (i.e. the noise reduction flag bit). Through the traversing operation, the background elements can be further eliminated, so that a secondary noise reduction image is obtained.
Then, the noise reduction zone bit in the region of interest b2 is calculated, and the initial width w m =end-start of the black mark is obtained. Further, on the basis of the convolution kernel of the initial width determination (w m/3, 1), an open operation is performed on the black label, thereby further eliminating burrs in the secondary noise reduction image. And taking the convolution kernel (5,w b/2), and performing open operation on the black mark to enable the overall shape of the black mark to be more regular, so as to obtain the region of interest b3 (namely the target noise reduction image) shown in fig. 10. Further, to improve the robustness, the maximum outline of the black mark is further searched and selected in the region of interest b3, so as to further reduce the region of interest to the bounding box of the maximum outline.
Through the preprocessing process in the above example, the quality of the image to be detected can be further improved, so that the accuracy and the reliability of black mark qualification detection can be improved, and the automatic detection efficiency of the package can be improved.
In the embodiment of the application, the region of interest to be detected is extracted from the medicine package image, and then the region of interest is compressed into the one-dimensional image element matrix, and the visual characteristics of the marker are obtained from the one-dimensional image element matrix through the phenomenon that the element value at the edge of the marker is larger than the set threshold value, so that whether the marker is qualified or not is judged, the automatic detection of medicine package is realized, and the detection efficiency of the marker in the medicine package is greatly improved. Meanwhile, the problem of accuracy reduction caused by human eye fatigue and the risk of secondary pollution of medicines caused by manual operation intervention are effectively avoided by automatically extracting and detecting the visual characteristics in the images, so that the accuracy of marker detection in medicine packaging is greatly improved, and the medicine quality is ensured.
Having described the apparatus, method of the exemplary embodiments of the present application, a marker detection device for pharmaceutical packages of the exemplary embodiments of the present application is described next with reference to fig. 11. Alternatively, the marker detection device may be provided in a pharmaceutical packaging detection apparatus or a pharmaceutical packaging production apparatus, the device comprising:
An acquisition unit 1101 for acquiring a target image; the target image comprises image information of a target package, wherein the target package is one independent package in a dual package, and the dual package comprises two independent packages connected with each other;
An extracting unit 1102, configured to extract a region of interest from the target image according to a relative position of the target package in the duplex package;
A compression unit 1103, configured to compress the data matrix of the region of interest to obtain a one-dimensional map element matrix of the region of interest; wherein each element in the one-dimensional image element matrix respectively represents the image change characteristics of a corresponding column of elements in the data matrix of the region of interest; the element value at the edge of the marker in the one-dimensional graph element matrix is larger than a set threshold value; the marker is used for indicating a cuttable region in the duplex package;
A judging unit 1104, configured to judge whether the visual feature of the marker meets a preset detection condition; wherein the visual characteristics of the marker are determined based on the elements at the edges of the marker.
As an alternative embodiment, the extracting unit 1102 is specifically configured to:
identifying a relative position of the target package in the duplex package; the relative position includes a left pocket located on a left side of the duplex package or a right pocket located on a right side of the duplex package;
determining the region of interest in the target image according to the relative position of the target package and the standard visual features of the marker;
wherein the standard visual characteristics of the marker include standard size and standard location; the standard position is related to the relative position in which the target package is located.
As an alternative embodiment, the extracting unit 1102 is specifically configured to, when identifying the relative position of the target package in the duplex package:
acquiring the minimum external contour of the target package;
Determining the midpoint of the minimum external contour according to the first-order central moment of the minimum external contour;
Judging whether the midpoint is positioned at the left side or the right side of a horizontal midline in the target image;
If the midpoint is on the left side of the horizontal midline, the target package is a right bag positioned on the right side in a duplex package; or alternatively
If the midpoint is to the right of the horizontal midline, the target package is the left pocket on the left in the duplex package.
As an alternative embodiment, the compression unit 1103 is specifically configured to:
And calculating the average value of each column of elements in the data matrix of the region of interest, and taking the average value of each column of elements as each corresponding element in the first one-dimensional graph element matrix of the region of interest.
Accordingly, the judging unit 1104 specifically functions to:
Traversing the first one-dimensional graph element matrix to extract the head and tail positions of the marker; the head and tail positions are the first elements which are traversed in the first one-dimensional image element matrix and are larger than a set threshold value by taking the first elements on two sides of the first one-dimensional image element matrix as traversal starting points;
judging whether the physical size of the marker meets the standard size; the physical size of the marker is calculated based on the head-to-tail position of the marker;
if the physical size of the marker meets the standard size, determining that the target package is qualified; or alternatively
And if the physical size of the marker does not meet the standard size, determining that the target package is not qualified.
Wherein, as an alternative embodiment, the head-tail position includes:
Taking the first element which traverses from left to right and is larger than a set threshold value in the first one-dimensional graph element matrix as the initial position of the marker; and taking the first element which traverses from right to left and is larger than a set threshold value in the first one-dimensional graph element matrix as the end position of the marker.
As an alternative embodiment, the obtaining unit 1101 is specifically configured to:
Acquiring an original image containing the target package image information;
and performing binarization processing on the original image by adopting an automatic image binarization mode based on a fuzzy set theory so as to obtain the target image.
As an optional implementation manner, the apparatus further includes a preprocessing unit, before the compressing unit 1103 compresses the data matrix of the region of interest to obtain the one-dimensional graph element matrix of the region of interest, configured to:
Performing open operation on the binarized image of the region of interest to obtain an initial noise reduction image of the region of interest;
Compressing the data matrix of the initial noise reduction image to obtain a second one-dimensional image element matrix; each element in the second one-dimensional image element matrix respectively represents the average value of a corresponding column of elements in the data matrix of the initial noise reduction image;
And performing background elimination processing on the initial noise reduction image by adopting a noise reduction zone bit in the second one-dimensional image element matrix to obtain a target noise reduction image of the region of interest.
The noise reduction zone bit is the first element which is searched in the second one-dimensional image element matrix and is in a preset range by taking the first elements at two sides in the second one-dimensional image element matrix as search starting points.
As an optional implementation manner, the preprocessing unit is configured to perform background elimination processing on the initial noise reduction image by using a noise reduction flag bit in the second one-dimensional image element matrix, so as to obtain a target noise reduction image of the region of interest, where the preprocessing unit is specifically configured to:
searching a noise reduction zone bit in the second one-dimensional picture element matrix;
Performing background elimination processing on the initial noise reduction image based on the noise reduction zone bit to obtain a secondary noise reduction image; the noise reduction zone bit is the first element which is in a preset range and is searched in the second one-dimensional picture element matrix by taking the first elements at two sides in the second one-dimensional picture element matrix as search starting points;
taking the difference value between at least two noise reduction zone bits as the initial width of the marker;
And performing open operation on the secondary noise reduction image based on the initial width to obtain the target noise reduction image.
According to the embodiment of the application, the region of interest to be detected is extracted from the medicine package image through the medicine package marker detection device, and then the phenomenon that the element value at the edge of the marker is larger than the set threshold value is referred to, and the visual characteristics of the marker are obtained from the one-dimensional image element matrix obtained by compressing the region of interest, so that whether the marker is qualified or not is judged, the automatic detection of medicine packages is realized, and the detection efficiency of the marker in the medicine packages is greatly improved. Meanwhile, the problem of accuracy reduction caused by human eye fatigue and the risk of secondary pollution of medicines caused by manual operation intervention are effectively avoided by automatically extracting and detecting the visual characteristics in the images, so that the accuracy of marker detection in medicine packaging is greatly improved, and the medicine quality is ensured.
Having described the apparatus, method and device of the exemplary embodiments of the present application, a computer readable storage medium of the exemplary embodiments of the present application, which may be provided in a marker detection apparatus for pharmaceutical packaging, will be described with reference to fig. 12, and referring to fig. 12, the computer readable storage medium is shown as an optical disc 120 having a computer program (i.e., a program product) stored thereon, which when executed by a processor, implements the steps described in the above-described method embodiments, for example, to obtain a target image; the target image comprises image information of a target package, wherein the target package is one independent package in a dual package, and the dual package comprises two independent packages connected with each other; extracting a region of interest from the target image according to the relative position of the target package in the duplex package; compressing the data matrix of the region of interest to obtain a one-dimensional graph element matrix of the region of interest; wherein each element in the one-dimensional image element matrix respectively represents the image change characteristics of a corresponding column of elements in the data matrix of the region of interest; the element value at the edge of the marker in the one-dimensional graph element matrix is larger than a set threshold value; the marker is used for indicating a cuttable region in the duplex package; judging whether the visual characteristics of the marker accord with preset detection conditions or not; wherein the visual characteristics of the marker are determined based on the elements at the edge of the marker; the specific implementation of each step is not repeated here.
It should be noted that examples of the computer readable storage medium may also include, but are not limited to, a phase change memory (PRAM), a Static Random Access Memory (SRAM), a Dynamic Random Access Memory (DRAM), other types of Random Access Memory (RAM), a Read Only Memory (ROM), an Electrically Erasable Programmable Read Only Memory (EEPROM), a flash memory, or other optical or magnetic storage medium, which will not be described in detail herein.
Having described the apparatus, methods, media, and devices of exemplary embodiments of the present application, next, a computing device for marker detection in pharmaceutical packaging of exemplary embodiments of the present application, which may be provided in a pharmaceutical packaging detection device or a pharmaceutical packaging production device, is described with reference to fig. 13.
FIG. 13 illustrates a block diagram of an exemplary computing device 100 suitable for use in implementing embodiments of the application, the computing device 100 may be a computer system or a server. The computing device 100 shown in fig. 13 is only one example and should not be taken as limiting the functionality and scope of use of embodiments of the application.
As shown in fig. 13, components of computing device 100 may include, but are not limited to: one or two processors or processing units 1001, a system memory 1002, and a bus 1003 that connects the different system components (including the system memory 1002 and the processing units 1001).
Computing device 100 typically includes a variety of computer system readable media. Such media can be any available media that is accessible by computing device 100 and includes both volatile and nonvolatile media, removable and non-removable media.
The system memory 1002 may include computer system readable media in the form of volatile memory, such as Random Access Memory (RAM) 10021 and/or cache memory 10022. Computing device 100 may further include other removable/non-removable, volatile/nonvolatile computer system storage media. By way of example only, ROM10023 may be used to read from or write to non-removable, nonvolatile magnetic media (not shown in FIG. 13, commonly referred to as a "hard disk drive"). Although not shown in fig. 13, a magnetic disk drive for reading from and writing to a removable non-volatile magnetic disk (e.g., a "floppy disk"), and an optical disk drive for reading from or writing to a removable non-volatile optical disk (e.g., a CD-ROM, DVD-ROM, or other optical media), may be provided. In these cases, each drive may be connected to bus 1003 via one or two data media interfaces. The system memory 1002 may include at least one program product having a set (e.g., at least one) of program modules configured to carry out the functions of the embodiments of the application.
A program/utility 10025 having a set (at least one) of program modules 10024 may be stored, for example, in system memory 1002, and such program modules 10024 include, but are not limited to: an operating system, one or two application programs, other program modules, and program data, each of which may include an implementation of a network environment, or some combination. Program modules 10024 generally perform the functions and/or methodologies of the described embodiments of the application.
Computing device 100 may also communicate with one or two external devices 1004 (e.g., keyboard, pointing device, display, etc.). Such communication may occur through an input/output (I/O) interface 605. Moreover, computing device 100 may also communicate with one or both networks such as a Local Area Network (LAN), a Wide Area Network (WAN) and/or a public network, such as the Internet, through network adapter 1006. As shown in fig. 10, the network adapter 1006 communicates with other modules of the computing device 100 (e.g., processing unit 1001, etc.) over the bus 1003. It should be appreciated that although not shown in fig. 10, other hardware and/or software modules may be used in connection with computing device 100.
The processing unit 1001 executes various functional applications and data processing by running a program stored in the system memory 1002, for example, acquires a target image; the target image comprises image information of a target package, wherein the target package is one independent package in a dual package, and the dual package comprises two independent packages connected with each other; extracting a region of interest from the target image according to the relative position of the target package in the duplex package; compressing the data matrix of the region of interest to obtain a one-dimensional graph element matrix of the region of interest; wherein each element in the one-dimensional image element matrix respectively represents the image change characteristics of a corresponding column of elements in the data matrix of the region of interest; the element value at the edge of the marker in the one-dimensional graph element matrix is larger than a set threshold value; the marker is used for indicating a cuttable region in the duplex package; judging whether the visual characteristics of the marker accord with preset detection conditions or not; wherein the visual characteristics of the marker are determined based on the elements at the edges of the marker. The specific implementation of each step is not repeated here. It should be noted that although in the above detailed description several units/modules or sub-units/sub-modules of a marker detection device of a pharmaceutical package are mentioned, such a division is only exemplary and not mandatory. Indeed, the features and functionality of two or more units/modules described above may be embodied in one unit/module in accordance with embodiments of the present application. Conversely, the features and functions of one unit/module described above may be further divided into two units/modules to be embodied.
In the description of the present application, it should be noted that the terms "first," "second," and "third" are used for descriptive purposes only and are not to be construed as indicating or implying relative importance.
It will be clear to those skilled in the art that, for convenience and brevity of description, specific working procedures of the above-described systems, apparatuses and units may refer to corresponding procedures in the foregoing method embodiments, and are not repeated herein.
In the several embodiments provided by the present application, it should be understood that the disclosed systems, devices, and methods may be implemented in other manners. The above-described apparatus embodiments are merely illustrative, for example, the division of the units is merely a logical function division, and there may be additional divisions in actual implementation, and for example, two units or components may be combined or integrated into another system, or some features may be omitted or not performed. Alternatively, the coupling or direct coupling or communication connection shown or discussed with each other may be through some communication interface, device or unit indirect coupling or communication connection, which may be in electrical, mechanical or other form.
The units described as separate units may or may not be physically separate, and units shown as units may or may not be physical units, may be located in one place, or may be distributed on two network units. Some or all of the units may be selected according to actual needs to achieve the purpose of the solution of this embodiment.
In addition, each functional unit in the embodiments of the present application may be integrated in one processing unit, or each unit may exist alone physically, or two or more units may be integrated in one unit.
The functions, if implemented in the form of software functional units and sold or used as a stand-alone product, may be stored in a non-volatile computer readable storage medium executable by a processor. Based on this understanding, the technical solution of the present application may be embodied essentially or in a part contributing to the prior art or in a part of the technical solution, in the form of a software product stored in a storage medium, comprising several instructions for causing a computer device (which may be a personal computer, a server, a network device, etc.) to perform all or part of the steps of the method according to the embodiments of the present application. And the aforementioned storage medium includes: a usb disk, a removable hard disk, a Read-Only Memory (ROM), a random access Memory (RAM, random Access Memory), a magnetic disk, or an optical disk, or other various media capable of storing program codes.
Finally, it should be noted that: the above examples are only specific embodiments of the present application, and are not intended to limit the scope of the present application, but it should be understood by those skilled in the art that the present application is not limited thereto, and that the present application is described in detail with reference to the foregoing examples: any person skilled in the art may modify or easily conceive of the technical solution described in the foregoing embodiments, or perform equivalent substitution of some of the technical features, while remaining within the technical scope of the present disclosure; such modifications, changes or substitutions do not depart from the spirit and scope of the technical solutions of the embodiments of the present application, and are intended to be included in the scope of the present application. Therefore, the protection scope of the present application shall be subject to the protection scope of the claims.
Furthermore, although the operations of the methods of the present application are depicted in the drawings in a particular order, this is not required or suggested that these operations must be performed in this particular order or that all of the illustrated operations must be performed in order to achieve desirable results. Additionally or alternatively, certain steps may be omitted, combined into one step to perform, and/or split into two steps to perform.
Claims (9)
1. A method of detecting a label for a pharmaceutical package, comprising:
acquiring a target image; the target image comprises image information of a target package, wherein the target package is one independent package in a dual package, and the dual package comprises two independent packages connected with each other;
extracting a region of interest from the target image according to the relative position of the target package in the duplex package;
Compressing the data matrix of the region of interest to obtain a one-dimensional graph element matrix of the region of interest; wherein each element in the one-dimensional image element matrix respectively represents the image change characteristics of a corresponding column of elements in the data matrix of the region of interest; the element value at the edge of the marker in the one-dimensional graph element matrix is larger than a set threshold value; the marker is used for indicating a cuttable region in the duplex package;
judging whether the visual characteristics of the marker accord with preset detection conditions or not; wherein the visual characteristics of the marker are determined based on the elements at the edge of the marker;
wherein the extracting the region of interest from the target image according to the relative position of the target package in the duplex package comprises:
identifying a relative position of the target package in the duplex package; the relative position includes a left pocket located on a left side of the duplex package or a right pocket located on a right side of the duplex package;
determining the region of interest in the target image according to the relative position of the target package and the standard visual features of the marker;
wherein the standard visual characteristics of the marker include standard size and standard location; the standard position is related to the relative position of the target package;
The compressing the data matrix of the region of interest to obtain a one-dimensional graph element matrix of the region of interest includes:
And calculating the average value of each column of elements in the data matrix of the region of interest, and taking the average value of each column of elements as each corresponding element in the first one-dimensional graph element matrix of the region of interest.
2. The method of claim 1, wherein the identifying the relative position of the target package in the duplex package comprises:
acquiring the minimum external contour of the target package;
Determining the midpoint of the minimum external contour according to the first-order central moment of the minimum external contour;
Judging whether the midpoint is positioned at the left side or the right side of a horizontal midline in the target image;
If the midpoint is on the left side of the horizontal midline, the target package is a right bag positioned on the right side in a duplex package; or alternatively
If the midpoint is to the right of the horizontal midline, the target package is the left pocket on the left in the duplex package.
3. The method of claim 1, wherein determining whether the visual characteristic of the marker meets a predetermined detection condition comprises:
Traversing the first one-dimensional graph element matrix to extract the head and tail positions of the marker; the head and tail positions are the first elements which are traversed in the first one-dimensional image element matrix and are larger than a set threshold value by taking the first elements on two sides of the first one-dimensional image element matrix as traversal starting points;
judging whether the physical size of the marker meets the standard size; the physical size of the marker is calculated based on the head-to-tail position of the marker;
if the physical size of the marker meets the standard size, determining that the target package is qualified; or alternatively
And if the physical size of the marker does not meet the standard size, determining that the target package is not qualified.
4. A method of detecting a marker according to claim 3, wherein the end-to-end position comprises:
taking the first element which traverses from left to right and is larger than a set threshold value in the first one-dimensional graph element matrix as the initial position of the marker;
and taking the first element which traverses from right to left and is larger than a set threshold value in the first one-dimensional graph element matrix as the end position of the marker.
5. The method of claim 1, wherein compressing the data matrix of the region of interest to obtain the one-dimensional map element matrix of the region of interest further comprises:
Performing open operation on the binarized image of the region of interest to obtain an initial noise reduction image of the region of interest;
Compressing the data matrix of the initial noise reduction image to obtain a second one-dimensional image element matrix; each element in the second one-dimensional image element matrix respectively represents the average value of a corresponding column of elements in the data matrix of the initial noise reduction image;
performing background elimination processing on the initial noise reduction image by adopting a noise reduction zone bit in the second one-dimensional image element matrix to obtain a target noise reduction image of the region of interest;
the noise reduction zone bit is the first element which is searched in the second one-dimensional image element matrix and is in a preset range by taking the first elements at two sides in the second one-dimensional image element matrix as search starting points.
6. The method for detecting a marker according to claim 5, wherein the performing background elimination processing on the initial noise-reduced image by using the noise-reduced flag bit in the second one-dimensional image element matrix to obtain a target noise-reduced image of the region of interest includes:
searching a noise reduction zone bit in the second one-dimensional picture element matrix;
Performing background elimination processing on the initial noise reduction image based on the noise reduction zone bit to obtain a secondary noise reduction image; the noise reduction zone bit is the first element which is in a preset range and is searched in the second one-dimensional picture element matrix by taking the first elements at two sides in the second one-dimensional picture element matrix as search starting points;
taking the difference value between at least two noise reduction zone bits as the initial width of the marker;
And performing open operation on the secondary noise reduction image based on the initial width to obtain the target noise reduction image.
7. A label detection device for pharmaceutical packaging, the device comprising:
An acquisition unit configured to acquire a target image; the target image comprises image information of a target package, wherein the target package is one independent package in a dual package, and the dual package comprises two independent packages connected with each other;
The extraction unit is used for extracting a region of interest from the target image according to the relative position of the target package in the bigeminal package;
The compression unit is used for compressing the data matrix of the region of interest to obtain a one-dimensional graph element matrix of the region of interest; wherein each element in the one-dimensional image element matrix respectively represents the image change characteristics of a corresponding column of elements in the data matrix of the region of interest; the element value at the edge of the marker in the one-dimensional graph element matrix is larger than a set threshold value; the marker is used for indicating a cuttable region in the duplex package;
The judging unit is used for judging whether the visual characteristics of the marker accord with preset detection conditions or not; wherein the visual characteristics of the marker are determined based on the elements at the edge of the marker;
the extraction unit is specifically configured to, when extracting a region of interest from the target image according to a relative position of the target package in the duplex package:
identifying a relative position of the target package in the duplex package; the relative position includes a left pocket located on a left side of the duplex package or a right pocket located on a right side of the duplex package;
determining the region of interest in the target image according to the relative position of the target package and the standard visual features of the marker;
wherein the standard visual characteristics of the marker include standard size and standard location; the standard position is related to the relative position of the target package;
the compression unit is used for compressing the data matrix of the region of interest and obtaining a one-dimensional graph element matrix of the region of interest, and is particularly used for:
And calculating the average value of each column of elements in the data matrix of the region of interest, and taking the average value of each column of elements as each corresponding element in the first one-dimensional graph element matrix of the region of interest.
8. A computing device, the computing device comprising:
at least one processor, memory, and input output unit;
Wherein the memory is for storing a computer program and the processor is for invoking the computer program stored in the memory to perform the marker detection method of the pharmaceutical packaging of any one of claims 1 to 6.
9. A computer readable storage medium comprising instructions that when executed on a computer cause the computer to perform the method of marker detection of a pharmaceutical package according to any one of claims 1 to 6.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202311119565.7A CN117152415B (en) | 2023-09-01 | 2023-09-01 | Method, device, equipment and storage medium for detecting marker of medicine package |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202311119565.7A CN117152415B (en) | 2023-09-01 | 2023-09-01 | Method, device, equipment and storage medium for detecting marker of medicine package |
Publications (2)
Publication Number | Publication Date |
---|---|
CN117152415A CN117152415A (en) | 2023-12-01 |
CN117152415B true CN117152415B (en) | 2024-04-23 |
Family
ID=88905660
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202311119565.7A Active CN117152415B (en) | 2023-09-01 | 2023-09-01 | Method, device, equipment and storage medium for detecting marker of medicine package |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN117152415B (en) |
Citations (18)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US8712163B1 (en) * | 2012-12-14 | 2014-04-29 | EyeNode, LLC | Pill identification and counterfeit detection method |
CN103903265A (en) * | 2014-03-31 | 2014-07-02 | 东华大学 | Method for detecting industrial product package breakage |
CN104537671A (en) * | 2015-01-04 | 2015-04-22 | 长沙理工大学 | Cigarette filter online counting and quality detecting method based on machine vision |
US9342900B1 (en) * | 2014-12-23 | 2016-05-17 | Ricoh Co., Ltd. | Distinguishing between stock keeping units using marker based methodology |
CN111210412A (en) * | 2019-12-31 | 2020-05-29 | 电子科技大学中山学院 | Package detection method and device, electronic equipment and storage medium |
CN111242900A (en) * | 2019-12-31 | 2020-06-05 | 电子科技大学中山学院 | Product qualification determination method and device, electronic equipment and storage medium |
US10803272B1 (en) * | 2016-09-26 | 2020-10-13 | Digimarc Corporation | Detection of encoded signals and icons |
CN111767920A (en) * | 2020-06-30 | 2020-10-13 | 北京百度网讯科技有限公司 | Region-of-interest extraction method and device, electronic equipment and storage medium |
WO2021114799A1 (en) * | 2019-12-14 | 2021-06-17 | 华南理工大学广州学院 | Computer vision-based matrix vehicle light identification method |
CN113284095A (en) * | 2021-05-08 | 2021-08-20 | 北京印刷学院 | Method for detecting number of medicine bags in medicine box based on machine vision |
WO2021179751A1 (en) * | 2020-03-13 | 2021-09-16 | 上海哔哩哔哩科技有限公司 | Image processing method and system |
CN113420690A (en) * | 2021-06-30 | 2021-09-21 | 平安科技(深圳)有限公司 | Vein identification method, device and equipment based on region of interest and storage medium |
CN113870189A (en) * | 2021-09-02 | 2021-12-31 | 广东省电信规划设计院有限公司 | Industrial product circular detection method and device |
CN113870217A (en) * | 2021-09-27 | 2021-12-31 | 菲特(天津)检测技术有限公司 | Edge deviation vision measurement method based on machine vision and image detector |
CN114066810A (en) * | 2021-10-11 | 2022-02-18 | 安庆师范大学 | Method and device for detecting concave-convex point defects of packaging box |
WO2022036478A1 (en) * | 2020-08-17 | 2022-02-24 | 江苏瑞科科技有限公司 | Machine vision-based augmented reality blind area assembly guidance method |
CN114092771A (en) * | 2020-08-05 | 2022-02-25 | 北京万集科技股份有限公司 | Multi-sensing data fusion method, target detection device and computer equipment |
CN115908269A (en) * | 2022-10-26 | 2023-04-04 | 中科慧远视觉技术(北京)有限公司 | Visual defect detection method and device, storage medium and computer equipment |
Family Cites Families (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US7506817B2 (en) * | 2004-12-14 | 2009-03-24 | Ricoh Co., Ltd. | Location of machine readable codes in compressed representations |
US9367770B2 (en) * | 2011-08-30 | 2016-06-14 | Digimarc Corporation | Methods and arrangements for identifying objects |
-
2023
- 2023-09-01 CN CN202311119565.7A patent/CN117152415B/en active Active
Patent Citations (18)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US8712163B1 (en) * | 2012-12-14 | 2014-04-29 | EyeNode, LLC | Pill identification and counterfeit detection method |
CN103903265A (en) * | 2014-03-31 | 2014-07-02 | 东华大学 | Method for detecting industrial product package breakage |
US9342900B1 (en) * | 2014-12-23 | 2016-05-17 | Ricoh Co., Ltd. | Distinguishing between stock keeping units using marker based methodology |
CN104537671A (en) * | 2015-01-04 | 2015-04-22 | 长沙理工大学 | Cigarette filter online counting and quality detecting method based on machine vision |
US10803272B1 (en) * | 2016-09-26 | 2020-10-13 | Digimarc Corporation | Detection of encoded signals and icons |
WO2021114799A1 (en) * | 2019-12-14 | 2021-06-17 | 华南理工大学广州学院 | Computer vision-based matrix vehicle light identification method |
CN111210412A (en) * | 2019-12-31 | 2020-05-29 | 电子科技大学中山学院 | Package detection method and device, electronic equipment and storage medium |
CN111242900A (en) * | 2019-12-31 | 2020-06-05 | 电子科技大学中山学院 | Product qualification determination method and device, electronic equipment and storage medium |
WO2021179751A1 (en) * | 2020-03-13 | 2021-09-16 | 上海哔哩哔哩科技有限公司 | Image processing method and system |
CN111767920A (en) * | 2020-06-30 | 2020-10-13 | 北京百度网讯科技有限公司 | Region-of-interest extraction method and device, electronic equipment and storage medium |
CN114092771A (en) * | 2020-08-05 | 2022-02-25 | 北京万集科技股份有限公司 | Multi-sensing data fusion method, target detection device and computer equipment |
WO2022036478A1 (en) * | 2020-08-17 | 2022-02-24 | 江苏瑞科科技有限公司 | Machine vision-based augmented reality blind area assembly guidance method |
CN113284095A (en) * | 2021-05-08 | 2021-08-20 | 北京印刷学院 | Method for detecting number of medicine bags in medicine box based on machine vision |
CN113420690A (en) * | 2021-06-30 | 2021-09-21 | 平安科技(深圳)有限公司 | Vein identification method, device and equipment based on region of interest and storage medium |
CN113870189A (en) * | 2021-09-02 | 2021-12-31 | 广东省电信规划设计院有限公司 | Industrial product circular detection method and device |
CN113870217A (en) * | 2021-09-27 | 2021-12-31 | 菲特(天津)检测技术有限公司 | Edge deviation vision measurement method based on machine vision and image detector |
CN114066810A (en) * | 2021-10-11 | 2022-02-18 | 安庆师范大学 | Method and device for detecting concave-convex point defects of packaging box |
CN115908269A (en) * | 2022-10-26 | 2023-04-04 | 中科慧远视觉技术(北京)有限公司 | Visual defect detection method and device, storage medium and computer equipment |
Non-Patent Citations (2)
Title |
---|
一种基于激光三维成像雷达距离像的目标检测方法;黄明晶;蹇渊;王雪梅;孟伟杰;马蒙蒙;;激光与红外;20200720(07);正文全文 * |
微目标遥感图像的感兴趣区域提取新方法;李晓飞;马大玮;胡焰智;范小麟;;微计算机信息;20080405(10);正文全文 * |
Also Published As
Publication number | Publication date |
---|---|
CN117152415A (en) | 2023-12-01 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN110443243B (en) | Water level monitoring method, storage medium, network device and water level monitoring system | |
TWI654567B (en) | Method and apparatus for extracting specific information from standard cards | |
KR100325384B1 (en) | Character string extraction apparatus and pattern extraction apparatus | |
CN110909640A (en) | Method and device for determining water level line, storage medium and electronic device | |
CN111307039A (en) | Object length identification method and device, terminal equipment and storage medium | |
CN112906532B (en) | Image processing method and device, electronic equipment and storage medium | |
CN111046644A (en) | Answer sheet template generation method, identification method, device and storage medium | |
CN104298947A (en) | Method and device for accurately positioning two-dimensional bar code | |
CN103852034A (en) | Elevator guide rail perpendicularity detection method | |
CN107516085A (en) | A kind of method that black surround is automatically removed based on file and picture | |
CN112861861A (en) | Method and device for identifying nixie tube text and electronic equipment | |
CN117152415B (en) | Method, device, equipment and storage medium for detecting marker of medicine package | |
CN113392819A (en) | Batch academic image automatic segmentation and labeling device and method | |
US9734610B2 (en) | Image processing device, image processing method, and image processing program | |
CN113052181A (en) | Table reconstruction method, device and equipment based on semantic segmentation and storage medium | |
CN107861931B (en) | Template file processing method and device, computer equipment and storage medium | |
CN116309494A (en) | Method, device, equipment and medium for determining interest point information in electronic map | |
CN113760686B (en) | User interface testing method, device, terminal and storage medium | |
CN115660952A (en) | Image processing method, dictionary pen and storage medium | |
CN112801987B (en) | Mobile phone part abnormity detection method and equipment | |
CN105930813B (en) | A method of detection composes a piece of writing this under any natural scene | |
CN111292374B (en) | Method and equipment for automatically plugging and unplugging USB interface | |
CN110866928B (en) | Target boundary segmentation and background noise suppression method and device based on neural network | |
CN117152088B (en) | Method, device, equipment and storage medium for detecting seal of medicine package | |
CN116648721A (en) | Hair evaluation method, program, computer, and hair evaluation system |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |