CN110288359B - Code fingerprint anti-fake method - Google Patents

Code fingerprint anti-fake method Download PDF

Info

Publication number
CN110288359B
CN110288359B CN201910547553.1A CN201910547553A CN110288359B CN 110288359 B CN110288359 B CN 110288359B CN 201910547553 A CN201910547553 A CN 201910547553A CN 110288359 B CN110288359 B CN 110288359B
Authority
CN
China
Prior art keywords
code
commodity
code fingerprint
feature
image
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201910547553.1A
Other languages
Chinese (zh)
Other versions
CN110288359A (en
Inventor
孙继伟
凌辉
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Jiangsu Yuanyu Intelligent Technology Co ltd
Original Assignee
Jiangsu Yuanyu Intelligent Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Jiangsu Yuanyu Intelligent Technology Co ltd filed Critical Jiangsu Yuanyu Intelligent Technology Co ltd
Priority to CN201910547553.1A priority Critical patent/CN110288359B/en
Publication of CN110288359A publication Critical patent/CN110288359A/en
Application granted granted Critical
Publication of CN110288359B publication Critical patent/CN110288359B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06KGRAPHICAL DATA READING; PRESENTATION OF DATA; RECORD CARRIERS; HANDLING RECORD CARRIERS
    • G06K17/00Methods or arrangements for effecting co-operative working between equipments covered by two or more of main groups G06K1/00 - G06K15/00, e.g. automatic card files incorporating conveying and reading operations
    • G06K17/0022Methods or arrangements for effecting co-operative working between equipments covered by two or more of main groups G06K1/00 - G06K15/00, e.g. automatic card files incorporating conveying and reading operations arrangements or provisious for transferring data to distant stations, e.g. from a sensing device
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q30/00Commerce
    • G06Q30/018Certifying business or products
    • G06Q30/0185Product, service or business identity fraud
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/20Image preprocessing
    • G06V10/26Segmentation of patterns in the image field; Cutting or merging of image elements to establish the pattern region, e.g. clustering-based techniques; Detection of occlusion
    • G06V10/267Segmentation of patterns in the image field; Cutting or merging of image elements to establish the pattern region, e.g. clustering-based techniques; Detection of occlusion by performing operations on regions, e.g. growing, shrinking or watersheds
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/40Extraction of image or video features
    • G06V10/44Local feature extraction by analysis of parts of the pattern, e.g. by detecting edges, contours, loops, corners, strokes or intersections; Connectivity analysis, e.g. of connected components
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/40Extraction of image or video features
    • G06V10/50Extraction of image or video features by performing operations within image blocks; by using histograms, e.g. histogram of oriented gradients [HoG]; by summing image-intensity values; Projection analysis
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/70Arrangements for image or video recognition or understanding using pattern recognition or machine learning
    • G06V10/74Image or video pattern matching; Proximity measures in feature spaces
    • G06V10/75Organisation of the matching processes, e.g. simultaneous or sequential comparisons of image or video features; Coarse-fine approaches, e.g. multi-scale approaches; using context analysis; Selection of dictionaries
    • G06V10/757Matching configurations of points or features

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Multimedia (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Business, Economics & Management (AREA)
  • Medical Informatics (AREA)
  • Accounting & Taxation (AREA)
  • Databases & Information Systems (AREA)
  • Evolutionary Computation (AREA)
  • General Health & Medical Sciences (AREA)
  • Artificial Intelligence (AREA)
  • Software Systems (AREA)
  • Health & Medical Sciences (AREA)
  • Entrepreneurship & Innovation (AREA)
  • Computing Systems (AREA)
  • Development Economics (AREA)
  • Economics (AREA)
  • Finance (AREA)
  • Marketing (AREA)
  • Strategic Management (AREA)
  • General Business, Economics & Management (AREA)
  • General Engineering & Computer Science (AREA)
  • Image Analysis (AREA)
  • Collating Specific Patterns (AREA)

Abstract

The invention provides a code fingerprint anti-counterfeiting method, which comprises the steps of establishing a code fingerprint library, establishing a code fingerprint of a commodity to be detected, and retrieving and verifying; establishing a code fingerprint database, including extracting local characteristic points of the mark area image; removing general local characteristic points in the local characteristic points, and forming a remarkable local characteristic point diagram by the residual local characteristic points; acquiring a block detailed feature vector according to the significant locality feature point diagram; and establishing a key area feature code fingerprint according to the significant locality feature point diagram and the block detailed feature vector. The code fingerprint anti-counterfeiting method comprises the steps of collecting commodity mark area images on optical equipment to contain printing characteristic differences, processing the commodity mark area images through a certain method, establishing a code fingerprint with unique identification, storing the code fingerprint in a code fingerprint library, and judging whether the commodity to be detected has the unique code fingerprint in the code fingerprint library or not by collecting the corresponding mark area images of the commodity to be detected.

Description

Code fingerprint anti-fake method
Technical Field
The invention relates to the technical field of commodity anti-counterfeiting, in particular to a code fingerprint anti-counterfeiting method.
Background
Along with the development of market economy, counterfeit commodities on the market are more and more, people cannot defend the counterfeit commodities, and the counterfeit commodities become the second most public nuisance in the world except for drugs.
In order to combat counterfeiting, various anti-counterfeiting technologies are developed in various countries around the world, such as anti-counterfeiting paper, anti-counterfeiting ink, special printing, security seal, watermark, laser holographic anti-counterfeiting, commodity bar code, microelectronic chip, Radio Frequency Identification (RFID) electronic tag, two-dimensional code and other anti-counterfeiting technologies. However, these traditional anti-counterfeiting technologies cannot really prevent counterfeiting, so that consumers can conveniently distinguish the authenticity of the product. For example, a counterfeiter can easily know the package of the product and the commodity bar code thereof through a genuine product which enters the market for circulation, so as to imitate the same package and commodity bar code, and can obtain the same scanning result as the genuine product by scanning the commodity bar code of the counterfeit product. For another example, the counterfeiter may copy the two-dimensional code of the genuine product to present the same two-dimensional code on the counterfeit product, and may also obtain the same scanning result as the genuine product by scanning the two-dimensional code of the counterfeit product. For another example, a Near Field Communication (NFC) chip is used for anti-counterfeiting, but a counterfeiter extracts and counterfeits the information of the NFC chip in the old package by recycling the old package, so as to achieve a real effect, and a copycat website and a fake database are also manufactured, so that a consumer cannot distinguish whether the product is true or false after purchasing the product.
How to truly realize the anti-counterfeiting of the commodity, so that counterfeiters cannot copy the commodity, and the authenticity of the commodity which can be inquired by consumers through a simple method is an urgent problem to be solved in the field.
Disclosure of Invention
Therefore, a code fingerprint anti-counterfeiting method is needed to solve the problems that the traditional anti-counterfeiting method is easy to imitate and a consumer cannot conveniently distinguish whether the commodity is true or false.
The invention provides a code fingerprint anti-counterfeiting method, which comprises the following steps:
a step of establishing a code fingerprint database, which comprises the steps of collecting commodity mark area images, establishing key area characteristic code fingerprints according to the commodity mark area images, and establishing a key area characteristic code fingerprint database by the key area characteristic code fingerprints of a plurality of commodities of the same type;
establishing a code fingerprint of a commodity to be detected, wherein the step comprises the steps of collecting a mark area image of the commodity to be detected, and establishing a characteristic code fingerprint of a key area to be detected according to the mark area image of the commodity to be detected;
the step of retrieval verification comprises the steps of retrieving the key area feature code fingerprint database according to the key area feature code fingerprint to be detected and judging the authenticity of the commodity to be detected according to the retrieval result;
the method comprises the following steps of establishing a key area characteristic code fingerprint according to the commodity mark area image and establishing a key area characteristic code fingerprint to be detected according to the commodity mark area image, wherein the key area characteristic code fingerprint to be detected respectively comprises the following steps:
extracting local characteristic points of the mark region image;
removing general local characteristic points in the local characteristic points, and forming a remarkable local characteristic point diagram by the residual local characteristic points;
acquiring a block detailed feature vector according to the significant locality feature point diagram;
establishing a key area feature code fingerprint according to the significant locality feature point diagram and the block detailed feature vector;
the mark area image is a commodity mark area image or a commodity mark area image to be detected, and the key area feature code fingerprint is a key area feature code fingerprint or a key area feature code fingerprint to be detected.
In one embodiment, the extraction of the local feature points of the image of the marker region is realized by a scale-invariant feature transformation method.
In one embodiment, the common local feature point is a local feature point which accounts for more than a preset proportion of all local feature points of a plurality of same type of the commodity mark area images.
In one embodiment, the obtaining of the block detail feature vector according to the significant locality feature point diagram is implemented by a histogram of oriented gradients feature detection method.
In one embodiment, after the step of obtaining the block detail feature vector according to the significant locality feature point diagram, the method further includes the following steps:
and generating an index code according to the remarkable local characteristic point diagram.
In one embodiment, the generating an index code according to the significant locality feature point diagram includes the following steps:
dividing the significant locality feature point map into blocks;
extracting the central point of the significant local characteristic point in each block, and taking the gray value of the central point in each block as the threshold of the corresponding block;
comparing the gray value of each significant local characteristic point in each block with the threshold value of the corresponding block, and when the gray value of the significant local characteristic point is smaller than the threshold value, marking as 1, otherwise, marking as 0, and obtaining the characteristic code of each block;
establishing a characteristic code spiral matrix from inside to outside by taking the characteristic code of the block in the center of the mark area image as a starting point; the characteristic code spiral matrix forms an index code.
In one embodiment, retrieving the key region feature code fingerprint database according to the key region feature code fingerprint to be tested includes the following steps:
randomly generating an Archimedes thread line by taking the central point of the remarkable local characteristic point diagram of the commodity to be detected as a starting point;
according to the size of the set, randomly extracting a plurality of significant local characteristic points of all the significant local characteristic points passed by the Archimedes spiral;
matching and retrieving the extracted several significant local characteristic points with the index codes;
when the first fault tolerance rate of the extracted plurality of the salient local characteristic points and a specific index code in the index codes is less than or equal to a first fault tolerance rate preset value, carrying out block detailed characteristic vector matching retrieval on the extracted plurality of the salient local characteristic points and the specific index code;
and when the second fault tolerance of the extracted plurality of the significant local characteristic points and the specific index code for carrying out the block detailed characteristic vector is less than or equal to a second fault tolerance preset value, judging that the result is true.
In one embodiment, when a second fault tolerance of the extracted significant local feature points and the block detail feature vector of the specific index code is greater than a second preset fault tolerance, the step of randomly extracting the significant local feature points of all the significant local feature points passed by the archimedean screw according to the set size is performed.
In one embodiment, after a plurality of significant local feature points of all significant local feature points passed by the archimedean spiral are randomly extracted according to the set size by a preset number of times, when a second fault tolerance of the extracted significant local feature points and the specific index code for performing block detailed feature vector is greater than a second fault tolerance preset value, the judgment result is false.
In one embodiment, when the judgment result is true, the key area feature code fingerprint database is updated according to the judgment result;
the updating mode comprises deleting the corresponding key area characteristic code fingerprint in the key area characteristic code fingerprint library or writing the user information into the corresponding key area characteristic code fingerprint in the key area characteristic code fingerprint library.
Based on the printing quality difference caused by the printing color change caused by the blocking plate, the ink carrying amount change caused by the abrasion of the printing plate, the temperature and humidity change of the printing environment and the drying speed change in the prior printing technology, the above common factors will result in printed matter having subtle printed feature differences on the optical device, which are unique and non-counterfeit like a human fingerprint, the code fingerprint anti-counterfeiting method comprises the steps of collecting the commodity mark area image on the optical equipment to contain the printing characteristic difference, processing the commodity mark area image by a certain method to establish the code fingerprint with unique identification and storing the code fingerprint in the code fingerprint database, whether the commodity to be detected has the unique code fingerprint in the code fingerprint library can be judged by collecting the corresponding mark area image of the commodity to be detected, if the commodity to be detected has the corresponding code fingerprint, the commodity to be detected can be judged to be a genuine product, and if the commodity to be detected does not have the corresponding code fingerprint, the commodity to be detected can be judged to be false. The code fingerprint of the code fingerprint anti-counterfeiting method cannot be decoded and copied, and the prevention and control of counterfeiters can be avoided. Furthermore, by extracting the local characteristic points and removing the passing local characteristic points, the data processing amount can be reduced, and the anti-counterfeiting verification speed can be improved.
Drawings
In order to more clearly illustrate the embodiments of the present application or technical solutions in the prior art, the drawings needed to be used in the embodiments will be briefly described below, and it is obvious that the drawings in the following description are only some embodiments described in the present invention, and other drawings can be obtained by those skilled in the art according to the drawings.
FIG. 1 is a schematic flow chart of a first embodiment of a large face code anti-counterfeit method according to the present invention;
FIG. 2 is a block diagram of a second embodiment of a large face code anti-counterfeit system according to the present invention;
FIG. 3 is a schematic flow chart illustrating a third embodiment of a large-area normalization preprocessing step according to the present invention;
FIG. 4 is an image of a marked area of a commodity to be tested, using a bar code as an example;
FIG. 5 is an image after binarization processing of the image shown in FIG. 4;
FIG. 6 is an image of the image of FIG. 5 after a grid cut process;
FIG. 7 is an image after the image convolution process shown in FIG. 6;
FIG. 8 is a schematic image of the connection component identification process target in the image shown in FIG. 7;
FIG. 9 is a schematic image of the process target for orienting the components in the image of FIG. 7;
FIG. 10 is an image of the image determination assembly of FIG. 9 after orientation processing;
FIG. 11 is a block diagram of the image calculation bounding box region image of FIG. 10;
FIG. 12 is a logo image within the bounding box of the image extraction shown in FIG. 11;
fig. 13 is an image of a to-be-detected commodity sign region image after graying processing, taking a two-dimensional code as an example;
FIG. 14 is an image of the image of FIG. 13 after average highlights processing;
FIG. 15 is a schematic diagram of a Gaussian filter template;
FIG. 16 is a representation of the image of FIG. 14 after Gaussian smoothing filtering;
FIG. 17 is a schematic view showing Gx, Gy in the XY direction gradient difference treatment;
FIG. 18 is an XY-direction gradient difference processed image of the image shown in FIG. 16;
FIG. 19 is a schematic diagram of neighboring pixels of (x, y);
FIG. 20 is the image of FIG. 18 after the mean filtering process;
FIG. 21 is an image of the image of FIG. 20 after binarization processing;
FIG. 22 is the image of FIG. 21 after the closing operation;
FIG. 23 is an image of the image of FIG. 22 after the erosion process;
FIG. 24 is an image of the image of FIG. 23 after three dilation operations;
FIG. 25 is the image of FIG. 24 after a bounding box has been created;
FIG. 26 is an image of the image of FIG. 25 after maximum filtering;
FIG. 27 is a diagram of the image of FIG. 26 illustrating the extraction of the logo image within the bounding box;
FIG. 28 is a flowchart illustrating a fourth exemplary embodiment of a normalization preprocessing step for large facets according to the present invention;
FIG. 29 is a schematic diagram of detecting a position of an extremum point and an adjacent point in a scale space;
FIG. 30 is a dimensional feature descriptor diagram;
fig. 31 is a schematic view of a gradient direction histogram.
Detailed Description
In order to make the objects, technical solutions and advantages of the present invention more clearly understood, the following embodiments are described in further detail with reference to the accompanying drawings. It should be understood that the specific embodiments described herein are merely illustrative of the invention and are not intended to limit the invention.
The invention provides a code fingerprint anti-counterfeiting method. Referring to fig. 1, a code fingerprint anti-counterfeiting method according to an embodiment of the present invention includes the following steps:
a code fingerprint database establishing step S100, which comprises the steps of collecting commodity mark area images, establishing key area characteristic code fingerprints according to the commodity mark area images, and establishing a key area characteristic code fingerprint database by a plurality of key area characteristic code fingerprints of the same type;
step S200 of establishing a code fingerprint of a commodity to be detected, which comprises the steps of collecting a mark area image of the commodity to be detected, and establishing a characteristic code fingerprint of a key area to be detected according to the mark area image of the commodity to be detected;
and a retrieval verification step S300, which comprises retrieving a key area feature code fingerprint database according to the key area feature code fingerprint to be detected and judging the authenticity of the commodity to be detected according to the retrieval result.
Although the existing printing techniques are very advanced and a wide variety of printed products can be printed according to design, counterfeiters also perform a great deal of counterfeiting based on the existing advanced printing techniques, and the manufactured counterfeit products are sufficient to be counterfeited. Although many anti-counterfeiting technologies exist, such as anti-counterfeiting paper, anti-counterfeiting ink, special printing, security seal, watermark, laser holographic anti-counterfeiting, commodity bar code, microelectronic chip, radio frequency electronic tag, two-dimensional code and the like, general consumers still cannot conveniently and accurately judge the authenticity of commodities based on the anti-counterfeiting technologies, and the anti-counterfeiting technologies or verification methods are inconvenient or cannot be cracked and utilized by anti-counterfeiting consumers, so that the consumers cannot verify the authenticity when facing the commodities.
The invention is based on the difference of printing quality caused by the change of printing color caused by the blocking plate, the change of ink carrying quantity caused by the abrasion of the printing plate, the change of temperature and humidity of the printing environment and the change of drying speed in the prior printing technology, and the common factors can cause the printed matters to have slight printing characteristic difference on optical equipment, and the printing characteristic difference has uniqueness and cannot be imitated like human fingerprints, just as no one-model tree leaves exist in the world, and the same printed matters do not exist in the world.
The code fingerprint anti-counterfeiting method comprises the steps of collecting commodity mark area images on optical equipment to contain printing characteristic differences, processing the commodity mark area images through a certain method, establishing a code fingerprint with unique identification, storing the code fingerprint in a code fingerprint library, judging whether the commodity to be detected has the unique code fingerprint in the code fingerprint library or not by collecting the corresponding mark area images of the commodity to be detected, judging the commodity to be genuine if the corresponding code fingerprint exists, and judging the commodity to be fake if the corresponding code fingerprint does not exist. The code fingerprint of the code fingerprint anti-counterfeiting method cannot be decoded and copied, and the prevention and control of counterfeiters can be avoided.
The commercial product marking area has printing features which cannot be directly observed by human eyes during the printing process due to various factors, and the printing features can be area printing discontinuity, edge linearity, foreign labels, aspect ratio, feature shape and contrast, aspect ratio change, feature position and size and the like. Due to the contingency of the reasons for forming the printing characteristics, the anti-counterfeiting person cannot copy the printing characteristics of the specific commodity, namely cannot copy the identity characteristics of the specific commodity, so that the anti-counterfeiting person is prevented from copying from the source. Although human eyes can not directly observe the printing characteristics, the printing characteristics can be observed and recorded by optical equipment, the code fingerprint anti-counterfeiting method provided by the invention establishes key area characteristic code fingerprints according to the printing characteristics by collecting commodity mark area images, even the same mark can generate different identities, and the produced commodity forms an identity anti-counterfeiting system after establishing the key area characteristic code fingerprints.
The key point of the code fingerprint anti-counterfeiting method is to collect the commodity mark area image. If the commodity produced by the manufacturer is not subjected to the code fingerprint anti-counterfeiting method to acquire the commodity mark area image and establish the key area characteristic code fingerprint, the anti-counterfeiting judgment result of the genuine commodity is false. Therefore, the collection of the printing characteristics of the commodity should be strictly carried out, and the missing collection is avoided. Alternatively, the commodity sign area image may be acquired by an optical device immediately after the commodity packaging is completed. Optionally, before the commodity package is finished, the image of the commodity mark area on the commodity package is collected, in this case, attention should be paid to the treatment of the discarded package, on one hand, the discarded package should be subjected to physical treatment in time, such as crushing, so as to avoid the discarded package from being recycled; on the other hand, the stored key area characteristic code fingerprints corresponding to the scrapped packages are removed in time, and excessive useless code fingerprint information is prevented from being stored.
As an optional implementation mode, the commodity mark area image is a characteristic area part which is different from other commodities on the commodity package; the to-be-detected commodity marking area image is a corresponding area image of the commodity marking area image. For example, the product package may have a logo such as a cartoon pattern and a product name. Preferably, in order to improve the accuracy of the anti-counterfeiting verification and the convenience of processing, the characteristic which is distinguished from other commodities can be any one or a combination of a commodity bar code, a commodity two-dimensional code and a commodity trademark.
It should be noted that the method for establishing the fingerprint of the key area feature code according to the image of the mark area of the commodity to be detected is the same as or substantially the same as the method for acquiring the image of the mark area of the commodity and establishing the fingerprint of the key area feature code according to the image of the mark area of the commodity, and therefore, the related method for establishing the fingerprint of the code of the commodity to be detected is not repeated herein.
As an alternative embodiment, after the step S300 of retrieving and verifying, the following steps are included:
and a code fingerprint database updating step S400, which comprises updating the key area feature code fingerprint database according to the judgment result when the judgment result of the commodity to be detected is true. The updating method can delete the key area characteristic code fingerprint of the corresponding commodity in the key area characteristic code fingerprint library, can mark the key area characteristic code fingerprint of the corresponding commodity in the key area characteristic code fingerprint library, and can store the judgment result information of the corresponding commodity in the key area characteristic code fingerprint library and associate the judgment result information with the key area characteristic code fingerprint of the corresponding commodity.
Through the code fingerprint library updating step, a mode of counterfeiting through recycling the commodity package can be avoided, if a counterfeiter recycles the commodity package, the counterfeiter has corresponding key area characteristic code fingerprints, but the anti-counterfeiting judgment result is still false because the key area characteristic code fingerprints are deleted or marked.
As an alternative embodiment, after the step S300 of retrieving verification, the method further includes:
and a commodity tracking step S500, which comprises updating the key area feature code fingerprint database according to the user information of the judgment result obtained by the request when the judgment result of the commodity to be detected is true.
Since the merchandise has a unique specific key zone feature code fingerprint, the merchandise tracking can be performed based on the unique specific key zone feature code fingerprint. Namely, the code fingerprint anti-counterfeiting method can also be used as a commodity management and control system, and different user authorities and key area feature code fingerprint database updating methods can be set for different users.
Optionally, when the user information requesting to obtain the determination result shows that the user is an end user, the updating of the key area feature code fingerprint database may be deleting the key area feature code fingerprint of the corresponding commodity in the key area feature code fingerprint database, or marking the key area feature code fingerprint of the corresponding commodity in the key area feature code fingerprint database, and when the user information requesting to obtain the determination result again requests to display the key area feature code fingerprint as false.
Further optionally, in the step S100 of establishing the code fingerprint database, the key area feature code fingerprint further includes or is associated/linked with information such as production date, use limit date, lot number, batch and the like of the commodity, the key area feature code fingerprint database can be updated in real time according to the actual date and the production date or use limit date of the commodity, and a producer can control the quality of the commodity according to the update, so as to prevent the expired commodity from flowing into the circulation field. Furthermore, when the user information requesting to obtain the judgment result shows that the user is an end user, the end user can obtain the information that the commodity is true or false, and can also obtain the information of the real production date, the use limiting date, the batch number and the like of the commodity in a correlated manner, so that the use of the expired commodity is avoided, and the damage to the rights of consumers caused by the falsification of the production date or the use limiting date in the circulation field of the commodity is also avoided; the consumer can also learn more about the goods based on the information obtained.
Optionally, when the user information requesting to obtain the determination result shows that the user is a non-end user such as a distributor, a transporter, or the like, the corresponding key region feature code fingerprint is updated based on the user information, for example, information such as user basic information, a user geographic position, user anti-counterfeiting verification time, or the like is written in, so as to track the commodity, and information such as whether the commodity is mixed, distributed or shipped, a transportation state of the commodity, or the like is obtained, so as to timely and conveniently manage and control the commodity.
Optionally, when the user information requesting to obtain the determination result shows that the user is warehouse management of a producer, the corresponding key area feature code fingerprint is updated based on the user information, and the key area feature code fingerprint may be marked with the warehouse-out time or the warehouse-in time according to the user retrieval verification time and the purpose (warehouse-out or warehouse-in).
As an alternative embodiment, after the step S300 of retrieving and verifying, the method further includes:
the information feedback step S600 includes sending a feedback invitation to a user requesting to obtain a determination result, and obtaining feedback information of the corresponding user. For example, when the user is an end user, the user can interact with the user through information feedback, and feedback information of the user, such as user experience, commodity suggestions and the like, is collected in time. For another example, when the user requesting to obtain the determination result is a non-end user, the user is classified into behavior types in advance, and the user obtains the determination result and feeds back the commodity information based on different behavior types. For example, when the user uses the method based on inventory management, the user will obtain inventory management information of the goods, such as information of warehousing time, warehousing position, ex-warehousing time and the like, based on the collected goods characteristic region images. For another example, when the user uses the method based on sales, the method is used for presetting information such as price, and the user can perform operations such as cash collection and the like by collecting images of the commodity characteristic areas, such as commodity bar codes, and can obtain corresponding information such as orders. At this time, the corresponding price information can be updated to the key area feature code fingerprint database in real time so as to monitor the commodity price information. That is to say, the code fingerprint anti-counterfeiting method can also be cooperatively applied to a supply chain management system, a commodity sales system and the like.
As an alternative embodiment, after the step of retrieving verification S300, the following steps are included:
the commodity information sharing step S700 includes updating the key area feature code fingerprints of the specific commodities in the key area feature code fingerprint library in batches according to the commodity sharing information, where the commodity sharing information may be recall information of batches of commodities, quality information of batches of commodities, and the like that need to be known by the user. Preferably, such an information sharing procedure is suitable for goods used by the user for a long time, such as automobiles, air conditioners, refrigerators, and the like. In this case, the updating method in the step of updating the code fingerprint database is not suitable for the updating method of deleting the key area feature code fingerprint of the corresponding commodity in the key area feature code fingerprint database.
As an alternative embodiment, the following steps are included after the step S300 of retrieving verification:
the step S800 of marking abnormal user behavior includes obtaining user behavior according to the user information of the request obtaining judgment result, judging whether the user behavior is abnormal, and marking a corresponding user according to the result of whether the user behavior is abnormal. In order to avoid that malicious competitors use the code fingerprint anti-counterfeiting method to carry out commodity authenticity verification in a large batch to influence the normal use of the code fingerprint anti-counterfeiting method, whether the user behavior is abnormal or not can be judged according to the user behavior, for example, when the end user identity is used for carrying out anti-counterfeiting verification in a large amount within a preset time, the end user can be marked as an abnormal user, and the method is limited to be continuously used for carrying out measures such as commodity authenticity verification and the like.
Further optionally, the code fingerprint anti-counterfeiting method can be used by a non-end user in an authentication mode to avoid illegal interference of illegal non-end users, such as providing false code fingerprint library tracking information.
The second aspect of the present invention provides a code fingerprint anti-counterfeit system, please refer to fig. 2, which includes a factory end, and the factory end includes the following modules:
an image acquisition module: the system is used for acquiring the images of the commodity mark areas;
a data processing module: the system comprises a client, a server and a client, wherein the client is used for receiving a commodity mark area image, establishing a commodity key area characteristic code fingerprint according to the commodity mark area image, receiving a to-be-detected commodity mark area image sent by the client, establishing a to-be-detected commodity key area characteristic code fingerprint according to the to-be-detected commodity mark area image or receiving the to-be-detected commodity key area characteristic code fingerprint sent by the client;
a data storage module: the fingerprint database is used for storing commodity key area characteristic code fingerprint key area characteristic code fingerprints formed by a plurality of same commodity key area characteristic code fingerprint key area characteristic code fingerprints;
a retrieval verification module: the system is used for searching the commodity key area feature code fingerprint database according to the commodity key area feature code fingerprint to be detected, and judging the authenticity of the commodity to be detected according to the searching result.
Above-mentioned sign indicating number fingerprint anti-fake system equipment is simple, only need set up image acquisition module in current production system, commodity on the assembly line pack into the case before through image acquisition module sampling can, former commodity packing does not have any additional, need not to carry out any adjustment, change to current production facility, packaging method, does not have any influence to the former production line of commodity. The image acquisition module is matched with the data processing module and the data storage module to quickly record and quickly calculate the original commodity mark into a unique code fingerprint, and any commodity which is not acquired and processed by the image acquisition module is identified as false, so that imitation is stopped from the source; correspondingly, the user can also verify by collecting the commodity image, the verification is simple and quick, and exact true and false information can be obtained.
Alternatively, the image acquisition module may be an optical device such as an industrial camera, which is disposed in the production line of the commodity to acquire the commodity sign area image. The original production line of an enterprise does not need to be changed, and the original production process of the enterprise is not influenced; the acquisition frequency of 700 times/min can be realized, the acquisition efficiency is high, and corresponding customization can be performed according to a production line.
Alternatively, the data processing module, the data storage module and the retrieval verification module can be integrated on a computing device or a server, and can also be other devices for realizing the purposes.
As an optional implementation manner, the data storage module is further configured to update the product key area feature code fingerprint database according to the determination result when the determination result of the product to be tested is true.
As an optional implementation manner, the data processing module is further configured to receive, when the determination result of the to-be-detected commodity is true, user information that requests to obtain the determination result and is sent by the client, and update the commodity key area feature code fingerprint database according to the user information.
As an optional implementation manner, the code fingerprint anti-counterfeiting system further comprises a client:
the client is used for collecting the marked area image of the commodity to be detected and sending the collected marked area image of the commodity to be detected to the factory, or
The client is used for collecting the mark area image of the commodity to be detected, establishing the key area characteristic code fingerprint of the commodity to be detected according to the collected mark area image of the commodity to be detected and sending the key area characteristic code fingerprint of the key area characteristic code of the commodity to be detected to the factory end.
The client can be arranged at the mobile terminal as required, for example, the client can be an APP installed in a smart phone, an independent client, or a terminal which is integrated with WeChat, Payment treasured and the like and can collect images and scan codes. For example, the user can use the code scanning function of the WeChat to collect the image of the marked area of the commodity to be tested and send the image to the factory for verifying the authenticity of the commodity to be tested. In addition, the user can also use the independently developed client to collect the marked area image of the commodity to be detected, establish the key area characteristic code fingerprint of the key area characteristic code of the commodity to be detected according to the collected marked area image of the commodity to be detected and send the key area characteristic code fingerprint to the factory end.
The code fingerprint anti-counterfeiting system is applied in an implementation mode that a communication link is established between a factory end and a client end, and the client end is mainly used for collecting an image of a mark area of a commodity to be detected, sending the image of the mark area of the commodity to be detected to the factory end and receiving a retrieval verification result of the factory end. The factory end function is mainly used for acquiring the commodity marking area image, establishing key area feature code fingerprints according to the commodity marking area image, and establishing a key area feature code fingerprint database by key area feature code fingerprints of a plurality of commodities of the same type; receiving a mark area image of a commodity to be detected acquired by a client and establishing a feature code fingerprint of a key area to be detected according to the mark area image of the commodity to be detected; and further, searching the key area feature code fingerprint database according to the key area feature code fingerprint to be detected, judging the authenticity of the commodity to be detected according to the search result and sending the search judgment result to the client. Such an application method has low requirements on devices of the client, and is preferably used for end users such as consumers.
The code fingerprint anti-counterfeiting system is also applied to the embodiment that a communication link is established between a factory end and a client end, and the client end is mainly used for acquiring the marked area image of the commodity to be detected, establishing the key area characteristic code fingerprint of the commodity to be detected according to the marked area image of the commodity to be detected, sending the key area characteristic code fingerprint to be detected to the factory end and receiving the retrieval verification result of the factory end. The factory end function is mainly used for acquiring the commodity marking area image, establishing key area feature code fingerprints according to the commodity marking area image, and establishing a key area feature code fingerprint database by key area feature code fingerprints of a plurality of commodities of the same type; receiving a key region feature code fingerprint to be detected sent by a client; and further, searching the key area feature code fingerprint database according to the key area feature code fingerprint to be detected, judging the authenticity of the commodity to be detected according to the search result and sending the search judgment result to the client. The application mode has slightly higher requirements on application equipment of the client, and is preferably suitable for non-terminals of suppliers, distributors, retailers and the like to verify the authenticity of commodities and control each link communicating with a supply chain.
In other embodiments, the client may be a device including an image capture module and a computing device or server, i.e., the client may also be the same device as the factory side.
In addition, the client may also develop different functions and categories according to different users, for example, when the user type is a non-terminal customer such as a middleman, a carrier, and the like, the client may include a management function module for warehousing, ex-warehouse, selling, and the like. When the user type is a terminal client such as a consumer, the client can comprise functional modules such as information feedback and the like.
As an optional implementation manner, the client is further configured to collect user information and send the user information to the factory side.
It should be noted that the code fingerprint anti-counterfeiting system is established based on the code fingerprint anti-counterfeiting method, and all the specific steps of the code fingerprint anti-counterfeiting method can be realized through the code fingerprint anti-counterfeiting system. The specific implementation manner can be selected according to different situations, and is not limited to the method described in the following embodiment of the present invention. The implementation manner and effect of the code fingerprint anti-counterfeiting method can be realized in the code fingerprint anti-counterfeiting system, and the same contents will not be described in detail in the above description.
The third aspect of the present invention provides a code fingerprint anti-counterfeiting method, as shown in fig. 3, including the following steps:
a code fingerprint database establishing step S100, which comprises the steps of collecting commodity mark area images, establishing key area characteristic code fingerprints according to the commodity mark area images, and establishing key area characteristic code fingerprint databases by key area characteristic code fingerprints of a plurality of commodities of the same type;
step S200 of establishing a code fingerprint of a commodity to be detected, which comprises the steps of collecting a mark area image of the commodity to be detected, and establishing a characteristic code fingerprint of a key area to be detected according to the mark area image of the commodity to be detected;
a retrieval verification step S300, which comprises retrieving a key area feature code fingerprint database according to the key area feature code fingerprint to be detected and judging the authenticity of the commodity to be detected according to the retrieval result;
after the step of acquiring the mark area image, the method also comprises a step of carrying out standardization preprocessing on the mark area image so that the mark area image has standardized shape, size and shape stretching conversion; the mark area image is a commodity mark area image or a commodity mark area image to be detected.
The code fingerprint anti-counterfeiting method has the advantages that the code fingerprint established according to the mark area images can be reduced due to the fact that different acquisition equipment and acquisition conditions for acquiring the mark area images are adopted to carry out standardized preprocessing on the mark area images, the operation amount for establishing key area characteristic code fingerprints according to the mark area images is reduced, the efficiency and the accuracy of verification results are improved, in addition, the requirements of the application of the code fingerprint anti-counterfeiting method on equipment and environment are reduced, and the application range of the code fingerprint anti-counterfeiting method is improved.
As an optional implementation, the collecting of the commodity sign area image is performed by any one of a mobile terminal, a camera and a video camera.
Generally, non-end users such as factories and middlemans have requirements for efficiency and accuracy of acquiring images of commodity sign areas, and efficient image acquisition equipment such as industrial cameras can be used. In addition, since the non-terminal user adopts a pipeline mode and the like to collect the commodity mark area images, and the relative positions of the commodities and the collecting equipment are fixed, the collected commodity mark area images are also relatively fixed, and a boundary frame can be directly created and the mark images in the boundary frame can be extracted. Optionally, for the convenience of the terminal user verification by using the code fingerprint anti-counterfeiting method, the mobile terminal may be used for verifying the authenticity of the commodity by using the code fingerprint anti-counterfeiting method. For example, the authenticity of the goods may be verified using a smartphone. However, due to the diversity and different performances of the user mobile terminals, the size, pixels and other differences of the collected to-be-detected commodity mark area images are often large, and the convenience and the accuracy of the code fingerprint anti-counterfeiting method verification can be improved through standardized processing. In the following, the present invention takes the standardized processing of acquiring the image of the mark area of the commodity to be tested by the APP capable of scanning the code of the smart phone as an example to describe the standardized preprocessing, and in other use cases, for example, the image of the mark area of the commodity acquired at the factory can be further processed by the standardized processing, which is not described herein again.
In the following embodiments, taking the mark of the to-be-detected commodity as an example of a barcode, the acquired image of the mark area of the to-be-detected commodity is shown in fig. 4, and the standardization preprocessing for the image of the mark area of the to-be-detected commodity includes the following steps:
a binarization processing step of performing binarization processing on the commodity label area image to obtain a commodity label area binarization image as shown in fig. 5.
Alternatively, when the conditions of the size, the pixels and the like of the binarized image of the commodity sign area are more standard, the step of extracting the sign image can be directly executed, a boundary frame is created for the binarized image of the commodity sign area, and the sign image in the boundary frame is extracted.
However, due to the limitation of the acquisition device and the acquisition environment, the to-be-detected commodity sign region image acquired by the user through the smart phone APP often has certain factors such as inclination, shadow interference and the like which are not convenient for extracting the to-be-detected commodity sign region (barcode), and therefore, after the step of performing binarization processing on the commodity sign region image to obtain the commodity sign region binarized image, the method preferably includes the following steps:
a grid cutting processing step, which comprises the step of carrying out grid cutting on the binarized image of the commodity mark area, wherein the processing result of the grid cutting step is shown in figure 6; further, carrying out convolution processing, connection component identification processing and component direction determination processing on the binary image of the commodity mark area after grid cutting in sequence. The following describes the convolution process, the connection component identification process, and the component direction determination process, respectively.
In convolution processing, since information is transmitted at different frequencies in a natural image, where high frequencies are usually encoded in detail and lower frequencies are usually encoded in the overall structure, the output of the convolutional layer can be viewed as a mixture of information at different frequencies. The present invention uses a convolution kernel (filter) to perform convolution processing on each pixel of an input image to obtain a feature map, and the image after the convolution processing is shown in fig. 7. The purpose of the convolution process is to efficiently process the low frequency components in the corresponding tensor.
In the connection component identification process, as shown in fig. 8, if the characteristics of each cell include directions and features parallel to each other, it is considered that the cell may be a skeleton of a barcode, and if the cell does not include directions and features parallel to each other, it is considered that the cell is not a skeleton of a barcode, as shown in fig. 8, a cell has the same characteristics, and it is considered that the cell is a skeleton of a barcode; the cells at B do not have the same characteristics, and are not part of the barcode if the cells are not considered to have a skeleton of the barcode.
In the process of determining the orientation of the component, after the processing of the previous step, the image after grid cutting contains each relatively independent small data grid-cell with orientation information, as shown in fig. 9, the barcode necessarily contains skeletons of features parallel to each other as shown at C and D in the figure, but the D in the image is a wrong mark. When the component direction is determined to be processed, if the cells are adjacent and have common characteristics, the cells can be regarded as a part of the bar code to be extracted, the parts of the cells adjacent and having the common characteristics are extracted firstly, such as positions C and D in the figure, then the parts with the most cells (positions C in the figure) are selected to win according to the sorting of the number of the cells, and the processing result is shown in fig. 10. This processing step, of course, also includes a lot of useful information, so that the following steps can be called up again,
after the above-described processing steps, unnecessary interference information is removed, and all information necessary for summarizing the position of the barcode is obtained.
Optionally, the step of extracting the marker image is performed after the step of binarizing processing or the step of determining the orientation of the component, and includes creating a bounding box for the binarized image of the commodity marker region, and extracting the marker image within the bounding box.
A minimum bounding box is created in the image that spans all patches in a group. First, the average angle at which the patch is contained is calculated and then used to rotate the cells by that precise angle. Then, by calculating the bounding box using the outermost corners of all patches, the bounding box area is calculated as shown in fig. 11. Finally, the bounding box is rotated in the opposite direction to convert it back to the origin, extracting the marker images within the bounding box as shown in FIG. 12.
In the following embodiment, taking the mark of the to-be-detected commodity as the two-dimensional code as an example, the standardized preprocessing for the mark area image of the to-be-detected commodity includes the following steps:
and carrying out graying processing, average highlight processing, Gaussian smooth filtering processing, XY direction gradient difference processing and mean value filtering processing on the commodity mark area image.
The collected image of the marked area of the to-be-detected commodity after the graying processing is shown in fig. 13.
After the graying, the average highlight processing is carried out, and if a picture has a width of W and a height of H, the average brightness of the picture is L (Bri), as shown in the following formula, wherein poi represents the brightness value of x and y direction coordinates.
Figure GDA0002144294170000111
The image is divided into 8-by-8 small blocks and scanned, the average brightness of the blocks is obtained, and the sub-block average brightness matrix is obtained according to the distribution of each sub-block as shown in the following formula.
Figure GDA0002144294170000112
And subtracting the average brightness from each value in the sub-block brightness matrix to obtain a sub-block brightness difference matrix, wherein the brightness difference of the sub-blocks in the high brightness area is positive, and the brightness difference of the sub-blocks in the low brightness area is negative.
Obtaining a full-image brightness difference value matrix L (Bri _ block) -L (Bri)
And (3) subtracting the corresponding value in the full-image brightness difference matrix from each pixel value of the original image, if the value is greater than the full-image highest brightness, using the full-image highest brightness, and otherwise, if the lowest brightness is less than the full-image lowest brightness, using the full-image lowest brightness.
The image after the average highlight processing is shown in fig. 14.
And after the average highlight processing, performing Gaussian smooth filtering processing on the image. The gaussian function used is as follows:
Figure GDA0002144294170000113
where (x, y) is a point coordinate, which may be considered an integer in image processing; σ is the standard deviation.
To obtain a template of a gaussian filter, the gaussian function can be discretized, and the obtained gaussian function value is used as the coefficient of the template. For example: to generate a 3 x 3 gaussian filter template as shown in fig. 15, sampling is performed with the center position of the template as the origin of coordinates. The coordinates of the template at each position, as shown in FIG. 15, are horizontal to the right on the x-axis and vertical down on the y-axis.
Thus, the coordinates of each position are brought into the gaussian function, and the resulting values are the coefficients of the template. For a window template of size (2k +1) × (2k +1), the calculation formula for the values of the elements in the template is as follows:
Figure GDA0002144294170000114
the template thus calculated has two forms: decimal and integer. The template in the decimal form is a value obtained by direct calculation and is not subjected to any treatment; the template in the form of an integer needs to be normalized, and the value of the upper left corner of the template is normalized to 1.
When an integer template is used, a coefficient is added in front of the template, and the coefficient is
Figure GDA0002144294170000115
I.e. the inverse of the sum of the template coefficients.
Fig. 16 shows the result of gaussian smoothing processing with 3 × 3 and σ equal to 0.
And after Gaussian smoothing filtering, performing XY direction gradient difference processing on the image. As shown in fig. 17, Gx is a horizontal operator, and Gy is a vertical operator.
If the original image is recorded as f, then
GX=Gx*f
GY=Gy*f
Gx=-1*f(x-1,y-1)+0*f(x,y-1)+1*f(x+1,y-1)+(-2)*f(x-1,y)+0*f(x,y)+2*f(x+1,y)+(-1)*f(x-1,y+1)+0*f(x,y+1)+1*f(x+1,y+1)
Gy=1*f(x-1,y-1)+2*f(x,y-1)+1*f(x+1,y-1)+0*f(x-1,y)0*f(x,y)+0*f(x+1,y)+(-1)*f(x-1,y+1)+(-2)*f(x,y+1)+(-1)*f(x+1,y+1)
GX, GY represents the result of convolving the original image with the template.
For each pixel in the original image, performing the above convolution in a 3 × 3 template to obtain GX and GY, and finally, the gray value of the pixel is approximated to:
G=|GX|+|GY|
if G is greater than a threshold, then the point is considered an edge point.
The above-described processing may be performed in two directions at the same time, or may be performed in only one direction when it is necessary to highlight edge information in one direction of the image. Fig. 18 shows an image of the two-dimensional code of the present embodiment after the XY direction gradient difference processing.
And carrying out mean value filtering processing on the image after the XY direction gradient difference processing. The mean filtering method is to select a template for the current pixel to be processed, the template is composed of a plurality of adjacent pixels, and the mean value of the template is used to replace the value of the original pixel.
Figure GDA0002144294170000121
As shown in FIG. 19, 1 to 8 are (x, y) adjacent pixels. The weight coefficient matrix template is as follows:
g=(f(x-1,y-1)+f(x,y-1)+f(x+1,y-1)+f(x-1,y)+f(x,y)+f(x+1,y)+f(x-1,y+1)+f(x,y+1)+f(x+1,y+1))/9
Figure GDA0002144294170000122
after the mean filtering process, the image without the high frequency noise is shown in fig. 20.
And after the mean value filtering processing, carrying out binarization processing on the image. Alternatively, the binarization adopts Otsu (OTSU), which is an algorithm for determining the binary segmentation threshold of the image. An exhaustive search enables a threshold value for minimizing the variance within a class, defined as the weighted sum of the variances of two classes:
Figure GDA0002144294170000123
weight ωiAre the probabilities of two classes separated by a threshold t, and
Figure GDA0002144294170000124
is the variance of these two classes. Otsu demonstrated that minimizing the intra-class variance and maximizing the inter-class variance are the same:
Figure GDA0002144294170000125
using class probability omegaiSum mean μiTo indicate. Class probability omega1(t) calculating with a histogram with threshold t:
Figure GDA0002144294170000126
and the mean value mu of1(t) is:
Figure GDA0002144294170000131
where x (i) is the value in the center of the ith histogram bin. Similarly, ω 2(t), μ 2 of the right histogram can be found for bins greater than t. The class probabilities and class means may be calculated iteratively. This results in an efficient algorithm.
The Otsu algorithm derives a threshold on the 0: 1 range for the dynamic range of pixel intensities present in the image. Fig. 21 shows a binary image obtained by using the ohq method.
After the step of obtaining the binarized image of the commodity sign region by binarizing the image of the commodity sign region by the above method, the binarized image of the commodity sign region is subjected to closing operation processing, etching processing and expansion processing.
The gap between the dots of the two-dimensional code is filled by the closed arithmetic processing to make the feature more clear, and the processed image is as shown in fig. 22. Further etching was performed to remove relatively isolated dots, and the processed image is shown in fig. 23. Further, the expansion process is performed to fill the gap to make the morphological feature more clear, and in the present embodiment, three times of expansion are used, and the processed image is shown in fig. 24. In other embodiments, the expansion treatment may be a continuous expansion treatment 1 to 3 times.
After the expansion processing, a bounding box is created for the product marker region image as shown in fig. 25. Further, the area maximization filtering process is performed on the binarized image of the commodity sign area after the bounding box is established, as shown in fig. 26. Fig. 27 shows the extracted marker image in the bounding box after the area maximization filtering process.
It should be noted that, in the embodiment where the mark of the commodity to be tested is a two-dimensional code, a series of processing methods are also applicable to the standardized preprocessing of the barcode with less serious background interference.
Further optionally, the to-be-detected commodity marking area image and the commodity marking area image adopt the same standardized preprocessing operation. Through standardized preprocessing operation, the to-be-detected commodity sign area image or the commodity sign area image with uniform shape, uniform size and nearly uniform shape stretching is formed, so that the key area feature code fingerprint can be conveniently established for the to-be-detected commodity sign area image or the commodity sign area image.
The fourth aspect of the invention provides a code fingerprint anti-counterfeiting method, which comprises the following steps:
a code fingerprint database establishing step S100, which comprises the steps of collecting commodity mark area images, establishing key area characteristic code fingerprints according to the commodity mark area images, and establishing key area characteristic code fingerprint databases by key area characteristic code fingerprints of a plurality of commodities of the same type;
step S200 of establishing a code fingerprint of a commodity to be detected, which comprises the steps of collecting a mark area image of the commodity to be detected, and establishing a characteristic code fingerprint of a key area to be detected according to the mark area image of the commodity to be detected;
a retrieval verification step S300: searching a key area feature code fingerprint database according to the key area feature code fingerprint to be detected, and judging the authenticity of the commodity to be detected according to the search result;
the method comprises the following steps of establishing a key area characteristic code fingerprint according to a commodity mark area image and establishing a key area characteristic code fingerprint to be detected according to a commodity mark area image, wherein the key area characteristic code fingerprint to be detected respectively comprises the following steps:
extracting local characteristic points of the mark region image;
removing general local characteristic points in the local characteristic points, and forming a remarkable local characteristic point diagram by the residual local characteristic points;
acquiring a block detailed feature vector according to the significant locality feature point diagram;
establishing a key area feature code fingerprint according to the significant locality feature point diagram and the block detailed feature vector;
the mark area image is a commodity mark area image or a commodity mark area image to be detected, and the key area feature code fingerprint is a key area feature code fingerprint or a key area feature code fingerprint to be detected.
The code fingerprint anti-counterfeiting method comprises the steps of collecting commodity mark area images on optical equipment to contain printing characteristic differences, processing the commodity mark area images through a certain method, establishing a code fingerprint with unique identification, storing the code fingerprint in a code fingerprint library, judging whether the commodity to be detected has the unique code fingerprint in the code fingerprint library or not by collecting the corresponding mark area images of the commodity to be detected, judging the commodity to be genuine if the corresponding code fingerprint exists, and judging the commodity to be fake if the corresponding code fingerprint does not exist. The code fingerprint of the code fingerprint anti-counterfeiting method cannot be decoded and copied, and the prevention and control of counterfeiters can be avoided. Furthermore, by extracting the local characteristic points and removing the passing local characteristic points, the data processing amount can be reduced, and the anti-counterfeiting verification speed can be improved.
As an alternative implementation, extracting the local feature points of the image of the marker region is implemented by a scale-invariant feature transformation method.
Scale-invariant feature transform (SIFT) is used for an image processing method, has Scale invariance, can detect key points in an image, and is a local feature descriptor. And extracting local characteristic points of the image of the mark region, wherein each local characteristic point (key point) adopts 4X 8 to describe the key point characteristics of the subprocess of 128-dimensional vectors. Each keypoint contains position, scale and orientation information.
Optionally, the scale invariant feature transformation method for extracting the local feature points of the image of the marker region includes generating a scale space, detecting extreme points of the scale space, accurately positioning the extreme points, assigning a direction parameter to each key point, and generating a key point descriptor.
Generating a scale space, wherein the theoretical purpose of the scale space is to simulate multi-scale features of image data, a Gaussian convolution kernel is a unique linear kernel for realizing scale transformation, and the scale space of a two-bit image of one image is defined as follows:
L(x,y,σ)=G(x,y,σ)*I(x,y)
where G (x, y, σ) is a scale-variable Gaussian function.
Figure GDA0002144294170000141
(x, y) are spatial coordinates and σ is a scale coordinate. The size of sigma determines the smoothness of the image, the large scale corresponds to the profile features of the image, and the small scale corresponds to the detail features of the image. A large sigma value corresponds to a coarse scale (low resolution) and conversely to a fine scale (high resolution).
A difference of Gaussian scale-space (DOG scale-space) is constructed as follows:
D(x,y,σ)=[G(x,y,kσ)-G(x,y,σ)]*I(x,y)=L(x,y,kσ)-L(x,y,σ)
detecting the extreme point of the scale space,
to find the extreme points in the scale space, each sample point is compared to all its neighbors to see if it is larger or smaller than its image and its neighbor to the sum scale. As shown in fig. 29, the middle detection point is compared with 26 points, which are 8 adjacent points of the same scale and 9 × 12 points of the upper and lower adjacent scales, to ensure that extreme points are detected in both scale space and binary image space. If a point is the maximum or minimum value in the 26 fields of the DOG scale space at the layer and the upper and lower layers, the point is considered as a characteristic point of the image at the scale.
And (3) accurately positioning the extreme point, and accurately determining the position and the scale of the key point (achieving sub-pixel precision) by fitting a three-dimensional quadratic function, wherein the DOG operator can generate stronger edge response, and meanwhile, the key point with low contrast and the unstable edge response point are removed, so that the matching stability is enhanced, and the anti-noise capability is improved. The spatial scale function is as follows:
Figure GDA0002144294170000142
the derivation is made to 0 to obtain the precise position
Figure GDA0002144294170000143
Among the feature points that have been detected, feature points of low contrast and unstable edge corresponding points are removed. Removing the low contrast point, taking formula 2 as formula 1, only the first two terms can be taken:
Figure GDA0002144294170000151
if it is
Figure GDA0002144294170000152
The feature point is retained, otherwise it is rejected.
The elimination of the edge response, the extremum of a poorly defined gaussian difference operator has a greater principal curvature across the edge and a lesser principal curvature in the direction perpendicular to the edge. The principal curvature is determined by a 2 × 2 Hessian matrix H:
Figure GDA0002144294170000153
the derivative is estimated from the sample point neighbor difference.
D, the principal curvature is in direct proportion to the characteristic value of H, and alpha is the maximum characteristic value and beta is the minimum characteristic value
Tr(H)=Dxx+Dyy=α+β
Figure GDA0002144294170000157
Let α be β, then:
Figure GDA0002144294170000154
Ifratio>(r+1)2/(r),throw it out(SIFT uses r=10)
dxx represents the derivation twice in the x direction of an image of a certain scale in the DOG pyramid.
And assigning a direction parameter for each key point, and assigning the direction parameter for each key point by using the gradient direction distribution characteristic of the raining pixels of the key points, so that the operator has rotation invariance.
Figure GDA0002144294170000155
Figure GDA0002144294170000156
The above two formulas are respectively the modulus and direction formula of the gradient at (x, y). Wherein the scale used for L is the scale at which each keypoint is located. So far, the detection of the key points of the image is finished, and each key point consists of three information: position, scale, direction. From this, a SIFT feature region can be determined.
The key point descriptor is generated by first rotating the coordinate axis to the direction of the key point to ensure rotational invariance. Taking 8 × 8 neighborhoods with the feature points as centers as sampling windows, classifying the relative directions of the sampling points and the feature points into 8 direction histograms after Gaussian weighting, and finally obtaining 2 × 8 32-dimensional feature descriptors, wherein a schematic diagram is shown as 30.
Each cell represents a pixel in the scale space where the feature point neighborhood is located, the arrow direction represents the pixel gradient direction, and the arrow length represents the amplitude of the pixel. Then the gradient direction histogram of 8 directions is calculated within a 4 x 4 window. Plotting the accumulation for each gradient direction may form a seed point.
As shown in fig. 31, each histogram has 8-directional gradient directions, and each descriptor contains four histogram arrays located near a keypoint, which results in a 128-dimensional feature vector (a 4 × 4 histogram is computed, each histogram has 8 directions, so 4 × 4 × 8 is 128-dimensional), which is normalized to further remove the effects of illumination.
As an alternative embodiment, the general local feature point is a local feature point whose percentage of all local feature points in the images of the marker regions of the same type of goods exceeds a preset ratio. And by means of big data learning, filtering out the universal local characteristic points step by step to reduce key points. For example, if the ratio of the same local feature point in all the information in the warehouse exceeds the preset value by 50%, the information is regarded as a general local feature point, and the characteristic point is taken out and is not required to be warehoused as a key point.
As an alternative embodiment, the obtaining of the detailed feature vector of the block according to the significant locality feature point diagram is implemented by a feature detection method of histogram of oriented gradients.
In the part of the designated key point position, a lattice for sufficient operation is constructed. And taking each key point as a central position, taking 6 × 6 small grids, taking 6 × 6 domain grids as centers, extracting 3 × 3 large domain grids, and extracting local detailed features which comprise the significant features in the large domain grids. And obtaining the size and the direction of the detection target graph and contour information.
The main idea of this process is that in one image, the appearance and shape of local objects (appearance and shape) can be well described by the directional density distribution of the gradient or edge. It is essentially statistical information of the gradient, which is mainly present at the edges. The implementation method comprises the steps of dividing an image into small connected regions, namely the connected regions are called cell units, and then collecting the direction histogram of the gradient or the edge of each pixel point in the cell units. Finally, the histograms are combined to form the feature descriptor.
To further improve performance, these local histograms are contrast-normalized over a larger range of the image (called bin or block) by: the density of each histogram in this bin (block) is calculated and then each cell unit in the bin is normalized according to this density. By this normalization, better effects on illumination variations and shadows can be obtained. The feature extraction method is to execute the following steps on an image-detected target or a scanning window:
1) graying (treating the image as a three-dimensional image in x, y, z (gray scale));
2) standardizing (normalizing) the color space of the input image by using a Gamma correction method; the method aims to adjust the contrast of the image, reduce the influence caused by local shadow and illumination change of the image and inhibit the interference of noise;
3) calculating the gradient (including magnitude and direction) of each pixel of the image; the method mainly aims to capture contour information and further weakens the interference of illumination;
4) dividing the image into small cells (e.g., 6 x 6 pixels/cell);
5) counting the gradient histogram (the number of different gradients) of each cell to form a descriptor of each cell;
6) forming each plurality of cells into a block (for example, 3 × 3 cells/block), and connecting the feature descriptors of all the cells in the block in series to obtain the feature description of the block;
7) the image feature description of the image (the target you want to detect) can be obtained by connecting the local feature descriptions of all blocks in the image in series. This is the final feature vector available for classification.
Dividing a sample image into a plurality of pixel units (cells), averagely dividing the gradient direction into 9 intervals (bins), carrying out histogram statistics on the gradient directions of all the pixels in each direction interval in each unit to obtain a 9-dimensional feature vector, forming a block (block) by every adjacent 4 units, connecting the feature vectors in one block to obtain a 36-dimensional feature vector, and scanning the sample image by using the block, wherein the scanning step length is one unit. And finally, connecting the features of all the blocks in series to obtain the contour feature of the image. For example, for a 64 × 128 image, every 16 × 16 pixels make up a cell, every 2 × 2 cells make up a block, and since each cell has 9 features, there are 4 × 9 — 36 features in each block, with 8 pixels as a step size, then there will be 7 scan windows in the horizontal direction and 15 scan windows in the vertical direction. That is, 64 × 128 pictures, total 36 × 7 × 15 — 3780 profile features.
As an alternative embodiment, after the step of obtaining the block detailed feature vector according to the significant locality feature point diagram, the method further comprises the following steps: and generating an index code according to the remarkable locality feature point diagram. Optionally, the generating the index code according to the significant locality feature point diagram includes the following steps:
dividing the significant locality feature point diagram into blocks;
extracting the central point of the significant local characteristic point in each block, and taking the gray value of the central point in each block as the threshold of the corresponding block;
comparing the gray value of each significant local characteristic point in each block with the threshold value of the corresponding block, and when the gray value of the significant local characteristic point is smaller than the threshold value, marking as 1, otherwise, marking as 0, and obtaining the characteristic code of each block;
establishing a characteristic code spiral matrix from inside to outside by taking the characteristic code of the block in the center of the mark area image as a starting point; the characteristic code spiral matrix forms an index code.
In particular, the positive center point of the image, e.g., the most central point in the range of 3 x 3, is extracted. The gray value of the central pixel is taken as a threshold value. And taking a region of 8 by 8 pixels, and comparing the gray level of each pixel in the region with a threshold value. If the value is less than the preset value, marking the value as 1, otherwise, marking the value as 0; then the encoding within the 8 by 8 trellis is as follows:
Figure GDA0002144294170000171
forming a 64-bit binary signature code in a block, and taking the binary signature code as a center, establishing a spiral matrix from inside to outside as follows:
Figure GDA0002144294170000172
as above, 25 blocks of information of the two-dimensional code are established, each block comprising a 64-bit binary code. Here, in order to easily create 25 pieces of block information of the two-dimensional code, the number of blocks of the two-dimensional code is much larger than 25 in actual processing. The block information is stored in a database, such as PostgreSQL.
As an optional implementation manner, the searching the key area feature code fingerprint database according to the key area feature code fingerprint to be tested comprises the following steps:
randomly generating an Archimedes thread line by taking the central point of the remarkable local characteristic point diagram of the commodity to be detected as a starting point;
according to the size of the set, randomly extracting a plurality of significant local characteristic points of all significant local characteristic points through which the Archimedes thread passes;
matching and retrieving the extracted plurality of significant locality characteristic points with the index codes;
when the first fault tolerance rate of the extracted plurality of the salient local characteristic points and a specific index code in the index codes is less than or equal to a first fault tolerance rate preset value, carrying out block detailed characteristic vector matching retrieval on the extracted plurality of the salient local characteristic points and the specific index code;
and when the second fault tolerance of the extracted plurality of the significant local characteristic points and the specific index code for carrying out the block detailed characteristic vector is less than or equal to a second fault tolerance preset value, judging that the result is true.
And when the second fault tolerance of the extracted plurality of the salient local characteristic points and the specific index code for carrying out the block detailed characteristic vector is larger than a second fault tolerance preset value, executing a plurality of salient local characteristic points of all the salient local characteristic points passed by the Archimedes spiral according to the size of the set.
And when the second fault tolerance of the block detailed feature vector of the extracted plurality of the salient local feature points and the specific index code is greater than a second fault tolerance preset value after the step of randomly extracting the plurality of the salient local feature points of all the salient local feature points passed by the Archimedes spiral line according to the size of the set is executed for a preset number of times, judging that the result is false.
Specifically, two "archimedean screw threads" are randomly generated on an image by taking the central point of the image as a starting point, a plurality of salient local feature point data are randomly extracted according to the size of a set from all the passing screw threads (the salient local feature points after filtering the universal local feature points), the archimedean screw threads are used to pass through blocks 1, 8, 6, 4, 11, 23, 19 and 15, and the blocks are matched and searched by adopting an "index code". If the retrieval meets the first preset fault tolerance rate of 40 percent, the key points meeting the condition of the index code are subjected to refined local feature comparison matching (position, scale, direction information, size, direction and contour information), and the fault tolerance rate is calculated to give different prompts. If no record exists, random extraction of a plurality of significant local feature point data is executed again, and the result still does not conform to the fault tolerance, the commodity is a false verification result.
Optionally, when the judgment result is true, updating the key area feature code fingerprint database according to the judgment result; the updating mode comprises deleting the corresponding key area characteristic code fingerprint in the key area characteristic code fingerprint library or writing the user information into the corresponding key area characteristic code fingerprint in the key area characteristic code fingerprint library.
In the description of the present invention, the terms "first" and "second" are used for descriptive purposes only and are not to be construed as indicating or implying relative importance.
The above-mentioned embodiments only express several embodiments of the present invention, and the description thereof is more specific and detailed, but not construed as limiting the scope of the present invention. It should be noted that, for a person skilled in the art, several variations and modifications can be made without departing from the inventive concept, which falls within the scope of the present invention. Therefore, the protection scope of the present patent shall be subject to the appended claims.

Claims (7)

1. A code fingerprint anti-counterfeiting method is characterized by comprising the following steps:
a step of establishing a code fingerprint database, which comprises the steps of collecting commodity mark area images, establishing key area characteristic code fingerprints according to the commodity mark area images, and establishing a key area characteristic code fingerprint database by the key area characteristic code fingerprints of a plurality of commodities of the same type;
establishing a code fingerprint of a commodity to be detected, wherein the step comprises the steps of collecting a mark area image of the commodity to be detected, and establishing a characteristic code fingerprint of a key area to be detected according to the mark area image of the commodity to be detected;
the step of retrieval verification comprises the steps of retrieving the key area feature code fingerprint database according to the key area feature code fingerprint to be detected and judging the authenticity of the commodity to be detected according to the retrieval result;
the method comprises the following steps of establishing a key area characteristic code fingerprint according to the commodity mark area image and establishing a key area characteristic code fingerprint to be detected according to the commodity mark area image, wherein the key area characteristic code fingerprint to be detected respectively comprises the following steps:
extracting local characteristic points of the mark region image;
removing general local characteristic points in the local characteristic points, and forming a remarkable local characteristic point diagram by the residual local characteristic points;
acquiring a block detailed feature vector according to the significant locality feature point diagram;
establishing a key area feature code fingerprint according to the significant locality feature point diagram and the block detailed feature vector;
the mark area image is a commodity mark area image or a commodity mark area image to be detected, and the key area feature code fingerprint is a key area feature code fingerprint or a key area feature code fingerprint to be detected;
after the step of obtaining the block detailed feature vector according to the significant local feature point diagram, the method further comprises the following steps:
generating an index code according to the significant locality feature point diagram;
the method for generating the index code according to the significant locality feature point diagram comprises the following steps:
dividing the significant locality feature point map into blocks;
extracting the central point of the significant local characteristic point in each block, and taking the gray value of the central point in each block as the threshold of the corresponding block;
comparing the gray value of each significant local characteristic point in each block with the threshold value of the corresponding block, and when the gray value of the significant local characteristic point is smaller than the threshold value, marking as 1, otherwise, marking as 0, and obtaining the characteristic code of each block;
establishing a characteristic code spiral matrix from inside to outside by taking the characteristic code of the block in the center of the mark area image as a starting point; the characteristic code spiral matrix forms an index code;
the key area feature code fingerprint database is retrieved according to the key area feature code fingerprint to be detected, and the method comprises the following steps:
randomly generating an Archimedes thread line by taking the central point of the remarkable local characteristic point diagram of the commodity to be detected as a starting point;
according to the size of the set, randomly extracting a plurality of significant local characteristic points of all the significant local characteristic points passed by the Archimedes spiral;
matching and retrieving the extracted several significant local characteristic points with the index codes;
when the first fault tolerance rate of the extracted plurality of the salient local characteristic points and a specific index code in the index codes is less than or equal to a first fault tolerance rate preset value, carrying out block detailed characteristic vector matching retrieval on the extracted plurality of the salient local characteristic points and the specific index code;
and when the second fault tolerance of the extracted plurality of the significant local characteristic points and the specific index code for carrying out the block detailed characteristic vector is less than or equal to a second fault tolerance preset value, judging that the result is true.
2. The code fingerprint anti-counterfeiting method according to claim 1, wherein the extraction of the local characteristic points of the image of the mark region is realized by a scale invariant characteristic transformation method.
3. The code fingerprint anti-counterfeiting method according to claim 1, wherein the universal local characteristic points are local characteristic points which account for more than a preset proportion in all local characteristic points of a plurality of same commodity sign region images.
4. The code fingerprint anti-counterfeiting method according to claim 1, wherein the acquisition of the detailed feature vector of the block according to the significant locality feature point diagram is realized by a histogram of oriented gradients feature detection method.
5. The code fingerprint anti-counterfeiting method according to claim 1, wherein when a second fault tolerance of the extracted significant locality feature points and the specific index code for block detailed feature vectors is greater than a second fault tolerance preset value, the step of randomly extracting the significant locality feature points of all the significant locality feature points through which the Archimedes spiral passes according to the size of the set is performed.
6. The code fingerprint anti-counterfeiting method according to claim 5, wherein the judgment result is false when a second fault tolerance of the extracted significant local feature points and the specific index code for block detailed feature vectors is greater than a second fault tolerance preset value after a number of significant local feature point steps of randomly extracting all significant local feature points passed by the Archimedes spiral according to the set size are performed for a preset number of times.
7. The code fingerprint anti-counterfeiting method according to any one of claims 1 to 6, wherein when the judgment result is true, the key area feature code fingerprint database is updated according to the judgment result;
the updating mode comprises deleting the corresponding key area characteristic code fingerprint in the key area characteristic code fingerprint library or writing the user information into the corresponding key area characteristic code fingerprint in the key area characteristic code fingerprint library.
CN201910547553.1A 2019-06-24 2019-06-24 Code fingerprint anti-fake method Active CN110288359B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201910547553.1A CN110288359B (en) 2019-06-24 2019-06-24 Code fingerprint anti-fake method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201910547553.1A CN110288359B (en) 2019-06-24 2019-06-24 Code fingerprint anti-fake method

Publications (2)

Publication Number Publication Date
CN110288359A CN110288359A (en) 2019-09-27
CN110288359B true CN110288359B (en) 2021-11-23

Family

ID=68004630

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910547553.1A Active CN110288359B (en) 2019-06-24 2019-06-24 Code fingerprint anti-fake method

Country Status (1)

Country Link
CN (1) CN110288359B (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113408526A (en) * 2021-06-18 2021-09-17 深圳市数标国际科技有限公司 Image recognition anti-counterfeiting method and image recognition anti-counterfeiting system based on Handle identification positioning

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103279731A (en) * 2013-06-06 2013-09-04 格科微电子(上海)有限公司 Two-dimension code anti-fake method and anti-fake verification method thereof
CN103309982A (en) * 2013-06-17 2013-09-18 武汉大学 Remote sensing image retrieval method based on vision saliency point characteristics
CN104036281A (en) * 2014-06-24 2014-09-10 北京奇虎科技有限公司 Matching method, searching method, and matching and searching device of pictures
CN105493143A (en) * 2013-04-04 2016-04-13 日本电气株式会社 Identification method, identification system, matching device, and program
CN106960351A (en) * 2016-01-11 2017-07-18 深圳市安普盛科技有限公司 A kind of commodity counterfeit prevention, verification method and system and bar code scanner
CN103559473B (en) * 2013-10-28 2017-08-01 汝思信息技术(上海)有限公司 The false proof method and system of stock is realized using characteristic image
CN108303435A (en) * 2017-01-12 2018-07-20 同方威视技术股份有限公司 The method for checking equipment and container being checked

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105493143A (en) * 2013-04-04 2016-04-13 日本电气株式会社 Identification method, identification system, matching device, and program
CN103279731A (en) * 2013-06-06 2013-09-04 格科微电子(上海)有限公司 Two-dimension code anti-fake method and anti-fake verification method thereof
CN103309982A (en) * 2013-06-17 2013-09-18 武汉大学 Remote sensing image retrieval method based on vision saliency point characteristics
CN103559473B (en) * 2013-10-28 2017-08-01 汝思信息技术(上海)有限公司 The false proof method and system of stock is realized using characteristic image
CN104036281A (en) * 2014-06-24 2014-09-10 北京奇虎科技有限公司 Matching method, searching method, and matching and searching device of pictures
CN106960351A (en) * 2016-01-11 2017-07-18 深圳市安普盛科技有限公司 A kind of commodity counterfeit prevention, verification method and system and bar code scanner
CN108303435A (en) * 2017-01-12 2018-07-20 同方威视技术股份有限公司 The method for checking equipment and container being checked

Also Published As

Publication number Publication date
CN110288359A (en) 2019-09-27

Similar Documents

Publication Publication Date Title
US11423641B2 (en) Database for detecting counterfeit items using digital fingerprint records
US20200226366A1 (en) Controlled authentication of physical objects
US20210142436A1 (en) Event-driven authentication of physical objects
US11321964B2 (en) Loop chain digital fingerprint method and system
Alkawaz et al. Detection of copy-move image forgery based on discrete cosine transform
US11100517B2 (en) Preserving authentication under item change
EP2869240A2 (en) Digital fingerprinting object authentication and anti-counterfeiting system
CN1661627B (en) Counterfeit and tamper resistant labels with randomly occurring features
Sirmacek et al. Urban-area and building detection using SIFT keypoints and graph theory
JP6308370B2 (en) Identification method, identification system, verification device, and program
US20150067346A1 (en) Digital fingerprinting track and trace system
JP2011510365A (en) Document verification using dynamic document identification framework
US20200065829A1 (en) Commodity anti-counterfeit verification system based on natural biological information
CN109074371B (en) Method and computing device for determining whether a mark is authentic
CN103871044A (en) Image signature generating method and image verifying method and device
CN110288359B (en) Code fingerprint anti-fake method
CN115035533B (en) Data authentication processing method and device, computer equipment and storage medium
Sun et al. TemplateFree: product detection on retail store shelves
CN110310131A (en) Code fingerprint method for anti-counterfeit and code fingerprint anti-counterfeiting system
Ishiyama et al. Melon authentication by agri-biometrics-identifying individual fruits using a single image of rind pattern
Lu et al. Computer vision for hardware security
CN110297920A (en) A kind of yard of fingerprint method for anti-counterfeit
Masri et al. Image classification using appearance based features
Fattahi et al. Detection of Copy-Move Forgery in Digital Images Using Scale Invariant Feature Transform Algorithm and the Spearman Relationship.
Liu Automatic target recognition using location uncertainty

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant